# [Various] Radeon RX 5700 XT and 5700 reviews



## ToTheSun!

The performance of the 5700 XT, relative to the rest of cards tested, is so bipolar. In the same review, it can be seen surpassing the 2080 and falling behind the 2070 by some 30%.


----------



## Rei86

RVII is dead.


----------



## SoloCamo

ToTheSun! said:


> The performance of the 5700 XT, relative to the rest of cards tested, is so bipolar. In the same review, it can be seen surpassing the 2080 and falling behind the 2070 by some 30%.


Yea, it seems all over the place. Overall I think once drivers are matured we should see more consistency.

And just for the obligatory to annoy everyone:

"I'm glad I picked up this V64 for $399 a few months ago with 3 free games" 

I just want to see OC to OC vs Vega 56/64. I think a 1600mhz core / 1100mem V64 should easily enough match the 5700xt especially at 1440p or higher. Stock V64 already has a 40GB/s bandwidth lead, putting at 1100mhz brings that up to 563.2GB/s which is quite a jump over 5700XT's 448gb/s. Memory oc'ing on Vega makes a big difference.

Edit: maybe not, these benchmarks are all over the place


----------



## AlphaC

https://www.computerbase.de/2019-07/radeon-rx-5700-xt-test/

"it's not loud" - famous last words.

The blower is louder than GTX 1080 FE , spins to 2100RPM


Igor's Lab / Tomshw Germany hit 2.1GHz on water though. https://www.tomshw.de/2019/07/07/am...n-vega-und-fast-2-1-ghz-takt-unter-wasser/13/


Also Linux results on mesa 19.2 driver

https://www.phoronix.com/scan.php?page=article&item=radeon-5700-linuxgl&num=6


----------



## Gunderman456

Paper launch? Nothing is available in Canada yet at least on newegg or amazon.


----------



## Imouto

https://www.eurogamer.net/articles/...5700-xt-review-head-to-head-with-nvidia-super


----------



## 113802

Decent stock blower for a change. The card does have a hotspot which starts to throttle at 115c.


----------



## NightAntilli




----------



## Heuchler

Igor Labs' RX 5700 XT at 2.1GHz with EK prototype block


----------



## Heuchler

Guru3D Radeon RX 5700 and 5700 XT review
https://www.guru3d.com/articles-pages/amd-radeon-rx-5700-and-5700-xt-review,1.html


----------



## rv8000

Glad to see most of the reviews confirm there was some great progress made in engines where Nvidia was dusting AMD. Bought one for fun, should be here the end of this week, wish I had universal block to slap on it for fun.


----------



## AmericanLoco

rv8000 said:


> Glad to see most of the reviews confirm there was some great progress made in engines where Nvidia was dusting AMD. Bought one for fun, should be here the end of this week, wish I had universal block to slap on it for fun.


Where did you buy it from?


----------



## geoxile

Disappointing results for a 7nm card. But it looks like AMD finally caught up to media playback efficiency, which is great news for next gen mobile APUs. Probably the most hype part about this for me. The boost seems to have changed to nvidia's model as well, overclocking over the boost when there's headroom for it. So maybe custom cards will hit up to 2.1ghz


----------



## prjindigo

Keep in mind that the "RX 460" just kept up with the "GTX 1070" in this circumstance... this is going to be hilarious.


----------



## SoloCamo

Heuchler said:


> Guru3D Radeon RX 5700 and 5700 XT review
> https://www.guru3d.com/articles-pages/amd-radeon-rx-5700-and-5700-xt-review,1.html


Have to admit, their results seem to be fairly spot on. I did a non oc'ed and oc'ed run of my V64 on DeusEx mactching their listed settings and at stock my V64 matched their 38fps at 4k. When I bumped it to never drop below 1605mhz core /1100mem it put me at 43.6fps. 45fps on a stock 5700XT at 4k is better than I was anticipating at that high of a res.



geoxile said:


> AMD finally caught up to media playback efficiency, which is great news for next gen mobile APUs. Probably the most hype part about this for me. The boost seems to have changed to nvidia's model as well, overclocking over the boost when there's headroom for it. So maybe custom cards will hit up to 2.1ghz


Yup, going to finally replace my laptop (A8-6410 Puma+ 15w quad core cpu) with Ryzen 3's APU line.


----------



## keikei

Gunderman456 said:


> Paper launch? Nothing is available in Canada yet at least on newegg or amazon.





@ newegg (U.S.) The 5700 is posted, but is pushed back later in the week. The XT isnt even posted yet. Looks like both will have a _slight _delay.


----------



## Heuchler

AMD Navi 5700 XT Live-Test from PCGH


----------



## ibb27

Anandtech

https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review


----------



## mohit9206

Reference card sucks,wait for aftermarket cards.


----------



## NightAntilli

ibb27 said:


> Anandtech
> 
> https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review


This is particularly interesting... Navi is weaker at everything compared to Vega, except FP texture fillrate... The Navi cards lose in Tessellation, Pixel fillrate, Integer texture fillrate, INT8 buffer compression and FP32 buffer compression. And then...;









Also, when they tested Metro Exodus, the Navi cards actually did quite well, while at other sites they didn't. On the other websites, Hairworks, which uses tessellation, is likely hampering the Navi cards. And since Anandtech specifically disabled hairworks, other websites most likely didn't, skewing the results in nVidia's favor. Keep in mind, the 5700XT actually is very close to rivaling the Radeon VII LOL.


----------



## EastCoast

LOL, this is awesome. 
This is one of Radeon best launches in a long time. Sure they should have had some AM coolers but from what I see so far the 5700 is a good card.
Can't wait for the aftermarket coolers. Nitro 5700XT is going to be a beast.
Muhahaha

As with most Radeon released video cards press release drivers tend to be more about stability. With performance drivers coming later on (weeks later...).




NightAntilli said:


> This is particularly interesting... Navi is weaker at everything compared to Vega, except FP texture fillrate... The Navi cards lose in Tessellation, Pixel fillrate, Integer texture fillrate, INT8 buffer compression and FP32 buffer compression. And then...;


That has always been the case for Radeon from what I recall. Looks like they were telling the truth about improve the Uarch.



NightAntilli said:


> Also, when they tested Metro Exodus, the Navi cards actually did quite well, while at other sites they didn't. On the other websites, Hairworks, which uses tessellation, is likely hampering the Navi cards. And since Anandtech specifically disabled hairworks, other websites most likely didn't, skewing the results in nVidia's favor. Keep in mind, the 5700XT actually is very close to rivaling the Radeon VII LOL.


I blame NDA 2.0. Some sites might prefer to enable Radeon crippling features found in some games. Are they still posting "Test Mythodology" anymore? If so, it should be included.


----------



## Heuchler

[Coreteks] RX 5700 & 5700 XT REVIEW


----------



## keikei




----------



## crazycrave

I tested the new driver = 19 . 7 . 1 on my X58 system with the anti lag on in driver and this is DX 12 on Cross Fired RX 570's in stock 1250/1750 clocks.. = 67 fps in Ultra 4K on a 10 year old platform is fast .


----------



## tpi2007

Gamers Nexus review is up, third video in the OP. Or here:







The 5700 non XT is artificially locked in terms of overclocking potential so you have to buy the XT, according to GN. No, AMD, this is not how you do this.


----------



## runwiththedevil

https://www.techpowerup.com/review/amd-radeon-rx-5700/34.html


^ Odd results on the fan speed, RPM slows down a bit after hitting around 71ºC, and thus the card gets even more hot.
The voltage is quite odd as well, on idle it's at 0.775V (5700XT is lower at 0.725V... doesn't make much sense), while on load it seems it's locked at 0.987V no matter the frequency.


I guess the 5700 still needs to be fixed.


----------



## EastCoast

keikei said:


> snip


Yeah, I unsubscribed from that guy a while back. Ever since he tried to defend NDA 2.0 with a lawyer who clearly contradicted him. We had a discussion about this on this forum.
Furthermore, he comes off as ultra hyper critical of Radeon as a whole. The guy just turns me off.


----------



## 113802

NightAntilli said:


> This is particularly interesting... Navi is weaker at everything compared to Vega, except FP texture fillrate... The Navi cards lose in Tessellation, Pixel fillrate, Integer texture fillrate, INT8 buffer compression and FP32 buffer compression. And then...;
> 
> Also, when they tested Metro Exodus, the Navi cards actually did quite well, while at other sites they didn't. On the other websites, Hairworks, which uses tessellation, is likely hampering the Navi cards. And since Anandtech specifically disabled hairworks, other websites most likely didn't, skewing the results in nVidia's favor. Keep in mind, the 5700XT actually is very close to rivaling the Radeon VII LOL.


Not surprising at all, just look at the compute workloads where the Vega 64 crushes even the RTX 2070. Vega is a high performance compute architecture while RDNA was built purely for gaming. Vega won't be going anywhere any time soon and we'll continue to see it until it's replaced with Arcturus.


----------



## Heuchler

SWE overclockers RX 5700 XT overclocking/undervolting results
https://www.sweclockers.com/test/27832-amd-radeon-rx-5700-och-rx-5700-xt-navi/10#content


----------



## keikei

If you *must* get the cheapest card, then opt of the reference. I'd gladly pay a bit more for better acoustics/thermals. I'm sure cheaping out (is that a word?) on the cooler helped enable AMD to drop their prices.


----------



## tpi2007

EastCoast said:


> As with most Radeon released video cards press release drivers tend to be more about stability. With performance drivers coming later on (weeks later...).




Right, right...

https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/3


> The big issue at the moment is that while AMD’s drivers are in fairly good shape for gaming, the same cannot be said for compute. Most of our compute benchmarks either failed to have their OpenCL kernels compile, triggered a Windows Timeout Detection and Recovery (TDR), or would just crash. As a result, only three of our regular benchmarks were executable here, with [email protected], parts of CompuBench, and Blender all getting whammied.
> 
> And "executable" is the choice word here, because even though benchmarks like LuxMark would run, the scores the RX 5700 cards generated were nary better than the Radeon RX 580. This a part that they can easily beat on raw FLOPs, let alone efficiency. So even when it runs, the state of AMD's OpenCL drivers is at a point where these drivers are likely not indicative of anything about Navi or the RDNA architecture; only that AMD has a lot of work left to go with their compiler.





> Finally, for whatever reason, the RX 5700 cards wouldn’t display the boot/BIOS screens when hooked up to my testbed monitor over HDMI. This problem did not occur with DisplayPort, which is admittedly the preferred connection anyhow. But it’s an odd development, since this behavior doesn’t occur with Vega or Polaris cards – or any other cards I’ve tested, for that matter.


----------



## EastCoast

tpi2007 said:


> Right, right...
> 
> https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/3


*



The big issue at the moment is that while AMD’s drivers are in fairly good shape for gaming

Click to expand...

*There, I gotcha. Bold for emphasis.
Any one who makes a post is not making an absolute. You need to work on how to properly interpret what's being inferred, implied, or the general context of the message. Therefore, as mentioned in your own post, the drivers are stable. Does not mean there is no room for improvement. 

Therefore, I still stand by what I said, press drivers release tend to be more about stability. Performance drivers later on. 
Enjoy! :thumb:


----------



## ilmazzo

very good gaming card, I wonder if someone was able to downvolt it to see this 7nm how are good? 
it will get noise under control and reduce frequency roller coaster
I don’t get why we have again the graphite thermal pad and the hotspot temp, thought were a legacy of vegas.... 

if vii was the most interesting card to get for optimization but now we have a new king here...lot of optimizations on driver will came in the next releases, this is completely a new beast imho


----------



## 113802

ilmazzo said:


> very good gaming card, I wonder if someone was able to downvolt it to see this 7nm how are good?
> it will get noise under control and reduce frequency roller coaster
> I don’t get why we have again the graphite thermal pad and the hotspot temp, thought were a legacy of vegas....
> 
> if vii was the most interesting card to get for optimization but now we have a new king here...lot of optimizations on driver will came in the next releases, this is completely a new beast imho


The graphite pad is used since it's consistent compared to thermal paste which will have inconsistent spread. Clearly the hotspot sensors are on it to prevent heat from damaging the GPU due to the heat density of 7nm. VII selling point is prosumer performance.


----------



## EastCoast

ilmazzo said:


> very good gaming card, I wonder if someone was able to downvolt it to see this 7nm how are good?
> it will get noise under control and reduce frequency roller coaster
> I don’t get why we have again the graphite thermal pad and the hotspot temp, thought were a legacy of vegas....
> 
> if vii was the most interesting card to get for optimization but now we have a new king here...lot of optimizations on driver will came in the next releases, this is completely a new beast imho


I still think this card needs better cooling. Not just for the GPU but for the vregs/mosfets/ram/etc. Once Sapphire intro their lineup (Nitro, etc) I will be looking into one of these myself.


----------



## AlphaC

I think you can summarize all the RX 5700 series reviews as "wait for Sapphire NITRO/VaporX/TOXIC cards" , unless you plan on shelling out for a non-existent waterblock (EKWB has a prototype) or trying your luck with Arctic Accelero for ~$50


----------



## rv8000

AmericanLoco said:


> rv8000 said:
> 
> 
> 
> Glad to see most of the reviews confirm there was some great progress made in engines where Nvidia was dusting AMD. Bought one for fun, should be here the end of this week, wish I had universal block to slap on it for fun.
> 
> 
> 
> Where did you buy it from?
Click to expand...

Newegg around 10am est time. Sapphire was the only card I could find at that time.


----------



## Heuchler

WannaBeOCer said:


> The graphite pad is used since it's consistent compared to thermal paste which will have inconsistent spread. Clearly the hotspot sensors are on it to prevent heat from damaging the GPU due to the heat density of 7nm. VII selling point is prosumer performance.


RX 5700 cards use the same Hitachi HM03 thermal solution used in RVII [8:40 min] in the new Bring Up video https://youtu.be/-ruSlwmcOAI?t=473

Over 40 W/mK according to Hitachi - https://www.hitachi-chem.co.jp/english/products/cc/026.html


----------



## Heuchler

ilmazzo said:


> very good gaming card, I wonder if someone was able to downvolt it to see this 7nm how are good?
> it will get noise under control and reduce frequency roller coaster
> I don’t get why we have again the graphite thermal pad and the hotspot temp, thought were a legacy of vegas....
> 
> if vii was the most interesting card to get for optimization but now we have a new king here...lot of optimizations on driver will came in the next releases, this is completely a new beast imho


Yes, SWE overclockers provided undervolting in their Review. 
https://www.overclock.net/forum/28029650-post30.html


----------



## 113802

Heuchler said:


> RX 5700 cards use the same Hitachi HM03 thermal solution used in RVII [8:40 min] in the new Bring Up video https://youtu.be/-ruSlwmcOAI?t=473
> 
> Over 40 W/mK according to Hitachi - https://www.hitachi-chem.co.jp/english/products/cc/026.html


I'm aware they are using the same graphite pad and like I stated they also said the graphite pad allows for a consistent thermal transfer compared to thermal paste.


----------



## NightAntilli

WannaBeOCer said:


> Not surprising at all, just look at the compute workloads where the Vega 64 crushes even the RTX 2070. Vega is a high performance compute architecture while RDNA was built purely for gaming. Vega won't be going anywhere any time soon and we'll continue to see it until it's replaced with Arcturus.


Maybe... But why are particle physics scores of these cards through the roof? Doesn't really make it seem like a gaming graphics card in that regard.

https://browser.geekbench.com/v4/compute/4259036 (scroll to bottom, 200k FPS, not a typo).


----------



## Defoler

NightAntilli said:


> Also, when they tested Metro Exodus, the Navi cards actually did quite well, while at other sites they didn't. On the other websites, Hairworks, which uses tessellation, is likely hampering the Navi cards. And since Anandtech specifically disabled hairworks, other websites most likely didn't, skewing the results in nVidia's favor. Keep in mind, the 5700XT actually is very close to rivaling the Radeon VII LOL.


A few other reviews I have seen also disabled hairworks. Their results are similar in terms of position of performance as anandtech.
Now while it does change the results using hairworks, if you have that tech and it adds good visuals, why not use it? 
Would you not use tressfx in games just because "oh it hinders nvidia, so I shouldn't use it"?


----------



## 113802

NightAntilli said:


> Maybe... But why are particle physics scores of these cards through the roof? Doesn't really make it seem like a gaming graphics card in that regard.
> 
> https://browser.geekbench.com/v4/compute/4259036 (scroll to bottom, 200k FPS, not a typo).


Particle physics is used often when developing games especially water and games that utilize PhysX.


----------



## tpi2007

PC Perspective review added to the OP.






EastCoast said:


> There, I gotcha. Bold for emphasis.
> Any one who makes a post is not making an absolute. You need to work on how to properly interpret what's being inferred, implied, or the general context of the message. Therefore, as mentioned in your own post, the drivers are stable. Does not mean there is no room for improvement.
> 
> Therefore, I still stand by what I said, press drivers release tend to be more about stability. Performance drivers later on.
> Enjoy! :thumb:



Right, right, make it all relative, nice way to get out of it. Keep going.


----------



## ilmazzo

Nice find

I was quite sure that the uv profile would squeeze more perf versus the broken oc setting


----------



## 113802

Gamer Nexus talks about boost on Navi and states that they renamed the boost names. Expect boost to work exactly the same as Vega 20 where the turbo boost will only run for a few seconds just like peak boost. 

Talk about boost starts @ 2:00






They artificially locked the overclock frequency on the RX 5700. What an anti-consumer way to get sales on their higher end cards.


----------



## Heuchler

WannaBeOCer said:


> Gamer Nexus talks about boost on Navi and states that they renamed the boost names. Expect boost to work exactly the same as Vega 20 where the turbo boost will only run for a few seconds just like peak boost.
> 
> Talk about boost starts @ 2:00
> 
> https://youtu.be/-SAWtKEIYbw
> 
> They artificially locked the overclock frequency on the RX 5700. What an anti-consumer way to get sales on their higher end cards.



Could just be a driver bug since they stated that the RX 5700 will overclock at PCWorld video last week with Scott Herkelman and Robert Hallock [44 minutes or 2700 seconds i guess]
https://youtu.be/OY8qvK5XRgA?t=2700

SWE Overclockers did overclock their RX 5700 with decent (but not super results) as in not as high as RX 5700 XT stock results
https://www.sweclockers.com/test/27832-amd-radeon-rx-5700-och-rx-5700-xt-navi/10#content

Gamers Nexus needs to high out to Buildzoid for consulting/content. But if I even need a mod mat I will watch one of GN videos.


----------



## PontiacGTX

just wondering, anyone has found a review where they benchmarked a 5700/XT in aida64? amd has improved the 32 bit integer performance with RDNA?


----------



## tyvar

With the complaints about borked drivers, does this mean Finewine(tm) is back?


----------



## tpi2007

Digital Foundry RX 5700 XT video review added to the OP:


----------



## 113802

Heuchler said:


> Could just be a driver bug since they stated that the RX 5700 will overclock at PCWorld video last week with Scott Herkelman and Robert Hallock [44 minutes or 2700 seconds i guess]
> https://youtu.be/OY8qvK5XRgA?t=2700
> 
> SWE Overclockers did overclock their RX 5700 with decent (but not super results) as in not as high as RX 5700 XT stock results
> https://www.sweclockers.com/test/27832-amd-radeon-rx-5700-och-rx-5700-xt-navi/10#content
> 
> Gamers Nexus needs to high out to Buildzoid for consulting/content. But if I even need a mod mat I will watch one of GN videos.


I hope Steve from Gamers Nexus is incorrect. 



> The 5700 non XT can't be overclocked past a certain frequency


----------



## SoloCamo

SoloCamo said:


> Have to admit, their results seem to be fairly spot on. I did a non oc'ed and oc'ed run of my V64 on DeusEx mactching their listed settings and at stock my V64 matched their 38fps at 4k. When I bumped it to never drop below 1605mhz core /1100mem it put me at 43.6fps. 45fps on a stock 5700XT at 4k is better than I was anticipating at that high of a res.


Just to follow up on this, going to the latest drivers brought me up to 44.6 fps on the same overclock. So a decently oc'ed reference cooled V64 is about on par with a stock reference 5700xt. At least as far as Deus Ex. Will have to test on other games. Keep in mind.. I' also on a lowly 4.6ghz 4790k w/ 16gb ddr3 cas10 2400mhz vs their 9900k.


----------



## JackCY

Borked drivers on launch, again. OC near impossible in most reviews especially VRAM.

Seems rushed again on software side. Price not competitive, again. Custom cards nowhere to be seen on launch, again. AMD doesn't learn from repeated mistakes, again.

---

Performance wise, sometimes Navi does OK other times it plummets, the focus on ST performance does show in results to me where the cards tend to jump up in performance under high FPS low latency situations (to top level of all GPUs) but on the other hand hammering them with some engines on 1440p or 4k the cards suffer and drop to a 2060 level.


----------



## SoloCamo

JackCY said:


> Borked drivers on launch, again. OC near impossible in most reviews especially VRAM.
> 
> Seems rushed again on software side. Price not competitive, again. Custom cards nowhere to be seen on launch, again. AMD doesn't learn from repeated mistakes, again.


Beats competition with ease at the same price points but hey, clearly the price is not competitive because of the 2060's awesome ray tracing performance.


----------



## Gunderman456

R9 3900X does not beat the 9900k in gaming and they are about the same price.

5700 is locked down by AMD and cannot be overclocked. Most likely to protect the 5700 XT.

5700 XT is overclocked by AMD to the extreme. Right now can't be overclocked as drivers seem broken and there does not seem to be much headway anyway. It is loud and hot.

The 2060 Super, the 2070, the 5700 XT, the 2070 Super and the Radeon VII are all about the same performance.

Better luck next time AMD.

/Thread


----------



## JackCY

While having less features and lower support overall. Drivers don't allow working OC yet either. Only reference unusable cards available running 90C under load and loud. No thanks. Please compare custom cards vs custom cards at retail prices that people can buy. Not magical MSRP and reference cards almost no one wants.

12% behind 2070S.
2% ahead of a 175W 2070.
https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html

Value wise it's really a tie on custom cards especially if you consider outgoing 2070 sales.


----------



## 113802

Gunderman456 said:


> 5700 is locked down by AMD and cannot be overclocked. Most likely to protect the 5700 XT.
> 
> 5700 XT is overclocked by AMD to the extreme. Right now can't be overclocked as drivers seem broken and there does not seem to be much headway anyway. It is loud and hot.
> 
> The 2060 Super, the 2070, the 5700 XT, the 2070 Super and the Radeon VII are all about the same performance.
> 
> Better luck next time AMD.
> 
> /Thread


Steve from Gamers Nexus said it can be overclocked to a certain point. Hopefully this is wrong. Hopefully it overclocks decently. 

The RX 5700 XT is on average 8% slower than the Radeon VII @ 1440p and 12% slower @ 4k

The RX 5700 XT is on average 12% slower than the RTX 2070 Super at 1440p and 14% slower @ 4k

Edit: Anyone find any reviews regarding RIS? Can't find any.


----------



## DNMock

Gunderman456 said:


> R9 3900X does not beat the 9900k in gaming and they are about the same price.
> 
> 5700 is locked down by AMD and cannot be overclocked. Most likely to protect the 5700 XT.
> 
> 5700 XT is overclocked by AMD to the extreme. Right now can't be overclocked as drivers seem broken and there does not seem to be much headway anyway. It is loud and hot.
> 
> The 2060 Super, the 2070, the 5700 XT, the 2070 Super and the Radeon VII are all about the same performance.
> 
> Better luck next time AMD.
> 
> /Thread



Get that garbage trolling out of here... 

3900X matches the 9900k (within about 5% on low resolution gaming and identical above 1080p) in gaming and smashes in everything else at the same price point. 

GPU's seem competitive at the price point thanks to the knee-jerk price drop at least.


----------



## bigjdubb

There seems to be some pretty good potential buried inside there. I don't know if anything great will come with these cards but they give me some optimism for whats coming. It would be great if it could hold on to it's comparable performance levels from 1080p at 4k.


----------



## Ha-Nocri

So, 5700 is ~10% faster than 2060 at 1440p, costs the same. Seems like a no-brainier for me. That will be my new GPU... once Sapphire cards come out. I don't even bother thinking about ray tracing. At 1440p frame rate is terrible on a 2060. RT is just not ready atm.


----------



## magnek

ToTheSun! said:


> The performance of the 5700 XT, relative to the rest of cards tested, is so bipolar. In the same review, it can be seen surpassing the 2080 and falling behind the 2070 by some 30%.


Blame Unreal Engine 4. Compare the following results:

*Unreal Engine 4*
Ace Combat 7
DarkSiders 3

*Dunia Engine*
Far Cry 5

*Frostbite Engine*
Battlefield V

*REDEngine 3*
Witcher 3

UE4 simply does not play well with AMD for whatever reason, and represents one extreme of the spectrum where 5700 XT can't even keep up with a 2060 (Super)! Then on the other extreme, you've got the Frostbite engine which allows the 5700 XT to zoom past even a 2080. And of course in between those two extremes you've got a bunch of other games running on various engines where the general trend is 5700 XT slots somewhere in between 2070 and 2070 Super (or "expected performance" I guess).

Hopefully we don't get to a point where we have to buy our GPUs based on what game engine a game uses LOL. Seriously, if someone spends most of their time playing BF V, they truly have no reason to get an $800 nVidia card when a $400 AMD card smashes it. OTOH, if one plays a lot of UE4 based games, it would make no sense to buy anything other than nVidia. Such is the sad reality in 2019 I guess...


----------



## ToTheSun!

magnek said:


> Blame Unreal Engine 4. Compare the following results:
> 
> *Unreal Engine 4*
> Ace Combat 7
> DarkSiders 3
> 
> *Dunia Engine*
> Far Cry 5
> 
> *Frostbite Engine*
> Battlefield V
> 
> *REDEngine 3*
> Witcher 3
> 
> UE4 simply does not play well with AMD for whatever reason, and represents one extreme of the spectrum where 5700 XT can't even keep up with a 2060 (Super)! Then on the other extreme, you've got the Frostbite engine which allows the 5700 XT to zoom past even a 2080. And of course in between those two extremes you've got a bunch of other games running on various engines where the general trend is 5700 XT slots in right between 2070 and 2070 Super (or "expected performance" I guess).
> 
> Hopefully we don't get to a point where we have to buy our GPUs based on what game engine a game uses LOL. Seriously, if someone spends most of their time playing BF V, they truly have no reason to get an $800 nVidia card when a $400 AMD card smashes it. OTOH, if one plays a lot of UE4 based games, it would make no sense to buy anything other than nVidia. Such is the sad reality in 2019 I guess.


Well, damn, you're right. The 5700 XT looks pretty good otherwise. Epic have been a complete letdown lately.

If it weren't for UE4 games, the 5700 XT would be the absolute best perf/price, hands down.


----------



## SoloCamo

magnek said:


> Blame Unreal Engine 4. Compare the following results:
> 
> *Unreal Engine 4*
> Ace Combat 7
> DarkSiders 3
> 
> *Dunia Engine*
> Far Cry 5
> 
> *Frostbite Engine*
> Battlefield V
> 
> *REDEngine 3*
> Witcher 3
> 
> UE4 simply does not play well with AMD for whatever reason, and represents one extreme of the spectrum where 5700 XT can't even keep up with a 2060 (Super)! Then on the other extreme, you've got the Frostbite engine which allows the 5700 XT to zoom past even a 2080. And of course in between those two extremes you've got a bunch of other games running on various engines where the general trend is 5700 XT slots somewhere in between 2070 and 2070 Super (or "expected performance" I guess).
> 
> Hopefully we don't get to a point where we have to buy our GPUs based on what game engine a game uses LOL. Seriously, if someone spends most of their time playing BF V, they truly have no reason to get an $800 nVidia card when a $400 AMD card smashes it. OTOH, if one plays a lot of UE4 based games, it would make no sense to buy anything other than nVidia. Such is the sad reality in 2019 I guess...


Frostbite is pretty much the reason I've been on AMD.


----------



## magnek

ToTheSun! said:


> Well, damn, you're right. The 5700 XT looks pretty good otherwise. Epic have been a complete letdown lately.
> 
> If it weren't for UE4 games, the 5700 XT would be the absolute best perf/price, hands down.


Absolutely. If AMD fixed their UE4 weakness, their cards would really shine against nVidia's when it came to perf/price. 

The other alternative is to not play UE4 games.


----------



## ToTheSun!

magnek said:


> Absolutely. If AMD fixed their UE4 weakness, their cards would really shine against nVidia's when it came to perf/price.
> 
> The other alternative is to not play UE4 games.


Have any id tech 6 games been tested yet in reviews? I'd like to see the 5700 XT's performance there.


----------



## looniam

magnek said:


> Blame Unreal Engine 4. Compare the following results:
> 
> *Unreal Engine 4*
> Ace Combat 7
> DarkSiders 3
> 
> *Dunia Engine*
> Far Cry 5
> 
> *Frostbite Engine*
> Battlefield V
> 
> *REDEngine 3*
> Witcher 3
> 
> UE4 simply does not play well with AMD for whatever reason, and represents one extreme of the spectrum where 5700 XT can't even keep up with a 2060 (Super)! Then on the other extreme, you've got the Frostbite engine which allows the 5700 XT to zoom past even a 2080. And of course in between those two extremes you've got a bunch of other games running on various engines where the general trend is 5700 XT slots somewhere in between 2070 and 2070 Super (or "expected performance" I guess).
> 
> Hopefully we don't get to a point where we have to buy our GPUs based on what game engine a game uses LOL. Seriously, if someone spends most of their time playing BF V, they truly have no reason to get an $800 nVidia card when a $400 AMD card smashes it. OTOH, if one plays a lot of UE4 based games, it would make no sense to buy anything other than nVidia. Such is the sad reality in 2019 I guess...


*^THAT^* is why there is a rep button people.



that is all, carry on. :thumb:


----------



## ilmazzo

ToTheSun! said:


> Well, damn, you're right. The 5700 XT looks pretty good otherwise. Epic have been a complete letdown lately.
> 
> If it weren't for UE4 games, the 5700 XT would be the absolute best perf/price, hands down.


don't know what's wrong with UE but is thrash on red side....they lose one tier in perf at least...


----------



## 113802

They're in stock on Newegg: 

RX 5700 XT: https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-1473

RX 5700: https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-1474

Also AMD's site

AMD: https://www.amd.com/en/where-to-buy/promotions


----------



## paulerxx

tpi2007 said:


> Digital Foundry RX 5700 XT video review added to the OP:
> 
> https://www.youtube.com/watch?v=1QYeR2HJplk


Digital Foundry dropped the ball on this review, comparing a $500 card to a $400 one and then talk down to the $400 card, as if it isn't $100 cheaper? If that's not a clear cut sign of bias idk what is.


If you're buying a new ryzen CPU with a new 5700xt, you can get the 5700xt for $350 at Microcenter. (at least the one in my area)

You also save $$ on Mobos.


----------



## magnek

ToTheSun! said:


> Have any id tech 6 games been tested yet in reviews? I'd like to see the 5700 XT's performance there.


Could've sworn Wolfenstein II is supposed to be Id Tech 6, but for some reason TPU reports Id Tech 5. In any case:


















Sauces: TPU, Toms

Then some interactive graphs on Tweakers.net which I'm too lazy to screenshot then copypasta here.

Slightly faster (~5%) than 2070, but behind a 2070 Super. So "as expected" performance.


----------



## ToTheSun!

magnek said:


> Slightly faster (~5%) than 2070, but behind a 2070 Super. So "as expected" performance.


I suppose looniam is right; REPs all around.


----------



## PontiacGTX

no site benchmaked aida64 then?


----------



## NightAntilli




----------



## 113802

NightAntilli said:


>


Just like the Radeon VII these cards are going to be impressive under water. Most of the reviews are most likely thermal throttled. EK blocks are already available.


----------



## PontiacGTX

WannaBeOCer said:


> Just like the Radeon VII these cards are going to be impressive under water. Most of the reviews are most likely thermal throttled.


If AMD knows the OC potential. why they keep repeating the same error year after year. 290x,390x(ref),480, RX vega 56/64,RX 5700 /XT why not improve the reference cooling or dont release a reference design like they did with the 590/580


----------



## Ha-Nocri

Steve from H. Unboxed also had frequency limit on RX 5700.... don't like it.


----------



## 113802

PontiacGTX said:


> If AMD knows the OC potential. why they keep repeating the same error year after year. 290x,390x(ref),480, RX vega 56/64,RX 5700 /XT


I can't comment on Hawaii or Polaris. The Vega 56/64 could overclock a decent amount but the boost was terrible. They fixed the boost with the Radeon VII and most of them can hit 1950Mhz with proper cooling and 1200Mhz on memory. If you're referring to a stock cooler I do not know why they keep failing with stock coolers. 

I bugged my Vega 64 and was able to run it at a sustained 1800Mhz

https://www.3dmark.com/fs/18270986

https://hwbot.org/submission/407152...___1080p_xtreme_radeon_rx_vega_64_5503_points

https://youtu.be/GXDced_nNPw


----------



## Heuchler

magnek said:


> Could've sworn Wolfenstein II is supposed to be Id Tech 6, but for some reason TPU reports Id Tech 5.


Should be as you said IdTech 6
https://en.wikipedia.org/wiki/Id_Tech_6

Doom (2016) – by id Software
Wolfenstein II: The New Colossus (2017) – by MachineGames
Doom VFR (2017) – by id Software
Wolfenstein: Youngblood (2019) – by MachineGames and Arkane Studios

Ace Combat 7 - Unreal Engine 4
Assassin's Creed Odyssey - AnvilNext 2.0
Battlefield V - Frostbite 4
Cryengine/lumberjack - Hunt: Showdown
Civilization VI - Firaxis Engine
Darksiders 3 - Unreal Engine 4
Devil May Cry 5 - RE Engine
Divinity Original Sin 2 - Divinity Engine 2
F1 2018 - EGO Engine 3.0
Far Cry 5 - Dunia 2
Hitman 2 - Glacier Engine II
Metro Exodus - 4A Engine
Monster Hunter World - MT Framework 2.x
Rage 2 - Apex Engine
Rainbow Six Siege - AnvilNext 2.0
Wildlands - AnvilNext 2.0
AC:O - AnvilNext 2.0
Sekiro - PhyreEngine 
Shadow of the Tomb Raider - Foundation Engine
Shadow of War - LithTech Firebird
Strange Brigade - Asura Engine
The Division 1 & 2 - Snowdrop Engine
Witcher 3 - REDEngine 3
Wolfenstein 2 - idTech 6


----------



## gamervivek

WannaBeOCer said:


> Just like the Radeon VII these cards are going to be impressive under water. Most of the reviews are most likely thermal throttled. EK blocks are already available.


From German Tom's,



> If you give the little monkey really sugar, then it is even close to the 2.1 GHz mark and in places even more! Of course, then, what flows through the veins, no more blood stream, but high-energy electricity needed life juice. But it works. Easier and more stable than the Radeon VII has ever done. At about 2.1 GHz, there seems to be a kind of physical barrier, but what the hell. Exactly there will be a follow-up as article and video.


https://www.tomshw.de/2019/07/07/am...n-vega-und-fast-2-1-ghz-takt-unter-wasser/12/


----------



## PontiacGTX

WannaBeOCer said:


> I can't comment on Hawaii or Polaris. The Vega 56/64 could overclock a decent amount but the boost was terrible. They fixed the boost with the Radeon VII and most of them can hit 1950Mhz with proper cooling and 1200Mhz on memory. If you're referring to a stock cooler I do not know why they keep failing with stock coolers.
> 
> I bugged my Vega 64 and was able to run it at a sustained 1800Mhz
> 
> https://www.3dmark.com/fs/18270986
> 
> https://hwbot.org/submission/407152...___1080p_xtreme_radeon_rx_vega_64_5503_points
> 
> https://youtu.be/GXDced_nNPw


Stock coolers limiting the OC potential at least.. I mean with thermal throttling,I dont think anyone wants a card throttling when they overclock. why the R9 FURY/X, RX 590 and Radeon VII has a decent cooling solution at release,is it too expensive for a 450usd card?, make a special edition with a non reference cooler for a slighly higher cost then



gamervivek said:


> From German Tom's,
> 
> 
> 
> https://www.tomshw.de/2019/07/07/am...n-vega-und-fast-2-1-ghz-takt-unter-wasser/12/


Well and did they measure the power consumption?

also
Gaming>Torture test?


----------



## magnek

gamervivek said:


> From German Tom's,
> 
> 
> 
> If you give the little monkey really sugar, then it is even close to the 2.1 GHz mark and in places even more! Of course, then, what flows through the veins, no more blood stream, but high-energy electricity needed life juice. But it works. Easier and more stable than the Radeon VII has ever done. At about 2.1 GHz, there seems to be a kind of physical barrier, but what the hell. Exactly there will be a follow-up as article and video.
> 
> 
> 
> https://www.tomshw.de/2019/07/07/am...n-vega-und-fast-2-1-ghz-takt-unter-wasser/12/
Click to expand...

All I got from this was sugar = high electricity life juice for little monkeys.

...which actually kinda makes sense, given how they're always eating bananas. I think the Germans are onto something! :thinking:


----------



## gamervivek

100% fan speed for >2GHz clocks, often at 2.1GHz, wow.






Timestamp 11:30


----------



## Gunderman456

Canada Computers have the 5700 XT now for between $540-$560. Exchange rate is off as usual as $399 US is $521.87 CAD. The $499 US 3900X is selling for $689 which should be $652.52. Sigh...


----------



## criminal

I am happy with the results and can hope for more as drivers mature. And finally, I have an all AMD rig again after many, many years. Today was a lovely day.


----------



## Gunderman456

criminal said:


> I am happy with the results and can hope for more as drivers mature. And finally, I have an all AMD rig again after many, many years. Today was a lovely day.


So what will you be getting?


----------



## magnek

Gunderman456 said:


> Canada Computers have the 5700 XT now for between $540-$560. Exchange rate is off as usual as $399 US is $521.87 CAD. The $499 US 3900X is selling for $689 which should be $652.52. Sigh...


Uhhhh that's a 3.5-7.3% markup, which honestly is kinda amazing. Literally everything in Canada is more expensive than the exchange rate would indicate, and a single digit markup is something to be celebrated.



criminal said:


> I am happy with the results and can hope for more as drivers mature. And finally, I have an all AMD rig again after many, many years. Today was a lovely day.


:thumb:

As usual I'm taking the "wait at least 2 months for everything to be ironed out" approach. Still would like to see what the 3950X and 5700 XT 50th Anniversary Edition + 2 months of optimization can do. I could probably hold out on the GPU a bit longer but I'm getting more frequent BSODs (code points to RAM issues -- either actual sticks or IMC starting to degrade  ) so would definitely like to do a platform upgrade in the foreseeable future (Black Friday???).


----------



## criminal

Gunderman456 said:


> So what will you be getting?


It is in my sig.


----------



## Gunderman456

magnek said:


> Uhhhh that's a 3.5-7.3% markup, which honestly is kinda amazing. Literally everything in Canada is more expensive than the exchange rate would indicate, and a single digit markup is something to be celebrated.


I guess, let's see what happens in a few weeks, need to wait until they fix the Bios issues for the CPU boost and drivers for the cards.

I may go for a 3900X and a 5700 XT or wait and go for a 3950X and a Navi 20. Not sure what to do!!


----------



## Gunderman456

criminal said:


> It is in my sig.


Wow, proactive, I like it!


----------



## Faster_is_better

Going to have to watch these for a while. Will need to get away from these 290s at some point, and one of these may be the ticket. If they are amazing under water will have to get a block too


----------



## criminal

Gunderman456 said:


> Wow, proactive, I like it!




I am excited. Hopefully I get decent results all around.


----------



## 113802

Anyone find any Radeon Image Sharpening reviews?


----------



## NightAntilli

WannaBeOCer said:


> Anyone find any Radeon Image Sharpening reviews?


Not yet. Hardware Unboxed will be doing them though, but we don't know when exactly. There was one review that mentioned it, but it didn't really go in depth. Their criticism was that it's a toggle rather than a slider, but that was it. I think it was Coreteks.


----------



## 113802

NightAntilli said:


> Not yet. Hardware Unboxed will be doing them though, but we don't know when exactly. There was one review that mentioned it, but it didn't really go in depth. Their criticism was that it's a toggle rather than a slider, but that was it. I think it was Coreteks.


Thanks, I don't pay attention to those Youtubers that have a ton of different theories/rumors. I'll wait for reviews.


----------



## Blackops_2

criminal said:


> I am happy with the results and can hope for more as drivers mature. And finally, I have an all AMD rig again after many, many years. Today was a lovely day.


Agreed man. I'll be holding out till Christmas to build again as i just updated Purple Haze with a 1080 Mini (which coincidentally couldn't find a block for) and Green Envy with a Vega 56, which the 3770k is getting long in the tooth by now. But yeah 3800X and big Navi or whatever AMD has by Christmas is what i'm going to run for sure in a Micro ATX setup.


----------



## sjwpwpro

NightAntilli said:


> Not yet. Hardware Unboxed will be doing them though, but we don't know when exactly. There was one review that mentioned it, but it didn't really go in depth. Their criticism was that it's a toggle rather than a slider, but that was it. I think it was Coreteks.


It is the coreteks review and it does not go into great detail, though it does have side by side footage and says that 4k is still better.


----------



## ZealotKi11er

gamervivek said:


> 100% fan speed for >2GHz clocks, often at 2.1GHz, wow.
> 
> https://www.youtube.com/watch?v=rz47WqRDDK4
> 
> Timestamp 11:30


That is amazing. Can't wait for AIBs card. Moral of the story is don't get the non-XT if you want 2GHz+ clock speeds.


----------



## NightAntilli

I'll definitely consider the Nitro+ version of the 5700XT. If the 5700 from AIBs doesn't have a limiter, I'd get that instead, but I refuse to buy a card that has an artificial limit in place.


----------



## PriestOfSin

Don't know why, but I have an urge to get two 5700XTs for Crossfire. Even though I know crossfire sucks, and isn't worth it in any way, shape, or form.

But the urge is there. Will probably be ignored, but it's there. Regardless, I want to see Sapphire do a decent cooler for this thing. Can't wait to see the water results!


----------



## NightAntilli

PriestOfSin said:


> Don't know why, but I have an urge to get two 5700XTs for Crossfire. Even though I know crossfire sucks, and isn't worth it in any way, shape, or form.
> 
> But the urge is there. Will probably be ignored, but it's there. Regardless, I want to see Sapphire do a decent cooler for this thing. Can't wait to see the water results!


If I'm not mistaken, Navi no longer supports Crossfire.

Statement by AMD:
"Radeon RX 5700 Series GPU's support CrossFire in 'Explicit' multi-GPU mode when running a DX12 or Vulkan game that supports multiple GPU's. The older 'implicit' mode used by legacy DX9/11/OpenGL titles is not supported."


----------



## ElectroManiac

Not enough performance to upgrade my 1070, but happy with the results I have to say. Good price/performance ratio and bring some good competition to Nvidia. I'm amazed how after 3 years since the release of the 1070 I haven't find a card that is worth upgrading for me in term of price/performance. GPU tech has definitely slow down in the last years sadly.


----------



## EastCoast

If the rumors are true that the Nitro and other AM cards allow the 5700XT sustain 1900MHz+ then this might cause Nvidia to have to react with price cuts.
I'm sure that's realistic with a waterblock though. But none have been announced yet. Usually they are pretty good to get in the fray of new releases to create a buzz for their WB products. Not sure why nothing has been announced yet.


----------



## PriestOfSin

NightAntilli said:


> If I'm not mistaken, Navi no longer supports Crossfire.
> 
> Statement by AMD:
> "Radeon RX 5700 Series GPU's support CrossFire in 'Explicit' multi-GPU mode when running a DX12 or Vulkan game that supports multiple GPU's. The older 'implicit' mode used by legacy DX9/11/OpenGL titles is not supported."


Ah, well that makes sense. Multi-GPU hasn't been worth it since the HD6900 series, IMO. And to think in 2008, I thought multi-GPU was both the shiznit and the future.


----------



## Heuchler

Faster_is_better said:


> Going to have to watch these for a while. Will need to get away from these 290s at some point, and one of these may be the ticket. If they are amazing under water will have to get a block too


Hey Faster_is_better

nothing to see here, move along. Nothing personally but if you get one then Ramzinho will want one. Then I have to get one. And the whole vicious cycle of never ending upgrades will start all over again. Nobody wants that.


----------



## The Robot

RDNA is looking great. PS5/XB2 could very well match or beat a 2080Ti/Titan RTX with binned Navi2, finally console gaming would be exciting again since PS3 times. I feel for VII owners though if they bought it purely for gaming, it was a bad call for AMD to market it as such, should've just used Frontier branding.


----------



## EastCoast

The Robot said:


> RDNA is looking great. PS5/XB2 could very well match or beat a 2080Ti/Titan RTX with binned Navi2, finally console gaming would be exciting again since PS3 times. I feel for VII owners though if they bought it purely for gaming, it was a bad call for AMD to market it as such, should've just used Frontier branding.


I recall some saying that the R7 was a filler card. I'm afraid they are correct. Once the Navi20/21 hits the market I seriously doubt they will be cheap. It's already rumored that it's 2080TI level of performance. Looking at the 5700XT results today I tend to believe it for now.

As it's stands now AMD sku stack could looks like this:
5700 Mid
5800 High End
5900 Ultra High End 

Just like before. Not lets think for a minute. If their Mid can do 1440p comfortably...


----------



## Jarhead




----------



## JackCY

WannaBeOCer said:


> Thanks, I don't pay attention to those Youtubers that have a ton of different theories/rumors. I'll wait for reviews.


You don't have to watch the videos, there is a written review as well: https://www.techspot.com/reviews/graphics-cards/
It only takes a while for it to get there compared to video summary.


----------



## D-S-J

The Robot said:


> RDNA is looking great. PS5/XB2 could very well match or beat a 2080Ti/Titan RTX with binned Navi2, finally console gaming would be exciting again since PS3 times. I feel for VII owners though if they bought it purely for gaming, it was a bad call for AMD to market it as such, should've just used Frontier branding.


On what data are you basing this off of? The custom SOC or APU powering the PS5/XB2 is limited by power, cooling, and case form factor. How is it going to beat a desktop flagship GPU from either Nvidia or AMD that's not as restricted with power, cooling, and form factor size?


----------



## magnek

So umm, fair to say that after all this time, Wait for Navi™ has finally somewhat paid off?


----------



## D-S-J

magnek said:


> So umm, fair to say that after all this time, Wait for Navi™ has finally somewhat paid off?


"Wait for Navi" is already being pushed under the rug. "Wait for Big Navi" or "Wait for Arcturus" is the current hype.


----------



## 113802

JackCY said:


> You don't have to watch the videos, there is a written review as well: https://www.techspot.com/reviews/graphics-cards/
> It only takes a while for it to get there compared to video summary.


I do not understand your response. I said I'll be waiting for reviews about Radeon Image Sharpening.


----------



## JackCY

I think all those waiting for Navi already gave up or passed away. It has taken too long. Also keep in mind these are 7nm GPUs competing with a year old 12nm. NV is working on 7nm EUV.


----------



## lightsout

Whats this people are talking about an artificial limit placed on the 5700? Does it power throttle?


Sent from my iPad using Tapatalk


----------



## D-S-J

JackCY said:


> I think all those waiting for Navi already gave up or passed away. It has taken too long. Also keep in mind these are 7nm GPUs competing with a year old 12nm. NV is working on 7nm EUV.


In terms of performance the 5700 XT still loses to the 1080 TI overall, which was released about 27 months ago, on 16nm/14nm. 5700 XT only beats that 2+ year flagship with MSRP.


----------



## Gunderman456

EastCoast said:


> I recall some saying that the R7 was a filler card. I'm afraid they are correct. Once the Navi20/21 hits the market I seriously doubt they will be cheap. It's already rumored that it's 2080TI level of performance. Looking at the 5700XT results today I tend to believe it for now.
> 
> As it's stands now AMD sku stack could looks like this:
> 5700 Mid
> 5800 High End
> 5900 Ultra High End
> 
> Just like before. Not lets think for a minute. If their Mid can do 1440p comfortably...


Fixed that for you!

5700 Low
5800 Mid
5900 High

Ultra High what the frig is that? Justifying high prices or what? Don't do them any favors, thanks.


----------



## Majentrix

I still don't understand why AMD don't let AIB partners sell custom cooling solutions on launch. You think they would've learned after the 290X disaster.


----------



## Gunderman456

Majentrix said:


> I still don't understand why AMD don't let AIB partners launch custom cooling solutions on launch. You think they would've learned after the 290X disaster.


They want their full cut first?


----------



## D-S-J

Majentrix said:


> I still don't understand why AMD don't let AIB partners sell custom cooling solutions on launch. You think they would've learned after the 290X disaster.


Basically what Gunderman456 posted. AMD is first and foremost a business. Being the first as well as the supplier will give a large influx of much needed cash, hopefully profit. Delaying the AIB partners is just a typical business move. And this will appease the shareholders and investors after the so called planned price reduction before the product even releases after an official MSRP was released back at E3. If I was a shareholder or investor I would have been pissed as well.


----------



## keikei

D-S-J said:


> In terms of performance the 5700 XT still loses to the 1080 TI overall, which was released about 27 months ago, on 16nm/14nm. 5700 XT only beats that 2+ year flagship with MSRP.



Nvidia does have the better architecture. I'll admit that, but the comparison is not a fair one. You are comparing a flagship gpu (no longer in production that launched for $700) vs a midtier card going for $400. A more fair comparison would be the 2070S vs 5700XT. Both cards perform virtually identical, but one costs an additional $100.


----------



## doom26464

there decent cards, Though not sure why AMD doesn't wait a few more weeks and polish up drivers a bit to help with the sore spots in reviews. 

Blower fan is bad, AMD needs to stop with that. NVIDIA doesn't even use that nonsense anymore. However once AIB cards come it is a clear better purchase to me over the 2060super and 2060. If driver can improve and AIB cards can unlock a little more gusto then even against 2070super it could be a worthy choice(considering price of course). Overclocking seems a bit of a mess but maybe in a few weeks that will get figured out, or driver issues will fix some of it.


Im not sure Nvidia will do much at this point though, they will probably still sell Super cards off mind share and RTX nonsense hype to the masses. Also depends how fast AMD can get bigger NAVI chips to market as well, time is of the essences as every bit creeps closer to 7nm NVIDIA cards and intel as well. 


Also poor radeon 7, they should have not even marketed that card for gaming. I hope nobody bough it for pure gaming usage and at least uses it for workstation applications.


----------



## D-S-J

keikei said:


> Nvidia does have the better architecture. I'll admit that, but the comparison is not a fair one. You are comparing a flagship gpu (no longer in production that launched for $700) vs a midtier card going for $400. A more fair comparison would be the 2070S vs 5700XT. Both cards perform virtually identical, but one costs an additional $100.
> https://www.youtube.com/watch?v=fkA-yCrl_A4&t=507s


I agree it's not an apples to apples comparison. That's why I did state the price difference between the two in my post. My post was piggybacking off the other post. It was more about how 5700XT 7nm compares to the 2070S 12nm from last year and how it compares to the aging Pascal arch from 3-2 years depending on SKU that was built on 16nm/14nm.


----------



## Defoler

keikei said:


> Nvidia does have the better architecture. I'll admit that, but the comparison is not a fair one. You are comparing a flagship gpu (no longer in production that launched for $700) vs a midtier card going for $400. A more fair comparison would be the 2070S vs 5700XT. Both cards perform virtually identical, but one costs an additional $100.


That depends on your gaming suit.
TPU's suit shows 14% advantage on the 2070 super over the 5700 XT at 4K. G3D is around 6-7% difference, AT with their small suit shows a lot closer, etc. 
So having a bit higher performance and added DXR performance and other techs that don't run as fast on AMD, allows nvidia to ask 100$ more.


----------



## magnek

Defoler said:


> That depends on your gaming suit.
> TPU's suit shows 14% advantage on the 2070 super over the 5700 XT at 4K. G3D is around 6-7% difference, AT with their small suit shows a lot closer, etc.
> So having a bit higher performance and added DXR performance and other techs that don't run as fast on AMD, allows nvidia to ask 100$ more.


See this post: https://www.overclock.net/forum/225...n-rx-5700-xt-5700-reviews-3.html#post28030118

UE4 is the culprit. Once you remove the two UE4 games in TPU's suite (Ace Combat 7, DarkSiders 3), 5700XT slots in right between 2070 and 2070 Super, closer to one end in some games, and closer to the other end in other games. Notably, on the Frostbite engine the 5700XT surpasses even a 2080! 

If you play a lot of UE 4 games, then don't even bother with AMD. Conversely if you play a lot of Battlfield games, AMD might be a far better choice.


----------



## huzzug

JackCY said:


> Borked drivers on launch, again. OC near impossible in most reviews especially VRAM.
> 
> Seems rushed again on software side. Price not competitive, again. Custom cards nowhere to be seen on launch, again. AMD doesn't learn from repeated mistakes, again.


They seem to be doing this on purpose. A 2 year old learns better than a few billion $ strong entity. They seem to want Nvidia to grab better share of the market, like always by being the ugly duckling and then later bring in their make up team, which was on stand by during their opening performance to put the proverbial lipstick.

This launch is good if you're the sweet summer child when compared to their contemporaries. If you know history though, it's just 2 competitors trying to save each other by keeping the status quo.


----------



## Defoler

magnek said:


> See this post: https://www.overclock.net/forum/225...n-rx-5700-xt-5700-reviews-3.html#post28030118
> 
> UE4 is the culprit. Once you remove the two UE4 games in TPU's suite (Ace Combat 7, DarkSiders 3), 5700XT slots in right between 2070 and 2070 Super, closer to one end in some games, and closer to the other end in other games. Notably, on the Frostbite engine the 5700XT surpasses even a 2080!
> 
> If you play a lot of UE 4 games, then don't even bother with AMD. Conversely if you play a lot of Battlfield games, AMD might be a far better choice.


Well we can also remove games like wither 3, maybe a lot of people don't play them. 
How about we remove AnvilNext 2.0 game engine. That engine seems pretty bad. Or how about IO Engine? What a crappy engine right? Or 4A Engine. Metro games are horrible anyway. 
How about we use only game engines that AMD are good at? There we go, best gaming suit available, and we don't have to worry about "game engine problems".

In other words, as long as the game engine exist, games use it, and people play those games, its existence in a game suit review should not be removed, so it is pointless to say "yeah let's just remove X". 

You know what, next time we review nvidia games, we should remove frostbite game engine. It is too good for AMD right? Isn't fair. That will make it better. I'm sure no one will care if we omit AMD favouring games at all.


----------



## headd

So 5700 is first locked GPU ever?This is most anti-consumer move i have seen in GPU spcace.AMD really selling locked GPU so you need buy more expensive 5700xt to be able oc it?WOOOOW.


----------



## Blackops_2

ZealotKi11er said:


> That is amazing. Can't wait for AIBs card. Moral of the story is don't get the non-XT if you want 2GHz+ clock speeds.


While i always planned on the 5700XT or better (assuming AMD puts out anything else by Christmas) but that's honestly the big negative for this launch if the 5700 is indeed locked. That just bothers me. And very well could start setting an intel reminiscent "K series" precedent in the GPU sector. Which is so polar of them considering since Bulldozer they've marketed to OCers.


----------



## magnek

Defoler said:


> Well we can also remove games like wither 3, maybe a lot of people don't play them.
> How about we remove AnvilNext 2.0 game engine. That engine seems pretty bad. Or how about IO Engine? What a crappy engine right? Or 4A Engine. Metro games are horrible anyway.
> How about we use only game engines that AMD are good at? There we go, best gaming suit available, and we don't have to worry about "game engine problems".
> 
> In other words, as long as the game engine exist, games use it, and people play those games, its existence in a game suit review should not be removed, so it is pointless to say "yeah let's just remove X".
> 
> You know what, next time we review nvidia games, we should remove frostbite game engine. It is too good for AMD right? Isn't fair. That will make it better. I'm sure no one will care if we omit AMD favouring games at all.


:doh:

Totally missed the point. I'm saying we all expected 5700XT to fit in somewhere between 2070 and 2070 Super, and indeed except for the two UE4 games tested, it does just that. So it's more of a single issue with AMD cards doing very badly with UE4 (and thus skewing overall results), than some luck of the draw with the gaming suite.


----------



## Hwgeek

I knew it , the new RDNA can be OCed easy over 2Ghz and only limited to 2150Mhz since it's the max that the MSI slider can allow:
https://www.tomshw.de/2019/07/08/am...los-auf-21-ghz-uebertaktet-wasser-sei-dank/2/


----------



## ILoveHighDPI

I couldn’t care less about AMD vs. Nvidia (RTX has killed any hope I have for buying another green GPU ever again).
As long as Navi 64/Big Navi can give me double the performance of Vega 64 I’m gonna gobble up that thing like a fat kid on ice cream.


----------



## tpi2007

Majentrix said:


> I still don't understand why AMD don't let AIB partners sell custom cooling solutions on launch. You think they would've learned after the 290X disaster.





D-S-J said:


> Basically what Gunderman456 posted. AMD is first and foremost a business. Being the first as well as the supplier will give a large influx of much needed cash, hopefully profit. Delaying the AIB partners is just a typical business move. And this will appease the shareholders and investors after the so called planned price reduction before the product even releases after an official MSRP was released back at E3. If I was a shareholder or investor I would have been pissed as well.



I think that the answer is much simpler than that: as happened with the R9 290X & 290 launch, AMD simply wasn't ready in time for launch and thus AIB's didn't have the chips for long enough to make custom cards for a 7/7 launch. 

Here's the thing, if you're going to sell $350-$400+ cards with 180w-225w+ of board power, the target audience that can afford to buy $350-$400+ cards will have the means to have a well ventilated case, so AMD's reasoning makes no sense, it's just an excuse. 

Now, a blower design on a card in the $200-$300 Polaris successor area... ahem... RX 690 territory... would have made more sense for the target audience... the less powerful card doesn't even have a backplate...


----------



## dantoddd

ILoveHighDPI said:


> I couldn’t care less about AMD vs. Nvidia (RTX has killed any hope I have for buying another green GPU ever again).
> As long as Navi 64/Big Navi can give me double the performance of Vega 64 I’m gonna gobble up that thing like a fat kid on ice cream.


Well anyone would buy a card with twice the performance of Vega 64, that would put it about 20-30% above RTX 2080 Ti. I doubt that will happen though


----------



## Shatun-Bear

The 5700XT is seriously impressive looking across benches. A good AIB overclocked is going to be faster than a Radeon VII even, and faster than a stock 1080 Ti and 2070 Super. This is my next card to replace my 1070 Ti.

The 5700XT is managing this performance level with only 40 CUs too. Imagine a 64 CU Navi (or rumours suggest 64 compute units might not be the max with RDNA but beyond this number) clocked at 2000 core clock (XT is 1900 already). If performance scales up decently, we'll be looking at performance between a 2080 and 2080 Ti, probably land around the same as the yet-to-be-seen 2080 Super.


----------



## tpi2007

Shatun-Bear said:


> The 5700XT is seriously impressive looking across benches. A good AIB overclocked is going to be faster than a Radeon VII even, and faster than a stock 1080 Ti and 2070 Super. This is my next card to replace my 1070 Ti.
> 
> The 5700XT is managing this performance level with only 40 CUs too. Imagine a 64 CU Navi (or rumours suggest 64 compute units might not be the max with RDNA but beyond this number) clocked at 2000 core clock (XT is 1900 already). If performance scales up decently, we'll be looking at performance between a 2080 and 2080 Ti, probably land around the same as the yet-to-be-seen 2080 Super.



The problem is that they need to make further enhancements to the Navi arch and / or wait for 7nm EUV in order to produce something that is meaningfully faster than the RX 5700 XT and the Radeon VII. If the RX 5700 has a 180w board power and the XT has a 225w board power, how much of a power hog would a 64 CU card be with the 5700 XT's clocks? It's not doable as of now with their current arch design and that's why they had to release the VII earlier this year, they likely won't have anything until January of next year, at best. If we go by the RX 480 -> Vega 64 release dates, we may be waiting even more, closer to the next consoles release:

RX 480 release date: 29 June 2016 
Vega 64 release date: 7 August 2017


Potentially:


RX 5700 XT release date: 7 July 2019
RX 5800 XT release date: 15 August 2020


----------



## Blze001

These numbers are pretty good, especially for the prices. Nicely done AMD.


----------



## keikei

ILoveHighDPI said:


> I couldn’t care less about AMD vs. Nvidia (RTX has killed any hope I have for buying another green GPU ever again).
> As long as Navi 64/Big Navi can give me double the performance of Vega 64 I’m gonna gobble up that thing like a fat kid on ice cream.


How come? Nvidia managed to boost both performance in both RT and non-RT titles in this refresh. We're not quite there, probably a few more gens, but we're seeing improvement. Yes, i'd rather not have RT alpha hardware forced onto the cards, but Green has the clout to do so. Its an egg and chicken situation. The RTX 2080 S is also on the horizon. Lets see what Nvidia has in store for us.


----------



## tpi2007

TechSpot published the written review of the Navi cards, just added it to the OP (also below).

https://www.techspot.com/review/1870-amd-radeon-rx-5700/


----------



## NightAntilli

D-S-J said:


> In terms of performance the 5700 XT still loses to the 1080 TI overall, which was released about 27 months ago, on 16nm/14nm. 5700 XT only beats that 2+ year flagship with MSRP.


It loses to AIB 1080Ti cards but is on par with reference ones. This is the first card that gives you near 1080Ti performance levels at a price of $399. 

Anyone that doesn't think this is good value at this point is crazy. I can't wait to see the Nitro+ version.


----------



## keikei

Incoming blocks from EK: https://www.guru3d.com/news-story/ek-vector-blocks-engineered-for-amd-navi-gpus.html


----------



## Rei86

EDIT:

I hope this stuttering issue he's talking about is just a driver thing.





Go to 4:35 mark.


----------



## fragamemnon

Gunderman456 said:


> R9 3900X does not beat the 9900k in gaming and they are about the same price.
> 
> 5700 is locked down by AMD and cannot be overclocked. Most likely to protect the 5700 XT.
> 
> 5700 XT is overclocked by AMD to the extreme. Right now can't be overclocked as drivers seem broken and there does not seem to be much headway anyway. It is loud and hot.
> 
> The 2060 Super, the 2070, the 5700 XT, the 2070 Super and the Radeon VII are all about the same performance.
> 
> Better luck next time AMD.
> 
> /Thread



https://www.overclock.net/forum/28030022-post84.html


Do we have anything to actually contribute to the thread with? Or should I expect to see this crap in every Ryzen/RX thread?


----------



## spyshagg

Any tear down yet? I wan't to see If I can use the stock blower cooler together with a GPU waterblock


----------



## 113802

spyshagg said:


> Any tear down yet? I wan't to see If I can use the stock blower cooler together with a GPU waterblock


Gamers Nexus in his 3900x review said he'll be uploading a tear down. He showed the bare card at the end of the video.


----------



## EastCoast

Super Price cuts incoming...

I still believe the rumor about AIB custom coolers allowing the 5700XT @ 1900MHz and higher...


----------



## Blze001

EastCoast said:


> Super price drops incoming...


Doubtful. Price cuts reduces leather jacket budget.


----------



## D-S-J

NightAntilli said:


> It loses to AIB 1080Ti cards but is on par with reference ones. This is the first card that gives you near 1080Ti performance levels at a price of $399.
> 
> Anyone that doesn't think this is good value at this point is crazy. I can't wait to see the Nitro+ version.


"Value" is subjective. For those who bought a 1080 TI back in 2017 probably feel like it's a good value. Since they've been enjoying that goodness for 2+ years to this day and who knows for how many additional years to come. Plus they opt to sell it for added additional "value".


----------



## EastCoast

EK waterblocks incoming....
https://www.tomshw.de/2019/07/08/am...emlos-auf-21-ghz-uebertaktet-wasser-sei-dank/




Undervolting results
https://www.hardwareluxx.de/index.p...0-und-radeon-rx-5700-xt-im-test.html?start=25













5700XT @ 2100MHz






This is getting "gooder" by the hour 
It's tied for a 2070 Super at Max OC 


Hey Nvidia, that wasn't suppose to happen with the 2070 Super :doh:


----------



## NightAntilli

D-S-J said:


> "Value" is subjective. For those who bought a 1080 TI back in 2017 probably feel like it's a good value. Since they've been enjoying that goodness for 2+ years to this day and who knows for how many additional years to come. Plus they opt to sell it for added additional "value".


You can choose to interpret it that way. But... It is still a FACT that the 5700 series cards have better performance per dollar. That IS good value in that regard, and if you deny that, you are simply biased. 

It's quite obvious you're deliberately trying to confuse things by conflating the same word used as a personal emotional standard and as a judgment of reality. I guess these cards stepped on some toes.


----------



## AlphaC

spyshagg said:


> Any tear down yet? I wan't to see If I can use the stock blower cooler together with a GPU waterblock


 tomshw.de


---


Wait for custom cards IMO


https://videocardz.com/newz/msi-to-launch-seven-custom-radeon-rx-5700-graphics-cards


----------



## maltamonk

Really was the blower card and incoming water blocks/ aib coolers a surprise to anyone? I saw a comment about repeating the r9 290/x debacle and thought, oh a repeat of some of the best cards in recent memory...sweet!


----------



## Hwgeek

EastCoast said:


> EK waterblocks incoming....
> https://www.tomshw.de/2019/07/08/am...emlos-auf-21-ghz-uebertaktet-wasser-sei-dank/
> 
> 
> 
> 
> Undervolting results
> https://www.hardwareluxx.de/index.p...0-und-radeon-rx-5700-xt-im-test.html?start=25
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5700XT @ 2100MHz
> 
> https://youtu.be/kK3isGg9nDw
> 
> This is getting "gooder" by the hour
> It's tied for a 2070 Super at Max OC
> 
> 
> Hey Nvidia, that wasn't suppose to happen with the 2070 Super :doh:


2150Mhz was Max only because that was the max that MSI AB could apply, to confirm this isn't the silicone max you can see that only Extra 2oW was needed to get 2150Mhz, so who knows if the max OC gonna be around 2300~2400? LOL RIP SUPER.


----------



## D-S-J

NightAntilli said:


> You can choose to interpret it that way. But... It is still a FACT that the 5700 series cards have better performance per dollar. That IS good value in that regard, and if you deny that, you are simply biased.
> 
> It's quite obvious you're deliberately trying to confuse things by conflating the same word used as a personal emotional standard and as a judgment of reality. I guess these cards stepped on some toes.


It's not an interpretation, it's a fact of reality that everybody values things differently. AMD put 2 different values on their 5700 series, the original E3 MSRP and the later price dropped MSRP. Potential customers stated their opinions. Some valued a lower price in general, some valued more features for the price compared to the RTX series, some valued a lower price for the amount of power it consumed, some valued a lower price because they felt entitled, the list can go on forever. This is why I stated "value is subjective."

I could care less whether Nvidia or AMD wins, I'm not a shareholder or investor. A corporation is a corporation. I don't buy in this GPU tier so the 5700 series or the 2070 RTX series doesn't matter to me. I buy from who ever has the most powerful and feature rich card at the time of purchase regardless of pricing. I don't sidegrade or upgrade unless a component dies, I do full system builds every 4-6 years depending on the tech and needs at the time. If I were to buy today, it would be the RTX 2080 TI for gaming or RTX Titan/RTX Quadro 5000 for workstation.


----------



## maltamonk

D-S-J said:


> It's not an interpretation, it's a fact of reality that everybody values things differently. AMD put 2 different values on their 5700 series, the original E3 MSRP and the later price dropped MSRP. Potential customers stated their opinions. Some valued a lower price in general, some valued more features for the price compared to the RTX series, some valued a lower price for the amount of power it consumed, some valued a lower price because they felt entitled, the list can go on forever. This is why I stated "value is subjective."
> 
> I could care less whether Nvidia or AMD wins, I'm not a shareholder or investor. A corporation is a corporation. I don't buy in this GPU tier so the 5700 series or the 2070 RTX series doesn't matter to me. I buy from who ever has the most powerful and feature rich card at the time of purchase regardless of pricing. I don't sidegrade or upgrade unless a component dies, I do full system builds every 4-6 years depending on the tech and needs at the time. If I were to buy today, it would be the RTX 2080 TI for gaming or RTX Titan/RTX Quadro 5000 for workstation.


If you're not interested in this tier and have a view on value that doen't really apply here, why are you interjecting here? Boredom?


----------



## D-S-J

maltamonk said:


> If you're not interested in this tier and have a view on value that doen't really apply here, why are you interjecting here? Boredom?


Seems I stepped on someone's toes. Just because I personally wouldn't buy in this tier doesn't mean I'm not interested. I'm interested in all tech, especially PCs and mobile. Also I'm interested for info. As one of the few tech nerds of the clan, I want to give the most informed advice to family, friends, clan, acquaintances if asked for recommendations.


----------



## maltamonk

D-S-J said:


> Seems I stepped on someone's toes. Just because I personally wouldn't buy in this tier doesn't mean I'm not interested. I'm interested in all tech, especially PCs and mobile. Also I'm interested for info. As one of the few tech nerds of the clan, I want to give the most informed advice to family, friends, clan, acquaintances if asked for recommendations.


No toes, just curious. Fair enough.


----------



## NightAntilli

D-S-J said:


> It's not an interpretation, it's a fact of reality that everybody values things differently. AMD put 2 different values on their 5700 series, the original E3 MSRP and the later price dropped MSRP. Potential customers stated their opinions. Some valued a lower price in general, some valued more features for the price compared to the RTX series, some valued a lower price for the amount of power it consumed, some valued a lower price because they felt entitled, the list can go on forever. This is why I stated "value is subjective."


Here we go again with the whole 'value' thing... >_< ugh...

People may be believe that smoking adds great value to their daily life. We all know how that ultimately turns out. There's subjective value and objective value. Not all value is subjective. You might give a higher value to teal cars, but objectively, if you want to re-sell it, its market value will always remain below the black, white and grey cars of the same make and model, because the majority of people are looking for those colors, and not teal, even if you like teal more.
These are gaming cards. On pretty much all the gaming points, the 5700 series is a better deal. We'll see how much better when the AIB cards are released. There really isn't much that the nVidia Super cards bring to the table regarding gaming compared to the 5700 series. They offer the same or slightly more performance for an astronomically higher price. That's it. The rest is noise. 

What argument is there really, to NOT go for these GPUs? Sure, right now, noise and thermals due to the blower cooler. When the AIBs come out, what will it be? Ray Tracing? Anyone that believes ray tracing on anything other than a 2080 Ti is a good feature to have, needs a reality check. 



D-S-J said:


> I could care less whether Nvidia or AMD wins, I'm not a shareholder or investor. A corporation is a corporation. I don't buy in this GPU tier so the 5700 series or the 2070 RTX series doesn't matter to me. I buy from who ever has the most powerful and feature rich card at the time of purchase regardless of pricing. I don't sidegrade or upgrade unless a component dies, I do full system builds every 4-6 years depending on the tech and needs at the time. If I were to buy today, it would be the RTX 2080 TI for gaming or RTX Titan/RTX Quadro 5000 for workstation.


And yet here you are, trying to downplay these cards...


----------



## PontiacGTX

Does this implies that Vega is still better for compute?












EastCoast said:


> EK waterblocks incoming....
> https://www.tomshw.de/2019/07/08/am...emlos-auf-21-ghz-uebertaktet-wasser-sei-dank/
> 
> 
> 
> 
> Undervolting results
> https://www.hardwareluxx.de/index.p...0-und-radeon-rx-5700-xt-im-test.html?start=25
> 
> 
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5700XT @ 2100MHz
> 
> https://youtu.be/kK3isGg9nDw
> 
> This is getting "gooder" by the hour
> It's tied for a 2070 Super at Max OC
> 
> 
> Hey Nvidia, that wasn't suppose to happen with the 2070 Super :doh:


the only reason undervolt works is because 5700 XT is thermal limited while 5700 isnt but thanks for the Toms Hard link too bad the author is using a game which favors nvidia,did he made that on purpose?



AlphaC said:


> tomshw.de
> 
> 
> ---
> 
> 
> Wait for custom cards IMO
> 
> 
> https://videocardz.com/newz/msi-to-launch-seven-custom-radeon-rx-5700-graphics-cards



will this be enough for 2.1ghz+? looks like a similar design they used in 580s


----------



## ILoveHighDPI

dantoddd said:


> Well anyone would buy a card with twice the performance of Vega 64, that would put it about 20-30% above RTX 2080 Ti. I doubt that will happen though


5700XT is already 20% better than Vega 64 in best case scenarios, if they can refine the architecture a little more over the next year then a 2X jump at 64CUs could be done.
Even if those performance margins only apply at 4K I’d still take it, Vega is especially weak in 4K and that’s one of my biggest regrets with the card.


----------



## D-S-J

NightAntilli said:


> Here we go again with the whole 'value' thing... >_< ugh...
> 
> People may be believe that smoking adds great value to their daily life. We all know how that ultimately turns out. There's subjective value and objective value. Not all value is subjective. You might give a higher value to teal cars, but objectively, if you want to re-sell it, its market value will always remain below the black, white and grey cars of the same make and model, because the majority of people are looking for those colors, and not teal, even if you like teal more.
> These are gaming cards. On pretty much all the gaming points, the 5700 series is a better deal. We'll see how much better when the AIB cards are released. There really isn't much that the nVidia Super cards bring to the table regarding gaming compared to the 5700 series. They offer the same or slightly more performance for an astronomically higher price. That's it. The rest is noise.
> 
> What argument is there really, to NOT go for these GPUs? Sure, right now, noise and thermals due to the blower cooler. When the AIBs come out, what will it be? Ray Tracing? Anyone that believes ray tracing on anything other than a 2080 Ti is a good feature to have, needs a reality check.
> 
> 
> And yet here you are, trying to downplay these cards...


Downplay these cards? I never stated these were bad, these cards have issues like all other cards. I wouldn't have any issues recommending these depending on needs and budget. I wouldn't personally buy in this tier because my budget is higher which affords me the option of buying something better. I made no mention of DXR, without RT cores or Tensor cores, the 2080 TI would still be king of gaming today.


----------



## paulerxx

If you can't find the value in these cards, I don't know what to tell you other than take off the blinders.


----------



## doom26464

Im skeptical nvidia will touch super card pricing. 

AIB cards are still a month away so super cards will still sell well in that time. I want to see more solid OC data though for 5700xt, it has me intrigued for when AIB cards hit with beefed up power delivery and cooling. The whole overclocking issue with navi leave me a bit wish washy right at the moment.


----------



## PontiacGTX

youtu.be/IrKUZoEzAfY?t=385
youtu.be/IrKUZoEzAfY?t=423
youtu.be/IrKUZoEzAfY?t=459


----------



## rdr09

paulerxx said:


> If you can't find the value in these cards, I don't know what to tell you other than take off the blinders.


How much do you think the XT should cost? I bought my R9 290 for 400$ almost six yrs ago and im still using it. It's faster than my 1060. This is prolly twice as fast. Surely i cannot afford the 2080 or 2080Ti.



PontiacGTX said:


> https://youtu.be/IrKUZoEzAfY?t=385
> https://youtu.be/IrKUZoEzAfY?t=423
> https://youtu.be/IrKUZoEzAfY?t=459


How you doing?


----------



## PontiacGTX

rdr09 said:


> How much do you think the XT should cost? I bought my R9 290 for 400$ and im still using it. It's faster than my 1060.
> 
> 
> 
> How you doing?


I am fine, and you are still in that internship?job?


----------



## rdr09

PontiacGTX said:


> I am fine, and you are still in that internship?job?


Job. Nothing to buy here where im at. Have to wait till i go back home. You still have your Vega?

I want the XT.


----------



## bigjdubb

NightAntilli said:


> Ray Tracing? Anyone that believes ray tracing on anything other than a 2080 Ti is a good feature to have, needs a reality check.


Ray Tracing is a bad feature on the 2080ti as well.



D-S-J said:


> Downplay these cards? I never stated these were bad, these cards have issues like all other cards. I wouldn't have any issues recommending these depending on needs and budget. I wouldn't personally buy in this tier because my budget is higher which affords me the option of buying something better. I made no mention of DXR, without RT cores or Tensor cores, the 2080 TI would still be king of gaming today.


The 2080ti is king in spite of having RT cores, certainly not because of them. There is no doubt that the 2080ti (and every other GeForce card) would be better without the RT cores.


----------



## PontiacGTX

rdr09 said:


> Job. Nothing to buy here where im at. Have to wait till i go back home. You still have your Vega?


I need to find a heatsink for it, the noise levels are as bad as the RX 5700 XT's HSF.. if it were me I wouldnt buy a reference AMD card again


----------



## rdr09

PontiacGTX said:


> I need to find a heatsink for it, the noise levels are as bad as the RX 5700 XT design.. if it were me I wouldnt buy a reference AMD card again


I can imagine. I watercooled both my 290s. Im down to one and it too is watered. May have to get a block for the XT.

You play PUBG?


----------



## PontiacGTX

rdr09 said:


> I can imagine. I watercooled both my 290s. Im down to one and it too is watered. May have to get a block for the XT.
> 
> You play PUBG?


not really also I assume I need a better cpu,but I play Rainbow six siege/Apex form time to time... too bad they are server based and your ping wouldnt be good since you are far from NA servers


if you are going to get one of these you cant crossfire anymore


----------



## rdr09

PontiacGTX said:


> not really also I assume I need a better cpu,but I play Rainbow six siege/Apex form time to time... too bad they are server based and your ping wouldnt be good since you are far from NA servers


Might see you playing Apex, then we can team up. Did you see how the 3600 do in Rainbow? 

https://imgur.com/ZblOCJF

not sure if real.

I'll pm you about my i7.

No need to crossfire. Im down to 1440.


----------



## AlphaC

PontiacGTX said:


> Does this implies that Vega is still better for compute?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the only reason undervolt works is because 5700 XT is thermal limited while 5700 isnt but thanks for the Toms Hard link too bad the author is using a game which favors nvidia,did he made that on purpose?
> 
> 
> 
> 
> will this be enough for 2.1ghz+? looks like a similar design they used in 580s


These are neutered in terms of compute probably due to the lower shader count. They might be able to get more done per core due to higher clocks and greater ROP to core ratio , but a large part of that is also filling the wavefront in blocks of 32 instead of 64. This means if the wavefronts aren't filled you can get equivalent to Vega 64+20% (roughly Radeon VII).





https://www.extremetech.com/gaming/293107-meet-rdna-amds-long-awaited-new-gpu-architecture



Optimumtech was hitting 2100MHz target clock on the blower RX 5700XT.


The MSI cooler looks to be using 3 heatpipes to augment the cooler base which is basically more than adequate if they're 8mm diameter. 6mm = 40W or so, 8mm = 60W or so. A cooler for 160W RTX 2060 tends to have 2x heatpipes at a minimum, RTX 2060 Super uses 3 larger heatpipes probably 10mm size flattened.


STRIX RTX 2060 = 6x6mm , overkill unless over 200W

"Dual" RTX 2060 = 4x6mm

RTX 2060 Gaming X = 4x 6mm
RTX 2060 Ventus = 4x6mm direct touch
EVGA RTX 2060 xc ultra = 3 heatpipes of larger diameter (8mm-10mm)

Aorus Xtreme RTX 2060 = full copper base 5 heatpipe (might be larger diameter) 

Gigabyte RTX 2060 OC pro = 4x6mm direct touch

Leadtek RTX 2060 = 4x6mm

Zotac RTX 2060 Twin fan / amp = 3x6mm
Galax RTX 2060 = 2 heatpipes


If you look at CPU coolers, 3x8mm heatpipes tends to be on par with 4x6mm = 160W TDP with ~1300RPM fan speeds and about 180-200W with 2000RPM
To deal with 220-250W you are looking at 5x6mm or 4x8mm.
Unlike CPUs gaming GPUs aren't going to be 100% load all the time generally , whereas folding/BOINC users will need to opt for better cooled ones.


EDIT: I think everyone should wait for Sapphire Nitro.


For Vega 56 they had 3x 8mm + 3x6mm which can dissipate 300W into the finstack. For RX Vega 56 Pulse it was a tamer 4x 8mm heatpipe solution with a 6mm heatpipe VRM heatsink and 2x ball bearing fans <1650 RPM.


----------



## JackCY

Wait for custom cards as always. That will decide just how good performance actually is on these and also what prices will be of cards that people actually buy.

The hardware isn't bad, as I said many times there are hardly any bad products, but there are plenty badly priced ones be it in general and vs competition = poor overall value.
If all one cares about is gaming and gaming only then sure these aren't as bad. But there is more to GPUs than gaming. In fact all the GPUs today are not even made primarily for gaming anyway, those times have passed long ago, it's all about server compute and re-purposing the chips for everything else.

$400 + tax etc. isn't great, custom cards are not gonna be $400 either. It's likely going to turn into a wash once custom cards (Navi) are compared vs custom cards (Turing) in terms of perf./price, while overall value is likely to stay with Turing as there is no way around it if you can use those extra features.

There is no doubt that the reference blower is pathetic, it can have all the vapor chamber it wants when the finstack is tiny and most airflow wasted from the fan.


----------



## EastCoast

PontiacGTX said:


> the only reason undervolt works is because 5700 XT is thermal limited while 5700 isnt but thanks for the Toms Hard link too bad the author is using a game which favors nvidia,did he made that on purpose?


Absolutely, yes
If a 5700 XT "has better cooling" thus overclock higher it competes, very well, with a 2070S in a game that favors Nvidia GPUs. 
This is a pure win for RTG.


----------



## PostalTwinkie

Too hot. Too loud. Too late. Maybe AIB can make something of this. It is hard to justify the price/perf ration when it is loud and overly hot.


----------



## EastCoast

PostalTwinkie said:


> Too hot. Too loud. Too late. Maybe AIB can make something of this. It is hard to justify the price/perf ration when it is loud and overly hot.


You know, I like hostess twinkies too. Brings back fond childhood memories. But I digress.


----------



## Damage Inc

magnek said:


> So umm, fair to say that after all this time, Wait for Navi™ has finally somewhat paid off?


The wait continues. The wait for the glorious Navi 3 that might finally match the 2080. Speaking of 2080, I might replace mine with the 2080Ti Super once it comes out.


----------



## maltamonk

https://videocardz.com/newz/msi-to-launch-seven-custom-radeon-rx-5700-graphics-cards

Msi bringing 4 xt models and 3 5700 models.

There will be lots of aibs to choose from if msi is any indication of the rest of board partners


----------



## NightAntilli

PostalTwinkie said:


> Too hot. Too loud. Too late. Maybe AIB can make something of this. It is hard to justify the price/perf ration when it is loud and overly hot.


Blower cards... So hot and loud, I can agree with. Nothing AIB cards can't fix.
Too late...? Why? It's the first card that gives you 1080 Ti class performance at $399. Why is it too late?


----------



## bigjdubb

Damage Inc said:


> The wait continues. The wait for the glorious Navi 3 that might finally match the 2080. Speaking of 2080, I might replace mine with the 2080Ti Super once it comes out.


Is there going to be a 2080ti Super?


----------



## keikei

bigjdubb said:


> Is there going to be a 2080ti Super?


I think he means the 2080 S. We'll find out in 2 weeks. Nvidia wants to let the water to settle before dropping a bomb.


----------



## doom26464

Ok overclocking is meh but im suspecting the blower cooler is not helping. What im seeing is a out a 6-7% overclocking headroom @2100ishh. 

Aib might be able to push that up to 2150mhz or a bit beyond. So lets take a stab and say at best 9-10% oc headroom.

Decent for the price if AIB don't go too over the top in price. 2070s could be an interesting match up against this. Also need to factor in 2070s OC headroom.


I need more data, I hate waiting 😞


----------



## magnek

Damage Inc said:


> The wait continues. The wait for the glorious Navi 3 that might finally match the 2080. Speaking of 2080, I might replace mine with the 2080Ti Super once it comes out.


The 2080 Ti Super is already out, it's called the Titan RTX.


----------



## ilmazzo

doom26464 said:


> Ok overclocking is meh but im suspecting the blower cooler is not helping. What im seeing is a out a 6-7% overclocking headroom @2100ishh.
> 
> Aib might be able to push that up to 2150mhz or a bit beyond. So lets take a stab and say at best 9-10% oc headroom.
> 
> Decent for the price if AIB don't go too over the top in price. 2070s could be an interesting match up against this. Also need to factor in 2070s OC headroom.
> 
> 
> I need more data, I hate waiting 😞


since guaranteed gaming frequency is 1750ish, a 2100 would be a 20% oc


----------



## doom26464

ilmazzo said:


> doom26464 said:
> 
> 
> 
> Ok overclocking is meh but im suspecting the blower cooler is not helping. What im seeing is a out a 6-7% overclocking headroom @2100ishh.
> 
> Aib might be able to push that up to 2150mhz or a bit beyond. So lets take a stab and say at best 9-10% oc headroom.
> 
> Decent for the price if AIB don't go too over the top in price. 2070s could be an interesting match up against this. Also need to factor in 2070s OC headroom.
> 
> 
> I need more data, I hate waiting 😞
> 
> 
> 
> since guaranteed gaming frequency is 1750ish, a 2100 would be a 20% oc
Click to expand...

Your right my math is off. Well im going off the optimum tech reveiw numbers so something not right here....

Need to do more digging.

Edit: more digging I do more I find OC is pretty much broken at this point. Not sure how or what optimum tech was able to do when everyone else is pretty much at the OC is broken point.....hrmnm


----------



## bigjdubb

Yeah, the software side of overclocking is a stumbling block for AMD right now. I gave up messing with it on my RVII and it looks like the 5000 series cards are going through the same teething process.


----------



## keikei

doom26464 said:


> Ok overclocking is meh but im suspecting the blower cooler is not helping. What im seeing is a out a 6-7% overclocking headroom @2100ishh.
> 
> Aib might be able to push that up to 2150mhz or a bit beyond. So lets take a stab and say at best 9-10% oc headroom.
> 
> Decent for the price if AIB don't go too over the top in price. 2070s could be an interesting match up against this. Also need to factor in 2070s OC headroom.
> 
> 
> I need more data, I hate waiting 😞



You gonna put a block on dat badboi?!


----------



## tpi2007

keikei said:


> You gonna put a block on dat badboi?!
> 
> 
> https://www.youtube.com/watch?v=BpctR5TFJjk



So they put a metal backplate, but then forget to put thermal pads on it.


----------



## SoloCamo

Bleh all this too hot and too loud. I've been on a reference Sapphire 290x and reference Sapphire Vega 64. Just use headphones and pump the fan speed - aka be a man.

In all seriousness, I wish they'd step away from blower cards, but they do have their purpose.

That said, at the stock 'balanced' settings my refererence V64 rarely creeps above 1500mhz core / 945 mem and that's in a VERY well ventilated case. With an UV and PL increase it rarely dips below 1625mhz core and 1100mhz mem making me sit between a 5700 and 5700xt.

Not going to lie, the 5700xt performs better than expected compared to my oc'ed V64.


----------



## Leopardi

SoloCamo said:


> Bleh all this too hot and too loud. I've been on a reference Sapphire 290x and reference Sapphire Vega 64. Just use headphones and pump the fan speed - aka be a man.
> 
> In all seriousness, I wish they'd step away from blower cards, but they do have their purpose.
> 
> That said, at the stock 'balanced' settings my refererence V64 rarely creeps above 1500mhz core / 945 mem and that's in a VERY well ventilated case. With an UV and PL increase it rarely dips below 1625mhz core and 1100mhz mem making me sit between a 5700 and 5700xt.
> 
> Not going to lie, the 5700xt performs better than expected.


I'm not against blowers. The MSI GTX 980 OC I had would stay inaudibly silent with a 80C maximum load target - while assisting in extracting that heat out of my silent Fractal Define case.


----------



## 113802

SoloCamo said:


> Bleh all this too hot and too loud. I've been on a reference Sapphire 290x and reference Sapphire Vega 64. Just use headphones and pump the fan speed - aka be a man.
> 
> In all seriousness, *I wish they'd step away from blower cards, but they do have their purpose.*
> 
> That said, at the stock 'balanced' settings my refererence V64 rarely creeps above 1500mhz core / 945 mem and that's in a VERY well ventilated case. With an UV and PL increase it rarely dips below 1625mhz core and 1100mhz mem making me sit between a 5700 and 5700xt.
> 
> Not going to lie, the 5700xt performs better than expected compared to my oc'ed V64.


AMD already explained the reason and it's accurate. They understand us nerds(enthusiast) know how to cool a GPU with axial fans while normal users do not. They'd rather have a heatsink design that can be installed in any system than have majority of users complain giving users an excuse to use nVidia cards.


----------



## Kaltenbrunner

I could have waited, then I would have gone AMD again, instead I have rtx 2070, but that's it


----------



## JackCY

magnek said:


> The 2080 Ti Super is already out, it's called the Titan RTX.


It's called Quadro RTX 8000. Titan RTX is features cut down in comparison.



WannaBeOCer said:


> AMD already explained the reason and it's accurate. They understand us nerds(enthusiast) know how to cool a GPU with axial fans while normal users do not. They'd rather have a heatsink design that can be installed in any system than have majority of users complain giving users an excuse to use nVidia cards.


Dual axial cooler is better than blower even in stupid pre built and poorly cooled systems, that's the problem. They are using blower because they want to sell these to data centers, which is what the cards are designed for anyway, but even for that blower is not ideal. They can't be bothered to ever develop a proper cooler for any GPU, except when they know they can't so they outsource it to Cooler Master etc. to do water cooling for them.


----------



## Blackops_2

Kaltenbrunner said:


> I could have waited, then I would have gone AMD again, instead I have rtx 2070, but that's it


I wish i'd have held off of the 1080 mini. Could have a 5700 and a waterblock for $15 difference.


----------



## 113802

JackCY said:


> It's called Quadro RTX 8000. Titan RTX is features cut down in comparison.


Same GPU die, nothing is disabled. 



JackCY said:


> Dual axial cooler is better than blower even in stupid pre built and poorly cooled systems, that's the problem. They are using blower because they want to sell these to data centers, which is what the cards are designed for anyway, but even for that blower is not ideal. They can't be bothered to ever develop a proper cooler for any GPU, except when they know they can't so they outsource it to Cooler Master etc. to do water cooling for them.


nVidia and AMD both out source manufacturing to Cooler Master of their heatsinks. The question is who designs them. Aside from cloud gaming Navi won't be used much in the data center. Most of them are using Radeon Pro V340 for cloud gaming or Vega for ML.


----------



## NightAntilli

Interesting video... Is nVidia really worried...?


----------



## 113802

NightAntilli said:


> Interesting video... Is nVidia really worried...?


Not interesting just another one of those theorist youtubers. 

No they aren't. Their cards are selling to researchers, freelancers and gamers. Their 7nm card is slower and uses more power than nVidia's 12nm card at gaming.


----------



## JackCY

WannaBeOCer said:


> Same GPU die, nothing is disabled.
> 
> 
> 
> nVidia and AMD both out source manufacturing to Cooler Master of their heatsinks. The question is who designs them. Aside from cloud gaming Navi won't be used much in the data center. Most of them are using Radeon Pro V340 for cloud gaming or Vega for ML.


Well, -24GB VRAM, no 10bit OpenGL for Photoshop etc. in driver, maximum concurrent NVENC sessions down from unlimited to only 2, you can search the rest. Titan RTX same as any other Titan is still a cut down.

Same dies in server GPUs.


----------



## 113802

JackCY said:


> Well, -24GB VRAM, no 10bit OpenGL for Photoshop etc. in driver, maximum concurrent NVENC sessions down from unlimited to only 2, you can search the rest. Titan RTX same as any other Titan is still a cut down.
> 
> Same dies in server GPUs.


Completely different markets, the question was where is the RTX 2080 Ti Super. The response was another Geforce card named the Titan RTX. Quadros are for professionals that require ECC which isn't on Geforce cards. 

The others you listed are software locked to sell professional cards. Just like it's been for years, at least we now have Studio drivers that brings the performance of Quadro drivers without the certifications.


----------



## AmericanLoco

SoloCamo said:


> Bleh all this too hot and too loud. I've been on a reference Sapphire 290x and reference Sapphire Vega 64. Just use headphones and pump the fan speed - aka be a man.
> 
> In all seriousness, I wish they'd step away from blower cards, but they do have their purpose.


It's not even like modern blower cards are even "loud". Modern PC gamers like to whine at just about everything. My computer makes some mild wooshing fan sounds if I fire up furmark and prime95. Any kind of game audio at all instantly drowns out the fan noise. 

Back in the day we used to purposely put 5,000 RPM "Delta Screamers" and "Vantec Tornados' in our rigs.


----------



## magnek

AmericanLoco said:


> It's not even like modern blower cards are even "loud". Modern PC gamers like to whine at just about everything. My computer makes some mild wooshing fan sounds if I fire up furmark and prime95. Any kind of game audio at all instantly drowns out the fan noise.
> 
> Back in the day we used to purposely put *5,000 RPM "Delta Screamers"* and "Vantec Tornados' in our rigs.







What a classic. Really does make the fans (and gamers) of today look like sissies.


----------



## Blackops_2

Benches at 2110 on the core


----------



## doom26464

NightAntilli said:


> Interesting video... Is nVidia really worried...?


Pure click bait nonsense. I Hope people don't support a channel like that. 


Is Nvidia aware of Navi? Yes

Are they scared? Absolutely not


----------



## magnek

Dude can't even pronounce Navi correctly...


----------



## looniam

AmericanLoco said:


> It's not even like modern blower cards are even "loud". Modern PC gamers like to whine at just about everything. My computer makes some mild wooshing fan sounds if I fire up furmark and prime95. Any kind of game audio at all instantly drowns out the fan noise.
> 
> Back in the day we used to purposely put 5,000 RPM "Delta Screamers" and "Vantec Tornados' in our rigs.


my last blower was gtx570. and/but once you go :Snorkle: .


----------



## 113802

Linustechtips review, their RX 5700 XT overheats to the point it causes the system to shutdown. They need to stop locking the fan to a specific acoustic level and let the fan ramp up. 

https://youtu.be/3bmQPx9EJLA


----------



## steelbom

magnek said:


> Dude can't even pronounce Navi correctly...


That's not how it's pronounced??? How do you say it then?


----------



## ilmazzo

WannaBeOCer said:


> Linustechtips review, their RX 5700 XT overheats to the point it causes the system to shutdown. They need to stop locking the fan to a specific acoustic level and let the fan ramp up.
> 
> https://youtu.be/3bmQPx9EJLA


dont' worry

this is not a new GTX480


----------



## magnek

steelbom said:


> That's not how it's pronounced??? How do you say it then?


Well considering Navi is short for navigation/navigator, neh-vee. Or some like to pronounce it nay-vee as in navy. But definitely not nuh-vee.


----------



## steelbom

magnek said:


> Well considering Navi is short for navigation/navigator, neh-vee. Or some like to pronounce it nay-vee as in navy. But definitely not nuh-vee.


Ah, interesting. Didn't know that. I pronounced it nuh-vee like the blue things out of avatar hahahah


----------



## Imouto

magnek said:


> steelbom said:
> 
> 
> 
> That's not how it's pronounced??? How do you say it then?
> 
> 
> 
> Well considering Navi is short for navigation/navigator, neh-vee. Or some like to pronounce it nay-vee as in navy. But definitely not nuh-vee.
Click to expand...

Navi comes from Ivan spelled backwards. An alternate name for the Gamma Cassiopeiae star.

Pretty much like Vega is the name for another star.


----------



## huzzug

magnek said:


> Well considering Navi is short for navigation/navigator, neh-vee. Or some like to pronounce it nay-vee as in navy. But definitely not nuh-vee.


Umm.... Nah!!! V


----------



## ibb27

So... undervolting RX 5700 bring us better perf/lower watts, why I am not surprised 

https://translate.google.com/transl...0-und-radeon-rx-5700-xt-im-test.html?start=25


----------



## Malinkadink

ibb27 said:


> So... undervolting RX 5700 bring us better perf/lower watts, why I am not surprised
> 
> https://translate.google.com/transl...0-und-radeon-rx-5700-xt-im-test.html?start=25


AMD keeping it interesting


----------



## ilmazzo

ibb27 said:


> So... undervolting RX 5700 bring us better perf/lower watts, why I am not surprised
> 
> https://translate.google.com/transl...0-und-radeon-rx-5700-xt-im-test.html?start=25


shocked, I'm totally shocked!!!11!


----------



## momonz

I would rather listen to a few youtube analysts like adored.tv, coretek, good ol gamer and moore's law is dead with their "OPINION". Than believe in Intel, AMD or NVIDIA's false marketing. Like for example NVIDIA's ray tracing. Where in fact it is hardly a non-existent advantage over AMD, unlike AMD's DX12 advantage when Polaris was launched.


----------



## tpi2007

WannaBeOCer said:


> Linustechtips review, their RX 5700 XT overheats to the point it causes the system to shutdown. They need to stop locking the fan to a specific acoustic level and let the fan ramp up.
> 
> https://youtu.be/3bmQPx9EJLA




Added to the OP.


----------



## ilmazzo

Let me guess without even watching the video

wait for custom models?


----------



## NightAntilli

doom26464 said:


> Pure click bait nonsense. I Hope people don't support a channel like that.
> 
> 
> Is Nvidia aware of Navi? Yes
> 
> Are they scared? Absolutely not


I don't know if they are or not, and I can't know. However... If it is true that they have been trying to intimidate reviewers to skew the opinions, then it doesn't bode confidence from nVidia. Note that The Good Old Gamer and Not An Apple Fan also mentioned this in their respective videos. I doubt they're simply making it up. Why would they? It wouldn't be the first time that nVidia resorts to shady tactics either. Remember GPP? Companies generally have a more long term view of things, unlike gamers, which seem to have the attention span of someone with ADHD and the memory of Dory from Finding Nemo. You call it pure nonsense, but apparently someone had to come out and notify him of this, if he mentions there is at least one channel that experienced such a situation.

And considering AMD has practically every other gaming market under their belt right now (consoles, Stadia, Samsung phones), and they're competing with a smaller chip, it is possible that nVidia is worried. To dismiss that possibility simply because they have the advantage on the top end right now, is short-sighted.

Another one;


----------



## Hwgeek




----------



## Blze001

Hopefully Navi can put pressure on Nvidia, they have no motivation to really push the envelope on the high end or keep prices reasonable.


----------



## NightAntilli

Another interesting thing... 5700XT gains 5fps going from 9900k to 3900x, while 2070 loses 10fps. Note that the majority of reviewers tested with a 9900K.


----------



## diggiddi

NightAntilli said:


> Another interesting thing... 5700XT gains 5fps going from 9900k to 3900x, while 2070 loses 10fps. Note that the majority of reviewers tested with a 9900K.


Sauce?


----------



## ILoveHighDPI

If someone can demonstrate a similar uptick in performance with Vega64 and 3800X I’d probably ditch my 6600K.


----------



## maltamonk

NightAntilli said:


> Another interesting thing... 5700XT gains 5fps going from 9900k to 3900x, while 2070 loses 10fps. Note that the majority of reviewers tested with a 9900K.


I'll wager that that will be the next youtube topic to make the rounds. Once one of them does it they all follow suit.


----------



## lombardsoup

Can't believe people are complaining about blowers, when aftermarket hits in a mere week. Less entitlement, more patience!


----------



## Imouto

NightAntilli said:


> Another interesting thing... 5700XT gains 5fps going from 9900k to 3900x, while 2070 loses 10fps. Note that the majority of reviewers tested with a 9900K.


Secret handshake?


----------



## NightAntilli

maltamonk said:


> I'll wager that that will be the next youtube topic to make the rounds. Once one of them does it they all follow suit.


I hope so. Right now it seems to be by some random Reddit user, which makes me wonder how true it actually is.


----------



## ToTheSun!

Imouto said:


> Secret handshake?


It does seem like a lot of weird coincidences (Ryzen boost clocks nVidia bug and now this).

That is, assuming this is true.


----------



## maltamonk

Interesting enough some of the aib super card reviews have been put up. https://videocardz.com/81446/nvidia-geforce-rtx-2070-2060-super-custom-review-roundup 

Imo they are not looking good, hopefully the aibs for the 5700/xt do a better job (well not really fair to aib as Nvidia locked the cards down).


----------



## magnek

ToTheSun! said:


> It does seem like a lot of weird coincidences (Ryzen boost clocks nVidia bug and now this).
> 
> That is, assuming this is true.


pffft it's all very simple you see, the infamous Gimpworks™ is out in full force again. Except this time it's at the driver level.


----------



## NightAntilli

Info on the decoder and content adaptive sharpening. (sorry if already posted)







tl;dw
Some issues with VCE/AMF
HEVC encoder is faster than 2080Ti
Upscale from 1440p to 4K + CAS (Contrast adaptive sharpening) = almost native 4K visuals. They specifically mention it looks better than the nVidia scaling equivalent.
Anti-lag works.


----------



## AlphaC

Imouto said:


> Secret handshake?


 More like Nvidia GPUs need the CPU to work and that the Nvidia driver default to multithread (so after 3 core on CCX R9 3900X suffers a performance hit because it's 2 CCDs). The effect wouldn't be as pronounced on a R7 3700X / R5 3600X I think.


see https://3dnews.ru/990367/obzor-amd-ryzen-9-3900x





maltamonk said:


> Interesting enough some of the aib super card reviews have been put up. https://videocardz.com/81446/nvidia-geforce-rtx-2070-2060-super-custom-review-roundup
> 
> Imo they are not looking good, hopefully the aibs for the 5700/xt do a better job (well not really fair to aib as Nvidia locked the cards down).


You'd get an AIB RTX card for fan stop and sub 1500RPM fan speeds at load (mostly due to more fin area and larger fans) and possibly higher power limit. Not because they're magically better performing.


https://www.techpowerup.com/review/msi-geforce-rtx-2070-super-gaming-x-trio/31.html , https://www.kitguru.net/components/...i-rtx-2070-super-gaming-x-trio-8gb-review/20/

1120RPM at load



https://www.techpowerup.com/review/asus-geforce-rtx-2070-super-strix-oc/31.html

Normal BIOS 1850RPM or so

Quiet BIOS 1300RPM or so at load


https://lab501.ro/placi-video/review-gigabyte-geforce-rtx-2070-super-gaming-oc-8g/19


https://www.techpowerup.com/review/palit-geforce-rtx-2060-super-jetstream/33.html
1500RPM at load


I didn't look at RTX 2060 Super reviews since it's a pointless card when you can get RTX 2070 for more or less the same price.


----------



## Heuchler

Radeon RX 5700 with R9 290/390 AIB cooler
https://youtu.be/D3AoenEBM2E?t=266


----------



## KyadCK

magnek said:


> Well considering Navi is short for navigation/navigator, neh-vee. Or some like to pronounce it nay-vee as in navy. But definitely not nuh-vee.


Uh... As Imouto said, it's named after the Star.

https://en.wikipedia.org/wiki/Gamma_Cassiopeiae


> It sometimes goes by the informal name Navi.


You know, like, Vega? Polaris? Arcturus?

I chose to pronounce is as "Nah-Vee". As in that annoying thing from Legend of Zelda. As for the video, how do you even get a "nuh" sound from only an a?



huzzug said:


> Umm.... Nah!!! V


Yea, like that.


----------



## Heuchler

RX 5700XT overclocked with EKWB Vector full-cover waterblock

To overclock, I had used two different power limits: the original factory and once an increase to the maximum 50%. The effects on the clock are more obvious than the average FPS. We see that the blue curve (50% more PL) is much more balanced and even peaks beyond 2.1 GHz are possible.

Igor Wallossek on Tom's Hardware DE
https://www.tomshw.de/2019/07/08/am...emlos-auf-21-ghz-uebertaktet-wasser-sei-dank/


----------



## keikei

https://www.techpowerup.com/257212/custom-radeon-rx-5700-series-only-by-mid-august-amd


> Herkelman stated that custom-design graphics cards based on the Radeon RX 5700 XT and RX 5700 will start hitting the shelves only by *mid-August*.


----------



## paulerxx

What's everyone's opinions on this?


----------



## 113802

paulerxx said:


> What's everyone's opinions on this?


RTX 2070 Super cards are sold out on Newegg, they have 8 cards faster than the Radeon VII and are on 12nm. What's nVidia scared of?


----------



## ToTheSun!

paulerxx said:


> What's everyone's opinions on this?


My opinion is that it sufficed the first 2 times it was posted on this very thread.


----------



## EastCoast

paulerxx said:


> What's everyone's opinions on this?
> 
> https://www.youtube.com/watch?v=nMZNCxmFURA


The guy is probably telling you the truth. NDA 2.0 in full effect. No GPP now so they need some other color of shade. 
I wouldn't be surprise if they reinstate their "focus group" 

I'm sure Nvidia knows what Big Navi is capable of and it's rumor to be faster then the 2080ti.


----------



## Kpjoslee

EastCoast said:


> The guy is probably telling you the truth. NDA 2.0 in full effect. No GPP now so they need some other color of shade.
> I wouldn't be surprise if they reinstate their "focus group"
> 
> I'm sure Nvidia knows what Big Navi is capable of and it's rumor to be faster then the 2080ti.


I am not sure if they got anything to worry about when the competition needed 7nm to catch up to their 12nm products.


----------



## NightAntilli

Kpjoslee said:


> I am not sure if they got anything to worry about when the competition needed 7nm to catch up to their 12nm products.


I don't see why it's constantly attributed to 7nm. The Radeon VII was also 7nm and look how that turned out.


----------



## doom26464

paulerxx said:


> What's everyone's opinions on this?


Its been posted already twice. Its pure nonsense. Clickbait trash.


----------



## Kpjoslee

NightAntilli said:


> I don't see why it's constantly attributed to 7nm. The Radeon VII was also 7nm and look how that turned out.


Imagine Navi on 12nm. It will probably trading blows with 2060 non-super.


----------



## EastCoast

Kpjoslee said:


> Imagine Navi on 12nm. It will probably trading blows with 2060 non-super.


:doh:

You didn't understand his post at all.


----------



## bigjdubb

I have a hard time imagining that Nvidia is concerned about Navi right now. AMD has come out with comparable hardware at comparable prices, I think history has shown us that they need a better product at a better price to make a real difference in Nvidias bottom line.


----------



## ToTheSun!

bigjdubb said:


> I have a hard time imagining that Nvidia is concerned about Navi right now. AMD has come out with comparable hardware at comparable prices, I think history has shown us that they need a better product at a better price to make a real difference in Nvidias bottom line.


People forget very quickly how many times Jensen turned out to be the man with the plan. He isn't just a promising new CEO for nVidia; he withstood it all since his company's inception.

But, of course, rumors and conjecture are always enough to get "some" people going.

If my previous card had died today, I know very well that I'd be sitting on a 3900X + 5700 XT build right now. AMD is currently too strong on CPU performance and GPU perf/price. But this window of opportunity for the red team, *historically*, is a small one.


----------



## 113802

ilmazzo said:


> dont' worry
> 
> this is not a new GTX480


It's definitely not a new GTX 480, that was Vega 10. This is a fan profile issue that needs to be addressed asap. There shouldn't be a reason why a fan is locking itself to 2100 RPM to the point it overheats and shuts down.


----------



## Newbie2009

Nice cards but still overpriced. I'd say I would pick one up to play with but looks like locked down as much as nvidia gpus.


----------



## Kpjoslee

EastCoast said:


> :doh:
> 
> You didn't understand his post at all.



Oh, I did. Biggest reason Navi is competing against Turing is because it is on 7nm. Radeon VII wasn't even possible if it wasn't for 7nm. 
My point still stands.


----------



## EastCoast

bigjdubb said:


> I have a hard time imagining that Nvidia is concerned about Navi right now. AMD has come out with comparable hardware at comparable prices, I think history has shown us that they need a better product at a better price to make a real difference in Nvidias bottom line.


We all found it hard to believe when nvidia cheats by adding code that hampers AMD cards in games they already win yet and still they have. 





Kpjoslee said:


> Oh, I did. Biggest reason Navi is competing against Turing is because it is on 7nm. Radeon VII wasn't even possible if it wasn't for 7nm.
> My point still stands.


I love a double down. I get the tingles just reading this.




> AMD's cards handily win on performance, according to Tom's Hardware, marking the first time in a long time that NVIDIA has felt genuine competitive pressure from its rival.
> ...
> AMD's RX 5700 has an MSRP of $349 following a $30 price cut. The faster RX 5700 XT goes for $399 following a $50 price cut. These cards go head-to-head with NVIDIA's $349 RTX 2060 and $399 RTX 2060 SUPER.
> 
> Tom's Hardware found that the RX 5700 produced 11% higher frame rates, averaged across its benchmark suite, than the RTX 2060. Had AMD kept its original pricing, the comparison would have been muddled by a higher price. But with both cards now priced the same, AMD's entry clearly comes out on top.
> 
> The RX 5700 XT also bests its competition, beating the RTX 2060 SUPER by 9.9% on average. It even comes close to the performance of NVIDIA's $499 RTX 2070 SUPER, which beats AMD's card by just 6.9% despite costing 25% more.
> 
> On top of beating NVIDIA on raw performance, AMD made huge gains in power efficiency. The company's new cards aren't quite as power efficient as NVIDIA's products, but the gap has been substantially narrowed. Power efficiency has been one of AMD's big weaknesses for years, but a new architecture coupled with the move to a 7nm manufacturing process has allowed the company to nearly catch up with NVIDIA.


https://finance.yahoo.com/news/amd-leapfrogs-nvidia-first-time-123000900.html

Now before you start to disagree with the article don't miss the point. AMD is trending and is increase it's own mindshare. This is what has Nvidia worried. It's not about you, how you feel or if you find the source valid or not.


----------



## Blackops_2

Newbie2009 said:


> Nice cards but still overpriced. I'd say I would pick one up to play with but looks like locked down as much as nvidia gpus.


Wish they'd revert their decision to lock the 5700. I really dislike that it's locked down. Though i suppose it's just a matter of time before custom BIOS but it would be reminiscent of the 290/vega56 being with in spitting distance of it's bigger more expensive brother. Still needs to be $300 though also. At $350 which everyone is forgetting is a $30 reduction it's still overpriced. At that point $50 more (to me anyway) is warranted to get the XT.


----------



## NightAntilli

Blackops_2 said:


> Wish they'd revert their decision to lock the 5700. I really dislike that it's locked down. Though i suppose it's just a matter of time before custom BIOS but it would be reminiscent of the 290/vega56 being with in spitting distance of it's bigger more expensive brother. Still needs to be $300 though also. At $350 which everyone is forgetting is a $30 reduction it's still overpriced. At that point $50 more (to me anyway) is warranted to get the XT.


I heard somewhere that the 5700XT is also locked down to 2150 MHz max... And I also remember reading a while back that locking down GPUs is some sort of new demand that Microsoft has within its compliance specifications for official drivers or something like that. Can't remember where I read that, and I don't know if it's true.


----------



## ilmazzo

Please stop this 7vs 12 non sense, there is no rule that both need to use same pp for ...what? This is not a game with fair fixed rules. What if nvidia had amd r&d money instead? 

Amd did it, nvidia no...just judge what is real, not wishfull thinking.

Navi is more than a 7nm vega shrink, this should not be the yahoo answers of hw.


----------



## ilmazzo

NightAntilli said:


> Blackops_2 said:
> 
> 
> 
> Wish they'd revert their decision to lock the 5700. I really dislike that it's locked down. Though i suppose it's just a matter of time before custom BIOS but it would be reminiscent of the 290/vega56 being with in spitting distance of it's bigger more expensive brother. Still needs to be $300 though also. At $350 which everyone is forgetting is a $30 reduction it's still overpriced. At that point $50 more (to me anyway) is warranted to get the XT.
> 
> 
> 
> I heard somewhere that the 5700XT is also locked down to 2150 MHz max... And I also remember reading a while back that locking down GPUs is some sort of new demand that Microsoft has within its compliance specifications for official drivers or something like that. Can't remember where I read that, and I don't know if it's true.
Click to expand...

Before pumping ufo conspiracy teories

Right now 2150 is the max value ab slider allow

That’s all folks

If 5700 will be confirmed as bios limited, which is something I don’t appreciate, we will find a way to workaround it

Most selling gpus by some years are the most locked down ever, so ...


----------



## 113802

ilmazzo said:


> Please stop this 7vs 12 non sense, there is no rule that both need to use same pp for ...what? This is not a game with fair fixed rules. What if nvidia had amd r&d money instead?
> 
> Amd did it, nvidia no...just judge what is real, not wishfull thinking.
> 
> Navi is more than a 7nm vega shrink, this should not be the yahoo answers of hw.


You're absolutely correct it's AMD's GTX 680. That's why I'll wait for it's bigger brother.


----------



## teh n00binator

I'm hoping to see some more water cooling/hybrid results (or even alternative air cooler) from reviewers, I'm wondering whether these are naturally toasty or a stupidly poor cooler design that's even worse than Nvidia blowers.
.


----------



## NightAntilli

Kpjoslee said:


> Oh, I did. Biggest reason Navi is competing against Turing is because it is on 7nm. Radeon VII wasn't even possible if it wasn't for 7nm.
> My point still stands.


It really isn't 7nm... That's a bonus. 
The biggest reason Navi is competing against Turing is because of RDNA. Whether Radeon VII was possible on 7nm or not, 7nm did not help the Radeon VII even beat the 2080, despite it having 34% more flops, around the same amount of transistors and clock speeds.

As a comparison between Vega and Navi, the 5700XT almost matches the Radeon VII in gaming with similar clock speeds, 33% less compute units, 24% less die size area, and close to 25% less power _without using HBM_. A part of the power consumption and frequency is definitely due to 7nm, but we're comparing it to Radeon VII, another 7nm part, which really helps to show the architectural improvement rather than the node improvement. Also, remember that since it's not using HBM and is still more power efficient, a huge part is also confirmed to be due to the new architecture. 

And more importantly, AMD is now matching nVidia's IPC. How we know? Running the cards at the same frequency nets similar performance while they also have a similar amount of FLOPS. Basically they jumped the more efficient Maxwell and Pascal and got right up to Turing. See here;

https://www.computerbase.de/2019-07/radeon-rx-5700-xt-test/4/


----------



## DrFPS

Is it me or is the clear winner the 1080ti. It's 2 years old. What are they thinking. Have no idea what your complaining about nvidia's cooling? This is what Mine looks like.








Another
Major
Design flaw.


----------



## Kpjoslee

NightAntilli said:


> It really isn't 7nm... That's a bonus.
> The biggest reason Navi is competing against Turing is because of RDNA. Whether Radeon VII was possible on 7nm or not, 7nm did not help the Radeon VII even beat the 2080, despite it having 34% more flops, around the same amount of transistors and clock speeds.
> 
> As a comparison between Vega and Navi, the 5700XT almost matches the Radeon VII in gaming with similar clock speeds, 33% less compute units, 24% less die size area, and close to 25% less power _without using HBM_. A part of the power consumption and frequency is definitely due to 7nm, but we're comparing it to Radeon VII, another 7nm part, which really helps to show the architectural improvement rather than the node improvement. Also, remember that since it's not using HBM and is still more power efficient, a huge part is also confirmed to be due to the new architecture.
> 
> And more importantly, AMD is now matching nVidia's IPC. How we know? Running the cards at the same frequency nets similar performance while they also have a similar amount of FLOPS. Basically they jumped the more efficient Maxwell and Pascal and got right up to Turing. See here;
> 
> https://www.computerbase.de/2019-07/radeon-rx-5700-xt-test/4/


While you credit RDNA as biggest reason why Navi is able to compete against Turing, being on 7nm is THE biggest reason why Navi is competing in the first place. If Navi was still on 12nm, you would be looking at quite a lower clockspeed with much higher power consumption, which would have been disaster against Turing lineup despite having small IPC advantage. I am not discrediting RDNA since it is much needed step in the right direction, but their early move to 7nm was much more 
important because that is what enabled them to compete.


----------



## magnek

ToTheSun! said:


> My opinion is that it sufficed the first 2 times it was posted on this very thread.


My opinion is that your opinion is correct.



DrFPS said:


> Is it me or is the clear winner the 1080ti. It's 2 years old. What are they thinking. Have no idea what your complaining about nvidia's cooling? This is what Mine looks like.
> 
> 
> 
> 
> 
> 
> 
> 
> Another
> Major
> Design flaw.


Ummm dude you know that kinda goes against the point you're trying to make right? Needing overkill triple fan triple slot cooling just means the card runs hot and is power hungry. Pretty sure that's not the point you were trying to make but yeah just an FYI.



Kpjoslee said:


> While you credit RDNA as biggest reason why Navi is able to compete against Turing, being on 7nm is THE biggest reason why Navi is competing in the first place. If Navi was still on 12nm, you would be looking at quite a lower clockspeed with much higher power consumption, which would have been disaster against Turing lineup despite having small IPC advantage. I am not discrediting RDNA since it is much needed step in the right direction, but their early move to 7nm was much more
> important because that is what enabled them to compete.


:thumb:

I have rose-colored glasses on most of the time but even *I* can't stand the double standards being peddled around in this thread. When nVidia wins in Hz/IPC/efficiency, it' because "yeah great they did a node shrink big deal". But when AMD does the same, it's some rainbow pixie dust that has absolutely nothing to do with a node shrink whatsoever none zilch nada no-siree.


----------



## guttheslayer

ilmazzo said:


> Please stop this 7vs 12 non sense, there is no rule that both need to use same pp for ...what?


You are very wrong on many level. 7 vs 12 is the main differences these cards are performing.

The next iteration of 7nm+ with EUV promises 2x density shrink from the existing 12nm. AMD had the die shrink and power saving from smaller node advantage, and yet its still consuming alot of power.


Yes, big NAVI probably can offer at least 60% more SPs bringing its performance to 10-20% above 2080 Ti, but that die is 60% bigger as well, or 400mm^2 at least, even leveraging EUV better density shrink it will still be around 380mm^2, and will be a 350W monster.


NV porting over to 7nm EUV could have massive 20% performance gain at just 350~ mm^2 die (A 4096 CORES Turing at 7nm+ probably has the size of 340mm^2). And at that die size they will probably consume 180W (as they always do), while the NAVI variant will be extremely difficult to keep it below 300W at 380mm^2.


The power required per mm^2 is just isnt in the favour of AMD, no matter how you looked at it. Most likely the mid-size die RTX 3080 will surpassed NAVI 20 without having to go to the big 7nm+ turning GPU.


----------



## ZealotKi11er

guttheslayer said:


> You are very wrong on many level. 7 vs 12 is the main differences these cards are performing.
> 
> The next iteration of 7nm+ with EUV promises 2x density shrink from the existing 12nm. AMD had the die shrink and power saving from smaller node advantage, and yet its still consuming alot of power.
> 
> 
> Yes, big NAVI probably can offer at least 60% more SPs bringing its performance to 10-20% above 2080 Ti, but that die is 60% bigger as well, or 400mm^2 at least, even leveraging EUV better density shrink it will still be around 380mm^2, and will be a 350W monster.
> 
> 
> NV porting over to 7nm EUV could have massive 20% performance gain at just 350~ mm^2 die (A 4096 CORES Turing at 7nm+ probably has the size of 340mm^2). And at that die size they will probably consume 180W (as they always do), while the NAVI variant will be extremely difficult to keep it below 300W at 380mm^2.
> 
> 
> The power required per mm^2 is just isnt in the favour of AMD, no matter how you looked at it. Most likely the mid-size die RTX 3080 will surpassed NAVI 20 without having to go to the big 7nm+ turning GPU.


We can't see the future but if Nvidia keeps adding RT and Tensor core they will have larger dies than AMD meaning worse yield and higher cost. 7nm helps Navi for sure but 12nm Navi would have used 10-20% more power but probably cost less. Let's get this clear. Nvidia would be using 7nm if they could and not because they can compete with 12nm. It is impossible for them to use 7nm for Turing. Even if Turing came by the end of 2019, anything over 350mm2 would have been too expensive.
I like to believe AMD has learned from Vega and you can already see it. No more 1 GPU to do both datacenter/enterprise and gaming. I hope we don't have 2080 Ti priced GPU from Nvidia with their next gen.


----------



## magnek

ZealotKi11er said:


> We can't see the future but if Nvidia keeps adding RT and Tensor core they will have larger dies than AMD meaning worse yield and higher cost. 7nm helps Navi for sure but 12nm Navi would have used 10-20% more power but probably cost less. Let's get this clear. Nvidia would be using 7nm if they could and not because they can compete with 12nm. It is impossible for them to use 7nm for Turing. Even if Turing came by the end of 2019, anything over 350mm2 would have been too expensive.
> I like to believe AMD has learned from Vega and you can already see it. No more 1 GPU to do both datacenter/enterprise and gaming. *I hope we don't have 2080 Ti priced GPU from Nvidia with their next gen.*


We absolutely will if AMD has nothing to compete with their 2080 Ti successor. And to be totally honest, assuming AMD does have something up their sleeves, don't expect it to be cheap either. Likely not $1200 level insanity, but definitely an $800+ card.


----------



## NightAntilli

ZealotKi11er said:


> We can't see the future but if Nvidia keeps adding RT and Tensor core they will have larger dies than AMD meaning worse yield and higher cost. *7nm helps Navi for sure but 12nm Navi would have used 10-20% more power but probably cost less.* Let's get this clear. Nvidia would be using 7nm if they could and not because they can compete with 12nm. It is impossible for them to use 7nm for Turing. Even if Turing came by the end of 2019, anything over 350mm2 would have been too expensive.
> I like to believe AMD has learned from Vega and you can already see it. No more 1 GPU to do both datacenter/enterprise and gaming. I hope we don't have 2080 Ti priced GPU from Nvidia with their next gen.


Glad to see someone understands. No one has given a sound argument on how I'm wrong regarding RDNA being the main reason for the improvement, but have simply repeated the same stuff over and over. Even after clearly showing that Radeon VII which was also 7nm and didn't do anything for AMD, somehow people still tout 7nm as being the main reason for AMD being able to compete with Navi. Some real mental gymnastics are necessary for that kind of conclusion.

And then there's this;



magnek said:


> I have rose-colored glasses on most of the time but even *I* can't stand the double standards being peddled around in this thread. When nVidia wins in Hz/IPC/efficiency, it' because "yeah great they did a node shrink big deal". But when AMD does the same, it's some rainbow pixie dust that has absolutely nothing to do with a node shrink whatsoever none zilch nada no-siree.


Can't people argue without resorting to falsely accusing others of things they clearly didn't say? Statements like these make me want to punch somebody. But whatever. Just leaving the improvement in IPC of RDNA here and calling it a day. People can believe whatever nonsense they wish.


----------



## magnek

I was actually referring to other posters but whatever indeed. Nobody is saying RDNA didn't improve IPC, but I also think it's disingenuous to ignore the node shrink as a major contributing factor.


----------



## 113802

NightAntilli said:


> Glad to see someone understands. No one has given a sound argument on how I'm wrong regarding RDNA being the main reason for the improvement, but have simply repeated the same stuff over and over. *Even after clearly showing that Radeon VII which was also 7nm and didn't do anything for AMD*, somehow people still tout 7nm as being the main reason for AMD being able to compete with Navi. Some real mental gymnastics are necessary for that kind of conclusion.


Yet it sits in the middle of the RTX 2080 and RTX 2080 Ti when it comes to compute workloads along with content creation. Vega 20 did so well we're seeing it in the Mac Pro. The 5700 XT trades blows with the Vega 64 when it comes to compute workloads.


----------



## Heuchler




----------



## 113802

Heuchler said:


>


Another person just talking purely about gaming performance. 

Look at the Tensorflow benchmarks of the RTX 2060 that's mostly faster than the GTX 1080 Ti: 
https://www.phoronix.com/scan.php?page=article&item=nvidia-rtx2060-linux&num=7

The RTX 2060 Super is most likely consistently faster than the GTX 1080 Ti at machine learning especially when always using Tensor cores. 



RTX 2080 Ti - FP16 TensorFlow Performance (1 GPU)

For FP16 training of neural networks, the RTX 2080 Ti is..

72% faster than GTX 1080 Ti
59% faster than Titan XP
32% faster than RTX 2080
81% as fast as Titan V
71% as fast as Titan RTX
55% as fast as Tesla V100 (32 GB)


----------



## Imouto

WannaBeOCer said:


> Another person just talking purely about gaming performance.


He forgot to mention its performance as a door stopper too. On par with a RTX 2080 Ti and a much better value.


----------



## Heuchler

WannaBeOCer said:


> Another person just talking purely about gaming performance.
> 
> Look at the Tensorflow benchmarks of the RTX 2060 that's mostly faster than the GTX 1080 Ti:
> https://www.phoronix.com/scan.php?page=article&item=nvidia-rtx2060-linux&num=7
> 
> The RTX 2060 Super is most likely consistently faster than the GTX 1080 Ti at machine learning especially when always using Tensor cores.
> 
> 
> 
> RTX 2080 Ti - FP16 TensorFlow Performance (1 GPU)
> 
> For FP16 training of neural networks, the RTX 2080 Ti is..
> 
> 72% faster than GTX 1080 Ti
> 59% faster than Titan XP
> 32% faster than RTX 2080
> 81% as fast as Titan V
> 71% as fast as Titan RTX
> 55% as fast as Tesla V100 (32 GB)



So how many people that purchase the RTX 2060 SUPER will use it other than gaming ?


----------



## 113802

Heuchler said:


> So how many people that purchase the RTX 2060 SUPER will use it other than gaming ?


There's a lot of users already using the RTX 2060 since it's cost effective and cheap. It had the highest performance per dollar out of the entire RTX 20 series lineup when comparing deep learning performance. Many reassembled compute machines ship with blower style RTX 2060s so I'm sure they're around. 

The increase in memory and performance will definitely attract more users to the RTX 2060. 

https://timdettmers.com/2019/04/03/which-gpu-for-deep-learning/

https://towardsdatascience.com/rtx-...t-rtx-vs-most-expensive-gtx-card-cd47cd9931d2


----------



## ilmazzo

guttheslayer said:


> You are very wrong on many level. 7 vs 12 is the main differences these cards are performing.
> 
> The next iteration of 7nm+ with EUV promises 2x density shrink from the existing 12nm. AMD had the die shrink and power saving from smaller node advantage, and yet its still consuming alot of power.
> 
> 
> Yes, big NAVI probably can offer at least 60% more SPs bringing its performance to 10-20% above 2080 Ti, but that die is 60% bigger as well, or 400mm^2 at least, even leveraging EUV better density shrink it will still be around 380mm^2, and will be a 350W monster.
> 
> 
> NV porting over to 7nm EUV could have massive 20% performance gain at just 350~ mm^2 die (A 4096 CORES Turing at 7nm+ probably has the size of 340mm^2). And at that die size they will probably consume 180W (as they always do), while the NAVI variant will be extremely difficult to keep it below 300W at 380mm^2.
> 
> 
> The power required per mm^2 is just isnt in the favour of AMD, no matter how you looked at it. Most likely the mid-size die RTX 3080 will surpassed NAVI 20 without having to go to the big 7nm+ turning GPU.


No, the main difference is that one has a green box and another one red....if this is your level of analysis.....

Anyway, I trust you, can you link me where to buy this nvidia 3080 7nm+++++ legendary tier you are referring to please? And a review too?

Much appreciated.

Have a nice day

p.s.:350W monster? Have a look in 2080ti offical thread, people is struggling to get 500W bioses lulz enthusiast don't give a penny to power efficiency


----------



## maltamonk

Imouto said:


> He forgot to mention its performance as a door stopper too. On par with a RTX 2080 Ti and a much better value.


I laughed...lol


----------



## criminal

The cooler on the 5700XT is very bad. I got mine yesterday and out of the box it isn't very loud, but the card idled at 48-50C and got to 101C after playing Battlefield V for a few minutes. I opened up Wattman and set a more aggressive fan profile which helped with the temps (40C idle and 88C load), but the noise was unbearable to me. I was able to under volt the card to 1.095v and I set the fan to a constant 45%, which was as loud as I could tolerate the noise and after a few minutes of stress saw temps around 86-87C. 

So I decided to tear it down and put my Kraken G12 on it. That didn't go as planned since I couldn't get the mounting holes lined up no matter which brackets I tried. So I guess the Kraken G12 isn't compatible. Finally I decided to pull the graphite pad off, use thermal paste instead and remounted the cooler with some plastic washers to improve pressure. After putting the card back under load using Time Spy stress test for 10 minutes the card hit 111C. I went to bed at that point.

Tonight I will try the Accelero Twin Turbo III and see if it will mount. If not I will be water blocking the card asap.



TLDR; Cooler sucks, Kraken G12 isn't compatible and re-pasting with plastic washers made temps worse. Will try Accelero Twin Turbo III next.


----------



## 113802

criminal said:


> /forum/images/smilies/frown.gif
> 
> The cooler on the 5700XT is very bad. I got mine yesterday and out of the box it isn't very loud, but the card idled at 48-50C and got to 101C after playing Battlefield V for a few minutes. I opened up Wattman and set a more aggressive fan profile which helped with the temps (40C idle and 88C load), but the noise was unbearable to me. I was able to under volt the card to 1.095v and I set the fan to a constant 45%, which was as loud as I could tolerate the noise and after a few minutes of stress saw temps around 86-87C.
> 
> So I decided to tear it down and put my Kraken G12 on it. That didn't go as planned since I couldn't get the mounting holes lined up no matter which brackets I tried. So I guess the Kraken G12 isn't compatible. Finally I decided to pull the graphite pad off, use thermal paste instead and remounted the cooler with some plastic washers to improve pressure. After putting the card back under load using Time Spy stress test for 10 minutes the card hit 111C. I went to bed at that point.
> 
> Tonight I will try the Accelero Twin Turbo III and see if it will mount. If not I will be water blocking the card asap.
> 
> 
> 
> TLDR; Cooler sucks, Kraken G12 isn't compatible and re-pasting with plastic washers made temps worse. Will try Accelero Twin Turbo III next.


For a blower it's not bad but the fan profile is horrible.


----------



## PontiacGTX

WannaBeOCer said:


> For a blower it's not bad but the fan profile is horrible.


still noise level isnt acceptable,while nvidia decided to release a dual fan solution, then AMD had released blower based cooling which keeps some people away from buying/keeping these cards due the noise levels, also keep in mind that not many will want to spend 60usd+ on a heatsink/or wcing solution(assuming they dont own one that fits) or even consider to "mod" their card due to warranty restrictions


----------



## Newbie2009

criminal said:


> The cooler on the 5700XT is very bad. I got mine yesterday and out of the box it isn't very loud, but the card idled at 48-50C and got to 101C after playing Battlefield V for a few minutes. I opened up Wattman and set a more aggressive fan profile which helped with the temps (40C idle and 88C load), but the noise was unbearable to me. I was able to under volt the card to 1.095v and I set the fan to a constant 45%, which was as loud as I could tolerate the noise and after a few minutes of stress saw temps around 86-87C.
> 
> So I decided to tear it down and put my Kraken G12 on it. That didn't go as planned since I couldn't get the mounting holes lined up no matter which brackets I tried. So I guess the Kraken G12 isn't compatible. Finally I decided to pull the graphite pad off, use thermal paste instead and remounted the cooler with some plastic washers to improve pressure. After putting the card back under load using Time Spy stress test for 10 minutes the card hit 111C. I went to bed at that point.
> 
> Tonight I will try the Accelero Twin Turbo III and see if it will mount. If not I will be water blocking the card asap.
> 
> 
> 
> TLDR; Cooler sucks, Kraken G12 isn't compatible and re-pasting with plastic washers made temps worse. Will try Accelero Twin Turbo III next.


Try without washers. AMD messes up with using a pad but if worse with paste it needs a remount. I will be putting liquid metal on mine as soon as it arrives.


----------



## keikei

Newbie2009 said:


> Try without washers. AMD messes up with using a pad but if worse with paste it needs a remount. I will be putting *liquid metal* on mine as soon as it arrives.


Fancy.  I wonder if Morpheus fits these?


----------



## criminal

Newbie2009 said:


> Try without washers. AMD messes up with using a pad but if worse with paste it needs a remount. I will be putting liquid metal on mine as soon as it arrives.


Okay, I will give that a try. I do have some liquid metal myself. Hmmm...


Bright side is that the performance in BFV (the main game I play right now) was phenomenal. On Ultra 1440p I saw over 130FPS often.


----------



## PontiacGTX

keikei said:


> Fancy.  I wonder if Morpheus fits these?


yes it does but the card would be 475usd very close to RTX 2070 super msrp, are there yet any more benchmark aside toms hardware with an overclocked RX 5700 with proper cooling solution?


----------



## Newbie2009

keikei said:


> Fancy.  I wonder if Morpheus fits these?


I would imagine one could remove fan and shroud and cable tie two noctua fans to heatsink it and it would be grand.

People get so bent out of shape over blower cards.


----------



## ToTheSun!

criminal said:


> Bright side is that the performance in BFV (the main game I play right now) was phenomenal. On Ultra 1440p I saw over 130FPS often.


More than 2080 performance for less than 2070 prices. Can't beat that value!


----------



## tpi2007

At the end of the day, Navi with AIB coolers (depending on the premium they'll ask for them) may be a better deal than Nvidia's offers, but, all in all, this whole generation of GPUs is a pass, from both Nvidia and AMD. Samsung 7nm EUV / TSMC 6nm is where it's at. 

The current gens at their going prices are just showing beyond doubt that they are just experimental from both sides and they are passing the costs to consumers. Nvidia needs a more balanced RTX lineup in terms of performance with RTX turned on and with sensible price points and AMD needs Navi with improved energy efficiency, more performance and hardware ray tracing support in Navi 2.0.

The truth is, we are currently having x60 - x70 performance class tier for $399 from both camps, after a long time after the previous gen, even more so for AMD. So no, Navi isn't great value, it's just slightly better than Nvidia's Super series and that isn't saying much. With Navi you're simply getting in 2019 the level of price/performance improvement that we should have gotten last year.

In order for Navi to be as good an arch as Turing in rasterization performance, the 7nm RX 5700 XT at $399 should be performing on par with a 12nm RTX 2080 Ti while using 150w, so they have a lot of catching up to do if they want to scale the design to compete at the high end.




Then, there's the fact that you only get one chance to make a first good impression and AMD's Radeon group still doesn't understand this simple fact of life after all these years. They keep making beginners' mistakes over and over again since at least 2013, which is even more inexcusable when they are coming late to the market and there is no reason to rush a release at this point.

Take the R9 290X and 290, great cards, but due to AMD's incompetence, the talk was about the crappy reference cooler, the lack of AIB models at release, what clocks the card can actually achieve vs stated, etc, Uber mode vs non Uber mode, importantly when the 290X only beat the Titan in Uber mode (Uber loud that is). None of this would have been the subject of analysis if the cards came with a better cooler or had aftermarket cards on release. The 290X wouldn't need an Uber mode switch to begin with, it would always reach 1 Ghz, run cooler and quieter because the cooler would be properly made to handle the heat.

Then there's Polaris, a reasonable card, but they tried to portray it as a 150w or lower card (most probably sensing the 120w GTX 1060 coming just a few weeks later) by only putting a singe 6-pin power connector on the card. When reviewed, the card was actually using 163w, and worse than that, instead of exceeding the spec on the 6-pin connector, it did so on the PCIe slot, which is much worse. Then they had to release a driver update with a specific toggle to allow people to put the card working in PCIe spec.

Then comes Vega, and here comes the talk of undervolting to make the power consumption less obscene compared to the 1080. That just seems like poor product binning at the factory and puts the burden and risk of not getting any better results in the hands of the buyer.

Now Navi, late and with last minute price cuts, same undervolting talk. The media engine according to the LTT review above produces poor results, and the cooler, in order to not be as loud as before (but still loud), was hamstrung and now the cards get super hot (like Criminal posted above) and in the case of LTT, even produce a system shutdown. The backplate on the 5700 XT doesn't have thermal pads.

AMD needs to think long and hard why their GPUs are seen as second rate. Even when they have a good card like the R9 290X, they make beginners' mistakes with it.



And now Scott Herkelman, due to the backlash on the cooler, is liking the open air cooler more on launch. Well, they should have done that, now it's only going to happen with Navi 2.0, so again one step behind Nvidia:

https://www.reddit.com/r/Amd/comments/catck3/psa_5700_series_custom_aib_designs/


> But the feedback over the past few weeks has been really good for us to read. Going forward:
> 
> 1). If blower design are used also offer dual/tri-axial options at launch for the enthusiasts. I like this idea.


----------



## Blackops_2

tpi2007 said:


> At the end of the day, Navi with AIB coolers (depending on the premium they'll ask for them) may be a better deal than Nvidia's offers, but, all in all, this whole generation of GPUs is a pass, from both Nvidia and AMD. Samsung 7nm EUV / TSMC 6nm is where it's at.
> 
> The current gens at their going prices are just showing beyond doubt that they are just experimental from both sides and they are passing the costs to consumers. Nvidia needs a more balanced RTX lineup in terms of performance with RTX turned on and with sensible price points and AMD needs Navi with improved energy efficiency, more performance and hardware ray tracing support in Navi 2.0.


That's my biggest issue with the current market. They've moved the midrange cards from 250-300 to $400 and while yes Navi is better value as you said it's late as can be and still to conventional nomenclature not great price/performance. They should've launched these as the fan shroud said, the RX 690, priced it accordingly, and it really wouldn't matter about the blower and unorganized launch because the price/performance would be normal again and people would buy them in droves.

Zen 2 hit the mark for the most part, by Christmas i expect Navi to be doing very well but it really needs to be $400 for AIB. Still upset about the 5700 being locked too.


----------



## AlphaC

I am kind of disappointed with Navi but I did notice they set new highs for energy benchmark which more or less is a pointcloud:









Radeon Pro WX3200 (RX550 rehash ~65W) launched this month so it seems they are moving on to WX_200 naming. WX 8200 is a Vega 56. Maybe cut down Navi will slot in at WX 7200 ~150W as a Polaris WX 7100 replacement.


WX 8200 is $1K , WX 3200 is $200. They're missing the $400-700 cards.


RTX 4000 is $900 , P2200 is ~$400 Pascal GP106 rehash with GDDR5X (sub 75W).


----------



## PontiacGTX

AlphaC said:


> I am kind of disappointed with Navi but I did notice they set new highs for energy benchmark which more or less is a pointcloud:
> View attachment 279038
> 
> 
> 
> Radeon Pro WX3200 (RX550 rehash ~65W) launched this month so it seems they are moving on to WX_200 naming. WX 8200 is a Vega 56. Maybe cut down Navi will slot in at WX 7200 ~150W as a Polaris WX 7100 replacement.
> 
> 
> WX 8200 is $1K , WX 3200 is $200. They're missing the $400-700 cards.
> 
> 
> RTX 4000 is $900 , P2200 is ~$400 Pascal GP106 rehash with GDDR5X (sub 75W).


can you link this article/review?


----------



## AlphaC

PontiacGTX said:


> can you link this article/review?


 It's the Linustechtips video mentioned earlier.


Code:


https://www.youtube.com/watch?v=3bmQPx9EJLA

Before people go it's CPU bound , it's not.
https://diit.cz/clanek/recenze-amd-...-cinebench-r15r15er20-blender-specviewperf-13
2080 ti = 32-33 FPS

https://www.spec.org/gwpg/gpc.data/vp13/summary.html
RTX 5000 = 56 FPS (CPU = Xeon Gold 8c/16t Skylake, max 4.2GHz)

https://techgage.com/article/nvidia-quadro-rtx-4000-review/4/
P6000 = 51.9FPS (GP102)
RTX 4000 = 39.5 FPS (RTX 2070 super with RTX 2070 CUDA count, pro drivers)
Radeon VII = 35.3 FPS
2080 Ti = 31.4 FPS 

https://hothardware.com/reviews/nvidia-quadro-rtx-4000-review?page=4
TITAN RTX = 52.33
P6000 = 53.42
RTX 4000 = 39.55
P5000 = 39.03 (GTX 1080 pro drivers)
2080 ti = 32.27
P4000 = 30.19 (GTX 1070 underclocked + pro drivers)
WX 8200 = 23.15


RTX 4000 = 40 (https://www.engineering.com/Hardware/ArticleID/18495/NVIDA-RTX-4000-GPU-A-Hands-On-Review.aspx)


----------



## 113802

AlphaC said:


> I am kind of disappointed with Navi but I did notice they set new highs for energy benchmark which more or less is a pointcloud:
> View attachment 279038
> 
> 
> 
> Radeon Pro WX3200 (RX550 rehash ~65W) launched this month so it seems they are moving on to WX_200 naming. WX 8200 is a Vega 56. Maybe cut down Navi will slot in at WX 7200 ~150W as a Polaris WX 7100 replacement.
> 
> 
> WX 8200 is $1K , WX 3200 is $200. They're missing the $400-700 cards.
> 
> 
> RTX 4000 is $900 , P2200 is ~$400 Pascal GP106 rehash with GDDR5X (sub 75W).


We're going to continue to see Vega since it's a high performance computing architecture. While Navi is a ground up architecture for gaming.


----------



## PontiacGTX

AlphaC said:


> It's the Linustechtips video mentioned earlier.
> 
> 
> Code:
> 
> 
> https://www.youtube.com/watch?v=3bmQPx9EJLA
> 
> Before people go it's CPU bound , it's not.
> https://diit.cz/clanek/recenze-amd-...-cinebench-r15r15er20-blender-specviewperf-13
> 2080 ti = 32-33 FPS
> 
> https://www.spec.org/gwpg/gpc.data/vp13/summary.html
> RTX 5000 = 56 FPS (CPU = Xeon Gold 8c/16t Skylake, max 4.2GHz)
> 
> https://techgage.com/article/nvidia-quadro-rtx-4000-review/4/
> P6000 = 51.9FPS (GP102)
> RTX 4000 = 39.5 FPS (RTX 2070 super with RTX 2070 CUDA count, pro drivers)
> Radeon VII = 35.3 FPS
> 2080 Ti = 31.4 FPS
> 
> https://hothardware.com/reviews/nvidia-quadro-rtx-4000-review?page=4
> TITAN RTX = 52.33
> P6000 = 53.42
> RTX 4000 = 39.55
> P5000 = 39.03 (GTX 1080 pro drivers)
> 2080 ti = 32.27
> P4000 = 30.19 (GTX 1070 underclocked + pro drivers)
> WX 8200 = 23.15
> 
> 
> RTX 4000 = 40 (https://www.engineering.com/Hardware/ArticleID/18495/NVIDA-RTX-4000-GPU-A-Hands-On-Review.aspx)


found this 







link

if a reference Vega 56 scores 2200 at stock do you think that Navi improved the 32 bit int perf?


----------



## Newbie2009

tpi2007 said:


> At the end of the day, Navi with AIB coolers (depending on the premium they'll ask for them) may be a better deal than Nvidia's offers, but, all in all, this whole generation of GPUs is a pass, from both Nvidia and AMD. Samsung 7nm EUV / TSMC 6nm is where it's at.
> 
> The current gens at their going prices are just showing beyond doubt that they are just experimental from both sides and they are passing the costs to consumers. Nvidia needs a more balanced RTX lineup in terms of performance with RTX turned on and with sensible price points and AMD needs Navi with improved energy efficiency, more performance and hardware ray tracing support in Navi 2.0.
> 
> The truth is, we are currently having x60 - x70 performance class tier for $399 from both camps, after a long time after the previous gen, even more so for AMD. So no, Navi isn't great value, it's just slightly better than Nvidia's Super series and that isn't saying much. With Navi you're simply getting in 2019 the level of price/performance improvement that we should have gotten last year.
> 
> In order for Navi to be as good an arch as Turing in rasterization performance, the 7nm RX 5700 XT at $399 should be performing on par with a 12nm RTX 2080 Ti while using 150w, so they have a lot of catching up to do if they want to scale the design to compete at the high end.
> 
> 
> 
> 
> Then, there's the fact that you only get one chance to make a first good impression and AMD's Radeon group still doesn't understand this simple fact of life after all these years. They keep making beginners' mistakes over and over again since at least 2013, which is even more inexcusable when they are coming late to the market and there is no reason to rush a release at this point.
> 
> Take the R9 290X and 290, great cards, but due to AMD's incompetence, the talk was about the crappy reference cooler, the lack of AIB models at release, what clocks the card can actually achieve vs stated, etc, Uber mode vs non Uber mode, importantly when the 290X only beat the Titan in Uber mode (Uber loud that is). None of this would have been the subject of analysis if the cards came with a better cooler or had aftermarket cards on release. The 290X wouldn't need an Uber mode switch to begin with, it would always reach 1 Ghz, run cooler and quieter because the cooler would be properly made to handle the heat.
> 
> Then there's Polaris, a reasonable card, but they tried to portray it as a 150w or lower card (most probably sensing the 120w GTX 1060 coming just a few weeks later) by only putting a singe 6-pin power connector on the card. When reviewed, the card was actually using 163w, and worse than that, instead of exceeding the spec on the 6-pin connector, it did so on the PCIe slot, which is much worse. Then they had to release a driver update with a specific toggle to allow people to put the card working in PCIe spec.
> 
> Then comes Vega, and here comes the talk of undervolting to make the power consumption less obscene compared to the 1080. That just seems like poor product binning at the factory and puts the burden and risk of not getting any better results in the hands of the buyer.
> 
> Now Navi, late and with last minute price cuts, same undervolting talk. The media engine according to the LTT review above produces poor results, and the cooler, in order to not be as loud as before (but still loud), was hamstrung and now the cards get super hot (like Criminal posted above) and in the case of LTT, even produce a system shutdown. The backplate on the 5700 XT doesn't have thermal pads.
> 
> AMD needs to think long and hard why their GPUs are seen as second rate. Even when they have a good card like the R9 290X, they make beginners' mistakes with it.
> 
> 
> 
> And now Scott Herkelman, due to the backlash on the cooler, is liking the open air cooler more on launch. Well, they should have done that, now it's only going to happen with Navi 2.0, so again one step behind Nvidia:
> 
> https://www.reddit.com/r/Amd/comments/catck3/psa_5700_series_custom_aib_designs/


Yeah they don’t learn from their mistakes. Only conclusion I can come to is they don’t feel in a strong enough position to piss off their partners with competing dual fan cards. They did it with Vega 7 but that wasn’t really a normal card launch.

Vega 64 was probably the most disappointing AMD Card I’ve owned and I had it under water.
The market is bad no doubt but I have stuck to my guns by never paying more than 500€ for a card. Yeah it’s a rip off ( specially the anniversary edition) but what other option do we have. Hold out until next gen when prices are even worse? That’s the view I took anyway and picked up an anniversary one to play with.

Prices are only going to go up, currencies around the world are being debased, race to the bottom.


----------



## magnek

criminal said:


> The cooler on the 5700XT is very bad. I got mine yesterday and out of the box it isn't very loud, but the card idled at 48-50C and got to 101C after playing Battlefield V for a few minutes. I opened up Wattman and set a more aggressive fan profile which helped with the temps (40C idle and 88C load), but the noise was unbearable to me. I was able to under volt the card to 1.095v and I set the fan to a constant 45%, which was as loud as I could tolerate the noise and after a few minutes of stress saw temps around 86-87C.
> 
> So I decided to tear it down and put my Kraken G12 on it. That didn't go as planned since I couldn't get the mounting holes lined up no matter which brackets I tried. So I guess the Kraken G12 isn't compatible. Finally I decided to pull the graphite pad off, use thermal paste instead and remounted the cooler with some plastic washers to improve pressure. After putting the card back under load using Time Spy stress test for 10 minutes the card hit 111C. I went to bed at that point.
> 
> Tonight I will try the Accelero Twin Turbo III and see if it will mount. If not I will be water blocking the card asap.
> 
> 
> 
> TLDR; Cooler sucks, Kraken G12 isn't compatible and re-pasting with plastic washers made temps worse. Will try Accelero Twin Turbo III next.


Almost sounds like die contact (ie the heatsink) is the problem.



ToTheSun! said:


> More than 2080 performance for less than 2070 prices. Can't beat that value!


Yep as mentioned in one of my earlier posts. Navi loooooooooooves the Frostbite engine. For someone who plays BF V a lot, nVidia becomes a very hard sell.


----------



## PontiacGTX

magnek said:


> Almost sounds like die contact (ie the heatsink) is the problem.


back in 2013 
https://www.computerbase.de/2013-11/amd-radeon-r9-290-test/10/


----------



## tpi2007

Newbie2009 said:


> Yeah they don’t learn from their mistakes. Only conclusion I can come to is they don’t feel in a strong enough position to piss off their partners with competing dual fan cards. They did it with Vega 7 but that wasn’t really a normal card launch.
> 
> Vega 64 was probably the most disappointing AMD Card I’ve owned and I had it under water.
> The market is bad no doubt but I have stuck to my guns by never paying more than 500€ for a card. Yeah it’s a rip off ( specially the anniversary edition) but what other option do we have. Hold out until next gen when prices are even worse? That’s the view I took anyway and picked up an anniversary one to play with.
> 
> Prices are only going to go up, currencies around the world are being debased, race to the bottom.



7nm EUV is supposed to make manufacturing less complex and more reliable and cheaper, so I'm surely hoping that this situation won't continue.

As to AMD, in case they didn't feel in a strong enough position to upset their partners with a competing dual fan card at launch, then they should have arranged the launch timeframe in a way that the partners could have the time to have custom made cards for launch.


----------



## NightAntilli




----------



## The Robot

tpi2007 said:


> As to AMD, in case they didn't feel in a strong enough position to upset their partners with a competing dual fan card at launch, then they should have arranged the launch timeframe in a way that the partners could have the time to have custom made cards for launch.


Then everyone would complain that AMD is late as usual to respond to Nvidia. Sort of a catch-22 for AMD.


----------



## looniam

criminal said:


> Okay, I will give that a try. I do have some liquid metal myself. Hmmm...
> 
> 
> Bright side is that the performance in BFV (the main game I play right now) was phenomenal. On Ultra 1440p I saw over 130FPS often.


if you could measure the mounting holes. my uniblock maxes ~58.4mm and curious.
really thinking about popping out the vapor camber and cutting a notch in the shroud. 

TIA.


----------



## criminal

NightAntilli said:


> https://www.youtube.com/watch?v=Ud8Bco0dk6Q


Now I am wondering if the plastic washers I used were not think enough. Either way, that still not enough improvement for my liking.



looniam said:


> if you could measure the mounting holes. my uniblock maxes ~58.4mm and curious.
> really thinking about popping out the vapor camber and cutting a notch in the shroud.
> 
> TIA.


Okay, I will let you know. I have been contemplating doing something similar myself.


----------



## tpi2007

The Robot said:


> Then everyone would complain that AMD is late as usual to respond to Nvidia. Sort of a catch-22 for AMD.



AMD was already almost nine months late to compete with the RTX 2070 and a full six months late compared to the RTX 2060.


----------



## magnek

looniam said:


> if you could measure the mounting holes. my uniblock maxes ~58.4mm and curious.
> really thinking about popping out the vapor camber and cutting a notch in the shroud.
> 
> TIA.


ಠ_ಠ







tpi2007 said:


> AMD was already almost nine months late to compete with the RTX 2070 and a full six months late compared to the RTX 2060.


They just priced it wrong. As I always like to say, there are no bad products, just bad pricing. If 5700XT launched at $349 it'd be an instant hit.


----------



## looniam

magnek said:


> ಠ_ಠ
> 
> 
> Spoiler
> 
> 
> 
> https://www.youtube.com/watch?v=31g0YE61PLQ


what can i say?
been wanting to do another ghetto mod for awhile.
and i ain't scared of power tools. 


srly, a dremel and steady hand will make it look factory. 

you freaked when you saw tech jesus steve take a hammer to the one, eh?


----------



## keikei

NightAntilli said:


> https://www.youtube.com/watch?v=Ud8Bco0dk6Q



How can AMD design a halfway decent architecture, but fail so miserably for its cooling?


----------



## tpi2007

magnek said:


> They just priced it wrong. As I always like to say, there are no bad products, just bad pricing. If 5700XT launched at $349 it'd be an instant hit.



It would be ok. A real hit would be $329 in my opinion.

Anyway, in the end this has probably got nothing to do with time, it's more to garner positive perceptions given the lower prices of blower cards compared to the competition. Because from what I'm seeing from the AIB RTX 2060 Super prices where I am, getting a decent dual fan RX 5700 XT with decent thermals and acoustics is probably going to end up costing ~ 450 - 480 €, and considering that the RX 5700 XT has a higher board power, I'm leaning towards the higher end of the price spectrum.


----------



## AlphaC

WannaBeOCer said:


> We're going to continue to see Vega since it's a high performance computing architecture. While Navi is a ground up architecture for gaming.


Yes I know Vega is a compute oriented architecture. Navi is a graphics oriented architecture similar to Polaris , Tonga, or Pitcairn (great accomplishment by AMD on 1st gen GCN Pitcairn vs Kepler GK106). Navi is basically AMD's Maxwell , where all the FP64 and non-graphical compute is more or less cut out in favor of higher clocks and lower memory bus clocked far higher.

The issue at hand is the midtier, where the 150W Radeon Pro WX 7100 has absolutely no shot in graphics against 160W RTX 4000 (RTX 2070 on TU104) or P2200 (GP106 with GDDR5X). RX 5700XT's Navi10 is likely primarily a testbed for console & cloud Navi rather than for professional graphics though. Fiji (R9 Fury) never made it to workstation pro graphics due to frame buffer as Hawaii already had a 512-bit bus with 8GB VRAM, so it (Fury) was exclusively Radeon Instinct despite 4GB VRAM.

~$400 P2200 is a 1280 CUDA GP106 chip with 5GB GDDR5x on 160-bit bus clocked to ~1500MHz boost to fit in 75W power envelope. ~$450 WX 7100 is Polaris10 on a single slot blower constrained to 150W.

Unless they can bring in professional 120-175W Navi at ~$550-750 and outperform the ~$900 RTX 4000 (pro version of RTX 2070 on TU104) across the board by something ridiculous such as 30-50% (it's been done before but for both GPU driversets: https://www.engineering.com/DesignS...ith-SOLIDWORKS-2019s-New-Graphics-Engine.aspx), AMD needs to address ray tracing. Unlike gamers, professional markets aren't going to put up with "FineWine" memes. 



Cloud Navi referenced in Investor presentation (Q3 2019 7nm Navi Cloud Gaming). http://ir.amd.com/static-files/9c985e84-bbb6-4e23-99bd-dcbb21f18592


What Jon Peddie research had to say about RTX :


https://www.jonpeddie.com/reviews/nvidias-most-accessible-turing-class-quadro-the-rtx-4000 said:


> As such, whether you as a buyer have ray tracing performance as a key criterion or not, it’s worth considering what the RTX 4000 can achieve in the non-traditional areas Nvidia chose to focus on: ray tracing and machine learning.
> ...
> In terms of price–performance, it delivers roughly the same scores/dollar (though that will likely improve over the product lifetime) as the P4000. And finally, it about 9% better performance/watt than the P4000. By contrast, the P4000 delivered a very impressive 87%, 82%, and 113% improvements, respectively, over the M4000.
> ...
> For the first time since Fermi, Nvidia’s Turing is a graphics-first chip with costly features that don’t directly serve Nvidia’s traditional market focus of 3D raster graphics


 AMD doesn't have much market penetration in Media/Entertainment (other than Mac) and physics solvers due to CUDA prevalence but for engineering visualization (especially Solidworks + NX) they were the defacto choice when GCN came out. It's getting better in the Media/Entertainment space due to Mac Pro and overall Da Vinci Resolve performance of Vega. Autodesk Maya and DS CATIA seem to favor Nvidia drivers , maybe it is due to Nvidia drivers forcing multi-threading via command-lists.



https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-14-NVIDIA-GeForce-vs-AMD-Radeon-Vega-1213/ said:


> The AMD Radeon cards give amazing performance for their cost in DaVinci Resolve and if they are what fit within your budget you should definitely consider using them. However, there are two main reasons why we won't be listing them in our own Resolve workstations. The first is simply because every single Resolve workstation we have sold over the last year has used a GTX 1080 Ti or higher GPU. That isn't to say that every Resolve user has the budget for a GTX 1080 Ti, but rather that our customers in particular overwhelming do have that budget.





https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-15-AMD-Radeon-VII-16GB-Performance-1382/ said:


> The short answer is: YES! The AMD Radeon VII 16GB is an excellent GPU for DaVinci Resolve. We had issues with BRAW footage, but the fact that the Radeon VII has 16GB of VRAM and matches the RTX 2080 Ti 11GB in terms of single-GPU performance (and with a MSRP that is $500 less) makes it a very, very strong GPU for DaVinci Resolve.
> However, nothing is perfect and there are two issues with the Radeon VII that you may want to consider:
> 
> 
> First, supply is extremely tight at the moment. This usually improves over time, but if you are trying to get your hands on a Radeon VII at the time of this article, don't expect to get one quickly.
> Second, the cooler design is not very good for multi-GPU setups which are very common on high-end DaVinci Resolve workstations.


Current pro product stack:
Radeon Instinct MI60 (MI50 is cut down but 60 CU) = Radeon VII
WX9*1*00 = Vega64 ,16GB VRAM , 230W official TDP --- replaced Hawaii , Firepro W9100
WX8200 = Vega56 (very late) , ~225W ---- replaced Hawaii , Firepro W8100
WX7*1*00 (36CU)= Polaris10 (RX 480), 150W --- replace Tonga , Firepro W7100 
WX5*1*00 (28CU) = cut-down Polaris10 (RX 470D), 75W TDP but clocks low: Firestrike / Timespy / Superposition ~10% over WX4100 despite double CUs
WX4*1*00 (16CU) = Polaris11 (RX 560)
WX3200 (10CU)= RX550 rehash , _APU level of CUs_
WX2*1*00 (8CU) = _APU level of CUs_

Key non-gaming metrics to look at:
* *Architecture & Engineering *: Revit , Autocad, Metashape , Archicad, Bentley Microstation, LabView (not very GPU bound) , Altium (not very GPU-bound), Sketchup 
--> rendering: Maxwell, Lumion 3d, Corona , Autodesk VRED
* *Design & Manufacturing *: CATIA / Siemens NX / Solidworks / PTC Creo / SolidEdge: these all have pro driver optimizations
---> Autodesk Inventor is DirectX and I believe Fusion360 is as well
---> RTX 4000 claims 2x performance in Solidworks Visualize vs P4000 (GTX 1070 level) due to RTX & tensor cores
** Media & entertainment* (including VR): Autodesk Maya / After effects / premiere pro / Realitycapture/ lightroom/ zbrush /mudbox/ Houdini / Modo / keyshot / pix4d/ 3dsmax / Showcase / Blender / Cinema4d / Avid / Unreal engine 4 / Unity / Rhino
--> rendering such as Octane (Radeon support is missing at the moment), Autodesk arnold, V-ray, etc
* *Compute & solver integration*: CAE/CFD (ANSYS, Abaqus, ADINA, LS-DYNA, Simulia , COMSOL, NASTRAN/PATRAN) , MATLAB+Simulink, Mathematica, Pytorch, Tensorflow, etc 
--> Radeon Instinct mentions TensorFlow, PyTorch, Caffe 2 
--> see https://www.nvidia.com/content/dam/...-Quadro-RTX-4000-for-Real-Time-Simulation.pdf , https://www.nvidia.com/content/dam/...re/NVIDIA-Quadro-for-ANSYS-Discovery-Live.pdf
* *Virtualization* (pass-through performance / concurrent users): VMware, Xen, Linux driver support , etc.
* *Signage* : this probably leans towards _display outputs_ than raw performance (Nvidia NVS / Radeon V1000 / embedded Polaris / embedded Vega / Vega on APUs)

https://www.pny.com/file library/su...o graphics cards/linecards/quadrolinecard.pdf

Market outlook:
Workstation by field , Q1 2019







https://www.cadalyst.com/hardware/w...-evolution-modern-workstation-46557?page_id=2
_some Vendors_
* BOXX -- https://www.boxx.com/systems/workstations/t-class , https://www.boxx.com/systems/workstations/x-class
* Velocity Micro --- https://www.velocitymicro.com/professional-workstations.php
* Pugetsystems (zero AMD products anymore) --- https://www.pugetsystems.com/nav/genesis/II/customize.php
* Xicomputer -- http://www.xicomputer.com/products/welcome.asp?content=mtowerone&microsite
* Scan UK -- https://www.scan.co.uk/shop/pro-graphics , https://www.scan.co.uk/3xs/custom/hd-4k-video-editing-pcs-laptops/workstations#anc , https://www.scan.co.uk/3xs/custom/cad-graphics-workstations/workstations#anc
* Exxact (not GPU-focused) -- https://www.exxactcorp.com/Standard-Workstations
* HP Z series -- https://store.hp.com/us/en/pdp/hp-z8-g4-workstation-customizable-z3z16av-1
* Dell Precision --- https://www.dell.com/en-us/work/sho...op-workstation/spd/precision-7920-workstation , https://www.dell.com/en-us/work/sho...op-workstation/spd/precision-5820-workstation

Your scope keeps on focusing on one small section of the GPU market which is deep learning. People can use AWS compute instances for that for example, but for other things (especially for air-gapped systems) it simply isn't possible. At this point in time even if AMD made a 120W GPU that performs as fast as RTX 2080 Ti in gaming it is pointless if all you do is compare deep learning performance since the ROCm deep learning infrastructure isn't quite there.

That's the main problem with Navi: where the money is being made. Pro users are used to paying $500-1000 for "60 class cards" as the youtubers like to call it , gamers aren't. In order for Navi to succeed on desktop it has to do a particular type of workload VERY well , I mean 1.5-2x sort of numbers or 1.5-3x performance per watt. Remember Tahiti HD7950 , R9 290, RX 470 , & Vega 56 flying off the shelf due to mining? 

In industries where millions of dollars are being made, the $50 Navi price cuts would make zero difference because performance isn't there. AMD's graphics division isn't going to be making real money off desktop GPUs , other than Mac Pro. The real money is in the consoles, Samsung mobile GPUs, and the Google Stadia shenanigans (eposvox claims that Navi can do 6x HEVC streams , so it can replace VEGA in that aspect). Nvidia revenue is growing in datacenter and gaming , AMD graphics is growing in markets where branding matters less because decisions are made by OEMs.

Have you seen any announcement of OEMs adopting Navi? Doesn't seem it is catching on and is a beta test product (akin to Fiji R9 Fury) so to speak until the 2nd RDNA GPU.



PontiacGTX said:


> found this
> link
> 
> if a reference Vega 56 scores 2200 at stock do you think that Navi improved the 32 bit int perf?
> (cut image)


It's something to do with tensor cores on Nvidia's GPUs. I think AMD figures to use CPU for integer and GPU for Floating Point (as CPU would use AVX).

On Navi the main difference for this integer workload is probably the halving of wavefront to improve utilization. 

Navi is all about graphics (viewport)
https://hothardware.com/reviews/amd-navi-radeon-rx-5700-architecture , https://hexus.net/tech/news/graphics/131555-the-architecture-behind-amds-rdna-navi-gpus/
"What you need to know is that GCN is great for complex instructions often present in the scientific space - it's a fantastic calculator - but not so hot for gaming code unless pipelined very well. And that's the crux of it; GCN needs sufficiently complex work and excellent scheduling if it's to hit its rated throughput specification.
...

Rather than use a SIMD16 with a four-clock issue, RDNA uses dual SIMD32s with a single-clock issue, meaning that the Compute Unit can be kept better utilised for gaming code.
...
That said, you may wonder why all this focus on single-threaded performance and efficiency when graphics computation is actually a very parallel exercise. The reason is that, although there are tens of thousands of threads in flight, it's not easy keeping a GCN-type machine totally full across a wide range of diverse workloads, and that's the reason for the manifest changes in RDNA."


----------



## magnek

looniam said:


> what can i say?
> been wanting to do another ghetto mod for awhile.
> and i ain't scared of power tools.
> 
> 
> srly, a dremel and steady hand will make it look factory.
> 
> you freaked when you saw tech jesus steve take a hammer to the one, eh?


I'm just allergic to notches is all. 

Carry on.


----------



## tpi2007

I just came across a very interesting comment in the TPU article about custom RX 5700 series cards, I strongly suggest people read and look at the pictures. It seems that AMD has had alternative open air, dual fan reference designs since the R9 290X days, but for some mind boggling reason opted to not use them (they used a triple fan design in the HD 7990). Oh, and look at the size of the Navi vapour chamber compared to the HD 7970. And to think that they want $449 for the RX 5700 XT Anniversary Edition with a 235w board power.


----------



## tyvar

I strongly suspect that AMD stays with blowers as a deliberate policy of not stepping on AIB toes. 

The blowers enourage everybody to wait for customs and thus AIBs get more sales.


----------



## NightAntilli

tyvar said:


> I strongly suspect that AMD stays with blowers as a deliberate policy of not stepping on AIB toes.
> 
> The blowers enourage everybody to wait for customs and thus AIBs get more sales.


I agree with this... Since AIBs generally don't launch well over a month later most of the time, if AMD had good coolers, many more people would get a card before the AIBs release theirs. But I don't think this really flies though... Because right now AMD is hurting their own launches with these blowers. And if they would release it with a decent cooler, more people would buy their cards in general, and that would include increased sales for AIBs as well. Right now, I wouldn't be surprised if some people simply get a Super card because of the blower issues, rather than waiting a month for the AIBs.


----------



## criminal

Got the accelero turbo III installed on my 5700XT. 

Idle is now 38C, gpu edge load tops out at 57C, gpu hotspot tops out at 82C. Much better and quiet!


----------



## guttheslayer

ZealotKi11er said:


> We can't see the future but if Nvidia keeps adding RT and Tensor core they will have larger dies than AMD meaning worse yield and higher cost. 7nm helps Navi for sure but 12nm Navi would have used 10-20% more power but probably cost less. Let's get this clear. Nvidia would be using 7nm if they could and not because they can compete with 12nm. It is impossible for them to use 7nm for Turing. Even if Turing came by the end of 2019, anything over 350mm2 would have been too expensive.
> I like to believe AMD has learned from Vega and you can already see it. No more 1 GPU to do both datacenter/enterprise and gaming. I hope we don't have 2080 Ti priced GPU from Nvidia with their next gen.


Did you know how Turing scales? the RT cores and Tensor are locked to the CUDA cores in a SM at a fixed ratio (if they stick to the same layout and architecture). Increasing them would also mean increasing the CUDA cores with it, which is essential for standard gaming.


Like I said, they wont need to start off big, A 4096 CUDA Cores with the same ratio of RT / TENSOR would be significantly smaller (340mm^2) on TSMC 7nm+. Its has lesser TENSOR / RT cores than 2080 Ti but is faster due to massive improved speed from 12nm to 7nm+ (2 step process node jump).


A 6144 version big die would be massive, but still occupy a reason ~520mm^2 area, which is significantly less than GV100 or TU102.


----------



## EastCoast

criminal said:


> Got the accelero turbo III installed on my 5700XT.
> 
> Idle is now 38C, gpu edge load tops out at 57C, gpu hotspot tops out at 82C. Much better and quiet!


Can we see it through pics and/or video?


----------



## tpi2007

tyvar said:


> I strongly suspect that AMD stays with blowers as a deliberate policy of not stepping on AIB toes.
> 
> The blowers enourage everybody to wait for customs and thus AIBs get more sales.



I can understand that, but by not giving AIB's time to have their cards ready by launch, AMD is either wanting to have their cake and eat it (aka, they don't want to step on AIB's toes with reference open air coolers, but then end up doing it anyway with delayed AIB card availability, so that their first party blower cards have time to sell), or they are just incompetent and can't get the launches synchronized properly.


----------



## ZealotKi11er

guttheslayer said:


> Did you know how Turing scales? the RT cores and Tensor are locked to the CUDA cores in a SM at a fixed ratio (if they stick to the same layout and architecture). Increasing them would also mean increasing the CUDA cores with it, which is essential for standard gaming.
> 
> 
> Like I said, they wont need to start off big, A 4096 CUDA Cores with the same ratio of RT / TENSOR would be significantly smaller (340mm^2) on TSMC 7nm+. Its has lesser TENSOR / RT cores than 2080 Ti but is faster due to massive improved speed from 12nm to 7nm+ (2 step process node jump).
> 
> 
> A 6144 version big die would be massive, but still occupy a reason ~520mm^2 area, which is significantly less than GV100 or TU102.


Nvidia needs a GPU at least faster than 2080 Ti. I am not saying they can't do 7nm. I am saying RTX price will only go up with 7nm.


----------



## 113802

criminal said:


> Got the accelero turbo III installed on my 5700XT.
> 
> Idle is now 38C, gpu edge load tops out at 57C, gpu hotspot tops out at 82C. Much better and quiet!


Awesome! I'm curious to see how high it boost now that it's no where near the 115c hotspot limit. How well does Navi undervolt?

Would be a perfect card to throw a Raijintek Morpheus II on.


----------



## tpi2007

ZealotKi11er said:


> Nvidia needs a GPU at least faster than 2080 Ti. I am not saying they can't do 7nm. I am saying RTX price will only go up with 7nm.




No.


https://www.anandtech.com/show/1349...ction-of-chips-using-its-7nm-euv-process-tech


> Samsung produces its 7LPP EUV chips at its Fab S3 in Hwaseong, South Korea. The company can process 1500 wafers a day on each of its ASML Twinscan NXE:3400B EUVL step and scan systems with a 280 W light source. Samsung does not say whether it uses pellicles that protect photomasks from degradation, but only indicates that *usage of EUV enables it to cut the number of masks it requires for a chip by 20%*. In addition, the company says that it had *developed a proprietary EUV mask inspection tool to perform early defect detection and eliminate flaws early in the manufacturing cycle (which will likely have a positive effect on yields)*.





> “With the introduction of its EUV process node, Samsung has led a quiet revolution in the semiconductor industry,” — said Charlie Bae, executive vice president of foundry sales and marketing team at Samsung Electronics. “This fundamental shift in how wafers are manufactured gives our customers the *opportunity to significantly improve their products’ time to market with superior throughput, reduced layers, and better yields.*





> Ultimately, EUVL is expected to reduce usage of multi-patterning when producing complex elements of a chip and therefore simplify design process, improve yields, and shrink cycle times (or rather not make them longer in the foreseeable future).














And judging by the ramp up times of both Samsung 7nm EUV (also here) and TSMC 7nm to put out mass market products, if Nvidia really wanted, I'd say it would be doable to launch Ampere in December of this year / January 2020.


----------



## magnek

criminal said:


> Got the accelero turbo III installed on my 5700XT.
> 
> Idle is now 38C, gpu edge load tops out at 57C, gpu hotspot tops out at 82C. Much better and quiet!


Stahhhhhhhhhhp, if you keep posting these I'm gonna have to buy a waterblock for this just so I can brag about my temps. 



tpi2007 said:


> No.
> 
> 
> https://www.anandtech.com/show/1349...ction-of-chips-using-its-7nm-euv-process-tech
> View attachment 279172
> 
> 
> 
> 
> 
> And judging by the ramp up times of both Samsung 7nm EUV and TSMC 7nm to put out mass market products, if Nvidia really wanted, I'd say it would be doable to launch Ampere in December of this year / January 2020.


Well you're assuming Leatherman is gonna pass on the savings; I don't see that happening unless AMD brings strong competition. Why lower your margins when the consumers will trip over themselves to buy your $1200 GPU?


----------



## tpi2007

magnek said:


> Well you're assuming Leatherman is gonna pass on the savings; I don't see that happening unless AMD brings strong competition. Why lower your margins when the consumers will trip over themselves to buy your $1200 GPU?



After the Kepler drama they appeased customers with Maxwell. I think they will do the same thing again. They've already admitted that Turing is priced too high.


----------



## AlphaC

Maxwell completely cut FP64 to 1/32 even on the larger die. Even TITAN X Maxwell was 1/32.


----------



## tpi2007

AlphaC said:


> Maxwell completely cut FP64 to 1/32 even on the larger die. Even TITAN X Maxwell was 1/32.



It went from 1:24 to 1:32, if that's what it takes to make Ampere more palatable than Turing for the consumer market, I'll take it, although it's been at 1:32 for a while now, so they'll just have to get creative elsewhere, I'd say.


----------



## AlphaC

Original TITAN (Kepler) was 1/3


----------



## guttheslayer

ZealotKi11er said:


> Nvidia needs a GPU at least faster than 2080 Ti. I am not saying they can't do 7nm. I am saying RTX price will only go up with 7nm.


They use 12nm for Turing process cos they are going straight for 7nm+ which is a year later than 7nm. EUV promises lower operating cost per wafer as well.



4096 Turing Cores clock at 2GHz boost (7nm+) is definitely alot faster than a 4352 Cores 2080 Ti clock at 1545Mhz (I am estimating probably 15-20%) And that is a 340mm^2 die vs a 754mm^2 die.


Here is a simple maths for you, by cutting the die size by half, you are slashing the cost price by more than half. You extract >100% more die due to better yield and space at the edge of the wafer, 7nm might cost more, but i am sure it will be cheaper as a result.


----------



## tpi2007

AlphaC said:


> Original TITAN (Kepler) was 1/3



Yes, and consumer cards like the 680, 780 and 780 Ti are 1:24.


----------



## magnek

tpi2007 said:


> After the Kepler drama they appeased customers with Maxwell. I think they will do the same thing again. They've already admitted that Turing is priced too high.


Yeah but like alphacool said they completely neutered double precision, and of course there's the Forceman's Law card lol. As with Intel, don't expect nVidia to give you deals without a giant * attached. 980 Ti only existed because of Fury X, I do not for a second believe it would've been priced at $649 otherwise (or even existed for that matter). Yeah I definitely owe AMD one...



tpi2007 said:


> Yes, and consumer cards like the 680, 780 and 780 Ti are 1:24.


I think his point was more for $1000, you at least had something to show for it. Or conversely, if you had no need for FP64, you could find equivalent hardware (sans VRAM I guess) for 65-70% the price. Where is my $840 GTX 2080 Ti without all the useless gimmicky RT cores?

(inb4 something about machine learning/AI and Tensorflow lol)


----------



## JackCY

For all those blower lowers and defenders:

https://youtu.be/3bmQPx9EJLA?t=504


----------



## AlphaC

TITAN X Pascal was a ripoff initially.
When AMD put out VEGA FE with the pro driver switching then TITAN X Pascal and Titan xP added on driver optimizations for Solidworks and such.
https://www.eteknix.com/nvidia-385-12-driver-unlocks-titan-xp-prosumer-performance/ ; https://blogs.nvidia.com/blog/2017/07/31/titan-xp-drivers-new-levels-of-performance-for-creatives/

Catia – *72%* Increase
Creo – *107%* Increase
Energy – *54%* Increase
Medical – *53%* Increase
Solidworks – *95%* Increase

Siemens NX went from unusable <10FPS to ~ 70FPS.

Now Nvidia is making their "Creator driver" for Turing & Titan XP that boosts ~ 9% in Adobe products via software.

AMD should have done similar with Radeon VII , all AMD did was unlock FP64 (not even to the full 1/2 capacity only 1/4) so it was too small a concession on a $700 card and doesn't make sense for the consumer market really when what matters is applications such as Maya & 3dsmax that Titan XP _didn't have any change from the driver unlock_.


The rationale is low for enabling FP64 to a high level on Radeon VII , whereas creatives working on gaming mods (i.e. using Blender/3dsmax/unreal engine/unity/maya/zbrush/mudbox/etc) are more likely to buy a gaming GPU.


----------



## guttheslayer

magnek said:


> I think his point was more for $1000, you at least had something to show for it. Or conversely, if you had no need for FP64, you could find equivalent hardware (sans VRAM I guess) for 65-70% the price. Where is my $840 GTX 2080 Ti without all the useless gimmicky RT cores?
> 
> (inb4 something about machine learning/AI and Tensorflow lol)



To be very honest, the RT cores and TENSOR cores actually add IPC to the architecture.

I am not sure how, but comparing both GTX 1080 and RTX 2070 SUPER, both have the same amt of 2560 CUDA Cores, but for some reason the SUPER is alot faster and they are very similarly clocked as well.


----------



## AlphaC

That has more to do with the concurrent float+int. That's why the GTX 1660 Ti without RTX gets more than GTX 1070 performance.


----------



## magnek

guttheslayer said:


> To be very honest, the RT cores and TENSOR cores actually add IPC to the architecture.
> 
> I am not sure how, but comparing both GTX 1080 and RTX 2070 SUPER, both have the same amt of 2560 CUDA Cores, but for some reason the SUPER is alot faster and they are very similarly clocked as well.


That's more to do with the substantial architecture revision between Pascal and Turing than the RT and tensor hardware directly. Kepler to Maxwell improved "IPC" by 35% without needing to introduce any extra hardware, so likely the IPC uplift is also because of the architecture. Also GDDR6 vs GDDR5X.


----------



## Caffinator

Can't wait to get my hands on the RTX 2070 Super from Gigabyte - they have a white edition just like the 2080! Looks clean


----------



## tpi2007

magnek said:


> Yeah but like alphacool said they completely neutered double precision, and of course there's the Forceman's Law card lol. As with Intel, don't expect nVidia to give you deals without a giant * attached. *980 Ti only existed because of Fury X*, I do not for a second believe it would've been priced at $649 otherwise (or even existed for that matter). Yeah I definitely owe AMD one...
> 
> 
> 
> I think his point was more for $1000, you at least had something to show for it. Or conversely, if you had no need for FP64, you could find equivalent hardware (sans VRAM I guess) for 65-70% the price. Where is my $840 GTX 2080 Ti without all the useless gimmicky RT cores?
> 
> (inb4 something about machine learning/AI and Tensorflow lol)



Bold for emphasis. And what was the reason for the 1080 Ti to exist?


----------



## huzzug

tpi2007 said:


> Then comes Vega, and here comes the talk of undervolting to make the power consumption less obscene compared to the 1080. That just seems like poor product binning at the factory and puts the burden and risk of not getting any better results in the hands of the buyer.


AMD also bundled Vega with Freesync screens to justify their prices.


----------



## dieanotherday

same story again right? underwhelming?


----------



## magnek

tpi2007 said:


> Bold for emphasis. And what was the reason for the 1080 Ti to exist?


Not sure what you're getting at here, but definitely NOT Vega. 

Key difference is, 980 Ti came only 3 months after Titan X because of the impending Fury X launch, making Maxwell's Titan X the shortest lived and most pointless and foolish Titan purchase ever. Whereas 1080 Ti came a full 10 months later after the 1080 cow has been milked dry, and 1080 buyers hungry for more performance so they'd happily shell out another $700.


----------



## JackCY

AlphaC said:


> TITAN X Pascal was a ripoff initially.
> When AMD put out VEGA FE with the pro driver switching then TITAN X Pascal and Titan xP added on driver optimizations for Solidworks and such.
> https://www.eteknix.com/nvidia-385-12-driver-unlocks-titan-xp-prosumer-performance/ ; https://blogs.nvidia.com/blog/2017/07/31/titan-xp-drivers-new-levels-of-performance-for-creatives/
> 
> Catia – *72%* Increase
> Creo – *107%* Increase
> Energy – *54%* Increase
> Medical – *53%* Increase
> Solidworks – *95%* Increase
> 
> Siemens NX went from unusable <10FPS to ~ 70FPS.
> 
> Now Nvidia is making their "Creator driver" for Turing & Titan XP that boosts ~ 9% in Adobe products via software.
> 
> AMD should have done similar with Radeon VII , all AMD did was unlock FP64 (not even to the full 1/2 capacity only 1/4) so it was too small a concession on a $700 card and doesn't make sense for the consumer market really when what matters is applications such as Maya & 3dsmax that Titan XP _didn't have any change from the driver unlock_.
> 
> 
> The rationale is low for enabling FP64 to a high level on Radeon VII , whereas creatives working on gaming mods (i.e. using Blender/3dsmax/unreal engine/unity/maya/zbrush/mudbox/etc) are more likely to buy a gaming GPU.


Do you have any source that goes into detail of difference for studio driver, aka what precisely does it improve and add? All I could see so far are just marketing mumbo jumbo and basic tests showing no improvement in apps.
Does it add 10bit support to GeForce (OGL 10bit for Photoshop etc.)? What apps run faster with studio driver compared to regular and vice versa.


----------



## ilmazzo

Caffinator said:


> Can't wait to get my hands on the RTX 2070 Super from Gigabyte - they have a white edition just like the 2080! Looks clean


very in topic

well, all last pages are very way off... go on talking about nvidia please


----------



## ToTheSun!

ilmazzo said:


> very in topic
> 
> well, all last pages are very way off... go on talking about nvidia please


Sweet irony.


----------



## guttheslayer

magnek said:


> That's more to do with the substantial architecture revision between Pascal and Turing than the RT and tensor hardware directly. Kepler to Maxwell improved "IPC" by 35% without needing to introduce any extra hardware, so likely the IPC uplift is also because of the architecture. Also GDDR6 vs GDDR5X.


Not really, I didnt see any changes from the block diagram from Pascal to Turing, there is big changes from Kepler to Maxwell though, the latter become alot more scalar as well. If I am not wrong Turing allows asynchronous processing with their RT/TENSOR cores, which is why the IPC is improved.


Also GDDR have no effect on the performance except that it provide enough bandwidth not to bottleneck the GPU, and G5X to G6 is just memory bandwidth increase.


----------



## tpi2007

magnek said:


> Not sure what you're getting at here, but definitely NOT Vega.
> 
> Key difference is, 980 Ti came only 3 months after Titan X because of the impending Fury X launch, making Maxwell's Titan X the shortest lived and most pointless and foolish Titan purchase ever. Whereas 1080 Ti came a full 10 months later after the 1080 cow has been milked dry, and 1080 buyers hungry for more performance so they'd happily shell out another $700.



Well, exactly, if the theory was that Nvidia wouldn't do anything without competition from AMD, why would the 1080 Ti exist? Which is to say, why would Nvidia not provide lower prices with Ampere on 7nm EUV? And by the way, the passing on the cost savings of the process to consumers also applies to AMD, I think they will too have lower prices on 7nm EUV or 6nm. Neither came out looking too good in this generation in my opinion, so they'll do it because of consumers, not because of each other.


----------



## keikei

dieanotherday said:


> same story again right? underwhelming?


Absolutely not. You are getting solid 1440p cards here for a good price. The 5700 is the best $/perf card for midrange at the moment. The 5700XT is close to 1080ti performance for $400. The only big criticism here is the garbage reference cooler, so wait a bit for aftermarket cards in mid august. For the most part, gamers are very happy.


----------



## ilmazzo

Any quality image review anywhere for the sharpening feature? Maybe something comparing it to the DLSS?


----------



## tpi2007

ASUS custom cards will only be available sometime in September: https://www.techpowerup.com/257269/asus-to-release-custom-navi-gpus-in-september



> In a blog post on Edge UP, ASUS said that "Our initial Navi offerings will use AMD's reference cooler design and clock speeds, but we'll be tweaking, tuning, and powering up these new Radeons with coolers of our own design soon. Stay tuned for more details in September."


----------



## keikei

ilmazzo said:


> Any quality image review anywhere for the sharpening feature? Maybe something comparing it to the DLSS?


Oddly, I havent found anything yet even though there is a recent driver for it. The tech 'just works', so devs dont need to do anything. Virtually no performance impact as well. Heres an example from AMD's presentation, but we should be seeing better examples. As a Vega owner, im a little peeved we didnt get the tech, but then again its not a Navi card. We got the antilag though!https://hothardware.com/news/amd-radeon-rx-5700-image-sharpening-comparison


----------



## Leopardi

ilmazzo said:


> Any quality image review anywhere for the sharpening feature? Maybe something comparing it to the DLSS?







CAS is whole lot better


----------



## ilmazzo

keikei said:


> Oddly, I havent found anything yet even though there is a recent driver for it. The tech 'just works', so devs dont need to do anything. Virtually no performance impact as well. Heres an example from AMD's presentation, but we should be seeing better examples. As a Vega owner, im a little peeved we didnt get the tech, but then again its not a Navi card. We got the antilag though!https://hothardware.com/news/amd-radeon-rx-5700-image-sharpening-comparison





Leopardi said:


> https://www.youtube.com/watch?v=Yi-_T3vsv-Q
> 
> CAS is whole lot better


much appreciated, I'll take a look into it asap


----------



## ilmazzo

yep....underwhelming.....


----------



## Frosted racquet

ilmazzo said:


> Any quality image review anywhere for the sharpening feature? Maybe something comparing it to the DLSS?


----------



## Hwgeek

RIP DLSS!,
So yo can get 30% performance uplift without quality lose or double the FPS with slight Image softness thanks to Navis Image sharpening
all this with a click of a button, unlike DLSS.

So with RX 5700XT you can get "4K" game play experience at RTX 2080TI performance level


----------



## tpi2007

Rendering at 1800p instead of 4K + FXAA and upscaling had already put DLSS in the recycle bin long ago. DLSS is trying to solve a problem in a convoluted way when a much simpler and effective method (because it's not game dependent) already existed.


----------



## ToTheSun!

tpi2007 said:


> Rendering at 1800p instead of 4K + FXAA and upscaling had already put DLSS in the recycle bin long ago. DLSS is trying to solve a problem in a convoluted way when a much simpler and effective metthod (because it's not game dependent) already existed.


Yeah. DLSS was DOA. Hwgeek has been asleep.


----------



## Hwgeek

Sry, 1st year as CS student, no more gaming for me :-(.

Also -with this Upsacaling tech- AMD can offer tiny NAVI silicone under 200$ that will be able to offer ~RTX 2070 performance, who would say no to this?


----------



## AlphaC

JackCY said:


> Do you have any source that goes into detail of difference for studio driver, aka what precisely does it improve and add? All I could see so far are just marketing mumbo jumbo and basic tests showing no improvement in apps.
> Does it add 10bit support to GeForce (OGL 10bit for Photoshop etc.)? What apps run faster with studio driver compared to regular and vice versa.


 Pugetsystems tested and it was margin of error https://www.pugetsystems.com/labs/a...ey-faster-in-Premiere-Pro-and-Photoshop-1392/


> Creator Ready drivers get extra testing in top creative applications to ensure performance and stability
> Creator Ready drivers have all the game optimizations that are present in the "Game Ready" drivers, but they may be a few revisions behind
> GeForce Experience software can be set to auto-update based on whether you want to use the "Creator Ready" or "Game Ready" drivers



Anyway I think AMD needs to do something to address the creative market without resorting to Radeon Pro , maybe unlocking Autodesk maya/3dsmax performance on Navi as those aren't compute or precision oriented apps. I have a feeling most Autodesk apps these days are DirectX.


----------



## grifers

Hi!!. I arrived my card today (sapphire 5700xt) and freesync dont work :S, in adrenalin/video is enable but dont work. In my previous card no problem (Vega 64). Is normal?

P.D - Sorry my language, use Google translator. My monitor is Benq EX3203R (freesync 2 monitor)

Thanks


----------



## keikei

grifers said:


> Hi!!. I arrived my card today (sapphire 5700xt) and freesync dont work :S, in adrenalin/video is enable but dont work. In my previous card no problem (Vega 64). Is normal?
> 
> P.D - Sorry my language, use Google translator. My monitor is Benq EX3203R (freesync 2 monitor)
> 
> Thanks


Its not normal. Try a reinstall and if the issue persists, then its a driver bug. You may want to report it to AMD as well.


----------



## Frosted racquet

grifers said:


> Hi!!. I arrived my card today (sapphire 5700xt) and freesync dont work :S, in adrenalin/video is enable but dont work. In my previous card no problem (Vega 64). Is normal?
> 
> P.D - Sorry my language, use Google translator. My monitor is Benq EX3203R (freesync 2 monitor)
> 
> Thanks


Just in case, check the monitor FreeSync settings to see if it's on. Disable and re-enable it just incase.


----------



## Heuchler

tpi2007 said:


> ASUS custom cards will only be available sometime in September: https://www.techpowerup.com/257269/asus-to-release-custom-navi-gpus-in-september


"Custom AIB designs will be hitting the market ~mid August" -Scott Herkelman (on Reddit)
https://www.reddit.com/r/Amd/comments/catck3/psa_5700_series_custom_aib_designs/

ASUS custom Radeon design means GeForce cooler slapped on OR VRM thermal pad is not making and contact.


----------



## magnek

guttheslayer said:


> Not really, I didnt see any changes from the block diagram from Pascal to Turing, there is big changes from Kepler to Maxwell though, the latter become alot more scalar as well. If I am not wrong Turing allows asynchronous processing with their RT/TENSOR cores, which is why the IPC is improved.
> 
> 
> Also GDDR have no effect on the performance except that it provide enough bandwidth not to bottleneck the GPU, and G5X to G6 is just memory bandwidth increase.


Don't take my word for it, see what Anandtech has to say: https://www.anandtech.com/show/13282/nvidia-turing-architecture-deep-dive/4



> Diving straight into the microarchitecture, the new Turing SM looks very different to the Pascal SM, but those who’ve been keeping track of Volta will notice a lot of similarities to the NVIDIA’s more recent microarchitecture. In fact, on a high-level, the Turing SM is fundamentally the same, with the notable exception of a new IP block: the RT Core. Putting the RT Cores and Tensor Cores aside for now, the most drastic changes from Pascal are same ones that differentiated Volta from Pascal. Turing’s advanced shading features are also in the same bucket in needing explicit developer support.
> 
> Like Volta, the Turing SM is partitioned into 4 sub-cores (or processing blocks) with each sub-core having a single warp scheduler and dispatch unit, as opposed Pascal’s 2 partition setup with two dispatch ports per sub-core warp scheduler. There are some fairly major implications with change, and broadly-speaking this means that Volta/Turing loses the capability to issue a second, non-dependent instruction from a thread for a single clock cycle. Turing is presumably identical to Volta performing instructions over two cycles but with schedulers that can issue an independent instruction every cycle, so ultimately Turing can maintain 2-way instruction level parallelism (ILP) this way, while still having twice the amount of schedulers over Pascal.
> 
> Like we saw in Volta, these changes go hand-in-hand with the new scheduling/execution model with independent thread scheduling that Turing also has, though differences were not disclosed at this time. Rather than per-warp like Pascal, Volta and Turing have per-thread scheduling resources, with a program counter and stack per-thread to track thread state, as well as a convergence optimizer to intelligently group active same-warp threads together into SIMT units. So all threads are equally concurrent, regardless of warp, and can yield and reconverge.
> 
> In terms of the CUDA cores and ALUs, the Turing sub-core has 16 INT32 cores, 16 FP32 cores, and 2 Tensor Cores, the same setup as the Volta sub-core. With the split INT/FP datapath model like Volta, Turing can also concurrently execute FP and INT instructions, which as we will see, is much more relevant with the RT cores involved. Where Turing differs is in lacking Volta’s full complement of FP64 cores, instead having a token amount (2 per SM) for compatibility reasons and resulting in FP64 throughput being 1/32 the TFLOP rate of FP32. Maimed FP64 is standard for NVIDIA’s consumer GPUs, but what has not been standard until now is Turing’s full 2x FP16 throughput, which was available in GP100 but was crippled in the other Pascal GPUs.


You may also like to note the intra-SM memory subsystem has been revamped:










The end result as nVidia claims is a 50% higher shading performance:


----------



## grifers

keikei said:


> Its not normal. Try a reinstall and if the issue persists, then its a driver bug. You may want to report it to AMD as well.





Frosted racquet said:


> Just in case, check the monitor FreeSync settings to see if it's on. Disable and re-enable it just incase.


Hi. By default my monitor has freesync enable (Normal or premium). Reinstall drivers and Works fine again.

For comparison (with my old vega 64), games like Battlefield V, Assassins Creed Origins, Far Cry 5, Quantum breake, performance is a Little bit faster. Another games like Rage 2, Metro Exodous, sHadow of the tomb raider, performancer is margin faster. Games like GTA 5, Watch Dogs 2 and Just Cause 4 performance is quite faster .

5700xt stock vs Vega 64 reference with Undervolting (1630 MHz at 1000 V, and 1100 MHz at 1000 v to HMB memory)

Hope yours undertand my poor english hahah


----------



## magnek

tpi2007 said:


> Well, exactly, if the theory was that Nvidia wouldn't do anything without competition from AMD, why would the 1080 Ti exist? Which is to say, why would Nvidia not provide lower prices with Ampere on 7nm EUV? And by the way, the passing on the cost savings of the process to consumers also applies to AMD, I think they will too have lower prices on 7nm EUV or 6nm. Neither came out looking too good in this generation in my opinion, so they'll do it because of consumers, not because of each other.


Well I've already pointed out the differences between 980 Ti and 1080 Ti. One came just 3 months after the Titan, the other came 7 months later. One was priced competitively, the other technically had an MSRP of $700 but good luck finding one at that price. Had Vega been competitive the 1080 Ti launch would've looked a lot more like the 980 Ti. 

Why would nVidia not provide lower prices? Because as I've said, if they can get away with charging higher prices, _why should they lower prices?_ If AMD does pass on the cost savings AND assuming the cards are competitive, then yes there will be incentive for nVidia to not gouge as much, but then that's strictly because of competition. If AMD also follows in nVidia's steps and doesn't pass on the cost savings, well we're finished. To be honest, given the pricing trends of Vega and Radeon VII, I'm really not convinced AMD is going to be significantly cheaper. Sure it might have 10% better perf/$, but at that point it's more or less a wash.


----------



## kd5151

Would like to see users results with auto undervolting and manual undervolting. Mainly with the 5700 non xt.


----------



## EastCoast

Leopardi said:


> https://www.youtube.com/watch?v=Yi-_T3vsv-Q
> 
> CAS is whole lot better













It's even better then DLSS.

Hint:
Look on top of the tank.


----------



## chas1723

Is the anniversary edition just for sale on AMDs website? Do you think it is worth the $50 premium? I will be matching it with a 3700x or 3800x. I will also be slapping a block on everything as I have a full custom loop already. 

Sent from my SM-N950U using Tapatalk


----------



## tyvar

tpi2007 said:


> I can understand that, but by not giving AIB's time to have their cards ready by launch, AMD is either wanting to have their cake and it eat (aka, they don't want to step on AIB's toes with reference open air coolers, but then end up doing it anyway with delayed AIB card availability, so that their first party blower cards have time to sell), or they are just incompetent and can't get the launches synchronized properly.


I think its because there is still a third player involved, Nvidia. AMD wanted something out ahead of or contemporary with the Super release.

Like a lot of people now are seeing that the 5700 and 5700x trade blows even with horrible coolers with a 2060S or 2070S in certain titles, at lower cost. Those people might just wait the extra month.

If AMD had absolutely zilch out I think the number of people who would wait would be a lot smaller.


----------



## bigjdubb

That DLSS stuff is so crappy. The canvas roll on the back of the tank looks like something from CSGO.


----------



## Asmodian

I am not at all impressed by the sharpening... too much time looking at various sharpening filters for video, it has the classic look of over sharpened.

DLSS is bad too, I want DLSS that runs as pure AA on native resolution rendering, not this upscaling crap.


----------



## doom26464

DLSS is trash. Just a market gimmick to sell tensor cores aka left over AI parts of the die.

AMD sharpening actually has some use to it. More press should be testing this and point that out.


----------



## 113802

doom26464 said:


> DLSS is trash. Just a market gimmick to sell tensor cores aka left over AI parts of the die.
> 
> AMD sharpening actually has some use to it. More press should be testing this and point that out.


They should review FeeeStyle vs Radeon Image Sharpening. Since AMD is working on a DLSS competitor using DirectML.


----------



## EastCoast

Another example. Take note to the artifacts in the water which is suppose to be a reflection of the mountain range.
Among other obvious areas.


----------



## 113802

EastCoast said:


> Another example. Take note to the artifacts in the water which is suppose to be a reflection of the mountain range.
> Among other obvious areas.


I'd like to see DLSS with Freestyle sharpening enabled against it.


----------



## grifers

Not HDR in Netflix app (From Microsoft store). Vega 64 did not have either.


----------



## magnek

Asmodian said:


> I am not at all impressed by the sharpening... too much time looking at various sharpening filters for video, it has the classic look of over sharpened.
> 
> DLSS is bad too, I want DLSS that runs as pure AA on native resolution rendering, not this upscaling crap.


Apart from actually increasing the monitor's resolution, you still can't beat good ol' OGSSAA. If there was one reason to go severely overkill on the GPU side it would've been for OGSSAA. Too bad traditional MSAA died and most post-DX9 games don't support it anymore.


----------



## Asmodian

magnek said:


> Apart from actually increasing the monitor's resolution, you still can't beat good ol' OGSSAA. If there was one reason to go severely overkill on the GPU side it would've been for OGSSAA. Too bad traditional MSAA died and most post-DX9 games don't support it anymore.


I think a neural net trained to AA a 4K image using a 64 sample SSAA version as the ground truth could do a very good job without needing to render to an even higher resolution. If only we had that instead of something that is also trying to upscale from 1440p. At 1440p we simply do not have enough pixels to generate a useful 4K image with a trained neural net, e.g. all the shimmer in small details. Actually running 8x SSAA would be even better, of course, but the performance hit from needing to render to 22720x17280 would be pretty extreme.

I am not sure how AMD's sharpening is even comparable to DLSS, AMD is not upscaling and does not attempt to do AA with RIS. Why are people comparing AMD rendering to 4K and then sharpening to Nvidia upscaling to 4K and attempting to do AA with a neural net? Shouldn't we be comparing AMD running at 1440p with RIS to "4K" DLSS? 



doom26464 said:


> AMD sharpening actually has some use to it. More press should be testing this and point that out.


Yes, because applying a contrast-adaptive sharpening filter after rendering is so new and helpful.


----------



## EastCoast

Asmodian said:


> Yes, because applying a contrast-adaptive sharpening filter after rendering is so new and helpful.


With the performance hit of 1-2 frames vs reshade performance hit... vs dlss blurring vs nothing at all...it is.


----------



## magnek

Why are we even comparing RIS to DLSS when DSR is still a thing? Did everybody forget DSR still exists as a "sorta kinda" spiritual successor to OGSSAA? Or better yet, compare it to SMAA?

Yeah I get RIS is not AA and does not claim to be AA, but if the end results are largely the same (specifically SMAA), wouldn't that be a much more worthwhile comparison?


----------



## 113802

EastCoast said:


> With the performance hit of 1-2 frames vs reshade performance hit... vs dlss blurring vs nothing at all...it is. /forum/images/smilies/biggrin.gif


nVidia FreeStyle does the same thing as RIS and more. AMD is working on a DLSS competitor using DirectML. DLSS can be used with Freestyles' sharpening.


----------



## maltamonk

WannaBeOCer said:


> nVidia FreeStyle does the same thing as RIS and more. AMD is working on a DLSS competitor using DirectML. DLSS can be used with Freestyles' sharpening.


I had to look up freestyle... What is the point of dlss if freestyle can do the same thing w/o the performance hit?


----------



## 113802

maltamonk said:


> I had to look up freestyle... What is the point of dlss if freestyle can do the same thing w/o the performance hit?


One Super Samples the other sharpens. If I had a RX 5700 XT I would enable VSR along with RIS.


----------



## maltamonk

Ok...knowing what we know about filters...if we can do it real time.....then why not focus on them?


----------



## tpi2007

tyvar said:


> I think its because there is still a third player involved, Nvidia. AMD wanted something out ahead of or contemporary with the Super release.
> 
> Like a lot of people now are seeing that the 5700 and 5700x trade blows even with horrible coolers with a 2060S or 2070S in certain titles, at lower cost. Those people might just wait the extra month.
> 
> If AMD had absolutely zilch out I think the number of people who would wait would be a lot smaller.



Fair point, although that would mean that Nvidia wasn't exactly responding with Super to AMD's Navi. Or if they were, it still means that AMD can't get a synchronized release with AIB's without Nvidia knowing.


----------



## ZealotKi11er

tpi2007 said:


> Fair point, although that would mean that Nvidia wasn't exactly responding with Super to AMD's Navi. Or if they were, it still means that AMD can't get a synchronized release with AIB's without Nvidia knowing.


It takes time to validate each GPU. AMD does its own testing with reference GPU before giving to AIBs. Only time AIBs get at the same time if its something like Super or Nvidia is just holding back launch for AIBs to get ready. AMD does not have time to hold back.


----------



## ilmazzo

Nvidia gave AMD time due to their RTX bet, Navi seems a strong arch pairing turing perfomance in rasterization, only the reference blower is holding it a little back ... easy solved by AIBs in the next month or LCing it.... big Navi will be a gaming monster (but I won't afford it, lol). Thanks nVidia! Oh, and thanks Raja!!!!!


----------



## Gunderman456

AMD needs to play Nvidia's game by releasing cards that can boost all the way to their rated speeds while staying cool with a better cooler.

AIBs should then earn their money with better components/cooling to permit max overclocks.

That's how these two should be segmented and always working for the benefit of the consumer. 

AMD can't continue to keep crippling their reference cards because many people will just grab Nvidia's reference cards (when they're not gouging with their founder's cards).


----------



## dagget3450

Has anyone seen anything on the Anniversary edition rx 5700xt? I wanted to see it vs a regular 5700xt but i'm not finding anything.


----------



## maltamonk

Gunderman456 said:


> AMD needs to play Nvidia's game by releasing cards that can boost all the way to their rated speeds while staying cool with a better cooler.
> 
> AIBs should then earn their money with better components/cooling to permit max overclocks.
> 
> That's how these two should be segmented and always working for the benefit of the consumer.
> 
> AMD can't continue to keep crippling their reference cards because many people will just grab Nvidia's reference cards (when they're not gouging with their founder's cards).


From what I read of the AIB super cards is that they are a worse option than the Nvidia ref design this time around, Yes they are cooler and quieter, but they don't boost/perform any better. So we have Amd where the AIb are the better choice and Nvidia where the ref is the better choice. This kinda screws the AIb partners this time around for the super cards but works well for them for the 5700 cards.


----------



## bigjdubb

maltamonk said:


> From what I read of the AIB super cards is that they are a worse option than the Nvidia ref design this time around, Yes they are cooler and quieter, but they don't boost/perform any better. So we have Amd where the AIb are the better choice and Nvidia where the ref is the better choice. This kinda screws the AIb partners this time around for the super cards but works well for them for the 5700 cards.


Wait a minute! Cooler and quieter with the same boost/performance IS better. I'm not sure anyone else would think that with performance being equal, the louder/hotter card is the better option....


----------



## maltamonk

bigjdubb said:


> Wait a minute! Cooler and quieter with the same boost/performance IS better. I'm not sure anyone else would think that with performance being equal, the louder/hotter card is the better option....


While I understand that historically we get cooler, quieter, AND better performance from Aib models. They are actually performing worse (in some cases) due to binning (and already being maxed out) this time around on Nvidias side.


----------



## ilmazzo

maltamonk said:


> Gunderman456 said:
> 
> 
> 
> AMD needs to play Nvidia's game by releasing cards that can boost all the way to their rated speeds while staying cool with a better cooler.
> 
> AIBs should then earn their money with better components/cooling to permit max overclocks.
> 
> That's how these two should be segmented and always working for the benefit of the consumer.
> 
> AMD can't continue to keep crippling their reference cards because many people will just grab Nvidia's reference cards (when they're not gouging with their founder's cards).
> 
> 
> 
> From what I read of the AIB super cards is that they are a worse option than the Nvidia ref design this time around, Yes they are cooler and quieter, but they don't boost/perform any better. So we have Amd where the AIb are the better choice and Nvidia where the ref is the better choice. This kinda screws the AIb partners this time around for the super cards but works well for them for the 5700 cards.
Click to expand...

The only “problem” for the ref navi is the noise level to get the card to scratch its legs

So any aib will be ok delivering the cooling needed to boost to 2000+ below 50dbs

Ref nvidia cards costs 100$ more aib entry level so it is pointless, they are custom cards but from nvidia.


----------



## EastCoast

Softmod for 5700 reviewed using a 390 cooler.


----------



## bigjdubb

Wow, I didn't catch on that he was using a 5700 and not an XT until he started talking about the xt. Getting the 5700 up to 2080 performance levels is pretty impressive.


----------



## EastCoast

bigjdubb said:


> Wow, I didn't catch on that he was using a 5700 and not an XT until he started talking about the xt. Getting the 5700 up to 2080 performance levels is pretty impressive.


But if you believe what was said...5700 series has some "caughting up to do" :doh:


----------



## bigjdubb

I didn't actually listen to most of it, the introduction caused me to mute and start forwarding until graphs started popping up.


----------



## EastCoast

bigjdubb said:


> I didn't actually listen to most of it, the introduction caused me to mute and start forwarding until graphs started popping up.


LOL, cringy and annoying isn't it?








5700xt.


----------



## ilmazzo

Love this guy.


----------



## EastCoast

Radeon 7 is rumored to be EOL


----------



## bigjdubb

I'm curious to see how the 5700xt does at 4k with the power play tables, it seem to fall flat on it's face at 4k in the reviews I read (not many of them). It was the main reason I decided to keep the RVII instead of selling it to buy an AIB 5700xt (the cooler on the RVII is a touch loud).


----------



## Diffident

bigjdubb said:


> Wow, I didn't catch on that he was using a 5700 and not an XT until he started talking about the xt. Getting the 5700 up to 2080 performance levels is pretty impressive.



He has a 5700, it was Igor's 5700xt that he mentions getting close to a 2080. His 5700, with the mod gets 8682 graphic score in Time Spy, which he shows is lower than a 5700xt, he also shows in a graphic the 2080 FE with a score of 11138. He isn't getting a 5700 close to a 2080.


----------



## bigjdubb

Makes more sense. See what happens when these youtubers go full tard at the beginning of their videos.... causing me to skip around and miss most of the important information.


----------



## AlphaC

maltamonk said:


> While I understand that historically we get cooler, quieter, AND better performance from Aib models. They are actually performing worse (in some cases) due to binning (and already being maxed out) this time around on Nvidias side.


I've seen AIB RTX cards go down to 1200-1500RPM at load. The Nvidia cooler seems to be 1500-1800RPM without fan stop.


----------



## 113802

Diffident said:


> He has a 5700, it was Igor's 5700xt that he mentions getting close to a 2080. His 5700, with the mod gets 8682 graphic score in Time Spy, which he shows is lower than a 5700xt, he also shows in a graphic the 2080 FE with a score of 11138. He isn't getting a 5700 close to a 2080.


A score of 11138 in Time Spy for a single GPU score or total? I doubt he got near 11k with a RX 5700 XT.


----------



## Ha-Nocri

Diffident said:


> He has a 5700, it was Igor's 5700xt that he mentions getting close to a 2080. His 5700, with the mod gets 8682 graphic score in Time Spy, which he shows is lower than a 5700xt, he also shows in a graphic the 2080 FE with a score of 11138. He isn't getting a 5700 close to a 2080.


this is true, but there is no way to increase voltage manually atm. He said that a small bump in voltage is added with the power table, but I'm not sure about that. I don't see a reason 5700 wouldn't be able to go over 2000MHz as 5700XT's are hitting 2100+


----------



## ilmazzo

2200+ actually (igor's liquid cooled one)


----------



## bigjdubb

I misunderstood.


----------



## Diffident

WannaBeOCer said:


> A score of 11138 in Time Spy for a single GPU score or total? I doubt he got near 11k with a RX 5700 XT.



The graphic he shows in the video was a 2080 FE with a graphic score of 11138. The getting "close" to a 2080 with a 5700 XT was just in Tomb Raider not in Time Spy.


----------



## Ultracarpet

In Igor's video he mentioned that the super aggressive powerplay mod that got his card to 2200+mhz wasn't for everyday use- is 1.25v actually that high for these GPU's? I haven't overclocked a GPU since the 290x so I guess I'm out of the loop.


----------



## bigjdubb

Diffident said:


> The graphic he shows in the video was a 2080 FE with a graphic score of 11138. The getting "close" to a 2080 with a 5700 XT was just in Tomb Raider not in Time Spy.


Yes. I was referring to the tombraider chart.


----------



## ToTheSun!

Well, at 2200+ MHz, surely a 5700 XT would be 2080ti ballpark in Frostbite titles, at least. That's a more humbling comparison than Tomb Raider.


----------



## mouacyk

ToTheSun! said:


> Well, at 2200+ MHz, surely a 5700 XT would be *2080ti* ballpark in Frostbite titles, at least. That's a more humbling comparison than Tomb Raider.


gotta be a typo


----------



## 113802

mouacyk said:


> ToTheSun! said:
> 
> 
> 
> Well, at 2200+ MHz, surely a 5700 XT would be *2080ti* ballpark in Frostbite titles, at least. That's a more humbling comparison than Tomb Raider.
> 
> 
> 
> gotta be a typo
Click to expand...

Not a typo: https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/10.html


----------



## EastCoast

One retailer indicates that he had around 1000 units for launch
https://forums.overclockers.co.uk/posts/32841426/

I'm not sure they would allocate that much inventory space if the price was going to be what we thought it was before Jebaited.


----------



## mouacyk

WannaBeOCer said:


> Not a typo: https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/10.html


That doesn't look very representative of most games, so stretching it to dice titles may be a bit much. We all know what happens when you use an overpowered GPU in a sufficiently low resolution/setting -- the GPU usage drops. The first cause is usually CPU/RAM bottleneck, followed by the GPU kicking into lower power mode (Adaptive Power mode is default).


----------



## PontiacGTX

mouacyk said:


> That doesn't look very representative of most games, so stretching it to dice titles may be a bit much. We all know what happens when you use an overpowered GPU in a sufficiently low resolution/setting -- the GPU usage drops. *The first cause is usually CPU/RAM bottleneck*, followed by the GPU kicking into lower power mode (Adaptive Power mode is default).


more like the cause is the bottleneck the API causes, how handles those frames,I am still wondering if this is still true on DX12/vulkan but in DX11 it is


----------



## magnek

mouacyk said:


> That doesn't look very representative of most games, so stretching it to dice titles may be a bit much. We all know what happens when you use an overpowered GPU in a sufficiently low resolution/setting -- the GPU usage drops. The first cause is usually CPU/RAM bottleneck, followed by the GPU kicking into lower power mode (Adaptive Power mode is default).


Uhhh 1440p isn't exactly "low resolution". I don't know if this is game specific or general to Frostbite engine titles, so more testing will need to be done.


----------



## maltamonk

Seems to be favouring Frostbite, but really who cares. It's priced the same as the 2060s. The mere mention of comparison to a 2080 is phenomenal.


----------



## Blackops_2

Didn't know if anyone had seen this but Paul is apparently stoked. 2100mhz core clock with 90% power limit drawing 230w..


----------



## magnek

maltamonk said:


> Seems to be favouring Frostbite, but really who cares. It's priced the same as the 2060s. The mere mention of comparison to a 2080 is phenomenal.


Well that's the favorable extreme, the unfavorable extreme is UE4 where 5700XT falls behind a 2060. Everything else falls in between 2070 and 2070 Super, some titles closer or just equal to 2070, others closer or on par with 2070 Super.


----------



## Blackops_2




----------



## Hwgeek

If we can judge by the reviews- RX 5700 was running at 1650~1700Mhz on AVG, so if AIB cards could OC to 2100 at-least - this is ~23% OC Headroom and will bring the performance over Radeon VII!.


----------



## EastCoast

Hwgeek said:


> If we can judge by the reviews- RX 5700 was running at 1650~1700Mhz on AVG, so if AIB cards could OC to 2100 at-least - this is ~23% OC Headroom and will bring the performance over Radeon VII!.


There is still the pesky silicon lottery to contend with though.


-------------------------------





https://www.ekwb.com/news/ek-vector-blocks-engineered-for-amd-navi-gpus/


----------



## kd5151

Was watching a guy on youtube. His 5700XT @ 1900mhz was drawing 132watts undervolted to 966mv. That's gpu only power draw. I know.

I would love to see the cards locked at like 1400mhz and see how they compare to polaris and all the vega gpus. ^_^

Navi is very efficient as long as your not pumping 1.2volts into it and trying to hit over 2000mhz. It's got quite a range.


----------



## 113802

kd5151 said:


> Was watching a guy on youtube. His 5700XT @ 1900mhz was drawing 132watts undervolted to 966mv. That's gpu only power draw. I know.
> 
> I would love to see the cards locked at like 1400mhz and see how they compare to polaris and all the vega gpus. ^_^
> 
> Navi is very efficient as long as your not pumping 1.2volts into it and trying to hit over 2000mhz. It's got quite a range.


I believe it. My Vega 64 LC undervolted. 

https://youtu.be/_-RmzKtHTic


----------



## NightAntilli




----------



## kd5151

WannaBeOCer said:


> I believe it. My Vega 64 LC undervolted.
> 
> https://youtu.be/_-RmzKtHTic


Well the card was boosting to 1820mhz and the fan was locked at like 50%. Sorry on mobile and got a little ahead of myself. 

AMD cards shine when undervolted! C'mon reviewers. Wake up!


----------



## tpi2007

If all cards were able to do this, it would seem that there is an overvolting department over at AMD that needs to be fired asap.


----------



## AlphaC

Higher default voltage allows for higher yields. It's probably the binning team that needs to be fired lol.


Also that youtuber may have gotten a golden sample.


----------



## 113802

kd5151 said:


> Well the card was boosting to 1820mhz and the fan was locked at like 50%. Sorry on mobile and got a little ahead of myself.
> 
> AMD cards shine when undervolted! C'mon reviewers. Wake up!


Just undervolted my Radeon VII to 900mV @ 1650Mhz/1200. Used between 80-140w

https://www.3dmark.com/3dm/37628591?


----------



## ZealotKi11er

tpi2007 said:


> If all cards were able to do this, it would seem that there is an overvolting department over at AMD that needs to be fired asap.


No. All cards support this at different levels, even Nvidia. My 1080 Ti did 1860MHz @ 0.875v locked. Stock was hitting the same clocks but using 0.975v-1.06v. Also its all about performance. The way it works is that you have slow parts, typical parts, and fast parts. They all have to hit a certain level of perf. Some slow part get count down (5700 no-xt) and the best part usually are for special editions (xt 50th, vega 64 lc). Nvidias Turing FE cards where fast parts.


----------



## magnek

tpi2007 said:


> If all cards were able to do this, it would seem that there is an overvolting department over at AMD that needs to be *fired* asap.


Fired or fried? 

Or did you mean fired literally?


----------



## kd5151

WannaBeOCer said:


> Just undervolted my Radeon VII to 900mV @ 1650Mhz/1200. Used between 80-140w
> 
> https://www.3dmark.com/3dm/37628591?


The guy I was referring to originally also tested his Radeon 7 against vega 56 and vega 64 at the same clocks but with much lower voltage on the radeon 7.


----------



## tpi2007

ZealotKi11er said:


> No. All cards support this at different levels, even Nvidia. My 1080 Ti did 1860MHz @ 0.875v locked. Stock was hitting the same clocks but using 0.975v-1.06v. Also its all about performance. The way it works is that you have slow parts, typical parts, and fast parts. They all have to hit a certain level of perf. Some slow part get count down (5700 no-xt) and the best part usually are for special editions (xt 50th, vega 64 lc). Nvidias Turing FE cards where fast parts.



Soo... yes? It's basically what you said after saying "No."





magnek said:


> Fired or fried?
> 
> Or did you mean fired literally?



Fired (aka dismissed), but not literally, as there is no such thing as an "overvolting department" over at AMD. But if there is, yes, they should all be fired.

What I'm basically saying is that getting a Navi card in hopes that by undervolting it will use way less power and clock way better is like hoping that every triple core Phenom II from back in the day will unlock to a quad core and if it does, works reliably. It may, or it may not.


----------



## JackCY

It's hot and loud out of the box because AMD overvolts their GPUs and puts blower on them. Undervolting is a form OCing and in essence not a stock configuration out of the box.
NV cards also pull close to nothing when undervolted by 200mV.

It's like someone at AMD decided in 2010 etc. when GCN was launched to set their GPU Vcore to 1.2V and roll with it until 2020 or later.


----------



## bigjdubb

JackCY said:


> It's like someone at AMD decided in 2010 etc. when GCN was launched to set their GPU Vcore to 1.2V and roll with it until 2020 or later.


Maybe 1.2 volts is the happy place between performance, power consumption, and yield for AMD's designs. We don't really know how well all of the cards under volt because only a percentage of overclockers under volt and only a small percentage of buyers overclock.


----------



## EastCoast

I recall way back that they used higher voltage to compensate for different PC cases or something like that. It allowed the gpu (at the time) to run well even when ambient temps were not optimal. 
But that was a long time ago. I've not read why they still do it.


----------



## magnek

tpi2007 said:


> Fired (aka dismissed), but not literally, as there is no such thing as an "overvolting department" over at AMD. But if there is, yes, they should all be fired.
> 
> What I'm basically saying is that getting a Navi card in hopes that by undervolting it will use way less power and clock way better is like hoping that every triple core Phenom II from back in the day will unlock to a quad core and if it does, works reliably. It may, or it may not.


I was being facetious. Overvolting = higher temps = crispy and fried (chicken) taken to an extreme. Could also start fires, hence fired literally.


----------



## tpi2007

magnek said:


> I was being facetious. Overvolting = higher temps = crispy and fried (chicken) taken to an extreme. Could also start fires, hence fired literally.



Oh, ok, I guess that does make sense lol.


----------



## runwiththedevil

I've seen the 5700 XT benefits quite a bit from undervolting, but what about the non-XT model?


----------



## Damage Inc

You just gotta love AMD/ATi. They always let you put a finishing touch to their cards. File down a corner here, put some extra washers there and you might just bring the temps down to 95C.


----------



## ZealotKi11er

runwiththedevil said:


> I've seen the 5700 XT benefits quite a bit from undervolting, but what about the non-XT model?


Probably less. 5700 is already running at lower speeds and slower fan.


----------



## Hwgeek

*Bykski Water Block for AMD Radeon RX 5700 / 5700XT is out for US $96.61-$105.84*
https://www.aliexpress.com/item/4000004998452.html


----------



## Blackops_2

2150 on the core with powerplay tables and an aftermarket cooler. These cards could be phenomenal under water. A lightening will be my go to if they make one. At the same time i hope we get rumor of Big Navi somehow coming early 2020. If it actually scales well it could be monstrous, albeit late to the market but if it's at a cheaper price than a 2080/2080S it's a win/win.


----------



## 113802

Blackops_2 said:


> 2150 on the core with powerplay tables and an aftermarket cooler. These cards could be phenomenal under water. A lightening will be my go to if they make one. At the same time i hope we get rumor of Big Navi somehow coming early 2020. If it actually scales well it could be monstrous, albeit late to the market but if it's at a cheaper price than a 2080/2080S it's a win/win.


Doessn't seem like it scales as well as Vega 10 when overclocking. I'm pretty sure my Vega 64 would crush a 2100Mhz Navi 10 GPU.


----------



## ilmazzo

WannaBeOCer said:


> Blackops_2 said:
> 
> 
> 
> 2150 on the core with powerplay tables and an aftermarket cooler. These cards could be phenomenal under water. A lightening will be my go to if they make one. At the same time i hope we get rumor of Big Navi somehow coming early 2020. If it actually scales well it could be monstrous, albeit late to the market but if it's at a cheaper price than a 2080/2080S it's a win/win.
> 
> 
> 
> Doessn't seem like it scales as well as Vega 10 when overclocking. I'm pretty sure my Vega 64 would crush a 2100Mhz Navi 10 GPU.
Click to expand...

You know...64cu vs 40....

Vega right now is mid performer, navi is amd’s maxwell time


----------



## Hwgeek

I think we will learn more on Earning call on Jul 23rd, Lisa will talk about upcoming new products/timelines.


----------



## ZealotKi11er

WannaBeOCer said:


> Doessn't seem like it scales as well as Vega 10 when overclocking. I'm pretty sure my Vega 64 would crush a 2100Mhz Navi 10 GPU.


Why Vega 10? You can take a Radeon 7 to those clock speeds. Try something like TimeSpy. I get 9.8K with stock cooler and memory only at 900MHz. Vega has much faster memory especially Radeon 7.


----------



## Newbie2009

Vega 64 gets clobbered by Navi, no amount of overclocking will change that.


----------



## 113802

Newbie2009 said:


> Vega 64 gets clobbered by Navi, no amount of overclocking will change that.


Guess you never ran your Vega 64 at 1800Mhz.


----------



## Newbie2009

WannaBeOCer said:


> Guess you never ran your Vega 64 at 1800Mhz.


1760mhz max, I doubt 50mhz over stock LC will blow my socks off lol


----------



## 113802

Newbie2009 said:


> 1760mhz max, I doubt 50mhz over stock LC will blow my socks off lol


Even though it's clocked at 1760Mhz what does it run at? I'm talking about actually sustaining 1800Mhz. It's pretty much even with a Radeon VII at stock.


----------



## Newbie2009

WannaBeOCer said:


> Even though it's clocked at 1760Mhz what does it run at? I'm talking about actually sustaining 1800Mhz. It's pretty much even with a Radeon VII at stock.


About 1720mhz under water. 1145mhz on the HBM.


----------



## 113802

Newbie2009 said:


> About 1720mhz under water. 1145mhz on the HBM.


Exactly, my RX Vega 64 can run at 1800Mhz sustained.


----------



## Newbie2009

WannaBeOCer said:


> Exactly, my RX Vega 64 can run at 1800Mhz sustained.
> 
> https://www.youtube.com/watch?v=GXDced_nNPw&t=57s


80mhz over mine, not exactly impressive. For a vega its good but I found memory oc yielded more performance than core anyway. 
Might beat it out in some benchmarks but gaming navi every day of the week. It's about 20% faster overall over Vega and gtx 1080.

https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html


----------



## 113802

Newbie2009 said:


> 80mhz over mine, not exactly impressive. For a vega its good but I found memory oc yielded more performance than core anyway.
> Might beat it out in some benchmarks but gaming navi every day of the week. It's about 20% faster overall over Vega and gtx 1080.
> 
> https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html


The 5700 XT at stock vs a stock Vega 64 is 15% faster at 1440p. The reason people said Vega benefited more from memory overclocking is due to the core not being able to overclock. I bugged it to run at 1800Mhz sustained and did testing. Here is my Radeon VII testing at 1820Mhz/1145Mhz and 1900Mhz/1000Mhz and of course the higher clock speed was faster.


----------



## NightAntilli

Blackops_2 said:


> 2150 on the core with powerplay tables and an aftermarket cooler. These cards could be phenomenal under water. A lightening will be my go to if they make one. At the same time i hope we get rumor of Big Navi somehow coming early 2020. If it actually scales well it could be monstrous, albeit late to the market but if it's at a cheaper price than a 2080/2080S it's a win/win.
> 
> https://www.youtube.com/watch?v=Ux79hBmmWq0


I doubt it would not scale well. The main weakness of GCN has been ironed out with RDNA. It's the reason 40CUs can compete with 64 CUs. The architectural efficiency has gone through the roof, and there should be no more idling stream processors. At least not nearly as much as before. 

The main concern for big Navi would be its power consumption. But I guess it will depend on what sacrifices AMD will make on clock speeds. The wider they go, the lower the clock speeds need to be to keep power in check. So there is an optimum somewhere, and AMD will shoot for that.


----------



## ilmazzo

They will stick to HBM2 legacy on top tier, that will help on consumption side but not on costing.... oh, well.....we will see what the end of the year will bring to the table .....


----------



## Blackops_2

NightAntilli said:


> I doubt it would not scale well. The main weakness of GCN has been ironed out with RDNA. It's the reason 40CUs can compete with 64 CUs. The architectural efficiency has gone through the roof, and there should be no more idling stream processors. At least not nearly as much as before.
> 
> The main concern for big Navi would be its power consumption. But I guess it will depend on what sacrifices AMD will make on clock speeds. The wider they go, the lower the clock speeds need to be to keep power in check. So there is an optimum somewhere, and AMD will shoot for that.


Personally if it performed as well as the 2080Ti for $600 i honestly wouldn't care if it sat at 500W. I know most wont see it that way and i do like efficiency but if it brings parity to the market idk that i'll care. HBM as mentioned above would certainly help albeit make it more expensive. Either way exciting times ahead because RDNA is looking better and better. I just hope AMD can be ahead of schedule for once.


----------



## JackCY

Maybe in 2030 they will also fix their H264 encoder as by that time H264 will be likely "discontinued".


----------



## 113802

ilmazzo said:


> They will stick to HBM2 legacy on top tier, that will help on consumption side but not on costing.... oh, well.....we will see what the end of the year will bring to the table .....


I only see them using HBM2 with their compute cards just like nVidia. That means RDNA won't get HBM2. The other question will be if they're going to release their Vega successor with an affordable prosumer card or are they only going to restrict them to Radeon Pro/Instinct.


----------



## PontiacGTX

WannaBeOCer said:


> I only see them using HBM2 with their compute cards just like nVidia. That means RDNA won't get HBM2. The other question will be if they're going to release their Vega successor with an affordable prosumer card or are they only going to restrict them to Radeon Pro/Instinct.


do we know if Navi would benefit of using HBM2 like Vega did? did some review tested ram overclocking performance gain??


----------



## Hwgeek

Blackops_2 said:


> Personally if it performed as well as the 2080Ti for $600 i honestly wouldn't care if it sat at 500W. I know most wont see it that way and i do like efficiency but if it brings parity to the market idk that i'll care. HBM as mentioned above would certainly help albeit make it more expensive. Either way exciting times ahead because RDNA is looking better and better. I just hope AMD can be ahead of schedule for once.


Where Do you live ? Siberia?

I had 2700X+ Zotac AMP EXTREME 1080Ti and I could not play a game without turning the central AC ON, no way I will use over 500W Gaming PC ever again.


----------



## Blackops_2

Hwgeek said:


> Where Do you live ? Siberia?
> 
> I had 2700X+ Zotac AMP EXTREME 1080Ti and I could not play a game without turning the central AC ON, no way I will use over 500W Gaming PC ever again.


Mississippi actually lol i just crank the AC.


----------



## bigjdubb

Hwgeek said:


> Where Do you live ? Siberia?
> 
> I had 2700X+ Zotac AMP EXTREME 1080Ti and I could not play a game without turning the central AC ON, no way I will use over 500W Gaming PC ever again.


Do you live in a place where you can be indoors without the ac on? I can't even think about playing a game without ac, not even on my phone.


----------



## Hwgeek

Only in summer we must use AC, but over 500W near me it's too much, I prefer all the 7nm refresh to let me have same performance at half the power, it's good enough for me and I don;t care too loose a little of sharpness for RIS upscale and get good FPS on 4K+.
In winter - maybe .

I did thought about WC mod so I can push all the hot air thru the window to the outside, this is only possible solutions that I will use high 500W+ Gaming PC.


----------



## 113802

Hwgeek said:


> Only in summer we must use AC, but over 500W near me it's too much, I prefer all the 7nm refresh to let me have same performance at half the power, it's good enough for me and I don;t care too loose a little of sharpness for RIS upscale and get good FPS on 4K+.
> In winter - maybe .
> 
> I did thought about WC mod so I can push all the hot air thru the window to the outside, this is only possible solutions that I will use high 500W+ Gaming PC.


But AMD's 7nm parts uses more power than nVidia's 12nm parts or are you waiting for nVidia's 7nm parts? You could also just use nVidia's FreeStyle.


----------



## AlphaC

Blackops_2 said:


> Personally if it performed as well as the 2080Ti for $600 i honestly wouldn't care if it sat at 500W. I know most wont see it that way and i do like efficiency but if it brings parity to the market idk that i'll care. HBM as mentioned above would certainly help albeit make it more expensive. Either way exciting times ahead because RDNA is looking better and better. I just hope AMD can be ahead of schedule for once.



The cost of the cooling would outweigh any savings on the GPU...


Anyway it's not feasible to make a 500W consumer GPU , 400W is more or less the limit unless it's using additional power connectors even on reference card. Just to cool it using air would be a tall order without resorting to a vapor chamber with heatpipes on top of it, the Sapphire Nitro+ designed for Vega64 has enough heatpipes to dissipate 300W into the fin stack quickly (3x 8 mm + 5x 6mm Heatpipes including the two for VRM).


Nitro+ Vega 64 had 3x 8 pins = 450W available not including the 75W from the PCIE slot


75W PCIE slot
150W 8 pin
150W 8 pin
-----------
375W


----------



## 113802

AlphaC said:


> The cost of the cooling would outweigh any savings on the GPU...
> 
> 
> Anyway it's not feasible to make a 500W consumer GPU , 400W is more or less the limit unless it's using additional power connectors even on reference card. Just to cool it using air would be a tall order without resorting to a vapor chamber with heatpipes on top of it, the Sapphire Nitro+ designed for Vega64 has enough heatpipes to dissipate 300W into the fin stack quickly (3x 8 mm + 5x 6mm Heatpipes including the two for VRM).
> 
> 
> Nitro+ Vega 64 had 3x 8 pins = 450W available not including the 75W from the PCIE slot
> 
> 
> 75W PCIE slot
> 150W 8 pin
> 150W 8 pin
> -----------
> 375W


I'd like to introduce you to the R9 295x2

https://www.guru3d.com/articles-pages/amd-radeon-r9-295x2-review,12.html


----------



## AlphaC

That's a dual GPU with a closed loop liquid cooler


----------



## 113802

AlphaC said:


> That's a dual GPU with a closed loop liquid cooler


You said it wasn't feasible to make a 500W consumer GPU but they did already. We are talking about AMD's reference cards not AIB cards.


----------



## AlphaC

It also costed $1500 at launch so how do you expect them to sell a RTX 2080 ti competitor at that price.


----------



## 113802

AlphaC said:


> It also costed $1500 at launch so how do you expect them to sell a RTX 2080 ti competitor at that price.


If the rumors that people are spreading in here are true and it outperforms a RTX 2080 Ti they could price it at $1500.


----------



## AlphaC

Let's be realistic here. A 500W card with a closed loop cooler costing $1500 versus a $1200 MSRP RTX 2080 Ti (you can get them ~$1050) that needs less than 300W TDP. Just the RTX alone already sways people towards the RTX 2080 Ti, let alone power/temp/noise and the supposed cost.



For one, there's going to be people with 650-750W PSUs so the people that have overclocked CPUs would need to buy a new PSU to have enough headroom.


----------



## maltamonk

AlphaC said:


> Let's be realistic here. A 500W card with a closed loop cooler costing $1500 versus a $1200 MSRP RTX 2080 Ti (you can get them ~$1050) that needs less than 300W TDP. Just the RTX alone already sways people towards the RTX 2080 Ti, let alone power/temp/noise and the supposed cost.
> 
> 
> 
> For one, there's going to be people with 650-750W PSUs so the people that have overclocked CPUs would need to buy a new PSU to have enough headroom.


People who buy at the high aren't historically known for having much concern for value. A great example of this are ppl that buy Titans for gaming as for that usage I think they hold the title of having the worst p/p ratio. Also on that same note those same ppl wouldn't hesitate to buy a new psu to accommodate their new top tier toy.


----------



## ilmazzo

Yeah

why not a full kilowatt? this is ocn, right?


----------



## AlphaC

maltamonk said:


> People who buy at the high aren't historically known for having much concern for value. A great example of this are ppl that buy Titans for gaming as for that usage I think they hold the title of having the worst p/p ratio. Also on that same note those same ppl wouldn't hesitate to buy a new psu to accommodate their new top tier toy.


 People that buy at the high end also tend to want extra features such as RTX.


Also the RTX TITAN has pro driver optimizations similar to VEGA FE.


----------



## magnek

Blackops_2 said:


> Personally if it performed as well as the 2080Ti for $600 i honestly wouldn't care if it sat at 500W. I know most wont see it that way and i do like efficiency but if it brings parity to the market idk that i'll care. HBM as mentioned above would certainly help albeit make it more expensive. Either way exciting times ahead because RDNA is looking better and better. I just hope AMD can be ahead of schedule for once.


If it performed as well as 2080 Ti I can pretty much guarantee it'll be $700 minimum, and more likely $800+.



WannaBeOCer said:


> If the rumors that people are spreading in here are true and it outperforms a RTX 2080 Ti they could price it at $1500.


EL-OH-EL nobody would buy it if they priced it at $1500 unless it was at least 30% faster, and even then it'd be hard sell. I certainly would not buy such an overpriced card.



maltamonk said:


> People who buy at the high aren't historically known for having much concern for value. A great example of this are ppl that buy Titans for gaming as for that usage I think they hold the title of having the worst p/p ratio. Also on that same note those same ppl wouldn't hesitate to buy a new psu to accommodate their new top tier toy.


People who buy Titans are the least likely to even _look_ at AMD nevermind actually buying it.


----------



## looniam

AlphaC said:


> Let's be realistic here.
> snip


:notontopi


----------



## magnek

looniam said:


> :notontopi


:jerry:


----------



## AlphaC

More on topic I think Navi is supposed to scale down not scale up.


https://www.tweaktown.com/news/66602/amds-new-radeon-rx-5600-series-leaked-navi-14-gpu/index.html
https://www.tomshardware.com/news/amd-navi-14-gpu-compubench,39887.html
https://www.pcgamesn.com/amd/navi-14-gpu-linux-kernel-patches






https://www.notebookcheck.net/AMD-N...-replacement-for-Polaris-RX-580.427350.0.html


> Komachi notes that Compubench displays only half of the actual CU count in RDNA GPUs implying that the 7nm Navi 14 is a 24 CU part with 1,536 SPs. The listing also indicates that the maximum clock speed of this chip is 1,900 MHz with 4 GB of VRAM. While there's no official marketing designation to this card yet, we presume it will be called the RX 5600 and will be AMD's answer to NVIDIA's GTX 1660 series. We see the GPU listed with _just_ 4 GB VRAM, but there is definitely a possibility of both 4 GB and 8 GB SKUs like we saw with the Polaris cards such as the RX 480 and RX 580.


1536 Shaders would basically be a GTX 1650 through GTX 1660 GDRR5 (1408 CUDA) contender unless it can somehow match GTX 1660 TI (1536 CUDA on 192-bit memory bus with GDDR6).


----------



## 113802

AlphaC said:


> More on topic I think Navi is supposed to scale down not scale up.
> 
> 
> https://www.tweaktown.com/news/66602/amds-new-radeon-rx-5600-series-leaked-navi-14-gpu/index.html
> https://www.tomshardware.com/news/amd-navi-14-gpu-compubench,39887.html
> https://www.pcgamesn.com/amd/navi-14-gpu-linux-kernel-patches
> 
> 
> 
> 
> 
> 
> https://www.notebookcheck.net/AMD-N...-replacement-for-Polaris-RX-580.427350.0.html
> 
> 
> 
> Komachi notes that Compubench displays only half of the actual CU count in RDNA GPUs implying that the 7nm Navi 14 is a 24 CU part with 1,536 SPs. The listing also indicates that the maximum clock speed of this chip is 1,900 MHz with 4 GB of VRAM. While there's no official marketing designation to this card yet, we presume it will be called the RX 5600 and will be AMD's answer to NVIDIA's GTX 1660 series. We see the GPU listed with _just_ 4 GB VRAM, but there is definitely a possibility of both 4 GB and 8 GB SKUs like we saw with the Polaris cards such as the RX 480 and RX 580.
Click to expand...

Arcturus is the Vega replacement and it's going to be a monster. Hopefully it doesn't consume the power like one.


----------



## Diffident

WannaBeOCer said:


> Arcturus is the Vega replacement and it's going to be a monster. Hopefully it doesn't consume the power like one.



It probably will and be super hot. It seems to be an AMD thing.


----------



## 113802

Diffident said:


> It probably will and be super hot. It seems to be an AMD thing.


I undervolted my Radeon VII yesterday to 1v and it runs at 1900/1250Mhz just fine. Set my fans to 300 rpm and temps only hit 45/65. Power usage is between 130-220w


----------



## magnek

WannaBeOCer said:


> Arcturus is the Vega replacement and it's going to be a monster. Hopefully it doesn't consume the power like one.


If the performance is there, IDGAF if it's a 350W card. If the performance isn't there, I'll be laughing at AMD if it's a 350W card.


----------



## Diffident

WannaBeOCer said:


> I undervolted my Radeon VII yesterday to 1v and it runs at 1900/1250Mhz just fine. Set my fans to 300 rpm and temps only hit 45/65. Power usage is between 130-220w



In Linux I can undervolt to 1v and crunch [email protected] tasks, but in Windows trying to play The Division II, the game crashes. It's crazy how hot it gets if I do the "Auto Overclock" in Wattman. Peaks at 73C and 112C tjunction....and that's with a waterblock. It puts off enough heat to increase my CPU temp by 15C since they are in the same loop. With Nvidia, I never had a GPU that went over 55C.


----------



## 113802

Diffident said:


> In Linux I can undervolt to 1v and crunch [email protected] tasks, but in Windows trying to play The Division II, the game crashes. It's crazy how hot it gets if I do the "Auto Overclock" in Wattman. Peaks at 73C and 112C tjunction....and that's with a waterblock. It puts off enough heat to increase my CPU temp by 15C since they are in the same loop. With Nvidia, I never had a GPU that went over 55C.


I haven't seen 73C since using my waterblock but I haven't ran my fans low since setting my rig overclock. I haven't touched The Division II in awhile but I swear I remember seeing it overboost just like my RX Vega 64 did. That's the only game I seen cause my Radeon VII to overboost insanely high when undervolted(2200Mhz+) 

Highest I've seen with my waterblock is 51/88 with my six fans at 2000 RPM. I prefer running them at 300 RPM so if I have to sacrifice 200Mhz it's worth it.


----------



## Imouto

I haven't met some users hardware and I feel they're family already.


----------



## ToTheSun!

ilmazzo said:


> Yeah
> 
> why not a full kilowatt? this is ocn, right?


Does it come with an unlimited supply of LN2?


----------



## magnek

ToTheSun! said:


> Does it come with an unlimited supply of LN2?


No but it definitely comes with an unlimited supply of (hot) GN2.


----------



## AlphaC

Indeed Arcturus is the vega replacement
https://www.phoronix.com/scan.php?page=news_item&px=Arcturus-Linux-Driver-Patches



> So all things considered, Arcturus appears to be a forthcoming Vega workstation "Radeon Instinct" type offering we'd be could be announced at SIGGRAPH or Hot Chips based on the timing. Digging through this code drop today, it further points at a compute accelerator without any 3D support, "It's because Arcturus has not 3D engine."
> 
> Arcuturus isn't a small change over Vega 20 but amounts to 102 patches to the AMDGPU kernel driver and 100,491 lines of new code. Granted, lots of that new code is auto-generated header files for the registers.


----------



## 113802

AlphaC said:


> Indeed Arcturus is the vega replacement
> https://www.phoronix.com/scan.php?page=news_item&px=Arcturus-Linux-Driver-Patches
> 
> 
> 
> 
> So all things considered, Arcturus appears to be a forthcoming Vega workstation "Radeon Instinct" type offering we'd be could be announced at SIGGRAPH or Hot Chips based on the timing. Digging through this code drop today, it further points at a compute accelerator without any 3D support, "It's because Arcturus has not 3D engine."
> 
> Arcuturus isn't a small change over Vega 20 but amounts to 102 patches to the AMDGPU kernel driver and 100,491 lines of new code. Granted, lots of that new code is auto-generated header files for the registers.
Click to expand...

Just as I thought their compute cards will continue to use HBM2. The question is now if they're going to release a prosumer card to compete with the Titan. That's the card I want. 



> We've also heard from AMD directly at the Navi event and E3 over their plans to continue to use Vega+HBM when it comes to workstation/compute offerings as Vega is still quite good in that regard.


----------



## JackCY

Doubtful from that info Arcturus will even have 3D let alone be sold on consumer/gaming market if it has 3D in the end.


----------



## Section31

Anyone know how much the perfomance do I lose if i go from 2080ti to 5700xt. This card OC seems very tempting


----------



## keikei

Section31 said:


> Anyone know how much the perfomance do I lose if i go from 2080ti to 5700xt. This card OC seems very tempting


Depends on what resolution you're measuring. Its more performance loss @ 4K.


----------



## Section31

keikei said:


> Depends on what resolution you're measuring. Its more performance loss @ 4K.


I mainly play at UW 1440p 100mhz. That being said, the better financial decision is to wait for the next AMD 7nm GPU or Nvidia 7nm GPU.


----------



## ilmazzo

Well, the better financial decision would have been not buy the rtx ti at the time......makes no sense go to a mid vga until you don't need some money back...


----------



## technodanvan

ilmazzo said:


> Well, the better financial decision would have been not buy the rtx ti at the time......makes no sense go to a mid vga until you don't need some money back...


That depends, the 2080Ti isn't losing all the much value on the resale market right now. You could pocket a few hundred while still getting a capable card, especially if you aren't really pushing the Ti all that much and like playing around with new toys.


----------



## Gunderman456

Section31 said:


> Anyone know how much the perfomance do I lose if i go from 2080ti to 5700xt. This card OC seems very tempting


~40%


----------



## jay2nice000

Gunderman456 said:


> ~40%


40 percent........... over dramatic much


----------



## keikei

Section31 said:


> I mainly play at UW 1440p 100mhz. That being said, the better financial decision is to wait for the next AMD 7nm GPU or Nvidia 7nm GPU.



Unless I borked the math, its about *32% *slower at that res vs. RTX 2080Ti. Hardward Unbox does great comparisons:


----------



## JackCY

100% in Q2RTX. Too dramatic?

https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html

2080Ti +46% elsewhere. Or 5700XT -32% elsewhere. Depends how you calculate the %, what you use as reference.


----------



## ToTheSun!

JackCY said:


> 100% in Q2RTX. Too dramatic?


GOTTEM


----------



## guttheslayer

JackCY said:


> 100% in Q2RTX. Too dramatic?
> 
> https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html
> 
> 2080Ti +46% elsewhere. Or 5700XT -32% elsewhere. Depends how you calculate the %, what you use as reference.



At 4K the 2080 Ti was 56% faster, I guess next gen Navi if it comes with 60% more shader (4096 SP) it will just compete with 2080 Ti, and will overtake it once AMD drivers on RDNA matures.


This actually shows that shader for shader AMD actually have superior IPC compared to Turing, pretty impressive.


----------



## 113802

guttheslayer said:


> At 4K the 2080 Ti was 56% faster, I guess next gen Navi if it comes with 60% more shader (4096 SP) it will just compete with 2080 Ti, and will overtake it once AMD drivers on RDNA matures.
> 
> 
> This actually shows that shader for shader AMD actually have superior IPC compared to Turing, pretty impressive.


If AMD's RDNA has superior IPC compared to Turing wouldn't it have outperformed the RTX 2070 Super? We can't even do a proper comparison due to AMD's screwed up turbo boosting on their cards starting with Vega and beyond.


----------



## Imouto

If you compare the 10.3b transistors in the 5700 XT against the 10.8b in the RTX 2070 (non-super) I'd say RDNA is pretty much on par or slightly better than Turing after factoring clocks and transistors.


----------



## Newbie2009

Picked up an xt anniversary , does 2030 MHz or so at stock on fire strike. I put liquid metal on it. Drivers seem a bit wonky and software is broken.. haven’t had a chance to play with it much yet but looks good.


----------



## 113802

Imouto said:


> If you compare the 10.3b transistors in the 5700 XT against the 10.8b in the RTX 2070 (non-super) I'd say RDNA is pretty much on par or slightly better than Turing after factoring clocks and transistors.


Without Tensor cores and RT cores a RTX 2070 is around 9.9b and a RTX 2070 super is 10.9b.

From all the reviews I've seen a 2090Mhz RX 5700 XT is slower than a RTX 2070 Super @ 1900Mhz.


----------



## Imouto

WannaBeOCer said:


> Imouto said:
> 
> 
> 
> If you compare the 10.3b transistors in the 5700 XT against the 10.8b in the RTX 2070 (non-super) I'd say RDNA is pretty much on par or slightly better than Turing after factoring clocks and transistors.
> 
> 
> 
> Without Tensor cores and RT cores a RTX 2070 is around 9.9b and a RTX 2070 super is 10.9b.
> 
> From all the reviews I've seen a 2090Mhz RX 5700 XT is slower than a RTX 2070 Super @ 1900Mhz.
Click to expand...

I'll repeat myself.

Same performance as a door stopper. Better value.

I thought we were talking about gaming. Tensor cores are not used for gaming and DXR is used in a grand total of... maybe 5 relevant games? Perhaps 8 by the end of the year? 20 until Turing is completely phased out?

Those are wasted transistors for gaming.


----------



## 113802

Imouto said:


> I'll repeat myself.
> 
> Same performance as a door stopper. Better value.
> 
> I thought we were talking about gaming. Tensor cores are not used for gaming and DXR is used in a grand total of... maybe 5 relevant games? Perhaps 8 by the end of the year? 20 until Turing is completely phased out?
> 
> Those are wasted transistors for gaming.


I'm not arguing that the 5700 series is a better value for gamers. I am comparing the two architectures. Turing has better IPC.


----------



## ILoveHighDPI

WannaBeOCer said:


> If AMD's RDNA has superior IPC compared to Turing wouldn't it have outperformed the RTX 2070 Super? We can't even do a proper comparison due to AMD's screwed up turbo boosting on their cards starting with Vega and beyond.


https://www.techpowerup.com/gpu-specs/geforce-rtx-2070.c3252

https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt.c3339

https://www.techpowerup.com/gpu-specs/geforce-rtx-2070-super.c3440

The 5700XT is a 251mm die, the 2070 is 77% larger at 445mm, the 2070 Super is 545mm TU104, 117% more die space.

Hopefully RTX is cutting into Nvidia’s profits enough to actually get another GTX xx80 GPU in the next 18 months.


----------



## ZealotKi11er

WannaBeOCer said:


> Without Tensor cores and RT cores a RTX 2070 is around 9.9b and a RTX 2070 super is 10.9b.
> 
> From all the reviews I've seen a 2090Mhz RX 5700 XT is slower than a RTX 2070 Super @ 1900Mhz.


You got to give RDNA some time to mature.


----------



## 113802

ZealotKi11er said:


> You got to give RDNA some time to mature.


I will, I'm also impatiently waiting on their DLSS competitor. Adam Kozak kinda hyped it up.


----------



## ZealotKi11er

WannaBeOCer said:


> I will, I'm also impatiently waiting on their DLSS competitor. Adam Kozak kinda hyped it up.


Have you tried DLSS? I tried it with 2080 Ti and was not a fan. I play 55" OLED 4K and anything but native it very noticeable.


----------



## 113802

ZealotKi11er said:


> Have you tried DLSS? I tried it with 2080 Ti and was not a fan. I play 55" OLED 4K and anything but native it very noticeable.


Just like RDNA it will improve with time. So will AMD's DirectML super sampling.


----------



## rv8000

Just chiming I’m here:

My 5700XT showed up today, and after fixing a botched driver install this card is pretty neat.

Some very preliminary testing shows that for roughly a 7.5% increase in avg core clock I got roughly 7% increase in timespy GPU score; stock run was 8706 at ~1860mhz avg vs 9342 at 2010mhz avg.

Side note memory overclocking is 115% broken atm. Card seems to be running at unnecessary voltage too, I had no problem bumping it down from the stock 1200mV to 1100mV while pushing the core clock up as well.

Aftermarket card reviews should be really good. I have high hopes if some of the top tier modes can maintain 1950-2000 core out of the box we’ll see like a 5-8% boost across games with vastly better thermals.

More testing to come at some point (won’t be posting here though)


----------



## JackCY

Newbie2009 said:


> Picked up an xt anniversary , does 2030 MHz or so at stock on fire strike. I put liquid metal on it. Drivers seem a bit wonky and software is broken.. haven’t had a chance to play with it much yet but looks good.


Did you add washers to the cooler mount or using a custom cooler instead?


----------



## Newbie2009

JackCY said:


> Did you add washers to the cooler mount or using a custom cooler instead?


Washers. I’m not going to bother overclocking until new drivers fix things.

Just to note, I don’t find the blower cooler very loud. Quieter than 290x and Vega . My cpu fan is probably louder , only really ramps up when benchmarking.

I think people who prefer quiet over performance should leave fan to auto.


----------



## rdr09

rv8000 said:


> Just chiming I’m here:
> 
> My 5700XT showed up today, and after fixing a botched driver install this card is pretty neat.
> 
> Some very preliminary testing shows that for roughly a 7.5% increase in avg core clock I got roughly 7% increase in timespy GPU score; stock run was 8706 at ~1860mhz avg vs 9342 at 2010mhz avg.
> 
> Side note memory overclocking is 115% broken atm. Card seems to be running at unnecessary voltage too, I had no problem bumping it down from the stock 1200mV to 1100mV while pushing the core clock up as well.
> 
> Aftermarket card reviews should be really good. I have high hopes if some of the top tier modes can maintain 1950-2000 core out of the box we’ll see like a 5-8% boost across games with vastly better thermals.
> 
> More testing to come at some point (won’t be posting here though)


Thank you for this info. +rep.


----------



## Newbie2009

Fire strike ultra.
https://www.3dmark.com/fs/19883245

Excuse the cpu was just playing with the apu recently.


----------



## NightAntilli

WannaBeOCer said:


> I'm not arguing that the 5700 series is a better value for gamers. I am comparing the two architectures. Turing has better IPC.


Computer base tested this... According to them, RDNA has a 1% advantage on turing. I also already posted this a few pages back, although not so extensively. Note that this is based on an average of multiple games. This can change completely as drivers mature, or as games themselves change.
https://www.computerbase.de/2019-07/radeon-rx-5700-xt-test/4/

Google translated;
_The comparison Navi against Turing succeeds almost perfectly. The Radeon RX 5700 and the GeForce RTX 2070 share not only the same number of shader units, but also the memory bandwidth is identical. With identical clock so you can draw a direct comparison.

All benchmarks were done in 2,560 × 1,440. The exact details of how the graphics cards are configured are shown in the following table:
Navi Vs. Turing : Configuration directly comparable
RX 5700	2,304 ALUs @ ~ 1.5 GHz	GDDR6 @ 7,000 MHz, 256 bits
RTX 2070	2,304 ALUs @ ~ 1.5 GHz	GDDR6 @ 7,000 MHz, 256 bits

Not surprisingly, although Nvidia has made big leaps in performance per shader unit with Maxwell and Turing, AMD manages to cut off an average of one percent better on average with RDNA.
In the individual games sometimes massive differences. Anno 1800 is the master discipline of Turing with Hitman 2. There, Nvidia's architecture is 10 and 18 percent ahead of RDNA. RDNA scores best in Battlefield V and Metro Exodus, where the GCN successor works 16 and 13 percent faster at the same processing power.

In Shadow of the Tomb Raider, both technologies manage to cut it off in the same way, despite the huge differences. In other games, the designs are close to each other. In Resident Evil 2, Turing is two percent faster, in COD: Black Ops 4 is RDNA.
_

Summary compared to other architectures;


----------



## rluker5

NightAntilli said:


> Computer base tested this... According to them, RDNA has a 1% advantage on turing. I also already posted this a few pages back, although not so extensively. Note that this is based on an average of multiple games. This can change completely as drivers mature, or as games themselves change.
> https://www.computerbase.de/2019-07/radeon-rx-5700-xt-test/4/
> 
> Google translated;
> _The comparison Navi against Turing succeeds almost perfectly. The Radeon RX 5700 and the GeForce RTX 2070 share not only the same number of shader units, but also the memory bandwidth is identical. With identical clock so you can draw a direct comparison.
> 
> All benchmarks were done in 2,560 × 1,440. The exact details of how the graphics cards are configured are shown in the following table:
> Navi Vs. Turing : Configuration directly comparable
> RX 5700	2,304 ALUs @ ~ 1.5 GHz	GDDR6 @ 7,000 MHz, 256 bits
> RTX 2070	2,304 ALUs @ ~ 1.5 GHz	GDDR6 @ 7,000 MHz, 256 bits
> 
> Not surprisingly, although Nvidia has made big leaps in performance per shader unit with Maxwell and Turing, AMD manages to cut off an average of one percent better on average with RDNA.
> In the individual games sometimes massive differences. Anno 1800 is the master discipline of Turing with Hitman 2. There, Nvidia's architecture is 10 and 18 percent ahead of RDNA. RDNA scores best in Battlefield V and Metro Exodus, where the GCN successor works 16 and 13 percent faster at the same processing power.
> 
> In Shadow of the Tomb Raider, both technologies manage to cut it off in the same way, despite the huge differences. In other games, the designs are close to each other. In Resident Evil 2, Turing is two percent faster, in COD: Black Ops 4 is RDNA.
> _
> 
> Summary compared to other architectures;


That is pretty significant. 

I hope AMD puts out a beast that Nvidia can't match with ray tracing hardware taking up die space.


----------



## rage fuury

*Undervolting RX 5700 XT | Power Play Tables | Testing Results:*





... with undervolt to 0.966V the card consumes slightly above 130W for 1820Mhz (40Hx lower than the default 1860Mhz) !?


----------



## ilmazzo

They'll be forced jumping on RT wagon imo

If by software or generic hw or dedicated hw needs to be seen though


----------



## rdr09

rage fuury said:


> https://www.youtube.com/watch?v=7_GvOe1_UKs
> ... with undervolt to 0.966V the card consumes slightly above 130W for 1820Mhz (40Hx lower than the default 1860Mhz) !?


Consumes just a tad higher than my GTX 1060!


----------



## The Robot

ilmazzo said:


> They'll be forced jumping on RT wagon imo
> 
> If by software or generic hw or dedicated hw needs to be seen though


They already do in next gen consoles with Navi2. It would 
really matter only when the proper native RT console ports come out, until then it's a useless gimmick just like PhysX. I can see Nvidia dumping RT cores altogether if they will really need to increase useful die space to compete with AMD in raw performance.


----------



## Newbie2009

The Robot said:


> They already do in next gen consoles with Navi2. It would
> really matter only when the proper native RT console ports come out, until then it's a useless gimmick just like PhysX. I can see Nvidia dumping RT cores altogether if they will really need to increase useful die space to compete with AMD in raw performance.


Doubt it as being a gpu only company they need to keep discreet graphics relevant.

20cu Navi APUwith ryzen quad core 8 thread would be something I’d like to see 290x level performance in an apu.


----------



## ilmazzo

Raw performance will be done just with 7nm, it is very unlikely they will drop RT hw to save the face....they still have a big (too much big) market share and mind share with the gaming crowd

Never thought to look to a mid range card for my next upgrade, well it seems that this Navi XT is going to be my next card within the year...


----------



## 113802

rluker5 said:


> That is pretty significant.
> 
> I hope AMD puts out a beast that Nvidia can't match with ray tracing hardware taking up die space.


Until there is a proper way to disable turbo boost on both cards that test is useless. We have no clue what either GPU is boosting too.


----------



## maltamonk

RT core have their place even if they don't become relevant in gaming. RT cores are great for particle physics AI which is becoming more prevalent in the medical industry. Want to know exactly where that cancer starts and stops in the body? RT cores can help with that. I'm sure there are many other uses, it's just the medical uses are a bit close to my realm.


----------



## NightAntilli

WannaBeOCer said:


> Until there is a proper way to disable turbo boost on both cards that test is useless. We have no clue what either GPU is boosting too.


They claimed to have maintained identical clocks, although they don't explain how they achieved that.


----------



## 113802

NightAntilli said:


> They claimed to have maintained identical clocks, although they don't explain how they achieved that.


That's not possible with AMD's boost. It might run at a frequency for 80% of the time but it will fluctuate further down at times driving performance down. For example I bugged my Vega 64 to run at 1800Mhz/1105Mhz sustained and it outperforms my Radeon VII at 1850Mhz/1105. 

If anything AMD's RDNA would be higher than 1%


----------



## EastCoast

(Rumored) RDNA isn't fully optimized yet. If so I would assume that 1% getting higher. But by what amount is yet to be seen.


----------



## NightAntilli

WannaBeOCer said:


> That's not possible with AMD's boost. It might run at a frequency for 80% of the time but it will fluctuate further down at times driving performance down. For example I bugged my Vega 64 to run at 1800Mhz/1105Mhz sustained and it outperforms my Radeon VII at 1850Mhz/1105.
> 
> If anything AMD's RDNA would be higher than 1%


Doesn't lowering the voltage enough automatically reduce clocks? Can't it be tuned in that way to reach a stable clock around ~1500 MHz? I know my R9 Fury Nitro works that way, but I'm not sure if the cards that came later do the same thing.


----------



## AlphaC

RX 5700 vs RTX 2080 in video editing.


Adobe Premiere Pro : RX 5700 1s faster , 8s faster in x264



Da Vinci Resolve : relatively close


No out of memory errors on RX 5700.


Still some bugs in Radeon RX 5700 driver. Handbrake x264 is slower due to a bug.


----------



## Newbie2009

Sweet spot for my 5700xt is 2125 @ 1126mv, auto fans. 

https://www.3dmark.com/fs/19892399

Need more cooling and power limit extension for anything more.


----------



## ZealotKi11er

Newbie2009 said:


> Sweet spot for my 5700xt is 2125 @ 1126mv, auto fans.
> 
> https://www.3dmark.com/fs/19892399
> 
> Need more cooling and power limit extension for anything more.


Can you try Timespy?


----------



## Newbie2009

ZealotKi11er said:


> Can you try Timespy?


Will run it this evening.


----------



## criminal

Newbie2009 said:


> Washers. I’m not going to bother overclocking until new drivers fix things.
> 
> Just to note, I don’t find the blower cooler very loud. Quieter than 290x and Vega . My cpu fan is probably louder , only really ramps up when benchmarking.
> 
> I think people who prefer quiet over performance should leave fan to auto.


The last I posted I had put an acclero cooler on the card, but after further testing the VRAM on the card was getting over 100C. This appeared to be causing horrible stuttering in games I was playing, so I ended up putting the stock cooler back on. With the stock cooler back on, none of the tweaking I did to the fan curve satisfied me with the noise level (yeah, yeah it may not be that bad to some, but to me = nails on a chalkboard) and having to reinstall the driver 3 times in 4 days(due to games crashing) finally convinced me the 5700XT was not for me. The older I get, the less I want to "troubleshoot" issues at home after doing so all day at work(FYI I work in IT). So long story short, I got a 2070 Super. I may have to admit at this point in time, I am just more partial to Nvidia.

Having said all that, I don't want to come off as anti-AMD. I am completely ecstatic with my 3700X. It runs extremely well and performance is exceptional.


----------



## bigjdubb

criminal said:


> The last I posted I had put an acclero cooler on the card, but after further testing the VRAM on the card was getting over 100C. This appeared to be causing horrible stuttering in games I was playing, so I ended up putting the stock cooler back on. With the stock cooler back on, none of the tweaking I did to the fan curve satisfied me with the noise level (yeah, yeah it may not be that bad to some, but to me = nails on a chalkboard) and having to reinstall the driver 3 times in 4 days(due to games crashing) finally convinced me the 5700XT was not for me. The older I get, the less I want to "troubleshoot" issues at home after doing so all day at work(FYI I work in IT). So long story short, I got a 2070 Super. I may have to admit at this point in time, I am just more partial to Nvidia.
> 
> Having said all that, I don't want to come off as anti-AMD. I am completely ecstatic with my 3700X. It runs extremely well and performance is exceptional.


I feel your pain. I gave up on tweaking my radeon vii for software reasons as well. At least you got a 3700x though, AMD is about as good at keeping their new products in stock as they are at writing software.


----------



## PontiacGTX

criminal said:


> The last I posted I had put an acclero cooler on the card, but after further testing the VRAM on the card was getting over 100C. This appeared to be causing horrible stuttering in games I was playing, so I ended up putting the stock cooler back on. With the stock cooler back on, none of the tweaking I did to the fan curve satisfied me with the noise level (yeah, yeah it may not be that bad to some, but to me = nails on a chalkboard) and having to reinstall the driver 3 times in 4 days(due to games crashing) finally convinced me the 5700XT was not for me. The older I get, the less I want to "troubleshoot" issues at home after doing so all day at work(FYI I work in IT). So long story short, I got a 2070 Super. I may have to admit at this point in time, I am just more partial to Nvidia.
> 
> Having said all that, I don't want to come off as anti-AMD. I am completely ecstatic with my 3700X. It runs extremely well and performance is exceptional.


not sure if the expensive price is worth it but have you tried a morpheus heatsink? I see it has some heatsinks for the vram/vrm


----------



## criminal

bigjdubb said:


> I feel your pain. I gave up on tweaking my radeon vii for software reasons as well. At least you got a 3700x though, AMD is about as good at keeping their new products in stock as they are at writing software.


Yep. I hate to say it because I know the hate I will probably get, but Nvidia's drivers just seem to work and the know how to design a quiet cooler. Having said that, the R9 380 in my son's computer has been rock solid. So it is probably just growing pains for AMD driver team and RDNA architecture.



PontiacGTX said:


> not sure if the expensive price is worth it but have you tried a morpheus heatsink? I see it has some heatsinks for the vram/vrm


I was looking at getting something like that because it would have still been cheaper than a 2070 Super, but the weird software issues I was having with the 5700XT sealed the deal. I honestly don't want to deal with that sort of stuff when I just want to come home and relax. The way it is going, in a few more years I might be into consoles because of the simplicity... lol


----------



## AlphaC

https://videocardz.com/press-release/asrock-announces-radeon-rx-5700-challenger-series


> Radeon RX 5700 XT Challenger 8G OC graphics card provides base/boost/game GPU clock at 1650/1795/1905 MHz, and on the other hand, Radeon RX 5700 Challenger 8G OC graphics card features with base/boost/game GPU clock at 1515/1675/1725 MHz
> ...
> 
> The Radeon RX 5700 Challenger 8G OC series graphics cards are specially designed with a long-life 10 cm dual fan, and 4 copper heat-pipe up to 8mm to enhance the heat dissipation effect.


 Seems Asrock is going to bring a relatively decent offering, although if you don't like bare copper it's probably not going to be an option.


edit: it's up on Asrock's site, comes with metal backplate
https://www.asrock.com/Graphics-Card/AMD/RX%205700%20XT%20Challenger%208G%20OC/


https://www.asrock.com/Graphics-Card/AMD/RX 5700 Challenger 8G OC/index.asp


----------



## 113802

NightAntilli said:


> Doesn't lowering the voltage enough automatically reduce clocks? Can't it be tuned in that way to reach a stable clock around ~1500 MHz? I know my R9 Fury Nitro works that way, but I'm not sure if the cards that came later do the same thing.


I haven't touched a R9 Fury so I'm not sure how it's boost works. Lowering voltage reduced clocks on Vega 10 but on Vega 20 it works like it should and doesn't reduce clocks only voltage. I have my card set to 1900Mhz/1250Mhz @ 1v which is between 130-230w never touches the 300w PL I set and watercooled. I see the card fluctuates with load. If it uses around 130w-180w it downclocks itself down to 1680-1780 but if it's pegged with an intensive scene it runs at 1850Mhz+ (1900Mhz is peak which it rarely touches) 

Radeon Chill is off and nothing is power limiting the card or thermal throttling the card. 

From what I've seen from videos of the 5700 XT it's boost is similar to the Vega 20.


----------



## bigjdubb

criminal said:


> Yep. I hate to say it because I know the hate I will probably get, but Nvidia's drivers just seem to work and the know how to design a quiet cooler. Having said that, the R9 380 in my son's computer has been rock solid. So it is probably just growing pains for AMD driver team and RDNA architecture.
> 
> I was looking at getting something like that because it would have still been cheaper than a 2070 Super, but the weird software issues I was having with the 5700XT sealed the deal. I honestly don't want to deal with that sort of stuff when I just want to come home and relax. The way it is going, in a few more years I might be into consoles because of the simplicity... lol


I think you made the right call because it isn't just an RDNA thing, or even just a graphics thing, the software for the cpu stuff is just as annoying. I feel like every software release comes with a problem that leaves you longing for the next release (or rolling back because the old problem wasn't as bad as the new one).


----------



## criminal

bigjdubb said:


> I think you made the right call because it isn't just an RDNA thing, or even just a graphics thing, the software for the cpu stuff is just as annoying. I feel like every software release comes with a problem that leaves you longing for the next release (or rolling back because the old problem wasn't as bad as the new one).


:thumb:


----------



## rdr09

NightAntilli said:


> Doesn't lowering the voltage enough automatically reduce clocks? Can't it be tuned in that way to reach a stable clock around ~1500 MHz? I know my R9 Fury Nitro works that way, but I'm not sure if the cards that came later do the same thing.


Not sure how exactly you do it in Fury but it might be the way the guy in the vid that was posted earlier did it (see 5 min mark).


----------



## Newbie2009

criminal said:


> The last I posted I had put an acclero cooler on the card, but after further testing the VRAM on the card was getting over 100C. This appeared to be causing horrible stuttering in games I was playing, so I ended up putting the stock cooler back on. With the stock cooler back on, none of the tweaking I did to the fan curve satisfied me with the noise level (yeah, yeah it may not be that bad to some, but to me = nails on a chalkboard) and having to reinstall the driver 3 times in 4 days(due to games crashing) finally convinced me the 5700XT was not for me. The older I get, the less I want to "troubleshoot" issues at home after doing so all day at work(FYI I work in IT). So long story short, I got a 2070 Super. I may have to admit at this point in time, I am just more partial to Nvidia.
> 
> Having said all that, I don't want to come off as anti-AMD. I am completely ecstatic with my 3700X. It runs extremely well and performance is exceptional.


I would say I’m anti Nvidia and I’ll pick up a 2070 super this month for another mini itx build. (Moving away from big cases and water cooling) Will be just throw it in and forget about it.

Couldn’t help myself getting a 5700 though, been ages since I had a card I liked and wanted to tweak/mess around with something .


----------



## 113802

rdr09 said:


> Not sure how exactly you do it in Fury but it might be the way the guy in the vid that was posted earlier did it (see 5 min mark).


AMD's clock speed fluctuates too often compared to nVidia. It's as though AMD has a dynamic clock that depends on load compared to nVidia which just depends on PL + temp.

Watercooled Radeon VII: fluctuates 





5700 XT: Fluctuates





RTX 2080 Ti - stays at the same frequency:


----------



## Blackops_2

Think 5700XT nanos will be coming?


----------



## AlphaC

Blackops_2 said:


> Think 5700XT nanos will be coming?


 Anything not based on reference PCB is likely September timeframe.

MSI's stream stated August for the reference styled PCB with aftermarket coolers (so Mech, Ventus type) while Gaming X and such will likely be September.


ASUS also stated September for their boards.


----------



## EastCoast

Are they using their own PCB's? 
I would be fine with Radeon's reference PCB with 3 fan cooler on it.


----------



## rluker5

WannaBeOCer said:


> I haven't touched a R9 Fury so I'm not sure how it's boost works. Lowering voltage reduced clocks on Vega 10 but on Vega 20 it works like it should and doesn't reduce clocks only voltage. I have my card set to 1900Mhz/1250Mhz @ 1v which is between 130-230w never touches the 300w PL I set and watercooled. I see the card fluctuates with load. If it uses around 130w-180w it downclocks itself down to 1680-1780 but if it's pegged with an intensive scene it runs at 1850Mhz+ (1900Mhz is peak which it rarely touches)
> 
> Radeon Chill is off and nothing is power limiting the card or thermal throttling the card.
> 
> From what I've seen from videos of the 5700 XT it's boost is similar to the Vega 20.


You make me feel old  
My Fury Nitro boosts just to what I set it in AB and holds it. Pascal And Turing can be set to any stable clockspeed, voltage combination using volt curves in MSI AB. That performs equivalent to bios flashing Kepler cards to eliminate boost and setting clocks and volts manually in AB. I just assumed clockspeed control was attainable with all cards but I may be mistaken.

I don't know about this new weird floaty boost in AMD cards. Seems to have good performance but may be hard to control clocks.


----------



## rdr09

WannaBeOCer said:


> AMD's clock speed fluctuates too often compared to nVidia. It's as though AMD has a dynamic clock that depends on load compared to nVidia which just depends on PL + temp.
> 
> Watercooled Radeon VII: fluctuates
> https://www.youtube.com/watch?v=dWkoJqALIoo


I think it depends on the polling rate.


----------



## 113802

rdr09 said:


> I think it depends on the polling rate.


That link you provided is showing a RTX card that is being thermal throttled. I tried to provide videos of cards that do not have a thermal limitation. nVidia GPUs need to stay below 55C or else they'll start to throttle down a few Mhz.


----------



## rdr09

WannaBeOCer said:


> That link you provided is showing a RTX card that is being thermal throttled. I tried to provide videos of cards that do not have a thermal limitation. nVidia GPUs need to stay below 55C or else they'll start to throttle down a few Mhz.


Wut? That needs a minimum of 360.


----------



## bigjdubb

Keep in mind that it's throttling down from an automatic overclock, not throttling down below the rated boost clock. It drops a few mhz for every 5 degrees (not sure the actual numbers), something along those lines. It's been working that way since they introduced the boost feature.


----------



## 113802

rdr09 said:


> Wut? That needs a minimum of 360.


Maxwell was 65c if I recall and Pascal brought it down to 55c and Turing seems to be the same.


----------



## rdr09

WannaBeOCer said:


> Maxwell was 65c if I recall and Pascal brought it down to 55c and Turing seems to be the same.


You serious? No wonder my 1060 fluctuates much. It goes to 83 in gaming. 

55 is hard to achieve. lol


----------



## 113802

rdr09 said:


> You serious? No wonder my 1060 fluctuates much. It goes to 83 in gaming.
> 
> 55 is hard to achieve. lol


I always water cooled so I never had an issue. Vega pisses me off that I can't run it at the peak frequency 100% of the time. I bugged it with my RX Vega 64 and it ran faster than my Radeon VII at 1850/1105Mhz since it fluctuates. I'm using much less power and a bit quicker though with my Radeon VII at 1900/1250Mhz 1v.


----------



## AlphaC

Navi is such a beast in pure graphics performance that I expect a Radeon Pro variant to be a complete winner on many fronts especially since you can swap to gaming driver on the fly. Without pro drivers it is within same FPS as _2076MHz _Titan XP in CATIA. Titan XP has optimizations from the pro driver.

http://www.icpcw.com/Parts/Graphics/xkpc/3328/332857.htm
















This result is consistent with what I saw on Linustechtips' video with Specviewperf13. RX 5700XT and RX 5700 outperformed RTX 2070 Super in CATIA (1.6x) and Solidworks (+25%) and about 3x in Siemens NX (RTX super doesn't get pro driver support and is super hampered there). In DirectX (i.e. Maya , 3dsMax) the performance deficit to the $100 more RTX 2070 Super is around 15% using the blower, so aftermarket cards should close that gap a bit while Showcase performance is better than on RTX cards (not that it matters since Autodesk Showcase is discontinued).


----------



## magnek

WannaBeOCer said:


> Maxwell was 65c if I recall and Pascal brought it down to 55c and Turing seems to be the same.


I think Maxwell was also 55C or might've been lower even. I remember seeing my 980 Ti downclock by one bin (10-13 MHz) during summer (where it hits 50C+) but don't see the same behavior during winter.


----------



## Newbie2009

ZealotKi11er said:


> Can you try Timespy?


Think 1080p is cpu limited, 2200g just a place holder, so I did ultra also to remove cpu

https://www.3dmark.com/spy/7826911

https://www.3dmark.com/spy/7827435


----------



## bigjdubb

magnek said:


> I think Maxwell was also 55C or might've been lower even. I remember seeing my 980 Ti downclock by one bin (10-13 MHz) during summer (where it hits 50C+) but don't see the same behavior during winter.


That seems correct. It was a long time ago but I do recall trying to keep my cards under 50. It wasn't easy with the modified bios and 1600ish clocks.


----------



## 113802

AlphaC said:


> Navi is such a beast in pure graphics performance that I expect a Radeon Pro variant to be a complete winner on many fronts especially since you can swap to gaming driver on the fly. Without pro drivers it is within same FPS as _2076MHz _Titan XP in CATIA. Titan XP has optimizations from the pro driver.
> 
> 
> This result is consistent with what I saw on Linustechtips' video with Specviewperf13. RX 5700XT and RX 5700 outperformed RTX 2070 Super in CATIA (1.6x) and Solidworks (+25%) and about 3x in Siemens NX (RTX super doesn't get pro driver support and is super hampered there). In DirectX (i.e. Maya , 3dsMax) the performance deficit to the $100 more RTX 2070 Super is around 15% using the blower, so aftermarket cards should close that gap a bit while Showcase performance is better than on RTX cards (not that it matters since Autodesk Showcase is discontinued).


Depends on the workload. Vega crushes Navi in most of these test if you look at Linus' Radeon VII video.


----------



## AlphaC

https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/17
with i7-7820x CPU

Autodesk 3dsmax: 
RTX 2080 : 221.75
GTX 1080 Ti: 212.66 ---> similar to OC ru's result with R7 3700X CPU
_Radeon VII : 169.57 --- minimal improvement over Vega 64 ~7%
_Vega 64: 159.25

Autodesk Maya:
RTX 2080: 330.55
GTX 1080 Ti: 316.64 ---> 299+ FPS even with R7 2700X
Radeon VII : 229.15 ---> about 12% improvement over Vega64
Vega 64: 204.07

Dassault Systems CATIA:
Radeon VII : 272.75 --> +13% over Vega64
Vega64: 242.06
RTX 2080: 160.44
GTX 1080 Ti: 154.03 ---> R7 3700X result on OC ru varies between 147-154

Dassault systems Solidworks:
Radeon VII : 109.62 --- minimal improvement ~5% (mostly CPU bound)
Vega64: 104.22
RTX 2080: 100.49
GTX 1080 Ti: 90.41

PTC Creo:
GTX 1080 Ti : 234.14 ---> 224-241 on OC ru result using R7 3700X/R7 2700X / i7-9700K
RTX 2080: 220.37
Radeon VII: 123.59 --> ~+10% over Vega64
Vega64: 112.66

Siemens NX (heavily benefits from pro drivers to the extreme):
Vega64: 44.71 
Radeon VII: 37.81 --> performance regression...
RTX 2080: 19.73 --> an utter joke
GTX 1080 Ti : 17.66 --> also a joke


https://overclockers.ru/lab/show/98...md-ryzen-7-3700x-vot-ono-realnoe-vozrozhdenie
R7 3700x/R7 2700X/i7-9700k + 1080 Ti results


----------



## 113802

AlphaC said:


> https://www.anandtech.com/show/13923/the-amd-radeon-vii-review/17
> with i7-7820x CPU
> 
> Autodesk 3dsmax:
> RTX 2080 : 221.75
> GTX 1080 Ti: 212.66 ---> similar to OC ru's result with R7 3700X CPU
> _Radeon VII : 169.57 --- minimal improvement over Vega 64 ~7%
> _Vega 64: 159.25
> 
> Autodesk Maya:
> RTX 2080: 330.55
> GTX 1080 Ti: 316.64 ---> 299+ FPS even with R7 2700X
> Radeon VII : 229.15 ---> about 12% improvement over Vega64
> Vega 64: 204.07
> 
> Dassault Systems CATIA:
> Radeon VII : 272.75 --> +13% over Vega64
> Vega64: 242.06
> RTX 2080: 160.44
> GTX 1080 Ti: 154.03 ---> R7 3700X result on OC ru varies between 147-154
> 
> Dassault systems Solidworks:
> Radeon VII : 109.62 --- minimal improvement ~5% (mostly CPU bound)
> Vega64: 104.22
> RTX 2080: 100.49
> GTX 1080 Ti: 90.41
> 
> PTC Creo:
> GTX 1080 Ti : 234.14 ---> 224-241 on OC ru result using R7 3700X/R7 2700X / i7-9700K
> RTX 2080: 220.37
> Radeon VII: 123.59 --> ~+10% over Vega64
> Vega64: 112.66
> 
> Siemens NX (heavily benefits from pro drivers to the extreme):
> Vega64: 44.71
> Radeon VII: 37.81 --> performance regression...
> RTX 2080: 19.73 --> an utter joke
> GTX 1080 Ti : 17.66 --> also a joke
> 
> 
> https://overclockers.ru/lab/show/98...md-ryzen-7-3700x-vot-ono-realnoe-vozrozhdenie
> R7 3700x/R7 2700X/i7-9700k + 1080 Ti results


specviewperf13 != specviewperf12


----------



## AlphaC

That's version 13 , same as LTT

edit: also if you compare to Quadros & TITAN RTX

https://hothardware.com/reviews/nvidia-quadro-rtx-4000-review?page=4
*CPU was i7-9980XE*

3dsmax
RTX 2080 Ti = 259.13
TITAN RTX = 238.28
RTX 4000 = 182.31
WX 8200 (Vega56 Pro) = 146.31

Maya
RTX 2080 Ti = 381.23
TITAN RTX = 371.51
RTX 4000 = 263.93
WX 8200 (Vega56 Pro) = 243.45

CATIA
Titan RTX = 274.09
RTX 4000 = 237.73
WX 8200 (Vega 56 Pro) = 233.89
RTX 2080 Ti = 175.8

Solidworks
Quadro RTX 4000 = 138.56
WX 8200 (Vega 56 Pro) = 134.2
TITAN RTX = 126.16
RTX 2080 Ti = 103.68

Creo
RTX 2080 Ti = 267.11
TITAN RTX = 263.41
RTX 4000 = 208.24
WX8200 (Vega 56 Pro) = 176.06

Siemens NX 
TITAN RTX = 432.58
RTX 4000 = 360.88
WX 8200 = 273.15
RTX 2080 Ti = 22.01 <-- gimped by drivers

https://www.spec.org/gwpg/gpc.data/vp13/summary.html


----------



## ToTheSun!

rdr09 said:


> You serious? No wonder my 1060 fluctuates much. It goes to 83 in gaming.
> 
> 55 is hard to achieve. lol


It's very easy to achieve if you have an AIO model or custom watercooled.

Mine throttles below 55ºC with a manual OC, so I figure it's power limited before it even gets to be temp limited.


----------



## ilmazzo

What? 

Did I really just read that nvidia cards boost algorithm keep cards steady on frequency without effort form the user? 

Are we serious? this is becoming a joke......


----------



## ToTheSun!

ilmazzo said:


> What?
> 
> I have just read that nvidia cards boost algorithm keep cards steady on frequency?
> 
> Are we serious? this is becoming a joke......


If you have a good BIOS, watercool below the temperature threshold for Turing, and set performance to max in the NVCP, the boost clocks will not budge.


----------



## ilmazzo

ToTheSun! said:


> If you have a good BIOS, watercool below the temperature threshold for Turing, and set performance to max in the NVCP, the boost clocks will not budge.


lol

ok

seems legit

no effort at all, "just install and forget"


----------



## ToTheSun!

ilmazzo said:


> lol
> 
> ok
> 
> seems legit
> 
> no effort at all, "just install and forget"


It is install and forget, especially if you buy an AIO model that's not severely power limited. Mine boosts well above spec without budging; it only throttles because it's overclocked a little too much.


----------



## 0razor1

*He's right you know*



ilmazzo said:


> lol
> 
> ok
> 
> seems legit
> 
> no effort at all, "just install and forget"


Cause I put a G12 + H80i v2 on a 2070 and it hummed at 1890 MHz no matter what I threw at it. Nothing overclocked.


----------



## ilmazzo

why did you water cooled it?


----------



## ToTheSun!

ilmazzo said:


> why did you water cooled it?


Better temps (=better OC and/or clock stability) and less noise, if I had to hazard a guess.


----------



## ilmazzo

ToTheSun! said:


> Better temps (=better OC and/or *clock stability*) and less noise, if I had to hazard a guess.


perfect

"Well, meeting adjourned, gentlemen."


----------



## 113802

ilmazzo said:


> What?
> 
> Did I really just read that nvidia cards boost algorithm keep cards steady on frequency without effort form the user?
> 
> Are we serious? this is becoming a joke......


Where did anyone state without effort? Keep it below 55c and the card will stay at the highest boost clock. Vega 20 and Navi 10 clocks go all over the place even when thermals and power aren't a problem.


----------



## rdr09

WannaBeOCer said:


> Where did anyone state without effort? Keep it below 55c and the card will stay at the highest boost clock. Vega 20 and Navi 10 clocks go all over the place even when thermals and power aren't a problem.


You've seen a NAVI tested with a waterblock?


----------



## ilmazzo

nvidia boost algorithm cut frequencies step every 10c increase btw

amd has 64 sensors for temp (hotspot is the maxed one between all), nvidia is at the stone age regarding temp sensors so amd boost algorithm is more temp sensible cause temps are waaaaaay wider than nvidias....and nevermind that is the 4th implementation of boost algorithm on green cards.

Anyway, this is not a problem at all, so why am I bothering?

cheers


----------



## 113802

rdr09 said:


> You've seen a NAVI tested with a waterblock?



No but Navi doesn't thermal throttle until the junction temperature hits 110C just like the Radeon VII. AMD's boost is load dependent. 

You can see in this video the junction temperature isn't near 110c:

https://youtu.be/7_GvOe1_UKs



ilmazzo said:


> nvidia boost algorithm cut frequencies step every 10c increase btw
> 
> amd has 64 sensors for temp (hotspot is the maxed one between all), nvidia is at the stone age regarding temp sensors so amd boost algorithm is more temp sensible cause temps are waaaaaay wider than nvidias....and nevermind that is the 4th implementation of boost algorithm on green cards.
> 
> Anyway, this is not a problem at all, so why am I bothering?
> 
> cheers


AMD's boost is load dependent. I do not want it to be load dependent just let me set my clocks to the frequency I desire unless temperature or power limits are hit.


----------



## rdr09

ilmazzo said:


> nvidia boost algorithm cut frequencies step every 10c increase btw
> 
> amd has 64 sensors for temp (hotspot is the maxed one between all), nvidia is at the stone age regarding temp sensors so amd boost algorithm is more temp sensible cause temps are waaaaaay wider than nvidias....and nevermind that is the 4th implementation of boost algorithm on green cards.
> 
> Anyway, this is not a problem at all, so why am I bothering?
> 
> cheers


lol. I want to know before i receive my card. My 290s are steady in any scenario, especially gaming. You see a problem with the clocks here?


----------



## rdr09

WannaBeOCer said:


> No but Navi doesn't thermal throttle until the junction temperature hits 110C just like the Radeon VII. AMD's boost is load dependent.
> 
> You can see in this video the junction temperature isn't near 110c:
> 
> https://youtu.be/7_GvOe1_UKs


I've seen it. The gpu clock is not that straight as my old 290. The guy did not mention any issues, though. The way you are describing it is there is a problem.


----------



## looniam

ilmazzo said:


> What?
> 
> Did I really just read that *nvidia cards boost algorithm* keep cards steady on frequency without effort form the user?
> 
> Are we serious? this is becoming a joke......





ilmazzo said:


> why did you water cooled it?


because since gpu boost 2 nvidia's algorithm uses temps along with voltage and power limits. :thumb:
it really started w/maxwell, no BS my 980ti won't OC to 1500+ above 34c.

the arch w/node shrinks (which throws out voltage scaling) making temps matter the most in OCing. its the hardest thing to inform previous AMD users; who want to just dump more voltage.

fwiw, it took a lot of effort to figure that all out.


----------



## 113802

rdr09 said:


> I've seen it. The gpu clock is not that straight as my old 290. The guy did not mention any issues, though. The way you are describing it is there is a problem.


Vega/Navi will always run at the 2nd from last power state which is usually 50Mhz below. Since AMD considers the highest power state peak boost. Along with that the frequency will dip when playing games due to load. 

You'll see frames dip down by a few when the frequency dips. 

If power and thermal limitations aren't being hit why does AMD insist on dropping the frequency?


----------



## rdr09

WannaBeOCer said:


> Vega/Navi will always run at the 2nd from last power state which is usually 50Mhz below. Since AMD considers the highest power state peak boost. Along with that the frequency will dip when playing games due to load.
> 
> You'll see frames dip down by a few when the frequency dips.
> 
> If power and thermal limitations aren't being hit why does AMD insist on dropping the frequency?


In that video, i don't see any of that wild fluctuations running Strange Brigade in the background. 

I checked a bench of Division 2. Wouldn't the 1% show those dips?


----------



## maltamonk

Let me see if I can get this new bit straight. Nvidia >10xxcards when kept below certain thermal limits will retain boost clock levels no matter the load? Navi on the other hand, has fluctuating boost clocks dependent on load regardless of thermals?

Does either matter? Does constant boost have an effect on power draw? Does fluctuating? Do either effect performance to a noticeable degree?


----------



## miklkit

Strange Brigade is very well optimized and runs very smoothly. I still play it a lot when I need some mindless bodycount action. 



Overall I have been happy with my AMD GPUs. The 280X is an exception as it performed well when it felt like it but was usually in a power saving mode that had it delivering poor frame rates. I got rid of it ASAP.


The 290X was a good one. It always delivered 100% and it was up to me to take care of temps. The Fury was another great one as it always delivered 100%. This Vega 64 is all over the place and is usually underperforming badly with low power use and frame rates. 16fps with the V64 running just above idle clocks is no fun.


I want to get rid of this thing and now it seems the 5700 is doing the same thing? That's just great.....


----------



## epic1337

maltamonk said:


> Let me see if I can get this new bit straight. Nvidia >10xxcards when kept below certain thermal limits will retain boost clock levels no matter the load? Navi on the other hand, has fluctuating boost clocks dependent on load regardless of thermals?
> 
> Does either matter? Does constant boost have an effect on power draw? Does fluctuating? Do either effect performance to a noticeable degree?


fluctuating boost clocks does mean fluctuating power consumption, its their way to hide their card's high power consumption.
switching between low and high boost clock can affect frametime, specially when the boost couldn't clock up fast enough to catch those sudden changes in loads.


----------



## BradleyW

My STRIX VEGA 64 does this. The higher the load, the lower the clocks, despite thermals being low.


----------



## maltamonk

epic1337 said:


> fluctuating boost clocks does mean fluctuating power consumption, its their way to hide their card's high power consumption.
> switching between low and high boost clock can affect frametime, specially when the boost couldn't clock up fast enough to catch those sudden changes in loads.


Not sure I understand what you mean by "hide". That said wasn't there something about bringing frame time down to 1-3ns vs >20ns?


----------



## rdr09

miklkit said:


> Strange Brigade is very well optimized and runs very smoothly. I still play it a lot when I need some mindless bodycount action.
> 
> 
> 
> Overall I have been happy with my AMD GPUs. The 280X is an exception as it performed well when it felt like it but was usually in a power saving mode that had it delivering poor frame rates. I got rid of it ASAP.
> 
> 
> The 290X was a good one. It always delivered 100% and it was up to me to take care of temps. The Fury was another great one as it always delivered 100%. This Vega 64 is all over the place and is usually underperforming badly with low power use and frame rates. 16fps with the V64 running just above idle clocks is no fun.
> 
> 
> I want to get rid of this thing and now it seems the 5700 is doing the same thing? That's just great.....


That's what i am concerned about. Is your Vega underperforming in Strange Brigade? How about Division 2 (if you played it)? Thanks.

NVM. Saw a vid and i don't see any of the behaviors you guys are describing. The clocks are fluctuating but nothing that would cause the fps drop as to make the game unplayable. Could it be a driver issue?


----------



## AlphaC

It's probably Radeon Chill to be honest


----------



## 113802

AlphaC said:


> It's probably Radeon Chill to be honest


It's not Radeon Chill. It's AMD's bios on the card.


----------



## epic1337

maltamonk said:


> Not sure I understand what you mean by "hide". That said wasn't there something about bringing frame time down to 1-3ns vs >20ns?


not when they throttle clocks, whatever load is on the GPU would take more time to execute.


----------



## miklkit

The only way my V64 is underperforming in SB is that textures keep dropping from DX12 to DX7 levels. Other than that it is very consistent. As near as I can tell this has something to do with vram.



I'm not into shooters so I don't have the Division 2. You want to see frame rate fluctuations? Get Subnautica. FPS is all over the place (under 20 to over 100) with the lowest fps using the least power and the highest fps using the highest power. I see this in other games as well but not nearly as bad.


----------



## rdr09

WannaBeOCer said:


> It's not Radeon Chill. It's AMD's bios on the card.


Ugh. Don't say it needs a BIOS editior.



miklkit said:


> The only way my V64 is underperforming in SB is that textures keep dropping from DX12 to DX7 levels. Other than that it is very consistent. As near as I can tell this has something to do with vram.
> 
> 
> 
> I'm not into shooters so I don't have the Division 2. You want to see frame rate fluctuations? Get Subnautica. FPS is all over the place (under 20 to over 100) with the lowest fps using the least power and the highest fps using the highest power. I see this in other games as well but not nearly as bad.


Thank you. I plan to play D2, so i asked.

Anyways, I'll find out myself.


----------



## ilmazzo

WannaBeOCer said:


> AMD's boost is load dependent. I do not want it to be load dependent just let me set my clocks to the frequency I desire unless temperature or power limits are hit.


the tjunction was below max temp but pl?


----------



## 113802

ilmazzo said:


> the tjunction was below max temp but pl?


My card is way below power limit. I have it set to 1v @ 1900/1250Mhz. It's just the way AMD's boost works.


----------



## keikei

rdr09 said:


> Ugh. Don't say it needs a BIOS editior.
> 
> 
> 
> Thank you. I plan to play D2, so i asked.
> 
> Anyways, I'll find out myself.


TD2 isnt as unstable as Subnautica. Stick with DX11 though. The game is graphically demanding, but deservedly so. Game looks amazing.


----------



## Imouto

D2 is Diablo 2!!! Nothing else!

ALSO STOP MAKING ME FEEL OLD!


----------



## rdr09

keikei said:


> TD2 isnt as unstable as Subnautica. Stick with DX11 though. The game is graphically demanding, but deservedly so. Game looks amazing.


Looking forward to it.



Imouto said:


> D2 is Diablo 2!!! Nothing else!
> 
> ALSO STOP MAKING ME FEEL OLD!


Now, im really looking forward to it!


----------



## The Robot

Imouto said:


> D2 is Diablo 2!!! Nothing else!
> 
> ALSO STOP MAKING ME FEEL OLD!


It's Destiny 2


----------



## PontiacGTX

rdr09 said:


> lol. I want to know before i receive my card. My 290s are steady in any scenario, especially gaming. You see a problem with the clocks here?


since GCN3 amd has some kind of boost but the GCN4+ is totally different to older gcn, for example if you set to 1600mhz in load Vega wont stay at 1600mhz(or wont even touch 1600mhz) it will fluctuate (unless it isnt power limited and temperature limited )



WannaBeOCer said:


> No but Navi doesn't thermal throttle until the junction temperature hits 110C just like the Radeon VII. AMD's boost is load dependent.
> 
> You can see in this video the junction temperature isn't near 110c:
> 
> https://youtu.be/7_GvOe1_UKs
> 
> 
> 
> AMD's boost is load dependent. I do not want it to be load dependent just let me set my clocks to the frequency I desire unless temperature or power limits are hit.


I wonder if there is a way to force AMD to have a contant core clock? MSI AB didnt have an option for that?


----------



## rdr09

PontiacGTX said:


> since GCN3 amd has some kind of boost but the GCN4+ is totally different to older gcn, for example if you set to 1600mhz in load Vega wont stay at 1600mhz(or wont even touch 1600mhz) it will fluctuate (unless it isnt power limited and temperature limited)


Does it affect your fps badly when it fluctuates? I'll post this vid again. Do you see it happening in this vid? Around 6 min mark.


----------



## PontiacGTX

rdr09 said:


> Does it affect your fps badly when it fluctuates?


I havent measure but I would expect it does.. these boost clock and dynamic voltage states seems like a way to save power or stay under certain temperature limit

you mean that 50mhz difference? sometimes I see 150mhz in Vega 56


----------



## Dmac73

PontiacGTX said:


> since GCN3 amd has some kind of boost but the GCN4+ is totally different to older gcn, for example if you set to 1600mhz in load Vega wont stay at 1600mhz(or wont even touch 1600mhz) it will fluctuate (unless it isnt power limited and temperature limited )
> 
> 
> 
> I wonder if there is a way to force AMD to have a contant core clock? MSI AB didnt have an option for that?



You have the force constant voltage option but that's it i believe. It will still fluctuate. Looking forward a new AB with 5700 support. I've had a lot of issues since launch even after fresh installs.

Dmac


----------



## rdr09

PontiacGTX said:


> I havent measure but I would expect it does.. these boost clock and dynamic voltage states seems like a way to save power or stay under certain temperature limit


Check out the vid pls.


----------



## runwiththedevil

Imouto said:


> D2 is Diablo 2!!! Nothing else!
> 
> ALSO STOP MAKING ME FEEL OLD!


 Are people refering D2 to anything else? :thumbsdow burn them at the stake.


----------



## PontiacGTX

rdr09 said:


> Check out the vid pls.


I skipped his overclocking process... but he overclocked the core? because if he did maybe had to increase power limit? he left it at stock if he didnt overclock probably might be a power saving feature or the voltage fluctuation reduced to the power state with the 2nd stable clock speed/power state?


anyway people who do some sort of undervolting or overclocking on these cards with blower fan should increase power limit just in case, aswell check if the other Power states' voltage make any difference to the overall clock speed


----------



## rdr09

PontiacGTX said:


> I skipped his overclocking process... but he overclocked the core? because if he did maybe had to increase power limit? he left it at stock if he didnt overclock probably might be a power saving feature or the voltage fluctuation reduced to the power state with the 2nd stable clock speed/power state?
> 
> 
> 
> 
> anyway people who do some sort of undervolting or overclocking on these cards with blower fan should increase power limit just in case, aswell check if the other Power states' voltage make any difference to the overall clock speed


He lowered the clock to about 1820MHz but he said it was still higher than the game clock of 1700+MHz. Did not touch the Power Limit and kept the Fan curve at a constant 50%. It was using like 130W. Almost same as my GTX 1060. Check out the Orange line (2nd from top).


----------



## bigjdubb

runwiththedevil said:


> Are people refering D2 to anything else? :thumbsdow burn them at the stake.


Kind of makes me laugh. Diablo was the game that put me in the "I don't like RPG's" category and Division was the game that moved me into the "Not all RPG's suck" category.


Either way, Mighty Ducks is the OG D2.


----------



## ilmazzo

The cap should be the p7 in wattman which is the max boost pstate, if it throttles for some mhz down whatever pl is set and used I don't know what to say


----------



## 113802

ilmazzo said:


> The cap should be the p7 in wattman which is the max boost pstate, if it throttles for some mhz down whatever pl is set and used I don't know what to say


It's the way AMD's boost works. Max power state on Vega 10 was 7 and Vega 20 has 8. I don't know about Navi since I don't have one. What ever they changed with Vega 20 carried over to Navi. 

The Radeon VII in the video hits 300w at times but I can assure you my Radeon VII that's 70w below 300w also fluctuates the same.


----------



## PontiacGTX

WannaBeOCer said:


> It's the way AMD's boost works. Max power state on Vega 10 was 7 and Vega 20 has 8. I don't know about Navi since I don't have one. What ever they changed with Vega 20 carried over to Navi.
> 
> The Radeon VII in the video hits 300w at times but I can assure you my Radeon VII that's 70w below 300w also fluctuates the same.
> 
> https://youtu.be/4y8tyDEXNC0


in this video Radeon RX 5700 isnt as fast as most review shows compared to RX Vega 64, that or RX Vega 64 is performing properly


----------



## 113802

PontiacGTX said:


> in this video Radeon RX 5700 isnt as fast as most review shows compared to RX Vega 64, that or RX Vega 64 is performing properly


That RX Vega 64 is performing properly. Stock Vega 64's run between 1400-1500Mhz. This review is showing all the cards overclocked and undervolted. I'm pretty sure my RX Vega 64 I have would beat a RX 5700 XT. Here's a result of my bugged RX Vega 64 LC I was able to sustain 1827/1145Mhz for the entire run without it dipping: https://www.3dmark.com/fs/18270986 

AMD is just leaving performance on the table and I'm not sure why. The cards undervolt beautifully, why am I not allowed to run it at the highest power state 100% of the time when gaming?

Here's further proof a Radeon VII clocks fluctuates and it's at 120w.


----------



## PontiacGTX

WannaBeOCer said:


> That RX Vega 64 is performing properly. Stock Vega 64's run between 1400-1500Mhz. This review is showing all the cards overclocked and undervolted. I'm pretty sure my RX Vega 64 I have would beat a RX 5700 XT. Here's a result of my bugged RX Vega 64 LC I was able to sustain 1827/1145Mhz for the entire run without it dipping: https://www.3dmark.com/fs/18270986
> 
> AMD is just leaving performance on the table and I'm not sure why. The cards undervolt beautifully, why am I not allowed to run it at the highest power state 100% of the time when gaming?
> 
> Here's further proof a Radeon VII clocks fluctuates and it's at 120w.
> 
> https://youtu.be/OEIVn9lk9To


have you tried clockblocker on vega...navi?


----------



## 113802

PontiacGTX said:


> have you tried clockblocker on vega...navi?


No, only Wattman for the Vega 64(it had an option to lock power states which didn't work) and WattTool which also didn't work. What ever AMD did with the bios the peak boost(highest power state) can not be forced to run 100% of the time. For some odd reason flashing a 8GB FE bios on the Vega 64 LC and restarted allowed it to be bugged for a couple restarts. Until someone figures out how get around AMD Secure Technology's were stuck with what AMD provides.


----------



## PontiacGTX

WannaBeOCer said:


> No, only Wattman for the Vega 64(it had an option to lock power states which didn't work) and WattTool which also didn't work. What ever AMD did with the bios the peak boost(highest power state) can not be forced to run 100% of the time. For some odd reason flashing a 8GB FE bios on the Vega 64 LC and restarted allowed it to be bugged for a couple restarts. Until someone figures out how get around AMD Secure Technology's were stuck with what AMD provides.


I will try clockblocker within some days and compare the performance, though I dont know if it will hold the clock speed on vega since it is designed for fiji but the opencl app should work on either
https://www.guru3d.com/files-details/clockblocker-download.html


----------



## ZealotKi11er

This is gathered playing Anthem @ 4K.


----------



## treetops422

PontiacGTX said:


> in this video Radeon RX 5700 isnt as fast as most review shows compared to RX Vega 64, that or RX Vega 64 is performing properly


 You probably want to look at the links in the first post of this page rather then believing some random youtuber. 

https://www.techspot.com/review/1870-amd-radeon-rx-5700/


----------



## 113802

treetops422 said:


> You probably want to look at the links in the first post of this page rather then believing some random youtuber.
> 
> https://www.techspot.com/review/1870-amd-radeon-rx-5700/


They're comparing a stock Vega 64 that runs between 1400-1500Mhz. The video shows both cards undervolted/overclocked. A Vega 64 runs between 1550-1630Mhz uv/oc while a Vega 64 LC runs between 1650-1730Mhz.


----------



## rdr09

ZealotKi11er said:


> This is gathered playing Anthem @ 4K.


.
No data on fps and frametime. Have sauce? Thanks.


----------



## rdr09

WannaBeOCer said:


> It's the way AMD's boost works. Max power state on Vega 10 was 7 and Vega 20 has 8. I don't know about Navi since I don't have one. What ever they changed with Vega 20 carried over to Navi.
> 
> The Radeon VII in the video hits 300w at times but I can assure you my Radeon VII that's 70w below 300w also fluctuates the same.
> 
> https://youtu.be/4y8tyDEXNC0


Thanks for this video. I did not see any drops in any of the gpu clocks. Was thinking of drops to 1000MHz, 800MHz, or even lower. All these cards seem to handle power just fine. The XT was clearly more efficient in this vid. They were all oc'ed. Really curious to the power gating of Navi. The next NAVI should be able to handle 4K 144Hz for sure. +rep.


----------



## Ragsters

Thinking of getting a 5700 xt. Does anyone know where to get one? Trying to get some kinda deal. Maybe a free game or no tax?


----------



## PontiacGTX

https://youtu.be/ycMWmQ-4-ww?t=418
in soviet russia Vsync is faster than Freesync or no-vsync


----------



## ToTheSun!

PontiacGTX said:


> https://youtu.be/ycMWmQ-4-ww?t=418
> in soviet russia Vsync is faster than Freesync or no-vsync


The results are weird, but I'd be willing to accept the discrepancies as a direct consequence of a testing method that's not very robust.

However, the Metro Exodus latencies make absolutely no sense whatsoever. I'd rather see someone else testing this.


----------



## ZealotKi11er

Is there a club for 5700/XT?


----------



## Imouto

ZealotKi11er said:


> Is there a club for 5700/XT?


Peasant GPUs = No club.


----------



## Heuchler

ToTheSun! said:


> The results are weird, but I'd be willing to accept the discrepancies as a direct consequence of a testing method that's not very robust.
> 
> However, the Metro Exodus latencies make absolutely no sense whatsoever. I'd rather see someone else testing this.


The conclusion that they reached had me scratching my head. So it works. Biggest impact at high resolutions but just test at 1080p @240Hz. Pretty sure 3440x1440p gamers would like to know the results.


----------



## treetops422

Ragsters said:


> Thinking of getting a 5700 xt. Does anyone know where to get one? Trying to get some kinda deal. Maybe a free game or no tax?


 Right now all they are offering is 3 months of Xbox game pass for PC. It lets you play like around 100 games for free.



https://www.newegg.com/asrock-radeo...=5700 xt&cm_re=5700_xt-_-14-930-018-_-Product


----------



## rdr09

Heuchler said:


> The conclusion that they reached had me scratching my head. So it works. Biggest impact at high resolutions but just test at 1080p @240Hz. Pretty sure 3440x1440p gamers would like to know the results.


Is this card even meant for 240Hz? You need a high-end card for that much like you need a fast cpu.


----------



## Heuchler

rdr09 said:


> Is this card even meant for 240Hz? You need a high-end card for that much like you need a fast cpu.


That is the problem with the review. Professional eSport gamers already have cards that can do 240 FPS at 1080p. Down playing the latency improvement by Radeon Anti-Lag. I play on older card at 2560x1440 @100Hz and it is very noticeable in Siege. Have a cousin that plays on 3440x1440
and it is even more notable. Hardware Unbox pointing out that Anti-Lag is of the most benefit at 4K then tests at 1080p ?!? 

At least they didn't do 640x480 test I guess.


----------



## rdr09

Heuchler said:


> That is the problem with the review. Professional eSport gamers already have cards that can do 240 FPS at 1080p. Down playing the latency improvement by Radeon Anti-Lag. I play on older card at 2560x1440 @100Hz and it is very noticeable in Siege. Have a cousin that plays on 3440x1440
> and it is even more notable. Hardware Unbox pointing out that Anti-Lag is of the most benefit at 4K then tests at 1080p ?!?
> 
> At least they didn't do 640x480 test I guess.


Would be nice if they do the same test on the next NAVI.


----------



## treetops422

PontiacGTX said:


> https://youtu.be/ycMWmQ-4-ww?t=418
> in soviet russia Vsync is faster than Freesync or no-vsync


I guess that video for the 1% of people who have monitor over 144hz? That likely game at 720p or lower to get 240 fps?


----------



## nolive721

i am always doubtful in these benchmark about the level of VEGA64 

because if thats the GPU set out of the box of course its going to perform badly considering the crazy default Vcore AMD has defined for the thing


----------



## Fediuld

PontiacGTX said:


> in soviet russia Vsync is faster than Freesync or no-vsync


There is no Soviet Russia for over 25 years now........ except if you sleeping in some cave.


----------



## ToTheSun!

Fediuld said:


> There is no Soviet Russia for over 25 years now........ except if you sleeping in some cave.


There's no Russia, period.


----------



## PontiacGTX

Fediuld said:


> There is no Soviet Russia for over 25 years now........ except if you sleeping in some cave.


https://knowyourmeme.com/memes/in-soviet-russia

the meme sometimes is used when thing are done or happen backwards


----------



## Ragsters

treetops422 said:


> Right now all they are offering is 3 months of Xbox game pass for PC. It lets you play like around 100 games for free.
> 
> 
> 
> https://www.newegg.com/asrock-radeo...=5700 xt&cm_re=5700_xt-_-14-930-018-_-Product


This is what I ended up getting. Thanks!


----------



## PontiacGTX

Ragsters said:


> This is what I ended up getting. Thanks!


maybe you should have gone with MSI, their warranty is better
https://www.newegg.com/msi-radeon-r...=5700 xt&cm_re=5700_xt-_-14-137-432-_-Product


----------



## Hwgeek

Heuchler said:


> That is the problem with the review. Professional eSport gamers already have cards that can do 240 FPS at 1080p. Down playing the latency improvement by Radeon Anti-Lag. I play on older card at 2560x1440 @100Hz and it is very noticeable in Siege. Have a cousin that plays on 3440x1440
> and it is even more notable. Hardware Unbox pointing out that Anti-Lag is of the most benefit at 4K then tests at 1080p ?!?
> 
> At least they didn't do 640x480 test I guess.


I also found out their review stupid, AMD presented this feture to reduce Lag on 60~90 FPS range where GPU is the bottleneck and they go and test Fastest 240Hz monitor at over 200FPS @1080P LOL.
Since even old AMD GPU's support it they should tested it with Polaris 580 for example 1080P so it would be more interesting if this Free new Feature can help Non Rich Gamers with 240Hz/RTX 2080Ti/9900K.


----------



## PontiacGTX

Hwgeek said:


> I also found out their review stupid, AMD presented this feture to reduce Lag on 60~90 FPS range where GPU is the bottleneck and they go and test Fastest 240Hz monitor at over 200FPS @1080P LOL.
> Since even old AMD GPU's support it they should tested it with Polaris 580 for example 1080P so it would be more interesting if this Free new Feature can help Non Rich Gamers with 240Hz/RTX 2080Ti/9900K.


PCGH testing had shown Nvidia latency compared to anti lag as higher , even without anti lag it was higher not sure what you mean with the last sentence, if nvidia cant improve that maybe they should introduce a similar feature?


----------



## Ragsters

PontiacGTX said:


> maybe you should have gone with MSI, their warranty is better
> https://www.newegg.com/msi-radeon-r...=5700 xt&cm_re=5700_xt-_-14-137-432-_-Product


Where do you see that? I see limited 3 year warranty for both.


----------



## PontiacGTX

Ragsters said:


> Where do you see that? I see limited 3 year warranty for both.


Well at least MSI allow you can replace the cooler-repaste, I dont know asrock, but some other AIB partner would void the warranty


----------



## Ragsters

PontiacGTX said:


> Well at least MSI allow you can replace the cooler-repaste, I dont know asrock, but some other AIB partner would void the warranty


Yeah, Asrock is new to the graphic card world which is why I chose them. I love Asrock and have had great experience with their customer support when dealing with their motherboards.


----------



## PontiacGTX

Ragsters said:


> Yeah, Asrock is new to the graphic card world which is why I chose them. I love Asrock and have had great experience with their customer support when dealing with their motherboards.


their warranty terms
"AUTHORIZED DISTRIBUTOR ONLY Manufacturer's warranty will be null and void if products are *modified*, damaged or otherwise *tampered with*, for example, the outer case is opened or *additional optional parts/components are installed/removed*."

https://www.asrock.com/support/index.asp?cat=Policy

so yeah still I see MSI has better warranty,while they state the same they arent as strict, we dont know if Asrock would be less strict than most brands fulfilling the warranty terms


----------



## Hwgeek

PontiacGTX said:


> PCGH testing had shown Nvidia latency compared to anti lag as higher , even without anti lag it was higher not sure what you mean with the last sentence, if nvidia cant improve that maybe they should introduce a similar feature?


I meant to say that HU should have tested if this future helps regular gamer's that cannot efford super low Input lags because they don't have 240Hz/RTX2080Ti/9900K Gaming PC's, this new feture is free for most of the AMD gpu owners so they shpuld have included RX 580 for example and show if we have better chance to compete vs high end PC Gamers,


----------



## PontiacGTX

Hwgeek said:


> I meant to say that HU should have tested if this future helps regular gamer's that cannot efford super low Input lags because they don't have 240Hz/RTX2080Ti/9900K Gaming PC's, this new feture is free for most of the AMD gpu owners so they shpuld have included RX 580 for example and show if we have better chance to compete vs high end PC Gamers,



PCGH tested RX 590 in PUGB and there was some minimal gain from anti lag


----------



## Ragsters

PontiacGTX said:


> their warranty terms
> "AUTHORIZED DISTRIBUTOR ONLY Manufacturer's warranty will be null and void if products are *modified*, damaged or otherwise *tampered with*, for example, the outer case is opened or *additional optional parts/components are installed/removed*."
> 
> https://www.asrock.com/support/index.asp?cat=Policy
> 
> so yeah still I see MSI has better warranty


Well.. MSI exclusion are pretty much identical

https://us.msi.com/page/warranty
"Unauthorized changes of non MSI parts, modifications or alterations , parts removal in or to the products"


----------



## PontiacGTX

Ragsters said:


> Well.. MSI exclusion are pretty much identical
> 
> https://us.msi.com/page/warranty
> "Unauthorized changes of non MSI parts, modifications or alterations , parts removal in or to the products"


well I mean MSI doesnt seem to enforce that clause as often, at least reading their forum, also read 
post #6
https://forum-en.msi.com/index.php?topic=302869.0
post #4
https://forum-en.msi.com/index.php?topic=266114.0
https://forum-en.msi.com/index.php?topic=301153.0

but if ASrock has similar policy then...it should be fine otherwise if they dont have similar policy as MSI,then I would take MSI specially if you plan to keep it long term


----------



## ZealotKi11er

I would choose ASRock over MSI because they are AMD only.


----------



## AlphaC

https://www.forbes.com/sites/jasone...arly-linux-gaming-benchmarks-on-ubuntu-18-04/
Radeon 5700 XT performance on Ubuntu 18.04LTS



https://www.techspot.com/article/1879-amd-radeon-anti-lag/
Anti-lag needs experimentation on settings


----------



## JackCY

What experimentation when there are not setting to change for it. All it does is alter the CPU pre render to be shorter. The results are as expected, lower latency but also slightly lower performance because the "queue" is shorter.


----------



## Heuchler

AlphaC said:


> https://www.forbes.com/sites/jasone...arly-linux-gaming-benchmarks-on-ubuntu-18-04/
> Radeon 5700 XT performance on Ubuntu 18.04LTS



In his previous review [Ubuntu-based] Pop!_OS 19.04 vs Win 10 it was interesting to see F1 2018, Dirt Rally and Total War: Three Kingdoms being a win and Strange Brigade being a tie for Pop OS
https://www.forbes.com/sites/jasone...-about-linux-gaming-performance/#66ea0a795e74


----------



## ilmazzo

Heuchler said:


> AlphaC said:
> 
> 
> 
> https://www.forbes.com/sites/jasone...arly-linux-gaming-benchmarks-on-ubuntu-18-04/
> Radeon 5700 XT performance on Ubuntu 18.04LTS
> 
> 
> 
> 
> In his previous review [Ubuntu-based] Pop!_OS 19.04 vs Win 10 it was interesting to see F1 2018, Dirt Rally and Total War: Three Kingdoms being a win and Strange Brigade being a tie for Pop OS
> https://www.forbes.com/sites/jasone...-about-linux-gaming-performance/#66ea0a795e74
Click to expand...

The more i read about strange brigade the more I like how they made it....all aaa games should be done like that one from a programming/optimization pov


----------



## Blackops_2

Couldn't go past 2100 on the core but that's still an 18% OC. Right behind the 1080Ti/2080. With the registry modifications he could increase the core to 2300 but couldn't achieve stability on anything above 2100. I have to say, i really hope a lightening version of this card is coming $400-450 1080Ti performance sounds wonderful even if years late. Also hope the 5800 is right around the corner. Gains are a little disappointing considering the high OC, i wonder why that is?


----------



## Hwgeek

Who did he got 100W increase in power usage while Igor got only 35w at 2.2Ghz?


----------



## treetops422

Hwgeek said:


> Who did he got 100W increase in power usage while Igor got only 35w at 2.2Ghz?


Tomshardware Germany showed while it can go higher there was no need to at 2.2. I don't think Hardware Unboxed did it right or they have low end lottery.


----------



## Newbie2009

Blackops_2 said:


> Couldn't go past 2100 on the core but that's still an 18% OC. Right behind the 1080Ti/2080. With the registry modifications he could increase the core to 2300 but couldn't achieve stability on anything above 2100. I have to say, i really hope a lightening version of this card is coming $400-450 1080Ti performance sounds wonderful even if years late. Also hope the 5800 is right around the corner. Gains are a little disappointing considering the high OC, i wonder why that is?
> 
> https://www.youtube.com/watch?v=xYFB3Z0RpLU


My fanboy edition does 2100mhz @ 200w, power slide left at zero.


----------



## Newbie2009

Blackops_2 said:


> Couldn't go past 2100 on the core but that's still an 18% OC. Right behind the 1080Ti/2080. With the registry modifications he could increase the core to 2300 but couldn't achieve stability on anything above 2100. I have to say, i really hope a lightening version of this card is coming $400-450 1080Ti performance sounds wonderful even if years late. Also hope the 5800 is right around the corner. Gains are a little disappointing considering the high OC, i wonder why that is?
> 
> https://www.youtube.com/watch?v=xYFB3Z0RpLU


It’s likely clocked higher than the sweet spot at stock so gains aren’t great.

Memory overclocking seems totally fubar


----------



## Heuchler

I think you correct. Igor Wallossek has been doing quality reviews for Tom Hardware DE since 2010. Most likely he did work before that as well.
He broke the story that RX 480 violated the PCIe spec and had scientific testing to proof his claim.


Hardware Unboxed one of the many youtuber's that claimed Win10 1903 Ryzen aware schedule improvements would need a new chipset and all the reports of performance gains on the internet pre Ryzen 3000 launch where wrong.
https://youtu.be/5XMWS0_9gNo?t=127

I actually like Steve from Hardware Unboxed but he is no Igor'sLAB. And sometimes Steve is just plain wrong. Test methodology or conclusion. Roman "der8auer" Hartung also has that great test methodology without that techtube emotional drama to his videos.



[3dcenter] Launch analysis AMD Radeon RX 5700 & 5700 XT [661 Tests|5130 Benchmarks]
https://www.3dcenter.org/artikel/la...ch-analyse-amd-radeon-rx-5700-5700-xt-seite-2


Google Translate Link
https://translate.google.com/transl...ch-analyse-amd-radeon-rx-5700-5700-xt-seite-2


----------



## EastCoast

I never rely on or watch videos from HU when steve is the one doing Radeon reviews. In the past I've never seen his reviews to ever be consistent with the reviews I've watched or read from other reviewers.


----------



## Hwgeek

Newbie2009 said:


> It’s likely clocked higher than the sweet spot at stock so gains aren’t great.
> 
> Memory overclocking seems totally fubar


TPU found the memory chips on RX 5700XT are K4Z80325BC-HC14 and on RTX cards they overclock over 1050Mhz, so maybe it's locked for now until AIB cards will show up and will get better reviews thanks to the cooling and CORE/Mem OC headroom?


----------



## Blackops_2

Heuchler said:


> I think you correct. Igor Wallossek has been doing quality reviews for Tom Hardware DE since 2010. Most likely he did work before that as well.
> He broke the story that RX 480 violated the PCIe spec and had scientific testing to proof his claim.
> 
> 
> Hardware Unboxed one of the many youtuber's that claimed Win10 1903 Ryzen aware schedule improvements would need a new chipset and all the reports of performance gains on the internet pre Ryzen 3000 launch where wrong.
> https://youtu.be/5XMWS0_9gNo?t=127
> 
> I actually like Steve from Hardware Unboxed but he is no Igor'sLAB. And sometimes Steve is just plain wrong. Test methodology or conclusion. Roman "der8auer" Hartung also has that great test methodology without that techtube emotional drama to his videos.


Roman "der8auer" is probably my favorite reviewer when he does put out videos. Steve mentioned he suspected the GPU was memory starved but again could do nothing on the memory. Hopefully it's just locked because the headroom is there on the core. I like GN and HW unboxed but they're two very different channels in a lot of respects. Honestly find myself liking "not an apple fan" also lol. Might just be his rants with an irish accent but he's pretty good. Little Red leaning but i find myself doing the same these days.


----------



## Imouto

I miss good ol' written reviews.


----------



## 113802

Imouto said:


> I miss good ol' written reviews.


https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html


----------



## AlphaC

Igor's one of my favorite reviewers because he tears down the cards , uses a proper Infrared device and not a $200 one, and does a sound spectrum graph (with mention of coil whine/electrical noise).
https://www.igorslab.media/amd-rade...los-auf-21-ghz-uebertaktet-wasser-sei-dank/3/

https://www.igorslab.media/amd-rade...n-vega-und-fast-2-1-ghz-takt-unter-wasser/13/



For example for RTX 4000 , he showed 3000RPM fan speeds with min 1500RPM: https://www.igorslab.media/nvidia-q...ger-mit-ueberraschender-leistung-igorslab/11/


How many people review Quadros for noise? Probably none other than him.


----------



## magnek

AlphaC said:


> Igor's one of my favorite reviewers because he tears down the cards , uses a proper Infrared device and not a $200 one, and does a sound spectrum graph (with mention of coil whine/electrical noise).
> https://www.igorslab.media/amd-rade...los-auf-21-ghz-uebertaktet-wasser-sei-dank/3/
> 
> https://www.igorslab.media/amd-rade...n-vega-und-fast-2-1-ghz-takt-unter-wasser/13/
> 
> 
> 
> For example for RTX 4000 , he showed 3000RPM fan speeds with min 1500RPM: https://www.igorslab.media/nvidia-q...ger-mit-ueberraschender-leistung-igorslab/11/
> 
> 
> How many people review Quadros for noise? Probably none other than him.


Unfortunately mein Deutsch ist schlecht, and I'm assuming Igor does not sprechen Englisch. For written reviews I can Google translate it and get most of it, but video reviews I'm forced to just stare at graphs lol.


----------



## Heuchler

Google Translate does a decent and sometimes hilarious job.

The content is in German but English subtitles are available for his youtube content.


Just looking over the section on audio on his MSI MEG Z390 Ace review one can see they quality of work. 
https://www.igorslab.media/msi-meg-z390-ace-im-test-nicht-ganz-godlike-aber-recht-solide-igorslab/2/


"frequency salad" from Einstreuung (elektromagnetisch Einstreuung) is a world salad in book. Well it made me laugh.
https://translate.google.com/transl...ht-ganz-godlike-aber-recht-solide-igorslab/2/


----------



## Newbie2009

Has anyone else run into an issue with manual fan curve? Seems I’m stable in auto fans but manual fan curve it green screens A LOT


----------



## maltamonk

This is what I was waiting to see in a sense.


----------



## 113802

https://www.techspot.com/review/1883-overclocking-radeon-rx-5700/

lol



maltamonk said:


> This is what I was waiting to see in a sense.


Why would you want to see a review of two completely different systems? 

Same Youtube channel using the same system:


----------



## maltamonk

So I could compare it to ones like the one you linked. Most reviews are 5700xt using 9900k or a 3600 with a 2080ti.


----------



## rdr09

WannaBeOCer said:


> https://www.techspot.com/review/1883-overclocking-radeon-rx-5700/
> 
> lol


From that review. Not bad for 400$. BFV 1440.


----------



## Darklyric

PontiacGTX said:


> Well at least MSI allow you can replace the cooler-repaste, I dont know asrock, but some other AIB partner would void the warranty


I don't think they can do that in the us... yet. Right to repair and what not. The stickers are for some other countries iirc.


----------



## 113802

rdr09 said:


> From that review. Not bad for 400$. BFV 1440.


You're suppose to cherry pick Forza not BFV. Anyway, the RTX 2070 Super and 2080 Super gains around 8% from overclocking as well.


----------



## rdr09

WannaBeOCer said:


> You're suppose to cherry pick Forza not BFV. Anyway, the RTX 2070 Super and 2080 Super gains around 8% from overclocking as well.


I did not include the oc'ed XT. It was right above the Super.


----------



## 113802

rdr09 said:


> I did not include the oc'ed XT. It was right above the Super.


I'm aware of where it was at, the 5700 XT was 2 FPS below and 2 FPS above the RTX 2080 Ti on Forza Horizon 4 when overclocked.


----------



## AlphaC

I don't know why people are having so much discussion about a garbage blower, I bet the sentiment will be much different once better drivers come around that aren't designed around compatability with GCN. In addition, next month or September we should be getting Sapphire Nitro+ or Toxic. 

Also if you want an apples to apples comparison you'd need to do something like MSI 2070 Super/non-Super Gaming X vs MSI RX 5700XT Gaming X (or ASUS STRIX/Gigabyte AORUS) instead of this hackjob comparison guesstimate.


The only results really worth looking at are Igor's results on water.


----------



## Heuchler

Igor's LAB amicable divorce from Tom's Hardware is final now and Igor's LAB can now have none-german content.
All major things should be covered in English now. 



AlphaC said:


> I don't know why people are having so much discussion about a garbage blower, I bet the sentiment will be much different once better drivers come around that aren't designed around compatability with GCN. In addition, next month or September we should be getting Sapphire Nitro+ or Toxic.
> 
> Also if you want an apples to apples comparison you'd need to do something like MSI 2070 Super/non-Super Gaming X vs MSI RX 5700XT Gaming X (or ASUS STRIX/Gigabyte AORUS) instead of this hackjob comparison guesstimate.
> 
> 
> The only results really worth looking at are Igor's results on water.




"better drivers come around that aren't designed around compatability with GCN" aka 'FineWine' (for folks that spend their days on youtube instead of hardware enthusiast forums...pretty sure that comment is going to bit me in the ASUS someday).


----------



## AlphaC

The FineWine meme is lame for everything from Tonga to Vega as all were GCN at their core. RDNA is a new architecture , it doesn't even have [email protected] support right now due to OpenCL being broken.


I don't know how people are taking blower results seriously when LinusTechTips stalled their RX5700XT and TimmyJoe on Youtube actually artifacted on his Anniversary edition without overclocking.


----------



## maltamonk

AlphaC said:


> I don't know why people are having so much discussion about a garbage blower, I bet the sentiment will be much different once better drivers come around that aren't designed around compatability with GCN. In addition, next month or September we should be getting Sapphire Nitro+ or Toxic.
> 
> Also if you want an apples to apples comparison you'd need to do something like MSI 2070 Super/non-Super Gaming X vs MSI RX 5700XT Gaming X (or ASUS STRIX/Gigabyte AORUS) instead of this hackjob comparison guesstimate.
> 
> 
> The only results really worth looking at are Igor's results on water.


Wouldn't you do it vs the 2060s though?


----------



## AlphaC

RTX 2060 Super costs as much as RTX 2070 and has less CUDA cores as well as RT/tensor cores.


----------



## 113802

AlphaC said:


> I don't know why people are having so much discussion about a garbage blower, I bet the sentiment will be much different once better drivers come around that aren't designed around compatability with GCN. In addition, next month or *September we should be getting Sapphire Nitro+ or Toxic.*
> 
> Also if you want an apples to apples comparison you'd need to do something like MSI 2070 Super/non-Super Gaming X vs MSI RX 5700XT Gaming X (or ASUS STRIX/Gigabyte AORUS) instead of this hackjob comparison guesstimate.
> 
> 
> The only results really worth looking at are Igor's results on water.


The card is power hungry at high frequencies. Just like pretty much every architecture that runs out of spec. The question is how much those AIB cards will cost. I bet they are between $450-$480 at which point it's might be better just to buy a RTX 2070 super FE.


----------



## maltamonk

Yes. Tis the reason the comparison needs to be done vs the 2060s and not the 2070s. Ofc that all changes if the aib come out >$100 over ref.

Edit: @ WannaBeOCer I thought the 2070s was a 2070 replacement? If that's the case , that option won't be available for too long.


----------



## 113802

maltamonk said:


> Yes. Tis the reason the comparison needs to be done vs the 2060s and not the 2070s. Ofc that all changes if the aib come out >$100 over ref.
> 
> Edit: @ WannaBeOCer I thought the 2070s was a 2070 replacement? If that's the case , that option won't be available for too long.


If someone doesn't care about noise or overclocking get a RX 5700 XT. People are already buying them and replacing the heatsink with $50-$75 heatsinks and losing their warranty. At that point they should of just bought a RTX 2070 Super.


----------



## AlphaC

It's naive to think that the card isn't less leaky with a better cooler. Nothing to do with Sapphire overclocking it.


----------



## maltamonk

WannaBeOCer said:


> If someone doesn't care about noise or overclocking get a RX 5700 XT. People are already buying them and replacing the heatsink with $50-$75 heatsinks and losing their warranty. At that point they should of just bought a RTX 2070 Super.


No one should be buying the ref design unless using water or for niche case scenarios, even though most power/noise/heat references have been blown out of proportion. 

That said at +$50-75 they are still cheaper than the 2070s. I guess what I'm getting at is why has the tier been moved up to the 2070s when the real tier is vs the 2060s?


----------



## 113802

maltamonk said:


> No one should be buying the ref design unless using water or for niche case scenarios, even though most power/noise/heat references have been blown out of proportion.
> 
> That said at +$50-75 they are still cheaper than the 2070s. I guess what I'm getting at is why has the tier been moved up to the 2070s when the real tier is vs the 2060s?


The $25-$50 saved isn't worth the work and losing the cards warranty. 

Again if you are fine with a blower get it. It's cheaper than the RTX 2070 Super.


----------



## maltamonk

Can you answer the question? Why has the 5700xt been moved up a pricing tier?


----------



## 113802

maltamonk said:


> Can you answer the question? Why has the 5700xt been moved up a pricing tier?


How many times do I have to say it? The cooling solution on the reference 5700 XT is trash.


----------



## treetops422

maltamonk said:


> Can you answer the question? Why has the 5700xt been moved up a pricing tier?


Wannabe is a super nvidia fan boy... The "trash cooler" still beats the 2060 Super by 10%+ at the same price. Look at the links at the start of this topic.


----------



## maltamonk

WannaBeOCer said:


> How many times do I have to say it? The cooling solution on the reference 5700 XT is trash.


Ughh hate to break this to you, but that's not a good reason.


----------



## 113802

maltamonk said:


> Ughh hate to break this to you, but that's not a good reason.


It's the reason we pay extra for AIB cards.


----------



## ZealotKi11er

WannaBeOCer said:


> How many times do I have to say it? The cooling solution on the reference 5700 XT is trash.


It is not trash. You can not say the cooler is trash because you cant OC. It's more than enough for 180W TDP.


----------



## 113802

ZealotKi11er said:


> It is not trash. You can not say the cooler is trash because you cant OC. It's more than enough for 180W TDP.


It's trash because it's around 52dBA while the RTX 2070 Super is around 42dBA and you can get a RTX 2070 Super Ventus for $510 which is around 35dBA.


----------



## maltamonk

WannaBeOCer said:


> It's the reason we pay extra for AIB cards.


The cooler is acceptable when compared to it's tier. It only becomes unacceptable when you try to compare it to the next tier up. Granted I'm not one for ref coolers, but I understand my dislike for them is not justification to pit the card against the next tier up.

Saying it's hotter/louder/consumes more power than the ref 2060s is all fair. By the same token saying it performs better than the ref 2060s is also fair. 

Once the aib models come out...if...big if.... they cost enough to move their pricing tier up to be in the 2070s range then, by all means compare them as equivalents.


----------



## 113802

maltamonk said:


> The cooler is acceptable when compared to it's tier. It only becomes unacceptable when you try to compare it to the next tier up. Granted I'm not one for ref coolers, but I understand my dislike for them is not justification to pit the card against the next tier up.
> 
> Saying it's hotter/louder/consumes more power than the ref 2060s is all fair. By the same token saying it performs better than the ref 2060s is also fair.
> 
> Once the aib models come out...if...big if.... they cost enough to move their pricing tier up to be in the 2070s range then, by all means compare them as equivalents.


The only reason it's priced against a RTX 2060 Super is due to the blower. Wait until AIB cards are out and they will be competing with the RTX 2070 Super. 

Like I said multiple times, if you are fine with the reference cooler you are getting a bargain. If they plan on just buying a RX 5700 XT just to replace the heatsink that they have to purchase they are dumb.


----------



## maltamonk

WannaBeOCer said:


> The only reason it's priced against a RTX 2060 Super is due to the blower. Wait until AIB cards are out and they will be competing with the RTX 2070 Super.
> 
> Like I said multiple times, if you are fine with the reference cooler you are getting a bargain. If they plan on just buying a RX 5700 XT just to replace the heatsink that they have to purchase they are dumb.


And why wouldn't you compare it to the 2060s aib models at that point? I doubt all aib models will add $75-100. Heck even the strix 2060s is only $470.

That said if ppl already have custom cooling from previous cards, I don't think I'd considering them "dumb" for buying a ref model with the intention of using it. Generalized statements however....


----------



## 113802

maltamonk said:


> And why wouldn't you compare it to the 2060s aib models at that point? I doubt all aib models will add $75-100. Heck even the strix 2060s is only $470.
> 
> That said if ppl already have custom cooling from previous cards, I don't think I'd considering them "dumb" for buying a ref model with the intention of using it. Generalized statements however....


Learn to re-read before assuming I made a generalized statement.


----------



## maltamonk

WannaBeOCer said:


> Learn to re-read before assuming I made a generalized statement.


You said they will compete against the 2070s....that's generalized when aib models range in price.

I admit I wrote that poorly as it was put into the wrong place as was taken for needing to buy a cooler.


----------



## 113802

maltamonk said:


> You said they will compete against the 2070s....that's generalized when aib models range in price.
> 
> I admit I wrote that poorly as it was put into the wrong place as was taken for needing to buy a cooler.


I see them being priced that high since I believe they'll be competing against the RX 5700 XT anniversary edition.


----------



## maltamonk

WannaBeOCer said:


> I see them being priced that high since I believe they'll be competing against the RX 5700 XT anniversary edition.


Now that I can understand as we have precedence for it with the founders editions and the following aibs.


----------



## Heuchler

WannaBeOCer said:


> It's trash because it's around 52dBA while the RTX 2070 Super is around 42dBA and you can get a RTX 2070 Super Ventus for $510 which is around 35dBA.


Currently unavailable. 

Coming Soon

Expected availability: Aug 16, 2019



So, I'm confused can you get this card for $510 right now.

Reasons to get a reference card is easy to find full cover water block. And not everybody that overclocks cares about warranty. This is an overclocking forum. And some manufactures will say overclocking will void warranty.


----------



## 113802

Heuchler said:


> WannaBeOCer said:
> 
> 
> 
> It's trash because it's around 52dBA while the RTX 2070 Super is around 42dBA and you can get a RTX 2070 Super Ventus for $510 which is around 35dBA.
> 
> 
> 
> Currently unavailable.
> 
> Coming Soon
> 
> Expected availability: Aug 16, 2019
> 
> 
> 
> So, I'm confused can you get this card for $510 right now.
> 
> Reasons to get a reference card is easy to find full cover water block. And not everybody that overclocks cares about warranty. This is an overclocking forum. And some manufactures will say overclocking will void warranty.
Click to expand...

Just shows how great the RTX 2070 Ventus is. 

I didn't know that was a reason to buy a reference card. Thank you very much for educating me.

Or you can be a logical person instead of spending $550 for a card that barely benefits from water cooling on a RTX 2070 Super.


----------



## Heuchler

WannaBeOCer said:


> Just shows how great the RTX 2070 Ventus is.
> 
> I didn't know that was a reason to buy a reference card. Thank you very much for educating me.
> 
> Or you can be a logical person instead of spending $550 for a card that barely benefits from water cooling on a RTX 2070 Super.



Posting Gamer Nexus "buy my $80 anti-static mat" really make me see the logic behind you posting the same thing for the last 10 pages.

Complain about a noisy PC but don't like water cooling. 

Any reviews on MSI RTX 2070 SUPER VENTUS OC. Nope because it hasn't been released yet. It might be a good product but it isn't available for sale just like board partner RX 5700 XT cards.


So to save your reply to my reply -- Upcoming non reference cards will be out in a few weeks [fill in Maker and Model of you choice]. Wait and see for resutls.


----------



## 113802

Heuchler said:


> Posting Gamer Nexus "buy my $80 anti-static mat" really make me see the logic behind you posting the same thing for the last 10 pages.
> 
> Complain about a noisy PC but don't like water cooling.
> 
> Any reviews on MSI RTX 2070 SUPER VENTUS OC. Nope because it hasn't been released yet. It might be a good product but it isn't available for sale just like board partner RX 5700 XT cards.
> 
> 
> So to save your reply to my reply -- Upcoming non reference cards will be out in a few weeks [fill in Maker and Model of you choice]. Wait and see for resutls.


Yes I hate watercooling, I'd never touch it.


----------



## Heuchler

That is fine. Personal preference. Even just undervolting [UV] can have major performance gains. The other side of the coin is overvolting too much and harming performance.


5700 XT vs Radeon VII vs Vega 64 vs Vega 56 Undervolted & Overclocked @1440p using 3900X






5700 XT UV vs Vega 64 UV using 3900X @ 4.425Ghz CCD oc @1080P


----------



## Blackops_2

WannaBeOCer said:


> Yes I hate watercooling, I'd never touch it.


*looks at sig* *looks for sarcasm tag*

I do hope something changes with the AIB cards. Disappointing to see this kind of headroom and no significant gains.


----------



## rdr09

WannaBeOCer said:


> Yes I hate watercooling, I'd never touch it.


Don't start. Watercooling is addicting. And 50 dB is fine. Does the refrigerator bother you when you grab that beer in it? Doubt it. The ac puts out 60, does not bother me. Three other gamers in the house and we all use cans.

To be so close to the 2080 Super, a 700$ card, in BFV at 1440 does not bother me at all!lol


----------



## 113802

rdr09 said:


> Don't start. Watercooling is addicting. And 50 dB is fine. Does the refrigerator bother you when you grab that beer in it? Doubt it. The ac puts out 60, does not bother me. Three other gamers in the house and we all use cans.
> 
> To be so close to the 2080 Super, a 700$ card, in BFV at 1440 does not bother me at all!lol


If the fridge was right next to me I would have a problem with it. There's a reason why I undervolted my CPU/GPU heavily. The loudest thing in my system is the coil whine of my video card. 

If the card performs extraordinary in a game that you play often go for it.


----------



## rdr09

WannaBeOCer said:


> If the fridge was right next to me I would have a problem with it. There's a reason why I undervolted my CPU/GPU heavily. The loudest thing in my system is the coil whine of my video card.
> 
> If the card performs extraordinary in a game that you play often go for it.


Coil whine? That's a totally diff issue. You need noise cancelling with that. Loudest thing in my sig is the pump.


----------



## miklkit

Interesting conversation. I'm unhappy with my Vega 64 due to some sort of memory problem that I do not have the knowledge to fix. The 5700XT looks to be its replacement. I have never bought a blower card and have also never used a warranty. 



What interests me is that Arctic Cooling makes a cooler that just bolts onto the 5700XT. I have a soft spot in me head for Arctic and now want an excuse to get that Arctic Cooler. It's a win/win!


----------



## 113802

miklkit said:


> Interesting conversation. I'm unhappy with my Vega 64 due to some sort of memory problem that I do not have the knowledge to fix. The 5700XT looks to be its replacement. I have never bought a blower card and have also never used a warranty.
> 
> 
> 
> What interests me is that Arctic Cooling makes a cooler that just bolts onto the 5700XT. I have a soft spot in me head for Arctic and now want an excuse to get that Arctic Cooler. It's a win/win!


Memory problems on Vega are usually due to browser hardware acceleration.


----------



## miklkit

No. Not when browsing, when gaming.


----------



## 113802

miklkit said:


> No. Not when browsing, when gaming.


If you have a browser open when gaming the memory will drop and run at a lower frequency. Close the browser or disable hardware acceleration to prevent memory from running lower when gaming.


----------



## bigjdubb

WannaBeOCer said:


> Yes I hate watercooling, I'd never touch it.


Haven't I read about your watercooled RVII in the Owners Club?


I feel like maybe I'm on the outside of an inside joke or something.



Also, I was a little disappointed with the results GN got using the hybrid mod approach. They usually get a pretty decent improvement by just adding the cooler so it seems like the stock blowers cooling potential is fine, just loud. Makes me wonder if the partner cards are really going to be much faster, maybe just a lot quieter.


----------



## ZealotKi11er

bigjdubb said:


> Haven't I read about your watercooled RVII in the Owners Club?
> 
> 
> I feel like maybe I'm on the outside of an inside joke or something.
> 
> 
> 
> Also, I was a little disappointed with the results GN got using the hybrid mod approach. They usually get a pretty decent improvement by just adding the cooler so it seems like the stock blowers cooling potential is fine, just loud. Makes me wonder if the partner cards are really going to be much faster, maybe just a lot quieter.


Stock GPU is TDP limited not temp limited.


----------



## bigjdubb

Wouldn't the powerplay tables take care of tdp limitations? They did get a little bit of extra performance with the powerplay mod and the hybrid mod, but it wasn't as much as I thought it would be.


----------



## 113802

bigjdubb said:


> Wouldn't the powerplay tables take care of tdp limitations? They did get a little bit of extra performance with the powerplay mod and the hybrid mod, but it wasn't as much as I thought it would be.


Yes from both Gamers Nexus and Igor's testing the 5700 XT didn't benefit that much from the increase in core clock. I kinda want to buy a RX 5700 XT and put it up against my RX Vega 64 at 1800Mhz/1105Mhz. I believe the RX Vega 64 at that frequency will outperform the RX 5700 XT.


----------



## criminal

I guess I am in a twilight zone when it comes to some of the responses... I had the 5700XT for over a week and found the card and software to be a miserable experience. The noise from the cooler was unbearable. The drivers were horrible. Having to reinstall 3 times in a 4 day stretch is not something I enjoy. 

AIB cards will be much, much better. But the driver issue was just killer to me. I am not claiming to be a Computer genius! 

All I know is that in my experience:

Nvidia - install driver - use computer - no more worrying until time to update driver.
AMD - install driver - use computer - games start crashing - install driver again - games start crashing - install driver again.

No thanks.

Sucks because the main game(s) I play is Battlefield and the 5700XT HAD KILLER performance. Plus it was $120 cheaper than the 2070 Super I bought.


----------



## EastCoast

Wow Radeon really gain some mindshare with the 5700 series. Nearly 20 days after release people are still buzzing about the card. 
Looks like they hit a home run.


----------



## ZealotKi11er

criminal said:


> I guess I am in a twilight zone when it comes to some of the responses... I had the 5700XT for over a week and found the card and software to be a miserable experience. The noise from the cooler was unbearable. The drivers were horrible. Having to reinstall 3 times in a 4 day stretch is not something I enjoy.
> 
> AIB cards will be much, much better. But the driver issue was just killer to me. I am not claiming to be a Computer genius!
> 
> All I know is that in my experience:
> 
> Nvidia - install driver - use computer - no more worrying until time to update driver.
> AMD - install driver - use computer - games start crashing - install driver again - games start crashing - install driver again.
> 
> No thanks.
> 
> Sucks because the main game(s) I play is Battlefield and the 5700XT HAD KILLER performance. Plus it was $120 cheaper than the 2070 Super I bought.


But you got a gpu with mature drivers. RDNA just came out.


----------



## criminal

EastCoast said:


> Wow Radeon really gain some mindshare with the 5700 series. Nearly 20 days after release people are still buzzing about the card.
> Looks like they hit a home run. I didn't notice that with the 2060 Super release.


Navi, RDNA or whatever you want to call it has so much potential. Again, very disappointed from my experience with the software side. BUT if people knew me in real life, they would completely understand why I can't and won't put up with the nonsense.



ZealotKi11er said:


> But you got a gpu with mature drivers. RDNA just came out.



Oh I agree. That's why I kinda feel bad, talking bad about the card and AMD. But I am not a beta tester and the truth is the truth. Why do I car about how new something is if I get a negative experience from it?


----------



## EastCoast

ZealotKi11er said:


> But you got a gpu with mature drivers. RDNA just came out.


If he paid $120 more it was never his intention on keeping the card. If he had it at all.


----------



## criminal

EastCoast said:


> If he paid $120 more it was never his intention on keeping the card. If he had it at all.


Oh so you are one of those... lol

https://www.3dmark.com/tsst/567707

Oh I had the card. And $120 is nothing to me if it means I can plug and play. When you have been in IT as long as I have, going home to "fix" more issues is not something I enjoy.

:thumb:


----------



## EastCoast

criminal said:


> Oh so you are one of those... lol
> 
> https://www.3dmark.com/tsst/567707
> 
> Oh I had the card. And $120 is nothing to me if it means I can plug and play. When you have been in IT as long as I have, going home to "fix" more issues is not something I enjoy.
> 
> :thumb:


Your narrative has no effect on me though. As already pointed out you failed to acknowledge that this is a new Uarch. and whatever problems you claim to have had be it real or imagined would have been ironed out. 
However, I'm sure you are scrambling to update your drivers now to get that mouse pointer fixed for that 2070 right?
https://www.techpowerup.com/257708/...431-68-driver-to-address-broken-mouse-pointer

Well, what do you know. The grass isn't greener on the other side. Yet you paid $120 more just to make sure your mouse is working. Have you conceded that mature drivers are "subjective" yet? Because RTX drivers have been out for quiet a while and I wouldn't expect this kind of problem. Be that as it may though...Congrads!


----------



## 113802

criminal said:


> I guess I am in a twilight zone when it comes to some of the responses... I had the 5700XT for over a week and found the card and software to be a miserable experience. The noise from the cooler was unbearable. The drivers were horrible. Having to reinstall 3 times in a 4 day stretch is not something I enjoy.
> 
> AIB cards will be much, much better. But the driver issue was just killer to me. I am not claiming to be a Computer genius!
> 
> All I know is that in my experience:
> 
> Nvidia - install driver - use computer - no more worrying until time to update driver.
> AMD - install driver - use computer - games start crashing - install driver again - games start crashing - install driver again.
> 
> No thanks.
> 
> Sucks because the main game(s) I play is Battlefield and the 5700XT HAD KILLER performance. Plus it was $120 cheaper than the 2070 Super I bought.


I stopped using AMD cards due to their horrible drivers that were released with the HD 5000 series. I stopped using nVidia cards due to their DPC latency issues. I picked up a RX Vega 64 LC at launch and haven't had any major driver problems that prevented me from using the card. I liked their driver so much I picked up a Radeon VII at launch instead of a RTX 2080. I wish they would change the GUI back to how it looked with the Catalyst Control Center. I do not like how it's difficult to navigate the GUI with a keyboard. If I could swap their current GUI with Intel's Command Center I would.


----------



## ZealotKi11er

criminal said:


> Navi, RDNA or whatever you want to call it has so much potential. Again, very disappointed from my experience with the software side. BUT if people knew me in real life, they would completely understand why I can't and won't put up with the nonsense.
> 
> 
> Oh I agree. That's why I kinda feel bad, talking bad about the card and AMD. But I am not a beta tester and the truth is the truth. Why do I car about how new something is if I get a negative experience from it?


You should have not gotten a new GPU with new architecture at launch. It was going to leave you disappointed from day 1. Its been like this with all GPUs. You should change your upgrade cycle. I did use 5700 XT for 3 days and did not face any issues. Also, it was no louder than my 2080 Ti.


----------



## miklkit

WannaBeOCer said:


> If you have a browser open when gaming the memory will drop and run at a lower frequency. Close the browser or disable hardware acceleration to prevent memory from running lower when gaming.



Again, no. I never have a browser or anything else running when gaming. That is just silly.


----------



## PontiacGTX

Darklyric said:


> I don't think they can do that in the us... yet. Right to repair and what not. The stickers are for some other countries iirc.


XFX does that though



WannaBeOCer said:


> I stopped using AMD cards due to their horrible drivers that were released with the HD 5000 series. I stopped using nVidia cards due to their DPC latency issues. I picked up a RX Vega 64 LC at launch and haven't had any major driver problems that prevented me from using the card. I liked their driver so much I picked up a Radeon VII at launch instead of a RTX 2080. I wish they would change the GUI back to how it looked with the Catalyst Control Center. I do not like how it's difficult to navigate the GUI with a keyboard. If I could swap their current GUI with Intel's Command Center I would.


the only real problem with AMD is the cpu performance on dx11 games but other than that I havent seen anything that seems bugged/glitched or cause problems on software side(I havent used nvidia since 2011)



ZealotKi11er said:


> You should have not gotten a new GPU with new architecture and *reference cooling* at launch. It was going to leave you disappointed from day 1. Its been like this with all GPUs. You should change your upgrade cycle. I did use 5700 XT for 3 days and did not face any issues. Also, it was no louder than my 2080 Ti.


ftfy


----------



## criminal

EastCoast said:


> Your narrative has no effect on me though. As already pointed out you failed to acknowledge that this is a new Uarch. and whatever problems you claim to have had be it real or imagined would have been ironed out.
> However, I'm sure you are scrambling to update your drivers now to get that mouse pointer fixed for that 2070 right?
> https://www.techpowerup.com/257708/...431-68-driver-to-address-broken-mouse-pointer
> 
> Well, what do you know. The grass isn't greener on the other side. Yet you paid $120 more just to make sure your mouse is working. Have you conceded that mature drivers are "subjective" yet? Because RTX drivers have been out for quiet a while and I wouldn't expect this kind of problem. Be that as it may though...Congrads!


I don't appear to have or notice the issue you speak of. It might just be the bias in me not noticing though...lol 

Does that make you feel better?


----------



## criminal

ZealotKi11er said:


> You should have not gotten a new GPU with new architecture at launch. It was going to leave you disappointed from day 1. Its been like this with all GPUs. You should change your upgrade cycle. I did use 5700 XT for 3 days and did not face any issues. Also, it was no louder than my 2080 Ti.


So trying to support an AMD product on launch is wrong? You guys... I just don't get it. AMD/Nvidia is responsible for making a product stable upon release. It's not my job to give them a break if the product I paid my hard earned money for doesn't work correctly. 

I have had no issues with X570... so should I have waited on upgrading that too since it is brand new?


----------



## bigjdubb

I guess it's ok to KNOW that we shouldn't buy an AMD card at launch because the drivers will be bunk and the cooler will be trash, but it's not ok to be disappointed by this? There is no excuse for first day drivers that don't function properly, they may not be at peak optimization for performance but they should work. Since graphics cards don't work without drivers, selling me a graphics card with drivers that are malfunctioning is no different than selling me a graphics card with a malfunctioning vrm. Resolving the problem is easier with drivers than it is with a bad vrm, but there is nothing wrong with returning a card with either problem.

Neither team is hitting the nail on the head with drivers, I have had problems with both and I don't give a pass to either one of them. Nvidias driver problems seem to be easy to fix, don't update them.


----------



## 113802

Wait until AIB cards are out but until then Just Buy It


----------



## Newbie2009

keikei said:


> What did you end up doing with your card?


I still have it if that’s what your asking?


----------



## rdr09

Newbie2009 said:


> My fanboy edition does 2100mhz @ 200w, power slide left at zero.


On air? You got it at 100%? 

One review showed about 250W at 2100.


----------



## 113802

rdr09 said:


> Newbie2009 said:
> 
> 
> 
> My fanboy edition does 2100mhz @ 200w, power slide left at zero.
> 
> 
> 
> On air? You got it at 100%?
> 
> One review showed about 250W at 2100.
Click to expand...

Probably didn't have a fanboy edition. $50 for a binned card. I paid $200 extra for the Vega 64 LC fanboy edition.


----------



## rdr09

WannaBeOCer said:


> Probably didn't have a fanboy edition.


Oh you are right. Fanboy Ed. I guess is binned.

200$ extra! Ugh.


----------



## keikei

Newbie2009 said:


> I still have it if that’s what your asking?



Did you do anything to the cooler?


----------



## Newbie2009

rdr09 said:


> On air? You got it at 100%?
> 
> One review showed about 250W at 2100.


No auto fans large undervolt.
2100 is 1112mv I think. Stable 
2125 is 1128mv stable
2150 is 1175mv, but not sure if 100% stable, has crashes a few times in timespy but seems to game fine .

Ninja edit

Once past 2100 (stock clock is 2078mhz) you have you raise power limit.

I’m talking target now, clocks bounce around.


----------



## Newbie2009

keikei said:


> Did you do anything to the cooler?


I put liquid metal on it.


----------



## ilmazzo

WannaBeOCer said:


> Wait until AIB cards are out but until then Just Buy It
> 
> https://youtu.be/08QgzHFBjq0


I have a new rule (after the glorious "enjoy small things") 

never ever watch a youtube video with a facepalm somewhere

Gonna go back to my doctor or something


----------



## rdr09

Newbie2009 said:


> No auto fans large undervolt.
> 2100 is 1112mv I think. Stable
> 2125 is 1128mv stable
> 2150 is 1175mv, but not sure if 100% stable, has crashes a few times in timespy but seems to game fine .
> 
> Ninja edit
> 
> Once past 2100 (stock clock is 2078mhz) you have you raise power limit.
> 
> I’m talking target now, clocks bounce around.


Make sense. AIB might do higher. Or yours too if watercooled.


----------



## ilmazzo

criminal said:


> I don't appear to have or notice the issue you speak of. It might just be the bias in me not noticing though...lol
> 
> Does that make you feel better?
> 
> 
> 
> I'm just at a loss why people get their panties in a bunch because someone critiques a product? Either of you care to explain?
> 
> I openly admitted that I was probably hard on the 5700XT, but it is what it is. And if you see it as just whining and moaning... I will see my way out. :thumb:


cmon guys

what's this witch hunt going on?

You had a negative experience with it, that's all, let's move on, nothing to see here...... read no ***** talk at all so why so nasty?

Cheers

ps: whoa this forum has parental control embedded!!!


----------



## Newbie2009

rdr09 said:


> Make sense. AIB might do higher. Or yours too if watercooled.


I dunno. You can add up to 1.3v but I wouldn’t .
And you cannot use a custom fan profile as will green screen even at stock. 

I’d like to put it at 100% to see what it can do but it will just crash regardless.


----------



## rdr09

Newbie2009 said:


> I dunno. You can add up to 1.3v but I wouldn’t .
> And you cannot use a custom fan profile as will green screen even at stock.
> 
> I’d like to put it at 100% to see what it can do but it will just crash regardless.


You are using Wattman? I read the new version of AB works. I Never use wattman.


----------



## Newbie2009

rdr09 said:


> You are using Wattman? I read the new version of AB works. I Never use wattman.


Yeah wattman. You can set the manual fan fine just there is some bug in the drivers I think that makes it crash all the time.


----------



## The Robot

Gunderman456 said:


> He's not even allowed one post on his negative experience with the card?


Exactly, there's nothing wrong with voicing his opinion if he did in fact own the card and not just blowing hot air like one of the posters here claiming "noise bad, 2070S good" (well duh, of course open air is better than a blower).


----------



## rluker5

PontiacGTX said:


> the only real problem with AMD is the cpu performance on dx11 games but other than that I havent seen anything that seems bugged/glitched or cause problems on software side(I havent used nvidia since 2011)


That's my biggest worry with getting an AMD card, but the newer stuff like VII and Navi seem to be closing the gap despite the drivers. That and dx11 is quietly fading away.
Hopefully a big Navi comes out that is close to as good as Nvidias next big card (hoping for 2080ti+25% for a significant upgrade over my 1080ti). My Furies put out a great image, but I want smoothness too and if I have to upgrade my whole rig to go with AMD to get that, that costs a lot more than the Nvidia tax. Better cpu use across all games would take away my worry of getting an awesome card and not being able to use it due to cpu bottlenecking. Right now my cpu makes my 1080ti the bottleneck at 1080p, but it could easily be poorly utilized and not be able to hold 60.

But I prefer playing with frame interpolation on where the importance of holding a smooth 60 is magnified, and I have a relatively computationally weak cpu that games well due to good time efficiency so I'm in the minority.


----------



## PontiacGTX

https://www.feedback.amd.com/se/5A1E27D203B57D32 now you know what to vote for...(hint: Integer scaling)


----------



## PontiacGTX

I dont get how the 1080Ti holds so well and the 1070/1080 fall behind where they usually perform


----------



## rluker5

Imouto said:


> Of course he is. What I doubt is the worth of the chain of such posts and the usual clutter it generates in the news forum.
> 
> I'm as tired of some people berating the noise from this card than WannaBeOCer's fabled and jeweled RVII, 9900K or whatever. I don't know about you, but I expect to find new info about the topic at hand when I enter a thread and not an AMA of a rabid fanboy or hater.


I don't think he put jewels on it. But I'd rather hear mentions of significant success with a card with a lot of OC potential than the latest Nvidia that you can just maybe get around artificial limitations with. 

Not everything has to be perfectly organized, you mentioned jewels and my daughter would probably love to stick jewel stickers all over her case. And llama stickers.


----------



## 113802

rluker5 said:


> I don't think he put jewels on it. But I'd rather hear mentions of significant success with a card with a lot of OC potential than the latest Nvidia that you can just maybe get around artificial limitations with.
> 
> Not everything has to be perfectly organized, you mentioned jewels and my daughter would probably love to stick jewel stickers all over her case. And llama stickers.


Last I checked Vega 20 and Navi have artificial limitations. Along with the artificial limitations both Gamers Nexus and Jayztwocents prove that overclocking Navi doesn't provide a decent boost when overclocking. 

While an overclocked RTX 2070 Super with a stock cooler is faster and cheaper than a 5700 XT + waterblock.


----------



## rluker5

PontiacGTX said:


> I dont get how the 1080Ti holds so well and the 1070/1080 fall behind where they usually perform


You likely were used to seeing the 1080ti cpu bottlenecked. It has 40% more shaders than the 1080. This should become more apparent as more games become less optimized for the Pascal arch.


----------



## Imouto

WannaBeOCer said:


> So people buying a reference card and installing a waterblock still get a card that cost more and slower than a RTX 2070 Super. While an overclocked RTX 2070 Super with a stock cooler is faster and cheaper than a 5700 XT + waterblock.


A dated comment is dated. Honestly, how hard is telling people to wait two weeks for AIBs instead of this bull? You don't know if they're going to perform better than reference or hold the same price.


----------



## 113802

Imouto said:


> A dated comment is dated. Honestly, how hard is telling people to wait two weeks for AIBs instead of this bull? You don't know if they're going to perform better than reference or hold the same price.


When was the last time AMD AIB cards performed better than reference? The anniversary edition will be the best Navi 10 card. It's not bull, it's OC reviews by two youtubers that provide excellent content that are quoted here very often. 

Since you don't like the results it's bull?


----------



## rdr09

WannaBeOCer said:


> Last I checked Vega 20 and Navi have artificial limitations. Along with the artificial limitations both Gamers Nexus and Jayztwocents prove that overclocking Navi doesn't provide a decent boost when overclocking.
> 
> While an overclocked RTX 2070 Super with a stock cooler is faster and cheaper than a 5700 XT + waterblock.


People who buy AMD reference cards to waterblock will waterblock any cards they gonna buy regardless. Those who buy reference cards and whine about the noise are . . .

We watercool for silence. The sound of water trickling.


----------



## 113802

rdr09 said:


> People who buy AMD reference cards to waterblock will waterblock any cards they gonna buy regardless. Those who buy reference cards and whine about the noise are . . .
> 
> We watercool for silence. The sound of water trickling.


You're on the wrong forum if you think that's the only reason we water cool.


----------



## rluker5

WannaBeOCer said:


> Last I checked Vega 20 and Navi have artificial limitations. Along with the artificial limitations both Gamers Nexus and Jayztwocents prove that overclocking Navi doesn't provide a decent boost when overclocking.
> 
> While an overclocked RTX 2070 Super with a stock cooler is faster and cheaper than a 5700 XT + waterblock.



A lot of stuff doesn't overclock much anymore. I've had 2 1080tis and both did 2k at stock, raise power limit to 375w, and one could give 1 to 3 dozen more hz and the other maybe 5 dozen hz. Turing, Ryzen are similar. Even new Intel cpus don't oc much. My old Intel cpus do and did. 14-25%. I like to hear of stuff like Vega and VII that do as well. Even if it is a bit of a hassle to do.

Oh, and I installed 1903 on my gaming rig and the mitigations still took a bite of performance, even disabled, so I reverted back to 1709 :/ Wasn't as noticeable with lesser demands on living room pc with sata ssd + Fury Nitro I guess. Feel kind of cheated with that dubious mitigation fiasco.

Hopefully the AIB 5700s will perform better with oc. The more success out there the better.


----------



## rdr09

WannaBeOCer said:


> You're on the wrong forum if you think that's the only reason we water cool.


No, it is a hobby. lol


----------



## ZealotKi11er

The reason you do not see more perf from 5700 XT is the memory does not overclock as much as Nvidia GPUs. When was the last time AMD had parity with Nvidia GPU on memory BW?


----------



## 113802

ZealotKi11er said:


> The reason you do not see more perf from 5700 XT is the memory does not overclock as much as Nvidia GPUs. When was the last time AMD had parity with Nvidia GPU on memory BW?


Do you really want me to answer that question? I suggest you look at AMD's last few generations all the way back to the 7970.


----------



## Hwgeek

ZealotKi11er said:


> The reason you do not see more perf from 5700 XT is the memory does not overclock as much as Nvidia GPUs. When was the last time AMD had parity with Nvidia GPU on memory BW?


Wait a little and soon another Mem tweaker programs will pop out for navi too thanks to miners @bitcointalk 
At-least the chips that TPU found ion their 5700XT are the same that are used on many RTX cards and they should OC over 1000Mhz easy, maybe there is some kind of limitation on it for now or more relax timing can help the OC.


----------



## ZealotKi11er

WannaBeOCer said:


> Do you really want me to answer that question? I suggest you look at AMD's last few generations all the way back to the 7970.


Apart from 290X, everything in AMD side had more theoretical BW. Now after exerting all the delta compression since Maxwell, AMD is next to Nvidia.



Hwgeek said:


> Wait a little and soon another Mem tweaker programs will pop out for navi too thanks to miners @bitcointalk
> At-least the chips that TPU found ion their 5700XT are the same that are used on many RTX cards and they should OC over 1000Mhz easy, maybe there is some kind of limitation on it for now or more relax timing can help the OC.


Even if they are the same model, it does not mean AMD get the good ones. Just like GPUs, memory can be different. Also its the memory controller which differs just like in CPUs.


----------



## EastCoast

LOL,
Looks like Radeon is back on the menu boys!

And from what I've seen the blower isn't that bad after all. If price is an absolute factor and you haven't upgraded in years 5700/5700xt is the card to get.


----------



## TrueForm

5700XT is over $250 cheaper than the 2070S here. Though, blower vs Aib coolers.


----------



## AlphaC

This thread is so derailed.

AIB boards have already been spotted.

ASROCK PHANTOM GAMING X
ASROCK TAICHI
ASROCK CHALLENGER

ASUS STRIX
ASUS TUF
ASUS Dual


Gigabyte : ?

HIS : ?

MSI GAMING X 
MSI EVOKE
MSI MECH (looks like VENTUS)

Powercolor : ?

SAPPHIRE TOXIC / VAPOR-X (rumored) / NITRO+

XFX : ?

https://portal.eaeunion.org/sites/o...8&ListId=d84d16d7-2cc9-4cff-a13b-530f96889dbc
https://portal.eaeunion.org/sites/o...2&ListId=d84d16d7-2cc9-4cff-a13b-530f96889dbc

---------------------------


It simply isn't ready yet:


https://www.servethehome.com/sth-amd-radeon-rx-5700-xt-and-rx-5700-review-update/ said:


> During the course of this week, William worked to get his pieces finished for the new AMD GPUs. He ran into issues due to the current state of the AMD OpenCL support for the cards. A quick look shows that even Ryan Smith at Anandtech found similar challenges during his compute benchmarks.
> As a quick note, we also heard through back-channels that AMD is aware of this challenge. The company focused on gaming compatibility at launch. Frankly, that is a valid direction for AMD as the largest segment for the company is gaming.
> Last night, I asked William to stop making progress on the Radeon RX 5700 XT and RX 5700 reviews. We are shelving the reviews until AMD remedies its OpenCL support.
> 
> *Final Words*
> 
> I requested that William spend his time on his Super reviews instead. The first one up will likely be the NVIDIA GeForce RTX 2060 Super review since the 2060 level saw an increase from 6GB to 8GB of GDDR6 which will have an impact on many of the workloads we run. I am personally excited to see how Navi performs once drivers are in place as competition is good for the market.
> At this time, if you want a GPU for compute, or compute and gaming, NVIDIA seems to have better options. We cannot recommend someone use a Navi desktop GPU for professional applications when we are seeing errors and program crashes using the current (as of 7/11/2019) driver stack.


----------



## ZealotKi11er

TrueForm said:


> 5700XT is over $250 cheaper than the 2070S here. Though, blower vs Aib coolers.


AIB for sure if you have good case airflow. Make sure the cooler is big enough. You do not want those crap MSI Armor cooler.


----------



## criminal

rdr09 said:


> People who buy AMD reference cards to waterblock will waterblock any cards they gonna buy regardless. Those who buy reference cards and whine about the noise are . . .
> 
> We watercool for silence. The sound of water trickling.


You are wrong actually. I custom water cool my cpu because it 1.) cools better and 2.)it looks better. I only water cool gpu if 1.) it is loud and/or 2.) it improves performance. Cards can be silent with air coolers you know. Watercooling does nothing to help the quality of the software.

Why would anyone pay an additional $100+ dollars to make an unstable product quiet? None of you have yet to explain how that's beneficial.


----------



## ryan92084

Last several pages cleaned. As usual if you are here to talk about the hardware then stick around but if you are here to discuss each other then move along.


----------



## treetops422

Waiting on one of these
https://www.newegg.com/sapphire-rad...iption=5700&cm_re=5700-_-14-202-342-_-Product
Radeon 5700 $330 Brand New


Eventually I'll throw my all in one closed loop 120 mm cpu cooler on it for funzie. Until I start messing around with a 35 gallon open loop and beyond :Snorkle:. it's been to long since I refreshed my rig :thumb:.


I wonder if the blower shroud will ever sell for anything on ebay? ^^ maybe in a few months, oo I could always use it as an extra vent fan.

p.s. yeah I know pci fan on the vrm


----------



## rdr09

treetops422 said:


> Waiting on one of these
> https://www.newegg.com/sapphire-rad...iption=5700&cm_re=5700-_-14-202-342-_-Product
> Radeon 5700 $330 Brand New
> 
> 
> Eventually I'll throw my all in one closed loop 120 mm cpu cooler on it for funzie. Until I start messing around with a 35 gallon open loop and beyond :Snorkle:. it's been to long since I refreshed my rig :thumb:.
> 
> 
> I wonder if the blower shroud will ever sell for anything on ebay? ^^ maybe in a few months, oo I could always use it as an extra vent fan.
> 
> p.s. yeah I know pci fan on the vrm


Someone watercooled a XT then UV'ed it. Got 12122 graphics score in FSE. My Oc'ed 290's got 11500.

1860/910 mhz 908mv 105w 1810mhz 0% 52 degrees 12122 (210-215w of powerdraw at wall :O)

Found a 5700 unlocked and oc'ed to 2000MHz.


----------



## ilmazzo

AlphaC said:


> This thread is so derailed.
> 
> AIB boards have already been spotted.
> 
> ASUS STRIX
> ASUS TUF
> ASUS Dual


where did you see the asus boards? or you just are pointing out announces from AIBs? thanks!


----------



## keikei

treetops422 said:


> Waiting on one of these
> https://www.newegg.com/sapphire-rad...iption=5700&cm_re=5700-_-14-202-342-_-Product
> Radeon 5700 $330 Brand New
> 
> Eventually I'll throw my all in one closed loop 120 mm cpu cooler on it for funzie. Until I start messing around with a 35 gallon open loop and beyond :Snorkle:. it's been to long since I refreshed my rig :thumb:.
> 
> *I wonder if the blower shroud will ever sell for anything on ebay*? ^^ maybe in a few months, oo I could always use it as an extra vent fan.
> 
> p.s. yeah I know pci fan on the vrm



If you plan on reselling the card, the buyer may want the stock cooler.


----------



## AlphaC

ilmazzo said:


> where did you see the asus boards? or you just are pointing out announces from AIBs? thanks!


 https://videocardz.com/81086/asus-g...uper-and-radeon-rx-5700-series-spotted-at-eec


It's been speculated this picture from ASUS Australia is the RX 5700 series though it could be a GTX 1660 or similar

https://www.facebook.com/asusaustralia/photos/a.197366076963568/2583701074996711/?type=3&theater


Vietnamese listing: sg.h2gaming.vn/ROG-STRIX-RX5700XT-O8G-GAMING-1


edit: Also Alphacool testing by IgorsLab


https://www.igorslab.media/zurueck-...r-wasserkuehler-fuer-amds-radeon-rx-5700xt/2/


----------



## EastCoast

Gameplay immersion is nearly identical yet the averages are different.
Goes to show you that FPS number alone isn't enough to get the whole picture when gaming. 

I honestly don't see why someone should pay much more for a gpu when a cheaper one give you the same game play experience/immersion.


----------



## ToTheSun!

That's a good narrative, I guess. "It's better" when it's more performant, and "it doesn't matter" when it's less performant.

I guess you really can't go wrong with AMD. Green man bad.


----------



## ZealotKi11er

EastCoast said:


> https://youtu.be/r49Ijb7c5Ic
> 
> Gameplay immersion is nearly identical yet the averages are different.
> Goes to show you that FPS number alone isn't enough to get the whole picture when gaming.
> 
> I honestly don't see why someone should pay much more for a gpu when a cheaper one give you the same game play experience/immersion.


I have 2080 Ti and really its only when a normal GPU drops below 40 fps that you care about 2080 Ti performance.


----------



## looniam

ToTheSun! said:


> That's a good narrative, I guess. "It's better" when it's more performant, and "it doesn't matter" when it's less performant.
> 
> I guess you really can't go wrong with AMD. Green man bad.


inb4 you can't see more than 30fps anyways . . . 

reminds me when kyle got a stick up his bum w/nvidia and did those V64/freesync to 1080ti/gsync comparisons and proclaimed "the experience" was the same for hundreds less. hate emotions does things to perceptions.


----------



## EastCoast

ZealotKi11er said:


> I have 2080 Ti and really its only when a normal GPU drops below 40 fps that you care about 2080 Ti performance.


And that's the point really what the video shows. A 5700XT is sustaining minimal frame rates high enough that when compared to the Super there is no difference in movement, smoothness, immersion, etc. 

Back in the day when hardware was still teething FPS NUMBERS did matter. LOL, even max FPS mattered. Now we've reached parity with the HW available today yet some still place an inordinate, unrealistic and unhealthy expectation on highest FPS when minimal FPS are above the threshold needed to play the game as intended from developer(s). 

As the video shows disparity in FPS between GPUs. Gameplay is similar.


----------



## 113802

EastCoast said:


> ZealotKi11er said:
> 
> 
> 
> I have 2080 Ti and really its only when a normal GPU drops below 40 fps that you care about 2080 Ti performance.
> 
> 
> 
> And that's the point really what the video shows. A 5700XT is sustaining minimal frame rates high enough that when compared to the Super there is no difference in movement, smoothness, immersion, etc.
> 
> Back in the day when hardware was still teething FPS NUMBERS did matter. LOL, even max FPS mattered. Now we've reached parity with the HW available today yet some still place an inordinate, unrealistic and unhealthy expectation on highest FPS when minimal FPS are above the threshold needed to play the game as intended from developer(s).
> 
> As the video shows disparity in FPS between GPUs. Gameplay is similar.
Click to expand...

They shouldn't be using a RTX 2080 Ti with a low resolution unless they are enabling DxR. A RTX 2080 Ti vs RX 5700 XT is noticeable when gaming at 4k.


----------



## EastCoast

WannaBeOCer said:


> They shouldn't be using a RTX 2080 Ti with a low resolution unless they are enabling DxR. A RTX 2080 Ti vs RX 5700 XT is noticeable when gaming at 4k.


Well, in the video they used a 2080 Super. Unless you were talking about something else?


----------



## 113802

EastCoast said:


> Well, in the video they used a 2080 Super. Unless you were talking about something else?


Re-read what I quotes.


----------



## rluker5

EastCoast said:


> https://youtu.be/r49Ijb7c5Ic
> 
> Gameplay immersion is nearly identical yet the averages are different.
> Goes to show you that FPS number alone isn't enough to get the whole picture when gaming.
> 
> I honestly don't see why someone should pay much more for a gpu when a cheaper one give you the same game play experience/immersion.


To be fair, the heavily overclocked one is likely louder, less stable and has less remaining oc headroom.
But on the other hand is in a different price bracket as well and is clearly the better buy when any more than pure performance is considered.


----------



## keikei

rluker5 said:


> To be fair, the heavily overclocked one is likely louder, less stable and has less remaining oc headroom.
> But on the other hand is in a different price bracket as well and is clearly the better buy when any more than pure performance is considered.



Is not mention to be fair, but to emphasize the value of the 5700XT. A better comparison would have been vs a standard 5700. Its the 1440p king for perf/$ atm.


----------



## EastCoast

WannaBeOCer said:


> Re-read what I quotes.


Your quote was my previous post in which I made no mention of the 2080ti. 
ZealotKi11er mentioned the 2080ti but didn't indicate he was using a lower resolution.
Therefore your reply was disjointed. So I inference the video which didn't discuss the 2080ti but the 2080 Super. 
But no worries, it's not the serious 



keikei said:


> Is not mention to be fair, but to emphasize the value of the 5700XT. A better comparison would have been vs a standard 5700. Its the 1440p king for perf/$ atm.


Exactly!


----------



## ejb222

ToTheSun! said:


> That's a good narrative, I guess. "It's better" when it's more performant, and "it doesn't matter" when it's less performant.
> 
> I guess you really can't go wrong with AMD. Green man bad.


I understand your point...but for me, with a 60hz 3440x1440 monitor, why should I pay $800 for 75fps instead of $400 for 68fps????


----------



## ToTheSun!

ejb222 said:


> I understand your point...but for me, with a 60hz 3440x1440 monitor, why should I pay $800 for 75fps instead of $400 for 68fps????


You should absolutely buy whatever makes the most sense to you.

I've simply been reading the news sub-forum long enough to know who means what with which post. The only remedy for bad speech is more speech.


----------



## maltamonk

Nice to see some of these selling under msrp already. It gives me hope for the Aib's.


----------



## ZealotKi11er

WannaBeOCer said:


> They shouldn't be using a RTX 2080 Ti with a low resolution unless they are enabling DxR. A RTX 2080 Ti vs RX 5700 XT is noticeable when gaming at 4k.


It is but also depends on the game. I only game at 4K hence the 2080 Ti but going from 1080 Ti to 2080 Ti in older games 2017-2018 I did not feel much difference. The only depending game I played was Witcher 3 which is 4 years old now. BFV is easy with even 5700 XT at 4K.


----------



## The Robot

EastCoast said:


> https://youtu.be/r49Ijb7c5Ic
> 
> Gameplay immersion is nearly identical yet the averages are different.
> Goes to show you that FPS number alone isn't enough to get the whole picture when gaming.
> 
> I honestly don't see why someone should pay much more for a gpu when a cheaper one give you the same game play experience/immersion.


Some on OCN would defend even a 1% increase for 10% the price, it's got really lame lately. Good thing is that a lot of people are starting to wake up from a decade of green and blue slumber, when they were fleeced and accustomed that they really need $400 CPUs and $700 GPUs just to play console ports.


----------



## Newbie2009

My green screen issue re fan is gone with new drivers. Did a ddu again and this time seems fine.
So now I can set clocks to target 2150 at stock volts (1198mv)
Benches timespy stable at 2175 same volts but performance regresses slightly.

I’ve not tried any extra voltage yet ,1.3v is the limit I see in wattman.

Power peaks at 250w so should have another 50w to play with. Saying that the slider maxes out at 2200mhz


----------



## 113802

ZealotKi11er said:


> It is but also depends on the game. I only game at 4K hence the 2080 Ti but going from 1080 Ti to 2080 Ti in older games 2017-2018 I did not feel much difference. The only depending game I played was Witcher 3 which is 4 years old now. BFV is easy with even 5700 XT at 4K.


My Radeon VII dips below 60 FPS pretty often at 4k in Battlefield V. I'm sure the difference between a 2080 Ti and 5700 XT is very noticeable in BFV. I use Virtual Super Sampling to gauge what games my Radeon VII can play at 4k because at a point I was thinking about upgrading my monitor. 

So far the experience was horrible with most FPS games I played. Destiny 2 and BFV dipping in the high 40's while Overwatch and Devil May Cry 5 are playable.


----------



## PontiacGTX

The Robot said:


> Some on OCN would defend even a 1% increase for 10% the price, it's got really lame lately. Good thing is that a lot of people are starting to wake up from a decade of green and blue slumber, when they were fleeced and accustomed that they really need $400 CPUs and $700 GPUs just to play console ports.


700usd not even the flagship maybe for$ AMD will be 700-900 thanks to the new prices due to the new tech and due to nvidia enabling AMD through selling more expensive cards, remember when AMD released a R9 290X which was pretty much like a WX8100 with less vram and didnt have an special edition?(FE) or when the high end didnt cost more than 550-650usd, suddenly a cutdown chip is 700, or when the midrange was 200-300usd, now moved to 350 and 400 as far performance increases (some) people dont mind paying more


----------



## miklkit

Newbie2009 said:


> My green screen issue re fan is gone with new drivers. Did a ddu again and this time seems fine.
> So now I can set clocks to target 2150 at stock volts (1198mv)
> Benches timespy stable at 2175 same volts but performance regresses slightly.
> 
> I’ve not tried any extra voltage yet ,1.3v is the limit I see in wattman.
> 
> Power peaks at 250w so should have another 50w to play with. Saying that the slider maxes out at 2200mhz



Ah, an on topic post! Thank you. The drivers are improving and it would seem that it has solidly better performance than my Vega 64 at less watts. I could live with that, especially since it already costs less.


----------



## ZealotKi11er

WannaBeOCer said:


> My Radeon VII dips below 60 FPS pretty often at 4k in Battlefield V. I'm sure the difference between a 2080 Ti and 5700 XT is very noticeable in BFV. I use Virtual Super Sampling to gauge what games my Radeon VII can play at 4k because at a point I was thinking about upgrading my monitor.
> 
> So far the experience was horrible with most FPS games I played. Destiny 2 and BFV dipping in the high 40's while Overwatch and Devil May Cry 5 are playable.


I played with 1080 Ti and Vega 64 LC at 4K with no issues. I dropped below 60 fps for sure but it did not bother me that much.


----------



## 113802

ZealotKi11er said:


> I played with 1080 Ti and Vega 64 LC at 4K with no issues. I dropped below 60 fps for sure but it did not bother me that much.


Are you referring to the story mode or multiplayer? Multiplayer dropped below 60 FPS. I can see people playing through that fine with a RX 5700 XT. I purchase cards based off of multiple performance. If I didn't I would still use my RX Vega 64.


----------



## bigjdubb

Newbie2009 said:


> My green screen issue re fan is gone with new drivers. Did a ddu again and this time seems fine.
> So now I can set clocks to target 2150 at stock volts (1198mv)
> Benches timespy stable at 2175 same volts but performance regresses slightly.
> 
> I’ve not tried any extra voltage yet ,1.3v is the limit I see in wattman.
> 
> Power peaks at 250w so should have another 50w to play with. Saying that the slider maxes out at 2200mhz


Did you watch the GamersNexus hybrid mod video? He mentioned in that video that he could set the frequency higher but the performance would drop off, he ended up getting the best performance from a lower clock. The moral of his story was that the clocks needed to be verified with a performance test because the highest clock you can reach probably won't be the highest performance you can get.


----------



## AlphaC

https://www.techpowerup.com/257773/...ase-rx-5700-series-idle-fan-speeds-by-over-50




> In our reviews we praised AMD for excellent idle fan speeds that make their cards whisper quiet in idle, despite the lack of the idle-fan-stop feature. With 19.7.2, the the card's idle noise output was measured at 27 dBA. With 19.7.3 and its idle fan speed raised, our RX 5700 XT now has an idle fan-noise output of 29.4 dBA. The RX 5700 went from 27.8 dBA to 29.7 dBA.
> 
> Maximum fan speeds are unaffected, because both cards are configured with a certain maximum RPM fan speed limit, which is reached after around half a minute of gaming. It seems that during gaming, that limit takes priority over the change in fan speed observed with 19.7.3 Beta.
> 
> It's also extremely surprising that AMD's driver patch notes don't mention such a significant change at all



All is not rosy in Navi-land.


----------



## rdr09

AlphaC said:


> https://www.techpowerup.com/257773/...ase-rx-5700-series-idle-fan-speeds-by-over-50
> 
> 
> 
> 
> 
> All is not rosy in Navi-land.


A whisper is 30dB. Adrenelin 19.7.4 is out.


----------



## Newbie2009

bigjdubb said:


> Did you watch the GamersNexus hybrid mod video? He mentioned in that video that he could set the frequency higher but the performance would drop off, he ended up getting the best performance from a lower clock. The moral of his story was that the clocks needed to be verified with a performance test because the highest clock you can reach probably won't be the highest performance you can get.


Yeah it’s nothing new. You might be hitting s clock but not enough juice so performance drops off. 2150 is best performance at stock volts


----------



## Newbie2009

rdr09 said:


> A whisper is 30dB. Adrenelin 19.7.4 is out.


Can’t say I even noticed . Fans below 25% are inaudible for me, which they still are. 

I can only hear the card over cpu fan past 25%.


----------



## treetops422

I know people are concerned with noise, but personally I don't play video games muted, so it's never bothered me. I always turn my fans to max before a gaming session. Windforce x3 fans on the GPU 2x 120 mm radiator fans atm. I used to have 2x 240 mm fans as well but those died out long ago. I have a little fan controller hooked with a adjustment knob, cost me around 5$ 10 years ago. Still works. And for the gpu ab


----------



## PontiacGTX

if reviews(computerbase) are accurate and the reference model produce as much noise level as reference vega that kind of sound level is awful unless you dont mind (about people) around your surrouding and use a headphone


----------



## treetops422

PontiacGTX said:


> if reviews(computerbase) are accurate and the reference model produce as much noise level as reference vega that kind of sound level is awful unless you dont mind (about people) around your surrouding and use a headphone


lol I've been gaming since the 90s, I've never heard of anyone complaining about someone else's computer fans being to loud. Coil whine sucks, but that's another story.


----------



## DaaQ

treetops422 said:


> lol I've been gaming since the 90s, I've never heard of anyone complaining about someone else's computer fans being to loud. Coil whine sucks, but that's another story.


To be honest, I haven't had an issue with fan noise since Geforece 4600Ti days. That was during EQ1 days using Thermaltake (??) 93cfm 80mm fan use. It sounded like a vacuum cleaner. This was two boxing as well.


----------



## Hwgeek

The orange color fan? yep it was like Delta fan, super high RPM .


----------



## ilmazzo

Lag reduction tested

https://www.pcgameshardware.de/Rade...3/Specials/Radeon-Anti-Lag-Test-Navi-1293817/



















The "nvidia has it by 10 years (prerendered frame = 1)" story seems way off from what is working here behind the scenes...gg amd

Other thing: your thoughts regarding this tests?



















This seems to prove that 5700 XT has 48rops working like the 5700 whilst having 64rops so it is missing a raster seems..... I would point to the benchmarks or driver issue with them...... the 5700xt edges anyway the 5700 due to clocks and ram.


----------



## rdr09

Newbie2009 said:


> Can’t say I even noticed . Fans below 25% are inaudible for me, which they still are.
> 
> I can only hear the card over cpu fan past 25%.


Well, check out who did the test on the gpu at TPU. Not surprised. At idle it should be whisper quiet at 25dB. Gonna put a waterblock eventually anyway. I think 1850MHz on the core is enough when we hopefully be able to play with VRAM.


----------



## PontiacGTX

treetops422 said:


> lol I've been gaming since the 90s, I've never heard of anyone complaining about someone else's computer fans being to loud. Coil whine sucks, but that's another story.


try by having a reference blower card to hold a decent temperature and then you will have..because the reference cooler is just annoying on vega, and reviews show same noise level on rx 5700


----------



## ilmazzo

PontiacGTX said:


> try by having a reference blower card to hold a decent temperature and then you will have..because the reference cooler is just annoying on vega, and reviews show same noise level on rx 5700


nope mate

not like a nvidia FE but even not like a vega64 (even without undervolting it)

wait for AIBs if you are concerned a lot by noise, this should have been clear 100 pages ago on this topic, but still....


----------



## miklkit

Fan noise on video cards...........
I've spent enough years playing with air cooling (there is a big bin full of fans) that I can tell roughly how a fan will perform just by looking at it, and all the fans on the GPUs I've owned have had the worst of the worst fans. Lotsa noise and poor cooling. 



They are high speed fans that are designed to move a lot of air in an empty room, not push air through fin stacks. All they do is make noise and good case fans help more with cooling. This thread has convinced me. Hopefully I will be buying a 5700XT and an Arctic Cooler. I know those fans and they are quiet and are designed to push air through fin stacks.


----------



## doom26464

Amd really did hurt overall impressions with the blower fan. 

I think Navi has great potential and this is the first time I would considered AMD cards in a long time for upcoming builds. 

But like its been echoed AIB cards will be the ones to wait for and pricing. Great potential to shine here for the AIB if they play it smart.


----------



## AlphaC

https://www.phoronix.com/scan.php?page=news_item&px=RadeonSI-Navi-10-Evolve-Launch


Linux AMD RX 5700 series performance gains since launch (less than 1 month ago)


Unigine Heaven
RX 5700 : 117 to 126
RX 5700XT : 132 to 140


Result overview vs Nvidia cards
https://www.phoronix.com/scan.php?page=article&item=rx-5700-july&num=4
_If taking a look at today's metrics, the Radeon RX 5700 XT based on the geometric mean of all these gaming benchmarks was 5% faster than the GeForce RTX 2070 while the Radeon RX 5700 was just above the GTX 1080. But in some of the games performing very well on Linux, the Radeon RX 5700 XT was similar to the RTX 2080 and even the RX 5700 hitting the RTX 2070 performance._


----------



## PontiacGTX

ilmazzo said:


> nope mate
> 
> not like a nvidia FE but even not like a vega64 (even without undervolting it)
> 
> wait for AIBs if you are concerned a lot by noise, this should have been clear 100 pages ago on this topic, but still....


https://www.computerbase.de/2019-07/geforce-rtx-2080-super-test/3/#abschnitt_lautstaerke__kuehlung
Vega 56*


----------



## Hwgeek

So AMD has N7P to play with, maybe this is what they gonna use for 5800/5900?


----------



## Newbie2009

Query , I haven’t seen anyone mention voltage overclocking? What’s the max volts allowed in wattman for vanilla 5700xt?

I read about the power play mod and thought it was odd needing more power. &25% puts my card at 250w max and and 50% 300w, although at stock volts I’ve never seen it go that high.

1.3v is the max I can apply in wattman which I’ve yet to do.


----------



## keikei

So 4k gaming. 5700XT vs. 2070S. Who wins?


----------



## ilmazzo

Newbie2009 said:


> Query , I haven’t seen anyone mention voltage overclocking? What’s the max volts allowed in wattman for vanilla 5700xt?
> 
> I read about the power play mod and thought it was odd needing more power. &25% puts my card at 250w max and and 50% 300w, although at stock volts I’ve never seen it go that high.
> 
> 1.3v is the max I can apply in wattman which I’ve yet to do.


1.3v is the max even on powertables, so......


----------



## ilmazzo

ilmazzo said:


> Lag reduction tested
> 
> .....
> 
> Other thing: your thoughts regarding this tests?
> 
> .......
> 
> 
> This seems to prove that 5700 XT has 48rops working like the 5700 whilst having 64rops so it is missing a raster seems..... I would point to the benchmarks or driver issue with them...... the 5700xt edges anyway the 5700 due to clocks and ram.


boring stuff?!?!? better go on with noise whining....


----------



## Ashura

keikei said:


> So 4k gaming. 5700XT vs. 2070S. Who wins?


Results are quite conflicting. They both seem pretty close. The 2070S wins mostly, but not by much.


----------



## Rei86

doom26464 said:


> Amd really did hurt overall impressions with the blower fan.
> 
> I think Navi has great potential and this is the first time I would considered AMD cards in a long time for upcoming builds.
> 
> But like its been echoed AIB cards will be the ones to wait for and pricing. Great potential to shine here for the AIB if they play it smart.


Nah man the real issue is

1. Why did Vega VII get a open air design that didn't carry over to the RX 690.. I mean RX 5700 series?
2. Why the soft launch at 7.7.2019 but AIB's won't have their versions out till mid August... Why can nVidia pretty much have their AIB's launch pretty close to the FE's launch with their custom coolers but AMD can't?


----------



## ToTheSun!

keikei said:


> So 4k gaming. 5700XT vs. 2070S. Who wins?


General look? Seems like a wash.

Some games could decide either way, depending on what those are.


----------



## keikei

Ashura said:


> Results are quite conflicting. They both seem pretty close. The 2070S wins mostly, but not by much.





ToTheSun! said:


> General look? Seems like a wash.
> 
> Some games could decide either way, depending on what those are.


Unreal engine, mainly, Street Fighter V.


----------



## Ashura

keikei said:


> Unreal engine, mainly, Street Fighter V.


Difficult to answer. But I reckon the 5700xt would be the winner based on price to perf.


----------



## ZealotKi11er

keikei said:


> So 4k gaming. 5700XT vs. 2070S. Who wins?


From the looks of thing, 5700 XT perf drops at 4K relative to other resolutions. 2070S will probably be a bit faster relative to 1440p/1080p fps difference.


----------



## chas1723

I was all set to go with the 5700xt as I watercool my CPU and GPU so noise is not an issue. I then noticed nvidia has unlocked 10 bit color for OpenGL apps such as Photoshop. This will allow me to use my 10 bit color accurate monitor to it's full potential. This is something AMD has not done. You have to buy a professional card from AMD to get that feature. I will probably now be going with nvidia because of this. 

Sent from my SM-N950U using Tapatalk


----------



## 113802

chas1723 said:


> I was all set to go with the 5700xt as I watercool my CPU and GPU so noise is not an issue. I then noticed nvidia has unlocked 10 bit color for OpenGL apps such as Photoshop. This will allow me to use my 10 bit color accurate monitor to it's full potential. This is something AMD has not done. You have to buy a professional card from AMD to get that feature. I will probably now be going with nvidia because of this.
> 
> Sent from my SM-N950U using Tapatalk


You should of been buying a RTX 2070 Super in the first place. RX 5700 XT + water block = $520+

The RTX 2070 Super AIB cards are already quiet and you do not need a waterblock. 

You should wait for AMD's Siggraph presentation before making a decision on a card.


----------



## ToTheSun!

Ashura said:


> Difficult to answer. But I reckon the 5700xt would be the winner based on price to perf.


If UE4 is the decider, the choice is pretty simple: nVidia.


----------



## chas1723

WannaBeOCer said:


> You should of been buying a RTX 2070 Super in the first place. RX 5700 XT + water block = $520+
> 
> The RTX 2070 Super AIB cards are already quiet and you do not need a waterblock.
> 
> You should wait for AMD's Siggraph presentation before making a decision on a card.


Nope because I like maximum cooling at the lowest noise level. Turing looses clockspeed at somewhere around 45c. I will watercool whatever I get. 

Sent from my SM-N950U using Tapatalk


----------



## PontiacGTX

chas1723 said:


> Nope because I like maximum cooling at the lowest noise level. Turing looses clockspeed at somewhere around 45c. I will watercool whatever I get.
> 
> Sent from my SM-N950U using Tapatalk


there are AIOs for 55 which work on navi but it will have a higher temperature than 45c


----------



## looniam

ToTheSun! said:


> If UE4 is the decider, the choice is pretty simple: nVidia.


you are so WISE! :glasses


----------



## ToTheSun!

looniam said:


> you are so WISE! :glasses


I'm actually surprised I was allowed to recommend nVidia. Maybe they're all at SIGGRAPH.


----------



## keikei

ToTheSun! said:


> If UE4 is the decider, the choice is pretty simple: nVidia.



See, now you've gone and made me seriously consider Green. I'm very partial to fighting games and dat engine is very popular due to the hit mechanics/physx.


----------



## chas1723

PontiacGTX said:


> there are AIOs for 55 which work on navi but it will be higher temperature than 45c


I've already got everything for a good loop minus the block. 

Sent from my SM-N950U using Tapatalk


----------



## magnek

ToTheSun! said:


> General look? Seems like a wash.
> 
> Some games could decide either way, depending on what those are.


Play UE4 games, get nVidia. Do Battles in the Vield all day long, get AMD. Fairly simple dichotomy.


----------



## miklkit

It's always been like that. A little research into which software works best with your hardware helps a lot. For instance a few years ago I was trying to decide between The Witcher 3 or Fallout 4. When I found out that Fallout 4 is very AMD hostile the choice became easy. Currently the only UE4 based games I own are so old it doesn't matter as they are stuck on the fps limiter anyway.


----------



## Heuchler

AMD RDNA 1.0 Instruction Set Architecture [PDF]
https://gpuopen.com/wp-content/uploads/2019/08/RDNA_Shader_ISA_7July2019.pdf


----------



## treetops422

I've had my 5700 for a few days, it's not loud. It hits 72c without using powerplay OCIng to what afterburner will allow while running the Firestrike demo. I manually turned up the fan to 100% in AB and it's a jet engine lol. Luckily the fan doesn't go beyond 35% or so on auto. I spent most of my time reinstalling Windows 7 n such for my new system, frekin ms has made 7 a huuuuge hassle.


----------



## ejb222

My 5700xt is not too bad at all either. Just got it today been playing around with undervolting...never done it before. is this a bad start?


----------



## ilmazzo

Nothing new

Some people like to make onanistic talks on graphs, were 3% makes a product good or thrash....enjoy your card


----------



## JackCY

Yes because clocks mean nothing without performance validation.


----------



## bigjdubb

keikei said:


> See, now you've gone and made me seriously consider Green. I'm very partial to fighting games and dat engine is very popular due to the hit mechanics/physx.


Do any cards have a difficult time running the fighting games you like? Games should be the deciding factor, but don't choose games that will run at high frame rates no matter the card.


----------



## ZealotKi11er

bigjdubb said:


> Do any cards have a difficult time running the fighting games you like? Games should be the deciding factor, but don't choose games that will run at high frame rates no matter the card.


Would Anti-Lag benefit fighting games?


----------



## Blackops_2

Lisa Su apparently mentioned recently big navi is coming. Of course completely ambiguous as to when. December would be incredible (highly unlikely) looks like an AIB 5700XT is the route i head, hope MSI releases a lightening. I feel like with better memory we'd see more gains from the OCs.


----------



## ilmazzo

Blackops_2 said:


> Lisa Su apparently mentioned recently big navi is coming. Of course completely ambiguous as to when. December would be incredible (highly unlikely) looks like an AIB 5700XT is the route i head, hope MSI releases a lightening. I feel like with better memory we'd see more gains from the OCs.


16gbps saw the light only on 2080ss (second s is for plural)

think will be just ok if navi could be overclocked on ram on usual softwares but like every amd release softwares needs to catch up (and usually is amd "fault" due to deep changes on api implementations at driver lvl)

big bamboo navi will be THE card to lc for the next 10 years, raven segna.


----------



## maltamonk

Bah.....big navi is okay I reckon, but I want rx570-590 replacements. I guess they don't really have any motivation as the 570 and 580 simply crush everything in that segment p/p wise.


----------



## bigjdubb

ZealotKi11er said:


> Would Anti-Lag benefit fighting games?



I'm not the right person to answer that, I think anti lag is pointless no matter what the game is.


----------



## bucdan

maltamonk said:


> Bah.....big navi is okay I reckon, but I want rx570-590 replacements. I guess they don't really have any motivation as the 570 and 580 simply crush everything in that segment p/p wise.


I'm sure it's coming with the 5600 series. Some cut down 5700s will fill that role for the 150-250 range.



bigjdubb said:


> I'm not the right person to answer that, I think anti lag is pointless no matter what the game is.


At least it doesn't cost anything, so it's just value added.


----------



## bigjdubb

Agreed, I'm not dogging the feature. I just don't think it's one of those features that should be a deciding factor between AMD or Nvidia. 

I don't think that either company has a "feature" that I would consider being worthy of factoring into the choice between Nvidia or AMD. Performance (for your needs) and cost, everything else is garbage that they use to try and sway your decision.


----------



## PontiacGTX

maltamonk said:


> Bah.....big navi is okay I reckon, but I want rx570-590 replacements. I guess they don't really have any motivation as the 570 and 580 simply crush everything in that segment p/p wise.


570/580 and 590 replacement are the RX 5700 Series. that they cost more is a different thing but they arent high end cards


----------



## keikei

ASUS Radeon RX 5700 (XT) ROG STRIX & TUF pictured No pricing info yet.



ROG STRIX RX 5700 available August 16th
TUF RX 5700 available August 23rd


----------



## bucdan

The TUF looks like they just copied the Radeon VII and changed it a little. Still waiting on what Sapphire and AsRock put out. I have no hopes for a decent XFX design.


----------



## Gunderman456

Blackops_2 said:


> Lisa Su apparently mentioned recently big navi is coming. Of course completely ambiguous as to when. December would be incredible (highly unlikely) looks like an AIB 5700XT is the route i head, hope MSI releases a lightening. I feel like with better memory we'd see more gains from the OCs.
> 
> https://www.youtube.com/watch?v=Zve0lJKHQ08


The 5700 and 5700XT are low end cards, should have been priced between $200-$250. MSI won't be doing lightning versions of these cards. Also anyone putting these under water would be like taking an R9 270 and 270X and putting them under water. Don't drink the Nvidia and now the AMD Koolaid.


----------



## rdr09

Gunderman456 said:


> The 5700 and 5700XT are low end cards, should have been priced between $200-$250. MSI won't be doing lightning versions of these cards. Also anyone putting these under water would be like taking an R9 270 and 270X and putting them under water. Don't drink the Nvidia and now the AMD Koolaid.


If an AMD low-end with an oc can nearly match a 700$ nvidia card, i'll take it. This is twice as fast as my current R9 290. Heck, locally where im at, a GTX 1050 Ti costs like 300$. lol


----------



## Gunderman456

rdr09 said:


> If an AMD low-end with an oc can nearly match a 700$ nvidia card, i'll take it. This is twice as fast as my current R9 290.


The only reason the 5700XT comes close is because Nvidia did not slip the 2080 = 1080Ti performance to where it was suppose to be in the first place as a 2070. For the first time ever they kept the 1080Ti performance level and charged you the same as the last gen with the 2080. With this simple skew and naming schemes, the manipulation of the consumer is now at an all time high with both companies. Instead of AMD wanting to correct the market, they decided they would succumb to it. I did not drink the Koolaid.


----------



## keikei

bucdan said:


> The TUF looks like they just copied the Radeon VII and changed it a little. Still waiting on what Sapphire and AsRock put out. I have no hopes for a decent XFX design.



No extra RGB is fine with me as i dont have a showcase tower. All i care about is quiet & cool for the cheapest.


----------



## rdr09

Gunderman456 said:


> The only reason it comes close is because Nvidia did not slip the 2080 = 1080Ti performance to where it was suppose to be in the first place as a 2070. For the first time ever they kept the 1080Ti performance level and charged you the same as the last gen with the 2080. With this simple skew, the manipulation of the consumer is now at an all time level strong with both companies. Instead of AMD wanting to correct the market, they decided they would succumb to it. I did not drink the Koolaid.


The folks who bought a 2080 have their reason. They prolly want to use RT without paying 1200$. Check it out.


----------



## keikei

^True. RT has gotten better with Supa, but no one want to talk about that.


----------



## AlphaC

Gunderman456 said:


> The 5700 and 5700XT are low end cards, should have been priced between $200-$250. MSI won't be doing lightning versions of these cards. Also anyone putting these under water would be like taking an R9 270 and 270X and putting them under water. Don't drink the Nvidia and now the AMD Koolaid.


 The RX 5700 is a low-midrange card since it has 256-bit memory bus. It's 180W TDP which is unsuitable for a lowend card even with an undervolt. Also it's no different than watercooling a RTX 2070 or a Vega 56 (which underperformed for the stock TDP).

The real "low end cards" are the rumored Navi 14 / RX 5600 on 128-bit memory bus and supposedly 4GB VRAM which will most likely be 75-120W TDP. They'll out-compete GTX 1650 for sure, maybe GTX 1660 (1408 CUDA, still on GDDR5 with 192-bit bus), and highly unlikely GTX 1660 Ti (1536 CUDA, Vega 56 performance level). Because the RX 570 and RX 580 still exist on the shelf at $150-180 prices, this hasn't had a reason to arrive yet.

https://www.notebookcheck.net/AMD-N...-replacement-for-Polaris-RX-580.427350.0.html

With 1536 shaders clocked similar to Nvidia's GPUs, it should be at least GTX 1660 level (448 GB/s for 256-bit memory bus = ~224 GB/s for 128-bit memory bus which is more than a GTX 1660 that has 192 GB/s , close to the 256-bit memory bus on RX 470/RX 480 which provided 224-256GB/s).


----------



## bigjdubb

Gunderman456 said:


> The 5700 and 5700XT are low end cards, should have been priced between $200-$250. MSI won't be doing lightning versions of these cards. Also anyone putting these under water would be like taking an R9 270 and 270X and putting them under water. Don't drink the Nvidia and now the AMD Koolaid.


Why does a card have to be expensive or high end to be worthy of water cooling? There is nothing wrong with water cooling an RX 560, especially if your system already has a loop in it. 

Maybe it would seem silly if you were trying to build a showpiece system (hard lines, compression fittings etc..) around a cheap gpu, but slapping on a universal block with barb fittings and soft tubes is a good idea for any gpu.


----------



## PontiacGTX

bigjdubb said:


> Why does a card have to be expensive or high end to be worthy of water cooling? There is nothing wrong with water cooling an RX 560, especially if your system already has a loop in it.
> 
> Maybe it would seem silly if you were trying to build a showpiece system (hard lines, compression fittings etc..) around a cheap gpu, but slapping on a universal block with barb fittings and soft tubes is a good idea for any gpu.


maybe considers has a limited overclockng potential? Well the RX 5700 non XT has proven is awful overclocker or maybe the card can do it just with non reference cooling, though if it overclocks beyond 2.2ghz maybe worth water cooling it


----------



## rdr09

PontiacGTX said:


> maybe considers has a limited overclockng potential? Well the RX 5700 non XT has proven is awful overclocker or maybe the card can do it just with non reference cooling, though if it overclocks beyond 2.2ghz maybe worth water cooling it


No need to oc. Don't even need to install the block. Just stare at it all day.


----------



## bigjdubb

Well overclocking isn't all that great of a reason to watercool anymore. Noise and aesthetics are where the cost of water cooling starts to make sense. Even watercooling on the cheap is a bit of a waste if all you want it for is overclocking.


----------



## ilmazzo

Huuuuuuuuuuuuuuge


----------



## 113802

bigjdubb said:


> Well overclocking isn't all that great of a reason to watercool anymore. Noise and aesthetics are where the cost of water cooling starts to make sense. Even watercooling on the cheap is a bit of a waste if all you want it for is overclocking.


Water cooling is still valuable for overclocking. Just look at Vega 10 and Vega 20. Vega 20 outperforms a RTX 2080 OC vs OC with water. The Radeon VII thermal throttles out the box with the stock heatsink. AIB cards are quiet with these sub 250w cards. Water cooling helps sustain high frequencies. 

Only reason water cooling isn't beneficial for Navi is due to the memory clock.


----------



## Michael Paul

Created an account to show my watercooled 5700xt AE results using the tables. Thought I'd shoot holes in some of the things I was reading here. I didn't get to do much tweaking yet as I had to rebend a few tubing runs and irl stuff etc. 

I've managed 995mhz on my Samsung memory and the core boosts just over 2200mhz at times.

https://www.3dmark.com/fs/20048410
https://www.3dmark.com/spy/8005170


----------



## Michael Paul

Those obviously aren't my everyday settings but I'm very happy with the performance. Things will get better as they optimize the driver as well. There's some small bugs but I sorta figured out a workaround to dealing with Radeon settings glitches.


----------



## 113802

Michael Paul said:


> Created an account to show my watercooled 5700xt AE results using the tables. Thought I'd shoot holes in some of the things I was reading here. I didn't get to do much tweaking yet as I had to rebend a few tubing runs and irl stuff etc.
> 
> I've managed 995mhz on my Samsung memory and the core boosts just over 2200mhz at times.
> 
> https://www.3dmark.com/fs/20048410
> https://www.3dmark.com/spy/8005170


Nice results, what were you trying to shut down with your results? It's clear a water cooled overclocked RTX 5700 XT isn't faster than an air cooled RTX 2070 Super overclocked.


----------



## Blackops_2

I'll be honest i love water cooling for the added performance and acoustics but a lot of it is aesthetic for me. My first hardline rig i finished is by far the cleanest rig i've ever done. I'll be building again come december, i guess at worse i could pick up a vega 64 to hold me over until big Navi instead of picking up a 5700XT though if AIB 5700XTs have memory that's clockable i'm going to get one. V64 is still $300 used or so.


----------



## treetops422

PontiacGTX said:


> maybe considers has a limited overclockng potential? Well the RX 5700 non XT has proven is awful overclocker or maybe the card can do it just with non reference cooling, though if it overclocks beyond 2.2ghz maybe worth water cooling it


 1. What overclock gains does the 5700 get with the blower? 

2. What overclock gains does the 2060 get?
3. What do you consider good gains?
4. What do you consider awful gains?
5. What do you consider normal gains?


----------



## PontiacGTX

treetops422 said:


> 1. What overclock gains does the 5700 get with the blower?
> 
> 2. What overclock gains does the 2060 get?
> 3. What do you consider good gains?
> 4. What do you consider awful gains?
> 5. What do you consider normal gains?


5700 is objectively worse than 5700 XT so overclocking on that model is not going to net as much performance gain as the 5700 XT, so far reference model 5700 XT can reach 2.1-2.2ghz meanwhile the RX 5700 cant reach similar clock speed


----------



## Blackops_2

Thought the 5700 was locked?


----------



## 113802

Blackops_2 said:


> Thought the 5700 was locked?


It is locked but can be unlocked by using Power Plays.


----------



## Blackops_2

WannaBeOCer said:


> It is locked but can be unlocked by using Power Plays.


Gotcha and it still doesn't clock well?


----------



## treetops422

PontiacGTX said:


> 5700 is objectively worse than 5700 XT so overclocking on that model is not going to net as much performance gain as the 5700 XT, so far reference model 5700 XT can reach 2.1-2.2ghz meanwhile the RX 5700 cant reach similar clock speed


lol why would you expect it to reach the same clock speeds as the 5700 xt?


----------



## treetops422

https://translate.google.com/transl...playtables-fuer-die-rx-5700-und-rx-5700-xt/3/



So yeah you can overclock these beasts. These are on liquid cooling, I saw a guy post 2000mhz for the 5700 on the stock blower. 90% fan speed. I've never used powerplay so I need to do some more research before I dive in.



*Radeon RX 5700* *Vanilla5700* Restore data for the original values ​​of the Radeon RX 5700 ex factory. *MorePower5700* Max. 2100MHz GPU, max. 1000MHz VMem, max. + 50% power limit *EvenMorePower5700* Max. 2100MHz GPU, max. 1000MHz VMem, max. + 90% power limit, 1225 mV Vcore (experimental character, not final) 
*Radeon RX 5700 XT* *Vanilla5700XT* Restore data for the original values ​​of the Radeon RX 5700 XT ex factory *MorePower5700XT* Max. 2300MHz GPU, max. 1000MHz VMem, max. + 70% power limit *EvenMorePower5700XT* Max. 2300MHz GPU, max. 1000MHz VMem, max. + 90% power limit, 1225 mV Vcore


----------



## ilmazzo

Silicon lottery

The gpus once unlocked will just oc exactly the same according to silicon quality and power delivery/cooling, both vanilla or xt

Like every other gpus of the last century


----------



## ToTheSun!

ilmazzo said:


> Like every other gpus of the last century


I'd go as far as saying it's like every GPU of the last millennium!


----------



## EastCoast

criminal said:


> I guess I am in a twilight zone when it comes to some of the responses... I had the 5700XT *for over a week and found the card and software to be a miserable experience*.





criminal said:


> *I don't appear to have or notice the issue you speak of.* It might just be the bias in me not noticing though...lol





criminal said:


> When you have been in IT as long as I have, *going home to "fix" more issues is not something I enjoy.*
> 
> :thumb:





criminal said:


> Navi, RDNA or whatever you want to call it has so much potential. Again, *very disappointed from my experience with the software side*. BUT if people knew me in real life, they would completely understand *why I can't and won't put up with the nonsense.
> *
> 
> 
> Oh I agree. That's why I kinda feel bad, talking bad about the card and AMD. *But I am not a beta tester* and the truth is the truth. Why do I care about how new something is if I get a negative experience from it?





NVIDIA Issues Windows 10 Warning: Update GeForce, NVS, Quadro And Tesla Drivers Now


> No less than five security vulnerabilities have been found in the NVIDIA GeForce, NVS, Quadro And Tesla graphics processing unit (GPU) display drivers for Windows.
> 
> Three of these are rated as high severity and unless updated could enable a denial of service attack as well as the escalation of privileges to enable an attacker to execute malware on your Windows computer.


https://www.forbes.com/sites/daveyw...vs-quadro-and-tesla-drivers-now/#67bff24710de

LOL, these kind of posts simply write themselves in a horrific comedy sense. 
Sometimes one doesn't even know when they are a beta tester nor understand when there is actually a real issue on the software side.

Meanwhile...
Adrenalin 2019 Edition 19.7.*5* Optional
File Size: 411 MB
Release Date: 7/31/2019


----------



## ZealotKi11er

WannaBeOCer said:


> Water cooling is still valuable for overclocking. Just look at Vega 10 and Vega 20. Vega 20 outperforms a RTX 2080 OC vs OC with water. The Radeon VII thermal throttles out the box with the stock heatsink. AIB cards are quiet with these sub 250w cards. Water cooling helps sustain high frequencies.
> 
> Only reason water cooling isn't beneficial for Navi is due to the memory clock.


AMD should improve memory performance with Navi2. They can get 40 CU card to 2080 level with just improvements in memory side.


----------



## ToTheSun!

Looks like SIGGRAPH is over. They started coming back.


----------



## AlphaC

https://www.igorslab.media/morepowe...x-5700-xt-tweaking-and-overclocking-software/
*"The program changes the upper and lower limits of the Wattman settings or blocks input options or releases them. However, the software does not replace the Wattman from the Radeon software or other overclocking tools, but extends their input options instead! But what is absolutely necessary: the program must be started as administrator!"*




> *Overdrive Limits*
> 
> The name actually says it all and the labeling of the individual fields is also unambiguous and self-explanatory (bottom right picture). Values that exceed or fall below the possible limits are ignored by the Wattman.


----------



## PontiacGTX

treetops422 said:


> lol why would you expect it to reach the same clock speeds as the 5700 xt?


7970 and 7950 reached same clock speed
r9 290x and 290 reached same clock speed
R9 Fury and R9 Fury x reached same clock speed
rx 470 and RX 480 reach similar clock speed
RX vega 56 and Vega 64 more or less the same it is too power limited

why wouldnt it? AMD releasing only binned chips for the XT model is the only explanation,GPU Before RX Vega/500 series werent binned unless they were some non reference model maybe en exception is 7970GHz but that probably wouldnt be much better than a regular 7970


----------



## treetops422

PontiacGTX said:


> 7970 and 7950 reached same clock speed
> r9 290x and 290 reached same clock speed
> R9 Fury and R9 Fury x reached same clock speed
> rx 470 and RX 480 reach similar clock speed
> RX vega 56 and Vega 64 more or less the same it is too power limited
> 
> why wouldnt it? AMD releasing only binned chips for the XT model is the only explanation,GPU Before RX Vega/500 series werent binned unless they were some non reference model maybe en exception is 7970GHz but that probably wouldnt be much better than a regular 7970


So you're saying the 2060 and 2060 Super are bad at overclocking because the 2060 doesn't match the 2060 Super core clock speed? If you say so. And the same goes for the 2070 and 2070 Super, the 2080 and 2080 Super, man that's harsh. But I'll take you're word for it.


----------



## ZealotKi11er

PontiacGTX said:


> 7970 and 7950 reached same clock speed
> r9 290x and 290 reached same clock speed
> R9 Fury and R9 Fury x reached same clock speed
> rx 470 and RX 480 reach similar clock speed
> RX vega 56 and Vega 64 more or less the same it is too power limited
> 
> why wouldnt it? AMD releasing only binned chips for the XT model is the only explanation,GPU Before RX Vega/500 series werent binned unless they were some non reference model maybe en exception is 7970GHz but that probably wouldnt be much better than a regular 7970


1) As GPUs get newer and newer you can see that in fact Vega 64 overclocks more on average. Same goes with 480/290X/7970.
2) 5700 and 5700 XT could be binned differently than before. 
3) Some R9 290, HD 7950 could unlock to 290X/290. 5700 could just be an XT that is just poor clocker and not just failed chip that cant run 40 CUs.


----------



## PontiacGTX

treetops422 said:


> So you're saying the 2060 and 2060 Super are bad at overclocking because the 2060 doesn't match the 2060 Super core clock speed? If you say so. And the same goes for the 2070 and 2070 Super, the 2080 and 2080 Super, man that's harsh. But I'll take you're word for it.


well at least GCN for AMD i never mentioned Nvidia. I didnt know that AMD made turing architecture



ZealotKi11er said:


> 1) As GPUs get newer and newer you can see that in fact Vega 64 overclocks more on average. Same goes with 480/290X/7970.
> 2) 5700 and 5700 XT could be binned differently than before.
> 3) Some R9 290, HD 7950 could unlock to 290X/290. 5700 could just be an XT that is just poor clocker and not just failed chip that cant run 40 CUs.


well vega 64 had a less limited power amount in bios, and thats not really right a 7970 and 7950 could both reach similar clock speed after, my 7950 and 7970 had no problem reaching similar clock speed,I have seen most people with 290 and 290x have similar clocks performance difference was minimal, 470? I could oc the one to 1.410 with ease but the vrm temp limited afaik 480 couldnt go further than 1400ish

now 2 and 3 are true


----------



## PontiacGTX

Powercolor


----------



## 113802

ZealotKi11er said:


> 1) As GPUs get newer and newer you can see that in fact Vega 64 overclocks more on average. Same goes with 480/290X/7970.
> 2) 5700 and 5700 XT could be binned differently than before.
> 3) Some R9 290, HD 7950 could unlock to 290X/290. 5700 could just be an XT that is just poor clocker and not just failed chip that cant run 40 CUs.


AMD set artificial limitations on the Vega 56. Pretty much all Vega 56 reference cards could run at Vega 64 speeds after flashing the 64 bios. I'm sure when a flash tool is released for Navi we'll be seeing users flash the XT bios on the 5700 and get similar clock speeds. 

The only cards that AMD bins are the RX vega 64 LC and 5700 XT 50th anniversary. 

This is the reason why AMD set these artificial limitations: https://videocardz.com/72299/amd-radeon-rx-vega-56-gets-faster-with-vega-64-bios


----------



## criminal

EastCoast said:


> NVIDIA Issues Windows 10 Warning: Update GeForce, NVS, Quadro And Tesla Drivers Now
> 
> https://www.forbes.com/sites/daveyw...vs-quadro-and-tesla-drivers-now/#67bff24710de
> 
> LOL, these kind of posts simply write themselves in a horrific comedy sense.
> Sometimes one doesn't even know when they are a beta tester nor understand when there is actually a real issue on the software side.
> 
> Meanwhile...
> Adrenalin 2019 Edition 19.7.*5* Optional
> File Size: 411 MB
> Release Date: 7/31/2019


Holy bring up the past Batman. You are one of those people that just hang on to things huh? 

And ah... 


> However, the warning is tempered somewhat by the fact, as reported by Bleeping Computer, that all the vulnerabilities "require local user access and cannot be exploited remotely, with potential attackers having to rely on user interaction to execute malicious code designed to exploit one of the fixed bugs on machines with unpatched display drivers."


Although not something that I would shrug off, it is not like this is something that will just happen without user interaction.


----------



## maltamonk

XFX brought back the 280x DD Cooler design and are calling it Thicc2.......lol

https://videocardz.com/newz/xfx-radeon-rx-5700-xt-thicc2-pictured


----------



## 113802

maltamonk said:


> XFX brought back the 280x DD Cooler design and are calling it Thicc2.......lol
> 
> https://videocardz.com/newz/xfx-radeon-rx-5700-xt-thicc2-pictured


Plastic chrome grille on a video card?


----------



## ToTheSun!

criminal said:


> Holy bring up the past Batman. You are one of those people that just hang on to things huh?
> 
> And ah...
> 
> 
> Although not something that I would shrug off, it is not like this is something that will just happen without user interaction.


Didn't you know? We've entered the age when hardware that will let compromise the most modest bit of data when it's tampered with locally by a million hackers is considered vulnerable and must absolutely be patched to near non-functionality.

nVidia released a fix so quickly for this issue that even the AMD squad was late to talk about it.


----------



## AlphaC

https://techgage.com/article/amd-radeon-navi-vs-nvidia-geforce-super-proviz/2/

garbage in 3dsmax, mediocre in Maya & PTC Creo, decent Solidworks, great in CATIA

Also RTX super is better when RTX cores are used (such as Blender render and Agisoft Metashape)




> Unfortunately for AMD, its Navi cards didn’t fare too well across a bunch of our tests. It did perform extremely well in SolidWorks, CATIA and Blender viewport tests, but in some others, like Metashape’s depth map generation and LuxMark’s Hotel render, both 5700-series cards nonsensical fell behind the older RX 590.


----------



## maltamonk

AlphaC said:


> https://techgage.com/article/amd-radeon-navi-vs-nvidia-geforce-super-proviz/2/
> 
> garbage in 3dsmax, mediocre in Maya & PTC Creo, decent Solidworks, great in CATIA
> 
> Also RTX super is better when RTX cores are used (such as Blender render and Agisoft Metashape)


Isn't that the point of these cards though? They did the whole non-gaming thing in the past and it didn't sell so well and they got blasted for it. Now they go the gaming route and get criticized for it. Kinda a Catch-22 don't ya think?


----------



## AlphaC

In the past their cards didn't sell well because of power/temp/noise.

HD 7950 / HD7970 provided a lot of FP64 performance for the masses , now they don't have decently high FP64 except on Radeon VII.

If they don't have the compute performance they need to have perf/watt and perf/area to drop pricing or drop tiers.

Should've been releasing Navi with RX 590-like performance at 75W , Vega 56 performance at 150W , Radeon VII performance ~ 175W (Radeon RX 5700 XT basically is this).

Anything without RT/tensor cores would need to be up against GTX 1650 through GTX 1660Ti, which is <$350 market unless the performance is over a tier higher. Right now it isn't:
RX 5700 is on par with RTX 2060 Super/RTX 2070 for a $350 non-AIB model
RX 5700XT is behind RTX 2070 Super for a $400 non-AIB model

*~$150* GTX 1660 performance in 75-100W for GTX 1650 pricing would sell ... there's no RTX here , this would also appeal to Linux users as a host GPU for Threadripper or Skylake-X (GT 1030 cards and GTX 1050 host cards are often used for GPU passthrough)
---> also competing against their own RX 570 and severely discounted RX 580 until they are sold out
*~$220 *close to GTX 1660 Ti performance in ~150W for GTX 1660 pricing would sell ... there's no RTX here
*~$300* RTX 2060 performance for GTX 1660 Ti pricing would sell (RX 5700) , 180W or so is reasonable but higher, at $350 + AIB partner cooler cost it's near $400
---> also competing against their own Vega 56 cards that have not sold out
(RTX becomes a larger factor at ~$400)

*$400 *RTX 2070 performance for RTX 2060 pricing might not necessarily sell if RT features are utilized and the bulk of the market is ~$200-350 (see GTX 1660 Ti, GTX 1660, GTX 1060, GTX 970, GTX 960 which had gimped memory bus, GTX 760 = GTX 670 , GTX 660 Ti, GTX 660 , GTX 560 Ti, GTX 460 , GTX 260 core 216 which is GDDR3, etc)
*$500+* : RTX 2080 Super performance at a minimum unless there's feature that makes Radeon standout...

AMD needed to take into account the amount of people that will see similar pricing and opt for the RTX cards because of having extra options/features in the same power & monetary budget. Radeon Anti-lag isn't enough to sway content creators and such and Freesync has "Gsync compatible" to deal with. They needed Maya/3dsmax performance (which are non engineering so more likely to be on Geforce) to match RTX Quadros or something similar to that. Given the poor showing by Radeon Pro WX8200 that needs to happen at the pro driver level and not just on consumer Radeons. That's not even including the Blender render results.


----------



## ZealotKi11er

maltamonk said:


> Isn't that the point of these cards though? They did the whole non-gaming thing in the past and it didn't sell so well and they got blasted for it. Now they go the gaming route and get criticized for it. Kinda a Catch-22 don't ya think?


Cant make anyone happy lol. What RDNA is now, it is perfect. Let AMD GCN handle compute applications.


----------



## PontiacGTX

AlphaC said:


> In the past their cards didn't sell well because of power/temp/noise.
> 
> HD 7950 / HD7970 provided a lot of FP64 performance for the masses , now they don't have decently high FP64 except on Radeon VII.
> 
> If they don't have the compute performance they need to have perf/watt and perf/area to drop pricing or drop tiers.
> 
> Should've been releasing Navi with RX 590-like performance at 75W , Vega 56 performance at 150W , Radeon VII performance ~ 175W (Radeon RX 5700 XT basically is this).
> 
> Anything without RT/tensor cores would need to be up against GTX 1650 through GTX 1660Ti, which is <$350 market unless the performance is over a tier higher. Right now it isn't:
> RX 5700 is on par with RTX 2060 Super/RTX 2070 for a $350 non-AIB model
> RX 5700XT is behind RTX 2070 Super for a $400 non-AIB model
> 
> *~$150* GTX 1660 performance in 75-100W for GTX 1650 pricing would sell ... there's no RTX here , this would also appeal to Linux users as a host GPU for Threadripper or Skylake-X (GT 1030 cards and GTX 1050 host cards are often used for GPU passthrough)
> ---> also competing against their own RX 570 and severely discounted RX 580 until they are sold out
> *~$220 *close to GTX 1660 Ti performance in ~150W for GTX 1660 pricing would sell ... there's no RTX here
> *~$300* RTX 2060 performance for GTX 1660 Ti pricing would sell (RX 5700) , 180W or so is reasonable but higher, at $350 + AIB partner cooler cost it's near $400
> ---> also competing against their own Vega 56 cards that have not sold out
> (RTX becomes a larger factor at ~$400)
> 
> *$400 *RTX 2070 performance for RTX 2060 pricing might not necessarily sell if RT features are utilized and the bulk of the market is ~$200-350 (see GTX 1660 Ti, GTX 1660, GTX 1060, GTX 970, GTX 960 which had gimped memory bus, GTX 760 = GTX 670 , GTX 660 Ti, GTX 660 , GTX 560 Ti, GTX 460 , GTX 260 core 216 which is GDDR3, etc)
> *$500+* : RTX 2080 Super performance at a minimum unless there's feature that makes Radeon standout...
> 
> AMD needed to take into account the amount of people that will see similar pricing and opt for the RTX cards because of having extra options/features in the same power & monetary budget. Radeon Anti-lag isn't enough to sway content creators and such and Freesync has "Gsync compatible" to deal with. They needed Maya/3dsmax performance (which are non engineering so more likely to be on Geforce) to match RTX Quadros or something similar to that. Given the poor showing by Radeon Pro WX8200 that needs to happen at the pro driver level and not just on consumer Radeons. That's not even including the Blender render results.


so you suggest AMD should drop the [roces? because if they cant compete against similar priced cards with more features maybe lowering the price might do it


----------



## AlphaC

Before launch I predicted ~$330 / <$450 for the RX 5700 and RX 5700XT since they'd need to undercut RTX cards by at least 10% for the full die. RTX 2070 was already selling for $430-450 at the time. The cut down Navi version was clock limited so the 10% cut for RX 5700 was surprising , I expected something along the lines of 15-25% shader reduction to fit a tighter TDP such as 150W. That was before RTX 2070 Super (with NVLink) and RTX 2060 Super. With the memory bandwidth increase on RTX 2060 Super in addition to the 2 GB added VRAM to achieve VRAM parity, AMD can't afford to price their RX 5700 merely close to it. They need to price it accordingly and match or undercut RTX 2060 non-Super AIBs (while keeping GTX 1660 Ti pricing in mind , so $260-280 is a price floor for a non-RTX card) while offering non-RTX value adds: a XBox trial is a noncontender if you're using the GPU for Blender for example.

That's on top of the issues people are having with the Navi GPUs' drivers not working for encoding or other tasks, Navi's Linux drivers being proprietary and not part of kernel (which means the same install process as Nvidia GPUs), etc. Their own Radeon Prorender doesn't work on RX 5700 series and NVENC reportedly offers a superior experience for encoding...

For some reason people think a reference blower RX 5700 for $330 is a deal right now... it really isn't when you need to spend at least $40 to make it cool and quiet which pushes it to RTX 2060 Super / RTX 2070 pricing. The RX 5700XT is more competitive for gaming-only workloads because RTX 2070 Super is $500 , but keep in mind that RTX 2070 Super is also a bit faster. Until AIB partner cards come along for a reasonable price and the outstanding issues are fixed, they really need to step up competitiveness all around. I suspect the only people buying the RX 5700 cards right now are AMD-only buyers , they haven't done enough to sway the brand-agnostic or even gain attention from Nvidia-biased people.

Even if you look at it from a strictly gaming point of view: in unfavorable titles the Navi cards are slower and in favored titles such as Battlefield V the RTX cards are still within margin of error for 1% lows except for Forza Horizon 4. For popular Unreal Engine titles such as Fortnite, the RX 5700XT is minimally slower than RTX 2060 Super. In Rainbow Six Siege which is a top 10 steam title, the RX 5700XT manages to be faster than RTX 2060 Super and within margin of RTX 2070 but 10 FPS slower than the RTX 2070 Super. What I'm getting at is : in gaming there isn't a pure advantage but in non-gaming it just isn't comparable.


----------



## Heuchler

With Micro Center combo discount the RX 5700 drops down to $250 ($299 - $50 for buying a Ryzen 3000-series processor). 
I expect that will be the Black Friday sales price for a lot of retailers.. Entry non-reference cards seem to be targeting the current MSRP price.


----------



## Newbie2009

AlphaC said:


> Before launch I predicted ~$330 / <$450 for the RX 5700 and RX 5700XT since they'd need to undercut RTX cards by at least 10% for the full die. RTX 2070 was already selling for $430-450 at the time. The cut down Navi version was clock limited so the 10% cut for RX 5700 was surprising , I expected something along the lines of 15-25% shader reduction to fit a tighter TDP such as 150W. That was before RTX 2070 Super (with NVLink) and RTX 2060 Super. With the memory bandwidth increase on RTX 2060 Super in addition to the 2 GB added VRAM to achieve VRAM parity, AMD can't afford to price their RX 5700 merely close to it. They need to price it accordingly and match or undercut RTX 2060 non-Super AIBs (while keeping GTX 1660 Ti pricing in mind , so $260-280 is a price floor for a non-RTX card) while offering non-RTX value adds: a XBox trial is a noncontender if you're using the GPU for Blender for example.
> 
> That's on top of the issues people are having with the Navi GPUs' drivers not working for encoding or other tasks, Navi's Linux drivers being proprietary and not part of kernel (which means the same install process as Nvidia GPUs), etc. Their own Radeon Prorender doesn't work on RX 5700 series and NVENC reportedly offers a superior experience for encoding...
> 
> For some reason people think a reference blower RX 5700 for $330 is a deal right now... it really isn't when you need to spend at least $40 to make it cool and quiet which pushes it to RTX 2060 Super / RTX 2070 pricing. The RX 5700XT is more competitive for gaming-only workloads because RTX 2070 Super is $500 , but keep in mind that RTX 2070 Super is also a bit faster. Until AIB partner cards come along for a reasonable price and the outstanding issues are fixed, they really need to step up competitiveness all around. I suspect the only people buying the RX 5700 cards right now are AMD-only buyers , they haven't done enough to sway the brand-agnostic or even gain attention from Nvidia-biased people.
> 
> Even if you look at it from a strictly gaming point of view: in unfavorable titles the Navi cards are slower and in favored titles such as Battlefield V the RTX cards are still within margin of error for 1% lows except for Forza Horizon 4. For popular Unreal Engine titles such as Fortnite, the RX 5700XT is minimally slower than RTX 2060 Super. In Rainbow Six Siege which is a top 10 steam title, the RX 5700XT manages to be faster than RTX 2060 Super and within margin of RTX 2070 but 10 FPS slower than the RTX 2070 Super. What I'm getting at is : in gaming there isn't a pure advantage but in non-gaming it just isn't comparable.


If you want to keep the Navi cards cooler they are great under volters, no cost involved.

5700xt is going for £375, 2070 super is £550 if you want it now, in stock. 

Navi is a total steal.


----------



## treetops422

PontiacGTX said:


> maybe considers has a limited overclockng potential? Well the RX 5700 non XT has proven is awful overclocker or maybe the card can do it just with non reference cooling, though if it overclocks beyond 2.2ghz maybe worth water cooling it





treetops422 said:


> 1. What overclock gains does the 5700 get with the blower?
> 2. What overclock gains does the 2060 get?
> 3. What do you consider good gains?
> 4. What do you consider awful gains?
> 5. What do you consider normal gains?





PontiacGTX said:


> 5700 is objectively worse than 5700 XT so overclocking on that model is not going to net as much performance gain as the 5700 XT, so far reference model 5700 XT can reach 2.1-2.2ghz meanwhile the RX 5700 cant reach similar clock speed





treetops422 said:


> lol why would you expect it to reach the same clock speeds as the 5700 xt?





PontiacGTX said:


> 7970 and 7950 reached same clock speed
> r9 290x and 290 reached same clock speed
> R9 Fury and R9 Fury x reached same clock speed
> rx 470 and RX 480 reach similar clock speed
> RX vega 56 and Vega 64 more or less the same it is too power limited
> 
> why wouldnt it? AMD releasing only binned chips for the XT model is the only explanation,GPU Before RX Vega/500 series werent binned unless they were some non reference model maybe en exception is 7970GHz but that probably wouldnt be much better than a regular 7970





treetops422 said:


> So you're saying the 2060 and 2060 Super are bad at overclocking because the 2060 doesn't match the 2060 Super core clock speed? If you say so. And the same goes for the 2070 and 2070 Super, the 2080 and 2080 Super, man that's harsh. But I'll take you're word for it.





PontiacGTX said:


> well at least GCN for AMD i never mentioned Nvidia. I didnt know that AMD made turing architecture


 So you're saying that only AMD cards are bad at overclocking if they don't have the same core clock....? Nvidia cards can have different core clock over clocking, but AMD cards have to have the same core clock overclocking. :thumb:




bigjdubb said:


> He didn't mention a single Nvidia card


Here you go, I went ahead and made it easier to understand why I mentioned Nvidia cards.


----------



## 113802

treetops422 said:


> So you're saying that only AMD cards are bad at overclocking if they don't have the same core clock....? Nvidia cards can have different core clock over clocking, but AMD cards have to have the same core clock overclocking. :thumb:
> 
> 
> Here you go, I went ahead and made it easier to understand why I mentioned Nvidia cards.


I do expect the same GPU die to overclock the same. As stated previously only the RX Vega 64 LC and 5700 XT 50th anniversary edition are binned. The RX Vega 56 and RX 5700 have artificial power limitations set. Once people figure out how to flash the XT bios on a RX 5700 we'll see that the 5700 clocks similarly. If users were able to do this on their stock RX Vega 56 the number of RX vega 64 sales would of dropped by a decent amount: https://videocardz.com/72299/amd-radeon-rx-vega-56-gets-faster-with-vega-64-bios


----------



## treetops422

WannaBeOCer said:


> I do expect the same GPU die to overclock the same. As stated previously only the RX Vega 64 LC and 5700 XT 50th anniversary edition are binned. The RX Vega 56 and RX 5700 have artificial power limitations set. Once people figure out how to flash the XT bios on a RX 5700 we'll see that the 5700 clocks similarly. If users were able to do this on their stock RX Vega 56 the number of RX vega 64 sales would of dropped by a decent amount: https://videocardz.com/72299/amd-radeon-rx-vega-56-gets-faster-with-vega-64-bios


So these are all bad overclockers?
2080 Super 545 mm² 
2080 545 mm² 
2070 Super 545 mm² 
2070 445 mm² 
2060 Super 445 mm² 
2060 445 mm²


----------



## 113802

treetops422 said:


> So these are all bad overclockers?
> 2080 Super 545 mm²
> 2080 545 mm²
> 2070 Super 545 mm²
> 2070 445 mm²
> 2060 Super 445 mm²
> 2060 445 mm²


Guru3D's overclocking results for Founders cards:

2080 Super 545 mm² - TU104 - 2050Mhz
2080 545 mm² - TU104 - 2030~2115 MHz
2070 Super 545 mm² - TU104 - 2050Mhz
2070 445 mm² - TU106 - Couldn't find a review
2060 Super 445 mm² - TU106 - 2025Mhz
2060 445 mm² - TU106 - 2000Mhz-2050Mhz

The RX Vega 56 and RX 5700 are power limited and restrained by AMD's firmware/drivers. Luckily the Vega 56 could flash the Vega 64 bios. While nVidia cards aren't restrained by firmware/drivers.


----------



## JackCY

Newbie2009 said:


> If you want to keep the Navi cards cooler they are great under volters, no cost involved.
> 
> 5700xt is going for £375, 2070 super is £550 if you want it now, in stock.
> 
> Navi is a total steal.


So where are all the custom cooler Navi cards? As expected it's more of a September card. Always takes AMD's partners months to get products to stores.

The blower is not worth it. AMD needs to learn finally.


----------



## ZealotKi11er

JackCY said:


> So where are all the custom cooler Navi cards? As expected it's more of a September card. Always takes AMD's partners months to get products to stores.
> 
> The blower is not worth it. AMD needs to learn finally.


Lol. It's only the beginning of August.


----------



## Heuchler

Another Look At The Maturing AMD Radeon RX 5700 Series Linux Performance
https://www.phoronix.com/scan.php?page=article&item=navi-august-2019&num=1

"The Radeon RX 5700 Linux performance was just shy of the Radeon RX Vega 64 which in turn is just below the GeForce RTX 2070. For The AMD Radeon RX 5700 XT, the performance came in mid-way between the RTX 2070 and RTX 2080.

We're now nearly one month in since the Radeon RX 5700 series launch and the Linux driver support is becoming quite good though there still are some cases with select games where the performance is coming in short, as shown in this article"


----------



## ilmazzo

WannaBeOCer said:


> The RX Vega 56 and RX 5700 are power limited and restrained by AMD's firmware/drivers. Luckily the Vega 56 could flash the Vega 64 bios. While nVidia cards aren't restrained by firmware/drivers.


non sense talk

nvidia cards clipped even the gpu with -A and not A gpu models, come on......every gpu is different from others, liek every cpu and so on.....what are we talking about?


----------



## PontiacGTX

ilmazzo said:


> non sense talk
> 
> nvidia cards clipped even the gpu with -A and not A gpu models, come on......every gpu is different from others, liek every cpu and so on.....what are we talking about?


thing is he isnt comparing AMD vs Nvidia.. is it hard to see that AMD can just reduce overclock ability just to make the more expensive product appealing due to the performance gap(for Vega that was the case, is it the same for navi?I havent seen the first 2.150/2100mhz RX 5700 card compared to 5700 XT in reviews) there is also the possibility some chips are highly binned and can reach much higher clocks but so far the release of navi started with some serious driver bugs and the cards didnt have as much mod-ability until now


----------



## 113802

ilmazzo said:


> non sense talk
> 
> nvidia cards clipped even the gpu with -A and not A gpu models, come on......every gpu is different from others, liek every cpu and so on.....what are we talking about?


Most CPUs/GPUs have an average that they can overclock to. The RX 56 and RX 5700 are being restricted not to touch the average speed they can run at. Pretty much most the reference RX 56 with Samsung memory after flashing the RX Vega 64 bios are able to run their memory at 1100Mhz compared to 900-950Mhz on the stock RX 56 bios. 

They are artificially limiting their GPUs to prevent losing sales of their top tier cards.


----------



## ZealotKi11er

WannaBeOCer said:


> Most CPUs/GPUs have an average that they can overclock to. The RX 56 and RX 5700 are being restricted not to touch the average speed they can run at. Pretty much most the reference RX 56 with Samsung memory after flashing the RX Vega 64 bios are able to run their memory at 1100Mhz compared to 900-950Mhz on the stock RX 56 bios.
> 
> They are artificially limiting their GPUs to prevent losing sales of their top tier cards.


Most v56 had hynix for memory.


----------



## 113802

ZealotKi11er said:


> Most v56 had hynix for memory.


Pretty much all the reference cards at launch were Samsung memory since they were the only available HBM supplier. That still has nothing to do with AMD limiting OC potential purposely.


----------



## maltamonk

WannaBeOCer said:


> Most CPUs/GPUs have an average that they can overclock to. The RX 56 and RX 5700 are being restricted not to touch the average speed they can run at. Pretty much most the reference RX 56 with Samsung memory after flashing the RX Vega 64 bios are able to run their memory at 1100Mhz compared to 900-950Mhz on the stock RX 56 bios.
> 
> They are artificially limiting their GPUs to prevent losing sales of their top tier cards.


Why is that an issue? It may be limited, but it's also cheaper. Not only that, but it's possible someone will find a work around in the future to unlock that potential. For us the consumers.....that's a plus, over not having that possibility. Would you rather the cheaper variant be hardware restricted at the same price? If so why?


----------



## ZealotKi11er

5700 non XT with unlocked clocks. Not sure where these scores sit.

https://www.3dmark.com/3dm/38304658

https://www.3dmark.com/3dm/38304576


----------



## PontiacGTX

ZealotKi11er said:


> 5700 non XT with unlocked clocks. Not sure where these scores sit.
> 
> https://www.3dmark.com/3dm/38304658
> 
> https://www.3dmark.com/3dm/38304576


with max oc?


----------



## treetops422

ZealotKi11er said:


> 5700 non XT with unlocked clocks. Not sure where these scores sit.
> 
> https://www.3dmark.com/3dm/38304658
> 
> https://www.3dmark.com/3dm/38304576


I looked at the highest scores on fire strike, #1 combined. Then looked at the graphic score, I don't know how to filter by graphics only or if that's an option.
Graphical scores on Firestrike:
Best 2070 Super 29665 
Best 5700 XT 28907
Best Vega 64 28194 
Radeon 5700 The score you listed 27516 (The best score I've seen, grats )
Best 2070 26476 graphical 
Best 5700 combined score graphical score 26470
My best score with the blower fan so far 5700 25003 (Still rising, Igor MorePowerTool ftw, I'm no pro)
Best 2060 Super 24441
Best Vega 56 24090
Best 2060 22713
My stock Firestrike score 22381 

Best 1660 Ti 19232

Again all the ones I pulled from Best are the number one combined scores on Firestrike, then the graphical scores. The older cards can be l2n cooled for all we know.

P.S. If someone knows of a good free benchmark that only tests your GPU and lets you compare results with other video cards and people using your video card lmk.


----------



## 113802

ZealotKi11er said:


> 5700 non XT with unlocked clocks. Not sure where these scores sit.
> 
> https://www.3dmark.com/3dm/38304658
> 
> https://www.3dmark.com/3dm/38304576


FireStrike scores the same as a RX Vega 64 LC w/ UV. The TimeSpy score is definitely a bit higher than a Vega 64 LC.


----------



## EastCoast

criminal said:


> Holy bring up the past Batman. You are one of those people that just hang on to things huh?
> 
> And ah...


Oh, Hi...there you are. Got some DLSS vs FidelityFX (AMD's Open source). It's unfortunate that this information is just getting out there but here you go none the less.

Review here:
https://www.overclock3d.net/reviews/software/f1_2019_-_nvidia_dlss_vs_amd_fidelityfx/1

Below DLSS is compared to FidelityFX





I certainly would not use DLSS in that game. It looks similar to having disabled AA. But hey, at least it gives that frame rate performance boost. :thumb:


----------



## rluker5

I wouldn't even use DLSS like that playing Dishonored. It would be a lot funnier how bad it looks if you couldn't turn it off. That's probably a feature best forgotten. 
The FidelityFX does help fix what TAA is breaking in that game though. It is a good feature to have.


----------



## 113802

EastCoast said:


> Oh, Hi...there you are. Got some DLSS vs FidelityFX (AMD's Open source). It's unfortunate that this information is just getting out there but here you go none the less.
> 
> Review here:
> https://www.overclock3d.net/reviews/software/f1_2019_-_nvidia_dlss_vs_amd_fidelityfx/1
> 
> Below DLSS is compared to FidelityFX
> 
> 
> 
> 
> 
> I certainly would not use DLSS in that game. It looks similar to having disabled AA. But hey, at least it gives that frame rate performance boost. /forum/images/smilies/thumb.gif


nVidia must of done something right if AMD is also using machine learning to upscale. 

https://www.guru3d.com/news-story/a...tive-with-radeon-vii-though-directml-api.html

Followed by their announcement at Siggraph: https://www.overclock.net/forum/226...ions-new-plugin-versions-coming-features.html


----------



## keikei

rluker5 said:


> I wouldn't even use DLSS like that playing Dishonored. It would be a lot funnier how bad it looks if you couldn't turn it off. That's probably a feature best forgotten.
> The FidelityFX does help fix what TAA is breaking in that game though. It is a good feature to have.



DLSS seems to be a 4k hack with the current mainstream lineup. The cards arent fast enough to push native 4k most of the time, but with DLSS, you make the game resemble it.


----------



## The Robot

keikei said:


> DLSS seems to be a 4k hack with the current mainstream lineup. The cards arent fast enough to push native 4k most of the time, but with DLSS, you make the game resemble it.


It's like PS4 Pro's "peasant 4k"


----------



## keikei

The Robot said:


> It's like PS4 Pro's "peasant 4k"



Don't most Pro games have some sort of '4k' option now? I can't say that with DLSS. I just saw this:

SAPPHIRE Radeon RX 5700 XT PULSE available for preorder



> Sapphire have a new Radeon RX 5700 XT graphics card, based on Navi 10 XT, coming on *August 30th*.
> The PULSE RX 5700XT features dual-fan design, which appear to be 10cm wide. The card has standard power connectors: 8-pin and 6-pin. It is equipped with a full cover backplate.
> The card comes with undisclosed clock speeds, but at least the estimated pricing has been revealed. You can preorder the card for *429 GBP*.


----------



## ZealotKi11er

Some quick overclocking of 5700 XT

https://www.3dmark.com/spy/8051896

https://www.3dmark.com/fs/20084614


----------



## looniam

ZealotKi11er said:


> Some quick overclocking of 5700 XT
> 
> https://www.3dmark.com/spy/8051896
> 
> https://www.3dmark.com/fs/20084614


took me a second to realize it was a different rig. 

but that non XT looks really good:
https://www.3dmark.com/compare/fs/20084614/fs/20082654
https://www.3dmark.com/compare/spy/8051896/spy/8048700

~8% difference . . . whats the price difference? like ~$50US?


----------



## Hwgeek

OMG- DLSS is so bad in this game:


----------



## treetops422

ZealotKi11er said:


> Some quick overclocking of 5700 XT
> 
> https://www.3dmark.com/spy/8051896
> 
> https://www.3dmark.com/fs/20084614


Wow less then 1% off of the best 2070 Super graphical Firestrike score. $100 difference. Navi=Overclocking fun on a bun. So far I've gotten about 12% over stock with my 5700 blower model. And I'm no expert my power level is still rising.


----------



## ilmazzo

looniam said:


> took me a second to realize it was a different rig.
> 
> but that non XT looks really good:
> https://www.3dmark.com/compare/fs/20084614/fs/20082654
> https://www.3dmark.com/compare/spy/8051896/spy/8048700
> 
> ~8% difference . . . whats the price difference? like ~$50US?


cut version has always been "near" (I mean not over 10%) the xt version on amd cards, amd provide better power delivery and better ram chips to widen the difference between the versions (and a different PL too of course)..... right now I think the xt when overclocked is coming short on bandwidth, pushing ram on both would push XT model a bit ahead

Anyway a 5700 right now is utterly no brainer, a steal if paired with some amd parts and their discount...

a 3600, 3200 16gb ram on a B450 and a 5700 make a perfect fullhd/qhd cruncher for the masses, bring it on Lisa!!!!!


----------



## ilmazzo

Hwgeek said:


> OMG- DLSS is so bad in this game:


if you start a race with a ferrari and then activate DLSS you actually get a mc laren


----------



## BUFUMAN

ilmazzo said:


> if you start a race with a ferrari and then activate DLSS you actually get a mc laren


Hahaha

Gesendet von meinem LYA-L29 mit Tapatalk


----------



## Newbie2009

ZealotKi11er said:


> Some quick overclocking of 5700 XT
> 
> https://www.3dmark.com/spy/8051896
> 
> https://www.3dmark.com/fs/20084614


High end cpu showing the potential the card has, with early drivers and all


----------



## ZealotKi11er

looniam said:


> took me a second to realize it was a different rig.
> 
> but that non XT looks really good:
> https://www.3dmark.com/compare/fs/20084614/fs/20082654
> https://www.3dmark.com/compare/spy/8051896/spy/8048700
> 
> ~8% difference . . . whats the price difference? like ~$50US?


Well its only 4 more CUs but XT can clock a bit more. 2200 vs 2150. Memory is a miss. Also TimeSpy uses the extra 4 CUs much better than FireStrike. Personally, I would get XT unless the price difference is more than $50. Once AIB cards are out <$380 AIB 5700 will be a better buy than 5700 XT Reference.


----------



## Newbie2009

Finally got around to changing out the 2200G for a 3600X.

Card is 2150/900 and cpu is stock.

https://www.3dmark.com/spy/8060197

Not bad for a budget build in a micro atx case imo.


----------



## treetops422

ZealotKi11er said:


> Well its only 4 more CUs but XT can clock a bit more. 2200 vs 2150. Memory is a miss. Also TimeSpy uses the extra 4 CUs much better than FireStrike. Personally, I would get XT unless the price difference is more than $50. Once AIB cards are out <$380 AIB 5700 will be a better buy than 5700 XT Reference.



What kind of cooler\settings are you using on the 5700?


----------



## ZealotKi11er

treetops422 said:


> What kind of cooler\settings are you using on the 5700?


For these run, I used both Reference @ 100% or Morpheus II.
I did not put memory heatsinks with the Morpheus.


----------



## 113802

Installed F1 2019 yesterday and am able to enable FidelityFX on the Radeon VII. Anyone have a preference on what looks better FidelityFX sharpening or upscaling?


----------



## JackCY

keikei said:


> Don't most Pro games have some sort of '4k' option now? I can't say that with DLSS. I just saw this:
> 
> SAPPHIRE Radeon RX 5700 XT PULSE available for preorder


Same poor cooler design, no thanks for that price. Saphire cards went from great to mediocre in one generation after AMD stopped being competitive, forcing Saphire to stream line and simplify their designs and offerings.
The front-back blowing fin stack is subpar compared to side blowing. The endless metal covers block exhaust too. Those sunk in PCIe connectors are going to be a pain to work with on this Sapphire.



Hwgeek said:


> OMG- DLSS is so bad in this game:


"motion blur"



treetops422 said:


> Wow less then 1% off of the best 2070 Super graphical Firestrike score. $100 difference. Navi=Overclocking fun on a bun. So far I've gotten about 12% over stock with my 5700 blower model. And I'm no expert my power level is still rising.


Best what? A 2070 is easily over 10k in TimeSpy, no mods on an old quad core 4 threads. GPU score.


----------



## maltamonk

JackCY said:


> Best what? A 2070 is easily over 10k in TimeSpy, no mods on an old quad core 4 threads. GPU score.


I'm no genius, but from what I can ascertain, the "best what" was "2070 super graphical firestrike score". I may have misread it.


----------



## bucdan

JackCY said:


> Same poor cooler design, no thanks for that price. Saphire cards went from great to mediocre in one generation after AMD stopped being competitive, forcing Saphire to stream line and simplify their designs and offerings.
> The front-back blowing fin stack is subpar compared to side blowing. The endless metal covers block exhaust too. Those sunk in PCIe connectors are going to be a pain to work with on this Sapphire.
> 
> 
> "motion blur"
> 
> 
> Best what? A 2070 is easily over 10k in TimeSpy, no mods on an old quad core 4 threads. GPU score.


Agreed. Basically having air run into each other between the fans. The power connectors look like the biggest pain. Hopefully they don't turn into the one team ASUS-pricing style brand. Powercolor and XFX seem the best this go around.


----------



## treetops422

ZealotKi11er said:


> For these run, I used both Reference @ 100% or Morpheus II.
> I did not put memory heatsinks with the Morpheus.


 I see you are at 2060 core 925 memory on the 5700 firestrike. What power max were you set at? Also I really don't know if it matters at all, but what's your voltage curve? (Yeah I know the blower fan has it's limits 



P.S. Hmm just thought of something, yeah I know it's probably been done. A mirror below your GPU so you can see it's purdy face. I get that people don't like the blower etc, but I think the reference card looks slick. The saphire box is ugly though.


----------



## NightAntilli

JackCY said:


> Same poor cooler design, no thanks for that price. Saphire cards went from great to mediocre in one generation after AMD stopped being competitive, forcing Saphire to stream line and simplify their designs and offerings.
> The front-back blowing fin stack is subpar compared to side blowing. The endless metal covers block exhaust too. Those sunk in PCIe connectors are going to be a pain to work with on this Sapphire.


I don't like Pulse either, but I don't think there was anything wrong with the Nitro Vega or Polaris cards. There must be a Nitro version on the way.


----------



## ZealotKi11er

treetops422 said:


> I see you are at 2060 core 925 memory on the 5700 firestrike. What power max were you set at? Also I really don't know if it matters at all, but what's your voltage curve? (Yeah I know the blower fan has it's limits
> 
> 
> 
> P.S. Hmm just thought of something, yeah I know it's probably been done. A mirror below your GPU so you can see it's purdy face. I get that people don't like the blower etc, but I think the reference card looks slick. The saphire box is ugly though.


 
You do not want the curve to go over 1.2v. Mine was set to 2125MHz/1200mv. The strange part is that I got it to 28.5k with stock 6700k + 8gb ddr3 2133 but with 9900k I could not go over 27.9k no matter what I tried.


----------



## treetops422

JackCY said:


> Best what? A 2070 is easily over 10k in TimeSpy, no mods on an old quad core 4 threads. GPU score.





maltamonk said:


> I'm no genius, but from what I can ascertain, the "best what" was "2070 super graphical firestrike score". I may have misread it.


 Correct of course. But now I just looked up the highest combined scores for the 2070 super and the 5700 xt. I literally said holy shhh when I saw them.


Best ever combined score 5700 xt Graphical Score 16937 Time Spy

Best ever combined score 2070 Super Graphical Score 11616 Time Spy


As mentioned before I do not know how to sort these by graphics score or if that's even an option. I looked up the best combined score for each card and this is what I see. Again these are benchmarks, cooling is unknown. Ug one of the few times I've wanted windows 10, no way to test timespy on 7.


----------



## ilmazzo

the difference that passes between a gpu card and a potato

how could it be?


----------



## ilmazzo

BTW ROG twitted some teasers of the Strix 5700XT card

Seems quite good of course (at least, I hope asus this time did a good work) but will be at same price level of a 2080 I presume so lol


----------



## 113802

treetops422 said:


> Correct of course. But now I just looked up the highest combined scores for the 2070 super and the 5700 xt. I literally said holy shhh when I saw them.
> 
> 
> Best ever combined score 5700 xt Graphical Score 16937 Time Spy
> 
> Best ever combined score 2070 Super Graphical Score 11616 Time Spy
> 
> 
> As mentioned before I do not know how to sort these by graphics score or if that's even an option. I looked up the best combined score for each card and this is what I see. Again these are benchmarks, cooling is unknown. Ug one of the few times I've wanted windows 10, no way to test timespy on 7.


Use common sense, some how a 5700 XT at 1919Mhz/875Mhz has a much higher GPU score than a 5700 XT @ 2153/926. It's obvious 3Dmark isn't capturing 2 cards in that system for some odd reason. It did the same thing with Radeon VII and screwed up the scores. 

Result you captured a screenshot from: https://www.3dmark.com/spy/7965849
HWBot highest score: https://www.3dmark.com/spy/7931309


----------



## ilmazzo

2 cards?

I knew that navi skipped CF support at all, or timespy can take advantage anyway through DX12 mgpu?


----------



## Heuchler

Shouldn't ASUS name their RX 5700 cards Arez....if anybody should be quite about VRM and temps it should be those guys.


----------



## 113802

ilmazzo said:


> 2 cards?
> 
> I knew that navi skipped CF support at all, or timespy can take advantage anyway through DX12 mgpu?


TimeSpy supports explicit multi-adapter, but only in linked node mode (LDA). Which both Vega 20 and Navi support.


----------



## treetops422

WannaBeOCer said:


> Use common sense, some how a 5700 XT at 1919Mhz/875Mhz has a much higher GPU score than a 5700 XT @ 2153/926.



I knew something was off, I just looked it up real quick, I don't use Timespy windows 7 blah blah etc no direct x 12. To bad the likes of Igor don't show us peasants some 2300/1k results.


----------



## keikei

This card is not gonna be cheap: https://videocardz.com/newz/asus-rog-strix-radeon-rx-5700-xt-review-kit-leaks-out


----------



## PontiacGTX

keikei said:


> This card is not gonna be cheap: https://videocardz.com/newz/asus-rog-strix-radeon-rx-5700-xt-review-kit-leaks-out


why ASUS uses useless metrics?3dmark11 AND at 1080p? most brands dont learn people plays games not benchmarks


----------



## JackCY

ULMark 2011 is useless in 2019, paid for by companies. Not that games aren't either but at least it's something you may actually use for a short time.


----------



## ilmazzo

yep, cheap and rog/strix products is an axymoron


----------



## PontiacGTX

JackCY said:


> ULMark 2011 is useless in 2019, paid for by companies. Not that games aren't either but at least it's something you may actually use for a short time.


I would rather see 2019-2018 game benchmarks than time spy benchmarks, and probably the game themselves wont show any noticeable gain in performance if the card wasnt so limited in first place (like non XT model) and if it holds the core clocks of the reference model


----------



## maltamonk

keikei said:


> This card is not gonna be cheap: https://videocardz.com/newz/asus-rog-strix-radeon-rx-5700-xt-review-kit-leaks-out


Yup and it'll totally blow the p/p advantage that makes these cards attractive.


----------



## keikei

maltamonk said:


> Yup and it'll totally blow the p/p advantage that makes these cards attractive.



If the competitor's version is any clue, we're looking at a $85 premium over stock. https://www.bhphotovideo.com/c/prod...MIl5m7runz4wIVkp6fCh0UHAqNEAQYASABEgKS4PD_BwE


----------



## AlphaC

ASUS STRIX isn't about price/performance. It's about charging people a massive ROG tax on a power limited card as well as ridiculousness such as Dual BIOS Nvidia cards when Nvidia cards are carrying encrypted locked down BIOS.

By their own slides RX 5700 XT STRIX is ~5% faster and supposedly costs much more (~35% if you go by OCUK's 530 vs 390). OCUK also happens to have their "Dual" card at 430 GBP which is far more reasonable ; unlike their GTX 1660 and GTX 1660 Ti Dual boards it actually has a proper heatsink. At a certain point you have to go "why buy this midrange card with a premium cooler that costs more than a reference RTX Super or RTX 2070 Super Ventus/Dual/Windforce/etc".

If you want price/performance you should look at Sapphire Pulse , XFX dual fan "RAW" cards, Asrock Challenger series (maybe not Taichi/Phantom Gaming X), or possibly MSI's EVOKE/Armor (sometimes it's a bad card on AMD side, but sometimes it's just Gaming X sans RGB and a powerstage or two missing). From what we've seen those will charge around 30-50 US dollars/Euros/GBP over reference. Powercolor even claims its aftermarket cards will match reference pricing of $400 but I suspect they mean something like a Red Dragon rather than the "Red Devil" cards (I don't understand the marketing).


----------



## PontiacGTX

AlphaC said:


> ASUS STRIX isn't about price/performance. It's about charging people a massive ROG tax on a power limited card as well as ridiculousness such as Dual BIOS Nvidia cards when Nvidia cards are carrying encrypted locked down BIOS.
> 
> By their own slides RX 5700 XT STRIX is ~5% faster and supposedly costs much more (~35% if you go by OCUK's 530 vs 390). OCUK also happens to have their "Dual" card at 430 GBP which is far more reasonable ; unlike their GTX 1660 and GTX 1660 Ti Dual boards it actually has a proper heatsink. At a certain point you have to go "why buy this midrange card with a premium cooler that costs more than a reference RTX Super or RTX 2070 Super Ventus/Dual/Windforce/etc".
> 
> If you want price/performance you should look at Sapphire Pulse , XFX dual fan "RAW" cards, Asrock Challenger series (maybe not Taichi/Phantom Gaming X), or possibly MSI's EVOKE/Armor (sometimes it's a bad card on AMD side, but sometimes it's just Gaming X sans RGB and a powerstage or two missing). From what we've seen those will charge around 30-50 US dollars/Euros/GBP over reference. Powercolor even claims its aftermarket cards will match reference pricing of $400 but I suspect they mean something like a Red Dragon rather than the "Red Devil" cards (I don't understand the marketing).


i dont know why anyone would pay 530 gbp for that. at this point people would consider spending on nvidia. but yes red dragon is some times as cheap as a reference model but the marketing on social media has been about red devil I wonder how looks like the red dragon would it be a 3 fan design? I hope so


----------



## maltamonk

AlphaC said:


> ASUS STRIX isn't about price/performance. It's about charging people a massive ROG tax on a power limited card as well as ridiculousness such as Dual BIOS Nvidia cards when Nvidia cards are carrying encrypted locked down BIOS.
> 
> By their own slides RX 5700 XT STRIX is ~5% faster and supposedly costs much more (~35% if you go by OCUK's 530 vs 390). OCUK also happens to have their "Dual" card at 430 GBP which is far more reasonable ; unlike their GTX 1660 and GTX 1660 Ti Dual boards it actually has a proper heatsink. At a certain point you have to go "why buy this midrange card with a premium cooler that costs more than a reference RTX Super or RTX 2070 Super Ventus/Dual/Windforce/etc".
> 
> If you want price/performance you should look at Sapphire Pulse , XFX dual fan "RAW" cards, Asrock Challenger series (maybe not Taichi/Phantom Gaming X), or possibly MSI's EVOKE/Armor (sometimes it's a bad card on AMD side, but sometimes it's just Gaming X sans RGB and a powerstage or two missing). From what we've seen those will charge around 30-50 US dollars/Euros/GBP over reference. Powercolor even claims its aftermarket cards will match reference pricing of $400 but I suspect they mean something like a Red Dragon rather than the "Red Devil" cards (I don't understand the marketing).


Well for the most part they have to be. The only segments where they are not subject to p/p are the very top models. All of the lower tiers are subject to the next tier up's p/p. So while atm the 5700xt rog/stix p/p doesn't matter as much (even though one would have to consider the Nvidia cards at that point), the 5700 is subject to the 5700xt, sans other aib top models.


----------



## treetops422

Meanwhile in the Nvidia Super star camp. Custom coolers cost just as much on the Super cards.... But you get 10% less FPS P/P.


----------



## Heuchler

PontiacGTX said:


> why ASUS uses useless metrics?3dmark11 AND at 1080p? most brands dont learn people plays games not benchmarks


some motherboards have 3DMark legacy benchmark Boosters is the only thing that comes to mind. Those might only apply to 3DMark2001.



ASRock Phantom Gaming and Taichi


----------



## Heuchler

HIS Radeon RX 5700 XT IceQX2 8GB


XFX Radeon RX 5700 XT 8GB RAW


----------



## Gunderman456

Everything looks ugly and cheap from plasticky shrouds and dinky fans. The XFX THICC2 looks cool though (still plasticky/shiny dinky looking fans on it though). Man, look at the Nvidia reference or is it (let's rip you off) founder cards and stop making junk looking fans which are probably not as effective.

Asus, same Strix design since the stone ages, yawn... Come up with something different and exciting for goodness sake. The copy paste in design is getting tedious.

Yeah, been following that thread. The Chinese cards say allot about where our culture is these days. Hold on lost my train of thought, just got a text on my cell phone. Oh, I also forgot I'm getting a pink hairdoo tonight to match my cards. Whatev, later...
VVV


----------



## Heuchler

I just realized I posted in the wrong RX 5700 [XT] thread in the Hardware News 



ASRock Taichi reminds me of Nitro card.


----------



## keikei

Heuchler said:


> some motherboards have 3DMark legacy benchmark Boosters is the only thing that comes to mind. Those might only apply to 3DMark2001.
> 
> 
> 
> ASRock Phantom Gaming and Taichi



Some more pics: https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/


I'm eager to see prices and release dates. These cards are beasts. :devil:


----------



## Heuchler

So far the Taichi my favorite in the looks department. 


Sapphire GPUs enable a boost setting on newer ASRock motherboards. I like overclocking in the BIOS as much as the next OCN guy but I can overclock my video card in the OS. Make me feels more normal.

[April 11,2017] ASRock Strategic Technology Partnership with SAPPHIRE - https://www.asrock.com/news/index.asp?id=3580


----------



## PontiacGTX

Heuchler said:


> some motherboards have 3DMark legacy benchmark Boosters is the only thing that comes to mind. Those might only apply to 3DMark2001.
> 
> 
> 
> ASRock Phantom Gaming and Taichi


the 3rd one look cool but that doesnt goes against the purpose of backplate?


----------



## keikei

PontiacGTX said:


> the 3rd one look cool but that doesnt goes against the purpose of backplate?



You mean just the role of support?

GIGABYTE Radeon RX 5700 XT GAMING OC pictured


----------



## PontiacGTX

keikei said:


> You mean just the role of support?


well backplates sometimes help with heat and that led seems to reduce the efficiency but well that too i guess?


----------



## keikei

PontiacGTX said:


> well backplates sometimes help with heat and that led seems to reduce the efficiency but well that too i guess?



Depends on the design. Aluminum and direct contact with the components and it does help with heat (indirectly), but otherwise is board support for the sometimes heavy cooler. Dats my limited understanding anyways.


----------



## AlphaC

PontiacGTX said:


> well backplates sometimes help with heat and that led seems to reduce the efficiency but well that too i guess?


 They're gaming cards not compute or workstation cards. People want all the RGB all over the place. 

When is the last time you heard a "gamer" ask about efficiency or power consumption unless it was about GPU coolers? 

There's no guarantee the backplate is metal , we've seen this happen with less expensive Geforce GPUs already where the backplate is plastic "for support and aesthetics".


----------



## Heuchler

On better GPU designs the backplate is thermally coupled with the back of the VRM adding additional surface area. On others it just traps heat or even worse it thermally couples things like VRAM which you don't want.

At least from what I have seen. Could be wrong about it. Haven't seen any real testing done on this subject. Youtube reviewers are too busy pumping too much voltage into CPUs and GPUs it seems.


----------



## PontiacGTX

AlphaC said:


> They're gaming cards not compute or workstation cards. People want all the RGB all over the place.
> 
> When is the last time you heard a "gamer" ask about efficiency or power consumption unless it was about GPU coolers?
> 
> There's no guarantee the backplate is metal , we've seen this happen with less expensive Geforce GPUs already where the backplate is plastic "for support and aesthetics".


Well I will never understand the RGB trend you will end up more time looking the screen rather the components inside


----------



## AlphaC

Timmy Joe said he cannot get Arctic Accelero Xtreme III to work


PCGH had the Xtreme IV working


----------



## keikei

PontiacGTX said:


> Well I will never understand the RGB trend you will end up more time looking the screen rather the components inside



I understand it. I just can't afford it. If you're gonna pimp the pc, you gotta go all the way right?


----------



## KyadCK

PontiacGTX said:


> Well I will never understand the RGB trend you will end up more time looking the screen rather the components inside


I spend more time inside my car than looking at it, I still care that it looks decent.

I spend almost none of my time looking at a mirror, but I care that my hair looks acceptable, that my beard is shaved or trimmed, and that my clothes look alright.

I spend most of my time in my house, but I still...

You get the idea. Pride in ownership is a thing, and RGB is by far the best color scheme for compatibility.



keikei said:


> I understand it. I just can't afford it. If you're gonna pimp the pc, you gotta go all the way right?


Obviously.


----------



## PontiacGTX

https://www.facebook.com/notes/powercolor-usa/update-powercolor-5700-xt-red-devil/1184626441711118/


----------



## keikei

PontiacGTX said:


> https://www.facebook.com/notes/powercolor-usa/update-powercolor-5700-xt-red-devil/1184626441711118/



The 15th, nice.


----------



## Shatun-Bear

Where's Sapphire's custom cards? Conspicuous by their absence.


----------



## PontiacGTX

Shatun-Bear said:


> Where's Sapphire's custom cards? Conspicuous by their absence.


Pulse model meh... either way the best looking card until the moment is the powercolor (knowing their history of previous cooler designs it will be unrivaled, a Pulse cooler isnt a match for the Red Devil)


----------



## Rei86

Heuchler said:


> Shouldn't ASUS name their RX 5700 cards Arez....if anybody should be quite about VRM and temps it should be those guys.



LOL nVidia backed off their Green partner program so they don't have too.

And the REAL ARES cards are dead, just like Asus MARS line too. 

Happy ASrock is now making GPU's. Their design aesthetics is... ... ... ... ... Ok...
But good god do we REALLY need triple slot thick coolers?



PontiacGTX said:


> Pulse model meh... either way the best looking card until the moment is the powercolor (knowing their history of previous cooler designs it will be unrivaled, a Pulse cooler isnt a match for the Red Devil)


Really? What about Sapphires Toxic, MSi Lighting, and Asus Matrix series?


----------



## PontiacGTX

Rei86 said:


> LOL nVidia backed off their Green partner program so they don't have too.
> 
> And the REAL ARES cards are dead, just like Asus MARS line too.
> 
> Happy ASrock is now making GPU's. Their design aesthetics is... ... ... ... ... Ok...
> But good god do we REALLY need triple slot thick coolers?
> 
> 
> 
> Really? What about Sapphires Toxic, MSi Lighting, and Asus Matrix series?


Toxic? didnt exist during the Red Devil cooling, but PCS+ was just as good as Vapor X, Lightining similar as PCS+, Asus Matrix on 290x if you think 90c is fine.. sure it was worse.only the Nitro+ cooler is more or less equivalent to the Red Devil, and for the new ASROCK, ASUS and Gigabyte coolers cant compare, but objectively the pulse model is worse


----------



## Rei86

PontiacGTX said:


> Toxic? didnt exist during the Red Devil cooling, but PCS+ was just as good as Vapor X, Lightining similar as PCS+, Asus Matrix on 290x if you think 90c is fine.. sure it was worse.only the Nitro+ cooler is more or less equivalent to the Red Devil, and for the new ASROCK, ASUS and Gigabyte coolers cant compare, but objectively the pulse model is worse


I thought Toxic was the top end line for Sapphire, like how Lightning is for MSi and Classified used to be for EVGA?
Looks like the Nitro + line is the top end for Sapphire but they did hint that the Toxic models will be coming out with Navi.

Kind of surprised PCS+ cooler is on the same level as MSI's Lightning.


----------



## Heuchler

Was it the Asus R9 290X Direct CU2 that re-used a GTX 770 cooler (outer two direct-touch heatpipes well touched nothing) ? 

we all learn from our mistakes right. VEGA Strix VRM pads being too thin and not making contact with the cooler.



PCS+ was R9 290X and R9 390X. They also had a dual GPU R9 295X2 Devil13. They might have dropped the PCS+ name for Devil after that. Sapphire had the Toxic R9 280X then Toxic R9 290X afterwards

MARS might have been the best GTX 295 card. Doesn't make it a good product tho. If it the one that came in a suitcase package. Might be remembering it wrong.


----------



## keikei

Not bad.


----------



## Rei86

Heuchler said:


> Was it the Asus R9 290X Direct CU2 that re-used a GTX 770 cooler (outer two direct-touch heatpipes well touched nothing) ?
> 
> MARS might have been the best GTX 295 card. Doesn't make it a good product tho. If it the one that came in a suitcase package. Might be remembering it wrong.


Yes the DCUII was just reused nVidia cooler on top of AMD GPU.

The Asus R9-295x2 wasn't given a name but it would have been called the Ares if Asus haven't already built a R9-290X dual GPU that came with a waterblock on top called the ARES III.


----------



## treetops422

KyadCK said:


> I spend more time inside my car than looking at it, I still care that it looks decent.
> 
> I spend almost none of my time looking at a mirror, but I care that my hair looks acceptable, that my beard is shaved or trimmed, and that my clothes look alright.
> 
> I spend most of my time in my house, but I still...
> 
> You get the idea. Pride in ownership is a thing, and RGB is by far the best color scheme for compatibility.
> 
> 
> 
> Obviously.


Do you put you're car inside a case at a angle that it can't be seen by you or anyone else? Do you keep the pipes in your walls squeaky clear? How about under the engine, that would be a better comparison. Do you go under the engine and make it nice and shiny? 











You and your cars exterior are visible to yourself and others. That's the difference.


p.s. I like all the neat looking cards it's a shame you have to yank it out of your computer to see it.




keikei said:


> Not bad.


Now that's a smoking deal, $420 for 3x fans. If I wasn't going to eventually water cool I'd be jelly.


----------



## criminal

keikei said:


> Not bad.


Very nice. Should be whisper quiet too.


----------



## maltamonk

keikei said:


> Not bad.


Oooooohhhhh....pleasantly surprised!


----------



## KyadCK

treetops422 said:


> Do you put you're car inside a case at a angle that it can't be seen by you or anyone else? Do you keep the pipes in your walls squeaky clear? How about under the engine, that would be a better comparison. Do you go under the engine and make it nice and shiny?
> 
> 
> You and your cars exterior are visible to yourself and others. That's the difference.
> 
> 
> p.s. I like all the neat looking cards it's a shame you have to yank it out of your computer to see it.


Uh........ :h34r-smi





































I am confused as to why I would need to remove the card to see it or why others would be unable to see it upon walking into the room.

You should spend some time in the build log and case mod sections of the forum, my stuff is tame.


----------



## Jedi Mind Trick

I cannot tell if I like that the 5700 didn't come with a backplate. On one hand, the 480 reference card had one (and looked great IMO); but on the other, it makes taking it out of my MITX build 100x easier (I have to use a chopstick to hit the PCIe release tab on cards with backplates). I like the cards a lot so far, great deal (with the $50 off at Microcenter [especially when already open box]) and the performance/temps/fan noise seem a lot better than I was expecting (especially in a pretty cramped case [I forgot how much better blower coolers in situations like this]).

Really wanted to wait for a custom card, but I was getting a 3600 for a friend (which gave me the $50 off) and a lot of the cards seem like they are a decent bit longer than what I can fit.



KyadCK said:


> Uh........ :h34r-smi
> 
> 
> I am confused as to why I would need to remove the card to see it or why others would be unable to see it upon walking into the room.
> 
> You should spend some time in the build log and case mod sections of the forum, my stuff is tame.


I think he is getting at that 99% of the cool stuff on cards is visible if the card is flipped (short of the stuff on the side of the GPU). Back when cards had designs on them, the only way you could really see them is when they were out of the case (or if you had a mirrored-case where the mobo was upside down). That is why it seems like case-manufactures started having vertical expansion slots with adapters to let you 'show off your GPU.'

I don't disagree with him, but I also agree with you, it becomes a 'pride' thing to me (my PC looks nowhere near as nice as yours though! :thumb. I might be a little different, but I look at my PC all the time (the only thing about it that bothers me are the Noctua fans that do not fit in with my black+white+grey PSU cables and my purple and white fans).


----------



## looniam

thats it!

i am definitely putting a disco ball in my next build. :coolsmile


----------



## Heuchler

"those that know [what they are doing] can get excellent results those that don't should proceed with caution and should contact [PCGH] forums before proceeding" - PCGH (Around the 10 minuet mark)

https://youtu.be/WGo6s-2RXwo?t=605


PCGH at the end of the video comment about using the included washers as intend spacers to get proper contact between the die and the cooler [Xtreme IV].

They used the cooler around 30 times before so it a little wrapped. They say the spacers should prevent that as well.


Edit: Not sure if PCGH Review Navi 10 got listed here

https://www.pcgameshardware.de/Rade...ase-Benchmark-Preis-Kaufen-Vega-64-1293229/3/


----------



## Hwgeek

*ASUS Radeon RX 5700 XT STRIX OC Review*
https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-strix-oc/







]


----------



## Heuchler

ComputerBase Radeon RX 5700 XT Strix OC Review [german]
https://www.computerbase.de/2019-08/asus-radeon-rx-5700-xt-strix-test/


Google Translate
https://translate.google.com/transl...de/2019-08/asus-radeon-rx-5700-xt-strix-test/


----------



## keikei

https://wccftech.com/review/sapphire-pulse-rx-5700xt-navi-heartbeat-elevated/



> At *$409* for the SAPPHIRE PULSE RX 5700XT and *$359* for the 5700 non-XT variants, SAPPHIRE has completely shifted the focus off the reference models in the best way possible. They have crafted a card that performs admirably at 1440p, but Ultrawide 1440p would demand a bit more if that is your cup of tea.


----------



## 113802

keikei said:


> https://wccftech.com/review/sapphire-pulse-rx-5700xt-navi-heartbeat-elevated/


Still prefer the MSI Ventus cooler over all of the current the other manufacturer leaks. Hopefully some other elegant shrouds are released.

I assume the mesh is RGB like the EVGA GTX 10 series?

https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-strix-oc/

Strix PCB breakdown: 
https://youtu.be/ioJHGyivLpQ


----------



## keikei

WannaBeOCer said:


> Still prefer the MSI Ventus cooler over all of the current the other manufacturer leaks. Hopefully some other elegant shrouds are released.
> 
> I assume the mesh is RGB like the EVGA GTX 10 series?
> 
> https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-strix-oc/
> 
> Strix PCB breakdown:
> https://youtu.be/ioJHGyivLpQ


For the price, I don't think the Pulse has RGB. I could not even find any mention of the word in the review.


----------



## 113802

keikei said:


> aurl]For the price, I don't think the Pulse has RGB. I could not even find any mention of the word in the review.


Cool, glad our complaining about noise and price kept cost low for AIB cards. Now if retailers keep the $410 price it's an easy choice to recommend.

Pulse review: https://youtu.be/FQJCm7bnOfU


----------



## KyadCK

Jedi Mind Trick said:


> I cannot tell if I like that the 5700 didn't come with a backplate. On one hand, the 480 reference card had one (and looked great IMO); but on the other, it makes taking it out of my MITX build 100x easier (I have to use a chopstick to hit the PCIe release tab on cards with backplates). I like the cards a lot so far, great deal (with the $50 off at Microcenter [especially when already open box]) and the performance/temps/fan noise seem a lot better than I was expecting (especially in a pretty cramped case [I forgot how much better blower coolers in situations like this]).
> 
> Really wanted to wait for a custom card, but I was getting a 3600 for a friend (which gave me the $50 off) and a lot of the cards seem like they are a decent bit longer than what I can fit.
> 
> 
> 
> *I think he is getting at that 99% of the cool stuff on cards is visible if the card is flipped* (short of the stuff on the side of the GPU). Back when cards had designs on them, the only way you could really see them is when they were out of the case (or if you had a mirrored-case where the mobo was upside down). That is why it seems like case-manufactures started having vertical expansion slots with adapters to let you 'show off your GPU.'
> 
> I don't disagree with him, but I also agree with you, it becomes a 'pride' thing to me (my PC looks nowhere near as nice as yours though! :thumb. I might be a little different, but I look at my PC all the time (the only thing about it that bothers me are the Noctua fans that do not fit in with my black+white+grey PSU cables and my purple and white fans).


True, but as you noted, these days it's extremely easy to find a case with even 3 slots for GPU rotate to face the window, and now you can even get Tempered Glass for cheap. Even $50 cases do not require you to take out the GPU to see it.

I dropped too much money on "custom" cables, LEDs, and MagLev fans, and faaaaaaar too much time. I even put a MagLev in the WraithRipper (worth)  

But yes, my desktop is directly next to my monitors and I do look at it every day. People who visit my house see it when they walk in the room. The rig itself is just money, anyone can throw $5,000 at a computer if they have the money, be it parts of maybe something like CyberPower or Origin. PCs are my hobby, and the pride comes from seeing my rig and knowing that the "dayum that looks good" (in my mind or from others, thank you by the way) comes from my own skill and time, and is one of my achievements. I did pour several hours into each of those configurations after all.

Also dang, that AX1200 has been in my rigs for over 6 years now. That red rig is oooooooooooold, those are 7950s. I had the 1200 with 6970s before that too.


----------



## Mand12

Wait, so which is the GPU and which is the box it came in?


----------



## PontiacGTX

https://be.hardware.info/artikel/95...-test-asus-rog-strix-vs-sapphire-pulse-review


----------



## AlphaC

Hwgeek said:


> *ASUS Radeon RX 5700 XT STRIX OC Review*
> https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-strix-oc/



Looking at the noise levels and fan speeds, RX 5700 series PULSE with Quiet BIOs (1300RPM per TPU on quiet, 1500RPM on normal BIOS) makes STRIX (1700RPM or so on quiet BIOs apparently) looks like a terrible buy as PULSE is mere $10 more than reference. Per computerbase the card weighs 1,434 g yet does not have better thermals overall vs the <1kg Pulse cards.

With 7 powerstages (Vishay SiC) , 5heatpipe cooler, replaceable fans, Dual BIOS, VRM cooling all around with finned secondary heatsink, lower idle power (per hw info https://be.hardware.info/artikel/95...-test-asus-rog-strix-vs-sapphire-pulse-review) it should be the clear choice for the money.
Fan RPM on Pulse XT version seems to be 1500RPM (1000RPM on Quiet BIOS) https://www.gamersnexus.net/hwreviews/3498-sapphire-rx-5700-xt-pulse-review

I think it's SiC620A powerstages on the Pulse https://www.vishay.com/docs/62922/sic620a.pdf

"Just to give some extra context for these results, these readings under load came with the Pulse RX 5700 running its fans at at 44%, or 1430rpm." - https://www.kitguru.net/components/graphic-cards/dominic-moass/sapphire-rx-5700-pulse-8gb-review/15/

https://www.bit-tech.net/reviews/tech/graphics/sapphire-radeon-rx-5700-xt-pulse-review/10/

https://www.computerbase.de/2019-08...700-xt-pulse-test/2/#diagramm-luefterdrehzahl
^ faulty silent BIOS fan speed? 1450RPM on Silent for RX 5700 and 1750RPM on silent for RX 5700XT


Average power using Cybernetics system and Furmark:
~238W , 296W peak

https://www.tomshardware.com/reviews/sapphire-pulse-radeon-rx-5700-xt-amd-navi-review,6276-4.html



---

RX 5700 XT Pulse vs RX 5700 Pulse : different heatsink design mainly leading to lower weight and fewer parts (872 g vs 925 g).

RX 5700XT teardown :
https://www.techpowerup.com/review/sapphire-radeon-rx-5700-xt-pulse/4.html
https://www.modders-inc.com/sapphire-pulse-radeon-rx-5700-xt-review/4/



RX 5700XT breakaway diagram:







https://www.tomshardware.com/reviews/sapphire-pulse-radeon-rx-5700-xt-amd-navi-review,6276.html

RX 5700 has 3 heatpipes (165 Watt main BIOs , 150W secondary BIOS).






- https://www.computerbase.de/2019-08...est/#abschnitt_sapphire_pulse_mit_amd_navi_10








https://www.pcworld.com/article/3431151/sapphire-pulse-rx-5700-review.html








https://www.kitguru.net/components/graphic-cards/dominic-moass/sapphire-rx-5700-pulse-8gb-review/2/
3x 6mm heatpipes, which means it is undersized. 8mm is capable of 60W each , 6mm is widely considered 40W each = 120W.


----------



## doom26464

Seems like AIB should keep pricing in line. 

Improvements to thermals and noise is what I was looking for. 

Perfect no reason now to use a 2060/2060super over the 5700xt/5700. Hell even the 2070super at its price make no sense over a 5700xt. 

Its going to be wierd doing a few of the fall builds coming up with AMD gpus.


----------



## Section31

Anyone heard about 5700xt mini pcb gpu being released?

Thinking of getting one and then swapping an full size rtx2060 to my cousin.


----------



## Jedi Mind Trick

doom26464 said:


> Seems like AIB should keep pricing in line.
> 
> Improvements to thermals and noise is what I was looking for.
> 
> Perfect no reason now to use a 2060/2060super over the 5700xt/5700. Hell even the 2070super at its price make no sense over a 5700xt.
> 
> Its going to be wierd doing a few of the fall builds coming up with AMD gpus.


RT; but yea, I agree. The 5700(xt) are really solid cards. Glad AMD seems to be a bit more competitive (not that the 56/64 weren't but these cards must have better margins for AMD than those did).


----------



## PontiacGTX

It is curious no one is talking about TriXX boost


----------



## Aussiejuggalo

PontiacGTX said:


> It is curious no one is talking about TriXX boost


Still slower than the 2080Ti in other games though:




Spoiler


























30 slower in Metro Exodus & 20 in Witcher 3. BF5 seems to be the exception.

I had high hopes for Navi but this really is disappointing, it does keep up with the 2070 / 2080 in some things but most of the time it's behind, not to mention the power consumption. In saying that however, if an AIO version comes out for a reasonable price, I'd seriously consider it just to get away from Nvidia's buggy drivers .


----------



## Heuchler

PontiacGTX said:


> It is curious no one is talking about TriXX boost




Yeah why didn't you include a game with Hairworks or other GameWork features enabled. Still love how DLSS is market as a performance enhancing feature.

Totally unfair. Totally cherry picking results. Why would anybody care about shooters in PC gaming. Not like that genre has been pushing PC gaming for decades. 

Not like people will play hundreds or even thousands of hours in games like Battlefield. Metro Exodus was so great I stopped playing it after 5 minuets.


[I was joking above] picking BF V over a game that nobody will play for more than 20 years makes sense to me.


Massive Snowdrop engine was developed with nVidia as a Gamework title for the Division. The partnered up with AMD for the sequel but I don't expect them to rewrite most of the code. Dropping two settings from Ultra to high boost performance on Radeon cards by a lot.



FidelityFX CAS for everyone
https://www.overclock.net/forum/28085782-post7.html


----------



## treetops422

PontiacGTX said:


> It is curious no one is talking about TriXX boost


Quote


It looks really handy, I can't wait to try it out. You could just do it with manually with custom resolutions(I think). But that would be a pain in the u know what. Having these preset at the click of a button =noice. Now all we need is RIS on Direct x 11, although you can already use it with RIS added to reshade if you don't want to wait. Real time adaptable target FPS would be tits. Set the fps and let a program pick out what to change in real time. Even if it just picked out certain areas like wow cities where your fps dips.


----------



## keikei

Looks like Sapphire is first out of the gate: https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416p8gl/p/N82E16814202349


----------



## ZealotKi11er

keikei said:


> Looks like Sapphire is first out of the gate: https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416p8gl/p/N82E16814202349


Can people complain now about Navi being hot and loud?


----------



## criminal

ZealotKi11er said:


> Can people complain now about Navi being hot and loud?


Nope, seems like that issue is solved now. 

But it does appear that AMD should have just let AIB partners handle the launch so the cards would have had a better reputation from the start. Seems that at only $10 more Sapphire released the 5700XT as it should have been released to begin with.






There still seems to be a "bug" that causes games to crash without warning. Hopefully AMD gets that fixed now.


----------



## Nick the Slick

ZealotKi11er said:


> Can people complain now about Navi being hot and loud?


Nope. So now they'll keep pushing the narrative of power draw and how it doesn't beat the 2080Ti at $200 and also doesn't have ray tracing so therefore it's a complete and utter failure and no one ever should buy it.


----------



## keikei

Seriously, which one do i choose!?! I don't have showcase rig, so the RGB cards are pointless, but the other cards are fair game. Sux xfire is a moot tech.


----------



## criminal

keikei said:


> Seriously, which one do i choose!?! I don't have showcase rig, so the RGB cards are pointless, but the other cards are fair game. Sux xfire is a moot tech.


The cheapest that isn't reference.


----------



## 7850K

Nick the Slick said:


> Nope. So now they'll keep pushing the narrative of power draw and how it doesn't beat the 2080Ti at $200 and also doesn't have ray tracing so therefore it's a complete and utter failure and no one ever should buy it.


heh, sounds like how I expected the AMD subreddit to respond. They were in general very critical of the previous gens like Vega, Polaris and Fury. Navi, though, has been surprisingly well received.


----------



## Aussiejuggalo

Anyone know if there will be an AIO version from a company? Tempted to get one but not looking for an air cooled version, to much power and to hot.

Vega was a failure though really, to hot, to power hungry and far to expensive. Navi would've been the same if AMD stuck with HBM, thankfully they didn't.

Really the 5700XT is a damn good card, coming close to the 2070 Super for $210 less, when comparing the most expensive 5700XT ($699) to the cheapest 2070 Super ($909) at least here in Aus.


----------



## Heuchler

Aussiejuggalo said:


> Anyone know if there will be an AIO version from a company? Tempted to get one but not looking for an air cooled version, to much power and to hot.
> 
> Vega was a failure though really, to hot, to power hungry and far to expensive. Navi would've been the same if AMD stuck with HBM, thankfully they didn't.
> 
> Really the 5700XT is a damn good card, coming close to the 2070 Super for $210 less, when comparing the most expensive 5700XT ($699) to the cheapest 2070 Super ($909) at least here in Aus.





How would HBM be more power hungry ?


----------



## Aussiejuggalo

Heuchler said:


> How would HBM be more power hungry ?



Not the HBM, the Vega architecture was power hungry, 295 - 300w for jack all performance.


----------



## The Robot

Aussiejuggalo said:


> Not the HBM, the Vega architecture was power hungry, 295 - 300w for jack all performance.


Yeah, it was AMD's Fermi, a compute card with display outputs. At least now RDNA and compute arch is separated, just like Nvidia does it starting with Kepler.


----------



## 113802

The Robot said:


> Aussiejuggalo said:
> 
> 
> 
> Not the HBM, the Vega architecture was power hungry, 295 - 300w for jack all performance.
> 
> 
> 
> Yeah, it was AMD's Fermi, a compute card with display outputs. At least now RDNA and compute arch is separated, just like Nvidia does it starting with Kepler.
Click to expand...

I wouldn't call Kepler a separated arch since big Kepler crushed the GTX 680 in gaming performance. nVidia didn't split their archs up until just recently with Volta and Turing.


----------



## Heuchler

Sapphire Pulse Radeon RX 5700 XT test
https://www.hardwareluxx.de/index.p...sapphire-pulse-radeon-rx-5700-xt-im-test.html









TriXX boost


----------



## keikei




----------



## rdr09

criminal said:


> Nope, seems like that issue is solved now.
> 
> But it does appear that AMD should have just let AIB partners handle the launch so the cards would have had a better reputation from the start. Seems that at only $10 more Sapphire released the 5700XT as it should have been released to begin with.
> 
> There still seems to be a "bug" that causes games to crash without warning. Hopefully AMD gets that fixed now.


I agree. It would have been a better show if AIBs were reviewed at same time with reference but we all know why it never happens that way for awhile.

Considering that the 5700 covers a good range of price and performance category. From GTX 1660Ti to all the way up to the 2060 Super. Spend 50$ or so and you have the XT that's nipping at the heels of the RTX 2080. So, these two cards combined cover low to mid-high end tiers.


----------



## Ashura

Any point in choosing the top tier cards? Those triple fan ones, Strix for e.g.


----------



## ilmazzo

Ashura said:


> Any point in choosing the top tier cards? Those triple fan ones, Strix for e.g.


e-peen


----------



## Offler

Comparing 5700(xt) with my Radeon VII...

Scores/FPS from Guru3D reviews on games i play are almost exactly same as mine...

There is one specific i am interested in. I realized on Radeon VII that when i play games on fullHD (1920x1080), frame capped (60) the total power consumption of my system is 130-160 watts on full details and full quality.

Does 5700(xt) behave similarly regarding to power consumption?


----------



## keikei

Offler said:


> Comparing 5700(xt) with my Radeon VII...
> 
> Scores/FPS from Guru3D reviews on games i play are almost exactly same as mine...
> 
> There is one specific i am interested in. I realized on Radeon VII that when i play games on fullHD (1920x1080), frame capped (60) the total power consumption of my system is 130-160 watts on full details and full quality.
> 
> Does 5700(xt) behave similarly regarding to power consumption?


You'll probably only get that info from an owner, but reviews have the power consumption similar. 5700 XT does eek out slightly more efficient vs Vega 7.

*Former Vega 7 owner myself. I will be getting Navi. Good to have the new tech for cheaper $ and very similar performance. Unless Nvidia does something crazy good, Navi 2 will be another purchase.


----------



## Offler

keikei said:


> You'll probably only get that info from an owner, but reviews have the power consumption similar. 5700 XT does eek out slightly more efficient vs Vega 7.
> 
> *Former Vega 7 owner myself. I will be getting Navi. Good to have the new tech for cheaper $ and very similar performance. Unless Nvidia does something crazy good, Navi 2 will be another purchase.


I will keep my Radeon VII at least for crunching big numbers (in case that might happen), which is 5700 not capable of... The low power consumption on lower resolutions with framecap was an unexpected bonus, because even in that case i expected 300+W.


To answer some discussion above:
For now my system seems to have buses wider than required (PCI-E, SSD, HBM, RAM), with power cascade much higher than required (on both GPU and CPU), using PSU which has 90% effectivity at 10% load so i kinda expect very smooth performance, low heat and low power consumption even when more CPUs or newer GPUs might provide more total performance.

Its not just about e-peen, but you have to justify things like superwide bus on HBMs of Vega. In my case the goal is to achieve lower GPU latency, and better concurrent writes into VRAM compared to standard GDDRs.


----------



## PontiacGTX

Offler said:


> I will keep my Radeon VII at least for crunching big numbers (in case that might happen), which is 5700 not capable of... The low power consumption on lower resolutions with framecap was an unexpected bonus, because even in that case i expected 300+W.
> 
> 
> To answer some discussion above:
> For now my system seems to have buses wider than required (PCI-E, SSD, HBM, RAM), with power cascade much higher than required (on both GPU and CPU), using PSU which has 90% effectivity at 10% load so i kinda expect very smooth performance, low heat and low power consumption even when more CPUs or newer GPUs might provide more total performance.
> 
> Its not just about e-peen, but you have to justify things like superwide bus on HBMs of Vega. In my case the goal is to achieve lower GPU latency, and better concurrent writes into VRAM compared to standard GDDRs.


with big numbers you refer to opencl applications? or games? also memory speed seems better on Navi https://img1.mydrivers.com/img/20190710/a3fb2354-77f1-4fe7-a50a-939374f63e48.png


----------



## criminal

https://www.techspot.com/review/1896-msi-radeon-5700-xt-evoke/

I actually like the look of this card.


----------



## ZealotKi11er

criminal said:


> https://www.techspot.com/review/1896-msi-radeon-5700-xt-evoke/
> 
> I actually like the look of this card.


Looks like RTX FE


----------



## Heuchler

[Igor'sLAB] MSI Radeon RX 5700 XT Evoke OC Edition tested - English
https://www.igorslab.media/msi-rade...-butter-or-margarine-on-bread-english-review/

Cards is 23.5 cm long card (fits most mATX cases), 5x 6mm heatpipes, 460 euros for mid-tier card [Deutsch]
https://www.igorslab.media/msi-rade...tion-im-test-butter-oder-margarine-aufs-brot/


PWM Controller: IR35217 (7 used), 7x NCP302155 (OnSemi)
Mem: Micron MT61K256M32, NCP81022 (OnSemi), 2x NCP302155 (OnSemi)







fixed the typo of 4x 6mm to 5x 6mm heatpipes. 7+2+1 VRM. MSI MECH leaked photo was wrong as they are using the new VENTUS cooler rather than ARMOR. And no DVI.


----------



## Imouto

The performance discrepancy among reviewers is enervating.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/
https://www.guru3d.com/articles_pages/msi_radeon_rx_5700_xt_evoke_review,1.html


----------



## AlphaC

There's no discrepancy where it matters though re: the EVOKE.

In a nutshell: worse power design (Onsemi NCP powerstages is less efficient) , cooling (4-5 heatpipes with similar fin area ; non-thermal backplate), and pricing vs Sapphire PULSE.


2000RPM on the fans in both Igors Lab and techpowerup reviews. Guru3d (which I don't find to be as credible) also has the noise level on the top half of its charts.


----------



## Heuchler

pre-order Gigabyte Radeon RX 5700 XT Gaming OC 8G $420 
https://www.amazon.com/dp/B07W95D5V3/


Should ship before the 1-2 month estimate.


----------



## Melcar

Heuchler said:


> pre-order Gigabyte Radeon RX 5700 XT Gaming OC 8G $420
> https://www.amazon.com/dp/B07W95D5V3/
> 
> 
> Should ship before the 1-2 month estimate.



Had several GB GPUs in the past and the fans always end up crapping out in a short time. Good cards besides that.



So far the Pulse seems to be the best one. To bad these cards are out of my price range (or that I don't even need a new card).


----------



## Heuchler

Totally agree but lets keep on looking at hardware and try to get talked into unnecessary upgrades - Game of OCN




[4-6 days] ASRock Radeon RX 5700 XT CHALLENGER D 8G OC 8 $410
https://www.newegg.com/asrock-radeon-rx-5700-xt-rx-5700-xt-challenger-d-8g-oc/p/N82E16814930020

[4-6 days] ASRock Radeon RX 5700 CHALLENGER D 8G OC 8 $360
https://www.newegg.com/asrock-radeon-rx-5700-rx-5700-challenger-d-8g-oc/p/N82E16814930021


----------



## Offler

PontiacGTX said:


> with big numbers you refer to opencl applications? or games? also memory speed seems better on Navi https://img1.mydrivers.com/img/20190710/a3fb2354-77f1-4fe7-a50a-939374f63e48.png


OpenCL.

Higher frequency with smaller bus widht has its benefits over low frequency and large bus width. Better/worse is arbitrary, depending on application and whether the RAM is being taxed by a single task or multiple threads at the same time.

For purposes of 3d Rendering/Gaming is high frequency VRam a benefit. For system that should run multiple 3d instances at the same time, while "hiccup" might be a problem (usually gradual degradation of performance) its better to have HBM.

Its quite harder to hit a bottleneck on a HBM and system with Quad RAM channels, in a similar manner its harder to hit critical temps on CPU power cascade with 8 physical cores, while it was designed for up to 32.

(sorry for OT)


----------



## Heuchler

[Igor'sLAB] Powercolor RX 5700 XT Red Devil [English]
https://www.igorslab.media/powercol...ce-is-mass-times-acceleration-english-review/

cooler: four 6 mm and one 8 mm heatpipe
7 + 2 Phase design (10 + 2 VR circuits)
8-Pin + 6-Pin Dual-BIOS (5WP080)

PWM Controller: IR35217 (7 used), 7x NCP302155 (OnSemi)
Mem: Micron MT61K256M32, NCP81022 (OnSemi), 2x NCP302155 (OnSemi)

Infrared Images Page 7


Bottom Line

One could also have given the „Recommended Buy“ award, because those who want to buy the big Navi card and have enough space are welcome (and almost without hesitation) to access here! Especially as the MSRP of 439 euros including VAT is hardly an obstacle to the whole thing, if the card really arrives in the shops that way. After all, you get the performance of a GeForce RTX 2070 (and sometimes even Super) for the price of a GeForce RTX 2060 Super. Nevertheless, the „Excellent Hardware“ award is higher in a class, because the technical implementation leaves almost nothing to be desired.

If you can do without raytracing, play predominantly in WQHD, enjoy a modern driver interface and in return nonchalantly tolerate the frivolous Wattman as a tool, you have no reason to regret buying such a card. That I would write this about an AMD board partner card, which was rather emotionlessly tinkered by the board partners in the last years, I would not have dreamed of before the test. Now it has happened.


[German]
https://www.igorslab.media/powercol...l-im-test-kraft-ist-masse-mal-beschleunigung/


----------



## Heuchler

PowerColor RX 5700 XT Red Devil Limited Edition Review by Hot Harware
https://hothardware.com/reviews/tul-powercolor-radeon-rx-5700-xt-red-devil-review

PowerColor 5700 RED DEVIL (non-XT) by Guru3D
https://www.guru3d.com/articles-pages/powercolor-5700-red-devil-review,1.html

PowerColor Radeon RX 5700 XT Red Devil @ ComputerBase [German]
https://www.computerbase.de/2019-08/powercolor-radeon-rx-5700-xt-red-devil-test/







subtitles available (can use auto-translate feature)


PowerColor RX 5700 XT Red Devil Limited Edition XRX 5700XT AXRX 5700XT 8GBD6-3DHEP/OC 
https://www.overclockers.co.uk/powe...ddr6-pci-express-graphics-card-gx-19a-pc.html

Limited Edition with mouse pad is 449 USD. Overclockers UK has it for £499.99


----------



## criminal

Heuchler said:


> PowerColor RX 5700 XT Red Devil Limited Edition Review by Hot Harware
> https://hothardware.com/reviews/tul-powercolor-radeon-rx-5700-xt-red-devil-review
> 
> PowerColor 5700 RED DEVIL (non-XT) by Guru3D
> https://www.guru3d.com/articles-pages/powercolor-5700-red-devil-review,1.html
> 
> PowerColor Radeon RX 5700 XT Red Devil @ ComputerBase [German]
> https://www.computerbase.de/2019-08/powercolor-radeon-rx-5700-xt-red-devil-test/
> 
> 
> https://www.youtube.com/watch?v=5U2s7CaWUy0
> 
> subtitles available (can use auto-translate feature)
> 
> 
> PowerColor RX 5700 XT Red Devil Limited Edition XRX 5700XT AXRX 5700XT 8GBD6-3DHEP/OC
> https://www.overclockers.co.uk/powe...ddr6-pci-express-graphics-card-gx-19a-pc.html
> 
> Limited Edition with mouse pad is 449 USD. Overclockers UK has it for £499.99


Those are great looking cards.


----------



## AlphaC

https://www.pugetsystems.com/labs/a...UPER-vs-AMD-RX-5700-XT-1552/#BenchmarkResults

RX 5700XT series isn't for Photoshop users 

Red Devil is a bit large and at $440 it's only for people that want the highest performing RX 5700 XT with reasonable fan noise at the expense of power and size. Peak efficiency on the Onsemi NCP302155 is lower than the powerstages used on the reference and the Sapphire Pulse. Additionally the normal BIOS still retains as 225W-250W (discrepancy among reviewers) power limit while silent BIOS has 185W power limit , meaning the extra power on tap is largely wasted.

For most people a $410 Sapphire Pulse is the best option as on Quiet BIOS it is simply the only card smaller than reference design and comes with price/performance as well as the delicate noise and power balance. Sapphire probably designed their card exactly to under a 1kg for shipping reasons to save massively on shipping costs per pallet and to the end user.


----------



## Ashura

AlphaC said:


> https://www.pugetsystems.com/labs/a...UPER-vs-AMD-RX-5700-XT-1552/#BenchmarkResults
> 
> RX 5700XT series isn't for Photoshop users


Yes,2070S is faster, but PS can run absolutely fine even on low end GPUs.

From the article:


> To be fair, however, Photoshop is not exactly a GPU powerhouse. There isn't much reason to use a higher-end NVIDIA GPU, and even if we only look at the tasks that utilize the GPU, there is only about a 10% advantage at most for using NVIDIA over AMD. This isn't nothing, but it also isn't likely to be a deal breaker for many users.


----------



## 113802

Ashura said:


> Yes,2070S is faster, but PS can run absolutely fine even on low end GPUs.
> 
> From the article:


Very true, hopefully Puget does a DaVinci Resolve 5700 XT review. 

ROCm 2.7 was released today, no Navi support:

https://github.com/RadeonOpenCompute/ROCm


----------



## bigjdubb

Heuchler said:


> PowerColor RX 5700 XT Red Devil Limited Edition XRX 5700XT AXRX 5700XT 8GBD6-3DHEP/OC
> https://www.overclockers.co.uk/powe...ddr6-pci-express-graphics-card-gx-19a-pc.html
> 
> Limited Edition with mouse pad is 449 USD. Overclockers UK has it for £499.99


Why does the mouse pad have a USB cable? The card itself looks like a robot in disguise, I wonder what it transforms into...


----------



## 113802

bigjdubb said:


> Why does the mouse pad have a USB cable? The card itself looks like a robot in disguise, I wonder what it transforms into...


Doesn't transform into anything special except for a RGB mousepad. The edge of the mousepad lights up.


----------



## maltamonk

bigjdubb said:


> Why does the mouse pad have a USB cable? The card itself looks like a robot in disguise, I wonder what it transforms into...


It's a RGB mousepad.


----------



## bigjdubb

I'm really missing out, might be time to up my mousepad game.


----------



## AlphaC

Ashura said:


> Yes,2070S is faster, but PS can run absolutely fine even on low end GPUs.
> 
> From the article:


 It's currently running slower than Vega 64 and GTX 1660 Ti. You don't see that as a downside?

_"We spoke with our contacts at AMD and what appears to be happening is that the Radeon RX 5700XT (and Radeon VII) currently include software optimizations for gaming, but not yet for office/professional workloads."_

If the RX 5700 XT can perform as well as RTX 2070 Super /RTX 2080 in Da VInci Resolve it would probably be of interest. Radeon VII had decent performance: _https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-15-AMD-Radeon-VII-16GB-Performance-1382/
_
Also see RX 5700 XT vs RTX 2080:


Spoiler











https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=94979 <--- RX 5700 XT having issues , possibly due to AMD driver

*OpenCL peformance in Geekbench:*
RTX 2080 SUPER = 304,556 which is ~ +20%
RTX 2080 = 286,048 which is ~ +13%
RTX 2070 SUPER = 270,427 which is ~+7%
Quadro RTX 5000 = 264,488
RX 5700XT = 253,336 <---- 
RTX 2070 = 246,441
Quadro RTX 4000 = 233,092
TITAN Xp Collectors Edition = 227,635 
GTX 1080 Ti = 215,361
TITAN X (Pascal) = 210,733
Radeon VII = 210,044
RX 5700 = 208,790 <---- around 82% of the RX 5700 XT
Vega 64 = 204,168 


*Sisoft Sandra results*








https://www.sisoftware.co.uk/2019/08/07/amd-radeon-5700xt/



Given that people don't need/want professional GPUs for DirectX workloads such as 3dsmax / Maya and other such applications (although you'd want added VRAM) as well as compute workloads where there aren't driver lockdowns, AMD needs to roll out their non-gaming optimizations as soon as they can if they want to gain marketshare outside of gaming. When mining was prevalent their GPUs were being bought en masse.

FineWine memes and all but it's not fully baked. People buying right now are paying full price for something that isn't fully ready , much like buying RTX 2080 for gaming when there was a few games that support it and at <60FPS. At least RTX compute performance is useful for computation.


----------



## Imouto

AlphaC said:


> It's currently running slower than Vega 64 and GTX 1660 Ti. You don't see that as a downside?


Point being if that "running slower" is something you would be even aware of.


----------



## Heuchler

PowerColor RED DEVIL Radeon RX 5700 XT AXRX 5700XT 8GBD6-3DHEP/OC $449.99
https://www.newegg.com/powercolor-radeon-rx-5700-xt-axrx-5700xt-8gbd6-3dhep-oc/p/N82E16814131751


PowerColor RED DEVIL Radeon RX 5700 AXRX 5700 8GBD6-3DHE/OC 8GB $389.99
https://www.newegg.com/powercolor-radeon-rx-5700-axrx-5700-8gbd6-3dhe-oc/p/N82E16814131750

both out of stock now


"The PowerColor RX 5700 XT Red Devil Limited Edition retails for $450 and includes an RGB mousepad. The regular edition, with exactly the same card with identical specs retails for $440"
https://www.techpowerup.com/review/powercolor-radeon-rx-5700-xt-red-devil/35.html

PowerColor's Radeon RX 5700 XT Red Devil is expected to retail for $440. We reviewed the $450 Limited Edition, which has a flashier package, bundles an RGB mousepad and a graphics card holder. PowerColor assures us that the Limited Edition just has the bigger bundle—the card, all its specs, clocks, etc will be identical on the regular $440 edition. At that price, the card is not unreasonably priced, I'd say maybe $10 too high.


Sapphire Pulse RX5700 XT at $410 is a better buyer unless you want a RGB mousepad (they can range from $25 - $125).


----------



## keikei




----------



## Imouto

Just noticed the Pulse and Evoke boards are almost a carbon copy of each other.

https://www.techpowerup.com/review/sapphire-radeon-rx-5700-xt-pulse/5.html
https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-evoke/5.html


----------



## EastCoast

keikei said:


> https://www.youtube.com/watch?v=hpyE0S2yKJY


He actually told the truth *@ 9:38-10:35* giving the RD 5700 XT the nod over the 2060S and 2070S. Something we've been seeing all along.
I don't believe it! :wheee:


So...ahhhh, yeah!


----------



## AlphaC

All they did was basically condense reference PCB and put different powerstages on it , with the MSI one having cheaper powerstages. The reference PCB has dead space to put the radial fan.


----------



## Heuchler

PowerColor RED DEVIL cards used Victory Giant Technology(HuiZhou) for PCB. Just like the reference card.


----------



## ZealotKi11er

AlphaC said:


> All they did was basically condense reference PCB and put different powerstages on it , with the MSI one having cheaper powerstages. The reference PCB has dead space to put the radial fan.


And use cheaper Micron memory same as 5700. 5700 XT Reference has Samsung.


----------



## Imouto

ZealotKi11er said:


> And use cheaper Micron memory same as 5700. 5700 XT Reference has Samsung.


The Evoke card seems to be missing the dual BIOS switch too.


----------



## Section31

I hope with 5700xt and amd competitive we go back to path using nvidia as example only. 

RTX2080 becomes RTX3060, 2080ti becomes 3070. Then you have 3080 at 30ish percent improvement and 3080ti at 60% jump. Reasonable pricing with 3080ti running at 699-899usd range.


----------



## ZealotKi11er

Section31 said:


> I hope with 5700xt and amd competitive we go back to path using nvidia as example only.
> 
> RTX2080 becomes RTX3060, 2080ti becomes 3070. Then you have 3080 at 30ish percent improvement and 3080ti at 60% jump. Reasonable pricing with 3080ti running at 699-899usd range.


It depends. Probably more like 3700 it beat 2080S but not get to 2080 Ti level. I think people need to forget 2080 Ti. You cant expect to match its perf within 1 gen with $400 GPU.


----------



## Ashura

AlphaC said:


> It's currently running slower than Vega 64 and GTX 1660 Ti. You don't see that as a downside?
> 
> _"We spoke with our contacts at AMD and what appears to be happening is that the Radeon RX 5700XT (and Radeon VII) currently include software optimizations for gaming, but not yet for office/professional workloads."_
> 
> If the RX 5700 XT can perform as well as RTX 2070 Super /RTX 2080 in Da VInci Resolve it would probably be of interest. Radeon VII had decent performance: _https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-15-AMD-Radeon-VII-16GB-Performance-1382/
> _


For PS? No. There isn't much of a difference for it to matter. I've worked on 560,760,780,960,970,290x, 1060 & 1080. Nothing noticeable. 

As for Max/Maya, In general it wouldn't make much of a difference. But when it comes to rendering, that's where nvidia takes the lead.
it also depends on the renderer. 

For eg. Arnold is a CPU renderer, whereas Vray/ Redshift is GPU & can take advantage of CUDA cores. Now I don't personally know the gap in perf. between 5700xt & 2070S in real world Vray/Redshift renders.

Having said that, I totally agree with your sentiment that AMD should improve on the professional workloads side.


----------



## rdr09

Ashura said:


> For PS? No. There isn't much of a difference for it to matter. I've worked on 560,760,780,960,970,290x, 1060 & 1080. Nothing noticeable.
> 
> As for Max/Maya, In general it wouldn't make much of a difference. But when it comes to rendering, that's where nvidia takes the lead.
> it also depends on the renderer.
> 
> For eg. Arnold is a CPU renderer, whereas Vray/ Redshift is GPU & can take advantage of CUDA cores. Now I don't personally know the gap in perf. between 5700xt & 2070S in real world Vray/Redshift renders.
> 
> Having said that, I totally agree with your sentiment that AMD should improve on the professional workloads side.


I agree. But for professional workloads, ii don't think any 400$ card would suffice. They buy Mac.


----------



## AlphaC

Adobe After effects performance is up:
https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1553/


RX 5700XT nearly on par with Radeon VII


----------



## bigjdubb

AlphaC said:


> Adobe After effects performance is up:
> https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1553/
> 
> 
> RX 5700XT nearly on par with Radeon VII


It's adobe after effects, pretty much every video card is on par with each other!


----------



## EastCoast

RD 5700XT is again the better card over the 2070S at it's cheaper PP.


----------



## Rei86

Question, does AMD still make great reference boards?


----------



## 113802

Rei86 said:


> Question, does AMD still make great reference boards?


Of course they do but their bios on the cards are anti-consumer.


----------



## JackCY

Do they now come with a working driver or is it a DIY thing still of figuring out 10 bugs and then realizing all you can do is plug it in and leave it be, don't touch anything in the graphics control panel? It seemed from GN that the driver is still same poo as on launch.

If I had to choose probably Sapphire Pulse or PowerColor Red Devil depending which one is cheaper in store and available. Red Devil is better cooling wise despite how "nice"/clever Sapphire made their cooling they made a major mistake of blowing hot air back into case, there is barely anything escaping from these coolers out the case through back of case slot, it almost always gets all blown back to front of case at great pressure (had a cooler like that and 2 with side blow to top and mobo like PowerColor has it and those always run so much better cooling wise), plus Red Devil has at least a bit of a cut out on back to blow straight through. Not a fan of 3x90mm fans, 2x100-120mm fans are better. Still Sapphire did at least some things right unlike ASUS and MSI. Sapphire Pulse and MSI EVOKE have some AMD reference partner PCB, same PCB with minor tweaks.


----------



## ilmazzo

Rei86 said:


> Question, does AMD still make great reference boards?


Best of the west

serious answer: pretty still true yes


----------



## PontiacGTX

Offler said:


> OpenCL.
> 
> Higher frequency with smaller bus widht has its benefits over low frequency and large bus width. Better/worse is arbitrary, depending on application and whether the RAM is being taxed by a single task or multiple threads at the same time.
> 
> For purposes of 3d Rendering/Gaming is high frequency VRam a benefit. For system that should run multiple 3d instances at the same time, while "hiccup" might be a problem (usually gradual degradation of performance) its better to have HBM.
> 
> Its quite harder to hit a bottleneck on a HBM and system with Quad RAM channels, in a similar manner its harder to hit critical temps on CPU power cascade with 8 physical cores, while it was designed for up to 32.
> 
> (sorry for OT)


well if you are on a R VII, you better stick with it because anandtech and and serverthehome had issues running opencl on this card. though. I think in linux AMD's rocm is going to work fine.


----------



## 113802

PontiacGTX said:


> well if you are on a R VII you better stick with it because anandtech and and serverthehome had issues running opencl on this card. though. I think in linux AMD's rocm is going to work fine.


ROCm still doesn't support GFX10(Navi) even with the latest version 2.7. Navi users have to use the AMDGPU-Pro driver for OpenCL.


----------



## PontiacGTX

WannaBeOCer said:


> ROCm still doesn't support GFX10(Navi) even with the latest version 2.7. Navi users have to use the AMDGPU-Pro driver for OpenCL.


I dont know if the AMDGPU-Pro drivers exists in windows for consumers. but many sites couldnt even launch an opencl based application due to drivers, I wonder if AMD is blocking OpenCL Support on windows on purpose event they acknolwedge they lack of proper support on windows, also one would expect AMD would priotize GPUOpen platforms over open source ones since they havent updated opencl/compute APIs on windows for a while


----------



## ejb222

JackCY said:


> Do they now come with a working driver or is it a DIY thing still of figuring out 10 bugs and then realizing all you can do is plug it in and leave it be, don't touch anything in the graphics control panel? It seemed from GN that the driver is still same poo as on launch.
> 
> If I had to choose probably Sapphire Pulse or PowerColor Red Devil depending which one is cheaper in store and available. Red Devil is better cooling wise despite how "nice"/clever Sapphire made their cooling they made a major mistake of blowing hot air back into case, there is barely anything escaping from these coolers out the case through back of case slot, it almost always gets all blown back to front of case at great pressure (had a cooler like that and 2 with side blow to top and mobo like PowerColor has it and those always run so much better cooling wise), plus Red Devil has at least a bit of a cut out on back to blow straight through. Not a fan of 3x90mm fans, 2x100-120mm fans are better. Still Sapphire did at least some things right unlike ASUS and MSI. Sapphire Pulse and MSI EVOKE have some AMD reference partner PCB, same PCB with minor tweaks.


I've had no driver issues with stock settings. But radeon settings crashes when I undrrvolt sometimes. I'd like to think a mild undervolt isn't the problem and it's the driver. But I don't know how to tell the difference between driver issue and unstable undervolt.


----------



## rdr09

ejb222 said:


> I've had no driver issues with stock settings. But radeon settings crashes when I undrrvolt sometimes. I'd like to think a mild undervolt isn't the problem and it's the driver. But I don't know how to tell the difference between driver issue and unstable undervolt.


I started using Auto Undervolt in Wattman. Works great. Temps down and it boosts higher (2000MHz). Been playing Roblox for hours now. Yah, Roblox.


----------



## keikei

*Heard there is a Sapphire Nitro incoming from GN live streams.


----------



## treetops422

keikei said:


> https://www.youtube.com/watch?v=SQjTYbqG1CI
> 
> 
> *Heard there is a Sapphire Nitro incoming from GN live streams.


For those who don't click on the video, it's actually not expected. 90c tj max on the stock fan curve with the blower during a torture test is expected.


----------



## JackCY

They won't say that officially because then they would get slammed with RMAs for poor cooling performance on some of the dud cards. So instead they will say thermal throttling temperature, the max user can see, is fine lol.

Why do they bother with graphite pads... blower... and metal shroud... when for same money they could make so much better cooler and the OEM argument of poor cases is moot as axial coolers perform better even there.
At least NV took the freakin' hint after decades and made a switch on their reference cards finally to more reasonable coolers. AMD still stuck with DO NOT BUY reference cards. The hot and loud meme is only being fueled by AMD's heat and unwillingness to change their cooler design to anything sensible.


----------



## ZealotKi11er

JackCY said:


> They won't say that officially because then they would get slammed with RMAs for poor cooling performance on some of the dud cards. So instead they will say thermal throttling temperature, the max user can see, is fine lol.
> 
> Why do they bother with graphite pads... blower... and metal shroud... when for same money they could make so much better cooler and the OEM argument of poor cases is moot as axial coolers perform better even there.
> At least NV took the freakin' hint after decades and made a switch on their reference cards finally to more reasonable coolers. AMD still stuck with DO NOT BUY reference cards. The hot and loud meme is only being fueled by AMD's heat and unwillingness to change their cooler design to anything sensible.


Are you going to complain for 1 month? 5700/XT are going to be in the market for 1 year+.


----------



## sjwpwpro

Can anyone point me in the direction of a 3rd party 5700/5700 XT or 2070 super with LED's that is at least close to MSRP?


----------



## paulerxx

2450 on the core before he crashed.


----------



## keikei

sjwpwpro said:


> Can anyone point me in the direction of a 3rd party 5700/5700 XT or 2070 super with LED's that is at least close to MSRP?



The Red Devil does have rgb. The non limited ed is $440 (not released yet). I recommend playing vid on mute:


----------



## sjwpwpro

yeah but its not in stock, sorry should have added that to start with. I have the money to buy now and I cant find either anywhere, at least not anywhere near MSRP. I have not built one in a while and finished one a few months ago but waited on the card for all the new ones to come out and see where the cards fell but Amazon and Newegg are the only places I know to order from.


----------



## sjwpwpro

keikei said:


> The Red Devil does have rgb. The non limited ed is $440 (not released yet). I recommend playing vid on mute:
> 
> 
> https://www.youtube.com/watch?v=i0kXPWE5o58


That card is sexy and i will buy the second that i can find one.


----------



## keikei

^Auto notify. Hell, auto notify all the aftermarket ones. For whatever reason its super hard to get a card during its initial launch.


----------



## sjwpwpro

I feel its that way on purpose. When the 2060/2070..... launched there was an abundance of cards and I almost got one but held out. Now there are none of the new ones, I am beginning to wounder what is going on here.


----------



## treetops422

sjwpwpro said:


> I feel its that way on purpose. When the 2060/2070..... launched there was an abundance of cards and I almost got one but held out. Now there are none of the new ones, I am beginning to wounder what is going on here.


 This one is on 4-6 day back order. Quickest AIB I could find. idk anything about back orders so don't yell at me later lol. I checked Walmart, Microcenter, Frys electronic, TigerDirect, Bestbuy and of course < Newegg Amazon Ebay the only ones that have aib

https://www.newegg.com/asrock-radeo...t&cm_re=radeon_5700_xt-_-14-930-020-_-Product


----------



## PontiacGTX

paulerxx said:


> https://www.youtube.com/watch?v=JS9in-RHbjw
> 
> 2450 on the core before he crashed.


is it me or the core clock scales much better from 2100 to 2400 than from stock to 2100?or is it becaue time spy is synthethic It would be far more interesting IF there were people who tested in games


----------



## AlphaC

https://www.igorslab.media/en/makin...ntly-quieter-with-the-morepowertool-tutorial/


> Powercolor showed us the way with the Silent-BIOS, today we follow with a tutorial for everyone that is suitable for everyday use: no more BIOS flashing and no more Wattman tricks, but a comprehensible manual that is easy to implement with our free MorePowerTool...
> I’ve been asked again and again why we don’t write undervolting tutorials for AMD’s new Navi cards analogous to the RX Vega mandatory user program. I’ve tested a lot with Wattman and the possibilities and find that a simple undervolting guide for these new graphics cards wouldn’t be really practical and honest. The series dispersion in chip qualities would mean that a few people might think that I’m a scared rabbit who doesn’t want (or can’t) exploit the full potential of the (very good) chips. And the vast majority of the rest could be left disappointed, whose cards don’t even reach my goal.





> From 44.4 dB(A), which is quite loud, the whole thing drops to only 35.7 dB(A), which is not only bearable, but now also quiet! Well, the motor noises are still there, but the air noises of the rotor blades are history. And it’s even quiet enough to finally hear the voltage regulators again during load changes. This is not loud, but noticeable again, especially in the menus in the game.
> 
> 
> ...
> Everything that the manufacturers themselves can still modify in the BIOS to adapt the cards can also be done with the MorePowerTool. And this is exactly where the advantage is obvious, because on the one hand you don’t have to risk a BIOS to flash and on the other hand you can adapt the card to your individual needs. So a Radeon RX 5700 XT with only 160 watts limit and adjusted Ampere and fan values can still act faster than a RX 5700 @Stock, but can act much more efficient and quieter.



RX 5700XT Evoke was used


----------



## treetops422

AlphaC said:


> https://www.igorslab.media/en/makin...ntly-quieter-with-the-morepowertool-tutorial/


Igor is a god and I am not worthy. (And _*hellm*_) 


Edit to add in check out this list of where to buy the 5700 xt
https://www.asrock.com/general/buy....0 XT Challenger D 8G OC&Country=United States
Radeon 5700 Ref card back down to $330 3 year warranty or 2 year if you want Sapphire 

https://www.newegg.com/asrock-radeo...5700&cm_re=radeon_5700-_-14-930-019-_-Product


----------



## Heuchler

treetops422 said:


> For those who don't click on the video, it's actually not expected. 90c tj max on the stock fan curve with the blower during a torture test is expected.



Thanks. I had no plan on clicking on the click-bait. As far a taking safety advice from that channel that shows Matisse overclocking at 1.40v hard pass.

As far as I know nVidia doesn't let T-Junction reading 

ComputerBase will get my clicks
https://www.computerbase.de/2019-08/radeon-rx-5700-xt-185-watt-custom-vergleich/


----------



## sjwpwpro

Thank you guys for trying. The AMD cards I understand but I figured NVIDIA would have been better.


----------



## ilmazzo

of course, nvidia is always better no matter what


----------



## paulerxx

5700XT's peak overclock matches a RTX2080. ( 2450mhz on the core)


----------



## Buris

paulerxx said:


> https://www.youtube.com/watch?v=IDO-UHZebV4
> 
> 5700XT's peak overclock matches a RTX2080. ( 2450mhz on the core)


Even at stock it's severely limited by Memory Bandwidth. Raising the clocks helps a little, but with faster memory, this puppy could probably be significantly faster than the 2080.


----------



## paulerxx

Buris said:


> Even at stock it's severely limited by Memory Bandwidth. Raising the clocks helps a little, but with faster memory, this puppy could probably be significantly faster than the 2080.


I totally agree, as did the man himself. He got a very small OC on the memory, it's basically topped out as is.

I'm curious to see what AMD's 5800/5900 bring to the table and what AMD will do to keep those GPUs cool.

From Steve's video linked above:

"The memory was one of our challenges, as we couldn't drive the temperature lower without a memory heater. We also need more voltage and power, something that additional softmods or hardmods will fix."


----------



## maltamonk

Buris said:


> Even at stock it's severely limited by Memory Bandwidth. Raising the clocks helps a little, but with faster memory, this puppy could probably be significantly faster than the 2080.


Enter the 5800/xt skews possibly with hbm2?


----------



## ilmazzo

maltamonk said:


> Enter the 5800/xt skews possibly with hbm2?


Well, according to past amd's top gpu releases it should have HBM2 (or HBM2E on dual stacks) 

but

dunno what raja changed before leaving.... for profitability and availability they should have ditch it on gaming cards, but since seems that memory compression on navi is still behind nvidia's, I think they will have to stick to hbm2 even this time..... a 384bit ddr6 maybe with 16gbps can suffice to feed a big navi but these are only feelings, not calculations (786GB/s maybe?)....for sure they need to exceed the 600GB/s of the VII but at a lower cost (otherwise I won't be able to afford it lol )

dunno, interesting times next 6 months


----------



## keikei

The size of dat heatsink. :drool:
https://videocardz.com/newz/msi-showcases-radeon-rx-5700-xt-gaming


----------



## ZealotKi11er

ilmazzo said:


> Well, according to past amd's top gpu releases it should have HBM2 (or HBM2E on dual stacks)
> 
> but
> 
> dunno what raja changed before leaving.... for profitability and availability they should have ditch it on gaming cards, but since seems that memory compression on navi is still behind nvidia's, I think they will have to stick to hbm2 even this time..... a 384bit ddr6 maybe with 16gbps can suffice to feed a big navi but these are only feelings, not calculations (786GB/s maybe?)....for sure they need to exceed the 600GB/s of the VII but at a lower cost (otherwise I won't be able to afford it lol )
> 
> dunno, interesting times next 6 months


I do not think AMD is going to make a gaming HBM2 card. They are not going to make that mistake again. 384-Bit seems interesting. Probably they have something else up their sleeve that they might have learned from Zen.


----------



## homestyle

Which 5700xt is better?

Gigabyte 3 fan cooler, or Sapphire Pulse?


----------



## ilmazzo

powercolor atm


----------



## keikei

Available as i type this: https://www.newegg.com/asrock-radeo...=5700 xt&cm_re=5700_xt-_-14-930-020-_-Product


----------



## Heuchler

guess the Sapphire Pulse RX 5700 XT and ASRock Challenger where in-stock today at Newegg.


GIGABYTE Radeon RX 5700 XT GAMING OC $420 currently on backorder
https://www.newegg.com/gigabyte-radeon-rx-5700-xt-gv-r57xtgaming-oc-8gd/p/N82E16814932208

Sapphire Radeon Pulse RX 5700 XT $420 currently on backorder
https://www.amazon.com/Sapphire-Radeon-Triple-Backplate-Graphics/dp/B07WC7683C/




ASRock Radeon RX 5700 and Sapphire PULSE RX 5700 are in-stock
https://www.newegg.com/asrock-radeon-rx-5700-rx-5700-challenger-d-8g-oc/p/N82E16814930021
https://www.newegg.com/sapphire-radeon-rx-5700-100417p8gl/p/N82E16814202350


----------



## ilmazzo

I missed gigabyte reviews? Can't remember anyone.... so i won't just take it into account when choosing, they did a lot of errors with previous amd cards releases.......


----------



## keikei

ilmazzo said:


> I missed gigabyte reviews? Can't remember anyone.... so i won't just take it into account when choosing, they did a lot of errors with previous amd cards releases.......


I've found multiple sapphire pulse reviews and all speak well of the card, but not a single gigabyte OC one. Thats telling in of it self.


----------



## Heuchler

keikei said:


> I've found multiple sapphire pulse reviews and all speak well of the card, but not a single gigabyte OC one. Thats telling in of it self.


As in what? NDA probably expires near or at launch date for those cards. 

RDNA Architecture Whitepaper [PDF]
https://www.amd.com/system/files/documents/rdna-whitepaper.pdf


----------



## zGunBLADEz

ZealotKi11er said:


> It depends. Probably more like 3700 it beat 2080S but not get to 2080 Ti level. I think people need to forget 2080 Ti. You cant expect to match its perf within 1 gen with $400 GPU.


that gpu dont even exist in my book anyway as it is anything over the $699 bracket its non existent actually XD


----------



## tpi2007

First WHQL certified drivers for Navi:

https://www.techpowerup.com/258507/...radeon-software-adrenalin-19-8-1-whql-drivers


If you already have 19.8.1 Beta, there is no change. That was the version submitted for validation and passed as is.


----------



## AlphaC

_Davinci Resolve result_
https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1563/


> *RTX 2060 SUPER 8GB is overall about 24% faster than the Radeon RX 5700 XT 8GB* while the *RTX 2070 SUPER 8GB is 11% faster than the Radeon RX Vega 64 8GB*. This is a pretty significant lead for NVIDIA and one that is going to require a lot of work on AMD's part to overcome.


RX 5700 needs work on the software side


----------



## maltamonk

AlphaC said:


> https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1563/
> 
> 
> 
> RX 5700 needs work on the software side


Might want to add that's for DaVinci Resolve. Without that info.....the quote is misleading since it's taken out of context.


----------



## Imouto

Someone please tell that Pudget Systems guy about weighted averages.

What a trainwreck of a comparison.


----------



## AlphaC

Eh, look at the sub scores. It's slower than Vega 64. You're free to post a comment on their page to be honest.


People reading that page probably want an executive summary and if you want a subtest breakdown it's there.


----------



## b.walker36

I'm so tempted to buy the red evil, it would be a nice step up from my 980TI but I know as soon as I pull the trigger Nvidia drops 7nm and big navi comes out haha. Whichever comes first I'm buying something I need more power.


----------



## 113802

AlphaC said:


> _Davinci Resolve result_
> https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1563/
> 
> RX 5700 needs work on the software side


Bias nVidia article, the Radeon VII slaughters the RTX 2080 and competes with $2500+ workstation cards. The Radeon VII isn't EOL and still has decent availability until the Instinct Mi50/Mi60 are EoL. 

Lisa Su and Simon Ng stated that we're going to continue to see Vega since it's a high computing architecture while Navi(RDNA) was built ground up for gaming. I do not believe we'll see Navi parts replace the WX 9100(Vega 10) or Instinct cards. We'll be seeing Arcturus soon also which is another GCN compute card.

AMD statement regarding Radeon VII EoL:



> We continue to see strong availability of Radeon VII in the channel for both gamers and creators.


----------



## b.walker36

b.walker36 said:


> I'm so tempted to buy the red evil, it would be a nice step up from my 980TI but I know as soon as I pull the trigger Nvidia drops 7nm and big navi comes out haha. Whichever comes first I'm buying something I need more power.



I expect the next showdown summer 2020. I could be wrong. Either way, grabbing a Navi to game at decent settings. Devil on my list as well. :devil:



keikei said:


> I expect the next showdown summer 2020. I could be wrong. Either way, grabbing a Navi to game at decent settings. Devil on my list as well. :devil:


 You really think almost a year for nvidia 7nm?


----------



## keikei

b.walker36 said:


> You really think almost a year for nvidia 7nm?



My best guess. Supa just dropped. You sound like its coming much sooner.


----------



## paulerxx

AlphaC said:


> _Davinci Resolve result_
> https://www.pugetsystems.com/labs/a...-Roundup-NVIDIA-SUPER-vs-AMD-RX-5700-XT-1563/
> 
> RX 5700 needs work on the software side


 I'm not sure if you know this or not but that article is CLEARLY bias. Please keep nonsense like this off this site, you're not going to fool people like us with an article like this. The majority of us are highly intelligent in this field (that's why we're here to begin with) and we can read between the lines relatively easily.

Sincerely, The Overclock.net community.


----------



## Heuchler

paulerxx said:


> I'm not sure if you know this or not but that article is CLEARLY bias. Please keep nonsense like this off this site, you're not going to fool people like us with an article like this. The majority of us are highly intelligent in this field (that's why we're here to begin with) and we can read between the lines relatively easily.
> 
> Sincerely, The Overclock.net community.


So when TR 2970W beats I9-9980WE in [CPU] DaVinci Resolve bench it is fine. But if a new gaming architecture does worse than previous CGN card like Radeon VII then it is BIAS test. Radeon VII isn't EOL. 

https://www.pugetsystems.com/recomm...-DaVinci-Resolve-187/Hardware-Recommendations


----------



## Imouto

Heuchler said:


> So when TR 2970W beats I9-9980WE in [CPU] DaVinci Resolve bench it is fine. But if a new gaming architecture does worse than previous CGN card like Radeon VII then it is BIAS test. Radeon VII isn't EOL.
> 
> https://www.pugetsystems.com/recomm...-DaVinci-Resolve-187/Hardware-Recommendations


Beats? When there are more than 10 CPUs in the top 20% I'd say "beat" comes a bit too strong. And the price bracket is even more glaring for a 20%.


----------



## 113802

Heuchler said:


> So when TR 2970W beats I9-9980WE in [CPU] DaVinci Resolve bench it is fine. But if a new gaming architecture does worse than previous CGN card like Radeon VII then it is BIAS test. Radeon VII isn't EOL.
> 
> https://www.pugetsystems.com/recomm...-DaVinci-Resolve-187/Hardware-Recommendations


It's bias because they are just trying to sell their products. The Radeon VII isn't EOL and instead of recommending it they state the clear choice are nVidia cards.



> What this means is that with the exception of the AMD Radeon VII 16GB (which is no longer being made), *NVIDIA GPUs are the clear choice for anyone using DaVinci Resolve.* While the exact performance gain over AMD changes based on the type of grade and codec used, the RTX 2060 SUPER 8GB is overall about 24% faster than the Radeon RX 5700 XT 8GB while the RTX 2070 SUPER 8GB is 11% faster than the Radeon RX Vega 64 8GB. This is a pretty significant lead for NVIDIA and one that is going to require a lot of work on AMD's part to overcome.


----------



## PontiacGTX

nvidia released a driver compatible with Integer scaling ,will AMD ever catch up? I mean at least let people who own Vega /Navi get it?


----------



## treetops422

Heuchler said:


> So when TR 2970W beats I9-9980WE in [CPU] DaVinci Resolve bench it is fine. But if a new gaming architecture does worse than previous CGN card like Radeon VII then it is BIAS test. Radeon VII isn't EOL.
> 
> https://www.pugetsystems.com/recomm...-DaVinci-Resolve-187/Hardware-Recommendations


 In the comments the author says, normal hardware reviews go off MSRP prices, not current prices, thought that was pretty lol. Nvidia is one of the only companies hardware companies that does not lower it's products below MSRP over time. Not to hate on Nvidia, that's just how they operate. Intel dropped the 9400f to like $150, that's a damn good deal. The 3600 is $200. Both great deals, you wouldn't list the 9400f at MSRP $200? and not mention the the current price in a review.


I should shutup since I don't know anything about Davinci. But when general knowledge is ignored it's usually a red flag.


----------



## Heuchler

treetops422 said:


> Int the comment the author says, normal hardware reviews go off MSRP prices, not current prices, thought that was pretty lol. Nvidia is one of the only companies hardware companies that does not lower it's products below MSRP over time. Not to hate on Nvidia, that's just how they operate. Intel dropped the 9400f to like $150, that's a damn good deal. The 3600 is $200. Both great deals, you wouldn't list the 9400f at MSRP $200? and not mention the the current price in a review.
> 
> 
> I should shutup since I don't know anything about Davinci. But when general knowledge is ignored it's usually a red flag.


Competition is good for consumers. Hardware is limited by software at times. AMD has been ahead with innovations such like 64-bit processor, first true dual-core processor, first real quad-core CPU. But if software doesn't take advantage we don't get the full benefits of these advances. 
Professional software is hardly every developed for AMD products. Vegas Pro video editing software is the only one that comes to mind. Autodesk and Adobe don't seem very optimized for AMD products. Hopefully that will change over time.

Single Precision (FP32) 
AMD Radeon VII 13.8 TFLOPS
AMD Radeon RX Vega 64 12.7 TFLOPS
AMD Radeon RX 5700 XT 9.75 TFLOPs
AMD Radeon R9 Fury X 8.6 TFLOPS
AMD Radeon RX 5700 7.95 TFLOPs
AMD Radeon RX 590 7.1 TFLOPs


----------



## tpi2007

WannaBeOCer said:


> Bias nVidia article, the Radeon VII slaughters the RTX 2080 and competes with $2500+ workstation cards. The Radeon VII isn't EOL and still has decent availability until the Instinct Mi50/Mi60 are EoL.
> 
> Lisa Su and Simon Ng stated that we're going to continue to see Vega since it's a high computing architecture while Navi(RDNA) was built ground up for gaming. I do not believe we'll see Navi parts replace the WX 9100(Vega 10) or Instinct cards. We'll be seeing Arcturus soon also which is another GCN compute card.
> 
> AMD statement regarding Radeon VII EoL:



That carefully crafted PR statement basically means that the Radeon VII is actually EOL. They deflected the real answer by pointing to the store shelves and saying that there is still plenty of stock . And while that is so, they don't have to address whether the card is actually EOL on their end.

Take the GT 710, you can still buy it from many places, but I just learned that the card is EOL since at least June of 2017. See the comment from an Asus Rep here: https://www.newegg.com/asus-geforce...on=GT 710&cm_re=GT_710-_-14-126-052-_-Product

That is to say, a card may be EOL for months, sometimes even years, all the while you can still find them for sale in multiple places around the world.



Anyway, back to Navi:






Seems like a low effort design.​


----------



## ilmazzo

Not everything that shines is actually goooouuuld (bad translation of a italian proverb)


----------



## The Robot

tpi2007 said:


> That carefully crafted PR statement basically means that the Radeon VII is actually EOL. They deflected the real answer by pointing to the store shelves and saying that there is still plenty of stock . And while that is so, they don't have to address whether the card is actually EOL on their end.
> 
> Take the GT 710, you can still buy it from many places, but I just learned that the card is EOL since at least June of 2017. See the comment from an Asus Rep here: https://www.newegg.com/asus-geforce...on=GT 710&cm_re=GT_710-_-14-126-052-_-Product
> 
> That is to say, a card may be EOL for months, sometimes even years, all the while you can still find them for sale in multiple places around the world.
> 
> 
> 
> Anyway, back to Navi:
> 
> https://www.youtube.com/watch?v=morJq0HJoCc
> 
> Seems like a low effort design.​



Man, this is even worse than heatpipe fiasco on Asus DirectCU 290X and EVGA ACX 970.​


----------



## Lexi is Dumb

Sort of on topic, anyone else seen the XFX 5700 XT RAW yet, looks like they learned something after the whole <>< phase. It looks mean.
Hope they learned some cooling lessons after the fat boy too, but I won't hold my breath.
https://www.pccasegear.com/products/47618/xfx-radeon-rx-5700-xt-raw-ii-oc-8gb


----------



## ToTheSun!

ilmazzo said:


> Not everything that shines is actually goooouuuld (bad translation of a italian proverb)


Pretty sure that saying is used in almost all romance and germanic languages.


----------



## criminal

tpi2007 said:


> Anyway, back to Navi:
> 
> https://www.youtube.com/watch?v=morJq0HJoCc
> 
> Seems like a low effort design.


I saw that earlier this morning. Did MSI even try? I can't help but feel that they didn't do anything but find a cooler that would fit into the gold shroud they designed and called it a day.



Lexi is Dumb said:


> Sort of on topic, anyone else seen the XFX 5700 XT RAW yet, looks like they learned something after the whole <>< phase. It looks mean.
> Hope they learned some cooling lessons after the fat boy too, but I won't hold my breath.
> https://www.pccasegear.com/products/47618/xfx-radeon-rx-5700-xt-raw-ii-oc-8gb


That is a great looking card.


----------



## 113802

Lexi is Dumb said:


> Sort of on topic, anyone else seen the XFX 5700 XT RAW yet, looks like they learned something after the whole <>< phase. It looks mean.
> Hope they learned some cooling lessons after the fat boy too, but I won't hold my breath.
> https://www.pccasegear.com/products/47618/xfx-radeon-rx-5700-xt-raw-ii-oc-8gb


Game clock is 60Mhz slower than the Sapphire Pulse. The Red Devil looks to have the best clocks out the box but it comes with a steep price.

Aesthetically I prefer the MSI Mech but everyone is different.


----------



## Jamelcr

Any Mech OC review?


----------



## criminal

Jamelcr said:


> Any Mech OC review?


https://lmgtfy.com/?q=msi+5700+xt+mech


----------



## Heuchler

new Radeon Software Adrenalin 2019 Edition 19.8.2 is now available, with up to 10% improved performance in Control,
support for Man of Medan, HDCP 2.3 support on Radeon RX 5700 series graphics cards, and various bug fixes
https://www.amd.com/en/support/kb/release-notes/rn-rad-win-19-8-2

New driver where just released an hour or so ago


HardwareLuxx test of Control @1440p (with RTX off) using GeForce 436.02 and Adrenalin 2019 Edition 19.8.1 
https://www.hardwareluxx.de/index.p...rks-control-mit-rtx-und-dlss-ausprobiert.html


----------



## PriestOfSin

Pretty good vid. Honestly, both cards seem like winners, but I'd personally go with the 5700XT with a good cooling solution. Maybe if I had a use for RTX I'd go for the 2070S.


----------



## ZealotKi11er

PriestOfSin said:


> https://www.youtube.com/watch?v=0ptGG0lIKYI
> 
> Pretty good vid. Honestly, both cards seem like winners, but I'd personally go with the 5700XT with a good cooling solution. Maybe if I had a use for RTX I'd go for the 2070S.


Overclocked 2070S vs "stock" 5700 XT Reference.


----------



## maltamonk

ZealotKi11er said:


> Overclocked 2070S vs "stock" 5700 XT Reference.


$591 vs $449......idk why there is this insistence to compare the 5700xt to the 2070s. It should be vs the 2060s where the prices actually line up


----------



## Newbie2009

Tip for anyone with a reference card, I found manual fan profile although keeps card cooler for some reason it tanks performance. 

Maybe just me but this issue has been over multiple drivers and ddu now.

First drivers would cause a green screen if I tried manual fan.

My performance jumped 400+ points by using auto fans.
https://www.3dmark.com/spy/8297449
https://www.3dmark.com/fs/20271366

Don’t think I’ve ever come across this with reference amd cards before, and I’ve owned a lot.


----------



## grifers

Newbie2009 said:


> Tip for anyone with a reference card, I found manual fan profile although keeps card cooler for some reason it tanks performance.
> 
> Maybe just me but this issue has been over multiple drivers and ddu now.
> 
> First drivers would cause a green screen if I tried manual fan.
> 
> My performance jumped 400+ points by using auto fans.
> https://www.3dmark.com/spy/8297449
> https://www.3dmark.com/fs/20271366
> 
> Don’t think I’ve ever come across this with reference amd cards before, and I’ve owned a lot.



Hi!!!. Can you post screenshoot with your manual fan profile?. Thanks a lot!!!. I have 5700 xt reference card too.


----------



## keikei

maltamonk said:


> $591 vs $449......idk why there is this insistence to compare the 5700xt to the 2070s. It should be vs the 2060s where the prices actually line up



Even the Supa cards are very limited. Retailers jacking dem $ up too, so impossible to find msrp. Nvidia pulling another unicorn @ $499.


----------



## The Robot

maltamonk said:


> $591 vs $449......idk why there is this insistence to compare the 5700xt to the 2070s. It should be vs the 2060s where the prices actually line up


Because of similar performance. Jacketman just wants $100 for his RTX vaporware.


----------



## ToTheSun!

The Robot said:


> Because of similar performance. Jacketman just wants $100 for his RTX vaporware.


One is "but muh rasterization" and the other is "vaporware". Can you tell which is which?



Spoiler


----------



## 113802

ToTheSun! said:


> One is "but muh rasterization" and the other is "vaporware". Can you tell which is which?


The top image has anti-reflective window film. You have it reversed the bottom should be "before" and the top is "after" installing the film on the glass.

Reflections look fine without that vaporware.


----------



## treetops422

maltamonk said:


> $591 vs $449......idk why there is this insistence to compare the 5700xt to the 2070s. It should be vs the 2060s where the prices actually line up


Nvidia needs a 25% price reduction handicap.



Heuchler said:


> new Radeon Software Adrenalin 2019 Edition 19.8.2 is now available, with up to 10% improved performance in Control,
> support for Man of Medan, HDCP 2.3 support on Radeon RX 5700 series graphics cards, and various bug fixes
> https://www.amd.com/en/support/kb/release-notes/rn-rad-win-19-8-2
> 
> New driver where just released an hour or so ago
> 
> 
> HardwareLuxx test of Control @*1440* p (with RTX off) using GeForce 436.02 and Adrenalin 2019 Edition 19.8.1
> https://www.hardwareluxx.de/index.p...rks-control-mit-rtx-und-dlss-ausprobiert.html


Downloading as I type!:thumb:


----------



## Dmac73

Little playing around with 5700XT:

19.8.2
2080/900
Fire Strike - 29,357 gpuScore

19.8.2
2110/900
TimeSpy - 9,776 gpuScore

Still messing. I have subpar memory.
Not bad for a $399 card. Strapped a cheap old Antec Kuhler to it.


Dmac


----------



## Heuchler

PowerColor Red Dragon RX 5700 XT (AXRX 5700 XT 8GBD6-3DHR/OC)
https://www.powercolor.com/product?id=1565953800

1795MHz(Game) /up to 1650MHz (Base) / up to 1905MHz(Boost) 

5* 6mm cooper heatpipes
dual 100m dual-ball-bearing fans
1.5mm metal back plate
240mm*132mm*41mm

PowerColor RX 5700 XT Red Dragon $409
PowerColor RX 5700 Red Dragon $359

https://www.powercolor.com/new?id=1565837617


VAT include (tax)

£424.99 - https://www.scan.co.uk/products/pow...40-gpu-7nm-rdna-2560-streams-1650mhz-gpu-1905
£439.99 - https://www.overclockers.co.uk/powe...ddr6-pci-express-graphics-card-gx-19d-pc.html
PLN 2,099.00 - https://www.proshop.pl/Karta-grafic...-Dragon-8GB-GDDR6-RAM-Karta-graficzna/2794196
SEK 5,399.00 - https://www.proshop.se/Grafikkort/P...T-Red-Dragon-8GB-GDDR6-RAM-Grafikkort/2794196
out-of-stock - https://www.newegg.com/powercolor-radeon-rx-5700-xt-axrx-5700xt-8gbd6-3dh/p/N82E16814131753


----------



## rdr09

Dmac73 said:


> Little playing around with 5700XT:
> 
> 19.8.2
> 2080/900
> Fire Strike - 29,357 gpuScore
> 
> 19.8.2
> 2110/900
> TimeSpy - 9,776 gpuScore
> 
> Still messing. I have subpar memory.
> Not bad for a $399 card. Strapped a cheap old Antec Kuhler to it.
> 
> 
> Dmac


Nice. How's that 3700K holding up?


----------



## ILoveHighDPI

ToTheSun! said:


> One is "but muh rasterization" and the other is "vaporware". Can you tell which is which?
> 
> 
> 
> Spoiler


It took me a solid minute to figure out what reflections were missing.

RTX is a waste of time.


----------



## ToTheSun!

ILoveHighDPI said:


> It took me a solid minute to figure out what reflections were missing.
> 
> RTX is a waste of time.


I won't hold that against you; it's only, like, half the total visual content on the scene.


----------



## Dmac73

rdr09 said:


> Nice. How's that 3700K holding up?



Not great. Zen is in my near future.


Dmac


----------



## rdr09

Dmac73 said:


> Not great. Zen is in my near future.
> 
> 
> Dmac



You just need a R5 3600 and some cheap B450 motherboard. Cheap ram will work too. Even this will work . . .

https://www.newegg.com/ballistix-16...s&cm_re=ballistics-_-0RN-00A4-000N8-_-Product


----------



## JackCY

WannaBeOCer said:


> The top image has anti-reflective window film. You have it reversed the bottom should be "before" and the top is "after" installing the film on the glass.
> 
> Reflections look fine without that vaporware.


There is a big difference between static baked reflections and fully dynamic reflections and lighting etc. Sure you can make games look OK with static baked everywhere, even some engines support somewhat "dynamic" baking with having "samples" through out the 3D space, but compared to tracing it's miles miles away in quality for dynamic situations.

Sure the average Joe has no idea how much effort and work is put into making these graphics look more realistic and how big a difference in approach and performance there can be between various situations for the same visual effect, static vs dynamic, etc.

Reflections, shadows/GI, ... look pretty horrible without tracing in any dynamic scene. Dynamic as in moving reflective, emissive, etc. objects, not just a static reflective mirror or a static light that you can both bake. AO as we have it until now is horrible, adding shadows where they are not supposed to be. Lighting overall is very bad in games in terms of realism and requires a lot of developer work to setup and bake even half decently.


----------



## overpass

It isn't as much as to what's missing in my case, but of confusion. What is being reflected, what is on the other side? Why is the subject closer to the glass cast darker, in stark contrast to the other in bright light? Is it trying to tell something, perhaps next objective in the quest? It's all too confusing, and subtract from delivering necessary visual cues than adding to it. In that comparison at least, RTX is a failure.


----------



## Heuchler

GIGABYTE RX5700/5700XT GAMING OC 8G Mini-Review
7 + 2 Phase design: IR35217 (PWM-Controller) using ON Semiconductor NCP81022
Micron D9WCW MT61K256M32 - https://www.micron.com/products/graphics-memory/gddr6/part-catalog/mt61k256m32je-14


http://www.coolpc.com.tw/phpBB2/viewtopic.php?f=71&t=262005


Manufacture Page
https://www.gigabyte.com/us/Graphics-Card/GV-R57XTGAMING-OC-8GD#kf

GIGABYTE Radeon RX 5700 XT GAMING OC $419.99
https://www.newegg.com/gigabyte-radeon-rx-5700-xt-gv-r57xtgaming-oc-8gd/p/N82E16814932208


----------



## criminal

ZealotKi11er said:


> Overclocked 2070S vs "stock" 5700 XT Reference.





maltamonk said:


> $591 vs $449......idk why there is this insistence to compare the 5700xt to the 2070s. It should be vs the 2060s where the prices actually line up



About $130 total difference after tax between the 5700XT I had to the 2070S I have now. No hesitation or doubt what so ever about what I am about to say... the $130 extra was worth every penny. ¯\_(ツ)_/¯


----------



## 113802

JackCY said:


> There is a big difference between static baked reflections and fully dynamic reflections and lighting etc. Sure you can make games look OK with static baked everywhere, even some engines support somewhat "dynamic" baking with having "samples" through out the 3D space, but compared to tracing it's miles miles away in quality for dynamic situations.
> 
> Sure the average Joe has no idea how much effort and work is put into making these graphics look more realistic and how big a difference in approach and performance there can be between various situations for the same visual effect, static vs dynamic, etc.
> 
> Reflections, shadows/GI, ... look pretty horrible without tracing in any dynamic scene. Dynamic as in moving reflective, emissive, etc. objects, not just a static reflective mirror or a static light that you can both bake. AO as we have it until now is horrible, adding shadows where they are not supposed to be. Lighting overall is very bad in games in terms of realism and requires a lot of developer work to setup and bake even half decently.


Overwatch has dynamic reflections, looks great without the need of tracing of the rays.


----------



## Heuchler

Excel has Ray Tracing 
https://github.com/s0lly/Raytracer-In-Excel


----------



## The Robot

overpass said:


> It isn't as much as to what's missing in my case, but of confusion. What is being reflected, what is on the other side? Why is the subject closer to the glass cast darker, in stark contrast to the other in bright light? Is it trying to tell something, perhaps next objective in the quest? It's all too confusing, and subtract from delivering necessary visual cues than adding to it. In that comparison at least, RTX is a failure.


This. If the price for "physically accurate rendering" is halving your framerate and doubling the GPU cost, is it really worth it? Especially when talented developers can achieve the same immersion using "fake" methods. I'm with Raja on this one, the tech is cool but not ready for prime time yet.
https://www.dsogaming.com/news/raja-koduri-talks-about-real-time-ray-tracing-and-intels-xe-gpus-starting-price-200-in-2020/


----------



## ToTheSun!

The Robot said:


> This. If the price for "physically accurate rendering" is halving your framerate and doubling the GPU cost, is it really worth it? Especially when talented developers can achieve the same immersion using "fake" methods. I'm with Raja on this one, the tech is cool but not ready for prime time yet.
> https://www.dsogaming.com/news/raja...nd-intels-xe-gpus-starting-price-200-in-2020/


Yes. I'm sure the fact that Intel's and AMD's stance on raytracing has absolutely nothing to do with the fact that they don't have powerful hardware to even think about implementing it.

When they do have it, I'm sure it'll be a tremendous coincidence that, then, raytracing will be absolutely right.


----------



## criminal

ToTheSun! said:


> Yes. I'm sure the fact that Intel's and AMD's stance on raytracing has absolutely nothing to do with the fact that they don't have powerful hardware to even think about implementing it.
> 
> When they do have it, I'm sure it'll be a tremendous coincidence that, then, raytracing will be absolutely right.


Although not overly impressed at the moment with Ray Tracing myself, when AMD gets it, I'm sure many will think it is the bee's knees.


----------



## ToTheSun!

criminal said:


> Although not overly impressed at the moment with Ray Tracing myself, when AMD gets it, I'm sure many will think it is the bee's knees.


It amazes me that a tech forum is frequented by so many technophobes.

Even the 8K guy says it's hard to spot transparency reflections, a beautiful product of tracing that is impossible to do dynamically and in real time by rasterization.


----------



## maltamonk

criminal said:


> Although not overly impressed at the moment with Ray Tracing myself, when AMD gets it, I'm sure many will think it is the bee's knees.


I'm pretty sure that'll all depend on the performance hit at that time and the added cost. For many it's not that they don't like it, it's just not worth it yet. It's very much like HBM2....is it cool, yes, is it worth it now,....uhmmm not really.


----------



## 113802

maltamonk said:


> I'm pretty sure that'll all depend on the performance hit at that time and the added cost. For many it's not that they don't like it, it's just not worth it yet. It's very much like HBM2....is it cool, yes, is it worth it now,....uhmmm not really.


HBM2 is awesome especially the HBCC. It's too bad the R9 Fury/X didn't have a HBCC it would of probably been the best long term video card ever.


----------



## criminal

ToTheSun! said:


> It amazes me that a tech forum is frequented by so many technophobes.
> 
> Even the 8K guy says it's hard to spot transparency reflections, a beautiful product of tracing that is impossible to do dynamically and in real time by rasterization.





maltamonk said:


> I'm pretty sure that'll all depend on the performance hit at that time and the added cost. For many it's not that they don't like it, it's just not worth it yet. It's very much like HBM2....is it cool, yes, is it worth it now,....uhmmm not really.


Just played Control for a about 30 minutes. Ray Tracing on High settings. All I can say it is wow. Performance is perfectly fine for a single player game. The reflections and lighting in that game are amazing. https://arstechnica.com/gaming/2019...-best-game-yet-and-a-ray-tracing-masterpiece/


----------



## Jedi Mind Trick

criminal said:


> Just played Control for a about 30 minutes. Ray Tracing on High settings. All I can say it is wow. Performance is perfectly fine for a single player game. The reflections and lighting in that game are amazing. https://arstechnica.com/gaming/2019...-best-game-yet-and-a-ray-tracing-masterpiece/


LMAO! I came to say the same thing. HOLY JESUS. I was playing it on max (1440p with a 2060S), getting 30fps (or less) and realized the lower framerate was 100% worth the increased visuals. 

And here I (honestly) thought RTX was a useless gimmick...


----------



## ZealotKi11er

Jedi Mind Trick said:


> LMAO! I came to say the same thing. HOLY JESUS. I was playing it on max (1440p with a 2060S), getting 30fps (or less) and realized the lower framerate was 100% worth the increased visuals.
> 
> And here I (honestly) thought RTX was a useless gimmick...


This dev has always had amazing visuals.


----------



## treetops422

Jedi Mind Trick said:


> LMAO! I came to say the same thing. HOLY JESUS. I was playing it on max (1440p with a 2060S), getting 30fps (or less) and realized the lower framerate was 100% worth the increased visuals.
> 
> And here I (honestly) thought RTX was a useless gimmick...


How does it look compared to 4k RT off? What FPS you gettin on 4k?


----------



## ilmazzo

criminal said:


> About $130 total difference after tax between the 5700XT I had to the 2070S I have now. No hesitation or doubt what so ever about what I am about to say... the $130 extra was worth every penny. ¯\_(ツ)_/¯


5700XT aibs cleared the choice, 2070S now have to be discounted


----------



## ilmazzo

criminal said:


> Just played Control for a about 30 minutes. Ray Tracing on High settings. All I can say it is wow. Performance is perfectly fine for a single player game. The reflections and lighting in that game are amazing. https://arstechnica.com/gaming/2019...-best-game-yet-and-a-ray-tracing-masterpiece/


I have just seen screenshot of a woman looking at a wall/mirror/corridor

seems very exciting lol

anyway I want a 8fps "real world graphics" RT HDR4 game, anyone with me?


----------



## Offler

ilmazzo said:


> anyway I want a 8fps "real world graphics" RT HDR4 game, anyone with me?


I know its meant to be a joke, but realistically... How much data can be generated is one thing, how much data can human senses process is a different thing.


----------



## ToTheSun!

ilmazzo said:


> I have just seen screenshot of a woman looking at a wall/mirror/corridor
> 
> seems very exciting lol
> 
> anyway I want a 8fps "real world graphics" RT HDR4 game, anyone with me?


Are you guys going to belittle criminal's personal experience again, like you did with his reference card?


----------



## Jedi Mind Trick

treetops422 said:


> How does it look compared to 4k RT off? What FPS you gettin on 4k?


Mid 20's with 4k; I think 1440p RT looks a bit better than 4k no RT (the lighting looks much, much better).


----------



## ilmazzo

ToTheSun! said:


> Are you guys going to belittle criminal's personal experience again, like you did with his reference card?


Sure, I already sent a anonymous note to KGB (you know, red team....), they gonna take him and his 2070S within minutes.....



Jedi Mind Trick said:


> Mid 20's with 4k; I think 1440p RT looks a bit better than 4k no RT (the lighting looks much, much better).



naaaaaa DLSS is the (only) way for 4K, JHH would be disappointed if not, all those neural networks put in waste!


----------



## Jedi Mind Trick

ilmazzo said:


> Sure, I already sent a anonymous note to KGB (you know, red team....), they gonna take him and his 2070S within minutes.....
> 
> 
> 
> 
> naaaaaa DLSS is the (only) way for 4K, JHH would be disappointed if not, all those neural networks put in waste!


Lolol!


----------



## ToTheSun!

ilmazzo said:


> Sure, I already sent a anonymous note to KGB (you know, red team....), they gonna take him and his 2070S within minutes.....


If AMD is, at least, 2 years behind nVidia on RT, what makes you think they'd do anything within minutes?


----------



## ilmazzo

We agree that me and you don't understand each other, lol and for sure it is my fault, don't get me wrong

RT software and gaming is 4 years behind to be a option so amd is still in time imho


----------



## maltamonk

This is about the best I could find atm.





Meh.......


----------



## criminal

ilmazzo said:


> 5700XT aibs cleared the choice, 2070S now have to be discounted


I didn't know the aibs fixed the cumbersome software and driver issues too. 

If money was the only factor about what stuff I buy, then yeah the 5700XT is a no brainier. But even taking the "noise problem" out of the equation that I experienced doesn't overcome the overall experience I had with the 5700XT versus the 2070. Personal experience > price.



ToTheSun! said:


> Are you guys going to belittle criminal's personal experience again, like you did with his reference card?


That's just what brand boys do. They would rather argue for their brand then actually argue for better tech.

Like I said in an earlier post.... Ray Tracing was kinda of meh up until Control. Videos and screenshots just don't do it justice.


----------



## ToTheSun!

ilmazzo said:


> We agree that me and you don't understand each other, lol and for sure it is my fault, don't get me wrong
> 
> RT software and gaming is 4 years behind to be a option so amd is still in time imho


Why do you say it's 4 years behind when people in this very thread who have actually looked at and played Control with RTX on personally say it looks astonishing and framerates can actually be made playable with the right compromise of settings?

Aren't owners of RT capable cards playing RT titles entitled to their personal observations? More importantly still, don't they have the legitimacy to do so? And, in contrast, what would you say about people who do NOT own RT capable cards and have NOT played RT titles criticizing and belittling the experience of those who do?



criminal said:


> Ray Tracing was kinda of meh up until Control. Videos and screenshots just don't do it justice.


Absolutely agree.


----------



## maltamonk

Can you explain how videos don't do it justice?


----------



## skline00

Just bought Control. Anxious to try it tonight with RT and the 9900k/2080TI rig.


----------



## Jedi Mind Trick

maltamonk said:


> Can you explain how videos don't do it justice?


I couldn't post my screenshots from earlier (when I was comparing 1440p w/ & w/o RT to 4k w/ & w/o RT), but I thought it was pretty noticeable in screenshots/videos. When I get home, I'll try to post them again (I think they were a bit too large to attach directly).

In game the non-RT graphics made the MC's jacket look unnatural (kind of 'clay' like, as opposed to looking like the swish-y material it looks like it was made of). Haven't gotten very far yet, so I haven't really noticed other things, but I think the differences between RT and non RT are pretty big (big enough that I probably should not have enabled it in the first place [the 2060S I am currently using is for a friend; my 1080ti does not support RT :sad-smile]).


----------



## ilmazzo

skline00 said:


> Just bought Control. Anxious to try it tonight with RT and the 9900k/2080TI rig.


A little OT but yeah nice for ya


----------



## paulerxx

skline00 said:


> Just bought Control. Anxious to try it tonight with RT and the 9900k/2080TI rig.




"Radeon RX 5700 XT and 5700 reviews"



?????? 

Did you post this here by accident?


----------



## Jedi Mind Trick

ilmazzo said:


> A little OT but yeah nice for ya





paulerxx said:


> "Radeon RX 5700 XT and 5700 reviews"
> 
> 
> 
> ??????
> 
> Did you post this here by accident?


I mean, we've been talking about Control and RTX features (or the lack-thereof) for quite some time now.

Seems a bit weird to call this user out specifically about being OT/lost...

Edit: I guess we've only been talking about it for a day or so, but the point still stands.


----------



## The Robot

maltamonk said:


> This is about the best I could find atm.
> https://www.youtube.com/watch?v=av9_rBMvvhc
> 
> Meh.......


Yeah, looks the same as Quantum Break. Anyway, who is really excited to play it other than to look at pretty reflections? Remedy haven't made an interesting game since Max Payne. I see mostly newbie players praising it, then again they praise CoD too.


----------



## PontiacGTX

The Robot said:


> Yeah, looks the same as Quantum Break. Anyway, who is really excited to play it other than to look at pretty reflections? Remedy haven't made an interesting game since Max Payne. I see mostly newbie players praising it, then again they praise CoD too.


the mechanics are kinda like rage 2/sigularity with control of gravity and telekinetic control of objects, and stuff. few games implement fun mechanics like these about the story it makes little to no sense, same for rage 2


----------



## Imouto

maltamonk said:


> This is about the best I could find atm.
> https://www.youtube.com/watch?v=av9_rBMvvhc
> 
> Meh.......


80 FPS at 720p with a RTX 2060S

That's beyond sad.


----------



## The Robot

Imouto said:


> 80 FPS at 720p with a RTX 2060S
> 
> That's beyond sad.


It's irresponsible performance


----------



## PontiacGTX

nvm switched to 1080p and still reach 90-100fps with RTX off vs 60-40 on 1080 with RTX on


----------



## NightAntilli

criminal said:


> Although not overly impressed at the moment with Ray Tracing myself, when AMD gets it, I'm sure many will think it is the bee's knees.


Let's clarify this, because people keep saying that.

Firstly, ray tracing has been around for quite a while. The technology hasn't been used in gaming because it simply was (and arguably still is) too slow. This is three years ago;






AMD cards CAN do ray tracing. It has already been done. See here (start at 5:30);






The issue is not with ray tracing. Ray tracing technology can become useful. The issue is that RTX cards are overpriced for their performance, and on top of that, RTX tanks performance even more, for what in reality is a minor graphical upgrade. Those facts by default definitely do not make RTX awesome or innovative or whatever.

The ones that think RTX are awesome are either the ones that have bought the nVidia propaganda, or blind nVidia fanboys that want to convince everyone to waste money on a gimmick. RTX seems to be the next GameWorks, in the sense that they are putting specific hardware in their GPUs, do things deliberately in an inefficient manner to prop up GPU prices and try to single out the competition at the same time, just like they did with Tessellation. And people are still gullible enough to fall for it.

You have direct proof right there, that AMD cards can do ray tracing without requiring RTX cores or dedicated hardware or whatever. Can it be faster with dedicated hardware? Sure. But it becomes a problem when GPU prices were jacked up with the same performance but some gimmick added on top as the justification for the price hike. If you can't see it, I can't help you. It has everything to do with being scammed. Listen to that same second video, about what he says around 8:46.


----------



## 113802

NightAntilli said:


> Let's clarify this, because people keep saying that.
> 
> Firstly, ray tracing has been around for quite a while. The technology hasn't been used in gaming because it simply was (and arguably still is) too slow. This is three years ago;
> 
> AMD cards CAN do ray tracing. It has already been done. See here (start at 5:30);
> 
> The issue is not with ray tracing. Ray tracing technology can become useful. The issue is that RTX cards are overpriced for their performance, and on top of that, RTX tanks performance even more, for what in reality is a minor graphical upgrade. Those facts by default definitely do not make RTX awesome or innovative or whatever.
> 
> The ones that think RTX are awesome are either the ones that have bought the nVidia propaganda, or blind nVidia fanboys that want to convince everyone to waste money on a gimmick. RTX seems to be the next GameWorks, in the sense that they are putting specific hardware in their GPUs, do things deliberately in an inefficient manner to prop up GPU prices and try to single out the competition at the same time, just like they did with Tessellation. And people are still gullible enough to fall for it.
> 
> You have direct proof right there, that AMD cards can do ray tracing without requiring RTX cores or dedicated hardware or whatever. Can it be faster with dedicated hardware? Sure. But it becomes a problem when GPU prices were jacked up with the same performance but some gimmick added on top as the justification for the price hike. If you can't see it, I can't help you. It has everything to do with being scammed. Listen to that same second video, about what he says around 8:46.


Yes we get it AMD got into Ray Tracing way too late. If ATi was still separate we would of seen this much sooner. At the end of the day AMD will still release hardware dedicated to accelerate BVH. Just like how they increased the number of tessellation units when they were behind with DX11 tessellation. 

This was 9 years ago and nVidia clearly stated their vision that we would see hybrid Ray Tracing in future games. Thanks to their RT cores it's already possible. 

https://youtu.be/oK4UGnwwuEM


----------



## NightAntilli

WannaBeOCer said:


> Yes we get it AMD got into Ray Tracing way too late. If ATi was still separate we would of seen this much sooner.


Too late based on what? There's this from 2008;
https://www.tomshardware.com/news/Larrabee-Ray-Tracing,5769.html 



WannaBeOCer said:


> At the end of the day AMD will still release hardware dedicated to accelerate BVH. Just like how they increased the number of tessellation units when they were behind with DX11 tessellation.


 Yeah. They will. So what? Is this another example of 'yeah but nVidia did it first'? When will AMD ever get credited for doing something first? They did tessellation first. Nobody cared. nVidia does it, boom, everyone sees it as the best thing ever. Radeon image sharpening? Everyone claims they're copying nVidia, then nVidia releases an equivalent method, suddenly they are great for surpassing AMD's version, even though they are technically 'late'. G-sync vs FreeSync? Same story. Async compute? Same story. DX10.1? Same story. Radeon Anti-lag? Same story. 

It vehemently annoys me that nVidia gets the props whether they do something first or last, and AMD gets slammed whether they do it first or last. Like, seriously. I cannot understand where this bias comes from. 



WannaBeOCer said:


> This was 9 years ago and nVidia clearly stated their vision that we would see hybrid Ray Tracing in future games. *Thanks to their RT cores it's already possible. *
> 
> https://youtu.be/oK4UGnwwuEM


At bold part; I have no idea why you're saying this. I just posted a video of the 5700XT and even the RX 580 doing hybrid RT without any RT cores. The 5700XT is definitely playable, and even the RX 580 is playable if you expect console-like performance (as in, 30fps). So really, what is your point? If these cards can achieve this, what did the RT cores really bring to the table, other than a marketing gimmick that apparently has everybody fooled?

I guarantee that if AMD comes with a superior solution, no one will be praising them like they praised nVidia for being late with FreeSync support, late with Async compute support, late with their sharpening filter, late with their low latency mode etc. etc. etc. They will find something to slam AMD about, just like the Navi cards are being slammed on trivial irrelevant nonsense.


----------



## 113802

NightAntilli said:


> Too late based on what? There's this from 2008;
> https://www.tomshardware.com/news/Larrabee-Ray-Tracing,5769.html
> 
> Yeah. They will. So what? Is this another example of 'yeah but nVidia did it first'? When will AMD ever get credited for doing something first? They did tessellation first. Nobody cared. nVidia does it, boom, everyone sees it as the best thing ever. Radeon image sharpening? Everyone claims they're copying nVidia, then nVidia releases an equivalent method, suddenly they are great for surpassing AMD's version, even though they are technically 'late'. G-sync vs FreeSync? Same story. Async compute? Same story. DX10.1? Same story. Radeon Anti-lag? Same story.
> 
> It vehemently annoys me that nVidia gets the props whether they do something first or last, and AMD gets slammed whether they do it first or last. Like, seriously. I cannot understand where this bias comes from.
> 
> 
> At bold part; I have no idea why you're saying this. I just posted a video of the 5700XT and even the RX 580 doing hybrid RT without any RT cores. The 5700XT is definitely playable, and even the RX 580 is playable if you expect console-like performance (as in, 30fps). So really, what is your point? If these cards can achieve this, what did the RT cores really bring to the table, other than a marketing gimmick that apparently has everybody fooled?
> 
> I guarantee that if AMD comes with a superior solution, no one will be praising them like they praised nVidia for being late with FreeSync support, late with Async compute support, late with their sharpening filter, late with their low latency mode etc. etc. etc. They will find something to slam AMD about, just like the Navi cards are being slammed on trivial irrelevant nonsense.


nVidia had sharpening filters using FreeStyle since 390.65 which was released Jan 8, 2018. AMD does it, boom, everyone sees it as the best thing ever. nVidia has had pre-rendered frames in their drivers for years. AMD releases it and names it Anti-Lag boom, everyone sees it as the best thing ever. All nVidia did was rename their old technologies to counter AMD's canny marketing. 

https://www.geforce.com/en_GB/gfecn...ansel-enhancements-geforce-experience-article



> Some of our favorite combinations are “war cinema,” which adds a Vignette filter, reduces the vibrance using the Color filter, and decreases Gamma using Contrast. Another favorite is “Eagle Eye,” which increases vibrance through Color filter while *sharpening the image using Details filter*. With a full suite of powerful filters, you can customize your game to your liking.


They have a superior technology than Vesa adaptive-sync, I would also take my time to support a technology that ripped off mine. 

I've been using SEUS ptgi e9 since launch day on my Radeon VII, it does look nice and uses voxels for GI just like the Crytek demo which ran at 1080p 30 FPS on a RX Vega 56. Crytek already said if they used RT cores it would run faster. Same can be said about Minecraft and we'll see that soon as the RTX patch is released. AMD released cards that sucked at tessellation, nVidia didn't. nVidia released an architecture(fermi) that revolutionized the technology industry that caused the “big bang” of deep learning.


----------



## NightAntilli

^Keep up the lies bro.

Aside from all that nonsense, it's still clear that AMD's cards are viable for ray tracing, without RT cores. I already said that hardware acceleration would make it better. But there is nothing in nVidia's hardware that really makes it that much faster, if we use that Minecraft video as a reference.


----------



## 113802

NightAntilli said:


> ^Keep up the lies bro.
> 
> Aside from all that nonsense, it's still clear that AMD's cards are viable for ray tracing, without RT cores. I already said that hardware acceleration would make it better. But there is nothing in nVidia's hardware that really makes it that much faster, if we use that Minecraft video as a reference.


If their current cards were viable they would of already enabled DxR. We've already seen how the the GTX 10 series performs and even the Titan V. Minecraft Seus PTGI doesn't utilize RT cores.


----------



## Imouto

What people refuse to understand is that RT cores accelerate just a part of the whole process. That's why the performance uplift in path tracers (the whole package) like Blender Cycles is around 30%.

Once you start doing real world lightning the performance tanks badly. Unless you make dedicated hardware that can take care of the whole thing in real time RTX is vaporware.

The RTX implementation for ray tracing in gaming is lackluster to say the least because it is at least a decade early in its current form. For path tracer render software like Cycles or Octane it wins hands down because the speed up actually matters.

And before anyone starts with that, it just matters for amateurs and small jobs because of the tiny VRAM on GPUs and the complexity of movie shading that tanks performance even further why animation studios, like Pixar for example, are still doing CPU rendering.


----------



## ilmazzo

Let's change discussion title

"RT discussion thread" 

since seems the only way for actual RT to be used: in forum debating. F**k Navi and amd gpus....too harsh? don't think so....


----------



## tpi2007

maltamonk said:


> This is about the best I could find atm.
> https://www.youtube.com/watch?v=av9_rBMvvhc
> 
> Meh.......



Yeah, in the video (at 2:14) they say that the frame rate is "pretty good" at ~70 fps with a 2060 Super, but at 2:44 you can see in the game's menu that the game is being rendered at 720p and then upscaled to 1080p with DLSS. Later you can see that at actual 1080p with RTX on the fps drops down to as low as 41 fps and usually stays in the mid-40 fps and that's with MSAA turned off.

And the game is Epic Games Store exclusive for a year.

Oh yeah, I'm going to rush right out the door and delve into all this RTX glory. LOL




Anyway, back to the topic:


----------



## Leopardi

WannaBeOCer said:


> nVidia has had pre-rendered frames in their drivers for years. AMD releases it and names it Anti-Lag boom, everyone sees it as the best thing ever. All nVidia did was rename their old technologies to counter AMD's canny marketing.


nvidia didn't have zero pre-rendered frames, so it wasn't just a rename, and they were also caught with much higher input lag even without anti-lag features, than AMD.


----------



## NightAntilli

WannaBeOCer said:


> If their current cards were viable they would of already enabled DxR. We've already seen how the the GTX 10 series performs and even the Titan V. Minecraft Seus PTGI doesn't utilize RT cores.


Yeah. Just keep ignoring this;


----------



## 113802

NightAntilli said:


> WannaBeOCer said:
> 
> 
> 
> If their current cards were viable they would of already enabled DxR. We've already seen how the the GTX 10 series performs and even the Titan V. *Minecraft Seus PTGI doesn't utilize RT cores.*
> 
> 
> 
> Yeah. Just keep ignoring this;
Click to expand...

You're going to quote me talking about that Minecraft video and say I ignored it? 

Anyway that Minecraft shader pack is voxel based not BVH. DxR/Radeon Rays utilizes BVH and they're currently working on a way to accelerate BVH. Again if their current cards were viable AMD would of enabled DxR in their current lineup.


----------



## criminal

NightAntilli said:


> Let's clarify this, because people keep saying that.
> 
> The issue is not with ray tracing. Ray tracing technology can become useful. The issue is that RTX cards are overpriced for their performance, and on top of that, RTX tanks performance even more, for what in reality is a minor graphical upgrade. Those facts by default definitely do not make RTX awesome or innovative or whatever.
> 
> The ones that think RTX are awesome are either the ones that have bought the nVidia propaganda, or blind nVidia fanboys that want to convince everyone to waste money on a gimmick. RTX seems to be the next GameWorks, in the sense that they are putting specific hardware in their GPUs, do things deliberately in an inefficient manner to prop up GPU prices and try to single out the competition at the same time, just like they did with Tessellation. And people are still gullible enough to fall for it.
> 
> You have direct proof right there, that AMD cards can do ray tracing without requiring RTX cores or dedicated hardware or whatever. Can it be faster with dedicated hardware? Sure. But it becomes a problem when GPU prices were jacked up with the same performance but some gimmick added on top as the justification for the price hike. If you can't see it, I can't help you. It has everything to do with being scammed. Listen to that same second video, about what he says around 8:46.


Thanks for letting me know AMD can do Ray tracing. Still doesn't change my point if AMD had dedicated hardware and Nvidia didn't, most everyone screaming "It is Useless" would be screaming "Look what AMD cards can do and Nvidia's can't!". Anyway, up until playing Control I didn't give two craps about Ray Tracing. And honestly I don't see myself buying a game just because it has Ray Tracing, unless it was a game I planned to buy anyway. 

Also, I am not trying to convince anyone to buy an RTX card to get RTX. Because up until Control, I hadn't seen a reason to even use RTX. Even still I may be wrong to assume that Control is amazing to anyone else, but to me it makes the best use of the RTX features to date. And performance is good enough at native 1440p for me to play and enjoy the game. Nvidia and AMD cards ARE both overpriced right now. Nvidia cards are obviously more overpriced; I won't argue that. As I said and I stand by it, the added benefit of Nvidia's better laid out and less cumbersome control panel(my personal experience) and better, more consistent drivers(again my personal experience) is more than worth the little bit of extra cost I paid. And yeah you get RTX features if you choose to use them, which despite what many of you think is an added feature. So gimmick or not... that's for the person to decide and decide if it is worth spending the extra money. To me, I didn't care either way. I went with the best experience and after using both, the best experience I had was the Nvidia card. Sorry if it hurts some of your feelings. :sad-smile



tpi2007 said:


> Yeah, in the video (at 2:14) they say that the frame rate is "pretty good" at ~70 fps with a 2060 Super, but at 2:44 you can see in the game's menu that the game is being rendered at 720p and then upscaled to 1080p with DLSS. Later you can see that at actual 1080p with RTX on the fps drops down to as low as 41 fps and usually stays in the mid-40 fps and that's with MSAA turned off.
> 
> And the game is Epic Games Store exclusive for a year.
> 
> Oh yeah, I'm going to rush right out the door and delve into all this RTX glory. LOL


I didn't put up a FPS counter, so I can't verify what my frame rates were. But I didn't use DLSS or anything special. Native 1440p and performance was good enough for a single player game. Control was free with my RTX 2070S which cost me about $40 bucks after regrettably selling my 1080Ti. 

No, don't rush out and get RTX. But if you already have it; it is a nice treat. Not sure how some can't wrap their heads around that fact. :thumb:


----------



## ZealotKi11er

criminal said:


> Thanks for letting me know AMD can do Ray tracing. Still doesn't change my point if AMD had dedicated hardware and Nvidia didn't, most everyone screaming "It is Useless" would be screaming "Look what AMD cards can do and Nvidia's can't!". Anyway, up until playing Control I didn't give two craps about Ray Tracing. And honestly I don't see myself buying a game just because it has Ray Tracing, unless it was a game I planned to buy anyway.
> 
> Also, I am not trying to convince anyone to buy an RTX card to get RTX. Because up until Control, I hadn't seen a reason to even use RTX. Even still I may be wrong to assume that Control is amazing to anyone else, but to me it makes the best use of the RTX features to date. And performance is good enough at native 1440p for me to play and enjoy the game. Nvidia and AMD cards ARE both overpriced right now. Nvidia cards are obviously more overpriced; I won't argue that. As I said and I stand by it, the added benefit of Nvidia's better laid out and less cumbersome control panel(my personal experience) and better, more consistent drivers(again my personal experience) is more than worth the little bit of extra cost I paid. And yeah you get RTX features if you choose to use them, which despite what many of you think is an added feature. So gimmick or not... that's for the person to decide and decide if it is worth spending the extra money. To me, I didn't care either way. I went with the best experience and after using both, the best experience I had was the Nvidia card. Sorry if it hurts some of your feelings. :sad-smile
> 
> 
> I didn't put up a FPS counter, so I can't verify what my frame rates were. But I didn't use DLSS or anything special. Native 1440p and performance was good enough for a single player game. Control was free with my RTX 2070S which cost me about $40 bucks after regrettably selling my 1080Ti.
> 
> No, don't rush out and get RTX. But if you already have it; it is a nice treat. Not sure how some can't wrap their heads around that fact. :thumb:


I was getting about 60 fps at the beginning of the game with no action with 2080 Ti. You are probably on 40 fps with 2070S. Good enough for the game.


----------



## ilmazzo

So tired of reading about nvidia gaming experiences in amd review threads............. every forum is like that , jeezzzzzz 

unsubscribed

cheers


----------



## 113802

ilmazzo said:


> So tired of reading about nvidia gaming experiences in amd review threads............. every forum is like that , jeezzzzzz
> 
> unsubscribed
> 
> cheers


It's what happens when AMD releases a card with 2 year old performance and nVidia has 7 cards faster than AMD's top of the line card.


----------



## ZealotKi11er

WannaBeOCer said:


> It's what happens when AMD releases a card with 2 year old performance and nVidia has 7 cards faster than AMD's top of the line card.


That comment is super ignorant. 

Nvidia has 2070/2070S/1080Ti/2080/2080S/2080Ti/Titan RTX/Titan V which are all more expensive than 2-year-old performance card. From your statement AMD and Nvidia can't release any GPUs slower than 2080 Ti.


----------



## keikei

I'm hoping 1 will be in stock next week. Lol. Seriously.


----------



## Imouto

WannaBeOCer said:


> It's what happens when AMD releases a card with 2 year old performance and nVidia has 7 cards faster than AMD's top of the line card.


Excuse the poor car analogy but would that mean that Renault shouldn't bother releasing new cars because Lamborghini or Bugatti outperform them in every metric?

The problem seems to be more from perception and ignorance than a real one.

Nvidia seems to have mastered both engineering and marketing and now if Toyota releases a Corolla someone will put it down comparing it to a Veyron.


----------



## JackCY

keikei said:


> I'm hoping 1 will be in stock next week. Lol. Seriously.


Get a bunch of Chinese/Taiwanese/... locals to surround the factory and demand a truck load of cards, then hire a private cargo jet to fly them over to Iceland. That's how it's done :thumbsdow

No seriously.

AMD never can keep up with demand of GPUs when there is even when there may not be a mining craze. They don't care to be competitive and available on GPU market, they can do it on CPUs but never on GPUs since GCN. The competition is dead on GPU market, it's all set up. And when they did get competitive then Nvidia pulls out all the stops and goes bonkers with shady practices.

In EU the supply for AMD GPUs has always been miserable. It takes them always a year to have both stock and acceptable prices. Right now, there are some left over stock, 1-3 units locally and that's it, so the supply is damn low and only because the cards cost so much they are not being sold out outright entirely. $490 for a 5700 XT Pulse... no thank you.


----------



## tpi2007

criminal said:


> I didn't put up a FPS counter, so I can't verify what my frame rates were. But I didn't use DLSS or anything special. Native 1440p and performance was good enough for a single player game. Control was free with my RTX 2070S which cost me about $40 bucks after regrettably selling my 1080Ti.
> 
> No, don't rush out and get RTX. But if you already have it; it is a nice treat. Not sure how some can't wrap their heads around that fact. :thumb:



Don't get me wrong, at one point I was contemplating getting one of those decent AIB models discounted RTX 2070, which would still be around 480 €, and then I realize that Control is an Epic exclusive for a year. Everything is telling me to just wait for Ampere on 7nm EUV (I guess that custom title beneath the username checks out) next year, along with Navi 2.0 with support for ray traced lighting effects, so basically wait until the stars are aligned. Both Turing and Navi are overpriced, so it would take quite something to make such a move right now, and unfortunately, after all is said and done, there is nothing in my opinion that would make me buy either. That Pulse card from Sapphire, along with the atrociously named model from XFX look nice though (not sure about the thermals on the XFX model, haven't looked into it). Maybe once things settle down in the coming months both Nvidia and AMD will do some discounts.


----------



## 113802

Imouto said:


> Excuse the poor car analogy but would that mean that Renault shouldn't bother releasing new cars because Lamborghini or Bugatti outperform them in every metric?
> 
> The problem seems to be more from perception and ignorance than a real one.
> 
> Nvidia seems to have mastered both engineering and marketing and now if Toyota releases a Corolla someone will put it down comparing it to a Veyron.


Like you stated a poor car analogy when comparing electronics. Both AMD and nVidia's cards were a disappointment with their current generation when it comes down to rasterization performance. 

Moore's Law:



> The principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain.


----------



## NightAntilli

WannaBeOCer said:


> You're going to quote me talking about that Minecraft video and say I ignored it?


You are ignoring the fact that today Minecraft is running with ray tracing on AMD cards. 



WannaBeOCer said:


> Anyway that Minecraft shader pack is voxel based not BVH. DxR/Radeon Rays utilizes BVH and they're currently working on a way to accelerate BVH. Again if their current cards were viable AMD would of enabled DxR in their current lineup.


And that matters because...? It's ultimately the same effect, achieved in a different way. What you're saying is the equivalent of saying HairWorks uses Tessellation while TressFX doesn't and AMD improved tessellation because of HairWorks and thus HairWorks is better.


----------



## 113802

NightAntilli said:


> You are ignoring the fact that today Minecraft is running with ray tracing on AMD cards.
> 
> 
> And that matters because...? It's ultimately the same effect, achieved in a different way. What you're saying is the equivalent of saying HairWorks uses Tessellation while TressFX doesn't and AMD improved tessellation because of HairWorks and thus HairWorks is better.


And again you are speaking about Minecraft, once you add a resource(texture pack) FPS tank. My Radeon VII which outperforms the 5700 XT tanks down to 30 FPS. I'd love to see the 5700 XT with a resource pack. Again if AMD cards were viable for ray tracing current generation games AMD would of enabled DxR already.


----------



## PontiacGTX

https://www.phoronix.com/scan.php?page=article&item=3900x-9900k-gaming&num=6 why the 3900x power draw is so high with a 5700 xt?


----------



## Dmac73

PontiacGTX said:


> https://www.phoronix.com/scan.php?page=article&item=3900x-9900k-gaming&num=6 why the 3900x power draw is so high with a 5700 xt?


Linux might have something to do with it on top of X570 chipset and PCIe4.


----------



## NightAntilli

WannaBeOCer said:


> And again you are speaking about Minecraft, once you add a resource(texture pack) FPS tank. My Radeon VII which outperforms the 5700 XT tanks down to 30 FPS. I'd love to see the 5700 XT with a resource pack. Again if AMD cards were viable for ray tracing current generation games AMD would of enabled DxR already.


If it can run Minecraft, what is stopping it from running Quake 2? 

Wanna play this game? Ok. Maybe you're right. AMD wants to do it when their cards are viable, unlike nVidia, where the RTX 2060 and arguably even the RTX 2070 are not viable but are sold as being viable.

Jokes aside, a GTX 970 can still run with texture packs (8:01, and more importantly 12:20);






So, why wouldn't a 5700XT be able to do it? 
What's your next excuse?


----------



## 113802

NightAntilli said:


> If it can run Minecraft, what is stopping it from running Quake 2?
> 
> Wanna play this game? Ok. Maybe you're right. AMD wants to do it when their cards are viable, unlike nVidia, where the RTX 2060 and arguably even the RTX 2070 are not viable but are sold as being viable.
> 
> Jokes aside, a GTX 970 can still run with texture packs (8:01, and more importantly 12:20);
> 
> 
> So, why wouldn't a 5700XT be able to do it?
> What's your next excuse?


Only AMD is stopping it from running Quake 2 since nVidia's VKRay is hardware agnostic just like Microsoft's DxR. Like I stated earlier if ray tracing was viable on AMD's cards they would of enabled it by now. 

My excuse? You're comparing terrible texture packs to current generation games. I was referring to Stratum that tanked my FPS: https://continuum.graphics/stratum-rp


----------



## keikei




----------



## NightAntilli

WannaBeOCer said:


> Only AMD is stopping it from running Quake 2 since nVidia's VKRay is hardware agnostic just like Microsoft's DxR. Like I stated earlier if ray tracing was viable on AMD's cards they would of enabled it by now.
> 
> My excuse? You're comparing terrible texture packs to current generation games. I was referring to Stratum that tanked my FPS: https://continuum.graphics/stratum-rp


Those textures would kill performance with or without ray tracing. So yeah... 

In any case... Repeating something multiple times doesn't make it true. It more likely has to do with software than their cards not being capable, since they obviously are. Not being able to render hair with tessellation doesn't mean that those same cards can't render hair nor that they can't do tessellation. But whatever. I'll leave it at this, and let people make up their own minds.


----------



## criminal

tpi2007 said:


> Both Turing and Navi are overpriced, so it would take quite something to make such a move right now, and unfortunately, after all is said and done, there is nothing in my opinion that would make me buy either. That Pulse card from Sapphire, along with the atrociously named model from XFX look nice though (not sure about the thermals on the XFX model, haven't looked into it). Maybe once things settle down in the coming months both Nvidia and AMD will do some discounts.


Graphics card have been overpriced for years in my opinion. This is probably the worst it has ever been, but compared to some other hobbies, it is still relatively cheap. Truth be told if I had the money, time and physical space I would be building muscle cars instead of computers. 

Things are tiered enough with graphics cards that me "overpaying" $200 for a graphics card at the price range of $300-600 isn't going to kill me. For others, overpaying any amount is silly. But don't complain or disregard tech if you can't or won't pay for it. Especially since it is a hobby and brings people enjoyment. Even I have my limits though. I am not willing to spend $1200 on gpu. At the moment anyway.


----------



## EastCoast

I was hoping for a Nitro variant by now. 
RD version reminds me of it though.


----------



## 113802

EastCoast said:


> I was hoping for a Nitro variant by now.
> RD version reminds me of it though.


I doubt we'll see a Nitro version anytime soon since AMD is keeping the binned chips for their fanboy edition.


----------



## The Robot

WannaBeOCer said:


> I doubt we'll see a Nitro version anytime soon since AMD is keeping the binned chips for their fanboy edition.


I guess Nitro is like Lightning, reserved only for big boy chips. Maybe they'll do it for Big Navi.


----------



## keikei

The Robot said:


> I guess Nitro is like Lightning, reserved only for big boy chips. Maybe they'll do it for Big Navi.



GN said they were getting a Nitro. No mention of date though.


----------



## NightAntilli

The Robot said:


> I guess Nitro is like Lightning, reserved only for big boy chips. Maybe they'll do it for Big Navi.


It's possible... There was a Nitro RX460 though, but that was before Pulse existed. The thing is, they released Nitro versions for the RX 470, RX 480, RX 570, RX 580, RX 590... None of those are really 'big boy chips'... And they released Pulse for all of the 500 series and Vega 56... No Pulse for Vega 64, but Nitro for both Vega 56 and Vega 64... 

There doesn't seem to be a reason not to release a Nitro version, unless they can't give much better performance compared to the Pulse to warrant the price difference between them.


----------



## DiHydrogenMonOxide

Has anyone done a comparison with release drivers to current drivers? I'm curious if the performance has evened out in some of those games that were poor at release.


----------



## tpi2007

criminal said:


> Graphics card have been overpriced for years in my opinion. This is probably the worst it has ever been, but compared to some other hobbies, it is still relatively cheap. Truth be told if I had the money, time and physical space I would be building muscle cars instead of computers.
> 
> Things are tiered enough with graphics cards that me "overpaying" $200 for a graphics card at the price range of $300-600 isn't going to kill me. For others, overpaying any amount is silly. But don't complain or disregard tech if you can't or won't pay for it. Especially since it is a hobby and brings people enjoyment. Even I have my limits though. I am not willing to spend $1200 on gpu. At the moment anyway.







> But don't complain or disregard tech if you can't or won't pay for it. Especially since it is a hobby and brings people enjoyment.





> Even I have my limits though. I am not willing to spend $1200 on gpu. At the moment anyway.



Cyclic redundancy check error.


----------



## keikei

https://videocardz.com/newz/asrock-radeon-rx-5700-xt-taichi-oc-pictured


----------



## Imouto

You have to really try to make a card uglier than that.


----------



## ToTheSun!

Imouto said:


> You have to really try to make a card uglier than that.


I think it's fine (traditional taichi look) till you see the RGB.


----------



## AuraNova

Imouto said:


> You have to really try to make a card uglier than that.


I kinda like the design. Mostly the backplate, and that's the part I would mostly see anyway. I love the "steampunk" element Taichi always had.


----------



## keikei

https://videocardz.com/newz/sapphire-teases-radeon-rx-5700-xt-nitro


----------



## NightAntilli

keikei said:


> https://videocardz.com/newz/sapphire-teases-radeon-rx-5700-xt-nitro


I really wonder what this will bring over the Pulse. Honestly, I'm not expecting much better performance... Probably better looks, LED options and a better cooler, and that's it.


----------



## keikei

NightAntilli said:


> I really wonder what this will bring over the Pulse. Honestly, I'm not expecting much better performance... Probably better looks, LED options and a better cooler, and that's it.


I suspect much better oc'in. 2 X 8 pins vs 1 x 8 and 1 x 6.


----------



## Dmac73

keikei said:


> I suspect much better oc'in. 2 X 8 pins vs 1 x 8 and 1 x 6.



That's not going to make much of a difference at all. It's all in the cooling and VRM. Even with +90% power mod these cards aren't pulling enough to warrant 2x 8 pins. 

Dmac


----------



## keikei

Dmac73 said:


> That's not going to make much of a difference at all. *It's all in the cooling and VRM*. Even with +90% power mod these cards aren't pulling enough to warrant 2x 8 pins.
> 
> Dmac


I suspect better cooling as well in regards to the components, but we werent shown that. I also expect a large premium. The question is how large? It explains why the Pulse was so 'cheap'.


----------



## Dmac73

keikei said:


> I suspect better cooling as well in regards to the components, but we werent shown that. I also expect a large premium. The question is how large? It explains why the Pulse was so 'cheap'.


The Pulse was designed very well and thought out. It's one of the better cards, but it isn't over the top. Not sure how well it cools the junction temperature which is what really matters now on AMD.

The Nitro looks impressive just on the teaser alone. It will have a robust VRM with good VRM cooling and hopefully the cooler is up to the task for the GPU/Junction not to mention memory.


----------



## keikei

*Rumor: SAPPHIRE RX 5700 NITRO arrives September 16th*

https://videocardz.com/newz/sapphire-radeon-rx-5700-xt-nitro-oc-pictured

*479 EURO: Nitro.


----------



## ilmazzo

finally gigabyte seems to have done an amd gpu that does not shoot in its feet, nice....


----------



## PontiacGTX

https://www.pcgameshardware.de/Gears-5-Spiel-62011/Specials/PC-Release-Benchmarks-Review-1331739/

well Navi supremacy is not showing up in gears 5's performance, I wonder if a 1700MHz Vega 64 would beat a RX 5700 XT, for sure a Vega 56 can beat a RX 5700 here


----------



## rdr09

PontiacGTX said:


> https://www.pcgameshardware.de/Gears-5-Spiel-62011/Specials/PC-Release-Benchmarks-Review-1331739/
> 
> well Navi supremacy is not showing up in gears 5's performance, I wonder if a 1700MHz Vega 64 would beat a RX 5700 XT, for sure a Vega 56 can beat a RX 5700 here


Pretty close to these results.

https://www.techpowerup.com/review/gears-5-benchmark-test-performance-analysis/4.html


----------



## PontiacGTX

rdr09 said:


> Pretty close to these results.
> 
> https://www.techpowerup.com/review/gears-5-benchmark-test-performance-analysis/4.html


for some reason the RX vega 56 is underperforming,but again might be their 1080p results are wrong


----------



## rdr09

PontiacGTX said:


> for some reason the RX vega 56 is underperforming,but again might be their 1080p results are wrong


Could be a different scenario. If it was an in-game benchmark, could be a different one used. I have not seen the game. Or, it could be a different brand of gpu used.


----------



## 113802

keikei said:


> *Rumor: SAPPHIRE RX 5700 NITRO arrives September 16th*
> 
> https://videocardz.com/newz/sapphire-radeon-rx-5700-xt-nitro-oc-pictured
> 
> *479 EURO: Nitro.


Might as well buy a RTX 2070 Super at that price.


----------



## PontiacGTX

WannaBeOCer said:


> Might as well buy a RTX 2070 Super at that price.


Europe has inflated prices (due to Taxes?) so probably a 2070 super might be more expensive than that
https://www.alternate.de/html/search.html?query=rtx+2070&x=0&y=0


----------



## Imouto

PontiacGTX said:


> Europe has inflated prices (due to Taxes?) so probably a 2070 super might be more expensive than that
> https://www.alternate.de/html/search.html?query=rtx+2070&x=0&y=0


It's on demand.

RX 5700 XT: https://geizhals.eu/?cat=gra16_512&xf=9809_16+8218+-+RX+5700+XT
RTX 2070 Super: https://geizhals.eu/?cat=gra16_512&xf=9810_9+8218+-+RTX+2070+SUPER

Reference Navis fell of a cliff.


----------



## Dmac73

PontiacGTX said:


> for some reason the RX vega 56 is underperforming,but again might be their 1080p results are wrong


TPU is majority nVidia happy games/benchmarks.


----------



## PontiacGTX

Imouto said:


> It's on demand.
> 
> RX 5700 XT: https://geizhals.eu/?cat=gra16_512&xf=9809_16+8218+-+RX+5700+XT
> RTX 2070 Super: https://geizhals.eu/?cat=gra16_512&xf=9810_9+8218+-+RTX+2070+SUPER
> 
> Reference Navis fell of a cliff.


well so AMD is the same as they were before people prefers nvidia over AMD even if it is more expensive



Dmac73 said:


> TPU is majority nVidia happy games/benchmarks.


RX 590 is a AMD GPU, and an inferior GPU, might run at higher clock speed, then they dont compare aftermarket RX vega 56s?


----------



## keikei




----------



## 113802

keikei said:


> https://www.youtube.com/watch?v=WdSqeWB-p-E


Did they get inspiration from a Chrysler 300?


----------



## maltamonk

WannaBeOCer said:


> Did they get inspiration from a Chrysler 300?


Considering the 300 got it's inspiration from a RR Phantom...I guess it's a compliment?


----------



## Synoxia

For me this gpu is not worth it. Vega 64 UV+OC is easily on par/marginally worse than this. Better hold off until PS5 release.


----------



## ZealotKi11er

Synoxia said:


> For me this gpu is not worth it. Vega 64 UV+OC is easily on par/marginally worse than this. Better hold off until PS5 release.


It all depends. With my Vega 64 I always would have problems with UV. It was always a battle for stability with contant crashing. In the end, I just ran the card at stock. WIth 5700 XT is very good out of the box.


----------



## ilmazzo

https://www.tomshw.it/hardware/corsair-xg7-waterblock-radeon-rx-5700/

Corsair is in the Navi waterblocks now....


----------



## bucdan

maltamonk said:


> Considering the 300 got it's inspiration from a RR Phantom...I guess it's a compliment?





ZealotKi11er said:


> It all depends. With my Vega 64 I always would have problems with UV. It was always a battle for stability with contant crashing. In the end, I just ran the card at stock. WIth 5700 XT is very good out of the box.


You guys think a used Vega 64 would be a better buy than the RX 5700? I'm seeing Vega 64's going for about $250, and it seems to perform on par with a RX 5700.


----------



## PontiacGTX

bucdan said:


> You guys think a used Vega 64 would be a better buy than the RX 5700? I'm seeing Vega 64's going for about $250, and it seems to perform on par with a RX 5700.


if it is 100usd cheaper, yeah if it is 50usd cheaper probably invest more on the RX 5700 though if you program in opencl you want vega.. also is this vega 64 reference cooler? if it is keep looking for a non reference card, unless you can find a cheap way cool it(some 120mm AIO+your own bracket)


----------



## bucdan

PontiacGTX said:


> if it is 100usd cheaper, yeah if it is 50usd cheaper probably invest more on the RX 5700 though if you program in opencl you want vega.. also is this vega 64 reference cooler? if it is keep looking for a non reference card, unless you can find a cheap way cool it(some 120mm AIO+your own bracket)


Yeah, reference cooler. Was thinking UV+OC. There's pretty much no non reference cards for Sub $250, pretty much gets into the territory of the RX 5700. V56 Reference can be had for as cheap as $200.


----------



## PontiacGTX

bucdan said:


> Yeah, reference cooler. Was thinking UV+OC. There's pretty much no non reference cards for Sub $250, pretty much gets into the territory of the RX 5700. V56 Reference can be had for as cheap as $200.


if you go with a reference card look on some alternative cooling then, I dont like the temperature under load, also Vega has to get cool enough to perform properly otherwise you will end with a card slower than benchmarks...but if this is for raw compute application yeah doesnt matter much it is fine but still temperature have to be under control, I have had a Vega 56 ref cooler and it overheated (not sure if this is a problem specific to this model) and ended performing like my R9 FURY due to the temperature.. so hotspot has to be below 100c and the core temp below 75c to perform fine


----------



## bucdan

PontiacGTX said:


> if you go with a reference card look on some alternative cooling then, I dont like the temperature under load, also Vega has to get cool enough to perform properly otherwise you will end with a card slower than benchmarks...but if this is for raw compute application yeah doesnt matter much it is fine but still temperature have to be under control, I have had a Vega 56 ref cooler and it overheated (not sure if this is a problem specific to this model) and ended performing like my R9 FURY due to the temperature.. so hotspot has to be below 100c and the core temp below 75c to perform fine


I guess I'll patiently wait for a Navi sale somewhere (it isn't for me). Thanks. I'm waiting for big Navi as well. Itching for that [email protected]/144 performance.


----------



## diggiddi

bucdan said:


> Yeah, reference cooler. Was thinking UV+OC. There's pretty much no non reference cards for Sub $250, pretty much gets into the territory of the RX 5700. V56 Reference can be had for as cheap as $200.


Where are you seeing those prices? I'm looking for a used Vega can't find anything on Amazon(I have giftcard)


----------



## treetops422

diggiddi said:


> Where are you seeing those prices? I'm looking for a used Vega can't find anything on Amazon(I have giftcard)


 Vega 56 was down to $270 for almost 6 months, but it shot up recently on Newegg, Amazon new, likely because of bit coin prices rising. They are great miners. However you can get a used card for about $220 used on Ebay. Never seen a new one at $200.
https://www.ebay.com/sch/i.html?_from=R40&_nkw=vega+56&_sacat=0&_sop=15&rt=nc&LH_BIN=1


You can get a new 5700 non XT for $330
https://www.amazon.com/Sapphire-Rad...s=radeon+5700&qid=1568171136&s=gateway&sr=8-5

Amazon used prices usually suck compared to Ebay. Maybe you can exchange it for an ebay card or sell it for cash idk. Or wait for low end, prices might also go down again.


https://www.google.com/search?client=firefox-b-1-d&q=bitcoin+prices


----------



## buttface420

has anyone found any successful aftermarket cooling for the 5700xt that is not a waterblock? was gonna get a g12 or a frostflow or even a accelero extreme iv but after watching a couple youtube vids it seems none of those are even working right because hotspot is skyrocketing?


----------



## PontiacGTX

buttface420 said:


> has anyone found any successful aftermarket cooling for the 5700xt that is not a waterblock? was gonna get a g12 or a frostflow or even a accelero extreme iv but after watching a couple youtube vids it seems none of those are even working right because hotspot is skyrocketing?


----------



## PontiacGTX

diggiddi said:


> Where are you seeing those prices? I'm looking for a used Vega can't find anything on Amazon(I have giftcard)


2nd hand on ebay or maybe on craiglist or...


----------



## keikei

buttface420 said:


> has anyone found any successful aftermarket cooling for the 5700xt that is not a waterblock? was gonna get a g12 or a frostflow or even a accelero extreme iv but after watching a couple youtube vids it seems none of those are even working right because hotspot is skyrocketing?


Morpheus cooler maybe?


----------



## treetops422

buttface420 said:


> has anyone found any successful aftermarket cooling for the 5700xt that is not a waterblock? was gonna get a g12 or a frostflow or even a accelero extreme iv but after watching a couple youtube vids it seems none of those are even working right because hotspot is skyrocketing?


 You need to know where the memory is at, so your fan(s) on say the g12 are hitting the hot points. The normal fan bracket is way off. I have the non XT. But it's the same layout. For the most part you need to aim at the black squares around the core. (some grey in this pic). I put some old cpu fans in the 2nd pic up on it and it did pretty good without any heat sinks. You can zip tie whatever fan(s) you use with a aio to hang around the hot spots. If you want to slap your Blower fan back on with a cardboard shroud, combined with a AIO.... You can get some amazing temps even without heat sinks. 



I'm moving a lot of stuff around so it's all sloppy atm. Don't use tape on any hotspots if you do make a cardboard shroud, you can zip tie it.


----------



## buttface420

treetops422 said:


> You need to know where the memory is at, so your fan(s) on say the g12 are hitting the hot points. The normal fan bracket is way off. I have the non XT. But it's the same layout. For the most part you need to aim at the black squares around the core. (some grey in this pic). I put some old cpu fans in the 2nd pic up on it and it did pretty good without any heat sinks. You can zip tie whatever fan(s) you use with a aio to hang around the hot spots. If you want to slap your Blower fan back on with a cardboard shroud, combined with a AIO.... You can get some amazing temps even without heat sinks.
> 
> 
> 
> I'm moving a lot of stuff around so it's all sloppy atm. Don't use tape on any hotspots if you do make a cardboard shroud, you can zip tie it.


lol this is amazing on so many levels..never thought of a cardboard shroud lmao, got me thinking about cutting some thin plexiglass/plastic sheet and making my own!!!!



PontiacGTX said:


> https://www.youtube.com/watch?v=ojgK4K7nliE


this is something i considered, if it works out well like he said in the video this may be the best option, my only worry is how good is a chinese aio and when is it gonna start leaking? otherwise this is perfect.


----------



## treetops422

buttface420 said:


> lol this is amazing on so many levels..never thought of a cardboard shroud lmao, got me thinking about cutting some thin plexiglass/plastic sheet and making my own!!!!


Just remember the little half curve for the blower around the intake circle. You can see it on the plate if you decide to copy it. I made a quick one, you can kind of see it the pic. Really simple stuff.


----------



## keikei

https://www.guru3d.com/articles-pages/msi-radeon-rx-5700-xt-gaming-x-review,1.html

MSRP: $449. Interesting what undervolting will do as the card sucks up moar power than other versions. Temps & acoustics are very noice.


----------



## EastCoast

*It's here*


----------



## rdr09

Reference RX 5700 Stock. About 13% slower than my Stock XT. Got it $331 Tax and shipping free at B&H.


----------



## PontiacGTX

buttface420 said:


> this is something i considered, if it works out well like he said in the video this may be the best option, my only worry is how good is a chinese aio and when is it gonna start leaking? otherwise this is perfect.


the other alternative is using a massive Morpheus II which will take 4-5 slot with 25mm fans less if they are 12mm/10mm


----------



## criminal

treetops422 said:


> SNIP


Now that is some getto modding right there... lol


----------



## Heuchler

SAPPHIRE NITRO+ Radeon RX 5700 XT listed $440
https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416nt-8gsr/p/N82E16814202351




PC Gaming Hardware PowerColor Red Devil RX 5700 XT - PCI-E Gen 3.0 Vs. Gen 4.0 





X570 vs B450. Tested at 1440p. Assassin's Creed Odyssey and Battefield showed the most games. Some titles it didn't show any gains.
Would have been a better test if they simply used used X570 in Gen 4.0 [Auto] mode vs Gen 3.0 mode.


----------



## 113802

Heuchler said:


> SAPPHIRE NITRO+ Radeon RX 5700 XT listed $440
> https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416nt-8gsr/p/N82E16814202351
> 
> 
> 
> 
> PC Gaming Hardware PowerColor Red Devil RX 5700 XT - PCI-E Gen 3.0 Vs. Gen 4.0
> https://www.youtube.com/watch?v=d5Y3pBmwKuA
> 
> X570 vs B450. Tested at 1440p. Assassin's Creed Odyssey and Battefield showed the most games. Some titles it didn't show any gains.
> Would have been a better test if they simply used used X570 in Gen 4.0 [Auto] mode vs Gen 3.0 mode.


AMD is leaving performance on the table by leaving PCI-E 4.0 disabled on the Radeon VII for some titles it seems?


----------



## Dmac73

Heuchler said:


> Would have been a better test if they simply used used X570 in Gen 4.0 [Auto] mode vs Gen 3.0 mode.


Exactly. It's possible the B450 was throttling the CPU, etc etc..


----------



## ilmazzo

keikei said:


> https://www.guru3d.com/articles-pages/msi-radeon-rx-5700-xt-gaming-x-review,1.html
> 
> MSRP: $449. Interesting what undervolting will do as the card sucks up moar power than other versions. Temps & acoustics are very noice.


yup, great card...... navi is already nice tuned from factory, is not another vega 64....undervolting this will usually bring less performance cause temps are already in check, on reference can give something due to the less throttling but the voltage curve seems ok most of the time...



EastCoast said:


> *It's here*
> 
> https://youtu.be/KEvafTEjFVo


I got a fetish feeling for some reason looking at this vid


----------



## keikei

https://www.guru3d.com/news-story/asrock-officially-launches-radeon-rx-5700-xt-yaichi-x-8g-oc.html


----------



## PontiacGTX

WannaBeOCer said:


> AMD is leaving performance on the table by leaving PCI-E 4.0 disabled on the Radeon VII for some titles it seems?


Can RVII use PCIE4.0? I didnt know also I wonder if then the next nvidia gpu will benefit as much from PCIE4.0?

just wondering anyone has tested if PCIE3.0 cards improve performance on 4.0? because RVII In GPUz shows PCIE3.0


----------



## EastCoast

Live feed of the 5700xt Nitro testing


----------



## Dmac73

PontiacGTX said:


> Can RVII use PCIE4.0? I didnt know also I wonder if then the next nvidia gpu will benefit as much from PCIE4.0?
> 
> just wondering anyone has tested if PCIE3.0 cards improve performance on 4.0? because RVII In GPUz shows PCIE3.0



VII is 3.0.


----------



## homestyle

Heuchler said:


> X570 vs B450. Tested at 1440p. Assassin's Creed Odyssey and Battefield showed the most games. Some titles it didn't show any gains.
> Would have been a better test if they simply used used X570 in Gen 4.0 [Auto] mode vs Gen 3.0 mode.


Not really. In theory it makes sense to keep all the other variables constant when comparing 3 vs 4.

But in reality, the choice to move from 3 to 4 is a matter of deciding to upgrade your 450 (or last gen) chipset to X570, and not a matter of enabling pcie 4.0 on your existing x570 mobo.

So it's good to use the same cpu and ram, but test by using the different chipsets.


----------



## 113802

Dmac73 said:


> PontiacGTX said:
> 
> 
> 
> Can RVII use PCIE4.0? I didnt know also I wonder if then the next nvidia gpu will benefit as much from PCIE4.0?
> 
> just wondering anyone has tested if PCIE3.0 cards improve performance on 4.0? because RVII In GPUz shows PCIE3.0
> 
> 
> 
> 
> VII is 3.0.
Click to expand...

Radeon VII supports PCIe-4.0 but they disabled it at launch to sell Navi cards. First AMD PCIe-4.0 video cards were Vega 20 Instinct cards. They hinted at enabling it a few times but I doubt they will. Maybe some YouTubers can bring it up and maybe we'll see support. I doubt they will though since it will require a firmware update.

Their reason for disabling PCIe 4.0 was that gamers don't need the throughput of PCI-e 4.0.


----------



## ZealotKi11er

WannaBeOCer said:


> Radeon VII supports PCIe-4.0 but they disabled it at launch to sell Navi cards. First AMD PCIe-4.0 video cards were Vega 20 Instinct cards. They hinted at enabling it a few times but I doubt they will. Maybe some YouTubers can bring it up and maybe we'll see support. I doubt they will though since it will require a firmware update.
> 
> Their reason for disabling PCIe 4.0 was that gamers don't need the throughput of PCI-e 4.0.


V20 was most likely validated with Epyc system and not Ryzen 2 systems. Navi was validated with Ryzen 2 system hence it had mainstream pcie4 support.


----------



## keikei




----------



## ilmazzo

So far the Gaming X seems the best one.... not too big, great temperatures and lot of overclocking headroom if you sacrifice noise to max performance without going to LC, and the Gaming not X will be good the same while being a little cheaper...second is the Pulse.....


If you don't mind dimensions instead Red Devil and Gigabyte are good too....we will see the Nitro but they never fail delivering so there are a lot of nice options out there....


----------



## keikei

ilmazzo said:


> So far the Gaming X seems the best one.... not too big, great temperatures and lot of overclocking headroom if you sacrifice noise to max performance without going to LC, and the Gaming not X will be good the same while being a little cheaper...second is the Pulse.....
> 
> 
> If you don't mind dimensions instead Red Devil and Gigabyte are good too....we will see the Nitro but they never fail delivering so there are a lot of nice options out there....


The hardest part is which one. Lol. The supply is an issue though. Atm, aftermarket cards are just trickling in and are gobbled up as soon as they're in stock. In general, this is the case (not including ref versions). PCMR has gotten crowded. Nitro and Taichi reviews soon. :thumb:


----------



## JackCY

I wouldn't be surprised if AMD has a deal with NV on limiting supply of GPUs to keep prices high, this is going on for ages now, they fix prices and performance, fix launch dates, ...
For quite a while now every AMD GPU launch is plagued by horrible availability while NV cards launched at same prices and performances at the same time are available in stores to purchase up to even 6 months more readily.

This Gaming X design on Navi looks nice, simplistic, functional unlike their crazy designs for NV cards.

AMD mainstream CPUs... no problem with supply, almost ever.
AMD GPUs... takes 6-12 months after official launch to get them to stores in any acceptable quantity. Gotta feed that Stadia and server farms first ya know...

Another problem with some of these cards is crazy pricing and you're better off getting that near MSRP aftermarket card than spending much more on a barely any better cooler with same performance.

https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/

Wow, power consumption of a 2080Ti on that 5700 XT Gaming X... damn. 1.2V Vcore... yeah that will do it. Typical AMD GPU overvolt to push performance as high as possible out of the box to be competitive performance wise.
Gaming X Navi is not a new cooler, it's 99% same cooler as on NV card from MSI. It's a nice cooler but it's not new or something revolutionary made just for Navi. It's a reuse of existing product.
Could have been better if they kept the voltages in check under 1.1V at least.

Still same performance as the old RTX 2070, which had similar price as these newer Navis.

Navi... nice new chip but the value is not great, performance 1 year late for that price at the least. If you can even buy one yet.

5700 Pulse probably being the best so far.


----------



## homestyle

JackCY said:


> I wouldn't be surprised if AMD has a deal with NV on limiting supply of GPUs to keep prices high, this is going on for ages now, they fix prices and performance, fix launch dates, ...
> For quite a while now every AMD GPU launch is plagued by horrible availability while NV cards launched at same prices and performances at the same time are available in stores to purchase up to even 6 months more readily.
> 
> This Gaming X design on Navi looks nice, simplistic, functional unlike their crazy designs for NV cards.
> 
> AMD mainstream CPUs... no problem with supply, almost ever.
> AMD GPUs... takes 6-12 months after official launch to get them to stores in any acceptable quantity. Gotta feed that Stadia and server farms first ya know...
> 
> Another problem with some of these cards is crazy pricing and you're better off getting that near MSRP aftermarket card than spending much more on a barely any better cooler with same performance.
> 
> https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/
> 
> Wow, power consumption of a 2080Ti on that 5700 XT Gaming X... damn. 1.2V Vcore... yeah that will do it. Typical AMD GPU overvolt to push performance as high as possible out of the box to be competitive performance wise.
> Gaming X Navi is not a new cooler, it's 99% same cooler as on NV card from MSI. It's a nice cooler but it's not new or something revolutionary made just for Navi. It's a reuse of existing product.
> Could have been better if they kept the voltages in check under 1.1V at least.
> 
> Still same performance as the old RTX 2070, which had similar price as these newer Navis.
> 
> Navi... nice new chip but the value is not great, performance 1 year late for that price at the least. If you can even buy one yet.
> 
> 5700 Pulse probably being the best so far.


Amd prices are the ones that are staying low (and staying at launch price) even with low supply. With this low supply, the market can bear higher prices.

That makes you wonder about nvidia's higher-than-launch prices.


----------



## PontiacGTX

WannaBeOCer said:


> Radeon VII supports PCIe-4.0 but they disabled it at launch to sell Navi cards. First AMD PCIe-4.0 video cards were Vega 20 Instinct cards. They hinted at enabling it a few times but I doubt they will. Maybe some YouTubers can bring it up and maybe we'll see support. I doubt they will though since it will require a firmware update.
> 
> Their reason for disabling PCIe 4.0 was that gamers don't need the throughput of PCI-e 4.0.


still curious if PCIE4 improves performance on flagship cards, though AMD has a bug with the driver where tools like gpu-z/hwinfo/etc shows you use full pcie bw/speed I am not sure if this only affecting old platforms or just anyone


----------



## ZealotKi11er

PontiacGTX said:


> still curious if PCIE4 improves performance on flagship cards, though AMD has a bug with the driver where tools like gpu-z/hwinfo/etc shows you use full pcie bw/speed I am not sure if this only affecting old platforms or just anyone


Would only benefit in a server environment.


----------



## keikei




----------



## PlugSeven

JackCY said:


> https://www.techpowerup.com/review/msi-radeon-rx-5700-xt-gaming-x/
> 
> Wow, p
> 
> 
> Still same performance as the old RTX 2070, which had similar price as these newer Navis.
> 
> Navi... nice new chip but the value is not great, performance 1 year late for that price at the least. If you can even buy one yet.
> 
> 5700 Pulse probably being the best so far.


Seriously dude?!?! What review are you looking at? This thing is more often than not matching or besting the 2070S, even embarrassing the 2080S in a couple of titles and somehow you drag it down to 2070 level performance. Are you joking?


----------



## PontiacGTX

keikei said:


> https://www.youtube.com/watch?v=-VB_E4R69DU


non XT comparison is worthless


----------



## JackCY

PlugSeven said:


> Seriously dude?!?! What review are you looking at? This thing is more often than not matching or besting the 2070S, even embarrassing the 2080S in a couple of titles and somehow you drag it down to 2070 level performance. Are you joking?


It's right in your quote, dude. Maybe learn to read and do more research.
The value of 5700 XT is not impressive at all since they cost in aftermarket designs pretty same as a 2070 and only perform tiny bit better if at all. Usually 5700 XT sits between 2070 and 2070S, which is a pretty narrow range. And with AMD you give up quite a few useful features.
5700 XT is not priced well against "competition", simple as that.


----------



## maltamonk

JackCY said:


> It's right in your quote, dude. Maybe learn to read and do more research.
> The value of 5700 XT is not impressive at all since they cost in aftermarket designs pretty same as a 2070 and only perform tiny bit better if at all. Usually 5700 XT sits between 2070 and 2070S, which is a pretty narrow range. And with AMD you give up quite a few useful features.
> 5700 XT is not priced well against "competition", simple as that.


Well yeah that seems reasonable. Take the most expensive 5700xt models and compare it to the cheapest 2070 (eol) and 2070s models.


----------



## JackCY

Here is more comparisons if you care to read:

https://www.techspot.com/review/1902-geforce-rtx-2070-super-vs-radeon-5700-xt/

2070 is still being sold and costs the same as RX 5700 XT, both are aftermarket models. On some markets yes the 5700 XT can be a little cheaper if you're looking at Sapphire Pulse vs not so low model RTX card. The prices of 5700 XT cards in aftermarket models as expected are nothing to "write home about", not impressive at all.

Lately some people are so delusional they even say 5700 XT is faster than 2070 Super on average etc. When no exhaustive comparison has ever shown that.

Navi used in 5700/XT is fine (traditional performance wise), but their prices suck, same price issue goes for NV cards.

Not everywhere 5700/XT is as cheap as it may be in US. In EU AMD GPUs have struggled to be competitive for a decade at least and it isn't changing now either. Pricing 5700 XT same price as RTX 2070, is a fail for AMD, it offers less overall for same cost.


----------



## ZealotKi11er

JackCY said:


> Here is more comparisons if you care to read:
> 
> https://www.techspot.com/review/1902-geforce-rtx-2070-super-vs-radeon-5700-xt/
> 
> 2070 is still being sold and costs the same as RX 5700 XT, both are aftermarket models. On some markets yes the 5700 XT can be a little cheaper if you're looking at Sapphire Pulse vs not so low model RTX card. The prices of 5700 XT cards in aftermarket models as expected are nothing to "write home about", not impressive at all.
> 
> Lately some people are so delusional they even say 5700 XT is faster than 2070 Super on average etc. When no exhaustive comparison has ever shown that.
> 
> Navi used in 5700/XT is fine (traditional performance wise), but their prices suck, same price issue goes for NV cards.
> 
> Not everywhere 5700/XT is as cheap as it may be in US. In EU AMD GPUs have struggled to be competitive for a decade at least and it isn't changing now either. Pricing 5700 XT same price as RTX 2070, is a fail for AMD, it offers less overall for same cost.


RTX 2070 is not the same as 2070 Super. 2070 = 2060S.


----------



## Imouto

JackCY said:


> Not everywhere 5700/XT is as cheap as it may be in US. In EU AMD GPUs have struggled to be competitive for a decade at least and it isn't changing now either. Pricing 5700 XT same price as RTX 2070, is a fail for AMD, it offers less overall for same cost.


What the actual flying duck?

https://geizhals.eu/?cat=gra16_512&xf=9809_16+8218+-+RX+5700+XT

https://geizhals.eu/?cat=gra16_512&xf=9810_9+6497+-+RTX+2070

Custom RX 5700XTs start at 410-430€ while custom RTX 2070 non-S start at 447€ for *ONE* model and then jumping to 462€+ for the rest. Custom RTX 2070S start at 515€.

Also, from the review you posted:



> Overall the 5700 XT is very punchy at 1440p and we think it’s fair to say that the 5700 XT eliminates the RTX 2060 Super and it almost does the same to the 2070 Super, but the fact that the 2070 Super was faster by a noteworthy margin in about a dozen of the games tested, means it’s a worthwhile option for those seeking some extra performance.
> 
> We already spelled this out in our recent update to The Best Graphics Cards. For the best high-end 1440p GPU we're going with the Radeon 5700 XT as the best option in the $400 - $600 range. Unless you’re spending considerably more -- there's a brief window for buying the RTX 2080 at $660 which is a better deal than buying the Super version -- there are no other options.


----------



## PlugSeven

JackCY said:


> It's right in your quote, dude. Maybe learn to read and do more research.
> The value of 5700 XT is not impressive at all since they cost in aftermarket designs pretty same as a 2070 and only perform tiny bit better if at all. Usually 5700 XT sits between 2070 and 2070S, which is a pretty narrow range. And with AMD you give up quite a few useful features.
> 5700 XT is not priced well against "competition", simple as that.


If priced the same, you may have had a semblance of a point. The crap I responded to is right in your post, the one I quoted. You put it as "same performance as old 2070" and now you're revising your crap. Maybe take your own advise about learning to read and doing research.


----------



## 113802

Turing has more use cases which equals more demand which means a higher cost. If you're just gaming just buy a Radeon RX 5700 series AIB card.


----------



## PontiacGTX

WannaBeOCer said:


> Turing has more use cases which equals more demand which means a higher cost. If you're just gaming just buy a Radeon RX 5700 series AIB card.


I wonder if at some point when nvidia keep selling specialized video cards for consumers and they arent truly the segment of the price they are being sold at (performance/arch design wise) and the prices dont drop, AMD will be selling more expensive cards aswell because from the price history from release the prices have increased (of course they use 7nm and GDDR6)


----------



## PlugSeven

WannaBeOCer said:


> Turing has more use cases which equals more demand which means a higher cost. If you're just gaming just buy a Radeon RX 5700 series AIB card.


Is that why sales have been "disappointing" because of "more demand"? 97%(# pulled out of my rear) of these things are being bought by ppl who don't have the first clue or any use for these "use cases".


----------



## keikei

https://wccftech.com/review/sapphire-nitro-radeon-rx-5700xt-top-tier-navi/


----------



## 113802

keikei said:


> https://wccftech.com/review/sapphire-nitro-radeon-rx-5700xt-top-tier-navi/


Went from a simple nice design to an odd insect inspired one.


----------



## keikei

WannaBeOCer said:


> Went from a simple nice design to an odd insect inspired one.


I like it, but the backplate is a little too much aesthetically imo. Price-wise ($440) its not too much over the Pulse. You get better cooling and rgb for the $ bump.


----------



## Dmac73

The Nitro is a great card and the edge/junction temps are pretty phenomenal. It should be the go to choice for 5700XT.

Dmac


----------



## 113802

Dmac73 said:


> The Nitro is a great card and the edge/junction temps are pretty phenomenal. It should be the go to choice for 5700XT.
> 
> Dmac


Hopefully the anniversary edition products is finished. If not all the AIB 5700 XTs cards overclock the same. As long as junction temp is below 115c get a card based off of aesthics or an anniversary edition with a block.


----------



## Dmac73

WannaBeOCer said:


> Hopefully the anniversary edition products is finished. If not all the AIB 5700 XTs cards overclock the same. As long as junction temp is below 115c get a card based off of aesthics or an anniversary edition with a block.


The lower the junction the better. Scaling is better. It's not just about being 115c or under. Plus your room is going to be hot as crap after a gaming session. The Nitro is the card to get IMO.


----------



## Jedi Mind Trick

PlugSeven said:


> Seriously dude?!?! What review are you looking at? This thing is more often than not matching or besting the 2070S, even embarrassing the 2080S in a couple of titles and somehow you drag it down to 2070 level performance. Are you joking?


Lol, the 5700XT beats the 2080s in one game according to your link (BF) and is comparable to the 2060S in more titles than that. Heck, it even loses to the 1070 Ti in one!



maltamonk said:


> Well yeah that seems reasonable. Take the most expensive 5700xt models and compare it to the cheapest 2070 (eol) and 2070s models.


I agree it isn't the best comp; but I see an EVGA 2070 Black at $450 [$435 with rebate] at my local microcenter, which I guess could be "cheating" as I do not really see it elsewhere. At that price, I think $30 to get a card sooner rather than later is fair. Honestly, it seems hard to go wrong in the ~$400-450 bracket this go around. Really depends on what is available to you I guess.



PlugSeven said:


> If priced the same, you may have had a semblance of a point. The crap I responded to is right in your post, the one I quoted. You put it as "same performance as old 2070" and now you're revising your crap. Maybe take your own advise about learning to read and doing research.


Are they not the same as the "old 2070?" Again, according to your link, the Gaming X 5700 XT is at ~6.4% better than the 2070 they compared it against at 1440p (the difference between the two is less at 1080p and 4k). ~5% (TO ME) is close enough that they might as well be the same. Saying it is the same as the 2070 seems a lot more accurate than saying the XT beats the 2080S in a couple of titles without noting that it loses to lesser cards in others.



Dmac73 said:


> The Nitro is a great card and the edge/junction temps are pretty phenomenal. It should be the go to choice for 5700XT.
> 
> Dmac


Yep, I like the Nitro a lot, can't really go wrong with any of the Red Devil/Gaming X/Nitro/Pulse IMO. All really good, just comes down to availability.


----------



## keikei

*Currently the best cooler for a 5700 XT GN has tested.


----------



## PlugSeven

Jedi Mind Trick said:


> Lol, the 5700XT beats the 2080s in one game according to your link (BF) and is comparable to the 2060S in more titles than that. Heck, it even loses to the 1070 Ti in one!
> 
> Are they not the same as the "old 2070?" Again, according to your link, the Gaming X 5700 XT is at ~6.4% better than the 2070 they compared it against at 1440p (the difference between the two is less at 1080p and 4k). ~5% (TO ME) is close enough that they might as well be the same. Saying it is the same as the 2070 seems a lot more accurate than saying the XT beats the 2080S in a couple of titles without noting that it loses to lesser cards in others.


So by your logic, given that the gap between the 2070S and the XT is similar to that between the 2060S and the XT,(two cards you deem "comparable") Can we say the same for the former pair? If so, then I guess you're a half empty guy and I'm a half full guy and we're just talking around each other.


----------



## Dmac73

Jedi Mind Trick said:


> Lol, the 5700XT beats the 2080s in one game according to your link (BF) and is comparable to the 2060S in more titles than that. Heck, it even loses to the 1070 Ti in one!
> 
> 
> Are they not the same as the "old 2070?" Again, according to your link, the Gaming X 5700 XT is at ~6.4% better than the 2070 they compared it against at 1440p (the difference between the two is less at 1080p and 4k). ~5% (TO ME) is close enough that they might as well be the same. Saying it is the same as the 2070 seems a lot more accurate than saying the XT beats the 2080S in a couple of titles without noting that it loses to lesser cards in others.


https://www.techspot.com/review/1902-geforce-rtx-2070-super-vs-radeon-5700-xt/

They trade blows. 2% and 6%. Might as well be the same right?

It all depends on the review suite as well. Some sites have a very biased slate of games and game engines, still using Heaven for comparisons, etc...


----------



## Jedi Mind Trick

PlugSeven said:


> So by your logic, given that the gap between the 2070S and the XT is similar to that between the 2060S and the XT,(two cards you deem "comparable") *Can we say the same for the former pair? *If so, then I guess you're a half empty guy and I'm a half full guy and we're just talking around each other.


Without a doubt! But that isn't what it seemed like was being argued (at least that wasn't how I read what was going on), which was: someone said the XT is comparable to the 2070, someone [not naming names!] said it was better than the 2080S! I was just saying that it being comparable to the 2070 is more accurate than the 2080S. I honestly think that without counting the beans, it would be really hard to tell the difference between any of the cards in this range [2060S, 2070, 2070S, XT, VII, 1080ti, even the 2080]. Great time to buy cards in this range (even if the prices are 'high,' really no obviously wrong choices).

I am 110% glass half empty though!



Dmac73 said:


> https://www.techspot.com/review/1902-geforce-rtx-2070-super-vs-radeon-5700-xt/
> 
> They trade blows. 2% and 6%. Might as well be the same right?
> 
> It all depends on the review suite as well. Some sites have a very biased slate of games and game engines, still using Heaven for comparisons, etc...


Agreed, TPU is my go to, as I usually go from nV to nV cards anyways (stupid g-sync monitor! [always been an ATI/AMD fan]); that and someone used it 'first,' but they could really use a suite overhaul (though, I guess their games are pretty relevant for me at least).


----------



## Dmac73

Jedi Mind Trick said:


> Without a doubt! But that isn't what it seemed like was being argued (at least that wasn't how I read what was going on), which was: someone said the XT is comparable to the 2070, someone [not naming names!] said it was better than the 2080S! I was just saying that it being comparable to the 2070 is more accurate than the 2080S. I honestly think that without counting the beans, it would be really hard to tell the difference between any of the cards in this range [2060S, 2070, 2070S, XT, VII, 1080ti, even the 2080]. Great time to buy cards in this range (even if the prices are 'high,' really no obviously wrong choices).
> 
> I am 110% glass half empty though!
> 
> 
> 
> Agreed, TPU is my go to, as I usually go from nV to nV cards anyways (stupid g-sync monitor! [always been an ATI/AMD fan]); that and someone used it 'first,' but they could really use a suite overhaul (though, I guess their games are pretty relevant for me at least).


TPU has a pretty biased suite, they have some games that hardly anyone plays that are heavily nvidia optimized and they still use Heaven as an OC comparison like it's 2009 again. I like TPU but i don't care for their reviews or overall test suite. I came from a 2070 to 5700XT and it spanks the 2070.


----------



## PontiacGTX

keikei said:


> https://www.youtube.com/watch?v=kh1DeO4yz2s
> 
> 
> *Currently the best cooler for a 5700 XT GN has tested.


2140 isnt close to the upper limit of the rx 5700 xt i wonder why they dont try a 1.9 or 2ghz clock on the Radeon VII


----------



## Dmac73

PontiacGTX said:


> 2140 isnt close to the upper limit of the rx 5700 xt i wonder why they dont try a 1.9 or 2ghz clock on the Radeon VII



That's a high game stable core. It's pretty close to your max daily game stable clocks your not going to see much higher through a test bed, regardless of "upper limit" or a quick suicide benchmark run. Same with the Vii seeing as that is the reference cooler on the VII and results from their original review or around that time.


----------



## PontiacGTX

Dmac73 said:


> That's a high game stable core. It's pretty close to your max daily game stable clocks your not going to see much higher through a test bed, regardless of "upper limit" or a quick suicide benchmark run. Same with the Vii seeing as that is the reference cooler on the VII and results from their original review or around that time.


RVII is capable of 2ghz and those arent "suicide runs" and form test it gives at least 9fps+


----------



## 113802

PontiacGTX said:


> RVII is capable of 2ghz and those arent "suicide runs" and form test it gives at least 9fps+


That's obviously some sort of aftermarket cooling. With the stock cooler I could run my card at 1950\1200 with an undervolt of 1070mV. My stock voltage is 1138mV.

Edit: title says LC


----------



## PontiacGTX

WannaBeOCer said:


> That's obviously some sort of aftermarket cooling. With the stock cooler I could run my card at 1950\1200 with an undervolt of 1070mV. My stock voltage is 1138mV.
> 
> Edit: title says LC


missed the LC

air cooled 2ghz
http://oc.jagatreview.com/2019/02/eksperimen-overclocking-amd-radeon-vii-ke-2-ghz/


----------



## 113802

PontiacGTX said:


> missed the LC
> 
> air cooled 2ghz
> http://oc.jagatreview.com/2019/02/eksperimen-overclocking-amd-radeon-vii-ke-2-ghz/


That's a pretty golden Radeon VII 

2 Ghz - 1150 – 1175 mV


----------



## JackCY

Imouto said:


> What the actual flying duck?
> 
> https://geizhals.eu/?cat=gra16_512&xf=9809_16+8218+-+RX+5700+XT
> 
> https://geizhals.eu/?cat=gra16_512&xf=9810_9+6497+-+RTX+2070
> 
> Custom RX 5700XTs start at 410-430€ while custom RTX 2070 non-S start at 447€ for *ONE* model and then jumping to 462€+ for the rest. Custom RTX 2070S start at 515€.
> 
> Also, from the review you posted:


That's almost only Germany, which is the largest electronics market in EU and western distribution.
Here (central EU) it's like this more or less:

5700 XT Pulse 475 EUR
5700 XT Nitro+ 512 EUR (not available yet, preorder only in 1 decent shop)
cheapest acceptable shop and model 453 EUR

2070 Gaming 490 EUR
2070 Strix 491 EUR
cheapest acceptable shop and model 404 EUR

For lulz from same shop as the 2070 Strix here you have 5700 XT Strix: 543 EUR, 517 EUR if you hunt for cheapest acceptable shop. An MSI 5700 XT Gaming is nowhere to be found at all, not listed anywhere.

Why are the 2070 cheaper than RX 5700 XT? Because shops are more likely to want to sell them so they lower prices a little or even make time limited discounts. All prices are with taxes, converted to EUR, central EU distribution not western (which is 99% of what you see on German geizhals but it's easy for all the US people here because it also has English site version and it's a decent database of products as well). And buying from say Germany doesn't magically allow one better prices as only some shops ship outside Germany, you have to pay your local tax which for a large part of EU if not all lol is more than Germany has and shipping from Germany isn't exactly cheap or may even have extra fee for international orders, Amazon.de and it's 3rd party sellers also do not ship all products everywhere. If you're in Germany, yes it's all great, most products are sold there at good prices, elsewhere... not so much and there are parts of EU where it's way worse than in central EU distribution and price wise.

Those 5700 XTs look so damn hot? Not really if you look around for best price of good models. It's pretty much a wash in the end price wise between RX 5700 XT and RTX 2070. Are extra and missing features/support of Navi worth same price as Turing? Nope, Navi needs to be cheaper than Turing to sell and gain mind share and market share, but those two things are not what AMD wants or rather is not allowed to by NV in their duopoly stranglehold of the market. AMD is not competitive and likely won't be until 3rd major player manages to disrupt the GPU market. NV had killed ATI long ago only to be saved by AMD to struggle some more like fish on dry land. For how much AMD invests into GPUs their results while 1-2 years late are still pretty good, but the prices are fixed and suck.

---

The Nitro looks OK if one can buy it and it's priced well.


----------



## Dmac73

PontiacGTX said:


> RVII is capable of 2ghz and those arent "suicide runs" and form test it gives at least 9fps+
> https://www.youtube.com/watch?v=BQ63Sagz6-U



That one is LC.

Also i'm pretty sure steve from GN got a pretty bad Radeon Vii so most should do more than 1885 but remember that's his average. It was probably moving around from 1850-1950 core. Who knows. AMD clocks like to bounce around.


----------



## Imouto

JackCY said:


> That's almost only Germany, which is the largest electronics market in EU and western distribution.
> Here (central EU) it's like this more or less:


I hope you know you can buy wherever you want in the EU. Ports are usually 10 € to 15 € from Germany to Spain. I regularly buy books on Amazon.co.uk and they send them here for free.

Shopping in the UK when the pound is being pounded nets some interesting discounts too.

I didn't know you were shopping impaired. My bad.


----------



## ilmazzo

2070 is going eol and more than a year on the market so is a apple to orange comparison and you should know it


----------



## JackCY

Imouto said:


> I hope you know you can buy wherever you want in the EU. Ports are usually 10 € to 15 € from Germany to Spain. I regularly buy books on Amazon.co.uk and they send them here for free.
> 
> Shopping in the UK when the pound is being pounded nets some interesting discounts too.
> 
> I didn't know you were shopping impaired. My bad.


Already covered that in my previous post as to how "easy" it is to shop from German electronics shops. Yeah I bought a book from UK store too before BREXIT. Electronics from Germany... only from Amazon.de, the rest never turned out worth it, always cost more than other options, local or Amazon. While you may get free shipping on books from UK, it's another story trying to get stuff from Mindfactory, Caseking and so on from Germany, at some times some of these get moody and disable exporting altogether only to enable it again next year, etc. fees and shipping costs ain't cheap with some of these either even for whole PC order worth a lot it still comes out more expensive than getting the parts locally (from one's country than to shop internationally).

I didn't know you had reading problems, skipping parts of posts to try argue over already answered issue.


----------



## maltamonk

Just a FYI Brexit hasn't actually happened yet. They are still in the EU.


----------



## keikei

Depending on the model and res played at, this can be a compelling buy if someone is debating an aftermarket 5700 XT vs entery level 2070: https://wccftech.com/call-of-duty-m...orce-rtx-graphics-cards-laptops-and-desktops/


----------



## paulerxx

I'm buying a RX 5700XT within a week, the SAPPHIRE PULSE edition to be specific. Is there anything I should know or consider before making this purchase?


----------



## Dmac73

paulerxx said:


> I'm buying a RX 5700XT within a week, the SAPPHIRE PULSE edition to be specific. Is there anything I should know or consider before making this purchase?


Not really. If you can afford the Nitro get it, but the pulse is a great card especially for the small premium over reference.


----------



## paulerxx

Dmac73 said:


> Not really. If you can afford the Nitro get it, but the pulse is a great card especially for the small premium over reference.


The XFX Thic II model is actually the same price as SAPPHIRE's PULSE where I'm buying the card, probably going to grab that instead.


----------



## keikei

paulerxx said:


> I'm buying a RX 5700XT within a week, the SAPPHIRE PULSE edition to be specific. Is there anything I should know or consider before making this purchase?


The Pulse is the no nonsense card, but if you want the epeen (slightly better cooling, some triple-fan, beefy card), consider the Devil, Nitro, Gaming X, or Taichi.


----------



## jsriolo

Any other information on upcoming AIB models? On TPU I saw a link to the XFX 5700 XT RAW model http://www.madshrimps.be/articles/article/1001164/ that I had not previously heard about. Are there other manufacturers lacking flagship cards? Perhaps an Aorus model from Gigabyte (mentioned as a possibility here https://www.reddit.com/r/Amd/comments/d3xffl/second_batch_of_the_gigabyte_rx_5700_xt_gaming_oc/)?

Once all of the options are known I'll make my decision. If I had to choose right now it would probably be one of the Sapphire cards depending on availability.
http://www.madshrimps.be/articles/article/1001164/


----------



## rv8000

paulerxx said:


> The XFX Thic II model is actually the same price as SAPPHIRE's PULSE where I'm buying the card, probably going to grab that instead.


The Nitro and Red Devil are your best top tier cards, and the pulse is the best budget friendly (not that $20-30 should be a deal breaker at this price range).

The Thic is one of the hotter, if not, hottest AIB cards, weighs a ton, and aside from liking it for aesthetic reasons, a bad buy.


----------



## PontiacGTX

paulerxx said:


> The XFX Thic II model is actually the same price as SAPPHIRE's PULSE where I'm buying the card, probably going to grab that instead.


I would consider the gaming x has a warranty is 3 years,it seems to be close to the powercolor in temps and slighly more noise and 3 years warranty now if you prefer lower noise level the powercolor is the most quiet

https://www.techpowerup.com/review/sapphire-radeon-rx-5700-xt-nitro/31.html

edit:
the Nitro+ has the lowest edge temp, I assume here pasting job affects a lot the temperature
https://www.computerbase.de/2019-08/radeon-rx-5700-xt-185-watt-custom-vergleich/


----------



## huzzug

rv8000 said:


> (not that $20-30 should be a deal breaker at this price range).


$20-30 used to be a deal breaker when buying a 7850 years ago.


----------



## rv8000

huzzug said:


> $20-30 used to be a deal breaker when buying a 7850 years ago.


The 7850 was in a totally different price category and $25 would be double the mark up over reference msrp vs the markup on AIB 5700 XT's. $20-30 means a lot more when you're only paying for a card in the $200 price range.


----------



## JackCY

maltamonk said:


> Just a FYI Brexit hasn't actually happened yet. They are still in the EU.


After what is it by now, 3 extensions of the deadline? (was 2 last time I checked) because all sensible UK politicians resigned etc. and don't want anything to do with that fiasco. Unable to even exit with any dignity. Well done really, well done :applaud:

Any review of the Taichi out yet?


----------



## huzzug

rv8000 said:


> The 7850 was in a totally different price category and $25 would be double the mark up over reference msrp vs the markup on AIB 5700 XT's. $20-30 means a lot more when you're only paying for a card in the $200 price range.


Umm....you have it all backwards. The 5700 & the 5700XT are in a totally different price bracket because $$


----------



## ilmazzo

paulerxx said:


> Dmac73 said:
> 
> 
> 
> Not really. If you can afford the Nitro get it, but the pulse is a great card especially for the small premium over reference.
> 
> 
> 
> The XFX Thic II model is actually the same price as SAPPHIRE's PULSE where I'm buying the card, probably going to grab that instead.
Click to expand...

The xfx is not so great on ram and vrm btw


----------



## rv8000

huzzug said:


> Umm....you have it all backwards. The 5700 & the 5700XT are in a totally different price bracket because $$


A) No one said a thing about the 5700 non XT in any of the posts I'm referencing/quoting.
B) Launch MSRP for the 7850 was $249 which quickly dropped to $239.
C) The 5700 XT is $150-$160 more than the 7850 reference msrp, and is at a completely different price bracket and performance tier, where $20-$30 is a lot less of a difference to anyone thinking about buying a $400 card as opposed to a $240-$250 card (you can't even compare the 5700 non XT).

:headscrat


----------



## maltamonk

JackCY said:


> After what is it by now, 3 extensions of the deadline? (was 2 last time I checked) because all sensible UK politicians resigned etc. and don't want anything to do with that fiasco. Unable to even exit with any dignity. Well done really, well done :applaud:
> 
> Any review of the Taichi out yet?


Right, but you were using it as a counterpoint to debate someone's position on EU pricing. So you can insult it all you want, but it doesn't make you right. It actually strengthens their point because the pound is dropping due to the fiasco.


----------



## ZealotKi11er

huzzug said:


> Umm....you have it all backwards. The 5700 & the 5700XT are in a totally different price bracket because $$


5700 XT > $400, HD 7870 > $350 ($393 after inflation)
5700 non XT > $350 , HD 7850 > $250 ($280 after inflation)


----------



## huzzug

rv8000 said:


> A) No one said a thing about the 5700 non XT in any of the posts I'm referencing/quoting.
> B) Launch MSRP for the 7850 was $249 which quickly dropped to $239.
> C) The 5700 XT is $150-$160 more than the 7850 reference msrp, and is at a completely different price bracket and performance tier, where $20-$30 is a lot less of a difference to anyone thinking about buying a $400 card as opposed to a $240-$250 card (you can't even compare the 5700 non XT).
> 
> :headscrat





ZealotKi11er said:


> 5700 XT > $400, HD 7870 > $350 ($393 after inflation)
> 5700 non XT > $350 , HD 7850 > $250 ($280 after inflation)


Time sure does fly and marketing sure does work.


----------



## rv8000

huzzug said:


> Time sure does fly and marketing sure does work.


Whether or not you like the current prices is completely irrelevant. The unfortunate situation is these cards now exist in today's price brackets. It doesn't change the fact that 20$ extra when buying a $400 product means a lot less to a consumer than when buying a $250 product.


----------



## keikei

Interesting trend regarding these aftermarket cards. Normally, the pricing goes a little out of wack in the wrong direction against the buyer. Here, its who can go the lowest price-wise, while also providing the most incentives. Even Green had to conceive a bit with Navi. I like. We're still missing Taichi and Phantom Gaming cards from asrock.


----------



## Dmac73

keikei said:


> Interesting trend regarding these aftermarket cards. Normally, the pricing goes a little out of wack in the wrong direction against the buyer. Here, its who can go the lowest price-wise, while also providing the most incentives. Even Green had to conceive a bit with Navi. I like. We're still missing Taichi and Phantom Gaming cards from asrock.


Newegg has the taichi although its auto notify. $479.99. Tough choice over a cheap 2070 Super.


----------



## keikei

Dmac73 said:


> Newegg has the taichi although its auto notify. $479.99. Tough choice over a cheap 2070 Super.


$40 above Nitro. Either the heatsink is all copper or there is an insane amount of rgb. Possibly triple slot card. Yeah, with that price: top premium 5700 XT or entry level 2070 S. Personally, I'd take the 5700 XT given equal pricing.


----------



## huzzug

rv8000 said:


> Whether or not you like the current prices is completely irrelevant. The unfortunate situation is these cards now exist in today's price brackets. It doesn't change the fact that 20$ extra when buying a $400 product means a lot less to a consumer than when buying a $250 product.


Then I take you didn't read my post and what it conveyed.


----------



## PontiacGTX

ZealotKi11er said:


> 5700 XT > $400, HD 7870 > $350 ($393 after inflation)
> 5700 non XT > $350 , HD 7850 > $250 ($280 after inflation)


one thing to note, these use a small node which not even nvidia is using and GDDR6 is probably more expensive than GDDR5 was, but anyway midrange prices increased 50-100usd (non reference)


----------



## rv8000

huzzug said:


> Then I take you didn't read my post and what it conveyed.


Feel free to continue on whatever tangent it is that you're going off on. Clearly we aren't talking about the same thing at this point.



keikei said:


> $40 above Nitro. Either the heatsink is all copper or there is an insane amount of rgb. Possibly triple slot card. Yeah, with that price: top premium 5700 XT or entry level 2070 S. Personally, I'd take the 5700 XT given equal pricing.


Tackiest and most over-designed card is obviously worth another $30-$40 over the nitro/red devil 

I'll be curious to see their first stab at a real AIB card in terms of thermals/acoustics though. I wish them the best but dang that thing is ugly.


----------



## huzzug

rv8000 said:


> Feel free to continue on whatever tangent it is that you're going off on. Clearly we aren't talking about the same thing at this point.


Guess you finally got the hint. Next time try harder before responding.


----------



## Section31

I just got reference 5700xt (to put waterblock on) but i will say amd did great job on making it easy to disassemble the gpu compared to nvidia founders with it insane amount of screws. Also corsair did descent job on its block. Thw quickest installs i have ever done.


----------



## diggiddi

Section31 said:


> I just got reference 5700xt (to put waterblock on) but i will say amd did great job on making it easy to disassemble the gpu compared to nvidia founders with it insane amount of screws. Also corsair did descent job on its block. Thw quickest installs i have ever done.


 Pictures? or it didn't happen


----------



## Heuchler

Dmac73 said:


> Newegg has the taichi although its auto notify. $479.99. Tough choice over a cheap 2070 Super.


ASRock Radeon RX 5700 XT Taichi OC has four Display Ports and two HDMI port. ATI's Radeon HD 5870 Eyefinity 6 Edition was $479


----------



## Figueiredo

Hi guys, 

Not seeing an owners thread so I might just ask here for some feedback:

Well, this card runs pretty hot, even watercooled. Never had a watercooled GPU running so hot, don't know if it's the way AMD measures TJ or whatever but feel a bit dissapointed. All my nVidia cards never went over 45ºC under full load, even overclocked and this RX5700XT goes up fast and easy to 50ºC after just 5min. Not even trying to overclock yet. All stock. Loop is powerfull enough and I'm sure there's nothing wrong with the mounting, checked and contact is perfect. Also, used premium Thermal Grizzly paste.
VRM and Memory running cooler than GPU die. What is this?

Loop: Res -> DDC-1T PLus -> CG480 + CG120 -> CPU -> GPU
Flow is 230L/h
CPU very cool 40ºC , GPU is the hottest part while gaming no doubt.

Anyone experiencing the same behavior ??? IS THIS NORMAL ???


----------



## rdr09

Figueiredo said:


> Hi guys,
> 
> Not seeing an owners thread so I might just ask here for some feedback:
> 
> Well, this card runs pretty hot, even watercooled. Never had a watercooled GPU running so hot, don't know if it's the way AMD measures TJ or whatever but feel a bit dissapointed. All my nVidia cards never went over 45ºC under full load, even overclocked and this RX5700XT goes up fast and easy to 50ºC after just 5min. Not even trying to overclock yet. All stock. Loop is powerfull enough and I'm sure there's nothing wrong with the mounting, checked and contact is perfect. Also, used premium Thermal Grizzly paste.
> VRM and Memory running cooler than GPU die. What is this?
> 
> Loop: Res -> DDC-1T PLus -> CG480 + CG120 -> CPU -> GPU
> Flow is 230L/h
> CPU very cool 40ºC , GPU is the hottest part while gaming no doubt.
> 
> Anyone experiencing the same behavior ??? IS THIS NORMAL ???


What waterblock are you using? Might avoid it. But, yes, VRM and Memory will run cooler than the Core when watercooled. Have you seen this?

https://www.igorslab.media/ungefess...erplaytables-fuer-die-rx-5700-und-rx-5700-xt/

And the temps, were they during gaming?

BTW, not sure about the AIBs, but the reference stock volt is kinda high at 1200mv. Too much. I undervolt mine and maintain same boost clocks.


----------



## Figueiredo

Temps @ full load stressing the card.
Gaming it's something in the 42-46.

I'm using EK Vector so quality is no issue. Been using their GPU blocks since GTX 9800. Never had a GPU over 45c while stressing and gaming always around 40.
Might try and lower voltage on this sucker granted it's more than enough to run 2K.


----------



## rdr09

Figueiredo said:


> Temps @ full load stressing the card.
> Gaming it's something in the 42-46.
> 
> I'm using EK Vector so quality is no issue. Been using their GPU blocks since GTX 9800. Never had a GPU over 45c while stressing and gaming always around 40.
> Might try and lower voltage on this sucker granted it's more than enough to run 2K.


Doubt if liquid metal will help. Your flow is kinda low, tho.


----------



## Figueiredo

rdr09 said:


> Doubt if liquid metal will help. Your flow is kinda low, tho.


Liquid metal might be a good option to shave 2-3ºC yes, but 1 gallon per min is a nice sweet spot for flow. Yeah 2 gallon would shave 1ºC but don't need the extra noise. I prefer silence vs performance. Could also put some high cfm and noisy fans, that would be better than adding a 2nd pump but no need really. I'll live with the temperature but as a remark, these cards run hotter that I expected.
Guess I'll experiment with undervoltage prior to liquid metal as I've seen good reports that Navi is very good at this. If I can get around 90% performance with 1v I believe temps will max around my 45ºC target. Will share some results soon.


----------



## rdr09

Figueiredo said:


> Liquid metal might be a good option to shave 2-3ºC yes, but 1 gallon per min is a nice sweet spot for flow. Yeah 2 gallon would shave 1ºC but don't need the extra noise. I prefer silence vs performance. Could also put some high cfm and noisy fans, that would be better than adding a 2nd pump but no need really. I'll live with the temperature but as a remark, these cards run hotter that I expected.
> Guess I'll experiment with undervoltage prior to liquid metal as I've seen good reports that Navi is very good at this. If I can get around 90% performance with 1v I believe temps will max around my 45ºC target. Will share some results soon.


Yes, undervolt. Figure out the lowest your card will do at stock. I think i got mine at 1012mv before instability set it.


----------



## treetops422

Figueiredo said:


> Temps @ full load stressing the card.
> Gaming it's something in the 42-46.
> 
> I'm using EK Vector so quality is no issue. Been using their GPU blocks since GTX 9800. Never had a GPU over 45c while stressing and gaming always around 40.
> Might try and lower voltage on this sucker granted it's more than enough to run 2K.


 I use a $20 CPU block on my 5700 non XT, with about 40 gallons of water as cooling, no rads. Mid range paste, $10 for 3 grams. (big walmart tote) I stay barely under 40c benchmarking\gaming at really high over clocks. 2040+ core boost, 960ish ram.(remember its non XT) Using 225 watts to even 260 watts.(5700 base is 150 watts?) Almost 30c ambient. Maybe 2 gpm. With water memory\vrm will always be lower on any card. It doesn't output as many watts per square inch as your core. 



Why undervolt? what's the point of low temps if you are not OCing?


----------



## 113802

treetops422 said:


> I use a $20 CPU block on my 5700 non XT, with about 40 gallons of water as cooling, no rads. Mid range paste, $10 for 3 grams. (big walmart tote) I stay barely under 40c benchmarking\gaming at really high over clocks. 2040+ core boost, 960ish ram.(remember its non XT) Using 225 watts to even 260 watts.(5700 base is 150 watts?) Almost 30c ambient. Maybe 2 gpm. With water memory\vrm will always be lower on any card. It doesn't output as many watts per square inch as your core.
> 
> 
> 
> Why undervolt? what's the point of low temps if you are not OCing?


I'm going to answer your question from Vega experience. There are a few reasons to undervolt, saving power and preventing a card from power throttling. The other method to prevent power throttling is to modify the powerplay by increasing the power limit.


----------



## Figueiredo

treetops422 said:


> I use a $20 CPU block on my 5700 non XT, with about 40 gallons of water as cooling, no rads. Mid range paste, $10 for 3 grams. (big walmart tote) I stay barely under 40c benchmarking\gaming at really high over clocks. 2040+ core boost, 960ish ram.(remember its non XT) Using 225 watts to even 260 watts.(5700 base is 150 watts?) Almost 30c ambient. Maybe 2 gpm. With water memory\vrm will always be lower on any card. It doesn't output as many watts per square inch as your core.
> 
> 
> 
> Why undervolt? what's the point of low temps if you are not OCing?


I have no need to overclock really... Gaming @ 2K 60Hz, just want the water temps under 30ºC so the fans are spinning at only 600rpm and completely silent. I've setup aquaero with a fan curve so that a higher than 30ºC water temp the fans start to spin up more to try and maintain this temp. I always aim at 30ºC or a delta of 6-8ºC, been like this forever, it's how I like to run things and with this AMD card I notice my fans ramping up more so I need to tame it.


----------



## 113802

Figueiredo said:


> I have no need to overclock really... Gaming @ 2K 60Hz, just want the water temps under 30ÂºC so the fans are spinning at only 600rpm and completely silent. I've setup aquaero with a fan curve so that a higher than 30ÂºC water temp the fans start to spin up more to try and maintain this temp. I always aim at 30ÂºC or a delta of 6-8ÂºC, been like this forever, it's how I like to run things and with this AMD card I notice my fans ramping up more so I need to tame it.


As long as your junction temp is below 110c you can run the fans at any speed. I have my radiator fans at 300 RPM and my Radeon VII junction temp hits around 90c on hot days. It's not like a nVidia card where it starts to throttle at 55c or 65c.


----------



## Figueiredo

Ok, sitting at 2000mhz GPU and 900mhz for MEM, 1.035v. Max temp 47º after 30 min at full load. Normal gaming locked at 60hz is really low temp now, around 42º. Pretty happy now.
Temps are getting better and also power consumption is really low at full load, mere 166w. Gaming around 140w.


----------



## paulerxx

I just bought a 5700XT ASUS TUF Gaming X3! Awaiting it's arrival :/


----------



## rdr09

Figueiredo said:


> Ok, sitting at 2000mhz GPU and 900mhz for MEM, 1.035v. Max temp 47º after 30 min at full load. Normal gaming locked at 60hz is really low temp now, around 42º. Pretty happy now.
> Temps are getting better and also power consumption is really low at full load, mere 166w. Gaming around 140w.


A stock 5700 gets about 140w. Nice.


----------



## EastCoast




----------



## keikei




----------



## 113802

Looks average buy reference for water cooling if you're going to purchase a block for this mid-range card. 

https://youtu.be/9Nve6XruPgw


----------



## treetops422

WannaBeOCer said:


> Looks average buy reference for water cooling if you're going to purchase a block for this mid-range card.
> 
> https://youtu.be/9Nve6XruPgw


 Lotsa advanced info and basic info for the likes of me . I wonder if the reference sapphire XT is also missing those phases, my non xt ref sapphire sure is. I assumed it was a non XT thing when I compared my card to all the xt break down vids.


----------



## ZealotKi11er

WannaBeOCer said:


> Looks average buy reference for water cooling if you're going to purchase a block for this mid-range card.
> 
> https://youtu.be/9Nve6XruPgw


For water cooling, you need Fanboy Edition. I have always told people that reference AMD GPUs are the best. AIB cards are cost reduced.


----------



## paulerxx

I've been messing around with ASUS's 5700XT Gaming Tuf. Stock temps were high af, in the uper 80s/90s range. With a new fan curve + undervolting, the card now is in the High 60s/70 range. If it goes above 70, it's only for a moment.

The card at 100% sounds like a blow dryer lol


----------



## skupples

my 1080ti is throwing artifacts... 7nm calls my name... to block or not to block, that is the question.


----------



## Dmac73

skupples said:


> my 1080ti is throwing artifacts... 7nm calls my name... to block or not to block, that is the question.


Ref / Block or get a Nitro.


----------



## rluker5

skupples said:


> my 1080ti is throwing artifacts... 7nm calls my name... to block or not to block, that is the question.


Those 1080tis have not been good to you. I never had a problem with one. Almost wish I would because I have close to 2 years left on my 4 year warranty and AIOs have a limited life.


----------



## skupples

rluker5 said:


> Those 1080tis have not been good to you. I never had a problem with one. Almost wish I would because I have close to 2 years left on my 4 year warranty and AIOs have a limited life.


its my fault. I tried making block changes while between living arrangements, and starting a new job. This lead to my overlooking an EK-FC terminal O-ring, which lead to high pressure water spraying everywhere. Highly likely both cards received water damage on the sides where the terminal would be. 

luckily, i can still play 90% of my games, n the only performance issue is a random artifact here and there. Some other titles just take longer to longer to load as the session goes on.

its not my storage, or my memory, or my board. I know for sure.



Dmac73 said:


> Ref / Block or get a Nitro.


so business as usual as it were. 

I was looking @ the thicc ultra, but knowing myself I'd end up getting a block anyways. so ref/block it'll likely be, like always. this will be my last pc purchase until 4K120 rev finally gets rolling.


----------



## Section31

skupples said:


> its my fault. I tried making block changes while between living arrangements, and starting a new job. This lead to my overlooking an EK-FC terminal O-ring, which lead to high pressure water spraying everywhere. Highly likely both cards received water damage on the sides where the terminal would be.
> 
> luckily, i can still play 90% of my games, n the only performance issue is a random artifact here and there. Some other titles just take longer to longer to load as the session goes on.
> 
> its not my storage, or my memory, or my board. I know for sure.
> 
> 
> 
> so business as usual as it were.
> 
> I was looking @ the thicc ultra, but knowing myself I'd end up getting a block anyways. so ref/block it'll likely be, like always. this will be my last pc purchase until 4K120 rev finally gets rolling.


Best bet is to get reference and cheap china waterblock. The china blocks were going at 70ish USD on aliexpress. and if you add this to reference gpu, its about same price as the better aib models. . This still allows you to get an 5900XT/3080TI down the road.

That's what i'm doing myself for the build in the PC-011XL. Going to use it and 3700X (meant for my work pc) parts to fully plan the system and test out the water system before I transfer over my 3900X/2080TI (heatkiller block) over to the PC-011XL. I don't have an system down in the meantime. My main rig can still get an 3080TI/5900XT when they come out. Also can continue to use my caselabs s8 too.


----------



## skupples

i'm not in a huge rush. I can still get most of what I want done atm, and I've still gotta swap in the other card & see if its a miracle or not.


----------



## diggiddi

Dmac73 said:


> Ref / Block or get a Nitro.


Is nitro/block necessary if noise is not an issue?


----------



## skupples

if you want temps below 50c, and worry about memory, and VRM temps. 

cpu air coolers may be trading blows with some basic water solutions these days, but water is still king in the gpu world. I assume this is only magnified with 7nm.


----------



## 113802

skupples said:


> if you want temps below 50c, yes.


As long as junction temp is 110C and below you're fine, it's not a nVidia card.


----------



## skupples

what's the definition of fine here?

are you implying that its not gonna clock any better when kept cool? that hurts my brain, you're triggering my CD.


----------



## 113802

skupples said:


> what's the definition of fine here?
> 
> are you implying that its not gonna clock any better when kept cool? that hurts my brain, you're triggering my CD.


AMD cards won't throttle if the junction temp is 109C and below. Unlike nVidia where they throttle at 55C.

Edit: Navi doesn't throttle until 115c


----------



## skupples

hmmmm

I've gotta digest the actual implications of that for a sec. 

i really like the look of the thicc ultra, clocked @ 1980 outta the box. Nitro won't show up at my door in two days with no extra fees(on top of subscription!), clearly not a valid option!!!


----------



## 113802

skupples said:


> hmmmm
> 
> I've gotta digest the actual implications of that for a sec.
> 
> i really like the look of the thicc ultra, clocked @ 1980 outta the box. Nitro won't show up at my door in two days with no extra fees(on top of subscription!), clearly not a valid option!!!


Best PCB for overclocking is the PowerColor Red Devil followed by the fanboy edition and finally reference. I suggest getting a block if you do plan on overclocking a fanboy edition/reference card. I can't tolerate a blower. The Thicc ultra's memory cooling is garbage. I wouldn't touch it


----------



## skupples

a block it'll be. 

just gotta see how hosed my other 1080ti is. I'm not gonna play RMA games. I fudged hard. My fuxup.


----------



## diggiddi

WannaBeOCer said:


> Best PCB for overclocking is the PowerColor Red Devil followed by the fanboy edition and finally reference. I suggest getting a block if you do plan on overclocking a fanboy edition/reference card. I can't tolerate a blower. The Thicc ultra's memory cooling is garbage. I wouldn't touch it


So if you crank fans up to 110% on ref card you can still get a decent OC?


----------



## 113802

diggiddi said:


> [So if you crank fans up to 110% on ref card you can still get a decent OC?


Just crank it to 100% and you'll be able to hit the max stable OC on the core. I couldn't stand the 60+ db fan though. Memory would definitely benefit from water. My Radeon VII with the fan at 100% which is 3850 RPM had a junction temp of 95c with 1150mV with a 2Ghz OC. 

I honestly prefer AMD junction temperature over nVidia's throttling at 55C.


----------



## rluker5

skupples said:


> its my fault. I tried making block changes while between living arrangements, and starting a new job. This lead to my overlooking an EK-FC terminal O-ring, which lead to high pressure water spraying everywhere. Highly likely both cards received water damage on the sides where the terminal would be.
> 
> luckily, i can still play 90% of my games, n the only performance issue is a random artifact here and there. Some other titles just take longer to longer to load as the session goes on.
> 
> its not my storage, or my memory, or my board. I know for sure.
> 
> 
> 
> so business as usual as it were.
> 
> I was looking @ the thicc ultra, but knowing myself I'd end up getting a block anyways. so ref/block it'll likely be, like always. this will be my last pc purchase until 4K120 rev finally gets rolling.


I've seen stuff like that before, even done stuff like that. But not with a computer. I work in a chemical factory and the gasket somebody forgot or stuck to something then fell off can lead to adventure. 
I'd use them until they bust. Graphics cards depreciate in value anyways. You might even try some far fetched idea like reflashing the bios that gets one to last more of its useful life. Not that reflashing is the answer, but given time you might come up with something. You never know.


----------



## skupples

yeah, the other card has been in a tub, with a jug of damp rid in it, out in the 100f garage for 2 weeks now. Hopefully that's the ticket.


----------



## Dmac73

skupples said:


> its my fault. I tried making block changes while between living arrangements, and starting a new job. This lead to my overlooking an EK-FC terminal O-ring, which lead to high pressure water spraying everywhere. Highly likely both cards received water damage on the sides where the terminal would be.
> 
> luckily, i can still play 90% of my games, n the only performance issue is a random artifact here and there. Some other titles just take longer to longer to load as the session goes on.
> 
> its not my storage, or my memory, or my board. I know for sure.
> 
> 
> 
> so business as usual as it were.
> 
> I was looking @ the thicc ultra, but knowing myself I'd end up getting a block anyways. so ref/block it'll likely be, like always. this will be my last pc purchase until 4K120 rev finally gets rolling.


Yeah don't get a thicc. They're trash according to GN and the design is not good performance wise.

The Nitro gets very respectable junction temps for AIR cooling, which is one of your top priorities on AMD cards. Plus it's sexy as hell. Will be $100 cheaper than ref/block though.

Dmac


----------



## skupples

i appreciate the advice on the thicc, that being the case, ref & block it's gonna be. Kinda the norm for me anywhoo. maybe big navi will be out by the time I get around to it.

my only hold out is having a shield, though it doesn't get much use in my current living arrangement. My folks offered me a place to stay when they heard i was getting a job near by, so i'm saving all sorts of cash atm  

i'm planning a gift to myself when i finally move back out... an all new dream PC in my STH10 + LG OLED


----------



## 113802

skupples said:


> i appreciate the advice on the thicc, that being the case, ref & block it's gonna be. Kinda the norm for me anywhoo. maybe big navi will be out by the time I get around to it.


I suggest a RTX 2070 Super at that price range. If you're going to get one though I do suggest getting a Heatkiller block. 

http://shop.watercool.de/epages/Wat...ies/Wasserkühler/GPU_Kuehler/"Radeon RX Navi"

https://youtu.be/JbwE6p41OGo


----------



## skupples

i'd go used 2080ti hunting if going nv 

as to the 2070 @ price range - i won't be using ray tracing any time soon, and don't need green green software. I like the progress AMD is making, I can throw them a bone. Pretty sure my last AMD acquisition was so long ago that my folks provided it.


----------



## Dmac73

skupples said:


> i'd go used 2080ti hunting if going nv
> 
> as to the 2070 @ price range - i won't be using ray tracing any time soon, and don't need green green software. I like the progress AMD is making, I can throw them a bone. Pretty sure my last AMD acquisition was so long ago that my folks provided it.


This is nothing more than an opinion but i think we will see a RX5800 by years end or Q1 at the latest so you might be able to hold off. 

Although i'll say a 2100 core 5700XT is insanely fast in games, particularly the ones i play(Apex,CoD,OW,BF).

Dmac


----------



## skupples

yeah, i'm in no rush. I likely won't even swap to the other 1080ti to see if its OK until i run into an actual game stopping issue. I'm also on late shift this week, and outta town the next 2 weekends. 

I really have no interest in paying the RTX tax until its something i'll actually use. Not to mention intel joining the fray soon enough. 

in short, we're still at the beginning of the dx12 revolution. These last few monoliths aren't going to age very well. I'm hyper-focused on what's coming 2021/2022. $1,000 4K120 panels, & $1,000 GPUs that can drive em well.


----------



## Jedi Mind Trick

skupples said:


> yeah, i'm in no rush. I likely won't even swap to the other 1080ti to see if its OK until i run into an actual game stopping issue. I'm also on late shift this week, and outta town the next 2 weekends.
> 
> I really have no interest in paying the RTX tax until its something i'll actually use. Not to mention intel joining the fray soon enough.
> 
> in short, we're still at the beginning of the dx12 revolution. These last few monoliths aren't going to age very well. I'm hyper-focused on what's coming 2021/2022. $1,000 4K120 panels, & $1,000 GPUs that can drive em well.


I feel like I'm kind of in the same boat, even though I keep buying and returning RTX cards [though TBF, I seem to keep getting artifacting on them]. 

It is killing me that nV seems to be able to increase the costs for things all the time with seemingly no negative effect (to them). The 1080ti to a 2080S and lower isn't enough added performance to justify the ~$200 needed to upgrade (or ~$120 for the 2080, ~$50+ for the 2070S, the 2070/5700XT could end up being no-cost trade for slightly less performance). I know the performance is technically there ($400ish +tax for something within spitting distance of the 1080ti is pretty solid [the XT]; the $500 2070S doesn't seem like an awful value) and I could always look at a 5700 if I wanted [which has a great P ratio], but I'm greedy! *I want it all for nothing...* It just bothers me how costly the 'high' end has gotten. $700 for a base 2080S vs $500 for a base 2070S is the same difference in cost that the 1080 Ti had vs the 1080, but the performance difference between the to 'upgrades' was much higher (40% [$200] for a ~15% boost from the 2070S to the 2080S, 43% [$210] for a 27.5% boost from the 1080 to the 1080Ti; and this isn't even bringing up the 2080->2080Ti...). All of that extra money for some mediocre (okay, I think it _*can *_look pretty cool) ray tracing.

I sold my 1080Ti Trio, went back to my 980ti (after going through a 2080, 2070/2070S [gave the S to a friend], and a 2060S [*in hind sight, I am 100% an idiot for selling the 1080ti without wanting to spend more $$ or sidegrade...*]).

Luckily, I found a used 1080ti Sea Hawk [the hybrid variant, not the FC-EK block one] for $427.50 [~$12.50 less than what I sold my Trio for and it will fit in my MITX case if I ever want to downsize] and a 'new' (open box) 5700XT red devil for ~$405 after tax. Been super busy (work and exams this week), but as soon as I play with each of them for a bit, I'll hopefully keep whichever I like more (or sell the Ti and return the XT because I have a problem). 

Sorry for the rant. 

On a different note, I really need to stop buying/selling cards and just stick with one.


----------



## keikei

Dmac73 said:


> This is nothing more than an opinion but i think we will see a RX5800 by years end or Q1 at the latest so you might be able to hold off.
> 
> Although i'll say a 2100 core 5700XT is insanely fast in games, particularly the ones i play(Apex,CoD,OW,BF).
> 
> Dmac


That timeline seems more legit considering the smaller Navi info has been leaked a bit.


----------



## Dmac73

keikei said:


> That timeline seems more legit considering the smaller Navi info has been leaked a bit.


Yep. Then you'll see a high end 5900XT middle of next year is my guess. 

I think 5800XT will retain GDDR6, 5900XT could be going HBM2e. 

Also, https://www.amazon.com/gp/product/B07XMNGVVD/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1 Nitro in stock the 13th. $439 free prime shipping. Really a fantastic deal.


----------



## 113802

Dmac73 said:


> Yep. Then you'll see a high end 5900XT middle of next year is my guess.
> 
> I think 5800XT will retain GDDR6, 5900XT could be going HBM2e.
> 
> Also, https://www.amazon.com/gp/product/B07XMNGVVD/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1 Nitro in stock the 13th. $439 free prime shipping. Really a fantastic deal.


Pretty sure AMD will only use HBM just for their accelerator cards just like nVidia. We won't see it in consumer cards and I doubt we'll see an Arcturus consumer card even though I hope we do.


----------



## Dmac73

WannaBeOCer said:


> Pretty sure AMD will only use HBM just for their accelerator cards just like nVidia. We won't see it in consumer cards and I doubt we'll see an Arcturus consumer card even though I hope we do.


I realize Navi is a different direction than Vega but i still think AMD will use HBM2 on the high end card. I'm pretty confident in that. You can quote me and we will revisit it in a few months.

One of the main reasons is that it's going to be a power hungry card. We both know this. AMD isn't near nVidias level in the power consumption / performance department. This is still true on 7nm when you take all the facts into account. We are looking at a 300w TDP card and there's no way they're going to fit a 352/11gb bus into that power envelope; not to mention they are quite seasoned with HBM on consumer cards now dating back to Fury. This would be nothing new for AMD. 

It also allows them to continue with 16gb on their flagship cards coming from VII --> 5900.

Time will only tell, we will visit it down the road and one of us can say i told ya so. 

Dmac


----------



## ZealotKi11er

Dmac73 said:


> I realize Navi is a different direction than Vega but i still think AMD will use HBM2 on the high end card. I'm pretty confident in that. You can quote me and we will revisit it in a few months.
> 
> One of the main reasons is that it's going to be a power hungry card. We both know this. AMD isn't near nVidias level in the power consumption / performance department. This is still true on 7nm when you take all the facts into account. We are looking at a 300w TDP card and there's no way they're going to fit a 352/11gb bus into that power envelope; not to mention they are quite seasoned with HBM on consumer cards now dating back to Fury. This would be nothing new for AMD.
> 
> It also allows them to continue with 16gb on their flagship cards coming from VII --> 5900.
> 
> Time will only tell, we will visit it down the road and one of us can say i told ya so.
> 
> Dmac


The card will only be power-hungry if it uses 12GB-384-Bit or 16GB-512-Bit. If its still 16/8GB-256-Bit it will be fine. I too think AMD is done with HBM for gaming GPUs. Both Fiji and Vega showed that in games its a waste of money.


----------



## Dmac73

ZealotKi11er said:


> The card will only be power-hungry if it uses 12GB-384-Bit or 16GB-512-Bit. If its still 16/8GB-256-Bit it will be fine. I too think AMD is done with HBM for gaming GPUs. Both Fiji and Vega showed that in games its a waste of money.


I'm not talking about a 5800XT i'm talking about a theoretical 5900XT, a top dog. And 256 bit bus ain't enough. Plain and simple. Not in that segment; and we are so past 512 bit buses it's not even funny. So you either go 384 or you go HBM2e and that's what i see AMD doing.

Dmac


----------



## ZealotKi11er

Dmac73 said:


> I'm not talking about a 5800XT i'm talking about a theoretical 5900XT, a top dog. And 256 bit bus ain't enough. Plain and simple. Not in that segment; and we are so past 512 bit buses it's not even funny. So you either go 384 or you go HBM2e and that's what i see AMD doing.
> 
> Dmac


What about Nvidia? 384-Bit @ 18GBps enough for 3080 Ti? Problem with 384-Bit is they are limited to 12GB/24GB. 12 is too small and 24 is too much for cost reasons.


----------



## 113802

ZealotKi11er said:


> What about Nvidia? 384-Bit @ 18GBps enough for 3080 Ti? Problem with 384-Bit is they are limited to 12GB/24GB. 12 is too small and 24 is too much for cost reasons.


They stated desktop GPUs don't need the throughput of PCIe 4.0. Yet one of their selling points was PCIe 4.0 with Navi. They even worked with Futuremark on a PCIe bandwidth benchmark. AMD should sell another generation with GDDR6 followed by a generation with HBM. AMD fanboys will jump on it seeing that HBM is back.


----------



## Dmac73

ZealotKi11er said:


> What about Nvidia? 384-Bit @ 18GBps enough for 3080 Ti? Problem with 384-Bit is they are limited to 12GB/24GB. 12 is too small and 24 is too much for cost reasons.


You missed one of my main points in there bud. nVidia gpus are far more power efficient, they can afford to stay under a tdp envelope with 384 bit bus etc


----------



## ilmazzo

Every gpu is efficient in the right power window (tension vs frequency table), the problem for amd was that the uarch was less efficient and had to be overclocked outside of the ideal voltage region to get some more performance and ruining the power and thermal efficiency even at stock

Now the uarch is the right one (40 navi cu (5700XT) gaming performance are more or like the old gen 60cu (VII)) but it is still overvolted (voltages vary from 1,05 to 1,2V, this is a 28nm voltage), nvidia cards are not efficient outside their fixed voltage window too and they are build up with that voltage range in mind since this is the third generation with fixed voltage they release, and for a reason.....

5800 should be done on the 7nm+ which will help with power efficiency while having nice base clocks and a feasible voltage, even a 64cu version will be a nice little monster


----------



## PontiacGTX

WannaBeOCer said:


> They stated desktop GPUs don't need the throughput of PCIe 4.0. Yet one of their selling points was PCIe 4.0 with Navi. They even worked with Futuremark on a PCIe bandwidth benchmark. AMD should sell another generation with GDDR6 followed by a generation with HBM. AMD fanboys will jump on it seeing that HBM is back.


I mean HBM helped with power consumption, Fiji and Vega are power hungry they dont need it on Navi unless..a big navi gpu becomes power hungry also HBM saving isnt that much compared to GDDR6, unlike GDDR5 which probably uses a lot more power for less bandwidth, and GDDR5X wasnt an option since probably was a micron only technology?


----------



## KyadCK

PontiacGTX said:


> I mean HBM helped with power consumption, Fiji and Vega are power hungry they dont need it on Navi unless..a big navi gpu becomes power hungry also *HBM saving isnt that much compared to GDDR6, unlike GDDR5 which probably uses a lot more power for less bandwidth*, and GDDR5X wasnt an option since probably was a micron only technology?


https://www.eeweb.com/profile/schar...-memory-ready-for-ai-primetime-hbm2e-vs-gddr6


> Power consumption for the HBM2e device is about 5 watts at 2.8-Gb/s bandwidth. By comparison, in the case of GDDR6, each of the four devices consumes about 2.5 W, giving a total power consumption of 10 W. Thus, it is evident that a *single HBM2e device consumes almost half the power as for a GDDR6 solution.*


----------



## Section31

Lets see what happens with the high end gpus. Nvidia also has access to hbm memory but they only use on titan v.


----------



## ilmazzo

It's not only the chip power usage itself, but even the memory controller inside the gpu allows a power save afaik


----------



## Heuchler

RX 5600 XT vs RX 5500 XT vs GTX 1660 Super







https://www.igorslab.media/nvidia-k...pannt-mit-der-alten-2060-auf-eine-rx-5600-xt/


----------



## EastCoast

Has anyone else mentioned this issue about Global Wattman from this video in reddit?

https://www.reddit.com/r/Amd/comments/dcz9k6/5700_xt_not_game_boosting_as_it_should_in_games/


----------



## PontiacGTX

KyadCK said:


> https://www.eeweb.com/profile/schar...-memory-ready-for-ai-primetime-hbm2e-vs-gddr6


so 15w? not worth it, also if HBM2 on Vega is a close measure, it is even more pointless since HBM2 chips uses more power than HBM1


----------



## treetops422

EastCoast said:


> Has anyone else mentioned this issue about Global Wattman from this video in reddit?
> 
> https://www.reddit.com/r/Amd/comments/dcz9k6/5700_xt_not_game_boosting_as_it_should_in_games/


I had a similar issue, reinstalled windows without a third party AV active and poof it worked. While keeping in mind known issues like enhanced sync. Hardware accel bug etc shown in the driver update details.


----------



## keikei

Possibly improved over the initial Thicc: https://videocardz.com/82200/xfx-launches-radeon-rx-5700-xt-thicc-iii-ultra


----------



## skupples

improved memory cooling? that's the only complaint i've seen repeated.


----------



## keikei

skupples said:


> improved memory cooling? that's the only complaint i've seen repeated.


GN did a review on the Thic II and it wasn't favorable. Mediocre build quality and while decent oc, its achieved with louder and normal db vs the comp.


----------



## homestyle

GN and XFX must have some beef going back. Or GN is looking to be over-the-top to somehow prove to consumers they are legit and honest reviewers that keep manufacturers accountable.

GN was way overly dramatic.

The plastic bits has been a terrible trend on all computer parts, especially motherboards.

Yes, it performs worse than the other tested cards. But you don't have to go on a rant. It's that same mindless droning on and on that he goes on in his other videos about something that should've been cut in half. That same style doesn't translate well when you have a bad review. 

Especially when you see that powercolor advertises their cards on GN.


----------



## rv8000

homestyle said:


> GN and XFX must have some beef going back. Or GN is looking to be over-the-top to somehow prove to consumers they are legit and honest reviewers that keep manufacturers accountable.
> 
> GN was way overly dramatic.
> 
> The plastic bits has been a terrible trend on all computer parts, especially motherboards.
> 
> Yes, it performs worse than the other tested cards. But you don't have to go on a rant. It's that same mindless droning on and on that he goes on in his other videos about something that should've been cut in half. That same style doesn't translate well when you have a bad review.
> 
> Especially when you see that powercolor advertises their cards on GN.


That's just Steve's thing in the past year or so. He seems to have gotten quite a big head. I can't take him seriously anymore because of it, I mostly mute his cooler breakdowns now and just watch him disassemble the parts without his commentary. Either way the review of the THIC definitely brought up some valid points about the unnecessary parts of the shroud design hurting thermals.

I was hoping the XFX card would be a bit better but the thermals are lack luster and the noise levels are pretty bad compared to MSI Gaming X, the Red Devil, and the Nitro


----------



## skupples

its the age + experience myth. "i'm 28, been doing this for 12 years" that's good, your next lesson in life is controlling ego. 
(quoting my good friend's grandfather, from a few years back)


----------



## The Robot

homestyle said:


> GN and XFX must have some beef going back. Or GN is looking to be over-the-top to somehow prove to consumers they are legit and honest reviewers that keep manufacturers accountable.
> 
> GN was way overly dramatic.
> 
> The plastic bits has been a terrible trend on all computer parts, especially motherboards.
> 
> Yes, it performs worse than the other tested cards. But you don't have to go on a rant. It's that same mindless droning on and on that he goes on in his other videos about something that should've been cut in half. That same style doesn't translate well when you have a bad review.
> 
> Especially when you see that powercolor advertises their cards on GN.


What's funny is that XFX got backhurt and started to white knight themselves in the comments. Maybe Nvidia is not the only devil in their breakup story.


----------



## airisom2

Yeah, it's ironic since a couple years ago when people were complaining about chinese made Noctua fan tolerances, he was the first to remark about outrage culture. Well, who do we have here neck deep in it? This is all just a big soap opera.


----------



## skupples

its an easy rut to fall into, and highly profitable. 

angry people click more.

its the in thing for folks around his (my) age.  

all the rage, to rage rage rage. just not much worth while to rage about so they raged into the 72 variant intersecting nightmare.

(there's plenty to rage about, but folks are just way too distracted by complete and utter nonsense.)


----------



## braincracking

I've unsubbed because of this horse crap. I want some light digestible content giving me the short of a certain hardware product, I don't need the dramatic nonsense with it.


----------



## keikei




----------



## treetops422

I think aib cards are uselessly expensive if your not using mpt or a bios mod. Any pre oced gain, non oc people might get for the price is better saved and spent on the next tier. If you must get a quieter non blower don't spend more then $10 over msrp. That's my opinion. Probably not a popular one. 


GamerNexus would disagree, so what he still rocks. If anyone knows of a better English speaking YT for what he does, let it be known. I also like Jayz2cents, Hardware Unboxed and Igors Lab. They all have their own style. Igors Lab is my favorite, it's major content is translated ftw. They don't make stupid faces for clicks, well unless it's in a mocking manner lol. Or have cliche names like, internet explorer sucks. Spread rumors for clicks. Or just copy paste other peoples content without any understanding of it's validity.



Linus gets a pass since he's been doing stupid faces for over a decade? But it's still annoying.


----------



## PontiacGTX

braincracking said:


> I've unsubbed because of this horse crap. I want some light digestible content giving me the short of a certain hardware product, I don't need the dramatic nonsense with it.


I mean his RX 580 Yeston review was quite objective not sure about the rest I dont watch his reviews


----------



## keikei

treetops422 said:


> I think aib cards are uselessly expensive if your not using mpt or a bios mod. Any pre oced gain, non oc people might get for the price is better saved and spent on the next tier. If you must get a quieter non blower don't spend more then $10 over msrp. That's my opinion. Probably not a popular one.
> 
> 
> GamerNexus would disagree, so what he still rocks. If anyone knows of a better English speaking YT for what he does, let it be known. I also like Jayz2cents, Hardware Unboxed and Igors Lab. They all have their own style. Igors Lab is my favorite, it's major content is translated ftw. They don't make stupid faces for clicks, well unless it's in a mocking manner lol. Or have cliche names like, internet explorer sucks. Spread rumors for clicks. Or just copy paste other peoples content without any understanding of it's validity.
> 
> 
> 
> Linus gets a pass since he's been doing stupid faces for over a decade? But it's still annoying.


 

Epeen for these aftermarket cards is strong. I believe there is one more card Asrock has yet to release. RGB on all fans i believe. If I were in the market for one, i'd get these over a low end 2070.


----------



## Section31

I was hoping to be able to use my 5700xt but its having issue with no display output on complete new setup (only reused nvme drive from intel rig). I'm in the process of ruling out the possibilities on another rig (ddu issue, x570 pcie 4,etc) but in meantime I have to use my RTX2060. It was nice I managed to find the funds in order to make my work rig an all AMD rig with 3700X and 5700XT (watercooled too). I will get this to work somehow.

That being said, I hope the 5900XT is not like this or I will have to continue my main gaming rig as Nvidia GPU (RTX3080TI).


----------



## nisc

What is the rx 5700 xt chip size?

https://tpucdn.com/gpu-specs/images/g/861-navi-10-xt.jpg


----------



## tpi2007

nisc said:


> What is the rx 5700 xt chip size?
> 
> https://tpucdn.com/gpu-specs/images/g/861-navi-10-xt.jpg



It's on the database page that contains that picture: 251 mm²

Here: https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt.c3339


----------



## nisc

tpi2007 said:


> nisc said:
> 
> 
> 
> What is the rx 5700 xt chip size?
> 
> https://tpucdn.com/gpu-specs/images/g/861-navi-10-xt.jpg
> 
> 
> 
> 
> It's on the database page that contains that picture: 251 mmÂ²
> 
> Here: https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt.c3339
Click to expand...

251 is the die area.

https://encrypted-tbn0.gstatic.com/...TURuxxrC3klTdVYy97Ch7jxl4Rj8G4v3Aa8BAj95fTU1Y


----------



## Section31

This might be off topic but once I get the 5700XT working (give me until late November/Early December) I will run a contest for an free RTX2060 (out of fairness to everyone). Since the actual resale value of the cards are approaching (150-200USD range), i'm not getting to get much back so might as well give it away to the community. I will update it once where I post it and when. It will be open to US/Canada individuals (though its up to them to arrange all shipping details).


----------



## PriestOfSin

Question before I pull the trigger on the Red Devil 5700XT. I heard reports of cards self-destructing, with their performance taking a nosedive until they eventually stop working entirely. Any validity to those? The card is going in a secondary machine (Doom Ripper) so won't be used a whole lot beyond when I need to entertain little cousins, etc., so if the card starts to slowly die I probably won't notice it too much... with my luck I wouldn't notice until the warranty ran out.


----------



## keikei

PriestOfSin said:


> Question before I pull the trigger on the Red Devil 5700XT. I heard reports of cards self-destructing, with their performance taking a nosedive until they eventually stop working entirely. Any validity to those? The card is going in a secondary machine (Doom Ripper) so won't be used a whole lot beyond when I need to entertain little cousins, etc., so if the card starts to slowly die I probably won't notice it too much... with my luck I wouldn't notice until the warranty ran out.



Nothing wide spread enough to cause a concern. You can go with an asrock challenger. There are enough buyers for that card, if there were some issue, it would've come up. Plus, 3 yr warranty.


----------



## Newbie2009

PriestOfSin said:


> Question before I pull the trigger on the Red Devil 5700XT. I heard reports of cards self-destructing, with their performance taking a nosedive until they eventually stop working entirely. Any validity to those? The card is going in a secondary machine (Doom Ripper) so won't be used a whole lot beyond when I need to entertain little cousins, etc., so if the card starts to slowly die I probably won't notice it too much... with my luck I wouldn't notice until the warranty ran out.


Can't say I heard it as being a widespread issue. Cards of choice are sapphire nitro, MSI gaming x, Power Color Red Devil and maybe the gigabyte gaming OC. Any of them and you should be good.


----------



## Newbie2009

homestyle said:


> GN and XFX must have some beef going back. Or GN is looking to be over-the-top to somehow prove to consumers they are legit and honest reviewers that keep manufacturers accountable.
> 
> GN was way overly dramatic.
> 
> The plastic bits has been a terrible trend on all computer parts, especially motherboards.
> 
> Yes, it performs worse than the other tested cards. But you don't have to go on a rant. It's that same mindless droning on and on that he goes on in his other videos about something that should've been cut in half. That same style doesn't translate well when you have a bad review.
> 
> Especially when you see that powercolor advertises their cards on GN.


It can also be picked up for the price of a stock blower card more or less, so it's a good buy imo.


----------



## Section31

I have fixed my 5700XT (Driver Issue Dumb Me) and have started the process for the free RTX2060 giveaway. It's on Freebie Section if your all interested.


----------



## keikei

Section31 said:


> I have fixed my 5700XT (Driver Issue Dumb Me) and have started the process for the free RTX2060 giveaway. It's on Freebie Section if your all interested.


You are a Saint.


----------



## EastCoast

Section31 said:


> I have fixed my 5700XT (Driver Issue Dumb Me) and have started the process for the free RTX2060 giveaway. It's on Freebie Section if your all interested.


How did you fix it?


----------



## Section31

EastCoast said:


> How did you fix it?


Basically I've been on Nvidia GPU so long I didn't realize i needed to do DDU. I could change remove and insert Nvidia GPU's with no issues.


----------



## PriestOfSin

keikei said:


> Nothing wide spread enough to cause a concern. You can go with an asrock challenger. There are enough buyers for that card, if there were some issue, it would've come up. Plus, 3 yr warranty.





Newbie2009 said:


> Can't say I heard it as being a widespread issue. Cards of choice are sapphire nitro, MSI gaming x, Power Color Red Devil and maybe the gigabyte gaming OC. Any of them and you should be good.


Awesome, good to hear that it isn't a widespread issue. Normally I buy sapphire, I wasn't totally sure what Power Color's warranty was. Red Devil just looks too awesome to pass up.


----------



## EastCoast

Looks like Navi is making AMD money
https://translate.google.com/transl...rbase.de/2019-10/navi-gpu-amd-radeon-rx-5700/


----------



## ToTheSun!

EastCoast said:


> Looks like Navi is making AMD money
> https://translate.google.com/transl...rbase.de/2019-10/navi-gpu-amd-radeon-rx-5700/


One would hope so.


----------



## Ha-Nocri

So, choices in my local store are XFX 5700XT THICC II or MSI 5700XT MECH OC. Both are bad it seems, but which is better?


----------



## ilmazzo

Thicc II is not Thicc, haven't seen reviewed it yet...regarding the msi as far as you fix the thermal pads is decent afaik


----------



## keikei

Ha-Nocri said:


> So, choices in my local store are XFX 5700XT THICC II or MSI 5700XT MECH OC. Both are bad it seems, but which is better?


I believe the Thicc had some early driver issues, but were addressed pretty quickly. I would double check that statement though. My $ is on the XFX.


----------



## keikei




----------



## EastCoast

keikei said:


> https://www.youtube.com/watch?v=x4fKbE8QsFg


Finally the truth comes out. RD was never "the best card" it was simply one of the 1st to market that improved on AMD blower style cooler.

Goes to show you he who offers the AIB card 1st wins the mind share. 

I seriously doubt this will repeat with the 5800 series. AIB should be out the same time as blower style.


----------



## PriestOfSin

EastCoast said:


> Finally the truth comes out. RD was never "the best card" it was simply one of the 1st to market that improved on AMD blower style cooler.
> 
> Goes to show you he who offers the AIB card 1st wins the mind share.
> 
> I seriously doubt this will repeat with the 5800 series. AIB should be out the same time as blower style.


Seems to boil down to aesthetics within the price bracket you're looking at. Maybe warranty support? I grabbed one because it went with my build really well and the Nitro + wasn't available at my local store.


----------



## EastCoast

PriestOfSin said:


> Seems to boil down to aesthetics within the price bracket you're looking at. Maybe warranty support? I grabbed one because it went with my build really well and the Nitro + wasn't available at my local store.


I'm willing to disagree with the aesthetics, it's pretty generic looking 3 fan cooler. 
As for availability I seriously doubt that the RD was the only 5700 in a particular bm/e-retailer. Unless others have already sold out.

I waited several months so neither is a good excuse to claim hold to.


----------



## PriestOfSin

EastCoast said:


> I'm willing to disagree with the aesthetics, it's pretty generic looking 3 fan cooler.
> As for availability I seriously doubt that the RD was the only 5700 in a particular bm/e-retailer. Unless others have already sold out.
> 
> So neither is a good excuse to claim hold to.


Disagree with the aesthetics all you like, I don't particularly care lol. And yeah, at my local store this morning I was able to choose from the:

MSI Evoke
Powercolor Red Devil Special Edition
Powercolor Red Devil
Gigabyte Gaming Overclocked
MSI Mech
ASRock Challenger
Powercolor Red Dragon
Sapphire Pulse
Powercolor 5700XT (unnamed dual fan)
Powercolor 5700XT Reference

So maybe you're just being argumentative for the sake of being argumentative.


----------



## looniam

steve is really going to need help with his sandwich addiction; his health has taken a downturn (that girth yo!) and shamelessly pandering for money to get his next one.

oh, how quickly the mighty fall . . .


----------



## 113802

EastCoast said:


> Finally the truth comes out. RD was never "the best card" it was simply one of the 1st to market that improved on AMD blower style cooler.
> 
> Goes to show you he who offers the AIB card 1st wins the mind share.
> 
> I seriously doubt this will repeat with the 5800 series. AIB should be out the same time as blower style.


Buildzoid showed it was the best 2 months ago and it still is. 

https://youtu.be/QYPysvNlt_0


----------



## EastCoast

PriestOfSin said:


> Disagree with the aesthetics all you like, I don't particularly care lol. And yeah, at my local store this morning I was able to choose from the:
> 
> MSI Evoke
> Powercolor Red Devil Special Edition
> Powercolor Red Devil
> Gigabyte Gaming Overclocked
> MSI Mech
> ASRock Challenger
> Powercolor Red Dragon
> Sapphire Pulse
> Powercolor 5700XT (unnamed dual fan)
> Powercolor 5700XT Reference
> 
> So maybe you're just being argumentative for the sake of being argumentative.


The Aesthetics are pretty bland and the color (besides the red) would fit in any case be it one with a side window or not. The point I'm referencing is the recommendations that it was a "go to card/best in class" which I knew it was not. 

And thanks for providing 5700 xt availability. What's being said as the go to is (not in any order):
Nitro+
Sapphire Pulse 
RD


I can only assume that the Sapphire Pulse is cheaper for some. 



WannaBeOCer said:


> Buildzoid showed it was the best 2 months ago and it still is.
> 
> https://youtu.be/QYPysvNlt_0


Never post this video to indicate which card is better. These kind of reviews are talking about components over entire pcb build. At the end of the day a lot of the AIB 5700 xt cards perform similarly in game. Making the review of the components themselves not important in the grand scheme of things.

Well, unless you plan on modding the card for extreme OC/sub zero cooling, etc.

Edit:
If i'm not mistaken he likes the OEM cards the most. As they have very expensive, overbuilt components. Smart vregs or something like that. However, we all know how the OEM blower style cards perform against AIB cards.

Sure you could get a waterblock and "unleash it" but that's particularly "cost prohibitive" because by the time you pay for a 5700 XT LS ED. + waterblock (for example) you are better off getting a 2070S.


----------



## 113802

EastCoast said:


> The Aesthetics are pretty bland and the color (besides the red) would fit in any case be it one with a side window or not. The point I'm referencing is the recommendations that it was a "go to card/best in class" which I knew it was not.
> 
> And thanks for providing 5700 xt availability. What's being said as the go to is (not in any order):
> Nitro+
> Sapphire Pulse
> RD
> 
> 
> I can only assume that the Sapphire Pulse is cheaper for some.
> 
> 
> 
> Never post this video to indicate which card is better. These kind of reviews are talking about components over entire pcb build. At the end of the day a lot of the AIB 5700 xt cards perform similarly in game. Making the review of the components themselves not important in the grand scheme of things.
> 
> Well, unless you plan on modding the card for extreme OC/sub zero cooling, etc.
> 
> Edit:
> If i'm not mistaken he likes the OEM cards the most. As they have very expensive, overbuilt components. Smart vregs or something like that. However, we all know how the OEM blower style cards perform against AIB cards.
> 
> *Sure you could get a waterblock and "unleash it" but that's particularly "cost prohibitive" because by the time you pay for a 5700 XT LS ED. + waterblock (for example) you are better off getting a 2070S.*


I said the bold text on launch day of the 5700 XT and every got angry. Like many people already stated there are fanboys who look past "cost prohibitive" just like I did when I bought a RX Vega 64 LC for $699 instead of buying a GTX 1080 TI for $600 on launch day.

He already said that he prefers the Red Devil followed by the OEM cards followed by the Sapphire Nitro+ in his Nitro+ breakdown.


----------



## PriestOfSin

EastCoast said:


> The Aesthetics are pretty bland and the color (besides the red) would fit in any case be it one with a side window or not. The point I'm referencing is the recommendations that it was a "go to card/best in class" which I knew it was not.
> 
> And thanks for providing 5700 xt availability. What's being said as the go to is (not in any order):
> Nitro+
> Sapphire Pulse
> RD
> 
> 
> I can only assume that the Sapphire Pulse is cheaper for some.
> 
> 
> 
> Never post this video to indicate which card is better. These kind of reviews are talking about components over entire pcb build. At the end of the day a lot of the AIB 5700 xt cards perform similarly in game. Making the review of the components themselves not important in the grand scheme of things.
> 
> Well, unless you plan on modding the card for extreme OC/sub zero cooling, etc.
> 
> Edit:
> If i'm not mistaken he likes the OEM cards the most. As they have very expensive, overbuilt components. Smart vregs or something like that. However, we all know how the OEM blower style cards perform against AIB cards.
> 
> Sure you could get a waterblock and "unleash it" but that's particularly "cost prohibitive" because by the time you pay for a 5700 XT LS ED. + waterblock (for example) you are better off getting a 2070S.


The pulse was $410 I believe, but I didn't want a dual-fan design. I ended up paying $440 for the RD. The Nitro+ would have been $440 as well had it been in stock, looking at the pricing online, with the Strix (which I believe is technically the best 5700XT, hardware wise) coming in at a whopping $460.


----------



## Gunderman456




----------



## 113802

Delete


----------



## Ha-Nocri

So I ended up buying Gigabyte 5700 XT Gamin OC. It behaves nothing like in GamersNexus' review.

Under load fans spin @2000 RMP. I guess GB updated the BIOS in the mean time. The worst thing is that junction temperature hits 100 c in gaming, while the edge temp doesn't cross 70c. It works fine like that tho and at 2000 RPM card is very quiet.

Then I saw that GB had another BIOS on their site, and that one increased fan speed to 2400 rpm. Now card is not that quiet, but not too loud either. They obviously had complains about temperatures. Now junction temp is around 90 c.

VRM and memory temps are excellent tho.

Now, there are a few things I could do. The card doesn't have thermal pads between GPU and back-plate. I could add it. And I could increase mounting pressure, which might bring down junction temp.


----------



## ilmazzo

Maybe just a tim replace with something good and the washer mod could improve something and almost free


----------



## treetops422

Ha-Nocri said:


> So I ended up buying Gigabyte 5700 XT Gamin OC. It behaves nothing like in GamersNexus' review.
> 
> Under load fans spin @2000 RMP. I guess GB updated the BIOS in the mean time. The worst thing is that junction temperature hits 100 c in gaming, while the edge temp doesn't cross 70c. It works fine like that tho and at 2000 RPM card is very quiet.
> 
> Then I saw that GB had another BIOS on their site, and that one increased fan speed to 2400 rpm. Now card is not that quiet, but not too loud either. They obviously had complains about temperatures. Now junction temp is around 90 c.
> 
> VRM and memory temps are excellent tho.
> 
> Now, there are a few things I could do. The card doesn't have thermal pads between GPU and back-plate. I could add it. And I could increase mounting pressure, which might bring down junction temp.


If your not using MPT or a bios mod. Every single 5700 xt can get max oc and they have the same performance. The only trade off is how loud the fan has to be. If your TJ is 90c cool beans. Your card will run just fine. If you want to tinker with it for fun , have fun, but there is nothing wrong with running it at those temps.


----------



## skupples

PriestOfSin said:


> Disagree with the aesthetics all you like, I don't particularly care lol. And yeah, at my local store this morning I was able to choose from the:
> 
> MSI Evoke
> Powercolor Red Devil Special Edition
> Powercolor Red Devil
> Gigabyte Gaming Overclocked
> MSI Mech
> ASRock Challenger
> Powercolor Red Dragon
> Sapphire Pulse
> Powercolor 5700XT (unnamed dual fan)
> Powercolor 5700XT Reference
> 
> So maybe you're just being argumentative for the sake of being argumentative.


i'm almost glad no one sells GPUs locally anymore. It prevents me from sampling & tying up money in return games.


----------



## EastCoast

Gunderman456 said:


> https://www.youtube.com/watch?v=5McVc0SQMHU


20:04 says it all. But we already knew that. For everyday gaming use pcb components used doesn't mean much.

@PriestOfSin @WannaBeOCer


----------



## PriestOfSin

skupples said:


> i'm almost glad no one sells GPUs locally anymore. It prevents me from sampling & tying up money in return games.


A friend ended up doing that over a weekend last year. Picked up the 1060 3GB on Friday, fooled around with it Saturday, returned and exchanged for the 1060 6GB on Sunday- which he returned and exchanged for a 1070 on Monday. It can be a blessing in a lot of ways, but GPUs are a gateway drug lmao.



EastCoast said:


> 20:04 says it all. But we already knew that. For everyday gaming use pcb components used doesn't mean much.
> 
> @PriestOfSin
> @WannaBeOCer


I'd be lying if I said I didn't want to see that Stryx card under LN2.


----------



## Fletcherea

Ha-Nocri said:


> So I ended up buying Gigabyte 5700 XT Gamin OC. It behaves nothing like in GamersNexus' review.
> 
> Under load fans spin @2000 RMP. I guess GB updated the BIOS in the mean time. The worst thing is that junction temperature hits 100 c in gaming, while the edge temp doesn't cross 70c. It works fine like that tho and at 2000 RPM card is very quiet.
> 
> Then I saw that GB had another BIOS on their site, and that one increased fan speed to 2400 rpm. Now card is not that quiet, but not too loud either. They obviously had complains about temperatures. Now junction temp is around 90 c.
> 
> VRM and memory temps are excellent tho.
> 
> Now, there are a few things I could do. The card doesn't have thermal pads between GPU and back-plate. I could add it. And I could increase mounting pressure, which might bring down junction temp.


Mine was pretty balmy too. This is my 1st amd gpu ever so I was lost and confused for a bit. I ended up undervolting and dropping the boost to 1800mhz (i think thats what the reference card was ish) I managed to get a stable 960mv out of it while just a tiny bit under the 1800mhz(I think the power sits at around 165 ish watts, jumps occasionally but thats where it likes to hang out) . It's super cool and quiet for me now, was having a bit of buyers remorse before I did this, now I'm quite happy.


----------



## Ha-Nocri

Fletcherea said:


> Mine was pretty balmy too. This is my 1st amd gpu ever so I was lost and confused for a bit. I ended up undervolting and dropping the boost to 1800mhz (i think thats what the reference card was ish) I managed to get a stable 960mv out of it while just a tiny bit under the 1800mhz(I think the power sits at around 165 ish watts, jumps occasionally but thats where it likes to hang out) . It's super cool and quiet for me now, was having a bit of buyers remorse before I did this, now I'm quite happy.


How bad was it? You are lowering the performance because you were worried by the numbers. My card works just fine, I would just like junction temp to be lower so I would have some OC headroom so it can boost above 2000MHz


----------



## skupples

PriestOfSin said:


> A friend ended up doing that over a weekend last year. Picked up the 1060 3GB on Friday, fooled around with it Saturday, returned and exchanged for the 1060 6GB on Sunday- which he returned and exchanged for a 1070 on Monday. It can be a blessing in a lot of ways, but GPUs are a gateway drug lmao.
> 
> 
> 
> I'd be lying if I said I didn't want to see that Stryx card under LN2.


they most definitely are. The last GPU I purchased on the shelf was GK110 Titan.

I went back two separate times, for a grand total of 3x GK110 titans.

I started with 670, got a second one... still not fast enough, returned both, got one titan. still not fast enough... etc etc. 

Now I know what to do next time mGPU becomes relevant, 2x flagships off the bat.


----------



## 113802

Ha-Nocri said:


> How bad was it? You are lowering the performance because you were worried by the numbers. My card works just fine, I would just like junction temp to be lower so I would have some OC headroom so it can boost above 2000MHz


Do you have a reference card? The junction temperature can reach 110c before it throttles which is plenty of headroom.


----------



## Ha-Nocri

WannaBeOCer said:


> Do you have a reference card? The junction temperature can reach 110c before it throttles which is plenty of headroom.


Gigabyte 5700 xt gaming oc


----------



## NightAntilli




----------



## b.walker36

NightAntilli said:


> https://www.youtube.com/watch?v=oej2ev3g-sM


he usednon reference models then used msrp with that data to calculate value lol. Doesn't make sense. Granted the XT is still a better value but why do people do things like that.


----------



## treetops422

b.walker36 said:


> he usednon reference models then used msrp with that data to calculate value lol. Doesn't make sense. Granted the XT is still a better value but why do people do things like that.


He already tested the 5700 xt 5700 2060 2060 super and 2070 super ref models, actually 17 cards total in the link below. That's why I'm guessing he used different cards this time. Perhaps he used msrp prices because aib prices fluctuate idk. Or maybe he already had the msrp graphs from his earlier benchmark. Pretty sure he's getting tired of testing these cards at this point. If someone wants to bust out a calculator it's not to hard to put in current prices of those AIBs. I see your point, but he's only got so much time.



https://www.techspot.com/review/1870-amd-radeon-rx-5700/


----------



## Fletcherea

Ha-Nocri said:


> How bad was it? You are lowering the performance because you were worried by the numbers. My card works just fine, I would just like junction temp to be lower so I would have some OC headroom so it can boost above 2000MHz


It wasnt awful, low 90s hotspot, but it was louder than I wanted. 100mhz slower, down to 960mv, and barely creeping over 80/81 hotspot I'm quite happy with that(heck if it was advertised as that it would have been my 1st choice w/o comparisons!)


----------



## b.walker36

treetops422 said:


> He already tested the 5700 xt 5700 2060 2060 super and 2070 super ref models, actually 17 cards total in the link below. That's why I'm guessing he used different cards this time. Perhaps he used msrp prices because aib prices fluctuate idk. Or maybe he already had the msrp graphs from his earlier benchmark. Pretty sure he's getting tired of testing these cards at this point. If someone wants to bust out a calculator it's not to hard to put in current prices of those AIBs. I see your point, but he's only got so much time.
> 
> 
> 
> https://www.techspot.com/review/1870-amd-radeon-rx-5700/


If he used reference data to make that graph then that is cool, but I assumed it was the two cards he tested since he didn't say specifically unless I missed it. It could be very misleading so while I get that you have limited time, that is a not a good place to cut corners cause it could be very misleading.


----------



## Ha-Nocri

Fletcherea said:


> It wasnt awful, low 90s hotspot, but it was louder than I wanted. 100mhz slower, down to 960mv, and barely creeping over 80/81 hotspot I'm quite happy with that(heck if it was advertised as that it would have been my 1st choice w/o comparisons!)


Yeah, I've tried some undervolting and card runs 10c cooler while keeping the same clock. Default voltage is 1.171, I can lower it to 1.140 for the same performance. Definitely worth doing on Navi cards.


----------



## Ice009

I wonder if you guys would mind me asking in this thread what you think I should upgrade to? I was looking at either a GTX 1080Ti (second hand) or an AMD RX 5700XT. I'm currently using a 1920 x 1200p Dell Monitor, so resolution is basically 1080p. What is the best overall card between those two, and/or, what is the best upgrade path that will also give me some headroom if I do upgrade my Monitor down the line to either a 2K or 4K resolution screen? I was leaning towards the RX 5700 XT, but I've also read it runs pretty hot and also how are AMD drivers these days? I haven't used an AMD card since the HD 5770 that I had in Crossfire.


----------



## keikei

Ice009 said:


> I wonder if you guys would mind me asking in this thread what you think I should upgrade to? I was looking at either a GTX 1080Ti (second hand) or an AMD RX 5700XT. I'm currently using a 1920 x 1200p Dell Monitor, so resolution is basically 1080p. What is the best overall card between those two, and/or, what is the best upgrade path that will also give me some headroom if I do upgrade my Monitor down the line to either a 2K or 4K resolution screen? I was leaning towards the RX 5700 XT, but I've also read it runs pretty hot and also how are AMD drivers these days? I haven't used an AMD card since the HD 5770 that I had in Crossfire.


Do you care about the best cooling/quietest or $/perf with adequate cooling or yolo epeen card? There is a 5700 XT for all dem.


----------



## Ice009

keikei said:


> Do you care about the best cooling/quietest or $/perf with adequate cooling or yolo epeen card? There is a 5700 XT for all dem.


If I had a choice, I'd like a cooler running one that isn't too loud (doesn't have to be ultra quiet, though). Do high temps affect the performance of the cards? Also, are you suggesting I go with the RX 5700 XT over the GTX 1080Ti? The average price of the GTX 1080Ti (used) is about $100-$150AUD more here.


----------



## skupples

yes, get the 5700xt over the old used nvidia card. 100%

5700xt is brand new, pascal is end of life.


----------



## The Robot

Ice009 said:


> If I had a choice, I'd like a cooler running one that isn't too loud (doesn't have to be ultra quiet, though). Do high temps affect the performance of the cards? Also, are you suggesting I go with the RX 5700 XT over the GTX 1080Ti? The average price of the GTX 1080Ti (used) is about $100-$150AUD more here.


Get cheapest non-blower 5700XT you can find, it'll be a lot quieter and cooler than any 1080ti. Don't worry about the temps. You can also enable auto undervolt in wattman.


----------



## AlphaC

There's definitely use cases where a RTX TITAN or possibly TITAN XP is better than a RX 5700 XT / RX 5700 , mainly content creation where over 8GB VRAM is helpful and any use of CUDA. There's also the professional driver optimizations for OpenGL , although in some applications such as CATIA and Solidworks the RX 5700XT cards should do fine. 1080 TI though is a hard thing to buy in late 2019 and unless you can make use of the VRAM it is almost always better just to buy RTX 2070 _Super_ if you need CUDA. 

If you scroll down to compute performance with the reference RX 5700 series:
https://www.notebookcheck.net/AMD-R...success.428200.0.html#toc-compute-performance

*Solidworks Specviewperf 12 (sw-03 viewset)** , makes some use of professional driver optimizations*
Titan RTX with Ryzen 7 2700X = 139.59 fps 
RX 5700 XT + R7 2700X = 89.55 fps (64%)
RX 5700 + R7 2700X = 79.21 fps (57%)
RTX 2080 + R7 2700X = 70.14 fps (50%)
RTX 2070 _Super_ + R7 2700X = 65.36 fps (47%)

*Solidworks Specviewperf 13 (sw-04 viewset)** , makes some use of professional driver optimizations*
RTX TITAN = 132.89 fps
RX 5700 XT = 114.44 fps (86%)
RX 5700 = 101.77 fps (77%)
RTX 2080 = 100.32 fps (75%)
RTX 2070 _Super _= 93.26 fps (70%)

*Maya Specviewperf 12 (maya-04 viewset) , keeping in mind it is DirectX*
RTX 2080 = 148.72 fps
RTX Titan = 132.76 fps(flawed benchmark or driver difference?)
RTX 2070 _Super _= 125.05 fps
Radeon VII = 88.09 fps
RX 5700 XT = 86.51 fps

*Maya Specviewperf 13 (maya-05 viewset)*
RTX TITAN = 367.78 fps
RTX 2070 _Super_ = 290.37 fps (79%)
RTX 2070 = 275.28 (75%)
RTX 2060 _Super _= 262.37 fps (71%)
RX 5700 XT = 220.34 fps (60%)

*3dsmax Specviewperf 12 (3dsmax-05 viewset) , keeping in mind it is DirectX *
RTX TITAN = 268.22 fps
RTX 2080 = 234.09 fps
RTX 2070_ Super_ = 211.78 fps
RX 5700 XT = 173.77 fps
Radeon VII = 166.92 fps

*CATIA specviewperf 12 (catia-04 viewset) , makes use of professional driver optimizations*
RTX TITAN = 178.15 fps
RX 5700XT = 158.81 fps
RX 5700 = 135.74 fps
RTX 2080 = 109.41 fps
RTX 2070 _Super _= 96.26 fps

*CATIA specviewperf 13 (catia-05 viewset) ** , makes some use of professional driver optimizations*
RTX TITAN = 260.44 fps
RX 5700 XT = 242.76 fps (93%)
Radeon VII = 239.18 fps (92%)
RX 5700 = 209.21 fps (80%)
RTX 2070 _Super _= 146.32 fps (56%)

*Also in FAH (folding) and some other compute loads such as Blender:*















https://www.extremetech.com/computi...-vs-5700-xt-rendering-and-compute-performance


Also it is reflected in the techgage article:
https://techgage.com/article/amd-radeon-navi-vs-nvidia-geforce-super-proviz/5/
_Unfortunately for AMD, its Navi cards didn’t fare too well across a bunch of our tests. It did perform extremely well in SolidWorks, CATIA and Blender viewport tests, but in some others, like Metashape’s depth map generation and LuxMark’s Hotel render, both 5700-series cards nonsensical fell behind the older RX 590._


----------



## PriestOfSin

Oof. ASUS dropped the ball hard on the TUF model.


----------



## PriestOfSin

Ice009 said:


> I wonder if you guys would mind me asking in this thread what you think I should upgrade to? I was looking at either a GTX 1080Ti (second hand) or an AMD RX 5700XT. I'm currently using a 1920 x 1200p Dell Monitor, so resolution is basically 1080p. What is the best overall card between those two, and/or, what is the best upgrade path that will also give me some headroom if I do upgrade my Monitor down the line to either a 2K or 4K resolution screen? I was leaning towards the RX 5700 XT, but I've also read it runs pretty hot and also how are AMD drivers these days? I haven't used an AMD card since the HD 5770 that I had in Crossfire.


I've been able to play with mine for awhile now, and I can safely say that as long as you get one of the better models (Sapphire Pulse, Sapphire Nitro, Powercolor Red Devil, Powercolor Red Dragon) you'll be totally fine in the temps department. I'm currently using a 4k panel for my 5700XT, and it does pretty well, all things considered. Older games max out easily. Lately I've been playing Borderlands 3, which I can only run at High settings and achieve 60FPS, but the sharpness of 4k looks spectacular combined with BL3's art style; at 1440p the card does great, it'll rip right through 1200p no sweat.. The 5700XT will be weaker than the 1080Ti, but the 1080Ti is quite old these days. Lots of gas in the tank (my wife uses one), but I don't think I'd seek one out specifically. I haven't had any driver issues for this card, but the driver hell I went through on my Radeon VII in my main system is fresh in my mind- sometimes AMD's drivers work fine, sometimes Wattman is a half broken cluster of shattered dreams and broken OC settings.

If you can make use of CUDA, it may make more sense to go with a 2070 Super, or if you desire RTX eye candy. A 2070S will also perform better in general, and will do better in 4k.


----------



## paulerxx

PriestOfSin said:


> https://www.youtube.com/watch?v=OJU8jKIYtS4
> 
> Oof. ASUS dropped the ball hard on the TUF model.


I own this card and have never seen memory temps break 85c...Let alone 100c+ And I have my memory set to 900mhz in Wattman.

The screenshot was taken after around 45 minutes of playing Forza 4 + Gear 5, on ultra settings.


----------



## Ice009

Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


----------



## looniam

Ice009 said:


> Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


there was a day when two x70 cards could slap around a big TI but that is no longer the case for awhile. anymore its a game of hot potato between nv/amd and the devs on who should support multi-gpus since dx12.

if you wanna play around benchmarking or some compute (edit: which mostly blows chunks on pascal) . . rendering . . but not actual game play.


----------



## ilmazzo

mgpu is dead, hail to mgpu!


----------



## paulerxx

Ice009 said:


> Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


Definitely grab the GTX1080ti or 5700XT, I believe the 1080ti is slightly faster. The 5700XT will likely get better over time, while the 1080TI is at the end of it's life cycle.


----------



## NightAntilli

Ice009 said:


> Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


SLI is basically dead. Many of nVidia's GPUs don't even support it anymore, so. don't bother...


----------



## keikei

Ice009 said:


> Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


Xfire is nonexistent. SLI is a per game basis. Take that as you will. Not quite dead, but absolutely not priority.


----------



## skupples

Ice009 said:


> Thanks for the help so far guys. I have another option 2 x GTX 1070s in SLI for about the same price as either a GTX 1080Ti or AMD RX 5700 XT.. How is SLI these days, or would you guys recommend not going that route? How would they perform compared to a single card? Is there many newer games that don't have SLI profiles?


sli is 90% dead, and is only supported in a hand full of titles from the last 5 years. Some titles can be "hacked" for support (like frostbite engine titles) Stutters and massive drops in lowest FPS causes the higher FPS to be a wash. It's hard to get current SLI users to admit this as it physically hurts them to admit they wasted money on a card that works 10% of the time, for only 40-50% more power.


----------



## rv8000

paulerxx said:


> I own this card and have never seen memory temps break 85c...Let alone 100c+ And I have my memory set to 900mhz in Wattman.
> 
> The screenshot was taken after around 45 minutes of playing Forza 4 + Gear 5, on ultra settings.


Having your GPU fans hit 90% (4000 RPM) compared to the review's stock curve is hardly fair, that actually makes the TUF version look almost as bad as the reference cooler. With my reference card I was able to keep the GPU temp around 70-75c with memory temps around 90c at about 2800rpm.


----------



## AlphaC

paulerxx said:


> I own this card and have never seen memory temps break 85c...Let alone 100c+ And I have my memory set to 900mhz in Wattman.
> 
> The screenshot was taken after around 45 minutes of playing Forza 4 + Gear 5, on ultra settings.


Context man, context.

The Nitro+ card is doing similar or better temps with 1500RPM fan speed on default BIOS and ~1100RPM on silent BIOs and the Powercolor Red Devil has similar results as well.


Also even the Pulse (which is $10 more than reference) has better results. So if someone is unbiased , doesn't mind lack of RGB, and is buying based off a card's merits, the Pulse is a great choice.


A normal user isn't going to tolerate much more than 1800RPM.


----------



## skupples

in fact, lack of RGB is a selling point for many many people ages 14 and over. 

my B-die memory kit was a good deal less expensive simply because its not the highly sought after RGB crystal nonsense all the kids want.


----------



## b.walker36

skupples said:


> in fact, lack of RGB is a selling point for many many people ages 14 and over.


So true. I don't mind a subtle white lightning but the crazy RGB that exists today is not my cup of tea.


----------



## EastCoast

b.walker36 said:


> So true. I don't mind a subtle white lightning but the crazy RGB that exists today is not my cup of tea.



I don't mind RGB. As long as you can control it without needing to run a software app in the background. I'm fine changing it to either just one color, off or something else I might like.


----------



## skupples

b.walker36 said:


> So true. I don't mind a subtle white lightning but the crazy RGB that exists today is not my cup of tea.


likewise, I prefer classic backlighting, or nothing at all. however, i never got super caught up in pleasing the internet with a e s t h e t i c, and there's no one in my life that'll think any higher of me for having a PC that can land planes. 

I had an LED phase when I was a kid too, it was just inside my car. Not my PC. 151db pounding kickers with synced RGB , you know, where real people in the real world will actually see it    we almost blew the back window outta the Cav i had those squares in. 



EastCoast said:


> I don't mind RGB. As long as you can control it without needing to run a software app in the background. I'm fine changing it to either just one color, off or something else I might like.


yep, but I'll go with the NO RGB option for less money every.single.time.  

The board lights on my MSI MPG are so obscene.


----------



## treetops422

skupples said:


> likewise, I prefer classic backlighting, or nothing at all. however, i never got super caught up in pleasing the internet with a e s t h e t i c, and there's no one in my life that'll think any higher of me for having a PC that can land planes.
> 
> I had an LED phase when I was a kid too, it was just inside my car. Not my PC. 151db pounding kickers with synced RGB , you know, where real people in the real world will actually see it    we almost blew the back window outta the Cav i had those squares in.
> 
> 
> 
> yep, but I'll go with the NO RGB option for less money every.single.time.
> 
> The board lights on my MSI MPG are so obscene.


Especially if you are trying to watch a movie in the dark or have your PC in the room you sleep in. I usually turn my PC off, unless I'm downloading a huge amounts of data or running scans. It also heats up the room a bit more?


----------



## ilmazzo

american tape, the real way to do it


----------



## skupples

^^ I believe that's what we in america would call masking tape?


----------



## Ice009

PriestOfSin said:


> I've been able to play with mine for awhile now, and I can safely say that as long as you get one of the better models (Sapphire Pulse, Sapphire Nitro, Powercolor Red Devil, Powercolor Red Dragon) you'll be totally fine in the temps department. I'm currently using a 4k panel for my 5700XT, and it does pretty well, all things considered. Older games max out easily. Lately I've been playing Borderlands 3, which I can only run at High settings and achieve 60FPS, but the sharpness of 4k looks spectacular combined with BL3's art style; at 1440p the card does great, it'll rip right through 1200p no sweat.. The 5700XT will be weaker than the 1080Ti, but the 1080Ti is quite old these days. Lots of gas in the tank (my wife uses one), but I don't think I'd seek one out specifically. I haven't had any driver issues for this card, but the driver hell I went through on my Radeon VII in my main system is fresh in my mind- sometimes AMD's drivers work fine, sometimes Wattman is a half broken cluster of shattered dreams and broken OC settings.
> 
> If you can make use of CUDA, it may make more sense to go with a 2070 Super, or if you desire RTX eye candy. A 2070S will also perform better in general, and will do better in 4k.


Hi, thanks for all that info. The RTX 2070 Super would be my pick if money wasn't an issue, but it's a bit out of my price range/budget right now.

I'm now also looking to upgrade my Monitor as a friend of mine convinced me to get something along the lines with these specs : 27 inch, 1440p resolution and 144Hz refresh rate, oh and also FreeSync or Gsync. 

Do you have any recommendations for Monitors? I see you have some LG ones, and there was an LG one that I was looking at that apparently can do both FreeSync and Gsync. I think I would prefer one that can do both, that way I'm not stuck with only one manufacturer's (Nvidia or AMD) video card. I read you can do Gsync on selected Freesync Monitors (I assume it's done through drivers? I think I also read you can't use HDR if it's a Freesync Monitor and you use Gsync on it, can only use HDR with Freesync?), but what I also wanted to know, can you also do Freesync on any native Gsync screens? From what I read, you can't?


----------



## PriestOfSin

Ice009 said:


> Hi, thanks for all that info. The RTX 2070 Super would be my pick if money wasn't an issue, but it's a bit out of my price range/budget right now.
> 
> I'm now also looking to upgrade my Monitor as a friend of mine convinced me to get something along the lines with these specs : 27 inch, 1440p resolution and 144Hz refresh rate, oh and also FreeSync or Gsync.
> 
> Do you have any recommendations for Monitors? I see you have some LG ones, and there was an LG one that I was looking at that apparently can do both FreeSync and Gsync. I think I would prefer one that can do both, that way I'm not stuck with only one manufacturer's (Nvidia or AMD) video card. I read you can do Gsync on selected Freesync Monitors (I assume it's done through drivers? I think I also read you can't use HDR if it's a Freesync Monitor and you use Gsync on it, can only use HDR with Freesync?), but what I also wanted to know, can you also do Freesync on any native Gsync screens? From what I read, you can't?


The 5700XT is a price to performance behemoth, for sure. I don't regret grabbing one at all.

1440p/144hz is sweet. I may be too old to fully appreciate it, but it looks good when I do notice it. I miss some sharpness of 4k IPS when I'm on my 1440p VA panel, but nothing major.

My understanding is that Nvidia has unlocked Freesync for newer GPUs, but an AMD GPU flat-out cannot do G sync (since the tech, while being similar in nature, is different between the two). My large 4k panel is HDR, but I've never messed with it much.

As far as monitor recommendations, get a 27'' 1440p/144Hz IPS display. I got a 32'' display (too large), and it's VA (good, better than TN, but not as good as IPS).


----------



## Ice009

Hey PriestOfSin, I forgot to put a link of the Monitor I'm looking at https://www.lg.com/us/monitors/lg-27GL850-gaming-monitor what do you think?




looniam said:


> there was a day when two x70 cards could slap around a big TI but that is no longer the case for awhile. anymore its a game of hot potato between nv/amd and the devs on who should support multi-gpus since dx12.
> 
> if you wanna play around benchmarking or some compute (edit: which mostly blows chunks on pascal) . . rendering . . but not actual game play.


Thanks for the info. I didn't realize that SLI and Crossfire aren't being utilized anymore.



ilmazzo said:


> mgpu is dead, hail to mgpu!





paulerxx said:


> Definitely grab the GTX1080ti or 5700XT, I believe the 1080ti is slightly faster. The 5700XT will likely get better over time, while the 1080TI is at the end of it's life cycle.


Great point about the GTX 1080Ti being EOL, so I guess if going Nvidia, I'd have to look at an RTX card. How would you guys compare the RTX 2060 Super Vs the RX 5700 XT? I can't afford much more than either of these cards as the prices here in Australia are very high. RTX 2070 Super is over $1000 AUD.



NightAntilli said:


> SLI is basically dead. Many of nVidia's GPUs don't even support it anymore, so. don't bother...


Wow. I didn't know that you can't even use SLI anymore.



keikei said:


> Xfire is nonexistent. SLI is a per game basis. Take that as you will. Not quite dead, but absolutely not priority.


So AMD are even less supportive of dual cards than Nvidia? I always though AMD struggled with drivers for dual card setups, so it kind of makes sense that they were even quicker than Nvidia to drop it.



skupples said:


> sli is 90% dead, and is only supported in a hand full of titles from the last 5 years. Some titles can be "hacked" for support (like frostbite engine titles) Stutters and massive drops in lowest FPS causes the higher FPS to be a wash. It's hard to get current SLI users to admit this as it physically hurts them to admit they wasted money on a card that works 10% of the time, for only 40-50% more power.


So most games don't even support it anymore? That is crazy. I didn't know the support for it had dropped that much.


----------



## PriestOfSin

Ice009 said:


> I forgot to put a link of the Monitor I'm looking at https://www.lg.com/us/monitors/lg-27GL850-gaming-monitor what do you think?


Looks pretty much perfect to me!


----------



## diggiddi

Ice009 said:


> So AMD are even less supportive of dual cards than Nvidia? I always though AMD struggled with drives for dual card setups, so it kind of makes sense that they were even quicker than Nvidia to drop it.


I don't think that is entirely accurate, but I think on the DX12 games could be true, not so much on DX11 games though


----------



## rluker5

skupples said:


> sli is 90% dead, and is only supported in a hand full of titles from the last 5 years. Some titles can be "hacked" for support (like frostbite engine titles) Stutters and massive drops in lowest FPS causes the higher FPS to be a wash. It's hard to get current SLI users to admit this as it physically hurts them to admit they wasted money on a card that works 10% of the time, for only 40-50% more power.


I thought it ran ok in more than half of titles over the last few years. And near 100% scaling in a lot of those if you were gpu limited (4k).
That being said, I did sell my second 1080ti after I beat SOTTR and started on AC Odyssey. I thought the 3080ti was coming sooner tbh. But one 1080ti has been decent at 4k, adjusted settings. Although I didn't see stutters. Single gpu isn't that different for me in that regard.
And I probably wouldn't sli 1080tis or greater cards again if I were playing 4k60 w/o raytracing. A 2080ti would already be better overall for that and cheaper and could do TAA, but not sli AA (which was very nice in Dishonored 2 stuff).

So, if you can only sli new cards that are already pretty much good solo for the max resolution I plan on playing for the forseeable future (could go 8k, but it has to be a setup like I have now with the overhanging tv which I can't quite fit a 65" flat, and cards can't handle it yet anyways) I probably won't do sli for the forseeable future either, and I have a lot of fond memories and still like it. 

Maybe if Samsung made a curved 60" 8k and the 3080ti in sli could do 8k I would consider it. But that would be a lot of money.


----------



## treetops422

rluker5 said:


> I thought it ran ok in more than half of titles over the last few years. And near 100% scaling in a lot of those if you were gpu limited (4k).
> That being said, I did sell my second 1080ti after I beat SOTTR and started on AC Odyssey. I thought the 3080ti was coming sooner tbh. But one 1080ti has been decent at 4k, adjusted settings. Although I didn't see stutters. Single gpu isn't that different for me in that regard.
> And I probably wouldn't sli 1080tis or greater cards again if I were playing 4k60 w/o raytracing. A 2080ti would already be better overall for that and cheaper and could do TAA, but not sli AA (which was very nice in Dishonored 2 stuff).
> 
> So, if you can only sli new cards that are already pretty much good solo for the max resolution I plan on playing for the forseeable future (could go 8k, but it has to be a setup like I have now with the overhanging tv which I can't quite fit a 65" flat, and cards can't handle it yet anyways) I probably won't do sli for the forseeable future either, and I have a lot of fond memories and still like it.
> 
> Maybe if Samsung made a curved 60" 8k and the 3080ti in sli could do 8k I would consider it. But that would be a lot of money.


Has anyone tried upscaled 8k?


----------



## rluker5

treetops422 said:


> Has anyone tried upscaled 8k?


Seeing as how we don't have 8k outputs yet and Samsung is already selling 8k tvs, I imagine a large fraction of those who have bought them have gone with Samsung's upscaling. It won't be as good as native 8k and on a matter of principle and I would find upscaling generally unacceptable if I spent the fortune to get it. Just like using upscaling on my 4k in my avatar. The upscaling on my 2018 QLED 4k in the living room doesn't look too bad because it is all of the way across the room, unlike my gaming 4k which is less than arms reach from my face. I also don't think I would see much benefit from 8k unless it was this close.

Back to the upscaling, if you pop out the upscaling examples of the boat from here: https://www.rtings.com/tv/reviews/samsung/q900-q900r-8k-qled and zoom them in to the same spot, you can quickly switch tabs to see how Samsung's upscaling is doing on their 8k's.


----------



## skupples

its official, I'm giving in. 5700xt in bound, just gotta figure out which one. These reviews hurt my head for some reason. Seems like everyone just shills for their favorite. One thread says Red Devils FTW, others say its garbage, etc etc.


----------



## tpi2007

skupples said:


> its official, I'm giving in. 5700xt in bound, just gotta figure out which one. These reviews hurt my head for some reason. Seems like everyone just shills for their favorite. One thread says Red Devils FTW, others say its garbage, etc etc.



Sapphire Pulse. Problem solved.


----------



## skupples

the only one that won't be here in 48 hours, of course 

I was looking at the red devil.

this is one card that'll never see water while under my ownership.

anyone around here migrate from 1080ti? opinions?


----------



## rluker5

skupples said:


> the only one that won't be here in 48 hours, of course
> 
> I was looking at the red devil.
> 
> this is one card that'll never see water while under my ownership.
> 
> anyone around here migrate from 1080ti? opinions?


Sapphire pulse has Trixx Boost. Maybe they all will get it, but it looks like a nice feature for Sapphire cards right now: https://www.kitguru.net/gaming/dominic-moass/sapphire-trixx-boost-analysed-dlss-killer/ . The version of Trixx that I'm using for my Fury Nitro that has Boost doesn't let me oc, and I have to do that in Wattman. And The Fury doesn't oc much anyways so I usually just undervolt.

Just fyi. Maybe there is a better way to do this, AMD folks tend to make some complicated workarounds that I really haven't gotten into.

And I'm not migrating from a 1080ti. I plan on putting it in the living room in place of my Fury when I upgrade to whatever is the best option for a big upgrade. But that's just me. Your hobby is for you to enjoy.


----------



## skupples

thanks for the advice, maybe i'll go that route if this shows up defective. 

I'm just looking for it to work. It'll even get a fresh windows install. 

no waterblocks, no extreme tweaking.

dial it in, and drive it til 4080ti.


----------



## Section31

I agree, I went away from waterblocked 5700XT to MSI Gaming X 5700XT (got for 300cad from friend). Sold the previous 5700XT and waterblock. Too much extra work (especially for rig that won't game much). So far it's good.

Going forward starting with the WIP build in O11XL, I am converting the GPU back to soft tubing with QDC's. It's just more convinent to change GPU's without complete drain. Rest of the system is hard tubing with drain valve.


----------



## skupples

yeah idk, i haven't really messed with OC since NV locked us outta the buck controllers after GK110, and didn't have much luck on my 1080tis. I mainly block GPUs for volume reasons these days. 

someone has repeated a mantra a few times in this thread about navi not benefiting as much as usual from water due to an incredibly high tjmax type deal? idr, and i'm too lazy to load back into page 38 of the thread.


----------



## ilmazzo

If you say that lc the card allow pushing it in 2100+ (2200 ish) is not "benefiting" (while pumping power tables of course) ..... yes, it does not benefit that much....on the green side it allows just to avoid 3 throtthling steps (45mhz) for thermals and that's it since voltage is blocked and you can only mess with TDP grabbing bioses left and right...lc right now is for silence and performance stability, not much else....

good 'ol days gone dudes


----------



## Koeman

Hi everyone. I am currently running a xeon 5650 @ 4.40ghz on a Rampage II Gene (bios 1704) with an rx580 and 16gb of hyperx fury ddr3 cl10.

I am playing flawlessly at 1080p but looking to move to a 2k monitor and my main question is:

will a 5700 xt work on my mobo?

I am reading some cases here: https://www.reddit.com/r/AMDHelp/comments/ckxs69/anyone_tried_to_run_the_5700xt_in_an_old_school/

Has anyone tried this on a rampage ii gene?

Thank you in advance.


----------



## skupples

seems like you'd need a custom bios, if the card can even run in 2.0

i'd say you're screwed & that its a great time to jump onto AMD's new ryzen chips. That Xeon has had a long life, put it to bed/home server work.


----------



## ilmazzo

Isn't pciex retrocampatible on any level? Modern vgas in idle rollback to pciex 1 for power saving, so why should be 2.0 a problem?

Since this is OCN and we live in a consumerist world jumping on a new platform is a must and will cost you another XT more or less.......


----------



## skupples

no clue, all conjecture. 

NV cards won't run in full size 4x pci-e slot, for example. 

Reading thru the reddit thread, sounds like he's gonna be boned either way. Whatever the hang up is (bios?)


----------



## looniam

skupples said:


> no clue, all conjecture.
> 
> *NV cards won't run in full size 4x pci-e slot, for example. *
> 
> Reading thru the reddit thread, sounds like he's gonna be boned either way. Whatever the hang up is (bios?)


?????????
i've run both fermi and kepler cards in x4 slot for physX w/o an issue, something "new?"


----------



## skupples

maybe that's because it was for physX? 

maybe i'm thinking of SLI support.


----------



## looniam

skupples said:


> maybe that's because it was for physX?
> 
> maybe i'm thinking of SLI support.


yeah, there is the x4 difference between sli/xfire and i think i know, what was that? the X79 chipset getting whacky? i never had a hedt so just read the postings for changing the bios and seemed it was always a nv card in the rig specs. it just could be coincidence, that.

and i think around the 1.0/2.0 transition nv cards might of had an issue (though some "future proof" MBs weren't really) . . so i wouldn't have be surprised.


----------



## skupples

yeah, you can 16x16x, 16x8x 8x 8x, but nothing with 4x. (NV SLI) 

googled


----------



## Koeman

skupples said:


> no clue, all conjecture.
> 
> NV cards won't run in full size 4x pci-e slot, for example.
> 
> Reading thru the reddit thread, sounds like he's gonna be boned either way. Whatever the hang up is (bios?)



Thank you for taking the time to read that thread in the first place, my thoughts exactly as yours. I guess I have to let go my beloved platform sooner or later...


----------



## skupples

just retire it to a home server.


----------



## rluker5

NV cards run over pcie2x1. For normal or PhysX, and I own a gt730 that only has a pciex1 plug to show it: https://www.userbenchmark.com/UserRun/18097101 
The h81t R2.0 doesn't even have a pcie slot, I had to use an adapter. The gt730 is slower than the igpu on the 4980hq and hanging outside the case ruins the appearance so I won't be using it like that, so I stuck it in my main rig in case I play a game with PhysX. 

Just not sli, that needs x8, or at least imitation plx x8.


----------



## rdr09

https://www.techpowerup.com/260696/...-cooler-offers-replacements-to-current-owners


----------



## skupples

almost like they'd benefit from one actual nerd on their internal QA team...


----------



## Hydroplane

Koeman said:


> Hi everyone. I am currently running a xeon 5650 @ 4.40ghz on a Rampage II Gene (bios 1704) with an rx580 and 16gb of hyperx fury ddr3 cl10.
> 
> I am playing flawlessly at 1080p but looking to move to a 2k monitor and my main question is:
> 
> will a 5700 xt work on my mobo?
> 
> I am reading some cases here: https://www.reddit.com/r/AMDHelp/comments/ckxs69/anyone_tried_to_run_the_5700xt_in_an_old_school/
> 
> Has anyone tried this on a rampage ii gene?
> 
> Thank you in advance.


May not be comparable to a 5700xt, but I know a Titan RTX can work flawlessly on PCI-E 2.0 x16 (LGA 775, in this case). Quite cpu limited though


----------



## skupples

gotta say, this red devil is handling the ringer with ease. running @ 2150 core, messing with the memory one inch slows things down though.


----------



## 99belle99

I ran a 5700Xt on a x58 PCIe 2.0 and it worked perfect. Could play at 4k no problem.


----------



## skupples

in case anyone wasn't aware, AMD is forcing segmentation via clock speeds. This red devil won't go past 2150, period.  I guess protecting the card to reduce RMA occurrence is a good way to bring costs down  

so AMD and NV are officially on the same page, except you can shunt NV stuff still.

i mean, this is so hard coded in that its DIALED IN, when you open MSI-AB. It automatically jumps to 2150/875.


----------



## 99belle99

skupples said:


> in case anyone wasn't aware, AMD is forcing segmentation via clock speeds. This red devil won't go past 2150, period.  I guess protecting the card to reduce RMA occurrence is a good way to bring costs down
> 
> so AMD and NV are officially on the same page, except you can shunt NV stuff still.
> 
> i mean, this is so hard coded in that its DIALED IN, when you open MSI-AB. It automatically jumps to 2150/875.


You can use powerplay tables or morepowertool to go up to 2200MHz if you have the right cooling.


----------



## treetops422

99belle99 said:


> You can use powerplay tables or morepowertool to go up to 2200MHz if you have the right cooling.


Igor went to 2300 1000 ram stable on the ref xt WC, says you need to be below 45c, under 50c maybe. He said you can go even higher, but doesn't recommend 2300 mhz 1000 ram for daily use.


----------



## Ha-Nocri

skupples said:


> gotta say, this red devil is handling the ringer with ease. running @ 2150 core, messing with the memory one inch slows things down though.


Without any mods/power tables, at 1.2V?

Mine is only stable at 2070/935 MHz, which usually means clocks will constantly be above 2000 MHz in games and gives 5-7% more performance


----------



## PontiacGTX

skupples said:


> in case anyone wasn't aware, AMD is forcing segmentation via clock speeds. This red devil won't go past 2150, period.  I guess protecting the card to reduce RMA occurrence is a good way to bring costs down
> 
> so AMD and NV are officially on the same page, except you can shunt NV stuff still.
> 
> i mean, this is so hard coded in that its DIALED IN, when you open MSI-AB. It automatically jumps to 2150/875.


it can be overriden but I doubt will go much further than 2.2 no one has posted stable clocks beyond that


----------



## skupples

Ha-Nocri said:


> Without any mods/power tables, at 1.2V?
> 
> Mine is only stable at 2070/935 MHz, which usually means clocks will constantly be above 2000 MHz in games and gives 5-7% more performance


its a BIOS table wall, it seems. 2150 is what it clocks itself to in every OC utility I've tried, which is super stable so i'm happy 


PontiacGTX said:


> it can be overriden but I doubt will go much further than 2.2 no one has posted stable clocks beyond that



yeah that's what i've been seeing. I guess i got a decent sample, running 2150 no issues on air. Not sure I'm even gonna return it when my 2080ti shows up this week.



treetops422 said:


> Igor went to 2300 1000 ram stable on the ref xt WC, says you need to be below 45c, under 50c maybe. He said you can go even higher, but doesn't recommend 2300 mhz 1000 ram for daily use.


seems reasonable, except that its like the card knows it can't go over 2150. it auto-dials it in. Not sure how I'd get around that, even with a block, aside from changing the bios.


----------



## Gunderman456

Wow, you broke down and purchased 2 cards not waiting till after the console releases!!

Patience, man, have patience (tells self).


----------



## skupples

i've learned my lesson trying to be cheap. 

the 5700xt will become a present for someone, and the 2080ti will stay on air as a hold over until ampere titan. 

the 1080TI's have been cleaned, one is up for parts, and one is up for auction. The one for parts is an awesome 2080 in waiting for someone.


----------



## TriWheel

Gunderman456 said:


> Wow, you broke down and purchased 2 cards not waiting till after the console releases!!
> 
> Patience, man, have patience (tells self).


You inspire me, any room left on that bandwagon?


----------



## Gunderman456

TriWheel said:


> You inspire me, any room left on that bandwagon?


All are welcomed! I'm waiting to see what AMD's next video card release will be (RX 5800 XT & RX 5900 XT) in relation to price/perf and I'll make a decision then.

Personally, I'm hoping the RX 5700 XT drops in price once the RX 5800 XT and RX 5900 XT are released to permit more reasonable price brackets on the newer cards. If that is the case, I just may pick up an RX 5700 XT for 1440p, 144Hz gaming. Even though I still consider it fleecing, the RX 5700 XT should have been priced at the RX 5700 bracket.

We'll see, the older games I'm playing right now are still manageable.

If things don't pan out and hardware prices remain absurd then I'll relegate myself to being a few years behind when it comes to gaming in terms of both hardware and software.


----------



## Section31

Maybe with intel in the game we go back to the sub 600usd for top tier gpus. Hope intel gets more price aggressive so amd will match and give us even cheaper cpus.


----------



## 99belle99

Section31 said:


> Maybe with intel in the game we go back to the sub 600usd for top tier gpus. Hope intel gets more price aggressive so amd will match and give us even cheaper cpus.


I doubt it. Prices are only going to go up.


----------



## skupples

seems unlikely to budge much

smaller arch = more money
more memory = more money
high end memory = more money
more high end memory with smaller arch = lots of more money.


----------



## treetops422

Shell shocker on Newegg radeon 5700 non xt $290. 

Comes with BL3 and 3 months xbox game pass. Only 2 year warranty though. Expires in 16 hours.

https://www.newegg.com/xfx-radeon-r...&cm_sp=Homepage_SS-_-P0_14-150-822-_-11042019


If crossfire had more support I'd grab it ^^


----------



## ZealotKi11er

treetops422 said:


> Shell shocker on Newegg radeon 5700 non xt $290.
> 
> Comes with BL3 and 3 months xbox game pass. Only 2 year warranty though. Expires in 16 hours.
> 
> https://www.newegg.com/xfx-radeon-r...&cm_sp=Homepage_SS-_-P0_14-150-822-_-11042019
> 
> 
> If crossfire had more support I'd grab it ^^


At that price 1660 to 2060S are dead.


----------



## skupples

side note - AMD just demanded POP for redemption.


----------



## treetops422

skupples said:


> side note - AMD just demanded POP for redemption.


 mmm I think you just need to have a 5700 or 5700 xt? idk


ZealotKi11er said:


> At that price 1660 to 2060S are dead.


My uneducated guess is that this is a result of XFX getting slammed by reviewers. Even though those reviewers were not talking about the ref card. I look at Newegg every day, this is only the second time the price has been below $330, XFX both times.


----------



## Section31

skupples said:


> side note - AMD just demanded POP for redemption.


Read online some people spend hours just to get new code from amd. Not sure game that will be on sale soon are worth the time. That and they are all epic game store exclusives.


----------



## skupples

yep, I had to generate a ticket to register my product, this is weird.


----------



## treetops422

skupples said:


> yep, I had to generate a ticket to register my product, this is weird.


 pretty sure you just have to restart your pc after installing the amd verifier, it's some simple little thing, maybe disabling AV, here is my old email response from AMD.


Please try the following:

1. Delete any previously used PVT you've used and download the PVT again. The PVT by default is installed to your C:\Program Files\AMDProduct Verification Tool. Make sure to use the uninstaller.

2. Simply download the PVT from a different browser.

3. Temporarily disable your antivirus software 

4. Be sure to launch the program from your file browser and not your web browser. This has worked with other users having the same issue.


----------



## skupples

or the PVT installer pop up got blocked...
thanks! 

I'll check this evening.


----------



## qwertymac93

treetops422 said:


> mmm I think you just need to have a 5700 or 5700 xt? idk
> 
> My uneducated guess is that this is a result of XFX getting slammed by reviewers. Even though those reviewers were not talking about the ref card. I look at Newegg every day, this is only the second time the price has been below $330, XFX both times.


I bought a reference ASUS 5700 a few weeks ago on the egg for $279(299-20 MIR). People just don't want the reference cards, which is sad because the blower handles the non-XT's heat just fine.


----------



## skupples

well, i mean... AMD has done an amazing job with not improving their blower AT ALL. So most folks just assume avoid their stock coolers. They really need to do what NV did. Drop a stock dual fan option.


----------



## qwertymac93

The blower on the XT was a mistake, definitely. A dual axial cooler on the XT would have been a good differentiator.


----------



## rdr09

qwertymac93 said:


> The blower on the XT was a mistake, definitely. A dual axial cooler on the XT would have been a good differentiator.


I have a reference XT and it handles temps just as good as the reference 5700. Prolly the Air 540 helps but i do undervolt mine, so with the ac on, the hotspot does not go over 90c. 

The stock fan setting is pretty slow. I guess by design to keep it silent.


----------



## Diffident

skupples said:


> well, i mean... AMD has done an amazing job with not improving their blower AT ALL. So most folks just assume avoid their stock coolers. They really need to do what NV did. Drop a stock dual fan option.



They did that for the Radeon Vii and it's still just as loud and hot as a blower. Everyone just needs to come to grips that AMD makes suck ass coolers.


----------



## The Robot

Diffident said:


> They did that for the Radeon Vii and it's still just as loud and hot as a blower. Everyone just needs to come to grips that AMD makes suck ass coolers.


Yeah, the VII fell victim to "slim and pretty" hipster design choice over performance. They should just let Sapphire do it. I think they still use blowers because of some stupid policy of "not competing with AIBs".


----------



## ZealotKi11er

The Robot said:


> Yeah, the VII fell victim to "slim and pretty" hipster design choice over performance. They should just let Sapphire do it. I think they still use blowers because of some stupid policy of "not competing with AIBs".


Not really. Its just a problem with 7nm/hotspot. Nvidia is using 12nm and much bigger dies. Wait until their 7nm.


----------



## skupples

Diffident said:


> They did that for the Radeon Vii and it's still just as loud and hot as a blower. Everyone just needs to come to grips that AMD makes suck ass coolers.


we have, long ago. thus why most folks are willing to shell out the extra $20-$50 for the fancy cooler. 

its a bit silly though, triple thick card that's ALSO wider? oy vey. i still like this card though, It's my new hold over card. I have a bad habit of not RMA'ing stuff that should be RMA'd simply because i don't like down time. 

so i'm finally gonna ship off these 1080tis, n see if evga or pny will do me a solid.


----------



## lightsout

So is it worth it to flash a reference 5700? I am going to grab one, just wondering if the stock cooler can handle the heat? 

Also anyone aware of some after market coolers that fit on the 5700's. Not looking to break the bank, but some thing under $50 would be nice.

I have an ITX case so it has to be dual slot. (which I know excludes a lot)


----------



## 99belle99

lightsout said:


> So is it worth it to flash a reference 5700? I am going to grab one, just wondering if the stock cooler can handle the heat?
> 
> Also anyone aware of some after market coolers that fit on the 5700's. Not looking to break the bank, but some thing under $50 would be nice.
> 
> I have an ITX case so it has to be dual slot. (which I know excludes a lot)


I seen a video a while back and they guy had at least two or three aftermarket coolers that you slap onto the reference and after all tests the stock cooler was just as good.

Now this was a while back and unless there is a aftermarket cooler that came to the market that was designed for the 5700 then maybe that would be better but the coolers he used were generic that fit all different cards.


----------



## lightsout

99belle99 said:


> I seen a video a while back and they guy had at least two or three aftermarket coolers that you slap onto the reference and after all tests the stock cooler was just as good.
> 
> Now this was a while back and unless there is a aftermarket cooler that came to the market that was designed for the 5700 then maybe that would be better but the coolers he used were generic that fit all different cards.


Yeah after posting this I looked around and seems the stock cooler isn't terrible. I am mostly concerned with noise. If I can get it at a decent level and performance still be solid I will probably just leave it alone.

Going to see how things go with the card before I make a decision. I know when I have a blower 1080ti that thing would scream.


----------



## ZealotKi11er

Does anyone have a 5700 XT Stock/backplate Cooler for sale?


----------



## keikei

Took a while, but we're here: ASRock Announces Radeon RX 5700 Phantom Gaming Series


----------



## lightsout

Just personal preference but Asrock makes some pretty ugly cards.


----------



## NightAntilli

lightsout said:


> Just personal preference but Asrock makes some pretty ugly cards.


I agree. But they're new here, so, maybe they'll improve over time. I really like the look of the Red Dragon and the Thicc II, although I'd buy a Pulse over anything else at this point.


----------



## keikei

lightsout said:


> Just personal preference but Asrock makes some pretty ugly cards.



Yeah, its an 'attention grabber' no doubt. Lol.


----------



## The Robot

keikei said:


> Yeah, its an 'attention grabber' no doubt. Lol.


They have Asian market in their sights, they like flashy stuff over there.


----------



## Heuchler

GIGABYTE Radeon RX 5700 XT AORUS Edition

Brian - GIGABYTE Community Manager

https://www.reddit.com/r/Amd/comments/dtjztk/official_aorus_rx_5700_xt_coming_soon/


----------



## lightsout

Seeing some cards come out pretty late. Wonder if these guys were testing the market to see if these cards would sell?


----------



## PontiacGTX

lightsout said:


> So is it worth it to flash a reference 5700? I am going to grab one, just wondering if the stock cooler can handle the heat?
> 
> Also anyone aware of some after market coolers that fit on the 5700's. Not looking to break the bank, but some thing under $50 would be nice.
> 
> I have an ITX case so it has to be dual slot. (which I know excludes a lot)


https://es.aliexpress.com/item/4000311120335.html
https://www.newegg.com/p/2YM-0024-00001

the chinese store's price have increased or the low priced AIO are OOS well this is one choice


----------



## lightsout

PontiacGTX said:


> https://es.aliexpress.com/item/4000311120335.html
> https://www.newegg.com/p/2YM-0024-00001
> 
> the chinese store's price have increased or the low priced AIO are OOS well this is one choice


I saw that, pretty cheap price. The more I read I keep seeing that reference isn't too bad. We shall see.


----------



## ZealotKi11er

lightsout said:


> I saw that, pretty cheap price. The more I read I keep seeing that reference isn't too bad. We shall see.


The only issue with that AIO is that you have to worry about G6 memory/VRM temperature. The GPU will get very good temps and it will be quite. The stock cooler is ok if you run stock.


----------



## lightsout

ZealotKi11er said:


> The only issue with that AIO is that you have to worry about G6 memory/VRM temperature. The GPU will get very good temps and it will be quite. The stock cooler is ok if you run stock.


I'd like to overclock with the morepowertool. But don't expect to overvolt. Not sure how just raising the clocks will affect temps.


----------



## ZealotKi11er

lightsout said:


> I'd like to overclock with the morepowertool. But don't expect to overvolt. Not sure how just raising the clocks will affect temps.


More power means more voltage. There is no way around it. The only thing you can change is the curve. Let say you do not want to go over 1.1v. You can set 2100MHz @ 1.1v and max out the power. The thing is that you do not need any third-party tools for 5700 XT if you are not going over 50% power which will not happen even with 1.2V. 5700 non-XT is a different story.


----------



## lightsout

ZealotKi11er said:


> More power means more voltage. There is no way around it. The only thing you can change is the curve. Let say you do not want to go over 1.1v. You can set 2100MHz @ 1.1v and max out the power. The thing is that you do not need any third-party tools for 5700 XT if you are not going over 50% power which will not happen even with 1.2V. 5700 non-XT is a different story.


Yeah I have a reference 5700 on the way. I haven't messed with them at all, last AMD was R9 270. So got some reading to do. Which I have been doing.


----------



## rdr09

lightsout said:


> Yeah I have a reference 5700 on the way. I haven't messed with them at all, last AMD was R9 270. So got some reading to do. Which I have been doing.


When i switched from the R9 290 to 5700, i just downloaded the latest driver direct from the AMD site and ran it using Express Method. Nothing else.

On my other rig, tho, i switched from GTX 1060 to 5700XT and there i used Windows Uninstall. Others recommend DDU and have success. I did clear the CMOS for this switch. After installing the XT, i just installed the driver again using Express Method. 

With the 5700 at stock but undervolted, Hot Spot temp stays in check in gaming. In can be hot in the area i live in, so got to have the ac on.


----------



## skupples

or you could be like me, and have both installed at the same time.


----------



## Newbie2009

Been playing with undervolding pc for gaming without headphones.

A quick and dirty undervolt for me is 1000mv @ 1950mhz and 900mhz on the memory.
Fans float from 16 - 1700 rpm, can't hear the card over the cpu cooler.

Artifacts 1000mv @ 2ghz but will see of there is more wiggle room. Peak wattage spikes about 160w, average about 130w.


----------



## lightsout

rdr09 said:


> When i switched from the R9 290 to 5700, i just downloaded the latest driver direct from the AMD site and ran it using Express Method. Nothing else.
> 
> On my other rig, tho, i switched from GTX 1060 to 5700XT and there i used Windows Uninstall. Others recommend DDU and have success. I did clear the CMOS for this switch. After installing the XT, i just installed the driver again using Express Method.
> 
> With the 5700 at stock but undervolted, Hot Spot temp stays in check in gaming. In can be hot in the area i live in, so got to have the ac on.


Not bad, thanks for the info.

So has this basically become the owners thread? Odd I don't see one.


----------



## PontiacGTX

lightsout said:


> I'd like to overclock with the morepowertool. But don't expect to overvolt. Not sure how just raising the clocks will affect temps.







You can use a reference RX 5700 XT BIOS and get a 8% boost but also a noticeable power consumption increase, maybe 50-80w more


----------



## lightsout

PontiacGTX said:


> https://www.youtube.com/watch?v=ojgK4K7nliE&t
> 
> You can use a reference RX 5700 XT BIOS and get a 8% boost but also a noticeable power consumption increase, maybe 50-80w more


I was looking at doing the same but without the flash. I may get brave after a while and do the flash, bummer AMD used to give us a bios switch. 

I'll check that video out when I get a minute.


----------



## PriestOfSin

lightsout said:


> I was looking at doing the same but without the flash. I may get brave after a while and do the flash, bummer AMD used to give us a bios switch.
> 
> I'll check that video out when I get a minute.


My Red Devil has some sort of toggle switch, anyone know if that's a dual bios switch? Given that I'd never use silent mode (because it's quiet enough as it is), I'd be willing to see if I could get a few gains with a new bios.


----------



## skupples

PriestOfSin said:


> My Red Devil has some sort of toggle switch, anyone know if that's a dual bios switch? Given that I'd never use silent mode (because it's quiet enough as it is), I'd be willing to see if I could get a few gains with a new bios.


that was my assumption, that's what they normally are. how else would you initiate quiet mode besides to cut fan speeds, clocks, power usage, which would all be bios.


----------



## treetops422

lightsout said:


> I was looking at doing the same but without the flash. I may get brave after a while and do the flash, bummer AMD used to give us a bios switch.
> 
> I'll check that video out when I get a minute.


 You can always use MPT, no need to flash it if you don't want too. If you want to simulate the stock ref XT. With a bit more. I don't think you will have to adjust your fan curve a whole lot to run it like that. It's just 3 settings in MPT and 3 settings in Wattman. First save your bios with GPU-Z then run MPT. Load the bios you saved. Here is a pic of the tab in MPT where you need to change the 3 settings. After changing click write at the bottom. Exit. Restart your computer.

In Wattman, Click Gaming on the bottom, Global settings and then Global Wattman. Ramp up your fan if needed. Don't worry Wattman only has like 4 settings. Yes you can go higher. There is no need to adjust the voltage on air or really even water unless you are trying to set some record. Adjusting the voltage to try to go higher is a last resort. If you are even slightly weary of the a bios flash do not alter the voltage limits in MPT. Wattman by default won't let you mess up the voltage.

Max Power 40%
Boost Clock 1905 MHz
Ram 930


MPT
https://www.igorslab.media/morepowe...x-5700-xt-tweaking-and-overclocking-software/


https://www.overclock.net/forum/67-...adeon-vii-tweaking-overclocking-software.html


----------



## ZealotKi11er

lightsout said:


> I was looking at doing the same but without the flash. I may get brave after a while and do the flash, bummer AMD used to give us a bios switch.
> 
> I'll check that video out when I get a minute.


It should be fine. If you mess up the flash you can always use a secondary GPU to reflash. As long as the vBIOS is compatible nothing should break.


----------



## lightsout

treetops422 said:


> You can always use MPT, no need to flash it if you don't want too. If you want to simulate the stock ref XT. With a bit more. I don't think you will have to adjust your fan curve a whole lot to run it like that. It's just 3 settings in MPT and 3 settings in Wattman. First save your bios with GPU-Z then run MPT. Load the bios you saved. Here is a pic of the tab in MPT where you need to change the 3 settings. After changing click write at the bottom. Exit. Restart your computer.
> 
> In Wattman, Click Gaming on the bottom, Global settings and then Global Wattman. Ramp up your fan if needed. Don't worry Wattman only has like 4 settings. Yes you can go higher. There is no need to adjust the voltage on air or really even water unless you are trying to set some record. Adjusting the voltage to try to go higher is a last resort. If you are even slightly weary of the a bios flash do not alter the voltage limits in MPT. Wattman by default won't let you mess up the voltage.
> 
> Max Power 40%
> Boost Clock 1905 MHz
> Ram 930
> 
> 
> MPT
> https://www.igorslab.media/morepowe...x-5700-xt-tweaking-and-overclocking-software/
> 
> 
> https://www.overclock.net/forum/67-...adeon-vii-tweaking-overclocking-software.html


Thanks al ot, so would that open up potential clocks to 2150 or just 1905?


ZealotKi11er said:


> It should be fine. If you mess up the flash you can always use a secondary GPU to reflash. As long as the vBIOS is compatible nothing should break.


Yeah I have done it a ton of times with my gtx 970 and 780 I believe. Just hesitant this time, hate when I have to play with something and it goes bad.


----------



## lightsout

treetops422 said:


> You can always use MPT, no need to flash it if you don't want too. If you want to simulate the stock ref XT. With a bit more. I don't think you will have to adjust your fan curve a whole lot to run it like that. It's just 3 settings in MPT and 3 settings in Wattman. First save your bios with GPU-Z then run MPT. Load the bios you saved. Here is a pic of the tab in MPT where you need to change the 3 settings. After changing click write at the bottom. Exit. Restart your computer.
> 
> In Wattman, Click Gaming on the bottom, Global settings and then Global Wattman. Ramp up your fan if needed. Don't worry Wattman only has like 4 settings. Yes you can go higher. There is no need to adjust the voltage on air or really even water unless you are trying to set some record. Adjusting the voltage to try to go higher is a last resort. If you are even slightly weary of the a bios flash do not alter the voltage limits in MPT. Wattman by default won't let you mess up the voltage.
> 
> Max Power 40%
> Boost Clock 1905 MHz
> Ram 930
> 
> 
> MPT
> https://www.igorslab.media/morepowe...x-5700-xt-tweaking-and-overclocking-software/
> 
> 
> https://www.overclock.net/forum/67-...adeon-vii-tweaking-overclocking-software.html


Any reason you picked 1905 here?


----------



## PriestOfSin

Ran Crytek's new raytracing benchmark for lolz, and discovered that my 5700XT is beating my VII in this test. It's not apples to apples (so this shouldn't be taken terribly seriously), but the 5700XT system is overall slower than my VII system anyway. Looks like RDNA is superior at ray tracing than Vega is?


----------



## ZealotKi11er

PriestOfSin said:


> Ran Crytek's new raytracing benchmark for lolz, and discovered that my 5700XT is beating my VII in this test. It's not apples to apples (so this shouldn't be taken terribly seriously), but the 5700XT system is overall slower than my VII system anyway. Looks like RDNA is superior at ray tracing than Vega is?


RT has a lot to do with cache so RDNA is better than older GCN in R7.


----------



## treetops422

lightsout said:


> Any reason you picked 1905 here?


It's the stock 5700 XT core boost. You can go higher if you like.


----------



## ZealotKi11er

treetops422 said:


> It's the stock 5700 XT core boost. You can go higher if you like.


Nope. 1905MHz is not stock for 5700 XT. Mine says 2064MHz.


----------



## skupples

1905 is the "base clock" on my red devilXT


----------



## treetops422

The ref 5700 XT, he bought the ref non xt, spoke of updating the bios to an XT, it's just somewhere to start. So he can see how easy it is and if he wants the normal stock non xt to stock xt bios "boost", the starting point I gave is pretty much the same thing. With higher ram. 

Boost Clock 1905 MHz


----------



## lightsout

treetops422 said:


> The ref 5700 XT, he bought the ref non xt, spoke of updating the bios to an XT, it's just somewhere to start. So he can see how easy it is and if he wants the normal stock non xt to stock xt bios "boost", the starting point I gave is pretty much the same thing. With higher ram.
> 
> Boost Clock 1905 MHz




Perfect. Thanks man appreciate the info. I'll report once I have played with it. Is 40% a good number on power for a stock cooler?


----------



## rdr09

lightsout said:


> Perfect. Thanks man appreciate the info. I'll report once I have played with it. Is 40% a good number on power for a stock cooler?


Enjoy your card at stock first. First thing we peasants coming from low end cards would notice going into our games is adjusting the settings just a little higher. If your ambient is high, undervolt to just a tad higher than a volt first, test, and go from there. Use HWINIFO to monitor your temps and clocks. Remember, the stock fan setting is a bit in the low side to keep it silent. I never really mess with it.


----------



## lightsout

rdr09 said:


> Enjoy your card at stock first. First thing we peasants coming from low end cards would notice going into our games is adjusting the settings just a little higher. If your ambient is high, undervolt to just a tad higher than a volt first, test, and go from there. Use HWINIFO to monitor your temps and clocks. Remember, the stock fan setting is a bit in the low side to keep it silent. I never really mess with it.




Yeah that's good advice. I like to save ocing usually for a little while to not spoil all the fun up front. 

Downgraded from a 1080ti a while ago because I want gaming. But I cut the itch again.


----------



## Ha-Nocri

My cards stock max boost is 1990 MHz, but isn't stable above 2070


----------



## rdr09

lightsout said:


> Yeah that's good advice. I like to save ocing usually for a little while to not spoil all the fun up front.
> 
> Downgraded from a 1080ti a while ago because I want gaming. But I cut the itch again.


It will ba a bit slower than the 1080 Ti but you should be able to set your games similarly.



Ha-Nocri said:


> My cards stock max boost is 1990 MHz, but isn't stable above 2070


My reference XT only boosts to 1975MHz. Really depends on the ambient.


----------



## treetops422

lightsout said:


> Perfect. Thanks man appreciate the info. I'll report once I have played with it. Is 40% a good number on power for a stock cooler?


Ran fine at that max power running benchmarks. I think I had 26-28c ambient air temps. Looked up some of my very unorganized notes . You might not even have to up your fan curve at those settings. 40% max power seems to be a sweet spot. The benchmarks I ran though only lasted for maybe 10 minutes, aside from the pre gpu-z heatup I did below. Gaming at 100% usage for hours on end will probably make it hotter.



Stock fan curve 40% power stock oc Graphics score
24 260 tj 93 after rendering gpu-z 15 min 189 watts 76c max temp
Core Boost 1850
Ram 930
40% max power


Stock fan curve 41% power 2000 core boost 930 ram Graphics score 25 330 214 watts tj 102 gpu 81c


p.s. I took the vent plate thing off for more outgoing airflow idk how much that matters


----------



## lightsout

treetops422 said:


> Ran fine at that max power running benchmarks. I think I had 26-28c ambient air temps. Looked up some of my very unorganized notes . You might not even have to up your fan curve at those settings. 40% max power seems to be a sweet spot. The benchmarks I ran though only lasted for maybe 10 minutes, aside from the pre gpu-z heatup I did below. Gaming at 100% usage for hours on end will probably make it hotter.
> 
> 
> 
> Stock fan curve 40% power stock oc Graphics score
> 24 260 tj 93 after rendering gpu-z 15 min 189 watts 76c max temp
> Core Boost 1850
> Ram 930
> 40% max power
> 
> 
> Stock fan curve 41% power 2000 core boost 930 ram Graphics score 25 330 214 watts tj 102 gpu 81c
> 
> 
> p.s. I took the vent plate thing off for more outgoing airflow idk how much that matters




Thanks for the results. The cards finally shows up today. Ups ground cross country is slooooooow.


----------



## lightsout

treetops422 said:


> You can always use MPT, no need to flash it if you don't want too. If you want to simulate the stock ref XT. With a bit more. I don't think you will have to adjust your fan curve a whole lot to run it like that. It's just 3 settings in MPT and 3 settings in Wattman. First save your bios with GPU-Z then run MPT. Load the bios you saved. Here is a pic of the tab in MPT where you need to change the 3 settings. After changing click write at the bottom. Exit. Restart your computer.
> 
> In Wattman, Click Gaming on the bottom, Global settings and then Global Wattman. Ramp up your fan if needed. Don't worry Wattman only has like 4 settings. Yes you can go higher. There is no need to adjust the voltage on air or really even water unless you are trying to set some record. Adjusting the voltage to try to go higher is a last resort. If you are even slightly weary of the a bios flash do not alter the voltage limits in MPT. Wattman by default won't let you mess up the voltage.
> 
> Max Power 40%
> Boost Clock 1905 MHz
> Ram 930
> 
> 
> MPT
> https://www.igorslab.media/morepowe...x-5700-xt-tweaking-and-overclocking-software/
> 
> 
> https://www.overclock.net/forum/67-...adeon-vii-tweaking-overclocking-software.html




Got the card and changed these three settings. Put the max to 2000, so far it seems like I'm settling with mid 1800's at 20% power. Raising all the way to 40% saw clocks in the low 1900's but temps were high.

Wattman is pretty lame. Can't say I like it. Would much rather use AB but I cent seem to get it to let me change clocks. The slider just says 0. It has numbers at one point but didn't seem to do anything. 

Also I made another thread but getting a lot of freezing in COD. Basically any time a round starts the system locks up for about 10 seconds. Or if I alt tab in and out of game it does that every time I switch back. this is with or without any oc. 

I don't think it's a dud card because other than that it's really solid. Always seems like AMD cards are a pain though. Nvidia for me with my staple afterburner is so easy.

But this thing smokes the 2060 I had and I only paid $275. Have to spend a lot more to get equal performance with Nvidia


----------



## paulerxx

What does everyone's Wattman settings look like?


----------



## Ha-Nocri

I use this daily. So, reference card has 1905 MHz, what it the voltage?


----------



## lightsout

How do I unlock core overclock on Afterburner with a reference card? I swear it worked at one point. Or at least the numbers were there. Now I just see a 0 and the slider won't move.

EDIT#1 Got it working, trying to get it with the registry unlocks and it seems they don't want to play nice.

EDIT #2 Started over and I think I am good with AB and MPT.


----------



## ZealotKi11er

Ha-Nocri said:


> I use this daily. So, reference card has 1905 MHz, what it the voltage?


What do you have stock? Mine scales all the way to 1200mv @ 2064mhz


----------



## skupples

i uhhhh... lol.


----------



## lightsout

skupples said:


> i uhhhh... lol.


What in the world.


----------



## Ha-Nocri

ZealotKi11er said:


> What do you have stock? Mine scales all the way to 1200mv @ 2064mhz


Stock 1990 MHz/1175 mV, but rarely stays at that voltage because of 200W power limit. Mine goes to 2070 at 1.2V stable


----------



## treetops422

I like to start the 1.2 where my boost usually peaks. I have no clue if this is the smart way to do it. btw my fan curve is just for the memory, see sig for why


----------



## 113802

This card has been out since July and it still doesn't have an owners thread?


----------



## skupples

welcome to newest revision OCN. They aren't saying get out, but they totally wouldn't mind if none of us were here tomorrow.


----------



## treetops422

WannaBeOCer said:


> This card has been out since July and it still doesn't have an owners thread?


There is an awesome owners thread here with an insane amount of info. It's the XT. They have a non xt owners club too, I think I'm the only member lol.
https://forums.overclockers.co.uk/threads/the-radeon-rx-5700-xt-owners-thread.18859271/
Sadly were behind.


----------



## Gunderman456

The OP should rename this the owner's thread and ask a Mod to move it, or heck a Mod could just do it!


----------



## paulerxx

ZealotKi11er said:


> What do you have stock? Mine scales all the way to 1200mv @ 2064mhz


 That's my stock settings, ASUS Tuf Gaming.

And yes, this should be the RX5700's series owner's club since that's what we've turned it into!


----------



## skupples

if only mods existed to complete these simple house keeping tasks... (nope, not volunteering, at all)


----------



## lightsout

skupples said:


> welcome to newest revision OCN. They aren't saying get out, but they totally wouldn't mind if none of us were here tomorrow.




Not sure how that's ocn's fault. It's usually just one of the early adopters that starts it. Someone start one and let's all jump over there. I've been wondering the same thing.


----------



## skupples

well, at least back in the day, OCN would grant the "official" status. that's what made it official, thus stickied.

either way, i assume OP could easily edit the title. Seems the folks are on vacation, and they left a fridge full of beer.


----------



## lightsout

skupples said:


> well, at least back in the day, OCN would grant the "official" status. that's what made it official, thus stickied.
> 
> 
> 
> either way, i assume OP could easily edit the title. Seems the folks are on vacation, and they left a fridge full of beer.




Yes but a thread would have been started in the proper section, and then moved to official status. Things aren't what they used to be around here but this seems like a stretch to pub on the site. Maybe back in the day mods would have been more proactive with something like this.

But really all it takes is a user to start a thread. Unless we are hoping all the info from this one is carried over.


----------



## ZealotKi11er

WannaBeOCer said:


> This card has been out since July and it still doesn't have an owners thread?


Buy a 5700 XT and make an Owners Club.


----------



## Ashura

skupples said:


> welcome to newest revision OCN. They aren't saying get out, but they totally wouldn't mind if none of us were here tomorrow.


lol



lightsout said:


> skupples said:
> 
> 
> 
> well, at least back in the day, OCN would grant the "official" status. that's what made it official, thus stickied.
> 
> either way, i assume OP could easily edit the title. Seems the folks are on vacation, and they left a fridge full of beer.
> 
> 
> 
> Yes but a thread would have been started in the proper section, and then moved to official status. Things aren't what they used to be around here but this seems like a stretch to pub on the site. Maybe back in the day mods would have been more proactive with something like this.
> 
> But really all it takes is a user to start a thread. Unless we are hoping all the info from this one is carried over.
Click to expand...

Yes, It seems mods aren't really active, & those who are active aren't mods.


----------



## 113802

ZealotKi11er said:


> Buy a 5700 XT and make an Owners Club.


Why would I downgrade? Plus I won't be able to get any work completed with a Navi card since there isn't any ROCm support for the card and it has half the memory. I'm hoping AMD releases a Frontier Edition of what ever architecture replaces Vega.


----------



## skupples

Ashura said:


> lol
> 
> 
> 
> Yes, It seems mods aren't really active, & those who are active aren't mods.


the active mods I know of live in off topic & news only, so i guess they're just ignoring it. no farks given. I may have to go check on my old forum accounts, see who's still up, and who's down. I consider reddit going public to essentially be the great forum purge. Reddit acted as a magnet for the clickbait fiends, and as a result left sites like OCN in the dust. Oh, it also captured the folks that learned about 4chan the hacker via CNN news.


----------



## BradleyW

I am soon to join the club. Have a 5700 XT (Sapphire Nitro+) on the way!


----------



## looniam

BradleyW said:


> I am soon to *start* the club. Have a 5700 XT (Sapphire Nitro+) on the way!



FTFY


----------



## ilmazzo

Yeah, it is a shame, no club after months of it been available


----------



## melodystyle2003

Quite powerfull beast, but juction temp goes to the high side causing high rpm fan speeds.. Whats the safe temp to run this hot spot? Any workaround to reduce its readings? TIA as always
http://www.3dmark.com/fs/21019577


----------



## Ha-Nocri

Which card is that? Think the J. temp is on the high side for 220W and 3200 RPM


----------



## melodystyle2003

Ha-Nocri said:


> Which card is that? Think the J. temp is on the high side for 220W and 3200 RPM


Gigabyte 5700xt gaming oc. And it is also undervolted..

Here it is with [email protected]
http://www.3dmark.com/fs/21019665


----------



## Ha-Nocri

I have the same card and had high JT too, but it is not that bad anymore, might be that thermal paste needed time to cure.

Also, it doesn't matter what voltage is set if power(W) is the same (limited by power limit)


----------



## melodystyle2003

Ha-Nocri said:


> I have the same card and had high JT too, but it is not that bad anymore, might be that thermal paste needed time to cure.
> 
> Also, it doesn't matter what voltage is set if power(W) is the same (limited by power limit)


OK, i 've also read that it is ok to have JT under 110°C. Not sure though about the validity of the articles that claim that, i surely need to dig in more. I will monitor how it goes, maybe i will change the thermal paste with a better one and tight a bit more the screws, depending the outcome, mood and free time. I do have many enquiries around the optimal performance of the rx 5700 xt in general, thus i always remain open to suggestions, comments, references and links regarding that subject. TIA


----------



## ilmazzo

AMd stated below 110c you get no throttling otherwise the card would take care of itself


----------



## epic1337

ilmazzo said:


> AMd stated below 110c you get no throttling otherwise the card would take care of itself


you know this got be thinking, AMD may be saying it wont throttle but will it boost when the chip is so hot?


----------



## tripleflip

I really hope these cards are going to be more sturdy with continued use like the older generation cards. I heard the vega series cards could easily blow capacitors and it could be an revolving RMA nightmare. I love the Radeon cards, I just hope they can last 5 years with some heavy usage.


----------



## treetops422

GamerNexus would disagree, 110c tj is not normal on any 5700 xt or non xt. Though said to be rated as not throttling. At stock from everything I've seen that should not be happening. I'm no expert but I've been watching a ton of reviews on Navi. And my own exp with the non xt ref card.


----------



## qwertymac93

melodystyle2003 said:


> Gigabyte 5700xt gaming oc. And it is also undervolted..
> 
> Here it is with [email protected]
> http://www.3dmark.com/fs/21019665




I'm fairly certain you should not have such a large Delta between edge temp and hotspot temp. The Delta goes up as voltage increases, but a full 35c seems excessive. My stock(blower) non-XT has a delta under 15c. If your card doesn't have tamper seals(or they aren't enforceable in your country) you should try and remount your cooler with a good thermal paste; it's possible the factory messed up the mount. You could also research your particular model and see if the issue is inherent to its cooler design.


----------



## melodystyle2003

qwertymac93 said:


> I'm fairly certain you should not have such a large Delta between edge temp and hotspot temp. The Delta goes up as voltage increases, but a full 30c seems excessive. My stock(blower) non-XT has a delta less than half that. If your card doesn't have tamper seals(or they aren't enforceable in your country) you should try and remount your cooler with a good thermal paste; it's possible the factory messed up the mount. You could also research your particular model and see if the issue is inherent to its cooler design.


I do believe the same as you. I have tried to tight the screws and indeed 3 out of 4 that are holding the cooler with the gpu core together where so loose that were almost freely spinning. Also, the same applied to 2 out of 3 VRM screws. But the 'problem' remains, i can easily observe a delta among core and JT of over 35°C. In this particular model, JT tends to hit 90°C (i.e. gamernexus have reviewed this card and averages at ~89°C) but mine goes up to 101°C with stock settings after just one unigine valley run with extremeHD preset. The next step is what you have said.


----------



## melodystyle2003

Well it needed to be reapplied, stock Tim was dry. JT now is 5°C less with a delta of 30°C, which i find it still high.


----------



## Ha-Nocri

I was also thinking to put thermal pad between GPU and back-plate. It might help, but probably only by a few degrees.

EDIT: just run valley on extremeHD, max JT was 85c, default card settings. Not sure if you have the newest bios, they changed fan speed to go up to 2400 RPM


----------



## melodystyle2003

Ha-Nocri said:


> I was also thinking to put thermal pad between GPU and back-plate. It might help, but probably only by a few degrees.
> 
> EDIT: just run valley on extremeHD, max JT was 85c, default card settings. Not sure if you have the newest bios, they changed fan speed to go up to 2400 RPM


Thanks for this pic, mine's JT definitely goes up to 101°C with stock setting using the latest bios from gigabyte.


----------



## PontiacGTX

epic1337 said:


> you know this got be thinking, AMD may be saying it wont throttle but will it boost when the chip is so hot?


unlike Vega this chip will hold the boost better or so have seen on reviews of the reference model or aftermarket


----------



## lightsout

melodystyle2003 said:


> Thanks for this pic, mine's JT definitely goes up to 101°C with stock setting using the latest bios from gigabyte.


Man thats hot, reference 5700 here at 1800mhz in a 69f room. High 60's and hotspot is high 70's. 

When the room is warmer its more like mid 70's and low 80's.


----------



## doom26464

How are drivers and stability of these card today? Nvidia cards never give me fuss and its been a very long time Since I played with an AMD card. 

Pricing out builds and for the first time in a long ass time Im actually looking at AMD cards. 

I see average reviews have 5700xt beating the 2060 super 10-13%. Pricing here in CAN has it by roughly 30 bucks cheaper give or take.


----------



## ZealotKi11er

doom26464 said:


> How are drivers and stability of these card today? Nvidia cards never give me fuss and its been a very long time Since I played with an AMD card.
> 
> Pricing out builds and for the first time in a long ass time Im actually looking at AMD cards.
> 
> I see average reviews have 5700xt beating the 2060 super 10-13%. Pricing here in CAN has it by roughly 30 bucks cheaper give or take.


No issues here. I think the most annoying thing with both Nvidia and AMD is the constant release of drivers. With Navi, in general, i had better experience than vega 64.


----------



## BradleyW

looniam said:


> BradleyW said:
> 
> 
> 
> I am soon to *start* the club. Have a 5700 XT (Sapphire Nitro+) on the way!
> 
> 
> 
> 
> FTFY /forum/images/smilies/biggrin.gif
Click to expand...

Ha, I like the edit you did to my quote!

Need some advice please everyone. I have the choice of the nitro and red devil. 5700xt.

Red devil is better by 2c on the core, and a smidge quieter, with 50w less draw and more vrm capability. Nitro seems to have better cooling for vrm, memory, larger fans, 2-3 FPS faster in every review I can find.

Why is the nitro faster? Is it just clocked more aggressively via its bios? Can the red devil match or surpass it via upping things in wattman? 

Basically is it just software what sets them apart? The red devil appeals to me, but I want the performance of the sapphire.

Thank you.


----------



## rdr09

BradleyW said:


> Ha, I like the edit you did to my quote!
> 
> Need some advice please everyone. I have the choice of the nitro and red devil. 5700xt.
> 
> Red devil is better by 2c on the core, and a smidge quieter, with 50w less draw and more vrm capability. Nitro seems to have better cooling for vrm, memory, larger fans, 2-3 FPS faster in every review I can find.
> 
> Why is the nitro faster? Is it just clocked more aggressively via its bios? Can the red devil match or surpass it via upping things in wattman?
> 
> Basically is it just software what sets them apart? The red devil appeals to me, but I want the performance of the sapphire.
> 
> Thank you.


Isn't you have a custom loop? Why not one that comes with a block?


----------



## BradleyW

rdr09 said:


> BradleyW said:
> 
> 
> 
> Ha, I like the edit you did to my quote!
> 
> Need some advice please everyone. I have the choice of the nitro and red devil. 5700xt.
> 
> Red devil is better by 2c on the core, and a smidge quieter, with 50w less draw and more vrm capability. Nitro seems to have better cooling for vrm, memory, larger fans, 2-3 FPS faster in every review I can find.
> 
> Why is the nitro faster? Is it just clocked more aggressively via its bios? Can the red devil match or surpass it via upping things in wattman?
> 
> Basically is it just software what sets them apart? The red devil appeals to me, but I want the performance of the sapphire.
> 
> Thank you.
> 
> 
> 
> Isn't you have a custom loop? Why not one that comes with a block?
Click to expand...

The card will be in a different system.

What sets the nitro and devil apart if they are the same card? Software? Silicone lottery? Every review has the nitro performing better, yet the devil has technical advantages.

https://bit-tech.net/reviews/tech/graphics/powercolor-radeon-rx-5700-xt-red-devil-review/11/

Edit:

After research, I think the nitro uses better silicone compared to most devil cards. Every review has the nitro not only winning out of the box, but further overclocking better than the devil. If the nitro was winning purely due to bios settings, the software overclocking would override, putting both cards level. So leaving just the factor of silicone quality, which the nitro cards also consistently show that they may be better in this respect as they clock higher.


----------



## rdr09

BradleyW said:


> The card will be in a different system.
> 
> What sets the nitro and devil apart if they are the same card? Software? Silicone lottery? Every review has the nitro performing better, yet the devil has technical advantages.
> 
> https://bit-tech.net/reviews/tech/graphics/powercolor-radeon-rx-5700-xt-red-devil-review/11/
> 
> Edit:
> 
> After research, I think the nitro uses better silicone compared to most devil cards. Every review has the nitro not only winning out of the box, but further overclocking better than the devil. If the nitro was winning purely due to bios settings, the software overclocking would override, putting both cards level. So leaving just the factor of silicone quality, which the nitro cards also consistently show that they may be better in this respect as they clock higher.


It seems the nitro boosts higher in most scenarios. Scroll down to the Actual Clock Rates Table.

https://www.computerbase.de/2019-09/sapphire-radeon-rx-5700-xt-nitro-plus-test/


----------



## BradleyW

rdr09 said:


> BradleyW said:
> 
> 
> 
> The card will be in a different system.
> 
> What sets the nitro and devil apart if they are the same card? Software? Silicone lottery? Every review has the nitro performing better, yet the devil has technical advantages.
> 
> https://bit-tech.net/reviews/tech/graphics/powercolor-radeon-rx-5700-xt-red-devil-review/11/
> 
> Edit:
> 
> After research, I think the nitro uses better silicone compared to most devil cards. Every review has the nitro not only winning out of the box, but further overclocking better than the devil. If the nitro was winning purely due to bios settings, the software overclocking would override, putting both cards level. So leaving just the factor of silicone quality, which the nitro cards also consistently show that they may be better in this respect as they clock higher.
> 
> 
> 
> It seems the nitro boosts higher in most scenarios. Scroll down to the Actual Clock Rates Table.
> 
> https://www.computerbase.de/2019-09/sapphire-radeon-rx-5700-xt-nitro-plus-test/
Click to expand...

What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.


Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro. 

Cheers.


----------



## lightsout

Someone start an owners thread yet?


----------



## treetops422

BradleyW said:


> What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.
> 
> 
> Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro.
> 
> Cheers.


Every single XT will overclock the same with wattman, some will be louder then others. Unless you use MPT or change the BIOS.


----------



## BradleyW

treetops422 said:


> BradleyW said:
> 
> 
> 
> What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.
> 
> 
> Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro.
> 
> Cheers.
> 
> 
> 
> Every single XT will overclock the same with wattman, some will be louder then others. Unless you use MPT or change the BIOS.
Click to expand...

This is what I thought mate. However, every review comparing the nitro and devil show that when both are over clocked via wattman, the nitro still takes the lead. This is why I thought perhaps sapphire are picking better silicone to match their aggressive Bios profile. Just a thought, not concrete, but the nitro looks the better out of the two despite the devil having a slightly more refined pcb.


----------



## Gunderman456

treetops422 said:


> Every single XT will overclock the same with wattman, some will be louder then others. Unless you use MPT or change the BIOS.


True, that is why Gamers Nexus would not run benchmarks on every card since they would all clock close to each other on FPS. Some cards would throttle before others, but that could be eliminated by upping the fans which generated more noise on the cheaper models. So what the Sapphire Nitro and the Red Devil are affording you is less noise since they all seem to be able to go over 2000MHz with some needing more tweaking than others.

If I ever buy one, I would buy reference since I don't mind extra noise anyway. Or, the best recommended and cheaper of the AIB models - the Sapphire Pulse.


----------



## PontiacGTX

BradleyW said:


> What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.
> 
> 
> Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro.
> 
> Cheers.


Have you checked this asrock card? https://www.techpowerup.com/review/asrock-radeon-rx-5700-xt-taichi-oc-plus/32.html

it seems the 2nd card that reachest highest clock based on TPU

https://www.computerbase.de/2019-09...hnitt_die_besten_getesteten_radeon_rx_5700_xt

here the asus seems the best performing but the quietest is the powercolor


----------



## BradleyW

PontiacGTX said:


> BradleyW said:
> 
> 
> 
> What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.
> 
> 
> Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro.
> 
> Cheers.
> 
> 
> 
> Have you checked this asrock card? https://www.techpowerup.com/review/asrock-radeon-rx-5700-xt-taichi-oc-plus/32.html
> 
> it seems the 2nd card that reachest highest clock based on TPU
> 
> https://www.computerbase.de/2019-09...hnitt_die_besten_getesteten_radeon_rx_5700_xt
> 
> here the asus seems the best performing but the quietest is the powercolor
Click to expand...

Red devil is only the quietest if you run the card with the economical bios.


----------



## skupples

BradleyW said:


> What is the reason for it boosting higher? Better silicone quality or software? Surely I can just set a higher power target with the devil to achieve higher clocks like the nitro? However, these reviews I've looked at show the nitro consistently overclockes higher and still takes the performance lead over an over clocked devil.
> 
> 
> Whatever the reason may be, that review you linked also point out that the nitro is the best version for performance, overclocking and cooling. I think I might stick with the nitro.
> 
> Cheers.


i believe its memory temp? errrrg, lemme find it. either way, with a bit of work they'll be identical. case air flow would probably come into play before anything else. 

aside from being YUUUUUUUUUUUGE, the red devil was a really nice card. think i'll stick to waterblocks though. I've already got enough fan noise from the 3x thick and dense 480s.


----------



## SoloCamo

I've been contemplating moving to a 5700xt if only to upgrade my V56 in my VR box to V64 and give my V56 to my nephew as an upgrade over his 1060 6gb.

Is a 5700xt even worth it over a V64 that holds 1630mhz core / 1100mhz HBM2 consistently during extended gaming? Mostly game at 3200x1800 to 4k depending on title.


----------



## rdr09

SoloCamo said:


> I've been contemplating moving to a 5700xt if only to upgrade my V56 in my VR box to V64 and give my V56 to my nephew as an upgrade over his 1060 6gb.
> 
> Is a 5700xt even worth it over a V64 that holds 1630mhz core / 1100mhz HBM2 consistently during extended gaming? Mostly game at 3200x1800 to 4k depending on title.


Me thinks it is not worth it. Your nephew will disagree.


----------



## ilmazzo

well it's an improvement, especially in power usage, but not something that would blow your mind.... I would make happy the nephew though...


----------



## dainfamous

Which would you guys purchase between these 2 only the reference 5700xt or the THICC II?

Thanks


----------



## rdr09

dainfamous said:


> Which would you guys purchase between these 2 only the reference 5700xt or the THICC II?
> 
> Thanks


How much is the difference in price? I have to undervolt my XT even at stock to keep the Junction Temp/Hot spot in check.


----------



## lightsout

What sort of clocks should I expect from an unlocked 5700? I can set power at 40%, voltage at 1120mv and core at 2000 and loop fire strike ultra no problem. Although that bench heats up my card.

But running COD MW with that clock the game crashes right away.

I am using after burner as I hate watt man, haven’t confirmed that the voltage slider actually does anything, assume it does.


----------



## dainfamous

Same price.


----------



## Newbie2009

dainfamous said:


> Which would you guys purchase between these 2 only the reference 5700xt or the THICC II?
> 
> Thanks


Probably thick II if you don’t use headphones or won’t put a block on it down the road


----------



## skupples

dainfamous said:


> Which would you guys purchase between these 2 only the reference 5700xt or the THICC II?
> 
> Thanks


pretty sure thicc II got replaced by thicc III already, due to issues with thicc II cooling.


----------



## PriestOfSin

SoloCamo said:


> I've been contemplating moving to a 5700xt if only to upgrade my V56 in my VR box to V64 and give my V56 to my nephew as an upgrade over his 1060 6gb.
> 
> Is a 5700xt even worth it over a V64 that holds 1630mhz core / 1100mhz HBM2 consistently during extended gaming? Mostly game at 3200x1800 to 4k depending on title.


How does that V64 OC compare to a stock VII? My 5700XT Red Devil consistently matches my VII at lower res, but 1440p widens the gap slightly. 2160p the VII takes the crown, I'm assuming due to the HBM. I'm not sure it'd be worth it for you at your high res, probably wait for the inevitable 5800XT.


----------



## skupples

agreed, i'd wait if you're trying to drive 4K w. mid range navi... unless you only play RTS and undemanding stuff. If not? You'll be spending most of your time at 1440p, which is blurry on 4K.


----------



## SoloCamo

PriestOfSin said:


> How does that V64 OC compare to a stock VII? My 5700XT Red Devil consistently matches my VII at lower res, but 1440p widens the gap slightly. 2160p the VII takes the crown, I'm assuming due to the HBM. I'm not sure it'd be worth it for you at your high res, probably wait for the inevitable 5800XT.


Couldn't say for sure. From my understanding at my clocks I'm close / matching a stock 5700xt at 1440p+. VII is going to be faster by a good margin at each resolution. It's actually on sale for $569 now, but I can't justify it with big navi around the corner.



skupples said:


> agreed, i'd wait if you're trying to drive 4K w. mid range navi... unless you only play RTS and undemanding stuff. If not? You'll be spending most of your time at 1440p, which is blurry on 4K.



BFV, and mostly less demanding titles otherwise. I usually hover at 75+ fps on BFV all ultra at 3200x1800p. That res is still blurry compared to native 4k, even on my 28" monitor, but it's tolerable. 1440p on a native 4k monitor makes me want to stab my own eyes.


----------



## homestyle

PontiacGTX said:


> https://www.youtube.com/watch?v=ojgK4K7nliE&t
> 
> You can use a reference RX 5700 XT BIOS and get a 8% boost but also a noticeable power consumption increase, maybe 50-80w more


What kind of temps could I expect if I did this to my Sapphire 5700xt pulse?

The pulse cooler is good up to ~200 watts. It struggles once you overclock.


----------



## Newbie2009

SoloCamo said:


> Couldn't say for sure. From my understanding at my clocks I'm close / matching a stock 5700xt at 1440p+. VII is going to be faster by a good margin at each resolution. It's actually on sale for $569 now, but I can't justify it with big navi around the corner.
> 
> 
> 
> 
> BFV, and mostly less demanding titles otherwise. I usually hover at 75+ fps on BFV all ultra at 3200x1800p. That res is still blurry compared to native 4k, even on my 28" monitor, but it's tolerable. 1440p on a native 4k monitor makes me want to stab my own eyes.


You basically have a slower LC Vega 64, performance isn’t all that. If you were hitting 1750 core gaming then maybe. I had a 64 under water custom loop for a long time at 1750 and game clock of 1705 or so. 5700xt is just faster in everything, only a few games Vega might compete.

If you want better perf at higher Rez get a 2070super. I’m done waiting for Navi.


----------



## Newbie2009

homestyle said:


> What kind of temps could I expect if I did this to my Sapphire 5700xt pulse?
> 
> The pulse cooler is good up to ~200 watts. It struggles once you overclock.


Try it and find out, if it is too hot you can always lower power limit so stay within previous limits.


----------



## ZealotKi11er

lightsout said:


> What sort of clocks should I expect from an unlocked 5700? I can set power at 40%, voltage at 1120mv and core at 2000 and loop fire strike ultra no problem. Although that bench heats up my card.
> 
> But running COD MW with that clock the game crashes right away.
> 
> I am using after burner as I hate watt man, haven’t confirmed that the voltage slider actually does anything, assume it does.


In theory, 5700 should clock as high as 5700 XT. You might crash in games because the GPU probably boosts very high at the start of the game. 5700s most likely are failed 5700 XT in terms of core clk or Cus. If you get a failed CU one then you might get a high clocking one. If you get a failed clock one then you will not hit as high as 5700 XTs.


----------



## lightsout

ZealotKi11er said:


> In theory, 5700 should clock as high as 5700 XT. You might crash in games because the GPU probably boosts very high at the start of the game. 5700s most likely are failed 5700 XT in terms of core clk or Cus. If you get a failed CU one then you might get a high clocking one. If you get a failed clock one then you will not hit as high as 5700 XTs.




I haven’t seen the spikes. But I guess it could be super fast. I guess I’m happy in the low 1800s. Trying to push it beyond that isn’t working in cod.

Annoying though cause like I said I can sit around 1950 in fire strike. Haven’t tried anything else.


----------



## treetops422

lightsout said:


> What sort of clocks should I expect from an unlocked 5700? I can set power at 40%, voltage at 1120mv and core at 2000 and loop fire strike ultra no problem. Although that bench heats up my card.
> 
> But running COD MW with that clock the game crashes right away.
> 
> I am using after burner as I hate watt man, haven’t confirmed that the voltage slider actually does anything, assume it does.


 Yeah benchmarks are less taxing on your gpu. Your GPU gets to cool down between loads. And when the cpu test is running. Check GPU-Z to see your volts in the sensor tab. Rest easy that a stock ref blower 5700 xt is only set to 1905 core boost. So your already getting an extra $50 out of your card. Plus whatever extra core boost\memory your getting too. the non xt will not match the xt core boost oc vs oc. Looks like it's working as expected. :thumb:

When I say core boost, I mean the core boost you set it at. It will not actually run at that. Usually 50-75+ below the core boost at 99% gpu utilization. it jumps around


ZealotKi11er said:


> In theory, 5700 should clock as high as 5700 XT. You might crash in games because the GPU probably boosts very  high at the start of the game. 5700s most likely are failed 5700 XT in terms of core clk or Cus. If you get a failed CU one then you might get a  high clocking one. If you get a failed clock one then you will not hit as high as 5700 XTs.


I've yet to see any 5700 non xt that can core boost as high as a xt OC vs OC. Igors lab went to 2100 on the non XT with a top of the line waterblock and decades of know how. And got to 2300 core boost on the XT. Excuse me if I'm misinterpreting what your saying. If your saying some people can't bios up to a ref 5700 xt on a ref non xt, I haven't seen anyone report that being an issue either. in lieu of a failed flash


----------



## lightsout

treetops422 said:


> Yeah benchmarks are less taxing on your gpu. Your GPU gets to cool down between loads. And when the cpu test is running. Check GPU-Z to see your volts in the sensor tab. Rest easy that a stock ref blower 5700 xt is only set to 1905 core boost. So your already getting an extra $50 out of your card. Plus whatever extra core boost\memory your getting too. the non xt will not match the xt core boost oc vs oc. Looks like it's working as expected. :thumb:
> 
> When I say core boost, I mean the core boost you set it at. It will not actually run at that. Usually 50-75+ below the core boost at 99% gpu utilization. it jumps around
> 
> 
> I've yet to see any 5700 non xt that can core boost as high as a xt OC vs OC. Igors lab went to 2100 on the non XT with a top of the line waterblock and decades of know how. And got to 2300 core boost on the XT. Excuse me if I'm misinterpreting what your saying. If your saying some people can't bios up to a ref 5700 xt on a ref non xt, I haven't seen anyone report that being an issue either. in lieu of a failed flash




No you are reading me loud and clear. Thanks for the info, that’s all I really wanted to know if the card was doing ok. I’m very happy with performance it’s very solid.

It’s odd with fire strike. Because when I run the stress and not the bench. It heats up the card about 10c hotter than the game. Which makes me think it’s pushing the card harder. But as I said I can’t run those clocks in game.

I’m happy with real game clocks of 1800. I have it set to 1875.


----------



## skupples

i assume 7nm navi is coming from TSMC just like 7nm ryzen? if so, then it'll probably be awhile until we see 58/59


----------



## PJVol

qwertymac93 said:


> I'm fairly certain you should not have such a large Delta between edge temp and hotspot temp.


You said large delta? Look at Time Spy run below, that's the delta 








https://www.3dmark.com/3dm/41368504

On my previos Vega56(64) repasting helped(i.e letting the TIM fill the somehow formed cavities between GPU & HBM), although I think tightening the screws was more likely to affect.
What left now? "washer-mode" -> "try to set back retention spring" -> "repaste" -> RMA? (or left as is - as if everything is fine)


PS: Wattman profile - 2100Mhz/1135mv/900Mhz/+15%


----------



## Ha-Nocri

89c is fine, and your card is much better than mine if 2100 MHz at 1135 mV is stable in games for you. I need 1200mV for 2070 MHz


----------



## SoloCamo

Newbie2009 said:


> You basically have a slower LC Vega 64, performance isn’t all that. If you were hitting 1750 core gaming then maybe. I had a 64 under water custom loop for a long time at 1750 and game clock of 1705 or so. 5700xt is just faster in everything, only a few games Vega might compete.
> 
> If you want better perf at higher Rez get a 2070super. I’m done waiting for Navi.


Stock vs stock, I can't see how a V64 at 1630 core (very, very few drops below) and 1100mhz HMB2 could possibly be slower than the same core clock LC V64 (stock they boost to 1630 max, yes?) with slower HBM2. Unless you meant OC vs OC which is understandable.

Either way, I just can't justify upgrading to either a 5700xt or VII despite how much I'd like to at 4k.


----------



## PJVol

Ha-Nocri said:


> 89c is fine, and your card is much better than mine if 2100 MHz at 1135 mV is stable in games for you. I need 1200mV for 2070 MHz


 I don't think Tj value being two times the T-edge one is fine. The same happened to me after installing water block on ref Vega. After repasting delta got back to acceptable 30-32 (at 280W gpu power) instead of initial 40-45.
I've read a thread on reddit Gigabyte RX 5700XT Gaming OC - Issues just before posting here, where users describe similar symptoms w/GB cards. I see their problems could have the same source as mine have, i.e. in both cases incorrect cooler setup may have occured.


----------



## Gunderman456

I just purchased an MSI RX 5700 XT (reference) in line with my new Build Log entitled "Fast n' Cheap":

https://www.overclock.net/forum/18082-builds-logs-case-mods/1737250-build-log-fast-n-cheap.html

I listed parts I'm considering and asked for constructive criticism/suggestions.

It was the cheapest 5700 XT I could find in Canada. I bought it from Canada Computers for $458.88 (after a $20 rebate). I figure since it comes with Borderlands 3 and I was considering buying the game ($80 CAD) to play with my daughter and son in-law, I managed to gnash and claw my way to a card for under $400. I had to pull every trick in the book to match the US regular retail price. Sheesh...

The build will revolve around the RX 5700 XT and the Ryzen 7 3700X!


----------



## treetops422

lightsout said:


> No you are reading me loud and clear. Thanks for the info, that’s all I really wanted to know if the card was doing ok. I’m very happy with performance it’s very solid.
> 
> It’s odd with fire strike. Because when I run the stress and not the bench. It heats up the card about 10c hotter than the game. Which makes me think it’s pushing the card harder. But as I said I can’t run those clocks in game.
> 
> I’m happy with real game clocks of 1800. I have it set to 1875.


Well the thing I would recommend is to uninstall AB and try it with Wattman, reason being it can conflict. Or stack for lack of better word. Sometimes I could use AB, sometimes I couldn't. really use GPU-Z see what's going on in the sensors. You might set it to 1875 in AB, but it's spiking up to 19XX or 20XX in GPU-Z. That shouldn't happen. Even with Wattman uninstalled it can sometimes conflict with AB. Or maybe your watts, volts or whatever are stacking. Just some peace of mind.


----------



## BradleyW

My Sapphire 5700 XT. Currently set to Core = 2090 MHz, Voltage 1069Mv, Memory = 945 MHz. Boosts around 2000 - 2030 MHz and fans are not audible. The centre of the GPU runs around 75-78c under heavy gaming (ambient 24c). Super card!


----------



## PontiacGTX

homestyle said:


> What kind of temps could I expect if I did this to my Sapphire 5700xt pulse?
> 
> The pulse cooler is good up to ~200 watts. It struggles once you overclock.


non XT model gets the performance boost the XT model just have to use a tool to mod overclocking limits


----------



## Gunderman456

BradleyW said:


> My Sapphire 5700 XT. Currently set to Core = 2090 MHz, Voltage 1069Mv, Memory = 945 MHz. Boosts around 2000 - 2030 MHz and fans are not audible. The centre of the GPU runs around 75-78c under heavy gaming (ambient 24c). Super card!


Looks good BradleyW, congrats!

I'm sure my reference MSI RX 5700 XT won't be able to hit that but I will try my best by introducing some washers, tightening the screws on the GPU bracket, using a reasonable fan curve and undervolting. I don't think I'll miss any other tricks?


----------



## BradleyW

Gunderman456 said:


> BradleyW said:
> 
> 
> 
> My Sapphire 5700 XT. Currently set to Core = 2090 MHz, Voltage 1069Mv, Memory = 945 MHz. Boosts around 2000 - 2030 MHz and fans are not audible. The centre of the GPU runs around 75-78c under heavy gaming (ambient 24c). Super card!
> 
> 
> 
> Looks good BradleyW, congrats!
> 
> I'm sure my reference MSI RX 5700 XT won't be able to hit that but I will try my best by introducing some washers, tightening the screws on the GPU bracket, using a reasonable fan curve and undervolting. I don't think I'll miss any other tricks?
Click to expand...

I don't know a massive amount on this, but I think the key to reference design especially is to undervolt the card so it runs cooler and boosts higher.

At least Navi doesn't boost poorly when it's running hot compared to VEGA.


----------



## Gunderman456

BradleyW said:


> I don't know a massive amount on this, but I think the key to reference design especially is to undervolt the card so it runs cooler and boosts higher.
> 
> At least Navi doesn't boost poorly when it's running hot compared to VEGA.


A thorough discussion on this precise topic took place in my new build log:

https://www.overclock.net/forum/18082-builds-logs-case-mods/1737250-build-log-fast-cheap.html


----------



## lightsout

treetops422 said:


> Well the thing I would recommend is to uninstall AB and try it with Wattman, reason being it can conflict. Or stack for lack of better word. Sometimes I could use AB, sometimes I couldn't. really use GPU-Z see what's going on in the sensors. You might set it to 1875 in AB, but it's spiking up to 19XX or 20XX in GPU-Z. That shouldn't happen. Even with Wattman uninstalled it can sometimes conflict with AB. Or maybe your watts, volts or whatever are stacking. Just some peace of mind.




Yeah thanks man. I’m happy where I’m at. Not looking to push to hard just want things stable with a nice boost. It took me a bit to get AB to play nice with watt man and the more power mod. But I think things are ok now. I just really don’t like wattman. Only use it if I have to.


----------



## ZealotKi11er

What is the highest memory clock with 5700 XT so far?


----------



## rdr09

BradleyW said:


> I don't know a massive amount on this, but I think the key to reference design especially is to undervolt the card so it runs cooler and boosts higher.
> 
> At least Navi doesn't boost poorly when it's running hot compared to VEGA.


No oc'ing with the reference especially if ambient is high. Here is mine gaming at an undervolt of 1.1v.


----------



## doom26464

Just snagged the gigabyte OC 5700xt and a benq EX3203R. With sales on today I feel this is one super good combo for a client. 

Looking forward to playing around with the 5700xt @1440p 144hz


----------



## lightsout

rdr09 said:


> No oc'ing with the reference especially if ambient is high. Here is mine gaming at an undervolt of 1.1v.


Is 1.1v an undcervolt? My non XT reports less than that at stock. Maybe I am measuring it with the wrong app.


----------



## rdr09

lightsout said:


> Is 1.1v an undcervolt? My non XT reports less than that at stock. Maybe I am measuring it with the wrong app.


According to Wattman, stock is 1169mv. It is actaully 1096mv but HWINFO reads it as 1100mv. Lowest i got my 5700 is 956mv.


----------



## lightsout

rdr09 said:


> According to Wattman, stock is 1169mv. It is actaully 1096mv but HWINFO reads it as 1100mv. Lowest i got my 5700 is 956mv.


I was just going by what the slider in AB says. For some reason HWinfo can't read my ref 5700. I think I will reinstall the program and see if it fixes it.

Is there any way to have profiles auto load in wattman, one of the reasons I don't like using it is opening it every time. They have it hidden away under a bunch of different menus.


----------



## rdr09

lightsout said:


> I was just going by what the slider in AB says. For some reason HWinfo can't read my ref 5700. I think I will reinstall the program and see if it fixes it.
> 
> Is there any way to have profiles auto load in wattman, one of the reasons I don't like using it is opening it every time. They have it hidden away under a bunch of different menus.


I hardly ever shutdown my pc. I just put it to sleep, so the profile does not change. Sometimes it saves it even after reboor. I just tested it and it reset to default values.


----------



## lightsout

rdr09 said:


> According to Wattman, stock is 1169mv. It is actaully 1096mv but HWINFO reads it as 1100mv. Lowest i got my 5700 is 956mv.


My stock voltage is 1062mv. I have not had any luck adjusting it. Maybe I should try to lower it a bit.


----------



## rdr09

lightsout said:


> My stock voltage is 1062mv. I have not had any luck adjusting it. Maybe I should try to lower it a bit.


What do you mean? Undervolting makes it unstable or the undervolt is not sticking?

I just set Frequency/Voltage to manual, use the slider, then close Wattman. Screenshot is from XT.

EDIT: My 5700 stock voltage is 1042mv according to Wattman.


----------



## Gunderman456

Not sure whether to use Wattman or MSI Afterburner when I have my new build in place?!?


----------



## rdr09

Gunderman456 said:


> Not sure whether to use Wattman or MSI Afterburner when I have my new build in place?!?


I suggest not to install Afterburner or any other third-party app first. You can use HWINFO64 to monitor readings.

EDIT: On my 5700 on a R5 3600/B350 motherboard, Wattman profile stays even after reboot. Using 19.11.3 driver.


----------



## PJVol

rdr09 said:


> EDIT: On my 5700 on a R5 3600/B350 motherboard, Wattman profile stays even after reboot. Using 19.11.3 driver.


Yea, it was so annoying on Vega. But now i noticed that sometimes the driver drags up the last freq/voltage point on the curve by itself. I can't even catch the moment when it did that (i.e. move it for example from 1100 to 1175mv leaving PL, memclock untouched)


----------



## lightsout

rdr09 said:


> What do you mean? Undervolting makes it unstable or the undervolt is not sticking?
> 
> 
> 
> I just set Frequency/Voltage to manual, use the slider, then close Wattman. Screenshot is from XT.
> 
> 
> 
> EDIT: My 5700 stock voltage is 1042mv according to Wattman.




I meant overvolting seems to make it unstable. I haven't really undervlted at all.

I also don't have any luck with wattman keeping the settings. When I reboot they go back to stock.


----------



## Gunderman456

lightsout said:


> I meant overvolting seems to make it unstable. I haven't really undervlted at all.
> 
> I also don't have any luck with wattman keeping the settings. When I reboot they go back to stock.


Is seems things go up and down dependent on drivers and that is why I'm hesitant to use Wattman myself.

I think I'll start with MSI Afterburner and if I get problems, will delete and use Wattman. Easier that way, since I think if you enable Wattman and decide it's no good, I don't think you can take it back and it will start conflicting with MSI Afterburner. Not sure if you uninstall Radeon and reinstall it will then disable Wattman (I guess you could choose to not keep the Radeon settings when uninstalling and that would resolve the issue but unsure).


----------



## lightsout

Gunderman456 said:


> Is seems things go up and down dependent on drivers and that is why I'm hesitant to use Wattman myself.
> 
> I think I'll start with MSI Afterburner and if I get problems, will delete and use Wattman. Easier that way, since I think if you enable Wattman and decide it's no good, I don't think you can take it back and it will start conflicting with MSI Afterburner. Not sure if you uninstall Radeon and reinstall it will then disable Wattman (I guess you could choose to not keep the Radeon settings when uninstalling and that would resolve the issue but unsure).


I at first had issues with afterburner, wouldn't adjust clocks, messed up the morepower mod. But started over and got things sorted now.

But I am having random system lock ups. I had updated my bios (after a long time) because I was having issues in game (lock ups), had a few bsod's so I turned off PBO. But now and then the system will lock up. Which is unacceptable as I use this rig for work. Can't be in the middle of a paper and have it freeze.

AMD, while they are the best bang for your buck, this is just how it is. When its good its good, but there are always quirks. The main one being overclock software, and conflicts it may cause. 

For now my gaming experience is excellent, but the new COD is super sensitive to OC's. I am just leaving that alone for now. Running about 1800mhz. I should just leave it stock but I just can't leave those couple hundred Mhz on the table.


----------



## Gunderman456

lightsout said:


> I at first had issues with afterburner, wouldn't adjust clocks, messed up the morepower mod. But started over and got things sorted now.
> 
> But I am having random system lock ups. I had updated my bios (after a long time) because I was having issues in game (lock ups), had a few bsod's so I turned off PBO. But now and then the system will lock up. Which is unacceptable as I use this rig for work. Can't be in the middle of a paper and have it freeze.
> 
> AMD, while they are the best bang for your buck, this is just how it is. When its good its good, but there are always quirks. The main one being overclock software, and conflicts it may cause.
> 
> For now my gaming experience is excellent, but the new COD is super sensitive to OC's. I am just leaving that alone for now. Running about 1800mhz. I should just leave it stock but I just can't leave those couple hundred Mhz on the table.


If you are still getting lock-ups I'm thinking RAM.

I noticed that too with AMD drivers, some are better then others for OC.

Additionally, I use to OC the GPU and forget. But for a while now some OCs will not work on some newer games (most likely because they push the card harder) and I'd have to reassess.


----------



## lightsout

Gunderman456 said:


> If you are still getting lock-ups I'm thinking RAM.
> 
> 
> 
> I noticed that too with AMD drivers, some are better then others for OC.
> 
> 
> 
> Additionally, I use to OC the GPU and forget. But for a while now some OCs will not work on some newer games (most likely because they push the card harder) and I'd have to reassess.




You talking about system ram or GPU? I haven't touched the ram on the GPU. And I only turn on my oc profile for the GPU when I game.


----------



## Gunderman456

lightsout said:


> You talking about system ram or GPU? I haven't touched the ram on the GPU. And I only turn on my oc profile for the GPU when I game.


System RAM.


----------



## lightsout

Gunderman456 said:


> System RAM.


You know these Flares always gave me a little trouble. First boot at stock settings (3200 c14) will often fail after a clean bios. But rebooting will boot into windows. I used to run then a hair overvolted to avoid issues. I may try that again.


----------



## Newbie2009

Personally with AMD I’ve learned to avoid afterburner, lot of issues with previous generation AMD cards when away for me when got rid


----------



## Gunderman456

I've never had any issues with MSI AB and I've mostly used ATI/AMD cards. If you have more then one card just remember to disable ULPS.


----------



## PJVol

So i decided to put back original retention spring, without full backplate unmount and cirquit drainage, just unscrew 4 screws, put spring with PE washers and tightened back. This gave Tjunction decreased by 5° avg. Now its 42/81 (edge/junction) - before was ~ 42/86.

Is it advisable to completely unmount waterblock and repast further or it wont worth it?


----------



## BradleyW

Gunderman456 said:


> Not sure whether to use Wattman or MSI Afterburner when I have my new build in place?!?


I'd strongly recommend wattman for clock and voltage changes.


----------



## ZealotKi11er

BradleyW said:


> I'd strongly recommend wattman for clock and voltage changes.


Yeah. I use Wattman for OC and MSI AB just for OSD. Since Vega, it has been pointless to use MSI AB for AMD GPUs.


----------



## lightsout

ZealotKi11er said:


> Yeah. I use Wattman for OC and MSI AB just for OSD. Since Vega, it has been pointless to use MSI AB for AMD GPUs.


What do you mean by pointless? I am using it with good results.


----------



## ZealotKi11er

lightsout said:


> What do you mean by pointless? I am using it with good results.


You can only do so much with MSI AB because you only control the highest DPM. Whatever is possible and tested by AMD will be available via Wattman. Nvidia in the other hand is different. For them, you use Evga as a reference OC tool.


----------



## lightsout

ZealotKi11er said:


> You can only do so much with MSI AB because you only control the highest DPM. Whatever is possible and tested by AMD will be available via Wattman. Nvidia in the other hand is different. For them, you use Evga as a reference OC tool.


You mean the power limit or the more granular control? I do fine with AB, also not a fan of evga precision. 

I have not done a ton of testing but wattman hasn't seem to net me anything better than I get with AB.


----------



## Petet1990

Hey guys i just picked up the 5700xt red devil. wanted to know if there is a waterblock for it from EK.


----------



## rdr09

Petet1990 said:


> Hey guys i just picked up the 5700xt red devil. wanted to know if there is a waterblock for it from EK.


Have not seen any. Reference design are the first to get waterblocks.

Double check this.

https://www.aliexpress.com/i/4000161739526.html


----------



## BradleyW

Petet1990 said:


> Hey guys i just picked up the 5700xt red devil. wanted to know if there is a waterblock for it from EK.


I don't think any are available. I doubt anyone has plans to make one for this version. Always stick with reference if you plan to watercool.


----------



## PriestOfSin

Petet1990 said:


> Hey guys i just picked up the 5700xt red devil. wanted to know if there is a waterblock for it from EK.


The Liquid Devil is the same PCB as the Red Devil iirc, so hopefully that means it's incoming.


----------



## ZealotKi11er

Petet1990 said:


> Hey guys i just picked up the 5700xt red devil. wanted to know if there is a waterblock for it from EK.


If you are 100% set in liquid, get 5700 XT 50th Anniversary.


----------



## doom26464

https://youtu.be/hL7qGkTZgwo

Has the gap between the 5700xt and 2060 super gotten even larger as of latly?

Is fine wine drivers starting to mature?


----------



## Gunderman456

doom26464 said:


> https://youtu.be/hL7qGkTZgwo
> 
> Has the gap between the 5700xt and 2060 super gotten even larger as of latly?
> 
> Is fine wine drivers starting to mature?


The 5700 XT is faster than the RTX 2070. So the question would be, is the 5700 XT nibbling at the RTX 2070 Super's heels? The RTX 2070 Super is still faster. So let's hope we get to conquer that card with the fine wine (until the smaller driver team at AMD catches up with Nividia's more optimized driver releases day one AMD's illusory "fine wine" will continue to gain legendary status)!!


----------



## BradleyW

My 5700 XT (Nitro+) clock speed keeps randomly dropping. It'll run between 2030 to 2060 normally. Then it'll randomly drop down to 1850 - 1950 for a second. Happens randomly. Sometimes frequently. Not application specific. Not driver specific. Happens on stock settings and modified settings. Temperatures are not causing throttling as TJ, core, memory, VRM's are all below 80c. Thank you.


----------



## Ha-Nocri

BradleyW said:


> My 5700 XT (Nitro+) clock speed keeps randomly dropping. It'll run between 2030 to 2060 normally. Then it'll randomly drop down to 1850 - 1950 for a second. Happens randomly. Sometimes frequently. Not application specific. Not driver specific. Happens on stock settings and modified settings. Temperatures are not causing throttling as TJ, core, memory, VRM's are all below 80c. Thank you.


Could be a few things. Situation becomes bottle-necked by some other part of the system for a moment, GPU drops clocks as it doesn't have to work as hard. Or a spike in power/temperature happens and card down-clocks briefly. Or AMD drivers are not as polished yet. But if you don't see stutter in that moment then there isn't a problem.


----------



## Ha-Nocri

doom26464 said:


> https://youtu.be/hL7qGkTZgwo
> 
> Has the gap between the 5700xt and 2060 super gotten even larger as of latly?
> 
> Is fine wine drivers starting to mature?


In 4 new games, RDR2, BL3, CoD, Fallen Order, I think 5700 XT is faster than 2070 S


----------



## ryan92084

As others have said, it is well past time to create an owners thread and move your discussion there. Closure imminent.


----------



## lightsout

ryan92084 said:


> As others have said, it is well past time to create an owners thread and move your discussion there. Closure imminent.


Can this thread get transferred and renamed? So all the data goes with it? I pm'd another mod but was ignored.


----------



## Imouto

ryan92084 said:


> As others have said, it is well past time to create an owners thread and move your discussion there. Closure imminent.


----------



## doom26464

Ha-Nocri said:


> doom26464 said:
> 
> 
> 
> https://youtu.be/hL7qGkTZgwo
> 
> Has the gap between the 5700xt and 2060 super gotten even larger as of latly?
> 
> Is fine wine drivers starting to mature?
> 
> 
> 
> In 4 new games, RDR2, BL3, CoD, Fallen Order, I think 5700 XT is faster than 2070 S
Click to expand...

I noticed even older titles as well, even titles like fornite that are heavy nvidia favored the gap has closed up alot


----------



## BradleyW

Ha-Nocri said:


> Could be a few things. Situation becomes bottle-necked by some other part of the system for a moment, GPU drops clocks as it doesn't have to work as hard. Or a spike in power/temperature happens and card down-clocks briefly. Or AMD drivers are not as polished yet. But if you don't see stutter in that moment then there isn't a problem.


No sudden spikes on any reported temperatures. Memory, VRM's and GPU was 60c average. Centre of the GPU was sat at a steady 75-77c. GPU usage 99%. CPU usage low. I've tested in static scenes to eliminate changes in the game which may alter the usage.


----------



## ryan92084

lightsout said:


> ryan92084 said:
> 
> 
> 
> As others have said, it is well past time to create an owners thread and move your discussion there. Closure imminent.
> 
> 
> 
> Can this thread get transferred and renamed? So all the data goes with it? I pm'd another mod but was ignored.
Click to expand...

Usually you'd want an owner to start the thread so they can maintain the first post. If no one steps up though...


----------



## lightsout

ryan92084 said:


> Usually you'd want an owner to start the thread so they can maintain the first post. If no one steps up though...


I'm down to do it, but there was talk of transferring this thread to preserve all the growing pains, etc. 

Transfer the thread and change the OP?


----------



## ZealotKi11er

BradleyW said:


> No sudden spikes on any reported temperatures. Memory, VRM's and GPU was 60c average. Centre of the GPU was sat at a steady 75-77c. GPU usage 99%. CPU usage low. I've tested in static scenes to eliminate changes in the game which may alter the usage.


If you do not see fps drop then it's fine. Clocks are no longer static. There are way more parameters to determining clocks then what you have visible to you.


----------



## looniam

lightsout said:


> I'm down to do it, but there was talk of transferring this thread to preserve all the growing pains, etc.
> 
> Transfer the thread and change the OP?


if i may:

it is desirable to make the info you're referring to easily available/accessible to other owners but the majority of the 1700+ posts here are alot of "_noise_." or in other words; it'll be a needle in a haystack (of _discussions_!) for someone looking why their boost clock speeds are flonky or driver settings not sticking.

so yeah, it would be highly appreciated by untold masses of owners and others if someone would start an owners club and quote the pertinent info thats here in that. it would be a daunting task for one but there are several owners here now (not going to throw anyone under the bus . .yet) who do appear helpful . . . 

just saying.

oh yeah i forgot, came here really to post something . ."new?"
filed under "can't beat em, join 'em" folder.
*HIS Announces Pink, Blue Army Versions of AMD's RX 5700 XT Graphics Card*


Spoiler






> HIS Announces Pink, Blue Army Versions of AMD's RX 5700 XT Graphics Card[/URL]
> These are essentially adaptations of HIS' IceQ 2X coolers to AMD's latest, with a changed color scheme. The cards are 2.5 slots featuring dual-fan cooling solutions, and feature a full cover backplate that's also been "gaudied" up in the aforementioned colors. The OC versions will carry clocks set at 1730 MHz, 1870 MHz, and 1980 MHz for the Base, Game and Boost modes respectively, which are either higher or in line with AMD's own RX 5700 XT Anniversary graphics cards. The non-OC versions still see bumps relative to AMD's reference design, shipping at 1670 MHz, 1815 MHz, and 1925 MHz for the Base, Game and Boost modes. Outputs are taken care of by 3x DisplayPort and 1x HDMI.


----------



## lightsout

looniam said:


> if i may:
> 
> it is desirable to make the info you're referring to easily available/accessible to other owners but the majority of the 1700+ posts here are alot of "_noise_." or in other words; it'll be a needle in a haystack (of _discussions_!) for someone looking why their boost clock speeds are flonky or driver settings not sticking.
> 
> so yeah, it would be highly appreciated by untold masses of owners and others if someone would start an owners club and quote the pertinent info thats here in that. it would be a daunting task for one but there are several owners here now (not going to throw anyone under the bus . .yet) who do appear helpful . . .
> 
> just saying.
> 
> oh yeah i forgot, came here really to post something . ."new?"
> filed under "can't beat em, join 'em" folder.
> *HIS Announces Pink, Blue Army Versions of AMD's RX 5700 XT Graphics Card*
> 
> 
> Spoiler


True, thats well beyond what I would be willing to do, but no one has stepped up and started it so I figured why not. We could get the thread up and running, and people could begin to link to relevant info and it could be added to the OP.


----------



## looniam

just admit it, all you owners are really too disappointed and embarrassed to start an owners club. :heyyou: 

did anyone mention clubs being safe harbor from trolls?


----------



## ilmazzo

I LOVE, LOVE!!!, that pinky 5700!!!!!!!!!! I would throw my money to it immediately if I had ...cough! money cough! at all....


----------



## lightsout

I started an owner's thread.
https://www.overclock.net/forum/67-amd/1737804-amd-rx-5700-xt-5700-owner-s-club.html#post28222992

Currently


----------



## looniam

lightsout said:


> I started an owner's thread.
> https://www.overclock.net/forum/67-amd/1737804-amd-rx-5700-xt-5700-owner-s-club.html#post28222992
> 
> Currently tags don't want to work, tried with imgur and imgbb, not sure what the deal is. Please share helpful links and recommend things for the OP.[/quote]
> 
> yeah, its seems using a third party image hosting can be a chore around here. i've used shareable folder in google photos (or whatever its call now) and getting a/the proper link is hit or miss, a couple of right click/open image in tab to find a usable URL.
> 
> and that drag and drop stuff just adds thumbnail at the end of the post. i found using the paperclip, uploading the images works better. however if adding/upload more images it gets a little flaky, not all seem available to insert so i'll use the insert all and format as needed.
> 
> you may want to ask in [URL="https://www.overclock.net/forum/15167-forum-platform-help-discussion/"][B]the help forum[/B][/URL] how to add a google spread sheet if you are going to have a members list.
> 
> and btw, thanks :thumb:


----------



## lightsout

looniam said:


> yeah, its seems using a third party image hosting can be a chore around here. i've used shareable folder in google photos (or whatever its call now) and getting a/the proper link is hit or miss, a couple of right click/open image in tab to find a usable URL.
> 
> and that drag and drop stuff just adds thumbnail at the end of the post. i found using the paperclip, uploading the images works better. however if adding/upload more images it gets a little flaky, not all seem available to insert so i'll use the insert all and format as needed.
> 
> you may want to ask in *the help forum* how to add a google spread sheet if you are going to have a members list.
> 
> and btw, thanks :thumb:


I did the image attachment thing, then used the link of the image once it was hosted at OCN. Then I removed the attachment, we'll see if they stick around.


----------



## looniam

lightsout said:


> I did the image attachment thing, then used the link of the image once it was hosted at OCN. Then I removed the attachment, we'll see if they stick around.


i don't think that worked what i see is:








rightclick view image gets me:








deleting an attachment deletes the attachment, its gone. are you using drag and drop? i find that the most useless feature for formatting a post and avoid it unless i just need a quick reference.


----------



## Gunderman456

lightsout said:


> I did the image attachment thing, then used the link of the image once it was hosted at OCN. Then I removed the attachment, we'll see if they stick around.


I was doing the same on my new build log. To me the pics were there but when my brother browsed my thread he told me that there were no pics. So I was stuck in uploading the pics, leaving them uploaded on OCN and with additional thumbnails. So you end up with one big picture and the same smaller one at the bottom of the post. I still can't figure out how to get around this and I don't want to host my pictures anywhere. I did not have to before the last OCN migration. If you manage to figure it out, let me know.


----------



## lightsout

looniam said:


> i don't think that worked what i see is:
> View attachment 310120
> 
> 
> rightclick view image gets me:
> View attachment 310122
> 
> 
> deleting an attachment deletes the attachment, its gone. are you using drag and drop? i find that the most useless feature for formatting a post and avoid it unless i just need a quick reference.


The attachments are actually still accessible with the direct link, maybe only for the uploader?

Think I got it working?

Do the embedded youtube videos show up??



Gunderman456 said:


> I was doing the same on my new build log. To me the pics were there but when my brother browsed my thread he told me that there were no pics. So I was stuck in uploading the pics, leaving them uploaded on OCN and with additional thumbnails. So you end up with one big picture and the same smaller one at the bottom of the post. I still can't figure out how to get around this and I don't want to host my pictures anywhere. I did not have to before the last OCN migration. If you manage to figure it out, let me know.


I used a new site, that gave me a link and I think it's working. Whats the big deal about hosting elsewhere? The problem for me was that even that wasn't working, let me know if you guys can see them.


----------



## Gunderman456

lightsout said:


> Oh weird, it still shows up for me, this site aye!!!
> 
> Do the embedded youtube videos show up??


Yes.


----------



## BradleyW

ZealotKi11er said:


> If you do not see fps drop then it's fine. Clocks are no longer static. There are way more parameters to determining clocks then what you have visible to you.


I do see the FPS drop for a split second when the clock drops. It is random. Drivers, games, settings...doesn't matter...still happens. Even if I stand still in a game and not move the camera, the drop can still happen.

After Googling, it seems it is a widespread problem. People say to fiddle with the power target, but it's not done anything for me. Reported on reference and aftermarket.


----------



## Chrono Detector

Currently experiencing an issue with my recently purchased ASUS TUF Gaming X3 RX 5700 XT card. While gaming, not even at least half an hour or so the video card will crash either with a black or blank green screen and my PC will restart. Is this a common issue with all 5700 XT's or this card in particular? Really disappointed I spent $600 AUS on this card only to have it crash on me which is close to unusable.

I haven't even overclocked my system and I'm running a Ryzen 3900x as I purchased a new PC.


----------



## skupples

have you tracked temps up to the crash? 

first things first, run display driver uninstaller from safe mode, restart, & re-install drivers. 

some models have had improper thermal pad coverage & stuff, causing memory to heat stroke. I believe this is part of the reason why THICC went to revision III so quickly.


----------



## ilmazzo

Since the issue seems to pop up after some gaming you can have a over-temperature problem on some vga components (ie memory chips as already pointed out) so try to log hwinfo readings so you can get data at the very last second of the pc being running, then read the output and share it here


----------



## Newbie2009

After going to the hassle of dropping a big watercooling rig and building a node 202 on air because I wanted to downsize, I just dropped 500bucks on parts to watercool the node 202.

I think I have a problem. I am looking forward to seeing what the anniversary edition can do under water though.


----------



## lightsout

BradleyW said:


> I do see the FPS drop for a split second when the clock drops. It is random. Drivers, games, settings...doesn't matter...still happens. Even if I stand still in a game and not move the camera, the drop can still happen.
> 
> 
> 
> After Googling, it seems it is a widespread problem. People say to fiddle with the power target, but it's not done anything for me. Reported on reference and aftermarket.




I'm not even sure I would call it a problem. Just seems to be the normal behavior of these cards. Clocks fluctuate.


----------



## ryan92084

Continue discussions here 


lightsout said:


> I started an owner's thread.
> https://www.overclock.net/forum/67-amd/1737804-amd-rx-5700-xt-5700-owner-s-club.html#post28222992
> 
> Currently tags don't want to work, tried with imgur and imgbb, not sure what the deal is. Please share helpful links and recommend things for the OP.[/quote]
> Thank you lightsout


----------

