# [Various] AMD Ryzen 7 Reviews



## ryan92084

*1800x, 1700x & 1700*

Gigabyte AX370-Gaming 5 http://www.overclockersclub.com/reviews/amd_ryzen_7_1800x_1700x_1700/
Asus Crosshair VI http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700
?[German} https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/
MSI x370 Titanium http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/
Gigabyte AX370-Gaming 5 http://hothardware.com/reviews/amd-ryzen-7-1800x-1700x-1700-benchmarks-and-review
versus Phenom II [German] https://www.computerbase.de/2017-03/benchmarks-phenom-ii-x6-ryzen-7-vergleich/
Gigabyte AX370-Gaming 5 http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed/3

*1800x & 1700x*

Asus Crosshair VI [Dutch] https://nl.hardware.info/reviews/7223/amd-ryzen-7-1800x1700x-review-eindelijk-weer-concurrentie-voor-intel
Asus Crosshair VI [Swedish] http://www.sweclockers.com/test/23426-amd-ryzen-7-1800x-och-7-1700x
Asrock X370 Taichi http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/
Asrock X370 Taichi and Asus Crosshair VI with motherboard bugginess explained https://www.youtube.com/watch?v=mW1pzcdZxKc

*1800x*

MSI x370 Titanium [Chinese] http://3c.3dmgame.com/show-52-5363-1-all.html
Asus Crosshair VI [Chinese] https://www.chiphell.com/article-17555-4.html
Asus Crosshair VI https://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html
Asus Crosshair VI http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review.html
Asus Crosshair VI http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks or YT https://www.youtube.com/watch?v=j7UBHjtCXhU
MSI x370 Titanium http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html
ASUS Prime X370 Pro (old bios tested) https://www.overclock3d.net/reviews/cpu_mainboard/amd_ryzen_7_1800x_cpu_review/1 or 1+ hour







https://youtu.be/0Yk-izRf2Ro
Asus Crosshair VI http://www.tweaktown.com/reviews/8072/amd-ryzen-7-1800x-cpu-review-intel-battle-ready/index.html
MSI x370 Titanium and Linux https://www.phoronix.com/scan.php?page=article&item=ryzen-1800x-linux&num=1
[Japanese] http://www.4gamer.net/games/300/G030061/20170302065/
[Portugese] http://adrenaline.uol.com.br/2017/03/02/48469/analise-processador-amd-ryzen-7-1800x/
Asus Crosshair VI https://arstechnica.co.uk/gadgets/2017/03/amd-ryzen-review/
[Polish] http://www.benchmark.pl/testy_i_recenzje/amd-ryzen-7-1800x-test.html
MSI x370 Titanium http://www.eteknix.com/amd-ryzen-7-1800x-am4-8-core-processor-review/
[German] https://www.hardwareluxx.de/index.php/artikel/hardware/prozessoren/42068-acht-kerne-und-16-threads-fuer-550-euro-amd-ryzen-7-1800x-im-test.html
Asus Crosshair VI http://hexus.net/tech/reviews/cpu/102964-amd-ryzen-7-1800x-14nm-zen/
Asus Crosshair VI http://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-7-1800x-cpu-review/
[German] http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
Asus Crosshair VI https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen
[Romanian] http://lab501.ro/featured-articles/review-amd-ryzen-7-1800x-welcome-back-amd
Linus https://www.youtube.com/watch?v=9wJQEHNYE7M
Various with comments on bios [German] https://www.golem.de/news/ryzen-7-1800x-im-test-amd-ist-endlich-zurueck-1703-125996-4.html
http://www.overclockers.com/amd-ryzen-7-1800x-cpu-review/
*1700x*

Asus Crosshair VI Linux https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/
Asus Crosshair VI includes SMT off http://www.guru3d.com/articles_pages/amd_ryzen_7_1700x_review,6.html

*1700*

Gigabyte AX370-Gaming 5 +notes on 1700x/1800x http://www.xtremesystems.org/forums/showthread.php?293130-Ryzen-Return-of-the-Jedi&p=5254423#post5254423
Gigabyte AX370-Gaming 5 https://www.youtube.com/watch?v=V5RP1CPpFVE
MSI x370 Titanium uefi 117 w/ overclocking http://www.legitreviews.com/amd-ryzen-7-1700-overclocking-best-ryzen-processor_192191
https://www.youtube.com/watch?v=PcbdN7vdCuQ

*Overclocking tests*

round up http://www.overclock.net/t/1624507/various-amd-ryzen-7-reviews/50#post_25885406
Record run https://www.youtube.com/watch?v=9XGvrfTwwNI
3.9ghz 1700 versus 5ghz 7700k also temps versus the 1800x https://www.youtube.com/watch?v=V5RP1CPpFVE or http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/
Follow up to 3.9ghz to 5ghz https://www.youtube.com/watch?v=BXVIPo_qbc4
Follow up to the follow up with 720p low https://youtu.be/nsDjx-tW_WQ
ROG Crosshair VI overclocking thread with mem speed testing http://www.overclock.net/t/1624603/rog-crosshair-vi-overclocking-thread
10 1700 tested for OC potential https://www.reddit.com/r/Amd/comments/5xvv16/r7_1700_binning_data_from_10_cpus_indonesian/
*User Tests*

1700 @3.9 and rx480 https://www.reddit.com/r/Amd/comments/5xugl2/i_have_a_170039_and_a_rx_480_are_there_any/
*Misc*

Ryzen Master overclocking tool http://www.amd.com/en/technologies/ryzen-master and the guide http://support.amd.com/TechDocs/AMD%20Ryzen%20Processor%20and%20AMD%20Ryzen%20Master%20Overclocking%20Users%20Guide.pdf
Latency evaluation [French] http://www.hardware.fr/articles/956-22/retour-sous-systeme-memoire.html
Ryzen AMA on reddit with Lisa Su et al https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/
Joker and GamersNexus chat https://www.youtube.com/watch?v=04p_ryVM2ow
The Stilt analysis https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
GamersNexus explaining review disparities https://www.youtube.com/watch?v=TBf0lwikXyU
Core Scaling http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-cores&num=1
1700, 1700X & 1800X OC'd - Will It Help Gaming? https://youtu.be/IKGJshXgOwU
SMT better on win7 than win10? https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-8#post-38775732
Ryzen in 16 games https://www.youtube.com/watch?v=PoEcgfbhwTs
Titan X (Maxwell) SLI benching https://youtu.be/8-mMBbWHrwM
SSD testing http://www.tweaktown.com/articles/8073/amd-ryzen-ssd-storage-performance-preview/index.html
Cache, Compute Complexes & Scheduling https://www.youtube.com/watch?v=40h4skxDkh4
RTC bias win win8/10? http://hwbot.org/newsflash/4335_ryzen_platform_affected_by_rtc_bias_w88.110_not_allowed_on_select_benchmarks
Sleep bug? http://forum.hwbot.org/showthread.php?t=167702
4c/8t 1800x tests http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/
Quote:


> Originally Posted by *cjwilson*
> 
> *Ryzen 7 1700*
> *Stock:* 3000 MHz
> *Multi-Thread Turbo:* 3300 MHz _ryan edit_: might be 3100?
> *Single-Thread Turbo:* 3700 MHz
> *XFR:* No _ryan edit_: you get some with an x370
> 
> *Ryzen 7 1700X*
> *Stock:* 3400 MHz
> *Multi-Thread Turbo:* 3500 MHz
> *Single-Thread Turbo:* 3800 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 3900 MHz
> 
> *Ryzen 7 1800X*
> *Stock:* 3600 MHz
> *Multi-Thread Turbo:* 3700 MHz
> *Single-Thread Turbo:* 4000 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 4100 MHz
> 
> *SKUs with "X" Suffix:* XFR boosts by 100 MHz
> *SKUs without "X" Suffix:* XFR boosts by 50 MHz
> 
> *** XFR has been observed to apply its frequency boost to up to four cores while gaming.


I think we've got a good set of reviews now but if you find anything particularly interesting or niche feel free to post it and I'll add it. If you use the mention feature to bring it to my attention that would be helpful too.


----------



## jprovido

Let's goo


----------



## RedM00N

Ryzen shine....AMD....Ryzen..shine.


Spoiler: Warning: Spoiler!



props if you get the reference and let us hope for the best for the CPU world


----------



## wolfxing

Let's go~~~ glad to be able to read Chinese


----------



## Megaman_90

Cool pictures.


----------



## Mahigan

I don't think that those are legit benchmarks. The combined score for 3D Mark is very odd...


----------



## Wishmaker

Some results.


----------



## JackCY

Oh please, not these Asian leaks again. List some more proper reviews :/


----------



## ryan92084

first english is up


----------



## daviejams

Quote:


> Originally Posted by *JackCY*
> 
> Oh please, not these Asian leaks again. List some more proper reviews :/


The first link is an English review


----------



## AlphaC

Here we go!

Looks like 4Ghz required 1.425V on the Ryzen 7 1700 and somehow clocked higher than R7 1800X

http://www.overclockersclub.com/reviews/amd_ryzen_7_1800x_1700x_1700/4.htm

http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,23.html - power use a bit worrying when OC'ed but nothing crazy










Let me know if anyone simulates a Ryzen 5 via the MASTER utility.
Quote:


> Originally Posted by *axiumone*
> 
> So, looking at gaming benchmarks and overclocking headroom, my hype train is getting derailed pretty hard.


Ryzen 5.
Also need to look at Ryzen 7 1800X @4+ GHz (the Ryzen 7 1700 appears to hit 4GHz OC).
It's a software design issue ("old games") since Firestrike score is very high.



http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951-9.html Guess I'm getting only one







. It's good for threaded stuff (simulation/render/computation) , not as good for modeling & gaming.

P.S. Cinebench single thread is largely _academic_ , the 200ms "PDF opening" difference is hilarious.


----------



## Tobiman

The R7 in the chiphell review isnt boosting to 4.1ghz. That 149cb score proves that. MT score is expected for 3.7ghz.


----------



## looniam

if you care:


----------



## Quantium40

Well...

http://www.overclockersclub.com/reviews/amd_ryzen_7_1800x_1700x_1700/

The link appears to be down









That many people want to see a real bench huh?


----------



## Arturo.Zise

Looks like early motherboard bios' are hampering benchmarking. Might need to revisit in a few weeks and see if it improves.


----------



## BURGER4life

http://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html


----------



## amd-dude

I'll wait for the usual youtube reviewers, LTT, gamers nexus, bitwit, jay. Soon my sub box will be filled, I'll be occupied for the rest of the day.


----------



## Tobiman

Ocuk servers are being hammered. YT vids should be going up soon.


----------



## NoDestiny

Anandtech's up: http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700


----------



## BURGER4life

Another one
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review.html


----------



## amd-dude

https://www.youtube.com/watch?v=j7UBHjtCXhU

HERE WE GO...As Mike Goldberg use to say.


----------



## Jebicore

First 2 Passmark's Official Ryzen 7 1800X Benchmarks! ... https://www.cpubenchmark.net/high_end_cpus.html


----------



## Code-Red

Looks like the 1700 was the right purchase, glad I saved myself a few hundred bucks. Can't wait for it to arrive


----------



## dejau

Damn, looks like 4-4.2 is average overclock Ryzen chips (8 core mind you) are able to achieve and it doesnt look like voltage helps to make situation any better. For those using emulators (dolphin, pcsx2 mainly) it does not bode too well.


----------



## Newbie2009

lol i think some servers are creaking


----------



## looniam

(made a boo boo. )

quick OC 1700 vs 1800X



more at:
http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/


----------



## Wishmaker

Quote:


> Originally Posted by *Newbie2009*
> 
> lol i think some servers are creaking


They must be on Ryzen







!


----------



## 428cobra

Gonna get a 1700x for a streaming pc AMD back in the game!


----------



## NoDestiny

Toms 1800X: http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html


----------



## Wishmaker

Quote:


> In-game performance is just about the only performance-driven metric that doesn't fall into AMD's happy Ryzen narrative. In titles that weren't capped by the game engine or bottlenecked by the TITAN X, Ryzen's winning streak came to an end. The 1800X still provided highly competitive results but in many situations it fell behind the less expensive 7700K and 7600K. This issue isn't unique to Ryzen since even the once-mighty Broadwell-E processors had problems keeping up as well. As I said when Broadwell-E was launched, mammoth 8-core, 16-thread CPUs are great for people who need that excess horsepower but they go largely underutilized in gaming rigs.


A solid observation rather than 'INTEL IS DEAD, AMD WINS '







. INTEL will slash prices and they will still be in business with this generation. OCN made a big fuss as always







.


----------



## BURGER4life

https://www.overclock3d.net/reviews/cpu_mainboard/amd_ryzen_7_1800x_cpu_review/1
http://www.tweaktown.com/reviews/8072/amd-ryzen-7-1800x-cpu-review-intel-battle-ready/index.html

and so on


----------



## daviejams

Quote:


> Originally Posted by *NoDestiny*
> 
> Toms 1800X: http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html


I think BF4 on their test might just be GPU bound


----------



## ryanrenolds08

So if you are in the market for a new system Ryzen looks incredibly solid; if you are primarily a gamer with at least a modern K series CPU = the holdout on a substantial increase continues...

I do love the fact that this injects some life into the rivalry! Thing is a numbers beast but at least Guru has it still lagging just a bit in game centric situations. Overall, good news for "both sides".


----------



## Artikbot

The excitement is real!! Sifting through all shops atm to grab myself a 1700+X370 Killer SLI.


----------



## Seyumi

Looks like the gaming CPU crown still belongs on the 7700k / 7600k


----------



## axiumone

So, looking at gaming benchmarks and overclocking headroom, my hype train is getting derailed pretty hard.


----------



## Wishmaker

Quote:


> Originally Posted by *Seyumi*
> 
> Looks like the gaming CPU crown still belongs on the 7700k / 7600k


As expected!


----------



## Artikbot

Quote:


> Originally Posted by *axiumone*
> 
> So, looking at gaming benchmarks and overclocking headroom, my hype train is getting derailed pretty hard.


Is it? Mine is going full steam ahead and is about to crash into a storefront so I can grab myself some goods.

Quote:


> Originally Posted by *Wishmaker*
> 
> As expected!












Of course!


----------



## Praetorr

lol @ Hardwarecanucks

_"the 1800X competes very well against and even beats the *nearly three year old Broadwell-E design*."_

Hmmmm.... I wonder why they would so explicitly mention this?









Perhaps doing a little shilling? Perhaps big papa Intel had a few talking points they wanted mentioned?

Then again I'm sure I'm just being paranoid. Intel would never do that.


----------



## Sonikku13

Ryzen on Linux benchies.

https://www.phoronix.com/scan.php?page=article&item=ryzen-1800x-linux&num=1


----------



## Wishmaker

Quote:


> Originally Posted by *Praetorr*
> 
> lol @ Hardwarecanucks
> 
> _"the 1800X competes very well against and even beats the *nearly three year old Broadwell-E design*."_
> 
> Hmmmm.... I wonder why they would so explicitly mention this?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Perhaps doing a little shilling? Perhaps big papa Intel had a few talking points they wanted mentioned?
> 
> Then again I'm sure I'm just being paranoid. Intel would never do that.


..so Anand got paid too, yeah?

Quote:


> The CPU has a traditional uArch and does well, especially compared to last generation AMD, and a new high-perf core will be a feather in their cap. We see a lot of benchmark results where AMD is clearly equal or above Intel's HEDT parts in both ST and MT. *However there are a few edge cases where AMD is lacking behind 10-20% still, even to Broadwell*.


----------



## ryan92084

Already one reviewer (overclock3d.net) has admitted they didn't redo the tests with the last minute ram boosting bios. Ugh.


----------



## Ding Chavez

Gaming benches about what I expected.

OC a bit less than hoped for...

Overall pretty good IMO. The 6 cores will be the best value but not out yet for months.


----------



## Praetorr

You're not gonna have your "gotcha" moment with a strawman that weak.

I didn't say Intel "paid" anybody, that would be illegal.

I'm saying that maybe they wouldn't get all expense paid vacations to Intel events if they didn't hit certain talking points.

Is that clear enough?

(P.S. Yeah, you'll say "who cares?" "That's Intel's right to fly out whomever they want." Yeah it is, but it also means we can't take anything these reviewers say without a healthy dose of salt).
Quote:


> Originally Posted by *Wishmaker*
> 
> ..so Anand got paid too, yeah?


----------



## PsyM4n

The war has begun. May the fanboys compete in mortal kcombat.


----------



## Catscratch

http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700/17

Wow. Just look at the 65w cpu in those tests. 1700 shines so bright.


----------



## rt123

Quote:


> Originally Posted by *Wishmaker*
> 
> A solid observation rather than 'INTEL IS DEAD, AMD WINS '
> 
> 
> 
> 
> 
> 
> 
> . INTEL will slash prices and they will still be in business with this generation. OCN made a big fuss as always
> 
> 
> 
> 
> 
> 
> 
> .


Lower clocked Ryzen falls behind. News @ 11.


----------



## Wishmaker

Now it is clear why AMD priced it like this. Lisa SU said AMD does not want to continue on the 'budget solution' path, yet Ryzen is extremely affordable. Had 1800x been the best on the market, it would have been double the price. Ryzen is a good step forward and if Ryzen2 brings 15% increase in performance vs Ryzen, then INTEL will need to put out more than they have done these past 5 years. The future looks good







!


----------



## Blze001

I really don't understand Intel fans praying Ryzen fails. Intel having no competition is horrible for everyone.


----------



## Descadent

which review actually has them overclocked beyond stock xfr clock? Everyone i've seen so far and videos has not overclocked them and compare to the i7 overclocked.

so many reviews coming out so fast it's hard trying to go through them all just to see if they have pushed past 4.1


----------



## naz2

pretty much exactly what i expected. generally competitive performance, intel quads are still king in games, can't overclock.

overall very good but didn't quite live up to the hype drummed up by the amd fanboys frothing at the mouths for the past few weeks. looks like the base 1700 will be the best choice, and probably the one i'll be picking up


----------



## Samuris

It's me or r7 1700 and r7 1700x r7 1800x are the same processeur at different clock ? look it's like the r7 1700 after overclock is the best of three lol but he's at the lowest price ? strange no ?


----------



## SoloCamo

Ah, results more in line with realistic expectations vs the recent hype. Good for the price but again a jack of all trades and master of none.


----------



## Serios

Quote:


> Originally Posted by *Wishmaker*
> 
> A solid observation rather than 'INTEL IS DEAD, AMD WINS '
> 
> 
> 
> 
> 
> 
> 
> . INTEL will slash prices and they will still be in business with this generation. OCN made a big fuss as always
> 
> 
> 
> 
> 
> 
> 
> .


Yeah you can calm down now but don't forget that in the upcoming months AMD will launch their hexa and quad core CPUs.


----------



## jezzer

o wow i still need this for editing and i wont loose fps in games compaired to 4790k but since it does not OC i can get a cheap b350 mobo and save some more


----------



## BinaryDemon

Most of the reviews seem similar, here's what I'm taking away from them:

+Seems like most of the leaks were accurate, it's a very well rounded processor doing thats good for gaming and excellent with most multithreaded workloads.
+Power draw seems to end up being better than all the Intel Broadwell-E systems but not quite as good as i7 Kabylake.
+It has great price/performance value.
-Seems like AMD went really aggressive with stock clock speeds and turbo, not leaving a lot of room for overclocking. Probably a smart business move, it will be interesting to see what kind of overclocks OCN gets from Ryzen.

Nice work AMD.


----------



## BrainSplatter

Here is a selection of reviews with overclocking tests:

*overclockersclub.com 4.1Ghz on 1700, others 4.0Ghz*.:
http://www.overclockersclub.com/reviews/amd_ryzen_7_1800x_1700x_1700/4.htm

*guru3d* also managed *4.1OC* on a 1800x:
https://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,22.html

*Hardware Canucks 4.0*:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-19.html
Quote:


> So what does this all mean for overclocking? I'm not quite sure yet since I'm still dialing things in. Right now it looks like overclocking will settle *somewhere between 3.85GHz and 4GHz*.


*ETeknix 4.1*
http://www.eteknix.com/amd-ryzen-7-1800x-am4-8-core-processor-review/6/
Quote:


> We did manage to get the system to POST at 4.2GHz with 1.5v, but the heat was creeping up and the system shut down. While I do think the system will bench at this level, we would need a more powerful cooler to keep it stable, but it seems possible with a custom loop. *4.1 GHz with 1.488v* on the other hand, was much more successful and the temperatures were much more manageable.


*Computerbase.de (German) 3.9-4.1*
https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/7/
Quote:


> The Ryzen 7 is completely stable @ 3,9 GHz. It manages 4,15 GHz with light load, 4,1 for games and 4,05 GHz for Prime95


*Hexus.net 4.1*
http://hexus.net/tech/reviews/cpu/102964-amd-ryzen-7-1800x-14nm-zen/?page=11
Quote:


> Using a Noctua NH-D15S equipped with a single fan and overclocking by using the BIOS, we managed an all-core, stable speed of *4.1GHz*. Voltage was increased only slightly, from 1.3625V to 1.375V.


*Kitguru.net 4.1*
http://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-7-1800x-cpu-review/3/
Quote:


> The highest Cinebench-stable processor frequency that we could achieve with all 8 cores and SMT enabled was 4100MHz. Frequencies such as 4150MHz and 4200MHz would boot into Windows 10 but could not retain stability under Cinebench loading.


*Tweaktown.com 4.0*
http://www.tweaktown.com/reviews/8072/amd-ryzen-7-1800x-cpu-review-intel-battle-ready/index12.html
Quote:


> we have overclocked the CPU to 4GHz on all cores and the memory to 2933MHz. The overclock you see here is the maximum of most first generation 1800X, at least from what I can gather, when overclocking all cores


*Hot Hardware 4.0*
http://hothardware.com/reviews/amd-ryzen-7-1800x-1700x-1700-benchmarks-and-review?page=10
Quote:


> topped out at an all-core overclock of 4.0GHz, with 1.35 volts supplied to the CPU. For cooling, we stuck with the tower-type Thermaltake Contact Silent 12 we showed you on a previous page. At these settings, the CPU topped out at just shy of 90'C and was perfectly stable. Bumping things up to 4.1GHz would result in frequent crashes and the system wouldn't boot properly at 4.2GHz.


----------



## unseen0

Quote:


> Originally Posted by *naz2*
> 
> pretty much exactly what i expected. generally competitive performance, intel quads are still king in games, can't overclock.
> 
> overall very good but didn't quite live up to the hype drummed up by the amd fanboys frothing at the mouths for the past few weeks. looks like the base 1700 will be the best choice, and probably the one i'll be picking up


The R3 and R5's may overclock better, resulting in higher single core performance and they could still wipe the floor with Intel's flagship gaming cpu's.

All in all, you'd be foolish to get a kaby lake over Ryzen at this point. Sure u get 5% more fps in games. But in turn u lose 50% in multi threaded applications that are becoming more and more mainstream, games included.


----------



## Artikbot

Quote:


> Originally Posted by *Descadent*
> 
> which review actually has them overclocked beyond stock xfr clock? Everyone i've seen so far and videos has not overclocked them and compare to the i7 overclocked.
> 
> so many reviews coming out so fast it's hard trying to go through them all just to see if they have pushed past 4.1


They seem to struggle, although OCC admitted to not even having tried to overclock outside Ryzen Master.

I'll wait for the local retailers to start stocking boards and for more reviews to flood in.


----------



## looniam

Quote:


> Originally Posted by *Wishmaker*
> 
> ..so Anand got paid too, yeah?
> Quote:
> 
> 
> 
> The CPU has a traditional uArch and does well, especially compared to last generation AMD, and a new high-perf core will be a feather in their cap. We see a lot of benchmark results where AMD is clearly equal or above Intel's HEDT parts in both ST and MT. *However there are a few edge cases where AMD is lacking behind 10-20% still, even to Broadwell*.
Click to expand...

just for posterity: (e: vocabulary FAIL!)
Quote:


> However there are a few edge cases where AMD is lacking behind 10-20% still, even to Broadwell. These edge cases are difficult to anticipate, and can stem from unoptimized code. One of the benefits of Intel's big R&D juggernaut is the ability to process those edge cases, through prefetch, memory algorithms, and extensive testing. So despite the best will, there's still a large element to having a substantial budget to hire 300+ more engineers to cater for that, which is something AMD wasn't able for Zen or Ryzen.


----------



## ryan92084

Added a whole slew of reviews. I've fallen behind on the motherboard portion though.


----------



## ondoy

*Ryzen - Return of the Jedi*


----------



## Wishmaker

Quote:


> Originally Posted by *Serios*
> 
> Yeah you can calm down now but don't forget that in the upcoming months AMD will launch their hexa and quad core CPUs.


No need to fret, I am sure Zen 2 will be better. Another example of a product where OCN has a tendency to push things to new levels of Hype. Lisa Su should cut some cheques for certain people because damn, everyone was thinking that they made the bad choice with INTEL and should have waited for AMD.

INTEL still remains a valid choice, contrary to what some people are saying here on OCN. Benches are out, things are public, we got a good product from AMD, stop the poisonous statements that INTEL is trash and AMD is better.


----------



## Olivon

Vs 7700K

Apps = +35%
Games = -20%

http://www.hardware.fr/articles/956-21/indices-performance.html


----------



## Quantium40

Quote:


> Originally Posted by *BinaryDemon*
> 
> Most of the reviews seem similar, here's what I'm taking away from them:
> -Seems like AMD went really aggressive with stock clock speeds and turbo, not leaving a lot of room for overclocking. Probably a smart business move, it will be interesting to see what kind of overclocks OCN gets from Ryzen.


It is sort of impressive in a way (although sad for OCers) that they can bin their chips so consistently close to their limits. It makes for a better chip from a business perspective for sure.


----------



## Descadent

so let me get this right?

you can't get an 1800x past 4.1?!??!! or are these guys taking it easy...can i not get it to 4.6-5.0?!?!


----------



## naz2

Quote:


> Originally Posted by *unseen0*
> 
> The R3 and R5's may overclock better, resulting in higher single core performance and they could still wipe the floor with Intel's flagship gaming cpu's.
> 
> All in all, you'd be foolish to get a kaby lake over Ryzen at this point. Sure u get 5% more fps in games. But in turn u lose 50% in multi threaded applications that are becoming more and more mainstream, games included.


yes will be interesting to see how the r5 quads compare and overclock, but that's only coming in H2. kaby lake is still the best gaming CPU and with recent price cuts, r7 is not an automatic buy. average user gets these enthusiast CPUs exclusively for games in which ryzen is relatively weakest. anyone doing serious multithreading would also benefit from quad-channel memory which ryzen lacks. so no, in a word, intel is not "finished"


----------



## jezzer

Doesnt the cpu boost to 4ghz anyways? offcourse its stable at 4ghz then...


----------



## IRONPIG1

Well thanks for the hype everyone







turned out as expected(actually disappointing, but the market was GOOD!) sold shares at just under 15







so I am indeed very happy with this launch!

So drawing from the released benchmarks so far, is it safe to say that a 2600k @4.9 will still hold its own against a 1800x in games if not outperform it considering the 1800x's minimum FPS drops?


----------



## SirCumference

Quote:


> Originally Posted by *Descadent*
> 
> so let me get this right?
> 
> you can't get an 1800x past 4.1?!??!! or are these guys taking it easy...can i not get it to 4.6-5.0?!?!


Doesn't look like it. Reviewer pushing crazy voltages are barely able to get the 1800X to 4.1Ghz


----------



## naz2

also i think AMD made a mistake by leaving every CPU unlocked. now there is virtually no reason to buy the 1700x or 1800x over the base 1700. for most people the $70 premium would've been justified and still far cheaper than anything comparable from intel. amd just lost a lot of money there


----------



## IRONPIG1

Quote:


> Originally Posted by *jezzer*
> 
> Doesnt the cpu boost to 4ghz anyways? offcourse its stable at 4ghz then...


it boosts 1 to 2 of the cores to 4ghz whilst the others are idle, don't know if that is what you mean, haven't seen all 8 @ 4ghz


----------



## SoloCamo

Quote:


> Originally Posted by *naz2*
> 
> also i think AMD made a mistake by leaving every CPU unlocked. now there is virtually no reason to buy the 1700x or 1800x over the base 1700. for most people the $70 premium would've been justified and still far cheaper than anything comparable from intel. amd just lost a lot of money there


There is still a rather large market that would rather not deal with overclocking. They are catering to both crowds here.


----------



## Rocozaur

Please add the Lab501 review located here:

REVIEW AMD RYZEN 7 1800X - WELCOME BACK AMD

Yes, you're gonna need a bit of Google Translate but the numbers and graphs speak for themselves


----------



## lombardsoup

Quote:


> Originally Posted by *SirCumference*
> 
> Doesn't look like it. Reviewer pushing crazy voltages are barely able to get the 1800X to 4.1Ghz


Highest I've seen one yet is 4.3, but that's on an EKWB block

I'm not sure if reviewers are just being conservative, or if that really is the upper limit.


----------



## Sonikku13

There's some other good news for people with pre-Ryzen AMD Wraith coolers. Old AMD stock coolers will probably work with Ryzen.

Edit - Well, I didn't read the article I was reading enough. I skimmed through it. lol


----------



## Wishmaker

Quote:


> Originally Posted by *Descadent*
> 
> so let me get this right?
> 
> you can't get an 1800x past 4.1?!??!! or are these guys taking it easy...can i not get it to 4.6-5.0?!?!


Quote:


> Originally Posted by *SirCumference*
> 
> Doesn't look like it. Reviewer pushing crazy voltages are barely able to get the 1800X to 4.1Ghz


It was a known fact for months that the INTEL process is superior. People complained about INTEL clocks on these chips and completely forgot AMD does not have the same process. So INTEL clocks are rather good now it seems







.


----------



## DADDYDC650

So 1700 over 1700x/1800x and just OC it yourself to 4Ghz?


----------



## Samuris

i don't understand this 
so why get a r7 1800x ?


----------



## ryan92084

Quote:


> Originally Posted by *DADDYDC650*
> 
> So 1700 over 1700x/1800x and just OC it yourself to 4Ghz?


http://www.xtremesystems.org/forums/showthread.php?293130-Ryzen-Return-of-the-Jedi&p=5254423#post5254423 has a good evaluation for that in the pre amble


----------



## ondoy

*AMD RYZEN 7 1700X LINUX BENCHMARKS*


----------



## IRONPIG1

Quote:


> Originally Posted by *Olivon*
> 
> 
> 
> 
> 
> Vs 7700K
> 
> Apps = +35%
> Games = -20%
> 
> http://www.hardware.fr/articles/956-21/indices-performance.html


These graphs are very confusing.


----------



## Descadent

this is just stupid if all you can get is 4.1 in 2017. talk about disappointment . why would i buy ryzen over intel if 4.1 is IT?!

not to mention how it's not even up to par in gaming performance. I need both gaming and rendering performance personally.


----------



## Kuivamaa

Quote:


> Originally Posted by *lombardsoup*
> 
> Highest I've seen one yet is 4.3, but that's on an EKWB block
> 
> I'm not sure if reviewers are just being conservative, or if that really is the upper limit.


Both probably. Most do not want to touch 1.45V that is where AMD reports the chip will degrade in the long run. It seems that Sammy/GloFo process needs to mature a bit still.


----------



## sammkv

damn, in some games ryzen onlly a little bit better than old intel quads


----------



## Cyph3r

I was very excited for Ryzen and was going to do a Ryzen build very soon, but after looking at the reviews, I'm feeling kind of "meh" about it...


----------



## kalelovil

Quote:


> Originally Posted by *sammkv*
> 
> damn, in some games ryzen onlly a little bit better than old intel quads


Looks like AMD's SMT implementation is having a significant negative effect on some games.

http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


----------



## SoloCamo

Quote:


> Originally Posted by *Descadent*
> 
> this is just stupid if all you can get is 4.1 in 2017. talk about disappointment . why would i buy ryzen over intel if 4.1 is IT?!


Are you serious? Did you buy the pentium 4 over athlon 64 due to much higher clock speeds, too?

I'll take a 4.1ghz 8c16t over a 5ghz 4c8t...

Really depends on your workload and more importantly, budget.


----------



## gigafloppy

Conclusion:
- If all you do is gaming, don't buy it.
- If all you do is rendering, BUY IT, NOW!!!


----------



## czin125

Quote:


> Originally Posted by *Descadent*
> 
> so let me get this right?
> 
> you can't get an 1800x past 4.1?!??!! or are these guys taking it easy...can i not get it to 4.6-5.0?!?!


Anyone going to delid these and put inside the IHS and outside the IHS just to maximize the transfer from cpu->ihs->waterblock and tested with something larger than 240mm rad at least?

http://www.thermal-grizzly.com/en/products/26-conductonaut-en

Maybe it scales better with lower heat?


----------



## IRONPIG1

Quote:


> Originally Posted by *Cyph3r*
> 
> I was very excited for Ryzen and was going to do a Ryzen build very soon, but after looking at the reviews, I'm feeling kind of "meh" about it...


lol ^^ was kind of expected







I still have to upgrade from 2600k but seems upgrading to a 1800x wouldn't be much of an upgrade regarding gaming performance.


----------



## Shiftstealth

Quote:


> Originally Posted by *kalelovil*
> 
> Looks like AMD's SMT implementation is having a significant negative effect on some games.
> 
> http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


It looses .2FPS with SMT.


----------



## Sonikku13

Quote:


> Originally Posted by *gigafloppy*
> 
> Conclusion:
> - If all you do is gaming, don't buy it.
> - If all you do is rendering, BUY IT, NOW!!!


If you're coming from an A10-7850K like me, or an FX-series processor, and want 8 cores, _buy it now!_


----------



## IRONPIG1

Quote:


> Originally Posted by *SoloCamo*
> 
> Are you serious? Did you buy the pentium 4 over athlon 64 due to much higher clock speeds, too?
> 
> I'll take a 4.1ghz 8c16t over a 5ghz 4c8t...
> 
> Really depends on your workload and more importantly, budget.


+


----------



## criminal

Intel







AMD


----------



## Slomo4shO

The 1700 at $329 seems to be a winner. Doesn't seem to be much of an improvement going up to a 1700X or 1800X. Very little OC headroom so why pay a premium for a X variant?


----------



## Shiftstealth

Quote:


> Originally Posted by *Slomo4shO*
> 
> The 1700 at $329 seems to be a winner. Doesn't seem to be much of an improvement going up to a 1700X or 1800X. Very little OC headroom so why pay a premium for a X variant?


I'm upset i got the 1700x now, but it is on the fedex trunk now. So meh. That 400 Mhz higher base clock felt significant.


----------



## SoloCamo

Quote:


> Originally Posted by *Slomo4shO*
> 
> The 1700 at $329 seems to be a winner. Doesn't seem to be much of an improvement going up to a 1700X or 1800X. Very little OC headroom so why pay a premium for a X variant?


Exactly. Everyone is looking at the 1800x but the 1700 is the go to chip here. This is the real star of the show much like the 8320 was compared to a 9590.
Quote:


> Originally Posted by *Shiftstealth*
> 
> I'm upset i got the 1700x now, but it is on the fedex trunk now. So meh. That 400 Mhz higher base clock felt significant.


At least you didn't jump on the 1800x bandwagon.


----------



## Shiftstealth

Quote:


> Originally Posted by *SoloCamo*
> 
> Exactly. Everyone is looking at the 1800x but the 1700 is the go to chip here. This is the real star of the show much like the 8320 was compared to a 9590.
> At least you didn't jump on the 1800x bandwagon.


Yep. I'll only recommend the 1700 to people.

Yes. So happy i did not buy the 1800x.


----------



## ryan92084

Quote:


> Originally Posted by *SoloCamo*
> 
> Exactly. Everyone is looking at the 1800x but the 1700 is the go to chip here. This is the real star of the show much like the 8320 was compared to a 9590.
> At least you didn't jump on the 1800x bandwagon.


Quote:


> Originally Posted by *Shiftstealth*
> 
> Yep. I'll only recommend the 1700 to people.
> 
> Yes. So happy i did not buy the 1800x.


Yep for anyone not afraid of overclocking 1700 is the winner of the three so far. Also might explain why they've thinned out the 6c/12t line to just one chip versus the rumored 2-4.


----------



## Sonikku13

AMD redesigned their site as Ryzen launched.

https://www.amd.com/en


----------



## kalelovil

Quote:


> Originally Posted by *sammkv*
> 
> damn, in some games ryzen onlly a little bit better than old intel quads


Quote:


> Originally Posted by *Shiftstealth*
> 
> It looses .2FPS with SMT.


Depends on the game.
Quote:


> Total Warhammer shows the biggest change in performance when disabling SMT. The AMD R7 1800X moves from ~127FPS AVG (1% low: 90, 0.1% low: 65.7) to ~153FPS AVG with SMT0. That's an increase in performance of 20.5% by disabling AMD's most advertised property. In the 1% and 0.1% low values, AMD moves from 90 to 117.3FPS (1% low) and from 65.7 to 101.7FPS (0.1% low), indicating further that SMT hamstrings frame latency significantly.
> [\QUOTE]


----------



## Descadent

Quote:


> Originally Posted by *SoloCamo*
> 
> Are you serious? Did you buy the pentium 4 over athlon 64 due to much higher clock speeds, too?
> 
> I'll take a 4.1ghz 8c16t over a 5ghz 4c8t...
> 
> Really depends on your workload and more importantly, budget.


yes i'm serious and an 1800x is hardly different if different at all compared to my 6 year old 2600k in gaming which is at 4.6 for 6 years now. yeah it will be better for 4k rendering but i could get intel that will do both just as good. ryzen is looking like a disappointment unless they can get the clock speed up on overclocking. why would i buy ryzen for 20% less performance in one of my applications compared to even a 7700k or 6850/6800k?


----------



## Slomo4shO

Just need a quality ITX board now.

Seems it is time to part out and rebuild Little Boy & Tsar Bomba


----------



## looniam

keep in mind reviews haven't had a lot of time OCing ryzen. nor did they have a lot of different mobos and mobos with a "beta" bios.

i'd be hesitant to make an OCing judgement until a few members here do some extensive testing . . .


----------



## PostalTwinkie

Why are people ragging on an 8 core hitting 4.2? That is damn good, especially round 1.

Come talk to all the Haswell-E owners that wish they could hit that on their 6 core.


----------



## RedM00N

Seems like so far as someone said its a jack of all trades, master of none. Well rounded CPUs specially for the price. I will still hold my final judgment when user reviews/numbers come.

As I'm on a gimped 3930k with only 4cores active this will look good in comparison for the most part









Now I just wait a couple months for SiliconLottery to see if they can find some 4.3/4.4s (wasn't expecting more than Broadwell-e oc's anyway). Funny how the 1700 seem to so far be the better overclockers but it's all a lottery really. If not, I'm no stranger to low overclocks as my 5820k was only 4ghz post folding days and ran games with more than enough fps for most games.


----------



## Mand12

I read through the Anandtech review and benchmarks, and they're all for things that I may do, but don't really care about. As someone who does not do rendering or really any CPU-intensive workloads and cares solely about gaming performance, how excited should I be and how much should I actually care which of the two companies' products I get?


----------



## Axon14

Quote:


> Originally Posted by *Shiftstealth*
> 
> I'm upset i got the 1700x now, but it is on the fedex trunk now. So meh. That 400 Mhz higher base clock felt significant.


These results and your feelings are precisely why I cancelled my pre-order. I had the 1800x and the Crosshair VI. Total was $750+ USD. I have a 6700k that clocks to 4.5 and I mostly game. Ryzen had minimal leaks and a strict pre-release review embargo. Things looked good, and I knew Ryzen was going to be their best offering in years, but something still felt a little off to me. The clocks were low, and with 8 cores, I was wondering how far these chips could be pushed. Decided to wait and see, and now I'm glad I did. These results aren't _bad_, but they are not earth shattering either.

I'm sure I'll get the itch to upgrade my backup rig within the next couple of months. Hopefully by then the 1600x will have launched or microcenter will offer one of its $50 discounts and -$30 on a motherboard combo.


----------



## Cyph3r

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Why are people ragging on an 8 core hitting 4.2? That is damn good, especially round 1.
> 
> Come talk to all the Haswell-E owners that wish they could hit that on their 6 core.


I thought 4.2 was standard on H-E? And 4.4ghz was the typical OC from what I've seen.


----------



## SoloCamo

Quote:


> Originally Posted by *Descadent*
> 
> yes i'm serious and an 1800x is hardly different if different at all compared to my 6 year old 2600k in gaming which is at 4.6 for 6 years now. yeah it will be better for 4k rendering but i could get intel that will do both just as good. ryzen is looking like a disappointment unless they can get the clock speed up on overclocking. why would i buy ryzen for 20% less performance in one of my applications?


What intel cpu at $329 will match the multithreaded/rendering performance of Ryzen? Your argument falls flat. Your 4.6ghz 2600k is practically as fast as my stock 4790k as well in gaming and not far behind a 7700k, either.

You can get an intel as good as Ryzen in multi threaded loads, but you are going to pay A LOT more for it. The $329 1700 at 4.1ghz has no problem matching a $1000 6900k.. Even if you get 4.5ghz out of the 6900k (which isn't too common) is it worth near enough $700?
Quote:


> Originally Posted by *Mand12*
> 
> I read through the Anandtech review and benchmarks, and they're all for things that I may do, but don't really care about. As someone who does not do rendering or really any CPU-intensive workloads and cares solely about gaming performance, how excited should I be and how much should I actually care which of the two companies' products I get?


Depends on the cpu you are coming from and the games you play.


----------



## naz2

SMT is HT all over again. i remember how everyone emphatically advised to disable it in BIOS for improved gaming performance back with the original nehalem and sandy bridge. 8 years later intel has it figured out but amd is only starting from the beginning


----------



## 98uk

Many people are saying the 1700 is a better value chip than the 1700x, can someone summarise why?

What does the 1700x actually offer over the 1700 and does it offer value?


----------



## mistax

In front of microcenter right now waiting for them to open. Im so glad the reviews came out. Originally was going to buy the 1800x, but if i can overclock tge 1700 to 4-4.1 anyways im saving 170. Not to concern about gaming benchmark as o expected gaming performance similiar to other. Hex/octo cores


----------



## lombardsoup

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Why are people ragging on an 8 core hitting 4.2? That is damn good, especially round 1.
> 
> Come talk to all the Haswell-E owners that wish they could hit that on their 6 core.


Exactly. Overclocking any 8 core processor (Intel or AMD) isn't an easy task: both brands tend to crap out at around 4.2-4.3.


----------



## Mahigan

Quote:


> Originally Posted by *Wishmaker*
> 
> Now it is clear why AMD priced it like this. Lisa SU said AMD does not want to continue on the 'budget solution' path, yet Ryzen is extremely affordable. Had 1800x been the best on the market, it would have been double the price. Ryzen is a good step forward and if Ryzen2 brings 15% increase in performance vs Ryzen, then INTEL will need to put out more than they have done these past 5 years. The future looks good
> 
> 
> 
> 
> 
> 
> 
> !


Uh?

So because it trades blows with a $1,000+ Intel CPU (6900K), AMD made the wise choice of pricing it at $499? No dude.. they could have priced it at $800 given its performance. They priced it lower because of their strategy (lowering the cost of a 4K gaming capable PC).


----------



## Descadent

Quote:


> Originally Posted by *SoloCamo*
> 
> What intel cpu at $329 will match the multithreaded/rendering performance of Ryzen? Your argument falls flat. Your 4.6ghz 2600k is practically as fast as my stock 4790k as well in gaming and not far behind a 7700k, either.
> 
> You can get an intel as good as Ryzen in multi threaded loads, but you are going to pay A LOT more for it. The $329 1700 at 4.1ghz has no problem matching a $1000 6900k.. Even if you get 4.5ghz out of the 6900k (which isn't too common) is it worth near enough $700?
> Depends on the cpu you are coming from and the games you play.


i'm talking about the 1800x not the 1700.


----------



## DADDYDC650

Quote:


> Originally Posted by *ryan92084*
> 
> http://www.xtremesystems.org/forums/showthread.php?293130-Ryzen-Return-of-the-Jedi&p=5254423#post5254423 has a good evaluation for that in the pre amble


So 1700 from Amazon or silicon lottery. Thank goodness I canceled my 1800x pre-order yesterday.


----------



## Shiftstealth

Quote:


> Originally Posted by *Axon14*
> 
> These results and your feelings are precisely why I cancelled my pre-order. I had the 1800x and the Crosshair VI. Total was $750+ USD. I have a 6700k that clocks to 4.5 and I mostly game. Ryzen had minimal leaks and a strict pre-release review embargo. Things looked good, and I knew Ryzen was going to be their best offering in years, but something still felt a little off to me. Decided to wait and see, and now I'm glad I did. These results aren't _bad_, but they are not earth shattering either.
> 
> I'm sure I'll get the itch to upgrade my backup rig within the next couple of months. Hopefully by then the 1600x will have launched or microcenter will offer one of its $50 discounts and -$30 on a motherboard combo.


I'm upset i got the 1700x, over the 1700. Not upset i got ryzen. I will be happy to sport AMD for once again.


----------



## Artikbot

Quote:


> Originally Posted by *98uk*
> 
> Many people are saying the 1700 is a better value chip than the 1700x, can someone summarise why?
> 
> What does the 1700x actually offer over the 1700 and does it offer value?


Both overclock to the same clockspeeds.

If you're not going to overclock (and I know you aren't going to), 1700X is the best of the three in terms of value, considering XFR (auto overclock thing).


----------



## 98uk

Quote:


> Originally Posted by *Shiftstealth*
> 
> I'm upset i got the 1700x, over the 1700. Not upset i got ryzen. I will be happy to sport AMD for once again.


What is the difference between the two... apart from cost!?


----------



## Sonikku13

Quote:


> Originally Posted by *Mahigan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wishmaker*
> 
> Now it is clear why AMD priced it like this. Lisa SU said AMD does not want to continue on the 'budget solution' path, yet Ryzen is extremely affordable. Had 1800x been the best on the market, it would have been double the price. Ryzen is a good step forward and if Ryzen2 brings 15% increase in performance vs Ryzen, then INTEL will need to put out more than they have done these past 5 years. The future looks good
> 
> 
> 
> 
> 
> 
> 
> !
> 
> 
> 
> Uh?
> 
> So because it trades blows with a $1,000+ Intel CPU (6900K), AMD made the wise choice of pricing it at $499? No dude.. they could have priced it at $800 given its performance. They priced it lower because of their strategy (lowering the cost of a 4K gaming capable PC).
Click to expand...

AMD needs market share. Sure, they could have priced Ryzen at $800, but at that price, they wouldn't have gotten many buyers. At $500, well, a 1800X + a 1080 = a 6900K. Or a 1700 + a 1080 Ti = 6900K. Practically sells itself.


----------



## PureBlackFire

as I expected these are clearly not "gaming" cpus. hopefully AMD can get the 4-6 core Ryzen chips to overclock like BD/PD chips. those would be great gaming cpus. similar to how how Intel's mainstream i7 chips clock alot higher than the HEDT parts and sit atop virtually every gaming chart due to that high clock speed. having 3 8 cores unlocked with a similar clock ceiling seems pointless. always said AMD should learn how to segment their products better. good product overall though. went from non competitive to right back in the mix.


----------



## 98uk

Quote:


> Originally Posted by *Artikbot*
> 
> Both overclock to the same clockspeeds.
> 
> If you're not going to overclock (and I know you aren't going to), 1700X is the best of the three in terms of value, considering XFR (auto overclock thing).


Ok cool, sounds good. Will be what I pick up then.


----------



## SoloCamo

Quote:


> Originally Posted by *Descadent*
> 
> i'm talking about the 1800x not the 1700.


The point still stands. The 1800x is $500, the 6900k is $1000. The 1700 at $329 just makes the 6900k pointless if you can overclock. That $700 difference can make quite a difference for the rest of the system.

That said, the 4c8t and 6c12t cpu's should be quite interesting as well.

If either can hit 4.5ghz they will be in great shape and again, offer great price/performance.


----------



## BenchZowner

So... average all cores OC ballpark seems to be 4GHz to 4.1GHz on air/water. ( note 24/7 stable OC, not running a few benchmarks at 4.2GHz like some reviewers did, if crashing Prime95 or BOINC or during a long video encoding or render this isn't a stable OC )

Gaming performance where CPU counts ( 1080p ) is still inferior to Intel's offerings ( don't expect to change with BIOS or 3600 memory instead of 3000 for example ).
Good news for 1440p and 2160p users, they are on par with Intel there since the CPUs role is significantly less of a factor ( CPUs done, Graphics cards for decent 4k gaming yet to come for real ).

Still sad to see the 8c Ryzen behind the 4c 7700K in h.265 ( see AVX2 disadvantage ) which is key for video in the following years.


----------



## Shiftstealth

Quote:


> Originally Posted by *98uk*
> 
> What is the difference between the two... apart from cost!?


1700X Base is 3.4 Boost is 3.8

1700 Base is 3.0 Boost is 3.7

Boosts are Single threaded


----------



## SirCumference

I just cancelled my 1700X preorder. Ryzen seems to be awesome for transcoding, a bit weak for gaming. I feel so torn lol. Think I may hold onto my 3570k till Zen+ hits.


----------



## naz2

Quote:


> Originally Posted by *SoloCamo*
> 
> What intel cpu at $329 will match the multithreaded/rendering performance of Ryzen? Your argument falls flat. Your 4.6ghz 2600k is practically as fast as my stock 4790k as well in gaming and not far behind a 7700k, either.
> 
> You can get an intel as good as Ryzen in multi threaded loads, but you are going to pay A LOT more for it. The $329 1700 at 4.1ghz has no problem matching a $1000 6900k.. Even if you get 4.5ghz out of the 6900k (which isn't too common) is it worth near enough $700?
> Depends on the cpu you are coming from and the games you play.


he's specifically talking about gaming. that's what the overwhelming majority of people use these CPUs for. reviews are clearly showing that intel quads are superior, and with price cuts they're still a legitimate option. let's not turn this thread into that 4 vs 6 vs 8 core article fiasco again


----------



## Descadent

Quote:


> Originally Posted by *SoloCamo*
> 
> The point still stands. The 1800x is $500, the 6900k is $1000.


i know that...but the 6850 and 6800 and 7700k isn't far off in price or less than 1800x. that's MY point.


----------



## davidtran007

Cancelled my 1800X on Amazon. Will wait for the 1700 to come back in stock and try out my luck with the silicon lottery.


----------



## sepiashimmer

That's amazing performance! Almost as good as x99 at half the price.


----------



## Shiftstealth

Quote:


> Originally Posted by *davidtran007*
> 
> Cancelled my 1800X on Amazon. Will wait for the 1700 to come back in stock and try out my luck with the silicon lottery.


https://www.newegg.com/Product/Product.aspx?Item=N82E16819113428&cm_re=1700_amd-_-19-113-428-_-Product


----------



## Olivon

FX8350/Ryzen/6900k at 3GHz :



http://www.hardware.fr/articles/956-6/piledriver-zen-broadwell-e-3-ghz.html


----------



## CULLEN

Quote:


> Originally Posted by *PureBlackFire*
> 
> as I expected these are clearly not "gaming" cpus. .


At this moment no, but it will be interesting to see what happens when game start utilizing more cores like demonstrated with few games today.


----------



## Descadent

Quote:


> Originally Posted by *CULLEN*
> 
> At this moment no, but it will be interesting to see what happens when game start utilizing more cores like demonstrated with few games today.


we've been begging for games to use more than 2-4 cores for like almost up to 10 years now. don't hold your breath for 8 lol.


----------



## Serandur

I've got mixed thoughts. It's a good value proposition, especially for heavily-threaded workloads. But from an overclocking fun and gaming perspective, meh... not as tempted to buy it anymore.

If buying now, I honestly don't know what to recommend for people other than waiting for Intel's responses. I'm personally not regretting my old 6700K in the slightest right now though. It's not feeling quite as worthless as I hoped it would feel after Ryzen launched.


----------



## Axon14

Quote:


> Originally Posted by *Shiftstealth*
> 
> I'm upset i got the 1700x, over the 1700. Not upset i got ryzen. I will be happy to sport AMD for once again.


Yeah, if I was in your boat, I'd feel the same way. I think ultimately those extra cores will be a boon. Did you see Legit reviews? they literally have one set of benchmarks where Ryzen is trash, and then another set where it's competitive with 7700k. They claimed it was a driver issue.


----------



## Sonikku13

Gonna hang on to my 1800X preorder, quite simply cause I haven't ever owned a top of the line processor from either AMD or Intel, that had the performance desired from a top of the line processor. And I don't overclock.


----------



## Kuivamaa

Quote:


> Originally Posted by *naz2*
> 
> SMT is HT all over again. i remember how everyone emphatically advised to disable it in BIOS for improved gaming performance back with the original nehalem and sandy bridge. 8 years later intel has it figured out but amd is only starting from the beginning


Well AMD knows what's going on, hence the collaboration with Bethesda starting with Prey. They want to make sure their SMT does not hinder perf in games.


----------



## Ding Chavez

Quote:


> Originally Posted by *IRONPIG1*
> 
> Well thanks for the hype everyone
> 
> 
> 
> 
> 
> 
> 
> turned out as expected(actually disappointing, but the market was GOOD!) sold shares at just under 15
> 
> 
> 
> 
> 
> 
> 
> so I am indeed very happy with this launch!
> 
> So drawing from the released benchmarks so far, is it safe to say that a 2600k @4.9 will still hold its own against a 1800x in games if not outperform it considering the 1800x's minimum FPS drops?


Looks maybe possibly slightly overhyped just a touch... IMO.

Not very flattering slower than a stock 2700K in some of these gaming benches. Maybe disabling SMT will help but that's also not a good thing... 4 years and this is it, well I suppose it shows how hard it is to make CPUs these days.

http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/10


----------



## huzzug

Quote:


> Originally Posted by *naz2*
> 
> he's specifically talking about gaming. *that's what the overwhelming majority of people use these CPUs for*. reviews are clearly showing that intel quads are superior, and with price cuts they're still a legitimate option. let's not turn this thread into that 4 vs 6 vs 8 core article fiasco again


I'm sorry, what ? You've been hanging out with wrong people.


----------



## Mad Pistol

Just read PCper's review... AMD is back baby!!!! Ryzen is the real deal!!!


----------



## Arturo.Zise

Looks like the R7 series does what it was supposed to do, equal if not beat the 6/8 core Intels at half the price. The Handbrake performance alone is enough for me to make the upgrade.

Will be interested to see what happens when the mobo manufacturers release more tweaked and improved bios updates.


----------



## Shiftstealth

Quote:


> Originally Posted by *Axon14*
> 
> Yeah, if I was in your boat, I'd feel the same way. I think ultimately those extra cores will be a boon. Did you see Legit reviews? they literally have one set of benchmarks where Ryzen is trash, and then another set where it's competitive with 7700k. They claimed it was a driver issue.


Unless it is AVX2, or related to the dual channel memory then i'd gander to say that could be true (70%)


----------



## pez

I'm eager to see the market effect this has. I'm also super interested to see the $200 R5 chip when it comes time. Give us those ITX boards AMD.


----------



## GorillaSceptre

Confirms what has been known for a while now..

If you have a recent 4 core, and your system is primarily for gaming, then there's little to see here. If you are coming from an older I7, or gaming isn't all your system is used for, then you can get near 6900K performance for around $330.. If gaming is all you're interested in, then wait for the Ryzen 4 cores and compare them to something like the 7700K.

AMD are back.


----------



## DADDYDC650

Quote:


> Originally Posted by *Sonikku13*
> 
> Gonna hang on to my 1800X preorder, quite simply cause I haven't ever owned a top of the line processor from either AMD or Intel, that had the performance desired from a top of the line processor. And I don't overclock.


1800x is only top of the line in name if you don't overclock manually.


----------



## renx

Steve from Gamers Nexus completely obliterated it.


----------



## SoloCamo

Quote:


> Originally Posted by *renx*
> 
> Steve from Gamers Nexus completely obliterated it.


Not really. He's just realistic about it - which is why I watch him. The 1800x is not the best buy of the Ryzen lineup. As we have known, it depends on your workload. Just like people buy a 7700k or 6900k. Ryzen is a cheap 6900k.


----------



## Kevin Sia

I wasn't disappointed until i realized that the OC is a very limited.


----------



## mAs81

So as predicted by all: 1800x wins some, lose some for $500 less. That's a big win for AMD in my book


----------



## Benny89

Hm, Guess I will have to sit on my 4,7 4790k a little longer...









But maybe 7700K will drop in price! Or should I wait for Coffe Lake now....

Ughhhh!!!

All in all, very good AMD. Such power for 330 bucks is to be respected! Good to have you back! Now lets see future CPUs when their architecture will mature


----------



## Newbie2009

I'd like to see them disable 2 cores and see how far they can clock the 6 cores, would give us an idea for the 1600X


----------



## Descadent

Quote:


> Originally Posted by *Kevin Sia*
> 
> I wasn't disappointed until i realized that the OC is a very limited.


more like nonexistent almost lol


----------



## SoloCamo

Quote:


> Originally Posted by *Benny89*
> 
> Hm, Guess I will have to sit on my 4,7 4790k a little longer...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But maybe 7700K will drop in price! Or should I wait for Coffe Lake now....
> 
> Ughhhh!!!
> 
> All in all, very good AMD. Such power for 330 bucks is to be respected! Good to have your back! Now lets see future CPUs when their architecture will mature


A 7700k is a joke of an upgrade over a 4.7ghz 4790k...


----------



## CULLEN

Quote:


> Originally Posted by *Descadent*
> 
> we've been begging for games to use more than 2-4 cores for like almost up to 10 years now. don't hold your breath for 8 lol.


Well 8 cores haven't been the "standard" and only used by a tiny minority of gamers, Ryzen could easily make 8 cores the standard. Also allows developers to work with much more horsepower.


----------



## Rob27shred

Well the reviews are in & Ryzen is a monster compared to the FX series for sure, but it is not the panacea most were hoping for unfortunately. Seeing the gaming results I'm in no hurry to jump ship from my 6700k now. The OCing headroom & gaming performance of Ryzen are definitely a bit disappointing to me since those are the two things I predominately use my PC for. I still intend on picking up a 1700 & decent mobo sooner than later, but seeing the reviews Ryzen won't really give me any tangible benefits so I'm gonna wait on a good sale & more mobo options before I pull the trigger.

Overall though I am very happy to see AMD truly being competitive with Intel again. Ryzen will definitely shake the CPU market up a good bit & lower prices, especially in the HEDT sector!


----------



## ryan92084

Quote:


> Originally Posted by *Olivon*
> 
> FX8350/Ryzen/6900k at 3GHz :
> 
> http://www.hardware.fr/articles/956-6/piledriver-zen-broadwell-e-3-ghz.html


Why would any reviewer do a graph that way?
Quote:


> Originally Posted by *GorillaSceptre*
> 
> Confirms what has been known for a while now..
> 
> If you have a recent 4 core, and your system is primarily for gaming, then there's little to see here. If you are coming from an older I7, or gaming isn't all your system is used for, then you can get near 6900K performance for around $330.. If gaming is all you're interested in, then wait for the Ryzen 4 cores and compare them to something like the 7700K.
> 
> AMD are back.


Perfect summary IMO except I'd throw in the 1600x as a good balance option.


----------



## Benny89

Quote:


> Originally Posted by *SoloCamo*
> 
> A 7700k is a joke of an upgrade over a 4.7ghz 4790k...


But DDR4














I really want to switch to high-freq DDR4

But I see where you coming from. Please ignore me


----------



## Wishmaker

Quote:


> Originally Posted by *Mahigan*
> 
> Uh?
> 
> So because it trades blows with a $1,000+ Intel CPU (6900K), AMD made the wise choice of pricing it at $499? No dude.. they could have priced it at $800 given its performance. They priced it lower because of their strategy (lowering the cost of a 4K gaming capable PC).


Key word, trades blows. If you remember, Conroe did not trade blows for 1/4 of the FX 62 price, it obliterated it. As much as many want to push it , including you, this is not a Conroe moment. AMD has more work to catch up with INTEL and DEMAND the 800 dollar price you are saying. People would not even look at the 200 dollar difference between the amd and intel counterpart because the AMD platform is inferior. For the time being AMD needs to sort out their lanes and other RAM conundrums.

Last time they priced something at 1000 euros, I know, I bought one of their chips for that price, was so long ago that they need to rebuild reputation. Cheerleaders like you supporting the underdog with each realease, your async shader posts saying that AMD cards are superior did not prevent NVIDIA's demise. They are still king and launching a 1080TI which will sell more than sliced bread. While they are launching the 1080TI people are scouring web archives for AMD names because nothing concrete is available.

INTEL is still a viable option performance wise if *you have the money*. Just make sure you do not choke on this, would be a shame to lose your entertaining posts on how everything AMD touches turns into gold and how INTEL is not an innovator







.


----------



## delboy67

I wouldnt bash it for gaming benches, theres cheaper ryzens otw with lower prices and core counts, not to mention zen+ could well fit into the board you buy, overall pleased but not gona upgrade my ivy i7 just yet.


----------



## Slomo4shO

Next step, Zen APUs with HBM2


----------



## headd

Ryzen have HT bug -10-15% performance drop in games
http://www.hardware.fr/articles/956-7/impact-smt-ht.html


----------



## Evil Penguin

Damn fine processor the 1700 is...

1700 + B350/X370 + overclocking = major winner.

At this point they just need to refine the platform and maybe release a new higher-clocking stepping within a year.


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> Key word, trades blows. If you remember, Conroe did not trade blows for 1/4 of the FX 62 price, it obliterated it. As much as many want to push it , including you, this is not a Conroe moment. AMD has more work to catch up with INTEL and DEMAND the 800 dollar price you are saying. People would not even look at the 200 dollar difference between the amd and intel counterpart because the AMD platform is inferior. For the time being AMD needs to sort out their lanes and other RAM conundrums.
> 
> Last time they priced something at 1000 euros, I know, I bought one of their chips for that price, was so long ago that they need to rebuild reputation. Cheerleaders like you supporting the underdog with each realease, your async shader posts saying that AMD cards are superior did not prevent NVIDIA's demise. They are still king and launching a 1080TI which will sell more than sliced bread.
> 
> INTEL is still a viable option performance wise if you have the money. Just make sure you do not choke on this, would be a shame to lose your entertaining posts on how everything AMD touches turns into gold and how INTEL is not an innovator
> 
> 
> 
> 
> 
> 
> 
> .


Again I ask, where can I get the same multi threaded performance at $329 from Intel OC to OC?

As far I'm seeing I'd need to spend quite a bit more than an additional $200 for it.


----------



## NoDestiny

Here's my thoughts: $320 gives you a 3.0Ghz 8-core 16-thread processor that can overclock to ~4.0Ghz (33% gain). For me, that's great! I have an FX-8350 currently at 4.2Ghz (it's 5.0Ghz capable, just don't have the cooling or noise suppression to handle higher). A little gaming, but I mainly use this computer as a DAW (using Reaper), so saving time rendering is huge for me. Very excited... whenever a decent mATX motherboard shows up for sale, at least.


----------



## Mad Pistol

Quote:


> Originally Posted by *delboy67*
> 
> I wouldnt bash it for gaming benches, theres cheaper ryzens otw with lower prices and core counts, not to mention zen+ could well fit into the board you buy, overall pleased but not gona upgrade my ivy i7 just yet.


I'm in the same boat. My i7 4790k is not only "good enough" for gaming, but I have yet to run into a situation for gaming that it cannot handle perfectly.

Overall, it is an excellent showing for AMD.


----------



## Motley01

Well I'm heading to Microcenter right now. I was planning on buying the 1700X, but now I'm getting the 1700. A 1GHz OC is still very impressive.

Can't wait to get back and build this sucker!!


----------



## jamaican voodoo

so it seems the new SMT technology is causing low game performance i wonder if what would be the fix for this?.


----------



## Serandur

On a tangential note, I have so much respect for Sandy Bridge still. 6 years later and wow. Go 2700K:


----------



## naz2

Quote:


> Originally Posted by *jamaican voodoo*
> 
> so it seems the new SMT technology is causing low game performance i wonder if what would be the fix for this?.


8 years of refinement


----------



## CULLEN

If I'm getting this right, with AMD Master (Ryzen overclocking tool), couldn't you overclock few games which benefit from higher clock speed vs. more cores?


----------



## Mephistobr

I aim for gaming at [email protected] The gaming benchs for 4k gaming the difference between the 1700 and 7700k is negligible, the numbers ramp up in favor of the 7700k on 1080p lower image settings, wich I honestly don't care about.

So my question is, how important are these numbers at lower quality settings? At 4k I expect the GPU to bottleneck for years to come, and also the 1700 is brand new I expect that patches in games will fix the performance in a lot of these games.

It seems at least now, that the 1700 is the best buy for people like me that have older CPUs and are looking to upgrade.


----------



## Firann

So my thoughts from some reviewed and the comments here:

1. The star of the show is the 1700. They all OC to the same point so people who aren't afraid to OC should get the 1700. The only reason I see for not obtaining the 1700 is if it is proven that 1800X are better binned chips and ALL go 4.0Ghz+ where as not all 1700 can reach it.

2. Ryzen 8cores are great all around chips. Especially if you do rendering. However the Intel's do have the advantage in the memory department with quad memory.

3. IF you buy a Ryzen for gaming DISABLE SMT at all costs. This will basicly make it an 8 core cpu (so directly competes with 4 core 8 threads in the amount of threads) but you do see a boost in FPS.

It is safe to assume that Ryzen v2 will be much better once they remove all the kinks with memory, SMT and perhaps OCing.

[edit] also it seems the 95W TDP claimed which looked highly favorable vs Intel's 140W TDP wasn't apples to apples and in reality are quite similar.


----------



## hawker-gb

This is more than excellent.
Well done AMD


----------



## DarkRadeon7000

Well I never though I had say this but Intel seems more value for money in my country.The 7700K is like 40 dollars more than the the Ryzen 1700 and its beating even the Ryzen 1800 in gaming benchmarks meaning its going to stomp the 1700. As usual AMD's newest processors suck for gaming


----------



## Wishmaker

Quote:


> Originally Posted by *SoloCamo*
> 
> Again I ask, where can I get the same multi threaded performance at $329 from Intel OC to OC?
> 
> As far I'm seeing I'd need to spend quite a bit more than an additional $200 for it.


A bit of Economics would answer your question. AMD has slotted their product in the bracket where it was needed to *maximize profits.* They identified a shortcoming in the market and took it. Smart move and should make INTEL think twice from now on if this makes a significant dent in their sales.

AMD wants to make money and keep investors happy at the end of the day. These products are releases to restore investor confidence and make that balance sheet gain some weight. This is their short term game because they have been on the wrong side of the fence for ages. In the long run, once they take out a chip that is better than INTEL, you will not have this price. I guarantee it.

If VEGA beats TITAN X and 1080TI it will be at the same price or more expensive. AMD has profit maximization as a target and not being charitable.


----------



## looniam

Quote:


> Originally Posted by *Serandur*
> 
> On a tangential note, I have so much respect for Sandy Bridge still. 6 years later and wow. Go 2700K:
> 
> 
> Spoiler: Warning: Spoiler!


STAHP!

you're making me reconsider "upgrading" my ivy to anything at all.


----------



## Serandur

Quote:


> Originally Posted by *looniam*
> 
> STAHP!
> 
> you're making me reconsider "upgrading" my ivy to anything at all.


Coming from the other side of the upgrading from Ivy fence, I regret it. My best friend enjoyed a sweet second-hand deal on it and the whole platform though.


----------



## Steele84

Yeah for my workload this is still a huge win for me. The biggest disappointment so far is the OC, but I didn't expect super clocks. I do wander what the 1500 series will be like and if that will be better for the OC. Lots of things to do in the coming months!


----------



## darealist

So it's slower than even Sandy Bridge in video games... can't wait for dat Coffee Lake 6 core.


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> A bit of Economics would answer your question. AMD has slotted their product in the bracket where it was needed to *maximize profits.* They identified a shortcoming in the market and took it. Smart move and should make INTEL think twice from now on if this makes a significant dent in their sales.
> 
> AMD wants to make money and keep investors happy at the end of the day. These products are releases to restore investor confidence and make that balance sheet gain some weight. This is their short term game because they have been on the wrong side of the fence for ages. In the long run, once they take out a chip that is better than INTEL, you will not have this price. I guarantee it.


All good and all, but from a consumer perspective you completely skipped my question. What can intel offer *me* in that price range that competes with it's multi threaded performance?

I'm not new to the cpu scene, I've seen AMD price match intel on the high end.

Quote:


> Originally Posted by *Wishmaker*
> 
> If VEGA beats TITAN X and 1080TI it will be at the same price or more expensive. AMD has profit maximization as a target and not being charitable.


290x was half the cost of the Titan and beat it.


----------



## rv8000

Quote:


> Originally Posted by *naz2*
> 
> also i think AMD made a mistake by leaving every CPU unlocked. now there is virtually no reason to buy the 1700x or 1800x over the base 1700. for most people the $70 premium would've been justified and still far cheaper than anything comparable from intel. amd just lost a lot of money there


No they didn't, overclockers represent a very small portion of the total market. For the 90-95% of users that will leave these cpus stock, the 1700 1700x and 1800x will all perform at different levels and warrant different pricing.


----------



## naz2

Quote:


> Originally Posted by *Wishmaker*
> 
> A bit of Economics would answer your question. AMD has slotted their product in the bracket where it was needed to *maximize profits.* They identified a shortcoming in the market and took it. Smart move and should make INTEL think twice from now on if this makes a significant dent in their sales.
> 
> AMD wants to make money and keep investors happy at the end of the day. These products are releases to restore investor confidence and make that balance sheet gain some weight. This is their short term game because they have been on the wrong side of the fence for ages. In the long run, once they take out a chip that is better than INTEL, you will not have this price. I guarantee it.
> 
> If VEGA beats TITAN X and 1080TI it will be at the same price or more expensive. AMD has profit maximization as a target and not being charitable.


gaming is the primary sales-driver in this segment. most people will compare the 1700 to the 7700k, not the 6900k. extremely multi-threaded workloads where ryzen shines are a niche, usually occupied by professionals who will buy the best regardless of price. that's why intel has always priced it's top-end parts at such a premium, and why they actually include useful features like quad-channel memory.

point is intel figured out this system years ago and have been milking it ever since. amd made a foolish move by under-pricing their best CPU so much. at the very least they could've better stratified the X series like intel does with the K's.


----------



## batmanwcm

I can understand why some people were disappointed with the overclocking potential of Ryzen but this is still a big win for AMD. The 1700 ended up being a great value.


----------



## Wishmaker

Quote:


> Originally Posted by *SoloCamo*
> 
> All good and all, but from a consumer perspective you completely skipped my question. What can intel offer *me* in that price range that competes with it's multi threaded performance?
> 
> I'm not new to the cpu scene, I've seen AMD price match intel on the high end.
> 290x was half the cost of the Titan and beat it.


I may be wrong on which TITAN because Nvidia released so many but the last Titan is pascal right? AMD has no competitor for it since it launched. 8 months is it now and NVIDIA is competing with themselves with the 1080TI. Even the 1080 founder's edition is a top choice for 4K rigs.

Regarding your point, INTEL does not have any offering in that segment. This is the reason why AMD is so successful with this release. It killed two birds with one stone







. Performance and price.

There you go, tweaktown summarized this perfectly.
Quote:


> *AMD holds a very important piece of the CPU market, located right between Intel's mainstream desktop and high-end desktop segments.*
> 
> Read more: http://www.tweaktown.com/reviews/8072/amd-ryzen-7-1800x-cpu-review-intel-battle-ready/index13.html


----------



## jprovido

Performance hit on games with SMT enabled. Ist gen i7 all over again


----------



## MuscleBound

Here are my thoughts- For gaming and everyday productivity the 6700k or 7700k is still the way to go.
For heavy Photoshop and Premiere Pro use the Ryzen 1800X is KILLER at less than half the price of the Intel 6900K.
However the power consumption is Horrendous. So if you have to worry about ur Power Bills stick to Intel.
Also I am betting the AMD runs waay too hotter than the Intel.
Fairly Disappointed actually.


----------



## CortexA99

Quote:


> Originally Posted by *jamaican voodoo*
> 
> so it seems the new SMT technology is causing low game performance i wonder if what would be the fix for this?.


This problem is old, but sometimes cannot avoid. Some games or Windows thread scheduler patch is needed.


----------



## ryboto

Quote:


> Originally Posted by *jprovido*
> 
> Performance hit on games with SMT enabled. Ist gen i7 all over again


Maybe this will help?
Quote:


> They also have a Windows Driver coming in approximately one month that will help performance as the Windows High Precision Event Timer (HPET) isn't playing nice with the SenseMI sensors that poll the CPU status every millisecond.
> Read more at http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/15#t4rjzQpVwY4DAP47.99


No Windows driver...maybe it's affecting things?


----------



## SoloCamo

Quote:


> Originally Posted by *MuscleBound*
> 
> Here are my thoughts- For gaming and everyday productivity the 6700k or 7700k is still the way to go.
> For heavy Photoshop and Premiere Pro use the Ryzen 1800X is KILLER at less than half the price of the Intel 6900K.
> However the power consumption is Horrendous. So if you have to worry about ur Power Bills stick to Intel.
> Also I am betting the AMD runs waay too hotter than the Intel.
> Fairly Disappointed actually.


How is the power consumption of this 8c16t cpu horrendous? Also you "bet" it will run hotter? The reviews are right in this very thread...

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-18.html

Power consumption is less than the 6900k.


----------



## sugalumps

This is almost exactly same as the r480 thread/threads, hyping it to oc to insane amounts only for it to fall way short. People need to stop falling for the false hype and self manufactured graphs with perfect conditions, from both sides!

Been holding onto this i5 4670k for ages now, was sure this was going to be the cpu to finally get me to upgrade but in gaming it still seems money is best spent on gpus, really negligible performance increases these cpus are with only 1 card.


----------



## Rhuarc86

Playing at 3440x1440 @ 100hz, trying to decide between the 1800x, 7700k, or 6800k. All on a full custom loop, so temps aren't a problem.


----------



## alpsie

If you want you can add this to OP. OC3D have done a review too of the 1800x - ASUS Prime X370-Pro - 0504 bios - https://www.overclock3d.net/reviews/cpu_mainboard/amd_ryzen_7_1800x_cpu_review/1


----------



## Ximplicite

is Ryzen more future proof than i7-7700k?


----------



## SoloCamo

Quote:


> Originally Posted by *sugalumps*
> 
> This is almost exactly same as the r480 thread/threads, hyping it to oc to insane amounts only for it to fall way short. People need to stop falling for the false hype and self manufactured graphs with perfect conditions, from both sides!


AMD didn't hype it at all... The hype train on this site did. For those of us who kept expectations in check this falls right in line.


----------



## Wishmaker

Ryzen, like the latest AMD GPU releases, are not high end releases. They are slotted perfectly where they should be. Claiming that AMD could charge almost an arm and a leg , like INTEL, for their products is not viable at this point in time. AMD is simply a few steps back but in the right direction.

I also have the feeling that a company like INTEL was aware how good or bad Ryzen would be. Where are the huge price cuts everyone expected. If Intel tomorrow slashes the 6900 to 500 dollars AMD sales would dip more than people think. INTEL has a more mature platform for the time being.


----------



## rv8000

Quote:


> Originally Posted by *MuscleBound*
> 
> Here are my thoughts- For gaming and everyday productivity the 6700k or 7700k is still the way to go.
> For heavy Photoshop and Premiere Pro use the Ryzen 1800X is KILLER at less than half the price of the Intel 6900K.
> However the power consumption is Horrendous. So if you have to worry about ur Power Bills stick to Intel.
> Also I am betting the AMD runs waay too hotter than the Intel.
> Fairly Disappointed actually.


From the 7-8 reviews I've looked through power consumption and temps end up between the 7700k and 6900k. Though many reviews have done a rather crappy job documenting temps so far.


----------



## Arturo.Zise

Why are all the gaming benchmarks in 1080p? Surely 1440p and 4k are where the action is no?


----------



## ToTheSun!

Well... i guess i'll have to stick to my 6700K for my traditional use cases. There goes my dream of a summer all-AMD build.


----------



## SoloCamo

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Why are all the gaming benchmarks in 1080p? Surely 1440p and 4k are where the action is no?


Less likely to run into a gpu bottleneck.


----------



## batmanwcm

Amazon USA just sold out of the 1700 and 1700X. I wonder if Newegg is going to be the same.


----------



## AmericanLoco

Quote:


> Originally Posted by *Wishmaker*
> 
> Ryzen, like the latest AMD GPU releases, are not high end releases. They are slotted perfectly where they should be. Claiming that AMD could charge almost an arm and a leg , like INTEL, for their products is not viable at this point in time. AMD is simply a few steps back but in the right direction.
> 
> I also have the feeling that a company like INTEL was aware how good or bad Ryzen would be. Where are the huge price cuts everyone expected. If Intel tomorrow slashes the 6900 to 500 dollars AMD sales would dip more than people think. INTEL has a more mature platform for the time being.


The 1800x is a high-end release for professional and server use. It's not a high-end chip for gamers. You need to remember the context. This current crop of Zen CPUs is competing with Intel 8-core chips, and they're directly comparable at half (or less) the cost.


----------



## AlphaC

Quote:


> Originally Posted by *MuscleBound*
> 
> Here are my thoughts- For gaming and everyday productivity the 6700k or 7700k is still the way to go.
> For heavy Photoshop and Premiere Pro use the Ryzen 1800X is KILLER at less than half the price of the Intel 6900K.
> However the power consumption is Horrendous. So if you have to worry about ur Power Bills stick to Intel.
> Also I am betting the AMD runs waay too hotter than the Intel.
> Fairly Disappointed actually.


R7 1700.









Should take a ~ 20% voltage bump to reach 95W TDP CPUs' power.

overclockersclub only used +50w relative to stock for 4.05GHz


----------



## hawker-gb

Quote:


> Originally Posted by *MuscleBound*
> 
> Here are my thoughts- For gaming and everyday productivity the 6700k or 7700k is still the way to go.
> For heavy Photoshop and Premiere Pro use the Ryzen 1800X is KILLER at less than half the price of the Intel 6900K.
> However the power consumption is Horrendous. So if you have to worry about ur Power Bills stick to Intel.
> Also I am betting the AMD runs waay too hotter than the Intel.
> Fairly Disappointed actually.


:facepalm:

What a nonsense.


----------



## Ultracarpet

Have any reviews messed with disabling cores while overclocking?


----------



## naz2

Quote:


> Originally Posted by *ryboto*
> 
> No Windows driver...maybe it's affecting things?


now it needs a windows update for full performance? literally bulldozer all over again
Quote:


> Originally Posted by *Arturo.Zise*
> 
> Why are all the gaming benchmarks in 1080p? Surely 1440p and 4k are where the action is no?


not for a few years, also the gpu would bottleneck the cpu


----------



## 7850K

So the rhetoric now is 8 haswell cores at 4ghz is "clearly not a gamig CPU"...


----------



## SoloCamo

Quote:


> Originally Posted by *naz2*
> 
> now it needs a windows update for full performance? literally bulldozer all over again


You quoted the wrong person, that wasn't me.

I'll respond anyways. Use your brain. This is not bulldozer.


----------



## Benny89

Quote:


> Originally Posted by *Ximplicite*
> 
> is Ryzen more future proof than i7-7700k?


Mate, if you don't game at 1080p still, you question is irrelevant. There is was no reason to upgrade to 7700k from 4790k at all. Hell, there is little reason to upgrade from 2900k to Ryzen 1700.

CPUs are very future proof, at least in gaming as GPU is doing most of the job nowadays, especially the higher resolution and refresh rate you play.

CPUs are not GPUs thankfully and are very very future proof.


----------



## jezzer

Quote:


> Originally Posted by *SoloCamo*
> 
> Less likely to run into a gpu bottleneck.




Still a GPU bottleneck but at least if you dont have anything higher than a gtx 1080 u will be on par with 7700k on 5ghz

With unlimited fps the res doesnt really matter regarding GPU bottleneck, the GPU performance does


----------



## sugalumps

Quote:


> Originally Posted by *SoloCamo*
> 
> AMD didn't hype it at all... The hype train on this site did. For those of us who kept expectations in check this falls right in line.


Well at the amd event with linus they didn't have min/max % showing and they tested at 4k so it was gpu bound not cpu bound, both sides obviously give themselves the best conditions at their press conferences which never translates into real world performance.


----------



## Wishmaker

Quote:


> Originally Posted by *AmericanLoco*
> 
> The 1800x is a high-end release for professional and server use. It's not a high-end chip for gamers. You need to remember the context. This current crop of Zen CPUs is competing with Intel 8-core chips, and they're directly comparable at half (or less) the cost.


...dictated by price. Eliminate the price gap, have the INTEL 8 CORE CHIPs at the same price and what happens? Sales will decrease for AMD and it will hurt them. That is why people should have their fingers crossed and hope INTEL does not slash prices. The 6900 has the same performance than Ryzen with an added twist : better gaming







!
Quote:


> Originally Posted by *SoloCamo*
> 
> AMD didn't hype it at all... The hype train on this site did. For those of us who kept expectations in check this falls right in line.


Exactly, this is so toxic! OCN tends to do this every damn time an AMD product is released.


----------



## SoloCamo

Regardless. This is AMD's first comeback in a VERY long time. Can we all stop and realize just much of a gap they were able to close? Zen+ is going to be very interesting as well and unless Intel makes some major changes AMD will have no problem continuing to compete.


----------



## Benny89

Quote:


> Originally Posted by *jezzer*
> 
> 
> 
> Still a GPU bottleneck but at least if you dont have anything higher than a gtx 1080 u will be on par with 7700k on 5ghz
> 
> With unlimited fps the res doesnt really matter regarding GPU bottleneck, the GPU performance does


So 1700 on 4,1 GHZ with 2000Hz 1080Ti would bottleneck on 1440p 165Hz monitor in games??


----------



## naz2

Quote:


> Originally Posted by *SoloCamo*
> 
> I'll respond anyways. Use your brain. This is not bulldozer.


this is the identical rhetoric that was used when bulldozer came out. not sure what's so difficult to comprehend


----------



## SoloCamo

Quote:


> Originally Posted by *naz2*
> 
> this is the identical rhetoric that was used when bulldozer came out. not sure what's so difficult to comprehend


Bulldozer pretty much sucked in all respects, to compare Ryzen to that is non sense and you know it.


----------



## Benny89

Quote:


> Originally Posted by *Wishmaker*
> 
> ...dictated by price. Eliminate the price gap, have the INTEL 8 CORE CHIPs at the same price and what happens? Sales will decrease for AMD and it will hurt them. That is why people should have their fingers crossed and hope INTEL does not slash prices. The 6900 has the same performance than Ryzen with an added twist : better gaming
> 
> 
> 
> 
> 
> 
> 
> 
> !
> Exactly, this is so toxic! OCN tends to do this every damn time an AMD product is released.


Um, actually we all (at least me) hoped Ryzen would force Intel to drastically drop prices. I sure would like to have more option within budget. 6900k for 500$ would be great AMD win also- for consumers at least


----------



## naz2

Quote:


> Originally Posted by *SoloCamo*
> 
> Bulldozer pretty much sucked in all respects, to compare Ryzen to that is non sense and you know it.


are we looking at the same reviews? ryzen is losing 10% fps with smt enabled and there's already murmurings of a magical windows update to fix it. this is verbatim what happened when bulldozer dropped and people pointed to poor optimization with the windows scheduler as the culprit

not saying ryzen sucks or anything, just that the rhetoric is the same. hopefully they will actually fix it with a magical update but i doubt it. intel's been refining hyper-threading for nearly a decade to get to its current level


----------



## prznar1

Overclock is very minimum tho. Bit sad but you cant have everything, right?


----------



## Kalimera

As expected, Ryzen is a completely pointless upgrade for gamers with i5/i7 Haswell/Skylake processors and in many cases, it is worse.

If i had a 2500k i would consider it, but i would be also thinking that i could still get more mileage out of my system and wait for a better implementation in the future. It's not an easy slam dunk decision.

Might be better to keep the money and use it for other more important upgrades in the meantime (display, GPU etc.).

Ryzen is great though for people who rely on CPU intensive apps for their professional work, this is where the platform is really aimed at.

Still, can't ignore the issues that have already cropped up: worse gaming performance than Intel, not much overclocking headroom, high frequency ram not supported yet. Those things will probably get fixed in the next generation of AMD processors.

Not some company fanboy as i am currently using an AMD card.


----------



## Steele84

So is heat the problem ??????

https://www.youtube.com/watch?v=9XGvrfTwwNI


----------



## Blackops_2

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Confirms what has been known for a while now..
> 
> If you have a recent 4 core, and your system is primarily for gaming, then there's little to see here. If you are coming from an older I7, or gaming isn't all your system is used for, then you can get near 6900K performance for around $330.. If gaming is all you're interested in, then wait for the Ryzen 4 cores and compare them to something like the 7700K.
> 
> AMD are back.


This. There seems to be a lot of disappointment which is just surprising as hell to me. 2-3 years ago most of us were hoping it would meet Sandy/Ivy IPC, it's near broadwell, the price is good, power consumption is good, there is little to complain about at all. It wont clock to the moon like Intel's ever improving arch they've been working with for nearly a decade, but we all could expect that. Intel is pumping out Kaby lake at 4.2ghz stock, we knew it was likely that Ryzen couldn't match Kaby Lakes clock speed, least it's 8-core variant anyway. Hell there isn't an intel 8-core offering that will match Kaby Lake's OCs.

AMD is back. We're about to have sub 300$ hexa-cores for God's sake. I'm extremely happy with the results.


----------



## jezzer

Quote:


> Originally Posted by *Benny89*
> 
> So 1700 on 4,1 GHZ with 2000Hz 1080Ti would bottleneck on 1440p 165Hz monitor in games??


No, just saying testing with a titan XP and throwing those unlimited FPS in the air to just show the difference between 250fpos ryzen vs 300fps 7700k isnt really a real word scenario for most and people actually think by seeing that that ryzen sucks in gaming and expect 50fps more with a 7700k


----------



## CriticalOne

I bet that a lot of i7-7700K owners are feeling pretty vindicated right about now.

Anyway, I would be lying if I said i wasn't a little disappointed.


----------



## stargate125645

So does the horrible overclocking (edit: I say this because an overclocked Ryzen does not match a non-overclocked Intel in many situations, and the overclocking doesn't exceed the boost speed by much) of these Ryzen processors mean that AMD is squeezing all they can out of the platform on stock, or have the reviewers not figured out the maximum voltage and temperatures yet?

If you are not overclocking, these are a great option given the price. Per the LegitReviews article, an overclocked Ryzen 7 processor will not touch an overclocked Haswell or Broadwell. Hopefully it encourages Intel to drop prices like has been rumored. A dropped Broadwell-E price is about the only reason for someone like myself (5930k) to upgrade or change.


----------



## geoxile

Strange that the 8-core Ryzens perform worse than the 7700k even in games that use multiple threads. Seems like there's no obvious reason for it considering how well the the Ryzen chips perform in synthetic workloads. If it's just a software issue with windows hopefully they can fix it.


----------



## Slink3Slyde

Anyone remember the leak from OCUK where Gibbo I think said that you're going need to to have a top end motherboard to overclock Ryzen to its limits? IIRC he tried the same chip in 3 boards and could only get to 3.8 on 2 and on the Asus Crosshair ROG he got it up to 4.0.

I noticed that the reviews Ive read so far seem to be using mid/upper mid range boards like the Gigabyte Gaming 5 and the Asus Pro. Wondering if something to do with the low power process AMD are using makes the chips very sensitive to any power fluctuations and in this case more phases helps quite a lot more then people on INtel chips are used to with everyday overclocks.

Not suggesting that we're going to be talking 4.5ghz + but it might mean a couple of hundred mhz more if AMD are saying most chips should do 4.2. Of course its early days as far as BIOS's and the process are concerned as well.


----------



## Ha-Nocri

This doesn't look bad at all. 1700 (non-X) looks like the winner between RyZen CPUs. And Joker was able to clock it @ 3.9GHz with low voltage (1.3V)


----------



## BobiBolivia

After I took time to read all 23 pages... I am literally speechless.









Who in the right mind would expect 8c Ryzen to be on par with 4c Intel offerings ?!
In all 23 pages I haven't seen any focus on "OK, let's compare this to 8c Intel offerings.".


----------



## DarkRadeon7000

Quote:


> Originally Posted by *jezzer*
> 
> No, just saying testing with a titan XP and throwing those unlimited FPS in the air to just show the difference between 250fpos ryzen vs 300fps 7700k isnt really a real word scenario for most and people actually think by seeing that that ryzen sucks in gaming and expect 50fps more with a 7700k


Even at lower fps the difference is pretty big



Watch Dogs 2 is one game which made my FX 8350 beg for mercy and in these benches the 7700k destroyed the Ryzen by more than 20 fps
Its performing more like an i5 than an i7 but at equivalent performance people will just get an Intel


----------



## Wishmaker

Quote:


> Originally Posted by *BobiBolivia*
> 
> After I took time to read all 23 pages... I am literally speechless.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who in the right mind would expect 8c Ryzen to be on par with 4c Intel offerings ?!
> In all 23 pages I haven't seen any focus on "OK, let's compare this to 8c Intel offerings.".


People do not compare them to the 8c INTEL offerings because of the price aspect. If you eliminate price then Ryzen is not the best option anymore. PRICE has been AMD's motto for the past decade.


----------



## Olivon

Uber interesting, must read !

*Understand the Ryzen sub-system memory - HFR*

*Source*


----------



## lombardsoup

Quote:


> Originally Posted by *BobiBolivia*
> 
> After I took time to read all 23 pages... I am literally speechless.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who in the right mind would expect 8c Ryzen to be on par with 4c Intel offerings ?!
> In all 23 pages I haven't seen any focus on "OK, let's compare this to 8c Intel offerings.".


Intel's 8c offerings overclock just as poorly. More cores = harder to OC, yet there's only a handful of people that have mentioned this.


----------



## darealist

tldr: Intel is still superior and I don't see their hegemony getting harmed anytime soon. Coffeelake 6 cores for the mainstream platform will increase the lead even more.


----------



## IRobot23

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Even at lower fps the difference is pretty big
> 
> 
> 
> Watch Dogs 2 is one game which made my FX 8350 beg for mercy and in these benches the 7700k destroyed the Ryzen by more than 20 fps
> Its performing more like an i5 than an i7 but at equivalent performance people will just get an Intel


Does anyone trust that guy? When I was watching his benchmark I always had felling that Fury X was as fast as GTX 970.


----------



## SoloCamo

Quote:


> Originally Posted by *darealist*
> 
> tldr: Intel is still superior and I don't see their hegemony getting harmed anytime soon.


ITT: People only look at 7700k gaming performance and claim superiority as a whole. 6900k being $1000 and not much faster/equal in multithreaded workloads over $329-$500 AMD cpu seemingly also ignored.


----------



## jezzer

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Even at lower fps the difference is pretty big
> 
> 
> 
> Watch Dogs 2 is one game which made my FX 8350 beg for mercy and in these benches the 7700k destroyed the Ryzen by more than 20 fps
> Its performing more like an i5 than an i7 but at equivalent performance people will just get an Intel


Yea but there must be something wrong with that one, WD2 likes multithreads and seeing the 6900k in top spot it just seems too odd to take for granted


----------



## ryan92084

Quote:


> Originally Posted by *Olivon*
> 
> Uber interesting, must read !
> 
> *Understand the Ryzen sub-system memory - HFR*
> 
> *Source*


Bing translated conclusion emphasis mine
Quote:


> Who's fault?
> If you have followed us here, we can begin to see the causes of delays that we note in some benchmarks. If one sums up what we have seen so far:
> *Ryzen memory latency is higher
> The bandwidth between the CCX is particularly small
> Access caches between CCX are expensive*
> If we add the tendency of Windows to move threads permanently, we can start to understand better certain behaviors. Indeed, around the threads of one CCX to another is excessively expensive on Ryzen whose bandwidth between the CCX is reduced.
> One can imagine easily two cases that will cause slowdowns:
> threads that, when they are ballads pay a high price in latency to access data from their caches
> threads which, ballads or not, saturate the entire cache
> In the case of 7 - Zip or WinRAR, you look on the second option, in practice these software use a particularly wide compression dictionary to which they referred constantly. We assume that in this case, bandwidth limited between the CCX memory is a limiting factor.
> For the games, we assume that one is closer to the first case, or a mix of the two.
> *Is the problem so intractable? Probably not, several solutions are possible, as for example an adaptation of the Windows scheduler to limit the movement of the threads out of the CCX, a bit like what was done for Bulldozer with modules. One can imagine that AMD is working with Microsoft to implement a system of this type, even if the manufacturer could not confirm.*
> Another change that could prove beneficial for the games is the arrival of the "Game Mode" of Windows 10, whose one of the peculiarities is also, less to move threads. Other mitigations techniques are conceivable and AMD is expected to present a session at GDC, we'll talk.
> What part of the gap will be overtaken by these potential fixes, patches, and changes? It is impossible to say and we must be very careful on what will be, or not, the various developers involved. *The bandwidth between the CCX and the high latency will remain things that they evolve, at least not before a future revision of Zen.*


----------



## BobiBolivia

Quote:


> Originally Posted by *Wishmaker*
> 
> People do not compare them to the 8c INTEL offerings because of the price aspect. If you eliminate price then Ryzen is not the best option anymore. PRICE has been AMD's motto for the past decade.


Yes but it is like everybody expected AMD to be perfect on first try...
I just can't find a logic behind this seemingly irrelevant comparison...


----------



## naz2

Quote:


> Originally Posted by *SoloCamo*
> 
> ITT: People only look at 7700k gaming performance and claim superiority as a whole. 6900k being $1000 and not much faster/equal in multithreaded workloads over $329-$500 AMD cpu seemingly also ignored.


people look at things that actually affect them, not irrelevant synthetic benchmarks


----------



## sugalumps

Quote:


> Originally Posted by *BobiBolivia*
> 
> After I took time to read all 23 pages... I am literally speechless.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who in the right mind would expect 8c Ryzen to be on par with 4c Intel offerings ?!
> In all 23 pages I haven't seen any focus on "OK, let's compare this to 8c Intel offerings.".


Probably because most people with an eye on upgrading to this are one 4 core intel offerings hoping for an upgrade or bulldozer chips.


----------



## DNMock

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Even at lower fps the difference is pretty big
> 
> 
> 
> Watch Dogs 2 is one game which made my FX 8350 beg for mercy and in these benches the 7700k destroyed the Ryzen by more than 20 fps
> Its performing more like an i5 than an i7 but at equivalent performance people will just get an Intel


SMT is bugged on ryzen atm, if you disable SMT then ryzen is within 1 fps across the board.


----------



## SoloCamo

Quote:


> Originally Posted by *naz2*
> 
> people look at things that actually affect them, not irrelevant synthetic benchmarks


Handbrake is irrelevant? Odd.. just had four instances of it running on my 4790k overnight. Certainly seems like a 8c16t would make quite a difference.


----------



## DuraN1

Quote:


> Originally Posted by *lombardsoup*
> 
> Intel's 8c offerings overclock just as poorly. More cores = harder to OC, yet there's only a handful of people that have mentioned this.


Just as poorly? Ryzen clocks to about 100 mhz over boost, while the 6900K does 500 mhz (3.7 to around 4.2). My 6900K runs between 4.4 and 4.6. Overall good CPU, very good price point, but lets not kid ourselves about the TERRIBLE overclocking.


----------



## BobiBolivia

Quote:


> Originally Posted by *lombardsoup*
> 
> Intel's 8c offerings overclock just as poorly. More cores = harder to OC, yet there's only a handful of people that have mentioned this.


Thank you. I wanted to say exactly that, but couldn't figure the words for it...


----------



## prznar1

Guys, any reviews with 1700 overclock results? All i see is the 1800x

NVM

http://www.overclockersclub.com/reviews/amd_ryzen_7_1800x_1700x_1700/4.htm

Old rule to buy cheapest model and OC it still works


----------



## batmanwcm

Some of you guys are a little delusional. You guys do realize that even Intel's X99 6 & 8 Core variants are worse than the 7700k in gaming as well.


----------



## ryan92084

Quote:


> Originally Posted by *prznar1*
> 
> Guys, any reviews with 1700 overclock results? All i see is the 1800x


Check the OP


----------



## BobiBolivia

Quote:


> Originally Posted by *DuraN1*
> 
> Just as poorly? Ryzen clocks to about 100 mhz over boost, while the 6900K does 500 mhz (3.7 to around 4.2). My 6900K runs between 4.4 and 4.6. Overall good CPU, very good price point, but lets not kid ourselves about the TERRIBLE overclocking.


You do realize, that process/stepping/binning can (and will) improve, right ? RIGHT ?


----------



## lombardsoup

Quote:


> Originally Posted by *DuraN1*
> 
> Just as poorly? Ryzen clocks to about 100 mhz over boost, while the 6900K does 500 mhz (3.7 to around 4.2). My 6900K runs between 4.4 and 4.6. Overall good CPU, very good price point, but lets not kid ourselves about the TERRIBLE overclocking.


Both brands 8c offerings end in the 4.2, maybe 4.3 range.


----------



## prznar1

Quote:


> Originally Posted by *batmanwcm*
> 
> Some of you guys are a little delusional. You guys do realize that even Intel's X99 6 & 8 Core variants are worse than the 7700k in gaming as well.


Its called hypocrisy


----------



## tolga9009

If anyone finds anything about IOMMU / ECC, please let me know. The ECC support situation on the ASUS Prime X370-Pro is still unclear (ECC support is still listed on german website) and also various retailers list ECC support. I just want anyone to stick a pair of ECC memory into that board and give us a screenshot of MemTest86+, so we can validate, whether ECC memory is operating in ECC mode (enabled) or not.

Why I'm so curious? Cause ASUS is known for supporting ECC in general on AMD motherboards; even when they only list non-ECC memory. E.g. the ASUS AM1M-A supports ECC memory, eventhough it's an AM1 motherboard (no other AM1 mobo supports ECC). Some manuals list ECC, others list non-ECC for that particular motherboard.

Also, I'd like to see a single review / report, telling me anything about IOMMU support (bugged? working?). Haven't seen that so far. Reviews say: not for gamers. But ECC support and IOMMU would make them an interesting option for workstation use.


----------



## ryan92084

Everyone who is saying that they don't OC past the boost/XFR seems to be missing the fact that the boost/xfr speeds are for single core only where as the overclocks are for all cores...


----------



## DADDYDC650

Purchased an Asrock x370 Fata1ity Pro board. Love it's 16 phases of power and built in WiFi/BT 4.2 Also ordered a Ryzen 1700 thru Amazon. Will be exchanging if the first chip doesn't hit 4Ghz stable. Wanted the 1800x but what's the point?


----------



## lombardsoup

Quote:


> Originally Posted by *batmanwcm*
> 
> Some of you guys are a little delusional. You guys do realize that even Intel's X99 6 & 8 Core variants are worse than the 7700k in gaming as well.


Shhhhhhhhhhhhhh

You're not supposed to mention that


----------



## Kevin Sia

Ryzen CPUs literally smash Intel in every aspect except in overclocking, the only downside which not a big deal for most people.


----------



## EniGma1987

Any reviews yet showing a description for the die layout of the entire die and not just a single CCX? I looked at the die and labelled it myself a while ago. Got most of it right so far but I was having trouble identifying 6 (or 3?) sections along the outside edges of the die. Id like to see a review go over the full die in detail so I could figure out what those sections are


----------



## prznar1

Quote:


> Originally Posted by *ryan92084*
> 
> Check the OP


Found a review with 1700 oc. 4.1 from base 3.0 is a quite good, aint it? Now i wonder how 1700 will oc on something like msi b350 tomahawk.


----------



## Motley01

Quote:


> Originally Posted by *Kevin Sia*
> 
> Ryzen CPUs literally smash Intel in every aspect except in overclocking, the only downside which not a big deal for most people.


Not the 1700, the 1GHz OC is very impressive.. Not so much for the 1700X & 1800X


----------



## G woodlogger

I'm mostly interested in how compiler/game development support will improve for skylake and Ryzen over the coming years. I bet they will poll away from my 4790k.


----------



## Wishmaker

Quote:


> Originally Posted by *lombardsoup*
> 
> Intel's 8c offerings overclock just as poorly. More cores = harder to OC, yet there's only a handful of people that have mentioned this.


INTEL clocks :

1. 5960x overclocks close to 4.7 GHz with a bit of tweaking. 8 core chip.
2. 6900K, overclocks to 4.5 GHZ with some users reporting 4.7 GHz 24/7 stable. 8 cores
3. 6950x overclocks to 4.4, 4.5 GHz, with some users reporting WC mandatory for this. Again 8 cores.

Where is the AMD equivalent , excluding LN2?


----------



## Lass3

I'm looking forward to 1600X. Less cores should equal higher OC. 1600X @ 4.4+ would be very good price/perf. 259$


----------



## jezzer

Quote:


> Originally Posted by *Kevin Sia*
> 
> Ryzen CPUs literally smash Intel in every aspect except in overclocking, the only downside which not a big deal for most people.


It's like Pascal, out of the box boost was so high due to native clockspeeds and boost 3.0 there wasnt much to OC for yourself. Then people say YAY BEST CARD now because its not nvidia or intel but intel they go boohoo
Dont like it dont buy it but there isnt really anything to really boohoo about


----------



## davidelite10

Quote:


> Originally Posted by *lombardsoup*
> 
> Shhhhhhhhhhhhhh
> 
> You're not supposed to mention that


The difference is just barely, the 6900/6950k clock to 4.4ghz and do extremely well.
Not nearly as much disparity as ryzen,

Not bashing AMD by any means they made HUGEEEEEEEEEEEEEEEEEEE advancements and will be picking up a 1700 for an esxi box but I'll wait for coffee for my main rig.


----------



## prznar1

Quote:


> Originally Posted by *Motley01*
> 
> Not the 1700, the 1GHz OC is very impressive.. Not so much for the 1700X & 1800X


Yea, the 1700 shines in performance/price after OC. The best octacore imo atm.


----------



## IRobot23

Quote:


> Originally Posted by *Wishmaker*
> 
> INTEL clocks :
> 
> 1. 5960x overclocks close to 4.7 GHz with a bit of tweaking. 8 core chip.
> 2. 6900K, overclocks to 4.5 GHZ with some users reporting 4.7 GHz 24/7 stable. 8 cores
> 3. 6950x overclocks to 4.4, 4.5 GHz, with some users reporting WC mandatory for this. Again 8 cores.
> 
> Where is the AMD equivalent , excluding LN2?


If you get i7 6900K over 4.2-4.3GHz (24/7) you can be happy.


----------



## prznar1

Quote:


> Originally Posted by *Lass3*
> 
> I'm looking forward to 1600X. Less cores should equal higher OC. 1600X @ 4.4+ would be very good price/perf. 259$


Doubtfull. 4.0 is going to be avrage untill they will fix it with better stepping like they used to do in older days.


----------



## GoLDii3

Quote:


> Originally Posted by *Kevin Sia*
> 
> Ryzen CPUs literally smash Intel in every aspect except in overclocking, the only downside which not a big deal for most people.


----------



## Blackops_2

Quote:


> Originally Posted by *Lass3*
> 
> I'm looking forward to 1600X. Less cores should equal higher OC. 1600X @ 4.4+ would be very good price/perf. 259$


My line of thinking as well, if the 1600x can hit 4.4/4.5 it would be perfect for a gaming upgrade for the money and extra cores.


----------



## DuraN1

Quote:


> Originally Posted by *lombardsoup*
> 
> Both brands 8c offerings end in the 4.2, maybe 4.3 range.


Key word: HEADROOM

Also, 4.2-4.3 is the lower end of the spectrum for HW-E and BW-E. I have yet to see Ryzen overclock to 4.2-4.3, do you have any source?


----------



## Starbuck5000

I've been looking at the reviews but are there any comparisons where both the Intel or AMD CPUs are overclocked (memory too with proper oc's)

Or is everything stock/light overclocks for now?


----------



## DADDYDC650

Quote:


> Originally Posted by *Wishmaker*
> 
> INTEL clocks :
> 
> 1. 5960x overclocks close to 4.7 GHz with a bit of tweaking. 8 core chip.
> 2. 6900K, overclocks to 4.5 GHZ with some users reporting 4.7 GHz 24/7 stable. 8 cores
> 3. 6950x overclocks to 4.4, 4.5 GHz, with some users reporting WC mandatory for this. Again 8 cores.
> 
> Where is the AMD equivalent , excluding LN2?


6900k is lucky to get past 4.3Ghz fully stable. You also need to apply an AVX offset since anything past 4.1Ghz is very iffy.


----------



## fateswarm

AMD screws up its marketing by not going with well binned 4c/8t first just like Intel does.


----------



## jezzer

Quote:


> Originally Posted by *GoLDii3*


They test in 720p? wow..


----------



## sage101

Solid showing by AMD. Only cons I can associate with ryzen 7 is the very limited OCing and the memory issue. All the people crying over gaming performance just wait till the ryzen 5 is released. I'll be getting the 1600X to replace my legendary 2500k.


----------



## lombardsoup

Quote:


> Originally Posted by *Starbuck5000*
> 
> I've been looking at the reviews but are there any comparisons where both the Intel or AMD CPUs are overclocked (memory too with proper oc's)
> 
> Or is everything stock/light overclocks for now?


Haven't seen a review yet that goes past 1.425 on core.


----------



## DarkRadeon7000

Quote:


> Originally Posted by *DNMock*
> 
> SMT is bugged on ryzen atm, if you disable SMT then ryzen is within 1 fps across the board.


That was the benchmark with SMT off


----------



## chuy409

Gamers nexus said that he was told by amd to use 4k to make a gpu bottleneck.
Quote:


> When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU's performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field. While we fundamentally disagree with this approach to testing, we decided to entertain a mid-step: 1440p, just out of respect for additional numbers driven by potentially realistic use cases. Of course, in some regard, benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is "the same," since they'd all choke on the CPU. Same idea here, just the inverse.


http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


----------



## EniGma1987

Quote:


> Originally Posted by *Starbuck5000*
> 
> I've been looking at the reviews but are there any comparisons where both the Intel or AMD CPUs are overclocked (memory too with proper oc's)
> 
> Or is everything stock/light overclocks for now?


Most are saying they can only get 4.1GHz or so at best which is why you arent seeing higher. That is the max.

memory has trouble too. Some boards are only getting 2666, but there are some reviews with DDR-3000 15/15 1T


----------



## Kevin Sia

Quote:


> Originally Posted by *jezzer*
> 
> They test in 720p? wow..


----------



## prznar1

Every review is with the top end mobo... aint people intrested in b350 overclocking results?


----------



## DuraN1

Quote:


> Originally Posted by *BobiBolivia*
> 
> You do realize, that process/stepping/binning can (and will) improve, right ? RIGHT ?


Yes. But the post i replied to claimed that Intel's 8 core offerings are lousy overclockers. They are not. My 5960X did 4.625 with 1.25v. My 6900K does 4.5 with 1.34v. Thats over 1Ghz over boost for the HW-E and close to it for the BW-E. How is that lousy?


----------



## EniGma1987

Quote:


> Originally Posted by *lombardsoup*
> 
> Haven't seen a review yet that goes past 1.425 on core.


There are at least two I just finished reading who used 1.45 and 1.46

EDIT: found the 1.46 one again:
http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/14


----------



## comagnum

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> That was the benchmark with SMT off


That's literally what he said.


----------



## comagnum

Quote:


> Originally Posted by *Wishmaker*
> 
> People do not compare them to the 8c INTEL offerings because of the price aspect. If you eliminate price then Ryzen is not the best option anymore. PRICE has been AMD's motto for the past decade.


Quote:


> Originally Posted by *Wishmaker*
> 
> INTEL clocks :
> 
> 1. 5960x overclocks close to 4.7 GHz with a bit of tweaking. 8 core chip.
> 2. 6900K, overclocks to 4.5 GHZ with some users reporting 4.7 GHz 24/7 stable. 8 cores
> 3. 6950x overclocks to 4.4, 4.5 GHz, with some users reporting WC mandatory for this. Again 8 cores.
> 
> Where is the AMD equivalent , excluding LN2?


Please go away.


----------



## lombardsoup

Quote:


> Originally Posted by *EniGma1987*
> 
> Most are saying they can only get 4.1GHz or so at best which is why you arent seeing higher. That is the max.
> 
> memory has trouble too. Some boards are only getting 2666, but there are some reviews with DDR-3000 15/15 1T


4.2 @ 1.425, but yeah, most of the reviewers are afraid to go past AMD recommended voltage.


----------



## Motley01

Well I'm heading to Microcenter now, I'll be back in a couple hours, then start my new build Very excited!!!


----------



## GoLDii3

Quote:


> Originally Posted by *Kevin Sia*


Both of you are geniuses seriously.

Not knowing that lower resolution makes games CPU bound rather than GPU.


----------



## Starbuck5000

Quote:


> Originally Posted by *EniGma1987*
> 
> Most are saying they can only get 4.1GHz or so at best which is why you arent seeing higher. That is the max.
> 
> memory has trouble too. Some boards are only getting 2666, but there are some reviews with DDR-3000 15/15 1T


I'm ok with that being that max, I'm wondering if there is anything that would compare that to a 7700k at 4.8Ghz or whatever a typical 24/7 OC with the RAM pushed too.


----------



## DarkRadeon7000

Quote:


> Originally Posted by *jezzer*
> 
> Yea but there must be something wrong with that one, WD2 likes multithreads and seeing the 6900k in top spot it just seems too odd to take for granted


Quote:


> Originally Posted by *IRobot23*
> 
> Does anyone trust that guy? When I was watching his benchmark I always had felling that Fury X was as fast as GTX 970.


Techreport benchmarked that same game with both the 1700 and 1800 and the results are the same.Ryzen is being smashed

http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed



In fact in all CPU dependant games games its being wrecked





At this point Intel is more value for money for gaming


----------



## Nautilus

Quote:


> Originally Posted by *jezzer*
> 
> They test in 720p? wow..


lol 720p? rly?...you are very picky aren't you..









not even need to mention that @ 1440p the difference between 7700K and 1800X is simply gone.


----------



## bigjdubb

It's pretty impressive but I think my 4790k will hold me over until Intel can respond with Skylake E, I will decide which one to get once those launch and we get to see how they price them.


----------



## comagnum

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Techreport benchmarked that same game with both the 1700 and 1800 and the results are the same.Ryzen is being smashed
> 
> http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed
> 
> 
> 
> In fact in all CPU dependant games games its being wrecked
> 
> 
> 
> 
> 
> At this point Intel is more value for money for gaming


If smt is disabled, the gap disappears. Obviously there's a scheduling error or something that bugs smt.


----------



## EniGma1987

Quote:


> Originally Posted by *Starbuck5000*
> 
> I'm ok with that being that max, I'm wondering if there is anything that would compare that to a 7700k at 4.8Ghz or whatever a typical 24/7 OC with the RAM pushed too.


I doubt it in gaming. Those Sky/Kaby lake are beasts at high clock speed in games. My guess is it will take significant maturity of the 14nm process node or Zen+ to be able to push these to the higher speeds needed to compete with 4.8+ Kaby's.


----------



## Jpmboy

Looking good. some "believable" benchmarks at stock. The lack of OC headroom is concerning tho.


----------



## EniGma1987

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Techreport benchmarked that same game with both the 1700 and 1800 and the results are the same.Ryzen is being smashed
> 
> http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed
> 
> 
> 
> In fact in all CPU dependant games games its being wrecked
> 
> 
> 
> 
> 
> At this point Intel is more value for money for gaming


Ryzen is pulling some impressive minimums though, somewhat close to their averages and not too far behind the 6800-6900k processors. Still left in the dust by 7700K minimum fps though.


----------



## GoLDii3

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed
> 
> At this point Intel is more value for money for gaming


lol at Ryzen having less single core performance than Ivy Bridge here


----------



## Derp

The gaming performance, overclocking headroom and ram speeds are disappointing. Not remotely as disappointing as bulldozer was but it seems no matter how pessimistic I am I'm still somewhat blinded by hype.

The prices are "low" for a reason.

To all reviewers, stop making yourself look like clowns with GPU bottlenecked 4k gaming results when the product you're reviewing is a CPU. I thought we were over this by now.


----------



## Starbuck5000

Quote:


> Originally Posted by *EniGma1987*
> 
> I doubt it in gaming. Those Sky/Kaby lake are beasts at high clock speed in games. My guess is it will take significant maturity of the 14nm process node or Zen+ to be able to push these to the higher speeds needed to compete with 4.8+ Kaby's.


So once Oc'ing is factored in the gap between Kaby and Ryzen only grows in Intels favour (when gaming)?


----------



## Pro3ootector

http://pclab.pl/art69780-15.html

Still cost waaaay over Ryzen.


----------



## motoray

So.... everything near the same clocks.... why should i get anything over the 1700?


----------



## Lass3

Quote:


> Originally Posted by *prznar1*
> 
> Doubtfull. 4.0 is going to be avrage untill they will fix it with better stepping like they used to do in older days.


It's pretty much fact that 8 cores overclocks worse than 4/6 cores.

But we'll see in a few weeks. I bet the 1600X is going to be very popular.


----------



## bigjdubb

Quote:


> Originally Posted by *motoray*
> 
> So.... everything near the same clocks.... why should i get anything over the 1700?


No reason right now, unless you are allergic to overclocking.


----------



## darealist

"Fastest eight-core processor" myth from AMD debunked.


----------



## dieanotherday

reviewer samples are handpicked too so dont expect high ocs on ur 1700

proly still worth to get 1700x


----------



## mistax

Quote:


> Originally Posted by *bigjdubb*
> 
> No reason right now, unless you are allergic to overclocking.


pretty much i orignally preorder a 1800x and today i walked out with an 1700 and going to play the silicon lottery. I basically was reading reviews for the hour i had before microcenter opened and i'm glad i did saved 200.


----------



## jezzer

Quote:


> Originally Posted by *GoLDii3*
> 
> Both of you are geniuses seriously.
> 
> Not knowing that lower resolution makes games CPU bound rather than GPU.


Well mr genius have fun playing at 720p, maybe they should have tested an MSDOS game to because it scales better with CPU speed too.


----------



## dieanotherday

also they need to disable core and oc


----------



## oxidized

I honestly thought we'd see better than this, at this point all those leaked benchmarks were fake or something wasn't right with them, because what i'm seeing today, is very different in some situation, i only hope now that the 1600X comes closer to 7700K in gaming performance. I have to say i'm a bit disappointed


----------



## comagnum

Quote:


> Originally Posted by *jezzer*
> 
> Well mr genius have fun playing at 720p, maybe they should have tested an MSDOS game to because it scales better with CPU speed too.


What?? It means that fps is more closely related to raw CPU power rather than Gpu power when ran at 720p. Are you serious?


----------



## bigjdubb

Quote:


> Originally Posted by *jezzer*
> 
> Well mr genius have fun playing at 720p, maybe they should have tested an MSDOS game to because it scales better with CPU speed too.


It's the proper way to test CPU's. If you stick with high res and high settings you will be limited in fps by the graphics card installed. The portions of the reviews that contain 1080p and higher are less important.


----------



## Mahigan

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> Techreport benchmarked that same game with both the 1700 and 1800 and the results are the same.Ryzen is being smashed
> 
> http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed
> 
> 
> 
> In fact in all CPU dependant games games its being wrecked
> 
> 
> 
> 
> 
> At this point Intel is more value for money for gaming


Disable SMT: http://wccftech.com/ryzen-gaming-benchmarks-roundup/

Now it is competitive with the 7700K







Not that it is supposed to be.


----------



## prznar1

Quote:


> Originally Posted by *Lass3*
> 
> It's pretty much fact that 8 cores overclocks worse than 4/6 cores.
> 
> But we'll see in a few weeks. I bet the 1600X is going to be very popular.


Well.. the 6 core dont have its own dye. 4c will be a half of it but same schematics for that half. I would not think they will clock much higher. If they do then it will be great but dont get to much hopes for that.


----------



## comagnum

Quote:


> Originally Posted by *oxidized*
> 
> I honestly thought we'd see better than this, at this point all those leaked benchmarks were fake or something wasn't right with them, because what i'm seeing today, is very different in some situation, i only hope now that the 1600X comes closer to 7700K in gaming performance. I have to say i'm a bit disappointed


From what I've seen, disabling smt greatly diminishes the fps gap. Points to a scheduling issue within windows and/or drivers.


----------



## naz2

Quote:


> Originally Posted by *oxidized*
> 
> I honestly thought we'd see better than this, at this point all those leaked benchmarks were fake or something wasn't right with them, because what i'm seeing today, is very different in some situation, i only hope now that the 1600X come closer to 7700K in gaming performance. I have to say i'm a bit disappointed


i think most leaked benchmarks were just cherrypicked, not necessarily fabricated. problem was exacerbated by fanboys fueling the hype instead of following rule #1 of benchmarks, which is to wait for real, verified numbers


----------



## DADDYDC650

I game at 4K so all of these 1080p/1440p benchmarks are worthless to me. I'm guessing the 1700 is pretty much equal to a 7700k at 4k.


----------



## oxidized

Quote:


> Originally Posted by *comagnum*
> 
> From what I've seen, disabling smt greatly diminishes the fps gap. Points to a scheduling issue within windows and/or drivers.


I really hope it's something fixable, and yeah smt off seems to improve everything quite enough
Quote:


> Originally Posted by *naz2*
> 
> i think most leaked benchmarks were just cherrypicked, not necessarily fabricated. problem was exacerbated by fanboys fueling the hype instead of following rule #1 of benchmarks, which is to wait for real, verified numbers


You're right

I seriously hope the 1600X can do better than this in gaming


----------



## RedM00N

Someone already got close to 6ghz with everything active.

5.8 1.97v








http://hwbot.org/submission/3473875_der8auer_cpu_frequency_ryzen_7_1800x_5802.93_mhz

5.4g GPUPI for CPU
http://hwbot.org/submission/3473880_


----------



## lombardsoup

Quote:


> Originally Posted by *comagnum*
> 
> From what I've seen, disabling smt greatly diminishes the fps gap. Points to a scheduling issue within windows and/or drivers.


Some reviewers aren't mentioning this fact, I wonder why.


----------



## bigjdubb

Quote:


> Originally Posted by *DADDYDC650*
> 
> I game at 4K so all of these 1080p/1440p benchmarks are worthless to me. I'm guessing the 1700 is pretty much equal to a 7700k at 4k.


The 4k results are showing gpu limitations, not cpu limitations. 720p is the only resolution needed in CPU tests.


----------



## Iching

It's a very decent offering. It's priced well and performs as expected. The prices will drop even further within a couple of months. Good job, AMD!


----------



## CULLEN

*From what I understand from various sources.*

SMT is bugged and is no good for gaming. Disable it for more performance.

Pick *the right motherboard and memory* if you want to exceed 2667 MHz.

Multi-threaded workload, Ryzen is equal to i7 6900K.

Voltages are bugged and being read wrong, it does not function at 0.2V nor 1.5V.

i7 7700K is the best for gaming.

But in the end, this seems to be the case, which is absolutely hilarious!
Quote:


> Originally Posted by *SoloCamo*
> 
> ITT: People only look at 7700k gaming performance and claim superiority as a whole. 6900k being $1000 and not much faster/equal in multithreaded workloads over $329-$500 AMD cpu seemingly also ignored.


Consumers are the winners today because Ryzen is simply AMAZING, and yet people are complaining that 6900K is still better.


----------



## DADDYDC650

Quote:


> Originally Posted by *bigjdubb*
> 
> The 4k results are showing gpu limitations, not cpu limitations. 720p is the only resolution needed in CPU tests.


Yes I know. Point is I won't see a difference between a 1700 and 7700k at 4k res.


----------



## jezzer

Quote:


> Originally Posted by *bigjdubb*
> 
> It's the proper way to test CPU's. If you stick with high res and high settings you will be limited in fps by the graphics card installed. The portions of the reviews that contain 1080p and higher are less important.


Quote:


> Originally Posted by *bigjdubb*
> 
> The 4k results are showing gpu limitations, not cpu limitations. 720p is the only resolution needed in CPU tests.


No one cares about 720p because it tells nothing about what people will get and can expect.

Why not test in dosbox then lul that would be the ultimate cpu test


----------



## DigitrevX

Quote:


> Originally Posted by *Mahigan*
> 
> Disable SMT: http://wccftech.com/ryzen-gaming-benchmarks-roundup/
> 
> Now it is competitive with the 7700K
> 
> 
> 
> 
> 
> 
> 
> Not that it is supposed to be.


It's still getting wrecked with SMT disabled https://www.youtube.com/watch?v=j7UBHjtCXhU
And as a person that is a content creator doing 3d work hes right. Often times gpu rendering is king these days.


----------



## Pro3ootector

FoK JuLlE NaIeRs


----------



## jprovido

so basically 1700, 1700x and 1800x have the same overclock headroom? (around 4ghz)


----------



## M3T4LM4N222

AMD is back. I was beginning to wonder if I would ever see them being competitive with Intel again. Now I guess the question becomes...can AMD maintain this?


----------



## DADDYDC650

Quote:


> Originally Posted by *jprovido*
> 
> so basically 1700, 1700x and 1800x have the same overclock headroom? (around 4ghz)


Seems like it so far.


----------



## DarkRadeon7000

Quote:


> Originally Posted by *Mahigan*
> 
> Disable SMT: http://wccftech.com/ryzen-gaming-benchmarks-roundup/
> 
> Now it is competitive with the 7700K
> 
> 
> 
> 
> 
> 
> 
> Not that it is supposed to be.


GamersNexus benchmarks show it still gets hammered with SMT disabled.Disabling smt nets about 8 fps more. Not enough


----------



## 113802

Ryzen has it's flaws but it's perfect where it's at. The Ryzen 7 1700 is currently the best processor to get and it's only $329.99

They already know which improvements they have to make for Ryzen 2 so we can already expect a 10% increase in IPC.

Still in stock!

https://www.newegg.com/Product/Product.aspx?Item=N82E16819113428


----------



## DADDYDC650

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Ryzen has it's flaws but it's perfect where it's at. The Ryzen 7 1700 is currently the best processor to get and it's only $329.99
> 
> They already know which improvements they have to make for Ryzen 2 so we can already expect a 10% increase in IPC.
> 
> Still in stock!
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16819113428


I'd buy it from newegg if it wasn't for their no return policy. Amazon, where's your stock?


----------



## rv8000

Quote:


> Originally Posted by *RedM00N*
> 
> Someone already got close to 6ghz with everything active.
> 
> 5.8 1.97v
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3473875_der8auer_cpu_frequency_ryzen_7_1800x_5802.93_mhz
> 
> 5.4g GPUPI for CPU
> http://hwbot.org/submission/3473880_


This is what's bothering me about all the OC results in reviews, no one touched the b-clock


----------



## Blackops_2

My question is this. Where there that many of you expecting it to dethrone Kaby Lake or any other CPU with a higher IPC in the gaming sector, knowing that IPC/Clock speed is still king and not core count? We all suspected the OC headroom would rather iffy, especially after the leaked 5.1ghz on LN2 with what 1.5+ vcore?

Just baffles me really. I think people are forgetting it was just a decade or so ago that hitting 4ghz was a huge feat and anything there after considered extremely good. New arch comes out and doesn't hit 4.5ghz and people are just way too overly disappointed considering what AMD has accomplished.


----------



## Kuivamaa

Quote:


> Originally Posted by *IRobot23*
> 
> Does anyone trust that guy? When I was watching his benchmark I always had felling that Fury X was as fast as GTX 970.


He isn't competent really. Very shallow tests and he often found the 9590 slower than the 83x0 back in the day , probably due to throttling boards. That said, ryzen should be getting higher FPS in some games than what reviewers get.


----------



## zealord

Great CPU, but I am not really sure if I am that happy with the performance in games. For programs, synth benchmarks and some applications it is pretty cool, but a 7700K is better for games and that is what I do 99% of the time.

Hmm


----------



## ducegt

Someone being arguably the best overclocking on the planet. What's the best we have seen with an AIO water cooler?


----------



## 113802

Quote:


> Originally Posted by *RedM00N*
> 
> Someone already got close to 6ghz with everything active.
> 
> 5.8 1.97v
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3473875_der8auer_cpu_frequency_ryzen_7_1800x_5802.93_mhz
> 
> 5.4g GPUPI for CPU
> http://hwbot.org/submission/3473880_


He was able to successfully de-lid it! I was waiting to see if it was still functional. Wow that's insane the results. Can't wait for him to publish the temperature comparison to see if it was worth de-lidding.


----------



## M3T4LM4N222

It seems like AMD is betting on games and programs being able to benefit from more cores vs higher frame-rates in the coming years. Maybe a "Future Proof" kind of mentality.

It seems similar to the approach they took with the RX480 outperforming NVIDIA equivalents when using DX12 and Vulkan.

Basically in the current state, they're overall not going to blow the competition completely away, however, when the circumstances are right, they do.

I mean, realistically, they're releasing the best price-to-performance option for people who do heavy rendering or use heavily multi-threaded applications.

I don't know...I'm tired and haven't been active on these forums for years.


----------



## Defoler

So for creating content, when actual cores count means something, AMD have put a very good step into the game.
For gaming, you better stick with intel. In some cases even an i3 can beat the 1800x (which I have no idea why, but seems that it happens).
OC is very limited, and it could be released too early and requires update bios for memory fix and hopefully higher OC headroom.

If they could only OC them to 4.2ghz at most, while intel can go to 5ghz, I don't see AMD breaking intel's dominance in the mid and low end range, which is disappointing.


----------



## cssorkinman

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IRobot23*
> 
> Does anyone trust that guy? When I was watching his benchmark I always had felling that Fury X was as fast as GTX 970.
> 
> 
> 
> He isn't competent really. Very shallow tests and he often found the 9590 slower than the 83x0 back in the day , probably due to throttling boards. That said, ryzen should be getting higher FPS in some games than what reviewers get.
Click to expand...

I think you'd sooner see Steve sporting a flat top haircut than see him give AMD any credit.


----------



## Defoler

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> I think AMD is betting on games and programs being able to benefit from more cores vs higher frame-rates in the coming years. It seems similar to the approach they took with the RX480 outperforming NVIDIA equivalents when using DX12 and Vulkan.
> 
> Basically in the current state, they're overall not going to blow the competition completely away, however, when the circumstances are right, they do.
> 
> I don't know...I'm tired and haven't been active on these forums for years.


If they are beating for the next 2-3 years, intel have the answer with 6 core mid/high end cards. We might see 6/6 core and 6/12 cores replacement for the 7600K/7700K this year, which will make AMD really hard to beat if they right now in games can't beat the current lineup.

I do see people who need multi core to take the AMD series and threaten intel because of their higher price, which I hope force intel to reduce price and release the skylake-e soon.


----------



## jprovido

my e-peen made a comeback with my i7 7700k. and I'm not happy at all







I wanted a 1800x. I guess I'm gonna settle for the 1700 and matx board for my vr rig. the 7700k is gonna stay with my main pc


----------



## SoloCamo

Quote:


> Originally Posted by *zealord*
> 
> Great CPU, but I am not really sure if I am that happy with the performance in games. For programs, synth benchmarks and some applications it is pretty cool, but a 7700K is better for games and that is what I do 99% of the time.
> 
> Hmm


I'm not sure where the notion that Ryzen is going to be a better _current_ gaming cpu over a 7700k? Most were expecting ivy - haswell gaming performance and it seems we got it for the most part.


----------



## btupsx

Quote:


> Originally Posted by *Jpmboy*
> 
> Looking good. some "believable" benchmarks at stock. The lack of OC headroom is concerning tho.


Initial frequency limits were to be expected given the nature of the process node. I think there is still another easy 200-300 MHz in there once best boards and tuning have been fleshed out. Process refinements will yield improvements as well. 1600X/1600 should be the overall best clockers/value proposition once they are released in short time.


----------



## ducegt

Quote:


> Originally Posted by *jprovido*
> 
> my e-peen made a comeback with my i7 7700k. and I'm not happy at all
> 
> 
> 
> 
> 
> 
> 
> I wanted a 1800x. I guess I'm gonna settle for the 1700 and matx board for my vr rig. the 7700k is gonna stay with my main pc


At least your e-ego is okay. I wanted AMD to have been better as well, but the writing was on the wall. I'll be buying Vega regardless of how it turns out.


----------



## jamaican voodoo

i'm going to be happy with my 1800x i game at 4k so all good the go the processor will get better with updates.


----------



## Blackops_2

Quote:


> Originally Posted by *SoloCamo*
> 
> I'm not sure where the notion that Ryzen is going to be a better _current_ gaming cpu over a 7700k? Most were expecting ivy - haswell gaming performance and it seems we got it for the most part.


My point entirely, don't understand where all the expectation was coming from.


----------



## Defoler

Quote:


> Originally Posted by *SoloCamo*
> 
> ITT: People only look at 7700k gaming performance and claim superiority as a whole. 6900k being $1000 and not much faster/equal in multithreaded workloads over $329-$500 AMD cpu seemingly also ignored.


You are messing a simple point.
If someone has 400$ or so to spend on a *gaming* CPU, he isn't looking at the 6900K anyway. But someone who does *productivity* work, isn't going to look at the 7700K anyway, but he will look into either xeons or broadwell-Es for productivity. And for him, AMD makes a whole lot of point.

If you are a gamer, you have no reason to look into AMD right now. At least until they fix their bugs. Or wait and see how intel does with their next iteration and hope that AMD also get better in gaming.


----------



## Defoler

Quote:


> Originally Posted by *Blackops_2*
> 
> My point entirely, don't understand where all the expectation was coming from.


Really?
Have you not seen the hype train going full speed ahead in the last month with wild speculations on how ryzen is going to beat intel completely and IPC comparisons and claims of the incoming death of the 7700K?


----------



## oxidized

Quote:


> Originally Posted by *Blackops_2*
> 
> My point entirely, don't understand where all the expectation was coming from.


Latest rumoured/leaked tests and benchmarks i guess, i too fell for that tho

Quote:


> Originally Posted by *Defoler*
> 
> Really?
> Have you not seen the hype train going full speed ahead in the last month with wild speculations on how ryzen is going to beat intel completely and IPC comparisons and claims of the incoming death of the 7700K?


This basically


----------



## zealord

Quote:


> Originally Posted by *SoloCamo*
> 
> I'm not sure where the notion that Ryzen is going to be a better _current_ gaming cpu over a 7700k? Most were expecting ivy - haswell gaming performance and it seems we got it for the most part.


Its the first day of reviews. How could I have know before?

Somehow Ryzen looks like a CPU which should be very attractive to gamers, but really isn't all that great.

I am not sure why I should buy Ryzen as a gamer


----------



## DarkRadeon7000

Quote:


> Originally Posted by *SoloCamo*
> 
> I'm not sure where the notion that Ryzen is going to be a better _current_ gaming cpu over a 7700k? Most were expecting ivy - haswell gaming performance and it seems we got it for the most part.


Then no gamer will buy it as Intels are outperforming it at a nominal price increase. The 7 1700 costs as much as a 7700k here and its getting kicked around by the Intel


----------



## Newbie2009

Quote:


> Originally Posted by *WannaBeOCer*
> 
> He was able to successfully de-lid it! I was waiting to see if it was still functional. Wow that's insane the results. Can't wait for him to publish the temperature comparison to see if it was worth de-lidding.


1.97V lol and still alive


----------



## Axon14

Quote:


> Originally Posted by *Blackops_2*
> 
> My question is this. Where there that many of you expecting it to dethrone Kaby Lake or any other CPU with a higher IPC in the gaming sector, knowing that IPC/Clock speed is still king and not core count? We all suspected the OC headroom would rather iffy, especially after the leaked 5.1ghz on LN2 with what 1.5+ vcore?
> 
> Just baffles me really. I think people are forgetting it was just a decade or so ago that hitting 4ghz was a huge feat and anything there after considered extremely good. New arch comes out and doesn't hit 4.5ghz and people are just way too overly disappointed considering what AMD has accomplished.


I certainly was not. This is a huge win for AMD as the CPU is competitive with Intel's fastest 4c/8t in gaming. Ultimately we need to see what the 1600x can do with overclocking.


----------



## Phixit

Quote:


> Originally Posted by *zealord*
> 
> Its the first day of reviews. How could I have know before?
> 
> Somehow Ryzen looks like a CPU which should be very attractive to gamers, but really isn't all that great.
> 
> I am not sure why I should buy Ryzen as a gamer


Well, there is still a 4C/8T variant to come.


----------



## zealord

Quote:


> Originally Posted by *Blackops_2*
> 
> My point entirely, don't understand where all the expectation was coming from.


I know where all the expectations was coming from.

For example AMD preview events and youtubers showing it in comparison to Intel CPUs.


----------



## zealord

Quote:


> Originally Posted by *Phixit*
> 
> Well, there is still a 4C/8T variant to come.


Yeah maybe the 1600X with 6/12 is going to be interesting at a lower price point if it overclocks well


----------



## oxidized

Quote:


> Originally Posted by *Axon14*
> 
> I certainly was not. This is a huge win for AMD as the CPU is competitive with Intel's fastest 4c/8t in gaming. Ultimately we need to see what the 1600x can do with overclocking.


Also this, i have more faith in 1600X actually, especially if it overclocks good


----------



## Defoler

Quote:


> Originally Posted by *btupsx*
> 
> Initial frequency limits were to be expected given the nature of the process node. I think there is still another easy 200-300 MHz in there once best boards and tuning have been fleshed out. Process refinements will yield improvements as well. 1600X/1600 should be the overall best clockers/value proposition once they are released in short time.


Yeah over time AMD will overcome the issues.
But it will take them a year or so. Don't expect intel to sit idle on that one and let AMD play catchup.
The 7700K already reach 5ghz, and if high OC headroom will follow up to the next iteration, we should expect to see 6/12 cpus reaching close to 5ghz from intel by the end of the year. That will definitely put a heavy lid on AMD's plans to gain high momentum in that tire.


----------



## Axon14

Quote:


> Originally Posted by *Defoler*
> 
> Really?
> Have you not seen the hype train going full speed ahead in the last month with wild speculations on how ryzen is going to beat intel completely and IPC comparisons and claims of the incoming death of the 7700K?


Hardwareswap on reddit was chock full of 6700k and 7700k cpus for resale, everyone was "quitting gaming" but it was obvious that it was because of all the Ryzen hype. Oops?


----------



## naz2

Quote:


> Originally Posted by *Blackops_2*
> 
> My point entirely, don't understand where all the expectation was coming from.


the poopstorm in a few recent threads could've fooled me. funny how so many people are backing off their original claims and falling back to the synthetic benchmark/"productivity" argument when we just had a 60 page thread exclusively about ryzen gaming performance


----------



## bigjdubb

Quote:


> Originally Posted by *Defoler*
> 
> You are messing a simple point.
> If someone has 400$ or so to spend on a *gaming* CPU, he isn't looking at the 6900K anyway. But someone who does *productivity* work, isn't going to look at the 7700K anyway, but he will look into either xeons or broadwell-Es for productivity. And for him, AMD makes a whole lot of point.
> 
> If you are a gamer, you have no reason to look into AMD right now. At least until they fix their bugs. Or wait and see how intel does with their next iteration and hope that AMD also get better in gaming.


There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.

Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.


----------



## Newbie2009

I'd like to see some OC results under a proper loop


----------



## Xuper

overall is decent.I will buy Ryzen for Game/VirtualBox/also Heavy thread app.If you don't like , well it's your problem.


----------



## oxidized

Quote:


> Originally Posted by *naz2*
> 
> the poopstorm in a few recent threads could've fooled me. funny how so many people are backing off their original claims and falling back to the synthetic benchmark/"productivity" argument when we just had a 60 page thread exclusively about ryzen gaming performance


Agreed


----------



## Liranan

AMD have done an amazing job with Ryzen. The last time we saw such a leap was when Intel went from Netburst to Core 2 and that was a decade ago.

I will wait for Zen 2 and get that when it arrives as Zen still has a few problems that need to be sorted but for the price these chips are just ridiculous powerful.


----------



## kfxsti

Quote:


> Originally Posted by *bigjdubb*
> 
> There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.
> 
> Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.


if im not mistaken the next Intel chips are supposed to work on the z20 platform


----------



## moonroket

I don't understand. while 1800x matched 6900k in st performance, but in game benchmark it have a huge gap? Even tough ppl said there was problem with smt, but i can't clearly read the graph about which one have smt or not.


----------



## kd5151

AMD is trying to kill two birds with one stone. Trying to compete with X99 & Z270 cpus. 1700/1700x is the better deal. I cant see paying $500 USD for the 1800x because of its higher clocks when Intel has the highest clocks out of the box and better IPC for $350 USD on the i7 7700k. But if you want a cpu like the 6900k then take your pick. 1700/1700x/1800x are all better options. Most people dont spend over $500 on a cpu anyways.

I think the 1600x is the better deal when it comes. If you cant wait then get the 1700 and good luck with overclocking because you are going to need it. Especially when the $350 i7 for the last two years can easily hit 4.5ghz+. Ryzens overclocking headroom sucks. Still looking into this.

For gaming Ryzen is on par with a i5. But how long is that going to last with games like BF1. Especially for higher resolutions where the gpu becomes the bottleneck and Intels cpus are being pegged @ 100%. cough i5 cough.

Choose your next CPU wisley for those looking to upgrade like me. This is the best AMD can do for now. Gonna keep looking benchmarks for now. Final verdict is yet to be seen.


----------



## M3T4LM4N222

How anybody can say Ryzen is disappointing baffles me. AMD is competitive again, that alone should make you excited. Intel is going to have to respond, by either offering more cores in the mainstream, lowering prices or both.

The 1800X is literally less than half the price of the 6900K and pretty much directly competing with it. Intel has a mainstream platform and a high-end platform, both with difference feature sets, sockets, etc. AMD has managed to cram all of their offerings in one universal socket.

Yes, Ryzen is falling behind Skylake in single threaded applications and gaming.. a lot of games aren't made to utilize more than 2-4 cores. But if you do some serious rendering, streaming or use a lot of multi-threaded applications Ryzen is by far the best option at it's price.

Not to mention there's been talk of games down the road being able to take advantage of more cores.


----------



## Defoler

Quote:


> Originally Posted by *Axon14*
> 
> Hardwareswap on reddit was chock full of 6700k and 7700k cpus for resale, everyone was "quitting gaming" but it was obvious that it was because of all the Ryzen news.


I did expect people to rush and buy AMD based on the leaks. Those are why the leaks were so often and numerous. The PR war at its best.

Yet once they go back to gaming, they will eventually "eat their hat" and go back to intel until AMD's next iteration.
But it doesn't overall surprise me. A lot of people purchase 6800K or 6900Ks just for gaming for the e-peen. So people will buy 1700X/1800X for those 3dmark physical scores and those productivity scores and claim their machine is best in the world, but the guy who stayed with his intel CPU will be more than happy to get better FPS without needing to replace anything.


----------



## JedixJarf

Quote:


> Originally Posted by *bigjdubb*
> 
> There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.
> 
> Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.


Exactly.

I like to game @ 4k, but I also use my machine as a workstation for running multiple servers/vswitches at a time in vmware. 7700k couldn't handle that load.


----------



## Axon14

Quote:


> Originally Posted by *bigjdubb*
> 
> There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.
> 
> Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.


Also consider that once
Quote:


> Originally Posted by *moonroket*
> 
> I don't understand. while 1800x matched 6900k in st performance, but in game benchmark it have a huge gap? Even tough ppl said there was problem with smt, but i can't clearly read the graph about which one have smt or not.


If you read these reviews closely, its clear even some of the reviewers share your concern. PCPer said they had contacted AMD and not heard back. Legit reviews literally has two sets of benches - one with obvious problems where the 1800x loses to like the i5 570 and another more in line with everything else we are seeing.


----------



## Xuper

*Ryzen isn't Pure gaming CPU*.it has good Power consumption/decent value for Multithread app also in game I don't see any issue.


----------



## ChronoBodi

Apparently 162 Cinebench single threaded, but is worse in games?

If anything, the games are Intel optimized, there is no other way to explain why a 1700x can have Broadwell-e ipc yet games like it has sandy bridge ipc.


----------



## budgetgamer120

Quote:


> Originally Posted by *Shiftstealth*
> 
> I'm upset i got the 1700x, over the 1700. Not upset i got ryzen. I will be happy to sport AMD for once again.


Excellent AMD nailed it. Not too fast for Intel to trigger price drops and fast enough for the masses to buy. Intel fanboy pockets will continue to get raped while AMD will make money and market share.









Quote:


> Originally Posted by *sugalumps*
> 
> Well at the amd event with linus they didn't have min/max % showing and they tested at 4k so it was gpu bound not cpu bound, both sides obviously give themselves the best conditions at their press conferences which never translates into real world performance.


What is wrong with 4K benchmarks? That is where everyone is headed and AMD made it clear that they want 4k for the masses.
Quote:


> Originally Posted by *naz2*
> 
> this is the identical rhetoric that was used when bulldozer came out. not sure what's so difficult to comprehend


Please dont mention Ryzen and bulldozer in the same sentence... - credibility
Quote:


> Originally Posted by *naz2*
> 
> are we looking at the same reviews? ryzen is losing 10% fps with smt enabled and there's already murmurings of a magical windows update to fix it. this is verbatim what happened when bulldozer dropped and people pointed to poor optimization with the windows scheduler as the culprit
> 
> not saying ryzen sucks or anything, just that the rhetoric is the same. hopefully they will actually fix it with a magical update but i doubt it. intel's been refining hyper-threading for nearly a decade to get to its current level


Drivers are not magic. Drivers tell hardware how to operate. Ryzen has a feature that windows does not know how to handle, a driver will fix or Windows update.


----------



## oxidized

Quote:


> Originally Posted by *Xuper*
> 
> *Ryzen isn't Pure gaming CPU*.it has good Power consumption/decent value for Multithread app also in game I don't see any issue.


Gaming was used many times in marketing content, though.


----------



## budgetgamer120

Quote:


> Originally Posted by *oxidized*
> 
> Gaming was used many times in marketing content, though.


It games right?









What is your point?


----------



## Liranan

Quote:


> Originally Posted by *JedixJarf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigjdubb*
> 
> There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.
> 
> Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.
> 
> 
> 
> Exactly.
> 
> I like to game @ 4k, but I also use my machine as a workstation for running multiple servers/vswitches at a time in vmware. 7700k couldn't handle that load.
Click to expand...

Obviously Zen is the best option for you.

Personally I would like to get a 1700 or even 1600 for my Plex server. It would be glorious.


----------



## zealord

Quote:


> Originally Posted by *Xuper*
> 
> *Ryzen isn't Pure gaming CPU*.it has good Power consumption/decent value for Multithread app also in game I don't see any issue.


The issue is, atleast for me as a pure gamer, that you don't want your CPU to be a bottleneck and it is less likely to have your CPU being the bottleneck if you pick a 7700K over a Ryzen CPU.
(From a pure gamer standpoint)


----------



## Ha-Nocri

I was sure I was going to buy 1600X, now I don't know anymore. But also don't want to give money to Intel for their overpriced 4 core crap.... dunno what to do


----------



## Artikbot

Quote:


> Originally Posted by *Xuper*
> 
> *Ryzen isn't Pure gaming CPU*.it has good Power consumption/decent value for Multithread app also in game I don't see any issue.


It games well enough, it is hella fast at processing photos and video transcoding and it is damn efficient at doing so, while setting you back a very reasonable amount.

It does everything I need it to do at the price I can afford.


----------



## Defoler

Quote:


> Originally Posted by *bigjdubb*
> 
> There is also those who do both. Maybe they don't game at 1080p where there is a difference, maybe they game at 1440p or 4k where there is very little difference since things are gpu bound.
> 
> Honestly, if all I did was play games I would still consider the 1700 over the 7700k simply because they are close enough and the revision of Ryzen will more than likely work on the board you purchase, not so likely with the 7700k.


I'm not saying there aren't people who do both. But tbh, they are very few, and there are more 6900K buyers who could be just fine with a 7700K (and you can see in some places the 7700K beat the 1800x even in productivity work).

And if you considered 1700 over the 7700K, remember that the 7700K can easily be OCed to almost 5ghz, and even at 1440p and 4K, it does hold better. In some cases even the 7600K runs just as good.
I see no reason to get AMD for anything gaming related, with the low memory speeds (atm at least, no idea how it will be after the fix), or the so small OC head room. They have no future proofing tbh, even with games in 2-3 years really use multi cores (which have been talked about for years, and nothing has really improved, and I don't see how they will).


----------



## budgetgamer120

Quote:


> Originally Posted by *zealord*
> 
> The issue is, atleast for me as a pure gamer, that you don't want your CPU to be a bottleneck and it is less likely to have your CPU being the bottleneck if you pick a 7700K over a Ryzen CPU.
> (From a pure gamer standpoint)


Ryzen is not for you sir. At least the 8 core Ryzen is not for you. Move on.


----------



## TomiKazi

Here's hoping that the lower performance in some instances where SMT is enabled is due to operating systems not yet being tailored to AMD's implementation, instead of an inherent hardware flaw. Is there any information about this yet?


----------



## Xuper

Quote:


> Originally Posted by *oxidized*
> 
> Gaming was used many times in marketing content, though.


Yes but not part of most people , How many people do play Game ? I really don't see why you should be disapponited because of game ? Who said Ryzen Is for game? I will go Ryzen overall Performance.


----------



## weebeast

Well i upgraded from a 2500k to a 7600k and i didn't really found it a big improvement so i sold my motherboard and send back my CPU. Ryzen doesn't perform great on most games but i still will get one. I don't game a lot and it would be nice to have more CPU power for other stuff.


----------



## Blackops_2

Quote:


> Originally Posted by *Defoler*
> 
> Really?
> Have you not seen the hype train going full speed ahead in the last month with wild speculations on how ryzen is going to beat intel completely and IPC comparisons and claims of the incoming death of the 7700K?


Most of the leaks pointed to exactly what we got broadwell IPC. There was maybe one that i recall that wccf posted (which should be a sign) of it somehow beating Kaby Lake. Other than that i figured the hype was that AMD is back. Which they are. They're competing again. I guess some people really went over the top with their interpretation of AMD being back.


----------



## Defoler

Quote:


> Originally Posted by *Xuper*
> 
> *Ryzen isn't Pure gaming CPU*.it has good Power consumption/decent value for Multithread app also in game I don't see any issue.


But AMD used gaming as excuse to sell them as well no?
If they weren't planned for any gaming, AMD should have not talked about gaming at all.


----------



## zealord

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen is not for you sir. At least the 8 core Ryzen is not for you. Move on.


yeah but AMD presented it like it was great for exactly me. They actually said that Ryzen is going to be amazing for gamers. They marketed the 1700 and 1700X as such.

I really love AMD, but there is no need to defend it. They are lacking where it matters most for a huge chunk of potential buyers.


----------



## jprovido

Quote:


> Originally Posted by *ducegt*
> 
> At least your e-ego is okay. I wanted AMD to have been better as well, but the writing was on the wall. I'll be buying Vega regardless of how it turns out.


feels better in a sense that I'm not an idiot for buying the 7700k but I wanted the 1800x to do better. negative SMT scaling in games is concerning. OC headroom is dissapointing. well at least the r7 1700 is a lot better buy now since they have the same oc headroom (around 4ghz).

AMD IS back don't get me wrong. not the beast I was expecting though. a 4.4-4.5ghz overclock on the 1800x would've been the perfect fairy tale ending but sadly we didn't get it


----------



## JedixJarf

Quote:


> Originally Posted by *Liranan*
> 
> Obviously Zen is the best option for you.
> 
> Personally I would like to get a 1700 or even 1600 for my Plex server. It would be glorious.


I've got my plex server on a 12c ES xeon v3


----------



## Xuper

Quote:


> Originally Posted by *Defoler*
> 
> But AMD used gaming as excuse to sell them as well no?
> If they weren't planned for any gaming, AMD should have not talked about gaming at all.


You don't get my point , do you ? I said Pure gaming CPU.For gaming It's Ok.at least this CPU is not for Hardcore gamer.I don't know why are you focusing on Game?


----------



## oxidized

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I was sure I was going to buy 1600X, now I don't know anymore. But also don't want to give money to Intel for their overpriced 4 core crap.... dunno what to do


Yeah same here









But let's wait for the 1600X to come out
Quote:


> Originally Posted by *Xuper*
> 
> Yes but not part of most people , How many people do play Game ? I really don't see why you should be disapponited because of game ? Who said Ryzen Is for game? I will go Ryzen overall Performance.


Many do, actually the biggest market for such high end hardware is due to gaming, it's hands down the biggest slice of market there is for pc hardware.
I'm actually not so disappointed, but rumors and leaked tests really set me on ryzen pretty easily, and now that's not so solid anymore.
Who said ryzen is for game? AMD did, multiple times, and keeps saying it.


----------



## Defoler

Quote:


> Originally Posted by *Blackops_2*
> 
> Most of the leaks pointed to exactly what we got broadwell IPC. There was maybe one that i recall that wccf posted (which should be a sign) of it somehow beating Kaby Lake. Other than that i figured the hype was that AMD is back. Which they are. They're competing again. I guess some people really went over the top with their interpretation of AMD being back.


I agree with the last statement.
People never learn to be cautious with it comes to bold statements.

They hype that AMD are back is true, for productivity applications. To beat intel, they need more than that. They need OC headroom, and better gaming performance.
When I see an i3 running some games better than the 1800x, it makes me wonder.


----------



## budgetgamer120

Quote:


> Originally Posted by *weebeast*
> 
> Well i upgraded from a 2500k to a 7600k and i didn't really found it a big improvement so i sold my motherboard and send back my CPU. Ryzen doesn't perform great on most games but i still will get one. I don't game a lot and it would be nice to have more CPU power for other stuff.


Smart move....

But why would you go from an i5 to an i5 int eh first place









I guess you had some people telling you an i5 was all you needed


----------



## Blackops_2

But it isn't really lacking by any means in gaming.. its under performing against a CPU with higher IPC and clock speed, but it's by no means the disparity that BD was. It's not as if it's impractical for gaming especially if you have use for more cores and it's not the only thing you do. It isn't the best CPU for gaming, which again we all knew it wasn't going to be. Not until devs shift towards huge thread counts in games across the board.


----------



## ryan92084

Quote:


> Originally Posted by *rv8000*
> 
> This is what's bothering me about all the OC results in reviews, no one touched the b-clock


In the xremesystems review for the 1700 they said that their motherboard had the b-clock locked I'm not sure about the others. Their opinion of that board in general didn't seem very favorable.


----------



## Defoler

Quote:


> Originally Posted by *Xuper*
> 
> You don't get my point , do you ? I said Pure gaming CPU.For gaming It's Ok.at least this CPU is not for Hardcore gamer.I don't know why are you focusing on Game?


Has intel PRed their CPUs as "pure gaming"?
Not really.
So that statement sounds like an excuse of "no wait guys! no! it is not what we meant! not for gaming guys! we didn't mean to!".


----------



## jprovido

Quote:


> Originally Posted by *Blackops_2*
> 
> But it isn't really lacking by any means in gaming.. its under performing against a CPU with higher IPC and clock speed, but it's by no means the disparity that BD was. It's not as if it's impractical for gaming especially if you have use for more cores and it's not the only thing you do. It isn't the best CPU for gaming, which again we all knew it wasn't going to be. Not until devs shift towards huge thread counts in games across the board.


yep it's not a bulldozer. Ryzen demolished the 8350 (couldn't say the same with Bulldozer over Thuban) and multithreadded performance is excellent. gaming is "okay" not terrible. as long as you're not gaming on 120hz and above you'll be fine.


----------



## tajoh111

Quote:


> Originally Posted by *SoloCamo*
> 
> AMD didn't hype it at all... The hype train on this site did. For those of us who kept expectations in check this falls right in line.


AMD definitely applied guerrilla marketing here. Leaks although partially true generally highlighted where AMD does well, while hiding the less the instances where AMD fall short.

Trading blows is definitely being a little stretched when comparing a 6900k vs 1800x. It is better to say, occasionally, the 1800k wins, but the 6900k is definitely the faster processor for productivity, usually winning in 80+ percent of instances, while losing across the board for gaming.

Consider the 6900k frequency disadvantage, IPC for AMD's processors is definitely behind broadwell and is more like ivybridge.

But nonetheless, this is definitely a return to competition. AMD chips are definitely competitive with Intel's and definitely provide superior value. But the work needs to continue and AMD can't rest on Ryzen just yet.


----------



## M3T4LM4N222

Arguing that Ryzen isn't a compelling offer because it falls behind in gaming is laughable. If you're building a system solely for gaming, you really shouldn't be looking at anything above a Core i5 which is in the $200-$250 price range. Most games WILL NOT benefit from hyper-threading.

If you're doing anything multi-threaded, anything that benefits from more cores (Rendering, Encoding, Decoding, Streaming, etc) Ryzen is a compelling offer. You're going to get more performance for your dollar than anything Intel currently offers.

Despite the arguments, I highly doubt the difference in frame-rate you'd see with a Ryzen CPU vs a 7700K would be that noticeable, especially if you're using a 60HZ monitor. Not to mention, if what we've been hearing recently is true, games will start to benefit from more cores in the near future and that could completely change the playing field.


----------



## Liranan

Quote:


> Originally Posted by *JedixJarf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Liranan*
> 
> Obviously Zen is the best option for you.
> 
> Personally I would like to get a 1700 or even 1600 for my Plex server. It would be glorious.
> 
> 
> 
> I've got my plex server on a 12c ES xeon v3
Click to expand...

Now that is sexy but way beyond my reach so I will gladly settle for a 1700. If I had enough people accessing my server then I might be able to justify such a cost, especially if I could get donations to offset the price of a setup like this but I can't possibly justify the cost for only a few people accessing the server.


----------



## oxidized

Quote:


> Originally Posted by *Blackops_2*
> 
> But it isn't really lacking by any means in gaming.. its under performing against a CPU with higher IPC and clock speed, but it's by no means the disparity that BD was. It's not as if it's impractical for gaming especially if you have use for more cores and it's not the only thing you do. It isn't the best CPU for gaming, which again we all knew it wasn't going to be. Not until devs shift towards huge thread counts in games across the board.


No it isn't but then agai, it's not that close to what they showed even at GDC. I don't think anyone tried to compare this to bd, that would be very stupid, this is a HUGE jump in AMD overall level of CPUs, but not quite what was showed and advertised
Quote:


> Originally Posted by *Defoler*
> 
> Has intel PRed their CPUs as "pure gaming"?
> Not really.
> So that statement sounds like an excuse of "no wait guys! no! it is not what we meant! not for gaming guys! we didn't mean to!".


Agreed


----------



## weebeast

Quote:


> Originally Posted by *budgetgamer120*
> 
> Smart move....
> 
> But why would you go from an i5 to an i5 int eh first place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess you had some people telling you an i5 was all you needed


A 2500k is also an i5 and it's a great CPU. Nobody told me to get the 7600k. Maybe the 7700k would had been a better move but still i expected a bigger difference in 5 years time.


----------



## budgetgamer120

Memory performance is boss. The piledriver ones were so terrible.


----------



## Ha-Nocri

Think AMD should disable SMT on some chips and sell it @ $250-$300. I would buy that as it would be better in games and cheaper.


----------



## budgetgamer120

Quote:


> Originally Posted by *weebeast*
> 
> A 2500k is also an i5 and it's a great CPU. Nobody told me to get the 7600k. Maybe the 7700k would had been a better move but still i expected a bigger difference in 5 years time.


i5 2500k was a great cpu... it is not 2013 anymore...


----------



## GTR Mclaren

I just want a damn 1600x at 4.0Ghz to upgrade

Being using 4 cores CPU since forever!!


----------



## SoloCamo

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> If you're building a system solely for gaming, you really shouldn't be looking at anything above a Core i5 which is in the $200-$250 price range. Most games WILL NOT benefit from hyper-threading.


How is this still a thing in 2017, let alone on this site?


----------



## jprovido

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Think AMD should disable SMT on some chips and sell it @ $250-$300. I would buy that as it would be better in games and cheaper.


1st gen issues. I remember doing this on my i7 920. disabling HT increased gamign performance, decreased temps by 10 degrees and oc headroom went up by 200mhz+


----------



## Ultracarpet

I think it's hilarious that most peoples expectations were ivy bridge ipc and also low 3ghz clocks. AMD comes to the table with broadwell ipc and 4ghz clocks and all of a sudden it's a disappointment. How fast the goal posts shift.

The 1700 is cheaper than a 7700k where i live. I can disable cores, OC, and probably get within 5% to a 7700k in completely cpu bound scenarios for gaming, and absolutely rofl stomp the 7700k's face in when using it for any parallel tasks beyond just gaming.

This is much better than anyone was expecting even a month ago. It is priced lower than most everyone was expecting. It performs better/ has higher ipc than most were expecting (AMD included lol). It has higher clocks than most were expecting. It's more power efficient than most were expecting. But yet all of a sudden it's a let down because the all core OC's can't manage much beyond stock? Are you guys freaking kidding me?


----------



## oxidized

So Ryzen 5 lineup should come out later this month/early April right?
Quote:


> Originally Posted by *Ultracarpet*
> 
> I think it's hilarious that most peoples expectations were ivy bridge ipc and also low 3ghz clocks. AMD comes to the table with broadwell ipc and 4ghz clocks and all of a sudden it's a disappointment. How fast the goal posts shift.
> 
> The 1700 is cheaper than a 7700k where i live. I can disable cores, OC, and probably get within 5% to a 7700k in completely cpu bound scenarios for gaming, and absolutely rofl stomp the 7700k's face in when using it for any parallel tasks beyond just gaming.
> 
> This is much better than anyone was expecting even a month ago. It is priced lower than most everyone was expecting. It performs better/ has higher ipc than most were expecting (AMD included lol). It has higher clocks than most were expecting. It's more power efficient than most were expecting. But yet all of a sudden it's a let down because the all core OC's can't manage much beyond stock? Are you guys freaking kidding me?


A 1700 isn't cheaper than a 7700k everywhere, i'm actually pretty sure it's the opposite, i can buy a 7700k for 330/340€ here in Italy, a 1700 costs 40€ more than that


----------



## Xuper

I remembered First reaction on RX 480.How was reaction to First bench of RX 480 ? It's same.after couple of month , It will be fine for Gaming


----------



## X-Nine

Basically, everything I've seen so far suggests what I was hoping for and expecting. It's not the end-all be-all Intel crusher some people were saying, but a competitive offering at a lower price point. Which is good for everyone. The more competition we have in the CPU market the more prices come down and innovation starts kicking in again.


----------



## weebeast

Quote:


> Originally Posted by *budgetgamer120*
> 
> i5 2500k was a great cpu... it is not 2013 anymore...


It still is a great cpu


----------



## budgetgamer120

Quote:


> Originally Posted by *Ultracarpet*
> 
> I think it's hilarious that most peoples expectations were ivy bridge ipc and also low 3ghz clocks. AMD comes to the table with broadwell ipc and 4ghz clocks and all of a sudden it's a disappointment. How fast the goal posts shift.
> 
> The 1700 is cheaper than a 7700k where i live. I can disable cores, OC, and probably get within 5% to a 7700k in completely cpu bound scenarios for gaming, and absolutely rofl stomp the 7700k's face in when using it for any parallel tasks beyond just gaming.
> 
> This is much better than anyone was expecting even a month ago. It is priced lower than most everyone was expecting. It performs better/ has higher ipc than most were expecting (AMD included lol). It has higher clocks than most were expecting. It's more power efficient than most were expecting. But yet all of a sudden it's a let down because the all core OC's can't manage much beyond stock? Are you guys freaking kidding me?


Haters can never be pleased.


----------



## azanimefan

Quote:


> Originally Posted by *Wishmaker*
> 
> It was a known fact for months that the INTEL process is superior. People complained about INTEL clocks on these chips and completely forgot AMD does not have the same process. So INTEL clocks are rather good now it seems
> 
> 
> 
> 
> 
> 
> 
> .


some people forget intel's first chip on this process (broadwell) couldn't overclock either.

This is technically intel's 3rd generation at 14nm; amd's first
Quote:


> Originally Posted by *Ultracarpet*
> 
> I think it's hilarious that most peoples expectations were ivy bridge ipc and also low 3ghz clocks. AMD comes to the table with broadwell ipc and 4ghz clocks and all of a sudden it's a disappointment. How fast the goal posts shift.
> 
> The 1700 is cheaper than a 7700k where i live. I can disable cores, OC, and probably get within 5% to a 7700k in completely cpu bound scenarios for gaming, and absolutely rofl stomp the 7700k's face in when using it for any parallel tasks beyond just gaming.
> 
> This is much better than anyone was expecting even a month ago. It is priced lower than most everyone was expecting. It performs better/ has higher ipc than most were expecting (AMD included lol). It has higher clocks than most were expecting. It's more power efficient than most were expecting. But yet all of a sudden it's a let down because the all core OC's can't manage much beyond stock? Are you guys freaking kidding me?


bingo


----------



## xlink

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Think AMD should disable SMT on some chips and sell it @ $250-$300. I would buy that as it would be better in games and cheaper.


No real reason to unless that part of the chip(on multiple sets of cores) is defective AND no other.
Heck if only one cluster of chips is defective in that area then it's probably just better to disable the whole cluster.

Remember, AMD is the underdog, they need every leg up they can against Intel and they're fighting a yields game.


----------



## Defoler

Quote:


> Originally Posted by *Blackops_2*
> 
> But it isn't really lacking by any means in gaming.. its under performing against a CPU with higher IPC and clock speed, but it's by no means the disparity that BD was. It's not as if it's impractical for gaming especially if you have use for more cores and it's not the only thing you do. It isn't the best CPU for gaming, which again we all knew it wasn't going to be. Not until devs shift towards huge thread counts in games across the board.


I don't agree.
If cheaper CPUs can out-perform it, it does lack in gaming that AMD boasted about.
Getting an i5-7600K or a i3-7350K which both cost a lot less than the 1700, and getting better performance (even in 1440p or 4K sometimes), OC room and memory support speeds, means it lacks in gaming.

I'm not saying it isn't impractical, but if you do game and do some productivity tools, even the 7700K can do better. You gain more for gaming, and in some tools the 7700K is very close behind (and its OC room allows it to overcome its lack of cores sometimes), and even out perform in some cases on tools which still rely on IPC than just core count.

For pure productivity, there is no doubt, AMD set out to win a war with the X99 platform. And they seem to do extremely well for price/performance over intel.
For gaming, or combined work, I don't see a reason to move to AMD unless you have a 3 years old platform and you want to upgrade it right now.


----------



## IRONPIG1

Quote:


> Originally Posted by *jezzer*
> 
> 
> 
> Still a GPU bottleneck but at least if you dont have anything higher than a gtx 1080 u will be on par with 7700k on 5ghz
> 
> With unlimited fps the res doesnt really matter regarding GPU bottleneck, the GPU performance does


This doesn't take into account the 1% and 0.1% lows of the Ryzen CPUs...just like the Linus video, Bitwit or w/e that youtube channel showed significant drops in FPS across most games. To me its all about the lows, doesnt matter how high your average is sub 60 drops mean you have to lower that graphics


----------



## jprovido

this is reminiscent to the 1090T. the 1090T beats out the i7 950 with multithreaded workloads, has a far cheaper offering with the 1055T and loses in gaming.


----------



## Xuper

Quote:


> Originally Posted by *Defoler*
> 
> I don't agree.
> If cheaper CPUs can out-perform it, it does lack in gaming that AMD boasted about.
> Getting an i5-7600K or a i3-7350K which both cost a lot less than the 1700, and getting better performance (even in 1440p or 4K), OC room and memory support speeds, means it lacks in gaming.
> 
> I'm not saying it isn't impractical, but if you do game and do some productivity tools, even the 7700K can do better. You gain more for gaming, and in some tools the 7700K is very close behind, and even out perform in some cases on tools which still rely on IPC than just core count.
> 
> For pure productivity, there is no doubt, AMD set out to win a war with the X99 platform. And they seem to do extremely well for price/performance over intel.
> For gaming, or combined work, I don't see a reason to move to AMD unless you have a 3 years old platform and you want to upgrade it right now.


We will see in Ryzen R5 or R3.


----------



## budgetgamer120

Quote:


> Originally Posted by *JasonCL*
> 
> Basically, everything I've seen so far suggests what I was hoping for and expecting. It's not the end-all be-all Intel crusher some people were saying, but a competitive offering at a lower price point. Which is good for everyone. The more competition we have in the CPU market the more prices come down and innovation starts kicking in again.


The overclocked results are impressive









https://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,23.html


----------



## bossie2000

Funny! We all know Ryzen was not really up to it in gamming,but we all know it is awesome in multithreaded apps. But some are dissapointed!?


----------



## M3T4LM4N222

Quote:


> Originally Posted by *SoloCamo*
> 
> How is this still a thing in 2017, let alone on this site?


I'm not sure what you mean?

It's the truth. It has been for years. You're not going to see a large difference, if any difference at all, games wise using a Core i5 in your gaming system vs a Core i7. So if you're solely looking to build a gaming system, you shouldn't even be looking into the $300-$400 CPU territory. Buy an unlocked Core i5 and overclock it, because you're going to gain more for higher clock speed than you are from the extra thread the Core i7 has when it comes to solely gaming.

Now if you intend on playing games, streaming, producing content, etc then of course you'll want to looking into a Core i7 or Ryzen.


----------



## Ha-Nocri

Quote:


> Originally Posted by *jprovido*
> 
> this is reminiscent to the 1090T. the 1090T beats out the i7 950 with multithreaded workloads, has a far cheaper offering with the 1055T and loses in gaming.


Yep, this is like Phenom II x6 vs i7 860/920. Hope we will not be waiting another 7 years until AMD improves the performance


----------



## Catscratch

Was planning to upgrade my rig at home and put this 2500k elsewhere for a light database work. As it seems ill be going for a 7700k build. Not right away thou. Ill see if mobo manufacturers can improve any performance with new bios. Im only considering 1700 or 7700k.


----------



## jprovido

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> I'm not sure what you mean?
> 
> It's the truth. It has been for years. You're not going to see a large difference, if any difference at all, games wise using a Core i5 in your gaming system vs a Core i7. So if you're solely looking to build a gaming system, you shouldn't even be looking into the $300-$400 CPU territory.
> 
> Now if you intend on playing games, streaming, producing content, etc then of course you'll want to looking into a Core i7 or Ryzen.


check Digital Foundry videos. overclocked i5's loses to i7's now. this statement was correct 3-5 years ago. now this is not the case. i7 IS better in gaming


----------



## xx9e02

I think I'm going to stick with my original plan of waiting for the R5 1600X and evaluating what I want to do at that point. I don't do anything that warrants 8 cores, and I don't want to pay $300 for 4 cores lol


----------



## dieanotherday

is video editing mostly done on gpu or cpu nowadays?

i like more cores cuz epeen but im not sure if ill ever use them

they shoulda released high clock 4c first


----------



## Defoler

Quote:


> Originally Posted by *Xuper*
> 
> We will see in Ryzen R5 or R3.


Yup. And hopefully AMD can make it up. I hope it will not be another 5ghz pushed to the limit CPU with zero OC headroom that will be out crowned buy any mere OC.


----------



## SoloCamo

Quote:


> Originally Posted by *dieanotherday*
> 
> is video editing mostly done on gpu or cpu nowadays?
> 
> i like more cores cuz epeen but im not sure if ill ever use them
> 
> they shoulda released high clock 4c first


CPU if you want decent quality.


----------



## Defoler

Quote:


> Originally Posted by *dieanotherday*
> 
> is video editing mostly done on gpu or cpu nowadays?
> 
> i like more cores cuz epeen but im not sure if ill ever use them
> 
> they shoulda released high clock 4c first


Depends on the program. Today there is a move for GPU video rendering. Adobe has been moving to GPU assist rendering. This helps even with iGPUs from intel.
Quote:


> Originally Posted by *SoloCamo*
> 
> CPU if you want decent quality.


Again depends on program.
Adobe shown a huge performance increase using GPUs in their programs. Even compared to high core count CPUs. If you have a good GPU, the difference between 4c/8t and 8c/16t cpus start to diminish quickly.


----------



## budgetgamer120

I am surprised by these gaming benchmarks here. Seeing as these games use more than 4 cores.

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html


----------



## SuprUsrStan

Quote:


> Originally Posted by *kd5151*
> 
> AMD is trying to kill two birds with one stone. Trying to compete with X99 & Z270 cpus. 1700/1700x is the better deal. I cant see paying $500 USD for the 1800x because of its higher clocks when Intel has the highest clocks out of the box and better IPC for $350 USD on the i7 7700k. But if you want a cpu like the 6900k then take your pick. 1700/1700x/1800x are all better options. Most people dont spend over $500 on a cpu anyways.
> 
> I think the 1600x is the better deal when it comes. If you cant wait then get the 1700 and good luck with overclocking because you are going to need it. Especially when the $350 i7 for the last two years can easily hit 4.5ghz+. Ryzens overclocking headroom sucks. Still looking into this.
> 
> For gaming Ryzen is on par with a i5. But how long is that going to last with games like BF1. Especially for higher resolutions where the gpu becomes the bottleneck and Intels cpus are being pegged @ 100%. cough i5 cough.
> 
> Choose your next CPU wisley for those looking to upgrade like me. This is the best AMD can do for now. Gonna keep looking benchmarks for now. Final verdict is yet to be seen.


So intel drops the 6900K down to $600 and leaves the mainstream CPU's at the same price and Intel will crush AMD in all usage applications and segments.


----------



## Blackops_2

Quote:


> Originally Posted by *Defoler*
> 
> I don't agree.
> If cheaper CPUs can out-perform it, it does lack in gaming that AMD boasted about.
> Getting an i5-7600K or a i3-7350K which both cost a lot less than the 1700, and getting better performance (even in 1440p or 4K sometimes), OC room and memory support speeds, means it lacks in gaming.
> 
> I'm not saying it isn't impractical, but if you do game and do some productivity tools, even the 7700K can do better. You gain more for gaming, and in some tools the 7700K is very close behind (and its OC room allows it to overcome its lack of cores sometimes), and even out perform in some cases on tools which still rely on IPC than just core count.
> 
> For pure productivity, there is no doubt, AMD set out to win a war with the X99 platform. And they seem to do extremely well for price/performance over intel.
> For gaming, or combined work, I don't see a reason to move to AMD unless you have a 3 years old platform and you want to upgrade it right now.


Fair enough. But again this is just considering the current 8c/16t variants. The 1600x is set to come in at probably around 250$ mark, if it OCs better (which that is a big if) but if and it can hit 4.4/4.5ghz it's going to be well worth the extra cores to me. I guess i just don't feel there is any real need as of right now in the gaming for anything over sandy/Ivy at 4.5ghz. Least i haven't come across it with my 3770k rig yet. Just itching to build a new rig.


----------



## xlink

Quote:


> Originally Posted by *Ultracarpet*
> 
> I think it's hilarious that most peoples expectations were ivy bridge ipc and also low 3ghz clocks. AMD comes to the table with broadwell ipc and 4ghz clocks and all of a sudden it's a disappointment. How fast the goal posts shift.
> 
> The 1700 is cheaper than a 7700k where i live. I can disable cores, OC, and probably get within 5% to a 7700k in completely cpu bound scenarios for gaming, and absolutely rofl stomp the 7700k's face in when using it for any parallel tasks beyond just gaming.
> 
> This is much better than anyone was expecting even a month ago. It is priced lower than most everyone was expecting. It performs better/ has higher ipc than most were expecting (AMD included lol). It has higher clocks than most were expecting. It's more power efficient than most were expecting. But yet all of a sudden it's a let down because the all core OC's can't manage much beyond stock? Are you guys freaking kidding me?


The price is slightly higher.

It clocks marginally lower than I expected (I was hoping for 4.4GHz, not 4Ghz).
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-19.html

It has marginally better performance per clock than I expected(I expected it to be around 10% slower per clock than it actually is).

All in all it's about as fast as I expected - around 25% slower in lightly threaded applications than a hex core intel and about 25% faster in highly multithreaded applications than a hex core intel.

AMD is back. They can match my 4.4 GHz ivy bridge CPU in nearly everything and give me 2x the multithreaded performance I crave for doing statistical computing and machine learning.


----------



## Ha-Nocri

There is no way 1600X will OC to 4.4/4.5, no way


----------



## Horsemama1956

Looking good. Will be picking up whatever costs $2-250 Canadian whenever they launch. AMD are definitely headed in the right direction.


----------



## Xuper

One thing very important.Power consumption is now on par with 7700K.with 4 core or 6 core Ryzen , AMD can deliver decent perf/Power into Laptop.I'm waiting 6 core Ryzen at low clock for Laptop.


----------



## budgetgamer120

Quote:


> Originally Posted by *Ha-Nocri*
> 
> There is no way 1600X will OC to 4.4/4.5, no way


Agreed. I do not thinking lowering core count will give any advantage.

But I could be wrong.


----------



## NoDestiny

Cancelled my pre-order on Amazon of the 1700. w/o mATX mobos available, no reason to get it right now (for me, at least. I don't want a full sized). Though, I do look forward to AMD's future.


----------



## PontiacGTX

https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/2/


----------



## IRONPIG1

Well what's clear is if you have a 2600k there is really no reason to upgrade to a Ryzen yet, otherwise 7700k would've justified that long ago. I just find it sad that it sucks so hard in games yet they boasted about its performance, maybe the drops get sorted with software but highly doubt it.
RIP hopes and dreams...


----------



## Ha-Nocri

Quote:


> Originally Posted by *budgetgamer120*
> 
> Agreed. I do not thinking lowering core count will give any advantage.
> 
> But I could be wrong.


It should give some advantage, but I expect max 100 MHz over 8c/16t


----------



## Slomo4shO

Did people really expect RyZen to surpass Kaby Lake? Are people really that naive?

RyZen is competitive. At the price points, it is a damn attractive alternative to Intel. Games should wait and see how the 4 or 6 core variants fair vs the 7700K before they voice an opinion over a part that doesn't compete directly against a 7700K. If AMD can deliver near 7600K gaming performance at $129(R3 1100) and near 7700K at $175 (R5 1300), the saving can directly translate into an investment into a better GPU or other components.

Not everyone that builds a PC has the mentality of an OCer...


----------



## M3T4LM4N222

Quote:


> Originally Posted by *jprovido*
> 
> check Digital Foundry videos. overclocked i5's loses to i7's now. this statement was correct 3-5 years ago. now this is not the case. i7 IS better in gaming


I'm unable to find anything from them indicating that this is the case. Even if it is, it takes more than one source to verify that the Core i7 is indeed better than a Core i5 for gaming overall. As far as I know, and from what I have seen over the years on several different websites, games are not going to benefit from hyperthreading.

I've cited several credible sources below, the only game I can find that seems to benefit from hyperthreading and more cores by a notable amount is Ashes of the Singularity.

http://www.techspot.com/review/1299-intel-core-kaby-lake-desktop/page9.html

https://www.pcgamesn.com/intel/intel-core-i5-7600k-review-benchmarks/#benchmarks

http://www.tomshardware.com/reviews/intel-kaby-lake-core-i7-7700k-i7-7700-i5-7600k-i5-7600,4870-7.html

http://www.kitguru.net/components/cpu/luke-hill/intel-core-i7-7700k-i5-7600k-kaby-lake-cpu-review/

Some of the differences between the i5 Kaby Lake and i7 Kaby Lake are so negligible they could be considered within the margin or error.


----------



## IRONPIG1

Quote:


> Originally Posted by *PontiacGTX*
> 
> https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/2/


Wow...that illustrates the drops in FPS...that's worse than a [email protected]


----------



## dieanotherday

Quote:


> Originally Posted by *Xuper*
> 
> One thing very important.Power consumption is now on par with 7700K.with 4 core or 6 core Ryzen , AMD can deliver decent perf/Power into Laptop.I'm waiting 6 core Ryzen at low clock for Laptop.


I think they can achieve 7w 4c/8t @3ghz

anyone thinsk theres a chance the future rzyens can be unlocked like my phenoms?


----------



## kingduqc

Glad they are competitive. I have to say I'm disapointed in the OC and the gaming results. I expected more.I waited for the review for this reason. IPC and clocks is always king when it comes to gaming and will be for a while.


----------



## M3T4LM4N222

Quote:


> Originally Posted by *dieanotherday*
> 
> I think they can achieve 7w 4c/8t @3ghz
> 
> anyone thinsk theres a chance the future rzyens can be unlocked like my phenoms?


That'd be amazing. I remember those good ol' days, when you could buy the Phenom X2's and X3's and unlock them into X4's if you were lucky enough to get one that was stable when unlocked. Heck, I remember you could do it with some of the Athlon's as well.


----------



## PontiacGTX

Quote:


> Originally Posted by *IRONPIG1*
> 
> Wow...that illustrates the drops in FPS...that's worse than a [email protected]


games dont get all the performance from SMT AMD implementation they need ot update their game code but for some they are just fine


----------



## Twirlz

I currently have the 4790K but was considering moving to Ryzen 1700 and overclocking. Is there any point? Benchmarks are all over the place and would appreciate anyones opinions.

I often complete multithreaded workloads which favor core count but I also game a lot. I wouldn't want to upgrade if my gaming performance suffered due to having a 144Hz monitor.

How likely is it for Ryzens gaming performance to improve overtime (bios updates to fix issues or help SMT, future game optimizing, Windows updates etc). Is it likely to even be noticeable? Is it possible?

My heart says Ryzen, but my head refuses to if gaming performance will suffer noticeably despite helping in other areas.


----------



## dieanotherday

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> That'd be amazing. I remember those good ol' days, when you could buy the Phenom X2's and X3's and unlock them into X4's if you were lucky enough to get one that was stable when unlocked. Heck, I remember you could do it with some of the Athlon's as well.


ive unlocked and oced x2's to 3.5ghz+, for $70 cpu $70 mobo

good times


----------



## dieanotherday

Quote:


> Originally Posted by *Twirlz*
> 
> I currently have the 4790K but was considering moving to Ryzen 1700 and overclocking. Is there any point? Benchmarks are all over the place and would appreciate anyones opinions.
> 
> I often complete multithreaded workloads which favor core count but I also game a lot. I wouldn't want to upgrade if my gaming performance suffered due to having a 144Hz monitor.
> 
> How likely is it for Ryzens gaming performance to improve overtime (bios updates to fix issues or help SMT, future game optimizing, Windows updates etc). Is it likely to even be noticeable?
> 
> My heart says Ryzen, but my head refuses to if gaming performance will suffer noticeably despite helping in other areas.


I'd do it if u do lots of multithreaded


----------



## IRONPIG1

Quote:


> Originally Posted by *PontiacGTX*
> 
> most games dont get most performance from SMT AMD implementation they need ot update their game code but for some they are just fine


Those scores at when they are all locked at 3.5Ghz right? I don't really get how the 1800x does better when locked at 3.5Ghz opposed to having stock boost enabled(in their GTA v tests)
Care to elaborate?(I have no idea, looks like a mistake to me)


----------



## AuraNova

WHEW! I made it to the end. Read every page up to my post here. I don't have much to add myself. Just that my expectations were on par.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> I was sure I was going to buy 1600X, now I don't know anymore. But also don't want to give money to Intel for their overpriced 4 core crap.... dunno what to do


I have a feeling Ryzen 5 is going to be a sleeper. Maybe not in the 1600X, but maybe in the 1500. Especially with the way the 1700 looks to be a better overall deal than the 1800X.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> There is no way 1600X will OC to 4.4/4.5, no way


In due time, we'll see. I'm not expecting 4.5GHz either, but I think it might push a little further than the 7 is doing.

Overall, I am just glad AMD is in the same field of play as Intel. Maybe this could light a fire under Intel to innovate and focus on price/performance. Ryzen isn't perfect. It certainly isn't a Core i killer, but it's getting there. This is a win for the consumer in every way.

Anyway, I'm coming over from an i7 860, so I'm good.


----------



## pengs

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> It seems like AMD is betting on games and programs being able to benefit from more cores vs higher frame-rates in the coming years. Maybe a "Future Proof" kind of mentality.
> 
> It seems similar to the approach they took with the RX480 outperforming NVIDIA equivalents when using DX12 and Vulkan.
> 
> Basically in the current state, they're overall not going to blow the competition completely away, however, when the circumstances are right, they do.
> 
> I mean, realistically, they're releasing the best price-to-performance option for people who do heavy rendering or use heavily multi-threaded applications.
> 
> I don't know...I'm tired and haven't been active on these forums for years.


This is what I'm seeing.



When these new API's mature it will alleviate the dependency on IPC as far as gaming is concerned.

The marks in DX11 and current titles are acceptable, minimums aren't too terrible and Ryzen generally can accumulate 50-60fps at it's worst on most current tittles. If these Doom benchmarks are representative of a proper Vulkan implementation Ryzen 7 and 5 should be close to future proof - when a 8370 is 10% from a 1800X at 200fps which in-turn is limited by Doom's internal frame limit.
The 9590 3fps from 200fps. 1.5% (?) from the 1800X. If the frame limit was lifted where would the 1800X fall...

What I see is that games' dependencies on IPC should be rendered mostly insignificant with the maturity of DX12 and Vulkan.

In my opinion at Ryzen's price point whilst knowing where gaming is headed there is absolutely no reason to get an 4/8 thread i7 over something like a 1700. Unless one wants increase the chance that they have the highest refresh rate right now with little interest in the future.


----------



## LancerVI

Did they touch on whether SMT can be turned off and what effect that may have on overclocks? I would be really curious if you could get a bit more headroom with SMT off.


----------



## SuprUsrStan

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IRONPIG1*
> 
> Wow...that illustrates the drops in FPS...that's worse than a [email protected]
> 
> 
> 
> games dont get all the performance from SMT AMD implementation they need ot update their game code but for some they are just fine
Click to expand...

Perhaps, but would you really buy a 1800X over a 7700K by these charts? *Not locked to 3.5Ghz*







ESPECIALLY when you consider you can push the 7700K up to 5Ghz when you overclock and the 1800X is essentially already boosting up to its max 4Ghz.


----------



## M3T4LM4N222

Quote:


> Originally Posted by *pengs*
> 
> This is what I'm seeing.
> 
> 
> 
> When these new API's mature it will alleviate the dependency on IPC as far as gaming is concerned.
> 
> The marks in DX11 and current titles are acceptable, minimums aren't too terrible and Ryzen generally can accumulate 50-60fps at it's worst on most current tittles. If these Doom benchmarks are representative of a proper Vulkan implementation Ryzen 7 and 5 should be close to future proof - when a 8370 is 10% from a 1800X at 200fps which in-turn is limited by Doom's internal frame limit.
> The 9590 3fps from 200fps. 1.5% (?) from the 1800X. If the frame limit was lifted where would the 1800X fall...
> 
> What I see is that games' dependencies on IPC should be rendered mostly insignificant with the maturity of DX12 and Vulkan.
> 
> In my opinion at Ryzen's price point whilst knowing where gaming is headed there is absolutely no reason to get an 4/8 thread i7 over something like a 1700. Unless one wants increase the chance that they have the highest refresh rate right now and has no interest in the future.


My thoughts pretty much to a T.


----------



## rick19011

Yep it lived up to the hype! Can't wait for it to be delivered


----------



## spddmn24

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> I'm unable to find anything from them indicating that this is the case. Even if it is, it takes more than one source to verify that the Core i7 is indeed better than a Core i5 for gaming overall. As far as I know, and from what I have seen over the years on several different websites, games are not going to benefit from hyperthreading.
> 
> I've cited several credible sources below, the only game I can find that seems to benefit from hyperthreading and more cores by a notable amount is Ashes of the Singularity.
> 
> http://www.techspot.com/review/1299-intel-core-kaby-lake-desktop/page9.html
> 
> https://www.pcgamesn.com/intel/intel-core-i5-7600k-review-benchmarks/#benchmarks
> 
> http://www.tomshardware.com/reviews/intel-kaby-lake-core-i7-7700k-i7-7700-i5-7600k-i5-7600,4870-7.html
> 
> http://www.kitguru.net/components/cpu/luke-hill/intel-core-i7-7700k-i5-7600k-kaby-lake-cpu-review/
> 
> Some of the differences between the i5 Kaby Lake and i7 Kaby Lake are so negligible they could be considered within the margin or error.


The information is out there, and there's a ton of reviews in this very thread that show hyper threading is very beneficial in modern games. If you choose to be willfully ignorant and cherry pick benchmarks on non cpu bound games to convince yourself an i7 isn't any better than an i5 that's your choice, but you shouldn't spread miss information on the internet that could potentially steer other people into the wrong purchase.

Here's an example, 7700k stock 112.7 fps avg 76 .1% low, 7700k hyperthreading off 87.3 fps avg 60.3 .1% low.


Spoiler: Warning: Spoiler!







i5 30%+ slower in some games

http://www.eurogamer.net/articles/digitalfoundry-2017-intel-kaby-lake-core-i7-7700k-review


----------



## IRONPIG1

Quote:


> Originally Posted by *Syan48306*
> 
> Perhaps, but would you really buy a 1800X over a 7700K by these charts? *Not locked to 3.5Ghz*
> 
> *****(Don't want to fill with all the images)*****
> 
> ESPECIALLY when you consider you can push the 7700K up to 5Ghz when you overclock and the 1800X is essentially already boosting up to its max 4Ghz.


I fully agree. I'd pick a [email protected] over that 1800x.


----------



## DADDYDC650

Where are the Ryzen 1700 vs 1800x overclocking videos!?!?


----------



## Kinaesthetic

Quote:


> Originally Posted by *SoloCamo*
> 
> CPU if you want decent quality.


Actually, most have shifted over to GPU rendering as the quality dip is almost inherently unnoticeable these days compared to CPU-based. ProRes/Cineform/Red all use GPU rendering output for the final product (Adobe Premier Pro CC). And those are the most commonly used export codecs these days in the video industry.

Almost all hardware acceleration these days are also done on the GPU too. Heck, even live streaming is starting to shift to GPU encoding, because they are reaching quality parity with CPU h264 encoding.


----------



## IRONPIG1

Quote:


> Originally Posted by *rick19011*
> 
> Yep it lived up to the hype! Can't wait for it to be delivered


In certain cases but regarding gaming...it was completely over hyped and failed looking at AMDs boasting on its gaming performance.


----------



## tajoh111

Quote:


> Originally Posted by *Slomo4shO*
> 
> Did people really expect RyZen to surpass Kaby Lake? Are people really that naive?
> 
> RyZen is competitive. At the price points, it is a damn attractive alternative to Intel. Games should wait and see how the 4 or 6 core variants fair vs the 7700K before they voice an opinion over a part that doesn't compete directly against a 7700K. If AMD can deliver near 7600K gaming performance at $129(R3 1100) and near 7700K at $175 (R5 1300), the saving can directly translate into an investment into a better GPU or other components.
> 
> Not everyone that builds a PC has the mentality of an OCer...


I think the people disappointed are the 1800x buyers.

People were expecting 6900k performance for 500 dollars.

What they are getting is gaming performance worse than Intel's 300 dollars performance and generally performance as far as productivity closer to a 6850/6800k rather than a 6900k in most applications. Overclock both and the 6800k becomes the better buy for most people, particularly if they have a microcenter nearby.

The people who bought the standard 1700 are doing great. They are coming out ahead, unless they bought the 1700 purely for gaming.


----------



## Majin SSJ Eric

Other than the OCing things look pretty good for Ryzen. I was never targeting KL (or even SL) performance as a possibility for Ryzen in the first place so i'm not sure where the disappointment over it not beating a 7700K in gaming is coming from? My expectation going in several months ago was at least IB-E parity with a hope for Broadwell-E parity and Ryzen largely does just that. You're basically getting a 4GHz 6900K for $500, or really $329 since it seems there's little reason to spring for the 1800X. I understand the feeling of disappointment from those who thought (somehow) Ryzen was going to go toe-to-toe with the 7700K in lightly threaded scenarios like gaming but I would guess that if these very reviews had "leaked" out 6 months to a year ago, people would have been losing their minds over how good Ryzen is.

Remember that all these guys who are downplaying Ryzen today with a glint of fanboy glee were also the same usual suspects that 6 months ago were telling us Ryzen wouldn't even reach SB IPC and would cost as much as Broadwell-E. Perspective people...


----------



## Mad Pistol

Quote:


> Originally Posted by *IRONPIG1*
> 
> In certain cases but regarding gaming...it was completely over hyped and failed looking at AMDs boasting on its gaming performance.


Not really. Sure it doesn't match Intel at the moment for gaming, but give me a break... the R7 Ryzen CPUs are still great for gaming.

It's also a possibility that a few microcode tweaks may fix this too. It's possible that the games are not utilizing AMD's SMT implementation correctly. For example, AMD had to release a bios tweak that allowed FX 8 core CPUs to prioritize each of the 4 modules first before moving other threads onto the second core of each module.

It is a possibility that current games are hitting the core + SMT thread simultaneously while other cores sit idle.


----------



## Kuivamaa

Quote:


> Originally Posted by *Twirlz*
> 
> I currently have the 4790K but was considering moving to Ryzen 1700 and overclocking. Is there any point? Benchmarks are all over the place and would appreciate anyones opinions.
> 
> I often complete multithreaded workloads which favor core count but I also game a lot. I wouldn't want to upgrade if my gaming performance suffered due to having a 144Hz monitor.
> 
> How likely is it for Ryzens gaming performance to improve overtime (bios updates to fix issues or help SMT, future game optimizing, Windows updates etc). Is it likely to even be noticeable? Is it possible?
> 
> My heart says Ryzen, but my head refuses to if gaming performance will suffer noticeably despite helping in other areas.


Well 1800X for once seems to be almost on par (losing by 1-2%) with 7700k in the site i trust the most:

https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#diagramm-ryzen-7-1800x-1700x-und-1700-vs-broadwell-e-witcher-3

so I don't see how you will lose any gaming performance going with a 4GHz ryzen. Bulldozer had an actual problem with windows scheduling but Ryzen is a more standard design so I do not expect dramatic change from windows fixes unless there are obvious bugs. Come to think,there will be bugs. I expect the most from AMD collaborating with game devs.They started already with bethesda.

http://www.pcworld.com/article/3174806/gaming/amd-radeon-infuses-bethesda-games-with-vulkan-cozies-up-to-a-geforce-now-rival.html


----------



## M3T4LM4N222

Quote:


> Originally Posted by *spddmn24*
> 
> The information is out there, and there's a ton of reviews in this very thread that show hyper threading is very beneficial in modern games. If you choose to be willfully ignorant and cherry pick benchmarks on non cpu bound games to convince yourself an i7 isn't any better than an i5 that's your choice, but you shouldn't spread miss information on the internet that could potentially steer other people into the wrong purchase.
> 
> Here's an example, 7700k stock 112.7 fps avg 76 .1% low, 7700k hyperthreading off 87.3 fps avg 60.3 .1% low.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> i5 30%+ slower in some games
> 
> http://www.eurogamer.net/articles/digitalfoundry-2017-intel-kaby-lake-core-i7-7700k-review


Care to explain to me how linking full Kaby Lake reviews from FOUR different review websites that show the gaming performance of both the Core i5 and Core i7 Kaby Lake processors being within several frames of each other is cherry picking? I'm certain had I put 4 more reviews from 4 different websites, the results would be the same.

I'm not spreading misinformation. I'm providing accurate information along with links to my actual sources. You can't say the same. You've literally provided a picture of ONE "Cherry picked" result that shows the Core i7 outperforming the Core i5.

I'll give you Ashes of the Singularity. There's a pretty big performance difference. Aside from that, I'm not seeing anything justifying purchasing an i7 over an i5 for SOLELY gaming.


----------



## PontiacGTX

Quote:


> Originally Posted by *IRONPIG1*
> 
> Those scores at when they are all locked at 3.5Ghz right? I don't really get how the 1800x does better when locked at 3.5Ghz opposed to having stock boost enabled(in their GTA v tests)
> Care to elaborate?(I have no idea, looks like a mistake to me)


Probably Intel gets higher performance from Boost than AMD does?
Quote:


> Originally Posted by *Syan48306*
> 
> Perhaps, but would you really buy a 1800X over a 7700K by these charts? *Not locked to 3.5Ghz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ESPECIALLY when you consider you can push the 7700K up to 5Ghz when you overclock and the 1800X is essentially already boosting up to its max 4Ghz.


Depends if the games get proper SMT code for AMD probably the 7700k wont be that far. and if AMD makes a better binned CPU really why not get the Ryzen 7 1900(x) when you can get better performance in incoming games?


----------



## spddmn24

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Care to explain to me how linking full Kaby Lake reviews from FOUR different review websites that show the gaming performance of both the Core i5 and Core i7 Kaby Lake processors being within several frames of each other is cherry picking? I'm certain had I put 4 more reviews from 4 different websites, the results would be the same.
> 
> I'm not spreading misinformation. I'm providing accurate information along with links to my actual sources. You can't say the same. You've literally provided a picture of ONE "Cherry picked" result that shows the Core i7 outperforming the Core i5.
> 
> I'll give you Ashes of the Singularity. There's a pretty big performance difference. Aside from that, I'm not seeing anything justifying purchasing an i7 over an i5 for SOLELY gaming.


So a 30% gain in fps in games isn't worth considering an i7 over an i5 for?


----------



## delboy67

For everyone saying it doesnt live up to the hype re gaming, the 150£ 4c/8t will be the gaming chip


----------



## ducegt

Quote:


> Originally Posted by *budgetgamer120*
> 
> 
> 
> Memory performance is boss. The piledriver ones were so terrible.


Top Ryzen score not stable. Says so in the article for an observant reader. No bench only intel clocks compared.
Quote:


> Originally Posted by *AuraNova*
> 
> In due time, we'll see. I'm not expecting 4.5GHz either, but I think it might push a little further than the 7 is doing.
> 
> Overall, I am just glad AMD is in the same field of play as Intel. Maybe this could light a fire under Intel to innovate and focus on price/performance. Ryzen isn't perfect. It certainly isn't a Core i killer, but it's getting there. This is a win for the consumer in every way.
> 
> Anyway, I'm coming over from an i7 860, so I'm good.


I went from a 4ghz to 7700K that does 5.2 Triple the frames in Doom and doubles the lows in most games. Anyone expecting 4.5 did so on blind faith/ignorance. Will intel give us 1151 6 cores as a result of Ryzen? I think they are now motivated as well.


----------



## IRONPIG1

Quote:


> Originally Posted by *PontiacGTX*
> 
> Probably AMD gets higher performance from Boost than Intel does?
> Depends if the games get proper SMT code for AMD probably the 7700k wont be that far. and if AMD makes a better binned CPU really why not get the Ryzen 7 1900(x) when you can get better performance in incoming games?


No, no. I meant look at the GTA V, look at the "stock" and "locked @ 3.5" graphs. The Ryzen did better in the "locked @ 3.5" than "stock".

So "Probably AMD gets higher performance from Boost than Intel does?" can't be, maybe AMD gets higher performance from lower clock? or the graph is wrong.


----------



## jprovido

I've had enough of ryzen reviews. I think I've pretty much read/watched as much as I can. good job AMD


----------



## M3T4LM4N222

Quote:


> Originally Posted by *spddmn24*
> 
> So a 30% gain in fps in games isn't worth considering an i7 over an i5 for?


In one "cherry picked" instance the Core i7 being 30% faster in a game? Worth $100 more? Hell no.

We're talking overall value here. If you're solely gaming, you're simply not going to see a large benefit between a Kaby Lake i5 and a Kaby Lake i7. Certainly
not a benefit worth the price difference.


----------



## Shogon

Quote:


> Official AMD excuse list:
> 
> -Games and apps aren't optimized yet
> -It's clearly a driver issue
> -Wait for BIOS updates
> -I like to play games with 300 browser tabs and Photoshop open
> -Just wait for DX12
> -Most games are GPU limited anyway
> -Things will clearly work better on the motherboard I ordered instead of the one in the review
> -People haven't figured out how to overclocking these yet
> -What really matters is [pick any game that wasn't included in the review]
> -These are great for "office work"
> -Intel has been ripping people off for years, so I'm buying a slower CPU to support AMD
> -I encode Blu Rays 14 hours a day, delete those, then encode them again
> -The CPU will last longer in the future when programs support more threads


Saw this somewhere else; made me chuckle.

Looks like the quad cores/Zen+ chips will be more up my alley, as I doubt Intel will lower prices at all on Broadwell-E.


----------



## Butthurt Beluga

I am so ready to kick this i7 3770k quad core to the curb.


----------



## HeadlessKnight

It is a competitive CPU, no matter what the haters say. For gaming in worst cases it is between Sandy/ Ivy, and in best cases matching Kaby. It performs great in gaming and smokes the quad-core i7's in multithreaded workloads, and sometimes matching the 6950X/6900K or coming pretty close. That's a very good feat considering how Bulldozer was and considering the 8+ core Intel CPUs cost a grand or more.


----------



## IRONPIG1

Quote:


> Originally Posted by *Mad Pistol*
> 
> Not really. Sure it doesn't match Intel at the moment for gaming, but give me a break... the R7 Ryzen CPUs are still great for gaming.
> 
> It's also a possibility that a few microcode tweaks may fix this too. It's possible that the games are not utilizing AMD's SMT implementation correctly. For example, AMD had to release a bios tweak that allowed FX 8 core CPUs to prioritize each of the 4 modules first before moving other threads onto the second core of each module.
> 
> It is a possibility that current games are hitting the core + SMT thread simultaneously while other cores sit idle.


I hope you are right and that they can make adjustments for that via updates to fix the low lows







but you're right, it's still very early and SMT







.


----------



## Wishmaker

OCN needs to review its policies because in two distinct situations, the user base here acted as a magnifying lens creating artificial expectations AMD never confirmed. The first case was with the RX480 where many claimed it will have 980TI performance. People pulled the trigger and we know what happened. We then had statements of the sort, well you can't expect this from the RX it is not the proper card.

In a similar fashion, RyZen was used to lure in certain buyers. OCN is a very reliable source for some users and whatever people say here becomes the de facto argument for a purchase. From the first leak, Zen was being pushed as the INTEL killer. Better 8 core performance at half the price. These were the looped songs on OCN for these past weeks. Some began to believe it. Many bought the 1700x and ended up regretting their purchases as the 1700 is the better chip. Another group of people purchased the 1800x on the point that you get better than INTEL performance at half the price.

It is not normal to misinform people and paint the competition in a bad light to support your arguments. The first hype started with Keller. People jumped on this wagon as he designed the old amd chips. They assumed it will be a Conroe moment and it was presented to them as a Conroe moment. They all pushed and pushed to the last moment.

Results are out. AMD did very good but they are nowhere near the INTEL killers or the de facto choice in the enthusiast segment. They are slotted perfectly to make money. Lisa Su knows this and even AMD engineers are admitting that Zen is not the best they can do.

PLEASE STOP WITH THE HYPE BECAUSE PEOPLE SPEND MONEY BASED ON TWISTED STATEMENTS!


----------



## Vesku

Ryzen looks to be a solid production/server CPU. If I was building a gaming focused PC I'd wait to see if Intel will lower prices after the full Ryzen stack is available.

I'm interested to see performance revisited in 3-6 months, especially memory and SMT.


----------



## Kinaesthetic

Quote:


> Originally Posted by *HeadlessKnight*
> 
> It is a competitive CPU, no matter what the haters say. For gaming in worst cases it is between Sandy/ Ivy, and in best cases matching Kaby. It performs great in gaming and smokes the quad-core i7's in multithreaded workloads, and sometimes matching the 6950X/6900K or coming pretty close. That's a very good feat considering how Bulldozer was and considering the 8+ core Intel CPUs cost a grand or more.


It is competitive. Very much so.

It ISN'T AS competitive as the raging AMD fanboys (pretty obvious who those suspects are without naming them) and AMD themselves that have been hyping this release up to the moon for the past month. That is why people are disappointed.

Honest to goodness, the damage control here in this thread is utterly hilarious. Ironically, it is those raging AMD fanboys/fangirls that are hurting AMD the most.


----------



## Hueristic

Too much info too fast, so was I right about the 1700 being the best bet $/OC or not?


----------



## aDyerSituation

As primarily a gamer with multitasking coming second, this is disappointing.

I guess I'll be waiting for intel's next move


----------



## seven7thirty30

People are accountable for their own decisions and should make well-informed choices with discipline. I hear your frustration, but I won't feel responsible for anyone on this forum buying into the Ryzen hype. There were just as many Pro-Intel comments as there were Pro-AMDean comments.


----------



## spddmn24

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> In one "cherry picked" instance the Core i7 being 30% faster in a game? Worth $100 more? Hell no.
> 
> We're talking overall value here. If you're solely gaming, you're simply not going to see a large benefit between a Kaby Lake i5 and a Kaby Lake i7. Certainly
> not a benefit worth the price difference.


In 1 cherry picked instance?







In 4 out of the 7 games they tested the i7 was significantly faster. The 3770k @ 4.5 ghz also beat the 7600k @ 4.8 ghz in those instances. The 3 other games were gpu bound since clock speed had minimal/no effect on fps.


----------



## djfunz

Anyone seeing any Intel price drops on the 7700k? I guess they likely won't be coming now.


----------



## Xuper

Quote:


> Originally Posted by *djfunz*
> 
> Anyone seeing any Intel price drops on the 7700k? I guess they likely won't be coming now.


IF R5 1600X can be overclocked to 4.5Ghz, then maybe.


----------



## Derp

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> In one "cherry picked" instance the Core i7 being 30% faster in a game? Worth $100 more? Hell no.
> 
> We're talking overall value here. If you're solely gaming, you're simply not going to see a large benefit between a Kaby Lake i5 and a Kaby Lake i7. Certainly
> not a benefit worth the price difference.


It's not just a few games anymore. Many games can show a large performance increase going from i5 to i7. The price premium is absolutely worth it unless you're a 60Hz pleb.


----------



## DADDYDC650

Quote:


> Originally Posted by *aDyerSituation*
> 
> As primarily a gamer with multitasking coming second, this is disappointing.
> 
> I guess I'll be waiting for intel's next move


Unless you game at 1080p or less, you won't see much of a difference. Ryzen will pull ahead with games that utilize more cores vs a 4 core Intel CPU. Also, the upgrade path is A LOT better on AMD vs Intel.


----------



## ducegt

Quote:


> Originally Posted by *Wishmaker*
> 
> OCN needs to review its policies because in two distinct situations, the user base here acted as a magnifying lens creating artificial expectations AMD never confirmed.


What would such a policy even look like?, AMD themselves gave skewed previews. Its called marketing. Everyone does it. As an American, I think free speech is vital. Even if I dont care for what most people have to say. We fact check eachother here and on the flip side there are groups of people who will stand behind a false or misguided arguement.


----------



## jprovido

Quote:


> Originally Posted by *djfunz*
> 
> Anyone seeing any Intel price drops on the 7700k? I guess they likely won't be coming now.


if you live in the US you can buy a 7700k @ staples then take advantage of there 110% price match with microcenter. a friend of mine was able to get one for 290+ usd


----------



## BobiBolivia

Do we know (for a fact) if Zen+ will be still on AM4 ? Or will we be forced to change MBs as on Intel ?


----------



## Pro3ootector

Any review with RYZEN OC without SMT?


----------



## neurotix

What happened to all the hype?

What happened to nearly every single user in the AMD CPU subforum preordering these things?

My 4790k is still better in nearly every game I've seen tested. Especially at 5ghz.

I have no compelling reason to upgrade (to anything) because I don't do any heavily multithreaded work, so this CPU isn't for me. The most I do is Plex streaming, and my i7 4790k still makes short work of it. I don't stream, I don't play multiplayer games, and I generally don't multitask while I play games (having Eyefinity and all). Even if I did do those things, it's not like my CPU is any slouch. Skylake or Kaby Lake are a sidegrade (except for fast DDR4) and Ryzen would basically be a downgrade.

I didn't let myself get hyped up at all because I'm sober and jaded and remember Bulldozer very well.

Seems like the second coming.


----------



## jprovido

Quote:


> Originally Posted by *Pro3ootector*
> 
> Any review with RYZEN OC without SMT?


not at this time. probably 100-200mhz more oc headroom


----------



## Pro3ootector

Any Review with RYZEN OC without SMT?


----------



## GnarlyCharlie

Quote:


> Originally Posted by *budgetgamer120*
> 
> 
> 
> Memory performance is boss.


OK, if you say so (and my memory performance is not all that great, 2666 is as good as my old chip will do)

From Sept. 2015, FWIW


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> As primarily a gamer with multitasking coming second, this is disappointing.
> 
> I guess I'll be waiting for intel's next move


You shouldnt be looking a CPU int that case... You are good for another 5 years.


----------



## ZealotKi11er

Meh, I will not be upgrading. Gaming performance is disappointing.


----------



## DADDYDC650

Quote:


> Originally Posted by *BobiBolivia*
> 
> Do we know (for a fact) if Zen+ will be still on AM4 ? Or will we be forced to change MBs as on Intel ?


I read that the AM4 upgrade path will last until 2019-2020.


----------



## Kuivamaa

Quote:


> Originally Posted by *Hueristic*
> 
> Too much info too fast, so was I right about the 1700 being the best bet $/OC or not?


Well we do not quite know yet if there are ryzen lemons that won't do 4GHz, but with what it seems to be a hard ceiling at 4.2GHz on Air-AIO, the 1700 seems to be the OC choice. 1700X is for those that care about stock price/perf.


----------



## Wishmaker

Quote:


> Originally Posted by *ducegt*
> 
> What would such a policy even look like?, AMD themselves gave skewed previews. Its called marketing. Everyone does it. *As an American, I think free speech is vital.* Even if I dont care for what most people have to say. We fact check eachother here and on the flip side there are groups of people who will stand behind a false or misguided arguement.


Free speech is vital but OCN has certain topics you are not allowed to approach. I will not name them as I will be in violation of said policies even though I have the right to express myself. It should be the same with perpetuating inaccurate and false information.


----------



## ZealotKi11er

Now my money will just go for GPU.


----------



## ducegt

Quote:


> Originally Posted by *neurotix*
> 
> Seems like the second coming.


Cue @RyzenChrist


----------



## Alight

Any reviews comparing OCs on the same chip across all mobos? I'd like to pick up a gigabyte mobo, but if there is a significant difference in OCability between boards, I need to take that into account.


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Meh, I will not be upgrading. Gaming performance is disappointing.


Yep. negative scaling on SMT is a little bit worrying. this is not new though we've been here before with the first gen Hyperthreading.

would still get the 1700 just to play with it. hopefully a 3.9ghz+ oc is possible with a matx motherboard


----------



## LancerVI

Quote:


> Originally Posted by *Pro3ootector*
> 
> Any review with RYZEN OC without SMT?


Exactly what I'm looking for. 8c/8t with SMT disabled overclock results.


----------



## fergsonfire

Looks like AMD built in its max overclocking with the XFR feature. Pretty disappointing. Looks like Intel allows more top end potential.


----------



## BobiBolivia

Quote:


> Originally Posted by *DADDYDC650*
> 
> I read that the AM4 upgrade path will last until 2019-2020.


Thanks. That's good enough for me.


----------



## opt33

Since I game at 1440 p where cpu differences will likely be minimal between rzyen/broadwell e, and spend fair amount of time encoding with 8 cores, I should have held off on buying $1100 6900k, and bought AMD 1700 and Oced it, similar performance for 1/3 the price. All gaming benchmarks so far are 1080p, not that it is surprising. Kind of like buying a new LCD tv, to have it go 50% off on sale, or in this case way over 50% lower.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> Yep. negative scaling on SMT is a little bit worrying. this is not new though we've been here before with the first gen Hyperthreading.
> 
> would still get the 1700 just to play with it. hopefully a 3.9ghz+ oc is possible with a matx motherboard


AMD never gets first gen right. Phenom I, Bulldozer and now Zen all failed the same way. It took Phenom II, Piledriver and probably Zen+ to iron out problems.


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AMD never gets first gen right. Phenom I, Bulldozer and now Zen all failed the same way. It took Phenom II, Piledriver and probably Zen+ to iron out problems.


good point. zen+ will be a beast


----------



## lombardsoup

Quote:


> Originally Posted by *opt33*
> 
> Since I game at 1440 p where cpu differences will likely be minimal between rzyen/broadwell e, and spend fair amount of time encoding with 8 cores, I should have held off on buying $1100 6900k, and bought AMD 1700 and Oced it, similar performance for 1/3 the price. All gaming benchmarks so far are 1080p, not that it is surprising. Kind of like buying a new LCD tv, to have it go 50% off on sale, or in this case way over 50% lower.


That was the path I ended up taking, used the rest on a GPU.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AMD never gets first gen right. Phenom I, Bulldozer and now Zen all failed the same way. It took Phenom II, Piledriver and probably Zen+ to iron out problems.


Not sure how Ryzen is a fail


----------



## DADDYDC650

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


Some people seem to have unrealistic expectations.


----------



## Slaughterem

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Now my money will just go for GPU.


In your case there was no reason to even consider a new cpu. You have a 4k monitor.


----------



## chuy409

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AMD never gets first gen right. Phenom I, Bulldozer and now Zen all failed the same way. It took Phenom II, Piledriver and probably Zen+ to iron out problems.


Phenom II was the better release from those 3 you mentioned.


----------



## Descadent

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


go re-read the reviews and you'll see why people think that for their certain way they would use ryzen.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> good point. zen+ will be a beast


I was expecting ~ 10% more gaming performance than my 3770K with the added benefit of 8-Core for future games but for that to happened Zen has to be running much faster DDR4 and OC as much as 4.6GHz. Look like those rumors of Zen having clocking problems might have been true. Its not like Zen is a bad CPU. It just that its not a upgrade even coming from 2600K for gaming.


----------



## Blackops_2

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AMD never gets first gen right. Phenom I, Bulldozer and now Zen all failed the same way. It took Phenom II, Piledriver and probably Zen+ to iron out problems.


I was close to saying this, it took an improved generation on their last two architecture releases to get clocks where we all wanted them to be.


----------



## BobiBolivia

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


Everybody says it's fail, so it has to be truth... Deal with it.

/s


----------



## iRUSH

I'm very happy to see AMD offer a better CPU solution. A little disappointed in the gaming results myself as a high refresh rate panel user. Their 40-52% IPC improvement seems about right over their previous architecture. It also put into perspective just how far ahead intel actually is.

None the less I look forward to messing around with the 1700 here very soon.

These chips are far better in the current market than they would have been 5 years ago.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


It's not. I don't understand why so many people are being so critical about it.


----------



## Jayjr1105

I'm still optimistic these numbers will improve though BIOS updates, bugfixes, micocode updates, motherboard revisions, chipset drivers, etc. This is extremely early to call anything a failure. But even if nothing improves, this chip is a winner, especially for it's price.


----------



## pengs

Quote:


> Originally Posted by *LancerVI*
> 
> Exactly what I'm looking for. 8c/8t with SMT disabled overclock results.


http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


----------



## ZealotKi11er

Quote:


> Originally Posted by *chuy409*
> 
> Phenom II was the better release from those 3 you mentioned.


FX8350 was a good speed from FX8150. Zen seem like a good CPU which needs 1-2 years of tuning. Also motherboards/memory are not to the task. We need AMP.


----------



## ducegt

Quote:


> Originally Posted by *Wishmaker*
> 
> Free speech is vital but OCN has certain topics you are not allowed to approach. I will not name them as I will be in violation of said policies even though I have the right to express myself. It should be the same with perpetuating inaccurate and false information.


Topics or means of interactions between users? As someone who once was a young teenager posting on tech forums, Ive learned how to deal with my former self. What inaccurate and false info did you see with regard to Ryzen? I saw some people who claimed its OC potential and who were way off. Same kind of people who tell others the choices they should make without understanding all the variables in play. History has repeated itself and generally those who made such predictions werent well informed of it. Perhaps because they were in their infancy when it happened.


----------



## Slaughterem

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was expecting ~ 10% more gaming performance than my 3770K with the added benefit of 8-Core for future games but for that to happened Zen has to be running much faster DDR4 and OC as much as 4.6GHz. Look like those rumors of Zen having clocking problems might have been true. Its not like Zen is a bad CPU. It just that its not a upgrade even coming from 2600K for gaming.


You were hoping for what? Were you planning on playing games at 1080p?


----------



## Blackops_2

Quote:


> Originally Posted by *Mad Pistol*
> 
> It's not. I don't understand why so many people are being so critical about it.


Unrealistic expectations due to association of "AMD being back" with the nostalgia of A64 is the only plausible scenario i can come up with.


----------



## budgetgamer120

Quote:


> Originally Posted by *Descadent*
> 
> go re-read the reviews and you'll see why people think that for their certain way they would use ryzen.


I read the reviews and they all prove Ryzen is 40% or more faster per clock than the previous. So yeah tell me how it is a fail please.


----------



## Motley01

Quote:


> Originally Posted by *Motley01*
> 
> Well I'm heading to Microcenter now, I'll be back in a couple hours, then start my new build Very excited!!!


I just got back from Microcenter in Denver. I got their at 10:30am, and it was a crazy! Well not everybody was buying Ryzen, just a lot of people buying crap.

They had plenty of Ryzen processors, but their board inventory was wacked. No Asrock, No MSI. I ended up getting the Asus CH6, and they were selling like hotcackes. There were only like 3-4 of those left.

Well now its time to build this sucker, should be fun....


----------



## Scotty99

Its literally fx launch V2, tell me how its any different.....please.

Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


----------



## Xuper

Quote:


> Originally Posted by *budgetgamer120*
> 
> I read the reviews and they all prove Ryzen is 40% or more faster per clock than the previous. So yeah tell me how it is a fail please.












dunno man why some feels it !


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


Who are pissed? So far you are the only one


----------



## ZealotKi11er

Quote:


> Originally Posted by *Slaughterem*
> 
> You were hoping for what? Were you planning on playing games at 1080p?


I wanted to "upgrade" for the sake up upgrading. Been 3 years with 3770K and Zen still struggles to keep up in games.


----------



## DADDYDC650

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


Seem people don't care much and will gladly pay for the higher number. I canceled my 1800x yesterday because I wanted to wait for reviews. Glad I did. Ordered a 1700 from Amazon.


----------



## Blackops_2

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


It's not even remotely comparable to any iteration of Bulldozer. Noone should have to explain that, look disparity in performance between competing CPUs when BD launched then, and Ryzen now.


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I wanted to "upgrade" for the sake up upgrading. Been 3 years with 3770K and Zen still struggles to keep up in games.


3770k @ 4.6ghz is still beast. sadly you have to wait a bit more


----------



## Slaughterem

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


Why would people be pissed. For all other things but 1080p gaming this chip is very competitive. If you have a 1440p or 4k monitor this chip is just as good as a 7700k


----------



## 113802

Quote:


> Originally Posted by *neurotix*
> 
> What happened to all the hype?
> 
> What happened to nearly every single user in the AMD CPU subforum preordering these things?
> 
> My 4790k is still better in nearly every game I've seen tested. Especially at 5ghz.
> 
> I have no compelling reason to upgrade (to anything) because I don't do any heavily multithreaded work, so this CPU isn't for me. The most I do is Plex streaming, and my i7 4790k still makes short work of it. I don't stream, I don't play multiplayer games, and I generally don't multitask while I play games (having Eyefinity and all). Even if I did do those things, it's not like my CPU is any slouch. Skylake or Kaby Lake are a sidegrade (except for fast DDR4) and Ryzen would basically be a downgrade.
> 
> I didn't let myself get hyped up at all because I'm sober and jaded and remember Bulldozer very well.
> 
> Seems like the second coming.


It is a second coming for AMD!

This processor is not for you. it's not for me but I can't deny what AMD has accomplished.

Not sure if we are looking at the same release but the AMD's Ryzen 1700 a $329.99 processor competes with the Core i7 6900k for $1,049.99

AMD blows past Intel in parallel processing workloads. Every other workload beside gaming is on par with the Core i7 6900k. Everything AMD said was true about their processors. Even AnandTech had good things to say.


----------



## 98uk

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


Because some people have a financial situation where they can easily afford it and don't want to waste time overclocking


----------



## Derp

Remember OCN, when the results don't match up with your hype meter you can just blame the author of the review. Especially the reviews that are actually thorough and have the balls to call out AMD for their BS.

<3 you Gamers Nexus.


----------



## kaosstar

It's funny how the standards change here.
For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.

Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


----------



## DADDYDC650

Quote:


> Originally Posted by *98uk*
> 
> Because some people have a financial situation where they can easily afford it and don't want to waste time overclocking


Then those people aren't on OCN and won't be offended.


----------



## aDyerSituation

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I wanted to "upgrade" for the sake up upgrading. Been 3 years with 3770K and Zen still struggles to keep up in games.


This. I traded for my 4790k a year after it came out, and have had it about a year. I want to upgrade to ddr4 and all of that just because, but really I need a reason to. I have yet to see it.


----------



## jprovido

Quote:


> Originally Posted by *Derp*
> 
> Remember OCN, when the results don't match up with your hype meter you can just blame the author of the review. Especially the reviews that are actually thorough and have the balls to call out AMD for their BS.
> 
> <3 you Gamers Nexus.


Steve and the rest of the GN gang nailed the Ryzen review. man I just hope they don't get too mainstream. I love their work. don't want them to turn into another Linustechtips


----------



## Slaughterem

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


Exactly the hypocrites are out in full force


----------



## budgetgamer120

Quote:


> Originally Posted by *Slaughterem*
> 
> Why would people be pissed. For all other things but 1080p gaming this chip is very competitive. If you have a 1440p or 4k monitor this chip is just as good as a 7700k


But gaming? Does something have to be the fastest in something to compete? Or did the meaning of compete change


----------



## MrTOOSHORT

So keep the 1680 V2 then?

The 1700 looks like the cpu to get over the 1800x and 1700x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


Nope. 1080p is good test for CPU. If you want to test GPU you can test it at 1440p and 4K. If you want a CPU review and want to hear how CPUs perform them same then test CPU at 4K.


----------



## bigjdubb

Quote:


> Originally Posted by *jprovido*
> 
> Steve and the rest of the GN gang nailed the Ryzen review. man I just hope they don't get too mainstream. I love their work. *don't want them to turn into another Linustechtips*


That's not likely to happen, you get popular on youtube by being enjoyable to watch.


----------



## IRONPIG1

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


Lel, heat maybe but it OVERCLOCK.net







I don't think anyone here wants to save power









But no denying the power consumption is pretty impressive.


----------



## jprovido

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - *the important thing is 1080p gaming benchmarks*. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


once you get a high refresh rate monitor then you'd realize how important these "useless 1080p cpu limited benchmarks" are. I learned it the hard way with my x99 setup. 5820k @ 4.7ghz wasn't up to task


----------



## Slaughterem

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. 1080p is good test for CPU. If you want to test GPU you can test it at 1440p and 4K. If you want a CPU review and want to hear how CPUs perform them same then test CPU at 4K.


And gamer nexus did it at 1440 and the results was?


----------



## Alwrath

LOOKS LIKE RYZEN 1700 + 3200 MHZ CL 15 DDR4 + GOOD MOTHERBOARD = THE BEST 8 CORE BARGAIN OF THE YEAR. GREAT JOB AMD YOU DID IT. YOU GAVE 4 - 4.2 GHZ 8 CORE WITH WATER COOLING TO THE SMART ONES WHO NEED AN UPGRADE FOR $600. I FEEL REALLY BAD FOR PPL WHO BOUGHT THE 1700X AND 1800X THEY FLUSHED THERE MONEYS DOWN THE TOILET.


----------



## naz2

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. Since computer hardware enthusiasts are primarily 1080p gamers now, apparently.


uhh, no. performance is _always_ the #1 factor here. it's just that after years of intel being automatically better people had to find other criteria to compare


----------



## Pro3ootector

AMD almost made socket 1151 toasted, now wait for the 4 and 6 core RYZEN's!!


----------



## budgetgamer120

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So keep the 1680 V2 then?
> 
> The 1700 looks like the cpu to get over the 1800x and 1700x.


Definitely lol why were you even looking at Ryzen.


----------



## AuraNova

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


You have to read the context. I think fail is a wrong word to use in this case. It's that when AMD releases a new CPU onto the market, it has massive bugs. Phenom I, Bulldozer, and Zen seem to share this. Their later counterparts were much improved. Ryzen itself isn't a fail, but its first iteration falls short compared to Intel. Of course, this is coupled with massive overhyping.

(Man, this is going to get lost under all the posts. lol)


----------



## tpi2007

I have only read three reviews so far (Tom's Hardware, PC Perspective and The Tech Report), but from the results I'd say that this quote from the TH review makes a good summary:
Quote:


> To that end, when we weigh the 1800X's strong showing in workstation and HPC workloads against its issues with games, we can't help but believe that AMD designed this specific configuration with a datacenter-driven mindset and didn't optimize it thoroughly for desktops. Much like Intel and Broadwell-E, in fact.


The idle power consumption of the platform is good, at 10w less than Intel's mainstream (although that can vary a lot by motherboard maker and specific model), but I still haven't read about the platform's overall performance, such as USB 3.1 and SATA 3 and NVMe, etc.

I still want to see how the CPU does with higher clocked RAM, as that may feed the cores better.

Overall, it's more or less as expected: good in some things, not so good in others; the AVX2 performance is lacking as we already more or less knew, and there are teething issues here and there. If these Ryzen CPUs can't for the time being clock higher than 4.2 as I've read in a few posts here, then I'd say two things: Intel will probably lower their Broadwell-E CPUs prices by a bit, but not as much as expected, and, secondly, if we can make some extrapolations regarding the upcoming 1600X (for which AMD already announced the clockspeeds) and the quads later down the road - if they don't do any better clockspeed wise, the 7700K's price won't be lowered by much if at all, at least for now. Maybe that's why AMD isn't releasing the quads now. Perhaps to get more clockspeed with a refined process and better yields, otherwise they'll be competing with the 4770K (and with AVX2 at half throughput).


----------



## aDyerSituation

I know it's hard for some of you to comprehend, but not everyone has the same needs or uses a computer for the same thing you do.
If you need a chip that is a good enough for gaming, but has great multi threaded performance for the low, RyZen might be for you.

If you are trying to push 144hz 1080 in games that use only a thread or two like me, you might want to look elsewhere.


----------



## budgetgamer120

Quote:


> Originally Posted by *kaosstar*
> 
> It's funny how the standards change here.
> For years, efficiency/power consumption was one of the most important factors for the "enthusiasts" here.
> 
> Suddenly, power consumption doesn't matter - the important thing is 1080p gaming benchmarks. *Since computer hardware enthusiasts are primarily 1080p gamers now, apparently*.


----------



## LancerVI

Quote:


> Originally Posted by *pengs*
> 
> http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


it shows it at stock. What's the point of disabling SMT if you're not overclocking???


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> I *know it's hard for some of you to comprehend, but not everyone has the same needs or uses a computer for the same thing you do.*
> If you need a chip that is a good enough for gaming, but has great multi threaded performance for the low, RyZen might be for you.
> 
> If you are trying to push 144hz 1080 in games that use only a thread or two like me, you might want to look elsewhere.


Yes maybe 24/7 gamers at 120hz need to understand that in bold too. I might as well buy a console


----------



## AuraNova

Quote:


> Originally Posted by *aDyerSituation*
> 
> I know it's hard for some of you to comprehend, but not everyone has the same needs or uses a computer for the same thing you do.
> If you need a chip that is a good enough for gaming, but has great multi threaded performance for the low, RyZen might be for you.
> 
> If you are trying to push 144hz 1080 in games that use only a thread or two like me, you might want to look elsewhere.


Ryzen is a balance of having both, especially if you just want to mess around with one computer. It still games well, and you get the added benefit of heavy workloads for multithreading.


----------



## kd5151

Quote:


> Originally Posted by *chuy409*
> 
> Phenom II was the better release from those 3 you mentioned.


Phenom II was shut down in about 1 month when the i5-750 was released. AMD had to lower prices down to i3 level to compete.


----------



## Jayjr1105

Quote:


> Originally Posted by *tpi2007*
> 
> I have only read three reviews so far (Tom's Hardware, PC Perspective and The Tech Report), but from the results I'd say that this quote from the TH review makes a good summary:
> The idle power consumption of the platform is good, at 10w less than Intel's mainstream (although that can vary a lot by motherboard maker and specific model), but I still haven't read about the platform's overall performance, such as USB 3.1 and SATA 3 and NVme, etc.
> 
> I still want to see how the CPU does with higher clocked RAM, as that may feed the cores better.
> 
> Overall, it's more or less as expected good in some things, not so good in others; the AVX2 performance is lacking as we already more or less knew, and there are teething issues here and there. If these Ryzen CPUs can't for the time being clock higher than 4.2 as I've read in a few posts here, then I'd say two things: Intel will probably lower their Broadwell-E CPUs by a bit, but not as much as expected and if we can make some extrapolations regarding the upcoming 1600X (for which AM already announced the clockspeeds) and the quads later down the road - if they don't do any better clockspeed wise, the 7700K's price won't be lowered by much if at all, at least for now. Maybe that's why AMD aren't releasing the quads now. Perhaps to get more clockspeed with a refined process and better yields, otherwise they'll be competing with the 4770K (and without having AVX2).


Very interesting catch


----------



## jprovido

Quote:


> Originally Posted by *LancerVI*
> 
> it shows it at stock. What's the point of disabling SMT if you're not overclocking???


he did it to make a point. in some cases stock with SMT disabled performed better than 1800x overclocked with SMT enabled.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *budgetgamer120*
> 
> Definitely lol why were you even looking at Ryzen.


OCN hype train ofcourse!









Also the curiousity of going back to AMD. If I was building a third computer, it would be a 1700 and CH6. Great value there.


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Meh, I will not be upgrading. Gaming performance is disappointing.


Nehalem/LGA1366 SMT performance decreased when ti was on. later people had to wait games get proper coding for Hyperthreading(AMD hasnt ever user SMT on their CPUs, and approach probably is different to intel) and also you have to account for Memory latency


----------



## budgetgamer120

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> OCN hype train ofcourse!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also the curiousity of going back to AMD. If I was building a third computer, it would be a 1700 and CH6. Great value there.


If Ryzen could hit the clocks you are at then sure. But this aint for you right now.


----------



## chuy409

I know its an old game but jesus. beaten by fx.


----------



## Slaughterem

Quote:


> Originally Posted by *Alwrath*
> 
> HEY GUYS LET ME KNOW WHEN THE 4K REVIEWS COME OUT. IM NOT AS GOOD AS THE ELITE 1080P ENTHUSIAST GAMER CROWD IS. I ONLY HAVE A CRAPPY $1500 4K DISPLAY SO I GOTTA WAIT FOR THE LOWLY 4K BENCHMARKS TO COME IN. *BOWS BEFORE ALL THE ELITE ENTHUSIAST 1080P GAMERS*
> 
> *BUYS RYZEN 1700 ON NEWEGG*


Makes the smartest choice and has a great system for next few years


----------



## Derp

Quote:


> Originally Posted by *LancerVI*
> 
> it shows it at stock. What's the point of disabling SMT if you're not overclocking???


Because regardless of the clock speed SMT can show a rather large performance penalty on Ryzen. Lisa Su is blaming programs and games because they are optimized for Intel.

https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/def5iab/


----------



## Motley01

Ok but can you please stop posting in all caps.


----------



## jprovido

Quote:


> Originally Posted by *chuy409*
> 
> 
> 
> I know its an old game but jesus. beaten by fx.


this is gonna trigger a lot of people. this benchmark is pretty useless imo.

I've had enough of this thread for now. bye guys keep on rockin


----------



## Scotty99

I am going to be honest, ive only been up for an hour but the temperature results when overclocked are the biggest disappoinment for me so far. I knew gaming was going to be behind (not quite this far tho) i was simply hoping for good temps being that the IHS is soldered to the die.

I dunno man, im not sure i can even pull the trigger on a 1700 because i will be air cooling (albeit a large one in the dark rock pro 3).


----------



## criminal

Quote:


> Originally Posted by *Scotty99*
> 
> Its literally fx launch V2, tell me how its any different.....please.
> 
> Man how pissed are the people who pre ordered a 1800x and could have literally saved 170 dollars with a 1700, from the reviews ive seen they OC to identical levels.


Oh please go away! It is nowhere near that bad.

I think this a good showing from AMD. Especially considering the price points.


----------



## LancerVI

Quote:


> Originally Posted by *jprovido*
> 
> he did it to make a point. in some cases stock with SMT disabled performed better than 1800x overclocked with SMT enabled.


That wasn't my original question......

Are there any results for a better OC result with SMT off? With SMT off, I'd imagine you'd get a bit more headroom.


----------



## 113802

Quote:


> Originally Posted by *naz2*
> 
> uhh, no. performance is _always_ the #1 factor here. it's just that after years of intel being automatically better people had to find other criteria to compare


Nope, after years of Intel barely increasing performance following price increases every generation.

Price/Perf has always been compared for years. Intel completely removed overclocking non-K skus. Forcing people to spend $40 more to allow them to overclock.

Now with the release of Ryzen we can see Intel's mark up on their processors.


----------



## Kuivamaa

Quote:


> Originally Posted by *chuy409*
> 
> Phenom II was the better release from those 3 you mentioned.


That would be Zen by far. Phenom I was meh at best and even Phenom II had a worse initial showing than Zen.


----------



## budgetgamer120

Quote:


> Originally Posted by *criminal*
> 
> Oh please go away! It is nowhere near that bad.
> 
> I think this a good showing from AMD. Especially considering the price points.


Just look how bad Ryzen performs in gaming.

http://wccftech.com/ryzen-gaming-benchmarks-roundup/


----------



## sugarhell

Solid product but low quality reviews.

For 350 bucks i can get an 8core that most of the times it match a 6900k. Great for a small workstation machine.

I think that they will sort out all the bugs it's a new platform and a new architecture. GJ AMD now let's see Zen 2


----------



## 113802

Quote:


> Originally Posted by *budgetgamer120*
> 
> Just look how bad Ryzen performs in gaming.
> 
> http://wccftech.com/ryzen-gaming-benchmarks-roundup/


Just look how great Ryzen performs in everything else

https://www.youtube.com/watch?v=Y7_1AQc6Xf8


----------



## bossie2000

Excellent job AAAAAAAAAAAMMMMMMMMMMMMMDDDDDDDDDDDDD!!!


----------



## budgetgamer120

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Just look how great Ryzen performs in everything else
> 
> https://www.youtube.com/watch?v=Y7_1AQc6Xf8


I was being sarcastic -_-


----------



## Scotty99

Does anyone know of an outlet that has done overwatch or WoW benchmarks? Techpowerup does not even have a ryzen review up yet.


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone know of an outlet that has done overwatch or WoW benchmarks? Techpowerup does not even have a ryzen review up yet.


Performs so badly.


http://wccftech.com/ryzen-gaming-benchmarks-roundup/


----------



## CriticalOne

This is what is really letting me down:


Spoiler: Processor load








I wanted to get a Ryzen eight core processor not for peak performance, but to be able to play games and do other system tasks at the same time without much slowdown. The CPU gaming results are disappointing, but not exactly the end of the world. However, if Ryzen is only getting 17-20% lower load in intensive highly multithreaded games then i'm not seeing a disadvantage to the 6700K/7700K for my uses.


----------



## -Sweeper_

PCGamesHardware

- Anno 2205
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.4 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 35.8 FPS (14% faster)

- AC: Syndicate
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 109.9 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.9 FPS (23.6% faster)

- Crysis 3
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 148.9 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 176.9 FPS (18.6% faster)

- Dragon Age Inquisition
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 112.3 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.6 FPS (20.7% faster)

- F1 2015
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 91.1 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 126.3 FPS (38.6% faster)

- Far Cry 4
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 76.8 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 89.3 FPS (16.2% faster)

- Starcraft 2
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.1 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 38.2 FPS (22.8% faster)

- The Witcher 3
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 80.2 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 134.7 (68% faster)

www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> Performs so badly.
> 
> 
> http://wccftech.com/ryzen-gaming-benchmarks-roundup/


I'll go ahead and tell your right now that benchmark is not legit.


----------



## darealist

The outrage is not the fact it's only a bit slower than Intel in gaming. It's because they lied to everyone, saying it's on par with a 6900k and that it's the fastest 8 core processor in the world. Intel engineers are most likely laughing their socks off right now.


----------



## 98uk

Quote:


> Originally Posted by *DADDYDC650*
> 
> Then those people aren't on OCN and won't be offended.


I'm one









Overclocking is boring and I'd rather spend the cash for the guaranteed extra!


----------



## davidelite10

Quote:


> Originally Posted by *-Sweeper_*
> 
> PCGamesHardware
> 
> - Anno 2205
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.4 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 35.8 FPS (14% faster)
> 
> - AC: Syndicate
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 109.9 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.9 FPS (23.6% faster)
> 
> - Crysis 3
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 148.9 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 176.9 FPS (18.6% faster)
> 
> - Dragon Age Inquisition
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 112.3 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.6 FPS (20.7% faster)
> 
> - F1 2015
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 91.1 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 126.3 FPS (38.6% faster)
> 
> - Far Cry 4
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 76.8 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 89.3 FPS (16.2% faster)
> 
> - Starcraft 2
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.1 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 38.2 FPS (22.8% faster)
> 
> - The Witcher 3
> Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 80.2 FPS
> Core i7-6900K (8C/16T 3.2-3.7 GHz): 134.7 (68% faster)
> 
> www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033


That's fine and dandy but the 6900k clocks to 4.3-4.ghz as well, so even farther







.


----------



## Artikbot

Quote:


> Originally Posted by *darealist*
> 
> The outrage is not the fact it's only a bit slower than Intel in gaming. It's because they lied to everyone, saying it's on par with a 6900k and that it's the fastest 8 core processor in the world. Intel engineers are most likely laughing their socks off right now.


Let me guess, you were one of those people that extrapolated AMD's slides about Polaris to 'same power consumption than a R9 290, but 50% faster' right?

Quote:


> Originally Posted by *davidelite10*
> 
> That's fine and dandy but the 6900k clocks to 4.3-4.ghz as well, so even farther
> 
> 
> 
> 
> 
> 
> 
> .


It also costs almost 3 times as much.


----------



## Game256

Detailed and excellent Russian review of Ryzen 7 1800 X.

https://3dnews.ru/948466

Some interesing notes: overclocking in Ryzen 7 1800X is pretty bad (but maybe will be improved later).

CPU couldn't surpass 4.0 Ghz with stable performance (Noctua NH-U14S was used for cooling).

The system was able to start with frequency of up to 4.25 Ghz but was higly unstable. During the testing with Prime95 on 4.05 Ghz, the system was crushed within minutes.

However even the use of 4.0 Ghz was risque. In order to maintain stable performance on this frequency it was required to increase voltage up to 1.55 V. The temperature surpassed 100 °C even with Noctua NH-U14S however throttling did not occur.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> I'll go ahead and tell your right now that benchmark is not legit.


Let hardware canucks know. It is their benchmark
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-17.html


----------



## Scotty99

Have any reviewers tried overclocking the 1700 non x on a b350 board? That i am really curious about.


----------



## sugalumps

Quote:


> Originally Posted by *budgetgamer120*
> 
> Just look how bad Ryzen performs in gaming.
> 
> http://wccftech.com/ryzen-gaming-benchmarks-roundup/


At stock it looks to be a decent upgrade from my aging 4670k and when you factor in ddr4 upgrade aswell it could be worth it and just to support amd, however my i5 is sitting at 4.4ghz if the 1700x was unable to get that not sure if I would see an improvement at all in gaming.


----------



## AlphaC

Some reviews are alarming because they used coolers suitable for quadcores not 8 cores:

http://hothardware.com/reviews/amd-ryzen-7-1800x-1700x-1700-benchmarks-and-review?page=10
Quote:


> For cooling, we stuck with the tower-type Thermaltake Contact Silent 12 we showed you on a previous page. At these settings, the CPU topped out at just shy of 90'C and was perfectly stable. Bumping things up to 4.1GHz would result in frequent crashes and the system wouldn't boot properly at 4.2GHz. With more voltage and more powerful cooling, however, somewhat higher clocks would likely be possible, but don't expect to hit the same kind of frequencies Intel's 14nm processors are capable of, with this generation of Ryzen at least.


----------



## Artikbot

Why would they though? AMD provided decent cooling, there was no need to go with cheapo coolers.


----------



## Game256

Quote:


> Originally Posted by *AlphaC*
> 
> Some reviews are alarming because they used coolers suitable for quadcores not 8 cores:
> 
> http://hothardware.com/reviews/amd-ryzen-7-1800x-1700x-1700-benchmarks-and-review?page=10


From Russian review of Ryzen 1800X with Noctua NH-U14S.

CPU couldn't surpass 4.0 Ghz with stable performance.

The system was able to start with frequency of up to 4.25 Ghz but was higly unstable.

During the testing with Prime95 on 4.05 Ghz, the system was crushed within minutes.

However even the use of 4.0 Ghz was risque. In order to maintain stable performance on this frequency it was required to increase voltage up to 1.55 V. The temperature surpassed 100 °C even with Noctua NH-U14S however throttling did not occur.


----------



## Master__Shake

Quote:


> Originally Posted by *Motley01*
> 
> Ok but can you please stop posting in all caps.


it's hard to troll if people can't see you


----------



## ducegt

Derp.







Not so good


----------



## kd5151

raw data. i like it!!!

https://m.youtube.com/watch?v=BXVIPo_qbc4


----------



## Derp

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone know of an outlet that has done overwatch or WoW benchmarks? Techpowerup does not even have a ryzen review up yet.


----------



## tpi2007

Quote:


> Originally Posted by *Jayjr1105*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> I have only read three reviews so far (Tom's Hardware, PC Perspective and The Tech Report), but from the results I'd say that this quote from the TH review makes a good summary:
> The idle power consumption of the platform is good, at 10w less than Intel's mainstream (although that can vary a lot by motherboard maker and specific model), but I still haven't read about the platform's overall performance, such as USB 3.1 and SATA 3 and NVme, etc.
> 
> I still want to see how the CPU does with higher clocked RAM, as that may feed the cores better.
> 
> Overall, it's more or less as expected good in some things, not so good in others; the AVX2 performance is lacking as we already more or less knew, and there are teething issues here and there. If these Ryzen CPUs can't for the time being clock higher than 4.2 as I've read in a few posts here, then I'd say two things: Intel will probably lower their Broadwell-E CPUs by a bit, but not as much as expected and if we can make some extrapolations regarding the upcoming 1600X (for which AM already announced the clockspeeds) and the quads later down the road - if they don't do any better clockspeed wise, the 7700K's price won't be lowered by much if at all, at least for now. Maybe that's why AMD aren't releasing the quads now. Perhaps to get more clockspeed with a refined process and better yields, otherwise they'll be competing with the 4770K (and without having AVX2).
> 
> 
> 
> 
> 
> 
> 
> Very interesting catch
Click to expand...

Thanks.









Just a note, I corrected a few mistakes in the post (such as "AM" instead of "AMD"), but, more importantly, the last thing in parenthesis where I meant that Ryzen has half the AVX2 throughput, not that it doesn't have it. Earlier in the post I talked about AVX2, so it's easy to tell it's a mistake at the end, but still, on top of correcting it, since you quoted it before I corrected it, I found it better to make a post about it.


----------



## DADDYDC650

Quote:


> Originally Posted by *98uk*
> 
> I'm one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocking is boring and I'd rather spend the cash for the guaranteed extra!


Haven't heard of a 1700 not being able to hit 4Ghz YET.


----------



## Pro3ootector

Oh gaming laptops will no longer cost a fortune!!!


----------



## sugarhell

Quote:


> Originally Posted by *kd5151*
> 
> raw data. i like it!!!
> 
> https://m.youtube.com/watch?v=BXVIPo_qbc4


Thank you. Raw data is the best


----------



## Scotty99

Quote:


> Originally Posted by *Derp*


Dam, 50 FPS behind









Although ive watched two reviews on youtube from bitwit (kyle) and joker productions, and their results were that it ties intel.

Benchmarks are honestly all over the place right now, very frustrating.


----------



## tpi2007

I'm looking for a fourth review to read / watch and I'm seeing Gamersnexus mentioned as being good, should I go for that one?


----------



## sugarhell

Quote:


> Originally Posted by *tpi2007*
> 
> I'm looking for a fourth review to read / watch and I'm seeing Gamersnexus mentioned as being good, should I go for that one?


I never liked GamerNexus

I disagree a lot with their methodology


----------



## Xuper

Quote:


> Originally Posted by *sugarhell*
> 
> I never liked GamerNexus
> 
> I disagree a lot with their methodology


GamerNexus is ok , what's wrong ? maybe they're complicated? I don't like their CSS Style.ugly


----------



## Blaze051806

i love that amd is back on the board! if i was into content creation i would deff buy a 1700. but my rig is primarily used for gaming. so looks like ill be buying a 7700k. im glad it got a price drop tho thxs amd!


----------



## tpi2007

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> I'm looking for a fourth review to read / watch and I'm seeing Gamersnexus mentioned as being good, should I go for that one?
> 
> 
> 
> I never liked GamerNexus
> 
> I disagree a lot with their methodology
Click to expand...

Hmmm, so what would you advise next? I've read Tom's Hardware, PC Perspective and The Tech Report. All three are missing things they didn't have time to do for now, so I'm looking for a good one that has at least some of the missing stuff.


----------



## SoloCamo

Good side by side video of 5ghz 7700k vs 3.9ghz 1700






I'd say it games just fine considering it's 1.1ghz deficit and lower IPC.


----------



## DaaQ

From the Tom's conclusion, and to add to tpi2007's point.
Quote:


> _Achieving a 4 GHz overclock was straightforward enough through multiplier and voltage adjustments_, *and there are plenty of AMD-specific firmware settings we need to explore*. More headroom could certainly be available (though the Core i7-7700K is honestly more exciting to overclock if all you care about is higher numbers). On the memory overclocking side, AMD hasn't opened all of the sub-timings yet, and the Core i7-6900K has a throughput advantage with its quad-channel controller.


I will wait for member results before jumping to conclusions.
I think the emphasis (which is mine) of the above quote should speak for itself.


----------



## sugarhell

Quote:


> Originally Posted by *tpi2007*
> 
> Hmmm, so what would you advise next? I've read Tom's Hardware, PC Perspective and The Tech Report. All three are missing things they didn't have time to do for now, so I'm looking for a good one that has at least some of the missing stuff.


In general i read hardware.fr(they are the best) and computerbase.de


----------



## Game256

Well, it turned out that Ryzen is not THAT good as it was presented earlier but it's still pretty excellent. The most important thing that it started competition and there will be much more new CPUs, from AMD as well, this is still a new microarchitecture.


----------



## Vesku

Quote:


> Originally Posted by *darealist*
> 
> The outrage is not the fact it's only a bit slower than Intel in gaming. It's because they lied to everyone, saying it's on par with a 6900k and that it's the fastest 8 core processor in the world. Intel engineers are most likely laughing their socks off right now.


It's smaller, cheaper, and very close in performance to the 6900k in MT professional workloads. If there is laughter at Intel HQ it's probably not regarding Zen.


----------



## budgetgamer120

Quote:


> Originally Posted by *kd5151*
> 
> raw data. i like it!!!
> 
> https://m.youtube.com/watch?v=BXVIPo_qbc4


Ryzen seems very competitive in gaming.


----------



## SoloCamo

Quote:


> Originally Posted by *darealist*
> 
> The outrage is not the fact it's only a bit slower than Intel in gaming. It's because they lied to everyone, saying it's on par with a 6900k and that it's the fastest 8 core processor in the world. Intel engineers are most likely laughing their socks off right now.


Links? Don't remember them saying it's the fastest 8 core processor in the world.


----------



## Descadent

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen seems very competitive in gaming.


as long as you accept 20% less performance


----------



## BinaryDemon

I wonder if the R5 and R3 parts will clock higher or if 4.1ghz is a limitation of the architecture.


----------



## Scotty99

Can anyone link a review done with a b350 board?

According to this AMD video, b350 should be fully capable OC board:
https://www.youtube.com/watch?v=xRiWY1xiqFM

If a 100 dollar b350 board can make a r7 1700 boost to 3.8 or 3.9 on all cores, i think that is the route i am going to go.


----------



## Alwrath

This is just the beginning. Once the motherboard manufacturers wake up and develop stronger boards for overclocking, we will probably see 4.4 ghz Ryzen 1700 sig's in the forums in a few months once the really nice $300-$400 overclock boards come out.


----------



## AlphaC

http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2

very harsh review of the R7 1700.
Quote:


> Don't get me wrong: There's a whole lot to like about Ryzen. Like I said, it's competitive at higher gaming resolutions, or if you pair it with a top-end graphics card like the GTX 1080. As evidenced by the Jekyll and Hyde Ashes of the Singularity results, Ryzen shines in games that actually make use of abundant cores, which is encouraging indeed if DirectX 12 and Vulkan games start gaining traction. Ryzen's incredibly, impressively power efficient. The platform supports every modern amenity you could ask for. And, as you'll discover in PCWorld's Ryzen review, AMD's CPUs actually whomp Intel's chips in multithreaded productivity tasks-and for a fraction of the price of comparable 8-core Core processors. The Ryzen 7 1700 is damned disruptive, and not a dud whatsoever.
> 
> No, it's not a dud-unless you're looking to replace a 5-year-old, quad-core Intel Core i5 chip for mainstream gaming at the most popular display resolution. There, the Ryzen 7 1700 can stumble, and stumble hard.


Should have overclocked the R7 1700 to 3.9 or 4Ghz , give it a TDP limit of a i7-7700k.

Astonished that there would be such a harsh review when the i7-6900k, i7-6800k, etc were not given such harsh treatment.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Alwrath*
> 
> This is just the beginning. Once the motherboard manufacturers wake up and develop stronger boards for overclocking, we will probably see 4.4 ghz Ryzen 1700 sig's in the forums in a few months once the really nice $300-$400 overclock boards come out.


These are low watt cpus, can't see how $300-$400 boards would help overclocking. The Asus CH6 is more than enough.


----------



## budgetgamer120

Quote:


> Originally Posted by *Descadent*
> 
> as long as you accept 20% less performance


In this video which is what i replied they are on par while Ryzen gives smother fps.


----------



## Alwrath

Quote:


> Originally Posted by *Descadent*
> 
> as long as you accept 20% less performance


Not according to this review. Neck and neck against a intel 7700k in every game.

https://www.youtube.com/watch?v=V5RP1CPpFVE


----------



## Jpmboy

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen seems very competitive in gaming.


Aside comparing a 4 core vs 8 core, look closely at the gpu frequencies. Only thing I can conclude from that yt is the 7700K is performing better than I would expect.


----------



## geriatricpollywog

I feel bad for anybody who preordered an 1800x for a gaming rig, unless they bought AMD for the sake of buying AMD.

This worries me, as I planned on getting a Vega card. If they can't deliver with Ryzen, there's no way Vega will live up to its promises.

Fine wine will prevail and Ryzen may hit its stride in a few years.


----------



## neurotix

So they had talked up how they were targetting 40% IPC improvement, and then recently that they exceeded that and it was closer to 56%...

...however I find it interesting that to my knowledge, they never really said compared to what. 56% compared to the original Bulldozer? Piledriver? Excavator? A potato?


----------



## Alwrath

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> These are low watt cpus, can't see how $300-$400 boards would help overclocking. The Asus CH6 is more than enough.


This is the first launch board from Asus. While impressive, theres always room for improvement. Just wait and see Asus will release a stronger oc board in a few months, I guarantee it.


----------



## budgetgamer120

Quote:


> Originally Posted by *neurotix*
> 
> So they had talked up how they were targetting 40% IPC improvement, and then recently that they exceeded that and it was closer to 56%...
> 
> ...however I find it interesting that to my knowledge, they never really said compared to what. 56% compared to the original Bulldozer? Piledriver? Excavator? A potato?


Excavator sir... Excavator... They said so in the stream


----------



## Pro3ootector

It's the question of Ryzen 4.1GHZ SMT disabled vs Kaby 4C/8T 5.1ghz.


----------



## EniGma1987

Quote:


> Originally Posted by *bigjdubb*
> 
> The 4k results are showing gpu limitations, not cpu limitations. 720p is the only resolution needed in CPU tests.


Id like to see people test 4k for minimum fps differences between the CPUs. Since CPU and memory can affect min fps quite a bit
Quote:


> Originally Posted by *RedM00N*
> 
> Someone already got close to 6ghz with everything active.
> 
> 5.8 1.97v
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3473875_der8auer_cpu_frequency_ryzen_7_1800x_5802.93_mhz
> 
> 5.4g GPUPI for CPU
> http://hwbot.org/submission/3473880_


5.8GHz with all 16 threads is pretty crazy.


----------



## amstech

Yikes.

If your got rid of your i7 for a Ryzen CPU [for gaming purposes] and bought into the hype, you have learned a valuable lesson.
That being said I think Ryzen is doing ok, its competing with the 3770k which is pretty respectable, but overall the overclock wall and gaming results tell no lies.
This means my 6 year old i7 is competing with Ryzen!

Talk about bang for your buck!


----------



## Game256

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2
> 
> very harsh review of the R7 1700.


I wouldn't call it harsh. He states obvious things that Ryzen doesn't bring exceptional improvements in terms of technology (except power efficiency), instead it brings competition and similar CPU for a better price.


----------



## Xuper

Here The Stilt's Review from Anandtech



Quote:


> 850 points in Cinebench 15 at 30W is quite telling. Or not telling, but absolutely massive. Zeppelin can reach absolutely monstrous and unseen levels of efficiency, as long as it operates within its ideal frequency range.


Holy cow 850 point in 30w !!!


----------



## BinaryDemon

I would say Ryzen is a very well rounded processor, but if all you do is play games then these current R7's might not represent the best choice. The R7's seemed positioned against Intel's workstation cpus, and I would think the R5 or R3 Ryzens would offer better value (not performance) against mainstream 1151 cpu's.


----------



## OutlawII

Ryzen hype train has derailed as far as gaming is concerned,but i do believe this will still be good for the cpu market.


----------



## Descadent

Quote:


> Originally Posted by *budgetgamer120*
> 
> In this video which is what i replied they are on par while Ryzen gives smother fps.


Quote:


> Originally Posted by *Alwrath*
> 
> Not according to this review. Neck and neck against a intel 7700k in every game.
> 
> https://www.youtube.com/watch?v=V5RP1CPpFVE


doesn't matter, it cannot even get within 20fps or sometimes WAY more against intel. the fact you can't get ryzen over 4.1-.2 is why. it's actually bullcrap this thing can't overclock but some of you are just willing to roll over and accept it "because it's an 8 core" blah blah...and if we are talking about an 1800x it's like ***? this is it?! gaming i'm talking about not rendering.

i am just so glad i had the restraint to not preorder any of it as I need gaming and rendering. not one or the other...and my 2600k at 4.6 competes with ALL of the ryzens on the game side, so i ain't buying in until ryzen can overclock like a man.


----------



## EniGma1987

Quote:


> Originally Posted by *naz2*
> 
> the poopstorm in a few recent threads could've fooled me. funny how so many people are backing off their original claims and falling back to the synthetic benchmark/"productivity" argument when we just had a 60 page thread exclusively about ryzen gaming performance


What thread was that?


----------



## budgetgamer120

Quote:


> Originally Posted by *amstech*
> 
> Yikes.
> 
> If your got rid of your i7 for a Ryzen CPU [for gaming purposes] and bought into the hype, you have learned a valuable lesson.
> That being said I think Ryzen is doing ok, its competing with the 3770k which is pretty respectable, but overall the overclock wall and gaming results tell no lies.
> This means my 6 year old i7 is competing with Ryzen!
> 
> Talk about bang for your buck!


Yes Cause I still play Far Cry 2 and those other dinosaur games


----------



## ChronoBodi

Jesus, did it occur to anyome why its weird that it has ipc of Broadwell-e in everything but gaming.

It has 162 CB single threaded score, comparable to Broadwell.

Maybe its the games being so Intel-optimized for 10 years, and this is literally first day ever of a brand new architecture from AMD in a very long time.

Maybe its early firmware, bugs with early review kits?

I rather have a cpu that's decent in gaming and better in everything else.

1080p FPS is not the be-all end all, people. At higher resolutions the Ryzens and i7s are near or tied.

I rather game and render videos with the cpu being great at both,
Rather than a one-trick pony cpu.

For video editors, this is a huge deal, 6900k ipc for 1/2 the price.


----------



## CriticalOne

I am just going to put Zen on the back burner for longer. DDR4 prices are still far too high for my liking and I don't have desperation for a new CPU. If I wait long enough for Zen+ to come, then I will get that instead.

I don't see why people are putting even more hope on the 1600x. If the 8C/16T version, the 1800x isnt performing well in games, cutting off more cores is only going to hurt performance more.


----------



## zGunBLADEz

we need frame pacing tests for 7700k (ala 7970 era XD) i just loled so hard seeing smooth amd and stutter/microstutter fest from intel
https://www.youtube.com/watch?v=BXVIPo_qbc4


----------



## DaaQ

Quote:


> Originally Posted by *neurotix*
> 
> So they had talked up how they were targetting 40% IPC improvement, and then recently that they exceeded that and it was closer to 56%...
> 
> ...however I find it interesting that to my knowledge, they never really said compared to what. 56% compared to the original Bulldozer? Piledriver? Excavator? A potato?


It was, has been and is Excavator.


----------



## budgetgamer120

Quote:


> Originally Posted by *ChronoBodi*
> 
> Jesus, did it occur to anyome why its weird that it has ipc of Broadwell-e in everything but gaming.
> 
> It has 162 CB single threaded score, comparable to Broadwell.
> 
> Maybe its the games being so Intel-optimized for 10 years, and this is literally first day ever of a brand new architecture from AMD in a very long time.
> 
> Maybe its early firmware, bugs with early review kits?
> 
> I rather have a cpu that's decent in gaming and better in everything else.
> 
> 1080p FPS is not the be-all end all, people. At higher resolutions the Ryzens and i7s are near or tied.
> 
> I rather game and render videos with the cpu being great at both,
> Rather than a one-trick pony cpu.
> 
> For video editors, this is a huge deal, 6900k ipc for 1/2 the price.


Well rounded CPU.


----------



## OutlawII

Quote:


> Originally Posted by *budgetgamer120*
> 
> Yes Cause I still play Far Cry 2 and those other dinosaur games


Maybe you should watch some other reveiws it falls behind in alot of games.


----------



## Undervolter

Quote:


> Originally Posted by *Xuper*
> 
> GamerNexus is ok , what's wrong ? maybe they're complicated? I don't like their CSS Style.ugly


I remember GamersNexus being brought up in the AMD forum, because in their gaming benchmarks, they wrote more or less that...FX95xx had a "characteristic behaviour observed by Gamersnexus always, where it would drop FPS behind 8350 or even 8370E". Several members, pointed them out that it is impossible for inferior clocked FX to beat the FX95XX, because they are the same chips, just with higher frequency. And that obviously they were doing something wrong (like throttling). It was to no avail..


----------



## Game256

The performance of Ryzen is more than enough for gaming and Ryzen significantly surpasses it's price competitors in everything else.


----------



## budgetgamer120

Quote:


> Originally Posted by *zGunBLADEz*
> 
> we need frame pacing tests for 7700k (ala 7970 era XD) i just loled so hard seeing smooth amd and stutter/microstutter fest from intel
> https://www.youtube.com/watch?v=BXVIPo_qbc4


Also notice the 70% CPU usage on the 7700k


----------



## tpi2007

Quote:


> Originally Posted by *SoloCamo*
> 
> Good side by side video of 5ghz 7700k vs 3.9ghz 1700
> 
> 
> 
> 
> 
> 
> I'd say it games just fine considering it's 1.1ghz deficit and lower IPC.


Thanks! Rep+

From those games it seems fine. In other games not so much apparently, it seems like the numbers are a bit all over the place. But from what I'm seeing the Ryzen 7 1700 appears to be indeed the best value.

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Hmmm, so what would you advise next? I've read Tom's Hardware, PC Perspective and The Tech Report. All three are missing things they didn't have time to do for now, so I'm looking for a good one that has at least some of the missing stuff.
> 
> 
> 
> In general i read hardware.fr(they are the best) and computerbase.de
Click to expand...

Thanks! Rep+

Quote:


> Originally Posted by *neurotix*
> 
> So they had talked up how they were targetting 40% IPC improvement, and then recently that they exceeded that and it was closer to 56%...
> 
> ...however I find it interesting that to my knowledge, they never really said compared to what. 56% compared to the original Bulldozer? Piledriver? Excavator? A potato?


They claimed 52%, not 56%, and the graph Lisa Su presented explicitly mentioned it was against the last Bulldozer iteration, Excavator.


----------



## 7850K

Quote:


> Originally Posted by *neurotix*
> 
> So they had talked up how they were targetting 40% IPC improvement, and then recently that they exceeded that and it was closer to 56%...
> 
> *...however I find it interesting that to my knowledge*, they never really said compared to what. 56% compared to the original Bulldozer? Piledriver? Excavator? A potato?


It's been known since early zen leak info. You are just now catching up and looking the fool as usual


----------



## kingduqc

Quote:


> Originally Posted by *amstech*
> 
> Yikes.
> 
> If your got rid of your i7 for a Ryzen CPU [for gaming purposes] and bought into the hype, you have learned a valuable lesson.
> That being said I think Ryzen is doing ok, its competing with the 3770k which is pretty respectable, but overall the overclock wall and gaming results tell no lies.
> This means my 6 year old i7 is competing with Ryzen!
> 
> Talk about bang for your buck!


This reminded me that I have to PM a guy on OCN that he won't be getting my i7.


----------



## budgetgamer120

Quote:


> Originally Posted by *OutlawII*
> 
> Maybe you should watch some other reveiws it falls behind in alot of games.


It does not need to be the fastest to "compete". Did the meaning of compete change?


----------



## CriticalOne

Quote:


> Originally Posted by *budgetgamer120*
> 
> Also notice the 70% CPU usage on the 7700k


See my post from earlier:
Quote:


> Originally Posted by *CriticalOne*
> 
> This is what is really letting me down:
> 
> 
> Spoiler: Processor load
> 
> 
> 
> 
> 
> 
> 
> 
> I wanted to get a Ryzen eight core processor not for peak performance, but to be able to play games and do other system tasks at the same time without much slowdown. The CPU gaming results are disappointing, but not exactly the end of the world. However, if Ryzen is only getting 17-20% lower load in intensive highly multithreaded games then i'm not seeing a disadvantage to the 6700K/7700K for my uses.


----------



## mouacyk

Quote:


> Originally Posted by *tpi2007*
> 
> They claimed 52%, not 56%, and the graph Lisa Su presented explicitly mentioned it was against the last Bulldozer iteration, Excavator.


It's funny that we're happy for AMD to bring some competition, but AMD is feeling anything but competitive. The big performance improvement of 40 - 52% are compared to *their* own *Bulldozer*! Does not bode well for competition.


----------



## ducegt

Quote:


> Originally Posted by *OutlawII*
> 
> Maybe you should watch some other reveiws it falls behind in alot of games.


That guy will ignore everything. Almost seems like a new Zen powered forum marketing bot. That neural learning tech is taking off. Could be more interesting if and when it matures.


----------



## Mephistobr

Some people are bashing the gaming performance, but I don't think that will stay like that much longer. How many years Intel is the go to gaming whise? All these games are clearly optimized for Intel CPUs. There's absolute zero sense in a CPU being on par or better in everything but gaming unless there's a problem with the games itself. I'm pretty sure that with new patches all those games can have a significantly better performance with ryzen.


----------



## budgetgamer120

Quote:


> Originally Posted by *CriticalOne*
> 
> See my post from earlier:


A quad would be of no good to you then.


----------



## OutlawII

Quote:


> Originally Posted by *budgetgamer120*
> 
> It does not need to be the fastest to "compete". Did the meaning of compete change?


Just saying that the way everyone hyped it up it was the best thing since sliced bread.


----------



## cssorkinman

Quote:


> Originally Posted by *Xuper*
> 
> Here The Stilt's Review from Anandtech
> 
> 
> 
> Quote:
> 
> 
> 
> 850 points in Cinebench 15 at 30W is quite telling. Or not telling, but absolutely massive. Zeppelin can reach absolutely monstrous and unseen levels of efficiency, as long as it operates within its ideal frequency range.
> 
> 
> 
> Holy cow 850 point in 30w !!!
Click to expand...

I saw that - amazing to me. I have no idea how that compares to Intel's best effort but thats about one tenth the power an 8 core Visera would need to score 850. I'd think it would speak well towards server/mobile markets for Zen.


----------



## budgetgamer120

Quote:


> Originally Posted by *ducegt*
> 
> That guy will ignore everything. Almost seems like a new Zen powered forum marketing bot. That neural learning tech is taking off. Could be more interesting if and when it matures.


The hater bots are alot more


----------



## Ultracarpet

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2
> 
> very harsh review of the R7 1700.
> Should have overclocked the R7 1700 to 3.9 or 4Ghz , give it a TDP limit of a i7-7700k.
> 
> Astonished that there would be such a harsh review when the i7-6900k, i7-6800k, etc were not given such harsh treatment.


He overclocked his 3570k to 4.2 ghz, and is using 2133 ddr3 ram that is undoubtedly tweaked. He didn't mention what speed his sticks were running on the 1700, nor did he overclock the 1700 at all.

Why would he not include an 8 core, or even 6 core intel for that matter, also running at stock?

This crap has been happening ever since AMD shocked people with the first few leaks indicating that these chips weren't going to be so firmly in 2nd place to Intel. the R7 is a different beast to Intel's quad core chips, and yet everyone and their grandparents keeps comparing them to the mainstream Intel lineup in tasks that the intel quad cores pretty much live off of. The 6950x, the 6900k, and the 6800k are not as good of gaming chips as the 7700k, and no one is claiming Intel's hedt platform is useless because of it. However, for AMD that is suddenly a defining factor??? Such a joke.

The r5 and r3 chips will be what needs to be compared to Intel's mainstream lineup. For the simple reason that the core configs are closer/ exactly the same as Intel's, and also they will be priced much cheaper in relation to their Intel equivalent chips. The r3 for example, is going to be quite a bit cheaper than a non HT i5. We will see the value proposition versus Intel's mainstream chips then. For now, the focus should be on Intel's HEDT plaform. Which undoubtedly has a few advantages, but at an absolutely massive price hike vs. this ryzen r7 lineup.

The 1700 is close in price to the 7700k, i get that, but that is not the point, nor the value proposition of the chip. It is like 30-40% of the cost of a 6900k, and it can be easily OC'd to be competitive with the 6900k. That is mind blowing to me, and should be for everyone.


----------



## mouacyk

Quote:


> Originally Posted by *kingduqc*
> 
> This reminded me that I have to PM a guy on OCN that he won't be getting my i7.


I happily offloaded mine -- need those 8 cores for compiling work


----------



## AmericanLoco

Quote:


> Originally Posted by *mouacyk*
> 
> It's funny that we're happy for AMD to bring some competition, but AMD is feeling anything but competitive. The big performance improvement of 40 - 52% are compared to *their* own *Bulldozer*! Does not bode well for competition.


You know, never mind the fact that it's pretty much comparable to the i7-6900 at half the cost







. The i7 Quad cores are not the 8-core Zen's target.


----------



## pengs

Quote:


> Originally Posted by *Scotty99*
> 
> Dam, 50 FPS behind
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Although ive watched two reviews on youtube from bitwit (kyle) and joker productions, and their results were that it ties intel.


50fps down... from 300fps


----------



## ducegt

Quote:


> Originally Posted by *mouacyk*
> 
> The big performance improvement of 40 - 52% are compared to *their* own *Bulldozer*! Does not bode well for competition.


Donald Trump rebuttal: Yeah, but can't you agree 52℅ is a very yuuuge number? ... Then he would thank you for agreeing. Oh 52 you are so huge it doesn't matter what context you are used in.


----------



## zGunBLADEz

we need more raw data results joker video do a big talk here i dont trust this reviewers XD


----------



## mouacyk

Quote:


> Originally Posted by *AmericanLoco*
> 
> You know, never mind the fact that it's pretty much comparable to the i7-6900 at half the cost
> 
> 
> 
> 
> 
> 
> 
> . The i7 Quad cores are not the 8-core Zen's target.


I agree AMD has something when it comes to perf/dollar, absolutely no arguments there! My point is then the 52% improvement over Bulldozer is meaningless, because Bulldozer is irrelevant. When it comes to pure performance, AMD still has nothing - try a 6950x at 4.4GHz on all 10 cores.


----------



## CriticalOne

Quote:


> Originally Posted by *budgetgamer120*
> 
> A quad would be of no good to you then.


I get that, but the Ryzen processors don't seem to be much better off than the i7s when it comes to CPU utilization in games.


----------



## LancerVI

Quote:


> Originally Posted by *ChronoBodi*
> 
> Jesus, did it occur to anyome why its weird that it has ipc of Broadwell-e in everything but gaming.
> 
> It has 162 CB single threaded score, comparable to Broadwell.
> 
> Maybe its the games being so Intel-optimized for 10 years, and this is literally first day ever of a brand new architecture from AMD in a very long time.
> 
> Maybe its early firmware, bugs with early review kits?
> 
> I rather have a cpu that's decent in gaming and better in everything else.
> 
> 1080p FPS is not the be-all end all, people. At higher resolutions the Ryzens and i7s are near or tied.
> 
> I rather game and render videos with the cpu being great at both,
> Rather than a one-trick pony cpu.
> 
> For video editors, this is a huge deal, 6900k ipc for 1/2 the price.


I'm going to predict it now; Ryzen will age VERY WELL.

All this talk about it being a failure is completely ridiculous. Let's have this conversation 6 months from now.

The 1700 non-X appears to be a GREAT value and Joker Productions shows it neck and neck with the 7700.

Remember, NO ONE EXPECTED RYZEN TO BEAT KABY LAKE IN GAMING. If you did, you're a complete ignoramus.

Having said that, I would say the 1800x or 1700x are a pretty hard sell for gaming.


----------



## looniam

none of these gaming benchmarks are relevant since no one was streaming to twitch and using handbrake at the same time.

*amirite?*


----------



## 7850K

"From the AMD AMA. CEO herself answering stuff."

https://forums.anandtech.com/threads/official-amd-ryzen-benchmarks-reviews-prices-and-discussion.2499879/page-103#post-38771019


----------



## ihatelolcats

so, what's the deal with XFR? are there any reviews that go into detail about it?


----------



## Serios

Quote:


> Originally Posted by *Wishmaker*
> 
> I would like to give all those people who said the AMD process will be superior to the INTEL one a huge hug! I won't say I told you so. No need, we got dozens of sites saying I told you so
> 
> 
> 
> 
> 
> 
> 
> . You cannot have the same process with your competitor who is on its third iteration.


So you haven't calmed down.
Anyway i don't see the big deal. AMD's first finfet 14nm 8 core CPU and it's doing great especially in efficiency.
Clocks will improve with bios updates and as the process continues to mature. AMD did a fine job with the first batch of CPUs.


----------



## Jayjr1105

How are some reviews so lopsided then ones like this are neck and neck...


----------



## sugarhell

Best review right here : Stilt's Review

Some benchmarks performance seems weird. Probably has to do with the memory


----------



## ToTheSun!

Quote:


> Originally Posted by *Jayjr1105*
> 
> How are some reviews so lopsided then ones like this are neck and neck...


Because that obviously insufficient data.


----------



## aDyerSituation

Quote:


> Originally Posted by *Jayjr1105*
> 
> How are some reviews so lopsided then ones like this are neck and neck...


Probably the fact that they are pushing the GPU too hard.

CPU review with games maxed out makes sense from the average users perspective, but doesn't really show true performance


----------



## ducegt

Quote:


> Originally Posted by *Jayjr1105*
> 
> How are some reviews so lopsided then ones like this are neck and neck...


The close ones have the GPU being a bottleneck. It helps to hide the difference now, but 5 fps difference today will turn into a far greater amount in the future. More is always better. This "technique" is nothing new.


----------



## Alwrath

Quote:


> Originally Posted by *Jayjr1105*
> 
> How are some reviews so lopsided then ones like this are neck and neck...


Most likely they are not very good at displaying data properly / human error / bad setups / paid by intel / who knows????

All I know is Ryzen 1700 is the best gaming cpu money can buy right now with 8 cores for the money. period. AMD wins! They are kicking ass and chewing bubblegum again! And we the consumers will benefit!


----------



## Jayjr1105

Quote:


> Originally Posted by *ducegt*
> 
> The close ones have the GPU being a bottleneck. It helps to hide the difference now, but 5 fps difference today will turn into a far greater amount in the future. More is always better. This "technique" is nothing new.


a GTX1080 bottlenecked @ 1080P??


----------



## Blackops_2

Quote:


> Originally Posted by *looniam*
> 
> none of these gaming benchmarks are relevant since no one was streaming to twitch and using handbrake at the same time.
> 
> *amirite?*










right!


----------



## Asy

Quote:


> Originally Posted by *mouacyk*
> 
> I agree AMD has something when it comes to perf/dollar, absolutely no arguments there! My point is then the 52% improvement over Bulldozer is meaningless, because Bulldozer is irrelevant. When it comes to pure performance, AMD still has nothing - try a 6950x at 4.4GHz on all 10 cores.


how much is 6950x again ??? OH RIGHT derp


----------



## budgetgamer120

Quote:


> Originally Posted by *CriticalOne*
> 
> I get that, but the Ryzen processors don't seem to be much better off than the i7s when it comes to CPU utilization in games.


91% load vs 70% on Ryzen and you say Ryzen is not much better?


----------



## ChronoBodi

Oh jesus, joker just did a video that contradicts gamer nexus.

What reviewers do we trust now? We have to do independent testing here. Something is off.


----------



## AmericanLoco

Quote:


> Originally Posted by *Ultracarpet*
> 
> This crap has been happening ever since AMD shocked people with the first few leaks indicating that these chips weren't going to be so firmly in 2nd place to Intel. the R7 is a different beast to Intel's quad core chips, and yet everyone and their grandparents keeps comparing it to them for tasks that the intel quad cores pretty much live off of. The 6950x, the 6900k, and the 6800k are not as good of gaming chips as the 7700k, and no one is claiming Intel's hedt platform is useless because of it. However, for AMD that is suddenly a defining factor??? Such a joke.


Exactly, so many people seem to be forgetting the context of this chip. - AMD has been specifically advertising the HEDT aspects of r7. I haven't seen much at all about gaming on Ryzen, from AMD actually besides a few demos. The video encoding, rendering, number crunching, etc... The performance/dollar and performance/watt of Ryzen in these applications are amazing. This will be a huge chip for AMD in the workstation and server market, which is where AMD desperately needs market share.

The R7 is not a i7-Quad competitor. It's an i7-8 core competitor, and it does an amazingly good job at that. Complaining that the R7-1800x does poorly in gaming compared to a 5GHz i7-7700k is like complaining that a dump-truck makes for a poor sports car. Well no ****, they're designed to do completely different things.

Anyone who thought ~3.7Ghz 8-Core was going to comparable to a 5Ghz quad in gaming were just deluding themselves.
Quote:


> Originally Posted by *ducegt*
> 
> The close ones have the GPU being a bottleneck. It helps to hide the difference now, but 5 fps difference today will turn into a far greater amount in the future. More is always better. This "technique" is nothing new.


No it won't. Future games will be better threaded, Ryzen will have had updated drivers/Windows patches to improve SMT performance, and programmers will be able to program better for AMD's specific SMT implementation (which is clearly different than Intel's)


----------



## ducegt

Quote:


> Originally Posted by *Jayjr1105*
> 
> a GTX1080 bottlenecked @ 1080P??


Depends on the graphic settings as well. It's a powerful card, but there is more latency from having to preform more things. Maybe they clocked it like a potato, I don't know. All the benches favoring Intel for games mirror synthetic single core performance so have you taken that into consideration?


----------



## Kpjoslee

Besides disappointing OC results, Ryzen is performing as expected. Perhaps it wouldn't have been a disappointment that some feels right now if people stuck with a expectation we had last year instead of buying into the hype all those rumors created in last couple of months.


----------



## sugarhell

It has a really good rendering/video encoding in general workstation performance.
It has a good gaming performance a bit lower than broadwell/kabylake but with a good gpu you have nothing to worry about like any intel cpu from the last 5 years.
It has awesome perf/watt.
The price is awesome.

Negatives:
Problems with memory/cache.
Unstable mobos atm
A bit lacking 1080p gaming performance

It is a solid product. Intel finally has competition. What else you want?


----------



## Vesku

Quote:


> Originally Posted by *mouacyk*
> 
> I agree AMD has something when it comes to perf/dollar, absolutely no arguments there! My point is then the 52% improvement over Bulldozer is meaningless, because Bulldozer is irrelevant. When it comes to pure performance, AMD still has nothing - try a 6950x at 4.4GHz on all 10 cores.


Will you sell me one for $320? $500?


----------



## jeffdamann

So I can't decide! I will be primarily gaming. Drop my 1800x preorder/backorder and use the extra cas for RAM or get a 1700 thats in stock or wait?

Can anyone point me to some benches to help me make my decision?


----------



## Ultracarpet

Quote:


> Originally Posted by *AmericanLoco*
> 
> Exactly, so many people seem to be forgetting the context of this chip. - AMD has been specifically advertising the HEDT aspects of r7. I haven't seen much at all about gaming on Ryzen, from AMD actually besides a few demos. The video encoding, rendering, number crunching, etc... The performance/dollar and performance/watt of Ryzen in these applications are amazing. This will be a huge chip for AMD in the workstation and server market, which is where AMD desperately needs market share.
> 
> The R7 is not a i7-Quad competitor. It's an i7-8 core competitor, and it does an amazingly good job at that. Complaining that the R7-1800x does poorly in gaming compared to a 5GHz i7-7700k is like complaining that a dump-truck makes for a poor sports car. Well no ****, they're designed to do completely different things.
> 
> Anyone who thought ~3.7Ghz 8-Core was going to comparable to a 5Ghz quad in gaming were just deluding themselves.


It's not like it's even that slow for those lightly threaded tasks either. Using your own analogy It's like they made a decent (not the fastest) track car that doubles as one of the best dump trucks on the market.

Someone put it well up above. The r7 lineup has the best all round bang for buck 8 core chips available, gaming or not. Case closed.


----------



## lombardsoup

Quote:


> Originally Posted by *ChronoBodi*
> 
> Oh jesus, joker just did a video that contradicts gamer nexus.
> 
> What reviewers do we trust now? *We have to do independent testing here*. Something is off.


This, way too much misinformation out right now.


----------



## budgetgamer120

Quote:


> Originally Posted by *ducegt*
> 
> The close ones have the GPU being a bottleneck. It helps to hide the difference now, but 5 fps difference today will turn into a far greater amount in the future. More is always better. This "technique" is nothing new.


Do you think before you post?

How does 5fps in favor for the i7 7700k turns into more tomorrow when the 7700k is being used to its full potential?

The 1700 is the CPU that is under-utilized. It will have the upper-hand tomorrow.

Man Google or something before you post.


----------



## LancerVI

Quote:


> Originally Posted by *ducegt*
> 
> Depends on the graphic settings as well. It's a powerful card, but there is more latency from having to preform more things. Maybe they clocked it like a potato, I don't know. All the benches favoring Intel for games mirror synthetic single core performance so have you taken that into consideration?


As Ricky Gervais might say, "STOP TALKING ****"

Because that's complete non-sense.

Intel's own HEDT platforms lag behind the "gaming" CPUs. Always have...Do you think the HEDT platform sucks too???

I certainly don't. I'm running one. I can game at ultra @ 2560x1080p and it works great, but I'm not pigeon holed to gaming.

the 1700 seems to me to be the best all around proc on the market RIGHT NOW.


----------



## Jayjr1105

Quote:


> Originally Posted by *sugarhell*
> 
> Negatives:
> Problems with memory/cache.
> Unstable mobos atm
> A bit lacking 1080p gaming performance
> 
> It is a solid product. Intel finally has competition. What else you want?


All of the negatives will be easily fixed in a month or so.


----------



## kd5151

Quote:


> Originally Posted by *zGunBLADEz*
> 
> we need more raw data results joker video do a big talk here i dont trust this reviewers XD


techno-kitchen does. lots of raw data stuff.

https://m.youtube.com/user/irkutsk2mailru


----------



## FatalProximity

Sorry for my ignorant post because I'm at work and unable to read reviews.

If my friend is planning to build a new gaming PC, will he get better performance/price with Ryzen or should he stick with Intel? I'm not up to date on the gaming performance of Ryzen.


----------



## sugarhell

Quote:


> Originally Posted by *Jayjr1105*
> 
> All of the negatives will be easily fixed in a month or so.


Maybe there will be solutions. Probably the windows scheduler needs some love too.


----------



## CULLEN

Quote:


> Originally Posted by *jeffdamann*
> 
> So I can't decide! I will be primarily gaming. Drop my 1800x preorder/backorder and use the extra cas for RAM or get a 1700 thats in stock or wait?
> 
> Can anyone point me to some benches to help me make my decision?


Get the 1700 (non X), find from any of the reviewers that got good results what motherboard and memory they have, and probably consider that a bit better.


----------



## CriticalOne

Quote:


> Originally Posted by *budgetgamer120*
> 
> 91% load vs 70% on Ryzen and you say Ryzen is not much better?


The opportunity cost is too much, in my opinion. For that 20% reduction in load I have to give up 20% or more of game performance to begin with. That implicit cost along with the fact that upgrading to Ryzen would cost me at least $500 leaves me wanting when I look at the benchmarks.


----------



## Jayjr1105

Quote:


> Originally Posted by *FatalProximity*
> 
> Sorry for my ignorant post because I'm at work and unable to read reviews.
> 
> If my friend is planning to build a new gaming PC, will he get better performance/price with Ryzen or should he stick with Intel? I'm not up to date on the gaming performance of Ryzen.


Two things short and sweet.

1. wait for intel price drops which are likely to come

2. wait for Ryzen 5 (even more bang for buck to come)


----------



## budgetgamer120

Quote:


> Originally Posted by *FatalProximity*
> 
> Sorry for my ignorant post because I'm at work and unable to read reviews.
> 
> If my friend is planning to build a new gaming PC, will he get better performance/price with Ryzen or should he stick with Intel? I'm not up to date on the gaming performance of Ryzen.


It comes down to Do you want a prosumer PC or a gaming PC for the same price. Clearly Ryzen CPU is the around better performer while an i7 quad will give you better gaming performance and perform worse in everything else.

So your friend has to decide what is important.


----------



## Freakydude

what is that?? the HYPE train screaming to a halt? AMD and all the phony leaks built this chip up and now comes the end of the sugar rush.
everybodies expectations are coming back to earth.
For me ryzen did what I want it too, just enough pressure to ease Intel's comfort zone
Nvidia on the other hand has put the boots to vega and now Vega will not get the chance to even come close, it's win or probably die


----------



## Ultracarpet

Quote:


> Originally Posted by *CriticalOne*
> 
> The opportunity cost is too much, in my opinion. For that 20% reduction in load I have to give up 20% or more of game performance to begin with. That implicit cost along with the fact that upgrading to Ryzen would cost me at least $500 leaves me wanting when I look at the benchmarks.


Here is something for you to consider. If you have a 7700k, and you find out you would benefit from having more cores, would you go Intel HEDT now, or would you go Ryzen? This was never a question of choosing a 7700k or Ryzen R7.


----------



## budgetgamer120

Quote:


> Originally Posted by *Jayjr1105*
> 
> Two things short and sweet.
> 
> 1. wait for intel price drops which are likely to come
> 
> 2. wait for Ryzen 5 (even more bang for buck to come)


I am not convinced Ryzen 5 will be any better at gaming... Unless it is a different silicon.

Intel does not have to drop prices, look at the reaction in this thread. Says it all, AMD failed.


----------



## Kpjoslee

Quote:


> Originally Posted by *jeffdamann*
> 
> So I can't decide! I will be primarily gaming. Drop my 1800x preorder/backorder and use the extra cas for RAM or get a 1700 thats in stock or wait?
> 
> Can anyone point me to some benches to help me make my decision?


ATM, it is hard to justify 1800x over 1700. You are looking at maybe 100mhz difference at most between those two after OC.


----------



## AmericanLoco

Quote:


> Originally Posted by *budgetgamer120*
> 
> Do you think before you post?
> 
> How does 5fps in favor for the i7 7700k turns into more tomorrow when the 7700k is being used to its full potential?
> 
> The 1700 is the CPU that is under-utilized. It will have the upper-hand tomorrow.
> 
> Man Google or something before you post.


He's some kind of troll/shill. His average post count is less than 0.5 posts/day, but he has over 30 posts in various Ryzen threads from just today, and every single one of them is negative. Constantly comparing Ryzen to a 5 GHz i7-7700k


----------



## Hueristic

Quote:


> Originally Posted by *Master__Shake*
> 
> it's hard to troll if people can't see you


Yup member forever and still using caps every post is an instant ignore.









Quote:


> Originally Posted by *DADDYDC650*
> 
> Haven't heard of a 1700 not being able to hit 4Ghz YET.


So who else called it? What do all the little TDP kids have to say now?


----------



## CriticalOne

Quote:


> Originally Posted by *Ultracarpet*
> 
> Here is something for you to consider. If you have a 7700k, and you find out you would benefit from having more cores, would you go Intel HEDT now, or would you go Ryzen? This was never a question of choosing a 7700k or Ryzen R7.


Personally I was looking at upgrading to a 4790K with my LGA 1150 system or going Ryzen.


----------



## amstech

Quote:


> Originally Posted by *budgetgamer120*
> 
> Yes Cause I still play Far Cry 2 and those other dinosaur games


Why so mean and sarcastic?
I don't care what you play I love playing older games, any real PC gamer knows its more then about graphics and trying to look good.
And yes, those results apply to newer games as well its just the most recent comparison I have.


----------



## budgetgamer120

Quote:


> Originally Posted by *amstech*
> 
> Why so mean and sarcastic?
> I don't care what you play I love playing older games, any real PC gamer knows its more then about graphics and trying to look good.
> And yes, those results apply to newer games as well.


Those results don't carry into new games. So please









Besides if all you play are old games what sort of performance improvement are you looking for?


----------



## motoray

We have a dedicated 1700 overclock thread? Trying to see if its the one to buy. Huge savings.


----------



## Hueristic

Quote:


> Originally Posted by *amstech*
> 
> Why so mean and sarcastic?
> I don't care what you play I love playing older games, any real PC gamer knows its more then about graphics and trying to look good.
> And yes, those results apply to newer games as well.


Then better up the textures on you avy!









ok back to ten pages back. Lol


----------



## Pro3ootector

Same as R7 competes with boardwell now, R5 will compete with Kaby. AMD is back in the game.


----------



## ducegt

Quote:


> Originally Posted by *Jayjr1105*
> 
> All of the negatives will be easily fixed in a month or so.


So easily fixed that they couldn't have been in the past 5 plus years they have been creating Zen? I can understand your optimism, but you really should say something like, "I hope they can fix those things." You, along with others, act like you have a crystal ball and that it's showing the future is very different than the past...And I hope you are right, but I'll predict your wrong. Accept Zen for what it is.


----------



## ZealotKi11er

There must be a reason for poor gaming performance. It beyond terrible. It's slower than IVY.


----------



## jeffdamann

Quote:


> Originally Posted by *Kpjoslee*
> 
> ATM, it is hard to justify 1800x over 1700. You are looking at maybe 100mhz difference at most between those two after OC.


I cant find any good overclock reviews, is this possibly an error with motherboards or something. Is it possible that in a few months we will be seeing 500 mhz difference between the 2 after OC?


----------



## Ultracarpet

Honestly, the biggest surprise about Ryzen are the people who are suddenly considering it.

Mid summer, really, the only people that were considering Zen (what i was called then) were people on bulldozer lineage setups, and nehalem/sandy bridge chips, or without rigs completely. Come these past few months and the leaks start to indicated that the performance was better than expected, suddenly there was a shift. There was then people with ivy and Haswell setups considering making a purchase, which suddenly spiraled into the comparisons to skylake and even kaby.

Ryzen was never meant to be an all round upgrade for people on recent Intel chips. The fact that they have brought near parity in competition for the HEDT line is insane, and for people looking to make the jump from Intel's mainstream lineup to the HEDT lineup, AMD is suddenly now something to consider.

We will get to see what impact the r5 and r3 chips have on the mainstream side, but I believe that most demand for those chips will come from people stuck on older AMD systems/ very old Intel systems.


----------



## Jayjr1105

Quote:


> Originally Posted by *ducegt*
> 
> So easily fixed that they couldn't have been in the past 5 plus years they have been creating Zen? I can understand your optimism, but you really should say something like, "I hope they can fix those things." You, along with others, act like you have a crystal ball and that it's showing the future is very different than the past...And I hope you are right, but I'll predict your wrong. Accept Zen for what it is.


You're right, they probably just say "eh screw it, it doesn't need any updates, bug fixes, or improvements"


----------



## Ultracarpet

Quote:


> Originally Posted by *CriticalOne*
> 
> Personally I was looking at upgrading to a 4790K with my LGA 1150 system or going Ryzen.


Then you should either wait for the r5/r3 chips, or just go with the 4790k if all you do is game. If Intel's HEDT platform was never a consideration for you, the r7 shouldn't be either.


----------



## CriticalOne

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There must be a reason for poor gaming performance. It beyond terrible. It's slower than IVY.


This is what really has me intrigued. Ryzen is only marginally slower than Broadwell but its getting destroyed by Broadwell-E in games, despite it having a clock rate advantage.

Something's not adding up. Maybe there is an errata in SMT.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There must be a reason for poor gaming performance. It beyond terrible. It's slower than IVY.


Maybe the RAM issues are deeper than not just supporting higher speeds? Maybe there is some other issue causing higher latency in it, impacting gaming results.

It will be fun to see what SR5 looks like.


----------



## Ultracarpet

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Maybe the RAM issues are deeper than not just supporting higher speeds? Maybe there is some other issue causing higher latency in it, impacting gaming results.
> 
> It will be fun to see what SR5 looks like.


Kinda looked like AMD's version of hyperthreading was also causing some dips in performance for some games.


----------



## Quantum Reality

Quote:


> Asus Crosshair VI https://arstechnica.co.uk/gadgets/2017/03/amd-ryzen-review/


"... but gamers should look elsewhere"???

Are you joking, Ars Technica? Your own benches show that the Ryzen 1800X you got is within 10% or so of the 7700K in games.


----------



## tygeezy

Quote:


> Originally Posted by *zGunBLADEz*
> 
> we need frame pacing tests for 7700k (ala 7970 era XD) i just loled so hard seeing smooth amd and stutter/microstutter fest from intel
> https://www.youtube.com/watch?v=BXVIPo_qbc4


Looking at the frametimes briefly. From what I saw intel was consistantly lower. Both were very stable in this video. Granted I didn't sit and watch the entire thing. Can you please point out a sequence where intel struggles with consistent frametimes and amd is doing fine?


----------



## BinaryDemon

Quote:


> Originally Posted by *FatalProximity*
> 
> Sorry for my ignorant post because I'm at work and unable to read reviews.
> 
> If my friend is planning to build a new gaming PC, will he get better performance/price with Ryzen or should he stick with Intel? I'm not up to date on the gaming performance of Ryzen.


Your friend probably shouldnt be looking at R7's for best price/performance. I would guess the R5 1600x would offer basically the same gaming performance but cheaper than any of these R7's. Obviously if you arent afraid to overclock and want a good deal then the R5 1500 or R5 1300 would start looking like a good price/performance value. If you want i7-7700k performance, then buy that.


----------



## Kyube

are there any overclock attempts with some cores disabled (making it a 6core or 4core) on any r7 chip? just to check out overclock posibilities on 6-core & 4-core variants
1700 + B350 seems like a sweet build to make c:


----------



## ryboto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There must be a reason for poor gaming performance. It beyond terrible. It's slower than IVY.


Answer:
Quote:


> [-]AMD_LisaSuCEO of AMD 451 points 2 hours ago
> Thanks for the question. In general, we've seen great performance from SMT in applications and benchmarks but there are some games that are using code optimized for our competitor... we are confident that we can work through these issues with the game developers who are actively engaging with our engineering teams.


https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/def5iab/

They say that, plus I heard something about the Windows driver in the Legitreviews review.
Quote:


> AMD firmly believes that they can improve gaming performance with Ryzen optimizations as all the games we tested with were optimized on Intel, so they feel the testing is one sided right now. They also have a Windows Driver coming in approximately one month that will help performance as the Windows High Precision Event Timer (HPET) isn't playing nice with the SenseMI sensors that poll the CPU status every millisecond.
> Read more at http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/15#t4rjzQpVwY4DAP47.99


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ultracarpet*
> 
> Kinda looked like AMD's version of hyperthreading was also causing some dips in performance for some games.


Yea, that was mentioned as well and I forgot so thanks for pointing it out again.

This is all WAAAAYYYYY above my head, but possible micro-code issues? Again going back to the memory issue being deeper.

Ignorant guessing on my part.


----------



## ducegt

Quote:


> Originally Posted by *Jayjr1105*
> 
> You're right, they probably just say "eh screw it, it doesn't need any updates, bug fixes, or improvements"


So by that logic, Intel is doing the same...happy? They are just waiting for AMD to act first; whom are working all day and night working on "easy and massive" fixes/improvements to something that has taken many years and billions of dollars to develop. Makes sense.


----------



## Ultracarpet

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yea, that was mentioned as well and I forgot so thanks for pointing it out again.
> 
> This is all WAAAAYYYYY above my head, but possible micro-code issues? Again going back to the memory issue being deeper.
> 
> Ignorant guessing on my part.


I honestly just can't wait for users here to get the chips in their hands and we can start messing around with stuff.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ryboto*
> 
> Answer:
> https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/def5iab/
> 
> They say that, plus I heard something about the Windows driver in the Legitreviews review.


@ryan92084

Can we get this put in the OP?


----------



## CULLEN

Quote:


> Originally Posted by *Freakydude*
> 
> what is that?? the HYPE train screaming to a halt? AMD and all the phony leaks built this chip up and now comes the end of the sugar rush.
> everybodies expectations are coming back to earth.
> For me ryzen did what I want it too, just enough pressure to ease Intel's comfort zone
> Nvidia on the other hand has put the boots to vega and now Vega will not get the chance to even come close, it's win or probably die


_Read with David Attenborough voice_

The most extraordinary thing about the fanboy is his will to ignore great things happening all around him. He might have simple, yet small victories, but he will never back down and accept that he really isn't the king in all aspects. They do come in many different colors, but the most commonly seen are usually green, red or blue.

However, they all share blind faith and embrace a fruit which ultimately does them no good - it's extraordinary.

On a serious note, all I can think about..
Quote:


> Originally Posted by *SoloCamo*
> 
> ITT: People only look at 7700k gaming performance and claim superiority as a whole. 6900k being $1000 and not much faster/equal in multithreaded workloads over $329-$500 AMD cpu seemingly also ignored.


Freakydude, you might not know it yet, but you and I and all the consumers out there are the true winners.


----------



## PostalTwinkie

Quote:


> Originally Posted by *CULLEN*
> 
> _Read with David Attenborough voice_
> 
> The most extraordinary thing about the fanboy is his will to ignore great things happening all around him. He might have simple, yet small victories, but he will never back down and accept that he really isn't the king in all aspects. They do come in many different colors, but the most commonly seen are usually green, red or blue.
> 
> However, they all share blind faith and embrace a fruit which ultimately does them no good - it's extraordinary.
> 
> On a serious note, all I can think about..
> You might not know it yet, but you and I and all the consumers out there are the true winners.


It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


----------



## rv8000

Man is this thread entertaining, the epeen from both extremes is ruining it in typical OCN fashion


----------



## Xuper

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


Pretty Much.


----------



## Mand12

Quote:


> Originally Posted by *Wishmaker*
> 
> PLEASE STOP WITH THE HYPE BECAUSE PEOPLE SPEND MONEY BASED ON TWISTED STATEMENTS!


Can the people excited really be blamed by people preordering without waiting for benchmarks?

Isn't that something we criticize people for doing in games all the time?


----------



## zGunBLADEz

Quote:


> Originally Posted by *tygeezy*
> 
> Looking at the frametimes briefly. From what I saw intel was consistantly lower. Both were very stable in this video. Granted I didn't sit and watch the entire thing. Can you please point out a sequence where intel struggles with consistent frametimes and amd is doing fine?


Its all over the video watch closely the intel side it shows microstutters


----------



## Kpjoslee

Quote:


> Originally Posted by *jeffdamann*
> 
> I cant find any good overclock reviews, is this possibly an error with motherboards or something. Is it possible that in a few months we will be seeing 500 mhz difference between the 2 after OC?


I really doubt that would be the case. I would just go with 1700 right now and maybe shoot for the glory with Zen+.


----------



## CriticalOne

I don't get why people expecting more are getting so much criticism.

Ryzen leaks confirmed Broadwell-E IPC multiple times and even post launch benchmarks confirm it. Ryzen processors even have a clock rate advantage over Broadwell-E advantage. Multiple multithreaded tests showed the 1800X either matching or beating the 6900k. For all intents and purposes, Ryzen is interchangeable with Broadwell-E.

Then you look at game benchmarks and there is a massive gulf between the 1800x and the 6900k.

Something isn't adding up here and according to what we know Ryzen should be performing much, much faster in games.


----------



## Blackops_2

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


I think his response was making fun of an supposed or alleged intel-fanboy. Not the other way around, but yeah i couldn't agree more. Like i said people forget it wasn't that long ago that CPUs were capping at 4ghz and that was a great clock.


----------



## ryan92084

Quote:


> Originally Posted by *PostalTwinkie*
> 
> @ryan92084
> 
> Can we get this put in the OP?


Added the AMA to the misc. Good call totally forgot about it.


----------



## CULLEN

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


Wow.. I had never thought about it that way, I think you really might be onto something here.


----------



## Wishmaker

Quote:


> Originally Posted by *Quantum Reality*
> 
> "... but gamers should look elsewhere"???
> 
> Are you joking, Ars Technica? Your own benches show that the Ryzen 1800X you got is within 10% or so of the 7700K in games.


Quite a few reviews are saying that Gamers should go for INTEL and not just Ars Technica. In this day and age, INTEL has pulled an NIVIDIA on AMD and is charging top dollar to have the highest performance. This is what happens when your competitors are lagging and do not manage to deliver products that sell. No capital, no R&D, no top performance.

In a galaxy far far away, the gaming market had place for just one series of chips and that was the FX. You wanted the best of the best? You bought the FX series.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Kpjoslee*
> 
> I really doubt that would be the case. I would just go with 1700 right now and maybe shoot for the glory with Zen+.


As typically is the case; if you are going to OC it is usually a lot more cost effective to get the entry processor of a given stack. So the 1700 instead of 1800x, the 'old' 8320 instead of 9590, etc.


----------



## Quantum Reality

Quote:


> Asus Crosshair VI Linux https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/












Looks like Ryzen just blows the doors off expensive Intel server CPUs! Sounds like the Linux-based server market will get a real shake-up in the next few months, especially if AMD can get motherboard manufacturers to put out EATX dual-CPU versions for heavily multithreaded applications.


----------



## jeffdamann

Quote:


> Originally Posted by *Kpjoslee*
> 
> I really doubt that would be the case. I would just go with 1700 right now and maybe shoot for the glory with Zen+.


Done and done, if things change I will just sell the 1700 and go 1800x

Forgive my noobishness although I am very experienced, I have never been in on the front end of a new arch


----------



## Mand12

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


Yeah, I'm really not sure what the benefit of the extreme tribalism, either positive or negative.

For me, this is a big success, because I remember the days when I used to shop between Intel and AMD CPUs for gaming rigs. Even though I have pretty much ruled out Ryzen for a potential gaming upgrade, the fact that their chip is as good as it is in other areas bodes well for expanded offerings in the future.

The "other side" doing well is good for your favorite, too. That's what people don't understand.

And yes, I realize how odd that statement will read to the people who jumped on me for being critical of AMD in the past. Here's the deal, though: I want AMD to succeed by being awesome and delivering a quality product, as they have with Ryzen. I don't want them to merely avoid extinction by lying to us about both their products and others'.

This is a clear success for AMD. As good as could have been expected, and probably more than that.


----------



## ducegt

Quote:


> Originally Posted by *PostalTwinkie*
> 
> It is pretty sad to see an 'enthusiast' community spitting in the face of a clear success. Honestly I think a lot of the people here have only grown up in the Intel era, not having been born or old enough when Intel was chasing AMD and there was true competition. So they feel like their childhood is being assaulted and are going to defend their memories.


And it's the opposite for those of us who grew up on pentiums, athlon, athlon 64. We love to support AMD even if it's inferior for our needs. I know I do with GPUs, but I went with the 7700K because it meets my needs better. My financial allowance is also a lot more now than back then.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ryboto*
> 
> Answer:
> https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/def5iab/
> 
> They say that, plus I heard something about the Windows driver in the Legitreviews review.


Yeah that is probably the main reasons. 3770K will last me even longer. Its not like I have to upgrade to Intel.


----------



## Quantum Reality

Quote:


> Originally Posted by *Wishmaker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Quantum Reality*
> 
> "... but gamers should look elsewhere"???
> 
> Are you joking, Ars Technica? Your own benches show that the Ryzen 1800X you got is within 10% or so of the 7700K in games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quite a few reviews are saying that Gamers should go for INTEL and not just Ars Technica. In this day and age, INTEL has pulled an NIVIDIA on AMD and is charging top dollar to have the highest performance. This is what happens when your competitors are lagging and do not manage to deliver products that sell. No capital, no R&D, no top performance.
> 
> In a galaxy far far away, the gaming market had place for just one series of chips and that was the FX. You wanted the best of the best? You bought the FX series.
Click to expand...

I'm just rolling my eyes at the whole "Well AMD is NO GOOD AT ALL for gaming" emanating from some reviews. It's like they can't bring themselves to admit AMD has brought a competitive CPU to the table for about half the cost of Intel's top flagship offerings that get 10% or so extra performance in game framerates.


----------



## orlfman

i don't know if this has been posted yet, way to many pages to sort through, but reading http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951-6.html shows that there may be scheduling and power management issues that could be hurting zen's performance in gaming. many of the benchmarks show increases when smt is turned off and high power mode is used rather than balanced. granted even in those scenarios zen typically still trails behind broadwell-e.

i agree comparing zen to skylake+ (6700k / 7700l) is pointless as its competitor has always been broadwell / x99. all zen vs skylake mainstream and above should end with the same conclusion and tone as x99 vs mainstream, you go mainstream for gaming. don't ridicule beyond that.

zen is a very strong competitor against broadwell-e in workstation tasks. it offers very strong floating point performance but it appears its interger performance lacks against broadwell-e. couple that with what appears to be power management and smt scheduling issues, makes it a weak contender against broadwell-e in gaming since gaming relies more heavily on integer rather than floating point.


----------



## jeffdamann

Onelast question since I can't find it in reviews. I am an extremely competent Overclocker. Can I achieve 4.0 on a 1700 with a Noctua D15 and low ambients?


----------



## amstech

Quote:


> Originally Posted by *budgetgamer120*
> 
> Those results don't carry into new games. So please


Yea its still right on par with a 3770k if the clock speeds are equal.
Isn't that great?


----------



## budgetgamer120

Quote:


> Originally Posted by *ryboto*
> 
> Answer:
> https://www.reddit.com/r/Amd/comments/5x4hxu/we_are_amd_creators_of_athlon_radeon_and_other/def5iab/
> 
> They say that, plus I heard something about the Windows driver in the Legitreviews review.


Interesting to see the turn of events.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Quantum Reality*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like Ryzen just blows the doors off expensive Intel server CPUs! Sounds like the Linux-based server market will get a real shake-up in the next few months, especially if AMD can get motherboard manufacturers to put out EATX dual-CPU versions for heavily multithreaded applications.


Holy Hell.......

Intel is going to flip cookies if this translates over to Naples.


----------



## tygeezy

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Its all over the video watch closely the intel side it shows microstutters


Those micro stutters will show up in frame-time. I watched a couple minutes of it and the frametimes were generally better with Intel. There were no spikes in frame-time from what i saw.

So, if you could point out a sequence that would be very excellent.


----------



## ducegt

Quote:


> Originally Posted by *Quantum Reality*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like Ryzen just blows the doors off expensive Intel server CPUs! Sounds like the Linux-based server market will get a real shake-up in the next few months, especially if AMD can get motherboard manufacturers to put out EATX dual-CPU versions for heavily multithreaded applications.


Enterprises won't change so fast over night. There are "the more you spend, the bigger discount" agreements that Intel has to hold their position in the market. You are how loyal personal consumers are? It's more extreme in enterprise environments. I deal with this everyday at my job.


----------



## GorillaSceptre

So, as expected, despite the fact that Ryzen is clearly a success... we have over 70 pages of people trying to justify their CPU's..

Sorry, but if you really thought a lower clocked 8 core would beat your recent 4 core in _*games*_, then an 8 core is beyond your uses in the first place..

*Look what this thing does to the value of a $1100 6900K*.. Yeah, terrible job AMD, you've only given people the opportunity to buy $330 chips that can compete with a $1000 one.. What a fail.

Someone should let Intel know what a failure their HEDT platform is, do those dummies know an i5 beats the 6950X in games??


----------



## kd5151

tommorow i am going to wake up with a ryzen hangover.


----------



## ducegt

Quote:


> Originally Posted by *kd5151*
> 
> tommorow i am going to wake up with a ryzen hangover.


Not as good as ryzen wood.


----------



## sugarhell

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Holy Hell.......
> 
> Intel is going to flip cookies if this translates over to Naples.


Wow for real this is unreal performance on linux

Now i am exciting for Naples


----------



## PostalTwinkie

Quote:


> Originally Posted by *ducegt*
> 
> Enterprises won't change so fast over night. There are "the more you spend, the bigger discount" agreements that Intel has to hold their position in the market. You are how loyal personal consumers are? It's more extreme in enterprise environments. I deal with this everyday at my job.


I just left the IT world after a couple of decades.

Data Centers will absolutely scoop up Naples if these sort of results translate to it.


----------



## Wishmaker

Even with the scheduler changes and power management tweaks, we are forgetting something : Skylake-E is around the corner. INTEL's best and latest will bring certain improvements to the current generation and AMD will not have Zen+ ready to counter. So AMD is pretty much a generation behind and Zen + will have to face Cannonlake.


----------



## AuraNova

Quote:


> Originally Posted by *kd5151*
> 
> tommorow i am going to wake up with a ryzen hangover.


My head is already spinning from reading and watching all these reviews, let alone reading this thread.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So, as expected, despite the fact that Ryzen is clearly a success... we have over 70 pages of people trying to justify their CPU's..
> 
> Sorry, but if you really thought a lower clocked 8 core would beat your recent 4 core in _*games*_, then an 8 core is beyond your uses in the first place..
> 
> *Look what this thing does to the value of a $1100 6900K*.. Yeah, terrible job AMD, you've only given people the opportunity to buy $330 chips that can compete with a $1000 one.. What a fail.
> 
> Someone should let Intel know what a failure their HEDT platform is, do those dummies know an i5 beats the 6950X in games??


My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


----------



## CriticalOne

Quote:


> Originally Posted by *Ultracarpet*
> 
> Then you should either wait for the r5/r3 chips, or just go with the 4790k if all you do is game. If Intel's HEDT platform was never a consideration for you, the r7 shouldn't be either.


Intel's HEDT platform wasn't in contention due to price. I'm not trying to pay $1,000 for an eight core or $200 for a motherboard. I do CAD and video editing alongside gaming so I need strong multithreaded performance.

The way that Ryzen was presented is that I could get a 1700 for $320, the same price as an i7 or whatever, and get more or less the same performance in games while having a little under than twice the multithreaded performance. AMD gave multiple presentations showing a Ryzen processor and an Intel 6900k getting equal performance in games, which made me excited. Imagine my shock when I check gaming benchmarks and I see a 7700k with no hyperthreading beating a 1800X.

Along with that, there was a sizable faction on this forum declaring quad cores as dead or not good enough and how foolish it would be to choose a 6/7700k over a Ryzen processor due to games now being able to use 8 threads well and so on. It wasnt AMD, but there were a lot of people hyping up as Ryzen being some sort of Intel killer.


----------



## Wishmaker

Speaking of tweaks, can people confirm that this is the second time AMD needs to do such changes? I recall the BD era had a similar scenario where windows tweaks were needed for a performance bump. A performance bump which wasn't enough to hang with the top range from the direct competitor.


----------



## ryboto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah that is probably the main reasons. 3770K will last me even longer. Its not like I have to upgrade to Intel.


Yea...I really wanted a good reason to ditch the 3570k...when Vega pops up, I still may go all AMD...by then who knows what the state of Ryzen performance might be..? Though it all hinges on AM4 performance mITX boards being available...


----------



## sugarhell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


Upgrade for what?

An 8-core mainly focus on workstations not Gaming.

If you want just for gaming get a 7700k or wait for R5? More logical...

You can always get a 6900k i have one and it is awesome !


----------



## tpi2007

Quote:


> Originally Posted by *Quantum Reality*
> 
> Quote:
> 
> 
> 
> Asus Crosshair VI Linux https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like Ryzen just blows the doors off expensive Intel server CPUs! Sounds like the Linux-based server market will get a real shake-up in the next few months, especially if AMD can get motherboard manufacturers to put out EATX dual-CPU versions for heavily multithreaded applications.
Click to expand...

Yep, the Tom's Hardware review also pointed in the same direction, this is going to be interesting. It's not surprising really, because AMD lost the most marketshare in that department, down to 0.4%, so it's an uphill battle and it seems that Ryzen is focused on storming that lucrative segment.


----------



## Wishmaker

Quote:


> Originally Posted by *CriticalOne*
> 
> Intel's HEDT platform wasn't in contention due to price. I'm not trying to pay $1,000 for an eight core or $200 for a motherboard. I do CAD and video editing alongside gaming so I need strong multithreaded performance.
> 
> The way that Ryzen was presented is that I could get a 1700 for $320, the same price as an i7 or whatever, and get more or less the same performance in games while having a little under than twice the multithreaded performance. AMD gave multiple presentations showing a Ryzen processor and an Intel 6900k getting equal performance in games, which made me excited. Imagine my shock when I check gaming benchmarks and I see a 7700k with no hyperthreading beating a 1800X.
> 
> *Along with that, there was a sizable faction on this forum declaring quad cores as dead or not good enough and how foolish it would be to choose a 6/7700k over a Ryzen processor due to games now being able to use 8 threads well and so on. It wasnt AMD, but there were a lot of people hyping up as Ryzen being some sort of Intel killer*.


EXACTLY! This is why we need OCN to change the TOS and Policies to prevent this false information to be spread. Look at how many victims it is making!

STOP THE HYPE PEOPLE! YOU ARE HARMING THE COMMUNITY!!!


----------



## Kpjoslee

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So, as expected, despite the fact that Ryzen is clearly a success... we have over 70 pages of people trying to justify their CPU's..
> 
> Sorry, but if you really thought a lower clocked 8 core would beat your recent 4 core in _*games*_, then an 8 core is beyond your uses in the first place..
> 
> *Look what this thing does to the value of a $1100 6900K*.. Yeah, terrible job AMD, you've only given people the opportunity to buy $330 chips that can compete with a $1000 one.. What a fail.
> 
> Someone should let Intel know what a failure their HEDT platform is, do those dummies know an i5 beats the 6950X in games??


Uh, you don't sound much better than people you just mentioned lol. This is just like every other AMD launch thread. Let the things settle down a bit and we all going to realize Ryzen is pretty good CPU for the money.


----------



## Quantum Reality

Quote:


> Originally Posted by *Wishmaker*
> 
> Speaking of tweaks, can people confirm that this is the second time AMD needs to do such changes? I recall the BD era had a similar scenario where windows tweaks were needed for a performance bump. A performance bump which wasn't enough to hang with the top range from the direct competitor.


It is true the scheduler may need patching (again!) but that won't suddenly translate to 20+% increases in performance. That being said someone has recommended switching to High Performance mode to cut down on processor overhead that slows things down a tad.


----------



## budgetgamer120

Quote:


> Originally Posted by *ryboto*
> 
> Yea...I really wanted a good reason to ditch the 3570k...when Vega pops up, I still may go all AMD...by then who knows what the state of Ryzen performance might be..? Though it all hinges on AM4 performance mITX boards being available...


I am sorry but you have years of reason to ditch the 3570K.... You don't want to.

Which is perfectly fine.


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> EXACTLY! This is why we need OCN to change the TOS and Policies to prevent this false information to be spread. Look at how many victims it is making!
> 
> STOP THE HYPE PEOPLE! YOU ARE HARMING THE COMMUNITY!!!


It's not false information that games are becoming more multi threaded and will only continue to be more reliant on it in the future.


----------



## jeffdamann

Can anyone confirm, if I am a competent OCer, could I pull 4.0 on a 1700 with a Noctua D-15?


----------



## sugarhell

Quote:


> Originally Posted by *Wishmaker*
> 
> EXACTLY! This is why we need OCN to change the TOS and Policies to prevent this false information to be spread. Look at how many victims it is making!
> 
> STOP THE HYPE PEOPLE! YOU ARE HARMING THE COMMUNITY!!!


Change the policy on what? Discard people opinions?

Nice attitude dude


----------



## amstech

Ryzen is doing well against $1000 chips, but its also losing to $200 chips in some gaming benchmarks. (Core i5's).
Even in games that really crunch a CPU like Gears and Overwatch, its losing to $350 i7's. (7700k is $350 on Newegg, was $400 a week ago I think)
So it depends on how you look at it.

Ryzen
Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


Your chip is a great CPU and your overclock is impressive, I see no reason for you to upgrade.


----------



## ducegt

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I just left the IT world after a couple of decades.
> 
> Data Centers will absolutely scoop up Naples if these sort of results translate to it.


I don't see that happening. Not until it's more proven being Opteron has almost been forgotten. Did you work IT in which you touched with thousands of customers with no loyalty to various platforms? I doubt it. I'm in a rare position to get a feel of the diversity that exists.


----------



## aDyerSituation

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


Exactly. Doesn't even match Haswell in gaming, which came out in 2013?

Doesn't overclock well either.


----------



## CriticalOne

Quote:


> Originally Posted by *SoloCamo*
> 
> It's not false information that games are becoming more multi threaded and will only continue to be more reliant on it in the future.


That's fair, but there's some logical inconsistencies when people were saying that one should buy an eight core for gaming due to a higher demand on resources and more threads with future titles, and then turn around and say it was foolish to consider the R7s as a gaming processor. It can't be had both ways.


----------



## ryboto

Quote:


> Originally Posted by *Wishmaker*
> 
> Speaking of tweaks, can people confirm that this is the second time AMD needs to do such changes? I recall the BD era had a similar scenario where windows tweaks were needed for a performance bump. A performance bump which wasn't enough to hang with the top range from the direct competitor.


Deriding them as tweaks and dismissing the actual specifics is a bit ignorant if in fact there is a serious difference in the SMT implementation vs HT, and the software is hindering it.

The Windows thing is related to "...the Windows High Precision Event Timer (HPET) isn't playing nice with the SenseMI sensors that poll the CPU status every millisecond.
Read more athttp://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/15#0OPpM5w4u4IWvOGI.99"


----------



## budgetgamer120

Quote:


> Originally Posted by *SoloCamo*
> 
> It's not false information that games are becoming more multi threaded and will only continue to be more reliant on it in the future.


Did you not know people still play Far Cry 2 and Resident Evil 5?


----------



## Quantum Reality

Another thing to keep in mind is that AMD will undoubtedly be able to learn from the real-world issues with R7 and push microcode updates that will make R3 and R5 more efficient when they are ready to launch.


----------



## dieanotherday

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure how Ryzen is a fail


The big wuestion is if it's not for gamers and its a new unproven chip, who will be its market and will sales be sufficient.

Gamers will stick with intel, content creators will get ryzen if they didnt have an 8 core intel, large OEMs will only implement it once they know its a proven chip without problems.


----------



## looniam

Quote:


> Originally Posted by *jeffdamann*
> 
> Onelast question since I can't find it in reviews. I am an extremely competent Overclocker. Can I achieve 4.0 on a 1700 with a Noctua D15 and low ambients?


Quote:


> Originally Posted by *jeffdamann*
> 
> Can anyone confirm, if I am a competent OCer, could I pull 4.0 on a 1700 with a Noctua D-15?


i'm sure if you looked at reviews, saw their cooling solution, then compared to that D15 and what temps they got; you would very much have your answer.


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


Why don't you upgrade to Kaby then? If gaming is everything, then even my overclocked Sandy is good enough to drive any GPU out there..

*Ryzen offers 8 cores for just over $300*... I must be missing something here. Why the hell were any of you looking at an 8 core in the first place if all you're focusing on is games? I don't know if people are noticing, but in everything else the 4 cores get slaughtered..
Quote:


> Originally Posted by *Kpjoslee*
> 
> *Uh, you don't sound much better than people you just mentioned lol.* This is just like every other AMD launch thread. Let the things settle down a bit and we all going to realize Ryzen is pretty good CPU for the money.


Oh yeah? How's that?


----------



## budgetgamer120

Quote:


> Originally Posted by *dieanotherday*
> 
> The big wuestion is if it's not for gamers and its a new unproven chip, who will be its market and will sales be sufficient.
> 
> Gamers will stick with intel, content creators will get ryzen if they didnt have an 8 core intel, large OEMs will only implement it once they know its a proven chip without problems.


I have looked at many reviews. I have not seen one thing Ryzen is "bad" at. So how is it a fail?


----------



## yesitsmario

Any gaming benchmarks for 1440p/4K? I'm mostly seeing 1080p.


----------



## Wishmaker

Quote:


> Originally Posted by *sugarhell*
> 
> Change the policy on what? Discard people opinions?
> 
> Nice attitude dude


Discard misleading opinions and false information.
This is the attitude OCN should be promoting. PROFESSIONALISM AND INTEGRITY. Not the following :

*INTEL 4 core gaming is dead, long live Ryzen 8 core gaming!*

The above was one of the tunes we had in threads for the past 2 weeks. A completely false claim that was debunked today. People were saying that leaks are pointing to INTEL 4 core being dead. Was it? Not really so people should stop spreading FALSE information based on non existent facts.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sugarhell*
> 
> Upgrade for what?
> 
> An 8-core mainly focus on workstations not Gaming.
> 
> If you want just for gaming get a 7700k or wait for R5? More logical...
> 
> You can always get a 6900k i have one and it is awesome !


My thinking was BW-E IPC which is upgrade from IVY and 8-core for long term investment. 3770K lasted me 5 years so my expectation was for a CPU to last me as long. 8-Cores where going to ensure that. Never wanted 6900K. Zen point was all the price.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> Why don't you upgrade to Kaby then? If gaming is everything, then even my overclocked Sandy is good enough to drive any GPU out there..
> 
> *Ryzen offers 8 cores for just over $300*... I must be missing something here. Why the hell were any of you looking at an 8 core in the first place if all you're focusing on is games? I don't know if people are noticing, but in everything else the 4 cores get slaughtered..
> 
> Oh yeah? How's that?


Because I do not want to buy Intel CPU until I have too. Kaby does nothing for me. My thinking was for the future and having fun with it now. Games are starting to use more than 4 core.


----------



## jeffdamann

Quote:


> Originally Posted by *looniam*
> 
> i'm sure if you looked at reviews, saw their cooling solution, then compared to that D15 and what temps they got; you would very much have your answer.


The thing is I am not finding decent overclock reviews, just very general broad reviews, or reviews where they only OC one of the chips


----------



## AuraNova

Quote:


> Originally Posted by *Wishmaker*
> 
> EXACTLY! This is why we need OCN to change the TOS and Policies to prevent this false information to be spread. Look at how many victims it is making!
> 
> STOP THE HYPE PEOPLE! YOU ARE HARMING THE COMMUNITY!!!


I don't get this.

I look at this as a bunch of sports fans arguing which team is better. You're going to have this argument all the time. Same here with Intel (even Nvidia) and AMD. It's all about loyalty, even if the supporter is wrong. It's not anyone's fault if one falls for the hype or information. If you're on a forum like this, and have a sense of knowledge about the subject at hand, you can make well informed decisions. We don't need a policy change for that.


----------



## comagnum

Quote:


> Originally Posted by *Wishmaker*
> 
> EXACTLY! This is why we need OCN to change the TOS and Policies to prevent this false information to be spread. Look at how many victims it is making!
> 
> STOP THE HYPE PEOPLE! YOU ARE HARMING THE COMMUNITY!!!


Go away.. seriously. You're the worst kind of fanboy.


----------



## budgetgamer120

Quote:


> Originally Posted by *Wishmaker*
> 
> Discard misleading opinions and false information.
> This is the attitude OCN should be promoting. PROFESSIONALISM AND INTEGRITY. Not the following :
> 
> *INTEL 4 core gaming is dead, long live Ryzen 8 core gaming!*
> 
> The above was one of the tunes we had in threads for the past 2 weeks. A completely false claim that was debunked today. People were saying that leaks are pointing to INTEL 4 core being dead. Was it? Not really so people should stop spreading FALSE information based on non existent facts.


You said that or made that up. I do not recall anyone saying that. People said i5s are dead. Which is reasonable given cpu load in recent games. If you are ok with near 100% CPU load in recent games then great.

No one said i7s which are quads were dead, so not sure what you are trying to start.


----------



## ryboto

Quote:


> Originally Posted by *Wishmaker*
> 
> Discard misleading opinions and false information.
> This is the attitude OCN should be promoting. PROFESSIONALISM AND INTEGRITY. Not the following :
> 
> *INTEL 4 core gaming is dead, long live Ryzen 8 core gaming!*
> 
> The above was one of the tunes we had in threads for the past 2 weeks. A completely false claim that was debunked today. People were saying that leaks are pointing to INTEL 4 core being dead. Was it? Not really so people should stop spreading FALSE information based on non existent facts.


It's a forum, there will always be misleading opinions. We have a rumors section, if users can't figure it out, it's their own fault. I don't need my forum to be neutered because people can't be bothered to read.


----------



## Ultracarpet

Quote:


> Originally Posted by *ducegt*
> 
> Enterprises won't change so fast over night. There are "the more you spend, the bigger discount" agreements that Intel has to hold their position in the market. You are how loyal personal consumers are? It's more extreme in enterprise environments. I deal with this everyday at my job.


But for new server contracts this is a big deal.


----------



## SoCalMX70

Quote:


> Originally Posted by *Wishmaker*
> 
> Discard misleading opinions and false information.
> This is the attitude OCN should be promoting. PROFESSIONALISM AND INTEGRITY. Not the following :
> 
> *INTEL 4 core gaming is dead, long live Ryzen 8 core gaming!*
> 
> The above was one of the tunes we had in threads for the past 2 weeks. A completely false claim that was debunked today. People were saying that leaks are pointing to INTEL 4 core being dead. Was it? Not really so people should stop spreading FALSE information based on non existent facts.


People can use their brains to separate fact from fiction. We don't need hand-holding and censorship. Grow up.


----------



## ryboto

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am sorry but you have years of reason to ditch the 3570K.... You don't want to.
> 
> Which is perfectly fine.


Years of reasons? I don't have a need to, that's the thing. AMD releasing new products does give me the itch though...


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My thinking was BW-E IPC which is upgrade from IVY and 8-core for long term investment. 3770K lasted me 5 years so my expectation was for a CPU to last me as long. 8-Cores where going to ensure that. Never wanted 6900K. Zen point was all the price.
> Because I do not want to buy Intel CPU until I have too. Kaby does nothing for me. My thinking was for the future and having fun with it now. Games are starting to use more than 4 core.


That's my point.. Kaby does nothing for you, the fastest 4 cores Intel can offer, but you expected higher performance in games from 8 cores that can be even cheaper than a 7700K?


----------



## comagnum

Quote:


> Originally Posted by *budgetgamer120*
> 
> You said that or made that up. I do not recall anyone saying that. People said i5s are dead. Which is reasonable given cpu load in recent games. If you are ok with near 100% CPU load in recent games then great.
> 
> No one said i7s which are quads were dead, so not sure what you are trying to start.


He's delusional. He's done nothing but spout garbage all day about how superior intel is, and how everyone who got excited over this release were brainwashed and fed misinformation. He's either trolling, or he's got issues. Either way.


----------



## jsc1973

After reading some of the reviews, it seems like it's a Broadwell-E killer for sure, but it's not a gaming powerhouse. We'll have to see if we get BIOS updates that improve the CPU's performance on that end. Some of those results just looked plain weird, and there's something that's not optimized there.

Still a much more competitive and interesting product than we've seen from AMD in many years.


----------



## IRobot23

https://www.youtube.com/watch?v=V5RP1CPpFVE

Hmm...


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> Discard misleading opinions and false information.
> This is the attitude OCN should be promoting. PROFESSIONALISM AND INTEGRITY. Not the following :
> 
> *INTEL 4 core gaming is dead, long live Ryzen 8 core gaming!*
> 
> The above was one of the tunes we had in threads for the past 2 weeks. A completely false claim that was debunked today. People were saying that leaks are pointing to INTEL 4 core being dead. Was it? Not really so people should stop spreading FALSE information based on non existent facts.


http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html

BF1 SINGLE PLAYER

1700x - 134avg / 127min

i5-7600k - 144avg / 81min

Higher IPC and 400mhz advantage and it still drops 46fps on the min. I'd say 4c4t is not a smart choice for a new build.. aka dead.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GorillaSceptre*
> 
> That's my point.. Kaby does nothing for you, the fastest 4 cores Intel can offer, but you expected higher performance in games from 8 cores that can be even cheaper than a 7700K?


No I expected same preference. Kaby beats my CPU but its a side grade for me.


----------



## IRobot23

Quote:


> Originally Posted by *SoloCamo*
> 
> http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html
> 
> BF1 SINGLE PLAYER
> 
> 1700x - 134avg / 127min
> 
> i5-7600k - 144avg / 81min
> 
> Higher IPC and 400mhz advantage and it still drops 46fps on the min. I'd say 4c4t is not a smart choice for a new build.. aka dead.


Nobody cares about SP, where i7 7700K have problems is MP 64players.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ducegt*
> 
> I don't see that happening. Not until it's more proven being Opteron has almost been forgotten. Did you work IT in which you touched with thousands of customers with no loyalty to various platforms? I doubt it. I'm in a rare position to get a feel of the diversity that exists.


I was an Internet Service Provider and spent my time in and out of data centers and dealing with those matters, the people that ran and build data centers, etc.

It isn't a matter of loyalty, you can try and argue that all you want - as it just shows you lack the business management knowledge - this is a matter of money. If the results that the 1700x just demonstrated translate over to Naples Intel won't be able to overcome it with cost. Even if they dropped their parts to the same cost as AMD, they are being outperformed by an extreme margin (based off this sample).

It doesn't matter what my head of IT thinks when the performance delta is so extreme. The energy savings potential to a Data Center with Naples paired with the performance will mandate the change. Even if Intel manages to meet the part cost 1:1.

It is absolutely crazy that the 1700x put up the numbers it did.


----------



## chuy409

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Why don't you upgrade to Kaby then? If gaming is everything, then even my overclocked Sandy is good enough to drive any GPU out there..
> 
> *Ryzen offers 8 cores for just over $300*... I must be missing something here. Why the hell were any of you looking at an 8 core in the first place if all you're focusing on is games? I don't know if people are noticing, but in everything else the 4 cores get slaughtered..
> 
> Oh yeah? How's that?


Because amd themselves pitched ryzen an 8 core 16 thread accessible to masses at an affordable price. That in itself is an odd position right now as the masses arent leveraging all 8 cores to the brink but are more likely than not, gamers if we remove office work-type pcs from the equation.


----------



## SoloCamo

Quote:


> Originally Posted by *IRobot23*
> 
> Nobody cares about SP, where i7 7700K have problems is MP 64players.


7700k doesn't have problems like my 4790k doesn't have problems, i5 does. And that was exactly my point, the 64 player maps are far more multithreaded so if the single player is showing that much of a discrepancy the gap is even larger in multiplayer.


----------



## SoCalMX70

Quote:


> Originally Posted by *jsc1973*
> 
> After reading some of the reviews, it seems like it's a Broadwell-E killer for sure, but it's not a gaming powerhouse. We'll have to see if we get BIOS updates that improve the CPU's performance on that end. Some of those results just looked plain weird, and there's something that's not optimized there.
> 
> Still a much more competitive and interesting product than we've seen from AMD in many years.


I agree. I think the biggest issue we see early on here is you have an entirely new architecture and platform competing with Intel, who has many years of very incremental updates...

I plan on doing a new build sometime in late Summer and I can't wait to see how things are shaking out for AMD at that time. Bios and other updates might really smooth things out on the purely gaming side. Or, maybe not? Either way, I like what I'm seeing (I'm not strictly a gamer).


----------



## Wishmaker

Quote:


> Originally Posted by *comagnum*
> 
> Go away.. seriously. You're the worst kind of fanboy.


You call me a fanboy because I am pointing out the harm certain members of this community are doing? Read the forums, people are annoyed that they bought ZEN based on what OCN has said and now they have a product that does not fit their expectations. This is somewhat false advertising coming from a forum where people give advice to others.

When people hype a product they created expectation.
When people twist the negatives into a positive by using bait and switch, it creates expectations.

Please continue to call me a fanboy for pointing out what is wrong with this community nowadays. It was the same with RX480. People were claiming 980TI performance and OCN members started buying the products to realize that it was nowhere near the 980ti. How many times does this have to happen for anyone to realize that toxicity is not good in a community?

We have two events where OCN has mislead people and they are both related to AMD launches.


----------



## ZealotKi11er

All I am going to say and will be my last comment on Zen from me until things change is that for gaming is not the same CPU that it is from productivity and this has nothing to do with 8-Core. We will see Zens gaming performance more so with 4-6 Core versions. As as *Overclock.net* community goes I am appointing myself ambassador and stating that Zen is a poor overclocker making BW-E look good and is Fury X ugly cousin.


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> You call me a fanboy because I am pointing out the harm certain members of this community are doing? Read the forums, people are annoyed that they bought ZEN based on what OCN has said and now they have a product that does not fit their expectations. This is somewhat false advertising coming from a forum where people give advice to others.
> 
> When people hype a product they created expectation.
> When people twist the negatives into a positive by using bait and switch, it creates expectations.
> 
> Please continue to call me a fanboy for pointing out what is wrong with this community nowadays. It was the same with RX480. People were claiming 980TI performance and OCN members started buying the products to realize that it was nowhere near the 980ti. How many times does this have to happen for anyone to realize that toxicity is not good in a community?
> 
> We have two events where OCN has mislead people and they are both related to AMD launches.


Sorry, it's called being a responsible consumer and taking *people's opinions* with a grain of salt. I read the rumors about 480 as well and as a smart consumer, waited on reviews and did hours worth of research.

Problem with your train of thought is you are not putting any blame on the person naive enough to not wait for reviews.


----------



## zGunBLADEz

Quote:


> Originally Posted by *SoloCamo*
> 
> 7700k doesn't have problems like my 4790k doesn't have problems, i5 does. And that was exactly my point, the 64 player maps are far more multithreaded so if the single player is showing that much of a discrepancy the gap is even larger in multiplayer.


i see sometimes myself turning HT off to prevents microstutters XD


----------



## SoCalMX70

Quote:


> Originally Posted by *Wishmaker*
> 
> You call me a fanboy because I am pointing out the harm certain members of this community are doing? *Read the forums, people are annoyed that they bought ZEN based on what OCN has said and now they have a product that does not fit their expectations.* This is somewhat false advertising coming from a forum where people give advice to others.
> 
> When people hype a product they created expectation.
> When people twist the negatives into a positive by using bait and switch, it creates expectations.
> 
> Please continue to call me a fanboy for pointing out what is wrong with this community nowadays. It was the same with RX480. People were claiming 980TI performance and OCN members started buying the products to realize that it was nowhere near the 980ti. How many times does this have to happen for anyone to realize that toxicity is not good in a community?
> 
> *We have two events where OCN has mislead people and they are both related to AMD launches*.


If true, those people are stupid and should do better research... Wait for factual data. Pretty simple.

I always get a a great laugh out of those calling for rules/hand-holding/censorship to "protect" people from bad information. You're wrong, just stop.


----------



## Ultracarpet

Quote:


> Originally Posted by *CriticalOne*
> 
> Intel's HEDT platform wasn't in contention due to price. I'm not trying to pay $1,000 for an eight core or $200 for a motherboard. I do CAD and video editing alongside gaming so I need strong multithreaded performance.
> 
> The way that Ryzen was presented is that I could get a 1700 for $320, the same price as an i7 or whatever, and get more or less the same performance in games while having a little under than twice the multithreaded performance. AMD gave multiple presentations showing a Ryzen processor and an Intel 6900k getting equal performance in games, which made me excited. *Imagine my shock when I check gaming benchmarks and I see a 7700k with no hyperthreading beating a 1800X.*
> 
> Along with that, there was a sizable faction on this forum declaring quad cores as dead or not good enough and how foolish it would be to choose a 6/7700k over a Ryzen processor due to games now being able to use 8 threads well and so on. It wasnt AMD, but there were a lot of people hyping up as Ryzen being some sort of Intel killer.


Were you not shocked when you saw gaming benchmarks of 7700k vs 5960x and 6900k??? You are now saying that the 1700 had to not only be as good as a 6900k it had to also be equal to a 7700k in lightly threaded tasks all at half or less than half of the cost as a 6900k.... are you even serious right now? are you on the same planet as me? universe? dimension?????

Ryzen was never supposed to be an upgrade for people with recent Intel chips. In fact, it only became worth considering when all of a sudden leaks indicated better than expected performance. The r7 being near par with Intel's HEDT lineup for 99% of tasks was unexpected, and amazing, as now suddenly the chips are worth considering for people who are looking to make the jump to Intels HEDT platform. If you have a recent quad core from Intel on their mainstream lineup, Ryzen is not going to be an upgrade for you. It never was supposed to be. No one is telling you it will be or should be. And just to remind you, there is no upgrade for you on Intel's lineup either.


----------



## looniam

Quote:


> Originally Posted by *jeffdamann*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> i'm sure if you looked at reviews, saw their cooling solution, then compared to that D15 and what temps they got; you would very much have your answer.
> 
> 
> 
> The thing is I am not finding decent overclock reviews, just very general broad reviews, or reviews where they only OC one of the chips
Click to expand...

you didn't see the computerbase review? 3.9Ghz @ ~60c

it looks like they are using a Nocuta <- image of socket w/bracket.


----------



## GorillaSceptre

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No I expected same preference. Kaby beats my CPU but its a side grade for me.


Kaby is a sidegrade to anyone with a recent 4 core with an overclock, hell, even my old-ass chip isn't really worth the price to upgrade from..

This is competing with Intels 8 cores..

Quote:


> Originally Posted by *chuy409*
> 
> *Because amd themselves pitched ryzen an 8 core 16 thread accessible to masses at an affordable price*. That in itself is an odd position right now as the masses arent leveraging all 8 cores to the brink but are more likely than not, gamers if we remove office work-type pcs from the equation.


As far as I can tell AMD delivered.. It is an 8 core at an affordable price, it takes on Intels damn $1100 6900K..

Once again, if gaming is all that matters then go laugh at 6950X owners for losing to 7700K's..


----------



## tygeezy

I pretty much game, surf, watch videos, and do some spreedsheets. I do a tiny bit of video editing with some game captures I do with shadowplay. Is there any counter-strike benchmarks?

My i7 860 at 3.5 can't maintain 140 fps 100% of the time. If the 1700 can maintain 140 fps in both cs go and overwatch this processor might be for me. I have a 144 hz gsync 1080 monitor. So anything over 140 fps is overkill.

So as long as I can maintain 140 fps with comp shooters I think this is a better buy for future games that are well threaded. Judging by bf1.


----------



## b3machi7ke

Quote:


> Originally Posted by *SoCalMX70*
> 
> *If true, those people are stupid and should do better research... Wait for factual data. Pretty simple.*
> 
> I always get a a great laugh out of those calling for rules/hand-holding/censorship to "protect" people from bad information. You're wrong, just stop.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Ultracarpet*
> 
> Were you not shocked when you saw gaming benchmarks of 7700k vs 5960x and 6900k??? You are now saying that the 1700 had to not only be as good as a 6900k it had to also be equal to a 7700k in lightly threaded tasks all at half or less than half of the cost as a 6900k.... are you even serious right now? are you on the same planet as me? universe? dimension?????
> 
> Ryzen was never supposed to be an upgrade for people with recent Intel chips. In fact, it only became worth considering when all of a sudden leaks indicated better than expected performance. The r7 being near par with Intel's HEDT lineup for 99% of tasks was unexpected, and amazing, as now suddenly the chips are worth considering for people who are looking to make the jump to Intels HEDT platform. If you have a recent quad core from Intel on their mainstream lineup, Ryzen is not going to be an upgrade for you. It never was supposed to be. No one is telling you it will be or should be. And just to remind you, there is no upgrade for you on Intel's lineup either.


Don't forget how everyone was screaming that it would be amazing if they got to Sandybridge IPC levels, and that Ivybridge is probably out of reach.

Here we are.....Broadwell-E+ and it is a failure.......










EDIT:

Man, I was expecting people to be celebrating today as a day of Nerds. We finally have actual competition, legitimate, real competition.

Yet people aren't happy.


----------



## DADDYDC650

Awesome review below. In short, Ryzen has better MINIMUM frames on average vs 7700k while 7700k has better average/max frames. If you stream, multi-task and encode and plan on sticking with your motherboard for 2+ years, Ryzen all day.

https://www.youtube.com/watch?v=sciuiEcrnzg&t=520s


----------



## GorillaSceptre

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Don't forget how everyone was screaming that it would be amazing if they got to Sandybridge IPC levels, and that Ivybridge is probably out of reach.
> 
> Here we are.....Broadwell-E+ and it is a failure.......


Yeah, it's unreal tbh..


----------



## Kpjoslee

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Oh yeah? How's that?


Because you are going bit emotional over nothing lol, just like the ones arguing about this whole matter. You realize there are only a few very vocal posters spreading this negativity.


----------



## CriticalOne

Quote:


> Originally Posted by *Ultracarpet*
> 
> Were you not shocked when you saw gaming benchmarks of 7700k vs 5960x and 6900k??? You are now saying that the 1700 had to not only be as good as a 6900k it had to also be equal to a 7700k in lightly threaded tasks all at half or less than half of the cost as a 6900k.... are you even serious right now? are you on the same planet as me? universe? dimension?????
> 
> Ryzen was never supposed to be an upgrade for people with recent Intel chips. In fact, it only became worth considering when all of a sudden leaks indicated better than expected performance. The r7 being near par with Intel's HEDT lineup for 99% of tasks was unexpected, and amazing, as now suddenly the chips are worth considering for people who are looking to make the jump to Intels HEDT platform. If you have a recent quad core from Intel on their mainstream lineup, Ryzen is not going to be an upgrade for you. It never was supposed to be. No one is telling you it will be or should be. And just to remind you, there is no upgrade for you on Intel's lineup either.


I'm not sure what you are trying to say.

I'm not saying that I expected Ryzen to beat a 7700k. However, when a 7700K *WITHOUT* *HYPERTHREADING*, which is essentially an i5, beats it in a game that I know that is very well threaded and demanding (which, you know, would put the 4C/4T 7700K at a massive disadvantage), there's questions to be asked for my usage scenario.


----------



## Lee Patekar

Interesting. In three months or so we'll have a handle on what issues persist and which were fixed. Everything from AMD seems rushed out the doors nowadays, from the Rx480 to Rysen.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Lee Patekar*
> 
> Interesting. In three months or so we'll have a handle on what issues persist and which were fixed. Everything from AMD seems rushed out the doors nowadays, from the Rx480 to Rysen.


You want to back any of that up with facts? Or.......you just spewing it to spew?


----------



## Rhadamanthys

So what if I game exclusively in 4k? Any advice on what CPU I should be getting? Should I buy a 1800X, does it even matter? I guess it's more future-proof than current Intel offerings, so...


----------



## Ultracarpet

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Don't forget how everyone was screaming that it would be amazing if they got to Sandybridge IPC levels, and that Ivybridge is probably out of reach.
> 
> Here we are.....Broadwell-E+ and it is a failure.......


Honestly, you should see my face while I'm reading this crap.


----------



## Noufel

6950X losing to even a 6700k in gaming ( considered normal )
1800X losing to a 7700k in gaming ( absolute catastrophe )
the first one is 1500$ + and the second one is 500$ and btw the cheaper one is on par with the 1500$ one in productivity
the X370 is a very strong platform like the z270 is but who cares AMD has give us a new faildozer


----------



## CriticalOne

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Don't forget how everyone was screaming that it would be amazing if they got to Sandybridge IPC levels, and that Ivybridge is probably out of reach.
> 
> Here we are.....Broadwell-E+ and it is a failure.......


The forum isn't a hivemind. Different people had different expectations.


----------



## amstech

Quote:


> Originally Posted by *Hueristic*
> 
> Then better up the textures on you avy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ok back to ten pages back. Lol


Lol I know that my old beater will struggle a little more then some of the newer ones in certain, demanding games but at the right clock speed it still competes/games ok for what it is, that's all I was trying to say!


----------



## GorillaSceptre

Quote:


> Originally Posted by *Noufel*
> 
> 6950X losing to even a 6700k in gaming ( considered normal )
> 1800X losing to a 7700k in gaming ( absolute catastrophe )
> the first one is 1500$ + and the second one is 500$ and btw the cheaper one is on par with the 1500$ one in productivity
> the X370 is a very strong platform like the z270 is but who cares AMD has give us a new faildozer


Perfectly sums up the thread.








Quote:


> Originally Posted by *Kpjoslee*
> 
> Because you are going bit emotional over nothing lol, just like the ones arguing about this whole matter. You realize there are only a few very vocal posters spreading this negativity.


Emotional? Lol, whatever.


----------



## PostalTwinkie

Quote:


> Originally Posted by *CriticalOne*
> 
> The forum isn't a hivemind. Different people had different expectations.


Sure. People are allowed to have that as well, but when you start keeping double standards in order to hold/justify your expectation there is a problem.


----------



## Ultracarpet

Quote:


> Originally Posted by *ducegt*
> 
> Seems I've covered all the pitfalls of Ryzen so well that the AMD fan child club can't do anything, but attempt to insult my character. You all will have to try a lot given the most you have done thus far is go in for a group hug with each other.
> Sure and even smaller businesses, *but you might be shocked how much cash businesses don't care to save because they fear change*. Sometimes their concerns are valid. A disruption period of 24 hours due to swapping Intel with AMD might not be acceptable. Those who use IT, maintain it, and ultimately purchase it in larger organizations are all different people with different purposes. More cores can also mean more software licensing expensive so raw speed is favored. I'm just a gamer at home, but I understand why the market is what it is. AMD will make a change now which no matter how small, is better than the nothing it did before.


Completely understandable. There is definitely a large trust hurdle they need to jump in terms of demonstrating the reliability and strengths of their platform to the bigger clients.


----------



## SoCalMX70

Quote:


> Originally Posted by *Lee Patekar*
> 
> Interesting. In three months or so we'll have a handle on what issues persist and which were fixed. Everything from AMD seems rushed out the doors nowadays, from the Rx480 to Rysen.


I feel like AMD walks a tough path these days. Everyone has heard about their lackluster budget for R&D, how they are behind here and there, etc. Meanwhile people are clamoring for something, ANYTHING, from them to help push back on prices and crappy incremental 5-10% performance updates from Intel and move technology forward. And of course, to fight Nvidia and their ballooning prices.

How long could AMD possibly wait on this release, or any release for that matter? I see mostly disappointment about how Vega is still a few months away. Will that be rushed too? Guess we will find out.


----------



## Lee Patekar

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You want to back any of that up with facts? Or.......you just spewing it to spew?


You forgot about the PCIe voltage fix via drivers already? Quite a few reviewers saw an increase in performance by turning off SMT.. I suspect that should be fixed via Bios or something in the near future, or at least addressed.

Tech isn't static. I aint buying s**t until its done its paces in the wilds for three months.


----------



## DADDYDC650

I've been flip flopping between a 1700 and 1800x. Going back to a 1800x just to support AMD.


----------



## madweazl

Maybe I missed it but all of the typical resolution gaming benchmarks look very similar in performance. I get that at low res, the 7700k shows a decent performance margin where a GPU bottleneck doesnt exist but how long is it going to take before GPU performance exceeds what the Ryzen CPUs are capable of delivering? In real world performance, I just dont see any reason the Ryzen CPU is a problem. It is delivering excellent performance, at a great price, and will probably continue to do so for a few years before GPUs go beyond what they're capable of delivering. While I certainly dont build new rigs every year, I dont go more than three without some sort of upgrade either.

If the 7700k delivers 170fps and the Ryzen is only capable of 135fps in the same title, I wont be able to tell the difference to begin with. The very small OC potential is a bummer for tinkerers but damn, there is a ton of bang for the buck on tap with the R7.


----------



## Quantum Reality

Quote:


> Originally Posted by *Noufel*
> 
> 6950X losing to even a 6700k in gaming ( considered normal )
> 1800X losing to a 7700k in gaming ( absolute catastrophe )
> the first one is 1500$ + and the second one is 500$ and btw the cheaper one is on par with the 1500$ one in productivity
> the X370 is a very strong platform like the z270 is but who cares AMD has give us a new faildozer


That's one of the biggest differences, really. When BD came out and people realized Phenom II, especially the Phenom II X6, could still exceed BD's performance, people rightly were dismayed and upset at the failure of AMD to live up to expectations of at least a modest increase in performance across the board.

Ryzen is a very different story, since they not only undid the losses of BD, but have soared well above Phenom II.

I would love to see comprehensive AMD-only benchmarks covering the range from Phenom II X4 965 all the way to Ryzen 1800X to show what the gains have been over the last decade.


----------



## DADDYDC650

Quote:


> Originally Posted by *madweazl*
> 
> Maybe I missed it but all of the typical resolution gaming benchmarks look very similar in performance. I get that at low res, the 7700k shows a decent performance margin where a GPU bottleneck doesnt exist but how long is it going to take before GPU performance exceeds what the Ryzen CPUs are capable of delivering? In real world performance, I just dont see any reason the Ryzen CPU is a problem. It is delivering excellent performance, at a great price, and will probably continue to do so for a few years before GPUs go beyond what they're capable of delivering. While I certainly dont build new rigs every year, I dont go more than three without some sort of upgrade either.
> 
> If the 7700k delivers 170fps and the Ryzen is only capable of 135fps in the same title, I wont be able to tell the difference to begin with. The very small OC potential is a bummer for tinkerers but damn, there is a ton of bang for the buck on tap with the R7.


Seems as if Ryzen has better minimum frames on average. i7 wins out in average and high frames. I'm at 4k/60Hz so Ryzen is def for me.


----------



## Blackops_2

Quote:


> Originally Posted by *ryboto*
> 
> Years of reasons? I don't have a need to, that's the thing. AMD releasing new products does give me the itch though...


This is my position. In reality i have no need to move from Ivy, though i'd like to retire it and the 780s as a backup rig and move to something newer with Vega and 1440p. Plus just itching to build a new midsize watercooling rig with PETG.


----------



## Lee Patekar

Quote:


> Originally Posted by *SoCalMX70*
> 
> How long could AMD possibly wait on this release, or any release for that matter? I see mostly disappointment about how Vega is still a few months away. Will that be rushed too? Guess we will find out.


Vega will be rushed I'm sure. Nonetheless, my next system is a Rysen and Vega combo. I'll just give them three months to iron out the kinks. I'm in no hurry.


----------



## Seyumi

Jesus people. 2017:

Gaming - Intel 7700k
Everything else - AMD 1700

End of thread. Don't bother with the "future proofing" aspect because both CPUs will be dinanousurs and worth very little by the time more CPU cores become more relevant.


----------



## SoCalMX70

Quote:


> Originally Posted by *Lee Patekar*
> 
> Vega will be rushed I'm sure. Nonetheless, my next system is a Rysen and Vega combo. I'll just give them three months to iron out the kinks. I'm in no hurry.


I'm right there with you. I've built nothing but Intel based systems since my first custom build in 2003. I want to give AMD a shot on the CPU side now.


----------



## Quantum Reality

The thing that is very encouraging is that video encoding promises to be very fast, according to the HardwareCanucks review. I know a couple of people who do video editing and Photoshop work - sounds like Ryzen will be perfect for their tasks.


----------



## ducegt

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I was an Internet Service Provider and spent my time in and out of data centers and dealing with those matters, the people that ran and build data centers, etc.
> 
> It isn't a matter of loyalty, you can try and argue that all you want - as it just shows you lack the business management knowledge - this is a matter of money. If the results that the 1700x just demonstrated translate over to Naples Intel won't be able to overcome it with cost. Even if they dropped their parts to the same cost as AMD, they are being outperformed by an extreme margin (based off this sample).
> 
> It doesn't matter what my head of IT thinks when the performance delta is so extreme. The energy savings potential to a Data Center with Naples paired with the performance will mandate the change. Even if Intel manages to meet the part cost 1:1.
> 
> It is absolutely crazy that the 1700x put up the numbers it did.


Tell that to the head of IT and if you are in good standings, they share with you the agreements they have signed. And if that isn't enough, you can read the agreement. It won't make sesne to why it's that way, but you'll understand why IT decision makers make the choices they do. Energy savings is a lower priority than support to many. And people trust those who have well supported them in the past. What more specifically was your role? Sys admin, DBA, IT cough people manager, architect, developer, or ... Also, you didn't seem acknowledge some very strong points I made...


----------



## AuraNova

Quote:


> Originally Posted by *Seyumi*
> 
> Jesus people. 2017:
> 
> Gaming - Intel 7700k
> Everything else - AMD 1700
> 
> End of thread. Don't bother with the "future proofing" aspect because both CPUs will be dinanousurs and worth very little by the time more CPU cores become more relevant.


This is pretty much it. The real story here is that AMD is once again competitive to Intel. This is a win for the consumer in the long run. Rome wasn't built in a day and AMD could very well have better plans as Zen matures. Like I said, this is only going to light a fire under Intel.


----------



## Hueristic

Can't keep up with this thread but just wanted to say that I think afa gaming performance I have no doubt drivers/bios and micro code or stepping updates will all have a huge impact there. For what I want the reviews look great, 1700/B350 is going to make a great upgrade for us that multitask all day.

What I really want to see is comparative benches from different mobo's, I have a feeling the bios on them is going to have a big impact and that may be why we are seeing some manufacturers boards not available on release.

Edited: double negative doh.


----------



## SinX7

Gaming wise, might not be the best, but it seems awesome for the price, especially fort those who can utilize those cores and threads! Love seeing the competition!


----------



## chuy409

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Kaby is a sidegrade to anyone with a recent 4 core with an overclock, hell, even my old-ass chip isn't really worth the price to upgrade from..
> 
> This is competing with Intels 8 cores..
> As far as I can tell AMD delivered.. It is an 8 core at an affordable price, it takes on Intels damn $1100 6900K..
> 
> Once again, if gaming is all that matters then go laugh at 6950X owners for losing to 7700K's..


yea i understand that but 6950x doesnt lose nowhere as badly as the 7700k to the ryzen. I believe if we saw a bigger margin, i would probably call ryzen a niche product for those professions that actually uses all 8 cores. But ryzen doesnt do TOO badly i guess.


----------



## SoloCamo

Quote:


> Originally Posted by *Seyumi*
> 
> Jesus people. 2017:
> 
> Gaming - Intel 7700k
> Everything else - AMD 1700
> 
> End of thread. Don't bother with the "future proofing" aspect because both CPUs will be dinanousurs and worth very little by the time more CPU cores become more relevant.


Tell that to 2600k owners who are doing far better off than 2500k owners these days. That minor price difference went a very long way and games are only going to utilize more threads.


----------



## jprovido

you guys think the Wraith Spire LED cooler is enough for the r7 1700? saw one on amazon bundled with the cpu for 329.99 I'm thinking of getting a very basic motherboard for my vr rig (probably the B350). 3.8ghz oc is what I'm aiming for. 3.8 seems easy to reach right?


----------



## Ultracarpet

Quote:


> Originally Posted by *CriticalOne*
> 
> I'm not sure what you are trying to say.
> 
> I'm not saying that I expected Ryzen to beat a 7700k. However, when a 7700K *WITHOUT* *HYPERTHREADING*, which is essentially an i5, beats it in a game that I know that is very well threaded and demanding (which, you know, would put the 4C/4T 7700K at a massive disadvantage), there's questions to be asked for my usage scenario.


Does this happen in every well threaded game? No? Could it be something that is remedied and not a representation of the entire performance of the chip for gaming? Regardless, you are again, comparing apples to oranges. If we are to compare the r7 to the chip it should be compared, the advantage in gaming is not as big, and one can then go "oh well it is less than half the freaking price of Intel's 8 core, maybe losing in gaming by 5-10% isn't so bad".

What I'm trying to say that really you only have yourself to blame for being disappointed. The chip was laid out VERY well in terms of what was to be expected. Probably one of the best releases AMD has ever done IMO. The r7 was NEVER meant to be an upgrade for your quad core Intel, UNLESS that upgrade was for an increase in cores. Of which, you would still take a hit in gaming performance even if you went Intel for the increased cores.


----------



## corky dorkelson

Man, my 4790K still looking like a REALLY good buy, especially since I got it used. For games, it's still only a few frames behind some of the current top dog chips.


----------



## DADDYDC650

Quote:


> Originally Posted by *SoloCamo*
> 
> Tell that to 2600k owners who are doing far better of than 2500k owners these days. That minor price difference went a very long way and games are only going to utilize more threads.


Exactly. No point in going 7700k unless you upgrade yearly and need really high frames vs minimum. If you want a chip that will last for years and on a platform with a long upgrade path, go Ryzen.


----------



## SoloCamo

Quote:


> Originally Posted by *corky dorkelson*
> 
> Man, my 4790K still looking like a REALLY good buy, especially since I got it used. For games, it's still only a few frames behind some of the current top dog chips.


Same, (well I bought it new but w/e). I'll be waiting on Zen+ for my next upgrade most likely. Really can't justify switching yet.


----------



## Quantum Reality

Quote:


> Originally Posted by *corky dorkelson*
> 
> Man, my 4790K still looking like a REALLY good buy, especially since I got it used. For games, it's still only a few frames behind some of the current top dog chips.


Mmhmm. I'm not looking to go Ryzen ASAP, as my 4690K continues to serve me well. But in a few months to a year, I think I'll get the best Ryzen R5 there is to buy and make the switch







And that will serve well for a long time.


----------



## Ultracarpet

Quote:


> Originally Posted by *chuy409*
> 
> yea i understand that but 6950x doesnt lose nowhere as badly as the 7700k to the ryzen. I believe if we saw a bigger margin, i would probably call ryzen a niche product for those professions that actually uses all 8 cores. But ryzen doesnt do TOO badly i guess.


Right it doesn't, but if you use the 6950x and 6900k as the comparison, that 10% difference in gaming starts to become a lot less important when the chip costs 1/3 of those two


----------



## budgetgamer120

Quote:


> Originally Posted by *ducegt*
> 
> Seems I've covered all the pitfalls of Ryzen so well that the AMD fan child club can't do anything, but attempt to insult my character. You all will have to try a lot given the most you have done thus far is go in for a group hug with each other.
> Sure and even smaller businesses, but you might be shocked how much cash businesses don't care to save because they fear change. Sometimes their concerns are valid. A disruption period of 24 hours due to swapping Intel with AMD might not be acceptable. Those who use IT, maintain it, and ultimately purchase it in larger organizations are all different people with different purposes. More cores can also mean more software licensing expensive so raw speed is favored. I'm just a gamer at home, but I understand why the market is what it is. AMD will make a change now which no matter how small, is better than the nothing it did before.


You are the one insulting yourself









Did you read your posts?


----------



## CriticalOne

Quote:


> Originally Posted by *Ultracarpet*
> 
> What I'm trying to say that really you only have yourself to blame for being disappointed. The chip was laid out VERY well in terms of what was to be expected. Probably one of the best releases AMD has ever done IMO. The r7 was NEVER meant to be an upgrade for your quad core Intel, UNLESS that upgrade was for an increase in cores. Of which, you would still take a hit in gaming performance even if you went Intel for the increased cores.


Hold up.

I don't own a quad core Intel processor or an i7. I thought I said I was upgrading and I was considering getting a Core i7 processor or getting Ryzen.


----------



## Malinkadink

Looks like these 8 cores won't go much past 4ghz. Fine, but now i'm even more interested to see how the 1600X performs. 4.6ghz would be nice.


----------



## aberrero

With the overclocks we are seeing, is it enough to just let the 1800x do its turbo and not mess with it beyond that?


----------



## GorillaSceptre

Quote:


> Originally Posted by *chuy409*
> 
> yea i understand that but 6950x doesnt lose nowhere as badly as the 7700k to the ryzen. I believe if we saw a bigger margin, i would probably call ryzen a niche product for those professions that actually uses all 8 cores. But ryzen doesnt do TOO badly i guess.


Difference is you can get a Ryzen 8 core for cheaper than a 7700K, and the 6950X costs $1700..


----------



## ducegt

Quote:


> Originally Posted by *DADDYDC650*
> 
> Exactly. No point in going 7700k unless you upgrade yearly and need really high frames vs minimum. If you want a chip that will last for years and on a platform with a long upgrade path, go Ryzen.


Don't forget some of us went to sky and kaby lake from first and second generation i5/i7 chips. My 7700K triples the frames of the i7 860 4ghz I had. I'm guilty of being too patient. If I would have known this is how things would be, I would have gone sandy or ivy.


----------



## Ultracarpet

Quote:


> Originally Posted by *CriticalOne*
> 
> Hold up.
> 
> I don't own a quad core Intel processor or an i7. I thought I said I was upgrading and I was considering getting a Core i7 processor or getting Ryzen.


Then it's an even simpler choice for you. If you were originally considering getting the Intel HEDT platform, go Ryzen instead and save yourself a boatload of cash. If not, then wait for the r5 and r3 if you want to save money, or just go Intel for the absolute best gaming performance.

SIMPLE.


----------



## tpi2007

Quote:


> Originally Posted by *Wishmaker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *comagnum*
> 
> Go away.. seriously. You're the worst kind of fanboy.
> 
> 
> 
> You call me a fanboy because I am pointing out the harm certain members of this community are doing? Read the forums, people are annoyed that they bought ZEN based on what OCN has said and now they have a product that does not fit their expectations. This is somewhat false advertising coming from a forum where people give advice to others.
> 
> When people hype a product they created expectation.
> When people twist the negatives into a positive by using bait and switch, it creates expectations.
> 
> Please continue to call me a fanboy for pointing out what is wrong with this community nowadays. It was the same with RX480. People were claiming 980TI performance and OCN members started buying the products to realize that it was nowhere near the 980ti. How many times does this have to happen for anyone to realize that toxicity is not good in a community?
> 
> We have two events where OCN has mislead people and they are both related to AMD launches.
Click to expand...

1. Please don't put the whole of OCN into that boat. We all know that there are shills here as there are everywhere else, some more sophisticated than others - and this applies to _all brands_, but you can't put a label on it and say like in your last sentence that "OCN has mislead people".

Many people were more reserved and had a more distanced opinion of it until more concrete data was available.

Also, people who pre-ordered took the risk. It was simply impossible to know if the 1700 was a better value than the 1700X, for example. Eventual binning for less leakage and lower TDP had to be taken into consideration, for example. And even now it still depends. If you don't overclock, the 1700X offers more for $70 more. It's not the end of the world in any case.

As to the teething issues, it's also impossible to know the degree to which they will be fixed, but we historically can put into perspective that Intel had teething issues too and motherboard makers issued lots of BIOS updates for Intel boards too. My X79 board, for example, had 21 BIOS updates (22 versions, including the initial one) during its lifetime.

2. Don't exaggerate. 4C/4T is not worth the premium Intel is asking for it anymore, that was already true even before Ryzen was released. And the 4C/8T scenario is also shaken because this isn't exactly the FX-8150 days where you could spend less for the 2500K for great gaming or $72 more for the 2600K and all-round win in like 99% of cases at stock (not even including overclocking).

This is a case where for the same money you get a chip that doesn't do as well in gaming as a 7700K, but still does pretty well, but that competes with a $1050 CPU in multithreading. In 2011 if you wanted to compete there by going Intel it was +$72, now it's +$550. That has to be taken into account.

I, for one, don't value the 7700K nor the 7600K at their current prices, they need to come down to $250 and $180 respectively. Highly competitive gamers will still find value in the 7700K at the current pricing, but it's not the overall value it once was, that I'm sure of.


----------



## budgetgamer120

Quote:


> Originally Posted by *amstech*
> 
> Lol I know that my old beater will struggle a little more then some of the newer ones in certain, demanding games but at the right clock speed it still competes/games ok for what it is, that's all I was trying to say!


So you just said that an old i7 compete games ok but Ryzen which is much faster is does not compete and is a fail?

Nice


----------



## variant

Quote:


> Originally Posted by *Malinkadink*
> 
> Looks like these 8 cores won't go much past 4ghz. Fine, but now i'm even more interested to see how the 1600X performs. 4.6ghz would be nice.


Someone could test this now by disabling two cores and trying to overclock it.


----------



## Lex Luger

I feel sorry for the people that think the 4 and 6 core ryzen are going to magically clock much higher than the 8 core. I'm guessing 100-200 mhz more max, even the 4 core with hyperthreading disabled.


----------



## Malinkadink

Quote:


> Originally Posted by *aberrero*
> 
> With the overclocks we are seeing, is it enough to just let the 1800x do its turbo and not mess with it beyond that?


I would find a stable core voltage and leave everything else alone. Chances are it will be giving itself more volts on auto than you just finding the minimum volts to be stable so can shave off temps.


----------



## darealist

For me, if you waited this long you might as well wait for Coffee Lake with mainstream 6 cores replacing i7 7700k. Top-of-the-line IPC with high clocks which will translate well with 6 cores.


----------



## tygeezy

Quote:


> Originally Posted by *ducegt*
> 
> Don't forget some of us went to sky and kaby lake from first and second generation chips. My 7700K triples the frames of the i7 860 4ghz I had. I'm guilty of being too patient. If I would have known this is how things would be, I would have gone sandy or ivy.


I can only hit 3.5 on my 860. I'm on a coolermaster hyper 212 evo. So really, a 7700 k or an amd 1700 would be massive for me. What games are you seeing triple the framerate?

I'm looking to keep it at 140 fps 100% of the time in both cs go and overwatch(1080 P 144 hz gsync). Overwatch does a better job of maintaining that fps, but i'm also looking towards future multithreaded titles where minimums will be the key as I like to cap it at low end to maintain consistent frame-times and input latency.


----------



## hokk

Quote:


> Originally Posted by *corky dorkelson*
> 
> Man, my 4790K still looking like a REALLY good buy, especially since I got it used. For games, it's still only a few frames behind some of the current top dog chips.


I was actually on a 2500k then was offered this whole system for only 700 so i jumped on it

i can't see myself upgrading for a long time maybe 3 years or later.


----------



## dir_d

I'm still getting a 1800x to replace my 2500k. All i see are minor issues that will get hashed out with windows patches and bios fixes. I game at 1440p anyways but the extra cores are needed.


----------



## Lex Luger

6 core 10 nm cannon lake chips will only be for laptops allegedly.

14nm 6 core coffee lake chips will be available though.

At this point, I'm waiting for icelake before I give intel more of my money, that or maybe zen+ if AMD can improve ipc by 20 percent and achieve 5 ghz overclocks, both of which I highly doubt.


----------



## budgetgamer120

Quote:


> Originally Posted by *Lex Luger*
> 
> I feel sorry for the people that think the 4 and 6 core ryzen are going to magically clock much higher than the 8 core. I'm guessing 100-200 mhz more max, even the 4 core with hyperthreading disabled.


Agreed. I do not see a 5ghz 4 core or 6 core zen happening... Maybe 4.4ghz max.


----------



## Melan

I guess I'll be on my 3770K until it (or mobo) finally dies. Le sigh.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Lex Luger*
> 
> I feel sorry for the people that think the 4 and 6 core ryzen are going to magically clock much higher than the 8 core. I'm guessing 100-200 mhz more max, even the 4 core with hyperthreading disabled.


It has nothing to do with "magic", it's physics..

In any case, I highly, highly doubt they will be able to match Intel's process, but their prices make them very competitive.


----------



## madweazl

Quote:


> Originally Posted by *DADDYDC650*
> 
> Seems as if Ryzen has better minimum frames on average. i7 wins out in average and high frames. I'm at 4k/60Hz so Ryzen is def for me.


I dont know that the gap would ever be noticeable. This review was done at 1080 (mostly) but with higher resolutions, I didnt see any difference (closer if anything) in other reviews. I think the 1700 is a fantastic option (what I would do if I were in the market) right now. All this GPU bottleneck bickering is crazy when they're using the strongest GPUs on the market; what else are we gaming with? I think all the early adopters will be quite happy.


----------



## Quantum Reality

I will be particularly interested to see how Ryzen-based laptops shake out. With the kinds of prices AMD is giving on high-end retail, I would be unsurprised to see ~$500 gaming-capable laptops instead of the absurd $1200+ Intel-based machines that have become so commonplace.


----------



## microchidism

Quote:


> Originally Posted by *Seyumi*
> 
> Jesus people. 2017:
> 
> Gaming - Intel 7700k
> Everything else - AMD 1700
> 
> End of thread. Don't bother with the "future proofing" aspect because both CPUs will be dinanousurs and worth very little by the time more CPU cores become more relevant.


I think this may be the best post in this entire thread


----------



## Lex Luger

Ryzen laptops are interesting, but Ryzen APU laptops are even more interesting. With laptops the lack of high overclocks matters zero, and Ryzen has good energy efficiency it seems.

Ryzen 4 core apu with rx480 level gpu and 16 gb of HBM2 memory all for 350 bucks. Sounds like a game changer to me that will finally justify overpaying for ATI graphics all those years ago.


----------



## BroJin

GamersNexus and Joker discussing Ryzen Benchmarks
https://www.youtube.com/watch?v=04p_ryVM2ow


----------



## SoloCamo

Quote:


> Originally Posted by *Lex Luger*
> 
> Ryzen laptops are interesting, but Ryzen APU laptops are even more interesting. With laptops the lack of high overclocks matters zero, and Ryzen has good energy efficiency it seems.
> 
> Ryzen 4 core apu with rx480 level gpu and 16 gb of HBM2 memory all for 350 bucks. Sounds like a game changer to me that will finally justify overpaying for ATI graphics all those years ago.


Not getting a 480 class APU for a very long time. Let alone at $350.

But yes, the Ryzen APU's are by far what I'm looking forward to the most especially after seeing how efficient these are.


----------



## 7850K

Quote:


> Originally Posted by *microchidism*
> 
> I think this may be the best post in this entire thread


yeah because all the sandybridge owners dumped those dinosaurs years ago


----------



## DADDYDC650

Quote:


> Originally Posted by *aberrero*
> 
> With the overclocks we are seeing, is it enough to just let the 1800x do its turbo and not mess with it beyond that?


It only turbos to 4-4.1Ghz using 1 or 2 cores. Otherwise it runs at 3.6Ghz.


----------



## SoloCamo

https://www.youtube.com/watch?v=04p_ryVM2ow

Turns out Joker's 1700x had 3000mhz running and is likely why his 3.9ghz 1700x did very compared to the 5ghz 7700k. Seems these memory issues are hurting performance quite a bit from what I gathered.


----------



## ducegt

Quote:


> Originally Posted by *tygeezy*
> 
> I can only hit 3.5 on my 860. I'm on a coolermaster hyper 212 evo. So really, a 7700 k or an amd 1700 would be massive for me. What games are you seeing triple the framerate?
> 
> I'm looking to keep it at 140 fps 100% of the time in both cs go and overwatch(1080 P 144 hz gsync). Overwatch does a better job of maintaining that fps, but i'm also looking towards future multithreaded titles where minimums will be the key as I like to cap it at low end to maintain consistent frame-times and input latency.


Doom for triple. I'm on a rather slow GPU, R9 285, so I even tested with the lowest graphic settings though they aren't very different than the medium preset. I was a hardcore CS player for 15 years so I know what your looking for. Was GE for a period and did all the tweaks, even wrote guides/config. My 860 dipped into 70s and 80s during firefights in CSGO. 7700K holds above 200. I didn't care to get an accurate number. It's a big improvement. Now I play mostly OW and I can't tell if the CPU made a big change or if I'm GPU limited. Using mostly low settings on 1080p. It'd safe to assume my GPU is the bottleneck and all I can say is my experience is mostly the same with OW. The new COD and FC Primal went from unplayable 30fps to always above 60 and that is with the high presets. The 285 is only 2GB so more than doubling minium FPS with freesync has me very happy that I upgraded now. I saw you mentioned BF1 earlier. Even though it does multithreaded well, 7700K still out preforms Zen in it


----------



## Slomo4shO

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I highly, highly doubt they will be able to match Intel's process, but their prices make them very competitive.


Spend $100 more for a GPU or for a CPU. Oh the choices we will make


----------



## tygeezy

Quote:


> Originally Posted by *ducegt*
> 
> Doom for triple. I'm on a rather slow GPU, R9 285, so I even tested with the lowest graphic settings though they aren't very different than the medium preset. I was a hardcore CS player for 15 years so I know what your looking for. Was GE for a period and did all the tweaks, even wrote guides/config. My 860 dipped into 70s and 80s during firefights in CSGO. 7700K holds above 200. I didn't care to get an accurate number. It's a big improvement. Now I play mostly OW and I can't tell if the CPU made a big change or if I'm GPU limited. Using mostly low settings on 1080p. It'd safe to assume my GPU is the bottleneck and all I can say is my experience is mostly the same with OW. The new COD and FC Primal went from unplayable 30fps to always above 60 and that is with the high presets. The 285 is only 2GB so more than doubling minium FPS with freesync has me very happy that I upgraded now. I saw you mentioned BF1 earlier. Even though it does multithreaded well, 7700K still out preforms Zen in it


Thanks for the input. I'm on a geforce 1070, so i'm the opposite direction being cpu bound. This old chip still does well, but it would be nice to have 140 fps cap held 100 % of the time. Were the minimums still better on the 7700 k? How about the CPU usage? I know wit hthe i5's it's at like 100 % most of the time.


----------



## dragneel

Quote:


> Originally Posted by *dir_d*
> 
> I'm still getting a 1800x to replace my 2500k. All i see are minor issues that will get hashed out with windows patches and bios fixes. I game at 1440p anyways but the extra cores are needed.


same boat, but i think im getting the 1700.


----------



## Lex Luger

With only dual channel memory and twice as many cores as intels 4 core chips, yeah, you better take the time to fine tune the memory and get it to 3200 with these chips. Maxing out what the memory controller on these chips can do will be key, more so than getting that last 100-200 mhz outta the chip.


----------



## Motley01

Quote:


> Originally Posted by *DaaQ*
> 
> From the Tom's conclusion, and to add to tpi2007's point.
> I will wait for member results before jumping to conclusions.
> I think the emphasis (which is mine) of the above quote should speak for itself.


Well I'm about a couple hours away still before I finish the build. Then install a fresh copy of Win 10 too.
I'll post some results later. And maybe some build pics too.

I don't care about any of the Intel trolls/haters. My new system will be AWESOME, trust me. I'm the only one that I need to please, and I feel comfortable on the money I spent too.

So y'all keep arguing, I'll be playing with my new AMD rig.


----------



## dieanotherday

Quote:


> Originally Posted by *dragneel*
> 
> same boat, but i think im getting the 1700.


i'm on 2600k @4.7

only reason i cancelled order is cuz of no good small motherboards


----------



## looniam

Quote:


> Originally Posted by *SoloCamo*
> 
> https://www.youtube.com/watch?v=04p_ryVM2ow
> 
> Turns out Joker's 1700x had 3000mhz running and is likely why his 3.9ghz 1700x did very compared to the 5ghz 7700k. Seems these memory issues are hurting performance quite a bit from what I gathered.


had tuned in just before your post and caught the last past of steve talking about the difference between samsung and hynix ICs.

didn't catch what but that there is a difference.


----------



## Quantum Reality

From this benchmark video: https://www.youtube.com/watch?v=V5RP1CPpFVE



The results and the red boxes really say it all for me - this is what AMD can deliver when paired with high-end motherboards and RAM modules - with a 1 GHz shortfall compared to Intel when overclocked.


----------



## ducegt

Quote:


> Originally Posted by *tygeezy*
> 
> Thanks for the input. I'm on a geforce 1070, so i'm the opposite direction being cpu bound. This old chip still does well, but it would be nice to have 140 fps cap held 100 % of the time. Were the minimums still better on the 7700 k? How about the CPU usage? I know wit hthe i5's it's at like 100 % most of the time.


Minimums are a lot better. I don't pay attention to utilization. It's not a good indicator of much for gaming IMO. With a 1770, you deserve an upgrade. Even like a 6600K will improve your gaming over the 860.


----------



## tpi2007

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lex Luger*
> 
> I feel sorry for the people that think the 4 and 6 core ryzen are going to magically clock much higher than the 8 core. I'm guessing 100-200 mhz more max, even the 4 core with hyperthreading disabled.
> 
> 
> 
> It has nothing to do with "magic", it's physics..
> 
> In any case, I highly, highly doubt they will be able to match Intel's process, but their prices make them very competitive.
Click to expand...

They have a lot of experience doing that. Not with this process node, of course, but back with 32nm Bulldozer -> Piledriver, they got a 400 Mhz Base and a 200 Mhz Turbo speed bump, and then on 28nm, they got the Steamroller based APUs CPU speed from 3.7 Ghz Base / 4 Ghz Turbo Ghz on the 7850K to 4.1Ghz Base / 4.3 Ghz Turbo on the 7890K.

Quote:


> Originally Posted by *SoloCamo*
> 
> https://www.youtube.com/watch?v=04p_ryVM2ow
> 
> Turns out Joker's 1700x had 3000mhz running and is likely why his 3.9ghz 1700x did very compared to the 5ghz 7700k. Seems these memory issues are hurting performance quite a bit from what I gathered.


Ah, so it may be akin to the 6700K being hurt by using the standard 2133 Mhz DDR4 it was validated with.


----------



## TomiKazi

I'm now suffering from a post launch hype


----------



## looniam

wow 5 minutes . . . *crickets*


----------



## tygeezy

Quote:


> Originally Posted by *ducegt*
> 
> Minimums are a lot better. I don't pay attention to utilization. It's not a good indicator of much for gaming IMO. With a 1770, you deserve an upgrade. Even like a 6600K will improve your gaming over the 860.


Oh yeah, for sure i'm in dire need. It's kind of annoying that Sandybridge gets all the love as the everlasting cpu when the i7 920/860 were both stellar chips.

So you think the 7700 k would be a better buy as primarily for gaming? Actually, the bigger upgrade I need is a desk chair believe it or not. The one I have at work has really opened up my eyes.

It's really annoying how DDR 4 prices have skyrocketed right around the time I finally go to upgrade. Either platform I want to use minimum 3000 mhz ddr 4.


----------



## lombardsoup

Quote:


> Originally Posted by *Quantum Reality*
> 
> From this benchmark video: https://www.youtube.com/watch?v=V5RP1CPpFVE
> 
> 
> 
> The results and the red boxes really say it all for me - this is what AMD can deliver when paired with high-end motherboards and RAM modules - with a 1 GHz shortfall compared to Intel when overclocked.


Interesting. That's very different to the narrative other reviewers are pushing.


----------



## budgetgamer120

Quote:


> Originally Posted by *ducegt*
> 
> Minimums are a lot better. *I don't pay attention to utilization. It's not a good indicator of much for gaming IMO*. With a 1770, you deserve an upgrade. Even like a 6600K will improve your gaming over the 860.


----------



## Motley01

Quote:


> Originally Posted by *looniam*
> 
> wow 5 minutes . . . *crickets*


I think people are getting sick of arguing, I know I am.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Lee Patekar*
> 
> You forgot about the PCIe voltage fix via drivers already? Quite a few reviewers saw an increase in performance by turning off SMT.. I suspect that should be fixed via Bios or something in the near future, or at least addressed.
> 
> Tech isn't static. I aint buying s**t until its done its paces in the wilds for three months.


I see where you are coming from.

As always the first adoption rules apply. Expect bumps and hiccups with first steppings and chipset releases.

Quote:


> Originally Posted by *Quantum Reality*
> 
> From this benchmark video: https://www.youtube.com/watch?v=V5RP1CPpFVE
> 
> 
> 
> The results and the red boxes really say it all for me - this is what AMD can deliver when paired with high-end motherboards and RAM modules - with a 1 GHz shortfall compared to Intel when overclocked.


Very nice.


----------



## hokk




----------



## GnarlyCharlie

Quote:


> Originally Posted by *Quantum Reality*
> 
> "... but gamers should look elsewhere"???
> 
> Are you joking, Ars Technica? Your own benches show that the Ryzen 1800X you got is within 10% or so of the 7700K in games.


Quote:


> Originally Posted by *Wishmaker*
> 
> Quite a few reviews are saying that Gamers should go for INTEL and not just Ars Technica. In this day and age, INTEL has pulled an NIVIDIA on AMD and is charging top dollar to have the highest performance. This is what happens when your competitors are lagging and do not manage to deliver products that sell. No capital, no R&D, no top performance.
> 
> In a galaxy far far away, the gaming market had place for just one series of chips and that was the FX. You wanted the best of the best? You bought the FX series.


Isn't the 1800X about $200 more expensive than the 7700K, and giving up that 10% or so?

Don't get me wrong, the 1800X does very well in other things, but they qualified the statement with "but gamers". Not sure most gamers would throw down $200 extra to give up any performance unless they were AMD die hards, then $200 or performance wouldn't matter.
Quote:


> Originally Posted by *Quantum Reality*
> 
> I'm just rolling my eyes at the whole "Well AMD is NO GOOD AT ALL for gaming" emanating from some reviews. It's like they can't bring themselves to admit AMD has brought a competitive CPU to the table for about half the cost of Intel's top flagship offerings that get 10% or so extra performance in game framerates.


But the 1800X is not half the cost of the 7700K that it's giving up the gaming performance to.

I understand the frustration when compared to a 6900K, but it was the 7700K that got this whole ball rolling.


----------



## PontiacGTX

Quote:


> Originally Posted by *sugarhell*
> 
> I never liked GamerNexus
> 
> I disagree a lot with their methodology


Quote:


> Originally Posted by *Gamer Nexus*
> When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU's performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field.


----------



## trism

Quote:


> Originally Posted by *Quantum Reality*
> 
> From this benchmark video: https://www.youtube.com/watch?v=V5RP1CPpFVE
> 
> 
> 
> The results and the red boxes really say it all for me - this is what AMD can deliver when paired with high-end motherboards and RAM modules - with a 1 GHz shortfall compared to Intel when overclocked.


The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


----------



## aDyerSituation

Quote:


> Originally Posted by *PontiacGTX*


----------



## SoloCamo

I wish the reviewers focused on the 1700 instead of the 1800x.. Cheaper than the 7700k and oc's the same (from what we've seen)


----------



## sugarhell

Quote:


> Originally Posted by *PontiacGTX*


And how this relates to my comment?


----------



## Kuivamaa

Quote:


> Originally Posted by *trism*
> 
> The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


It still is a GTX1080 used at 1080p. Any more GPU power on this resolution or lower and it turns a useful bench that tells you "what do these CPUs do in contemporary gaming" in a semiworthless synthetic test.


----------



## iLeakStuff

There is something extremely weird about the difference reviewers get in games indeed.
AMD need to get on top of that.

Its strange that synthetic benchmarks that doesnt scale lineary with cores does so much better on Ryzen than some games


----------



## Xuper

Quote:


> Originally Posted by *iLeakStuff*
> 
> There is something extremely weird about the difference reviewers get in games indeed.
> AMD need to get on top of that.
> 
> Its strange that synthetic benchmarks that doesnt scale lineary with cores does so much better on Ryzen than some games


I think different Bios/Memory Kit/Mobo


----------



## Artikbot

Quote:


> Originally Posted by *trism*
> 
> The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


It might not be a very good review, but it is the scenario myself and many others will encounter.

Sure, it is great to set the resolution to 1024x768 and details to rock bottom for the sake of doing a CPU comparison, but ask yourself, is that how anyone plays games?

I myself am far more interested in knowing how it does at the way I play games (that is, 1080p and beyond at mid-high graphics and above). Mostly because that's how most of us play, thus the way it is relevant for us.

Quote:


> Originally Posted by *iLeakStuff*
> 
> There is something extremely weird about the difference reviewers get in games indeed.
> AMD need to get on top of that.
> 
> Its strange that synthetic benchmarks that doesnt scale lineary with cores does so much better on Ryzen than some games


It depends on how games make use of CPU resources I'd wager. Remember when Skyrim didn't even use SSE?


----------



## comagnum




----------



## ZealotKi11er

Quote:


> Originally Posted by *iLeakStuff*
> 
> There is something extremely weird about the difference reviewers get in games indeed.
> AMD need to get on top of that.
> 
> Its strange that synthetic benchmarks that doesnt scale lineary with cores does so much better on Ryzen than some games


I am waiting for Digital Foundtry.


----------



## trism

Quote:


> Originally Posted by *Kuivamaa*
> 
> It still is a GTX1080 used at 1080p. Any more GPU power on this resolution or lower and it turns a useful bench that tells you "what do these CPUs do in contemporary gaming" in a semiworthless synthetic test.


Not really. It is testing the CPU capability. You could achieve the same results with older CPUs too so what value does this test hold, really?

If they are running the tests on lower settings, you could see the real differences - mainly, how well the CPUs hold up in the future. If there is a massive performance issue at lower settings, it could mean that 2-3 years from now with better GPUs and higher resolutions, you would have to update the entire system instead of just the GPU.
Quote:


> Originally Posted by *Artikbot*
> 
> It might not be a very good review, but it is the scenario myself and many others will encounter.
> 
> Sure, it is great to set the resolution to 1024x768 and details to rock bottom for the sake of doing a CPU comparison, but ask yourself, is that how anyone plays games?
> 
> I myself am far more interested in knowing how it does at the way I play games (that is, 1080p and beyond at mid-high graphics and above). Mostly because that's how most of us play, thus the way it is relevant for us.


Uh... When you get 30% more performance at lower settings with Intel, you are going to get AT LEAST the same than Ryzen at higher. Testing it with GPU limited scenario does not test the CPU, it tests the GPU - tests that the Internet is already filled with.

And speak for yourself. I play 1024x768 on everything competitive and max 1080p on single player games. Frames per second is much more important metric for me than graphics. 10-year-old games look fine, Minecraft too.
Quote:


> Originally Posted by *iLeakStuff*
> 
> Thats exactly how you should review a CPU because games will never be CPU bottlenecked with neither 7700K nor 1800X.
> So doing test like you say would make the game tests synthetic which doesnt make it a gaming test
> 
> So no


They could very well in the future, like 2 years from now. I don't plan on upgrading my entire system every time if I can buy now something that grants me only a GPU-update a few years in the future.


----------



## iLeakStuff

Quote:


> Originally Posted by *trism*
> 
> The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


Thats exactly how you should review a CPU because games will never be CPU bottlenecked with neither 7700K nor 1800X.
So doing test like you say would make the game tests synthetic which doesnt make it a gaming test

So no


----------



## budgetgamer120

Quote:


> Originally Posted by *iLeakStuff*
> 
> Thats exactly how you should review a CPU because games will never be CPU bottlenecked with neither 7700K nor 1800X.
> So doing test like you say would make the game tests synthetic which doesnt make it a gaming test
> 
> So no


We game at 720p though.


----------



## Artikbot

Quote:


> Originally Posted by *iLeakStuff*
> 
> Thats exactly how you should review a CPU because games will never be CPU bottlenecked with neither 7700K nor 1800X.
> So doing test like you say would make the game tests synthetic which doesnt make it a gaming test
> 
> So no


I think this is the second time I ever agree with you.


----------



## ZealotKi11er

Quote:


> Originally Posted by *trism*
> 
> Not really. It is testing the CPU capability. You could achieve the same results with older CPUs too so what value does this test hold, really?
> 
> If they are running the tests on lower settings, you could see the real differences - mainly, how well the CPUs hold up in the future. If there is a massive performance issue at lower settings, it could mean that 2-3 years from now with better GPUs and higher resolutions, you would have to update the entire system instead of just the GPU.


Still cant bealive people do not understand that much. CPUs get faster 5-10% every 1-2 years but GPUs get 30-50% every year. In 3-4 years we have 2-3x faster GPU with same CPU. Even Titan XP @ 1080p is a realistic configuration because in 3-4 year $150 priced cards will have that performance.


----------



## tpi2007

Quote:


> Originally Posted by *looniam*
> 
> wow 5 minutes . . . *crickets*


Some people were probably watching the livestream between Joker and Gamers Nexus that was linked a few posts back (it just ended).

From what Steve from Gamers Nexus said that AMD told them, most of these should do at most 3.9 Ghz on all cores, with just a few being able to do 4 - 4.1 Ghz on all cores.

So, if a 1700 (non X) can also do that in general, it's probably the best value for overclockers. Let's see more reviews confirm that.

Edit: Also, it's late into the discussion.


----------



## ducegt

Quote:


> Originally Posted by *tygeezy*
> 
> Oh yeah, for sure i'm in dire need. It's kind of annoying that Sandybridge gets all the love as the everlasting cpu when the i7 920/860 were both stellar chips.
> 
> So you think the 7700 k would be a better buy as primarily for gaming? Actually, the bigger upgrade I need is a desk chair believe it or not. The one I have at work has really opened up my eyes.
> 
> It's really annoying how DDR 4 prices have skyrocketed right around the time I finally go to upgrade. Either platform I want to use minimum 3000 mhz ddr 4.


7700K is a better choice for your needs from what Ive gathered. I have always modified my desks for FPS gaming and recently modified my chair too. My wife isnt happy with tbat so I should be looking for a chair as well, but Vega...My work chair is like 600 USD. Wish I had it at home as well. I went nuts with 3600 cl15 tridentdz 8x2 for 180. Never spent so much on RAM since corsair XMS DDR1. If you dont decide to be cheap on mobo and RAM, my 2 cents is to spend less on the mobo if you dont extra needs and spend more on the RAM.


----------



## KarathKasun

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still cant bealive people do not understand that much. CPUs get faster 5-10% every 1-2 years but GPUs get 30-50% every year. In 3-4 years we have 2-3x faster GPU with same CPU. Even Titan XP @ 1080p is a realistic configuration because in 3-4 year $150 priced cards will have that performance.


30%-50% is somewhat a stretch. How long was the GTX 980 Ti out before the GTX 1080Ti was even announced?

Anyway, there were mumblings about SMT not playing well with the micro-op cache causing performance to crater pretty hard in a few situations. Looks like that turned out to be true, with ~5% gains from disabling SMT.

Also, something like 50 pages got nuked. XD


----------



## Kriant

Soooo Crosshair Hero VI or Titanium for mobo choice?


----------



## Artikbot

Someone on XS reported that his CHVI decided to update the BIOS all by itself and bricked itself.

So there's that.


----------



## Kpjoslee

Quote:


> Originally Posted by *Kriant*
> 
> Soooo Crosshair Hero VI or Titanium for mobo choice?


Always preferred Asus but I like the look of Titanium better lol.


----------



## Potatolisk

Not the best. But I would still take 1700 over 7700k. With time, memory and SMT issues resolved it should be better. 1600x might be the sweetspot when it's released.


----------



## C2H5OH

Quote:


> Originally Posted by *Quantum Reality*
> 
> From this benchmark video: https://www.youtube.com/watch?v=V5RP1CPpFVE
> 
> 
> 
> The results and the red boxes really say it all for me - this is what AMD can deliver when paired with high-end motherboards and RAM modules - with a 1 GHz shortfall compared to Intel when overclocked.


Quote:


> Originally Posted by *trism*
> 
> The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


That's not entirely one sided. What Quantum Reality shows is a must in benchmarking.

- Yes, when you remove GPU bottleneck you see what the CPU is capable of and can be considered some sort of IPC check
- Showing scenarios with 1440p and 4k, reveals that Ryzen will not bottleneck your GPU.

As such, if you game with GTX Titan and Ryzen 7, you will see the same FPS as with Intel 7700k. At the end of the day, isn't that what high-end gamers are doing?
At the same time Ryzen can do much more in non gaming scenarios.

When reviewers show only one side of the things, you get mass hysteria, which doesn't do good to no one. Sure, there are some quirks to be ironed in Windows with scheduling and core parking but I really liked what I saw with Ryzen.


----------



## tp4tissue

I think the 1800x would be a real hit @ $300..

The encoding/ streaming argument is old.. They said that back with bulldozer, and in the end , that didn't move the chip,

outside of the couple hundred or so people on this forum.. No one else who streams even have $1000 to buy new computers..

hahahahaha.


----------



## mistax

3 days of delivery exception from fedex my raaam


----------



## tpi2007

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *trism*
> 
> Not really. It is testing the CPU capability. You could achieve the same results with older CPUs too so what value does this test hold, really?
> 
> If they are running the tests on lower settings, you could see the real differences - mainly, how well the CPUs hold up in the future. If there is a massive performance issue at lower settings, it could mean that 2-3 years from now with better GPUs and higher resolutions, you would have to update the entire system instead of just the GPU.
> 
> 
> 
> Still cant bealive people do not understand that much. CPUs get faster 5-10% every 1-2 years but GPUs get 30-50% every year. In 3-4 years we have 2-3x faster GPU with same CPU. Even Titan XP @ 1080p is a realistic configuration because in 3-4 year $150 priced cards will have that performance.
Click to expand...

Yes, but in 3-4 years games won't be relying on less threads, so the situation for Ryzen will only get better from now on, not worse, even at 1080p, even if they didn't do any further scheduler / game / BIOS optimizations.

Quote:


> Originally Posted by *Artikbot*
> 
> Someone on XS reported that his CHVI decided to update the BIOS all by itself and bricked itself.
> 
> So there's that.


What? Did they have any motherboard specific Windows software installed that does such stuff automatically or something?


----------



## PontiacGTX

Quote:


> Originally Posted by *sugarhell*
> 
> And how this relates to my comment?


that they didnt accept using a methology which implies to hide cpu diference at games if framerate isnt locked like here


----------



## sugarhell

Quote:


> Originally Posted by *PontiacGTX*
> 
> that they didnt accept using a methology which implies to hide cpu diference at games if framerate isnt locked


But the way that you quote you imply that i have the same methodology as amd? I dont understand


----------



## trism

Quote:


> Originally Posted by *C2H5OH*
> 
> That's not entirely true!
> 
> Yes, when you remove GPU bottleneck you see what the CPU is capable of and can be considered some sort of IPC check, but showing scenarios with 1440p and 4k, reveals that Ryzen will not bottleneck your GPU.
> 
> As such, if you game with GTX Titan and Ryzen 7, you will see the same FPS as with Intel 7700k. At the end of the day, isn't that what high-end gamers are doing?
> At the same time Ryzen can do much more in non gaming scenarios.
> 
> When reviewers show only one side of the things, you get mass hysteria, which doesn't do good to no one. Sure, there are some quirks to be ironed in Windows with scheduling and core parking but I really liked what I saw with Ryzen.


It is entirely true, because we could be doing these tests at 4K with low-tier i3s and get the same FPS with all systems. It does not tell *a single thing* about the processor, except that it is enough to cap the GPU. Sure, if that's your goal then I'm not here judging you. I am more interested in the raw performance of the processor in non-limited scenarios because that tells much better how long I can stay with the processor before upgrading. Judging based on the reviews I've seen, at least for now i7 7700k is a much better choice *if you only play games* and seek for high FPS.

I am not saying the Ryzen is a bad processor and I am not an Intel fanboy. I just don't think that review really tells anything except that Ryzen can do what a lot of other CPUs can also do.
Quote:


> Originally Posted by *tpi2007*
> 
> Yes, but in 3-4 years games won't be relying on less threads, so the situation for Ryzen will only get better from now on, not worse, even at 1080p, even if they didn't do any further scheduler / game / BIOS optimizations.


Possible. But that same thing has also been said for a decade now... I guess we could be in a turning point right now, who knows. All I know that the Ryzen is very well executed from AMDs part and I really like how it turned out. I'm guessing AMD will get quite a bit of sales since the "power user" crowd is quite a minority. My only gripe was with that review, not the processor itself.


----------



## iLeakStuff

Here is a summary of 12 games tested.https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_vor_und_nachteile_durch_smt

7700K is only 12% faster than 1800X stock and 8% faster with 4.1GHz clock.


----------



## Majin SSJ Eric

I'll be replacing my 4930K with a 1700 and CH6 over the summer I think. Will probably throw in a couple (hopefully) cheap 1080's to replace my OG Titans and call it a day. Will still have a very beastly setup for around $1200 total, or around the same amount I spent on each of my Titans when they came out!


----------



## aDyerSituation

Didn't people use the same "it will get better with time" argument 5 years ago with the FX series?

By the time that does happen, single threaded performance gap will be even greater. I made this mistake with the FX series, I won't make it again. I was expecting 5-10% faster than haswell in gaming on top of the more cores, and at least hitting 4.3-4.5.


----------



## ihatelolcats

is there any OC difference between these processors?


----------



## IRobot23

Quote:


> Originally Posted by *trism*
> 
> The (game) tests in that review are GPU limited. Not a very good review when it comes to comparing 7700k vs Ryzen.


not all of them
BF1 and watchdogs 2
you will see GTX 1080 dropping under 90% usage in empty map in BF1? EMPTY, 64p would much more critical for i7 7700K than ryzen.


----------



## comagnum

Quote:


> Originally Posted by *comagnum*


Just reposting because it was obviously missed.


----------



## PontiacGTX

Quote:


> Originally Posted by *sugarhell*
> 
> But the way that you quote you imply that i have the same methodology as amd? I dont understand


that you sayy theiir methodolgy is wrong but it seems that it could be worse


----------



## C2H5OH

Quote:


> Originally Posted by *PontiacGTX*
> 
> that they didnt accept using a methology which implies to hide cpu diference at games if framerate isnt locked


And yes and no.
Nothing was stopping them from testing 720p, 1080p, 1440p and 4k if they really were concerd that AMD wanted to cheat.

At higher resolutions you see that Ryzen is absolutely capable of feeding the GPU and there will be little difference when gamers are using 1440p or 4k will see in FPS. Currently, all those usg such monitors have no idea how Ryzen will perform for them.


----------



## tygeezy

Quote:


> Originally Posted by *ducegt*
> 
> 7700K is a better choice for your needs from what Ive gathered. I have always modified my desks for FPS gaming and recently modified my chair too. My wife isnt happy with tbat so I should be looking for a chair as well, but Vega...My work chair is like 600 USD. Wish I had it at home as well. I went nuts with 3600 cl15 tridentdz 8x2 for 180. Never spent so much on RAM since corsair XMS DDR1. If you dont decide to be cheap on mobo and RAM, my 2 cents is to spend less on the mobo if you dont extra needs and spend more on the RAM.


I can't even find my chair online. It was built by calpia (boston chair). I dont know the price of this chair but it's somewhere between 600-800. Then again, calpia might charge more since it's a state agency (built by prisoners).

Anyway, id hate to get the 7700 k and then be sandbridged with cofeelake getting 6 cores.


----------



## sugarhell

Quote:


> Originally Posted by *aDyerSituation*
> 
> Didn't people use the same "it will get better with time" argument 5 years ago with the FX series?
> 
> By the time that does happen, single threaded performance gap will be even greater. I made this mistake with the FX series, I won't make it again. I was expecting 5-10% faster than haswell in gaming on top of the more cores, and at least hitting 4.3-4.5.


FX sucked from the start.

Ryzen has the ipc but on some workloads doesn't perform.

Probably the memory is the reason we will seen soon with bug fixes and new bios what will happen. But ryzen is not FX it is a solid product


----------



## PontiacGTX

Quote:


> Originally Posted by *C2H5OH*
> 
> And yes and no.
> Nothing was stopping them from testing 720p, 1080p, 1440p and 4k if they really were concerd that AMD wanted to cheat.
> 
> At higher resolutions you see that Ryzen is absolutely capable of feeding the GPU and there will be little difference when gamers are using 1440p or 4k will see in FPS. Currently, all those usg such monitors have no idea how Ryzen will perform for them.


well under same presets a FX could play games just fine


----------



## iLeakStuff

Quote:


> Originally Posted by *sugarhell*
> 
> FX sucked from the start.
> 
> Ryzen has the ipc but on some workloads doesn't perform.
> 
> Probably the memory is the reason we will seen soon with bug fixes and new bios what will happen. But ryzen is not FX it is a solid product


Latest rumors say its the BIOS on some motherboards impacted FPS.. And since reviewers used so many different ones, results are all over the place


----------



## Artikbot

Quote:


> Originally Posted by *tpi2007*
> 
> What? Did they have any motherboard specific Windows software installed that does such stuff automatically or something?


I don't know actually. This is the post:

http://www.xtremesystems.org/forums/showthread.php?292937-Ryzen-and-memory&p=5254541&viewfull=1#post5254541


----------



## Quantum Reality

Quote:


> Originally Posted by *iLeakStuff*
> 
> Here is a summary of 12 games tested.https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_vor_und_nachteile_durch_smt
> 
> 7700K is only 12% faster than 1800X stock and 8% faster with 4.1GHz clock.


What's also interesting is that 2667 MHz DDR4 is apparently the sweet spot right now for Ryzen.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *aDyerSituation*
> 
> Didn't people use the same "it will get better with time" argument 5 years ago with the FX series?
> 
> By the time that does happen, single threaded performance gap will be even greater. I made this mistake with the FX series, I won't make it again. I was expecting 5-10% faster than haswell in gaming on top of the more cores, and at least hitting 4.3-4.5.


Wait, after having owned FX why were you expecting that in the first place? My baseline expectations for Zen have always been around IB IPC, which at the time I argued would already be a big accomplishment for AMD considering where they were with FX. Ryzen exceeds my expectations by a good bit. I guess SB and IB all suck now that Ryzen is faster?


----------



## Siezureboy

I think the biggest thing that has sort of brought about the disappointment with this chipset is that it's been in development for so long. I honestly feel that it may have gone for much longer with the whole fiasco with AMD's big failure that was bulldozer and the big company roster mix-up. So who's to say if this chipset was actually initially to be the contender for previous intel chips like the skylakes or what have you and AMD, in a panicked desperation had to come up with a marketing scheme (as proven with a lot of their products lately) to get the hype going over time and sell slightly short while those that don't follow the technology closely buy into the product.


----------



## ducegt

Quote:


> Originally Posted by *iLeakStuff*
> 
> Here is a summary of 12 games tested.https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_vor_und_nachteile_durch_smt
> 
> 7700K is only 12% faster than 1800X stock and 8% faster with 4.1GHz clock.


And most know this, but almost all 7700Ks OC 18% or more.


----------



## TopicClocker

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Arguing that Ryzen isn't a compelling offer because it falls behind in gaming is laughable. If you're building a system solely for gaming, you really shouldn't be looking at anything above a Core i5 which is in the $200-$250 price range. Most games WILL NOT benefit from hyper-threading.
> 
> If you're doing anything multi-threaded, anything that benefits from more cores (Rendering, Encoding, Decoding, Streaming, etc) Ryzen is a compelling offer. You're going to get more performance for your dollar than anything Intel currently offers.
> 
> Despite the arguments, I highly doubt the difference in frame-rate you'd see with a Ryzen CPU vs a 7700K would be that noticeable, especially if you're using a 60HZ monitor. Not to mention, if what we've been hearing recently is true, games will start to benefit from more cores in the near future and that could completely change the playing field.


Quote:


> Originally Posted by *M3T4LM4N222*
> 
> In one "cherry picked" instance the Core i7 being 30% faster in a game? Worth $100 more? Hell no.
> 
> We're talking overall value here. If you're solely gaming, you're simply not going to see a large benefit between a Kaby Lake i5 and a Kaby Lake i7. Certainly
> not a benefit worth the price difference.


I just read your posts concerning the i5 vs i7, wow man. Times have changed.
By saying there's no difference between them when gaming you're simply misinforming people, there is a difference in CPU bound titles that scale past 4 threads and this list of games is constantly growing.

The extra people spent on an i7 over an i5 many years ago with Sandy and Ivy processors have shown that it was worth the cost for those who care about higher frame-rates and have CPUs for over 4+ years, even today the additional threads are beneficial and 4C/8T CPUs have distanced themselves from their 4 thread counterparts in many games, it's not 2011 anymore.



Assassin's Creed Syndicate
Battlefield 1


Digital Foundry - Intel Kaby Lake: Core i7 7700K review



Even the 4.8GHz i5 7600K cannot beat the lower clocked 4.5GHz i7 4790K in most of the games here, Digital Foundry know how to test CPUs properly which is why you're seeing these differences here. A lot of CPU tests you see are GPU bound because people are not testing the CPUs properly.

Gamers Nexus - AMD Ryzen R7 1800X Review: An i5 in Gaming, i7 in Production



Quote:


> Originally Posted by *TopicClocker*
> 
> At the moment there's a couple of games which scale past 4+ core CPUs such as:
> Assassin's Creed Unity
> Assassin's Creed Syndicate
> Watch Dogs 2
> The Division
> Star Wars Battlefront
> The Witcher 3
> Dragon Age Inquisition
> Battlefield 1
> Fallout 4 (Performance improves notably on 4c8t processors over 4c4t processors, I haven't seen it scale much above that though, it seems to be quite dependent on single-threaded performance).
> Crysis 3
> Total War: Warhammer
> 
> Pretty much all the new games which utilize the Frostbite engine scale past 4 cores, so games such as Dragon Age Inquisition, Need For Speed 2015/6 and most likely the upcoming Mass Effect Andromeda as-well.


----------



## C2H5OH

Quote:


> Originally Posted by *trism*
> 
> It is entirely true, because we could be doing these tests at 4K with low-tier i3s and get the same FPS with all systems. It does not tell *a single thing* about the processor, except that it is enough to cap the GPU. Sure, if that's your goal then I'm not here judging you. I am more interested in the raw performance of the processor in non-limited scenarios because that tells much better how long I can stay with the processor before upgrading. Judging based on the reviews I've seen, at least for now i7 7700k is a much better choice *if you only play games* and seek for high FPS.
> 
> I am not saying the Ryzen is a bad processor and I am not an Intel fanboy. I just don't think that review really tells anything except that Ryzen can do what a lot of other CPUs can also do.
> Possible. But that same thing has also been said for a decade now... I guess we could be in a turning point right now, who knows. All I know that the Ryzen is very well executed from AMDs part and I really like how it turned out. I'm guessing AMD will get quite a bit of sales since the "power user" crowd is quite a minority. My only gripe was with that review, not the processor itself.


If you say that i3 will be enough to cap your GPU, how long will it take for Ryzen 7 to be outdated?

If you are interested in raw performance, then you shouldn't look at games.

I didn't say, you are implying Ryzen was bad, I am just laying out my opinion since we are commenting on benchmarks.


----------



## budgetgamer120

Quote:


> Originally Posted by *ducegt*
> 
> And most know this, but almost all 7700Ks OC 18% or more.


And the percentage of user that overclock is even lower than 18%.


----------



## IRobot23

*DEPENDS ON HOW YOU CHERRY PICK BENCHMARKS!!!!!*

*People needs to under RYZEN need fixes. It does run really well in some games, but in some just does not! cripple function???!*


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> And the percentage of user that overclock is even lower than 18%.


that doesn't change the fact that it is there lmao


----------



## Oubadah

..


----------



## ducegt

Quote:


> Originally Posted by *budgetgamer120*
> 
> And the percentage of user that overclock is even lower than 18%.


On overclock.net?


----------



## PureBlackFire

Quote:


> Originally Posted by *Kriant*
> 
> Soooo Crosshair Hero VI or Titanium for mobo choice?


Crosshair VI easily, and I gave up buying Asus. the Titanium isn't worth it's price by any stretch. it's stripped down compared to it's Z270 counterpart and the most important thing outside of Intel's "Z" platform for the cpu overclocking, the power delivery, looks stupidly weak for a $300 motherboard on the Titanium. I don't suggest even considering it. meanwhile the Crosshair VI is clearly, unchallenged as the best launch motherboard for Ryzen. best power delivery and best cooler compatibility (important!!!).


----------



## aDyerSituation

Quote:


> Originally Posted by *Oubadah*
> 
> I'm glad to see that AMD is back in the game, but these CPUs are nothing that I would buy and I don't see them forcing Intel to clean it's act up.


I don't see prices dropping either.

Maybe intel will push a 6 core 12 thread processor in place of the typical 4c/8t sooner rather than later. But that's it.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> that doesn't change the fact that it is there lmao


AMD does not care about that. Majority of processors sold will run at stock, which is why they implement boost and XFR.

Mentioning overclocks like that will affect profits. AMD maxed out their CPUs from the get go. "lmao"


----------



## Majin SSJ Eric

You have a 6700K. There would be no reason for you to buy a Ryzen unless you wanted to do some highly threaded workloads (which you don't since you bought a quad core). I OTOH am coming from a 4930K so a 1700 looks very good to me, especially for the low price of access.


----------



## criminal

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'll be replacing my 4930K with a 1700 and CH6 over the summer I think. Will probably throw in a couple (hopefully) cheap 1080's to replace my OG Titans and call it a day. Will still have a very beastly setup for around $1200 total, or around the same amount I spent on each of my Titans when they came out!


Sounds like a solid plan to me.


----------



## tpi2007

Quote:


> Originally Posted by *aDyerSituation*
> 
> Didn't people use the same "it will get better with time" argument 5 years ago with the FX series?
> 
> By the time that does happen, single threaded performance gap will be even greater. I made this mistake with the FX series, I won't make it again. I was expecting 5-10% faster than haswell in gaming on top of the more cores, and at least hitting 4.3-4.5.


Back in those days the FX-8150 actually performed worse than its predecessor, it wasn't even funny and you just had to spend +$72 to get the 2600K and win all-round compared to the FX-8150. Now you only get to choose the 7700K instead, which won't get better in 3-4 years because it doesn't have 8 cores and 16 threads to leverage. Games won't go back to using less threads, they are starting to use more, more than 4 certainly and more than 8 soon too, In any case, right away having more than 4 physical cores to leverage more than 4 threads is almost a no-brainer.

Of course IPC and overclockabilty aren't the same for Ryzen, but, well, they can't do miracles with their R&D budget, especially because they are doing something that not even Intel does, which is debut a new architecture on a new manufacturing process.

I'm not hugely impressed (although the workstation / server side is very interesting), but I'm not disappointed either. I'll wager that people will have reason to be disappointed if the 4C/8T CPUs later in the year only oveclock as well as these 8 cores. That's probably why AMD isn't going there right now, to avoid direct comparisons. They are aiming at a well-rounded package for an attractive price.


----------



## fleetfeather

Can someone give me a TL;DR of stock voltages and OC voltages? I get the impression that OC Vcore was close to 1.50, but have no idea on stock Vcore's out of the box

I ask because I'm looking to conclude for myself what role (if any) power delivery components are likely to have on OC - if OC'ing only adds an extra ~.10 to Vcore, I'd be tempted to consider motherboards for their feature set over their power delivery.


----------



## looniam

.


----------



## OutlawII

Quote:


> Originally Posted by *fleetfeather*
> 
> Can someone give me a TL;DR of stock voltages and OC voltages? I get the impression that OC Vcore was close to 1.50, but have no idea on stock Vcore's out of the box
> 
> I ask because I'm looking to conclude for myself what role (if any) power delivery components are likely to have on OC - if OC'ing only adds an extra ~.10 to Vcore, I'd be tempted to consider motherboards for their feature set over their power delivery.


Go watch TTL reveiw on the 1800 he could only get it up to 4.1 i beleive


----------



## trism

Quote:


> Originally Posted by *C2H5OH*
> 
> If you say that i3 will be enough to cap your GPU, how long will it take for Ryzen 7 to be outdated?
> 
> If you are interested in raw performance, then you shouldn't look at games.


I was merely giving a similar scenario. However, with your logic: Why buy Ryzen if i3 is already fast enough to run games at the same performance with lower price? This is exactly why GPU capped scenarios are bad in reviews. The reviews should test the absolute power of the chip because that indicates the speed much better. Otherwise they could've just called it "GTX 1080 review".

You do realize that 2-3 years or maybe even further we could have scenarios where Ryzen (and certainly not the i3) wouldn't be able to cap the "then used" GPU anymore but Intel with it's higher headroom could? There are people who game with 1080p still and some (like me) are more interested in the FPS counter than the graphical fidelity.


----------



## fleetfeather

Quote:


> Originally Posted by *OutlawII*
> 
> Go watch TTL reveiw on the 1800 he could only get it up to 4.1 i beleive


I'm asking about Voltages, not Frequencies


----------



## Quantum Reality

As I said earlier, I would definitely like to see a comprehensive set of AMD-only benchmarks tracking from the Phenom II X4 965BE all the way to Ryzen 1800X to see how much of an improvement Ryzen represents just from that sector alone.


----------



## Melan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I OTOH am coming from a 4930K so a 1700 looks very good to me, especially for the low price of access.


I'm coming from 3770K and 1700 did look good as an upgrade, but amazon lowered the price of 7700K to the same 350-ish euros. Now I'm pretty much stuck.


----------



## Quantum Reality

Quote:


> Originally Posted by *Melan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> I OTOH am coming from a 4930K so a 1700 looks very good to me, especially for the low price of access.
> 
> 
> 
> I'm coming from 3770K and 1700 did look good as an upgrade, but amazon lowered the price of 7700K to the same 350-ish euros. Now I'm pretty much stuck.
Click to expand...

You have to purchase a new motherboard and RAM in any case. Who's to say Intel won't do another "Surprise! NEW SOCKET AGAIN!" this or next year? By contrast, AMD has a long and proven history of keeping the same socket design across multiple generations of CPUs for years on end.

Thus, your upgrade path is more well-defined with AMD.


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> AMD does not care about that. Majority of processors sold will run at stock, which is why they implement boost and XFR.
> 
> Mentioning overclocks like that will affect profits. AMD maxed out their CPUs from the get go. "lmao"


You are joking right? They are maxed out and still performing worse in games. Which is the basis of this discussion.


----------



## C2H5OH

Quote:


> Originally Posted by *trism*
> 
> I was merely giving a similar scenario. However, with your logic: Why buy Ryzen if i3 is already fast enough to run games at the same performance with lower price? This is exactly why GPU capped scenarios are bad in reviews. The reviews should test the absolute power of the chip because that indicates the speed much better. Otherwise they could've just called it "GTX 1080 review".
> 
> You do realize that 2-3 years or maybe even further we could have scenarios where Ryzen (and certainly not the i3) wouldn't be able to cap the "then used" GPU anymore but Intel with it's higher headroom could? There are people who game with 1080p still and some (like me) are more interested about the FPS counter than the graphical fidelity.


The i3 part wasn't my logic, I took that from your comment.
Currently there is a stagnation in IPC advancements, and with Moores law not helping, the only low lying fruit is more cores.

No CPU and GPU are everlasting, but it won't happen in a year or two. By that time, Intel may not have "more headroom", as at this point everything points to industry moving to multi-threaded apps, games etc.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> You are joking right? They are maxed out and still performing worse in games. Which is the basis of this discussion.


The good thing is they are competing


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> The good thing is they are competing


I'm not discrediting what they HAVE accomplished. But I'm not sure it will be enough


----------



## trism

Quote:


> Originally Posted by *C2H5OH*
> 
> The i3 part wasn't my logic, I took that from your comment.
> Currently there is a stagnation in IPC advancements, and with Moores law not helping, the only low lying fruit is more cores.
> 
> No CPU and GPU are everlasting, but it won't happen in a year or two. By that time, Intel may not have "more headroom", as at this point everything points to industry moving to multi-threaded apps, games etc.


You asked me about it and I used that logic as the counterpoint









Yes, I agree. However GPUs seem to advance quite a lot still (since they scale better). Everything has been moving to multi-threaded apps for a decade now. No one knows really, but I am pretty sure i7 7700k bought now is fine for the next 4-5 years easily.

Also, this was not even the original point. I only commented about the video because I don't think it's a good review nor an indicator of the CPU's gaming power.

EDIT:
http://www.hardocp.com/article/2017/03/02/amd_ryzen_1700x_cpu_review/4

Here's a review I like. Again, not dissing the Ryzen at all because I see it as a very good product and possibly an upgrade when the hexa-cores come. Hopefully AMD can optimize a bit though.


----------



## Xuper

Quote:


> Originally Posted by *aDyerSituation*
> 
> I'm not discrediting what they HAVE accomplished. But I'm not sure it will be enough


You are really funny.in other to match core i7 , AMD need 80% IPC but they were able to reach 52%.this doesn't translate into BD or SKL/KB.AMD is now on par with Haswell which is ok.what did you expect? Kaby lake oh god?


----------



## tp4tissue

Quote:


> Originally Posted by *trism*
> 
> It is entirely true, because we could be doing these tests at 4K with low-tier i3s and get the same FPS with all systems. It does not tell *a single thing* about the processor, except that it is enough to cap the GPU. Sure, if that's your goal then I'm not here judging you. I am more interested in the raw performance of the processor in non-limited scenarios because that tells much better how long I can stay with the processor before upgrading. Judging based on the reviews I've seen, at least for now i7 7700k is a much better choice *if you only play games* and seek for high FPS.
> 
> I am not saying the Ryzen is a bad processor and I am not an Intel fanboy. I just don't think that review really tells anything except that Ryzen can do what a lot of other CPUs can also do.
> Possible. But that same thing has also been said for a decade now... I guess we could be in a turning point right now, who knows. All I know that the Ryzen is very well executed from AMDs part and I really like how it turned out. I'm guessing AMD will get quite a bit of sales since the "power user" crowd is quite a minority. My only gripe was with that review, not the processor itself.


GAMES for the forseeable future will continue to be bottlenecked by single thread performance due to the fact that the output HAS to be composited.

So no matter how fast your other stuff goes, in the end, when it's all put together and presented to the user, that thread is always just 1 thread.. The CPU might in some cases let you render more physics accurate birds flying on the screen, but the LAST thread will still depend on single core.


----------



## Jayjr1105

Quote:


> Originally Posted by *aDyerSituation*
> 
> I'm not discrediting what they HAVE accomplished. But I'm not sure it will be enough


you do realize other than gaming the 1800x is kicking the snot out of the 6900K right? At less than half the price mind you.


----------



## aDyerSituation

Quote:


> Originally Posted by *Xuper*
> 
> You are really funny.in other to match core i7 , AMD need 80% IPC but they were able to reach 52%.this doesn't translate into BD or SKL/KB.AMD is now on par with Haswell which is ok.what did you expect? Kaby lake oh god?


I am talking about gaming.


----------



## III-Method-III

Im exhausted. I just read 100 pages. You guys are posting just fractionally slower than I can read in real time









I have a 4790k (which i dont overclock because im lazy and dont really know how to do so properly) and a gtx980. Im running 16gb of RAM on a Asus Ranger VII.

Im a gamer (primarily Battlefield 1 and 4) and a youtuber (so i use Premiere Pro a LOT, Photoshop a fair bit and After Effects occasionally).

I play at 1440p 80FPS (limited, so that OBS can get some cpu and gpu cycles when its recording) and record at 1440p 60fps. If Im not recording, I take off the limiter and get about 100 fps in Bf1.

So, given I appear to be one of the rare creatures who wants both great gaming performance and great rendering / encoding performance and given 100 pages of arguments, stick poking and burning torch waving....

Do you guys think a move to a Ryzen 1800x or 1700x is a good one for my specific use case? Im planning a new build with a LOT more storage (recording takes up a comical amount of space - approx 1Gb per min) and 32 or 64 GB RAM as Premier and After Effects benefit greatly from MOAR. I will also take advantage of the TI launch and scalp a cheaper 1080 to replace my 980.

Pleaz helps a poor Youtuber


----------



## Malinkadink

So Ryzen is scaling very well with faster ram? 3k mhz + 1700 is painfully close to 5ghz 7700k if jokers benchmarks are to be believed. Thinking 3200mhz will give just a tiny bit more and be great to pair with ryzen with 14 cas no less.


----------



## blue1512

Quote:


> Originally Posted by *tp4tissue*
> 
> GAMES for the forseeable future will continue to be bottlenecked by single thread performance due to the fact that the output HAS to be composited.
> 
> So no matter how fast your other stuff goes, in the end, when it's all put together and presented to the user, that thread is always just 1 thread.. The CPU might in some cases let you render more physics accurate birds flying on the screen, but the LAST thread will still depend on single core.


The final verdict of single thread performance should be given when the DDR4 issue is figured out. Slow RAM = slow single thread unfortunately.


----------



## Xuper

Quote:


> Originally Posted by *aDyerSituation*
> 
> I am talking about gaming.


I checked gamersnexus and computerbase.they have decent performance.on par with core i5-6600K and Core i5-4690.What's problem ? AMD doesn't need to beat Intel.AMD needs balance perf/price not to beat Intel's CPU.


----------



## tp4tissue

Quote:


> Originally Posted by *Malinkadink*
> 
> So Ryzen is scaling very well with faster ram? 3k mhz + 1700 is painfully close to 5ghz 7700k if jokers benchmarks are to be believed. Thinking 3200mhz will give just a tiny bit more and be great to pair with ryzen with 14 cas no less.


what page is the joker benchmark on.


----------



## tp4tissue

Quote:


> Originally Posted by *Xuper*
> 
> I checked gamersnexus and computerbase.they have decent performance.on par with core i5-6600K and Cor e i5-4690.What's problem ? AMd doesn't need to beat Intel.AMD needs balance perf/price not to beat Intel's CPU.


Of course.. price to perf.. $500 is too much.


----------



## fleetfeather

Quote:


> Originally Posted by *III-Method-III*
> 
> Im exhausted. I just read 100 pages. You guys are posting just fractionally slower than I can read in real time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a 4790k (which i dont overclock because im lazy and dont really know how to do so properly) and a gtx980. Im running 16gb of RAM on a Asus Ranger VII.
> 
> Im a gamer (primarily Battlefield 1 and 4) and a youtuber (so i use Premiere Pro a LOT, Photoshop a fair bit and After Effects occasionally).
> 
> I play at 1440p 80FPS (limited, so that OBS can get some cpu and gpu cycles when its recording) and record at 1440p 60fps. If Im not recording, I take off the limiter and get about 100 fps in Bf1.
> 
> So, given I appear to be one of the rare creatures who wants both great gaming performance and great rendering / encoding performance and given 100 pages of arguments, stick poking and burning torch waving....
> 
> Do you guys think a move to a Ryzen 1800x or 1700x is a good one for my specific use case? Im planning a new build with a LOT more storage (recording takes up a comical amount of space - approx 1Gb per min) and 32 or 64 GB RAM as Premier and After Effects benefit greatly from MOAR. I will also take advantage of the TI launch and scalp a cheaper 1080 to replace my 980.
> 
> Pleaz helps a poor Youtuber


A move to Intel's X99 platform would make more sense, particularly since it has 4 channel DDR4 support and doesn't have issues with heightened queue depth (which may or may not get fixed over time on Ryzen)

Not everyone has the budget for such a move though


----------



## Xuper

Quote:


> Originally Posted by *tp4tissue*
> 
> Of course.. price to perf.. $500 is too much.


lol, Just buy 1700 , that's done.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> I am talking about gaming.


CPUs are used for other things.

In live stream AMD stated these are all round performers for those who want to work and play on the same system.


----------



## Phixit

Quote:


> Originally Posted by *fleetfeather*
> 
> A move to Intel's X99 platform would make more sense, particularly since it has 4 channel DDR4 support and doesn't have issues with heightened queue depth (which may or may not get fixed over time on Ryzen)
> 
> Not everyone has the budget for such a move though


.. or wait for Intel X299 and see how it performs.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Jayjr1105*
> 
> you do realize other than gaming the 1800x is kicking the snot out of the 6900K right? At less than half the price mind you.


Even in gaming the 1800X is holding its own against the 6900K. I'm still trying to wrork out why anybody thought Ryzen was going to be faster than a 7700K in gaming in the first place? If you are going to discredit Ryzen for not being faster than a 7700K then you were always going to be discrediting Ryzen from the start.


----------



## trism

One possibility is to just calm down and wait whether Ryzen gets better support and fixes for the issues. It is still a very good CPU, especially if you game at a bit higher resolutions, stream/use other programs simultaneously and if your budget is limited.


----------



## III-Method-III

Quote:


> Originally Posted by *fleetfeather*
> 
> A move to Intel's X99 platform would make more sense, particularly since it has 4 channel DDR4 support and doesn't have issues with heightened queue depth (which may or may not get fixed over time on Ryzen)
> 
> Not everyone has the budget for such a move though


But if I am to improve rendering and encoding times, I need more cores. X99 is great Im sure, but the 6 and 8 core CPU's that plug in are a tad on the expensive side with no noticeable gains over the 8 core Ryzen...just checking I understand your logic.

I appreciate there is an issue with RAM speed on the Ryzen platform but I expect it will be resolved soon(TM) such that a numpty like me can at least use 3200 Mhz ram?

the stuff about queue depth lost me....you might need to use smaller words for me


----------



## tp4tissue

Quote:


> Originally Posted by *budgetgamer120*
> 
> CPUs are used for other things.
> 
> In live stream AMD stated these are all round performers for those who want to work and play on the same system.


Except very few people actually do that..

There's a reason there's a need to separate the computer responsible for YOUR LIVELIHOOD, and your Gamer frags..


----------



## Melan

Quote:


> Originally Posted by *Quantum Reality*
> 
> You have to purchase a new motherboard and RAM in any case. Who's to say Intel won't do another "Surprise! NEW SOCKET AGAIN!" this or next year? By contrast, AMD has a long and proven history of keeping the same socket design across multiple generations of CPUs for years on end.
> 
> Thus, your upgrade path is more well-defined with AMD.


By the time I'll be going from 7700K or 1700, I'll still have to buy new mobo though. I don't upgrade like most people here do lol.


----------



## $ilent

Not had much time to read reviews but is it safe to say there is no need to upgrade my 2700k to a 1700 if I am gaming?

Thanks


----------



## Xuper

Get over it! AMD Is not God.I'm telling you AMD repeated themself "40% IPC" now they're over 50% IPC but it's not enough to beat SKL.so why are worrying about gaming performance When you know that to be on par with Core i7-7700K , AMD needs at least 80% more IPC ? heh ? *Seriously ?!!!*

*I wish AMD said this before :*

Quote:


> I offer You i5's gaming performance with i7's Production performance for just $350, How is it ?


----------



## III-Method-III

Quote:


> Originally Posted by *tp4tissue*
> 
> Except very few people actually do that..
> 
> There's a reason there's a need to separate the computer responsible for YOUR LIVELIHOOD, and your Gamer frags..


I do...


----------



## Scotty99

Quote:


> Originally Posted by *$ilent*
> 
> Not had much time to read reviews but is it safe to say there is no need to upgrade my 2700k to a 1700 if I am gaming?
> 
> Thanks


Depends on the games tbh, a 1700 will beat a 2700k in the new titles quite handily.


----------



## tp4tissue

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even in gaming the 1800X is holding its own against the 6900K. I'm still trying to wrork out why anybody thought Ryzen was going to be faster than a 7700K in gaming in the first place? If you are going to discredit Ryzen for not being faster than a 7700K then you were always going to be discrediting Ryzen from the start.


I was all over that rumor thread telling people Zen is at least 36% behind..

now it seems that the 4.1ghz cap puts them around exactly that....


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *fleetfeather*
> 
> A move to Intel's X99 platform would make more sense, particularly since it has 4 channel DDR4 support and doesn't have issues with heightened queue depth (which may or may not get fixed over time on Ryzen)
> 
> Not everyone has the budget for such a move though


Yes, I'm sure he can't wait to queue up to spend $1000 for a 6900K and $400 for a good X99 board because of a few FPS in games and lack of quad channel memory. Those things are definitely worth $1000 over Ryzen.


----------



## tp4tissue

Quote:


> Originally Posted by *III-Method-III*
> 
> I do...


exactly, you and the 5 other guys like you on OCN.. hahahahah..


----------



## budgetgamer120

Quote:


> Originally Posted by *tp4tissue*
> 
> Except very few people actually do that..
> 
> There's a reason there's a need to separate the computer responsible for YOUR LIVELIHOOD, and your Gamer frags..


Can you tell me the reasons? Lol


----------



## Rich84

1700 @3.9 vs 7700k @5 very close fps


----------



## Scotty99

Quote:


> Originally Posted by *Rich84*
> 
> 1700 @3.9 vs 7700k @5 very close fps


As much as i like that dude, you really have to take that with a grain of salt given the results of pretty much every other youtuber on the planet.


----------



## FoamyV

Eh, we'll be getting member benchmarks pretty soon and we'll have a larger base to test different configs. Next few days/weeks will be interesting


----------



## III-Method-III

Quote:


> Originally Posted by *budgetgamer120*
> 
> Can you tell me the reasons? Lol


^^This.

i was (and still am) dearly hoping that Ryzen 1700x or 1800X will allow me to do exactly this. Game on the same rig that i work on (by work i mean render / encode and generally thrash the **** out of my CPU at 100% for hours on end) and allow me to do so at about half the cost of an eye watering x99 platform and 6900k.

My quandry is simple really - how much gaming goodness am i giving up going from a 4790k at 4Ghz to a Ryzen 1800X at 3.8 or whatever i might PC it to on water? (lets assume for now Im going to keep the GPU constant (980) for the sake of my sanity.


----------



## Kuivamaa

Quote:


> Originally Posted by *trism*
> 
> Not really. It is testing the CPU capability. You could achieve the same results with older CPUs too so what value does this test hold, really?


You never,ever ,ever test just the CPU capability in games. There is always a GPU there that does the heavy lifting,not just a variable. Hence all new things come to play, like API,drivers, PCIe etc. By reducing settings and resolution ,or even in the extreme "CPU gaming tests" reduce frequencies drastically in search of CPU bottlenecks, you usually create conditions that do not apply in actual gaming as most people do it (force more cache misses than you would normally and suddenly memory performance becomes more significant,for example).

A CPU is an instrument. It is used either on its own to run apps or in conjuction with the GPU to run games. The actual part is how well can it run the games in realistic situations. The only interesting question raised by people that lower settings and resolutions is "but in the future ,I will be using my CPU with much stronger GPU,why can't I see what is the max that my CPU can do anyway?". The problem here is that your results are gonna only be valid for the actual games that you tested (when you only reduce resolution) and sometimes not even those games (If you also reduce settings that one way or another affect the CPU load when they are set to higher levels).


----------



## siren05

Some competition at last. Though, i am gonna wait to see what the lga 2066 chipset and Intel next generation processors are gonna bring. I think i can wait till then.


----------



## comagnum

Quote:


> Originally Posted by *Scotty99*
> 
> As much as i like that dude, you really have to take that with a grain of salt given the results of pretty much every other youtuber on the planet.


He also tested on a different motherboard with manual oc and high performance power settings (no core parking) so he didn't have the issues that are causing the discrepancies the results are showing.


----------



## fleetfeather

Quote:


> Originally Posted by *Phixit*
> 
> .. or wait for Intel X299 and see how it performs.


I assumed that posting on a chipset launch thread meant that the intention to upgrade is imminent, but yes, X299 isn't too far away now either. That's personally what I'm waiting for








Quote:


> Originally Posted by *III-Method-III*
> 
> But if I am to improve rendering and encoding times, I need more cores. X99 is great Im sure, but the 6 and 8 core CPU's that plug in are a tad on the expensive side with no noticeable gains over the 8 core Ryzen...just checking I understand your logic.
> 
> I appreciate there is an issue with RAM speed on the Ryzen platform but I expect it will be resolved soon(TM) such that a numpty like me can at least use 3200 Mhz ram?
> 
> the stuff about queue depth lost me....you might need to use smaller words for me


You kind of understand me, but also kind of don't. I'll try explain a bit better:

CPU for CPU, I think there are only marginal differences in the performance between Ryzen 8C/16T and Broadwell-E 8C/16T.

The larger differences are seen in the motherboard chipsets; AM4 for Ryzen, and X99 for Broadwell-E. On AM4, you have 2-channel DDR4 support (just like you have now on Z87/Z97). On X99, you have 4-channel DDR4 support. Productivity applications tend to benefit from these two additional channels.

The last comment I made was in reference to storage performance; best to check out the PCPerspective Ryzen review for a guide into what I'm talking about there. Unlike the gaming-centric approach PCPer has taken, higher queue depth performance is something worth considering when you're building for productivity applications (such are media creation)

Hope that helps a bit more


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *III-Method-III*
> 
> ^^This.
> 
> i was (and still am) dearly hoping that Ryzen 1700x or 1800X will allow me to do exactly this. Game on the same rig that i work on (by work i mean render / encode and generally thrash the **** out of my CPU at 100% for hours on end) and allow me to do so at about half the cost of an eye watering x99 platform and 6900k.
> 
> My quandry is simple really - how much gaming goodness am i giving up going from a 4790k at 4Ghz to a Ryzen 1800X at 3.8 or whatever i might PC it to on water? (lets assume for now Im going to keep the GPU constant (980) for the sake of my sanity.


I doubt you're giving up much if any gaming performance compared to a 4GHz 4790K tbh. Your 980 would definitely be the bottleneck with either CPU. Even a 1080 is bottlenecking Ryzen it would seem.


----------



## chasefrench

Is anyone else feeling that this still is not enough to upgrade from a 2500k at 4.5ghz?

After the last generation of gpus, at least in the UK, I found that a 980ti from ebay for 320 was better value than the 480, 1070 or 1080

I really wanted to buy the 1600x but as I m purely interested in Excel, VBA, Python and Gaming I'm not sure I need to yet

You have to think that the 6 core intel in 12 months maybe an interesting response?

The only interesting point here is that like previous AMD generations, a good AM4 mobo could last last 4-5 generations of Zen, which then makes the current generation an interesting proposition.....

Until there is a good AMD sync 1440p ultrawide screen @ 120fps, the sandy bridge still keeps up at 60fps, not including the occasional abrupt drop

The first game I have ever felt I am bottlenecking is Deus Ex: Mankind Divided which is annoying me, but I m not sure there is enough value in this upgrade.......

Really confused as to what to do


----------



## frizo

Even if it appears that Ryzen hasn't _quite_ lived up to all the hype, it still appears to be a very capable platform performance-wise and it definitely appears to be a tremendous option when it comes to pricing, especially if you're looking for or need more cores for your tasks.

I'm just glad AMD's giving Intel's chips a run for their money. Competition is a good thing and Intel's been running largely unopposed for quite some time. Hats off to AMD and here's to hoping this is the beginning of a new round of CPU wars.


----------



## CriticalOne

Quote:


> Originally Posted by *chasefrench*
> 
> Is anyone else feeling that this still is not enough to upgrade from a 2500k at 4.5ghz?


Make the upgrade. The R7 1700 gets higher performance in games and with tasks outside of games.


----------



## NFL

Performance-wise, this makes me wonder where Ryzen 5 will fall


----------



## trism

Quote:


> Originally Posted by *Kuivamaa*
> 
> You never,ever ,ever test just the CPU capability in games. There is always a GPU there that does the heavy lifting,not just a variable. Hence all new things come to play, like API,drivers, PCIe etc. By reducing settings and resolution ,or even in the extreme "CPU gaming tests" reduce frequencies drastically in search of CPU bottlenecks, you usually create conditions that do not apply in actual gaming as most people do it (force more cache misses than you would normally and suddenly memory performance becomes more significant,for example).
> 
> A CPU is an instrument. It is used either on its own to run apps or in conjuction with the GPU to run games. The actual part is how well can it run the games in realistic situations. The only interesting question raised by people that lower settings and resolutions is "but in the future ,I will be using my CPU with much stronger GPU,why can't I see what is the max that my CPU can do anyway?". The problem here is that your results are gonna only be valid for the actual games that you tested (when you only reduce resolution) and sometimes not even those games (If you also reduce settings that one way or another affect the CPU load when they are set to higher levels).


You absolutely do, like in this scenario if you are interested in the raw performance of the CPU. As you are from Finland, you should check io-tech's review which tells a much better story about the performance than this test.


----------



## tpi2007

Quote:


> Originally Posted by *NFL*
> 
> Performance-wise, this makes me wonder where Ryzen 5 will fall


The hexacore 1600X will probably be fighting the 6800K at around $260 and the quad core will probably be around 4770K level, but it will probably cost just a little more than the i3-7350K, and while it won't be touching the 7700K any day, between a 2C/4T CPU for $179 with a bit higher IPC and a 4C/8T with a bit higher than Haswell IPC for $200, it'll probably reset expectations for that price point.


----------



## blue1512

Quote:


> Originally Posted by *III-Method-III*
> 
> i was (and still am) dearly hoping that Ryzen 1700x or 1800X will allow me to do exactly this. Game on the same rig that i work on (by work i mean render / encode and generally thrash the **** out of my CPU at 100% for hours on end) and allow me to do so at about half the cost of an eye watering x99 platform and 6900k.
> 
> My quandry is simple really - how much gaming goodness am i giving up going from a *4790k at 4Ghz to a Ryzen 1800X at 3.8* or whatever i might PC it to on water? (lets assume for now Im going to keep the GPU constant (980) for the sake of my sanity.


At this moment I would say they are close in most of *games*, with Ryzen pulled on newer titles which thread better. The difference in current benchs is that 4790k needs ~100% CPU workload to match ~40% of 1800X. IF AMD can figure out all the bottleneck (very likely in the memory controller), 1800X will pull ahead, while there is no headroom for 4790k.


----------



## Majin SSJ Eric

In all honesty, CPU these days matter very little for gaming as all recent Intel chips (and now Ryzens as well) will perform the same with even the strongest of video cards at the actual resolutions and settings we play games at. I really fail to see in what scenario doing a test between CPU's at 480p and low details is relevant to anything beyond going onto a forum and bragging that your CPU is better than another (even though they perform the same at the settings you actually play games at).


----------



## CriticalOne

Joker Productions and Steve Burke from Gamersnexus speculate that the variance in game benchmarks and overall lower than performance in games is down to immature motherboard EFIs and old/buggy firmware.


----------



## chasefrench

Quote:


> Originally Posted by *CriticalOne*
> 
> Make the upgrade. The R7 1700 gets higher performance in games and with tasks outside of games.


I really want to, but in the last 10 years, i made one upgrade from an E8500 at 4.3 to a 2500k at 4.5 which was sensational

Maybe we will never see that again. I completely agree with the multi-thread argument and want to support AMD, and the similarity in console architecture can only drive that further but as I get older, with buying a house and kids, you can't help but want that unanimous agreement amongst OCN that it is the best chip, like the 2500k / 920 / E8500 was

It seems like there is a growing specialistion in cpus now where productivity and gaming are no longer correlated


----------



## rv8000

Quote:


> Originally Posted by *fleetfeather*
> 
> I'm asking about Voltages, not Frequencies


Not sure if this was addressed yet, so many posts...

From the few reviews I looked at I think most were defaulting around 1.325 to 1.35 pushing upwards of 1.475 for single core XFR boost to 4.1. There isn't a lot of clarity in any of the reviews regarding clock stability and voltage measuring (how it varies clocks/time) from what I've seen. Need to wait for additions to reviews and or user experience here.


----------



## luisxd

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K is 5 years old. I want to upgrade but AMD does not want me to upgrade. Yes 1700 @ 3GHz is not going to beat 4.6GHz i5/i7 lol. The problem is something like 1800X is not even 4.0GHz CPU.


The problem is you only use your pc for gaming or casual use cause' you still think that after all those reviews, and that's why you don't find it as a good investmen.


----------



## dieanotherday

If Intel decreases their 8 core pricing then we can call ryzen a success.

ive been stuck on 4 cores since q6600 its ridiculous


----------



## CriticalOne

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> In all honesty, CPU these days matter very little for gaming as all recent Intel chips (and now Ryzens as well) will perform the same with even the strongest of video cards at the actual resolutions and settings we play games at. I really fail to see in what scenario doing a test between CPU's at 480p and low details is relevant to anything beyond going onto a forum and bragging that your CPU is better than another (even though they perform the same at the settings you actually play games at).


Since these tests compare only CPU power for the most part, they can be used to extrapolate.

Once you first build a rig your system generally is GPU bound, but as you upgrade your GPU power gets much faster and in turn needs a faster CPU to feed it draw calls. This is what happened to me; I have an Haswell i3 and started off with a 750 Ti without bottlenecking problems, but upgraded to a 380x and my system got _extremely_ CPU bound. In theory, the same ratio or gap between 400 FPS for CPU X and 300 FPS for CPU Y will be maintained in other CPU bound tasks. CPU X will get 40 FPS but CPU Y will only get 30, if that makes sense.


----------



## SSJVegeta

Good to know my 5820k @ 4.4GHz is still relevant. Only bought it for £220 new last year









I'll probably upgrade in a few years.


----------



## trism

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> In all honesty, CPU these days matter very little for gaming as all recent Intel chips (and now Ryzens as well) will perform the same with even the strongest of video cards at the actual resolutions and settings we play games at. I really fail to see in what scenario doing a test between CPU's at 480p and low details is relevant to anything beyond going onto a forum and bragging that your CPU is better than another (even though they perform the same at the settings you actually play games at).


Let me guess, you play with 4K 30-60fps? I play 90% of the time at 1024x768 with +200 fps. 10% of the time I play 1080p at +100fps. I *never, never, ever* play anything below 100 fps because it is not enjoyable for me. And there are others who share this view. So stop talking about your own perspective and let others prioritize different things. There is no reason to GPU limit the tests because that only tests the GPU, not the CPU.


----------



## blue1512

Quote:


> Originally Posted by *luisxd*
> 
> The problem is you only use your pc for gaming or casual use cause' you still think that after all those reviews, and that's why you don't find it as a good investmen.


Modern gamers now should be able to play and stream at the same time. Ryzen is the perfect fit for them at this moment


----------



## frizo

Quote:


> Originally Posted by *dieanotherday*
> 
> ive been stuck on 4 cores since q6600 its ridiculous


At least the Q6600 is a friggin' tank. There are worse chips to be "stuck" on.


----------



## C2H5OH

An excellent read from @The Stilt (voltages also explained)

Ryzen: Strictly technical


----------



## Shatun-Bear

Quote:


> Originally Posted by *Kuivamaa*
> 
> You never,ever ,ever test just the CPU capability in games. There is always a GPU there that does the heavy lifting,not just a variable. Hence all new things come to play, like API,drivers, PCIe etc. By reducing settings and resolution ,or even in the extreme "CPU gaming tests" reduce frequencies drastically in search of CPU bottlenecks, you usually create conditions that do not apply in actual gaming as most people do it (force more cache misses than you would normally and suddenly memory performance becomes more significant,for example).
> 
> A CPU is an instrument. It is used either on its own to run apps or in conjuction with the GPU to run games. The actual part is how well can it run the games in realistic situations. The only interesting question raised by people that lower settings and resolutions is "but in the future ,I will be using my CPU with much stronger GPU,why can't I see what is the max that my CPU can do anyway?". The problem here is that your results are gonna only be valid for the actual games that you tested (when you only reduce resolution) and sometimes not even those games (If you also reduce settings that one way or another affect the CPU load when they are set to higher levels).


Absolutely spot on.

People saying the Ryzen gets 'stomped' or 'thrashed' by the 7700K apparently didn't realise this is only when benching with a Titan X and with the resolution reduced to 1080p!!

The reality is there is a 2-4fps gap between a 7700K and 1800X with a Titan X when gaming at 4k. *2-4 fps.*

http://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-7-1800x-cpu-review/11/

Even more pertinently, when gaming with a card below a Titan X, which is what 99.5% of gamers have, the differences in fps in games _on average_ is even smaller. Completely irrelevant and unnoticeable, like I have been saying.


----------



## aDyerSituation

Quote:


> Originally Posted by *blue1512*
> 
> Modern gamers now should be able to play and stream at the same time. Ryzen is the perfect fit for them at this moment


most people stream games that are easy to run

League, overwatch, CSGO, h1z1, etc

7700k does that while having higher fps in games


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> most people stream games that are easy to run
> 
> League, overwatch, CSGO, h1z1, etc
> 
> 7700k does that while having higher fps in games


I really doubt that 7700k can still maintain the lead when streaming+gaming at the same time. Most of the gaming bench at this moment they are at ~100% CPU workload to pull ahead ~40% of 1800X.


----------



## trism

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Absolutely spot on.
> 
> People saying the Ryzen gets 'stomped' or 'thrashed' by the 7700K apparently didn't realise this is only when benching with a Titan X and with the resolution reduced to 1080p!!


This is how you test the gaming performance of a CPU.
Quote:


> Originally Posted by *Shatun-Bear*
> 
> The reality is there is a 2-4fps gap between a 7700K and 1800X with a Titan X when gaming at 4k. *2-4 fps.*


Yes, when the Titan X limits the test. It's like saying (a little exaggeration here) that a Ferrari and a Honda Civic are the same speed-wise, since the speed limit is 60 mph and they both can do that just fine.

It might not matter now but it very well could in the future when you are upgrading your GPU. It is still quite a bit slower CPU in gaming applications but hopefully the better support and future optimizations change things around.


----------



## aDyerSituation

I guess it's a crime to expect a new processor to have cores faster than a 4 year old processor


----------



## Shatun-Bear

Quote:


> Originally Posted by *trism*
> 
> Let me guess, you play with 4K 30-60fps? *I play 90% of the time at 1024x768 with +200 fps. 10% of the time I play 1080p at +100fps.* I *never, never, ever* play anything below 100 fps because it is not enjoyable for me. And there are others who share this view. So stop talking about your own perspective and let others prioritize different things. There is no reason to GPU limit the tests because that only tests the GPU, not the CPU.


In the whole market of gamers, people that use computers like you are like 0.01%.

Like people have been saying, which you don't seem to understand - nearly everyone reading the reviews doesn't play at silly and extreme settings like you do. And you probably don't even own a Titan X, which makes your post even more confusing, as any card below that at 1024x768 resolution will show no noticeable difference between Ryzen and a 7700K.


----------



## chrisjames61

Quote:


> Originally Posted by *lombardsoup*
> 
> Exactly. Overclocking any 8 core processor (Intel or AMD) isn't an easy task: both brands tend to crap out at around 4.2-4.3.


Are you crazy? Any Piledriver cpu is good for at least 4.6 to 4.8 GHz if you have a good board and good cooling. 8 core cpu's included.


----------



## CriticalOne

Quote:


> Originally Posted by *trism*
> 
> Yes, when the Titan X limits the test. It's like saying (a little exaggeration here) that a Ferrari and a Honda Civic are the same speed-wise, since the speed limit is 60 mph and they both can do that just fine.


Exactly.

Give a sports car and a modern grand touring endurance race car both some bald, worn out summer tires that have been left out in the sun for 5 years and they will perform the same.


----------



## C2H5OH

Quote:


> Originally Posted by *aDyerSituation*
> 
> I guess it's a crime to expect a new processor to have cores faster than a 4 year old processor


It's a crime for Intel to have a massive profits while selling, new "gen" processors, that have little gains but constantly increasing prices.








It's also a crime, that some think it's normal...


----------



## TheReciever

The argument fallacies and voids of reality from Intel supporters are pretty laughable, thanks for supporting my popcorn addiction.

Overall Im really liking what I am seeing here. Looking forward to building a 1700 system in the future for recording/storage/streaming unless I wait for refresh. Dont have a need for a desktop at the moment


----------



## Descadent

it really is intel's fault for not innovating practically at all since sandy bridge which is going on 7 years old.

although I wished ryzen was something that could have taken intel out behind the school and beat the crap out of


----------



## trism

Quote:


> Originally Posted by *Shatun-Bear*
> 
> In the whole market of gamers, people that use computers like you are like 0.01%.
> 
> Like people have been saying, which you don't seem to understand - nearly everyone reading the reviews doesn't play at silly and extreme settings like you do. And you probably don't even own a Titan X, which makes your post even more confusing, as any card below that at 1024x768 resolution will show no noticeable difference between Ryzen and a 7700K.


It doesn't matter what settings I use. Testing the CPU properly tells the absolute SPEED which can be extrapolated on any other resolution and setting. When you are on purpose LIMITING it with the GPU, you are not getting any good data except testing the capabilities of the GPU.

1024x768 is far more likely to be CPU limited than GPU limited, so no, Titan X is not required to show differences here. As seen from the reviews, Sandy Bridge with 1080 ~ the same performance as Ryzen at those resolutions.


----------



## AmericanLoco

Quote:


> Originally Posted by *ducegt*
> 
> I don't see that happening. Not until it's more proven being Opteron has almost been forgotten. Did you work IT in which you touched with thousands of customers with no loyalty to various platforms? I doubt it. I'm in a rare position to get a feel of the diversity that exists.


Enterprise can move quickly. Back in late 2003 when AMD released the first K8 Opterons, they went from 0 to 25% of enterprise marketshare in basically 2 years.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> As much as i like that dude, you really have to take that with a grain of salt given the results of pretty much every other youtuber on the planet.


I am sorry but he actually shows the fps in the game.


----------



## Shatun-Bear

Quote:


> Originally Posted by *CriticalOne*
> 
> Exactly.
> 
> Give a sports car and a modern grand touring endurance race car both some bald, worn out summer tires that have been left out in the sun for 5 years and they will perform the same.


No what you are saying is illogical.

Exposing gains for the 7700K by selecting settings that no-one uses is silly.


----------



## aDyerSituation

Quote:


> Originally Posted by *C2H5OH*
> 
> It's a crime for Intel to have a massive profits while selling, new "gen" processors, that have little gains but constantly increasing prices.
> 
> 
> 
> 
> 
> 
> 
> 
> It's also a crime, that some think it's normal...


what does that have to do with anything?

Even with Intel going stagnant AMD still can't match their single thread performance


----------



## Echoa

Quote:


> Originally Posted by *chrisjames61*
> 
> Are you crazy? Any Piledriver cpu is good for at least 4.6 to 4.8 GHz if you have a good board and good cooling. 8 core cpu's included.


Piledriver has a longer pipeline, weaker and smaller cores, etc.

It's not even the same, you can't compare the overclocking potential of the ity bity FX CPUs to the large cores or Ryzen and Intel. Not to mention that even at 5ghz Piledriver couldn't hope to match Ryzen


----------



## CriticalOne

Quote:


> Originally Posted by *Shatun-Bear*
> 
> No what you are saying is illogical.
> 
> Exposing gains for the 7700K by selecting settings that no-one uses is silly.


Why not?

The settings are silly but nobody was claiming its a _realistic_ benchmark. I can take the result from these CPU test and extrapolate to see which CPU gives me the best performance when it is the bottleneck.


----------



## LancerVI

Quote:


> Originally Posted by *CriticalOne*
> 
> Intel's HEDT platform wasn't in contention due to price. I'm not trying to pay $1,000 for an eight core or $200 for a motherboard. I do CAD and video editing alongside gaming so I need strong multithreaded performance.
> 
> The way that Ryzen was presented is that I could get a 1700 for $320, the same price as an i7 or whatever, and get more or less the same performance in games while having a little under than twice the multithreaded performance. AMD gave multiple presentations showing a Ryzen processor and an Intel 6900k getting equal performance in games, which made me excited. Imagine my shock when I check gaming benchmarks and I see a 7700k with no hyperthreading beating a 1800X.
> 
> Along with that, there was a sizable faction on this forum declaring quad cores as dead or not good enough and how foolish it would be to choose a 6/7700k over a Ryzen processor due to games now being able to use 8 threads well and so on. It wasnt AMD, but there were a lot of people hyping up as Ryzen being some sort of Intel killer.


So you game at 1080p????

I don't remember AMD portraying their titles at 1080p. I could be wrong of course, but I thought they showed it running at 4k mostly or 1080p whilst streaming on twitch.

Sounds to me that if you game at any resolution greater than 1080p, you'll get great performance from Ryzen. Not world beating, but still pretty great. Plus you can do all those things you talked about as well at great speeds.


----------



## chrisjames61

Quote:


> Originally Posted by *naz2*
> 
> he's specifically talking about gaming. that's what the overwhelming majority of people use these CPUs for.


You think the majority of people who use computers are gamers? What alternate universe do you come from?
Quote:


> Originally Posted by *naz2*
> 
> are we looking at the same reviews? ryzen is losing 10% fps with smt enabled and there's already murmurings of a magical windows update to fix it. this is verbatim what happened when bulldozer dropped and people pointed to poor optimization with the windows scheduler as the culprit
> 
> not saying ryzen sucks or anything, just that the rhetoric is the same. hopefully they will actually fix it with a magical update but i doubt it. intel's been refining hyper-threading for nearly a decade to get to its current level


I don't remember ever seeing you post here before. Now you crawl out of the woodwork seeing an opportunity to fashion some blatant trolls. Glad you are enjoying the moment.


----------



## CriticalOne

Quote:


> Originally Posted by *LancerVI*
> 
> So you game at 1080p????


Is that a problem?


----------



## aDyerSituation

Quote:


> Originally Posted by *CriticalOne*
> 
> Is that a problem?


like seriously? Most of my friends game at 1080 144hz, as do I.

And unfortunately AMD gets blown out of the water there


----------



## LancerVI

Quote:


> Originally Posted by *CriticalOne*
> 
> Is that a problem?


Not at all. I game at 2560x1080p, so I get it.

You said that AMD presented their gaming one way and delivered another. All I know is the presentations I saw were NOT at 1080p.


----------



## trism

Quote:


> Originally Posted by *Shatun-Bear*
> 
> No what you are saying is illogical.
> 
> Exposing gains for the 7700K by selecting settings that no-one uses is silly.


You don't understand. Think it as a power reserve. With this current situation (no real multicore advancement) the next GPU could theoretically be fast enough to cap Ryzen instead while 7700k still has power left to keep the GPU capped instead, thus get better performance. Ryzen is pretty much on-par with Ivy Bridge when it comes to games. If 4K currently is your goal, you don't need to buy a Ryzen.


----------



## OutlawII

Quote:


> Originally Posted by *fleetfeather*
> 
> I'm asking about Voltages, not Frequencies


It showed voltages on there


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> like seriously? Most of my friends game at 1080 144hz, as do I.
> 
> And unfortunately AMD gets blown out of the water there


Seriously, you are still trying to compare apple to orange here, ~3.4 GHz 8c 1800x vs ~ 4.5 GHz 4c 7700k makes little sense. If you want gaming only, wait for lower cores Ryzens, which can reach higher clock and better single thread performance.


----------



## Xuper

Quote:


> Originally Posted by *blue1512*
> 
> Seriously, you are still trying to compare apple to orange here, ~3.4 GHz 8c 1800x vs ~ 4.5 GHz 4c 7700k makes little sense. If you want gaming only, wait for lower cores Ryzens, which can reach higher clock and better single thread performance.


He won't.dunno what happened to him.he can't understand AMd won't bring CPU with Kabylake's IPC yet he mentions it......


----------



## CriticalOne

Ryzen was shown to the public almost exclusively at 4K. I think the only exception was the x.264 showcase.

It shows Ryzen performing well in a realistic use case scenario but it doesn't give much information about the processor's gaming prowess itself and I honestly should have known better. Right now not even the fastest GPU in SLI will be bottlenecked by Ryzen but GPU tech tends to move very fast. AMD's RX 680 or whatever could be four times as fast as a Titan X; what happens then?


----------



## Shatun-Bear

Quote:


> Originally Posted by *trism*
> 
> It doesn't matter what settings I use. Testing the CPU properly tells the absolute SPEED which can be extrapolated on any other resolution and setting. When you are on purpose LIMITING it with the GPU, you are not getting any good data except testing the capabilities of the GPU.
> 
> 1024x768 is far more likely to be CPU limited than GPU limited, so no, Titan X is not required to show differences here. As seen from the reviews, Sandy Bridge with 1080 ~ the same performance as Ryzen at those resolutions.


Quote:


> Originally Posted by *CriticalOne*
> 
> Why not?
> 
> The settings are silly but nobody was claiming its a _realistic_ benchmark. I can take the result from these CPU test and extrapolate to see which CPU gives me the best performance when it is the bottleneck.


Ok let me try to understand your thinking.

You're saying that because the 7700k is faster in gaming settings no-one uses (Titan X @ 1080p), it's worth it.

But because the 1800X is faster in all the multi-threaded benches and CPU tasks that people use on a _daily basis_, it's not worth it?

Can you see how illogical this is? No?


----------



## $ilent

Quote:


> Originally Posted by *Scotty99*
> 
> Depends on the games tbh, a 1700 will beat a 2700k in the new titles quite handily.


I generally play old steam games and then likes of battlefield 4 and bf1.


----------



## AmericanLoco

Quote:


> Originally Posted by *dieanotherday*
> 
> It shows Ryzen performing well in a realistic use case scenario but it doesn't give much information about the processor's gaming prowess itself and I honestly should have known better. Right now not even the fastest GPU in SLI will be bottlenecked by Ryzen but GPU tech tends to move very fast. AMD's RX 680 or whatever could be four times as fast as a Titan X; what happens then?


You go ahead and just upgrade your CPU, because AMD has pledged multiple years of support for the AM4 platform.


----------



## LancerVI

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Ok let me try to understand your thinking.
> 
> You're saying that because the 7700k is faster in gaming settings no-one uses (Titan X @ 1080p), it's worth it.
> 
> But because the 1800X is faster in all the multi-threaded benches and CPU tasks that people use on a _daily basis_, it's not worth it?
> 
> Can you see how illogical this is? No?


Pretty much this exactly.


----------



## Gunderman456

Seems that the consensus is "AMD Ryzen Disappointing Game Benchmarks Run Better with SMT DISABLED". The 7700K still presents better value for gamers.


----------



## trism

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Ok let me try to understand your thinking.
> 
> You're saying that because the 7700k is faster in gaming settings no-one uses (Titan X @ 1080p), it's worth it.
> 
> But because the 1800X is faster in all the multi-threaded benches and CPU tasks that people use on a _daily basis_, it's not worth it?
> 
> Can you see how illogical this is? No?


My point was in how GPU capped reviews alone are useless. Again: they do not test anything else than the GPU itself. When the benchmark is not limited by the GPU, you see the "true speed" of the CPU.

I am not saying anything like that, if you go read back my posts I said Ryzen is a good processor. But strictly for gaming, I would pick Intel.


----------



## JackCY

This thread will be endless.

Well done AMD, great CPUs with awesome performance for the money. There are still some tweaks that need to be done but that's to be expected with a completely new CPU launch after many years, mobo makers nor OS are prepared yet fully.
Memory latency seems high :/
Memory support should improve with updated UEFIs though. Hopefully the SMT and fractional multipliers get better OS support too.
Plus other tweaks on the software side overall.

Some reviews have wide variations in certain results even for Intel vs Intel CPU they have to be taken with a grain of salt and check the variances present.

Will keep my 4690k @ 4.7-7-6-6 GHz, don't have the money to burn on an 8 core system nor a big need for it.


----------



## Scotty99

I just realized microcenter is selling 6700k for 259 bucks, ya at that price i dont know if i can pull the trigger on a 1700.


----------



## blue1512

Quote:


> Originally Posted by *Gunderman456*
> 
> Seems that the consensus is "AMD Ryzen Disappointing Game Benchmarks Run Better with SMT DISABLED".


Sounds logic to me. SMT provides better multithread but it tends to slow down single thread performance, which is more favored in games.

A side note, from the benches, Ryzen SMT seems to be better than Intel HT though, as they have lower IPC but better multithread than 6900k.


----------



## yesitsmario

Hoping Ryzen 5 1500 can go past 4ghz, would be a good buy for $229 right? Would the 1600x have an advantage over the 1500?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> I just realized microcenter is selling 6700k for 259 bucks, ya at that price i dont know if i can pull the trigger on a 1700.


I would not do it but 1600X could be better buy if $250.


----------



## CriticalOne

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Ok let me try to understand your thinking.
> 
> You're saying that because the 7700k is faster in gaming settings no-one uses (Titan X @ 1080p), it's worth it.
> 
> But because the 1800X is faster in all the multi-threaded benches and CPU tasks that people use on a _daily basis_, it's not worth it?
> 
> Can you see how illogical this is? No?


You're missing the forest for the tree.

The point of CPU benchmarks isn't to compare hardware at unrealistic settings, but its to compare hardware's peak performance and compare how they perform in relation of each other to get a good idea of how powerful they are in general. Magnitude doesn't matter, we could be comparing framerates in the single digits and the information is just as valuable.



I don't know anyone who uses Solidworks with a 400Hz monitor but the information is still very valuable. This situation is a simple test that removes CPU influence in order to see which GPU has the most power.



As you can see here the performance of the cards in the first chart can be used as a pretty strong ball park to see how performance will be in a much more demanding workload. Outside of the cheaper cards with a small amount of VRAM, the relationships of performance relative to other cards correlate strongly to the ones in the first chart. This time, instead of a 970 getting 150 FPS and a M4000 getting 350, the 970 is now struggling along at a very uncomfortable 24FPS while the M4000 still gets a tolerable 64FPS.


----------



## S.M.

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even in gaming the 1800X is holding its own against the 6900K. I'm still trying to wrork out why anybody thought Ryzen was going to be faster than a 7700K in gaming in the first place? If you are going to discredit Ryzen for not being faster than a 7700K then you were always going to be discrediting Ryzen from the start.


Self inflicted wounds are an Overclock.net staple.

We got Skylake performance (and better) when we were told the performance would rival Ivy.

Can't wait to receive my 1700!


----------



## SoloCamo

Didn't see it posted but I'm wondering if more binning is going on then realized.

L1Techs got their 1700 to 3.4ghz on 8 cores and only 3.8ghz on 2 cores...






Quote:


> Originally Posted by *comagnum*
> 
> He also tested on a different motherboard with manual oc and high performance power settings (no core parking) so he didn't have the issues that are causing the discrepancies the results are showing.


Also, he was able to run the memory at 3ghz w/ XMP profile


----------



## motoray

Overall pretty stoked with the results. It will get faster in games with updates. Half dissapointed with max overclocks. Half excited that i only need to buy a 1700 and save a ton of money. Am i the only one on this thought process?


----------



## aDyerSituation

Quote:


> Originally Posted by *S.M.*
> 
> Self inflicted wounds are an Overclock.net staple.
> 
> We got Skylake performance (and better) when we were told the performance would rival Ivy.
> 
> Can't wait to receive my 1700!


I doubt anyone is arguing that Ryzen isn't great value for multi threaded workloads

but in gaming it is severely lacking behind Ivy bridge and haswell in most games, which is not something that should just be looked over. those processors have been around for years


----------



## inedenimadam

Not bad AMD! Multi threaded performance is top notch, and single threaded performance is nothing to scoff at. The pricing on these chips makes me think that they may drag some market share back. I already was willing to sacrifice a bit of single threaded gaming performance to get more than 4 cores with a 6800k, so these chips would have been in my cross-hairs if I was purchasing today.

Movement in the CPU market....exciting!

Maybe AMD can loose the "New York Nationals" stigma.


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> I doubt anyone is arguing that Ryzen isn't great value for multi threaded workloads
> 
> but in gaming it is severely lacking behind Ivy bridge and haswell in most games, which is not something that should just be looked over. those processors have been around for years


Don't know what are you talking about. They are not behind ivy/haswell clock2clock.


----------



## aDyerSituation

Quote:


> Originally Posted by *blue1512*
> 
> Don't know what are you talking about. They are not behind ivy/haswell clock2clock.


are you looking at the game benchmarks? Or going off of cinebench scores?


----------



## Oubadah

..


----------



## Scotty99

Quote:


> Originally Posted by *SoloCamo*
> 
> Didn't see it posted but I'm wondering if more binning is going on then realized.
> 
> L1Techs got their 1700 to 3.4ghz on 8 cores and only 3.8ghz on 2 cores...
> 
> 
> 
> 
> 
> Also, he was able to run the memory at 3ghz w/ XMP profile


No read the comments, they didn't touch voltages at all.


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> are you looking at the game benchmarks? Or going off of cinebench scores?


Clock2clock, mate.

And in this thread, we already have a little cherry pick. Now can you say that Ryzen is slower than ivy/haswell again


----------



## rv8000

Quote:


> Originally Posted by *Oubadah*
> 
> You're absolutely right. This is an important point that so many people can't get their heads around.
> 
> "Unrealistically" low res CPU benchmarks help compensate for poor benching choices by reviewers.
> 
> If the reviews are conducted at a low res, the results might not be directly relevant to "real world" situations, but at least it's still a meaningful result that you can extrapolate to _other_ more CPU dependent real world scenarios that the reviewer might have neglected to test. If they're just posting a bunch of flatline graphs from totally GPU bound ("real world") scenarios, then you're getting getting nothing meaningful at all.


There's one huge problem with this theory, not a single lower res 720/1080p result correlates to trend of the synthetic and real world benchmarks for Ryzen. Excluding game benchmarks, the R7 line at worst ties and generally beats the 7700k by a fair amount, coming closer to the 6800k/6900k.


----------



## aDyerSituation

Quote:


> Originally Posted by *blue1512*
> 
> Clock2clock, mate.
> 
> And in this thread, we already have a little cherry pick. Now can you say that Ryzen is slower than ivy/haswell again


LOL I am so done. You cherry picked so hard right there it is unbelievable. You do understand single core and multi core performance are 2 completely different things and not handled the sameway between each company.


----------



## DaaQ

Quote:


> Originally Posted by *fleetfeather*
> 
> Can someone give me a TL;DR of stock voltages and OC voltages? I get the impression that OC Vcore was close to 1.50, but have no idea on stock Vcore's out of the box
> 
> I ask because I'm looking to conclude for myself what role (if any) power delivery components are likely to have on OC - if OC'ing only adds an extra ~.10 to Vcore, I'd be tempted to consider motherboards for their feature set over their power delivery.


https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/

The stilts review is quite informative.


----------



## Marios145

It's funny, because calling ryzen trash means that every intel arch except skylake/kabylake is trash too.


----------



## VegetarianEater

The thing I don't get about the gaming performance is that it does so well in various single threaded benchmarks compared to the broadwell-e processors, so theoretically it should have similar gaming performance to those processors...

Is there a Windows optimization problem with AMD CPUS? One of the reviews I read mentioned that despite getting lower FPS than most of the Intel competition in games, there was a lot of CPU headroom, whereas the Intel CPUs were getting maxed out. I still imagine that the Ryzen chips will be great for streaming and multitasking compared to a quad-core intel CPU, but I hope they solve this mystery soon. Just doesn't make sense that the same single thread performance as the 6900k/6800k is translating to much lower FPS in games.


----------



## Oubadah

..


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> LOL I am so done. You cherry picked so hard right there it is unbelievable. You do understand single core and multi core performance are 2 completely different things and not handled the sameway between each company.


I didn't make that chart LOL. Just wanted to show that your point of "slower than ivy/haswell" is completely subjective and situational.

Also as I already pointed out in this thread, 1800x is comparable with ~100% 7700k in games while only ultilizing ~40-60%. There are plenty of headroom for streaming, or more CPU intensive games in the future.


----------



## aDyerSituation

Quote:


> Originally Posted by *Oubadah*
> 
> Yes, and that's the sad truth.
> 
> The fact that it's still even possible for two people to argue about whether Ryzen is faster than Ivy Bridge is a sad indictment on both AMD _and_ Intel. Ivy Bridge. Ivy Bridge from _2012_. No-one was arguing that Wolfdale was still competitive with Haswell in 2013.


This is exactly my point.

We are praising AMD for matching processors from a couple years ago and stacking cores on top of it.

For value it's great compared to what's out, but as far as a performance standpoint goes there's nothing to be impressed about.


----------



## Phixit

Are most of the gaming benchmarks running the 7700k at stock speed ?


----------



## aDyerSituation

Quote:


> Originally Posted by *Phixit*
> 
> Are most of the gaming benchmarks running the 7700k at stock speed ?


yes, but don't bother going that direction or you will get hit with the "most people don't OC" argument

(on overclock.net)


----------



## Quantum Reality

Quote:


> Originally Posted by *VegetarianEater*
> 
> The thing I don't get about the gaming performance is that it does so well in various single threaded benchmarks compared to the broadwell-e processors, so theoretically it should have similar gaming performance to those processors...
> 
> Is there a Windows optimization problem with AMD CPUS? One of the reviews I read mentioned that despite getting lower FPS than most of the Intel competition in games, there was a lot of CPU headroom, whereas the Intel CPUs were getting maxed out. I still imagine that the Ryzen chips will be great for streaming and multitasking compared to a quad-core intel CPU, but I hope they solve this mystery soon. Just doesn't make sense that the same single thread performance as the 6900k/6800k is translating to much lower FPS in games.


One thing I remember is that AMD has sometimes had to provide third-party driver support for what should be natively supported features in Windows. Back in the S939 dual-core days I remember you needed a special driver because the windows XP internal multiprocessor scheduler wasn't optimized well enough when the two "CPUs" were effectively sharing resources rather than being truly independent as in the dual-Pentium scenario.


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> This is exactly my point.
> 
> We are praising AMD for matching processors from a couple years ago and stacking cores on top of it.
> 
> For value it's great compared to what's out, but as far as a performance standpoint goes there's nothing to be impressed about.


Pardon me but who is praising AMD for that?

For most ppl prasing Ryzen, the main point is 1800x can match the best 8 cores of Intel in CPU intensive tasks, while also comparable with the best gaming CPU of Intel.


----------



## orlfman

Quote:


> Originally Posted by *Scotty99*
> 
> No read the comments, they didn't touch voltages at all.


Quote:


> Originally Posted by *VegetarianEater*
> 
> The thing I don't get about the gaming performance is that it does so well in various single threaded benchmarks compared to the broadwell-e processors, so theoretically it should have similar gaming performance to those processors...
> 
> Is there a Windows optimization problem with AMD CPUS? One of the reviews I read mentioned that despite getting lower FPS than most of the Intel competition in games, there was a lot of CPU headroom, whereas the Intel CPUs were getting maxed out. I still imagine that the Ryzen chips will be great for streaming and multitasking compared to a quad-core intel CPU, but I hope they solve this mystery soon. Just doesn't make sense that the same single thread performance as the 6900k/6800k is translating to much lower FPS in games.


this is why this thread has over 100 pages. ryzen gaming results are so contradictory it's baffling. i've seen so many reviews, benchmarks today, that its hard to wrap my head around it.

then the possible causes... from performance issues could possibly be stemming from asus motherboards that i recently just read from techreport as many of these reviewers are running asus crosshair vi boards and seeing performance gains by switching to a gigabyte and asrock boards, to level1 on youtube concluded the same point that a simple ufei update changed results so much they decided to holdout releasing benchmarks and threw them all out. advising viewers to take all the benchmarks this week from sites with a grain, grain worth of salt.

then you have amd claiming smt scheduling issues since for the longest time, for x86, intel has been the only one to release smt capable processors so most optimizations for smt are done for intel, not amd. followed by issues with windows power managment issues with balance reducing performance heavily compared to intel.

then memory issues as well. level1 talked about timing and memory latency issues which can very well effect gaming performance. level1 mentioned how switching from a two dim kit to a four dim kit increased latency to [email protected] and timing issues with higher frequency kits such as reduced performance due to incorrect timings to stabilitly issues.

syntethics and workstation class benchmarks show ryzen in a very postive light so something iffy is going on and honestly, it could very well be all the possible causes mentioned so far.

imo, ryzen is suffering from growing pains and it be a month or two before we get a good light on its true gaming performance. as level1 put it, it seems like ryzen was a server class processor first, and gaming / mainstream processor second so its going to be some time before all the issues are ironed out.


----------



## Marios145

Quote:


> Originally Posted by *VegetarianEater*
> 
> The thing I don't get about the gaming performance is that it does so well in various single threaded benchmarks compared to the broadwell-e processors, so theoretically it should have similar gaming performance to those processors...
> 
> Is there a Windows optimization problem with AMD CPUS? One of the reviews I read mentioned that despite getting lower FPS than most of the Intel competition in games, there was a lot of CPU headroom, whereas the Intel CPUs were getting maxed out. I still imagine that the Ryzen chips will be great for streaming and multitasking compared to a quad-core intel CPU, but I hope they solve this mystery soon. Just doesn't make sense that the same single thread performance as the 6900k/6800k is translating to much lower FPS in games.


The whole problem lies on the fact that most software treats ryzen SMT as intel HT.
1.Instead of assigning threads on physical cores first it assigns threads on both physical/virtual cores.

2.Then you have the fact that usually threads move to different cores by windows or apps, on intel it's fine as long as you move it around physical cores first, but on ryzen if you assign a thread from core 1 to core 7 you send the thread to a *different CCX*, and that will create a huge performance penalty.
I think ryzen should be treated as 2x4cores to maximise performance.
Games will need patches, windows will need updates.
Maybe setting affinity will help? Dunno, ryzen owners should try and tell us.

How i know this? read this:
AMD's Ryzen CPU Series will Need Modern Linux Kernel for Proper Support


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> most people stream games that are easy to run
> 
> League, overwatch, CSGO, h1z1, etc
> 
> 7700k does that while having higher fps in games


Please show us. AMD showed a 6700k struggling with streaming and gaming while a 6900k and Ryzen were smooth as butter.

So I'm curious to s how much better a 7700k is that is less than 2% better than a 6700k.


----------



## ZealotKi11er

Quote:


> Originally Posted by *budgetgamer120*
> 
> Please show us. AMD showed a 6700k struggling with streaming and gaming while a 6900k and Ryzen were smooth as butter.
> 
> So I'm curious to s how much better a 7700k is that is less than 2% better than a 6700k.


Key word "AMD"


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> Please show us. AMD showed a 6700k struggling with streaming and gaming while a 6900k and Ryzen were smooth as butter.
> 
> So I'm curious to s how much better a 7700k is that is less than 2% better than a 6700k.


As someone who streams these games occasionally and is always recording my gameplay I can assure you that whatever settings they were using were total overkill for streaming


----------



## blue1512

Quote:


> Originally Posted by *Marios145*
> 
> The whole problem lies on the fact that most software treats ryzen SMT as intel HT.
> Instead of assigning threads on physical cores first it assigns threads on both physical/virtual cores.
> Then you have the fact that usually threads move to different cores by windows or apps, on intel it's fine as long as you move it around physical cores first, but on ryzen if you assign a thread from core 1 to core 7 you send the thread to a different CCX, and that will create a huge performance penalty.
> I think ryzen should be treated as 2x4cores to maximise performance.
> Games will need patches, windows will need updates.
> Maybe setting affinity will help? Dunno, ryzen owners should try and tell us.
> 
> How i know this? read this:
> AMD's Ryzen CPU Series will Need Modern Linux Kernel for Proper Support


It is always the problem for the minority. AMD has sent dev kits to cover over 1000 games, but they needs time before being patched


----------



## sugarhell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Key word "AMD"


Try to stream with a i7 while you play. That's why a lot of streamer use a setup with 2 systems one for gaming and one for streaming only.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> As someone who streams these games occasionally and is always recording my gameplay I can assure you that whatever settings they were using were total overkill for streaming


Sure man sure. No one wants low quality streams these days. High quality streaming and gaming is affordable.


----------



## aDyerSituation

Quote:


> Originally Posted by *sugarhell*
> 
> Try to stream with a i7 while you play. That's why a lot of streamer use a setup with 2 systems one for gaming and one for streaming only.


Of course, if they do it for a living. Most people don't. And the performance difference is not worth it. I notice 0 difference in my games when I am streaming AND recording at the same time. Why? Because these games use a couple threads at best and the rest of the processor actually has something to do now.

However streaming a game like battlefield(which hardly anyone does) might see gains with more cores


----------



## Marios145

Quote:


> Originally Posted by *aDyerSituation*
> 
> As someone who streams these games occasionally and is always recording my gameplay I can assure you that whatever settings they were using were total overkill for streaming


Overkill for a 4core. It's a light workload for a 8core.


----------



## VegetarianEater

Quote:


> Originally Posted by *Marios145*
> 
> The whole problem lies on the fact that most software treats ryzen SMT as intel HT.
> Instead of assigning threads on physical cores first it assigns threads on both physical/virtual cores.
> Then you have the fact that usually threads move to different cores by windows or apps, on intel it's fine as long as you move it around physical cores first, but on ryzen if you assign a thread from core 1 to core 7 you send the thread to a different CCX, and that will create a huge performance penalty.
> I think ryzen should be treated as 2x4cores to maximise performance.
> Games will need patches, windows will need updates.
> Maybe setting affinity will help? Dunno, ryzen owners should try and tell us.
> 
> How i know this? read this:
> AMD's Ryzen CPU Series will Need Modern Linux Kernel for Proper Support


Some reviews disabled SMT and gained only a bit of performance in gaming, Tom's Hardware I believe...so it's more than just that. 8 real cores should provide fantastic gaming performance compared to a 7700k in games that use 8 threads, but it's just not the case so far.


----------



## aDyerSituation

Quote:


> Originally Posted by *Marios145*
> 
> Overkill for a 4core. It's a light workload for a 8core.


yes, except for the fact that stream difference is hardly noticeable

and now your viewers can't watch the stream without it stuttering. If you don't know how streaming works don't use that as an "advantage"


----------



## VegetarianEater

Quote:


> Originally Posted by *aDyerSituation*
> 
> yes, except for the fact that stream difference is hardly noticeable
> 
> and now your viewers can't watch the stream without it stuttering. If you don't know how streaming works don't use that as an "advantage"


I have a 3770k at 4.1ghz(stock voltage) and I can't stream consistently to twitch using OBS (720p streaming, stream is constantly laggy/bad). Granted I game at 1440p max settings... not sure what you're gaming at.


----------



## Captain318

Wow, Poor RYZEN is being crushed by it's own Hype. Is it as ideal for gaming as a 7700k right now in single and more lightly threaded games? No probably not but does Intel have a 8 Core chip that fits that bill? Not that I'm aware of and if they do I know it's not for the cost of RYZEN.

I ordered a 1700x because I want something to play with. Maybe it will pay off down the road maybe it wont but Ive been playing with Intel Quad cores forever. I certainly won't retire my 6700k anytime soon especially because I like Emulators like Dolphin but I'm happy that AMD finally has a chip out that's good enough I want a new rig in the house with AMD at its heart. Been more than a Decade since I even thought of doing an AMD Rig


----------



## blue1512

Quote:


> Originally Posted by *VegetarianEater*
> 
> I have a 3770k at 4.1ghz(stock voltage) and I can't stream consistently to twitch using OBS (720p streaming, stream is constantly laggy/bad). Granted I game at 1440p max settings... not sure what you're gaming at.


From his sign, his rig is a 6 cores but he keep talking about 4 cores...
Ps: Oh he does have a 4 cores rig, my bad.


----------



## The-Beast

Quote:


> Originally Posted by *VegetarianEater*
> 
> The thing I don't get about the gaming performance is that it does so well in various single threaded benchmarks compared to the broadwell-e processors, so theoretically it should have similar gaming performance to those processors...
> 
> Is there a Windows optimization problem with AMD CPUS?


It looks to be a combination of things. Launch bugs are a big one. But there's also some shady benchmark suites being used by certain sites where the entire library used intel compilers.


----------



## Phixit

.. but people won't buy Ryzen for gaming.

What are they going to stream ? Their Blu-ray encoding sessions ?


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> yes, except for the fact that stream difference is hardly noticeable
> 
> and now your viewers can't watch the stream without it stuttering. If you don't know how streaming works don't use that as an "advantage"


You serious? How is the steam difference not visible? So if I go YouTube and stream a high-quality video it won't look better?

Why are you worrying about viewers stutter? Lol then shifting goal post.

Fact is i7 has a hard time staking and gaming. One of the skiing points for Ryzen The CEO spoke about and did live test.

She didn't say it beats 7700k in gaming many in here are throwing a fit over


----------



## aDyerSituation

If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


----------



## artemis2307

AMD - Murdering the competition


----------



## naved777

1700 vs 7700k
GAMEPLAY VIDEO FOOTAGE
It doesn't look that bad at all !
1700 @ 3.9Ghz and 7700k @ 5Ghz


----------



## blue1512

Quote:


> Originally Posted by *aDyerSituation*
> 
> If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


Fair enough. Those 8 cores Ryzen were not advertised as gaming specialists though.


----------



## sugarhell

Quote:


> Originally Posted by *aDyerSituation*
> 
> If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


I think it is 99.9% just for you


----------



## The-Beast

Quote:


> Originally Posted by *aDyerSituation*
> 
> If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


A good choice depends on the timeframe of your upgrades. If you cycle your PC every 2 years? Probably not. If you extend the life cycle beyond that it looks extraordinarily competitive for the current software lineup and future expectations.


----------



## CriticalOne

I would honestly just lock the thread until the issues surrounding the EFI gets solved. Apparently some boards are shipping with outdated or broken UEFIs that are hurting Ryzen performance by a lot. Its the reason why results are so inconsistent. The press have been communicating over this.


----------



## savagebunny

The amount of angry people in this thread over a new AMD lineup which is the best damn thing since K9 old times and DFI ruling the world... It's just insane. Y'all can't get over, hell on some of these games on a "Average" FPS which is minimal, if you're gonna cry over 5 fps, then just leave the thread and go buy Intel if that rustles your jimmies that much which in turn you wanna piss off everyone else in here who is excited.

If you expected AMD to blow Intel out of the water, or just start slapping hoes and taking names, well... You bet way too much of your love and tears into AMD. The amount of work AMD pulled off, less money they spent on R&D.

Oh, have any of you actually done the math on paper of $/performance ratio for YOURSELF?

If you haven't looked at the AMD linux benchmarks yet either, then you're missing out on the big picture. Windows seems to be lacking something in general with Ryzen and dealing with SMT. Its literally launch day, give it some damn time. Wait for a BIOS/Microcode update and it should be a lot better imho.


----------



## ZealotKi11er

Quote:


> Originally Posted by *aDyerSituation*
> 
> If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


Yes it will but at the same time help Intel too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *blue1512*
> 
> 4.2 GHz, around the same of 6900k


Stop spreading lies.


----------



## sLowEnd

Basically, gaming is brand agnostic now as far as CPUs are concerned.


----------



## dmasteR

Quote:


> Originally Posted by *blue1512*
> 
> 4.2 GHz, around the same of 6900k


That's not average... 4-4.1 Ghz is average.



Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes it will but at the same time help Intel too.
> 3.8-3.9GHz
> 
> Where are you seeing people only hitting 3.9-3.9Ghz?


----------



## ZealotKi11er

Quote:


> Originally Posted by *dmasteR*
> 
> That's not average... 3.9-4.1 Ghz is average.


And all those are using unsafe volts. Max for Zen is v1.365.


----------



## C2H5OH

Quote:


> Originally Posted by *aDyerSituation*
> 
> If faster ram and bug fixes up RyZen's performance and closes gap in gaming performance I'd consider it. But so far it doesn't seem like a good choice for a 90% gaming build.


We'll see with the upcoming Windows update...and he further development of the platform. Let's hope in a month or two.

I'm on the opposite side, as I game 10% or even less, but it should be noted that it depends on the games. Low threaded games, sure - Ryzen won't be suitable. For multi-threaded it's a personal choice, but in high resolutions Ryzen will do as much as 7700k.

I understand the disappointment to some, but It's not at all doom and gloom.

Plus, I'm waiting for game developers to patch for Ryzen, as was implied by AMD...just out of curiosity.


----------



## VegetarianEater

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And all those are using unsafe volts. Max for Zen is v1.365.


How if it's 1.34v stock?


----------



## ZealotKi11er

Quote:


> Originally Posted by *C2H5OH*
> 
> We'll see with the upcoming Windows update...and he further development of the platform. Let's hope in a month or two.
> 
> I'm on the opposite side, as I game 10% or even less, but it should be noted that it depends on the games. Low threaded games, sure - Ryzen won't be suitable. For multi-threaded it's a personal choice, but in high resolutions Ryzen will do as much as 7700k.
> 
> I understand the disappointment to some, but It's not at all doom and gloom.
> 
> Plus, I'm waiting for game developers to patch for Ryzen, as was implied by AMD...just out of curiosity.


Core i7 750 is also fine for gaming at high resolutions. That's a 8 years old architecture.


----------



## dmasteR

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And all those are using unsafe volts. Max for Zen is v1.365.


Do you have a Source for this?

I remember seeing a article that mentioned max voltage for Zen, but I can't seem to find it again.


----------



## C2H5OH

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And all those are using unsafe volts. Max for Zen is v1.365.


The Stilt was suggesting that "voltages higher than 1.4500V are generally not advisable for sustained use" but "Meanwhile the actual (effective) voltage for the highest single core boosted PState (XFR, e.g. 4.1GHz) can be as high as 1.47500V."


----------



## Captain318

Don't know if that was posted

AMD responds to 1080p gaming tests on Ryzen

https://www.pcper.com/news/Processors/AMD-responds-1080p-gaming-tests-Ryzen?utm_source=dlvr.it&utm_medium=facebook


----------



## Mad Pistol

As has already been said by AMD themselves (including their CEO), most games currently on the market are using Intel compilers to run the game engines. This isn't a bad thing, but the architecture of Ryzen chips is different. Because of this, and because this is AMD's first generation of SMT, we should all expect that gaming performance will improve over time.

I have to believe that AMD's chips perform well in professional applications because these tasks use a "plan of attack" so to speak before they begin the actual processing. The same can be seen in how many computer language compilers work; they figure out what needs to be done, calculate the most efficient way to accomplish the task, THEN they actually process the task. Because of this, AMD's task scheduler has a better idea of what will be processed and optimizes the logical path to make it work.

Unfortunately, games are far less linear in the way they are processed; they constantly require inputs and must react instantly to changing conditions. This can potentially be a problem for SMT logic because, if you take 2 threads and push it to the same core twice (core + SMT), then suddenly you're pushing double the work into the same pipeline; you're going to get a pretty severe bottleneck that could lead to performance degredation.

More than likely the issue is with the scheduler and how it processes SMT-based workloads. This can probably be fixed with either a driver or bios update that basically tells the CPU how to "behave" when being bombarded with many threads from the same program. More than likely we will see some tweaks coming, a la FX 8 core, that will put the priority on the 8 "real cores" in gaming workloads as opposed to the CPU basically throwing a thread wherever resources are available, even if it means throwing 2 threads at a core while another core is completely open to take additional instructions.

TL;DR this is the launch of a brand new product from AMD, one which is far more complex than anything they have attempted before. Give it time, and performance will improve. Believe it or not, the FX 8 core CPUs did improve overtime as well.


----------



## C2H5OH

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Core i7 750 is also fine for gaming at high resolutions. That's a 8 years old architecture.


At 1440p and 4K?
I'm not sure but that's not what I was commenting on.

For people considering upgrades, Ryzen is not a bad choice, considering it has some limitations. Which Intel has as well, of course.


----------



## Forceman

Quote:


> Originally Posted by *dmasteR*
> 
> Do you have a Source for this?
> 
> I remember seeing a article that mentioned max voltage for Zen, but I can't seem to find it again.


At least one of the reviews I looked at, can't remember which, said that AMD recommended 1.425V. Could have been 1.45V though.


----------



## NameUnknown

Well, I'm impressed enough to say that I need to scrounge the cash together for real this time to replace my 1090T. Every time in the past I have delayed because AMD or Intel was nearing this or that. Now the only questions are 1700 or 1700X & new GPU or more 280Xs...


----------



## Marios145

Official guide by Asus:
@ambient temps you can safely go 1.4V and max safe is 1.45V
http://www.overclock.net/t/1624560/asus-rog-ryzen-overclocking-thread/0_50


----------



## Hueristic

https://www.extremetech.com/gaming/245204-amds-ryzen-7-1800x-reviewed-zen-amazing-workstation-chip-1080p-gaming-achilles-heel


----------



## artemis2307

Lemme get this straight
a 6900k + x99 combo cost around 1400$ have the same performance as the 650$ 1800x + X370 Combo?
WHAT?﻿


----------



## helis4life

Quote:


> Originally Posted by *artemis2307*
> 
> Lemme get this straight
> a 6900k + x99 combo cost around 1400$ have the same performance as the 650$ 1800x + X370 Combo?
> WHAT?﻿


Yeah, more or less, that appears to be the case.


----------



## Clocknut

Pick one of the following setup.

AMD R7 1800X + 1080Ti *VS* Intel 6900K + 1060 3GB

or

AMD R7 1800X + 1080 SLI *VS* Intel 6900K + 1080

tell me which one is better gaming machine.

I cant believe entire thread is complaining about 1800X performance without actually looking at the price. Have you guys got no common sense?


----------



## DaaQ

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And all those are using unsafe volts. Max for Zen is v1.365.


Have a read and learn something
https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


----------



## littledonny

Quote:


> Originally Posted by *blue1512*
> 
> 4.2 GHz, around the same of 6900k


Is this for all models? Is it heat limited?


----------



## Quantum Reality

Quote:


> Originally Posted by *Hueristic*
> 
> https://www.extremetech.com/gaming/245204-amds-ryzen-7-1800x-reviewed-zen-amazing-workstation-chip-1080p-gaming-achilles-heel


According to that page about half the games they tried have very comparable framerates. Calling it an "Achilles' Heel" is a bit much, especially as it's already noticeable that DX12-based games for some reason are the laggards.

I suspect DX12 does some CPU offloading and this is where the problem comes in, if it's been heavily optimized for an Intel-based architecture.


----------



## Mad Pistol

Quote:


> Originally Posted by *littledonny*
> 
> Is this for all models? Is it heat limited?


Don't think it's the heat. More than likely, it's a limit of the silicon or a bottleneck in the uarch.

I guess we will find out when AMD releases their 6 and 4 core parts.


----------



## WolfssFang

I guess it's time to give my little brother the 2600k and upgrade to the 1700x


----------



## dmasteR

Quote:


> Originally Posted by *Clocknut*
> 
> Pick one of the following setup.
> 
> AMD R7 1800X + 1080Ti *VS* Intel 6900K + 1060 3GB
> 
> or
> 
> AMD R7 1800X + 1080 SLI *VS* Intel 6900K + 1080
> 
> tell me which one is better gaming machine.
> 
> I cant believe entire thread is complaining about 1800X performance without actually looking at the price. Have you guys got no common sense?


Why are you using the 6900K when you're talking about a gaming machine over a 7700K/6700K?


----------



## Gunderman456

Quote:


> Originally Posted by *Clocknut*
> 
> Pick one of the following setup.
> 
> AMD R7 1800X + 1080Ti *VS* Intel 6900K + 1060 3GB
> 
> or
> 
> AMD R7 1800X + 1080 SLI *VS* Intel 6900K + 1080
> 
> tell me which one is better gaming machine.
> 
> I cant believe entire thread is complaining about 1800X performance without actually looking at the price. Have you guys got no common sense?


No some are saying that a 7700K is cheaper and is better at gaming right now (this may change with optimization for RyZen down the road) then the 1800X. Your example is ingenuous.


----------



## littledonny

Quote:


> Originally Posted by *Mad Pistol*
> 
> Don't think it's the heat. More than likely, it's a limit of the silicon or a bottleneck in the uarch.
> 
> I guess we will find out when AMD releases their 6 and 4 core parts.


A 6 core 4.4Ghz part would be a huge hit for gamers.


----------



## SoloCamo

Quote:


> Originally Posted by *dmasteR*
> 
> Why are you using the 6900K when you're talking about a gaming machine over a 7700K/6700K?


Gaming machines can be used for more than games though, this is what some people are missing. Ryzen is a lot closer to the 7700k in gaming, especially at realistic resolutions/settings then the 7700k is to Ryzen when it comes to productivity.

If I had to buy a new cpu right now I'd definitely go the Ryzen route. The minor performance difference is absolutely not worth the cost of 4 more real cores / 8 more threads at a close enough IPC.


----------



## artemis2307

ANYBODY trashing about AMD losing, REMEMBER, FINEWINE TECHNOLOGY
It's their first comebacks in 10 years, give them a break
the platform just launched, give it time to grow, yields are gonna be better, clocks gonna be higher
Just be patience﻿


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *CriticalOne*
> 
> Since these tests compare only CPU power for the most part, they can be used to extrapolate.
> 
> Once you first build a rig your system generally is GPU bound, but as you upgrade your GPU power gets much faster and in turn needs a faster CPU to feed it draw calls. This is what happened to me; I have an Haswell i3 and started off with a 750 Ti without bottlenecking problems, but upgraded to a 380x and my system got _extremely_ CPU bound. In theory, the same ratio or gap between 400 FPS for CPU X and 300 FPS for CPU Y will be maintained in other CPU bound tasks. CPU X will get 40 FPS but CPU Y will only get 30, if that makes sense.


Its going to be a long time before you are going to be upgrading to a video card that seriously bottlenecks any of the current upper range Intel or Ryzen chips. Gaming performance is simply not a metric that people use to justify CPU purchases much anymore. I can play all my games just fine on my old 2600K still. Having at least that baseline of gaming performance but being able to add all that Ryzen offers (including twice the cores and threads) for just $329 is an absolute win for me. Can't see how its being twisted into a loss for anybody, really?


----------



## GHADthc

This guy sums up what I am seeing right now:

Quote from RussianSensation
Elite Member Anandtechforum

"X10000!

Seriously, most professional review media is completely clueless nowadays (or are in bed with Intel for product samples and marketing dollars). Who the hell buys a $330-500 8-core 16T CPU that they want to be a well-rounded processor for encoding, rendering, well-threaded office applications, streaming, and still be great at games, but then pairs it with a GTX1070/1080/1080Ti (or in reviews $1200 Titan X Pascal) and then uses a POS $90-200 1080p 60Hz monitor? Give me a break! I will continue ripping into 1080p 60Hz gaming as we are now in 2017, and 1070, a 1440P card, is only $349. Not only is 1080p outdated for PC enthusiasts, but most 1080p monitors are small in real estate (24" or less) and tend to be budget in terms of IQ quality (a lot of them are mediocre IPS or TN panels).

Joker is one of the few true PC gamers on YouTube who did a Real World Ryzen vs. Intel PC gaming review.

The idea of testing the CPU at low resolutions or low GPU settings to "test CPUs" is marketing drivel that Intel has used for a decade, despite it not making any sense in a decade. If 95% of games are GPU limited at 4xMSAA 1080p/1440p/3440x1440 or 4K, those are the real world results us gamers actually should care about since those are the scenarios we use our PCs in.

I did not buy my 6700K and 6800K CPUs and GTX1070s to play games at 1080p low-medium settings. To exacerbate the matters, these so called professionals reviewers used Titan XP for gaming benchmarks. In fact, if there is ANY extra gaming performance on the table, I, and I am sure many of you, will increase all IQ settings to the max, and then if more performance is available, we woudl raise MSAA to 2-4X or even enable SSAA. The fact is a 4Ghz R7 1700 thrashes my 6700K in so many other applications outside of games that it's a MUCH better well-rounded processor. Outside of professional gamers who need 200-300 fps, who wants to play games with tearing?

The comparisons of 7700K vs. R7 1800X are so stupid, it's beyond any reasonable logic. That's like claiming i7 6900K is a failed CPU because it loses to the 7700K in gaming. It's OK to claim that 7700K is a better CPU for gaming, but that doesn't suddenly make R7 Ryzen a bad CPU. I would hope so that someone buying a $330-500 CPU does something other than gaming with it; and especially not gaming at the peasant 1080p 60Hz resolution. Otherwise, you do not need to spend that much on a processor in the first place. I am sure you can pick up a used i7 4770K/4790K and that would be more than adequate for gaming.

There also seems to be a lot of criticism coming from i7 4770K/6700K owners that Ryzen didn't really change the landscape for them. Let's look at Steam survey and see just how many Steam users have a CPU as powerful as the i7 4770K/6700K? For someone who has an i7 920/860, i5 2500K/2600K, R7 1700 @ 4Ghz would be an excellent upgrade.

It's amazing how the pro-Intel media never criticized 6-8 core Intel CPUs starting with i7-990X and 3930K eras, but the minute AMD's Ryzen makes 5820K/6800K/6850K and 6900K irrelevant and frankly flat out horrible in value, all of a sudden ALL the focus shifts to 1080p and lower PC gaming benchmarks? What a joke our tech-review industry has become. How come the media is barely discussing that other than Thunderbolt and SLI/CF support, the $90-100 B350 boards are significantly cheaper than the X99 board despite offering all the latest modern features from PCIe 3.0 x4 M.2 support, to USB 3.1 Type-C to Ryzen overclocking support? The total platform cost rises even more with X99 for those who want a multi-threaded powerhouse of a CPU.

With Intel, we basically get amazing mainstream gaming CPUs that are slow for multi-threaded tasks OR extremely overpriced X99 parts. With Ryzen, we actually get the most well-rounded processor to keep for the next 4-5 years. With B350 chipset, a mobo can be purchased for $90-100 and Ryzen 2.0/3.0 7nm parts will likely be backwards compatible with AM4 socket come 2019-2020. OTOH, LGA1151 Z170/270 or X99 platforms are completely dead. It's interesting how most "professional" reviewers have no clue as to the forward looking advantage of the AM4 platform as well. It means a builder can pick up a 4C or a 6C Ryzen and just get the 2019-2020 8C Ryzen when he/she needs more performance. With Intel, not only are you forced to pay more for X99 chipset board, but these boards have no upgrade path at all.

If I had to build a new PC system now, I would purchase an R7 1700 over the 6700K/7700K. I'd rather have a CPU that's 95% as good for games but is 30-60% faster in anything else I want to throw at it today or in the future. I would also get a guaranteed upgrade path for faster Ryzen models. Win-Win.

Techno-Kitchen already showed that a 5Ghz 7700K is barely faster in games than a 4Ghz 7700K. Most games today are GPU-limited once we start using Real World gaming settings and anti-aliasing settings!"

Couldn't have expressed it better myself...

All of these reviews are fishy...no 4K resolution testing? No SLI/CFX testing? Very odd choice of games I am seeing in some review suites...Looks like I am going to have to look at some real-world performance figures from users here on OCN.


----------



## LancerVI

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its going to be a long time before you are going to be upgrading to a video card that seriously bottlenecks any of the current upper range Intel or Ryzen chips. Gaming performance is simply not a metric that people use to justify CPU purchases much anymore. I can play all my games just fine on my old 2600K still. Having at least that baseline of gaming performance but being able to add all that Ryzen offers (including twice the cores and threads) for just $329 is an absolute win for me. Can't see how its being twisted into a loss for anybody, really?


This exactly.

Talk about "missing the forest for the trees" as one of the detractors exclaimed.


----------



## tashcz

Guys, anyone got a link to a review where someone tried disabling cores in order to get a higher overclock? Could that be possible to push the limit further?


----------



## Clocknut

Quote:


> Originally Posted by *dmasteR*
> 
> Why are you using the 6900K when you're talking about a gaming machine over a 7700K/6700K?


Quote:


> Originally Posted by *Gunderman456*
> 
> No some are saying that a 7700K is cheaper and is better at gaming right now (this may change with optimization for RyZen down the road) then the 1800X. Your example is ingenuous.


If you take high clocked Dual core Pentium vs lower clocked quad i5 you get the same answer (well that was a couple of years ago when games were only dual core, look what happen now)


----------



## czin125

Are they doing a clean install of Windows when they test these new cpus?
Quote:


> Originally Posted by *artemis2307*
> 
> ANYBODY trashing about AMD losing, REMEMBER, FINEWINE TECHNOLOGY
> It's their first comebacks in 10 years, give them a break
> the platform just launched, give it time to grow, yields are gonna be better, clocks gonna be higher
> Just be patience﻿


Every time they retest old amd gpus they're using newer cpus to go with them, no? It's just their gpu isn't being maxed out yet. 7970 is 2012, but they retest using a 2016/2017 cpu. RX480 gains more % from cpu clocks than the GTX 1060.


----------



## Code-Red

Lisa Su said the reason for low gaming scores is probably due to Intel focused programming (compilers, targeting their SMT arch, etc). It should balance out over time with developers getting on board - which this time around is a hell of a lot more likely than asking them to jump on board with Bulldozers architecture.

I think we'll see improvements over time when HT and the memory issues are hashed out. Definitely when higher-clocking R3 and R5 processors launch.


----------



## iRUSH

Remember how optimistic we were just to have an AMD CPU perform near Haswell?

Skylake needed a BIOS update to get RAM past 2133 too. Skylake really wakes up with 3000+MHz RAM too. Perhaps Ryzen will do the same?

I have very little reason to not be happy about Ryzen now and definitely Ryzen later.

If you look back at old threads about this "Ryzen" and knew the performance from it that we currently have, many would be astatic.

I know some of the press and AMD themselves have deliberately setup situations that favor their product from a price to performance standpoint. That's probably the big issue. But looking back at what we thought we'd get compared to what we currently have is pretty good.

If you have read this far I want to add one more thing. IMO, This will help AMDs GPU sales too. For whatever reason many people do not like to build a PC with an AMD CPU and a Nvidia GPU.

But more people than I can fathom want an all AMD build!


----------



## budgetgamer120

Quote:


> Originally Posted by *GHADthc*
> 
> This guy sums up what I am seeing right now:
> 
> Quote from RussianSensation
> Elite Member Anandtechforum
> 
> "X10000!
> 
> Seriously, most professional review media is completely clueless nowadays (or are in bed with Intel for product samples and marketing dollars). Who the hell buys a $330-500 8-core 16T CPU that they want to be a well-rounded processor for encoding, rendering, well-threaded office applications, streaming, and still be great at games, but then pairs it with a GTX1070/1080/1080Ti (or in reviews $1200 Titan X Pascal) and then uses a POS $90-200 1080p 60Hz monitor? Give me a break! I will continue ripping into 1080p 60Hz gaming as we are now in 2017, and 1070, a 1440P card, is only $349. Not only is 1080p outdated for PC enthusiasts, but most 1080p monitors are small in real estate (24" or less) and tend to be budget in terms of IQ quality (a lot of them are mediocre IPS or TN panels).
> 
> Joker is one of the few true PC gamers on YouTube who did a Real World Ryzen vs. Intel PC gaming review.
> 
> The idea of testing the CPU at low resolutions or low GPU settings to "test CPUs" is marketing drivel that Intel has used for a decade, despite it not making any sense in a decade. If 95% of games are GPU limited at 4xMSAA 1080p/1440p/3440x1440 or 4K, those are the real world results us gamers actually should care about since those are the scenarios we use our PCs in.
> 
> I did not buy my 6700K and 6800K CPUs and GTX1070s to play games at 1080p low-medium settings. To exacerbate the matters, these so called professionals reviewers used Titan XP for gaming benchmarks. In fact, if there is ANY extra gaming performance on the table, I, and I am sure many of you, will increase all IQ settings to the max, and then if more performance is available, we woudl raise MSAA to 2-4X or even enable SSAA. The fact is a 4Ghz R7 1700 thrashes my 6700K in so many other applications outside of games that it's a MUCH better well-rounded processor. Outside of professional gamers who need 200-300 fps, who wants to play games with tearing?
> 
> The comparisons of 7700K vs. R7 1800X are so stupid, it's beyond any reasonable logic. That's like claiming i7 6900K is a failed CPU because it loses to the 7700K in gaming. It's OK to claim that 7700K is a better CPU for gaming, but that doesn't suddenly make R7 Ryzen a bad CPU. I would hope so that someone buying a $330-500 CPU does something other than gaming with it; and especially not gaming at the peasant 1080p 60Hz resolution. Otherwise, you do not need to spend that much on a processor in the first place. I am sure you can pick up a used i7 4770K/4790K and that would be more than adequate for gaming.
> 
> There also seems to be a lot of criticism coming from i7 4770K/6700K owners that Ryzen didn't really change the landscape for them. Let's look at Steam survey and see just how many Steam users have a CPU as powerful as the i7 4770K/6700K? For someone who has an i7 920/860, i5 2500K/2600K, R7 1700 @ 4Ghz would be an excellent upgrade.
> 
> It's amazing how the pro-Intel media never criticized 6-8 core Intel CPUs starting with i7-990X and 3930K eras, but the minute AMD's Ryzen makes 5820K/6800K/6850K and 6900K irrelevant and frankly flat out horrible in value, all of a sudden ALL the focus shifts to 1080p and lower PC gaming benchmarks? What a joke our tech-review industry has become. How come the media is barely discussing that other than Thunderbolt and SLI/CF support, the $90-100 B350 boards are significantly cheaper than the X99 board despite offering all the latest modern features from PCIe 3.0 x4 M.2 support, to USB 3.1 Type-C to Ryzen overclocking support? The total platform cost rises even more with X99 for those who want a multi-threaded powerhouse of a CPU.
> 
> With Intel, we basically get amazing mainstream gaming CPUs that are slow for multi-threaded tasks OR extremely overpriced X99 parts. With Ryzen, we actually get the most well-rounded processor to keep for the next 4-5 years. With B350 chipset, a mobo can be purchased for $90-100 and Ryzen 2.0/3.0 7nm parts will likely be backwards compatible with AM4 socket come 2019-2020. OTOH, LGA1151 Z170/270 or X99 platforms are completely dead. It's interesting how most "professional" reviewers have no clue as to the forward looking advantage of the AM4 platform as well. It means a builder can pick up a 4C or a 6C Ryzen and just get the 2019-2020 8C Ryzen when he/she needs more performance. With Intel, not only are you forced to pay more for X99 chipset board, but these boards have no upgrade path at all.
> 
> If I had to build a new PC system now, I would purchase an R7 1700 over the 6700K/7700K. I'd rather have a CPU that's 95% as good for games but is 30-60% faster in anything else I want to throw at it today or in the future. I would also get a guaranteed upgrade path for faster Ryzen models. Win-Win.
> 
> Techno-Kitchen already showed that a 5Ghz 7700K is barely faster in games than a 4Ghz 7700K. Most games today are GPU-limited once we start using Real World gaming settings and anti-aliasing settings!"
> 
> Couldn't have expressed it better myself...
> 
> All of these reviews are fishy...no 4K resolution testing? No SLI/CFX testing? Very odd choice of games I am seeing in some review suites...Looks like I am going to have to look at some real-world performance figures from users here on OCN.


So in other words 1080p is the new 720p?

I'll be on 1080p for a while though


----------



## S.M.

Quote:


> Originally Posted by *aDyerSituation*
> 
> I doubt anyone is arguing that Ryzen isn't great value for multi threaded workloads
> 
> but in gaming it is severely lacking behind Ivy bridge and haswell in most games, which is not something that should just be looked over. those processors have been around for years


No it's not.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> So in other words 1080p is the new 720p?
> 
> I'll be on 1080p for a while though


1440p should be the new 1080p... but it isn't, and it probably won't be until we can get decent 1440P panels under $200.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And all those are using unsafe volts. Max for Zen is v1.365.


HWCanucks said 1.4V was fine in their review.


----------



## redone13

Quote:


> Originally Posted by *S.M.*
> 
> No it's not.


https://i.imgur.com/z3fQeQZ.png

http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> 1440p should be the new 1080p... but it isn't, and it probably won't be until we can get decent 1440P panels under $200.


I use VSR to go 4k or 1440p sometimes but outside of gaming it makes my desktop blurry.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> HWCanucks said 1.4V was fine in their review.


I heard it from GamerNexus where they talked to AMD and was told 1.365v the the safe voltage to run.


----------



## Oubadah

..


----------



## manitox

Quote:


> Originally Posted by *GHADthc*
> 
> This guy sums up what I am seeing right now:
> 
> Quote from RussianSensation
> Elite Member Anandtechforum
> 
> "X10000!
> 
> Seriously, most professional review media is completely clueless nowadays (or are in bed with Intel for product samples and marketing dollars). Who the hell buys a $330-500 8-core 16T CPU that they want to be a well-rounded processor for encoding, rendering, well-threaded office applications, streaming, and still be great at games, but then pairs it with a GTX1070/1080/1080Ti (or in reviews $1200 Titan X Pascal) and then uses a POS $90-200 1080p 60Hz monitor? Give me a break! I will continue ripping into 1080p 60Hz gaming as we are now in 2017, and 1070, a 1440P card, is only $349. Not only is 1080p outdated for PC enthusiasts, but most 1080p monitors are small in real estate (24" or less) and tend to be budget in terms of IQ quality (a lot of them are mediocre IPS or TN panels).
> 
> Joker is one of the few true PC gamers on YouTube who did a Real World Ryzen vs. Intel PC gaming review.
> 
> The idea of testing the CPU at low resolutions or low GPU settings to "test CPUs" is marketing drivel that Intel has used for a decade, despite it not making any sense in a decade. If 95% of games are GPU limited at 4xMSAA 1080p/1440p/3440x1440 or 4K, those are the real world results us gamers actually should care about since those are the scenarios we use our PCs in.
> 
> I did not buy my 6700K and 6800K CPUs and GTX1070s to play games at 1080p low-medium settings. To exacerbate the matters, these so called professionals reviewers used Titan XP for gaming benchmarks. In fact, if there is ANY extra gaming performance on the table, I, and I am sure many of you, will increase all IQ settings to the max, and then if more performance is available, we woudl raise MSAA to 2-4X or even enable SSAA. The fact is a 4Ghz R7 1700 thrashes my 6700K in so many other applications outside of games that it's a MUCH better well-rounded processor. Outside of professional gamers who need 200-300 fps, who wants to play games with tearing?
> 
> The comparisons of 7700K vs. R7 1800X are so stupid, it's beyond any reasonable logic. That's like claiming i7 6900K is a failed CPU because it loses to the 7700K in gaming. It's OK to claim that 7700K is a better CPU for gaming, but that doesn't suddenly make R7 Ryzen a bad CPU. I would hope so that someone buying a $330-500 CPU does something other than gaming with it; and especially not gaming at the peasant 1080p 60Hz resolution. Otherwise, you do not need to spend that much on a processor in the first place. I am sure you can pick up a used i7 4770K/4790K and that would be more than adequate for gaming.
> 
> There also seems to be a lot of criticism coming from i7 4770K/6700K owners that Ryzen didn't really change the landscape for them. Let's look at Steam survey and see just how many Steam users have a CPU as powerful as the i7 4770K/6700K? For someone who has an i7 920/860, i5 2500K/2600K, R7 1700 @ 4Ghz would be an excellent upgrade.
> 
> It's amazing how the pro-Intel media never criticized 6-8 core Intel CPUs starting with i7-990X and 3930K eras, but the minute AMD's Ryzen makes 5820K/6800K/6850K and 6900K irrelevant and frankly flat out horrible in value, all of a sudden ALL the focus shifts to 1080p and lower PC gaming benchmarks? What a joke our tech-review industry has become. How come the media is barely discussing that other than Thunderbolt and SLI/CF support, the $90-100 B350 boards are significantly cheaper than the X99 board despite offering all the latest modern features from PCIe 3.0 x4 M.2 support, to USB 3.1 Type-C to Ryzen overclocking support? The total platform cost rises even more with X99 for those who want a multi-threaded powerhouse of a CPU.
> 
> With Intel, we basically get amazing mainstream gaming CPUs that are slow for multi-threaded tasks OR extremely overpriced X99 parts. With Ryzen, we actually get the most well-rounded processor to keep for the next 4-5 years. With B350 chipset, a mobo can be purchased for $90-100 and Ryzen 2.0/3.0 7nm parts will likely be backwards compatible with AM4 socket come 2019-2020. OTOH, LGA1151 Z170/270 or X99 platforms are completely dead. It's interesting how most "professional" reviewers have no clue as to the forward looking advantage of the AM4 platform as well. It means a builder can pick up a 4C or a 6C Ryzen and just get the 2019-2020 8C Ryzen when he/she needs more performance. With Intel, not only are you forced to pay more for X99 chipset board, but these boards have no upgrade path at all.
> 
> If I had to build a new PC system now, I would purchase an R7 1700 over the 6700K/7700K. I'd rather have a CPU that's 95% as good for games but is 30-60% faster in anything else I want to throw at it today or in the future. I would also get a guaranteed upgrade path for faster Ryzen models. Win-Win.
> 
> Techno-Kitchen already showed that a 5Ghz 7700K is barely faster in games than a 4Ghz 7700K. Most games today are GPU-limited once we start using Real World gaming settings and anti-aliasing settings!"
> 
> Couldn't have expressed it better myself...
> 
> All of these reviews are fishy...no 4K resolution testing? No SLI/CFX testing? Very odd choice of games I am seeing in some review suites...Looks like I am going to have to look at some real-world performance figures from users here on OCN.


THIS!


----------



## Vesku

Quote:


> Originally Posted by *blue1512*
> 
> Seriously, you are still trying to compare apple to orange here, ~3.4 GHz 8c 1800x vs ~ 4.5 GHz 4c 7700k makes little sense. If you want gaming only, wait for lower cores Ryzens, which can reach higher clock and better single thread performance.


R7 owners can simulate 6 cores to give us some expectations. I do not think we will see OCs much past 4.2 without extreme cooling solutions even on lower core Ryzen, they are cuts of the same die.


----------



## S.M.

Quote:


> Originally Posted by *redone13*
> 
> https://i.imgur.com/z3fQeQZ.png
> 
> http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


I can cherry pick 1080p benchmarks, too.


----------



## 12Cores

It looks like if you can get these chips to 3.9-4.1ghz they will do damage. This is definitely not the bulldozer launch, I know I owned the FX 8120/8320/8350. If you are sitting on a FX 8XXX, the R7 1700 should be on your radar for sure.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Mad Pistol*
> 
> 1440p should be the new 1080p... but it isn't, and it probably won't be until we can get decent 1440P panels under $200.


It has been close to 5 years now since I purchased my first 1440P monitor, it was a $250 Korean off eBay and was amazing for the money, still going to this day. How sad that 1440P hasn't taken over 1080P in that time.


----------



## Fifth Horseman

Idk maybe I am easily impressed for a very affordable price you can get a chip that can touch intels 10 core in a CB15 multi. Yea it looks like its not working very well for gaming benches at 1080p but given that i game at 1440p/4k this is a non issue.

Also from what I can tell with the mem issue, immature bios, lack of windows drivers and no optimizations for games it would be foolish for anyone to jump to a large amount of concrete conclusions, i would likely think that if we look at this in a few months we will get a different picture.

Will I buy? meh probably not but i am not really wanting to buy any CPU. What I have works really well for now.


----------



## Clocknut

The real problem here is AMD overhype things, they cherry picked the benchmark and claim R7 1800x is the best 8 core beating 6900K..... remember Lisa said the fastest 8 core? LOL

Had AMD said they reach i7-5960X performance, many would have been impress.


----------



## Forceman

Quote:


> Originally Posted by *Vesku*
> 
> R7 owners can simulate 6 cores to give us some expectations. I do not think we will see OCs much past 4.2 without extreme cooling solutions even on lower core Ryzen, they are cuts of the same die.


There were some reviews that used good cooling and they still didn't get past 4.1ish, so I think you are right. It isn't a temp limit, it is a process/architecture issue and even the 4/6 cores won't clock much higher.


----------



## VegetarianEater

Quote:


> Originally Posted by *S.M.*
> 
> I can cherry pick 1080p benchmarks, too.


Like it or not, the vast majority of gamers still use 1080p. I personally use 1440p, so most games are GPU bound at that resolution, but only for now.

The whole point of testing at 720p or 1080p is to see what raw gaming performance is like if there is no GPU bottleneck. GPUs are still getting faster every year, so in 3 years time a single GPU might not be a bottleneck at 1440p like it currently is right now for most games.

Theoretically, the Ryzen 1800x should get the same gaming performance as the 6900k or 6850k, but it's getting consistently worse results when CPU-bound, and that's what matters most. Yeah obviously in GPU games it isn't going to matter, but why the hell would you benchmark a CPU by using GPU bound benchmarks? A lot of the time even the FX-8350 is fine when GPU bound, 1440p or 4K.

So you can argue that those results are "cherry-picked" but actually those are the results that matter most, CPU bound scenarios vs GPU bound scenarios.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Clocknut*
> 
> The real problem here is AMD overhype things, they cherry picked the benchmark and claim R7 1800x is the best 8 core beating 6900K..... remember Lisa said the fastest 8 core? LOL
> 
> Had AMD said they reach i7-5960X performance, many would have been impress.


1800X really an impressive CPU but gaming. It just does not make any sense because 6900X stomps it. This has nothing to do with Single Threaded performance because 6900K is not a higher clock CPU.


----------



## 12Cores

OP can you add this 1700(3.9ghz) vs 7700K(5ghz) benchmark video to the reviews, good stuff here.

https://www.youtube.com/watch?v=BXVIPo_qbc4


----------



## redone13

Quote:


> Originally Posted by *S.M.*
> 
> I can cherry pick 1080p benchmarks, too.


Cherry pick where they win and they aren't from AMD's early propaganda which clearly had no correlation.


----------



## S.M.

Quote:


> Originally Posted by *Clocknut*
> 
> The real problem here is AMD overhype things, they cherry picked the benchmark and claim R7 1800x is the best 8 core beating 6900K..... remember Lisa said the fastest 8 core? LOL
> 
> Had AMD said they reach i7-5960X performance, many would have been impress.


It is the fastest 8 core on the market clock for clock.


----------



## tashcz

Quote:


> Originally Posted by *Forceman*
> 
> There were some reviews that used good cooling and they still didn't get past 4.1ish, so I think you are right. It isn't a temp limit, it is a process/architecture issue and even the 4/6 cores won't clock much higher.


How much can a 5820K overclock compared to 5960X?

I think you are just making bad assumptions. 8 Core chips just DONT clock high. EOS.


----------



## Forceman

Quote:


> Originally Posted by *Clocknut*
> 
> The real problem here is AMD overhype things, they cherry picked the benchmark and claim R7 1800x is the best 8 core beating 6900K..... remember Lisa said the fastest 8 core? LOL
> 
> Had AMD said they reach i7-5960X performance, many would have been impress.


I think the bigger issue on OCN was the recent leaks posted by WCCF, etc, with the Cinebench scores - they painted a unrealistic picture of the overall performance and got everyone hyped up. Those "theoretical" comparisons where they raised the scores to simulate higher clock speeds were especially bad. If we had these full results a week ago I think most everyone would be impressed, but all the "faster than Kaby Lake clock for clock" stuff skewed expectations.
Quote:


> Originally Posted by *tashcz*
> 
> How much can a 5820K overclock compared to 5960X?
> 
> I think you are just making bad assumptions. 8 Core chips just DONT clock high. EOS.


Well the 6 core is just going to be a disabled 8 core, so power use will be about the only difference. Maybe the 4 cores will be better, but best not to overhype the 6 core expectations.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Clocknut*
> 
> The real problem here is AMD overhype things, they cherry picked the benchmark and claim R7 1800x is the best 8 core beating 6900K..... remember Lisa said the fastest 8 core? LOL
> 
> Had AMD said they reach i7-5960X performance, many would have been impress.


Why are games the ONLY performance that counts? It IS the fastest 8-core processor in many, many other things and it is still pretty close to the 6900K in most games as well.


----------



## tpi2007

Quote:


> Originally Posted by *Quantum Reality*
> 
> One thing I remember is that AMD has sometimes had to provide third-party driver support for what should be natively supported features in Windows. Back in the S939 dual-core days I remember you needed a special driver because the windows XP internal multiprocessor scheduler wasn't optimized well enough when the two "CPUs" were effectively sharing resources rather than being truly independent as in the dual-Pentium scenario.


It's the Dual-Core Optimizer driver for Windows XP:

http://support.amd.com/en-us/search/utilities



Quote:


> Originally Posted by *SoloCamo*
> 
> Didn't see it posted but I'm wondering if more binning is going on then realized.
> 
> L1Techs got their 1700 to 3.4ghz on 8 cores and only 3.8ghz on 2 cores...


As someone also pointed out they say in the video that they didn't touch voltages yet, so we'll have to wait on more results to reach a conclusion, but there is a valuable piece of information that they gave, starting at 11:30, is that they saw great performance discrepancies even with supposedly minor BIOS updates that kept coming, so wait a few more days or weeks before making a final conclusion.

I've said it before, but I'll just leave the example of my Intel X79 motherboard, it had 21 BIOS updates over its lifetime.


----------



## Marios145

The 1800x on its first day won 3 world records on hwbot. Which is the fastest 8core now?


----------



## artemis2307

they're having issue with SMT which disabled yields better results
and the memory thing is still a mess
I'd say wait 3-4 months for things to even out and see


----------



## S.M.

Quote:


> Originally Posted by *VegetarianEater*
> 
> Like it or not, the vast majority of gamers still use 1080p. I personally use 1440p, so most games are GPU bound at that resolution, but only for now.
> 
> The whole point of testing at 720p or 1080p is to see what raw gaming performance is like if there is no GPU bottleneck. GPUs are still getting faster every year, so in 3 years time a single GPU might not be a bottleneck at 1440p like it currently is right now for most games.
> 
> Theoretically, the Ryzen 1800x should get the same gaming performance as the 6900k or 6850k, but it's getting consistently worse results when CPU-bound, and that's what matters most. Yeah obviously in GPU games it isn't going to matter, but why the hell would you benchmark a CPU by using GPU bound benchmarks? A lot of the time even the FX-8350 is fine when GPU bound, 1440p or 4K.
> 
> So you can argue that those results are "cherry-picked" but actually those are the results that matter most, CPU bound scenarios vs GPU bound scenarios.


I game at 1080p. I agree with all your points.

Whoever I quoted earlier said Ryzen is significantly slower than Ivy bridge in gaming. That is not true. The response I got showed the WORST benchmark disparity between Ryzen and Ivy as 13%, with an FPS over 120FPS.
Quote:


> Originally Posted by *redone13*
> 
> Cherry pick where they win and they aren't from AMD's early propaganda which clearly had no correlation.


That's not the definition of propaganda.

The person I quoted said Ryzen is "significantly" slower in gaming than Ivy bridge. It is not. Even the benchmarks you provided proved my point.


----------



## GHADthc

Quote:


> Originally Posted by *Oubadah*
> 
> That guy is misguided for the reasons I've already stated. In fact I'm pretty sure I wrote a response to that same guy when he was spouting the same spiel about GPU bound CPU benchmarks a hardware generation or two back and he never gave a satisfactory counterargument.
> 
> EDIT: Not the one I was thinking about, but the same discussion. My response:
> 
> Most of the professional review media _is_ completely clueless nowadays, and that's precisely why low-res CPU benches matter. Otherwise you're just looking at a pile of GPU bound CPU benchmarks that tell you absolutely NOTHING about the relationship between two CPUs. Maybe it means that X CPU is equal to Y CPU at "real world" resolutions, or maybe it just tells you that the reviewer is not familiar with the game, and did not test the CPU in a highly CPU bound area.
> 
> Test a 2600K with Skyrim in the plains around Whiterun: "Oh look, the 2600K is good enough for 60fps".
> Test a 2600K with Skyrim inside Markarth city: Suddenly 60fps is out the window.
> 
> Ideally the reviewer tests the 2600K in Markarth city, but if they don't know that's where the game is most CPU bottlenecked, then having low res *CPU bound* CPU benchmarks from around Whiterun is the second best option. If you know the game, you can extrapolate those results (the difference between two CPUs) to Markarth. If they have a bunch of *GPU bound* CPU benchmarks from around Whiterun, then those are absolutely worthless to everyone.


Except none of those tests are real-world scenario's where a user will actually experience them, who buys a high end new gaming rig, just to lower all the settings down to maximize their FPS (Other than a very select few niche hardcore FPS players?)...next to nobody is the answer to that question.....now I know those low res tests are good to see a CPU bottleneck and see which CPU comes out ontop...but they are irrelevant to 99% of consumers who will buy a new CPU, as they plan to run their games at higher resolution with all the settings to as high as possible.

All in all, most of these tests, that put Intel in a more positive light, are deliberately orchestrated to do so...no one is going to do that sort of thing 9 times out of 10 when they buy a new PC, they will just play their games at as high a level of fidelity that they can.

And in the case of Ryzen, it bridges the gap between it and Intel's best, the higher the resolution goes up....and yet I haven't seen any of the review sites showing 4K resolution numbers...no SLI/CFX numbers (which is important, because all the X99 users are hanging crap on AMD right now, thinking that X370 isn't up to scratch with SLI/CFX, even though in reality the difference between 8x PCIE 3.0 and 16x is pretty negligible).


----------



## Majin SSJ Eric

Which is why I was always planning to do a Ryzen build over the summer at the earliest.


----------



## S.M.

Quote:


> Originally Posted by *Marios145*
> 
> The 1800x on its first day won 3 world records on hwbot. Which is the fastest 8core now?


Clearly a disappointing launch and a failure.

I should have bought a 7600K.


----------



## AuraNova

The thing I find funny is how when rumours and speculation and so-called leaks came out, a good collective of us said, "Wait for legit reviews." Now that those legit reviews are out, nothing is consistent, more arguing ensues, and the term "cherry-picking" seems to used quite a bit. This mentality being the worst in any major product review I've seen so far.

I know the reasoning why things are the way they are for Ryzen at this moment, I just wanted to make this observation.


----------



## redone13

Quote:


> Originally Posted by *S.M.*
> 
> I game at 1080p. I agree with all your points.
> 
> Whoever I quoted earlier said Ryzen is significantly slower than Ivy bridge in gaming. That is not true. The response I got showed the WORST benchmark disparity between Ryzen and Ivy as 13%, with an FPS over 120FPS.
> That's not the definition of propaganda.
> 
> The person I quoted said Ryzen is "significantly" slower in gaming than Ivy bridge. It is not. Even the benchmarks you provided proved my point.


Ok, focus on my word usage. It was selectively chosen benchmarks to make it appear like it wouldn't turn out how it did in all these other FPS benches. I did see the 3770K above the Ryzen.


----------



## tashcz

Nobody liked the RX480 at launch either. And after that the RX470 was a ball buster. So don't judge immediatly, let some time pass.


----------



## helis4life

Arguing that amd is slower at 1080p medium settings with a gtx1080 or higher seems pretty redundant. Kind of feels like nothing but epeen, particularly when you consider the ultra gaming tests and non gaming tests which show a much more even amd v intel picture.


----------



## Marios145

Quote:


> Originally Posted by *AuraNova*
> 
> The thing I find funny is how when rumours and speculation and so-called leaks came out, a good collective of us said, "Wait for legit reviews." Now that those legit reviews are out, nothing is consistent, more arguing ensues, and the term "cherry-picking" seems to used quite a bit. This mentality being the worst in any major product review I've seen so far.
> 
> I know the reasoning why things are the way they are for Ryzen at this moment, I just wanted to make this observation.


You obviously weren't around during the athlon 64/pentium 4 era. What you see is just the beginning.
People will try to maintain their "elitist" status every way that they can.


----------



## AuraNova

Quote:


> Originally Posted by *Marios145*
> 
> You obviously weren't around during the athlon 64/pentium 4 era. What you see is just the beginning.
> People will try to maintain their "elitist" status every way that they can.


I actually was. I even thought about that when this whole thing blew up. I don't think it was as bad.


----------



## tashcz

Anyone whining about Intel being better at gaming, while Ryzen kicks ass in everything else, think about this:

Intel is for boys, AMD is for real men.


----------



## DeadSkull

Ryzen has risen very impressively for doing development, computational and simulation work. As a huge user of Python modules for work and scientific research, Ryzen computational prowess is huge for me.

Games are meh but I rarely game anymore so to each his own.


----------



## Oubadah

..


----------



## Clocknut

Quote:


> Originally Posted by *Forceman*
> 
> I think the bigger issue on OCN was the recent leaks posted by WCCF, etc, with the Cinebench scores - they painted a unrealistic picture of the overall performance and got everyone hyped up. Those "theoretical" comparisons where they raised the scores to simulate higher clock speeds were especially bad. If we had these full results a week ago I think most everyone would be impressed, but all the "faster than Kaby Lake clock for clock" stuff skewed expectations.


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Why are games the ONLY performance that counts? It IS the fastest 8-core processor in many, many other things and it is still pretty close to the 6900K in most games as well.


There is nothing wrong with the CPU, 1800X is an impressive CPU.

The problem is on AMD keep on emphasizing/overhype how 1800X beat 6900K, when the reality is those result are cherry picked. "Fastest 8 core in the market" is coming out from AMD CEO's mouth herself, thats not coming from fans.

once again, had they take comparison against i7 5960X, it would be the clear winner because 1800X pretty much win most of the time here.


----------



## redone13

Quote:


> Originally Posted by *tashcz*
> 
> Anyone whining about Intel being better at gaming, while Ryzen kicks ass in everything else, think about this:
> 
> Intel is for boys, AMD is for real men.


If by real mean you mean a smaller market that needs heavy computational power for very specialized tasks.. It is good Ryzen made it more affordable. This may make more threads and cores more available and widely used further down the road through price.


----------



## DeadSkull

Quote:


> Originally Posted by *redone13*
> 
> If by real mean you mean a smaller market that needs heavy computational power for very specialized tasks.. It is good Ryzen made it more affordable. This may make more threads and cores more available and widely used further down the road through price.


Ryzen looks very good for anyone working with machine learning / deep learning development even with the power that aws offers.


----------



## AmericanLoco

Quote:


> Originally Posted by *Clocknut*
> 
> The problem is on AMD keep on emphasizing/overhype how 1800X beat 6900K, when the reality is those result are cherry picked. "Fastest 8 core in the market" is coming out from AMD CEO's mouth herself, thats not coming from fans.
> 
> once again, had they take comparison against i7 5960X, it would be the clear winner because 1800X pretty much win most of the time here.


There are plenty of benchmarks where The 1800X is faster, and somewhere it is slower than the 6900. It shines in HEDT workloads, not so much in gaming.


----------



## blue1512

Quote:


> Originally Posted by *Oubadah*
> 
> You're not understanding my point. To render it down, I'll quote myself from the discussion I linked above:
> 
> *What is more futile, to:
> 
> a) Take CPU benchmarks in an artificially CPU bound scenario (ie. low res), or
> b) Take CPU benchmarks in a hopelessly GPU bound scenario?*
> 
> You don't need 4K Ryzen results because you can just look at existing Intel vs Intel results to see what happens at that res. Flatlined GPU bound CPU benchmarks are only useful to people who have some sort of agenda to prove that their pet CPU is "as good as" the competition. They offer no value. "GPU bound CPU benchmarks" Just consider that phrase for a second. That is the definition of futility.


Using game to assess CPU is flaw from the start, as most of studios don't put their effort in utilizing the CPU power. They put GPU in first priority.


----------



## tashcz

Quote:


> Originally Posted by *redone13*
> 
> If by real mean you mean a smaller market that needs heavy computational power for very specialized tasks.. It is good Ryzen made it more affordable. This may make more threads and cores more available and widely used further down the road through price.


Well were gamers buying 6900K's or 5960X's maybe? Or were they 4790K/6700K/7700K?


----------



## VegetarianEater

Quote:


> Originally Posted by *S.M.*
> 
> I game at 1080p. I agree with all your points.
> 
> Whoever I quoted earlier said Ryzen is significantly slower than Ivy bridge in gaming. That is not true. The response I got showed the WORST benchmark disparity between Ryzen and Ivy as 13%, with an FPS over 120FPS.
> That's not the definition of propaganda.
> 
> The person I quoted said Ryzen is "significantly" slower in gaming than Ivy bridge. It is not. Even the benchmarks you provided proved my point.


whoops, didn't see that post, this thread is moving by so fast. However I do feel that with Ryzen's single and multi thread performance based on the benchmarks i've seen, it should crush the Ivy in gaming, and should be exactly on par with the 6900k in gaming benchmarks, but unfortunately it's not so far...

For example at 4.1 ghz on my 3770k I get around 1770 in cpu-z single thread, and 7900 multi thread, while the Ryzen gets over 2000 single threaded (2200 for the 1800x) and over 16000 multi thread, so it should destroy it in gaming, but for some reason it's not, which is frustrating to me as I was hoping Ryzen would be my next upgrade (I want 8 cores for streaming, but not if overall gaming performance is worse than what I already have)


----------



## Gunderman456

I think they tested at low res to show CPU prowess, which is not real world.

If they would have tested at high res and with everything maxed out, CPUs would all be very close to each other.

We are being taken really, they give us the illusion that there is a great difference between AMD and Intel CPUs of later and recent generations, but for the most part a 4770K, a 7700K a 1800X and a lot of times even a 3850 will all perform very close to each other in a lot of games especially when GPU bound.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *Gunderman456*
> 
> I think they tested at low res to show CPU prowess, which is not real world.
> 
> If they would have tested at high res and with everything maxed out, CPUs would all be very close to each other.
> 
> We are being taken really, they give us the illusion that there is a great difference between AMD and Intel CPUs of later and recent generations, but for the most part a 4770K, a 7700K a 1800X and a lot of times even a 3850 will all perform very close to each other in a lot of games especially when GPU bound.


Resolution is barely standardizing to 1080p. 4k is not realistic for anyone not willing to buy a GTX 1080 at the minimum. It's another, further down the road thing or a way of future proofing which isn't really the now for the majority.


----------



## CULLEN

Quote:


> Originally Posted by *redone13*
> 
> I doubt anyone is arguing that Ryzen isn't great value for multi threaded workloads
> 
> but *in gaming it is severely lacking behind Ivy bridge and haswell in most games*, which is not something that should just be looked over. those processors have been around for years


Somebody pointed out you were wrong

And what did you do?

Cherry picked a single test of course.








Quote:


> Originally Posted by *redone13*
> 
> https://i.imgur.com/z3fQeQZ.png
> 
> http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


_Read with David Attenborough voice_

As the Intel fanboy stalks his prey in the night, his face lights up from the RGB backlit keyboard. To him, this isn't just about making a statement, it's about what's true, in his opinion.

Arguing against his believe, even with proofs, just enforces his beliefs - all efforts to help him understand that this is a victory for all are in vain - absolutely remarkable creature.


----------



## S.M.

Quote:


> Originally Posted by *VegetarianEater*
> 
> whoops, didn't see that post, this thread is moving by so fast. However I do feel that with Ryzen's single and multi thread performance based on the benchmarks i've seen, it should crush the Ivy in gaming, and should be exactly on par with the 6900k in gaming benchmarks, but unfortunately it's not so far...
> 
> For example at 4.1 ghz on my 3770k I get around 1770 in cpu-z single thread, and 7900 multi thread, while the Ryzen gets over 2000 single threaded (2200 for the 1800x) and over 16000 multi thread, so it should destroy it in gaming, but for some reason it's not, which is frustrating to me as I was hoping Ryzen would be my next upgrade (I want 8 cores for streaming, but not if overall gaming performance is worse than what I already have)


There's a brand new piece of hardware between a Ryzen CPU and it's GPU.

And it's currently running on Beta and Launch BIOS revisions that AMD themselves have said are bugged.


----------



## redone13

Quote:


> Originally Posted by *CULLEN*
> 
> Somebody pointed out you were wrong
> 
> And what did you do?
> 
> Cherry picked a single test of course.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Read with David Attenborough voice_
> 
> As the Intel fanboy stalks his prey in the night, his face lights up from the RGB backlit keyboard. To him, this isn't just about making a statement, it's about what's true, in his opinion.
> 
> Arguing against his believe, even with proofs, just enforces his beliefs - all efforts to help him understand that this is a victory for all are in vain - absolutely remarkable creature.


Lol. Well, do show me other ones from the front page as a rebuttal where the Ryzen is on top.


----------



## LancerVI

Quote:


> Originally Posted by *redone13*
> 
> Lol. Well, do show me other ones *from the front page* as a rebuttal where the Ryzen is on top.


You're doing it again.


----------



## Majin SSJ Eric

AMD and Intel are BOTH for real men, Jeezus. I quite like Ryzen. I think the 1800X is a great processor. I think the 7700K is a great processor. I still think my 4930K and 2600K are both great processors. The point is, for gaming in 2017 what processor you have is no where near the worry it used to be. If you bought any Intel CPU since 2011 or you buy Ryzen tomorrow, they will all play games perfectly fine. At least Ryzen allows you to spend the money you waste on a 6900K on a second 1080, for instance, while still offering capabilities to do other things that a 7700K couldn't dream of doing.


----------



## cssorkinman

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vesku*
> 
> R7 owners can simulate 6 cores to give us some expectations. I do not think we will see OCs much past 4.2 without extreme cooling solutions even on lower core Ryzen, they are cuts of the same die.
> 
> 
> 
> There were some reviews that used good cooling and they still didn't get past 4.1ish, so I think you are right. It isn't a temp limit, it is a process/architecture issue and even the 4/6 cores won't clock much higher.
Click to expand...

Could be the case, seen it before with a first gen . Wouldn't that basically give us a $259 5820 in the 1600X?

Quote:


> Originally Posted by *12Cores*
> 
> OP can you add this 1700(3.9ghz) vs 7700K(5ghz) benchmark video to the reviews, good stuff here.
> 
> https://www.youtube.com/watch?v=BXVIPo_qbc4


Joker's gaming benches look a lot more like I would expect them to than some I've seen.


----------



## redone13

Quote:


> Originally Posted by *LancerVI*
> 
> You're doing it again.


If you mean prove that Intel is more practical for those not using handbrake then I suppose. All I asked for was some evidence of your own. I do see a youtube video in the prior post in support of AMD.


----------



## Clocknut

Quote:


> Originally Posted by *AmericanLoco*
> 
> There are plenty of benchmarks where The 1800X is faster, and somewhere it is slower than the 6900. It shines in HEDT workloads, not so much in gaming.


which is why I said AMD should bench against i7-5960x. 1800x win pretty much every time here.

IMO, AMD still did not learn from RX480 polaris launch. Never over-hype things get cherry picked result, they should just take the best average result & the worst average result show it to the public then let the price decide the winner.


----------



## GHADthc

Quote:


> Originally Posted by *Oubadah*
> 
> You're not understanding my point. To render it down, I'll quote myself from the discussion I linked above:
> 
> *What is more futile, to:
> 
> a) Take CPU benchmarks in an artificially CPU bound scenario (ie. low res), or
> b) Take CPU benchmarks in a hopelessly GPU bound scenario?*
> 
> You don't need 4K Ryzen results because you can just look at existing Intel vs Intel results to see what happens at that res. Flatlined GPU bound CPU benchmarks are only useful to people who have some sort of agenda to prove that their pet CPU is "as good as" the competition. They offer no value. "GPU bound CPU benchmarks" Just consider that phrase for a second. That is the definition of futility.


Quote:


> Originally Posted by *Oubadah*
> 
> Yes, historically game devs have been slack about optimizing for CPUs, but if you're getting a CPU to run games, then what else are you going to use to assess it?


And there is my point, right there, in terms of gaming performance (Which is what people seem to think AMD was heavily marketing Ryzen for), when you benchmark the Ryzen CPU alongside Intel's offerings, in a GPU bound CPU test (What will happen 9 times out of 10 whilst gaming)...there will be very little difference between Ryzen and Intel's best...and then when you run the rest of the gamut of tests..AMD show's its also got exceptional MT performance compared to both Intel and its own last generation of CPU as an added bonus.

Unfortunately, 90% of the consumer market will briefly go over the launch day tests, go straight to gaming benchmarks, and see a very skewed bunch of results...and automatically assume Intel is far better in the gaming side of things, than it really is...4K resolution testing would of painted a slightly different picture, as its becoming more relevant, and its an especially important bit of information for people who buy a PC and keep it for 4+ years...they would like to know what their systems capabilities will be at the highest resolutions and settings...it helps when reviewers give the full story.


----------



## S.M.

Quote:


> Originally Posted by *redone13*
> 
> If you mean prove that Intel is more practical for those not using handbrake then I suppose. All I asked for was some evidence of your own.


The only use case an Intel CPU performs better dollar-per-dollar is gaming at 1080p or below.

That's not practical at all.


----------



## ryan92084

Quote:


> Originally Posted by *12Cores*
> 
> OP can you add this 1700(3.9ghz) vs 7700K(5ghz) benchmark video to the reviews, good stuff here.
> 
> https://www.youtube.com/watch?v=BXVIPo_qbc4


Caught up in the thread and added several interesting links including this one.

Feel free to mention me in a post if you find something you think is worth adding to the OP so I'll definitely see it. I'll be back in the morning good night and fight nice everyone.

_Personal note:_ now I'm extra glad I decided to wait for the 1600x. So many complaints about the sorry state of the motherboards. Hopefully that and the memory issues will get sorted out by then.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> AMD and Intel are BOTH for real men, Jeezus. I quite like Ryzen. I think the 1800X is a great processor. I think the 7700K is a great processor. I still think my 4930K and 2600K are both great processors. The point is, for gaming in 2017 what processor you have is no where near the worry it used to be. If you bought any Intel CPU since 2011 or you buy Ryzen tomorrow, they will all play games perfectly fine. At least Ryzen allows you to spend the money you waste on a 6900K on a second 1080, for instance, while still offering capabilities to do other things that a 7700K couldn't dream of doing.


i second this all day someone is make more sense. the people are trashing ryzen already own great processor. so i think they should shut up and leave the thread i rather hear bickering from people who actually want to upgrade.


----------



## littledonny

Quote:


> Originally Posted by *Forceman*
> 
> I think the bigger issue on OCN was the recent leaks posted by WCCF, etc, with the Cinebench scores - they painted a unrealistic picture of the overall performance and got everyone hyped up. Those "theoretical" comparisons where they raised the scores to simulate higher clock speeds were especially bad. If we had these full results a week ago I think most everyone would be impressed, but all the "faster than Kaby Lake clock for clock" stuff skewed expectations.
> Well the 6 core is just going to be a disabled 8 core, so power use will be about the only difference. Maybe the 4 cores will be better, but best not to overhype the 6 core expectations.


Since the IHS is soldered, the enabled cores should still run cooler due to the disabled cores. Whether that results in higher OCs is unknown.


----------



## redone13

Quote:


> Originally Posted by *S.M.*
> 
> The only use case an Intel CPU performs better dollar-per-dollar is gaming at 1080p or below.
> 
> That's not practical at all.


Read my prior post. It said this:

Resolution is barely standardizing to 1080p. 4k is not realistic for anyone not willing to buy a GTX 1080 at the minimum. It's another, further down the road thing or a way of future proofing which isn't really the now for the majority.


----------



## Gunderman456

Psssst....high end i3, i5, are very close in gaming to i7 and last gen AMD and new gen RyZen in real world gaming. You can't lose by going with RyZen and certainly not with all the optimization that is coming.

So the back and forth going on here is useless.


----------



## VegetarianEater

Quote:


> Originally Posted by *GHADthc*
> 
> And there is my point, right there, in terms of gaming performance (Which is what people seem to think AMD was heavily marketing Ryzen for), when you benchmark the Ryzen CPU alongside Intel's offerings, in a GPU bound CPU test (What will happen 9 times out of 10 whilst gaming)...there will be very little difference between Ryzen and Intel's best...and then when you run the rest of the gamut of tests..AMD show's its also got exceptional MT performance compared to both Intel and its own last generation of CPU as an added bonus.
> 
> Unfortunately, 90% of the consumer market will briefly go over the launch day tests, go straight to gaming benchmarks, and see a very skewed bunch of results...and automatically assume Intel is far better in the gaming side of things, than it really is...4K resolution testing would of painted a slightly different picture, as its becoming more relevant, and its an especially important bit of information for people who buy a PC and keep it for 4+ years...they would like to know what their systems capabilities will be at the highest resolutions and settings...it helps when reviewers give the full story.


at 4K processor choice currently matters very little. My point is that in the future GPUs will continue to get more powerful, and so something that is GPU bound now might not be in the future, such as 1440p or 4k gaming, and then CPU choice actually will matter quite a bit. We're seeing now that at 1080p a lot of games are still GPU bound, but some aren't even when maxed, and that's when CPU choice will matter.


----------



## Mad Pistol

Quote:


> Originally Posted by *S.M.*
> 
> The only use case an Intel CPU performs better dollar-per-dollar is gaming at 1080p or below.
> 
> That's not practical at all.


And you have to purchase a 7700k in order to hit that "value" (and I use that term extremely loosely). A 7600k is already a bottleneck for some newer games.

I have a feeling that any Ryzen 8 core will not suffer the same fate of an i5 7600k. That being said, if you want THE BEST GAMING EXPERIENCE RIGHT NOW AND HAVE NO REASON TO FUTUREPROOF YOUR RIG OMG 5 GIGGLEHURTZ LAWL... an i7 7700k is a great choice.


----------



## mcg75

Quote:


> Originally Posted by *GHADthc*
> 
> Except none of those tests are real-world scenario's where a user will actually experience them, who buys a high end new gaming rig, just to lower all the settings down to maximize their FPS (Other than a very select few niche hardcore FPS players?)...next to nobody is the answer to that question.....now I know those low res tests are good to see a CPU bottleneck and see which CPU comes out ontop...but they are irrelevant to 99% of consumers who will buy a new CPU, as they plan to run their games at higher resolution with all the settings to as high as possible.
> 
> All in all, most of these tests, that put Intel in a more positive light, are deliberately orchestrated to do so...no one is going to do that sort of thing 9 times out of 10 when they buy a new PC, they will just play their games at as high a level of fidelity that they can.


Anyone with a 1800x and GTX 1080 is probably playing 1440p minimum.

At the same time, someone with a 1700x and GTX 1060/RX 480 is probably playing 1080p.

Is there a reason why the the percentage gap would be smaller if they had used a 1060/480? I could be wrong here but am willing to listen.

If not, all these 1080p results can be applied to the more mainstream cards as well so those results are very important.
Quote:


> Originally Posted by *GHADthc*
> 
> And in the case of Ryzen, it bridges the gap between it and Intel's best, the higher the resolution goes up....and yet I haven't seen any of the review sites showing 4K resolution numbers...no SLI/CFX numbers (which is important, because all the X99 users are hanging crap on AMD right now, thinking that X370 isn't up to scratch with SLI/CFX, even though in reality the difference between 8x PCIE 3.0 and 16x is pretty negligible).


There are some 4K reviews mixed in there. That was specifically what I was looking for.

Ryzen is a solid triple. It's not the strikeout Intel fans wanted nor is it the home run that AMD fans wanted.

Let's hope AMD's continues refinements keeping Intel on their toes.


----------



## tpi2007

Quote:


> Originally Posted by *cssorkinman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vesku*
> 
> R7 owners can simulate 6 cores to give us some expectations. I do not think we will see OCs much past 4.2 without extreme cooling solutions even on lower core Ryzen, they are cuts of the same die.
> 
> 
> 
> There were some reviews that used good cooling and they still didn't get past 4.1ish, so I think you are right. It isn't a temp limit, it is a process/architecture issue and even the 4/6 cores won't clock much higher.
> 
> Click to expand...
> 
> Could be the case, seen it before with a first gen . Wouldn't that basically give us a $259 5820 in the 1600X?
Click to expand...

The 4 core may clock higher if they release it with a higher revision / better yields. They aren't releasing the quad cores until the second half of the year, right? Maybe that's why. The F3 revision CPUs got to 3.9 Turbo and the F4 ones got to 4 Ghz, so it's possible further fine tuning is going on. Especially since they are different dies.

In any case, there is one argument that still hasn't been made: the Z270 platform is a dead end. All you'll get is a 4C/8T chip that has already been released, whereas with the AMD platform you can buy a $260 Ryzen 5 1600X 6C/12T CPU or a $329 Ryzen 7 1700 8C/16T now and in a year or two upgrade to Zen+ or Zen++, which will perform better and most likely clock higher. With Intel you have to make that decision right away, go for Z270 or X99. And on that front, the 6900K is probably the best you'll get too (unless Intel decides to release some higher clocked CPUs at the last minute). (Edit: I forgot about the 6950X, but that one is priced way out of the competition, at least right now.)


----------



## redone13

Quote:


> Originally Posted by *VegetarianEater*
> 
> at 4K processor choice currently matters very little. My point is that in the future GPUs will continue to get more powerful, and so something that is GPU bound now might not be in the future, such as 1440p or 4k gaming, and then CPU choice actually will matter quite a bit. We're seeing now that at 1080p a lot of games are still GPU bound, but some aren't even when maxed, and that's when CPU choice will matter.


Everything is later on is my point. Ryzen is a good early step! I don't argue that it is bad for anything. I am just stating facts.


----------



## S.M.

Quote:


> Originally Posted by *redone13*
> 
> Read my prior post. It said this:
> 
> Resolution is barely standardizing to 1080p. 4k is not realistic for anyone not willing to buy a GTX 1080 at the minimum. It's another, further down the road thing or a way of future proofing which isn't really the now for the majority.


Steam hardware survey states that 1080p has been the majority since 2014.


----------



## redone13

Quote:


> Originally Posted by *S.M.*
> 
> Steam hardware survey states that 1080p has been the majority since 2014.


And still is. See the problem?


----------



## Hueristic

Could care less about a few frames in games but these bios issues do concern me, I'm waiting until they are squared away and I see some Asrock and Biostar reviews as well.


----------



## blue1512

]
Quote:


> Originally Posted by *Mad Pistol*
> 
> And you have to purchase a 7700k in order to hit that "value" (and I use that term extremely loosely). A 7600k is already a bottleneck for some newer games.
> 
> I have a feeling that any Ryzen 8 core will not suffer the same fate of an i5 7600k. That being said, if you want THE BEST GAMING EXPERIENCE RIGHT NOW AND HAVE NO REASON TO FUTUREPROOF YOUR RIG OMG 5 GIGGLEHURTZ LAWL... an i7 7700k is a great choice.


Ironically not long ago ppl were bashing 7700k for its small increase over Skylake :rofl:


----------



## Mad Pistol

Quote:


> Originally Posted by *Hueristic*
> 
> Could care less about a few frames in games but these bios issues do concern me, I'm waiting until they are squared away and I see some Asrock and Biostar reviews as well.


Intel has had growing pains like this as well.

People seem to forget that hardware nowadays is so complex that the best way to test it is to sell it. As big as Intel/Nvidia is, they cannot possibly figure out all of the issues before a product launch. They just try and minimize their impact post-launch, usually through hotfixes or driver updates.


----------



## Oubadah

..


----------



## Hueristic

Quote:


> Originally Posted by *Mad Pistol*
> 
> *Intel has had growing pains like this as well.
> *
> People seem to forget that hardware nowadays is so complex that the best way to test it is to sell it. As big as Intel/Nvidia is, they cannot possibly figure out all of the issues before a product launch. They just try and minimize their impact post-launch, usually through hotfixes or driver updates.


Too any times to count actually.


----------



## Alwrath

Quote:


> Originally Posted by *redone13*
> 
> Read my prior post. It said this:
> 
> Resolution is barely standardizing to 1080p. 4k is not realistic for anyone not willing to buy a GTX 1080 at the minimum. It's another, further down the road thing or a way of future proofing which isn't really the now for the majority.


You both make valid points. Problem is, just because 1080p is still the most popular resolution, or the " standardized resolution " most people play at, does not mean 1440p and 4k benches are worthless. Its quite the opposite. I would argue that 1440p and 4k benches are more important, because those are the resolutions enthusiast power users like you, me, and probably most of the people on this very forum care about the most because that is what enthusiast power users ( aka gamers that have a clue ) run the games at nowadays. 1440p and 4k represent the future, and the right here right now for people like us, and real, true dedicated PC hardware enthusiast's throw lots of $$$ at a monitor cause they know its money well spent in game enjoyment.









Let the 1080 potatoes run there games on there little 24 inch 1080p monitors, and let the real PC hardware enthusiast crowd game on 1440p and 4k, where the benches really matter.









Also you can easily game at 4K with a $180 radeon rx480, just turn some details down.


----------



## S.M.

Quote:


> Originally Posted by *redone13*
> 
> And still is. See the problem?


Yes, stagnant hardware advancement and a lack of affordable options.


----------



## Clocknut

Quote:


> Originally Posted by *Mad Pistol*
> 
> Intel has had growing pains like this as well.
> 
> People seem to forget that hardware nowadays is so complex that the best way to test it is to sell it. As big as Intel/Nvidia is, they cannot possibly figure out all of the issues before a product launch. They just try and minimize their impact post-launch, usually through hotfixes or driver updates.


GTX970.... Sandy bridge cpu bug, Sandy Bridge P67 sata problems that cause a mass motherboard replacement..


----------



## Hueristic

Quote:


> Originally Posted by *Clocknut*
> 
> GTX970.... Sandy bridge cpu bug, Sandy Bridge P67 sata problems that cause a mass motherboard replacement..


First Pentium was an utter disaster.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *tpi2007*
> 
> The 4 core may clock higher if they release it with a higher revision / better yields. They aren't releasing the quad cores until the second half of the year, right? Maybe that's why. The F3 revision CPUs got to 3.9 Turbo and the F4 ones got to 4 Ghz, so it's possible further fine tuning is going on. Especially since they are different dies.
> 
> *In any case, there is one argument that still hasn't been made: the Z270 platform is a dead end. All you'll get is a 4C/8T chip that has already been released, whereas with the AMD platform you can buy a $260 Ryzen 5 1600X 6C/12T CPU or a $329 Ryzen 7 1700 8C/16T now and in a year or two upgrade to Zen+ or Zen++, which will perform better and most likely clock higher.* With Intel you have to make that decision right away, go for Z270 or X99. And on that front, the 6900K is probably the best you'll get too (unless Intel decides to release some higher clocked CPUs at the last minute).


That is a very good point and something I hadn't even considered. But you have some people in here worrying about 720p performance because at some point several years from now the newest Titan Y will supposedly be bottlenecked by an 1800X but not a 7700K. Yeah, OK. Of course by then both CPU's will be old news...


----------



## kaosstar

I think the Ryzen R5 1500 at $229 will end up being an awesome value gaming CPU.


----------



## Mad Pistol

Quote:


> Originally Posted by *kaosstar*
> 
> I think the Ryzen R5 1500 at $229 will end up being an awesome value gaming CPU.


FTFY


----------



## oxidized

Quote:


> Originally Posted by *Mad Pistol*
> 
> FTFY


They already gave us that, now we need a really good GAMING cpu...


----------



## Oubadah

..


----------



## Mad Pistol

Quote:


> Originally Posted by *oxidized*
> 
> They already gave us that, now we need a really good GAMING cpu...


True, but the 6 core will simply be a killer all-around CPU. Great price, great performance, great... everything.


----------



## redone13

Quote:


> Originally Posted by *Alwrath*
> 
> You both make valid points. Problem is, just because 1080p is still the most popular resolution, or the " standardized resolution " most people play at, does not mean 1440p and 4k benches are worthless. Its quite the opposite. I would argue that 1440p and 4k benches are more important, because those are the resolutions enthusiast power users like you, me, and probably most of the people on this very forum care about the most because that is what enthusiast power users ( aka gamers that have a clue ) run the games at nowadays. 1440p and 4k represent the future, and the right here right now for people like us, and real, true dedicated PC hardware enthusiast's throw lots of $$$ at a moniter cause they know its money well spent in game enjoyment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let the 1080 potatoes run there games on there little 24 inch 1080p monitors, and let the real PC hardware enthusiast crowd game on 1440p and 4k, where the benches really matter.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also you can easily game at 4K with a $180 radeon rx480, just turn some details down.


I surely believe there is merit in 4k benchmarks. I want the market to start using more cores and threads. But if the current GPU cannot do a certain resolution because it is too far ahead and then we add a processor that loses power because it has way more cores/threads aren't being utilized, then where does one get real world. It is a good for a workstation, heavy number crunching and very specialized users.


----------



## DADDYDC650

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That is a very good point and something I hadn't even considered. But you have some people in here worrying about 720p performance because at some point several years from now the newest Titan Y will be bottlenecked by an 1800X but not a 7700K. Of course by then both CPU's will be old news...


Exactly why I'm not bothered at all by the lower max frames Ryzen gets vs the 7700k. I'm currently using a 60Hz 4K monitor so the CPU isn't the bottleneck. By the time there's a GPU that can push 4K easily I'll be on Zen 2 or 3. Nice to be able to keep the same mobo and RAM along the way.


----------



## oxidized

Quote:


> Originally Posted by *Mad Pistol*
> 
> True, but the 6 core will simply be a killer all-around CPU. Great price, great performance, great... everything.


Hope that, i'm really counting on it, gaming is the thing i do most, but for other stuff, and future game designs +2C/+4T is surely something i'd need


----------



## Oubadah

..


----------



## SoloCamo

Quote:


> Originally Posted by *Mad Pistol*
> 
> True, but the 6 core will simply be a killer all-around CPU. Great price, great performance, great... everything.


and IIRC it will actually have some benefits over the 8c16t cpu as it has the same amount of cache, but more dedicated to each core.


----------



## Majin SSJ Eric

I'll be right there with you. Getting a 1700X and CH6 over the summer to replace my X79 setup. Will take some of the money I save over X99 and get a couple 1080's and retire my OG Titans finally....


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> But the higher res benchmarks potentially give the illusion of stagnancy even where there is none (all CPUs look the same in the graph).


See my edited post. I accidentally mashed two together.


----------



## kfxsti

128 pages of bickering..
What part of this lineup of CPUs from AMD is bad ? A little over hyped to some extent yes maybe. Even with me owning a 7700k I was super excited today for this release. Regardless of winning whatever performance wise. This is a step in the right direction for everyone. AMD back on Thier feet in the CPU market Hell yes !! Intel having to do price shuffling and possibly pushing something out beyond a marginal increase in performance at a better price - Hell yes !! Once the kinks are worked out with Ryzen and even before that I applaud AMD For this leap.


----------



## kaosstar

Quote:


> Originally Posted by *SoloCamo*
> 
> and IIRC it will actually have some benefits over the 8c16t cpu as it has the same amount of cache, but more dedicated to each core.


Also the potential to OC higher, maybe.


----------



## Alwrath

Quote:


> Originally Posted by *redone13*
> 
> It is a good for a workstation, heavy number crunching and very specialized users.


You forgot to add 1440p 4K enthusiast power gamer user's to your list


----------



## redone13

Quote:


> Originally Posted by *redone13*
> 
> I surely believe there is merit in 4k benchmarks. I want the market to start using more cores and threads. But if the current GPU cannot do a certain resolution because it is too far ahead and processor that loses power because it has way more cores/threads aren't being utilized, then where does one get real world. It is a good for a workstation, heavy number crunching and very specialized users.


Quote:


> Originally Posted by *Alwrath*
> 
> You forgot to add 1440p 4K enthusiast power gamer user's to your list


That could either go the way of current benches or not.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'll be right there with you. Getting a 1700X and CH6 over the summer to replace my X79 setup. Will take some of the money I save over X99 and get a couple 1080's and retire my OG Titans finally....


At least right now, it looks like the 1700 is the better choice - seems to overclock about the same.


----------



## Majin SSJ Eric

I have a new scenario for gaming. Which system is a better gaming system:

1700X + GTX 1080Ti for $1100

or

6900K + GTX 1060 for $1250

Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.

I think I'll take the 1700X thanks...


----------



## Alwrath

Quote:


> Originally Posted by *redone13*
> 
> That could either go the way of current benches or not.


I can still do 4k enthusiast gaming with my core i5 760 @4.2 ghz. A Ryzen cpu was seen neck and neck with a 7700k overclocked @1080p benches. Its already proven that Ryzen is awsome at gaming @4K.


----------



## LancerVI

Quote:


> Originally Posted by *mcg75*
> 
> Anyone with a 1800x and GTX 1080 is probably playing 1440p minimum.
> 
> At the same time, someone with a 1700x and GTX 1060/RX 480 is probably playing 1080p.
> 
> Is there a reason why the the percentage gap would be smaller if they had used a 1060/480? I could be wrong here but am willing to listen.
> 
> If not, all these 1080p results can be applied to the more mainstream cards as well so those results are very important.
> There are some 4K reviews mixed in there. That was specifically what I was looking for.
> 
> Ryzen is a solid triple. It's not the strikeout Intel fans wanted nor is it the home run that AMD fans wanted.
> 
> Let's hope AMD's continues refinements keeping Intel on their toes.


THIS is even handed and fair. I can agree with this.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...


Yep. I'll take the 1700X w/1080ti please.


----------



## Mad Pistol

Quote:


> Originally Posted by *kfxsti*
> 
> 128 pages of bickering..
> What part of this lineup of CPUs from AMD is bad ? A little over hyped to some extent yes maybe. Even with me owning a 7700k I was super excited today for this release. Regardless of winning whatever performance wise. This is a step in the right direction for everyone. AMD back on Thier feet in the CPU market Hell yes !! Intel having to do price shuffling and possibly pushing something out beyond a marginal increase in performance at a better price - Hell yes !! Once the kinks are worked out with Ryzen and even before that I applaud AMD For this leap.


I mean, it's quite incredible. We FINALLY have a value-based 8 core in the R7 1700; For $329, you get an 8 core/16 thread part that is within spitting distance of Intel's i7 6900k... a $1000+ chip. Oh, and on top of that, AMD's motherboards are cheaper, and dual-channel RAM kits are cheaper too... and AMD's platform uses slightly less power.

Except for the lack of significant overclocking (both the Phenom I and Bulldozer suffered from this as well), the Ryzen launch is a slam dunk. Sure it isn't an Intel killer that people were hoping it would be, but damnit if this isn't the most competitive product that AMD has released in a decade!!!


----------



## CULLEN

Quote:


> Originally Posted by *redone13*
> 
> Lol. Well, do show me other ones from the front page as a rebuttal where the Ryzen is on top.


Quote:


> Originally Posted by *LancerVI*
> 
> You're doing it again.


And so, my series of lousy David Attenborough impressions must continue.

_Read with David Attenborough voice_

A well-known myth about ostriches putting their head in the sand is anything but a myth when the mysterious foul creature, the fanboy is in question.

Not particually graceful defense, but a surprisingly effective one.

No one will protect what they don't care about, and no one will care about what they have never experienced.

(the last one is actually by the man himself)


----------



## redone13

Quote:


> Originally Posted by *redone13*
> 
> That could either go the way of current benches or not.


Quote:


> Originally Posted by *Alwrath*
> 
> I can still do 4k enthusiast gaming with my core i5 760 @4.2 ghz. A Ryzen cpu was seen neck and neck with a 7700k overclocked @4K benches. Its already proven that Ryzen is awsome at gaming @4K.


If that is the case, it is still very niche. 4k is NOT the norm due to GPU cost.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Forceman*
> 
> At least right now, it looks like the 1700 is the better choice - seems to overclock about the same.


No, I know, but I want that "X" for epeen! Only $70 more. Don't ask, I'm stupid...


----------



## ChronoBodi

Blah blah blah. Who cares, let's look at more Ryzen pics. Feels good man.


----------



## Alwrath

Quote:


> Originally Posted by *LancerVI*
> 
> THIS is even handed and fair. I can agree with this.
> Yep. I'll take the 1700X w/1080ti please.


Same here. Superior choice.


----------



## Slomo4shO

Quote:


> Originally Posted by *tashcz*
> 
> Anyone whining about Intel being better at gaming, while Ryzen kicks ass in everything else, think about this:
> 
> Intel is for play, AMD is for productivity.


FTFY

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> No, I know, but I want that "X" for epeen! Only $70 more. Don't ask, I'm stupid...


At least you are honest with yourself


----------



## yesitsmario

1080p 4 life!

jk







My next monitor will be 1440p 144hz or 4K 60hz.


----------



## SoloCamo

Quote:


> Originally Posted by *redone13*
> 
> If that is the case, it is still very niche. 4k is NOT the norm due to GPU cost.


290x & 4k user reporting in.


----------



## Bruizer

I'm just glad I have something from AMD worthy to replace my old 1100T I've been rocking since 2011. Not real picky. Thanks AMD. Almost lost faith there for a second...or 6-7 years.


----------



## paskowitz

Just the fact that AMD is even moderately competitive should be enough to call Ryzen a win. I think the real test will come with how it holds up over time.


----------



## kd5151

Choice. Something I haven't seen in a while.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...


If all you want to do is game, the 4/8 7700K isn't going to be much of a limitation, so the dead socket argument isn't terribly compelling considering the 7700K isn't likely to bottleneck anything in the next several years.

At the same price on a new build, the 1700X is probably the better choice, but I don't think it's a slam dunk.

The 1600X is the real wildcard.


----------



## Mad Pistol

Quote:


> Originally Posted by *yesitsmario*
> 
> 1080p 4 life!
> 
> jk


If you're feeling really frisky, go play Battlefield 1 @ 720p, and reduce the resolution scaling to 25%... It's fun in the same way that you didn't put your contacts/glasses on to drive.


----------



## SoloCamo

Quote:


> Originally Posted by *Mad Pistol*
> 
> If you're feeling really frisky, go play Battlefield 1 @ 720p, and reduce the resolution scaling to 25%... It's fun in the same way that you didn't put your contacts/glasses on to drive.


I've actually done that for fun on this 4k monitor. I did pretty well considering, too.









Even at 100 feet away that image looks poor.


----------



## redone13

Quote:


> Originally Posted by *SoloCamo*
> 
> 290x & 4k user reporting in.


I was talking by typical video game websites' standards. Ultra, AA, tesselation. The GPU is the limit currently not the CPU. 4k, 144HZ is the sweet spot.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Forceman*
> 
> If all you want to do is game, the 4/8 7700K isn't going to be much of a limitation, so the dead socket argument isn't terribly compelling, considering the 7700K isn't likely to bottleneck anything in the next several years.


Maybe not, but you are locked into that chip forever until you decide to spring for a completely new platform. With the 1700X you can drop in a Zen+ chip that may well beat the 7700K in gaming for just the cost of the CPU.


----------



## Alwrath

Quote:


> Originally Posted by *redone13*
> 
> If that is the case, it is still very niche. 4k is NOT the norm due to GPU cost.


$180 rx480. Turn some details down. We have been over this already. Oh sorry, I thought you actually read other peoples posts in response to yours before you post again, my bad.


----------



## Dragonsyph

Toss Ryzen in the trash, when is Kabylake X coming out?


----------



## Clocknut

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...


lets not forget, you need not to change the motherboard for years to come. Zen+ or Zen++ might offer higher clock speed or may be more than 8 cores. X99 pretty much lock yourself to Maximum 10 cores.


----------



## SoloCamo

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Maybe not, but you are locked into that chip forever until you decide to spring for a completely new platform. With the 1700X you can drop in a Zen+ chip that may well beat the 7700K in gaming for just the cost of the CPU.


Ironically this is why I'm not upgrading yet. Can do 4.7ghz on this 4790k with 16gb cas10 2400mhz ddr3. Really don't feel like doing a whole platform change just due to the annoyance of it though I'd benefit from the extra threads for what I do. Waiting on Zen+.


----------



## redone13

Quote:


> Originally Posted by *Alwrath*
> 
> $180 rx480. Turn some details down. We have been over this already. Oh sorry, I thought you actually read other peoples posts in response to yours before you post again, my bad.


I do read them.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Maybe not, but you are locked into that chip forever until you decide to spring for a completely new platform. With the 1700X you can drop in a Zen+ chip that may well beat the 7700K in gaming for just the cost of the CPU.


Yeah - maybe I'm just jaded because I can't make a convincing argument to myself to switch out my 4790K. Had both a 1700 and 1700X preordered, but ended up cancelling them both. I'm keeping the water block mount though, in case the 1600X hits the spot.

Quote:


> Originally Posted by *SoloCamo*
> 
> Ironically this is why I'm not upgrading yet. Can do 4.7ghz on this 4790k with 16gb cas10 2400mhz ddr3. Really don't feel like doing a whole platform change just due to the annoyance of it though I'd benefit from the extra threads for what I do.


You and me both.


----------



## Alwrath

Quote:


> Originally Posted by *redone13*
> 
> I do read them.


Haha ok, *brohug* its all good.


----------



## Mad Pistol

Quote:


> Originally Posted by *SoloCamo*
> 
> Ironically this is why I'm not upgrading yet. Can do 4.7ghz on this 4790k with 16gb cas10 2400mhz ddr3. Really don't feel like doing a whole platform change just due to the annoyance of it though I'd benefit from the extra threads for what I do. Waiting on Zen+.


I agree with this. Unless you need the extra cores for productivity (or are just itching to get an 8 core CPU), an i7 4790k or 6700k is still very much a standard for gaming. I have yet to run into a situation on my i7 4790k where I wanted more than what it gives.

If you're on a quad core i5... bleh... throw it in the trash and get a Ryzen CPU.


----------



## redone13

Quote:


> Originally Posted by *Forceman*
> 
> Yeah - maybe I'm just jaded because I can't make a convincing argument to myself to switch out my 4790K. Had both a 1700 and 1700X preordered, but ended up cancelling them both. I'm keeping the water block mount though, in case the 1600X hits the spot.


The 4790k is still a savage.
Quote:


> Originally Posted by *Alwrath*
> 
> Haha ok, *brohug* its all good.


----------



## Slomo4shO

Quote:


> Originally Posted by *Forceman*
> 
> If all you want to do is game, the 4/8 7700K isn't going to be much of a limitation, so the dead socket argument isn't terribly compelling, considering the 7700K isn't likely to bottleneck anything in the next several years.


How about the following:

R5 1300 + GTX 1070 = $575

VS

7700K + GTX 1060 = $579


----------



## Majin SSJ Eric

Having been on X79 since 2012, I'm simply never going back to quad cores again so Ryzen is pretty much my only option.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Forceman*
> 
> Yeah - maybe I'm just jaded because I can't make a convincing argument to myself to switch out my 4790K. Had both a 1700 and 1700X preordered, but ended up cancelling them both. I'm keeping the water block mount though, in case the 1600X hits the spot.
> You and me both.


Unless you actually need the cores you both have no reason to ditch those DC's...


----------



## kfxsti

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...


From browsing around the interwebs Intel's Coffelake will be 1151 and possibly with bios upgrade for the z270 platform.


----------



## Forceman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Unless you actually need the cores you both have no reason to ditch those DC's...


I know, but I want something new to play with. Darn fiscal responsibility.


----------



## blue1512

Quote:


> Originally Posted by *Mad Pistol*
> 
> I agree with this. Unless you need the extra cores for productivity (or are just itching to get an 8 core CPU), an i7 4790k or 6700k is still very much a standard for gaming. I have yet to run into a situation on my i7 4790k where I wanted more than what it gives.


Fair enough mate. People who see that Ryzen is more than what they need could just go with what they have, and stop calling Ryzen names


----------



## Majin SSJ Eric

I hear you there. I have no need to ditch my old as dirt 4930K either but if I do so its going to be for an 8-core or better. Not sidegrading or downgrading...


----------



## kfxsti

Quote:


> Originally Posted by *Mad Pistol*
> 
> I mean, it's quite incredible. We FINALLY have a value-based 8 core in the R7 1700; For $329, you get an 8 core/16 thread part that is within spitting distance of Intel's i7 6900k... a $1000+ chip. Oh, and on top of that, AMD's motherboards are cheaper, and dual-channel RAM kits are cheaper too... and AMD's platform uses slightly less power.
> 
> Except for the lack of significant overclocking (both the Phenom I and Bulldozer suffered from this as well), the Ryzen launch is a slam dunk. Sure it isn't an Intel killer that people were hoping it would be, but damnit if this isn't the most competitive product that AMD has released in a decade!!!


Agreed on the phenom Overclocking lol. I still have a phenom 940 , and several 1100 and 1090ts aswell. Love them to this day.
The power consumption on Ryzen is what's compelling to me. It's freaking phenomenal.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I hear you there. I have no need to ditch my old as dirt 4930K either but if I do so its going to be for an 8-core or better. Not sidegrading or downgrading...


I have a feeling 1800X will never beat 4930K in gaming.


----------



## Quantum Reality

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *redone13*
> 
> If that is the case, it is still very niche. 4k is NOT the norm due to GPU cost.
> 
> 
> 
> 290x & 4k user reporting in.
Click to expand...

HD7950 + 60Hz 1920x1200 for me, heh.


----------



## CULLEN

Quote:


> Originally Posted by *SoloCamo*
> 
> Ironically this is why I'm not upgrading yet. Can do 4.7ghz on this 4790k with 16gb cas10 2400mhz ddr3. Really don't feel like doing a whole platform change just due to the annoyance of it though I'd benefit from the extra threads for what I do. Waiting on Zen+.


And there is no reason to upgrade for you. You've got a brilliant chip running at high clock speed and probably not the bottleneck in your rig.

I feel like many of the users here are upgrading from Tesla Model S to the Model X.

Sure the Model S does the tracks faster, and while the Model X can do the tracks in a great time, it's at everything else where it excels.


----------



## Quantum Reality

Quote:


> Originally Posted by *Clocknut*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...
> 
> 
> 
> lets not forget, you need not to change the motherboard for years to come. Zen+ or Zen++ might offer higher clock speed or may be more than 8 cores. X99 pretty much lock yourself to Maximum 10 cores.
Click to expand...

I keep saying this myself, but I'll say it again:

*AMD has a long history of keeping socket pins the same over many generations of CPUs.* AM2 --> AM3+ were to varying degrees upwards and downwards compatible over a broad range of motherboards going from ~2006 to ~2013. By contrast I think Intel changed socket sizes and pinouts at least three times.

This alone seals the deal as far as my next upgrade goes: I _will_ get a Ryzen + motherboard sometime late this year, because when it comes time to go to the next level up, it will be pin-compatible.


----------



## Mad Pistol

Quote:


> Originally Posted by *Quantum Reality*
> 
> I keep saying this myself, but I'll say it again:
> 
> *AMD has a long history of keeping socket pins the same over many generations of CPUs.* AM2 --> AM3+ were to varying degrees upwards and downwards compatible over a broad range of motherboards going from ~2006 to ~2013. By contrast I think Intel changed socket sizes and pinouts at least three times.
> 
> This alone seals the deal as far as my next upgrade goes: I _will_ get a Ryzen + motherboard sometime late this year, because when it comes time to go to the next level up, it will be pin-compatible.


AMD is definitely more friendly to people who wish to upgrade, that's for sure.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> I mean, it's quite incredible. We FINALLY have a value-based 8 core in the R7 1700; For $329, you get an 8 core/16 thread part that is within spitting distance of Intel's i7 6900k... a $1000+ chip. Oh, and on top of that, AMD's motherboards are cheaper, and dual-channel RAM kits are cheaper too... and AMD's platform uses slightly less power.
> 
> Except for the lack of significant overclocking (both the Phenom I and Bulldozer suffered from this as well), the Ryzen launch is a slam dunk. Sure it isn't an Intel killer that people were hoping it would be, but damnit if this isn't the most competitive product that AMD has released in a decade!!!


Been looking at MB and no they are not cheaper. They are same price as Z270 and some X99 like Taichi are same price in all 3 platforms. You do not need Quad Channel for 6900X to match or beat 1800X. Also the power this is completely negligible. Yes this is just like PI and Bulldozer. We just have to wait for Zen+ for the true Zen CPU. I just hope it does not take that long. My worry for AMD and my excitement is Intel replacing 6850K with 8-Core Skylake-X. $600 for 8 Core Skylake is going to kill AMDs 1800X. Yes you pay more for Intel but think long terms because I do when it comes for the CPU. I could have gotten FX8350 for $120 less than 3770K back in 2012 but made the right choice but thinking ahead. CPUs are more than just a now performance like GPUs. Its about at least 3-4 years.
Quote:


> Originally Posted by *Mad Pistol*
> 
> AMD is definitely more friendly to people who wish to upgrade, that's for sure.


I am sorry but I do not want trashy first gen X370 MB with new Zen+. S775 has the same socket but chipset made huge difference. The only advantage is backward compatibility not forward. MBs are nice things to upgrade.


----------



## Slomo4shO

Quote:


> Originally Posted by *kfxsti*
> 
> From browsing around the interwebs Intel's Coffelake will be 1151 and possibly with bios upgrade for the z270 platform.


It was also supposed to be out in 2018...
Quote:


> Originally Posted by *SoloCamo*
> 
> Ironically this is why I'm not upgrading yet. Can do 4.7ghz on this 4790k with 16gb cas10 2400mhz ddr3. Really don't feel like doing a whole platform change just due to the annoyance of it though I'd benefit from the extra threads for what I do. Waiting on Zen+.


My 4770K is stable at 4.8GHz. What I am currently looking for is a CPU that can deliver performance in a smaller package. I got a 6700K for the ITX build but I am on the lookout for something more well rounded since I barely game any more.

The irony is that as my Steam library grows, I play less and less









Tragically, I am more inclined to game on my PS4 Pro on my 4K TV than my PCs these days. But who has time to game:


----------



## kaosstar

Quote:


> Originally Posted by *Slomo4shO*
> 
> It was also supposed to be out in 2018...
> My 4770K is stable at 4.8GHz. What I am currently looking for is a CPU that can deliver performance in a smaller package. I got a 6700K for the ITX build but I am on the lookout for something more well rounded since I barely game any more.
> 
> The irony is that as my Steam library grows, I play less and less
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tragically, I am more inclined to game on my PS4 Pro on my 4K TV than my PCs these days. But who has time to game:


This is a little OT, but is your 4770k delidded? I can only manage 4.4 on mine with no delid.


----------



## czin125

They moved Coffeelake to Q4 2017 from Q1 2018
Quote:


> Originally Posted by *kfxsti*
> 
> From browsing around the interwebs Intel's Coffelake will be 1151 and possibly with bios upgrade for the z270 platform.


http://seekingalpha.com/article/4051541-suerte-de-capote-intels-cannonlake-leaked-patent
It'll be faster than the 7700K with that 3D stack too.

https://en.wikipedia.org/wiki/Spin-transfer_torque


----------



## tpi2007

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> AMD is definitely more friendly to people who wish to upgrade, that's for sure.
> 
> 
> 
> I am sorry but I do not want trashy first gen X370 MB with new Zen+. S775 has the same socket but chipset made huge difference. The only advantage is backward compatibility not forward. MBs are nice things to upgrade.
Click to expand...

The motherboards won't make that big of a difference nowadays when compared to the socket 775 days. Now the memory controller is on-die and so is the PCIe controller and on Ryzen you even have dedicated PCIe lanes for storage, so the crucial parts are on the CPU. The emphasis on the motherboard has more to do with whether it's made with quality components and how well it's supported with BIOS updates, etc.


----------



## budgetgamer120

Quote:


> Originally Posted by *Forceman*
> 
> If all you want to do is game, the 4/8 7700K isn't going to be much of a limitation, so the dead socket argument isn't terribly compelling considering the 7700K isn't likely to bottleneck anything in the next several years.
> 
> At the same price on a new build, the 1700X is probably the better choice, but I don't think it's a slam dunk.
> 
> The 1600X is the real wildcard.


Dead socket only has one meaning... And it is not bottleneck. So I'm not sure what your argument is about.

Dead socket means no upgrade path.


----------



## jamaican voodoo

joker proof that the R7 1700 [email protected] 3.9ghz is equal to a i7 [email protected] 5ghz max setting @ 1080P


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> *Been looking at MB and no they are not cheaper.* They are same price as Z270 and some X99 like Taichi are same price in all 3 platforms. You do not need Quad Channel for 6900X to match or beat 1800X. Also the power this is completely negligible. Yes this is just like PI and Bulldozer. We just have to wait for Zen+ for the true Zen CPU. I just hope it does not take that long. My worry for AMD and my excitement is Intel replacing 6850K with 8-Core Skylake-X. $600 for 8 Core Skylake is going to kill AMDs 1800X. Yes you pay more for Intel but think long terms because I do when it comes for the CPU. I could have gotten FX8350 for $120 less than 3770K back in 2012 but made the right choice but thinking ahead. CPUs are more than just a now performance like GPUs. Its about at least 3-4 years.


Come on dude... do we really need to do this...

Here are some comparable systems.

AMD Ryzen 1700 (8 core, 3.4Ghz): $329.99
ASRock X370 Taichi: $189.99
*Total: $519.98*

Intel Core i7 6800k (6 core, 3.4Ghz): $424.99
ASRock X99 Taichi: $209.99
*Total: $634.98*

I see $115 difference, and the AMD setup will be as fast or faster in virtually every scenario.

All other parts equal, you can either save money with the AMD system, or buy better parts. Either way, the Ryzen build is the way to go.


----------



## redone13

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Been looking at MB and no they are not cheaper. They are same price as Z270 and some X99 like Taichi are same price in all 3 platforms. You do not need Quad Channel for 6900X to match or beat 1800X. Also the power this is completely negligible. Yes this is just like PI and Bulldozer. We just have to wait for Zen+ for the true Zen CPU. I just hope it does not take that long. My worry for AMD and my excitement is Intel replacing 6850K with 8-Core Skylake-X. $600 for 8 Core Skylake is going to kill AMDs 1800X. Yes you pay more for Intel but think long terms because I do when it comes for the CPU. I could have gotten FX8350 for $120 less than 3770K back in 2012 but made the right choice but thinking ahead. CPUs are more than just a now performance like GPUs. Its about at least 3-4 years.
> I am sorry but I do not want trashy first gen X370 MB with new Zen+. S775 has the same socket but chipset made huge difference. The only advantage is backward compatibility not forward. MBs are nice things to upgrade.


Comes down to if you want the best possible right now for one purpose: gaming or productivity. If you get a six-core or eight-core now for gaming because it will be better later for gaming it doesn't make much sense to me. This purchase is based on the "possible" future which may or may not meet expectations especially compared with current benchmarks. If you get a Ryzen for pure productivity, power to ya. How's a higher core count CPU going to do 4k anyways without an affordable GPU?


----------



## jeffdamann

Quote:


> Originally Posted by *Quantum Reality*
> 
> HD7950 + 60Hz 1920x1200 for me, heh.


Try that in Forza Horizon 3 full settings


----------



## Marios145

1. Get Ryzen and create 2 VMs, install the same game(bf1 for example) on each VM, assign 3 cores to each, one keyboard to each, one mouse to each, one display/GPU to each. and play vs a friend with one machine
3. Stream the content on Youtube/Twitch using the remaining 2 cores.
3. ???
4. Profit


----------



## kfxsti

Quote:


> Originally Posted by *Slomo4shO*
> 
> It was also supposed to be out in 2018...
> My 4770K is stable at 4.8GHz. What I am currently looking for is a CPU that can deliver performance in a smaller package. I got a 6700K for the ITX build but I am on the lookout for something more well rounded since I barely game any more.
> 
> The irony is that as my Steam library grows, I play less and less
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tragically, I am more inclined to game on my PS4 Pro on my 4K TV than my PCs these days. But who has time to game:


"Supposed to" alot of research is suggesting it may make it out this year. Either way I'm tickled with everything that's going on lol


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Comes down to if you want the best possible right now for one purpose: gaming or productivity. If you get a six-core now for gaming because it will be better later for gaming it doesn't make much sense to me. This purchase is based on the "possible" future which may or may not meet expectations especially compared with current benchmarks. If you get a Ryzen for pure productivity, power to ya.


For pure productivity, Ryzen is the best bang for buck and 1800x even bests the twice expensive 6900k









For pure gamers, 7700k is still the best for current games. But still, an old i5 Sandy is still capable for gaming so I don't really understand the whines. Gamers should put more money in the GPU. But if you do want more future proof with CPU, focus on the multi-thread performance, as games will definitely thread better.


----------



## ChronoBodi

Quote:


> Originally Posted by *jamaican voodoo*
> 
> joker proof that the R7 1700 [email protected] 3.9ghz is equal to a i7 [email protected] 5ghz max setting @ 1080P


Especially Watch Dogs 2....

7700k just got choked to 100% at 6:48 in the video.

Um, i'm just saying, there may be more Watch Dogs 2-like games in the future, 4c/8t not going to cut it.


----------



## extracrunchy

Yay, competition. It appears that AMD loses at most gaming benchmarks but is now competitive enough to be close. (By lose I mean vs the higher clocking quad core intels).

Then in most multi-threaded apps AMD now kicks ass, either equal to or sometimes beating the more than twice as expensive i7.

Has anyone seen any useful photography benchmarks? (No lame photoshop ones with one large image though I saw that site..I mean like.. any benchmarks where someone is processing 100 raw files in camera raw, lightroom, DXO, etc lol). I couldn't find any .


----------



## ChronoBodi

Quote:


> Originally Posted by *extracrunchy*
> 
> Yay, competition. It appears that AMD loses at most gaming benchmarks but is now competitive enough to be close. (By lose I mean vs the higher clocking quad core intels).
> 
> Then in most multi-threaded apps AMD now kicks ass, either equal to or sometimes beating the more than twice as expensive i7.
> 
> Has anyone seen any useful photography benchmarks? (No lame photoshop ones with one large image though I saw that site..I mean like.. any benchmarks where someone is processing 100 raw files in camera raw, lightroom, DXO, etc lol). I couldn't find any .


I do Capture one Pro with 42MP pics from A7RII.... 1700x chews through it just like my 5960x.


----------



## blue1512

Quote:


> Originally Posted by *ChronoBodi*
> 
> Especially Watch Dogs 2....
> 
> 7700k just got choked to 100% at 6:48 in the video.
> 
> Um, i'm just saying, there may be more Watch Dogs 2-like games in the future, 4c/8t not going to cut it.


I already pointed out for the whiners...
Quote:


> Originally Posted by *blue1512*
> 
> 1800x is comparable with ~100% 7700k in games while only ultilizing ~40-60%. There are plenty of headroom for streaming, or more CPU intensive games in the future.


----------



## momonz

Been reading reviews and I am pretty much happy with the result.

My thoughts about Ryzen 1080p gaming is a negatively overhyped. Yes it's slower than Intel, but but but if you have a at least a RX480 or 1060 you can still max out games. I am using i5 2500k at stock which is already bottlenecks my RX 480 yet I can max out games at 1080 or at least at ULTRA setting.

Obviously the hype is well over the roof and all of us expecting Ryzen to be competitive in any fronts. But again, if you think of it, it's poorer performance at 1080p gaming is negligible. *Please stop about harping poorer 1080p performance if you graphics card can handle 1080p easily.*

AMD brought back competition which is really good and that's what we have been hoping for.


----------



## redone13

Quote:


> Originally Posted by *momonz*
> 
> Been reading reviews and I am pretty much happy with the result.
> 
> My thoughts about Ryzen 1080p gaming is a negatively overhyped. Yes it's slower than Intel, but but but if you have a at least a RX480 or 1060 you can still max out games. I am using i5 2500k at stock which is already bottlenecks my RX 480 yet I can max out games at 1080 or at least at ULTRA setting.
> 
> Obviously the hype is well over the roof and all of us expecting Ryzen to be competitive in any fronts. But again, if you think of it, it's poorer performance at 1080p gaming is negligible. *Please stop about harping poorer 1080p performance if you graphics card can handle 1080p easily.*
> 
> Competition is really good.


It's harder to get excited over say 2160p because realistically it requires a minimum GTX 1080 to utilize. 1080p has some relevancy.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> Come on dude... do we really need to do this...
> 
> Here are some comparable systems.
> 
> AMD Ryzen 1700 (8 core, 3.4Ghz): $329.99
> ASRock X370 Taichi: $189.99
> *Total: $519.98*
> 
> Intel Core i7 6800k (6 core, 3.4Ghz): $424.99
> ASRock X99 Taichi: $209.99
> *Total: $634.98*
> 
> I see $115 difference, and the AMD setup will be as fast or faster in virtually every scenario.
> 
> All other parts equal, you can either save money with the AMD system, or buy better parts. Either way, the Ryzen build is the way to go.


I am sorry but how is $20 difference for MB much cheaper? Also I would get 6800K simply because it a better all rounded CPU. Simply faster at gaming is enough for me and most people. To make things worse 6800K is more than 1 years old. Yeah you save $115 and waited for a slower CPU. I will wait for 7800K and get even faster performance.


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> You're not understanding my point. To render it down, I'll quote myself from the discussion I linked above:
> 
> *What is more futile, to:
> 
> a) Take CPU benchmarks in an artificially CPU bound scenario (ie. low res), or
> b) Take CPU benchmarks in a hopelessly GPU bound scenario?*
> 
> You don't need 4K Ryzen results because you can just look at existing Intel vs Intel results to see what happens at that res. Flatlined GPU bound CPU benchmarks are only useful to people who have some sort of agenda to prove that their pet CPU is "as good as" the competition. They offer no value. "GPU bound CPU benchmarks" Just consider that phrase for a second. That is the definition of futility.


Quote:


> fu·til·i·ty
> ˌfyo͞oˈtilədē/
> noun
> noun: futility
> 
> pointlessness or uselessness.


low res benchmarks are pointless and useless

uselessness = futile

that;s the definition of futility my friend.


----------



## Partogi

So for gaming 7700K is better


----------



## redone13

Quote:


> Originally Posted by *Master__Shake*
> 
> low res benchmarks are pointless and useless
> 
> uselessness = futile
> 
> that;s the definition of futility my friend.


Yes, low res aka the current standard of 1080p is pointless and useless. Let's look for 2160p benchmarks because the gtx 1080 is wildly cost efficient.


----------



## TheReciever

Quote:


> Originally Posted by *Partogi*
> 
> So for gaming 7700K is better


There was no reason to believe otherwise even with hype.

Its like people set themselves up for disappointment on every release.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Partogi*
> 
> So for gaming 7700K is better


check out the video i posted joker show raw game footage about his benchmark trust me the R7 1700 is plenty cpu for everything.


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am sorry but how is $20 difference for MB much cheaper? Also I would get 6800K simply because it a better all rounded CPU. Simply faster at gaming is enough for me and most people. To make things worse 6800K is more than 1 years old. Yeah you save $115 and waited for a slower CPU. I will wait for 7800K and get even faster performance.


Ok, I can see your point, but in the same vein, there is no denying that AMD now gives you a cheaper, comparable solution that has 2 cores/4 threads more than its competition. If the current trend towards highly threaded programs/games is any indication of what the future holds, then the R7 1700 is hands down the better solution... period.


----------



## Master__Shake

Quote:


> Originally Posted by *redone13*
> 
> Yes, low res aka the current standard is pointless and useless. Let's look for 2160p benchmarks because the gtx 1080 is wildly cost efficient.


hi, i'm on 4k and my rig doesn't have a 1080.

also vr is great with a single 980 since it doesn't support sli and that's 2560x1200.

but you knew that obviously.


----------



## bmgjet

Looks like ill be waiting for Gen2 or Zen+ since it really needed to be hitting 4.4 - 4.5ghz to be a worth while upgrade for me.


----------



## redone13

Quote:


> Originally Posted by *Master__Shake*
> 
> hi, i'm on 4k and my rig doesn't have a 1080.
> 
> also vr is great with a single 980 since it doesn't support sli and that's 2560x1200.
> 
> but you knew that obviously.


You and I both know even the GTX1080 cannot maintain a constant 60 FPS with maxed settings in many games. Most gaming websites use Ultra, AA, etc. and FPS count to benchmark. I give a lot of credit the 980 though. It is a great card.


----------



## LancerVI

Quote:


> Originally Posted by *TheReciever*
> 
> There was no reason to believe otherwise even with hype.
> 
> Its like people set themselves up for disappointment on every release.


More like people WANT to be disappointed. That's the only explanation I can come up with for this reaction to Ryzen.

Disappointment for low overclocks??? I can see that, but everything else seems to be in order and a lot of the issues will resolve themselves with platform maturity.

Hell my 5820k BSODs at anything over 4.2/4.3.


----------



## momonz

I really don't get why people who has gtx1080 worried about 1080p gaming with Ryzen today.


----------



## jamaican voodoo

this thread moves way to fast so here you go again i recommend you guys check this video joker posted its new raw footage of his benchmark judge for yourself.


----------



## Master__Shake

Quote:


> Originally Posted by *momonz*
> 
> I really don't get why people who has gtx1080 worried about 1080p gaming with Ryzen today.


because this



720p and 1080p are on top.


----------



## redone13

Quote:


> Originally Posted by *momonz*
> 
> I really don't get why people who has gtx1080 worried about 1080p gaming with Ryzen today.


People with the 1080 might worry that the 1080TI is coming out and costs a large amount of dough to possibly do proper 60FPS/2160p (4K).

And yes Jaimaican, I see a youtube video where AMD is keeping pace with Intel. It is the only thing I see. Not an actual benchmark.


----------



## momonz

Quote:


> Originally Posted by *redone13*
> 
> It's harder to get excited over say 2160p because realistically it requires a minimum GTX 1080 to utilize. 1080p has some relevancy.


I didn't say 1080p is irrelevant. But ryzen's weakness on 1080p is negligible if your graphics card can handle games at that resolution.


----------



## AuraNova

Quote:


> Originally Posted by *jamaican voodoo*
> 
> this thread moves way to fast so here you go again i recommend you guys check this video joker posted its new raw footage of his benchmark judge for yourself.






Basically all the reviews were off from one another to some extent. This goes into greater detail of why Joker and Gamers Nexus had such different findings.


----------



## momonz

Quote:


> Originally Posted by *redone13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *momonz*
> 
> I really don't get why people who has gtx1080 worried about 1080p gaming with Ryzen today.
> 
> 
> 
> People with the 1080 might worry that the 1080TI is coming out and costs a large amount of dough to possibly do 60FPS/2160p (4K).
> 
> And yes Jaimaican, I see a youtube video where AMD is keeping pace with Intel. It is the only thing I see. Not an actual benchmark.
Click to expand...

Please check my post again, I am only talking to 1080p. But in 4k Ryzen seems competitive. And it's a new architecture, it would improve. See what AMD did with RX 480 and given that GPU is not yet their focus.


----------



## Slomo4shO

Quote:


> Originally Posted by *kaosstar*
> 
> This is a little OT, but is your 4770k delidded? I can only manage 4.4 on mine with no delid.





Spoiler: Yes







Quote:


> Originally Posted by *kfxsti*
> 
> I'm tickled with everything that's going on lol


You won't see me arguing against competition.


----------



## momonz

Quote:


> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *momonz*
> 
> I really don't get why people who has gtx1080 worried about 1080p gaming with Ryzen today.
> 
> 
> 
> because this
> 
> 
> 
> 720p and 1080p are on top.
Click to expand...

I have RX 480 and i5 2500k stock clocks (way slower than the slowest R7 ryzen). I can max out games at 1080p. What is your point?


----------



## Oubadah

..


----------



## blue1512

Quote:


> Originally Posted by *momonz*
> 
> Please check my post again, I am only talking to 1080p. But in 4k Ryzen seems competitive. And it's a new architecture, it would improve. See what AMD did with RX 480 and given that GPU is not yet their focus.


Even on 1080p, if you don't play on 120/144 Hz monitor, >60 fps is pretty much pointless.


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> I've already written a post detailing why they're not.


they are


----------



## momonz

Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *momonz*
> 
> Please check my post again, I am only talking to 1080p. But in 4k Ryzen seems competitive. And it's a new architecture, it would improve. See what AMD did with RX 480 and given that GPU is not yet their focus.
> 
> 
> 
> Even on 1080p, if you don't play on 120/144 Hz monitor, >60 fps is pretty much pointless.
Click to expand...

That is so true. That's why I don't get people whining about 1080p performance.


----------



## tpi2007

The argument about testing at 1080p to see what reserves a CPU has does have its merits but in this transitional time, also an implied contradiction. You're testing reserves at 1080p for the future... but the future is 1440p and 4K and games with much better multi-threaded engines. This isn't 2011 anymore - back then there was no Mantle, no Vulkan, no DX 12 and no consoles with 8 x86 CPU cores (6 / 7 usable by games). The games being released and being made now will be spreading the load much better and will be using the cores for more advanced effects.

Back in the Summer of 2008 a Q6600 cost more or less the same as an E8400. Most people gaming went for the E8400, because games were still only starting to use two cores and the quad+ engine implementations still relied on one or two faster cores.

Fast forward six years, to the Summer of 2014, and Intel can only charge no more than $72 for an unlocked Pentium with two cores. Even back then Anandtech was asking Intel to make an unlocked i3 because games were already using more threads. (Took them almost three years to do so.)

Now, fast forward from 2011 (Sandy Bridge and Bulldozer) to 2017 - that's also six years. Quads and quads with HT were the great price/performance kings back then, with a different set of technologies on the map. The game is changing.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *jamaican voodoo*
> 
> this thread moves way to fast so here you go again i recommend you guys check this video joker posted its new raw footage of his benchmark judge for yourself.


There are a lot more articles than just nexusgaming.

http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html

http://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html

http://www.pcworld.com/article/3176191/computers/ryzen-review-amd-is-back.html?page=2

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html

https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Gaming-Performance

https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/

Yes, there is differences and product and methodology of testing. It all doesnt end up in too different of a result.


----------



## Forceman

Quote:


> Originally Posted by *budgetgamer120*
> 
> So I'm not sure what your argument is about.


Well luckily for you, I wasn't talking to you. The people I were talking to understood just fine.


----------



## jeffdamann

Quote:


> Originally Posted by *blue1512*
> 
> Even on 1080p, if you don't play on 120/144 Hz monitor, >60 fps is pretty much pointless.


But you want to make sure your mins never go under 60


----------



## momonz

Quote:


> Originally Posted by *tpi2007*
> 
> The argument about testing at 1080p to see what reserves a CPU has does have its merits but in this transitional time, also an implied contradiction. You're testing reserves at 1080p fore the future... but the future is 1440p and 4K and games with much better multi-threaded engines. This isn't 2011 anymore - back then there was no Mantle, no Vulkan, no DX 12 and no consoles with 8 x86 CPU cores (6 / 7 usable by games). The games being released and being made now will be spreading the load much better and will be using the cores for more advanced effects.
> 
> Back in the Summer of 2008 a Q6600 cost the same as an E8400. Most people gaming went for the E8400, because games were still only starting to use two cores and the quad+ engine implementations still relied on one or two faster cores.
> 
> Fast forward six years, to the Summer of 2014 and Intel can only charge no more than $72 for an unlocked Pentium with two cores. Even back then Anandtech was asking Intel to make an unlocked i3 because games were already using more threads. (Took them almost three years to do so.)
> 
> Now, fast forward from 2011 (Sandy Bridge and Bulldozer) to 2017 - that's also six years. Quads and quads with HT were the great price/performance kings back then, with a different set of technologies on the map. The game is changing.


+1. You said it all right. That's what AMD is telling us during the launch of Ryzen.


----------



## kishagi

great production cpus. Should generate a good amount of money for AMD


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> I look forward to your deconstruction of my well reasoned post.


hi, i own a 4k monitor and i want to build a pc.

how is a 1080p benchmark going to help me do that?

hi, i own an HTC Vive and i want to build a PC for VR.

how is a 1080p benchmark going to help me?

am i screwed because my usage scenarios weren't covered today?


----------



## DaaQ

Quote:


> Originally Posted by *redone13*
> 
> Comes down to if you want the best possible right now for one purpose: gaming or productivity. If you get a six-core or eight-core now for gaming because it will be better later for gaming it doesn't make much sense to me. This purchase is based on the "possible" future which may or may not meet expectations especially compared with current benchmarks. If you get a Ryzen for pure productivity, power to ya. How's a higher core count CPU going to do 4k anyways without an affordable GPU?


Best possible CPU right now but not best possible GPU right now? Doesn't make sense.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *Master__Shake*
> 
> hi, i own a 4k monitor and i want to build a pc.
> 
> how is a 1080p benchmark going to help me do that?
> 
> hi, i own an HTC Vive and i want to build a PC for VR.
> 
> how is a 1080p benchmark going to help me?
> 
> am i screwed because my usage scenarios weren't covered today?


Hi, it looks like you have money to have the best item today and then upgrade when it isn't the best. Like how it's been forever, even for AMD. Fact of the matter is, the intel is better for gaming. AMD for benches.


----------



## Mad Pistol

I just got this screen cap from the Hardware Unboxed Ryzen video.

https://youtu.be/mW1pzcdZxKc?t=8m



My theory as to why the minimums are competitive but the averages are not is because of AMD's current implementation of SMT. The game's master thread (used to balance all other workloads) has no issues maintaining a constant stream of data to the GPU. However, due to some wonky stuff in SMT, sometimes the instructions get sent in an unbalanced way to the cores and SMT threads. Therefore, it creates a bottleneck in processing time, but it's able to match Intel on the minimum framerates.

Seriously, if AMD can get this worked out, Ryzen 8-cores will be competitive in gaming as well.


----------



## Master__Shake

Quote:


> Originally Posted by *redone13*
> 
> Hi, it looks like you have money to have the best item today and then upgrade when it isn't the best. Like how it's been forever, even for AMD.


but i don't have a pc i need to build one.

which platform is the best for 4k or VR?

where are the benchmarks for VR and 4k today?


----------



## redone13

Quote:


> Originally Posted by *Master__Shake*
> 
> but i don't have a pc i need to build one.
> 
> which platform is the best for 4k or VR?
> 
> where are the benchmarks for VR and 4k today?


Excellent point. We are all waiting and this is a scholarly discussion with only so much data as of right now.


----------



## Oubadah

..


----------



## blue1512

Quote:


> Originally Posted by *jeffdamann*
> 
> But you want to make sure your mins never go under 60


Min fps is very stable on Ryzen. They never reach 100% in gaming so they don't choke like 7700k in Watch dog 2.


----------



## Forceman

Quote:


> Originally Posted by *Mad Pistol*
> 
> I just got this screen cap from the Hardware Unboxed Ryzen video.
> 
> My theory as to why the minimums are competitive but the averages are not is because of AMD's current implementation of SMT. The game's master thread (used to balance all other workloads) has no issues maintaining a constant stream of data to the GPU. However, due to some wonky stuff in SMT, sometimes the instructions get sent in an unbalanced way to the cores and SMT threads. Therefore, it creates a bottleneck in processing time, but it's able to match Intel on the minimum framerates.
> 
> Seriously, if AMD can get this worked out, Ryzen 8-cores will be competitive in gaming as well.


I think one of the prevailing theories is that thread switches between cores causes significant performance impacts, and the current Windows scheduler is not keeping thread/core affinity in some cases. So possibly a driver-level fix could help out.


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> Read the bloody post.


Quote:


> Originally Posted by *Oubadah*
> 
> Read the bloody post.


i'm a consumer and i don't have time to go on bloody forums and figure out what i need for my usage scenarios.

please direct me to the reviews i can base a decision off of that help me make informed decisions.

it's as if all the reviewers were looking at a car and instead of telling me or anyone else what the car could do for them they were complaining about the spare tire.


----------



## Steele84

Quote:


> Originally Posted by *blue1512*
> 
> Even on 1080p, if you don't play on 120/144 Hz monitor, >60 fps is pretty much pointless.


This ^

I want to be able to get the most out of 1080p at 144 Hz were not there yet even with the biggest badass GPU's on the market.


----------



## budgetgamer120

Quote:


> Originally Posted by *Forceman*
> 
> Well luckily for you, I wasn't talking to you. The people I were talking to understood just fine.


Sorry your post did not make any sense. No one said 7700k was gonna be bottleneck, but you addressed bottleneck.


----------



## blue1512

Quote:


> Originally Posted by *budgetgamer120*
> 
> Sorry your post did not make any sense. No one said 7700k was gonna be bottleneck, but you addressed bottleneck.


7700k already bottlenecks in Watch Dog 2, choking really hard at 100%.
1800x does not.


----------



## budgetgamer120

Quote:


> Originally Posted by *blue1512*
> 
> 7700k already bottlenecks in Watch Dog 2, choking really hard at 100%.


Well then


----------



## redone13

Uh oh, not watchdogs 2, an absolutely terrible game but I digress. Did you guys not see all the other benches from reputable sources?

http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html

http://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html

http://www.pcworld.com/article/3176191/computers/ryzen-review-amd-is-back.html?page=2

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html

https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Gaming-Performance

https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Uh oh, not watchdogs. An absolutely terrible game but I digress. Did you guys not see all the other benches from reputable sources?
> 
> http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html
> 
> http://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html
> 
> http://www.pcworld.com/article/3176191/computers/ryzen-review-amd-is-back.html?page=2
> 
> http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html
> 
> https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Gaming-Performance
> 
> https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/


Piledriver going strong


----------



## CULLEN

@ZealotKi11er: It's weird seeing such a respected member, generally using strong points in arguments, with such a weird view on Ryzen.

Saying that it isn't "that" impressive and you'd rather get i7 7800K is downright weird to read mate.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Piledriver going strong


Thanks for confirming the general idea of my post about the majority of data we have so far.


----------



## blue1512

Quote:


> Originally Posted by *CULLEN*
> 
> @ZealotKi11er: It's weird seeing such a respected member, generally using strong points in arguments, with such a weird view on Ryzen.
> 
> Saying that it isn't "that" impressive and you'd rather get i7 7800K is downright weird to read mate.


This.

People who after raw single thread performance have all the right to skip Ryzen.

But processing to buy a 7800k after bashing the single thread performance of Ryzen make little to no sense for me


----------



## Forceman

Quote:


> Originally Posted by *budgetgamer120*
> 
> Sorry your post did not make any sense. No one said 7700k was gonna be bottleneck, but you addressed bottleneck.


The only reason you are concerned about the socket being a dead end would be because you wouldn't be able to upgrade the CPU. The only reason you'd (presumably) want to upgrade the CPU is because it is becoming a bottleneck. So if there is no expectation that the 7700K is going to become said bottleneck in the near future, than the dead socket argument is not compelling. By the time the 7700K becomes untenable for gaming, forcing an upgrade, other motherboard features (like PCIe 4.0, or Optane, or whatever) may make you want to switch motherboards anyway.

That was the basis of the discussion.


----------



## budgetgamer120

Quote:


> Originally Posted by *blue1512*
> 
> This.
> 
> People who after raw single thread performance have all the right to skip Ryzen.
> 
> But processing to buy a 7800k after bashing the single thread performance of Ryzen make little to no sense for me


Exactly


----------



## Oubadah

..


----------



## jologskyblues

Good thing I didn't pre-order Ryzen. From what AMD has been saying, it needs more of that Fine Wine™ for gaming.

This is why I hate Hype Trains. Specially if the company is behind it.


----------



## finalheaven

Quote:


> Originally Posted by *Oubadah*
> 
> You still haven't read the post.
> 
> No one is going to be making informed decisions off benchmarks using your protocol. If low res benchmarks have the potential to be misleading, then GPU bound CPU benchmarks have potential to be _even more_ misleading, as I quoted in the post that you haven't bothered to read:


I've been trying to find this post you're referring to. Can you link it please?


----------



## momonz

Even if Ryzen is slower than Intel in 4k that deficit is going to be pointless. The fact that AMD can still improve Ryzen in gaming and those people who are in 4k upgrades GPU every year (see Vega and Volta), I don't see that being a problem from people building a new PC. Usually people upgrade GPU more often than CPU.

Let's not say Ryzen is only competitive in production or work, It still good in gaming albeit not what we have wanted it to be.


----------



## blue1512

Quote:


> Originally Posted by *momonz*
> 
> Even if Ryzen is slower than Intel in 4k that deficit is going to be pointless. The fact that AMD can still improve Ryzen in gaming and those people who are in 4k upgrades GPU every year (see Vega and Volta), I don't see that being a problem from people building a new PC. Usually people upgrade GPU more often than CPU.
> 
> Let's not say Ryzen is only competitive in production or work, It still good in gaming albeit not what we have wanted it to be.


Let's not forget that the min fps of Ryzen is very stable as well.


----------



## budgetgamer120

Quote:


> Originally Posted by *jologskyblues*
> 
> Good thing I didn't pre-order Ryzen. From what AMD has been saying, it needs more of that Fine Wine™ for gaming.
> 
> This is why I hate Hype Trains. Specially if the company is behind it.


AMD has been saying this is for users who work and play and do not like stuttering. Its not for the gamers.


----------



## savagebunny

Quote:


> Originally Posted by *jologskyblues*
> 
> Good thing I didn't pre-order Ryzen. From what AMD has been saying, it needs more of that Fine Wine™ for gaming.
> 
> This is why I hate Hype Trains. Specially if the company is behind it.


The community as a whole, all the youtube videos from 3rd partys, this website, reddit etc hyped it up beyond belief and made everyone believe it was the next supercomputer chip. Look at all the threads the past few months, I'm 99% sure those aren't AMD employees hyping it.

It gets a bit behind in single thread, everyone throws a fit. It bogs down a bit due to SMT, immature BIOS etc.

When Skylake came out, I didn't see everyone cutting there wrists over this crap


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> AMD has been saying this is for users who work and play and do not like stuttering. Its not for the gamers.


The funny thing was, the original AMD reveal made the intel PC look like it was a stutter-fest. And of course they hand picked a gaming benchmark when the vast majority are not for it. That matters not. It is a right step in the general direction of computer tech and affordability.


----------



## Oubadah

..


----------



## budgetgamer120

Quote:


> Originally Posted by *savagebunny*
> 
> The community as a whole, all the youtube videos from 3rd partys, this website, reddit etc hyped it up beyond belief and made everyone believe it was the next supercomputer chip. Look at all the threads the past few months, I'm 99% sure those aren't AMD employees hyping it.


They had a reason to be hype... And it delivered on being a super computing chip, so what is the issue?

Did you read the reviews yourself or just repeating what someone else said.









What is faster than Ryzen computational wise? The 6950x that is $1500.


----------



## ChronoBodi

Quote:


> Originally Posted by *savagebunny*
> 
> The community as a whole, all the youtube videos from 3rd partys, this website, reddit etc hyped it up beyond belief and made everyone believe it was the next supercomputer chip. Look at all the threads the past few months, I'm 99% sure those aren't AMD employees hyping it.


I wasn't expecting it to be a superchip, but a compelling 8c/16t alternative that can do decent gaming while essentially being half price of similar core count i7s.

All the 1080p FPS freaks can go i7 7700k, I rather have a more flexible CPU that can handle more workloads than just "gamez".


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> The funny thing was, the original AMD reveal made the intel PC look like it was a stutter-fest. And of course they hand picked a gaming benchmark when the vast majority are not for it. That matters not. It is a right step in the general direction of computer tech and affordability.


The stutter fest was streaming while gaming on a 6700k. AMD also showed Ryzen and 6900k handling it well.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> The stutter fest was streaming while gaming on a 6700k. AMD also showed Ryzen and 6900k handling it well.


I would like an independent reviewer to do that due to the bias of parent companies. Even intel you must take with a grain of salt.


----------



## jeffdamann

Quote:


> Originally Posted by *Oubadah*
> 
> The discussion started here:
> http://www.overclock.net/t/1624507/various-amd-ryzen-7-reviews/1100#post_25887927
> The post:
> http://www.overclock.net/t/1624507/various-amd-ryzen-7-reviews/1190#post_25888168
> My argument from 2014 that I quoted in the post, because it's still relevant, it's just 1080p instead of 720p now.
> https://forums.anandtech.com/threads/coolaler-devils-canyon-4-0-base-4-4-turbo-stock.2382450/page-5#post-36381781


Good god man, if this is such an issue, why not play games with no hardware acceleration that use ONLY the cpu for all tasks, including rendering? It would be impossible for the GPU to be the bottleneck, because the gpu isn't even utilized.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> I would like an independent reviewer to do that due to the bias of parent companies. Even intel you must take with a grain of salt.


Linus and others have confirmed that many times and it is no secret.

People who stream and game normally have 2 systems, one to play the game on and the other to stream.

I am not sure why this is of surprise to you. A quad core is simply a quad core, it has limitation 8 cores do not have.


----------



## kx11

i guess AMD brought the challenge to Intel finally

they didn't beat Intel yet but they became a challenger finally , also made intel price cut their whole line-up from 2015/2016 which is great for everyone


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Linus and others have confirmed that many times and it is no secret.
> 
> People who stream and game normally have 2 systems, one to play the game on and the other to stream.
> 
> I am not sure why this is of surprise to you. A quad core is simply a quad core, it has limitation 8 cores do not have.


Hmm, that is an excellent point. But it could also be argued that many streamers only have 1 pc. The one's that aren't major and don't have sponsors sending them a second one. It is not a surprise; it is a luxury and not a requirement to get your content out.


----------



## Oubadah

..


----------



## Hueristic

Quote:


> Originally Posted by *AuraNova*
> 
> 
> 
> 
> 
> Basically all the reviews were off from one another to some extent. This goes into greater detail of why Joker and Gamers Nexus had such different findings.


OK, so Nexus is too stupid to do a bios update. Got it, got sick of listening to him blabber. Did he ever get his XMP on the asus? Nice to see the Gigabyte hit the XMP right out of the gate. I'm so shocked that on a new release getting the latest bios and the board matters. OMG


----------



## savagebunny

Quote:


> Originally Posted by *ChronoBodi*
> 
> I wasn't expecting it to be a superchip, but a compelling 8c/16t alternative that can do decent gaming while essentially being half price of similar core count i7s.
> 
> All the 1080p FPS freaks can go i7 7700k, I rather have a more flexible CPU that can handle more workloads than just "gamez".


My understanding the past 75 pages of this "discussion" tonight is, everyone wanted this chip to be the best all around chip. I sure got my $/performance numbers ran and can surely jump to Ryzen when the day presents itself.

I build off workload, not a synthetic benchmark train


----------



## finalheaven

Quote:


> Originally Posted by *Oubadah*
> 
> The discussion started here:
> http://www.overclock.net/t/1624507/various-amd-ryzen-7-reviews/1100#post_25887927
> The post:
> http://www.overclock.net/t/1624507/various-amd-ryzen-7-reviews/1190#post_25888168
> My argument from 2014 that I quoted in the post, because it's still relevant, it's just 1080p instead of 720p now.
> https://forums.anandtech.com/threads/coolaler-devils-canyon-4-0-base-4-4-turbo-stock.2382450/page-5#post-36381781
> 
> I acknowledge that the low res benchmarks are imperfect, and always have. But it's the difference between imperfect (even heavily flawed) benchmarks and 100% worthless benchmarks.


So at 1440p (the resolution i play at) and 1070 (my gpu), will there be cases where the 1700 fails miserably in fps currently? or are you stating that in the future when my 1070 will not be the bottleneck at 1440p, then i'll know why cpu-limited benchmarks matter?


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Hmm, that is an excellent point. But it could also be argued that many streamers only have 1 pc. The one's that aren't major and don't have sponsors sending them a second one. It It is not a surprise; it is a luxury and not a requirement to get your content out.


If you want to stream at good quality, you are going to need 2 systems, or put up with stuttering, or stream in low quality with an i7.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> If you want to stream at good quality, you are going to need 2 systems, or put up with stuttering, or stream in low quality with an i7.


You know that even if you have a good pc, you must stream for the audience at lower bit rate? It isn't going to be top notch quality so that you can get the widest audience. And I for one, as someone who watches multiple 1000+ viewer twitch streamers, see many that have just one PC and do just fine.


----------



## ChronoBodi

Quote:


> Originally Posted by *Hueristic*
> 
> OK, so Nexus is too stupid to do a bios update. Got it, got sick of listening to him blabber. Did he ever get his XMP on the asus? Nice to see the Gigabyte hit the XMP right out of the gate. I'm so shocked that on a new release getting the latest bios and the board matters. OMG


Ouch. I thought he was reputable, but this was a screwup on Steve's part and it seems he won't admit it.

So, question is, who is the reputable reviewers?


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> You still haven't read the post.
> 
> No one is going to be making informed decisions off benchmarks using your protocol. If low res benchmarks have the potential to be misleading, then GPU bound CPU benchmarks have potential to be _even more_ misleading, as I quoted in the post that you haven't bothered to read.


as a consumer the first thing i do when trying to find out something is google it.

so i google "intel vs amd 4k gaming"

you know what i get? nothing.

i guess no one cares about me or people like me.

so where do i go when i can't google something?

i go buy a ps4 pro or something.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Marios145*
> 
> 1. Get Ryzen and create 2 VMs, install the same game(bf1 for example) on each VM, assign 3 cores to each, one keyboard to each, one mouse to each, one display/GPU to each. and play vs a friend with one machine
> 3. Stream the content on Youtube/Twitch using the remaining 2 cores.
> 3. ???
> 4. Profit


And do all that for less money than it would take to build a 7700K rig that couldn't hope to do the same! Nice perspective!


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *redone13*
> 
> It's harder to get excited over say 2160p because realistically it requires a minimum GTX 1080 to utilize. 1080p has some relevancy.


The cool thing about Ryzen is at the low price of entry you can afford a fancy GPU like a 1080 without having to settle for a 4 core that may beat it in games but nothing else or a 6 core that doesn't really perform much better in games and again gets stomped in other stuff.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> You know that even if you have a good pc, you must stream for the audience at lower bit rate? It isn't going to be top notch quality so that you can get the widest audience. And I for one, as someone who watches multiple 1000+ viewer twitch streamers, see many that have just one PC and do just fine.


Resolution matters also.

The stutter and lag will be present regardless of how low your quality is on a quad core. An 8 core will give smoother experience.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The cool thing about Ryzen is at the low price of entry you can afford a fancy GPU like a 1080 without having to settle for a 4 core that may beat it in games but nothing else or a 6 core that doesn't really perform much better in games and again gets stomped in other stuff.


Go to microcenter and get a 7700k at an awesome price thanks to Ryzen. I want to see more 4k benches but current data says intel for gaming, amd for productivity.


----------



## redone13

How much smoother can it be if there is no problem already with those that have one pc? It is still a luxury for the majority and certainly not required. Most probably started with 1 pc but got another as it become their job to stream.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Resolution matters also.
> 
> The stutter and lag will be present regardless of how low your quality is on a quad core. An 8 core will give smoother experience.


How much smoother can it be if there is no problem already with those that have one pc? It is still a luxury for the majority and certainly not required. Most probably started with 1 pc but got another as it become their job to stream.


----------



## blue1512

https://www.reddit.com/r/Amd/comments/5x7iwf/did_i_just_get_lucky_with_my_1800x_my_results_are/

Some ppl are reporting better gaming performance on Gigabyte board which support faster RAM.

It seems there are some issues with the Asus board.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am sorry but how is $20 difference for MB much cheaper? Also I would get 6800K simply because it a better all rounded CPU. Simply faster at gaming is enough for me and most people. To make things worse 6800K is more than 1 years old. Yeah you save $115 and waited for a slower CPU. I will wait for 7800K and get even faster performance.


You are banging this "faster in gaming" drum every where you post as though the 1700X is like 20-30% slower than a 6800K in all games. That's not the case. In fact, its very very close in gaming performance to the 6900K and 6850K. The X99 chips may be faster overall but it is by no means this wide chasm that you seem to be painting...


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> How much smoother can it be if there is no problem already with those that have one pc? It is still a luxury for the majority and certainly not required. Most probably started with 1 pc but got another as it become their job to stream.


What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


----------



## AuraNova

Quote:


> Originally Posted by *Hueristic*
> 
> OK, so Nexus is too stupid to do a bios update. Got it, got sick of listening to him blabber. Did he ever get his XMP on the asus? Nice to see the Gigabyte hit the XMP right out of the gate. I'm so shocked that on a new release getting the latest bios and the board matters. OMG


I started getting the feeling as the day went on that it was ASUS boards that was the issue, but it's really a bunch of things. As of right now, I don't know who to believe anymore. And so...
Quote:


> Originally Posted by *ChronoBodi*
> 
> So, question is, who is the reputable reviewers?


Your guess is as good as mine. I guess the only reputable reviewer is yourself.


----------



## jologskyblues

Quote:


> Originally Posted by *savagebunny*
> 
> The community as a whole, all the youtube videos from 3rd partys, this website, reddit etc hyped it up beyond belief and made everyone believe it was the next supercomputer chip. Look at all the threads the past few months, I'm 99% sure those aren't AMD employees hyping it.


Nope. Just go revisit all the AMD Official PR events, announcements and presentation slides which fed the hype. Especially the most recent ones prior to the launch. Even the fans had lowered their expectations months before. The so-called daily "leaks" leading up to the launch raised expectations which set AMD up for disappointment again.They're doing the same with Vega. I'm seeing all the red flags. Manage your expectations people. It's AMD we're talking about here.


----------



## tpi2007

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> I just got this screen cap from the Hardware Unboxed Ryzen video.
> 
> My theory as to why the minimums are competitive but the averages are not is because of AMD's current implementation of SMT. The game's master thread (used to balance all other workloads) has no issues maintaining a constant stream of data to the GPU. However, due to some wonky stuff in SMT, sometimes the instructions get sent in an unbalanced way to the cores and SMT threads. Therefore, it creates a bottleneck in processing time, but it's able to match Intel on the minimum framerates.
> 
> Seriously, if AMD can get this worked out, Ryzen 8-cores will be competitive in gaming as well.
> 
> 
> 
> I think one of the prevailing theories is that thread switches between cores causes significant performance impacts, and the current Windows scheduler is not keeping thread/core affinity in some cases. So possibly a driver-level fix could help out.
Click to expand...

It wouldn't be the first time. Not even Windows XP at first supported HT properly, and when it did, there were bugs:

http://www.geek.com/blurb/hyperthreading-implementation-in-windows-xp-548442/

http://www.hardwareheaven.com/community/threads/microsoft-confirms-bug-in-hyper-threading-support-by-windows-xp.18170/

Quote:


> Originally Posted by *Hueristic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AuraNova*
> 
> 
> 
> 
> 
> Basically all the reviews were off from one another to some extent. This goes into greater detail of why Joker and Gamers Nexus had such different findings.
> 
> 
> 
> OK, so Nexus is too stupid to do a bios update. Got it, got sick of listening to him blabber. Did he ever get his XMP on the asus? Nice to see the Gigabyte hit the XMP right out of the gate. I'm so shocked that on a new release getting the latest bios and the board matters. OMG
Click to expand...

Isn't he saying that he actually used the latest one?


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


I don't know about everyone. And I don't know how to show you besides linking a streamer with one PC but I neither want to endorse anyone or divulge what I enjoy lol. Take a look at one of the thousand streamers on twitch and see how 1 pc lags for "everyone" is all I can say.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> I don't know about everyone. And I don't know how to show you besides linking a streamer with one PC but I neither want to endorse anyone or divulge what I enjoy lol.


Ok I will link you one with 7700k and Ryzen 1700.

Streamign on i7 = stutter and dropped frames.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Partogi*
> 
> So for gaming 7700K is better


Why would you have expected anything differently? Most realistic people were targeting Haswell levels of IPC from Zen going back for a while now. Even the reddest of fanboys didn't expect AMD to be able to hurdle Intel's 10 year gap in architecture superiority in one single release. The truly important take away from Ryzen is that AMD is finally competitive with Intel again and they have finally opened the door to legitimate 8-core processors to those who are not swimming around in Scrooge McDuck's money bin...


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ok I will link you one with 7700k and Ryzen 1700.
> 
> Streamign on i7 = stutter and dropped frames.


You know that is the original AMD propaganda video right? LOL Straight from the parent company's first presentation where they showed a cherry picked bench?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Master__Shake*
> 
> because this
> 
> 
> 
> 720p and 1080p are on top.


Got news for you. Hardly any of those people are running anything like a GTX 1080...


----------



## Malinkadink

Games like Watch Dogs 2 and BF1 that hammer 8 thread i7s all day and are not the last games that will do so is reason enough to consider R7 Ryzen especially if you want to stream.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> You know that is the original AMD propaganda video right? LOL Straight from the parent company's first presentation where they showed a cherry picked bench?


It's the truth, the "propaganda video" was taken in front the reliable eyes.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


I in no one endorse this 1 pc streamer of a 3d game with 5000 viewers. I just randomly encountered looked for someone on twitch. 4770K. No lag.

https://www.twitch.tv/drriku

2 pcs is a luxury, not a requirement. There are thousands more.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> You know that is the original AMD propaganda video right? LOL Straight from the parent company's first presentation where they showed a cherry picked bench?


Why is it propaganda? The facts do not change because it is from an AMD event. Please I like mature argument, if you see false stuff in the video please point them out. "LOLOL it is from AMD" does nothing for me.

More propaganda for you .


----------



## DADDYDC650

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have a new scenario for gaming. Which system is a better gaming system:
> 
> 1700X + GTX 1080Ti for $1100
> 
> or
> 
> 6900K + GTX 1060 for $1250
> 
> Which gives gamer's the better option of those two choices? Obviously you could do a 7700K + GTX 1080Ti for around the same money as the Ryzen setup but you would also be locked into a dead socket with no upgrade path on a quad core.
> 
> I think I'll take the 1700X thanks...


LoL, wise choice. More games will favor Zen over the 7700k as time goes by. Like you said, investing in 1151 is a dead end.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Why is it propaganda? The facts do not change because it is from an AMD event. Please I like mature argument, if you see false stuff in the video please point them out. "LOLOL it is from AMD" does nothing for me.
> 
> More propaganda for you .


Read my above post. Wow, that 4770K is.... not lagging. As we speak this VERY moment. There are many variables such as bit rate.


----------



## Oubadah

..


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Read my above post. Wow, that 4770K is.... not lagging. As we speak this VERY moment.


Which game, mate.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> I in no one endorse this 1 pc streamer of a 3d game with 5000 viewers. I just randomly encountered looked for someone on twitch. 4770K. No lag.
> 
> https://www.twitch.tv/drriku
> 
> 2 pcs is a luxury, not a requirement. There are thousands more.


Say what you want to say. Facts are facts









You like stutter and low fps then stream and game on your quad core. Real streamers do not have to settle or break the bank anymore now that Ryzen is here.


----------



## Quantum Reality

I was poking around a bit on reddit and noticed this thread:

https://www.reddit.com/r/Amd/comments/5x6q5e/this_is_whats_going_on_with_ryzen_explaining_some/

It seems to make the case that it really is simply a matter of teething issues regarding getting the OS and BIOS up to snuff.

Incidentally, will AMD at least be issuing a patch that goes back to Windows 7? Microsoft has been *very* heavily pushing Windows 10, and I wouldn't put it past them to "forget" to issue hotfixes for Win7 and 8/8.1 to make Ryzen users go to Win10 to get the full power of their systems.


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> Which game, mate.


Look at the previous link. I am not endorsing that streamer. Was just to prove a point. It is a complicated subject and streamers will not use the highest bit rate to get a wider audience.

Quote:


> Originally Posted by *budgetgamer120*
> 
> Say what you want to say. Facts are facts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You like stutter and low fps then stream and game on your quad core. Real streamers do not have to settle or break the bank anymore now that Ryzen is here.


I gave you a realtime result. You gave me an AMD presentation. We can leave it at this. That's a 4770k btw, not a 7700k.


----------



## tpi2007

Quote:


> Originally Posted by *Oubadah*
> 
> Quote:
> 
> 
> 
> Originally Posted by *finalheaven*
> 
> So at 1440p (the resolution i play at) and 1070 (my gpu), will there be cases where the 1700 fails miserably in fps currently? or are you stating that in the future when my 1070 will not be the bottleneck at 1440p, then i'll know why cpu-limited benchmarks matter?
> 
> 
> 
> Like I said, GPUs are advancing at a much greater rate than CPUs. So yes, is it not conceivable that in a couple of years you could replace the 1070 with something that could be significantly faster without also replacing the monitor? Then CPU limitations that were previously hidden behind GPU limitations may become visible.
> 
> Everyone plays different games and uses different configs so it's impossible to say whether anything is going to "fail miserably". I just think that when making decisions about what CPU to buy, it's valuable to be able to see what reserves each one might have above what is considered "enough" at that moment. Reviewers take one small snapshot of performance from one area of a game, and it's rarely the worst case scenario.
Click to expand...

Sure enough, but I think that at this point in time core reserves are more important than IPC reserves (provided you have a good baseline, of course) when looking into the future. There will always be a need for single threaded performance, but you've got to balance it out with the need for more cores as time moves on.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Look at the previous link. I am not endorsing that streamer. Was just to prove a point.
> I gave you a realtime result. You gave me an AMD presentation. We can leave it at this.


What result you gave me. I saw you typed something. That means nothing. Show me a video of before streaming performance and after.

You cant because we know for a fact the performance tanks.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> What result you gave me. I saw you typed something. That means nothing. Show me a video of before streaming performance and after.
> 
> You cant because we know for a fact the performance tanks.


I showed you a link to a live streamer with a 4770k this is not lagging. This refutes your claim about needing 2 PCs. I can go back and quote if you like. I'm getting less and less interested in debating with you though as you can't see fact or logic. See post #1447.


----------



## budgetgamer120

NVM


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Look at the previous link. I am not endorsing that streamer. Was just to prove a point. It is a complicated subject and streamers will not use the highest bit rate to get a wider audience.
> I gave you a realtime result. You gave me an AMD presentation. We can leave it at this.


AMD streamed Dota, a very popular and quite demanding on CPU. It presents very well what they are aiming for with Ryzen.

You don't just pick a streamer of a light CPU game to deny this point. And I doubt that you ever tried streaming.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *redone13*
> 
> Hi, it looks like you have money to have the best item today and then upgrade when it isn't the best. Like how it's been forever, even for AMD. Fact of the matter is, the intel is better for gaming. AMD for benches.


Actually, due to Ryzen's incredible pricing, people who could never have afforded something like a Vive or 4K monitor in addition to the cost of an expensive Intel platform to run it all on now have the option of going into those technologies. Its called a Win for consumers...


----------



## DarkRadeon7000

Quote:


> Originally Posted by *Malinkadink*
> 
> Games like Watch Dogs 2 and BF1 that hammer 8 thread i7s all day and are not the last games that will do so is reason enough to consider R7 Ryzen especially if you want to stream.


The 4 core 7700k beats the Ryzen 8 core on those very games


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> AMD streamed Dota, a very popular and quite demanding on CPU. It presents very well what they are aiming for with Ryzen.
> 
> You don't just pick a streamer of a light CPU game to deny this point.


Do some research and look at the thousands of others. Say League players. I posted hard data. There is THOUSANDs more on twitch.


----------



## ChronoBodi

Quote:


> Originally Posted by *AuraNova*
> 
> I started getting the feeling as the day went on that it was ASUS boards that was the issue, but it's really a bunch of things. As of right now, I don't know who to believe anymore. And so...
> Your guess is as good as mine. I guess the only reputable reviewer is yourself.


The only way to know for sure is to run user OCN benches here, there is too much rush out the door to get Youtube clicks and views when the hardware just barely works together with lots of unknowns on the actual performance/specs.

Who's to say if ALL march 2nd reviews from ALL sites is not valid then? now what?

We need to wait and verify ourselves.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Do some research and look at the thousands of others. Say League players. I posted hard data. There is THOUSANDs more on twitch.


Don't let me start on the crap engine of LOL, mate.

Every establish streamers or someone trying to become one spends hard on the CPU. They normally dedicate one computer just for streaming.


----------



## redone13

Malinkdalink,

Take a look. How about you find me a crappy laggy stream that is on an i7. How about you provide some evidence?


----------



## budgetgamer120

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> The 4 core 7700k beats the Ryzen 8 core on those very games


I am watching the stream and it is clearly choppy and stuttering.

Here is the stream https://www.twitch.tv/drriku
Quote:


> Originally Posted by *redone13*
> 
> Malinkdalink,
> 
> Take a look. How about you find me a crappy laggy stream that is on an i7. How about you provide some evidence?


Already post 2.


----------



## redone13

That is very strange. I have 90mb cable in chicago and it is buttery smooth. The 5000 other viewers probably think so too. Otherwise people just enjoy watching a choppy stream right?


----------



## finalheaven

Quote:


> Originally Posted by *Oubadah*
> 
> Like I said, GPUs are advancing at a much greater rate than CPUs. So yes, is it not conceivable that in a couple of years you could replace the 1070 with something that could be significantly faster without also replacing the monitor? Then CPU limitations that were previously hidden behind GPU limitations may become visible.
> 
> Everyone plays different games and uses different configs so it's impossible to say whether anything is going to "fail miserably". I just think that when making decisions about what CPU to buy, it's valuable to be able to see what reserves each one might have above what is considered "enough" at that moment. Reviewers take one small snapshot of performance from one area of a game, and it's rarely the worst case scenario.


I'm trying to play out your argument. You make complete sense btw as to why there would be no point in those testing if everything is only GPU-limited. But you would agree that if a person is only GPU-limited through the life of the system, then the CPU-limited video game benchmarks are completely useless? [for example, someone who might have kept a 2500k (me) with Nvidia 1070]

However, if most people here (overclock.net enthusiasts) are only GPU-limited, whether or not CPU-limited benchmarks matter depends on within the time period of when they'll be replacing their system, question being will they be CPU-limited because of a new next gen GPU?

With that in mind, do you believe that GPU makers (nvidia and amd) and video game developers will focus on making their newer games more efficient with multiple cores or that chips will reach 6ghz? I ask because games all seem to head towards multiple CPU-oriented now (for example DX12) which means that even though you are completely right that CPU-limited benchmarks matter should next gen GPUs and video games concentrate on clock speed, that if GPUs and video concentrate on multiple cores, then you might be wrong.

In other words, your argument only holds true if and when people are CPU-limited?


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am watching the stream and it is clearly choppy and stuttering.
> 
> Here is the stream https://www.twitch.tv/drriku
> Already post 2.


You can quit it with the eye rolling thing. I am not trying to be condescending. I choose a 5000 person streamer because people will not watch a choppy stream in that amount of numbers. All i provide is evidence, all I ask is that you do the same.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> That is very strange. I have 90mb cable in chicago and it is buttery smooth. The 5000 other viewers probably think so too. Otherwise people just enjoy watching a choppy stream right?


And I have 150Mbps









Also the stream you presented did not have any webcam feed. which was present in the AMD demo.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am watching the stream and it is clearly choppy and stuttering.
> 
> Here is the stream https://www.twitch.tv/drriku
> Already post 2.


If I find one will you start being logical?


----------



## th3illusiveman

So my gaming resolution is 2560x1440p. Is this comparable to intel post Sandy? As long as it's FASTER then Sandy ALL the time i won't have any regrets.


----------



## Zahix

Amazing performance. Not disappointed. Overclocked 1700/1700x have insane value.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> If I find one will you start being logical?


Maybe you should start being logical... If you were we wouldnt be having this discussion.

Also there is no proof of loss of performance on the gamers side. So i dont know what you are gettign at.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Maybe you should start being logical... If you were we wouldnt be havign this discussion.
> 
> Also there is no proof of loss of performance on the gamers side. So i dont know what you are gettign at.


6700k. There you go. You said all i7 streams lag

https://www.twitch.tv/2mgovercsquared

Also, the previous example was a 4770k and it didnt lag.
Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ChronoBodi*
> 
> Ouch. I thought he was reputable, but this was a screwup on Steve's part and it seems he won't admit it.
> 
> So, question is, who is the reputable reviewers?


We will be able to find them in the OCN Ryzen owner's club. I like real-world experiences here from a wide range of actual owners and that goes for any hardware. You know, people who will actually take the time to use the latest firmware and do the other things that matter when judging performance. I suspect that one month from now we will all be seeing much more clear evidence of Ryzen's true nature in gaming than we are currently. Which is nice since I don't plan on getting my Ryzen system set up until later this year anyway.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> 6700k. There you go. You said all i7 streams lag
> 
> https://www.twitch.tv/2mgovercsquared
> 
> Also, the previos example was a 4770k and it didnt lag.


It's pretty much a pointless argument because most streamer don't bother update their set up info. Why don't you just ask him on stream about the streaming set up?


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> 6700k. There you go. You said all i7 streams lag
> 
> https://www.twitch.tv/2mgovercsquared
> 
> Also, the previous example was a 4770k and it didnt lag.


There is no gaming in that one. I see a lady speaking.

How do you see the specs of their systems?


----------



## redone13

Either way, dont say this:
Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


If you can't provide hard evidence.
Quote:


> Originally Posted by *budgetgamer120*
> 
> There is no gaming in that one. I see a lady speaking.
> 
> How do you see the specs of their systems?


Just wait a while. She'll game. Or look for your damn self that you're wrong.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Either way, dont say this:
> If you can't provide hard evidence.
> Just wait a while. She'll game. Or look for your damn self that your wrong.


Meanwhile, why don't you head to Arteezy's stream, who is streaming Dota 2 with [email protected] x264 CPU encode on Ryzen?

You can google around to figure out how impressive this feed is.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Either way, dont say this:
> If you can't provide hard evidence.


I can and will say it. Streaming on i7 leads to dropped frames, lost performance and stuttering.

You have been posting 30fps 720p videos to prove your point









Is this 2013?


----------



## redone13

Either way, dont say this:
Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


If you can't provide hard evidence.

And blue, I am sure it looks good. You don't need to instigate. Let budget read what he typed.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Either way, dont say this:
> If you can't provide hard evidence.
> 
> And blue, I am sure it looks good. You don't need to instigate. Let budget read what he typed.


I am being logical.


----------



## TheReciever

Then bring us a 1080p60 stream and call it a day on that discussion?

Otherwise you guys are working towards a lock really...


----------



## redone13

Well if we are going to do it like that, the streamer should have an established user base with high numbers like I provided. That way we know there are more than the 3 viewers you posted. This guy could have no idea how to set it up. The fact of the matter is, you said

Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


You said EVERYONE. You are wrong. Give it up. I showed you otherwise.


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Then bring us a 1080p60 stream and call it a day on that discussion?
> 
> Otherwise you guys are working towards a lock really...


There is no lock. Now he is scrambling for specifics when he made a generalization about EVERYONE.

Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


----------



## kd5151

Goodnight.


----------



## UZ7

For streaming, having multiple cores will benefit you if you're playing games that are CPU intensive.

You can get away with lightening the load of the CPU on the encoding but you will lose image quality unless you push more data (bandwidth/bitrate), so you in turn less CPU power but you'll need better upload rates (for streamers).

You can get away with lower upload rates/bitrate if you put more load on the CPU encoding.

I think a sweet spot is Intel's 6 core.. a 5820K to be a balance with encoding performance and keeping game fps up.

But if you have 8 cores, a few dedicated for streaming and the rest for gaming then you'll be good.


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> There is no lock. Now he is scrambling for specifics when he made a generalization about EVERYONE.


Any thread that gets out of hand gets a lock, especially in a thread like this. They may be a little loose with the rules here to be understanding of the climate but first point remains.

Generally i7's will have dropped frames, its why friends like to stream 720p60, but they arent huge streamers by any stretch.


----------



## Xuper

anyway LGA 1151/LGA2011 are dead socket in feature.Now with Ryzen you can have Zen+ or even Dr Lisa mentioned that We will have Zen2 and Zen 3.All are on same socket.


----------



## ChronoBodi

Quote:


> Originally Posted by *UZ7*
> 
> For streaming, having multiple cores will benefit you if you're playing games that are CPU intensive.
> 
> You can get away with lightening the load of the CPU on the encoding but you will lose image quality unless you push more data (bandwidth/bitrate), so you in tern less CPU power but you'll need better upload rates (for streamers).
> 
> You can get away with lower upload rates/bitrate if you put more load on the CPU encoding.
> 
> I think a sweet spot is Intel's 6 core.. a 5820K to be a balance with encoding performance and keeping game fps up.
> 
> But if you have 8 cores, a few dedicated for streaming and the rest for gaming then you'll be good.


Especially Watch Dogs 2 and BF1.... those already chokes up 4c/8t CPU or close to 100%.

no room for encoding there, an 8c/16t will give you room to do so.


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Any thread that gets out of hand gets a lock, especially in a thread like this. They may be a little loose with the rules here to be understanding of the climate but first point remains.
> 
> Generally i7's will have dropped frames, its why friends like to stream 720p60, but they arent huge streamers by any stretch.


I understand. I just wanted to prove realtime that if streamers that make a living off streaming are fine on one PC, he is exaggerating quite a bit. Not "everyone" lags.


----------



## lombardsoup

Quote:


> Originally Posted by *TheReciever*
> 
> Any thread that gets out of hand gets a lock, especially in a thread like this. They may be a little loose with the rules here to be understanding of the climate but first point remains.
> 
> Generally i7's will have dropped frames, its why friends like to stream 720p60, but they arent huge streamers by any stretch.


Sure, it gets a little heated (how is that a bad thing?), but the thread has otherwise been very informative.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Well if we are going to do it like that, the streamer should have an established user base with high numbers like I provided. That way we know there are more than the 3 viewers you posted. This guy could have no idea how to set it up. The fact of the matter is, you said
> You said EVERYONE. You are wrong. Give it up. I showed you otherwise.


Yes I know what I said. Stutter is only valid if streamer has high userbase?









Turning down the quality to where a scientific calculator can run in takes away soem of the stutter. Ryzen ownwers dont need to.

i7 you either stream at outdated setting and deal with chops or stream at good setting and deal with chops and stutters.

I dont build my highend PC to stream at 720p and 30 fps. My standards are a little higher.

https://www.twitch.tv/danelite


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> I understand. I just wanted to prove realtime that if streamers that make a living off streaming are fine on one PC, he is exaggerating quite a bit.


Even I would rather have a second PC and Im not ballin.

Just do it through the intranet if need be.


----------



## Majin SSJ Eric

Not sure what "redone" is still banging on about? We get it, Ryzen sucks and there's literally no compelling reason to buy a comparable 8-core AMD processor for less money than a more expensive Intel quad core because it has slightly higher FPS in games. Go enjoy said Intel quad core and stop trying so hard.

Btw, you have 54 posts on this forum all told and every single one of them is either bashing Ryzen today or chumming it up over in the KL OC thread. Kinda gives a little insight into your perspective on Ryzen I think...


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Yes I know what I said. Stutter is only valid if streamer has high userbase?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Turning down the quality to where a scientific calculator can run in takes away soem of the stutter. Ryzen ownwers dont need to.
> 
> i7 you either stream at outdated setting and deal with chops or stream at good setting and deal with chops and stutters.
> 
> I dont build my highend PC to stream at 720p and 30 fps. My standards are a little higher.
> 
> https://www.twitch.tv/danelite


I am repeating myself because you do not understand that not EVERYONE lags on a quad core that streams. Now you are talking specific formats as an out.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> I understand. I just wanted to prove realtime that if streamers that make a living off streaming are fine on one PC, he is exaggerating quite a bit. Not "everyone" lags.


Making a living or not does not change the facts. Everyone lags on a quad core. You can continue to repeat yourself.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not sure what "redone" is still banging on about? We get it, Ryzen sucks and there's literally no compelling reason to buy a comparable 8-core AMD processor for less money than a more expensive Intel quad core because it has slightly higher FPS in games. Go enjoy said Intel quad core and stop trying so hard.
> 
> Btw, you have 54 posts on this forum all told and every single one of them is either bashing Ryzen today or chumming it up over in the KL OC thread. Kinda gives a little insight into your perspective on Ryzen I think...


Just refuting misformation sir.


----------



## TheReciever

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not sure what "redone" is still banging on about? We get it, Ryzen sucks and there's literally no compelling reason to buy a comparable 8-core AMD processor for less money than a more expensive Intel quad core because it has slightly higher FPS in games. Go enjoy said Intel quad core and stop trying so hard.
> 
> Btw, you have 54 posts on this forum all told and every single one of them is either bashing Ryzen today or chumming it up over in the KL OC thread. Kinda gives a little insight into your perspective on Ryzen I think...


btw thats an argument fallacy. unhide your post just to see the same stuff that got you blocked some time ago. Some people never change :/


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Making a living or not does not change the facts. Everyone lags on a quad core. You can continue to repeat yourself.


No, not everyone as evidenced by the streamers I presented. It is true though that this is getting a little stale. I just don't know how to tell you not EVERYONE.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *redone13*
> 
> Just refuting misformation sir.


Gotta defend that 7700K to the death! You have the greatest CPU mankind has ever envisioned and no third rate microchip company like AMD is ever going to get away with taking attention away from your "baby", right?









In all seriousness, I agree with you. Your 7700K is an amazing processor and the best gaming processor there is currently. Still doesn't make the Ryzen CPU's any less impressive.


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Gotta defend that 7700K to the death! You have the greatest CPU mankind has ever envisioned and no third rate microchip company like AMD is ever going to get away with taking attention away from your "baby", right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness, I agree with you. Your 7700K is an amazing processor and the best gaming processor there is currently. Still doesn't make the Ryzen CPU's any less impressive.


Streaming and gaming is not "gaming" though and it is not even close to being the best at that. Int 5k series and 6k series all beat it in that area


----------



## naz2

why are we talking about streaming?

for every streamer on twitch there's a hundred others who don't stream. can you cherry pick a more irrelevant niche


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Gotta defend that 7700K to the death! You have the greatest CPU mankind has ever envisioned and no third rate microchip company like AMD is ever going to get away with taking attention away from your "baby", right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness, I agree with you. Your 7700K is an amazing processor and the best gaming processor there is currently. Still doesn't make the Ryzen CPU's any less impressive.


I concede that it has its own purpose. It will be good for the future of computers in general. But if someone makes a generalization that even people who have no idea what we are talking about can see is incorrect, what do? lol


----------



## budgetgamer120

Quote:


> Originally Posted by *naz2*
> 
> why are we talking about streaming?
> 
> for every streamer on twitch there's a hundred others who don't stream. can you cherry pick a more irrelevant niche


Might you suggest more niches for a 16 thread cpu? It is part of Ryzen


----------



## blue1512

Quote:


> Originally Posted by *naz2*
> 
> why are we talking about streaming?
> 
> for every streamer on twitch there's a hundred others who don't stream. can you cherry pick a more irrelevant niche


Believe it or not it was part of AMD's advertisement for Ryzen.

@redone13 It's true you can stream a light game at sub-par quality with a quad core i7, but it doesn't mean that Ryzen is a much better option at a sweet price point for most streamers.


----------



## Majin SSJ Eric

I have loved every single one of my Intel chips back to my C2D in 2007. They are fantastic CPU's and have absolutely obliterated AMD for so long that I never even bothered to put together an AMD build. I'm just glad to see AMD release a product that finally makes a viable alternative to Intel and I believe Ryzen does just that. I didn't expect it to be better than every Intel CPU at every task but it has exceeded my expectations on every level except OCing.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have loved every single one of my Intel chips back to my C2D in 2007. They are fantastic CPU's and have absolutely obliterated AMD for so long that I never even bothered to put together an AMD build. I'm just glad to see AMD release a product that finally makes a viable alternative to Intel and I believe Ryzen does just that. I didn't expect it to be better than every Intel CPU at every task but it has exceeded my expectations on every level except OCing.


Competition is good. And more cores/threads will be the norm. But that's a will be.


----------



## Malinkadink

A lot of what we do here is generalize and deduce assumptions from the data available to us. @redone13 how can you be so naive to say the 4C/8T i7s are fine when gaming + streaming? Jokers video shows how the i7 @ 5ghz no less chokes 100% usage around 6:48 i think. You're gonna tell me playing that game and trying to stream you wont have stutter and dropped frames? get real.


----------



## Xuper

Quote:


> Originally Posted by *redone13*
> 
> Competition is good. And more cores/threads will be the norm. But that's a will be.










hmm ok.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Competition is good. And more cores/threads will be the norm. But that's a will be.


But it's a close "will be".

I can see how 7700k runs at 100% to maintain the lead over ~40% Ryzen. Throwing some tasks like streaming and BOOM, who is ahead now?


----------



## redone13

Quote:


> Originally Posted by *Malinkadink*
> 
> A lot of what we do here is generalize and deduce assumptions from the data available to us. @redone13 how can you be so naive to say the 4C/8T i7s are fine when gaming + streaming? Jokers video shows how the i7 @ 5ghz no less chokes 100% usage around 6:48 i think. You're gonna tell me playing that game and trying to stream you wont have stutter and dropped frames? get real.


Please, stop using Jok3rs example. Didn't you watch his video about variance? If you did, then we all need to wait. I just am using the current data that is available.


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> But it's a close "will be".
> 
> I can see how 7700k runs at 100% to maintain the lead over ~40% Ryzen. Throwing some tasks like streaming and BOOM, who is ahead now?


Lol, of course the Ryzen.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> Please, stop using Jok3rs example. Didn't you watch his video about variance? If you did, then we all need to wait. I just am using the current data that is available.


For Ryzen number you are correct, we should wait. But for 7700k there is no reason to wait, that number will not change.


----------



## azanimefan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Gotta defend that 7700K to the death! You have the greatest CPU mankind has ever envisioned and no third rate microchip company like AMD is ever going to get away with taking attention away from your "baby", right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness, I agree with you. Your 7700K is an amazing processor and the best gaming processor there is currently. Still doesn't make the Ryzen CPU's any less impressive.


Not worth the argument.

He's here troll. why else would someone who just got a i7-7700k spend 3 out of 3 hours since joining this site in an AMD thread? (and making over 50 posts in that thread too). Just ignore him.

Personally I can't wait to drop this i5-4690k for a 8c16t ryzen cpu (probably going to go with the r7-1700, as it seems like it overclocks to the same clocks the 1700x and 1800x do, right around 3.9ghz-4.1ghz.

And yes, I'm going to do that with full knowledge that I'll be moving laterally in my gaming performance; mostly because this chip has ticked me off ever since I got it. I use too many cores/too much cpu for a quad, buying this chip was a mistake. In fact, I'm not interested in the i7-7700k in the least. All my monitors are 60hz. there isn't a game benched out there where ryzen failed to hit at least that much AT STOCK as a min frame rate; not that I game much.

I do like how the goalposts moved though today. Ryzen performs within 5% of the i7-7700k (in those it loses) in pretty much every gaming bench out there and it's a "HUGE FAILURE" even though the clock speed difference between the chips is much larger then 5%. I for one wont give chipzilla another dime.


----------



## Majin SSJ Eric

I'm excited to put a new Ryzen system together with a CHVI and some 3000MHz DDR4 later this year when the typical teething issues of a new platform have been largely resolved. I don't think it will be as fast as a 7700K system in gaming but it should be a nice upgrade from my 4930K system especially in the other stuff I do that isn't game-related. The low cost will also allow me to finally retire my OG Titans to the shelf and replace them with a couple of (hopefully) cheap 1080's after the 1080Ti releases. I could certainly build a faster Intel system but it would cost way too much money (don't want any more quad cores personally).


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I have loved every single one of my Intel chips back to my C2D in 2007. They are fantastic CPU's and have absolutely obliterated AMD for so long that I never even bothered to put together an AMD build. I'm just glad to see AMD release a product that finally makes a viable alternative to Intel and I believe Ryzen does just that. I didn't expect it to be better than every Intel CPU at every task but it has exceeded my expectations on every level except OCing.


What me likes the most










http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,11.html


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> For Ryzen number you are correct, we should wait. But for 7700k there is no reason to wait, that number will not change.


We always have to wait with Ryzen for the futureeee. I jest, I jest. I know what you mean. And azanimefan, all we can use to make informed decisions is the data available to us. Everyone is saying wait for more results. Ok, we can wait to see what happens. But we have a little idea of what is going on.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> We always have to wait with Ryzen for the futureeee. I jest, I jest. I know what you mean.


It is a new CPU on a new platform, mate. It will improve.

For 7700k? Nah, nothing to improve here.


----------



## Poncho87e

Has anyone broken 4+GHZ on manual with a higher vcore?


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> It is a new CPU on a new platform, mate. It will improve.
> 
> For 7700k? Nah, nothing to improve here.


That kind of relates to the dead cpu socket argument. Ryzen _could_ improve. If it does, it will benefit the consumer and even if it doesn't, it is better for the consumer anyways. I can live in the today while keeping an eye on the future. And Ryzen is close to the right timing but not exactly. Perhaps we can say it's getting the ball rolling.


----------



## Quantum Reality

@budgetgamer120

That list is one of several I've been wanting to see practically all day. I've been saying I want to see Ryzen compared to all past generations of AMD back to the Phenom II X4 965, and that list is an excellent all-in-one summary ot just how huge a difference Ryzen makes.

Phenom II X4 975 (closest proxy to the mainstream 965) - 13.77 fps HB
Various FX-8xxx series average about 22-23 fps HB
Ryzen 1800X - 47 fps

That is double the performance of Bulldozer-Excavator!









So even ignoring the Intel comparisons I would say AMD has delivered on their promise to remove the bad memory that is Bulldozer, which itself was at best a sidegrade from Phenom II X6.


----------



## n4p0l3onic

Anyone know why ryzen requires so much electricity to get to 4 ghz clock? I mean my 5930k is stable at 4.3 with just 1.25 volt, while from what I saw ryzen need 1.4 volt to reach 4? With newer manufacturing tech? Odd.


----------



## blue1512

Quote:


> Originally Posted by *redone13*
> 
> That kind of relates to the dead cpu socket argument. Ryzen _could_ improve. If it does, it will benefit the consumer and even if it doesn't, it is better for the consumer anyways. I can live in the today while keeping an eye on the future. And Ryzen is close to the right timing but not exactly.


There is no "if", the newly released Ryzen is capable to beat the best 8 cores of Intel, it's gaming performance will only be better.


----------



## redone13

Quote:


> Originally Posted by *blue1512*
> 
> There is no "if", the newly released Ryzen is capable to beat the best 8 cores of Intel, it's gaming performance will only be better.


Now, now. If there is anything we should have learned from all this, it's that we should not assume. They both have their purpose regardless.


----------



## azanimefan

Quote:


> Originally Posted by *Poncho87e*
> 
> Has anyone broken 4+GHZ on manual with a higher vcore?


some reviewers have. with an r7-1700 none the less, he hit 4.1ghz. his 1700x and 1800x both topped out at 4.0

probably dumb luck but i'm treating this release like the PHII. It took a bit for people to tease clocks out of those chips. It was a huge achievement when they came out if you could clock to 3.8ghz, but by the time piledriver released most people could hit 4.2-4.4ghz or higher on the PhII, mostly thanks to better motherboards, bios, ram and understanding what you needed to do to squeeze those clocks out of the chip.

I expect as the chipset and bios kinks are worked out we'll see some respectable clocks out of these chips.


----------



## Quantum Reality

Quote:


> Originally Posted by *n4p0l3onic*
> 
> Anyone know why ryzen requires so much electricity to get to 4 ghz clock? I mean my 5930k is stable at 4.3 with just 1.25 volt, while from what I saw ryzen need 1.4 volt to reach 4? With newer manufacturing tech? Odd.


I saw a nice post about that somewhere, someone prolly has a link.

Basically AMD has already "tuned" Ryzen near the edge of its performance envelope, such that when you try to push its clock speeds much above ~4 GHz you're already in a region where it takes a lot more voltage (relatively speaking) to get modest speed gains.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Poncho87e*
> 
> Has anyone broken 4+GHZ on manual with a higher vcore?


I think at this point we have to concede that ~4GHz is going to be the mean for the Ryzen 8-cores. Some will probably do 4.2-4.3GHz eventually (and some will fall short of even 4GHz) but the expectation going in should be that this is a set-it-and-forget it CPU of about 4GHz. Can't really compare to Intel's typical OCing of 4.5-5GHz because the chip simply can't do it. I'm fine with that personally because I have largely stopped even bothering to OC my 4930K and 2600K since they are old and not worth benching anymore.

If this was 4 years ago then I'd still be chucking my money at Intel for a 6950X and a pair of 1080Ti's to hit the benching section with but I'm older and (much) poorer these days so Ryzen absolutely hits the sweet spot for me. I can get at least comparable performance to Intel's best CPU's for a fraction of the cost and the only sacrifice I have to make is that it won't OC much and it won't get quite as many FPS in games as a 7700K at 5GHz. Considering I'm on 1440p/60Hz monitors, I can live with that.


----------



## lombardsoup

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think at this point we have to concede that ~4GHz is going to be the mean for the Ryzen 8-cores. Some will probably do 4.2-4.3GHz eventually (and some will fall short of even 4GHz) but the expectation going in should be that this is a set-it-and-forget it CPU of about 4GHz. Can't really compare to Intel's typical OCing of 4.5-5GHz because the chip simply can't do it. I'm fine with that personally because I have largely stopped even bothering to OC my 4930K and 2600K since they are old and not worth benching anymore.
> 
> If this was 4 years ago then I'd still be chucking my money at Intel for a 6950X and a pair of 1080Ti's to hit the benching section with but I'm older and (much) poorer these days so Ryzen absolutely hits the sweet spot for me. I can get at least comparable performance to Intel's best CPU's for a fraction of the cost and the only sacrifice I have to make is that it won't OC much and it won't get quite as many FPS in games as a 7700K at 5GHz. Considering I'm on 1440p/60Hz monitors, I can live with that.


For me, I can't be bothered with paying a premium for single threaded performance anymore. Games are increasingly able to take advantage of more cores, there's no question at this point.


----------



## chuy409

Quote:


> Originally Posted by *Quantum Reality*
> 
> @budgetgamer120
> 
> That list is one of several I've been wanting to see practically all day. I've been saying I want to see Ryzen compared to all past generations of AMD back to the Phenom II X4 965, and that list is an excellent all-in-one summary ot just how huge a difference Ryzen makes.
> 
> Phenom II X4 975 (closest proxy to the mainstream 965) - 13.77 fps HB
> Various FX-8xxx series average about 22-23 fps HB
> Ryzen 1800X - 47 fps
> 
> That is double the performance of Bulldozer-Excavator!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So even ignoring the Intel comparisons I would say AMD has delivered on their promise to remove the bad memory that is Bulldozer, which itself was at best a sidegrade from Phenom II X6.


I think i love my phenom 960t more than i love ryzen, my 5820k, any other cpu and myself. If someone had stolen my 960t and someone offered me to either give me an r7 1800x or my 960t back, i would pick 960t 10/10 times. It was my first cpu i built my first pc with and i used it throughout high school. Not only that but it unnlocks to x6 and its still a beast.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think at this point we have to concede that ~4GHz is going to be the mean for the Ryzen 8-cores. Some will probably do 4.2-4.3GHz eventually (and some will fall short of even 4GHz) but the expectation going in should be that this is a set-it-and-forget it CPU of about 4GHz. Can't really compare to Intel's typical OCing of 4.5-5GHz because the chip simply can't do it. I'm fine with that personally because I have largely stopped even bothering to OC my 4930K and 2600K since they are old and not worth benching anymore.
> 
> If this was 4 years ago then I'd still be chucking my money at Intel for a 6950X and a pair of 1080Ti's to hit the benching section with but I'm older and (much) poorer these days so Ryzen absolutely hits the sweet spot for me. I can get at least comparable performance to Intel's best CPU's for a fraction of the cost and the only sacrifice I have to make is that it won't OC much and it won't get quite as many FPS in games as a 7700K at 5GHz. Considering I'm on 1440p/60Hz monitors, I can live with that.


I like the future proofing side of things too regarding the same socket. There's something to be said about high clock speed though. It's like high RPMS in a car (unless your turbo'd







) I suppose IPC can make up for this. Ideally we can get high core count and high clock speed.


----------



## Quantum Reality

Hmm, something I just noticed on the Guru3D review page:
Quote:


> Delidding Ryzen - somebody is bound to ask it hence I'll address it right here and now, but delidding the processor will not be possible for the simple reason that *there are sensors on the heatspreader. Delidding the processor would break it.*


I don't know how important those sensors are, but given that the Ryzen CPU seems to do a lot of self-monitoring for dynamic overclocking, it would be a bad idea to de-lid.


----------



## Majin SSJ Eric

KL is perfect in terms of OCing. Its the first Intel chip since SB to reliably hit 5GHz and the performance is blistering. Imagine if you could combine the attributes of KL and Ryzen into one chip: $399, 8-core / 16-thread, 5GHz, fastest IPC, most efficient, etc. Would be the perfect CPU.

Ideally I'd get a 1700X and two 1080's to replace my 4930K rig and a 7700K and an RX 480 to replace my 2600K rig.


----------



## Slink3Slyde

People keep posting the link to The Stilts review but Im not sure everyone has read it.

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/



The current iteration of Ryzen isnt going to clock much higher then reviewers are showing, as I understand it, its a limitation of the low power process they used. Hence low TDP and power usage but low clocks. I'm not convinced that the lower core count chips are going to clock much higher either.

I'm on the fence as of this moment for my own upgrade from a 3570k. Most reviews seem to show Ryzen as a little better for gaming then what I have (I have a poor clocking chip anyway) but I'm still considering that its gaming performance might improve as the bugs are ironed out, and as time goes on those extra cores might come in handy. Its a great chip for some people and obviously a massive improvement. I wasnt expecting it to beat a 5GHZ 7700K in most games as of release, but I was expecting it to be perhaps 10 % faster then it is in games. I'm probably going to wait and see a while.


----------



## Poncho87e

Quote:


> Originally Posted by *lombardsoup*
> 
> For me, I can't be bothered with paying a premium for single threaded performance anymore. Games are increasingly able to take advantage of more cores, there's no question at this point.


Exactly, multi-core parallelism is the future. The Clock wars are a thing of the past.


----------



## redone13

Quote:


> Originally Posted by *Poncho87e*
> 
> Exactly, multi-core parallelism is the future. The Clock wars are a thing of the past.


The problem is, the eclipse isn't quite perfect. More cores is the direction we will probably end up in, however, the higher clock speeds right now show a superior performance for gamers which probably out number the amount of people that are heavy multitaskers or require specialized applications. If gaming is a non issue or if the performance of Ryzen increases, then all the better.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> KL is perfect in terms of OCing. Its the first Intel chip since SB to reliably hit 5GHz and the performance is blistering. Imagine if you could combine the attributes of KL and Ryzen into one chip: $399, 8-core / 16-thread, 5GHz, fastest IPC, most efficient, etc. Would be the perfect CPU.
> 
> Ideally I'd get a 1700X and two 1080's to replace my 4930K rig and a 7700K and an RX 480 to replace my 2600K rig.


That would be ideal Majin.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Slink3Slyde*
> 
> People keep posting the link to The Stilts review but Im not sure everyone has read it.
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> 
> The current iteration of Ryzen isnt going to clock much higher then reviewers are showing, as I understand it, its a limitation of the low power process they used. Hence low TDP and power usage but low clocks. I'm not convinced that the lower core count chips are going to clock much higher either.
> 
> I'm on the fence as of this moment for my own upgrade from a 3570k. Most reviews seem to show Ryzen as a little better for gaming then what I have (I have a poor clocking chip anyway) but I'm still considering that its gaming performance might improve as the bugs are ironed out, and as time goes on those extra cores might come in handy. Its a great chip for some people and obviously a massive improvement. I wasnt expecting it to beat a 5GHZ 7700K in most games as of release, but I was expecting it to be perhaps 10 % faster then it is in games. I'm probably going to wait and see a while.


I think you're absolutely right. But 4GHz is good enough for me considering the 1700X is now actually in the mix with Intel's chips in gaming (as opposed to the last 7 years where AMD has been relegated to the bottom of every graph) and there's nothing stopping me from dropping in a new 2700X next year or the year after when clocks will probably improve. With a 7700K, that's pretty much it. 4-cores is all you're ever going to get on 1151.


----------



## fatmario

surprise 4790k still holding its spot


----------



## tp4tissue

Quote:


> Originally Posted by *budgetgamer120*
> 
> Can you tell me the reasons? Lol


It would seem your money never depended on your computer..

For example. If you're a trader using automated tools.. you miss a call, boom , you're down 20 thousand dollars.. what's the cost of buying another computer..

For example, You're a programmer, on a deadline.. laptop crashes because of conflicting gaming drivers auto downloading, and your pc can't boot. You miss the deadline, boss is pissed, you're passed up for a promotion w/ a 10 thousand dollar pay increase..

This is the real world where people WORK..


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> The problem is, the eclipse isn't quite perfect. More cores is the direction we will probably end up in, however, the higher clock speeds right now show a superior performance for gamers which probably out number the amount of people that are heavy multitaskers or require specialized applications. If gaming is a non issue or if the performance of Ryzen increases, then all the better.


Im curious how well Ryzen can play 2 games. I often have an MMO up while I FPS

My 2500k @ 4.8Ghz couldnt do more than 1 game at 2 it would just keel over


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *fatmario*
> 
> surprise 4790k still holding its spot


Why? Its still a helluva a CPU.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> KL is perfect in terms of OCing. Its the first Intel chip since SB to reliably hit 5GHz and the performance is blistering. Imagine if you could combine the attributes of KL and Ryzen into one chip: $399, 8-core / 16-thread, 5GHz, fastest IPC, most efficient, etc. Would be the perfect CPU.
> 
> Ideally I'd get a 1700X and two 1080's to replace my 4930K rig and a 7700K and an RX 480 to replace my 2600K rig.


Quote:


> Originally Posted by *TheReciever*
> 
> Im curious how well Ryzen can play 2 games. I often have an MMO up while I FPS
> 
> My 2500k @ 4.8Ghz couldnt do more than 1 game at 2 it would just keel over


Lol, I used to multibox WoW. I don't think I maxed everything out though and it was on one of the first P4s at 3.0Ghz. Could be interesting.


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> Lol, I used to multibox WoW. I don't think I maxed everything out though and it was on one of the first P4s at 3.0Ghz. Could be interesting.


Never played WoW.

For me it was Black Desert Online while I played Rainbow Six: Siege


----------



## tp4tissue

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think you're absolutely right. But 4GHz is good enough for me considering the 1700X is now actually in the mix with Intel's chips in gaming (as opposed to the last 7 years where AMD has been relegated to the bottom of every graph) and there's nothing stopping me from dropping in a new 2700X next year or the year after when clocks will probably improve. With a 7700K, that's pretty much it. 4-cores is all you're ever going to get on 1151.


There's simply no point to buy the AMD if you PLAY GAMES..

The Aspiring youtube streaming star is really the ONLY demographic which would be better served by Zen at the moment..

The other thing we're banking on is AMD fine-wine.. But having lead software development.. This is one of those 2 years kind of thing.. hahahaha.


----------



## naz2

Quote:


> Originally Posted by *Poncho87e*
> 
> Exactly, multi-core parallelism is the future. The Clock wars are a thing of the past.


it's about having the right technology at the right time. by the time 8-cores go mainstream we'll have a new generation of cpus anyways. it's basically the rx480 argument all over again


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Never played WoW.
> 
> For me it was Black Desert Online while I played Rainbow Six: Siege


Yea, BDO is really nice looking. I'd imagine it'd be hard to do with RS:S though it would be hard to get some headshots in the first place distracted by anything else let alone an MMO.


----------



## Poncho87e

Quote:


> Originally Posted by *azanimefan*
> 
> some reviewers have. with an r7-1700 none the less, he hit 4.1ghz. his 1700x and 1800x both topped out at 4.0
> 
> probably dumb luck but i'm treating this release like the PHII. It took a bit for people to tease clocks out of those chips. It was a huge achievement when they came out if you could clock to 3.8ghz, but by the time piledriver released most people could hit 4.2-4.4ghz or higher on the PhII, mostly thanks to better motherboards, bios, ram and understanding what you needed to do to squeeze those clocks out of the chip.
> 
> I expect as the chipset and bios kinks are worked out we'll see some respectable clocks out of these chips.


I agree. I have an FX-8350 right now and I remember early in the 8000 series it was hard to get high clocks. I finally got mine past 5GHZ stable but it took a while and heavy duty water cooling.

I think we all can/should agree that what matters is that the info that we have up till now shows that the chips are great for productivity jobs and will game above 60hz. After all, we need to realize that all the ridiculous fps numbers mean nothing if your monitor can't support them. They are wasted frames. Most people, me included have a 1080p 60hz monitor. And the info we are getting says that the chip should push that without breaking a sweat. Be it the 1800X, the 1700X, or even the 1700.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *tp4tissue*
> 
> *There's simply no point to buy the AMD if you PLAY GAMES..*
> 
> The Aspiring youtube streaming star is really the ONLY demographic which would be better served by Zen at the moment..
> 
> The other thing we're banking on is AMD fine-wine.. But having lead software development.. This is one of those 2 years kind of thing.. hahahaha.


That's a curious statement to make. So I suppose there is no reason to buy any processor other than the 7700K if you ever play games?


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's a curious statement to make. So I suppose there is no reason to buy any processor other than the 7700K if you ever play games?


Yea, that is a bit of an extreme way of putting it. The Ryzen really does have a role, even like you said it could maybe multibox with ease, but it is not a good upgrade for many i7s unless you are purely looking for heavy number crunching.


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> Yea, BDO is really nice looking. I'd imagine it'd be hard to do with RS:S though it would be hard to get some headshots in the first place distracted by anything else let alone an MMO.


Im good at what I do









Now I just run two laptops since its easier for now. Probably will remain that way but Ryzen is looking good for a streaming/recording/plex/storage desktop.


----------



## momonz

"There's simply no point to buy the AMD if you PLAY GAMES.."

Nonsense statement like other articles who says it's not for gaming.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think you're absolutely right. But 4GHz is good enough for me considering the 1700X is now actually in the mix with Intel's chips in gaming (as opposed to the last 7 years where AMD has been relegated to the bottom of every graph) and there's nothing stopping me from dropping in a new 2700X next year or the year after when clocks will probably improve. With a 7700K, that's pretty much it. 4-cores is all you're ever going to get on 1151.


Yea buy 1700 and a very nice mobo, it will be just fine for now and when Zen + is release sell up the old chip and drop in a new one, its one option. I'm not going to go for a 7700 K as Intel will move to a new socket and I'll be stuck and I'd rather go for something more then a quad with hyperthreading. My motherboard is creaking as well as its seen a bit of abuse which is pushing me to upgrade, but Coffee lake is also in the back of my mind. A high clocking 6 core with that higher IPC would also be a great option.

Its going to depend for me on how much of the lower game performance is down to optimisation. As Ryzen does so well compared to Broadwell in productivity I can imagine it improving as time goes on.


----------



## pony-tail

Ryzen has not turned out to be good enough ( at the price point $Au 699 for the 1800x ) for me to make the change .
Others will undoubtedly get upset at me but I just do not find it worth the asking price , most of what I do will run as fast on a 6700k or even a 4790k ( I have both) but that said I have very high hopes for their top SKU APUs when they come to market - I have a feeling that they will impress .
Hope so anyway .


----------



## blue1512

Quote:


> Originally Posted by *momonz*
> 
> "There's simply no point to buy the AMD if you PLAY GAMES.."
> 
> Nonsense statement like other articles who says it's not for gaming.


I would say Ryzen is not for *>120Hz* gaming at this moment.

For ppl who prefers resolution over refresh rate, Ryzen is perfectly fine for gaming.


----------



## Poncho87e

Quote:


> Originally Posted by *Slink3Slyde*
> 
> People keep posting the link to The Stilts review but Im not sure everyone has read it.
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> 
> The current iteration of Ryzen isnt going to clock much higher then reviewers are showing, as I understand it, its a limitation of the low power process they used. Hence low TDP and power usage but low clocks. I'm not convinced that the lower core count chips are going to clock much higher either.
> 
> I'm on the fence as of this moment for my own upgrade from a 3570k. Most reviews seem to show Ryzen as a little better for gaming then what I have (I have a poor clocking chip anyway) but I'm still considering that its gaming performance might improve as the bugs are ironed out, and as time goes on those extra cores might come in handy. Its a great chip for some people and obviously a massive improvement. I wasnt expecting it to beat a 5GHZ 7700K in most games as of release, but I was expecting it to be perhaps 10 % faster then it is in games. I'm probably going to wait and see a while.


Hopefully they will release a bios feature to allow us to remove the power saving features like they did with the Polaris series GPU's.


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Im good at what I do
> 
> 
> 
> 
> 
> 
> 
> Now I just run two laptops since its easier for now. Probably will remain that way but Ryzen is looking good for a streaming/recording/plex/storage desktop.


Lol, I wouldn't dare doubt that







. Agreed, there is much promise shown and everyone will benefit from more options and a little competition.


----------



## Majin SSJ Eric

CPU upgrades aren't like GPU upgrades. I can still easily get by with gaming and other tasks on my 2600K system from 2011 (but my then mighty 580 Lightnings stopped cutting it years ago). Certainly I don't NEED to upgrade my 4930K to Ryzen, but it will provide a performance increase over my IB-E setup in nearly every scenario, and for a very manageable cost. Being a brand new platform that will be around for a long time as well as offering 2 more cores and 4 more threads is just the icing on the cake.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> CPU upgrades aren't like GPU upgrades. I can still easily get by with gaming and other tasks on my 2600K system from 2011 (but my then mighty 580 Lightnings stopped cutting it years ago). Certainly I don't NEED to upgrade my 4930K to Ryzen, but it will provide a performance increase over my IB-E setup in nearly every scenario, and for a very manageable cost. Being a brand new platform that will be around for a long time as well as offering 2 more cores and 4 more threads is just the icing on the cake.


We can probably blame things on the hype of intel supposedly being finished due to Ryzen's release. It will provide a fine platform hopefully from this point onwards but it will not entice a large amount of people for reasons many people are stating.


----------



## Oubadah

..


----------



## Poncho87e

Good ol' AMD Fine-Wine Technology. They think ahead. And when muti-core aware programs become the mainstream, as they already are, added to the increasing parallelism in GPU's. We will hopefully see the performance only increase from here. My FX8350 @5GHZ still charges ahead just fine. Just remember all, this is a whole new technology. Sit back, relax and be patient while the programming world catches up.


----------



## budgetgamer120

Quote:


> Originally Posted by *tp4tissue*
> 
> It would seem your money never depended on your computer..
> 
> For example. If you're a trader using automated tools.. you miss a call, boom , you're down 20 thousand dollars.. what's the cost of buying another computer..
> 
> For example, You're a programmer, on a deadline.. laptop crashes because of conflicting gaming drivers auto downloading, and your pc can't boot. You miss the deadline, boss is pissed, you're passed up for a promotion w/ a 10 thousand dollar pay increase..
> 
> This is the real world where people WORK..


I made no sense of anything you said. Yet to have any of those happen to me.

Quote:


> Originally Posted by *naz2*
> 
> it's about having the right technology at the right time. by the time 8-cores go mainstream we'll have a new generation of cpus anyways. it's basically the rx480 argument all over again


What was wrong with Rx480?
Quote:


> Originally Posted by *redone13*
> 
> Yea, that is a bit of an extreme way of putting it. The Ryzen really does have a role, even like you said it could maybe multibox with ease, but it is not a good upgrade for many i7s unless you are purely looking for heavy number crunching.


Is Ryzen a bad gamer? I keep hearing this talk about not buying Ryzen if you game.... I was not aware Ryzen owners have a hard time gaming.

Quote:


> Originally Posted by *redone13*
> 
> We can probably blame things on the hype of intel supposedly being finished due to Ryzen's release. It will provide a fine platform hopefully from this point onwards but it will not entice a large amount of people for reasons many people are stating.


There is no probably to it. Ryzen is a fine platform.

Just sorry I did not wait for it.

Besides AMD never targeted Ryzen 7 towards the gamer demographic. For the target audience it is perfect, a workstation PC at consumer prices.

Mission accomplished.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Poncho87e*
> 
> Hopefully they will release a bios feature to allow us to remove the power saving features like they did with the Polaris series GPU's.


Its not a power saving feature, its inherent in the way the chip is manufactured. It just hits a steep slope with voltage over the late 3 GHZ range. I'd love to be proved wrong, but I'm pretty sure improvements are going to come from using the clocks it can achieve more efficiently in software, and possibly over time with games using more cores better. Outside of gaming its a fantastic chip, in gaming its just O.K as of the results seen so far.


----------



## budgetgamer120

Quote:


> Originally Posted by *Oubadah*
> 
> Not sure it it was a typo, but 2500K+1070 is CPU limited. CPU limited benches would have been potentially useful to someone who'd ended up with that combo.
> 
> If someone was going to buy a CPU in 2011, they could have looked at something like this:
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/44339-intel-core-i3-2120-core-i5-2400-lga1155-processors-review-12.html
> 
> And concluded that in Crysis, a i3-2120 is more or less equivalent to a i7-2600K. That is a terribly misleading and unhelpful benchmark because:
> 
> A) It's taken in a heavily GPU-limited scenario (ice map). There are parts of Crysis (eg. shipyard), where even at [email protected] High on a GTX 580 (the benchmark config), the game is going to be CPU bound and the framerate won't be anywhere near 50fps. In those parts the 2600K would have made a significant difference.
> 
> B) If they were to upgrade the GTX 580 to, say, a GTX 1070 or 970, the 2600K would have been far less limiting after that upgrade.
> 
> Ideally the reviewer should have run their bench in heavy combat in the shipyard, but maybe they didn't know about it. The second best option was to lower the resolution so that the benchmark would be CPU bound in the ice map, then at least the reader would have got some sense that the 2120 and 2600K are not equal in Crysis.
> They might, but I wouldn't be buying hardware now based on an assumption that all future games will be using highly threaded engines. Another implied assumption seems to be that people are only going to be playing new games. Some people still play older games with older, poorly threaded engines. It's only been in the last year or so that we've started to see big movement on that front.
> 
> I'm not so sure that enthusiasts are necessarily likely to be less CPU limited either. if you're budget is high, then it's easier to end up with a CPU bottleneck that you can't fix just by throwing more money at it. It's not like you can put two CPUs in a system (well you could, but it wouldn't scale the same way). When I had my GTX 780 SLI build I was coming up against CPU bottlenecks regularly. Of course if I'd thrown a 4K monitor into the mix it might have swung back in the other direction. That's why it's hard to make any sweeping statements.


Agreed.


----------



## umeng2002

Where are the 64 player online BF1 tests?


----------



## GorillaSceptre

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's a curious statement to make. So I suppose there is no reason to buy any processor other than the 7700K if you ever play games?


No, if you're on a modern i7 with an overclock there's no reason to upgrade to anything if you only play games.. Funny how certain individuals are only highlighting Ryzen..

Broadwell-E+ IPC with 8 cores for cheaper than Intel's 4 core, what a fail..


----------



## Poncho87e

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Its not a power saving feature, its inherent in the way the chip is manufactured. It just hits a steep slope with voltage over the late 3 GHZ range. I'd love to be proved wrong, but I'm pretty sure improvements are going to come from using the clocks it can achieve more efficiently in software, and possibly over time with games using more cores better. Outside of gaming its a fantastic chip, in gaming its just O.K as of the results seen so far.


Ahhh okay. I didn't realize it was a manufacturing/architecture limit. Good to know. And I agree, you're right about the rest of what you said. As I said earlier, most people game at 1080p @60Hz. Which from what I've seen it can crunch that no problem.


----------



## 2010rig

I haven't read any of this thread *yet*, but what's the verdict guys?

Both pro and against?

I've only been able to glance at the HWC review, and based on what I saw so far, it's on par or faster than the 6900K,not as fast in Single Thread, and not a good overclocker?

My early conclusion is so what? All of that can be improved for the next iteration, and they made a huge leap forward. That 1800X is way more suited for my needs


----------



## Poncho87e

Quote:


> Originally Posted by *blackhole2013*
> 
> I'm just happy that finally there's a reason for intel to make a processor to blow away ryzen instead of tiny improvements and i will go buy it .


Then we would be back to square one. No competition. Competition is a good thing. I don't want either company to "Blow the other one away!" That's ridiculous. I want both companies to do well, and stay competitive. As we all should.


----------



## TheReciever

Quote:


> Originally Posted by *2010rig*
> 
> I haven't read any of this thread *yet*, but what's the verdict guys?
> 
> Both pro and against?
> 
> I've only been able to glance at the HWC review, and based on what I saw so far, it's on par or faster than the 6900K,not as fast in Single Thread, and not a good overclocker?
> 
> My early conclusion is so what? All of that can be improved for the next iteration, and they made a huge leap forward...


SMT needs fixing, Windows scheduling needs fixing, High Speed RAM bug needs fixing and likely will be number 1

Tune in next month.


----------



## redone13

Quote:


> Originally Posted by *2010rig*
> 
> I haven't read any of this thread *yet*, but what's the verdict guys?
> 
> Both pro and against?
> 
> I've only been able to glance at the HWC review, and based on what I saw so far, it's on par or faster than the 6900K,not as fast in Single Thread, and not a good overclocker?
> 
> My early conclusion is so what? All of that can be improved for the next iteration, and they made a huge leap forward...


The consensus is that it is a plus overall for competition, price and the advancement of technology. What is also part of the consensus is that it needs more time and many of the things you mentioned as well as TheReciever.


----------



## Poncho87e

Quote:


> Originally Posted by *2010rig*
> 
> I haven't read any of this thread *yet*, but what's the verdict guys?
> 
> Both pro and against?
> 
> I've only been able to glance at the HWC review, and based on what I saw so far, it's on par or faster than the 6900K,not as fast in Single Thread, and not a good overclocker?
> 
> My early conclusion is so what? All of that can be improved for the next iteration, and they made a huge leap forward...


Agreed. Good chip, huge leap forward from their last architecture and a good future ecosystem to build on.


----------



## 2010rig

Quote:


> Originally Posted by *TheReciever*
> 
> SMT needs fixing, Windows scheduling needs fixing, High Speed RAM bug needs fixing and likely will be number 1
> 
> Tune in next month.


Needs fixing as in we don't need to wait for V2?

Remember when Sandy Bridge came out, not as many kinks, but they can be ironed out


----------



## blue1512

Quote:


> Originally Posted by *2010rig*
> 
> Needs fixing as in we don't need to wait for V2?
> 
> Remember when Sandy Bridge came out, not as many kinks, but they can be ironed out


Apart from SMT, I think we don't need to wait for V2. SMT is a strange thing as it provides better multi-thread than Intel's HT, but harms single-thread more.


----------



## TheReciever

Quote:


> Originally Posted by *2010rig*
> 
> Needs fixing as in we don't need to wait for V2?
> 
> Remember when Sandy Bridge came out, not as many kinks, but they can be ironed out


From what I understand SMT is unlikely to be "fixed" but the other two points are likely to be ironed out without the need of a V2


----------



## 2010rig

Hey, even I don't expect it to be perfect, which reviews cover the SMT issues?

Thanks!


----------



## Xuper

Quote:


> Originally Posted by *2010rig*
> 
> Hey, even I don't expect it to be perfect, which reviews cover the SMT issues?
> 
> Thanks!


http://www.hardware.fr/articles/956-17/jeux-3d-project-cars-f1-2016.html


----------



## TheReciever

Quote:


> Originally Posted by *2010rig*
> 
> Hey, even I don't expect it to be perfect, which reviews cover the SMT issues?
> 
> Thanks!


that I couldnt tell you, just been reading through the thread while working on my uni stuff.


----------



## redone13

Damn lol. I bet it did a pretty good job as some side entertainment. I didn't realize all the hours that passed by for some type of consensus to be reached.


----------



## 2010rig

Hey, after a whole day launch and 1500 posts I figured some clear conclusions would be reached. I'm impressed and am I crazy for still wanting it? For my uses, it's pretty great, just run games with SMT OFF like we've always done with Intel, it's not like it's that far behind.

I wouldn't Day 1 buy it tho, will give it a couple months to see how things pan out, and learn more about the issues...


----------



## jasjeet

Quote:


> Originally Posted by *2010rig*
> 
> Hey, after a whole day launch and 1500 posts I figured some clear conclusions would be reached. I'm impressed and am I crazy for still wanting it? For my uses, it's pretty great, just run games with SMT OFF like we've always done with Intel, it's not like it's that far behind.
> 
> I wouldn't Day 1 buy it tho, will give it a couple months to see how things pan out, and learn more about the issues...


Nobody disables HT on Intel these days since it's actually being utilised by games quite well.


----------



## 2010rig

Quote:


> Originally Posted by *jasjeet*
> 
> Nobody disables HT on Intel these days since it's actually being utilised by games quite well.


I haven't really been gaming lately









I guess I was willing to overlook that, what kind of boosts are games seeing from HT ON?

What I care about most, which is rendering, those benchmarks are delicious.


----------



## Oubadah

..


----------



## budgetgamer120

Quote:


> Originally Posted by *2010rig*
> 
> Hey, after a whole day launch and 1500 posts I figured some clear conclusions would be reached. I'm impressed and am I crazy for still wanting it? For my uses, it's pretty great, just run games with SMT OFF like we've always done with Intel, it's not like it's that far behind.
> 
> I wouldn't Day 1 buy it tho, will give it a couple months to see how things pan out, and learn more about the issues...


No need to run games with SMT off. Your first post on Ryzen is correct. Equal to or greater than 6900k. Not up there in ST when compared to 7700k.

Verdict... A CPU that is great at everything one could use a CPU for which includes gaming.


----------



## DaaQ

Not sure if posted here yet. Ryzen master overclocking guide @ryan92084


----------



## PsYcHo29388

I had a feeling that Ryzen wouldn't come out on top when it came to gaming performance, it's part of the reason why I went ahead and switched to intel last year, I just wasn't willing to wait as long as it did for the possibility of an amazing product, and I'm sure there are plenty of people between now and then who felt the same way.

With that said I still think Ryzen is a very good choice for a dedicated workstation, and perhaps even 4k gaming where the CPU won't play too much of a factor.


----------



## Newbie2009

Well the OC is a little disappointing. But 8 cores for me would be overkill, I might pick up a 1600x. Good job AMD! finally


----------



## Oubadah

..


----------



## budgetgamer120

These comments are hilarious lol. Ryzen is only good at 4k where CPU wont play too much of a factor LOL.


----------



## comagnum

I'm considering dumping my skylake setup and going for the 1700, have there been any b350 reviews yet?


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> These comments are hilarious lol. Ryzen is only good at 4k where CPU wont play too much of a factor LOL.


If i am understanding Oubadah correctly, we need benches that are CPU bound more than anything. If it is GPU bound, then we will know nothing and the truth of the matter is, 4K is rather GPU bound at the moment. To make it not GPU bound, we need a 1080TI as even a 1080 struggles. This is taken from post 1561. Oubadah feel free to correct me if I'm wrong as I had to think about it for a bit to type this out.
Quote:


> Originally Posted by *Oubadah*
> 
> Not sure it it was a typo, but 2500K+1070 is CPU limited. CPU limited benches would have been potentially useful to someone who'd ended up with that combo.
> 
> If someone was going to buy a CPU in 2011, they could have looked at something like this:
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/44339-intel-core-i3-2120-core-i5-2400-lga1155-processors-review-12.html
> 
> And concluded that in Crysis, a i3-2120 is more or less equivalent to a i7-2600K. That is a terribly misleading and unhelpful benchmark because:
> 
> A) It's taken in a heavily GPU-limited scenario (ice map). There are parts of Crysis (eg. shipyard), where even at [email protected] High on a GTX 580 (the benchmark config), the game is going to be CPU bound and the framerate won't be anywhere near 50fps. In those parts the 2600K would have made a significant difference.
> 
> B) If they were to upgrade the GTX 580 to, say, a GTX 1070 or 970, the 2600K would have been far less limiting after that upgrade.
> 
> Ideally the reviewer should have run their bench in heavy combat in the shipyard, but maybe they didn't know about it. The second best option was to lower the resolution so that the benchmark would be CPU bound in the ice map, then at least the reader would have got some sense that the 2120 and 2600K are not equal in Crysis.
> They might, but I wouldn't be buying hardware now based on an assumption that all future games will be using highly threaded engines. Another implied assumption seems to be that people are only going to be playing new games. Some people still play older games with older, poorly threaded engines. It's only been in the last year or so that we've started to see big movement on that front.
> 
> I'm not so sure that enthusiasts are necessarily likely to be less CPU limited either. if you're budget is high, then it's easier to end up with a CPU bottleneck that you can't fix just by throwing more money at it. It's not like you can put two CPUs in a system (well you could, but it wouldn't scale the same way). When I had my GTX 780 SLI build I was coming up against CPU bottlenecks regularly. Of course if I'd thrown a 4K monitor into the mix it might have swung back in the other direction. That's why it's hard to make any sweeping statements.


----------



## ducegt

GPU bottlenecked workloads won't hide the difference with the 7700K for too long. A low resolution Delta of 30fps, and only 3 fps at 4K for a certain game engine means Intel will perform better in the future. When the next engine version is released with say more advanced physics, the 4K delta will increase. Is 10FPS at 4K not a big deal? I use to bench HL2 with video rendering disables to show the true power of the Athlon64 after its release. Many were confused as it initially didn't change the user experience too much from the P4, but guess which chip was relevant longer? If you mostly game, 7700K is better today and will be in the future. Even if you game in 4K.


----------



## TheReciever

Its a 7700k and 6950x firesale in Korea right now lol

Dropping 6950x's for 1800x doesnt make much sense to me and I dont even like Intel all that much right now


----------



## B NEGATIVE

Quote:


> Originally Posted by *TheReciever*
> 
> Its a 7700k and 6950x firesale in Korea right now lol
> 
> Dropping 6950x's for 1800x doesnt make much sense to me and I dont even like Intel all that much right now


Agreed.

Im keeping my 6900k.


----------



## comagnum

Quote:


> Originally Posted by *TheReciever*
> 
> Its a 7700k and 6950x firesale in Korea right now lol
> 
> Dropping 6950x's for 1800x doesnt make much sense to me and I dont even like Intel all that much right now


Depends on what the use is. Coming from Korea, they may be wanting to take advantage of the multi boxing/streaming capabilities that Ryzen offers for cheap. The 1700 is a better multitasker than the 7700k and it won't hurt performance of Mobas and similar games.


----------



## budgetgamer120

Quote:


> Originally Posted by *TheReciever*
> 
> Its a 7700k and 6950x firesale in Korea right now lol
> 
> Dropping 6950x's for 1800x doesnt make much sense to me and I dont even like Intel all that much right now


Only make sense if you can sell the 6950x. That's a lot of money back in pockets


----------



## redone13

Quote:


> Originally Posted by *ducegt*
> 
> GPU bottlenecked workloads won't hide the difference with the 7700K for too long. A low resolution Delta of 30fps, and only 3 fps at 4K for a certain game engine means Intel will perform better in the future. When the next engine version is released with say more advanced physics, the 4K delta will increase. Is 10FPS at 4K not a big deal? I use to bench HL2 with video rendering disables to show the true power of the Athlon64 after its release. Many were confused as it initially didn't change the user experience too much from the P4, but guess which chip was relevant longer? If you mostly game, 7700K is better today and will be in the future. Even if you game in 4K.


Hmm, hello duce. That is a very interesting point. I am eager to get absolutely all the data but the last bunch of pages of this thread have kept me entertained in the meantime if you caught any of it lol.


----------



## TheReciever

Maybe just trading equity?

Who knows but its funny to see in the Korean forums


----------



## 2010rig

Ok so... Why go for the 1800X over the 1700X?

Surely the 1700X can be OC'd to close to 4 GHz? It's disappointing it doesn't clock higher and why spend that extra $100?


----------



## Asy

For people who want a 4 GHz boost clock out of box I'd assume


----------



## budgetgamer120

Quote:


> Originally Posted by *2010rig*
> 
> Ok so... Why go for the 1800X over the 1700X?
> 
> Surely the 1700X can be OC'd to close to 4 GHz? It's disappointing it doesn't clock higher and why spend that extra $100?


No reason.


----------



## S.M.

Quote:


> Originally Posted by *2010rig*
> 
> Ok so... Why go for the 1800X over the 1700X?
> 
> Surely the 1700X can be OC'd to close to 4 GHz? It's disappointing it doesn't clock higher and why spend that extra $100?


Why go for the 1700X over the 1700?

XFR isn't worth $70 imo.


----------



## Artikbot

Lots of people don't overclock and XFR potentially brings an extra 20% burst performance provided cooling is adequate. Plenty worth it for those people.


----------



## pony-tail

I am a mechanic not an engineer , so might be a dumb question , but is there any chance they could fix the issues with another stepping ?


----------



## Artikbot

Mostly everything is fixed either on the OS level (scheduling) or the BIOS (RAM incompatibilities, etc).

Not OCing though. That's purely silicon.


----------



## czin125

Quote:


> Originally Posted by *blackhole2013*
> 
> I'm just happy that finally there's a reason for intel to make a processor to blow away ryzen instead of tiny improvements and i will go buy it .


That's still 5months away for Skylake-X and 9 months away for Coffeelake / 3D Stacking


----------



## Majin SSJ Eric

Guys, the "disappointing" gaming performance is right on par with the X99 chips from Intel. Can we stop calling it a turd for gaming? If you did the Pepsi challenge between a 5960X and the 1700X you would not be able to tell the difference, so saying "Ryzen can't play games" is the same thing as saying none of the Broadwell-E chips can play games either...


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Guys, the "disappointing" gaming performance is right on par with the X99 chips from Intel. Can we stop calling it a turd for gaming? If you did the Pepsi challenge between a 5960X and the 1700X you would not be able to tell the difference, so saying "Ryzen can't play games" is the same thing as saying none of the Broadwell-E chips can play games either...


For you criticizing me for the number of my posts in this thread accumulating on the account of supporting intel, you show no signs of stopping on behalf of AMD.









OK, you had that one coming. Anyways, yes, games can be played on Ryzen. To what extent, perhaps we don't fully know without the 4k benchmarks that Oubadah has outlined. The 1080p benches are fine considering all the multitasking and number crunching one can do as well.


----------



## boot318

Quote:


> Originally Posted by *redone13*
> 
> *For you criticizing me for the number of my posts in this thread* accumulating on the account of supporting intel, you show no signs of stopping on behalf of AMD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OK, you had that one coming. Games can be played on Ryzen. To what extent, perhaps we don't fully know without the 4k benchmarks that Oubadah has outlined. The 1080p benches are fine considering all the multitasking and number crunching one can do as well.


You only have 78 post in 15ish months. I'm not sure why anyone would be critical of you.


----------



## redone13

Quote:


> Originally Posted by *boot318*
> 
> You only have 78 post in 15ish months. I'm not sure why anyone would be critical of you.


Hmm, perhaps because an interesting new piece of hardware released and some possibly objective data was released about it?


----------



## The-Beast

Quote:


> Originally Posted by *redone13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ducegt*
> 
> GPU bottlenecked workloads won't hide the difference with the 7700K for too long. A low resolution Delta of 30fps, and only 3 fps at 4K for a certain game engine means Intel will perform better in the future. When the next engine version is released with say more advanced physics, the 4K delta will increase. Is 10FPS at 4K not a big deal? I use to bench HL2 with video rendering disables to show the true power of the Athlon64 after its release. Many were confused as it initially didn't change the user experience too much from the P4, but guess which chip was relevant longer? If you mostly game, 7700K is better today and will be in the future. Even if you game in 4K.
> 
> 
> 
> Hmm, hello duce. That is a very interesting point. I am eager to get absolutely all the data but the last bunch of pages of this thread have kept me entertained in the meantime if you caught any of it lol.
Click to expand...

No, it's not an interesting point. It's an incredibly simplistic point that has completely ignored the changing landscape of the technology. It's an argument based on pure and utter ignorance that IPC is the end all be all when that argument died in 2006 for forward thinking application needs. It died when the C2D completely succumbed to the changing needs of its users and left them wanting for performance before its life cycle was done.


----------



## Kuivamaa

Quote:


> Originally Posted by *Oubadah*
> 
> Not sure it it was a typo, but 2500K+1070 is CPU limited. CPU limited benches would have been potentially useful to someone who'd ended up with that combo.
> 
> If someone was going to buy a CPU in 2011, they could have looked at something like this:
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/44339-intel-core-i3-2120-core-i5-2400-lga1155-processors-review-12.html
> 
> And concluded that in Crysis, a i3-2120 is more or less equivalent to a i7-2600K. That is a terribly misleading and unhelpful benchmark because:
> 
> A) It's taken in a heavily GPU-limited scenario (ice map). There are parts of Crysis (eg. shipyard), where even at [email protected] High on a GTX 580 (the benchmark config), the game is going to be CPU bound and the framerate won't be anywhere near 50fps. In those parts the 2600K would have made a significant difference.
> 
> B) If they were to upgrade the GTX 580 to, say, a GTX 1070 or 970, the 2600K would have been far less limiting after that upgrade.
> 
> Ideally the reviewer should have run their bench in heavy combat in the shipyard, but maybe they didn't know about it. The second best option was to lower the resolution so that the benchmark would be CPU bound in the ice map, then at least the reader would have got some sense that the 2120 and 2600K are not equal in Crysis.
> They might, but I wouldn't be buying hardware now based on an assumption that all future games will be using highly threaded engines. Another implied assumption seems to be that people are only going to be playing new games. Some people still play older games with older, poorly threaded engines. It's only been in the last year or so that we've started to see big movement on that front.
> 
> I'm not so sure that enthusiasts are necessarily likely to be less CPU limited either. if you're budget is high, then it's easier to end up with a CPU bottleneck that you can't fix just by throwing more money at it. It's not like you can put two CPUs in a system (well you could, but it wouldn't scale the same way). When I had my GTX 780 SLI build I was coming up against CPU bottlenecks regularly. Of course if I'd thrown a 4K monitor into the mix it might have swung back in the other direction. That's why it's hard to make any sweeping statements.


Are you sure about all that? Crysis 1 is a game that sees two cores max after all, i3 2120 and i7 2600k are both Sandy. I doubt that they will perform all that different even at cpu heavy areas, even when both paired with a Titan XP in that game, just like 2011 the bench insinuates.


----------



## randomizer

I may be interested in Ryzen when it's been given more time in the oven.


----------



## dragneel

Call me crazy but if I only cared about gaming perf, I wouldnt even bother wasting my money upgrading from my 2500k, in fact I didn't and the 6700k already offered a sizeable gaming boost but my 2500k still plays my games just fine.

What I really wanted was high MT performance for a reasonable cost, and AMD delivered it spectacularly. The extra ST over my 2500k is just icing on the cake.


----------



## TheReciever

Quote:


> Originally Posted by *boot318*
> 
> You only have 78 post in 15ish months. I'm not sure why anyone would be critical of you.


Its argument fallacy and detracts from the subject matter.


----------



## ducegt

Quote:


> Originally Posted by *TheReciever*
> 
> Its argument fallacy and detracts from the subject matter.


Known as Ad hominem fallacy.


----------



## Newbie2009

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Guys, the "disappointing" gaming performance is right on par with the X99 chips from Intel. Can we stop calling it a turd for gaming? If you did the Pepsi challenge between a 5960X and the 1700X you would not be able to tell the difference, so saying "Ryzen can't play games" is the same thing as saying none of the Broadwell-E chips can play games either...


Fanboys will always have something to moan about.

Quote:


> Originally Posted by *redone13*
> 
> For you criticizing me for the number of my posts in this thread accumulating on the account of supporting intel, you show no signs of stopping on behalf of AMD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OK, you had that one coming. Anyways, yes, games can be played on Ryzen. To what extent, perhaps we don't fully know without the 4k benchmarks that Oubadah has outlined. The 1080p benches are fine considering all the multitasking and number crunching one can do as well.


Can it run crysis?


----------



## 2010rig

Quote:


> Originally Posted by *S.M.*
> 
> Why go for the 1700X over the 1700?
> 
> XFR isn't worth $70 imo.


Well yea, I shouldn't have added the X, $330 vs $500 ain't worth it for people like us who will overclock it to its limits


----------



## Newbie2009

Quote:


> Originally Posted by *2010rig*
> 
> Well yea, I shouldn't have added the X, $330 vs $500 ain't worth it for people like us who will overclock it to its limits


It does seem the real star of the show is the 1700.


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Its argument fallacy and detracts from the subject matter.


Quote:


> Originally Posted by *ducegt*
> 
> Known as Ad hominem fallacy.


I studied these at some point in school. Thanks for reminding me and actually giving me a little bit of relief from everyone that thinks that utilizing the data at hand makes you a troll if you don't have a certain number of posts over a number of years already.
Quote:


> Originally Posted by *Newbie2009*
> 
> Fanboys will always have something to moan about.
> Can it run crysis?


Mayhaps


----------



## 2010rig

Yeah for that price, it's the real MVP, I wonder how well binned the 1800X's vs 1700's are.


----------



## Oubadah

..


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ducegt*
> 
> Known as Ad hominem fallacy.


Knowing the motivations behind someone's arguments is instructive in evaluating said arguments. Fans of a particular product or company tend to make exclusively positive or negative arguments about said products/companies depending on their fandom. Its fair game to point that out.


----------



## Oubadah

..


----------



## The-Beast

Quote:


> Originally Posted by *TheReciever*
> 
> Its argument fallacy and detracts from the subject matter.


Right and using techniques of derailment doesn't detract from the subject matter. Cool gish gallop away.


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Knowing the motivations behind someone's arguments is instructive in evaluating said arguments. Fans of a particular product or company tend to make exclusively positive or negative arguments about said products/companies depending on their fandom. Its fair game to point that out.


It wasn't relevant when we were in heated debated about numbers obtained from 3rd parties that are supposed to be somewhat accurate or objective. Regardless, I didn't mention all the AMD related posts I saw in your profile when we were debating after you snooped mine.


----------



## Newbie2009

Quote:


> Originally Posted by *Oubadah*
> 
> Not at 60fps. It's all abut that single threaded performance yo.


lol, ok dude.


----------



## Oubadah

..


----------



## dragneel

Quote:


> Originally Posted by *Oubadah*
> 
> You answered your own question: Why would you upgrade from a 2500k? You'd upgrade from a 2500k if you wanted a "sizeable gaming boost".


But.. it wasnt a question









Stated for myself, that if i only cared about gaming I wouldn't be upgrading. Seems you've deliberately ignored the second half of my post.


----------



## Oubadah

..


----------



## Newbie2009

Quote:


> Originally Posted by *Oubadah*
> 
> Crysis 3 isn't Crysis. if you want to talk about Crysis 3, you type the characters "Crysis 3". the "can it play Crysis" meme is quite specific to the original game, which as someone correctly stated on the other page, uses only two cores.


Because crysis is easier to run than crysis 3. Idiotic to think otherwise.


----------



## Machspeed007

Quote:


> Originally Posted by *dragneel*
> 
> Call me crazy but if I only cared about gaming perf, I wouldnt even bother wasting my money upgrading from my 2500k, in fact I didn't and the 6700k already offered a sizeable gaming boost but my 2500k still plays my games just fine.
> 
> What I really wanted was high MT performance for a reasonable cost, and AMD delivered it spectacularly. The extra ST over my 2500k is just icing on the cake.


Well, I have a 2500k and just bought an R7.
I play a lot of 64+ multiplayer BF1 and the 2500k just doesn't cut it anymore. I know, it's a "specific" scenario, but that's what I play mostly because my attention span isn't anymore what it used to be with single player games.

I feel that multi-threaded CPUs are more relevant in heavy multiplayer games. Unfortunately, there isn't a standardized benchmark for multiplayer, i would have loved to see how the CPU scale would look.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *redone13*
> 
> It wasn't relevant when we were in heated debated about numbers obtained from 3rd parties that are supposed to be somewhat accurate or objective. Regardless, I didn't mention all the AMD related posts I saw in your profile when we were debating after you snooped mine.


I never claimed to be unbiased and freely admit that I root for AMD over Intel and Nvidia. I do at least attempt to be fair as I have admitted that the OCing is a disappointment and that the 7700K is still the best gaming CPU. Hell, I've never even owned an AMD CPU before!


----------



## redone13

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I never claimed to be unbiased and freely admit that I root for AMD over Intel and Nvidia. I do at least attempt to be fair as I have admitted that the OCing is a disappointment and that the 7700K is still the best gaming CPU. Hell, I've never even owned an AMD CPU before!


I too conceded and will continue to do so because both appear to have their place in the market. It was mostly budgetgamer that was arguing all that "everyone with quadcores lags while streaming" bit. I felt like that lead me too far astray. Because obviously not EVERYONE does lol.


----------



## Oubadah

..


----------



## comagnum

Quote:


> Originally Posted by *Newbie2009*
> 
> lol, ok dude.


It's a joke...


----------



## Pantsu

For me personally as a gamer, I'm leaning towards getting 7700K now. There's several factors to it, like I'd have to buy another cooler for AM4 or go with the expensive Crosshair mobo. So cost would be higher if I went with the 1700 build I was planning. It looks like Ryzen platform is still too immature and there's a bunch of software and mobo issues holding it back. I could go for the Crosshair and 1700 and hope that AMD will sort these out and eventually the system might perform like 6900K in games, but I think the better option for my situation is to just get the 7700K for now, and upgrade the CPU+mobo in a year or two when we hopefully have competitive 8 core products from both sides at reasonable prices. Of course the catch here is that 7700K right now will probably lose a bunch of its value, but ultimately we're probably talking 100-200€ at best, not that big of a deal. I'll take it if I can guarantee that my current games that don't have any Ryzen optimizations run as good as they can. If I knew for sure AMD's performance gets fixed in a few months to 6900K levels in games I'd certainly go for that over 7700K, but if if's and buts were FPS and nuts we'd all be gaming on AMD.


----------



## Newbie2009

Quote:


> Originally Posted by *Oubadah*
> 
> That's an overly simplistic way of looking at it. You'll be able to maintain a consistent 60fps throughout the entireity of Crysis 3 (if you can't already) before you can do the same on Crysis. if you think consistently high framerates in Crysis are easy because it's almost 10 years old, you're mistaken, In the shipyard you will see ~40fps during combat on a [email protected] IIRC the worst CPU area in 3 is Welcome to the Jungle, and that game runs at about 75fps there on the same cpu.


LOL oh yeah? Maybe on the igpu









I beat that game more than once using a phenom 2. You are talking from the anal passage. No insult.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Oubadah*
> 
> That's an overly simplistic way of looking at it. You'll be able to maintain a consistent 60fps throughout the entireity of Crysis 3 (if you can't already) before you can do the same on Crysis. if you think consistently high framerates in Crysis are easy because it's almost 10 years old, you're mistaken, In the shipyard you will see ~40fps during combat on a [email protected] IIRC the worst CPU area in 3 is Welcome to the Jungle, and that game runs at about 75fps there on the same cpu.


I tested Crysis and Crysis 3 quite extensively when I did the Titan vs 7970 comparison in my sig. Crysis 3 was much harder on my system than Crysis was, even with SLI Titans. I think I was using a 3960X at 4.7 GHz for that test.


----------



## Newbie2009

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I tested Crysis and Crysis 3 quite extensively when I did the Titan vs 7970 comparison in my sig. Crysis 3 was much harder on my system than Crysis was, even with SLI Titans. I think I was using a 3960X at 4.7 GHz for that test.


OF COURSE it is. People just parroting nonsense or lying, or trolling.


----------



## Oubadah

..


----------



## redone13

Just some quick items I scanned from YT:

https://www.youtube.com/watch?v=DOVEHCM1gmU = Crysis 1, 6700k, GTX 970 - 1080p maxed

https://www.youtube.com/watch?v=TX0qctRZZMQ = the same with Crysis 3.

Hmm, the crysis 3 970 is OCed. Might have helped a bit.


----------



## Oubadah

..


----------



## 2010rig

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I never claimed to be unbiased and freely admit that I root for AMD over Intel and Nvidia. I do at least attempt to be fair as I have admitted that the OCing is a disappointment and that the 7700K is still the best gaming CPU. Hell, I've never even owned an AMD CPU before!


I on the other hand have had and built many AMD rigs, and finally going to build another


----------



## Newbie2009

Quote:


> Originally Posted by *Oubadah*
> 
> "I beat that game" meaning you completed the game? So what's your point? I "beat" the game on an E6600 when it was released.
> 
> I said 60fps. Are you going to try and tell me your Phenom II gave you a consistent 60 fps throughout the game?


Hell yeah, it's about graphics settings not CPU. Crysis was GPU bound, always.


----------



## Majin SSJ Eric

From my test, minimum fps for Crysis with two Titans was 85. In Crysis 3 it was 47. That was with a 3960X at 5GHz actually. Just from my experience, Crysis 3 was much harder to run than Crysis was, but that was in 2013.


----------



## Newbie2009

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> From my test, minimum fps for Crysis with two Titans was 85. In Crysis 3 it was 47. That was with a 3960X at 5GHz actually. Just from my experience, Crysis 3 was much harder to run than Crysis was, but that was in 2013.


Of course , 10 year old game.

Some perspective here, it's warhead but similar, @ med settings

http://www.anandtech.com/bench/CPU/50


----------



## 364901

Quote:


> Originally Posted by *Artikbot*
> 
> Lots of people don't overclock and *XFR potentially brings an extra 20% burst performance* provided cooling is adequate. Plenty worth it for those people.


The XFR window is only 100MHz on X-series chips, and 50MHz on non-X chips. It isn't really that pertinent for performance, because you need really good cooling to keep XFR enabled when running heavy workloads, and you're limited to only having it work on two cores and four threads.


----------



## Oubadah

..


----------



## De_stroyer

New bios showing improvement of around 30% in gaming

Can everyone not panic


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> That's not the map you use if you want to bench CPUs in Crysis.
> Where did you test? people swan around Island for five minutes and think they have a comprehensive idea of the game's bottlenecks. Open assault, drive into the shipyard and start killing things and you will see the framerate plummet.
> When was the last time you played it. Clearly it was some time ago. The game is frequently CPU bound (unless you're pushing some massive res or SGSSAA) and has been for several years. The CPU is responsible for the lowest FPS areas in the game.


Oh, I stand corrected. Was just trying to get something besides rhetoric up there from you gents.
Quote:


> Originally Posted by *De_stroyer*
> 
> New bios showing improvement of around 30% in gaming
> 
> Can everyone not panic


Do you have an unbiased source?


----------



## Oubadah

.


----------



## Ha-Nocri

Quote:


> Originally Posted by *De_stroyer*
> 
> New bios showing improvement of around 30% in gaming
> 
> Can everyone not panic


Where did you see/hear this?


----------



## mAs81

Quote:


> Originally Posted by *De_stroyer*
> 
> New bios showing improvement of around 30% in gaming
> 
> Can everyone not panic


Source?Cause I'm feeling a little panicky


----------



## warr10r

Quote:


> Originally Posted by *De_stroyer*
> 
> New bios showing improvement of around 30% in gaming
> 
> Can everyone not panic


Excellent.









Its 2017 and people are still asking "Can it run Crysis?"









Geez people! I want to see a decent Star Citizen or Elite Dangerous or DayZ benchmark and people are STILL using Crysis as their be-all-and-end-all benchmark! What is this, 2010?


----------



## Ha-Nocri

AMD said that they are working with developers who will be releasing patches for their games which should improve performance. All games were codded for Inte CPU's, so we will have to wait and see


----------



## 364901

Quote:


> Originally Posted by *De_stroyer*
> 
> New bios showing improvement of around 30% in gaming
> 
> Can everyone not panic


You're pulling everyone's leg, surely?


----------



## mAs81

Quote:


> Originally Posted by *Ha-Nocri*
> 
> AMD said that they are working with developers who will be releasing patches for their games which should improve performance. All games were codded for Inte CPU's, so we will have to wait and see


Yeah I've seen that too , and it makes sense I guess..


----------



## prznar1

Quote:


> Originally Posted by *tpi2007*
> 
> Yes, but in 3-4 years games won't be relying on less threads, so the situation for Ryzen will only get better from now on, not worse, even at 1080p, even if they didn't do any further scheduler / game / BIOS optimizations.
> What? Did they have any motherboard specific Windows software installed that does such stuff automatically or something?


oh no. We hear that stuff for years, and its not happening. Ryzen will be cool for multitasking etc but games wont move from single thread do easly. Its to hard for games and costs to much.


----------



## Wishmaker

Quote:


> "As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences.
> 
> Oxide Games also provided a public statement today on the significant performance uplift observed when optimizing for the 8-core, 16-thread Ryzen 7 CPU design - optimizations not yet reflected in Ashes of the Singularity benchmarking. Creative Assembly, developers of the Total War series, made a similar statement today related to upcoming Ryzen optimizations.
> 
> CPU benchmarking deficits to the competition in certain games at 1080p resolution can be attributed to the development and optimization of the game uniquely to Intel platforms - until now. Even without optimizations in place, Ryzen delivers high, smooth frame rates on all "CPU-bound" games, as well as overall smooth frame rates and great experiences in GPU-bound gaming and VR. With developers taking advantage of Ryzen architecture and the extra cores and threads, we expect benchmarks to only get better, and enable Ryzen excel at next generation gaming experiences as well.
> 
> Game performance will be optimized for Ryzen and continue to improve from at-launch frame rate scores." John Taylor, AMD


Let us hope this is the case then







!


----------



## redone13

Quote:


> Originally Posted by *Wishmaker*
> 
> Let us hope this is the case then
> 
> 
> 
> 
> 
> 
> 
> !


That's from AMD's mouth though is it not? The same people that cherry picked their benchmark at the reveal and also showed a lagging quad core streaming versus their powerhouse. What else can they say but good stuff in all reality?


----------



## Newbie2009

Quote:


> Originally Posted by *Oubadah*
> 
> .


meh i'm not gonna argue about it anymore, ill check it out again for myself, I may even stream the results.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> I can't count how many times I've had this argument about Crysis and I'm always right. It's always the same: First there's the denial, then there's the "you're crazy", then there's the "but I benched it myself", then there are the screenshots of the map/area I _wasn't_ referring to, then there's the screenshots of that area by conveniently all it's all dead empty wih no AI or anything, then there's the moving goalposts, the "oh, you had the game on Very High? no one plays like that"... and so on...
> 
> https://forums.anandtech.com/threads/geforce-titan-2500k-vs-4790k-66-games-tested.2389580/page-2#post-36501457
> 
> The game is heavily CPU bound on modern systems. This is because it doesn't multithread, and has been starved of significant single threaded performance improvements for the last years. The CPU bottleneck used to be concealed by a sizeable GPU bottleneck, but over the years that has been eroded and now modern GPUs can competently render Crysis at a high framerate.
> 
> I only commented on the Cysis joke, because although people say it facetiously, thinking that it's really not difficult to run anymore, the irony is that it's still impossible to maintain a *consistent* 60fps throughout the entire game.


Dude, I believe you lol. I even referenced you a couple times when **** was heated. I will take a closer look at the link.


----------



## Newbie2009

Who said 30% uplift in games? I'd guess AMD need to improve their SMT.


----------



## lotzaramen

Yea there's lots of people saying it's *bios issues, microcode issues, etc*. joker productions released a podcast with Steve from gamersnexus explaining the difference in their results as one reference of a source. Well two technically since it's two channels on one podcast.

(Something about some reviewers had boards and cpus much earlier than others and were reviewing on dated firmware if I remember right)

HOWEVER I'm not totally buying that until I see it. Just seems off. I like to approach with skepticism. To me it sounds like a rushed release.


----------



## redone13

Alright Oubadah, so if a 4790K is getting smacked down like that and the 6700K and 7700K are not that far above it respectively, then yea, I can see it being harder to run than Crysis 3. The real question is, does an equivalent study exist for Crysis 3? Is there an article on Crysis 3 that finds a similar bottleneck in a specific level or is the general benchmark adequate for doing so?


----------



## TheReciever

Quote:


> Originally Posted by *redone13*
> 
> I studied these at some point in school. Thanks for reminding me and actually giving me a little bit of relief from everyone that thinks that utilizing the data at hand makes you a troll if you don't have a certain number of posts over a number of years already.
> Mayhaps


Its like saying well you only have 1000rep while having 20k posts, drawing a conclusion from that alone is well...anyways.

That being said it might help to slow down on response to keep the quality of the argument from falling apart lol

Quote:


> Originally Posted by *The-Beast*
> 
> Right and using techniques of derailment doesn't detract from the subject matter. Cool gish gallop away.


If your unsure where it goes then its best to leave that door closed no?

Quote:


> Originally Posted by *redone13*
> 
> It wasn't relevant when we were in heated debated about numbers obtained from 3rd parties that are supposed to be somewhat accurate or objective. Regardless, I didn't mention all the AMD related posts I saw in your profile when we were debating after you snooped mine.


Intent is one thing, but its only a factor in how to approach the discussion. It doesnt change the fact that you should be arguing the idea, not the person proposing it.


----------



## 364901

Quote:


> Originally Posted by *Oubadah*
> 
> I can't count how many times I've had this argument about Crysis and I'm always right. It's always the same: First there's the denial, then there's the "you're crazy", then there's the "but I benched it myself", then there are the screenshots of the map/area I wasn't referring to, then there's the screenshots of that area but conveniently it's all dead empty with no AI or anything, then there's the moving goalposts, the "oh, you had the game on Very High? no one plays like that"... and so on...
> 
> https://forums.anandtech.com/threads/geforce-titan-2500k-vs-4790k-66-games-tested.2389580/page-2#post-36501457
> (this is to illustrate that the game can drop to 33fps on a [email protected] So that means to get a stable 60fps in worst case Crysis, you're going to need a CPU with DOUBLE the singlethreaded performance of a 4790K. No one has that).
> 
> The game is heavily CPU bound on modern systems. This is because it doesn't multithread, and has been starved of significant single threaded performance improvements for the last years since Sandy Bride. The CPU bottleneck used to be concealed by a sizeable GPU bottleneck, but over the years that has been eroded and now modern GPUs can competently render Crysis at a high framerate.
> 
> I only commented on the Cysis joke, because although people say it facetiously, thinking that it's really not difficult to run anymore, the irony is that it's still impossible to maintain a *consistent* 60fps throughout the entire game.


This somewhat relates to the multiple times when I needed to remind people that Crysis 3 does not like, or run well on, Core i3 chips. It introduces so much resource contention that multi-threading might as well be turned off for all the good it does, and yet builds at the time still recommended Core i3s for playing Crysis 3 on a budget.


----------



## Xuper

No comment











First = BF1

Second = Witcher 3

Third = Watch dog 2

Edit : Fixed


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *TheReciever*
> 
> Its like saying well you only have 1000rep while having 20k posts, drawing a conclusion from that alone is well...anyways.
> 
> That being said it might help to slow down on response to keep the quality of the argument from falling apart lol
> If your unsure where it goes then its best to leave that door closed no?
> Intent is one thing, but its only a factor in how to approach the discussion. It doesnt change the fact that you should be arguing the idea, not the person proposing it.


Yea, I read ya load and clear in regards to the first point and third point. I will probably be retiring soon involuntarily due to lack of ability to sit much longer at my computer.


----------



## 364901

Quote:


> Originally Posted by *Xuper*
> 
> No comment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First = BF1
> 
> Second = Witcher 3
> 
> Third = GTAV


Third is Watch Dogs 2. GTA V doesn't have guidance arrows on the road when moving towards an objective.


----------



## cib24

Really happy for AMD although pricing wise they probably need to drop Ryzen by about $50 across the board. It's good to see them competitive again and hopefully Ryzen 2.0 will see further improvements. Still, I don't see enough of a reason to move on from my I7-3770k at 4.2Ghz for gaming which is what my PC is primarily used for. I'm still gaming at 1440p with a 980Ti, but I can see myself upgrading in another 2-3 years.


----------



## Xuper

Quote:


> Originally Posted by *CataclysmZA*
> 
> Third is Watch Dogs 2. GTA V doesn't have guidance arrows on the road when moving towards an objective.


Fixed.


----------



## Wishmaker

Quote:


> Originally Posted by *Xuper*
> 
> No comment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First = BF1
> Second = Witcher 3
> Third = Watch dog 2
> 
> Edit : Fixed


That is very interesting.


----------



## The-Beast

Quote:


> Originally Posted by *TheReciever*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The-Beast*
> 
> Right and using techniques of derailment doesn't detract from the subject matter. Cool gish gallop away.
> 
> 
> 
> If your unsure where it goes then its best to leave that door closed no?
Click to expand...

Or the door is open and you're letting the scathophagidae in.


----------



## Newbie2009

Ah ok, I didn't see this before.
Quote:


> "As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences.
> 
> Oxide Games also provided a public statement today on the significant performance uplift observed when optimizing for the 8-core, 16-thread Ryzen 7 CPU design - optimizations not yet reflected in Ashes of the Singularity benchmarking. Creative Assembly, developers of the Total War series, made a similar statement today related to upcoming Ryzen optimizations.
> 
> CPU benchmarking deficits to the competition in certain games at 1080p resolution can be attributed to the development and optimization of the game uniquely to Intel platforms - until now. Even without optimizations in place, Ryzen delivers high, smooth frame rates on all "CPU-bound" games, as well as overall smooth frame rates and great experiences in GPU-bound gaming and VR. With developers taking advantage of Ryzen architecture and the extra cores and threads, we expect benchmarks to only get better, and enable Ryzen excel at next generation gaming experiences as well.
> 
> Game performance will be optimized for Ryzen and continue to improve from at-launch frame rate scores." John Taylor, AMD


https://www.pcper.com/news/Processors/AMD-responds-1080p-gaming-tests-Ryzen

I suppose it does make sense, games, all games, optimized for high end intel to date.


----------



## Wishmaker

Quote:


> Originally Posted by *Newbie2009*
> 
> Ah ok, I didn't see this before.
> https://www.pcper.com/news/Processors/AMD-responds-1080p-gaming-tests-Ryzen
> 
> I suppose it does make sense, games, all games, optimized for high end intel to date.


There are issues that can be sorted now and issues with Zen +. Looking forward to seeing how these things pan out in terms of performance. I need INTEL to drop prices so I can build a proper rig







!


----------



## umeng2002

Quote:


> Originally Posted by *cib24*
> 
> Really happy for AMD although pricing wise they probably need to drop Ryzen by about $50 across the board. It's good to see them competitive again and hopefully Ryzen 2.0 will see further improvements. Still, I don't see enough of a reason to move on from my I7-3770k at 4.2Ghz for gaming which is what my PC is primarily used for. I'm still gaming at 1440p with a 980Ti, but I can see myself upgrading in another 2-3 years.


Ignore AMD's marketing.

Ryzen 7 isn't really meant for gaming.

Gaming + streaming, sure
Rendering movies, etc., sure

Gaming alone... wait for Ryzen 5.


----------



## Oubadah

..


----------



## 364901

Quote:


> Originally Posted by *Oubadah*
> 
> Crysis 3 was one of the first AAA games to really take advantage of >4 core CPUs, and it even makes great use of HT on i7 quads. That's why, despite using a hell of a lot more total computing power than Crysis 1, it can still do very well on modern systems.
> That's interesting because it responds well to HT on i7s. I don't recall what CPU this is, but it's some kind of Haswell i7 with HT off and HT on:
> 
> Next time I have it installed I might disable some cores and see how the HT benefit scales with cores.


I remember looking into it at the time when I was playing the game on my Athlon x3. Crysis 3 works well with HT turned on, but only if you have more than two physical cores and more than 3MB of cache. Some website at the time brought up testing data that showed that the game can use all four threads, but it can't schedule things properly on a dual-core chip with HT, resulting in lowered framerates. It might also fill up the L3 cache rather quickly because of all the simulation data, and when there's resource contention between a physical core and a virtual one, the physical core takes precedence.


----------



## DADDYDC650

Quote:


> Originally Posted by *umeng2002*
> 
> Ignore AMD's marketing.
> 
> Ryzen 7 isn't really meant for gaming.
> 
> Gaming + streaming, sure
> Rendering movies, etc., sure
> 
> Gaming alone... wait for Ryzen 5.


Of course Ryzen 7 is meant for gaming because it is. It's also meant for encoding and streaming. That's like saying Ivy Bridge/Broadwell-E isn't meant for gaming. You shouldn't expect a big boost with Ryzen 5 BTW.


----------



## Kosai

So anyone looking for a killer GAMING build today would be better off with and i7 7700k? Gaming only no render. I was seriously consdiering ryzen for a new build. But now I am unsure.


----------



## DADDYDC650

Quote:


> Originally Posted by *Kosai*
> 
> So anyone looking for a killer GAMING build today would be better off with and i7 7700k? Gaming only no render. I was seriously consdiering ryzen for a new build. But now I am unsure.


If you don't plan on upgrading for 2+ years, I'd go Ryzen. If you need the best now and or have a 100+ Hz monitor, 7700k.


----------



## TheReciever

Quote:


> Originally Posted by *The-Beast*
> 
> Or the door is open and you're letting the scathophagidae in.


Well I guess we both know what you should be expecting then


----------



## oxidized

Quote:


> Originally Posted by *DADDYDC650*
> 
> Of course Ryzen 7 is meant for gaming because it is. It's also meant for encoding and streaming. That's like saying Ivy Bridge/Broadwell-E isn't meant for gaming. *You shouldn't expect a big boost with Ryzen 5 BTW*.


----------



## Oubadah

..


----------



## DADDYDC650

Quote:


> Originally Posted by *oxidized*


Don't be sad. The difference in gaming performance isn't huge with the proper Ryzen setup. Performance might improve months from now with some with software updates. Everyone needs to remember that AM4/Ryzen is brand new so it's buggy and has issues at the moment. Check out the video below of a 3.9Ghz Ryzen 1700 @3.9Ghz vs a 7700k @5Ghz.


----------



## BinaryDemon

Quote:


> Originally Posted by *DADDYDC650*
> 
> Of course Ryzen 7 is meant for gaming because it is. It's also meant for encoding and streaming. That's like saying Ivy Bridge/Broadwell-E isn't meant for gaming. You shouldn't expect a big boost with Ryzen 5 BTW.


I wouldnt expect expect much performance loss either since most games dont fully utilize more than 8 threads yet. Something like the R5 1500 overclocked to 4ghz should be a pretty sweet deal.


----------



## XHellAngelX

Still faster than 7700K at 720p



https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#diagramm-watch-dogs-2-fps


----------



## oxidized

Quote:


> Originally Posted by *DADDYDC650*
> 
> Don't be sad. The difference in gaming performance isn't huge with the proper Ryzen setup. Performance might improve months from now with some with software updates. Everyone needs to remember that AM4/Ryzen is brand new so it's buggy and has issues at the moment. Check out the video below of a 3.9Ghz Ryzen 1700 @3.9Ghz vs a 7700k @5Ghz.


I hope so, i also hope 6/12 will give a decent boost to gaming performance besides all fixing and updates


----------



## DADDYDC650

Quote:


> Originally Posted by *BinaryDemon*
> 
> I wouldnt expect expect much performance loss either since most games dont fully utilize more than 8 threads yet. Something like the R5 1500 overclocked to 4ghz should be a pretty sweet deal.


Shouldn't be much of a performance loss in most games by going with an R5. What's cool about AM4 is that you can always go for a R5 instead of R7 for gaming and in 2-3 years upgrade to a Zen 3 8+ core CPU. Can't do that with any current Intel socket.


----------



## Xuper

Quote:


> Originally Posted by *Oubadah*
> 
> What point are you trying to make?
> 
> What software are they using to detect the per-core usage. Is it getting it's data the same way Windows does in TaskMan (ie. the results are deceptive because they don't actually match the usage on the cores, for example, if a game was using two full cores, the graph would look like 1 quarter load on all four cores.)


Where have you been when they're using it for 5 years? Point is that Good luck with 7700K if someone want to do some task while running game in background.


----------



## ducegt

Can it run Crisis? Is only a meme.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Knowing the motivations behind someone's arguments is instructive in evaluating said arguments. Fans of a particular product or company tend to make exclusively positive or negative arguments about said products/companies depending on their fandom. Its fair game to point that out.


Your assumptions of their motivations may not be accurate. Budgetgamer for example. Given his user name anyone would assume he cares about budget and gaming, but alas he does not. X99 based who says he would buy a console if he wanted to game.

Ive been personally attacked by a half dozen members because they falsely assume my motivations despite I have articulated them well; although not thoroughly in every comment Ive ever made. I'll share my 2 bits once more.

I've owned many Intel and AMD CPUs. I was OCing them when many here were pooping in their diapers. I've owned many nVidia cards, but as of the 5850 I have maintained loyalty to AMD and I will continue to do so regardless if their products are modestly inferior. So I'm biased against nVidia, but being CPUs generally outlast video cards, I'm open to Intel as well. I, like the vast majority of enthusiasts, play games. I occasionally render videos and edit photos. It's no big deal that I encode a video in twice the amount of time as an 8 core. I can run jobs when I'm away or sleeping. Again the point is that more people will benefit from the 7700K. Now being I feel this way, everything that I say about the details of Zen must be false, right? Like how I pointed out AMD benched the 7700K on slow standard RAM when in reality, the vast majority use faster RAM. I was called a troll, etc. I do think some respectable people here can temporarily lose their heads when they make false assumptions of intentions. Simple exchanges of facts turn into mud slinging because someone like you believes they deserve it; because why? They have a different perspective than you?


----------



## ducegt

Quote:


> Originally Posted by *Xuper*
> 
> Where have you been when they're using it for 5 years? Point is that Good luck with 7700K if someone want to do some task while running game in background.


I have a 7700K and I can minimize any game and browse the web. I can save, quit, and reload most games in less than a minute. Hell, I can reboot my entire computer in 30 seconds. If I wanted to drive 10,000km, I'd make a more intelligent decision by flying in plane as opposed to making upgrades to my car to make it run faster. Also, if I'm gaming now and I feel the need to check my email, I have my smart phone.


----------



## umeng2002

Quote:


> Originally Posted by *DADDYDC650*
> 
> Of course Ryzen 7 is meant for gaming because it is. It's also meant for encoding and streaming. That's like saying Ivy Bridge/Broadwell-E isn't meant for gaming. You shouldn't expect a big boost with Ryzen 5 BTW.


No.

The cost/perf ratio of 8 core Ryzen and Intel isn't justified for gaming.

The only hope is that Ryzen 5 will be better in gaming than Ryzen 7 if overclocks better.

But if it doesn't, I don't think the lack of 2 cores will hurt the Ryzen 5 gaming performance.


----------



## Samuris

https://youtu.be/U0jQ_gUqtgI?t=648 look the 3dmark test, the i7 7700k have better general score but the R7 have like 4k more physics score than the i7 so why the i7 have better general score ? look at the combined score, the combined score of R7 suck against the combined score of i7 7700k, for me it's just an issue, an incompatibility of ryzen who have to be fix.


----------



## BinaryDemon

Quote:


> Originally Posted by *ducegt*
> 
> I have a 7700K and I can minimize any game and browse the web. I can save, quit, and reload most games in less than a minute. Hell, I can reboot my entire computer in 30 seconds. If I wanted to drive 10,000km, I'd make a more intelligent decision by flying in plane as opposed to making upgrades to my car to make it run faster. Also, if I'm gaming now and I feel the need to check my email, I have my smart phone.


I think he meant other gaming related tasks, like streaming or recording video, although I suppose you could do that on your phone as well.


----------



## DADDYDC650

Quote:


> Originally Posted by *umeng2002*
> 
> No.
> 
> The cost/perf ratio of 8 core Ryzen and Intel isn't justified for gaming.
> 
> The only hope is that Ryzen 5 will be better in gaming than Ryzen 7 if overclocks better.
> 
> But if it doesn't, I don't think the lack of 2 cores will hurt the Ryzen 5 gaming performance.


Ryzen 1700 is justified for gaming since you would have to settle for a 4 core without hyper threading otherwise and lose out on 5-30 frames in certain games. Once Ryzen 5 is out then sure, I agree R7 isn't justified.


----------



## umeng2002

Quote:


> Originally Posted by *DADDYDC650*
> 
> Ryzen 1700 is justified for gaming since you would have to settle for a 4 core without hyper threading otherwise and lose out on 5-30 frames in certain games. Once Ryzen 5 is out then sure, I agree R7 isn't justified.


To me it's not justified because the R5 1600x will be cheaper, have higher default clocks, and will offer the same gaming performance as $500 R8 1800x since most games don't know what to do with more than 6 threads or cores.


----------



## DADDYDC650

Quote:


> Originally Posted by *umeng2002*
> 
> To me it's not justified because the R5 1600x will be cheaper, have higher default clocks, and will over the same gaming performance as $500 R8 1800x since most games don't know what to do with more than 6 threads or cores.


R5 won't be out for a few months and nobody knows what kind of performance it will bring. It's fine to guess but as of now, the R7 1700 is a great gaming CPU that should perform well for years like the 2600k has.


----------



## umeng2002

Take a 1800x and turn off 2 cores... that's a 1600x.

My reasoning that gaming performance will be almost the same as a 1800x is that most games aren't threaded well.

But I think most of the gaming improvements to Ryzen will come from better BIOSs and game patches/ optimizations.

For me, the OC speeds between a 1600x and 1700 would determine which is better for gaming.

Right now, the limit seems to be 3.9 or 4 GHz for the 1700.

If the limit on 1600x is around 4.1 to 4.3 GHz, those missing two cores won't hurt a thing in most games.

Yeah, the 1600x could just be leaker chips and hit the same limits as a 1800x, but right now, there is hope that there is a higher OC potential.

Look at what Intel can clock to with a 4 core vs 6/8 core.


----------



## ryan92084

Caught up for with the overnight stuff. Mostly just bickering so not much to add except the link to the ryzen master user guide and this handy chart that notes max all core boost
Quote:


> Originally Posted by *cjwilson*
> 
> *Ryzen 7 1700*
> *Stock:* 3000 MHz
> *Multi-Thread Turbo:* 3300 MHz
> *Single-Thread Turbo:* 3700 MHz
> *XFR:* No
> 
> *Ryzen 7 1700X*
> *Stock:* 3400 MHz
> *Multi-Thread Turbo:* 3500 MHz
> *Single-Thread Turbo:* 3800 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 3900 MHz
> 
> *Ryzen 7 1800X*
> *Stock:* 3600 MHz
> *Multi-Thread Turbo:* 3700 MHz
> *Single-Thread Turbo:* 4000 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 4100 MHz
> 
> *SKUs with "X" Suffix:* XFR boosts by 100 MHz
> *SKUs without "X" Suffix:* XFR boosts by 50 MHz
> 
> *** XFR has been observed to apply its frequency boost to up to four cores while gaming.


Its not perfect but gets the point across. A 4ghz all core overclock is actually a sizable overclock.
Quote:


> Originally Posted by *2010rig*
> 
> Yeah for that price, it's the real MVP, I wonder how well binned the 1800X's vs 1700's are.


I suggest taking a look at the 1700 review on xtremesystems from the OP. 1700x and 1800x seem to run about 20c hotter at the same clocks and require more volts to do so. They all hit about the same max overclock wall for air/water though.


----------



## Benny89

So for pure gaming upgrading from 4790K to 1700 is as much pointless as upgrading to 7700k at this moment?


----------



## DADDYDC650

Quote:


> Originally Posted by *umeng2002*
> 
> Take a 1800x and turn off 2 cores... that's a 1600x.
> 
> My reasoning that gaming performance will be almost the same as a 1800x is that most games aren't threaded well.
> 
> But I think most of the gaming improvements to Ryzen will come from better BIOSs and game patches/ optimizations.
> 
> For me, the OC speeds between a 1600x and 1700 would determine which is better for gaming.
> 
> Right now, the limit seems to be 3.9 or 4 GHz for the 1700.
> 
> If the limit on 1600x is around 4.1 to 4.3 GHz, those missing two cores won't hurt a thing in most games.
> 
> Yeah, the 1600x could just be leaker chips and hit the same limits as a 1800x, but right now, there is hope that there is a higher OC potential.
> 
> Look at what Intel can clock to with a 4 core vs 6/8 core.


Take broadwell (4/8threads) and broadwell-e (6-10) for example. Both max out around 4.3Ghz.


----------



## DADDYDC650

Quote:


> Originally Posted by *Benny89*
> 
> So for pure gaming upgrading from 4790K to 1700 is as much pointless as upgrading to 7700k at this moment?


At this moment that is correct.


----------



## umeng2002

Quote:


> Originally Posted by *DADDYDC650*
> 
> Take broadwell (4/8threads) and broadwell-e (6-10) for example. Both max out around 4.3Ghz.


True... but do those use the exact same dies?


----------



## 364901

Quote:


> Originally Posted by *Oubadah*
> 
> Oh, I think they released an HT patch for the game. Now that I think about it, when I first tried the game on a 2600K I remember HT in Crysis 3 being totally underwhelming, so when I tried it again later with a 4770K and saw that there was a large difference I incorrectly attributed it to the 4770K being "better at HT" until someone informed me about the patch. Not sure if that made any difference to <4 core CPUs though.
> 
> EDIT 1: Maybe I'm dreaming, because I can't find squat on a Crysis 3 "hyperthreading patch" aside from a few stray mentions like "Crysis 3 but ok fine, it's a poster child for CPU scaling and actually made good use of HT after the HT patch so I expected that"
> 
> EDIT 2: It was patch 1.3: http://www.pcgameshardware.de/Crysis-3-Spiel-20599/Tests/Crysis-3-CPU-Test-1068140/


Yeah, the site I remember reading from did test patch 1.3, but it still doesn't run as well on a Core i3 as it does a regular Core i5.

Quote:


> Originally Posted by *XHellAngelX*
> 
> Still faster than 7700K at 720p
> 
> 
> 
> https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#diagramm-watch-dogs-2-fps


Computerbase's results are closer to what I'd expect from the 1800X, but a lot of other reviews are all over the place. I wonder what the underlying cause is for all the discrepancy.


----------



## DADDYDC650

Quote:


> Originally Posted by *umeng2002*
> 
> True... but do those use the exact same dies?


Not quite sure but I don't see why they aren't very similar.


----------



## umeng2002

Quote:


> Originally Posted by *CataclysmZA*
> 
> but a lot of other reviews are all over the place. I wonder what the underlying cause is for all the discrepancy.


Different motherboards apparently


----------



## umeng2002

Quote:


> Originally Posted by *DADDYDC650*
> 
> Not quite sure but I don't see why they aren't very similar.


Yeah, I see your point, but I'm going from my personal experience with my old 1090t where disabling 2 cores made the OC increase by a 100 or 200 MHz.

I'm sure the 1800x and 1600x are 100% the same die just with two cores turned off. From a thermal/ power perspective, the 1600x should have more headroom unless they use leaky chips for the 1600x.

I doubt AMD would design a core that couldn't clock well into 4 GHz range as long as power envelope wasn't an issue.

A 1 CCX die would be interesting to test too.

But it could be a GloFlo 14nm FF issue in this regard.


----------



## Pro3ootector

I supose Skylake is no longer best CPU for Dolphyn, maybe for emulation overall.

https://lanoc.org/review/cpus/7455-amd-ryzen-7-cpus?showall=&start=2

Edit: nope, it's seconds to complete, sorry all.


----------



## umeng2002

Quote:


> Originally Posted by *Pro3ootector*
> 
> 
> 
> I supose Skylake is no longer best CPU for Dolphyn, maybe for emulation overall.


Edit... isn't lower better though?


----------



## 364901

Quote:


> Originally Posted by *ryan92084*
> 
> Its not perfect but gets the point across. A 4ghz all core overclock is actually a sizable overclock.


The all-core boost on the R7 1700 should be 3.1GHz, not 3.3GHz.

Quote:


> Originally Posted by *umeng2002*
> 
> Different motherboards apparently


Windows 10's scheduler also throws a few spanners in there for good measure. Their Linux benchmarks are quite comparable to those from Phoronix.

Quote:


> Originally Posted by *Pro3ootector*
> 
> 
> 
> I supose Skylake is no longer best CPU for Dolphyn, maybe for emulation overall.
> 
> https://lanoc.org/review/cpus/7455-amd-ryzen-7-cpus?showall=&start=2


Lower is better when it comes to Dolphin, because that's the time taken to complete the overall run. It prefers high single-core performance and high clock speeds.


----------



## Pro3ootector

_The folks at PC Perspective have shared a statement from AMD in response to their question as to why AMD's Ryzen processors show lower than expected performance at 1080p resolution (despite posting good high-resolution, high-detail frame rates). Essentially, AMD is reinforcing the need for developers to optimize their games' performance to AMD's CPUs (claiming that these have only been properly tuned to Intel's architecture). AMD also puts weight behind the fact they have sent about 300 developer kits already, so that content creators can get accustomed to AMD's Ryzen, and expect this number to increase to about a thousand developers in the 2017 time-frame. AMD is expecting gaming performance to only increase from its launch-day level. Read AMD's statement after the break.

AMD's John Taylor had this to say:

"As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences._\

https://www.techpowerup.com/231198/amd-responds-to-ryzens-lower-than-expected-1080p-performance


----------



## ryan92084

Quote:


> Originally Posted by *CataclysmZA*
> 
> Computerbase's results are closer to what I'd expect from the 1800X, but a lot of other reviews are all over the place. I wonder what the underlying cause is for all the discrepancy.


Ram speed might be a chunk of it. Joker is using 3000mhz and computerbase is using 3200 with tight timings (at least I think they are). I'd have to go back through and check but I recall a few boards being unable to read xmp timings an also not allowing for manual settings.


----------



## III-Method-III

I kinda asked this last night when people hadnt slept and some of the dissapointment with Ryzens (day 1) gaming performance was causing heated debate.

I game (2560x1440p,144hz monitor) and i also render, encode and work in premiere pro almost evey day. My rig is a stock 4790k, 16gb ram, gtx980.

I get readonable frames at my gaming res in bf1 (90-100) at high but custom settings. When i play im recording with OBS because shadowplay does not allow me to record the 3 seperate audio tracks i need.

So...Ryzen 1700/1800 build.

I plan to go to a gtx1080, 32 or 64gb ram (for premiere).

Do you think, from what we see right now (not hopes for bios/games patched for SMT in the future) that my frames at 1440p on a 4790k will be higher than 1800x? Same or lower?

My view is im gpu bound in bf1 at that res, highish detail and trying to get 144fps (to take advantage of my monitor) so moving to ryzen and a 1080 shouldnt see a fps loss right?

Meth


----------



## umeng2002

Yeah... seeing joker's 3.9 GHz chip toe to toe with a 5 GHz 7700k is promising for gaming.


----------



## jprovido

stop arguing about streaming. I streamed overwatch yesterday just to try it. 30fps with max bitrate on nvidia experience
https://www.twitch.tv/videos/125965145

I got A LOT Of dropped frames lol. seems like the demo on the AMD event is true









edit:

I have an i7 7700k @ 5.1ghz


----------



## Pro3ootector

Now if We could only spend saved money on VEGA GPU


----------



## tpi2007

Quote:


> Originally Posted by *ryan92084*
> 
> Caught up for with the overnight stuff. Mostly just bickering so not much to add except the link to the ryzen master user guide and this handy chart that notes max all core boost
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cjwilson*
> 
> *Ryzen 7 1700*
> *Stock:* 3000 MHz
> *Multi-Thread Turbo:* 3300 MHz
> *Single-Thread Turbo:* 3700 MHz
> *XFR:* No
> 
> *Ryzen 7 1700X*
> *Stock:* 3400 MHz
> *Multi-Thread Turbo:* 3500 MHz
> *Single-Thread Turbo:* 3800 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 3900 MHz
> 
> *Ryzen 7 1800X*
> *Stock:* 3600 MHz
> *Multi-Thread Turbo:* 3700 MHz
> *Single-Thread Turbo:* 4000 MHz
> *XFR:* Yes
> *XFR Two-Thread Turbo*:* 4100 MHz
> 
> *SKUs with "X" Suffix:* XFR boosts by 100 MHz
> *SKUs without "X" Suffix:* XFR boosts by 50 MHz
> 
> *** XFR has been observed to apply its frequency boost to up to four cores while gaming.
Click to expand...

That list doesn't seem to be accurate. The 1700 also has XFR, although it will only boost to 3750 Mhz.

It seems from all I've read that XFR is just a marketing gimmick; it's built into the boosting algorithm and has a very well established limit for each CPU, just like the "regular" turbo, so technically speaking they are one and the same thing.


----------



## ryan92084

Quote:


> Originally Posted by *tpi2007*
> 
> That doesn't seem to be accurate. The 1700 also has XFR, although it will only boost to 3750 Mhz.
> 
> It seems from all I've read that XFR is just a marketing gimmick; it's built into the boosting algorithm and has a very well established limit for each CPU, just like the "regular" turbo, so technically speaking they are one and the same thing.


the x370 boards add xfr to non X chips but according to AMD non X chips do not natively have xfr but maybe i'm reading them wrong. So its a little confusing and seems a bit contradictory in that quote. Pretty sure they did swap cores and threads in there though for XFR (should be 2 cores and 4 threads instead of 4 cores 2 threads)

X for xfr


Except when you have an x370 but you get half because the X chips get doubled


----------



## GorillaSceptre

Quote:


> Originally Posted by *III-Method-III*
> 
> I kinda asked this last night when people hadnt slept and some of the dissapointment with Ryzens (day 1) gaming performance was causing heated debate.
> 
> I game (2560x1440p,144hz monitor) and i also render, encode and work in premiere pro almost evey day. My rig is a stock 4790k, 16gb ram, gtx980.
> 
> I get readonable frames at my gaming res in bf1 (90-100) at high but custom settings. When i play im recording with OBS because shadowplay does not allow me to record the 3 seperate audio tracks i need.
> 
> So...Ryzen 1700/1800 build.
> 
> I plan to go to a gtx1080, 32 or 64gb ram (for premiere).
> 
> Do you think, from what we see right now (not hopes for bios/games patched for SMT in the future) that my frames at 1440p on a 4790k will be higher than 1800x? Same or lower?
> 
> My view is im gpu bound in bf1 at that res, highish detail and trying to get 144fps (to take advantage of my monitor) so moving to ryzen and a 1080 shouldnt see a fps loss right?
> 
> Meth


My guess would be Ryzen mops the floor with it if you you want higher quality streams using the CPU, and OBS.. If you do a lot of rendering, encoding, etc., then it can compete with the likes of a 6900K.. A 4 core wouldn't stand a chance in productivity.

My advice would be to just wait a bit, there is clearly some problems at the moment, reviews/performance scores are all over the place. Once members here get their hands on them you can ask them to run the exact scenarios/edge-cases you're looking at, and then you can decide.

Obviously from a purely gaming perspective there's not much to see here (for those on recent i7's), but Ryzen will probably more or less perfectly fit your needs, especially if chips like the 6900K are out of your price range.


----------



## TinyRichard

Not trying to be any type of jerk here, I actually think competition is good for the market and I feel bad for anyone who spends hard earned money (i.e. good money (isn't it all?)) on anything that under-delivers.

But, without any intended snark (I'm asking an honest question here), for those of you who are disappointed in Ryzen's gaming performance as it is (reportedly) today, where exactly in the company's past ten - fifteen year's worth of product history did you find any evidence to think the results would be any different _this_ time?

Let me repeat, I feel bad for those who spend money and get less than optimal results. I just don't get the 'surprise' and disappointment I'm reading atm.


----------



## BobiBolivia

@TinyRichard - following last cca 150 pages... Simple over-hype, that's all.


----------



## tpi2007

Quote:


> Originally Posted by *ryan92084*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> That doesn't seem to be accurate. The 1700 also has XFR, although it will only boost to 3750 Mhz.
> 
> It seems from all I've read that XFR is just a marketing gimmick; it's built into the boosting algorithm and has a very well established limit for each CPU, just like the "regular" turbo, so technically speaking they are one and the same thing.
> 
> 
> 
> the x370 boards add xfr to non X chips but according to AMD non X chips do not natively have xfr. So its a little confusing and seems a bit contradictory in that quote. Pretty sure they did swap cores and threads in there though for XFR (should be 2 cores and 4 threads instead of 4 cores 2 threads)
> 
> X for xfr
> 
> 
> Except when you have an x370 but you get half because the X chips get doubled
Click to expand...

Are those slides legitimate? The Ryzen logo on the top left over the top of the headlines looks weird. Anyway, those two slides are certainly confusing taken together. What do they mean on the first one by "High Performance, with XFR"? Does that exclude XFR from the non X chip or does it mean that non X chips just have a more "Pedestrian®" XFR?


----------



## Artikbot

Guess I'll find out when my 1700 gets here.


----------



## ryan92084

Quote:


> Originally Posted by *tpi2007*
> 
> Are those slides legitimate? The Ryzen logo on the top left over the top of the headlines looks weird. Anyway, those two slides are certainly confusing taken together. What do they mean on the first one by "High Performance, with XFR"? Does that exclude XFR from the non X chip or does it mean that non X chips just have a more "Pedestrian®" XFR?


I guess they could have been fake but I think that was just from the watermark done by the leaker the embargo time did have the wrong time zone. You can see the whole set at http://www.overclock.net/t/1624452/leaked-amd-ryzen-press-briefing-slides-confirm-r7-1700-xfr-nda-date-and-more

EDIT: some of tthe other slides have been used in official reviews but i don't recall seeing the xfr bar graph.


----------



## Phoenixlight

Has any review site benchmarked the chips in World of Warcraft Legion?


----------



## Echoa

Quote:


> Originally Posted by *Phoenixlight*
> 
> Has any review site benchmarked the chips in World of Warcraft Legion?


Or BDO in Calpheon would be another good one. Nobody ever bench's CPUs in MMOs for some reason.


----------



## tpi2007

Quote:


> Originally Posted by *ryan92084*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Are those slides legitimate? The Ryzen logo on the top left over the top of the headlines looks weird. Anyway, those two slides are certainly confusing taken together. What do they mean on the first one by "High Performance, with XFR"? Does that exclude XFR from the non X chip or does it mean that non X chips just have a more "Pedestrian®" XFR?
> 
> 
> 
> I guess they could have been fake but I think that was just from the watermark done by the leaker the embargo time did have the wrong time zone. You can see the whole set at http://www.overclock.net/t/1624452/leaked-amd-ryzen-press-briefing-slides-confirm-r7-1700-xfr-nda-date-and-more
Click to expand...

Ah, thanks. I saw that thread yesterday briefly, but then this behemoth of a thread came about and I forgot about it. Rep+


----------



## iRUSH

Ryzen IPC similar to Sandy, IVY or Haswell? Better?


----------



## AmericanLoco

About Broadwell-E levels, sometimes better depending on workload.


----------



## Gamingboy

Linus made a review of the Ryzen CPU a few days ago. Here is the link for those who are interested: https://www.youtube.com/watch?v=9wJQEHNYE7M&t=606s


----------



## DarthBaggins

From what I'm seeing I'll definitely continue using my 5930K (might upgrade to a 6900k in the near future) for my main rig since I don't due a crapton of encoding/rendering. But I would love to upgrade the wife's rig to a 1800x/1700x from the aging Phenom II x6, especially since she would benefit from it more than I would running Autodesk without having to break the bank on a secondary x99 rig.


----------



## OutlawII

Gaming wise if you have from a Ivybridge or up no sense in buying Zen. I dont even know if this chip will drive down Intel prices very much


----------



## tedman

Quote:


> Originally Posted by *OutlawII*
> 
> Gaming wise if you have from a Ivybridge or up no sense in buying Zen. I dont even know if this chip will drive down Intel prices very much


It might just about be worthwhile from SB or IB, but only just if you're gaming. I guess some advantages is having a newer platform, chipset/motherboard to play around with. My motherboard is 6 years old now! Also, I'm limited to DDR3.


----------



## Newbie2009

Not sure if someone already posted, but Guru 3D posted some benchmarks of 1700x with SMT on/off. It is definitely an issue in some games, but it is just one of a few issues it seems.

http://www.guru3d.com/articles_pages/amd_ryzen_7_1700x_review,22.html


----------



## 98uk

What's the story for Battlefield 1 then? Seems Ryzen isn't really leaping out as a winner in terms of performance in that specific game right?


----------



## Wishmaker

Quote:


> Originally Posted by *98uk*
> 
> What's the story for Battlefield 1 then? Seems Ryzen isn't really leaping out as a winner in terms of performance in that specific game right?


INTEL coded game it seems







!


----------



## 98uk

Quote:


> Originally Posted by *Wishmaker*
> 
> INTEL coded game it seems
> 
> 
> 
> 
> 
> 
> 
> !


That's a shame, i'm sure that game could put multicore to use in great ways!

I was really considering going for Ryzen... but if it performs no better than a 4770k in game, then not much point.


----------



## Wishmaker

Quote:


> Originally Posted by *98uk*
> 
> That's a shame, i'm sure that game could put multicore to use in great ways!
> 
> I was really considering going for Ryzen... but if it performs no better than a 4770k in game, then not much point.


AMD said it is working closely to have this fixed. If this is truly the case and all the games out there are INTEL based then Ryzen could still be an option. If the guy from AMD only said this as damage control, then you need to wait for ZEN+.

We had to AMD statements these past days.

1. Ryzen has some SMT issues which will be improved with Zen+ (AMD ENGINEER)
2. All games are INTEL coded and AMD is working to get this fixed (AMD GUY)


----------



## DarthBaggins

Of course with the launch of a very new tech there are going to be issues. Ryzen is still definitely a contender for Intel's current line-up but of course as any new tech it needs quite a bit of tweaking to correct any issues it has now and might have in the forseeable future.


----------



## Ding Chavez

Good article here with a summary of Ryzen reviews. Has some early problems but should only get better.

Quote:


> Overall, AMD's solutions are slower than their Intel counterparts in pure FPS terms. Sometimes, the actual difference, while favoring Intel's CPUs, is negligible in pure fluidity terms; in others, the difference is markedly in Intel's favor. However, we can be looking at some bugs and needed optimizations on Ryzen's microcode and software updates for gaming and applications. AMD is fast to claim that it's working with software developers to patch and improve Ryzen support, claiming that its architecture is different enough to warrant some performance outliers. If one looks towards gaming at higher resolutions and with all eye-candy turned up to 11, however, like most of us like to play, suddenly the differences between chips seem to be minimal. Furthermore, AMD has recognized that its gaming performance needs improving, and says it expects "higher performance to occur throughout Q1 and Q2" as it works with developers and engine-makers to get Ryzen up to snuff. Oxide Games' Brad Wardell, for one, mentioned that Ashes of the Singularity currently isn't optimized for Ryzen, and promised future updates would increase performance.
> 
> Other things seem to point to some underlying issues: some games show better performance with AMD's SMT disabled. AMD also says that Window's power profiles may adversely affect performance, recommending the SO's High Performance Profile. Also, some games show higher performance on Ryzen with their DX12 code-paths (Hitman), while others show lower performance with it (Rise of the Tomb Raider.) AMD also suggest disabling the HPET (High Precision Event Timer), either in the BIOS or operating system, to gain a 5-8% advantage. AMD has also reportedly informed technology partners about the need to improve microcode on Ryzen in regards to memory performance, which could be negatively affecting performance. AMD has apparently ut all of its available microcode optimizations in time for launch on the CPU cores side of the equation, somewhat leaving memory throughput, performance and latency a little on the green side. This could very well be part of the reason why the Ryzen processors don't fare so well under gaming environments.


https://www.techpowerup.com/231172/amds-ryzen-debut-onwards-to-the-hedt-market-or-the-stumbling-hype-train


----------



## 98uk

Quote:


> Originally Posted by *Wishmaker*
> 
> AMD said it is working closely to have this fixed. If this is truly the case and all the games out there are INTEL based then Ryzen could still be an option. If the guy from AMD only said this as damage control, then you need to wait for ZEN+.
> 
> We had to AMD statements these past days.
> 
> 1. Ryzen has some SMT issues which will be improved with Zen+ (AMD ENGINEER)
> 2. All games are INTEL coded and AMD is working to get this fixed (AMD GUY)


Quote:


> Originally Posted by *DarthBaggins*
> 
> Of course with the launch of a very new tech there are going to be issues. Ryzen is still definitely a contender for Intel's current line-up but of course as any new tech it needs quite a bit of tweaking to correct any issues it has now and might have in the forseeable future.


Fingers crossed they find some improvements in a new stepping or firmware!

I have until late March to make my mind up! I will buy when I visit the USA then.


----------



## Shiftstealth

Quote:


> Originally Posted by *98uk*
> 
> Fingers crossed they find some improvements in a new stepping or firmware!
> 
> I have until late March to make my mind up! I will buy when I visit the USA then.


I mean you could look at it two ways. Keep in mind this is on an LPP node. (Mainly for ARM cpu's previously)

First, the initial samples just 4-5 months ago were maxed at 3.15Ghz boost. We now see up to 4.1Ghz boost with xfr at stock.

Second, there is the possibility that since it is an LPP node that 4 Ghz essentially maxes out the process they are using. These nodes are most efficient for 2.1Ghz - 3.3Ghz. After that the voltage scale goes out the window.

So either they are making progress, or it is possible that the node is tapped.


----------



## Blackops_2

Quote:


> Originally Posted by *Wishmaker*
> 
> AMD said it is working closely to have this fixed. If this is truly the case and all the games out there are INTEL based then Ryzen could still be an option. If the guy from AMD only said this as damage control, then you need to wait for ZEN+.
> 
> We had to AMD statements these past days.
> 
> 1. Ryzen has some SMT issues which will be improved with Zen+ (AMD ENGINEER)
> 2. All games are INTEL coded and AMD is working to get this fixed (AMD GUY)


I think Zen+ will give us everything we want especially if they do increase IPC 10% or so along with much higher clocks. However when in the hell is that supposed to be here? I mean i know they have a roadmap but it's a day after Zen's release and we're already talking about SMT issues that wont be ironed out until the next iteration?








Quote:


> Originally Posted by *tedman*
> 
> It might just about be worthwhile from SB or IB, but only just if you're gaming. I guess some advantages is having a newer platform, chipset/motherboard to play around with. My motherboard is 6 years old now! Also, I'm limited to DDR3.


I'm not holding my breath but i'm eager to see what the 1600x will do. Idk really know if it would be worth moving from my Ivy platform just yet, and at 4.5ghz it's no slouch. That was obtained on an AIO as well, now that it's in a loop i might be seeing if i can go further.


----------



## Shiftstealth

Quote:


> Originally Posted by *Blackops_2*
> 
> I think Zen+ will give us everything we want especially if they do increase IPC 10% or so along with much higher clocks. However when in the hell is that supposed to be here? I mean i know they have a roadmap but it's a day after Zen's release and we're already talking about SMT issues that wont be ironed out until the next iteration?
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not holding my breath but i'm eager to see what the 1600x will do. Idk really know if it would be worth moving from my Ivy platform just yet, and at 4.5ghz it's no slouch. That was obtained on an AIO as well, now that it's in a loop i might be seeing if i can go further.


You will not see 4.5Ghz on Ryzen unless the 1600x's are on a new stepping, but they are likely binning them as we speak so i do not believe that will be the case. Perhaps in 6 months they might have a 1650x or something....


----------



## kaosstar

I'd consider this a worthwhile upgrade from: any previous AMD processor, any i5, any Intel processor from Sandy Bridge or before.


----------



## Shiftstealth

Quote:


> Originally Posted by *kaosstar*
> 
> I'd consider this a worthwhile upgrade from: any previous AMD processor, any i5, any Intel processor from Sandy Bridge or before.


That is a fair assessment.


----------



## tpi2007

Quote:


> Originally Posted by *Newbie2009*
> 
> Not sure if someone already posted, but Guru 3D posted some benchmarks of 1700x with SMT on/off. It is definitely an issue in some games, but it is just one of a few issues it seems.
> 
> http://www.guru3d.com/articles_pages/amd_ryzen_7_1700x_review,22.html


The first two games listed, the latest Hitman and Tomb Raider titles, show less than 3% and less than 1% variation respectively (edit: and Tom Clancy's The Division is at 1% variation), both at 100+ fps. I wonder if Intel's SMT implementation wouldn't yield the same kind of results. At least in the Sandy Bridge days that kind of variation was to be expected.

However, Far Cry Primal does show a much larger gap, at around 10%, so I'm inclined to think that part of it is definitely game code based, which might in addition not be playing properly with the way Windows is currently scheduling threads.


----------



## czin125

Quote:


> Originally Posted by *CataclysmZA*
> 
> Yeah, the site I remember reading from did test patch 1.3, but it still doesn't run as well on a Core i3 as it does a regular Core i5.
> 
> Computerbase's results are closer to what I'd expect from the 1800X, but a lot of other reviews are all over the place. I wonder what the underlying cause is for all the discrepancy.


Broadwell-E cpus are paired with DDR4-2400 in computerbase's tests. Timings are probably 17+?

"
Als Reaktion auf die vielen Rückmeldungen in den Kommentaren zum Artikel „ CPU-Skalierung im Test: 6, 8 oder 10 CPU-Kerne schlagen 4 schnelle " hat ComputerBase Broadwell-E in den Spielen Anno 2205, Ashes of the Singularity, The Witcher 3, Watch Dogs 2 und F1 2016 auch noch einmal mit Dual-Channel-DDR4-2.400 getestet.
"

And they also have DDR4-3200 16-18-18-36 in their link for the AM4 test

http://www.gskill.com/en/press/view/g-skill-announces-flare-x-series-and-fortis-series-ddr4-memory-for-amd-ryzen
New
DDR4-3200 14-14-14-34 CR1 1.35v designed for AM4


----------



## OutlawII

I see this as a step in the right way for the cpu industry,but to blame the lackluster performance of Zen in games on Intel and Gaming devs is absolute garbage.


----------



## Alwrath

Quote:


> Originally Posted by *jprovido*
> 
> stop arguing about streaming. I streamed overwatch yesterday just to try it. 30fps with max bitrate on nvidia experience
> https://www.twitch.tv/videos/125965145
> 
> I got A LOT Of dropped frames lol. seems like the demo on the AMD event is true
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit:
> 
> I have an i7 7700k @ 5.1ghz


You know whats funny? I could swear I see dropped frames from my old i5 760 sometimes in games like mechwarrior online. Skipped frames is a serious issue, I wonder of ppl with core i5's all experience it and just dont realize.


----------



## Alwrath

Quote:


> Originally Posted by *czin125*
> 
> Broadwell-E cpus are paired with DDR4-2400 in computerbase's tests. Timings are probably 17+?
> 
> "
> Als Reaktion auf die vielen Rückmeldungen in den Kommentaren zum Artikel „ CPU-Skalierung im Test: 6, 8 oder 10 CPU-Kerne schlagen 4 schnelle " hat ComputerBase Broadwell-E in den Spielen Anno 2205, Ashes of the Singularity, The Witcher 3, Watch Dogs 2 und F1 2016 auch noch einmal mit Dual-Channel-DDR4-2.400 getestet.
> "
> 
> And they also have DDR4-3200 16-18-18-36 in their link for the AM4 test
> 
> http://www.gskill.com/en/press/view/g-skill-announces-flare-x-series-and-fortis-series-ddr4-memory-for-amd-ryzen
> New
> DDR4-3200 14-14-14-34 CR1 1.35v designed for AM4


Its too bad they dont offer it in 2x16gb sticks







@ 14-14-14.


----------



## Samuris

Quote:


> Originally Posted by *Echoa*
> 
> Or BDO in Calpheon would be another good one. Nobody ever bench's CPUs in MMOs for some reason.


Yes it would be very interesting, in city i gain like more than 30 fps with my i7 7700k at 5ghz against my i7 6700k stock who had constantly 30-35fps in city, trust me it's weird but i'm not crazy


----------



## looniam

Quote:


> Originally Posted by *Shiftstealth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kaosstar*
> 
> I'd consider this a worthwhile upgrade from: any previous AMD processor, any i5, any Intel processor from Sandy Bridge or before.
> 
> 
> 
> That is a fair assessment.
Click to expand...

and i'll agree also but unfortunately i was hoping for a "slam dunk" upgrade from ivy . . .









not that's a bad thing. just gives me ~6 months to see what happens.


----------



## Alwrath

Quote:


> Originally Posted by *Samuris*
> 
> Yes it would be very interesting, in city i gain like more than 30 fps with my i7 7700k at 5ghz against my i7 6700k stock who had constantly 30-35fps in city, trust me it's weird but i'm not crazy


MMO's are known to be extremely taxing on hardware sometimes due to the area / players running around. I remember joining raids in Everquest with 100+ ppl back in 2002 on a pentium 4 3 ghz radeon 9700 pro and 2 gigs of ram and it would slow down to a crawl, you were lucky to get 10 FPS.


----------



## Shiftstealth

Quote:


> Originally Posted by *looniam*
> 
> and i'll agree also but unfortunately i was hoping for a "slam dunk" upgrade from ivy . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not that's a bad thing. just gives me ~6 months to see what happens.


If everyone waited on Zen for Zen+ AMD would go under, and we'd just have Intel.

That is really all i have to say.


----------



## JackCY

Anyone seen reviews or other benches done with 1 CCX disabled? As in 4C/8T only? Compared with 8C/16T in games or other software that had issues with the bad OS thread management?


----------



## Newbie2009

Quote:


> Originally Posted by *looniam*
> 
> and i'll agree also but unfortunately i was hoping for a "slam dunk" upgrade from ivy . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not that's a bad thing. just gives me ~6 months to see what happens.


Give it a month or two I'd say for the gaming. Some results just look off.


----------



## Newbie2009

Quote:


> Originally Posted by *Shiftstealth*
> 
> If everyone waited on Zen for Zen+ AMD would go under, and we'd just have Intel.
> 
> That is really all i have to say.


Yeah agree. AMD have done something amazing with Zen with a tiny fraction of the intel R&D budget. And they have also forced intel now to pull the finger out.
I will have to pull the trigger this year on one of their new cpu and one of their new gpu for the good of AMD and the industry.


----------



## Alwrath

Quote:


> Originally Posted by *Newbie2009*
> 
> Give it a month or two I'd say for the gaming. Some results just look off.


On top of that I want to get the 1700, but I want the best motherboard possible because ill be stuck on am4 for 4+ years. Better motherboards will be out in a few more months, and hopefully they improve overclocking. Once zen 2 comes out I can just ebay my 1700.


----------



## Samuris

i can return my i7 7700k on amazon for get an R7 1700, even if i have delidded relidded him and applying grizzly conductonaut and lower the vcc pll overvoltage of my mb at 1.17V for get an 100% stable i7 7700k at 5ghz 1.3V who never exceed 45° on prime95 cause it's the amazon politics return, so i'm lost cause i got a good chip with this i7 and it's like he perform better in game but ryzen have probably a better futur and perform better in application, so any advise ? what i should do


----------



## JackCY

Quote:


> Originally Posted by *Newbie2009*
> 
> Give it a month or two I'd say for the gaming. Some results just look off.


The results are off, there are unresolved software issues, UEFI, OS, compiler optimizations. By the launch of next Ryzen R5/3 it will probably be smoothed out and the platform perform much more like it should.
It is so obvious when some games prefer the 2-4C Intel no matter what and everything else tanks. Sometimes it's the bad thread allocation from OS etc. I've seen games that ran equal load on 4C/8T Intel but ran like a 2C/2T on Ryzen because who knows why, game didn't detect Core architecture CPU and started using threads in the worst way possible, 1 thread maxed out, 2nd 40-50%, the rest of 14 barely anything around 10-15% idling.
Sadly not all reviewers know much or care to show the actual raw data and loads across CPU cores/threads. Most boards beside Gigabyte can't even hit 160+ in Cinebench 1T, which shows that the system config is running worse than it should as stated by AMD themselves, ASUS is one of those especially.

---

Don't buy a 7700K it's always been a rip off to buy Intel i7s over i5s, you're paying +50% (+$100+) for HT and +2MB cache, that's all.
If you have a 7700K, keep it unless you want the Ryzen 8 cores.


----------



## dir_d

Quote:


> Originally Posted by *Samuris*
> 
> i can return my i7 7700k on amazon for get an R7 1700, even if i have delidded relidded him and applying grizzly conductonaut and lower the vcc pll overvoltage of my mb at 1.17V for get an 100% stable i7 7700k at 5ghz 1.3V who never exceed 45° on prime95 cause it's the amazon politics return, so i'm lost cause i got a good chip with this i7 and it's like he perform better in game but ryzen have probably a better futur and perform better in application, so any advise ? what i should do


Keep the i7


----------



## Newbie2009

Quote:


> Originally Posted by *Samuris*
> 
> i can return my i7 7700k on amazon for get an R7 1700, even if i have delidded relidded him and applying grizzly conductonaut and lower the vcc pll overvoltage of my mb at 1.17V for get an 100% stable i7 7700k at 5ghz 1.3V who never exceed 45° on prime95 cause it's the amazon politics return, so i'm lost cause i got a good chip with this i7 and it's like he perform better in game but ryzen have probably a better futur and perform better in application, so any advise ? what i should do


I presume you have forked out for expensive ram and motherboard already, so hold on to it. 7700k is a lovely processor.


----------



## kaosstar

Phoronix posted Linux gaming benchmarks of the 1800x vs. 7700k. It's not pretty.

http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-gaming&num=1


----------



## kd5151

6900k + ramon noodles
1700 + steak
7700k + happy meal


----------



## Charcharo

Quote:


> Originally Posted by *TinyRichard*
> 
> Not trying to be any type of jerk here, I actually think competition is good for the market and I feel bad for anyone who spends hard earned money (i.e. good money (isn't it all?)) on anything that under-delivers.
> 
> But, without any intended snark (I'm asking an honest question here), for those of you who are disappointed in Ryzen's gaming performance as it is (reportedly) today, where exactly in the company's past ten - fifteen year's worth of product history did you find any evidence to think the results would be any different _this_ time?
> 
> Let me repeat, I feel bad for those who spend money and get less than optimal results. I just don't get the 'surprise' and disappointment I'm reading atm.


The early 2000s really. Some good old fashioned AMD beating the hell out of Intel.

Comparisons of other technological fields as well (where pure willpower and talent overrules money).


----------



## looniam

Quote:


> Originally Posted by *Shiftstealth*
> 
> If everyone waited on Zen for Zen+ AMD would go under, and we'd just have Intel.
> 
> That is really all i have to say.


i'm not really thinking of zen+ but other SKUs being released and bugs worked out. and trying hard not to "offend" anyone - i am sure there are plenty of AMD enthusiasts who have an intel cpu and feel justified to swap out.
Quote:


> Originally Posted by *Newbie2009*
> 
> Give it a month or two I'd say for the gaming. Some results just look off.


AMD has given their reasoning for the differences and i think a month is too ambitious of a time period to make the appropriate corrections.

but believe me when i say i hope it does all work out soon.


----------



## Shiftstealth

Quote:


> Originally Posted by *looniam*
> 
> i'm not really thinking of zen+ but other SKUs being released and bugs worked out. and trying hard not to "offend" anyone - i am sure there are plenty of AMD enthusiasts who have an intel cpu who are feel justified to swap out.
> AMD has given their reasoning for the differences and i think a month is too ambitious of a time period to make the appropriate corrections.
> 
> but believe me when i say i hope it does all work out soon.


I wasn't offended. Just to clarify. I'm just saying with AMD only having 1 Billion in assets if this processor tanks because people want to wait. They'll be waiting a longer time than 5 years for competition.....


----------



## looniam

Quote:


> Originally Posted by *Shiftstealth*
> 
> I wasn't offended. Just to clarify. I'm just saying with AMD only having 1 Billion in assets if this processor tanks because people want to wait. They'll be waiting a longer time than 5 years for competition.....


if they held it together during BD/PD years . . _i think_ this should be a walk in the park.


----------



## Shiftstealth

Quote:


> Originally Posted by *looniam*
> 
> if they held it together during BD/PD years . . _i think_ this should be a walk in the park.


If Zen only yields as much profit as BD/PD, or not much more i could totally see AMD just exiting the CPU market and sticking to consoles or the low end stuff ala VIA.


----------



## kaosstar

Quote:


> Originally Posted by *looniam*
> 
> if they held it together during BD/PD years . . _i think_ this should be a walk in the park.


No doubt. In fact, I firmly believe AMD will gain back significant marketshare with Ryzen.


----------



## tpi2007

Quote:


> Originally Posted by *OutlawII*
> 
> I see this as a step in the right way for the cpu industry,but to blame the lackluster performance of Zen in games on Intel and Gaming devs is absolute garbage.


Absolutely, game developers can't take advantage of an arch that isn't there and games especially are less predictable than an encoding application for example with its more constant flow of work.

The blame in this case lies on AMD, just like it did on Intel's side in other occasions (see below). They need to provide a driver for Windows to help it schedule things properly and manage power states on the two CCXes, because this isn't exactly a native eight core CPU, it's two quad cores tightly put together with an advanced interface, but it's still two modules with 8 MB of L3 cache each. Interestingly, the quad cores should thus have fewer problems even without any fixes because if this.

The games, well, AMD can work with the developers, as they seem to be doing, to take advantage of the architecture better.

I'll just say as a general note that some people might have the erroneous idea that an operating system doesn't need drivers for the CPU, that it just works. It does, but only at the basic x86 level, just like any GPU will boot in VGA mode without any special driver because it's the agreed upon minimum common standard. It's just that CPUs change less often and thus those drivers end up being integrated into the OS itself.

For example, Windows Vista no longer needed AMD's Dual Core Optimizer driver. And when Intel released Skylake, they were the ones providing Microsoft with a patch for the Speed Shift feature for Windows 10. Not even Windows 10 supported it out of the box, it only got support for it in November of 2015:
Quote:


> Speed Shift was noticeably absent at the time of the launch of the processors. This is due to one of the requirements for Speed Shift - it requires operating system support to be able to hand over control of the processor performance to the CPU, and Intel had to work with Microsoft in order to get this functionality enabled in Windows 10. As of right now, anyone with a Skylake processor is actually not getting the benefit of the technology, at least right now. A patch will be rolled out in November for Windows 10 which will enable this functionality


Quote:


> For this short piece, *Intel was able to provide us with the Windows 10 patch* for Speed Shift ahead of time, so that we could test and see what kind of gains it can achieve. This gives us a somewhat unique situation, since we can isolate this one variable on a new processor and measure its impact on various workloads.


So, to put it into perspective with a relatively new example from Intel, for more than 2 months after release, there was a feature in Skylake that simply wasn't working, because there wasn't software support for it.


----------



## JackCY

*For those that don't know, Aida64 will be fixed, it's just another software not ready for Ryzen.*


----------



## looniam

Quote:


> Originally Posted by *Shiftstealth*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> if they held it together during BD/PD years . . _i think_ this should be a walk in the park.
> 
> 
> 
> 
> 
> 
> 
> If Zen only yields as much profit as BD/PD, or not much more i could totally see AMD just exiting the CPU market and sticking to consoles or the low end stuff ala VIA.
Click to expand...

this is from PCper's amazon sales linked through their web site for feb 22 (represents $300K in PC related sales)

Anecdotal: Early AMD Ryzen Pre-orders Show Exceptional Demand
Quote:


> Though we may have to wait months before getting reports from market research firms or from AMD earnings calls, we have at least some anecdotal information to share. PC Perspective is a long running hardware review and information website catering to enthusiasts and PC gamers. Through a bit of luck (being placed near the top of February 22nd Google searches for "Ryzen") and an Amazon affiliate integration, PC Perspective had a sales day topping $300,000 in PC hardware!
> 
> From that information, and a breakdown of the specific items purchased and their categories, the following data was provided:


granted we can speculate there were some returns but then that is pre-order and i'm pretty sure quite a few went ahead and ordered plenty more after reviews. but w/o being any "expert" myself . . . that looks pretty healthy esp. compared to what BD went through several years ago.


----------



## Artikbot

So..

GSkill is releasing new super-tight latency memory kits especially marketed for Ryzen rigs

AMD is updating the microcode nearly day after day

Software devs are getting ahold of Ryzen optimization procedures

AMD is also developing a driver so Windows can schedule better and not crap out with power profiles

I wonder how long until we see Ryzen showing its computing muscle in games?


----------



## lombardsoup

Quote:


> Originally Posted by *looniam*
> 
> granted we can speculate there were some returns but then that is pre-order and i'm pretty sure quite a few went ahead and ordered plenty more after reviews. but w/o being any "expert" myself . . . that looks pretty healthy esp. compared to what BD went through several years ago.


Will that be enough to force Intel to make any meaningful price cuts?


----------



## Quantum Reality

Quote:


> Originally Posted by *lombardsoup*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> granted we can speculate there were some returns but then that is pre-order and i'm pretty sure quite a few went ahead and ordered plenty more after reviews. but w/o being any "expert" myself . . . that looks pretty healthy esp. compared to what BD went through several years ago.
> 
> 
> 
> Will that be enough to force Intel to make any meaningful price cuts?
Click to expand...

Well, Microcenter already tried launching fake price cuts (i say "fake" because apparently they just pretended to discount the prices which are really just the same as before), and the secondary market (ebay, etc) are now flooded with i5s and i7s and prices there are dropping quite handily.

Intel will eventually have to budge; the question is, by how much?


----------



## Zero_

Sill nothing from Eurogamer/Digital Foundry? Must be working hard to find enough fault with these before publishing a review.


----------



## lombardsoup

Quote:


> Originally Posted by *Quantum Reality*
> 
> Well, Microcenter already tried launching fake price cuts (i say "fake" because apparently they just pretended to discount the prices which are really just the same as before), and the secondary market (ebay, etc) are now flooded with i5s and i7s and prices there are dropping quite handily.
> 
> Intel will eventually have to budge; the question is, by how much?


Music to my ears


----------



## ZealotKi11er

Quote:


> Originally Posted by *Artikbot*
> 
> So..
> 
> GSkill is releasing new super-tight latency memory kits especially marketed for Ryzen rigs
> AMD is updating the microcode nearly day after day
> Software devs are getting ahold of Ryzen optimization procedures
> AMD is also developing a driver so Windows can schedule better and not crap out with power profiles
> 
> I wonder how long until we see Ryzen showing its computing muscle in games?


Just in time for Zen+







.


----------



## looniam

granted as i said i am far from being an expert

BUT









i don't see intel doing anything _until there is a large* change in market share_.

*large= ?????


----------



## budgetgamer120

I was going to reply to the trolls. But I decided not to help ruin the thread anymore.

Quote:


> Originally Posted by *Kuivamaa*
> 
> Are you sure about all that? Crysis 1 is a game that sees two cores max after all, i3 2120 and i7 2600k are both Sandy. I doubt that they will perform all that different even at cpu heavy areas, even when both paired with a Titan XP in that game, just like 2011 the bench insinuates.


How can it be cpu heavy if it only use 2 cores while 4 was available?

Quote:


> Originally Posted by *oxidized*
> 
> I hope so, i also hope 6/12 will give a decent boost to gaming performance besides all fixing and updates


Why would 6 core give a boost? It is the same silicon. All you can expect are lower prices.
Quote:


> Originally Posted by *jprovido*
> 
> stop arguing about streaming. I streamed overwatch yesterday just to try it. 30fps with max bitrate on nvidia experience
> https://www.twitch.tv/videos/125965145
> 
> I got A LOT Of dropped frames lol. seems like the demo on the AMD event is true
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit:
> 
> I have an i7 7700k @ 5.1ghz


Thank you sir








Quote:


> Originally Posted by *DarthBaggins*
> 
> From what I'm seeing I'll definitely continue using my 5930K (might upgrade to a 6900k in the near future) for my main rig since I don't due a crapton of encoding/rendering. But I would love to upgrade the wife's rig to a 1800x/1700x from the aging Phenom II x6, especially since she would benefit from it more than I would running Autodesk without having to break the bank on a secondary x99 rig.


So why would you be buying a 6900k then?


----------



## sugarhell

Quote:


> Originally Posted by *Artikbot*
> 
> So..
> 
> GSkill is releasing new super-tight latency memory kits especially marketed for Ryzen rigs
> AMD is updating the microcode nearly day after day
> Software devs are getting ahold of Ryzen optimization procedures
> AMD is also developing a driver so Windows can schedule better and not crap out with power profiles
> 
> I wonder how long until we see Ryzen showing its computing muscle in games?


I think it will improve soon enough.

The linux benchmarks shows the computing muscle of Ryzen. I am excited about Naples on the Server Market.
But on the linux games it shows the same problems as the windows games. Weird Performance all around. Probably needs a lot of work from both AMD and the software developers and in general APIs changes.

Hmm also why none talks about the perf/power. It's insane compared to Bulldozer and i believe a bit better (except on gaming ) than Intel's


----------



## Brutuz

Quote:


> Originally Posted by *tpi2007*
> 
> Are those slides legitimate? The Ryzen logo on the top left over the top of the headlines looks weird. Anyway, those two slides are certainly confusing taken together. What do they mean on the first one by "High Performance, with XFR"? Does that exclude XFR from the non X chip or does it mean that non X chips just have a more "Pedestrian®" XFR?


I reckon XFR is using the new fine grained Turbo multipliers to autoOC, so a 1700 can "use" XFR but only in that they can use autoOCing and their new fine grained clock increases iif the mobo has it.
Quote:


> Originally Posted by *OutlawII*
> 
> Gaming wise if you have from a Ivybridge or up no sense in buying Zen. I dont even know if this chip will drive down Intel prices very much


It's not worth upgrading from SB or IB if you're running one OCed at all atm...My 3570k is typically around 40% in Civ VI. AMD has mostly caught up to Intel but Intel has had such small increases since the massive jump with SB that it's just not noticeable for most people.

That said, with the variance in results on similar benchmarks I'm thinking like most AMD platforms it can be quite a bit faster than reviews show with the right tuning. (eg. Increasing CPU/NB frequency on Phenom II would net similar performance to simply OCing the core and could be done in addition, but most reviewers didn't bother with it) They never matched Nehalem but they came a lot closer when at 4Ghz with a 2.6Ghz CPU/NB.
Quote:


> Originally Posted by *OutlawII*
> 
> I see this as a step in the right way for the cpu industry,but to blame the lackluster performance of Zen in games on Intel and Gaming devs is absolute garbage.


Why? It's been well known Intel has been using their compiler to limit AMD CPUs to old code while they get the latest, fastest stuff. It's only a few years ago they let AMD get SSE3 instead of x87 for gods sake...As for Devs, I can see why they're blaming them but can also see why people could consider that AMDs fault: because so many people have Intel CPUs that they optimise for that.
Quote:


> Originally Posted by *looniam*
> 
> and i'll agree also but unfortunately i was hoping for a "slam dunk" upgrade from ivy . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not that's a bad thing. just gives me ~6 months to see what happens.


I think it is, but that it won't become obvious from sheer FPS for a few years much like the Q6600. We already know that games are trending to using more threads and that Ryzen is faster than Kaby Lake when all of its threads are used. Does it not stand to reason that KL will hit its limit sooner? Especially as most of the reason it can get higher FPS is because it can use more of its resources more easily right now.
Quote:


> Originally Posted by *JackCY*
> 
> The results are off, there are unresolved software issues, UEFI, OS, compiler optimizations. By the launch of next Ryzen R5/3 it will probably be smoothed out and the platform perform much more like it should.
> It is so obvious when some games prefer the 2-4C Intel no matter what and everything else tanks. Sometimes it's the bad thread allocation from OS etc. I've seen games that ran equal load on 4C/8T Intel but ran like a 2C/2T on Ryzen because who knows why, game didn't detect Core architecture CPU and started using threads in the worst way possible, 1 thread maxed out, 2nd 40-50%, the rest of 14 barely anything around 10-15% idling.
> Sadly not all reviewers know much or care to show the actual raw data and loads across CPU cores/threads. Most boards beside Gigabyte can't even hit 160+ in Cinebench 1T, which shows that the system config is running worse than it should as stated by AMD themselves, ASUS is one of those especially.
> 
> ---
> 
> Don't buy a 7700K it's always been a rip off to buy Intel i7s over i5s, you're paying +50% (+$100+) for HT and +2MB cache, that's all.
> If you have a 7700K, keep it unless you want the Ryzen 8 cores.


This. Especially the part about reviewers. They leave so much detail out, even the good ones because they have so much to cover. Not many sites reported on the increase from CPU/NB OCing on Phenom II, or take the time to retest old hardware with newer drivers in reviews. (eg. Vega comes out so the 1080, 1080Ti and RX 480 all get new results from latest drivers)

It's why you use reviews to get a general idea of OCing then wait for enthusiasts to buy the chips and play with them enough to figure it out, as well as any hidden gems that increase performance noticeably. (eg. CPU/NB OCing or Socket 775 benefitting more from tight timings than high bandwidth RAM)
Quote:


> Originally Posted by *kaosstar*
> 
> Phoronix posted Linux gaming benchmarks of the 1800x vs. 7700k. It's not pretty.
> 
> http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-gaming&num=1


Yup, think about how badly most games are threaded on Windows then remember Linux games are ported from that. Ryzen will thrash Intel in general desktop usage on Linux because of gcc (With the right optimisations, even an FX could match a 3770k under Linux) but it will always remain a bit iffy on games because they'll be single-threaded for a while longer, at least until Vulkan picks up steam for Linux ports at least.

But you can also see that Ryzen isn't quite bugfree yet in that as well from the Vulkan results. That kind of difference isn't even just from a single-threaded application and hardware differences, it's a bug.


----------



## Ultracarpet

Quote:


> Originally Posted by *looniam*
> 
> granted as i said i am far from being an expert
> 
> BUT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't see intel doing anything _until there is a large* change in market share_.
> 
> *large= ?????


The marketshare and mindshare impact will be relatively small to Intel, but will still make a big impact to AMD's bottom line.

For the x86 markethsare and mindshare to be really shaken up, AMD would need a chip that has a complete victory across pretty much every metric. Only then, would you ever hear a best-buy employee recommend AMD. I mean, I can almost gaurantee that even with ryzen performing the way it does, best-buy employees and alike will in many cases be recommending pentiums/celerons before any ryzen chips.


----------



## Shiftstealth

Quote:


> Originally Posted by *Ultracarpet*
> 
> The marketshare and mindshare impact will be relatively small to Intel, but will still make a big impact to AMD's bottom line.
> 
> For the x86 markethsare and mindshare to be really shaken up, AMD would need a chip that has a complete victory across pretty much every metric. Only then, would you ever hear a best-buy employee recommend AMD. I mean I can almost gaurantee that even with ryzen performing the way it does, best-buy employees and alike will in many cases be recommending pentiums/celerons before any ryzen chips.


Ha. Yep. They all play games, and wil recommend intel for users just because it plays their games faster. (5% faster)


----------



## tpi2007

Quote:


> Originally Posted by *Artikbot*
> 
> So..
> 
> GSkill is releasing new super-tight latency memory kits especially marketed for Ryzen rigs
> AMD is updating the microcode nearly day after day
> Software devs are getting ahold of Ryzen optimization procedures
> AMD is also developing a driver so Windows can schedule better and not crap out with power profiles
> 
> I wonder how long until we see Ryzen showing its computing muscle in games?


My guess is that the first microcode updates should be focusing on getting rid of the glaring and out of spec performance discrepancies across different motherboards from different makers, and ironing out the bugs. You can probably start feeling the benefits in one week, but it'll probably take up to two months to get most of it together for the majority of cases, like optimizing RAM profiles for a wide variety of kit speeds, number of modules and densities.

OS scheduling could be ready in one, two or three months. The Windows 7 scheduler patches took three months after Bulldozer was released, but that was an arch with much less interest and the results were mixed to begin with; this time around the market interest and pressure will probably make everybody work faster. The Speed Shift patch for Skylake took a bit over two months after the CPUs were released.

Software devs optimizations is probably the same. Simple benchmarks like AIDA 64 should be fast to correct the software, and when it comes to game devs, we'll probably get one or two out of the gate fast (Oxide games and Bethesda - Arkane Studios) to show that the optimization works and then the rest will follow in the coming months.


----------



## ebduncan

Ryzen is a pretty good cpu.

I just find it odd that everyone wants to talk about 1080p gaming again. I mean its a legit concern if ryzen will not be able to reach the 144hz spec of some pannels. To be completely fair most games don't come close to 144fps even at 1080p. The cpu is more than capable of pushing 1080p, just at a slightly slower speed, mind you that slightly slower speed is above 60fps. I just don't see how its important to all these people who are not professional gamers.

I'd buy an R7-1700 today If I needed to build a new pc. Right now I'm rockin the [email protected], and the [email protected] Gonna wait a little longer before upgrading the FX system just to see what maturity does with Ryzen.


----------



## sugarhell

Good info here :

http://www.overclock.net/t/1624603/rog-crosshair-vi-overclocking-thread


----------



## Ultracarpet

Quote:


> Originally Posted by *ebduncan*
> 
> Ryzen is a pretty good cpu.
> 
> I just find it odd that everyone wants to talk about 1080p gaming again. I mean its a legit concern if ryzen will not be able to reach the 144hz spec of some pannels. To be completely fair most games don't come close to 144fps even at 1080p. The cpu is more than capable of pushing 1080p, just at a slightly slower speed, mind you that slightly slower speed is above 60fps. I just don't see how its important to all these people who are not professional gamers.
> 
> I'd buy an R7-1700 today If I needed to build a new pc. Right now I'm rockin the [email protected], and the [email protected] Gonna wait a little longer before upgrading the FX system just to see what maturity does with Ryzen.


This point of view is even furthered by VRR, which is becoming pretty commonplace in monitors now, especially in high refresh rate monitors.


----------



## Artikbot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just in time for Zen+
> 
> 
> 
> 
> 
> 
> 
> .


I'd like to see a retest in a month's time or so.

It's about for as long as I can hold out on an upgrade, the Phenom has been increasingly throwing bluescreens since the past couple months..

Quote:


> Originally Posted by *tpi2007*
> 
> My guess is that the first microcode updates should be focusing on getting rid of the glaring and out of spec performance discrepancies across different motherboards from different makers, and ironing out the bugs. You can probably start feeling the benefits in one week, but it'll probably take up to two months to get most of it together for the majority of cases, like optimizing RAM profiles for a wide variety of kit speeds, number of modules and densities.
> 
> OS scheduling could be ready in one, two or three months. The Windows 7 scheduler patches took three months after Bulldozer was released, but that was an arch with much less interest and the results were mixed to begin with; this time around the market interest and pressure will probably make everybody work faster. The Speed Shift patch for Skylake took a bit over two months after the CPUs were released.
> 
> Software devs optimizations is probably the same. Simple benchmarks like AIDA 64 should be fast to correct the software, and when it comes to game devs, we'll probably get one or two out of the gate fast (Oxide games and Bethesda - Arkane Studios) to show that the optimization works and then the rest will follow in the coming months.


I've got similar feelings for those timescales.

Gonna wait for GSKILL to put those memory kits out in the wild and then upgrade.


----------



## Shatun-Bear

I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.

But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p







) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


----------



## nakano2k1

After finally having some time to review a lot of the information, I have to say that i'm somewhat disappointed in the performance. I don't know if this is due to the relatively immature architecture or not but a lot of the "issues" that people are talking about with the memory should really have been ironed out before the release. It seems like especially with the Ryzen chips that high speed memory with tight timings is really vital in order to have if run at it's full potential.

Compared to bulldozer this is a springboard forward for AMD but it wasn't quite the massive pushback against Intel that I was hoping for.


----------



## maarten12100

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.
> 
> But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p
> 
> 
> 
> 
> 
> 
> 
> ) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


That is complete nonsense and you know it!#[email protected]#$!%!#[email protected]!


Spoiler: Warning: Spoiler!



It was a big great cheque it was huge(yuge!)


----------



## budgetgamer120

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.
> 
> But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p
> 
> 
> 
> 
> 
> 
> 
> ) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


And I thought Jayz was biased lol


----------



## Artikbot

Jayz is an utter monkey, didn't like him when he started, even less when he got popular, can't stand the guy at all now.

Half of the time he's spouting half-digested crap from the internet that he doesn't even understand. The other half he's bragging about something.


----------



## ciarlatano

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.
> 
> But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p
> 
> 
> 
> 
> 
> 
> 
> ) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


Unbiased or not, they have shown clearly through their reviews that they have very little knowledge or understanding of hardware, and no concept of how it fits in the market. I can't believe anyone reads that site.


----------



## rexolaboy

Quote:


> Originally Posted by *nakano2k1*
> 
> After finally having some time to review a lot of the information, *I have to say that i'm somewhat disappointed in the performance*. I don't know if this is due to the relatively immature architecture or not but a lot of the "issues" that people are talking about with the memory should really have been ironed out before the release. It seems like especially with the Ryzen chips that high speed memory with tight timings is really vital in order to have if run at it's full potential.
> 
> Compared to bulldozer this is a springboard forward for AMD but *it wasn't quite the massive pushback against Intel that I was hoping for*.


You are disappointed that Ryzen is 93% IPC of Kaby lake? You make note of the clear issues with the Motherboard manufacturers and blame AMD? The push back against Intel was never going to happen in the consumer space, AMD already stated this architecture is aimed at servers and high compute tasks, gaming is secondary. Your disappointment might well be warranted, but you didn't give a clear reason why. AMD is back, Ryzen 5 will be for gamers on a gamers budget. Ryzen 7 is only attacking the x99 platform, comparing the r7 1800x with ddr4 2400 ram to an i7 7700k with ddr4 4000 ram is a silly misleading comparison.


----------



## Tobiman

I wholly agree that GN's review was super harsh. Something else was definitely going on. My guess is that there was some back and fort between GN and AMD that didn't end well. Anyway, Ryzen seems to perform just well enough for me to consider moving from my i5 but i'll need to invest in a 1080TI/1080 first before changing my cpu.


----------



## budgetgamer120

Quote:


> Originally Posted by *rexolaboy*
> 
> You are disappointed that Ryzen is 93% IPC of Kaby lake? You make note of the clear issues with the Motherboard manufacturers and blame AMD? The push back against Intel was never going to happen in the consumer space, AMD already stated this architecture is aimed at servers and high compute tasks, gaming is secondary. Your disappointment might well be warranted, but you didn't give a clear reason why. AMD is back, Ryzen 5 will be for gamers on a gamers budget. Ryzen 7 is only attacking the x99 platform, comparing the r7 1800x with ddr4 2400 ram to an i7 7700k with ddr4 4000 ram is a silly misleading comparison.


Most people are mad that this Ryzen wont trigger price drops. I think that's great for AMD
Quote:


> Originally Posted by *Tobiman*
> 
> I wholly agree that GN's review was super harsh. Something else was definitely going on. My guess is that there was some back and fort between GN and AMD that didn't end well. Anyway, Ryzen seems to perform just well enough for me to consider moving from my i5 but i'll need to invest in a 1080TI/1080 first before changing my cpu.


Their charts are normally nice.

I hope they have to buy their review CPU next.


----------



## tpi2007

Quote:


> Originally Posted by *nakano2k1*
> 
> After finally having some time to review a lot of the information, I have to say that i'm somewhat disappointed in the performance. I don't know if this is due to the relatively immature architecture or not but a lot of the "issues" that people are talking about with the memory should really have been ironed out before the release. It seems like especially with the Ryzen chips that high speed memory with tight timings is really vital in order to have if run at it's full potential.
> 
> Compared to bulldozer this is a springboard forward for AMD but it wasn't quite the massive pushback against Intel that I was hoping for.


Skylake also underperformed with the DDR4 2133 Mhz RAM it was validated with and there were BIOS updates to make boards work better with faster speeds, which it absolutely benefits from. Haswell-E was the same thing, in the beginning you could only go to 2667 Mhz without touching the BCLK and then after some patches you could do more.


----------



## looniam

Quote:


> Originally Posted by *Brutuz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> and i'll agree also but unfortunately i was hoping for a "slam dunk" upgrade from ivy . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not that's a bad thing. just gives me ~6 months to see what happens.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think it is, but that it won't become obvious from sheer FPS for a few years much like the Q6600. We already know that games are trending to using more threads and that Ryzen is faster than Kaby Lake when all of its threads are used. Does it not stand to reason that KL will hit its limit sooner? Especially as most of the reason it can get higher FPS is because it can use more of its resources more easily right now.
Click to expand...

i might have missed something but that could all be pertinent if i said i was going to get either KL or ryzen but, i didn't.

i'll admit to thinking about "upgrading" strictly for benchmarking firestrike, time spy and whatnot but the bottom line is _i don't need to_ so i can wait until i do.

and seriously, i am so damned sick of hearing, "games are going to use more threads!"

well DUH! but i've been hearing that for ~8+ years and you'd think ~90% of them would be by now, but they don't. so again, *i'll wait until it happens.*


----------



## JackCY

Quote:


> Originally Posted by *rexolaboy*
> 
> comparing the r7 1800x with ddr4 2400 ram to an i7 7700k with ddr4 4000 ram is a silly misleading comparison.


But but my minesweeper!


----------



## ducegt

I've always found Gamers Nexus to be more detailed and technical than others. The fact that Steve called out AMD on asking them to skew results shows how GN values integrity. If Intel had done this, you same haters would be going crazy. All reviewers are suppose to do is share their experience and at least with GN you can gather a better sense of what it was.


----------



## bfedorov11

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.
> 
> But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p
> 
> 
> 
> 
> 
> 
> 
> ) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


If you watch the interview that went up last night with joker, steve did say he felt he was a little harsh on the overall recommendation.


----------



## JackCY

Quote:


> Originally Posted by *ducegt*
> 
> I've always found Gamers Nexus to be more detailed and technical than others. The fact that Steve called out AMD on asking them to skew results shows how GN values integrity. If Intel had done this, you same haters would be going crazy. All reviewers are suppose to do is share their experience and at least with GN you can gather a better sense of what it was.


I prefer data not the hurt feelings or flattered feelings of reviewers, don't care if they sack or hype it, just want the data









There are even results where 4790K beats 7700K etc. Not just 5775C but even 4790K lol, which shows how great the SL/KL really is in some tasks or how variable some tests actually are. SL/KL is better than HW/DC but not even in 100% of cases Intel did trade something for the extra clocks that rarely shows up.


----------



## iRUSH

Quote:


> Originally Posted by *ebduncan*
> 
> Ryzen is a pretty good cpu.
> 
> I just find it odd that everyone wants to talk about 1080p gaming again. I mean its a legit concern if ryzen will not be able to reach the 144hz spec of some pannels. To be completely fair most games don't come close to 144fps even at 1080p. The cpu is more than capable of pushing 1080p, just at a slightly slower speed, mind you that slightly slower speed is above 60fps. I just don't see how its important to all these people who are not professional gamers.
> 
> I'd buy an R7-1700 today If I needed to build a new pc. Right now I'm rockin the [email protected], and the [email protected] Gonna wait a little longer before upgrading the FX system just to see what maturity does with Ryzen.


I'm that guy. I'd happily take 1080p LOW at 144+ vs 1080p ULTRA 60+. My favorite thing about PC gaming is the fast refresh rate monitors. I know I'm outnumbered regarding this, especially here on OCN


----------



## chuy409

Quote:


> Originally Posted by *ducegt*
> 
> I've always found Gamers Nexus to be more detailed and technical than others. The fact that Steve called out AMD on asking them to skew results shows how GN values integrity. If Intel had done this, you same haters would be going crazy. All reviewers are suppose to do is share their experience and at least with GN you can gather a better sense of what it was.


gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.


----------



## Dagamus NM

Quote:


> Originally Posted by *tpi2007*
> 
> Skylake also underperformed with the DDR4 2133 Mhz RAM it was validated with and there were BIOS updates to make boards work better with faster speeds, which it absolutely benefits from. Haswell-E was the same thing, in the beginning you could only go to 2667 Mhz without touching the BCLK and then after some patches you could do more.


This, comparing Ryzen at release to architectures/boards that have had several bios updates is not apples to apples.

Regardless, the 1800X seems a pretty awesome deal for what you currently get. It is only going to get better (incrementally but better nonetheless).

I think Ryzen is a pretty amazing bit of hardware for the price. I won't get one for work, but if I were to build a multipurpose gaming oriented rig, HTPC, homework, light productivity, then Ryzen would be at the top of my list.

I am hoping that AMD does good with Vega and judging by the .90% performance of the 1080Ti to the Titan X Pascal I think AMD has something that will compete.

I have faith in AMD. Not to the level that I do in Intel and NVidia, but the market needs this.

As far as people being upset that Intel has not lowered prices in response, just wait. I will be very surprised if Intel raises the price on their top end Skylake-X chip compared to the 6950X. Most likely their pricing will stay about the same on release, but I expect to see prices trend down much faster by retailers now that AMD actually can eat marketshare.


----------



## SystemTech

Im just going to leave this here :
http://hwbot.org/submission/3473875_der8auer_cpu_frequency_ryzen_7_1800x_5802.93_mhz

Germany der8auer`s CPU Frequency score - Elite League
5802.93 mhz with AMD Ryzen 7 1800X

Sure thats on [email protected], but it is whats possible.
I feel after some optimizations, BIOS updates, Mobo updates, etc we will be seeing better results on the OC front.
Give it a month








This is typical AMD. Release a product, and within one month the products been updated and is better than it was on day 1.


----------



## JackCY

Quote:


> Originally Posted by *chuy409*
> 
> gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.


Pretty eye opening how he manages to portrait it all negatively. They ask not command, most of the stuff has been cleaned up in AMA and is on reddit. People are just too caught up with GN to read the details for themselves and make their own opinion from the data rather than what reviewers think/feel, use a brain people







Your own. Unless you sold it to buy an i7, in which case R.I.P.


----------



## piledragon

Quote:


> Ryzen is a pretty good cpu.
> 
> I just find it odd that everyone wants to talk about 1080p gaming again. I mean its a legit concern if ryzen will not be able to reach the 144hz spec of some pannels. To be completely fair most games don't come close to 144fps even at 1080p. The cpu is more than capable of pushing 1080p, just at a slightly slower speed, mind you that slightly slower speed is above 60fps. I just don't see how its important to all these people who are not professional gamers.
> 
> I'd buy an R7-1700 today If I needed to build a new pc. Right now I'm rockin the [email protected], and the [email protected] Gonna wait a little longer before upgrading the FX system just to see what maturity does with Ryzen.


[email protected] , you go boy! i got my [email protected] everyday, , i am definatily going to build a ryzen rig, i feel there is alot more potential for that chip, and it will only get better over time,


----------



## Master__Shake

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I thought people were telling me before that *Gamers Nexus* were a good, unbiased review outlet.
> 
> But after reading their Ryzen review, I can't take them seriously. It was so negative and harsh, I wonder if they forgot about the relative cost of Ryzen versus a 6900K, or how about how, in normal gaming use case scenarios (like with cards under GTX1080 or resolutions above 1080p
> 
> 
> 
> 
> 
> 
> 
> ) it's about 5% slower than the 6900K for half the price. Yet apparently it's not worth buying. Complete nonsense. Did they receive a little cheque from Intel or something?


intel doesn't mail reviewers cheques.

they have a special room for that.


Spoiler: Warning: Spoiler!


----------



## JackCY

Quote:


> Originally Posted by *Master__Shake*
> 
> intel doesn't mail reviewers cheques.
> 
> they have a special room for that.
> 
> 
> Spoiler: Warning: Spoiler!


He wasn't all negative. There is the review, then the live talk with Joker where he's more sensible if I remember right, the you have reddit with a lot of fine details and answers to all the issues present, then you have the post review publishing of private communication, etc.
I understand how he can easily get caught up in it and feel disappointed if he expected way too much and doesn't look at it all from a 3rd view, from a distance. His review is very "personal"/non objective but as time goes by tries to get back on the objective track somehow.
Maybe a haircut will help.


----------



## rexolaboy

Quote:


> Originally Posted by *chuy409*
> 
> gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.


AMD was courteous and respectful of Steve acting like a baby. AMD never lied about anything, they paid respect to ASUS even though ASUS totally botched their own product release, and Steve thought AMD was acting like some criminal overlord because AMD was proud that frametimes and smoothness at 4k gaming was better than Intel and they wanted Steve to show that. Also, AMD has been working with all the motherboard manufacturers to get things right, and ASUS STILL sent out a junk bios, where other brands did a better job. Also, Steve couldn't wrap his head around the fact that he has already made a point of (and AMD did as well) that in gaming, comparing a 3.5ghz 8 core with ddr4 2933 ram to an i7 7700k at 4.5ghz with MUCH faster ram is silly because of clock disparity, not showing a good indicator of IPC, which was AMD's goal to increase with Ryzen. Essentially Ryzen 7 1700 is TWO i7 6700t's in one package....for the same cost. THAT IS AMAZING!


----------



## Tobiman

Quote:


> Originally Posted by *chuy409*
> 
> gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.


Let me guess, they asked him to make the gpu the bottleneck? Regardless, AMD never pitched the R7 as the be all end all for gaming. With lower ipc and clocks, it's pretty much a given that the 7700k was going to be superior in gaming.
None of the 8 cores available are valuable gaming chips so why did he have to be so condescending in both tone and remarks. Dude needs to take a chill pill.


----------



## scorch062

I like GN for their detailed gamng benchmarks (the 1% and 0.1% lows are great metrics) and their 1800x review was pretty valuable. They showed that currently there is an issue with SMT, as disabling it could boost fps (especially the lows) quite substantially. However, their conclusion is way too harsh and seem to be stuck around 1080p 144hz. For gaming 1800x is not the best CPU but ti is good. not something i would say you should keep away at all cost.

Personally, i am getting 1700, to replace my 860k, and combo it with my current 380x No bottleneck is possible there and will replace the GPU once i could get an affordable 3440x1440p (preferably with FreeSync) - doubt i will be able to bottleneck that resolution with this CPU either.


----------



## kd5151

Quote:


> Originally Posted by *Zero_*
> 
> Sill nothing from Eurogamer/Digital Foundry? Must be working hard to find enough fault with these before publishing a review.


they do good work. when they do it!


----------



## Samuris

WE NEED a review with a man who sli the titan XP with an R7 1700 and an i7 7700k Plsss ! for cpu bottleneck at 1440p/4k


----------



## Majin SSJ Eric

I was wondering where people were getting all of this "You can't play games on Ryzen" nonsense? Saying that an Intel CPU is a little faster than Ryzen is far different from saying "Do not buy a Ryzen CPU if you play games".


----------



## Xuper

Quote:


> Compared to the original bios, the new UEFI increases the image rate in our game course between plus *4* and plus *26* percent, on the average even plus *17 percent*! In view of this tremendous increase in performance, we had to be certain that our values are correct, and have measured with the Asus boards. This gives us a touch more speed in games than with the updated MSI board.


https://translate.google.co.uk/translate?hl=en&sl=de&u=https://www.golem.de/news/ryzen-7-1800x-im-test-amd-ist-endlich-zurueck-1703-125996-4.html&prev=search

Whoa it's official.Asus Really screwed AMD Ryzen!


----------



## SoloCamo

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I was wondering where people were getting all of this "You can't play games on Ryzen" nonsense? Saying that an Intel CPU is a little faster than Ryzen is far different from saying "Do not buy a Ryzen CPU if you play games".


Yup it's nuts. Especially with modern titles even the FX-8 cores are plenty for most modern games. A 4.5ghz piledriver is still a great gaming cpu, and Ryzen obviously blows that out of the water while offering far better multi threaded performance.


----------



## teh-yeti

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I was wondering where people were getting all of this "You can't play games on Ryzen" nonsense? Saying that an Intel CPU is a little faster than Ryzen is far different from saying "Do not buy a Ryzen CPU if you play games".


I mean I can understand why you could say that about the 1800X. From a purely value standpoint, if you only game the 1700 can be OC'd for basically the same performance in games at $170 less, and honestly there is the 7700K that's less and performs better than both. The key point here is "if you only game". The 1800X is a monster of a CPU in the right application from what I can see.


----------



## budgetgamer120

Quote:


> Originally Posted by *scorch062*
> 
> I like GN for their detailed gamng benchmarks (the 1% and 0.1% lows are great metrics) and their 1800x review was pretty valuable. They showed that currently there is an issue with SMT, as disabling it could boost fps (especially the lows) quite substantially. However, their conclusion is way too harsh and seem to be stuck around 1080p 144hz. For gaming 1800x is not the best CPU but ti is good. not something i would say you should keep away at all cost.
> 
> Personally, i am getting 1700, to replace my 860k, and combo it with my current 380x No bottleneck is possible there and will replace the GPU once i could get an affordable 3440x1440p (preferably with FreeSync) - doubt i will be able to bottleneck that resolution with this CPU either.


Even if you got a GTX 1080 you wouldn't be bottleneck. You are talking like you bought a low end cpu.


----------



## hokk

Steven from gamernexus is biased he was telling people not to order ryzen while also saying one thing when another thing happen

he may aswell be confirmed an intel shill by now.


----------



## SoloCamo

Update video from Steve at Gamernexus


----------



## paskowitz

IDK, I found the GN article to be pretty fair. I do think he could have worded his conclusion a little more nuance. Instead of its "bad for gaming" I would have said acceptable but keep in mind we don't know how the CPU will age so buyer beware. It's not like somebody who buys and 1800X to game is going to get a crappy experience, and he kinda made it seem like that with his tone, even if his results were sound. Also, I would have liked to see him do some multitasking tests like streaming (even though it is hard to build a controlled test).


----------



## tpi2007

Quote:


> Originally Posted by *JackCY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chuy409*
> 
> gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.
> 
> 
> 
> Pretty eye opening how he manages to portrait it all negatively. They ask not command, most of the stuff has been cleaned up in AMA and is on reddit. People are just too caught up with GN to read the details for themselves and make their own opinion from the data rather than what reviewers think/feel, use a brain people
> 
> 
> 
> 
> 
> 
> 
> Your own. Unless you sold it to buy an i7, in which case R.I.P.
Click to expand...

Quote:


> Originally Posted by *JackCY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Master__Shake*
> 
> intel doesn't mail reviewers cheques.
> 
> they have a special room for that.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He wasn't all negative. There is the review, then the live talk with Joker where he's more sensible if I remember right, the you have reddit with a lot of fine details and answers to all the issues present, then you have the post review publishing of private communication, etc.
> I understand how he can easily get caught up in it and feel disappointed if he expected way too much and doesn't look at it all from a 3rd view, from a distance. His review is very "personal"/non objective but as time goes by tries to get back on the objective track somehow.
> Maybe a haircut will help.
Click to expand...

Quote:


> Originally Posted by *Tobiman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chuy409*
> 
> gamer's nexus released a video an hour revealing conversations between him and AMD. Pretty eye-opening what amd was asking him to do.
> 
> 
> 
> Let me guess, they asked him to make the gpu the bottleneck? Regardless, AMD never pitched the R7 as the be all end all for gaming. With lower ipc and clocks, it's pretty much a given that the 7700k was going to be superior in gaming.
> None of the 8 cores available are valuable gaming chips so why did he have to be so condescending in both tone and remarks. Dude needs to take a chill pill.
Click to expand...

In the phone call he published in that video they are very forthcoming of the IPC and clockspeed advantage that Kaby Lake has, so it's not as if they are hiding something or trying to bullcrap reviewers. Still, suggesting things is always going to get some journalists wary, and they may have historical reasons for it, so I understand that.

I get Steve's argument, but back when bulldozer was out, even if you argued that in a few years support would come, at best it would be matching the 2600K, which only cost $72 more. Now you're making the argument that the IPC may not be there in the future if the game engines don't become better multi-threaded (which they are), but on the other side, the 7700K, the number of cores definitely isn't there.

The other point is that making value assessment based on the 1800X was always going to be poor, it's like evaluating Broadwell-E based on the 6950X, which goes for $1700. AMD needs it to both not undersell the brand too much and to show the CPU in the best possible light against the direct competition, but the real value lies in the 1700X and 1700, and there you save between $100 - $170 on the CPU. The 1800X was always going to be a halo product. Back in the Q6600 days how many people actually bought a QX6700, a Q6700, a QX6800 or a QX6850?


----------



## SoloCamo

*720p low settings* 5ghz 7700k vs 3.9ghz 1700 (w/ 3000mhz memory)






*Ryzen clearly is a fine gaming cpu if it's doing this well at 720p low, a setting nobody will use*


----------



## scorch062

Quote:


> Originally Posted by *budgetgamer120*
> 
> Even if you got a GTX 1080 you wouldn't be bottleneck. *You are talking like you bought a low end cpu*.


Considering what i have right now, i am very certain that 1700 is nothing like a low end








Wish there were more reviews with 1700 atm, but the once i saw point to a great value with very little fps loss.


----------



## Xuper

Because GN's Title should be like this :

Quote:


> *Ryzen is King in Applications, Prince in Games*


yep He was more harsh toward Ryzen.I like computerbase's title, they're accurate more than GN.


----------



## Vesku

Quote:


> Originally Posted by *ducegt*
> 
> I've always found Gamers Nexus to be more detailed and technical than others. The fact that Steve called out AMD on asking them to skew results shows how GN values integrity. If Intel had done this, you same haters would be going crazy. All reviewers are suppose to do is share their experience and at least with GN you can gather a better sense of what it was.


Asking to consider 1440p and 4K results isn't skewing unless they also requested 1080p be removed.


----------



## GorillaSceptre

So bios/motherboards and microcode is playing a big role in some of the discrepancies between ST performance in synthetics (which looks very good for Ryzen), and performance in games..

I wonder how many follow up reviews there will be, 20% performance swings is _*massive*_, some reviewers comments post-reviews make it sound that Ryzen was a nightmare to deal with.. Crashes, ram not working right, etc., why didn't they emphasize those things more during their articles? Whatever board Joker Productions was using seemed to be fine, or at least far better than some other reviewers got..

I don't blame the MB manufacturers, AMD pushed Ryzen out very early.. These aren't just growing pains, suppliers clearly weren't ready yet, they should of just waited a month to sort the software side out.


----------



## ryboto

Quote:


> Originally Posted by *SoloCamo*
> 
> *720p low settings* 5ghz 7700k vs 3.9ghz 1700 (w/ 3000mhz memory)
> 
> 
> 
> 
> 
> 
> *Ryzen clearly is a fine gaming cpu if it's doing this well at 720p low, a setting nobody will use*


He's definitely appealing to the masses! Maybe next he'll do a clock for clock?


----------



## Newbie2009

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So bios/motherboards and microcode is playing a big role in some of the discrepancies between ST performance in synthetics (which looks very good for Ryzen), and performance in games..
> 
> I wonder how many follow up reviews there will be, 20% performance swings is _*massive*_, some reviewers comments post-reviews make it sound that Ryzen was a nightmare to deal with.. Crashes, ram not working right, etc., why didn't they emphasize those things more during their articles? Whatever board Joker Productions was using seemed to be fine, or at least far better than some other reviewers got..
> 
> I don't blame the MB manufacturers, AMD pushed Ryzen out very early.. These aren't just growing pains, suppliers clearly weren't ready yet, they should of just waited a month to sort the software side out.


Probably because most wanted to show it in a positive light


----------



## Ultracarpet

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So bios/motherboards and microcode is playing a big role in some of the discrepancies between ST performance in synthetics (which looks very good for Ryzen), and performance in games..
> 
> I wonder how many follow up reviews there will be, 20% performance swings is _*massive*_, some reviewers comments post-reviews make it sound that Ryzen was a nightmare to deal with.. Crashes, ram not working right, etc., why didn't they emphasize those things more during their articles? Whatever board Joker Productions was using seemed to be fine, or at least far better than some other reviewers got..
> 
> I don't blame the MB manufacturers, AMD pushed Ryzen out very early.. These aren't just growing pains, suppliers clearly weren't ready yet, they should of just waited a month to sort the software side out.


Well, I feel like MB manufacturers were taking their sweet time, I'm glad AMD forced their hand. Now they have to work fast, or people will be pissed that the board they got is buggy.

That anger won't be towards AMD either, just the motherboard manufacturer.


----------



## Newbie2009

Quote:


> Originally Posted by *Ultracarpet*
> 
> Well, I feel like MB manufacturers were taking their sweet time, I'm glad AMD forced their hand. Now they have to work fast, or people will be pissed that the board they got is buggy.
> 
> That anger won't be towards AMD either, just the motherboard manufacturer.


By the looks of it most places are sold out so this issue may not have a terrible effect on early adopters. Maybe they can get some updated bios out in the next couple of weeks, just in time for the restock.


----------



## motherpuncher

Quote:


> Originally Posted by *Vesku*
> 
> Asking to consider 1440p and 4K results isn't skewing unless they also requested 1080p be removed.


Yep...and I happen to appreciate reviews done at 1440p and up, even for cpu's. I want to know how MY performance will be affected. I left 1080p less than a year ago and won't go back. I can't make my purchasing decision based STRICTLY on how it games at low res. Ryzen looks great for the money in my opinion, though I'll be waiting for Zen+ because I just put my system together recently.


----------



## nakano2k1

What amazes me is many vendors haven't even updated their main pages to list AM4 socket motherboards. Manually searching for the motherboard in the search field is the only way to locate it.

MSI literally JUST released an update EFI on the 1st of March. If they've had 3 weeks to prepare and were given the AM4 bios template to work with in the beginning, then this is a real embarrassment for board partners.


----------



## Ultracarpet

Quote:


> Originally Posted by *Newbie2009*
> 
> By the looks of it most places are sold out so this issue may not have a terrible effect on early adopters. Maybe they can get some updated bios out in the next couple of weeks, just in time for the restock.


Yep, that's probably what will end up happening.


----------



## hellopppp

Quote:


> Originally Posted by *Newbie2009*
> 
> Probably because most wanted to show it in a positive light


Yes exactly, these reviews are about AMD's product and AMD's platform, so they exclude associated faults with MB manufacturers when others are experiencing stable results. I think we'll see this issue resolved within a month or so, if not less.


----------



## LancerVI

I think I'm going to build me a Ryzen 1700 gaming build.

Is that sacrilege? Will I be burnt at the OCN stake??

I want to see how "Bad" it games.


----------



## Hequaqua

Quote:


> Originally Posted by *ryboto*
> 
> He's definitely appealing to the masses! Maybe next he'll do a clock for clock?


I just watched that.









I will say at least this guy is letting the numbers do all the talking. It does appear that he didn't have many issues, so yea, it seems to be on the MB mfgs. That's not to say there still aren't issues with Ryzen.

I've read about 95% of this thread over the last couple of days...and people were asking about the exact testing that Joker did. I would say that the Ryzen chip held its own comparing it to the clock speed of the i7-7700k.

I'm with you, would love to see some clock to clock benchmarks.


----------



## Ultracarpet

Quote:


> Originally Posted by *LancerVI*
> 
> I think I'm going to build me a Ryzen 1700 gaming build.
> 
> Is that sacrilege? Will I be burnt at the OCN stake??
> 
> I want to see how "Bad" it games.


I can almost guarantee you wouldn't notice a shred of difference from your 4770k in games.


----------



## redone13

Quote:


> Originally Posted by *Hequaqua*
> 
> I just watched that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will say at least this guy is letting the numbers do all the talking. It does appear that he didn't have many issues, so yea, it seems to be on the MB mfgs. That's not to say there still aren't issues with Ryzen.
> 
> I've read about 95% of this thread over the last couple of days...and people were asking about the exact testing that Joker did. I would say that the Ryzen chip held its own comparing it to the clock speed of the i7-7700k.
> 
> I'm with you, would love to see some clock to clock benchmarks.


Why would you do clock for clock if a limitation of more cores is lower clock speed? That's the point of the 7700K. It clocks high rather easily. There seems to be some limitations in how far AMD chips can be pushed because they are already pushed rather far out of the box. I guess you could do clock for clock at low 4GHz if you truly wanted to see some numbers.


----------



## Master__Shake

anyone else think this is odd?

HWC did a review of the 7700k and 7600k and here are the detail levels.



same when they did the unlocked i3 review



then the ryzen review



weird eh.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74160-intel-kaby-lake-i7-7700k-i5-7600k-review-11.html

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-17.html

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74538-intel-kaby-lake-i3-7350k-review-10.html

it's not even an outlier (GTA V) they're all like that.

even the 10 core cpu review



http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/72695-intel-broadwell-e-i7-6950x-i7-6900k-review-11.html


----------



## LancerVI

Quote:


> Originally Posted by *Ultracarpet*
> 
> I can almost guarantee you wouldn't notice a shred of difference from your 4770k in games.


I know. The 4770k is my sons. I'm running a 5820k right now.

It was mostly tongue in cheek. I'm getting sick of morons saying the Ryzen "can't game"

It's a ridiculous conclusion and speaks to the amount of Intel's "junk" some here have in their mouths.


----------



## DarthBaggins

I would think the 4770k would beat the 1700(x) in gaming due to it's single threaded scores and optimizations - much like many of the regular i7's (4c8T). But I would say it wouldn't be a bad upgrade over-all, with the move to DDR4 and the new AM4 socket so future upgradeability is there when AMD drops the more refined 8c16t cpu's in the months to years to come.


----------



## LancerVI

Quote:


> Originally Posted by *Master__Shake*
> 
> anyone else think this is odd?
> 
> HWC did a review of the 7700k and 7600k and here are the detail levels.
> 
> 
> 
> same when they did the unlocked i3 review
> 
> 
> 
> then the ryzen review
> 
> 
> 
> weird eh.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74160-intel-kaby-lake-i7-7700k-i5-7600k-review-11.html
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-17.html
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74538-intel-kaby-lake-i3-7350k-review-10.html
> 
> it's not even an outlier (GTA V) they're all like that.


2x vs 4x MSAA /16 AF maybe?? Why change your (HWC) methodology though?


----------



## motherpuncher

Quote:


> Originally Posted by *Master__Shake*
> 
> anyone else think this is odd?
> 
> HWC did a review of the 7700k and 7600k and here are the detail levels.
> 
> 
> 
> same when they did the unlocked i3 review
> 
> 
> 
> then the ryzen review
> 
> 
> 
> weird eh.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74160-intel-kaby-lake-i7-7700k-i5-7600k-review-11.html
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74814-amd-ryzen-7-1800x-performance-review-17.html
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/74538-intel-kaby-lake-i3-7350k-review-10.html
> 
> it's not even an outlier (GTA V) they're all like that.


Different GPU's were used from the Intel tests to the AMD test. Could that be the difference? 980 vs Titan x


----------



## Master__Shake

Quote:


> Originally Posted by *motherpuncher*
> 
> Different GPU's were used from the Intel tests to the AMD test. Could that be the difference? 980 vs Titan x


but the detail level across 3 different reviews were identical.

why change the 4th?


----------



## GorillaSceptre

Quote:


> Originally Posted by *Newbie2009*
> 
> Probably because most wanted to show it in a positive light


Perhaps.. But I think it has done Ryzen a great disservice actually.

Pretending that things are fine and dandy and this is the performance Ryzen offers, then after the fact talking about how buggy things were on the software side, using ram speeds and timings that are sub-par, etc.

They then (again after the reviews are posted), are saying the reason some reviewers are seeing far better results with Ryzen might be because they actually had boards that were working the way they should be..

That doesn't sound like wanting to show them in a positive light to me.. Quite the opposite in fact. It would be far more beneficial to their readers if they criticized AMD for rushing to release too early, and said that stuff clearly isn't adding up regarding its gaming performance.


----------



## LancerVI

Quote:


> Originally Posted by *Master__Shake*
> 
> but the detail level across 3 different reviews were identical.
> 
> why change the 4th?


No. 2x MSAA vs 4x MSAA w/ 16AF

I think that's the difference.


----------



## Master__Shake

Quote:


> Originally Posted by *LancerVI*
> 
> No. 2x MSAA vs 4x MSAA w/ 16AF
> 
> I think that's the difference.


the 73xxk cpu, the 7700k, and the 6900k and 6950x reviews all had the same detail levels as well as AA levels on thosew 4 graphs i posted.

why change the details for the ryzen review.


----------



## teh-yeti

Quote:


> Originally Posted by *Master__Shake*
> 
> but the detail level across 3 different reviews were identical.
> 
> why change the 4th?


Not sure, it's weird. All in all, HWC gave a pretty positive review of the Ryzen processors, right? Didn't they give them the Dam Good and Dam innovative awards?


----------



## DarthBaggins

Still amazes me how well that little unlocked i3 is rocking the market. But once you get into video encoding/rendering you see its weak points especially when the Ryzen CPU's are killing it in that category. So Ryzen is definitely a prime low cost choice for content creators etc.

Also seeing those benches still delights my choice of going with a 5930k back when I did - now that has been an amazing CPU for me.


----------



## Derp

Joker's 720p low review is a good resource of information that shows actual CPU gaming performance that isn't held back by the GPU. Some games are acceptably slower considering the 3.9GHz vs 5Ghz clock speed disadvantage but in other games the fps difference is nearly 100+ fps. Thousands of gamers are sitting in front of vrr 240Hz monitors so this kind of testing is valuable to them.


----------



## sugarhell

Quote:


> Originally Posted by *Derp*
> 
> Joker's 720p low review is a good resource of information. Some games are acceptably slower considering the 3.9GHz vs 5Ghz clock speed disadvantage but in other games the fps difference is nearly 100+ fps. Thousands of gamers are sitting in front of vrr 240Hz monitors so this kind of testing is valuable to them.


I think millions of players

If you are into that much on high end gaming get a 7700k. It even better than then the 6900k for this kind of job


----------



## Master__Shake

even this review from the AMD 880K is the same settings



http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/72286-amd-athlon-x4-880k-review-9.html


----------



## Hequaqua

Quote:


> Originally Posted by *redone13*
> 
> Why would you do clock for clock if a limitation of more cores is lower clock speed? That's the point of the 7700K. It clocks high rather easily. There seems to be some limitations in how far AMD chips can be pushed because they are already pushed rather far out of the box. I guess you could do clock for clock at low 4GHz if you truly wanted to see some numbers.


True.









I'm sorta addicted to numbers and benchmarking. So to me, I would find that interesting. I'm a bit odd though.







Wouldn't it also show how well each urarch handles the same load at the same speed. I mean, of course the higher clocks will win out. I thought that was a given. That's why I can't understand all the grief a lot of people are putting on the AMD side. The R7's weren't designed to be a gaming chip, or a content creator chip. It was designed to fit in a notch that wasn't there yet. Somewhere in between, I think it does that.

I'll be looking at the hexa cores personally.


----------



## Kriant

I just want to point out, just like it was pointed out in
Quote:


> Originally Posted by *Derp*
> 
> Joker's 720p low review is a good resource of information that shows actual CPU gaming performance that isn't held back by the GPU. Some games are acceptably slower considering the 3.9GHz vs 5Ghz clock speed disadvantage but in other games the fps difference is nearly 100+ fps. Thousands of gamers are sitting in front of vrr 240Hz monitors so this kind of testing is valuable to them.


If you'll notice, say, the Division (where you see the fps difference is nearly 100 at times), doesn't seem to be loading Ryzen cores all that much, which is a bit weird. Maybe AMD is right in that proper optimization is needed and current engines are wasting resources


----------



## kaosstar

Quote:


> Originally Posted by *Hequaqua*
> 
> True.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sorta addicted to numbers and benchmarking. So to me, I would find that interesting. I'm a bit odd though.
> 
> 
> 
> 
> 
> 
> 
> Wouldn't it also show how well each urarch handles the same load at the same speed. I mean, of course the higher clocks will win out. I thought that was a given. That's why I can't understand all the grief a lot of people are putting on the AMD side. The R7's weren't designed to be a gaming chip, or a content creator chip. It was designed to fit in a notch that wasn't there yet. Somewhere in between, I think it does that.
> 
> I'll be looking at the hexa cores personally.


Yeah, I've been saying for a while that I think the 6 core chips will be the real hit.

At the bottom end will be the 6c 12t R5 1500 for $229. Max OC to Max OC, it might perform even better than the 1800X in gaming. And at a price of $120 less than a 7700K it will look like a real bargain.


----------



## AuraNova

Personally, I don't feel the R5 6-core chips are going to perform any better or that much different than R7.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Derp*
> 
> Joker's 720p low review is a good resource of information that shows actual CPU gaming performance that isn't held back by the GPU. Some games are acceptably slower considering the 3.9GHz vs 5Ghz clock speed disadvantage but in other games the fps difference is nearly 100+ fps. Thousands of gamers are sitting in front of vrr 240Hz monitors so this kind of testing is valuable to them.


What his results also show, is that many other reviews/graphs in this thread would then be wrong..

In his results Ryzen holds up very well at 1080P, I'd think people with GTX 1080's and $350 CPU's wouldn't be playing at 720P with every setting set to its lowest.. That's obviously the best case scenario for a 4 core clocked to 5GHZ.

Not to mention, in the titles with those 100+fps differences, Ryzen would also be fine driving a 240Hz monitor..


----------



## LancerVI

Quote:


> Originally Posted by *Master__Shake*
> 
> the 73xxk cpu, the 7700k, and the 6900k and 6950x reviews all had the same detail levels as well as AA levels on thosew 4 graphs i posted.
> 
> why change the details for the ryzen review.


Ok...I read ya. It is strange.

Quote:


> Originally Posted by *Kriant*
> 
> I just want to point out, just like it was pointed out in
> If you'll notice, say, the Division (where you see the fps difference is nearly 100 at times), doesn't seem to be loading Ryzen cores all that much, which is a bit weird. Maybe AMD is right in that proper optimization is needed and current engines are wasting resources


Yeah. The CPU utilization on Ryzen is extremely low.


----------



## Hequaqua

Quote:


> Originally Posted by *kaosstar*
> 
> Yeah, I've been saying for a while that I think the 6 core chips will be the real hit.
> 
> At the bottom end will be the 6c 12t R5 1500 for $229. Max OC to Max OC, it might perform even better than the 1800X in gaming. And at a price of $120 less than a 7700K it will look like a real bargain.


Hopefully by the time they do get released a lot of these "issues" will be worked out too! I'm happy with my 4770k, but I want something new! The last AMD chip I had was....and get ready.....was a Duron 850! That was when you could gain performance with a #2 pencil.


----------



## tygeezy

Some screen caps i took from Joker 720 p everything low vid.











id really like to see a raw video with bf1 multiplayer. You can't really make a fair comparison there, but from the graphs i've seen ryzen has better frame-times in dx11 multiplayer.


----------



## Newbie2009

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Perhaps.. But I think it has done Ryzen a great disservice actually.
> 
> Pretending that things are fine and dandy and this is the performance Ryzen offers, then after the fact talking about how buggy things were on the software side, using ram speeds and timings that are sub-par, etc.
> 
> They then (again after the reviews are posted), are saying the reason some reviewers are seeing far better results with Ryzen might be because they actually had boards that were working the way they should be..
> 
> That doesn't sound like wanting to show them in a positive light to me.. Quite the opposite in fact. It would be far more beneficial to their readers if they criticized AMD for rushing to release too early, and said that stuff clearly isn't adding up regarding its gaming performance.


yeah I hear you. I presume we will need updated reviews maybe end of March? Regardless looks like amd won't keep up with demand for a couple of months so all should work itself out.


----------



## czin125

http://www.asrock.com/mb/AMD/X370%20Taichi/index.us.asp#Memory

DDR4 3200 8GB G.Skill F4-3200CL14D-16GBTZ SS 2pcs
DDR4 3200 8GB G.Skill F4-3200CL14D-16GFXR SS 2pcs
DDR4 3200 8GB G.Skill F4-3200CL14D-16GFX SS 2pcs

The newer ram modules work fine at 3200


----------



## Derp

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Not to mention, in the titles with those 100+fps differences, Ryzen would also be fine driving a 240Hz monitor..


This isn't true. For one example at 3:22 you can see the Ryzen cpu at 116fps while the 7700k is at 193fps. In this situation Ryzen can't even keep up with 144Hz panels.

This should be ignored. For some reason he tested DX11 on Ryzen and DX12 on the 7700k for this specific game.


----------



## redone13

Quote:


> Originally Posted by *Pro3ootector*


Quote:


> Originally Posted by *CataclysmZA*
> 
> Lower is better when it comes to Dolphin, because that's the time taken to complete the overall run. It prefers high single-core performance and high clock speeds.


I see many gaming benchmarks and this and that but we can only hope waiting for Ryzen addresses emulation benches.


----------



## tygeezy

Quote:


> Originally Posted by *Derp*
> 
> This isn't true. For one example at 3:22 you can see the Ryzen cpu at 116fps while the 7700k is at 193fps. In this situation Ryzen can't even keep up with 144Hz panels.
> 
> This should be ignored. For some reason he tested DX11 on Ryzen and DX12 on the 7700k for this specific game.


I was just about to say that!


----------



## GorillaSceptre

Quote:


> Originally Posted by *Derp*
> 
> This isn't true. For one example at 3:22 you can see the Ryzen cpu at 116fps while the 7700k is at 193fps. In this situation Ryzen can't even keep up with 144Hz panels.


Okay, that's disingenuous..

You just picked a time stamp where the Ryzen chip has a drop, most of the time it's far higher than that. Secondly, you said 100fps+.

Not to mention (once again), that the 7700K is at 5GHz, @ 720p with everything set to low.. If you are treating Joker's 720P-low results as fact then I'm sure you wouldn't mind treating his 1080P ones the same right? At 1080P an 8 core is able to keep up with a 5GHz 4core in _*gaming*_, just think about that for a second..


----------



## ZealotKi11er

Quote:


> Originally Posted by *tygeezy*
> 
> Some screen caps i took from Joker 720 p everything low vid.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> id really like to see a raw video with bf1 multiplayer. You can't really make a fair comparison there, but from the graphs i've seen ryzen has better frame-times in dx11 multiplayer.


There is a problem with GTA. Lower fps but its using the cores like crazy.


----------



## budgetgamer120

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Okay, that's disingenuous..
> 
> You just picked a time stamp where the Ryzen chip has a drop, most of the time it's far higher than that. Secondly, you said 100fps+.
> 
> Not to mention (once again), that the 7700K is at 5GHz, @ 720p with everything set to low.. If you are treating Joker's 720P-low results as fact then I'm sure you wouldn't mind treating his 1080P ones the same right? At 1080P an 8 core is able to keep up with a 5GHz 4core in _*gaming*_, just think about that for a second..


See the edit.


----------



## tygeezy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There is a problem with GTA. Lower fps but its using the cores like crazy.


This is watchdogs 2 I believe. I did notice the high core usage and underwhelming performance. Watchodgs 2 is heavily multithreaded though.


----------



## GorillaSceptre

Quote:


> Originally Posted by *budgetgamer120*
> 
> See the edit.


Was disingenuous either way. I could go screen grab the 7700K during a drop..

What is strange is why Ryzens performance is so erratic, the 7700K is mostly like a rock.


----------



## budgetgamer120

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Was disingenuous either way. I could go screen grab the 7700K during a drop..
> 
> What is strange is why Ryzens performance is so erratic, the 7700K is mostly like a rock.


Meh if I cared about 720p low settings gaming my Xbox One would be my go to machine.


----------



## GorillaSceptre

Quote:


> Originally Posted by *budgetgamer120*
> 
> Meh if I cared about 720p low settings gaming my Xbox One would be my go to machine.


Difference is even the xbone doesn't play on low..

So I guess if you are building a new $1500+ machine, and you don't do any sort of productivity work, it's only for gaming, and you also spent all that money to play at 720P with everything set to low, then the choice is easy.


----------



## tpi2007

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *budgetgamer120*
> 
> See the edit.
> 
> 
> 
> Was disingenuous either way. I could go screen grab the 7700K during a drop..
> 
> What is strange is why Ryzens performance is so erratic, the 7700K is mostly like a rock.
Click to expand...

Threads are possibly being unnecessarily shuffled across the two CCXes, whereas the 7700K doesn't have that problem.

Maybe a quick fix would be to tell the Windows scheduler to treat the Ryzen 7 CPUs as a mix between a Core 2 Quad (two dual core dies on the same package) and a Pentium D Extreme Edition (two Pentium 4 with HT CPU dies on the same package). I'm oversimplifying, but an adapted fix like this should produce immediate results.


----------



## PROBN4LYFE

Quote:


> Originally Posted by *ryboto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SoloCamo*
> 
> *720p low settings* 5ghz 7700k vs 3.9ghz 1700 (w/ 3000mhz memory)
> 
> 
> 
> 
> 
> 
> *Ryzen clearly is a fine gaming cpu if it's doing this well at 720p low, a setting nobody will use*
> 
> 
> 
> He's definitely appealing to the masses! Maybe next he'll do a clock for clock?
Click to expand...

I like how the CPU usage on the 1700 rarely even hits 80% on the one core while the 7700 is consistently over 80 on multiple cores even if the FPS is lower you gotta give AMD a win for coming out with a decent processor that doesn't use 450watts at load.








I might have to build one just to say I have one now







!


----------



## Wishmaker

When INTEL does something anti-competitive, OCN goes nuts. When AMD tells people to bench 4k to bottleneck the GPU to create a gpu side limitation where the difference in IPC would be minimised nobody bats an eye!!







.


----------



## GorillaSceptre

Quote:


> Originally Posted by *tpi2007*
> 
> Threads are possibly being unnecessarily being shuffled across the two CCXes, whereas the 7700K doesn't have that problem.
> 
> Maybe a quick fix would be to tell the Windows scheduler to treat the Ryzen 7 CPUs as a Core 2 Quad (2 dual core dies on the same package), or even better, as a Pentium D Extreme Edition (two Pentium 4 with HT CPUs dies on the same package.) I'm oversimplifying, but an adapted fix like this should produce immediate results.


Man..

Between Windows scheduling, SMT, RAM bugs, UEFI and microcode problems, the logistics of the MB manufacturers, they then have to deal with games being optimized for Intel on top of it all.. I don't envy working at AMD for the next few months that's for sure.


----------



## BinaryDemon

Quote:


> Originally Posted by *AuraNova*
> 
> Personally, I don't feel the R5 6-core chips are going to perform any better or that much different than R7.


It's possible given that there will be more cache memory per core and less heat so it might stay at higher turbo speeds longer, or overclock better.


----------



## tacobob89

Anyone heard anything about what mobos are looking the best so far? Iv been looking but havent found much.


----------



## tpi2007

Quote:


> Originally Posted by *tacobob89*
> 
> Anyone heard anything about what mobos are looking the best so far? Iv been looking but havent found much.


The guy from Joker Productions has the Gigabyte Aorus AX370 Gaming 5 and it seems to be doing fine with the Corsair RAM at 3 Ghz.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Wishmaker*
> 
> When INTEL does something anti-competitive, OCN goes nuts. When AMD tells people to bench 4k to bottleneck the GPU to create a gpu side limitation where the difference in IPC would be minimised nobody bats an eye!!
> 
> 
> 
> 
> 
> 
> 
> .


*Breaking news! Corporation supplies review guides in order to paint their product in the best possible light*










There's nothing anti-competitive about that.. Did they say - "Don't post 720P, 1080P, etc., etc., or we'll sue you"? Intel, Nvidia, AMD, they all supply review guides and want their products to be shown in the best, or least damaging way possible.. Do you think the marketing guys who handle all of that are payed to sell the competitions products?


----------



## tacobob89

Quote:


> Originally Posted by *GorillaSceptre*
> 
> There's nothing anti-competitive about that.. Did they say - "Don't post 720P, 1080P, etc., etc., or we'll sue you"? Intel, Nvidia, AMD, they all supply review guides and want their products to be shown in the best, or least damaging way possible..


This


----------



## teh-yeti

Quote:


> Originally Posted by *tacobob89*
> 
> Anyone heard anything about what mobos are looking the best so far? Iv been looking but havent found much.


Guy from hardware unboxed mentioned that the Asus crosshairs was a bit of a nightmare, but that his problems were resolved when asrock sent him a taichi that he used for his benches. Looks like asrock is doing good am4 boards to me. Hopefully we have some more specific reviews though. And miniITX boards, which we still know NOTHING about. I promise I'm not salty.


----------



## AuraNova

Quote:


> Originally Posted by *BinaryDemon*
> 
> It's possible given that there will be more cache memory per core and less heat so it might stay at higher turbo speeds longer, or overclock better.


That's what a lot of us are hoping, but it R5 is anything like R7's build, you might get a small increase over R7, clock-wise. I am under the impression, like others, that it will perform and overclock better. I hope with the updates on top of all this that 6-core R5 winds up being the sleeper of the Ryzen line.


----------



## oxidized

Quote:


> Originally Posted by *Wishmaker*
> 
> When INTEL does something anti-competitive, OCN goes nuts. When AMD tells people to bench 4k to bottleneck the GPU to create a gpu side limitation where the difference in IPC would be minimised nobody bats an eye!!
> 
> 
> 
> 
> 
> 
> 
> .


I noticed that too


----------



## SoloCamo

Quote:


> Originally Posted by *oxidized*
> 
> I noticed that too


Besides GN, did anyone else mention this?


----------



## tpi2007

Regarding XFR and that confusion of whether the Ryzen 7 1700 has it or not, Hardware Canucks explains in the video review at 4:05 (see below) that the X CPUs have a 100 Mhz XFR boost, while the non X CPUs have just a 50 Mhz XFR boost. That's consistent with what I had read somewhere else that the 7 1700 could XFR boost to 3.75 Ghz.






So, that seems to be cleared up. On to the next batch of unsolved mysteries and problems.


----------



## Kevin Sia

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Difference is even the xbone doesn't play on low..
> 
> So I guess if you are building a new $1500+ machine, and you don't do any sort of productivity work, it's only for gaming, and you also spent all that money to play at 720P with everything set to low, then the choice is easy.


Seriously, these tests completely useless, nobody play at 720p with high-end rig.


----------



## Kand

Quote:


> Originally Posted by *tpi2007*
> 
> Regarding XFR and that confusion of whether the Ryzen 7 1700 has it or not, Hardware Canucks explains in the video review at 4:05 (see below) that the X CPUs have a 100 Mhz XFR boost, while the non X CPUs have just a 50 Mhz XFR boost. That's consistent with what I had read somewhere else that the 7 1700 could XFR boost to 3.75 Ghz.
> 
> 
> 
> 
> 
> 
> So, that seems to be cleared up. On to the next batch of unsolved mysteries and problems.


What an absolutely useless feature.............


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kand*
> 
> What an absolutely useless feature.............


That is why I am worried about Vega. All these useless features AMD overhyped.


----------



## Kand

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That is why I am worried about Vega. All these useless features AMD overhyped.


Vega doesn't exist.


----------



## kaosstar

Quote:


> Originally Posted by *SoloCamo*
> 
> Besides GN, did anyone else mention this?


From PC Gamer: "AMD's first suggestion was to test at 1440p or 4K, which is complete bunk. "
http://www.pcgamer.com/the-amd-ryzen-7-review/3/

However, this is par for the course for new hardware, from what I understand.


----------



## tacobob89

Ok xfr is a joke. What are all these other useless features that were hyped up tho. 1080 is old tech at this point, shouldnt the focus be for 4k+ resolutions? I dont understand anyone who is building a nice rig and are aiming to run 1080 resolution, thats silly.


----------



## rt123

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Was disingenuous either way. I could go screen grab the 7700K during a drop..
> 
> What is strange is why Ryzens performance is so erratic, the 7700K is mostly like a rock.


Because one is an iterative improvement over an architecture that has been around for a few years, while other one is a brand new architecture. Schedulers have caught upto properly utilize AMD architecture, while Zen will need some work & hopefully achieve consistency & maybe even a small boost.


----------



## naz2

Quote:


> Originally Posted by *SoloCamo*
> 
> Besides GN, did anyone else mention this?


from amd's reddit interview:
Quote:


> Originally Posted by *AMD_LisaSu*
> Ryzen is doing really well in 1440p and 4K gaming when the applications are more graphics bound. And we do exceptionally well in rendering and workstation applications where more cores are really useful. In 1080p, we have tested over 100+ titles in the labs&#8230;. And depending on the test conditions, we do better in some games and worse in others. We hear people on wanting to see improved 1080p performance and we fully expect that Ryzen performance in 1080p will only get better as developers get more time with "Zen". We have over 300+ developers now working with "Zen" and several of the developers for Ashes of Singularity and Total Warhammer are actively optimizing now


basically corroborates what reviewers have been saying


----------



## SoloCamo

All I know is if Zen+ can hit 4.5ghz this makes it a no brainer upgrade for most people.


----------



## fidler2k

The problem with the argument that you game in 1440p or 4K so "who cares about 1080p" is that you will most likely upgrade your GPU before upgrading your CPU in the next couple years(people generally keep their CPUs longer). In the near term future, 1440p will become the new standard resolution for high performance GPUs, thus making the CPU a more important aspect at 2K like it is in 1080p today. Of course this may take 2-3 years but people investing more than $300 for a CPU won't be ready to upgrade in this timeframe.


----------



## AmericanLoco

But Ryzen can only get faster as software updates roll out and games become more threaded. Plus AMD commits to platforms, unlike Intel.


----------



## tacobob89

I agree with Loco, I think in time this cpu is going to come into its own. Multi core cpus have been relevant for a long time now, would think game developers would have began optimizing games to take advantage of the extra horsepower a long time ago, or at least more than they have to this point. So much un-tapped power.


----------



## tpi2007

Quote:


> Originally Posted by *Kand*
> 
> What an absolutely useless feature.............


From what I said earlier, this doesn't seem like a feature at all, it's just an artificial segmentation of the Precision Boost feature made for PR purposes.

And in my opinion the reason why they did it is so they could present the Cinebench single threaded CPU score as the same versus the 6900K. Meanwhile, in the background, the 6900K was turboing a single core to 4 Ghz, while the 1800X was turboing to 4.1 Ghz, but since the slides present 3.6 Base / 4 Ghz Turbo, it seems more like a direct comparison. It's a small detail, because it ends up being a clock for clock deficit of about 2.5%, but still.

They do acknowledge this in the Press Briefing Call slides, but it doesn't look so well when actually doing a live benchmark on the Internet on the release event like Lisa Su did, because there is this single threaded IPC stigma associated with the previous architecture, so anything to get rid of that perception, even if the actual difference is small, is a plus in the PR book. That's probably why they also went through those last revisions, where the F3, that was initially presented, went to 3.9 Ghz Boost and the F4 goes to 4 Ghz.

I noticed back then that she said that "it's roughly a tie" (see below), which was weird, why was she saying that when the score was exactly the same 162 CB for both the 1800X and the 6900K? That's why. She seems like an honest person who isn't much into this kind of small trickery to micro-influence the audience.


----------



## CriticalOne

I've scrolled through the over 1900 posts on this thread and overall I have to say that i'm pretty disappointed in all the hostility.

People, we can have differences in opinion without the other person having to be a shill, a troll, an idiot, a fanboy, etc... What happens when it becomes okay to personally insult over users over opinions is that people start to leave the forum as they feel that their presence isn't welcome anymore. If you want to go bash people for what they think, go to WCCFTech's comment section.

Relax. I know Ryzen is exciting and polarizing, but its just computer hardware at the end of the day.


----------



## budgetgamer120

We just have to ignore the concerned trolling technique of some people.


----------



## JedixJarf

Quote:


> Originally Posted by *Liranan*
> 
> Now that is sexy but way beyond my reach so I will gladly settle for a 1700. If I had enough people accessing my server then I might be able to justify such a cost, especially if I could get donations to offset the price of a setup like this but I can't possibly justify the cost for only a few people accessing the server.


I paid 150 for a 12 core xeon ES chip on ebay lol.


----------



## kd5151

like me. AMD sucks.

?


----------



## IRobot23

Okay, guys
Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.

*also check out ms*

I saw this image on forum. Ryzen has less fps yet way more GPU usage?


----------



## kaosstar

I'm not quite sure how to take Joker's benchmarks: http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/

Is he a fanboy who rigged the testing, or is he just one of the only competent reviewers?
After seeing his videos, I'm leaning toward the latter.


----------



## tacobob89

Quote:


> Originally Posted by *kaosstar*
> 
> I'm not quite sure how to take Joker's benchmarks: http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/
> 
> Is he a fanboy who rigged the testing, or is he just one of the only competent reviewers?
> After seeing his videos, I'm leaning toward the latter.


This is what I was thinking after doing my research also.


----------



## IRobot23

Quote:


> Originally Posted by *kaosstar*
> 
> I'm not quite sure how to take Joker's benchmarks: http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/
> 
> Is he a fanboy who rigged the testing, or is he just one of the only competent reviewers?
> After seeing his videos, I'm leaning toward the latter.


check out picture.


----------



## RedM00N

Quote:


> Originally Posted by *IRobot23*
> 
> Okay, guys
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?


Look at vram usage too. Over 1700mb for the 7700 and 5700mb for the Ryzen, so it could explain the higher gpu load as this normally means different graphics settings. Now I know wd2 is a sandbox game and loads and numbers can vary, but that's way to much if settings were the same.


----------



## ZealotKi11er

Quote:


> Originally Posted by *IRobot23*
> 
> Okay, guys
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?


That One I do not understand. Look at Ryzen usage. Is 70% range ~ 6C usage. At the same time the GPU usage is double that of 7700K yet same fps? That should not be possible.


----------



## naz2

the 7700k system has half of the GPU usage and a third of the memory but gets more fps. what? that guy's number are just nonsense and should be discarded. did he even double-check his results? also btw that's not gta it's watch dogs 2


----------



## edgeofblade

Quote:


> Originally Posted by *CriticalOne*
> 
> I've scrolled through the over 1900 posts on this thread and overall I have to say that i'm pretty disappointed in all the hostility.
> 
> People, we can have differences in opinion without the other person having to be a shill, a troll, an idiot, a fanboy, etc... What happens when it becomes okay to personally insult over users over opinions is that people start to leave the forum as they feel that their presence isn't welcome anymore. If you want to go bash people for what they think, go to WCCFTech's comment section.
> 
> Relax. I know Ryzen is exciting and polarizing, but its just computer hardware at the end of the day.


NO! This is OCN, where if you aren't using Windows begrudgingly enough and sighing every three minutes about how much better the world would be if corporations were only in it for the advancement of a Star Trek like socialist state, you're no better than a console peasant. Linux Master Race! Down with greedy corporations, like $pybook, Micro$oft, and Int€l. MY choices are CLEARLY more morally founded, and if this isn't immediately apparent to you, me and my clones will jump you in the back alley.


----------



## RedM00N

Quote:


> Originally Posted by *naz2*
> 
> the 7700k system has half of the GPU usage and a third of the memory but gets more fps. what? that guy's number are just nonsense and should be discarded. did he even double-check his results? also btw that's not gta it's watch dogs 2


Ah right. I Keep getting them mixed up (the visuals throw me off)


----------



## Majin SSJ Eric

People have been saying members would abandon OCN over fanboy arguments for literally as long as I've been here. IMO people just need thicker skin around here...


----------



## Xuper

Quote:


> Originally Posted by *IRobot23*
> 
> Okay, guys
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?


Read all ( Link , Link 2 , Link 3) , Download Coreinfo v3.31

Windows treats Ryzen like this ( Link  , You see that the cache detection is completely broken ) But a proper CPU like this ( Core i7 3770K ).

Quote:


> I noted above that the 8c/16t chips have 2 CCXs on them. Each CCX contains 8MB of the L3 cache, for a total of 16MB. Ryzen's architecture is such that if a thread on one CCX needs to access the cache in the other CCX, it needs to talk through a bus system that goes through the memory controller. The bandwidth of this interconnection is *only 22GB/s, about the speed of DDR3-1600. SLOW.* This introduces two possible problematic scenarios:
> 
> 
> 
> A thread saturates more than half of the total L3 cache. In this case, let's say a thread in CCX0 needs to access a library that is 12MB large. It's all in the L3 cache, but some of it is in the half of the cache that is in CCX1. It queries both caches simultaneously, but has to *wait* for some of the data in CCX1.
> 
> 
> 
> Windows likes to swap loads on threads often. Basically, it will switch serial computations (not serial in the traditional sense, but serial as in it will re-use data that has already been computed and put into cache) onto new threads with no regard for where that thread is physically located. This makes sense for Intel and older AMD products, where there exists a unified cache across all cores. This is bad for Ryzen, as moving cache accesses from one CCX to the other is extremely expensive.


Quote:


> As things stand right now, Windows are essentially mishandling/scheduling the threads on Ryzen, and this could be solved to a high extent (although not completely). Thing is, we don't exactly know how Windows treats Ryzen: Like a normal Intel CPU or like an old AMD bulldozer based CPU, however, a solution seems to be possible.
> 
> This could also explain the very odd performance behaviour of Watch Dogs 2. A game that can utilize a lot of threads yet performs poorly on Ryzen:


Unfortunately, the 6 cores will still have this issue, however Ryzen 4 Cores will be faster than 8/6 cores in game


----------



## technodanvan

Quote:


> Originally Posted by *edgeofblade*
> 
> NO! This is OCN, where if you aren't using Windows begrudgingly enough and sighing every three minutes about how much better the world would be if corporations were only in it for the advancement of a Star Trek like socialist state, you're no better than a console peasant. Linux Master Race! Down with greedy corporations, like $pybook, Micro$oft, and Int€l. MY choices are CLEARLY more morally founded, and if this isn't immediately apparent to you, me and my clones will jump you in the back alley.


I can't like this comment enough.


----------



## JackCY

Quote:


> Originally Posted by *IRobot23*
> 
> Okay, guys
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?


Joker has the more decent GB mobo and less issues, overall the performance even of the 1700 vs 7700K is awesome from my POV.
His detailed videos also show why some games run better on Intel and worse on AMD Ryzen, they don't load the cores right at all on AMD this especially hits in DX12 which overall is a failure so far by M$ and developers using it.


----------



## jprovido

This thread is dead to me. calling someone "intel shill" is a disgrace questioning someones integrity because you don't agree with their review/methodology etc. I'm out


----------



## SuperZan

Quote:


> Originally Posted by *Xuper*
> 
> Read all ( Link , Link 2 , Link 3) , Download Coreinfo v3.31
> 
> Windows treats Ryzen like this ( Link  , You see that the cache detection is completely broken ) But a proper CPU like this ( Core i7 3770K ).
> 
> Unfortunately, the 6 cores will still have this issue, however Ryzen 4 Cores will be faster than 8/6 cores in game


Thank you for a useful post rather than regurgitated hyperbole.

As far as I understand it, these are not insurmountable problems.


----------



## AuraNova

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> People have been saying members would abandon OCN over fanboy arguments for literally as long as I've been here. IMO people just need thicker skin around here...


It's not like they won't find this in other computer forums. I think people need better ignoring skills. Granted, I'm not perfect either.

Some of the comments on this forum make me just shake my head. I just move on.


----------



## IRobot23

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *RedM00N*
> 
> Look at vram usage too. Over 1700mb for the 7700 and 5700mb for the Ryzen, so it could explain the higher gpu load as this normally means different graphics settings. Now I know gta is a sandbox and loads and numbers can vary, but that's way to much if settings were the same.


Quote:


> Originally Posted by *Xuper*
> 
> Read all ( Link , Link 2 , Link 3) , Download Coreinfo v3.31
> 
> Windows treats Ryzen like this ( Link  , You see that the cache detection is completely broken ) But a proper CPU like this ( Core i7 3770K ).
> 
> Unfortunately, the 6 cores will still have this issue, however Ryzen 4 Cores will be faster than 8/6 cores in game






check what I said!


----------



## Xuper

atm , Who does have Ryzen ? Please use feature "Set Affinity" in task manger and Test it.also If you disable Second CCX ( Core 5 to 8 ) , Please overclock it to above 4Ghz and inform us , Thanks


----------



## budgetgamer120

I wonder how low AMD stock is gonna go.


----------



## edgeofblade

Quote:


> Originally Posted by *jprovido*
> 
> This thread is dead to me. calling someone "intel shill" is a disgrace questioning someones integrity because you don't agree with their review/methodology etc. I'm out


To be fair, disagreements with methodology go to the root of what it means to use scientific method.

Also required is plenty of name calling.


----------



## GorillaSceptre

The problem is it's always all or nothing..

At one point Intel 4 cores were "dead", and now Ryzen 8 cores are "trash" for gaming..

The choices are pretty easy as far as i'm concerned.. Ryzen 7 is a poor choice if you are strictly going for gaming considering its price, the 7700K is far better for games and is cheaper/the same price. The 7700K is a poor choice for productivity considering its price, the 1700 is far better for productivity and is cheaper/the same price. There's usually a compromise somewhere.

Which is better as a jack of all trades? Imo it's easily Ryzen 7, it also (imo) makes the 6900K irrelevant for 90% of people who won't take advantage of Intel's HEDT features.

Ryzen 7 has in a way shaken up the productivity market (the only competition for the 1800X costs twice as much), but it's very clear that 99% of the hype has been gamers, that's the market most people wanted a shake up in.. On that front Ryzen 7 has failed, so I get why so many people would be disappointed.

I think AMD made a big blunder by keeping Ryzen 5 so close to their chests, and sending Ryzen 7 out to battle chips like the 7700K for the same price.. I think the ultimate success of Ryzen lies with the 1600X/1500X, 8 cores were never going to challenge fast 4 cores today. I just hope they clock well, either way they'll at least bring some competition for gamers, Ryzen 7 hasn't done that.

Maybe Skylake-E will bring chips with no compromises, they'll rip games and productivity workloads to shreds, but I'm sure we'll have to pay a pretty penny for those.


----------



## aDyerSituation

Quote:


> Originally Posted by *GorillaSceptre*
> 
> The problem is it's always all or nothing..
> 
> At one point Intel 4 cores were "dead", and now Ryzen 8 cores are "trash" for gaming..
> 
> The choices are pretty easy as far as i'm concerned.. Ryzen 7 is a poor choice if you are strictly going for gaming considering its price, the 7700K is far better for games and is cheaper/the same price. The 7700K is a poor choice for productivity considering its price, the 1700 is far better for productivity and is cheaper/the same price. There's usually a compromise somewhere.
> 
> Which is better as a jack of all trades? Imo it's easily Ryzen 7, it also (imo) makes the 6900K irrelevant for 90% of people who won't take advantage of Intel's HEDT features.
> 
> Ryzen 7 has in a way shaken up the productivity market (the only competition for the 1800X costs twice as much), but it's very clear that 99% of the hype has been gamers, that's the market most people wanted a shake up in.. On that front Ryzen 7 has failed, so I get why so many people would be disappointed.
> 
> I think AMD made a big blunder by keeping Ryzen 5 so close to their chests, and sending Ryzen 7 out to battle chips like the 7700K for the same price.. I think the ultimate success of Ryzen lies with the 1600X/1500X, 8 cores were never going to challenge fast 4 cores today. I just hope they clock well, either way they'll at least bring some competition for gamers, Ryzen 7 hasn't done that.
> 
> Maybe Skylake-E will bring chips with no compromises, they'll rip games and productivity workloads to shreds, but I'm sure we'll have to pay a pretty penny for those.


This sums it up for me. I just wanted the best of both worlds mannnn


----------



## tpi2007

Quote:


> Originally Posted by *Xuper*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *IRobot23*
> 
> Okay, guys
> 
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?
> 
> 
> 
> Read all ( Link , Link 2 , Link 3) , Download Coreinfo v3.31
> 
> Windows treats Ryzen like this ( Link  , You see that the cache detection is completely broken ) But a proper CPU like this ( Core i7 3770K ).
> Quote:
> 
> 
> 
> I noted above that the 8c/16t chips have 2 CCXs on them. Each CCX contains 8MB of the L3 cache, for a total of 16MB. Ryzen's architecture is such that if a thread on one CCX needs to access the cache in the other CCX, it needs to talk through a bus system that goes through the memory controller. The bandwidth of this interconnection is *only 22GB/s, about the speed of DDR3-1600. SLOW.*
> This introduces two possible problematic scenarios:
> 
> 
> 
> A thread saturates more than half of the total L3 cache. In this case, let's say a thread in CCX0 needs to access a library that is 12MB large. It's all in the L3 cache, but some of it is in the half of the cache that is in CCX1. It queries both caches simultaneously, but has to *wait*
> for some of the data in CCX1.
> 
> 
> 
> Windows likes to swap loads on threads often. Basically, it will switch serial computations (not serial in the traditional sense, but serial as in it will re-use data that has already been computed and put into cache) onto new threads with no regard for where that thread is physically located. This makes sense for Intel and older AMD products, where there exists a unified cache across all cores. This is bad for Ryzen, as moving cache accesses from one CCX to the other is extremely expensive.
> 
> 
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> As things stand right now, Windows are essentially mishandling/scheduling the threads on Ryzen, and this could be solved to a high extent (although not completely). Thing is, we don't exactly know how Windows treats Ryzen: Like a normal Intel CPU or like an old AMD bulldozer based CPU, however, a solution seems to be possible.
> 
> This could also explain the very odd performance behaviour of Watch Dogs 2. A game that can utilize a lot of threads yet performs poorly on Ryzen:
> 
> Click to expand...
> 
> Unfortunately, the 6 cores will still have this issue, however Ryzen 4 Cores will be faster than 8/6 cores in game
Click to expand...

Yeah, it's probably that. I was thinking along the same lines a few hours ago:

Quote:


> Originally Posted by *tpi2007*
> 
> The blame in this case lies on AMD, just like it did on Intel's side in other occasions (see below). They need to provide a driver for Windows to help it schedule things properly and manage power states on the two CCXes, because this isn't exactly a native eight core CPU, it's two quad cores tightly put together with an advanced interface, but it's still two modules with 8 MB of L3 cache each. Interestingly, the quad cores should thus have fewer problems even without any fixes because if this.


Quote:


> Originally Posted by *tpi2007*
> 
> Threads are possibly being unnecessarily shuffled across the two CCXes, whereas the 7700K doesn't have that problem.
> 
> Maybe a quick fix would be to tell the Windows scheduler to treat the Ryzen 7 CPUs as a mix between a Core 2 Quad (two dual core dies on the same package) and a Pentium D Extreme Edition (two Pentium 4 with HT CPU dies on the same package). I'm oversimplifying, but an adapted fix like this should produce immediate results.


----------



## IRobot23

Quote:


> Originally Posted by *GorillaSceptre*
> 
> The problem is it's always all or nothing..
> 
> At one point Intel 4 cores were "dead", and now Ryzen 8 cores are "trash" for gaming..
> 
> The choices are pretty easy as far as i'm concerned.. Ryzen 7 is a poor choice if you are strictly going for gaming considering its price, the 7700K is far better for games and is cheaper/the same price. The 7700K is a poor choice for productivity considering its price, the 1700 is far better for productivity and is cheaper/the same price. There's usually a compromise somewhere.
> 
> Which is better as a jack of all trades? Imo it's easily Ryzen 7, it also (imo) makes the 6900K irrelevant for 90% of people who won't take advantage of Intel's HEDT features.
> 
> Ryzen 7 has in a way shaken up the productivity market (the only competition for the 1800X costs twice as much), but it's very clear that 99% of the hype has been gamers, that's the market most people wanted a shake up in.. On that front Ryzen 7 has failed, so I get why so many people would be disappointed.
> 
> I think AMD made a big blunder by keeping Ryzen 5 so close to their chests, and sending Ryzen 7 out to battle chips like the 7700K for the same price.. I think the ultimate success of Ryzen lies with the 1600X/1500X, 8 cores were never going to challenge fast 4 cores today. I just hope they clock well, either way they'll at least bring some competition for gamers, Ryzen 7 hasn't done that.
> 
> Maybe Skylake-E will bring chips with no compromises, they'll rip games and productivity workloads to shreds, but I'm sure we'll have to pay a pretty penny for those.


i7 7700K bottlenecking GTX 1070 at 1080P not able of producing 120fps+
https://www.youtube.com/watch?v=CY3W_AXluYY


----------



## Majin SSJ Eric

Its still a massive oversimplification (and downright incorrect) to say R7's are bad for gaming. They are actually quite good for gaming, even as-is. The 7700K may be a little better but that's not to say that the R7's are actually incapable in any way of providing a great gaming experience right now. The likelihood that most of their deficiencies are going to be fixed with patches and optimizations just means that that already good performance is only going to get even better.

None of this even considers the extra 4C/8T you get for everything else that ISN'T gaming basically for free compared to the 7700K's cost...


----------



## aDyerSituation

Quote:


> Originally Posted by *IRobot23*
> 
> i7 7700K bottlenecking GTX 1070 at 1080P not able of producing 120fps+
> https://www.youtube.com/watch?v=CY3W_AXluYY


He is using action, a recording software that isn't friendly on your fps.


----------



## IRobot23

Quote:


> Originally Posted by *tpi2007*
> 
> Yeah, it's probably that. I was thinking along the same lines a few hours ago:


the biggest problem. Why would AMD make that if it only causes problems?
http://www.pcgameshardware.de/screenshots/original/2017/03/Ryzen-R7-1800X-Test-CPU-Core-Scaling-Battlefield-1-pcgh.png


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its still a massive oversimplification (and downright incorrect) to say R7's are bad for gaming. They are actually quite good for gaming, even as-is. The 7700K may be a little better but that's not to say that the R7's are actually incapable in any way of providing a great gaming experience right now. The likelihood that most of their deficiencies are going to be fixed with patches and optimizations just means that that already good performance is only going to get even better.
> 
> None of this even considers the extra 4C/8T you get for everything else that ISN'T gaming basically for free compared to the 7700K's cost...


Thank you, thank you, thank you.

The hyperbole has made reading this thread an act of supreme masochism.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> He is using action, a recording software that isn't friendly on your fps.


Pretty sure an 8 core would do better regardless....


----------



## LancerVI

Quote:


> Originally Posted by *SuperZan*
> 
> Thank you, thank you, thank you.
> 
> The hyperbole has made reading this thread an act of supreme masochism.


Agreed. If I knew nothing, I'd come away with the conclusion that Ryzen can't game or is bad for gaming which is completely wrong.


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> Pretty sure an 8 core would do better regardless....


it would do better while recording. I'll agree with that. But as far as just playing the game, I doubt it. But it'd be hard to test accurately.


----------



## Hequaqua

Just thought I would add this in here....A user here on OCN is doing some benchmarking now on his 1700 Ryzen now.

He has posted some more posts....he was able to get to like 4.1ghz.....

http://www.overclock.net/t/1624139/official-ryzen-7-1800x-1700x-1700-owners-club-4ghz-club/960#post_25891100

Should be interesting.....finally a "real" user!


----------



## cssorkinman

Quote:


> Originally Posted by *98uk*
> 
> What's the story for Battlefield 1 then? Seems Ryzen isn't really leaping out as a winner in terms of performance in that specific game right?


Quote:


> Originally Posted by *Wishmaker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *98uk*
> 
> What's the story for Battlefield 1 then? Seems Ryzen isn't really leaping out as a winner in terms of performance in that specific game right?
> 
> 
> 
> INTEL coded game it seems
> 
> 
> 
> 
> 
> 
> 
> !
Click to expand...

Quote:


> Originally Posted by *98uk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wishmaker*
> 
> INTEL coded game it seems
> 
> 
> 
> 
> 
> 
> 
> !
> 
> 
> 
> That's a shame, i'm sure that game could put multicore to use in great ways!
> 
> I was really considering going for Ryzen... but if it performs no better than a 4770k in game, then not much point.
Click to expand...

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> *Its still a massive oversimplification (and downright incorrect) to say R7's are bad for gaming*. They are actually quite good for gaming, even as-is. The 7700K may be a little better but that's not to say that the R7's are actually incapable in any way of providing a great gaming experience right now. The likelihood that most of their deficiencies are going to be fixed with patches and optimizations just means that that already good performance is only going to get even better.
> 
> None of this even considers the extra 4C/8T you get for everything else that ISN'T gaming basically for free compared to the 7700K's cost...


You nailed that.
That and too many "reviewers" weren't having good luck with the new platform. Joker was one of the few that didn't seem to have many issues - maybe the Giga board is more mature.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> it would do better while recording. I'll agree with that. But as far as just playing the game, I doubt it. But it'd be hard to test accurately.


It would do better while gaming and recording... What are you talking about?

Either way with GPUs being able to handle recording these days, there would be no reason to use CPU.


----------



## aDyerSituation

Quote:


> Originally Posted by *budgetgamer120*
> 
> It would do better while gaming and recording... What are you talking about?
> 
> Either way with GPUs being able to handle recording these days, there would be no reason to use CPU.


Okay buddy. You go on and test that for us then.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its still a massive oversimplification (and downright incorrect) to say R7's are bad for gaming. They are actually quite good for gaming, even as-is. The 7700K may be a little better but that's not to say that the R7's are actually incapable in any way of providing a great gaming experience right now. The likelihood that most of their deficiencies are going to be fixed with patches and optimizations just means that that already good performance is only going to get even better.
> 
> None of this even considers the extra 4C/8T you get for everything else that ISN'T gaming basically for free compared to the 7700K's cost...


Not sure if that was addressed at me or not. But I'll answer anyway.










I never said it was bad for gaming, quite the contrary actually (I've argued with a few in this thread about that). I said it's a *poor choice* if gaming is your objective. A $500 1800X gets a bit of a slap from a $350 7700K in most titles, if gaming is all someone cares about then it is completely useless to them.

I easily know what choice I'd make, I did say Ryzen 7 is the better all-rounder. But I do see where gamers are coming from (so long as they don't imply Ryzen 8 cores are "trash" for gaming).


----------



## CriticalOne

Bad is relative. When compared to the 7700K then the 1700, the 1700x, and the 1800x is a bad CPU for gaming. When compared to all modern CPU available, its bad but not the best either.


----------



## DADDYDC650

Ryzen 1700 not a good gaming CPU vs Intel? Check out the CPU utilization. Zen 1700 is at 3.9Ghz and the 7700k is at 5Ghz.


----------



## azanimefan

Quote:


> Originally Posted by *IRobot23*
> 
> Okay, guys
> Joker posted 720P benchmark where i7 [email protected] was sometimes just 10-15% faster than [email protected] and sometimes way faster.
> 
> *also check out ms*
> 
> I saw this image on forum. Ryzen has less fps yet way more GPU usage?


well that is because the ryzen isn't bottlenecking the game like the i7 is bottlenecking the game; this points to memory issues to me. Ryzen is being data starved by slow ram or cache. Looks like it's not a "slow core/ipc issue" but a memory issue.


----------



## Darklyric

Quote:


> Originally Posted by *cssorkinman*
> 
> You nailed that.
> That and too many "reviewers" weren't having good luck with the new platform. Joker was one of the few that didn't seem to have many issues - maybe the Giga board is more mature.


and to think that giga - amd - release - bios used to be a flag worthy offense


----------



## prznar1

I think that primary problem is that ryzen is basicly a 2x 4c cpu on your mobo. The design of a chip depends HEAVLY on ram and interconnection between two modules.


----------



## LancerVI

Quote:


> Originally Posted by *aDyerSituation*
> 
> I don't recall anyone calling RyZen "bad" for gaming. It is capable.
> 
> But being behind 4 year old chips in some games is pretty bad


Quote:


> Originally Posted by *CriticalOne*
> 
> Bad is relative. When compared to the 7700K then the 1700, the 1700x, and the 1800x is a bad CPU for gaming. When compared to all modern CPU available, its bad but not the best either.


There are plenty of people calling it "bad" and it's completely disingenuous and smacks of tribalism.


----------



## dragneel

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its still a massive oversimplification (and downright incorrect) to say R7's are bad for gaming. They are actually quite good for gaming, even as-is. The 7700K may be a little better but that's not to say that the R7's are actually incapable in any way of providing a great gaming experience right now. The likelihood that most of their deficiencies are going to be fixed with patches and optimizations just means that that already good performance is only going to get even better.
> 
> None of this even considers the extra 4C/8T you get for everything else that ISN'T gaming basically for free compared to the 7700K's cost...


Absolutely agreed
Quote:


> Originally Posted by *SuperZan*
> 
> Thank you, thank you, thank you.
> 
> The hyperbole has made reading this thread an act of supreme masochism.


Also agreed.

What a mess this is. How is it AMD can deliver more than anyone was expecting just a few months ago, and still have people calling their CPUs a flop.
I just don't get it.


----------



## thethispointer

How did intel kick their ass so bad? Honestly, how?


----------



## LancerVI

Quote:


> Originally Posted by *dragneel*
> 
> Absolutely agreed
> Also agreed.
> 
> What a mess this is. How is it AMD can deliver more than anyone was expecting just a few months ago, and still have people calling their CPUs a flop.
> I just don't get it.


It's fairly easy to figure out.


----------



## Pro3ootector

It's not bad for gaming but Intel has it's better for gaming crown, for now. But it is not bad at all


----------



## DADDYDC650

Quote:


> Originally Posted by *thethispointer*
> 
> How did intel kick their ass so bad? Honestly, how?


They didn't but nice try. I'll leave this here for you though.


----------



## tpi2007

Quote:


> Originally Posted by *IRobot23*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Yeah, it's probably that. I was thinking along the same lines a few hours ago:
> 
> 
> 
> the biggest problem. Why would AMD make that if it only causes problems?
> http://www.pcgameshardware.de/screenshots/original/2017/03/Ryzen-R7-1800X-Test-CPU-Core-Scaling-Battlefield-1-pcgh.png
Click to expand...

Excellent question. Maybe it's a forward looking way they found to scale up cores easily in the server / workstation market. You'll see that performance in those segments is fine. Intel's core bus interconnect also becomes more complicated once you pass 10 cores, so it isn't easy for either to build a system where you can easily add or subtract core counts and keep the chip performing uniformly, namely when it comes to each core having access to shared cache and main memory.

Quote:


> Originally Posted by *dragneel*
> 
> Also agreed.
> 
> What a mess this is. How is it AMD can deliver more than anyone was expecting just a few months ago, and still have people calling their CPUs a flop.
> I just don't get it.


It doesn't cook dinner. How are people supposed to recreate that romantic encounter between Raja and his Ryzen + Vega system?


----------



## thethispointer

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Uh, yeah, OK... Btw, I missed the 6950X in your sig, where's it at?


Lol

http://www.tomshardware.com/reviews/best-cpus,3986.html

I mean it all in good fun friends.

What does "6950X in your sig" mean? I assume that's a very good card or cpu.


----------



## redone13

Ryzen isn't bad for gaming. It's a jack of all trades, however, if one is interested in gaming and emulation which is a very large perk of having a strong PC, the argument for gaming gets that much worse as evidenced by the Dolphin benchmarks.


----------



## dragneel

Quote:


> Originally Posted by *tpi2007*
> 
> It doesn't cook dinner. How are people supposed to recreate that romantic encounter between Raja and his Ryzen + Vega system?


That is actually a pretty big downside tbh


----------



## thethispointer

Quote:


> Originally Posted by *DADDYDC650*
> 
> They didn't but nice try. I'll leave this here for you though.


I noticed this earlier, is this consistent across the board?


----------



## CriticalOne

I am going to wait for all the patches that supposedly improve performance to come in before making a final decision.

I do wish that AMD had all of this sorted out before releasing it, however.


----------



## Pro3ootector

Quote:


> Originally Posted by *DADDYDC650*
> 
> They didn't but nice try. I'll leave this here for you though.


I've seen that video, isn't some bug causing RYZEN framerate to drop all off sudden? Minimal frame rate seems very good.


----------



## DADDYDC650

Quote:


> Originally Posted by *thethispointer*
> 
> I noticed this earlier, is this consistent across the board?


consistently higher minimum frames.


----------



## Mad Pistol

I'm still waiting for a legitimate argument against the Ryzen 8 core CPUs. I understand that, for gaming, if you've already got an i7 quad core w/ Hyper Threading (or better), you're probably just fine for gaming. If you don't, then a Ryzen 8 core is an upgrade in almost every scenario.

I know many people were hoping for AMD to blow Intel out of the water, both on price and performance. Instead, AMD NAILED the price, and in the scenarios that AMD's Ryzen CPUs lose, it doesn't lose by much... and for the asking price, Ryzen is an amazing CPU.


----------



## Blackops_2

Quote:


> Originally Posted by *dragneel*
> 
> Absolutely agreed
> Also agreed.
> 
> What a mess this is. How is it AMD can deliver more than anyone was expecting just a few months ago, and still have people calling their CPUs a flop.
> I just don't get it.


People don't know how to control their expectation, even taking sources like WCCF as credible and staking all claim on the most favorable outcome... I've mentioned it several times in this thread, I just couldn't fathom how people are disappointed when i know most of us, myself included, never thought AMD would hit Broadwell-E IPC.

It's really to be taken with a grain of salt, the enthusiast realm doesn't make up AMD's market share and hasn't for quite some time. The R7 chips are already providing incredible value per dollar and the 6 and 4 core variants are set to do the same. While i might not be jumping ship just yet from my 3770K i still have plans for a Vega rig and in all likely hood it ends up with a 1600x. For the money it's going to be ridiculously hard to beat. That seems to be another point that some who downplay the performance seem to be missing. With one launch we just went from 1000$ 8-core CPUs to 300-500$ and are set to have sub 300$ hexa-cores in the coming months. Most of us have been looking to move past 4 cores for sometime.

I think we'll see some different or more consistent results in the coming months when they get the kinks worked out.


----------



## thethispointer

Quote:


> Originally Posted by *DADDYDC650*
> 
> consistently higher minimum frames.


Hey that's great! I was giving AMD a hard time, I can't wait until they get a good rep again.

I hope they do come out with some lightning fast CPUs, that way it'll give intel some competition. Anything to try to get prices down.


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm still waiting for a legitimate argument against the Ryzen 8 core CPUs. I understand that, for gaming, if you've already got an i7 quad core w/ Hyper Threading (or better), you're probably just fine for gaming. If you don't, then a Ryzen 8 core is an upgrade in almost every scenario.
> 
> I know many people were hoping for AMD to blow Intel out of the water, both on price and performance. Instead, AMD NAILED the price, and in the scenarios that AMD's Ryzen CPUs lose, it doesn't lose by much... and for the asking price, Ryzen is an amazing CPU.


It is clear these people weren't intending on buying an AMD product. It is always something or the other. No need to waste time replying and replying.


----------



## thethispointer

So what now is the best build for the $$ a 1700x and a GTX 1080 TI, or do you not mix them up like that?


----------



## -Sweeper_

Zen still below Haswell-E (2014) IPC in games, that's sad.









PurePC's review: Core i7-5960X relative to Ryzen 7 1800X

- Battlefield 1: 17,1% faster
- Crysis 3: 5% faster
- Dishonored 2: 19.5% faster
- Deus Ex (DX11): 21.8% faster
- Fallout 4: 4.4% faster
- Hitman: 28.8% faster
- Rise of the Tomb Raider: 58.4% faster
- Total War: 3.7% slower
- Watch Dogs 2: 15.5% faster
- The Witcher 3: 46.1% faster

https://www.purepc.pl/procesory/premiera_i_test_procesora_amd_ryzen_r7_1800x_dobra_zmiana


----------



## AuraNova

Well, at this point, all I have to say is bring on Ryzen 5. Either way, it should still be a great CPU for the price. If Zen+ is going to be a drop-in when it comes out, that's even more incentive to upgrade to an eventual R7 *2*700X. (Of course, I assume there's either some refreshes and new chipsets along the way.)


----------



## Pro3ootector

Vulcan and DX12 can really shake things up.


----------



## CriticalOne

Quote:


> Originally Posted by *budgetgamer120*
> 
> A Piledriver has no issues with dolphin, so I do not expect Ryzen to have any issues with over 50% performance improvements.


Yep.

People craved single threaded performance back in the days where PCSX2 and Dolphin were very new and unrefined. With the software and hardware improving, you don't really need to seek out high single threaded performance to run emulators well.
Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm still waiting for a legitimate argument against the Ryzen 8 core CPUs.


What's giving me pause is that there is a high amount of variance in performance between reviewers and the implications of that could be small or massive; who knows. I hate being left in the dark over what's going on especially when I am spending as much as I am on Ryzen. The launch just seems very uncoordinated in general, which isnt a knock against the CPU itself but I'm not a big fan of buying something with problems on release.

Quote:


> Originally Posted by *-Sweeper_*
> 
> PurePC's review: Core i7-5960X relative to Ryzen 7 1800X
> 
> - Battlefield 1: 17,1% faster
> - Crysis 3: 5% faster
> - Dishonored 2: 19.5% faster
> - Deus Ex (DX11): 21.8% faster
> - Fallout 4: 4.4% faster
> - Hitman: 28.8% faster
> - Rise of the Tomb Raider: 58.4% faster
> - Total War: 3.7% slower
> - Watch Dogs 2: 15.5% faster
> - The Witcher 3: 46.1% faster


This is what is giving me hesitation. Ryzen still performs well enough for my games so that isn't the issue here.

The 1800x is a faster processor than the 5960x. It beats it in both multithreaded and singlethreaded synthetic benchmarks.

When it comes to games, all of that gets thrown out of the window for some reason with the slower Intel processor beating the 1800x by wide margins.


----------



## dragneel

Quote:


> Originally Posted by *AuraNova*
> 
> Well, at this point, all I have to say is bring on Ryzen 5. Either way, it should still be a great CPU for the price. If Zen+ is going to be a drop-in when it comes out, that's even more incentive to upgrade to an eventual R7 *2*700X. (Of course, I assume there's either some refreshes and new chipsets along the way.)


That's the other thing isn't it really. We know the underlying architecture of Zen is good, so we can reasonably expect improvement from Zen+ and Zen 3.
Being able to hold onto the same board and RAM through that is actually something a lot of people want.


----------



## DarthBaggins

From watching the videos in the end I still stand by Ryzen is not a bad series of cpu's at all. Sure optimizations need to made but that's not AMD's fault and also the frame rates I'm seeing aren't really bad at all considering how some of the games that were tested.


----------



## redone13

Quote:


> Originally Posted by *CriticalOne*
> 
> Yep.
> 
> People craved single threaded performance back in the days where PCSX2 and Dolphin were very new and unrefined. With the software and hardware improving, you don't really need to seek out high single threaded performance to run emulators well.


I will just touch on this portion since I reposted the graphic. As of right now, you do need the single core performance and the assumption that all future emulators will be heavily multithread is probably right... down the line.


----------



## looniam

are people really comparing ONE (or even a few) frame(s) of a side by side comparison in a video?


----------



## fleetfeather

Quote:


> Originally Posted by *-Sweeper_*
> 
> Zen still below Haswell-E (2014) IPC in games, that's sad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PurePC's review: Core i7-5960X relative to Ryzen 7 1800X
> 
> - Battlefield 1: 17,1% faster
> - Crysis 3: 5% faster
> - Dishonored 2: 19.5% faster
> - Deus Ex (DX11): 21.8% faster
> - Fallout 4: 4.4% faster
> - Hitman: 28.8% faster
> - Rise of the Tomb Raider: 58.4% faster
> - Total War: 3.7% slower
> - Watch Dogs 2: 15.5% faster
> - The Witcher 3: 46.1% faster
> 
> https://www.purepc.pl/procesory/premiera_i_test_procesora_amd_ryzen_r7_1800x_dobra_zmiana


Nearly 2000 posts in the thread, many of which are discussing the reasons why there are inconsistencies in gaming review results across reviewers, and this is your contribution?


----------



## Ultracarpet

Quote:


> Originally Posted by *aDyerSituation*
> 
> I don't recall anyone calling RyZen "bad" for gaming. It is capable.
> 
> *But being behind 4 year old chips in some games is pretty bad*


And you can probably overclock sandy bridge/ivy chips to levels that make it better than intel's HEDT lineup for gaming. Your point?

For strictly gaming, Intel hasn't made a 4 core chip worthy of an upgrade for sandybridge users. So why is it a disappointment if AMD hasn't with Ryzen? Not only that, but for the 100th time, the r7 was labelled as being a great all round chip for gaming AND working. It literally makes Intel's HEDT platform a huge waste of money as things stand right now, so mission accomplished.

As for the r5 and r3? Well those are the chips that bring all those old and outdated bulldozer, conroe, nahalem etc. setups into the post sandybridge era with updated features at a really good price.

Again, people were expecting ivy bridge ipc and low 3ghz clocks out of ryzen. We got much better than that, and yet somehow the expectations shifted EXTREMELY quickly to ryzen needing to be trading blows with Kaby in ever metric.


----------



## blue1512

Quote:


> Originally Posted by *CriticalOne*
> 
> The 1800x is a faster processor than the 5960x. It beats it in both multithreaded and singlethreaded synthetic benchmarks.
> 
> When it comes to games, all of that gets thrown out of the window for some reason with the slower Intel processor beating the 1800x by wide margins.


Game engines have their own ways of handling the CPU, and they are just don't use 1800x properly at this moment.


----------



## Bruizer

Quote:


> Originally Posted by *Pro3ootector*
> 
> It's not bad for gaming but Intel has it's better for gaming crown, for now. But it is not bad at all


Intel has had that crown since before Bulldozer. So honestly it's like an after thought.

Side note (not directed at the person I quoted): I don't think people realize just how much computing is done outside of gaming. Haha. People like, "I wonder how much AMD stock is going to drop and what's going to happen?!" And the answer is, "Um...none. AMD is finally going to pull in some money because they offered a more affordable solution that meets your average person's, and average gamer's, needs." At the end of the day, people can say what they will, but AMD did exactly what they needed to do and hit a home run (I'm sure we all would have loved a GRAND SLAM but we've been dealing with base hits so I'll take the home run). They have provided legitimate competition that undercuts their competitor. It never had to beat Intel, just be dang close and a more affordable price. People acting like these won't sale like hot cakes because it's slightly inferior at gaming... Sure, whatever they say. There is a huge market outside of just the desktop gaming world. AMD saved itself and gave the consumer healthy competition to push prices down.


----------



## IRobot23

Quote:


> Originally Posted by *-Sweeper_*
> 
> Zen still below Haswell-E (2014) IPC in games, that's sad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PurePC's review: Core i7-5960X relative to Ryzen 7 1800X
> 
> - Battlefield 1: 17,1% faster
> - Crysis 3: 5% faster
> - Dishonored 2: 19.5% faster
> - Deus Ex (DX11): 21.8% faster
> - Fallout 4: 4.4% faster
> - Hitman: 28.8% faster
> - Rise of the Tomb Raider: 58.4% faster
> - Total War: 3.7% slower
> - Watch Dogs 2: 15.5% faster
> - The Witcher 3: 46.1% faster
> 
> https://www.purepc.pl/procesory/premiera_i_test_procesora_amd_ryzen_r7_1800x_dobra_zmiana


carefullly check benchmarks again, sometimes i7 5960X is OCed to 4.5GHz.
One thing to note, sometimes 1800X does just little better than FX 8350 which is very weird.


----------



## budgetgamer120

Quote:


> Originally Posted by *looniam*
> 
> are people really comparing ONE (or even a few) frame(s) of a side by side comparison in a video?


The rest of the video is identical. So Why not?


----------



## zGunBLADEz

Quote:


> Originally Posted by *DADDYDC650*
> 
> They didn't but nice try. I'll leave this here for you though.


People still dont realize that having a cpu that much loaded is bad news XD


----------



## CriticalOne

Quote:


> Originally Posted by *blue1512*
> 
> Game engines have their own ways of handling the CPU, and they are just don't use 1800x properly at this moment.


To me it appears that Ryzen eight core processors are two four core dies in a MCM configuration versus Intel's monolithic 10 core die with Broadwell-E. When AM4 motherboards receive EFI updates and other optimizations are done, I want to know if this impacts performance in games in any sort of way.


----------



## looniam

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> are people really comparing ONE (or even a few) frame(s) of a side by side comparison in a video?
> 
> 
> 
> The rest of the video is identical. So Why not?
Click to expand...

the whole video is identical?

NO.

there were a few instances in WD2 that also raised my eyebrow (for a few dozen milliseconds the 1700 was performing much better) *but it was not the same across the other games shown.*

and did not everyone hear joker himself say at the beginning that there will be a few flaky things?

e:
specifically mentioning due to the game engine.


----------



## prznar1

Just a thought. Wonder if ryzen+ will be based on 6c module and 12c total


----------



## CriticalOne

Quote:


> Originally Posted by *prznar1*
> 
> Just a thought. Wonder if ryzen+ will be based on 6c module and 12c total


I do not see the CCX configuration changing with Zen+.


----------



## AuraNova

Quote:


> Originally Posted by *prznar1*
> 
> Just a thought. Wonder if ryzen+ will be based on 6c module and 12c total


I wouldn't doubt it, but I would expect that to be quite a few years later. Let us get used to 8-core being the top dog for now. Then hit us with 12 core products down the line.


----------



## zGunBLADEz

Quote:


> Originally Posted by *looniam*
> 
> the whole video is identical?
> 
> NO.
> 
> there were a few instances in WD2 that also raised my eyebrow (for a few dozen milliseconds the 1700 was performing much better) *but it was not the same across the other games shown.*
> 
> and did not everyone hear joker himself say at the beginning that there will be a few flaky things?


Still dont take the fact as a gamer you never want that cpu touching over 80% (as a max) usage all together for a bunch of reasons..

Microstutters to begin with and weirds anomalies all around.. CPU is the heart of the system and you will need to have it on check if somethings kicks in in the background for a flawless experience and with a cpu that is pegged that high is just bad news for the gaming experience..


----------



## Blameless

Quote:


> Originally Posted by *Olivon*
> 
> Uber interesting, must read !
> 
> *Understand the Ryzen sub-system memory - HFR*
> 
> *Source*


This was forseen.
Quote:


> Originally Posted by *redone13*
> 
> I'm talking about this budgetgamer.
> Edit: Lower is better for those who are wondering.


So was this.

Dolphin is fast on Intel because of Intel's L1/L2 cache bandwidth...Haswell and newer architectures have cache nearly twice as fast as Ivy and Ryzen.


----------



## looniam

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> the whole video is identical?
> 
> NO.
> 
> there were a few instances in WD2 that also raised my eyebrow (for a few dozen milliseconds the 1700 was performing much better) *but it was not the same across the other games shown.*
> 
> and did not everyone hear joker himself say at the beginning that there will be a few flaky things?
> 
> 
> 
> 
> 
> 
> Still dont take the fact as a gamer you never want that cpu touching over 80% (as a max) usage all together for a bunch of reasons..
> 
> Microstutters to begin with and weirds anomalies all around.. CPU IS the hearts of the system and you will need to have it on check if somethings kicks in in the background for a flawless experience and a cpu is pegged that high is just bad news for the experience..
Click to expand...

ah . .you know the test IS TO MAKE the cpu bottleneck right?

so a test to cause a cpu to bottleneck causing a cpu bottleneck is expected, no?

what you look at is the GPU usage WHEN it occurs.


----------



## zGunBLADEz

Quote:


> Originally Posted by *looniam*
> 
> ah . .you know the test IS TO MAKE the cpu bottleneck right?
> 
> so a test to cause a cpu to bottleneck causing a cpu bottleneck is expected, no?


Not really tho the test is still valid for 1080P @ 144HZ monitors
so still a valid bench into an extent.

But of course i be buying a 1080TI to run a 720/1080P resolutions with low to bare minimum settings XD
just to make sure i do a cpu bottleneck XD


----------



## looniam

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> ah . .you know the test IS TO MAKE the cpu bottleneck right?
> 
> so a test to cause a cpu to bottleneck causing a cpu bottleneck is expected, no?
> 
> 
> 
> Not really tho the test is still valid for 1080P @ 144HZ monitors
> so still a valid bench into an extent.
> 
> But of course i be buying a 1080TI to run a 720/1080P resolutions with low to bare minimum settings XD
Click to expand...

no it isn't *valid* for 1080 144hz since not only is the resolution lowered but so are the graphic settings.

also read my edit in the previous.


----------



## redone13

Quote:


> Originally Posted by *Blameless*
> 
> This was forseen.
> So was this.
> 
> Dolphin is fast on Intel because of Intel's L1/L2 cache bandwidth...Haswell and newer architectures have cache nearly twice as fast as Ivy and Ryzen.


I stand by my statement blameless. It is looking to be a jack of all trades, the first generation in the shift from four to six and eight cores. And because it is the first generation, there is overlap between the transition which makes the notion of upgrading less compelling. Why future proof if you can get the real deal later or unless you don't have anything right now.


----------



## zGunBLADEz

I dont see the point to look at benches sub 1080P with low settings purposely creating a bottleneck on a well capable card it just dont make any sense.. im not doing that as a gamer myself...

That would be a synthetic "work" benchmark instead to see if the cpu is capable...

What about they show more in line a 1080P card like the 1060 or 480...

I never understood the lowering the resolution and create on purpose a bottleneck on a high end card...

Also i would love load core % displayed in benchmarks as well as i like to know how loaded the cpu was when benched


----------



## aberrero

Quote:


> Originally Posted by *zGunBLADEz*
> 
> I dont see the point to look at benches sub 1080P with low settings purposely creating a bottleneck on a well capable card it just dont make any sense..
> 
> What about they show more in line a 1080P card like the 1060 or 480...


These benchmarks tell us how the chip will perform with next-gen cards at higher resolutions.


----------



## mcg75

Didn't see this earlier.

Joker and Steve talking about bottlenecks and how ridiculous people can be talking about reviews.


----------



## zGunBLADEz

Quote:


> Originally Posted by *aberrero*
> 
> These benchmarks tell us how the chip will perform with next-gen cards at higher resolutions.


Thats not going to indicate how well the next gen card will perform tho XD


----------



## CriticalOne

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Thats not going to indicate how well the next gen card will perform tho XD


Its not a strong indicator but it still provides some insight on how a system will run when the CPU is the bottleneck. Giving some information and requiring interpretation/speculation is better than giving no meaningful information at all, which is what GPU bound "CPU" benchmarks do.


----------



## Buris

Quote:


> Originally Posted by *redone13*
> 
> https://i.imgur.com/z3fQeQZ.png
> 
> http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7


Laughable, even gamers nexus admitted to screwing up their bench "I stand behind my test, but I did these 3 things that hurt performance by 5% each".


----------



## redone13

Quote:


> Originally Posted by *Buris*
> 
> Laughable, even gamers nexus admitted to screwing up their bench "I stand behind my test, but I did these 3 things that hurt performance by 5% each".


Now that we know about the debacle it has become, I hear ya. It is laughable. All I was doing was posting the data that is/was available.

http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951.html

http://www.guru3d.com/articles-pages/amd-ryzen-7-1800x-processor-review,1.html

http://www.pcworld.com/article/3176191/computers/ryzen-review-amd-is-back.html?page=2

http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page4.html

https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Gaming-Performance

https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/

They all say a similar thing. The consensus is that we need to wait for tweaks.


----------



## jprovido

my 5820k @ 4.7ghz 3200mhz mems couldn't get dota 2 and overwatch stable at 144fps (I get a lot fps drops esp at dota it goes as low as 90fps with big team fights) do you think Ryzen @ 4-4.1ghz would do better? I just preordered 1700x (no motherboard yet) I'm deciding if I put it on my main rig or install it on my vr rig


----------



## CriticalOne

Any particular reason you are getting the 1700x over the 1700? Personally, I am having a hard time seeing the value.


----------



## Buris

Quote:


> Originally Posted by *jprovido*
> 
> my 5820k @ 4.7ghz 3200mhz mems couldn't get dota 2 and overwatch stable at 144fps (I get a lot fps drops esp at dota it goes as low as 90fps with big team fights) do you think Ryzen @ 4-4.1ghz would do better? I just preordered 1700x (no motherboard yet) I'm deciding if I put it on my main rig or install it on my vr rig


I would think that the 1700x performs on average marginally better than the 5820K, however it's not going to blow you away coming from what you already had.


----------



## redone13

Quote:


> Originally Posted by *CriticalOne*
> 
> Any particular reason you are getting the 1700x over the 1700? Personally, I am having a hard time seeing the value.


The 1700 seems to be the sweet spot.


----------



## tygeezy

Quote:


> Originally Posted by *jprovido*
> 
> my 5820k @ 4.7ghz 3200mhz mems couldn't get dota 2 and overwatch stable at 144fps (I get a lot fps drops esp at dota it goes as low as 90fps with big team fights) do you think Ryzen @ 4-4.1ghz would do better? I just preordered 1700x (no motherboard yet) I'm deciding if I put it on my main rig or install it on my vr rig


Interesting... I dont play dota 2 but i do play overwatch and cs go. Overwtach is the mroe stable of the two games. I cap my fps at 140 (gsync 144 hz) and I can maintain that most of the time. It did to the 110's on occasion, but does a good job of holding 140. This is on an ancient i7 860 at 3.5. I am using a geforce 1070 at 1080 p with a setting or two turned down (reflections i believe?)

What resolution and settings are you running? Perhaps a gpu bottleneck? Going 1440 p (res scale) I begin to get gpu bottlenecked, so it doesn't take much.


----------



## Code-Red

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> People have been saying members would abandon OCN over fanboy arguments for literally as long as I've been here. IMO people just need thicker skin around here...


Its weird that this is a new concept for some people... OCN was always like this back a number of years. Funny to think that the stagnancy in the computer industry toned the rabid fanboys on either side down so much.


----------



## ElectroGeek007

Wondering if I should return my (still in box) 1800X and C6H for a 1700 and some cheaper motherboard. Looking around for B350 mobo overclocking results but not seeing much.


----------



## redone13

Quote:


> Originally Posted by *ElectroGeek007*
> 
> Wondering if I should return my (still in box) 1800X and C6H for a 1700 and some cheaper motherboard. Looking around for B350 mobo overclocking results but not seeing much.


It's too early to tell. What is your primary use for the system?


----------



## Majin SSJ Eric

I'd just keep them if I were planning a Ryzen build anyway, unless I was really short on cash. You already have them and can build away. Return them and it could be a while before you can get new parts.


----------



## Shatun-Bear

Any word on when TechPowerUp's Ryzen review goes live?

That's another one I will read just out of morbid curiosity and to see how negative they spin these new chips. Because if anyone likes to put the boot in on AMD these days, it's W1zzard and TPU.


----------



## jprovido

Quote:


> Originally Posted by *tygeezy*
> 
> Interesting... I dont play dota 2 but i do play overwatch and cs go. Overwtach is the mroe stable of the two games. I cap my fps at 140 (gsync 144 hz) and I can maintain that most of the time. It did to the 110's on occasion, but does a good job of holding 140. This is on an ancient i7 860 at 3.5. I am using a geforce 1070 at 1080 p with a setting or two turned down (reflections i believe?)
> 
> What resolution and settings are you running? Perhaps a gpu bottleneck? Going 1440 p (res scale) I begin to get gpu bottlenecked, so it doesn't take much.


1440p @ 144hz max settings. 1080 SLi is not even breaking a sweat it's purely cpu bottleneck. my [email protected] have zero problems maintaining 144fps but my twitch stream is horrible https://www.twitch.tv/videos/125965145 my 5820k before streams in twitch a lot better but can't maintain 144fps like I mention and gets another performance hit too when you turn on streaming. my 7700k is 144fps 100% of the time even when streaming at twitch but as you can see in the video, skips frames alot


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> 1440p @ 144hz max settings. 1080 SLi is not even breaking a sweat it's purely cpu bottleneck. my [email protected] have zero problems maintaining 144fps but my twitch stream is horrible https://www.twitch.tv/videos/125965145 my 5820k before streams in twitch a lot better but can't maintain 144fps like I mention and gets another performance hit too when you turn on streaming. my 7700k is 144fps 100% of the time even when streaming at twitch but as you can see in the video, skips frames alot


You do realize that you can go down on the refresh rate right? You should probably lower it as 144HZ is no where near the norm. It's wasted resources regardless of whether you are streaming from a Ryzen CPU or Intel. Are twitch servers even capable of a 144HZ tick rate?


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> You do realize that you can go down on the refresh rate right? You should probably lower it as 144HZ is no where near the norm. It's wasted resources regardless of whether you are streaming from a Ryzen CPU or Intel. Are twitch servers even capable of a 144HZ tick rate?


have you even tried gaming on 144fps @ 144hz? it's smoooooooooooooth as butter







I'm only streaming at 60 or 30fps but again my 7700k can't handle it. the AMD demos ARE true I can attest to that


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> 1440p @ 144hz max settings. 1080 SLi is not even breaking a sweat it's purely cpu bottleneck. my [email protected] have zero problems maintaining 144fps but my twitch stream is horrible https://www.twitch.tv/videos/125965145 my 5820k before streams in twitch a lot better but can't maintain 144fps like I mention and gets another performance hit too when you turn on streaming. my 7700k is 144fps 100% of the time even when streaming at twitch but as you can see in the video, skips frames alot


Thank you again sir

Smooth they said.... Didnt drop frames they said...


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> Thank you again sir
> 
> Smooth they said.... Didnt drop frames they said...


the AMD demos are 100% true. 7700k can't handle gaming AND streaming. but as you can see the in game fps is solid 144fps. can't do that with my 5820k @ 4.7ghz. it seems like the cpu I need doesnt exist yet. zen+ would probably be the answer


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Thank you again sir
> 
> Smooth they said.... Didnt drop frames they said...


What you don't understand is that 90% of people _do not_ have a monitor with that high of refresh rate. So it is _pointless_ to stream that format. It is wasted resources. And then if you consider a server needs to be made to "tick" so to speak more frequently, then you are wrong again.

It's like shooting fish in a bucket with you budget.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> the AMD demos are 100% true. 7700k can't handle gaming AND streaming. but as you can see the in game fps is solid 144fps. can't do that with my 5820k @ 4.7ghz. it seems like the cpu I need doesnt exist yet. zen+ would probably be the answer


I bet your 5820k would handle gaming and streaming better though.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> I bet your 5820k would handle gaming and streaming better though.


it struggles a lot







dota 2 early game it can maintain 144fps no problem but when it gets to the latter parts of the game during big team fights it goes as low as 90fps. my 7700k can maintain 144fps without a sweat BUT again two less cores means no decent streaming on twitch anymore.


----------



## tpi2007

Quote:


> Originally Posted by *jprovido*
> 
> Quote:
> 
> 
> 
> Originally Posted by *budgetgamer120*
> 
> I bet your 5820k would handle gaming and streaming better though.
> 
> 
> 
> it struggles a lot
> 
> 
> 
> 
> 
> 
> 
> dota 2 early game it can maintain 144fps no problem but when it gets to the latter parts of the game during big team fights it goes as low as 90fps. my 7700k can maintain 144fps without a sweat BUT again two less cores means no decent streaming on twitch anymore.
Click to expand...

The CPU you want doesn't exist yet. Maybe with a 6 core Coffe Lake stock or a slightly overclocked Skylake-X.


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> it struggles a lot
> 
> 
> 
> 
> 
> 
> 
> dota 2 early game it can maintain 144fps no problem but when it gets to the latter parts of the game during big team fights it goes as low as 90fps. my 7700k can maintain 144fps without a sweat BUT again two less cores means no decent streaming on twitch anymore.


You are a prime example of why streams lag. Streaming in 144HZ is purely user error. You will have to lower the settings.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> it struggles a lot
> 
> 
> 
> 
> 
> 
> 
> dota 2 early game it can maintain 144fps no problem but when it gets to the latter parts of the game during big team fights it goes as low as 90fps. my 7700k can maintain 144fps without a sweat BUT again two less cores means no decent streaming on twitch anymore.


How about streaming while gaming... Smoothness compared to the 7700k?

I wonder if they missed where you said you streamed in 720p 30fps lol


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> What you don't understand is that 90% of people _do not_ have a monitor with that high of refresh rate. So it is _pointless_ to stream that format. It is wasted resources. And then if you consider a server needs to be made to "tick" so to speak more frequently, then you are wrong again.
> 
> It's like shooting fish in a bucket with you budget.


Quote:


> Originally Posted by *tpi2007*
> 
> The CPU you want doesn't exist yet. Maybe with a 6 core Coffe Lake stock or a slightly overclocked Skylake-X.


so true man. I need the best of both worlds
Quote:


> Originally Posted by *redone13*
> 
> You are a prime example of why stream lagd. Streaming in 144HZ is purely user error. You will have to lower the settings.


uhmm like what I said earlier I only stream at 30 and 60fps. that's the only option available on geforce experience


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> You are a prime example of why stream lagd. Streaming in 144HZ is purely user error. You will have to lower the settings.


You cant stream at 144hz... You stream at fps and he is streaming at 30









Anything else?


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> How about streaming while gaming... Smoothness compared to the 7700k?
> 
> I wonder if they missed where you said you streamed in 720p 30fps lol


7700k while streaming will still have 144fps locked. this was my test stream yesterday https://www.twitch.tv/videos/125965145 it has a small fps counter on the top left screen 144fps 100% of the time.

with the 5820k @ 4.7ghz I average around 120fps on overwatch. when stream is enabled another 5-10fps performance hit but my twitch streams is A LOT better comapred to the 7700k. a lot of skipped frames even at 30fps it's pathetic lol

edit:

here's a stream on my 5820k @ 4.7ghz. it's only on the training grounds with no other players but fps is still low (100-130fps). but as you can see it's a lot smoother than the 7700k stream https://www.twitch.tv/videos/112076573


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> 7700k while streaming will still have 144fps locked. this was my test stream yesterday https://www.twitch.tv/videos/125965145 it has a small fps counter on the top left screen 144fps 100% of the time.
> 
> with the 5820k @ 4.7ghz I average around 120fps on overwatch. when stream is enabled another 5-10fps performance hit but my twitch streams is A LOT better comapred to the 7700k. a lot of skipped frames even at 30fps it's pathetic lol
> 
> edit:
> 
> here's a stream on my 5820k @ 4.7ghz. it's only on the training grounds with no other players but fps is still low. but as you can see it's a lot smoother than the 7700k stream https://www.twitch.tv/videos/112076573


Ok thanks, that is what I wanted to know


----------



## kd5151





edit:do you see what i see? 0%?


----------



## Quantum Reality

Quote:


> Originally Posted by *aDyerSituation*
> 
> I don't recall anyone calling RyZen "bad" for gaming. It is capable.
> 
> But being behind 4 year old chips in some games is pretty bad


You know, I seem to recall people starting to gripe that modern Intel CPUs aren't actually all that amazingly better than their older counterparts - after all, weren't people noticing Intel was going for essentially cosmetic incremental improvements?

So "behind 4 year old chips" is a bit of an exaggeration given the relatively ploddy improvements since then.


----------



## jezzer

These reviews are pretty good.

7700K appears to be the king of 720P, i dont really need that so no 7700K for me

Ryzen on other hand can keep up with no real loss when gaming @1440p and can encode videos like a boss.

Glad i waited and didnt and up with King of 720p CPU


----------



## budgetgamer120

Quote:


> Originally Posted by *kd5151*
> 
> 
> 
> 
> 
> edit:do you see what i see? 0%?


Meh... Single player


----------



## kd5151

Quote:


> Originally Posted by *budgetgamer120*
> 
> Meh... Single player


they have other vids also.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ok thanks, that is what I wanted to know


If ryzen clocked a little bit more maybe around 4.4ghz and 3200mhz memory I think it's possible to play at perfect 144fps AND stream without skipped frames. zen+ can't get here any sooner


----------



## RedM00N

Quote:


> Originally Posted by *jprovido*
> 
> it struggles a lot
> 
> 
> 
> 
> 
> 
> 
> dota 2 early game it can maintain 144fps no problem but when it gets to the latter parts of the game during big team fights it goes as low as 90fps. my 7700k can maintain 144fps without a sweat BUT again two less cores means no decent streaming on twitch anymore.


Bit ot, but anything beyond the medium preset has diminishing returns compared to the load increase(dunno if your using this, just tossing it out here). Also, looking at your vods, I'm seeing 1080p at very high bitrates (some even 60fps, possibly variable bitrates too) which all this increases cpu load.

720p 60/30 3500 bitrate constant, medium/fast is what I'd try for cpu, unless you already have.


----------



## jprovido

Quote:


> Originally Posted by *RedM00N*
> 
> Bit ot, but anything beyond the medium preset has diminishing returns compared to the load increase(dunno if your using this, just tossing it out here). Also, looking at your vods, I'm seeing 1080p at very high bitrates (some even 60fps, possibly variable bitrates too) which all this increases cpu load.
> 
> 720p 60/30 3500 bitrate constant, medium/fast is what I'd try for cpu, unless you already have.


thanks for the info man. I will try it right now I hope I don't get as much skipped frames compared to the settings that I had before. I believe it was at [email protected] max bitrate


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> If ryzen clocked a little bit more maybe around 4.4ghz and 3200mhz memory I think it's possible to play at perfect 144fps AND stream without skipped frames. zen+ can't get here any sooner


Well, we are all waiting to see what happens as per AMDs instruction. Also, if gaming FPS have told us anything, it's that you won't be maintaining the 144FPS on your own screen. The broadcast of 30 or 60 FPS content may be smooth, but you may as well turn down your own refresh rate to avoid wasted resources.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> If ryzen clocked a little bit more maybe around 4.4ghz and 3200mhz memory I think it's possible to play at perfect 144fps AND stream without skipped frames. zen+ can't get here any sooner


Definitely not for you if you want 144fps


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> Well, we are all waiting to see what happens as per AMDs instruction. Also, if gaming FPS have told us anything, it's that you won't be maintaining the 144FPS on your own screen. The broadcast of 30 or 60 FPS content may be smooth, but you may as well turn down your own refresh rate to avoid wasted resources.


turning down refresh rate is out of the question. [email protected] is my target can't really play any other way now that I'm used to it. it's smooth as silk


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> turning down refresh rate is out of the question. [email protected] is my target can't really play any other way now that I'm used to it. it's smooth as silk


Well then, take a good hard look at where the Ryzen had higher FPS in games than the intel. And I am talking the majority. Not isolated benches. Thus the, "we must wait thing."


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> turning down refresh rate is out of the question. [email protected] is my target can't really play any other way now that I'm used to it. it's smooth as silk


How is streaming on steam?


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> Well then, take a good hard look at where the Ryzen had higher FPS in games than the intel. And I am talking the majority. Not isolated benches. Thus the, we must wait thing.


watched the youtube video comparing the gaming performance and Ryzen is really competetive despite the clockspeed deficit. I figured the 1700x is perfect for my VR rig I have no doubt it can maintain 90hz without any problems and it will only get better in the future
Quote:


> Originally Posted by *budgetgamer120*
> 
> How is streaming on steam?


never tried streaming there before


----------



## Jpmboy

not sure if this has been posted here yet.. 1800x looking good on LN2: http://hwbot.org/submission/3473862


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> watched the youtube video comparing the gaming performance and Ryzen is really competetive despite the clockspeed deficit. I figured the 1700x is perfect for my VR rig I have no doubt it can maintain 90hz without any problems


I am simply stating a compromise will be made unless the current gaming benchmarks change. 144FPS is not reasonable.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> I am simply stating a compromise will be made unless the current gaming benchmarks change. 144FPS is not reasonable.


yep I'm part of the minority on this one. that's why I spend a lot of money on my pc just to hope to get this experience on all the games I play


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> yep I'm part of the minority on this one. that's why I spend a lot of money on my pc just to hope to get this experience on all the games I play


You don't get it. If the intel PCs had higher FPS in games, how can the Ryzen attain 144FPS on your streaming rig. Sure, I get that you will broadcast in 60 or 30 FPS but please think about it.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> yep I'm part of the minority on this one. that's why I spend a lot of money on my pc just to hope to get this experience on all the games I play


He is bent on trying to not get you to buy Ryzen but havent realized you arent replacing your 7700k with Ryzen


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> He is bent on trying to not get you to buy Ryzen but havent realized you arent replacing your 7700k with Ryzen


Not bent on anything. Just trying to get you to use your brain budget.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> You don't get it. If the intel PCs had higher FPS in games, how can the Ryzen attain 144FPS on your streaming rig. Sure, I get that you will broadcast in 60 or 30 FPS but please think about it.


hmmm it would be possible on ryzen if it was clocked 4.4-4.5ghz with 3200mhz memory. with all the reviews I've read I think 4.0-4.1ghz would not cut it I would guess I'd get some minor frame drops like 110-130fps


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> hmmm it would be possible on ryzen if it was clocked 4.4-4.5ghz with 3200mhz memory. with all the reviews I've read I think 4.0-4.1ghz would not cut it I would guess I'd get some minor frame drops like 110-130fps


Operative words "would be." I understand we need to wait for more data. But do stop making assumptions.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> Operative words "would be." I understand we need to wait for more data. But do stop making assumptions.


I'm just making a rough estimate here. The cpu that I "want" doesn't exist yet. A sixcore skylake or a higher clocking ryzen would be perfect!


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> I'm just making a rough estimate here. The cpu that I "want" doesn't exist yet. A sixcore skylake or a higher clocking ryzen would be perfect!


Indeed. Sorry if I came off as a turd. But you now see what I'm saying.


----------



## Quantum Reality

Speaking of higher-clocking Ryzens, who wants to bet AMD will put out a 1900X part once they've worked out the bugs?


----------



## jprovido

Quote:


> Originally Posted by *Quantum Reality*
> 
> Speaking of higher-clocking Ryzens, who wants to bet AMD will put out a 1900X part once they've worked out the bugs?


not out of the question. with Phenom II when they released the C3 stepping it was clocking 200-300mhz higher. my Phenom II 955 BE's max oc was 3.8ghz and the C3 steppings were 4-4.1ghz on average.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> not out of the question. with Phenom II when they released the C3 stepping it was clocking 200-300mhz higher. my Phenom II 955 BE's max oc was 3.8ghz and the C3 steppings were 4-4.1ghz on average.


My phenom II max was 3.8ghz too


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> My phenom II max was 3.8ghz too


a little bit OT but I was really annoyed with Phenom II x2 555 BE owners back in the day. they were able to unlock their cpu to quadcore AND overclock their cpu to 4-4.1ghz because they were c3 stepping while being much cheaper lol


----------



## redone13

The X4 940 black was a pretty sweet gig though too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *budgetgamer120*
> 
> My phenom II max was 3.8ghz too


Still have mine at 4.2GHz.


----------



## umeng2002

IF your target is 144 fps gaming, turn down the game settings. That's the easiest way to achieve it.

Yeah, my C2 955 was 3.8 GHz.

My 1090t went up to 4 GHz. With 4 cores on, like 4.2 GHz.

Stepping matters.

But I would hope, AMD is working on Zen+ quickly... like launching a year from now.

If they can get an update out within a year, AMD will gain a ton of market share.

I almost hope they skip Zen for an APU and just move straight to Zen+ for the first Zen# APU.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> IF your target is 144 fps gaming, turn down the game settings. That's the easiest way to achieve it.
> 
> Yeah, my C2 955 was 3.8 GHz.
> 
> My 1090t went up to 4 GHz. With 4 cores on, like 4.2 GHz.


Quote:


> Originally Posted by *redone13*
> 
> I am simply stating a compromise will be made unless the current gaming benchmarks change.


It could be argued that one could do that for the Intel CPU to alleviate the strain and eliminate any lagginess that is present if any. It's really about knowing the what your hardware is capable of. It's all conjecture though.


----------



## jprovido

https://www.twitch.tv/videos/126205955 I just tried streaming just now with my [email protected] 5.1ghz. adjusted the settings to [email protected] 5mbps bitrate but with A LOT of skipped frames still. I guess no amount of tweaking with the settings would fix this
Quote:


> Originally Posted by *umeng2002*
> 
> IF your target is 144 fps gaming, turn down the game settings. That's the easiest way to achieve it.
> 
> Yeah, my C2 955 was 3.8 GHz.
> 
> My 1090t went up to 4 GHz. With 4 cores on, like 4.2 GHz.
> 
> Stepping matters.
> 
> But I would hope, AMD is working on Zen+ quickly... like launching a year from now.
> 
> If they can get an update out within a year, AMD will gain a ton of market share.
> 
> I almost hope they skip Zen for an APU and just move straight to Zen+ for the first Zen# APU.


the thing is with high refresh rate gaming even if you lower the game settings the fps would not be affected becuase there's no gpu bottleneck ( I have two overclocked 1080's) you need to have a powerful cpu to maintain 144fps


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/126205955 I just tried streaming just now with my [email protected] 5.1ghz. adjusted the settings to [email protected] 5mbps bitrate but with A LOT of skipped frames still. I guess no amount of tweaking with the settings would fix this


Think logical they said


----------



## umeng2002

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/126205955 I just tried streaming just now with my [email protected] 5.1ghz. adjusted the settings to [email protected] 5mbps bitrate but with A LOT of skipped frames still. I guess no amount of tweaking with the settings would fix this


Try it with GPU encoding to see if it's really your CPU.


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/126205955 I just tried streaming just now with my [email protected] 5.1ghz. adjusted the settings to [email protected] 5mbps bitrate but with A LOT of skipped frames still. I guess no amount of tweaking with the settings would fix this


Try turning down your refresh rate from 144HZ. My point is, the Ryzen will not be able to achieve that refresh rate on your own screen if it's weaker gaming wise and you are ending up in a similar position of compromise with both systems.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> Try turning down your refresh rate from 144HZ. My point is, the Ryzen will not be able to achieve that refresh rate on your own screen if it's weaker gaming wise and you are ending up in a similar position of compromise.


yep. with ryzen fps will be below 144fps BUT twitch stream would be smoother.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/126205955 I just tried streaming just now with my [email protected] 5.1ghz. adjusted the settings to [email protected] 5mbps bitrate but with A LOT of skipped frames still. I guess no amount of tweaking with the settings would fix this
> the thing is with high refresh rate gaming even if you lower the game settings the fps would not be affected becuase there's no gpu bottleneck ( I have two overclocked 1080's) you need to have a powerful cpu to maintain 144fps


Just was watching your stream... It is bad.


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> yep. with ryzen fps will be below 144fps BUT twitch stream would be smoother.


Also, we have been through this many pages back. I posted a 4770k stream that was fine and 6700k for budgetgamer. I can quote it if you like. It kinda proves user error when a 4770K is fine for 5000 people and you can't stream for one with a 7700k.


----------



## iRUSH

There's already some open box AM4 boards at my local Microcenter. I might go up and snag some parts and give the 1700 a go.


----------



## budgetgamer120

Quote:


> Originally Posted by *iRUSH*
> 
> There's already some open box AM4 boards at my local Microcenter. I might go up and snag some parts and give the 1700 a go.


What are the prices?


----------



## S.M.

Quote:


> Originally Posted by *pony-tail*
> 
> I am a mechanic not an engineer , so might be a dumb question , but is there any chance they could fix the issues with another stepping ?


Yes.

There's also a very good chance it will be fixed with a BIOS update or Windows patch.


----------



## jprovido

Quote:


> Originally Posted by *umeng2002*
> 
> Try it with GPU encoding to see if it's really your CPU.


can you enlighten me on this one? I don't know where this setting is. I'm just using geforce experience
Quote:


> Originally Posted by *budgetgamer120*
> 
> Just was watching your stream... It is bad.


Yep it's freakin horrible. my old streams with my 5820k wasn't like this at all


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> can you enlighten me on this one? I don't know where this setting is. I'm just using geforce experience
> Yep it's freakin horrible. my old streams with my 5820k wasn't like this at all


It's definitely you. 1000 person stream on 6700k and you lag with 1 viewer on a 7700k....

https://www.twitch.tv/anthony_kongphan

I choose a viewer count of more than 1 or 2 or 3 because 1000 people won't put up with **** quality.


----------



## umeng2002

Well, Geforce Experience should be using the GPU encode by default









It actually might be an issue with GFE and the video frame rate being different than your monitors frame rate.

Stop using GFE and try OBS with CPU encode and GPU encode (I think OBS has the option - GFE isn't the only way to do GPU encoding).


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> It's definitely you. 1000 person stream on 6700k and you lag with 1 viewer on a 7700k....
> 
> https://www.twitch.tv/anthony_kongphan
> 
> I choose a viewer count of more than 1 or 2 or 3 because 1000 people won't put up with **** quality.


The amount of viewers do not matter. That part is handled by twitch server.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> The amount of viewers do not matter. That part is handled by twitch server.


I explained the rationale. It proves that I am not making it up. I stated:

I choose a viewer count of more than 1 or 2 or 3 because 1000 people won't put up with **** quality. And Jprovido only has 1 viewer, us.

My examples for you maybe 30 pages back with the 4770k and the 6700k had near 5000 viewers. So please give it a rest.


----------



## RedM00N

Quote:


> Originally Posted by *jprovido*
> 
> can you enlighten me on this one? I don't know where this setting is. I'm just using geforce experience
> Yep it's freakin horrible. my old streams with my 5820k wasn't like this at all


I see your rig has two 1080's. I'll assume you have SLI on for ow, and I know there is a performance hit for me with capturing OW with SLI enabled (and just capturing in general but not as big). This is in OBS though, but I'd imagine all capture software would probably have an issue. Just a thought. I usually rolled with SLI off in OW anyway.


----------



## jprovido

Quote:


> Originally Posted by *RedM00N*
> 
> I see your rig has two 1080's. I know there is a performance hit for me with capturing OW with SLI enabled (and just capturing in general but not as big). This is in OBS though, but I'd imagine all capture software would probably have an issue. Just a thought. I usually rolled with SLI off in OW anyway.


didn't really think about SLI causing the problem. tbh one gtx 1080 would be more than enough for overwatch @ maxsettings. I will try with sli disabled to see if there's a difference


----------



## czin125

Anyone going to review with a similar setup?
http://www.overclock.net/t/1621347/kaby-lake-overclocking-guide-with-statistics/430
2160cm^2 of cooling + delid = 5.3ghz 24/7 7700K it seems ( 5 x 360mm*120mm rads for cpu+ 2 gpus )

DDR-4133 16-16-16-36 or DDR4-4000 16-16-16-36


----------



## Malinkadink

Quote:


> Originally Posted by *jprovido*
> 
> didn't really think about SLI causing the problem. tbh one gtx 1080 would be more than enough for overwatch @ maxsettings. I will try with sli disabled to see if there's a difference


Playing OW with max settings is a mistake if you care about input lag. Reflection graphics settings especially cause an increase in input lag. Just throwing it out there.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> I explained the rationale. It proves that I am not making it up. I stated:
> 
> I choose a viewer count of more than 1 or 2 or 3 because 1000 people won't put up with **** quality. And Jprovido only has 1 viewer, us.
> 
> My examples for you maybe 30 pages back with the 4770k and the 6700k had near 5000 viewers. So please give it a rest.


It does not matter the view count. They are not pulling from the streamer. They are pulling from Twitch server. Just as how Youtube streaming works.


----------



## jprovido

Quote:


> Originally Posted by *Malinkadink*
> 
> Playing OW with max settings is a mistake if you care about input lag. Reflection graphics settings especially cause an increase in input lag. Just throwing it out there.


mine has zero input lag @ max settings







gaming at 144hz is great

I have tried with SLI Disabled stream is still choppy https://www.twitch.tv/videos/126213444


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> It does not matter the view count. They are not pulling from the streamer. They are pulling from Twitch server. Just as how Youtube streaming works.


Budget, I understand that. What I am saying is that I chose a stream with more viewers simply because a large amount of viewers will not watch a choppy stream. It was multiple instances of proof that a quad core can stream for many people to enjoy. Because this was your stance was it not?

Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Budget, I understand that. What I am saying is that I chose a stream with more viewers simply because a large amount of viewers will not watch a choppy stream. It was multiple instances of proof that a quad core can stream for many people to enjoy. Because this was your stance was it not?


Why are you mentioning it when it has nothing to do with performance?


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Why are you mentioning it when it has nothing to do with performance?


What I am saying is that I chose a stream with more viewers simply because a large amount of viewers will not watch a choppy stream. It was multiple instances of proof that a quad core can stream for many people to enjoy (4770k and 2 6700ks). It refutes your general sentiments about choppiness.
Quote:


> Originally Posted by *budgetgamer120*
> 
> What do you mean no problem? Everyone who streams and game on the same pc with a quadcore has stutters. Might be less with i7 but it is still there.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> mine has zero input lag @ max settings
> 
> 
> 
> 
> 
> 
> 
> gaming at 144hz is great
> 
> I have tried with SLI Disabled stream is still choppy https://www.twitch.tv/videos/126213444


You can only lower your refresh rate to take load from cpu. If you dont mind trying 60hz or 120hz just to see if it is still choppy.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> You can only lower your refresh rate to take load from cpu. If you dont mind trying 60hz or 120hz just to see if it is still choppy.


Quote:


> Originally Posted by *redone13*
> 
> Try turning down your refresh rate from 144HZ. My point is, the Ryzen will not be able to achieve that refresh rate on your own screen if it's weaker gaming wise and you are ending up in a similar position of compromise with both systems.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> You can only lower your refresh rate to take load from cpu. If you dont mind trying 60hz or 120hz just to see if it is still choppy.


yep im pretty sure it would be fine at 60hz but it's not something I'm willing to do. obviously I'm not a streamer (just trying it out for fun) just a bit sad that my 7700k is not enough to [email protected] and stream at the same time


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> yep but it's not something I'm willing to do. obviously I'm not a streamer (just trying it out for fun) just a bit sad that my 7700k is not enough to [email protected] and stream at the same time


From the looks of it, the Ryzen is going to end up compromising too if it gets lower FPS in games. Know your hardware's limits.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> From the looks of it, the Ryzen is going to end up compromising too if it gets lower FPS in games. Know your hardware's limits.


why do you have to say that over and over again. it's been said a few times that Ryzen is not enough for this. it's annoying tbh


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> yep im pretty sure it would be fine at 60hz but it's not something I'm willing to do. obviously I'm not a streamer (just trying it out for fun) just a bit sad that my 7700k is not enough to [email protected] and stream at the same time


You are not a streamer so no worries. It is a quad core after all, it can only do so much.


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> why do you have to say that over and over again. it's been said a few times that Ryzen is not enough for this. it's annoying tbh


Because you and budgetgamer don't seem to grasp it. AMD is not a miracle elixir. It is a task oriented, heavy lifter that is excellent at what it does but has it own limitations just like even the "almighty" 7700K.


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> Because you and budgetgamer don't seem to grasp it. AMD is not a miracle elixir. It is a task oriented, heavy lifter that is excellent at what it does but has it own limitations.


are you even been reading the discussions we've been having? I've said this a million times. that cpu that I "want" doesn't exist yet.


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> are you even been reading the discussions we've been having? I've said this a million times. that cpu that I "want" doesn't exist yet.


I have been. But the whole can't stream on my 7700K because it is a quadcore bit really threw me off


----------



## jprovido

Quote:


> Originally Posted by *redone13*
> 
> I have been. But the whole can't stream on my 7700K bit because it is a quadcore bit really threw me off


I've been posing my test streams for reference the whole time. A quadcore can stream on twitch no problem at 60hz in game and 30or60fps on twitch. what we're discussing is MY setup which is 144hz and twitch streaming. we're not questioning the fact that they CAN stream, it's the problem I have on MY system and MY setup. we're OT btw


----------



## redone13

Quote:


> Originally Posted by *jprovido*
> 
> I've been posing my test streams for reference the whole time. A quadcore can stream on twitch no problem at 60hz in game and 30or60fps at twitch. what we're discussing is MY setup which is 144hz and twitch streaming. we're not questioning the fact that they CAN stream it's just a problem on MY system. we're OT btw


Yea, you're right. My apologies.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> I've been posing my test streams for reference the whole time. A quadcore can stream on twitch no problem at 60hz in game and 30or60fps on twitch. what we're discussing is MY setup which is 144hz and twitch streaming. we're not questioning the fact that they CAN stream, it's the problem I have on MY system and MY setup. we're OT btw


And that your 5820K performs better under same condition which has more cores, but not enough for 144hz because it cannot get to the 5ghz clockspeed.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> And that your 5820K performs better under same condition which has more cores, but not enough for 144hz because it cannot get to the 5ghz clockspeed.


Yes and also the minor ipc improvement helped. I'm guessing a 6 core skylake @ 5ghz or ryzen @ 4.5ghz+ would be the perfect cpu for it but as we all know, they do not exist yet.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> Yes and also the minor ipc improvement helped. I'm guessing a 6 core skylake @ 5ghz or ryzen @ 4.5ghz+ would be the perfect cpu for it but as we all know, they do not exist yet.


Now I want to create a twitch account


----------



## CULLEN

ITT.

*Example 1:*
Here's a proof that 7700K is best!

Picture of 7700K benching empty BF1 map.

SEE! Playing with people is overrated!

*Example 2:*
Ryzen is expensive plus 6900K is better, the price is irrelevant for the 6900K because it's best.

*Example 3:*
Ryzen is only good for multi-threaded not gaming.

_*shows proof where Ryzen @ stock has 274 fps but 7700K @ 5GHz has 281 fps*_

See! Ryzen is 2.49% slower! Checkmate.

*Example 4:*
Intel fanboy: Not impressed! And why would you even consider upgrading when Intel will be releasing 5nm in 3 years?


----------



## budgetgamer120

Quote:


> Originally Posted by *CULLEN*
> 
> ITT.
> 
> *Example 1:*
> Here's a proof that 7700K is best!
> 
> Picture of 7700K benching empty BF1 map.
> 
> SEE! Playing with people is overrated!
> 
> *Example 2:*
> Ryzen is expensive plus 6900K is better, the price is irrelevant for the 6900K because it's best.
> 
> *Example 3:*
> Ryzen is only good for multi-threaded not gaming.
> 
> _*shows proof where Ryzen @ stock has 274 fps but 7700K @ 5GHz has 281 fps*_
> 
> See! Ryzen is 2.49% slower! Checkmate.
> 
> *Example 4:*
> Intel fanboy: Not impressed! And why would you even consider upgrading when Intel will be releasing 5nm in 3 years?


----------



## redone13

Quote:


> Originally Posted by *CULLEN*
> 
> ITT.
> 
> *Example 1:*
> Here's a proof that 7700K is best!
> 
> Picture of 7700K benching empty BF1 map.
> 
> SEE! Playing with people is overrated!
> 
> *Example 2:*
> Ryzen is expensive plus 6900K is better, the price is irrelevant for the 6900K because it's best.
> 
> *Example 3:*
> Ryzen is only good for multi-threaded not gaming.
> 
> _*shows proof where Ryzen @ stock has 274 fps but 7700K @ 5GHz has 281 fps*_
> 
> See! Ryzen is 2.49% slower! Checkmate.
> 
> *Example 4:*
> Intel fanboy: Not impressed! And why would you even consider upgrading when Intel will be releasing 5nm in 3 years?


Cullen, you realize that the current state of things with AMD is about waiting which kind of is the same as your point 4 right? Like wait for more threads and cores to be utilized, wait for optimizations to be made, etc. And the general consensus seems to be that Ryzen has a role in the market so I don't know why you are so salty.


----------



## RedM00N

Quote:


> Originally Posted by *redone13*
> 
> Cullen, you realize that the current state of things with AMD is about waiting which kind of is the same as your point 4 right? Like wait for more threads and cores to be utilized etc. And the general consensus seems to be that Ryzen has a role in the market so I don't know why you are so salty.


Yep, for non gaming, these chips are a steal for what they offer from what I've seen in reviews. Gaming they are great, but In due time it will mature and get past the teething phase and gaming will be up to full speed as well. Naples will be fun to read about just to see how it can shake up the xeon world.


----------



## iRUSH

Quote:


> Originally Posted by *budgetgamer120*
> 
> What are the prices?


$203.96 for the Crosshair

$155.96 for the AORUS AX370

Then take off an additional $30 with the CPU


----------



## budgetgamer120

Quote:


> Originally Posted by *RedM00N*
> 
> Yep, for non gaming, these chips are a steal for what they offer from what I've seen in reviews. In due time it will mature and get past the teething phase and gaming will be up to speed as well. Naples will be fun to read about just to see how it can shake up the xeon world.


For non gaming? Does Ryzen not game?









I keep seeing this statement popup everywhere and wondering if Ryzen is incapable of gaming or i missed something









Can you point to Review where Ryzen is incapable of gaming?


----------



## RedM00N

Quote:


> Originally Posted by *budgetgamer120*
> 
> For non gaming? Does Ryzen not game?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep seeing this statement popup everywhere and wondering if Ryzen is incapable of gaming or i missed something
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you point to Review where Ryzen is incapable of gaming?


Ye, I added a part to the start of the second sentence to fix my horrid wording lol. They are good to go for games, and will get better with time.


----------



## aDyerSituation

Quote:


> Originally Posted by *jprovido*
> 
> yep im pretty sure it would be fine at 60hz but it's not something I'm willing to do. obviously I'm not a streamer (just trying it out for fun) just a bit sad that my 7700k is not enough to [email protected] and stream at the same time


I stream overwatch sometimes and still maintain 144+ fps


----------



## jprovido

Quote:


> Originally Posted by *aDyerSituation*
> 
> I stream overwatch sometimes and still maintain 144+ fps


I do too but a lot of skipped frames on twitch, you can see it on the settings button


----------



## aDyerSituation

Quote:


> Originally Posted by *jprovido*
> 
> I do too but a lot of skipped frames on twitch


I don't have that problem. Probably your settings.


----------



## jprovido

Quote:


> Originally Posted by *aDyerSituation*
> 
> I don't have that problem. Probably your settings.


I played around with the settings just now as instructed to me earlier here but no dice. can you teach me your settings? ygpm


----------



## CULLEN

Quote:


> Originally Posted by *redone13*
> 
> Cullen, you realize that the current state of things with AMD is about waiting which kind of is the same as your point 4 right? Like wait for more threads and cores to be utilized etc. And the general consensus seems to be that Ryzen has a role in the market so *I don't know why you are so salty*.


Ha! Mate, you're the salty one!









The difference between me and the fanboys (of all colors) is I'm making these jokes to point out the flawless logic by the men in blues.

I'll be doing the same thing when Intel does their next release and the Red Army dismisses everything.

This one is for you. (read this with a nice attitude and know that I'm just joking)

_Read with David Attenborough voice_

Swimming in the salty blue ocean at the bottom of the Mariana Trench, the fanboy never fully develops sight.

But how can the creature live at such insufferable conditions? Through the magic of evolution!

For the good part of a decade, it has paid a premium price for minor improvement and it has made it strong to abuse, some even develop Stockholm syndrome and protect the plague which hunts them.

There is no valid argument for denial.


----------



## budgetgamer120

Quote:


> Originally Posted by *aDyerSituation*
> 
> I stream overwatch sometimes and still maintain 144+ fps


Some how I knew yours out of everyone streaming on twitch would be ok and present no problems


----------



## kfxsti

I don't have an issue streaming. But then again I cap my frames at 72fps to stay within my Freesync range .


----------



## Dragonsyph

50 pages of people talking about watch dogs 2???????? I thought that watch dogs 2 was the world record holder for an unoptimized game. It is just ANOTHER console port that runs like garbage.


----------



## Liranan

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *budgetgamer120*
> 
> What are the prices?
> 
> 
> 
> $203.96 for the Crosshair
> 
> $155.96 for the AORUS AX370
> 
> Then take off an additional $30 with the CPU
Click to expand...

A quick look here and the Crosshair VI Hero is over 400 USD and the Auros isn't even available. The only board I see for 250 is the Asus Prime X370 Pro, which is most like 4+1 doubled to 8+2 phases. As these chips don't really overclock well I don't really see a reason for a good board so a cheaper board would be just fine, unless the mosphets and other components are low quality and prone to failure.


----------



## budgetgamer120

Quote:


> Originally Posted by *Dragonsyph*
> 
> 50 pages of people talking about watch dogs 2???????? I thought that watch dogs 2 was the world record holder for an unoptimized game. It is just ANOTHER console port that runs like garbage.


The unoptimized games are the ones that use 1-2 cores.


----------



## budgetgamer120

What is he on about?


----------



## comagnum

Why can't there be praise for the massive jump AMD made with a severely limited budget? They jumped from the fx generation to trading blows with intels best. Of course there are new architecture bugs and hurdles, intel isn't immune to them either.

Amd leaped ahead what 5-6 CPU generations with one release?? Competition is good. Don't spread false/negative rhetoric. Let the issues iron themselves out, and then criticize. For now, appreciate it as a consumer that we'll finally see what the tech world has in store for us when two juggernauts are duking it out.

Same goes for vega. Hope for success, because amd succeeding means great things for everyone. Hopefully we'll no longer see such grossly overpriced hardware.

Be critical, sure, but keep things in perspective.

In short, use common sense. It's hard, as this thread and others have painfully shown us, but please try.


----------



## Quantum Reality

Quote:


> Originally Posted by *comagnum*
> 
> Why can't there be praise for the massive jump AMD made with a severely limited budget? They jumped from the fx generation to trading blows with intels best. Of course there are new architecture bugs and hurdles, intel isn't immune to them either.
> 
> Amd leaped ahead what 5-6 CPU generations with one release?? Competition is good. Don't spread false/negative rhetoric. Let the issues iron themselves out, and then criticize. For now, appreciate it as a consumer that we'll finally see what the tech world has in store for us when two juggernauts are duking it out.
> 
> Same goes for vega. Hope for success, because amd succeeding means great things for everyone. Hopefully we'll no longer see such grossly overpriced hardware.
> 
> Be critical, sure, but keep things in perspective.
> 
> In short, use common sense. It's hard, as this thread and others have painfully shown us, but please try.


This is what I keep arguing as well. In the AMD-only sector, AMD has more than delivered - both with regard to their goals, and for AMD customers. They've not only surpassed Bulldozer, but they've also exceeded Phenom II as well. By that standard, the fact that Ryzen generally delivers about twice the performance of an equivalent Bulldozer setup, with comparable or lower heat output, is a huge triumph.


----------



## redone13

Quote:


> Originally Posted by *CULLEN*
> 
> Ha! Mate, you're the salty one!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The difference between me and the fanboys (of all colors) is I'm making these jokes to point out the flawless logic by the men in blues.
> 
> I'll be doing the same thing when Intel does their next release and the Red Army dismisses everything.
> 
> This one is for you. (read this with a nice attitude and know that I'm just joking)
> 
> _Read with David Attenborough voice_
> 
> Swimming in the salty blue ocean at the bottom of the Mariana Trench, the fanboy never fully develops sight.
> 
> But how can the creature live at such insufferable conditions? Through the magic of evolution!
> 
> For the good part of a decade, it has paid a premium price for minor improvement and it has made it strong to abuse, some even develop Stockholm syndrome and protect the plague which hunts them.
> 
> There is no valid argument for denial.


Lol. Ok. Done!


----------



## Liranan

Quote:


> Originally Posted by *comagnum*
> 
> Why can't there be praise for the massive jump AMD made with a severely limited budget? They jumped from the fx generation to trading blows with intels best. Of course there are new architecture bugs and hurdles, intel isn't immune to them either.
> 
> Amd leaped ahead what 5-6 CPU generations with one release?? Competition is good. Don't spread false/negative rhetoric. Let the issues iron themselves out, and then criticize. For now, appreciate it as a consumer that we'll finally see what the tech world has in store for us when two juggernauts are duking it out.
> 
> Same goes for vega. Hope for success, because amd succeeding means great things for everyone. Hopefully we'll no longer see such grossly overpriced hardware.
> 
> Be critical, sure, but keep things in perspective.
> 
> In short, use common sense. It's hard, as this thread and others have painfully shown us, but please try.


I also don't understand it. AMD have gone from an underperforming architecture to a 4GHz CPU being compared with an Intel 5GHz one and still being able to hold its own. I think AMD have done an incredible job and Zen 2 or Zen+ should be even better.

The problems so far seem to be scheduling, program code and clock speeds. Once those three have been sorted Zen will be even better.


----------



## Liranan

Quote:


> Originally Posted by *Quantum Reality*
> 
> Quote:
> 
> 
> 
> Originally Posted by *comagnum*
> 
> Why can't there be praise for the massive jump AMD made with a severely limited budget? They jumped from the fx generation to trading blows with intels best. Of course there are new architecture bugs and hurdles, intel isn't immune to them either.
> 
> Amd leaped ahead what 5-6 CPU generations with one release?? Competition is good. Don't spread false/negative rhetoric. Let the issues iron themselves out, and then criticize. For now, appreciate it as a consumer that we'll finally see what the tech world has in store for us when two juggernauts are duking it out.
> 
> Same goes for vega. Hope for success, because amd succeeding means great things for everyone. Hopefully we'll no longer see such grossly overpriced hardware.
> 
> Be critical, sure, but keep things in perspective.
> 
> In short, use common sense. It's hard, as this thread and others have painfully shown us, but please try.
> 
> 
> 
> This is what I keep arguing as well. In the AMD-only sector, AMD has more than delivered - both with regard to their goals, and for AMD customers. They've not only surpassed Bulldozer, but they've also exceeded Phenom II as well. By that standard, the fact that Ryzen generally delivers about twice the performance of an equivalent Bulldozer setup, with comparable or lower heat output, is a huge triumph.
Click to expand...

Vishere is superior to Phenom II in every possible way apart from power consumption. My CPU at stock beats my old 955BE at 3.5GHz, sadly it creates an ungodly amount of heat at 4.5GHz.


----------



## Mad Pistol

Quote:


> Originally Posted by *kd5151*
> 
> 
> 
> 
> 
> edit:do you see what i see? 0%?


Everyone... please watch this video. Joker ran an i7 7700k (the current pinnacle of gaming CPU's) against a "lowly" R7 1700. The settings were "low" and the resolution was 720P; bottlenecks from the GPU were minimized as much as possible.

If you will notice, at all times, the R7 1700 is within spitting distance of the i7 7700k in terms of framerates. You will also notice that for those of us that have high refresh rate monitors, the R7 1700 is above 144 fps for probably 80% of the scenarios shown here here (except for WD2 and a few other dips, where the i7 7700k was also falling short).

Let's just get this one out of the way.

Is the R7 1700 as good for gaming as the i7 7700k is at the moment?
*No.*

Is the R7 1700 cheaper than the i7 7700k?
*Yes.*

Does the R7 1700 have 4 extra cores and 8 extra threads compared to the i7 7700k?
*HELL Yes.*

Is the R7 1700 good at other things than just games?
*BIG FREAKIN YES.*

Can the R7 1700 use the additional cores to stream games while you play them?
*Absolutely!*

Can the i7 7700k stream well using the CPU?
*Sure... if you like stuttering.*

Is the R7 1700 more futureproof because it has more cores and threads than the i7 7700k?
*Definitely.*

I guess the last question you must ask yourself is...

In what reality do you live in where the i7 7700k is the better product?
I would love to know the answer to this one.


----------



## Malinkadink

Looking forward to seeing Ryzen + Vega benchmarks this summer just to see not only how Vega competes with Nvidia but also what improvements Ryzen gets once they iron out the current kinks.


----------



## Malinkadink

Quote:


> Originally Posted by *Mad Pistol*
> 
> Everyone... please watch this video. Joker ran an i7 7700k (the current pinnacle of gaming CPU's) against a "lowly" R7 1700. The settings were "low" and the resolution was 720P; bottlenecks from the GPU were minimized as much as possible.
> 
> If you will notice, at all times, the R7 1700 is within spitting distance of the i7 7700k in terms of framerates. You will also notice that for those of us that have high refresh rate monitors, the R7 1700 is above 144 fps for probably 80% of the scenarios shown here here (except for WD2 and a few other dips, where the i7 7700k was also falling short).
> 
> Let's just get this one out of the way.
> 
> Is the R7 1700 as good for gaming as the i7 7700k is at the moment?
> *No.*
> 
> Is the R7 1700 cheaper than the i7 7700k?
> *Yes.*
> 
> Does the R7 1700 have 4 extra cores and 8 extra threads compared to the i7 7700k?
> *HELL Yes.*
> 
> Is the R7 1700 good at other things than just games?
> *BIG FREAKIN YES.*
> 
> Can the R7 1700 use the additional cores to stream games while you play them?
> *Absolutely!*
> 
> Can the i7 7700k stream well using the CPU?
> *Sure... if you like stuttering.*
> 
> Is the R7 1700 more futureproof because it has more cores and threads than the i7 7700k?
> *Definitely.*
> 
> I guess the last question you must ask yourself is...
> 
> In what reality do you live in where the i7 7700k is the better product?
> I would love to know the answer to this one.


All well said, unsubbing from this thread now


----------



## CULLEN

Golem.de updated their BIOS and ran the tests again, *gained 4-27% additional performance in gaming, on average 17%.*

I knew the motherboards were buggy but I had NO idea a motherboard could affect the outcome so greatly.

source


----------



## redone13

Quote:


> Originally Posted by *Mad Pistol*
> 
> Everyone... please watch this video. Joker ran an i7 7700k (the current pinnacle of gaming CPU's) against a "lowly" R7 1700. The settings were "low" and the resolution was 720P; bottlenecks from the GPU were minimized as much as possible.
> 
> If you will notice, at all times, the R7 1700 is within spitting distance of the i7 7700k in terms of framerates. You will also notice that for those of us that have high refresh rate monitors, the R7 1700 is consistently above 144 fps around 85-90% of the scenarios shown here here (except for WD2 and a few other dips, where the i7 7700k was also falling short).
> 
> Let's just get this one out of the way.
> 
> Is the R7 1700 as good for gaming as the i7 7700k is at the moment?
> *No.*
> 
> Is the R7 1700 cheaper than the i7 7700k?
> *Yes.*
> 
> Does the R7 1700 have 4 extra cores and 8 extra threads compared to the i7 7700k?
> *HELL Yes.*
> 
> Is the R7 1700 good at other things than just games?
> *BIG FREAKIN YES.*
> 
> Can the R7 1700 use the additional cores to stream games while you play them?
> *Absolutely!*
> 
> Can the i7 7700k stream well using the CPU?
> *Sure... if you like stuttering.*
> 
> Is the R7 1700 more futureproof because it has more cores and threads than the i7 7700k?
> *Definitely.*
> 
> I guess the last question you must ask yourself is...
> 
> In what reality do you live in where the i7 7700k is the better product?
> I would love to know the answer to this one.


You guys are just reiterating the same thing that everyone has. The truth is 720p benches are not useful, Jok3r is unreliable, and benches need to be GPU bound, not CPU bound thus we are waiting for objective 4k data. A little objectiveness never hurt. All your points are valid aside from the quadcore lags thing. Read through the thread.


----------



## cssorkinman

Quote:


> Originally Posted by *CULLEN*
> 
> Golem.de updated their BIOS and ran the tests again, *gained 4-27% additional performance in gaming, on average 17%.*
> 
> I knew the motherboards were buggy but I had NO idea a motherboard could affect the outcome so greatly.


It's going to do nothing but get better - those updates combined the coming improvements in ram frequencies could be huge.


----------



## redone13

Quote:


> Originally Posted by *cssorkinman*
> 
> It's going to do nothing but get better - those updates combined the coming improvements in ram frequencies could be huge.


That too potentially.


----------



## CULLEN

I'm being mean and went overboard with a joke.

Weeeoooops


----------



## redone13

Quote:


> Originally Posted by *CULLEN*
> 
> All I read was "this doesn't fit my agenda, so no".


No, it could fit my agenda. However, you like to assume before you see cold hard facts. The current facts are NOT favorable aside from workstation primarily.


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> Look, there are pluses and minuses to both. Your lack of objectiveness is glaring. It is good in its own right. However, Jok3r related benches are less than reliable and the *720p benches show nothing.* The benchmarks need to be GPU bound. I think it's been established that more data needs to be released. Also, it's established that quad cores can stream just fine.
> 
> You guys are just reiterating the same thing that everyone has. *The truth is 720p benches are not useful*, Jok3r is unreliable, and benches need to be GPU bound, not CPU bound thus we are waiting for objective 4k data. A little objectiveness never hurt. All your points are valid aside from the quadcore lags thing. Read through the thread.


Wait a moment... I could have sworn everyone and their dog was asking to put the R7 CPUs into scenarios where the CPU was being taxed completely independent of the GPU. Now that it has been presented correctly, you want to throw the results out?

I don't get this line of thinking.... damned if you do, damned if you don't. 4k and 1440p benchmarks are great, but at a certain point, the CPU is just along for the ride, and the GPU is doing all of the heavy lifting.

We need more testing. There is no doubt about that. However, as an initial showing from AMD, this is extremely encouraging; they have developed a well-rounded product that can do everything pretty well.


----------



## redone13

Quote:


> Originally Posted by *Mad Pistol*
> 
> Wait a moment... I could have sworn everyone and their dog was asking to put the R7 CPUs into scenarios where the CPU was being taxed completely independent of the GPU. Now that it has been presented correctly, you want to throw the results out?
> 
> I don't get this line of thinking.... damned if you do, damned if you don't. 4k and 1440p benchmarks are great, but at a certain point, the CPU is just along for the ride, and the GPU is doing all of the heavy lifting.
> 
> We need more testing. There is no doubt about that. However, as an initial showing from AMD, this is extremely encouraging; they have developed a well-rounded product that can do everything pretty well.


It was my understanding that GPU bound benchmarks prove nothing? Am I mistaken?

Edit, I reread Oubadah's post. It is GPU-bound. It was post 1561 and it is still a little confusing lol.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> You guys are just reiterating the same thing that everyone has. The truth is 720p benches are not useful, Jok3r is unreliable, and benches need to be GPU bound, not CPU bound thus we are waiting for objective 4k data. A little objectiveness never hurt. All your points are valid aside from the quadcore lags thing. Read through the thread.


Completely wrong.

For a CPU comparison, you need to be CPU bound.

Today 4K is GPU bound. In 2 or 3 years when Ryzen owners upgrade their GPUs, 4K gaming might not be GPU bound with headroom to spare. This is where Ryzen might lag behind Intel if BIOS patches and game optimizations don't take place.

The only reason why 1440p and up show little difference is that the GPU is the overriding factor, today.

Would I have done 720p on low? No. I would have done a test at 720p with Ultra settings.


----------



## comagnum

Sigh.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Completely wrong.
> 
> For a CPU comparison, you need to be CPU bound.
> 
> Today 4K is GPU bound. In 2 or 3 years when Ryzen owners upgrade their GPUs, 4K gaming might not be GPU bound with headroom to spare. This is where Ryzen might lag behind Intel if BIOS patches and game optimizations don't take place.
> 
> The only reason why 1440p and up show little difference is that the GPU is the overriding factor, today.
> 
> Would I have done 720p on low? No. I would have done a test at 720p with Ultra settings.


Yea, it was this post:
Quote:


> Originally Posted by *Oubadah*
> 
> Not sure it it was a typo, but 2500K+1070 is CPU limited. CPU limited benches would have been potentially useful to someone who'd ended up with that combo.
> 
> If someone was going to buy a CPU in 2011, they could have looked at something like this:
> 
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/44339-intel-core-i3-2120-core-i5-2400-lga1155-processors-review-12.html
> 
> And concluded that in Crysis, a i3-2120 is more or less equivalent to a i7-2600K. That is a terribly misleading and unhelpful benchmark because:
> 
> A) It's taken in a heavily GPU-limited scenario (ice map). There are parts of Crysis (eg. shipyard), where even at [email protected] High on a GTX 580 (the benchmark config), the game is going to be CPU bound and the framerate won't be anywhere near 50fps. In those parts the 2600K would have made a significant difference.
> 
> B) If they were to upgrade the GTX 580 to, say, a GTX 1070 or 970, the 2600K would have been far less limiting after that upgrade.
> 
> Ideally the reviewer should have run their bench in heavy combat in the shipyard, but maybe they didn't know about it. The second best option was to lower the resolution so that the benchmark would be CPU bound in the ice map, then at least the reader would have got some sense that the 2120 and 2600K are not equal in Crysis.
> They might, but I wouldn't be buying hardware now based on an assumption that all future games will be using highly threaded engines. Another implied assumption seems to be that people are only going to be playing new games. Some people still play older games with older, poorly threaded engines. It's only been in the last year or so that we've started to see big movement on that front.
> 
> I'm not so sure that enthusiasts are necessarily likely to be less CPU limited either. if you're budget is high, then it's easier to end up with a CPU bottleneck that you can't fix just by throwing more money at it. It's not like you can put two CPUs in a system (well you could, but it wouldn't scale the same way). When I had my GTX 780 SLI build I was coming up against CPU bottlenecks regularly. Of course if I'd thrown a 4K monitor into the mix it might have swung back in the other direction. That's why it's hard to make any sweeping statements.


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> It was my understanding that CPU bound benchmarks prove nothing? Am I mistaken?


Independent of the software, there are 2 major components in game testing nowadays

1. *GPU* - Affects visual settings
2. *CPU* - Affects output of game data and response

-so-

If you want to cause a GPU bottleneck, turn up the in-game visual settings
If you want to cause a CPU bottleneck, turn down the in-game visual settings and resolution

Now... there is a point where a CPU or GPU can be slow enough that it can negatively affect gameplay. What Joker's video proves is that a Ryzen 8-core will not slow down a high-end graphics card to any level that should be detrimental to the gaming experience. Couple this fact with the fact that AMD's R7 CPUs have more cores and more threads than the i7 7700k, and the R7 1700 is not only the more future-proof solution, but it also represents a massive value proposition (The R7 1700 is $330. The i7 4790k is $350).

If I still had my FX 8320 from a couple of years ago, you can bet your butt that I would be getting a Ryzen setup without hesitation.


----------



## redone13

Quote:


> Originally Posted by *Mad Pistol*
> 
> Independent of the software, there are 2 major components in game testing nowadays
> 
> 1. *GPU* - Affects visual settings
> 2. *CPU* - Affects output of game data and response
> 
> -so-
> 
> If you want to cause a GPU bottleneck, turn up the in-game visual settings
> If you want to cause a CPU bottleneck, turn down the in-game visual settings and resolution
> 
> Now... there is a point where a CPU or GPU can be slow enough that it can negatively affect gameplay. What Joker's video proves is that a Ryzen 8-core will not slow down a high-end graphics card to any level that should be detrimental to the gaming experience. Couple this fact with the fact that AMD's R7 CPUs have more cores and more threads than the i7 7700k, and the R7 1700 is not only the more future-proof solution, but it also represents a massive value proposition (The R7 1700 is $330. The i7 4790k is $350).
> 
> If I still have my FX 8320 from a couple of years ago, you can bet your butt that I would be getting a Ryzen setup without hesitation.


Understood, just trying to apply some of the theory from the thread.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Completely wrong.
> 
> For a CPU comparison, you need to be CPU bound.
> 
> Today 4K is GPU bound. In 2 or 3 years when Ryzen owners upgrade their GPUs, 4K gaming might not be GPU bound with headroom to spare. This is where Ryzen might lag behind Intel if BIOS patches and game optimizations don't take place.
> 
> The only reason why 1440p and up show little difference is that the GPU is the overriding factor, today.
> 
> Would I have done 720p on low? No. I would have done a test at 720p with Ultra settings.


I riddle you this. In that video the GPU utilization was 85% or higher with some spikes to 90+. The processor was not being utilized heavily on very many cores at all. What is the bottleneck here?


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> Understood, just trying to apply some of the theory from the thread.


The 720p tests are valid because it exposes the R7 weakness is some games better than a higher resolution test.

Think of it like a magnifying glass.

TODAY, with TODAY'S GPUS, show no issue with R7 at 4K.

R7 might not be able to feed TOMORROW'S GPUS fast enough compared to a 7700k.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> The 720p tests are valid because it exposes the R7 weakness is some games better than a higher resolution test.
> 
> Think of it like a magnifying glass.
> 
> TODAY, with TODAY'S GPUS, show no issue with R7 at 4K.
> 
> R7 might not be able to feed TOMORROW'S GPUS fast enough compared to a 7700k.


That video looked more GPU bound to me.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> I riddle you this. In that video the GPU utilization was 85% or higher. The processor was not being utilized heavily on very many cores at all. What is the bottleneck here?


The bottleneck is probably the code the games use or the RAM speed or something.

The point in seeing how well a CPU can feed the GPU is to look at the GPU utilization, not really the CPU utilization.

The 7700k fed test had the GPU utilization a tad higher most of the time.

Again, the 720p test just makes the difference more noticeable, which might be important for future, more powerful GPUs.


----------



## Oubadah

..


----------



## Forceman

Quote:


> Originally Posted by *umeng2002*
> 
> The 720p tests are valid because it exposes the R7 weakness is some games better than a higher resolution test.
> 
> Think of it like a magnifying glass.
> 
> TODAY, with TODAY'S GPUS, show no issue with R7 at 4K.
> 
> R7 might not be able to feed TOMORROW'S GPUS fast enough compared to a 7700k.


Did anyone test with any AMD cards? Curious how the driver overhead affects things, if at all.


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> That video looked more GPU bound to me.


No. The GPU is just fast enough that it can push seriously high framerates when the settings are turned down.

CPU's are responsible for what is called "draw calls". This is the raw data that is sent to the GPU for how to plot and render a scene. Then, the GPU takes that data and creates pretty pictures. The CPU is also responsible for AI logic and physics.

The more the CPU has to work on other things, the less time it has to send data to the GPU, thus causing lag and stutters in game.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> That video looked more GPU bound to me.


Ideally, the GPUs should be at 98%+ with vsync off.

That's a totally fed GPU.

Anything less, and it's an issue where the CPU can't feed the GPU fast enough... whether or not the CPU is at 40% or 85% utilization.


----------



## Mad Pistol

Quote:


> Originally Posted by *umeng2002*
> 
> Ideally, the GPUs should be at 98%+ with vsync off.
> 
> That's a totally fed GPU.
> 
> *Anything less, and it's an issue where the CPU can't feed the GPU fast enough*... whether or not the CPU is at 40% or 85% utilization.


Bingo.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Ideally, the GPUs should be at 98%+ with vsync off.
> 
> That's a totally fed GPU.
> 
> Anything less, and it's an issue where the CPU can't feed the GPU fast enough... whether or not the CPU is at 40% or 85% utilization.


Hmm, ok I get what you are saying. So real world, what does this mean if 1080p is the standard?


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> Hmm, ok I get what you are saying. So real world, what does this mean if 1080p is the standard?


This means that the i7 7700k can push data more quickly to a GPU than an R7 1700... but not by much.

You should also expect to see this trend reverse in the coming years; games are beginning to be coded to use 8 core/16 thread CPUs more fully. As this happens, processors such as the i7 7700k will not be able to feed a GPU as efficiently as an R7 8 core can. Therefore, the R7 1700/X/1800X will pull ahead and ultimately be the better CPUs in the long run.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> Hmm, ok I get what you are saying. So real world, what does this mean if 1080p is the standard?


Depends on your desired frame rate. I still just use 60 Hz, so it wouldn't be an issue for me. For 144 Hz, people. It can be an issue.

The whole issue with Ryzen's 1080p performance in games is mostly about future titles and future GPUs and high resolutions.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Ideally, the GPUs should be at 98%+ with vsync off.
> 
> That's a totally fed GPU.
> 
> Anything less, and it's an issue where the CPU can't feed the GPU fast enough... whether or not the CPU is at 40% or 85% utilization.


I'm just trying to put together why the current data sets are unfavorable 1080p which is what the majority will be using. 4k is not the majority and maybe 720p has a stronger argument. It's all conjecture at this point but I want more tests. I know about all the controversy and how AMD says they will fix it.


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> You have to be wary of how the per-core load is being measured. It's well known the Windows' Task Manager graphs, for example, are deceptive. Try Running 1 thread of Prime95 on a 4 thread processor, and you will see Windows display it as 4 cores at 25% load*. A game could be bottlenecked on ONE core, pegging it at a solid 100%, but the software you're using to monitor it might not reflect this.
> 
> *Modern versions of Intel's Turbo will interfere with this test.


Yea, that was a little beyond my scope but I can believe that Task Manager isn't entirely accurate. So the hardware reporting doesn't correlate to software reporting.


----------



## umeng2002

Quote:


> Originally Posted by *Oubadah*
> 
> You have to be wary of how the per-core load is being measured. It's well known the Windows' Task Manager graphs, for example, are deceptive. Try Running 1 thread of Prime95 on a 4 thread processor, and you will see Windows display it as 4 cores at 25% load*. A game could be bottlenecked on ONE core, pegging it at a solid 100%, but the software you're using to monitor it might not reflect this.
> 
> *Modern versions of Intel's Turbowill interfere with this test.


Yeah, and for some reason joker didn't put TOTAL cpu utilization for Ryzen 7 on the video overlay like he did for the 7700k.

But you have to look at the GPU utilization most at the lower resolution. Vsync also needs to be off.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Yeah, and for some reason joker didn't put TOTAL cpu utilization for Ryzen 7 on the video overlay like he did for the 7700k.
> 
> But you have to look at the GPU utilization most at the lower resolution. Vsync also needs to be off.


The thing is, are we being drawn into synthetics rather than real world? Like a doctor looking at a patients vital signs rather than at the patient themselves first?


----------



## oxidized

Quote:


> Originally Posted by *Mad Pistol*
> 
> This means that the i7 7700k can push data more quickly to a GPU than an R7 1700... but not by much.
> 
> You should also expect to see this trend reverse in the coming years; games are beginning to be coded to use 8 core/16 thread CPUs more fully. As this happens, processors such as the i7 7700k will not be able to feed a GPU as efficiently as an R7 8 core can. Therefore, the R7 1700/X/1800X will pull ahead and ultimately be the better CPUs in the long run.


Games using 8/16 efficiently won't happen anytime soon, it's probably better to set for a 6/12, as 4/8 are still quite viable, we won't be seeing games using 8/16 tomorrow, that will most likely happen in a few years, when 4/8 will be pretty useless, and 6/12 would be optimal in most of the situations


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> Yea, that was a little beyond my scope but I can believe that Task Manager isn't entirely accurate. So the hardware reporting doesn't correlate to software reporting.


Again. Look at the GPU utilization. Not the CPU utilization.

GPU utilization is very mature and depends on the drivers from AMD and nVidia.

Less than high 90's GPU utilization and the GPU is WAITING for the CPU... even if the CPU utilization is low.


----------



## Mad Pistol

Quote:


> Originally Posted by *oxidized*
> 
> Games using 8/16 efficiently won't happen anytime soon, it's probably better to set for a 6/12, as 4/8 are still quite viable, we won't be seeing games using 8/16 tomorrow, that will most likely happen in a few years, when 4/8 will be pretty useless, and 6/12 would be optimal in most of the situations


Considering that the PS4 and Xbox One both have 8 core APUs in them, it may actually happen more quickly than people think.


----------



## Oubadah

..


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> I thought we were talking about games, not synthetics.


I mean like the idea of gaming at 720p which is arguably more than those who will game at 2160p. It's all the idea "of" but the current 1080p benches, with as much controversy as there is, don't add up.


----------



## oxidized

Quote:


> Originally Posted by *Mad Pistol*
> 
> Considering that the PS4 and Xbox One both have 8 core APUs in them, it may actually happen more quickly than people think.


Well consoles surely have a more gaming focused OS and overall platform, so it's probably a different thing. Optimization like on consoles is something we had long time ago on PC...


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> I mean like the idea of gaming at 720p which is arguably more than those who will game at 2160p.


That's not a synthetic test at all.

A game at 720p runs the same code as the game at 4k.

The lower resolution just makes the GPU's job easier to see if the CPU can feed it fast enough to be totally utilized.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> That's not a synthetic test at all.
> 
> A game at 720p runs the same code as the game at 4k.
> 
> The lower resolution just makes the GPU's job easier to see if the CPU can feed it fast enough to be totally utilized.


It just doesn't answer the question of the majority of data available to us. We are in love with the idea OF testing at 720p and 2160p but what does it translate to? Did intel pay off reviewers? Maybe lol


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> It just doesn't answer the question of the majority of data available to us. We are in love with the idea OF testing at 720p and 2160p but what does it translate to? Did intel pay off reviewers? Maybe lol


The idea is to expose any weakness is Ryzen game performance.

There are some weakness.

In fact the bigger story is AMD wanting reviews at 1440p and 4K to cover up those weaknesses.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> The idea is to expose any weakness is Ryzen game performance.
> 
> There are some weakness.
> 
> In fact the bigger story is AMD wanting reviews at 1440p and 4K to cover up those weaknesses.


Yea, I heard about that.


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> It just doesn't answer the question of the majority of data available to us. We are in love with the idea OF testing at 720p and 2160p but what does it translate to? Did intel pay off reviewers? Maybe lol


The problem with this launch is that reviewers were given a wide variety of hardware configurations between CPUs, motherboards, and RAM selection. Because of this, there are certain combinations of the 3 that may yield better overall results than others. This could be because the RAM doesn't play nice, or the motherboard's BIOS/microcode isn't mature, or perhaps some of the software used for testing just likes Intel's environment better.

The current PC ecosystem is highly fragmented. Getting consistent results across multiple systems, especially when launching a new product, can prove to be difficult sometimes.


----------



## redone13

Quote:


> Originally Posted by *Mad Pistol*
> 
> The problem with this launch is that reviewers were given a wide variety of hardware configurations between CPUs, motherboards, and RAM selection. Because of this, there are certain combinations of the 3 that may yield better overall results than others. This could be because the RAM doesn't play nice, or the motherboard's BIOS/microcode isn't mature, or perhaps some of the software used for testing just likes Intel's environment better.
> 
> The current PC ecosystem is highly fragmented. Getting consistent results across multiple systems, especially when launching a new product, can prove to be difficult sometimes.


They were pretty consistent.


----------



## Oubadah

..


----------



## umeng2002

It's actually a fairly standard method to test CPU gaming performance.

Turn off Vsync. Set the game at low resolution. Use the fastest GPU possible.

It's been known for decades now.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> It's actually a fairly standard method to test CPU gaming performance.
> 
> Turn off Vsync. Set the game at low resolution. Use the fastest GPU possible.
> 
> It's been known for decades now.


Right, but among the "companies" that review stuff, you'd be hard pressed to see a 720p bench. Perhaps there is a deeper reason for this.


----------



## redone13

Quote:


> Originally Posted by *Oubadah*
> 
> No, for the reasons I've stated a billion times. And at the end of the day, I ask this:
> 
> 1) GPU bound CPU benchmarks that tell you nothing.
> 2) CPU bound CPU benchmarks that may not give you real world results, but still give you some kind of result.
> 
> Which is more useful? If reviewers are going to do the former, they might as well not do the benchmarks at all.
> 
> I'm going to speed test 5 cars. Since no one can drive above the speed limit anyway, I'm going to make this a "real world" test by testing them all with a limiter, limiting them to exactly the speed limit. I test them all and, lo and behold, they're all exactly as fast as each other. Why did I even run the test at all?


I get it completely. The GPU will mask things.


----------



## Mad Pistol

Quote:


> Originally Posted by *redone13*
> 
> They were pretty consistent.


No they weren't. There were quite a few reviewers singing praises, constantly showing the R7 1800x outshining the i7 6900k. Then you had other reviewers that were comparing an R7 1700 vs an i7 7700k, and the R7 1700 was better in productivity tasks, but the i7 7700k walked all over it in gaming, so the reviewer declared the R7 1700 to be a failure.

Seriously, the difference of hardware from one reviewer to the next is probably what hurt AMD the most in all of this.


----------



## redone13

Quote:


> Originally Posted by *Mad Pistol*
> 
> No they weren't. There were quite a few reviewers singing praises, constantly showing the R7 1800x outshining the i7 6900k. Then you had other reviewers that were comparing an R7 1700 vs an i7 7700k, and the R7 1700 was better in productivity tasks, but the i7 7700k walked all over it in gaming, so the reviewer declared the R7 1700 to be a failure.
> 
> Seriously, the difference of hardware from one reviewer to the next is probably what hurt AMD the most in all of this.


You are right as far as consistency to what was being compared. It probably shouldn't have been compared for gaming purposes but the parallel was drawn because the vast majority of people game versus heavily multitask. It seemed like they all praised it as worthy competition for specialized tasks but were not impressed by the 1080p graphics.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> Right, but among the "companies" that review stuff, you'd be hard pressed to see a 720p bench. Perhaps there is a deeper reason for this.


Yeah to become as CPU bound as possible. And, frankly, most incorrectly think that a 1080 can't become GPU bound at 1080p. Newer titles down the road, tons of AA, can do it. But a low resolution like 720p, really makes it almost nill and puts all the bounds on the CPU to feed the GPU hundreds of frames per second.

There isn't some "agenda" here other than to see which CPU is better at gaming, currently.

Say in 4 years. With a R7 1700 and a 7700k, if no patches or optimizations take place, a new top-end GPU might very well show the same performance delta at 4k that they do now at 720p.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> Yeah to become as CPU bound as possible.
> 
> There isn't some "agenda" here other than to see which CPU is better at gaming, currently.
> 
> Say in 4 years. With a R7 1700 and a 7700k, if no patches or optimizations take place, a new top-end GPU might very well show the same performance delta at 4k that they do now at 720p.


I understand. But I've read various CPU reviews in my years, including the recently released ones, and 720P is no where in sight. It doesn't seem like a standard and I'm not saying it shouldn't be but it isn't common methodology. Unless I'm just reading too much of the mainstream stuff.


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> I understand. But I've read various CPU reviews in my years, including the recently released ones, and 720P is no where in sight. It doesn't seem like a standard and I'm not saying it should but it isn't.


It should be if they really want a thorough review.

But, joker should have ran the tests at ULTRA settings at 720p to load the engine with more physics and junk.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> It should be if they really want a thorough review.
> 
> But, joker should have ran the tests at ULTRA settings at 720p to load the engine with more physics and junk.


So the real question is, do we listen to those mainstream websites who may have not been feeding us accurate data or Jok3r? Or do we wait for more consistency before we totally get giddy?


----------



## umeng2002

Quote:


> Originally Posted by *redone13*
> 
> So the real question is, do we listen to those mainstream websites who may have not been feeding us accurate data or Jok3r? Or do we wait for more consistency before we totally get giddy?


The test are accurate enough, but a better test would be Ultra at 720p.

Youtubers, for the most part, the one's that don't have more in depths articles, aren't very good at reviewing... think linus, even Jayz2cents, even joker...

But at least joker did the review.

Frankly, all the best reviews are coming out of Europe.


----------



## redone13

Quote:


> Originally Posted by *umeng2002*
> 
> The test are accurate enough, but a better test would be Ultra at 720p.
> 
> Youtubers, for the most part, the one's that don't have more in depths articles, aren't very good at reviewing... think linus, even Jayz2cents, even joker...
> 
> But at least joker did the review.


Yea, Linus' review did not impress me at all. I think we need more information, however, there is enough to get excited about.


----------



## budgetgamer120

Quote:


> Originally Posted by *umeng2002*
> 
> That's not a synthetic test at all.
> 
> A game at 720p runs the same code as the game at 4k.
> 
> The lower resolution just makes the GPU's job easier to see if the CPU can feed it fast enough to be totally utilized.


His argument will always be the same. 720p gamers are more than 4k gamers. You see that argument there? Smh


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> His argument will always be the same. 720p gamers are more than 4k gamers. You see that argument there? Smh


Would you state otherwise?


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> Would you state otherwise?


What does that matter in the grand scheme of things? Does that mean that 720p is somehow better than 4k?
Is anyone buying Highend hardware to okay at 720p or are they buyign it to play at 4k?









Got to be the worst arguments I see on a forum.

If you bother researched before you post you would see 720p and 4k are equal on stats. Not that it matters but just fyi.
http://store.steampowered.com/hwsurvey


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> What does that matter in the grand scheme of things? Does that mean that 720p is somehow better than 4k?
> Is anyone buying Highend hardware to okay at 720p or are they buyign it to play at 4k?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you bother researched before you post you would see 720p and 4k are equal on stats. Not that it matters but just fyi.
> http://store.steampowered.com/hwsurvey


No, it doesn't matter. I was discussing why 720p benches aren't used more frequently so that things aren't GPU bound. You introduced what we call a strawman, misconstruing my argument to make it easier to attack. Actually, there was no argument, you just came to post "smh"


----------



## budgetgamer120

Quote:


> Originally Posted by *redone13*
> 
> No, it doesn't matter. I was discussing why 720p benches aren't used more frequently so that things aren't GPU bound. You introduced what we call a strawman, misconstruing my argument to make it easier to attack. Actually, there was no argument, you just came to post "smh"


I came with facts to back up my post. Try it sometimes instead of ifs and buts.


----------



## redone13

Quote:


> Originally Posted by *budgetgamer120*
> 
> I came with facts to back up my post. Try it sometimes instead of ifs and buts.


Please enlighten me as to what you are being so factual about?


----------



## TopicClocker

Quote:


> Originally Posted by *DADDYDC650*
> 
> consistently higher minimum frames.


The 1700 at 3.9GHz dipped below 70 fps while the 7700K at 5GHz does not, the minimum is unfortunately not consistently higher.

I also feel that he went a bit too low on the settings with the GTX 1080, games like GTA 5 and Watch Dogs 2 have CPU intensive settings that hammer the CPU when raised, yet he appeared to have tested low settings across the board, unless he had things like the draw distance raised.

I guess he doesn't know much about testing CPU performance in games, especially since he initially released GPU limited benchmarks when benchmarking the Ryzen CPUs.






EDIT: Oh, were you talking about the performance of it in the review at higher settings? Yes the minimum is higher on the Ryzen CPU.


----------



## budgetgamer120

Quote:


> Originally Posted by *TopicClocker*
> 
> The 1700 at 3.9GHz dipped below 70 fps while the 7700K at 5GHz does not, the minimum is unfortunately not consistently higher.
> 
> I also feel that he went a bit too low on the settings with the GTX 1080, games like GTA 5 and Watch Dogs 2 have CPU intensive settings that hammer the CPU when raised, I guess he doesn't know much about testing CPU performance in games, especially since he initially released GPU limited benchmarks when benchmarking the Ryzen CPUs.


The general technique is to drop settings to lowest and drop resolution. Some do not know that settings affect physics that are controlled by CPU. But he is trying to please his fans.


----------



## epic1337

do we have any reviews on various core configurations?

e.g. (2+2) vs (4+0) vs (2+4) vs (3+3) vs (4+2) vs (4+4)
numbers represents cores per CCX.

this could give us a glimpse of what a 4C/8T and 6C/12T Ryzen would perform like.

in particular, i sort of think both 2+4 and 3+3 would perform better than 4+2.
since according to one of the reviews, the 2core and 3core CCX would have full access to the 8MB L3 cache.

which means a 2+4 would have a somewhat higher performance on cores 0~3 (including hyperthreads).
where as 3+3 would have a somewhat higher performance across the board.


----------



## budgetgamer120

Quote:


> Originally Posted by *epic1337*
> 
> do we have any reviews on various core configurations?
> 
> e.g. (2+2) vs (4+0) vs (3+3) vs (4+2) vs (4+4)
> numbers represents cores per CCX.
> 
> this could give us a glimpse of what a 4C/8T and 6C/12T Ryzen would perform like.


Lol the same? I do not think there will be a difference. Maybe overclock a little higher.


----------



## phenom01

Dont argue with budget. He runs a AMD fan based channel and cannot be told anything that does not put AMD in the God mode category. Do not bother.


----------



## randomizer

No matter what settings reviewers use some people will claim GPU bottlenecks are compressing the performance differences and dismiss the results, or that nobody plays at the settings used and dismiss the results.


----------



## budgetgamer120

Quote:


> Originally Posted by *phenom01*
> 
> Dont argue with budget. He runs a AMD fan based channel and cannot be told anything that does not put AMD in the God mode category. Do not bother.


Check my posts









Good try though.


----------



## phenom01

Your post on this forum dont lie about your other posts.


----------



## budgetgamer120

Quote:


> Originally Posted by *phenom01*
> 
> Your post on this forum dont lie about your other posts.


Good try, try a little harder. And post my other post and my fanbot channel please


----------



## redone13

Quote:


> Originally Posted by *phenom01*
> 
> Dont argue with budget. He runs a AMD fan based channel and cannot be told anything that does not put AMD in the God mode category. Do not bother.


Yea, it's pretty evident when he attacked me for discussing why 720p benches aren't the norm to reduce GPU bottlenecks because he thought I was saying 720p is what the majority use. He can't form a logical argument.
Quote:


> Originally Posted by *budgetgamer120*
> 
> What does that matter in the grand scheme of things? Does that mean that 720p is somehow better than 4k?
> Is anyone buying Highend hardware to okay at 720p or are they buyign it to play at 4k?


----------



## oxidized

Quote:


> Originally Posted by *phenom01*
> 
> Dont argue with budget. He runs a AMD fan based channel and cannot be told anything that does not put AMD in the God mode category. Do not bother.


Quote:


> Originally Posted by *redone13*
> 
> Yea, it's pretty evident when he attacked me for discussing why 720p benches aren't the norm to reduce GPU bottlenecks because he thought I was saying 720p is what the majority use. He can't form a logical argument.


I suggest you add him into your ignore list


----------



## epic1337

Quote:


> Originally Posted by *budgetgamer120*
> 
> Lol the same? I do not think there will be a difference. Maybe overclock a little higher.


higher overclock is a given due to more thermal and power headroom.

but what i'm more interested in is L3 cache allocation, in one of the reviews they mentioned that disabling a few cores on the CCX doesn't disable the cache.
meaning a CCX with only 2 or 3 cores will still have the full 8MB cache for it's use, this is an advantage when one considers how eDRAM gave benefit to broadwell chips.


----------



## budgetgamer120

Quote:


> Originally Posted by *epic1337*
> 
> higher overclock is a given due to more thermal and power headroom.
> 
> but what i'm more interested in is L3 cache allocation, in one of the reviews *they mentioned that disabling a few cores on the CCX doesn't disable the cache.*
> meaning a CCX with only 2 or 3 cores will still have the full 8MB cache for it's use, this is an advantage when one considers how eDRAM gave benefit to broadwell chips.


Well that might be interesting. I wonder why no one has done this yet. Maybe they were instructed not to or everyone rushed their reviews out to get views.


----------



## budgetgamer120

For the persons who game on more than 4 cores, might fidn this interesting. If you have a 5k series intel or 6k series there is no need for Ryzen. Things look about equal in these tests


----------



## epic1337

Quote:


> Originally Posted by *budgetgamer120*
> 
> Well that might be interesting. I wonder why no one has done this yet. Maybe they were instructed not to or everyone rushed their reviews out to get views.


found the review i read it from.

http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/5
Quote:


> One of the neat things about a Ryzen platform is that you can disable cores and if you wanted to run just four cores you can select a 4-0 or 2-2 core setup on your CPU with a Ryzen 7 series processor. We thought some cache was attached per core, so disabling cores should change our benchmark results. Running all four cores on one half of the die (4-0) might have disabled the cache on one of the CCX's and we proved that to be the case here as there were no results in the 192 MB and 256 MB block sizes. the maximum block size is LLC * 16. So when you have a 8MB L3 cache (or 8MB per CCX), the maximum block size is 128MB that they test will run to. *Since we were getting results up to 256MB on the tests it looks like both blocks of 8MB cache per CCX was being accessed.* This is a good sign, but there is a significant performance hit after 8MB for some reason. It would be nice to benchmark the speed between the two CCX's as they are independent and connected with each other only through the SDF (Scalable Data Fabric). Could the interconnect between the two CCX's be bottlenecking performance? We aren't sure of a way to test that, but it sure looks like it.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *umeng2002*
> 
> The 720p tests are valid because it exposes the R7 weakness is some games better than a higher resolution test.
> 
> Think of it like a magnifying glass.
> 
> TODAY, with TODAY'S GPUS, show no issue with R7 at 4K.
> 
> R7 might not be able to feed TOMORROW'S GPUS fast enough compared to a 7700k.


Did you watch the video? What weaknesses did the R7 display at 720p? I'm confused...


----------



## Blameless

Quote:


> Originally Posted by *epic1337*
> 
> found the review i read it from.
> 
> http://www.legitreviews.com/amd-ryzen-7-1800x-1700x-and-1700-processor-review_191753/5


It was always obvious that their would be a significant performance hit from on CCX access the cache of the other. Several people, including myself, pointed this out as soon as the architectural details were released a Hotchips.

OSes and applications will need to take this into account for best performance.

Embarrassingly parallel apps won't care because their threads don't need each other's data often. This is one of the reason why rendering and transcoding are so damn fast on Ryzen...there is no bottleneck because the cores don't need to talk to eachother to get stuff done.

However, it's also apparent that some apps have major issues because threads are not being scheduled wisely. Anything that needs to share data frequently needs to be scheduled to the same CCX.

Comments I made back in August have evidently been vindicated:
Quote:


> Originally Posted by *Blameless*
> 
> ~100GB/s interconnect between the two modules may sound fast at first glance, and is faster than past interconnects they have used, but it's no where near as fast as having a unified LLC could be.
> 
> Having the parts with more than four cores be split into multiple modules is not a performance advantage, relative to the competition, but a weakness that must be overcome. Hopefully the interconnect they use is fast and low latency enough to mask any issues.


http://www.overclock.net/t/1579520/zen-hyper-thread/560#post_25456791


----------



## n4p0l3onic

Quote:


> Originally Posted by *umeng2002*
> 
> Ideally, the GPUs should be at 98%+ with vsync off.
> 
> That's a totally fed GPU.
> 
> Anything less, and it's an issue where the CPU can't feed the GPU fast enough... whether or not the CPU is at 40% or 85% utilization.


I disagree, not all games should tax gpu.


----------



## umeng2002

Quote:


> Originally Posted by *n4p0l3onic*
> 
> I disagree, not all games should tax gpu.


k... but with vsync off, you any game can get hundreds of fps... as fast as the CPU can feed the GPU frames...


----------



## Xuper

Computerbase : Ryzen is King in Applications, Prince in Games
Gamersnexus : Ryzen is An i5 in Gaming, i7 in Production

I like how computerbase Is professional...


----------



## kzone75

I have no intention of playing games at 720p, so I think I'll grab a 1700 and the Aorus AX370 Gaming 5 or the MSI X370 Gaming Pro Carbon early next month.


----------



## th3illusiveman

My next CPU is going to be a RYZEN. Not entirely happy with AMD keeping it's gaming performance hushed while hyping their synthetic benchmarks but that's business i guess. Intel gets enough money and if AMD can fix the issues they have with RYZEN in their gen 2 products then we all win. Besides, it seems even with it's "gimped" 1080p performance it's still faster then sandy bridge at it's worst and significantly better at it's best. With DX12 coming up, i should be fine.


----------



## redone13

Quote:


> Originally Posted by *kzone75*
> 
> I have no intention of playing games at 720p, so I think I'll grab a 1700 and the Aorus AX370 Gaming 5 or the MSI X370 Gaming Pro Carbon early next month.


If you had read anything, the point of the 720p discussion was to find what what resolution truly shows the potential of the CPU while benching. It's not about gaming at 720p.


----------



## dragneel

Quote:


> Originally Posted by *th3illusiveman*
> 
> My next CPU is going to be a RYZEN. Not entirely happy with AMD keeping it's gaming performance hushed while hyping their synthetic benchmarks but that's business i guess. Intel gets enough money and if AMD can fix the issues they have with RYZEN in their gen 2 products then we all win. Besides, it seems even with it's "gimped" 1080p performance it's still faster then sandy bridge at it's worst and significantly better at it's best. With DX12 coming up, i should be fine.


Think I might have been right a month or so ago, when I said AMD was going to pull in a lot of Sandybridge owners.


----------



## PsyM4n

Perfectly happy with the 3930k here. Not upgrading until there's something at least 2 times faster out. That's my requirement when upgrading for 2 dozens of years now, yeah, since 1993.


----------



## donrapello

Quote:


> Originally Posted by *PsyM4n*
> 
> Perfectly happy with the 3930k here. Not upgrading until there's something at least 2 times faster out. That's my requirement when upgrading for 2 dozens of years now, yeah, since 1993.


Yeah, I was going to replace my 3930 @ 4.7ghz to Ryzen but having second thoughts now. Upgrading to 1080ti when available and if these still bottleneck it, why bother?
I wonder how these perform in Arma3..


----------



## PsyM4n

Quote:


> Originally Posted by *donrapello*
> 
> Yeah, I was going to replace my 3930 @ 4.7ghz to Ryzen but having second thoughts now. Upgrading to 1080ti when available and if these still bottleneck it, why bother?
> I wonder how these perform in Arma3..


Don't get me wrong. If I was to get a PC to replace a 10 year old Phenom or Core 2 system, I would go for Ryzen. For people who only plan to get one computer to do everything, it's the best deal currently on the market, and it will be like that for at least 6 to 12 months.

The 3930k here has like double the bandwidth of the Ryzen, both in ram speed and PCI express. It's not even 50% slower when fully loaded in really heavy multi-threaded applications either. For gaming, the GPU is the bottleneck even if having a far newer processor. In practice, "upgrading" to anything different than an LGA2011-3 system with 40 PCIe lanes on the CPU (those CPUs cost like 600$) will be a downgrade on at least one department. So upgrading is so far out of the question.


----------



## ChronoBodi

why is this one benchmark favors the Ryzen compared to almost all other reviews?


----------



## Arturo.Zise

Looks like I'm the perfect candidate for a Ryzen R7 CPU. I game at 4K and I do x264 Video encoding. And thanks to Intel's price gouging, the money I save on not buying a 6900k will go towards a 1080ti or Vega GPU.

Ryzen isn't all doom and gloom people


----------



## Catscratch

Quote:


> Originally Posted by *ChronoBodi*
> 
> 
> 
> why is this one benchmark favors the Ryzen compared to almost all other reviews?


I found one other but it doesn't show a ryzen lead. However it has SMT and SMT off and 8350 score which shows how much ryzen jumped in performance.

http://www.hardware.fr/articles/956-17/jeux-3d-project-cars-f1-2016.html


----------



## ChronoBodi

Quote:


> Originally Posted by *Catscratch*
> 
> I found one other but it doesn't show a ryzen lead. However it has SMT and SMT off and 8350 score which shows how much ryzen jumped in performance.
> 
> http://www.hardware.fr/articles/956-17/jeux-3d-project-cars-f1-2016.html


Seems games get confused with AMD's SMT being on. Off, it games better.

So, clearly every game has no idea what Ryzen is and is using non-optimized codepath for "mystery cpu" basically.

And yet we have the choir of "OMG 7700k beats 1800x! AMD FAIL!" when we're talking like, minor fps differences that most people wouldn't even care to notice. Not everyone is a CSGO 240 hz freak.

It can game fine, it's probably like my 5960x in gaming, and the real comparison should be against 6900k, the $1000 competitor.


----------



## _Chimera

Is people really ruling out Ryzen already? It's brand new people, no good drivers, no optimizations, most software is confused as hell and probably don't even try to use all the new goodies ryzen packs.

Holy batman


----------



## hawker-gb

Quote:


> Originally Posted by *_Chimera*
> 
> Is people really ruling out Ryzen already? It's brand new people, no good drivers, no optimizations, most software is confused as hell and probably don't even try to use all the new goodies ryzen packs.
> 
> Holy batman


Only hard core Intel fans discard it.


----------



## mesrine

Quote:


> Originally Posted by *ChronoBodi*
> 
> Seems games get confused with AMD's SMT being on. Off, it games better.
> 
> So, clearly every game has no idea what Ryzen is and is using non-optimized codepath for "mystery cpu" basically.
> 
> And yet we have the choir of "OMG 7700k beats 1800x! AMD FAIL!" when we're talking like, minor fps differences that most people wouldn't even care to notice. Not everyone is a CSGO 240 hz freak.
> 
> It can game fine, it's probably like my 5960x in gaming, and the real comparison should be against 6900k, the $1000 competitor.


So which one of the Ryzen processors should be compared against 7700k in gaming performance. Just curious.


----------



## CULLEN

ITT:

6900K is a great CPU, keep it.

Ryzen is not for gaming.


----------



## Travieso

GamerNexus' headline is quite misleaded and really confusing.

"An i5 in Gaming, i7 in Production"

when people see "i7 in production", they're gonna think that Ryzen is about equal to i7 7700K in productivity.

no, it eats 7700K alive in productivity that's not even comparable and loses in gaming.

should have used better wording.


----------



## Tobiman

Quote:


> Originally Posted by *ChronoBodi*
> 
> 
> 
> why is this one benchmark favors the Ryzen compared to almost all other reviews?


He was using a 980TI which is a bit slower than a 1080, if it's not overclocked.


----------



## Kuivamaa

Quote:


> Originally Posted by *ChronoBodi*
> 
> 
> 
> why is this one benchmark favors the Ryzen compared to almost all other reviews?


It isn't different than the others. In the computerbase review, ryzen also stomps 7700k in this game.

https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#diagramm-f1-2016-fps

In my book ,Ryzen is a good upgrade to anything from 3770k or earlier for gaming.


----------



## ChronoBodi

Quote:


> Originally Posted by *mesrine*
> 
> So which one of the Ryzen processors should be compared against 7700k in gaming performance. Just curious.


the Ryzen 3 series are 4c/8t Ryzens against the 7700ks,
the ryzen 5 series are 6c/12t Ryzens against the 6850Ks.
the ryzen 7 series are 8c/16t Ryzens against 6900k.


----------



## Wishmaker

Quote:


> Originally Posted by *ChronoBodi*
> 
> *the Ryzen 3 series are 4c/8t Ryzens against the 7700ks*,
> the ryzen 5 series are 6c/12t Ryzens against the 6850Ks.
> the ryzen 7 series are 8c/16t Ryzens against 6900k.


Well AMD themselves said they are 10% behind KBL and will fix it with Zen +. 7700 will be on top in gaming that is for sure. They won't have the chestnut that games do not support yet more than 4 cores. It will be on even footing with INTEL in terms of cores and threads







. Frequency and IPC will make the difference.


----------



## Shatun-Bear

No way I'd buy a 4-core 8 thread CPU in 2017. 4C/8T is the new 2C/4T.

Ryzen will get much faster in 1080p gaming, and in 1440p and up, or using cards below a 1080, it's unnoticeable from a 7700K.

The same people that declared the 480 as a 'fail' at launch for it's various issues and claimed the 1060 king soon disappeared as soon as the 480 reduced that performance gap to zero are the same people with their hyperbolic reactions to Ryzen.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Wishmaker*
> 
> Well AMD themselves said they are 10% behind KBL and will fix it with Zen +. 7700 will be on top in gaming that is for sure. They won't have the chestnut that games do not support yet more than 4 cores. It will be on even footing with INTEL in terms of cores and threads
> 
> 
> 
> 
> 
> 
> 
> . *Frequency and IPC will make the difference*.


No price will. Ryzen 3 is going to be less than half the price of the overpriced 7700K.


----------



## Pro3ootector

Turning SMT off increse performance ~10% in not threaded games.

http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/


----------



## Shatun-Bear

Quote:


> Originally Posted by *Pro3ootector*
> 
> 
> 
> Turning SMT off increse performance ~10% in not threaded games.


source?


----------



## Wishmaker

Quote:


> Originally Posted by *Shatun-Bear*
> 
> No price will. Ryzen 3 is going to be less than half the price of the overpriced 7700K.


Then pray that INTEL does not slash the price on the 7700 because they can make profit on that even at 230 dollars per unit, according to the latest napkin math on other forums.


----------



## Travieso

Quote:


> Originally Posted by *Wishmaker*
> 
> Then pray that INTEL does not slash the price on the 7700 because they can make profit on that even at 230 dollars per unit, according to the latest napkin math on other forums.


slashing price twice in the span of 2 months isn't a good business decision.

they already slashed the price of their cpus for the whole lineup, they need to come up with another strategy.


----------



## Catscratch

Quote:


> Originally Posted by *ChronoBodi*
> 
> Seems games get confused with AMD's SMT being on. Off, it games better.
> 
> So, clearly every game has no idea what Ryzen is and is using non-optimized codepath for "mystery cpu" basically.
> 
> And yet we have the choir of "OMG 7700k beats 1800x! AMD FAIL!" when we're talking like, minor fps differences that most people wouldn't even care to notice. Not everyone is a CSGO 240 hz freak.
> 
> It can game fine, it's probably like my 5960x in gaming, and the real comparison should be against 6900k, the $1000 competitor.


It seems, not just windows nor games. Linux too.

http://git.kernel.org/cgit/linux/kernel/git/tip/tip.git/commit/?id=08b259631b5a1d912af4832847b5642f377d9101

"our SMT scheduling topology for Fam17h systems is broken, because the ThreadId is included in the ApicId when SMT is enabled."


----------



## poinguan

Quote:


> Originally Posted by *Travieso*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wishmaker*
> 
> Then pray that INTEL does not slash the price on the 7700 because they can make profit on that even at 230 dollars per unit, according to the latest napkin math on other forums.
> 
> 
> 
> slashing price twice in the span of 2 months isn't a good business decision.
> 
> they already slashed the price of their cpus for the whole lineup, they need to come up with another strategy.
Click to expand...

microcode to unlock overclock like the K CPU will be nice.


----------



## TheReciever

Quote:


> Originally Posted by *poinguan*
> 
> microcode to unlock overclock like the K CPU will be nice.


Doubt it, they still do take down notices on haswell unlocks


----------



## Samuris

Quote:


> Originally Posted by *Samuris*
> 
> i can return my i7 7700k on amazon for get an R7 1700, even if i have delidded relidded him and applying grizzly conductonaut and lower the vcc pll overvoltage of my mb at 1.17V for get an 100% stable i7 7700k at 5ghz 1.3V who never exceed 45° on prime95 cause it's the amazon politics return, so i'm lost cause i got a good chip with this i7 and it's like he perform better in game but ryzen have probably a better futur and perform better in application, so any advise ? what i should do


Quote:


> Originally Posted by *dir_d*
> 
> Keep the i7


and what about return and buy this ? after 4.5ghz oc i'll not notice the difference with my kaby, and i keep 280$ for ryzen if i want double config


----------



## HeadlessKnight

Back in the early 00s Intel used to sell $999 EE P4s even when they were spanked by AMD. I am not very sure Intel will drop their prices easily, unless Ryzen really makes a big threat to them and starts stealing significant market-share from them.

While Ryzen is competitive overall, it still lags behind the mainstream 7700k by a good chunk in ST games, and this seem a turn off for most mainstream users, as they don't care for productivity as much as for gaming. Also overclockers don't like it because it has little to no headroom for overclocking.

Intel has a very bad history when it comes to slashing prices and I am not very positive about Ryzen will make much difference.


----------



## JackCY

Intel can sell the stuff for what ever they want. I'm not recommending them anymore as long as there is an alternative from someone else that offers better performance/price ratio even if it's up to 20% slower but still more than enough to not notice any issues. Once the 4-6C Ryzen CPUs launch in a few months there will be pretty much zero need to buy Intel CPUs from my POV unless you want more than one CPU in your machine and even that may change when AMD launches the server line of CPUs. Also ECC seems supported from what they say in YT reviews/comments about Ryzen, not that it's enabled/supported by the UEFI or will be, no one knows probably but the chip itself supports it according to AMD. Will see how that pans out.

Right now my recommendation, budget low = avoid and save more wait for Ryzen 4C/8T, mid = i5 4C/4T pick your poison until Ryzen 6C/12T arrives, high = Ryzen 1700 8C/16T, anything higher is a waste of money for most people.
Wouldn't recommend 4C i7s 3 years ago, certainly won't today and even the 6C+ are done now.


----------



## Imglidinhere

Games favor fewer higher clocked threads than several slower clocked ones. This has always and will continue to always be the case. To believe anything otherwise is silly.

The issue with this launch is that I think the hype train was so great that everyone thought AMD would suddenly release a product that would trounce everything Intel had. They certainly _have_ released a heavy upgrade from anything that the FX line could manage, but I feel like the platform is too new to judge it right now. I mean, they haven't even fixed the memory issues and updated the BIOS in most machines to compensate for any irregularities yet. By 'they' I mean the manufacturers and whatnot. One other thing that not many people seem to understand is that AMD had a MASSIVE performance gap to try and make up. Even though the IPC difference between Sandy and Kaby is only about 30%, you have to understand that even doubling per-core performance wasn't enough to fully close the gap up to Intel. Despite the lack of innovation from the blue team, AMD poured everything they could into this CPU and, realistically speaking, they made a chip that is comparable to what Intel has right now... but it still wasn't quiiite enough. Their flagship CPUs have never been what people buy anyway, people always settle on the midrange parts.

I guess what I'm trying to say is... don't take this as another Pentium 4 or Bulldozer-style launch. These CPUs have **not** failed in any way. The 1800X is a VERY powerful chip and an IPC increase of >50% is nothing short of astounding... so keep in mind that the end-game, finalized revisions aren't here yet. This did spook Intel enough to warrant some applause to AMD's launch. _(trying to sound as neutral here as possible btw)_ This is just the start of this architecture. They're miles ahead of where they were and they DID close the gap to within about 10-15% instead of being literally 45-50% of what the i7/i5 sit at. That's nothing short of incredible and I honestly hope, I really really hope that people start to see exactly what this means. It means you can buy AMD without the worry that you're getting screwed out of potential performance in games and other applications. Gaming on an 8-core has always been gimmicky because it's such a goofy thing to try and do. No sane dev will ever use more than 8 threads at most since most games simply just don't NEED anything more than that, what with how powerful these CPUs are nowadays.

I feel like everyone was honestly expecting AMD to match Intel stride for stride, clock for clock, when the IPC speculation threads from over a year ago actually showed the exact scores we're seeing RIGHT NOW, which places these CPUs right at Haswell-level in all fairness.







It's LITERALLY exactly where they need to be in order to be competitive. As is, AMD is losing the frequency battle for clocks... but that's how Phenom II was with the initial C2 stepping. The C3 stepping showed 4GHz overclocks across the board with minimal tweaks and even further optimization over the following years showed vast improvements to what they could manage within the same TDP. Hell, how many gamers you think are dissatisfied with their Thuban CPUs running 4GHz? Consider that. _(bit of a moot point but whatevs <3)_

My hope, as I can only actually HOPE for stuff now, is that we'll see revisions of these CPUs and they'll bring greater clockspeeds once the silicon matures and the process grows easier. The one thing I'm curious about is exactly how far off is AMD though... I've noticed one massive thing when it comes to ALL of these reviews... and it's that the 7700K CPUs are ALL at 5GHz under these benches, and the Ryzen chips are at 4GHz or slightly under... Like... a 1GHz lead makes a massive difference. *I really want to see what the performance is when you compare these CPUs at the same clock speed.* I say that simply because _AMD seems to be getting some odd heat for having lackluster performance, so compare them at the same clocks... that way we can really and truly tell where these CPUs are lacking and how much we have to compensate for that (higher frequencies and overclocks) before it levels out._ If we can see that differential, then we know where AMD stands and if it's worth the purchase. The IPC performance IS there. We can see it clear as day... we just have to identify how to maximize the CPU's strengths now, and to do that, we need to see what kind of performance delta is there between Intel and AMD, even if it IS unrealistic.


----------



## JackCY

Comparisons at equal clocks and even core/thread counts are available. Just not in the popular reviews, you gotta search and hunt it down yourself a bit and find those less popular that put more work into the reviews and made decent comparisons.
Ryzen is where it was advertised by AMD, at HW/DC/BW level of performance at equal clocks and cores/threads. Which is right at where Intel is now except 4C but because it has optimized the arch forever with tiny gains it has manged to milk and clock it high, it also has it's own manufacturing and more options with more $$$ to burn.
AMD making this level of performance, on their first 14nm, without having the fab, with so much less $$$ is awesome.


----------



## Shatun-Bear

https://www.youtube.com/watch?v=V5RP1CPpFVE&t=338s Joker Productions

There is still no explanation why his benchmarks at 1080p are very, very close to the 7700K. Anyone? Surely it wasn't because all the other reviewers were using borked settings, or was it? He proves his benches by the side by side shots as well.


----------



## SoloCamo

Quote:


> Originally Posted by *Imglidinhere*
> 
> Games favor fewer higher clocked threads than several slower clocked ones. This has always and will continue to always be the case. To believe anything otherwise is silly.


I stopped reading there. This same argument was made when we went to dual cores, then the same was made when quad's came out and now the same is being said now that octo's are hitting the mainstream. Consoles have 8 cores and I highly doubt any future iterations will use less. This translate into games utilizing more cores especially when the consoles are working with 8 extremely slow cores in comparison to what we have available on the PC end. If they want decent performance they have to use all the threads they can.

It's not silly, it's just progress. There are already of plenty of titles/engines that use more threads. Using the blanket term of "games" is not fair to potential buyers. I'd certainly take a Ryzen over my 4790k (or 7700k) in Battlefield 1 which is the main title I currently play.

Moving to a higher core count takes time but it is happening and has happened. New engines are being designed to take advantage of it and it gets old to see people say "well see, X game on an engine built many years ago only uses two threads, more cores are pointless for gaming".


----------



## ducegt

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 
> 
> 
> 
> There is still no explanation why his benchmarks at 1080p are very, very close to the 7700K. Anyone? Surely it wasn't because all the other reviewers were using borked settings, or was it? He proves his benches by the side by side shots as well.


Maybe the 7700K settings were borked...


----------



## Tobiman

In those games, I think the GPU was more of the bottleneck in those benches.


----------



## comagnum

Anyone wanna buy my skylake setup so I can rejoin team red?


----------



## IRobot23

AMD must do for gamers 500$ CPU
4C/8T (750$ 6C/12T) Ryzen 2 (+15% IPC), 4.5GHz - 115W TDP, 16GB of HBM2 - no need for DDR4
Insane speed.


----------



## SkiesOfAzel

Quote:


> Originally Posted by *Travieso*
> 
> GamerNexus' headline is quite misleaded and really confusing.
> 
> "An i5 in Gaming, i7 in Production"
> 
> when people see "i7 in production", they're gonna think that Ryzen is about equal to i7 7700K in productivity.
> 
> no, it eats 7700K alive in productivity that's not even comparable and loses in gaming.
> 
> should have used better wording.


The GamerNexus headline is the least of this review's problems. The whole premise of the review is suspect. Ryzen 7 is a high core, low frequency part that is meant to go against BR-e. But Steve instead chose to center his review almost exclusively around gaming, which even an idiot would guess isn't the R7's primary purpose. He could of course wait for Ryzen 5 and 3 that are a better fit
for gaming tasks, but then he wouldn't have been able to put out an almost comically negative review of Ryzen at launch day.

I don't know why Steve did this, but it's so overdone that it becomes transparent. The guy tried to spin everything in a negative light. Even when he was forced to consent a victory to Ryzen, it was always coupled with a but. Ryzen is (only) good for content creation, but you can gpu accelerate those tasks (a gross over-generalization that borders on lying). He then used overclocking as an argument to claim that the 6900k is faster, even though he knows his motherboard is not in any way shape or form production ready and thus not representative of OC potential for the 1800x. To make matters worse, he put his 3.9 1800x OC against a 4.4 6900k OC which is in no way average for 6900k. He even complained about the blender benchmark settings even though the results it produced were representative of performance. Funniest of all, instead of talking about the price difference between 6900k and the 1800x, he kept repeating how much cheaper the 7700k is. Pure comedy.
Quote:


> Originally Posted by *Wishmaker*
> 
> Well AMD themselves said they are 10% behind KBL and will fix it with Zen +. 7700 will be on top in gaming that is for sure. They won't have the chestnut that games do not support yet more than 4 cores. It will be on even footing with INTEL in terms of cores and threads
> 
> 
> 
> 
> 
> 
> 
> . Frequency and IPC will make the difference.


AMD said they are 6% behind KBL in IPC, not 10%. Zen+ IPC target gains are around 10% though.
Quote:


> Originally Posted by *Wishmaker*
> 
> Then pray that INTEL does not slash the price on the 7700 because they can make profit on that even at 230 dollars per unit, according to the latest napkin math on other forums.


Why would anyone pray for that? The whole point of competition is that it drives prices down and innovation up.


----------



## Imglidinhere

Quote:


> Originally Posted by *SoloCamo*
> 
> I stopped reading there.


Then you're an idiot. If you read the actual post then you'll understand that I'm not saying a dual core at 12GHz is better than a quad core at 3Ghz, It's already proven it doesn't work that way. I was comparing the i7 7700K which was run at 5Ghz and the R7 1700 was run at 3.9GHz. Sorry but there's a massive clock differential there, and we don't know how heavily the games were influenced by the higher clockspeeds. More games RIGHT NOW use at MOST 8 threads to their fullest extent than they do with 16 threads. That's an irrefutable FACT. It has been for the past five years at least and will stay like that for a while yet. There are only a pair of games that can use upwards of 16 threads and of which one of them is objectively good, the other is objectively 'meh' in my opinion (BF4 and BF1 respectively).

Several games will use any i7 to its fullest right now, that's not being disputed, but almost NO games will use a Ryzen CPU to its fullest simply because we cannot test that. There are no games out there that will push any CPU with over 12-threads to it's knees and then some. No game NEEDS that much CPU horsepower right now. Again, that's annother irrefutable _fact_.


----------



## Kuivamaa

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 
> 
> 
> 
> https://www.youtube.com/watch?v=V5RP1CPpFVE&t=338s Joker Productions
> 
> There is still no explanation why his benchmarks at 1080p are very, very close to the 7700K. Anyone? Surely it wasn't because all the other reviewers were using borked settings, or was it? He proves his benches by the side by side shots as well.


All others? Computerbase.de which is one of the most reliable sites benched 1800x stock 1-2% lower than 7700k, in other words agreeing to joker.

https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_benchmarks_mit_fps_und_frametimes


----------



## Brutuz

Quote:


> Originally Posted by *umeng2002*
> 
> Completely wrong.
> 
> For a CPU comparison, you need to be CPU bound.
> 
> Today 4K is GPU bound. In 2 or 3 years when Ryzen owners upgrade their GPUs, 4K gaming might not be GPU bound with headroom to spare. This is where Ryzen might lag behind Intel if BIOS patches and game optimizations don't take place.
> 
> The only reason why 1440p and up show little difference is that the GPU is the overriding factor, today.
> 
> Would I have done 720p on low? No. I would have done a test at 720p with Ultra settings.


It's unlikely that Ryzen will age faster than Kaby Lake, in 2-3 years DX12 and Vulkan will have a much greater marketshare and allow Ryzen to use its extra cores for greater speed. BIOS patches and cheaper upgrades because AMD tends to stick with a socket (eg. Get a 1700 and X370 now, buy a new GPU and Zen3-4 based CPU later as plenty of people did with AM2+ Athlon64s and Phenom II) mean that it'll be a better option for longevity as AMD tends to be when they're this close to Intel.

It's funny how the more time goes on since Ryzen's launch, the more and more this feels like the E8400s launch...People debating whether a few extra FPS now is worth the chip with fewer cores as everything is slowly tipping to using more. Personally, I'm going to side with history...Hell, even Intel is increasing core counts on their mainstream socket with Coffee Lake so they must see some reason to do that.
Quote:


> Originally Posted by *Mad Pistol*
> 
> Independent of the software, there are 2 major components in game testing nowadays
> 
> 1. *GPU* - Affects visual settings
> 2. *CPU* - Affects output of game data and response
> 
> -so-
> 
> If you want to cause a GPU bottleneck, turn up the in-game visual settings
> If you want to cause a CPU bottleneck, turn down the in-game visual settings and resolution
> 
> Now... there is a point where a CPU or GPU can be slow enough that it can negatively affect gameplay. What Joker's video proves is that a Ryzen 8-core will not slow down a high-end graphics card to any level that should be detrimental to the gaming experience. Couple this fact with the fact that AMD's R7 CPUs have more cores and more threads than the i7 7700k, and the R7 1700 is not only the more future-proof solution, but it also represents a massive value proposition (The R7 1700 is $330. The i7 4790k is $350).
> 
> If I still had my FX 8320 from a couple of years ago, you can bet your butt that I would be getting a Ryzen setup without hesitation.


This. I'm sure you remember the Q6600 vs E8400 debates that used to go on here, it's exactly the same....If you plan on doing a major system upgrade in 2-3 years, go for the 7700k. If you want a build that'll happily be gaming at medium-high in 5 years with minimal upgrades, go for a 1700.


----------



## Pro3ootector

Ryzen is my next notebook CPU.


----------



## Mahigan

Quote:


> Originally Posted by *Kuivamaa*
> 
> All others? Computerbase.de which is one of the most reliable sites benched 1800x stock 1-2% lower than 7700k, in other words agreeing to joker.
> 
> https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_benchmarks_mit_fps_und_frametimes


Something is not right. Joker is no AMD fan either (he's actually pretty anti-AMD usually) as are Computerbase.de. Seems to me that some shady things may be afoot. Doesn't surprise me one bit. Some of these "reviewers" love the Intel money sent their way.

Some reviewers even ran at 480p because Intel told them too. At that res the Intel CPUs are faster but...

1. who games at 480p?
2. Future is more cores so as games become more complex RyZen, not Kaby-Lake, will pull ahead.
3. RyZen is faster at nearly everything else.

Why buy Intel? Well.. maybe if you overclock but I just say two 5.6+ GHz overclocks on RyZen (new world records too). So it does seem that perhaps RyZen is harder to overclock but once overclocked it is a superior architecture.


----------



## -Sweeper_

AMD marketing:

Intel Broadwell-E
Core i7-6900K
8C / 16T
3 GHz all-core 49.05

AMD ZEN
Engineering Sample
8C / 16T
3 GHz all-core 48.07
(-0.98 sec, 1.998%)

Tom's review:



Did they sabotage Intel in their demonstration?


----------



## dir_d

Quote:


> Originally Posted by *Samuris*
> 
> and what about return and buy this ? after 4.5ghz oc i'll not notice the difference with my kaby, and i keep 280$ for ryzen if i want double config


I don't think it's worth the effort, you have a solid machine that should last you a long time. Just use it and be happy.


----------



## cssorkinman

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shatun-Bear*
> 
> 
> 
> 
> 
> https://www.youtube.com/watch?v=V5RP1CPpFVE&t=338s Joker Productions
> 
> There is still no explanation why his benchmarks at 1080p are very, very close to the 7700K. Anyone? Surely it wasn't because all the other reviewers were using borked settings, or was it? He proves his benches by the side by side shots as well.
> 
> 
> 
> All others? Computerbase.de which is one of the most reliable sites benched 1800x stock 1-2% lower than 7700k, in other words agreeing to joker.
> 
> https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/#abschnitt_benchmarks_mit_fps_und_frametimes
Click to expand...

I think some of it is out and out bias but I'm also sure that many reviewers are just not used to playing with anything other than Intel.

On the FX platform , getting the right ram and then tuning it is key to good performance - Too bad that seems to be where the new platform needs the most work.

I really can't believe anyone would call Ryzen bad at gaming - for goodness sakes the FX could be made to maintain minimum frame rates above the refresh rate of 90 % of the monitors people own in alll but a few games ( and in most of those games it couldn't , neiher coulld intel chips) and look how much of an improvment Ryzen is being shown to be.

I guess what we should look at on those sites showing Ryzen and the 7700k to be very close in gaming is the numbers the intel is getting = are they as good as the other sites?


----------



## SkiesOfAzel

Quote:


> Originally Posted by *Mahigan*
> 
> Something is not right. Joker is no AMD fan either (he's actually pretty anti-AMD usually) as are Computerbase.de. Seems to me that some shady things may be afoot. Doesn't surprise me one bit. Some of these "reviewers" love the Intel money sent their way.
> 
> Some reviewers even ran at 480p because Intel told them too. At that res the Intel CPUs are faster but...
> 
> 1. who games at 480p?
> 2. Future is more cores so as games become more complex RyZen, not Kaby-Lake, will pull ahead.
> 3. RyZen is faster at nearly everything else.
> 
> Why buy Intel? Well.. maybe if you overclock but I just say two 5.6+ GHz overclocks on RyZen (new world records too). So it does seem that perhaps RyZen is harder to overclock but once overclocked it is a superior architecture.


There is no widespread conspiracy, but there is an ill prepared launch. Motherboard issues, scheduler issues, power management issues etc. Ryzen was expected to be slower in games that don't really scale past 8 threads (which is most of them), but it was expected to trail by ~15% due to ipc and frequency differences at stock. The fact that this 15% has become a 30% in some reviewed cases is mostly a product of those launch issues.

What does not make sense is slamming the Ryzen 7 series based on gaming benchmarks. It's like complaining that the dude that won the marathon is bad at sprinting.

As for overclocking, 4ghz is already a feat with the process those are made. Those overclocks you see are on LN, you won't easily see Ryzen go past 4ghz on safe voltages. I don't even expect the 4 core Ryzen 3 to easily go past 4.2ghz but i will be glad to be proven wrong.


----------



## cssorkinman

Quote:


> Originally Posted by *-Sweeper_*
> 
> AMD marketing:
> 
> Intel Broadwell-E
> Core i7-6900K
> 8C / 16T
> 3 GHz all-core 49.05
> 
> AMD ZEN
> Engineering Sample
> 8C / 16T
> 3 GHz all-core 48.07
> (-0.98 sec, 1.998%)
> 
> Tom's review:
> 
> 
> 
> Did they sabotage Intel in their demonstration?


No . but i suspect they did have the Ryzen running better than toms did.

EDIT That;;s some mighty good scaling on the 6900k @ 3.8 ghz - 600 mhz over base - probably 300 orver all core turbo and they get that kind of improvement????? Something is VERY fishy.


----------



## Kuivamaa

Quote:


> Originally Posted by *cssorkinman*
> 
> No . but i suspect they did have the Ryzen running better than toms did.
> 
> EDIT That;;s some mighty good scaling on the 6900k @ 3.8 ghz - 600 mhz over base - probably 300 orver all core turbo and they get that kind of improvement????? Something is VERY fishy.


AMD demo had a sample of 150, not 200 like tom's table.i would test ryzen myself to verify AMD numbers but my order is still waiting on the mobo. That said, users have confitmed AMD numbers on their intel rigs, so I don't know what tom's is doing but its intel numbers seem inflated for 200 samples, even at 3.8.


----------



## cssorkinman

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> No . but i suspect they did have the Ryzen running better than toms did.
> 
> EDIT That;;s some mighty good scaling on the 6900k @ 3.8 ghz - 600 mhz over base - probably 300 orver all core turbo and they get that kind of improvement????? Something is VERY fishy.
> 
> 
> 
> AMD demo had a sample of 150, not 200 like tom's table.i would test ryzen myself to verify AMD numbers but my order is still waiting on the mobo. That said, users have confitmed AMD numbers on their intel rigs, so I don't know what tom's is doing but its intel numbers seem very inflated for 200 samples.
Click to expand...

Lol check their scaling on the 6900k at 3.8 vs stock.... they dun goofed - wonder what their excuse will be?


----------



## Blameless

Quote:


> Originally Posted by *cssorkinman*
> 
> Lol check their scaling on the 6900k at 3.8 vs stock.... they dun goofed


Yep, that's far better than linear scaling, which is impossible.


----------



## AmericanLoco

Quote:


> Originally Posted by *cssorkinman*
> 
> Lol check their scaling on the 6900k at 3.8 vs stock.... they dun goofed - wonder what their excuse will be?


Apparently with Intel, a 15% increase in clocks yields a 35% performance increase


----------



## sugalumps

Every single amd launch here is met with the same two defenses, there is a grand conspiracy out to get amd and keeping them down then the other one is constant goal posts being moved same thing happened with the 480 launch. Wait for this wait for that, "aib partners will improve it" "tteething issues, motherboards are holding it back", "wait for the 4 core" etc etc.

It's a great chip but falls a little short of peoples expectations over the last few weeks here which is not the end of the world, but people are moving goal posts for sure.


----------



## cssorkinman

Quote:


> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Lol check their scaling on the 6900k at 3.8 vs stock.... they dun goofed
> 
> 
> 
> Yep, that's far better than linear scaling, which is impossible.
Click to expand...

That's certainly been my reality with this hobby

Quote:


> Originally Posted by *AmericanLoco*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Lol check their scaling on the 6900k at 3.8 vs stock.... they dun goofed - wonder what their excuse will be?
> 
> 
> 
> Apparently with Intel, a 15% increase in clocks yields a 35% performance increase
Click to expand...

Thanks for doing the math







.

So if it was at 600 mhz higher than stock ( best case for the 6900k). What would perfect scaling be expected to yield?


----------



## JackCY

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 
> 
> 
> 
> https://www.youtube.com/watch?v=V5RP1CPpFVE&t=338s Joker Productions
> 
> There is still no explanation why his benchmarks at 1080p are very, very close to the 7700K. Anyone? Surely it wasn't because all the other reviewers were using borked settings, or was it? He proves his benches by the side by side shots as well.


I've said it before I will say it again, he has one of the more ready boards a Gigabyte. Most reviews that used ASUS or MSI have even more unresolved issues that plague their systems, possibly non working boost even and definitely an old microcode.

Gotta wait a month at least until more decent UEFI is released for all boards. Then the benches need to be done again once the platform is fully supported. May happen with the launch of the much wanted 6C/12T.
Quote:


> Originally Posted by *-Sweeper_*
> 
> AMD marketing:
> 
> Intel Broadwell-E
> Core i7-6900K
> 8C / 16T
> 3 GHz all-core 49.05
> 
> AMD ZEN
> Engineering Sample
> 8C / 16T
> 3 GHz all-core 48.07
> (-0.98 sec, 1.998%)
> 
> Tom's review:
> 
> 
> 
> Did they sabotage Intel in their demonstration?


Depends how you test it, let me run the same test on my machine again and lets compare, shall we? I won't even close my hundreds of tabs Firefox or anything else, here we go...

*4690K 4.7-7-6-6GHz, 2.4GHz CL11:*

*47.37s*

Oh wait that beat the stock 6900K, and left the 7700K for dead.
Don't believe me? Well pictures then:


Spoiler: Warning: Spoiler!



 4.7GHz for 1-2 core usage, 4.6GHz for 3-4 core usage
 v2.78.4
 150 samples official Ryzen bench



You see, it very much depends on what precise version do you use, Blender is a piece of crap renderer and there are CPU optimized versions that run much faster. For me the SIMD optimized one runs fastests, the AVX etc. ran slower. Stilt has shared these versions on OCN before or anyone using Blender render probably has them otherwise you're looking at double the render time. Even between official Blender versions there can be a noticeable difference you cannot compare between versions nor between different compilations as they use different code, you need the precise version and build to be able to compare with results posted online.

Ok let me re run that with 200 samples, coz Tom's doesn't even know it should run at 150 not 200.

*62.73s* at 200 samples. Still beating the 7700K.


Spoiler: Warning: Spoiler!


----------



## TheReciever

Quote:


> Originally Posted by *Imglidinhere*
> 
> Games favor fewer higher clocked threads than several slower clocked ones. This has always and will continue to always be the case. To believe anything otherwise is silly.
> 
> The issue with this launch is that I think the hype train was so great that everyone thought AMD would suddenly release a product that would trounce everything Intel had. .


Where the hell are people getting this from?

People were anticipating Sandybridge performance with 16 threads of power and most even then were still looking forward to that level of performance. Its like the hype came in overnight or I must be missing something.


----------



## CriticalOne

Joker's and Computerbase.de's results line up to exactly where I thought Ryzen would be.

Based on synthetic benchmarks, the 1800x is faster than the 6900K in both singlethreaded and multithreaded and in theory should have been right up there with the 6900k in game benchmarks. When I saw it losing to the 6900k as much as 20% I knew something was wrong with the test systems. I think this is where most of the shock and the "disappointment" is coming from.


----------



## JackCY

Quote:


> Originally Posted by *TheReciever*
> 
> Where the hell are people getting this from?
> 
> People were anticipating Sandybridge performance with 16 threads of power and most even then were still looking forward to that level of performance. Its like the hype came in overnight or I must be missing something.


They were disillusioned by their brain and thinking that double the CPU cores = double the FPS at 1080p ultra details. Hell even GPUs don't scale that way.

True CriticalOne.


----------



## Game256

If you overclock Ryzen 7 1700 up to stock clocks of Ryzen 1700X will you get basically 1700X or there are still some internal differences that may, for example, wear out 1700 faster wih such high clocks?


----------



## Shatun-Bear

'GamersNexus' should be ashamed. What a poor and completely unforgiving review.


----------



## JackCY

Quote:


> Originally Posted by *Game256*
> 
> If you overclock Ryzen 7 1700 up to stock clocks of Ryzen 1700X will you get basically 1700X or there are still some internal differences that may, for example, wear out 1700 faster wih such high clocks?


It's the same chip, just sold with different clocks out of the box. Same as any other CPU these days really. The 4C and 6C will be the same chip again, just cut down.

Shatun-Bear: it's the hair, it gets in the way, makes him grumpy and overlook things.


----------



## CriticalOne

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 'GamersNexus' should be ashamed. What a poor and completely unforgiving review.


I don't see what's so poor about it. The website and the channel name should have given you some idea what the Ryzen processor is going to be judged on. Steve has already said many times that its a great CPU for productivity but for gaming it falls short of the cheaper Core i7 processors.

Its not Gamersnexus' job to give AMD an A for effort.


----------



## Game256

Quote:


> Originally Posted by *JackCY*
> 
> It's the same chip, just sold with different clocks out of the box. Same as any other CPU these days really. The 4C and 6C will be the same chip again, just cut down.


Yeah, I know. But I wonder whether it will be absolutely the same when you overlock 1700 up to the clocks of 1700X or 1700 is not supposed to be used with such high clocks and it can affect CPU in a bad way. Maybe there are some specific limiatitions? Declared TDP, for example, is different.


----------



## JackCY

Quote:


> Originally Posted by *Game256*
> 
> Yeah, I know. But I wonder whether it will be absolutely the same when you overlock 1700 up to the clocks of 1700X or 1700 is not supposed to be used with such high clocks and it can affect CPU in a bad way.


Overclocking as always at your own risk. It can be expected with a common reason that the chips will behave not much differently than previous node sizes and competing 14nm chips when it comes to longevity. As far as I've seen AMD recommends 1.35V max or so they say. There is an LDO just like there is in SL/KL that drops the voltages further. No FIVR like HW/DC. Don't remember what BW and BW-E use, BW must use FIVR still since it shares boards with HW/DC, no idea bout BW-E though. Even for Intel no one knows how much voltage will cause how much degradation. Of course the higher you go the worse it gets and it gets worse waster, usually your cooling is unable to keep up sooner than you will find a high volts 24/7 stable system.
Normally 10% is a reasonable safe limit from stock volts. Say stock is 1.2V so up to 1.32V is safe. At least that's how I do it if I don't know any recommendations from manufacturer.


----------



## X-Nine

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 'GamersNexus' should be ashamed. What a poor and completely unforgiving review.


Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?

Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.

It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


----------



## Xuper

Quote:


> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


How is 6900K ?

Quote:


> It shines in productivity but lacks in the gaming department


Do you agree? Here

6900K is losing to 7700K in Game's perf = Ok!

RX 1800 is losing to 7700K Game's perf = Epic Fail

Big different between GamersNexus & Computerbase Is language.GamersNexus :

Quote:


> Ryzen is An i5 in Gaming, i7 in Production


While computerbase :

Quote:


> Ryzen is King in Applications, Prince in Games


therefore GamersNexus used a more harsh words.


----------



## rexolaboy

Quote:


> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. *It's not his fault Ryzen failed to live up to the hype when it comes to gaming*, which is exactly what he pointed out. He said it shines in productivity but *lacks in the gaming department*. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


I think people were taken back by Steves review because generally people trust him to put out unbiased stuff, but his numbers were so low and his critiques were so harsh it seemed weird. To most people, seeing a CPU absolutely nail Intel in professional workloads and then have such amazing minimum FPS in gaming but low max FPS screams hardware issues, it seems that people see there is more in Ryzens tank than Steve wants to address. Retesting, 30 pages of typing, phone calls, videos, and he is still botching his review because his ASUS motherboard is junk and he is basing his recommendation of a poor bios. That's why people are angry at him. Its like watching someone trying to fly a kite with a huge hole in it due to poor handling at the store it was bought at and saying the kite was bad at flying without trying to fix the hole.


----------



## TopicClocker

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 'GamersNexus' should be ashamed. What a poor and completely unforgiving review.


It's poor and unforgiving for showcasing CPU limitations instead of GPU limitations? GPU limited CPU benchmarks are useless!

I strongly believe that this isn't all Ryzen has to give, there's a bunch of software issues and things just aren't working properly.
Hopefully we see a retest whenever this stuff is sorted out.


----------



## Artikbot

Quote:


> Originally Posted by *JackCY*
> 
> There is an LDO just like there is in SL/KL that drops the voltages further. No FIVR like HW/DC. Don't remember what BW and BW-E use, BW must use FIVR still since it shares boards with HW/DC, no idea bout BW-E though.


There must be a meaning of LDO I'm not aware of, because Ryzen sure as hell doesn't use linear regulators for the CPU power section


----------



## LancerVI

Quote:


> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


I can agree that I don't think there was any malice aforethought and being a "gamers" site to begin with, obviously, he's going to focus in on that. But even your own commentary is suggestive that the Ryzen can't game or "if you're a gamer, look elsewhere." That's a gross mis-characterization of what's going on here. Furthermore, your cheaper argument ignores the 1700 non-X. The system build for a good gaming machine with mobo and whatever video card is going to be significantly cheaper than pretty much any i7 build.

Again, if I knew nothing and read comments like yours, I'd believe that Ryzen is horrible at gaming, which is far from the truth.


----------



## DarkRadeon7000

Quote:


> Originally Posted by *Xuper*
> 
> How is 6900K ?
> Do you agree? Here
> 
> 6900K is losing to 7700K in Game's perf = Ok!
> RX 1800 is losing to 7700K Game's perf = Epic Fail
> 
> Big different between GamersNexus & Computerbase Is language.GamersNexus :
> While computerbase :
> therefore GamersNexus used a more harsh words.


Didnt it also lose to the 6900k in gaming? If so the Ryzen quad will also lose to the 7700k. And if this is the case Intel is going to wipe the floor with them once Coffeelake drops just like how NVIDIA just dropped the 1080ti bomb on AMD.

And I dont find his choice of words harsh at all.Even a 3770k from 4 years ago can compete with it in gaming if oced . No ones gonna pay 40 less for the AMD when there's a huge performance disparity in games


----------



## X-Nine

Quote:


> Originally Posted by *Xuper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.
> 
> 
> 
> How is 6900K ?
> Quote:
> 
> 
> 
> It shines in productivity but lacks in the gaming department
> 
> Click to expand...
> 
> Do you agree? Here
> 
> 6900K is losing to 7700K in Game's perf = Ok!
> RX 1800 is losing to 7700K Game's perf = Epic Fail
> 
> Big different between GamersNexus & Computerbase Is language.GamersNexus :
> Quote:
> 
> 
> 
> Ryzen is An i5 in Gaming, i7 in Production
> 
> Click to expand...
> 
> While computerbase :
> Quote:
> 
> 
> 
> Ryzen is King in Applications, Prince in Games
> 
> Click to expand...
> 
> therefore GamersNexus used a more harsh words.
Click to expand...

I don't think Ryzen is an epic fail at all, not when you take into consideration just how well it does at the cost. Could Steve have used different wording? Sure, but again, people are being far too sensitive on the matter.

Quote:


> Originally Posted by *rexolaboy*
> 
> I think people were taken back by Steves review because generally people trust him to put out unbiased stuff, but his numbers were so low and his critiques were so harsh it seemed weird. To most people, seeing a CPU absolutely nail Intel in professional workloads and then have such amazing minimum FPS in gaming but low max FPS screams hardware issues, it seems that people see there is more in Ryzens tank than Steve wants to address. Retesting, 30 pages of typing, phone calls, videos, and he is still botching his review because his ASUS motherboard is junk and he is basing his recommendation of a poor bios. That's why people are angry at him. Its like watching someone trying to fly a kite with a huge hole in it due to poor handling at the store it was bought at and saying the kite was bad at flying without trying to fix the hole.


Perhaps. He did go to great lengths to find a solution to the issue. Some boards may have had more development time on them. I think it's strange that so many different kits were sent out. It would have been much better if they had sent 2 or 3 different boards to each reviewer, in my opinion. That way, if they suspected something was up, they had another (or 2) board to test against to see if it's a board issue, or a platform issue.

When it comes down to it, I think people need to be realistic here. AMD has done a hell of a job improving their platform. Neither AMD nor Intel are perfect, both have pros and cons. I have no allegiance to either, but will use whatever makes sense for the build I'm working with.


----------



## momonz

I still lol at people calling RX 1800 is not good for gaming (facepalm)


----------



## DarkRadeon7000

Quote:


> Originally Posted by *LancerVI*
> 
> I can agree that I don't think there was any malice aforethought and being a "gamers" site to begin with, obviously, he's going to focus in on that. But even your own commentary is suggestive that the Ryzen can't game or "if your a gamer, look elsewhere." That's a gross mis-characterization of what's going on here. Furthermore, your cheaper argument ignores the 1700 non-X. The system build for a good gaming machine with mobo and whatever video card is going to be significantly cheaper than pretty much any i7 build.
> 
> Again, if I knew nothing and read comments like yours, I'd believe that Ryzen is horrible at gaming, which is far from the truth.


In my country Ryzen 1700X costs as much as a 7700k. How do you justify paying for the AMD solely for gaming?


----------



## PsyM4n

Quote:


> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


That.
...and to add some more things to the above,

When new CPU micro-architectures are released, there are always cases where existing code doesn't work all that well on them and is relatively slow. Taking another approach might make things better, might make no difference, or might make things less slow. This happens on both Intel and AMD processors. Remember back when Pentium 4 was first released and it was performing far worse than Pentium 3 unless the newly developed multi-threading technology on it was properly used.

Then you have bugs and errata. Lots of related past issues for both intel and amd again.

New hardware always has issues, some can be fixed through software, others with newer hardware steppings/revisions, while some are not fixed until different hardware comes out.


----------



## LancerVI

Quote:


> Originally Posted by *DarkRadeon7000*
> 
> In my country Ryzen 1700X costs as much as a 7700k. How do you justify paying for the AMD solely for gaming?


First, I said non-X.

Second, build out the entire system. What's the total system cost?

Third, I live in the US, so I can only speak to that and wouldn't presume to speak to your prices/economy.


----------



## Slomo4shO

Quote:


> Originally Posted by *Wishmaker*
> 
> Well AMD themselves said they are 10% behind KBL and will fix it with Zen +. 7700 will be on top in gaming that is for sure. They won't have the chestnut that games do not support yet more than 4 cores. It will be on even footing with INTEL in terms of cores and threads
> 
> 
> 
> 
> 
> 
> 
> . Frequency and IPC will make the difference.


Only thing is, the R3 is $150 to $200 less then the 7700K...

Needless to say, AMD is it's own worst enemy. Another butchered launch of a stellar product.


----------



## ryan92084

Quote:


> Originally Posted by *ChronoBodi*
> 
> the Ryzen 3 series are 4c/8t Ryzens against the 7700ks,
> the ryzen 5 series are 6c/12t Ryzens against the 6850Ks.
> the ryzen 7 series are 8c/16t Ryzens against 6900k.


Just an fyi that ryzen lineup is incorrect atm. AMD is on video saying the ryzen 5 lineup consists of the 6c/12t 1600x and the 4c/8t 1500x.


----------



## CULLEN

Quote:


> Originally Posted by *Pro3ootector*
> 
> Ryzen is my next notebook CPU.


I've made enough jokes about Intel fanboys for now. So my question is, why?

Why is Ryzen going to be in your next notebook? Unless I missed something, we really don't know a whole lot how Ryzen mobile will perform. We think it will be great, but we really have nothing to show for it.

I can't remember ever having AMD processor in any of my laptops because Intel mostly dominated that market. I really want to upgrade my T440s to Dell XPS13 or T470s but not one of them has AMD so obviously I'd have to pick Intel.

When it comes to laptops, the what matters (for me) is keyboard, battery, and monitor. Whatever hardware gives me the best battery and solid performance, I'll take it.


----------



## kd5151

Im just going to play devil's advocate. If Intel was making big strides we wouldn't even be talking about AMD right now. The next few years are going to be interesting. If Intel keeps going down the same path,AMDs investment in multithreaded gaming will pay off. I've been a fan of AMD for a long time. For the last ten years they have been selling more cores at half the price Intel does. A while back I supported AMD because I knew they would bring 8 cores for under $400 and they did. AMD sells great products but never gets credit where credit is due. The market is poisoned. Make AMD great again or better yet make CPU/GPU buying great again! Do I want a fast quad core or a 16 threaded workhorse? Both have pros and cons.


----------



## IRobot23

Quote:


> Originally Posted by *ryan92084*
> 
> Just an fyi that ryzen lineup is incorrect atm. AMD is on video saying the ryzen 5 lineup consists of the 6c/12t 1600x and the 4c/8t 1500x.


1600X 6C/12T
1500 6C/12T
1400X 4C/8T
1300 4C/8T


----------



## PsYcHo29388

Quote:


> Originally Posted by *CULLEN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pro3ootector*
> 
> Ryzen is my next notebook CPU.
> 
> 
> 
> I've made enough jokes about Intel fanboys for now. So my question is, why?
> 
> Why is Ryzen going to be in your next notebook? Unless I missed something, we really don't know a whole lot how Ryzen mobile will perform. We think it will be great, but we really have nothing to show for it.
> 
> I can't remember ever having AMD processor in any of my laptops because Intel mostly dominated that market. I really want to upgrade my T440s to Dell XPS13 or T470s but not one of them has AMD so obviously I'd have to pick Intel.
> 
> When it comes to laptops, the what matters (for me) is keyboard, battery, and monitor. Whatever hardware gives me the best battery and solid performance, I'll take it.
Click to expand...

Can't speak for him but my dad is still using an A10 based laptop and is waiting for ryzen based laptops because "I'm not really all that impressed with intel", this was after buying a $1,500 laptop and $2,300 laptop, both intel+nvidia based and both were returned.

Doesn't make any sense to me since the most expensive one handled GTAV pretty good, High @ 4k and managing 60FPS. How this isn't considered good to him baffles me but most AMD fans tend to be illogical like so.


----------



## PiOfPie

Out of curiosity, do any of these reviews have HPET turned off? Lisa and technical marketing were saying on Reddit yesterday that aside from the penalty from SMT, it may be another thing that's hampering performance a bit.


----------



## Shatun-Bear

Quote:


> Originally Posted by *JasonCL*
> 
> Why, for telling the truth? Steve posted what is quite likely the most in-depth, most tested (and retested) review of any site out there. It's not his fault Ryzen failed to live up to the hype when it comes to gaming, which is exactly what he pointed out. He said it shines in productivity but lacks in the gaming department. What's so wrong with telling the facts?
> 
> Some people are taking this so personally, it's ridiculous. AMD has put out a chip that's going to be stellar for work environments, especially at the cost. That's a HUGE step in the right direction. Can you still game on them? Yes, but they won't perform to the level of the cheaper i7 in many instances, and thus, his review (and others who have reported the very same) are only being honest. There's literally no reason to be upset with these reviewers.
> 
> It's also a new platform for them, with lots of new and immature technologies that can (and will) get better with age and development.


'Steve' is the one taking it personally actually. If they ever want to be taken seriously as a hardware review site, he needs to stop immediately the cardinal sin in this type of work of coming across as upset and annoyed in his writing, caused apparently by his contact with AMD for the review, which he alludes to twice in it. He doesn't even need to write that in there. As soon as I read that, the tone was set. This is un-professional and immature. Never in a million years would this type of thing be seen in an Anandtech review, or even a Guru3D review.

The specific problem with the review is you cannot extract completely the gaming performance value out of a product that is meant for much more than gaming, and then slap a really harsh conclusion on the product as a whole for it. Yes, he does say that Ryzen is good for other stuff, but then he keeps adding qualifiers of why these are still negatives (like GPUs are better for other production workloads). So the conclusion paints the picture that Ryzen is a big disappointment, when it's just about the most exciting CPU release in over 5 years.

He also says in the conclusion: 'We also believe firmly that our benchmarks are a better representation of the real world.' How on earth he can say that when his apparently real-world gaming tests were carried out with a GTX 1080 and mostly in 1080p res. How many people own a card on the level of a 1080? And those that do, how many game at 1080p? This is hardly 'real-world'.

Lastly, it is in no way the 'most in-depth, most-tested' review out there. That would be Computerbase's and Anandtech's.


----------



## oxidized

Quote:


> Originally Posted by *IRobot23*
> 
> 1600X 6C/12T
> 1500 6C/12T
> 1400X 4C/8T
> 1300 4C/8T


No... AMD already showed 1500X is a 4/8


----------



## CULLEN

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Can't speak for him but my dad is still using an A10 based laptop and is waiting for ryzen based laptops because "I'm not really all that impressed with intel", this was after buying a $1,500 laptop and $2,300 laptop, both intel+nvidia based and both were returned.
> 
> Doesn't make any sense to me since the most expensive one handled GTAV pretty good, High @ 4k and managing 60FPS. How this isn't considered good to him baffles me *but most AMD fans tend to be illogical like so*.


Ehm.. fanboys of either side tend to be illogical.

But yeah, Intel currently has the most efficient solution for the mobile market and for someone that doesn't play games and even though I feel like doing something morally wrong, Intel is the no-brainer when it comes to the laptop market.

And please don't compare me to a "fanboy" because I'm not the biggest fan of Intel, I've almost exclusively used Intel parts for a good part of a decade, but Intel hurt me (the consumer) a lot by bribing distributors to use only their solutions. The case was settled 7 years ago but the only competition they had is still recovering.

Healthy competition is good for us all.


----------



## Shatun-Bear

You have Steve 'GamersNexus' conclusion that Ryzen is an epic fail in general and in gaming, then you have a more trusted site called ComputerBase where their conclusion is the 1800X is only 4% behind the 7700K at 1080p. *4 PERCENT*. CB also benched more than double the amount of games that Nexus did. So who do you believe?



https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/


----------



## amstech

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 'Steve' is the one taking it personally actually. If they ever want to be taken seriously as a hardware review site, he needs to stop immediately the cardinal sin in this type of work of coming across as upset and annoyed in his writing, caused apparently by his contact with AMD for the review, which he alludes to twice in it. He doesn't even need to write that in there. As soon as I read that, the tone was set. This is un-professional and immature. Never in a million years would this type of thing be seen in an Anandtech review, or even a Guru3D review.
> 
> The specific problem with the review is you cannot extract completely the gaming performance value out of a product that is meant for much more than gaming, and then slap a really harsh conclusion on the product as a whole for it. Yes, he does say that Ryzen is good for other stuff, but then he keeps adding qualifiers of why these are still negatives (like GPUs are better for other production workloads). So the conclusion paints the picture that Ryzen is a big disappointment, when it's just about the most exciting CPU release in over 5 years.
> 
> He also says in the conclusion: 'We also believe firmly that our benchmarks are a better representation of the real world.' How on earth he can say that when his apparently real-world gaming tests were carried out with a GTX 1080 and mostly in 1080p res. How many people own a card on the level of a 1080? And those that do, how many game at 1080p? This is hardly 'real-world'.
> 
> Lastly, it is in no way the 'most in-depth, most-tested' review out there. That would be Computerbase's and Anandtech's.


Your completely out of line here, mentioned nothing true at all, and absolutely nothing in your reply to JasonCL combats anything he said.
All the review sites show the same thing.
Quote:


> Originally Posted by *Shatun-Bear*
> 
> 
> 
> https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/


That same chart shows an i5 keeping pace with Ryzen.
And plus, its ONE chart.
Gaming engines are outdated, only certain newer games like Overwatch and Gears 4 are really using a CPU for all its worth.

-Edited


----------



## oxidized

Quote:


> Originally Posted by *Shatun-Bear*
> 
> You have Steve 'GamersNexus' conclusion that Ryzen is an epic fail in general and in gaming, then you have a more trusted site called ComputerBase where their conclusion is the 1800X is only 4% behind the 7700K at 1080p. *4 PERCENT*. CB also benched more than double the amount of games that Nexus did. So who do you believe?
> 
> 
> 
> https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/


I actually don't believe computerbase, even that thing that came out some weeks ago about 4/8 vs 6/12, 8/16 performance in gaming, where most of the games could take the advantage of more cores, when it's not actually like that, and you can see in every benchmark 95% of the cases 7700K is the fastest, in every situation.


----------



## Slomo4shO

Call with AMD at 6:25






IPC comparison:
Quote:


> 0 to 1% ahead of Broadwell-E
> 6.8% back of Kaby Lake


Kaby Lake has a 12% clock advantage in addition to the 6.8% IPC advantage.


----------



## Redwoodz

Quote:


> Originally Posted by *amstech*
> 
> Your completely out of line here, mentioned nothing true at all, and absolutely nothing in your reply to JasonCL combats anything he said.
> All the review sites show the same thing.
> That same chart shows an i5 keeping pace with Ryzen.
> And plus, its ONE chart.
> Gaming engines are outdated, only certain newer games like Overwatch and Gears 4 are really using a CPU for all its worth.
> 
> -Edited


Quote:


> Originally Posted by *oxidized*
> 
> I actually don't believe computerbase, even that thing that came out some weeks ago about 4/8 vs 6/12, 8/16 performance in gaming, where most of the games could take the advantage of more cores, when it's not actually like that, and you can see in every benchmark 95% of the cases 7700K is the fastest, in every situation.











Soo what an I5 can keep pace with an i7 then?

Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.


----------



## CallsignVega

I'm glad Ryzen is doing really good on the price/performance realm. I am also glad that my 6950X is still the overall king and didn't halve in value overnight.


----------



## JackCY

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm glad Ryzen is doing really good on the price/performance realm. I am also glad that my 6950X is still the overall king and didn't halve in value overnight.


It did so the moment you purchased it.


----------



## Shatun-Bear

Quote:


> Originally Posted by *oxidized*
> 
> I actually don't believe computerbase, even that thing that came out some weeks ago about 4/8 vs 6/12, 8/16 performance in gaming, where most of the games could take the advantage of more cores, when it's not actually like that, and you can see in every benchmark 95% of the cases 7700K is the fastest, in every situation.


It's not so hard to believe when this is the same (I think) set of games they benched to do that core count vs frequency article, where the Intel 8-cores came ahead of the 7700K in average using these games that generally scale well across more than 4C/8T.

So it's not surprising that even though the games benefit from more cores, the 7700K is still slightly ahead of the 1800X in this test, if only slightly. So I wouldn't be so quick to say 'I don't believe them', which is just a bizarre reaction. At least explain why.


----------



## Xuper

Quote:


> Originally Posted by *CallsignVega*
> 
> I'm glad Ryzen is doing really good on the price/performance realm. I am also glad that my 6950X is still the overall king and didn't halve in value overnight.


Really were you able to overclock to 4.5ghz?


----------



## Undervolter

I don't read game reviews, but i know about this, because it was posted in the main AMD subforum. This is how i first heard of "Gamers Nexus" and what i think of their reliability and "expertese".

Do you see anything...odd in this graph?



You have a 3.2Ghz FX scoring 31FPS min, a 3.3Ghz FX scoring 33 FPS and a 4.7Ghz scoring... 15 FPS. Ooops! That sounds a bit strange! But fear not, because Steve has the rational explanation!
Quote:


> The FX-9590 technically plays 1440p/ultra, but we faced severe frame drops and stuttering with the CPU, *a trait characteristic of the 9590's* high TDP
> 
> http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference#!/ccomment-page=2


Basically, he says that this is a "characteristic trait of 9590". Meaning, that AMD, actually released a CPU that at 4.7Ghz, performs WORSE than the same chip at 3.2Ghz. Ok, some will say "human error". No! Several OCN members in the comments tried to point it out and also give possible explanation. Did it serve to anything? No! Cause Steve "knew better". Ironically, Bethesda on reccommended hardware, had put that CPU, that according to Steve, makes 15 FPS.

This was Steve's final conclusion: "Folks. The 9590 was provided -- by AMD, no less -- installed in the motherboard we tested with. We have two 9590-ready motherboards and two 9590s. The ASRock Extreme9 and Fatal1ty 990FX Professional are both in our hands.

Why would we use an incompatible board? Come on.

Just because we keep getting *uninformed*, offensively-phrased comments about the test results, I took the two hours to retest both boards and both CPUs for parity. We are seeing the issues on both. If your test methodology, location, or frame measurement differs from ours, your results will differ."

I have nothing else to add. I haven't read his Ryzen review. Cause there are many others i haven't either and i try to leave the ones i less trust for last. In science, what he did, is called "systematic error". He repeats the same error again and again and at the end, he attributes everything to the CPU having a "characteristic trait", no matter if the result goes against any reason.


----------



## CallsignVega

Quote:


> Originally Posted by *JackCY*
> 
> It did so the moment you purchased it.


You must really fail at research if you think used 6950X's are selling for half a year later, let alone right after purchase.

http://www.ebay.com/itm/intel-core-i7-6950x-extreme-edition-/262857318761?hash=item3d33846169:g:TFoAAOSwax5Ynhco

http://www.ebay.com/itm/Intel-Core-i7-Extreme-Edition-i7-6950X-Deca-core-10-Core-3-GHz-/122373898786?hash=item1c7e0d8222:g:zUoAAOSwdGFYtEgj
Quote:


> Originally Posted by *Xuper*
> 
> Really were you able to overclock to 4.5ghz?


Yes, I got a silicon lottery chip.


----------



## amstech

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soo what an I5 can keep pace with an i7 then?
> 
> Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.


If an i5 beats an i7 in games, the game only uses 4 cores and the i5 is clocked higher.
At the same clock speeds the i7 is always faster, they have been the king for gaming CPU's since Bloomfield released and thats not changing.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Undervolter*
> 
> I don't read game reviews, but i know about this, because it was posted in the main AMD subforum. This is how i first heard of "Gamers Nexus" and what i think of their reliability and "expertese".
> 
> Do you see anything...odd in this graph?
> 
> 
> 
> You have a 3.2Ghz FX scoring 31FPS min, a 3.3Ghz FX scoring 33 FPS and a 4.7Ghz scoring... 15 FPS. Ooops! That sounds a bit strange! But fear not, because Steve has the rational explanation!
> Basically, he says that this is a "characteristic trait of 9590". Meaning, that AMD, actually released a CPU that at 4.7Ghz, performs WORSE than the same chip at 3.2Ghz. Ok, some will say "human error". No! Several OCN members in the comments tried to point it out and also give possible explanation. Did it serve to anything? No! Cause Steve "knew better". Ironically, Bethesda on reccommended hardware, had put that CPU, that according to Steve, makes 15 FPS.
> 
> This was Steve's final conclusion: "Folks. The 9590 was provided -- by AMD, no less -- installed in the motherboard we tested with. We have two 9590-ready motherboards and two 9590s. The ASRock Extreme9 and Fatal1ty 990FX Professional are both in our hands.
> 
> Why would we use an incompatible board? Come on.
> 
> Just because we keep getting *uninformed*, offensively-phrased comments about the test results, I took the two hours to retest both boards and both CPUs for parity. We are seeing the issues on both. If your test methodology, location, or frame measurement differs from ours, your results will differ."
> 
> I have nothing else to add. I haven't read his Ryzen review. Cause there are many others i haven't either and i try to leave the ones i less trust for last. In science, what he did, is called "systematic error". He repeats the same error again and again and at the end, he attributes everything to the CPU having a "characteristic trait", no matter if the result goes against any reason.


Unbelievable and smacks of him being an amateur and childish, which was my complaint with the tone of his review. He's obviously got previous with AMD and criticism from their 'fans' which he hasn't put behind him. Never again will I read anything from the guy or his site again.


----------



## Slomo4shO

Quote:


> Originally Posted by *Redwoodz*
> 
> Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.


AMD clearly states that RyZen is inferior to Kaby Lake in both IPC and frequency. Not sure why people continue to argue about this fact.

That said, Ryzen matches Haswell and a 4 core RyZen will cost about half the price of a 7700K. Would you rather buy a 4770K for $175 or a 7700K at $349? Doesn't take a genius to determine which would be the better option for a budget oriented gamer.


----------



## budgetgamer120

Can we stop arguing with people who were never interested in the CPU please









Here is an overclocked review which has a mixture of gaming and workstation benchmarks.

http://www.legitreviews.com/amd-ryzen-7-1700-overclocking-best-ryzen-processor_192191


----------



## CriticalOne

Quote:


> Originally Posted by *Redwoodz*
> 
> Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.


The R7 1700 and the I7-7700K are similarly priced. Why _wouldn't_ you compare them?


----------



## SoloCamo

Quote:


> Originally Posted by *amstech*
> 
> Your completely out of line here, mentioned nothing true at all, and absolutely nothing in your reply to JasonCL combats anything he said.
> All the review sites show the same thing.
> That same chart shows an i5 keeping pace with Ryzen.
> And plus, its ONE chart.
> Gaming engines are outdated, only certain newer games like Overwatch and Gears 4 are really using a CPU for all its worth.
> 
> -Edited


Wait, so BF1's engine (64 player multiplayer listed no less), Watch Dogs 2, etc. are all outdated and not using a cpu for all it's worth? Ryzen beats my 4790k in BF1 mutliplayer but ok.



Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Undervolter*
> 
> I don't read game reviews, but i know about this, because it was posted in the main AMD subforum. This is how i first heard of "Gamers Nexus" and what i think of their reliability and "expertese".
> 
> Do you see anything...odd in this graph?
> 
> 
> 
> You have a 3.2Ghz FX scoring 31FPS min, a 3.3Ghz FX scoring 33 FPS and a 4.7Ghz scoring... 15 FPS. Ooops! That sounds a bit strange! But fear not, because Steve has the rational explanation!
> Basically, he says that this is a "characteristic trait of 9590". Meaning, that AMD, actually released a CPU that at 4.7Ghz, performs WORSE than the same chip at 3.2Ghz. Ok, some will say "human error". No! Several OCN members in the comments tried to point it out and also give possible explanation. Did it serve to anything? No! Cause Steve "knew better". Ironically, Bethesda on reccommended hardware, had put that CPU, that according to Steve, makes 15 FPS.
> 
> This was Steve's final conclusion: "Folks. The 9590 was provided -- by AMD, no less -- installed in the motherboard we tested with. We have two 9590-ready motherboards and two 9590s. The ASRock Extreme9 and Fatal1ty 990FX Professional are both in our hands.
> 
> Why would we use an incompatible board? Come on.
> 
> Just because we keep getting *uninformed*, offensively-phrased comments about the test results, I took the two hours to retest both boards and both CPUs for parity. We are seeing the issues on both. If your test methodology, location, or frame measurement differs from ours, your results will differ."
> 
> I have nothing else to add. I haven't read his Ryzen review. Cause there are many others i haven't either and i try to leave the ones i less trust for last. In science, what he did, is called "systematic error". He repeats the same error again and again and at the end, he attributes everything to the CPU having a "characteristic trait", no matter if the result goes against any reason.






Completely forgot about that nonsense. As someone who had a 8120, 8350 and 9590. I can assure you the 9590 was considerably faster. Stock watercooling kit and an Asus Crosshair V Formula Z paired with my current builds 16gb 2400mhz cas10. No throttling issues at all.


----------



## hawker-gb

Quote:


> Originally Posted by *CriticalOne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Redwoodz*
> 
> Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.
> 
> 
> 
> The R7 1700 and the I7-7700K are similarly priced. Why _wouldn't_ you compare them?
Click to expand...

Because 7700K is overpriced.


----------



## amstech

Quote:


> Originally Posted by *SoloCamo*
> 
> Wait, so BF1's engine (64 player multiplayer listed no less), Watch Dogs 2, etc. are all outdated and not using a cpu for all it's worth? Ryzen beats my 4790k in BF1 mutliplayer but ok.


I am very impressed by Ryzen, I think its doing great.
Just seeing the same results everyone else is, not trying to be argumentative.


----------



## CriticalOne

Quote:


> Originally Posted by *hawker-gb*
> 
> Because 7700K is overpriced.


OK, but they are still similarly priced.

The ad hoc rescue of "its not meant for games!" its getting extremely tiring. No CPU is designed _just_ for gaming.


----------



## SoloCamo

I think these reviewers need to do a performance update in like 90 days to give everyone a better picture. Hopefully by then all of the confusion will be cleared up. There is too much inconsistency.


----------



## budgetgamer120

Quote:


> Originally Posted by *CriticalOne*
> 
> OK, but they are still similarly priced.
> 
> The ad hoc rescue of "its not meant for games!" its getting extremely tiring. *No CPU is designed just for gaming.*


But beats Ryzen up for not being the best at gaming


----------



## CriticalOne

Keep beating up those strawmans.


----------



## Slomo4shO

Quote:


> Originally Posted by *CriticalOne*
> 
> The R7 1700 and the I7-7700K are similarly priced. Why _wouldn't_ you compare them?


Because the comparable 4/8 models that are launching in a couple of months will be half the price of a 7700K?


----------



## budgetgamer120

I think this is quite telling in the strides AMD has made in effeciency. Cant wait to see mobile. I do know that now power consumption only mattered during the piledriver era.


http://www.legitreviews.com/amd-ryzen-7-1700-overclocking-best-ryzen-processor_192191/9


----------



## ducegt

Quote:


> Originally Posted by *hawker-gb*
> 
> Because 7700K is overpriced.


You get more for what you pay for. It's like this with nVidia as well.

What hasn't been discussed much here is the optimization of the 7700Ks reviewed. Most reviews run them at the standard RAM speed when they are capable of much more. The reviews with OCed 7700Ks have the memory gimp as well. We have seen the best Ryzen has to offer, but not Kaby Lake...and Kaby Lake is still on top for gaming which lets be real, is what most people here and on similar venues care about.


----------



## ryan92084

Quote:


> Originally Posted by *IRobot23*
> 
> 1600X 6C/12T
> 1500 6C/12T
> 1400X 4C/8T
> 1300 4C/8T


Not according to AMD. Remember those same rumors had the 1600x at a very different clock speed so the product stack has changed. Watch starting at 24:00 or so https://youtu.be/GFoOG00DseE


----------



## kd5151

legitreviews and tweaktown has very nice reviews for the 1700.


----------



## Slomo4shO

On a side note:


----------



## CULLEN

Quote:


> Originally Posted by *budgetgamer120*
> 
> Can we stop arguing with people who were never interested in the CPU please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is an overclocked review which has a mixture of gaming and workstation benchmarks.
> 
> http://www.legitreviews.com/amd-ryzen-7-1700-overclocking-best-ryzen-processor_192191


I really haven't been paying too much attention to legitreviews, but that's an excellent unbiased summary and very detailed altogether.

They even pointed out that they had updated to the most recent BIOS version on the motherboard as some reviewers noticed in an incredible performance increase.

Very impressed by LegitReviews right now.
Quote:


> Originally Posted by *SoloCamo*
> 
> I think these reviewers need to do a performance update in like 90 days to give everyone a better picture. Hopefully by then all of the confusion will be cleared up. *There is too much inconsistency*.


Absolutely.

But keep in mind that a BIOS update (MSI's 113 > 117) one day after the launch showed up to 26% improved performance (17% on average). Reviewers might need to redo the test after each update (be it BIOS or chipset) while Ryzen is still in its infancy.


----------



## SoloCamo

Quote:


> Originally Posted by *ducegt*
> 
> You get more for what you pay for. It's like this with nVidia as well.
> 
> What hasn't been discussed much here is the optimization of the 7700Ks reviewed. Most reviews run them at the standard RAM speed when they are capable of much more. The reviews with OCed 7700Ks have the memory gimp as well. We have seen the best Ryzen has to offer, but not Kaby Lake...and Kaby Lake is still on top for gaming which lets be real, is what most people here and on similar venues care about.


We have not seen all that Kaby lake can offer but we've seen the best that Ryzen offers? What is this nonsense? How much are they paying you? Seriously this is the crap that makes this site look unreliable.

Ryzen is a lot closer in gaming performance to the 7700k then the 7700k is to Ryzen in multithreaded workloads. As far as I'm concerned, I'm getting for more for my money. And the fact that you said Nvidia gives you more for your money is a crock too. Price/performance is literally AMD's game on both fronts. Nvidia gave me more for my money when I picked up my 290x over the double the price Titan, too, right?


----------



## ducegt

Quote:


> Originally Posted by *CriticalOne*
> 
> Keep beating up those strawmans.


That's all he and many others can do. I've shared a few simple truths that were countered by a dozen different arguments; none that addressed my points. All I got was tangents, personal attacks, and that's how I know I articulated my perspective thoroughly because there many very respectable and knowledgeable folks here.


----------



## ducegt

Oops. Meant to edit prior post. Just look at the one above for a prime example of what I said.


----------



## Hueristic

Quote:


> Originally Posted by *Travieso*
> 
> GamerNexus' headline is quite misleaded and really confusing.
> 
> "An i5 in Gaming, i7 in Production"
> 
> when people see "i7 in production", they're gonna think that Ryzen is about equal to i7 7700K in productivity.
> 
> no, it eats 7700K alive in productivity that's not even comparable and loses in gaming.
> 
> should have used better wording.


Nexus has zero credibility. After watching that video with joker (at least as far as I could with that nexus fool babbling) it is obvious he had no clue how to handle newly released hardware. The moron wasn't even man enough to just admit he made a mistake and instead tried to cry about how hard it is to OC. Anyone that knows anything knows to check for firmware updates on newly released hardware and you cannot judge a CPU by tests on one motherboard. He probably thinks testing is just swapping out, getting a driver to work and that is that. that may work for GPU testing but It shows a complete lack of training. The boy is ill equipped to that task and more than likely doesn't know a bit from a byte.

Also AMD should have never shipped those kits with a bugged bios board, that was another moron call and shows incompetence once again in AMD marketing dept. Any competent engineer would have had the release combo fully vetted before shipping to reviewers.


----------



## budgetgamer120

Quote:


> Originally Posted by *CULLEN*
> 
> I really haven't been paying too much attention to legitreviews, but that's an excellent unbiased summary and very detailed altogether.
> 
> They even pointed out that they had updated to the most recent BIOS version on the motherboard as some reviewers noticed in an incredible performance increase.
> 
> Very impressed by LegitReviews right now.
> Absolutely.
> 
> But keep in mind that a BIOS update (MSI's 113 > 117) one day after the launch showed up to 26% improved performance (17% on average). Reviewers might need to redo the test after each update (be it BIOS or chipset) while Ryzen is still in its infancy.


Anand's is not bad and is more geared to workstation use. I expected Anand to produce a GamerNexus review.
http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700/23
Quote:


> The CPU has a traditional uArch and does well, especially compared to last generation AMD, and a new high-perf core will be a feather in their cap. We see a lot of benchmark results where AMD is clearly equal or above Intel's HEDT parts in both ST and MT. However there are a few edge cases where AMD is lacking behind 10-20% still, even to Broadwell. These edge cases are difficult to anticipate, and can stem from unoptimized code. One of the benefits of Intel's big R&D juggernaut is the ability to process those edge cases, through prefetch, memory algorithms, and extensive testing. So despite the best will, there's still a large element to having a substantial budget to hire 300+ more engineers to cater for that, which is something AMD wasn't able for Zen or Ryzen.


----------



## CULLEN

Quote:


> Originally Posted by *kd5151*
> 
> legitreviews and tweaktown has very nice reviews for the 1700.


I haven't seen TweakTown's review, but yeah LegitReview is very nice for the 1700 by being completely unbiased and showing virtually every single detail you could ask for.
Quote:


> Originally Posted by *Slomo4shO*
> 
> On a side note:


Compared to the Linux server test that was published the other day (source), where the 1700X is nothing but amazing, but knowing that it will work with ECC memory makes it even more notorious.


----------



## ducegt

Quote:


> Originally Posted by *SoloCamo*
> 
> We have not seen all that Kaby lake can offer but we've seen the best that Ryzen offers? What is this nonsense? How much are they paying you? Seriously this is the crap that makes this site look unreliable.
> 
> Ryzen is a lot closer in gaming performance to the 7700k then the 7700k is to Ryzen in multithreaded workloads. As far as I'm concerned, I'm getting for more for my money. And the fact that you said Nvidia gives you more for your money is a crock too. Price/performance is literally AMD's game on both fronts. Nvidia gave me more for my money when I picked up my 290x over the double the price Titan, too, right?


I get paid millions. Actually I'm in a strange situation that has given me too much time on my hands. You really didn't say much with anything meaning regarding what I said. Was something I said inaccurate? Most gamers should not care about the multi performance because by the time that is properly utilized, they will be better off saving cash now and upgrading their hardware when software is ready for it. Same thing happened with 64 bit support and quad cores at launch.

I haven't bought nVidia since the 5850. I prefer the value of AMDs video cards. I tend to keep my CPUs longer and I want the highest minimum frames possible, so I've been with Intel since Conroe. I'll actually purchase AMD over nVidia even if it's in all aspects inferior. I do value the innovation they drive, but I generally buy what is best for my needs. Freesync is awesome.


----------



## AlphaC

https://www.pugetsystems.com/labs/articles/Premiere-Pro-CC-2017-AMD-Ryzen-7-1700X-1800X-Performance-909/

"Ryzen 7 1800X results are currently only placeholders! Other hardware reviews have shown the 1800X to be consistently ~5% faster than the 1700X, but we were not able to have a CPU available for testing prior to AMD's launch. We have a CPU on the way and will update this article with actual 1800X results in the coming weeks."







At least they're brutally honest.

Motherboard was the Asus PRIME X370-Pro , the same one OC3D used so it's a good baseline motherboard as it's priced reasonably.

edit: more from Pugetsystems




https://www.pugetsystems.com/labs/articles/SOLIDWORKS-2017-AMD-Ryzen-7-1700X-1800X-Performance-908/


https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CC-2017-AMD-Ryzen-7-1700X-1800X-Performance-907/


https://www.pugetsystems.com/labs/articles/Adobe-Lightroom-CC-2015-8-AMD-Ryzen-7-1700X-1800X-Performance-910/


----------



## kd5151

im sorry. tweaktown tested the 1800x not 1700...but did alot of 4ghz clock to clock stuff. my bad.


----------



## budgetgamer120

Ryzen 1700 stock + 1070 @ 1080p


----------



## Slomo4shO

Quote:


> Originally Posted by *CULLEN*
> 
> Compared to the Linux server test that was published the other day (source), where the 1700X is nothing but amazing, but knowing that it will work with ECC memory makes it even more notorious.


The 1700 is the clear winner over the 1700X.

A R3 1100 would make for an interesting budget serve.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen 1700 stock + 1070 @ 1080p


GPU @ 97-98%.

This matches almost exactly what I get with a single 1070.

... and people said Ryzen isn't good for gaming.


----------



## SoloCamo

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen 1700 stock + 1070 @ 1080p


See I couldn't hope to record 1080p and hold over 100fps on my 4790k with a 64 player map like that.

Regardless of recording, the Ryzen cpu's are great for this engine and game.

Quote:


> Originally Posted by *ducegt*
> 
> I get paid millions. Actually I'm in a strange situation that has given me too much time on my hands. You really didn't say much with anything meaning regarding what I said. Was something I said inaccurate? Most gamers should not care about the multi performance because by the time that is properly utilized, they will be better off saving cash now and upgrading their hardware when software is ready for it. Same thing happened with 64 bit support and quad cores at launch.
> 
> I haven't bought nVidia since the 5850. I prefer the value of AMDs video cards. I tend to keep my CPUs longer and I want the highest minimum frames possible, so I've been with Intel since Conroe. I'll actually purchase AMD over nVidia even if it's in all aspects inferior. I do value the innovation they drive, but I generally buy what is best for my needs. Freesync is awesome.


Problem with that blanket statement is it entirely depends on the game. The only intensive games I typically play do make great use of the extra threads. Ryzen holds a higher min fps then my 4790k.

If you tend to keep your cpu's a long time I think Ryzen is clearly the better choice moving forward as long as you are in the 4ghz range.


----------



## budgetgamer120

For those with an i5 3570K. This user did not do the benchmarks himself, but asked a friend to do it that upgraded to Ryzen. So not sure how you all will take it.

But here is i5 3570k 4ghz vs Stock 1700.


----------



## oxidized

Quote:


> Originally Posted by *Redwoodz*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soo what an I5 can keep pace with an i7 then?
> 
> Evereyone wants to compare AMD's 8 core to Intel's 4core. Intel's 4 core beats their own 8 core in those games. Try comparing apples to apples.


AMD did in the first place, by comparing their 8/16 to the 6700k and 7700k, i'm just saying their slides and comparisons, don't show what's true, as indie testers proved otherwise in these 2 days
Quote:


> Originally Posted by *Shatun-Bear*
> 
> It's not so hard to believe when this is the same (I think) set of games they benched to do that core count vs frequency article, where the Intel 8-cores came ahead of the 7700K in average using these games that generally scale well across more than 4C/8T.
> 
> So it's not surprising that even though the games benefit from more cores, the 7700K is still slightly ahead of the 1800X in this test, if only slightly. So I wouldn't be so quick to say 'I don't believe them', which is just a bizarre reaction. At least explain why.


2 things basically.

1. They only picked games where 6/12 or 8/16 perform as good as, or slightly better than 4/8

or

2. Those tests are false, as in most of the games they tested, 7700k and possibly 6700k, perform better than all 6/12 and 8/12, but for some reason they showed otherwise.

So basically in the first, they cherry picked a bunch of games that could actually work as good as or better on 6/12 8/16. In the second they just showed false info, because it's really weird that in all those games 4/8 perform worse than 6/12 8/16.


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> GPU @ 97-98%.
> 
> This matches almost exactly what I get with a single 1070.
> 
> ... and people said Ryzen isn't good for gaming.


I am over arguing with salty 7700k defenders and concerned trolls. Just sharing info for who is interested and trying to convince myself to prematurly ditch my x99 setup for a 1700.

Quote:


> Originally Posted by *SoloCamo*
> 
> See I couldn't hope to record 1080p and hold over 100fps on my 4790k with a 64 player map like that.
> 
> Regardless of recording, the Ryzen cpu's are great for this engine and game.
> Problem with that blanket statement is it entirely depends on the game. The only intensive games I typically play do make great use of the extra threads. Ryzen holds a higher min fps then my 4790k.
> 
> If you tend to keep your cpu's a long time I think Ryzen is clearly the better choice moving forward as long as you are in the 4ghz range.


There are lots of things capable on Ryzen that aren't on other main stream systems.


----------



## kd5151

look at the cpu load. never went over 40%? 7700k would be at 70%+ while the 7600k 80-90%+ in bf1.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am over arguing with salty 7700k defenders and concerned trolls. Just sharing info for who is interested and trying to convince myself to prematurly ditch my x99 setup for a 1700.


Honestly, if you're happy with your X99 setup, why would you ditch it? I am all for people supporting AMD, but getting an R7 1700 for the sake of getting an R7 1700 is not something I would recommend.
Quote:


> Originally Posted by *kd5151*
> 
> look at the cpu load. never went over 40%? 7700k would be at 70%+ while the 7600k 80-90%+ in bf1.


True. Depending on the map, my 4790k @ stock will reach around 70-80%. It never maxes out on BF1.


----------



## aberrero

Quote:


> Originally Posted by *budgetgamer120*
> 
> For those with an i5 3570K. This user did not do the benchmarks himself, but asked a friend to do it that upgraded to Ryzen. So not sure how you all will take it.
> 
> But here is i5 3570k 4ghz vs Stock 1700.


This is crap. He has the most expensive DDR3 ram against slower DDR4, and overclocked the intel chip against the slowest Ryzen in a motherboard t doesn't turbo.


----------



## CULLEN

Quote:


> Originally Posted by *SoloCamo*
> 
> We have not seen all that Kaby lake can offer but we've seen the best that Ryzen offers? What is this nonsense? How much are they paying you? Seriously this is the crap that makes this site look unreliable.
> 
> Ryzen is a lot closer in gaming performance to the 7700k then the 7700k is to Ryzen in multithreaded workloads. As far as I'm concerned, I'm getting for more for my money. And the fact that you said Nvidia gives you more for your money is a crock too. Price/performance is literally AMD's game on both fronts. Nvidia gave me more for my money when I picked up my 290x over the double the price Titan, too, right?


Bro, haven't you heard? The 2-day old platform is at its pinnacle performance but Kaby still at its infancy performance. Hail Intel!

And like, of course, nVidia is the king of price/performance, Titan X is mad value dog! Hail nVidia.
Quote:


> Originally Posted by *ducegt*
> 
> I haven't bought nVidia since the 5850. I prefer the value of AMDs video cards. I tend to keep my CPUs longer and I want the highest minimum frames possible, so I've been with Intel since Conroe. I'll actually purchase AMD over nVidia even if it's in all aspects inferior. I do value the innovation they drive, *but I generally buy what is best for my needs.* Freesync is awesome.


God bless your logical, unbiased heart.
Quote:


> Originally Posted by *Mad Pistol*
> 
> Honestly, if you're happy with your X99 setup, why would you ditch it? I am all for people supporting AMD, but getting an R7 1700 for the sake of getting an R7 1700 is not something I would recommend.
> True. Depending on the map, my 4790k @ stock will reach around 70-80%. It never maxes out on BF1.


Sure supporting AMD is good and all, but ditching X99 which is completely sufficient is not the smartest move.
Quote:


> Originally Posted by *Slomo4shO*
> 
> The 1700 is the clear winner over the 1700X.
> 
> A R3 1100 would make for an interesting budget serve.


I'm not sure the R3 will support ECC if that's something you need, regardless of that, will probably make a superb server chip.


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> Honestly, if you're happy with your X99 setup, why would you ditch it? I am all for people supporting AMD, but getting an R7 1700 for the sake of getting an R7 1700 is not something I would recommend.
> True. Depending on the map, my 4790k @ stock will reach around 70-80%. It never maxes out on BF1.


Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.

I am happy with my setup though, but only 3 months old... I could have waited


----------



## Undervolter

Quote:


> Originally Posted by *kd5151*
> 
> look at the cpu load. never went over 40%? 7700k would be at 70%+ while the 7600k 80-90%+ in bf1.


Things can be worse. An 7700K will always be better, as long as the game doesn't completely saturate the 7700k. In this picture, it shows that it's close to saturation, but not there yet:



The good thing about the 7700, is that probably few games in the next couple of years will arrive to the point of complete saturation. Probably being key word here. It's a matter of predicting the future. Since there are out there still many i3 and i5 users, game developers have to think about them too. The more they thing about them, the safer an 7700 is. At some point in the future, the 7700 will be saturated in some games. When this happens, it will start falling behind the Ryzen, for the simple fact that the total throughput of Ryzen is superior to that of the 7700. Luckily for 7700 users, even when this happens, the game will still be very playable for them. Simply Ryzen will have more fps, but at the end, if it's playable, why care. Ryzen is certainly more future proof than 7700. It may not be more future proof than a new Intel CPU with much higher IPC and less cores. But it is more future proof than 7700K. Is it a problem for current 7700K users? No, they will probably upgrade their CPU before they encounter game threatening low FPS.


----------



## CULLEN

Quote:


> Originally Posted by *budgetgamer120*
> 
> Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.
> 
> I am happy with my setup though, but only 3 months old... I could have waited


Ah, actually I hadn't thought about it that way. Uhm, upgrading from X99 to Ryzen, in that case, could make sense, but how much are you saving/losing by going Red? If it's completely minimal then I'd stick with the X99 tbh.

And sure you could have waited but with that mentality, you'd never buy anything.


----------



## Kpjoslee

Quote:


> Originally Posted by *budgetgamer120*
> 
> Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.
> 
> I am happy with my setup though, but only 3 months old... I could have waited


I wonder what tempted you to build x99 setup 3 months ago. During that time, it has been already known that Ryzen is coming around this time. You could have waited and see.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.
> 
> I am happy with my setup though, but only 3 months old... I could have waited


It's your call. The good news about AM4 is that AMD has promised to keep the platform for at least 4 years, so at least you know Ryzen will have great support.


----------



## CULLEN

Quote:


> Originally Posted by *Kpjoslee*
> 
> I wonder what tempted you to build x99 setup 3 months ago. During that time, it has been already known that Ryzen is coming around this time. You could have waited and see.


Honestly, AMD's reputation had been a bit bruised, I bet most people expected just another Bulldozer and understandably so.


----------



## oxidized

Quote:


> Originally Posted by *Undervolter*
> 
> Things can be worse. An 7700K will always be better, as long as the game doesn't completely saturate the 7700k. In this picture, it shows that it's close to saturation, but not there yet:
> 
> 
> 
> The good thing about the 7700, is that probably few games in the next couple of years will arrive to the point of complete saturation. Probably being key word here. It's a matter of predicting the future. Since there are out there still many i3 and i5 users, game developers have to think about them too. The more they thing about them, the safer an 7700 is. At some point in the future, the 7700 will be saturated in some games. When this happens, it will start falling behind the Ryzen, for the simple fact that the total throughput of Ryzen is superior to that of the 7700. Luckily for 7700 users, even when this happens, the game will still be very playable for them. Simply Ryzen will have more fps, but at the end, if it's playable, why care. Ryzen is certainly more future proof than 7700. It may not be more future proof than a new Intel CPU with much higher IPC and less cores. But it is more future proof than 7700K. Is it a problem for current 7700K users? No, they will probably upgrade their CPU before they encounter game threatening low FPS.


Exactly, but what if 4/8 will still go on for years? And what if they become obsolete within the current year? But at that point why AMD has a few 4/8 into their "gaming" ryzen 5 lineup? From what we know they could even make only the 1600X as 6/12 and rest could be 4/8 and 4/4


----------



## Quantum Reality

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> Honestly, if you're happy with your X99 setup, why would you ditch it? I am all for people supporting AMD, but getting an R7 1700 for the sake of getting an R7 1700 is not something I would recommend.
> True. Depending on the map, my 4790k @ stock will reach around 70-80%. It never maxes out on BF1.
> 
> 
> 
> Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.
> 
> I am happy with my setup though, but only 3 months old... I could have waited
Click to expand...

Such a high-end setup is pretty solid for the next two years or so, I would think. Plus, in that time, Zen will get improvements and when you do finally make the upgrade you'll have the benefit of AMD's intensive research and upgrade work on AM4.


----------



## ducegt

Quote:


> Originally Posted by *CULLEN*
> 
> Bro, haven't you heard? The 2-day old platform is at its pinnacle performance but Kaby still at its infancy performance. Hail Intel!


You seemingly didn't comprehend the argument of what memory was used. Also, that's the funniest thing you could come up with? Calling me bro, hail Intel? ...sigh.

Titan X provided more performance at launch for the extra cash. I've never been one so short sighted to buy one, but if someone wants the best and has the cash, more power to them. What doesn't make sense is when someone wants to set time trial laps around a circuit by purchasing a Lamborghini when a cheaper Corvette will offer more performance. The hardware market generally doesn't have such discrepancies.


----------



## budgetgamer120

The division Ryzen 1700x


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> The division Ryzen 1700x


And again, we see a GTX 1080 @ 98-99% usage throughout the test.

A Ryzen R7 8 core is a very good processor for gaming.

The more I see these videos, the more I realize that "Ryzen is not for games" is simply not true.


----------



## budgetgamer120

Quote:


> Originally Posted by *CULLEN*
> 
> Ah, actually I hadn't thought about it that way. Uhm, upgrading from X99 to Ryzen, in that case, could make sense, but how much are you saving/losing by going Red? If it's completely minimal then I'd stick with the X99 tbh.
> 
> And sure you could have waited but with that mentality, you'd never buy anything.


Quote:


> Originally Posted by *Kpjoslee*
> 
> I wonder what tempted you to build x99 setup 3 months ago. During that time, it has been already known that Ryzen is coming around this time. You could have waited and see.


Quote:


> Originally Posted by *Mad Pistol*
> 
> It's your call. The good news about AM4 is that AMD has promised to keep the platform for at least 4 years, so at least you know Ryzen will have great support.


I use my pc heavily for development and I needed more cores and did not think Ryzen would deliver this much performance.

I am pleased with my rig, but the thought that i could have gotten more if I had waited kinda stings.

Plus I am getting trolled by my friend who waited







.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> I use my pc heavily for development and I needed more cores and did not think Ryzen would deliver this much performance.
> 
> I am pleased with my rig, but the thought that i could have gotten more if I had waited kinda stings.
> 
> *Plus I am getting trolled by my friend who waited*
> 
> 
> 
> 
> 
> 
> 
> .


There's the real reason.









Go for it dude!!!


----------



## Undervolter

Quote:


> Originally Posted by *oxidized*
> 
> Exactly, but what if 4/8 will still go on for years? And what if they become obsolete within the current year? But at that point why AMD has a few 4/8 into their "gaming" ryzen 5 lineup? From what we know they could even make only the 1600X as 6/12 and rest could be 4/8 and 4/4


This is why i say that this is about predicting the future. I don't follow games closely, i know what i see posted in the AMD forum. I don't believe 7700 will become problematic in the next few years, because if a game becomes problematic on i7, it means it will be at the limits of playability for several i5 (let's not forget that i5s go back several generations) and unplayable to several i3s. I don't think developers are going to push that in most games. A few games might be the exception after say 3 years. What i am sure about, is that if someone was to keep indefinitely the 7700 and the Ryzen 7 and see which one will first fall on its knees, it will be the 7700. But it will have probably exceeded its useful life by gamers' standards.

EDIT: Bottom line is this: Developers see PC users as potential wallets. When they see many potential wallets running i3s and i5s, they won't push overnight so hard in requirements, as to render them obsolete, because the users might not buy their game. So to predict how fast a 7700 will become obsolete, is the same with predicting, how fast an i3, i5 and finally i7 will become "entry level". Unless a big mass of users moves to more cores quickly, i don't think an i7 will become obsolete in short time.


----------



## Digitalwolf

Quote:


> Originally Posted by *budgetgamer120*
> 
> Well the longer I keep it the more money I lose. Also I am using a lower clocked xeon. Ryzen would be faster straight across the board in everything.
> 
> I am happy with my setup though, but only 3 months old... I could have waited


The way I look at this is... Let's say you waited the 3 months. Then you got a Ryzen and the various comments started to be made. Maybe you get a bad board or one that needs a bios update.. or you simply end up thinking it's not quite what it should be. Then you'd look backwards 3 months and wonder why you didn't build the x99.

On the other hand if you happened to have the money and got a Ryzen setup and you were happy with it. There is enough negative things being said about Ryzen (justified or not) that you can still turn around and sell your x99 parts easily enough. I sold a 5820K a few months back and a 6800K recently. I haven't put the x99 motherboard up for sale yet because.. well I guess i'm lazy with that. Plus I might have to mail it and the box is bigger than the USPS boxes I normally use.

On the plus side I have a lot more space in the apartment due to Ryzen. I sold a lot of stuff I wasn't using anymore... got my CPU, Mobo and ordered my new Waterblock. Recycling at it's best.


----------



## Kpjoslee

Quote:


> Originally Posted by *budgetgamer120*
> 
> I use my pc heavily for development and I needed more cores and did not think Ryzen would deliver this much performance.
> 
> I am pleased with my rig, but the thought that i could have gotten more if I had waited kinda stings.
> 
> Plus I am getting trolled by my friend who waited
> 
> 
> 
> 
> 
> 
> 
> .


In that case, you can just ride what you have and just wait for Ryzen+. There are still a quite a lot of issues that has to be ironed out with Ryzen platform and should be all fixed up by then lol.


----------



## Game256

Are there reviews with overclocking on B350, Asus B350 particularly?


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> There's the real reason.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go for it dude!!!


Lol

Or I will source a xeon with higher thread count than what Ryzen offers.

Quote:


> Originally Posted by *Digitalwolf*
> 
> The way I look at this is... Let's say you waited the 3 months. Then you got a Ryzen and the various comments started to be made. Maybe you get a bad board or one that needs a bios update.. or you simply end up thinking it's not quite what it should be. Then you'd look backwards 3 months and wonder why you didn't build the x99.
> 
> On the other hand if you happened to have the money and got a Ryzen setup and you were happy with it. There is enough negative things being said about Ryzen (justified or not) that you can still turn around and sell your x99 parts easily enough. I sold a 5820K a few months back and a 6800K recently. I haven't put the x99 motherboard up for sale yet because.. well I guess i'm lazy with that. Plus I might have to mail it and the box is bigger than the USPS boxes I normally use.
> 
> On the plus side I have a lot more space in the apartment due to Ryzen. I sold a lot of stuff I wasn't using anymore... got my CPU, Mobo and ordered my new Waterblock. Recycling at it's best.


Yeah, the x99 boards are a beauty with all the bells and whistles. Not sure if Intel's next gen prosumer CPUs will require a new socket either.
Quote:


> Originally Posted by *Kpjoslee*
> 
> In that case, you can just ride what you have and just wait for Ryzen+. There are still a quite a lot of issues that has to be ironed out with Ryzen platform and should be all fixed up by then lol.


Not sure how long Ryzen+ will take.


----------



## Xuper

What Is B350 mobo with Best VRM ?


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> And again, we see a GTX 1080 @ 98-99% usage throughout the test.
> 
> A Ryzen R7 8 core is a very good processor for gaming.
> 
> The more I see these videos, the more I realize that "Ryzen is not for games" is simply not true.


Yes overwatch looks good too. This is a 1800x.


----------



## Hueristic

Quote:


> Originally Posted by *budgetgamer120*
> 
> I use my pc heavily for development and I needed more cores and did not think Ryzen would deliver this much performance.
> 
> I am pleased with my rig, but the thought that i could have gotten more if I had waited kinda stings.
> 
> Plus I am getting trolled by my friend who waited
> 
> 
> 
> 
> 
> 
> 
> .


Since the mass of Zen is garbage for gaming will keep your CPU/Mobo price up for a while I would (not really I don't troll) wait and get the best Mobo and I doubt it will be the one he got and then troll him for not waiting long enough.


----------



## kaosstar

Quote:


> Originally Posted by *Hueristic*
> 
> Nexus has zero credibility. After watching that video with joker (at least as far as I could with that nexus fool babbling) it is obvious he had no clue how to handle newly released hardware. The moron wasn't even man enough to just admit he made a mistake and instead tried to cry about how hard it is to OC. Anyone that knows anything knows to check for firmware updates on newly released hardware and you cannot judge a CPU by tests on one motherboard. He probably thinks testing is just swapping out, getting a driver to work and that is that. that may work for GPU testing but It shows a complete lack of training. The boy is ill equipped to that task and more than likely doesn't know a bit from a byte.


The Ryzen reviews have definitely showed me who the elite reviewers are and who the trash reviewers are. Nexus was actually becoming one of my favorites, but now they're essentially blacklisted from my future browsing.


----------



## TopicClocker

Quote:


> Originally Posted by *Mad Pistol*
> 
> GPU @ 97-98%.
> 
> This matches almost exactly what I get with a single 1070.
> 
> ... and people said Ryzen isn't good for gaming.


Were people really trying to say Ryzen isn't good for gaming? I thought they were concerned with *how* good it is?
If they're claiming that it isn't good they're absolute fools.

I'd like to see how it handles Battlefield 1 64 Player Conquest at 120 fps, it would be really cool to see if it can maintain an average of 120+ fps consistently with minimums close to 120 fps.


----------



## kzone75

I just watched TTL for an hour. Ryzen can't be all bad if he's in a good mood. lol I was also wondering about Elric and his rant in the latest video. Can one be certain to get an honest review from him now, when even the name Ryzen (according to him) sucks..?


----------



## Rmerwede

Man, if i was PR at AMD, i would have only given this to select outlets and made sure the systems were updated or set up properly.

Anyone who can plug a chip into a socket is a "reviewer". Now thier stock is down almost 2 bux. This is the equivalent of yelpers being real food critics.AMD has the worst PR dept. Ever.

Let the wannabes get them off the shelves later and say what they want.

In 3,4,5k gaming you wont even notice a difference. You may even get higher min frame rates.


----------



## razielfury

explain this

https://www.youtube.com/watch?v=z7s-9RmCA8E


----------



## Digitalwolf

Quote:


> Originally Posted by *Rmerwede*
> 
> Man, if i was PR at AMD, i would have only given this to select outlets and made sure the systems were updated or set up properly.
> 
> Anyone who can plug a chip into a socket is a "reviewer". Now thier stock is down almost 2 bux. This is the equivalent of yelpers being real food critics.AMD has the worst PR dept. Ever.
> 
> Let the wannabes get them off the shelves later and say what they want.
> 
> In 3,4,5k gaming you wont even notice a difference. You may even get higher min frame rates.


It kind of makes me wonder and even in some of the super "positive" reviews I've seen.... when I finally see the reviewer... and hey here is a video of me changing hardware or setting up the system...

1) I watch a guy install a video card that seems to have never done it before. (yet they have a review site and a review kit.. yes for a cpu but they should know how to install a gpu).

2) Watch a guy and this is nit picking but... On say the AM4 boards the ram slots are A1, A2, B1, B2 and if you are using two sticks of ram you use A2/B2 (per the manual etc). Its been this way for a long time.. so of course the reviewer is using A1/B1. Now it might not really make a difference but... I've never really seen an a discussion to not follow the manual on that.

Those are just the things I remember.. I'm sure I could easily see more. Now I don't claim to be an expert, but I'm not running a review site either. Then again since I have to pay for my hardware I at least do enough research to install things properly, update things etc

So ya... there should be some requirements to get a review kit... no doubt.


----------



## sumitlian

So all the speculations before launch was right.








Every single one of them.

AMD said 40% IPC at first.
Then a week before launch, we have AMD surprising us with claims of IPC gain of 52%+. COOL.
Average single theard IPC improvements from the real world tests done by respected The Stilt, is >54% with up to 140% in 7-zip. Nothing COOLER can ever happen than this for us consumers.
And then Joker on YT showed how well a 3.9 GHz AMD CPU is doing against *latest* Intel 5.0 GHz technology in many AAA games. *I don't have words to say*

AMD has already given most of us more performance benefits with Zen than they ever claimed, and this is what every single user criticizing in RyZen thread must acknowledge and wrap around their head forever, no matter you own a i5 2500 @ 5.2 GHz or i7 7700k @ 5.0 GHz or whether you want to replace it with RyZen or not. Because it is the ultimate truth. Indeed this 8 core CPU is not for all types of Gamers who are already being satisfied by their i5/i7s, because this RyZen 8 core CPU is not _just_ a gaming CPU. It has EIGHT cores. It might not give you constant 120 FPS in some games or even many games but the market of 120 fps needing gamers is nothing of a significance at all as compared to Professional markets or Servers and because RyZen has been designed to do many things even the max overclocked quad core i7 will not cut in. RyZen 7 already has full resource of 2-4 cores left unused when running any of the AAA games.

And for now all this is happening with not so optimized BIOS and some alleged struggle with AMD's implementation of SMT in Windows.
If this is fixed, we may further see improvements in performance.

P.S.







Make Intel Great Again. Not that it wasn't great already.............but now?...not so much!


----------



## Slomo4shO

Quote:


> Originally Posted by *CULLEN*
> 
> I'm not sure the R3 will support ECC if that's something you need, regardless of that, will probably make a superb server chip.


The AM4 platform has ECC support. It will include the R3 and R5 series as well. As the reddit AMA suggests, the board manufacturers will dictate its availability.


----------



## JackCY

Quote:


> Originally Posted by *Slomo4shO*
> 
> The AM4 platform has ECC support. It will include the R3 and R5 series as well. As the reddit AMA suggests, the board manufacturers will dictate its availability.


Yes ECC depends on motherboards, the CPUs support it.


----------



## kd5151

Quote:


> Originally Posted by *razielfury*
> 
> explain this
> 
> https://www.youtube.com/watch?v=z7s-9RmCA8E


is that raja? lol ???


----------



## LesPaulLover

Quote:


> Originally Posted by *Undervolter*
> 
> This is why i say that this is about predicting the future. I don't follow games closely, i know what i see posted in the AMD forum. I don't believe 7700 will become problematic in the next few years, because if a game becomes problematic on i7, it means it will be at the limits of playability for several i5 (let's not forget that i5s go back several generations) and unplayable to several i3s. I don't think developers are going to push that in most games. A few games might be the exception after say 3 years. What i am sure about, is that if someone was to keep indefinitely the 7700 and the Ryzen 7 and see which one will first fall on its knees, it will be the 7700. But it will have probably exceeded its useful life by gamers' standards.
> 
> EDIT: Bottom line is this: Developers see PC users as potential wallets. When they see many potential wallets running i3s and i5s, they won't push overnight so hard in requirements, as to render them obsolete, because the users might not buy their game. So to predict how fast a 7700 will become obsolete, is the same with predicting, how fast an i3, i5 and finally i7 will become "entry level". Unless a big mass of users moves to more cores quickly, i don't think an i7 will become obsolete in short time.


Here's my personal experience over the past few years:

In 2014 I built a new rig (my current rig) running an i5 4670k (4c/4t). At that point in time the general consensus was "i5 is the goto price/performance gaming CPU" and "i7 multi-threading is useless in games." Fast forward two and a half years to the present, and I really wish I'd spent the extra $100 for the i7 4770k (4c/8t). In the past year or so games have FINALLY started fully saturating the capabilities of my 4670k - games like Overwatch, Civ 6, BF1, The Division....even purely single player games like DOOM 2016 have started consistently pushing my 4670k @ 85%+ usage on all four cores.

It seems that 4c/8t CPUs are really starting to become the "gold standard" gaming CPUs.

Problem is, Intel never seems to reduce the price on their previous-generation CPUs, as you'd expect. This has long been a part of their marketing structure. You either upgrade your entire platform (motherboard+CPU) or pay the initial, full-launch price for a CPU Upgrade. A new 4790k to this day will still cost you around $320.

It's hard for me to justify paying $320 for a nearly three-years-old CPU when for roughly twice that amount I can move to a brand new AM4 platform that would include not only an 8c/16t CPU but ALSO a move to DDR4 DRAM.

ALSO -- People seem to be forgetting the "silicon lottery" aspect of overclocking. Just because you buy an Intel chip doesn't mean you're gonna be achieving 1.0GHz+ overclocks! For example my 4670k ABSOLUTELY MAXES OUT @ 4.1GHz. It does not matter how many volts I push through the chip. It'll do 3.8GHz @ only 1.125vcore. A stable 4.0GHz requires a move up to 1.215Vcore. And moving up just 100mhz more, to 4.1GHz, requires moving up 1.250Vcore. It will NOT do 4.2GHz even if I push the max 1.300Vcore through the chip.

Remember this is a 4670k....base clock of 3.4GHz and only 4cores/4threads! Roughly a 600-700MHz overclock is the best it will do.

I think it's actually quite impressive that ALL the Ryzen chips seem to doing 3.9-4.0GHz; especially considering they're 8core/16thread chips! (OBVIOUSLY this makes the R7 1700 the true goto Ryzen chip -- with R7 1800x only existing for server/workstation use and end users who aren't interested in any manual overclocking)


----------



## Offender_Mullet

Quote:


> Originally Posted by *razielfury*
> 
> explain this
> 
> https://www.youtube.com/watch?v=z7s-9RmCA8E


At least link to the original content creator's vid.


----------



## Catscratch

Gotta admit the fact that its their first smt chip. And no one's even talking about ECC support. Yeah no more XEONs if the mobo supports it. We are weird people. No one actually expected Ryzen to be on par with 6900 let alone 7700 until the leaks.


----------



## Shatun-Bear

I never knew of so many people using their PC purely for gaming _AND NOTHING ELSE AT ALL_ until this past week. It seems to me, from reading the comments on forums up and down the vast webosphere, that in actual fact, the percentage of the PC market who do not do anything but GAME on their rig is about *95% of the total market*. I have been lead to believe that people like me, who use photoshop, Zbrush, browse the internet, stream stuff and watch movies on my PC are a tiny minority. Funny that.

Or have I been deceived and people are not telling the whole truth


Spoiler: Warning: Spoiler!



to suit their argument


?


----------



## Sheyster

Quote:


> Originally Posted by *SoloCamo*
> 
> I'll take a 4.1ghz 8c16t over a 5ghz 4c8t...


Ummm... NFW. I'd be on board with 4.5, but certainly not 4.1.
Quote:


> Originally Posted by *gigafloppy*
> 
> Conclusion:
> - If all you do is gaming, don't buy it.
> - If all you do is rendering, BUY IT, NOW!!!


This...


----------



## SoloCamo

Quote:


> Originally Posted by *Sheyster*
> 
> Ummm... NFW. I'd be on board with 4.5, but certainly not 4.1.
> This...


Do you really think 400mhz is going to be that big of a difference in reality? It's not going to make or break anything.

For a well rounded cpu, the extra cores/threads and cache at similar enough IPC more then outweigh any cons.

On a side note, absolutely no one has given AMD any credit in regards to how power efficient these are? When AMD was on the side of having poor power consumption it was the biggest deal ever... now that they are more then competitive it doesn't seem to matter at all?


----------



## Quantum Reality

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> Ummm... NFW. I'd be on board with 4.5, but certainly not 4.1.
> This...
> 
> 
> 
> Do you really think 400mhz is going to be that big of a difference in reality? It's not going to make or break anything.
> 
> For a well rounded cpu, the extra cores/threads and cache at similar enough IPC more then outweigh any cons.
> 
> On a side note, absolutely no one has given AMD any credit in regards to how power efficient these are? When AMD was on the side of having poor power consumption it was the biggest deal ever... now that they are more then competitive it doesn't seem to matter at all?
Click to expand...

mmhmm. I think the days of 140W+ TDP CPUs from AMD are pretty much gone for good. Not having to deal with as much heat means components will last longer which is all to the good in the long run.


----------



## IRobot23

http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-cores&num=2

CCX L3 (separated) is not bottleneck!


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> Ummm... NFW. I'd be on board with 4.5, but certainly not 4.1.
> This...
> 
> 
> 
> Do you really think 400mhz is going to be that big of a difference in reality? It's not going to make or break anything.
> 
> For a well rounded cpu, the extra cores/threads and cache at similar enough IPC more then outweigh any cons.
> 
> On a side note, absolutely no one has given AMD any credit in regards to how power efficient these are? When AMD was on the side of having poor power consumption it was the biggest deal ever... now that they are more then competitive it doesn't seem to matter at all?
Click to expand...

To be honest, power consumption is my only real complaint with the FX's and Ryzen can match it for cinebench scores at 30 watts vs what an FX would need 240 watts to produce - amazing.


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*
> 
> To be honest, power consumption is my only real complaint with the FX's and Ryzen can match it for cinebench scores at 30 watts vs what an FX would need 240 watts to produce - amazing.


Link to that article?

Really looking forward to what AMD can do in the ULV / 15w tdp range. Would love to have a Ryzen APU replacement for my puma based quad core laptop.


----------



## iLeakStuff

Quote:


> Originally Posted by *Undervolter*
> 
> I don't read game reviews, but i know abothis, because it was posted in the main AMD subforum. This is how i first heard of "Gamers Nexus" and what i think of their reliability and "expertese".
> 
> Do you see anything...odd in this graph?
> 
> 
> 
> You have a 3.2Ghz FX scoring 31FPS min, a 3.3Ghz FX scoring 33 FPS and a 4.7Ghz scoring... 15 FPS. Ooops! That sounds a bit strange! But fear not, because Steve has the rational explanation!
> Basically, he says that this is a "characteristic trait of 9590". Meaning, that AMD, actually released a CPU that at 4.7Ghz, performs WORSE than the same chip at 3.2Ghz. Ok, some will say "human error". No! Several OCN members in the comments tried to point it out and also give possible explanation. Did it serve to anything? No! Cause Steve "knew better". Ironically, Bethesda on reccommended hardware, had put that CPU, that according to Steve, makes 15 FPS.
> 
> This was Steve's final conclusion: "Folks. The 9590 was provided -- by AMD, no less -- installed in the motherboard we tested with. We have two 9590-ready motherboards and two 9590s. The ASRock Extreme9 and Fatal1ty 990FX Professional are both in our hands.
> 
> Why would we use an incompatible board? Come on.
> 
> Just because we keep getting *uninformed*, offensively-phrased comments about the test results, I took the two hours to retest both boards and both CPUs for parity. We are seeing the issues on both. If your test methodology, location, or frame measurement differs from ours, your results will differ."
> 
> I have nothing else to add. I haven't read his Ryzen review. Cause there are many others i haven't either and i try to leave the ones i less trust for last. In science, what he did, is called "systematic error". He repeats the same error again and again and at the end, he attributes everything to the CPU having a "characteristic trait", no matter if the result goes against any reason.


GamersNexus for you:
Quote:


> Test Methodology
> Each game was tested for *30 seconds* in an identical scenario, then repeated three times for parity


Take their sloppy reviews with a LOT of salt...


----------



## Quantum Reality

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> To be honest, power consumption is my only real complaint with the FX's and Ryzen can match it for cinebench scores at 30 watts vs what an FX would need 240 watts to produce - amazing.
> 
> 
> 
> Link to that article?
> 
> Really looking forward to what AMD can do in the ULV / 15w tdp range. Would love to have a Ryzen APU replacement for my puma based quad core laptop.
Click to expand...

That is also what I'm looking forward to later this year as well. Ryzen performance laptops will be a nice competitor to an increasingly expensive Intel laptop market.


----------



## moonbogg

I have a [email protected] I already bought an 1800X. Will the 1800X game better than a [email protected] you think? I honestly don't know if it will or not. Sticking with the 1800X for the fun of it, but wanted your opinions. I have an EK supremacy EVO that will cool the 1800X with lots of RAD and I expect between 4.0-4.2 max on the 1800X.


----------



## Undervolter

Quote:


> Originally Posted by *iLeakStuff*
> 
> GamersNexus for you:
> Take their sloppy reviews with a LOT of salt...


Even if he were to test for 5 mins, he would have still made a blunder. His max FPS are wrong too.If you read the comments, you will understand.

http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference#!/ccomment-page=1

His problem was in the motherboards. He didn't want to admit, he preferred the easy solution that this is "characteristic trait of 9590". Compare the max FPS too with another website: http://www.techspot.com/review/1089-fallout-4-benchmarks/page5.html

Everything was laid on a plate for him. It didn't matter, because when your ego is bigger than your capacity for logical thinking...

In Gamers Nexus, going from 3.2 to 4.7, gave + 5 to max FPS. In techpspot, it gave +23. Of course min FPS in techspot was 63, not 15.


----------



## IRobot23

Quote:


> Originally Posted by *moonbogg*
> 
> I have a [email protected] I already bought an 1800X. Will the 1800X game better than a [email protected] you think? I honestly don't know if it will or not. Sticking with the 1800X for the fun of it, but wanted your opinions. I have an EK supremacy EVO that will cool the 1800X with lots of RAD and I expect between 4.0-4.2 max on the 1800X.


Make sure that you are using fast ram is always important. If I were you I would disable SMT for gaming. Try to get it to 3,9-4GHz (but stable under 1.365V fixed)

Gaming
Ryzen has really fast cores and its been proven by all benchmarks. Maybe with optimizing code they could do better or as fast as Intel. Intel has their SMT design for quite some time. Their budgets are non comparable.

R7 1700 even if you look at him as 8C/8T (phenom fashion) its still a deal that you cannot overlook.
3.9Ghz with 3200MHz DDR4 dual channel even at 3.9GHz should be faster than sandy at 4.6Ghz


----------



## moonbogg

Quote:


> Originally Posted by *IRobot23*
> 
> Make sure that you are using fats ram is always important. If I were you I would disable SMT for gaming.


What is fats? You mean the rads? Alphacool UT60 360's (3 of them). Currently using two 980ti's but will switch to a single 1080ti. Game at [email protected] Concerned the 1800X will actually cost me some FPS vs the [email protected]

1800X will get [email protected] CAS 16.


----------



## IRobot23

Quote:


> Originally Posted by *moonbogg*
> 
> What is fats? You mean the rads? Alphacool UT60 360's (3 of them). Currently using two 980ti's but will switch to a single 1080ti. Game at [email protected] Concerned the 1800X will actually cost me some FPS vs the [email protected]
> 
> 1800X will get [email protected] CAS 16.


You dont need to get 1800X you can go with 1700.
I would go even with faster ram.
G.SKill


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> To be honest, power consumption is my only real complaint with the FX's and Ryzen can match it for cinebench scores at 30 watts vs what an FX would need 240 watts to produce - amazing.
> 
> 
> 
> Link to that article?
> 
> Really looking forward to what AMD can do in the ULV / 15w tdp range. Would love to have a Ryzen APU replacement for my puma based quad core laptop.
Click to expand...

That was a comment The Stilt made in a post about voltage characteristics of Ryzen - I'll see if i can retrieve it. - https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


----------



## Quantum Reality

Quote:


> Unlike with Intel's more recent CPUs, there is no asynchronous mode (straps / gears) available, which would allow stepping down the PCIe frequency at certain intervals.


This is absurd.

Since the LGA775 days people have been able to lock the PCI-E frequency independently of the internal clock speed of the CPU, so for AMD to neglect this a decade later is a kind of anklebiter move in an otherwise impressive CPU rollout.


----------



## budgetgamer120

Quote:


> Originally Posted by *moonbogg*
> 
> What is fats? You mean the rads? Alphacool UT60 360's (3 of them). Currently using two 980ti's but will switch to a single 1080ti. Game at [email protected] Concerned the 1800X will actually cost me some FPS vs the [email protected]
> 
> 1800X will get [email protected] CAS 16.


I think he meant *fast* ram


----------



## Hequaqua

Quote:


> Originally Posted by *cssorkinman*
> 
> That was a comment The Stilt made in a post about voltage characteristics of Ryzen - I'll see if i can retrieve it. - https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


He also make mention that some of the issues may be related to Windows 10.

I've been following that thread as well.


----------



## JackCY

Quote:


> Originally Posted by *IRobot23*
> 
> http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-cores&num=2
> 
> CCX L3 (separated) is not bottleneck!


Surprise surprise.
Quote:


> Originally Posted by *iLeakStuff*
> 
> GamersNexus for you:
> Take their sloppy reviews with a LOT of salt...


30s benches? Are they kiddin' me? Not even aircoolers will heat up in 30s gaming. I guess if the game had a 5s built in benchmark they would use that instead... because why bother making a 10min long play etc.
Many of the big review sites have a one click bench kinda, boot a specified drive with preinstalled OS and benches, click start, go away for the rest of the day, done. Not ideal either but if they are doing huge volumes of reviews it helps a lot and keeps consistency.


----------



## cssorkinman

Quote:


> Originally Posted by *Hequaqua*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> That was a comment The Stilt made in a post about voltage characteristics of Ryzen - I'll see if i can retrieve it. - https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> 
> He also make mention that some of the issues may be related to Windows 10.
> 
> I've been following that thread as well.
Click to expand...

I plan on using windows 7 and 10 with mine don't know if I'll go dual boot or just install different drives when I want to test one or the other.


----------



## BobiBolivia

Quote:


> Originally Posted by *Hequaqua*
> 
> He also make mention that some of the issues may be related to Windows 10.
> I've been following that thread as well.


Pardon me for this probably unrelated questions but:

If MS is supposed to fix that thread scheduler in Windows, do we know if MS will fix it in all "modern" versions of Windows (10/8.1/7) or only 10 ?
I don't mean to start any MS-hate, just curious what actions MS can take...


----------



## Hequaqua

Quote:


> Originally Posted by *BobiBolivia*
> 
> Pardon me for this probably unrelated questions but:
> 
> If MS is supposed to fix that thread scheduler in Windows, do we know if MS will fix it in all "modern" versions of Windows (10/8.1/7) or only 10 ?
> I don't mean to start any MS-hate, just curious what actions MS can take...


I can't answer that......That's beyond my pay scale.









I'm sure if there are issues, which it appears there are in Windows, they will be addressed. To what extent of which versions, I haven't a clue. Just as some of the issues that related to the MB bios'.

I'm just playing along and trying to learn as much as I can from *everyone*!


----------



## IRobot23

All that AMD need to do is ryzen consoles...


----------



## 7850K

Quote:


> Originally Posted by *IRobot23*
> 
> All that AMD need to do is ryzen consoles...


is it confirmed yet whether Xbox scorpio will have ryzen cores or jaguar cores?


----------



## AliNT77

Quote:


> Originally Posted by *7850K*
> 
> is it confirmed yet whether Xbox scorpio will have ryzen cores or jaguar cores?


nope . Not yet but ryzen is highly unlikely


----------



## budgetgamer120

Quote:


> Originally Posted by *JackCY*
> 
> Surprise surprise.
> 30s benches? Are they kiddin' me? Not even aircoolers will heat up in 30s gaming. I guess if the game had a 5s built in benchmark they would use that instead... because why bother making a 10min long play etc.
> Many of the big review sites have a one click bench kinda, boot a specified drive with preinstalled OS and benches, click start, go away for the rest of the day, done. Not ideal either but if they are doing huge volumes of reviews it helps a lot and keeps consistency.


I am disappointed that I liked their benchmarks... Did not know they were 30 secs.


----------



## Quantum Reality

Quote:


> Originally Posted by *cssorkinman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hequaqua*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> That was a comment The Stilt made in a post about voltage characteristics of Ryzen - I'll see if i can retrieve it. - https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> 
> He also make mention that some of the issues may be related to Windows 10.
> 
> I've been following that thread as well.
> 
> Click to expand...
> 
> I plan on using windows 7 and 10 with mine don't know if I'll go dual boot or just install different drives when I want to test one or the other.
Click to expand...

I have a thread here where I've been asking about Win7's usability with Ryzen and the initial results are encouraging.

http://www.overclock.net/t/1624699/windows-7-and-ryzen/0_100


----------



## IRobot23

http://oc.jagatreview.com/2017/03/hands-on-amd-ryzen-7-1700-oc-3-9-ghz-ddr4-3200-di-msi-b350-tomahawk-2/

120$ MB, 3.9GHz and 3200MHz dual channel DDR4? ...


----------



## Hueristic

Quote:


> Originally Posted by *IRobot23*
> 
> http://oc.jagatreview.com/2017/03/hands-on-amd-ryzen-7-1700-oc-3-9-ghz-ddr4-3200-di-msi-b350-tomahawk-2/
> 
> 120$ MB, 3.9GHz and 3200MHz dual channel DDR4? ...


Please link English. Thx


----------



## Quantum Reality

The pictures do a pretty decent job of explaining what they're doing, though.


----------



## budgetgamer120

Ok guys... Decided to skip Ryzen for now and turn my attention to the Nintendo Switch


----------



## CriticalOne

Quote:


> Originally Posted by *Slomo4shO*
> 
> Because the comparable 4/8 models that are launching in a couple of months will be half the price of a 7700K?


I can't buy a 4C/8T Ryzen CPU yet.

I don't get what is so controversial with comparing two similarly priced CPUs.


----------



## zGunBLADEz

Quote:


> Originally Posted by *umeng2002*
> 
> The 720p tests are valid because it exposes the R7 weakness is some games better than a higher resolution test.
> 
> Think of it like a magnifying glass.
> 
> TODAY, with TODAY'S GPUS, show no issue with R7 at 4K.
> 
> R7 might not be able to feed TOMORROW'S GPUS fast enough compared to a 7700k.










Like i want my cpu to be running at 99% pegged XD









as a pc gamer myself thats a no no on any setup i build and as soon i start seeing that is time to upgrade

Anyway im going to play games a 720P on my 1080 GTX..

This cpu bottleneck tests are useless again and again and again there no 5 feet in that FCAT XD


----------



## Blameless

Quote:


> Originally Posted by *Quantum Reality*
> 
> This is absurd.
> 
> Since the LGA775 days people have been able to lock the PCI-E frequency independently of the internal clock speed of the CPU, so for AMD to neglect this a decade later is a kind of anklebiter move in an otherwise impressive CPU rollout.


Many _modern_ Intel platforms can't lock the PCI-E frequency independently of the reference clock.


----------



## Ultracarpet

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ok guys... Decided to skip Ryzen for now and turn my attention to the Nintendo Switch


Pffff enjoy having less fps than a 7700k @ 6ghz


----------



## Malinkadink

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ok guys... Decided to skip Ryzen for now and turn my attention to the Nintendo Switch


Enjoy the buggy controller with connection issues then







Zelda is the only saving grace the system currently has, and i'd hardly say its worth $360 just to get the privlege to play it. I would rather buy a refurb Wii U from Nintendo for $200 + new Zelda + other Wii U exclusives imho.

Anyways, on topic, the debate is still going on between 7700k and R7?


----------



## 12Cores

OP please add this overclocking and optimization guide:

https://www.youtube.com/watch?v=IKGJshXgOwU&t=911s


----------



## umeng2002

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> 
> 
> 
> 
> 
> Like i want my cpu to be running at 99% pegged XD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> as a pc gamer myself thats a no no on any setup i build and as soon i start seeing that is time to upgrade
> 
> Anyway im going to play games a 720P on my 1080 GTX..
> 
> This cpu bottleneck tests are useless again and again and again there no 5 feet in that FCAT XD


No, you don't understand... the

GPU

should be pegged in the high nineties all the time if the CPU can feed it well enough.

So a fast CPU can feed a 99% utilized GPU at a low CPU utilization.

A Slower CPU might get a GPU to 99% utilization but at a higher CPU utilization.

This isn't a hard concept to understand.


----------



## sugarhell

Quote:


> Originally Posted by *12Cores*
> 
> OP please add this overclocking and optimization guide:
> 
> https://www.youtube.com/watch?v=IKGJshXgOwU&t=911s


90C. Someone tell him that this is not an Intel CPU


----------



## zGunBLADEz

Quote:


> Originally Posted by *umeng2002*
> 
> No, you don't understand... the
> 
> GPU
> 
> should be pegged in the high nineties all the time if the CPU can feed it well enough.
> 
> So a fast CPU can feed a 99% utilized GPU at a low CPU utilization.
> 
> A Slower CPU might get a GPU to 99% utilization but at a higher CPU utilization.
> 
> This isn't a hard concept to understand.


Yeah like i want micro stutters or hard drops on the roller coaster ride... Waiting for that process on the background waiting anxiously to ask cpu for some cyles XD


----------



## budgetgamer120

Quote:


> Originally Posted by *sugarhell*
> 
> 90C. Someone tell him that this is not an Intel CPU


He said to disregard the temps.


----------



## 12Cores

Quote:


> Originally Posted by *sugarhell*
> 
> 90C. Someone tell him that this is not an Intel CPU


Ignore the temps, what the video shows is that if a bios update can get these things running at 4.5ghz, look out.


----------



## sugarhell

Quote:


> Originally Posted by *12Cores*
> 
> Ignore the temps, what the video shows is that if a bios update can get these things running at 4.5ghz, look out.


Almost impossible without ln2


----------



## LancerVI

Quote:


> Originally Posted by *12Cores*
> 
> OP please add this overclocking and optimization guide:
> 
> https://www.youtube.com/watch?v=IKGJshXgOwU&t=911s


The quick take away for me is that the 1700 is just a bit shy of the 1700x/1800x and at significantly reduced temps. 83/90c vs 53.

I know he said he wasn't really monitoring that and the test methodology is a bit loose, but still. If that's anywhere near to how it actually is, the 1700 non-X seems like a no brainer to me.
Quote:


> Originally Posted by *sugarhell*
> 
> Almost impossible without ln2


I think the 1700 non-X may have a shot (silicon lottery depending) just based on thermal headroom.

EDIT: I know it wouldn't be much of an upgrade, but I'm seriously thinking of retiring my 5820k and getting a 1700. One: It's hard to resist the "upgrade" bug and two I can build a server out of my 5820k which has always been a horrible OC'er anyway.


----------



## ryan92084

Quote:


> Originally Posted by *12Cores*
> 
> OP please add this overclocking and optimization guide:
> 
> https://www.youtube.com/watch?v=IKGJshXgOwU&t=911s


Thanks. Added this and several other links throughout the day.


----------



## lamminium

Quote:


> Originally Posted by *sumitlian*
> 
> So all the speculations before launch was right.
> 
> 
> 
> 
> 
> 
> 
> 
> Every single one of them.
> 
> AMD said 40% IPC at first.
> Then a week before launch, we have AMD surprising us with claims of IPC gain of 52%+. COOL.
> Average single theard IPC improvements from the real world tests done by respected The Stilt, is >54% with up to 140% in 7-zip. Nothing COOLER can ever happen than this for us consumers.
> And then Joker on YT showed how well a 3.9 GHz AMD CPU is doing against *latest* Intel 5.0 GHz technology in many AAA games. *I don't have words to say*
> 
> AMD has already given most of us more performance benefits with Zen than they ever claimed, and this is what every single user criticizing in RyZen thread must acknowledge and wrap around their head forever, no matter you own a i5 2500 @ 5.2 GHz or i7 7700k @ 5.0 GHz or whether you want to replace it with RyZen or not. Because it is the ultimate truth. Indeed this 8 core CPU is not for all types of Gamers who are already being satisfied by their i5/i7s, because this RyZen 8 core CPU is not _just_ a gaming CPU. It has EIGHT cores. It might not give you constant 120 FPS in some games or even many games but the market of 120 fps needing gamers is nothing of a significance at all as compared to Professional markets or Servers and because RyZen has been designed to do many things even the max overclocked quad core i7 will not cut in. RyZen 7 already has full resource of 2-4 cores left unused when running any of the AAA games.
> 
> And for now all this is happening with not so optimized BIOS and some alleged struggle with AMD's implementation of SMT in Windows.
> If this is fixed, we may further see improvements in performance.
> 
> P.S.
> 
> 
> 
> 
> 
> 
> 
> Make Intel Great Again. Not that it wasn't great already.............but now?...not so much!


I really don't know what the people on this forum were thinking, really. AMD made conservative promises and exceeded expectations.

Many people here said they would be happy with SB or IB performance a while back but are now upset that Ryzen falls a _tiny bit_ short in gaming compared with Skylake (?!) but still holds respectable minimum framerates, without any optimisations *yet*. And then the *power consumption* which was a major deal breaker during the last FX series but is now somehow an non-issue even when the first benchmarks showed that *R7 used less energy than 7700K* under load, notwithstanding the fact that even _more BIOS updates and OS code improvements and whatnot_ will arrive in the coming months. And yeah, like someone previously mused, barely anyone on OCN uses PCs for work and productivity anymore, apparently.









This thread somewhat gives the illusion that OCN is principally an Intel fanclub.


----------



## ryan92084

Quote:


> Originally Posted by *lamminium*
> 
> I really don't know what the people on this forum were thinking, really. AMD made conservative promises and exceeded expectations.
> 
> Many people here said they would be happy with SB or IB performance a while back but are now upset that Ryzen falls a _tiny bit_ short in gaming compared with Skylake (?!) but still holds respectable minimum framerates, without any optimisations *yet*. And then the *power consumption* which was a major deal breaker during the last FX series but is now somehow an non-issue even when the first benchmarks showed that *R7 used less energy than 7700K* under load, notwithstanding the fact that even _more BIOS updates and OS code improvements and whatnot_ will arrive in the coming months. And yeah, like someone previously mused, barely anyone on OCN uses PCs for work and productivity anymore, apparently.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This thread somewhat gives the illusion that OCN is principally an Intel fanclub.


2 things come to mind besides just general baseless overhyping.
1) The closer to launch we got the IPC rumor train came off the rails with several people trumpeting close to kaby lake levels rather than broadwell-e.
2) Lots of people assumed the advertised boost clock was an all core boost and therefore expected the 1800x to OC 400mhz on top of that. Instead the all core boost is 3.7ghz so a 400mhz overclock on top only gets you to 4.1ghz which people were assuming was the baseline performance from the get.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SoloCamo*
> 
> I think these reviewers need to do a performance update in like 90 days to give everyone a better picture. Hopefully by then all of the confusion will be cleared up. There is too much inconsistency.


They will be doing several followup reviews of Ryzen whether they like it or not once the 6-core and 4-core variants are released. I wouldn't doubt if that fact wasn't lost on AMD when they decided to release the "worst" Ryzen gaming chips (8-core) first. Don't take that to mean that I think the R7's are bad gaming chips (wuite the opposite actually) but that the 6-core and especially the 4-cores should perform marginally better in games due to slightly higher clock speeds (and in the case of the 4-cores, because they won't gave to deal with the interconnect issues the 6 and 8-cores have to).


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ducegt*
> 
> You get more for what you pay for. It's like this with nVidia as well.
> 
> What hasn't been discussed much here is the optimization of the 7700Ks reviewed. Most reviews run them at the standard RAM speed when they are capable of much more. The reviews with OCed 7700Ks have the memory gimp as well. *We have seen the best Ryzen has to offer, but not Kaby Lake...*and Kaby Lake is still on top for gaming which lets be real, is what most people here and on similar venues care about.


Are you actually serious with your posts or are you just trolling at this point? We've seen the BEST Ryzen has to offer??? Most of these supposed reviews were done with un-updated mobo's, with 2400MHz DDR4, and across a wide range of hardware options. We have FAR from seen the BEST Ryzen can do, and I'd argue that most of the launch reviews were critically flawed at their core and should be completely redone in a few weeks. Mind, I'm not necessarily blaming all of the reviewers (AMD bears responsibility for not making sure all issues weren't ironed out before providing review samples) but the fact remains that the laucnh reviews absolutely do not give us an accurate representation of what these chips are capable of, especially in games...


----------



## umeng2002

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Yeah like i want micro stutters or hard drops on the roller coaster ride... Waiting for that process on the background waiting anxiously to ask cpu for some cyles XD


I don't think you know how computers work...


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Are you actually serious with your posts or are you just trolling at this point? We've seen the BEST Ryzen has to offer??? Most of these supposed reviews were done with un-updated mobo's, with 2400MHz DDR4, and across a wide range of hardware options. We have FAR from seen the BEST Ryzen can do, and I'd argue that most of the launch reviews were critically flawed at their core and should be completely redone in a few weeks. Mind, I'm not necessarily blaming all of the reviewers (AMD bears responsibility for not making sure all issues weren't ironed out before providing review samples) but the fact remains that the laucnh reviews absolutely do not give us an accurate representation of what these chips are capable of, especially in games...


Man when I saw it I laughed.

Just when I thought I have seen it all on a forum









I haven't seen this much defending since World Cup.


----------



## Hueristic

Funny I called 1700 non X from the beginning and people are acting like it's some sort of revelation.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *moonbogg*
> 
> I have a [email protected] I already bought an 1800X. Will the 1800X game better than a [email protected] you think? I honestly don't know if it will or not. Sticking with the 1800X for the fun of it, but wanted your opinions. I have an EK supremacy EVO that will cool the 1800X with lots of RAD and I expect between 4.0-4.2 max on the 1800X.


I would guess the 1800X at 4.1GHz would be a little faster at 1080p but it wouldn't be noticeable. Unless you are gaming at 1080p or less, all of these recent CPU's are pretty GPU-limited in games at 1440p or more unless you have a Titan XP or something.


----------



## umeng2002

No sense in trying to OC the piss out of it for gaming... just wait for patches to games and Windows... and BIOS updates for better memory speeds.


----------



## AmericanLoco

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Are you actually serious with your posts or are you just trolling at this point? We've seen the BEST Ryzen has to offer??? Most of these supposed reviews were done with un-updated mobo's, with 2400MHz DDR4, and across a wide range of hardware options. We have FAR from seen the BEST Ryzen can do, and I'd argue that most of the launch reviews were critically flawed at their core and should be completely redone in a few weeks. Mind, I'm not necessarily blaming all of the reviewers (AMD bears responsibility for not making sure all issues weren't ironed out before providing review samples) but the fact remains that the laucnh reviews absolutely do not give us an accurate representation of what these chips are capable of, especially in games...


He's made over 100 posts in 3 days in multiple Ryzen threads. He's just a troll, ignore him.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ducegt*
> 
> Smartphones can do all those other things. So can $200 walmart PCs. Your argument is weak.
> So Ryzen can run higher Mem clocks with higher timings. Intel can run much higher clocks at much lower timings; but that's not how they were tested. What is so difficult about this to understand? Being this is beyond you, it's exceedingly hard for your shallow and dim witted personal attacks to get under my skin. So I must ask you... How much is AMDing paying you? Are you ok drugs? Are you competition for number 1 AMD fan boy? Are you trolling? Are are you simply not able to comprehend what I've said and thus argue against something I didn't say? When they redo Ryzen reviews; they ought to properly show a 7700K setup. Not one with all core turbo disabled and slow standard RAM...and not clocked at stock to level the playing field with XFR.


Most reviews are stock and I expect the next round of reviews to be the same. I guess Nvidia is OK to add extra boost clocks but AMD is "cheating" somehow by utilizing XFR? XFR is nothing but an extended boost clock and is thus STOCK. At any rate, I never said the 7700K wasn't the fastest gaming CPU; it is. There is no question that no Ryzen chip is going to topple a 5GHz 7700K in a game ever. All I'm saying is that all things considered, the Ryzen architecture is a great gaming platform as well as providing MT performance far beyond anything the 4C / 8T Intel quads will be able to manage in most applications. The R7's are direct competitors for Broadwell-E after all and not KL.

By the way, for such a huge AMD-supported fanboy I have never owned an AMD processor before as when I got into PC building the 2600K had just come out and I still have that 2600K which was my first CPU in my backup rig right now. Intel CPU's have been the very best for a loooooooong time and that is a fact. I'm just really excited to finally see a viable alternative to even the latest Intel processors available from AMD and with Ryzen that is exactly what we have.


----------



## Majin SSJ Eric

Its OK, he's free to say whatever he wants. He hates Ryzen and thinks nobody should buy anything other than a 7700K. He's entitled to his opinion...


----------



## Majin SSJ Eric

Why is it so hard to give credit where it is due, on both fronts? I will say it til I'm blue in the face: The i7 7700K is an absolute beast of a processor and the first true leap forward for Intel quads since the 2600K. What's not to like? It clocks like crazy, its architecture provides unbelievable IPC, and it is extremely polished and many-featured. If you haqve a 7700K then great for you; its the best gaming CPU you can buy today!

*ALSO*

The Ryzen R7 octo-cores are absolute beasts and are a quantum leap forward from BD and Phenom II. Ryzen is the Phenom successor that we should have gotten in 2011 (new tech not withstanding) and the pricing is just the icing on the cake. IPC is at Broadwell-E levels, SMT appears to be more efficient than HT, the 8C / 16T will absolutely be relevant in most day-to-day applications that are not gaming, and this is the first processor that can shift the mainstream quad core paradigm by providing twice the cores and threads for the same $$$ that Intel charges for their i7 quads. Its one glaring flaw is obviously the fact that OCing is near nonexistent but that's at least tolerable considering the 4GHz speeds we are seeing still allow them to be within spitting distance of OC'd Intel processors in gaming and actually faster in other MT-intensive tasks.

See, its not impossible to praise both products. The real winner in all of this chaos are us, the PC enthusiasts. Never before have we had the kind of options to choose from that we now have in the $300-$400 segment. You can get a blazingly fast gaming processor for the highest possible FPS on your 144Hz monitors or you can get an 8 core beast that will equal and even exceed Intel's fastest 8-core processors for literally 1/3 the price.


----------



## Blameless

Thus far most _sensible_ pre-launch predictions have panned out. OCing is a bit of a disappointment and there are platform teething issues, but it's a solid architecture.

Not going to build one right away, but come summer time, I hope to have a decent ITX gaming/streaming setup built around a Ryzen 1700 and either Vega or a 1080 Ti.


----------



## kfxsti

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Why is it so hard to give credit where it is due, on both fronts? I will say it til I'm blue in the face: The i7 7700K is an absolute beast of a processor and the first true leap forward for Intel quads since the 2600K. What's not to like? It clocks like crazy, its architecture provides unbelievable IPC, and it is extremely polished and many-featured. If you haqve a 7700K then great for you; its the best gaming CPU you can buy today!
> 
> *ALSO*
> 
> The Ryzen R7 octo-cores are absolute beasts and are a quantum leap forward from BD and Phenom II. Ryzen is the Phenom successor that we should have gotten in 2011 (new tech not withstanding) and the pricing is just the icing on the cake. IPC is at Broadwell-E levels, SMT appears to be more efficient than HT, the 8C / 16T will absolutely be relevant in most day-to-day applications that are not gaming, and this is the first processor that can shift the mainstream quad core paradigm by providing twice the cores and threads for the same $$$ that Intel charges for their i7 quads. Its one glaring flaw is obviously the fact that OCing is near nonexistent but that's at least tolerable considering the 4GHz speeds we are seeing still allow them to be within spitting distance of OC'd Intel processors in gaming and actually faster in other MT-intensive tasks.
> 
> See, its not impossible to praise both products. The real winner in all of this chaos are us, the PC enthusiasts. Never before have we had the kind of options to choose from that we now have in the $300-$400 segment. You can get a blazingly fast gaming processor for the highest possible FPS on your 144Hz monitors or you can get an 8 core beast that will equal and even exceed Intel's fastest 8-core processors for literally 1/3 the price.


One of the best post in this thread so far +rep good sir


----------



## SoloCamo

Quote:


> Originally Posted by *Blameless*
> 
> Thus far most _sensible_ pre-launch predictions have panned out. OCing is a bit of a disappointment and there are platform teething issues, but it's a solid architecture.
> 
> Not going to build one right away, but come summer time, I hope to have a decent ITX gaming/streaming setup built around a Ryzen 1700 and either Vega or a 1080 Ti.


Yup, it exceeded my expectations in some areas and the only thing I could say it fell a bit short in was the oc'ing results.


----------



## budgetgamer120

Can't wait to get to 1core 4 thread CPU cores


----------



## WhiteCrane

Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


----------



## SoloCamo

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


This gets so old. Just because I am a gamer doesn't mean I don't do any other tasks with my pc that would absolutely benefit from the extra cores/threads..

Since when does playing games automatically mean you do nothing else?


----------



## jprovido

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


Streaming while gaming. this is how my 7700k looks like when playing at 1440p @ 144hz while streaming on twitch https://www.twitch.tv/videos/126205955 It was supposed to be 1080p @ 30fps but as you can see skipped frames city


----------



## bfedorov11

I wonder if the next batch could oc higher.. not likely, but you never know on a gen1 product. Does anyone have the F4 ones in hand yet?

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


After some fixes, 1080p gaming performance should increase. Value, platform will be good for at least 4 years, future proofing. $300 8 core cpu that competes with intel's $1000 cpus..


----------



## WhiteCrane

I see. So going with Ryzen gives you a great CPU for gaming, but with extra cores as "gravy" I guess.


----------



## budgetgamer120

More benchmarks guys Ryzen vs Vishera.

More at the source http://wccftech.com/ryzen-fx-performance-gains-vishera/

Ryzen From FX: Performance Gains Over Vishera


----------



## SoloCamo

Quote:


> Originally Posted by *WhiteCrane*
> 
> I see. So going with Ryzen gives you a great CPU for gaming, but with extra cores as "gravy" I guess.


Ryzen offers single core performance that is plenty for any game out there with the benefit of ridiculously good multi threaded performance at this price point. Unless you absolutely need 144fps in every title (something the 7700k is not going to accomplish either) then the trade off of losing that many more cores/threads is just not worth it to me.

For those that game at 1440p+ (or myself at 4k) you are going to need gpu's that don't exist yet for Ryzen to ever even remotely become any sort of meaningful bottleneck.


----------



## kaosstar

This is a good graphic that I saw posted on Anandtech.


----------



## TheReciever

Quote:


> Originally Posted by *SoloCamo*
> 
> This gets so old. Just because I am a gamer doesn't mean I don't do any other tasks with my pc that would absolutely benefit from the extra cores/threads..
> 
> Since when does playing games automatically mean you do nothing else?


On NBR the argument being made against me is that you are no longer a "consumer" if you do more than just 1 game at a time on your system, after that you become a prosumer and therefore you cant argue against because you are no longer apart of that demographic.

This is after moving the goal posts twice.


----------



## Pyrotagonist

Quote:


> Originally Posted by *kaosstar*
> 
> This is a good graphic that I saw posted on Anandtech.


Here's the original review: http://drmola.com/pc_column/141286

These are the gaming tests:


----------



## AuraNova

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


As said by many people before, the general consensus is that for SOLELY gaming, Ryzen right now is not the best to buy. Many people multitask, including streamers. Ryzen seems to be at least a perfect balance of both gaming and productivity. Then once in the coming months it gets optimized, then we'll see Ryzen truly shine.


----------



## kaosstar

It seems that part of the problem with gaming could be specific to Windows 10: https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-8#post-38775716


----------



## flippin_waffles

Does anyone have a link for 'optimizing for Ryzen' session at GDC?


----------



## renx

The 1700 is the 7700K killer big time. Sadly, most people couldn't see it, yet.
But 2 months from now evaluation will change. (maybe even sooner)


----------



## flippin_waffles

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


Why? Ryzen is a very capable gaming chip that has only begun to be exploited its only 2 days old. Ryzen is designed different than past chips its only natural that optimizations need to be made. The best part is i will see free performance boosts along the way. Its ridiculous how low the cost of the platform is.

The chip is very cool actually.


----------



## flippin_waffles

Quote:


> Originally Posted by *renx*
> 
> The 1700 is the 7700K killer big time. Sadly, most people couldn't see it, yet.
> But 2 months from now evaluation will change. (maybe even sooner)


Theyll start to see it soon it shouldnt take long.

Many already do theres a huge demand for Ryzen.


----------



## ducegt

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its OK, he's free to say whatever he wants. He hates Ryzen and thinks nobody should buy anything other than a 7700K. He's entitled to his opinion...


Don't hate Ryzen. It's just not the best for what most people want; gaming.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> You said we have seen the best Ryzen (a two day old processor) has to offer. That is the definition of inaccurate to the point of being ridiculous. I'm actually embarrassed for you.


With respect to memory. You have a serious reading comprehension problem. A 2 day old chip that's been in development for 10 years. Anything inaccurate about how 7700Ks were gimped by memory in the reviews? Maybe the third time is a charm for your tunnel vision.

Ad nauseam arguments from others. I don't have to ignore them because I've never seen them contribute anything and it's evident now they lack such a capacity.


----------



## Robin Nio

Quote:


> Originally Posted by *ducegt*
> 
> Don't hate Ryzen. It's just not the best for what most people want; gaming.


in my opinion, not that many people buy top of the line cpus to play games, they get like a i5 for 1080p or 1440p gaming which probably is the majority of the market since 4k is still very expensive, these cpus are for editing and work that require a lot of power. The 1600x and below is probably gonna be where we get to gaming atleast for now until 8 core becomes a standard in gaming.


----------



## Arturo.Zise

If someone had told me that AMD were going to release a CPU that could compete with an i7 6900k at HALF the price, I would of laughed in their face and told them they are dreaming. Call me impressed.

Wonder how much $$$ the next Intel 8 core CPU's will launch at?


----------



## Brutuz

Quote:


> Originally Posted by *ducegt*
> 
> With respect to memory. You have a serious reading comprehension problem. A 2 day old chip that's been in development for 10 years. Anything inaccurate about how 7700Ks were gimped by memory in the reviews? Maybe the third time is a charm for your tunnel vision.
> 
> Ad nauseam arguments from others. I don't have to ignore them because I've never seen them contribute anything and it's evident now they lack such a capacity.


With respect to history, the Q6600 ended up being a better option than the E8400 even though the E8400 was faster at first because games were starting to use more than 2 cores so while the benefit wasn't immediately obvious it quickly became obvious. I've said it a few times now and I'll say it again: Upgrade often? 7700k. Upgrade rarely? 1700, used HEDT or 8700k when that comes out if the 6c rumours are true.

And the difference in gaming is still tiny, GPU limited or above what most people use for a refresh rate regardless. It's irrelevant for all intents and purposes because of increased threading in games making any real predictions using single/dual threaded engines useless.


----------



## sumitlian

Quote:


> Originally Posted by *Brutuz*
> 
> And the difference in gaming is still tiny...


No it is not. i7 7700k users with GTX 1080 can always, in a second, set the in game resolution of their 2K monitor to 720p, just to show you how their real world 720p gaming experience is significantly better than any of the RyZen model.









i7 for 720p, anytime, anywhere...FTW


----------



## IRobot23

Quote:


> Originally Posted by *sumitlian*
> 
> No it is not. i7 7700k users with GTX 1080 can always, in a second, set the in game resolution of their 2K monitor to 720p, just to show you how their real world 720p gaming experience is significantly better than any of the RyZen model.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 for 720p, anytime, anywhere...FTW


Nobody tested "real life" situations in games.
Again SP vs MP - LOW VS HIGH SETTINGS.

- Lets say 720p LOW SETTING in SP


Spoiler: Warning: Spoiler!



For example :
i3 vs i5 vs i7 (same clock):

- i3 = 110FPS
- i5 = 120 FPS
- i7 = 120 FPS
- i7X = 120FPS



- Lets say 720p HIGH SETTING in SP


Spoiler: Warning: Spoiler!



For example :
i3 vs i5 vs i7 (same clock):

- i3 = 80FPS
- i5 = 105 FPS
- i7 = 110 FPS
- i7X = 110FPS



- Lets say 720p HIGH SETTING in MP


Spoiler: Warning: Spoiler!



For example :
i3 vs i5 vs i7 (same clock):

- i3 = 40FPS
- i5 = 60 FPS
- i7 = 80 FPS
- i7X = 100 FPS



Why is there difference between i7 6900K and R7 1800X in gaming - because of DUAL vs QUAD CH.
- Simply you would need to compare 3866MHz DDR4 CH-2 vs 2133MHz DDR4 CH-4.


----------



## Kuivamaa

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


I was uninterested in Quads. I had a 6800k and a X99 asus board on my shopping list but canard leaks last december reignited my interest in zen. I can see a solid arch and three great CPUs snd yes this is 95% for gaming. Everything is brand new so there are teething issues but at the same time the socket has future. 2011 is dead and 1151 may or may not see another CPU ( if it sees it is just coffee lake which will be the same core than SL it seems). Kaby has the IPS but it is still a quad- a 1700 is almost twice the CPU for similar price.


----------



## sumitlian

Quote:


> Originally Posted by *IRobot23*
> 
> Nobody tested "real life" situations in games.
> Again SP vs MP - LOW VS HIGH SETTINGS.
> 
> - Lets say 720p LOW SETTING in SP
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> For example :
> i3 vs i5 vs i7 (same clock):
> 
> - i3 = 110FPS
> - i5 = 120 FPS
> - i7 = 120 FPS
> - i7X = 120FPS
> 
> 
> 
> - Lets say 720p HIGH SETTING in SP
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> For example :
> i3 vs i5 vs i7 (same clock):
> 
> - i3 = 80FPS
> - i5 = 105 FPS
> - i7 = 110 FPS
> - i7X = 110FPS
> 
> 
> 
> - Lets say 720p HIGH SETTING in MP
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> For example :
> i3 vs i5 vs i7 (same clock):
> 
> - i3 = 40FPS
> - i5 = 60 FPS
> - i7 = 80 FPS
> - i7X = 100 FPS
> 
> 
> 
> Why is there difference between i7 6900K and R7 1800X in gaming - because of DUAL vs QUAD CH.
> - Simply you would need to compare 3866MHz DDR4 CH-2 vs 2133MHz DDR4 CH-4.


_Lets say_ if Quad Channel is specifically what you want, you may go pay >100% $$$ for it, simple. Beside that, RyZen with Dual Channel is already doing groundbreaking against any i7s, even against the Deca Core one since the per/$ of i7 6950x even with extra two cores is already garbage.


----------



## Digitalwolf

Quote:


> Originally Posted by *sumitlian*
> 
> No it is not. i7 7700k users with GTX 1080 can always, in a second, set the in game resolution of their 2K monitor to 720p, just to show you how their real world 720p gaming experience is significantly better than any of the RyZen model.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 for 720p, anytime, anywhere...FTW


Ya... it's kind of bothersome to keep seeing this mentioned by "some people". I had managed to not comment on this trend until... now... When I see a review that does this... I either move on, let my eyes glaze over and click to the next part or simply black list the reviewer or review site.

Why? Well because I'm never going to use the product in that manner. So it shows me absolutely nothing relative to my intended use.. how is that helpful? I get this whole "logic" people bring up about finding a way to show a difference in cpu's... but the point is when I actually use my cpu that "difference" won't be there. However... when I add another intensive task into the mix.... suddenly the system I built is still doing everything I need without dropping frames etc... where as the one that was better at 720p... I have to turn something down.... why? because that's my actual "real world" performance metric based on how I actually use my machine. You won't see that in a review (or at least not most reviews)... because that's not the intended use or audience the review goes for. Yet that 720p gaming crowd is pretty large...

I definitely get the 1080p discussion because it's super relevant to a lot of people. However, adding in high quality live streaming even at 1080p changes things... Intended use means a lot in my opinion. At least the 1080p benches make sense to the mass of gamers still there.

Even on the 1080p gaming.... If you can choose between a 349 cpu and a 329 cpu... and the 349 was a LOT faster at 720p... but at 1080p there was zero difference beyond you could save $20. Again... how is the lower ress useful? Oh I get it that the performance difference dropped to 0 due to something else (not the cpu) but it's still there.... unless you drop your play ress. Which brings up the most valid point for the average person... price versus actual performance you will see regardless of the reasons behind that performance. (which yes even means depending on an individuals use an I5 or even I3 may still be the most relevant chip for them... but these low ress benchmarks are still not going to show them how their game (or whatever) performs at the actual settings they will use).

Oh and for reference I've been at 1440 for what seems like a long time. So that's my reference point, but I am well aware that many people are still at 1080.

*edits for clarity and typo's... yay*


----------



## Tobiman

Quote:


> Originally Posted by *-Sweeper_*
> 
> AMD marketing:
> 
> Intel Broadwell-E
> Core i7-6900K
> 8C / 16T
> 3 GHz all-core 49.05
> 
> AMD ZEN
> Engineering Sample
> 8C / 16T
> 3 GHz all-core 48.07
> (-0.98 sec, 1.998%)
> 
> Tom's review:
> 
> 
> 
> Did they sabotage Intel in their demonstration?


The new Blender update uses AVX2 which is why Intel gets a noticeable bump.


----------



## zGunBLADEz

Quote:


> Originally Posted by *umeng2002*
> 
> No, you don't understand... the
> 
> GPU
> 
> should be pegged in the high nineties all the time if the CPU can feed it well enough.
> 
> So a fast CPU can feed a 99% utilized GPU at a low CPU utilization.
> 
> A Slower CPU might get a GPU to 99% utilization but at a higher CPU utilization.
> 
> This isn't a hard concept to understand.


It's not hard to understand, then you look at the 1700 is almost on par on frame in that particular watch dogs2 on fps than the 7700k with less usage....But when I'm looking benches with a card that run 1440P up 60 no problems that's when I don't see no point in this benchs.

That's not iniciative of a CPU bottleneck if that's the case why 4790k still on par with the 7700k in most of this CPU bottlenecks tests?

That's not too hard to understand either right? So the test is pretty much useless if you create a GPU bottleneck.. Which is the case 99% of the time. I'm not buying high end gpu to play at 720/1080 resolution. Using this tests as argument is futile.

Plus taking into consideration a lead of 20-40 fps on that type of cards at 720-1080P with every setting set to low it's barely nothing.


----------



## sumitlian

Quote:


> Originally Posted by *Tobiman*
> 
> The new Blender update uses AVX2 which is why Intel gets a noticeable bump.


AVX2 is where one can't defend RyZen in performance since AMD themselves haven't provided optimum support for 256 bit SIMD operations as compared to Intel.
But regardless, overall RyZen is still very good value in perf/$, both in single thread and multithread, imo.

One thing I am still wondering is if this AVX2 Blender is the ultimate performance number for RyZen Architecture ? I mean what if you optimize Blender based on AVX-128 Instructions, should it make RyZen any faster than the AVX2 build ? Can Blender constraints be adjusted to work on 2 x 64 bit data sets ? Are there any relevant instructions to do same work in less wide data sets ? AMD had demoed Blender with SSE optimization if I am not wrong. And technically AMD does have support for AVX-128 bit Instructions. Has AMD added AVX-128 support just for the sake of supporting _new feature sets / new instructions_ or AMD's implementation of AVX-128 bit adds _throughput_ improvements as well over SSE4 ? Because if AVX-128 does provide higher _throughput_ over SSE4, then it is highly likely we might see RyZen in AVX-128 optimized Build doing better than this AVX2 build. It is not that it would be very close to Intel AVX2, but RyZen should still perform better than what it is doing now.


----------



## Kuivamaa

Quote:


> Originally Posted by *Tobiman*
> 
> The new Blender update uses AVX2 which is why Intel gets a noticeable bump.


If that was the case, Intel would have won stock vs stock too. The problem is the insane scaling 6900k between stock vs 3.8, not the numbers themselves. And I do not see how AVX2 can be the reason for that. Most likely a botched test.


----------



## Tobiman

Quote:


> Originally Posted by *Kuivamaa*
> 
> If that was the case, Intel would have won stock vs stock too. The problem is the insane scaling 6900k between stock vs 3.8, not the numbers themselves. And I do not see how AVX2 can be the reason for that. Most likely a botched test.


The difference is that 3.8ghz 6900k is all core clocks with maybe 4.0ghz single core turbo compared to the 3.2ghz all core speed of a stock 6900k with a 4.0ghz single core turbo.


----------



## CriticalOne

CPU benchmarks have always been done in a manner where the tester uses a low resolution in order to remove the GPU benchmark. I don't know if some of you are new to PC Gaming, but this is how proper CPU benchmarks using games has always been done. Don't get upset at the reviewer because you refuse to educate yourself or let reviewers explain their methodology.


----------



## CULLEN

Quote:


> Originally Posted by *WhiteCrane*
> 
> Part from bias, I'm having a hard time seeing why any gamer would go Ryzen over 7700K. And I do not even like Kaby Lake.


Well, gaming is where the i7 7700K aces, but Ryzen has virtually everything else.

But not only that, Ryzen is very-very close it the 7700K when it comes to gaming, the 7700K is not so close to Ryzen in anything else.

For most games the GPU is the bottleneck, in those cases, the 7700K and Ryzen are nearly identical.

*Ryzen platform is 3 days old*, unpolished drivers, various RAM settings are currently bugged, even something simple as a BIOS update has improved gaming performance for Ryzen up to 26%.


----------



## SoloCamo

Ryzen w/ SMT disabled benchmark will be out soonish from Joker... sorry for the poor quality picture



GPU usage slightly down but showing 10fps diff in Metro.. at least from what he has shown


----------



## Mahigan

Quote:


> Originally Posted by *SoloCamo*
> 
> Ryzen w/ SMT disabled benchmark will be out soonish from Joker... sorry for the poor quality picture
> 
> 
> 
> GPU usage slightly down but showing 10fps diff in Metro.. at least from what he has shown
> 7700k user? Check.
> 
> You act as if Intel has not had issues with newly launched platforms before... or issues at all. Issues as recently as Skylake's memory problems come to mind. And how mature are those cpu's compared to Ryzen? Considering the lineage of said cpu, it's not exactly new.


Yep... I'll be posting my numbers soon... except I only have an FX 8350 and i7 3930K to compare it with.


----------



## looniam

Quote:


> Originally Posted by *Brutuz*
> 
> *With respect to history, the Q6600 ended up being a better option than the E8400 even though the E8400 was faster at first because games were starting to use more than 2 cores so while the benefit wasn't immediately obvious it quickly became obvious.* I've said it a few times now and I'll say it again: Upgrade often? 7700k. Upgrade rarely? 1700, used HEDT or 8700k when that comes out if the 6c rumours are true.
> 
> And the difference in gaming is still tiny, GPU limited or above what most people use for a refresh rate regardless. It's irrelevant for all intents and purposes because of increased threading in games making any real predictions using single/dual threaded engines useless.


you're constantly forgetting a few detail with your remembrance:
the Q6600 was substantially more expensive (65%) and it would need OCed to 3Ghz which took a huge cooler since it has a hot chip







(i know i had one)


Spoiler: Warning: Spoiler!






^that is both @3ghz (E8400 stock) that ~10% difference would diminish when also OCing the E8400 ( i had no issue hitting 3.6Ghz on a stock cooler.)

to be fair though yes, a few games would make a larger difference:


Spoiler: Warning: Spoiler!







but that's more than 4 years after the chip hit the market. *so spend ~60% to wait years?*

no thanks. btw, yeah _i had both chips until sandy was released_.


----------



## SoloCamo

Hardware Canucks will also be doing an update.

They got 4.1ghz and 3200mhz stable.

I really wish we could get an official rebench when everyone learns their way around the new platform and after firmware/bios/windows updates have been made.


----------



## zGunBLADEz

Quote:


> Originally Posted by *CriticalOne*
> 
> CPU benchmarks have always been done in a manner where the tester uses a low resolution in order to remove the GPU benchmark. I don't know if some of you are new to PC Gaming, but this is how proper CPU benchmarks using games has always been done. Don't get upset at the reviewer because you refuse to educate yourself or let reviewers explain their methodology.


.

Still that's not the point. Who plays at 720p with everything at low? That's not saying too much neither indicates nothing for the end user specially sporting a 1080 I mean come on lol.

Then you see 1440p+ benchs and every other arithmetic tests and look what happen? You can still achieve equal frame rates or close to 7700k perf with the right equipment but there's nothing else that's going to save a 7700k on the multi tasking tests.


----------



## SoloCamo

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Still that's not the point. Who plays at 720p with everything at low? That's not saying too much neither indicates nothing for the end user specially sporting a 1080 I mean come on lol.
> 
> Then you see 1440p+ benchs and every other arithmetic tests and look what happen? You can still achieve equal frame rates or close to 7700k perf with the right equipment but there's nothing else that's going to save a 7700k on the multi tasking tests.


In all fairness I'm glad they do low resolution cpu bound benchmarks because when you see results like the one posted a few up...
Quote:


> Originally Posted by *looniam*


it would be VERY misleading to people that don't know better.

A FX-4170, let alone q6600 is within 2fps of a 3770k due to a gpu bottleneck. Most people don't put two and two together and say "see, the FX-4170 is fine for gaming" and then you have regrets posted everywhere.

Now obviously people take this too far and think that because a 5ghz 7700k may get 500fps and a Ryzen chip 350fps suddenly the Ryzen is crap but that's obviously not the case.

I just wish people would use more logic and actually turn their brain on to do proper research.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mahigan*
> 
> Yep... I'll be posting my numbers soon... except I only have an FX 8350 and i7 3930K to compare it with.


Joker just got called out hard. He has been faking his benchmarks.

https://www.youtube.com/watch?v=VWarC_Nygew&t=0s


----------



## SoloCamo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Joker just got called out hard. He has been faking his benchmarks.
> 
> https://www.youtube.com/watch?v=VWarC_Nygew&t=0s


Haven't watched it but please keep in mind there was a personal/work falling out between these two so take whatever negative things he says (Joker has been getting a lot of hits lately) with a large grain of salt.


----------



## Alwrath

Quote:


> Originally Posted by *sumitlian*
> 
> i7 for 720p, anytime, anywhere...FTW


I play at 720p too. On my portable Nintendo Switch. When real PC enthusiast gamers play on a geforce 1080, they are at 1440p and 4K. Anything less and you might as well sell your geforce 1080 on ebay and game on the Nintendo Switch if you like 720p.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SoloCamo*
> 
> Haven't watched it but please keep in mind there was a personal/work falling out between these two so take whatever negative things he says (Joker has been getting a lot of hits lately) with a large grain of salt.


Yes but Jokers review is one of the few that has positive results. For a relatively small channel he did get all 3 CPUs to test.


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Joker just got called out hard. He has been faking his benchmarks.
> 
> https://www.youtube.com/watch?v=VWarC_Nygew&t=0s


What's interesting is that what is being claimed is not happening on the Ryzen side but on the Intel side.

He needs to respond simply because he can't claim any Ryzen issues were the cause of the results.

Something happened to the Intel side of his tests.


----------



## SoloCamo

Quote:


> Originally Posted by *mcg75*
> 
> What's interesting is that what is being claimed is not happening on the Ryzen side but on the Intel side.
> 
> He needs to respond simply because he can't claim any Ryzen issues were the cause of the results.
> 
> Something happened to the Intel side of his tests.


If true this is really disappointing. I don't know what review to trust.. they've all got their failings at this point and the inconsistency is ridiculous.

Wish we had some reliable sources here on OCN that own both and could do the testing.. seems to be the only reliable source we get every time.


----------



## lombardsoup

Quote:


> Originally Posted by *mcg75*
> 
> What's interesting is that what is being claimed is not happening on the Ryzen side but on the Intel side.
> 
> He needs to respond simply because he can't claim any Ryzen issues were the cause of the results.
> 
> Something happened to the Intel side of his tests.


Taking a look at that video and the trolls vs trolls drama in the comment section for it, I'm not sure who to trust. I'm amazed people can get this nasty over benchmarks.


----------



## BobiBolivia

I proposed it in another thread:

How about OCN community making its own "Reviews Corner" ?
It would be from actual end-users, who could put HW through ridiculous tests, and it would be community-powered, which means many people able to verify results on their own.

I can imagine that it will grow to some respectable source of information - we are OCN.net, right ?








And as for actual benefit - being able to verify results against other review sites, and maybe (just maybe) a much less bias in reviews ?









_Edit: Sometimes I am inconsistent with my own thoughts..._


----------



## looniam

Quote:


> Originally Posted by *SoloCamo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it would be VERY misleading to people that don't know better.
> 
> A FX-4170, let alone q6600 is within 2fps of a 3770k due to a gpu bottleneck. Most people don't put two and two together and say "see, the FX-4170 is fine for gaming" and then you have regrets posted everywhere.
> 
> Now obviously people take this too far and think that because a 5ghz 7700k may get 500fps and a Ryzen chip 350fps suddenly the Ryzen is crap but that's obviously not the case.
> 
> I just wish people would use more logic and actually turn their brain on to do proper research.
Click to expand...

i think you hit the nail on the head; people try to extrapolate too much from results data and i'll add, esp. when the testing conditions weren't set to allow such data to be extrapolated.

it was pounded in my academic head several decades ago: don't read too much into data, just what it's meant to show. but if you do think you extrapolated more, _then it must be verified with further testing._


----------



## sumitlian

Oh man, now my faith on youtube videos seem to be over.








Can't believe on anyone, no internet review, no videos, no nothing. Die you all liars mass deceivers human youtubers.

Now this is why we need to replace these evil humans with full fledged A.I. based reviewer LOL.


----------



## budgetgamer120

Yes sure jokers results are fake... Are Guru's fake too?


----------



## mcg75

Quote:


> Originally Posted by *BobiBolivia*
> 
> I proposed it in another thread:
> 
> How about OCN community making its own "Reviews Corner" ?
> It would be from actual end-users, who could put HW through ridiculous tests, and it would be community-powered, which means many people able to verify results on their own.
> 
> I can imagine that it will grow to some respectable source of information - we are OCN.net, right ?
> 
> 
> 
> 
> 
> 
> 
> 
> And as for actual benefit - not waiting for other review sites, and maybe (just maybe) a much less bias in reviews ?


We've been doing this here for years. Of course, it always happens after release.

It won't help. Some people look at reviews expecting it to say what they personally want it to say. If it doesn't, the review is biased or paid off or whatever else they can come up with to make themselves feel better about it.


----------



## daffy.duck




----------



## Digitalwolf

Quote:


> Originally Posted by *CriticalOne*
> 
> CPU benchmarks have always been done in a manner where the tester uses a low resolution in order to remove the GPU benchmark. I don't know if some of you are new to PC Gaming, but this is how proper CPU benchmarks using games has always been done. Don't get upset at the reviewer because you refuse to educate yourself or let reviewers explain their methodology.


This isn't about people who don't agree with it educating themselves... its about how a person uses their pc and there has been a trend there. So while there may be some education required it's not for the people who know Low Res testing tells them nothing useful.

If I post a low res review of gaming that takes the GPU out of the equation... You see CPU A is better than CPU B. What I don't do is show you what happens in the "real world" regardless of GPU used when you try to stream that game at a 'real world" resolution. For our example this same resolution takes the CPU somewhat out of the picture for "gaming".... So from the point of view of just "gaming" and using the old Method even tho now the game runs about the same on both CPU's... CPU A is still considered to be better. Then you start your stream....

Now you end up on the other end of the spectrum... suddenly that CPU which was great based on the old methodology that you are defending is complete crap. You are dropping frames and have to drop quality compared to another streamer that is using CPU B... why? simply because in the real world conditions you are finding quite common these days... the age old standard isn't working anymore. Why would it? It's not what it was designed to test for.

So the point is for people who are only going to do what the old standard would test for.. it still works.

When streaming is to the point that more and more development is building streaming into the product... you have to add a new method. So for people who are going to "do that" don't end up getting a CPU that isn't performing the way they need for their intended purpose. Every game at Amazon Game Studio's that is in development (as an example) will have full Twitch integration built in. Most of the games I beta test now all have notice for people who stream... it's free advertising and even when games still have NDA's they are often allowing people to contact them to get an ok to stream live while beta testing...

My statement or point of view isn't that the old methodology you're defending is wrong. Its that it doesn't do anything for someone like me.... it adds zero useful information for me.

We already see the same thing between the 7700K and the x99 line. Simply for gaming there isn't a compelling reason to go x99... but for a streamer it's usually the better choice. So this isn't even an AMD vs. Intel thing... it's an Intended use thing.


----------



## Charcharo

Quote:


> Originally Posted by *Alwrath*
> 
> I play at 720p too. On my portable Nintendo Switch. When real PC enthusiast gamers play on a geforce 1080, they are at 1440p and 4K. Anything less and you might as well sell your geforce 1080 on ebay and game on the Nintendo Switch if you like 720p.


Just to correct you here*

PC Gaming is cheaper, that is why poor Eastern Europe is on PC Gaming.

PC Gaming has mods and emulation and backwards compatibility. It also has Strategy games.

All the reasons I listed>>> muh 300 fps >>>>>>>>> silly graphics. Really









As for Ryzen... I am very pumped to see the 1600X after the inevitable Windows update, Kernel and BIOS updates and better RAM.


----------



## zGunBLADEz

Quote:


> Originally Posted by *SoloCamo*
> 
> In all fairness I'm glad they do low resolution cpu bound benchmarks because when you see results like the one posted a few up...
> it would be VERY misleading to people that don't know better.
> 
> A FX-4170, let alone q6600 is within 2fps of a 3770k due to a gpu bottleneck. Most people don't put two and two together and say "see, the FX-4170 is fine for gaming" and then you have regrets posted everywhere.
> 
> Now obviously people take this too far and think that because a 5ghz 7700k may get 500fps and a Ryzen chip 350fps suddenly the Ryzen is crap but that's obviously not the case.
> 
> I just wish people would use more logic and actually turn their brain on to do proper research.


More logic would mean. Look at the resolution you are playing from. I have a 1080 and a 4790K 5GHz chip you think I would play on 720 with everything on low? But ryzen is giving me the multitasking my 4790k would dream from.


----------



## SoloCamo

Quote:


> Originally Posted by *zGunBLADEz*
> 
> More logic would mean. Look at the resolution you are playing from. I have a 1080 and a 4790K 5GHz chip you think I would play on 720 with everything on low? But ryzen is giving me the multitasking my 4790k would dream from.


So I should have never upgraded from my 9590 since my 290x is the bottleneck at 4k? I should have just kept my prior 8350 or even my 8120 then, too. Look at both sides to it. A gpu bottleneck just say it's "good enough". I don't purchase parts based on "good enough" when paying as much as we do - I try to give myself some headroom and see potential issues with a part down the line.

If I went by this logic when people were telling me an i5 is plenty and 8gb is already overkill I would have needed to upgrade my pc. Meanwhile I'm still here using my 4790k and 16gb of 2400mhz ram, and look at that, it's (unsurprisingly) pretty much become the norm now for a decent gaming experience.


----------



## Master__Shake

"he didn't show us the final results, he never showed us the settings used"

such hurt feelings.

i guess if he had some curiosity he would have clicked over to his \website and read something.
Quote:


> For the graphics card, we're using the GTX 1080 overclocked 200Mhz on the core and 300MHz on the memory and all of our games were tested with the Ultra preset at 1080p & 1440p.


http://www.toptengamer.com/amd-ryzen-7-1700-vs-intel-i7-7700k-1800x/

instead he took to youtube to insult another reviewer.

journalism at it's finest.


----------



## Mad Pistol

Quote:


> Originally Posted by *daffy.duck*


It looks like SMT on/off makes very little difference in gaming.

However, what I was noticing was some seriously high framerates across the board. I'm still confused as to how people can say that Ryzen is not a good gaming CPU.


----------



## flippin_waffles

Quote:


> Originally Posted by *Mad Pistol*
> 
> It looks like SMT on/off makes very little difference in gaming.
> 
> However, what I was noticing was some seriously high framerates across the board. I'm still confused as to how people can say that Ryzen is not a good gaming CPU.


People can say anything...doesn't mean they are right however.


----------



## JackCY

Quote:


> Originally Posted by *Mad Pistol*
> 
> It looks like SMT on/off makes very little difference in gaming.
> 
> However, what I was noticing was some seriously high framerates across the board. I'm still confused as to how people can say that Ryzen is not a good gaming CPU.


Because they expected an 8 core $329 to be better than both $350 4 core and $1000 8 core in EVERYTHING AND ALWAYS.


----------



## Mad Pistol

Quote:


> Originally Posted by *JackCY*
> 
> Because they expected an 8 core $329 to be better than both $350 4 core and $1000 8 core in EVERYTHING AND ALWAYS.


You can't please everyone. They were expecting a $329 CPU to beat a dedicated, high clocked 4c/8t CPU and a $1000 8c/16t CPU... sorry, but if AMD was able to pull that one off, these CPUs would be more expensive.

Of course, all things considered, and with AMD's limited R&D budget, AMD pulled off a miracle with Ryzen.

If I were in the market for an 8 core system, the 1700 would be my go-to chip. The value it provides is simply bonkers. Until the 6 and 4 core Ryzen parts launch, the R7 1700 just became my recommended CPU for nearly all scenarios.


----------



## Shatun-Bear

PC hardware journalism from some of the *smaller* sites and YouTube personalities is in the gutter.

Just look at Steve from Gamers Nexus and how he's airing the dirty laundry in public about his correspondence with AMD and the angry Ryzen review. I could never imagine such behaviour from Anandtech or other respected sites.


----------



## budgetgamer120

Quote:


> Originally Posted by *Mad Pistol*
> 
> It looks like SMT on/off makes very little difference in gaming.
> 
> However, what I was noticing was some seriously high framerates across the board. I'm still confused as to how people can say that Ryzen is not a good gaming CPU.


Even if it did.. I still wouldnt turn SMT off.


----------



## Hequaqua

Quote:


> Originally Posted by *Shatun-Bear*
> 
> PC hardware journalism from some of the *smaller* sites and YouTube personalities is in the gutter.
> 
> Just look at Steve from Gamers Nexus and how he's airing the dirty laundry in public about his correspondence with AMD and the angry Ryzen review. I could never imagine such behaviour from Anandtech or other respected sites.


^^I agree with that.

I saw no sense in posting recorded phone calls, etc. They asked him to run higher res.....not to hide the 1080p results. He made it sound like AMD was asking him to skew the results...which wasn't the case. I'm not saying AMD was in the right completely, but still, there was no reason to air all of that.

I'm not loyal to any one brand....CPU or GPU. After reading hundreds of comments, it seems like a lot of people keep moving the goalposts on AMD. I guess they are happy with just one mfg dominating each market(Intel/nVidia).

I try to stay neutral with what someone buys. They buy it for their own reasons. Who am I to put their choice down. It just leads to a bunch of needless arguments. We can all find reasons to justify our purchases, but we shouldn't have too.

Competition is good for all of us, no matter what side of the fence you might be on.


----------



## CULLEN

Quote:


> Originally Posted by *Hequaqua*
> 
> I try to stay neutral with what someone buys. They buy it for their own reasons. Who am I to put their choice down. It just leads to a bunch of needless arguments. We can all find reasons to justify our purchases, but we shouldn't have too.
> 
> *Competition is good for all of us, no matter what side of the fence you might be on*.


This guy gets it.


----------



## haszek

Quote:


> Originally Posted by *Shatun-Bear*
> 
> PC hardware journalism from some of the *smaller* sites and YouTube personalities is in the gutter.
> 
> Just look at Steve from Gamers Nexus and how he's airing the dirty laundry in public about his correspondence with AMD and the angry Ryzen review. I could never imagine such behaviour from Anandtech or other respected sites.


Yeah he is really sad guy...

It's fine to test CPU using games, but this is not how CPU will act in actual gaming, so those are not gaming tests(due to different settings). However most reviews try to push this to the public.

Especially this steve guy, who definetely has somekind of grudge to amd, and like someone already mentioned his 30 seconds 'gaming' tests.

The other thing is even if we move away from high resolutions, do you think at 1080p there is more players with cards like 480/1050/1060 or 1080? With those 200$ cards it's like playing high res as GPU is bottlenecking, so this is more accurate situation for average player.


----------



## budgetgamer120

Quote:


> Originally Posted by *Hequaqua*
> 
> ^^I agree with that.
> 
> I saw no sense in posting recorded phone calls, etc. They asked him to run higher res.....not to hide the 1080p results. He made it sound like AMD was asking him to skew the results...which wasn't the case. I'm not saying AMD was in the right completely, but still, there was no reason to air all of that.
> 
> I'm not loyal to any one brand....CPU or GPU. After reading hundreds of comments, it seems like a lot of people keep moving the goalposts on AMD. I guess they are happy with just one mfg dominating each market(Intel/nVidia).
> 
> I try to stay neutral with what someone buys. They buy it for their own reasons. Who am I to put their choice down. It just leads to a bunch of needless arguments. We can all find reasons to justify our purchases, but we shouldn't have too.
> 
> Competition is good for all of us, no matter what side of the fence you might be on.


I normally go with best bang for buck. Which has normally always been AMD up until recent.. But now they are back with even better bang for buck.


----------



## Kuivamaa

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Joker just got called out hard. He has been faking his benchmarks.
> 
> https://www.youtube.com/watch?v=VWarC_Nygew&t=0s


Perhaps an oc.net member with a 7700k and a 1080 could run these benches as well?


----------



## stargate125645

Quote:


> Originally Posted by *JackCY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mad Pistol*
> 
> It looks like SMT on/off makes very little difference in gaming.
> 
> However, what I was noticing was some seriously high framerates across the board. I'm still confused as to how people can say that Ryzen is not a good gaming CPU.
> 
> 
> 
> Because they expected an 8 core $329 to be better than both $350 4 core and $1000 8 core in EVERYTHING AND ALWAYS.
Click to expand...

OK, that's also faulty logic. Straw man aside, do you think AMD would price things that low if they beat Intel in everything? It's possible, but unlikely. A lot of people have issues forming sound arguments here, on both sides.


----------



## Uns33n

Seems like keeping the r7 series locked at 4ghz-4.1ghz OC so far is giving it the most performance improvement instead of it leaving it stocked and boost to 4ghz.

The difference is pretty substantial.


----------



## Slomo4shO

I'll leave this here as well since 75%+ of the benchmarked titles in any review are action/adventure/shooter games which only account for 16.3% of overall unit sales.














Anyone willing to test the SIMS on 7700K and 1700?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kuivamaa*
> 
> Perhaps an oc.net member with a 7700k and a 1080 could run these benches as well?


Yeah I do not know why we even bother with review websites and youtube videos. We know better than those wannabes. We are the ultimate form of PC enthusiast sites.


----------



## CULLEN

Quote:


> Originally Posted by *stargate125645*
> 
> OK, that's also faulty logic. Straw man aside, do you think AMD would price things that low if they beat Intel in everything? It's possible, but unlikely. A lot of people have issues forming sound arguments here, on both sides.


Well, it's been well known that Intel has been kind of ripping off customers because there was no real competition, and AMD is desperate to be back in the game. Extremely competitive price is very attracting.


----------



## IRobot23

Quote:


> Originally Posted by *Tobiman*
> 
> The new Blender update uses AVX2 which is why Intel gets a noticeable bump.


Look again i7 6900K and i7 6900K 3.8GHz.


----------



## budgetgamer120

Quote:


> Originally Posted by *stargate125645*
> 
> OK, that's also faulty logic. Straw man aside, do you think AMD would price things that low if they beat Intel in everything? It's possible, but unlikely. A lot of people have issues forming sound arguments here, on both sides.


Yes.. Shaking up the industry and providing high performance to the masses was their goal and the only way to do that was with great pricing.

After all we know a $1000 cpu is not worth $1000 or cost even $1000 to make. I am sure AMD is making a killing off Ryzen at its current price.


----------



## lombardsoup

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah I do not know why we even bother with review websites and youtube videos. We know better than those wannabes. We are the ultimate form of PC enthusiast sites.


Most post those videos for the simple reason that its faster to do that than actually spend the hours/days running the same benchmarks on their own systems. Unfortunately, relying on said videos also opens the door to interest groups on both sides. I've found OCN to be far less biased.


----------



## Charcharo

Quote:


> Originally Posted by *Slomo4shO*
> 
> I'll leave this here as well since 75%+ of the benchmarked titles in any review are action/adventure/shooter games which only account for 16.3% of overall unit sales.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone willing to test the SIMS on 7700K and 1700?


SIMS and simulation/Tycoon games are generally VERY CPU demanding.

Of course, if a game uses all cores well and after some tweaks for Ryzen on OS and game side are done... it will run better on it than on a 7700K.


----------



## IRobot23

Quote:


> Originally Posted by *Shatun-Bear*
> 
> PC hardware journalism from some of the *smaller* sites and YouTube personalities is in the gutter.
> 
> Just look at Steve from Gamers Nexus and how he's airing the dirty laundry in public about his correspondence with AMD and the angry Ryzen review. I could never imagine such behaviour from Anandtech or other respected sites.


You are correct. Dont know why reviewers dint try to compare i7 7700K and 4C/8T ryzen with same memory speed DDR4 in gaming to see how much difference. Since R5 1400X will be below 200$ you might reconsider.


----------



## zGunBLADEz

Quote:


> Originally Posted by *SoloCamo*
> 
> So I should have never upgraded from my 9590 since my 290x is the bottleneck at 4k? I should have just kept my prior 8350 or even my 8120 then, too. Look at both sides to it. A gpu bottleneck just say it's "good enough". I don't purchase parts based on "good enough" when paying as much as we do - I try to give myself some headroom and see potential issues with a part down the line.
> 
> If I went by this logic when people were telling me an i5 is plenty and 8gb is already overkill I would have needed to upgrade my pc. Meanwhile I'm still here using my 4790k and 16gb of 2400mhz ram, and look at that, it's (unsurprisingly) pretty much become the norm now for a decent gaming experience.


Thats why im saying... the 1700 on gaming at the resolutions i play with the current hardware (gpu wise) is around the same ballpark on perf than the competition.

But, the 4790k or 7700k in this case cant beat the 1700 on heavy multitasking applications no matter how you look at it..


----------



## 98uk

Quote:


> Originally Posted by *BobiBolivia*
> 
> I proposed it in another thread:
> 
> How about OCN community making its own "Reviews Corner" ?
> It would be from actual end-users, who could put HW through ridiculous tests, and it would be community-powered, which means many people able to verify results on their own.
> 
> I can imagine that it will grow to some respectable source of information - we are OCN.net, right ?
> 
> 
> 
> 
> 
> 
> 
> 
> And as for actual benefit - being able to verify results against other review sites, and maybe (just maybe) a much less bias in reviews ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Edit: Sometimes I am inconsistent with my own thoughts..._


I suspect the results would be biased or of poor quality. You only need to check user reviews on OCN to see how crap they are.

Also, most review sites have a relatively neutral stance, whereas people on OCN tend to be a bit sad and have brand favourites and call each other "fanboys".

Outside of a few members, the level of professionalism and maturity on OCN is extremely low and you can see this even in this thread!


----------



## incog

I want to see benchmarks in real world CPU loads. The time it takes to encode a certain video, overclocking capabilities, playing CPU bound games such as SC2. I want to see CPUs being used in CPU bound scenarios and see what happens.

So far, just from reading this thread, it seems that benchmarks out there aren't meant to be trusted. Furthermore, if they use intel-biased benchmarks (e.g. benchmarks which simply run better on Intel CPUs, YES that's a thing) then it needs to be said.


----------



## SoloCamo

This is actually one of the better, straight forward reviews I've found so far. Lot of productivity benches.






4.3ghz 6900k vs 4.1ghz 1800x and stock vs stock comparison. Also running 3ghz ram for both.

Results are in line with what you'd expect considering everything we know specifications wise.


----------



## jeffdamann

Quote:


> Originally Posted by *98uk*
> 
> I suspect the results would be biased or of poor quality. You only need to check user reviews on OCN to see how crap they are.
> 
> Also, most review sites have a relatively neutral stance, whereas people on OCN tend to be a bit sad and have brand favourites and call each other "fanboys".
> 
> Outside of a few members, the level of professionalism and maturity on OCN is extremely low and you can see this even in this thread!


So you don't think the hardware enthusiasts that work for said review sites have a natural bias towards a certain brand? Even if they don't cone right out and say so, but instead may hint at such a thing?


----------



## budgetgamer120

Quote:


> Originally Posted by *incog*
> 
> I want to see benchmarks in real world CPU loads. The time it takes to encode a certain video, overclocking capabilities, playing CPU bound games such as SC2. I want to see CPUs being used in CPU bound scenarios and see what happens.
> 
> So far, just from reading this thread, it seems that benchmarks out there aren't meant to be trusted. Furthermore, if they use intel-biased benchmarks (e.g. benchmarks which simply run better on Intel CPUs, YES that's a thing) then it needs to be said.


Doesn't Starcrft 2 use 1-2 cores?


----------



## 98uk

Quote:


> Originally Posted by *jeffdamann*
> 
> So you don't think the hardware enthusiasts that work for said review sites have a natural bias towards a certain brand? Even if they don't cone right out and say so, but instead may hint at such a thing?


In general, not really. A lot of the reviews ive seen seem to be neutral and come to the same conclusion.

In no way do I believe crowdsourcing reviews would improve anything.

People on OCN can't even talk about two brands without slapping each other with hand bags. It's quite pathetic at times...


----------



## ChronoBodi

No, i can help with benches. Stuff like in-game benchmark, cinebench, i can provide neutral data.


----------



## Hequaqua

Quote:


> Originally Posted by *98uk*
> 
> In general, not really. A lot of the reviews ive seen seem to be neutral and come to the same conclusion.
> 
> In no way do I believe crowdsourcing reviews would improve anything.
> 
> People on OCN can't even talk about two brands without slapping each other with hand bags. It's quite pathetic at times...


This^^^^^

This carries over to everywhere....

It's the: I'm right...You're wrong...no matter what!

Sad really.


----------



## Blameless

Testing gaming performance in almost entirely GPU bound scenarios will tell you a lot about your GPU and almost nothing about your CPU. If you are buying a CPU on the basis of what happens in GPU bound situations, you may as well be flipping coins.

It's completely true that a Ryzen system will do fine in most actual gaming scenarios...but this is not a selling point for a CPU because most games aren't very CPU limited.

If you want to know what CPU has the most gaming performance _potential_, you need to test CPU bound scenarios. These are almost intrinsically not real-world with high-end CPUs, unless you also have and extremely high-end GPU setup, or are running one of the few titles that is abnormally CPU dependent.

Long story short, unless you have very specific gaming needs, you can simply stop looking at gaming benchmarks when comparing Ryzen and the Intel alternatives. How much does the system cost and what sort of performance do you need in actual, CPU limited, scenarios, should be the deciding factors for most people, at least once AM4 matures slightly as a platform.


----------



## AlphaC

Quote:


> Originally Posted by *SoloCamo*
> 
> This is actually one of the better, straight forward reviews I've found so far. Lot of productivity benches.
> 
> 
> 
> 
> 
> 
> 4.3ghz 6900k vs 4.1ghz 1800x and stock vs stock comparison. Also running 3ghz ram for both.
> 
> Results are in line with what you'd expect considering everything we know specifications wise.


Finally some sense. Professional benchmarks make sense for a 8 core 16 thread CPU.

All these people expecting the Ryzen 7 to beat a 5GHZ i7-7700k with 8 threads on any DX11/DX12 game or any program relying on viewports (i.e. photoshop / CAD), are deluding themselves though. Maybe if you were to destroy the i7-7700k by having it stream , do voice coms, and etc while running a game using 4 cores with low idle time then it _might_ start to show some strain. That's like expecting a Haswell-E at 3.4-3.6GHz to beat a 5GHz i7-7700k. _It's not happening, sorry. Not when the DX11 API uses 6 logical cores at best and the DX12 scaling past 6 logical cores is poor._ Not to mention said games weren't designed with Ryzen optimizations whatsoever along with its finicky SMT. Assuming a game can use only up to 8 logical cores then the Kaby Lake [email protected] has about 5-10% IPC advantage on top of a 1GHz (so 25%+) clock advantage.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Joker just got called out hard. He has been faking his benchmarks.
> 
> https://www.youtube.com/watch?v=VWarC_Nygew&t=0s


Not surprised that there was some discrepancy. If he wrote that he was comparing an underclocked i7-7700k to an overclocked Ryzen 7 clock for clock he'd be laughed at since it has 2 times the threads. There's no way that anyone would run a i7-7700k at 4GHz unless it was broken.


----------



## BinaryDemon

Quote:


> Originally Posted by *budgetgamer120*
> 
> Even if it did.. I still wouldnt turn SMT off.


It's a shame SMT can't be toggled in realtime (without reboot). This would be a nice feature for AMD and Intel platforms. I guess the Windows Scheduler cant handle it.

I know there's some debate if it's all Windows Scheduler's fault or if it's a share resources issue. If it's just Windows Scheduler, I know there are utilities that make it easy to create profiles that automatically assign cores to a program.

I was wondering if someone couldnt make a simple utility to fool Windows Scheduler, and assign the HT cores to a the utility and then it simple idles but I have a feeling Windows Scheduler would detect that HT cores are idle and still attempt to assign other processes to them.


----------



## zGunBLADEz

Quote:


> Originally Posted by *BinaryDemon*
> 
> It's a shame SMT can't be toggled in realtime (without reboot). This would be a nice feature for AMD and Intel platforms. I guess the Windows Scheduler cant handle it.
> 
> I know there's some debate if it's all Windows Scheduler's fault or if it's a share resources issue. If it's just Windows Scheduler, I know there are utilities that make it easy to create profiles that automatically assign cores to a program.
> 
> I was wondering if someone couldnt make a simple utility to fool Windows Scheduler, and assign the HT cores to a the utility and then it simple idles but I have a feeling Windows Scheduler would detect that HT cores are idle and still attempt to assign other processes to them.


Process Lasso can do that


----------



## ducegt

Quote:


> Originally Posted by *looniam*
> 
> you're constantly forgetting a few detail with your remembrance:
> the Q6600 was substantially more expensive (65%) and it would need OCed to 3Ghz which took a huge cooler since it has a hot chip
> 
> 
> 
> 
> 
> 
> 
> (i know i had one)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ^that is both @3ghz (E8400 stock) that ~10% difference would diminish when also OCing the E8400 ( i had no issue hitting 3.6Ghz on a stock cooler.)
> 
> to be fair though yes, a few games would make a larger difference:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> but that's more than 4 years after the chip hit the market. *so spend ~60% to wait years?*


Exactly. And being my 775 board was like 60 dollars and OCed to 3.6, I jumped to àn i3 530 with another 60 dollar board for DDR3 for a few years until games started using all cores...then I dropped an i7 860 in same board at 4ghz for the next 5 years which smoked the Q6600 especially when the used chip was 80 dollars. Q6600 still had its place for anyone who refused to upgrade no matter what and Ryzen might prove to be the same. That's not what enthusiasts do. Ryzen: good for grandpa, not gamers.[/quote]
Quote:


> Originally Posted by *zGunBLADEz*
> 
> .
> 
> Still that's not the point. Who plays at 720p with everything at low? That's not saying too much neither indicates nothing for the end user specially sporting a 1080 I mean come on lol.
> 
> Then you see 1440p+ benchs and every other arithmetic tests and look what happen? You can still achieve equal frame rates or close to 7700k perf with the right equipment but there's nothing else that's going to save a 7700k on the multi tasking tests.


The testing methodology is beyond your grasps and many users have ready tried to spoon feed it to you.
Quote:


> Originally Posted by *IRobot23*
> 
> You are correct. Dont know why reviewers dint try to compare i7 7700K and 4C/8T ryzen with same memory speed DDR4 in gaming to see how much difference. Since R5 1400X will be below 200$ you might reconsider.


If Ryzen could only do DDR4 1066. Because of its limitations, then 7700K should be tested at Same speed? Even though 7700K go can above 4200? We aren't only comparing a CPU, but the entire platform it runs on.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ducegt*
> 
> The testing methodology is beyond your grasps and many users have ready tried to spoon feed it to you.


Like i said a useless test...
I understand very well what it is for... The problem is when we will see a gpu that powerful enough?

in the meantime this is what we do XD


Spoiler: Warning: Spoiler!







Edit: btw i find very funny that a 4790K vs 7700k stock in that chart is a mere 5%+ increase on "cpu" bound 3 gen later XD
even using the same argument on the long run term.. intel vs intel XD.. But you know nobody mentions it XD


----------



## kariverson

Personally I don't get the not as good as intel at gaming argument. I mean that in what game are you ever gonna max out a cpu? Even a Sandy Bridge 2600k @ stock is still adequate for modern games! I'm speaking from experience. I had a 2600k with a 980ti!


----------



## BinaryDemon

Quote:


> Originally Posted by *SoloCamo*
> 
> This is actually one of the better, straight forward reviews I've found so far. Lot of productivity benches.
> 
> 
> 
> 
> 
> 
> 4.3ghz 6900k vs 4.1ghz 1800x and stock vs stock comparison. Also running 3ghz ram for both.
> 
> Results are in line with what you'd expect considering everything we know specifications wise.


Lol, I guess Passmark is useless or his systems need a little more tweaking. My 5960x @ 4.23ghz is beating the i7-6900k @ 4.3ghz.


----------



## IRobot23

Quote:


> Originally Posted by *ducegt*
> 
> If Ryzen could only do DDR4 1066. Because of its limitations, then 7700K should be tested at Same speed? Even though 7700K go can above 4200? We aren't only comparing a CPU, but the entire platform it runs on.


Well some did run 3000MHz Dual channel like Joker did. Some board did have support for it.
I just want clear comparison between cores.

Gaming


Spoiler: Warning: Spoiler!



i7 7700K will still bottleneck a lot! Thats truth. What gamers need is decent game engine with support ONLY for DX12 and great core scaling. Why would you need i7 7700K at 5GHz, if 65W TDP 3GHz 8C could do better.


----------



## stargate125645

Quote:


> Originally Posted by *CULLEN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> OK, that's also faulty logic. Straw man aside, do you think AMD would price things that low if they beat Intel in everything? It's possible, but unlikely. A lot of people have issues forming sound arguments here, on both sides.
> 
> 
> 
> Well, it's been well known that Intel has been kind of ripping off customers because there was no real competition, and AMD is desperate to be back in the game. Extremely competitive price is very attracting.
Click to expand...

OK? That doesn't dissuade from either of my points. People are claiming someone is using poor logic, then proceeds to use horrible logic to back up their own claim.

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> OK, that's also faulty logic. Straw man aside, do you think AMD would price things that low if they beat Intel in everything? It's possible, but unlikely. A lot of people have issues forming sound arguments here, on both sides.
> 
> 
> 
> Yes.. Shaking up the industry and providing high performance to the masses was their goal and the only way to do that was with great pricing.
> 
> After all we know a $1000 cpu is not worth $1000 or cost even $1000 to make. I am sure AMD is making a killing off Ryzen at its current price.
Click to expand...

Again, you're just making suppositions without looking at all the information. That was my point.


----------



## Blameless

Quote:


> Originally Posted by *BinaryDemon*
> 
> It's a shame SMT can't be toggled in realtime (without reboot). This would be a nice feature for AMD and Intel platforms. I guess the Windows Scheduler cant handle it.


SMT doesn't do anything if no threads are scheduled to the extra logical cores.

95% of the time simple affinity changes will do exactly what disabling SMT would, and the remainder can be covered by core parking.


----------



## looniam

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ducegt*
> 
> The testing methodology is beyond your grasps and many users have ready tried to spoon feed it to you.
> 
> 
> 
> Like i said a useless test...
> I understand very well what it is for... The problem is when we will see a gpu that powerful enough?
> 
> in the meantime this is what we do XD
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: btw i find very funny that a 4790K vs 7700k stock in that chart is a mere 5%+ increase on "cpu" bound 3 gen later XD
> even using the same argument on the long run term.. intel vs intel XD.. But you know nobody mentions it XD
Click to expand...

and you don't realize that is exactly what people have been complaining about for years now(?)

so thanks for pointing that out . . . i guess.


----------



## zGunBLADEz

Quote:


> Originally Posted by *looniam*
> 
> and you don't realize that is exactly what people have been complaining about for years now(?)
> 
> so thanks for pointing that out . . . i guess.


Well, the expectations of ryzen before release were been close to hasswell perf even -10%-15% less but 8 cores and the price point... but all this people are comparing it to a 7700k mind you @ 5GHz lol...










1700 only cost $330 giving you multitasking similar to a 6900K for 1/3 of the price...
and you can do gaming on it too...

If thats the case lets compare how bad the 6950x vs the 7700k on gaming XD then we are going to throw the $1600 price tag on it for double killings and giggles..


----------



## looniam

ok, if you say so.


----------



## Artikbot

Quote:


> Originally Posted by *Slomo4shO*
> 
> I'll leave this here as well since 75%+ of the benchmarked titles in any review are action/adventure/shooter games which only account for 16.3% of overall unit sales.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone willing to test the SIMS on 7700K and 1700?


That is a VERY good point. Back in the day reviewers would use popular games people played for their benchmarks. I remember sifting through sites to find which reviewers used the games I played.

That cannot be said anof current reviews. I find myself looking at synthetic benchmarks the most since gaming benchmarks are all but irrelevant for my usage case.


----------



## Hueristic

This is what I care about AFA games go. Does anyone have other bechies of turn time? Apparently this guy was have mem issues as well.


----------



## Artikbot

I'm more than happy to run a few tests once mine gets here. Perhaps someone should start a community benchmark thread in the AMD section?


----------



## EightDee8D

One thing i learned after this launch is how biased some people/Tech-media are. when GPU launches nobody talks about GPU limited 4k res benchmarks because *Reasons* even though it removes CPU bottleneck. but On a cpu launch, 720p/1080p low benchmarks are the most important because it removes GPU bottleneck. both cases are kinda opposite of real world because hardly anyone buys 400$+700$ CPU+GPU to game on 720p/1080p low settings. what kind of BS and hypocrisy is this ?


----------



## Quantum Reality

Quote:


> Originally Posted by *Hueristic*
> 
> This is what I care about AFA games go. Does anyone have other bechies of turn time? Apparently this guy was have mem issues as well.


As performance increases go this kind of thing is exactly what I've been saying: In the AMD ecosystem Ryzen easily beats Bulldozer and shows a marked improvement over that architecture. Bulldozer can and probably will be relegated to low-power Internet general purpose usage and/or edge cases where the native speed of the CPU can still hold its own if absolute maxinum performance is not the issue.

The fact that it matches or at least produces 80% of the performance of Intel's current offerings (at worst) is a bonus.

(EDIT to better reword my comparison)


----------



## Catscratch

Quote:


> Originally Posted by *daffy.duck*


I have a proposition. Let's stop believing non-built-in benchmark videos.


----------



## Quantum Reality

Quote:


> Originally Posted by *daffy.duck*


Interesting that disabling SMT barely lowers the frame rates.

I would definitely be interested in an 8C/8T Ryzen model from AMD - Perhaps a "Ryzen 7 1650"?


----------



## Uns33n

Quote:


> Originally Posted by *Quantum Reality*
> 
> Interesting that disabling SMT barely lowers the frame rates.
> 
> I would definitely be interested in an 8C/8T Ryzen model from AMD - Perhaps a "Ryzen 7 1650"?


It increases in most games


----------



## oxidized

Quote:


> Originally Posted by *Uns33n*
> 
> It increases in most games


It doesn't really, and when it does it's completely random and in few games


----------



## jprovido

Quote:


> Originally Posted by *oxidized*
> 
> It doesn't really, and when it does it's completely random and in few games


It's been tested by several reviewers that there's indeed a performance hit on some games when SMT is enabled even AMD acknowledged it. This is not new this happened to my i7 930 back in the day. Since I only game on my pc I opted to disable HT I was able to get extra 200mhz oc and temps were lower too. Intel fixed ht issues with sandy bridge tho. No more performance hit and increase in temps


----------



## CULLEN

Quote:


> Originally Posted by *stargate125645*
> 
> OK? That doesn't dissuade from either of my points. People are claiming someone is using poor logic, then proceeds to use horrible logic to back up their own claim.
> Again, you're just making suppositions without looking at all the information. That was my point.


Whay, I..? I was giving me thought why AMD priced Ryzen so well, that was all. I think AMD priced them well to get some share of the market.

If you're denying that Intel nearly bankrupt AMD with illegal business tactics, which gave them almost a complete monopoly and therefore being able to charge whatever they felt like for their chips, then I don't know what to say.

But if the settlement was just a big misunderstanding and you have some inside information, please feel free to share them.


----------



## oxidized

Quote:


> Originally Posted by *jprovido*
> 
> It's been tested by several reviewers that there's indeed a performance hit on some games when SMT is enabled even AMD acknowledged it. This is not new this happened to my i7 930 back in the day. Since I only game on my pc I opted to disable HT I was able to get extra 200mhz oc and temps were lower too. Intel fixed ht issues with sandy bridge tho. No more performance hit and increase in temps


On every benchmark that we could see in these days, gaming performance are not that good, and SMT disabled only slightly worked for very few of them, it's nothing really worth it, i'd rather keep it enabled if it only improves next to nothing performance on few games, i might even never play in my life.


----------



## iRUSH

What's everyone's thoughts on the price? The 1700 is priced pretty good but I find the other two pretty unjustified. In my eyes the star of the show is clear.

I understand the X models seem to clock slightly better. But there's no way you can convince me that the price premium of the X models are justified, right? Or can you...


----------



## ducegt

Quote:


> Originally Posted by *IRobot23*
> 
> Well some did run 3000MHz Dual channel like Joker did. Some board did have support for it.
> I just want clear comparison between cores.[/SPOILER]


You still don't understand. The 7700Ks were benched with 2400 when they can except above 4200. Intel at 3200 can use lower timings than Ryzen. You don't give the impression that you understand anything about how memory performance works. You need a clear understanding of things and then to revaluate your curiosity.


----------



## wstanci3

I cannot see the benefit in disabling SMT for a few frames when you have to restart the system just to enable it again for productivity purposes, ie where this chip shines.


----------



## TopicClocker

Quote:


> Originally Posted by *EightDee8D*
> 
> One thing i learned after this launch is how biased some people/Tech-media are. when GPU launches nobody talks about GPU limited 4k res benchmarks because *Reasons* even though it removes CPU bottleneck. but On a cpu launch, 720p/1080p low benchmarks are the most important because it removes GPU bottleneck. both cases are kinda opposite of real world because hardly anyone buys 400$+700$ CPU+GPU to game on 720p/1080p low settings. what kind of BS and hypocrisy is this ?


I don't know what silly benchmarks you've seen but there is definitely legitimacy to testing CPU limitations in games, too many reviewers test GPU bound scenarios in CPU reviews, these are useless as you're limited by the GPU's capabilities, not the CPUs, so what you're really benchmarking is the GPU's performance.

What good is a GPU review when you're CPU bound? You wouldn't test a GPU under CPU bound scenarios and use that as a representation of it's performance. You also shouldn't do a CPU review under GPU bound scenarios as you're not testing the CPU's capabilities.

Testing low settings is no good and even misleading in some cases as some titles have settings that put more load on the CPU. Joker's testing of GTA V was poor because he used low settings, he clearly doesn't know how to test CPU performance in Games at all as the Draw Distance settings in this game puts more load on the CPU, especially the Extended Draw Distance settings.

Reducing the resolution down to 720p would have been adequate if necessary but he's using a GTX 1080 so he probably could have just run it at 1080p and bumped up the draw distance settings and hit the city, this would have been a useful CPU benchmark. If you're targeting 120-144 fps you'll start to see the CPU limitations come into play in CPU intensive games like GTA V.

See this Watch Dogs 2 video below? See the GPU usage? Sub 90 percent, it's CPU limited and below 80 fps throughout most of the video.

No matter what GPU power you throw at it you wont be able get higher frame-rates as you're CPU limited.

Now this isn't final representation of Ryzen's performance as there are obviously many software kinks to be worked out and reviewers should retest when they are, I'm just showing you an example of a real world CPU limitation.


----------



## budgetgamer120

Quote:


> Originally Posted by *oxidized*
> 
> On every benchmark that we could see in these days, gaming performance are not that good, and SMT disabled only slightly worked for very few of them, it's nothing really worth it, i'd rather keep it enabled if it only improves next to nothing performance on few games, i might even never play in my life.


This gaming not good talk again


----------



## budgetgamer120

Quote:


> Originally Posted by *TopicClocker*
> 
> I don't know what silly benchmarks you've seen but there is definitely legitimacy to testing CPU limitations in games, too many reviewers test GPU bound scenarios in CPU reviews, these are useless as you're limited by the GPU's capabilities, not the CPUs, so what you're really benchmarking is the GPU's performance.
> 
> What good is a GPU review when you're CPU bound? You wouldn't test a GPU under CPU bound scenarios and use that as a representation of it's performance. You also shouldn't do a CPU review under GPU bound scenarios as you're not testing the CPU's capabilities.
> 
> Testing low settings is no good and even misleading in some cases as some titles have settings that put more load on the CPU. Joker's testing of GTA V was poor because he used low settings, he clearly doesn't know how to test CPU performance in Games at all as the Draw Distance settings in this game puts more load on the CPU, especially the Extended Draw Distance settings.
> 
> Reducing the resolution down to 720p would have been adequate if necessary but he's using a GTX 1080 so he probably could have just run it at 1080p and bumped up the draw distance settings and hit the city, this would have been a useful CPU benchmark. If you're targeting 120-144 fps you'll start to see the CPU limitations come into play in CPU intensive games like GTA V.
> 
> See this Watch Dogs 2 video below? See the GPU usage? Sub 90 percent, it's CPU limited and below 80 fps throughout most of the video.
> 
> No matter what GPU power you throw at it you wont be able get higher frame-rates as you're CPU limited.
> 
> Now this isn't final representation of Ryzen's performance as there are obviously many software kinks to be worked out and reviewers should retest when they are, I'm just showing you an example of a real world CPU limitation.


So whats wrong with the performance in the video again? Also look at CPU usage for peaople who arent interested in a PC "console".


----------



## TopicClocker

Quote:


> Originally Posted by *budgetgamer120*
> 
> So whats wrong with the performance in the video again? Also look at CPU usage for peaople who arent interested in a PC "console".


CPU above 70% usage is pretty normal for this game on 4C8T and 6C12T processors, this game just eats CPUs.

I believe that video is fake or something is wrong with their PC because I've seen many other videos with 4 core i7s which have no stuttering, Digital Foundry also tested the same CPU and found no performance issues with it.
There's even a 38 minute video with an i7 4790K and a GTX 1080 on YouTube. I would link it but it has some... Stuff in it which probably shouldn't be posted on this forum.

*Watch Dogs 2 i7 6850K Titan X Pascal SLI*






*i7 5820K and GTX 980 Ti*






*i7 4790K GTX 1080
*


----------



## jprovido

Before ryzen the 7700k was fine now it's bottlenecking and maxing out with BF1 and Watchdogs smh. Ryzen is good we do not need to mess with the 7700k to make it look better


----------



## budgetgamer120

Quote:


> Originally Posted by *TopicClocker*
> 
> CPU above 70% usage is pretty normal for this game on 4C8T and 6C12T processors, this game just eats CPUs.
> 
> I believe that video is fake or something is wrong with their PC because I've seen many other videos with 4 core i7s which have no stuttering, Digital Foundry also tested the same CPU and found no performance issues with it.


Yeah sure it is fake


----------



## TopicClocker

Quote:


> Originally Posted by *budgetgamer120*
> 
> Yeah sure it is fake


Just go and look at the other videos, if the game was running like that on recent 4C i7s people would be bawling left and right about it.


----------



## budgetgamer120

Quote:


> Originally Posted by *TopicClocker*
> 
> Just go and look at the other videos, if the game was running like that on i7s people would be bawling left and right about it.


All depends on what area on the map you are playing. And no I wont agree with you that Ryzen is "bad" at gaming.


----------



## TopicClocker

Quote:


> Originally Posted by *budgetgamer120*
> 
> All depends on what area on the map you are playing. And no I wont agree with you that Ryzen is "bad" at gaming.


Where did I say Ryzen was "bad" at gaming?


----------



## Slomo4shO

Quote:


> Originally Posted by *Artikbot*
> 
> That is a VERY good point. Back in the day reviewers would use popular games people played for their benchmarks. I remember sifting through sites to find which reviewers used the games I played.
> 
> That cannot be said anof current reviews. I find myself looking at synthetic benchmarks the most since gaming benchmarks are all but irrelevant for my usage case.


What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...

BF1 is prime example of this...



Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


----------



## jprovido

Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


Battlefield 1 players on multiplayer has never been an issue on pc. Every game I played (granted I don't play it much) 64 player conquest maps are always full and sometimes you're on queue waiting for a free slot so yea it should be a focus too a lot of people are playing it on pc


----------



## EightDee8D

Quote:


> Originally Posted by *TopicClocker*
> 
> I don't know what silly benchmarks you've seen but there is definitely legitimacy to testing CPU limitations in games, too many reviewers test GPU bound scenarios in CPU reviews, these are useless as you're limited by the GPU's capabilities, not the CPUs, so what you're really benchmarking is the GPU's performance.
> 
> What good is a GPU review when you're CPU bound? You wouldn't test a GPU under CPU bound scenarios and use that as a representation of it's performance. You also shouldn't do a CPU review under GPU bound scenarios as you're not testing the CPU's capabilities.
> 
> Testing low settings is no good and even misleading in some cases as some titles have settings that put more load on the CPU. Joker's testing of GTA V was poor because he used low settings, he clearly doesn't know how to test CPU performance in Games at all as the Draw Distance settings in this game puts more load on the CPU, especially the Extended Draw Distance settings.
> 
> Reducing the resolution down to 720p would have been adequate if necessary but he's using a GTX 1080 so he probably could have just run it at 1080p and bumped up the draw distance settings and hit the city, this would have been a useful CPU benchmark. If you're targeting 120-144 fps you'll start to see the CPU limitations come into play in CPU intensive games like GTA V.
> 
> See this Watch Dogs 2 video below? See the GPU usage? Sub 90 percent, it's CPU limited and below 80 fps throughout most of the video.
> 
> No matter what GPU power you throw at it you wont be able get higher frame-rates as you're CPU limited.
> 
> Now this isn't final representation of Ryzen's performance as there are obviously many software kinks to be worked out and reviewers should retest when they are, I'm just showing you an example of a real world CPU limitation.


Thanks for completely missing my point.


----------



## Slomo4shO

Quote:


> Originally Posted by *jprovido*
> 
> Battlefield 1 players on multiplayer has never been an issue on pc. Every game I played (granted I don't play it much) 64 player conquest maps are always full and sometimes you're on queue waiting for a free slot so yea it should be a focus too a lot of people are playing it on pc


Lets go by units sold then...
Quote:


> "Breaking down the sales by platform, the game sold best on the PlayStation 4 with 2,158,901 units sold (62%), compared to 1,164,294 units sold on the Xbox One (34%). The game also sold *141,584 units on Windows PC (4%)*."


----------



## budgetgamer120

Quote:


> Originally Posted by *Slomo4shO*
> 
> Lets go by units sold then...


Ouch my decision to move away from PC gaming is well justified lol.


----------



## RedM00N

Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


I feel its just because they are popular triple AAA (or in some game cases well known to be cpu heavy/unpotimized).

I'd love to see benches from the most played standalone games on pc(source engine, dota/lol for example) but we dont normally get those in reviews. Plus people would probably just pull the "A potato can run it" card for these games since they arent generally resource heavy.


----------



## iRUSH

Quote:


> Originally Posted by *iRUSH*
> 
> What's everyone's thoughts on the price? The 1700 is priced pretty good but I find the other two pretty unjustified. In my eyes the star of the show is clear.
> 
> I understand the X models seem to clock slightly better. But there's no way you can convince me that the price premium of the X models are justified, right? Or can you...


Am I not cool enough to post in this thread and get a response?


----------



## TopicClocker

Quote:


> Originally Posted by *Slomo4shO*
> 
> Lets go by units sold then...


Retail? Digital took over, those numbers are useless.


----------



## Slomo4shO

Quote:


> Originally Posted by *RedM00N*
> 
> I feel its just because they are popular triple AAA (or in some game cases well known to be cpu heavy/unpotimized).


They are not optimized because they, historically, do not sell nearly as well as on console.

Quote:


> Originally Posted by *iRUSH*
> 
> Am I not cool enough to post in this thread and get a response?


You asked a qustion which has had at least a dozen responses already. The preliminary data puts the 1700 as the best value. There have been no discernible advantages to paying more for a 1700X or 1800X


----------



## Kuivamaa

Quote:


> Originally Posted by *Slomo4shO*
> 
> Lets go by units sold then...


Does not include digital downloads,obviously.Which is the prime way games are sold on the PC these days. Even in the bf1stats you posted (which also do not quite tell the whole truth since origin profiles set to invisible will not show,for example), today, 5 months down the road, we had more daily active users on the PC than the combined hard copy sales of BF1 on the pc,the first weeks. There are millions monthly active players on the PC.


----------



## Cyph3r

I feel like if the Ryzen CPUs could overclock to 4.4-4.6Ghz they'd be topping the charts in gaming. It's a huge shame they cap out at 4Ghz for most people.


----------



## cssorkinman

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iRUSH*
> 
> What's everyone's thoughts on the price? The 1700 is priced pretty good but I find the other two pretty unjustified. In my eyes the star of the show is clear.
> 
> I understand the X models seem to clock slightly better. But there's no way you can convince me that the price premium of the X models are justified, right? Or can you...
> 
> 
> 
> Am I not cool enough to post in this thread and get a response?
Click to expand...

Eventually with enough samples out there we will be able to see the differences and see if they are justified in pure performance or performance/cost - the premium on the 1800X obviously excludes it from P/$ justification vs the other 2 chips out there now.


----------



## blue1512

Quote:


> Originally Posted by *TopicClocker*
> 
> Reducing the resolution down to 720p would have been adequate if necessary but he's using a GTX 1080 so he probably could have just run it at 1080p and bumped up the draw distance settings and hit the city, this would have been a useful CPU benchmark. If you're targeting 120-144 fps you'll start to see the CPU limitations come into play in CPU intensive games like GTA V.
> 
> See this Watch Dogs 2 video below? See the GPU usage? Sub 90 percent, it's CPU limited and below 80 fps throughout most of the video.
> 
> No matter what GPU power you throw at it you wont be able get higher frame-rates as you're CPU limited.
> 
> Now this isn't final representation of Ryzen's performance as there are obviously many software kinks to be worked out and reviewers should retest when they are, I'm just showing you an example of a real world CPU limitation.


I don't think your video shows that the CPU is the bottleneck. If the CPU hit 100% and the GPU is not at 100%, like in 7700k case, the CPU has done its best and still bottlenecks the GPU. In this case, 1700 has around ~60% utilization, so it is not yet optimized and still has the potential for higher fps.


----------



## Mad Pistol

Quote:


> Originally Posted by *Cyph3r*
> 
> I feel like if the Ryzen CPUs could overclock to 4.4-4.6Ghz they'd be topping the charts in gaming. It's a huge shame they cap out at 4Ghz for most people.


The architecture is pushed to the absolute limit at the moment. If I had to guess, there is probably 2 or 3 components of the chip that were designed with Haswell-like IPC in mind. When AMD realized that the IPC is actually higher (Broadwell-E), it was too late to go back and redesign those parts for more bandwidth.

My guess is that those few parts will be tweaked and re-engineered with the next iteration of Ryzen, and we will see higher clock speeds as a result.

I can guarantee Nvidia did the same thing with their Maxwell architecture... thus, we now have Pascal which clocks to 2Ghz+ on overclocks.


----------



## dmasteR

Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


What games would you like them to test? BF1 is one of the most popular games on PC...


----------



## paulerxx

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ouch my decision to move away from PC gaming is well justified lol.


Are you serous? Those units sold do not include digital copies sold over Origin, Steam. etc.+


----------



## Malinkadink

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ouch my decision to move away from PC gaming is well justified lol.


The only reason i would want to move from PC gaming is to get away from cheaters where on consoles there are virtually none, but its not worth trading all the benefits of PC like high fps, better graphics, and mouse/kb.


----------



## budgetgamer120

Quote:


> Originally Posted by *paulerxx*
> 
> Are you serous? Those units sold do not include digital copies sold over Origin, Steam. etc.+


Steam does not sell BF1...

But you are right otherwise. Willing to bet the total still is not half Console gamers though.


----------



## TopicClocker

Quote:


> Originally Posted by *blue1512*
> 
> I don't think your video shows that the CPU is the bottleneck. If the CPU hit 100% and the GPU is not at 100%, like in 7700k case, the CPU has done its best and still bottlenecks the GPU. In this case, 1700 has around ~60% utilization, so it is not yet optimized and still has the potential for higher fps.


You can still be CPU limited and not have 100% CPU usage, its due to how the software uses the hardware, it's not taking full advantage of it.

This of-course isn't Ryzen's full potential and when the software kinks get worked out it will perform better, especially in well parallelised games that are willing to use almost every ounce of CPU power available.


----------



## Slomo4shO

Quote:


> Originally Posted by *dmasteR*
> 
> What games would you like them to test? BF1 is one of the most popular games on PC...


ROFL? One of the most popular on PC? The data suggests otherwise.

Quote:


> Originally Posted by *Kuivamaa*
> 
> Does not include digital downloads,obviously.


Digital sales were still twice that of PC....
Quote:


> Originally Posted by *Kuivamaa*
> 
> Which is the prime way games are sold on the PC these days. Even in the bf1stats you posted (which also do not quite tell the whole truth since origin profiles set to invisible will not show,for example), today, 5 months down the road, we had more daily active users on the PC than the combined hard copy sales of BF1 on the pc,the first weeks.


More copies being sold after initial launch. Who would have imagined








Quote:


> Originally Posted by *Kuivamaa*
> 
> There are millions monthly active players on the PC.


Are you deliberately ignoring the stats that are easily available online?

Quote:


> Originally Posted by *paulerxx*
> 
> Are you serous? Those units sold do not include digital copies sold over Origin, Steam. etc.+


In which consoles still surpassed PC...


----------



## iRUSH

Quote:


> Originally Posted by *Slomo4shO*
> 
> They are not optimized because they, historically, do not sell nearly as well as on console.
> You asked a qustion which has had at least a dozen responses already. The preliminary data puts the 1700 as the best value. There have been no discernible advantages to paying more for a 1700X or 1800X


This thread has moved pretty quickly, sorry.


----------



## looniam

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iRUSH*
> 
> What's everyone's thoughts on the price? The 1700 is priced pretty good but I find the other two pretty unjustified. In my eyes the star of the show is clear.
> 
> I understand the X models seem to clock slightly better. But there's no way you can convince me that the price premium of the X models are justified, right? Or can you...
> 
> 
> 
> Am I not cool enough to post in this thread and get a response?
Click to expand...

i guess not but you did get a few replies since.

it is mind boggling why AMD put such a price premium for a generally better out of box experience.

unless there is some magic that a proper bios will uncover later . . .


----------



## Slomo4shO

Quote:


> Originally Posted by *looniam*
> 
> it is mind boggling why AMD put such a price premium for a generally better out of box experience.


If Nvidia can charge a $100 tax for a label and inferior performance...


----------



## TopicClocker

Quote:


> Originally Posted by *Slomo4shO*
> 
> ROFL? One of the most popular on PC? The data suggests otherwise.
> Digital sales were still twice that of PC....
> More copies being sold after initial launch. Who would have imagined
> 
> 
> 
> 
> 
> 
> 
> 
> Are you deliberately ignoring the stats that are easily available online?
> In which consoles still surpassed PC...


There's more players on console, okay. Why does it matter? The game is still populated on PC. If it was on Steam it would probably be doing better.


----------



## Slomo4shO

Quote:


> Originally Posted by *TopicClocker*
> 
> There's more players on console, okay. Why does it matter? The game is still populated on PC. If it was on Steam it would probably be doing better.


There are plenty of others more popular on PC that never enter the limelight of reviews... If one is going to make an argument that a CPU can't game, shouldn't they at least demonstrate capacity of said PC on the majority of the games played on PC and not the highly publicized console ports that barely anyone plays?



Shouldn't you demand objectivity and validity from reviews? I suppose that you can continue to argue for a piece of the spectrum of PC gaming that occupies a very small niche of the overall PC gaming market...


----------



## dmasteR

Quote:


> Originally Posted by *Slomo4shO*
> 
> ROFL? One of the most popular on PC? The data suggests otherwise.
> Digital sales were still twice that of PC....
> More copies being sold after initial launch. Who would have imagined
> 
> 
> 
> 
> 
> 
> 
> 
> Are you deliberately ignoring the stats that are easily available online?
> In which consoles still surpassed PC...


What games are more popular?

You have CS:GO, DOTA2, League of Legends, GTA V, Overwatch, TF2, WoW, Path of Exile, Rust, Rocket League, H1Z1, ARK. BF1 matches up with a few of those games in terms of concurrent players.


----------



## Kuivamaa

Quote:


> Originally Posted by *Slomo4shO*
> 
> Digital sales were still twice that of PC....
> More copies being sold after initial launch. Who would have imagined
> 
> 
> 
> 
> 
> 
> 
> 
> Are you deliberately ignoring the stats that are easily available online?


First of all my post was a direct answer to your notion that PC amounts to 4% of BF1 sales. I am glad we now agree it has a far greater market percentage than that. As for the "stats that I ignore" I think you do not understand how metrics work. What BF1stats shows as "currently online" are concurrent players, and the 24h peak is the highest concurrent for the day-not how many logged. Daily Active Users will be orders of magnitude larger than concurrent users as it counts all the unique players that logged and played in a span of 24h. And they are in the hundreds of thousands. Monthly unique players are in the millions,yes. And again, these trackers are limited to what they can actually see, especially on the PC. Actual player count is higher (still PS4 has the most players though,that is not up for debate and never was).


----------



## TopicClocker

Quote:


> Originally Posted by *Slomo4shO*
> 
> There are plenty of others more popular on PC that never enter the limelight of reviews... If one is going to make an argument that a CPU can't game, shouldn't they at least demonstrate capacity of said PC on the majority of the games played on PC and not the highly publicized console ports that barely anyone plays?


The games tested should be diverse and not limited to a certain type. I see no issue with testing Battlefield 1, but many other games should be tested as-well such as Dota 2, CS GO, GTA V, Overwatch, MMOs and a couple of new and older releases including notably popular titles on the platform. This provides a diverse list of gaming performance to the readers/watchers of the review.


----------



## blue1512

Quote:


> Originally Posted by *TopicClocker*
> 
> You can still be CPU limited and not have 100% CPU usage, its due to how the software uses the hardware, it's not taking full advantage of it.
> 
> This of-course isn't Ryzen's full potential and when the software kinks get worked out it will perform better, especially in well parallelised games that are willing to use almost every ounce of CPU power available.


While you are true about the CPU limited, you can't pull the final verdict regarding Ryzen yet. For example, The Stilted has boosted the gaming fps of Ryzen for ~10% just by using Win 7 instead of Win 10! (yes, with a mod driver). The general consensus now is that Ryzen got murdered by Win 10 scheduler in games, and an update is much needed for a re-bench.


----------



## Slomo4shO

Quote:


> Originally Posted by *dmasteR*
> 
> You have CS:GO, DOTA2, League of Legends, GTA V, Overwatch, TF2, WoW, Path of Exile, Rust, Rocket League, H1Z1, ARK. *BF1 matches up with a few of those games in terms of concurrent players*.


I would love to see the data for this.

Quote:


> Originally Posted by *Kuivamaa*
> 
> First of all my post was a direct answer to your notion that PC amounts to 4% of BF1 sales. I am glad we now agree it has a far greater market percentage than that. As for the "stats that I ignore" I think you do not understand how metrics work.


Fairly certain that I said the PC segment accounted for roughly 15% in my initial claim...
Quote:


> Originally Posted by *Slomo4shO*
> 
> barely 15% of all sales when comparing all consoles...


Oh look... I did...


----------



## blue1512

Quote:


> Originally Posted by *TopicClocker*
> 
> The games tested should be diverse and not limited to a certain type. I see no issue with testing Battlefield 1, but many other games should be tested as-well such as Dota 2, CS GO, GTA V, Overwatch, MMOs and a couple of new and older releases including notably popular titles on the platform. This provides a diverse list of gaming performance to the readers/watchers of the review.


The benches should be taken on advanced games, not some popular garbages like Lol which can runs on a potato.


----------



## Slomo4shO

Quote:


> Originally Posted by *TopicClocker*
> 
> The games tested should be diverse and not limited to a certain type. I see no issue with testing Battlefield 1, but many other games should be tested as-well such as Dota 2, CS GO, GTA V, Overwatch, MMOs and a couple of new and older releases including notably popular titles on the platform. This provides a diverse list of gaming performance to the readers/watchers of the review.


This is a great stance and one I agree with. Diversity is key. But this isn't what the typical gaming benchmark suite covers...
Quote:


> Originally Posted by *blue1512*
> 
> The benches should be taken on advanced games, not some popular garbages like Lol which can runs on a potato.


Shouldn't it have both? If more players play LoL than all shooter games on PC combined, shouldn't they also have some understanding of how the hardware matched their needs?


----------



## blue1512

Quote:


> Originally Posted by *Slomo4shO*
> 
> This is a great stance and one I agree with. Diversity is key. But this isn't what the typical gaming benchmark suite covers...
> Shouldn't it have both? If more players play LoL than all shooter games on PC combined, shouldn't they also have some understanding of how the hardware matched their needs?


If lol can be played on a potato with stable 60 fps, why do their players care about a benchmark with >100fps? The fact is, they just don't care. Why bother benching the game anyway?


----------



## Kuivamaa

Quote:


> Originally Posted by *dmasteR*
> 
> What games are more popular?
> 
> You have CS:GO, DOTA2, League of Legends, GTA V, Overwatch, TF2, WoW, Path of Exile, Rust, Rocket League, H1Z1, ARK. BF1 matches up with a few of those games in terms of concurrent players.


Steam data are very accurate. Third party trackers not so much- invisible origin profiles, database downtime, broken queries, bugs, lots of things get lost in translation, so yes, origin and I suppose, battle.net playercounts alike appear skewed.


----------



## mcg75

Quote:


> Originally Posted by *blue1512*
> 
> While you are true about the CPU limited, you can't pull the final verdict regarding Ryzen yet. For example, The Stilted has boosted the gaming fps of Ryzen for ~10% just by using Win 7 instead of Win 10! (yes, with a mod driver). The general consensus now is that Ryzen got murdered by Win 10 scheduler in games, and an update is much needed for a re-bench.


He's not trying to give a final verdict. Every reasonable person here including him have stated that there are bugs to be worked out and things can get better,


----------



## jprovido

Quote:


> Originally Posted by *dmasteR*
> 
> What games are more popular?
> 
> You have CS:GO, DOTA2, League of Legends, GTA V, Overwatch, TF2, WoW, Path of Exile, Rust, Rocket League, H1Z1, ARK. BF1 matches up with a few of those games in terms of concurrent players.


Dota 2 is HARD to run ever since the 7.00 update. yes you heard it here. my 5820k @ 4.7ghz gets fps drops @ 1440p at 144hz to as low as 90fps with big team fights. it's crazy


----------



## blue1512

Quote:


> Originally Posted by *mcg75*
> 
> He's not trying to give a final verdict. Every reasonable person here including him have stated that there are bugs to be worked out and things can get better,


Sure, mate. I didn't said that he was trying to give a final verdict.


----------



## budgetgamer120

Quote:


> Originally Posted by *mcg75*
> 
> He's not trying to give a final verdict. Every reasonable person here including him have stated that there are bugs to be worked out and things can get better,


And us reasonable people are saying things are not bad or even close to being bad.


----------



## Slomo4shO

Lets take a look at the diversity of games in review benchmarks, bare in mind that shooters only occupy 6.3% of all PC game sales and action/adventure titles cover another 10%:

Techspot:
Watch Dogs 2, Overwatch, BF1, Gears of War 4

Legitreviews:
Thief, GTA 5, Deus Ex

Guru3d:
RE7, BF1, Watch Dogs 2, Dishonored 2, Deus Ex, The Division, Far Cry Primal, Hitman, Rise of the Tomb Raider

HardwareCanucks:
BF1, COD: IW, Deus Ex, Doom, GTA 5, Overwatch

TomsHardware:
Ashes of Singularity, BF4, Hitman, Metro:LL, Project Cars

PCPer:
Civ 5, Rise of the Tomb Raider, Hitman

Hexus:
Deus Ex, Hitman, Total War: Warhammer

GamersNexus:
Watch Dogs 2, BF1, Ashes of Singularity, GTA5, Metro: LL, Total War, Warhammer

TechReport:
Deus Ex, Doom, GTA5, Crysis 3, Watch Dogs 2

OverclockersClub:
Hitman, The Division

Quote:


> Originally Posted by *blue1512*
> 
> If lol can be played on a potato with stable 60 fps, why do their players care about a benchmark with >100fps? The fact is, they just don't care. Why bother benching the game anyway?


By that same logic, why bother benching a game that less than 1% of the consumer base will be playing?


----------



## sugarhell

Quote:


> Originally Posted by *Mad Pistol*
> 
> The architecture is pushed to the absolute limit at the moment. If I had to guess, there is probably 2 or 3 components of the chip that were designed with Haswell-like IPC in mind. When AMD realized that the IPC is actually higher (Broadwell-E), it was too late to go back and redesign those parts for more bandwidth.
> 
> My guess is that those few parts will be tweaked and re-engineered with the next iteration of Ryzen, and we will see higher clock speeds as a result.
> 
> I can guarantee Nvidia did the same thing with their Maxwell architecture... thus, we now have Pascal which clocks to 2Ghz+ on overclocks.


This doesn't work this way...

The moment they choose 14nm LPP they knew it that they will not hit the clocks of skylake/kabylake.

The L3 cache on Ryzen is a victim cache so it needs first to check the L2 cache of the CCX. Then it communicates through one of the data fabric to the other L3 cache of the other CCX if it needs .The bandwidth looks fine for this kind of work but it is(the work) _different_ compared to intel's way. Also it seems that the bandwidth of the data fabric is related to the memory speed and with ratio 1:2(Thanks The Stilt)

The only problem that i see is the weird/weak IMC, software applications not working correctly with ryzen and the motherboards stability issues.


----------



## JackCY

Can someone with Ryzen test simulators? Say Kerbal Space Program, Assetto Corsa, some flight sims, what ever you've got. Traditional reviews don't really bother to test on games/sims that use the CPU a lot and mostly stick to arcade FPS :/
I will welcome even the Arma 3 tested even though it's a horrible engine on any PC to run, as a war sim.


----------



## blue1512

Quote:


> Originally Posted by *sugarhell*
> 
> This doesn't work this way...
> 
> The moment they choose 14nm LPP they knew it that they will not hit the clocks of skylake/kabylake.
> 
> The L3 cache on Ryzen is a victim cache so it needs first to check the L2 cache of the CCX. Then it communicates through one of the data fabric to the other L3 cache of the other CCX if it needs .The bandwidth looks fine for this kind of work but it is(the work) _different_ compared to intel's way. Also it seems that the bandwidth of the data fabric is related to the memory speed and with ratio 1:2(Thanks The Stilt)
> 
> The only problem that i see is the weird/weak IMC, software applications not working correctly with ryzen and the motherboards stability issues.


This.

I highly recommend The Stilt's thread on Anantech for a clear picture of Ryzen.


----------



## Xuper

while you're talking , I'm still stuck on my phenom II 925 /cry! waiting for 2 month to get Ryzen 1600x....Intel six cores are too damn expensive (around $540)


----------



## Buris

Quote:


> Originally Posted by *sugarhell*
> 
> This doesn't work this way...
> 
> The moment they choose 14nm LPP they knew it that they will not hit the clocks of skylake/kabylake.
> 
> The L3 cache on Ryzen is a victim cache so it needs first to check the L2 cache of the CCX. Then it communicates through one of the data fabric to the other L3 cache of the other CCX if it needs .The bandwidth looks fine for this kind of work but it is(the work) _different_ compared to intel's way. Also it seems that the bandwidth of the data fabric is related to the memory speed and with ratio 1:2(Thanks The Stilt)
> 
> The only problem that i see is the weird/weak IMC, software applications not working correctly with ryzen and the motherboards stability issues.


Which is insane to think about, because a 16nm CPU could have potentially given Ryzen clockspeeds in excess of 5ghz on air.

However, Ryzen is by no means "bad" for gaming, just like the 6900 and 6800 aren't "bad" for gaming. They're just better at other tasks.


----------



## Master__Shake

Quote:


> Originally Posted by *Catscratch*
> 
> I have a proposition. Let's stop believing non-built-in benchmark videos.


you can't account for every single variable if you do not use canned benchmarks.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


Quote:


> Originally Posted by *Slomo4shO*
> 
> Lets go by units sold then...












i blame origin i guess...if i had to

too funny though.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Slomo4shO*
> 
> This is a great stance and one I agree with. Diversity is key. But this isn't what the typical gaming benchmark suite covers...
> Shouldn't it have both? If more players play LoL than all shooter games on PC combined, shouldn't they also have some understanding of how the hardware matched their needs?


I think the point he is making is that these more popular PC titles you speak of are no problem for even old hardware to run, much less anything currently available. In my experience the game bench suites are usually based around what games push the hardware the hardest rather than what games are the most popular. That way you get a worst case scenario when comparing FPS between hardware rather than a situation where EVERYTHING runs at 200+ FPS and there's no differentiation between the hardware being compared.


----------



## kaosstar

Quote:


> Originally Posted by *blue1512*
> 
> This.
> 
> I highly recommend The Stilt's thread on Anantech for a clear picture of Ryzen.


Good thread. Kind of funny how "some forum guy" can do a better review, in many respects, than the "professionals".


----------



## Scotty99

Quote:


> Originally Posted by *kaosstar*
> 
> Good thread. Kind of funny how "some forum guy" can do a better review, in many respects, than the "professionals".


Well, they wouldn't have gotten big in the first place if they did product reviews like that guy does lol.


----------



## Slomo4shO

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think the point he is making is that these more popular PC titles you speak of are no problem for even old hardware to run, much less anything currently available.


Considering that strategy and simulation games are more taxing on the CPU than your run of the mill 2-4 core utilizing shooter and that these two genres cover over 50% of the PC gamer market, how exactly is the propagation of a myth based on a niche market useful for the general consumer?


----------



## blue1512

Quote:


> Originally Posted by *kaosstar*
> 
> Good thread. Kind of funny how "some forum guy" can do a better review, in many respects, than the "professionals".


The Stilt is not just a "some forum guy" though. He has been one of the most reliable people when it comes to AMD. Meanwhile the "professionals" mostly say what the sponsors or the audience want. They just don't dwell into the chip at the enthusiast level.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Slomo4shO*
> 
> Considering that strategy and simulation games are more taxing on the CPU than your run of the mill 2-4 core utilizing shooter and that these two genres cover over 50% of the PC gamer market, how exactly is the propagation of a myth based on a niche market useful for the general consumer?


Sigh, whatever man. Obviously the suite of games the reviewers test must be what people want to see regardless of how you feel about the selection. I'm sure if people were really clamoring for a breakdown of the 7700K's performance in The Sims then they would be doing that...


----------



## randomizer

If it's important to you that reviewers test The Sims (or whatever popular game that they're not testing) then why not ask them to test it?


----------



## Liranan

Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> This doesn't work this way...
> 
> The moment they choose 14nm LPP they knew it that they will not hit the clocks of skylake/kabylake.
> 
> The L3 cache on Ryzen is a victim cache so it needs first to check the L2 cache of the CCX. Then it communicates through one of the data fabric to the other L3 cache of the other CCX if it needs .The bandwidth looks fine for this kind of work but it is(the work) different compared to intel's way. Also it seems that the bandwidth of the data fabric is related to the memory speed and with ratio 1:2(Thanks The Stilt)
> 
> The only problem that i see is the weird/weak IMC, software applications not working correctly with ryzen and the motherboards stability issues.
> 
> 
> 
> This.
> 
> I highly recommend The Stilt's thread on Anantech for a clear picture of Ryzen.
Click to expand...

Excellent thread, showing both Zen's strengths and weaknesses.

Zen mobile chips should be amazingly efficient and powerful compared with Intel's offerings, so the next revision of the XBox might actually be a high clocked Zen chip with a very powerful GPU.


----------



## mohiuddin

Quote:


> Originally Posted by *Buris*
> 
> Which is insane to think about, because a 16nm CPU could have potentially given Ryzen clockspeeds in excess of 5ghz on air.


Yes , same question. why didnt they choose 16nm TSMC node in the first place.?
By architecture , Ryzen is just awesome. I think it surpasses intel's latest offerings . But the stupid Glofo LPP is the main culprit here for this 4GHz cap of ryzen OC.


----------



## Travieso

[/quote]
Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


because we PC gamers are still stuck with games that developed by ancient engines like DotA2, LoL, CS:GO. all of these games can run perfectly fine even on a microwave oven.

there's no point reviewing games that can run on grandma's baytrail tablets or windows stick PCs. reviewers need to benchmark less popular games but have much more advanced graphics to make the results more relevant.


----------



## DaaQ

Quote:


> Originally Posted by *kaosstar*
> 
> Good thread. Kind of funny how "some forum guy" can do a better review, in many respects, than the "professionals".


http://www.overclockers.com/amd-ryzen-7-1800x-cpu-review/
Here's another good review from "some forum guy" that is much better than these "professional" journalists.


----------



## blue1512

Quote:


> Originally Posted by *mohiuddin*
> 
> Yes , same question. why didnt they choose 16nm TSMC node in the first place.?
> By architecture , Ryzen is just awesome. I think it surpasses intel's latest offerings . But the stupid Glofo LPP is the main culprit here for this 4GHz cap of ryzen OC.


LPP seems to be extremely good for servers though.


----------



## sumitlian

Quote:


> Originally Posted by *kaosstar*
> 
> Good thread. Kind of funny how "some forum guy" can do a better review, in many respects, than the "professionals".


Many of these so called professionals are 'technically illiterate'. The Stilt is not "some forum guy". He has in depth knowledge of low level stuff regarding computer system architecture and he is most likely a great low level programmer as well. He has written a program for Windows that enables 'x87 support'(officially disabled) in BIOS to be used with Bulldozer family of AMD CPUs if I remember correctly.


----------



## mohiuddin

A very good review for games>>


----------



## kariverson

Quote:


> Originally Posted by *DaaQ*
> 
> http://www.overclockers.com/amd-ryzen-7-1800x-cpu-review/
> Here's another good review from "some forum guy" that is much better than these "professional" journalists.


Wait a minute, do I see right? It goes toe to toe with the 6950x?!?


----------



## DaaQ

Quote:


> Originally Posted by *kariverson*
> 
> Wait a minute, do I see right? It goes toe to toe with the 6950x?!?


All chips set at 4ghz.


----------



## kariverson

Quote:


> Originally Posted by *DaaQ*
> 
> All chips set at 4ghz.


Doesn't the i7 turbo at 3.50GHz only? So was it OCed for the test?


----------



## Majin SSJ Eric

Its not like the 6950X goes that much farther beyond 4GHz though...


----------



## pez

Quote:


> Originally Posted by *Slomo4shO*
> 
> What boggles my mind is why popular console games, even if they are not popular on pc, get the most attention in these so called reviews...
> 
> BF1 is prime example of this...
> 
> 
> 
> Not even a 1/3 of the player base compred to individual consoles and barely 15% of all sales when comparing all consoles...


BF1 just had a free weekend on both consoles, I believe. That would at least account for some of that peak I imagine. BF1 is a game that's been great on all platforms. It probably says more that console gamers are taking to better games instead of the rehashed CoD crap.


----------



## budgetgamer120

Quote:


> Originally Posted by *pez*
> 
> BF1 just had a free weekend on both consoles, I believe. That would at least account for some of that peak I imagine. BF1 is a game that's been great on all platforms. It probably says more that console gamers are taking to better games instead of the rehashed CoD crap.


It was also free on PC.

https://www.battlefield.com/news/article/free-trial-battlefield-1-this-weekend


----------



## pez

Quote:


> Originally Posted by *budgetgamer120*
> 
> It was also free on PC.
> 
> https://www.battlefield.com/news/article/free-trial-battlefield-1-this-weekend


Yeah, I'll blame Origin, then







. I haven't played BF1 as much as I thought I would, either. I paid full price even







. That'll teach me...for the hundredth time in a row....


----------



## Kuivamaa

Quote:


> Originally Posted by *pez*
> 
> BF1 just had a free weekend on both consoles, I believe. That would at least account for some of that peak I imagine. BF1 is a game that's been great on all platforms. It probably says more that console gamers are taking to better games instead of the rehashed CoD crap.


Check both tracking site numbers,they are never equal. And that's because their numbers are estimates, and not 100% reliable data. Especially on the pc they are way off.


----------



## Majin SSJ Eric

I haven't gotten BF1 yet because its still $60! Need a sale on this game already!


----------



## Slink3Slyde

Not to beat the gaming horse too much but Techspot have a gaming test article up where theyve disabled SMT for all tests and used 3000MHZ RAM on a Gigabyte X370 gaming 5.

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/



Disabling SMT doesnt seem make to much difference in most their tests. Another review for the OP anyway.


----------



## 1216

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I haven't gotten BF1 yet because its still $60! Need a sale on this game already!


50% off right now for the standard edition


----------



## Wishmaker

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Not to beat the gaming horse too much but Techspot have a gaming test article up where theyve disabled SMT for all tests and used 3000MHZ RAM on a Gigabyte X370 gaming 5.
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/
> 
> 
> 
> Disabling SMT doesnt seem make to much difference in most their tests. Another review for the OP anyway.


That is not what we were expecting, were we?


----------



## Slink3Slyde

Quote:


> Originally Posted by *Wishmaker*
> 
> That is not what we were expecting, were we?


Honestly I didnt know what to expect. Total War Warhammer shows a bit of a gain from SMT off but In BF 1 the minimums are actually far worse with it off.

From my perspective I was looking to upgrade to Zen if it gave me a bit of a bump in ST performance from my 3570K (which is a poor clocker), a platform that I could keep for a few years and upgrade CPU if I needed too and something that would let me get some decent scores in 3dMark when I want to play with it occasionally. Unfortunately it seems to me that Id likely be worse off in some games and better in others. Not really something Id like to invest in for my uses.

Now I'm looking at 7700K's, because I really only game. I dont stream or anything else that needs more threads much at all and looking at overclocking I dont believe the R5 chips are going to match the 7700k. Now they might affect pricing and I'm sure they'll be great value chips, but I'm tired of waiting and by the time summer comes here I wont be gaming half as much anyway.

If a 7700K starts to lag behind after the new chips release and if games start to show some benefit I'm quite happy to sell up again and build something new next winter.


----------



## prznar1

Is there a review of msi b350 tomahawk board?


----------



## Nickyvida

Seems like AMD chose the wrong process node for Ryzen. All only can clock 4 cos it is fabbed on 14nm LPP. Its the same node for smartphones iirc. Suspect the 4-6 core will clockhigher than the 8 cores
But not by much. Probably only at 4.4-4.5 ish if lucky.

The chip design is solid but AMD shot itself in own foot by choosing the wrong node. Why not14nm HP or 16nm HP?

If it had a better process node can probably have clock spds of 5 on air, at least probably for the 4 core parts.


----------



## Kuivamaa

Quote:


> Originally Posted by *Nickyvida*
> 
> Seems like AMD chose the wrong process node for Ryzen. All only can clock 4 cos it is fabbed on 14nm LPP. Its the same node for smartphones iirc. Suspect the 4-6 core will clockhigher than the 8 cores
> But not by much. Probably only at 4.4-4.5 ish if lucky.
> 
> The chip design is solid but AMD shot itself in own foot by choosing the wrong node. Why not14nm HP or 16nm HP?
> 
> If it had a better process node can probably have clock spds of 5 on air, at least probably for the 4 core parts.


It is not just the node it seems, ryzen is a high density design too. Even if AMD had the option to go with TSMC (they do not have that option) clocks wouldn't have been much different.On the other hand, Ryzen is very,very power efficient at stock or lower clocks. Server and Laptop chips should be insanely good.


----------



## JackCY

They "have to" use GF which uses Samsung node. They probably wanted low power which is more beneficial than having a node that allows higher frequencies on a design that is made for them. Look how long it took Intel to get to 5GHz and they still do not sell a CPU that has those clocks stock, it's about a max for OC on their newest chips and on some old simpler chips, they own the fab and have way more money to burn on optimizations.

IMHO Ryzen runs best to about 3.3GHz, 3.5GHz max, beyond that it starts to consume more and more power as it needs larger bumps in volts to raise clocks. At these efficient clocks it's a beast. Sure at max 4.0GHz it's eating a lot of power just like Intel 8 cores.


----------



## huzzug

Quote:


> Originally Posted by *Nickyvida*
> 
> Seems like AMD chose the wrong process node for Ryzen. All only can clock 4 cos it is fabbed on 14nm LPP. Its the same node for smartphones iirc. Suspect the 4-6 core will clockhigher than the 8 cores
> But not by much. Probably only at 4.4-4.5 ish if lucky.
> 
> The chip design is solid but AMD shot itself in own foot by choosing the wrong node. Why not14nm HP or 16nm HP?
> 
> If it had a better process node can probably have clock spds of 5 on air, at least probably for the 4 core parts.


That could have been a possibility, but AMD should also have been planning to put these chips in mobile (laptops, notebooks & UB) for which low power envelope is everything for said performance. Going with 16HP would have led AMD to design a separate uarch for mobile which even Intel does not do.


----------



## Nickyvida

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is not just the node it seems, ryzen is a high density design too. Even if AMD had the option to go with TSMC (they do not have that option) clocks wouldn't have been much different.On the other hand, Ryzen is very,very power efficient at stock or lower clocks. Server and Laptop chips should be insanely good.


Well, look at pascal, it can clock really well on 16nm. Im not hating. Just saying it was a missed opportunity because if it wasnt node limited it would have given 7700k a real fight probably


----------



## SuperZan

Quote:


> Originally Posted by *Nickyvida*
> 
> Well, look at pascal, it can clock really well on 16nm. Im not hating. Just saying it was a missed opportunity because if it wasnt node limited it would have given 7700k a real fight probably


It already is if your focus isn't 1080p 144Hz gaming. A Ryzen quad that can hit 4.4 or better wiould be an absurd price/performance proposition, especially with Zen+ compatibility and the ability to freely drop in a hex or octa if your workload/interests/the meta change. If I'm a 60Hz 1080p gamer it's an easy choice, IMO.


----------



## Wishmaker

Quote:


> Originally Posted by *SuperZan*
> 
> It already is if your focus isn't 1080p 144Hz gaming. A Ryzen quad that can hit *4.4* or better wiould be an absurd price/performance proposition, especially with Zen+ compatibility and the ability to freely drop in a hex or octa if your workload/interests/the meta change. If I'm a 60Hz 1080p gamer it's an easy choice, IMO.


4.4 GHz??


----------



## SuperZan

Yep, I think 4.4 is a safe bet with what we know of 14nm LPP. I've been gaming happily on a 3.9GHz 1700X. I even turned off one 480 and pulled out the 1080p 60Hz display to morph into Average Jill gamer. Passed the taste test, so a higher-clocking quad Ryzen will basically offer everyone who buys i3/i5 chips an i7 for the same price they're used to. Not bad for a basic gamer. It'll be a tremendous value play, IMO and if the GTX 970 taught us anything it's that gamers love a value play.


----------



## huzzug

Quote:


> Originally Posted by *SuperZan*
> 
> and if the GTX 970 taught us anything it's that gamers love a value play.


Well, it was a well played value offering.









OT: Have you tried disabling cores to make it behave as, what might be the 6 & 4 core variants ? Does disabling cores have any impact on ability to clock high or is 4.2 a wall ?


----------



## Olivon

Quote:


> Originally Posted by *Wishmaker*
> 
> 4.4 GHz??


We can't avoid people to dream


----------



## SuperZan

Quote:


> Originally Posted by *huzzug*
> 
> Well, it was a well played value offering.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OT: Have you tried disabling cores to make it behave as, what might be the 6 & 4 core variants ? Does disabling cores have any impact on ability to clock high or is 4.2 a wall ?


I haven't played with cores just yet but it's on my to-do list. The wall is a finicky thing because of needed UEFI and HPET fixes that will probably help us fine-tune volts a bit better which may lead some lucky folks to find a 100 more MHz or so. Once fixes roll in I'll try disabling some cores to see how she goes.


----------



## Kuivamaa

I doubt 4.4 is doable. I expect the process to allow an easy 4.2 eventually, within reasonable voltage but Ryzen design appears to be limited as far as frequencies go.


----------



## SuperZan

We'll see. 4.2GHz is where the octa chips tap out, so even with 14nm LPP limitations I still think 4.4 is a plausible max ambient OC for quads.


----------



## prznar1

The good old steppings might come back. Aka Intel CPU generations









Anyone remember OC difference with i7 920 c0 and d0 steppings?


----------



## Heimdallr

Anyone knows of any website or youtube channel that tested Ryzen performance in WoW Legion?


----------



## Kuivamaa

Quote:


> Originally Posted by *Heimdallr*
> 
> Anyone knows of any website or youtube channel that test Ryzen performance in WoW Legion?


Proper WoW testing is too cumbersome, although it would have been very worthwhile. Basically you need to multibox. Use a lead account and put the characters run by the CPUs you test to "follow" then take a spin in ,say, Alterac Valley. Only then you will get relevant data in big player count conditions.


----------



## ryan92084

Quote:


> Originally Posted by *BobiBolivia*
> 
> I proposed it in another thread:
> 
> How about OCN community making its own "Reviews Corner" ?
> It would be from actual end-users, who could put HW through ridiculous tests, and it would be community-powered, which means many people able to verify results on their own.
> 
> I can imagine that it will grow to some respectable source of information - we are OCN.net, right ?
> 
> 
> 
> 
> 
> 
> 
> 
> And as for actual benefit - being able to verify results against other review sites, and maybe (just maybe) a much less bias in reviews ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Edit: Sometimes I am inconsistent with my own thoughts..._


I'll add an ocn user review section to the OP if/when we get some.
Quote:


> Originally Posted by *SoloCamo*
> 
> Hardware Canucks will also be doing an update.
> 
> They got 4.1ghz and 3200mhz stable.
> 
> I really wish we could get an official rebench when everyone learns their way around the new platform and after firmware/bios/windows updates have been made.


If we get enough rebenches with significantly different results I'll move day 1 reviews into a separate post. I think a lot of the reviews will just be waiting until the 1600x to bother though.


----------



## Shatun-Bear

Quote:


> Originally Posted by *JackCY*
> 
> They "have to" use GF which uses Samsung node. They probably wanted low power which is more beneficial than having a node that allows higher frequencies on a design that is made for them. Look how long it took Intel to get to 5GHz and they still do not sell a CPU that has those clocks stock, it's about a max for OC on their newest chips and on some old simpler chips, they own the fab and have way more money to burn on optimizations.
> 
> IMHO *Ryzen runs best to about 3.3GHz, 3.5GHz max, beyond that it starts to consume more and more power as it needs larger bumps in volts to raise clocks. At these efficient clocks it's a beast.* Sure at max 4.0GHz it's eating a lot of power just like Intel 8 cores.


You are right. It's a beast at 3.3Ghz, according to The Stilt's tests:
Quote:


> The ideal frequency range for the process or the design (as a whole) appears to be 2.1 - 3.3GHz (25mV per 100MHz). Above this region (>= 3.3GHz) the voltage scaling gradually deteriorates to 40 - 100mV+ per 100MHz.


Quote:


> *850 points in Cinebench 15 at 30W* is quite telling. Or not telling, but absolutely massive. Zeppelin can reach absolutely monstrous and unseen levels of efficiency, as long as it operates within its ideal frequency range.


https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/

Zeppelin is the code name for Zen's die design


----------



## Heimdallr

Quote:


> Originally Posted by *Kuivamaa*
> 
> Proper WoW testing is too cumbersome, although it would have been very worthwhile. Basically you need to multibox. Use a lead account and put the characters run by the CPUs you test to "follow" then take a spin in ,say, Alterac Valley. Only then you will get relevant data in big player count conditions.


It makes sense, it's a pity that MMOs almost never get benchmarked in reviews.
Unfortunately (







) i'm playing wow a lot these days.


----------



## pengs

Adored's take on the reviews


----------



## mAs81

EDIT:
Ooops..wrong thread,,Disregard..









Tho I really believe that Ryzen has a lot of room for improvement and will do just that...I just don't know if anyone who has already bought an AM4+ m/b will see it-meaning perhaps he'll need to get a newer revision board most probably


----------



## Charcharo

Adored forgot to talk about the RAM/BIOS/Kernel/CCX/Scheduling/L3 issues that are solvable but may help its performance. Still, he DOES have a point.

I remember seeing the 8370 above the 2500K and was like " Wait a second... computerbase.de ... hmmm is this true?" but didnt make much of a fuss at the time. But it does make sense.


----------



## Hueristic

Quote:


> Originally Posted by *DaaQ*
> 
> http://www.overclockers.com/amd-ryzen-7-1800x-cpu-review/
> Here's another good review from "some forum guy" that is much better than these "professional" journalists.


What a great resource, I had no clue about the X300 chipset.

This will give the ITX crowd something to talk about! Their very own chip!
Quote:


> A Note on X300
> 
> The X300 is a one-of-a-kind in the PC industry, as *it has been purpose-built by AMD to cultivate ITX* solutions for the AMD Ryzen family of processors. X300 accomplishes this by enabling the SOC characteristics of the AMD Ryzen processor, which provides sufficient integrated I/O to fully enable the backplane and connectivity options of a premium ITX motherboard. Correspondingly, the X300 has no I/O functionality of its own, and exists instead to enable the remaining functionality of a chipset: secure boot, trusted platform module (TPM), and other security-related functionality. These hardware security capabilities fit into a chip the size of the fingernail on a human pinky finger.
> 
> The diminutive size and non-I/O capabilities of the X300 chipset permit motherboard manufacturers to connect this unique chipset to the AMD Ryzen processor with a dedicated SPI link, rather than the PCIe x4 link required for I/O chipsets. This dedicated link opens an additional four PCI Express Gen 3 lanes within the AMD Ryzen processor for general use, including additional NVMe devices, GigE, WLAN, or other quality-of-life companion cards and controllers common to the ITX form factor.
> 
> Due to the simplicity of the X300 chipset, it has an incredibly low power draw of only 1 W. This further assists motherboard manufacturers in designing ITX motherboards for the AMD Ryzen family of processors.


----------



## Undervolter

Quote:


> Originally Posted by *Charcharo*
> 
> Adored forgot to talk about the RAM/BIOS/Kernel/CCX/Scheduling/L3 issues that are solvable but may help its performance. Still, he DOES have a point.
> 
> I remember seeing the 8370 above the 2500K and was like " Wait a second... computerbase.de ... hmmm is this true?" but didnt make much of a fuss at the time. But it does make sense.


This is because, FX has been bashed by gamers for years, for being "trash". The reality is, that it was a workstation CPU, designed for running many, heavy threads at much lower IPC. Which the games didn't have. I remember when i first joined the forum, people saying "games will never run more than 4 threads, because of the nature of the programming. It's impossible to code for many threads". The result was that a game, would saturate 100% an i3 and 50% an FX. The i3 was infront. Years later, games can actually spawn 8 threads and almost saturate an FX. Every time that a game saturates an FX, it will run better than a 2500. For the simple reason that the total performance of the FX is higher (aside overclocking, where i don't know what happens). The same will come true with 7700 vs Ryzen. As soon as 7700 will be saturated by a game in the future, Ryzen will pass ahead, because it has untapped potential.

Visual representation of the phenomenon:

http://www.overclock.net/t/1623305/what-board-to-buy/20#post_25851760

Me, if i were a gamer and wanted to buy today, i would certainly not buy 7700 over Ryzen. Ryzen has massive untapped potential. Future CPUs may have higher IPC and games may not saturate quads yet. But when they do, Ryzen will be there, getting ahead of them. And contrary to FX vs Sandy, Ryzen vs 7700 isn't massively behind in IPC and contrary to 2012, games with 8 threads already EXIST. When such games saturate completely the 7700, it will have CPU bottleneck. Ryzen won't.

CPU saturation (100% load) = CPU runs at full potential. This was the case for years for i3s and i5s, while FX was underperforming because games couldn't saturate it. Now things start to change and FX starts to get saturated too and turns the tables.


----------



## Hueristic

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Not to beat the gaming horse too much but Techspot have a gaming test article up where theyve disabled SMT for all tests and used 3000MHZ RAM on a Gigabyte X370 gaming 5.
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/
> 
> 
> 
> Disabling SMT doesnt seem make to much difference in most their tests. Another review for the OP anyway.


Did they disable HPET? Or use Win7? Win10 is bugged and the scheduler conflicts with HPET,

If instructions don't get scheduled correctly then the amount of clk cycles lost can be massive.


----------



## AlphaC

https://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-7-1700x-cpu-review/2/

DDR4-3200MHz 14-14-14-34 @ 1.35V.
"Our Ryzen 7 1700X chip at 4.0GHz with a load voltage between 1.395-1.417V"

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/
^ DX 11 limits the threads to about 6

http://www.hardwarezone.com.sg/review-amd-ryzen-7-1800x-vs-intel-core-i7-7700k-next-gen-flagship-cpu-matchup/gaming-benchmarks-overclocking
"We managed to hit 4.05GHz at 1.4V with our Noctua cooler, which is a bit disappointing given that the XFR boost actually goes up to 4.1GHz."


----------



## Charcharo

This annoys me a lot. I myself am primarily a gamer, but it is as if most gamers have no long term thinking and simply can not understand CPUs (not that I am that much better).

The most important markets are the pre-build PCs, the Gaming DIY market and Servers/productivity/science. The last one has the best margins, and possibly is the most important, but it is gamers that set the mind share and " fame or infamy" of a product, which also affects pre-built PCs. That adds up over time. It does matter. And all the " hurr durr Ryzen 7 is bad for games lel, lets badmouth all of Ryzen" is probably going to be damaging.

Ryzen is above Broadwell-E in productivity and science.
It has the same or better IPC
It uses a bit less power.
AMD achieved this on a MUCH worse process (Intel 14nm is the BEST in the world)
Their Ryzen core is much smaller and is overall cheaper.

And people call... that bad? A good deal of its gaming deficiencies will be fixed in a few months, perhaps 3-4. But even so it is quite OK in games. And this is a fail?

*I want to fail this hard in real life, seems like a pleasant, good thing.


----------



## prznar1

Quote:


> Originally Posted by *Undervolter*
> 
> This is because, FX has been bashed by gamers for years, for being "trash". The reality is, that it was a workstation CPU, designed for running many, heavy threads at much lower IPC. Which the games didn't have. I remember when i first joined the forum, people saying "games will never run more than 4 threads, because of the nature of the programming. It's impossible to code for many threads". The result was that a game, would saturate 100% an i3 and 50% an FX. The i3 was infront. Years later, games can actually spawn 8 threads and almost saturate an FX. Every time that a game saturates an FX, it will run better than a 2500. For the simple reason that the total performance of the FX is higher (aside overclocking, where i don't know what happens). The same will come true with 7700 vs Ryzen. As soon as 7700 will be saturated by a game in the future, Ryzen will pass ahead, because it has untapped potential.
> 
> Visual representation of the phenomenon:
> 
> http://www.overclock.net/t/1623305/what-board-to-buy/20#post_25851760
> 
> Me, if i were a gamer and wanted to buy today, i would certainly not buy 7700 over Ryzen. Ryzen has massive untapped potential. Future CPUs may have higher IPC and games may not saturate quads yet. But when they do, Ryzen will be there, getting ahead of them. And contrary to FX vs Sandy, Ryzen vs 7700 isn't massively behind in IPC and contrary to 2012, games with 8 threads already EXIST. When such games saturate completely the 7700, it will have CPU bottleneck. Ryzen won't.


When games will be capable of using 8c16t cpu potential, regular guy will change his cpu - twice.

Personally, I would still get r7 1700 because synt benches have proven that ryzen is well capable cpu, just needs patches in games and windows + its 8c for lowe price than 4c from intel, WAY LOWER power usage (2x cores powered by 2/3rd of power required by 7700k - its a crazy win for servers and workstations) and AMDs cpu socket forward compatibility.


----------



## Liranan

Quote:


> Originally Posted by *SuperZan*
> 
> We'll see. 4.2GHz is where the octa chips tap out, so even with 14nm LPP limitations I still think 4.4 is a plausible max ambient OC for quads.


According to The Stilt there will be little if any difference between chips of different core count and they will all be limited to the same clock limitations. Hopefully Zen+/Zen 2 will allow for higher clocks and thus will easily match the 7700K in gaming performance, after all the 7700K runs at a higher clock frequency than Zen.


----------



## Undervolter

Quote:


> Originally Posted by *prznar1*
> 
> When games will be capable of using 8c16t cpu potential, regular guy will change his cpu - twice.
> 
> Personally, I would still get r7 1700 because synt benches have proven that ryzen is well capable cpu, just needs patches in games and windows + its 8c for lowe price than 4c from intel, WAY LOWER power usage (2x cores powered by 2/3rd of power required by 7700k - its a crazy win for servers and workstations) and AMDs cpu socket forward compatibility.


There is a misunderstanding. Earlier in this thread, i wrote, that i don't think 7700K will hamper gaming in the next few years. However, there is difference between "getting ahead" and "using the potential". Ryzen doesn't need a game to use 16 threads to go ahead. It needs a game, that can saturate an 7700K. An 7700K will be saturated way before Ryzen is. As soon as this happens, Ryzen will be ahead. You don't need a game to saturate Ryzen for that. You just need it to saturate the 7700. For sure, 10 threads at 100% load on Ryzen, will mean that it will be ahead of the 7700k. I am not sure about 8 threads, cause i haven't looked at core vs core comparisons and 7700 has a small IPC advantage. I posted this before:



This game exists NOW. 7700 is 5 fps ahead, because it's not YET fully saturated. *It's the IPC advantage that still has that allows it to be ahead*. If a new game pushes the 7700 to 100% load, the 5fps advantage, will vanish and the heavier the game, the more the advantage of Ryzen from there on, provided that the game will spawn at least 9 threads. There is like 5% untapped power in the above scenario for the 7700 and 40% for Ryzen. Which one is more futureproof? That's all i am saying.

EDIT: To put more simply:

It's a matter of total throughput required by a game. Lets say a game requires 800 points of total performance. The game doesn't care if these 800 points are given by 4 cores or by 8, as long as it's programmed to use enough threads. It just wants 800 points. As long as 7700 can give these 800 points under 100% load, it's fine. As soon as a game requires 820 points and this exceeds the total throughput of 7700k, if the game is coded to use additional threads, Ryzen will pass ahead.


----------



## Liranan

Quote:


> Originally Posted by *Undervolter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *prznar1*
> 
> When games will be capable of using 8c16t cpu potential, regular guy will change his cpu - twice.
> 
> Personally, I would still get r7 1700 because synt benches have proven that ryzen is well capable cpu, just needs patches in games and windows + its 8c for lowe price than 4c from intel, WAY LOWER power usage (2x cores powered by 2/3rd of power required by 7700k - its a crazy win for servers and workstations) and AMDs cpu socket forward compatibility.
> 
> 
> 
> There is a misunderstanding. Earlier in this thread, i wrote, that i don't think 7700K will hamper gaming in the next few years. However, there is difference between "getting ahead" and "using the potential". Ryzen doesn't need a game to use 16 threads to go ahead. It needs a game, that can saturate an 7700K. An 7700K will be saturated way before Ryzen is. As soon as this happens, Ryzen will be ahead. You don't need a game to saturate Ryzen for that. You just need it to saturate the 7700. For sure, 10 threads at 100% load on Ryzen, will mean that it will be ahead of the 7700k. I am not sure about 8 threads, cause i haven't looked at core vs core comparisons and 7700 has a small IPC advantage. I posted this before:
> 
> 
> 
> This game exists NOW. 7700 is 5 fps ahead, because it's not YET fully saturated. If a new game pushes the 7700 to 100%, the 5fps advantage, will vanish and the heavier the game, the more the advantage of Ryzen from there on. There is like 5% untapped power in the above scenario for the 7700 and 40% for Ryzen. Which one is more futureproof? That's all i am saying.
> 
> EDIT: To put more simply:
> 
> It's a matter of total throughput required by a game. Lets say a game requires 800 points of total performance. The game doesn't care if these 800 points are given by 4 cores or by 8, as long as it's programmed to use enough threads. It just wants 800 points. As long as 7700 can give these 800 points under 100% load, it's fine. As soon as a game requires 820 points and this exceeds the total throughput of 7700k, if the game is coded to use additional threads, Ryzen will pass ahead.
Click to expand...

This is another reason why I will get Zen(+) over an Intel offering.


----------



## Undervolter

Quote:


> Originally Posted by *Liranan*
> 
> This is another reason why I will get Zen(+) over an Intel offering.


Even if i were a gamer, i would still get Ryzen, simply because the gap at current games is very limited in favour of i7, while the potential gap in future games is massive in favour of Ryzen, with the added bonus that it murders i7 in productivity. Since the IPC gap isn't big, since the 1700 costs about the same and overclocks the same as 1800X, since some new games already manage to spawn 8 threads in FX and the above game shows 16 threads in Ryzen being spawned, i would certainly go Ryzen, if i wanted to keep the CPU more time. If you don't mind changing CPU again in 2 years to get a newer Intel with much higher IPC, by all means, get the 7700 to gain 5 fps in current games. Like if you are going to notice...

EDIT: The thing is this. For years, games were using "fixed" number of threads. It didn't matter if you have 2,8,10 cores, the game would spawn a fixed number of threads. Now it seems that game engines are becoming "thread-aware", which is how FX finally sees 8 threads or the above Ryzen photo shows 16. And using more cores will be obbligatory, UNLESS Intel makes some MAJOR IPC breakthrough. Because game developers always more more power. If they can't have it from IPC major improvements, there is only 1 other way to find it. From more cores. Some years ago, i3s were performing just the same as i5s and i7s. Because games couldn't saturate them. Once they got saturated, the i5s and i7s started moving ahead. In cases where i5s were saturated, you had i7s or even 6 and 8 cores moving ahead. That's the trend, unless Intel find a way to provide big IPC boosts to game developers. But Intel seems to be going for "moar cores" too, faster than "moar IPC". So...If you are a game dev and want more power, where are you going to search it more? You get either a 5% IPC or 2 more cores to use. Which one gives easier more power?


----------



## Xyxox

I think I'll go with an Intel 7700K for my next gaming rig build and with Ryzen 1800X for my next workstation build since I use a workstation for compute intense applications and the Ryzen 1800X will be a cost effective way to throw cores at compute. It would be nice if there was a way to throw two 1800X processors in a rig.


----------



## prznar1

Naples?


----------



## bossie2000

Naples = Bigger socket..


----------



## CULLEN

Quote:


> Originally Posted by *Liranan*
> 
> This is another reason why I will get Zen(+) over an Intel offering.


I've been meaning to ask for some time, what is Zen+?


----------



## Undervolter

Quote:


> Originally Posted by *CULLEN*
> 
> I've been meaning to ask for some time, what is Zen+?


Zen+ is probably now renamed into Zen2. Lisa Su said that they plan releasing Zen2 and Zen3. In early roadmaps, Zen+ (probably Zen2 now), was shown for 2019. The entire Zen architecture was said that should last them up to 2021. So if they stick to that plan, between 2019 and 2021, there should be Zen2 and Zen3 out.


----------



## bossie2000

Quote:


> I've been meaning to ask for some time, what is Zen+?


Suppose to be Zen's post succersor..


----------



## Slink3Slyde

Quote:


> Originally Posted by *Hueristic*
> 
> Did they disable HPET? Or use Win7? Win10 is bugged and the scheduler conflicts with HPET,
> 
> If instructions don't get scheduled correctly then the amount of clk cycles lost can be massive.


Honestly don't know about HPET, I havent kept up as well as some. They do state at the start.
Quote:


> AMD Ryzen processors made a strong impression last week, however a number of technical difficulties and time constraints resulted in more questions than answers when it came to the four games we managed to benchmark in time for launch. As promised, we're back to follow up on our initial 1080p testing with a more in-depth look at Ryzen's gaming performance across a 16 titles played at 1080p and 1440p resolutions.
> 
> In addition to including more games, we're also adding results for the 1800X and 1700X with SMT disabled as Anandtech forum-goers have discovered a problem with the Windows 10 scheduler that can cause Ryzen to perform worse in lightly-threaded applications with SMT enabled. Apparently Windows 10 treats all Ryzen threads the same (not identifying SMT from physical cores) and thus the operating system thinks all threads have access to their own L2 and L3 cache when in fact they don't.


I'd guess theyre thinking that the thread scheduling problem is only coming from SMT threads, I dont know much more then that.

Techspot is not known for their stellar methodology generally, but theyre making some kind of effort here at least.

I've heard something about the Windows 7 thing but I havent looked into it TBH. If Ryzen is really getting 20% more performance in Windows 7 in games that would put it ahead of Kaby lake in single thread performance, I struggle to believe that's the case but I'd be happy to be proved wrong.

All in all if theyd waited a couple of weeks and sorted all this stuff out it would have worked out better for them, particularly if they were leaving 20% gaming performance on the table that could have made the day 1 reviews absolutely shine in every metric. The fact that they didnt wait speaks to me that that they dont think there's that much left to gain really.

Dont get me wrong, fantastic CPU, fantastic value, just not the best you can get for gaming right now, although its definitely capable.


----------



## azanimefan

I got a chance to help overclock 2 ryzen cpus last night. one was a 1700x the other was a 1700. combined with what I've seen on the youtube overclocking videos and what various overclockers have written I'm convinced Ryzen has a hard cap on it's clock speeds with non-LN2 methods at around 4.3ghz. I also believe every single review chip i've seen someone get to 4.0 or 3.9 has another 100-300mhz left under the hood, not tapped due to not understanding how to overclock with the bios setups; not understanding what strokes these chips and perhaps bios/microcode issues.

I suspect 4.2 will be the "standard" overclock speed for these once the issues are ironed out. They're pretty optimized as it is, but when you hit 4.0, with 1.38V and the peak temps are 60C there SHOULD be a bit more under the hood. Especially since these chips should max out around 1.45V and 95C.

And yes, both chips last night got to 4.0 without too much trouble. getting it higher proved impossible, and I suspect something that will have to wait for bios updates, possibly different ram, or someone figuring out how to properly overclock these chips with their bios/features.


----------



## JedixJarf

Quote:


> Originally Posted by *azanimefan*
> 
> I got a chance to help overclock 2 ryzen cpus last night. one was a 1700x the other was a 1700. combined with what I've seen on the youtube overclocking videos and what various overclockers have written I'm convinced Ryzen has a hard cap on it's clock speeds with non-LN2 methods at around 4.3ghz. I also believe every single review chip i've seen someone get to 4.0 or 3.9 has another 100-300mhz left under the hood, not tapped due to not understanding how to overclock with the bios setups; not understanding what strokes these chips and perhaps bios/microcode issues.
> 
> I suspect 4.2 will be the "standard" overclock speed for these once the issues are ironed out. They're pretty optimized as it is, but when you hit 4.0, with 1.38V and the peak temps are 60C there SHOULD be a bit more under the hood. Especially since these chips should max out around 1.45V and 95C.
> 
> And yes, both chips last night got to 4.0 without too much trouble. getting it higher proved impossible, and I suspect something that will have to wait for bios updates, possibly different ram, or someone figuring out how to properly overclock these chips with their bios/features.


Throttle kicks in @ 75C, TJMax won't be 95c

source : http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks


----------



## MadRabbit

Since when is "Open a PDF" a benchmark anyway? A serious question actually. Just looked at the 7700k review, nothing the likes on there on the same *cough* anand *cough* page.

So now we are coming up with bs tests to show how "bad" a CPU is? Oh, wait, its Cutress.


----------



## tygeezy

Quote:


> Originally Posted by *jprovido*
> 
> If ryzen clocked a little bit more maybe around 4.4ghz and 3200mhz memory I think it's possible to play at perfect 144fps AND stream without skipped frames. zen+ can't get here any sooner


Can't you put a capture card into your other pc and use your main 7700 k rig for gaming? I don't know anything about streaming, but wouldn't that alleviate the skipped frames while also maintaining 100 % performance on the 7700 k?


----------



## SoloCamo

Quote:


> Originally Posted by *tygeezy*
> 
> Can't you put a capture card into your other pc and use your main 7700 k rig for gaming? I don't know anything about streaming, but wouldn't that alleviate the skipped frames while also maintaining 100 % performance on the 7700 k?


Between the cost of a decent capture card and a whole other pc to run it I'd rather just have a single powerful pc.


----------



## Undervolter

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I've heard something about the Windows 7 thing but I havent looked into it TBH. If Ryzen is really getting 20% more performance in Windows 7 in games that would put it ahead of Kaby lake in single thread performance, I struggle to believe that's the case but I'd be happy to be proved wrong.


About Win7, there have been observations favouring Win7 by "The Stilt":
Quote:


> - Ryzen 17% faster in Win7 than Win10 in draw calls
> - I've used Win 7 about 95% of the time on Ryzen.
> Only the actual performance evaluation was done with Win 10, as it wasn't up to me. biggrin.gif
> The performance differences are minor, but very constant regardless (in favor of Win 7).
> 
> - I did some 3D testing and eventhou there is not nearly enough data to confirm it, I'd say the SMT regression is infact a Windows 10 related issue.
> In 3D testing I did recently on Windows 10, the title which illustrated the biggest SMT regression was Total War: Warhammer.
> 
> All of these were recorded at 3.5GHz, 2133MHz MEMCLK with R9 Nano:
> 
> Windows 10 - 1080 Ultra DX11:
> 
> 8C/16T - 49.39fps (Min), 72.36fps (Avg)
> 8C/8T - 57.16fps (Min), 72.46fps (Avg)
> 
> Windows 7 - 1080 Ultra DX11:
> 
> 8C/16T - 62.33fps (Min), 78.18fps (Avg)
> 8C/8T - 62.00fps (Min), 73.22fps (Avg)
> 
> At the moment this is just pure speculation as there were variables, which could not be isolated.
> Windows 10 figures were recorded using PresentMon (OCAT), however with Windows 7 it was necessary to use Fraps.
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-5#


----------



## GingerJohn

Quote:


> Originally Posted by *MadRabbit*
> 
> Since when is "Open a PDF" a benchmark anyway? A serious question actually.


Not going to comment on the usage of this benchmark to compare CPUs, but productivity wise that is actually a reasonable benchmark.

I'm an engineer, and a lot of our drawings are sent out to clients or vendors in PDF format. Personally I will often either open and print a batch of 30-50 PFDs, or open and sign / protect them. Or open a batch of 100-500 AutoCAD drawings (piping isometrics) and PDF them. This generally takes up a lot of CPU time, and can render my computer more or less unusable for 5-20 minutes depending on the number and size of the DWGs / PDFs.

So it is not as stupid a benchmark as it first seems, although I certainly agree that it is not useful for comparing gaming performance.


----------



## mouacyk

Quote:


> Originally Posted by *Charcharo*
> 
> This annoys me a lot. I myself am primarily a gamer, but it is as if most gamers have no long term thinking and simply can not understand CPUs (not that I am that much better).
> 
> The most important markets are the pre-build PCs, the Gaming DIY market and Servers/productivity/science. The last one has the best margins, and possibly is the most important, but it is gamers that set the mind share and " fame or infamy" of a product, which also affects pre-built PCs. That adds up over time. It does matter. And all the " hurr durr Ryzen 7 is bad for games lel, lets badmouth all of Ryzen" is probably going to be damaging.
> 
> Ryzen is above Broadwell-E in productivity and science.
> It has the same or better IPC
> It uses a bit less power.
> AMD achieved this on a MUCH worse process (Intel 14nm is the BEST in the world)
> Their Ryzen core is much smaller and is overall cheaper.
> 
> And people call... that bad? A good deal of its gaming deficiencies will be fixed in a few months, perhaps 3-4. But even so it is quite OK in games. And this is a fail?
> 
> *I want to fail this hard in real life, seems like a pleasant, good thing.


Bell-curve much? If anything, AMD has only achieved great pricing on 8-core and thread-heavy stuff. In everything else, it performs like a $300 i7.


----------



## Charcharo

Quote:


> Originally Posted by *mouacyk*
> 
> Bell-curve much? If anything, AMD has only achieved great pricing on 8-core and thread-heavy stuff. In everything else, it performs like a $300 i7.


So let us all ignore all the engineering behind something, the important Per/Watt metric for servers, all use cases, production... everything.

Lets focus only on buggy video game performance. Only that matters.

*Sigh* The life of an Industrial Engineer, that awaits me, is a cruel and thankless job. At least I will be rich...


----------



## sumitlian

Episode 2:
Joker's Response to Tech City Accusations


----------



## mouacyk

Quote:


> Originally Posted by *Charcharo*
> 
> So let us all ignore all the engineering behind something, the important Per/Watt metric for servers, all use cases, production... everything.
> 
> Lets focus only on buggy video game performance. Only that matters.
> 
> *Sigh* The life of an Industrial Engineer, that awaits me, is a cruel and thankless job. At least I will be rich...


I'm just waiting for answers to the RAM limitations, then I will be investing in one for a Gentoo compiler box for sure, because a $300 i7 will definitely be left in the dust there. My reasons are my own.


----------



## Charcharo

Quote:


> Originally Posted by *mouacyk*
> 
> I'm just waiting for answers to the RAM limitations, then I will be investing in one for a Gentoo compiler box for sure, because a $300 i7 will definitely be left in the dust there. My reasons are my own.


They are. But engineering/cost/wattage/size of core/size of process... are what impresses me.

As I said, AMD equaled Intel in things more important than games (where they might as well in a month or two) with what is effectively engineering voodoo magic.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Undervolter*
> 
> About Win7, there have been observations favouring Win7 by "The Stilt":


Quote:


> In 3D testing I did recently on Windows 10, the title which illustrated the biggest SMT regression was Total War: Warhammer


Interesting. Total War: Warhammer looks to be a game where Ryzen will show some decent improvements with updates and optimization for Windows 10. At least on GCN based cards in DX11. If Ryzen is 17% faster making draw calls in Windows 7 then this equates to more then 20% better mins and a bit less then 10% on average in his tests between 7 and 10, although he does say that there are some variables involved.

Techspot showed a pretty decent benefit for disabling SMT in TW Warhammer, for other games they tested it varies from no difference in some cases to around 20% better average in Deus Ex. It doesnt look to me to be a universal improvement, but rather something that would help in some games but not in others. Or have I read this wrong? There's a bit of crystal ball involved here









Would this Windows 10 issue also affect any of the productivity benchmarks?


----------



## budgetgamer120

Quote:


> Originally Posted by *Nickyvida*
> 
> Well, look at pascal, it can clock really well on 16nm. Im not hating. Just saying it was a missed opportunity because if it wasnt node limited it would have given 7700k a real fight probably


The 7700k is no competition to Ryzen 8. at least for me.


----------



## rage fuury

*Ryzen - The Tech Press Loses The Plot (using low resolution benchs)*
https://www.youtube.com/watch?v=ylvdSnEbL50


----------



## DaaQ

Two things really jump out at me in the Techspot review.
Quote:


> *ARMA 3 isn't a game I really like to test but in fear that you guys might burn this thing to the ground I have included it.* Despite poor utilization, the 1800X actually looks good in relation to the 6900K, though unsurprisingly the higher-clocked Kaby Lake and Skylake chips offer much better performance in this title.


Fear of what/whom?
Quote:


> As noted in our full Ryzen review, at this point we're waiting for software optimizations and AMD promises they're coming. *Of course, the company also said that about Bulldozer,* but Ryzen appears to be in a much better situation.


Relate it to Bulldozer. Nice writing there. The subtlety of it all.









There's more but I don't have time to pick it apart right now.


----------



## Slink3Slyde

Quote:


> Originally Posted by *rage fuury*
> 
> *Ryzen - The Tech Press Loses The Plot (using low resolution benchs)*
> https://www.youtube.com/watch?v=ylvdSnEbL50


Little bit hyperbolic with the losing the plot stuff, and the fact he mentions he 'had a tear in his eye' when he read the TPU review doesnt do much to speak for his objectivity but he does make an interesting point so far as I watched it. I agree that in 5 years time the Ryzen 8 cores are probably going to look better vs current gen quad core Intel ones especially I5's.

I do wonder how the quad I7's are going to age. The consoles are using incredibly weak (by PC standards) 8 core CPU's. AFAIK they reserve two cores for system usage leaving 6 open for use in games. Hence why we are now finally seeing games use more cores/threads.

If the consoles are stuck on 6 weak cores for at least 3 more years, I'm guessing an I7 with 4 much more powerful cores and 4 extra logical ones will still be pretty good in most games at least up to that point in gaming and probably beyond. I cant see the 7700K falling behind to the Ryzen 8 cores apart from in some edge cases. Could be wrong, no one knows the future









For myself, a bird in the hand is worth two in the bush, and I'll cross that bridge when I come to it


----------



## Hueristic

Quote:


> Originally Posted by *tygeezy*
> 
> Can't you put a capture card into your other pc and use your main 7700 k rig for gaming? I don't know anything about streaming, but wouldn't that alleviate the skipped frames while also maintaining 100 % performance on the 7700 k?


You can also build 3 systems to use 3 different apps and monitors, but why?

Quote:


> Originally Posted by *rage fuury*
> 
> *Ryzen - The Tech Press Loses The Plot (using low resolution benchs)*
> https://www.youtube.com/watch?v=ylvdSnEbL50


Great video.


----------



## JackCY

Quote:


> Originally Posted by *sumitlian*
> 
> Episode 2:
> Joker's Response to Tech City Accusations


They are getting paid now? Can we expect a whole series?








I knew someone will hire some of these comedians, though no Linus in the series yet?









---

While I do not agree with everything AdoredTV always says as some of his claims are a bit wild at times, this last video about CPU testing was quite spot on, honest and funny








Doesn't pull any punches and stands by his claims.

It's simple really, if someone doesn't like the R7 1700 then go buy half the CPU with higher clocks as i7 7700K. But then don't come crying later on when the 7700K is maxed out and can't keep up.


----------



## Kuivamaa

Quote:


> Originally Posted by *sumitlian*
> 
> Episode 2:
> Joker's Response to Tech City Accusations


So apparently there is some personal issue between them, or?


----------



## looniam

Quote:


> Originally Posted by *sumitlian*
> 
> Episode 2:
> Joker's Response to Tech City Accusations


i don't know. i think if joker does want to have a meaningful response then instead of showing vega pro time line and making a claim that a file name is proof of when it was recorded (







) joker should just rerun the benchmark using aida via rtss to show the clock speed of the 7700K.

and btw:
am i the only one thinking TOO MUCH emphasis is given to frame rate/time? or cpu core/thread utilization?

pretty sure gpu utilization is more of an indicator fo a cpu bottleneck. sure most of the time frame rate/time also goes up but not necessarily.


----------



## Undervolter

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Interesting. Total War: Warhammer looks to be a game where Ryzen will show some decent improvements with updates and optimization for Windows 10. At least on GCN based cards in DX11. If Ryzen is 17% faster making draw calls in Windows 7 then this equates to more then 20% better mins and a bit less then 10% on average in his tests between 7 and 10, although he does say that there are some variables involved.
> 
> Techspot showed a pretty decent benefit for disabling SMT in TW Warhammer, for other games they tested it varies from no difference in some cases to around 20% better average in Deus Ex. It doesnt look to me to be a universal improvement, but rather something that would help in some games but not in others. Or have I read this wrong? There's a bit of crystal ball involved here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would this Windows 10 issue also affect any of the productivity benchmarks?


I have no idea. There are theories around, but nobody AFAIK from AMD has explained what's happening. Also, aside from Stilt, nobody actually cared to compare the situation with Win7. Some games seem to have problem, some not, leading to think that part of the problem lies in how the game is coded. If we believe Stilt, another part is on the Win10 scheduler or whatever it is that controls the way SMT will work. But i have as much as knowledge as you do in the subject.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Undervolter*
> 
> I have no idea. There are theories around, but nobody AFAIK from AMD has explained what's happening. Also, aside from Stilt, nobody actually cared to compare the situation with Win7. Some games seem to have problem, some not, leading to think that part of the problem lies in how the game is coded. If we believe Stilt, another part is on the Win10 scheduler or whatever it is that controls the way SMT will work. But i have as much as knowledge as you do in the subject.


I just have to go back to my logic that if there was really a large amount of performance left on the table I cant understand why they would release knowing that. If they didnt know there was 17% or more going begging in games, someone dun goofed.

I wont come crying if/when my shiny new 7700K gets smashed in games by Ryzen in a couple of years, I'll just buy a new set up.


----------



## Kuivamaa

Quote:


> Originally Posted by *looniam*
> 
> am i the only one thinking TOO MUCH emphasis is given to frame rate/time? or cpu core/thread utilization?
> 
> pretty sure gpu utilization is more of an indicator fo a cpu bottleneck. sure most of the time frame rate/time also goes up but not necessarily.


The FX chips most of the time push even the fastest videocards at 99% usage on modern titles, especially on 1440p ultra but still push clearly fewer fps than, say a 7700k. So GPU utilization is not the be all end all factor.


----------



## budgetgamer120

Quote:


> Originally Posted by *looniam*
> 
> i don't know. i think if joker does want to have a meaningful response then instead of showing vega pro time line and making a claim that a file name is proof of when it was recorded (
> 
> 
> 
> 
> 
> 
> 
> ) joker should just rerun the benchmark using aida via rtss to show the clock speed of the 7700K.
> 
> and btw:
> am i the only one thinking TOO MUCH emphasis is given to frame rate/time? or cpu core/thread utilization?
> 
> pretty sure gpu utilization is more of an indicator fo a cpu bottleneck. sure most of the time frame rate/time also goes up but not necessarily.


Thats not true. My FX pushed my GPu 99% but still got higher FPS when I switched to intel.


----------



## tygeezy

Quote:


> Originally Posted by *Hueristic*
> 
> You can also build 3 systems to use 3 different apps and monitors, but why?
> Great video.


I know there are some serious streams out there that have computers dedicated to streaming and video-capturing. He has a spare computer laying around. So if he ever became a serious streamer, he can have his cake and eat it too per say. Even if it isn't the most efficient thing in the world.


----------



## DADDYDC650

Great video linked below. For all the people that claim that even an i5 is faster than Ryzen... what a joke. In a few years, MS and Sony will probably have Ryzen chips in them. GL with an i5.

https://www.youtube.com/watch?v=ylvdSnEbL50&t=2s


----------



## budgetgamer120

AMD stock going down.... Hopefully it goes low enough so I can buy some


----------



## 7850K

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Zeppelin is the code name for Zen's die design


I wonder if *Ze*ppeli*n* gave them the idea for the buzzword *Zen*


----------



## mouacyk

I wonder why AMD didn't beta-test their L3 cache design with production software in the real world. It almost sounded like: "Meh, we're over R&D budget and L3 Cache is still faster than 2133MHz DDR4. Let's launch it!"


----------



## budgetgamer120

Quote:


> Originally Posted by *mouacyk*
> 
> I wonder why AMD didn't beta-test their L3 cache design with production software in the real world. It almost sounded like: "Meh, we're over R&D budget and L3 Cache is still faster than 2133MHz DDR4. Let's launch it!"


What make you think they did not test it?


----------



## ihatelolcats

Quote:


> Originally Posted by *mouacyk*
> 
> I wonder why AMD didn't beta-test their L3 cache design with production software in the real world. It almost sounded like: "Meh, we're over R&D budget and L3 Cache is still faster than 2133MHz DDR4. Let's launch it!"


yeah i'm sure that's exactly what happened


----------



## flippin_waffles

Quote:


> Originally Posted by *mouacyk*
> 
> I wonder why AMD didn't beta-test their L3 cache design with production software in the real world. It almost sounded like: "Meh, we're over R&D budget and L3 Cache is still faster than 2133MHz DDR4. Let's launch it!"


I wonder why intel didn't beta-test their L2 cache design. Zen has much higher bandwidth.


----------



## AmericanLoco

Quote:


> Originally Posted by *mouacyk*
> 
> I wonder why AMD didn't beta-test their L3 cache design with production software in the real world. It almost sounded like: "Meh, we're over R&D budget and L3 Cache is still faster than 2133MHz DDR4. Let's launch it!"


Because they had a small budget and a narrow time frame. This design allows them to basically stack as many CCXs together as they want with "minimal" work. It'd be trivial to make a 16, 24, or even 32 core processor with the Zen design.

The Intel 8/10/12/16 core design requires a bespoke design for each variation, since the L3 cache is all connected.


----------



## comagnum

Quote:


> Originally Posted by *budgetgamer120*
> 
> Thats not true. My FX pushed my GPu 99% but still got higher FPS when I switched to intel.


I had the same experience. My 9370 pushed my dual 480's pretty decently, but the switch to intel definitely pushed the frames higher.


----------



## Undervolter

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I just have to go back to my logic that if there was really a large amount of performance left on the table I cant understand why they would release knowing that. If they didnt know there was 17% or more going begging in games, someone dun goofed.
> 
> I wont come crying if/when my shiny new 7700K gets smashed in games by Ryzen in a couple of years, I'll just buy a new set up.


For me, this was a paper launch and a marketing debacle, because the platform as a whole wasn't clearly ready. Your question is right. It's also right to ask, why didn't they know beforehand, that Win10 wasn't ready, that ASUS and MSI's Bioses were like 20% (as i read) penalized in performance, that motherboards wouldn't be available (here there is no x370 to find from any brand and only ASUS B350 available), that current memory kits were having trouble etc. They had already postponed the launch and didn't want to do it again and this is what happens when you do things forcefully. Ryzen is a very good chip, but the entire platform is now treating the end users as beta testers. That's the truth.


----------



## Kuivamaa

Quote:


> Originally Posted by *Undervolter*
> 
> For me, this was a paper launch and a marketing debacle, because the platform as a whole wasn't clearly ready. Your question is right. It's also right to ask, why didn't they know beforehand, that Win10 wasn't ready, that ASUS and MSI's Bioses were like 20% (as i read) penalized in performance, that motherboards wouldn't be available (here there is no x370 to find from any brand and only ASUS B350 available), that current memory kits were having trouble etc. They had already postponed the launch and didn't want to do it again and this is what happens when you do things forcefully. Ryzen is a very good chip, but the entire platform is now treating the end users as beta testers. That's the truth.


It was definitely not a paper launch. The CPUs were widely and readily available straight away. As for BIOS ,bugs etc , these are common for all new platforms. The one part I agree with you is motherboard availability. How could I disagree when I am still waiting on mine


----------



## sugarhell

Quote:


> Originally Posted by *looniam*
> 
> i don't know. i think if joker does want to have a meaningful response then instead of showing vega pro time line and making a claim that a file name is proof of when it was recorded (
> 
> 
> 
> 
> 
> 
> 
> ) joker should just rerun the benchmark using aida via rtss to show the clock speed of the 7700K.
> 
> and btw:
> am i the only one thinking TOO MUCH emphasis is given to frame rate/time? or cpu core/thread utilization?
> 
> pretty sure gpu utilization is more of an indicator fo a cpu bottleneck. sure most of the time frame rate/time also goes up but not necessarily.


In general utilization metrics are a bit meh. Shader utilization, Geometry utilization, Front end Utilization?

A gpu can never actually be on 100% usage that's why all the APIs for utilization use algorithms that most of the times are way off.

I only care about minimum FPS and input lag.


----------



## Undervolter

Quote:


> Originally Posted by *Kuivamaa*
> 
> It was definitely not a paper launch. The CPUs were widely and readily available straight away. As for BIOS ,bugs etc , these are common for all new platforms. The one part I agree with you is motherboard availability. How could I disagree when I am still waiting on mine


In Finland maybe. Here, Amazon, which was listed as the official launch shop in AMD's website, didn't have chips to send for the preorder. Same thing happened in the french Amazon. And even the lucky ones that got one, had to wait for motheboard, which is much harder to find.

EDIT: This is the situation right now here, from the best price engine available. Results for any X370 motherboard: 1 shop has 4 motherboard models, another has 1 and that's it. No Asrocks or Biostars at all.



This is from local Amazon. ASUS CH6 aren't available directly from Amazon and there is only 1 vendor, from Germany that sells it, at the nice price of 630 EUR:



MSI Titanium, not available still:



The Asrock x370 Taichi doesn't even exist as a name in Amazon.

You can't imagine the joy, of some local users that managed to get a CPU and don't have a motherboard to put it in.


----------



## dmasteR

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I haven't gotten BF1 yet because its still $60! Need a sale on this game already!


There's been multiple sales for BF1.

In fact BF1 is on sale right now for $30

https://www.origin.com/usa/en-us/store/battlefield/battlefield-1/standard-edition


----------



## AlphaC

Quote:


> *PCIE*
> Ryzen has a bit of an advantage in maximum sequential read performance. Intel has a bit of advantage when writing sequential data. Intel has a huge advantage in small-file performance. Comparing performance at a 4KB transfer size illustrates this fact.
> 
> Intel has a massive advantage in every measured category except 4K QD1 read. That said, AMD delivers very respectable performance. As you will notice throughout, Intel has superior random performance.
> 
> With CDM, there is very little difference in sequential performance between the two platforms; Intel still holds a slight edge. Intel holds a distinct advantage at 4K QD1 write, where it is 50% faster than AMD.
> 
> *SATA*
> 
> Ryzen delivers better sequential write speeds than Intel at transfers of 64KB and higher. Sequential read performance with larger file transfers are about equal. However, Intel delivers where it matters most; small sequential files. For some unknown reason, read transfers with at 16KB took a nose dive on the Ryzen platform. We ran ATTO another time to see if it would happen again, and ATTO would not even run.
> 
> AMD delivers very respectable performance; however, Ryzen is crushed by Intel's vastly super random performance. As you will notice throughout, Intel has superior random performance. A good portion of the disparity between the two platforms is due to Intel's RST driver and superior chipset. Ryzen has no SATA driver, and thus suffers the consequences.
> 
> *CONCLUSIONS*
> We are pleased with the PCIe storage performance that Ryzen delivers. It is within striking distance of Intel and we believe that for the most part user experience between the two platforms is comparable. However, SATA performance between the two platforms is another matter. Ryzen is at a distinct disadvantage compared with Intel if you are running a SATA SSD.


http://www.tweaktown.com/articles/8073/amd-ryzen-ssd-storage-performance-preview/index4.html

TL;DR : More driver issues related to storage optimization


----------



## NYU87

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.tweaktown.com/articles/8073/amd-ryzen-ssd-storage-performance-preview/index4.html
> 
> TL;DR : More driver issues related to storage optimization


It's well known Intel's chipset and its SATA implementation is superior to AMD's. Hopefully with this new chipset AMD can at least come to parity with Intel.


----------



## looniam

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> am i the only one thinking TOO MUCH emphasis is given to frame rate/time? or cpu core/thread utilization?
> 
> pretty sure gpu utilization is more of an indicator fo a cpu bottleneck. sure most of the time frame rate/time also goes up but not necessarily.
> 
> 
> 
> The FX chips most of the time push even the fastest videocards at 99% usage on modern titles, especially on 1440p ultra but still push clearly fewer fps than, say a 7700k. So GPU utilization is not the be all end all factor.
Click to expand...

Quote:


> Originally Posted by *budgetgamer120*
> 
> Thats not true. My FX pushed my GPu 99% but still got higher FPS when I switched to intel.


Quote:


> Originally Posted by *sugarhell*
> 
> In general utilization metrics are a bit meh. Shader utilization, Geometry utilization, Front end Utilization?
> 
> A gpu can never actually be on 100% usage that's why all the APIs for utilization use algorithms that most of the times are way off.
> 
> I only care about minimum FPS and input lag.


ok, thanks for the . . correction/clarification and +rep.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> Dota 2 is HARD to run ever since the 7.00 update. yes you heard it here. my 5820k @ 4.7ghz gets fps drops @ 1440p at 144hz to as low as 90fps with big team fights. it's crazy


Did you ever try a steam stream to see if it was smoother compared to twitch?


----------



## daffy.duck




----------



## budgetgamer120

Quote:


> Originally Posted by *daffy.duck*


Still a ******ed review. Comparing i5 cost etc


----------



## JackCY

Quote:


> Originally Posted by *budgetgamer120*
> 
> Still a ******ed review. Comparing i5 cost etc


Still hasn't shaved off his hair.


----------



## CriticalOne

Quote:


> Originally Posted by *budgetgamer120*
> 
> Still a ******ed review. Comparing i5 cost etc


Did you actually read the article and watch the video?


----------



## OutlawII

Quote:


> Originally Posted by *CriticalOne*
> 
> Did you actually read the article and watch the video?


No they are just mad because he tells it like it is


----------



## Brutuz

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Little bit hyperbolic with the losing the plot stuff, and the fact he mentions he 'had a tear in his eye' when he read the TPU review doesnt do much to speak for his objectivity but he does make an interesting point so far as I watched it. I agree that in 5 years time the Ryzen 8 cores are probably going to look better vs current gen quad core Intel ones especially I5's.
> 
> I do wonder how the quad I7's are going to age. The consoles are using incredibly weak (by PC standards) 8 core CPU's. AFAIK they reserve two cores for system usage leaving 6 open for use in games. Hence why we are now finally seeing games use more cores/threads.
> 
> If the consoles are stuck on 6 weak cores for at least 3 more years, I'm guessing an I7 with 4 much more powerful cores and 4 extra logical ones will still be pretty good in most games at least up to that point in gaming and probably beyond. I cant see the 7700K falling behind to the Ryzen 8 cores apart from in some edge cases. Could be wrong, no one knows the future
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For myself, a bird in the hand is worth two in the bush, and I'll cross that bridge when I come to it


Compare CPU results in benchmarks that have reasonable multi-thread usage, that's pretty much where CPUs will end up coming in when the threads are fully utilized by a game. Ryzen will certainly end up faster than Kaby Lake, just like Bulldozer will certainly end up matching the 3770k, it's just a matter of when and if the CPU will still be useful...Unlike BD which was simply too weak before its threads were utilized, I think the answer to those may be "Soon" and "Yes".
Quote:


> Originally Posted by *Slink3Slyde*
> 
> I just have to go back to my logic that if there was really a large amount of performance left on the table I cant understand why they would release knowing that. If they didnt know there was 17% or more going begging in games, someone dun goofed.
> 
> I wont come crying if/when my shiny new 7700K gets smashed in games by Ryzen in a couple of years, I'll just buy a new set up.


They would if they just want to get a high-performance product out right now. This isn't like Intel where they could afford a delay much more readily, they haven't had a competitive CPU for nearly 7 years now I believe...They did the same with BD then fixed it with PD, however hopefully Zen's fixes can be done via microcode and Windows updates.

And that's exactly it, a 7700k does still make sense if you mainly game and plan on upgrading relatively soon compared to a Ryzen build. I don't get why it isn't obvious to everyone....Buy Ryzen if you want longevity, multi-tasking or do CPU intensive tasks in general, buy Kaby Lake if you plan on upgrading soon, mainly game or can get a used one very cheap and have an i5 6600k and finally, buy HEDT if you simply have money to burn and want the best of the best. Ryzen can match or even beat it in some situations, but it has a much broader range where it performs well while Ryzen seems to have concentrated on the largest potential markets with an extremely high value chip.
Quote:


> Originally Posted by *DADDYDC650*
> 
> Great video linked below. For all the people that claim that even an i5 is faster than Ryzen... what a joke. In a few years, MS and Sony will probably have Ryzen chips in them. GL with an i5.
> 
> https://www.youtube.com/watch?v=ylvdSnEbL50&t=2s


I would put money on Project Scorpio including Zen cores, the timing and reported performance increase makes sense for a Zen + Polaris APU, personally I'd hope either with HBM now or the next upgrade including that along with a few other improvements.
Quote:


> Originally Posted by *AlphaC*
> 
> http://www.tweaktown.com/articles/8073/amd-ryzen-ssd-storage-performance-preview/index4.html
> 
> TL;DR : More driver issues related to storage optimization


I'd only put a little bit of that down to driver issues. Intel's support chips have been great for decades at this point...There's a reason why even high-end Ryzen boards tout Intel Ethernet as a feature.


----------



## CULLEN

ITT. People still complaining Ryzen isn't impressive.

And no, Intel is not paying anyone to write "bad" summary of Ryzen (although tons of inactive accounts suddenly became notoriously active, only defending Intel), however, like with religion, there are fanatic followers of corporations as well which will stand up for them at all cost, no matter low unlogical it will sound.

I've literally seen people here claiming that they'd rather buy i7 7700K even though they already have i7 4790K just to prove a point. Like, what?!

Suddenly what's been on top of gamers mind, getting higher minimum frames, doesn't matter, at all. That point is now moot and doesn't matter because Intel has higher average frames.

If Ryzen is above i7 7700K it's "just over" but when Intel is on top it's DESTROYING Ryzen.

*ANYWAY*

In the end, we're all just chasing cars, nothing we'll say will affect sales of either chip, I'll be getting Ryzen and I don't care, at all, what anyone else here will be getting. The only game I plan on playing is BF1 where Ryzen is brilliant, but everything else will be anything but gaming. For me it's a great investment. For anyone else, it's your money so if you fancy buying something, do it, nobody cares.


----------



## budgetgamer120

Quote:


> Originally Posted by *CriticalOne*
> 
> Did you actually read the article and watch the video?


Yes.


----------



## looniam

that was a bit of a skanky move to use the polaris marketing slide to justify 1080 benching. we have all here argued for days over how relevant steam hardware surveys are. it is what it is; marketing for a mid tier graphics card. _to a point_ GN or any review site doesn't really need to justify what benchmarks they run; if they're relevant they are, if not then i just ignore them.

maybe people need to ignore more things.

what am i saying?!?!?!

what fun is there in that?


----------



## blue1512

Quote:


> Originally Posted by *CULLEN*
> 
> ITT. People still complaining Ryzen isn't impressive.
> 
> And no, Intel is not paying anyone to write "bad" summary of Ryzen (although tons of inactive accounts suddenly became notoriously active, only defending Intel), however, like with religion, there are fanatic followers of corporations as well which will stand up for them at all cost, no matter low unlogical it will sound.
> 
> I've literally seen people here claiming that *they'd rather buy i7 7700K even though they already have i7 4790K just to prove a point.* Like, what?!
> 
> Suddenly what's been on top of gamers mind, getting higher minimum frames, doesn't matter, at all. That point is now moot and doesn't matter because Intel has higher average frames.
> 
> If Ryzen is above i7 7700K it's "just over" but when Intel is on top it's DESTROYING Ryzen.
> 
> *ANYWAY*
> 
> In the end, we're all just chasing cars, nothing we'll say will affect sales of either chip, I'll be getting Ryzen and I don't care, at all, what anyone else here will be getting. The only game I plan on playing is BF1 where Ryzen is brilliant, but everything else will be anything but gaming. For me it's a great investment. For anyone else, it's your money so if you fancy buying something, do it, nobody cares.


This.

Ironically 7700k is the worst improvement ever in Core's history, basically a 6700k with slightly better overclock and some Netflix exclusive....


----------



## Master__Shake

Quote:


> Originally Posted by *Charcharo*
> 
> This annoys me a lot. I myself am primarily a gamer, but it is as if most gamers have no long term thinking and simply can not understand CPUs (not that I am that much better).
> 
> The most important markets are the pre-build PCs, the Gaming DIY market and Servers/productivity/science. The last one has the best margins, and possibly is the most important, but it is gamers that set the mind share and " fame or infamy" of a product, which also affects pre-built PCs. That adds up over time. It does matter. And all the " hurr durr Ryzen 7 is bad for games lel, lets badmouth all of Ryzen" is probably going to be damaging.
> 
> Ryzen is above Broadwell-E in productivity and science.
> It has the same or better IPC
> It uses a bit less power.
> AMD achieved this on a MUCH worse process (Intel 14nm is the BEST in the world)
> Their Ryzen core is much smaller and is overall cheaper.
> 
> And people call... that bad? A good deal of its gaming deficiencies will be fixed in a few months, perhaps 3-4. But even so it is quite OK in games. And this is a fail?
> 
> *I want to fail this hard in real life, seems like a pleasant, good thing.


no matter how good amd does in gpu or cpu there will always be people that grab one thing and shout it as loud as they can.

power consumption or temperatures or 1080p gaming or whatever and they will just plug their ears and scream it over and over again.

it's a shame, they wouldn't have bought it anyway. they just want intel or nvidia to lower their price.

so they scream about competition but in actuality it's all for self serving reasons.


----------



## CULLEN

Surely this has been posted here, but just to raise an awareness, here it is again. No matter what side you're on, this is a must watch.






*Highlights from the video:*
- i7 7700K is on average 4% ahead of Ryzen in 1080P gaming. *4%.*
- i7 7700K @ 4.9GHz is at its limit in BF1 and will likely not gain more performance, but Ryzen barely breaks a sweat and it's less than a week old platform.
- Ryzen is extremely good in single thread performance.
- Tomb Raider and for eg. Intel destroys AMD at low settings (510 fps vs. 378 fps) but at Ultra settings they are identical (around 176 fps). Same goes for Tom Clancy.
- Bulldozer went from being +10% behind 2500K to 10% ahead because of better thread utilization.
- Many reviews you've seen in the past were benching at 480P/720P where 2500K was more than 17% faster than Vishera but is now almost 10% slower.
- A couple of game studios have confirmed that they'll update their game engines to utilize Ryzen better.
- If history repeats itself, Xbox and PS will be using AMD hardware in the next-gen, game developers will write code to utilize the processors to the max.
- i7 7700K might be a regretful decision in the weeks or months to come.

For anyone capable of rational thinking, Ryzen should probably be the obvious choice by now.

*But again*! Your money! Do what *you want*! If it's all a conspiracy against Intel and you already have your tin foil hat on, *go for Intel!* Today i7 7700K is still a great processor, just at a terrible value compared to Ryzen.

I've seen some users with X99 considering upgrading because the value of their rig might fall very fast very soon. All I can say is if you feel like going through the trouble of selling your computer, buying a new one, installing everything again just to possibly save few bucks, sure. But I'd personally hold on to it.


----------



## budgetgamer120

Quote:


> Originally Posted by *CULLEN*
> 
> Surely this has been posted here, but just to raise an awareness, here it is again. No matter what side you're on, this is a must watch.
> 
> 
> 
> 
> 
> 
> *Highlights from the video:*
> - i7 7700K is on average 4% ahead of Ryzen in 1080P gaming. *4%.*
> - i7 7700K @ 4.9GHz is at its limit in BF1 and will likely not gain more performance, but Ryzen barely breaks a sweat and it's less than a week old platform.
> - Ryzen is extremely good in single thread performance.
> - Tomb Raider and for eg. Intel destroys AMD at low settings (510 fps vs. 378 fps) but at Ultra settings they are identical (around 176 fps). Same goes for Tom Clancy.
> - Bulldozer went from being +10% behind 2500K to 10% ahead because of better thread utilization.
> - Many reviews you've seen in the past were benching at 480P/720P where 2500K was more than 17% faster than Vishera but is now almost 10% slower.
> - A couple of game studios have confirmed that they'll update their game engines to utilize Ryzen better.
> - If history repeats itself, Xbox and PS will be using AMD hardware in the next-gen, game developers will write code to utilize the processors to the max.
> - i7 7700K might be a regretful decision in the weeks or months to come.
> 
> For anyone capable of rational thinking, Ryzen should probably be the obvious choice by now.
> 
> *But again*! Your money! Do what *you want*! If it's all a conspiracy against Intel and you already have your tin foil hat on, *go for Intel!* Today i7 7700K is still a great processor, just at a terrible value compared to Ryzen.
> 
> I've seen some users with X99 considering upgrading because the value of their rig might fall very fast very soon. All I can say is if you feel like going through the trouble of selling your computer, buying a new one, installing everything again just to possibly save few bucks, sure. But I'd personally hold on to it.


How would a 7700k gain performance? It is already tapped out by one game. The processors what stand to gain performance are the ones not being used fully.

I know you arent saying that, but I've seen someone say it before. That the 7700k will only improve.... Trick are kids bro


----------



## AuraNova

While all of the bickering and tension between the reviewers and the internet (as well as each other) was going on, the main thing in my mind was "What does AdoredTV think about this???" I have a lot of respect for this man. I understand many people don't like his way of thinking, but I swear, compared to others, he tends to think out of the box. Way out of the box. He's rarely wrong either. Many of the things he says is usually speculation, but he is well informed, well educated, and does his research about many of things he covers. AdoredTV seems to have a much better business acumen than many of the other reviewers on YouTube.

With that said, he makes a lot of sense. While I am not vast on the way a CPU or GPU is run, and the processes involved, I have learned a lot from this video(s). All the summary points stated in CULLEN's post above me pretty much speculated from AdoredTV goes into great detail.

(Besides, I love that accent too.







)


----------



## Quantum Reality

I'm a little concerned that AMD hasn't issued a SATA component to their chipset driver. I wonder how much of an effect that has in real-world cases like boot times.


----------



## CriticalOne

AdoredTV is usually nothing but pure rubbish.


----------



## CULLEN

Quote:


> Originally Posted by *AuraNova*
> 
> (Besides, I love that accent too.
> 
> 
> 
> 
> 
> 
> 
> )


I'm on a YT binge on his channel right now for that reason (and really good videos).

Quote:


> Originally Posted by *Quantum Reality*
> 
> I'm a little concerned that AMD hasn't issued a SATA component to their chipset driver. I wonder how much of an effect that has in real-world cases like boot times.


Good point, I'd like to know that as well.

Quote:


> Originally Posted by *CriticalOne*
> 
> AdoredTV is usually nothing but pure rubbish.


How come?


----------



## Ding Chavez

Wait until the 6c12t Ryzen CPUs come out. They will offer even cheaper prices and hopefully with some of the problems worked out by then.

The 4c8t and 6c12t Ryzen will be a price performance ratio hard to beat.

7700K is very good but more cores and threads for cheaper might be a good thing...


----------



## ToTheSun!

Quote:


> Originally Posted by *CULLEN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CriticalOne*
> 
> AdoredTV is usually nothing but pure rubbish.
> 
> 
> 
> How come?
Click to expand...

His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.


----------



## JackCY

This last video is not bad though in showing the issue with review testing and wanting strong single core performance while sacrificing total performance.
The only reason why not to buy a Ryzen right now is that it's a new platform with problematic unfinished mobo UEFIs and Windows being out of date. So one has to be OK with resolving issues, updating UEFI, trying OS tweaks as workarounds, ...
Other than that 1700 is way better than even the 7700K for common use except the rare cases that need very strong 1T performance and don't care about nT such as old games and old apps that don't use nT at all or are stuck at 2-4 threads etc. It's so obvious from all the gaming benchmarks in reviews what games are terribly optimized to use more threads when a 4 core sometimes even 2 core beats 6-10 core CPUs because those run a little lower clocks due to the higher core count.


----------



## Master__Shake

Quote:


> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.


but he used cold hard facts when comparing the 8350 to the 2500k.

he couldn't fake those.


----------



## cssorkinman

Quote:


> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.
> 
> 
> 
> but he used cold hard facts when comparing the 8350 to the 2500k.
> 
> he couldn't fake those.
Click to expand...

Unpopular truths are .... unpopular


----------



## sumitlian

Quote:


> Originally Posted by *JackCY*
> 
> This last video is not bad though in showing the issue with review testing and wanting strong single core performance while sacrificing total performance.
> The only reason why not to buy a Ryzen right now is that it's a new platform with problematic unfinished mobo UEFIs and Windows being out of date. So one has to be OK with resolving issues, updating UEFI, trying OS tweaks as workarounds, ...
> Other than that 1700 is way better than even the 7700K for common use except the rare cases that need very strong 1T performance and don't care about nT such as old games and old apps that don't use nT at all or are stuck at 2-4 threads etc. It's so obvious from all the gaming benchmarks in reviews what games are terribly optimized to use more threads when a 4 core sometimes even 2 core beats 6-10 core CPUs because those run a little lower clocks due to the higher core count.


I think i7-7700k has only a little higher _IPC_ than RyZen, and the very strong 1T part is mostly due to higher _IPS_ with i7-7700k since it can overclock much higher than RyZen 8 core CPUs.


----------



## umeng2002

Quote:


> Originally Posted by *sumitlian*
> 
> I think i7-7700k has only a little higher _IPC_ than RyZen, and the very strong 1T part is mostly due to higher _IPS_ with i7-7700k since it can overclock much higher than RyZen 8 core CPUs.


AMD has admitted Ryzen IPC is about 4.5% lower than the 7700k and it clocks about 10% faster... so that's also what is driving the gap in some games.


----------



## VegetarianEater

Quote:


> Originally Posted by *umeng2002*
> 
> AMD has admitted Ryzen IPC is about 4.5% lower than the 7700k and it clocks about 10% faster... so that's also what is driving the gap in some games.


yeah but it also loses to the 6900k in games... which is similar in both clocks and IPC...


----------



## Buris

Quote:


> Originally Posted by *umeng2002*
> 
> AMD has admitted Ryzen IPC is about 4.5% lower than the 7700k and it clocks about 10% faster... so that's also what is driving the gap in some games.


Without a doubt AMD will have some troubles when Skylake or Kabylake-E come out, I don't think Cannon lake will acomplish very much, Coffee Lake has been pushed up into Q4 2017(albeit still 14nm), which means intel will have potentially released 3 generations of processors in one year. That's crazy.

Zen+ might be able to improve IPC by 10% or so, but depending on how nice 7nm ends up being, they could end up clocking these into 4.5Ghz territory, it doesn't necessarily need to have the IPC as long as it's an impercievable difference in performance compared to the competition.

Everyone knew the Core 2 series was better (IPC-wise) than the Phenom and Phenom II's, but AMD sold well by undercutting intel by just a little bit.

The issue with AMD in the early 2010's was the fact Sandy Bridge just absolutely destroyed it in IPC while bulldozer was a step back compared to the Phenom II's. Making the FX lineup look like a complete turd (and it was), the 8300 came out and it was around Nehalem IPC, but the damage to the FX and AMD brand had already taken its toll.


----------



## umeng2002

Quote:


> Originally Posted by *VegetarianEater*
> 
> yeah but it also loses to the 6900k in games... which is similar in both clocks and IPC...


With SMT off?

Ryzen has a trifecta of issues to resolve in the next few months.

More stable UEFI to improve memory overclocking.

Getting Windows patched so the SMT and basic scheduling works better.

Getting devs to patch and/or optimize future games for Ryzen's SMT implementation.


----------



## looniam

Quote:


> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.
> 
> 
> 
> but he used cold hard facts when comparing the 8350 to the 2500k.
> 
> he couldn't fake those.
Click to expand...

but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core _but completely ignored it up to that point._

if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.


----------



## umeng2002

He did bring up some valid points, but the 2500k has at most 4 threads, while games treat the 8350 as having 8 threads. I don't see mainstream CPUs having more than 8 cores for some years or if ever.

The main "good" thing for AMD is that Ryzen is within spitting distance of Intel in gaming... not to mention trouncing it in production work.

People will not avoid AMD now... sales go up, etc.


----------



## The-Beast

Quote:


> Originally Posted by *umeng2002*
> 
> He did bring up some valid points, but the 2500k has at most 4 threads, while games treat the 8350 as having 8 threads. I don't see mainstream CPUs having more than 8 cores for some years or if ever.


You live in a very small world. The R5 will have a mainstream cpu that will have more than 8 threads. Even if you don't think it's coming in the next year. The sheer fact that 22core chips exist should point you to the fact that the core count in consumer products is artificially deflated.


----------



## prznar1

Quote:


> Originally Posted by *looniam*
> 
> but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core _but completely ignored it up to that point._
> 
> if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.


No no no. The key was to show that AMD lacked proper support in games. Modern games are being designed for more cores, as the consoles use the jaguar chip that is 8c chip. Since then, games are utilitizing the bigger core count way better than they used to do that before. And i thought that this is stil a dream for us, to games finally utilitize the higher core count.


----------



## looniam

Quote:


> Originally Posted by *umeng2002*
> 
> He did bring up some valid points, but the 2500k has at most 4 threads, while games treat the 8350 as having 8 threads. I don't see mainstream CPUs having more than 8 cores for some years or if ever.
> 
> The main "good" thing for AMD is that Ryzen is within spitting distance of Intel in gaming... not to mention trouncing it in production work.
> 
> People will not avoid AMD now... sales go up, etc.


tbh, i really don't want to admit if he had any good points since it personally irks me to no end when some tries to use someone else's data to prove their own hypothesis. _but hey that's common enough in academia, eh?_ if you want to prove something then do *YOUR OWN* damned testing! this is why most youtubers ought to be avoided like the plague; all they have is a microphone and an opinion.

but yeah, now that my rant is over









i'm looking forward to seeing the teething issues worked out and grabbing a 1700 and some 3600 ram in a month to game on after i go back to work. got a call earlier today to maybe do some video work on the side later this year.

win - win


----------



## CriticalOne

If you're at work: yes, the 1800x is great for gaming.


----------



## STUNTDEVIL




----------



## 12Cores

OP please add this 1800X/sli test:

https://www.youtube.com/watch?v=8-mMBbWHrwM


----------



## prznar1

As much as i laughed when Jay was doing his thermal imaging "test" he is also saying that ryzen is still not mature platform. Like most of people that know stuff. He didnt trashed the cpu, he trashed the games for not supporting it correctly and if you are not in a hurry, you should wait before you buy Ryzen.

However if you want to see some realy crazy results, watch this.




And observe the MINIMUM fps.


----------



## Oubadah

..


----------



## looniam

Quote:


> Originally Posted by *prznar1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core _but completely ignored it up to that point._
> 
> if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.
> 
> 
> 
> 
> 
> 
> No no no. The key was to show that AMD lacked proper support in games. Modern games are being designed for more cores, as the consoles use the jaguar chip that is 8c chip. Since then, games are utilitizing the bigger core count way better than they used to do that before. And i thought that this is stil a dream for us, to games finally utilitize the higher core count.
Click to expand...

though he eventually did, that was not what he was trying to show. this is exactly what he disputed:
https://www.computerbase.de/2012-10/test-amd-fx-8350-vishera/7/
Quote:


> Although low-resolution games do not make any sense at all to the layman, they are an elementary part of the process. This is where the true effect of the processor is shown, if you can almost eliminate the limitation by the graphics card, which already uses at 1,920 × 1,080 pixels. *From this, it can be deduced that the low-resolution graphics shown here today could be true with faster graphics cards at high resolutions - depending on how big the performance jump of the next graphics card generation will be.* Accordingly, these values are important because they can be better planned for the future in the processors.


that is talking about upgrading the graphics card not the games. in the meantime swaps out games like batman AC and mass effect 3 that a potato can run let alone a 680 and replaces those with crysis 3 and metro LL that would put a heavier load on the titan, even @ 1080p. my point that if he really did want to debunk that theory then the same games ought to have been used.
Quote:


> Originally Posted by *Oubadah*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Master__Shake*
> 
> but he used cold hard facts when comparing the 8350 to the 2500k.
> 
> he couldn't fake those.
> 
> 
> 
> 
> 
> 
> I didn't find the video particularly convincing. It's certainly _not_ a compelling argument for generating utterly useless GPU bound CPU benchmarks or not doing low res CPU benchmarks, even though he seems to think it is. However, the following did dawn on me:
> 
> The answer to the question of the longevity of the 8350 vs 2500K (the 3570K is more it's contemporary) depends on your priorities, and some of the disagreement on here might have something to do with that.
> 
> Is your primary objective:
> 
> a) Improving your future experience with *existing* games, or:
> b) Improving your future experience with *future* games
> 
> Mine is absolutely the former, so that's the angle I've been coming at this from. On the other hand, that video is assuming b.
Click to expand...

you just might have said it better than me.


----------



## Master__Shake

Quote:


> Originally Posted by *Oubadah*
> 
> *I didn't find the video particularly convincing. It's certainly NOT and argument for generating utterly useless GPU bound CPU benchmarks, even though he seems to think it is. However, the following did dawn on me:*
> 
> The answer to the question of the longevity of the 8350 vs 2500K (the 3570K is more it's contemporary) depends on your priorities, and some of the disagreement on here might have something to do with that.
> 
> Is your primary objective:
> 
> a) Improving your future experience with *existing* games, or:
> b) Improving your future experience with *future* games
> 
> Mine is absolutely the former, so that's the angle I've been coming at this from. On the other hand, that video is assuming b.


so now the games on a titan x (pascal) or titan (kepler) or 680 are gpu bound at 1080p.

why?


----------



## DaaQ

Quote:


> Originally Posted by *CULLEN*
> 
> Surely this has been posted here, but just to raise an awareness, here it is again. No matter what side you're on, this is a must watch.
> 
> 
> 
> 
> 
> 
> *Highlights from the video:*
> - i7 7700K is on average 4% ahead of Ryzen in 1080P gaming. *4%.*
> - i7 7700K @ 4.9GHz is at its limit in BF1 and will likely not gain more performance, but Ryzen barely breaks a sweat and it's less than a week old platform.
> - Ryzen is extremely good in single thread performance.
> - Tomb Raider and for eg. Intel destroys AMD at low settings (510 fps vs. 378 fps) but at Ultra settings they are identical (around 176 fps). Same goes for Tom Clancy.
> - Bulldozer went from being +10% behind 2500K to 10% ahead because of better thread utilization.
> - Many reviews you've seen in the past were benching at 480P/720P where 2500K was more than 17% faster than Vishera but is now almost 10% slower.
> - A couple of game studios have confirmed that they'll update their game engines to utilize Ryzen better.
> - If history repeats itself, Xbox and PS will be using AMD hardware in the next-gen, game developers will write code to utilize the processors to the max.
> - i7 7700K might be a regretful decision in the weeks or months to come.
> 
> For anyone capable of rational thinking, Ryzen should probably be the obvious choice by now.
> 
> *But again*! Your money! Do what *you want*! If it's all a conspiracy against Intel and you already have your tin foil hat on, *go for Intel!* Today i7 7700K is still a great processor, just at a terrible value compared to Ryzen.
> 
> I've seen some users with X99 considering upgrading because the value of their rig might fall very fast very soon. All I can say is if you feel like going through the trouble of selling your computer, buying a new one, installing everything again just to possibly save few bucks, sure. But I'd personally hold on to it.


Quote:


> Originally Posted by *looniam*
> 
> but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core _but completely ignored it up to that point._
> 
> if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.


What you both missed then which I'm not sure is fact on Windows 10. Is the statement that Intel defaults to performance mode and AMD defaults to balanced mode in the Windows OS.
So if that is true. Easily confirmable. Then all reviews are pretty much borked. Wouldn't y'all say ?


----------



## Oubadah

..


----------



## looniam

Quote:


> Originally Posted by *DaaQ*
> 
> What you both missed then which I'm not sure is fact on Windows 10. Is the statement that Intel defaults to performance mode and AMD defaults to balanced mode in the Windows OS.
> So if that is true. Easily confirmable. Then all reviews are pretty much borked. Wouldn't y'all say ?


not sure how that is relevant to 5 year old benchmarks. but fwiw everytime i've installed windows since W7 i've had to change the performance mode from balanced to what i like (custom).


----------



## sumitlian

Quote:


> Originally Posted by *DaaQ*
> 
> What you both missed then which I'm not sure is fact on Windows 10. Is the statement that Intel defaults to performance mode and AMD defaults to balanced mode in the Windows OS.
> So if that is true. Easily confirmable. Then all reviews are pretty much borked. Wouldn't y'all say ?


I don't think windows default power profiles have any noticeable impact on performance.
Yes core parking might affect performance, I remember I got significant WInrar, 7-zip and Skyrim boost with core parking disabled when I was using 8350. It is not with only 8350, imo, all multicore CPUs are going to see some improvements.

What I am more interested in that The Stilt discovered significant fps improvements on Windows 7 + ryZen. If MS provides the necessary fixes then Yes all the current review data are going to be obsolete.


----------



## sumitlian

Quote:


> Originally Posted by *looniam*
> 
> not sure how that is relevant to 5 year old benchmarks. but fwiw everytime i've installed windows since W7 i've had to change the performance mode from balanced to what i like (custom).


Custom is the best. Both min and max processor state to 100% + Core parking disabled = Best performance.


----------



## epic1337

setting them to 99% instead of 100% reduces stutters.


----------



## Oubadah

..


----------



## sumitlian

Quote:


> Originally Posted by *Oubadah*
> 
> If you have an Intel quad without HT/HT disabled, no core parking happens. With HT, Windows parks the fake cores. Why is Windows parking 8350 cores?
> 
> Does Windows park any cores on an Intel hex+ proc if HT is disabled? I should have tested that when I had my broadwell E build.


May be because windows also recognize them as so called 'fake' cores.

You can easily see it in here.
http://www.tomshardware.com/answers/id-2367717/amd-parked-cores.html

Imo, It has nothing to do with hyper threading. Windows will park every single core/logical core it can find that is not doing much work except the core #0.


----------



## BinaryDemon

Quote:


> Originally Posted by *Oubadah*
> 
> If you have an Intel quad without HT/HT disabled, no core parking happens. With HT, Windows parks the fake cores. Why is Windows parking 8350 cores?


I think it works like this:

On Bulldozer/Piledriver the CPU's are divided up into 4 modules of 2 cores each. Each module shares some resources between the two cores. If your using less than 4 threads, you want to avoid this resource sharing so Windows Scheduler loads 1 core per module first, and parks the others ones.


----------



## DaaQ

Quote:


> Originally Posted by *Oubadah*
> 
> I'm 99.99% sure that's BS. Windows 10 defaults to balanced on Intel hardware just like Windows 7 does, unless this is some Kaby Lake specific thing. I can test what 10 does on Skylake in a day or two.


Agree, I don't know if it's accurate or not. Been quite awhile since I've done a Windows install. But day after release the recommendation of disabling HPET n BIOS and setting windows to high performance was every where wasn't it ? So what were the reviews of both done on? I still think 7700 will end up faster regardless due to clock speed alone but that's not the point.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *dmasteR*
> 
> There's been multiple sales for BF1.
> 
> In fact BF1 is on sale right now for $30
> 
> https://www.origin.com/usa/en-us/store/battlefield/battlefield-1/standard-edition


Thanks for that! +Rep!


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *blue1512*
> 
> This.
> 
> Ironically 7700k is the worst improvement ever in Core's history, basically a 6700k with slightly better overclock and some Netflix exclusive....


I personally think the 7700K is the first Intel quad since the 2600K to truly impress me. You say "only" better OC's but that's the biggest thing to me. You basically get SL's impressive IPC gains while getting even better OCing than the original daddy of OCing in SB...


----------



## Derp

Quote:


> Originally Posted by *epic1337*
> 
> setting them to 99% instead of 100% reduces stutters.


Got any proof?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *AuraNova*
> 
> While all of the bickering and tension between the reviewers and the internet (as well as each other) was going on, the main thing in my mind was "What does AdoredTV think about this???" I have a lot of respect for this man. I understand many people don't like his way of thinking, but I swear, compared to others, he tends to think out of the box. Way out of the box. He's rarely wrong either. Many of the things he says is usually speculation, but he is well informed, well educated, and does his research about many of things he covers. AdoredTV seems to have a much better business acumen than many of the other reviewers on YouTube.
> 
> With that said, he makes a lot of sense. While I am not vast on the way a CPU or GPU is run, and the processes involved, I have learned a lot from this video(s). All the summary points stated in CULLEN's post above me pretty much speculated from AdoredTV goes into great detail.
> 
> (Besides, I love that accent too.
> 
> 
> 
> 
> 
> 
> 
> )


I love Adored's channel as well even if he goes off the deep end when speculating about upcoming AMD products at times. I think its that lovely Scottish brogue! In all seriousness, his evaluation of Ryzen and the supposed efficacy of low resolution testing methodology is brilliant and is backed up by sources. I was actually very surprised to see that BD is now faster than the 2500K as I thought that would never happen. Just goes to show that this idea that low-res benches will predict future performance is a crock. IMO, low-res benches are basically the only way to separate performance of CPU's in gaming but that's not really super relevant because (as I've been saying all along) gaming is a very poor way to be comparing CPU's in the first place. We are in a time now where the CPU is about as inconsequential in gaming as its ever been. As long as you have enough cores not to cause a serious bottleneck, any of the current intel CPU's will be more than enough for any gaming scenarios and the R7's (having Broadwell-E IPC and 8 cores) are in the same boat. If you absolutely must crank out the highest possible FPS for your 144Hz monitor (and have a very beastly video card to boot) then sure the 7700K will probably get you those extra 5 or so FPS you're looking for. But the R7's are very close in gaming to any of the X99 chips so if your contention is that Ryzen "sucks" for gaming then you are also saying that the 6900K sucks for gaming too. And that is ridiculous.


----------



## epic1337

Quote:


> Originally Posted by *Derp*
> 
> Got any proof?


none, just based on personal experience.

the point was its less likely to get pegged at 100% cpu load.


----------



## budgetgamer120

Quote:


> Originally Posted by *CULLEN*
> 
> Surely this has been posted here, but just to raise an awareness, here it is again. No matter what side you're on, this is a must watch.
> 
> 
> 
> 
> 
> 
> *Highlights from the video:*
> - i7 7700K is on average 4% ahead of Ryzen in 1080P gaming. *4%.*
> - i7 7700K @ 4.9GHz is at its limit in BF1 and will likely not gain more performance, but Ryzen barely breaks a sweat and it's less than a week old platform.
> - Ryzen is extremely good in single thread performance.
> - Tomb Raider and for eg. Intel destroys AMD at low settings (510 fps vs. 378 fps) but at Ultra settings they are identical (around 176 fps). Same goes for Tom Clancy.
> - Bulldozer went from being +10% behind 2500K to 10% ahead because of better thread utilization.
> - Many reviews you've seen in the past were benching at 480P/720P where 2500K was more than 17% faster than Vishera but is now almost 10% slower.
> - A couple of game studios have confirmed that they'll update their game engines to utilize Ryzen better.
> - If history repeats itself, Xbox and PS will be using AMD hardware in the next-gen, game developers will write code to utilize the processors to the max.
> - i7 7700K might be a regretful decision in the weeks or months to come.
> 
> For anyone capable of rational thinking, Ryzen should probably be the obvious choice by now.
> 
> *But again*! Your money! Do what *you want*! If it's all a conspiracy against Intel and you already have your tin foil hat on, *go for Intel!* Today i7 7700K is still a great processor, just at a terrible value compared to Ryzen.
> 
> I've seen some users with X99 considering upgrading because the value of their rig might fall very fast very soon. All I can say is if you feel like going through the trouble of selling your computer, buying a new one, installing everything again just to possibly save few bucks, sure. But I'd personally hold on to it.


Really good video.


----------



## tpi2007

Quote:


> Originally Posted by *AlphaC*
> 
> Quote:
> 
> 
> 
> *PCIE*
> 
> Ryzen has a bit of an advantage in maximum sequential read performance. Intel has a bit of advantage when writing sequential data. Intel has a huge advantage in small-file performance. Comparing performance at a 4KB transfer size illustrates this fact.
> 
> Intel has a massive advantage in every measured category except 4K QD1 read. That said, AMD delivers very respectable performance. As you will notice throughout, Intel has superior random performance.
> 
> With CDM, there is very little difference in sequential performance between the two platforms; Intel still holds a slight edge. Intel holds a distinct advantage at 4K QD1 write, where it is 50% faster than AMD.
> 
> *SATA*
> 
> Ryzen delivers better sequential write speeds than Intel at transfers of 64KB and higher. Sequential read performance with larger file transfers are about equal. However, Intel delivers where it matters most; small sequential files. For some unknown reason, read transfers with at 16KB took a nose dive on the Ryzen platform. We ran ATTO another time to see if it would happen again, and ATTO would not even run.
> 
> AMD delivers very respectable performance; however, Ryzen is crushed by Intel's vastly super random performance. As you will notice throughout, Intel has superior random performance. A good portion of the disparity between the two platforms is due to Intel's RST driver and superior chipset. Ryzen has no SATA driver, and thus suffers the consequences.
> 
> *CONCLUSIONS*
> 
> We are pleased with the PCIe storage performance that Ryzen delivers. It is within striking distance of Intel and we believe that for the most part user experience between the two platforms is comparable. However, SATA performance between the two platforms is another matter. Ryzen is at a distinct disadvantage compared with Intel if you are running a SATA SSD.
> 
> 
> 
> http://www.tweaktown.com/articles/8073/amd-ryzen-ssd-storage-performance-preview/index4.html
> 
> TL;DR : More driver issues related to storage optimization
Click to expand...

Good to know that part of the deal for now. It was something I wanted to read about. Rep+

Quote:


> Originally Posted by *Quantum Reality*
> 
> I'm a little concerned that AMD hasn't issued a SATA component to their chipset driver. I wonder how much of an effect that has in real-world cases like boot times.


I hope that they release a specific driver, but let them do it properly. Back in 2012, when I got on the X79 platform - and I waited around 5 months after the platform's release before making the purchase, I was getting BSODs. After investigating the matter, the culprit was none other than Intel's RSTe driver (back then the normal RST wasn't officially available for the X79 platform as it is now). I was using the latest RSTe driver version that was recommended on the Asus site nonetheless (which they later removed). Intel did correct the problem and I haven't had a single storage related BSOD ever since (although others have with other versions, see the thread linked below), using either RSTe or RST for the X79 platform, but it goes to show that even Intel, in their enterprise grade software, had teething issues. But that wasn't all.

Even after having corrected the bug that was causing the BSOD, it still took Intel a few more months to get the RSTe driver to let S.M.A.R.T. attributes through and allow the SSD Optimizer from their own SSD Toolbox software to work. Essentially, if you loaded Intel's own SSD software (or any other with similar functionality), you'd know nothing about its health, nor could manually run TRIM. So, you could have better performance with RSTe (and I noticed the difference in games with lots of streaming like driving around in GTA IV), or you could have lower performance and see the SMART attributes and run TRIM by using the Windows provided standard driver. Also, TRIM support wasn't available in RAID mode for the X79 platform for a long time too.

In case you're interested, here is the thread about it I made back in 2012.

And if you're wondering how long it took for them to fix the S.M.A.R.T. pass through and Intel SSD Toolbox problems, here is the thread of when RSTe 3.2.0.1135 was released - August of 2012 - by that time the platform was nine months old.


----------



## budgetgamer120

AdoredTV ftw









Said it the best... Nerve of people thinking Ryzen is bad.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Brutuz*
> 
> Compare CPU results in benchmarks that have reasonable multi-thread usage, that's pretty much where CPUs will end up coming in when the threads are fully utilized by a game. Ryzen will certainly end up faster than Kaby Lake, just like Bulldozer will certainly end up matching the 3770k, it's just a matter of when and if the CPU will still be useful...Unlike BD which was simply too weak before its threads were utilized, I think the answer to those may be "Soon" and "Yes".
> They would if they just want to get a high-performance product out right now. This isn't like Intel where they could afford a delay much more readily, they haven't had a competitive CPU for nearly 7 years now I believe...They did the same with BD then fixed it with PD, however hopefully Zen's fixes can be done via microcode and Windows updates.
> 
> And that's exactly it, a 7700k does still make sense if you mainly game and plan on upgrading relatively soon compared to a Ryzen build. I don't get why it isn't obvious to everyone....Buy Ryzen if you want longevity, multi-tasking or do CPU intensive tasks in general, buy Kaby Lake if you plan on upgrading soon, mainly game or can get a used one very cheap and have an i5 6600k and finally, buy HEDT if you simply have money to burn and want the best of the best. Ryzen can match or even beat it in some situations, but it has a much broader range where it performs well while Ryzen seems to have concentrated on the largest potential markets with an extremely high value chip.
> I would put money on Project Scorpio including Zen cores, the timing and reported performance increase makes sense for a Zen + Polaris APU, personally I'd hope either with HBM now or the next upgrade including that along with a few other improvements.
> I'd only put a little bit of that down to driver issues. Intel's support chips have been great for decades at this point...There's a reason why even high-end Ryzen boards tout Intel Ethernet as a feature.


I get what youre saying about multithreaded performance matching gaming after a time. Ryzen is _already_ faster then an I7 7700K in all metrics apart from gaming and maybe a single threaded benchmark here and there. Actually I agree with most of what youre saying, except Broadwell HEDT has pretty much been killed in my eyes at its current pricing.

My concern is that Ive seen benchmarks and lets plays for games that I play _now_ not doing as well on Ryzen as Kaby Lake an in some cases barely matching or slightly losing to my current 3570K even if these are bunk and Ryzen ends up matching Broadwell IPC in games as well it just doesnt clock high enough to make up the difference with Kaby for my needs. I'm not going to spend a lot of money for potentially slightly worse performance in some cases today in the hope that in the future it will get better. Hell I havent upgraded my chip for years because I didnt think 5 % increments was worth it. There's an Intel chip now that boosts to as far as my current chip overclocks at stock, with 20-30% better IPC that will likely OC 10% or more further, with 4 more threads as a bonus. That sounds like a reasonable point to upgrade. I waited for Ryzen, and it isnt for me at the moment.

That is not to say its not for anyone else and that its not an amazing chip and that some launch issues aside AMD engineers havent done a fantastic job, I have to keep repeating myself because it seems to be a crime to say anything other then Ryzen is king in everything and Intel sucks. Gamers Nexus got some kids telling them they hoped their families died for simply reporting their findings and making conclusions based on those findings.

The guy from Adored TV makes no bones about absolutely bleeding red. Imagine if another Youtuber came out and said he had a 'tear in his eye' in a video because of the amazing performance of the Titan XP. Would anyone seriously think him credible and rely on his opinion? Computerbase.de is the only place other then some debunked Youtuber showing that Ryzen is outperforming all Intel chips across all games, and suddenly they are the only credible source flying in the face of every other site out there?


----------



## umeng2002

Quote:


> Originally Posted by *The-Beast*
> 
> You live in a very small world. The R5 will have a mainstream cpu that will have more than 8 threads. Even if you don't think it's coming in the next year. The sheer fact that 22core chips exist should point you to the fact that the core count in consumer products is artificially deflated.


I said 8 cores.

Not 8 Threads.

With SMT, that means I implied 16 threads.


----------



## umeng2002

Quote:


> Originally Posted by *CriticalOne*
> 
> If you're at work: yes, the 1800x is great for gaming.


Just like 3D Pinball Space Cadet


----------



## Brutuz

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I get what youre saying about multithreaded performance matching gaming after a time. Ryzen is _already_ faster then an I7 7700K in all metrics apart from gaming and maybe a single threaded benchmark here and there. Actually I agree with most of what youre saying, except Broadwell HEDT has pretty much been killed in my eyes at its current pricing.
> 
> My concern is that Ive seen benchmarks and lets plays for games that I play _now_ not doing as well on Ryzen as Kaby Lake an in some cases barely matching or slightly losing to my current 3570K even if these are bunk and Ryzen ends up matching Broadwell IPC in games as well it just doesnt clock high enough to make up the difference with Kaby for my needs. I'm not going to spend a lot of money for potentially slightly worse performance in some cases today in the hope that in the future it will get better. Hell I havent upgraded my chip for years because I didnt think 5 % increments was worth it. There's an Intel chip now that boosts to as far as my current chip overclocks at stock, with 20-30% better IPC that will likely OC 10% or more further, with 4 more threads as a bonus. That sounds like a reasonable point to upgrade. I waited for Ryzen, and it isnt for me at the moment.
> 
> That is not to say its not for anyone else and that its not an amazing chip and that some launch issues aside AMD engineers havent done a fantastic job, I have to keep repeating myself because it seems to be a crime to say anything other then Ryzen is king in everything and Intel sucks. Gamers Nexus got some kids telling them they hoped their families died for simply reporting their findings and making conclusions based on those findings.
> 
> The guy from Adored TV makes no bones about absolutely bleeding red. Imagine if another Youtuber came out and said he had a 'tear in his eye' in a video because of the amazing performance of the Titan XP. Would anyone seriously think him credible and rely on his opinion? Computerbase.de is the only place other then some debunked Youtuber showing that Ryzen is outperforming all Intel chips across all games, and suddenly they are the only credible source flying in the face of every other site out there?


The funny thing is that I have a 3570k, it's almost never at 100% at 1080p and I still manage to pull at least 40fps average with a large GPU limitation. Ryzen is what I'll be going to (Or Zen2) simply because even though it's not as fast in todays games, it's still more than fast enough...and I'd rather that than Kaby being the same experience (60Hz screen, not going to be seeing Ryzen struggle to reach that at even an ultrawide 1080p screen) now and a lesser experience later. I guess that's the point, it's not like BD where you'd get much lower performance now and the same performance later, instead you get slightly lower to unnoticeable performance drops (eg. 75fps vs 68fps) and much greater performance later.


----------



## Eagle1337

Quote:


> Originally Posted by *Brutuz*
> 
> The funny thing is that I have a 3570k, it's almost never at 100% at 1080p and I still manage to pull at least 40fps average with a large GPU limitation. Ryzen is what I'll be going to (Or Zen2) simply because even though it's not as fast in todays games, it's still more than fast enough...and I'd rather that than Kaby being the same experience (60Hz screen, not going to be seeing Ryzen struggle to reach that at even an ultrawide 1080p screen) now and a lesser experience later. I guess that's the point, it's not like BD where you'd get much lower performance now and the same performance later, instead you get slightly lower to unnoticeable performance drops (eg. 75fps vs 68fps) and much greater performance later.


Unless you game at 144hz.


----------



## umeng2002

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I love Adored's channel as well even if he goes off the deep end when speculating about upcoming AMD products at times. I think its that lovely Scottish brogue! In all seriousness, his evaluation of Ryzen and the supposed efficacy of low resolution testing methodology is brilliant and is backed up by sources. I was actually very surprised to see that BD is now faster than the 2500K as I thought that would never happen. Just goes to show that this idea that low-res benches will predict future performance is a crock. IMO, low-res benches are basically the only way to separate performance of CPU's in gaming but that's not really super relevant because (as I've been saying all along) gaming is a very poor way to be comparing CPU's in the first place. We are in a time now where the CPU is about as inconsequential in gaming as its ever been. As long as you have enough cores not to cause a serious bottleneck, any of the current intel CPU's will be more than enough for any gaming scenarios and the R7's (having Broadwell-E IPC and 8 cores) are in the same boat. If you absolutely must crank out the highest possible FPS for your 144Hz monitor (and have a very beastly video card to boot) then sure the 7700K will probably get you those extra 5 or so FPS you're looking for. But the R7's are very close in gaming to any of the X99 chips so if your contention is that Ryzen "sucks" for gaming then you are also saying that the 6900K sucks for gaming too. And that is ridiculous.


It's not a crock because the 2500K is at most 4 threads.

The 8350 is 8 threads.

As I pointed out before, I don't see anything beyond 8c16t becoming mainstream.. even high end expect for edge cases.

So, I doubt there will ever be games going wider than 16 threads.


----------



## Liranan

Quote:


> Originally Posted by *looniam*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.
> 
> 
> 
> but he used cold hard facts when comparing the 8350 to the 2500k.
> 
> he couldn't fake those.
> 
> Click to expand...
> 
> but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core but completely ignored it up to that point.
> 
> if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.
Click to expand...

So what you expect him to do is only use benchmarks from 2012 to prove whether anything has changed? You seriously are grasping at straws but that isn't hard to understand. We have been inundated with propaganda for years that Intel are better but now that they aren't many can't accept it.


----------



## Charcharo

Quote:


> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.


A love of engineering and forward thinking does that to people.

Normal users and gamers won't understand, of course.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Brutuz*
> 
> The funny thing is that I have a 3570k, it's almost never at 100% at 1080p and I still manage to pull at least 40fps average with a large GPU limitation. Ryzen is what I'll be going to (Or Zen2) simply because even though it's not as fast in todays games, it's still more than fast enough...and I'd rather that than Kaby being the same experience (60Hz screen, not going to be seeing Ryzen struggle to reach that at even an ultrawide 1080p screen) now and a lesser experience later. I guess that's the point, it's not like BD where you'd get much lower performance now and the same performance later, instead you get slightly lower to unnoticeable performance drops (eg. 75fps vs 68fps) and much greater performance later.


Total War Warhammer is drilling my 3570K in large battles below 20 FPS at times, and I also play stuff like Hearts of Iron IV a lot at the moment where the time ticker starts going pretty slowly late game with a large amount of battles going on worldwide, I have a single thread bottleneck currently. I'm not your typical console game player on PC. Although I do play some action games, youre right in those Ryzen would be more then enough with plenty left in the tank for later, but at the moment for those games my 3570K would still be just fine as well and for the forseeable future too.

There are many other reasons Ive picked up a 7700K just now, my mobo is getting hinky with booting sometimes, USB ports not working on occasion, availability of decent compatible RAM for Ryzen, being able to get a deal on the 7700K that works out a couple of hundred cheaper then Ryzen with a decent motherboard, needing it now as in a few weeks I wont have as much time to game anyway etc, etc.

I wont hesitate to recommend a Ryzen CPU, but the 7700K still has a place in gaming at the moment.


----------



## umeng2002

Any $250+ CPU is going to be great in gaming.

If I absolutely wanted a new gaming PC now, I'd probably go Intel too.

But with the R5 yet to launch, and various software bugs for Ryzen, I'd wait until summer or fall for a new gaming PC.


----------



## pez

So...serious question time.

I'm not really considering Ryzen for my main rig as from a few reviews it looks like Ryzen has consistently put out lower minimums/99th percentiles than something like the 7700K. In the case of the arstechnica review, it's pretty consistent for a gaming situation that Kaby Lake performs what I'd consider a significant amount better on gaming when CPU bound. However, I'm seeing arguments here of people saying Ryzen has dethroned Intel in gaming...what gives?

I love what Ryzen is offering, and am even considering it for an HTPC, but if I'm going to spend the money, I'm thinking I might as well repurpose my Haswell chip for that since I'm looking at a similar performance delta.


----------



## Charcharo

Quote:


> Originally Posted by *pez*
> 
> So...serious question time.
> 
> I'm not really considering Ryzen for my main rig as from a few reviews it looks like Ryzen has consistently put out lower minimums/99th percentiles than something like the 7700K. In the case of the arstechnica review, it's pretty consistent for a gaming situation that Kaby Lake performs what I'd consider a significant amount better on gaming when CPU bound. However, I'm seeing arguments here of people saying Ryzen has dethroned Intel in gaming...what gives?
> 
> I love what Ryzen is offering, and am even considering it for an HTPC, but if I'm going to spend the money, I'm thinking I might as well repurpose my Haswell chip for that since I'm looking at a similar performance delta.


The idea is that Ryzen itself will outlast the i7 7700K in gaming, long term. That also includes the 6 core 12 thread Ryzen 5 1600X which might have more cache to itself and/or higher clocks to boot.

Some people also raise the point that Z270 is pretty much it, the end of that platform. At most (and unlikely) one more refresh will use it and game over, whilst AM4 is till 2020 (which means Ryzen 2 and Ryzen 3).

Others also say that Ryzen has better performance for people that do many things, ergo playing a game with 100 tabs open and doing 10 things at once. Valid point, my i5 4460 is not enough for me due to my computing habits.

Others still believe Ryzen's gaming performance will improve with OS updates, Kernel/BIOS maturity, RAM issues fixed etc.

All are technically correct. I am 100% certain the Ryzen 7 and maybe 5 CPUs will outlast a 7700K long term, and that AM4 is a more future proof platform... but as for the rest?
It is a wait and see or subjective point.


----------



## pez

Quote:


> Originally Posted by *Charcharo*
> 
> The idea is that Ryzen itself will outlast the i7 7700K in gaming, long term. That also includes the 6 core 12 thread Ryzen 5 1600X which might have more cache to itself and/or higher clocks to boot.
> 
> Some people also raise the point that Z270 is pretty much it, the end of that platform. At most (and unlikely) one more refresh will use it and game over, whilst AM4 is till 2020 (which means Ryzen 2 and Ryzen 3).
> 
> Others also say that Ryzen has better performance for people that do many things, ergo playing a game with 100 tabs open and doing 10 things at once. Valid point, my i5 4460 is not enough for me due to my computing habits.
> 
> Others still believe Ryzen's gaming performance will improve with OS updates, Kernel/BIOS maturity, RAM issues fixed etc.
> 
> All are technically correct. I am 100% certain the Ryzen 7 and maybe 5 CPUs will outlast a 7700K long term, and that AM4 is a more future proof platform... but as for the rest?
> It is a wait and see or subjective point.


Good points. That's kinda what I gathered from my reading, but I see a few people that seem to pull info from thin air







.

In all fairness, I'm more interested in doing a 4C/8T Ryzen build for my HTPC as cost will at the very least be $100+ cheaper than the equivalent Intel build. I'd like to see those minimums improve a bit with some fine tuning as well since 1080p will be a focus for an HTPC for quite some time. Nonetheless I'm happy to see the competition, and if it knocks Intel's prices down on their mainstream i7 and i5 CPUs I'll be happy and in an even bigger dilemma than I am now







.


----------



## Liranan

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Charcharo*
> 
> The idea is that Ryzen itself will outlast the i7 7700K in gaming, long term. That also includes the 6 core 12 thread Ryzen 5 1600X which might have more cache to itself and/or higher clocks to boot.
> 
> Some people also raise the point that Z270 is pretty much it, the end of that platform. At most (and unlikely) one more refresh will use it and game over, whilst AM4 is till 2020 (which means Ryzen 2 and Ryzen 3).
> 
> Others also say that Ryzen has better performance for people that do many things, ergo playing a game with 100 tabs open and doing 10 things at once. Valid point, my i5 4460 is not enough for me due to my computing habits.
> 
> Others still believe Ryzen's gaming performance will improve with OS updates, Kernel/BIOS maturity, RAM issues fixed etc.
> 
> All are technically correct. I am 100% certain the Ryzen 7 and maybe 5 CPUs will outlast a 7700K long term, and that AM4 is a more future proof platform... but as for the rest?
> It is a wait and see or subjective point.
> 
> 
> 
> Good points. That's kinda what I gathered from my reading, but I see a few people that seem to pull info from thin air
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In all fairness, I'm more interested in doing a 4C/8T Ryzen build for my HTPC as cost will at the very least be $100+ cheaper than the equivalent Intel build. I'd like to see those minimums improve a bit with some fine tuning as well since 1080p will be a focus for an HTPC for quite some time. Nonetheless I'm happy to see the competition, and if it knocks Intel's prices down on their mainstream i7 and i5 CPUs I'll be happy and in an even bigger dilemma than I am now
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

A low end to mid range Zen APU will be amazing, low power and great performance. I can't wait for Zen mobile chips.


----------



## ToTheSun!

Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> His unhinged love for AMD has made him publicly say a few wrong things. And his unhealthy adoration for the brand is very cringeworthy, IMO.
> 
> 
> 
> A love of engineering and forward thinking does that to people.
> 
> Normal users and gamers won't understand, of course.
Click to expand...

That's the weirdest excuse for his content i've ever seen. There's a reason many people don't take him seriously; a "love of engineering and forward thinking" is not it.


----------



## Charcharo

Quote:


> Originally Posted by *ToTheSun!*
> 
> That's the weirdest excuse for his content i've ever seen. There's a reason many people don't take him seriously; a "love of engineering and forward thinking" is not it.


I am explaining the "love" for a brand part... which is not super logical in itself but can be explained with those words. AdoredTV has stated that he likes AMD for those things before, so I am taking up his word on it. Obviously, I dont know whether it is truth or a lie.

Guys, you do understand that no matter what happens, the scientists and engineers at AMD, Nvidia, Intel... will *always* have a job and probably make more money than any of us? A brand is just marketing, the human element is important, but here it is effectively equal.
Quote:


> Originally Posted by *pez*
> 
> Good points. That's kinda what I gathered from my reading, but I see a few people that seem to pull info from thin air
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In all fairness, I'm more interested in doing a 4C/8T Ryzen build for my HTPC as cost will at the very least be $100+ cheaper than the equivalent Intel build. I'd like to see those minimums improve a bit with some fine tuning as well since 1080p will be a focus for an HTPC for quite some time. Nonetheless I'm happy to see the competition, and if it knocks Intel's prices down on their mainstream i7 and i5 CPUs I'll be happy and in an even bigger dilemma than I am now
> 
> 
> 
> 
> 
> 
> 
> .


I myself am waiting for AM4+ with PCIE4.0 before I buy Ryzen or Ryzen 2.


----------



## budgetgamer120

Quote:


> Originally Posted by *pez*
> 
> So...serious question time.
> 
> I'm not really considering Ryzen for my main rig as from a few reviews it looks like Ryzen has consistently put out lower minimums/99th percentiles than something like the 7700K. In the case of the arstechnica review, it's pretty consistent for a gaming situation that Kaby Lake performs what I'd consider a significant amount better on gaming when CPU bound. However, I'm seeing arguments here of people saying Ryzen has dethroned Intel in gaming...what gives?
> 
> I love what Ryzen is offering, and am even considering it for an HTPC, but if I'm going to spend the money, I'm thinking I might as well repurpose my Haswell chip for that since I'm looking at a similar performance delta.


Why would anyone put a 16 thread cpu in a HTPC


----------



## pez

Quote:


> Originally Posted by *Liranan*
> 
> A low end to mid range Zen APU will be amazing, low power and great performance. I can't wait for Zen mobile chips.


Yeah, I'd never though about that. Maybe we'll even see some snazzy GTX 1060 mobile platforms with this performance to have a decent gaming laptop for around the $800 range...in my dreams I guess







.
Quote:


> Originally Posted by *Charcharo*
> 
> I am explaining the "love" for a brand part... which is not super logical in itself but can be explained with those words. AdoredTV has stated that he likes AMD for those things before, so I am taking up his word on it. Obviously, I dont know whether it is truth or a lie.
> 
> Guys, you do understand that no matter what happens, the scientists and engineers at AMD, Nvidia, Intel... will *always* have a job and probably make more money than any of us? A brand is just marketing, the human element is important, but here it is effectively equal.
> I myself am waiting for AM4+ with PCIE4.0 before I buy Ryzen or Ryzen 2.


Indeed. It'll be nice to see the second and third reiteration of the platform. I hope to see great maturity like I believe intel did with Z170->Z270.
Quote:


> Originally Posted by *budgetgamer120*
> 
> Why would anyone put a 16 thread cpu in a HTPC


Seems you didn't read the whole conversation







. Besides, HTPCs are used for gaming sometimes.


----------



## Oubadah

..


----------



## pez

Quote:


> Originally Posted by *Oubadah*
> 
> No, Intel has just been better for years, no "propaganda" was required. Now that AMD is back in the game, it seems that the hard core AMD fans are not content with having several great new AMD processors, but feel the need to play "my inauguration crowd is bigger than your inauguration crowd" over the fact that the competition still has a small advantage in a select few areas.
> 
> I can't believe you guys.
> ...and as servers. Streaming, transcoding, file serving etc.


I'm sure it'll trigger him even more that my soon-to-be 1080Ti or TXP was considered as a possibility to end up in it as well







.


----------



## ToTheSun!

Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ToTheSun!*
> 
> That's the weirdest excuse for his content i've ever seen. There's a reason many people don't take him seriously; a "love of engineering and forward thinking" is not it.
> 
> 
> 
> I am explaining the "love" for a brand part... which is not super logical in itself but can be explained with those words. AdoredTV has stated that he likes AMD for those things before, so I am taking up his word on it. Obviously, I dont know whether it is truth or a lie.
Click to expand...

Oh, alright, then. That's fair.


----------



## Charcharo

Quote:


> Originally Posted by *ToTheSun!*
> 
> Oh, alright, then. That's fair.


Badass Boast: http://tvtropes.org/pmwiki/pmwiki.php/Main/BadassBoast

I am always fair.


----------



## mcg75

Quote:


> Originally Posted by *Slink3Slyde*
> 
> That is not to say its not for anyone else and that its not an amazing chip and that some launch issues aside AMD engineers havent done a fantastic job, I have to keep repeating myself because it seems to be a crime to say anything other then Ryzen is king in everything and Intel sucks. Gamers Nexus got some kids telling them they hoped their families died for simply reporting their findings and making conclusions based on those findings.


Pathetic isn't it. There are very, very few people who actually think Ryzen is bad in any way. Yet anything constructive posted is automatically interpreted as an attack. It can be seen online and especially in this thread.
Quote:


> Originally Posted by *Slink3Slyde*
> 
> The guy from Adored TV makes no bones about absolutely bleeding red. Imagine if another Youtuber came out and said he had a 'tear in his eye' in a video because of the amazing performance of the Titan XP. Would anyone seriously think him credible and rely on his opinion? Computerbase.de is the only place other then some debunked Youtuber showing that Ryzen is outperforming all Intel chips across all games, and suddenly they are the only credible source flying in the face of every other site out there?


I've looked at the computerbase.de results and I'm not seeing what you refer to. They have Ryzen at 13.6% slower in 1080p gaming.

I don't think that's an unreasonable figure. With AMD getting developers on board, they should be able to narrow or even eliminate the gap.

It's an excellent start and things should get better with time.


----------



## ryan92084

Quote:


> Originally Posted by *12Cores*
> 
> OP please add this 1800X/sli test:
> 
> https://www.youtube.com/watch?v=8-mMBbWHrwM


Added, thanks. Also put a few more links in last night.


----------



## NightAntilli

Testing at low settings at 720p and even 1080p is still bogus though... Think about it... It's the equivalent of testing all GPUs at only 4K max settings, and leaving 1440p and 1080p results out 'because then we have a guarantee of removing a CPU bottleneck!'. If that was the standard, AMD's GPUs would be touted as the best a lot more often in the past, even though they might not be for certain other uses with specific setups.

"But but but, not all GPUs are meant for 4K, something like a GTX 1050 can only game at 1080P!" Yeah... I doubt a Ryzen 7, an 8C/16T CPU, was meant to game at 720p low settings...

There is no reason not to test these CPUs at multiple resolutions and multiple settings, rather than low resolution low settings. It is not an indication of future performance at all anymore. Besides, things are not linear anymore like they used to be when things ran only on one core. Back then this test might have made sense. Right now, with more threads being used, and more focus on smoothness of gameplay, frame times and minimum framerates are more important than amount of frames pushed at low settings.

It's quite funny though... The 'future' argument is only allowed to be used in favor of Intel. Apparently creating a pure CPU bottleneck will show you which one will perform better in the future, which is definitely Intel. Double the cores and threads? Pfft. Forget it, what matters is that right now the Intel CPU has the best performance, also at 720. AMD's more cores are useless and what matters is how it performs right now.


----------



## pez

Quote:


> Originally Posted by *NightAntilli*
> 
> Testing at low settings at 720p and even 1080p is still bogus though... Think about it... It's the equivalent of testing all GPUs at only 4K max settings, and leaving 1440p and 1080p results out 'because then we have a guarantee of removing a CPU bottleneck!'. If that was the standard, AMD's GPUs would be touted as the best a lot more often in the past, even though they might not be for certain other uses with specific setups.
> 
> "But but but, not all GPUs are meant for 4K, something like a GTX 1050 can only game at 1080P!" Yeah... I doubt a Ryzen 7, an 8C/16T CPU, was meant to game at 720p low settings...
> 
> There is no reason not to test these CPUs at multiple resolutions and multiple settings, rather than low resolution low settings. It is not an indication of future performance at all anymore. Besides, things are not linear anymore like they used to be when things ran only on one core. Back then this test might have made sense. Right now, with more threads being used, and more focus on smoothness of gameplay, frame times and minimum framerates are more important than amount of frames pushed at low settings.
> 
> It's quite funny though... The 'future' argument is only allowed to be used in favor of Intel. Apparently creating a pure CPU bottleneck will show you which one will perform better in the future, which is definitely Intel. Double the cores and threads? Pfft. Forget it, what matters is that right now the Intel CPU has the best performance, also at 720. AMD's more cores are useless and what matters is how it performs right now.


It's testing a CPU for a 'CPU'-bound situation. I'm not saying I wouldn't like to see more resolutions tested, but it's pretty clear why they all test at 1080p. Also, while this is the 8C/16T Ryzen we're seeing here, this might be an early indiciation of how 4C/8T Ryzen will perform in single-threaded and CPU-heavy applications.


----------



## umeng2002

I still say the biggest reasons for the discrepancy in 1080p performance is Windows 10.

Where are some linux gaming tests?


----------



## DADDYDC650

Seems as if these chips are binned. Info provided by Siliconlottery.com.

Ryzen 7 1700
93% reach 3.8GHz @ 1.376V
70% reach 3.9GHz @ 1.408V
20% reach 4.0GHz @ 1.440V
Ryzen 7 1700X
100% reach 3.8GHz @ 1.360V
77% reach 3.9GHz @ 1.392V
33% reach 4.0GHz @ 1.424V
Ryzen 7 1800X
100% reach 3.8GHz (assumed)
97% reach 3.9GHz @ 1.376V
67% reach 4.0GHz @ 1.408V
20% reach 4.1GHz @ 1.440V


----------



## umeng2002

Quote:


> Originally Posted by *DADDYDC650*
> 
> Seems as if these chips are binned. Info provided by Siliconlottery.com.
> 
> Ryzen 7 1700
> 93% reach 3.8GHz @ 1.376V
> 70% reach 3.9GHz @ 1.408V
> 20% reach 4.0GHz @ 1.440V
> Ryzen 7 1700X
> 100% reach 3.8GHz @ 1.360V
> 77% reach 3.9GHz @ 1.392V
> 33% reach 4.0GHz @ 1.424V
> Ryzen 7 1800X
> 100% reach 3.8GHz (assumed)
> 97% reach 3.9GHz @ 1.376V
> 67% reach 4.0GHz @ 1.408V
> 20% reach 4.1GHz @ 1.440V


Of course they're binned being clocked so close the wall from the factory, no matter what AMD says.


----------



## Slink3Slyde

Quote:


> Originally Posted by *mcg75*
> 
> Pathetic isn't it. There are very, very few people who actually think Ryzen is bad in any way. Yet anything constructive posted is automatically interpreted as an attack. It can be seen online and especially in this thread.
> I've looked at the computerbase.de results and I'm not seeing what you refer to. They have Ryzen at 13.6% slower in 1080p gaming.
> 
> I don't think that's an unreasonable figure. With AMD getting developers on board, they should be able to narrow or even eliminate the gap.
> 
> It's an excellent start and things should get better with time.


It is an excellent start and welcome competition, I'm sure gaming will improve with time. I was wondering earlier if they knew of the draw call issues with Windows 10 , HPET and power plan problems people have referred to they didnt wait a week or two to release. If there really is that much performance left to gain in games, and they have marketed Ryzen as a gaming chip, surely they must realize that day one reviews are crucial. If they had been shown matching or beating BW-E in gaming as well it would have been an absolute slam dunk. Someone suggested that perhaps as a smaller company (relative to Intel I guess) they couldn't afford to wait, I'm not sure I can see the logic in that unless they were about to go bankrupt in the next month.

I was actually referring specifically to the 7700K, which they have Ryzen down as beating overall, whereas the places I generally go to all showed it behind the 7700K in the vast majority of games they tested, and quite significantly in some cases where Computerbase puts Ryzen ahead.

https://arstechnica.com/gadgets/2017/03/amd-ryzen-review/

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page5.html

http://techreport.com/review/31366/amd-ryzen-7-1800x-ryzen-7-1700x-and-ryzen-7-1700-cpus-reviewed/10

http://www.tweaktown.com/reviews/8081/amd-ryzen-7-1700-1700x-cpu-review/index6.html

Perhaps Computerbase.de have found the most CPU intensive parts of the most multi-threaded new games and are benching there specifically, its hard to tell as I dont read German, I just see their results as an outlier compared to everyone else. I've questioned as to the why of this before in another thread IIRC, I never got an answer though.


----------



## BinaryDemon

Regarding Siliconlottery.com's testing:

I wonder why they used the different voltage set points for the 1700x.


----------



## NightAntilli

Quote:


> Originally Posted by *pez*
> 
> It's testing a CPU for a 'CPU'-bound situation. I'm not saying I wouldn't like to see more resolutions tested, but it's pretty clear why they all test at 1080p. Also, while this is the 8C/16T Ryzen we're seeing here, this might be an early indiciation of how 4C/8T Ryzen will perform in single-threaded and CPU-heavy applications.


If you want an indication of how 4C/8T performs, you disable 4 cores on the CPU. This will also give a better indication of how 4 cores would overclock etc.

The low resolution low setting benchmarks are still useless. How do you know you're testing "CPU" limitations rather than say, memory controller limitations? Memory speed/bandwidth limitations? Motherboard limitations?


----------



## unkletom

http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2


----------



## NightAntilli

You know what I also find interesting...? Why did no one test with AMD GPUs? Everyone is always complaining about AMD having high CPU overhead on their cards, particularly GCN 1.0 and GCN 1.1. If you really want to create a CPU bottleneck, why not test with a 390X, or a Fury (X) card?


----------



## SoloCamo

Quote:


> Originally Posted by *unkletom*
> 
> http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2


Don't waste your time anyone...4.2ghz 3570k vs stock 1700. i5 is article writers personal rig, no bias there of course. When your results are this off from everyone elses you might just stop and think something is up right?

To quote this nonsense:
Quote:


> Ryzen in two out of the three games tested-*and I mean smoked*-and the CPUs traded blows in the third.
> 
> Wow.
> 
> First up: The Division. Testing was conducted using the Ultra graphics preset, with V-Sync and any GPU vendor-specific technologies disabled, as usual. The results aren't pretty for the Ryzen 7 1700. Heck, at 1080p *it's a damned massacre.*


----------



## Wishmaker

Ryzen 1800X has a base clock of 3.6 GHz
INTEL 6900K has a base clock of 3.2 GHz.

One tops out at 4.1 GHz if you are extremely lucky and the other at 4.4 GHz. I will let you guys figure out which is which







.

If 4 GHz is the sweet spot and AMD 1800X @ 4.1 GHz is the baseline (180 dollars over MSRP from Silicon Lottery) then we cannot make statements of the sort that Ryzen 1800x is half the price INTEL asks for their 6900K. If you want a chip that clocks to 4.1 GHz of course


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> Ryzen 1800X has a base clock of 3.6 GHz
> INTEL 6900K has a base clock of 3.2 GHz.
> 
> One tops out at 4.1 GHz if you are extremely lucky and the other at 4.4 GHz. I will let you guys figure out which is which
> 
> 
> 
> 
> 
> 
> 
> .
> 
> If 4 GHz is the sweet spot and AMD 1800X @ 4.1 GHz is the baseline (180 dollars over MSRP from Silicon Lottery) then we cannot make statements of the sort that Ryzen 1800x is half the price INTEL asks for their 6900K. If you want a chip that clocks to 4.1 GHz of course


This logic hurts my brain. If you want to add in the cost of binning then you would need to add that to the 6900k oc as well. Secondly, 99% of people are likely just going to buy a 1700 and overclock it themselves. You are right, it's not half the price of a 6900k, it's 1/3rd of it.

4.1ghz on Ryzen is just as likely as 4.4ghz on the 6900k from what I've seen.


----------



## mcg75

Quote:


> Originally Posted by *SoloCamo*
> 
> Don't waste your time anyone...4.2ghz 3570k vs stock 1700. i5 is article writers personal rig, no bias there of course. When your results are this off from everyone elses you might just stop and think something is up right?
> 
> To quote this nonsense:


Here's more.
Quote:


> Don't get me wrong: There's a whole lot to like about Ryzen. Like I said, it's competitive at higher gaming resolutions, or if you pair it with a top-end graphics card like the GTX 1080. As evidenced by the Jekyll and Hyde Ashes of the Singularity results, Ryzen shines in games that actually make use of abundant cores, which is encouraging indeed if DirectX 12 and Vulkan games start gaining traction. Ryzen's incredibly, impressively power-efficient. The platform supports every modern amenity you could ask for. And, as you'll discover in PCWorld's Ryzen review, AMD's CPUs actually *whomp* Intel's chips in multithreaded productivity tasks-and for a fraction of the price of comparable 8-core Core processors. The Ryzen 7 1700 is damned disruptive, and not a dud whatsoever.


This is exactly what I'm talking about. Why do we vilify someone for the parts we don't like and ignore the parts where they praise?

He says Ryzen "whomps" Intel as it certainly does in tasks not gaming but nobody gets upset about that. It's true.


----------



## Pro3ootector

Imagine what Ryzen 3 will do to Core i5.


----------



## JackCY

Quote:


> Originally Posted by *Wishmaker*
> 
> Ryzen 1800X has a base clock of 3.6 GHz
> INTEL 6900K has a base clock of 3.2 GHz.
> 
> One tops out at 4.1 GHz if you are extremely lucky and the other at 4.4 GHz. I will let you guys figure out which is which
> 
> 
> 
> 
> 
> 
> 
> .
> 
> If 4 GHz is the sweet spot and AMD 1800X @ 4.1 GHz is the baseline (180 dollars over MSRP from Silicon Lottery) then we cannot make statements of the sort that Ryzen 1800x is half the price INTEL asks for their 6900K. If you want a chip that clocks to 4.1 GHz of course


Then include 6900K binned to 4.4GHz.
Plus the 1700 at 3.8-3.9 is easy to do and costs 1/3rd of unbinned 6900K, the value is unspeakable.
Quote:


> Originally Posted by *umeng2002*
> 
> I still say the biggest reasons for the discrepancy in 1080p performance is Windows 10.
> 
> Where are some linux gaming tests?


Windows has another bug related to sleep mode, something that Win struggles with for ages, sleep mode. When you put Ryzen PC to sleep, wake up and bench you may get 4.2GHz or even 4.8GHz







The timer, timing what ever is completely messed up after sleep mode and you can get some ridiculous fake OC and bench results








2000+ Cinebench R15 anyone? Sleep, wake, tada!









Linux gaming is pretty much dead in the water but there are some reviews on Linux including gaming.

---

Windows:

broken scheduler moves threads across CCXs and moves threads in general when it should let them be
broken cache mapping
broken sleep mode
enabled core parking by default?
possible mismanaging of SMT?
...


----------



## SoloCamo

Quote:


> Originally Posted by *mcg75*
> 
> Here's more.
> This is exactly what I'm talking about. Why do we vilify someone for the parts we don't like and ignore the parts where they praise?
> 
> He says Ryzen "whomps" Intel as it certainly does in tasks not gaming but nobody gets upset about that. It's true.


Followed by:
Quote:


> "No, it's not a dud-unless you're looking to replace a 5-year-old, quad-core Intel Core i5 chip for mainstream gaming at the most popular display resolution. There, the Ryzen 7 1700 can stumble, and stumble hard."


Nobody gets upset that it "whomps" intel in multithreaded applications because it's actually true. Ryzen is not a dud for gaming, or being murdered, or being smoked, massacred, etc. which he plastered all over the performance benchmarks.


----------



## looniam

Quote:


> Originally Posted by *Liranan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> but what he did do was compared complete different sets of gaming benchmarks though out his comparison. each time the graphics card changed so did the games. he even mentioned in later it his comparison when discussing games using more core _but completely ignored it up to that point._
> 
> if he would have truly debunked the myth then the ONLY variable would have been the graphics card. his exercise just simply pointed out whats been said for years; games will start using more cores. but didn't show any significant evidence that the press is wrong to use low res benchmarks.
> 
> 
> 
> So what you expect him to do is only use benchmarks from 2012 to prove whether anything has changed? You seriously are grasping at straws but that isn't hard to understand. We have been inundated with propaganda for years that Intel are better but now that they aren't many can't accept it.
Click to expand...

proper testing of a hypothesis has nothing to do with being inundated w/propaganda nor does calling someone out who changed their hypothesis in mid stream.

and i guess intel's propaganda failed w/me as i am looking for reasons to start doing production work (again) and feel more justified buying a ryzen later this year.


----------



## Wishmaker

Quote:


> Originally Posted by *SoloCamo*
> 
> This logic hurts my brain. If you want to add in the cost of binning then you would need to add that to the 6900k oc as well. Secondly, 99% of people are likely just going to buy a 1700 and overclock it themselves. You are right, it's not half the price of a 6900k, it's 1/3rd of it.
> 
> 4.1ghz on Ryzen is just as likely as 4.4ghz on the 6900k from what I've seen.


I got a Nurofen here on my desk. Should help with the pain.

Granted the 4.4 GHz 6900K is a fable creature and even mythical for 24/7 usage, however, the 4.1 GHz clock, which is unicorn territory for Ryzen is not. So having said that, you do not need a 6900K from silicon lottery to hit 4.1 GHz but it seems that you need a Ryzen from Silicon lottery for that frequency.

If we factor in this and not the 4.4 GHz, you pay 1000 dollars for the INTEL chip which easily hits 4.1 GHz and you pay 680 dollars for the AMD chip and the difference does not make 1/3 as you mention or even half. So in terms of overclocking, the AMD chip is bad.

For all intents and purposes, the discussion here is about the 1800 and not 1700.


----------



## SoloCamo

Quote:


> Originally Posted by *Wishmaker*
> 
> I got a Nurofen here on my desk. Should help with the pain.
> 
> Granted the 4.4 GHz 6900K is a fable creature and even mythical for 24/7 usage, however, the 4.1 GHz clock, which is unicorn territory for Ryzen is not. So having said that, you do not need a 6900K from silicon lottery to hit 4.1 GHz but it seems that you need a Ryzen from Silicon lottery for that frequency.
> 
> If we factor in this and not the 4.4 GHz, you pay 1000 dollars for the INTEL chip which easily hits 4.1 GHz and you pay 680 dollars for the AMD chip and the difference does not make 1/3 as you mention.
> 
> For all intents and purposes, the discussion here is about the 1800 and not 1700.


Even with your ridiculous comparison, that still means you are saving $320. A 4.1ghz 6900k is not worth $320 more then a 4.1ghz Ryzen. Now if you actually look at the reality of the situation, it's a $671 price increase from a 1700.

Your argument is like using the 9590 and ignoring the 8320 though they both hit the same clocks. For all intents and purposes, your point is well, pointless.


----------



## tpi2007

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *unkletom*
> 
> http://www.pcworld.com/article/3176100/computers/amd-ryzen-7-1700-vs-a-5-year-old-gaming-pc-or-why-you-should-never-preorder.html?page=2
> 
> 
> 
> Don't waste your time anyone...4.2ghz 3570k vs stock 1700. i5 is article writers personal rig, no bias there of course. When your results are this off from everyone elses you might just stop and think something is up right?
> 
> To quote this nonsense:
> Quote:
> 
> 
> 
> Ryzen in two out of the three games tested-*and I mean smoked*-and the CPUs traded blows in the third.
> 
> Wow.
> 
> First up: The Division. Testing was conducted using the Ultra graphics preset, with V-Sync and any GPU vendor-specific technologies disabled, as usual. The results aren't pretty for the Ryzen 7 1700. Heck, at 1080p *it's a damned massacre.*
> 
> Click to expand...
Click to expand...

The conclusion on the article is quite convoluted. He says that 1080p gaming is not for him because he's got a GTX 1080 and plays at 4K or 1440p and is planning on streaming and creating videos for work, so Ryzen may be what he is looking for. He even lists the relative proposition from the other side, that costs more than 1k.

The whole premise needs polishing. Who exactly buys a $300+ CPU and what for? What are the rest of the system specs of such a buyer?

And also, if he is to use a 3570K at 4.2 Ghz in the comparison, he better overclock the 1700 a bit too, since it's also unlocked and is thus also part of the value proposition. This paragraph is quite alarming in that regard:

Quote:


> The Core i5-3570K was long ago overclocked from its stock 3.5GHz to 4.2GHz to put more pep in its step. Because gamers with similar rigs are likely to have done the same, I left that in place. (Remember: Real-world comparison between two systems! Should I upgrade?)


Guess what people buying the Ryzen 7 1700 are going to do? Overclock it also. Yet none of that was talked about. This alone, denotes quite the bias of the reviewer and disqualifies his results.

The whole 1080p argument is also many times being presented, as is the case, that many people on Steam still use that resolution. Since when has that been a factor for people buying CPUs that cost more than $300? One thing is to say that 1080p allows to test the CPU's reserves for the future - but being coherent would require the reviewers to balance the extra cores in the equation, but this is not what many are doing. If they are going to refer to 1080p usage on Steam, they better correlate that with the CPUs and GPUs being used. Almost as many people there have dual core CPUs as four core CPUs (46.19% vs 47.74%, per the latest survey). The majority game on 1 GB VRAM GPUs (33.35%), followed by 2 GB VRAM GPUs (2047 + 2048 MB entries = 24.78%). 4GB GPUs come in a distant third, at 11.21% and 6 GB and 8 GB GPUs combined are at 7.28%.

Quote:


> Originally Posted by *Wishmaker*
> 
> Ryzen 1800X has a base clock of 3.6 GHz
> INTEL 6900K has a base clock of 3.2 GHz.
> 
> One tops out at 4.1 GHz if you are extremely lucky and the other at 4.4 GHz. I will let you guys figure out which is which
> 
> 
> 
> 
> 
> 
> 
> .
> 
> If 4 GHz is the sweet spot and AMD 1800X @ 4.1 GHz is the baseline (180 dollars over MSRP from Silicon Lottery) then we cannot make statements of the sort that Ryzen 1800x is half the price INTEL asks for their 6900K. If you want a chip that clocks to 4.1 GHz of course


Actually, we can. How much is the premium on a golden 4.4 Ghz Broadwell-E?

Not to mention that Broadwell-E's efficiency also goes out the door once you go over 4 Ghz, so keep some extra cash in that equation for some beefier cooling.


----------



## Pro3ootector

I will just leave this here.



http://pclab.pl/art73043-4.html

Edit: It's before / after OC.


----------



## NightAntilli

Looking at those benches, my FX-8 CPU looks like crap. And then I remind myself that I only have a FreeSync monitor with a range of 40 Hz to 75 Hz... FX-4300 would still be good enough for me for this game... I can wait for Ryzen+/Ryzen2 without worries.


----------



## Samuris

Quote:


> Originally Posted by *Pro3ootector*
> 
> I will just leave this here.
> 
> 
> 
> http://pclab.pl/art73043-4.html
> 
> Edit: It's before / after OC.


Look likes the minimum framerate is so much better on ryzen cpu ? or i mistake ?


----------



## huzzug

Quote:


> Originally Posted by *Samuris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pro3ootector*
> 
> I will just leave this here.
> http://pclab.pl/art73043-4.html
> 
> Edit: It's before / after OC.
> 
> 
> 
> Look likes the minimum framerate is so much better on ryzen cpu ? or i mistake ?
Click to expand...

You're judging it the wrong way. The Intels have much higher max frames.


----------



## The-Beast

Quote:


> Originally Posted by *Wishmaker*
> 
> Ryzen 1800X has a base clock of 3.6 GHz
> INTEL 6900K has a base clock of 3.2 GHz.
> 
> One tops out at 4.1 GHz if you are extremely lucky and the other at 4.4 GHz. I will let you guys figure out which is which
> 
> 
> 
> 
> 
> 
> 
> .
> 
> If 4 GHz is the sweet spot and AMD 1800X @ 4.1 GHz is the baseline (180 dollars over MSRP from Silicon Lottery) then we cannot make statements of the sort that Ryzen 1800x is half the price INTEL asks for their 6900K. If you want a chip that clocks to 4.1 GHz of course


You seem to know everything about the chip. Could you explain this? https://youtu.be/S93asfXAu9M?t=777

Making absolute statements less than a week after the initial release is only going to make you look foolish.


----------



## pez

Quote:


> Originally Posted by *NightAntilli*
> 
> If you want an indication of how 4C/8T performs, you disable 4 cores on the CPU. This will also give a better indication of how 4 cores would overclock etc.
> 
> The low resolution low setting benchmarks are still useless. How do you know you're testing "CPU" limitations rather than say, memory controller limitations? Memory speed/bandwidth limitations? Motherboard limitations?


Disabling 4 cores on a 4 core CPU? I'm assuming you mean threads. And I guess that would be a cool test, but it would only be useful for synthetics. I don't think it's likely that people are going to disable threads on a CPU for gaming.

Also, keeping constant hardware and changing only the CPU, then testing the games is the way you're going to find out what game is limited by the CPU. Lower resolutions like 1080p and below are more CPU-limited than 1440p and above. This is pretty common knowledge.

I'm not saying that every game they tested was CPU bound, and maybe there were better choices as far the game selections used, but saying it's useless is just incorrect. You can take the data and analyze and come up with more than a few games that will essentially give you the conclusion that 'game X performs better on CPU 4 because CPU 4 does better at CPU-bound tasks'.


----------



## SkiesOfAzel

This is really interesting. If those results are true, then the Ryzen 7 game penalties are caused almost exclusively by cache misses due to thread switching between CCX. This also means that a fix should also raise Ryzen 5/7 performance in most multithreaded tasks (in effect it raises ipc for those tasks).


----------



## Artikbot

Quote:


> Originally Posted by *Wishmaker*
> 
> Granted the 4.4 GHz 6900K is a fable creature and even mythical for 24/7 usage, however, the 4.1 GHz clock, which is unicorn territory for Ryzen is not. So having said that, you do not need a 6900K from silicon lottery to hit 4.1 GHz but it seems that you need a Ryzen from Silicon lottery for that frequency.


Seems to be less and less the case with newer BIOSes.

More people hitting 4.1 regularly are cropping up this week. Together with most vendors already supporting DDR4-3200 and even 3600 out of the box... This calls for a re-benching because we're looking at substantial performance differences just three working days into the life of Ryzen.

Hoping my stuff gets here by this weekend so I can test it for myself.


----------



## dragneel

Quote:


> Originally Posted by *huzzug*
> 
> You're judging it the wrong way. The Intels have much higher max frames.










i cant tell if you're joking or not


----------



## NightAntilli

Quote:


> Originally Posted by *huzzug*
> 
> You're judging it the wrong way. The Intels have much higher max frames.


Not sure if serious, or sarcastic.
Quote:


> Originally Posted by *pez*
> 
> Disabling 4 cores on a 4 core CPU? I'm assuming you mean threads. And I guess that would be a cool test, but it would only be useful for synthetics. I don't think it's likely that people are going to disable threads on a CPU for gaming.


Ryzen is an 8C/16T CPU. If you want to know how the Ryzen equivalent of 4C/8T performs, you disable 4 cores on it.
Quote:


> Originally Posted by *pez*
> 
> Also, keeping constant hardware and changing only the CPU, then testing the games is the way you're going to find out what game is limited by the CPU. Lower resolutions like 1080p and below are more CPU-limited than 1440p and above. This is pretty common knowledge.
> 
> I'm not saying that every game they tested was CPU bound, and maybe there were better choices as far the game selections used, but saying it's useless is just incorrect. You can take the data and analyze and come up with more than a few games that will essentially give you the conclusion that 'game X performs better on CPU 4 because CPU 4 does better at CPU-bound tasks'.


It's useless because this is no longer the 1C/1T era. At that time you could guarantee that the CPU was the limit. With more threads available than current software can manage, you can never isolate the limit on the CPU itself and judge performance on it. Especially with all the issues going on. Might be windows, might be the game code, might be other things. But whatever.


----------



## Kuivamaa

Quote:


> Originally Posted by *Samuris*
> 
> Look likes the minimum framerate is so much better on ryzen cpu ? or i mistake ?


It is pclab. You can safely ignore their graphs.


----------



## dragneel

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is pclab. You can safely ignore their graphs.


I honestly have no idea what a pclab is. Mind explaining?


----------



## umeng2002

I wonder what the odds of the Ryzen 5 6c/12t being a new stepping, that clocks higher?

How long has AMD been stockpiling these dies?


----------



## pez

Quote:


> Originally Posted by *NightAntilli*
> 
> Not sure if serious, or sarcastic.
> Ryzen is an 8C/16T CPU. If you want to know how the Ryzen equivalent of 4C/8T performs, you disable 4 cores on it.
> It's useless because this is no longer the 1C/1T era. At that time you could guarantee that the CPU was the limit. With more threads available than current software can manage, you can never isolate the limit on the CPU itself and judge performance on it. But whatever.


Ok, let me put this scenario into play for you. I'm going to reference the numbers in the Guru3D review for this scenario (http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,17.html). Specifically that link as I'm going to target Rise of the Tomb Raider results from that review.

A friend comes to you and says, 'I want to play Rise of the Tomb Raider on my 1080p 120Hz monitor, but I want to maintain 120fps because of the monitor I'm using. What CPU do you recommend?'.

Ignoring that there's no minimums included on the review, and ignoring the fact that AMD optimizations aren't 100% yet--with that information given, the 1800X is not going to get him what he wants. One because it's not meeting the target, and two because there's obviously a difference here being caused by the CPU. It's not a GPU bottleneck that we're looking at because the results all include the same GPU. You're looking at either a CPU bottleneck or a GPU that's being bottlenecked by the rest of the system.


----------



## mcg75

Quote:


> Originally Posted by *SoloCamo*
> 
> Followed by:
> Nobody gets upset that it "whomps" intel in multithreaded applications because it's actually true. Ryzen is not a dud for gaming, or being murdered, or being smoked, massacred, etc. which he plastered all over the performance benchmarks.


The guy likes dramatic words. That doesn't make his data completely useless.

He was one of the only ones that I know of to use an AMD card during testing. Rx480. And he did 1440p and 1080p at Ultra.

The setup he used would be what an entry level content creator and part time gamer might build.

My only issue is no overclock on the 1700. But given that 4.0 seems to be the max, it's not going to change results by that much.

But I suppose because people don't like his results, he's biased and or cheating on the benches somehow.


----------



## Wishmaker

The best sum up for for the 1800x chip.
Quote:


> Overclocking was perhaps the biggest disappointment - considering the 1800X is meant to be the top-binned CPU, the fact we only got to 4GHz on all cores when pitched against some typically demanding multi-threaded workloads with pretty decent cooling did seem lacklustre compared to the 4.2-4.4 GHz everyday overclock you'd likely be able to get from an Intel six or eight-core CPU, which suggests that AMD is really having to run these chips close to their natural limits.


----------



## looniam

Quote:


> Originally Posted by *dragneel*
> 
> I honestly have no idea what a pclab is. Mind explaining?


those crazy polish. go check their reviews yourself, i personally like them.

hell they even "found" an award to give ryzen:


----------



## unkletom

Building a 120\144hz gaming setup with Ryzen would absolutely be suicide. You can't have your cpu bottleneck your gaming experience in this setup and preferrably you want high speed ram like 3866\4000 mhz.

To my surprise I've seen a couple Ryzen builds with [email protected] But hey it's not my money.


----------



## Kuivamaa

Quote:


> Originally Posted by *dragneel*
> 
> I honestly have no idea what a pclab is. Mind explaining?


A polish bench site ,notorious for posting results wildly different than the rest.


----------



## umeng2002

I would imagine turn down a few sliders in a game will get you 120 or 144 Hz at 1080p no matter what CPU you have.


----------



## Charcharo

The way the wind is blowing with CPUs and games, it sounds like a fine 144hz long term build.

As for my friend that wants a 144 FPS experience in ROTTR... I will honestly school him on how naive it is to build a PC for just ONE game and scenario, followed by me scolding him to at least want to play a good game and not ROTTR









If he sees the light... or even if he doesnt... I will help him of course. Either OCed 7700K or 1700 (if he wants long term),


----------



## dragneel

Quote:


> Originally Posted by *looniam*
> 
> those crazy polish. go check their reviews yourself, i personally like them.
> 
> hell they even "found" an award to give ryzen:


"Captivating" Fantastic.








Quote:


> Originally Posted by *Kuivamaa*
> 
> A polish bench site ,notorious for posting results wildly different than the rest.


I see. fair enough.


----------



## Shatun-Bear

Quote:


> Originally Posted by *umeng2002*
> 
> I wonder what the odds of the Ryzen 5 6c/12t being a new stepping, that clocks higher?
> 
> How long has AMD been stockpiling these dies?


The thing that gets me more excited for the R5's is the 1600X. It's got a turbo frequency of 4Ghz and because it's 2 less cores, it could overclock slightly better than the 4.1Ghz ceiling of the R7s.4.2Ghz is a possibility.

Intel's newest 6-cores only reach 4.4Ghz on average. So the Ryzen's could potentially only be giving up 200 Mhz, which is tiny.

The crux of all this is that the 6800K is over $400. The top R5, the 1600X, is going to be way under that price, probably as low as $250. Even more compelling would be the 1600. Cheaper still, but likely just as overclockable a 6-core as the X version if the R7s are anything to go by. This chip will be a market killer as long as the press and 4-core owners don't try and kill it at birth ?


----------



## SoloCamo

Quote:


> Originally Posted by *mcg75*
> 
> The guy likes dramatic words. That doesn't make his data completely useless.
> 
> He was one of the only ones that I know of to use an AMD card during testing. Rx480. And he did 1440p and 1080p at Ultra.
> 
> The setup he used would be what an entry level content creator and part time gamer might build.
> 
> My only issue is no overclock on the 1700. But given that 4.0 seems to be the max, it's not going to change results by that much.
> 
> But I suppose because people don't like his results, he's biased and or cheating on the benches somehow.


Using your own personal overclocked rig as "real world" against a chip that can oc (near enough 1ghz from it's base) but isn't being oc'ed on top of the reviewer being overly dramatic is not what I personally like when I'm trying to get a review but that's besides the point. This is where I see the problem..

http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,19.html

Interesting, a Athlon X4 845 can push a gtx1080 to 79fps (7700k getting 119fps) but I am supposed to believe a R7 1700 can't push a RX480 past 60fps? This is my problem with that review.

or perhaps this will help...


----------



## bigjdubb

I got an email from microcenter showing a price drop for the 1700x to $350, that's not bad.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Kuivamaa*
> 
> A polish bench site ,notorious for posting results wildly different than the rest.


So, you're a guy who likes Computerbase.de could you perhaps explain their methodology or why they seem to get much better results with CPU's with much higher thread counts? Its a genuine question, not trying to bait an argument I'm really curious as to why they are getting the results they are vs other sites.


----------



## AlphaC

http://www.microcenter.com/product/476004/Ryzen_7_1700X_34_GHz_8_Core_AM4_Boxed_Processor
AMD Ryzen 7 1700X 3.4 GHz 8 Core AM4 Boxed Processor
$399.99 SAVE $50.00 = $349.99

So it begins on the "X" CPUs , price drops in less than 1 week.

The Ryzen 7 1700X is a tough sell: if overclocking they have more or less the same clockspeed wall as the Ryzen 7 1700. If not manual OCing, the Ryzen 7 1800X boosts higher. The only thing stopping people that don't overclock from jumping to the Ryzen 7 1800X is the price differential.


Spoiler: Warning: Spoiler!







The clockspeeds for non overclockers looking for ~ $300 CPUs with more than 8 threads seems alright


----------



## sumitlian

Quote:


> Originally Posted by *Wishmaker*
> 
> I got a Nurofen here on my desk. Should help with the pain.
> 
> Granted the 4.4 GHz 6900K is a fable creature and even mythical for 24/7 usage, however, the 4.1 GHz clock, which is unicorn territory for Ryzen is not. So having said that, you do not need a 6900K from silicon lottery to hit 4.1 GHz but it seems that you need a Ryzen from Silicon lottery for that frequency.
> 
> If we factor in this and not the 4.4 GHz, you pay 1000 dollars for the INTEL chip which easily hits 4.1 GHz and you pay 680 dollars for the AMD chip and the difference does not make 1/3 as you mention or even half. So in terms of overclocking, the AMD chip is bad.
> 
> For all intents and purposes, the discussion here is about the 1800 and not 1700.


$1000 i7 6900k + $350 board = $1350 for a LGA-2011 V3 socket that has been declared dead in 2017, Zero future regarding new CPU support.
$330 R7 1700 + $350 board = $680 for Socket AM4 that is going to support at least one new series of further optimized CPUs (Zen+), 100% guaranteed future support.

Even with R7 at 3.9 GHz and i7 at 4.2 GHz, i7 6900k has *no doubt* become obsolete in every aspect, except if all you are going to do is run AVX2 based blender or other AVX2 based software.
Also you will need to spend *2x* more money for 2 extra DDR4 RAMs for Quad Channel for i7 platform. RyZen is equal to IPC wise with only two channel (extremely highly efficient) so you are saving a little amount of money for RAM in RyZen. i7 will also require aftermarket CPU cooler.

Nobody is going to dump >$130% more money (additional RAMs + CPU cooler) for mere 5-15% more performance(with overclock).
Also most productivity professionals do not do any type of overclocking for the sake of guaranteed stability, of course we have many users here in OCN that will overclock every CPU they touch but that would be <1% of total users worldwide.

If you want 120-144Hz+ gaming, buy i7 7700k.
If you want extreme multithread performance, buy any of RyZen 8 core CPUs.

Edit: some typos were fixed.


----------



## Artikbot

Quote:


> Originally Posted by *sumitlian*
> 
> $1000 i7 6900k + $350 board = $1350 for a LGA-2011 V3 socket that has been declared dead in 2017, Zero future regarding new CPU support.
> $330 R7 1700 + $350 board = $680 for Socket AM4 that is going to support at least one new series of further optimized CPUs (Zen+), 100% guaranteed future support.
> 
> *Even with R7 at 3.9 GHz and i7 at 4.2 GHz, i7 6900k has no doubt become obsolete in every aspect, except if all you are going to do is run AVX2 based blender or other AVX2 based software.*
> Also you will need to spend *2x* more money for 2 extra DDR4 RAMs for Quad Channel for i7 platform. RyZen is equal to IPC wise with only two channel (extremely highly memory efficient) so you are saving a little amount of money for RAM in RyZen. i7 will also require aftermarket CPU cooler.
> 
> Nobody is going to dump >$130% more money (additional RAMs + CPU cooler) for mare 5-15% more performance(with overclock).
> Also most productivity professionals do not do any type of overclocking for the sake of guaranteed stability, of course we have many users here in OCN that will overclock every CPU they touch but that would be <1% of total users worldwide.
> 
> If you want 120-144Hz+ gaming, buy i7 7700k.
> If you want extreme multithread performance, buy any of RyZen 8 core CPUs.


And even if you're to run AVX2-aware programs, they need to make use of 512bit SIMDs (or multiple 256bit ones IIRC too?) to gather that advantage Intel has in that respect, because Ryzen can run 256bit vectors in one clock, too.


----------



## Shatun-Bear

Quote:


> Originally Posted by *sumitlian*
> 
> *$1000 i7 6900k + $350 board = $1350 for a LGA-2011 V3 socket that has been declared dead in 2017, Zero future regarding new CPU support.
> $330 R7 1700 + $350 board = $680 for Socket AM4 that is going to support at least one new series of further optimized CPUs (Zen+), 100% guaranteed future support.
> *
> Even with R7 at 3.9 GHz and i7 at 4.2 GHz, i7 6900k has *no doubt* become obsolete in every aspect, except if all you are going to do is run AVX2 based blender or other AVX2 based software.
> Also you will need to spend *2x* more money for 2 extra DDR4 RAMs for Quad Channel for i7 platform. RyZen is equal to IPC wise with only two channel (extremely highly efficient) so you are saving a little amount of money for RAM in RyZen. i7 will also require aftermarket CPU cooler.
> 
> *Nobody is going to dump >$130% more money (additional RAMs + CPU cooler) for mere 5-15% more performance(with overclock).*
> Also most productivity professionals do not do any type of overclocking for the sake of guaranteed stability, of course we have many users here in OCN that will overclock every CPU they touch but that would be <1% of total users worldwide.
> 
> If you want 120-144Hz+ gaming, buy i7 7700k.
> If you want extreme multithread performance, buy any of RyZen 8 core CPUs.
> 
> Edit: some typos were fixed.


Spot on.


----------



## sumitlian

Quote:


> Originally Posted by *Artikbot*
> 
> And even if you're to run AVX2-aware programs, they need to make use of 512bit SIMDs (or multiple 256bit ones IIRC too?) to gather that advantage Intel has in that respect, because Ryzen can run 256bit vectors in one clock, too.


RyZen can only do 256 bit load / cycle. For store, it is limited to 128 bit / cycle if I've not read that wrong, while Intel do not have that limitation since Sandy Bridge.In RyZen, if 256 bit SIMD operations in an application are mostly about more memory reads than writes then I think there should not be any significant difference between Intel and AMD regarding AVX-256, but if it is vice versa then AMD will be slower.
Can't say about 512 bit AVX since I don't think there exist any problem domain remaining for gaming applications for 512 bit SIMD, well my knowledge is very limited in this. All I've read is both major consoles do use AVX-128 SIMD (or even 256 bit SIMD).

I think 6900k/7700k can only improve gaming performance in future if more games will start to use AVX2. But even if that is the case then I believe only PC exclusive titles may see such significant gains (due to superior AVX-256 IPC, not what we are seeing now with non AVX based games), because we know that both PS4 and XB1 has AMD Jaguar based A/CPU, Jaguar does not provide 256-bit vector (AVX-256) processing in a single cycle, it had been designed to do 128 bit vector operation per cycle. Tbh I am technically illiterate when it comes to porting if developers see the potential for performance increase in utilizing wider instructions while porting games to PC since PC has much better vector processor in Intel or they do not because it would only cost time and extra overhead !? Also all CPUs that support AVX also fully support 128 bit AVX instructions that means developers should not need to worry about backward compatibility between different AVX based architecture. The problem comes when they decided to use AVX2-256 (Integer/double instructions, _mm256_xxx_epi32/pd (8 x 32 bit / 4 x64 bit packed, in these instructions AMD has 50% throughput of what Intel can do ) ), but it is highly unlikely for now they would do so because I think market of AVX2 based processors globally still doesn't surpass non AVX2 based processor, at least for now. Finally my speculation is, if RyZen in terms of 128-bit throughput is actually equal to Jaguar and since console games are obviously optimized for Jaguar architecture, it is very likely that performance gap between i76900k / i7 7700k and RyZen, IPC wise, will not increase. Also most still don't realize that higher frame rates with 7700k has more do to with IPS than the IPC, since 7700k overclocks much better.

Edit: types fixed, pardon me if there is any,also ignore the grammar lol.


----------



## Oubadah

..


----------



## sumitlian

Also, if anybody does need extreme AVX performance and their application support up to 512 bit AVX(I don't know if there literally exist any real world use for that or not) then why don't they buy any of the Skylake based Xeon models that has support for AVX-512 ? It would be much better for them to spend on Skylake Xeon rather than Broadwell-E, imo.


----------



## PsyM4n

512 bit avx is not supported on retail CPUs yet. Skylake EP is meant to support it.


----------



## sumitlian

Quote:


> Originally Posted by *PsyM4n*
> 
> 512 bit avx is not supported on retail CPUs yet. Skylake EP is meant to support it.


Yes agreed!


----------



## Slink3Slyde

Quote:


> Originally Posted by *Oubadah*
> 
> You mean the phenomenon of last gen HEDT processors giving better scores on <5 thread games vs newer i7 quads with higher IPC and clocks?


More the fact that other review sites tested the same games and found the 7700K better. The only place I've seen BW-E processors and Ryzen ahead is Computerbase. I'm not denying its possible, Im actually wondering if their testing is actually better then most and I cant read their testing methods like I can the others to be able to tell for myself.


----------



## NightAntilli

Quote:


> Originally Posted by *pez*
> 
> Ok, let me put this scenario into play for you. I'm going to reference the numbers in the Guru3D review for this scenario (http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,17.html). Specifically that link as I'm going to target Rise of the Tomb Raider results from that review.
> 
> A friend comes to you and says, 'I want to play Rise of the Tomb Raider on my 1080p 120Hz monitor, but I want to maintain 120fps because of the monitor I'm using. What CPU do you recommend?'.
> 
> Ignoring that there's no minimums included on the review, and ignoring the fact that AMD optimizations aren't 100% yet--with that information given, the 1800X is not going to get him what he wants. One because it's not meeting the target, and two because there's obviously a difference here being caused by the CPU. It's not a GPU bottleneck that we're looking at because the results all include the same GPU. You're looking at either a CPU bottleneck or a GPU that's being bottlenecked by the rest of the system.


I'd recommend to use DX11 instead









Quoted from one of my other posts;
http://www.overclock.net/t/1624618/tpu-amd-responds-to-ryzens-lower-than-expected-1080p-performance/260_20#post_25901155



Everyone was saying that DX11 BF1 was the way to test for GPUs, because DX12 is inferior in most ways... For some reason, everyone decided to test DX12 for the Ryzen review... We all know that DX12 is closer to the metal, meaning, the programming needs a bit more handholding... And the CPU instructions need to be adequate. Considering the difference between DX11 and DX12 in BF1 for Ryzen, we know that it's not the CPU that is the issue in BF1. It's the programming that is not optimized for Ryzen...



The same would apply for any DX12 game at the moment.


----------



## Artikbot

Quote:


> Originally Posted by *sumitlian*
> 
> RyZen can only do 256 bit load / cycle. For store, it is limited to 128 bit / cycle if I've not read that wrong, while Intel do not have that limitation since Sandy Bridge.In RyZen, if 256 bit SIMD operations in an application are mostly about more memory reads than writes then I think there should not be any significant difference between Intel and AMD regarding AVX-256, but if it is vice versa then AMD will be slower.
> Can't say about 512 bit AVX since I don't think there exist any problem domain remaining for gaming applications for 512 bit SIMD, well my knowledge is very limited in this. All I've read is both major consoles do use AVX-128 SIMD (or even 256 bit SIMD).
> 
> I think 6900k/7700k can only improve gaming performance in future if more games will start to use AVX2. But even if that is the case then I believe only PC exclusive titles may see such significant gains (due to superior AVX-256 IPC, not what we are seeing now with non AVX based games), because we know that both PS4 and XB1 has AMD Jaguar based A/CPU, Jaguar does not provide 256-bit vector (AVX-256) processing in a single cycle, it had been designed to do 128 bit vector operation per cycle. Tbh I am technically illiterate when it comes to porting if developers see the potential for performance increase in utilizing wider instructions while porting games to PC since PC has much better vector processor in Intel or they do not because it would only cost time and extra overhead !? Also all CPUs that support AVX also fully support 128 bit AVX instructions that means developers should not need to worry about backward compatibility between different AVX based architecture. The problem comes when they decided to use AVX2-256 (Integer/double instructions, _mm256_xxx_epi32/pd (8 x 32 bit / 4 x64 bit packed, in these instructions AMD has 50% throughput of what Intel can do ) ), but it is highly unlikely for now they would do so because I think market of AVX2 based processors globally still doesn't surpass non AVX2 based processor, at least for now. Finally my speculation is, if RyZen in terms of 128-bit throughput is actually equal to Jaguar and since console games are obviously optimized for Jaguar architecture, it is very likely that performance gap between i76900k / i7 7700k and RyZen, IPC wise, will not increase. Also most still don't realize that higher frame rates with 7700k has more do to with IPS than the IPC, since 7700k overclocks much better.
> 
> Edit: types fixed, pardon me if there is any,also ignore the grammar lol.


Crikey that's a dense wall of words haha

Anyway. Yeah it seems Zen does 256bit by fusing two 128bit registers, with a performance penalty (it still uses only one of the two FMACs). Although it's nowhere as high as doing 512bit, where Intel fuses two AVX2 registers and AMD has to use the two FMACs (adding two sets of two 128bit registers if I'm not mistaken?), which is what takes a substantial performance hit.

On the other hand, it looks like most Intel chips don't have AVX2 registers and some don't even support the instruction set at all (Pentium only does SSE4.1/4.2), so this is a very immediate stick on the wheel of mainstream 256 and 512bit AVX2 adoption.

It doesn't seem to me like the lack of 256bit AVX2 registers is a threat to Zen at all, bar for the very odd case (such as this particular revision of Blender Toms happened to test).


----------



## DaaQ

Quote:


> Originally Posted by *NightAntilli*
> 
> If you want an indication of how 4C/8T performs, you disable 4 cores on the CPU. This will also give a better indication of how 4 cores would overclock etc.
> 
> The low resolution low setting benchmarks are still useless. How do you know you're testing "CPU" limitations rather than say, memory controller limitations? Memory speed/bandwidth limitations? Motherboard limitations?


Maybe PCIe controller?


----------



## looniam

still waiting on the MC's AM4 proc/mobo bundles before that 90 minute drive up to detroit:

http://www.microcenter.com/site/products/amd_bundles.aspx


----------



## sumitlian

Quote:


> Originally Posted by *Artikbot*
> 
> It doesn't seem to me like the lack of 256bit AVX2 registers is a threat to Zen at all, bar for the very odd case (such as this particular revision of Blender Toms happened to test).


Indeed market of AVX2 based processor for gaming environment is extremely low.
Many thanks to Intel who launched Pentium G with HT in 2017 with no AVX,







. Unless Intel or developers want to kill performance of new Pentium G all of as sudden in gaming (sub 30 fps lol), nobody is going to use AVX2 in gaming for the sake of full backward compatibility for all CPU holders







.


----------



## DarkRadeon7000

Is the stock cooler decent enough to cool the Ryzen 1700 without being noisy if I dont plan on doing too much OCing? Or should I get the corsair H80i V2


----------



## JackCY

Source.




Can't help but the GTAV stats to me look like the CCX thread movement by scheduler causes 1.4ms penalty to render time or some other thread hackery. IMHO those graphs shouldn't have two spikes like that and something is obviously very wrong. As long as it's truly the same test scene there shouldn't be two peaks.

There are a bit of two peaks in Crysis 3 as well but not as pronounced.


----------



## Kuivamaa

Quote:


> Originally Posted by *Slink3Slyde*
> 
> So, you're a guy who likes Computerbase.de could you perhaps explain their methodology or why they seem to get much better results with CPU's with much higher thread counts? Its a genuine question, not trying to bait an argument I'm really curious as to why they are getting the results they are vs other sites.


First of all ,computerbase isn't particularly standing out. If you want to see a site that really "loves" their multithreaded results, look no further than gamegpu.ru.Even today,their stock 5960X leads on most CPU benchmarks for whatever reason,even in titles that do not appear perfectly threaded,even against skylake CPUs. What sets CB (and digital foundry,and a few others like hardware.fr) apart from the rest is the critical approach they take towards products they test and their own results alike. They will try various angles, revisit old data, question their findings if they see anomalies etc etc. Unlike ,say, gamersnexus or gamegpu,ru or pclab which will publish whatever before double checking if their test board is throttling first, for example. Pclab in particular had the habit of publishing tons and tons of benches ,even multiplayer ones from more than a dozen CPUs ,in games that have been out a handful of hours. Naturally you need days for such tests. Unless you simply bench an empty area for 15seconds and call it a day.


----------



## rage fuury

*DDR4 Memory Scaling on AMD AM4 Platform - The Best Memory Kit For AMD Ryzen CPUs*

http://www.legitreviews.com/ddr4-memory-scaling-amd-am4-platform-best-memory-kit-amd-ryzen-cpus_192259

I think this is relevant for the topic (may be for the OP?!)

Also, the Memory speed for gaming performance @1080p seems to be of capital importance:



Anything under 300MHz will bottleneck the cpu...


----------



## sumitlian

Quote:


> Originally Posted by *rage fuury*
> 
> *DDR4 Memory Scaling on AMD AM4 Platform - The Best Memory Kit For AMD Ryzen CPUs*
> 
> http://www.legitreviews.com/ddr4-memory-scaling-amd-am4-platform-best-memory-kit-amd-ryzen-cpus_192259
> 
> I think this is relevant for the topic (may be for the OP?!)
> 
> Also, the Memory speed for gaming performance @1080p seems to be of capital importance:


>15% more fps !!!
I don't want to start the next hype train. But you mean to say AMD already has a CPU ready in advance for next launch (Zen+, if it would do 3600+ MHz DDR4) that will directly compete with Coffee Lake or whatever Intel is going to throw next with *cough* 5% more IPC * cough* ?


----------



## bigjdubb

So that 3466mhz GSkill Ryzen memory might turn out to be a real winner?


----------



## Slink3Slyde

Quote:


> Originally Posted by *Kuivamaa*
> 
> First of all ,computerbase isn't particularly standing out. If you want to see a site that really "loves" their multithreaded results, look no further than gamegpu.ru.Even today,their stock 5960X leads on most CPU benchmarks for whatever reason,even in titles that do not appear perfectly threaded,even against skylake CPUs. What sets CB (and digital foundry,and a few others like hardware.fr) apart from the rest is the critical approach they take towards products they test and their own results alike. They will try various angles, revisit old data, question their findings if they see anomalies etc etc. Unlike ,say, gamersnexus or gamegpu,ru or pclab which will publish whatever before double checking if their test board is throttling first, for example. Pclab in particular had the habit of publishing tons and tons of benches ,even multiplayer ones from more than a dozen CPUs ,in games that have been out a handful of hours. Naturally you need days for such tests. Unless you simply bench an empty area for 15seconds and call it a day.


Thats interesting, I just wish I spoke German well enough to understand the site.

I have to say though they are standing out to me







I dont really trust GameGPU either to be fair and Ive rarely visited the other sites you mentioned. I just checked Hardware.fr and their Ryzen review game results seem more consistent with everywhere except Computerbase. Unfortunately Anand havent done any gaming benches yet otherwise I'd be looking at them too, but I usually go to Tweaktown, ARStechnica and Techspot because they at least used to do quite a lot of performance analysis of new games as well as Digital Foundry and Techpowerup

I mean this all could still be down to the coming optimizations, and all the other stuff that's gone wrong at launch affecting everyones results for Ryzen for the worse, and even if it is its still definitely a good gaming CPU for most everyone as it stands and will obviously get better as time goes on. But the question remains to me at least: Why are Computerbase not getting the same results as others?

I'm really interested to see what Digital Foundry comes up with as well, them and Techpowerup are maybe waiting until the dust settles and everything is optimal.


----------



## moustang

Quote:


> Originally Posted by *NightAntilli*
> 
> 
> 
> Everyone was saying that DX11 BF1 was the way to test for GPUs, because DX12 is inferior in most ways... For some reason, everyone decided to test DX12 for the Ryzen review... We all know that DX12 is closer to the metal, meaning, the programming needs a bit more handholding... And the CPU instructions need to be adequate. Considering the difference between DX11 and DX12 in BF1 for Ryzen, we know that it's not the CPU that is the issue in BF1. It's the programming that is not optimized for Ryzen...
> 
> 
> 
> The same would apply for any DX12 game at the moment.


I wouldn't limit your comments to Ryzen in that example. There's some strange differences between the Intel CPUs as well. Looking at the 720p scores, the 7700k is 11fps faster under DX12, but the 6900k is 21fps slower. The 6850k is 2fps faster in DX12 but the 6950x is 9fps slower. Even among very similar Intel chips the performance is inconsistent.

Basically BF1 is just unreliable for benchmarking. The performance gain/loss is inconsistent, even on chips of the same basic architecture or same generation.


----------



## Majin SSJ Eric

BF1 is unrelieable for benchmarking, but GTAV and ROTTR are great for benchmarking!


----------



## Artikbot

Quote:


> Originally Posted by *sumitlian*
> 
> Indeed market of AVX2 based processor for gaming environment is extremely low.
> Many thanks to Intel who launched Pentium G with HT in 2017 with no AVX,
> 
> 
> 
> 
> 
> 
> 
> . Unless Intel or developers want to kill performance of new Pentium G all of as sudden in gaming (sub 30 fps lol), nobody is going to use AVX2 in gaming for the sake of full backward compatibility for all CPU holders
> 
> 
> 
> 
> 
> 
> 
> .


To make matters even more difficult - only Haswell and above support AVX2. Before that and until Sandy Bridge they only supported AVX.

But as you said it doesn't matter anyway because any Pentiums support neither, lol.


----------



## doritos93

Incredible, these "reviewers" actually managed to drive down AMDs share price

AMD can't catch a break really, back from the dead with an extremely competitive product only to get burned by incompetant reviewers and their rushed, half baked reviews ...


----------



## rage fuury

Quote:


> Originally Posted by *bigjdubb*
> 
> So that 3466mhz GSkill Ryzen memory might turn out to be a real winner?


Quote:


> Originally Posted by *sumitlian*
> 
> >15% more fps !!!
> I don't want to start the next hype train. But you mean to say AMD already has a CPU ready in advance for next launch (Zen+, if it would do 3600+ MHz DDR4) that will directly compete with Coffee Lake or whatever Intel is going to throw next with *cough* 5% more IPC * cough* ?


More like 6-7 fps lol.Must took the 2666MHz as the base... This was the freq. most reviewers used to benchmark Ryzen...
What can I say, Ninja Speed Memory-Kits helps..







... If the MB supports it of course.


----------



## Undervolter

Quote:


> Originally Posted by *sumitlian*
> 
> RyZen can only do 256 bit load / cycle. For store, it is limited to 128 bit / cycle if I've not read that wrong, while Intel do not have that limitation since Sandy Bridge.In RyZen, if 256 bit SIMD operations in an application are mostly about more memory reads than writes then I think there should not be any significant difference between Intel and AMD regarding AVX-256, but if it is vice versa then AMD will be slower.
> Can't say about 512 bit AVX since I don't think there exist any problem domain remaining for gaming applications for 512 bit SIMD, well my knowledge is very limited in this. All I've read is both major consoles do use AVX-128 SIMD (or even 256 bit SIMD).
> 
> I think 6900k/7700k can only improve gaming performance in future if more games will start to use AVX2. But even if that is the case then I believe only PC exclusive titles may see such significant gains (due to superior AVX-256 IPC, not what we are seeing now with non AVX based games), because we know that both PS4 and XB1 has AMD Jaguar based A/CPU, Jaguar does not provide 256-bit vector (AVX-256) processing in a single cycle, it had been designed to do 128 bit vector operation per cycle. Tbh I am technically illiterate when it comes to porting if developers see the potential for performance increase in utilizing wider instructions while porting games to PC since PC has much better vector processor in Intel or they do not because it would only cost time and extra overhead !? Also all CPUs that support AVX also fully support 128 bit AVX instructions that means developers should not need to worry about backward compatibility between different AVX based architecture. The problem comes when they decided to use AVX2-256 (Integer/double instructions, _mm256_xxx_epi32/pd (8 x 32 bit / 4 x64 bit packed, in these instructions AMD has 50% throughput of what Intel can do ) ), but it is highly unlikely for now they would do so because I think market of AVX2 based processors globally still doesn't surpass non AVX2 based processor, at least for now. Finally my speculation is, if RyZen in terms of 128-bit throughput is actually equal to Jaguar and since console games are obviously optimized for Jaguar architecture, it is very likely that performance gap between i76900k / i7 7700k and RyZen, IPC wise, will not increase. Also most still don't realize that higher frame rates with 7700k has more do to with IPS than the IPC, since 7700k overclocks much better.
> 
> Edit: types fixed, pardon me if there is any,also ignore the grammar lol.


About AVX in general. Both x264 and x265 use a bit of AVX and more of AVX2. But not just that. They use also mostly integer, but in lesser percentage also FPU. So they are kind of mixed loads. The results, vary, but 7700K isn't saved:

http://www.hardware.fr/articles/956-14/encodage-video-x264-x265.html

http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700/20

Also, this is free translation i made, from a post from a local forum, from a guy called "*bjt2*", who is also in anandtech forum: if you can't understand it well, it's not entirely my fault, as i don't completely understand the terms in the native language, so don't expect me to do a better job translating in english:
Quote:


> Dump of 1800X:
> http://users.atw.hu/instlatx64/AuthenticAMD0800F11_K17_Zen_InstLatX64.txt
> 
> Dump of Kabylake:
> http://users.atw.hu/instlatx64/GenuineIntel00906E9_Kabylake_InstLatX64.txt
> 
> To summarize:
> 
> - Zen is faster in rare instructions, mmx/x87 switch, division both whole and FP x87 (Zen has double the speed, but not in divisions SSE and AVX where it's the same), up to 3 times faster in transcendents and square root x87, up to 3 times faster in packed SSE(2). Kabylake recovers in memory string instructions (given the L1 band).
> - Zen is faster in LEA and MOV 14 and 64 bit.
> - Logical vector instructions AVX : 3 per cycle for Kabylake, 4 per cycle for Zen: So all 4 pipelines in Zen can do instructions vecint SSE/AVX (128bit).
> - Some packed SSE(2)/ AVX(2) instructions have a throughput of 1 in Kabylake and a throughput of 2 in Zen (ex from (ex. from 1378 to 1407)
> - It's noteworthy that from 1610 and afterwards, the FMA3 instructions, introduced by Intel, have throughput 2 in Kabylake (since it has 2 FP pipelines) and 4 in Zen (since it has 4 FP pipelines).
> - Obviously the situation is reversed in FMA3 256bit: Always 2 per cycle in Intel and 1 per cycle in Zen.
> - RDRAND (casual numbers in hardware): Kabylake, 3 times faster.
> - Instructions MOV AVX/SSE: More throughput for Zen, obviously since it has more FP pipelines.
> - BMI(2) bit manipulation: more throughput in Intel.
> - AVX2 a 256 bit : obviously faster in Intel.
> - Instructions SHA for cryptography: Not present in Kabylake???
> - Other instructions SSE(2) legacy, comparable or faster in Zen


Bottom line. Zen has only to worry about 256bit AVX2 and 256bit FMA3. In AVX it appears actually faster. So, given that many CPUs out there don't even support AVX, let alone AVX2, i don't see this becoming a problem soon and even if they appear, the program must make massive use of AVX2 to hinder Zen. Because like i said, x264 and x265 do make use, but Zen, according to who you want to believe, either still outperforms or is very close to Intel, but in any case, still beats the 7700K. The 7700K will likely still win in cases where it' still not saturated. Once it's saturated and if theh program uses at least 9 threads in Zen, then Zen will either close the gap or pass ahead, according to how much 256bit AVX2 heavy the program (or game is). Cause 128bit doesn't seem an issue. Still, if a game saturates the 7700K and generates more threads for RYzen, 256bit AVX won't save for long the 7700k. They will save the 6900K, IMHO, because it will be able to also use more threads, just like Ryzen, with the difference that it will be able to also use its AVX2 256 bit advantage.

And at the end, what are you going to do? Write an entire game exclusively to run 100% AVX2 256bit?


----------



## looniam

Quote:


> Originally Posted by *doritos93*
> 
> Incredible, these "reviewers" actually managed to drive down AMDs share price
> 
> AMD can't catch a break really, back from the dead with an extremely competitive product only to get burned by incompetant reviewers and their rushed, half baked reviews ...


STAHP

to be fair AMD didn't get anyone a sample until just days before the NDA lifted. i saw tweets of reviewers who still didn't get one in time! you can fault journalist integrity all you want but the fact of the matter is every day the site doesn't have a review is a day they lose ad revenue.


----------



## Slomo4shO

ROFL


----------



## sumitlian

I wonder why did nobody test more DX12 games like Forza Motorsport 6, Forza Horizon 3, Quantum Break DX12
Quote:


> Originally Posted by *Artikbot*
> 
> To make matters even more difficult - only Haswell and above support AVX2. Before that and until Sandy Bridge they only supported AVX.
> 
> But as you said it doesn't matter anyway because any Pentiums support neither, lol.


good for everyone.


----------



## nycgtr

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.microcenter.com/product/476004/Ryzen_7_1700X_34_GHz_8_Core_AM4_Boxed_Processor
> AMD Ryzen 7 1700X 3.4 GHz 8 Core AM4 Boxed Processor
> $399.99 SAVE $50.00 = $349.99
> 
> *So it begins on the "X" CPUs , price drops in less than 1 week.*
> 
> The Ryzen 7 1700X is a tough sell: if overclocking they have more or less the same clockspeed wall as the Ryzen 7 1700. If not manual OCing, the Ryzen 7 1800X boosts higher. The only thing stopping people that don't overclock from jumping to the Ryzen 7 1800X is the price differential.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The clockspeeds for non overclockers looking for ~ $300 CPUs with more than 8 threads seems alright


This is not fair to say. I have bought from maybe 30 cpus from microcenter in total if not more. Microcenter in my history of shopping there *HAS ALWAYS* charged cost vs msrp. When I saw their ryzen prices I was shocked to see it matched everywhere else. I expected the 1800x ,1700x to be 50 less and maybe 30 less on the 1700. This "price drop" is nothing more than a return to their original pricing method. For example I paid 318 or 328 and 599 for a 5820 and 5930k on x99 launch, 199 for 6600k on launch. This is just how they price things.


----------



## madweazl

If the 1700 drops below $300 I'm gonna be forced to go pick one up


----------



## kaseki

There was a question about Linux benchmarks and a comment disparaging Linux games a few pages back that I didn't see answered. For Linux benchmarks, including of games, visit Phoronix.com.


----------



## NightAntilli

Quote:


> Originally Posted by *moustang*
> 
> I wouldn't limit your comments to Ryzen in that example. There's some strange differences between the Intel CPUs as well. Looking at the 720p scores, the 7700k is 11fps faster under DX12, but the 6900k is 21fps slower. The 6850k is 2fps faster in DX12 but the 6950x is 9fps slower. Even among very similar Intel chips the performance is inconsistent.
> 
> Basically BF1 is just unreliable for benchmarking. The performance gain/loss is inconsistent, even on chips of the same basic architecture or same generation.


Let's analyze the data shall we...

720p framerate BF1 DX11/DX12
7700k (4/8) 116.4/127.6; 11.2 more (+9.6%)
6850k (6/12) 120.9/122; 1.1 more (+0.9%)
6900k (8/16) 143.8/122.4; 21.4 less (-14.9%)
6950x (10/20) 129.6/120.9; 8.7 less (-6.7%)

1080p framerate BF1 DX11/DX12
7700k 116.2/120.4; 4.2 more (+3.6%)
6850k 120.5/121.5; 1 more (+0.8%)
6900k 136.5/117.3; 19.2 less (-14.1%)
6950x 127.1/109.7; 17.4 less (-13.7%)

I see nothing really inconsistent in the above... The 4C/8T CPU gains performance when jumping to DX12, indicating it likely is already bottlenecking in DX11. The 6C/12T CPU is practically the same between the APIs... For any more cores/threads, performance decreases... The exact reason? Don't know. Likely too many threads is causing higher latency or some issues with fences, decreasing framerate... In any case, the loss is seen at both resolutions. Why it's bigger for the 6950x at 1080p compared to 720p, don't know. Likely an optimization issue. Aside from this, at 720p the gain is greater for the 7700k than at 1080p, as expected... But then, look at the Ryzen benchmarks...

Ryzen (8/16):
720p framerate BF1 DX11/DX12
122.4/90.7 31.7 less (-25.9%)
1080p framerate BF1 DX11/DX12
121.8/89.2 32.6 less (-26.8%)

Ryzen loses at least 26%-ish simply by jumping between DX11 and DX12. You cannot honestly tell me that this does not indicate a lack of optimization. Especially considering that at 1080p, where the impact should be less since we're less CPU bound, the loss is actually slightly greater than at 720p... None of the bunch of DX12 benchmarks we have seen are at all representative of CPU performance for Ryzen. That is a fact.

Now... What is the following indicating...?

From 720p to 1080p DX11
7700k 116.4/116.2; 0.2 less (-0.2%)
6850k 120.9/120.5; 0.4 less (-0.3%)
6900k 143.8/136.5; 7.3 less (-5.1%)
6950x 129.6/127.1; 2.5 less (-1.9%)
Ryzen 122.4/121.8; 0.6 less (-0.5%)

It indicates that under DX11, all except the 6900k and the 6950x are already CPU limited at 1080p, considering a lower resolution does not increase framerate significantly. Where is the limitation? It can't be single threaded performance of the CPUs, since the 7700k has the highest of all of them, yet has the lowest framerate of all of them. You tell me. What is causing it?

And here...;

From 720p to 1080p DX12
7700k 127.6/120.4; 7.2 less (-5.6%)
6850k 122/121.5; 0.5 less (-0.4%)
6900k 122.4/117.3; 5.1 less (-4.2%)
6950x 120.9/109.7; 11.2 less (-9.3%)
Ryzen 90.7/89.2; 1.5 less (-1.7%)

Under DX12, the 7700k is no longer the bottleneck at 1080p. Same CPU, same game, same OS, different API. And now, the 7700k under DX12 is faster than Ryzen under DX11, at 720p. But at 1080p, suddenly Ryzen is faster under DX11, than the 7700k at DX12. What gives? Care to explain?

Has it become clearer now, why using low settings & low resolution no longer is a good indication of CPU performance? There are too many other variables to consider right now. It is no longer as simple as in the past.


----------



## AlphaC

http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x

Gamers Nexus suggesting R7 1700 over Ryzen 7 1800X
Quote:


> This further reinforces our stance on the 1800X: You are far better off buying an R7 1700 for $330, applying a 5-minute overclock and a half-decent cooler, and netting a chip that outperforms a stock 1800X, or performs equally to an overclocked version. There is no reason to purchase an 1800X if you are OK with the idea of applying an overclock. Now, we'd normally assume that most folks aren't overclocking - see: "just want the best" people who buy a 7700K - but it's a few minutes of work and grants performance that minimally equals the R7 1800X.
> 
> The one point of hesitation here is on the future binning of these CPUs. We're not sure if the early 1700 models will OC higher, or if they're modified 1800X parts that retain the same headroom. Our sample overclocks to 3.9-4.0GHz, depending on workload, and can push memory to 2933 in those overclocked cases. We know some other folks were able to achieve similar overclocks on their review sample R7 1700 CPUs.
> 
> We're still waiting on our purchased 1700 to come in so that we can review the stock cooler, but overall, thermals are significantly reduced over the 1800X. The performance at the price point, when considering a light overclock, is a far better argument in terms of price than the 1800X. Even for production workloads, the overclocked 1700 produces equal performance to the overclocked 1800X.
> 
> So again, we strongly advise against the 1800X as a CPU for a gaming machine. If you're doing zero production, you're not doing any content creation, then you're still generally getting a better deal with an i5 or i7 CPU. For folks who are combining content creation (similar to what we do) with gaming, or may be considering streaming, the R7 1700 is actually a viable chip - and far more so than the 1800X.
> 
> AMD is its own best competition at the price point, when considering those use cases.
> 
> We still have to look at the 1700X, but based on initial results, the R7 1700 looks like the hero of AMD's initial lineup. It is far easier to argue this CPU, just know that you're still limited in terms of gaming performance at the cost. Rendering workloads are far boosted over equivalently priced Intel CPUs. We can stand behind the R7 1700 under the right usage conditions - just figure out if you fit into those conditions.


Not surprised.

MORE IMPORTANTLY:
Quote:


> Originally Posted by *https://siliconlottery.com/collections/frontpage/products/1700a39g*
> As of 3/6/17, the top 70% of 1700s were able to hit 3.9GHz or greater.
> 
> Passed the ROG RealBench stress test for one hour with these settings:
> 
> 39x CPU Multiplier
> 1.408V CPU VCORE (Or less)
> LLC Level 3 (Asus Crosshair VI Hero)


----------



## DaaQ

Quote:


> Originally Posted by *AlphaC*
> 
> 
> 
> 
> http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x
> 
> Gamers Nexus suggesting R7 1700 over Ryzen 7 1800X
> Not surprised.
> 
> MORE IMPORTANTLY:


Which was always the case with over clocking until the creation of the K series. (For the younger crowd who may have not experienced the time prior to the K series)


----------



## Undervolter

Quote:


> Originally Posted by *AlphaC*
> 
> 
> 
> For folks who are combining content creation (similar to what we do) with gaming, or may be considering streaming, the R7 1700 *is actually a viable chip* - and far more so than the 1800X.


I wonder. If this was a 330$ Intel chip, with the same capabilities, would he say the same? Or more something like "the R7 1700 is an amazing chip, which practically gives you 6900K capabilities for 1/3 of the price and you just can't beat that".


----------



## DADDYDC650

$309 for Ryzen 1700.... Ordered one. I get 10 percent cash back which brings the total down to $279. Wow!









Check your email for ebay 10 percent cash back offer. Expires on March 9th.

http://www.ebay.com/itm/New-AMD-Ryzen-7-1700-8-Core-3-0GHz-Desktop-Processor-AM4-65W-YD1700BBAEBOX-/351996487464


----------



## sumitlian

Quote:


> Originally Posted by *Undervolter*
> 
> About AVX in general. Both x264 and x265 use a bit of AVX and more of AVX2. But not just that. They use also mostly integer, but in lesser percentage also FPU. So they are kind of mixed loads. The results, vary, but 7700K isn't saved:
> 
> http://www.hardware.fr/articles/956-14/encodage-video-x264-x265.html
> 
> http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700/20
> 
> Also, this is free translation i made, from a post from a local forum, from a guy called "*bjt2*", who is also in anandtech forum: if you can't understand it well, it's not entirely my fault, as i don't completely understand the terms in the native language, so don't expect me to do a better job translating in english:
> Bottom line. Zen has only to worry about 256bit AVX2 and 256bit FMA3. In AVX it appears actually faster. So, given that many CPUs out there don't even support AVX, let alone AVX2, i don't see this becoming a problem soon and even if they appear, the program must make massive use of AVX2 to hinder Zen. Because like i said, x264 and x265 do make use, but Zen, according to who you want to believe, either still outperforms or is very close to Intel, but in any case, still beats the 7700K. The 7700K will likely still win in cases where it' still not saturated. Once it's saturated and if theh program uses at least 9 threads in Zen, then Zen will either close the gap or pass ahead, according to how much 256bit AVX2 heavy the program (or game is). Cause 128bit doesn't seem an issue. Still, if a game saturates the 7700K and generates more threads for RYzen, 256bit AVX won't save for long the 7700k. They will save the 6900K, IMHO, because it will be able to also use more threads, just like Ryzen, with the difference that it will be able to also use its AVX2 256 bit advantage.
> 
> And at the end, what are you going to do? Write an entire game exclusively to run 100% AVX2 256bit?


+1 !
Holy bell !







Zen is actually much much faster than even Kaby Lake in most instruction on xmm registers. SSE2 is almost in every instruction 2x faster in Zen, so are many SSS3 and SSE4.1 instructions. And Damn that x87 performance. (I hope Bethesda is making one more x87 based game lol, unlikely in future...but wait a sec...why did anybody not test Skyrim yet ???). No doubt AMD is King of performance in 128bit registers (both in memory to register and vice versa ). Also I saw in many scalar instructions, 64 bit transfer is faster too.

Yes you are right. x264/x265 use AVX-256. But I've read somewhere in here that x265 processing is the one which gains significant benefit from AVX2 exclusive instructions. Though Intel provides dedicated hardware encoder/decoder as well, for H.265 in Kaby lake, if AMD comes with such support in their upcoming APUs, this shouldn't be that big problem. Until then Polaris and Maxwell is doing great I think.

One thing I am greatly wondering about is that, if Toms used AVX-256 bit optimized Blender where RyZen is noticeably losing, then shouldn't there be possibilities that RyZen could still do much better if Blender's source code was modified to work on xmm registers with 'similar instructions that are available on ymm' ? Of course they might have to change the current algorithm for that since they will have to work on 2 x DWORD. But there is a chance, isn't it ?
And if that is the case, almost every workload should have a possibility to work competitively on RyZen as well if it is recompiled according to RyZen, unless that program has been specifically written in AVX2 instructions that is another case where no one can do anything. lol I don't know why I am thinking so much. But tbh I am literally fascinated by the performance of most SSE/AVX-128 instructions in RyZen. Also memory dependent instructions are more likely to work even faster with higher speed RAMs, AMD literally have chance to make another great CPU (probably Zen+) if they could somehow provide support for 4000+ MHz DDR4 RAMs.









P.S. Thanks to computers my basic assembly programming knowledge helped me in here.

Edit: *But seriously







why has nobody shown us the Skyrim performance of RyZen ?*
[ Conspiracy theory mode activated ]
Did Intel.hmmm do.......nothing ? Anyways forget it !


----------



## HeliXpc

Ok, here is my status on the 1700x I own and some insight into this whole Ryzen thing after 2 full days with it. Do I like it? Yes and No.

Im sitting at 4.1ghz with 1.450v which does great in the games I have tested, I am only using a gtx 1060 6gb for the moment, will be picking up the 1080ti or Vega soon.

Memory, and temps are the biggest issue with this platform. Ram for me is not stable anything above 2133mhz, even at stock clocks on the cpu, only adjusting memory frequency crashes the motherboard and I have to do a bios reset to bring it back to life, my board (*asus crosshair vi*) also will not set the timings properly, it chooses its own timings no matter what I input into the memory timing setting, anyone else experience this with the crosshair 6? tried 2 different kits and both of them have the same issue, anything above 2133 is not stable and timings in the bios will not stick after reboot. I think this is the main reason why my ram is not stable.

Temps, expect Idle temps in the 50s and load in the 70s and 80s, I am using the Swiftech H320 prestige, a good AIO liquid cooler. I am using bios 5704, will try the new unofficial beta 5803 tonight and see what happens.


----------



## NightAntilli

Quote:


> Originally Posted by *HeliXpc*
> 
> Ok, here is my status on the 1700x I own and some insight into this whole Ryzen thing after 2 full days with it. Do I like it? Yes and No.
> 
> Im sitting at 4.1ghz with 1.450v which does great in the games I have tested, I am only using a gtx 1060 6gb for the moment, will be picking up the 1080ti or Vega soon.
> 
> Memory, and temps are the biggest issue with this platform. Ram for me is not stable anything above 2133mhz, even at stock clocks on the cpu, only adjusting memory frequency crashes the motherboard and I have to do a bios reset to bring it back to life, my board (*asus crosshair vi*) also will not set the timings properly, it chooses its own timings no matter what I input into the memory timing setting, anyone else experience this with the crosshair 6? tried 2 different kits and both of them have the same issue, anything above 2133 is not stable and timings in the bios will not stick after reboot. I think this is the main reason why my ram is not stable.
> 
> Temps, expect Idle temps in the 50s and load in the 70s and 80s, I am using the Swiftech H320 prestige, a good AIO liquid cooler. I am using bios 5704, will try the new unofficial beta 5803 tonight and see what happens.


Asus is known to have problems... Wait for BIOS updates in the following month or two.


----------



## Slomo4shO

Quote:


> Originally Posted by *AlphaC*
> 
> MORE IMPORTANTLY:


Chips definitely are binned and the higher end chips do seem to require less voltage for equivalent frequencies, the question is whether it is worthwhile to pay for the extra for 100-200MHz frequency boost.
Quote:


> 97% of 1700s were able to hit 3.8GHz or greater.
> 1.376V CPU VCORE (Or less)
> 
> 70% of 1700s were able to hit 3.9GHz or greater.
> 1.408V CPU VCORE (Or less)
> 
> 23% of 1700s were able to hit 4.0GHz or greater.
> 1.44V CPU VCORE (Or less)


Compared to:
Quote:


> 100% of 1700Xs were able to hit 3.8GHz or greater.
> 1.36V CPU VCORE (Or less)
> 
> 77% of 1700Xs were able to hit 3.9GHz or greater.
> 1.392V CPU VCORE (Or less)
> 
> 33% of 1700Xs were able to hit 4.0GHz or greater.
> 1.424V CPU VCORE (Or less)
> 
> 97% of 1800Xs were able to hit 3.9GHz or greater.
> 1.376V CPU VCORE (Or less)
> 
> 67% of 1800Xs were able to hit 4.0GHz or greater.
> 1.408V CPU VCORE (Or less)
> 
> 20% of 1800Xs were able to hit 4.1GHz or greater.
> 1.44V CPU VCORE (Or less)


----------



## sumitlian

Quote:


> Originally Posted by *HeliXpc*
> 
> Ok, here is my status on the 1700x I own and some insight into this whole Ryzen thing after 2 full days with it. Do I like it? Yes and No.
> 
> Im sitting at 4.1ghz with 1.450v which does great in the games I have tested, I am only using a gtx 1060 6gb for the moment, will be picking up the 1080ti or Vega soon.
> 
> Memory, and temps are the biggest issue with this platform. Ram for me is not stable anything above 2133mhz, even at stock clocks on the cpu, only adjusting memory frequency crashes the motherboard and I have to do a bios reset to bring it back to life, my board (*asus crosshair vi*) also will not set the timings properly, it chooses its own timings no matter what I input into the memory timing setting, anyone else experience this with the crosshair 6? tried 2 different kits and both of them have the same issue, anything above 2133 is not stable and timings in the bios will not stick after reboot. I think this is the main reason why my ram is not stable.
> 
> Temps, expect Idle temps in the 50s and load in the 70s and 80s, I am using the Swiftech H320 prestige, a good AIO liquid cooler. I am using bios 5704, will try the new unofficial beta 5803 tonight and see what happens.


Thanks for the update. I hope everything gets normal in couple of months.

Also, keep in touch in here if you haven't done so.
https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


----------



## bigjdubb

Quote:


> Originally Posted by *HeliXpc*
> 
> Ok, here is my status on the 1700x I own and some insight into this whole Ryzen thing after 2 full days with it. Do I like it? Yes and No.
> 
> Im sitting at 4.1ghz with 1.450v which does great in the games I have tested, I am only using a gtx 1060 6gb for the moment, will be picking up the 1080ti or Vega soon.
> 
> Memory, and temps are the biggest issue with this platform. Ram for me is not stable anything above 2133mhz, even at stock clocks on the cpu, only adjusting memory frequency crashes the motherboard and I have to do a bios reset to bring it back to life, my board (*asus crosshair vi*) also will not set the timings properly, it chooses its own timings no matter what I input into the memory timing setting, anyone else experience this with the crosshair 6? tried 2 different kits and both of them have the same issue, anything above 2133 is not stable and timings in the bios will not stick after reboot. I think this is the main reason why my ram is not stable.
> 
> *Temps, expect Idle temps in the 50s and load in the 70s and 80s, I am using the Swiftech H320 prestige, a good AIO liquid cooler.* I am using bios 5704, will try the new unofficial beta 5803 tonight and see what happens.


Wow, so even a proper radiator and block has trouble keeping the temps in the 60's :/ I was hoping the AIO's the review sites have been using were just not quite enough to get the job done.


----------



## amstech

The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
Thank you AMD, I've been wanting that chip forever.


----------



## sumitlian

Quote:


> Originally Posted by *amstech*
> 
> The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
> Thank you AMD.


Good for everyone.










Spoiler: Warning: Spoiler!



Meanwhile Intel Engineers are probably having diarrhea, since almost no one is going to buy another $500+ CPU from them again.


----------



## SoloCamo

Quote:


> Originally Posted by *amstech*
> 
> The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
> Thank you AMD, I've been wanting that chip forever.


Sarcasm? The 6800k is not exactly a good overclocker in comparison to Kaby at all. Think most tap out at 4.3-4.4.


----------



## DADDYDC650

Quote:


> Originally Posted by *amstech*
> 
> The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
> Thank you AMD, I've been wanting that chip forever.


Can't get close to 5Ghz with a 6800k.


----------



## HeliXpc

Quote:


> Originally Posted by *bigjdubb*
> 
> Wow, so even a proper radiator and block has trouble keeping the temps in the 60's :/ I was hoping the AIO's the review sites have been using were just not quite enough to get the job done.


Yup, the coolant has also been replaced with some high quality coolant. It does keep the cpu stable at 4.1ghz, so Im happy about that but the reported temps for idle seem very high, i can touch the back of motherboard and seems cool, not 60c, even under load the radiator does not spit out hot air, unlike the other AIO cooler which warm up the radiators quickly under load. This is still giving me cool air so Im not sure whats up with the temps.


----------



## budgetgamer120

Quote:


> Originally Posted by *amstech*
> 
> The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
> Thank you AMD, I've been wanting that chip forever.


Was there another 6800k released I know nothing about?


----------



## bfromcolo

Quote:


> Originally Posted by *HeliXpc*
> 
> Yup, the coolant has also been replaced with some high quality coolant. It does keep the cpu stable at 4.1ghz, so Im happy about that but the reported temps for idle seem very high, i can touch the back of motherboard and seems cool, not 60c, even under load the radiator does not spit out hot air, unlike the other AIO cooler which warm up the radiators quickly under load. This is still giving me cool air so Im not sure whats up with the temps.


I have a similar issue with my 5820k, temps in the 60s stress testing but the radiator never feels warm


----------



## aDyerSituation

Quote:


> Originally Posted by *SoloCamo*
> 
> Sarcasm? The 6800k is not exactly a good overclocker in comparison to Kaby at all. Think most tap out at 4.3-4.4.


I see them hit 4.6 more often nowadays. But usually around 4.4 is more of the norm


----------



## daviejams

Quote:


> Originally Posted by *HeliXpc*
> 
> Yup, the coolant has also been replaced with some high quality coolant. It does keep the cpu stable at 4.1ghz, so Im happy about that but the reported temps for idle seem very high, i can touch the back of motherboard and seems cool, not 60c, even under load the radiator does not spit out hot air, unlike the other AIO cooler which warm up the radiators quickly under load. This is still giving me cool air so Im not sure whats up with the temps.


I bet it's not reading proper temps and an update will fix it


----------



## sumitlian

Quote:


> Originally Posted by *budgetgamer120*
> 
> Was there another 6800k released I know nothing about?


Hexa core.

It is probably meh for you.


----------



## amstech

Quote:


> Originally Posted by *DADDYDC650*
> 
> Can't get close to 5Ghz with a 6800k.


Damnit no!
I thought it was possible, swear I saw a few users with a Hydro H50 getting 4.6-5.0.


----------



## budgetgamer120

Quote:


> Originally Posted by *sumitlian*
> 
> Hexa core.
> 
> It is probably meh for you.


I am using a hexacore xeon.


----------



## sumitlian

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am using a hexacore xeon.


This is why I said that.


----------



## HeliXpc

Quote:


> Originally Posted by *daviejams*
> 
> I bet it's not reading proper temps and an update will fix it


This is a good product by AMD and a decent start, bios updates and some patches will smooth things out in the upcoming months and by summer and fall everything should be ironed out, imagine what Zen 2 will be like in the next couple of years, either way I am happy and we the consumers win here. Prices will come down for Intel and push them to make better products, AMD once again pushing development and technology forward just like they did with Mantle, pushing MS to release DX12 and Vulkan as well which both borrow from mantle. People who talk bad about AMD are clueless







gotta give it up for AMD here, good products at good prices pushing technology forward a HUGE thumbs up for the people at AMD


----------



## DADDYDC650

Quote:


> Originally Posted by *amstech*
> 
> Damnit no!
> I thought it was possible, swear I saw a few users with a Hydro H50 getting 4.6-5.0.


4.5 fully stable with air/water is a gold chip.


----------



## budgetgamer120

Quote:


> Originally Posted by *sumitlian*
> 
> This is why I said that.


Oh









Well I will be joining the 8 or more core club one of these days


----------



## criminal

Quote:


> Originally Posted by *amstech*
> 
> The Ryzen launch is good for us all, the 6800K is now going for $380 at some places and I here people can hit 5.0GHz with it.
> Thank you AMD, I've been wanting that chip forever.


5GHz.... lol. Please show me!
Quote:


> Originally Posted by *SoloCamo*
> 
> Sarcasm? The 6800k is not exactly a good overclocker in comparison to Kaby at all. Think most tap out at 4.3-4.4.


Exactly.


----------



## pez

Quote:


> Originally Posted by *unkletom*
> 
> Building a 120\144hz gaming setup with Ryzen would absolutely be suicide. You can't have your cpu bottleneck your gaming experience in this setup and preferrably you want high speed ram like 3866\4000 mhz.
> 
> To my surprise I've seen a couple Ryzen builds with [email protected] But hey it's not my money.


Quote:


> Originally Posted by *umeng2002*
> 
> I would imagine turn down a few sliders in a game will get you 120 or 144 Hz at 1080p no matter what CPU you have.


Quote:


> Originally Posted by *Charcharo*
> 
> The way the wind is blowing with CPUs and games, it sounds like a fine 144hz long term build.
> 
> As for my friend that wants a 144 FPS experience in ROTTR... I will honestly school him on how naive it is to build a PC for just ONE game and scenario, followed by me scolding him to at least want to play a good game and not ROTTR
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If he sees the light... or even if he doesnt... I will help him of course. Either OCed 7700K or 1700 (if he wants long term),


It was a really specific example to limit confusion and prove a point, but I see that the confusion still ensues for the targeted individual.

Also, building a PC for 120/144hz for a single game is more common than some of you may think (I.e. Overwatch and CS:GO).


----------



## JoeChamberlain

Nice one AMD, it's about time.


----------



## doritos93

Quote:


> Originally Posted by *looniam*
> 
> STAHP
> 
> to be fair AMD didn't get anyone a sample until just days before the NDA lifted. i saw tweets of reviewers who still didn't get one in time! you can fault journalist integrity all you want but the fact of the matter is every day the site doesn't have a review is a day they lose ad revenue.


Haha very funny !

so what's a bigger problem, the fact they supposedly didn't have time, or that they're incompetant and motivated solely by money? Can you fix incompetence with more time maybe?

Proper reviewers are few and far between these days, Ryzens launch has made that abundantly clear. I too thought GN were reputable, but it just seems like their too big for their britches, cringe worthy, their "review explained" video and review was


----------



## Artikbot

Quote:


> Originally Posted by *Slomo4shO*
> 
> Chips definitely are binned and the higher end chips do seem to require less voltage for equivalent frequencies, the question is whether it is worthwhile to pay for the extra for 100-200MHz frequency boost.
> Compared to:


The gist of it is, you pay nearly 40% more to have a guarantee that you'll hit 5% higher clock speeds.

The way I see it, it's pretty poor value if you want to overclock.

Now if you don't want to... The 1700X provides a ~12% more performance than the 1700, plus XFR, at 20% more money which is far more reasonable.


----------



## bigjdubb

Quote:


> Originally Posted by *doritos93*
> 
> Haha very funny !
> 
> so what's a bigger problem, the fact they supposedly didn't have time, or that they're incompetant and motivated solely by money? Can you fix incompetence with more time maybe?
> 
> Proper reviewers are few and far between these days, Ryzens launch has made that abundantly clear. I too thought GN were reputable, but it just seems like their too big for their britches, cringe worthy, their "review explained" video and review was


What was done that makes you believe the reviewers are incompetent? This thread has moved too fast for me to be able to keep up with each thread of conversation.


----------



## Jpmboy

Here's one issue with many benchmarks on the ryzen platform:

http://hwbot.org/newsflash/4335_ryzen_platform_affected_by_rtc_bias_w88.110_not_allowed_on_select_benchmarks

>5GHz LN2 tho... http://hwbot.org/submission/3473862_elmor_cinebench___r15_ryzen_7_1800x_2454_cb


----------



## looniam

Quote:


> Originally Posted by *doritos93*
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> STAHP
> 
> to be fair AMD didn't get anyone a sample until just days before the NDA lifted. i saw tweets of reviewers who still didn't get one in time! you can fault journalist integrity all you want but the fact of the matter is every day the site doesn't have a review is a day they lose ad revenue.
> 
> 
> 
> Haha very funny !
> 
> so what's a bigger problem, the fact they supposedly didn't have time, or that they're incompetant and motivated solely by money? Can you fix incompetence with more time maybe?
> 
> Proper reviewers are few and far between these days, Ryzens launch has made that abundantly clear. *I too thought GN were reputable*, but it just seems like their too big for their britches, cringe worthy, their "review explained" video and review was
Click to expand...

well there is your first problem, thinking GN was reputable.









i don't know about any reputation but they, or steve to be specific, is a very immature reviewer. to publicly disclose private conversations/emails from a manufacturer will not endear him to them and any other manufacturer in the future, *esp when the only reason was to throw AMD under the bus to "save his reputation."* and to add that such conversations are par for the course:


Spoiler: Warning: Spoiler!







but wait there's MORE!

then follow up using a marketing slide from RTG to validate his 1080 benchmarks.
http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x


Spoiler: Warning: Spoiler!







(E: that pretty much says it all.)

there is no supposedly about reviewers getting late samples:


Spoiler: Warning: Spoiler!









*i certainly hope for everyone's sake that AMD is a lot more expedient getting samples to the several hundred developers to fix issues.
*
but if you want to call someone incompetent, so how about looking at the mobo vendors? was it their incompetence none had a "fully functioning" bios upon release?
no?
really?

bottom line is, _if anyone was incompetent_; there is enough to go around and blame EVERYONE; reviewers, mobo manufacturers and AMD.


----------



## LesPaulLover

An R7 1800x running as 4c/8t (one CCX disabled) @ 4.0GHz vs an Intel 7700k clocked @ 4.0GHz.......

http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/


----------



## BinaryDemon

Honestly with how many issues the Ryzen platform is having right now, it shouldnt be surprising that the results and conclusions vary widely.

- Windows Scheduler needs to be updated to utilize cores correctly.
- Windows RTC sleep bug.
- Motherboard manufacturers hit or miss DDR4 support for anything > 2133 mhz
- Many benchmark / system utilities were reading temps or memory latency incorrect initially.

I like to imagine how solid a platform Ryzen will be once these issues are addressed.


----------



## budgetgamer120

Quote:


> Originally Posted by *LesPaulLover*
> 
> An R7 1800x running as 4c/8t (one CCX disabled) @ 4.0GHz vs an Intel 7700k clocked @ 4.0GHz.......
> 
> http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/


Interesting


----------



## JackCY

Quote:


> Originally Posted by *BinaryDemon*
> 
> Honestly with how many issues the Ryzen platform is having right now, it shouldnt be surprising that the results and conclusions vary widely.
> 
> - Windows Scheduler needs to be updated to utilize cores correctly.
> - Windows RTC sleep bug.
> - Motherboard manufacturers hit or miss DDR4 support for anything > 2133 mhz
> - Many benchmark / system utilities were reading temps or memory latency incorrect initially.
> 
> I like to imagine how solid a platform Ryzen will be once these issues are addressed.


It's not the Ryzen platform having most of these issues, they are Microsoft's incompetence when it comes to updating their OS. I bet other OS work fine and as far as I know the Ryzen updates have already been pushed into those.
M$ also delayed or canceled the February update no? As such the needed update for Ryzen may have not been ready or they just want to stall, it was about the first delayed update since they started that schedule. But then it is also the first major AMD release in a long long time. On the other hand if M$ wrote the scheduler and other things right and wasn't treating users like sheep it should have been adjustable on the fly by users, no need for some crazy mega update to fix a scheduler etc. Lots of stuff is in Win one can make two 4 core CPUs out of the 8 core Ryzen but that's subpar, maybe they really didn't add enough flexibility to the CPU mapping and scheduler








Broken RTC and sleep is a constant issue with newer Win8+.
Mobos are fault of the mobo makers, although AMD may have not given them much time but still how much time do you need and what specs? The socket was known forever but the software part seems delayed which sucks, no idea why they are so late with UEFIs and the ones released are so basic it makes one cry, can't even adjust RAM properly, it's all locked to automatic beyond the major timings and doesn't seem to always work right.

Temperature... that one falls on AMD not disclosing the formula needed to convert the sensor value.
Latency... AIDA64 is out of date and they didn't get a sample before launch.

Most of it is really Windows issues and lack of well working mobos.
Sometimes it's not much different with new Intel platforms either.
At least AM4 should stay a while unlike Intel's yearly harvest.


----------



## doritos93

Quote:


> Originally Posted by *looniam*
> 
> well there is your first problem, thinking GN was reputable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't know about any reputation but they, or steve to be specific, is a very immature reviewer. to publicly disclose private conversations/emails from a manufacturer will not endear him to them and any other manufacturer in the future, *esp when the only reason was to throw AMD under the bus to "save his reputation."* and to add that such conversations are par for the course:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> but wait there's MORE!
> 
> then follow up using a marketing slide from RTG to validate his 1080 benchmarks.
> http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (E: that pretty much says it all.)
> 
> there is no supposedly about reviewers getting late samples:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *i certainly hope for everyone's sake that AMD is a lot more expedient getting samples to the several hundred developers to fix issues.
> *
> but if you want to call someone incompetent, so how about looking at the mobo vendors? was it their incompetence none had a "fully functioning" bios upon release?
> no?
> really?
> 
> bottom line is, _if anyone was incompetent_; there is enough to go around and blame EVERYONE; reviewers, mobo manufacturers and AMD.


_AMD delivered a chip that packs twice the price/perf of Intel's 1000$ offerings, a total success_
That should be the headline to every review. 5 stars, 10/10, editor's choice, everything, from every site.
Instead we saw a lot of nit picking for 5% fps differences in older single threaded games... "Critical mess" ... Cmon ...

The conclusion most reviewers pushed through was "don't get it for gaming" when in actual fact it's perfectly fine for gaming ON TOP of being a great workstation chip

This is where I found incompetence, the bigger picture was not properly communicated


----------



## looniam

Quote:


> Originally Posted by *doritos93*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> well there is your first problem, thinking GN was reputable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't know about any reputation but they, or steve to be specific, is a very immature reviewer. to publicly disclose private conversations/emails from a manufacturer will not endear him to them and any other manufacturer in the future, *esp when the only reason was to throw AMD under the bus to "save his reputation."* and to add that such conversations are par for the course:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> but wait there's MORE!
> 
> then follow up using a marketing slide from RTG to validate his 1080 benchmarks.
> http://www.gamersnexus.net/hwreviews/2827-amd-r7-1700-review-amd-competes-with-its-1800x
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (E: that pretty much says it all.)
> 
> there is no supposedly about reviewers getting late samples:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *i certainly hope for everyone's sake that AMD is a lot more expedient getting samples to the several hundred developers to fix issues.
> *
> but if you want to call someone incompetent, so how about looking at the mobo vendors? was it their incompetence none had a "fully functioning" bios upon release?
> no?
> really?
> 
> bottom line is, _if anyone was incompetent_; there is enough to go around and blame EVERYONE; reviewers, mobo manufacturers and AMD.
> 
> 
> 
> 
> 
> 
> 
> _AMD delivered a chip that packs twice the price/perf of Intel's 1000$ offerings, a total success_
> That should be the headline to every review. 5 stars, 10/10, editor's choice, everything, from every site.
> Instead we saw a lot of nit picking for 5% fps differences in older single threaded games... "Critical mess" ... Cmon ...
> 
> The conclusion most reviewers pushed through was "don't get it for gaming" when in actual fact it's perfectly fine for gaming ON TOP of being a great workstation chip
> 
> This is where I found incompetence, the bigger picture was not properly communicated
Click to expand...

well that's YOUR opinion.

what I SAW was any decent site characterize ryzen's strengths in production and weaknesses in gaming. what you are pissed at is no one is shilling AMD's product and i can't say i am sorry about that.


----------



## wanako

I don't know why, but I'm getting this itch to go from my 4790K @ 4.7, to a 1800X. which would make it would be my first AMD system. Not sure if it will be faster than the 4790K at that speed but damned if I don't want the latest goodies! Also, since it's in its infancy, Ryzen has LOTS of room to grow. My 4790K has already hit the top of what it can achieve.

Edit: reading through this thread gave me cancer. The idiocy on display is amazing.


----------



## CULLEN

Quote:


> Originally Posted by *looniam*
> 
> bottom line is, _if anyone was incompetent_; there is enough to go around and blame EVERYONE; reviewers, mobo manufacturers and AMD.


Ye do know that the AM4 platform is less than a week old, right? Like, there has been one BIOS update which improved gaming performance by up to 26%. We're still missing a Windows update, ironed chipset drivers, better utilization etc.

It's so weird seeing few (thankfully only a few) of the respected members of this community not seeing the big picture, *Ryzen is incredibly successful* chip no matter your opinion is (because it's completely irrelevant), and it just, a few days ago got released. It's just over 100-*hour* old platform, and y'all complaining over this and that? Not to mention that its main rival is $1,000.

I think it's healthy to be a little bit skeptical about everything, but many of you are far beyond a reasonable doubt.

And don't forget, the winners at the end of the day are we, the consumers.


----------



## sugarhell

Quote:


> Originally Posted by *looniam*
> 
> well that's YOUR opinion.
> 
> what I SAW was any decent site characterize ryzen's strengths in production and weaknesses in gaming. what you are pissed at is no one is shilling AMD's product and i can't say i am sorry about that.


I mostly agree with you but i think reviewers focused way too much on gaming on an 8core. And i am still waiting for cf/sli benchmarks.

In general my opinion on Ryzen reviews : rushed low quality reviews with bad methodology. Comparing most of the times wrong cpus and wrong target groups.

Yes i know that an 8core is a "bad" choice for gaming because the target group of an 8core is not the gaming.

If they wanted to focus so much on gaming they should run a rendering or a vm server while playing. Much more useful at least on me.

And no, productivity tests are not the cinebench test or 720p rendering on blender(What's wrong with a high resolution rendering?).


----------



## LesPaulLover

Quote:


> Originally Posted by *CULLEN*
> 
> Ye do know that the AM4 platform is less than a week old, right? Like, there has been one BIOS update which improved gaming performance by up to 26%. We're still missing a Windows update, ironed chipset drivers, better utilization etc.
> 
> It's so weird seeing few (thankfully only a few) of the respected members of this community not seeing the big picture, *Ryzen is incredibly successful* chip no matter your opinion is (because it's completely irrelevant), and it just, a few days ago got released. It's just over 100-*hour* old platform, and y'all complaining over this and that? Not to mention that its main rival is $1,000.
> 
> I think it's healthy to be a little bit skeptical about everything, but many of you are far beyond a reasonable doubt.
> 
> And don't forget, the winners at the end of the day are we, the consumers.


Why are you charging the words of post around, to something that was TOTALLY DIFFERENT than what I said, then calling me out on what you made up?


----------



## SuperZan

Quote:


> Originally Posted by *sugarhell*
> 
> I mostly agree with you but i think reviewers focused way too much on gaming on an 8core. And i am still waiting for cf/sli benchmarks.
> 
> In general my opinion on Ryzen reviews : rushed low quality reviews with bad methodology. Comparing most of the times wrong cpus and wrong target groups.
> 
> Yes i know that an 8core is a "bad" choice for gaming because the target group of an 8core is not the gaming.
> 
> If they wanted to focus so much on gaming they should run a rendering or a vm server while playing. Much more useful at least on me.
> 
> And no, productivity tests are not the cinebench test or 720p rendering on blender(What's wrong with a high resolution rendering?).


Agree. And even re: the gaming tests, I'll happily acknowledge a deficit of 1080p medium-details performance but it's such an outlier that I'm keen to see what basic improvements to the platform will accomplish. Besides that, most people buying a $300+ processor are probably gaming 1080p ultra at the very least. When the GPU becomes the focus the Ryzen chips perform as well as Intel, which is something that the construction cores couldn't always say.


----------



## LesPaulLover

Quote:


> Originally Posted by *wanako*
> 
> I don't know why, but I'm getting this itch to go from my 4790K @ 4.7, to a 1800X. which would make it would be my first AMD system. Not sure if it will be faster than the 4790K at that speed but damned if I don't want the latest goodies! Also, since it's in its infancy, Ryzen has LOTS of room to grow. My 4790K has already hit the top of what it can achieve.
> 
> Edit: reading through this thread gave me cancer. The idiocy on display is amazing.


Man 4790k @ 4.7GHz eh? I so wish I'd waited for Haswell-E instead of buying my 4670k. I can't get the damned thing past 4.1GHz regardless of how many volts I feed it.

From what I understand based on the many varied reviews I've watched and/or read your 4790k @ 4.7GHz almost definitely outperforms a Ryzen 7 @ 4.0GHz in most if not all gaming scenarios.....RIGHT NOW. That COULD change in the coming months (and, I think WILL CHANGE in the coming years) though. The other thing to consider about Ryzen is that the AM4 platform will be in use til at least 2021; so it's a likely 2-4 CPU upgrade path possible on that single platform.

Case in point, for example:

Only reason I'm considering upgrading right now is cuz I'm on an i5 not an i7. When I bought my 4670k 3 years ago everyone was saying "buy the i5. dont waste your money on an i7 for gaming!" Whaddya know, 2 years on and my 4c/4t CPU is hitting 95%+ on all CPU cores in modern games. ESPECIALLY online multiplayer games - BF4, BF1, OverWatch. Hell even DOTA2 pushed my 4670k really hard.

IF I had a 4790k, let alone one clocked @ 4.7GHz, I'd personally waitt til june/july/august before committing to an upgrade path.


----------



## LesPaulLover

Quote:


> Originally Posted by *SuperZan*
> 
> Agree. And even re: the gaming tests, I'll happily acknowledge a deficit of 1080p medium-details performance but it's such an outlier that I'm keen to see what basic improvements to the platform will accomplish. Besides that, most people buying a $300+ processor are probably gaming 1080p ultra at the very least. When the GPU becomes the focus the Ryzen chips perform as well as Intel, which is something that the construction cores couldn't always say.


It IS worth noting that just last summer AMD was explicitly pointing out the fact that the VASSSSTTTT MAJORITY of PC gamers are still playing games @ 1080p:



So for them to be requesting Ryzen reviewers to benchmark games @ 1440p/2160p is at least a BIT duplicitous. Please understand I fully support AMD and Ryzen and want nothing more than for them to succeed; my past 3 rigs have been Phenom 2 X6 1100T, an FX 8350, and an i5 4670k platforms.


----------



## aDyerSituation

I7 4790k at 4.7 to a 1800x?

Nah









not in it's current state at least


----------



## SuperZan

Quote:


> Originally Posted by *LesPaulLover*
> 
> It IS worth noting that just last summer AMD was explicitly pointing out the fact that the VASSSSTTTT MAJORITY of PC gamers are still playing games @ 1080p:


And nobody is disputing that, but context is everything. The 'VASSSSTTTT MAJORITY' of PC gamers at 1080p are gaming at 60Hz. Ryzen has no problems there. The 'VASSSSTTTT MAJORITY' of people in the market for octacores probably play AT LEAST at ultra 1080p. Ryzen is fine there. The 'VASSSSTTTT MAJORITY' of PC gamers will be much more interested in R3 and R5 chips reviewed on a more mature platform and offering absurd price/performance. Just because Steve from GN threw that picture out in a rage-panic doesn't make using a slide about graphics cards apply well to this particular range of processors.

Again, I don't have any problem with being realistic about Ryzen 7 but that cuts both ways. It is not the right CPU for people who push serious singlethreaded games at 144hz and beyond. That is not the same thing as it being a poor choice for gaming. It's only a poor choice for gamers if they're using CPUs only for that single thread FPS hunt or if they literally never turn on that PC for anything beyond web browsing or competitive gaming.


----------



## geriatricpollywog

Quote:


> Originally Posted by *LesPaulLover*
> 
> Man 4790k @ 4.7GHz eh? I so wish I'd waited for Haswell-E instead of buying my 4670k. I can't get the damned thing past 4.1GHz regardless of how many volts I feed it.
> 
> From what I understand based on the many varied reviews I've watched and/or read your 4790k @ 4.7GHz almost definitely outperforms a Ryzen 7 @ 4.0GHz in most if not all gaming scenarios.....RIGHT NOW. That COULD change in the coming months (and, I think WILL CHANGE in the coming years) though. The other thing to consider about Ryzen is that the AM4 platform will be in use til at least 2021; so it's a likely 2-4 CPU upgrade path possible on that single platform.
> 
> Case in point, for example:
> 
> Only reason I'm considering upgrading right now is cuz I'm on an i5 not an i7. When I bought my 4670k 3 years ago everyone was saying "buy the i5. dont waste your money on an i7 for gaming!" Whaddya know, 2 years on and my 4c/4t CPU is hitting 95%+ on all CPU cores in modern games. ESPECIALLY online multiplayer games - BF4, BF1, OverWatch. Hell even DOTA2 pushed my 4670k really hard.
> 
> IF I had a 4790k, let alone one clocked @ 4.7GHz, I'd personally waitt til june/july/august before committing to an upgrade path.


PCIe x16 will likely be outdated within 4 years. Even if the socket stays the same, you will need a new chipset within 4 years if you want the latest graphics card.


----------



## AlphaC

Quote:


> Originally Posted by *LesPaulLover*
> 
> An R7 1800x running as 4c/8t (one CCX disabled) @ 4.0GHz vs an Intel 7700k clocked @ 4.0GHz.......
> 
> http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/








Those graphs suggest it is a game issue (i.e. a i7-7700k should be ~25% faster at 5GHz compared to a 4GHz Ryzen 7 4c/8t but less than 13% difference at 4.5GHz stock boost). If it was a DirectX issue then it would manifest in some form within the Fire strike and Time Spy result I'd think.

Nice find , I believe pcgameshardware ran a 4+0 core (so only one CCX) benchmark as well.

http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/


----------



## looniam

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> well that's YOUR opinion.
> 
> what I SAW was any decent site characterize ryzen's strengths in production and weaknesses in gaming. what you are pissed at is no one is shilling AMD's product and i can't say i am sorry about that.
> 
> 
> 
> 
> 
> 
> I mostly agree with you but i think reviewers focused way too much on gaming on an 8core. And i am still waiting for cf/sli benchmarks.
> 
> In general my opinion on Ryzen reviews : rushed low quality reviews with bad methodology. Comparing most of the times wrong cpus and wrong target groups.
> 
> Yes i know that an 8core is a "bad" choice for gaming because the target group of an 8core is not the gaming.
> 
> If they wanted to focus so much on gaming they should run a rendering or a vm server while playing. Much more useful at least on me.
> 
> And no, productivity tests are not the cinebench test or 720p rendering on blender(What's wrong with a high resolution rendering?).
Click to expand...

i think it is difficult where to place ryzen. there are choices between $350-$500 and the other options are i7-7770K, i7-6800K and i&-6850K in that price range. now i admit that i don't have my finger on the market, but i think the majority of those that buy intel's offerings (4 and 6 core) are high end enthusiasts games.

i know that label of buyers then flies in the face of the validity of low res benchmarks for gaming; since they would also have a high end GPU or 2 and for that. but i really don't care if anyone does use low res benching, its not my concern and i would ignore that along with any conclusion that would.

but you post gave me a thought and i'd should share before it dies of loneliness.









reviews sites just might need to consider running some benches while streaming to twitch. anymore it seems that gaming/streaming is much more common in the last few years and i don't know if it would be all that difficult to run. yeah, it will require an increased time frame, but what(?) a few more hours depending on how many benches.

but as i started to say, ryzen is . . . new . . . and a bit different. i think reviewers need to consider changing a few things on how they benchmark.


----------



## tpi2007

Quote:


> Originally Posted by *mcg75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SoloCamo*
> 
> Followed by:
> Nobody gets upset that it "whomps" intel in multithreaded applications because it's actually true. Ryzen is not a dud for gaming, or being murdered, or being smoked, massacred, etc. which he plastered all over the performance benchmarks.
> 
> 
> 
> The guy likes dramatic words. That doesn't make his data completely useless.
> 
> He was one of the only ones that I know of to use an AMD card during testing. Rx480. And he did 1440p and 1080p at Ultra.
> 
> The setup he used would be what an entry level content creator and part time gamer might build.
> 
> *My only issue is no overclock on the 1700*. But given that 4.0 seems to be the max, it's not going to change results by that much.
> 
> But I suppose because people don't like his results, he's biased and or cheating on the benches somehow.
Click to expand...

The sample size, at three games, is too small to validate such strong language and even he admits that the sample is small + the fact that he didn't overclock the Ryzen as he did his own Ivy Bridge CPU does make his results skewed and serves to legitimise his downplaying of the Ryzen 7 1700's performance with the mentioned expressions.

He probably wouldn't be able to set the stage for the disappointment of not even getting parity in two of the three titles and trading blows in the third if he had overclocked the R7 1700. We won't know when the bottleneck for a certain title's game engine is lifted with each configuration, especially with a mid-range card like the RX 480, unless he were to test overclocking. If he acknowledges that people usually overclocked their 3570K's by at least 600 Mhz (over the all core Turbo of 3.6 Ghz), he has to acknowledge that people will eventually try to do the same with a Ryzen 7 1700, incidentally, by the same amount, since it seems most can do 600 Mhz over the all core turbo of 3.3 Ghz, without Ivy Bridge's TIM temp problems.


----------



## warpuck

I am waiting for a 370 board to be what the Asus Sabertooth 990FX was to the FX CPU. I am grateful for the present pioneers that are exploring with the Ryzen. I believe there is lot of potential to be explored as the motherboards mature.


----------



## Liranan

Quote:


> Originally Posted by *warpuck*
> 
> I am waiting for a 370 board to be what the Asus Sabertooth 990FX was to the FX CPU. I am grateful for the present pioneers that are exploring with the Ryzen. I believe there is lot of potential to be explored as the motherboards mature.


Crosshair is already out in the form of the CH VI Hero but I would also like to see a Kitty as well.

Sadly Zen+/Zen 2 is set to be released in two years (2019) and not next year, I really would have liked to see a refresh with higher clocks, higher IPC and better thermals.


----------



## Majin SSJ Eric

I wonder when the fire-sale of 1800X's is going to occur, as people begin to realize that the only R7 to get is actually the 1700? I'd imagine we might start to see panicked 1800X sellers as the platform matures and performance goes up across all three sku's similarly. I know that the only logical buy is the 1700 considering that with 4GHz clock speeds it is exactly as good as the 1800X but there's just something about having that 1800X in the sig rather than a 1700 that blinds me to logic! If I could find an 1800X several months down the line at less than $400 I think I'd actually spend the premium for it, for basically no good reason! What can I say, I'm stupid!


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Liranan*
> 
> Crosshair is already out in the form of the CH VI Hero but I would also like to see a Kitty as well.
> 
> Sadly Zen+/Zen 2 is set to be released in two years (2019) and not next year, I really would have liked to see a refresh with higher clocks, higher IPC and better thermals.


We may see a new stepping in the form of an 1850X or some such thing before Zen 2 that improves clock speeds as yields improve. The naming scheme they've gone with leaves the door open for such a sku down the line. Not something I'm predicting WILL happen, but it could.


----------



## looniam

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I wonder when the fire-sale of 1800X's is going to occur, as people begin to realize that the only R7 to get is actually the 1700? I'd imagine we might start to see panicked 1800X sellers as the platform matures and performance goes up across all three sku's similarly. I know that the only logical buy is the 1700 considering that with 4GHz clock speeds it is exactly as good as the 1800X but there's just something about having that 1800X in the sig rather than a 1700 that blinds me to logic! If I could find an 1800X several months down the line at less than $400 I think I'd actually spend the premium for it, for basically no good reason! What can I say, I'm stupid!


_nothing personal about you_ just saying, and this is where enthusiasts are no longer those that get a lesser product to use skills and knowledge to increase it's potential but those that spend more money.


----------



## Nickyvida

Only thing holding back Ryzen is its low clocks. There doesn't even seem to be binning at all, all are capped at 4, or 4.1 regardless of SKUs. So much for overclocking.

Should have gone for 14nm HP imo.

A higher clocked Ryzen would have better chances with KB-Lake. At this point im semi regretting my 1800X purchase.


----------



## budgetgamer120

Quote:


> Originally Posted by *Nickyvida*
> 
> Only thing holding back Ryzen is its low clocks. There doesn't even seem to be binning at all, all are capped at 4, or 4.1 regardless of SKUs. So much for overclocking.
> 
> Should have gone for 14nm HP imo.
> 
> A higher clocked Ryzen would have better chances with KB-Lake. At this point im semi regretting my 1800X purchase.


They have no time to bin. As time go on binning will take place. Same thing happens with graphics. If I buy a gpu at launch no matter what price it is always high quality. Buy a few months later after they had a chance to bin and have higher end sku like sapphire toxic etc... If I buy reference then it is terrible.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *looniam*
> 
> _nothing personal about you_ just saying, and this is where enthusiasts are no longer those that get a lesser product to use skills and knowledge to increase it's potential but those that spend more money.


Yes, but at least I own up to it! If I didn't have crippling child support payments to make, I'd have a 6950X!


----------



## SuperZan

Quote:


> Originally Posted by *warpuck*
> 
> I am waiting for a 370 board to be what the Asus Sabertooth 990FX was to the FX CPU. I am grateful for the present pioneers that are exploring with the Ryzen. I believe there is lot of potential to be explored as the motherboards mature.


I'm truly enjoying the guinea-pig life. Doubly so, as I'm on the Biostar X370 GT7 which has turned out to have one of the better VRM setups of the available boards. It's also done quite well with memory relative to other boards.

Quote:


> Originally Posted by *Nickyvida*
> 
> Only thing holding back Ryzen is its low clocks. There doesn't even seem to be binning at all, all are capped at 4, or 4.1 regardless of SKUs. So much for overclocking.
> 
> Should have gone for 14nm HP imo.
> 
> A higher clocked Ryzen would have better chances with KB-Lake. At this point im semi regretting my 1800X purchase.


I'm having the opposite experience with my 1700X. An easy Y-Cruncher-stable 3.85GHz all-core OC at 1.32v is running my games and productivity tasking exceptionally well. I'm quite pleased with performance already and that's on a raw UEFI and a green platform generally which is also lacking necessary support from Windows for basic functionality at an architectural level. After a UEFI revision and a Windows update, I'll revisit 4.0GHz which should be doable for me at around 1.385v or so, based on my testing.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Nickyvida*
> 
> Only thing holding back Ryzen is its low clocks. There doesn't even seem to be binning at all, all are capped at 4, or 4.1 regardless of SKUs. So much for overclocking.
> 
> Should have gone for 14nm HP imo.
> 
> A higher clocked Ryzen would have better chances with KB-Lake. At this point im semi regretting my 1800X purchase.


Nobody should have ever been expecting Ryzen to match Kaby Lake in single core performance. The fact that they even got to Haswell level IPC after the wasteland that was BD right off the jump is incredible. Only regret you should have about your 1800X is that the 1700 is much cheaper and mostly just as good.


----------



## looniam

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> _nothing personal about you_ just saying, and this is where enthusiasts are no longer those that get a lesser product to use skills and knowledge to increase it's potential but those that spend more money.
> 
> 
> 
> 
> 
> 
> 
> Yes, but at least I own up to it! If I didn't have crippling child support payments to make, I'd have a 6950X!
Click to expand...

naw i was just doing a mini-rant. but more off topic - i never did say how adorable your daughter looked when you posted in the "show yourself" thread.


----------



## Majin SSJ Eric

Lol, thanks buddy! She recently lost her two front baby teeth so she looks pretty silly these days!


----------



## huzzug

Quote:


> Originally Posted by *looniam*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> _nothing personal about you_ just saying, and this is where enthusiasts are no longer those that get a lesser product to use skills and knowledge to increase it's potential but those that spend more money.
> 
> 
> 
> 
> 
> 
> 
> Yes, but at least I own up to it! If I didn't have crippling child support payments to make, I'd have a 6950X!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> naw i was just doing a mini-rant. but more off topic - i never did say how adorable your daughter looked when you posted in the "show yourself" thread.
Click to expand...

You're not fun at parties, are you?
say it now darnit!!!


----------



## hollowtek

Has the hype train derailed yet? Is it safe to come out now? I didn't even grab the 7700k for under $300 with recent deals. Looks like this 2500k will just have to keep chuggin'


----------



## flippin_waffles

If by derailed you mean high demand then i guess you could be right.







i'll be getting one saturday.


----------



## Nickyvida

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Nobody should have ever been expecting Ryzen to match Kaby Lake in single core performance. The fact that they even got to Haswell level IPC after the wasteland that was BD right off the jump is incredible. Only regret you should have about your 1800X is that the 1700 is much cheaper and mostly just as good.


Of course no one expected, but it could have been doable on a high power process, given that it is poor OC is what is limiting Ryzen's potential in the first place and the yet unresolved problems coming with it like the windows scheduler that will improve once it is resolved, since it is so close to the KB-L right now.

Having it on a LPP hurt clocks, among exploring new process kinks etc Having all 3 SKUs clock to a maximum of 4.1, even the binned 1800x is just.. meh. I'm not asking for it to clock to the moon, given that it is a 8 core, but 4.3 to 4.5 would have been amazing. The price difference between the 1700 and 1800x for virtually no difference(same CPU), well, meh. For almost $200 more, i hoped it could clock slightly higher than its lower priced counter parts to justify the higher prices, not all ending up with a max clock of 4.1. The 1800x is no different from the 1700 at this point in time except for the XFR and what not.


----------



## azanimefan

actually from the numbers the stilt was shoving out ryzen might just be one of the best mobile chips out there. it probably will destroy intel in the mobile market getting as good performance as intel chips at significant reductions in power draw and heat.


----------



## epic1337

Quote:


> Originally Posted by *azanimefan*
> 
> actually from the numbers the stilt was shoving out ryzen might just be one of the best mobile chips out there. it probably will destroy intel in the mobile market getting as good performance as intel chips at significant reductions in power draw and heat.


only the APUs, the ones without integrated would have a less ideal setup of requiring a dGPU, meaning those idiots of a manufacturers would highly likely pair them with trash dGPUs.


----------



## somethingname

These reviews are sketchy to say the least. You don't know who to trust when you start seeing such mixed results even with same ram speeds.

One will show them neck and neck by a 5fps difference while most other reviews show it getting rekt by Intel with 15-20fps in the same games.


----------



## SighTurtle

hardocp.com/article/2017/03/08/amd_ryzen_1700_cpu_vs_1700x_review


----------



## daviejams

Quote:


> Originally Posted by *SighTurtle*
> 
> hardocp.com/article/2017/03/08/amd_ryzen_1700_cpu_vs_1700x_review


What was the point in using these games for the gaming benchmark - Lost planet , Bioshock infinite and Metro Last Light ? must really tax the CPU


----------



## CULLEN

Quote:


> Originally Posted by *LesPaulLover*
> 
> Why are you charging the words of post around, to something that was TOTALLY DIFFERENT than what I said, then calling me out on what you made up?


Haha mate, totally my fault! The quote I referred to was made by looniam, and I was also going to quote you on the benchmarks you posted because I found it very interesting.

Well, it looks like what came out of that was looniam's quote, with your name on it, and none of your post. I'll fix it.


----------



## aberrero

Quote:


> Originally Posted by *Nickyvida*
> 
> Having it on a LPP hurt clocks, among exploring new process kinks etc Having all 3 SKUs clock to a maximum of 4.1, even the binned 1800x is just.. meh. I'm not asking for it to clock to the moon, given that it is a 8 core, but 4.3 to 4.5 would have been amazing. The price difference between the 1700 and 1800x for virtually no difference(same CPU), well, meh. For almost $200 more, i hoped it could clock slightly higher than its lower priced counter parts to justify the higher prices, not all ending up with a max clock of 4.1. The 1800x is no different from the 1700 at this point in time except for the XFR and what not.


The real benefit of the 1800x is that it gives you the performance of an OC'd 1700 without having to overclock. In summer, for example, I want my CPU to run as cool as possible and so the 1800X will be worth it then.


----------



## looniam

Quote:


> Originally Posted by *CULLEN*
> 
> Haha mate, totally my fault! The quote I referred to was made by looniam, and I was also going to quote you on the benchmarks you posted because I found it very interesting.
> 
> Well, it looks like what came out of that was looniam's quote, with your name on it, and none of your post. I'll fix it.


well, replying to me is also a bit of a









i don't understand how saying "_if anyone was incompetent_" as a judgement on ryzen release. nor do was it meant i assure you.


----------



## SoloCamo

Quote:


> Originally Posted by *azanimefan*
> 
> actually from the numbers the stilt was shoving out ryzen might just be one of the best mobile chips out there. it probably will destroy intel in the mobile market getting as good performance as intel chips at significant reductions in power draw and heat.


I am ridiculously excited for their mobile line up. Bring forth the potential 4c8t apu for under $500


----------



## CULLEN

Quote:


> Originally Posted by *looniam*
> 
> well, replying to me is also a bit of a
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't understand how saying "_if anyone was incompetent_" as a judgement on ryzen release. nor do was it meant i assure you.


I should probably stop posting at 1 AM without my glasses, I can't even remember the context I was replying to. It probably wasn't meant for you because I also agreed with you and certainly not to LesPaulLover either.


----------



## epic1337

any updates on the reviews?

like DDR4 max OC and typical OC.


----------



## looniam

Quote:


> Originally Posted by *CULLEN*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *looniam*
> 
> well, replying to me is also a bit of a
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't understand how saying "_if anyone was incompetent_" as a judgement on ryzen release. nor do was it meant i assure you.
> 
> 
> 
> 
> 
> 
> I should probably stop posting at 1 AM without my glasses, I can't even remember the context I was replying to. It probably wasn't meant for you because I also agreed with you and certainly not to LesPaulLover either.
Click to expand...

don't worry, even not needing glasses i also should limit my late night posting.


----------



## Nickyvida

Quote:


> Originally Posted by *aberrero*
> 
> The real benefit of the 1800x is that it gives you the performance of an OC'd 1700 without having to overclock. In summer, for example, I want my CPU to run as cool as possible and so the 1800X will be worth it then.


Well that is no benefit, or almost intangible anyway. Had i known i wouldve saved $200 and gone for the 1700. The $200 premium is hard to justify imo

Disappointing OC if you ask me if the overclocked best can only match the max oc of the worst.

Hopefully the situation will be improved soonish.


----------



## JackCY

Quote:


> Originally Posted by *doritos93*
> 
> _AMD delivered a chip that packs twice the price/perf of Intel's 1000$ offerings, a total success_
> That should be the headline to every review. 5 stars, 10/10, editor's choice, everything, from every site.
> Instead we saw a lot of nit picking for 5% fps differences in older single threaded games... "Critical mess" ... Cmon ...
> 
> The conclusion most reviewers pushed through was "don't get it for gaming" when in actual fact it's perfectly fine for gaming ON TOP of being a great workstation chip
> 
> This is where I found incompetence, the bigger picture was not properly communicated


Some did some didn't. Some simply got it wrong and were comparing 7700K 4 core vs 1800X 8 core, in gaming, for gaming, because they are gamers who make reviews as such you have to see their reviews from purely gaming perspective not an overall well rounded review.

Anyone who has a common sense can see that for $329 you can get a 3.9GHz 8 core that is on par for most stuff with 5960X and 6900K $1000 CPUs.

I've checked some 5960X reviews yesterday and guess what, most games they run are stuck in 50-150fps range, they don't bother benching the CPU like now with Ryzen and go into 150-500fps nonsense range that mostly shows single thread performance that you need to run the over 15 years old Counter-Strike.

I think the inability to run a stable and up to date system also has it's impact on reviewers but isn't a fault of the CPU. The ecosystem simply isn't ready yet.

The CPUs are an insane value, build wise especially after mobos get sorted out and prices go where they should be. $400-450 8 core with OC mobo? or 1250+ for Intel 8 core with OC mobo. You can literally build 2 systems for the price of one comparable Intel one.
If AMD goes this aggressive in the server market Intel will see quite a drop in new sales.

---

Yeah the XFR is a shame that it disables itself when OCing. Makes those 1800X even more unattractive. Would have been nice to be able to set at least 2-4-6-8 core speeds and have XFR for those 2 cores active even with OC and be able to set it's offset.


----------



## mcg75

Quote:


> Originally Posted by *sugarhell*
> 
> I mostly agree with you but i think reviewers focused way too much on gaming on an 8core. And i am still waiting for cf/sli benchmarks.


I just can't agree with this.

They brought an eight core cpu down to the price of a four core.

Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


----------



## TheReciever

Quote:


> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


Just comes down to two things.

The focus of the discussion (cores vs price segment) and what you see yourself potentially doing in the future.


----------



## ryan92084

Quote:


> Originally Posted by *epic1337*
> 
> any updates on the reviews?
> 
> like DDR4 max OC and typical OC.


For typical overclocks we have silicon lottery https://www.reddit.com/r/Amd/comments/5xybp7/silicon_lottery_ryzen_overclock_statistics/ (i haven't added this to the OP because there is no source or sample size cited)
And this 10x 1700 test https://www.reddit.com/r/Amd/comments/5xvv16/r7_1700_binning_data_from_10_cpus_indonesian/

For RAM I think Elmor has a new Asus hero BIOS to support 3466 over in that thread and I've seen a few user reports of 3600 cl16.

Edit: or did you mean full reviews with those clocks and RAM? In that case not any new ones that I've seen.


----------



## JackCY

Quote:


> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


Quote:


> Originally Posted by *TheReciever*
> 
> Just comes down to two things.
> 
> The focus of the discussion (cores vs price segment) and what you see yourself potentially doing in the future.


I propose a new gaming benchmark.
1 CPU, 2 GPUs, 2 OS running from a single machine, 2 gamers playing on a single PC at the same time. Good luck 7700K








unRAID can do this.


----------



## SkiesOfAzel

Quote:


> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


We are talking about a high core, low frequency part. Furthermore, AMD themselves had made their IPC goals public years ago. It's simply a matter of reason, this is not and could never be a cpu targeted primarily at gamers.

That doesn't mean it can't be used for gaming, or that its gaming performance shouldn't be reviewed and measured. What it means though is that it's idiotic to almost exclusively concentrate on its game performance when reviewing it, especially when the reviewer knows that his motherboard has performance issues.


----------



## Oubadah

..


----------



## TheReciever

Quote:


> Originally Posted by *JackCY*
> 
> I propose a new gaming benchmark.
> 1 CPU, 2 GPUs, 2 OS running from a single machine, 2 gamers playing on a single PC at the same time. Good luck 7700K
> 
> 
> 
> 
> 
> 
> 
> 
> unRAID can do this.


Playing 2 games at once is something I wanted to do on a desktop but it cant happen I guess

So I have 2 laptops now for this.


----------



## JackCY

Quote:


> Originally Posted by *TheReciever*
> 
> Playing 2 games at once is something I wanted to do on a desktop but it cant happen I guess
> 
> So I have 2 laptops now for this.


Just gotta ask LTT somehow to redo this with Ryzen 1700 and compare to Intel similarly priced CPU/system.






They've also done it with a 28 core monster for 7x4 core machine.


----------



## Scotty99

Ive actually wanted to play two MMO's at once, its impossible with a 2500k and gtx 760. Building my ryzen pc soon it will have a 1700 and gtx 1060 6 gb, gonna give it another go lol.


----------



## jclafi

Exactly !
Quote:


> Originally Posted by *GHADthc*
> 
> This guy sums up what I am seeing right now:
> 
> Quote from RussianSensation
> Elite Member Anandtechforum
> 
> "X10000!
> 
> Seriously, most professional review media is completely clueless nowadays (or are in bed with Intel for product samples and marketing dollars). Who the hell buys a $330-500 8-core 16T CPU that they want to be a well-rounded processor for encoding, rendering, well-threaded office applications, streaming, and still be great at games, but then pairs it with a GTX1070/1080/1080Ti (or in reviews $1200 Titan X Pascal) and then uses a POS $90-200 1080p 60Hz monitor? Give me a break! I will continue ripping into 1080p 60Hz gaming as we are now in 2017, and 1070, a 1440P card, is only $349. Not only is 1080p outdated for PC enthusiasts, but most 1080p monitors are small in real estate (24" or less) and tend to be budget in terms of IQ quality (a lot of them are mediocre IPS or TN panels).
> 
> Joker is one of the few true PC gamers on YouTube who did a Real World Ryzen vs. Intel PC gaming review.
> 
> The idea of testing the CPU at low resolutions or low GPU settings to "test CPUs" is marketing drivel that Intel has used for a decade, despite it not making any sense in a decade. If 95% of games are GPU limited at 4xMSAA 1080p/1440p/3440x1440 or 4K, those are the real world results us gamers actually should care about since those are the scenarios we use our PCs in.
> 
> I did not buy my 6700K and 6800K CPUs and GTX1070s to play games at 1080p low-medium settings. To exacerbate the matters, these so called professionals reviewers used Titan XP for gaming benchmarks. In fact, if there is ANY extra gaming performance on the table, I, and I am sure many of you, will increase all IQ settings to the max, and then if more performance is available, we woudl raise MSAA to 2-4X or even enable SSAA. The fact is a 4Ghz R7 1700 thrashes my 6700K in so many other applications outside of games that it's a MUCH better well-rounded processor. Outside of professional gamers who need 200-300 fps, who wants to play games with tearing?
> 
> The comparisons of 7700K vs. R7 1800X are so stupid, it's beyond any reasonable logic. That's like claiming i7 6900K is a failed CPU because it loses to the 7700K in gaming. It's OK to claim that 7700K is a better CPU for gaming, but that doesn't suddenly make R7 Ryzen a bad CPU. I would hope so that someone buying a $330-500 CPU does something other than gaming with it; and especially not gaming at the peasant 1080p 60Hz resolution. Otherwise, you do not need to spend that much on a processor in the first place. I am sure you can pick up a used i7 4770K/4790K and that would be more than adequate for gaming.
> 
> There also seems to be a lot of criticism coming from i7 4770K/6700K owners that Ryzen didn't really change the landscape for them. Let's look at Steam survey and see just how many Steam users have a CPU as powerful as the i7 4770K/6700K? For someone who has an i7 920/860, i5 2500K/2600K, R7 1700 @ 4Ghz would be an excellent upgrade.
> 
> It's amazing how the pro-Intel media never criticized 6-8 core Intel CPUs starting with i7-990X and 3930K eras, but the minute AMD's Ryzen makes 5820K/6800K/6850K and 6900K irrelevant and frankly flat out horrible in value, all of a sudden ALL the focus shifts to 1080p and lower PC gaming benchmarks? What a joke our tech-review industry has become. How come the media is barely discussing that other than Thunderbolt and SLI/CF support, the $90-100 B350 boards are significantly cheaper than the X99 board despite offering all the latest modern features from PCIe 3.0 x4 M.2 support, to USB 3.1 Type-C to Ryzen overclocking support? The total platform cost rises even more with X99 for those who want a multi-threaded powerhouse of a CPU.
> 
> With Intel, we basically get amazing mainstream gaming CPUs that are slow for multi-threaded tasks OR extremely overpriced X99 parts. With Ryzen, we actually get the most well-rounded processor to keep for the next 4-5 years. With B350 chipset, a mobo can be purchased for $90-100 and Ryzen 2.0/3.0 7nm parts will likely be backwards compatible with AM4 socket come 2019-2020. OTOH, LGA1151 Z170/270 or X99 platforms are completely dead. It's interesting how most "professional" reviewers have no clue as to the forward looking advantage of the AM4 platform as well. It means a builder can pick up a 4C or a 6C Ryzen and just get the 2019-2020 8C Ryzen when he/she needs more performance. With Intel, not only are you forced to pay more for X99 chipset board, but these boards have no upgrade path at all.
> 
> If I had to build a new PC system now, I would purchase an R7 1700 over the 6700K/7700K. I'd rather have a CPU that's 95% as good for games but is 30-60% faster in anything else I want to throw at it today or in the future. I would also get a guaranteed upgrade path for faster Ryzen models. Win-Win.
> 
> Techno-Kitchen already showed that a 5Ghz 7700K is barely faster in games than a 4Ghz 7700K. Most games today are GPU-limited once we start using Real World gaming settings and anti-aliasing settings!"
> 
> Couldn't have expressed it better myself...
> 
> All of these reviews are fishy...no 4K resolution testing? No SLI/CFX testing? Very odd choice of games I am seeing in some review suites...Looks like I am going to have to look at some real-world performance figures from users here on OCN.


----------



## Artikbot

Quote:


> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


It's not that what I have a problem with personally, it's that they compare the two and give most relevance to only what the 7700K excels at and not saying that it's for the most part the same issue Intel's own HEDT has, but at much higher price tags.


----------



## CULLEN

I'd wish more reviewers would do real-life benchmarks.

We've seen that i7 7700 is completely capped in Battlefield 1, but these benchmarks are done at completely best case scenarios, no program open but the game. It makes sense for benchmarking but it doesn't replicate real-life scenarios.

When I'm working on my computer, with two virtual machines active, 20 tabs in Chrome, Spotify, Atom (with way too many packages) and various tools, I want to be able to just fire up Battlefield and be called a noob for 45 minutes without having to close everything before I play. And then finally reopen everything again when I've had enough of cold hard facts about my mother.

I just want my computer to be capable without sacrifice. For me, the i7 7700 isn't even an option, it's not sufficient for what I intend using my computer for.

*edit*

i7 7700K is still and incredible chip, and for my friends who only play games, it's not hard for me to recommend it.


----------



## epic1337

Quote:


> Originally Posted by *ryan92084*
> 
> For typical overclocks we have silicon lottery https://www.reddit.com/r/Amd/comments/5xybp7/silicon_lottery_ryzen_overclock_statistics/ (i haven't added this to the OP because there is no source or sample size cited)
> And this 10x 1700 test https://www.reddit.com/r/Amd/comments/5xvv16/r7_1700_binning_data_from_10_cpus_indonesian/
> 
> For RAM I think Elmor has a new Asus hero BIOS to support 3466 over in that thread and I've seen a few user reports of 3600 cl16.
> 
> Edit: or did you mean full reviews with those clocks and RAM? In that case not any new ones that I've seen.


yes with the RAM clock speed, theres currently the controversy of a RAM clock speed bug and i was wondering if someone had tested trying 4000Mhz RAM via OC.
if they could manage to pull it off, a good benchmark on performance scaling with that high RAM clock would be a good info.


----------



## TheReciever

Quote:


> Originally Posted by *JackCY*
> 
> Just gotta ask LTT somehow to redo this with Ryzen 1700 and compare to Intel similarly priced CPU/system.
> 
> 
> 
> 
> 
> 
> They've also done it with a 28 core monster for 7x4 core machine.


Without VM's and only 1 OS


----------



## SkiesOfAzel

Quote:


> Originally Posted by *AlphaC*
> 
> 
> 
> 
> 
> 
> 
> Those graphs suggest it is a game issue (i.e. a i7-7700k should be ~25% faster at 5GHz compared to a 4GHz Ryzen 7 4c/8t but less than 13% difference at 4.5GHz stock boost). If it was a DirectX issue then it would manifest in some form within the Fire strike and Time Spy result I'd think.
> 
> Nice find , I believe pcgameshardware ran a 4+0 core (so only one CCX) benchmark as well.
> 
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/


I posted a video about those benchmarks a day ago. The results are weird actually. Sure, gaming seems a lot better, which points to cache misses through thread hopping between CCXs in the case of Ryzen 7. But then, less cache misses equal more IPC, which is not the case in non gaming benchmarks where the 1700 @4ghz is 10-15% behind the 7700k @4ghz. The IPC difference should be a lot smaller, especially in non AVX2 tasks.


----------



## JackCY

Quote:


> Originally Posted by *TheReciever*
> 
> Without VM's and only 1 OS


1 copy of Windows cannot support 2 users simultaneously, maybe Linux can if you tweak it hard, especially when it comes to simultaneous exclusive fullscreen 3D and peripherals.
You need unRAID or similar to run 2 OS for 2 users, it's just how it's done.


----------



## Liranan

Quote:


> Originally Posted by *CULLEN*
> 
> I'd wish more reviewers would do real-life benchmarks.
> 
> We've seen that i7 7700 is completely capped in Battlefield 1, but these benchmarks are done at completely best case scenarios, no program open but the game. It makes sense for benchmarking but it doesn't replicate real-life scenarios.
> 
> When I'm working on my computer, with two virtual machines active, 20 tabs in Chrome, Spotify, Atom (with way too many packages) and various tools, I want to be able to just fire up Battlefield and be called a noob for 45 minutes without having to close everything before I play. And then finally reopen everything again when I've had enough of cold hard facts about my mother.
> 
> I just want my computer to be capable without sacrifice. For me, the i7 7700 isn't even an option, it's not sufficient for what I intend using my computer for.
> 
> *edit*
> 
> i7 7700K is still and incredible chip, and for my friends who only play games, it's not hard for me to recommend it.


LOL only 20 tabs? Amataur. I have over 150 tabs across three browsers, two VM's, lots of programs and then sometimes two games running at the same time









My next CPU shall also be Ryzen.

Edit: I'm thinking of getting 32GB RAM for my next system as 16 is sometimes just not enough.


----------



## TheReciever

Quote:


> Originally Posted by *JackCY*
> 
> 1 copy of Windows cannot support 2 users simultaneously, maybe Linux can if you tweak it hard, especially when it comes to simultaneous exclusive fullscreen 3D and peripherals.
> You need unRAID or similar to run 2 OS for 2 users, it's just how it's done.


Your missing the point here.

Why the heck would I want 2 users? I said earlier I want to play 2 games but no where did I say it was to accommodate someone else.

I want to FPS while I MMO.
Quote:


> Originally Posted by *Scotty99*
> 
> Ive actually wanted to play two MMO's at once, its impossible with a 2500k and gtx 760. Building my ryzen pc soon it will have a 1700 and gtx 1060 6 gb, gonna give it another go lol.


Let me know how that works out.

Im thinking I might need 2 GPU's though. One for each display it would need to drive but who knows if that is supported


----------



## Carniflex

I managed at last to read through this thread. Lost many brain cells.

Is there already B350 mATX reviews?


----------



## Newwt

What ever happened to tweaktown? They used to be reputable if I remember correctly...


----------



## JackCY

Quote:


> Originally Posted by *TheReciever*
> 
> Your missing the point here.
> 
> Why the heck would I want 2 users? I said earlier I want to play 2 games but no where did I say it was to accommodate someone else.
> 
> I want to FPS while I MMO.


You can do that sure but it's not what I had in mind. For your task where you use 1 app at a time any decent CPU is fine really as anything else will be in background and waiting pretty much most of the time.


----------



## Alex132

Quote:


> Originally Posted by *JackCY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheReciever*
> 
> Without VM's and only 1 OS
> 
> 
> 
> 1 copy of Windows cannot support 2 users simultaneously, maybe Linux can if you tweak it hard, especially when it comes to simultaneous exclusive fullscreen 3D and peripherals.
> You need unRAID or similar to run 2 OS for 2 users, it's just how it's done.
Click to expand...

Uh.....................?


----------



## sumitlian

NComputing virtual desktop ? Don't know much about that. I've heard it is reliable.


----------



## Andrew LB

[quote name="jclafi" Exactly !
[/quote]

From the reviews i've read, the vast majority have been clearly pro-AMD with reviewers going as far as making excuses for shortcomings. You never see that on Intel reviews.


----------



## warpuck

false
Quote:


> Originally Posted by *Nickyvida*
> 
> Well that is no benefit, or almost intangible anyway. Had i known i wouldve saved $200 and gone for the 1700. The $200 premium is hard to justify imo
> 
> Disappointing OC if you ask me if the overclocked best can only match the max oc of the worst.
> 
> Hopefully the situation will be improved soonish.


Maybe... maybe may see 1800Xs in workplaces where 8C/16T is needed to get more out of the $75 an hour human operating it. It also helps to not be putting a workload on the AC system.
Even when AMD was the box to beat, corp where buying more expensive and slower intels because the A/C factor.
Most places let the box be a DUST COLLECTOR until it fails. Something to think about if you want to build for other people to use.
There are lots of autos with 2 pedals only and only a 4 position gear selector,+ oil changes when the car tells you too.
I don't expect a recent IM grad in charge to expect anything else. To go even further, be able to assemble and network their own workstation.

From my experience the 9590 did not deliver enough for the few more MHZ it delivers over the cost of an 8350. But the 9590 was pretty much a plug and play. You get the same thing without having to fuss with mobo settings to get the 8350 to the same spot. The 5G turbo did not come on with all the things in the background running anyway.


----------



## Nickyvida

Quote:


> Originally Posted by *warpuck*
> 
> false
> Maybe... maybe may see 1800Xs in workplaces where 8C/16T is needed to get more out of the $75 an hour human operating it. It also helps to not be putting a workload on the AC system.
> Even when AMD was the box to beat, corp where buying more expensive and slower intels because the A/C factor.
> Most places let the box be a DUST COLLECTOR until it fails. Something to think about if you want to build for other people to use.
> There are lots of autos with 2 pedals only and only a 4 position gear selector,+ oil changes when the car tells you too.
> I don't expect a recent IM grad in charge to expect anything else. To go even further, be able to assemble and network their own workstation.
> 
> From my experience the 9590 did not deliver enough for the few more MHZ it delivers over the cost of an 8350. But the 9590 was pretty much a plug and play. You get the same thing without having to fuss with mobo settings to get the 8350 to the same spot. The 5G turbo did not come on with all the things in the background running anyway.


Meh, but in this case, an 8350 hits the same limit as a 9590 would in the case of Ryzen. It's just not acceptable for those who really want to overclock. Then what is the distinction between an 1800x and a 1700 if the cheaper chip can do what a 1800x do, with both reaching the capped limit even though 1800x is a good $200 more expensive?


----------



## bigjdubb

Quote:


> Originally Posted by *Nickyvida*
> 
> Meh, but in this case, an 8350 hits the same limit as a 9590 would in the case of Ryzen. It's just not acceptable for those who really want to overclock. Then what is the distinction between an 1800x and a 1700 if the cheaper chip can do what a 1800x do, with both reaching the capped limit even though 1800x is a good $200 more expensive?


It's like the old days. If you don't overclock, the 1800x is for you. If you don't mind (maybe even enjoy) overclocking then you save yourself $200 and get the 1700. This is how it always was, it wasn't until the "K" nonsense that overclockers started (forced) buying the most expensive chip.


----------



## nycgtr

Quote:


> Originally Posted by *bigjdubb*
> 
> It's like the old days. If you don't overclock, the 1800x is for you. If you don't mind (maybe even enjoy) overclocking then you save yourself $200 and get the 1700. This is how it always was, it wasn't until the "K" nonsense that overclockers started (forced) buying the most expensive chip.


I dont know about all that tbh. I have every ryzen chip now. If you get a 1700 that does 3.9 without ridiculous voltage then sure you got a great deal. Otherwise your spending time and effort to get something 70 -25 (1 day sale price on the 1700x) dollars more would of gotten you. I feel the 1700x and 1800x are much closer together and share a similar clock/volt wall. Granted I dont have a crap ton of chips in my hand but I've gotten similar results on 2 1800x and 2 1700x


----------



## Nickyvida

Quote:


> Originally Posted by *bigjdubb*
> 
> It's like the old days. If you don't overclock, the 1800x is for you. If you don't mind (maybe even enjoy) overclocking then you save yourself $200 and get the 1700. This is how it always was, it wasn't until the "K" nonsense that overclockers started (forced) buying the most expensive chip.


Well i buy to chase every last clock down and performance.
But what is the distinction between a 1800x and 1700 if both are ultimately capped at the same speed of the slower chip? The returns arent justified for a good $200 more imo.


----------



## Quantum Reality

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *azanimefan*
> 
> actually from the numbers the stilt was shoving out ryzen might just be one of the best mobile chips out there. it probably will destroy intel in the mobile market getting as good performance as intel chips at significant reductions in power draw and heat.
> 
> 
> 
> I am ridiculously excited for their mobile line up. Bring forth the potential 4c8t apu for under $500
Click to expand...

I am definitely looking for AMD to shake up the overpriced Intel-dominated laptop market as well!


----------



## warpuck

Quote:


> Originally Posted by *Alex132*
> 
> Uh.....................?


Back in the win 98, Win2K, XP days there was a add in card + software that would do that. It was mostly to cut the cost of buying and maintaining workstaiions and save space. Got too remember coppermine P3s, P4s, registered ram and mobos were even more costly then. But not very common to do this. Hint if you don't remember, 8MB of 60ns ram, $800 + for intel P3T supermicro board. 4Mb $400+ for P4


----------



## bigjdubb

Quote:


> Originally Posted by *nycgtr*
> 
> I dont know about all that tbh. I have every ryzen chip now. If you get a 1700 that does 3.9 without ridiculous voltage then sure you got a great deal. Otherwise your spending time and effort to get something 70 -25 (1 day sale price on the 1700x) dollars more would of gotten you. I feel the 1700x and 1800x are much closer together and share a similar clock/volt wall. Granted I dont have a crap ton of chips in my hand but I've gotten similar results on 2 1800x and 2 1700x


That may be the case for now, time will tell if this remains the same. It is still to early for me to make a judgement, the test pool is still way too small and it seems likely that some software updates will improve things.
Quote:


> Originally Posted by *Nickyvida*
> 
> Well i buy to chase every last clock down and performance.
> But what is the distinction between a 1800x and 1700 if both are ultimately capped at the same speed of the slower chip? The returns arent justified for a good $200 more imo.


The return isn't good for your case. You would be better served with a 1700 or 1700x.

IMO the real value winners are going to be the six core models, especially if they can clock better.


----------



## CULLEN

Quote:


> Originally Posted by *Liranan*
> 
> LOL only 20 tabs? Amataur. I have over 150 tabs across three browsers, two VM's, lots of programs and then sometimes two games running at the same time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My next CPU shall also be Ryzen.
> 
> Edit: I'm thinking of getting 32GB RAM for my next system as 16 is sometimes just not enough.


I think it's only around 20 tabs..

_starts counting_

Ah, it's 5 instances of Chrome, of which each and everyone is running on average 17 tabs. But yeah, that's what I'm talking about, real-life benchmarks without sacrifices.

If I were to play a game right now with the i7 7700K, I'd have to shut down the two VM's I've got running and probably couple of other programs. Play one map or so and then open everything again. There is just no way I'd do that and therefore all of these benchmarks we're seeing today are almost completely irrelevant to me since they don't simulate my usage.

Looks like my next CPU will be Ryzen as well. I truly pity those why pick brand over functionality.


----------



## Nickyvida

Quote:


> Originally Posted by *bigjdubb*
> 
> That may be the case for now, time will tell if this remains the same. It is still to early for me to make a judgement, the test pool is still way too small and it seems likely that some software updates will improve things.
> The return isn't good for your case. You would be better served with a 1700 or 1700x.
> 
> IMO the real value winners are going to be the six core models, especially if they can clock better.


It isn't good for all 1800X owners. Don't get me wrong, what AMD did was nothing short of astounding and they got the pricing spot on, but for $200 more than the value chip, the 1800x clocks and performs exactly the same as the base chip in some cases, which is essentially a 1700 with XFR and what not. It would have been better if there was more of a headroom to distinguish the segments and to justify the step up, perhaps 4.3ish?:. Sure there has been some binning involved, but what's the point if everything's basically the same eventually while overclocking?


----------



## KarathKasun

Quote:


> Originally Posted by *Nickyvida*
> 
> It isn't good for all 1800X owners. Don't get me wrong, what AMD did was nothing short of astounding and they got the pricing spot on, but for $200 more than the value chip, the 1800x clocks and performs exactly the same as the base chip in some cases, which is essentially a 1700 with XFR and what not. It would have been better if there was more of a headroom to distinguish the segments and to justify the step up, perhaps 4.3ish?:. Sure there has been some binning involved, but what's the point if everything's basically the same eventually while overclocking?


Not sure what you are getting at, Intel does the same thing with price vs stock speeds. The people who make CPU decisions on overclocking capability is tiny compared to people who just want to put the CPU in and get the performance they paid for.


----------



## bigjdubb

Quote:


> Originally Posted by *Nickyvida*
> 
> It isn't good for all 1800X owners. Don't get me wrong, what AMD did was nothing short of astounding and they got the pricing spot on, but for $200 more than the value chip, the 1800x clocks and performs exactly the same as the base chip in some cases, which is essentially a 1700 with XFR and what not. It would have been better if there was more of a headroom to distinguish the segments and to justify the step up, perhaps 4.3ish?:. Sure there has been some binning involved, but what's the point if everything's basically the same eventually while overclocking?


Well I'm sure the chips would be able to clock much higher if it was as simple as just choosing to have them clock higher. I highly doubt the 4.0-4.1 clock limitation was a choice someone made, it is what it is and hopefully we get new steppings that come along that allow higher clocks.


----------



## Nickyvida

Quote:


> Originally Posted by *KarathKasun*
> 
> Not sure what you are getting at, Intel does the same thing with price vs stock speeds. The people who make CPU decisions on overclocking capability is tiny compared to people who just want to put the CPU in and get the performance they paid for.


Jeez, What a revelation , if you make your CPU overclockable there will be people who want to overclock it to the last drop for performance, not end up at a brick wall paying for more but ending up with a capped speed as the base chip.









In this case, it's lacking performance that could've been there with higher clock speeds.


----------



## Phixit

Quote:


> Originally Posted by *CULLEN*
> 
> I think it's only around 20 tabs..
> 
> _starts counting_
> 
> Ah, it's 5 instances of Chrome, of which each and everyone is running on average 17 tabs. But yeah, that's what I'm talking about, real-life benchmarks without sacrifices.
> 
> If I were to play a game right now with the i7 7700K, I'd have to shut down the two VM's I've got running and probably couple of other programs. Play one map or so and then open everything again. There is just no way I'd do that and therefore all of these benchmarks we're seeing today are almost completely irrelevant to me since they don't simulate my usage.
> 
> Looks like my next CPU will be Ryzen as well. I truly pity those why pick brand over functionality.


Well, I think Ryzen is the perfect CPU for your usage.

.. but I prefer to run my VMs on a dedicated host and use my personal computer as a gaming one !


----------



## KarathKasun

Quote:


> Originally Posted by *Nickyvida*
> 
> Jeez, What a revelation , if you make your CPU overclockable there will be people who want to overclock it to the last drop for performance, not end up at a brick wall paying for more but ending up with a capped speed as the base chip.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In this case, it's lacking performance that could've been there with higher clock speeds.


Yep, Intel or AMD has totally not done that before. You pay more for a better stock speed, which automatically means you get less OC headroom. Athlon 64 x2 had some models that were right at the edge of available headroom, P3 and 1st gen Athlon top SKU's were the same, some P4's were too.

Almost all silicon in the same series is going to OC about the same as well, this is why overclockers traditionally went for the low end full chip SKU's unless they were specifically looking for WR chips.

R7 1800x still has OC room for anything more demanding than 1c/2c, and will likely get somewhat better clocks across all cores simply because of binning.


----------



## bigjdubb

Quote:


> Originally Posted by *Nickyvida*
> 
> Jeez, What a revelation , if you make your CPU overclockable there will be people who want to overclock it to the last drop for performance, not end up at a brick wall paying for more but ending up with a capped speed as the base chip.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *In this case, it's lacking performance that could've been there with higher clock speeds.*


In every case the CPU is lacking performance that could've been there with higher clock speeds. The i7 7700k would perform a lot better if it could clock to 8.0ghz. The chip clocks where it clocks, if that clock is not to your liking then maybe you should look into a different chip.


----------



## JackCY

Quote:


> Originally Posted by *bigjdubb*
> 
> In every case the CPU is lacking performance that could've been there with higher clock speeds. The i7 7700k would perform a lot better if it could clock to 8.0ghz. The chip clocks where it clocks, if that clock is not to your liking then maybe you should look into a different chip.


Only 8GHz? I'm disappointed, I wanted 10GHz.


----------



## BobiBolivia

Quote:


> Originally Posted by *bigjdubb*
> 
> In every case the CPU is lacking performance that could've been there with higher clock speeds. The i7 7700k would perform a lot better if it could clock to 8.0ghz. The chip clocks where it clocks, if that clock is not to your liking then maybe you should look into a different chip.


Quote:


> Originally Posted by *JackCY*
> 
> Only 8GHz? I'm disappointed, I wanted 10GHz.


Sums up OCN quite nicely.


----------



## bigjdubb

Right.

It would be faster if it was faster!


----------



## Majin SSJ Eric

I wanna know when these scumbag processor manufacturers are gonna stop sandbagging and give us TERA-hertz!!! Sick of this giga-crap!


----------



## spyshagg

Hi

Posted?
https://www.youtube.com/watch?v=40h4skxDkh4

This was known from looking at the diagrams, but the video explains how ryzen two quad modules on one chip needs to be handled by windows as NUMA instead of UMA (as it appears to be now)


----------



## JackCY

Quote:


> Originally Posted by *spyshagg*
> 
> Hi
> 
> Posted?
> https://www.youtube.com/watch?v=40h4skxDkh4
> 
> This was known from looking at the diagrams, but the video explains how ryzen two quad modules on one chip needs to be handled by windows as NUMA instead of UMA (as it appears to be now)


I see this video linked once more I will freakin' blacklist it in my firewall.

Windows needs a new mode which will not treat the Ryzen as 2 CPUs but as 1 CPU properly. You can set it up as 2 right now but that is bad because 1 process will run on 1 CPU so only on 1 CCX







As far as I know. This is OK for dual CPU systems but bad for Ryzen. M$ needs to add more flexibility to the mapping and scheduling that suits the model used in Ryzen. No idea why AMD made it as 4+4, I think Intel makes it as 10+0, dunno about the higher core count Xeons.


----------



## spyshagg

whats wrong with the video?


----------



## madweazl

Quote:


> Originally Posted by *Nickyvida*
> 
> Jeez, What a revelation , if you make your CPU overclockable there will be people who want to overclock it to the last drop for performance, not end up at a brick wall paying for more but ending up with a capped speed as the base chip.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In this case, it's lacking performance that could've been there with higher clock speeds.


Welcome to less than 1% of the CPU market...

It lacks no performance, it does exactly what it was designed to do. You lacked the foresight to select a CPU that would provide the results you wanted. When people purchased the 7700k, they new it wasnt going to pick up 1500mhz like the 6700k could occasionally pull off since the architecture hadnt changed. 5ghz was still right up near the wall; occasionally one overclocks exceptionally well and goes up to 5.5ghz. That will likely be true of the 1800x as well (not 5+ghz but a few hundred over what most are reporting right now in the 4.1 range). Overclocking typically provided the savvy budget minded consumer the capability to squeeze out most of the performance lost to the high dollar component. Additionally, thinking an 8c processor would have the headroom of a 4c processor was equally disillusion.


----------



## Ha-Nocri

RyZen review done right:




Ppl are forgeting how good RyZen really is... at idle it is consuming same power as 7700k, a 4 core chip.

They also OCed it to 4.2GHz


----------



## IRobot23

Quote:


> Originally Posted by *Ha-Nocri*
> 
> RyZen review done right:
> 
> 
> 
> 
> Ppl are forgeting how good RyZen really is... at idle it is consuming same power as 7700k, a 4 core chip.
> 
> They also OCed it to 4.2GHz


Well, most reviews are basically c**p.

Why? First of all I thought that AMD RYZEN will be great in DX12 games that are ported from consoles.

1. Most of them said that its probably IPC.
2. Some of them said that there are problems with memory, win10 SMT, cache allocation etc...

Why I thought that AMD *ryzen* *will be great for gaming* with 2x CCX, *if you look at consoles you get the picture that I got.*
Nobody mentioned that CONSOLES also have 2x4C with shared L3. What is wrong with these guys all they see is what they want to see...

IQ=**n*l* d*g*t


Spoiler: Warning: Spoiler!



Ry*n *h*o*t | P* P*r*s*e*t**e


----------



## Kuivamaa

Quote:


> Originally Posted by *Nickyvida*
> 
> It isn't good for all 1800X owners. Don't get me wrong, what AMD did was nothing short of astounding and they got the pricing spot on, but for $200 more than the value chip, the 1800x clocks and performs exactly the same as the base chip in some cases, which is essentially a 1700 with XFR and what not. It would have been better if there was more of a headroom to distinguish the segments and to justify the step up, perhaps 4.3ish?:. Sure there has been some binning involved, but what's the point if everything's basically the same eventually while overclocking?


Most people do not overclock, even the customers of such chips. We are a vocal minority, I also got a 1800X hoping it will get me 300-400 extra Mhz over 1700, what I am getting is 200MHz at best it seems but when it comes to stock clocks there is a big difference. Just not all that relevant to people like us in here.


----------



## tacobob89

I went with the 1700 because I plan on replacing this chip with a 2nd gen chip or something that will OC better when they come out. Might as well save close to 200$ now If I plan on replacing it in the next 1 to 2 years.


----------



## Kuivamaa

Quote:


> Originally Posted by *IRobot23*
> 
> Well, most reviews are basically c**p.
> 
> Why? First of all I thought that AMD RYZEN will be great in DX12 games that are ported from consoles.
> 
> 1. Most of them said that its probably IPC.
> 2. Some of them said that there are problems with memory, win10 SMT, cache allocation etc...
> 
> Why I thought that AMD *ryzen* *will be great for gaming* with 2x CCX, *if you look at consoles you get the picture that I got.*
> *Nobody mentioned that CONSOLES also have 2x4C with shared L3. What is wrong with these guys all they see is what they want to see...*
> 
> IQ=**n*l* d*g*t
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ry*n *h*o*t | P* P*r*s*e*t**e


Consoles do not run windows. Ryzen CCX thread allocation is a windows issue.


----------



## IRobot23

Quote:


> Originally Posted by *Kuivamaa*
> 
> Consoles do not run windows. Ryzen CCX thread allocation is a windows issue.


Why didnt MS or Sony build build with 8 core sharing L2? Why they went 2x4C sharing L2?


----------



## TheReciever

Quote:


> Originally Posted by *JackCY*
> 
> You can do that sure but it's not what I had in mind. For your task where you use 1 app at a time any decent CPU is fine really as anything else will be in background and waiting pretty much most of the time.


Im aware, I game on a 6 year old mobile i7 just fine.

Its basically the opposite of what I desire though...


----------



## Kuivamaa

Quote:


> Originally Posted by *IRobot23*
> 
> Why didnt MS or Sony build build with 8 core sharing L2? Why they went 2x4C sharing L2?


Ask them? My guess is this template (jaguar quad design) was what AMD offered them, tried and tested, so they went with that for economies of scale.


----------



## IRobot23

Quote:


> Originally Posted by *Kuivamaa*
> 
> Ask them? My guess is this template (jaguar quad design) was what AMD offered them, tried and tested, so they went with that for economies of scale.


The platform is meant only for gaming...

But okey you are probably correct, yet nobody from reviewer mentioned it.

Sorry to this aggressive, apologies.

If anyone who nows much more about architectures, could explain difference between PS4/XBOX and RYZEN (ABOUT CPU).


----------



## sumitlian

Zen is
Quote:


> Originally Posted by *IRobot23*
> 
> Why didnt MS or Sony build build with 8 core sharing L2? Why they went 2x4C sharing L2?


I think it may have something to do with Heterogeneous System Architecture(HSA). Imo, This is why we can not directly compare CPU/OS architecture of Consoles and Windows. Despite that both types of machines have x86 based CPUs. They still both differ significantly in lower level kernel and stuff + core microarchitecture. Both consoles are based on HSA model. L2 is directly connected to unified RAM, that RAM is called unified in that context because CPU cores can directly access data in ram by calling pointers, the data that has been stored by GPU cores and vice versa I think. And before all of this you would have asked why MS or Sony did not provide separate memory for GPU and CPU ?? The answer has been told.

Again, lets try to simulate somethings in here according to HSA specification, say GPU cores write some data in unified RAM.
Now what would the best way be to make the data available to all CPU cores ?








One of these methods got to work:
1) Dedicated L2 for all CPU cores: GPU cores put same data on all L2 caches across all cores. GPU/CPU actually write SIMD type aligned data into RAM/Cache to be used by CPU/GPU cores. Result: wastage of L2 cache by large quantity since same data is being copied multiple times.
or
2) Shared L2 for some/all CPU cores: CPU cores can load from shared L2. Result: No need to write data on multiple L2 caches. Each CPU core does have its private L1 instruction and Data Cache, hence all cores can without a problem work on Single Instruction and Multiple Data simultaneously.

Also Consoles have been designed to do graphics processing most of the time. This is why console CPU will probably not face the kind of CISC type difficulties/workload as a Desktop has to face. May be this is why this design suited best for consoles.

Edit: Xbox360's CPU named XCPU(Xenon) seems to use similar Cores -> L2 design too.


----------



## ryan92084

Quote:


> Originally Posted by *epic1337*
> 
> yes with the RAM clock speed, theres currently the controversy of a RAM clock speed bug and i was wondering if someone had tested trying 4000Mhz RAM via OC.
> if they could manage to pull it off, a good benchmark on performance scaling with that high RAM clock would be a good info.


Sorry, I haven't seen a full review doing anything over 3200mhz and even that was just a youtube review.


----------



## sugarhell

Quote:


> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?


Sometimes a bike and a motorcycle can cost the same but you simply do not compare them. If i want a bike i will get a bike.

It's okay to measure gaming performance but they missed completely the target group of this cpu. And its not only now. For example the same happens with 6900k and 7700k.

Tldr the two products have only on common the price. Their target groups are quite opposites on thei needs


----------



## epic1337

ohh yeah this reminds me, has the reviews tried overclocking with cores disabled?
e.g. 4+0, 2+2 or 4+2 configurations, whether it'll OC better than the full die.

from what i've seen on the full die OC, its not thermally bottlenecked but the voltage is too high.
to point out, reaching 4Ghz requires vcore at 1.4v, and reaching 4.2Ghz requires around 1.5v.


----------



## Nickyvida

Quote:


> Originally Posted by *bigjdubb*
> 
> In every case the CPU is lacking performance that could've been there with higher clock speeds. The i7 7700k would perform a lot better if it could clock to 8.0ghz. The chip clocks where it clocks, if that clock is not to your liking then maybe you should look into a different chip.


I didn't say it should clock that much. Just a little bit more to differentiate it from the 1700, around 4.2-4.3ish since it is an eight core. Right now even if there's binning, it still tops out at the same limit of the 1700. If there's absolutely no difference between the both of them, why the $200 increase in pricing? There's nothing to justify that.


----------



## Oubadah

..


----------



## SuperZan

Quote:


> Originally Posted by *Nickyvida*
> 
> I didn't say it should clock that much. Just a little bit more to differentiate it from the 1700, around 4.2-4.3ish since it is an eight core. Right now even if there's binning, it still tops out at the same limit of the 1700. If there's absolutely no difference between the both of them, why the $200 increase in pricing? There's nothing to justify that.


It's been explained to you. The 1800X offers more stock performance for the 99% of people that will never even look at a BIOS, let alone push their hardware to the limits. Even on OCN, OVERCLOCK.net, we have many people around here who do not overclock. Instead of trying to call AMD out for creating SKU's which offer more performance for the majority, why not credit them for being the literal opposite of Intel and giving a gift (the 1700) to overclockers instead of gating an unlocked multiplier behind a fiscal barrier?


----------



## epic1337

Quote:


> Originally Posted by *Oubadah*
> 
> Bicycles that cost the same as motorcycles sounds like a pretty good metaphor for some of Intel's CPUs.


i've actually seen one, a bike that costs over $10K.
from what i remember it was entirely made out of titanium alloy.


----------



## ryan92084

Quote:


> Originally Posted by *epic1337*
> 
> ohh yeah this reminds me, has the reviews tried overclocking with cores disabled?
> e.g. 4+0, 2+2 or 4+2 configurations, whether it'll OC better than the full die.
> 
> from what i've seen on the full die OC, its not thermally bottlenecked but the voltage is too high.
> to point out, reaching 4Ghz requires vcore at 1.4v, and reaching 4.2Ghz requires around 1.5v.


4c/8t @ 4ghz versus 7700k at 4ghz (not English) http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/
I think computerbase might have done one too but I don't have the link handy.


----------



## epic1337

Quote:


> Originally Posted by *ryan92084*
> 
> 4c/8t @ 4ghz versus 7700k at 4ghz (not English) http://www.zolkorn.com/reviews/amd-ryzen-7-1800x-vs-intel-core-i7-7700k-mhz-by-mhz-core-by-core/
> I think computerbase might have doine one too but I don't have the link handy.


this doesn't look good with OC potential huh, if its still stuck at 4Ghz even at 4C/8T configuration.


----------



## budgetgamer120

Quote:


> Originally Posted by *epic1337*
> 
> this doesn't look good with OC potential huh, if its still stuck at 4Ghz even at 4C/8T configuration.


Not sure why you would expect higher OC. It is the same silicon.


----------



## Nickyvida

Quote:


> Originally Posted by *SuperZan*
> 
> It's been explained to you. The 1800X offers more stock performance for the 99% of people that will never even look at a BIOS, let alone push their hardware to the limits. Even on OCN, OVERCLOCK.net, we have many people around here who do not overclock. Instead of trying to call AMD out for creating SKU's which offer more performance for the majority, why not credit them for being the literal opposite of Intel and giving a gift (the 1700) to overclockers instead of gating an unlocked multiplier behind a fiscal barrier?


Yes so let's ignore the 1% entirely. Precisely, this is OVERCLOCK.net. So why am i being called out for wanting to chase down every last drop of performance?

The reason why i bought it is to have a higher base clock and potentially higher overclocking potential due to binning. To chase down every last drop of performance for the extra $200 premium asked for. Not to find out that it is stuck at the limit of the base chip. You don't find supercars limited to the top speed of bread and butter cars, do you?

They did good with the 1700, that i can't deny, the pricing and unlocked overclocking were spot on, i can't deny that.

But the 1800x, i can't justify the $200 increase as compared to the base chip. All in all, just a little 0.2 increase in max clocks would have been great, to differentiate between the high end chip and the low end.


----------



## ryan92084

I just found this amusing


----------



## budgetgamer120

Quote:


> Originally Posted by *ryan92084*
> 
> 
> 
> 
> 
> I just found this amusing


Lol nice


----------



## SuperZan

Quote:


> Originally Posted by *Nickyvida*
> 
> Yes so let's ignore the 1% entirely. Precisely, this is OVERCLOCK.net. So why am i being called out for wanting to chase down every last drop of performance?
> 
> The reason why i bought it is to have a higher base clock and potentially higher overclocking potential due to binning. To chase down every last drop of performance for the extra $200 premium asked for. Not to find out that it is stuck at the limit of the base chip. You don't find supercars limited to the top speed of bread and butter cars, do you?
> 
> They did good with the 1700, that i can't deny, the pricing and unlocked overclocking were spot on, i can't deny that.
> 
> But the 1800x, i can't justify the $200 increase as compared to the base chip. All in all, just a little 0.2 increase in max clocks would have been great, to differentiate between the high end chip and the low end.


They didn't ignore the 1%. All of the SKU's are unlocked and the 1700 catches up to just about 1700X/1800X levels. Why I and a few others have responded to you is because what you're saying is incongruous with reality. The 1800X, all things being equal, will give you that last drop of performance, even after overclocks. Whether that's worth the price premium or not is up to the individual, but it's not AMD's fault that you assumed something which wasn't stated.

The supercar/bread and butter car is not analogous to this discussion. Supercars have different engines/drive-trains/etc. don't they? The 1800X shares an engine, as it were, with the 1700X and 1700. If the 1800X could clock better, it would clock better, but so would the 1700X and 1700. AMD doesn't have the money to make a different design with different characteristics and (in all likelihood) a different production process. The 1800X and 1700X are primarily there for people who won't overclock, so they can get better base performance within the performance 'bubble' that Ryzen provides.

The design could've clocked higher on 16nm, sure, but as a strapped company AMD has geared their entire production process towards success in all key markets with as much modularity as possible. That's why we have 14nm LPP... and even with these clocks, Ryzen 7 performs very well. What you're saying boils down to you wishing that Ryzen were a little bit faster. We all wish that, just like we wish all processors were a little bit faster, and all GPU's were a little bit stronger. If wishes were horses, beggars would ride.


----------



## JackCY

Quote:


> Originally Posted by *Nickyvida*
> 
> I didn't say it should clock that much. Just a little bit more to differentiate it from the 1700, around 4.2-4.3ish since it is an eight core. Right now even if there's binning, it still tops out at the same limit of the 1700. If there's absolutely no difference between the both of them, why the $200 increase in pricing? There's nothing to justify that.


The differences are about 0.1GHz of average max clocks between the different AMD bins.
Quote:


> Originally Posted by *Nickyvida*
> 
> Yes so let's ignore the 1% entirely. Precisely, this is OVERCLOCK.net. So why am i being called out for wanting to chase down every last drop of performance?
> 
> The reason why i bought it is to have a higher base clock and potentially higher overclocking potential due to binning. To chase down every last drop of performance for the extra $200 premium asked for. Not to find out that it is stuck at the limit of the base chip. You don't find supercars limited to the top speed of bread and butter cars, do you?
> 
> They did good with the 1700, that i can't deny, the pricing and unlocked overclocking were spot on, i can't deny that.
> 
> But the 1800x, i can't justify the $200 increase as compared to the base chip. All in all, just a little 0.2 increase in max clocks would have been great, to differentiate between the high end chip and the low end.


Aaand this is problem? Don't want to pay $499 for 1800X, buy a $329 1700, I've been saying this pre launch and post launch, there is nothing wrong with AMDs pricing, they are offering an incredible value of a CPU.
Overclocking is always limited and especially with large core count chips. AMD is known for maxing out their products in volts and clocks and leaving little to no overclocking headroom especially on the top binned chips.

I guess it comes down to what I've said before, there will be people with 1800X crying that their chip doesn't OC as much as a 1700 or that it cost them +$170 and they are getting the same OC. It's called silicon lottery, you pay more for possibly better chip that runs better clocks at stock, any OC headroom is never guaranteed and can only be guessed after user binning.

Cars are a perfect analogy, lets buy two identical cars except: #1 with 200km/h speed limit but no intercooler (gotta supply your own buddy) for $49.9k and #2 with 130km/h speed limit including a standard small intercooler for $32.9k. Same car, different speed limits, intercoolers and price, seems fair no?
Oh wait now we take both to an aftermarket shop and have the limits removed







Install better intercooler on #2 and voila our $32.9k "crap car" now performs the same almost as the originally more expensive #1.

---

LOL ryan.


----------



## epic1337

Quote:


> Originally Posted by *budgetgamer120*
> 
> Not sure why you would expect higher OC. It is the same silicon.


less vdroop? since adding more cores adds a parallel load to the supply line.
the vdroop would then be higher, which means it'll need a higher vcore to remain stable.


----------



## madweazl

Quote:


> Originally Posted by *Nickyvida*
> 
> Yes so let's ignore the 1% entirely. Precisely, this is OVERCLOCK.net. So why am i being called out for wanting to chase down every last drop of performance?
> 
> The reason why i bought it is to have a higher base clock and potentially higher overclocking potential due to binning. To chase down every last drop of performance for the extra $200 premium asked for. Not to find out that it is stuck at the limit of the base chip. You don't find supercars limited to the top speed of bread and butter cars, do you?
> 
> They did good with the 1700, that i can't deny, the pricing and unlocked overclocking were spot on, i can't deny that.
> 
> But the 1800x, i can't justify the $200 increase as compared to the base chip. All in all, just a little 0.2 increase in max clocks would have been great, to differentiate between the high end chip and the low end.


You can continue to ask the same question over and over if you want but the answer will continue to be the same. AMD specifically said it would boost to a thermal limit; you chose not to believe their claims but they were right on the money. Once again, the top tier chips usually dont have a lot of headroom regardless of platform. Go ahead, ask your same silly question again tomorrow, maybe we'll have a different answer for you...


----------



## umeng2002

If Ryzen 5 doesn't overclock a little better than Ryzen 7, AMD needs a new stepping for next year.

Then Ryzen 2 in 2019.


----------



## epic1337

Quote:


> Originally Posted by *umeng2002*
> 
> If Ryzen 5 doesn't overclock a little better than Ryzen 7, AMD needs a new stepping for next year.
> 
> Then Ryzen 2 in 2019.


they just might, atm all of Ryzen ( 7, 5 and 3 ) will be on the same GloFo 14nm LPP process.
and on that note, GloFo's LPP supposedly isn't meant to target >4Ghz much, they were intended to sit between 2Ghz~4Ghz.

now looking at AMD's contracts, didn't they sign up for Samsung's 14nm FinFet process?


----------



## budgetgamer120

Quote:


> Originally Posted by *epic1337*
> 
> less vdroop? since adding more cores adds a parallel load to the supply line.
> the vdroop would then be higher, which means it'll need a higher vcore to remain stable.


Hmmm ok. Hopefully the quad-core has lots of cache to help.


----------



## blue1512

Quote:


> Originally Posted by *epic1337*
> 
> they just might, atm all of Ryzen ( 7, 5 and 3 ) will be on the same GloFo 14nm LPP process.
> and on that note, GloFo's LPP supposedly isn't meant to target >4Ghz much, they were intended to sit between 2Ghz~4Ghz.
> 
> now looking at AMD's contracts, didn't they sign up for Samsung's 14nm FinFet process?


Samsung is LPP, GloFo is basically Samsung process.

I think they can get a high clock Ryzen if they try TSMC 16nm, just a pure speculation though.


----------



## DaaQ

Quote:


> Originally Posted by *Nickyvida*
> 
> Yes so let's ignore the 1% entirely. Precisely, this is OVERCLOCK.net. So why am i being called out for wanting to chase down every last drop of performance?
> 
> The reason why i bought it is to have a higher base clock and potentially higher overclocking potential due to binning. To chase down every last drop of performance for the extra $200 premium asked for. Not to find out that it is stuck at the limit of the base chip. You don't find supercars limited to the top speed of bread and butter cars, do you?
> 
> They did good with the 1700, that i can't deny, the pricing and unlocked overclocking were spot on, i can't deny that.
> 
> But the 1800x, i can't justify the $200 increase as compared to the base chip. All in all, just a little 0.2 increase in max clocks would have been great, to differentiate between the high end chip and the low end.


How long have you been building computers? Long enough to remember the P4 Northwood 2.4 GHz chip? The top SKU was what iirc 3.4 or 3.6? Or do you remember when the K series was actually released? How about a short time before that ?
You are stuck on the Intel sales plan unfortunately.


----------



## Motley01

Quote:


> Originally Posted by *ryan92084*
> 
> 
> 
> 
> 
> I just found this amusing


HOLY CRAP! That is what you call "multi-tasking" LOL


----------



## umeng2002

Are we guessing that GloFlo/ Samsung 14nm LPP isn't good for 4+ GHz? Are we just assuming that based on Ryzen OC'ing?


----------



## Blameless

Quote:


> Originally Posted by *umeng2002*
> 
> Are we guessing that GloFlo/ Samsung 14nm LPP isn't good for 4+ GHz? Are we just assuming that based on Ryzen OC'ing?


Low power (leakage) and high clocks are competing and often mutually exclusive goals. Ryzen's clock limitations may well be partially due to the process, but the density and complexity of the architecture certainly play a role as well.

Would a different process have yielded higher clocks? Probably. Would a performance rather than density library yielded higher clocks? Probably.

However, either or both of these things would have come at a cost...namely higher power/heat and larger die size. AMD's decision was likely the best option for building a viable, profitable, part.


----------



## DaaQ

Quote:


> Originally Posted by *umeng2002*
> 
> Are we guessing that GloFlo/ Samsung 14nm LPP isn't good for 4+ GHz? Are we just assuming that based on Ryzen OC'ing?


Pretty much guessing. Because ~6 months ago it wasn't even supposed to be much more than 3 ghz. I think the going guesses were ~2.5-2.8ish GHz


----------



## sumitlian

[Youtube] RoadtoRyzen: AMD Ryzen overclock sleep bug
Read the full description.

If this is true then all the low overclocking might only have to do with the buggy BIOS and nothing else.

Edit: Wait a sec, this is probably have to do with faulty timer which has also been revealed by HwBot.
Edit2: Sad I was wrong about that.








But did AMD sell us an artificial wormhole ?


----------



## randomizer

Quote:


> Originally Posted by *Motley01*
> 
> HOLY CRAP! That is what you call "multi-tasking" LOL


Ryzen finally delivers the megatasking platformance that AMD promised with Quad FX.


----------



## mAs81

Quote:


> Originally Posted by *sumitlian*
> 
> [Youtube] RoadtoRyzen: AMD Ryzen overclock sleep bug
> Read the full description.


There was a follow up video abouit this ,proving once again the faulty timer bug..
RoadtoRyzen: AMD Ryzen overclock sleep bug fully tested

After updating the BIOS on his m/b at least the temps are showing the correct numbers..

Ryzen is a new platform being launched in an all-intel environment , regarding optimization in OS & games and various other senarios..

But , I for one , even seeing the current performance in its un-optimised form , am feeling very optimistic about its future


----------



## epic1337

Quote:


> Originally Posted by *Blameless*
> 
> Low power (leakage) and high clocks are competing and often mutually exclusive goals. Ryzen's clock limitations may well be partially due to the process, but the density and complexity of the architecture certainly play a role as well.
> 
> Would a different process have yielded higher clocks? Probably. Would a performance rather than density library yielded higher clocks? Probably.
> 
> However, either or both of these things would have come at a cost...namely higher power/heat and larger die size. AMD's decision was likely the best option for building a viable, profitable, part.


well, if they could push IPC a bit more on the next generation then staying at the same 3Ghz~4Ghz clock range would be fine.
since intel is stuck at 5Ghz, and their IPC growth isn't stellar either, then AMD can play "catch-up" by just increasing IPC a bit higher than Intel's.

technically AMD's uarch is still fresh, so there should still be room to improve IPC, the CCX intercommunication for example is seriously slow.


----------



## M4c4br3

In Sweden, 1700 is only €15 cheaper than 7700k, while 7700k is a much better overclocker.
Sigh, the same thing happened with 480 vs 1060, 480 was actually more expensive but worse.
However 1800x is almost half the price of 5960x so there's that...


----------



## ToTheSun!

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mcg75*
> 
> I just can't agree with this.
> 
> They brought an eight core cpu down to the price of a four core.
> 
> Who in their right mind wouldn't cross shop the two if it was going to meet their needs and in their price range?
> 
> 
> 
> Sometimes a bike and a motorcycle can cost the same but you simply do not compare them. If i want a bike i will get a bike.
> 
> It's okay to measure gaming performance but they missed completely the target group of this cpu. And its not only now. For example the same happens with 6900k and 7700k.
> 
> Tldr the two products have only on common the price. Their target groups are quite opposites on thei needs
Click to expand...

Then AMD shouldn't have made gaming such a huge focus of their Ryzen marketing material. They weren't even modest about it, either.


----------



## epic1337

Quote:


> Originally Posted by *ToTheSun!*
> 
> Then AMD shouldn't have made gaming such a huge focus of their Ryzen marketing material. They weren't even modest about it, either.


they kept targeting Intel's 8core and forgot that Intel's mainstream isn't the i7-6900K or i7-5960X.
now that reviewers are pitting Ryzen against a 5Ghz i7-7700K it looked mediocre all of a sudden.


----------



## Shatun-Bear

Quote:


> Originally Posted by *M4c4br3*
> 
> In Sweden, 1700 is only €15 cheaper than 7700k, while 7700k is a much better overclocker.
> Sigh, the same thing happened with 480 vs 1060, 480 was actually more expensive but worse.
> However 1800x is almost half the price of 5960x so there's that...


What are you talking about? The 1700 has double the cores and double the threads of the 7700K. It's remarkable it's so close in price to Intel's 4 core.


----------



## daviejams

Quote:


> Originally Posted by *Shatun-Bear*
> 
> What are you talking about? The 1700 has double the cores and double the threads of the 7700K. It's remarkable it's so close in price to Intel's 4 core.


Yeah but the 7700k can run GTA V at 300 fps in 480p whilst the 1700 can only run it at 250fps in 480p


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Shatun-Bear*
> 
> What are you talking about? The 1700 has double the cores and double the threads of the 7700K. It's remarkable it's so close in price to Intel's 4 core.


Shhhhh, the only CPU that matters is whatever current Intel quad core gaming CPU is available. Anything that is just 10% slower in gaming but can do literally everything else better is crap, including it would seem Intel's own HEDT platform. Also, as with Nvidia, anything Intel does better than AMD is the most important aspect of a CPU. The second AMD offers comparable or better performance in that metric, it immediately becomes completely unimportant and the goalposts shift to something else that Intel does better.


----------



## ChronoBodi

so many Primes at Microcenter, every other x370s sold out. I wonder why.


----------



## Oubadah

..


----------



## ChronoBodi

Quote:


> Originally Posted by *Oubadah*
> 
> The fact that they were always comparing it to Intel HEDT parts should have been more than enough to temper everyone's expectations. I don't think it was unfair of AMD to promote Ryzen as a gaming part, because it _is_ great for gaming. It's just not as specialized as Kaby Lake is for certain gaming applications. Now there is a good choice of affordable gaming parts to suit everyone's requirements, which Intel refused to offer on it's own.
> This is the same kind of wantonly inflammatory drivel you polluted every Windows 10 related thread with. How many people are actually calling Ryzen "crap"? A tiny minority.


dude, Majin is being sarcastic.


----------



## Arturo.Zise

Is it too much to say that R7 Ryzen basically made Intel X99 obsolete overnight? If so does that say more about AMD's progress or Intel's laziness lol.


----------



## ChronoBodi

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Is it too much to say that R7 Ryzen basically made Intel X99 obsolete overnight? If so does that say more about AMD's progress or Intel's laziness lol.


Intel is dropping prices on their procs, but not that sharp, they're still quads for near r7 1700 prices.

To be fair, they had no competition for 6 years, and Intel's CEO actually didn't count on AMD making a comeback at all.


----------



## Oubadah

..


----------



## epic1337

Quote:


> Originally Posted by *ChronoBodi*
> 
> Intel is dropping prices on their procs, but not that sharp, they're still quads for near r7 1700 prices.
> 
> To be fair, they had no competition for 6 years, and Intel's CEO actually didn't count on AMD making a comeback at all.


they don't really have to drop prices, they still have the edge on higher single-thread performance due to higher IPC + higher clocks.
furthermore, its not a misconception that 7700K and 6700K are still the best choice for current generation games.

its even more so for intel's HEDT, quad-channel and more PCI-E lanes are still an advantage that warrants a premium.


----------



## sumitlian

Quote:


> Originally Posted by *mAs81*
> 
> There was a follow up video abouit this ,proving once again the faulty timer bug..
> RoadtoRyzen: AMD Ryzen overclock sleep bug fully tested
> 
> After updating the BIOS on his m/b at least the temps are showing the correct numbers..
> 
> Ryzen is a new platform being launched in an all-intel environment , regarding optimization in OS & games and various other senarios..
> 
> But , I for one , even seeing the current performance in its un-optimised form , am feeling very optimistic about its future


Agreed!

Quote:


> Originally Posted by *ToTheSun!*
> 
> Then AMD shouldn't have made gaming such a huge focus of their Ryzen marketing material. They weren't even modest about it, either.


^This, I can't keep myself from agreeing !
This is probably the only mistake AMD did with RyZen launch. (But again), <- this line of mine is nothing but a response of emotional state of mind and many are still thinking like that.

Why ? try to understand this; AMD hadn't shown us RyZen doing _just_ gaming. They had shown RyZen doing great when it was doing heavy multitasking (gaming + streaming, high settings, both are multithreading tasks). I don't know why many reviewers seemed to ignore that and compared i7 7700k to R7 in pure gaming only, this is the main cause of problem, imo.
You may call this opinion of mine is my perception of what happened after the launch. But in the real world, you can assign 4 dedicated cores or 8 threads to file compression or Video transcoding or streaming or Virtual Machines or do many things with less threads assigning to each task, to be run in the background and still retain more than playable fps in a AAA game, there are unimaginable amount of possibilities you can make your dream come true. If you try to feel the power of RyZen then you'll feel this is something groundbreaking results you are seeing in here with RyZen that even a 5.0 GHz 7700k doesn't cut.
Hence, Technically AMD is still 100% right about RyZen and they will always remain right about that. It is the reviewers and some gamers who are getting so emotional about gaming only that they are unable to fathom RyZen's true potential.
In the end, reviewers should have explicitly emphasized and concluded that i7 7700k is the best CPU for gaming if you are running a game and nothing major in background and RyZen is the best CPU for real time multitasking + gaming.

[offtopic]: AMD stock is RyZing again.


----------



## pez

Quote:


> Originally Posted by *ChronoBodi*
> 
> Intel is dropping prices on their procs, but not that sharp, they're still quads for near r7 1700 prices.
> 
> To be fair, they had no competition for 6 years, and Intel's CEO actually didn't count on AMD making a comeback at all.


According to what? The prices have yet to drop (and I'm not talking about the singular instances where Microcenter was doing this).


----------



## Arturo.Zise

Yes X99 has quad channel and more PCI-E lanes, but when an R7 1700 + GTX1080ti can be had for the same price as a 6900k CPU alone it's hard to justify the X99 platform as a Productivity/gaming combo.

How far away is Skylake-E? Would be funny if the new 8/10 core CPU's launched at half their current lineup pricing.


----------



## prznar1

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Yes X99 has quad channel and more PCI-E lanes, but when an R7 1700 + GTX1080ti can be had for the same price as a 6900k CPU alone it's hard to justify the X99 platform as a Productivity/gaming combo.
> 
> How far away is Skylake-E? Would be funny if the new 8/10 core CPU's launched at half their current lineup pricing.


It wont be so cheap, but i would assume price of 1800x or maybe even between 1800x and 1700x.


----------



## Kuivamaa

Ryzen is still a pretty damn good gaming chip and a powerhouse in general, lest we forget.


----------



## TheReciever

Im wondering if I can drive games according to which monitor is connected to it? Like Monitor 1 > GPU0 and Monitor 2 > GPU1

Then have the games be driven by separate GPU's on their respective monitors? I think you used to be able to do something like that in the old days but I am not hopeful about things right now since Windows isnt really concerned with anything outside the norm.

I wont be buying Ryzen anytime soon though, at least not for now.

Just interested in that possibility


----------



## Xuper

AMD Ryzen 1700 in VM!






6 VM game !!


----------



## TheReciever

It was posted earlier


----------



## Oubadah

..


----------



## mcg75

Quote:


> Originally Posted by *sugarhell*
> 
> Sometimes a bike and a motorcycle can cost the same but you simply do not compare them. If i want a bike i will get a bike.
> 
> It's okay to measure gaming performance but they missed completely the target group of this cpu. And its not only now. For example the same happens with 6900k and 7700k.
> 
> Tldr the two products have only on common the price. Their target groups are quite opposites on thei needs


That logic only applies in one direction though.

If I need an 8 core for content creation etc then no, I can't cross shop.

But if I've saved $350 for a good quad processor, I'd only be hurting myself not to include considering Ryzen when actually buying.

This is not about buying what I need, it's about buying the most I can for the money.


----------



## JackCY

Quote:


> Originally Posted by *M4c4br3*
> 
> In Sweden, 1700 is only €15 cheaper than 7700k, while 7700k is a much better overclocker.
> Sigh, the same thing happened with 480 vs 1060, 480 was actually more expensive but worse.
> However 1800x is almost half the price of 5960x so there's that...


Wanna brag with high OC? Get Intel dual core. Want a lot of performance for low price? Get Ryzen.
Quote:


> Originally Posted by *Arturo.Zise*
> 
> Yes X99 has quad channel and more PCI-E lanes, but when an R7 1700 + GTX1080ti can be had for the same price as a 6900k CPU alone it's hard to justify the X99 platform as a Productivity/gaming combo.
> 
> How far away is Skylake-E? Would be funny if the new 8/10 core CPU's launched at half their current lineup pricing.


Unlikely.
Quote:


> Originally Posted by *prznar1*
> 
> It wont be so cheap, but i would assume price of 1800x or maybe even between 1800x and 1700x.


That's more than the suggest half.

If they drop more than 25% say $1000 to $750 I will be surprised.
What they might wanna do is offer more cores to keep their "exclusivity", making minimum 8 cores and maximum what ever overpriced Xeon you wanna buy.
Of course the performance may be a step ahead due to higher clocks and everything already being optimized for it but value wise it won't come close to Ryzen.

Right now you can build a whole 8 core Ryzen PC for the price of an 8 core Intel CPU alone that's how big of a rip off Intel HEDT is.


----------



## Carniflex

I cant find B350 motherboard reviews? Surely there must be something out there - as far as I understand many people have had to order a B350 instead of X370 because of current motherboard shortage.

It's a bit silly - to the point I have to do my own spreadsheet to even compare the cheaper end B350 mobos at all to each other

It has been already a week from the launch! Surely somewhere in the tech press there is someone who could have filled this information gap? A whole week!


----------



## JackCY

Quote:


> Originally Posted by *Carniflex*
> 
> I cant find B350 motherboard reviews? Surely there must be something out there - as far as I understand many people have had to order a B350 instead of X370 because of current motherboard shortage.
> 
> It's a bit silly - to the point I have to do my own spreadsheet to even compare the cheaper end B350 mobos at all to each other
> 
> It has been already a week from the launch! Surely somewhere in the tech press there is someone who could have filled this information gap? A whole week!


I bet they don't have them either or after the massive backlash over poor review quality they are waiting for better UEFI and more boards to be available. I would give it a couple months, then look at mobos and buy Ryzen, not now.


----------



## pez

Quote:


> Originally Posted by *mcg75*
> 
> That logic only applies in one direction though.
> 
> If I need an 8 core for content creation etc then no, I can't cross shop.
> 
> But if I've saved $350 for a good quad processor, I'd only be hurting myself not to include considering Ryzen when actually buying.
> 
> This is not about buying what I need, it's about buying the most I can for the money.


I don't understand how he's not understanding this point







.


----------



## Shiftstealth

Not sure why you guys are still feeding the trolls in this thread.


----------



## NoDestiny

Count me in on waiting on some mATX reviews.

I've been trying to follow this thread the best I can. Have a question that I don't know has been answered or not...

Does the B350 chips seemingly max out Ryzen just as well as the X370? I never did have a desire for the X370's extra cost for the features, if I can avoid it. Wish Asrock would hurry up with that basic B350M already...


----------



## KarathKasun

Quote:


> Originally Posted by *NoDestiny*
> 
> Count me in on waiting on some mATX reviews.
> 
> I've been trying to follow this thread the best I can. Have a question that I don't know has been answered or not...
> 
> Does the B350 chips seemingly max out Ryzen just as well as the X370? I never did have a desire for the X370's extra cost for the features, if I can avoid it. Wish Asrock would hurry up with that basic B350M already...


As long as the chipset allows overclocking, it does not matter. Only VRM and UEFI bugs matter.


----------



## ryan92084

That's a fancy RAM frequency there and an even fancier score.
https://m.facebook.com/story.php?story_fbid=1141173012695602&substory_index=0&id=346079665538278


----------



## sumitlian

Quote:


> Originally Posted by *ryan92084*
> 
> 
> That's a fancy RAM frequency there and an even fancier score.
> https://m.facebook.com/story.php?story_fbid=1141173012695602&substory_index=0&id=346079665538278


Are we sure this is not due to timer bug ?


----------



## budgetgamer120

Quote:


> Originally Posted by *M4c4br3*
> 
> In Sweden, 1700 is only €15 cheaper than 7700k, while 7700k is a much better overclocker.
> Sigh, the same thing happened with 480 vs 1060, 480 was actually more expensive but worse.
> However 1800x is almost half the price of 5960x so there's that...


The much better cpu is €15 cheaper. What's the problem?


----------



## rage fuury

*Surprizes, surprizes...*

"When we started this testing, we had no idea what the final results would be. Our best guesses were that Ryzen was going to sit between the 7700K and 5820K in most tests or suffer worse performance at resolutions above 1080P due to the DDR4 memory latency issues we saw when testing the CPU and motherboard last week. Of course, in real-world testing like this, it didn't seem to make much of a difference, at least regarding what you would notice as a consumer. In almost all tests (things didn't go perfectly in Far Cry Primal) the Ryzen 1800X gave the best frame rates at all resolutions, and even more so when pushed to 1440P and 2160P, where the 8-core 16-thread design of the CPU was able to relieve the GTX 1080 Ti of any bottlenecks in performance."

http://www.eteknix.com/nvidia-gtx-1080-ti-cpu-showdown-i7-7700k-vs-ryzen-r7-1800x-vs-i7-5820k/

Of course the review must be biased and the results useless- they didn't test @480p resolution like anyone else... Ryzen sucks in gaming, it is known ...


----------



## sumitlian

Quote:


> Originally Posted by *budgetgamer120*
> 
> The much better cpu is €15 cheaper. What's the problem?


He wants better 720p performance.


----------



## BinaryDemon

Quote:


> Originally Posted by *ryan92084*
> 
> That's a fancy RAM frequency there and an even fancier score.
> https://m.facebook.com/story.php?story_fbid=1141173012695602&substory_index=0&id=346079665538278


Impressive!
Quote:


> Originally Posted by *sumitlian*
> 
> Are we sure this is not due to timer bug ?


While it would obviously be an easy thing to fake, the score doesnt seem that crazy. Also, I don't think CPU-Z is fooled by the timer bug.


----------



## sugarhell

Quote:


> Originally Posted by *mcg75*
> 
> That logic only applies in one direction though.
> 
> If I need an 8 core for content creation etc then no, I can't cross shop.
> 
> But if I've saved $350 for a good quad processor, I'd only be hurting myself not to include considering Ryzen when actually buying.
> 
> This is not about buying what I need, it's about buying the most I can for the money.


Yes but you don't understand by doing that on reviews they missed the main target group of this cpu.

It's like when they bench low end gpus at 1080p full ultra @ 15fps.

I am saying they need to do both and they failed miserable of doing that. Even on gaming reviews we still don't have multi gpus.

And in the end is what you need and what your pocket can buy . If i want to game i will get a 7700k or if i need a more balance cpu i will get a 1700.

Tell me this : I want this cpu for development. Except some linux-only sites none actually did benchmarks for development. Except if you think that cinebench is one of that.

Actually anandtech did something close especially with the pdf opening benchmark. But that's it. On the blender test they doesn't mention the settings or the complexity of the scene which is beyond useless.

I understand your points and i agree with them. But from my point of view i don't have a good reference for this CPU on development or rendering.
Quote:


> Originally Posted by *pez*
> 
> I don't understand how he's not understanding this point
> 
> 
> 
> 
> 
> 
> 
> .


See above and please understand that.

tldr Ryzen is a workstation cpu that can do gaming just fine.But reviews focused way too much on gaming.


----------



## sumitlian

Sugerhell,

here are some Linux benches. see if you could find something useful regarding your workloads.

https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/
Quote:


> Final Words
> 
> From a pure performance perspective, the AMD Ryzen is more than just competitive


http://www.phoronix.com/scan.php?page=article&item=ryzen-1800x-linux&num=4
http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-1700&num=1
https://phoronix.com/scan.php?page=news_item&px=AMD-Ryzen-Newer-Kernel


----------



## airfathaaaaa

koreans know how to do a benchmark


----------



## sugarhell

Quote:


> Originally Posted by *sumitlian*
> 
> Sugerhell,
> 
> here are some Linux benches. see if you could find something useful regarding your workloads.
> 
> https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/
> http://www.phoronix.com/scan.php?page=article&item=ryzen-1800x-linux&num=4
> http://www.phoronix.com/scan.php?page=article&item=amd-ryzen-1700&num=1
> https://phoronix.com/scan.php?page=news_item&px=AMD-Ryzen-Newer-Kernel


Quote:


> Except some linux-only sites


I think i was clear?


----------



## Kuivamaa

Ryzen is a powerhouse. The surprise is when its gaming performance is not near the top, not when it is. There are still lots of issues to be ironed out, but the raw horsepower is there.


----------



## sumitlian

Quote:


> Originally Posted by *sugarhell*
> 
> I think i was clear?


Oh, My bad, I misinterpreted your post. I thought you were talking about R7 and Linux.








while it was 7700k.


----------



## sugarhell

Quote:


> Originally Posted by *sumitlian*
> 
> Oh, My bad, I thought you were talking about R7 and Linux.


Oh ok. Thank you either way


----------



## mcg75

Quote:


> Originally Posted by *sugarhell*
> 
> Yes but you don't understand by doing that on reviews they missed the main target group of this cpu.
> 
> It's like when they bench low end gpus at 1080p full ultra @ 15fps.
> 
> I am saying they need to do both and they failed miserable of doing that. Even on gaming reviews we still don't have multi gpus.
> 
> And in the end is what you need and what your pocket can buy . If i want to game i will get a 7700k or if i need a more balance cpu i will get a 1700.
> 
> Tell me this : I want this cpu for development. Except some linux-only sites none actually did benchmarks for development. Except if you think that cinebench is one of that.
> 
> Actually anandtech did something close especially with the pdf opening benchmark. But that's it. On the blender test they doesn't mention the settings or the complexity of the scene which is beyond useless.
> 
> I understand your points and i agree with them. But from my point of view i don't have a good reference for this CPU on development or rendering.


Sorry Sugarhell, I misunderstood what you were getting at. You meant reviews not the actual consumer.


----------



## borandi

Quote:


> Originally Posted by *MadRabbit*
> 
> Since when is "Open a PDF" a benchmark anyway? A serious question actually. Just looked at the 7700k review, nothing the likes on there on the same *cough* anand *cough* page.
> 
> So now we are coming up with bs tests to show how "bad" a CPU is? Oh, wait, its Cutress.


Is there a problem? As explained in the review, it's a new benchmark suite, so it won't be in the 7700K review.
The usefulness of the Open PDF test is spelled out in the review:
Quote:


> First up is a self-penned test using a monstrous PDF we once received in advance of attending an event. While the PDF was only a single page, it had so many high-quality layers embedded it was taking north of 15 seconds to open and to gain control on the mid-range notebook I was using at the time. This put it as a great candidate for our 'let's open an obnoxious PDF' test. Here we use Adobe Reader DC, and disable all the update functionality within. The benchmark sets the screen to 1080p, opens the PDF to in fit-to-screen mode, and measures the time from sending the command to open the PDF until it is fully displayed and the user can take control of the software again. The test is repeated ten times, and the average time taken. Results are in milliseconds.


What particularly bugs you about Ian Cutress?


----------



## Kuivamaa

Opening excel sheet or a pdf several gigabytes in size, measuring compile times etc are all valid test useful in software development.


----------



## Carniflex

Quote:


> Originally Posted by *TheReciever*
> 
> Im wondering if I can drive games according to which monitor is connected to it? Like Monitor 1 > GPU0 and Monitor 2 > GPU1
> 
> Then have the games be driven by separate GPU's on their respective monitors? I think you used to be able to do something like that in the old days but I am not hopeful about things right now since Windows isnt really concerned with anything outside the norm.
> 
> I wont be buying Ryzen anytime soon though, at least not for now.
> 
> Just interested in that possibility


Depends on game. Some games, like, for example, eve online let you set the gpu it uses in its settings so for that its possible. I have regularly couple of eve clients running on secondary monitors and second gfx card while also playing, for example, planetside 2. Majority of games just go for whatever gfx card is driving your primary screen in windows. Even if the game windows is located on some other display.


----------



## pez

Quote:


> Originally Posted by *sugarhell*
> 
> Yes but you don't understand by doing that on reviews they missed the main target group of this cpu.
> 
> It's like when they bench low end gpus at 1080p full ultra @ 15fps.
> 
> I am saying they need to do both and they failed miserable of doing that. Even on gaming reviews we still don't have multi gpus.
> 
> And in the end is what you need and what your pocket can buy . If i want to game i will get a 7700k or if i need a more balance cpu i will get a 1700.
> 
> Tell me this : I want this cpu for development. Except some linux-only sites none actually did benchmarks for development. Except if you think that cinebench is one of that.
> 
> Actually anandtech did something close especially with the pdf opening benchmark. But that's it. On the blender test they doesn't mention the settings or the complexity of the scene which is beyond useless.
> 
> I understand your points and i agree with them. But from my point of view i don't have a good reference for this CPU on development or rendering.
> See above and please understand that.
> 
> tldr Ryzen is a workstation cpu that can do gaming just fine.But reviews focused way too much on gaming.


It is, but as mentioned, AMD marketed way too much as a gaming chip and also priced the 1700 so that it just so happens to be between the price of a 7600K and 7700K. So while I get your point, AMD did this to themselves a bit.


----------



## budgetgamer120

Quote:


> Originally Posted by *pez*
> 
> It is, but as mentioned, AMD marketed way too much as a gaming chip and also priced the 1700 so that it just so happens to be between the price of a 7600K and 7700K. So while I get your point, AMD did this to themselves a bit.


I don't recall AMD marketing Ryzen as a gaming chip. Maybe in your dreams?

They market it as a jack of all trades/ content creators chip. And that's what Ryzen is.


----------



## mAs81

There's talk over at anandtech about a new chipset from AMD that's supposed to come out before intel's X299 , the codenamed 'eel' X399 that supposedly will look like this:

16C32T
8CH DDR4
200W TDP
X399

..What?









Link


----------



## budgetgamer120

Quote:


> Originally Posted by *mAs81*
> 
> There's talk over at anandtech about a new chipset from AMD that's supposed to come out before intel's X299 , the codenamed 'eel' X399 that supposedly will look like this:
> 
> 16C32T
> 8CH DDR4
> 200W TDP
> X399
> 
> ..What?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link


Oh this is the $1000 platform then.


----------



## mAs81

Quote:


> Originally Posted by *budgetgamer120*
> 
> Oh this is the $1000 platform then.


Yeah I'm guessing that it'll be near the 6900K pricing , at least , but ..... 200W ??


----------



## pez

Quote:


> Originally Posted by *budgetgamer120*
> 
> I don't recall AMD marketing Ryzen as a gaming chip. Maybe in your dreams?
> 
> They market it as a jack of all trades/ content creators chip. And that's what Ryzen is.


Maybe not so much marketing, but consumer hype. The same hype we see for Vega since the TXP released.


----------



## budgetgamer120

Quote:


> Originally Posted by *mAs81*
> 
> Yeah I'm guessing that it'll be near the 6900K pricing , at least , but ..... 200W ??


Well the 5ghz FX was 220w.


----------



## mAs81

Good point..Well until some more trusty leaks occur we won't know for sure , but I'm very optimistic for AMD's future


----------



## Carniflex

Quote:


> Originally Posted by *mAs81*
> 
> There's talk over at anandtech about a new chipset from AMD that's supposed to come out before intel's X299 , the codenamed 'eel' X399 that supposedly will look like this:
> 
> 16C32T
> 8CH DDR4
> 200W TDP
> X399
> 
> ..What?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link


I believe this to be just wild speculation based on the announcement of "Naples" server platform which has something similar. That being 32 cores 64 threads, (1.4 GHz with boost up to 2.9GHz) in 220W TDP, 8 channel DDR4, 128 PCIe links. Few less than 4000 pins in LGA socket. Speculated price of these CPU's is north of 8000$ per piece.

Previosuly AMD has not released their server sockets for enthusiasts (C32 and G34 scoket Opterons) while Intel has used its LGA2011 also for "enthusiast" line with different chip-set from server space.


----------



## The-Beast

This makes me wonder how modular the design of Naples is, since this rumor essentially points to it being cut in half.


----------



## Carniflex

Quote:


> Originally Posted by *The-Beast*
> 
> This makes me wonder how modular the design of Naples is, since this rumor essentially points to it being cut in half.


It seems AMD is able to use these ryzen 4 core/8 thread building blocks at least up to 32 core/64 thread design based on their recent announcement of "Naples" server platform. Such a chip was said to be only 220W TDP with is damn impressive. It is not AM4 socket though. That one is in the new AMd server socket which has few less than 4000 pins in LGA and is HUGE. Also, 8 channel RAM and 128 PCIe 3.0 lanes.


----------



## epic1337

Quote:


> Originally Posted by *Carniflex*
> 
> I believe this to be just wild speculation based on the announcement of "Naples" server platform which has something similar. That being 32 cores 64 threads, (1.4 GHz with boost up to 2.9GHz) in 220W TDP, 8 channel DDR4, 128 PCIe links. Few less than 4000 pins in LGA socket. Speculated price of these CPU's is north of 8000$ per piece.
> 
> Previosuly AMD has not released their server sockets for enthusiasts (C32 and G34 scoket Opterons) while Intel has used its LGA2011 also for "enthusiast" line with different chip-set from server space.


it is exactly naples, naples will also have a 16C/32T cutdown SKU from the full die.


----------



## ChronoBodi

is there a reason why AMD used PGA for consumer but LGA for server?

Also, why no AMD consumer variant of server parts, like intel does with x99?

Or not even that, why not AMD LGA on consumer as well?


----------



## epic1337

Quote:


> Originally Posted by *ChronoBodi*
> 
> is there a reason why AMD used PGA for consumer but LGA for server?


4000pins is too much for PGA.


----------



## ChronoBodi

Quote:


> Originally Posted by *epic1337*
> 
> 4000pins is too much for PGA.


/
makes sense. But if they were doing a new socket, why did they go with PGA for consumer?
If i had to guess, it's basic compatibility with AM3 clip-only coolers that doesn't rely on backplates to fit?

Just curious as to why not LGA for consumer when Intel does it already. Ehhhh i'm just wondering.


----------



## epic1337

Quote:


> Originally Posted by *ChronoBodi*
> 
> /
> makes sense. But if they were doing a new socket, why did they go with PGA for consumer?
> If i had to guess, it's basic compatibility with AM3 clip-only coolers that doesn't rely on backplates to fit?


PGA vs LGA has nothing to do with cooler compatibility, they could fully support the previous cooler just by reusing the same mounting holes.

Quote:


> Originally Posted by *ChronoBodi*
> 
> Just curious as to why not LGA for consumer when Intel does it already. Ehhhh i'm just wondering.


the question is, what for?


----------



## Carniflex

Quote:


> Originally Posted by *ChronoBodi*
> 
> is there a reason why AMD used PGA for consumer but LGA for server?
> 
> Also, why no AMD consumer variant of server parts, like intel does with x99?
> 
> Or not even that, why not AMD LGA on consumer as well?


Yes. Thats the answer I'm pretty sure









Ofc _what_ is the reason is up to speculation and there is probably very few of these people present in this forum who were involved in this decision process.

For x99 equivalent - it has been speculated that AMD have not seen there to be a significant enough niche to justify that. Provided they can spare the funding for R&D though, it might be nice for e-peen to release something like SR-2 for enthusiasts. Or even a single socket Naples part with unlocked multiplier even if they do not do a special new chip-set for it but instead go with slightly modified server chip-set. Probably would not have many people going for it as the cost of the mobo+cpu would be probably north of 3000$.


----------



## epic1337

Quote:


> Originally Posted by *Carniflex*
> 
> For x99 equivalent - it has been speculated that AMD have not seen there to be a significant enough niche to justify that. Provided they can spare the funding for R&D though, it might be nice for e-peen to release something like SR-2 for enthusiasts. Or even a single socket Naples part with unlocked multiplier even if they do not do a special new chip-set for it but instead go with slightly modified server chip-set. Probably would not have many people going for it as the cost of the mobo+cpu would be probably north of 3000$.


exactly, they've already bridged the gap between mainstream and HEDT using Ryzen 7 (8C/16T) and Ryzen 5 (6C/12T) so theres hardly any point in pushing for a higher core count.
maybe in the future AMD will release a 4CCX 16C/32T desktop chip but thats for the future, for the current market 8C/16T is already plenty enough to last until the next generation comes.


----------



## Carniflex

Quote:


> Originally Posted by *epic1337*
> 
> exactly, they've already bridged the gap between mainstream and HEDT using Ryzen 7 (8C/16T) and Ryzen 5 (6C/12T) so theres hardly any point in pushing for a higher core count.
> maybe in the future AMD will release a 4CCX 16C/32T desktop chip but thats for the future, for the current market 8C/16T is already plenty enough to last until the next generation comes.


I agree. They do have TDP headroom to do up to 16C/32T part that would be at, say, 125W even with the current AM4 socket but I honestly do not believe they will. That would probably cannibalize a bit too much their server business plans and would get rather harsh reviews because if it would be, say, 2.5GHz base and 3.3 GHz boost it would not do well at all benchmarking 480p and 720p resolutions in single threaded games. And then you get headlines like "OMG this 1000$+ 16C/32T CPU can not game at all, omg its crap!!! Buy i7-7700k instead!".


----------



## Kuivamaa

Quote:


> Originally Posted by *ChronoBodi*
> 
> /
> makes sense. But if they were doing a new socket, why did they go with PGA for consumer?
> If i had to guess, it's basic compatibility with AM3 clip-only coolers that doesn't rely on backplates to fit?
> 
> Just curious as to why not LGA for consumer when Intel does it already. Ehhhh i'm just wondering.


Well, LGA moves the pin cost to board makers. Perhaps AMD does politics by remaining PGA.


----------



## epic1337

Quote:


> Originally Posted by *Carniflex*
> 
> I agree. They do have TDP headroom to do up to 16C/32T part that would be at, say, 125W even with the current AM4 socket but I honestly do not believe they will. That would probably cannibalize a bit too much their server business plans and would get rather harsh reviews because if it would be, say, 2.5GHz base and 3.3 GHz boost it would not do well at all benchmarking 480p and 720p resolutions in single threaded games. And then you get headlines like "OMG this 1000$+ 16C/32T CPU can not game at all, omg its crap!!! Buy i7-7700k instead!".


i'd be surprised if they'd even price a 16C/32T chip at $1000, since their 10C/20T, 12C/24T and 14C/28T chips would sit between $499 and $1000, i highly doubt this.

and yes, knowing how the public compares stock-vs-stock performance they'd without a doubt make a 16C/32T chip look like a waste of money.
to point out, let alone the consumers, ~90% of the reviewers themselves doesn't even know how to showcase the worth of Ryzen 8C/16T.
they'd do what, straight up gaming benchmark with a couple of synthetics and conclude its a mediocre chip?


----------



## ChronoBodi

Ugh, tell me about it. I mean, cool to have 16c/32t processors, but there is no way to avoid the idiotic audiences, you know the "OMG IT SUX IN GAMeZ!" crowd, nevermind that the higher-cored processors beat out the 4-core parts in everything else.

so, really for now, 8c/16t, OC 4 ghz was the best compromise of decent single-thread speed while having much better multithreading than 4 cores.

On the other hand, the law of diminishing returns strikes. Who's to say we may be at 8 cores for mainstream longer than the quad cores did?

i mean, we have to go from 8 to 16, then to 32, then to 64..... so yea.


----------



## Carniflex

Quote:


> Originally Posted by *epic1337*
> 
> i'd be surprised if they'd even price a 16C/32T chip at $1000, since their 10C/20T, 12C/24T and 14C/28T chips would sit between $499 and $1000, i highly doubt this.


I'd be surprised if AMD would do 12C/24T and 16C/32T in AM4 socket at all. If they would do I would speculate that they would get away with that kind of price tag just fine. Especially if people could OC these as far as 3.9 to 4.0 GHz still. Comparable Intel 10C/20T part is sitting currently at approx 1.5k$.

Just purely speculating, but say, 12C/24T for ~850 to 1000$ and 16C/32T at 3.0GHz base with up to 3.6GHz boost (and unlocked multi ofc) for 1500$ from AMD in AM4 socket would sound quite competitive to my ears. The Ryzen core is amazingly efficient at lower clocks and there would be loads at which people would not even blink twice to buy such CPU's for that specific task where all these extra cores/threads matter. But these are the same kind of people AMD would rather like to sell their Naples CPU for twice or trice the price.


----------



## epic1337

Quote:


> Originally Posted by *ChronoBodi*
> 
> Ugh, tell me about it. I mean, cool to have 16c/32t processors, but there is no way to avoid the idiotic audiences, you know the "OMG IT SUX IN GAMeZ!" crowd, nevermind that the higher-cored processors beat out the 4-core parts in everything else.
> 
> so, really for now, 8c/16t, OC 4 ghz was the best compromise of decent single-thread speed while having much better multithreading than 4 cores.
> 
> On the other hand, the law of diminishing returns strikes. Who's to say we may be at 8 cores for mainstream longer than the quad cores did?
> 
> i mean, we have to go from 8 to 16, then to 32, then to 64..... so yea.


8C/16T and 6C/12T will probably become mainstream soon, but not in the immediate months to come, we're probably looking at a couple of years.

4C/8T is still mainstream right now since from what we can see they're on top of most "gaming benchmarks"... like its the only thing a desktop would ever do.
with that in mind, game devs would probably still aim for 8thread optimization with superficial support for 12thread and 16thread counts.


----------



## sumitlian

Quote:


> Originally Posted by *ChronoBodi*
> 
> Also, why no AMD consumer variant of server parts, like intel does with x99?


I don't remember the exact time but I think it was near Piledriver launch, AMD did provide 65w Opteron 3280 (4M/8C) for AM3+. It was in the CPU support list of my Gigabyte 990FXA-UD5.
I think we might see Zen based Opteron too somewhere in upcoming months or may be not.


----------



## Carniflex

Quote:


> Originally Posted by *epic1337*
> 
> 8C/16T and 6C/12T will probably become mainstream soon, but not in the immediate months to come, we're probably looking at a couple of years.
> 
> 4C/8T is still mainstream right now since from what we can see they're on top of most "gaming benchmarks"... like its the only thing a desktop would ever do.
> with that in mind, game devs would probably still aim for 8thread optimization with superficial support for 12thread and 16thread counts.


I would not go as far as to say that 4C/8T is mainstream right now. Perhaps a quad core could be considered standard as by now only 48% of the systems having Steam installed (i.e., "gaming" in some capacity) have only 1 or 2 cores/threads at their disposal, meaning that 52% of the systems have three or more cores/threads available for them. See, http://store.steampowered.com/hwsurvey/

Enthusiasts are such a small niche. Considering all the people that are apparently playing at 1368x768 resolution using iGPU on the dualcore laptop.

I believe it might be a bit far fetched even to get game devs to support 8 threads and I believe it wont really happen until AMD manages to release their own compiler that is at least as good as the Intel optimized ones. There were a bit announcement about something like that couple of years ago - about AMD dedicating significant resources to do a better compiler for their CPU's including of being HSA aware and stuff. No idea where they are with that project.

All that said. I'm quite interested in Ryzen and I am keeping an keen eye on it. For me it is a damn impressive feat that AMD have pulled in here. Also some aspects of their marketing slides as far as gaming go have not gotten enough scrutiny I believe - what raised my interest in regards of gaming for Ryzen was specifically claim that they have some tech to ensure "smooth" experience and also some talk about frame times. I mean I do have 1440p 144Hz display and while it would be great to be able to hit 144 Hz on it I'm content to settle for, say, 120 Hz if my minimum manages to stay over 100Hz, for example, as opposed, for example, to a situation where I get most of the time over 144 Hz but the dips go as far as 80 Hz when a lot of stuff happens all the sudden on screen.

So far it seems it would be pretty much side-grade for me single thread perf wise (i7-3820 @ 4.3 GHz) while beating the crap out of my current rig multi thread wise. But for serious multi thread I can already remote desktop to a proper server at work. But if, for example, in Planetside 2 in Hossin when there is 96+ vs 96+ going on and my fps drops to mid 40's and Ryzen would give me, say, high 60's that would be noticeable improvement. Or if I could, for example, turn shadows on without fps heading to the dumpster as a result.


----------



## epic1337

Quote:


> Originally Posted by *Carniflex*
> 
> I believe it might be a bit far fetched even to get game devs to support 8 threads and I believe it wont really happen until AMD manages to release their own compiler that is at least as good as the Intel optimized ones. There were a bit announcement about something like that couple of years ago - about AMD dedicating significant resources to do a better compiler for their CPU's including of being HSA aware and stuff. No idea where they are with that project.


hmmm, did you forget that the latest consoles are using 8cores, and AMD's 8core chips at that?


----------



## Carniflex

Quote:


> Originally Posted by *epic1337*
> 
> hmmm, did you forget that the latest consoles are using 8cores, and AMD's 8core chips at that?


But they are not PC's strictly speaking. And while some ports have started using more cores as a result vast majority of PC games really do not use that much cores. Although if one would count these as "PC"'s as well then ofc one could make an argument that 8 core thing is a mainstream indeed. There is approx 120 million new generation consoles built worldwide and Steam is at approx 125 million users so similar ballpark which would push the people with 8+ cores/threads slightly over 50% as currently about 3% of systems on steam have 8+ cores.


----------



## KarathKasun

Quote:


> Originally Posted by *TheReciever*
> 
> Im wondering if I can drive games according to which monitor is connected to it? Like Monitor 1 > GPU0 and Monitor 2 > GPU1
> 
> Then have the games be driven by separate GPU's on their respective monitors? I think you used to be able to do something like that in the old days but I am not hopeful about things right now since Windows isnt really concerned with anything outside the norm.
> 
> I wont be buying Ryzen anytime soon though, at least not for now.
> 
> Just interested in that possibility


No, windows does not do that anymore. The last Windows OS that operated like that was XP.

I figured this out during my multiboxing days. Vista and later display the 2nd/3rd monitors by framebuffer copying from the primary GPU, so all applications are rendered by a single GPU. You would have to use virtual machines and PCIe VM passthrough to make that work in a modern Windows environment AFAIK. Another example of features that have gone away in MS software.


----------



## budgetgamer120

Quote:


> Originally Posted by *ChronoBodi*
> 
> /
> makes sense. But if they were doing a new socket, why did they go with PGA for consumer?
> If i had to guess, it's basic compatibility with AM3 clip-only coolers that doesn't rely on backplates to fit?
> 
> Just curious as to why not LGA for consumer when Intel does it already. Ehhhh i'm just wondering.


Quote:


> Originally Posted by *mAs81*
> 
> Yeah I'm guessing that it'll be near the 6900K pricing , at least , but ..... 200W ??


No thanks I hate LGA.


----------



## JackCY

Quote:


> Originally Posted by *KarathKasun*
> 
> No, windows does not do that anymore. The last Windows OS that operated like that was XP.
> 
> I figured this out during my multiboxing days. Vista and later display the 2nd/3rd monitors by framebuffer copying from the primary GPU, so all applications are rendered by a single GPU. You would have to use virtual machines and PCIe VM passthrough to make that work in a modern Windows environment AFAIK. Another example of features that have gone away in MS software.


That's nice but some applications still crash when moved between monitors. Such as MadVR. Go figure really with the multi monitor stuff, so many years being used but never 100% stable and supported.


----------



## Liranan

Quote:


> Originally Posted by *sumitlian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChronoBodi*
> 
> Also, why no AMD consumer variant of server parts, like intel does with x99?
> 
> 
> 
> I don't remember the exact time but I think it was near Piledriver launch, AMD did provide 65w Opteron 3280 (4M/8C) for AM3+. It was in the CPU support list of my Gigabyte 990FXA-UD5.
> I think we might see Zen based Opteron too somewhere in upcoming months or may be not.
Click to expand...

ECC support is how Intel differentiate consumer from server grade chips and all AMD chips since Athlon II have supported ECC RAM so why would you need special Opteron chips for AM4?


----------



## daviejams

Quote:


> Originally Posted by *budgetgamer120*
> 
> No thanks I hate LGA.


LGA is much easier and safer if you have giant hands like me

I've killed a FX8350 before with these clumsy fingers , PGA and an AMD stock cooler


----------



## CULLEN

Quote:


> Originally Posted by *daviejams*
> 
> LGA is much easier and safer if you have giant hands like me
> 
> I've killed a FX8350 before with these clumsy fingers , PGA and an AMD stock cooler


If you have hands like Andre the Giant, installing a CPU is the least of your worries. Being clumsy is on you.


----------



## sumitlian

Quote:


> Originally Posted by *Liranan*
> 
> ECC support is how Intel differentiate consumer from server grade chips and all AMD chips since Athlon II have supported ECC RAM so why would you need special Opteron chips for AM4?


very good question.
That is what I was thinking too. Guess I don't know the difference.







Somebody teach me.
May be they are even higher quality chips in terms of longevity and power efficiency or may be they have support for some workstation load specific ISA we don't hear much about or may be better IOMMU I think.

Edit: Okay that was me being stupid.... IOMMU has only to do with chipset not the CPU.


----------



## daviejams

Quote:


> Originally Posted by *CULLEN*
> 
> If you have hands like Andre the Giant, installing a CPU is the least of your worries. Being clumsy is on you.


I have the most trouble with the power switch jumpers actually. Aftermarker CPU coolers can be a bit tricky for me too but yeah the power switch jumper that's the hard one to get in


----------



## tpi2007

Regarding the 16C/32T and X399 platform rumour, I'd say it's a far fetched idea. Maybe they can harvest CPUs with 4 CCXes and a core disabled here and there or only three CCXes for a 12C/24T CPU for enthusiasts and workstations, but more than that seems unlikely for the forseeable future. Intel is maxing out at 10C/20T right now, and I don't see them going further than 12C/24T for Skylake-E either. You need a certain baseline clockspeed, a decent power envelope and good yields. I don't see an affordable 16C/32T CPU in this segment for a long time, on 14nm.

Meanwhile, in the 8C/16T consumer realm, for Zen 2 there's IPC gains to be had, yields to perfect, clockspeeds to raise and right now, BIOS and OS optimizations to do and then general application and game engine optimizations will follow suit.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ChronoBodi*
> 
> is there a reason why AMD used PGA for consumer but LGA for server?
> 
> Also, why no AMD consumer variant of server parts, like intel does with x99?
> 
> Or not even that, why not AMD LGA on consumer as well?


Well PGA means cheaper socket and less RMA problems for MB manufactures. The same thing the cost savings are not really reflected on high end parts.


----------



## Oubadah

..


----------



## SuperZan

Quote:


> Originally Posted by *Oubadah*
> 
> I too loathe LGA. What's the advantage supposed to be?


Gets the pulse up and the blood pumping any time you need to remove the CPU for any reason?


----------



## Kyube

To me, a person who upgrades every 5-6 years and wants the most framerate (low resolution, +144fps, mostly gaming) and the most future proof build, the Ryzen R7 or R5 lineup will be the best choice. The fact that the platform will stay for 4-5 years is what keeps me on the Ryzen hype train, despite all these ridiculous reviews. If I see a insane IPC improvement & better overclock in Zen2 or Zen3, I can just upgrade the CPU without replacing the motherboard. Once the the launch bugs get squashed, i'm expecting >5% improvement in ST & MT performance.

oh and yeah, i'm surprised nobody did a proper win7 vs win10 test with some tweaks to both OS (disabling bloatware, high-performance power plan,core unparking etc.)


----------



## aberrero

Quote:


> Originally Posted by *Kyube*
> 
> To me, a person who upgrades every 5-6 years and wants the most framerate (low resolution, +144fps, mostly gaming) and the most future proof build, the Ryzen R7 or R5 lineup will be the best choice. The fact that the platform will stay for 4-5 years is what keeps me on the Ryzen hype train, despite all these ridiculous reviews. If I see a insane IPC improvement & better overclock in Zen2 or Zen3, I can just upgrade the CPU without replacing the motherboard. Once the the launch bugs get squashed, i'm expecting >5% improvement in ST & MT performance.
> 
> oh and yeah, i'm surprised nobody did a proper win7 vs win10 test with some tweaks to both OS (disabling bloatware, high-performance power plan,core unparking etc.)


RUZEN OFFICIALLY DOESNT SUPPORT WINDOWS 7. Neither does Kant lake.

Edit: holy autocorrect Batman. I'm leaving the whole comment as is.


----------



## iRUSH

Quote:


> Originally Posted by *aberrero*
> 
> RUZEN OFFICIALLY DOESNT SUPPORT WINDOWS 7. Neither does Kant lake.
> 
> Edit: holy autocorrect Batman. I'm leaving the whole comment as is.


I really like "Kant lake"


----------



## philhalo66

Quote:


> Originally Posted by *aberrero*
> 
> RUZEN OFFICIALLY DOESNT SUPPORT WINDOWS 7. Neither does Kant lake.
> 
> Edit: holy autocorrect Batman. I'm leaving the whole comment as is.


but both work perfectly on windows 7.


----------



## sumitlian

Quote:


> Originally Posted by *CULLEN*
> 
> If you have hands like Andre the Giant, installing a CPU is the least of your worries. Being clumsy is on you.


This.
At least on PGA you don't have to buy a new board or wait for warranty if pin/s on CPU is irreparable You can buy any of the
Quote:


> Originally Posted by *iRUSH*
> 
> I really like "Kant lake"


----------



## Majin SSJ Eric

I duuno, I prefer LGA. Its a lot easier to damage pins on a CPU than on a mobo. I've dropped a couple of Intel CPU's while working around my systems but luckily they had no pins on them. Its not quite as easy to drop an entire mobo.


----------



## Slomo4shO

Quote:


> Originally Posted by *mAs81*
> 
> There's talk over at anandtech about a new chipset from AMD that's supposed to come out before intel's X299 , the codenamed 'eel' X399 that supposedly will look like this:
> 
> 16C32T
> 8CH DDR4
> 200W TDP
> X399
> 
> ..What?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link


If it is going to be a Naples derivative, Couldn't there also be a 32C64T variant?


----------



## Schottky

Quote:


> Originally Posted by *Kyube*
> 
> oh and yeah, i'm surprised nobody did a proper win7 vs win10 test with some tweaks to both OS (disabling bloatware, high-performance power plan,core unparking etc.)


I've seen one reviewer, which came up today, test the 7700K vs 1700X with Win7 and Win10:

http://www.overclock.net/t/1625274/mindblank-tech-youtube-ryzen-1700x-review-4-days-worth-of-testing-ft-i7-7700k

With most games he tested, the W7 and W10 results were about the same. One game ran better on W7 and one game ran better on W10.

The W7 vs W10 starts at the 15:30 mark in the video.


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I duuno, I prefer LGA. Its a lot easier to damage pins on a CPU than on a mobo. I've dropped a couple of Intel CPU's while working around my systems but luckily they had no pins on them. Its not quite as easy to drop an entire mobo.


I've had the opposite experience. I don't exactly have surgeon's hands so an accidental muscle twitch or dropped tool or losing grip on the processor can be disastrous with LGA. I don't really walk around holding processors or motherboards, so dropping one has never been the concern. In that sense, there is much more (for me) that can go wrong with LGA than PGA, as the processor is either in the motherboard or in a case and the motherboard is either in a box or in the case. I just don't have an in-between point where I could be dropping a processor from a height greater than a few inches, but a few inches into the LGA socket is a bad day with that board.


----------



## BinaryDemon

I'm surprised Intel or AMD hasnt designed a better way. No need risk the cpu or motherboard with bent pins. I guess costwise it's not worth it.


----------



## mAs81

Quote:


> Originally Posted by *Slomo4shO*
> 
> If it is going to be a Naples derivative, Couldn't there also be a 32C64T variant?


It could,but everything is just speculation atm..The way I see it Naples is supposed to cross swords with Xeons as high performance server cpus , when this 'eel' is more likely to be an enthusiast X299 competitor..

Your guess is as good as mine , the way things stand now..But I'm currently mildly enthusiastic about what AMD will have to offer this year


----------



## epic1337

Quote:


> Originally Posted by *Slomo4shO*
> 
> If it is going to be a Naples derivative, Couldn't there also be a 32C64T variant?


yes, just go with a Naples chip.
its not like theres anything different from going with a server chip instead of a consumer chip, aside from costs.
Quote:


> Originally Posted by *BinaryDemon*
> 
> I'm surprised Intel or AMD hasnt designed a better way. No need risk the cpu or motherboard with bent pins. I guess costwise it's not worth it.


probably cost, LGA's only problem is that it's leaf spring is too fragile.
they could use a different type of leaf spring thats resistant to external damage.


----------



## yesitsmario

What are the chances that 6 core and 4 core zen overclocks higher?


----------



## SuperZan

There are process limitations with 14nm LPP so don't expect night and day, but I'm convinced that 4.2 / 4.3 will be more normal for the quads, at least. Hard to say on the hexas without knowing how they're going to handle the production. Quads should be very consistent because it's just one of the octa's two CCX's.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SuperZan*
> 
> I've had the opposite experience. I don't exactly have surgeon's hands so an accidental muscle twitch or dropped tool or losing grip on the processor can be disastrous with LGA. I don't really walk around holding processors or motherboards, so dropping one has never been the concern. In that sense, there is much more (for me) that can go wrong with LGA than PGA, as the processor is either in the motherboard or in a case and the motherboard is either in a box or in the case. I just don't have an in-between point where I could be dropping a processor from a height greater than a few inches, but a few inches into the LGA socket is a bad day with that board.


Fair enough. When I was building my very first PC back in 2011 I pulled the 2600K out of the box and I was so nervous handling it my sweaty fingers slipped and it dropped all the way to the floor. Had it been PGA I have no doubt the pins would've been bent or broken. I suppose either platform has its own anecdotal horror stories and I've seen more people around here that agree with you than with me, but I still prefer LGA!


----------



## Liranan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SuperZan*
> 
> I've had the opposite experience. I don't exactly have surgeon's hands so an accidental muscle twitch or dropped tool or losing grip on the processor can be disastrous with LGA. I don't really walk around holding processors or motherboards, so dropping one has never been the concern. In that sense, there is much more (for me) that can go wrong with LGA than PGA, as the processor is either in the motherboard or in a case and the motherboard is either in a box or in the case. I just don't have an in-between point where I could be dropping a processor from a height greater than a few inches, but a few inches into the LGA socket is a bad day with that board.
> 
> 
> 
> Fair enough. When I was building my very first PC back in 2011 I pulled the 2600K out of the box and I was so nervous handling it my sweaty fingers slipped and it dropped all the way to the floor. Had it been PGA I have no doubt the pins would've been bent or broken. I suppose either platform has its own anecdotal horror stories and I've seen more people around here that agree with you than with me, but I still prefer LGA!
Click to expand...

I have only broken the pins of one CPU since I started working with PC's and that was a Pentium 133, however I've had to replace quite a few s775 boards because of bad springs (bent, broken) so I definitely dislike LGA.


----------



## Slomo4shO

Quote:


> Originally Posted by *yesitsmario*
> 
> What are the chances that 6 core and 4 core zen overclocks higher?


No where near as high as what people want them to clock at. 14nm LPP isn't designed for high clock rates, it is designed for efficiency.


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Fair enough. When I was building my very first PC back in 2011 I pulled the 2600K out of the box and I was so nervous handling it my sweaty fingers slipped and it dropped all the way to the floor. Had it been PGA I have no doubt the pins would've been bent or broken. I suppose either platform has its own anecdotal horror stories and I've seen more people around here that agree with you than with me, but I still prefer LGA!


I can see why.







I suppose most people have had a bad day with one or the other and that informs the subjective experience more than anything else.


----------



## Scotty99

From everything ive read [email protected] is the sweet spot (at least for the 1700). Stock cooler should be able to handle this as well, what a bargain


----------



## yesitsmario

Quote:


> Originally Posted by *SuperZan*
> 
> There are process limitations with 14nm LPP so don't expect night and day, but I'm convinced that 4.2 / 4.3 will be more normal for the quads, at least. Hard to say on the hexas without knowing how they're going to handle the production. Quads should be very consistent because it's just one of the octa's two CCX's.


Quote:


> Originally Posted by *Slomo4shO*
> 
> No where near as high as what people want them to clock at. 14nm LPP isn't designed for high clock rates, it is designed for efficiency.


Aww, was hoping for 4.5ghz on the quads.


----------



## Majin SSJ Eric

I'm getting a 1700X and I will absolutely get it to 4GHz with 3200MHz memory. I'll throw it in a CHVI with a couple of 1080's and call it a day. It will crush any game I play at 1440p (my resolution) and probably embarrass my 4930K in other areas like rendering, transcoding, and streaming. Can't wait to put it all together!


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'm getting a 1700X and I will absolutely get it to 4GHz with 3200MHz memory. I'll throw it in a CHVI with a couple of 1080's and call it a day. It will crush any game I play at 1440p (my resolution) and probably embarrass my 4930K in other areas like rendering, transcoding, and streaming. Can't wait to put it all together!


Mine is Y-Cruncher stable over 20 stress test runs at 3.85GHz, 1.32v and I'm able to use adaptive with moderate LLC. I can do 4.0 at around 1.38-1.4v but I'm waiting on a few updates before I mess with it much more. You should be able to get 4GHz fairly easily when you pick yours up, as many of the roadblocks will have been removed.


----------



## jprovido

just bought an r7 1700x and asus prime matx motherboard for my VR Build. will be getting the 1700x tomorrow and the motherboard on the 17th which is the last part I will have finished the ryzen build by them. can't wait to try it out. I will judge it without bias if it's better than my 7700k @ 5.1ghz I will demote it, buy a CHVI and will put the 1700x on my main rig


----------



## iRUSH

Quote:


> Originally Posted by *jprovido*
> 
> just bought an r7 1700x and asus prime matx motherboard for my VR Build. will be getting the 1700x tomorrow and the motherboard on the 17th which is the last part I will have finished the ryzen build by them. can't wait to try it out. I will judge it without bias if it's better than my 7700k @ 5.1ghz I will demote it, buy a CHVI and will put the 1700x on my main rig


Fun! What will you be doing to compare the two?

A 7700k is no slouch let alone one at 5.1ghz. That's tough to beat.

I look forward to reading about your experience.


----------



## budgetgamer120

Quote:


> Originally Posted by *iRUSH*
> 
> Fun! What will you be doing to compare the two?
> 
> A 7700k is no slouch let alone one at 5.1ghz. That's tough to beat.
> 
> I look forward to reading about your experience.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *jprovido*
> 
> just bought an r7 1700x and asus prime matx motherboard for my VR Build. will be getting the 1700x tomorrow and the motherboard on the 17th which is the last part I will have finished the ryzen build by them. can't wait to try it out. I will judge it without bias if it's better than my 7700k @ 5.1ghz I will demote it, buy a CHVI and will put the 1700x on my main rig


Its not going to be better than a 5.1GHz 7700K in gaming, no question about that. The real question is is the 15% difference in games at 1080p really enough to justify the massive disparity in performance in all other CPU workloads? If not then the 7700K is the better choice for you. My contention is that for the vast majority of OCN members the 1700X offers more than enough of an advantage in all other workloads over the 7700K to make up for its minor deficiencies in 1080p gaming performance. Hell, I've been running 1440p since 2012 when I ditched my S27A950D in favor of Korean IPS panels and haven't looked back to 1080p since. Sure 1080p / 144Hz (or more) monitors are a thing but even with those monitors its not exactly like the 1700X falls on its face or anything. If you have something like GTX 1080's I'm sure a 1700X will get you over 144 FPS at 1080p just as well as a 7700K will...


----------



## cssorkinman

Pushing a Fury to 100% usage on low graphics settings at 1080 p DX 11


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its not going to be better than a 5.1GHz 7700K in gaming, no question about that. The real question is is the 15% difference in games at 1080p really enough to justify the massive disparity in performance in all other CPU workloads? If not then the 7700K is the better choice for you. My contention is that for the vast majority of OCN members the 1700X offers more than enough of an advantage in all other workloads over the 7700K to make up for its minor deficiencies in 1080p gaming performance. Hell, I've been running 1440p since 2012 when I ditched my S27A950D in favor of Korean IPS panels and haven't looked back to 1080p since. Sure 1080p / 144Hz (or more) monitors are a thing but even with those monitors its not exactly like the 1700X falls on its face or anything. If you have something like GTX 1080's I'm sure a 1700X will get you over 144 FPS at 1080p just as well as a 7700K will...


Hey 15% in gaming is 3 generation of Intel CPUs. Are you looking down on people that got Haswell and Skylake? 15% is 4 years of engineering prowess from Intel.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cssorkinman*
> 
> Pushing a Fury to 100% usage on low graphics settings at 1080 p DX 11


I would like to see some fps and CPU test in BF1 is done in MP 64 maps.


----------



## Majin SSJ Eric

I think every i7 since the 2600K have been fantastic processors and still more than enough for gaming even in 2017. What I find interesting is that since Ryzen came out Haswell and Broadwell are somehow crappy gaming CPU's now. That's what you are saying if you say that Ryzen sucks for gaming.


----------



## cssorkinman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Pushing a Fury to 100% usage on low graphics settings at 1080 p DX 11
> 
> 
> 
> 
> 
> I would like to see some fps and CPU test in BF1 is done in MP 64 maps.
Click to expand...

64 person giant's shadow map I averaged 173 fps 128 min 200 max ( capped) I believe the minimum was registered at the first polling point.

Graphed


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think every i7 since the 2600K have been fantastic processors and still more than enough for gaming even in 2017. What I find interesting is that since Ryzen came out Haswell and Broadwell are somehow crappy gaming CPU's now. That's what you are saying if you say that Ryzen sucks for gaming.


Since 2600K upgrading for gaming 95% of people has been pointless. Only people that "benefit" are high Hz users and even then most games people play like CSGO, Overwatch can run 144fps + with 2600K. Yes 7700K is faster at gaming than 2600K if tested accordingly. Yes Zen is slower in gaming right now. All I am saying is Zen 8 Core is not for gamers. 6 Core will cost less and will show it strengths in gaming much faster in 1-2 years while 8 Core will be too slow "Q6600" when game use 16T. I just do not see many people here having a need for 8 Core CPUs. I have not had a need for i7 for all these years.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cssorkinman*
> 
> 64 person giant's shadow map I averaged 173 fps 128 min 200 max ( capped) I believe the minimum was registered at the first polling point.
> 
> Graphed


What is the CPU usage like? Do it go over 50%?


----------



## cssorkinman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> 64 person giant's shadow map I averaged 173 fps 128 min 200 max ( capped) I believe the minimum was registered at the first polling point.
> 
> Graphed
> 
> 
> 
> 
> 
> What is the CPU usage like? Do it go over 50%?
Click to expand...

I can't seem to make it go over 50% cpu usage with the fury ( at least not with a 200 fps cap ) would like to test a 1080ti with it and take the cap off.

During one run , that single player tank mission averaged 198 fps with a 200 fps cap . I;ve forgotten what my cpu speed was on that one - but I have managed to play 64 player games at 4125 mhz.


----------



## Twotenths

I am going to hold off on buying a Ryzen chip and motherboard until next year.

The main reason is because at the end of 2017 PCIe will be adopting the Gen 4.0 standard. It doubles the PCIe lane speeds so in effect they can either put in more PCIe Gen 3.0 lanes instead or make it possible to set 4.0 to 3.0 in the bios allowing your extra slot to be a full x16 instead of x4 that they offer now.

Unfortunately they can't do a bios update to give it to you later. It requires a separate SoC (System on Chip) baked in processor on the motherboard to handle the extra bandwidth. It will take that load off of the CPU. The CPU does it right now. PCIe lanes are built into the processor. Gen 4.0 will get it's own baked on processor to be able to double up the speeds. AMD could not wait another year to launch. They had to release it now or lose the advantage of what they have accomplished by Intel's probable down pricing. It must be noted that the Ryzen is much more a workstation system than a gaming one. The numbers are in and the proof is there. With time AMD may be able to tweak the bios and improve their gaming performance. It is a brand new platform after all so it takes time to get things just right.

For me I'm waiting a year for the build to mature a little and for the introduction of PCIe Gen 4.0. It will be introduced at the end of 2017 and new boards will be coming out to support it. I think the processor won't change but the motherboard will. (Hopefully they won't need new processors to do it. I doubt it as long as socket AM4 remains unchanged).


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Since 2600K upgrading for gaming 95% of people has been pointless. Only people that "benefit" are high Hz users and even then most games people play like CSGO, Overwatch can run 144fps + with 2600K. Yes 7700K is faster at gaming than 2600K if tested accordingly. Yes Zen is slower in gaming right now. All I am saying is Zen 8 Core is not for gamers. 6 Core will cost less and will show it strengths in gaming much faster in 1-2 years while 8 Core will be too slow "Q6600" when game use 16T. I just do not see many people here having a need for 8 Core CPUs. I have not had a need for i7 for all these years.


Actually you are thinking about this in the wrong way.

As games go forward we will undoubtedly see more core usage, sure maybe not up to 8 cores used all the time but what that allows is headroom for other programs to be run in tandem like streaming software or having more stuff plugged into usb etc.

Originally when the ryzen lineup/prices were leaked the 1600x/1500 were the ones i was interested in, but honestly the upcharge for 8 cores isnt that bad (compared to what intel does at least) so i decided not to wait and build now. Were not going to be seeing 4.5ghz 6 cores btw, my guess is they will overclock identically to the 8 cores and maybe get 100mhz more out of them.


----------



## Quantum Reality

I swear some people are not going to ever be happy about Ryzen.









This upgrade is *a no-brainer for anybody in the AMD ecosystem*. Period. With something like a 50% increase in performance over "Constructor" cores, Ryzen clearly blows the doors off anything AMD has made previously and this alone should be reason to applaud the release of this CPU.

Y'all are sweating minor details when the broad brushstroke is this: AMD has a CPU that is worth upgrading to if you have any older AMD system. *The fact that it now pulls within a few percentage points of Intel's mainstream offerings is a welcome extra.*


----------



## Brutuz

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I think every i7 since the 2600K have been fantastic processors and still more than enough for gaming even in 2017. What I find interesting is that since Ryzen came out Haswell and Broadwell are somehow crappy gaming CPU's now. That's what you are saying if you say that Ryzen sucks for gaming.


Meanwhile, I'm sitting here on an IvB i5 happily maxing out TW3 with a few graphics mods at 40-50fps average...








Quote:


> Originally Posted by *ZealotKi11er*
> 
> Since 2600K upgrading for gaming 95% of people has been pointless. Only people that "benefit" are high Hz users and even then most games people play like CSGO, Overwatch can run 144fps + with 2600K. Yes 7700K is faster at gaming than 2600K if tested accordingly. Yes Zen is slower in gaming right now. All I am saying is Zen 8 Core is not for gamers. 6 Core will cost less and will show it strengths in gaming much faster in 1-2 years while 8 Core will be too slow "Q6600" when game use 16T. I just do not see many people here having a need for 8 Core CPUs. I have not had a need for i7 for all these years.


Except the Q6600 ended up being faster than its competitors and priced very cheaply before it's EoL. Most people who had one kept it right up past SB or even IB because it still ran everything fine, whereas most Core 2 owners either went Nehalem or specifically waiting for SB despite CPU bottlenecks occuring in games. The main ways that games can really continue to expand all now are much more easily multi-threaded, and Ryzen has plenty of potential left compared to Kaby Lake. I also severely doubt it'll be too slow to be a gaming CPU by the time games use those threads...It'll already have an advantage if games merely use the 7 threads available to them on the 2 x86 consoles from being able to give them all real cores and not HyperThreads.

Plus, plenty of us like to game while we have our PC doing other tasks, Ryzen completely obliterates Kaby Lake in that arena.
Quote:


> Originally Posted by *Quantum Reality*
> 
> I swear some people are not going to ever be happy about Ryzen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This upgrade is *a no-brainer for anybody in the AMD ecosystem*. Period. With something like a 50% increase in performance over "Constructor" cores, Ryzen clearly blows the doors off anything AMD has made previously and this alone should be reason to applaud the release of this CPU.
> 
> Y'all are sweating minor details when the broad brushstroke is this: AMD has a CPU that is worth upgrading to if you have any older AMD system. *The fact that it now pulls within a few percentage points of Intel's mainstream offerings is a welcome extra.*


Hell, it's worth upgrading to if you have an older Intel system. The only reason I can see Kaby Lake being the better choice for that price range is if you upgrade CPUs often and just want the best right now or do high refresh rate gaming, otherwise Ryzen is either faster or still fast enough in everything.


----------



## Charcharo

Quote:


> Originally Posted by *Twotenths*
> 
> I am going to hold off on buying a Ryzen chip and motherboard until next year.
> 
> The main reason is because at the end of 2017 PCIe will be adopting the Gen 4.0 standard. It doubles the PCIe lane speeds so in effect they can either put in more PCIe Gen 3.0 lanes instead or make it possible to set 4.0 to 3.0 in the bios allowing your extra slot to be a full x16 instead of x4 that they offer now.
> 
> Unfortunately they can't do a bios update to give it to you later. It requires a separate SoC (System on Chip) baked in processor on the motherboard to handle the extra bandwidth. It will take that load off of the CPU. The CPU does it right now. PCIe lanes are built into the processor. Gen 4.0 will get it's own baked on processor to be able to double up the speeds. AMD could not wait another year to launch. They had to release it now or lose the advantage of what they have accomplished by Intel's probable down pricing. It must be noted that the Ryzen is much more a workstation system than a gaming one. The numbers are in and the proof is there. With time AMD may be able to tweak the bios and improve their gaming performance. It is a brand new platform after all so it takes time to get things just right.
> 
> For me I'm waiting a year for the build to mature a little and for the introduction of PCIe Gen 4.0. It will be introduced at the end of 2017 and new boards will be coming out to support it. I think the processor won't change but the motherboard will. (Hopefully they won't need new processors to do it. I doubt it as long as socket AM4 remains unchanged).


So you think that AM4+ (which will support Ryzen 1, 2 and 3 just like AM4) will come out in Q1/Q2 2018 or Q4 2017?

I am in the same boat. Ryzen 1700 is already looking like a great upgrade over my locked i5 4460 (EVEN in gaming, but I do more than game) but am waiting for PCIE 4.0. Is there more info about it?

Also are there any DDR5 leeks or rumors?


----------



## Liranan

Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Twotenths*
> 
> I am going to hold off on buying a Ryzen chip and motherboard until next year.
> 
> The main reason is because at the end of 2017 PCIe will be adopting the Gen 4.0 standard. It doubles the PCIe lane speeds so in effect they can either put in more PCIe Gen 3.0 lanes instead or make it possible to set 4.0 to 3.0 in the bios allowing your extra slot to be a full x16 instead of x4 that they offer now.
> 
> Unfortunately they can't do a bios update to give it to you later. It requires a separate SoC (System on Chip) baked in processor on the motherboard to handle the extra bandwidth. It will take that load off of the CPU. The CPU does it right now. PCIe lanes are built into the processor. Gen 4.0 will get it's own baked on processor to be able to double up the speeds. AMD could not wait another year to launch. They had to release it now or lose the advantage of what they have accomplished by Intel's probable down pricing. It must be noted that the Ryzen is much more a workstation system than a gaming one. The numbers are in and the proof is there. With time AMD may be able to tweak the bios and improve their gaming performance. It is a brand new platform after all so it takes time to get things just right.
> 
> For me I'm waiting a year for the build to mature a little and for the introduction of PCIe Gen 4.0. It will be introduced at the end of 2017 and new boards will be coming out to support it. I think the processor won't change but the motherboard will. (Hopefully they won't need new processors to do it. I doubt it as long as socket AM4 remains unchanged).
> 
> 
> 
> So you think that AM4+ (which will support Ryzen 1, 2 and 3 just like AM4) will come out in Q1/Q2 2018 or Q4 2017?
> 
> I am in the same boat. Ryzen 1700 is already looking like a great upgrade over my locked i5 4460 (EVEN in gaming, but I do more than game) but am waiting for PCIE 4.0. Is there more info about it?
> 
> Also are there any DDR5 leeks or rumors?
Click to expand...

AMD have brought the release of Zen2 forward by a year, it will be released next year already. APU's this year will be released with some of Zen2's improvements already built in, which is why they are being released after the highest end chips.

Whether AMD4+ will be released next year is another question but PCI-E 4 will bring no advantage to regular consumers so it's irrelevant, unless you just want that higher number.

If you do more than gaming with that 4460 then Zen is perfect for you. If you don't need the power of an 8 core, 16 thread monster then wait for the six core 12 threads. They will still outclass i7's in multithreading while being a lot cheaper.

DDR4 was released last year and you already want DDR5? DDR5 will be released in 3 or 4 years if roadmaps don't change.


----------



## Charcharo

Quote:


> Originally Posted by *Liranan*
> 
> AMD have brought the release of Zen2 forward by a year, it will be released next year already. APU's this year will be released with some of Zen2's improvements already built in, which is why they are being released after the highest end chips.
> 
> Whether AMD4+ will be released next year is another question but PCI-E 4 will bring no advantage to regular consumers so it's irrelevant, unless you just want that higher number.
> 
> If you do more than gaming with that 4460 then Zen is perfect for you. If you don't need the power of an 8 core, 16 thread monster then wait for the six core 12 threads. They will still outclass i7's in multithreading while being a lot cheaper.
> 
> DDR4 was released last year and you already want DDR5? DDR5 will be released in 3 or 4 years if roadmaps don't change.


DDR4 was released in September 2014. Not last year









I hear rumors that PCIE 4 will be a big deal and that some GPUs from it wont work on 3.0 . I do not have the money yet (will do in 2-3 months I guess) but if I ever buy a good, expensive motherboard, I want it to last long for sure.


----------



## budgetgamer120

Quote:


> Originally Posted by *Charcharo*
> 
> DDR4 was released in September 2014. Not last year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hear rumors that PCIE 4 will be a big deal and that some GPUs from it wont work on 3.0 . I do not have the money yet (will do in 2-3 months I guess) but if I ever buy a good, expensive motherboard, I want it to last long for sure.


Pcie is always backwards compatible... Have a server with PCIe gen one with PCIe gen 3 card in there.


----------



## Cherryblue

Quote:


> Originally Posted by *Brutuz*
> 
> Meanwhile, I'm sitting here on an IvB i5 happily maxing out TW3 with a few graphics mods at 40-50fps average...


Hate to break it, but it's already a two years old game.. sadly.. (#GeraltFanNumberOne)


----------



## Charcharo

Quote:


> Originally Posted by *budgetgamer120*
> 
> Pcie is always backwards compatible... Have a server with PCIe gen one with PCIe gen 3 card in there.


True... but I have never bought a mobo over 70 dollars for myself








So AMD got me to the point where I am thinking of an expensive mobo. That is what I would do for AM4+


----------



## Rmerwede

I don't understand all the hub bub, or why we are even comparing R7 to Kaby Lake. it is meant to compete with *BROADWELL-E* at a much lower price point. And, it does.

R5 and R3 will compete with Kaby with similar benches in games, and be priced appropriately.


----------



## Particle

Quote:


> Originally Posted by *Rmerwede*
> 
> I don't understand all the hub bub, or why we are even comparing R7 to Kaby Lake. it is meant to compete with *BROADWELL-E* at a much lower price point. And, it does.
> 
> R5 and R3 will compete with Kaby with similar benches in games, and be priced appropriately.


Like politics or religion, people feel invested and will move the goal posts until their world view lines up with what they want.


----------



## mcg75

Quote:


> Originally Posted by *Rmerwede*
> 
> I don't understand all the hub bub, or why we are even comparing R7 to Kaby Lake. it is meant to compete with *BROADWELL-E* at a much lower price point. And, it does.
> 
> R5 and R3 will compete with Kaby with similar benches in games, and be priced appropriately.


If you had money saved for Kaby Lake but got a chance to buy a Broadwell W for the same price would you turn it down?

That's what current Ryzen basically are. That's why it's being compared.

If it wasn't for the gaming results, there would be little to no reason to buy Intel mainstream at this point.


----------



## Liranan

Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Liranan*
> 
> AMD have brought the release of Zen2 forward by a year, it will be released next year already. APU's this year will be released with some of Zen2's improvements already built in, which is why they are being released after the highest end chips.
> 
> Whether AMD4+ will be released next year is another question but PCI-E 4 will bring no advantage to regular consumers so it's irrelevant, unless you just want that higher number.
> 
> If you do more than gaming with that 4460 then Zen is perfect for you. If you don't need the power of an 8 core, 16 thread monster then wait for the six core 12 threads. They will still outclass i7's in multithreading while being a lot cheaper.
> 
> DDR4 was released last year and you already want DDR5? DDR5 will be released in 3 or 4 years if roadmaps don't change.
> 
> 
> 
> DDR4 was released in September 2014. Not last year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hear rumors that PCIE 4 will be a big deal and that some GPUs from it wont work on 3.0 . I do not have the money yet (will do in 2-3 months I guess) but if I ever buy a good, expensive motherboard, I want it to last long for sure.
Click to expand...

I didn't realise DDR4 has already been out for about a year and a half already. It matters not, DDR5 won't be released for another few years to come so waiting for it is useless.


----------



## Charcharo

Quote:


> Originally Posted by *Liranan*
> 
> I didn't realise DDR4 has already been out for about a year and a half already. It matters not, DDR5 won't be released for another few years to come so waiting for it is useless.


Good. Then AM4+ with PCIE4.0 it is







! I will have the moneyz by then I hope.

Now lets hope Vega isnt too expensive...


----------



## Sodalink

Quote:


> Originally Posted by *DADDYDC650*
> 
> $309 for Ryzen 1700.... Ordered one. I get 10 percent cash back which brings the total down to $279. Wow!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Check your email for ebay 10 percent cash back offer. Expires on March 9th.
> 
> http://www.ebay.com/itm/New-AMD-Ryzen-7-1700-8-Core-3-0GHz-Desktop-Processor-AM4-65W-YD1700BBAEBOX-/351996487464


I ordered one and hope it ships, I got a message from them saying they were out of stock. But they said they'll restock this week and I should get it if I wanted to wait.

It was a pretty good deal, I will be saving like $77 bucks ($50+27 on taxes). Which I needed because I sold my i7 5820k kind of cheap, but I really don't regret it. I do want to save some energy going from a 140w TDP cpu to a 65w TDP and 2 extra cores. I haven't see benchmarks comparing the two, but I would assume the 1700 will do as good as the i7 5820k or better.


----------



## aberrero

Quote:


> Originally Posted by *mcg75*
> 
> If you had money saved for Kaby Lake but got a chance to buy a Broadwell W for the same price would you turn it down?
> 
> That's what current Ryzen basically are. That's why it's being compared.
> 
> If it wasn't for the gaming results, there would be little to no reason to buy Intel mainstream at this point.


When R5 and R3 come out, Intel's midrange i5 and i3 CPUs instantly become obsolete. The 7700K will continue to be the single threaded king.


----------



## iRUSH

Quote:


> Originally Posted by *aberrero*
> 
> When R5 and R3 come out, Intel's midrange i5 and i3 CPUs instantly become obsolete. The 7700K will continue to be the single threaded king.


I can't wait!

I thought I was in for a 1700 but I'm reconsidering for the 1600X while also waiting for a great mATX board.


----------



## sugarhell

https://community.amd.com/community/gaming/blog/2017/03/13/amd-ryzen-community-update?sf62109582=1


----------



## KarathKasun

Quote:


> Originally Posted by *Charcharo*
> 
> DDR4 was released in September 2014. Not last year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hear rumors that PCIE 4 will be a big deal and that some GPUs from it wont work on 3.0 . I do not have the money yet (will do in 2-3 months I guess) but if I ever buy a good, expensive motherboard, I want it to last long for sure.


PCIe 3.0 X16 is not bottlenecking any GPU around today. It can be argued that PCIe 3.0 X8 is enough bandwidth for even the highest end cards in mGPU scenarios.

X8 -> X16 gives 5% or less performance.


----------



## budgetgamer120

Quote:


> Originally Posted by *mcg75*
> 
> If you had money saved for Kaby Lake but got a chance to buy a Broadwell W for the same price would you turn it down?
> 
> That's what current Ryzen basically are. That's why it's being compared.
> 
> If it wasn't for the gaming results, there would be little to no reason to buy Intel mainstream at this point.


Did someone hack your account?


----------



## Arturo.Zise

Quote:


> Originally Posted by *sugarhell*
> 
> https://community.amd.com/community/gaming/blog/2017/03/13/amd-ryzen-community-update?sf62109582=1


So AMD are saying that the thread scheduler and SMT problems don't actually exist, and everything is working as intended?


----------



## Twotenths

Quote:


> Originally Posted by *Charcharo*
> 
> So you think that AM4+ (which will support Ryzen 1, 2 and 3 just like AM4) will come out in Q1/Q2 2018 or Q4 2017?
> 
> I am in the same boat. Ryzen 1700 is already looking like a great upgrade over my locked i5 4460 (EVEN in gaming, but I do more than game) but am waiting for PCIE 4.0. Is there more info about it?
> 
> Also are there any DDR5 leeks or rumors?


It may just be that they will need a AM4+ socket but so far it's too early. Mid summer last year they nailed down .7 of the PCIe 4.0 standard. That was good enough for board manufacturers to start designing new boards already. I beleive in Late January the standard was fully completed and fully tested. Ryzen was already committed to PCIe Gen 3.0. It was too late to change specs now and delay the launch by months. AMD has to start making money before they can modify their next generation boards and maybe even a processor refresh in a year. From what I understand AMD has pins on the CPU that are spare and not in use right now. If that is true then the change may not be so drastic as we may believe. AMD knew about PCIe 4.0. The consortium were already well into it back in 2015. I can't see them launching without a bare minimum contingency plan early next year.

Intel isn't in a hurry for a new chipset change or modification. But their enthusiast X99 boards are due for a new chipset. They only support two processor upgrades per chipset. The X79 was Sandy bridge-e and Ivy Bridge-e. The X99 has also seen it's two processors with Haswell-e and Broadwell-E. It would be perfect timing to come out with another new chipset that supports PCIe Gen 4.0 for Skylake-E and Kaby Lake-E. I'm sure they are already working on it but nothing has leaked so far.

I have heard nothing about DDR5. I doubt that will happen for quite some time. DDR3 lasted for many years before DDR4 came out. They can already make 32GB single sticks of memory and even bigger if you go down the server road. Windows 10 and DDR4 can both support 512GB of ram installed. Who would need that?.... it would be server based.


----------



## kyrie74

Quote:


> Originally Posted by *Twotenths*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Charcharo*
> 
> So you think that AM4+ (which will support Ryzen 1, 2 and 3 just like AM4) will come out in Q1/Q2 2018 or Q4 2017?
> 
> I am in the same boat. Ryzen 1700 is already looking like a great upgrade over my locked i5 4460 (EVEN in gaming, but I do more than game) but am waiting for PCIE 4.0. Is there more info about it?
> 
> Also are there any DDR5 leeks or rumors?
> 
> 
> 
> It may just be that they will need a AM4+ socket but so far it's too early. Mid summer last year they nailed down .7 of the PCIe 4.0 standard. That was good enough for board manufacturers to start designing new boards already. I beleive in Late January the standard was fully completed and fully tested. Ryzen was already committed to PCIe Gen 3.0. It was too late to change specs now and delay the launch by months. AMD has to start making money before they can modify their next generation boards and maybe even a processor refresh in a year. From what I understand AMD has pins on the CPU that are spare and not in use right now. If that is true then the change may not be so drastic as we may believe. AMD knew about PCIe 4.0. The consortium were already well into it back in 2015. I can't see them launching without a bare minimum contingency plan early next year.
> 
> Intel isn't in a hurry for a new chipset change or modification. But their enthusiast X99 boards are due for a new chipset. They only support two processor upgrades per chipset. The X79 was Sandy bridge-e and Ivy Bridge-e. The X99 has also seen it's two processors with Haswell-e and Broadwell-E. It would be perfect timing to come out with another new chipset that supports PCIe Gen 4.0 for Skylake-E and Kaby Lake-E. I'm sure they are already working on it but nothing has leaked so far.
> 
> I have heard nothing about DDR5. I doubt that will happen for quite some time. DDR3 lasted for many years before DDR4 came out. They can already make 32GB single sticks of memory and even bigger if you go down the server road. Windows 10 and DDR4 can both support 512GB of ram installed. Who would need that?.... it would be server based.
Click to expand...

Last I read about DDR5 the thinking was 2020 at the earliest.


----------



## doza

http://www.guru3d.com/news-story/amd-ryzen-7-have-a-temperature-20-degree-c-reporting-offset.html


----------



## aberrero

Quote:


> Originally Posted by *doza*
> 
> http://www.guru3d.com/news-story/amd-ryzen-7-have-a-temperature-20-degree-c-reporting-offset.html


I've literally spent hours readjusting my cooler trying to get my cpu to idle below 70C.


----------



## ChronoBodi

Quote:


> Originally Posted by *aberrero*
> 
> I've literally spent hours readjusting my cooler trying to get my cpu to idle below 70C.


um if you have h100i, then uh, something is off.

Lets assume the temps we know before the "offset +20C" was known. Even there, my weaker h60 can do 53-55C idle, 75C load.

So, that's pretty much 30c idle, 55c load when taking off the +20c offset.

So something is off with your H100i.


----------



## sumitlian

Quote:


> Originally Posted by *Arturo.Zise*
> 
> So AMD are saying that the thread scheduler and SMT problems don't actually exist, and everything is working as intended?


Yes.


----------



## Kuivamaa

SMT issues are well documented by now, though.


----------



## Twotenths

Quote:


> Originally Posted by *kyrie74*
> 
> Last I read about DDR5 the thinking was 2020 at the earliest.


DDR3 came out in 2007. DDR4 was introduced in 2014. That's a 7 year run. It would not surprise me if DDR4 also lasted another 7 years.


----------



## sumitlian

Quote:


> Originally Posted by *Kuivamaa*
> 
> SMT issues are well documented by now, though.


Yeah, that is I am seeing too. May be it has something to do with current BIOS code or these CPUs have been made to work like this.
Well aside from the weird Task Manager behavior and synthetic latency tests, I think we didn't hear any impact on perceptible performance yet, in any of the real world applications.
While on the contrary, at least in gaming, I think it is in BF1, some users have claimed smoother gameplay with RyZen, MindBlank Tech(YTber) has claimed similar experience.








Well I am surprised, even after the Zen launch, some mysteries still exist.

Edit: Also if you look at Linux benchmarks, RyZen is kicking ass of even latest Intel Xeon counterparts in performance per dollar, multi-threaded performance and performance per watt.
Imo, If something was wrong with the latency to the extent that could shatter the performance of current general/special purpose apps then I think somebody would have noticed in at least one real world app.

One more thing on the current latency results, I am not an expert on any of this neither do I have any deep understanding of 'how to make a cache benchmark'(though I have very good understand of how pointers work in c and has done some assembly in MASM (Can create a function and set the some loops , nothing major) ) and haven't tried to make one, yet. but I am speculating anyways since I'm interested to learn:
Doesn't the latency depend on what instruction/opcode we are using to measure it ?
I mean there are hundreds of opcodes in x86-64 arch and similar data in different manners can be transferred between cpu and memory or vice versa. Many of those opcodes also differ significantly in throughput and internal latency.

I mean, can we literally use any specific instruction to measure the latency and be in impression that the result is going to represent latencies of all other instructions ?
Also why do we automatically assume that a certain 'latency tester released at certain time' is going to work optimally on different CPU architectures that arrive later ?

Shouldn't different cache size, cache line size in Byte and different CPU<->Cache<->Memory associativity, pipelines, uOps buffer across so many varying CPU architectures affect the outcome when using a definite(unoptimized) set of problems ? I think even if these benchmarks are technically 'backward compatible' with all x86-64 CPUs, they still should not represent the latency of real world ever changing dynamic workloads, imo.


----------



## mcg75

Quote:


> Originally Posted by *budgetgamer120*
> 
> Did someone hack your account?


Yes, this is Roy Taylor. I hacked Mcg's account.

Seriously though, if you've really been paying attention, this is not the first time I've spoken positively about Ryzen.

It's a great product. It's not without it's faults but we may see some of those faults reduced or eliminated.

Problem is some want to defend it like it's perfect and others want to knock it like it's terrible.

Reasonable people fall somewhere between the two.


----------



## aberrero

Quote:


> Originally Posted by *ChronoBodi*
> 
> um if you have h100i, then uh, something is off.
> 
> Lets assume the temps we know before the "offset +20C" was known. Even there, my weaker h60 can do 53-55C idle, 75C load.
> 
> So, that's pretty much 30c idle, 55c load when taking off the +20c offset.
> 
> So something is off with your H100i.


I have a Hyper 212 attached with zip ties... I havent received my H100 bracket yet.


----------



## poii

Quote:


> Originally Posted by *doza*
> 
> http://www.guru3d.com/news-story/amd-ryzen-7-have-a-temperature-20-degree-c-reporting-offset.html


Just read it. This is hilarious. Glad I waited a bit to buy Ryzen.
Will buy a 1700 in a month unless some more weird news appear.


----------



## Sin0822

On some motherboards you can actually set the temp offset, but that 20C thing makes sense.


----------



## Brutuz

Quote:


> Originally Posted by *ChronoBodi*
> 
> um if you have h100i, then uh, something is off.
> 
> Lets assume the temps we know before the "offset +20C" was known. Even there, my weaker h60 can do 53-55C idle, 75C load.
> 
> So, that's pretty much 30c idle, 55c load when taking off the +20c offset.
> 
> So something is off with your H100i.


Or he lives somewhere in a different climate to you. I can get my 3570k with stock TIM and IHS down to 60c load under a NH-D14 in Winter if I leave my window open, and conversely in summer I have to lose a few hundred Mhz and some volts to keep temps below 90c.


----------



## budgetgamer120

Quote:


> Originally Posted by *mcg75*
> 
> Yes, this is Roy Taylor. I hacked Mcg's account.
> 
> Seriously though, if you've really been paying attention, this is not the first time I've spoken positively about Ryzen.
> 
> It's a great product. It's not without it's faults but we may see some of those faults reduced or eliminated.
> 
> Problem is some want to defend it like it's perfect and others want to knock it like it's terrible.
> 
> Reasonable people fall somewhere between the two.


Other than Windows Issues with SMT. I do not see any of the other "issues" being "fixed". Until Zen+


----------



## IRobot23

Quote:


> Originally Posted by *budgetgamer120*
> 
> Other than Windows Issues with SMT. I do not see any of the other "issues" being "fixed". Until Zen+


You think AMD will make better data fabric or 16MB of L3$? I think they will stick with data fabric (which is the key). Maybe for gamers some specialization on 6C with 12MB L3$ (no data fabric). Streaming + gaming actually Ryzen is pretty much best choice.


----------



## epic1337

Quote:


> Originally Posted by *IRobot23*
> 
> You think AMD will make better data fabric or 16MB of L3$? I think they will stick with data fabric (which is the key). Maybe for gamers some specialization on 6C with 12MB L3$ (no data fabric). Streaming + gaming actually Ryzen is pretty much best choice.


they'll definitely keep the current data fabric, they'll probably use a different clock though.
currently its in sync with the DRAM speed, e.g. 3000Mhz DDR4 = 1500Mhz data fabric.

if they increase the ratio, or a different clock altogether like intel's uncore, then they would increase the data fabric throughput without touching anything else.
this would effectively fix two sets of issues on AMD's Zen architecture, one is CCX-to-CCX communication speed, and the other is overall DRAM communication speed.

AMD by the way calls this "Infinity Fabric", an updated version of HyperTransport.


----------



## cssorkinman

Quote:


> Originally Posted by *epic1337*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IRobot23*
> 
> You think AMD will make better data fabric or 16MB of L3$? I think they will stick with data fabric (which is the key). Maybe for gamers some specialization on 6C with 12MB L3$ (no data fabric). Streaming + gaming actually Ryzen is pretty much best choice.
> 
> 
> 
> they'll definitely keep the current data fabric, they'll probably use a different clock though.
> currently its in sync with the DRAM speed, e.g. 3000Mhz DDR4 = 1500Mhz data fabric.
> 
> if they increase the ratio, or a different clock altogether like intel's uncore, then they would increase the data fabric throughput without touching anything else.
> this would effectively fix two sets of issues on AMD's Zen architecture, one is CCX-to-CCX communication speed, and the other is overall DRAM communication speed.
> 
> AMD by the way calls this "Infinity Fabric", an updated version of HyperTransport.
Click to expand...

Octochannel ram for Naples - up to 2 TB's , that will test the fabric.


----------



## sumitlian

Quote:


> Originally Posted by *cssorkinman*
> 
> Octochannel ram for Naples - up to 2 TB's , that will test the fabric.


Naples will probably be loaded by mostly NUMA based applications, hence this CCX issue we are facing with Windows and niche level gaming will be non existence for workstation and stuffs, imo.


----------



## ChronoBodi

Quote:


> Originally Posted by *aberrero*
> 
> I have a Hyper 212 attached with zip ties... I havent received my H100 bracket yet.


Quote:


> Originally Posted by *Brutuz*
> 
> Or he lives somewhere in a different climate to you. I can get my 3570k with stock TIM and IHS down to 60c load under a NH-D14 in Winter if I leave my window open, and conversely in summer I have to lose a few hundred Mhz and some volts to keep temps below 90c.


well actually, different climate is true.
But then he's actually on a zip-tied CM Hyper 212 Evo, which explains why his temps are higher than what i expected.


----------



## epic1337

Quote:


> Originally Posted by *cssorkinman*
> 
> Octochannel ram for Naples - up to 2 TB's , that will test the fabric.


Naples is using a 64lane bus as it's backbone for the Infinity Fabric.
in total, Naples has 128lane per SoC package, half of which are for I/O, and half of which are for Infinity Fabric.



the problem is, its still clocked much lower than Intel's uncore.
everyone knows that clock speed is directly related to latency, so even if it has the bandwidth, the latency would still be too high.


----------



## Majin SSJ Eric

I'm still wondering if there's any possibility we might see a new stepping for Ryzen before Zen+ that could be of any benefit at all to the clock speeds? I read that Ryzen's naming scheme was set up to allow for stepping sku's to be released down the line utilizing the last two numerals. Since we are mostly in agreement that the GF 14nm process is what is hampering clock speeds with Ryzen I don't really know if we can expect a new stepping to do much for speed but its an interesting thought. Then again, if AMD really are planning to release a Zen+ as soon as one year after Ryzen's release then its probably unlikely that they would drop any new original Zen sku's in the meantime...


----------



## Carniflex

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'm still wondering if there's any possibility we might see a new stepping for Ryzen before Zen+ that could be of any benefit at all to the clock speeds? I read that Ryzen's naming scheme was set up to allow for stepping sku's to be released down the line utilizing the last two numerals. Since we are mostly in agreement that the GF 14nm process is what is hampering clock speeds with Ryzen I don't really know if we can expect a new stepping to do much for speed but its an interesting thought. Then again, if AMD really are planning to release a Zen+ as soon as one year after Ryzen's release then its probably unlikely that they would drop any new original Zen sku's in the meantime...


I believe it is unlikely. What is more likely that they will do some optimizations and Zen chips made 6+ months into the cycle might have some minor optimizations. It is probably not going like it went with 1055T where at first it was a 125W TDP chip which was revised to be a 95W TDP chip a bit later into the cycle. Even if AMD manages to reduce the actual power consumption enough to theoretically fit the 8 core "x" chips also into the 65W bracket. I mean if they expect to release the revision already next year as some most optimistic rumors claim. If the Zen+ comes in reality in 2019 then it might make sense for them to do a different stepping revision. We will see ofc. It is just a pure speculation.


----------



## Twotenths

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'm still wondering if there's any possibility we might see a new stepping for Ryzen before Zen+ that could be of any benefit at all to the clock speeds? I read that Ryzen's naming scheme was set up to allow for stepping sku's to be released down the line utilizing the last two numerals. Since we are mostly in agreement that the GF 14nm process is what is hampering clock speeds with Ryzen I don't really know if we can expect a new stepping to do much for speed but its an interesting thought. Then again, if AMD really are planning to release a Zen+ as soon as one year after Ryzen's release then its probably unlikely that they would drop any new original Zen sku's in the meantime...


I am sure that AMD already has plans for a new motherboard. It will support PCIe Gen 4.0. If you read anything about the specifications for Gen 4.0 you would know that they need to add a SoC to double the speeds to 4.0 standards. I have heard that the Ryzen processors have unused pins in the socket. Next years introduction probably won't affect the AM4 socket nor the processor either. It will require a motherboard change that will have the new SoC onboard along with the new upgraded PCIe 4.0 slots.

They can do a lot with double the bandwidth. They don't need to use PCIe 4.0 because most hardware besides the highest end graphics cards wouldn't need more than Rev 3.0. That will allow them to add more PCIe 3.0 slots onto the board and only have slot one set to Gen 4.0. It could be something that you could change in the bios to let you decide on what kind of setup you want or need. It will be a big advantage for them seeing as their lack of slots really narrows down the options of what you can install right now.

If you want dual GPU's then the last slot is only a x4 supported slot. I use a Raid controller card which needs a PCIe 3.0 x8 slot to run. If I installed dual graphics cards then I could not install my Raid card on existing boards right now. I would also not be able to put that second video card into the x4 slot either. I'm pretty sure that AMD purposely built their chipsets and chips with less lanes because they know that the next upgrade of the board will basically eliminate the lack of lane support they have right now. I don't think the new Gen 4.0 boards will cost much more than existing prices this year for the selection of motherboards used for this years new launch.

It wouldn't surprise me to find out that the reinforced slots are a precursor of the new Gen 4.0 hardware. They are just testing them out in preparation for Gen 4.0. It's a quite logical way to test them to me. Think about it... Gen 3.0 x16 is at it's limits with the very latest video cards. The new graphics cards using Gen 4.0 might actually need three slots for what they need to fit on the card to better utilize what the new slots can handle, hence the suddenly introduced reinforced slots.


----------



## jprovido

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Its not going to be better than a 5.1GHz 7700K in gaming, no question about that. The real question is is the 15% difference in games at 1080p really enough to justify the massive disparity in performance in all other CPU workloads? If not then the 7700K is the better choice for you. My contention is that for the vast majority of OCN members the 1700X offers more than enough of an advantage in all other workloads over the 7700K to make up for its minor deficiencies in 1080p gaming performance. Hell, I've been running 1440p since 2012 when I ditched my S27A950D in favor of Korean IPS panels and haven't looked back to 1080p since. Sure 1080p / 144Hz (or more) monitors are a thing but even with those monitors its not exactly like the 1700X falls on its face or anything. If you have something like GTX 1080's I'm sure a 1700X will get you over 144 FPS at 1080p just as well as a 7700K will...


tbh I don't expect it to beat my 7700k @ 5.1ghz at all. what I decided is as long as Dota 2 and Overwatch can maintain 144fps @ 1440p without any drops (like my 7700k) I don't care about the other games. I hope it does perform my [email protected] 4.7ghz wasn't able to do it so I'm a bit skeptical but If ryzen will surprise me ( I hope it does it's more expensive than my 7700k) then my 7700k will be demoted to the vr rig lol

seems like I'll be getting the cpu sooner. the motherboard will be delivered today the 1700x tomorrow.



I've started putting together the parts that I have received already. just waiting for my motherboard today so I can get everything ready for tomorrow


----------



## tashcz

Any chance Cooler Master's Nepton will get AM4 brackets, anyone know?


----------



## jon666

I have everything but the motherboard...might swap out the set of RAM though. No ETA on that MSi Carbon mobo from NCIX.


----------



## SoloCamo

Quote:


> Originally Posted by *jprovido*
> 
> tbh I don't expect it to beat my 7700k @ 5.1ghz at all. what I decided is as long as Dota 2 and Overwatch can maintain 144fps @ 1440p without any drops (like my 7700k) I don't care about the other games. I hope it does perform my [email protected] 4.7ghz wasn't able to do it so I'm a bit skeptical but If ryzen will surprise me ( I hope it does it's more expensive than my 7700k) then my 7700k will be demoted to the vr rig lol


I'm really confused by what you are doing. Dota 2 is a very lightly threaded cpu bound game. If a 4.7ghz 5820k isn't doing it, neither is a 4ghz Ryzen cpu. Neither would touch a 7700k @ 5.1ghz in this game...

You obviously must know this?


----------



## Shatun-Bear

Quote:


> Originally Posted by *jprovido*
> 
> tbh I don't expect it to beat *my 7700k @ 5.1ghz* at all. what I decided is as long as Dota 2 and Overwatch can maintain 144fps @ 1440p without any drops (like my 7700k) I don't care about the other games. I hope it does perform my *[email protected] 4.7ghz* wasn't able to do it so I'm a bit skeptical but If ryzen will surprise me ( I hope it does it's more expensive than my 7700k) then my 7700k will be demoted to the vr rig lol
> 
> seems like I'll be getting the cpu sooner. the motherboard will be delivered today the 1700x tomorrow.
> 
> 
> 
> :


You've got/had a 7700K @ 5.1Ghz and more surprising, a 5820K at 4.7Ghz? You must have got a pair of golden chips there, especially with the 5820K.

Are you running the 5820K 24/7 at 4.7Ghz - if so, what core voltage?


----------



## nycgtr

4.7 on a new 5820k wasn't that golden imo. Mine did so @ 1.3v for about 6 months then needed more lol. My replacement one was the same.


----------



## Shatun-Bear

Quote:


> Originally Posted by *nycgtr*
> 
> 4.7 on a new 5820k wasn't that golden imo. Mine did so @ 1.3v for about 6 months then needed more lol. My replacement one was the same.


1.3V to get to 4.7Ghz on a 5820K would be a pretty golden chip imo, a lot hit a wall at 4.5Ghz and need a load more juice to get to 4.6Ghz (mine was one of them). 4.7 would just be a pipe dream with only 1.3v.


----------



## nycgtr

Quote:


> Originally Posted by *Shatun-Bear*
> 
> 1.3V to get to 4.7Ghz on a 5820K would be a pretty golden chip imo, a lot hit a wall at 4.5Ghz and need a load more juice to get to 4.6Ghz (mine was one of them). 4.7 would just be a pipe dream with only 1.3v.


Odd then. I remember hitting 4.8 with my 5820k @ 1.3 but it wasnt completely stable without more volts and I didn't wanna go past 1.3. I got the chip the same time as 3 friends we kinda did a microcenter raid. All of us hit 4.7. The first 5820k I had which is now doing htpc duty couldn't hit 4.7 after about 6 months without more voltage.


----------



## ZealotKi11er

Quote:


> Originally Posted by *nycgtr*
> 
> Odd then. I remember hitting 4.8 with my 5820k @ 1.3 but it wasnt completely stable without more volts and I didn't wanna go past 1.3. I got the chip the same time as 3 friends we kinda did a microcenter raid. All of us hit 4.7. The first 5820k I had which is now doing htpc duty couldn't hit 4.7 after about 6 months without more voltage.


Thats very golden to get 4.7 on Haswell-E. 4.5 was like 70% of the CPUs limit.


----------



## budgetgamer120

New Ryzen Review. 7700k Stuttering while streaming.


----------



## aberrero

Quote:


> Originally Posted by *ChronoBodi*
> 
> well actually, different climate is true.
> But then he's actually on a zip-tied CM Hyper 212 Evo, which explains why his temps are higher than what i expected.


I zip tied the Corsair H100 instead now and everything is fine (it holds much better than the tower).


----------



## Malinkadink

Quote:


> Originally Posted by *aberrero*
> 
> I zip tied the Corsair H100 instead now and everything is fine (it holds much better than the tower).


I didn't know people ziptied coolers







its like those drifters who mangle their bumpers and ziptie them back on


----------



## aberrero

Quote:


> Originally Posted by *Malinkadink*
> 
> I didn't know people ziptied coolers
> 
> 
> 
> 
> 
> 
> 
> its like those drifters who mangle their bumpers and ziptie them back on


Overclocking my HD4830:


----------



## SoloCamo

Quote:


> Originally Posted by *Malinkadink*
> 
> I didn't know people ziptied coolers
> 
> 
> 
> 
> 
> 
> 
> its like those drifters who mangle their bumpers and ziptie them back on


Yea, I've had my fair share of zip tie use for coolers all the way back to the P4 days. I'm too picky and need it to look clean these days to risk it though


----------



## Shatun-Bear

Quote:


> Originally Posted by *budgetgamer120*
> 
> New Ryzen Review. 7700k Stuttering while streaming.


I think this guy, NerdTechgasm, is a lot more technical and less of a fanboy than AdoredTV. He deserves more subscribers:


----------



## jclafi

Excelent review !









Quote:


> Originally Posted by *budgetgamer120*
> 
> New Ryzen Review. 7700k Stuttering while streaming.


----------



## budgetgamer120

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I think this guy, NerdTechgasm, is a lot more technical and less of a fanboy than AdoredTV. He deserves more subscribers:


The title alone makes it a failed review. Didn't bother watching.


----------



## Scotty99

Quote:


> Originally Posted by *kfxsti*
> 
> But what about us that don't stream all the time ? Not to mention the tossing of the word afanboy in this thread is hilarious.
> I got my 7700k before Ryzens release and have yet to see anything that would make me switch from it as of yet. As stated before by me and others !! Expectations were set a bit to high for Ryzen. It's an awesome chip for what it does and I am shickled titless that AMD is back in the game and looking forward to the 4c 8ts they have coming amongst other CPUs they release. But videos like this and comments like those just make most people look at them like - Damn someone is just grasping at straws to make Ryzen better looking than the 7700k.
> That's like saying now that gaming while streaming with 65 other programs running while downloading half the internet and balancing In your PC chair on your head while balancing two mice on each buttcheek is the new benchmark for CPUs..


Ryzen was made for people running older i5's like myself, anyone on a 3770k or above really has no reason to look at ryzen unless you do a lot of video encoding or something along those lines. For me it was an upgrade all around, plus they included a sweet cooler for 330 bucks









Ryzen 1700 is the new 2500k, feels good to be smart and patient and not spend money on incremental upgrades : )


----------



## Scotty99

Quote:


> Originally Posted by *budgetgamer120*
> 
> Nah the 1600 will be the new 2500k


But no RGB cooler lol.


----------



## bossie2000

Ok let's see.. 7700 k running at 4.5 to 5 gig beating Ryzen running at 4 gig. Wow no math needed here!








Now let;s see again.R7 running 3.0-4.0 gig beating(no sorry)destroying 7700 k at threaded task. We need some math here please!


----------



## kfxsti

Quote:


> Originally Posted by *budgetgamer120*
> 
> Ryzen is not for anyone with a 7700k. No one needs to grasp at straws. Ryzen 7 is way more powerful than a 7700k. I'm waiting to see the 6 core also stomp the 7700k for $219 And murder the 7600. Can't wait for April 11.
> 
> For your use a quad core is enough do Ryzen 7 is not for you.


Ryzen is for me if I want it. Hence it being MY choice lol. But then again earlier in this thread weren't you kinda hating Ryzen ? And pointless posts about some Xeon you own lol . If your catching the vibe that's Leaning your way then It's probably true.
And if your trying to tell me what's for me and what's not for me. You are already wrong.
And yes straws are being grasped . Hence AMD coming to light and stating that it's not an issue with the OS causing problems.. But it was something for you and others to grasp and gawk at to make it relevant to beating anything from Intel.
You failed to cover my other topic as why are steaming while gaming reviews just now poping up? Wasn't an issue until something was needed to help push Ryzen a little bit. See where I'm going with this ?
Like I said I'm tickled for them. But some of you guys buffing AMDs pole right now is hilarious.


----------



## kfxsti

Quote:


> Originally Posted by *Scotty99*
> 
> Ryzen was made for people running older i5's like myself, anyone on a 3770k or above really has no reason to look at ryzen unless you do a lot of video encoding or something along those lines. For me it was an upgrade all around, plus they included a sweet cooler for 330 bucks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ryzen 1700 is the new 2500k, feels good to be smart and patient and not spend money on incremental upgrades : )


I came from the first phenoms to phenom IIs to a 2600k to a 8320 then a 6600k that my wife now uses to my 7700k.


----------



## budgetgamer120

Quote:


> Originally Posted by *kfxsti*
> 
> Ryzen is for me if I want it. Hence it being MY choice lol. But then again earlier in this thread weren't you kinda hating Ryzen ? And pointless posts about some Xeon you own lol . If your catching the vibe that's Leaning your way then It's probably true.
> And if your trying to tell me what's for me and what's not for me. You are already wrong.
> And yes straws are being grasped . Hence AMD coming to light and stating that it's not an issue with the OS causing problems.. But it was something for you and others to grasp and gawk at to make it relevant to beating anything from Intel.
> You failed to cover my other topic as why are steaming while gaming reviews just now poping up? Wasn't an issue until something was needed to help push Ryzen a little bit. See where I'm going with this ?
> Like I said I'm tickled for them. But some of you guys buffing AMDs pole right now is hilarious.


I'm not telling you what's for you. So please take another road with your argument. I stated the target audience. No 7700k owners need to be salty.

Streaming and gaming has been a thing for years. You were ignorant to it, that doesn't man it just became a thing. Streamers have been using 2 systems for smooth streaming our spend $1000 on a 6900k... Until now, they only need to spend $500.

For people that do not stream it means superior multitasking and content creation which Ryzen owners here are pleased about.

Besides Ryzen 7 competes with 6900k a cpu way out of the 7700k league


----------



## bossie2000

Thankyou AMD. You have take long but you delivered!!


----------



## mouacyk

Quote:


> Originally Posted by *bossie2000*
> 
> Thankyou AMD. You have take long but you delivered!!


Let's hold that judgement until we see R3 and R5 do 4.5GHz+. Otherwise, all AMD managed to do is redeliver IPC we already had in 2011. Let's set the expectations high all across the board, since we are in pursuit of performance.


----------



## Scotty99

Quote:


> Originally Posted by *mouacyk*
> 
> Let's hold that judgement until we see R3 and R5 do 4.5GHz+. Otherwise, all AMD managed to do is redeliver IPC we already had in 2011. Let's set the expectations high all across the board, since we are in pursuit of performance.


What an unbelievably short sighted comment.

I would bet money in 3 years the 1700 will be a superior gaming CPU to a 7700k. Going forward cores are king, and AMD is changing the game with ryzen.

Also don't expect R3 and R5's to clock any higher than the R7's do, maybe 100mhz here and there but we wont be seeing 4.5ghz a common number on any of them.


----------



## Scotty99

To be fair, almost no one uses 2 PC setups. The reason some people do this in the first place is to up the CPU encoding preset, which can be of a higher quality than GPU encoding. This is what AMD showed at the ryzen launch event, but most people use the "veryfast" cpu preset on OBS which i7's (and i5's even) can handle fine.

The upside of ryzen is this, once more games start using more cores and dx12 adoption grows that means i7 owners will have less CPU resources leftover to do other tasks. I dont see i7's being obsolete anytime soon, but i could definitely see that being the case with i5's in a couple years.


----------



## kfxsti

Quote:


> Originally Posted by *Scotty99*
> 
> To be fair, almost no one uses 2 PC setups. The reason some people do this in the first place is to up the CPU encoding preset, which can be of a higher quality than GPU encoding. This is what AMD showed at the ryzen launch event, but most people use the "veryfast" cpu preset on OBS which i7's (and i5's even) can handle fine.
> 
> The upside of ryzen is this, once more games start using more cores and dx12 adoption grows that means i7 owners will have less CPU resources leftover to do other tasks. I dont see i7's being obsolete anytime soon, but i could definitely see that being the case with i5's in a couple years.


You sir have some rep aswell
But then again I'm an idiot because I'm not a YouTube superstar who had to use two rigs.. lol


----------



## bossie2000

Multi threading is the future.More and more games,apps,browsers and ofcourse OS is going to utilize it.
Single thread-well it got it's place but will slowly loose it over time....


----------



## kariverson

Guys come on. Objectively ryzen has the raw power of the 6950x. It's just horribly optimized. But optimization is something that will get better. But the Intel chips will not. You cannot compare it with the 7700k and you shouldn't because it's leagues above it. And for 500$ it is easily the best choice for everyone right now. Both gaming which is more than sufficient and it will only get better and for content creation and multi thread performance that goes head to head with CPUs of prices well above tbe 1k mark. But that damn PGA socket is ugly as hell. 20 year old design.


----------



## Alwrath

Quote:


> Originally Posted by *Scotty99*
> 
> Ryzen 1700 is the new 2500k, feels good to be smart and patient and not spend money on incremental upgrades : )


Tell me about it. Paid $200 for my core i5 760 back in 2010, and there was no way I was gonna shell out $400 - $600 for the six core generation like my friends did. Skipped over that intel overpriced era and behold : I am now on the cutting edge with a 8 core ryzen that I can upgrade when zen 2 and 3 come out. Why would anyone get a quad core in 2017? Its dated technology thats been out since 2007. Its time to move on, the quad core had a good run ?✌


----------



## Undervolter

For those who thought that Gamers Nexus was tough on Ryzen. There is worse!









New review (March 16) by italian edition of Tom's Hardware

https://translate.google.com/translate?sl=it&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=https%3A%2F%2Fwww.tomshw.it%2Fryzen-7-vs-core-i7-gaming-situazione-attuale-84125-p4&edit-text=

Conclusion: ""The Ryzen processors are being sold at prices definitely lower than the Intel Broadwell-E solutions and this gives them a big value in case one works with workstation class software. This value though is not applicable on games, where much cheaper Intel Kaby Lake CPUs are as fast or faster than Ryzen 7. The 7600K at 240 EUR beats all 3 Ryzen CPUs in different titles and the 7700K at 370/380 EUR is even faster". "...AMD has also announced that will fix the question about power profiles in Windows, even if it should have probably done it before launch. It is not clear if the modified profile will fix Ryzen's problems at the expense of something else, like power consumption, temperatures or noise".
"We are happy that Ryzen is seriously confronting the Broadwell-E in the hearts and minds of the content creators and other professionists, but our view as far as games is concerned is that Ryzen 7 is not currently the family of processors to beat. The Intel Kaby Lake CPUS are cheaper and usually faster. Maybe a future patch can change that"."


----------



## budgetgamer120

For the Linux Server crowd.

*AMD Ryzen 7 1800X Linux Benchmarks - Paying for speed*


More at source https://www.servethehome.com/amd-ryzen-7-1800x-linux-benchmarks-paying-for-speed/

Wonder when the server motherboards will be out


----------



## sage101

Quote:


> Originally Posted by *SoloCamo*
> 
> The irony of this is amazing.
> We had haswell/broadwell IPC in 2011? News to me.. we all must have also all forget that we apparently had 8c16t cpu's for $300 in 2016, let alone 2011 - with broadwell IPC on top of it.


I don't think he can differentiate between IPC and IPS.


----------



## mouacyk

Quote:


> Originally Posted by *SoloCamo*
> 
> The irony of this is amazing. Zealot, you seemed to be one of the few around here that were actually neutral and a true enthusiast regarding the topic but since Ryzen hit man, you are just completely negative towards it... there is one thing to be objective and another to make statements like in the quote.
> 
> In what realm is an 8c16t cpu with the ipc of haswell/broadwell for $300 not a good cpu? Intel has absolutely nothing remotely in the price range that touches it as an overall good processor.
> We had haswell/broadwell IPC in 2011? News to me.. we all must have also all forget that we apparently had 8c16t cpu's for $300 in 2016, let alone 2011 - with broadwell IPC on top of it.


Quote:


> Originally Posted by *sage101*
> 
> I don't he can differentiate between IPC and IPS.


Let's catch some people up to the objective comparisons that are happening, and not just the bell curve that wants to be enforced:

Please observe the SC performance: http://cpu.userbenchmark.com/Compare/Intel-Core-i5-2500K-vs-AMD-Ryzen-7-1700/619vs3915


----------



## SoloCamo

Quote:


> Originally Posted by *mouacyk*
> 
> Let's catch some people up to the objective comparisons that are happening, and not just the bell curve that wants to be enforced:
> 
> Please observe the SC performance: http://cpu.userbenchmark.com/Compare/Intel-Core-i5-2500K-vs-AMD-Ryzen-7-1700/619vs3915


Ah, when all else fails, whip out SC2. You know.... _the game from 2010 that you can get a major fps increase by spoofing it into thinking your AMD processor is Intel based._


----------



## mouacyk

Quote:


> Originally Posted by *SoloCamo*
> 
> Ah, when all else fails, whip out SC2. You know.... _the game from 2010 that you can get a major fps increase by spoofing it into thinking your AMD processor is Intel based._


yeah... SC


----------



## Praetorr

Don't you know? All the kids are playing SC2 these days! SC2 performance is *very* important in 2017!








Quote:


> Originally Posted by *SoloCamo*
> 
> Ah, when all else fails, whip out SC2. You know.... _the game from 2010 that you can get a major fps increase by spoofing it into thinking your AMD processor is Intel based._


----------



## SoloCamo

Or lets look at another popular game if you want to cherry pick...

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page3.html

Battlefield 1...

7600k which I think we can all agree is quite a bit faster than a stock 2500k get's a min fps of 98fps and an average of 156fps at 1080p. A 1700x get's a min of 125fps and avg of 144fps. Which sounds like the better gaming processor here for a modern title?


----------



## budgetgamer120

Quote:


> Originally Posted by *SoloCamo*
> 
> Or lets look at another popular game if you want to cherry pick...
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page3.html
> 
> Battlefield 1...
> 
> 7600k which I think we can all agree is quite a bit faster than a stock 2500k get's a min fps of 98fps and an average of 156fps at 1080p. A 1700x get's a min of 125fps and avg of 144fps. Which sounds like the better gaming processor here for a modern title?


Interesting... Turning off SMT increases performance.


----------



## Charcharo

Quote:


> Originally Posted by *mouacyk*
> 
> Let's catch some people up to the objective comparisons that are happening, and not just the bell curve that wants to be enforced:
> 
> Please observe the SC performance: http://cpu.userbenchmark.com/Compare/Intel-Core-i5-2500K-vs-AMD-Ryzen-7-1700/619vs3915


Uhm...
>User Benchmark
>Single Threaded IPC
Choose one. User Benchmark is not a single good comparison. This is like me using DOOM to showcase how good AMD GPUs are or Cinebench only.

Ryzen is Broadwell-E IPC. Sometimes it is better. Sometimes it is worse. That is the average as IPC is not a simple one figure.


----------



## criminal

Quote:


> Originally Posted by *SoloCamo*
> 
> Or lets look at another popular game if you want to cherry pick...
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page3.html
> 
> Battlefield 1...
> 
> 7600k which I think we can all agree is quite a bit faster than a stock 2500k get's a min fps of 98fps and an average of 156fps at 1080p. A 1700x get's a min of 125fps and avg of 144fps. Which sounds like the better gaming processor here for a modern title?


That 1700x for sure.

I would buy a 1700 today if I didn't already have a 6800K.


----------



## CriticalOne

Quote:


> Originally Posted by *SoloCamo*
> 
> Or lets look at another popular game if you want to cherry pick...
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page3.html
> 
> Battlefield 1...
> 
> 7600k which I think we can all agree is quite a bit faster than a stock 2500k get's a min fps of 98fps and an average of 156fps at 1080p. A 1700x get's a min of 125fps and avg of 144fps. Which sounds like the better gaming processor here for a modern title?


The 7600k is much cheaper than that 1700x, however. It becomes a question of how much you are willing to spend.


----------



## criminal

Quote:


> Originally Posted by *CriticalOne*
> 
> The 7600k is much cheaper than that 1700x, however. It becomes a question of how much you are willing to spend.


R5 1600x will be close to the 7600k in price and will have similar results to the 1700x in a lot of games I bet.


----------



## madweazl

Quote:


> Originally Posted by *criminal*
> 
> R5 1600x will be close to the 7600k in price and will have similar results to the 1700x in a lot of games I bet.


Skip the 1600x and go with the 1600 for $219.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SoloCamo*
> 
> The irony of this is amazing. Zealot, you seemed to be one of the few around here that were actually neutral and a true enthusiast regarding the topic but since Ryzen hit man, you are just completely negative towards it... there is one thing to be objective and another to make statements like in the quote.
> 
> In what realm is an 8c16t cpu with the ipc of haswell/broadwell for $300 not a good cpu? Intel has absolutely nothing remotely in the price range that touches it as an overall good processor.
> We had haswell/broadwell IPC in 2011? News to me.. we all must have also all forget that we apparently had 8c16t cpu's for $300 in 2016, let alone 2011 - with broadwell IPC on top of it.


Never said anything about Zen. I am just saying before Zen it. The problem I have is that with AMD there are always compromises which in case of their CPUs cant be overcome for my needs. I get salty when my plans get broken. My comments only reflect my situation alone and nothing else. If i was to start from scratch and build a new CPU right now It would have 1700. If I was to build a PC some someone else It would have Zen CPU. I just do not want people to ignore the fact that right now Zen as it stands has raised the bar in productivity, most likely raised the bar with R5 in gaming for $200 and under but has not raised the bar for high fps gaming. We have no clue if things will change. We just started to see games uses more than 4 core but we do not know when 8C+ will take that spot.


----------



## SoloCamo

Quote:


> Originally Posted by *criminal*
> 
> R5 1600x will be close to the 7600k in price and will have similar results to the 1700x in a lot of games I bet.


Even the $169 R5 1400 should have a higher minimum framerate in this game
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Never said anything about Zen. I am just saying before Zen it. The problem I have is that with AMD there are always compromises which in case of their CPUs cant be overcome for my needs. I get salty when my plans get broken. My comments only reflect my situation alone and nothing else. If i was to start from scratch and build a new CPU right now It would have 1700. If I was to build a PC some someone else It would have Zen CPU. I just do not want people to ignore the fact that right now Zen as it stands has raised the bar in productivity, most likely raised the bar with R5 in gaming for $200 and under but has not raised the bar for high fps gaming. We have no clue if things will change. We just started to see games uses more than 4 core but we do not know when 8C+ will take that spot.


Which is a fair point. My confusion comes from the fact that many (not just you) from the posts I'm reading were expecting it to beat Kaby in gaming which is the only thing that would have raised the bar for ultra high fps in these fairly lightly threaded games. Most of us were predicting ivy level at best and here we got haswell/broadwell IPC.


----------



## CriticalOne

I will just wait for benchmarks. Based on synthetic benchmarks the 1800x should be right up there with the 6900k in games but due to the CCX design this is not the case. I would like to think that the hexa core Ryzen processors would beat the i5s handidly but I am just not sure at this point.


----------



## madweazl

Quote:


> Originally Posted by *CriticalOne*
> 
> I will just wait for benchmarks. Based on synthetic benchmarks the 1800x should be right up there with the 6900k in games but due to the CCX design this is not the case. I would like to think that the hexa core Ryzen processors would beat the i5s handidly but I am just not sure at this point.


The six core wont operate any different than the eight core.


----------



## SuperZan

It seems they're all going to feature the dual CCX design, which is a good thing. It means that as new software/games are released with Ryzen optimisation as part and parcel to their development, benefits will be achieved on all Ryzen processors.


----------



## epic1337

Quote:


> Originally Posted by *madweazl*
> 
> The six core wont operate any different than the eight core.


depends on the workload, 8core would beat 6core on 7zip and handbrake for example.

Quote:


> Originally Posted by *SuperZan*
> 
> It seems they're all going to feature the dual CCX design, which is a good thing. It means that as new software/games are released with Ryzen optimisation as part and parcel to their development, benefits will be achieved on all Ryzen processors.


supposedly yes, but we all know what happened to Core2Quads, they weren't able to get much better due to their 2+2 design.


----------



## ZealotKi11er

Could Zen plus have 8 Core CCX?


----------



## Arturo.Zise

Ryzen R7 has already proven Intel X99 to be overpriced for what it is. Ryzen R3/R5 is about to make those Intel i3/i5's a difficult purchase prospect.

That leaves the 7700K to carry the flag for Intel until Coffee Lake.


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Or lets look at another popular game if you want to cherry pick...
> 
> http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page3.html
> 
> Battlefield 1...
> 
> 7600k which I think we can all agree is quite a bit faster than a stock 2500k get's a min fps of 98fps and an average of 156fps at 1080p. A 1700x get's a min of 125fps and avg of 144fps. Which sounds like the better gaming processor here for a modern title?


Stock 1800X and Fury in BF 1 - I need more GPU


----------



## blue1512

Quote:


> Originally Posted by *epic1337*
> 
> depends on the workload, 8core would beat 6core on 7zip and handbrake for example.
> supposedly yes, but we all know what happened to *Core2Quads*, they weren't able to get much better due to their 2+2 design.


Comparing Ryzen with C2Q is weird. Take a note that 1800x beats a true 8 cores 6900k, while 2+2 C2Qs bit the dust of true 4c C2Qs.

The way Infinity fabric connecting the CCXs in Ryzen is way more sophisticate and actually offers more of multithread performance than the ancient approaches from Intel







.


----------



## kfxsti

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Ryzen R7 has already proven Intel X99 to be overpriced for what it is. Ryzen R3/R5 is about to make those Intel i3/i5's a difficult purchase prospect.
> 
> That leaves the 7700K to carry the flag for Intel until Coffee Lake.


Bf1 made i5s and i3sa difficult purchase too lol. I'm hoping the 4c 8ts from them overclock well and give me a new toy to play with in the coming months


----------



## epic1337

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could Zen plus have 8 Core CCX?


most likely not, i doubt AMD would go for that route as scaling it down would become harder.
think of their APUs, they wouldn't make 8core APUs.

strictly speaking, the difference with AMD's design compared to intel is in two points.
one is AMD's L3 cache is attached to the CPU, the other is the infinity fabric (ring bus) is in sync with DRAM.

intel's L3 cache is in fact pooled outside of the CPU packages in slices for each cores, they're connected to each other and to the CPU by the ring bus.
intel's ring bus on the other hand is in sync with the L3 cache's clock called the uncore clock ( https://en.wikipedia.org/wiki/Uncore ), the DRAM clock has it's own ratio multiplier.


----------



## blue1512

Quote:


> Originally Posted by *epic1337*
> 
> most likely not, i doubt AMD would go for that route as scaling it down would become harder.
> think of their APUs, they wouldn't make 8core APUs.
> 
> strictly speaking, the difference with AMD's design compared to intel is in two points.
> one is AMD's L3 cache is attached to the CPU, the other is the infinity fabric (ring bus) is in sync with DRAM.
> 
> intel's L3 cache is in fact pooled outside of the CPU packages in slices for each cores, they're connected to each other and to the CPU by the ring bus.
> intel's ring bus on the other hand is in sync with the L3 cache's clock called the uncore clock, the DRAM clock has it's own ratio multiplier.


Ryzen approach is good for AMD's small R&D budget. It's easier to scale as we can see with Naples. The bandwidth of Fabric is fine, but the low clock leads to high latency and some bottleneck between 2 CCXs in heavy thread applications like gaming.

Hopefully Zen plus will have a better Memory Control and a new Fabric that can use 1:1 Memclock ratio (which mean double the clock of the current Fabric!).


----------



## epic1337

Quote:


> Originally Posted by *blue1512*
> 
> Ryzen approach is good for AMD's small R&D budget. It's easier to scale as we can see with Naples.
> 
> Hopefully Zen plus will have a better Memory Control and a new Fabric that can use 1:1 Memclock ratio (which mean double the clock of the current Fabric!).


no it isn't a "good approach", they could've in fact used the same L3 layout as intel, that is to say L3 is not joined with the cores and is piggybacked by the ring bus.

they don't need to use a better memory controller either, they just need to not sync the clock, L3 cache and infinity fabric should have it's own clock.


----------



## SuperZan

Quote:


> Originally Posted by *epic1337*
> 
> supposedly yes, but we all know what happened to Core2Quads, they weren't able to get much better due to their 2+2 design.


True, but differences in architecture, cross-fabric transfer implementation, and general advances in software development over the past decade mean that the Core 2 instance isn't a solid precedent for Ryzen's performance. We'll have to wait and see obviously, but I'm not feeling pessimistic just yet.


----------



## epic1337

Quote:


> Originally Posted by *SuperZan*
> 
> True, but differences in architecture, cross-fabric transfer implementation, and general advances in software development over the past decade mean that the Core 2 instance isn't a solid precedent for Ryzen's performance. We'll have to wait and see obviously, but I'm not feeling pessimistic just yet.


well yes, but while C2Q is old and uses ancient technology, what i'm implying is the difficulties on the software side.
even now C2Q initial launch vs C2Q of this date has barely any difference in performance.

AMD's similarity in their design, while all of the components are on the same die, they're running as if they aren't.
simply put, Infinity Fabric is designed as a multi-socket interconnect, while fast by it's own right, its not fast enough as on-die interconnect.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *kfxsti*
> 
> But what about us that don't stream all the time ? Not to mention the tossing of the word afanboy in this thread is hilarious.
> I got my 7700k before Ryzens release and have yet to see anything that would make me switch from it as of yet. As stated before by me and others !! Expectations were set a bit to high for Ryzen. It's an awesome chip for what it does and I am shickled titless that AMD is back in the game and looking forward to the 4c 8ts they have coming amongst other CPUs they release. But videos like this and comments like those just make most people look at them like - Damn someone is just grasping at straws to make Ryzen better looking than the 7700k.
> That's like saying now that gaming while streaming with 65 other programs running while downloading half the internet and balancing In your PC chair on your head while balancing two mice on each buttcheek is the new benchmark for CPUs..


And as I've said countless times already, nobody with a 7700K already should be shopping Ryzen (or any other CPU for that matter) for a long time, unless your usage is expected to need more than a quad core. Nobody ever realistically expected Ryzen 8-cores to be faster than KL in purely gaming. I hoped for IPC around IB level around a year ago and AMD delivered Haswell / Broadwell level which, in my book, is an astonishing achievement. Nothing more, nothing less. KL is still the fastest IPC architecture there is (combined with really good OC headroom) and nobody is denying that. But that fact has nothing to do with how good or bad Ryzen is. It is very good in fact...


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There are no Intel fanboys. There are people that buy good CPUs and people that buy AMD CPUs.


Ironic that this post denying the existence of Intel fanboys is about the most Intel-fanboyish post in this entire thread...
Quote:


> Originally Posted by *mouacyk*
> 
> Let's hold that judgement until we see R3 and R5 do 4.5GHz+. Otherwise, all AMD managed to do is redeliver IPC we already had in 2011. Let's set the expectations high all across the board, since we are in pursuit of performance.


I didn't realize Intel released BW-E in 2011?


----------



## epic1337

well technically AMD's IPC is as high as Haswell, but its not as consistent as some would say.
sometimes its as slow as Sandy Bridge, often times its faster than Ivy Bridge, and on occasion even faster than Broadwell.

the issues stems mostly on the difference between architectures, but one of the most obvious reason is the CCX intercommunication.


----------



## Brutuz

Quote:


> Originally Posted by *epic1337*
> 
> well yes, but while C2Q is old and uses ancient technology, what i'm implying is the difficulties on the software side.
> even now C2Q initial launch vs C2Q of this date has barely any difference in performance.
> 
> AMD's similarity in their design, while all of the components are on the same die, they're running as if they aren't.
> simply put, Infinity Fabric is designed as a multi-socket interconnect, while fast by it's own right, its not fast enough as on-die interconnect.


Like hell it didn't. In gaming it may not have gained much speed but it maintained what speed it had for a much longer time than the dual core Core 2s did.


----------



## epic1337

Quote:


> Originally Posted by *Brutuz*
> 
> Like hell it didn't. In gaming it may not have gained much speed but it maintained what speed it had for a much longer time than the dual core Core 2s did.


what are you talking about?

i'm talking about C2Qs vs C2Qs driver and software improvements, not C2Ds vs C2Qs performance in general.
e.g. Q9000s @ 2008 benchmarks vs Q9000s @ 2016 benchmarks.
its like comparing AMD's GPU early release vs later on benchmarks, mainly driver improvements.

simply put, Core2Quads didn't become faster even with multiple software improvements targeting multi-threading support.


----------



## espn

Quote:


> Originally Posted by *bossie2000*
> 
> Multi threading is the future.More and more games,apps,browsers and ofcourse OS is going to utilize it.
> Single thread-well it got it's place but will slowly loose it over time....


Well mult cores theme has been discussed for like a decade but still majority still can only use 2 cores to 4cores.


----------



## budgetgamer120




----------



## Liranan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bossie2000*
> 
> If this video does not open "Intel fanboy eyes" then nothing will !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There are no Intel fanboys. There are people that buy good CPUs and people that buy AMD CPUs.
Click to expand...

Thank you for proving his point.


----------



## budgetgamer120

Quote:


> Originally Posted by *Liranan*
> 
> Thank you for proving his point.


----------



## blue1512

Quote:


> Originally Posted by *epic1337*
> 
> well yes, but while C2Q is old and uses ancient technology, what i'm implying is the difficulties on the software side.
> even now C2Q initial launch vs C2Q of this date has barely any difference in performance.
> 
> AMD's similarity in their design, while all of the components are on the same die, they're running as if they aren't.
> simply put, Infinity Fabric is designed as a multi-socket interconnect, while fast by it's own right, its not fast enough as on-die interconnect.


It seems that you skimmed one of my post...

Calling Fabric as a multi-socket interconnect is WRONG in any means. And that's why I gave the comparison that C2Q 6000 2+2 bites dust of C2Q 9000 true 4, but 2xCCX 1800x beats the true 8 cores 6900k. You can see the difference here, right?

The only problem with Fabric now is its latency, which is higher than Intel's non-core. But by no mean you can compare it with the snail speed of multi-socket interconnect....


----------



## Carniflex

Quote:


> Originally Posted by *mouacyk*
> 
> Let's hold that judgement until we see R3 and R5 do 4.5GHz+. Otherwise, all AMD managed to do is redeliver IPC we already had in 2011. Let's set the expectations high all across the board, since we are in pursuit of performance.


IPC is only half of the story tho. Another half is that the AMD offering has twice the cores/threads/cache of the corresponding part from the other side of the fence in the very competitive power envelope.


----------



## Carniflex

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Never said anything about Zen. I am just saying before Zen it. The problem I have is that with AMD there are always compromises which in case of their CPUs cant be overcome for my needs. I get salty when my plans get broken. My comments only reflect my situation alone and nothing else. If i was to start from scratch and build a new CPU right now It would have 1700. If I was to build a PC some someone else It would have Zen CPU. I just do not want people to ignore the fact that right now Zen as it stands has raised the bar in productivity, most likely raised the bar with R5 in gaming for $200 and under but has not raised the bar for high fps gaming. We have no clue if things will change. We just started to see games uses more than 4 core but we do not know when 8C+ will take that spot.


Fair points.

For high fps gaming - for me the picture is not yet crystal clear. AMD has made kinda big fuss in their marketing slides about _minimum_ frame rates and stuff. While it seems unlikely considering the single core performance difference between the latest Kaby Lake with OC vs Ryzen with OC it is not totally impossible that once the BIOS issues on practically all the mobos get sorted the Ryzen might be competitive - if there is some kind of specific optimizations in there somewhere to actually ensure consistent minimum frame rate and good frame pacing and its not just purely marketing talk.

Having a 144Hz frees-sync 1440p in combination of [email protected] GHz (http://www.userbenchmark.com/UserRun/1774430 ) which is practically same single core performance as Ryzen at 4 GHz it seems likely that I wont be sitting at 144 Hz all the time (at least I'm not hitting that with my current i7), however, if my drops would be hitting, say, 80 fps instead of hitting as low as 60'ish this would be an upgrade. I should probably add that for me personally it seems that free-sync is a lot more significant thing than being able to hit 144 hz. It seems to make a lot greater difference for me than I expected it to make. With my 4K screen it is quite noticeable when you drop below 60 hz (no freesync), with my freesync I have dipped as far as 40 fps before taking note that it feels substantially enough choppier than I'm used to.

Ofc nothing to hold ones breath over. In real life miracles rarely happen and it is always an possibility that this frame smoothness slide was just an marketing thing without something concrete in silicon specifically for ensuring this to back that slide up.


----------



## epic1337

Quote:


> Originally Posted by *blue1512*
> 
> Calling Fabric as a multi-socket interconnect is WRONG in any means.




Quote:


> Originally Posted by *Carniflex*
> 
> IPC is only half of the story tho. Another half is that the AMD offering has twice the cores/threads/cache of the corresponding part from the other side of the fence in the very competitive power envelope.


this is both true and not quite true.

the same could be said with bulldozer and piledriver, IPC is what it lacks but offered twice as much cores for a correspondingly low price.
in this case, IPC matters a lot more than just simple core count, Skylake or Kabylake supremacy over Ryzen on games attests to this.

if Ryzen had been able to clock as high as Intel's chips, and had no issues with it's internal communications.
then Ryzen would've beaten Intel's mainstream chips on all grounds besides purely single-thread reliant workloads.
and yes this would include the vast majority of the games that scales well with multi-cores above 8threads.

Ryzen is, to put it simply, a pace changer, something to give the lot of us _the new_ sort of deal that we've sorely been held back on, not that everyone would appreciate it.
so with this in mind, we now have two or three choices, one is to take whats on the table for that cost effective _more-than-acceptable performance_, the other is to wait for the next thing AMD has to offer.
the third choice is to go with Intel's high-performance CPUs at a _less-than-desirable price_ that would leave you with a few things to desire, definitely not cost effective but the performance is there.


----------



## iRUSH

Quote:


> Originally Posted by *cssorkinman*
> 
> Stock 1800X and Fury in BF 1 - I need more GPU


Single player? I'm sure an i3 would do the dame thing (slight sarcasm). How's the frame-rate scaling in a 64p conquest game?


----------



## sumitlian

Quote:


> Originally Posted by *SoloCamo*
> 
> Even the $169 R5 1400 should have a higher minimum framerate in this game


*cough*cough*...it is not when you play at 720p. Nevermind.


----------



## cssorkinman

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Stock 1800X and Fury in BF 1 - I need more GPU
> 
> 
> 
> 
> 
> 
> 
> 
> Single player? I'm sure an i3 would do the dame thing (slight sarcasm). How's the frame-rate scaling in a 64p conquest game?
Click to expand...


----------



## iRUSH

Quote:


> Originally Posted by *cssorkinman*


Looks like it might drop to 125 at its worst. I think that's pretty good and speaks well to me.

Thank you


----------



## cssorkinman

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> 
> 
> 
> 
> Looks like it might drop to 125 at its worst. I think that's pretty good and speaks well to me.
> 
> Thank you
Click to expand...

128 min 173 Ave 200 max.


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*


Just curious as I want to run my own tests too but what software are you using for fps recording and what software for the graph?


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> 
> 
> 
> 
> Just curious as I want to run my own tests too but what software are you using for fps recording and what software for the graph?
Click to expand...

fraps to count frames then load that data into open office. Then insert graph into spreadsheet.


----------



## budgetgamer120

Quote:


> Originally Posted by *SoloCamo*
> 
> Just curious as I want to run my own tests too but what software are you using for fps recording and what software for the graph?


Seems like excel.


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*
> 
> fraps to count frames then load that data into open office. Then insert graph into spreadsheet.


Thanks, they always look more complicated then just importing it. You'd think with all my years behind a desk in offices I'd remember. Will do a run with 720p/low with my 290x as that *should* be about a fair comparison vs your Fury X at 1080p.


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> fraps to count frames then load that data into open office. Then insert graph into spreadsheet.
> 
> 
> 
> Thanks, they always look more complicated then just importing it. You'd think with all my years behind a desk in offices I'd remember. Will do a run with 720p/low with my 290x as that *should* be about a fair comparison vs your Fury X at 1080p.
Click to expand...

at 1080 fury and 290x lightning are very close. 1440 would be different.


----------



## mouacyk

Quote:


> Originally Posted by *Carniflex*
> 
> IPC is only half of the story tho. Another half is that the AMD offering has twice the cores/threads/cache of the corresponding part from the other side of the fence in the very competitive power envelope.


The extra cores/threads/cache are half measures in between Intels' mainstream and their HEDT, given their decision to go with a non-monolithic core design. Yes, they saved in R&D and produced a cheaper product, but is it competitive? Not in the sense that it has made decision-making clearer or easier -- and that is really a missed opportunity here on a brand new product launch designed by a reknown engineer.

And "competitive power envelope"? Sure, but performance compromises were made to make that possible. Yes, it may take less cooling to keep RyZen within operating temperatures, but reviews have shown that consumption, however, is nearly identical to its Intel counterpart. Not really seeing any advantage here either.

src: https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Power-Consumption-and-Conclusions


Spoiler: Warning: Spoiler!



In fact, despite the 95 watt TDP, the Ryzen CPU uses about the same power as the 140 watt Broadwell-E processors.


----------



## SoloCamo

Just for reference. Unfortunately no one was playing with 64 players on the giants shadow map so I used quickmatch and ended up with 64 player Empire's Edge.



Min 120
Max 199
Avg 159.323

4790k locked at 4.4ghz w/ 2400mhz cas 10 memory, 290x at 1070core stock mem of 1250. 1080p Low settings 64 player.

An 8c16t Ryzen+ will likely be my next cpu.


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Just for reference. Unfortunately no one was playing with 64 players on the giants shadow map so I used quickmatch and ended up with 64 player Empire's Edge.
> 
> 
> 
> Min 120
> Max 199
> Avg 159.323
> 
> 4790k locked at 4.4ghz w/ 2400mhz cas 10 memory, 290x at 1070core stock mem of 1250. 1080p Low settings 64 player.
> 
> An 8c16t Ryzen+ will likely be my next cpu.


I'd like to plot fps, gpu and cpu usage and stack the data one behind the other on one spreadsheet.

I played a 64 player round of St quentin's scar last night at all stock clockings and low graphics settings . I did spend some time in the 130's when the behemoth was on screen and I was between B and C other than that it was 150 + most of the time.

I recorded a 24 vs 24 map a while back and was planning on uploading it, but 768 kb upload is pretty limiting


----------



## tacobob89

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Ironic that this post denying the existence of Intel fanboys is about the most Intel-fanboyish post in this entire thread...
> I didn't realize Intel released BW-E in 2011?


Lmao so true.


----------



## Scotty99

Ive come to the realization intel will only have one chip in their entire lineup i can wholeheartedly recommend after R5 drops, the pentium g4560.

Think about it, how could you recommend anything else with the pricing and cores AMD is offering?


----------



## Shatun-Bear

Possibly one of the worst Ryzen reviews out there. Typical from TPU and W1zzard:

https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/

Not only does he not bench or compare the 1800X against a 6900K, a 6950X, or even a 5960X, but he ONLY compares the 1800K against a....7700K in all his tests. ***! They're not even in the same price category. It's a total Apples to Oranges comparison and he based his score off that!! And TPU are note even a 'gaming' site but a hardware one.

Then the vast majority of the tests favour frequency and do not scale well with more cores. Then look at the score. This is the guy that gives anything Nvidia 9.9, 9.8 scores.


----------



## budgetgamer120

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Possibly one of the worst Ryzen reviews out there. Typical from TPU and W1zzard:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/
> 
> Not only does he not bench or compare the 1800X against a 6900K, a 6950X, or even a 5960X, but he ONLY compares the 1800K against a....7700K in all his tests. ***! They're not even in the same price category. It's a total Apples to Oranges comparison and he based his score off that!! And TPU are note even a 'gaming' site but a hardware one.
> 
> Then the vast majority of the tests favour frequency and do not scale well with more cores. Then look at the score. This is the guy that gives anything Nvidia 9.9, 9.8 scores.


I really liked techpowerup...
I am going to have to add them to the list where tomshardware is.


----------



## Shatun-Bear

Quote:


> Originally Posted by *budgetgamer120*
> 
> I really liked techpowerup...
> I am goign to have to add them to the list where tomshardware is.


TPU, or W1zzard at least. have a recent history of being very pro Nvidia and negative/lower scores for AMD. You would do well to avoid them.


----------



## sugarhell

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Possibly one of the worst Ryzen reviews out there. Typical from TPU and W1zzard:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/
> 
> Not only does he not bench or compare the 1800X against a 6900K, a 6950X, or even a 5960X, but he ONLY compares the 1800K against a....7700K in all his tests. ***! They're not even in the same price category. It's a total Apples to Oranges comparison and he based his score off that!! And TPU are note even a 'gaming' site but a hardware one.
> 
> Then the vast majority of the tests favour frequency and do not scale well with more cores. Then look at the score. This is the guy that gives anything Nvidia 9.9, 9.8 scores.


1800x:



5960x :



This is not about being pro Intel,AMD,Nvidia. This guy can't remember how he reviewed previous 8 core cpus and in general he can't do a proper review


----------



## Scotty99

Almost all of the cons is just down to an early release lol.

Yes it was released too soon everyone agrees with this, but what a ridiculous notion to take marks off the product for that.


----------



## Shatun-Bear

Quote:


> Originally Posted by *sugarhell*
> 
> 1800x:
> 
> 
> 
> 5960x :
> 
> 
> 
> This is not about being pro Intel,AMD,Nvidia. This guy can't remember how he reviewed previous 8 core cpus and in general he can't do a proper review


Shocking. He does this often, marks down AMD for things that he doesn't do for NV/Intel.

It really isn't a conspiracy, W1zzard does seem to have some kind of bias against AMD for whatever reason. The scores of their products alone show this.


----------



## sumitlian

I have never liked TPU since the start. The only thing TPU has been good for me is downloading GPU bios.


----------



## tpi2007

He still gave it a "Highly Recommended" award at the end.

Anyway, perhaps more important today is this piece of news:

https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


----------



## sugarhell

Quote:


> Originally Posted by *tpi2007*
> 
> He still gave it a "Highly Recommended" award at the end.
> 
> Anyway, perhaps more important today is this piece of news:
> 
> https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


Almost everything is highly recommended nowdays.

Also we already knew that from The Stilt.


----------



## CriticalOne

Quote:


> Originally Posted by *Scotty99*
> 
> Almost all of the cons is just down to an early release lol.
> 
> Yes it was released too soon everyone agrees with this, but what a ridiculous notion to take marks off the product for that.


The product is reviewed as-is.


----------



## tpi2007

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> He still gave it a "Highly Recommended" award at the end.
> 
> Anyway, perhaps more important today is this piece of news:
> 
> https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed
> 
> 
> 
> Almost everything is highly recommended nowdays.
> 
> Also we already knew that from The Stilt.
Click to expand...

Ah, sorry, the last time I checked that thread was over thirty pages long and I'm not following it that closely.


----------



## sugarhell

Quote:


> Originally Posted by *tpi2007*
> 
> Ah, sorry, the last time I checked that thread was over thirty pages long and I'm not following it that closely.


Now he has a lot of good info in the first post :

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


----------



## tpi2007

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> Ah, sorry, the last time I checked that thread was over thirty pages long and I'm not following it that closely.
> 
> 
> 
> Now he has a lot of good info in the first post :
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
Click to expand...

Thanks! Rep+


----------



## sumitlian

Quote:


> Originally Posted by *tpi2007*
> 
> Anyway, perhaps more important today is this piece of news:
> 
> https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


That is good info.
+Rep.


----------



## sumitlian

Quote:


> Originally Posted by *tpi2007*
> 
> He still gave it a "Highly Recommended" award at the end.


It is probably a sarcasm.


----------



## Rocozaur

I believe that bashing the conclusions of the TechpowerUP! review is holding you back from seeing a very important detail:





Low power consumption in Games while also delivering less performance. But wait, what is this?


Benchmarks and tests that make full use of all Ryzen has to offer as cores/threads result in high performance and of course high power draw. Games deliver lower performance than Intel but the power consumption stays low even with OC applied! If that's not a dead giveaway that on average most games aren't properly using the Ryzen CPU, then I don't know what is


----------



## budgetgamer120

Quote:


> Originally Posted by *Rocozaur*
> 
> I believe that bashing the conclusions of the TechpowerUP! review is holding you back from seeing a very important detail:
> 
> 
> 
> 
> 
> Low power consumption in Games while also delivering less performance. But wait, what is this?
> 
> 
> Benchmarks and tests that make full use of all Ryzen has to offer as cores/threads result in high performance and of course high power draw. Games deliver lower performance than Intel but the power consumption stays low even with OC applied! If that's not a dead giveaway that on average most games aren't properly using the Ryzen CPU, then I don't know what is


Prime is not a benchmark does not equal real world scenario of an application that uses Ryzen to its full potential. Thats like use furmark to test power consumption.


----------



## AlphaC

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Possibly one of the worst Ryzen reviews out there. Typical from TPU and W1zzard:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/
> 
> Not only does he not bench or compare the 1800X against a 6900K, a 6950X, or even a 5960X, but he ONLY compares the 1800K against a....7700K in all his tests. ***! They're not even in the same price category. It's a total Apples to Oranges comparison and he based his score off that!! And TPU are note even a 'gaming' site but a hardware one.
> 
> Then the vast majority of the tests favour frequency and do not scale well with more cores. Then look at the score. This is the guy that gives anything Nvidia 9.9, 9.8 scores.


Yes, look at the memory setup. It's a scathing review.

Ryzen 7 has 2666MHz CL16 and the i7-7700k & i7-6700k are running 3000MHz CL15. If that is not biased I don't know what is.

I have no clue why the i7-4770k was running DDR3 1600 MHz 9-10-9-27. Most kits I've seen are 9-9-9-9-24 while the i5-2500k was running DDR3-1333Mhz 9-10-9-27.


Spoiler: Warning: Spoiler!












JEDEC standard:
DDR3-1600H = 9-9-9
DDR3-2133L = 12-12-12
DDR4-2133N = 14-14-14
DDR4-2400P = 15-15-15
DDR4-2400R = 16-16-16

A realistic test would be: Ryzen 7 with DDR4 2666Mhz RAM CL15 or lower (official spec) + highest supported kits such as the Trident Z 3200Mhz CL14, i7-4790k with 1600MHz CL9 + 2133MHz (highest DDR3 non overclocked) , i7-6700k with 2133Mhz and 2400Mhz DDR4 & i7-7700k both with DDR4 2400MHz CL15 or CL16 (the official spec) & 3000Mhz CL15.


----------



## cssorkinman

Quote:


> Originally Posted by *AlphaC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shatun-Bear*
> 
> Possibly one of the worst Ryzen reviews out there. Typical from TPU and W1zzard:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/
> 
> Not only does he not bench or compare the 1800X against a 6900K, a 6950X, or even a 5960X, but he ONLY compares the 1800K against a....7700K in all his tests. ***! They're not even in the same price category. It's a total Apples to Oranges comparison and he based his score off that!! And TPU are note even a 'gaming' site but a hardware one.
> 
> Then the vast majority of the tests favour frequency and do not scale well with more cores. Then look at the score. This is the guy that gives anything Nvidia 9.9, 9.8 scores.
> 
> 
> 
> Yes, look at the memory setup. It's a scathing review.
> 
> Ryzen 7 has 2666MHz CL16 and the i7-7700k & i7-6700k are running 3000MHz CL15. If that is not biased I don't know what is.
> 
> I have no clue why the i7-4770k was running DDR3 1600 MHz 9-10-9-27. Most kits I've seen are 9-9-9-9-24 while the i5-2500k was running DDR3-1333Mhz 9-10-9-27.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A realistic test would be: Ryzen 7 with DDR4 2666Mhz RAM CL15 or lower (official spec) + highest supported kits such as the Trident Z 3200Mhz CL14, i7-4790k with 1600MHz CL9 + 2133MHz (highest DDR3 non overclocked) , i7-6700k with 2133Mhz and 2400Mhz DDR4 & i7-7700k both with DDR4 2400MHz (the official spec) & 3000Mhz CL15.
Click to expand...

Good point about the ram speeds - seems like another hatchet job to me.

I'm quite happy with the ram speeds Im getting and it was simple - punch in rated voltage and select AXMP - cl 14 3200 ... engage!


----------



## mcg75

Quote:


> Originally Posted by *AlphaC*
> 
> Yes, look at the memory setup. It's a scathing review.
> 
> Ryzen 7 has 2666MHz CL16 and the i7-7700k & i7-6700k are running 3000MHz CL15. If that is not biased I don't know what is.
> 
> I have no clue why the i7-4770k was running DDR3 1600 MHz 9-10-9-27. Most kits I've seen are 9-9-9-9-24 while the i5-2500k was running DDR3-1333Mhz 9-10-9-27.
> 
> JEDEC standard:
> DDR3-1600H = 9-9-9
> DDR3-2133L = 12-12-12
> DDR4-2133N = 14-14-14
> DDR4-2400P = 15-15-15
> DDR4-2400R = 16-16-16
> 
> A realistic test would be: Ryzen 7 with DDR4 2666Mhz RAM CL15 or lower (official spec) + highest supported kits such as the Trident Z 3200Mhz CL14, i7-4790k with 1600MHz CL9 + 2133MHz (highest DDR3 non overclocked) , i7-6700k with 2133Mhz and 2400Mhz DDR4 & i7-7700k both with DDR4 2400MHz CL15 or CL16 (the official spec) & 3000Mhz CL15.


It is not a scathing review. The points he made both positive and negative were actually pretty consistent with the findings of most other reviewers. In other words, Ryzen dominated anything where multi-threading could be used and the 7700k won the gaming tests.

You call the memory settings biased? Did you bother to look into why they were set like they were? The 3000MHz kit AMD send him with the board wouldn't work at that frequency. In fact, he had to set all the memory timings manually to get it to run at 2666MHz.

My opinion is that it's a motherboard issue given the problems he had but seeing as how he tried multiple memory kits and took the time to get it as high a frequency as he could, calling it biased is just a little bit uncalled for.

And the 3000MHz kit that was used with the Intel, that's his main rig that is used for testing all the gpu's from AMD and Nvidia. I'm quite sure when the motherboards get fixed and he can run 3000MHz with the 1800x that results will improve but it's not going to make enough difference to change the order of the results.


----------



## Final8ty

AdoredTV


----------



## AlphaC

Quote:


> Originally Posted by *mcg75*
> 
> It is not a scathing review. The points he made both positive and negative were actually pretty consistent with the findings of most other reviewers. In other words, Ryzen dominated anything where multi-threading could be used and the 7700k won the gaming tests.
> 
> You call the memory settings biased? Did you bother to look into why they were set like they were? The 3000MHz kit AMD send him with the board wouldn't work at that frequency. In fact, he had to set all the memory timings manually to get it to run at 2666MHz.
> 
> My opinion is that it's a motherboard issue given the problems he had but seeing as how he tried multiple memory kits and took the time to get it as high a frequency as he could, calling it biased is just a little bit uncalled for.
> 
> And the 3000MHz kit that was used with the Intel, that's his main rig that is used for testing all the gpu's from AMD and Nvidia. I'm quite sure when the motherboards get fixed and he can run 3000MHz with the 1800x that results will improve but it's not going to make enough difference to change the order of the results.


Well then he should have gotten memory that works IMO. It's not like the review was published at launch.

That's why I called it biased. If it was at launch and nobody knew of the memory scaling and cross-CCX issues then it would be another story.

It's also biased against the i7-4770k + i5-2500k. we had people running i7-4770k with > 2400Mhz DDR3.

http://www.corsair.com/en-us/blog/2014/march/haswellrealworld
"Haswell Real World Performance: DDR3-1600 is Not Enough"

edit:
for example
Z97X SOC FORCE: Support for DDR3 3300(O.C.) / 3200(O.C.) / 3100(O.C.) / 3000(O.C.) / 2933(O.C.) / 2800(O.C.) / 2666(O.C.) / 2600(O.C.) / 2500(O.C.) / 2400(O.C.) / 2200(O.C.) / 2133(O.C.) / 2000(O.C.) / 1866(O.C.) / 1800(O.C.) / 1600 / 1333 MHz memory modules
Z97 OC Formula: Supports DDR3/DDR3L 3400+(OC)/2933(OC)/2800(OC)/2400(OC)/2133(OC)/1866(OC)/1600/1333/1066 non-ECC
Z97 Maximus VII Formula: 4 x DIMM, Max. 32GB, DDR3 3300(O.C.)/3200(O.C.)/3100(O.C.)/3000(O.C.)/2933(O.C.)/2800(O.C.)/2666(O.C.)/2400(O.C.)/2133(O.C.)/2000(O.C.)/1866(O.C.)/1600/1333 MHz Non-ECC, Un-buffered Memory
Z97 Xpower: Support four DDR3 DIMMs 1066/1333/1600/1866*/2000*/2133*/2200*/2400*/2600*/2666*/2800*/3000*/3100*/3200*/3300*(OC)DRAM (32GB Max)
GA-Z87X-OC: Support for DDR3 3000(O.C.) / 2933(O.C.) / 2800(O.C.) / 2666(O.C.) / 2600(O.C.) / 2500(O.C.) / 2400(O.C.) / 2200(O.C.) / 2133(O.C.) / 2000(O.C.) / 1866(O.C.) / 1800(O.C.) / 1600 / 1333 MHz memory modules
Z87 OC Formula: Supports DDR3/DDR3L 3000+(OC)/2933(OC)/2800(OC)/2400(OC)/2133(OC)/1866(OC)/1600/1333/1066 non-ECC, un-buffered memory
Maximus VI formula: 4 x DIMM, Max. 32GB, DDR3 3100(O.C)/3000(O.C.)/2933(O.C.)/2800(O.C.)/2666(O.C.)/2600(O.C.)/2500(O.C.)/2400(O.C.)/2200(O.C.)/2133(O.C.)/2000(O.C.)/1866(O.C.)/1800(O.C.)/1600/1333 MHz Non-ECC, Un-buffered Memory
Z77 OC Formula: DDR3 3000+(OC)/2800(OC)/2666(OC)/2400(OC)/2133(OC)/1866(OC)/1600/1333/1066
Z77 Maximus V Formula: 4 x DIMM, Max. 32GB, DDR3 2800(O.C.)/2666(O.C.)/2600(O.C.)/2400(O.C.)/2200(O.C.)/2133(O.C.)/1866(O.C.)/1600/1333/1066 MHz
GA-Z77X-UP5 TH : Support for DDR3 2800(OC)/1600/1333/1066 MHz memory modules


----------



## mcg75

Quote:


> Originally Posted by *AlphaC*
> 
> Well then he should have gotten memory that works IMO. It's not like the review was published at launch.
> 
> That's why I called it biased. If it was at launch and nobody knew of the memory scaling and cross-CCX issues then it would be another story.


Seriously? The kit he was sent by AMD themselves wouldn't do the 3000MHz so he bought more himself and they didn't work either. What's he supposed to do keep buying more kits out of his own pocket? A little unrealistic don't you think?

This post from VSG sums up his review pretty well.
Quote:


> I'll mention a few things here since w1z probably won't. I have had the benefit of seeing this review from scratch to the publication stage, and this took him weeks to go through. He received the press kit late since he was not present at AMD press day (AMD could not sponsor travel for many press based out of USA), and it arrived moments before he had to leave to cover the Nvidia press event on the GTX 1080 Ti (Nvidia did sponsor travel for everyone) where he got the GPU for review. So of course he was able to cover the GPU first, and the CPU later which he took a lot of time to understand the platform as best as possible before setting forth to figure out the bugs and how to deal with them.
> 
> The AMD press kit had a mix of motherboards, and most received their kits before the final microcode even went out to the motherboard companies. In this instance, the GA-AX370-Gaming 5 arrived with the BIOS F3 which was released Feb 13, and was extremely unstable. Gigabyte then released newer BIOS over 2 weeks later, albeit beta BIOS did get out which may have helped but can't be used for reviews in my opinion anyway.
> 
> As far as why no Intel HEDT CPUs? I asked him the same thing, and it was pretty simple- he had none to compare with. He bought some mainstream enthusiast level CPUs himself for comparison since he never was a CPU reviewer and had no samples lying around. I thought it was fair myself.
> 
> Granted I do not agree with a few things myself such as the title and language on some news/social media posts, but I continue to post reviews and more here because I still think TPU is among the very best when it comes to detailed reviews and this is no exception. If not, I would have just posted everything on my own website.


----------



## budgetgamer120

Quote:


> Originally Posted by *mcg75*
> 
> Seriously? The kit he was sent by AMD themselves wouldn't do the 3000MHz so he bought more himself and they didn't work either. What's he supposed to do keep buying more kits out of his own pocket? A little unrealistic don't you think?
> 
> This post from VSG sums up his review pretty well.


Seems like a bunch of excuses as to why no HEDT.


----------



## mcg75

Quote:


> Originally Posted by *budgetgamer120*
> 
> Seems like a bunch of excuses as to why no HEDT.


Yet he somehow manages to come to the correct conclusion despite not having any to test......

Quote:


> Overall, this makes the Ryzen 7 1800X an excellent alternative to processors from the Intel HEDT lineup (eg: Core i7-6900K) - at much more attractive pricing.
> 
> If you compare Ryzen 7 1800X pricing to Intel's highest-end offerings, AMD has a clear winner on their hands once the platform's issues are ironed out. The 1800X comes at much better pricing with very similar performance.


----------



## budgetgamer120

Quote:


> Originally Posted by *mcg75*
> 
> Yet he somehow manages to come to the correct conclusion despite not having any to test......


You know people only look at graphs right?


----------



## VSG

Quote:


> Originally Posted by *budgetgamer120*
> 
> Seems like a bunch of excuses as to why no HEDT.


If you suddenly got a press kit from AMD, would you spend your own money buying every HEDT CPU or a bunch of less expensive, more mainstream CPUs instead? He isn't a CPU reviewer yet, as I mentioned. But he's going to buy some more mainstream CPUs to prep for more CPU reviews including Ryzen 5 and Ryzen 3.


----------



## budgetgamer120

Quote:


> Originally Posted by *geggeg*
> 
> If you suddenly got a press kit from AMD, would you spend your own money buying every HEDT CPU or a bunch of less expensive, more mainstream CPUs instead? He isn't a CPU reviewer yet, as I mentioned. But he's going to buy some more mainstream CPUs to prep for more CPU reviews including Ryzen 5 and Ryzen 3.


He is a cpu reviewer? Is that really an excuse?

Even one HEDT would have been good.


----------



## Majin SSJ Eric

I don't really have much of an issue with TPU's review to be honest. We all already know the 7700K is a better pure gaming chip than the R7's (common sense tells us that no octocore is a pure gaming chip to begin with). Its disappointing he didn't include any Intel HEDT chips in the review but given the circumstances of him not having any on hand to test, that's understandable. If anything it goes to show just how badly Intel's HEDT chips are priced since a very popular reviewer flat out couldn't afford to buy the chips and platforms to test, and really, at $1000 for a 6900K can any of us really blame him? His conclusions were spot on as he stated the R7's are a no-brainer buy versus the Intel X99 chips, even if he couldn't afford to actually test them. Also, looking at the gaming graphs, I think the R7's look to be performing very well in this review. Only a couple of outlier games show a massive difference between the OC 1800X and the 7700K and we all know that with 3200MHz DDR4 that gap would be almost nonexistent. As I said, the 7700K is the faster gaming CPU; that is not in question. The question is, are the R7's BAD at gaming (like BD was) and the unequivocal answer to that question is of course not.


----------



## Liranan

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *geggeg*
> 
> If you suddenly got a press kit from AMD, would you spend your own money buying every HEDT CPU or a bunch of less expensive, more mainstream CPUs instead? He isn't a CPU reviewer yet, as I mentioned. But he's going to buy some more mainstream CPUs to prep for more CPU reviews including Ryzen 5 and Ryzen 3.
> 
> 
> 
> He is a cpu reviewer? Is that really an excuse?
> 
> Even one HEDT would have been good.
Click to expand...

He can't afford to spend 1500+ on an Intel setup whereas AMD were kind enough to send him a kit for him to review and he didn't say anything we don't already know. As long as you do nothing but gaming then an Intel 7700K is the best CPU but if you do other things besides gaming then Zen is amazing value, value that you have to be blind to ignore.

That is unless you're a certain poster in this thread who, a few pages back, proved that there are, indeed, some willfully ignorant people around.


----------



## mAs81

What I'm really hoping to see is the pricing monopoly on intel's side to just stop..

I mean the 7700K is 11euros cheaper than my 4790K which is 30 euros more today(at least) than back in August '14 that I got it,and that is utter nonsense imo

AMD is back in my book, and I really believe that Zen 7 will better until Zen 5 and 3 come out..

It all comes down to what people want and of course intel will get higher FPS counts because everything has been developed around their CPUs to maximize the Hardware/Game performance , which is understandable - AMD hadn't a serious contender in the CPU market but now that it does we will only see the performance raise with time (insert "finewine" allegory here ,lol )

What will happen will be only directed by the market-people can rant all day about poor FPS vs moar cores or whatnot , but in the end each and every one of us buys these things keeping the personal optimal performance to price ratio that he'd like..


----------



## Slink3Slyde

Quote:


> Originally Posted by *Rocozaur*
> 
> I believe that bashing the conclusions of the TechpowerUP! review is holding you back from seeing a very important detail:
> 
> Low power consumption in Games while also delivering less performance. But wait, what is this?
> 
> Benchmarks and tests that make full use of all Ryzen has to offer as cores/threads result in high performance and of course high power draw. Games deliver lower performance than Intel but the power consumption stays low even with OC applied! If that's not a dead giveaway that on average most games aren't properly using the Ryzen CPU, then I don't know what is


Its more an indication that an 8 core 16 thread CPU still isnt being fully utilized in gaming, unless you're desperate for more FPS in Civ.

If we're going to pick the review apart why did they overclock the 7700K in the productivity benches and not in the gaming ones? They overclocked the R7 in both.

I think the Ryzen 5 is going to be the more interesting gaming chip, it will probably perform almost the same as a Ryzen 7 while costing the same or cheaper then an I5.
Quote:


> Originally Posted by *AlphaC*
> 
> Yes, look at the memory setup. It's a scathing review.
> 
> Ryzen 7 has 2666MHz CL16 and the i7-7700k & i7-6700k are running 3000MHz CL15. If that is not biased I don't know what is.
> 
> I have no clue why the i7-4770k was running DDR3 1600 MHz 9-10-9-27. Most kits I've seen are 9-9-9-9-24 while the i5-2500k was running DDR3-1333Mhz 9-10-9-27.
> 
> JEDEC standard:
> DDR3-1600H = 9-9-9
> DDR3-2133L = 12-12-12
> DDR4-2133N = 14-14-14
> DDR4-2400P = 15-15-15
> DDR4-2400R = 16-16-16
> 
> A realistic test would be: Ryzen 7 with DDR4 2666Mhz RAM CL15 or lower (official spec) + highest supported kits such as the Trident Z 3200Mhz CL14, i7-4790k with 1600MHz CL9 + 2133MHz (highest DDR3 non overclocked) , i7-6700k with 2133Mhz and 2400Mhz DDR4 & i7-7700k both with DDR4 2400MHz CL15 or CL16 (the official spec) & 3000Mhz CL15.


I believe its quite possible with Kaby Lake on a top end motherboard with a good kit to get the RAM up to 4000+. If you wanted to be fair it would be Ryzen at 3200 C14 vs Kaby at 4000 C18/19.


----------



## renx

So the fabric scales with ram frequency.
That means Ryzen @2133 versus 7700K @2133, gives the 7700K more advantage than Ryzen @3200 versus 7700K @3200.
And that may explain why Ryzen is doing better in some reviews.


----------



## Nizzen

Quote:


> Originally Posted by *renx*
> 
> So the fabric scales with ram frequency.
> That means Ryzen @2133 versus 7700K @2133, gives the 7700K more advantage than Ryzen @3200 versus 7700K @3200.
> And that may explain why Ryzen is doing better in some reviews.


Waiting for an review for Ryzen 3200mhz mem vs 7700k 4133/4233mhz mem


----------



## Carniflex

Quote:


> Originally Posted by *mouacyk*
> 
> The extra cores/threads/cache are half measures in between Intels' mainstream and their HEDT, given their decision to go with a non-monolithic core design. Yes, they saved in R&D and produced a cheaper product, but is it competitive? Not in the sense that it has made decision-making clearer or easier -- and that is really a missed opportunity here on a brand new product launch designed by a reknown engineer.
> 
> And "competitive power envelope"? Sure, but performance compromises were made to make that possible. Yes, it may take less cooling to keep RyZen within operating temperatures, but reviews have shown that consumption, however, is nearly identical to its Intel counterpart. Not really seeing any advantage here either.
> 
> src: https://www.pcper.com/reviews/Processors/AMD-Ryzen-7-1800X-Review-Now-and-Zen/Power-Consumption-and-Conclusions
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> In fact, despite the 95 watt TDP, the Ryzen CPU uses about the same power as the 140 watt Broadwell-E processors.


"Competitive" does not need to beat the snot out of opposition. For me it means that it's sort of the same - as you have noted. Its in the same ballpark. Which is entirely logical, both are using 14nm process, volts and clocks are in the same'ish ballpark as well at stock settings.


----------



## IRobot23

Quote:


> Originally Posted by *Nizzen*
> 
> Waiting for an review for Ryzen 3200mhz mem vs 7700k 4133/4233mhz mem


Why that expensive ram? You can get x99 cheaper with better ram bandwidth. Mybe SP gaming wont run 144Hz, but it will definitely kill 7700K in MP 64player well optimized game.


----------



## Slink3Slyde

The point is that relatively speaking in the TPU review, both cpus are not using the fastest RAM they can handle. Therefore to say that Ryzen is being handicapped unfairly is nonsense.


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> The point is that relatively speaking in the TPU review, both cpus are not using the fastest RAM they can handle. Therefore to say that Ryzen is being handicapped unfairly is nonsense.


Thats not the point. Then you could say that games are not well optimized for ryzen. Or run few in background etc.

Like I said few month ago was like quad cores are not enough for latest games (real 144Hz gamers), but looks like i7 7700K is just enough everything if you compare it to ryzen.

You should watch this is some moments his CPU will almost 80% usage. There is no way that overclocked 7700K can go against it.
https://www.youtube.com/watch?v=M1VsX58qURs


----------



## Blameless

Quote:


> Originally Posted by *Slink3Slyde*
> 
> The point is that relatively speaking in the TPU review, both cpus are not using the fastest RAM they can handle. Therefore to say that Ryzen is being handicapped unfairly is nonsense.


Memory performance isn't the handicap. Ryzen's data fabric, the interconnect attaching the CCXes to each other and all I/O is tied to memory clock.

Relative to readily achievable ideals, you likely handicap a Ryzen more with DDR4-2666 vs 3200 than you do an Intel setup with DDR4-2133 vs. 4000.

Anyway, they should all be using the same memory, not whatever was on hand, not necessarily the fastest memory they can handle (DDR4-4000+ is a complete rip off and won't offer either platform major advantages over lower memory), but the same exact DIMMs, run at the best 24/7 stable settings that can be managed by those DIMMs on each platform.


----------



## kd5151




----------



## anker020

Thanks for the info,i will read some of them.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kd5151*


Been waiting for this reviews. Clearly they are leaders when it comes to testing games.


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> Thats not the point. Then you could say that games are not well optimized for ryzen. Or run few in background etc.
> 
> Like I said few month ago was like quad cores are not enough for latest games (real 144Hz gamers), but looks like i7 7700K is just enough everything if you compare it to ryzen.
> 
> You should watch this is some moments his CPU will almost 80% usage. There is no way that overclocked 7700K can go against it.
> https://www.youtube.com/watch?v=M1VsX58qURs


I disagree. You can cite BF1 MP but the fact is in a whole myriad of other games a fast clocked I7 quad core is faster, even an I5 7600K matches or beats Ryzen in many games today, I'd even say most. I dont really want to have that argument though its been done so many times. Ryzen is good for gaming and will be more future proof then an I5 going forwards. I think though by the time Ryzen is beating a 7700K significantly in the majority of games, both CPU's will have been superceded by better models. If I'm wrong I'm wrong, and oh dear I'll just have to buy a whole new platform to play with








Quote:


> Originally Posted by *Blameless*
> 
> Memory performance isn't the handicap. Ryzen's data fabric, the interconnect attaching the CCXes to each other and all I/O is tied to memory clock.
> 
> Relative to readily achievable ideals, you likely handicap a Ryzen more with DDR4-2666 vs 3200 than you do an Intel setup with DDR4-2133 vs. 4000.
> 
> Anyway, they should all be using the same memory, not whatever was on hand, not necessarily the fastest memory they can handle (DDR4-4000+ is a complete rip off and won't offer either platform major advantages over lower memory), but the same exact DIMMs, run at the best 24/7 stable settings that can be managed by those DIMMs on each platform.


You and @AlphaC will have to forgive me for having gaming blinkers on. I know that Intel chips do show improved performance in games from faster RAM and tighter timings, some games more then others, and I looked briefly at the TPU gaming tests that showed Ryzen gaining something like 5% on average going from 2133 to 2666, assuming the scaling continued the same up to 3200 RAM with some napkin maths its still behind the 7700K mostly, which would itself show gains from being overclocked and having faster RAM. That's where I was coming from.

As far as the productivity and office tasks go, I think Ryzen is a big winner even if it is being handicapped in the reviews that Ive seen.


----------



## AlphaC

It's not just the test setup but also the methodology. We just know some applications were tested but I don't see what methodology was used. For example the Excel rating seems off to me, see http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html If it was written "Monte Carlo simulation" or "Black scholes" it would be better. For all we know it could "opening a file in Excel" or "making a sum".

We also know Ryzen has a heavy reliance on the thermal solution as far as XFR ; the thermal solution appears to be some tower cooler but it isn't listed. We know people with custom loops and top tier coolers are getting over the 75°C mark.

It seems sloppy overall compared to Kaby Lake's review in which "Cryorig R1 Universal 2x 140 mm fan" is clearly listed. https://www.techpowerup.com/reviews/Intel/Core_i7-7700K_vs_6700K_Game_Performance/2.html

The end summary reads:
+ AMD processors are competitive again ---> not just yet , we need $200-250 CPUs from Ryzen 5
+ Outstanding performance in heavy multi-threaded apps ---> only in apps that can be parallelized past 8 threads , such as Visual studio compiling, handbrake, Cinema4d , Virtual box / VMware , Vray,or Mentalray
+ Cheaper than Intel HEDT processors ---> met pricing expectations
+ Single-threaded performance improved ---> it beat expectations
+ Low power draw and excellent power efficiency ---> beat power expectations
+ Platform updated to include latest features (PCIe 3.0, USB 3.1, NVMe) --> OK ,but USB 3.1 Gen 2 is what matters not USB 3.1
- Horrible motherboards / BIOS, feels not ready for market ---> motherboard prerelease BIOS , this is mostly on the motherboard makers' focus on Z270
- Limited game performance ---> it's not really gaming oriented
- Memory frequency options and memory compatibility limited ---> a large nuisance due to Intel oriented memory kits
- Setup complicated (memory, HPET, CCX, SMT, and power profile)
- Overclocking barely worth it ---> it's the Ryzen 7 1800X SKU and ade on low power process
- Requires optimized apps of which there are not many ---> mostly Windows 10 + parallelization limitations of more than 4 threads
- Lacks integrated graphics ---> irrelevant to most

I actually disagree with the TPU assessment on 8.6 score: Ryzen 7 1800X is not the CPU to buy for gaming , anything relying on DX11, or anything relying on viewports (Photoshop, media except encoding that can be parallelized, CAD/ 3d modeling) at this time. It's a computational CPU at its heart. The Ryzen 7 1800X deserves a 6 score or 6.5 at best unless newer steppings or batches magically XFR boost to 4.3GHz on 2-4 cores or 8 cores turbo to 4.1; the Ryzen 7 1700 deserves a 7.5 or 8.0 due to pricing / inclusion of cooler / overclock vs stock / TDP when stock. Until Ryzen 5 launches, motherboards are ironed out, and AMD optimized memory comes around we won't see the adoption AMD is desperate for. Most users won't have the patience to wait for a motherboard, wait for AM4 CPU cooler bracket, go through lists of memory kits actually designed for Intel, mess with Windows settings / core affinity / process monitors just to get it working properly. I expect ryzen 5 hexcores if implemented properly to be a solid 8 score.

I'm still waiting for a review that streams audio through VoIP application such as Discord/mumble/Vent/TS/Skype while doing something else at the same time whether it is gaming / number crunching / etc. Everyone and their dog is doing "gaming" average _without min FPS and frametimes_ on clean systems & synthetic benchmarks on unrealistic Windows installs.

Also the memory affects Ryzen *much* more than Kaby Lake due to the Infinity fabric's relation to memory speed. You could argue until you're blue about DDR4 4000+ on Kaby Lake but it doesn't change the fact that Kaby Lake is portrayed as better than it should be due to the Ryzen *& older i7s* being gimped. (read http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page2.html)

It's kind of hilarious that TPU themselves reported on the very issue: https://www.techpowerup.com/231268/amds-ryzen-cache-analyzed-improvements-improveable-ccx-compromises , https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


----------



## ZealotKi11er

Quote:


> Originally Posted by *AlphaC*
> 
> It's not just the test setup but also the methodology. We just know some applications were tested but I don't see what methodology was used. For example the Excel rating seems off to me, see http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html If it was written "Monte Carlo simulation" or "Black scholes" it would be better. For all we know it could "opening a file in Excel" or "making a sum".
> 
> We also know Ryzen has a heavy reliance on the thermal solution as far as XFR ; the thermal solution appears to be some tower cooler but it isn't listed. We know people with custom loops and top tier coolers are getting over the 75°C mark.
> 
> It seems sloppy overall compared to Kaby Lake's review in which "Cryorig R1 Universal 2x 140 mm fan" is clearly listed. https://www.techpowerup.com/reviews/Intel/Core_i7-7700K_vs_6700K_Game_Performance/2.html
> 
> The end summary reads:
> + AMD processors are competitive again ---> not just yet , we need $200-250 CPUs from Ryzen 5
> + Outstanding performance in heavy multi-threaded apps ---> only in apps that can be parallelized past 8 threads , such as Visual studio compiling, handbrake, Cinema4d , Virtual box / VMware , Vray,or Mentalray
> + Cheaper than Intel HEDT processors ---> met pricing expectations
> + Single-threaded performance improved ---> it beat expectations
> + Low power draw and excellent power efficiency ---> beat power expectations
> + Platform updated to include latest features (PCIe 3.0, USB 3.1, NVMe) --> OK ,but USB 3.1 Gen 2 is what matters not USB 3.1
> - Horrible motherboards / BIOS, feels not ready for market ---> motherboard prerelease BIOS , this is mostly on the motherboard makers' focus on Z270
> - Limited game performance ---> it's not really gaming oriented
> - Memory frequency options and memory compatibility limited ---> a large nuisance due to Intel oriented memory kits
> - Setup complicated (memory, HPET, CCX, SMT, and power profile)
> - Overclocking barely worth it ---> it's the Ryzen 7 1800X SKU and ade on low power process
> - Requires optimized apps of which there are not many ---> mostly Windows 10 + parallelization limitations of more than 4 threads
> - Lacks integrated graphics ---> irrelevant to most
> 
> I actually disagree with the TPU assessment on 8.6 score: Ryzen 7 1800X is not the CPU to buy for gaming , anything relying on DX11, or anything relying on viewports (Photoshop, media except encoding that can be parallelized, CAD/ 3d modeling) at this time. It's a computational CPU at its heart. The Ryzen 7 1800X deserves a 6 score or 6.5 at best unless newer steppings or batches magically XFR boost to 4.3GHz on 2-4 cores or 8 cores turbo to 4.1; the Ryzen 7 1700 deserves a 7.5 or 8.0 due to pricing / inclusion of cooler / overclock vs stock / TDP when stock. Until Ryzen 5 launches, motherboards are ironed out, and AMD optimized memory comes around we won't see the adoption AMD is desperate for. Most users won't have the patience to wait for a motherboard, wait for AM4 CPU cooler bracket, go through lists of memory kits actually designed for Intel, mess with Windows settings / core affinity / process monitors just to get it working properly. I expect ryzen 5 hexcores if implemented properly to be a solid 8 score.
> 
> I'm still waiting for a review that streams audio through VoIP application such as Discord/mumble/Vent/TS/Skype while doing something else at the same time whether it is gaming / number crunching / etc. Everyone and their dog is doing "gaming" average _without min FPS and frametimes_ on clean systems & synthetic benchmarks on unrealistic Windows installs.
> 
> Also the memory affects Ryzen *much* more than Kaby Lake due to the Infinity fabric's relation to memory speed. You could argue until you're blue about DDR4 4000+ on Kaby Lake but it doesn't change the fact that Kaby Lake is portrayed as better than it should be due to the Ryzen *& older i7s* being gimped. (read http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page2.html)
> 
> It's kind of hilarious that TPU themselves reported on the very issue: https://www.techpowerup.com/231268/amds-ryzen-cache-analyzed-improvements-improveable-ccx-compromises , https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


The only problem with DDR-4000 is price. You got to take that into effect. Up to 3200 price is effective.


----------



## Slink3Slyde

Quote:


> Originally Posted by *AlphaC*
> 
> It's not just the test setup but also the methodology. We just know some applications were tested but I don't see what methodology was used. For example the Excel rating seems off to me, see http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html If it was written "Monte Carlo simulation" or "Black scholes" it would be better. For all we know it could "opening a file in Excel" or "making a sum".
> 
> We also know Ryzen has a heavy reliance on the thermal solution as far as XFR ; the thermal solution appears to be some tower cooler but it isn't listed. We know people with custom loops and top tier coolers are getting over the 75°C mark.
> 
> It seems sloppy overall compared to Kaby Lake's review in which "Cryorig R1 Universal 2x 140 mm fan" is clearly listed. https://www.techpowerup.com/reviews/Intel/Core_i7-7700K_vs_6700K_Game_Performance/2.html
> 
> The end summary reads:
> + AMD processors are competitive again ---> not just yet , we need $200-250 CPUs from Ryzen 5
> + Outstanding performance in heavy multi-threaded apps ---> only in apps that can be parallelized past 8 threads , such as Visual studio compiling, handbrake, Cinema4d , Virtual box / VMware , Vray,or Mentalray
> + Cheaper than Intel HEDT processors ---> met pricing expectations
> + Single-threaded performance improved ---> it beat expectations
> + Low power draw and excellent power efficiency ---> beat power expectations
> + Platform updated to include latest features (PCIe 3.0, USB 3.1, NVMe) --> OK ,but USB 3.1 Gen 2 is what matters not USB 3.1
> - Horrible motherboards / BIOS, feels not ready for market ---> motherboard prerelease BIOS , this is mostly on the motherboard makers' focus on Z270
> - Limited game performance ---> it's not really gaming oriented
> - Memory frequency options and memory compatibility limited ---> a large nuisance due to Intel oriented memory kits
> - Setup complicated (memory, HPET, CCX, SMT, and power profile)
> - Overclocking barely worth it ---> it's the Ryzen 7 1800X SKU and ade on low power process
> - Requires optimized apps of which there are not many ---> mostly Windows 10 + parallelization limitations of more than 4 threads
> - Lacks integrated graphics ---> irrelevant to most
> 
> I actually disagree with the TPU assessment on 8.6 score: Ryzen 7 1800X is not the CPU to buy for gaming , anything relying on DX11, or anything relying on viewports (Photoshop, media except encoding that can be parallelized, CAD/ 3d modeling) at this time. It's a computational CPU at its heart. The Ryzen 7 1800X deserves a 6 score or 6.5 at best unless newer steppings or batches magically XFR boost to 4.3GHz on 2-4 cores or 8 cores turbo to 4.1; the Ryzen 7 1700 deserves a 7.5 or 8.0 due to pricing / inclusion of cooler / overclock vs stock / TDP when stock. Until Ryzen 5 launches, motherboards are ironed out, and AMD optimized memory comes around we won't see the adoption AMD is desperate for. *Most users won't have the patience to wait for a motherboard, wait for AM4 CPU cooler bracket, go through lists of memory kits actually designed for Intel*, mess with Windows settings / core affinity / process monitors just to get it working properly. I expect ryzen 5 hexcores if implemented properly to be a solid 8 score.
> 
> I'm still waiting for a review that streams audio through VoIP application such as Discord/mumble/Vent/TS/Skype while doing something else at the same time whether it is gaming / number crunching / etc. Everyone and their dog is doing "gaming" average _without min FPS and frametimes_ on clean systems & synthetic benchmarks on unrealistic Windows installs.
> 
> Also the memory affects Ryzen *much* more than Kaby Lake due to the Infinity fabric's relation to memory speed. You could argue until you're blue about DDR4 4000+ on Kaby Lake but it doesn't change the fact that Kaby Lake is portrayed as better than it should be due to the Ryzen *& older i7s* being gimped. (read http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page2.html)
> 
> It's kind of hilarious that TPU themselves reported on the very issue: https://www.techpowerup.com/231268/amds-ryzen-cache-analyzed-improvements-improveable-ccx-compromises , https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


Dont have time to argue until I'm blue I'm afraid, there's decent whiskey to be drunk. Just to say that going by other reviews Ive read Ryzen is pretty very well in everything, I admit I dont look very closely at the productivity charts but considering its price seems alright to me, especially the 1700. The bolded part of your post is half of the reason I dont have a Ryzen system today, the other parts being the decent but not mind-blowing gaming results and the fact I picked up a deal on a Mobo/RAM/CPU combo that worked out about 150 Euros cheaper here then a Ryzen rig with a decent mobo. But I digress.

You've probably read the TechReport review but here it is anyway. Frame-time analysis on a Ryzen system with 2933MT/S RAM at good timings shows decent results. Minimum frame times have been charted by Techspot in an article AMD cited themselves in the statement the other day. On average minimum frame times with SMT disabled are around that of an I5 7600K on average or if you'd prefer the 5960X, so there's that.

I think I'm the only guy in the world who just plays games when I play judging from what I read in these threads, I dont have my system streaming anything when I play, I find the habit of having 700 tabs in a browser open weird, thats what bookmarks are for, I dont often need to do much video encoding etc etc. If I want to look something up while Im playing alt tab is fine for me and I also have a tablet and a phone for checking emails or whatever. No such reviews before Ryzen was released have been done, to demonstrate the power of X79/X99 or the 980X over a quad core Intel set up. No one even questioned it before.

DDR4000? Meh its far too expensive, but when youre considering the relative performance of two platforms is it fair to have one running at the absolute limit of what its capable of and another somewhere in the middle. I understand that Ryzen benefits far more from the faster RAM then an Intel setup, and the cost is obscene for the top speeds. I guess as a reviewer you have to make some choices about what youre going to do. And they'll always be people to second guess whatever choice you make. As for the older CPU's being gimped with RAM, its possible it was done with what was on hand at the time I guess.

I dont want too get caught up as if I'm defending TPU, I usually like their reviews but I think you do make some good points.

Anyway, I've been rambling. Whiskey.


----------



## JackCY

Quote:


> Originally Posted by *AlphaC*
> 
> It's not just the test setup but also the methodology. We just know some applications were tested but I don't see what methodology was used. For example the Excel rating seems off to me, see http://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html If it was written "Monte Carlo simulation" or "Black scholes" it would be better. For all we know it could "opening a file in Excel" or "making a sum".
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> We also know Ryzen has a heavy reliance on the thermal solution as far as XFR ; the thermal solution appears to be some tower cooler but it isn't listed. We know people with custom loops and top tier coolers are getting over the 75°C mark.
> 
> It seems sloppy overall compared to Kaby Lake's review in which "Cryorig R1 Universal 2x 140 mm fan" is clearly listed. https://www.techpowerup.com/reviews/Intel/Core_i7-7700K_vs_6700K_Game_Performance/2.html
> 
> The end summary reads:
> + AMD processors are competitive again ---> not just yet , we need $200-250 CPUs from Ryzen 5
> + Outstanding performance in heavy multi-threaded apps ---> only in apps that can be parallelized past 8 threads , such as Visual studio compiling, handbrake, Cinema4d , Virtual box / VMware , Vray,or Mentalray
> + Cheaper than Intel HEDT processors ---> met pricing expectations
> + Single-threaded performance improved ---> it beat expectations
> + Low power draw and excellent power efficiency ---> beat power expectations
> + Platform updated to include latest features (PCIe 3.0, USB 3.1, NVMe) --> OK ,but USB 3.1 Gen 2 is what matters not USB 3.1
> - Horrible motherboards / BIOS, feels not ready for market ---> motherboard prerelease BIOS , this is mostly on the motherboard makers' focus on Z270
> - Limited game performance ---> it's not really gaming oriented
> - Memory frequency options and memory compatibility limited ---> a large nuisance due to Intel oriented memory kits
> - Setup complicated (memory, HPET, CCX, SMT, and power profile)
> - Overclocking barely worth it ---> it's the Ryzen 7 1800X SKU and ade on low power process
> - Requires optimized apps of which there are not many ---> mostly Windows 10 + parallelization limitations of more than 4 threads
> - Lacks integrated graphics ---> irrelevant to most
> 
> I actually disagree with the TPU assessment on 8.6 score: Ryzen 7 1800X is not the CPU to buy for gaming , anything relying on DX11, or anything relying on viewports (Photoshop, media except encoding that can be parallelized, CAD/ 3d modeling) at this time. It's a computational CPU at its heart. The Ryzen 7 1800X deserves a 6 score or 6.5 at best unless newer steppings or batches magically XFR boost to 4.3GHz on 2-4 cores or 8 cores turbo to 4.1; the Ryzen 7 1700 deserves a 7.5 or 8.0 due to pricing / inclusion of cooler / overclock vs stock / TDP when stock. Until Ryzen 5 launches, motherboards are ironed out, and AMD optimized memory comes around we won't see the adoption AMD is desperate for. Most users won't have the patience to wait for a motherboard, wait for AM4 CPU cooler bracket, go through lists of memory kits actually designed for Intel, mess with Windows settings / core affinity / process monitors just to get it working properly. I expect ryzen 5 hexcores if implemented properly to be a solid 8 score.
> 
> I'm still waiting for a review that streams audio through VoIP application such as Discord/mumble/Vent/TS/Skype while doing something else at the same time whether it is gaming / number crunching / etc. Everyone and their dog is doing "gaming" average _without min FPS and frametimes_ on clean systems & synthetic benchmarks on unrealistic Windows installs.
> 
> 
> 
> Also the memory affects Ryzen *much* more than Kaby Lake due to the Infinity fabric's relation to memory speed. You could argue until you're blue about DDR4 4000+ on Kaby Lake but it doesn't change the fact that Kaby Lake is portrayed as better than it should be due to the Ryzen *& older i7s* being gimped. (read http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page2.html)
> 
> It's kind of hilarious that TPU themselves reported on the very issue: https://www.techpowerup.com/231268/amds-ryzen-cache-analyzed-improvements-improveable-ccx-compromises , https://www.techpowerup.com/231585/amd-ryzen-infinity-fabric-ticks-at-memory-speed


People's wallet doesn't always agree with what wanna be reviewers think is the best product.
As long as Intel keeps overcharging like they did so far with their 4 cores max limited platform or pay triple to get 8 cores, ECC, ...
Makes little to no sense except specialized use cases to stick with Intel anymore.

The RAM and Fabric speed isn't an issue. RAM affects Ryzen CPUs same way Intel CPUs are, not some crazy 1.5x more or 2.0x times more than Intel.

DDR4 overall is kinda stuck at DDR3 speeds so far, it just runs lower volts and power.


----------



## Kuivamaa

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Been waiting for this reviews. Clearly they are leaders when it comes to testing games.


Decent review even if i disagree about the "bench out low res to evaluate longeavity for the CPUs" mantra. Won't discuss this here but something else. They wished to touch upon the CCX topic but they benched maybe the absolutely worst games they could have picked. Bith TW3 and Crysis 3 are fully multithreaded to the point they nearly mimic MT synthetics. CCX connection related fps drops are unlikely to happen in these.


----------



## cssorkinman

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Been waiting for this reviews. Clearly they are leaders when it comes to testing games.
> 
> 
> 
> Decent review even if i disagree about the "bench out low res to evaluate longeavity for the CPUs" mantra. Won't discuss this here but something else. They wished to touch upon the CCX topic but they benched maybe the absolutely worst games they could have picked. Bith TW3 and Crysis 3 are fully multithreaded to the point they nearly mimic MT synthetics. CCX connection related fps drops are unlikely to happen in these.
Click to expand...

It's pretty amazing in C3


Spoiler: Warning: Spoiler!


----------



## iRUSH

Quote:


> Originally Posted by *cssorkinman*
> 
> It's pretty amazing in C3
> 
> 
> Spoiler: Warning: Spoiler!


No doubt! The thread optimization in C3 is unreal!

I remember testing the legendary 2500k against a stock $89 FX 6300 and scratching my head at the outstanding results from the 6 core FX.

This was 4 years ago! I would have expected this across every AAA title to follow.


----------



## ZealotKi11er

Quote:


> Originally Posted by *iRUSH*
> 
> No doubt! The thread optimization in C3 is unreal!
> 
> I remember testing the legendary 2500k against a stock $89 FX 6300 and scratching my head at the outstanding results from the 6 core FX.
> 
> This was 4 years ago! I would have expected this across every AAA title to follow.


For Crysis 2 you have to go to the grass level and see the CPU take a huge hit. That was the first game where i7s beat i5s.


----------



## Oubadah

..


----------



## jprovido

Got my Ryzen matx rig up and running. currently I'm at 3.9GHz and 2400MHz CL15 (3200MHz CL16 kit)

Multithreadded performance is off the chain even my previous x99 system with an i7 5820k @ 4.7ghz 3200mhz Doesn't even come close got destroyed on cinebench. my 7700k @ 5.1ghz scored only 1100+


gaming performance is a bit dissapointing. Dota 2 sometimes gets as low as 80fps (110fps average) during big team fights at later parts of the game. Still waiting for bios updates. I hope 3200MHz ram speed would improve the gaming performance. no complaints with VR though. worked great as expected


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> Got my Ryzen matx rig up and running. currently I'm at 3.9GHz and 2400MHz CL15 (3200MHz CL16 kit)
> 
> Multithreadded performance is off the chain even my previous x99 system with an i7 5820k @ 4.7ghz 3200mhz Doesn't even come close got destroyed on cinebench. my 7700k @ 5.1ghz scored only 1100+
> 
> 
> gaming performance is a bit dissapointing. Dota 2 sometimes gets as low as 80fps (110fps average) during big team fights at later parts of the game. Still waiting for bios updates. I hope 3200MHz ram speed would improve the gaming performance. no complaints with VR though. worked great as expected


Oh god no 80 fps, heaven forbid.


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Oh god no 80 fps, heaven forbid.


looked horrible on my 1440p 144hz monitor it was a stuttery mess(no gsync btw I have a freesync monitor but two @ 1080 SLI)


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> looked horrible on my 1440p 144hz monitor it was a stuttery mess(no gsync btw I have a freesync monitor but two @ 1080 SLI)


Did you watch a single ryzen review before you purchased?


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Did you watch a single ryzen review before you purchased?


yep it was pretty much expected but then again I'm running at 2400Mhz right now(I can't get it any higher than that atm with my motherboard). hopefully I can get the minimums up to the 100's when a bios update rolls out or when I purchase a 3200mhz Ryzen Kit that works out of the box.

Overwatch was a pleasant surprise. It ran locked at 144fps (my 5820k @ 4.7ghz 3200mhz couldn't do it I was getting drops at around 120fps)


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> yep it was pretty much expected but then again I'm running at 2400Mhz right now. hopefully I can get the minimums up to the 100's when a bios update rolls out or when I purchase a 3200mhz Ryzen Kit that works out of the box


Does dota even benefit from a 144hz panel? With a 60hz vsync 1440p monitor it would be flawless gameplay.


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> Got my Ryzen matx rig up and running. currently I'm at 3.9GHz and 2400MHz CL15 (3200MHz CL16 kit)
> 
> Multithreadded performance is off the chain even my previous x99 system with an i7 5820k @ 4.7ghz 3200mhz Doesn't even come close got destroyed on cinebench. my 7700k @ 5.1ghz scored only 1100+
> 
> 
> gaming performance is a bit dissapointing. Dota 2 sometimes gets as low as 80fps (110fps average) during big team fights at later parts of the game. Still waiting for bios updates. I hope 3200MHz ram speed would improve the gaming performance. no complaints with VR though. worked great as expected


Sounds like your experience is mostly positive.

How is 80fps stuttery


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Does dota even benefit from a 144hz panel? With a 60hz vsync 1440p monitor it would be flawless gameplay.


the game looks beautiful at locked 144fps buttery smooth. Overwatch had no problems whatsoever and tried streaming on twitch it was hella smooth no skipped frames at all. still hoping for optimizations with Dota 2. if I can get that to run at locked 144fps like my 7700k I don't care about other games I will switch out the systems and put the 1700x on my main rig. it's fun playing with ryzen though I've been having a blast so far
Quote:


> Originally Posted by *budgetgamer120*
> 
> Sounds like your experience is mostly positive.
> 
> How is 80fps stuttery


when a game runs at below the refresh rate(without adaptive refresh rate) it will look stuttery esp when panning the map.

yep it's a beast







what's more impressive is it's a matx build. so small yet so bauss


----------



## Oubadah

..


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> the game looks beautiful at locked 144fps buttery smooth. Overwatch had no problems whatsoever and tried streaming on twitch it was hella smooth no skipped frames at all. still hoping for optimizations with Dota 2. if I can get that to run at locked 144fps like my 7700k I don't care about other games I will switch out the systems and put the 1700x on my main rig. it's fun playing with ryzen though I've been having a blast so far
> when a game runs at below the refresh rate(without adaptive refresh rate) it will look stuttery esp when panning the map.


Definitely something wrong with DOTA then, 7700k is a superior gaming cpu of course but it wouldnt explain a locked 144fps vs dips into the 80's on ryzen.

Surprising too cause i play older single threaded games as well like WoW, and it plays great.


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Definitely something wrong with DOTA then, 7700k is a superior gaming cpu of course but it wouldnt explain a locked 144fps vs dips into the 80's on ryzen.
> 
> Surprising too cause i play older single threaded games as well like WoW, and it plays great.


my 5820k @ 4.7ghz was the same (after the 7.00 update) I was getting 144fps early game but get drops to 110fps+ during latter part of the game.

My laptop with an i5 5200u and gtx 950m used to play this game at 60fps max settings. after the 7.00 update now it's averaging at 40fps. really hard to get it at a constant 60fps now even turning down the graphics settings doesn't help. I'm guessing it's more cpu intensive now unlike before


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> yep it was pretty much expected but then again I'm running at 2400Mhz right now(I can't get it any higher than that atm with my motherboard). hopefully I can get the minimums up to the 100's when a bios update rolls out or when I purchase a 3200mhz Ryzen Kit that works out of the box.
> 
> Overwatch was a pleasant surprise. It ran locked at 144fps (my 5820k @ 4.7ghz 3200mhz couldn't do it I was getting drops at around 120fps)


I do not think anything much will get fixed though
http://www.overclock.net/t/1625913/amd-ryzen-community-update

Quote:


> Originally Posted by *Oubadah*
> 
> Technically, any unsynchronised framerate is going to result in stutter.


I assumed he was using adaptive sync.


----------



## Scotty99

I will agree on overwatch tho, i have a gtx 1060 and i was surprised how many FPS i was getting at ultra 1080p (consistently 140+, sometimes over 180).

Blizzard is the king of optimizing games, smartest devlopers/coders in existence.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> I do not think anything much will get fixed though
> http://www.overclock.net/t/1625913/amd-ryzen-community-update
> I assumed he was using adaptive sync.


still not the end of the road for me. still hoping a faster ram would make it better
Quote:


> Originally Posted by *Scotty99*
> 
> I will agree on overwatch tho, i have a gtx 1060 and i was surprised how many FPS i was getting at ultra 1080p (consistently 140+, sometimes over 180).
> 
> Blizzard is the king of optimizing games, smartest devlopers/coders in existence.


yep what I can't explain is my 5820k @ 4.7ghz can't get locked 144fps. it should be performing better with the faster IPC performance and faster ram (it's quad channel too). I'm not complaining though ryzen magic baby


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> I will agree on overwatch tho, i have a gtx 1060 and i was surprised how many FPS i was getting at ultra 1080p (consistently 140+, sometimes over 180).
> 
> Blizzard is the king of optimizing games, smartest devlopers/coders in existence.


Maybe SenseMi is at work


----------



## Scotty99

3200 mhz ram should get you 10% give or take, depending on the game. Just remember tho ryzen is only about 12% faster than sandy bridge, not the best choice for someone seeking the highest framerate possible on twitch based games.

What it is is incredible value for money, and is going to age far better than anything intel is offering for the price point.


----------



## jprovido

My experience is pretty much in line with most reviewers. i5 performance in gaming and BEAST in multithreadded applications.








Quote:


> Originally Posted by *Scotty99*
> 
> 3200 mhz ram should get you 10% give or take, depending on the game. Just remember tho ryzen is only about 12% faster than sandy bridge, not the best choice for someone seeking the highest framerate possible on twitch based games.
> 
> What it is is incredible value for money, and is going to age far better than anything intel is offering for the price point.


I'd kill for that extra 10%. I hope asus sorts out the bios so I don't have to buy another kit to get 3200mhz


----------



## Scotty99

Im in same boat, 2400 cas 15 is as high as she'll go for now.


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Im in same boat, 2400 cas 15 is as high as she'll go for now.


do you have the same kit? ripjaws V 3200mhz cl16? (checked out your sig rig)

my asus prime matx board and corsair h60 isn't doing so bad huh? I'm getting 3.9ghz 2400 cas 15. It's impressive given the price and the vrms don't even have heatsinks


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> My experience is pretty much in line with most reviewers. i5 performance in gaming and BEAST in multithreadded applications.
> 
> 
> 
> 
> 
> 
> 
> 
> I'd kill for that extra 10%. I hope asus sorts out the bios so I don't have to buy another kit to get 3200mhz


Forgot to ask, how is the gaming while streaming now?


----------



## Majin SSJ Eric

I just love that here we are three years later and Crysis 3 is still amongst the very toughest games out there to run. Just goes to confirm my opinion all along that Crysis 3 is one of the best looking video games ever made and carries the Crysis tradition proudly of bringing even the most powerful systems to their knees. I know that being tough on hardware doesn't necessarily equate to looking great (BF1 also looks very good and is no where near as tough on hardware) but Crysis 3 just has the look of a first rate and highly stressful game graphically. I know not everybody is a fan but I find the game to be amazing in almost every way even to this day.


----------



## blue1512

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I just love that here we are three years later and Crysis 3 is still amongst the very toughest games out there to run. Just goes to confirm my opinion all along that Crysis 3 is one of the best looking video games ever made and carries the Crysis tradition proudly of bringing even the most powerful systems to their knees. I know that being tough on hardware doesn't necessarily equate to looking great (BF1 also looks very good and is no where near as tough on hardware) but Crysis 3 just has the look of a first rate and highly stressful game graphically. I know not everybody is a fan but I find the game to be amazing in almost every way even to this day.


How about the first Crysis? You'll never forget the moment "but can it run Crysis" was born


----------



## Majin SSJ Eric

Of course! The original Crysis is still one of my all time favorite games to this day and I still play it regularly. Have to admit that I'm a bit of a Crysis fanboy though so maybe not the most impartial, but I have loved every single one of them (yes, even Crysis 2 which everybody loves to hate on but to me had the best story line of them all)...


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> Forgot to ask, how is the gaming while streaming now?


https://www.twitch.tv/videos/129664768

see for yourself







stream looks good but in game fps is not that good. (i have fraps running) dota 2 @ 1080p. 1700x @ 3.9Ghz 2400MHz CL15

edit:

btw that's early part of the game. during late game and big team fights you see big tips 70-80fps at times


----------



## SuperZan

The stream quality is looking very consistent, that's nice to see.


----------



## dmasteR

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/129664768
> 
> see for yourself
> 
> 
> 
> 
> 
> 
> 
> stream looks good but in game fps is not that good. (i have fraps running) dota 2 @ 1080p. 1700x @ 3.9Ghz 2400MHz CL15
> 
> edit:
> 
> btw that's early part of the game. during late game and big team fights you see big tips 70-80fps at times


Should probably lower the bitrate. Way too many frames lost. Twitch doesn't like anything above 3500 unless you're partnered.


----------



## jprovido

Quote:


> Originally Posted by *dmasteR*
> 
> Should probably lower the bitrate. Way too many frames lost. Twitch doesn't like anything above 3500 unless you're partnered.


ill try 720p @ 60fps with lower bitrate brb


----------



## Malinkadink

Quote:


> Originally Posted by *jprovido*
> 
> ill try 720p @ 60fps with lower bitrate brb


720p 60fps 3500Kb/s are optimal stream settings. Encoder should be x264 with a preset of very fast, faster, or fast. faster and fast will give you a slightly better quality at the expense of more cpu usage but since you have 16 threads you can afford to use one of those. All other presets aren't worth it, they're either awful quality the higher the preset, or the quality improvements for the lower presets are hardly noticeable.


----------



## jprovido

Quote:


> Originally Posted by *Malinkadink*
> 
> 720p 60fps 3500Kb/s are optimal stream settings. Encoder should be x264 with a preset of very fast, faster, or fast. faster and fast will give you a slightly better quality at the expense of more cpu usage but since you have 16 threads you can afford to use one of those. All other presets aren't worth it, they're either awful quality the higher the preset, or the quality improvements for the lower presets are hardly noticeable.


https://www.twitch.tv/videos/129702499

I'm just using geforce experience to stream (not really a twitch streamer lol) still got skipped frames but believe me it's a lot better than how my 7700k does. 720p @60 3mbps bit rate. the quality have gone down by a lot but still get these 20 skipped frame intervals. I should probably try using OBS for both my 7700k and 1700x. geforce experience seems like it's not that good

and BTW I found out by reverting back to the default terrain on dota gave me a healthy 10fps+ boost. never really noticed that it gave such a performance hit. always got 144fps locked with kabylake


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> https://www.twitch.tv/videos/129702499
> 
> I'm just using geforce experience to stream (not really a twitch streamer lol) still got skipped frames but believe me it's a lot better than how my 7700k does. 720p @60 3mbps bit rate. the quality have gone down by a lot but still get these 20 skipped frame intervals. I should probably try using OBS for both my 7700k and 1700x. geforce experience seems like it's not that good
> 
> and BTW I found out by reverting back to the default terrain on dota gave me a healthy 10fps+ boost. never really noticed that it gave such a performance hit. always got 144fps locked with kabylake


GeForce experience uses your cpu or gpu? -_-


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> GeForce experience uses your cpu or gpu? -_-


both I believe. the ryzen stream was still a hundred times better than my 7700k test streams


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> both I believe. the ryzen stream was still a hundred times better than my 7700k test streams


Good to know


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> Good to know


I take it back. I think dota 2 is a bad example. my 7700k test stream I did about a month ago looked fine
https://www.twitch.tv/videos/124043382

will try with overwatch. my 7700k struggled pretty bad with that game while streaming
https://www.twitch.tv/videos/126205955


----------



## Malinkadink

Quote:


> Originally Posted by *jprovido*
> 
> I take it back. I think dota 2 is a bad example. my 7700k test stream I did about a month ago looked fine
> https://www.twitch.tv/videos/124043382
> 
> will try with overwatch. my 7700k struggled pretty bad with that game while streaming
> https://www.twitch.tv/videos/126205955


Overwatch i didn't have many issues on a 4.8ghz 7700k when capped to 235 fps @ 1080p or 160fps @ 1440p on a gtx 1080. This was while using the very fast encoder preset in obs. CPU usage was definitely up there in the 80s or so which is to be expected when you're running a game at such a high fps and want to stream. 4C/8T is barely enough in instances like those, and then throw something like watch dogs 2 or bf1 that can really put 8 threads to work and you'll have a really bad stream unless you cap fps ingame to reduce cpu load, or worse, use quicksync.


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I disagree. You can cite BF1 MP but the fact is in a whole myriad of other games a fast clocked I7 quad core is faster, even an I5 7600K matches or beats Ryzen in many games today, I'd even say most. I dont really want to have that argument though its been done so many times. Ryzen is good for gaming and will be more future proof then an I5 going forwards. I think though by the time Ryzen is beating a 7700K significantly in the majority of games, both CPU's will have been superceded by better models. If I'm wrong I'm wrong, and oh dear I'll just have to buy a whole new platform to play with
> 
> 
> 
> 
> 
> 
> 
> .


But they are test in SP not MP. Did you every played BF1 MP 64 players? How many people are complaining about i7 not being enough? Did you ever read forum?

i5 6600K
https://www.youtube.com/watch?v=DJ6vQFRE-0U
great example why low settings are not great to test for CPU. As you can see at ULTRA CPU is hardly producing 90FPS with low he easily push 110-120fps.
Why cant they test ULTRA 720P (GTX 1080TI) 5 min?


----------



## Scotty99

Just an fyi the main benefit to using a multicore chip like this on twitch is using the CPU encoding and upping the preset. You should be able to do fast or good i forget what they are called now, but ya that is what you should try next time.


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> But they are test in SP not MP. Did you every played BF1 MP 64 players? How many people are complaining about i7 not being enough? Did you ever read forum?


Nope never played it, the last BF I played significantly was BFBC2.

I dont deny the possibility of an 8 core being useful in BF mp. I dispute the fact that its the most important factor when theres a whole host of games that work better on a quad core with higher IPS setup.

The opposite argument to yours would be to say that Ryzen is not as good in SC 2 or Arma 3 therefore no one needs more then a dual core. To suggest that a cpu is universally better in gaming than another based on one very specific scenario would be wrong.


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Nope never played it, the last BF I played significantly was BFBC2.
> 
> I dont deny the possibility of an 8 core being useful in BF mp. I dispute the fact that its the most important factor when theres a whole host of games that work better on a quad core with higher IPS setup.
> 
> The opposite argument to yours would be to say that Ryzen is not as good in SC 2 or Arma 3 therefore no one needs more then a dual core. To suggest that a cpu is universally better in gaming than another based on one very specific scenario would be wrong.


Then do not talk stories that are incorrect.


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> Then do not talk stories that are incorrect.


You'd better clear that one up for me?


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> You'd better clear that one up for me?


So only game that has MP is BF1?
What about GTA V? Watchdogs 2?


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> So only game that has MP is BF1?


Are you suggesting that all games that are multiplayer will use 8 cores and 16 threads?


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Are you suggesting that all games that are multiplayer will use 8 cores and threads?


I am not, but definitely game will use more threads. I am suggesting that sites that makes SP benchmarks (low 720P) do not have a clue.
Just an example:
i3 vs i7
Low 720P = 150 vs 180 FPS (i3 is great)
ULTRA 720P = 100 vs 140fps (i3 is really good P/P)
MP ULTRA 720P = 50 vs 110FPS (i3 is really bad)


----------



## Malinkadink

Quote:


> Originally Posted by *Scotty99*
> 
> Just an fyi the main benefit to using a multicore chip like this on twitch is using the CPU encoding and upping the preset. You should be able to do fast or good i forget what they are called now, but ya that is what you should try next time.


slower and slow presets in OBS caused my 1700 when i had it to actually skip/drop frames. Medium and higher was okay, but the quality improvement from something like faster/fast isn't very noticeable at all so i would advise anyone to just either use very fast, faster, or fast.


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> I am not, but definitely game will use more threads. I am suggesting that sites that makes SP benchmarks (low 720P) do not have a clue.
> Just an example:
> i3 vs i7
> Low 720P = 150 vs 180 FPS
> ULTRA 720P = 100 vs 140fps
> MP ULTRA 720P = 50 vs 110FPS


O.K. RNG.
Quote:


> Originally Posted by *Slink3Slyde*
> 
> I disagree. You can cite BF1 MP but the fact is in a whole myriad of other games a fast clocked I7 quad core is faster, even an I5 7600K matches or beats Ryzen in many games today, I'd even say most. I dont really want to have that argument though its been done so many times. Ryzen is good for gaming and will be more future proof then an I5 going forwards. I think though by the time Ryzen is beating a 7700K significantly in the majority of games, both CPU's will have been superceded by better models. If I'm wrong I'm wrong, and oh dear I'll just have to buy a whole new platform to play with


On top of that I'll again add that I think the Ryzen R5 is going to be the best value gaming processor released in a long time, I'd rather suggest people wait a couple of weeks for that if they're only gaming.


----------



## IRobot23

Quote:


> Originally Posted by *Slink3Slyde*
> 
> O.K. RNG.
> On top of that I'll again add that I think the Ryzen R5 is going to be the best value gaming processor released in a long time, I'd rather suggest people wait a couple of weeks for that if they're only gaming.


Okay I apology, reading to many comments, sorry.


----------



## Slink3Slyde

Quote:


> Originally Posted by *IRobot23*
> 
> Okay I apology, reading to many comments, sorry.


No worries


----------



## iLeakStuff

TechPowerUp tested 20 games with R7 1800X against i7 7700K so its probably the best test out there.
Measely 10% below 7700K in 1080p and 5% below in 1440p.

They used a Gigabyte Aorus board instead of the broken Asus Crosshair crap all the other review sites used which is probably why it does so much better.
Efficiency is also a lot better than 7700K.


----------



## SoloCamo

Quote:


> Originally Posted by *IRobot23*
> 
> But they are test in SP not MP. Did you every played BF1 MP 64 players? How many people are complaining about i7 not being enough? Did you ever read forum?
> 
> i5 6600K
> https://www.youtube.com/watch?v=DJ6vQFRE-0U
> great example why low settings are not great to test for CPU. As you can see at ULTRA CPU is hardly producing 90FPS with low he easily push 110-120fps.
> Why cant they test ULTRA 720P (GTX 1080TI) 5 min?


Actually this brings up a good point. Setting Mesh Quality to anything lower than ultra in the frostbite engine greatly affects cpu performance. Mesh quality is essentially draw distance for characters and weaker cpu's struggle with it on ultra settings. At the same time, you really need this to be on ultra to have a proper experience otherwise you have people shooting you that you can't even see in the distance.

Quote:


> Originally Posted by *cssorkinman*
> 
> I'd like to plot fps, gpu and cpu usage and stack the data one behind the other on one spreadsheet.
> 
> I played a 64 player round of St quentin's scar last night at all stock clockings and low graphics settings . I did spend some time in the 130's when the behemoth was on screen and I was between B and C other than that it was 150 + most of the time.
> 
> I recorded a 24 vs 24 map a while back and was planning on uploading it, but 768 kb upload is pretty limiting


Do you mind doing another 1080p 64 player map run with all low settings except mesh quality at Ultra and at 1080p? Going to try and do a run quick on the same setup as my previous post.

Edit: This chart below was mistakenly ran at post process ultra and mesh low...

Min Max Avg
115 199 148.676


----------



## espn

For gaming performance, I think the real "problem" is that DirectX 12 cannot really use take advantage of 8 cores comparing to 4 cores + 4 threads. Very likely the design of DirectX 12 would not change for at least 3 years or even 4 to 5 years, then the speed of 4 cores + 4 threads is still ahead comparing to 8 cores, except AMD can really find a way to beat intel in term of single core speed instead of number of cores.


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IRobot23*
> 
> But they are test in SP not MP. Did you every played BF1 MP 64 players? How many people are complaining about i7 not being enough? Did you ever read forum?
> 
> i5 6600K
> https://www.youtube.com/watch?v=DJ6vQFRE-0U
> great example why low settings are not great to test for CPU. As you can see at ULTRA CPU is hardly producing 90FPS with low he easily push 110-120fps.
> Why cant they test ULTRA 720P (GTX 1080TI) 5 min?
> 
> 
> 
> Actually this brings up a good point. Setting Mesh Quality to anything lower than ultra in the frostbite enjoy greatly affects cpu performance. Mesh quality is essentially draw distance for characters and weaker cpu's struggle with it on ultra settings. At the same time, you really need this to be on ultra to have a proper experience otherwise you have people shooting you that you can't even see in the distance.
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> I'd like to plot fps, gpu and cpu usage and stack the data one behind the other on one spreadsheet.
> 
> I played a 64 player round of St quentin's scar last night at all stock clockings and low graphics settings . I did spend some time in the 130's when the behemoth was on screen and I was between B and C other than that it was 150 + most of the time.
> 
> I recorded a 24 vs 24 map a while back and was planning on uploading it, but 768 kb upload is pretty limiting
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Do you mind doing another 1080p 64 player map run with all low settings except mesh quality at Ultra and at 1080p? Going to try and do a run quick on the same setup as my previous post.
> 
> Here is Fao Fortress 64 player with all low, 1080p except mesh quality at ultra.
> 
> Min Max Avg
> 115 199 148.676
Click to expand...

I'll give it a go when I get a chance - interesting to see your results - thank you.


----------



## SoloCamo

Quote:


> Originally Posted by *espn*
> 
> For gaming performance, I think the real "problem" is that DirectX 12 cannot really use take advantage of 8 cores comparing to 4 cores + 4 threads. Very likely the design of DirectX 12 would not change for at least 3 years or even 4 to 5 years, then the speed of 4 cores + 4 threads is still ahead comparing to 8 cores, except AMD can really find a way to beat intel in term of single core speed instead of number of cores.


Where are you pulling this from?

I can assure you, with the same architecture and clock speeds anything that can use 8 threads would favor 8 cores vs 4c8t.


----------



## Kuivamaa

Well played some WoW dungeons and raids. Windows 10 often likes to load mostly cores 0 and 15 to run it







Scheduler or not,something must be done for sure.


----------



## espn

Quote:


> Originally Posted by *SoloCamo*
> 
> Where are you pulling this from?
> 
> I can assure you, with the same architecture and clock speeds anything that can use 8 threads would favor 8 cores vs 4c8t.


You can easily tell from any reveiw result.


----------



## III-Method-III

Hi folks

Im soon to take the plunge and buy a Ryzen 1700. Below is my reasoning followed by a few questions if anyone has time to answer:

*Reasoning for 1700 (over 1700x or 1800x):*
1) I get 8c16t for 1/3 of the price of an Intel 8c16 part.
2) I know I am 'early adopting' as such I want to do so with AMD's cheapest part
3) I can OC the 1700 to 4GHz if im lucky which is the same cieling enjoyed by the 1700X and 1800X...so why buy a more expensive part?
4) I spend 70% of my time on the PC doing productivity for my YT channel (Premiere Pro, Photoshop, After Effects) and 30% gaming (mainly BF1) - so threads helps the productivity

*Questions:*
1) I intend to stream. Does anyone here have a Ryzen who actually streams using OBS and knows what they are talking about (ie uses x264 codec on CPU to get most out of 3500kb/s). If so - how are you finding your gaming and streaming frames? Additionally has anyone used processor affinity to dedicate say 2c4t to the streaming compression and leave 6c12t to the game?
2) Do you think waiting for Gen2 AM4 motherboards is a wise idea given the problems? Ive been doing a lot of reading about the link between the infinity fabric clock speed and its link to memory clock speed so it seems logical that awaiting 3200MHz or higher clocking ram on AM4 will help matters greatly when games (which are constantly asking for data from the other CCX) are such an infinity fabric heavy workload?

Meth


----------



## budgetgamer120

Quote:


> Originally Posted by *iLeakStuff*
> 
> TechPowerUp tested 20 games with R7 1800X against i7 7700K so its probably the best test out there.
> Measely 10% below 7700K in 1080p and 5% below in 1440p.
> 
> They used a Gigabyte Aorus board instead of the broken Asus Crosshair crap all the other review sites used which is probably why it does so much better.
> Efficiency is also a lot better than 7700K.


Their conclusion and what they want the public to think is different though.


----------



## cssorkinman

@solocamo

Just managed to catch the last bit of that map at the settings you suggested



Min - 128 Max 200 Ave 159.8 4.1 ghz 1800X CL 14 1T


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*
> 
> @solocamo
> 
> Just managed to catch the last bit of that map at the settings you suggested
> 
> 
> 
> Min - 128 Max 200 Ave 159.8 4.1 ghz 1800X CL 14 1T


Thanks.

Secondly, please disregard my prior chart - I am removing it now. I mistakenly had post process at ultra and mesh at low instead of the other way around as I wanted. Did another run below.

As expected, the 8c16t seems to be much smoother overall - especially at 4.1ghz.

Here is another run with Empires Edge with all low and mesh at ultra - 1080p.

Min Max Avg
108 201 155.83


----------



## espn

Quote:


> Originally Posted by *SoloCamo*
> 
> Where are you pulling this from?
> 
> I can assure you, with the same architecture and clock speeds anything that can use 8 threads would favor 8 cores vs 4c8t.


When paired with a fast graphics card, and running at the standard resolution of 1080p, the framerate will be high, which results in a high CPU load. At some point, the CPU will be running as fast as it can, which will cap the framerate. This is worsened by games that don't properly scale across multiple cores, or only run at up to four cores, for example. While this is of course the game developer's "fault," it is a reality of today's game market that isn't going to change soon.
https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/16.html


----------



## SoloCamo

Quote:


> Originally Posted by *espn*
> 
> When paired with a fast graphics card, and running at the standard resolution of 1080p, the framerate will be high, which results in a high CPU load. At some point, the CPU will be running as fast as it can, which will cap the framerate. This is worsened by games that don't properly scale across multiple cores, or only run at up to four cores, for example. While this is of course the game developer's "fault," it is a reality of today's game market that isn't going to change soon.
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/16.html


This is not what I was talking about though.

All things equal, including architecture, clock speed, cache, etc. an 8 core part will never be beaten in any task by a 4 core 8 thread part. Of course applications or games that utilize less threads will favor the quicker architecture as long as said cpu has enough threads for that particular game engine or application.


----------



## espn

Quote:


> Originally Posted by *SoloCamo*
> 
> This is not what I was talking about though.
> 
> All things equal, including architecture, clock speed, cache, etc. an 8 core part will never be beaten in any task by a 4 core 8 thread part. Of course applications or games that utilize less threads will favor the quicker architecture as long as said cpu has enough threads for that particular game engine or application.


I don't think the architecture is the same, and very likely DirectX 12 is developed with Intel so Intel knows how to maximum the speed with less number of cores.


----------



## budgetgamer120

Quote:


> Originally Posted by *espn*
> 
> When paired with a fast graphics card, and running at the standard resolution of 1080p, the framerate will be high, which results in a high CPU load. At some point, the CPU will be running as fast as it can, which will cap the framerate. This is worsened by games that don't properly scale across multiple cores, or only run at up to four cores, for example. While this is of course the game developer's "fault," it is a reality of today's game market that isn't going to change soon.
> https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/16.html


Unless load is 100% the CPU is not running as fast as it can though.


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> @solocamo
> 
> Just managed to catch the last bit of that map at the settings you suggested
> 
> 
> 
> Min - 128 Max 200 Ave 159.8 4.1 ghz 1800X CL 14 1T
> 
> 
> 
> Thanks.
> 
> Secondly, please disregard my prior chart - I am removing it now. I mistakenly had post process at ultra and mesh at low instead of the other way around as I wanted. Did another run below.
> 
> As expected, the 8c16t seems to be much smoother overall - especially at 4.1ghz.
> 
> Here is another run with Empires Edge with all low and mesh at ultra - 1080p.
> 
> Min Max Avg
> 108 201 155.83
Click to expand...

Just counted frames for an entire round of st quentins scar 64 player low - ultra mesh - 1642 sample points about 1100 of those points were above 144fps @ 1500 above 120 fps the low was 103 fps - I tried to spend most of my time between B and C as that it about the most difficult place I've found on the map for fps. Average was 160..... that number is remarkably consistent.


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*
> 
> Just counted frames for an entire round of st quentins scar 64 player low - ultra mesh - 1642 sample points about 1100 of those points were above 144fps @ 1500 above 120 fps the low was 103 fps - I tried to spend most of my time between B and C as that it about the most difficult place I've found on the map for fps. Average was 160..... that number is remarkably consistent.


Thanks for taking the time to do this. I've got someone running around this board with a quote from me (that by itself is easily taken out of context to try to make me look bad) suggesting that this type of consistency is the benefit of having extra cores/threads regardless of peak avg fps which is focused on too often.

All my runs were with the bare minimum in background tasks. If I have so much as a 1080p youtube video up on my second monitor while I game the fps is even less consistent. I don't know how people with i5's can call 64 player maps completely smooth to be honest but I guess some people don't notice ridiculous fluctuations.


----------



## jprovido

Quote:


> Originally Posted by *Malinkadink*
> 
> Overwatch i didn't have many issues on a 4.8ghz 7700k when capped to 235 fps @ 1080p or 160fps @ 1440p on a gtx 1080. This was while using the very fast encoder preset in obs. CPU usage was definitely up there in the 80s or so which is to be expected when you're running a game at such a high fps and want to stream. 4C/8T is barely enough in instances like those, and then throw something like watch dogs 2 or bf1 that can really put 8 threads to work and you'll have a really bad stream unless you cap fps ingame to reduce cpu load, or worse, use quicksync.


I only use the geforce experience to stream ( I have a nvidia gpu). I'm not really a twitch streamer. my 5820k did a lot better than my 7700k on streaming. not that it's that important to me. I value gaming performance more that's why I "sidegraded" from 5820k to 7700k









I'm happy to know that the 7700k is good with streaming too as long as I use obs


----------



## cssorkinman

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Just counted frames for an entire round of st quentins scar 64 player low - ultra mesh - 1642 sample points about 1100 of those points were above 144fps @ 1500 above 120 fps the low was 103 fps - I tried to spend most of my time between B and C as that it about the most difficult place I've found on the map for fps. Average was 160..... that number is remarkably consistent.
> 
> 
> 
> Thanks for taking the time to do this. I've got someone running around this board with a quote from me (that by itself is easily taken out of context to try to make me look bad) suggesting that this type of consistency is the benefit of having extra cores/threads regardless of peak avg fps which is focused on too often.
> 
> All my runs were with the bare minimum in background tasks. If I have so much as a 1080p youtube video up on my second monitor while I game the fps is even less consistent. I don't know how people with i5's can call 64 player maps completely smooth to be honest but I guess some people don't notice ridiculous fluctuations.
Click to expand...

Cpu usage only goes over 40% during load screens, gameplay it's typically in the 30%'s on BF1 - plenty left in the tank at that rate.

Here is the spreadsheet from the St.quientins map run i did earlier if you want to play with the data..

bf12017-03-1914-07-11-61fps.csv 8k .csv file




Spoiler: Warning: Spoiler!



Crysis 3 maxing out a fury at low system spec preset , check the cpu usage



bf12017-03-1914-07-11-61fps.csv 8k .csv file


----------



## Kuivamaa

The experience is also silky smooth. After spending about 120 hours playing BF1 in an [email protected] and a [email protected] I was starting to believe that dips may be expected with this iteration of the engine and that player count. Stock 1800X offers flat drawgraph lines pretty much. Simply amazing. At this point I am severely limited by my Nano, will be picking up either a Vega or a 1080Ti come May.


----------



## SoloCamo

Quote:


> Originally Posted by *cssorkinman*
> 
> Cpu usage only goes over 40% during load screens, gameplay it's typically in the 30%'s on BF1 - plenty left in the tank at that rate.
> 
> Here is the spreadsheet from the St.quientins map run i did earlier if you want to play with the data..
> 
> bf12017-03-1914-07-11-61fps.csv 8k .csv file


Thanks again. Just out of curiousity I did the St. Quentins Scar map w/ 64 players at my typical play settings.

3840x2160 at 75% res scale (2880x1620)
All ultra settings except post processing set to medium
HBAO
TAA

4790k at 4.4ghz, 16gb 2400 cas10 mem, 290x at the same 1070 core / stock 1250 mem (reference cooler)

Min Max Avg
54 102 65.532

This 290x keeps on giving



Game looks great and runs great at these settings.

So to get somewhat back on topic... Ryzen does very well in this game. I'll have to replace this 290x before my 4790k of course since I'm trying to push 4k but I have a feeling Ryzen+ is going to definitely be my next system upgrade.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Slink3Slyde*
> 
> O.K. RNG.
> On top of that I'll again add that I think the Ryzen R5 is going to be the best value gaming processor released in a long time, I'd rather suggest people wait a couple of weeks for that if they're only gaming.


The 7700K will still be a better pure gaming CPU than the R5's (at a price premium of course) but I agree, the 1600 looks to be a fantastic value at around $220 or less than the KL i5's. Max FPS requires Intel still but Ryzen is only 10-15% behind and affords you the advantage of workstation-like MT performance that even a 7700K can only dream of. Just depends on what you personally want out of your machine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The 7700K will still be a better pure gaming CPU than the R5's (at a price premium of course) but I agree, the 1600 looks to be a fantastic value at around $220 or less than the KL i5's. Max FPS requires Intel still but Ryzen is only 10-15% behind and affords you the advantage of workstation-like MT performance that even a 7700K can only dream of. Just depends on what you personally want out of your machine.


With 1600 you can at least wait out games using 6 Core unlike 8 Core Zens.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> With 1600 you can at least wait out games using 6 Core unlike 8 Core Zens.


Not sure what you are trying to say there?







When I think about the 1600 I'm thinking of a 2500K but for modern workloads. Roughly the same price as when the 2500K launched but with 6 cores and 12 threads. This will be a very disruptive sku in the CPU market, even more so than the R7's which are already very disruptive to the HEDT market.


----------



## Malinkadink

I've been wondering, what made AMD go with the CCX design and put a pitifully slow "infinity fabric" in between? I can understand that it would help save money if yields aren't the best and you get chips with non functioning or under performing CCXs allowing you to disable the one half and use the good half for a cheaper SKU, but it doesn't seem like they'll be doing that exactly as they'll be going for 3+3, 2+2 setups instead in order to keep parity with the 4+4 R7s.

I'm not a computer engineer so i wouldn't know, but what would stop them from just having a 8 cores 4 on each side separated only by the cache between them?


----------



## Majin SSJ Eric

I don't see how the infinity fabric is "pitifully slow" in any way?


----------



## Malinkadink

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't see how the infinity fabric is "pitifully slow" in any way?


It's limited to 22GB/s vs the 170GB/s bandwidth within a CCX when accessing L3 cache. To give an idea of how slow that is, Nvidia's GTX 970 drops to 28GB/s from 196GB/s when accessing the slower 512MB portion of its VRAM.


----------



## Majin SSJ Eric

The tests I've seen when isolating CCX's hasn't really proven to make things any faster though? I don't think the infinity fabric is all that limiting. Ryzen is performing very well all around as it is, there's nothing "pitifully slow" about it...


----------



## Malinkadink

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> The tests I've seen when isolating CCX's hasn't really proven to make things any faster though? I don't think the infinity fabric is all that limiting. Ryzen is performing very well all around as it is, there's nothing "pitifully slow" about it...




Maybe not "pitifully slow" but its a limiting factor nonetheless and just not very optimal.


----------



## Majin SSJ Eric

I'm sure there were various cost-effectiveness considerations that were taken into account when they came up with the multi-CCX designs. AMD is, after all, not in the best of financial shape, so they had to maximize profitability while maintaining competitiveness. That's a pretty tall order for a company that has a fraction of Intel's R&D budget. With this design they can provide very competitive CPU's across a wide range of SKU's and one-up Intel in raw core/thread count while maintaining incredible value pricing. The performance is definitely there so i don't see the issue? If the R7's were only moderately faster than the construction cores then you might have a point, but in most scenarios the R7's are competing head to head with Haswell/Broadwell-E in IPC and are neck and neck with them.


----------



## kalelovil

Quote:


> Originally Posted by *Malinkadink*
> 
> I'm not a computer engineer so i wouldn't know, but what would stop them from just having a 8 cores 4 on each side separated only by the cache between them?


They could have gone with a Ring Bus to connect all the cores like Intel's Xeon/High-end desktop platform, but that brings its own trade-offs.

The Radeon HD 2xxx/3xxx used a Ring Bus but this was then dropped in the highly successful HD 4xxx series, this article explains some of the reasoning:
http://www.anandtech.com/show/2556/9


----------



## budgetgamer120

Quote:


> Originally Posted by *Malinkadink*
> 
> 
> 
> Maybe not "pitifully slow" but its a limiting factor nonetheless and just not very optimal.


Ping does not represent speed. I can ping somewhere and get high pings but still get high download speeds from the same server.

Like you said we aren't engineers and do not understand.

Furthermore I dont think ponging cores represents how programs use cores. As Ryzen is a killer in multitasking/multithreading.


----------



## ChronoBodi

Quote:


> Originally Posted by *Malinkadink*
> 
> 
> 
> Maybe not "pitifully slow" but its a limiting factor nonetheless and just not very optimal.


Then I can argue 5960x/6900k has worse latency than Ryzen within CCX, but better latency than Ryzen on Infinity Fabric?



we are not engineers, but there is always trade-offs when going with either monolithic die design like 5960/6900k or well... Core 2 Quadish designs (gross oversimplication but thats what Ryzen's two CCXs reminds me of)

But, AMD's trade off is a good one, they'll take the minor hit in gaming performance in order to offer more cores than Intel will for lower prices for the server market. They will make a killing there.

heck, come to think of it, the Ryzen 1500x is the most modern "Core 2 Quad" die CPU design now.

edit: actually, no. Ryzen is not exactly two CPU dies on a package like a Core 2 Quad, it's one die entirely, but internally, it's somewhat similar.


----------



## blue1512

Quote:


> Originally Posted by *ChronoBodi*
> 
> Then I can argue 5960x/6900k has worse latency than Ryzen within CCX, but better latency than Ryzen on Infinity Fabric?
> 
> we are not engineers, but there is always trade-offs when going with either monolithic die design like 5960/6900k or well... Core 2 Quadish designs (gross oversimplication but thats what Ryzen's two CCXs reminds me of)
> 
> But, AMD's trade off is a good one, they'll take the minor hit in gaming performance in order to offer more cores than Intel will for lower prices for the server market. They will make a killing there.
> 
> heck, come to think of it, the Ryzen 1500x is the most modern "Core 2 Quad" die CPU design now.
> 
> edit: actually, no. Ryzen is not exactly two CPU dies on a package like a Core 2 Quad, it's one die entirely, but internally, it's somewhat similar.


Because Infinity Fabric is on die connection, the latency is much smaller than the C2Q approach. Still, it's not comparable to monolithic die of true 8 cores and a ring bus. But it's enough to beat something like 6900k if the threads are well-distributed.

AMD targets the prosummer and workstation with Ryzen and I think it's a good move from them, it is where the money comes. PC market is stagnant and even Sandy Bridge is somehow enough for gaming, not worth to compete there.


----------



## ChronoBodi

Quote:


> Originally Posted by *blue1512*
> 
> Because Infinity Fabric is on die connection, the latency is much smaller than the C2Q approach. Still, it's not comparable to monolithic die of true 8 cores and a ring bus. But it's enough to beat something like 6900k if the threads are well-distributed.
> 
> AMD targets the prosummer and workstation with Ryzen and I think it's a good move from them, it is where the money comes. PC market is stagnant and even Sandy Bridge is somehow enough for gaming, not worth to compete there.


I figured.... kinda like a Core 2 quad, except on die with a much lower latency connection between the two CCXs.

While a monolithic approach would be "better" in performance, it's not worth the yield concerns and scaling AMD wants especially since they're fabless and have to design in mind the GloFlo's borrowed Samsung 14nm node, which itself is a node geared to efficiency for smartphone SOCs than high-clocking CPUs.

However, by doing so they can get better perf/watt, which is killer in the server market and where AMD will get the most dosh from.

However, the "reaction" to Ryzen's release is a lil too one-sided on the "gamerz" perspective, failing to realize that this Ryzen 7 was meant for servers/workstations for the likes of 6900k to compete with.

AMD already has their Ryzen 5 series to deal with the "gamer" i5s and i7s on LGA 1151, focusing on superior price/perf. 85% gaming perf for 50% of the price, pretty much.


----------



## Carniflex

Quote:


> Originally Posted by *Malinkadink*
> 
> I've been wondering, what made AMD go with the CCX design and put a pitifully slow "infinity fabric" in between? I can understand that it would help save money if yields aren't the best and you get chips with non functioning or under performing CCXs allowing you to disable the one half and use the good half for a cheaper SKU, but it doesn't seem like they'll be doing that exactly as they'll be going for 3+3, 2+2 setups instead in order to keep parity with the 4+4 R7s.
> 
> I'm not a computer engineer so i wouldn't know, but what would stop them from just having a 8 cores 4 on each side separated only by the cache between them?


I would speculate that Ryzen is, in essence, a workstation CPU jury-rigged into the desktop from-factor. So they have these CCX modules as building blocks and have to put something in-between. In their Naples parts they have massive 64x PCI-e lanes in their fabric - but for desktop .... probably something on the lines of "4x lanes will be good enough for the typical desktop usage, and it will save us several dollars per chip to have x4 instead of, say, x6 or x8 lanes for fabric". Might have been the same thing with strapping the fabric into the memory clocks instead of putting in there its own clock gen or using blck and deal with the synchronization with memory in some other way.

In my opinion the Ryzen core latency appears to be actually quite impressive. The first tier was a bit slower than intel while the higher tier cache was a bit better (42 ns vs 70 ns) but there was the issue of going for another CCX adding additional 100ns of latency meaning ~140ns vs ~70ns. At this moment it is not entirely clear how the memory clocks really affect this cache latency other than that the higher lcoks are supposed to substantially benefit the fabric in Ryzen cpu's.


----------



## Malinkadink

Quote:


> Originally Posted by *Carniflex*
> 
> I would speculate that Ryzen is, in essence, a workstation CPU jury-rigged into the desktop from-factor. So they have these CCX modules as building blocks and have to put something in-between. In their Naples parts they have massive 64x PCI-e lanes in their fabric - but for desktop .... probably something on the lines of "4x lanes will be good enough for the typical desktop usage, and it will save us several dollars per chip to have x4 instead of, say, x6 or x8 lanes for fabric". Might have been the same thing with strapping the fabric into the memory clocks instead of putting in there its own clock gen or using blck and deal with the synchronization with memory in some other way.
> 
> In my opinion the Ryzen core latency appears to be actually quite impressive. The first tier was a bit slower than intel while the higher tier cache was a bit better (42 ns vs 70 ns) but there was the issue of going for another CCX adding additional 100ns of latency meaning ~140ns vs ~70ns. At this moment it is not entirely clear how the memory clocks really affect this cache latency other than that the higher lcoks are supposed to substantially benefit the fabric in Ryzen cpu's.


Its not an end all be all thing for Ryzen after all 1000ns = 1 microsecond and 1000 microseconds is 1ms and we're dealing with <200 nanoseconds, less than 1/5 of a microsecond so its not like it will hinder your day to day usage. I'm just asking questions trying to get a better understanding. With the available resources AMD has they have a huge success on their hands with Ryzen. I can only imagine what sort of surprises Zen 2 and later 3 will have. I would expect at least 10% more IPC with Zen 2 and more OC headroom. Then Zen 3 another 10% IPC or more + 5ghz potential, one can dream


----------



## Scotty99

Whoops wrong thread.


----------



## Brutuz

Quote:


> Originally Posted by *Malinkadink*
> 
> Its not an end all be all thing for Ryzen after all 1000ns = 1 microsecond and 1000 microseconds is 1ms and we're dealing with <200 nanoseconds, less than 1/5 of a microsecond so its not like it will hinder your day to day usage. I'm just asking questions trying to get a better understanding. With the available resources AMD has they have a huge success on their hands with Ryzen. I can only imagine what sort of surprises Zen 2 and later 3 will have. I would expect at least 10% more IPC with Zen 2 and more OC headroom. Then Zen 3 another 10% IPC or more + 5ghz potential, one can dream


Sure, one bit of added latency from CCX to CCX communication is only 70ns extra, but it's doing potentially millions if not billions of those calculations every second, those extra seconds as imperceptible as they are to us are much more noticable to a CPU. Hell, it could lead to usability problems: Added latency from threads jumping around too much would mean even a CPU core as fast as a 5Ghz Kaby Lake wouldn't feel half as fast. Think of it like RAMBus, while it'd theoretically be faster at what it does, you've got additional time where you're waiting for data to come in with nothing much to do.


----------



## huzzug

You mean like a factory producing candies with 100mil workers. A few dozing off or goofing around during peak hours don't make much difference during production, but if they do so consistently and others join in on the goofing around, things go downhill.


----------



## KarathKasun

Quote:


> Originally Posted by *huzzug*
> 
> You mean like a factory producing candies with 100mil workers. A few dozing off or goofing around during peak hours don't make much difference during production, but if they do so consistently and others join in on the goofing around, things go downhill.


Its more like having two faster factories that may have a bit more delay in getting product back and forth to each other rather than having one factory with double the capacity. If you have two totally independent tasks, the former can be faster than the latter. If you have one task that takes the capacity of both factories the situation can reverse if the task does not take into account the transit time between the two locations.


----------



## Olivon




----------



## budgetgamer120

Quote:


> Originally Posted by *iRUSH*
> 
> That's the way I see it too. On top of that all 3 Ryzen 7 CPUs are exactly the same minus clock speed yet their price difference is massive.
> 
> I guarantee the price of these will drop just like most new AMD products. This makes them look bad.
> 
> There only needs to be one Ryzen 7 and price it at $300 tops.
> 
> I get that it does amazing things for extremely niche situations.
> 
> I just couldn't recommend R7 9 out of 10 times at its current price point.
> 
> Ryzen 5, 4 and 6 core CPUs will hopefully change my mind.


It will be 300 tops when a 6800k or 6900k is priced similarly.


----------



## Scotty99

Quote:


> Originally Posted by *iRUSH*
> 
> That's the way I see it too. On top of that all 3 Ryzen 7 CPUs are exactly the same minus clock speed yet their price difference is massive.
> 
> I guarantee the price of these will drop just like most new AMD products. This makes them look bad.
> 
> There only needs to be one Ryzen 7 and price it at $300 tops.
> 
> I get that it does amazing things for extremely niche situations.
> 
> I just couldn't recommend R7 9 out of 10 times at its current price point.
> 
> Ryzen 5, 4 and 6 core CPUs will hopefully change my mind.


Eh what?

1700 is 330 bucks and includes a stock cooler that provides enough dissipation for moderate all core overclocks. I cannot remember a better bargain in recent times, heck a couple years back people were calling the xeons that you could put in the mainstream boards deals because you got a 4c 8t chip for 100 dollars off i7 prices, but AMD makes something with double the performance of that zeon and its overpriced lol?

Just nutty logic man, nutty.


----------



## KarathKasun

Quote:


> Originally Posted by *iRUSH*
> 
> That's the way I see it too. On top of that all 3 Ryzen 7 CPUs are exactly the same minus clock speed yet their price difference is massive.
> 
> I guarantee the price of these will drop just like most new AMD products. This makes them look bad.
> 
> There only needs to be one Ryzen 7 and price it at $300 tops.
> 
> I get that it does amazing things for extremely niche situations.
> 
> I just couldn't recommend R7 9 out of 10 times at its current price point.
> 
> Ryzen 5, 4 and 6 core CPUs will hopefully change my mind.


You obviously aren't an OEM. If workstations can be built with Ryzen that achieve parity with Intel based systems for the cost of the Intel CPU alone... you would be cranking up a Ryzen workstation production line as fast as possible. Same goes for servers.

Here is the thing, your gaming use case is EXTREMELY niche. As in, it is a niche within a niche.


----------



## iRUSH

Quote:


> Originally Posted by *Scotty99*
> 
> Eh what?
> 
> 1700 is 330 bucks and includes a stock cooler that provides enough dissipation for moderate all core overclocks. I cannot remember a better bargain in recent times, heck a couple years back people were calling the xeons that you could put in the mainstream boards deals because you got a 4c 8t chip for 100 dollars off i7 prices, but AMD makes something with double the performance of that zeon and its overpriced lol?
> 
> Just nutty logic man, nutty.


So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.

Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.

Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.


----------



## Scotty99

Quote:


> Originally Posted by *iRUSH*
> 
> So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.
> 
> Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.
> 
> Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.


Thats why you buy the 1700....

x chips are for people that dont know what a bios is. I said this the same day the reviews hit, they should have ONLY released the 1700 lol. Forget about the x chips and you will see this differently.

Its just odd you are so upset about this but are cool with 6800k vs 6850k prices....


----------



## SoloCamo

Quote:


> Originally Posted by *iRUSH*
> 
> So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.
> 
> Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.
> 
> Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.


It's like some of you haven't been around the cpu scene long at all...

Different clock speed vs price point has typically always been the name of the game.. Some people do not want to overclock and just want the fastest model. *This is how the overclocking scene was born... to get a faster model for cheaper - not to just push the fastest models even faster.*. At $500 an 1800x is still considerably cheaper than a HEDT from Intel and performs pretty much in parity. For those of who want cheaper prices and can OC, pick up the 1700.

What looks bad is artificially locking overclocking to "K" chips yet that seems to be all fine and dandy around here.


----------



## KarathKasun

Quote:


> Originally Posted by *iRUSH*
> 
> So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.
> 
> Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.
> 
> Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.


Jesus, you do know that the only difference to OEM's for many models is clock speed right? This is especially true in laptop CPU's. The fact that you can OC the lower end part to the higher end speeds is just a bone thrown to the OC market. The 1800X is for those who don't want to OC, and they pay extra for it.


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> Thats why you buy the 1700....
> 
> x chips are for people that dont know what a bios is. I said this the same day the reviews hit, they should have ONLY released the 1700 lol. Forget about the x chips and you will see this differently.
> 
> Its just odd you are so upset about this but are cool with 6800k vs 6850k prices....


Or the 1800x is for people wanting 4.1ghz.

Who cares what the 1800x cost? We have a choice to buy the cheaper one


----------



## cssorkinman

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> Eh what?
> 
> 1700 is 330 bucks and includes a stock cooler that provides enough dissipation for moderate all core overclocks. I cannot remember a better bargain in recent times, heck a couple years back people were calling the xeons that you could put in the mainstream boards deals because you got a 4c 8t chip for 100 dollars off i7 prices, but AMD makes something with double the performance of that zeon and its overpriced lol?
> 
> Just nutty logic man, nutty.
> 
> 
> 
> So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.
> 
> Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.
> 
> Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.
Click to expand...

You could probably turn that around - as well as it's competing with the 5960X and 6900. The 1800X is underpriced at $499 and should be $599 - 1700X $549 - 1700 $499. Given the performance, efficiency and price compared to what they are actually designed to compete against - they would still be incredible bargains at those price points

EDIT:


Spoiler: Warning: Spoiler!


----------



## Scotty99

Quote:


> Originally Posted by *budgetgamer120*
> 
> Or the 1800x is for people wanting 4.1ghz.
> 
> Who cares what the 1800x cost? We have a choice to buy the cheaper one


Ive seen plenty of 1700's at 4.1.


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> Ive seen plenty of 1700's at 4.1.


The success rate is pretty low though. If I'm getting into Ryzen and want 4.1ghz 1700 would not be on my list. Because it is least likely to 4.1ghz


----------



## Scotty99

Quote:


> Originally Posted by *budgetgamer120*
> 
> The success rate is pretty low though. If I'm getting into Ryzen and want 4.1ghz 1700 would not be on my list. Because it is least likely to 4.1ghz


You are kinda missing the point tho mah dude lol. These chips are overall terrible overclockers, you are gonna pay 170 dollars for maybe 100mhz?


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> You are kinda missing the point tho mah dude lol. These chips are overall terrible overclockers, you are gonna pay 170 dollars for maybe 100mhz?


Some people do lol. Even on this forum.


----------



## iLeakStuff

https://mobile.twitter.com/BitsAndChipsEng/status/843864982320267265

https://mobile.twitter.com/BitsAndChipsEng/status/843868465702363141


----------



## jon666

This thread is entertaining, but will have to find someplace else for information once I get my x370 mobo.


----------



## ChronoBodi

Apparently, the recent Windows update did this for Ryzen FPS increases.



But, need to know more real information and more sources to see if this is for real or just an anomaly to one old game.


----------



## iLeakStuff

Quote:


> Originally Posted by *ChronoBodi*
> 
> Apparently, the recent Windows update did this for Ryzen FPS increases.
> 
> 
> 
> But, need to know more real information and more sources to see if this is for real or just an anomaly to one old game.


Try reading that tweet again.
I know you can do it


----------



## Malinkadink

Quote:


> Originally Posted by *ChronoBodi*
> 
> Apparently, the recent Windows update did this for Ryzen FPS increases.
> 
> 
> 
> But, need to know more real information and more sources to see if this is for real or just an anomaly to one old game.


It says its getting similar performance boost in Total War Warhammer, Cinebench R15, Handbrake, and Batman Arkham Asylum so its not just an anomaly. Will take it with salt though as i'd like more sources to test and confirm.


----------



## Scotty99

Quote:


> Originally Posted by *Malinkadink*
> 
> It says its getting similar performance boost in Total War Warhammer, Cinebench R15, Handbrake, and Batman Arkham Asylum so its not just an anomaly. Will take it with salt though as i'd like more sources to test and confirm.


I read that differently than you, he is saying those other games are the same results as before, UT was the only one he noticed a boost on.


----------



## cssorkinman

Quote:


> Originally Posted by *Malinkadink*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChronoBodi*
> 
> Apparently, the recent Windows update did this for Ryzen FPS increases.
> 
> 
> 
> But, need to know more real information and more sources to see if this is for real or just an anomaly to one old game.
> 
> 
> 
> It says its getting similar performance boost in Total War Warhammer, Cinebench R15, Handbrake, and Batman Arkham Asylum so its not just an anomaly. Will take it with salt though as i'd like more sources to test and confirm.
Click to expand...

I think you may be mis- interpreting what they were saying Total war warhammer, cinebench r15 , handbrake and batman etc were the same as before , but the DX 10 UT 3 bench was that much better.


----------



## Malinkadink

Quote:


> Originally Posted by *Scotty99*
> 
> I read that differently than you, he is saying those other games are the same results as before, UT was the only one he noticed a boost on.


Quote:


> Originally Posted by *cssorkinman*
> 
> I think you may be mis- interpreting what they were saying Total war warhammer, cinebench r15 , handbrake and batman etc were the same as before , but the DX 10 UT 3 bench was that much better.


Ahhh got it, damn those Italians! lol


----------



## Majin SSJ Eric

What I guess would've made some of these people happy is if AMD had pulled an Intel with Ryzen and offered the 1700 for its $329 price but locked out OCing (and cut SMT). Then there would be more justification in the "X" prices but stupid AMD just gave us basically the same chip but for $200 less. What a bunch of morons over at AMD not screwing over their customers!


----------



## Malinkadink

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What I guess would've made some of these people happy is if AMD had pulled an Intel with Ryzen and offered the 1700 for its $329 price but locked out OCing (and cut SMT). Then there would be more justification in the "X" prices but stupid AMD just gave us basically the same chip but for $200 less. What a bunch of morons over at AMD not screwing over their customers!


You're paying $80 more to get a much better chance at hitting 4.0ghz on the 1700X at reasonable voltages. You're paying $180 more to get a 1800X to get possibly 4.1ghz or just have the 4.0ghz out of the box. To some that may be worth it. Personally i don't see the point where a 3.9ghz or 4ghz if you're lucky 1700 for less $ will be just as good and you wont notice a difference outside of synthetics.

My problem right now is the damn memory issues. I just want a board that can take my 32GB 3200Mhz CL14 kit without a fuss and i'll be fine with that.


----------



## Majin SSJ Eric

I'm getting a 1700X personally just because the X looks better in the sig!


----------



## Shatun-Bear

Quote:


> Originally Posted by *Malinkadink*
> 
> You're paying $80 more to get a much better chance at hitting 4.0ghz on the 1700X at reasonable voltages. You're paying $180 more to get a 1800X to get possibly 4.1ghz or just have the 4.0ghz out of the box. To some that may be worth it. Personally i don't see the point where a 3.9ghz or 4ghz if you're lucky 1700 for less $ will be just as good and you wont notice a difference outside of synthetics.
> 
> My problem right now is the damn memory issues. I just want a board that can take my 32GB 3200Mhz CL14 kit without a fuss and i'll be fine with that.


All you have to do (easier said than done maybe) is wait until May when they release the memory BIOS update, then you'll be able to stick 3600Mhz memory in there if you want.


----------



## Malinkadink

Quote:


> Originally Posted by *Shatun-Bear*
> 
> All you have to do (easier said than done maybe) is wait until May when they release the memory BIOS update, then you'll be able to stick 3600Mhz memory in there if you want.


Sounds good to me, no rush i can manage with the i5 till then.


----------



## III-Method-III

^this.

Waiting until may though is agony.

I am curious to know how much of a diff this Win update is making when other sites re-test?


----------



## budgetgamer120

Quote:


> Originally Posted by *Shatun-Bear*
> 
> All you have to do (easier said than done maybe) is wait until May when they release the memory BIOS update, then you'll be able to stick 3600Mhz memory in there if you want.


I have a feeling that is not coming... Did a board manufacturer or AMD confirm that?


----------



## blue1512

Quote:


> Originally Posted by *III-Method-III*
> 
> ^this.
> 
> Waiting until may though is agony.
> 
> I am curious to know how much of a diff this Win update is making when other sites re-test?


AMD also just released new micro-code, so you should also wait until the new BIOS is available in next weeks. But I can say with confidence that the performance will only get better


----------



## iRUSH

Quote:


> Originally Posted by *SoloCamo*
> 
> It's like some of you haven't been around the cpu scene long at all...
> 
> Different clock speed vs price point has typically always been the name of the game.. Some people do not want to overclock and just want the fastest model. *This is how the overclocking scene was born... to get a faster model for cheaper - not to just push the fastest models even faster.*. At $500 an 1800x is still considerably cheaper than a HEDT from Intel and performs pretty much in parity. For those of who want cheaper prices and can OC, pick up the 1700.
> 
> What looks bad is artificially locking overclocking to "K" chips yet that seems to be all fine and dandy around here.


But at least the K chips justify their premium when compared to the exact same non-k chip. You pay a small amount more for the unlocked cpu. Ryzen 7 chips look silly since they all overclock to 3.9-4.1 but the price difference is quite large.


----------



## blue1512

Quote:


> Originally Posted by *iRUSH*
> 
> But at least the K chips justify their premium when compared to the exact same non-k chip. You pay a small amount more for the unlocked cpu. Ryzen 7 chips look silly since they all overclock to 3.9-4.1 but the price difference is quite large.


Now we are accusing AMD for being generous









I think the price is fine as it is. You are free to choose from 1700, 1700x, 1800x. If you want absolute bargain just go for 1700. If you want to plug-and-play then go for the X. And if you want to support AMD and pursuit as any MHz as possible just slap money to their face with 1800x.

There is buyer remorse with pre-order guys, sure. But in the end of the day most of them want to support AMD and will be fine with it.


----------



## AuraNova

Quote:


> Originally Posted by *iRUSH*
> 
> But at least the K chips justify their premium when compared to the exact same non-k chip. You pay a small amount more for the unlocked cpu. Ryzen 7 chips look silly since they all overclock to 3.9-4.1 but the price difference is quite large.


If anything, the 1700X seems like more of a waste. I could justify the 1800X being in the line up next to the 1700. Maybe not at the price point it is. Maybe $450 would have worked.


----------



## iRUSH

Quote:


> Originally Posted by *Scotty99*
> 
> Thats why you buy the 1700....
> 
> x chips are for people that dont know what a bios is. I said this the same day the reviews hit, they should have ONLY released the 1700 lol. Forget about the x chips and you will see this differently.
> 
> Its just odd you are so upset about this but are cool with 6800k vs 6850k prices....


I didn't say I was cool with the comparable performing intel priced chip. Although I can get a 6800k for $359, i'd probably still pick the 1700 over it.

Don't think that just because I have thoughts on this that don't align with others that rant and rave over Ryzen means I think the CPU is utter trash.

With that said, the article that I agreed with still holds true to not only me, but a ton of people beyond OCN.

Ryzen 7 is a niche product. Unless someone can show me mass businesses ditching their Xeons in favor of a Ryzen 7 chip?

There's no such thing as a bad CPU, just a bad price. The 1700x and 1800x are way off base. If it's not then it's price ratio compared to the other models within Ryzen 7 will never change either....but I have a hunch it will, just wait.

By that I mean the 1800x price will fall quite a bit and the whole R7 lineup with tighten. This is one of my biggest complaints with AMD. They price it high then drop over time. Intel and Nvidia you'll see far less of this. In other words it devalues AMD products. Anyone remember the 9K series FX chip prices at release? Yes I am aware that's an extreme example.


----------



## cssorkinman

Kinda fun slapping 5 ghz 7700k's around in 3D benches ( at stock clocks on the 1800x)


Spoiler: Warning: Spoiler!


----------



## oxidized

Latest DotA 2 patch included a ryzen improvement.

http://store.steampowered.com/news/28296/

- Improved threading configuration for AMD Ryzen processors.


----------



## mushroomboy

Quote:


> Originally Posted by *Malinkadink*
> 
> You're paying $80 more to get a much better chance at hitting 4.0ghz on the 1700X at reasonable voltages. You're paying $180 more to get a 1800X to get possibly 4.1ghz or just have the 4.0ghz out of the box. To some that may be worth it. Personally i don't see the point where a 3.9ghz or 4ghz if you're lucky 1700 for less $ will be just as good and you wont notice a difference outside of synthetics.
> 
> My problem right now is the damn memory issues. I just want a board that can take my 32GB 3200Mhz CL14 kit without a fuss and i'll be fine with that.


well when you OC with water, and have 2 pumps fail, you start to look at the top tier chips and go "if only I would have spent 100 more". Then you get that OC clock at stock, win?

[to others]

Some people like having the top tier performance without the effort, idk those who have kids and a life that don't have the time to ask more savvy friends what to do because they don't have the time for it.

No, that's too long of a sentence to be true or right. English embarrassment.


----------



## Malinkadink

Quote:


> Originally Posted by *oxidized*
> 
> Latest DotA 2 patch included a ryzen improvement.
> 
> http://store.steampowered.com/news/28296/
> 
> - Improved threading configuration for AMD Ryzen processors.


nice guy valve
Quote:


> Originally Posted by *mushroomboy*
> 
> well when you OC with water, and have 2 pumps fail, you start to look at the top tier chips and go "if only I would have spent 100 more". Then you get that OC clock at stock, win?
> 
> [to others]
> 
> Some people like having the top tier performance without the effort, idk those who have kids and a life that don't have the time to ask more savvy friends what to do because they don't have the time for it.
> 
> No, that's too long of a sentence to be true or right. English embarrassment.


pumps failing has less to do about the OC and more to do with the quality of those pumps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not sure what you are trying to say there?
> 
> 
> 
> 
> 
> 
> 
> When I think about the 1600 I'm thinking of a 2500K but for modern workloads. Roughly the same price as when the 2500K launched but with 6 cores and 12 threads. This will be a very disruptive sku in the CPU market, even more so than the R7's which are already very disruptive to the HEDT market.


What I am trying to say is that with 6 Core CPU you will get the performance before then a 8 Core CPU in games if you plan to keep the CPU for long time. Right now the sweet spot is 4C/8T. In 2-3 years it will be 6C/12T. 8C/16T will take much longer then that.


----------



## Alwrath

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What I am trying to say is that with 6 Core CPU you will get the performance before then a 8 Core CPU in games if you plan to keep the CPU for long time. Right now the sweet spot is 4C/8T. In 2-3 years it will be 6C/12T. 8C/16T will take much longer then that.


The sweet spot right now is 6 core/12 thread, decent amount of games are already using more cores.


----------



## espn

Quote:


> Originally Posted by *Alwrath*
> 
> The sweet spot right now is 6 core/12 thread, decent amount of games are already using more cores.


The fastest one in game testing is still 4 cores for almost all games


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What I am trying to say is that with 6 Core CPU you will get the performance before then a 8 Core CPU in games if you plan to keep the CPU for long time. Right now the sweet spot is 4C/8T. In 2-3 years it will be 6C/12T. 8C/16T will take much longer then that.


Wouldbt the 8 core perform the same seeing as it has 6 + 2 more cores?


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What I guess would've made some of these people happy is if AMD had pulled an Intel with Ryzen and offered the 1700 for its $329 price but locked out OCing (and cut SMT). Then there would be more justification in the "X" prices but stupid AMD just gave us basically the same chip but for $200 less. What a bunch of morons over at AMD not screwing over their customers!


They all wanted $1000 CPUs. I remember some here were hoping for $700 and saying AMD is not a charity lol


----------



## Ultracarpet

Quote:


> Originally Posted by *budgetgamer120*
> 
> They all wanted $1000 CPUs. I remember some here were hoping for $700 and saying AMD is not a charity lol


Definitely a strange few around here. Dunno why anyone would be angry over getting a better deal than they wanted.


----------



## SuperZan

Quote:


> Originally Posted by *iRUSH*
> 
> But at least the K chips justify their premium when compared to the exact same non-k chip. You pay a small amount more for the unlocked cpu. Ryzen 7 chips look silly since they all overclock to 3.9-4.1 but the price difference is quite large.


In a world where the vast, vast majority of people don't overclock, using a price hierarchy determined by tiers of base/boost clocks makes perfect sense. In a world where the vast, vast majority of professional workstation users are not overclocking, releasing an 8c/16t chip which uses a price hierarchy determineed by tiers of base/boost clocks makes perfect sense. That world is our world. The 1700 is a straight-up gift to price/performance buyers and overclockers, because AMD knows that these buyers and their goodwill towards AMD have effectively kept the sinking ship afloat long enough to put out competitive ranges in the money (read: server and workstation) spaces.

You're so conditioned to Intel's dominance that you're completely willing to excuse the near-monopoly power play whilst criticising the logical positioning of a stack that also happens to benefit we the extreme minority of overclocking enthusiasts. The only reason K chips (arguably) 'justify' their price-premium is because of standards that Intel itself has created regarding what is and isn't for overclockers.

And before any blue crusaders try to lay into me with 'all corporations matter' I'm not saying that any other company would provably do things differently with the same sort of dominance that Intel has had, nor am I calling them 'evil' for what they've done re: K vs non-K. That said, as a consumer I don't have to like it and I'm honestly not sure why any consumer would.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *iRUSH*
> 
> But at least the K chips justify their premium when compared to the exact same non-k chip. You pay a small amount more for the unlocked cpu. Ryzen 7 chips look silly since they all overclock to 3.9-4.1 but the price difference is quite large.


^^^^^^^^^^^^^ Oh look, exactly the kind of guy I was referencing when I said this:
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> What I guess would've made some of these people happy is if AMD had pulled an Intel with Ryzen and offered the 1700 for its $329 price but locked out OCing (and cut SMT). Then there would be more justification in the "X" prices but stupid AMD just gave us basically the same chip but for $200 less. What a bunch of morons over at AMD not screwing over their customers!


----------



## jprovido

I bought the 1700x because it was on sale at geekdeal without tax too (I'm from california). for 369.99 and they don't have any r7 1700 in stock. If i bought an r7 1700 in newegg or amazon the tax will make it cost the same with the 1700x on geekdeal. just had to wait for more than a week because I believe it was shipped from washington







Ryzen X master race. r7 1700 peasants


----------



## Majin SSJ Eric

I'll definitely be getting a 1700X myself. It doesn't really cost much more than the 1700 and I get to join the Ryzen X master race! Can't really justify the price gap to the 1800X personally, though if it were just a bit cheaper I'd probably spring for it just to get the top of the line Ryzen (but $499 is just a bit too steep of a jump from the 1700 for little real benefit).


----------



## jprovido

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'll definitely be getting a 1700X myself. It doesn't really cost much more than the 1700 and I get to join the Ryzen X master race! Can't really justify the price gap to the 1800X personally, though if it were just a bit cheaper I'd probably spring for it just to get the top of the line Ryzen (but $499 is just a bit too steep of a jump from the 1700 for little real benefit).


I'm really happy with my 1700x. I was able to get a 3.9ghz overclock on a cheap b350 matx board







hopefully the memory speeds gets better soon. 2400MHz kinda sucks and I have a 3200mhz kit


----------



## Majin SSJ Eric

I gotta get that Crosshair VI Hero personally! I know they've had issues but by the time I'm ready for my build they should be mostly sorted. I'm a big ROG fanboy at this point since my RIVE. Will throw some 3200MHz CL14 memory in there and call it a day.


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> ^^^^^^^^^^^^^ Oh look, exactly the kind of guy I was referencing when I said this:


----------



## Alwrath

Quote:


> Originally Posted by *espn*
> 
> The fastest one in game testing is still 4 cores for almost all games


Unless your on 1440p and 4K resolution, then things change quite a bit in favor of more cores.


----------



## espn

Quote:


> Originally Posted by *Alwrath*
> 
> Unless your on 1440p and 4K resolution, then things change quite a bit in favor of more cores.


For 4k gaming, the load becomes mainly graphic card and look like cpu gap is smaller.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Alwrath*
> 
> Unless your on 1440p and 4K resolution, then things change quite a bit in favor of more cores.


Where did you get that idea?


----------



## iRUSH

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'll definitely be getting a 1700X myself. It doesn't really cost much more than the 1700 and I get to join the Ryzen X master race! Can't really justify the price gap to the 1800X personally, though if it were just a bit cheaper I'd probably spring for it just to get the top of the line Ryzen (but $499 is just a bit too steep of a jump from the 1700 for little real benefit).


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> ^^^^^^^^^^^^^ Oh look, exactly the kind of guy I was referencing when I said this:


Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'm getting a 1700X personally just because the X looks better in the sig!


Reference all you want Eric. After all, you're clearly a fan... I'm sure we can "reference" your preference for "X" later.


----------



## Alwrath

Quote:


> Originally Posted by *jprovido*
> 
> I'm really happy with my 1700x. I was able to get a 3.9ghz overclock on a cheap b350 matx board
> 
> 
> 
> 
> 
> 
> 
> hopefully the memory speeds gets better soon. 2400MHz kinda sucks and I have a 3200mhz kit


Im gonna leave my 3200 gskill flare x kit at 2400 mhz till I see a major bios update/fix on ryzen. Im scared oc right now due to reports of users having problems with dead ram sticks.


----------



## espn

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I gotta get that Crosshair VI Hero personally! I know they've had issues but by the time I'm ready for my build they should be mostly sorted. I'm a big ROG fanboy at this point since my RIVE. Will throw some 3200MHz CL14 memory in there and call it a day.


You are samsung lover as well, you really spend a lot of money on new tech stuff.


----------



## Alwrath

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Where did you get that idea?


Actually your right, that doesnt make sense. More cores is always better no matter the resolution.


----------



## Alwrath

Quote:


> Originally Posted by *espn*
> 
> For 4k gaming, the load becomes mainly graphic card and look like cpu gap is smaller.


That is what I ment to say


----------



## ChronoBodi

Quote:


> Originally Posted by *Alwrath*
> 
> Im gonna leave my 3200 gskill flare x kit at 2400 mhz till I see a major bios update/fix on ryzen. Im scared oc right now due to reports of users having problems with dead ram sticks.


do 2666 mhz at 16-18-18-38. Looser timing i know but more likely to work.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *iRUSH*
> 
> Reference all you want Eric. After all, you're clearly a fan... I'm sure we can "reference" your preference for "X" later.


Feel free. I've never denied my illogical fanboyism at times. I bought a 3960X when it came out because I wanted that X even though the 3930K was the obvious smarter choice! Ah, the days when I actually had disposable cash...


----------



## Slink3Slyde

Quote:


> Originally Posted by *Alwrath*
> 
> Actually your right, that doesnt make sense. More cores is always better no matter the resolution.


And there was me interested to see a reference to your claim.


----------



## Alwrath

Quote:


> Originally Posted by *ChronoBodi*
> 
> do 2666 mhz at 16-18-18-38. Looser timing i know but more likely to work.


I may try 2666 at 14-14-14-36, just increase voltage to 1.28 maybe if im feeling brave.


----------



## ChronoBodi

Quote:


> Originally Posted by *Alwrath*
> 
> I may try 2666 at 14-14-14-36, just increase voltage to 1.28 maybe if im feeling brave.


ehhhh..... just do the looser timing, its still gonna be better than boring 2133 mhz lol.


----------



## Alwrath

Quote:


> Originally Posted by *ChronoBodi*
> 
> ehhhh..... just do the looser timing, its still gonna be better than boring 2133 mhz lol.


Its at 2400 mhz 14-14-14-36 @1.25 V atm, havent even tried lowering the voltage to see whats lowest stable yet.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ChronoBodi*
> 
> ehhhh..... just do the looser timing, its still gonna be better than boring 2133 mhz lol.


Dude your rigs are filthy! Seriously nice set ups!


----------



## Alwrath

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'll definitely be getting a 1700X myself. It doesn't really cost much more than the 1700 and I get to join the Ryzen X master race! Can't really justify the price gap to the 1800X personally, though if it were just a bit cheaper I'd probably spring for it just to get the top of the line Ryzen (but $499 is just a bit too steep of a jump from the 1700 for little real benefit).


Im fine with the 1700. Its only temporary till zen 2 comes out anyway and wipes the floor with anything we got out today. Thats how I look at it. Well get to keep our mb and ram too! 1700 will be on ebay someday LOL


----------



## ChronoBodi

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Dude your rigs are filthy! Seriously nice set ups!


Uhhhh.... yea. My x99 is filthy of a lot of storage, lol. excessively so.

And my ryzen rig is like, my "excuse" to have an AMD 5960x on the cheap for the TV room. lol.


----------



## Alwrath

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I gotta get that Crosshair VI Hero personally! I know they've had issues but by the time I'm ready for my build they should be mostly sorted. I'm a big ROG fanboy at this point since my RIVE. Will throw some 3200MHz CL14 memory in there and call it a day.


Yeah I like Asus boards too. Thankfully I think I made the right choice grabbing the K7 when I did. Asus CH6 seems littered with troubles atm, and this is my first Gigabyte board and so far im very impressed. Once you learn the bios, its actually very easy to use for a manual old school overclocker like me.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ChronoBodi*
> 
> Uhhhh.... yea. My x99 is filthy of a lot of storage, lol. excessively so.
> 
> And my ryzen rig is like, my "excuse" to have an AMD 5960x on the cheap for the TV room. lol.


Yeah your storage is mental! I wanted a 5960X so damn bad when they released but i couldn't swing the upgrade to X99 so I'm stuck with this ancient 4930K for a while longer. Will replace it with a 1700X and CHVI and then take a look at Skylake-E when it comes out. Maybe Intel will relax their pricing a bit considering how well Ryzen does.


----------



## ChronoBodi

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yeah your storage is mental! I wanted a 5960X so damn bad when they released but i couldn't swing the upgrade to X99 so I'm stuck with this ancient 4930K for a while longer. Will replace it with a 1700X and CHVI and then take a look at Skylake-E when it comes out. Maybe Intel will relax their pricing a bit considering how well Ryzen does.


I am still surprised Intel hasn't dropped prices yet at all on their 6900k.

What's taking them so long? and No, microcenter doesn't count, i mean actual Intel-mandated price drops.

Oh, I want to see what happens when ryzen 5 series comes out, then we'll have the issues ironed out.

And even then, there's already rumors of an AMD HEDT 16c/32t socket for prosumers. So... if AMD can disrupt the status quo of the 6900k, who's to say the same can be done to the Xeons 16 cores as well?

also, for some reason, Windows Update did this to Ryzen for a 9 year old game. LOL.


----------



## Alwrath

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yeah your storage is mental! I wanted a 5960X so damn bad when they released but i couldn't swing the upgrade to X99 so I'm stuck with this ancient 4930K for a while longer. Will replace it with a 1700X and CHVI and then take a look at Skylake-E when it comes out. Maybe Intel will relax their pricing a bit considering how well Ryzen does.


If I had your 4930k I wouldent even think about upgrading unless I just wanted to support AMD during this time that badly. 4930k is still a beast, but then again all I do is game at 4K @ 60hz.


----------



## Majin SSJ Eric

I'm not getting rid of the X79 system. it will move down to my studio PC and I'll finally retire my 2600K / P67 Sabertooth to my shelf. Then I'll have the 4930K and the 1700X at my disposal and more importantly will be off of quad cores for good...


----------



## Alwrath

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I'm not getting rid of the X79 system. it will move down to my studio PC and I'll finally retire my 2600K / P67 Sabertooth to my shelf. Then I'll have the 4930K and the 1700X at my disposal and more importantly will be off of quad cores for good...


Quad Cores are so 2007


----------



## Scotty99

Quote:


> Originally Posted by *ChronoBodi*
> 
> do 2666 mhz at 16-18-18-38. Looser timing i know but more likely to work.


Those are the exact timings for my 3200 kit, it wont boot at any speed i set in bios, thats why im thinking something is wrong with my system.

Literally the only two things i can boot at are bios defaults, or 2400 15 15 15 36.


----------



## ChronoBodi

Quote:


> Originally Posted by *Scotty99*
> 
> Those are the exact timings for my 3200 kit, it wont boot at any speed i set in bios, thats why im thinking something is wrong with my system.
> 
> Literally the only two things i can boot at are bios defaults, or 2400 15 15 15 36.


stupid idea, but did you try 2666 mhz 16-18-18-38? or 2666 mhz, but 15 15 15 36?

thing is my ram works on my Giga mobo, no idea on Asrock side of things for RAM workings.

you could try timings of 14, i heard Ryzen likes even numbers at lower mhz for some reason.


----------



## Scotty99

Quote:


> Originally Posted by *ChronoBodi*
> 
> stupid idea, but did you try 2666 mhz 16-18-18-38? or 2666 mhz, but 15 15 15 36?
> 
> thing is my ram works on my Giga mobo, no idea on Asrock side of things for RAM workings.
> 
> you could try timings of 14, i heard Ryzen likes even numbers at lower mhz for some reason.


Yep ive tried every speed at xmp timings, as well as 15 15 15 36 only 2400 boots at the cas 15. Never gave 14 cas a shot tho.

The only reason i even got 2400 to work is there are like, "suggested" timings to the left of where you manually enter them, and that was the 15 15 15 36 number. I dont know why they are even there but i tried them and thats the only thing i can boot at lol.


----------



## ChronoBodi

Quote:


> Originally Posted by *Scotty99*
> 
> Yep ive tried every speed at xmp timings, as well as 15 15 15 36 only 2400 boots at the cas 15. Never gave 14 cas a shot tho.
> 
> The only reason i even got 2400 to work is there are like, "suggested" timings to the left of where you manually enter them, and that was the 15 15 15 36 number. I dont know why they are even there but i tried them and thats the only thing i can boot at lol.


welp, i'm lost. my ram and urs was supposed to be 3200 mhz, but bah, not til May for that stable bios update.

these bios are like minimum spec required to get Ryzen out the door quick for launch.


----------



## Shatun-Bear

Quote:


> Originally Posted by *espn*
> 
> The fastest one in game testing is still 4 cores for almost all games


Yes this 4-core gives you maybe 160fps instead of the 140fps with the 8-core Ryzens when you game with a Titan X in 1080p resolution, so sure, it is 'faster' but let's be real here









When the game needs more than the 4C/8T of the 7700K, it doesn't matter the frequency, the limited cores and threads of that CPU are going to be left behind by even 6-core CPUs. That's the point - the 7700K may be 'faster' in scenarios that most of us never use or will never experience or _even notice_ (how can anyone tell the difference between 90fps and 105fps?!?) but it is limiting yourself for the games today that scale better with more cores and those games of tomorrow that will almost certainly prefer more cores/threads.


----------



## amlett

Waiting for a Noctua D15 for ocing more


----------



## Carniflex

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Yes this 4-core gives you maybe 160fps instead of the 140fps with the 8-core Ryzens when you game with a Titan X in 1080p resolution, so sure, it is 'faster' but let's be real here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When the game needs more than the 4C/8T of the 7700K, it doesn't matter the frequency, the limited cores and threads of that CPU are going to be left behind by even 6-core CPUs. That's the point - the 7700K may be 'faster' in scenarios that most of us never use or will never experience or _even notice_ (how can anyone tell the difference between 90fps and 105fps?!?) but it is limiting yourself for the games today that scale better with more cores and those games of tomorrow that will almost certainly prefer more cores/threads.


I would not go as far as "not notice" - it really depends on the specific game and circumstances. In a fps game, for example, in the absence of freesync/g-sync dropping under the refresh rate of the display has an immediate and obvious effect as this causes tearing. In a game that is strongly single thread performance dependent, even in the absence of GFX at all, like, for example, Dwarf Fortress the difference can be noticeable as well. It is noticeable in the late game in there on larger map in there if you are running at 25 fps or 20 fps. To be honest, though, I have not yet seen a lot of discussion about dwarf fortress in the context of ryzen - dwarf fortress is not only hungry for best single threaded performance but it is also very memory heavy, in particular for the cache sizes where the extra cache in ryzen might give it enough of a leg up to compete with the intel i5/i7 offerings.

There are scenarios where the Intel still has an advantage and what is new - nowadays there are also scenarios where the new AMD offerings have a clear advantage. That used to not be the case outside of the ulra low end budget builds for last some years.

You are ofc correct that majority of users would probably not know the difference. Both CPU's are good enough to play candy crush in web browser and as far as I understand that is what majority of users are supposed to be doing in the real world. Then again a 30 EUR android stick is also good enough for that.


----------



## Scotty99

See i dislike that train of thought, no one and i mean absolutely NO ONE should use vsync on a 144hz panel, so no you are not going to notice a dip if it dropped below 144.

The only spot ryzen is a bad choice is competitive gamers who think they can see 240 hz.


----------



## pez

Quote:


> Originally Posted by *Scotty99*
> 
> Yep ive tried every speed at xmp timings, as well as 15 15 15 36 only 2400 boots at the cas 15. Never gave 14 cas a shot tho.
> 
> The only reason i even got 2400 to work is there are like, "suggested" timings to the left of where you manually enter them, and that was the 15 15 15 36 number. I dont know why they are even there but i tried them and thats the only thing i can boot at lol.


I've known Gigabyte boards to notoriously set DRAM voltage below what it should be. I'm sure you've checked this, but it's worth a try, eh?


----------



## czin125

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Yes this 4-core gives you maybe 160fps instead of the 140fps with the 8-core Ryzens when you game with a Titan X in 1080p resolution, so sure, it is 'faster' but let's be real here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When the game needs more than the 4C/8T of the 7700K, it doesn't matter the frequency, the limited cores and threads of that CPU are going to be left behind by even 6-core CPUs. That's the point - the 7700K may be 'faster' in scenarios that most of us never use or will never experience or _even notice_ (how can anyone tell the difference between 90fps and 105fps?!?) but it is limiting yourself for the games today that scale better with more cores and those games of tomorrow that will almost certainly prefer more cores/threads.


Why would frequency not matter? A 4C/8T could match a 6C/12T in x265 if you give it faster ram. x265 gains at least 22%-28% for 4000mhz to 4266mhz vs 3000mhz
6% higher ipc / 22% from ram / 15% higher clocks = same as giving 1.5x cores. While it could equalize in x265, the 4C/8T would be faster in games.

http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page2.html

6850K at 4.6 * 1.15x higher core = 5.3 ( 7700K can reach this on water )


----------



## espn

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Yes this 4-core gives you maybe 160fps instead of the 140fps with the 8-core Ryzens when you game with a Titan X in 1080p resolution, so sure, it is 'faster' but let's be real here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When the game needs more than the 4C/8T of the 7700K, it doesn't matter the frequency, the limited cores and threads of that CPU are going to be left behind by even 6-core CPUs. That's the point - the 7700K may be 'faster' in scenarios that most of us never use or will never experience or _even notice_ (how can anyone tell the difference between 90fps and 105fps?!?) but it is limiting yourself for the games today that scale better with more cores and those games of tomorrow that will almost certainly prefer more cores/threads.


"games tomorrow" would be like sometimes at least 3 to 4 years later, since directx 12 doesnt look like can truly use more than 4 cores.


----------



## Oubadah

..


----------



## Scotty99

Quote:


> Originally Posted by *Oubadah*
> 
> Why not? The higher the refresh rate, the less impact V-Sync has on latency. At 144Hz it's negligible. If you could sustain 144fps, you'd be mad not to use V-Sync at that refresh rate.


Because 144hz displays dont tear, at least none that ive used.


----------



## IRobot23

Quote:


> Originally Posted by *Scotty99*
> 
> Because 144hz displays dont tear, at least none that ive used.


They do. Less, but they do.


----------



## Oubadah

..


----------



## Scotty99

Ive personally never witnessed tearing on a 144hz panel.

While it technically "exists" its not near as noticeable, that is why i say no one should be using vsync on them. Its also part of why i feel gsync is a major rip off, gsync makes sense at lower framerates but not on games where you are already getting over 100.


----------



## Derp

Quote:


> Originally Posted by *Scotty99*
> 
> Ive personally never witnessed tearing on a 144hz panel.
> 
> While it technically "exists" its not near as noticeable, that is why i say no one should be using vsync on them. Its also part of why i feel gsync is a major rip off, gsync makes sense at lower framerates but not on games where you are already getting over 100.


It's very noticeable even on high refresh rate monitors and so is the input lag from Vsync. Gsync/Freesync solves both of these problems so I don't understand why you would consider this kind of tech a rip off.


----------



## Oubadah

..


----------



## Scotty99

You both drank the gsync kool aid.

It is 100% unequivocally a rip off.

It should have at max a 50 dollar premium over standard 144hz monitors, but yet there are no 300 dollar gsync monitors on the market. You can buy a good freesync monitor for 200 dollars, you have to spend DOUBLE that for gsync.


----------



## Oubadah

..


----------



## Scotty99

There are two things here:

1. VRR CAN be beneficial i dont disagree with that, but we are specifically talking about situations where you are already at crazy high refresh rates. Here i dont feel VRR plays NEARLY as big of a role in say, a 4k game where you would be dipping below on a 4k 60hz gsync monitor.
2. You cant disagree with me on gsync prices, its just ridiculous.


----------



## SoloCamo

Quote:


> Originally Posted by *iRUSH*
> 
> I didn't say I was cool with the comparable performing intel priced chip. Although I can get a 6800k for $359, i'd probably still pick the 1700 over it.
> 
> Don't think that just because I have thoughts on this that don't align with others that rant and rave over Ryzen means I think the CPU is utter trash.
> 
> With that said, the article that I agreed with still holds true to not only me, but a ton of people beyond OCN.
> 
> Ryzen 7 is a niche product. Unless someone can show me mass businesses ditching their Xeons in favor of a Ryzen 7 chip?
> 
> There's no such thing as a bad CPU, just a bad price. The 1700x and 1800x are way off base. If it's not then it's price ratio compared to the other models within Ryzen 7 will never change either....but I have a hunch it will, just wait.
> 
> By that I mean the 1800x price will fall quite a bit and the whole R7 lineup with tighten. This is one of my biggest complaints with AMD. They price it high then drop over time. Intel and Nvidia you'll see far less of this. In other words it devalues AMD products. Anyone remember the 9K series FX chip prices at release? Yes I am aware that's an extreme example.


CPU's are not antique cars. Losing value is of little, well, value, to someone who is buying a cpu to put in use. There is no way to spin AMD's pricing in a bad light this time around. They offer a more expensive product for those that do not want to overclock or that just want the fastest chip out of the box (and it also comes with a better cooler for those who don't replace stock, either). They also offer at a cheaper price point a lower clocked cpu that might be able to hit the same speeds as the 1800x if you are willing to take the gamble. The silicone lottery does exist, even if so far most 1700 users have been getting good results.

Ryzen is hardly niche. Productivity is just as common as gaming is regardless of what most forums seem to believe.

So far, especially with the R5 pricing, Ryzen is quite a bit cheaper then many of us expected as is.

The 6900k is $1049

The 1800x is $499
The 1700x is $399
The 1700 is $329

Even at worst the highest priced AMD cpu is less than half the cost of Intel's 8c16t part. I can assure you, the 6900k is not over 100% faster, let alone any where near that,

That's all we need to know regarding the pricing.


----------



## Oubadah

..


----------



## Scotty99

Quote:


> Originally Posted by *Oubadah*
> 
> Well _I_ feel that it makes a significant difference at refresh rates across the board.
> I can, and I did. Titan pricing is ridiculous. G-Sync pricing is reasonable, for the reason I already gave. Fortunately no one is forced to buy G-Sync, and if they feel the price premiums for G-Sync monitors are too high they can vote with their wallets and buy a FreeSync monitor instead.


Only on forums like this one do i ever hear absurd opinions like yours.

What exactly do you feel gsync offers over freesync that warrants double the pricing?


----------



## Carniflex

Quote:


> Originally Posted by *Derp*
> 
> It's very noticeable even on high refresh rate monitors and so is the input lag from Vsync. Gsync/Freesync solves both of these problems so I don't understand why you would consider this kind of tech a rip off.


I would go a step further even and would like to say that in my opinion freesync is kind of bigger deal in a display than ability to run at 144 hz - that said majority of displays that can do 120+ hz nowadays seem to come with either free- or g-sync anyway. But if forced to pick between, say 75 hz freesync display and 144hz one without freesync I would pick the one with freesync myself.

Before buying a 144 hz freesync display (1440p) I was just thinking about it as nice to have semi-gimmic and bought the display for its high refresh rate, but having freesync makes such a large difference if your frame rate drops under the display native rate in perceived smoothness of the action. And it does drop under 144 hz often enough even with [email protected] which is about the same single thread performance wise as Ryzen at around 4'ish GHz based on benches I have seen so far. But with freesync - it really does not matter all that much, with freesync it can go as low as around 40'ish before it gets significantly noticeable. While when using 4k60hz without freesync it is immediately noticeable when I dip below 60 hz. Game in question is Planetside 2 with mix of potato and ultra settings using handcrafted useroptions.ini. It is MMOFPS and when you get 100vs100 fight sometimes the frame rate tanks regardless of the hardware.

So in this sense I would say that Ryzen is also damn good gaming CPU because free/g-sync exist. And if the user does not have such a display which supports g or freesync I would say it would be probably better to spend the budget on upggrading his/her display before going for that one CPU in the Intel lineup that offers a bit better max fps than ryzen does in some titles. Then again in my opinion any CPU form the past 5 years in the price bracket above 150$ is probably damn good gaming CPU to this day and you can find also few gems under that price bracket which I would also still consider relevant for gaming.


----------



## Oubadah

..


----------



## Scotty99

Quote:


> Originally Posted by *Carniflex*
> 
> I would go a step further even and would like to say that in my opinion freesync is kind of bigger deal in a display than ability to run at 144 hz - that said majority of displays that can do 120+ hz nowadays seem to come with either free- or g-sync anyway. But if forced to pick between, say 75 hz freesync display and 144hz one without freesync I would pick the one with freesync myself.
> 
> Before buying a 144 hz freesync display (1440p) I was just thinking about it as nice to have semi-gimmic and bought the display for its high refresh rate, but having freesync makes such a large difference if your frame rate drops under the display native rate in perceived smoothness of the action. And it does drop under 144 hz often enough even with [email protected] which is about the same single thread performance wise as Ryzen at around 4'ish GHz based on benches I have seen so far. But with freesync - it really does not matter all that much, with freesync it can go as low as around 40'ish before it gets significantly noticeable. While when using 4k60hz without freesync it is immediately noticeable when I dip below 60 hz. Game in question is Planetside 2 with mix of potato and ultra settings using handcrafted useroptions.ini. It is MMOFPS and when you get 100vs100 fight sometimes the frame rate tanks regardless of the hardware.
> 
> So in this sense I would say that Ryzen is also damn good gaming CPU because free/g-sync exist. And if the user does not have such a display which supports g or freesync I would say it would be probably better to spend the budget on upggrading his/her display before going for that one CPU in the Intel lineup that offers a bit better max fps than ryzen does in some titles. Then again in my opinion any CPU form the past 5 years in the price bracket above 150$ is probably damn good gaming CPU to this day and you can find also few gems under that price bracket which I would also still consider relevant for gaming.


Ryzen really isnt a good gaming cpu tho, as of today in 2017. Time will tell how it fairs in the future but if you are objective and watch all the reviews, its farther behind intel than it should be. Because VRR monitors exist does not make ryzen a better gaming chip, it just allows you to make an argument for VRR monitors.


----------



## Oubadah

..


----------



## SoloCamo

Quote:


> Originally Posted by *Scotty99*
> 
> Ryzen really isnt a good gaming cpu tho, as of today in 2017. Time will tell how it fairs in the future but if you are objective and watch all the reviews, its farther behind intel than it should be. Because VRR monitors exist does not make ryzen a better gaming chip, it just allows you to make an argument for VRR monitors.


Guess I have to throw out my 4790k... clearly it's not a good gaming cpu either if Ryzen isn't considered one.

Edit:
Quote:


> Originally Posted by *Oubadah*
> 
> That's like saying that the GTX 1080 isn't a good gaming GPU because of the 1080 Ti. Ryzen is a fine gaming CPU.


Exactly. Especially if you game at 4k. A Ryzen would push a 1080ti fine.. and probably the next few flagships at that res, too.


----------



## Carniflex

Quote:


> Originally Posted by *Scotty99*
> 
> Ryzen really isnt a good gaming cpu tho, as of today in 2017. Time will tell how it fairs in the future but if you are objective and watch all the reviews, its farther behind intel than it should be. Because VRR monitors exist does not make ryzen a better gaming chip, it just allows you to make an argument for VRR monitors.


I disagree.

Cant say I have watched _all_ the reviews. Have watched a fair bit of them. But the absolute minimum resolution at which I'm playing is 1440p. Often higher than that at either 4k or even a bit higher at 5400x1920 (5 screen eyefinity with 1080p screens). I just can not understand why people play at 1080p resolution when they can afford that one CPU in Intel lineup that is faster than Ryzen. A good enough 27'' 1440p screen can be had cheaper than that CPU and is a lot more noticeable upgrade when coming from 1080p.

Even when playing competitively some shooter that higher resolution is beneficial to some degree as it gives you more pixels to aim at.


----------



## Scotty99

4790k obliterates ryzen in most titles, not sure what you are on about.

I purchased ryzen because it is incredible value, and i dont NEED 180 fps im ok with 130. That does not excuse the fact they are behind so much in so many titles, i loaded up WoW yesterday and got worse FPS than my 4.2ghz 2500k which was a bit of a surprise but i put that one mostly on the old engine.

@carniflex, ive watched ALL the reviews lol. It is behind id say on average 25-40% on the CPU bound titles, it is more than you would expect looking at synthetic benchmarks.

Again some of that could be just launch day stuff, but as of right NOW ryzen is behind farther than its IPC would indicate.


----------



## Oubadah

..


----------



## Derp

Quote:


> Originally Posted by *Scotty99*
> 
> What exactly do you feel gsync offers over freesync that warrants double the pricing?


I agree G-sync monitors are more expensive than Freesync siblings at 144Hz but at 240hz the difference is $500 for Freesync vs $550 for G-sync. It's not that big of a premium, certainly not double.


----------



## Carniflex

Quote:


> Originally Posted by *Oubadah*
> 
> That's an overly simplistic way of looking at it. Not all 1080P displays run at 60Hz, and not all games are GPU bottlenecked at 1440P+ (especially older ones).


That is true ofcource.

Same can be said about statement that this or that CPU is good or is not good for gaming.

But why I'm saying that I do not understand why people cling to 1080p is because 1440p has 2x more pixels than 1080p. Entry level 1440p screens are about 180 EUR/$ and that 2x more pixels is such a significant boost in image quality, productivity and even beneficial, to some degree, in gaming. Obviously people have budget constraints, but if they have significant enough budget constraints to not have room fro 1440p I'm also pretty sure they are not fretting about the performance difference between Ryzen and top of the line i7's.

Refresh rate is another axis in there and it brings with itself a premium regardless of the resolution considered.


----------



## Blameless

Quote:


> Originally Posted by *JackCY*
> 
> The RAM and Fabric speed isn't an issue. RAM affects Ryzen CPUs same way Intel CPUs are


Far from convinced of this.

Very little testing has been done to isolate data fabric from memory performance. Understandable, since the clocks are linked, but still possible to do and still necessary for a complete picture.
Quote:


> Originally Posted by *Oubadah*
> 
> Technically, any unsynchronised framerate is going to result in stutter.


Tearing isn't stutter.
Quote:


> Originally Posted by *Kuivamaa*
> 
> Well played some WoW dungeons and raids. Windows 10 often likes to load mostly cores 0 and 15 to run it
> 
> 
> 
> 
> 
> 
> 
> Scheduler or not,something must be done for sure.


Are those both render threads?

Do you see any performance change when manually setting affinity to one CCX?
Quote:


> Originally Posted by *Malinkadink*
> 
> I've been wondering, what made AMD go with the CCX design


Scalability.

AMD can use the same CCX design on everything from the lowest end APUs to 16-32 core enterprise parts.
Quote:


> Originally Posted by *budgetgamer120*
> 
> Furthermore I dont think ponging cores represents how programs use cores. As Ryzen is a killer in multitasking/multithreading.


Depends on the task in question.

Non real-time rendering, archival, encoding...all of these sorts of tasks are split into small, barely associated jobs. They scale equally well almost irrespective of interconnect.

Not all tasks are like this. For some things, doubling latency during coherency work could easily cut performance in half.
Quote:


> Originally Posted by *iRUSH*
> 
> So you don't think it's ridiculous that there's an 1800x for $500 that is essentially the same as the other two chips prices below it? That's nutty logic. IMO $330 is pushing it and should be the 1800x price. Just get rid of the others.
> 
> Their pricing is clearly odd and will be changed just like always. I don't like to see this since it makes them look bad.
> 
> Even huge AMD fans that worship the ground they walk on should see this. I'm not accusing you of that person.


The 1700X and 1800X are selling. Eliminating them would not be profitable and would make AMD look far worse than the minor binning differences that separate the eight-core models.
Quote:


> Originally Posted by *Scotty99*
> 
> You are kinda missing the point tho mah dude lol. These chips are overall terrible overclockers, you are gonna pay 170 dollars for maybe 100mhz?


Plenty of people do.

The last few percent of performance always commands a huge premium.
Quote:


> Originally Posted by *ChronoBodi*
> 
> I am still surprised Intel hasn't dropped prices yet at all on their 6900k.
> 
> What's taking them so long?


Intel likely figures that their current supply will sell out at existing prices and won't cut prices until they see what they can get out of their next generation of parts.

There is quite a bit of inertia to opinion and demand. Last time AMD had a truly compelling product, it took years for people to get it out of their heads that Intel was always going to be the best purchase decision...then the same thing repeated itself once the balance shifted again.
Quote:


> Originally Posted by *Scotty99*
> 
> The only spot ryzen is a bad choice is competitive gamers who think they can see 240 hz.


The shortest duration of visual stimuli that will register to most people is far faster than 1/240th of a second.

I'm pretty content gaming on 60Hz displays, but I could probably see some benefit into the multi-thousand Hz range...and so could most people, given the proper scene and some training.


----------



## Oubadah

..


----------



## Blameless

Quote:


> Originally Posted by *Oubadah*
> 
> Unsynchronized frames = tearing and stutter.


Tearing, yes. Stutter, no, not necessarily.

Indeed, frames that aren't rendered at precise intervals, but are displayed at precise intervals, will be _more_ stuttery than unsyncronized frames.


----------



## Oubadah

..


----------



## Blameless

Perfectly possible to get stutter with vsync enabled and a frame rate that never dips below refresh rate. Just because they are displayed at a fixed interval does not mean that they were originally rendered/drawn at such a perfect interval.

And tearing still doesn't imply stutter.


----------



## Kuivamaa

Quote:


> Originally Posted by *Blameless*
> 
> Are those both render threads?
> 
> Do you see any performance change when manually setting affinity to one CCX?


I had the same question when I was witnessing this and I need to look further into it.I tried several tests with various core setups ,with SMT on and off but I was short on time and got nothing conclusive. Network jobs do occupy their own core but as for rendering I am not sure.


----------



## budgetgamer120

Quote:


> Originally Posted by *Oubadah*
> 
> To me a synced, sustained [email protected] is a _vast_ improvement over jittery, tearing, unsynced frames, even if the latter isn't dropping below 144fps. Just because you don't notice it doesn't mean it's not an issue. Everyone has different tolerances when it comes to this stuff.
> 
> G-Sync is not a rip-off. VRR is the best thing to happen to PC gaming in a decade. It's a huge boon at low _and_ high framerates.
> I guess I'm committing the same fallacy as Scotty99 in saying that V-Sync latency is negligible at 144Hz. It is more than halved vs 60 though.


I have a freesync monitor and I don't see what's so special about VRR. Wouldn't advise anyone to invest.


----------



## Quantum Reality

Quote:


> Originally Posted by *Scotty99*
> 
> Ryzen really isnt a good gaming cpu tho, as of today in 2017. Time will tell how it fairs in the future but if you are objective and watch all the reviews, its farther behind intel than it should be. Because VRR monitors exist does not make ryzen a better gaming chip, it just allows you to make an argument for VRR monitors.


Considering it achieves at least 80% of the performance of Intel's flagship offerings I'd say Ryzen is doing good. Also, we're already seeing people in the AMD ecosystem moving up from Phenom II and "Construction" CPUs and reporting noticeably improved results. By no means is Ryzen a failure or a bad gaming CPU.


----------



## Majin SSJ Eric

VRR is just for those l33t gamerz that get physically ill if their precious eyes ever experience a dip below 500MHz...


----------



## xlink

Quote:


> Originally Posted by *Oubadah*
> 
> If you're trying to run V-Sync with the framerate below the refresh, then of course you're going to have problems. Likely even a coarser stutter than with V-Sync off (assuming a triple buffered V-Sync here). But with V-Sync off you're still getting stutter.


Probably ideal just to do vsynch off with a high refresh rate.

As your refresh rate and/or FPS approach infinity, synchronization issues get smaller and smaller.


----------



## dmasteR

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> VRR is just for those l33t gamerz that get physically ill if their precious eyes ever experience a dip below 500MHz...


What are you even talking about? VRR is typically for people who are playing around 60fps as that's where you'll really notice screen tearing on monitors.

Higher the frames, the less noticeable the screen tearing is, thus less need for VRR.


----------



## tpi2007

Might be of interest for some:


----------



## ZealotKi11er

Quote:


> Originally Posted by *tpi2007*
> 
> Might be of interest for some:


So his 5960X is faster than 1800X which is odd. This put 1800X on IVY level.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So his 5960X is faster than 1800X which is odd. This put 1800X on IVY level.


Did you watch the video?

The Intel System is using faster video cards.

What is more telling is that out of all the systems he mentioned he was building, zero will have a 7700k...


----------



## NYU87

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So his 5960X is faster than 1800X which is odd. This put 1800X on IVY level.


Haswell-E has a higher overclocking ceiling than Ryzen also.

Most Ryzen CPUs are maxing out at 3.9-4GHz which sucks. I wouldn't be surprised at the same clock speeds (Jay set both at the same clocks), Haswell-E is faster than Ryzen. Remember that 5960x's base clock speed is 3GHz while 1800x is 3.6GHz so any reviews will reflect that difference.
Quote:


> Originally Posted by *budgetgamer120*
> 
> Did you watch the video?
> 
> The Intel System is using faster video cards.
> 
> What is more telling is that out of all the systems he mentioned he was building, zero will have a 7700k...


Did you watch the video? He is encoding and shows the load on the CPUs. You can see that there is absolutely no GPU usage so it has no performance impact.


----------



## Malinkadink

Can't watch it right now but whats the ram speed for the 1800X? AMD released new AGESA microcode thats supposed to improve ram compatibility when BIOS get updated from vendors. Should make getting 3000/3200 much easier. Talks of potentially getting 4000Mhz running. I recall seeing somewhere someone got 3700ish working, not sure what timings or what board. Point being as we all know Ryzen scales very well with memory speed.


----------



## BinaryDemon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So his 5960X is faster than 1800X which is odd. This put 1800X on IVY level.


Saying a stock 1800x is faster than a stock 5960x is a generalization that's true most of the time but it doesnt mean that it wins in every scenario using different instruction sets. Also it's important to note that x99 has quad channel memory. My crappy DDR4-2133 overclocked to DDR4-2292 speeds beats nearly every Ryzen result I've seen in memory read/write/copy and latency. So if the application is particularly memory sensitive then I wouldnt be surprised to see Haswell-E edge out Ryzen.


----------



## sugarhell

He has different gpus...

Welp


----------



## Oubadah

..


----------



## budgetgamer120

Quote:


> Originally Posted by *NYU87*
> 
> Haswell-E has a higher overclocking ceiling than Ryzen also.
> 
> Most Ryzen CPUs are maxing out at 3.9-4GHz which sucks. I wouldn't be surprised at the same clock speeds (Jay set both at the same clocks), Haswell-E is faster than Ryzen. Remember that 5960x's base clock speed is 3GHz while 1800x is 3.6GHz so any reviews will reflect that difference.
> Did you watch the video? He is encoding and shows the load on the CPUs. You can see that there is absolutely no GPU usage so it has no performance impact.


GPU usage was consistently at 38%.

Reviewer even mentions it.

Maybe you should check your eyes


----------



## SoloCamo

Quote:


> Originally Posted by *budgetgamer120*
> 
> GPU usage was consistently at 38%.
> 
> Reviewer even mentions it.
> 
> Maybe you should check your eyes


Was about to post this... it's mentioned quite clearly in the video for this exact reason.... unfortunately it fell on a few deaf ears still.


----------



## mohiuddin

Quote:


> Originally Posted by *Malinkadink*
> 
> Can't watch it right now but whats the ram speed for the 1800X? AMD released new AGESA microcode thats supposed to improve ram compatibility when BIOS get updated from vendors. Should make getting 3000/3200 much easier. Talks of potentially getting 4000Mhz running. I recall seeing somewhere someone got 3700ish working, not sure what timings or what board. Point being as we all know Ryzen scales very well with memory speed.


^this..
Can you point me to a thread talking about this exclusively?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *dmasteR*
> 
> What are you even talking about? VRR is typically for people who are playing around 60fps as that's where you'll really notice screen tearing on monitors.
> 
> Higher the frames, the less noticeable the screen tearing is, thus less need for VRR.


It was a joke for the earlier argument that was going on in here. Sometimes people just get a little too hyperbolic when discussing the "terrible" tearing and stuttering they supposedly experience at 200Hz without VRR...


----------



## ZealotKi11er

Quote:


> Originally Posted by *BinaryDemon*
> 
> Saying a stock 1800x is faster than a stock 5960x is a generalization that's true most of the time but it doesnt mean that it wins in every scenario using different instruction sets. Also it's important to note that x99 has quad channel memory. My crappy DDR4-2133 overclocked to DDR4-2292 speeds beats nearly every Ryzen result I've seen in memory read/write/copy and latency. So if the application is particularly memory sensitive then I wouldnt be surprised to see Haswell-E edge out Ryzen.


Yeah but 6900K still not that much faster than 1800X when it is faster being that it is clocked higher and better IPC. Not many places have tested HW-E, BW-E and Zen at 4.0GHz to see what goes. I do bealive his result though because I still think Zen has been overhyped even as a 8-Core CPU in terms of performance. Price is a different story because 1700 can be had for 1/3 the cost of 5960X.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah but 6900K still not that much faster than 1800X when it is faster being that it is clocked higher and better IPC. Not many places have tested HW-E, BW-E and Zen at 4.0GHz to see what goes. I do bealive his result though because *I still thing Zen has been overhyped even as a 8-Core CPU in terms of performance*. Price is a different story because 1700 can be had for 1/3 the cost of 5960X.


Thats your problem... "Delusionalism"
Encoding
http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,11.html


----------



## ZealotKi11er

Quote:


> Originally Posted by *budgetgamer120*
> 
> Thats your problem... "Delusionalism"
> Encoding
> http://www.guru3d.com/articles_pages/amd_ryzen_7_1800x_processor_review,11.html


Any data for software content creators actually use?


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any data for software content creators actually use?


x264 (Handbrake is just an x264 front-end) is probably the most widely used h.264 encoder there is. It's far from complete performance picture, in and of itself, but it's certainly relevant.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any data for software content creators actually use?


That is the point of the handbrake bench.

Oh look the 5960x is faster than the 6900K in Handbrake. The 6900k arch must be slower


----------



## BinaryDemon

Quote:


> Originally Posted by *Blameless*
> 
> x264 (Handbrake is just an x264 front-end) is probably the most widely used h.264 encoder there is. It's far from complete performance picture, in and of itself, but it's certainly relevant.


How does x264 compare to x265? Because looking thru the HWBOT.org ranking for those benchmarks, they seem to favor Haswell-E/Broadwell-E over Ryzen as well.


----------



## budgetgamer120

Quote:


> Originally Posted by *BinaryDemon*
> 
> How does x264 compare to x265? Because looking thru the HWBOT.org ranking for those benchmarks, they seem to favor Haswell-E/Broadwell-E over Ryzen as well.


I think you are right....
http://www.eurogamer.net/articles/digitalfoundry-2017-amd-ryzen-7-1800x-review


----------



## Blameless

Quote:


> Originally Posted by *BinaryDemon*
> 
> How does x264 compare to x265? Because looking thru the HWBOT.org ranking for those benchmarks, they seem to favor Haswell-E/Broadwell-E over Ryzen as well.


x265 is more demanding in general and can probably make better use of AVX instructions (something which x264 has, but doesn't seem to get much use out of).


----------



## Malinkadink

Quote:


> Originally Posted by *mohiuddin*
> 
> ^this..
> Can you point me to a thread talking about this exclusively?


https://www.overclock3d.net/news/cpu_mainboard/amd_has_reportedly_released_new_agesa_microcode_for_ryzen/1

A bit late sorry


----------



## Kuivamaa

Quote:


> Originally Posted by *sugarhell*
> 
> He has different gpus...
> 
> Welp


Is he using Premier with GPU acceleration? I haven't really used that, does it take advantage of both CPU and GPU at the same time? If it does I fail to see what he is trying to show here, since 1080s are clearly faster cards than Titan XMs.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kuivamaa*
> 
> Is he using Premier with GPU acceleration? I haven't really used that, does it take advantage of both CPU and GPU at the same time? If it does I fail to see what he is trying to show here, since 1080s are clearly faster cards than Titan XMs.


He is arguing in the video like both card are similar performance lol. GTX1080 is faster unless he left them at stock and OCed the Titan XM.


----------



## 0razor1

I just saw https://www.youtube.com/watch?v=TId-OrXWuOETHIS review on youtube and well it sums everything up neatly. It'd really be nice if one could set OS affinity to just say, two cores, find the stronger cluster of 4 cores, and set per-core clocks on a 1600x (or handicapped 1800x).. IDK what I'm thinking but will customizable turbo-boost (custom ratios like in Intel) be set up for AM4?


----------



## Alwrath

Ryzen build up and running. Same settings as my brothers Ryzen setup, we have the same cpu, mb, and ram combo. Gigabyte K7. 1700 @ 3.5 mhz 1.24 V, G skill Flare X @2400 mhz 14-14-14-36 1.25 V ( its a 3200 mhz kit but waiting for bios update or might try 2933 tomorrow ), EVGA Geforce 1080ti stock voltage @ 1950-2000 mhz ( think I got a decent ti here ), 1 TB sandisk ultra ii SSD, getting 110 - 120 fps in overwatch @ 4K @ 60hz. Best setup I have ever experienced. Blows away my intel core i5 setup out of the water. I notice a certain smoothness to my gaming that I didnt notice before on my quad core. Even at the same fps, the game feels smoother. To everyone who thinks Ryzen isnt for gaming... heh... well... I bought it just for gaming and couldn't be happier. Also, my zen 2 socket upgrade next year will blow away any intel cpu released this year so if you want my 1700 one day it will be on ebay. Im not worried about intel atm, even with skylake-e coming out. Im gaming now with 8 cores and its a blast.























































Will update with more overclocking results when my corsar bracket comes in for my h100i V2


----------



## budgetgamer120

Quote:


> Originally Posted by *Alwrath*
> 
> 
> 
> Ryzen build up and running. Same settings as my brothers Ryzen setup, we have the same cpu, mb, and ram combo. Gigabyte K7. 1700 @ 3.5 mhz 1.24 V, G skill Flare X @2400 mhz 14-14-14-36 1.25 V ( its a 3200 mhz kit but waiting for bios update or might try 2933 tomorrow ), EVGA Geforce 1080ti stock voltage @ 1950-2000 mhz ( think I got a decent ti here ), 1 TB sandisk ultra ii SSD, getting 110 - 120 fps in overwatch @ 4K @ 60hz. Best setup I have ever experienced. Blows away my intel core i5 setup out of the water. I notice a certain smoothness to my gaming that I didnt notice before on my quad core. Even at the same fps, the game feels smoother. To everyone who thinks Ryzen isnt for gaming... heh... well... I bought it just for gaming and couldn't be happier. Also, my zen 2 socket upgrade next year will blow away any intel cpu released this year so if you want my 1700 one day it will be on ebay. Im not worried about intel atm, even with skylake-e coming out. Im gaming now with 8 cores and its a blast.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will update with more overclocking results when my corsar bracket comes in for my h100i V2


Good to know. When I get a 1700 I won't be overclocking much either.


----------



## FoamyV

Soo, any reviews comparing any of the new ryzen cpu's to 4790k? I'm interested in a change of platform but i'd like to know how much of a hit i'll take in gaming. Thanks


----------



## Malinkadink

Quote:


> Originally Posted by *FoamyV*
> 
> Soo, any reviews comparing any of the new ryzen cpu's to 4790k? I'm interested in a change of platform but i'd like to know how much of a hit i'll take in gaming. Thanks


Haswell to Skylake/Kaby is about 6% better IPC,

Ryzen is sitting around broadwell levels which is 3% slower than Kaby/Sky. If you OC that 4790k you'll get better results in gaming, not quite 7700k levels but it will be better than Ryzen nonetheless, by how much? Well just look at some reputable 7700k vs 1800X comparisons making note of what clocks each CPU is running as well as paying attention to the ram being used on the Zen chip. The 4790k would fit in between somewhere. If you need the extra cores/threads its a good upgrade, if you're just gaming theres little reason to upgrade from a 4790k.


----------



## Brutuz

Quote:


> Originally Posted by *Malinkadink*
> 
> Haswell to Skylake/Kaby is about 6% better IPC,
> 
> Ryzen is sitting around broadwell levels which is 3% slower than Kaby/Sky. If you OC that 4790k you'll get better results in gaming, not quite 7700k levels but it will be better than Ryzen nonetheless, by how much? Well just look at some reputable 7700k vs 1800X comparisons making note of what clocks each CPU is running as well as paying attention to the ram being used on the Zen chip. The 4790k would fit in between somewhere. If you need the extra cores/threads its a good upgrade, if you're just gaming theres little reason to upgrade from a 4790k.


You're underestimating the difference in IPC, Broadwell/Zen are both around 7% lower IPC than Skylake/Kaby Lake and Haswell is a tad below that again. The 4790k will be faster from increased clocks but it'll also be around a ~15% difference core for core, but Ryzen will also benefit from being able to keep games on 4 threads (Even if they aren't fully loaded) and allocate background tasks to the other 4 threads before SMT, which while it'd obviously show a larger benefit if the cores were loaded it does mean that no game code has to ever wait for background code to clear a core first. I'd pin the difference at around 10% overall and likely not really noticable, but with Zen pulling way ahead when a game makes use of more than 4 threads as its extra cores allows overall IPC to go far ahead of even Kaby Lake when it's threads are filled. (Contrary to popular belief, IPC as a measurement includes all cores when you're talking about a program that uses all cores just as it only includes one core for single-threaded application.)

Going from anything after Sandy Bridge to Ryzen for purely gaming isn't a smart idea right now, it'd be better to save money until games are actually using more threads.


----------



## Arturo.Zise

The only thing I paid attention to in that Jayz2C video is the mention of 10% slower at 50% of the cost


----------



## jon666

Worst case I was expecting Ivy IPC, which is what I will be replacing. Not sure if my Ivy CPU is giving me issues or the motherboard. I'm leaning towards the motherboard. Random audio drops, freezes, etc. most likely overclocking hurt something important. Might take it easy with overclocking with this build. Will try to anyways. Here is to hoping I learn not to want too much.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malinkadink*
> 
> Haswell to Skylake/Kaby is about 6% better IPC,
> 
> Ryzen is sitting around broadwell levels which is 3% slower than Kaby/Sky. If you OC that 4790k you'll get better results in gaming, not quite 7700k levels but it will be better than Ryzen nonetheless, by how much? Well just look at some reputable 7700k vs 1800X comparisons making note of what clocks each CPU is running as well as paying attention to the ram being used on the Zen chip. The 4790k would fit in between somewhere. If you need the extra cores/threads its a good upgrade, if you're just gaming theres little reason to upgrade from a 4790k.


IPC has nothing to do with clock speeds. From what I see Zen is between IVY and Haswell in terms of IPC. Skylake is more than 6% faster than Haswell. The DDR4 alone will make the IPC difference bigger than that.


----------



## hawker-gb

Ryzen is 6% slower in IPC then KL.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hawker-gb*
> 
> Ryzen is 6% slower in IPC then KL.


In what world do you see 6% difference?


----------



## Scotty99

Quote:


> Originally Posted by *hawker-gb*
> 
> Ryzen is 6% slower in IPC then KL.


Most single threaded tasks ryzen is only barely faster than my old 2500k.

Its probably more like 25-35% behind kaby.


----------



## budgetgamer120

Quote:


> Originally Posted by *Scotty99*
> 
> Most single threaded tasks ryzen is only barely faster than my old 2500k.
> 
> Its probably more like 25-35% behind kaby.


He said ipc... Same clocks


----------



## renx

I've seen Ryzen IPC very close to Kaby Lake on most reviews.
Single thread performance is another animal, because of the clock differences.


----------



## Scotty99

Quote:


> Originally Posted by *budgetgamer120*
> 
> He said ipc... Same clocks


I know, 2500k boosts to 3.7ghz out of the box same as 1700. 1700 beats 2500k in cinebench single by like 4 points lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *renx*
> 
> I've seen Ryzen IPC very close to Kaby Lake on most reviews.
> Single thread performance is another animal, because of the clock differences.


Nope. They have never tested IPC. 1800X @ 4.0GHz scores 161 in R15. My 3770K for reference scores 167 at 4.6GHz. Cinebench is actually one of those benchmarks Ryzen does very good.


----------



## IRobot23

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. They have never tested IPC. 1800X @ 4.0GHz scores 161 in R15. My 3770K for reference scores 167 at 4.6GHz. Cinebench is actually one of those benchmarks Ryzen does very good.


Basically it does good in many scenarios.
They only problem is gaming, many expected that it will be faster than OCed i7 7700K.


----------



## espn

The lowest price Ryzen 3 with overclock should be a very good fight with I5.


----------



## 7850K

Quote:


> Originally Posted by *Scotty99*
> 
> I know, 2500k boosts to 3.7ghz out of the box same as 1700. 1700 beats 2500k in cinebench single by like 4 points lol.


My 1700 almost never boosts to 3.7 though. Opening programs, playing DX9 games, ect it will sometimes blip for a split second to 3.7. This is in balanced power preference where it's supposed to work best too. in high performance boost is disabled entirely.


----------



## ZealotKi11er

Quote:


> Originally Posted by *7850K*
> 
> My 1700 almost never boosts to 3.7 though. Opening programs, playing DX9 games, ect it will sometimes blip for a split second to 3.7. This is in balanced power preference where it's supposed to work best too. in high performance boost is disabled entirely.


In CB it should boost 3.75GHz for single core.


----------



## cssorkinman

Quote:


> Originally Posted by *7850K*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> I know, 2500k boosts to 3.7ghz out of the box same as 1700. 1700 beats 2500k in cinebench single by like 4 points lol.
> 
> 
> 
> My 1700 almost never boosts to 3.7 though. Opening programs, playing DX9 games, ect it will sometimes blip for a split second to 3.7. This is in balanced power preference where it's supposed to work best too. in high performance boost is disabled entirely.
Click to expand...

I ran cinebech R15 in hp mode with all clock related bios settings at auto and it boosted to 4.1 ghz , yielding 162 single core score on the 1800x,

Have you tried the hp plan?


----------



## IRobot23

Quote:


> Originally Posted by *renx*
> 
> I've seen Ryzen IPC very close to Kaby Lake on most reviews.
> Single thread performance is another animal, because of the clock differences.


Ryzen does better in some benchmarks (CPU-Z). Whats really interesting is SMT.
Ryzen looses against Kabylake 1C/1T , but wins 1/2T.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. They have never tested IPC. 1800X @ 4.0GHz scores 161 in R15. My 3770K for reference scores 167 at 4.6GHz. Cinebench is actually one of those benchmarks Ryzen does very good.


So Cinebench doesn't count now?


----------



## 7850K

Quote:


> Originally Posted by *cssorkinman*
> 
> I ran cinebech R15 in hp mode with all clock related bios settings at auto and it boosted to 4.1 ghz , yielding 162 single core score on the 1800x,
> 
> Have you tried the hp plan?


Yessir, I had it in HP until I read that balanced worked better with turbo and default settings which I'm still at. I noticed better game perofrmance immediately in balanced. maybe it's board/BIOS thing. I have some other 'quirks' on this system no one is talking about or able to help with *shrug* just got to wait a bit I guess


----------



## cssorkinman

Quote:


> Originally Posted by *7850K*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> I ran cinebech R15 in hp mode with all clock related bios settings at auto and it boosted to 4.1 ghz , yielding 162 single core score on the 1800x,
> 
> Have you tried the hp plan?
> 
> 
> 
> Yessir, I had it in HP until I read that balanced worked better with turbo and default settings which I'm still at. I noticed better game perofrmance immediately in balanced. maybe it's board/BIOS thing. I have some other 'quirks' on this system no one is talking about or able to help with *shrug* just got to wait a bit I guess
Click to expand...

I'm betting you are correct with the board/bios diagnosis.

Lots of things to test/tinker with and many of those experiments end with more questions than answers at the moment - very encouraged by the potential at any rate


----------



## Kuivamaa

Quote:


> Originally Posted by *cssorkinman*
> 
> I ran cinebech R15 in hp mode with all clock related bios settings at auto and it boosted to 4.1 ghz , yielding 162 single core score on the 1800x,
> 
> Have you tried the hp plan?


HP took my score down to 158 as If XFR wasn't always on. Balanced let me score 162, but it causes other issues with games.


----------



## hawker-gb

So blue boyz don't count cinebench anymore.

Back in days they preach about cinebench like it is some kind of divine. Before they came to forums they sacrifice few lambs in cinebench altar probably.

Now new divine is 480p benches.


----------



## budgetgamer120

Quote:


> Originally Posted by *hawker-gb*
> 
> So blue boyz don't count cinebench anymore.
> 
> Back in days they preach about cinebench like it is some kind of divine. Before they came to forums they sacrifice few lambs in cinebench altar probably.
> 
> Now new divine is 480p benches.


it is all about 720p gaming now man. 720 is the new 4k.


----------



## SoloCamo

Quote:


> Originally Posted by *IRobot23*
> 
> Basically it does good in many scenarios.
> They only problem is gaming, *many expected that it will be faster than OCed i7 7700K*.


Who are these people? That's not what was said around here at all.. in fact quite the opposite.


----------



## IRobot23

Quote:


> Originally Posted by *budgetgamer120*
> 
> it is all about 720p gaming now man. 720 is the new 4k.


Any cherry picked games. Suddenly Far cry Primal is used for benchmarking.....


----------



## Kuivamaa

Quote:


> Originally Posted by *IRobot23*
> 
> Any cherry picked games. Suddenly Far cry Primal is used for benchmarking.....


It is graphically heavy, for sure. As a matter of fact I will download it right now and check its threading behavior.


----------



## IRobot23

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is graphically heavy, for sure. As a matter of fact I will download it right now and check its threading behavior.


Its basically 1 thread at 100%...


----------



## Malinkadink

Quote:


> Originally Posted by *IRobot23*
> 
> Its basically 1 thread at 100%...


Is it really? I guess that's ubisoft for you, they'll have something like Division where its fine, then Farcry where its not even close. Then again Ubisoft has a bunch of studios all over the place and a quick google search shows each game had completely different ones working on it.


----------



## daviejams

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is graphically heavy, for sure. As a matter of fact I will download it right now and check its threading behavior.


I just tried the benchmark - one core at 100% the other three between 50-70% mostly around 50%. On a skylake i5 6600k at 4.2ghz 720p low settings. If I stick it on ultra 1440p still have that one thread higher than the rest significantly

somebody with an 8 core test it to see if it scales beyond that


----------



## ZealotKi11er

Quote:


> Originally Posted by *hawker-gb*
> 
> So blue boyz don't count cinebench anymore.
> 
> Back in days they preach about cinebench like it is some kind of divine. Before they came to forums they sacrifice few lambs in cinebench altar probably.
> 
> Now new divine is 480p benches.


Nope they do. In Single Threaded its not even a contentions. A OCed 2500K will beat Zen OC. Yes Zen stops with 16T. CB is best case and Zen cant beat my 4.6GHz IVY for SC. What about the not so good cases? Yeah a lot of people stream and encode and are content creators but I bet you more people right now can leverage 4C/8T than 8C/16T. In case in 6T/12T thread you sacrifice a bit in some games, gain a lot in multi-threaded workload, and is a lot cheaper. I just do not see a reason for getting 8C Zen right now unless you use the 8C. All I am trying to say is do not buy into Zen core count in hopes for the future.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope they do. In Single Threaded its not even a contentions. A OCed 2500K will beat Zen OC. Yes Zen stops with 16T. CB is best case and Zen cant beat my 4.6GHz IVY for SC. What about the not so good cases? Yeah a lot of people stream and encode and are content creators but I bet you more people right now can leverage 4C/8T than 8C/16T. In case in 6T/12T thread you sacrifice a bit in some games, gain a lot in multi-threaded workload, and is a lot cheaper. I just do not see a reason for getting 8C Zen right now unless you use the 8C. All I am trying to say is do not buy into Zen core count in hopes for the future.


Who cares what more people can leverage.... Intel made the 6900k for the same reason AMD made Ryzen 7....

The quad core era is finally ending and I likes it









The people I know with quads have been decreasing daily since Ryzen lol. Its like most people been waiting for the day when they can afford more cores.


----------



## ZealotKi11er

Quote:


> Originally Posted by *budgetgamer120*
> 
> Who cares what more people can leverage.... Intel made the 6900k for the same reason AMD made Ryzen 7....
> 
> The quad core era is finally ending and I likes it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The people I know with quads have been decreasing daily since Ryzen lol. Its like most people been waiting for the day when they can afford more cores.


How many people buy or need 6900K. Never told myself I want let alone need 6900K. That times comes only when I see the need for a 8C CPU.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How many people buy or need 6900K. Never told myself I want let alone need 6900K. That times comes only when I see the need for a 8C CPU.


The question is who can afford a $1000?









And everyone does not have the same outlook as you.


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope they do. In Single Threaded its not even a contentions. A OCed 2500K will beat Zen OC. Yes Zen stops with 16T. CB is best case and Zen cant beat my 4.6GHz IVY for SC. What about the not so good cases? Yeah a lot of people stream and encode and are content creators but I bet you more people right now can leverage 4C/8T than 8C/16T. In case in 6T/12T thread you sacrifice a bit in some games, gain a lot in multi-threaded workload, and is a lot cheaper. I just do not see a reason for getting 8C Zen right now unless you use the 8C. All I am trying to say is do not buy into Zen core count in hopes for the future.


You should put the x chips out of your head when talking about zen, the x chips only exist .....actually to be honest i cant come up with a reason for them existing lol. The 1700 is the chip people on here should be buying, its only 100 bucks more than the 1600 will be and that gets you a RGB cooler and 2c 4t.

Its not a ridiculous price gouge like intel does on x99 chips when going up core counts, that is why i decided to go with it instead of waiting and getting the 6 core.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> You should put the x chips out of your head when talking about zen, the x chips only exist .....actually to be honest i cant come up with a reason for them existing lol. The 1700 is the chip people on here should be buying, its only 100 bucks more than the 1600 will be and that gets you a RGB cooler and 2c 4t.
> 
> Its not a ridiculous price gouge like intel does on x99 chips when going up core counts, that is why i decided to go with it instead of waiting and getting the 6 core.


Yeah but 1800X is still sought out as the Zen representation and so its is price. I too bealive rely the only Zen chips are 1400,1600, 1700.
Quote:


> Originally Posted by *budgetgamer120*
> 
> The question is who can afford a $1000?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And everyone does not have the same outlook as you.


You do not understand gaming industry do you? If 6900K was like 30% faster in gaming it will sell a lot more to gamers even at $1000 like Titan does for GPU.


----------



## budgetgamer120

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah but 1800X is still sought out as the Zen representation and so its is price. I too bealive rely the only Zen chips are 1400,1600, 1700.
> You do not understand gaming industry do you? If 6900K was like 30% faster in gaming it will sell a lot more to gamers even at $1000 like Titan does for GPU.


And You clearly do not underatand the pc industry or how processors and think everyone only game and think and think processors are least powerful because they get less fps in a game.


----------



## SuperZan

Quote:


> Originally Posted by *Scotty99*
> 
> You should put the x chips out of your head when talking about zen, the x chips only exist .....actually to be honest i cant come up with a reason for them existing lol. The 1700 is the chip people on here should be buying, its only 100 bucks more than the 1600 will be and that gets you a RGB cooler and 2c 4t.
> 
> Its not a ridiculous price gouge like intel does on x99 chips when going up core counts, that is why i decided to go with it instead of waiting and getting the 6 core.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah but 1800X is still sought out as the Zen representation and so its is price. I too bealive rely the only Zen chips are 1400,1600, 1700.
> You do not understand gaming industry do you? If 6900K was like 30% faster in gaming it will sell a lot more to gamers even at $1000 like Titan does for GPU.


It's fairly obvious that the X chips exist because most consumers don't overclock and in the workstation market, where these chips offer mental price/performance, the number of consumers that overclock is even smaller. A hierarchy based on base/boost clocks at different price tiers is thus the only sensible way to offer choices on the price <--> performance continuum.

The non-X chips are gifts to overclockers rather than gating overclocking behind a 'K' paywall.


----------



## DNMock

Quote:


> Originally Posted by *Scotty99*
> 
> You should put the x chips out of your head when talking about zen, the x chips only exist .....actually to be honest i cant come up with a reason for them existing lol. The 1700 is the chip people on here should be buying, its only 100 bucks more than the 1600 will be and that gets you a RGB cooler and 2c 4t.
> 
> Its not a ridiculous price gouge like intel does on x99 chips when going up core counts, that is why i decided to go with it instead of waiting and getting the 6 core.


PCIE lanes and the pro-consumer market. Easier and cheaper to just take a Xeon chip and tune it for consumer level usage than to add 20+ PCIE lanes to core I7 cpus.


----------



## KarathKasun

Quote:


> Originally Posted by *SuperZan*
> 
> It's fairly obvious that the X chips exist because most consumers don't overclock and in the workstation market, where these chips offer mental price/performance, the number of consumers that overclock is even smaller. A hierarchy based on base/boost clocks at different price tiers is thus the only sensible way to offer choices on the price <--> performance continuum.
> 
> The non-X chips are gifts to overclockers rather than gating overclocking behind a 'K' paywall.


This.

The X SKUs exist for those who don't want to overclock or need the speed, simplicity, and reliability. Read this as, most of the market.

If you are an avid OCer you should be getting the non-x chips since their extra functionality is not even enabled while overclocking.


----------



## Ceadderman

X is unlocked. Non X chips.are tied down to Boost clocks. Check the latest OC records on LN2.









Would be nice if people actually hold back before posting about new architecture. The mainboard companies have had to play catchup since Ryzen 7 launched.









Oh and fwiw I own an 1800x.









~Ceadder


----------



## Ha-Nocri

RyZen overclocking revisited 3 weeks later, with new bios-es and microcode updates, From 4.0 to 4.1GHz. Memory stable @3200MHz. Good signs:


----------



## 12Cores

1700X with 3600mhz ram, excellent results below. I really wish these chip could hit 4.5ghz.

https://www.youtube.com/watch?v=RZS2XHcQdqA


----------



## Malinkadink

Quote:


> Originally Posted by *12Cores*
> 
> 1700X with 3600mhz ram, excellent results below. I really wish these chip could hit 4.5ghz.
> 
> https://www.youtube.com/watch?v=RZS2XHcQdqA


Zen+ will get to 4.5ghz me thinks


----------



## Scotty99

Quote:


> Originally Posted by *12Cores*
> 
> 1700X with 3600mhz ram, excellent results below. I really wish these chip could hit 4.5ghz.
> 
> https://www.youtube.com/watch?v=RZS2XHcQdqA


His results are not correct. Go watch any random ryzen review and look at GTA 5 results, even at 2133mhz ram his 1700x is way too close to 7700k performance.

Its cool that it is scaling well with memory, but this video is not an accurate portrayal of ryzen performance in GTA 5.


----------



## Tobiman

Mindblank is using a GTX 1070 and running the games the way people actually do; not the so called CPU benchmarks.


----------



## Scotty99

Quote:


> Originally Posted by *Tobiman*
> 
> Mindblank is using a GTX 1070 and running the games the way people actually do; not the so called CPU benchmarks.


I trust this guy fully:
https://www.youtube.com/watch?v=yFfcv6H5yxY&t=510s#t=43m41s

1440p with a gtx 1080 (even less of a CPU bottleneck than 1080p), intel is obliterating ryzen here and this falls in line with every review i have seen of GTA 5.

I dont know who "mindblank" is, but i cannot trust his results when you take into account every other review on the internet on GTA 5.

Again i dont doubt ryzen is scaling well with fast memory, the point i am trying to make here is his GTA numbers are WAY off.


----------



## Tobiman

Quote:


> Originally Posted by *Scotty99*
> 
> I trust this guy fully:
> https://www.youtube.com/watch?v=yFfcv6H5yxY&t=510s#t=43m41s
> 
> 1440p with a gtx 1080 (even less of a CPU bottleneck than 1080p), intel is obliterating ryzen here and this falls in line with every review i have seen of GTA 5.
> 
> I dont know who "mindblank" is, but i cannot trust his results when you take into account every other review on the internet on GTA 5.
> 
> Again i dont doubt ryzen is scaling well with fast memory, the point i am trying to make here is his GTA numbers are WAY off.


Stock R7 1700 with 2400mhz ram compared to a 4.7ghz i7 paired with 3200mhz ram. LOL


----------



## Scotty99

Quote:


> Originally Posted by *Tobiman*
> 
> Stock R7 1700 with 2400mhz ram compared to a 4.7ghz i7 paired with 3200mhz ram. LOL


Its a 3.7ghz all core overclock with 2400 ram, that is ages behind the intel chip.

Now go look at this mindblanks fellows video, he has 2133 results in there and its only 14 FPS behind the 7700k at 1080p. You are telling me 200mhz additional cpu frequency made up for the 25 extra average FPS defecit tech deals found at 1440p?

You clearly have no idea how to interpret benchmarks, or you would have seen this right away.

And in mindblanks video its a 5ghz 7700k....come on now dude something is fishy here.

Edit: The reason this bothers me is i dont want people to think this is accurate GTA 5 performance, whatever this guy did in this video was different from every review on this game i have seen on the net. He needs to redo his GTA 5 testing because its simply not accurate, he has a 3.9ghz 1700x with 3200 ram BEATING a 5ghz 7700k with 3200 ram. I would bet my bank account these results are wrong, i dont know who this person is or what his agenda is but i highly suggest he run his tests again.


----------



## Tojara

Quote:


> Originally Posted by *Scotty99*
> 
> I trust this guy fully:
> https://www.youtube.com/watch?v=yFfcv6H5yxY&t=510s#t=43m41s
> 
> 1440p with a gtx 1080 (even less of a CPU bottleneck than 1080p), intel is obliterating ryzen here and this falls in line with every review i have seen of GTA 5.
> 
> I dont know who "mindblank" is, but i cannot trust his results when you take into account every other review on the internet on GTA 5.
> 
> Again i dont doubt ryzen is scaling well with fast memory, the point i am trying to make here is his GTA numbers are WAY off.


I wouldn't discount a new Windows patch or GTA patch doing something. GTA is quite well threaded and it was without a doubt running worse on Ryzen than it should have. If anything the result with the 1700 @3.7 running barely faster than an 8300 @4.2 seems odd. First one has a substantial advantage in core throughput, latency and thread count and it's on a better platform.


----------



## Scotty99

Quote:


> Originally Posted by *Tojara*
> 
> I wouldn't discount a new Windows patch or GTA patch doing something. GTA is quite well threaded and it was without a doubt running worse on Ryzen than it should have. If anything the result with the 1700 @3.7 running barely faster than an 8300 @4.2 seems odd. First one has a substantial advantage in core throughput, latency and thread count and it's on a better platform.


Please i have watched every review on ryzen top to bottom, this guys GTA results are WAY out of line.

Overclocked ryzen vs overclocked 7700k no review has it even CLOSE, yet this guy has a 3900mhz ryzen BEATING a 5ghz 7700k with the same 3200mhz ram speed?

That alone should tell you guys how wrong his results are.


----------



## Tobiman

The result is definitely an outlier. That much is obvious but I haven't seen anyone else running 3200mhz at CL14 ram so i'm willing to give him the benefit of the doubt. Only very few people have given detailed info on how ram speed and latency affects gaming performance.


----------



## Scotty99

Don't get me wrong here im excited to see faster ram may have some decent gaming gains with ryzen, but at the same time you really have to question the whole video given his GTA 5 results.


----------



## Malinkadink

Quote:


> Originally Posted by *Scotty99*
> 
> Don't get me wrong here im excited to see faster ram may have some decent gaming gains with ryzen, but at the same time you really have to question the whole video given his GTA 5 results.


There was a windows update, there was a microcode update that im sure a few MBs have been updated with, not sure about GTA V as i havent booted the game up or bothered the check patch notes in ages, but i can believe it. Still would like to see some others test with his configuration/settings.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Scotty99*
> 
> I trust this guy fully:
> https://www.youtube.com/watch?v=yFfcv6H5yxY&t=510s#t=43m41s
> 
> 1440p with a gtx 1080 (even less of a CPU bottleneck than 1080p), intel is obliterating ryzen here and this falls in line with every review i have seen of GTA 5.
> 
> I dont know who "mindblank" is, but i cannot trust his results when you take into account every other review on the internet on GTA 5.
> 
> Again i dont doubt ryzen is scaling well with fast memory, the point i am trying to make here is his GTA numbers are WAY off.


Ryzen does not get "obliterated" by even the 7700K in any circumstances. At most it is around 30% slower (in a few select situations) to the very fastest gaming CPU on the market. In most cases its around 15% slower (at stock) and much closer to the performance of its intended X79/X99 CPU's. All the way around, Ryzen is a very good performer in gaming. Just because its notably less so against the very fastest CPU there is (of which its not even an actual competitor) does not mean its bad for gaming or getting obliterated in any way.


----------



## randomizer

Quote:


> Originally Posted by *Scotty99*
> 
> Please i have watched every review on ryzen top to bottom, this guys GTA results are WAY out of line.
> 
> Overclocked ryzen vs overclocked 7700k no review has it even CLOSE, yet this guy has a 3900mhz ryzen BEATING a 5ghz 7700k with the same 3200mhz ram speed?
> 
> That alone should tell you guys how wrong his results are.


That the results are different does not invalidate them. Maybe the rest are all wrong. You can't draw either conclusion.


----------



## Scotty99

Quote:


> Originally Posted by *randomizer*
> 
> That the results are different does not invalidate them. Maybe the rest are all wrong. You can't draw either conclusion.


I will eat a bucket of poop on livestream if you find me another review of a 3.9ghz ryzen beating a 5ghz 7700k in GTA 5. I will keep this offer open indefinitely.


----------



## FLCLimax

Closing the gap.


----------



## Majin SSJ Eric

I agree its literally impossible that a 1700X system would actually outperform a 7700K at 5GHz in GTA but nobody should be expecting it to in the first place. What it WILL do is perform nearly as well as, if not better than, most Intel CPU's pre-SL and more importantly, on their vastly more expensive X99 platform. The goal for AMD (and all of us PC fans) was simply for AMD to shake off the horrid reputation that they've had since BD and provide us with a competitive platform and CPU that doesn't break the bank. Ryzen has been a massive success in that respect. Before Ryzen, nobody outside of fanboys would make the claim that the 7700K was the end all, be all of CPU performance but since Ryzen, it is somehow now the ONLY CPU that matters anymore.


----------



## Scotty99

Hell a 5ghz ryzen wouldnt beat a 5ghz 7700k in GTA 5.

What was that kid thinking publishing that video...


----------



## randomizer

Quote:


> Originally Posted by *Scotty99*
> 
> I will eat a bucket of poop on livestream if you find me another review of a 3.9ghz ryzen beating a 5ghz 7700k in GTA 5. I will keep this offer open indefinitely.


Why would I bother to do that? Not only is that a pointless exercise but I also have no interest in watching you munch on faeces.


----------



## Majin SSJ Eric

I am legitimately curious of how he achieved those outlier results with his Ryzen? I don't believe he is lying or being shady but something is going on here. My guess is that something was messed up with his 7700K setup, rather than his Ryzen numbers being bad.


----------



## Scotty99

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I am legitimately curious of how he achieved those outlier results with his Ryzen? I don't believe he is lying or being shady but something is going on here. My guess is that something was messed up with his 7700K setup, rather than his Ryzen numbers being bad.


That's my hope as well, as he is getting some nice gains from memory speed increases









Maybe his 7700k was running at 1.5ghz? lol


----------



## Majin SSJ Eric

Probably a stability issue or wonky setup. Most importantly, his Ryzen setup is looking really good and we are starting to get much better performance with improved RAM speed support across the spectrum. Remember that Ryzen as a product is just three weeks old and people are already seeing much better system behavior overall. Jay had little issue with 4.1 GHz and 3200MHz DDR4 and the gains were impressive. I strongly suspect the trend will continue as things mature and that by the time the R3 launches, the platform will have developed significantly. it will still not match KL in ultimate ST performance, but the gap will continue to narrow.


----------



## darealist

Mindblank Tech has tested Ryzen during launch: https://www.youtube.com/watch?v=Nfv5aF_GfWg

His results were similar to many websites with the 7700k slaughtering Ryzen, which means his methodology is also legit.

We can say his new findings should also be legit. Ryzen was clearly bottle-necked by slow RAM speed.


----------



## Scotty99

Quote:


> Originally Posted by *darealist*
> 
> Mindblank Tech has tested Ryzen during launch: https://www.youtube.com/watch?v=Nfv5aF_GfWg
> 
> His results were similar to many websites with the 7700k slaughtering Ryzen, which means his methodology is also legit.
> 
> We can say his new findings should also be legit. Ryzen was clearly bottle-necked by slow RAM speed.


Umm are you watching the same video i am? In the one you linked there is only a 9 FPS difference between his ryzen and 7700k with both overclocked.

Go watch a bunch of launch videos, the vast majority of them have the 7700k 30 or even 40% ahead of ryzen in GTA 5.

Listen GTA 5 is one of the worst case games for ryzen im not trying to make this about ryzen being bad gaming CPU, i am just questioning this guys results/methods. Also if you cant tell from my sig i am typing this on a ryzen pc...


----------



## Majin SSJ Eric

Well its certainly likely that Ryzen's overall performance is already much better than it was in all of the launch reviews (which themselves were in no way bad) and performance should only get better as the platform matures. Its still only been three weeks...


----------



## Scotty99

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well its certainly likely that Ryzen's overall performance is already much better than it was in all of the launch reviews (which themselves were in no way bad) and performance should only get better as the platform matures. Its still only been three weeks...


Very possible, but the guy above was trying to make the point that this guy also had the 7700k "slaughtering" ryzen, when in fact he didnt find that at all lol. He had a 9 fps difference between ryzen and 7700k both overclocked, much much tighter gap than all the launch videos i watched.


----------



## Brutuz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope they do. In Single Threaded its not even a contentions. A OCed 2500K will beat Zen OC. Yes Zen stops with 16T. CB is best case and Zen cant beat my 4.6GHz IVY for SC. What about the not so good cases? Yeah a lot of people stream and encode and are content creators but I bet you more people right now can leverage 4C/8T than 8C/16T. In case in 6T/12T thread you sacrifice a bit in some games, gain a lot in multi-threaded workload, and is a lot cheaper. I just do not see a reason for getting 8C Zen right now unless you use the 8C. All I am trying to say is do not buy into Zen core count in hopes for the future.


*Because they outclock it significantly and Intel hasn't really increased IPC much over the years*. Zen is at BW levels of IPC or ~6.8% below Sky/Kaby Lake, the reason your 4.6Ghz 3770k wins in single-threaded benchmarks is the 15% clock advantage over a 4Ghz Ryzen makes up for a ~13% increase in IPC in CB. The results are exactly where they theoretically should be if Ryzen is just behind Kaby Lake IPC wise.
.
Quote:


> Originally Posted by *Scotty99*
> 
> Listen GTA 5 is one of the worst case games for ryzen im not trying to make this about ryzen being bad gaming CPU, i am just questioning this guys results/methods. Also if you cant tell from my sig i am typing this on a ryzen pc...


Yet I distinctly remember seeing GTA V reviews where Ryzen had lower average and max FPS, but was above 30fps at all times whereas Kaby Lake went down to 24 or so. Most people look at averages and would think Kaby is better yet while it has a higher average FPS than Ryzen, Ryzen wouldn't have that noticeable framedrop when it goes below 30.


----------



## renx

Quote:


> Originally Posted by *Scotty99*
> 
> Very possible, but the guy above was trying to make the point that this guy also had the 7700k "slaughtering" ryzen, when in fact he didnt find that at all lol. He had a 9 fps difference between ryzen and 7700k both overclocked, much much tighter gap than all the launch videos i watched.


Well maybe he's the one Intel forgot to send the envelope to?


----------



## 364901

Quote:


> Originally Posted by *Scotty99*
> 
> Umm are you watching the same video i am? In the one you linked there is only a 9 FPS difference between his ryzen and 7700k with both overclocked.
> 
> Go watch a bunch of launch videos, the vast majority of them have the 7700k 30 or even 40% ahead of ryzen in GTA 5.
> 
> Listen GTA 5 is one of the worst case games for ryzen im not trying to make this about ryzen being bad gaming CPU, i am just questioning this guys results/methods. Also if you cant tell from my sig i am typing this on a ryzen pc...


Since he's running a GTX 1070, despite the fact that it's overclocked, he's running up against a GPU bottleneck. Titan XP or GTX 1080 Ti would show these differences more clearly in processor performance, though I find it interesting how much the minimums and 0.10% fps numbers jumped up with memory clocks. DDR4-3200 is certainly a sweet spot in this case for both platforms.


----------



## Shau76434

Quote:


> Originally Posted by *CataclysmZA*
> 
> Since he's running a GTX 1070, despite the fact that it's overclocked, he's running up against a GPU bottleneck. Titan XP or GTX 1080 Ti would show these differences more clearly in processor performance, though I find it interesting how much the minimums and 0.10% fps numbers jumped up with memory clocks. DDR4-3200 is certainly a sweet spot in this case for both platforms.


Wouldnt higher ram speeds but higher cas latency be better for a Ryzen platform. I mean a 3200 Cl14 kit costs around the same as a 3600 CL16 kit. And according to that review the Cl of a ram kit doesnt affect Ryzen that much.


----------



## somethingname

Things are looking up if the ram boost is legit. I will probably grab one if they produce similar gaming results to a stock 7700k. We need more reviews with high clocked ram. 10-5fps is a good compromise for a great 8 core multitasking chip. In a few years they will probably refine the architecture and clock better.


----------



## gerikoh

did you guys see this video? https://www.youtube.com/watch?v=RZS2XHcQdqA


----------



## Ceadderman

Against a chip that's twice as much as the 1800x costs? I think that's rather p!$$ poor value for the 7700k tbh. But whatever. I guess when you spend a grand you have to have something to be excited about. Otherwise those who own 7700k would cry like babies.









~Ceadder


----------



## epic1337

well yeah, LGA2011 still has some to offer that justifies the price gap, namely the extra PCI-E lanes and quad-channel memory with 8dimms maximum.
intel's mainstream LGA1151 on the other hand has hardly anything better to offer than AMD's AM4, although the price gap isn't $500+ but is $100~$200 instead.


----------



## Ceadderman

Ahhh my bad I was thinking of the other high priced i7. Started posting that last night on my mobile when half asleep and decided to finish it off when I woke up.


















~Ceadder


----------



## bossie2000

AMD is the only company in the world that can bring out a new CPU and GPU in one month!


----------



## Ceadderman

Unless something happened overnight Vega doesn't launch this month.









~Ceadder


----------



## bossie2000

No man in April ! R5 and R 500...


----------



## AcesAndDueces

Lots of people reporting ridiculous gains w memory speed. Look most reviewers only tested w either 2133 or 2400 memory. W Ryzen going from 2400 to 3200 gains you 10-25% this is which puts it within 5% of the 7700 in almost all cases. I am not surprised to see it doing real well w 3600 memory. Funny thing is the Intel trolls keep focusing so much on clock speed that they have completely missed the memory scaling. The guy saying he read every review must have missed these that specifically say how close the race is now that people using faster memory.
http://www.overclock.net/t/1625187/the-ryzen-gaming-performance-gap-is-mostly-gone
https://youtu.be/6xIU2h8YMKc


----------



## Ceadderman

Quote:


> Originally Posted by *bossie2000*
> 
> No man in April ! R5 and R 500...


Unless I miss my guess we're still not.in the same month.


















~Ceadder


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Ceadderman*
> 
> Unless I miss my guess we're still not.in the same month.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Come on man, you know what he means. He's saying the R5's and Vega should release in the same month, not the R7's.


----------



## Ceadderman

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Unless I miss my guess we're still not.in the same month.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Come on man, you know what he means. He's saying the R5's and Vega should release in the same month, not the R7's.
Click to expand...

Actuslly, I had no idea the R5 was going to drop next month. I honestly thought he was referring to R7.









~Ceadder


----------



## Majin SSJ Eric

Its all good brotha, just busting your balls! But really, even if he were talking about the R7, we should see Vega drop pretty close in relation. The point was that AMD is the only company out there that can make a competitive flagship CPU and GPU for the PC (neither Intel or Nvidia can claim the same).


----------



## shhek0

Guys, is it confirmed that Zen+ would use again the AM4. I mean I am even sure that it would but I want to confirm however more important question- would the current x370 boards work with Zen+( like Z170 and Z270 for example). Thanks!


----------



## Scotty99

Ya its confirmed, zen+2 will likely be on AM4 as well.


----------



## Majin SSJ Eric

AM4 is not going anywhere for a long time. Zen+ or whatever they call Ryzen's successor will absolutely be a drop in replacement for AM4 boards. That said, there are rumors of a pro-level series of skus that will be LGA based on the Naples server platform. Obviously these new chips will not be AM4 (if they actually exist).


----------



## shhek0

Thank you. Going to splash some cash on the mobo in this case. I just need to figure if I like the C6H or Taichi more( as well as colour scheme etc.). I can not wait for some more $ in my pocket.. going from the 775 to the R7 1700.


----------



## Scotty99

Quote:


> Originally Posted by *shhek0*
> 
> Thank you. Going to splash some cash on the mobo in this case. I just need to figure if I like the C6H or Taichi more( as well as colour scheme etc.). I can not wait for some more $ in my pocket.. going from the 775 to the R7 1700.


Thats gonna be such an upgrade lol. I went from 2500k to 1700, its weird never having to worry about.....well doing anything on your pc lol. The other day cause i was bored i opened up 3 mmo's at once, cpu usage was only at twenty something percent .

The best value for the money motherboard is the asrock fatality gaming k4 hands down, i woulda bought it but it was red. Went for the killer sli and im real happy with it, it has a lesser audio chipset and a few other missing features like debug LED, but it does have wifi so thats a plus for me (bluetooth on desktop comes in handy at times).


----------



## shhek0

Quote:


> Originally Posted by *Scotty99*
> 
> Thats gonna be such an upgrade lol. I went from 2500k to 1700, its weird never having to worry about.....well doing anything on your pc lol. The other day cause i was bored i opened up 3 mmo's at once, cpu usage was only at twenty something percent .
> 
> The best value for the money motherboard is the asrock fatality gaming k4 hands down, i woulda bought it but it was red. Went for the killer sli and im real happy with it, it has a lesser audio chipset and a few other missing features like debug LED, but it does have wifi so that's a plus for me (bluetooth on desktop comes in handy at times).


Yeah that is why the Taichi is great in my eyes. A little less money, wifi and bluetooth onboard(I mean I know that is not expensive stuff- but still), good OC potentials etc. More importantly- both C6H and Taichi are not RGB crazy like.. Gigabyte models.. some of them are.. wow I hope you can switch some parts of the mobo LEDs :X

Bit off topic but I am just excited


----------



## budgetgamer120

Quote:


> Originally Posted by *AcesAndDueces*
> 
> Lots of people reporting ridiculous gains w memory speed. Look most reviewers only tested w either 2133 or 2400 memory. W Ryzen going from 2400 to 3200 gains you 10-25% this is which puts it within 5% of the 7700 in almost all cases. I am not surprised to see it doing real well w 3600 memory. Funny thing is the Intel trolls keep focusing so much on clock speed that they have completely missed the memory scaling. The guy saying he read every review must have missed these that specifically say how close the race is now that people using faster memory.
> http://www.overclock.net/t/1625187/the-ryzen-gaming-performance-gap-is-mostly-gone
> https://youtu.be/6xIU2h8YMKc


Please be specific...

5% within games. Everything else the 7700k gets blown away.


----------



## teh-yeti

Quote:


> Originally Posted by *shhek0*
> 
> Yeah that is why the Taichi is great in my eyes. A little less money, wifi and bluetooth onboard(I mean I know that is not expensive stuff- but still), good OC potentials etc. More importantly- both C6H and Taichi are not RGB crazy like.. Gigabyte models.. some of them are.. wow I hope you can switch some parts of the mobo LEDs :X
> 
> Bit off topic but I am just excited


I totally see where you're coming from. I've been waiting on the Taichi to come back in stock on Newegg. As far as the CH6 goes, i don't really see a compelling reason to go for it over the taichi. Anyone know a good reason to? Taichi has pretty robust power delivery and is all around a really nice looking board. UEFI is the only thing I can think of, and I'm not too impressed by the CH6 from what I've been able to see. Any first hand experience here?


----------



## AlphaC

Quote:


> Originally Posted by *teh-yeti*
> 
> I totally see where you're coming from. I've been waiting on the Taichi to come back in stock on Newegg. As far as the CH6 goes, i don't really see a compelling reason to go for it over the taichi. Anyone know a good reason to? Taichi has pretty robust power delivery and is all around a really nice looking board. UEFI is the only thing I can think of, and I'm not too impressed by the CH6 from what I've been able to see. Any first hand experience here?


Many users in CH VI Hero thread over on the AMD motherboard section.

One huge initial advantage the CH I Hero has is AM3 mounting holes for coolers , until AM4 brackets ship out. Another advantage is voltage check points & LN2 mode.

Also it has more USB ports.


----------



## Ceadderman

Quote:


> Originally Posted by *teh-yeti*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shhek0*
> 
> Yeah that is why the Taichi is great in my eyes. A little less money, wifi and bluetooth onboard(I mean I know that is not expensive stuff- but still), good OC potentials etc. More importantly- both C6H and Taichi are not RGB crazy like.. Gigabyte models.. some of them are.. wow I hope you can switch some parts of the mobo LEDs :X
> 
> Bit off topic but I am just excited
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I totally see where you're coming from. I've been waiting on the Taichi to come back in stock on Newegg. As far as the CH6 goes, i don't really see a compelling reason to go for it over the taichi. Anyone know a good reason to? Taichi has pretty robust power delivery and is all around a really nice looking board. UEFI is the only thing I can think of, and I'm not too impressed by the CH6 from what I've been able to see. Any first hand experience here?
Click to expand...

Went from 1100T to a Dell XPS 8700(i7 4790) that I dropped my old 5770 into(Inherited that over the Summer), to R7 1800x. Skipped FX entirely after building an 8325 system for a client and my Brother got on 8125.

I'm an ASUS guy going way back to the K series, so I am a bit biased, but I say get C6H. Pro grade caps and Mosfets always come with the RoG series. Then as far as OC'ing goes they're generally at/near the top in BIOS updates to get me where I need to go for OC'ING potential. Their RAM updates are generally updated at the push of a button if my RAM is not on their QVL. The only thing holding us back atm, is that no board from any manufacturer is running up to 3200mhz chips over 4 slots. No bigger so far as I am concerned because we didn't get stability over 4 slots in the past when it came to OC"ing. Should even out eventually because unlike Phenom II, it's not the CPU holding us back but rather the manufacturers haven't caught up with this new Architecture. That *will* change. With higher speed RAM the potential for higher clocks should be there according to 3D Guru.

So if you're holding off a high end board because of a perception of being tied to boost, don't sweat the small stuff. Get what you can afford to be certain. But I would rather have an unlocked X chip and spend more on the MB now, rather than getting a non X chip and being tied to boost clock on a cheaper board. In the end it saves me money because I won't be kicking myself later having to buy C6H. I will have it already and won't be tempted to pull the trigger for a second time on a new MB. And all Ryzen CPUs will drop into AM4. Gotta like that no mater if you pick up an R3 or R5 chip.









~Ceadder


----------



## AcesAndDueces

Quote:


> Originally Posted by *budgetgamer120*
> 
> Please be specific...
> 
> 5% within games. Everything else the 7700k gets blown away.


Yes 5% in most games is what I was speaking of .


----------



## DeathMade

Another pack of numbers from DF.

AotS: 7700k: 42FPS; 1800X: 35,3FPS
Division: 7700k: 133.4FPS; 1800X 129.4FPS
TW3: 7700k: 139.6FPS; 1800X 118.4FPS
RoTR: 7700k: 126.4FPS; 1800X 84.8FPS
FCP: 7700k: 137.9FPS; 1800X 91.1FPS
ACU: 7700k: 132.5FPS; 1800X 119.7FPS
Crysis 3: 7700k: 138.0FPS; 1800X 137.4FPS

Average FPS from all games: 7700k 121,4FPS; 1800X 102,3FPS. Which makes 7700k 18% faster on average than 1800X. 5/7 games are completely fine where 1800X is more or less within range of 7700k and then there is FCP and RoTR which have ridiculously low performance and affect this score greatly, and they are most likely going to get patched like Dota 2 which recently got Ryzen patch to improve threading performance.

And even with these 2 games affecting the score, 7700k is "only" 18% faster, while having 12.5% higher clockspeeds. Not some 40% faster nonsense people like to say. I dont really understand why everyone is saying how Ryzen is terrible in gaming when it's just fine, and with patches and optimisations from devs its going to be even better.

Tho I am not really that suprised to be honest when people make videos like this: https://www.youtube.com/watch?v=L4K7eIEAJx0&t=2s


----------



## ZealotKi11er

Quote:


> Originally Posted by *DeathMade*
> 
> 
> 
> 
> 
> Another pack of numbers from DF.
> 
> AotS: 7700k: 42FPS; 1800X: 35,3FPS
> Division: 7700k: 133.4FPS; 1800X 129.4FPS
> TW3: 7700k: 139.6FPS; 1800X 118.4FPS
> RoTR: 7700k: 126.4FPS; 1800X 84.8FPS
> FCP: 7700k: 137.9FPS; 1800X 91.1FPS
> ACU: 7700k: 132.5FPS; 1800X 119.7FPS
> Crysis 3: 7700k: 138.0FPS; 1800X 137.4FPS
> 
> Average FPS from all games: 7700k 121,4FPS; 1800X 102,3FPS. Which makes 7700k 18% faster on average than 1800X. 5/7 games are completely fine where 1800X is more or less within range of 7700k and then there is FCP and RoTR which have ridiculously low performance and affect this score greatly, and they are most likely going to get patched like Dota 2 which recently got Ryzen patch to improve threading performance.
> 
> And even with these 2 games affecting the score, 7700k is "only" 18% faster, while having 12.5% higher clockspeeds. Not some 40% faster nonsense people like to say. I dont really understand why everyone is saying how Ryzen is terrible in gaming when it's just fine, and with patches and optimisations from devs its going to be even better.
> 
> Tho I am not really that suprised to be honest when people make videos like this: https://www.youtube.com/watch?v=L4K7eIEAJx0&t=2s


Just to point things out that 7700K in running slower RAM and can get much more fps with faster RAM. Also 18% you say? Don't you know 18% is the difference between this and SB? 18% is 2-3 generations of Intel refinement.


----------



## rexolaboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just to point things out that 7700K in running slower RAM and can get much more fps with faster RAM. Also 18% you say? Don't you know 18% is the difference between this and SB? 18% is 2-3 generations of Intel refinement.


18% faster includes clock disparity. Clock for clock kaby lake is about 7 1/2 % faster than ryZen. When people say kaby is about 20% faster than sandy bridge, that's clock for clock ipc gain.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rexolaboy*
> 
> 18% faster includes clock disparity. Clock for clock kaby lake is about 7 1/2 % faster than ryZen. When people say kaby is about 20% faster than sandy bridge, that's clock for clock ipc gain.


It does not matter what clock speeds 7700K is. Its all relative. Can Ryzen do 5GHz? 5GHz is a feature.


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It does not matter what clock speeds 7700K is. Its all relative. Can Ryzen do 5GHz? 5GHz is a feature.


Not all 7700k can hit 5Ghz, so it isn't a feature. It's the silicon lottery.


----------



## Blameless

Quote:


> Originally Posted by *Mad Pistol*
> 
> Not all 7700k can hit 5Ghz, so it isn't a feature. It's the silicon lottery.


A ~20% peak stable clock speed advantage is one of Kaby's features vs. Ryzen.


----------



## epic1337

well to begin with, The Stilt's analysis puts KabyLake at 12.3% faster than Ryzen clock for clock.

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


with KabyLake being able to overclock higher than Ryzen the overall IPS is much higher.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> A ~20% peak stable clock speed advantage is one of Kaby's features vs. Ryzen.


Just like its for GPUs. Still cant bealive people do not get it. We can speculate if Ryzen hit 5GHz with Zen+ but right now it clocks ~ 1GHz less than Kaby Lake.


----------



## Mad Pistol

Quote:


> Originally Posted by *epic1337*
> 
> well to begin with, The Stilt's analysis puts KabyLake at 12.3% faster than Ryzen clock for clock.
> 
> 
> 
> with KabyLake being able to overclock higher than Ryzen the overall IPS is much higher.


This is fine, but based on an earlier video in this post, this can be minimized through high-clocking memeory.

The catch, though, is getting memory to scale up to 3600mhz on Ryzen is, again, the lottery. If Zen 2 has an improved IMC, we could see parity with Kaby Lake on IPC in many instances.


----------



## epic1337

Quote:


> Originally Posted by *Mad Pistol*
> 
> This is fine, but based on an earlier video in this post, this can be minimized through high-clocking memeory.
> 
> The catch, though, is getting memory to scale up to 3600mhz on Ryzen is, again, the lottery. If Zen 2 has an improved IMC, we could see parity with Kaby Lake on IPC in many instances.


that and a customizable ratio for the Fabric interconnect, having it locked at 1:1 IMC clock is limiting it's overall effectiveness.
if we could set it to run at 3:2 or even 4:3 then we'd be able to increase Ryzen throughput even while only using the same 3600Mhz ram.


----------



## DeathMade

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just to point things out that 7700K in running slower RAM and can get much more fps with faster RAM. Also 18% you say? Don't you know 18% is the difference between this and SB? 18% is 2-3 generations of Intel refinement.


As I have stated in the post there is clock difference between the CPUs.

If we compared CPUs like you say we could have said that intel made literally no progress between SB, IB, HW since you could clock sandy bridge really, really high.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> It does not matter what clock speeds 7700K is. Its all relative. Can Ryzen do 5GHz? 5GHz is a feature.


I bet that not even 50% 7700ks can do 5GHz. If all could hit 5Ghz intel would not put the boost clock 500MHz lower.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DeathMade*
> 
> As I have stated in the post there is clock difference between the CPUs.
> 
> If we compared CPUs like you say we could have said that intel made literally no progress between SB, IB, HW since you could clock sandy bridge really, really high.
> I bet that not even 50% 7700ks can do 5GHz. If all could hit 5Ghz intel would not put the boost clock 500MHz lower.


wth? Why then Core i5 2500K was set to 3.7GHz stock and all CPUs could so 4.5GHz? Why Intel did not Boost it higher? 7700K do 4.9-5.1 like Zen does 3.9-4.1.


----------



## Blameless

Quote:


> Originally Posted by *Mad Pistol*
> 
> This is fine, but based on an earlier video in this post, this can be minimized through high-clocking memeory.
> 
> The catch, though, is getting memory to scale up to 3600mhz on Ryzen is, again, the lottery. If Zen 2 has an improved IMC, we could see parity with Kaby Lake on IPC in many instances.


The tests used for that chart were all single threaded and thus _extremely_ unlikely to be affected by either data fabric or memory performance.


----------



## epic1337

Quote:


> Originally Posted by *DeathMade*
> 
> If we compared CPUs like you say we could have said that intel made literally no progress between SB, IB, HW since you could clock sandy bridge really, really high.


to be fair thats what users here are already saying, they're pretty much saying that intel made _no_ progress.


----------



## Blameless

Quote:


> Originally Posted by *epic1337*
> 
> to be fair thats what users here are already saying, they're pretty much saying that intel made _no_ progress.


They are either looking at a narrow spectrum of tasks, or are wrong.

Intel's gains have definitely been evolutionary rather than revolutionary since Nehalem, but there has been steady improvement.


----------



## epic1337

Quote:


> Originally Posted by *Blameless*
> 
> They are either looking at a narrow spectrum of tasks, or are wrong.
> 
> Intel's gains have definitely been evolutionary rather than revolutionary since Nehalem, but there has been steady improvement.


that and they're also ignoring IGP improvements.

its not like CPUs matter a lot these days, GPGPU has pretty much taken over most heavy tasks that improving CPUs is hardly cost effective.
the only way to give CPUs an easy tangible benefit that doesn't cost a lot to implement is to give it new instruction sets, AVX for example is revolutionary.

so with intel's logic, a slow bit-by-bit improvement for the CPU while introducing revolutionary instruction sets makes more sense.

AMD used to have a similar target, that is making a heterogeneous APU that can offload CPU tasks to IGP, i wonder what happened to that.


----------



## JackCY

Nah Intel APUs are improving but just the GPU part, the CPU not much at all








Overpriced quad cores anyone? Want SMT (named HT to create a brand name)? Ok pay +50% lol Overclocking? No no no you would burn the CPU and RMA it and cause us a lot of trouble, all CPUs are locked. Except the most expensive ones where we use crappy ToothInterfaceMaterial as well. And dare you find a way around it we will release updated ucode to remove the feature from all newer UEFIs.
Oh you want more than 4 cores? Ok shell out +100% on CPU and mobo and RAM.
That's Intel in nutshell.

Lack of competition is killing the industry.


----------



## DeathMade

Quote:


> Originally Posted by *ZealotKi11er*
> 
> ***? Why then Core i5 2500K was set to 3.7GHz stock and all CPUs could so 4.5GHz? Why Intel did not Boost it higher? 7700K do 4.9-5.1 like Zen does 3.9-4.1.


Because not all of them could do it, which is exactly my point.

I dont really get where are you getting your statistics from since in most reviews the average OC for 7700k is 4.8Ghz, which coresponds to what people who i personally know have achieved with their 7700k.

This reminds me of when people compared 980Ti to other GPUs and claimed that most of 980Tis hit 1600MHz


----------



## Mad Pistol

Quote:


> Originally Posted by *epic1337*
> 
> to be fair thats what users here are already saying, they're pretty much saying that intel made _no_ progress.


Which is definitely not true. Even my stock i7 4790k beats a highly overclocked i7 2600k.


----------



## DeathMade

Quote:


> Originally Posted by *epic1337*
> 
> to be fair thats what users here are already saying, they're pretty much saying that intel made _no_ progress.


I haven't seen that personally. I think that intel made about 5-10% improvment per gen didn't they? Even tho its quite disappointing imo its still progress, sort of.

And to be fair you can say a lot of times that "Oh why would I buy this when I can buy this and OC it?"


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> Which is definitely not true. Even my stock i7 4790k beats a highly overclocked i7 2600k.


But you 4790K is highly overclocked already


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> But you 4790K is highly overclocked already


What does your 3770k get @ 4.6Ghz on CB R15?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> What does your 3770k get @ 4.6Ghz on CB R15?


167


----------



## Mad Pistol

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 167


and Multicore?


----------



## Shatun-Bear

Optimizing for Ryzen PDF

Stacked with lots of good info in there.
Quote:


> REASONS TO UPGRADE
> COMPILER
> 
> Year Visual Studio Changes AMD Products
> 
> *2017* Improved code generation of loops: Support for automatic vectorization of division of
> constant integers, better identification of memset patterns. Added Cmake support. Added
> faster database engine. Improved STL & .NET optimizations.
> *"Zen"/"Summit Ridge"*
> 
> *2015* Improved autovectorization & scalar optimizations. Faster build times with
> /LTCG:incremental. Added assembly optimized memset & memcpy using ERMS & SSE2.
> *"Bulldozer"/"Kaveri"*
> 
> *2013* Improved inline. Improved auto-vectorization. Improved ISO C99 language and library. *"Bulldozer"/"Trinity"*
> 
> *2012* Added autovectorization. Optimized container memory sizes. *"Bulldozer"/"Orochi"*
> 
> *2010* Added nullptr keyword. Replaced VCBuild with MSBuild.
> 
> *2008* Tuned for Intel Core microarchitecture. Improved cpuidex & intrinsics. Added
> /Qfast_transcendentals & STL/CLR library. Faster build times with /MP & Managed
> incremental builds. *"Greyhound"*
> 
> *2005* Added x64 native compiler. "*K8"*


----------



## Majin SSJ Eric

Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


----------



## Mad Pistol

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


I noticed that as well... It is in human nature to minimize items that don't necessarily lineup with your internal agenda. In this case, Ryzen usually has higher minimums, so people that want intel to be successful (or to justify their purchase) naturally only focus on the averages, which are generally higher on the intel platform.

Averages are important, but low minimums and 1% lows can indicate an issue with stuttering. While experience is largely subjective, I would wager that the gaming experience on Ryzen 8 core systems is generally more consistent, and to some extent... smoother.

Hmmmmmm... where have we heard that one before?


----------



## Nizzen

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7

I can confirm than fps goes more up and down, and have lower min fps than my x99 computers.

I Have 1800x @ 4100mhz/3200cl14 and 6900k @ 4,4 3400mhz cl14 and 6850k/3200 cl14.


----------



## rage fuury

Sorry if already posted...

Using faster RAM (3200+MHz) the performance gain for Ryzen is rather substantial:

https://www.ht4u.net/reviews/2017/amd_ryzen_7_1800x_im_test/index36.php?dummy=&advancedFilter=false&prod%5B%5D=AMD+Ryzen+7+1800X+%5B8C%2F16T%403%2C6-4%2C1+GHz%5D&prod%5B%5D=AMD+Ryzen+7+1800X+%5B8C%2F16T%40DDR4-3200%5D

+10-15% in games it really close the gap betwen Ryzen and Kaby Lake I dare to say. And with faster RAM this seems definitivly possible!


----------



## Slink3Slyde

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


Quote:


> Originally Posted by *Mad Pistol*
> 
> I noticed that as well... It is in human nature to minimize items that don't necessarily lineup with your internal agenda. In this case, Ryzen usually has higher minimums, so people that want intel to be successful (or to justify their purchase) naturally only focus on the averages, which are generally higher on the intel platform.
> 
> Averages are important, but low minimums and 1% lows can indicate an issue with stuttering. While experience is largely subjective, I would wager that the gaming experience on Ryzen 8 core systems is generally more consistent, and to some extent... smoother.
> 
> Hmmmmmm... where have we heard that one before?


Its the conflicting reports. TechReport showed that Ryzen was actually worse with minimums and 10% lows (although not bad) compared the the 7700K This was pre the whole disabling SMT for select titles and High performance power plan thing came about though.

We have the Mindblank video showing the opposite with 3600 RAM. But we've also had Digital Foundry saying that using 3200 RAM showed similar scaling to what happens with Intel chips as far as using faster RAM goes, and Techspot testing 16 games showing Ryzen having similar minimum frame times with 3000 RAM as the I5 7600K. We have Eteknix on the other hand showing Ryzen doing better. Although to be honest Ive never really been a fan of their site.

Some sites are showing Ryzen suffering in certain games and others are showing those same games not far behind the 7700K or winning. I discount all the reviews that havent used at least 2933 RAM since the memory speed issue became apparent. I also have a feeling that it really depends on what games are selected for the test suite, and on what sections within those games are benchmarked.

I do know that using a games canned benchmark can be misleading as far as CPU benchmarks go, Digital Foundry go out of their way I believe to find the most CPU demanding sections of titles in order to test the CPU more. This is what makes me trust them currently over some others. The Mindblank guys video was very encouraging for people looking to game on Ryzen, I'm just still not convinced its actually faster then the 7700K on average overall in games. Very interested to see what Digital Foundry make of the R5's and if running 3200+ RAM becomes easier for these chips as the platform matures.

Whether Ryzen is faster then a 7700K or not a 7700k is a lot faster then my old CPU in every situaton, no buyers remorse here.


----------



## Blameless

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


Absolute minimum FPS has always been of dubious utility, but ~1% lows are probably the most relevant figure there is, when they are available.


----------



## budgetgamer120

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Funny how now that Ryzen is out, minimum FPS and 1% lows etc are unimportant and all that matters are avg FPS. Wonder why that would be?


Lots of things don't matter anymore now that AMD is back.


----------



## Majin SSJ Eric

Like Cinebench...


----------



## Mad Pistol

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Like Cinebench...


Yea... seems to be a pretty sweet selling point that Ryzen's 8 core CPUs are beating the 6900k pretty consistently in CB r15.


----------



## IRobot23

Quote:


> Originally Posted by *epic1337*
> 
> well to begin with, The Stilt's analysis puts KabyLake at 12.3% faster than Ryzen clock for clock.
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> with KabyLake being able to overclock higher than Ryzen the overall IPS is much higher.


depends on the tasks you are benchmarking. Still uses own method, well I have to say that some benchmarks are pretty random. So If I do my random benchmarks I can show that ryzen is faster.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> and Multicore?


837


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 837


My 2600K at 4.5GHz (1600MHz memory) got 794.


----------



## Mad Pistol

Quote:


> Originally Posted by *Mad Pistol*
> 
> What does your 3770k get @ 4.6Ghz on CB R15?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> 167


Quote:


> Originally Posted by *Mad Pistol*
> 
> and Multicore?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> 837


So a quick run @ stock on my i7 4790k (no optimization) gives me 166 on single and 865 in multi in Cinebench R15. Yes, it is clocked higher than a stock 3770k, but it is still lower clocked than your OC on your 3770k.

That means there was an improvement from Ivy to Haswell. Sky/Kaby Lake has even better IPC.

Not sure how people can say there was NO improvement over the last several generations of Intel CPUs... and part of that improvement has been higher stock-clocked parts.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> So a quick run @ stock on my i7 4790k (no optimization) gives me 166 on single and 865 in multi in Cinebench R15.
> 
> Considering that my i7 4790k is actually clocked lower than your 4.6Ghz 3770k, that means there was an improvement from Ivy to Haswell. Sky/Kaby Lake has even better IPC.
> 
> Not sure how people can say there was NO improvement over the last several generations of Intel CPUs.


There is for sure. It is not that much from SB to IVY to Haswell unless its AVX. Sky Lake and Kaby Lake main boost comes from DDR4.


----------



## STEvil

So are the reviews using dual-rank or single rank memory?

dual-rank 1t vs single rank 2t or visa versa may be producing many of the differences.


----------



## AcesAndDueces

I don't necessarily think the argument is that Intel has made no improvement. It's that they purposely made less than they could have to milk the market. Would/do other companies do that with lack of competition? Sure. But frowned upon by consumers for sure.


----------



## Mad Pistol

Quote:


> Originally Posted by *AcesAndDueces*
> 
> I don't necessarily think the argument is that Intel has made no improvement. It's that they purposely made less than they could have to milk the market. Would/do other companies do that with lack of competition? Sure. But frowned upon by consumers for sure.


If that's truly the case, we will begin seeing bigger jumps in performance over the coming years, especially since AMD now has a competitive product.

Honestly, I would be surprised if Intel actually has a more efficient architecture on the horizon. If Intel actually had something, they would have anticipated AMD and launched something far more competitive than Kaby Lake.

My gut tells me that Kaby Lake, at the moment, is the best that Intel has.


----------



## budgetgamer120

Quote:


> Originally Posted by *AcesAndDueces*
> 
> I don't necessarily think the argument is that Intel has made no improvement. It's that they purposely made less than they could have to milk the market. Would/do other companies do that with lack of competition? Sure. But frowned upon by consumers for sure.


Correct.


----------



## Kuivamaa

Quote:


> Originally Posted by *epic1337*
> 
> well to begin with, The Stilt's analysis puts KabyLake at 12.3% faster than Ryzen clock for clock.
> 
> https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/
> 
> 
> with KabyLake being able to overclock higher than Ryzen the overall IPS is much higher.


For these particular workloads yes. Add in the mix something like encryption and the tables turn. BW and SR will trade blows but in the end they are very compatable in ipc.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> If that's truly the case, we will begin seeing bigger jumps in performance over the coming years, especially since AMD now has a competitive product.
> 
> Honestly, I would be surprised if Intel actually has a more efficient architecture on the horizon. If Intel actually had something, they would have anticipated AMD and launched something far more competitive than Kaby Lake.
> 
> My gut tells me that Kaby Lake, at the moment, is the best that Intel has.


I think soo too. Intel could use a CPU for mobile that is more efficient and faster but they can't make it.


----------



## AcesAndDueces

It has less to do with Them "having something" already and more to do with them slacking on R&D and being content to spend as little as possible to get a minor increase in order to justify it as an "upgrade". Intel is a super dirty company on many levels from stealing AMD's tech from the 90's to beat them then bribing companies not to use AMD . To suing Nvidia over the Nforce motherboards. Yup for those who haven't been around for years like me. Nvidia used to make the Best motherboards until Intel sued them saying they had a patent on lga socket ### . Sadly Intel's mobo's sucked at the time and industry went backwards a bit. AMD had no prob w Nforce mobo's using their sockets but sadly only lasted a couple of years on just AMD. Both of these claims are true and had lawsuits attached to them so you can research easily.


----------



## powerincarnate

Quote:


> Originally Posted by *Mad Pistol*
> 
> If that's truly the case, we will begin seeing bigger jumps in performance over the coming years, especially since AMD now has a competitive product.
> 
> Honestly, I would be surprised if Intel actually has a more efficient architecture on the horizon. If Intel actually had something, they would have anticipated AMD and launched something far more competitive than Kaby Lake.
> 
> My gut tells me that Kaby Lake, at the moment, is the best that Intel has.


As someone else mentioned, it's not that Intel has something way better than Kaby lake, it's that they never had to push themselves to really innovate since they held a wide lead over the competition. This had negative effects in one of two ways, one it never forced them to innovate as i mentioned, but two, it meant that Processors from as far back as the 2500K was still quite competitive in gaming as the top of the line Kaby lake 7700K processor which mean people didn't have to upgrade their processor. This actually depressed the desktop PC market place big time and made it even more important for these companies to focus on mobile stuff like Ipads/tablets, laptops, and phone SOCs. If you noticed, it was in that sector all of the innovations happened in the past 5-7 years.

AMD however, never really got in the game with the mobile space. recently their GPUs really lagged behind nVidia, and for some 5-8 years their CPU lagged big time behind intel. Both of which actually Forced AMD to sync more money in R+D, even hiring old AMD engeineers, and the result: Vega is supposebly going to compete well against the 1080 ti (I couldn't wait so I got myself a 1080 ti), and Ryzen is now more than competitive with Intel's HEDT market AND reportedly a 16 core version that is going to cost 1000 will be out in the next few months, which if true will likely compete very well if not surpass the 6950X and a price point that is $700 cheaper.

competition is Very good for the marketplace. If intel is not ready to blow us away, and need more time for R+D to develop something truly magnificent, then we will at least get a price war, lest they risk losing significant market share. Because quite frankly, With Ryzen 7 and reportedly 16 core Ryzen, it truly makes Broadwell obsolete. Hope Skylake X does some amazing things.


----------



## Ceadderman

iirc the increase in IPC from one Intel release to the next is a relatively small gain. Something like between 6-12% increase in performance. Whereas Ryzen 7's performance increase over FX was a whopping 50%.

Some games scale better with R7 and some worse depending on the core value the game takes advantage of. 1 or.tqo.cores won't see much of a fps increase. More than 2 the fps will fareceive better.

At least this is my take from everything I have read.









~Ceadder


----------



## powerincarnate

the "50%" jump of ryzen vs the FX was simply because the FX was gimped and essentially didn't improve much since the original bulldozer. So essentially you are looking at a 2017 PC vs a PC from 5 or so years ago.

If you tally up the incremental upgrades from Intel from say core I7 2600 or 2700K from 2011 and put it up against the 6950x or the 7700K, at stock, no overclock it too will get creamed. The only reason some of these processors are still competitive in games is because A. You overclock the 2600K to 4.2 ghz or so, AND B. because game companies still aren't designing games for multicore processors. many of these companies had to scale back their game for the PS3 and 360 days before, and now even with the new system, is still scaling things back big time instead of trying to push the envelope with codes that will take advantage of 6-8 core 12-16 thread CPUs.

It's great that AMD is back. Now AMD on the GPU front stumbled a few times against nVidia, but unlike their CPU division, they have been competitive against their competition on multiple occasions in the past 5-6 years, and it meant that the GPU market have consistently improved and innovated in these past 6 years. So I'm very Happy AMD is back in the CPU market.


----------



## ZealotKi11er

Quote:


> Originally Posted by *powerincarnate*
> 
> the "50%" jump of ryzen vs the FX was simply because the FX was gimped and essentially didn't improve much since the original bulldozer. So essentially you are looking at a 2017 PC vs a PC from 5 or so years ago.
> 
> If you tally up the incremental upgrades from Intel from say core I7 2600 or 2700K from 2011 and put it up against the 6950x or the 7700K, at stock, no overclock it too will get creamed. The only reason some of these processors are still competitive in games is because A. You overclock the 2600K to 4.2 ghz or so, AND B. because game companies still aren't designing games for multicore processors. many of these companies had to scale back their game for the PS3 and 360 days before, and now even with the new system, is still scaling things back big time instead of trying to push the envelope with codes that will take advantage of 6-8 core 12-16 thread CPUs.
> 
> It's great that AMD is back. Now AMD on the GPU front stumbled a few times against nVidia, but unlike their CPU division, they have been competitive against their competition on multiple occasions in the past 5-6 years, and it meant that the GPU market have consistently improved and innovated in these past 6 years. So I'm very Happy AMD is back in the CPU market.


Well Bulldozer IPC is even less then PII which probably puts it PI IPC level. Thats less IPC than Core 2 (2006) so 10 years of IPC improvement.


----------



## KarathKasun

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well Bulldozer IPC is even less then PII which probably puts it PI IPC level. Thats less IPC than Core 2 (2006) so 10 years of IPC improvement.


BD is almost EXACTLY at C2D IPC. Its around the same as PII for most things. I have done testing to confirm this.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KarathKasun*
> 
> BD is almost EXACTLY at C2D IPC. Its around the same as PII for most things. I have done testing to confirm this.


Which is still valid as C2D ~ 2006 IPC level.


----------



## tpi2007

Quote:


> Originally Posted by *Mad Pistol*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AcesAndDueces*
> 
> I don't necessarily think the argument is that Intel has made no improvement. It's that they purposely made less than they could have to milk the market. Would/do other companies do that with lack of competition? Sure. But frowned upon by consumers for sure.
> 
> 
> 
> If that's truly the case, we will begin seeing bigger jumps in performance over the coming years, especially since AMD now has a competitive product.
> 
> Honestly, I would be surprised if Intel actually has a more efficient architecture on the horizon. If Intel actually had something, they would have anticipated AMD and launched something far more competitive than Kaby Lake.
> 
> My gut tells me that Kaby Lake, at the moment, is the best that Intel has.
Click to expand...

And it's not bad, it's just priced too high. Well, they are planning on releasing a Skylake Xeon Gold with 1 MB of L2 cache per core, and from what we've seen with Broadwell's 128 MB L4 cache, their arch certainly has some performance and smoothness juice left that they can explore if they feel the need.

For now things seem calm, but coming next month, I don't know, but it seems that a big part of Intel's lineup stops making sense at the current pricing. Their only 3 CPUs that make sense are the

1. Pentium G4560 - AMD isn't going there for now, and for $64 it offers compelling performance, even though it will be outdated pretty fast, like in a year. Still, for the price, it's cheap and cheerful all the way. Plus, you can upgrade to the more powerful (and most probably way cheaper by then) i7-7700K;

2. Core i7-7700K - it may be priced too high, but it still has a relevant number of threads, which, combined with its high clockspeed potential, puts it in a unique spot;

3. Core i7-6950X - because there is nothing like it. Huge premium, but no match, so it makes sense if you want the glory or need the multithreaded performance (and still get the glory as bonus).

The rest is diminishing returns and dubious propositions and the current value will depend on your workload. For example, a 6900K will make more sense if your usage scenarios favour AVX2, high memory bandwidth and lots of PCIe lanes, but the huge price difference may make the math more complicated, especially if you can't take advantage of everything that the Intel CPU and platform have to offer. The 6800K is a dubious proposition, with the line-up having more prestige as the premium platform if Intel simply discontinued it and put the fully PCIe lanes featured and faster clocked 6850K in its place.

And then in the mainstream segment, with AMD releasing the hexacores and quad cores next month, things will get complicated there fast for Intel. The Ryzen 5 1600X costs the same as the 7600K and the 1600 (non X) even less, so it's arguably a very strong choice, even for a strict gaming rig, I would say. 4 threads will show limitations that won't be overcome with overclocking and it will only get worse as time goes by, whereas the AMD hexacores will be able to overcome the IPC deficit with more cores and threads and will have a lot of headroom for the future. The i5 7600K is tomorrow's i3 and should cost at most $179 (incidentally the price of the i3-7350K).

And then we have the whole core i3 line-up. None of it makes sense at $117 - $179 with the Pentium G4560 around at $64 and certainly the i3-7350K has its days numbered at its ridiculous price when you compare it with the Ryzen 5 1500X. There is no argument here. 2C/4T at 5 Ghz won't be able to beat a Ryzen 5 4C/8T at 4 Ghz. The smoothness and the higher minimums will be on the Ryzen side. The i3 would be a worthwhile and fun chip to buy and overclock, but Intel sucked all the fun out of it by pricing it at a level where, in order to not feel completely bad about the purchase, it only makes strict financial sense if you will absolutely try to get 5 Ghz out of it on day one. It's borderline trying to make ends meet to begin to justify the price. Not very compelling nor fun if you're practically obliged to do it. Intel would have done better by delivering a 2C/4T unlocked Pentium without AVX and all the extra goodies that the i3's have, but at least for under $100 it would have been a compelling buy.

And then, of course, the locked Core i5's will have the desirability of a low $100's i3. Those will probably be the biggest victims in all of this. If you can buy a Ryzen 5 1500X, a 4C/8T CPU for $179, why would anybody buy a $182+ 4C/4T locked CPU?

AMD's lineup seems very compelling. They may not be moving the absolute performance bar up, but they are redefining what you should expect at each price point.


----------



## powerincarnate

Quote:


> Originally Posted by *tpi2007*
> 
> And it's not bad, its just priced too high. Well, they are planning on releasing a Skylake Xeon Gold with 1 MB of L2 cache per core, and from what we've seen with the Broadwell's 128 MB L4 cache, their arch certainly has some performance and smoothness juice left that they can explore if they feel the need.
> 
> For now things seem calm, but coming next month, I don't know, but it seems that a big part of Intel's lineup stops making sense at the current pricing. Their only 3 CPUs that make sense are the
> 
> 1. Pentium G4560 - AMD isn't going there for now, and for $64 it offers compelling performance, even though it will be outdated pretty fast, like in a year. Still, for the price, it's cheap and cheerful all the way. Plus, you can upgrade to the more powerful (and most probably way cheaper by then) i7-7700K;
> 
> 2. Core i7-7700K - it may be priced too high, but it still has a relevant number of threads, which, combined with its high clockspeed potential, puts it in a unique spot;
> 
> 3. Core i7-6950X - because there is nothing like it. Huge premium, but no match, so it makes sense if you want the glory or need the multithreaded performance (and still get the glory as bonus).
> 
> The rest is diminishing returns and dubious propositions and the current value will depend on your workload. For example, a 6900K will make more sense if your usage scenarios favour AVX2, high memory bandwidth and lots of PCIe lanes, but the huge price difference may make the math more complicated, especially if you can't take advantage of everything that the Intel CPU and platform have to offer. The 6800K is a dubious proposition, with the line-up having more prestige as the premium platform if Intel simply discontinued it and put the fully PCIe lanes featured and faster clocked 6850K in its place.
> 
> And then in the mainstream segment, with AMD releasing the hexacores and quad cores next month, things will get complicated there fast for Intel. The Ryzen 5 1600X costs the same as the 7600K and the 1600 (non X) even less, so it''s arguably a very strong choice, even for a strict gaming rig, I would say. 4 threads will show limitations that won't be overcome with overclocking and it will only get worse as time goes by, whereas the AMD hexacores will be able to overcome the IPC deficit with more cores and threads and will have a lot of headroom for the future. The i5 7600K is tomorrow's i3 and should cost at most $179 (incidentally the price of the i3-7350K).
> 
> And then we have the whole core i3 line-up. None of it makes sense at $117 - $179 with the Pentium G4560 around at $64 and certainly the i3-7350K has its days numbered at its ridiculous price when you compare it with the Ryzen 5 1500X. There is no argument here. 2C/4T at 5 Ghz won't be able to beat a 4 Ghz Ryzen 5 4C/8T at 4 Ghz. The smoothness and the higher minimums will be on the Ryzen side. The i3 would be a worthwhile and fun chip to buy and overclock, but Intel sucked all the fun out of it by pricing it at a level where, in order to not feel completely bad about the purchase, it only makes strict financial sense if you will absolutely try to get 5 Ghz out of it on day one. It's borderline trying to make ends meet to begin to justify the price. Not very compelling nor fun if you're practically obliged to do it. Intel would have done better by delivering a 2C/4T unlocked Pentium without AVX and all the extra goodies that the i3's have, but at least for under $100 it would have been a compelling buy.
> 
> And then, of course, the locked Core i5's will have the desirability of a low $100's i3. Those will probably be the biggest victims in all of this. If you can buy a Ryzen 5 1500X, a 4C/8T CPU for $179, why would anybody buy a $182+ 4C/4T locked CPU?
> 
> AMD's lineup seems very compelling. They may not be moving the absolute performance bar up, but they are redefining what you should expect at each price point.


the 7700K is Not too expensive. It's $299 at Microcenter. $299 for the fastest clock and most advance non-HEDT processor there is. In the old days the top of the line stuff would be much more expensive then this. It is quite literally bare none, the best gaming CPU.

as the other poster said, the rest of the lineup is fine, just priced too high. If 6950 should be like 1200. 6900 should be 700, 6850 should be 450, 6800 should be 300. 6900 at 700 bucks, Intel would be able to say hey, we beat the 1800K in most benchmarks, we are better than they are at gaming, and our motherboard and memory compatability is much more extensive, feature pack, and stable. This alone would at least be a reason to spend the extra 200 bucks. But even that may be stretching it. At this point, i'm waiting to see what the 16 core chip can do and what Skylake X is all about. Until then, I'll stick to my 3820K.


----------



## Ceadderman

MicroCenter can only be found at a few locations. Newegg has a larger online presence than they do. Then there is Amazon and PCPartsPicker too.









~Ceadder


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Mad Pistol*
> 
> So a quick run @ stock on my i7 4790k (no optimization) gives me 166 on single and 865 in multi in Cinebench R15. Yes, it is clocked higher than a stock 3770k, but it is still lower clocked than your OC on your 3770k.
> 
> That means there was an improvement from Ivy to Haswell. Sky/Kaby Lake has even better IPC.
> 
> Not sure how people can say there was NO improvement over the last several generations of Intel CPUs... and part of that improvement has been higher stock-clocked parts.


Well, my 6 year old 2600K gets 794 in CB at a modest 4.5GHz so its not like it was that much of an increase to the 4790K really. I don't think anybody has said that there was no improvement over the years from Intel mainstream processors but the increases have been very minimal (at least until SL). Of course the absolute failure of the construction cores from AMD had everything to do with that (and the fact that SB was so good on release that Intel saw little reason in providing massive improvements from that base).


----------



## tpi2007

Quote:


> Originally Posted by *powerincarnate*
> 
> the 7700K is Not too expensive. It's $299 at Microcenter. $299 for the fastest clock and most advance non-HEDT processor there is. In the old days the top of the line stuff would be much more expensive then this. It is quite literally bare none, the best gaming CPU.
> 
> as the other poster said, the rest of the lineup is fine, just priced too high. If 6950 should be like 1200. 6900 should be 700, 6850 should be 450, 6800 should be 300. 6900 at 700 bucks, Intel would be able to say hey, we beat the 1800K in most benchmarks, we are better than they are at gaming, and our motherboard and memory compatability is much more extensive, feature pack, and stable. This alone would at least be a reason to spend the extra 200 bucks. But even that may be stretching it. At this point, i'm waiting to see what the 16 core chip can do and what Skylake X is all about. Until then, I'll stick to my 3820K.


As Ceadderman said above, that is a pick up on location price and it's not official nor in practice in other places. Intel's recommended price range for the i7-7700K is $339-$350, as you can read here on their Ark site.

And even at $299, when the 1600X and 1600 are released, it will arguably be still too expensive to be the clear choice. Until now you could still make the argument of the Microcenter price vs the more expensive Ryzen 7's. But how will that pan out when in gaming the $229 Ryzen 5 1600 overclocked to 4 Ghz will do as well as the 1800X, 1700X and 1700? That's a very compelling CPU for a very compelling price. How much of a premium can Intel charge? Some, for sure, but for one that $299 price tag should become official everywhere first, at least.

Just a small correction at the end, you mean 3820. Or 4820K.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *powerincarnate*
> 
> the 7700K is Not too expensive. It's $299 at Microcenter. $299 for the fastest clock and most advance non-HEDT processor there is. In the old days the top of the line stuff would be much more expensive then this. It is quite literally bare none, the best gaming CPU.
> 
> as the other poster said, the rest of the lineup is fine, just priced too high. If 6950 should be like 1200. 6900 should be 700, 6850 should be 450, 6800 should be 300. 6900 at 700 bucks, Intel would be able to say hey, we beat the 1800K in most benchmarks, we are better than they are at gaming, and our motherboard and memory compatability is much more extensive, feature pack, and stable. *This alone would at least be a reason to spend the extra 200 bucks.* But even that may be stretching it. At this point, i'm waiting to see what the 16 core chip can do and what Skylake X is all about. Until then, I'll stick to my 3820K.


Actually, you can get a 1700 that will go blow for blow with the 6900K for only $339. That is the reason Ryzen is so disruptive in the HEDT market. All of the focused comparisons to the 7700K are dubious to me because the 8C /16T Ryzen chips were quite obviously designed to tackle X99 workloads, not quad cores, and the comparisons between Ryzen and X99 chips are very close indeed.

I just find it funny that a year or two ago X79/X99 chips were lauded as remarkable products but now that Ryzen has come out and is close to parity with those same chips, somehow the 7700K is the only thing anybody can talk about.


----------



## Arturo.Zise

Quote:


> Originally Posted by *tpi2007*
> 
> AMD's lineup seems very compelling. They may not be moving the absolute performance bar up, but they are redefining what you should expect at each price point.


Exactly. Outside of the 7700k and the 6950x, it's hard to recommend any Intel CPU right now vs Ryzen. The value for money just isn't there.


----------



## Majin SSJ Eric

Well there is still epeen. Even I'll admit that having a 6900K in the sig is sexier than a 1700 but that's not worth $700 to me anymore. If the 6900K were to be dropped to $499 or $599 I'd probably go with it but Ryzen is just too good a deal to pass on with Intel's current pricing.


----------



## SuperZan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well there is still epeen. Even I'll admit that having a 6900K in the sig is sexier than a 1700 but that's not worth $700 to me anymore. If the 6900K were to be dropped to $499 or $599 I'd probably go with it but Ryzen is just too good a deal to pass on with Intel's current pricing.


1800*X* + *X*370 + (R*X* or GT*X*) = sexiest sig confirmed.


----------



## teh-yeti

Quote:


> Originally Posted by *tpi2007*
> 
> As Ceadderman said above, that is a pick up on location price and it's not official nor in practice in other places. Intel's recommended price range for the i7-7700K is $339-$350, as you can read here on their Ark site.
> 
> And even at $299, when the 1600X and 1600 are released, it will arguably be still too expensive to be the clear choice. Until now you could still make the argument of the Microcenter price vs the more expensive Ryzen 7's. But how will that pan out when in gaming the $229 Ryzen 5 1600 overclocked to 4 Ghz will do as well as the 1800X, 1700X and 1700? That's a very compelling CPU for a very compelling price. How much of a premium can Intel charge? Some, for sure, but for one that $299 price tag should become official everywhere first, at least.
> 
> Just a small correction at the end, you mean 3820. Or 4820K.


Once again agreed. Not everyone is lucky enough to live even within 300 miles of a microcenter. Even then, for most of my PC gamer friends and even graphic design friends in uni who want to do gaming in the side, $340 is far too steep a price to pay. As mentioned earlier as well, the price point that most of them are willing to look at is at the locked i5 line, with even a 7600K looking too expensive, especially since it requires a more expensive Z series motherboard to get "all of the functionality" with overclocking. AMD is trying to redefine price points, and I for one hope that they do revolutionize it.


----------



## AlphaC

Quote:


> Originally Posted by *powerincarnate*
> 
> the 7700K is Not too expensive. It's $299 at Microcenter. $299 for the fastest clock and most advance non-HEDT processor there is. In the old days the top of the line stuff would be much more expensive then this. It is quite literally bare none, the best gaming CPU.
> 
> as the other poster said, the rest of the lineup is fine, just priced too high. If 6950 should be like 1200. 6900 should be 700, 6850 should be 450, 6800 should be 300. 6900 at 700 bucks, Intel would be able to say hey, we beat the 1800K in most benchmarks, we are better than they are at gaming, and our motherboard and memory compatability is much more extensive, feature pack, and stable. This alone would at least be a reason to spend the extra 200 bucks. But even that may be stretching it. At this point, i'm waiting to see what the 16 core chip can do and what Skylake X is all about. Until then, I'll stick to my 3820K.


That's only true *today* that Intel i7-7700k is the best gaming CPU overall. Keep in mind LGA1151 is a dead socket & Z170/Z270 is required for overclocking.

Ryzen 7 1700X has been $350 at Microcenter before and with $50 discount on motherboards. Ryzen 7 1700 has been $280-310 before.

When there's only 1 compelling CPU on the entire socket unless everything you run is below 2 cores then there is going to be a huge problem for Intel's lineup. Every locked i5 is in jeopardy and all i3s.

Also quad channel memory and extra PCie lanes separate Intel HEDT from AMD's Ryzen 7 & Ryzen 5 hexcores. When Ryzen 7 has been out for less than a month I think it is unfair to say X99 has better memory support when kits are labeled specifically for X99 / Z170. When Corsair/Kingston/Crucial/Team/Patriot/etc get their act together maybe this will improve but for now the only AMD labeled kits are the GSkill Flare X & Fortis.

AMD has a chance for mainstream PCs this time much like the Phenom II X6 had a chance before it was butchered for Bulldozer. I don't think a future Ryzen 3 can be a compelling purchase though unless it is an APU with one CCX.


----------



## Slink3Slyde

Quote:


> Originally Posted by *STEvil*
> 
> So are the reviews using dual-rank or single rank memory?
> 
> dual-rank 1t vs single rank 2t or visa versa may be producing many of the differences.


Interesting thought. As I understand it, its the fact that the speed of the 'infinity fabric' that connects the two Ryzen CCXs being directly linked to the clock speed of the RAM not the increase in bandwidth or latency of the RAM that is limiting Ryzen in games. So whenever 2 threads that are not on the same CCX need to communicate theres a latency penalty.
This mainly seems to affect gaming as the load shifts around a lot more between cores as opposed to many productivity tasks which stay more in place.
Quote:


> Originally Posted by *tpi2007*
> 
> And it's not bad, it's just priced too high. Well, they are planning on releasing a Skylake Xeon Gold with 1 MB of L2 cache per core, and from what we've seen with the Broadwell's 128 MB L4 cache, their arch certainly has some performance and smoothness juice left that they can explore if they feel the need.
> 
> For now things seem calm, but coming next month, I don't know, but it seems that a big part of Intel's lineup stops making sense at the current pricing. Their only 3 CPUs that make sense are the
> 
> 1. Pentium G4560 - AMD isn't going there for now, and for $64 it offers compelling performance, even though it will be outdated pretty fast, like in a year. Still, for the price, it's cheap and cheerful all the way. Plus, you can upgrade to the more powerful (and most probably way cheaper by then) i7-7700K;
> 
> 2. Core i7-7700K - it may be priced too high, but it still has a relevant number of threads, which, combined with its high clockspeed potential, puts it in a unique spot;
> 
> 3. Core i7-6950X - because there is nothing like it. Huge premium, but no match, so it makes sense if you want the glory or need the multithreaded performance (and still get the glory as bonus).
> 
> The rest is diminishing returns and dubious propositions and the current value will depend on your workload. For example, a 6900K will make more sense if your usage scenarios favour AVX2, high memory bandwidth and lots of PCIe lanes, but the huge price difference may make the math more complicated, especially if you can't take advantage of everything that the Intel CPU and platform have to offer. The 6800K is a dubious proposition, with the line-up having more prestige as the premium platform if Intel simply discontinued it and put the fully PCIe lanes featured and faster clocked 6850K in its place.
> 
> And then in the mainstream segment, with AMD releasing the hexacores and quad cores next month, things will get complicated there fast for Intel. The Ryzen 5 1600X costs the same as the 7600K and the 1600 (non X) even less, so it's arguably a very strong choice, even for a strict gaming rig, I would say. 4 threads will show limitations that won't be overcome with overclocking and it will only get worse as time goes by, whereas the AMD hexacores will be able to overcome the IPC deficit with more cores and threads and will have a lot of headroom for the future. The i5 7600K is tomorrow's i3 and should cost at most $179 (incidentally the price of the i3-7350K).
> 
> And then we have the whole core i3 line-up. None of it makes sense at $117 - $179 with the Pentium G4560 around at $64 and certainly the i3-7350K has its days numbered at its ridiculous price when you compare it with the Ryzen 5 1500X. There is no argument here. 2C/4T at 5 Ghz won't be able to beat a 4 Ghz Ryzen 5 4C/8T at 4 Ghz. The smoothness and the higher minimums will be on the Ryzen side. The i3 would be a worthwhile and fun chip to buy and overclock, but Intel sucked all the fun out of it by pricing it at a level where, in order to not feel completely bad about the purchase, it only makes strict financial sense if you will absolutely try to get 5 Ghz out of it on day one. It's borderline trying to make ends meet to begin to justify the price. Not very compelling nor fun if you're practically obliged to do it. Intel would have done better by delivering a 2C/4T unlocked Pentium without AVX and all the extra goodies that the i3's have, but at least for under $100 it would have been a compelling buy.
> 
> And then, of course, the locked Core i5's will have the desirability of a low $100's i3. Those will probably be the biggest victims in all of this. If you can buy a Ryzen 5 1500X, a 4C/8T CPU for $179, why would anybody buy a $182+ 4C/4T locked CPU?
> 
> AMD's lineup seems very compelling. They may not be moving the absolute performance bar up, but they are redefining what you should expect at each price point.


Good summary. Agreed.
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Actually, you can get a 1700 that will go blow for blow with the 6900K for only $339. That is the reason Ryzen is so disruptive in the HEDT market. All of the focused comparisons to the 7700K are dubious to me because the 8C /16T Ryzen chips were quite obviously designed to tackle X99 workloads, not quad cores, and the comparisons between Ryzen and X99 chips are very close indeed.
> 
> I just find it funny that a year or two ago X79/X99 chips were lauded as remarkable products but now that Ryzen has come out and is close to parity with those same chips, somehow the 7700K is the only thing anybody can talk about.


In my case because Im a gamer, and I want to know definitively what is the best gaming CPU I can recommend, and reviews are conflicting on this point. Ive stated quite a few times that as a value proposition X99 has been killed stone dead. Unless you have money to burn and need the extra PCI-E lanes or quad channel memory for whatever you do. If you read the article in my sig you can see that Im not suggesting that average frame rates are more important at all, its that absolute minimum reports are misleading and that 1% and 10% lows are more important. Mindblank appears to show Ryzen performing far better then other sources, thats why Im interested to see the reviews of Ryzen 5, particularly from DF and Techreport which will have frame time analysis with the new Windows updates and some of the other wrinkles ironed out as well.


----------



## KarathKasun

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Which is still valid as C2D ~ 2006 IPC level.


Overall performance is still higher because of much higher clocks, which seems to be what the goal was. It just was not enough of a delta to make up for the C2D -> Nehalem jump, outside of having more cores. Right tech at the wrong time, all of the delays made it DOA for most usage cases.

If AMD could have pulled an Intel P4 like clockspeed increase (over 2.3x clock increase over its lifespan) things may have been different, but it was already known that we were at a switching speed barrier at ~4ghz for consumer targeted SKUs with 45nm(cooling and power consumption limits). AMD was overly optimistic about switching speed increases for 32nm and beyond, and they paid for that in a big way.


----------



## Oubadah

..


----------



## epic1337

Quote:


> Originally Posted by *Slink3Slyde*
> 
> In my case because Im a gamer, and I want to know definitively what is the best gaming CPU I can recommend, and reviews are conflicting on this point. Ive stated quite a few times that as a value proposition X99 has been killed stone dead. Unless you have money to burn and need the extra PCI-E lanes or quad channel memory for whatever you do. If you read the article in my sig you can see that Im not suggesting that average frame rates are more important at all, its that absolute minimum reports are misleading and that 1% and 10% lows are more important. Mindblank appears to show Ryzen performing far better then other sources, thats why Im interested to see the reviews of Ryzen 5, particularly from DF and Techreport which will have frame time analysis with the new Windows updates and some of the other wrinkles ironed out as well.


well for the most part, 1% and 10% are mostly heavily biased towards the slowest point of the processor.
that is to say every bit of dip in processing speed can affect the 1% and 10% results drastically.
so Ryzen's issues with Fabric latency and lower clock speed and the likes puts it at a slight disadvantage.

on the other hand, 1% and 10% only tells one part of the "smoothness" of framerate.
its the "swing" between minimum and maximum framerate that causes stutters and the likes.

so we need a new test that doesn't just involve 1% and 10%, but also the delta from a frame-to-frame latency test.
that is to say, if one frame takes 8ms to render while the next frame takes 20ms, then the delta latency is 12ms.
obviously, the bigger the delta latency the more obvious the stutter would be, it'd show "smoothness" better than just the 1% and 10% tests.
so we need to have peak-delta and average-delta latency from a frame-to-frame latency test.

still though, with how little the 1600X and 1600 would cost its pretty much going to take the flag of perf/$ off intel's hands.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SuperZan*
> 
> 1800*X* + *X*370 + (R*X* or GT*X*) = sexiest sig confirmed.


Gimme all the X's!!! Hell, I need to get my 3960X back, what was I thinking!?!?


----------



## Liranan

Quote:


> Originally Posted by *tpi2007*
> 
> And it's not bad, it's just priced too high. Well, they are planning on releasing a Skylake Xeon Gold with 1 MB of L2 cache per core, and from what we've seen with the Broadwell's 128 MB L4 cache, their arch certainly has some performance and smoothness juice left that they can explore if they feel the need.
> 
> For now things seem calm, but coming next month, I don't know, but it seems that a big part of Intel's lineup stops making sense at the current pricing. Their only 3 CPUs that make sense are the
> 
> 1. Pentium G4560 - AMD isn't going there for now, and for $64 it offers compelling performance, even though it will be outdated pretty fast, like in a year. Still, for the price, it's cheap and cheerful all the way. Plus, you can upgrade to the more powerful (and most probably way cheaper by then) i7-7700K;
> 
> 2. Core i7-7700K - it may be priced too high, but it still has a relevant number of threads, which, combined with its high clockspeed potential, puts it in a unique spot;
> 
> 3. Core i7-6950X - because there is nothing like it. Huge premium, but no match, so it makes sense if you want the glory or need the multithreaded performance (and still get the glory as bonus).
> 
> The rest is diminishing returns and dubious propositions and the current value will depend on your workload. For example, a 6900K will make more sense if your usage scenarios favour AVX2, high memory bandwidth and lots of PCIe lanes, but the huge price difference may make the math more complicated, especially if you can't take advantage of everything that the Intel CPU and platform have to offer. The 6800K is a dubious proposition, with the line-up having more prestige as the premium platform if Intel simply discontinued it and put the fully PCIe lanes featured and faster clocked 6850K in its place.
> 
> And then in the mainstream segment, with AMD releasing the hexacores and quad cores next month, things will get complicated there fast for Intel. The Ryzen 5 1600X costs the same as the 7600K and the 1600 (non X) even less, so it's arguably a very strong choice, even for a strict gaming rig, I would say. 4 threads will show limitations that won't be overcome with overclocking and it will only get worse as time goes by, whereas the AMD hexacores will be able to overcome the IPC deficit with more cores and threads and will have a lot of headroom for the future. The i5 7600K is tomorrow's i3 and should cost at most $179 (incidentally the price of the i3-7350K).
> 
> And then we have the whole core i3 line-up. None of it makes sense at $117 - $179 with the Pentium G4560 around at $64 and certainly the i3-7350K has its days numbered at its ridiculous price when you compare it with the Ryzen 5 1500X. There is no argument here. 2C/4T at 5 Ghz won't be able to beat a 4 Ghz Ryzen 5 4C/8T at 4 Ghz. The smoothness and the higher minimums will be on the Ryzen side. The i3 would be a worthwhile and fun chip to buy and overclock, but Intel sucked all the fun out of it by pricing it at a level where, in order to not feel completely bad about the purchase, it only makes strict financial sense if you will absolutely try to get 5 Ghz out of it on day one. It's borderline trying to make ends meet to begin to justify the price. Not very compelling nor fun if you're practically obliged to do it. Intel would have done better by delivering a 2C/4T unlocked Pentium without AVX and all the extra goodies that the i3's have, but at least for under $100 it would have been a compelling buy.
> 
> And then, of course, the locked Core i5's will have the desirability of a low $100's i3. Those will probably be the biggest victims in all of this. If you can buy a Ryzen 5 1500X, a 4C/8T CPU for $179, why would anybody buy a $182+ 4C/4T locked CPU?
> 
> AMD's lineup seems very compelling. They may not be moving the absolute performance bar up, but they are redefining what you should expect at each price point.


Excellent summary. What has me excited is Zen 2 next year. If it clocks at 4.5 to 5GHz and has 10% higher IPC it will be an amazing chip. Intel are haveing a really hard time competing with Zen now, next year will be even harder unless they do another leap like they did from Netburst to C2 but I don't see that happening.


----------



## randomizer

Quote:


> Originally Posted by *Liranan*
> 
> Excellent summary. What has me excited is Zen 2 next year. If it clocks at 4.5 to 5GHz and has 10% higher IPC it will be an amazing chip. Intel are haveing a really hard time competing with Zen now, next year will be even harder unless they do another leap like they did from Netburst to C2 but I don't see that happening.


I don't think Intel are going to have a hard time competing against AMD in anything but charts.


----------



## Slink3Slyde

Quote:


> Originally Posted by *epic1337*
> 
> well for the most part, 1% and 10% are mostly heavily biased towards the slowest point of the processor.
> that is to say every bit of dip in processing speed can affect the 1% and 10% results drastically.
> *so Ryzen's issues with Fabric latency and lower clock speed and the likes puts it at a slight disadvantage.*
> 
> and for the most part, 1% and 10% only tells one part of the "smoothness" of framerate.
> its the "swing" between minimum and maximum framerate that causes stutters and the likes.
> 
> still though, with how little the 1600X and 1600 would cost its pretty much going to take the flag of perf/$.


So when a CPU has a one off dip to a very low frame rate it could be a blip yet would still look terrible in a graph, but the lower % of overall minimum frames gives a much better idea of the overall smoothness of the experience. I'm not convinced a swing from 144 FPS to 100 FPS if for example their was a big explosion causing lots of physics suddenly is going to be too noticeable unless you are watching a frame counter, but a big dip for a few seconds to half of the frame rate would be more of a problem in an action game. Personally I game at 60 FPS, so what concerns me most are dips in CPU heavy gaming situations that really aren't often benchmarked. For example in TW Warhammer in a huge battle on my old I5 you could be talking tanking down to 20FPS in a worst case scenario, there's no CPU out today that can bring those situations up to over 60FPS but if youre talking a possible difference between say 20 and 30 it helps.

The bolded part is what is in question to my mind, because now we have an interesting seemingly legit review from Mindblank on Youtube showing Ryzen with 3600 RAM and possibly the latest Windows update helping showing Ryzen outperforming the 7700K contrary to other reviews updated after release day using higher speed RAM in the games he tested. This could be fantastic news! But forgive me for not swallowing it whole as it hasn't been confirmed anywhere else yet.

My other question would be, is 3600 a reasonable speed for Ryzen to achieve on average or is it going to be a unicorn akin to getting the ridiculously hot 7700K to >5Ghz without delidding it?

Just seen your edit
Quote:


> so we need a new test that doesn't just involve 1% and 10%, but also the delta from a frame-to-frame latency test.
> that is to say, if one frame takes 8ms to render while the next frame takes 20ms, then the delta latency is 12ms.
> obviously, the bigger the delta latency the more obvious the stutter would be, it'd show "smoothness" better than just the 1% and 10% tests.
> so we need to have peak-delta and average-delta latency from a frame-to-frame latency test.


Interesting thought, but I think if something is going on on screen that's causes a huge dip, any CPU is going to cause a dip of some sort. So running a high refresh monitor if CPU A is dropping to 50 FPS and CPU B to 60 I think we can see from the 10% and 1% lows which is performing better in that case. I wouldn't say if CPU A was running at 80 FPS and dropped to 50 and CPU B was at 144 FPS and dropped to 60 that CPU A is providing the better overall experience as a chart like you put forward might suggest.


----------



## epic1337

Quote:


> Originally Posted by *Slink3Slyde*
> 
> So when a CPU has a one off dip to a very low frame rate it could be a blip yet would still look terrible in a graph, but the lower % of overall minimum frames gives a much better idea of the overall smoothness of the experience. I'm not convinced a swing from 144 FPS to 100 FPS if for example their was a big explosion causing lots of physics suddenly is going to be too noticeable unless you are watching a frame counter, but a big dip for a few seconds to half of the frame rate would be more of a problem in an action game. Personally I game at 60 FPS, so what concerns me most are dips in CPU heavy gaming situations that really aren't often benchmarked. For example in TW Warhammer in a huge battle on my old I5 you could be talking tanking down to 20FPS in a worst case scenario, there's no CPU out today that can bring those situations up to over 60FPS but if youre talking a possible difference between say 20 and 30 it helps.
> 
> The bolded part is what is in question to my mind, because now we have an interesting seemingly legit review from Mindblank on Youtube showing Ryzen with 3600 RAM and possibly the latest Windows update helping showing Ryzen outperforming the 7700K contrary to other reviews updated after release day using higher speed RAM in the games he tested. This could be fantastic news!
> 
> My other question would be, is 3600 a reasonable speed for Ryzen to achieve on average or is it going to be a unicorn akin to getting the ridiculously hot 7700K to >5Ghz without delidding it?


it might not be too good based from the Ryzen RAM OC reviews so far, the best assumption is around 3400Mhz, 3600Mhz might be the upper end of the limit.

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Just seen your edit
> Interesting thought, but I think if something is going on on screen that's causes a huge dip, any CPU is going to cause a dip of some sort. So running a high refresh monitor if CPU A is dropping to 50 FPS and CPU B to 60 I think we can see from the 10% and 1% lows which is performing better in that case. I wouldn't say if CPU A was running at 80 FPS and dropped to 50 and CPU B was at 144 FPS and dropped to 60 that CPU A is providing the better overall experience as a chart like you put forward might suggest.


it might, the delta latency between frames can result to an actual frame-skip from the screen.

edit: i found the explanation of Ryan's from Anand to be accurate to what i was implying.
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/6
Quote:


> In this metric, which for the moment we're calling Delta Percentages, we're collecting the deltas (differences) between frametimes, averaging that out, and then running the delta average against the average frametime of the entire run. The end result of this process is that we can measure whether sequential frames are rendering in roughly the same amount of time, while controlling for performance differences by looking at the data relative to the average frametime (rather than as absolute time).
> 
> In general, a properly behaving single-GPU card should have a delta average of under 3%, with the specific value depending in part on how variable the workload is throughout any given game benchmark. 3% may sound small, but since we're talking about an average it means it's weighed against the entire run. The higher the percentage the more unevenly frames are arriving, and exceeding 3% is about where we expect players with good eyes to start noticing a difference. Alternatively in a perfectly frame metered situation, such as v-sync enabled with a setup that can always hit 60fps, then this would be a flat 0%, representing the pinnacle of smoothness.


----------



## Slink3Slyde

Quote:


> Originally Posted by *epic1337*
> 
> it might not be too good based from the Ryzen RAM OC reviews so far, the best assumption is around 3400Mhz, 3600Mhz might be the upper end of the limit.


Then we also have to take into account (I often forget as well) for gaming most people want to set and forget and not too many outside of sites like these will overclock. So if say 3200XMP is as far as most chips go without adding voltage or messing with timings and the I7 7700K's price competitor Ryzen 1700 only clocks to 3.5 across all cores at stock then that is also something that need to be considered.

Also still in doubt IMO is the amount that faster RAM is affecting performance in gaming for Ryzen. It clearly helps, but I'm not convinced yet that its on the scale that Mindblank is showing until I see more, especially considering what I've seen from other sites. If lots of kits for XMP 3600 or even more start to turn up on motherboard QVL's then its a big plus anyway, and for overclockers if it turns out that the fastest RAM really brings up Ryzens effective gaming performance even closer or beyond a faster clocked quad I7 its great news for Ryzen in games.
Quote:


> Originally Posted by *epic1337*
> 
> it might, the delta latency between frames can result to an actual frame-skip from the screen.
> for example a sequence of 6.94ms(144FPS) frame to 16.67ms(60FPS) frame would cause two frame-skips on a 120Hz monitor, the delta is 9.73ms.
> another example is a sequence of 16.67ms(60FPS) frame to 25ms(40FPS) frame would cause one frame-skip on a 120Hz monitor, the delta is 8.33ms.
> 
> where as FreeSnyc or G-Sync makes the frame-skips less obvious, or rather there won't be frame skips but actual frame-latency gaps instead.


O.K then that's something I didnt know. I'd imagine though that the CPU showing lower 1% minimums would show bigger swings for the most part, it would be an interesting test to see. Edit: Anandtechs lack of a gaming review for Ryzen was strange to me, I've been keeping an eye out for a future review from them, maybe theyll do it with the Ryzen 5.

I also dont discount that Ryzens more cores may come in useful in the future for games, but I'm not sure it will be a huge issue for a similarly priced quad I7 within the IMO reasonable 3-5 year lifespan of the processor.


----------



## epic1337

the relevance of 8C/16T is too far off into the future, 6C/12T would be your best bet.

4C/8T on the other hand are still superb as of this time, but is starting to show it's age with recent titles supporting more threads.
that is to say, despite 7700K being a mere 4C/8T processor, it can somewhat keep up or even exceed 6C/12T and 8C/16T processors from both Intel and AMD.


----------



## Brutuz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just to point things out that 7700K in running slower RAM and can get much more fps with faster RAM. Also 18% you say? Don't you know 18% is the difference between this and SB? 18% is 2-3 generations of Intel refinement.


After AMD failed to compete.

Before AMD failed to compete, it's what Intel would increase speeds by every 6 months with a combo of IPC gains and slowly increasing Mhz. If you look at the raw numbers, only one game is below 60fps on Ryzen and it's also below 60 on the 7700k. 18% isn't really all that much, hence why people who have ~18% lower performance than a 7700k (Or even more than that) fail to notice it every single day. That said, I want to see a review of the two chips running at max OC with the fastest RAM they can have showing minimum fps, average and peak across a bunch of games from 1080p to 4k and between. Easiest way to see which is the best for the games you play, because it covers most usage situations and shows any hidden differences like Ryzen having higher minimum fps in some reviews.
Quote:


> Originally Posted by *rexolaboy*
> 
> 18% faster includes clock disparity. Clock for clock kaby lake is about 7 1/2 % faster than ryZen. When people say kaby is about 20% faster than sandy bridge, that's clock for clock ipc gain.


But SB and Kaby have similar OCing headrooms, so it all ends up being the same thing in the end even if it's a combo of IPC and clocks make the difference for Ryzen, but pure IPC for SB


----------



## Slink3Slyde

Quote:


> Originally Posted by *epic1337*
> 
> the relevance of 8C/16T is too far off into the future, 6C/12T would be your best bet.
> 
> 4C/8T on the other hand are still superb as of this time, but is starting to show it's age with recent titles supporting more threads.
> that is to say, despite 7700K being a mere 4C/8T processor, it can somewhat keep up or even exceed 6C/12T and 8C/16T processors from both Intel and AMD.


The Ryzen 5 is going to be the best value gaming CPU for sure, it balances good IPS with the couple extra cores for cheaper then either R7 or I7's. Everything else is for the crystal ball right now, if it was only about having more cores us gamers could just buy FX 8350's or low clocked 2nd hand Xeons and be content forever.


----------



## ZealotKi11er

So apparently CCX latency is not the factor on why Zen gaming performance is not up to par with Intel.

https://www.youtube.com/watch?v=Rhj6CvBnwNk&t=0s


----------



## Ha-Nocri

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So apparently CCX latency is not the factor on why Zen gaming performance is not up to par with Intel.
> 
> https://www.youtube.com/watch?v=Rhj6CvBnwNk&t=0s


Saw that. It must be something that RyZen is not as good as Intel is at, and some games are really using that feature(s) - Far Cry Primal, Tomb Rider, ...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Saw that. It must be something that RyZen is not as good as Intel is at, and some games are really using that feature(s) - Far Cry Primal, Tomb Rider, ...


I have a feeling is probably something addressable. Those games are in the past. I am sure games that are in development now will account for Zen and not experience those problems.


----------



## Damn_Smooth

Maybe it is simply because games are optimized for Intel processors just like AMD said. Who would've thought? Guess we'll have to wait to see some engine changes to see if they help.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Damn_Smooth*
> 
> Maybe it is simply because games are optimized for Intel processors just like AMD said. Who would've thought? Guess we'll have to wait to see some engine changes to see if they help.


Only problem with his benchmark is BF1 testing. They just can't seem to learn that BF1 CPU testing is all about Multiplayer with 64 players. Looking at this CPU usage graph the 4C Zen is not getting more then 70% utilized yet is pulling less fps than 7600K. I know though my testing in BF1 MP with 3770K I can get the CPU to 95-99% in all core at 1080p. This is where I want to see what Zen has to offer.


----------



## 7850K

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only problem with his benchmark is BF1 testing. They just can't seem to learn that BF1 CPU testing is all about Multiplayer with 64 players. Looking at this CPU usage graph the 4C Zen is not getting more then 70% utilized yet is pulling less fps than 7600K. I know though my testing in BF1 MP with 3770K I can get the CPU to 95-99% in all core at 1080p. This is where I want to see what Zen has to offer.


http://www.overclock.net/t/1625300/zolkorn-r7-vs-i7-core-for-core-clock-for-clock/320#post_25957543


----------



## cssorkinman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Damn_Smooth*
> 
> Maybe it is simply because games are optimized for Intel processors just like AMD said. Who would've thought? Guess we'll have to wait to see some engine changes to see if they help.
> 
> 
> 
> Only problem with his benchmark is BF1 testing. They just can't seem to learn that BF1 CPU testing is all about Multiplayer with 64 players. Looking at this CPU usage graph the 4C Zen is not getting more then 70% utilized yet is pulling less fps than 7600K. I know though my testing in BF1 MP with 3770K I can get the CPU to 95-99% in all core at 1080p. This is where I want to see what Zen has to offer.
Click to expand...

I should have my son jump into the same server as I am in in bf1 - 64 player map - he on the 4790K 290 x and me on the 1800x / fury at low settings and graph the fps for both.


----------



## Slink3Slyde

Isnt Frostbite one of those engines that scales like crazy with RAM on Intel in MP as well? I seem to remember back when Linus was telling everyone that RAM speed didn't matter at all that people around here were doing some testing showing that there were definitely benefits to faster RAM in at least a couple of threads. Edit. Specifically in BF4 I mean.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cssorkinman*
> 
> I should have my son jump into the same server as I am in in bf1 - 64 player map - he on the 4790K 290 x and me on the 1800x / fury at low settings and graph the fps for both.


Yeah try that. My 3770K was CPU limited at 1080p with both 290X and GTX1080 so different GPU should not be a problem.


----------



## Mahigan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only problem with his benchmark is BF1 testing. They just can't seem to learn that BF1 CPU testing is all about Multiplayer with 64 players. Looking at this CPU usage graph the 4C Zen is not getting more then 70% utilized yet is pulling less fps than 7600K. I know though my testing in BF1 MP with 3770K I can get the CPU to 95-99% in all core at 1080p. This is where I want to see what Zen has to offer.


Well, if it helps any... performance is pretty much the same from me going from a 3930K @ 4.5GHz to a RyZen 1800X at stock speeds. Of course, I'm GPU bottlenecked. I run BF1 under DX12 because I get better performance that way (go figure) when playing 64 player MP. I play at 1600p on a single R9 290x (the other one doesn't work under DX12 and yes I get better single GPU DX12 performance than Crossfire DX11 performance).

I'm sure there is a slight performance difference but the FPS are in the same range. I also play on Ultra and I disabled GPU memory restriction.

I think I'm in need of a GPU upgrade. That is coming when Vega is released. At that point I'll choose between Vega and Pascal. Point being though... I don't think my GPU can test for a CPU bottleneck... because I'm way too GPU bottlenecked at this point.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mahigan*
> 
> Well, if it helps any... performance is pretty much the same from me going from a 3930K @ 4.5GHz to a RyZen 1800X at stock speeds. Of course, I'm GPU bottlenecked. I run BF1 under DX12 because I get better performance that way (go figure) when playing 64 player MP. I play at 1600p on a single R9 290x (the other one doesn't work under DX12 and yes I get better single GPU DX12 performance than Crossfire DX11 performance).
> 
> I'm sure there is a slight performance difference but the FPS are in the same range. I also play on Ultra and I disabled GPU memory restriction.
> 
> I think I'm in need of a GPU upgrade. That is coming when Vega is released. At that point I'll choose between Vega and Pascal. Point being though... I don't think my GPU can test for a CPU bottleneck... because I'm way too GPU bottlenecked at this point.


Yeah at 1440p you are GPU bound. You can try 1080p all LOW settings and check the fps and CPU usage. BF1 should scale at least 8 cores to 100%. At least it does with 3770K. That being the case I do not see why Zen with 8 real cores is not faster then even 5GHz 7700K. People that actually play BF1 have reported 6800K is faster than 7700K in this game.


----------



## epic1337

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have a feeling is probably something addressable. Those games are in the past. I am sure games that are in development now will account for Zen and not experience those problems.


theres quite a lot of possibilities, the hardware alone has quite some notable differences.
Quote:


> Originally Posted by *Damn_Smooth*
> 
> Maybe it is simply because games are optimized for Intel processors just like AMD said. Who would've thought? Guess we'll have to wait to see some engine changes to see if they help.


there are hardware differences, like these for example.
http://www.anandtech.com/show/11170/the-amd-zen-and-ryzen-7-review-a-deep-dive-on-1800x-1700x-and-1700/12

some of the interesting points are these:

Ryzen L3 cache = Victim (acts as a dump for L1 and L2, this speeds up L1 and L2, but compromises L3 performance)
Skylake L3 cache = Write-back

Ryzen Decode = 4 uops/cycle
Skylake Decode = 5 uops/cycle

Ryzen AGU = 2 (two fused load/store units, fused load/store can only function as "OR")
Skylake AGU = 2+2 (two fused load/store + two store units, fused load/store can only function as "OR")
note : AGU = Address generation unit

Ryzen FMAC = 2x128bit
Skylake FMAC = 2x256bit
note : FMAC = fused multiply-accumulate (floating point)


----------



## AlphaC

http://www.tomshardware.com/reviews/amd-ryzen-7-1700x-review,4987-7.html










Quote:


> Originally Posted by *https://community.spiceworks.com/topic/1978441-ryzen-in-the-office-it-pros-meditate-on-amd-s-new-zen-architecture*
> It's not just the analysts who have kind words for AMD's new Ryzen line of processors: In a recent Spiceworks poll, 61% of IT pros said they were more likely to consider AMD following the launch of its latest processors.


----------



## aapppaa

Interesting...


----------



## budgetgamer120

People still read Toms?


----------



## Ha-Nocri

Strange results. From what I've seen before RyZen was losing badly in DeusEx DX12

Was there an update for this game recently?


----------



## randomizer

Quote:


> Originally Posted by *budgetgamer120*
> 
> People still read Toms?


It's certainly preferable to YouTube videos.


----------



## Laserlight

I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
I care only about gaming performance and nothing else.
I hope I will have a clear picture before I make the purchase.


----------



## cssorkinman

Quote:


> Originally Posted by *Laserlight*
> 
> I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
> I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
> I care only about gaming performance and nothing else.
> I hope I will have a clear picture before I make the purchase.


What games? What monitor will you be using -resolution - setting- refresh rate? What video card do you plan on using?


----------



## Shatun-Bear

Quote:


> Originally Posted by *Laserlight*
> 
> I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
> I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
> I care only about gaming performance and nothing else.
> I hope I will have a clear picture before I make the purchase.


I don't think there is any contest between a 1600 or 1600X vs the best i5. Ryzen all the way. If you are still considering an i5 at this point you must have missed what the Ryzen 5's are going to bring to the table and their price.


----------



## Laserlight

Quote:


> Originally Posted by *cssorkinman*
> 
> What games? What monitor will you be using -resolution - setting- refresh rate? What video card do you plan on using?


ATM I'm gaming at 1080p 60hz with a GTX970. I will upgrade to a higher res monitor sometime and also plan to upgrade to a GTX1070.
Right now I'm with an OC'ed i5 2500k. I do not upgrade mobo+cpu very often so I need something that will last well just like my Sandy.
I was interested in Ryzen 5 1600 and i5 7600k.
The i5 7600K seems to pull ahead and even match or surpass a Ryzen in many games.
In other games it seems to fair pretty well and this is why I'm so confused with the reviews and benchmarks.
Also I do not know how well Ryzen will mature in future in comparison with a 4 core i5.


----------



## budgetgamer120

Quote:


> Originally Posted by *randomizer*
> 
> It's certainly preferable to YouTube videos.


Maybe for you. Toms benchmarks are far from reality...


----------



## cssorkinman

Quote:


> Originally Posted by *Laserlight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> What games? What monitor will you be using -resolution - setting- refresh rate? What video card do you plan on using?
> 
> 
> 
> ATM I'm gaming at 1080p 60hz with a GTX970. I will upgrade to a higher res monitor sometime and also plan to upgrade to a GTX1070.
> Right now I'm with an OC'ed i5 2500k. I do not upgrade mobo+cpu very often so I need something that will last well just like my Sandy.
> I was interested in Ryzen 5 1600 and i5 7600k.
> The i5 7600K seems to pull ahead and even match or surpass a Ryzen in many games.
> In other games it seems to fair pretty well and this is why I'm so confused with the reviews and benchmarks.
> Also I do not know how well Ryzen will mature in future in comparison with a 4 core i5.
Click to expand...

Sounds like you'd do well to study things for a while before making the leap considering your upgrade cycle.
If there were something specific you'd like to see I would be happy to demonstrate Ryzen's performance if it's possible for me to do so.


----------



## bigbadbrad

Glad I got 7700k... Why is this thing 500 bux? Throwing cores and threads at something is useless, very few programs even utilize this. 4 cores and high frequency seem to be the sweet spot, why did AMD not focus on this? Heck for most work it seems like the 7700k is performing better anyways on top of being cheaper.


----------



## budgetgamer120

Quote:


> Originally Posted by *bigbadbrad*
> 
> Glad I got 7700k... Why is this thing 500 bux? Throwing cores and threads at something is useless, very few programs even utilize this. 4 cores and high frequency seem to be the sweet spot, why did AMD not focus on this? Heck for most work it seems like the 7700k is performing better anyways on top of being cheaper.












I am sure you are glad.


----------



## bigbadbrad

Yea, almost wasted 500 bux on a preorder. Luckily it was out of stock and I had time to cancel once the performance came out. Second 7700k came in the mail monday, and my new maximus ix extreme came in today. Pretty excited to put this new rig together and eagerly awaiting aftermarket 1080 ti's. I liked AMD for a long time. I've owned the following processors: 64 x2 4200, 64 x2 4400, amd phenom 2 x4 945, amd FX-8350 Black edition. I loved AMD, but it is obvious now that as a gamer and enthusiast that would prefer a nimbler processor AMD doesn't care what I want.


----------



## Ceadderman

Quote:


> Originally Posted by *Laserlight*
> 
> I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
> I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
> I care only about gaming performance and nothing else.
> I hope I will have a clear picture before I make the purchase.


If you aren't running Windows 10 and DX12 (which is glitchy by all accounts) then you can get wither with confidence. DX11 works just fine.

If you do have Windows 10 OS... I'm sorry.









~Ceaddwr


----------



## Laserlight

Quote:


> Originally Posted by *Ceadderman*
> 
> If you aren't running Windows 10 and DX12 (which is glitchy by all accounts) then you can get wither with confidence. DX11 works just fine.
> 
> If you do have Windows 10 OS... I'm sorry.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceaddwr


What does DX12 and Windows 10 has to do with what CPU to buy?
Doesn't Windows 10 run DX11 games?
Do we even have true DX12 games to begin with? (I don't mean the patched over to DX12 "support" games).


----------



## Damn_Smooth

Quote:


> Originally Posted by *bigbadbrad*
> 
> AMD doesn't care what I want.


I'm with them.


----------



## randomizer

Quote:


> Originally Posted by *budgetgamer120*
> 
> Maybe for you. Toms benchmarks are far from reality...


Perhaps your preferred reality.


----------



## bigbadbrad

Guess my 10 years of AMD loyalty mean nothing. I say AMD is doing it wrong and the blind loyalists come to its defense. Can't you see this is the exact same marketing strategy they have done for 10 years of fail cascading. Step 1: make moderately comparable CPU late in the intel cycle, Step 2: Build your CPU for the exact same crowd that has been dwindling for 10 years. Step 3: Divide your CPU line into 5 flavors that taste exactly the same but have different costs.


----------



## budgetgamer120

Exactly how I use my system


----------



## Damn_Smooth

Quote:


> Originally Posted by *bigbadbrad*
> 
> Guess my 10 years of AMD loyalty mean nothing. I say AMD is doing it wrong and the blind loyalists come to its defense. Can't you see this is the exact same marketing strategy they have done for 10 years of fail cascading. Step 1: make moderately comparable CPU late in the intel cycle, Step 2: Build your CPU for the exact same crowd that has been dwindling for 10 years. Step 3: Divide your CPU line into 5 flavors that taste exactly the same but have different costs.










He thinks I'm defending AMD by pointing out that nobody cares what he wants. You can buy whatever suits your needs from now until the end of time. Regardless of what that is, nobody is going to care. Welcome to Earth.


----------



## SuperZan

Quote:


> Originally Posted by *bigbadbrad*
> 
> Guess my 10 years of AMD loyalty mean nothing. I say AMD is doing it wrong and the blind loyalists come to its defense. Can't you see this is the exact same marketing strategy they have done for 10 years of fail cascading. Step 1: make moderately comparable CPU late in the intel cycle, Step 2: Build your CPU for the exact same crowd that has been dwindling for 10 years. Step 3: Divide your CPU line into 5 flavors that taste exactly the same but have different costs.


You're doing it wrong because you're completely missing the plot. Ryzen is more than 'moderately comparable' with its competitors.

1. It absolutely rinses Broadwell-e in price/performance and it's not a photo finish. Given that Ryzen performs as well as Broadwell-e does in most common productivity tasks for which more cores and threads are useful, I'd say that it's fairly well lapping Broadwell-e when it comes to price/performance.

2. The workstation crowd is not a dwindling niche. That's an asinine contention.

3. I'm not entirely sure to which 'five flavours' you are referring, but octacore, hexcore, and quadcore Ryzen are all directed towards different market segments and use-cases. That is patently obvious.

What you say means nothing because what you say has no merit. If all you had said was that a quadcore processor at 5.0 GHz is the absolute best gaming option at 1080p if all you do is game, then you'd have a great and largely indisputable point. Instead, you threw out some meaningless claptrap and are confused as to why you're being called out on it. Now you know.

Context and specifics. Everything else is low-quality bait.


----------



## AuraNova

I've noticed a lot of people taking things superficially, and not between the lines when it comes to Ryzen.

Do all the research you can, and just go with whatever suits your needs. If you want to take to any computer forum to state your findings as to why Intel's offerings are better, then go right ahead. Just make sure you have some proof to back your findings. Just just say things that only look good on the surface.


----------



## bigbadbrad

I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?

The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


----------



## Scotty99

Quote:


> Originally Posted by *bigbadbrad*
> 
> I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?
> 
> The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


I went with ryzen personally because it offered better value for the money. I cant really comment on benchmarks but you may be right for a lot of them, i just was not comfortable with the notion of having a quad core cpu in 2022 (kept my 2500k system over 5 years).

What i think people need to keep in mind is how early it is for the platform, we are just last week learning that ryzen scales quite well with memory due to its architecture. Put some years on ryzen and it would not surprise me if it achieves parity with 7700k in games.

As for right now its not like its a bad gaming chip, ill get 130 fps vs 150 and thats fine by me.


----------



## KarathKasun

Quote:


> Originally Posted by *bigbadbrad*
> 
> I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?
> 
> The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


One could make the same arguments for many of the markets that Intel is in, especially in mobile and very low power use cases. You are better off going with ARM in those cases, Intel only exists in those markets because of X86.

For lightly threaded workstation workloads you don't buy an 8c/16t CPU. If you can get 90% of the performance for 1/2 the price with similar or lower power consumption (R5 quads), you have a pretty tough decision to make. If a few more threads means that you can increase user productivity by allowing more tasks to be done simultaneously with little performance loss for the same price(R5 hex), then that is what you use.

AMD has not even been an option in recent history, now they very much are.


----------



## SuperZan

Quote:


> Originally Posted by *bigbadbrad*
> 
> I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?
> 
> The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


For one, in my profession it has long been established that cores and threads are extremely valuable. I bring home light work that doesn't require multi-CPU workstation or GPU-bank processing power. I also game on the same machine. Ryzen has been superior in every respect to the 3930k @ 4.6 GHz which I used previously for the same purposes. Tom's benched a number of scientific and engineering related workloads as well. I think you can draw the obvious conclusion regarding cores and threads. It's important to note that when possible, I/we are processing multiple workloads as quickly as possible. Cores and threads are obviously at a premium in that scenario.

AutoCAD was never going to be a great reason to buy Ryzen. Other workstation applications are highly dependent on use-case, but when rendering or other similar workloads require parallelised tasking in addition to tasks optimised for serial performance (which is very common) there is no extreme disadvantage to Ryzen in the serial workloads and it is exceptionally good during parallelised workloads. Workload balance and use-case would determine which would suit a particular workstation better.

AMD doesn't have the resources of Intel. They can't afford to make 'the best' processor for every single conceivable use-case. What they can do is make extremely competitive processors for many use-cases at extremely competitive prices. If you don't see the value, then maybe it's not for you. As somebody who does bioinformatics work and games at 1440p, Ryzen is exceptional value and excellent performance.


----------



## tpi2007

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So apparently CCX latency is not the factor on why Zen gaming performance is not up to par with Intel.
> 
> https://www.youtube.com/watch?v=Rhj6CvBnwNk&t=0s


DDR4 3600 Mhz, which also clocks the CCX link higher, seems to be the goal for realising the performance.

Quote:


> Originally Posted by *Laserlight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> What games? What monitor will you be using -resolution - setting- refresh rate? What video card do you plan on using?
> 
> 
> 
> ATM I'm gaming at 1080p 60hz with a GTX970. I will upgrade to a higher res monitor sometime and also plan to upgrade to a GTX1070.
> Right now I'm with an OC'ed i5 2500k. I do not upgrade mobo+cpu very often so I need something that will last well just like my Sandy.
> I was interested in Ryzen 5 1600 and i5 7600k.
> The i5 7600K seems to pull ahead and even match or surpass a Ryzen in many games.
> In other games it seems to fair pretty well and this is why I'm so confused with the reviews and benchmarks.
> Also I do not know how well Ryzen will mature in future in comparison with a 4 core i5.
Click to expand...

If you don't upgrade that often, why don't you wait and see? If you're considering a 4C/4T 7600K, why don't you buy a second hand 4C/8T 3770K instead and try to clock it around 4.5 Ghz? You'll end up in the same place and it's a better deal than having to buy into a new platform + RAM. That should give you some time to see how things go. I'm all for making the most of a platform.

Quote:


> Originally Posted by *bigbadbrad*
> 
> I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. *Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?
> *
> The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


Tom's Hardware review of the 1800X has a lot of non gaming tests: Desktop & Office, Workstation and Scientific & Engineering Computations And HPC

Starting on this page:

http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951-8.html

More tests here, this time with the 1700X:

https://www.servethehome.com/amd-ryzen-7-1700x-linux-benchmarks/


----------



## blue1512

Quote:


> Originally Posted by *bigbadbrad*
> 
> Glad I got 7700k... Why is this thing 500 bux? Throwing cores and threads at something is useless, very few programs even utilize this. 4 cores and high frequency seem to be the sweet spot, why did AMD not focus on this? Heck for most work it seems like the 7700k is performing better anyways on top of being cheaper.


AMD simply can't beat Intel's IPC with their resource, so they focused on multi threads and succeeded.

Once again the $500 price tag of 1800x is just for people who is pro AMD and just want the best. Compare to a $330 1700, 7700k is not that much cheaper (it had the same price before the cut). And 7700k hits dust in well-threaded applications, which are popular for people who want to buy an 8 core CPU. For gaming only, which is your main concern, you can get around 85% performance of 7700k at half of that price with a r5 4 cores.

Now you can see AMD makes a lot of sense with their line up. All in all they offers competitive performance at much cheaper price.


----------



## Mad Pistol

Quote:


> Originally Posted by *budgetgamer120*
> 
> Exactly how I use my system


This shows the versatility of a high core-count system. Considering that an R7 1700 costs $330 and will usually overclock to 3.9-4Ghz, it just hits home on the point of this CPU... you can do ANYTHING you want without it breaking a sweat.

I will maintain that the R7 1700 is the hero of AMD's 8-core lineup.


----------



## Laserlight

@tpi2007
"If you don't upgrade that often, why don't you wait and see? If you're considering a 4C/4T 7600K, why don't you buy a second hand 4C/8T 3770K instead and try to clock it around 4.5 Ghz? You'll end up in the same place and it's a better deal than having to buy into a new platform + RAM. That should give you some time to see how things go. I'm all for making the most of a platform"

Simply because my motherboard is dieing and I need to upgrade Ram as well. The system is too old to waste any more money on it at this point.
I will have to wait and see if I will jump to Ryzen. The whole gaming performance situation is too confusing and foggy right now to make a decision.


----------



## Ceadderman

Quote:


> Originally Posted by *Laserlight*
> 
> I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
> I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
> I care only about gaming performance and nothing else.
> I hope I will have a clear picture before I make the purchase.


Well for one thing the reviews that have come out are prettyuch Win7 based on average.

Win7 doesn't have DX12. Win10 does.

If you have Win7 on a SSD you should be able to run it on Ryzen, except for security updates post release. So essentially Ryzen ownera are stuck with Win10 unless we can trick Micro$haft with early hardware requiring security patches.

In any case DX12 is limited adoption by the developers atm. It's not the eaba of how well Ryzen scales out. From what I can tell it has limited R7 to some extent.

What's more, is Ryzen is x86 architecture whci is what FX is


----------



## tpi2007

Quote:


> Originally Posted by *Ceadderman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Laserlight*
> 
> I still can't figure out a clear picture on Ryzen's gaming performance. We have almost all reviewers concluding on the result that Ryzen falls behind Kaby and then few of them come out magically showing off way different results. Some reviewers are quite sketchy.
> I will need to upgrade to an i5 or a Ryzen 5 this year and I can't figure out which way to go.
> I care only about gaming performance and nothing else.
> I hope I will have a clear picture before I make the purchase.
> 
> 
> 
> Well for one thing the reviews that have come out are prettyuch Win7 based on average.
> 
> Win7 doesn't have DX12. Win10 does.
> 
> If you have Win7 on a SSD you should be able to run it on Ryzen, except for security updates post release. So essentially Ryzen ownera are stuck with Win10 unless we can trick Micro$haft with early hardware requiring security patches.
> 
> In any case DX12 is limited adoption by the developers atm. It's not the eaba of how well Ryzen scales out. From what I can tell it has limited R7 to some extent.
> 
> What's more, is Ryzen is x86 architecture whci is what FX is
Click to expand...

We still don't know about that part of Ryzen and Windows 7 not having security updates post release. You are correct about automatic updates; pretty much all future rollups with include the unsupported CPU check, so it's a no go, but downloading and installing the standalone security only updates (OS, IE and .NET Framework) from the Update Catalog site should work.


----------



## Alwrath

Quote:


> Originally Posted by *bigbadbrad*
> 
> Glad I got 7700k... Why is this thing 500 bux? Throwing cores and threads at something is useless, very few programs even utilize this. 4 cores and high frequency seem to be the sweet spot, why did AMD not focus on this? Heck for most work it seems like the 7700k is performing better anyways on top of being cheaper.


I had an i5 760 quadcore for 7 years. Its time to upgrade to 8 cores. Heck I know alot of gamers who have been gaming on 5820k's for years now, because they know the quadcore is dieing, and they want to be prepared for games that use more cores. Were already seeing many titles use more than 4 cores 8 threads, and in 4 years if your on a quadcore you will see a performance drop most likely.

Your living in the past, man. Your talkin about a clown from the 60's man!!!!!!!!!!!!! ( random Seinfeld quote )










The quadcore had its day. Its so 2007. Its time to move on! Embrace change! Enjoy your $350 4 core and well enjoy our $330 8 cores that are superior to your little quadcore in everyway, especially for people like me who game at 4K.


----------



## renx

Now that PC Perspective's Infinity Fabric + CCX thing has been debunked, I have regained my faith on Ryzen improving over time.
The platform and cpu will be improving a lot in the next few months.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *bigbadbrad*
> 
> I dunno, thought ryzen 7 being a better workstation processor than 7700k was also debatable. This is mostly due to the fact that high thread counts are largely more useful for rendering and nobody optimizes for them. Even the workstation benchmarks I find have the 7700k beating out mostly everyone. It wins vs everything: in photoshop, solidworks, 3dCAD. For me this means engineers, designers, and content creators largely benefit from the lower cost and the additional speed of the 7700k. I would consider ryzen if I was in maybe animation, or simulation. I don't know what I'm missing but it seems like these high core and thread count processors are a waste of money even for workstations. Can you post a comprehensive list of workstation benchmarks to potentially divide best processor to profession?
> 
> The 5 flavors refer to the fact that none of their processors have focus, they are good sometimes but never the best.


Dude you don't have to try so hard to justify your 7700K purchase. If it best fits your needs then bully for you, enjoy it. not sure why you are trolling a Ryzen thread since it so clearly is an inferior product to you? Fact is that in any application that takes advantage of the cores and threads the R7's will generally embarrass a 7700K (just as Intel's own HEDT chips will) and those are the workloads the R7 was designed for. The 7700K is a great chip and really Intel's first true successor to the 2600K way back in 2011 so go enjoy your chip and stop dumping all over another product that clearly wasn't designed for you.


----------



## ChronoBodi

7700k isn't going to do this and still give me room to play a game on the side as well. Quad cores is dead. Dead dead DEAD.


----------



## AlphaC

As far as gaming , Ryzen is a beast that just hasn't had the backing of a proper DX12 outlet and gaming optimizations.
i7-7700K is more or less faster in anything that relies on DX11 which only scales well to 6 threads , provided you do nothing else and disable your antivirus, voice chat, etc.

For computation / encoding / etc the threads make a difference.

Also DDR4-3200 has gains vd DDR4 2133 or DDR4 2400MHz on Ryzen, but it depends on application

HT4u has + 18.5% from DDR4 3200 as far as Rise of the Tomb Raider and +15% in Witcher 3 / DeusEx ManKind Divided / Batman Arkham Knight

https://www.ht4u.net/reviews/2017/amd_ryzen_7_1800x_im_test/index36.php?dummy=&advancedFilter=false&prod%5B%5D=AMD+Ryzen+7+1800X+%5B8C%2F16T%403%2C6-4%2C1+GHz%5D&prod%5B%5D=AMD+Ryzen+7+1800X+%5B8C%2F16T%40DDR4-3200%5D

This is not a small gain. If motherboards improve as far as memory clocks and more memory manufacturers release AMD DDR4 kits similar to FlareX DDR4 3200Mhz CL14 it will only get better.

Ht4u also has Ryzen 5 1600X results in their charts
https://www.ht4u.net/reviews/2017/amd_ryzen_7_1800x_im_test/index20.php


----------



## Malinkadink

User @ AMD Reddit is reporting getting 32GB 16GBx2 dimms working @ 3200mhz with 17-17-17-36 timings on the taichi. WHERE IS YOUR TAICHI STOCK MICROCENTER?!

EDIT: Sticks used were Trident Z 3200 CL14

I have a pair of 3200 CL14 Ripjaws, should be the same difference.


----------



## LunaTiC123

was looking forward upgrading to ryzen then I saw the ram prices... jesus christ what happened in the past 5-10 months, bought my 16gb ddr3 2400mhz g.skill kit for like 79 euros now i checked and it's like 140~ euros...







I remember 16gb ddr4 3200mhz g.skill kits being like 100-110 euros now they are like 140-150~ the ones with good timings are ridiculous like cl14 are 200~ euros and the cl15's are around 180~ or so with decent 16-16-16 being 150-160~ couldn't pick a worse time to upgrade








guess I might as well just postpone for another year or so and wait for zen+ but then again who knows if ram prices will go back to being cheaper in 2018


----------



## epic1337

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Yep, ever since Ryzen launched the only thing that matters in the PC world according to most Intel owners is gaming at 1080p. PC's don't have any use outside that, except maybe porn


yeah... i'm quite bothered by this as well, according to most people, PCs are glorified gaming consoles and nothing else.


----------



## Kuivamaa

Quote:


> Originally Posted by *SuperZan*
> 
> For one, in my profession it has long been established that cores and threads are extremely valuable. I bring home light work that doesn't require multi-CPU workstation or GPU-bank processing power. I also game on the same machine. Ryzen has been superior in every respect to the 3930k @ 4.6 GHz which I used previously for the same purposes. Tom's benched a number of scientific and engineering related workloads as well. I think you can draw the obvious conclusion regarding cores and threads. It's important to note that when possible, I/we are processing multiple workloads as quickly as possible. Cores and threads are obviously at a premium in that scenario.
> 
> AutoCAD was never going to be a great reason to buy Ryzen. Other workstation applications are highly dependent on use-case, but when rendering or other similar workloads require parallelised tasking _in addition_ to tasks optimised for serial performance (which is very common) there is no extreme disadvantage to Ryzen in the serial workloads and it is exceptionally good during parallelised workloads. Workload balance and use-case would determine which would suit a particular workstation better.
> 
> AMD doesn't have the resources of Intel. They can't afford to make 'the best' processor for every single conceivable use-case. What they can do is make extremely competitive processors for many use-cases at _extremely_ competitive prices. If you don't see the value, then maybe it's not for you. As somebody who does bioinformatics work and games at 1440p, Ryzen is exceptional value and excellent performance.


Even some serial workloads benefit from cores If you have lots of them. Running multiple instances of a single thread workload is possible on multicore systems.


----------



## epic1337

Quote:


> Originally Posted by *Kuivamaa*
> 
> Even some serial workloads benefit from cores If you have lots of them. Running multiple instances of a single thread workload is possible on multicore systems.


if only the OS thread scheduler doesn't stuff everything in core0~


----------



## Kuivamaa

Quote:


> Originally Posted by *epic1337*
> 
> if only the OS thread scheduler doesn't stuff everything in core0~


You can fix that with affinity.


----------



## epic1337

Quote:


> Originally Posted by *Kuivamaa*
> 
> You can fix that with affinity.


well yes, though its inconvenient, if they had made it round-robin scheduling or loadbalanced scheduling to begin with then we wouldn't even need to do this.
microsoft had made it to stuff everything on core0 unless other cores are requested, this is so that cores can be parked to save power.


----------



## Ceadderman

I got 1800x so no flamage...

But quads aren't dead, otherwise AMD wouldn't be following up with a quad and 6 core CPU at all.

I don't do 3D cad. I don't do video editing or rendering. I got 1800x for basically one reason. Because I could and I wanted their best CPU. If I want to render or use it for 3D modeling I can. People can say whatever they want, but I like having the overhead available to do things that I haven't done before for half the price of Intel's best CPU and for 2/3 of their top end board's price. RAM is still a costly factor but se les vis. My brother got 1700x for the same reason. He doesn't OC so I will likely swap his out to my C6H board and play with it a bit just to show him what his CPU is capable of.









For us the chips will likely be mostly for gaming. But we will have the room for doing other things far into the future. And you just never know, with AMD throwing out a 16thread CPU, developers may respond with multithreaded games.

Why would we do such things? Because we are Enthusiasts. We got into building computers back in the 90s. It's fun.









~Ceadder


----------



## Slink3Slyde

Quote:


> Originally Posted by *Ceadderman*
> 
> I got 1800x so no flamage...
> 
> But quads aren't dead, otherwise AMD wouldn't be following up with a quad and 6 core CPU at all.
> 
> I don't do 3D cad. I don't do video editing or rendering. I got 1800x for basically one reason. Because I could and I wanted their best CPU. If I want to render or use it for 3D modeling I can. People can say whatever they want, but I like having the overhead available to do things that I haven't done before for half the price of Intel's best CPU and for 2/3 of their top end board's price. RAM is still a costly factor but se les vis. My brother got 1700x for the same reason. He doesn't OC so I will likely swap his out to my C6H board and play with it a bit just to show him what his CPU is capable of.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For us the chips will likely be mostly for gaming. But we will have the room for doing other things far into the future. And you just never know, with AMD throwing out a 16thread CPU, developers may respond with multithreaded games.
> 
> Why would we do such things? Because we are Enthusiasts. We got into building computers back in the 90s. It's fun.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Well said, and fair enough.

I can almost guarantee that for the next 5 years there wont be a game released that wont run well enough for most on a fast quad core even without hyperthreading, unless the developers really don't care about selling to the majority of the market.

I'd still take an R5 when theyre released over an I5 though, before I get jumped on for being an I5 fanboy or something


----------



## JoJo1337

Do you guys think an R7 1700 paired with an R9 Fury Nitro will be good enough to play games such as BF1, cs go, GTA V... since I can get a 1080p 144hz screen for 200 euros and I think it'd be a good upgrade over my 60hz screen.
I'm aware that the i7 6700K/7700K are better for high refresh rate gaming but I want to support AMD and I also believe six/eight cores CPUs will become more and more useful.
Should I wait for the R5 1600 ?


----------



## cssorkinman

Quote:


> Originally Posted by *JoJo1337*
> 
> Do you guys think an R7 1700 paired with an R9 Fury Nitro will be good enough to play games such as BF1, cs go, GTA V... since I can get a 1080p 144hz screen for 200 euros and I think it'd be a good upgrade over my 60hz screen.
> I'm aware that the i7 6700K/7700K are better for high refresh rate gaming but I want to support AMD and I also believe six/eight cores CPUs will become more and more useful.
> Should I wait for the R5 1600 ?


Great combo for BF1 - Example of a similar rig https://youtu.be/YfhL08s4IMw

Fairly GPU bottlenecked at anything over low settings.

I have graphed fps for several 64 player maps for the entirety of the game and had minimums around 130 and averages in the 160 to
170 range ( 4 ghz on the Cpu).


----------



## Ceadderman

Quote:


> Originally Posted by *JoJo1337*
> 
> Do you guys think an R7 1700 paired with an R9 Fury Nitro will be good enough to play games such as BF1, cs go, GTA V... since I can get a 1080p 144hz screen for 200 euros and I think it'd be a good upgrade over my 60hz screen.
> I'm aware that the i7 6700K/7700K are better for high refresh rate gaming but I want to support AMD and I also believe six/eight cores CPUs will become more and more useful.
> Should I wait for the R5 1600 ?


First off, this system you are looking to build is your baby. In the end our input doesn't really matter so long as you're satisfied with the end result. Now if you were a client of mine I would suggest getting what you can afford and take the savings and put it to a higher end GPU/RAM/PSU, depending on the difference in price from one to the other. But I always ask, "what do you do with your computer and what would you like to do in the future" to determine what CPU and MB I should set them up with. All that being relative of course.









That said, if you want R7 for the epeen, then you could do worse than the 1700 since you plan to pair your Fury with it. $300 is a pretty reasonable price point unless you would rather spend less and put the savings into the places mentioned above. You could certainly do worse than an R5 even. At this point and time 12threads are even a bit much just like 16threads. But if you want something that you can sit on for 3-5 years both will work for you. You should be the judge in the end. Because you're the end user and you have to be happy with your decision. Apologies if this doesn't give a cut and dry answer on my part but cuter are like cars. Can you be happy with a Camara when you can afford a Porsche?









~Ceadder :drink;


----------



## JoJo1337

Quote:


> Originally Posted by *cssorkinman*
> 
> Great combo for BF1 - Example of a similar rig https://youtu.be/YfhL08s4IMw
> 
> Fairly GPU bottlenecked at anything over low settings.
> 
> I have graphed fps for several 64 player maps for the entirety of the game and had minimums around 130 and averages in the 160 to
> 170 range ( 4 ghz on the Cpu).


Indeed, those are good results !
My i7 950 has minimums fps around 40s in multiplayer so it's quite annoying. (1080p low settings)

Thanks for posting your results, I really appreciate it.

Quote:


> Originally Posted by *Ceadderman*
> 
> First off, this system you are looking to build is your baby. In the end our input doesn't really matter so long as you're satisfied with the end result. Now if you were a client of mine I would suggest getting what you can afford and take the savings and put it to a higher end GPU/RAM/PSU, depending on the difference in price from one to the other. But I always ask, "what do you do with your computer and what would you like to do in the future" to determine what CPU and MB I should set them up with. All that being relative of course.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That said, if you want R7 for the epeen, then you could do worse than the 1700 since you plan to pair your Fury with it. $300 is a pretty reasonable price point unless you would rather spend less and put the savings into the places mentioned above. You could certainly do worse than an R5 even. At this point and time 12threads are even a bit much just like 16threads. But if you want something that you can sit on for 3-5 years both will work for you. You should be the judge in the end. Because you're the end user and you have to be happy with your decision. Apologies if this doesn't give a cut and dry answer on my part but cuter are like cars. Can you be happy with a Camara when you can afford a Porsche?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder :drink;


You're right, I'll be the one making the final decision in the end.
I want a CPU that will last as long as my i7 950, I bought it in 2010 and it's definitely time to replace it .
AM4 should last longer than any Intel's socket so it's another reason I want to buy an AMD CPU, I won't have to buy another MB once Zen 2 will be released.
No need to apologize, you actually helped me a lot by asking these 2 questions therefore I'm sure an R7 1700 will fit my needs








By the way, I just ordered the Acer XF240 monitor so I'll be enjoying 144hz gaming from now on


----------



## Ultracarpet

Quote:


> Originally Posted by *JoJo1337*
> 
> Indeed, those are good results !
> My i7 950 has minimums fps around 40s in multiplayer so it's quite annoying. (1080p low settings)
> 
> Thanks for posting your results, I really appreciate it.
> You're right, I'll be the one making the final decision in the end.
> I want a CPU that will last as long as my i7 950, I bought it in 2010 and it's definitely time to replace it .
> AM4 should last longer than any Intel's socket so it's another reason I want to buy an AMD CPU, I won't have to buy another MB once Zen 2 will be released.
> No need to apologize, you actually helped me a lot by asking these 2 questions therefore I'm sure an R7 1700 will fit my needs
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, I just ordered the Acer XF240 monitor so I'll be enjoying 144hz gaming from now on


I have your rig pretty much lol, except I have a 1440p 144hz. I think you will be satisfied, especially coming from a i7-950. The only thing I will say, is you could possibly save a bit of money by going with one of the r5's when they come out if all you do is game. Other than that, the r7 1700 is a fantastic processor.


----------



## cssorkinman

Quote:


> Originally Posted by *JoJo1337*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Great combo for BF1 - Example of a similar rig https://youtu.be/YfhL08s4IMw
> 
> Fairly GPU bottlenecked at anything over low settings.
> 
> I have graphed fps for several 64 player maps for the entirety of the game and had minimums around 130 and averages in the 160 to
> 170 range ( 4 ghz on the Cpu).
> 
> 
> 
> Indeed, those are good results !
> My i7 950 has minimums fps around 40s in multiplayer so it's quite annoying. (1080p low settings)
> 
> Thanks for posting your results, I really appreciate it.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> First off, this system you are looking to build is your baby. In the end our input doesn't really matter so long as you're satisfied with the end result. Now if you were a client of mine I would suggest getting what you can afford and take the savings and put it to a higher end GPU/RAM/PSU, depending on the difference in price from one to the other. But I always ask, "what do you do with your computer and what would you like to do in the future" to determine what CPU and MB I should set them up with. All that being relative of course.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That said, if you want R7 for the epeen, then you could do worse than the 1700 since you plan to pair your Fury with it. $300 is a pretty reasonable price point unless you would rather spend less and put the savings into the places mentioned above. You could certainly do worse than an R5 even. At this point and time 12threads are even a bit much just like 16threads. But if you want something that you can sit on for 3-5 years both will work for you. You should be the judge in the end. Because you're the end user and you have to be happy with your decision. Apologies if this doesn't give a cut and dry answer on my part but cuter are like cars. Can you be happy with a Camara when you can afford a Porsche?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder :drink;
> 
> Click to expand...
> 
> You're right, I'll be the one making the final decision in the end.
> I want a CPU that will last as long as my i7 950, I bought it in 2010 and it's definitely time to replace it .
> AM4 should last longer than any Intel's socket so it's another reason I want to buy an AMD CPU, I won't have to buy another MB once Zen 2 will be released.
> No need to apologize, you actually helped me a lot by asking these 2 questions therefore I'm sure an R7 1700 will fit my needs
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, I just ordered the Acer XF240 monitor so I'll be enjoying 144hz gaming from now on
Click to expand...

An example of the fps during a recent BF1 match


----------



## Kuivamaa

Quote:


> Originally Posted by *JoJo1337*
> 
> Do you guys think an R7 1700 paired with an R9 Fury Nitro will be good enough to play games such as BF1, cs go, GTA V... since I can get a 1080p 144hz screen for 200 euros and I think it'd be a good upgrade over my 60hz screen.
> I'm aware that the i7 6700K/7700K are better for high refresh rate gaming but I want to support AMD and I also believe six/eight cores CPUs will become more and more useful.
> Should I wait for the R5 1600 ?


I have a 1800X and a Nano which is a combo very very similar to what you are looking at ,plus a 144hz monitor (1440p). I own all three games you mentioned so let me know if you need some numbers. BF1 gets at 1440p and a mix of ultra/high/med,roughly 80-100 fps on multiplayer on new dlc maps. Nano is the bottleneck actually. You can't go wrong with the 1700 although if you are low on cash the 1600 will be very capable as well.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *JoJo1337*
> 
> Do you guys think an R7 1700 paired with an R9 Fury Nitro will be good enough to play games such as BF1, cs go, GTA V... since I can get a 1080p 144hz screen for 200 euros and I think it'd be a good upgrade over my 60hz screen.
> I'm aware that the i7 6700K/7700K are better for high refresh rate gaming but I want to support AMD and I also believe six/eight cores CPUs will become more and more useful.
> Should I wait for the R5 1600 ?


Of course it would be good enough. It would be a heckuva a lot better than good enough in fact.


----------



## Kuivamaa

Quote:


> Originally Posted by *JoJo1337*
> 
> Indeed, those are good results !
> My i7 950 has minimums fps around 40s in multiplayer so it's quite annoying. (1080p low settings)
> 
> Thanks for posting your results, I really appreciate it.
> You're right, I'll be the one making the final decision in the end.
> I want a CPU that will last as long as my i7 950, I bought it in 2010 and it's definitely time to replace it .
> AM4 should last longer than any Intel's socket so it's another reason I want to buy an AMD CPU, I won't have to buy another MB once Zen 2 will be released.
> No need to apologize, you actually helped me a lot by asking these 2 questions therefore I'm sure an R7 1700 will fit my needs
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, I just ordered the Acer XF240 monitor so I'll be enjoying 144hz gaming from now on


It is funny but the other day I was thinking that 1700 is the new i7 920.It will be around and relevant for many years.


----------



## Mad Pistol

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is funny but the other day I was thinking that 1700 is the new i7 920.It will be around and relevant for many years.


Yea, the R7 1700 really is kinda like the new i7 920 or the 2500k. It's a beast of a product in stock form, and it's easily overclockable.


----------



## teh-yeti

Quote:


> Originally Posted by *Kuivamaa*
> 
> It is funny but the other day I was thinking that 1700 is the new i7 920.It will be around and relevant for many years.


Just got my new 1700 rig put together today (finally). Nothing would make me happier than to overclock this and keep it relevant for another 4-5 years.


----------



## Lipos

AotS update for Ryzen:



https://www.pcper.com/reviews/Processors/Ashes-Singularity-Gets-Ryzen-Performance-Update

Not bad.


----------



## renx

Not sure if it was already posted somewhere else, but here's the new Adored video.
He's upset.


----------



## AuraNova

Just when you thought we were done with the attacks. Everyone seems to be still on edge.

I feel that with the release of Ryzen, that benchmarking in general doesn't seem as viable as it used to be. Call it justification of the lack of gaming performance Ryzen seems to have at the moment, but I've watched many of these benchmarking videos, and none of them seem to say exactly the same thing. Everyone has different results, and it's gotten to some point where reviewers are calling each other out.

As I said before, AdoredTV has looked outside, and way beyond the box when it comes to benchmarking. Not just in Ryzen, but as a whole.


----------



## SuperZan

Quote:


> Originally Posted by *renx*
> 
> Not sure if it was already posted somewhere else, but here's the new Adored video.
> He's upset.


He really covered my gripes very well. I never bought into the 'testing Low 720p shows future performance' because it's such an academic argument that doesn't take into account the realities of the gaming market. As a structural argument, sure, testing at a low resolution with low settings to enforce a CPU benchmark should theoretically tell you how that CPU will perform when GPU's have conquered all graphical hurdles in modern titles. Those of us who have chased the 4k dragon know better than anybody that this mythical zenith just isn't something we're going to experience in real life. When even 1080p Ultra-maxed games can stress GPU's as powerful as the 1080, and when two Titan XP's or 1080 TI's are needed to truly 'dominate' 4k Ultra-maxed content, I think it becomes clear that in pushing the graphical envelope, developers are ensuring that the great majority of PC gamers playing modern games are going to be GPU bound.

Yes, edge-cases exist, but my contention is that many of those edge-cases are irrelevancies which only enthusiasts care for arguing over. Is maximum 5.0GHz Kaby Lake performance desirable for 1080p Dragon Age Origins? Sure, but I can't imagine that other modern processors are delivering painful experiences there.

1080p 60Hz. That's the market. Until people can be convinced to start buying up 1440p/4k monitors to replace those 1080p displays, we're likely going to remain in a place where pursuing increased graphical fidelity at 60Hz is the general direction of development. And part of getting people to buy those nice improved-resolution displays is making that as affordable an experience as 1080p / 60Hz. 1440p is slowly getting there, but it's nothing imminent in terms of a shift in the 1440p/1080p demographics. Until developers have squeezed all of the blood from that 1080p/60Hz stone, we're not going to see mass adoption of 1440p/2160p and every game that manages to push 1080p graphics that much further delays the process. It took ten years to move from 1024 x 768 to 1366 x 768. Unless people are buying processors to use for a decade and a half, the low-resolution enforced bottleneck testing is great for showing us relative processor strengths but is rather meaningless as a method of demonstrating real-world gaming differences... and assumes that you won't move towards a better resolution which starts the whole 'overcoming all graphical hurdles' process over again.


----------



## renx

Have you guys seen this?

https://www.pcper.com/reviews/Processors/Ashes-Singularity-Gets-Ryzen-Performance-Update

I'm sorry again if it was already posted, but the amount of threads and updates about Ryzen is getting a little bit overwhelming to me.


----------



## AuraNova

Quote:


> Originally Posted by *renx*
> 
> I'm sorry again if it was already posted, but the amount of threads and updates about Ryzen is getting a little bit overwhelming to me.


And it will continue to be this way, even when the Ryzen 3 chips get released. We're in for the long haul here.

Many people were expecting Ryzen 7 to be an Intel killer in every aspect. Since it's not, people either feel slighted or angered by the fact most of the launch benchmarks didn't make it so. So discussion is imminent. Not to mention the separate RAM and motherboard issues, which magnifies this whole thing even more.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *SuperZan*
> 
> He really covered my gripes very well. I never bought into the 'testing Low 720p shows future performance' because it's such an academic argument that doesn't take into account the realities of the gaming market. As a structural argument, sure, testing at a low resolution with low settings to enforce a CPU benchmark should theoretically tell you how that CPU will perform when GPU's have conquered all graphical hurdles in modern titles. Those of us who have chased the 4k dragon know better than anybody that this mythical zenith just isn't something we're going to experience in real life. When even 1080p Ultra-maxed games can stress GPU's as powerful as the 1080, and when two Titan XP's or 1080 TI's are needed to truly 'dominate' 4k Ultra-maxed content, I think it becomes clear that in pushing the graphical envelope, developers are ensuring that the great majority of PC gamers playing modern games are going to be GPU bound.
> 
> Yes, edge-cases exist, but my contention is that many of those edge-cases are irrelevancies which only enthusiasts care for arguing over. Is maximum 5.0GHz Kaby Lake performance desirable for 1080p Dragon Age Origins? Sure, but I can't imagine that other modern processors are delivering painful experiences there.
> 
> 1080p 60Hz. That's the market. Until people can be convinced to start buying up 1440p/4k monitors to replace those 1080p displays, we're likely going to remain in a place where pursuing increased graphical fidelity at 60Hz is the general direction of development. And part of getting people to buy those nice improved-resolution displays is making that as affordable an experience as 1080p / 60Hz. 1440p is slowly getting there, but it's nothing imminent in terms of a shift in the 1440p/1080p demographics. Until developers have squeezed all of the blood from that 1080p/60Hz stone, we're not going to see mass adoption of 1440p/2160p and every game that manages to push 1080p graphics that much further delays the process. It took ten years to move from 1024 x 768 to 1366 x 768. Unless people are buying processors to use for a decade and a half, the low-resolution enforced bottleneck testing is great for showing us relative processor strengths but is rather meaningless as a method of demonstrating real-world gaming differences... and assumes that you won't move towards a better resolution which starts the whole 'overcoming all graphical hurdles' process over again.


I agree he made a fantastic point about the industry and why they develop games the way they do. No matter how fast video cards get in the future their excess power will be used to provide better looking games rather than more FPS. That's the way its always been and its always probably going to be. As he said, the real reason the tech sites test at low res and low settings is so there's an actual story to tell when doing a CPU review. Nobody is going to care about looking at benches where all the CPU's perform the same but with Ryzen out now that's what you're going to see with all benchmark results if you test at anything like relevant settings and resolutions. All modern quad cores and up are just that good at this point and there are no more "Bulldozer" CPU's that are going to be obviously inferior anymore. Well, except for those that don't have enough cores/threads (which is kind of the whole point of what AMD is doing with Ryzen in the first place)....


----------



## randomizer

The arguments for benching at low and high resolutions are both valid in my opinion. Both approaches provide different information and people tend to prefer the one that shows them what they want to see, so every review will be criticised by someone. You need both to get the full picture.


----------



## Ultracarpet

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I agree he made a fantastic point about the industry and why they develop games the way they do. No matter how fast video cards get in the future their excess power will be used to provide better looking games rather than more FPS. That's the way its always been and its always probably going to be. As he said, the real reason the tech sites test at low res and low settings is so there's an actual story to tell when doing a CPU review. Nobody is going to care about looking at benches where all the CPU's perform the same but with Ryzen out now that's what you're going to see with all benchmark results if you test at anything like relevant settings and resolutions. All modern quad cores and up are just that good at this point and there are no more "Bulldozer" CPU's that are going to be obviously inferior anymore. Well, except for those that don't have enough cores/threads (which is kind of the whole point of what AMD is doing with Ryzen in the first place)....


The only issue i have is that high refresh rate displays have just as much viability as 4k does in a quest for better visuals. All other things considered equal, if one screen was 4k60hz and the other 2k144hz, I would take the 2k screen every time. Perhaps I place too much weight on the refresh rate, but in all honesty, I have see gorgeous videos of real life on 1080p screens, and terrible game graphics on 4k screens... resolution means very little in the grand scheme of things. The higher refresh rate makes things feel more alive to me... hard to explain, but the first time i loaded up overwatch on this 144hz screen, i was looking at some spinning lights, and their movement looked oddly life-like.

Also, I'm fairly certain that if AMD and Intel were going toe to toe for the last decade instead of the mess that the market became, we wouldn't have consoles still outputting 30hz... hell, maybe 60hz wouldn't have such a foothold on the PC market.

IMO if someone was wanting to upgrade their screen for gaming, I would probably recommend going high refresh rate, and that would require a CPU capable of pushing it. With that being said, I think Ryzen is a very capable CPU for the job, especially when g-sync and free-sync are thrown into the mix. The $170 and $190 Ryzen R5 chips are going to be absolutely amazing bang for buck in this regard (as long as they perform as expected). Allowing people on a budget to maybe consider getting a higher refresh display.


----------



## Majin SSJ Eric

High refresh rate monitors are still much more dependent on the GPU than the CPU in modern games though. As AdoredTV pointed out, the games themselves are designed for 1080p / 60Hz so the emphasis is on more textures, polygons, tessellation, etc rather than ensuring games run at 300 FPS. If you're doing it right you should always be GPU limited in gaming, at least in my opinion.


----------



## SuperZan

Exactly. I'm running 1440p 144Hz at the moment, but I still push for visuals first rather than limiting things to try to reach that 144Hz peak. If I can hit it, great. It makes a huge difference in games (multiplayer especially) that support it. At 1440p, though, you've got to work to be CPU limited.

1080p 120-144Hz is still safe Ryzen territory, IMO, as things shape up with RAM speed and microcode updates and newer games take Ryzen's design into account. 1080p 240Hz is firmly 7700k territory, but even as a 144Hz user I can't understand the necessity of 240Hz. I'm not shouting it down or anything, but I think it's quite a niche.


----------



## Ultracarpet

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> High refresh rate monitors are still much more dependent on the GPU than the CPU in modern games though. As AdoredTV pointed out, the games themselves are designed for 1080p / 60Hz so the emphasis is on more textures, polygons, tessellation, etc rather than ensuring games run at 300 FPS. If you're doing it right you should always be GPU limited in gaming, at least in my opinion.


I think what I'm trying to say is that ryzen is capable of high refresh rate gaming in the same way that piledriver was to 60hz gaming. 99% of the time you will find yourself GPU bound. To me, ryzen really opens the door to high refresh to everyone that was on old Intel CPUs and any previous AMD CPU.

I agree with you, but i also think testing north of 60hz is important... to a point. Like 180fps vs 150fps, no one is going to be able to tell the difference. Where the rub comes, and i understand is pretty much the whole point of the video, is when noobs are building computers and they fork over twice as much as they need to for a 7700k to game at 60hz because the 720p reviews showed it 20% faster than ryzen.


----------



## mohit9206

Quote:


> Originally Posted by *Alwrath*
> 
> I had an i5 760 quadcore for 7 years. Its time to upgrade to 8 cores. Heck I know alot of gamers who have been gaming on 5820k's for years now, because they know the quadcore is dieing, and they want to be prepared for games that use more cores. Were already seeing many titles use more than 4 cores 8 threads, and in 4 years if your on a quadcore you will see a performance drop most likely.
> 
> Your living in the past, man. Your talkin about a clown from the 60's man!!!!!!!!!!!!! ( random Seinfeld quote )
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The quadcore had its day. Its so 2007. Its time to move on! Embrace change! Enjoy your $350 4 core and well enjoy our $330 8 cores that are superior to your little quadcore in everyway, especially for people like me who game at 4K.


Am still on Pentium 2C/2T


----------



## artemis2307

yes quadcore days are coming to an end, but not soon though, especially 4.8ghz 4690k quadcore


----------



## fleetfeather

Quote:


> Originally Posted by *artemis2307*
> 
> yes quadcore days are coming to an end, but not soon though, especially 4.8ghz 4690k quadcore


Well, you're wrong


----------



## comagnum

I have a 4.8ghz quad and I'm sure I can get a couple good years out of it. People point to bf1 and say that that an i5 isn't enough but if you tailor the game to your specs, there aren't issues. I play at 144hz and I rarely, if ever, drop below that with tweaked settings in bf1. Sure, I'd rather have 8+ threads, but to say a highly clocked quad can't cut it is wrong.


----------



## epic1337

Quote:


> Originally Posted by *mohit9206*
> 
> Am still on Pentium 2C/2T


pretty much still 100% usable, hell you can even somewhat game on it if you aren't using _their_ standard.

people in general just simply forget that PCs aren't glorified gaming consoles.


----------



## Mad Pistol

I'm glad that the update for Ashes of the Singularity showed such an inprovement for Ryzen. Hopefully that will show consumers that Ryzen is a safe bet.


----------



## daviejams

Quote:


> Originally Posted by *comagnum*
> 
> I have a 4.8ghz quad and I'm sure I can get a couple good years out of it. People point to bf1 and say that that an i5 isn't enough but if you tailor the game to your specs, there aren't issues. I play at 144hz and I rarely, if ever, drop below that with tweaked settings in bf1. Sure, I'd rather have 8+ threads, but to say a highly clocked quad can't cut it is wrong.


My highly clocked skylake i5 can't cut it in games like Deus Ex Mankind Divided , Mafia 3 and hell even no mans sky. 95-100% CPU usage at points in those games


----------



## mouacyk

Quote:


> Originally Posted by *SuperZan*
> 
> Exactly. I'm running 1440p 144Hz at the moment, but I still push for visuals first rather than limiting things to try to reach that 144Hz peak. If I can hit it, great. It makes a huge difference in games (multiplayer especially) that support it. At 1440p, though, you've got to work to be CPU limited.
> 
> 1080p 120-144Hz is still safe Ryzen territory, IMO, as things shape up with RAM speed and microcode updates and newer games take Ryzen's design into account. 1080p 240Hz is firmly 7700k territory, but even as a 144Hz user I can't understand the necessity of 240Hz. I'm not shouting it down or anything, but I think it's quite a niche.


There is a very specific technology that requires high refresh (and therefore high fps) to work on LCD panels without causing seizures. It is unfortunate that OLED can't come soon enough, so that we can junk this light strobing crap. However, to sustain 120Hz, visual details will need to sacrificed no matter what, and a powerful CPU to keep this fps as minimum as often as possible.


----------



## 7850K

this is interesting
https://www.reddit.com/r/Amd/comments/62cfi2/quake_2_ryzen_vs_core_i7_in_software_rendering/


----------



## Shatun-Bear

Quote:


> Originally Posted by *Mad Pistol*
> 
> I'm glad that the update for Ashes of the Singularity showed such an inprovement for Ryzen. Hopefully that will show consumers that Ryzen is a safe bet.


Yep, it isn't just a small perf increase, it's a huge 20-30% as well, basically making Ryzen indistinguishable from Intel in this game and with more improvements to come.

I keep seeing people say 'but it's only one game and not all devs will optimize for their games' but this doesn't stand up because the only games that get benched these days are the AAA titles by the biggest developers. If an outfit as small as Oxide need just 400-man hours to optimise for Ryzen (which is hardly anything), the big devs will easily have the manpower.

On top of that, the smaller titles from the smaller dev houses tend to be very undemanding, so they'll already run in the hundreds of fps anyway. It really doesn't matter if they aren't updated for Ryzen. I imagine once AMD help the likes of DICE, id etc (the small number of big players in the market) optimise for the Zeppelin arch, the improvements will carry over for successive titles from said developers. In short, it's not the uphill task some are trying to make it seem to get Ryzen running optimally on the games that matter, which is Battlefield, Battlefront, RDR, Deus Ex, Doom, Overwatch etc etc.


----------



## sumitlian

Quote:


> Originally Posted by *7850K*
> 
> this is interesting
> https://www.reddit.com/r/Amd/comments/62cfi2/quake_2_ryzen_vs_core_i7_in_software_rendering/


No surprise, If you look instruction table of RyZen you'll know that RyZen is about 100% or more faster than even Kaby Lake IPC wise in many of the SSE and AVX-128 based instructions.


----------



## Blameless

Quote:


> Originally Posted by *sumitlian*
> 
> No surprise, If you look instruction table of RyZen you'll know that RyZen is about 100% or more faster than even Kaby Lake IPC wise in many of the SSE and AVX-128 based instructions.


I'd be willing to bet the larger caches has more to do with Quake II software rendering performance than any instruction execution costs.

Would be interesting to see how it does on other processors.

Also, Quake II predates, and does not utilize, SSE.


----------



## STEvil

Quote:


> Originally Posted by *Blameless*
> 
> I'd be willing to bet the larger caches has more to do with Quake II software rendering performance than any instruction execution costs.
> 
> Would be interesting to see how it does on other processors.
> 
> Also, Quake II predates, and does not utilize, SSE.


His custom OpenGL software renderer does.


----------



## Ceadderman

Is it selfish for me to want sales to tank enough for me to build my own super computer on the cheap? Imagine 6 to 12 1800x systems Folding away. My PPD would go up exponentially.









~Ceadder


----------



## ChronoBodi

Quote:


> Originally Posted by *Ceadderman*
> 
> Is it selfish for me to want sales to tank enough for me to build my own super computer on the cheap? Imagine 6 to 12 1800x systems Folding away. My PPD would go up exponentially.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


eh, cheap 1070s will do for that purposes... in 2020.


----------



## Pro3ootector

http://www.techspot.com/news/68740-ashes-singularity-dota-2-optimized-amd-ryzen-cpus.html

RYZEN seams to get better, and better.


----------



## sugarhell




----------



## Delphi

This is an interesting watch about Nvidia's DX12 Driver vs AMD's

https://www.youtube.com/watch?v=0tfTZjugDeg


----------



## Shatun-Bear

Quote:


> Originally Posted by *sugarhell*


This is great. Shows you how misleading some of the tech press benchmarks can be.

In summary, he finds that Ryzen + Nvidia graphics card benches are slower compared to Ryzen + AMD graphics cards, his crossfire 480s are well-ahead of an overclocked Titan X it seems as well, which is crazy.

And he also says how sublimely smooth crossfire 480s are in this game, but he doesn't know whether that is because of Ryzen, Freesync or the Polaris cards themselves.


----------



## IRobot23

Quote:


> Originally Posted by *Delphi*
> 
> This is an interesting watch about Nvidia's DX12 Driver vs AMD's
> 
> https://www.youtube.com/watch?v=0tfTZjugDeg
> 
> Edit: Was posted just above me >.<


Even DX11 NVIDIA driver AMD vs INTEL CPU... very interesting.

*AMD GPU DX11*
R7 1800X = *50FPS*
i7 7700K = *55FPS*

_10% difference._

*NVIDIA GPU DX11*
R7 1800X = *60FPS*
i7 7700K = *80FPS*

_33% difference._

What is going on


----------



## renx

This is what caught me off guard today. Totally unexpected, and I believe something is wrong here:

https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Analysis/

This has to be debunked. I've seen enough benchmarks, and I know this has to be flawed.


----------



## cssorkinman

Looks like someone needs to do a Intel/Nvidia SLI vs ALL AMD crossfire benchmark comparison.

Might have to pick up another Fury


----------



## iRUSH

Quote:


> Originally Posted by *IRobot23*
> 
> Even DX11 NVIDIA driver AMD vs INTEL CPU... very interesting.
> 
> *AMD GPU DX11*
> R7 1800X = *50FPS*
> i7 7700K = *55FPS*
> 
> _10% difference._
> 
> *NVIDIA GPU DX11*
> R7 1800X = *60FPS*
> i7 7700K = *80FPS*
> 
> _33% difference._
> 
> What is going on


Woah! I'd love to see more of this testing to verify how accurate this discrepancy is.


----------



## kd5151

Ryzen + Vega =


----------



## AcesAndDueces

Quote:


> Originally Posted by *renx*
> 
> This is what caught me off guard today. Totally unexpected, and I believe something is wrong here:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Analysis/
> 
> This has to be debunked. I've seen enough benchmarks, and I know this has to be flawed.


Not sure what you mean. What's wrong with that review?


----------



## mouacyk

Quote:


> Originally Posted by *AcesAndDueces*
> 
> Not sure what you mean. What's wrong with that review?


I have to agree. The results are in line with expectations. At lower resolutions, there is a near-linear scaling with memory increases. This scaling stops almost completely at 4K where the GPU becomes entirely the bottleneck. Some elaboration on what's wrong with the review would be appreciated.


----------



## IRobot23

Quote:


> Originally Posted by *mouacyk*
> 
> I have to agree. The results are in line with expectations. At lower resolutions, there is a near-linear scaling with memory increases. This scaling stops almost completely at 4K where the GPU becomes entirely the bottleneck. Some elaboration on what's wrong with the review would be appreciated.


gpu bottleneck. Also not near linear scaling, not even close.
Doesnt match with DF...


----------



## renx

Quote:


> Originally Posted by *mouacyk*
> 
> I have to agree. The results are in line with expectations. At lower resolutions, there is a near-linear scaling with memory increases. This scaling stops almost completely at 4K where the GPU becomes entirely the bottleneck. Some elaboration on what's wrong with the review would be appreciated.


3-5% average gain from 2133 to 3200Mhz?
Even TPU states that they're surprised with those results, which do not align with anything I've seen so far.
Ryzen scales better than that with faster ram.


----------



## Taylor121

Quote:


> Originally Posted by *renx*
> 
> This is what caught me off guard today. Totally unexpected, and I believe something is wrong here:
> 
> https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Analysis/
> 
> This has to be debunked. I've seen enough benchmarks, and I know this has to be flawed.


On some games such as Fallout 4, faster memory makes a very big difference. In other games, it doesn't. This review seems to make a case for higher speed memory for certain games. It also did not include minimum 1% .1% frame rates, where faster memory may make a larger difference.


----------



## mouacyk

Quote:


> Originally Posted by *IRobot23*
> 
> gpu bottleneck. Also not near linear scaling, not even close.
> Doesnt match with DF...


Can I get a link please? The one I found doesn't have as many memory speeds being benchmarked. If tpu's 1080p results in general aren't linear then describe it otherwise. I'm not interested in mere interjections.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Shatun-Bear*
> 
> This is great. Shows you how misleading some of the tech press benchmarks can be.
> 
> In summary, he finds that Ryzen + Nvidia graphics card benches are slower compared to Ryzen + AMD graphics cards, his crossfire 480s are well-ahead of an overclocked Titan X it seems as well, which is crazy.
> 
> And he also says how sublimely smooth crossfire 480s are in this game, but he doesn't know whether that is because of Ryzen, Freesync or the Polaris cards themselves.


It is in DX12 that RyZen sucks with NVidia GPU's b/c NV has poor DX12 drivers. They really can't make much better DX12 drivers as their hardware is poor in DX12 and can't take advantage of more CPU cores, even if game engine allows it.


----------



## sugarhell

Quote:


> Originally Posted by *Ha-Nocri*
> 
> It is in DX12 that RyZen sucks with NVidia GPU's b/c NV has poor DX12 drivers. They really can't make much better DX12 drivers as their hardware is poor in DX12 and can't take advantage of more CPU cores, even if game engine allows it.


Let's not go there.

Ryzen is a new architecture. It will take time until software/ drivers and APIs will be ready for Ryzen.


----------



## Ha-Nocri

Quote:


> Originally Posted by *sugarhell*
> 
> Let's not go there.
> 
> Ryzen is a new architecture. It will take time until software/ drivers and APIs will be ready for Ryzen.


Did you watch the video, especially the 2nd part where he introduces 2 480's in crossFire? Why do you think there is such an uplift in performance using AMD CPU and GPUs vs. AMD CPU and NV GPU?

Pascal's architecture is not parallel, parallelism is simulated, as we have knows since Pascal's launch


----------



## sugarhell

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Did you watch the video, especially the 2nd part where he introduces 2 480's in crossFire? Why do you think there is such an uplift in performance using AMD CPU and GPUs vs. AMD CPU and NV GPU?


The poor dx12 drivers comment on a game that we know dx12 in general is a hit or miss...

Probably dx12 drivers are unoptimized for ryzen on this game.


----------



## AcesAndDueces

Quote:


> Originally Posted by *Taylor121*
> 
> On some games such as Fallout 4, faster memory makes a very big difference. In other games, it doesn't. This review seems to make a case for higher speed memory for certain games. It also did not include minimum 1% .1% frame rates, where faster memory may make a larger difference.


I see and after looking further you are correct. I actually did my own test last night went from 2400 to 3200 and recorded difference. Will post results in here. I can tell you this I have a gtx 970 and ran close to RL setting so my results shouldn't be as dramatic due to gpu. Buy they were actually more dramatic


----------



## mouacyk

Quote:


> Originally Posted by *renx*
> 
> 3-5% average gain from 2133 to 3200Mhz?
> Even TPU states that they're surprised with those results, which do not align with anything I've seen so far.
> *Ryzen scales better than that with faster ram.*


This review doesn't really help to prove that point, because it seemed like the GTX 1080 was limiting all benchmarks. We have to assume medium or high Graphics settings, since this wasn't mentioned. When you're trying to test RAM+CPU, it's never a good idea to run the GPU anywhere near full utilization, which is unclear in this case because all resolutions were 1080p+ with assumed high settings. They were probably going for "realistic" benching, as Earthdog defended continuously in the discussion.

However, given the unknowns over this new CPU, I would have preferred to go with pure RAM+CPU testing by removing all GPU bottlenecks -- 720p on a Titan X Pascal. Only this way, can we reveal the true potential of this new series of CPUs. This point was made by TPU poster: https://www.techpowerup.com/forums/threads/amd-ryzen-memory-analysis-20-apps-17-games-up-to-4k.231924/#post-3630312
Quote:


> Testing only at 1080p and up, it's being hidden by GPU limiting which can kick in and out as different scenes are rendered, so you don't really know fast it is.


----------



## AcesAndDueces

Got my New memory and sharing my results these are mostly
my Real life settings in games.

My Setup Is Ryzen 1700 3.7 all cores (waiting on bracket then Ill go higher)--Giga Aorus X370 G5--GTX 970 OC

Two Memories Tested going from Trident [email protected](fastest it would go hynix stick)--Trident [email protected](Auros G5 currently only has a strap for up to 3200 and no BLCK yet) hopefully will get 3600 with a new bios. Also can probably get [email protected] but haven't messed w it yet.

Warthunder 3840X2160 Max settings no Blur and only FXAA 2400C16 Min 72.8 AV 83.7///3200C16 Min 74.7 AV 86.6

For Honor Mixed Med,High 1080p 2400C16 Min 90.91 AV 117.2///3200C16 Min 97.55 AV 126.9

For Honor 4K, 2400C16 Min 29.1 AV 37.2///3200C16 Min 30.22 AV 37.8

BF1 1080p Mostly High/Ultra 2400C16 Min 57.8 AV 79.2///3200C16 Min 63.9 AV 89.4

3DMark Firestrike 2400C16 10,450/G12,001/Phys16,532/Comb 4145///3200C16 10,845/G12,025/Phys18,883(nice jump)Comb 4569

Blender file 2400C16 25.26secs/// 3200C16 25.1 secs.

Think Only having a GTX 970 and actually running Real life settings Had me more GPU bound But really happy with Improvement. BF1 looks like I can crank up another setting I am happy as long as I am holding 60FPS 95% of the time the occasional drop just tells me I am getting the most Eye Candy I can out of my Gear.

This Rig will end up with A VEGA or GTX1080 just waiting to see numbers before I decide.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sugarhell*
> 
> The poor dx12 drivers comment on a game that we know dx12 in general is a hit or miss...
> 
> Probably dx12 drivers are unoptimized for ryzen on this game.


6900K work fine in this game so it Ryzen only.
Also could it be that Nvidia is indirectly hindering AMD CPUs so that they make less $ so that they have less money for GPU.


----------



## teh-yeti

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 6900K work fine in this game so it Ryzen only.
> Also could it be that Nvidia is indirectly hindering AMD CPUs so that they make less $ so that they have less money for GPU.


Though that technically is not outside the realm of possibility I highly doubt it. An increase in people building new Ryzen rigs means more people buying nVidia GPUs. I have a feeling that nVidia is just as excited for Ryzen as anyone else.


----------



## ZealotKi11er

Quote:


> Originally Posted by *teh-yeti*
> 
> Though that technically is not outside the realm of possibility I highly doubt it. An increase in people building new Ryzen rigs means more people buying nVidia GPUs. I have a feeling that nVidia is just as excited for Ryzen as anyone else.


I am sure Ryzen buyers are am likely yo go Vega once its out. The funny thing is that Nvidia released a DX12 driver recently.


----------



## teh-yeti

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am sure Ryzen buyers are am likely yo go Vega once its out. The funny thing is that Nvidia released a DX12 driver recently.


But is there evidence of that? Sure there are lots of people on this forum that will probably do that, but many of those people were already going to get Vega.


----------



## ZealotKi11er

Quote:


> Originally Posted by *teh-yeti*
> 
> But is there evidence of that? Sure there are lots of people on this forum that will probably do that, but many of those people were already going to get Vega.


I like to bealive people that buy Intel CPU buy them by default and people that buy AMD CPUs buy them after research. Same thing with Nvidia vs AMD GPUs.


----------



## Max78

Some extremely interesting information here, AMD video cards look to be able to use the Ryzen more efficiently than Nvidia cards.

https://www.youtube.com/watch?v=0tfTZjugDeg

Lol, guess im a bit late to the party. . .


----------



## kaseki

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am sure Ryzen buyers are am likely yo go Vega once its out. The funny thing is that Nvidia released a DX12 driver recently.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> I like to bealive people that buy Intel CPU buy them by default and people that buy AMD CPUs buy them after research. Same thing with Nvidia vs AMD GPUs.


Some generalizations are best avoided. Some AMD CPU buyers do so because they don't like to buy from Intel. Some nVidia GPU buyers do so because with proprietary drivers nVidia performs better in Linux (although AMD is catching up).


----------



## ryboto

Quote:


> Originally Posted by *teh-yeti*
> 
> But is there evidence of that? Sure there are lots of people on this forum that will probably do that, but many of those people were already going to get Vega.


Well, I'm evidence of 1 person waiting...Once Vega and more mITX AM4 boards are released I'll build a Ryzen 7 system..


----------



## Slink3Slyde

For the last 5 years any one who did any research would clearly have bought an FX 8320.


----------



## kd5151

Quote:


> Originally Posted by *ryboto*
> 
> Well, I'm evidence of 1 person waiting...Once Vega and more mITX AM4 boards are released I'll build a Ryzen 7 system..


Me 2. Just not on the mitx part.


----------



## teh-yeti

Quote:


> Originally Posted by *ryboto*
> 
> Well, I'm evidence of 1 person waiting...Once Vega and more mITX AM4 boards are released I'll build a Ryzen 7 system..


I mean anecdotal evidence but I see your point. I guess the point I'm trying to make is that a for profit company would not intentionally provide a worse experience for their customers just because they wanted an AMD CPU. Though a lot of people will buy Vega, there will still be enough people who pair nvidia cards with Ryzen that it would be counter intuitive to give inferior support on that platform on purpose.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Slink3Slyde*
> 
> For the last 5 years any one who did any research would clearly have bought an FX 8320.


Nope. If there was any chance of anyone buying AMD CPU or GPU is because they knew what they where buying.


----------



## Slink3Slyde

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. If there was any chance of anyone buying AMD CPU or GPU is because they knew what they where buying.


Im pretty sure they sold a fair few FX at least in dodgy pre builts because 8 cores x 4 ghz = 32 ghz.

Meh.


----------



## kd5151

Joker Productions has Ryzen 5. Reviews incoming.


----------



## FLCLimax

Cool build video and benchmarks...but look at those graphs. Blue bar = 144fps and red bar = 142fps


----------



## Shau76434

Quote:


> Originally Posted by *FLCLimax*
> 
> Cool build video and benchmarks...but look at those graphs. Blue bar = 144fps and red bar = 142fps


Apparently he said in the comments that he made those graphs that way, so that they would be "easier" to read. He deleted the video , which isn't surprising.


----------



## Majin SSJ Eric

Those graphs are ridiculous. You'd think the 7700K is more than twice as fast as the 1700! And the scale isn't even the same in the graphs either. Sweet build though.


----------



## Mad Pistol

Just watched AdoredTV's Ryzen of the Tome Raider here: https://www.youtube.com/watch?v=0tfTZjugDeg

It's looking like Ryzen + Vega will be the ticket. Seriously... AMD is on to something, and if they keep this up... god help Intel and Nvidia.


----------



## SuperZan

Quote:


> Originally Posted by *Mad Pistol*
> 
> Just watched AdoredTV's Ryzen of the Tome Raider here: https://www.youtube.com/watch?v=0tfTZjugDeg
> 
> It's looking like Ryzen + Vega will be the ticket. Seriously... AMD is on to something, and if they keep this up... god help Intel and Nvidia.


Thou shalt have no other gods before the Lord thy Huang.

Very impressive stuff, though, I agree.


----------



## PureBlackFire

Quote:


> Originally Posted by *FLCLimax*
> 
> Cool build video and benchmarks...but look at those graphs. Blue bar = 144fps and red bar = 142fps


wth is this from? lol.


----------



## teh-yeti

Quote:


> Originally Posted by *PureBlackFire*
> 
> wth is this from? lol.


Now that the video has been taken down, who was it that put it up? Out of curiosity.


----------



## Shau76434

Quote:


> Originally Posted by *teh-yeti*
> 
> Now that the video has been taken down, who was it that put it up? Out of curiosity.


TotallySilencedTech

He was having a fit on twitter aswell , saying the people who were calling his video out were fanboys etc. He deleted those tweets aswell.


----------



## budgetgamer120

Quote:


> Originally Posted by *Barca130*
> 
> TotallySilencedTech
> 
> He was having a fit on twitter aswell , saying the people who were calling his video out were fanboys etc. He deleted those tweets aswell.


Why were people calling him out?


----------



## Shau76434

Quote:


> Originally Posted by *budgetgamer120*
> 
> Why were people calling him out?


The way he scaled those graphs.


----------



## BobiBolivia

Quote:


> Originally Posted by *budgetgamer120*
> 
> Why were people calling him out?


Because he made 2 FPS look like 3x advantage...

Quite stupid actually...


----------



## budgetgamer120

Quote:


> Originally Posted by *Barca130*
> 
> The way he scaled those graphs.


Quote:


> Originally Posted by *BobiBolivia*
> 
> Because he made 2 FPS look like 3x advantage...
> 
> Quite stupid actually...


Lol I see


----------



## AcesAndDueces

I was aware day 1 of how well Ryzen scales w memory speed. Felt so crappy to watch most of these reviewers just leave memory at 2133. The kits they were sent would get to 2666 or 2933 they just flat didn't take the time to do it.


----------



## iRUSH

Quote:


> Originally Posted by *AcesAndDueces*
> 
> I was aware day 1 of how well Ryzen scales w memory speed. Felt so crappy to watch most of these reviewers just leave memory at 2133. The kits they were sent would get to 2666 or 2933 they just flat didn't take the time to do it.


It's an odd situation. It shouldn't take time when compared to intels plug and play. On top of that everyone is rushing to get their Ryzen info out for clicks/$.

Ultimately getting anything above standard ddr4 speed is still a sketchy effort. It's not consistent enough yet. It'll be 3 more months before this is behind us.

Coming from someone who's favorite part of a PC is the CPU and its RAM had no doubt that Ryzen or any other CPU since Haswell scales nicely with fast RAM in CPU intensive programs.

Yes previous CPU's scale well too, but Haswell started to do it well first.

Ultimately I agree that Ryzen would no doubt scale nicely with fast DDR. It's the unfortunate growing pains that paints AMD in poor lighting half the time.


----------



## ZealotKi11er

Quote:


> Originally Posted by *iRUSH*
> 
> It's an odd situation. It shouldn't take time when compared to intels plug and play. On top of that everyone is rushing to get their Ryzen info out for clicks/$.
> 
> Ultimately getting anything above standard ddr4 speed is still a sketchy effort. It's not consistent enough yet. It'll be 3 more months before this is behind us.
> 
> Coming from someone who's favorite part of a PC is the CPU and its RAM had no doubt that Ryzen or any other CPU since Haswell scales nicely with fast RAM in CPU intensive programs.
> 
> Yes previous CPU's scale well too, but Haswell started to do it well first.
> 
> Ultimately I agree that Ryzen would no doubt scale nicely with fast DDR. It's the unfortunate growing pains that paints AMD in poor lighting half the time.


Nope. RAM speed has always made a difference. You just had to test properly. My IVY CPU benefits greatly from DDR3-2400.


----------



## randomizer

Quote:


> Originally Posted by *BobiBolivia*
> 
> Because he made 2 FPS look like 3x advantage...
> 
> Quite stupid actually...


Do people not read the numbers any more? Maybe tabular results are the best option.


----------



## FLCLimax

So he re-uploaded the video with fixed graphs...but he is showcasing a budget build with a RX 480 and showing benchmark graphs done with a GTX 1080. Lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *FLCLimax*
> 
> So he re-uploaded the video with fixed graphs...but he is showcasing a budget build with a RX 480 and showing benchmark graphs done with a GTX 1080. Lol.


Who is this Youtuber?


----------



## budgetgamer120

Quote:


> Originally Posted by *FLCLimax*
> 
> So he re-uploaded the video with fixed graphs...but he is showcasing a budget build with a RX 480 and showing benchmark graphs done with a GTX 1080. Lol.


Link?


----------



## FLCLimax




----------



## budgetgamer120

Quote:


> Originally Posted by *FLCLimax*


I would like to get free parts


----------



## SuperZan

I'm enjoying his commentary. "I'm sorry if I triggered anyone". It's not about triggering, you clownshoe, it's about the fact that graphs are usually proportionally relative for a reason.


----------



## budgetgamer120

Quote:


> Originally Posted by *SuperZan*
> 
> I'm enjoying his commentary. "I'm sorry if I triggered anyone". It's not about triggering, you clownshoe, it's about the fact that graphs are usually proportionally relative for a reason.


His conclusion for the video is really bad


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. RAM speed has always made a difference. You just had to test properly. My IVY CPU benefits greatly from DDR3-2400.


Agreed. In synthetics at least my 4930K scales very well with 2400 DDR3 vs 1866 (my RAM's stock speed).


----------



## Scotty99

lol clownshoe


----------



## Majin SSJ Eric

The new graphs are fair enough. They should've been the original graphs all along. That said, i think his (limited) results speak very well for the 1700 considering the 7700K was clocked to 5GHz (a 1.2GHz or 32% clock speed advantage). Ryzen is a very powerful gaming platform, just not quite as good as KL when clock speed is factored in.


----------



## iRUSH

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. RAM speed has always made a difference. You just had to test properly. My IVY CPU benefits greatly from DDR3-2400.


I did say that previous CPU's scale well too lol

Your IVY getting 2400 was either lucky or you fiddled with it beyond what any normal person would do.

It was known as an enthusiast that one of Haswells best features was its IMC and its ability to handle fast RAM.


----------



## BobiBolivia

Quote:


> Originally Posted by *randomizer*
> 
> Do people not read the numbers any more? Maybe tabular results are the best option.


I read also numbers. But take this - there will be some group of users who will look on graph and say:

- Oooh, Intel still 3x times faster than newest AMD ? AMD sucks, here Intel, take my money now !

Graphs can be powerful, if you have enough users which don't care about understanding/reading them.


----------



## randomizer

Quote:


> Originally Posted by *BobiBolivia*
> 
> I read also numbers. But take this - there will be some group of users who will look on graph and say:
> 
> - Oooh, Intel still 3x times faster than newest AMD ? AMD sucks, here Intel, take my money now !
> 
> Graphs can be powerful, if you have enough users which don't care about understanding/reading them.


I'm aware of the effect of graphs with truncated axes. It's very useful in some situations. This is just not one of them and I don't endorse it in this case. Having said that, if you don't take the time to read the graph properly then I don't think you are capable of making an informed purchasing decision anyway.


----------



## BobiBolivia

Quote:


> Originally Posted by *randomizer*
> 
> I'm aware of the effect of graphs with truncated axes. *It's very useful in some situations.* This is just not one of them and I don't endorse it in this case. Having said that, *if you don't take the time to read the graph properly then I don't think you are capable of making an informed purchasing decision anyway.*


1) Can you show me such situation ? I generally hate graphs that have scale starting in weird way, like "not from 0" and so on.
2) I agree.


----------



## kaseki

Back in the day when graphs were drawn by hand, it was expected that the origin was zero and that when an axis scale was abnormal a "squiggle" '~' was put in the axis to show disruption of the value progression. The plot lines would also be squiggled to further emphasize the gap in the data.

I am not aware of any capability in Excel or its freeware counterparts to do that, unless this capability was added very recently. Perhaps the long-lost DISSPLA plotting software (used with Fortran) or counterpart had that feature, I don't recall. In today's software plotting world dominated by Excel, one would have to Photoshop two different graph axis presentations together to get the desired effect.

Adding an obvious tag notice about the axis would seem to be the minimum requirement to claim that the presentation was not biased.


----------



## Ceadderman

It's been a while but I believe that you can add the break. And it is still a requirement in graphs and charts today. If you cannot do it in the newest Excel, any dunderead with an Internet connection certainly understands slash slash //. So when setting up your chart/graph go one over and one under and use that for your separator and add the slashes to the over and under at their correct location so you get this =//= effect. Not at all perfect but yeah.









~Ceadder


----------



## randomizer

Quote:


> Originally Posted by *BobiBolivia*
> 
> 1) Can you show me such situation ?


I don't know if I can _show_ you, but I can tell you. When relatively small variances in data points are material it is useful to emphasise them. It is even more useful when you have more than two data points as you can see relative magnitude more clearly (eg. point A is halfway between points B and C, even though the actual difference is a fraction of a percent).

For a chart showing framerates it is completely useless because small variances are immaterial.


----------



## Charcharo

It seems AMD lost market share in March. At least if Steam is to be believed.


----------



## Blameless

Quote:


> Originally Posted by *Charcharo*
> 
> It seems AMD lost market share in March. At least if Steam is to be believed.


Perfectly plausible.

There is always a lag, and for market share to really be reflective of product quality will likely take years of success for outmoded misconceptions to be largely overcome.

If you look at old CPU market share charts, the closest AMD ever came to matching Intel was less than six months before five years of general AMD superiority _ended_.


----------



## kaseki

Quote:


> Originally Posted by *randomizer*
> 
> I don't know if I can _show_ you, but I can tell you. When relatively small variances in data points are material it is useful to emphasise them. It is even more useful when you have more than two data points as you can see relative magnitude more clearly (eg. point A is halfway between points B and C, even though the actual difference is a fraction of a percent).
> 
> For a chart showing framerates it is completely useless because small variances are immaterial.


It has just occurred to me that one area where offset origins are common is in financial market and analysis data, whether published by a Federal Reserve Bank, one of its owning banks, or other market analysts. I suspect the technique is used because financial skimming is a fraction of a fraction process, where small variations in values are important, as noted by *randomizer* above.


----------



## AcesAndDueces

Quote:


> Originally Posted by *iRUSH*
> 
> It's an odd situation. It shouldn't take time when compared to intels plug and play. On top of that everyone is rushing to get their Ryzen info out for clicks/$.
> 
> Ultimately getting anything above standard ddr4 speed is still a sketchy effort. It's not consistent enough yet. It'll be 3 more months before this is behind us.
> 
> Coming from someone who's favorite part of a PC is the CPU and its RAM had no doubt that Ryzen or any other CPU since Haswell scales nicely with fast RAM in CPU intensive programs.
> 
> Yes previous CPU's scale well too, but Haswell started to do it well first.
> 
> Ultimately I agree that Ryzen would no doubt scale nicely with fast DDR. It's the unfortunate growing pains that paints AMD in poor lighting half the time.


None of those really scale like Ryzen though. Yes some cpu intense workloads scale well. The difference is that faster DDR4 does nothing or little to nothing in most games (the only place Ryzen really needed a boost)on most intel CPU's. Checked on Haswell just cause you mentioned it and here you go, numbers are percentage increase from going from 1333 to 3000 in Discreet gaming. Dirt 3 less than 1% 104.5>104.75, Bioshock infinite 1.7% 61.9>62.9, Tomb Raider less than 1% 47>47.25. Ryzen often sees a 20% boost just from going from 2400 to 3200 (800mhz) these results are from an 2600mhz increase Have seen many review similar to this. I am not denying that Intel CPU's scale with memory in some situations. Just not nearly as well as Ryzen especially in gaming. Remember all you get when you OC intel memory is more memory bandwidth. With Ryzen infinity fabric you get that plus further improvement in cache and CCX speeds as well as improved PCI-E performance. All those are tied to memory speed on Ryzen.
Just have a strange feeling that if the shoe was on the other foot and Intel launched a CPU that gained that much from memory speed they would have taken the time to get it right. Also I dont think any reviewer expects a brand new platform to be PLUG and PLAY. Its been a while cause intel has rode the Core i train for years. But everytime they launch a ground new platform it has had issues also reviewers still took the time to get the most out of it.

http://www.anandtech.com/show/7364/memory-scaling-on-haswell/7


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Charcharo*
> 
> It seems AMD lost market share in March. At least if Steam is to be believed.


For God sake Ryzen has only just now been out a month and there were supply issues with chips and (more crucially boards) just as there always is with a new product. You can't look at market share comparisons THE very month a product is released, that's just daft. Let's see what that looks like a year from now, then we will be able to make conclusions as to how effective Ryzen has been for AMD.


----------



## Charcharo

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> For God sake Ryzen has only just now been out a month and there were supply issues with chips and (more crucially boards) just as there always is with a new product. You can't look at market share comparisons THE very month a product is released, that's just daft. Let's see what that looks like a year from now, then we will be able to make conclusions as to how effective Ryzen has been for AMD.


I hope you are right.

AM4+ with PCIE4 release already







!


----------



## 7850K

Quote:


> Originally Posted by *Charcharo*
> 
> It seems AMD lost market share in March. *At least if Steam is to be believed.*


can someone break down valve's hardware polling policy?
every time it's brought up people talk about how long it's been since they've been polled. I can't remember the last time I was, greater than 5 years probably. It seems like a useless metric.


----------



## budgetgamer120

Quote:


> Originally Posted by *7850K*
> 
> can someone break down valve's hardware polling policy?
> every time it's brought up people talk about how long it's been since they've been polled. I can't remember the last time I was, greater than 5 years probably. It seems like a useless metric.


I've only been asked once in my entire time using a pc to take the survey and I declined it.


----------



## Ceadderman

Cannot remember the last time I was polled. But I will say that their system keeps track of my hardware cause I am always magically up to date on my drivers.









~Ceadder


----------



## mAs81

I've been polled quite a few times if my memory doesn't fail me..Didnt think that it'd hurt I guess


----------



## randomizer

Quote:


> Originally Posted by *7850K*
> 
> can someone break down valve's hardware polling policy?
> every time it's brought up people talk about how long it's been since they've been polled. I can't remember the last time I was, greater than 5 years probably. It seems like a useless metric.


I think it has value for long term trends, but it's useless for making judgements from month to month.


----------



## Blameless

Quote:


> Originally Posted by *7850K*
> 
> can someone break down valve's hardware polling policy?
> every time it's brought up people talk about how long it's been since they've been polled. I can't remember the last time I was, greater than 5 years probably. It seems like a useless metric.


You only need to poll small, randomly selected, portion of the whole, to get statistically accurate figures.

~1% of the Steam user base would be plenty.


----------



## Ceadderman

Quote:


> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *7850K*
> 
> can someone break down valve's hardware polling policy?
> every time it's brought up people talk about how long it's been since they've been polled. I can't remember the last time I was, greater than 5 years probably. It seems like a useless metric.
> 
> 
> 
> You only need to poll small, randomly selected, portion of the whole, to get statistically accurate figures.
> 
> ~1% of the Steam user base would be plenty.
Click to expand...

The average Steam user is what, < 25 years of age? 1% of all Steam users is not satisfactory imho.









~Ceadder


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> You only need to poll small, randomly selected, portion of the whole, to get statistically accurate figures.
> 
> ~1% of the Steam user base would be plenty.


Not really because 95% of Steam users play non demanding games.


----------



## AmericanLoco

Quote:


> Originally Posted by *Ceadderman*
> 
> The average Steam user is what, < 25 years of age? 1% of all Steam users is not satisfactory imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Your opinion is not how statistics work.

With a userbase of 100 million people, Valve would only have to poll 1% of its users for the survey to be accurate within .1% with a 99% confidence level.


----------



## Capt

It seems that Ryzen performs better the higher the resolution is. It seems to outperform the 6900k in 1440p but loses in 1080p, probably because there's less stress on the cpu at 1440p.

http://www.relaxedtech.com/reviews/amd/ryzen-7-1700/7


----------



## Ceadderman

Quote:


> Originally Posted by *AmericanLoco*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> The average Steam user is what, < 25 years of age? 1% of all Steam users is not satisfactory imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your opinion is not how statistics work.
> 
> With a userbase of 100 million people, Valve would only have to poll 1% of its users for the survey to be accurate within .1% with a 99% confidence level.
Click to expand...

How many users of that 1% are gainfully employed what 1/4 of them? That would be a conservative estimate. So how is my opinion invalid? That was the point I am making.









~Ceadder


----------



## AmericanLoco

Quote:


> Originally Posted by *Ceadderman*
> 
> How many users of that 1% are gainfully employed what 1/4 of them? That would be a conservative estimate. So how is my opinion invalid? That was the point I am making.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


The 1% sample accurately represents the population. It doesn't matter how many people you sample, the results will still be the same. That is how statistics work.

I see your point, but the question the Steam Hardware Survey is not the question you are asking. The Steam Hardware Survey is asking "What hardware do our users have?". The question you are asking is "What hardware do employed users over the age of 25 have?" That question would be almost impossible to answer, and is honestly not really relevant.


----------



## Arturo.Zise

Quote:


> Originally Posted by *Capt*
> 
> It seems that Ryzen performs better the higher the resolution is. It seems to outperform the 6900k in 1440p but loses in 1080p, probably because there's less stress on the cpu at 1440p.
> 
> http://www.relaxedtech.com/reviews/amd/ryzen-7-1700/7


So the Ryzen R7 would make for a nice productivity chip with some 4K gaming thrown in as well. Hopefully the MB makers have some nice Bios updates around the time Vega launches then I might have my next AMD build planned out.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not really because 95% of Steam users play non demanding games.


Still representative of the broader gaming populace.
Quote:


> Originally Posted by *AmericanLoco*
> 
> Your opinion is not how statistics work.
> 
> With a userbase of 100 million people, Valve would only have to poll 1% of its users for the survey to be accurate within .1% with a 99% confidence level.


Yep.
Quote:


> Originally Posted by *Ceadderman*
> 
> So how is my opinion invalid? That was the point I am making.


Your opinion that the sample size is too small is invalid because a larger sample size almost certainly would not increase the accuracy of the results by any appreciable degree.


----------



## Majin SSJ Eric

I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


----------



## fleetfeather

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


You're correct. And what you're referring to is "sampling error"


----------



## AmericanLoco

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


That's where the "Random Sample" comes in. A truly random sample should be completely representative of the population. Obviously even with a perfectly random sample, there is the chance you could get all nvidia employees or something - but that's why Valve takes a sample every month. This is why I get a chuckle when people complain about the Steam survey not asking them month after month what their hardware is then questioning the accuracy. If Valve sampled the same people every month, the survey would stop being random, and the collected data would be useless for informing Valve about the state of their entire user base.

As far as demographics coming into play - you need to look how that fits into the question Valve is asking. Their question is this: "What hardware are our users running". What Valve cares about ultimately is the hardware distribution of their user base. For this particular question, they don't care where you live, how much money you have, what the color of your skin is, etc...

The biggest source of sampling bias I see with the Steam Hardware Survey, is the fact that it's opt-in. But Valve is a smart company, and I'm sure they've accounted for that.


----------



## spikeSP

Quote:


> Originally Posted by *Capt*
> 
> It seems that Ryzen performs better the higher the resolution is. It seems to outperform the 6900k in 1440p but loses in 1080p, probably because there's less stress on the cpu at 1440p.
> 
> http://www.relaxedtech.com/reviews/amd/ryzen-7-1700/7


That's interesting. Why does it end up outperforming when there's less strain on the CPU?


----------



## Ceadderman

Quote:


> Originally Posted by *AmericanLoco*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.
> 
> 
> 
> That's where the "Random Sample" comes in. A truly random sample should be completely representative of the population. Obviously even with a perfectly random sample, there is the chance you could get all nvidia employees or something - but that's why Valve takes a sample every month. This is why I get a chuckle when people complain about the Steam survey not asking them month after month what their hardware is then questioning the accuracy. If Valve sampled the same people every month, the survey would stop being random, and the collected data would be useless for informing Valve about the state of their entire user base.
> 
> As far as demographics coming into play - you need to look how that fits into the question Valve is asking. Their question is this: "What hardware are our users running". What Valve cares about ultimately is the hardware distribution of their user base. For this particular question, they don't care where you live, how much money you have, what the color of your skin is, etc...
> 
> The biggest source of sampling bias I see with the Steam Hardware Survey, is the fact that it's opt-in. But Valve is a smart company, and I'm sure they've accounted for that.
Click to expand...

When it comes to hardware I think you're not understanding that Steam polling is compiled based on driver updates. That's a 100% sample rate. It includes Apple.

However what I was also referring to was market share polling. It cannot be a perfect sampling because as I pointed out much of Steam users are under the age bracket that actively invest in the hardware market. A 1% sampling is a poor sampling model for the business world.









~Ceadder


----------



## AmericanLoco

The Steam Hardware Survey is not based on driver updates. It's based on a random opt-in survey box that appears. This box collects basic system info. You're also making some bold assumptions about Steam's userbase that you cannot possibly know. You're also making inferences (i.e. if you're young you don't have the latest hardware) that are really unfounded.

I really don't care if you _think_ a 1% sampling rate of a 100 miion strong population is weak, because I know you're wrong. Any intro high school stats course would tell you such. A sample size of 1 million is huge. You could survey just 400 people and have an accuracy of within 5%.


----------



## Kuivamaa

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


You can't. If that 1% is all North American users for example, and they for whatever reason buy a certain brand of processor way more than the other when compared to the rest of the globe. you just got yourself skewed results. Sampling matters and it has to be done really carefully. The more you know about your base, the better sampling can get.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


I'll second your admission, but Ill say one thing I do know









When they are polling for political surveys, there is a human element to the survey. Say I was asking people what there favourite song was, now different people on a different day might have different answers. Its the same with political polling. People might be embarrassed to admit that they were really going to vote for the Nazi or Communist candidate say, so they might not be honest and the poll is then inaccurate. Or they might change their minds between polling and the vote.

The Steam survey reads your hardware directly from your system, there's no exaggeration or outside influence because the specs are just read straight from your machine. As to your other point about outliers, I believe that's why they take the samples randomly so as to be representative, and as others can explain far better then me there's surely been a lot of testing as to the size of a sample that is considered large enough to be representative of the whole.

I guess it may be skewed if there are more Steam members from North America for example as Kuivamaa said above, but if theres more users from NA then I would think that just means that the survey is only accurate within Steams own user base. I assume that Steam is the biggest online games front so I would say its most representative of what gamers are using worldwide.


----------



## Kuivamaa

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I'll second your admission, but Ill say one thing I do know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When they are polling for political surveys, there is a human element to the survey. Say I was asking people what there favourite song was, now different people on a different day might have different answers. Its the same with political polling. People might be embarrassed to admit that they were really going to vote for the Nazi or Communist candidate say, so they might not be honest and the poll is then inaccurate. Or they might change their minds between polling and the vote.
> 
> The Steam survey reads your hardware directly from your system, there's no exaggeration or outside influence because the specs are just read straight from your machine. As to your other point about outliers, I believe that's why they take the samples randomly so as to be representative, and as others can explain far better then me there's surely been a lot of testing as to the size of a sample that is considered large enough to be representative of the whole.
> 
> I guess it may be skewed if there are more Steam members from North America for example as Kuivamaa said above, but if theres more users from NA then I would think that just means that the survey is only accurate within Steams own user base. I assume that Steam is the biggest online games front so I would say its most representative of what gamers are using worldwide.


There are a lot of variables to consider. For example if Steam only does surveys in the afternoons and intel users are more active than AMD ones at nights, then there will be an error in favor of AMD ( just a hypothetical example). Just tossing a survey to 1% of users can hide many traps.


----------



## Slink3Slyde

Quote:


> Originally Posted by *Kuivamaa*
> 
> There are a lot of variables to consider. For example if Steam only does surveys in the afternoons and intel users are more active than AMD ones at nights, then there will be an error in favor of AMD ( just a hypothetical example). Just tossing a survey to 1% of users can hide many traps.


I'm really not an expert on statistics and surveys so I wont argue. Just saying it how I see it from what little I do know.


----------



## Ultracarpet

Yeaaaa, I don't put too much weight into the steam survey's... tons of potential issues with the sample. If the spec submission was mandatory it would alleviate most of the issues, but as is, it pretty much completely depends on who actually allows steam to take that data.

Users are a terrible sample because they are unreliable in their reporting. Figures that come from upstream are usually much more reliable as companies are liable to be reporting accurate information. Dunno if there is anything for CPU's that is similar to what Jon Peddie research does for GPU shipments, but that tends to be a good information source for market share in the GPU space.


----------



## Brutuz

Quote:


> Originally Posted by *Blameless*
> 
> You only need to poll small, randomly selected, portion of the whole, to get statistically accurate figures.
> 
> ~1% of the Steam user base would be plenty.


It depends if those users are the kind to actually matter to that data. AAA devs may think that they can't push GPUs because Intel has such a high Steam marketshare, when in reality its just people who have Steam for one game getting the survey. That and different platforms seem to show the survey more often than not. I've seen it once on native Linux, but plenty of times under Windows.

I guess that my point is it includes people who won't be in the decision making for a AAA company as they literally own all of the Train Simulator games or just AoE2HD or something because that's literally all that they play. Heck, a significant portion of Civ gamers are 40-60 year old men who decidedly aren't gamers in any normal sense of the word, they just really like Civ. They'd be included in this, even if they've simply got a system built for Civ (ie. GPU doesn't matter too much, worry about CPU performance for those turn times) or a workstation or similar.
Quote:


> Originally Posted by *fleetfeather*
> 
> You're correct. And what you're referring to is "sampling error"


And there is a lot of (admittedly, anecdotal) evidence to suggest that Steam shows the survey popup more often on Windows boxes than Linux. Probably due to a bug or coding difference of some description because Valve do support Linux fairly well..
Quote:


> Originally Posted by *Ultracarpet*
> 
> Yeaaaa, I don't put too much weight into the steam survey's... tons of potential issues with the sample. If the spec submission was mandatory it would alleviate most of the issues, but as is, it pretty much completely depends on who actually allows steam to take that data.


Even if the opt-in popup appeared for every user once a month, it'd massively increase the reliability of the data. You can be sure that while some users would opt-out, most would opt-in and Valve could simply put a header on the page saying "Results collected from x Steam accounts out of a total of y Steam accounts." You'd still have the other issue I mentioned above, however.


----------



## doritos93

Steam should just collect every users data with consent provided by EULA

Hey, Google does it!


----------



## KarathKasun

Quote:


> Originally Posted by *Brutuz*
> 
> It depends if those users are the kind to actually matter to that data. AAA devs may think that they can't push GPUs because Intel has such a high Steam marketshare, when in reality its just people who have Steam for one game getting the survey. That and different platforms seem to show the survey more often than not. I've seen it once on native Linux, but plenty of times under Windows.
> 
> I guess that my point is it includes people who won't be in the decision making for a AAA company as they literally own all of the Train Simulator games or just AoE2HD or something because that's literally all that they play. Heck, a significant portion of Civ gamers are 40-60 year old men who decidedly aren't gamers in any normal sense of the word, they just really like Civ. They'd be included in this, even if they've simply got a system built for Civ (ie. GPU doesn't matter too much, worry about CPU performance for those turn times) or a workstation or similar.
> And there is a lot of (admittedly, anecdotal) evidence to suggest that Steam shows the survey popup more often on Windows boxes than Linux. Probably due to a bug or coding difference of some description because Valve do support Linux fairly well..
> Even if the opt-in popup appeared for every user once a month, it'd massively increase the reliability of the data. You can be sure that while some users would opt-out, most would opt-in and Valve could simply put a header on the page saying "Results collected from x Steam accounts out of a total of y Steam accounts." You'd still have the other issue I mentioned above, however.


I have set up a few Debian based gaming systems recently. Every single one of them has gotten hit by the steam survey, but only using the WINE clients. I have not seen a SHS hit a native Linux client for a long time. Seems kinda fishy.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I'll second your admission, but Ill say one thing I do know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When they are polling for political surveys, there is a human element to the survey. Say I was asking people what there favourite song was, now different people on a different day might have different answers. Its the same with political polling. People might be embarrassed to admit that they were really going to vote for the Nazi or Communist candidate say, so they might not be honest and the poll is then inaccurate. Or they might change their minds between polling and the vote.
> 
> The Steam survey reads your hardware directly from your system, there's no exaggeration or outside influence because the specs are just read straight from your machine. As to your other point about outliers, I believe that's why they take the samples randomly so as to be representative, and as others can explain far better then me there's surely been a lot of testing as to the size of a sample that is considered large enough to be representative of the whole.
> 
> I guess it may be skewed if there are more Steam members from North America for example as Kuivamaa said above, but if theres more users from NA then I would think that just means that the survey is only accurate within Steams own user base. I assume that Steam is the biggest online games front so I would say its most representative of what gamers are using worldwide.


Good points made. I wonder why they can't just poll everybody using Steam? I mean, there's no reason that the game client can't auto-detect the hardware in every machine using it and then compile that data. Why a sampling percentage at all? Why not just post the actual hardware config of all users?


----------



## huzzug

Compute


----------



## Blameless

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Good points made. I wonder why they can't just poll everybody using Steam? I mean, there's no reason that the game client can't auto-detect the hardware in every machine using it and then compile that data. Why a sampling percentage at all? Why not just post the actual hardware config of all users?


Significantly more data to handle and significantly more work for no practical benefit.


----------



## Carniflex

About statistics and measurements. To make it as simple as possible.

As a very rough rule of thumb, your "error" is approximately 1/sqrt where n is the sample size.
n=10 -> approx 30% "error"
n=100 -> approx 10% "error"
n=1000 -> approx 3% "error" (that is the sample size normally used in political polling)
n=10 000 -> approx 1% "error"
n=100k -> approx 0.3% "error"
n=1M -> approx 0.1% "error" and so on.

There is some assumptions in there. Normal (Gauss) distribution, for a start, which is not always in the case in nature. Not talking about confidence intervals either really or what is the meaning of "sigma". Assuming all the events/measurement are independent.

But as rough rule of thumb its a good estimate for majority of things in life and nature. The meaning of this thing is following, to give a very simple example. If you see something happening 1000 times then you can predict that when the same thing happens for the 1001'th time the outcome will be the same with 97% probability as in the previous 1000 cases.

The real stuff behind this very rough "rule of thumb" is ofc more complex and its not my main field.

In practice approx n=30 is "good enough". For example, for the average to start to settle down when throwing 6 sided dice or throwing coin for heads/tails thingy.


----------



## budgetgamer120

Another victory for Ryzen.


----------



## iRUSH

Quote:


> Originally Posted by *budgetgamer120*
> 
> Another victory for Ryzen.


I watched this earlier today! Very nice!


----------



## Scotty99

I am surprised why they only went down to "faster", i have mine set to "fast" (which is more intensive and looks better). I had 0 dropped frames or any issues with 4500 bitrate, and stream looks crisp AF.

They kinda nerfed ryzen by only selecting faster.


----------



## sumitlian

Quote:


> Originally Posted by *Blameless*
> 
> Significantly more data to handle and significantly more work for no practical benefit.


I haven't been asked to participate in the poll yet, therefore I don't know much about it but it looks like steam works like hybrid app (desktop app + web app). And Web APIs have not been designed to be used to directly detect clien'ts hardware info unless host also provides a saparate plugin for it and users install it by consent !?


----------



## Xuper

Quote:


> Originally Posted by *Scotty99*
> 
> I am surprised why they only went down to "faster", i have mine set to "fast" (which is more intensive and looks better). I had 0 dropped frames or any issues with 4500 bitrate, and stream looks crisp AF.
> 
> They kinda nerfed ryzen by only selecting faster.


Faster > Very fast > Super Fast.

So No , they used heavy present on Ryzen more than others.Linus used worse scenario for Ryzen.

they nerfed so much on 7700K.

Look at this : https://youtu.be/jludqTnPpnU?t=341


----------



## azanimefan

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I admit up front I know nothing about statistics but i can't wrap my brain around how 1% poll rate is in any way accurate at all? How do you know that 1% wasn't just a select outlier group in the population? What if all respondents were Nvidia employees as a clearly silly extreme? How can demographics not come into play whatsoever when taking your samples into account? Oh, and I'd hazard a guess that polling for something like major political elections are just a bit more thoroughly conducted and accurate than the Steam survey and yet they manage to get those polls wrong all the time. Just never have placed much faith in statistics in general personally but as I said I don't know anything about it.


Your concerns about polling population ARE justified. While in general, assuming a completely random sample population the following post is 100% mathematically correct


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Carniflex*
> 
> About statistics and measurements. To make it as simple as possible.
> 
> As a very rough rule of thumb, your "error" is approximately 1/sqrt where n is the sample size.
> n=10 -> approx 30% "error"
> n=100 -> approx 10% "error"
> n=1000 -> approx 3% "error" (that is the sample size normally used in political polling)
> n=10 000 -> approx 1% "error"
> n=100k -> approx 0.3% "error"
> n=1M -> approx 0.1% "error" and so on.
> 
> There is some assumptions in there. Normal (Gauss) distribution, for a start, which is not always in the case in nature. Not talking about confidence intervals either really or what is the meaning of "sigma". Assuming all the events/measurement are independent.
> 
> But as rough rule of thumb its a good estimate for majority of things in life and nature. The meaning of this thing is following, to give a very simple example. If you see something happening 1000 times then you can predict that when the same thing happens for the 1001'th time the outcome will be the same with 97% probability as in the previous 1000 cases.
> 
> The real stuff behind this very rough "rule of thumb" is ofc more complex and its not my main field.
> 
> In practice approx n=30 is "good enough". For example, for the average to start to settle down when throwing 6 sided dice or throwing coin for heads/tails thingy.






This is only true if the group you're being questioning is representative of the population you want to poll.

for example, in political polling (via teliphone) its long been known the sample will lean more conservative/republican if the poll is taken during 12pm-3pm on a week day, as that is the time you're most likely to poll stay at home mothers or retired Americans, who are a more republican and socially conservative population then the population as a whole. Meanwhile polls taken on Friday night- Sunday morning will be way more liberal then the population because conservatives usually spend time with family and don't answer the telephone during that time period (don't ask me to explain how this is true, I just recall these facts from a political statistics and polling class I took in college, who knows if its valid anymore); furthermore the VOTING public as a whole is more conservative then the non-voting public, meaning GENERAL and RANDOM polls of american citizens will lean further left then what you'd see at the ballot box (which is why political polls should always be of REGISTERED VOTERS)

Furthermore how polls are worded greatly influences the answers. For example if you ask a group of people if someone should be allowed to smoke in church (generally negative response); or if you change the wording to should churches allow someone to smoke in church (generally positive response from a similar sample of people), you get two vastly different results in what is essentially and logically the same question. Why? because of the psychology of it. In the first version of the question people think about "themselves" and if they want the person sitting next to them smoking, while in the 2nd question people think of the "principle" of freedom and generally (at least in american polling samples) tend to favor more rights and less rules. Its an exercise in group psychology.

Which is why you need to be cautious about the sources, methods and even the wording of polls regardless of the sample size. That's why internet polls are so untrustworthy. You have ZERO control over the people answering the questions, worse they're not a random group of people. They're a pre-selected group of people who found your poll, which means they're a unique population not a random population. Would someone not interested in computers or games find this post on this web page? nope. So even just taking the people who read this post and polling them will NOT resemble the population of the USA in any way whatsoever even if we can find 10,000 people to read it and answer it.


----------



## Malinkadink

Quote:


> Originally Posted by *Xuper*
> 
> Faster > Very fast > Super Fast.
> So No , they used heavy present on Ryzen more than others.Linus used worse scenario for Ryzen.
> 
> they nerfed so much on 7700K.
> 
> Look at this : https://youtu.be/jludqTnPpnU?t=341


I don't understand why they used superfast on the 6900k for 30 and 60 fps. Really a better comparison if testing 7700k vs 1800X is to take either 720p 30 fps or 720p 60 fps as a baseline and test them at all quality levels from ultrafast all the way down to slower and see which one starts dropping frames and stuttering first in each game. It's obvious that the 7700k will lose each time and it'll lose sooner the heavier the game being tested is on the cpu. Still, it would give people some perspective how each chip performs with various games at each preset.

Veryfast is the optimal preset for 99.99% of cases and will blow the doors off the hardware encoders. Going to faster, fast, and everything below that starts to put a heavy load on the cpu with minimal quality gains.


----------



## Scotty99

Quote:


> Originally Posted by *Malinkadink*
> 
> I don't understand why they used superfast on the 6900k for 30 and 60 fps. Really a better comparison if testing 7700k vs 1800X is to take either 720p 30 fps or 720p 60 fps as a baseline and test them at all quality levels from ultrafast all the way down to slower and see which one starts dropping frames and stuttering first in each game. It's obvious that the 7700k will lose each time and it'll lose sooner the heavier the game being tested is on the cpu. Still, it would give people some perspective how each chip performs with various games at each preset.
> 
> Veryfast is the optimal preset for 99.99% of cases and will blow the doors off the hardware encoders. Going to faster, fast, and everything below that starts to put a heavy load on the cpu with minimal quality gains.


I have mine set to fast, i cant even tell im streaming lol. If i tried that on my 2500k, i would be sub 10 FPS on every game i play.


----------



## Malinkadink

Quote:


> Originally Posted by *Scotty99*
> 
> I have mine set to fast, i cant even tell im streaming lol. If i tried that on my 2500k, i would be sub 10 FPS on every game i play.


An i5 will not run fast well, hell it won't even run very fast well depending on the game. i5 you're stuck using quicksync which is pretty shoddy quality.

I did a test stream tonight of Overwatch 160 fps 1440p capped + x264 veryfast and had cpu usage peak at 88% with chrome open as well. Basically i7 is really barely cutting it for gaming + streaming.


----------



## jprovido

bought an orange gtx 1080







yea I know the cooler sucks but I don't care










after a few bios updates still can't get my ripjaws V 3200mhz CL16 to run above 2400mhz. I have tightened the timings to Cas13 but it doesn't seem like it helps (seems like the infinity fabric wants faster ram) is it too soon to give up? I'm thinking of just buying a new kit but last time I checked they're not cheap


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> bought an orange gtx 1080
> 
> 
> 
> 
> 
> 
> 
> yea I know the cooler sucks but I don't care
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> after a few bios updates still can't get my ripjaws V 3200mhz CL16 to run above 2400mhz. I have tightened the timings to Cas13 but it doesn't seem like it helps (seems like the infinity fabric wants faster ram) is it too soon to give up? I'm thinking of just buying a new kit but last time I checked they're not cheap


I have the same ram as you, also still at 2400. Im just gonna wait it out cause i got these for a good price, itll work eventually


----------



## Malinkadink

Quote:


> Originally Posted by *jprovido*
> 
> bought an orange gtx 1080
> 
> 
> 
> 
> 
> 
> 
> yea I know the cooler sucks but I don't care
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> after a few bios updates still can't get my ripjaws V 3200mhz CL16 to run above 2400mhz. I have tightened the timings to Cas13 but it doesn't seem like it helps (seems like the infinity fabric wants faster ram) is it too soon to give up? I'm thinking of just buying a new kit but last time I checked they're not cheap


You don't tighten the timings if you want to increase the memory speed you need to loosen them. Try 2667 with 18-16-16-38 timings.


----------



## renx

Quote:


> Originally Posted by *Malinkadink*
> 
> You don't tighten the timings if you want to increase the memory speed you need to loosen them. Try 2667 with 18-16-16-38 timings.


I take it that he tightened the timings after giving up on frequency overclock.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> bought an orange gtx 1080
> 
> 
> 
> 
> 
> 
> 
> yea I know the cooler sucks but I don't care
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> after a few bios updates still can't get my ripjaws V 3200mhz CL16 to run above 2400mhz. I have tightened the timings to Cas13 but it doesn't seem like it helps (seems like the infinity fabric wants faster ram) is it too soon to give up? I'm thinking of just buying a new kit but last time I checked they're not cheap


It's your MB.


----------



## budgetgamer120

New Ryzen review https://elchapuzasinformatico.com/2017/04/amd-ryzen-5-1400-review/5/


----------



## teh-yeti

Quote:


> Originally Posted by *jprovido*
> 
> bought an orange gtx 1080
> 
> 
> 
> 
> 
> 
> 
> yea I know the cooler sucks but I don't care
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> after a few bios updates still can't get my ripjaws V 3200mhz CL16 to run above 2400mhz. I have tightened the timings to Cas13 but it doesn't seem like it helps (seems like the infinity fabric wants faster ram) is it too soon to give up? I'm thinking of just buying a new kit but last time I checked they're not cheap


ASUS B350 prime mATX? Could be the problem. X370 seems to be getting more attention as far as ram speed goes. Hopefully once Ryzen 5 launches and there is more focus from manufacturers promoting B350 there will be better support for faster ram. At least 2667 if not 3000/2996 or whatever it is with Ryzen.


----------



## KarathKasun

Set ALL timings manually (SPD reading software is a good start to get all subtiming values for your RAM) if you want faster memory/IF speeds. Ive seen 2400 kits running at 3200 on Ryzen, you just have to set the latencies properly (20-20-20-40 range if coming from 2400), there are calculators around the internet to generate the 3200 timings from 2400 ones.

Even with horrid timings, 3200 is faster than 2400 on Ryzen.

Ive also seen the MSI B350 Tomahawk running 3600, so there is that. Max speed is dependent on luck of the draw on the CPU and weather or not your vendor is putting the X370 UEFI fixes in its B350 UEFIs.

AFAIK, on Asus boards, you need to incrementally increase ram speeds because the built in link training is pretty bad at this point. IE, set 2400 at your 3200 timings and reboot, then change speed to 2667 and reboot, and so forth until you reach your desired speed. Have to do this at every cold boot as well. Actually, the same thing happens on Asus AM1 platforms.


----------



## Damn_Smooth

This should really be required reading.

http://www.tomshardware.com/forum/id-3378010/join-tom-hardware-amd-thursday-april-6th.html


----------



## jprovido

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It's your MB.


I've seen videos on youtube getting 2900+ to 3200mhz on this board. obviously with a different kit though









one sample is this one https://www.youtube.com/watch?v=RNu-QiVr95M&t=197s
Quote:


> Originally Posted by *renx*
> 
> I take it that he tightened the timings after giving up on frequency overclock.


yes. just gave up and focused on the tightening the timings lol
Quote:


> Originally Posted by *teh-yeti*
> 
> ASUS B350 prime mATX? Could be the problem. X370 seems to be getting more attention as far as ram speed goes. Hopefully once Ryzen 5 launches and there is more focus from manufacturers promoting B350 there will be better support for faster ram. At least 2667 if not 3000/2996 or whatever it is with Ryzen.


ASUS has been pretty good with supporting this b350 motherboard. I'm on my 4th bios update now and all the things that needed a little bit of ironing out as been addressed now (wrong bios temps, voltage not showing up correctly etc.). I got this kit from my main pc, my kaby lake system. I had 4 sticks of 8gb 3200mhz on that pc and I don't even need it so I decided to just take two sticks and put it on the ryzen rig. it's not been working so good atm
Quote:


> Originally Posted by *KarathKasun*
> 
> Set ALL timings manually (SPD reading software is a good start to get all subtiming values for your RAM) if you want faster memory/IF speeds. Ive seen 2400 kits running at 3200 on Ryzen, you just have to set the latencies properly (20-20-20-40 range if coming from 2400), there are calculators around the internet to generate the 3200 timings from 2400 ones.
> 
> Even with horrid timings, 3200 is faster than 2400 on Ryzen.
> 
> Ive also seen the MSI B350 Tomahawk running 3600, so there is that. Max speed is dependent on luck of the draw on the CPU and weather or not your vendor is putting the X370 UEFI fixes in its B350 UEFIs.
> 
> AFAIK, on Asus boards, you need to incrementally increase ram speeds because the built in link training is pretty bad at this point. IE, set 2400 at your 3200 timings and reboot, then change speed to 2667 and reboot, and so forth until you reach your desired speed. Have to do this at every cold boot as well. Actually, the same thing happens on Asus AM1 platforms.


mine won't budge at all. even tried cas22 2666 at one point just to see if it boots (I have a 3200mhz cl16 kit) I have adjusted every voltage that I can and I just can't get it to budge above 2400mhz.


----------



## KarathKasun

Quote:


> Originally Posted by *jprovido*
> 
> mine won't budge at all. even tried cas22 2666 at one point just to see if it boots (I have a 3200mhz cl16 kit) I have adjusted every voltage that I can and I just can't get it to budge above 2400mhz.


Does it have trfc and other advanced timing options in the UEFI? Is there a 'compatibility mode' or something like that under advanced memory timings?


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> I've seen videos on youtube getting 2900+ to 3200mhz on this board. obviously with a different kit though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> one sample is this one https://www.youtube.com/watch?v=RNu-QiVr95M&t=197s
> yes. just gave up and focused on the tightening the timings lol
> ASUS has been pretty good with supporting this b350 motherboard. I'm on my 4th bios update now and all the things that needed a little bit of ironing out as been addressed now (wrong bios temps, voltage not showing up correctly etc.). I got this kit from my main pc, my kaby lake system. I had 4 sticks of 8gb 3200mhz on that pc and I don't even need it so I decided to just take two sticks and put it on the ryzen rig. it's not been working so good atm
> mine won't budge at all. even tried cas22 2666 at one point just to see if it boots (I have a 3200mhz cl16 kit) I have adjusted every voltage that I can and I just can't get it to budge above 2400mhz.


From someone that test a lot of different CPU with same MB and RAM he said that the CPU has nothing to do with it. It is always the RAM or the MB.


----------



## Scotty99

I am also stuck at 2400, whatever you do it wont go higher.

Asrock killer

And this ram:
https://www.newegg.com/Product/Product.aspx?Item=N82E16820231941

Luckily i got mine for 98 bucks so stuck at 2400 for a while does not bother me too much lol.


----------



## AlphaC

sisoft review

http://www.sisoftware.eu/2017/04/05/amd-ryzen-review-and-benchmarks-cpu/
"What a return of fortune from AMD! Despite a hurried launch and inevitable issues which will be fixed in time (e.g. Windows scheduler), Ryzen puts a strong performance beating Intel's previous top-of-the-range Skylake 6700K and Haswell-E 6820K into dust in most tests at a much cheaper price.

Of course there are setbacks, highly vectorised AVX2/FMA code greatly favour Intel's SIMD units and here Ryzen falls behind a bit; streaming algorithms can overload the 2 memory channels but then again Intel's mainstream platform has only 2 also. "


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> I am also stuck at 2400, whatever you do it wont go higher.
> 
> Asrock killer
> 
> And this ram:
> https://www.newegg.com/Product/Product.aspx?Item=N82E16820231941
> 
> Luckily i got mine for 98 bucks so stuck at 2400 for a while does not bother me too much lol.


yea but our kit kinda sucks with ryzen. I'm trying really hard to be patient









I'm doing my 30 day ryzen challenge and switched my main pc and ryzen pc in the living room (I believe it's been 6 days now)

Playing dota 2 is a little bit annoying I'm certain at 3200mhz those minimums will improve. I'm getting 80-90fps drops in big fights. aside from that game every game that I play gets a lock 144fps


----------



## Ultracarpet

Quote:


> Originally Posted by *jprovido*
> 
> yea but our kit kinda sucks with ryzen. I'm trying really hard to be patient
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm doing my 30 day ryzen challenge and switched my main pc and ryzen pc in the living room (I believe it's been 6 days now)
> 
> Playing dota 2 is a little bit annoying I'm certain at 3200mhz those minimums will improve. I'm getting 80-90fps drops in big fights. aside from that game every game that I play gets a lock 144fps


I think the 30 day challenge with Ryzen is actually just to see if you can get your memory to its rated speed in that time frame lolll.


----------



## Capt

Quote:


> Originally Posted by *spikeSP*
> 
> That's interesting. Why does it end up outperforming when there's less strain on the CPU?


I think because there's less stress on the CPU the higher the resolution is. But the 1700X does better in most games than the 1700 regardless of the resolution.

http://www.relaxedtech.com/reviews/amd/ryzen-7-1700x/4


----------



## Kuivamaa

Quote:


> Originally Posted by *Capt*
> 
> I think because there's less stress on the CPU the higher the resolution is.


This is one of the biggest misconceptions when it comes to gaming benchmarks. Lower pixel count DOES NOT put extra strain on the CPU. The problem starts when people compare eg 250fps output on 720p vs 70fps 1440p. Ofc when the CPU produces more frames the load will be higher but we introduce a new variable like that. 60fps on 720p vs 60fps 1440p will have almost identical CPU load and strain.


----------



## Sheyster

Quote:


> Originally Posted by *Kuivamaa*
> 
> This is one of the biggest misconceptions when it comes to gaming benchmarks. Lower pixel count DOES NOT put extra strain on the CPU. The problem starts when people compare eg 250fps output on 720p vs 70fps 1440p. Ofc when the CPU produces more frames the load will be higher but we introduce a new variable like that. 60fps on 720p vs 60fps 1440p will have almost identical CPU load and strain.


I think most people get this; more FPS = more CPU load, it's not really less pixels per se. BUT the root cause of more FPS is less pixels to drive, hence more CPU load.


----------



## Kuivamaa

Quote:


> Originally Posted by *Sheyster*
> 
> I think most people get this; more FPS = more CPU load, it's not really less pixels per se. BUT the root cause of more FPS is less pixels to drive, hence more CPU load.


Still it is wrong. If it is the higher feamerate that creates the load , it should be communicated as such. Let's call a spade a spade. Upgrading video card can and will increase CPU load on a given resolution by virtue of raising the framerate.


----------



## Ultracarpet

Quote:


> Originally Posted by *Kuivamaa*
> 
> Still it is wrong. If it is the higher feamerate that creates the load , it should be communicated as such. Let's call a spade a spade. Upgrading video card can and will increase CPU load on a given resolution by virtue of raising the framerate.


But isn't the point that reducing the pixel count will increase the frame rate?


----------



## Kuivamaa

Quote:


> Originally Posted by *Ultracarpet*
> 
> But isn't the point that reducing the pixel count will increase the frame rate?


Potentially not if the player is using v-sync or a frame limiter of any kind. Or you can even drop the resolution but decrease the CPU load by raising detail and effects, if you reduce framerate in the process.The only linear relationship here is more frames=more CPU load.


----------



## Ultracarpet

Quote:


> Originally Posted by *Kuivamaa*
> 
> Potentially not if the player is using v-sync or a frame limiter of any kind. Or you can even drop the resolution but decrease the CPU load by raising detail and effects, if you reduce framerate in the process.The only linear relationship here is more frames=more CPU load.


Ok well yes, point taken.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kuivamaa*
> 
> Potentially not if the player is using v-sync or a frame limiter of any kind. Or you can even drop the resolution but decrease the CPU load by raising detail and effects, if you reduce framerate in the process.The only linear relationship here is more frames=more CPU load.


As long as CPU does 60 fps min its all good. Once the CPU hits 60 fps min than for next upgrade just do not consider GPU only.


----------



## Cherryblue

Quote:


> Originally Posted by *jprovido*
> 
> yea but our kit kinda sucks with ryzen. I'm trying really hard to be patient
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm doing my 30 day ryzen challenge and switched my main pc and ryzen pc in the living room (I believe it's been 6 days now)
> 
> Playing dota 2 is a little bit annoying I'm certain at 3200mhz those minimums will improve. I'm getting 80-90fps drops in big fights. aside from that game every game that I play gets a lock 144fps


Is that using Vulkan? OpenGL? DirectX11/12?

If I remember correctly, D2 engine is compatible with all. You may see different min fps depending on the one used.


----------



## Frugal

Ryzen 5 is released today isn't it ?
Anyone know what hour reviews start to be posted ?


----------



## mickeykool

Quote:


> Originally Posted by *Frugal*
> 
> Ryzen 5 is released today isn't it ?
> Anyone know what hour reviews start to be posted ?


I believe at 9am EST.


----------



## pez

Newegg just sent out an email that it's released.

http://promotions.newegg.com/NEemail/Apr-0-2017/Ryzen_re45okvb_11/index-landing.html?utm_medium=Email&utm_source=IGNEFL041117&cm_mmc=EMC-IGNEFL041117-_-EMC-041117-Index-_-E0-_-AMD&et_cid=33022&et_rid=3237203&et_p1=&email64=R1JFRU5MQU5URVJONTU2QEdNQUlMLkNPTQ==


----------



## Frugal

Quote:


> Originally Posted by *mickeykool*
> 
> I believe at 9am EST.


Thanks.


----------



## Pro3ootector

Quote:


> Originally Posted by *Frugal*
> 
> Ryzen 5 is released today isn't it ?
> Anyone know what hour reviews start to be posted ?


http://www.overclockersclub.com/reviews/amd_ryzen_5_1600x__1500x_processor/10.htm


----------



## VSG

Might be worth making a new thread for Ryzen 5. In the interim:

https://www.techpowerup.com/reviews/AMD/Ryzen_5_1600X/

https://www.techpowerup.com/reviews/AMD/Ryzen_5_1500X/


----------



## budgetgamer120

Quote:


> Originally Posted by *Pro3ootector*
> 
> http://www.overclockersclub.com/reviews/amd_ryzen_5_1600x__1500x_processor/10.htm


Doesn't look too good in that review.


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> I have the same ram as you, also still at 2400. Im just gonna wait it out cause i got these for a good price, itll work eventually


DUDEEE! I just updated to the latest bios on my board with the AGESA update.



for the first time I went above 2400mhz! 2666mhz on our kit! there's still hope!!!!







failed on 2933mhz and 3200mhz but I'd take a small bump over nothing all day


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> DUDEEE! I just updated to the latest bios on my board with the AGESA update.
> 
> 
> 
> for the first time I went above 2400mhz! 2666mhz on our kit! there's still hope!!!!
> 
> 
> 
> 
> 
> 
> 
> failed on 2933mhz and 3200mhz but I'd take a small bump over nothing all day


Hah grats









Im still at 2400 but asrock hasnt released a bios since the 31'st, in due time i guess.


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Hah grats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im still at 2400 but asrock hasnt released a bios since the 31'st, in due time i guess.


I was starting to think our kit just isn't compatible with Ryzen. I've been updating every single bios release (and almost checking everyday) and not a single one of them made it go above 2400mhz. I was so happy when it posted earlier lol. in due time I'm pretty sure we'll get 3200mhz stable with these kit.









on paper it may seem it's not a big upgrade but "2400" was such an eyesore for me. having it at 2666mhz now made me feel a lot happier


----------



## budgetgamer120

Quote:


> Originally Posted by *jprovido*
> 
> DUDEEE! I just updated to the latest bios on my board with the AGESA update.
> 
> for the first time I went above 2400mhz! 2666mhz on our kit! there's still hope!!!!
> 
> 
> 
> 
> 
> 
> 
> failed on 2933mhz and 3200mhz but I'd take a small bump over nothing all day


I am buying the same board you haveand Ryzen 1400 to mess around with


----------



## Carniflex

I might have missed a review or two - so a question about something which I have been wondering for a little while now about these new Ryzen CPU's: Can you just push 1-2 best cores to, say, 4.2 GHz and leave the rest, at say, 3.8..3.9 GHz? So far what I have seen is pushing the whole CPU to somewhere between 3.9 and 4.1 GHz.

In the pre-release hypetrain this was one of the points raised among all the speculation and hype that sparked my interest as it seemed to make some sense if possible. Offering you one or two cores for that little extra single threaded perf where needed while also having a large number of cores/threads available for the tasks that scale well.


----------



## Scotty99

Nope not yet, hopefully its something they can add down the road cause that would be a great feature.


----------



## jprovido

Quote:


> Originally Posted by *budgetgamer120*
> 
> I am buying the same board you haveand Ryzen 1400 to mess around with


Asus has been pretty good. almost weekly bios updates.


The motherboard I bought came with the day 1 bios. pretty much everything is broken. overclocking barely works, voltage is broken, temps is broken, stuck at 2400mhz, timings won't change even if you do it manually etc.

now all the issue I had is pretty much fixed. cpu-z is reporting correct voltage now, temps on both bios and ryzen master are correct now with the 20 degrees offset, bios is a hundred times more stable now and finally got 2666mhz. if this was the release bios it wouldn't have had the same bad impression I had when I first got it. it's slowly getting better and better


----------



## mAs81

Kinda how I feel about all the gaming benchmark rage as of late :


----------



## budgetgamer120

Quote:


> Originally Posted by *mAs81*
> 
> Kinda how I feel about all the gaming benchmark rage as of late :


So this is saying even though Ryzen might get less fps it is smoother?


----------



## mAs81

I believe that it says that after a certain amount of fps,you can't really tell the difference


----------



## ZealotKi11er

Quote:


> Originally Posted by *mAs81*
> 
> I believe that it says that after a certain amount of fps,you can't really tell the difference


PC 144Hz master race would like a word with you.


----------



## mAs81

Quote:


> Originally Posted by *ZealotKi11er*
> 
> PC 144Hz master race would like a word with you.


Now, now,I'm sure we can all be friends


----------



## Tojara

Quote:


> Originally Posted by *mAs81*
> 
> I believe that it says that after a certain amount of fps,you can't really tell the difference


It's like everything else, once it's good enough, you get significant diminishing returns. It's the same thing as with resolution and details, the higher you go the less of a difference there is. The reality of the matter is that it's practically impossible for you to notice a <10% difference outside of some edge cases.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Tojara*
> 
> It's like everything else, once it's good enough, you get significant diminishing returns. It's the same thing as with resolution and details, the higher you go the less of a difference there is. The reality of the matter is that it's practically impossible for you to notice a <10% difference outside of some edge cases.


The funny thing is that games that 144Hz matters get 144Hz relatively easy with CPU. Those that dont are those games you play for like 10 hours and move to the next game.


----------



## Ultracarpet

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The funny thing is that games that 144Hz matters get 144Hz relatively easy with CPU. Those that dont are those games you play for like 10 hours and move to the next game.


ahha this.

No but seriously, there isn't a whole lot of "oh im right on the cusp of 140 fps" in games. It's either a game that doesn't need to be at 140+fps/no CPU+GPU combo is going to maintain a solid 140+fps, or it's a game that you can push upwards of 200+ FPS given a powerful enough graphics card. Not a whole lot in between... at least that's the experience with my 144hz monitor.


----------



## Descadent

Quote:


> Originally Posted by *ZealotKi11er*
> 
> PC 144Hz master race would like a word with you.


oh boy not 144hz master race. gsync cough gsync cough... no need to hit 144fps cough. except for the tryhard csgo boys.


----------



## blue1512

Quote:


> Originally Posted by *Descadent*
> 
> oh boy not 144hz master race. gsync cough gsync cough... no need to hit 144fps cough. except for the tryhard csgo boys.


Luckily CSGO, or even Overwatch doesn't require much CPU power to hit 144fps consistently


----------



## Tojara

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The funny thing is that games that 144Hz matters get 144Hz relatively easy with CPU. Those that dont are those games you play for like 10 hours and move to the next game.


That's a pretty ignorant view of things. Of the newer games in my library, I don't think you could hold 144 Hz steady probably even with a 7700k. Cities, KSP, DS3 and EU4 are all pretty much impossible to run anywhere near that CPU wise. Even looking at Steam's most played games ten out of 25 eat CPU poewr for breakfast. Personally, the games that I tend to play seem to do the exact opposite of what you're saying.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Tojara*
> 
> That's a pretty ignorant view of things. Of the newer games in my library, I don't think you could hold 144 Hz steady probably even with a 7700k. Cities, KSP, DS3 and EU4 are all pretty much impossible to run anywhere near that CPU wise. Even looking at Steam's most played games ten out of 25 eat CPU poewr for breakfast. Personally, the games that I tend to play seem to do the exact opposite of what you're saying.


Dota 2 and CS:GO alone make the more hours than Top 25 combined. I do not care what you play. I do not even want to try to understand what the hell are the game you did not list full name. High fps matters to some people because they play certain MP games. And the biggest problem is you thinking you need to hold 144 fps steady.


----------



## shhek0

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dota 2 and CS:GO alone make the more hours than Top 25 combined. I do not care what you play. I do not even want to try to understand what the hell are the game you did not list full name. High fps matters to some people because they play certain MP games. *And the biggest problem is you thinking you need to hold 144 fps steady.*


This would change just like 60fps and 60Hz. Like now the 60fps is lets say the golden standard for 95% of "gamers" bacause a good 1444p 144Hz monitor and the hardware is still rather expensive for most people. Sooner or later that would be the new standard.. or maybe the mass people would go to 4K meaning resolution is everything







I mean everybody tries to push 4k nowadays making their ads..

You do not have to hold them, but you for sure you want and hope so- I mean you have put a lot of money in your hardware and would like to get as much as you can


----------



## ZealotKi11er

Quote:


> Originally Posted by *shhek0*
> 
> This would change just like 60fps and 60Hz. Like now the 60fps is lets say the golden standard for 95% of "gamers" bacause a good 1444p 144Hz monitor and the hardware is still rather expensive for most people. Sooner or later that would be the new standard.. or maybe the mass people would go to 4K meaning resolution is everything
> 
> 
> 
> 
> 
> 
> 
> I mean everybody tries to push 4k nowadays making their ads..
> 
> You do not have to hold them, but you for sure you want and hope so- I mean you have put a lot of money in your hardware and would like to get as much as you can


For monitors 60Hz will remain here until we cant tell the difference from resolution increase. 4K 60fps is just getting here and we probably need 3-5 years for $200 cards to be able to get 60 fps in 4K with latest games. As long as I get over 60 fps I am happy personally. Most game I am fine as low as 30 fps but drop below and I cant handle it.


----------



## Malinkadink

60 fps is borderline slideshow after being used to 165hz and making sure all FPS games at least will hold 165fps or more 100% of the time in which case i will tank the graphics settings until they do. Any game that doesn't require a high degree of mechanical skill to play where frames and input lag are paramount i can enjoy at 60-100 fps especially with gsync.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malinkadink*
> 
> 60 fps is borderline slideshow after being used to 165hz and making sure all FPS games at least will hold 165fps or more 100% of the time in which case i will tank the graphics settings until they do. Any game that doesn't require a high degree of mechanical skill to play where frames and input lag are paramount i can enjoy at 60-100 fps especially with gsync.


What FPS game? My theory is 2x Console fps. If game runs at 30 fps then run it 60 fps in PC. If 60 fps in console then run it at 120 fps in PC. Personally I am not mechanically fast enough to benefit from 120Hz screen. I am still the bottleneck. When I become a pro by age 30 I will buy me a 240 Hz screen an pwn all you 144Hz nubs.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *mAs81*
> 
> Kinda how I feel about all the gaming benchmark rage as of late :


Curious conclusions there. They all said they had "really bad hiccups and glitching" in Fallout on system 3 and a noticeably inferior performance in GTA5 on system 1 yet in the conclusion they all said there was no difference in any of the games on any of the systems. Of course, system 3 was the 7700K and system 1 was the 5960X so we certainly can't cast any dispersion's on those two can we...


----------



## Charcharo

As games become more optimized for Ryzen and multiple cores... newer titles would have less issues hitting high FPS. So it is not a worry for newer or future titles (especially when Ryzen 2nd gen is coming).

Only old CPU limited games, emulators (Ryzen 2 might solve) and some current titles are an issue. But most of those RUN at over 100 fps... so yeah.

Except STALKER. We need an 8ghz Dual core for that...


----------



## Carniflex

Quote:


> Originally Posted by *ZealotKi11er*
> 
> PC 144Hz master race would like a word with you.


I do have 1440p 144 Hz display and to be frank - I really cant tell the difference between, say, around 140 fps'ish and around 90 fps'ish. Granted that is with freesync so I do not have tearing to indicate me that I have dropped under the display native resolution as it is the case with fixed refresh rate screens. I have once dropped as far as around 40'ish fps before I noticed that "gee this is a bit choppy" and took a look at the fps counter. Although at that time I was busy playing the game trying to shoot the other bastard before he shoots me in Planetside 2.


----------



## Carniflex

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The funny thing is that games that 144Hz matters get 144Hz relatively easy with CPU. Those that dont are those games you play for like 10 hours and move to the next game.


Depends really. Some of these are not very well optimized in this regard. Planetside 2, for example, has really hard time doing fps above 130'ish with i7-3820 @ 4.3 GHz, that is with shadows off and some other optimizations for easing the CPU load a bit. Then again Planetside 2 is a MMOFPS and MMO's are normally a bit more CPU heavy than your average lobby shooters with lower number of actors running around. Also the Planetside 2 engine is rather ... aged. We are talking here about DX9 and single thread plus handful of auxiliary threads jury rigged into there at some later date for sound and ... something.


----------



## Carniflex

Quote:


> Originally Posted by *shhek0*
> 
> This would change just like 60fps and 60Hz. Like now the 60fps is lets say the golden standard for 95% of "gamers" bacause a good 1444p 144Hz monitor and the hardware is still rather expensive for most people. Sooner or later that would be the new standard.. or maybe the mass people would go to 4K meaning resolution is everything
> 
> 
> 
> 
> 
> 
> 
> I mean everybody tries to push 4k nowadays making their ads..
> 
> You do not have to hold them, but you for sure you want and hope so- I mean you have put a lot of money in your hardware and would like to get as much as you can


I would like to chime in on this.

In my opinion the fuss about having to hold consistent 60Hz is because if one does not have variable refresh rate dropping below the native refresh rate results in tearing which is normally rather noticeable effect. Without tearing, i.e., g-sync or freesync display with matching GFX card dropping below 60fps is barely noticeable most of the time. When buying my 144 Hz display I considered free-sync as just as an additional "nice to have" but I would go as far as to say that having a free-sync in that display is making a larger difference for me at this point than it being a 144Hz display even though I do play competitive shooters as well.

Edit: I should probably add that I also have a 4K display without freesync. After getting the 1440p freesync one I play most of the time on that one - although there is few games where I like the 4K better. Both are about the same physical size, 28'' 4K and 27'' 1440p.


----------



## Malinkadink

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What FPS game? My theory is 2x Console fps. If game runs at 30 fps then run it 60 fps in PC. If 60 fps in console then run it at 120 fps in PC. Personally I am not mechanically fast enough to benefit from 120Hz screen. I am still the bottleneck. When I become a pro by age 30 I will buy me a 240 Hz screen an pwn all you 144Hz nubs.


Well those were some interesting PHP errors









Anyways, I'm mainly playing Overwatch. I've been on 144hz + for at least 4 years now starting with the VG248QE and i won't ever go back to anything less. It is immediately noticeable how much slower 60hz looks and feels. That's not to say that i couldn't get used to it and be a good player but why would i want to? There is no denying that higher refresh rate and higher fps to go along with that will give you some advantage. Whether you're physically capable of taking advantage of that is a different story, but i'm 23 and am playing aim sensitive characters at a high level of play (currently 4300 rated).

I've considered going for a 240hz monitor, but even i understand that there are diminishing returns when going past 144hz. I don't think i'll play any better on 240hz vs 144hz, i dont see myself being hardware limited. I also sure as hell dont want to go back to 1080p. 1440p 165hz is where to be right now. When 1440p 240hz happens i'll be sure to upgrade especially if its IPS. TN is getting very sore to look at


----------



## Arturo.Zise

I feel sorry for all you 144hz gamer's with those massive low res pixels stabbing holes in your eyeballs ha ha


----------



## Scotty99

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I feel sorry for all you 144hz gamer's with those massive low res pixels stabbing holes in your eyeballs ha ha


Says the guy running a 32" 1440 screen....

My 90 dollar monitor from 2011 has higher PPI than your does...


----------



## cssorkinman

Fun thing to do - hook up 60 inch plasma to gaming rig - set resolution to 480 - BOOM - Every game you play is now the LEGO version







.


----------



## Arturo.Zise

Quote:


> Originally Posted by *Scotty99*
> 
> Says the guy running a 32" 1440 screen....
> 
> My 90 dollar monitor from 2011 has higher PPI than your does...


You missed the 50" 4K TV I have hooked up as well


----------



## iRUSH

Quote:


> Originally Posted by *cssorkinman*
> 
> Fun thing to do - hook up 60 inch plasma to gaming rig - set resolution to 480 - BOOM - Every game you play is now the LEGO version
> 
> 
> 
> 
> 
> 
> 
> .


Lol! I have to try this


----------



## Ultracarpet

Quote:


> Originally Posted by *Arturo.Zise*
> 
> You missed the 50" 4K TV I have hooked up as well


ur 50" 4k tv has less PPI than a 24" 1080p screen, and a 32" 1440p screen has the same PPI as a 24" 1080p screen. So essentially, that 4k tv of urs will have the most noticeable pixels out of all 3 lol.


----------



## rage fuury

Interesting statistics:

https://www.3dcenter.org/news/amd-ryzen-5-launchreviews-die-testresultate-zur-anwendungs-performance-im-ueberblick

Ryzen is more than competitive with Intel's offers no doubt!


----------



## kaseki

Quote:


> Originally Posted by *Malinkadink*
> 
> Well those were some interesting PHP errors
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, I'm mainly playing Overwatch. I've been on 144hz + for at least 4 years now starting with the VG248QE and i won't ever go back to anything less. It is immediately noticeable how much slower 60hz looks and feels. That's not to say that i couldn't get used to it and be a good player but why would i want to? There is no denying that higher refresh rate and higher fps to go along with that will give you some advantage. Whether you're physically capable of taking advantage of that is a different story, but i'm 23 and am playing aim sensitive characters at a high level of play (currently 4300 rated).
> 
> I've considered going for a 240hz monitor, but even i understand that there are diminishing returns when going past 144hz. I don't think i'll play any better on 240hz vs 144hz, i dont see myself being hardware limited. I also sure as hell dont want to go back to 1080p. 1440p 165hz is where to be right now. When 1440p 240hz happens i'll be sure to upgrade especially if its IPS. TN is getting very sore to look at


I think it is time for me to jump into this nest of fanatical gamers and ask a question that has been nagging me for a while: Why 144 Hz? Mains frequencies or digital values would seem more likely to be chosen. That is, one might imagine 100 Hz, or 120 Hz, or 128 Hz, or 150 Hz, or 180 Hz, etc., but a dozen dozen hertz?

Was there a study done somewhere relating that frame rate to some human characteristic, like eye fovial "pixels" swept per second given nystagmus motion angular rate? Or was this the best a dominant twisted nematic display maker could do at some point and it became the goal for everyone else?


----------



## Malinkadink

Quote:


> Originally Posted by *kaseki*
> 
> I think it is time for me to jump into this nest of fanatical gamers and ask a question that has been nagging me for a while: Why 144 Hz? Mains frequencies or digital values would seem more likely to be chosen. That is, one might imagine 100 Hz, or 120 Hz, or 128 Hz, or 150 Hz, or 180 Hz, etc., but a dozen dozen hertz?
> 
> Was there a study done somewhere relating that frame rate to some human characteristic, like eye fovial "pixels" swept per second given nystagmus motion angular rate? *Or was this the best a dominant twisted nematic display maker could do at some point and it became the goal for everyone else?*


I can't say for certain but if i was to guess it'd be the latter that i highlighted in bold.


----------



## jprovido

Quote:


> Originally Posted by *mAs81*
> 
> I believe that it says that after a certain amount of fps,you can't really tell the difference


75hz monitor. don't get me wrong I have a ryzen 1700x and It's beast but at 144hz gaming it's lacking. dota 2 still gets 90-100fps drops but it was already a lot better than before. before bios/windows updates at one point I saw it go as low as 60-70fps


----------



## Arturo.Zise

Quote:


> Originally Posted by *Ultracarpet*
> 
> ur 50" 4k tv has less PPI than a 24" 1080p screen, and a 32" 1440p screen has the same PPI as a 24" 1080p screen. So essentially, that 4k tv of urs will have the most noticeable pixels out of all 3 lol.


I consider anything smaller than 32" a monitor for ants. You need to sit 6 inches from a 24-27" screen otherwise it looks tiny. Small monitors can't deliver an immersive game experience versus a big screen. Switching from my 50" back to 32" just fees like I'm gaming on my tablet lol.

Anyway Ryzen R7/R5 are perfect gaming CPU's for 1440p and up. 1080p still belongs to 7700k for now. Congrats Intel on having one CPU worth buying right now


----------



## Scotty99

Quote:


> Originally Posted by *jprovido*
> 
> 75hz monitor. don't get me wrong I have a ryzen 1700x and It's beast but at 144hz gaming it's lacking. dota 2 still gets 90-100fps drops but it was already a lot better than before. before bios/windows updates at one point I saw it go as low as 60-70fps


Never played dota but in overwatch i can get 140+ FPS at 1440p, guess it depends on the game.


----------



## budgetgamer120

Quote:


> Originally Posted by *Fyrwulf*
> 
> You're not getting the 8 core part for $499, I'm so confident of that I'd be willing to buy you one if I'm wrong. That said, I can imagine the 4 and 6 core parts will be below that. The nice thing about the AM4 platform is that you can always get that 8 core part later on if it becomes necessary to upgrade.


Cannot say i did not tell you so









Three 8 core parts all under $500.

https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100006676%20100007671%20601294612&IsNodeId=1&Description=ryzen&name=CPUs%20%2f%20Processors&Order=BESTMATCH

Hope you keep your end of the deal and buy one


----------



## teh-yeti

Quote:


> Originally Posted by *budgetgamer120*
> 
> Cannot say i did not tell you so
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Three 8 core parts all under $500.
> 
> https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100006676%20100007671%20601294612&IsNodeId=1&Description=ryzen&name=CPUs%20%2f%20Processors&Order=BESTMATCH
> 
> Hope you keep your end of the deal and buy one


I remember thinking much the same as Fyrwulf back then as far as pricing goes. Goes I'll need to go pick up a hat so I can eat it... honestly I have no clue what to anticipate with AMD's pricing anymore.


----------



## budgetgamer120

Quote:


> Originally Posted by *teh-yeti*
> 
> I remember thinking much the same as Fyrwulf back then as far as pricing goes. Goes I'll need to go pick up a hat so I can eat it... honestly I have no clue what to anticipate with AMD's pricing anymore.


I almost forgot I saved that post. @Fyrwulf where are you man?

It is easy to guess AMD pricing. THey are pricing stuff to get marketshare. We know it does not cost $100 or anything close to make a CPU.

$329 for the 1700 they are still making profit.


----------



## Brutuz

Quote:


> Originally Posted by *Arturo.Zise*
> 
> You missed the 50" 4K TV I have hooked up as well


Which has a lower PPI than my 21.5" 1080p screen from 2004. Your pixels are bigger than mine.
Quote:


> Originally Posted by *Arturo.Zise*
> 
> I consider anything smaller than 32" a monitor for ants. You need to sit 6 inches from a 24-27" screen otherwise it looks tiny. Small monitors can't deliver an immersive game experience versus a big screen. Switching from my 50" back to 32" just fees like I'm gaming on my tablet lol.
> 
> Anyway Ryzen R7/R5 are perfect gaming CPU's for 1440p and up. 1080p still belongs to 7700k for now. Congrats Intel on having one CPU worth buying right now


I've got 3x21.5" screens set up to appear as one single screen, it's plenty immersive. And I find that sitting about 0.5m from my screens does me perfectly fine with a smaller than normal text-size. (And I'm pretty short sighted too, literally 10cm of blur free vision when I haven't got glasses on)
I personally prefer not having to sit on the opposite side of the room to my screen so that everything doesn't look like a blurry mess.

I'd also honestly say Ryzen is still a perfect gaming CPU for 1080p...But only if you either plan on upping the res soon or do tasks that Ryzen shows faster performance in plus gaming. It's not like it's slow at gaming, it's just that the Intel is faster.


----------



## Glottis

Quote:


> Originally Posted by *Brutuz*
> 
> Which has a lower PPI than my 21.5" 1080p screen from 2004. Your pixels are bigger than mine.


Is this a joke/troll post? You can't possibly be implying that your crappy old (maybe TN?) screen from 2004 is better than his shiny new 4K TV just because your PPI is lower


----------



## Sheyster

Quote:


> Originally Posted by *Brutuz*
> 
> Which has a lower PPI than my 21.5" 1080p screen from 2004. Your pixels are bigger than mine.


I hope you're not serious.














Any monitors I've had in the house that were < 24" have long since been given away to family members. My wife doesn't even use anything that small. Just sayin'..


----------



## jprovido

Quote:


> Originally Posted by *Scotty99*
> 
> Never played dota but in overwatch i can get 140+ FPS at 1440p, guess it depends on the game.


i get 180+ fps on overwatch which is great (over 240+ on my 7700k) 144hz monitor so it doens't really matter. but again it's terrible with dota. it's improving I tell you that hopefully 3200mhz can help with the minimums some more. currently at 2666mhz and the improvement has been noticeable from 2400
Quote:


> Originally Posted by *Sheyster*
> 
> I hope you're not serious.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any monitors I've had in the house that were < 24" have long since been given away to family members. My wife doesn't even use anything that small. Just sayin'..


even my dad's pc has a 27" IPS monitor. and lately he's been telling me if he can get a bigger one! lmao my parents are so trained by me with tech when my aunts/uncles come visit them they're amazed how good my parents are with the nvidia shield tv, chromecasts around the house etc.


----------



## Brutuz

Quote:


> Originally Posted by *Glottis*
> 
> Is this a joke/troll post? You can't possibly be implying that your crappy old (maybe TN?) screen from 2004 is better than his shiny new 4K TV just because your PPI is lower


He specifically mentioned issues that come from a high PPI. My monitor setup certainly isn't as good in colour reproduction and the like and I'd appreciate it if you didn't imply I did.
Quote:


> Originally Posted by *Sheyster*
> 
> I hope you're not serious.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any monitors I've had in the house that were < 24" have long since been given away to family members. My wife doesn't even use anything that small. Just sayin'..


The size is fine? I am going to get a bigger one later on but considering how close most of us sit to our screens even 30" is pretty big...I'd rather higher PPI. He specifically said "massive low res pixels" when even though my screen has a lower res, it has much smaller pixels regardless of screen size, colour reproduction, panel type, etc.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> i get 180+ fps on overwatch which is great (over 240+ on my 7700k) 144hz monitor so it doens't really matter. but again it's terrible with dota. it's improving I tell you that hopefully 3200mhz can help with the minimums some more. currently at 2666mhz and the improvement has been noticeable from 2400
> even my dad's pc has a 27" IPS monitor. and lately he's been telling me if he can get a bigger one! lmao my parents are so trained by me with tech when my aunts/uncles come visit them they're amazed how good my parents are with the nvidia shield tv, chromecasts around the house etc.


Same here. My parents love to use chromecast and my Dad comes and uses my 40" 4K monitor in my PC because he loved the size.


----------



## chuy409

Found out Phenoms slaughters ryzen 6 core and 4 core in L1 cache read and write while copy is not too far behind.

guru3d results of ryzen:
http://www.guru3d.com/articles-pages/amd-ryzen-5-1500x-and-1600x-review,16.html

My phenom x6 results:
http://imgur.com/a/T3wD7

Can someone explain whats going on here?


----------



## Majin SSJ Eric

I remember the first time I saw a 27" monitor in the store back in 2011 I was absolutely blown away by it (and I actually still have it, a Samsung P2770FH). It just looked so massive back then. Now my 27" IPS monitors just look like normal sized monitors to me. I can't imagine ever using anything smaller than these again.


----------



## sticks435

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I remember the first time I saw a 27" monitor in the store back in 2011 I was absolutely blown away by it (and I actually still have it, a Samsung P2770FH). It just looked so massive back then. Now my 27" IPS monitors just look like normal sized monitors to me. I can't imagine ever using anything smaller than these again.


I have a 25.5 inch 16:10 monitor at home and have since early 2011. At work the biggest monitor they give out is 16:9 22 inches. Every time I come home it still looks giant to me.


----------



## Majin SSJ Eric

I know what you mean but I'm just so used to this size now. In fact, my next monitor is almost certain to be at least 40" and 4K.


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same here. My parents love to use chromecast and my Dad comes and uses my 40" 4K monitor in my PC because he loved the size.


Its a shame most dont know how to use chromecasts, the new vizio TV's have 4k chromecasts built in and they are gettting a bad rep because people dont understand how superior it is to on screen interfaces lol.


----------



## Ceadderman

Quote:


> Originally Posted by *chuy409*
> 
> Found out Phenoms slaughters ryzen 6 core and 4 core in L1 cache read and write while copy is not too far behind.
> 
> guru3d results of ryzen:
> http://www.guru3d.com/articles-pages/amd-ryzen-5-1500x-and-1600x-review,16.html
> 
> My phenom x6 results:
> http://imgur.com/a/T3wD7
> 
> Can someone explain whats going on here?










:

Sorry, but I just don't see that considering the memory factor is infinitely faster over DDR3.

I have both 1100T and a R7 1800x. Maybe when I finally cobble together the board and RAM, I will look into this. But logic tells me that's an inaccurate assesment.









~Ceadder


----------



## cssorkinman

Quote:


> Originally Posted by *chuy409*
> 
> Found out Phenoms slaughters ryzen 6 core and 4 core in L1 cache read and write while copy is not too far behind.
> 
> guru3d results of ryzen:
> http://www.guru3d.com/articles-pages/amd-ryzen-5-1500x-and-1600x-review,16.html
> 
> My phenom x6 results:
> http://imgur.com/a/T3wD7
> 
> Can someone explain whats going on here?


Different versions of AIDA are not comparable scoring wise .


----------



## chuy409

Quote:


> Originally Posted by *cssorkinman*
> 
> Different versions of AIDA are not comparable scoring wise .


here you go:
http://imgur.com/a/pgNZZ


----------



## Tojara

Quote:


> Originally Posted by *chuy409*
> 
> here you go:
> http://imgur.com/a/pgNZZ


Doubled and quadroupled associativity along with area savings and power savings, I'm guessing. Essentially the cache functions as a larger one in Zen due to that, and it being slightly slower is a non-issue due to L2 and L3 being much larger and faster. Looking at it the other way around, the L2 on Zen has almost as much bandwidth as the L1 on your PII.


----------



## sumitlian

Quote:


> Originally Posted by *chuy409*
> 
> My phenom x6 results:
> http://imgur.com/a/T3wD7
> 
> Can someone explain whats going on here?


Strange! Phenom II's cache/memory is faster than even Intel Haswell? It can't be. It is definitely has to do with the varying size of data blocks to test read/write/copy speed of cache and memory across various versions of AIDA64.
Though you are running v5.8, here is mine from Feb-2012,


Spoiler: Phenom II X5 1055T 4.0 GHz AIDA64 Cache and Mem benchmark


----------



## sumitlian

Quote:


> Originally Posted by *cssorkinman*
> 
> Different versions of AIDA are not comparable scoring wise .


This.


----------



## chuy409

Quote:


> Originally Posted by *sumitlian*
> 
> Strange! Phenom II's cache/memory is faster than even Intel Haswell? It can't be. It is definitely has to do with the varying size of data blocks to test read/write/copy speed of cache and memory across various versions of AIDA64.
> Though you are running v5.8, here is mine from Feb-2012,
> 
> 
> Spoiler: Phenom II X5 1055T 4.0 GHz AIDA64 Cache and Mem benchmark


Yea its pretty odd. Pretty sweet if true. Im still wondering if there is a benchmark out there that stresses the l1 cache lot to test further.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I know what you mean but I'm just so used to this size now. In fact, my next monitor is almost certain to be at least 40" and 4K.


I had 27" 1440p since 2011. Dell U2711. Got My 4K 40" about 1 year ago and love it. I tried for 2 weeks 34" UltraWide 1440p and it was not for me. I really hated the fact that most of the games I played did not support the resolution. With 4K 16:9 everything will work. I think the only downside is that they are not good for competitive gaming because the screen is massive. I just can go back to 1440p anymore. Loading up Witcher 3 in this monitor should brings the world close to you. This is where 4K makes a difference. On a TV for 6+ feet away you do not appreciate 4K.


----------



## sumitlian

Quote:


> Originally Posted by *chuy409*
> 
> Yea its pretty odd. Pretty sweet if true. Im still wondering if there is a benchmark out there that stresses the l1 cache lot to test further.


There is nothing surprising about it. Different versions of AIDA64 used different configurations for testing cache/memory performance and hence the different results. It is analogous to HDD read write tests where different sized data blocks have different read/write or IOPS performance.


----------



## chuy409

Quote:


> Originally Posted by *sumitlian*
> 
> There is nothing surprising about it. Different versions of AIDA64 used different configurations for testing cache/memory performance and hence the different results. It is analogous to HDD read write tests where different sized data blocks have different read/write or IOPS performance.


Yea but we are comparing brand new technology vs 7+ year old technology. I expect it to beat 7 year old tech by default on any test any day of the week. And since aida got updated to support ryzen correctly, dont know why they would use an "unoptimized" or "legacy" mode of testing if what you said is true.


----------



## Artikbot

Since I actually have current data, may as well:










Hope that clears the 'slowness' of Ryzen's L1.


----------



## chuy409

Quote:


> Originally Posted by *Artikbot*
> 
> Since I actually have current data, may as well:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope that clears the 'slowness' of Ryzen's L1.


Its between the phenom x6 and ryzen 6 and 4 core, not 8 core.


----------



## sumitlian

Quote:


> Originally Posted by *chuy409*
> 
> Yea but we are comparing brand new technology vs 7+ year old technology. I expect it to beat 7 year old tech by default on any test any day of the week. And since aida got updated to support ryzen correctly, dont know why they would use an "unoptimized" or "legacy" mode of testing if what you said is true.


I get what you are saying. The confilcting scores between those versions is a proof that we must never conclude the ultimate benchmark scores of a certain CPU by any of the AIDA64's version. Whatever size of data they are using with AIDA64 v5.80 is only showing that it is fitting(cache hit) on Phenom II's cache architecture much better than it was with older versions. There are countless methods with countless data size you can test a CPU with to test cache/mem bandwidth with countless number of varying results. It doesn't mean that RyZen's and PII's L1 cache performance is equal for all conditions. It just means they show similar L1 performance for one of the certain types of calculations. You can be 99% sure that none of these AIDA64 tests represent the exact performance of any of the real world scenarios.

Also if you look at your test and compare it to mine you are seeing almost 6x performance increase with current version of AIDA on PII. Since L1 is private cache and available to each core, it is possible that current version of AIDA is testing for multithreaded cache performance and showing the aggregate scores.

But again since we are unaware of methods of AIDA64's cache benchmark(because it is not about 7+ year old technology, what formula are they using to test ? which instruction set is the test using ? is the instruction set latecny/throughput same for different CPUs ? if it is, then by how much ? ) we can't say what exactly is causing this. But imo, it must be one of the situations what I have clarified.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Artikbot*
> 
> Since I actually have current data, may as well:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope that clears the 'slowness' of Ryzen's L1.


That is massive especially the useless L3 on PII which is super slow.


----------



## sumitlian

@chuy409
Also I have a 2012 record of Sandra (probably one from 2011-2012) on PII.
It is in line with what you showed (at least in L1/Integrated one), but again, that old Sandra is showing much higher L2 and L3 cache performance as compared to what you and Artikbot showed with current AIDA tests.
Phenom II X6 1055T @ 4.0 GHz (286 x 14, NB = 286 x 10)


P.S. I don't know what I was thinking back then, I only cropped the main score side from whole window and kept all the data in one of my folder named bench. I have all the data of PII with Sandra in similar look.


----------



## sumitlian

Moral is these tests have got almost nothing to do with real world performance. I can literally make two cache test benchmark and both of them will show very different scores in all CPU. It all depends on what calculations you do and what optimization you choose.

For me, all cache benchmarks are 100% worthless as you can never guess or confirm any of the real world application performance based on those cache/bench tests whether it is from AIDA or Sandra.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sumitlian*
> 
> Moral is these tests have got almost nothing to do with real world performance. I can literally make two cache test benchmark and both of them will show very different scores in all CPU. It all depends on what calculations you do and what optimization you choose.
> 
> For me, all cache benchmarks are 100% worthless as you can never guess or confirm any of the real world application performance based on those cache/bench tests whether it is from AIDA or Sandra.


Cache is one of those things you can speed up without changing the architecture much. Its the main reason you see some gains with Intel CPU. From 1st Gen to 7th Gen there is big difference in Cache speeds. Same could be said for memory speeds.


----------



## chuy409

Quote:


> Originally Posted by *sumitlian*
> 
> Moral is these tests have got almost nothing to do with real world performance. I can literally make two cache test benchmark and both of them will show very different scores in all CPU. It all depends on what calculations you do and what optimization you choose.
> 
> For me, all cache benchmarks are 100% worthless as you can never guess or confirm any of the real world application performance based on those cache/bench tests whether it is from AIDA or Sandra.


I guess youre right. Is there any other software that benches cache speeds besides aida and sandra?


----------



## sumitlian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Cache is one of those things you can speed up without changing the architecture much. Its the main reason you see some gains with Intel CPU. From 1st Gen to 7th Gen there is big difference in Cache speeds. Same could be said for memory speeds.


Not all caches. Only L3 one. Afaik L1 and L2 speed is fixed or it is may be tied to core speed by some multiplier, I don't know (because we still don't know the exact performance capability of L2/L1 since both Intel and AMD don't publish the internal clocks of it, I think you can know the bus width of L1/L2 on Internet). All you can do is increase core speed to max or wherever it reaches, since core's IPS is increased and now you do some tests and see if there is difference in L1 cache bandwidth or not. While control of the L3 which is basically IMC is provided to end user and this is why you see massive changes in L3 speed by changing NB/uncore multiplier.

Not saying that 1st-7th gen hasn't brought improved cache speed, it definitely has improved but there are possible variables/factors with different versions of tests and this is why we can't say exactly how much performance is improved. To know the exact figures, set all CPUs' core speed and RAM speed/latency to identical and test them on same version of AIDA.


----------



## sumitlian

Quote:


> Originally Posted by *chuy409*
> 
> I guess youre right. Is there any other software that benches cache speeds besides aida and sandra?


MaxxMem or MaxxMEM2_preview was very famous in 2010-2012 I think.
http://www.softpedia.com/get/System/Benchmarks/MaxxMEM2.shtml or google for more versions.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I had 27" 1440p since 2011. Dell U2711. Got My 4K 40" about 1 year ago and love it. I tried for 2 weeks 34" UltraWide 1440p and it was not for me. I really hated the fact that most of the games I played did not support the resolution. With 4K 16:9 everything will work. I think the only downside is that they are not good for competitive gaming because the screen is massive. I just can go back to 1440p anymore. Loading up Witcher 3 in this monitor should brings the world close to you. This is where 4K makes a difference. On a TV for 6+ feet away you do not appreciate 4K.


Agreed and its the way I'm leaning for my next monitor. I will disagree that 1440p is now unacceptable as I still find my 1440p monitors certainly adequate but 4K would be a significant upgrade, no doubt about it (but only at a significantly larger size than my current displays). I have no interest whatsoever in "competitive" gaming and simply want my games to look as good and crisp as possible and for that 4K is hard to beat. Only issue is the amount of GPU horsepower you need to push the resolution at max quality (which is even more crucial than at lower resolutions since any compromises in graphical quality are magnified at 4K).


----------



## Ceadderman

I've got a 32" semi smart(no web) Samsung LED TV that I will be gaming/browsing on. Picking up a 8gb 480 should be pretty nice with my 1800x platform.









~Ceadder


----------



## Majin SSJ Eric

What resolution?


----------



## Ceadderman

I got it two Summers ago. A Magnavox. Sadly I just found out that it's 1080p.









So looks like I will be waiting for a Samsung 1440 capable to come on sale.









But I can hang in there with 1080p.









~Ceadder


----------



## IRobot23

Which MB for this ram?
https://www.gskill.com/en/product/f4-3200c14d-16gtzr
was looking to get X370 gaming 5
R5 1600X

https://www.gskill.com/en/product/f4-3200c16d-16gtzr
was looking to get X370 gaming 3 or ASUS X370 PRIME
R5 1600


----------



## Scotty99

Your ram costs as much as your cpu...


----------



## IRobot23

Quote:


> Originally Posted by *Scotty99*
> 
> Your ram costs as much as your cpu...


Not really, but I am not going to pay 115€ for 2133/2400MHz... anything faster is like 130€+.

PS: Its not your problem how much I will pay for anything... I am just asking for help, if you do not know then... have some respect.


----------



## Ceadderman

Honestly? I would get CVIH over Prime. Yeah it's $260ish but trust me, you will have a better experience.

Also for the uninitiated with ASUS(not sure of other manufacturers) AM4 boards, they have started to mount the CPU backplate to the board with an adhesive material. Which as a long time ASUS customer, makes absolutely *zero* sense with boards that would likely see the Unlocked chips as well as the locked ones. No R7 comes with a cooler. So that means aftermarket cooling solutions. Hopefully this is not just ASUS that did this. If it is, it's time to turn over a dirty letter to the company and possibly hand in my resignation as a customer. Cause the adhesive is very strong on a spot where flexing the board could be very problematic.










~Ceadder


----------



## IRobot23

Quote:


> Originally Posted by *Ceadderman*
> 
> Honestly? I would get CVIH over Prime. Yeah it's $260ish but trust me, you will have a better experience.
> 
> Also for the uninitiated with ASUS(not sure of other manufacturers) AM4 boards, they have started to mount the CPU backplate to the board with an adhesive material. Which as a long time ASUS customer, makes absolutely *zero* sense with boards that would likely see the Unlocked chips as well as the locked ones. No R7 comes with a cooler. So that means aftermarket cooling solutions. Hopefully this is not just ASUS that did this. If it is, it's time to turn over a dirty letter to the company and possibly hand in my resignation as a customer. Cause the adhesive is very strong on a spot where flexing the board could be very problematic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Thanks for the help.
His budget is limited

X370 Gaming 5 (for ME).
R5 1600X
DDR4 3200CL14
(mostly gaming new games and work)

EU prices (for friend)
R5 1600 - 233€
DDR4 [email protected] - 155€
MB max - 170€
560-570€ max.

any help is appreciated. We are not in a hurry..


----------



## shhek0

If we are talking about gaming on [email protected] then it is really whatever you throw.


----------



## Quantum Reality

Quote:


> Originally Posted by *chuy409*
> 
> Found out Phenoms slaughters ryzen 6 core and 4 core in L1 cache read and write while copy is not too far behind.
> 
> guru3d results of ryzen:
> http://www.guru3d.com/articles-pages/amd-ryzen-5-1500x-and-1600x-review,16.html
> 
> My phenom x6 results:
> http://imgur.com/a/T3wD7
> 
> Can someone explain whats going on here?


That should simply not be possible. Every Ryzen benchmark has shown ~50% improvement over Bulldozer/Phenom.


----------



## chuy409

Quote:


> Originally Posted by *Quantum Reality*
> 
> That should simply not be possible. Every Ryzen benchmark has shown ~50% improvement over Bulldozer/Phenom.


Guess not in l1 cache bandwidth. But need further testing to make sure. Phenoms also beat all FX cpus in l1 cache too. So probably more proof that its not fake i guess. I think this is a legitimate question to ask an AMD engineer.


----------



## Brutuz

Phenom II was a very interesting architecture. Despite having really good L3 cache bandwidth, increasing the CPU/NB clock speed (L3 cache, IMC) gives quite a large performance increase that's fairly even to that given when you overclock the CPU. That'll be one reason why Phenom II has more bandwidth than Ryzen at least, on Phenom II the L3 comes clocked at 2Ghz stock with overclocks often reaching 2.8Ghz, meanwhile even with DDR4-3600 the L3 on Ryzen is still only at 1800Mhz afaik.

Can someone with Ryzen test L3 cache speeds with fast RAM?


----------



## amlett

Quote:


> Originally Posted by *Brutuz*
> 
> Phenom II was a very interesting architecture. Despite having really good L3 cache bandwidth, increasing the CPU/NB clock speed (L3 cache, IMC) gives quite a large performance increase that's fairly even to that given when you overclock the CPU. That'll be one reason why Phenom II has more bandwidth than Ryzen at least, on Phenom II the L3 comes clocked at 2Ghz stock with overclocks often reaching 2.8Ghz, meanwhile even with DDR4-3600 the L3 on Ryzen is still only at 1800Mhz afaik.
> 
> Can someone with Ryzen test L3 cache speeds with fast RAM?


----------



## budgetgamer120

Those Ryzen APUs need to be released now.

Check this a12


----------



## teh-yeti

Quote:


> Originally Posted by *budgetgamer120*
> 
> Those Ryzen APUs need to be released now.
> 
> Check this a12


Video taken down. Want to summarize?


----------



## 7850K

Quote:


> Originally Posted by *teh-yeti*
> 
> Video taken down. Want to summarize?


I thought he might have linked techepiphany's newest but it's not down


----------



## budgetgamer120

Quote:


> Originally Posted by *teh-yeti*
> 
> Video taken down. Want to summarize?


Just wanted to share the performance of that AM4 apu running 2400mhz DDR4.

I hope there is an 8 core APU


----------



## Tojara

Quote:


> Originally Posted by *budgetgamer120*
> 
> Just wanted to share the performance of that AM4 apu running 2400mhz DDR4.
> 
> I hope there is an 8 core APU


Now replace the GPU cores with Vega, the CPU cores with Zen cores, increase the amount of CUs from 8 to 11, increase the clock speed by a few hundred MHz, swap memory controller with the one in Summit Ridge and run the RAM at proper DDR4 speeds (3GHz+). Both CPU and GPU performance should be up by 50%+.


----------



## Jayjr1105

Do R3's have an eta yet?


----------



## scorch062

Quote:


> Originally Posted by *Jayjr1105*
> 
> Do R3's have an eta yet?


Not much info on them yet.

I would suspect they will launch right after Raven Ridge APUs. I expect R3s to be those APU with disabled iGPU, just like 860k was. But i could be entirely wrong.


----------



## CasualCat

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I had 27" 1440p since 2011. Dell U2711. Got My 4K 40" about 1 year ago and love it. I tried for 2 weeks 34" UltraWide 1440p and it was not for me. I really hated the fact that most of the games I played did not support the resolution. With 4K 16:9 everything will work. I think the only downside is that they are not good for competitive gaming because the screen is massive. I just can go back to 1440p anymore. Loading up Witcher 3 in this monitor should brings the world close to you. This is where 4K makes a difference. On a TV for 6+ feet away you do not appreciate 4K.


Or get a bigger 4k TV (ie 65"-70"+)

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Agreed and its the way I'm leaning for my next monitor. I will disagree that 1440p is now unacceptable as I still find my 1440p monitors certainly adequate but 4K would be a significant upgrade, no doubt about it (but only at a significantly larger size than my current displays). I have no interest whatsoever in "competitive" gaming and simply want my games to look as good and crisp as possible and for that 4K is hard to beat. Only issue is the amount of GPU horsepower you need to push the resolution at max quality (which is even more crucial than at lower resolutions since any compromises in graphical quality are magnified at 4K).


Same on 1440p. When I got mine the majority of 4K PC monitors were either 27" or ridiculously expensive for 32" and GPU power to push it was also questionable. Really happy with my 32" 1440p for my desk as it was the same ppi as the 24" I was upgrading from but a significantly larger screen. Adding only 3" diagonal to me is very underwhelming even coupled with a higher resolution.


----------



## teh-yeti

Quote:


> Originally Posted by *scorch062*
> 
> Not much info on them yet.
> 
> I would suspect they will launch right after Raven Ridge APUs. I expect R3s to be those APU with disabled iGPU, just like 860k was. But i could be entirely wrong.


Since the R3 line is supposed to be quad cores with no SMT, do you think that the APUs will feature SMT? Or an L3 cache like the rest of the Ryzen line up?


----------



## kalelovil

Quote:


> Originally Posted by *teh-yeti*
> 
> Since the R3 line is supposed to be quad cores with no SMT, do you think that the APUs will feature SMT? Or an L3 cache like the rest of the Ryzen line up?


I suspect the APUs will still have L3 cache, perhaps halved to 4MB though.

The loss of L3 in previous APUs (Construction and Stars cores) wasn't as painful because it was both slow and existing alongside a relatively large L2 cache. Zen's L3 (within CCX) is lower latency, and it's L2 cache much smaller.

Despite many things being disabled in Intel's budget (core-based) Pentium line, they still leave a reasonable 3MB L3 cache for 2 cores.
I suspect that performance on 'Core' and Zen would drop substantially if the L3 were completely disabled, although I don't recall this ever being simulated.


----------



## Majin SSJ Eric

There's no doubt that the Zen APU's are going to be monstrous little AIO gaming chips!


----------



## Sempre

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> There's no doubt that the Zen APU's are going to be monstrous little AIO gaming chips!


They'd also be great in HTPCs.


----------



## oile

Hello guys,
I am a competitive esports player, now on BF1 getting ready for it.
I could buy a used 5820K for 220€ and a Gigabyte X99 Gaming 5P with a single non working ram slot for 70€.
Considering Ryzen 1600 is at 210€ where I live and an Asus prime X370 is 150€ (somewhat better VRMs than B350s)
what would you advise me to buy?
I've considered OCability, dead socket, Ram compatibility, L3 ryzen cache HyperFabric problems , future optimizations but I cannot make a decision.

I come from a 2600K @ 5.0Ghz (bought in years in wich everyone was saying to go for i5s) and 2133Mhz ram on Z68.

Could you help me?


----------



## epic1337

if you weight it by worth, i7-5820K would perform better than R5-1600 out of the box and when both are overclocked, if they're priced the same then theres no reason to purchase R5-1600.

it would be an entirely different matter if we're talking about R7-1700 though, that one has one major advantage over i7-5820K, and that is it's two extra cores.
as such, i recommend saving up a bit more and going for an R7-1700 straight up instead, it'll be a much more solid choice as a whole.


----------



## FLCLimax

1600 over 5820K all day. 5820K is already a dead socket and a faster CPU will run you a new system...you can drop a Ryzen+ right in the same AM4 board, no contest at all even for the same price up front. Intel sockets are one and done every year so unless you are buying Kaby Lake or the upcoming ones that will be relevant for the next 4 years do not bother with Intel.


----------



## daviejams

I am torn between the 1700 and 1600 , although I am waiting on Vega as I need to upgrade my GPU more than CPU. Will upgrade to Ryzen at some point this year

Not sure if I should get 8c or 6c


----------



## epic1337

Quote:


> Originally Posted by *FLCLimax*
> 
> 1600 over 5820K all day. 5820K is already a dead socket and a faster CPU will run you a new system...


false, 6C/12T Ryzen is slower than 5820K at the same clock, let alone when 5820K is clocked at 4.4Ghz or higher.

furthermore, dead-socket or not doesn't matter now a days, you can simply swap the entire rig for a much newer one.
specially so when newer features becomes introduced, e.g. PCI-E 4.0, DDR5 or HMC.
PCI-E 4.0 in particular would allow more NVMe SSDs to populate a given motherboard.


----------



## Lass3

Quote:


> Originally Posted by *epic1337*
> 
> false, 6C/12T Ryzen is slower than 5820K at the same clock, let alone when 5820K is clocked at 4.4Ghz or higher.
> 
> furthermore, dead-socket or not doesn't matter now a days, you can simply swap the entire rig for a much newer one.
> specially so when newer features becomes introduced, e.g. PCI-E 4.0 and DDR5.


Yeah.. DDR5 JEDEC spec finishes 2018-2019 and DDR5, goes mainstream 2019-2020.

Tbh I don't see the point in keeping an old motherboard with dated chipset/features when buying a new CPU.

And yes. OC'ed 5820K will beat OC'ed Ryzen 1600 in most workloads and especially games.


----------



## FLCLimax

Better to buy a processor alone than a new system but i'm not going to press on and upset anyone, lmao.

hey, we claim we wanted to see AMD to make a competitive chip but don't buy it, buy this old intel chip because intel!


----------



## budgetgamer120

Quote:


> Originally Posted by *oile*
> 
> Hello guys,
> I am a competitive esports player, now on BF1 getting ready for it.
> I could buy a used 5820K for 220€ and a Gigabyte X99 Gaming 5P with a single non working ram slot for 70€.
> Considering Ryzen 1600 is at 210€ where I live and an Asus prime X370 is 150€ (somewhat better VRMs than B350s)
> what would you advise me to buy?
> I've considered OCability, dead socket, Ram compatibility, L3 ryzen cache HyperFabric problems , future optimizations but I cannot make a decision.
> 
> I come from a 2600K @ 5.0Ghz (bought in years in wich everyone was saying to go for i5s) and 2133Mhz ram on Z68.
> 
> Could you help me?


Why would anyone choose a 5820k? Does it achieve higher clocks?

Broken motherboard? No


----------



## FLCLimax

think about people's feelings before you post, it's upsetting to recommend AMD around here.


----------



## epic1337

Quote:


> Originally Posted by *FLCLimax*
> 
> Better to buy a processor alone than a new system but i'm not going to press on and upset anyone, lmao.
> 
> hey, we claim we wanted to see AMD to make a competitive chip but don't buy it, buy this old intel chip because intel!


do take note that the current AM4 boards has issues, you either have to wait for that mythical bios "fix", or just swap to a newer AM4 board with more up-to-date features.

Quote:


> Originally Posted by *Lass3*
> 
> Yeah.. DDR5 JEDEC spec finishes 2018-2019 and DDR5, goes mainstream 2019-2020.
> 
> Tbh I don't see the point in keeping an old motherboard with dated chipset/features when buying a new CPU.
> 
> And yes. OC'ed 5820K will beat OC'ed Ryzen 1600 in most workloads and especially games.


i'd give DDR5 at least 3 years after release for it to become reasonably priced, similar to DDR4 they'd start out ridiculously expensive.
though on that note, we'd be looking at 4 or 5years from today, which just sits at the perfect time to upgrade a long-term rig.

quite so, with now a days new chipsets keep popping out like they're disposable, i'm looking at you intel.

exactly, they underestimate the performance of 5820K just because it's "OLD" while Ryzen is "NEW".


----------



## Pro3ootector

http://hexus.net/tech/reviews/ram/105241-gskill-flare-x-16gb-ddr4-3200-f4-3200c14d-16gfx/?page=2


----------



## CriticalOne

The choice between a 1600 and a 5820k isn't as straightfoward as you may think.

The 1600 is much cheaper and it uses less energy. The advantages end there.

The 5820k is a marginally faster processor clock for clock and can overclock higher. X99 can support more I/O than X370, especially when it comes to PCIE lanes and quad channel support.

Ryzen are some seriously impressive CPUs but you have to remember that its mainly beating Intel over prices. If you had a lot of money to spend, then the 5820k is the better choice. If you are like me who doesn't have a lot of money, then the 1600 is the better choice due to its value.


----------



## epic1337

Quote:


> Originally Posted by *CriticalOne*
> 
> The choice between a 1600 and a 5820k isn't as straightfoward as you may think.
> 
> The 1600 is much cheaper and it uses less energy. The advantages end there.
> 
> The 5820k is a marginally faster processor clock for clock and can overclock higher. X99 can support more I/O than X370, especially when it comes to PCIE lanes and quad channel support.
> 
> Ryzen are some seriously impressive CPUs but you have to remember that its mainly beating Intel over prices. If you had a lot of money to spend, then the 5820k is the better choice. If you are like me who doesn't have a lot of money, then the 1600 is the better choice due to its value.


but the choice between the two puts the 5820K at a lower price due to it being second hand, its not really a price advantage anymore but more of a "R5-1600 is replaceable with a higher-tier CPU".

normally the comparison ends up with 5820K being pitted against R7-1700 with the price being identical, so AMD becomes the favorable choice due to having more cores and it's more up-to-date platform.


----------



## dieanotherday

intel 8 core still >$1k... they simply don't care about us...


----------



## KarathKasun

Quote:


> Originally Posted by *CriticalOne*
> 
> The choice between a 1600 and a 5820k isn't as straightfoward as you may think.
> 
> The 1600 is much cheaper and it uses less energy. The advantages end there.
> 
> The 5820k is a marginally faster processor clock for clock and can overclock higher. X99 can support more I/O than X370, especially when it comes to PCIE lanes and quad channel support.
> 
> Ryzen are some seriously impressive CPUs but you have to remember that its mainly beating Intel over prices. If you had a lot of money to spend, then the 5820k is the better choice. If you are like me who doesn't have a lot of money, then the 1600 is the better choice due to its value.


Most 5820k chips Ive run across overclock horribly. There are some that do clock well, but it seems that most do not.
The I/O does not matter for your average user 9/10 times either.

You can save money with older setups that are used, the problem lies in replacing parts should something break. I wouldn't waste time with a board that has a dead memory slot unless its free. Its likely socket damage causing the issue. I would rather look into the 'off lease' resale market for high end workstations. You can get systems from that market for dirt cheap.


----------



## SuperZan

Quote:


> Originally Posted by *epic1337*
> 
> f*alse, 6C/12T Ryzen is slower than 5820K at the same clock*, let alone when 5820K is clocked at 4.4Ghz or higher.
> 
> furthermore, dead-socket or not doesn't matter now a days, you can simply swap the entire rig for a much newer one.
> specially so when newer features becomes introduced, e.g. PCI-E 4.0, DDR5 or HMC.
> PCI-E 4.0 in particular would allow more NVMe SSDs to populate a given motherboard.


That is not true when claimed as an absolute.

http://www.overclockers.com/amd-ryzen-5-1500x-1600x-cpu-review/

Clock for clock comparisons, 1600x vs. 5820k, have it 'trading blows'. Looking at the performance, your claim is patently false. The 5820k is faster in some tasks, slower in some tasks, and effectively the same in others. We're at a point where use-case and platform longevity are the determining factors, as the CPU's are each stronger in respective areas and functionally the same in many others (including most 'common' home desktop uses).


----------



## epic1337

Quote:


> Originally Posted by *SuperZan*
> 
> That is not true when claimed as an absolute.
> 
> http://www.overclockers.com/amd-ryzen-5-1500x-1600x-cpu-review/
> 
> Clock for clock comparisons, 1600x vs. 5820k, have it 'trading blows'. Looking at the performance, your claim is patently false. The 5820k is faster in some tasks, slower in some tasks, and effectively the same in others. We're at a point where use-case and platform longevity are the determining factors, as the CPU's are each stronger in respective areas and functionally the same in many others (including most 'common' home desktop uses).


your point is effectively invalid, due to BF1 preferring Intel's platform.
Quote:


> Originally Posted by *oile*
> 
> Hello guys,
> *I am a competitive esports player, now on BF1 getting ready for it.*
> I could buy a used 5820K for 220€ and a Gigabyte X99 Gaming 5P with a single non working ram slot for 70€.
> Considering Ryzen 1600 is at 210€ where I live and an Asus prime X370 is 150€ (somewhat better VRMs than B350s)
> what would you advise me to buy?
> I've considered OCability, dead socket, Ram compatibility, L3 ryzen cache HyperFabric problems , future optimizations but I cannot make a decision.
> 
> I come from a 2600K @ 5.0Ghz (bought in years in wich everyone was saying to go for i5s) and 2133Mhz ram on Z68.
> 
> Could you help me?


----------



## SuperZan

Quote:


> Originally Posted by *epic1337*
> 
> your point is effectively invalid.


Not really. You weren't specific in your assertion which turned it into a blanket statement by virtue of its ambiguity which I even helpfully qualified in my response.

If we're speaking solely in the context of the question, I'm assuming we're dealing with 1080p for competitive reasons. I play BF1 multiplayer quite a bit and have done so on a 3930k, 4790k, and 6700k, each at 4.6GHz, as well as a 1700x at 4.0GHz. At 1080p, I'd be hard-pressed to pick one out of a blind test and label it correctly. It scales very well with cores and threads, though, and the only processor that's going to push close to 240 frames for the ultra-high refresh rate monitors is a 6950x with a solid overclock at 720p.

http://www.pcgameshardware.de/Battlefield-1-2016-Spiel-54981/Specials/Battlefield-1-They-Shall-not-Pass-Benchmarks-1223170/

That said, the 1700 (and the other R 7's, obviously) is marginally slower than the 6900k when at about the same clocks. I'd thus expect an R5 at similar clocks to perform at or above Haswell-level in multiplayer content. For his purposes, if he can reliably count on a 4.5GHz OC out of that 5820k and has no inkling to upgrade beyond the capabilities of Broadwell-e any time in the next few years, then the 5820k is a fine choice. However, a 1600x would provide a very similar practical experience and would provide the option of a cheaper eight-core Zen or Zen+ part should more performance become necessary. Given how well the game scales with threads in multiplayer, I personally think that's a nice option to have with a very minimal (and probably imperceptible) trade-off.


----------



## epic1337

Quote:


> Originally Posted by *SuperZan*
> 
> Not really. You weren't specific in your assertion which turned it into a blanket statement by virtue of its ambiguity which I even helpfully qualified in my response.


my mistake on that part, i thought you were also paying attention to what he wants and was subjectively giving assertion to what would fit him best.

thats why i mentioned that an 5820K would without a doubt be the better deal between 5820K and R5-1600, especially when 5820K is practically cheaper.
furthermore i also mentioned that if it was compared to an R7-1700 instead, then the R7-1700 would have some advantages that simply makes 5820K pale in comparison.

Quote:


> Originally Posted by *SuperZan*
> 
> If we're speaking solely in the context of the question, I'm assuming we're dealing with 1080p for competitive reasons. I play BF1 multiplayer quite a bit and have done so on a 3930k, 4790k, and 6700k, each at 4.6GHz, as well as a 1700x at 4.0GHz. At 1080p, I'd be hard-pressed to pick one out of a blind test and label it correctly. It scales very well with cores and threads, though, and the only processor that's going to push close to 240 frames for the ultra-high refresh rate monitors is a 6950x with a solid overclock at 720p.


don't put intel's 4C/8T and 6C/12T on the same tier, the 5820K would perform better on BF1 MP than intel's 4C/8T processors.
especially so when you clock them the same, take note that reviewers have been comparing them at stock vs stock, which puts 5820K at nearly 1Ghz behind.


----------



## Lass3

Quote:


> Originally Posted by *KarathKasun*
> 
> Most 5820k chips Ive run across overclock horribly. There are some that do clock well, but it seems that most do not.
> The I/O does not matter for your average user 9/10 times either.
> 
> You can save money with older setups that are used, the problem lies in replacing parts should something break. I wouldn't waste time with a board that has a dead memory slot unless its free. Its likely socket damage causing the issue. I would rather look into the 'off lease' resale market for high end workstations. You can get systems from that market for dirt cheap.


"Most 5820k chips Ive run across overclock horribly."

Yet Ryzen overclocks worse


----------



## Cherryblue

Quote:


> Originally Posted by *Lass3*
> 
> "Most 5820k chips Ive run across overclock horribly."
> 
> Yet Ryzen overclocks worse


I wouldn't call +800mhz a horrible overclock.

Processor base boost xfr
R5 1600 *3.2* 3.6 3.7

But then again, if you buy the one already overclocked at a premium, yes, it has poor overclocking room..


----------



## epic1337

4.0Ghz is actually quite hard to attain, based on what i've heard so far the common ceiling is 3.8Ghz~3.9Ghz on air, while its 3.9Ghz~4.0Ghz on water.


----------



## KarathKasun

Quote:


> Originally Posted by *Lass3*
> 
> "Most 5820k chips Ive run across overclock horribly."
> 
> Yet Ryzen overclocks worse


Its about the same TBQH. 5820k and Ryzen 1700 work off the same principal, buy slow & clock as high as you can. On average they both end up about the same, except R7 has more cores.
Quote:


> Originally Posted by *epic1337*
> 
> 4.0Ghz is actually quite hard to attain, based on what i've heard so far the common ceiling is 3.8Ghz~3.9Ghz on air, while its 3.9Ghz~4.0Ghz on water.


I see many 5820's stuck around the same speeds unless they are under water, so its a pretty meh point.


----------



## epic1337

Quote:


> Originally Posted by *KarathKasun*
> 
> Its about the same TBQH. 5820k and Ryzen 1700 work off the same principal, buy slow & clock as high as you can. On average they both end up about the same, except R7 has more cores.


but we're comparing it against an R5-1600, AMD's cheapest 6C/12T processor.


----------



## KarathKasun

Quote:


> Originally Posted by *epic1337*
> 
> but we're comparing it against an R5-1600, AMD's cheapest 6C/12T processor.


Then you end up at around the same performance instead of being slightly ahead with AMD. BUT, the CPU and MB are new and of a much more known quality.

Have a problem with either part and you can RMA it for next to nothing.


----------



## Lass3

Quote:


> Originally Posted by *KarathKasun*
> 
> Its about the same TBQH. 5820k and Ryzen 1700 work off the same principal, buy slow & clock as high as you can. On average they both end up about the same, except R7 has more cores.
> I see many 5820's stuck around the same speeds unless they are under water, so its a pretty meh point.


Not in my experience.. I hit 4.4 on a friends 5820K using a Noctua NH-D14

http://hwbot.org/hardware/processor/core_i7_5820k/

In comparison, the Ryzen 5 1600 I tried maxed out at 3950 using a Corsair H100 V2

I've seen many people that can't even reach 4 GHz on Ryzen. I'm talking rock solid here. Not bench stable like in most reviews.


----------



## KarathKasun

Quote:


> Originally Posted by *Lass3*
> 
> Not in my experience.. I hit 4.4 on a friends 5820K using a Noctua NH-D14
> 
> http://hwbot.org/hardware/processor/core_i7_5820k/
> 
> In comparison, the Ryzen 5 1600 I tried maxed out at 3950 using a Corsair H100 V2


Ive seen a few that struggle mightily to get over ~4.2 with Phanteks or Noctuas without getting up into the 90c range. The chips seem to have a wide window of quality. HW-Bot averages are a really rough thing to go off of too. Especially without looking at how broad the average peak is in the statistics.


----------



## STEvil

Quote:


> Originally Posted by *KarathKasun*
> 
> Ive seen a few that struggle mightily to get over ~4.2 with Phanteks or Noctuas without getting up into the 90c range. The chips seem to have a wide window of quality. HW-Bot averages are a really rough thing to go off of too. Especially without looking at how broad the average peak is in the statistics.


motherboard can also play a huge role in where you get to.

I would go with the ryzen just for the upgrade path at the price point being given.


----------



## epic1337

i would skip the 1600 and go straight for a 1700 though, the resell value of 6C/12T CPUs would sharply decline as AMD releases more cheap 8C/16T CPUs.


----------



## Brutuz

Quote:


> Originally Posted by *Lass3*
> 
> Not in my experience.. I hit 4.4 on a friends 5820K using a Noctua NH-D14
> 
> http://hwbot.org/hardware/processor/core_i7_5820k/
> 
> In comparison, the Ryzen 5 1600 I tried maxed out at 3950 using a Corsair H100 V2
> 
> I've seen many people that can't even reach 4 GHz on Ryzen. I'm talking rock solid here. Not bench stable like in most reviews.


How were both systems OCed? What voltages did you change and how? Not saying you don't know, I just am aware that the ease of OCing multiplier unlocked chips has lead to a lot of people not realizing extra ways to get a few more Mhz out of a chip or some extra performance. (eg. I can get my 3570k to be mostly stable at 4.8Ghz without delidding/custom TIM/anything other than an NH-D14 and a cheap ASRock board through adjusting PLL voltages among other things, even if it means my 24/7 OC is the same and when I had a C2D I gained an extra 200Mhz on my stable 24/7 OC from mucking around with the PLL and NB voltages to increase FSB stability.)

Also, it's a new architecture with documented immaturity in UEFI and other low level software. It took a couple of years for CPU/NB overclocking with Phenom II to become a well known way to get a lot of extra performance easily, and a 800Mhz CPU/NB OC was roughly equal to a base CPU OC of 600Mhz iirc so it wasn't a small performance increase for the cost of some extra power consumption and time spent tweaking.


----------



## Cherryblue

Quote:


> Originally Posted by *epic1337*
> 
> 4.0Ghz is actually quite hard to attain, based on what i've heard so far the common ceiling is 3.8Ghz~3.9Ghz on air, while its 3.9Ghz~4.0Ghz on water.


So be it. Not the point, but OK. Seeing the R5 Owner topic on this forum, we have 1 dude on 8 who made it to 4ghz. Platform still is new, so it should be easier with time.

The point was, 700 or 800 mhz over stock, is still a nice gain, and should not be qualified by the illogical expression "horrible overclocking".

Quote:


> Originally Posted by *epic1337*
> 
> i would skip the 1600 and go straight for a 1700 though, the resell value of 6C/12T CPUs would sharply decline as AMD releases more cheap 8C/16T CPUs.


To add more information on the question,

R5 1600 just hit 195€ (with expedition cost) in France.

R7 1700*X* in the same country was at its lowest, 350€.

Mainly using a chip for gaming and multi tasking, I'd personnally be more inclined to buy the 6 cores, and keeping the money left (in this example, ~150€) for Ryzen v2, rather than paying 150€ more (77% of 1600 cost) for 2 more cores (and 4 threads yes, useful for power softwares, but for games...)

*EDIT: Error on my side, it's the 1700X and not the 1700. I edited my post.*

1700 was at its lowest at 315€. It's still 115 to 120€ more, meaning 59% of R5 1600 price.


----------

