# AMD FX 8350 VS i5 3570K Delidded Single gpu and Crossfire GPU



## Stoffie

Hi All

This thread has been created by myself and Powermonkey500 as we have very similar hardware except processor and motherboard.

My initial idea was to show the performance of the FX 8350 with single gpu and crossfire. Powermonkey500 offered his services as a comparison so A BIG THANK YOU TO HIM FOR HIS TIME AND EFFORT.

Both rigs are running corsair vengeance ram 16 GB at 1600mhz , crossfire HD 7970 GHZ editions at stock (1050 / 1500) please see my sig rig for further details. The differences mainly are:

AMD fx 8350 @ 4.8GHZ with ASUS Crosshair V formula Z

Intel i5 3570K @ 4.8GHZ with ASROCK Extreme 4

The Games so far used are Dirt Showdown, Sleeping Dogs and Battlefield 3, we hope at some point soon to add Hitman Absolution and Farcry 3.

I will start by adding the AMD Benchies followed by Intel. All games are preset to Ultra settings at 1080P Highest possible settings except in Showdown where the option on the intel rig was not available to turn on Global illumination, so we left that off.

Battlefield 3 is a difficult benchmark so we chose Toms Hardware's preferred balanced benchmark using Fraps on the following part of the game: http://www.youtube.com/watch?feature=player_embedded&v=wAA-NJj0Z0k

AMD Validations:

CPUZ: http://valid.canardpc.com/2600069
GPUZ: http://www.techpowerup.com/gpuz/9v8uz/

Intel Validations

GPUZ: http://www.techpowerup.com/gpuz/h35b6/
CPUZ: http://valid.canardpc.com/2599064

Sleeping Dogs

Single card AMD



Single Card Intel



Crossfire AMD



CrossFire Intel



CPU/GPU Usage FX8350



Showdown

single card AMD



Single Card Intel



Crossfire AMD



Crossfire Intel



CPU GPU Usage FX 8350



Battlefield 3

Single Card AMD



Single Card Intel



Crossfire AMD



Crossfire Intel



CPU/GPU Usage FX8350



3d Mark 11

Single Card AMD

http://www.3dmark.com/3dm11/5085392

Single Card Intel

http://www.3dmark.com/3dm11/5078798

Crossfire AMD

http://www.3dmark.com/3dm11/5085322

Crossfire Intel

http://www.3dmark.com/3dm11/5078760

Heaven

Single Card AMD



Single Card Intel



Crossfire AMD



Crossfire Intel



CINEBENCH

AMD



INTEL



MAX PAYNE 3

AMD Single



INTEL SINGLE



AMD Crossfire



INTEL Crossfire



AMD USAGE GRAPH



Formula 1 2012

AMD Single



INTEL SINGLE



AMD Crossfire



Intel Crossfire



Hitman Absolution

AMD Single



Intel Single to follow

AMD Crossfire



Intel Crossfire to follow

AMD Usage graph



Metro 2033

AMD Single

min 25 max 54 AVG 38.55

Intel Single to come

AMD Crossfire

min 63 max 90 avg 78.667

Intel Crossfire To follow

AMD Usage



Before anyone asks, yes on both rigs f1 2012 ran slower with crossfire enabled than in Single GPU mode hence I did not do a usage graph...

My conclusion is that they are very similar in performance at these speeds, i5 wins Some, FX8350 wins some, out of 6 game benchmarks 3 single GPU benches and 3 crossfire benches , the fx8350 won 3 and the i5 won 3, the I5 wins at 3d mark 11 but the 8350 was even at Heaven.

The intention of this thread is to show honest performance differences, I would again like to thank Powermonkey500 for his input and I hope this is useful information to everyone.


----------



## sugarhell

Rep. So you have 99% utilization in bf3?Interesting. A bit difficult to do the same bench in multiplayer bf3:/


----------



## Stoffie

Quote:


> Originally Posted by *sugarhell*
> 
> Rep. So you have 99% utilization in bf3?Interesting. A bit difficult to do the same bench in multiplayer bf3:/


Exactly I tried but I averaged i50 FPS by looking up at the sky slightly, 130 in normal gameplay ad then 110 if I snipered and saw a lot of ground, we could never be equal, Thanks for the rep, please rep powermonkey aswell it would not have been possible without his input!


----------



## cssorkinman

Very interesting stuff, thanks for your efforts!


----------



## Powermonkey500

Enjoy!


----------



## tsm106

Nice work. You should put the results back to back so one doesn't have to scroll far down to compare the results. As it is now, it's a pain to navigate.


----------



## Powermonkey500

Quote:


> Originally Posted by *tsm106*
> 
> Nice work. You should put the results back to back so one doesn't have to scroll far down to compare the results. As it is now, it's a pain to navigate.


I agree with this. Put them side by side please


----------



## Stoffie

Will do, it's 23:30 here in the UK, I will modify it tomorrow morning.


----------



## Powermonkey500

Quote:


> Originally Posted by *Stoffie*
> 
> Will do, it's 23:30 here in the UK, I will modify it tomorrow morning.


Is there any way to hand control to me?


----------



## AbdullahG

Nice job. Looking forward to more results. My only concern is that the benchmarks were done @ 1080p. A pair of HD 7970s would seem more practical in high-resolution gaming, not @ 1080p where a single high-end GPU is enough. It would be nice to see how the results compare beyond 1080p. +Rep nonetheless.


----------



## pwnzilla61

nice work, a few more games would be nice, but this just proves that you really cannot go wrong with a piledriver. Any dx11 game will run pretty much the same on either.


----------



## Stoffie

Quote:


> Originally Posted by *Powermonkey500*
> 
> Is there any way to hand control to me?


Does't look like it the kids will have awake within 6 hours i will get everything side by side then!


----------



## Stoffie

Quote:


> Originally Posted by *AbdullahG*
> 
> Nice job. Looking forward to more results. My only concern is that the benchmarks were done @ 1080p. A pair of HD 7970s would seem more practical in high-resolution gaming, not @ 1080p where a single high-end GPU is enough. It would be nice to see how the results compare beyond 1080p. +Rep nonetheless.


You going to get a shock when i load hitman absolution results, can barely get 60 fps in in crossfire with 100% utilisation on both cards


----------



## AbdullahG

Quote:


> Originally Posted by *Stoffie*
> 
> You going to get a shock when i load hitman absolution results, can barely get 60 fps in in crossfire with 100% utilisation on both cards


Is that because of poor CrossFire support, or is the game that difficult to run?


----------



## Stoffie

Quote:


> Originally Posted by *AbdullahG*
> 
> Is that because of poor CrossFire support, or is the game that difficult to run?


I am not sure yet but in single gpu you looking at less than 40 fps on highest settings going to fiddle with it tomorrow.

Powermonkey, Have a look I have updated the layout, got to bed and the wife was feeding the baby!!


----------



## MacLeod

Well done and very enlightening. Thanks to you both and looking forward to more results.


----------



## roudabout6

Great job







But what about eyefinity and 1440p? I still dont regret switching to intel after these results though. Although my fps may only be 5-15 frames more I have had a much better experience with it


----------



## TKFlight

These are my favorite types of benchmarks, it seems like the FX-8350 is a true contender. When it comes to multiple GPU setups though it seems AMD has some work to do. Awesome work though! I'd also like to add that in BF3 the extra cores from the FX-8350 really come in handy.


----------



## jamaican voodoo

thanks for this guys i was considering going amd this coming year but i wasn't sure about the performance....now that see some results i'm certainly well pick up a FX 8350. thanks and +rep


----------



## Stoffie

I would like to do eye infinity but I only have one monitor, however I think this is relevant for anyone considering 120 Hz 1080p. It's also relevant If you live in a country (like I do) where the 3570k is quite a bit more expensive.

What I didn't note in the in the opening thread and I'm not sure if anyone thinks it is relevant but the 8350 still has more overclock headroom it is currently maxing at 48 degrees, the 3570k needed to be delidded to get to 4.8ghz previously it's highest was.4.6ghz and would potentially have needed to be backed down a bit because of the heat being generated.

I plan to add farcry 3 and hitman absolution tonight for the AMD rig.


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> I would like to do eye infinity but I only have one monitor, howeve*r I think this is relevant for anyone considering 120 Hz 1080p. It's also relevant If you live in a country (like I do) where the 3570k is quite a bit more expensive.*
> What I didn't note in the in the opening thread and I'm not sure if anyone thinks it is relevant but the 8350 still has more overclock headroom it is currently maxing at 48 degrees, the 3570k needed to be delidded to get to 4.8ghz previously it's highest was.4.6ghz and would potentially have needed to be backed down a bit because of the heat being generated.
> I plan to add farcry 3 and hitman absolution tonight for the AMD rig.


State side the Intel build is going to be about 100$ cheaper.

PCPartPicker part list / Price breakdown by merchant / Benchmarks

*CPU:* Intel Core i5-3570K 3.4GHz Quad-Core Processor ($214.99 @ Newegg)
*Motherboard:* ASRock Z77 Extreme4 ATX LGA1155 Motherboard ($119.98 @ Outlet PC)
*Total:* *$334.97*
_(Prices include shipping, taxes, and discounts when available.)_
_(Generated by PCPartPicker 2012-12-01 05:21 EST-0500)_

PCPartPicker part list / Price breakdown by merchant / Benchmarks

*CPU:* AMD FX-8350 4.0GHz 8-Core Processor ($199.99 @ Newegg)
*Motherboard:* Asus Crosshair V Formula-Z ATX AM3+ Motherboard ($229.99 @ Newegg)
*Total:* *$429.98*
_(Prices include shipping, taxes, and discounts when available.)_
_(Generated by PCPartPicker 2012-12-01 05:23 EST-0500)_

Could add Max Payne 3
Borderlands 2
Batman Arkham city
To the test?

And add Cpu/gpu Usage Grafs for the 3570k to?

Cause for some Odd reason your results are much lower then my Sig rig results And 2 7970s should walk away from my 680s pretty easy.
Unless these games are also starting to take Advantage of HT? Anyways thx for the post + rep.


----------



## ProVisionOman

The problem is that the FX needs ALL of its 8 cores used to match the quad-core i5. Rare games do that, but thank god Windows distributes it across all them cores. In Skyrim for example, the 8350 gets raped because Skyrim only uses 2 threads effectively.
Edit: Thanks also for the thread. Real helpful. +Rep to you both







.


----------



## rdr09

Thank you Stoffie and Powermonkey. +rep to you guys.


----------



## itomic

Very nice thread. Could u throw in F1 2012, becouse its very CPU intensive and its very multi core aware.


----------



## Hokies83

Why are there no Grafs for the 3570k and its gpu use / cpu load?


----------



## fetzher

Planetside 2 benchmark pls


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> State side the Intel build is going to be about 100$ cheaper.
> PCPartPicker part list / Price breakdown by merchant / Benchmarks
> *CPU:* Intel Core i5-3570K 3.4GHz Quad-Core Processor ($214.99 @ Newegg)
> *Motherboard:* ASRock Z77 Extreme4 ATX LGA1155 Motherboard ($119.98 @ Outlet PC)
> *Total:* *$334.97*
> _(Prices include shipping, taxes, and discounts when available.)_
> _(Generated by PCPartPicker 2012-12-01 05:21 EST-0500)_
> PCPartPicker part list / Price breakdown by merchant / Benchmarks
> *CPU:* AMD FX-8350 4.0GHz 8-Core Processor ($199.99 @ Newegg)
> *Motherboard:* Asus Crosshair V Formula-Z ATX AM3+ Motherboard ($229.99 @ Newegg)
> *Total:* *$429.98*
> _(Prices include shipping, taxes, and discounts when available.)_
> _(Generated by PCPartPicker 2012-12-01 05:23 EST-0500)_
> Could add Max Payne 3
> Borderlands 2
> Batman Arkham city
> To the test?
> And add Cpu/gpu Usage Grafs for the 3570k to?
> Cause for some Odd reason your results are much lower then my Sig rig results And 2 7970s should walk away from my 680s pretty easy.
> Unless these games are also starting to take Advantage of HT? Anyways thx for the post + rep.


I can do Max Payne3, I don't have batman Arkham city or Borderlands, if they are cheap on steam I'll snap them up and add them though. Also don't forget the 2 7970's are at stock 1050/1500, They can run at 1200 /1600 and then I am sure at that point they will run away because they scale very well.

Also I agree with you , in the US if you are a gamer the 3570K is better but in the UK for the same money you would not get a 3570 you would get a i5 34** non K model.

Quote:


> Originally Posted by *itomic*
> 
> Very nice thread. Could u throw in F1 2012, becouse its very CPU intensive and its very multi core aware.


Absolutely I will add this, it is very easy though crossfire is rubbish on F1 2012, I checked about 2 weeks ago and I got better FPS on single card then I got on 2.


----------



## Powermonkey500

I'm actually downloading Hitman Absolution and Far Cry 3 a t the moment. When they are finished we may have some extra benchmarks for you guys.
Unluckily my internet is horribly slow.


----------



## zooterboy

+rep to both of you, nicely done.


----------



## Mattb2e

+Rep

Great job at compiling a list of benchmarks at equal speeds on two often recommended gaming builds. Its nice to see the comparison and contrast of what each build will do in a gaming scenario. What is interesting is that you also chose to do the same benchmarks with the same systems with a crossfire setup, most professional reviews leave out detail like that, which is partly why I like this thread so much







.


----------



## mingqi53

Both processors are good for gaming.. /endthread haha


----------



## Hokies83

The main one im Interested in is Borderlands 2.. Cause that one bottlenecks my Cpu..

I would also like to see a Graf of the 3570k core use/gpu use during the games like the 8350 has to have a more in depth compare.


----------



## Stoffie

Quote:


> Originally Posted by *fetzher*
> 
> Planetside 2 benchmark pls


Can you confirm if it has a built in benchmark?

Quote:


> Originally Posted by *Hokies83*
> 
> The main one im Interested in is Borderlands 2.. Cause that one bottlenecks my Cpu..
> I would also like to see a Graf of the 3570k core use/gpu use during the games like the 8350 has to have a more in depth compare.


I agree with you, the idea behind this thread was initially to show that a 8350 does not bottleneck crossfire as I have seen claimed by people who have not actually even got one...

The only reason that a graph has been done for 8350 is because there is a software that AMD has that can give this information but it does not recognise intel chips, so if you know what Powermonkey500 can use I am sure he will be happy to oblige.

unfortunately borderlands is £30 on steam, I can't justify paying that for a game that I won't play just to benchmark, when it drops in price I will be happy to do it though.


----------



## Alatar

Please keep the thread clean, no attacks on other users or anything.


----------



## 12Cores

Great thread guys, keep the benches coming.


----------



## Demonkev666

can you match single core speed in cinebench 11.5 for so both are at equal scores.

I'd like the see the benches that gets with the clock difference too.


----------



## MacLeod

Stoffie, how do you have your FX overclocked? Multiplier only or FSB with some HT Link and CPU/NB bumpage?


----------



## Powermonkey500

Quote:


> Originally Posted by *MacLeod*
> 
> Stoffie, how do you have your FX overclocked? Multiplier only or FSB with some HT Link and CPU/NB bumpage?


Check out his validation http://valid.canardpc.com/2600069


----------



## Stoffie

Quote:


> Originally Posted by *Demonkev666*
> 
> can you match single core speed in cinebench 11.5 for so both are at equal scores.
> I'd like the see the benches that gets with the clock difference too.


Yes I can do that I'll ask powermonkey500 too.

Quote:


> Originally Posted by *MacLeod*
> 
> Stoffie, how do you have your FX overclocked? Multiplier only or FSB with some HT Link and CPU/NB bumpage?


I overclock the whole thing, my fx reacts much better to fsb and multi, I have found that I get higher FPS if I get the ht link above 2800 an and the cpu nb at about 2400. but it is much harder to achieve an overclock because they have dead zones on the fsb my validation in the opening thread will give you the settings. HT link at 2600 definitely slows crossfire down a bit.


----------



## anubis44

Yes, I really appreciate it, too. This is something that I can't find anywhere: crossfire benchmarks with Vishera!

Thank you!


----------



## Powermonkey500

Anyway, I just sent him over Max Payne 3, F1 2012, and Cinebench. Should be appearing at some point in the near future, when he gets his benchmarks done and up.


----------



## Stoffie

F1 2012 is done just downloading max Payne then they will be up. It downloading at 6mbps off steam so shouldn't be too long


----------



## Stoffie

Personally I would have thought that hyperthreading will make a difference on games that use multithreading, in a round about way that is what the fx8350 does, ultimately it is a 4 module (4core) 8 thread chip allowing the operating system to schedule two threads or processes simultaneously. if widows is sharing the resources to the threads and the games are using more threads it should give you an advantage.

Please note this is my opinion and is based on zero facts ad has no evidence or source to back it up


----------



## Powermonkey500

Quote:


> Originally Posted by *Stoffie*
> 
> Personally I would have thought that hyperthreading will make a difference on games that use multithreading, in a round about way that is what the fx8350 does, ultimately it is a 4 module (4core) 8 thread chip allowing the operating system to schedule two threads or processes simultaneously. if widows is sharing the resources to the threads and the games are using more threads it should give you an advantage.
> Please note this is my opinion and is based on zero facts ad has no evidence or source to back it up


The FX8350 is 8 physical cores I believe


----------



## rquinn19

Thanks...rep'd

With similar systems I wish you guys could hook up a kill-a-watt and see the results as well.


----------



## Hokies83

Quote:


> Originally Posted by *sena*
> 
> Only game i have seen to use more that 4 thread (effectively) is BF3.


BF3 Planet side 2 and Sleeping dogs use them.

From the OP it does look like to me in games that take advantage of More then 4 threads a 8350 is equal to the 3570k clock for clock in gaming.

Looking at the other perks of both chips..

8350k better in multi threaded apps and equal to 3570k in multi threaded games uses 25% more power.

3570k faster in single threaded apps / games but a little bit slower in multi threaded apps uses 25% less power.


----------



## ZealotKi11er

I dont think 8350 closed the gab 8150 had with 2500k which was crazy with 2 GPUs. For one thing SD is GPU game so its pointless to compare that. Dirt is also. If anything they are cherry picked games.


----------



## Stoffie

Quote:


> Originally Posted by *Powermonkey500*
> 
> The FX8350 is 8 physical cores I believe


It's debateable, Cinebench recognises it as a 4 core 8 thread processor which when I post the link you will see what I mean. It has 4 modules which are split in to 2 parts. On my mobo I can only run the processor as a 4 6 or 8 core processor, if they were cores they should have the option to be switched off individually like the old phenom 2's. I'm not saying I am right because AMD claim it is an 8 core processor and who am I to tell them it's not . It's just interesting how it works.


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> It's debateable, Cinebench recognises it as a 4 core 8 thread processor which when I post the link you will see what I mean. It has 4 modules which are split in to 2 parts. On my mobo I can only run the processor as a 4 6 or 8 core processor, if they were cores they should have the option to be switched off individually like the old phenom 2's. I'm not saying I am right because AMD claim it is an 8 core processor and who am I to tell them it's not . It's just interesting how it works.


Yeah ive heard that to... Have not checked into it much...

Engineer friend of mine just got a 8350 to test out and see whats going on with it.. when he has some results i can share them with you.

But one would think if a game takes Advantage of the 8350 8 threads it would also take advantage of the 2600k/3770k 8 threads?


----------



## Desert Rat

Looks to me like AMD got a decent cpu for the money. Did you guys noticed any bottlenecks when running Crossfire? I want to build an AMD system and so far this cpu looks like a winner to me. Thanks for the info guys


----------



## Stoffie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont think 8350 closed the gab 8150 had with 2500k which was crazy with 2 GPUs. For one thing SD is GPU game so its pointless to compare that. Dirt is also. If anything they are cherry picked games.


The only cherry picking that went on was "here is a print screen of my steam account... what games do you also have from that?" So we cherry picked games to bench that we both have otherwise there would be no comparison.

Quote:


> Originally Posted by *Hokies83*
> 
> Yeah ive heard that to... Have not checked into it much...
> Engineer friend of mine just got a 8350 to test out and see whats going on with it.. when he has some results i can share them with you.
> But one would think if a game takes Advantage of the 8350 8 threads it would also take advantage of the 2600k/3770k 8 threads?


I would be really keen to hear what he has to say!


----------



## itomic

F1 2012 utilise all 8 cores on my chip !


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stoffie*
> 
> The only cherry picking that went on was "here is a print screen of my steam account... what games do you also have from that?" So we cherry picked games to bench that we both have otherwise there would be no comparison.
> I would be really keen to hear what he has to say!


What does this have to say?

http://www.hardocp.com/article/2011/11/03/amd_fx8150_multigpu_gameplay_performance_review/1


----------



## Hokies83

I trust the Ops findings to be legit

They almost 100% match the results of this 3d party reviewer..

Sleeping dogs
Op findings


Reviewer findings



BF3

Op findings



Reviewer findings


----------



## Dimaggio1103

Very nice thread. +rep guys

SO what are all the "i5 will stomp AMD" fanboys gonna say now?

Turns out in gaming they look to be dead even, hmmm.......


----------



## Hokies83

I have not tested Max payne 3 to see let see if it uses more then 4 threads. Will do that now.

I know Crysis did not on my system when i played it Bf3 does

I know the new Transformers does not and DarkSiders II does not.

Any reports on AC3?


----------



## Dimaggio1103

This should be bookmarked to help fellow OCN.ers when they are looking for a cpu for their specific situation.

I know this collaboration between a Intel guy and a AMD guy will still not be enough for trolls but anyone with a intelligence will be able to see whats up.

Much appreciated guys, you squashed the rumor that a big boy GPU in crossfire will bottleneck a 8350.


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> Hmm i have not tested *Max payne 3 to see let see if it uses more then 4 threads. Will do that now*.
> I know Crysis did not on my system when i played it Bf3 does
> I know the new Transformers does not and DarkSiders II does not.
> Any reports on AC3?


No Need, I have included a usage Graph.

Right Cinebench, F1 2012 and Max Payne 3 are up


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> No Need, I have included a usage Graph.
> Right Cinebench, F1 2012 and Max Payne 3 are up


Well here is my Max Payne 3 use..

Seems this game got an update used to run like crap.




It used your cpu a massive amount more then mine. i had it capped at 120 fps.


----------



## 12Cores

This is great thread and I now know that next year when I inevitable add another 7970, my cpu will not bottleneck my rig @ 1080P







. Rep plus +1 to both of you. Please add RE5 and 3dmark 11 since they are free.


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> Well here is my Max Payne 3 use..
> Seems this game got an update used to run like crap.
> 
> 
> It used your cpu a massive amount more then mine. i had it capped at 120 fps.


Sorry Hokies as I uploaded it I realised the wrong graph was up have since corrected it,


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> Sorry Hokies as I uploaded it I realised the wrong graph was up have since corrected it,


Where did u get the usage spikes at? i was going all over the area trying to find a spike.


----------



## MisterMalv

Given that it's upgrade time in the new year, this thread has got me scratching my head. I was all over an Intel CPU & chipset, but given their price in the UK I'm no longer sure. Decisions decisions, for an avid, but part time gamer.
+ rep for the OP's, I have a lot to think about.


----------



## Bitemarks and bloodstains

Thread cleaned.

Please keep all posts on topic.

I, the rest of staff and the members do not care if you think the results are bias, if low res benches are better or how many threads games use.


----------



## ZealotKi11er

Quote:


> Originally Posted by *12Cores*
> 
> This is great thread and I now know that next year when I inevitable add another 7970, my cpu will not bottleneck my rig @ 1080P
> 
> 
> 
> 
> 
> 
> 
> . Rep plus +1 to both of you. Please add RE5 and 3dmark 11 since they are free.


It will dont bother. Even my 3570K @ 4.6Ghz hold 2 x HD 7970 back @ 2560 x 1440. You need a lot more games to make a conclusion. This is like the saying don't judge to book by it cover. AMD CPU did not catch up Intel 4 generation lead in 1 night.


----------



## Electroneng

Great Comparison! My 8320 games like a dream with a 7870. Have not tried multiple cards with it as I just do not have the need,

If I go multiple cards in a system, the 3930k is my choice.


----------



## Electroneng

Quote:


> Originally Posted by *Bitemarks and bloodstains*
> 
> Thread cleaned.
> Please keep all posts on topic.
> I, the rest of staff and the members do not care if you think the results are bias, if low res benches are better or how many threads games use.


Exactly!!!


----------



## 12Cores

Quote:


> Originally Posted by *Electroneng*
> 
> Great Comparison! My 8320 games like a dream with a 7870. Have not tried multiple cards with it as I just do not have the need,
> If I go multiple cards in a system, the 3930k is my choice.


I guess I will find out, but I think I will be fine. Thanks for the feedback guys.


----------



## Demonkev666

Quote:


> Originally Posted by *Stoffie*
> 
> Yes I can do that I'll ask powermonkey500 too.
> I overclock the whole thing, my fx reacts much better to fsb and multi, I have found that I get higher FPS if I get the ht link above 2800 an and the cpu nb at about 2400. but it is much harder to achieve an overclock because they have dead zones on the fsb my validation in the opening thread will give you the settings. HT link at 2600 definitely slows crossfire down a bit.


I see the cinebench, but I said match thread speeds on the single core score, not clocks lol

like can you both get 1.20 on their and show the FPS with both chips at 1.20 in single core score on it.
Quote:


> Originally Posted by *Powermonkey500*
> 
> The FX8350 is 8 physical cores I believe


it will work both as a quad core and 8 core. It's just 8 cores get don't get the full 4 way decoder. steamroller should fix this how ever there still are other bottlenecks.
more of a hybrid chip imo
Quote:


> Originally Posted by *Stoffie*
> 
> It's debateable, Cinebench recognises it as a 4 core 8 thread processor which when I post the link you will see what I mean. It has 4 modules which are split in to 2 parts. On my mobo I can only run the processor as a 4 6 or 8 core processor, if they were cores they should have the option to be switched off individually like the old phenom 2's. I'm not saying I am right because AMD claim it is an 8 core processor and who am I to tell them it's not . It's just interesting how it works.


Biggest bottleneck so far for This architecture is it's "single fetch for two cores" as a dual core can have two fetches, one for each core.


----------



## Powermonkey500

Quote:


> Originally Posted by *Demonkev666*
> 
> I see the cinebench, but I said match thread speeds on the single core score, not clocks lol
> 
> like can you both get 1.20 on their and show the FPS with both chips at 1.20 in single core score on it.


That would be a pain in the ass.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It will dont bother. Even my 3570K @ 4.6Ghz hold 2 x HD 7970 back @ 2560 x 1440. You need a lot more games to make a conclusion. This is like the saying don't judge to book by it cover. AMD CPU did not catch up Intel 4 generation lead in 1 night.


Dont bother? Its funny even with two people of different brands showing proof neither is bottle necking performance, you and others still say no.

There is no way a 8350 is gonna hold any single or dual GPU back, unless of course your talking CPU heavy dependent games or tri/quad fire, then you might have a case.

AMD has done more than just caught up to Intels quad core CPU's. There is this thread, plus multiple reviews where it shows the 8350 outperforming the 3570K in multithreaded apps. Which is most apps nowadays.

Single thread? ya Intel is king no question. Heck I just came from a Intel rig, and I plan on going back depending on how haswell turns out. However, I am kinda annoyed how almost every Intel guy says a FX CPU will bottleneck higher end gpus specifically in crossfire without ever showing a shred of proof.

These two guys have showed such proof and are still in the process of doing so, yet proof seems to not be good enough.


----------



## TKFlight

I really don't think Haswell will be all that much better than Ivy, I seen in other threads here that Haswell will mainly be iGPU performance increases. AMD seems to be heading in the right direction, really good improvement with the FX-8320/8350 when Steamroller is released maybe we are in for a big surprise?


----------



## Dimaggio1103

Quote:


> Originally Posted by *TKFlight*
> 
> I really don't think Haswell will be all that much better than Ivy, I seen in other threads here that Haswell will mainly be iGPU performance increases. AMD seems to be heading in the right direction, really good improvement with the FX-8320/8350 when Steamroller is released maybe we are in for a big surprise?


Doubt it, although it would be a welcome surprise. I bounce around alot between company's and would love to stay with amd awhile this time. If neither haswell or steamroller offers a significant jump, Ill just rock this 6300 for awhile it does just fine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Dont bother? Its funny even with two people of different brands showing proof neither is bottle necking performance, you and others still say no.
> There is no way a 8350 is gonna hold any single or dual GPU back, unless of course your talking CPU heavy dependent games or tri/quad fire, then you might have a case.
> AMD has done more than just caught up to Intels quad core CPU's. There is this thread, plus multiple reviews where it shows the 8350 outperforming the 3570K in multithreaded apps. Which is most apps nowadays.
> Single thread? ya Intel is king no question. Heck I just came from a Intel rig, and I plan on going back depending on how haswell turns out. However, I am kinda annoyed how almost every Intel guy says a FX CPU will bottleneck higher end gpus specifically in crossfire without ever showing a shred of proof.
> These two guys have showed such proof and are still in the process of doing so, yet proof seems to not be good enough.


Like i said dont look at 3 games. What happens if the game does not use 8 Core? FX should perform much less. Look at 3770K review posted 2 paged before. The difference is huge with 2 GPUs. While FX 8-Core will beat Core i5 in Multi Tasking gaming is a different thing. On top of that its uses a lot more power.
Dirt, SD is not CPU depended. That does not show the power of the CPU only the GPUs. I leave my HD 7970s @ Stock even under water because 3570K does not always push them. Your telling me now FX will do better?


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Like i said dont look at 3 games. What happens if the game does not use 8 Core? FX should perform much less. Look at 3770K review posted 2 paged before. The difference is huge with 2 GPUs. While FX 8-Core will beat Core i5 in Multi Tasking gaming is a different thing. On top of that its uses a lot more power.
> Dirt, SD is not CPU depended. That does not show the power of the CPU only the GPUs. I leave my HD 7970s @ Stock even under water because 3570K does not always push them. Your telling me now FX will do better?


All I am saying is they are showing the systems handling modern games just fine. If a game is made single threaded that is the fault of the software company, and not AMD's fault.

People seem to get that fact confused, AMD beats the 3570K in multithreaded apps, and games are becoming multithreaded at a quick pace. Black ops 2 which runs on a dated engine uses six cores, I say that's a sign of things to come. Heavy hitters like crytek and dice are already making their games multithreaded up to 8 cores.

Also, my main ppoint is to contradict the people saying that a 8350 will bottleneck and a 3570K will not. Fanboys say that all the time and its simply not true. Why should people be lied to? the FX-8350 is on par with the 3570K with the exception of single/dual threaded apps, and we all know there aint much more of those being made.

Most of the people claiming AMD sucks have not owned a vishera chip. I have owned both a Ivy and mid range vishera so I actually speak from experience. I use to be just like you guys dogging on AMD until I experienced their new offerings for myself.


----------



## JunkoXan

personally it looks like if one has 60hz Monitor's either one would do and probably won't notice any differences i think







hmmm...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> All I am saying is they are showing the systems handling modern games just fine. If a game is made single threaded that is the fault of the software company, and not AMD's fault.
> People seem to get that fact confused, AMD beats the 3570K in multithreaded apps, and games are becoming multithreaded at a quick pace. Black ops 2 which runs on a dated engine uses six cores, I say that's a sign of things to come. Heavy hitters like crytek and dice are already making their games multithreaded up to 8 cores.
> Also, my main ppoint is to contradict the people saying that a 8350 will bottleneck and a 3570K will not. Fanboys say that all the time and its simply not true. Why should people be lied to? the FX-8350 is on par with the 3570K with the exception of single/dual threaded apps, and we all know there aint much more of those being made.
> Most of the people claiming AMD sucks have not owned a vishera chip. I have owned both a Ivy and mid range vishera so I actually speak from experience. I use to be just like you guys dogging on AMD until I experienced their new offerings for myself.


You are looking at a different view. I look at the CPU as much as its being able to push the GPUs to it MAX. You looks at CPU if it handles the game. Game dont use more then 4 Cores to make a difference. FX is simply keeping up because @ 4.8Ghz, GPU demanding game it keeps its able to keep the GPUs enough. The problem is when CPU is not enough its when FX looks trashy. Thats why some said 800 x 600.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are looking at a different view. I look at the CPU as much as its being able to push the GPUs to it MAX. You looks at CPU if it handles the game. Game dont use more then 4 Cores to make a difference. FX is simply keeping up because @ 4.8Ghz, GPU demanding game it keeps its able to keep the GPUs enough. The problem is when CPU is not enough its when FX looks trashy. Thats why some said 800 x 600.


800 x 600 is stupidly irrelevant. That's not an applicable benchmark for the real world at all.

If anyone ever posts a help thread asking for the best CPU to game at 800 x 600 I will be sure to recommend Intel.


----------



## Stefy

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Dont bother? Its funny even with two people of different brands showing proof neither is bottle necking performance, you and others still say no.
> There is no way a 8350 is gonna hold any single or dual GPU back, unless of course your talking CPU heavy dependent games or tri/quad fire, then you might have a case.
> AMD has done more than just caught up to Intels quad core CPU's. There is this thread, plus multiple reviews where it shows the 8350 outperforming the 3570K in multithreaded apps. Which is most apps nowadays.
> Single thread? ya Intel is king no question. Heck I just came from a Intel rig, and I plan on going back depending on how haswell turns out. However, I am kinda annoyed how almost every Intel guy says a FX CPU will bottleneck higher end gpus specifically in crossfire without ever showing a shred of proof.
> These two guys have showed such proof and are still in the process of doing so, yet proof seems to not be good enough.


Oh yes, all FX chips will bottleneck dual GPUs at some point, there''s no way around that. Besides, this thread is about 2 games with questionable results (imo).

The reviews speaks for themselves though, we all know which is the better chip.


----------



## amd955be5670

Nice benches you got out there. I know 500 series is outdated tech, but if someone gave some love to those cards, it would be great, and fun for analysis.
Quote:


> Originally Posted by *Dimaggio1103*
> 
> 800 x 600 is stupidly irrelevant. That's not an applicable benchmark for the real world at all.
> 
> If anyone ever posts a help thread asking for the best CPU to game at 800 x 600 I will be sure to recommend Intel.


Buy an i7-3960x, OC it to 5.2Ghz, buy 7970 6Gb x 4, play games at 640x360. Its obviously better than FX-8350 @ 2160p.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Stefy*
> 
> Oh yes, all FX chips will bottleneck dual GPUs at some point, there''s no way around that. Besides, this thread is about 2 games with questionable results (imo).
> The reviews speaks for themselves though, we all know which is the better chip.


Last I counted it was five games so far they benched on.....









Also you right reviews do speak for themselves. They show the 8350 beating the 3570K hands down in multithreaded performance. As I stated not single threaded.
http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/2


----------



## Stefy

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Last I counted it was five games so far they benched on.....
> 
> 
> 
> 
> 
> 
> 
> 
> Also you right reviews do speak for themselves. They show the 8350 beating the 3570K hands down in multithreaded performance. As I stated not single threaded.
> http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/2


If you mean "beats" by barely, then yeah. I would expect an 8-core chip to beat a four-core one. However, when it comes to gaming in general and dual-GPU setups, Intel is better.

Also, take a look at the gaming benches in your own link.


----------



## ZealotKi11er

Quote:


> Originally Posted by *amd955be5670*
> 
> Nice benches you got out there. I know 500 series is outdated tech, but if someone gave some love to those cards, it would be great, and fun for analysis.
> Buy an i7-3960x, OC it to 5.2Ghz, buy 7970 6Gb x 4, play games at 640x360. Its obviously better than FX-8350 @ 2160p.


You test only @ 800x600 just to see how far the CPU will get you. For example if you play BF3 there is so much the GPU can do in MP. its mostly CPU from a point. Playing @ 800x600 you will see how much the CPU can push in terms of fps. If you get 80fps then no matter that gpus you put in that system you will not break 80fps.


----------



## amd955be5670

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You test only @ 800x600 just to see how far the CPU will get you. For example if you play BF3 there is so much the GPU can do in MP. its mostly CPU from a point. Playing @ 800x600 you will see how much the CPU can push in terms of fps. If you get 80fps then no matter that gpus you put in that system you will not break 80fps.


Explain these two images to me and I'll give you a chocolate


Spoiler: Warning: Spoiler!



1080p

2160p

*Note:* Same gfx settings used.


----------



## Stoffie

Quote:


> Originally Posted by *Stefy*
> 
> Oh yes, all FX chips will bottleneck dual GPUs at some point, there''s no way around that. Besides, this thread is about 2 games with questionable results (imo).
> The reviews speaks for themselves though, we all know which is the better chip.


As one of the forum moderators have already said, no one cares if you think the results are bias...they are not and there are 5 games, and when we are done there will be about 10.

Quote:


> Originally Posted by *amd955be5670*
> 
> Explain these two images to me and I'll give you a chocolate
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1080p
> 
> 2160p
> 
> *Note:* Same gfx settings used.


any chance of a usage graph?

I am thinking about doing all of my benches again with only 6 cores to replicate a fx6300 and see how it fares what do you guys think?


----------



## amd955be5670

Quote:


> Originally Posted by *Stoffie*
> 
> any chance of a usage graph?


I can make one if you REALLY want it


----------



## Stoffie

Quote:


> Originally Posted by *amd955be5670*
> 
> I can make one if you REALLY want it


I am just interested to know if the game recognises that you are using a higher resolution and utilises an extra thread or core?


----------



## amd955be5670

You want CPU & GPU utilization? or only CPU?
I thought you were referring to GPU.

Any software you recommend which stores the values in csv?
I don't think you'll be satisfied with Task Manager & Afterburner Graph screenshots.


----------



## Stoffie

Yep as you are on AMD you can use AMD system monitor. Only works with AMD though just google it and you will see it.


----------



## BinaryDemon

Did Powermonkey500 physically pull the 2nd card from his system when performing the single gpu tests? Just wondering because I noticed his gpuz validation shows PCI-e 8x, which he can't help during the crossfire tests but it would be a shame to deny himself the extra bandwidth in the single gpu tests.


----------



## amd955be5670

I'll PM you the results whenever I'm able to work at it.


----------



## Powermonkey500

Quote:


> Originally Posted by *BinaryDemon*
> 
> Did Powermonkey500 physically pull the 2nd card from his system when performing the single gpu tests? Just wondering because I noticed his gpuz validation shows PCI-e 8x, which he can't help during the crossfire tests but it would be a shame to deny himself the extra bandwidth in the single gpu tests.


No. I would have to drain my water cooling and replace some parts to do that. Not worth it.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> No. I would have to drain my water cooling and replace some parts to do that. Not worth it.


Snagged sleeping dogs last night it seems it does take advantage of HT?

If games are really heading towards 6/8 threads does that mean the end of the i5? making the i7 the new i5... and intel bringing a 6 core to the main stream socket?

Anywho..


----------



## rdr09

Quote:


> Originally Posted by *Hokies83*
> 
> Snagged sleeping dogs last night it seems it does take advantage of HT?
> If games are really heading towards 6/8 threads does that mean the end of the i5? making the i7 the new i5... and intel bringing a 6 core to the main stream socket?
> Anywho..


both single gpu and sli or just in sli?


----------



## TKFlight

Its not the end of the i5, its just that Sleeping Dogs can take advantage of more than 4 cores. The i5 will be around for awhile, seeing this thread makes me wish 120hz monitors were cheaper so I could CF since I know now that the i5 wouldn't bottleneck.


----------



## Powermonkey500

When I play BF3 multiplayer my CPU is pegged at 95-100% the entire time.


----------



## Hokies83

Quote:


> Originally Posted by *rdr09*
> 
> both single gpu and sli or just in sli?


In single and in SLI.

Thread kind of shined a whole new light on things for me... I like learning something New.

Anywho how about we get a List of games that use more then 4 threads? Or is there a list already out there somewhere?

As i remember afew years ago it was pretty much un Heard of.

Here is another example Max Payne 3 using 8 threads.. but very lightly.


----------



## BinaryDemon

Quote:


> Originally Posted by *Powermonkey500*
> 
> No. I would have to drain my water cooling and replace some parts to do that. Not worth it.


Heh I understand. I was just looking at the subtle differences between the rigs. I was also trying to gauge the impact of Windows 8 vs Windows 7 but after looking at a bunch of Windows 8 performance reviews that seemed pretty insignificant too. I couldnt find any patterns that showed any of these games favored one OS or the other. Both you and Stoffie did some nice work here.


----------



## rdr09

hokies, this may sound taboo for a number of us but are you saying that in some games quads can become a bottleneck? even if they are oc'ed, like yours, at 5GHz?


----------



## Hokies83

Quote:


> Originally Posted by *rdr09*
> 
> hokies, this may sound taboo for a number of us but are you saying that in some games quads can become a bottleneck? even if they are oc'ed, like yours, at 5GHz?


Mine has 8 threads.

It depends on the person i consider a cpu not being able to provide 120fps a bottleneck....

And so far the only game that can use 8 threads very heavy is Bf3..

Also there are games like Borderlands 2 that use 2 threads but bottleneck a Cpu like mine.. so right it is kind of a give or take.

Need to find some kind of statement from major game makers as to there goals for future gaming.. to see if it is multi threaded or strong single threaded games.

It kind of Also points at the I7s as the premo gaming cpu being able to match the strong single threaded performance of the i5 but also able to support the multi threaded games like the 8350.


----------



## rdr09

Quote:


> Originally Posted by *Hokies83*
> 
> Mine has 8 threads.
> It depends on the person i consider a cpu not being able to provide 120fps a bottleneck....
> And so far the only game that can use 8 threads very heavy is Bf3..
> Also there are games like Borderlands 2 that use 2 threads but bottleneck a Cpu like mine.. so right it is kind of a give or take.


yah, could be the monitor res and the 120Hz you are using. wow.


----------



## Dimaggio1103

Its a good idea for alot of tus to collaborate and get a list of games that use six or more cores. Might be helpful to a future buyer. I dont think its the end of an era for the quad core or i5, but I know six cores is now a minimum for me. If I end up going back to Intel later on its a minimum of an i7, if I stay AMD minimum of a 6xxx. Hopefully steamroller's big cores plus the new chipset adds some flavor to things.

OP, I woul love to see you shut down one module to simulate a 6300 as they are identical in cache, wanna know if it will hold its own if I go crossfire GPU's.


----------



## Hokies83

Quote:


> Originally Posted by *rdr09*
> 
> yah, could be the monitor res and the 120Hz you are using. wow.


One on the left is 2560x1440 60hz One on the right is 27inch 1080i 120hz i have found with heavy AA i perfer the 120hz panel... and use the 2560x1440 to monitor things etc. more so then use for gaming.

Ima test afew other games today see how there doing..

I had Surgery Wednesday and have not been quite myself with only the use of 1 Arm on and some pretty tough pain killers.

But i feel a bit better today.. and would like to get some more results.


----------



## Malo

whats the point of 800x600?? does anybody know anyone who still plays at 800x600? most play at 1080p or moar lol


----------



## sugarhell

Planetside 2 maybe support 6 cores.


----------



## Hokies83

Quote:


> Originally Posted by *sugarhell*
> 
> Planetside 2 maybe support 6 cores.


Yah i install it played it for 10 mins and deleted it.. game was not for me lol.


----------



## Stoffie

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Its a good idea for alot of tus to collaborate and get a list of games that use six or more cores. Might be helpful to a future buyer. I dont think its the end of an era for the quad core or i5, but I know six cores is now a minimum for me. If I end up going back to Intel later on its a minimum of an i7, if I stay AMD minimum of a 6xxx. Hopefully steamroller's big cores plus the new chipset adds some flavor to things.
> OP, I woul love to see you shut down one module to simulate a 6300 as they are identical in cache, wanna know if it will hold its own if I go crossfire GPU's.


Will do it tomorrow, im going to spend the evening with the wife tonight, you just want crossfire bench's or both?


----------



## Ashtyr

In the sleeping dogs pics the AMD rig has Vsync ON,in the intel pics is off.

You have the same result of a Thuban core to 4.1 Ghz in cinebench , this means that one x6 should not do bottleneck two high-end GPU, right?

I know that my 670 isn't bottlenecked, but i have fear of buy another one with my rig and have a big bottleneck


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malo*
> 
> whats the point of 800x600?? does anybody know anyone who still plays at 800x600? most play at 1080p or moar lol


800*600 is you do see the MAX fps CPU can output. From there you know what GPU do get and what setting yo use in GPU. For example if CPU give you 70 fps and GPU give you 90fps then there is no point on lowering game setting if you want to get over 70fps cause u will not get over 70 no matter what. Also it a indication if you should OC the CPU or the GPU.


----------



## Bruennis

Awesome information. In my opinion, these two chips are great but all things considered they are hardly equal. The 3570K is the more well rounded chip. Nonetheless, this is a step in the right direction for AMD and bodes well for Steamroller.


----------



## ZealotKi11er

OP can you please run Sleeping Dogs with AA @ High no extreme. It shows CPU bottleneck much better.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 800*600 is you do see the MAX fps CPU can output. From there you know what GPU do get and what setting yo use in GPU. For example if CPU give you 70 fps and GPU give you 90fps then there is no point on lowering game setting if you want to get over 70fps cause u will not get over 70 no matter what. Also it a indication if you should OC the CPU or the GPU.


Firstly, low res benchmarks are something made up by that website and since have been thrown around like they are the final word. They have now real world applications no matter how you spin it. Instead of creating wierd off the wall benchmarks to see how a system will do in games, I have an idea how bout we actually just play those games and see how the system fairs?

The best gaming benchmark is to just play the game, end of story. That's exactly what these guys are doing, so as a moderator already said no need to talk about 800 x 600 or any other useless benchmark as no one cares.
Quote:


> Originally Posted by *Stoffie*
> 
> Will do it tomorrow, im going to spend the evening with the wife tonight, you just want crossfire bench's or both?


Both please. My 660ti is a higher end mid tier GPU on the same level as a GTX 580, so if your simulated 6300 can handle 7970's then I can trust 660tis would not be a concern, as I want to add a second one. If that makes sense at all. lol Sorry just woke up a bit ago.

Quote:


> Originally Posted by *Bitemarks and bloodstains*
> 
> Thread cleaned.
> Please keep all posts on topic.
> I, the rest of staff and the members do not care if you think the results are bias, if low res benches are better or how many threads games use.


Just to remind you.


----------



## Malo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 800*600 is you do see the MAX fps CPU can output. From there you know what GPU do get and what setting yo use in GPU. For example if CPU give you 70 fps and GPU give you 90fps then there is no point on lowering game setting if you want to get over 70fps cause u will not get over 70 no matter what. Also it a indication if you should OC the CPU or the GPU.


ok well my point is, I want to know what the performance is at the resolution I play at is.... if the cpu bottlenecks at 800x600 and it doesnt even come close at 1080p then I dont really care, see my point?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malo*
> 
> ok well my point is, I want to know what the performance is at the resolution I play at is.... if the cpu bottlenecks at 800x600 and it doesnt even come close at 1080p then I dont really care, see my point?


CPU these is days is only looked as Bottleneck when it comes to 120Hz. To get 120Hz not only you need 2 GPUs but a very fast overclocked CPU. If CPU cant deliver more then 80 fps at a certain game then you save $ buy not blowing it on multiple GPUs.
Lets say you have one card and you are getting 60-70 fps. You expect 100-120 fps once you add second card. Best way to find out is lower setting for the game you play. If you are not getting those high fps and only something like 80-90 fps then the second GPU will not help.
Most people to eliminate CPU bottleneck crank up AA. doing so GPUs hit the wall fist. Comparing CPUs this way is not a very good way to do. Lets take 2 examples.

CPU A
CPU B
Both same GPU.

Game X @ Max Settings both get ~ 70 fps.
So CPU A = CPU B end of story.

Game X @ Medium Setting CPU A get 90 fps CPU B get 80 fps.
CPU A is faster.

People here are saying because i play at those setting CPU B is as fast as CPU A end of story.
Also all the games tested work very well with CFX and are all AMD games. You need like 20-30 games to come to a conclusion 8 Core = 4 faster cores.


----------



## Malo

why play on medium when you can play on max? isnt that the point of buying all this expencive hardware?


----------



## JunkoXan

Quote:


> Originally Posted by *Malo*
> 
> why play on medium when you can play on max? isnt that the point of buying all this expencive hardware?


i think that is the point imo...







also to most people have 60hz monitors not 120hz as well.







the tests did provide great results for both sides providing great playability in single and crossfire


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> CPU these is days is only looked as Bottleneck when it comes to 120Hz. To get 120Hz not only you need 2 GPUs but a very fast overclocked CPU. If CPU cant deliver more then 80 fps at a certain game then you save $ buy not blowing it on multiple GPUs.
> Lets say you have one card and you are getting 60-70 fps. You expect 100-120 fps once you add second card. Best way to find out is lower setting for the game you play. If you are not getting those high fps and only something like 80-90 fps then the second GPU will not help.
> Most people to eliminate CPU bottleneck crank up AA. doing so GPUs hit the wall fist. Comparing CPUs this way is not a very good way to do. Lets take 2 examples.
> CPU A
> CPU B
> Both same GPU.
> Game X @ Max Settings both get ~ 70 fps.
> So CPU A = CPU B end of story.
> Game X @ Medium Setting CPU A get 90 fps CPU B get 80 fps.
> CPU A is faster.
> People here are saying because i play at those setting CPU B is as fast as CPU A end of story.
> Also all the games tested work very well with CFX and are all AMD games. You need like 20-30 games to come to a conclusion 8 Core = 4 faster cores.


Im sorry but you are making absolutely no sense here.

Your argument seems to say.

1, 120Hz monitor there is a diff. Except not many people have or want a 120Hz monitor. And as blind tests have shown only seasoned gamers can tell the diff.

2. Games at low resolution 800 x 600 fair better on Intel. Who gives a crap no one with high end hardware will ever play on that.

3. We should all play at medium settings as to give Intel the proper advantage, instead of max settings like our rigs can handle.

I find it truley amazing that in weeks and months prior almost all Intel guys said "Intel will destroy AMD in gaming" and "AMD cpu's bottleneck higherend GPUs, specifically in crossfire/SLI"

Yet these guys are showing proof that is a bunch of B.S. Is AMD 100% superior to Intel? no not by any stretch, and nobody here has said that. They are just disprooving some of the propaganda from the Intel camp.

All these guys are showing is that a AMD 8350 will game and run high end GPU's perfectly fine.

Also, you guys seem to be getting the definition of bottleneck confused.

bot·tle·neck (btl-nk)
n.
1.
a. A narrow or obstructed section, as of a highway or a pipeline.
b. A point or an area of traffic congestion.

Meaning if the GPU usage is at anything other than 98-99% usage than you have a case. IF the GPU's on both rigs are being maxed yet Intel pulls more FPS that is not a bottleneck, it is just a architecture advantage. There is a huge difference. Intel possess the better architecture, no question, but AMD is not bottlenecking.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Im sorry but you are making absolutely no sense here.
> Your argument seems to say.
> 1, 120Hz monitor there is a diff. Except not many people have or want a 120Hz monitor. And as blind tests have shown only seasoned gamers can tell the diff.
> 2. Games at low resolution 800 x 600 fair better on Intel. Who gives a crap no one with high end hardware will ever play on that.
> 3. We should all play at medium settings as to give Intel the proper advantage, instead of max settings like our rigs can handle.
> I find it truley amazing that in weeks and months prior almost all Intel guys said "Intel will destroy AMD in gaming" and "AMD cpu's bottleneck higherend GPUs, specifically in crossfire/SLI"
> Yet these guys are showing proof that is a bunch of B.S. Is AMD 100% superior to Intel? no not by any stretch, and nobody here has said that. They are just disprooving some of the propaganda from the Intel camp.
> All these guys are showing is that a AMD 8350 will game and run high end GPU's perfectly fine.
> Also, you guys seem to be getting the definition of bottleneck confused.
> bot·tle·neck (btl-nk)
> n.
> 1.
> a. A narrow or obstructed section, as of a highway or a pipeline.
> b. A point or an area of traffic congestion.
> Meaning if the GPU usage is at anything other than 98-99% usage than you have a case. IF the GPU's on both rigs are being maxed yet Intel pulls more FPS that is not a bottleneck, it is just a architecture advantage. There is a huge difference. Intel possess the better architecture, no question, but AMD is not bottlenecking.


You clearly have no idea what CPU bottleneck is. When i buy 2 x 7970 want to get every single fps they provide end of story. You don't buy 2 x HD 7970 to get over 60 fps.

Also if you get 99% GPU use in AMD or Intel CPU you will get same FPS.


----------



## paulerxx

So AMD really isn't that far off in gaming..Why do I keep hearing they're miles behind in this area?


----------



## Powermonkey500

Quote:


> Originally Posted by *paulerxx*
> 
> So AMD really isn't that far off in gaming..Why do I keep hearing they're miles behind in this area?


Well behind the scenes I am having a million problems with Eyefinity and Crossfire combined. I am actually extremely disappointed.
http://www.overclock.net/t/1333065/eyefinity-crossfire-problems-with-vsync-video-of-problem-attached/0_20#post_18732191

Basically I have irreversible stutter and broken vsync. These games look like garbage.


----------



## Malo

Quote:


> Originally Posted by *paulerxx*
> 
> So AMD really isn't that far off in gaming..Why do I keep hearing they're miles behind in this area?


lol fan boi's thats why... intel can play old games with a higher min fps...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malo*
> 
> lol fan boi's thats why... intel can play old games with a higher min fps...


What does that even mean?

After getting 2 x HD 7970 i know my next upgrade will be CPU upgrade. I want to get all the fps i can from the GPU i have before i spend 1K on other GPU and get low gpu usage.


----------



## sage101

And all this time we were made to believe that AMD were miles behind INTEL in terms of gaming especially with Xfire/SLI setups.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You clearly have no idea what CPU bottleneck is. When i buy 2 x 7970 want to get every single fps they provide end of story. You don't buy 2 x HD 7970 to get over 60 fps.
> *Also if you get 99% GPU use in AMD or Intel CPU you will get same FPS.*


Incorrect. given that two platforms are getting the exact same GPU usage on can still have slightly higher FPS if the CPU can handle CPU intensive tasks better. A GPU is not the only thing that plays the game, the CPU is just as important.

Clearly you have no idea how a game works. Things like textures, lighting, shaders ect get offloaded to the GPU, while things like physics get sent to the CPU. (physics not physX). So if both GPU's are being fed fast enough it comes down to architecture of CPU cores and MHz. Hence why games that are 4+ core optimized run better on quad cores than games only optimized for 1 to 2 cores.


----------



## Malo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What does that even mean?
> After getting 2 x HD 7970 i know my next upgrade will be CPU upgrade. I want to get all the fps i can from the GPU i have before i spend 1K on other GPU and get low gpu usage.


you keep contradicting your self all over the place


----------



## mohit9206

i just wanna ask what cpu someone should get if a single 7970 or 680 is to used at 1080p? 83xx or i5-3xxx ?
and which one for crossfire or sli configuration for resolution 1440p and higher?
i guess the winner and still champion intel i5-3xxx but whats your final word ?


----------



## cssorkinman

Hwbot just finished an appropriately named competitiion between the two cpu's being compared in this thread. There were 3 stages involving different rules . A 3570K won 2 of the stages and an 8350 won the other. In the overall score 3570's took 1st and 2nd place with a 8350 in 3rd as the top AMD in the competition. It may be of interest to anyone following this thread . http://hwbot.org/competition/ocfanboy_duel_1/


----------



## Dimaggio1103

Ontopic, you say both CPU's bottleneck, ok where? they are posting results from actual gaming at actual resolutions people use, yet I see no bottleneck, so could you point me to some proof, versus just saying things without backing it up? Thanks.

On a side note Zealot is a respected member, so I would not classify him from what ive seen as some of the others members posting non sense. Its a healthy debate lets just keep it civil.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Ontopic, you say both CPU's bottleneck, ok where? they are posting results from actual gaming at actual resolutions people use, yet I see no bottleneck, so could you point me to some proof, versus just saying things without backing it up? Thanks.


I play @ 2560 x 1440 which is more demanding in the GPU. When i play BF3 MP my GPU usage drops 60%. This shows my 3570K cant keep up. Yeah i am still getting 60 fps+ but i should be getting 100 fps at that point of the game.
Your mistaking CPU bottleneck with CPU holding you back. In the games tested by OP + those setting used both CPU will be fine. That being said it does not tell you any info if the CPUs where holding the GPUs back or the other way around.


----------



## Hokies83

*Transformers Fall of Cybertron* this is New game that i enjoy alot.





This game seems to use 99% of one Thread and barely anything of 3 others and 0 of the rest.
It also uses 50% of one 680 and about 7% of the other.

*Dirt show down*




Showdown seems to really really like lots of threads.

*Dark Siders 2*




Dark Siders 2 seems to use alot of 1 thread and an Avg amount of 4 others with minimal use on the 2 remaining threads.


----------



## Catscratch

I'm suggesting a punishment for those hot-heads in the thread. Find a board and chalk up :

"It all depends on the game"
"It all depends on the game"
"It all depends on the game"
"It all depends on the game"
.
.

100 times









Chill out.

On topic: It all depends on the game







There's a review http://vr-zone.com/articles/amd-fx-8350-vs-intel-core-i7-3770k--4.8ghz--multi-gpu-gaming-performance/17494.html that shows vishera can't keep up with 3770k at the same frequency.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I play @ 2560 x 1440 which is more demanding in the GPU. When i play BF3 MP my GPU usage drops 60%. This shows my 3570K cant keep up. Yeah i am still getting 60 fps+ but i should be getting 100 fps at that point of the game.
> Your mistaking CPU bottleneck with CPU holding you back. In the games tested by OP + those setting used both CPU will be fine. That being said it does not tell you any info if the CPUs where holding the GPUs back or the other way around.


So I see your point and it makes complete sense to me. I had a 1055T and a 2500K and both botlenecked my dual 6970's. However when I clocked my 2500K higher it started going away. Once I got to around 4.5GHz it was gone. SO considering 7970's are much more powerful it would make sense that to see a bottleneck go away on either CPU you would have to push even farther.

The problem is GPU performance is almost doubling on each gen, yet CPU's are not. So I would say its both the games and the GPU's fault and not specifically AMD's crappy CPU.

My point is that AMD does not have the horrid performance some fanboys say they do. You said it yourself both will bottleneck. It makes sense though you are only clocked at 4.5GHz they both where clocked at 4.8GHz so I am willing to bet you push to 4.8 and your bottleneck would start to disappear if not be gone completely.


----------



## Stoffie

Quote:


> Originally Posted by *mohit9206*
> 
> i just wanna ask what cpu someone should get if a single 7970 or 680 is to used at 1080p? 83xx or i5-3xxx ?
> and which one for crossfire or sli configuration for resolution 1440p and higher?
> i guess the winner and still champion intel i5-3xxx but whats your final word ?


No it's not i5-3xxx it's i5 3570k there is no way that an i5 33xx 34xx or 3550 or 3570 non k version would keep up with the 8350 because of the lack of overclock headroom, that being said no matter how you look at it, clock for clock the i5 3570k is faster, all be it by a small margin or even losing at BF3 by 4 fps and winning by 10-15% in others it is still a faster all round gaming processor right here right now.

To the guys saying 3570k will win in single threaded games, of course it will, you just have to look at the single thread score on cinebench the AMD chip loses by 35% but then again this thread is to see if these processors can handle crossfire and I bet any single threaded game does not require crossfire, it is going to be cpu bound.

The 8350 is a good chip if, you like AMD and want to support them, if you have a 990fx mobo already and want a cheap upgrade, if gaming is important but you also do a lot of heavy multithreaded tasking or if where you live it is 15% or more cheaper to get the AMD then it is worth your while. If the i5 is cheaper where you are then it is a no brainer for gaming.


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> No it's not i5-3xxx it's i5 3570k there is no way that an i5 33xx 34xx or 3550 or 3570 non k version would keep up with the 8350 because of the lack of overclock headroom, that being said no matter how you look at it, clock for clock the i5 3570k is faster, all be it by a small margin or even losing at BF3 by 4 fps and winning by 10-15% in others it is still a faster all round gaming processor right here right now.
> To the guys saying 3570k will win in single threaded games, of course it will, you just have to look at the single thread score on cinebench the AMD chip loses by 35% but then again this thread is to see if these processors can handle crossfire and I bet any single threaded game does not require crossfire, it is going to be cpu bound.
> The 8350 is a good chip if, you like AMD and want to support them, if you have a 990fx mobo already and want a cheap upgrade, if gaming is important but you also do a lot of heavy multithreaded tasking or if where you live it is 15% or more cheaper to get the AMD then it is worth your while. If the i5 is cheaper where you are then it is a no brainer for gaming.


All the games i seem to play use more then 4 threads but 1.

And this has taught me something new.

1 Moar coars does have a place in the near future of gaming.

2. I was thinking about getting an i5 Haswell well no longer after this sticking with the extra threads seem to be the way to go to tackle everything the future will throw at us.


----------



## Ashtyr

Another game tha use at least six cores, is far cry 3



Damn, i need another 670 for this game, i accept donations
I think that as games use six cores my phenom x6 could have a second youth


----------



## Hokies83

Quote:


> Originally Posted by *Ashtyr*
> 
> Another game tha use at least six cores, is far cry 3
> 
> Damn, i need another 670 for this game, i accept donations
> I think that as games use six cores my phenom x6 could have a second youth


Yah ill Snag that and AC3 and black ops 2 when i catch them for 20$ on a forum sale..

Snagged sleeping dogs for 10$ last night.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ashtyr*
> 
> Another game tha use at least six cores, is far cry 3
> 
> Damn, i need another 670 for this game, i accept donations
> I think that as games use six cores my phenom x6 could have a second youth


----------



## Ashtyr

Quote:


> Originally Posted by *ZealotKi11er*


i'm not sure what do you want to say with that.

If is to show me that my x6 coul be the limiting factor , maybe, maybe to stock ,but i dont think so to 4,1 Ghz and 3,2 ghz NB.

So I'm watching this game needs more GPU than CPU, at least in my case


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ashtyr*
> 
> i'm not sure what do you want to say with that.
> If is to show me that my x6 coul be the limiting factor , maybe, maybe to stock ,but i dont think so to 4,1 Ghz and 3,2 ghz NB.
> So I'm watching this game needs more GPU than CPU, at least in my case


Showing more that it does not benefit from 6 Cores. So just telling you to watch out. What kind of fps are you getting?


----------



## Hokies83

Well it does seem for most people they are happy with 60 FPS and just about any unlocked cpu since 2009 should be up to the task with overclocking.

Where you need the highend Intel chips is when ur shooting for for the golden 120fps mark like i do.. And that seems to be a Small fraction of people.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hokies83*
> 
> Well it does seem for most people they are happy with 60 FPS and just about any unlocked cpu since 2009 should be up to the task with overclocking.
> Where you need the highend Intel chips is when ur shooting for for the golden 120fps mark like i do.. And that seems to be a Small fraction of people.


More fps is better in general for MP games. In BF3 if you are average 60-70 fps that means you are hitting 40s in some places. Averaging higher means better min fps.


----------



## Ashtyr

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Showing more that it does not benefit from 6 Cores. So just telling you to watch out. What kind of fps are you getting?


I'm at the beginning, but for now it moves between 50 and 60 fps average

Im playing ultra with HBAO and 2xMSAA

An issue I am having is that sometimes the game is at 30fps, whatever you do 30 fps, for example I'm still at 60 fps, looking without moving me, and suddenly drop to 30 fps and there remains, i think it could be the Vsync

Edit. When frames drop to 30 GPU usage drop to 40/50%, CPU usage still the same about 50%

I think that uses six cores, you can see the fps, the core usage individually, the GPU usage , memory of GPU and temperature of CPU

Some pics


----------



## jeffro37

I noticed that the i5 when cf'd had better min fps on most of the games. Do you look at that or average fps more often to chose what set up you would use? I'm asking cause i always just went by average rather than the min fps. How easy is it to get the 8350 up to 4.8? My brother is in need of an upgrade and the offerings from AMD look very good this time.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ashtyr*
> 
> I'm at the beginning, but for now it moves between 50 and 60 fps average
> Im playing ultra with HBAO and 2xMSAA
> An issue I am having is that sometimes the game is at 30fps, whatever you do 30 fps, for example I'm still at 60 fps, looking without moving me, and suddenly drop to 30 fps and there remains, i think it could be the Vsync
> Edit. When frames drop to 30 GPU usage drop to 40/50%, CPU usage still the same about 50%


Yeah some games if you get under certain fps it will drop Vsync with drop form 60 to 30. Your fps look about right assuming you CPU is OCed? To be 100% sure check GPU usage ignoring that 30 fps drop. If its 90% most of the time you will probably benefit from second card. You will not more then 60-70 fps though.
Quote:


> Originally Posted by *jeffro37*
> 
> I noticed that the i5 when cf'd had better min fps on most of the games. Do you look at that or average fps more often to chose what set up you would use? I'm asking cause i always just went by average rather than the min fps. How easy is it to get the 8350 up to 4.8? My brother is in need of an upgrade and the offerings from AMD look very good this time.


I would say as hard as getting 4.8Ghz with 3570K. Also its the luck of the draw.


----------



## Stefy

Very funnt thread. I'd take my 930 over any FX chip.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stefy*
> 
> Very funnt thread. I'd take my 930 over any FX chip.


I dont know why we compare IVY to FX. Why dont we compare Core i7 920 with FX?


----------



## 2advanced

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont know why we compare IVY to FX.


Why shouldn't we?


----------



## Stefy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont know why we compare IVY to FX. Why dont we compare Core i7 920 with FX?


Indeed. It would be a more fair comparision towards the FX.


----------



## ZealotKi11er

Quote:


> Originally Posted by *2advanced*
> 
> Why shouldn't we?


Its better to test against the old king. Core i7 920 @ 4.2GHz is a tough CPU. If it can beat that then FX is truly a impressive CPU.


----------



## newblood

Can you do Crysis 2 please?


----------



## Electroneng

Quote:


> Originally Posted by *Stefy*
> 
> Very funnt thread. I'd take my 930 over any FX chip.


The FX 8320 and 8350 are better performers then the excellent first gen I7 chips. Preference is left to the owner as one can not go wrong either way,

I am not a fan of Ivy bridge due to the problems with trying to control the thermal conditions on such a small die area. I know as I have pushed these chips all the way to -70c. I will take the Sandy Chips any day over Ivy.

Give me Ln2 or Cascade phase change cooling now and Ivy is the place to be!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Electroneng*
> 
> The FX 8320 and 8350 are better performers then the excellent first gen I7 chips. Preference is left to the owner as one can not go wrong either way,
> I am not a fan of Ivy bridge due to the problems with trying to control the thermal conditions on such a small die area. I know as I have pushed these chips all the way to -70c. I will take the Sandy Chips any day over Ivy.
> Give me Ln2 or Cascade phase change cooling now and Ivy is the place to be!


The difference with Core Series was not that they where that much faster. They had much better CF scaling then AMD. Did something change with New FX chips? Really interested to see a comparison.


----------



## Electroneng

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The difference with Core Series was not that they where that much faster. They had much better CF scaling then AMD. Did something change with New FX chips? Really interested to see a comparison.


Please provide your own experience and ownership of the piledriver chips and we can go further. I do agree with your statement on CF as well as SLI scaling with first gen Core Products


----------



## 2advanced

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its better to test against the old king. Core i7 920 @ 4.2GHz is a tough CPU. If it can beat that then FX is truly a impressive CPU.


The old king is dead. FX beats it in everything.

http://www.anandtech.com/bench/Product/697?vs=47

And they both have similar overclocking headroom as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *2advanced*
> 
> The old king is dead. FX beats it in everything.
> http://www.anandtech.com/bench/Product/697?vs=47
> And they both have similar overclocking headroom as well.


Lol really. 2,66Ghz vs 4.0Ghz.

CPU that can do 4.4Ghz vs 5Ghz. One has almost 2Ghz OC other 1Ghz.


----------



## Electroneng

Quote:


> Originally Posted by *2advanced*
> 
> The old king is dead. FX beats it in everything.
> http://www.anandtech.com/bench/Product/697?vs=47
> And they both have similar overclocking headroom as well.


First gen I7 chips are still top notch but I do prefer the 8320 and 8350 to them.

No offense to anyone on this thread but I cannot argue about a CPU and platform if I have not owned and tested it.


----------



## Madclock

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol really. 2,66Ghz vs 4.0Ghz.
> CPU that can do 4.4Ghz vs 5Ghz. One has almost 2Ghz OC other 1Ghz.


Stock vs stock! Is that not the preferred testing method?

Maybe you would like to make one up! LOL


----------



## ZealotKi11er

Quote:


> Originally Posted by *Electroneng*
> 
> First gen I7 chips are still top notch but I do prefer the 8320 and 8350 to them.
> No offense to anyone on this thread but I cannot argue about a CPU and platform if I have not owned and tested it.


Unless you test them at the same time side by side nobody can really argue. I only did test as far as 1055T.


----------



## pwnzilla61

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol really. 2,66Ghz vs 4.0Ghz.
> CPU that can do 4.4Ghz vs 5Ghz. One has almost 2Ghz OC other 1Ghz.


not very many 920's can do 4.4 let alone 4.3-2(even). 4.8-5+ is much easier on the fx chip.


----------



## ZealotKi11er

Quote:


> Originally Posted by *pwnzilla61*
> 
> not very many 920's can do 4.4 let alone 4.3-2(even). 4.8-5+ is much easier on the fx chip.


A lot could if you bought it at the right time. Also it was a matter of cooling.


----------



## 2advanced

Quote:


> Originally Posted by *Electroneng*
> 
> First gen I7 chips are still top notch but I do prefer the 8320 and 8350 to them.
> No offense to anyone on this thread but I cannot argue about a CPU and platform if I have not owned and tested it.


I completely agree with everything you just said.


----------



## cssorkinman

Quote:


> Originally Posted by *jeffro37*
> 
> I noticed that the i5 when cf'd had better min fps on most of the games. Do you look at that or average fps more often to chose what set up you would use? I'm asking cause i always just went by average rather than the min fps. How easy is it to get the 8350 up to 4.8? My brother is in need of an upgrade and the offerings from AMD look very good this time.


In my case it was simple to get to 4.8 ghz stable ( 1.48 volts ) on the 8350, but bear in mind the it needs a great "supporting cast" to get there. Motherboard, cooling and powersupply will get tested by this chip if you push it. It's been my experience that for everyday use it will run over 5 ghz without problems. However running stability testing programs on all 8 cores over 4.9 ghz requires a pretty good bump in voltage , and an H-100 watercooling setup at the minimum. Also you need a really stout board to get that kind of overclock out of the 8350. Personally , I wouldn't attempt that big of an overclock on any board with less than an 8+2 power phase setup. I'm still working on getting a maximum benching stable speed, but I have been able to run superpi at 5.256 ghz as my best so far ( need more time).
Take a look here to see how people are getting along overclocking the Vishera chips http://www.overclock.net/t/1331219/fx-83xx-data-collection-thread Seems the early batches are doing pretty well.
A lot depends on how experienced you are with overclocking and how much voltage you are willing to send through your chip too.
At any rate good luck with whatever you choose


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Unless you test them at the same time side by side nobody can really argue. I only did test as far as 1055T.


Um no? You seem to make a awful lot of statements about what FX can and cannot do, for someone who has never even owned one......


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Um no? You seem to make a awful lot of statements about what FX can and cannot do, for someone who has never even owned one......


Have you ever owned a 3570K?


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have you ever owned a 3570K?


Yup sure have.

Killed it trying to delid cause the temps where so horrible couldn't get past 4.5GHz with decent cooling.

http://www.overclock.net/t/1309867/short-and-direct-delid-guide-ivy-bridge/130#post_18298593


----------



## 2advanced

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Yup sure have.
> Killed it trying to delid cause the temps where so horrible couldn't get past 4.5GHz with decent cooling.
> http://www.overclock.net/t/1309867/short-and-direct-delid-guide-ivy-bridge/130#post_18298593


That sucks! You live in Mojave?


----------



## anubis44

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Single thread? ya Intel is king no question. Heck I just came from a Intel rig, and I plan on going back depending on how haswell turns out. However, I am kinda annoyed how almost every Intel guy says a FX CPU will bottleneck higher end gpus specifically in crossfire without ever showing a shred of proof.
> These two guys have showed such proof and are still in the process of doing so, yet proof seems to not be good enough.


Well it's a funny thing about proof. It doesn't really work on religious people.









It's too bad, really, that actual straight-up, naked evidence from two guys who actually own similar rigs and games, except for the mobo/CPU, clocking them similarly, isn't enough evidence for some people. I guess they just have too much invested in their beliefs. They really just WANT to believe that Intel is better, and they're determined to do so, no matter what the evidence shows.


----------



## anubis44

Quote:


> Originally Posted by *Ashtyr*
> 
> Another game tha use at least six cores, is far cry 3
> 
> Damn, i need another 670 for this game, i accept donations
> I think that as games use six cores my phenom x6 could have a second youth


Yes, it will. Actually, it's funny how well AMD CPUs from the K8 (Athlon 64) onward seem to age, compared to their Intel counterparts. We have an old AMD Opteron dual-core CPU (dual-socket, too) HP workstation from 2005 we use as a ghost server, among other things at work, and we decided we wanted to put a 64-bit OS on it. No problem. The CPU supports Windows 7 x64 - just tossed it on, and it runs great. The Pentium IVs from around that time period, however, are only 32 bit, and pretty much junk at this point. Even the Bulldozer chip, which clearly had some issues, is starting to look better in a number of benchmarks that use as many cores/threads as you can throw at them. I honestly think Piledriver is the turning point in this latest down cycle for AMD, and with Jim Keller not only back at AMD, but actually heading the CPU division(!), the next AMD CPUs will positively surprise many people. Apparently, Mr. Keller started making changes to the next generation CPU designs on his very first day back at work. That's a VERY good sign.


----------



## Hokies83

Intel is better....

This shows However Alot of Games using Multi threads and it shows that Vishera is able to run head to Head clock for clock with a "3570k" in these Multi threaded games.Which is a great score for Amd imo.

Games using 4 cores or Less would be a much different result...

But it does Show in order to beat Amd down the line clock for clock Intel users will have to have I7s 2600k 2700k and 3770k.


----------



## ZealotKi11er

Quote:


> Originally Posted by *anubis44*
> 
> Yes, it will. Actually, it's funny how well AMD CPUs from the K8 (Athlon 64) onward seem to age, compared to their Intel counterparts. We have an old AMD Opteron dual-core CPU (dual-socket, too) HP workstation from 2005 we use as a ghost server, among other things at work, and we decided we wanted to put a 64-bit OS on it. No problem. The CPU supports Windows 7 x64 - just tossed it on, and it runs great. The Pentium IVs from around that time period, however, are only 32 bit, and pretty much junk at this point. Even the Bulldozer chip, which clearly had some issues, is starting to look better in a number of benchmarks that use as many cores/threads as you can throw at them. I honestly think Piledriver is the turning point in this latest down cycle for AMD, and with Jim Keller not only back at AMD, but actually heading the CPU division(!), the next AMD CPUs will positively surprise many people. Apparently, Mr. Keller started making changes to the next generation CPU designs on his very first day back at work. That's a VERY good sign.


You are comparing AMDs Golden Age to Intels P4 are. 165 was a expensive CPU. I can say same thing about Q6600 which even now is a amazing CPU. AMD had what back then Phenom I. At least FX 2 is not a step back like FX.


----------



## anubis44

Quote:


> Originally Posted by *Hokies83*
> 
> All the games i seem to play use more then 4 threads but 1.
> And this has taught me something new.
> 1 Moar coars does have a place in the near future of gaming.
> 2. I was thinking about getting an i5 Haswell well no longer after this sticking with the extra threads seem to be the way to go to tackle everything the future will throw at us.


My goodness, Hokies. I daresay you're starting to sound like a reasonable person!

Seriously, it's nice to see somebody in here mention it when they feel they've learned something or changed their mind about something. It's not any kind of weakness. On the contrary, it's a sign of strength when you can say that you feel you have learned something. In that sense, you are a strong person. There are a few others on this forum who could learn something from your example.


----------



## anubis44

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are comparing AMDs Golden Age to Intels P4 are. 165 was a expensive CPU. I can say same thing about Q6600 which even now is a amazing CPU. AMD had what back then Phenom I. At least FX 2 is not a step back like FX.


Well, my old Athlon 64 3200+ wasn't an expensive CPU. As I recall, it was about ~$245 at around the same time as this Opteron was selling, and it also runs 64 operating systems.


----------



## 2advanced

Quote:


> Originally Posted by *Hokies83*
> 
> Intel is better....


Quote:


> Originally Posted by *anubis44*
> 
> My goodness, Hokies. I daresay you're starting to sound like a reasonable person!
> Seriously, it's nice to see somebody in here mention it when they feel they've learned something or changed their mind about something. It's not any kind of weakness. On the contrary, it's a sign of strength when you can say that you feel you have learned something. In that sense, you are a strong person. There are a few others on this forum who could learn something from your example.


Hes ALMOST there!


----------



## almighty15

Run Crysis 1....


----------



## Stoffie

Quote:


> Originally Posted by *jeffro37*
> 
> I noticed that the i5 when cf'd had better min fps on most of the games. Do you look at that or average fps more often to chose what set up you would use? I'm asking cause i always just went by average rather than the min fps. How easy is it to get the 8350 up to 4.8? My brother is in need of an upgrade and the offerings from AMD look very good this time.


Yes as long as you have a h100 or better and a ud3 or better you should hit 4.8 its pretty easy, I has a 8320 aswell and it hit 4.8 no problems.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah some games if you get under certain fps it will drop Vsync with drop form 60 to 30. Your fps look about right assuming you CPU is OCed? To be 100% sure check GPU usage ignoring that 30 fps drop. If its 90% most of the time you will probably benefit from second card. You will not more then 60-70 fps though.
> *I would say as hard as getting 4.8Ghz with 3570K. Also its the luck of the draw*.


No i5 3570k to hit 4.8 is going to be delidded in most cases, it's fair to say 4.6 is going to be you max with a h100. again if you go on the official 8350/8320 owners club, the majority of guys under water are hitting 4.8 some as high as 5.2.

Quote:


> Originally Posted by *Hokies83*
> 
> Intel is better....
> This shows However Alot of Games using Multi threads and it shows that Vishera is able to run head to Head clock for clock with a "3570k" in these Multi threaded games.Which is a great score for Amd imo.
> Games using 4 cores or Less would be a much different result...
> But it does Show in order to beat Amd down the line clock for clock Intel users will have to have I7s 2600k 2700k and 3770k.


I wish AMD would come out with a FX 12 core steamroller or excavator with a new socket/ chipset, I am resigning myself to the fact that hitman absolution is never gonna hit 120 hz on any current pc


----------



## Hokies83

Quote:


> Originally Posted by *2advanced*
> 
> Hes ALMOST there!


Um dude.. Intel Cpu's are better then Amd Cpu's if u think any different you need a Reality check.

I said it competes with Intel's Main stream *I5* in Multi threaded games and from results of this thread that is true.

But there is still a 2600k 2700k 3820 3770k 3930k 3960x and a 3970.


----------



## Stefy

Quote:


> Originally Posted by *Hokies83*
> 
> Um dude.. Intel Cpu's are better then Amd Cpu's if u think any different you need a Reality check.


Yep.


----------



## Stoffie

Quote:


> Originally Posted by *newblood*
> 
> Can you do Crysis 2 please?


Does it have a built in bench?


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> Does it have a built in bench?


I can do that one to.

but Nvidia vs Amd would not do fair results unless u wanted to bench only the cpu at 800x600...

Gonna be awhile gotta reinstall it.

Been playing Clash of Clans on Ipad XD


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> I can do that one to.
> but Nvidia vs Amd would not do fair results unless u wanted to bench only the cpu at 800x600...
> Gonna be awhile gotta reinstall it.
> Been playing Clash of Clans on Ipad XD


why is it a nvidia or amd game? we can do stuff like that, just not add it to the opening thread as it takes the relevance off the title, i have no problem with either party not performing as well, all I want to do is offer people good honest advice with the evidence to back it up, to often you see comments on OCN that one company is bad without any relevant evidence to back up their statement.


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> why is it a nvidia or amd game? we can do stuff like that, just not add it to the opening thread as it takes the relevance off the title, i have no problem with either party not performing as well, all I want to do is offer people good honest advice with the evidence to back it up, to often you see comments on OCN that one company is bad without any relevant evidence to back up their statement.


Your Amd cards are Faster then my Nvidia cards XD lol I sold my 1400mhz clocking 680s for 2 4gb 680s there pretty bad clockers 1215mhz/3700mem But i needed the Vram heh kept running out =/

Have u tried torchlight 2 Batman Arkham city witcher 2 etc?

You should also give Dark Siders 2 a go.. it is a decent demanding game uses about 80% of both my 680s to run 120 fps.


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> Your Amd cards are Faster then my Nvidia cards XD lol I sold my 1400mhz clocking 680s for 2 4gb 680s there pretty bad clockers 1215mhz/3700mem But i needed the Vram heh kept running out =/
> Have u tried torchlight 2 Batman Arkham city witcher 2 etc?
> You should also give Dark Siders 2 a go.. it is a decent demanding game uses about 80% of both my 680s to run 120 fps.


I will have a look at them, I'll see if I can get them cheap anywhere, we will try get as many games on here as is financially viable...


----------



## Hokies83

hard forum forsale section cough


----------



## Stoffie

http://www.ebuyer.com/399244-powercolor-hd-7990-6gb-gddr5-dual-dvi-hdmi-dual-mini-displayport-pci-e-ax7990-6gbd5-a2dhj

Should I pull the trigger? See if we bottleneck tri fire?


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> http://www.ebuyer.com/399244-powercolor-hd-7990-6gb-gddr5-dual-dvi-hdmi-dual-mini-displayport-pci-e-ax7990-6gbd5-a2dhj
> Should I pull the trigger? See if we bottleneck tri fire?


No i heard bad things about that Gpu from RS on AnAndtech.


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> No i heard bad things about that Gpu from RS on AnAndtech.


Really? I heard bad bad things about the vortex ii 7970 from people on OCN, it was a great price and unlocked and the company offered a 14 satisfaction guaruntee and I like backing the underdog (hence why I have an AMD rig) I have to say my vortex ii is better than my Vapor x it is cooler and overclocks higher. both are better than the lightning 7970 I had (DL DVI Issue) and the worst card I had was a giga windforce because it was locked, if it wasn't locked I think it would have been the best because the fans at 100% are inaudible so it never went over 30 degrees and I am sure clocked to 1200 1600 it would have stayed under 55.


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> Really? I heard bad bad things about the vortex ii 7970 from people on OCN, it was a great price and unlocked and the company offered a 14 satisfaction guaruntee and I like backing the underdog (hence why I have an AMD rig) I have to say my vortex ii is better than my Vapor x it is cooler and overclocks higher. both are better than the lightning 7970 I had (DL DVI Issue) and the worst card I had was a giga windforce because it was locked, if it wasn't locked I think it would have been the best because the fans at 100% are inaudible so it never went over 30 degrees and I am sure clocked to 1200 1600 it would have stayed under 55.


Devil 13 overheats and has power issues.

i Buy whatever is best.. i would have all Amd gpus this go around but i use 3D vision.

i got one of those XD


----------



## Powermonkey500

I'm still not sure why Nvidia decided to go with only 2GB and a 256-bit memory bus...


----------



## Catscratch

Quote:


> Originally Posted by *Ashtyr*
> 
> I'm at the beginning, but for now it moves between 50 and 60 fps average
> Im playing ultra with HBAO and 2xMSAA
> An issue I am having is that sometimes the game is at 30fps, whatever you do 30 fps, for example I'm still at 60 fps, looking without moving me, and suddenly drop to 30 fps and there remains, i think it could be the Vsync
> Edit. When frames drop to 30 GPU usage drop to 40/50%, CPU usage still the same about 50%
> I think that uses six cores, you can see the fps, the core usage individually, the GPU usage , memory of GPU and temperature of CPU
> Some pics


I always catch this misconception. Just because you see some percentages on all cores doesn't mean it can use 6 cores







The middle picture shows the highest usage and it's only 3 cores (%49 total usage) to the fullest









On a 6 core cpu ;
~%20 total = 1 core,
~%35 total = 2 cores,
~%50 total = 3 cores,
~%65 total = 4 cores,
~%75 total = 5 cores,
~%100 total = 6 cores


----------



## totallynotshooped

Sweet, FX can actually compete with Intel in some games. Good job AMD.


----------



## SmokinWaffle

Thread cleaned. Keep it civil.


----------



## JunkoXan

what about GPU Overclock results keeping same GPU OC in single and Crossfire on both platforms?







see how much impact there will be between the 2 platforms


----------



## Hokies83

Crysis 2 Maxxed out 4k textures.

Uses 1 thread heavy and minimal use on 3 other threads.. and little to none on the rest.
The spike u see happened when i Alt+Tabbed out.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hokies83*
> 
> Crysis 2 Maxxed out 4k textures.
> Uses 1 thread heavy and minimal use on 3 other threads.. and little to none on the rest.
> The spike u see happened when i Alt+Tabbed out.


So its like a 2-3 Core usage across 8 cores. The problem is that 1 core is going to cause the bottleneck to close it gets to 90%.


----------



## Dimaggio1103

Here is the usage on my machine, seems to use more than 2-3 for me. Sorry for the yellow tinge dont know why it keeps doing this....


----------



## itomic

And GPU usage is north of 97% i presume ??


----------



## ZealotKi11er

I was getting 50-55% CPU usage and GPU usage was 80-99%.


----------



## Hokies83

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Here is the usage on my machine, seems to use more than 2-3 for me. Sorry for the yellow tinge dont know why it keeps doing this....


Hmm donno

It was only really using 1 thread for me. But we are also comparing a 6 thread Vishera at 4.1ghz to a 8 threaded Ivy at 5.2ghz.

BF3 Hmm Seems a 5ghz + 3770k does not bottleneck BF3 Multi Player? Atleast with 2 Overclocked 680s it does not.


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> BF3 Hmm Seems a 5ghz + 3770k does not bottleneck BF3 Multi Player? Atleast with 2 Overclocked 680s it does not.


I swear people in this forum do not know what CPU bottlenecked means, I highly doubt that in a 64 player server you CPU can keep both of your GPU's constantly pegged at 99% with out any drops.

If it drops below 99% you're CPU limited.


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> I swear people in this forum do not know what CPU bottlenecked means, I highly doubt that in a 64 player server you CPU can keep both of your GPU's constantly pegged at 99% with out any drops.
> If it drops below 99% you're CPU limited.


All games but BL2 pegged 99% Always.

Unless i Limit fps at 120fps.


----------



## paulerxx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol really. 2,66Ghz vs 4.0Ghz.
> CPU that can do 4.4Ghz vs 5Ghz. One has almost 2Ghz OC other 1Ghz.


Yea let's compare lucky draws vs actual stock performance....You're right in that sense though, when overclocked the i7 is likely better.


----------



## Hokies83

Assassin's Creed 3

This is the toughest one yet wow @[email protected]


----------



## Stoffie

Quote:


> Originally Posted by *Hokies83*
> 
> Assassin's Creed 3
> This is the toughest one yet wow @[email protected]


I wonder if this is related to the game actually using the thread or windows doing spread scheduling? I just did a single thread bench on cinebench, ended up with all the cores running if I add all the core usage up it ended up being 102% usage divided between the 8 cores so an average of 12.75% per core, I assume the 2% is running windows etc.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Hokies83*
> 
> All games but BL2 pegged 99% Always.
> Unless i Limit fps at 120fps.


When you play crysis 2 are you pegged at 99% as well?


----------



## Catscratch

Let's also not forget, just because your GPU drops below %99 doesn't mean you are cpu bottlenecked.

Why do you always assume these games are perfectly coded ? Is there a proof that these gfx engines are %100 efficient ? There are also the memory and windows quirks, let alone gfx driver quirks. So you shouldn't assume a cpu bottleneck if gpu drops below %99 on some occasions.


----------



## almighty15

Quote:


> Originally Posted by *Catscratch*
> 
> *Let's also not forget, just because your GPU drops below %99 doesn't mean you are cpu bottlenecked.*
> Why do you always assume these games are perfectly coded ? Is there a proof that these gfx engines are %100 efficient ? There are also the memory and windows quirks, let alone gfx driver quirks. So you shouldn't assume a cpu bottleneck if gpu drops below %99 on some occasions.


99% of the time it does because of how the API works with the CPU and GPU communication.


----------



## Hokies83

assassin's creed 3
Boston See why people with Quad cores are having trouble in Boston.


----------



## almighty15

Really wish you guys would post MSI After Burner GPU usage charts to go with your CPU charts


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> Really wish you guys would post MSI After Burner GPU usage charts to go with your CPU charts


Im doing this because at 5.2ghz i should have just about the 8 fastest threads on OCN

And i have Kepler cards so i do not use Afterburner.


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> Im doing this because at 5.2ghz i should have just about the 8 fastest threads on OCN
> And i have Kepler cards so i do not use Afterburner.


What CPU you running?


----------



## Dimaggio1103

Quote:


> Originally Posted by *Hokies83*
> 
> Im doing this because at 5.2ghz i should have just about the 8 fastest threads on OCN
> And i have Kepler cards so i do not use Afterburner.


You could use EVGA precision and have it on the On screen display, then take a screenshot.


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> What CPU you running?


3770k


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> 3770k


Nice....


----------



## ZealotKi11er

There is no way 3770K get 99% GPU utilization in BF3 MP. You got to try all map before you come to that conclusion. I hit as low as 50% in some MAP. That just saying to me that HT is more powerful then real cores and the games use 8 core effectively.


----------



## Hokies83

i Have found no Place in BF3 where my 2 680s are not pegged at 99% with V sync off.


----------



## Malo

this is my GPU usage with my 2 6870's OC'd to 940/1150 on my [email protected] Ghz.... I would not count this as a bottle neck...


----------



## almighty15

Quote:


> Originally Posted by *Malo*
> 
> 
> this is my GPU usage with my 2 6870's OC'd to 940/1150 on my [email protected] Ghz.... I would not count this as a bottle neck...


Well your not flat line 99% on both GPU's so you are bottlenecked and that's only with two 6870's!!!

Toss in a couple 7950's in and it'll become an even bigger bottleneck.

And the edit button is the little pencil in the bottom left corner of your post.


----------



## Malo

Quote:


> Originally Posted by *almighty15*
> 
> Well your not flat line 99% on both GPU's so you are bottlenecked and that's only with two 6870's!!!
> Toss in a couple 7950's in and it'll become an even bigger bottleneck.
> And the edit button is the little pencil in the bottom left corner of your post.


the dips is when I died and it goes back to the menu to select class... I'll posat another screenshot when I get home with my 1 7970 OC'd (OC'd to max in CCC)


----------



## almighty15

Quote:


> Originally Posted by *Malo*
> 
> the dips is when I died and it goes back to the menu to select class... I'll posat another screenshot when I get home with my 1 7970 OC'd (OC'd to max in CCC)


Died or not dude, it should still be 99%


----------



## Malo

your telling me menu's will utilize 99% of your gpu's? with nothing graphical going on? just text and pictures?


----------



## Hokies83

Yah im not a big fan of BF3...

Think ive played multi player 5-10x best i ever did was go like 2 and 15 @[email protected] My eyes are to bad to compete in games like those anymore.


----------



## jprovido

I've experienced almost zero difference in gaming on my systems except for starcraft 2 and planetside 2 (they did pretty decently too just not as good as an i5 3570k) they are pretty decent good cpu's and was a huge upgrade from my i3 2120. not really planning on goign dual gpu with the vishera rig so I'm contended with it









edit:

my h100i died so I'm gaming on stock settings. bias aside I think a stock i5 3570k is still beter in gaming than an FX8320 at 4.7ghz. I still had better performance with starcraft 2 and planetside 2 with the i5 at stock. doesn't matter with other games coz their gpu's are just fully loaded with zero bottlenecks


----------



## 2advanced

Quote:


> Originally Posted by *Hokies83*
> 
> i Have found no Place in BF3 where my 2 680s are not pegged at 99% with V sync off.


Quote:


> Originally Posted by *almighty15*
> 
> Died or not dude, it should still be 99%


Can either one of you guys post images of afterburner showing said flat line during large map, 64 MP? Even my old 2500k would spike down while on the spawn page.


----------



## Powermonkey500

My 7970s hover around 70-85% in BF3 multiplayer with a decent amount of players, CPU pegged at 95%.


----------



## 2advanced

Quote:


> Originally Posted by *Powermonkey500*
> 
> My 7970s hover around 70-85% in BF3 multiplayer with a decent amount of players, CPU pegged at 95%.


Holy cow! What do you temps look like under that particular BF3 scenario?


----------



## Powermonkey500

Max 50c on the 7970s. Didn't look at CPU temps though.
I could get better temps, I don't have enough radiator in my water cooling loop. It's focused more on silence than extreme cooling, and it can hold my clocks well within good temperatures so I'm happy with it. If I had to guess, the CPU is probably max 75C maybe 70C


----------



## 2advanced

Good stuff! I honestly didnt think that at those speeds there'd still be a slight bottleneck on CF 7970's. Thats a hell of system you got there. Thanks again for working with stoffie to put all this together.


----------



## almighty15

Quote:


> Originally Posted by *Malo*
> 
> your telling me menu's will utilize 99% of your gpu's? with nothing graphical going on? just text and pictures?


If you CPU can give them enough rendering frames then yes...


----------



## Stoffie

Quote:


> Originally Posted by *almighty15*
> 
> If you CPU can give them enough rendering frames then yes...


Are your sure? Cause when i die in bf3 my processor usage drops to about 20-%? If it can average drive my gpus at 100% in single player and 70% in multiplayer 64 i dont think it will bottle neck a menu.


----------



## almighty15

Quote:


> Originally Posted by *Stoffie*
> 
> Are your sure? Cause when i die in bf3 my processor usage drops to about 20-%? If it can average drive my gpus at 100% in single player and 70% in multiplayer 64 i dont think it will bottle neck a menu.


I'm sure, load up Crysis, Counter strike, STALKER.... as a few examples.... GPU usage goes to 99% and the frame rates goes into the thousands..


----------



## Powermonkey500

I've heard that 3770ks are higher binned. I've been toying with the idea of getting one in hopes that it will clock higher.
Anyway, mine isn't terribly above average. 4.7GHz is usually reachable if you have proper cooling. Water cooling works wonders. Unfortunately, to get "proper cooling" a delid is necessary. (Boo hiss Intel)


----------



## jprovido

Quote:


> Originally Posted by *Powermonkey500*
> 
> I've heard that 3770ks are higher binned. I've been toying with the idea of getting one in hopes that it will clock higher.
> Anyway, mine isn't terribly above average. 4.7GHz is usually reachable if you have proper cooling. Water cooling works wonders. Unfortunately, to get "proper cooling" a delid is necessary. (Boo hiss Intel)


my temps never goes above 70 degrees at 4.9ghz but I can't get 5ghz stable no matter how much voltage I try. I can boot into 5.2ghz but it's not even benchable


----------



## Powermonkey500

With high end CPUs, you get what you pay for. Unless you're paying for extreme editions. Then you're just an idiot or have too much money that you should give to me instead.


----------



## itomic

FX 8350 isnt bottleneck, software is !!. I dont see CPU usage at 99%, so we can call it a bottleneck. Its just games r not coded well for multithreading. CPU bottleneck is when CPU is 99% utilised, and GPU isnt. But even in that kind of situation other factors can interrupt, and cause GPU not to perform on its best. We see FX 8350 utilised at about 60%, so how can it be bottleneck ??


----------



## Powermonkey500

Quote:


> Originally Posted by *jprovido*
> 
> my temps never goes above 70 degrees at 4.9ghz but I can't get 5ghz stable no matter how much voltage I try. I can boot into 5.2ghz but it's not even benchable


Are you delidded?


----------



## Powermonkey500

Quote:


> Originally Posted by *itomic*
> 
> FX 8350 isnt bottleneck, software is !!. I dont see CPU usage at 99%, so we can call it a bottleneck. Its just games r not coded well for multithreading. CPU bottleneck is when CPU is 99% utilised, and GPU isnt. But even in that kind of situation other factors can interrupt, and cause GPU not to perform on its best. We see FX 8350 utilised at about 60%, so how can it be bottleneck ??


Well, current game coding is what makes it a bottleneck. There's no way around that. Not until game designers go back out and re-code all these games, which just isn't going to happen. Future games will slowly improve and introduce more threads. But in this day and age, you want fewer and beefier cores. 4-6 is the sweet spot.


----------



## jprovido

Quote:


> Originally Posted by *Powermonkey500*
> 
> Are you delidded?


yup. was pissed with my 3570k before. temps were so high but when I though the closest heatsink on my nh-d14 at that time it wasn't even getting warm. got pissed, delidded the rascal. voila, I've been very happy since
Quote:


> FX 8350 isnt bottleneck, software is !!. I dont see CPU usage at 99%, so we can call it a bottleneck. Its just games r not coded well for multithreading. CPU bottleneck is when CPU is 99% utilised, and GPU isnt. But even in that kind of situation other factors can interrupt, and cause GPU not to perform on its best. We see FX 8350 utilised at about 60%, so how can it be bottleneck ??


the reality is it's hard to program games. only a number of games barely support 4 threads let alone 8 threads that amd cpu's have. 99% of games today are still 2 threaded and I don't see it changing anytime soon tbh.


----------



## Powermonkey500

Quote:


> Originally Posted by *jprovido*
> 
> yup. was pissed with my 3570k before. temps were so high but when I though the closest heatsink on my nh-d14 at that time it wasn't even getting warm. got pissed, delidded the rascal. voila, I've been very happy since


Yeah, I was hitting high 90s Celsius at 4.6GHz... When I delidded, I lost almost 30 Celsius. It was a great day.

Well, actually, when I first delidded, I was pissed because I didn't lost much temperature at all. I took it back out and thoroughly cleaned all the black glue off, then the temperature dropped. It wasn't the thermal paste Intel used, it was the gap between the die and the IHS caused by too much glue.


----------



## jprovido

so it still wasn't fixed with windows 8? I was gonna upgrade my vishera rig to windows 8 just to see the difference. I guess Ima stick with windows 7 for now.


----------



## almighty15

Quote:


> Originally Posted by *jprovido*
> 
> so it still wasn't fixed with windows 8? I was gonna upgrade my vishera rig to windows 8 just to see the difference. I guess Ima stick with windows 7 for now.


See for yourself

http://www.tomshardware.com/reviews/windows-8-bulldozer-performance,3289.html


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> I've heard that 3770ks are higher binned. I've been toying with the idea of getting one in hopes that it will clock higher.
> Anyway, mine isn't terribly above average. 4.7GHz is usually reachable if you have proper cooling. Water cooling works wonders. Unfortunately, to get "proper cooling" a delid is necessary. (Boo hiss Intel)


Yea 3770k and u get the extra threads..


----------



## Powermonkey500

Quote:


> Originally Posted by *Hokies83*
> 
> Yea 3770k and u get the extra threads..


"

Are those threads even useful for gaming?


----------



## jprovido

Quote:


> Originally Posted by *almighty15*
> 
> See for yourself
> http://www.tomshardware.com/reviews/windows-8-bulldozer-performance,3289.html


is it so hard to make a proper scheduler to work? 8 threaded workloads would yield zero performance increase as expected but 2-4 threaded apps should;ve had an increase. maybe it's already fixed with the patch on windows 7
Quote:


> Originally Posted by *Powermonkey500*
> 
> "
> Are those threads even useful for gaming?


nope. even with my i7 950 a few years ago I disable HT and settle for higher overclocks coz it doesn't help at gaming at all and sometimes becomes a performance hit in some games. with that said I'm thinking of getting an i7 3770k. I miss my 8 threads







(even though I have zero use for it lol)


----------



## Powermonkey500

Quote:


> Originally Posted by *jprovido*
> 
> is it so hard to make a proper scheduler to work? 8 threaded workloads would yield zero performance increase as expected but 2-4 threaded apps should;ve had an increase. maybe it's already fixed with the patch on windows 7
> nope. even with my i7 950 a few years ago I disable HT and settle for higher overclocks coz it doesn't help at gaming at all and sometimes becomes a performance hit in some games. with that said I'm thinking of getting an i7 3770k. I miss my 8 threads
> 
> 
> 
> 
> 
> 
> 
> (even though I have zero use for it lol)


One thing I'm unclear about with hyperthreading...
Say threads 0 and 1 are running on physical core 0. If the game is not using thread 1, will thread 0 have the same performance as core 0?
What I'm saying is, is there a "load-balancing" effect? If one thread on a core is being used but not the other, will it have the same performance as one physical core with HT off? Does the power from the unused thread transfer over to the used thread, or are they stuck at 50-50 no matter the load on either?

Sorry, it was difficult to word that question properly


----------



## jprovido

Quote:


> Originally Posted by *Powermonkey500*
> 
> One thing I'm unclear about with hyperthreading...
> Say threads 0 and 1 are running on physical core 0. If the game is not using thread 1, will thread 0 have the same performance as core 0?
> What I'm saying is, is there a "load-balancing" effect? If one thread on a core is being used but not the other, will it have the same performance as one physical core with HT off? Does the power from the unused thread transfer over to the used thread, or are they stuck at 50-50 no matter the load on either?
> Sorry, it was difficult to word that question properly


hyperthreading is rather complex, afaik it automatically disables when you're in a lightly threaded workload to avoid performance hit. the reason why even with two threaded apps like games you still get how "2 cores" would give not "2 virtual cores" that hyperthreading does.


----------



## Powermonkey500

Quote:


> Originally Posted by *jprovido*
> 
> hyperthreading is rather complex, afaik it automatically disables when you're in a lightly threaded workload to avoid performance hit. the reason why even with two threaded apps like games you still get how "2 cores" would give not "2 virtual cores" that hyperthreading does.


Interesting. Well, I guess that a 3770k could be useful in games that can use 6 threads (can't Far Cry 3 do that?) and for the higher binning. I can't decide if it's worth my money at the moment. Probably not.... maybe I should just wait for Haswell.


----------



## jprovido

Quote:


> Originally Posted by *Powermonkey500*
> 
> Interesting. Well, I guess that a 3770k could be useful in games that can use 6 threads (can't Far Cry 3 do that?) and for the higher binning. I can't decide if it's worth my money at the moment. Probably not.... maybe I should just wait for Haswell.


I think I worded that horribly but I'm not really far off imo. hyperthreading isn;t perfect and in some games it really does give you a performance hit rather than an increase. if it's a dedicated gaming pc an i5 is the best. not to say the i5 is a slouch in heavy threaded workloads. it's still a beast


----------



## jprovido

Quote:


> Originally Posted by *Powermonkey500*
> 
> Interesting. Well, I guess that a 3770k could be useful in games that can use 6 threads (can't Far Cry 3 do that?) and for the higher binning. I can't decide if it's worth my money at the moment. Probably not.... maybe I should just wait for Haswell.


I think I worded that horribly but I'm not really far off imo. hyperthreading isn;t perfect and in some games it really does give you a performance hit rather than an increase. if it's a dedicated gaming pc an i5 is the best. not to say the i5 is a slouch in heavy threaded workloads. it's still a beast


----------



## frozne

Quote:


> Originally Posted by *Powermonkey500*
> 
> Interesting. Well, I guess that a 3770k could be useful in games that can use 6 threads (can't Far Cry 3 do that?) and for the higher binning. I can't decide if it's worth my money at the moment. Probably not.... maybe I should just wait for Haswell.


It is useful for more than 4 threads, such as BF3 multiplayer. All hyper threading does is jam more requests down the pipeline for the cores to process. It is around a 30% boost in performance in comparison to 4 cores if you can utilize the 8 threads. It has been around long enough that there is rarely a case where it hinders you (improper software tuning has mostly been ironed out). Anything that only uses 4 threads, a 3770k will be roughly equal to a 3570k. In 8 threads it will be about 30% better.

It is one of those things that also shows how poor the AMD cores are. The AMD modular design is much better than hyperthreading. You get about 80% of 8 full cores where hyperthreading you only get about 30%. Even with that 50% difference on the architecture, it still trails in a lot of ways.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> "
> Are those threads even useful for gaming?


Think i proved they were.. if the game uses the 8350 extra threads it will also use the 3770ks extra threads.

If you seen your clock for clock bench vs the 8350..

Now see a 3770k bf3 clock for clock vs 8350.



I also have my own tests,


Spoiler: Warning: Spoiler!


















Any game that uses extra threads gets more out of a 3770k.


----------



## Powermonkey500

Might be time to upgrade.


----------



## ZealotKi11er

I still dont see any benchmarks where HT helps in games. 3570K vs 3770K ~ 33% faster when HT is fully utilized. In games that never the case. Thats 33% faster by having 100% more Threads. So each thread gives ~ 8% increase in performance.


----------



## rdr09

Quote:


> Originally Posted by *Powermonkey500*
> 
> Yeah, I was hitting high 90s Celsius at 4.6GHz... When I delidded, I lost almost 30 Celsius. It was a great day.
> Well, actually, when I first delidded, I was pissed because I didn't lost much temperature at all. I took it back out and thoroughly cleaned all the black glue off, then the temperature dropped. It wasn't the thermal paste Intel used, it was the gap between the die and the IHS caused by too much glue.


so, your chip is delidded. hmmmm. if you did not and this test was done at your 'normal' scenario, then the 8350 will look better?

i know not all 8300 series are hitting 4.8 either but some do 5GHz.

i think the title should be changed with the word delidded added.


----------



## Powermonkey500

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I still dont see any benchmarks where HT helps in games. 3570K vs 3770K ~ 33% faster when HT is fully utilized. In games that never the case. Thats 33% faster by having 100% more Threads. So each thread gives ~ 8% increase in performance.


I can't decide if I want to wait for Haswell or get a 3770k. My bottleneck in BF3 is pretty bad.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> I can't decide if I want to wait for Haswell or get a 3770k. My bottleneck in BF3 is pretty bad.


3770k OC 4.8ghz will never drop below 120fps.

If u can get it for 229$ at MC u can just resale 1155 rig when Haswell comes out.. That is what im going to do.


----------



## Powermonkey500

Quote:


> Originally Posted by *Hokies83*
> 
> 3770k OC 4.8ghz will never drop below 120fps.


Eyefinity is a different story


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> Eyefinity is a different story


Multi Monitors are more of a Gpu bottleneck then a Cpu.

You could always delid it and run at 5ghz...

Have you tried Lucid MVP yet?


----------



## Powermonkey500

Quote:


> Originally Posted by *Hokies83*
> 
> Multi Monitors are more of a Gpu bottleneck then a Cpu.
> You could always delid it and run at 5ghz...
> Have you tried Lucid MVP yet?


Doesn't work with Crossfire. I wish it did.
I am delidded already, 4.8GHz is my max with safe voltages.
It's definitely a CPU bottleneck in BF3. 95% CPU all the time, 70-85% GPU usage.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> Doesn't work with Crossfire. I wish it did.
> I am delidded already, 4.8GHz is my max with safe voltages.
> It's definitely a CPU bottleneck in BF3. 95% CPU all the time, 70-85% GPU usage.


I mean de lid the 3770k lol.


----------



## Powermonkey500

Quote:


> Originally Posted by *Hokies83*
> 
> I mean de lid the 3770k lol.


Ah. Yeah I just put a Classified up to try and trade my 3570k.


----------



## almighty15

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I still dont see any benchmarks where HT helps in games. 3570K vs 3770K ~ 33% faster when HT is fully utilized. In games that never the case. Thats 33% faster by having 100% more Threads. So each thread gives ~ 8% increase in performance.


HT actually hurts games that use 4 threads or less..


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> HT actually hurts games that use 4 threads or less..


I can take 200fps instead of 202 fps in trade for 30=40% faster multi threaded game performance.

And on Avg a higher overclocking Chip.


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> I can take 200fps instead of 202 fps in trade for 30=40% faster multi threaded game performance.
> And on Avg a higher overclocking Chip.


It's actually more then that, I know that HT on the first generation core i7's caused a 16% hit on single threaded performance but I have no idea what the performance hit is today.


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> It's actually more then that, I know that HT on the first generation core i7's caused a 16% hit on single threaded performance but I have no idea what the performance hit is today.


That was first gen alot has changed..

Now it is hardly anything to nothing.


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> That was first gen alot has changed..
> Now it is hardly anything to nothing.


You got charts for it? I can't imagine Intel could gain HT for next to nothing, even on newer CPU's..


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> You got charts for it? I can't imagine Intel could gain HT for next to nothing, even on newer CPU's..


It is pretty common knowledge among Intel ppl.

Use this threads results with some of these.

http://vr-zone.com/articles/amd-fx-8350-vs-intel-core-i7-3770k--4.8ghz--multi-gpu-gaming-performance/17494.html

Or just google it..

I do not keep any charts of the 3770k Vs 3570k lol.


----------



## almighty15

Quote:


> Originally Posted by *Hokies83*
> 
> It is pretty common knowledge among Intel ppl.
> Use this threads results with some of these.
> http://vr-zone.com/articles/amd-fx-8350-vs-intel-core-i7-3770k--4.8ghz--multi-gpu-gaming-performance/17494.html
> Or just google it..
> I do not keep any charts of the 3770k Vs 3570k lol.


That shows absolutely nothing..... It's Intel vs Intel and not HT on vs HT off


----------



## Hokies83

Quote:


> Originally Posted by *almighty15*
> 
> That shows absolutely nothing..... It's Intel vs Intel and not HT on vs HT off


Im not Google searching for Grafs that you can do yourself lol..
There is no Argument among Intel that Ht hurts performance anymore..

It is pretty common Knowledge among Intel HT hurts very little if nothing at all.. this is the 3rd generation i7 not the first generation i7 ..


----------



## Clairvoyant129

Quote:


> Originally Posted by *almighty15*
> 
> That shows absolutely nothing..... It's Intel vs Intel and not HT on vs HT off


I can do tests on my rig, and I can assure you're 100% wrong.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Powermonkey500*
> 
> Ah. Yeah I just put a Classified up to try and trade my 3570k.


I would just wait for 4770K. No point on waiting money now in 3770K.


----------



## Powermonkey500

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would just wait for 4770K. No point on waiting money now in 3770K.


You're probably right.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So each thread gives ~ 8% increase in performance.


An oversimplification.
Quote:


> Originally Posted by *almighty15*
> 
> HT actually hurts games that use 4 threads or less..


Rarely, and even more rarely to any significant degree.
Quote:


> Originally Posted by *almighty15*
> 
> It's actually more then that, I know that HT on the first generation core i7's caused a 16% hit on single threaded performance but I have no idea what the performance hit is today.


I don't have any games that perform measurably different on first gen i7s regardless of HT being on or off.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would just wait for 4770K. No point on waiting money now in 3770K.


Unless you need a CPU now.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> You're probably right.


If u got a microcenter or somebody trades you.. it is worth it..


----------



## Powermonkey500

Quote:


> Originally Posted by *Hokies83*
> 
> If u got a microcenter or somebody trades you.. it is worth it..


If someone traded, they would likely want some cash on the side.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> If someone traded, they would likely want some cash on the side.


You be shocked sometimes...


----------



## Faster_is_better

They are quite close with single gpu, intel pulling a bit ahead with multi GPU. I would say AMD has a solid contender though.

Would be nice if the results could be put into single images, both amd/intel into 1 image for easier comparison. Also the BF3 graphs don't have exact fps, they just look similar.

Nice work to both of you though


----------



## Trev0r269

Cool benches. Nice to see that AMD did improve with PD over BD. I really kind of bummerd with the IPC and gaming performance from the BD series. The power consumption is a "pro" for the Intel stuff right? I ask because on paper @ stock, it looks like it does, but I've heard conflicting reports of voltage that gets eaten when OC'ing a an IB-K series.

Regardless, I have a 2500k that runs @ 4.6 and an 8120 (undervolted, 4.2 when OC'd and overvolted) that serves as a home server. The 2500k handles 2x 6970s better IME, so I'd have to sell off both A. The 2500k + it's mobo (I have matx and atx) B. The 8120 and C. An Antec case. While I'm at it, I'd probably a side/down grad from the 6970s to a kick ass 7970 or something. PD could be worth it for me assuming a nice OC can be obtained; I do do video rendering and GIS/remote sensing work at home sometimes. That wasn't shameless advertising, but it feels in my story about these 2 newer GPUS. /rant

More on topic, what I'm saying is that most of my upgrade options that I have considering the equipment I own, seem to be sidegrades, slight upgrades and maybe downgrades. Correct me if I am wrong. I'm asking legit questions and I trust OCN memebers more than most websites with questionable methods.


----------



## Malo

just did this quickly b4 work no bottleneckat all on my OC'd 7970 and this isnt even with vishera... it my 8150

*edit*
just realized my gts450 (physx) is in the main screen my 7970 is fully OC'd to max in ccc


----------



## ZealotKi11er

Quote:


> Originally Posted by *Malo*
> 
> 
> just did this quickly b4 work no bottleneckat all on my OC'd 7970 and this isnt even with vishera... it my 8150
> *edit*
> just realized my gts450 (physx) is in the main screen my 7970 is fully OC'd to max in ccc


B4? 1 GPU is easy to push for most GPUs. Also in most games just add 4x MSAA and all the CPU bottleneck will be gone. The problem is when you have 2 cards and you dont want to use MSAA like in Far Cry 3 for example.


----------



## Malo

Why wouldnt you want to use MSAA?


----------



## jprovido

Quote:


> Originally Posted by *Malo*
> 
> Why wouldnt you want to use MSAA?


eyefinity/multi-monitor. MSAA will kill his fps


----------



## sugarhell

Quote:


> Originally Posted by *Malo*
> 
> Why wouldnt you want to use MSAA?


In such a big res you dont even need to apply AA.


----------



## almighty15

I'm still waiting for ARMA 2, Crysis 1 and other games to be tested....

Why is it that the only games AMD fans benchmark is multi-threaded games? And on top of that it's always the same games....


----------



## Stoffie

uh.... I benchmark games that 1. I have and 2. I play. How is either Powermonkey or I a fan either way, both have said based on results provided that the intel chip is better, granted the intel chip needed to be delidded and so probably is no longer under warranty, but I have never had a chip fail so I am not bothered about that.

I have Far Cry 3 plus the following:



If you want any of them I am happy to do it, I have ordered Crysis 2 off of amazon because it was relatively cheap. if you are looking for us to do a bias setting towards either team then you are on the wrong thread, we are interested in giving fair information on games that we play. We are in 2012 and personally I haven't played any game this year that isn't a current title.


----------



## almighty15

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Then explain why my 3820 which is a 8 thread monster combined with a 7870 did not "Eat the game right up" Switching from a Phenom II X6 to a Core i5 or Core i7 isn't going to magically make GTA IV run bundles better. FYI - you're Phenom II X6 has 6 threads
> 
> 
> 
> 
> 
> 
> 
> 
> And no not "Everything" is a port. All of the games he tested are not ports. Far Cry 3 isn't a port. You don't see credible benchmarking websites using large console ports to benchmark a GPU/CPU. Why? Because most ports don't make for a very enjoyable game or utilize the PC components properly. Name 5 relevant ports that everyone plays without issue and i'll believe you. Don't say "All the Assassins Creed games"


Skyrim is a port, Call of duty games are ports, Bioshock's a port, Alan wake is a port... Silent Hill is a port... There's loads of straight ports that run just fine......

Intel system have no problems at all running GTA4.... AMD on the other hand have gone backwards with the FX series...... High IPC and decent multi-threaded performance what GTA4 needs to shine, AMD have the mutli-threaded performance but don't have the IPC.



Just look at how much faster the lower clocked i7 is compared to the Core 2 Quad, If you want to play GTA4 then I strongly suggest you ditch your FX CPU.


----------



## Stoffie

Quote:


> Originally Posted by *almighty15*
> 
> AMD fan boys cherry pick much much worse.... every argument out come the BF3 and Crysis 2 benchmarks and all other benchmarks are biased towards Intel


Our testing showed that the AMD won in BF3 in both single GPU and crossfire, how is that bias towards intel?funny thing is if I turn my processor down to a 6 core it matches the Intel in F1, sleeping dogs is obviously a game where the intel chews up AMD it won by 9fps, but still the AMD was more than playable and it's frame rates doubled by running in crossfire. That to me is still acceptable performance...

I think a lot of people are missing the point of this thread, it is based on what each CPU can do with crossfire as opposed to a single GPU, I can't cherry pick crysis one for you because I doubt crossfire will make any difference to the game? like F1 2012 the only relevance of that test was to show you that having crossfire will actually slow it down...


----------



## M3T4LM4N222

Quote:


> Originally Posted by *almighty15*
> 
> Skyrim is a port, Call of duty games are ports, Bioshock's a port, Alan wake is a port... Silent Hill is a port... There's loads of straight ports that run just fine......
> Intel system have no problems at all running GTA4.... AMD on the other hand have gone backwards with the FX series...... High IPC and decent multi-threaded performance what GTA4 needs to shine, AMD have the mutli-threaded performance but don't have the IPC.
> 
> Just look at how much faster the lower clocked i7 is compared to the Core 2 Quad, If you want to play GTA4 then I strongly suggest you ditch your FX CPU.


That benchmark is so far from relevant it's not even funny. It's seriously testing the Core i7 920 (Released in 2008) the ORIGINAL AM2+ Phenom CPU's (2007) and the Core 2 Quad (2007). How does that reflect the FX-8350 vs Core i7 3770k/Core i5 3570k performance AT ALL? Just look how much faster the 920 is than the Core 2 Quad! LOL!!!!!! Of course it's much faster - it was a much newer much more complex architecture and literally the best you could get back then.

Talk about cherry picking and biases....


----------



## almighty15

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> That benchmark is so far from relevant it's not even funny. It's seriously testing the Core i7 920 (Released in 2008) the ORIGINAL AM2+ Phenom CPU's (2007) and the Core 2 Quad (2007). How does that reflect the FX-8350 vs Core i7 3770k/Core i5 3570k performance AT ALL? Just look how much faster the 920 is than the Core 2 Quad! LOL!!!!!! Of course it's much faster - it was a much newer architecture and literally the best you could get back then.


The Core 2 Quad still has higher IPC then Bulldozer and all the Phenom 2 CPU's.....

performance gap is still going to be the same on newer CPU's.... maybe even more so with Ivy and Sandy as they overclock much higher then the 920 does...

You seriously need to do some research......


----------



## M3T4LM4N222

Quote:


> Originally Posted by *almighty15*
> 
> The Core 2 Quad still has higher IPC then Bulldozer and all the Phenom 2 CPU's.....
> performance gap is still going to be the same on newer CPU's.... maybe even more so with Ivy and Sandy as they overclock much higher then the 920 does...
> You seriously need to do some research......


My point is - suggest a relevant PC game to test. A REAL Dignified PC game. Not a game that's obviously going to run MUCH better on an Intel platform like you've stated. I mean seriously... talk about cherry picking. You have a serious bias for Intel. You should really work that out of your system because it won't get you far in this economy. Want to see some relevant benchmarks? Simply go to PAGE 1 of this thread and scroll down. Then go to the news section and look at the Far Cry 3 benchmarks posted. Those are relevant. Irrelevant benchmarks for an irrelevant console port is a irrelevant point.

I'm going to sleep. It's 4AM here. I'd be more than happy to continue this discussion tomorrow but I need some sleep.


----------



## Stoffie

Yeah I don't get why you have to be bias towards one side, I like AMD and I will always have an AMD rig BUT because of these results for my own interest I am going to get a caselabs magnum th10 and 2 8990's (if they surface) along with a Haswell or broadwell just to see what the other guys are offering, I'll also upgrade to steamroller when it comes out and I'll do this test again though it will take some integrity away by not having a honest associate like powemonkey.


----------



## almighty15

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Quote:
> 
> 
> 
> Originally Posted by *almighty15*
> 
> The Core 2 Quad still has higher IPC then Bulldozer and all the Phenom 2 CPU's.....
> performance gap is still going to be the same on newer CPU's.... maybe even more so with Ivy and Sandy as they overclock much higher then the 920 does...
> You seriously need to do some research......
> 
> 
> 
> My point is - suggest a relevant PC game to test. A REAL Dignified PC game. Not a game that's obviously going to run better on an Intel platform like you've stated. I mean seriously... talk about cherry picking. You have a serious bias for Intel. You should really work that out of your system because it won't get you far in this economy. Want to see some relevant benchmarks? Simply go to PAGE 1 of this thread and scroll down. Then go to the news section and look at the Far Cry 3 benchmarks posted. Those are relevant. Irrelevant benchmarks for an irrelevant console port is a irrelevant point.
> 
> I'm going to sleep. It's 4AM here. I'd be more than happy to continue this discussion tomorrow but I need some sleep.
Click to expand...

Not a game that's obviously going to run better on Intel? They all run better on Intel....









Sent from my ZTE-BLADE using Tapatalk 2


----------



## BinaryDemon

I think people need to quit complaining about the choice of games tested here. Even professional review's don't test 1000 games, they have to pick a few to represent certain situations. Obviously older games are far more likely to favor Intel because they are less likely to be aggressively multithreaded. I wouldn't draw any conclusions about which processor is better based of this thread but IMO it shows that the FX-8350 has 'game'. It can hang. It doesnt deserve the disrespect given to it's older Bulldozer siblings.


----------



## Catscratch

There's no real definition for bottleneck. %s don't mean anything by themselves.

For me, a cpu or GPU bottleneck is whenever your system CAN NOT push MINIMUM FPS more than your MONITOR REFRESH RATE.

*Min FPS > Monitor Refresh Rate.* Everything else is thrash talk = E-peen.


----------



## JunkoXan

Quote:


> Originally Posted by *Catscratch*
> 
> There's no real definition for bottleneck. %s don't mean anything by themselves.
> 
> For me, a cpu or GPU bottleneck is whenever your system CAN NOT push MINIMUM FPS more than your MONITOR REFRESH RATE.
> 
> *Min FPS > Monitor Refresh Rate.* Everything else is thrash talk = E-peen.


if it goes below my monitors refresh rate of 60 the min better be 45 or more.







i'm pretty lenient abot minimum FPS aslong it doesn't dip to low below 60 it's fine


----------



## Demonkev666

Quote:


> Originally Posted by *Stoffie*
> 
> CINEBENCH
> AMD
> 
> INTEL


how can you shows 56% difference and single core and not see that it's far lower for the gpu side on the openGL it's only 18% ?


----------



## Stoffie

Quote:


> Originally Posted by *Demonkev666*
> 
> how can you shows 56% difference and single core and not see that it's far lower for the gpu side on the openGL it's only 18% ?


Sorry forgive me for being stupid but I don't understand the question?


----------



## almighty15

Quote:


> Originally Posted by *Demonkev666*
> 
> how can you shows 56% difference and single core and not see that it's far lower for the gpu side on the openGL it's only 18% ?


Because the GPU is maxed out on the Intel system and not on the AMD.... If the GPU had 56% more power then it would show 56% higher performance..


----------



## Stoffie

Quote:


> Originally Posted by *almighty15*
> 
> Because the GPU is maxed out on the Intel system and not on the AMD.... If the GPU had 56% more power then it would show 56% higher performance..


It's a pointless test I think, just did a cpu/gpu usage, the cpu averaged 15% and the gpu 40%?!


----------



## Dimaggio1103

Quote:


> Originally Posted by *almighty15*
> 
> I'm still waiting for ARMA 2, Crysis 1 and other games to be tested....
> Why is it that the only games AMD fans benchmark is multi-threaded games? And on top of that it's always the same games....


Why should they test old crappy single threaded games. Like it or not the future is multi threaded. Who cares how old inefficient games run. If you wanna only play older games buy yourself a core 2 duo and be happy.

Almost every single AAA title being released now is multithreaded. whats the problem? Funny how two people from each CPU side take there time to post benchmarks for everyone and people like you just whine and complain the whole time, demanding they do this or that. Just to justify you paying extra for a CPU. I paid 130 bucks and can play all the games you can, U mad?


----------



## JunkoXan

i may consider getting a 8320 or 8350 than a 3570k as i find it more suitable in games and streaming and what not i only game at 60fps (60hz Monitors) and plan on getting a 7950 which is less powerful than a 7970 but not by much


----------



## M3T4LM4N222

Stoffie probably has some Metro 2033 benchmarks on the way. Potentially some Saints Row 3 as well.


----------



## Hokies83

Metro 2033 is like the most depending Gpu game out there the results between the two are going to be near the same.. heck they prolly be the same with a old core 2 quad aswell...
My brother plays Metro 2033 on a Stock i5 750 with Gtx 560SLI and he holds 60fps constant.

Need some cpu demanding games like Borderlands 2 if we want to see any difference.
Quote:


> Originally Posted by *JunkoXan*
> 
> i may consider getting a 8320 or 8350 than a 3570k as i find it more suitable in games and streaming and what not i only game at 60fps (60hz Monitors) and plan on getting a 7950 which is less powerful than a 7970 but not by much


Well you have to put this into it aswell the Amd cpu is going to use 25% more power then the 3570k same as a 7970 idleing in your case.. going to cost you more money in the long run. " Power is very very expensive where im at"

Gaming results with these multi threaded games and Gpu heavy games results are gonna be near the same between the two. for the lowend 60hz gamer.

The only time your gonna see a huge difference between the two is when you have a game using 4 threads are less.


----------



## Stoffie

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Stoffie probably has some Metro 2033 benchmarks on the way. Potentially some Saints Row 3 as well.


Yep I am doing it as you write, by the end of today I hope to have added metro2033, farcry 3, Hitman Absolution and Saints Row 3 and , I am not sure if powermonkey will be able to add anything for a while he is battling with an eye infinity issue...

@ Hokies If you want to donate me a copy of borderlands then I'll do any test you want but I can't blow £30 on a game that I have no interest in, to do a benchmark and then leave it alone... Isn't planetside 2 supposed to be cpu intensive? That's free I can do that?

I donated $10 to the humble bundle already so I have all of the THQ stuff if anything there appeals?


----------



## Hokies83

Quote:


> Originally Posted by *Stoffie*
> 
> Yep I am doing it as you write, by the end of today I hope to have added metro2033, farcry 3, Hitman Absolution and Saints Row 3 and , I am not sure if powermonkey will be able to add anything for a while he is battling with an eye infinity issue...
> @ Hokies If you want to donate me a copy of borderlands then I'll do any test you want but I can't blow £30 on a game to do a benchmark and then leave it alone... Isn't planetside 2 supposed to be cpu intensive? That's free I can do that?


I had PS2 hated it and deleted it i think it is multi threaded.


----------



## JunkoXan

Quote:


> Originally Posted by *Hokies83*
> 
> Metro 2033 is like the most depending Gpu game out there the results between the two are going to be near the same.. heck they prolly be the same with a old core 2 quad aswell...
> My brother plays Metro 2033 on a Stock i5 750 with Gtx 560SLI and he holds 60fps constant.
> 
> Need some cpu demanding games like Borderlands 2 if we want to see any difference.
> Quote:
> 
> 
> 
> Originally Posted by *JunkoXan*
> 
> i may consider getting a 8320 or 8350 than a 3570k as i find it more suitable in games and streaming and what not i only game at 60fps (60hz Monitors) and plan on getting a 7950 which is less powerful than a 7970 but not by much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well you have to put this into it aswell the Amd cpu is going to use 25% more power then the 3570k same as a 7970 idleing in your case.. going to cost you more money in the long run. " Power is very very expensive where im at"
> 
> Gaming results with these multi threaded games and Gpu heavy games results are gonna be near the same between the two. for the lowend 60hz gamer.
> 
> The only time your gonna see a huge difference between the two is when you have a game using 4 threads are less.
Click to expand...

yeah i know the power consumption will be a bit more higher and my electric bill is very cheap _(160 a month)_ , well i'm going to be playing BF3 and Crysis 2/3 and Planetside 2, ect... then i can factor streaming which can put a sizable load on the cpu. then non-game i do 3D Art and such which can benefit the FX 8300 series. also to i'll be gaming in eyefinity _(3840x1024)_.


----------



## kzone75

Quote:


> Originally Posted by *Hokies83*
> 
> I had PS2 hated it and deleted it i think it is multi threaded.


It is multi threaded, except only one of the cores is mostly utilized, and the rest are sort of utilized.


----------



## Stoffie

Hey Guys, Hitman Absolution and Metro 2033 is up for AMD Intel to follow when powermonkey has a spare minute. Im not sure if powermonkey has Metro2033 so might not get an intel for that! we are currently pondering a decent bench mark for farcry 3...


----------



## Powermonkey500

Quote:


> Originally Posted by *Stoffie*
> 
> Hey Guys, Hitman Absolution and Metro 2033 is up for AMD Intel to follow when powermonkey has a spare minute. Im not sure if powermonkey has Metro2033 so might not get an intel for that! we are currently pondering a decent bench mark for farcry 3...


A

I'm getting really bad stuttering in Far Cry 3 -.-
Trying to fix it.

As usual I can't just play.


----------



## Hokies83

Quote:


> Originally Posted by *Powermonkey500*
> 
> A
> I'm getting really bad stuttering in Far Cry 3 -.-
> Trying to fix it.
> As usual I can't just play.


Yeah I heard that game is really buggy I did not like far cry 1 or 2 so unless it is free I have no interest in 3 I may snag the new hit man if it has a 50% off sale.


----------



## itomic

Quote:


> Originally Posted by *Stoffie*
> 
> Hey Guys, Hitman Absolution and Metro 2033 is up for AMD Intel to follow when powermonkey has a spare minute. Im not sure if powermonkey has Metro2033 so might not get an intel for that! we are currently pondering a decent bench mark for farcry 3...


GPU usage graphs r just for CF ! Is GPU usage better with single HD 7970 ?


----------



## Hokies83

Witcher 2
Transformers fall of cybertron
BatMan Arkham City
Borderlands 2
Assassins Creed 3
Black ops 2
Arma 2 i do not care what fan bois say it is the # 10 best seller on Steam so test it.
Xcom
Skyrim
Alan Wake
Dishonored
Portal 2
Torchlight 2


----------



## JunkoXan

i rather have recent and future games tested as they come out than have old games_(3 years or older)_ tested, have to be moving forward not backwards in testing like this and thats what those 2 are doing moving forward testing games that came out recently and last year. to me these benchmarks are very good results that is head to head and not biased with proof to back up thier each individual experience which when compared to one another there is very little difference between them in both single and dual gpu set ups.









the 8350 is becoming more and more appealing as i look over the results.


----------



## kzone75

I can do Klondike, if anyone's interested.. But I only have a 6950..


----------



## AsanteSoul

The numbers are looking good.. I just want to know that if I run 3 660ti's or 2 670's or 680's the fx 8350 can keep up...


----------



## Stefy

Key words here guys: post-purchase rationalization.

Now, why would anyone go ahead and buy AMD CPUs when they want to game? Could be fanboyism, could be lack of awareness or it could be that you wanted to show off an 8-core processor to your buddies, who knows? I'm pretty sure most people with AMD chips in this thread knew very well what they were getting into when they decided to buy an FX-chip and now they are trying to justify their purchase by posting ridiculous benchmarks to make themselves feel better.


----------



## dizzy4

Wow these results are great. It looks like for single GPU setups you would not need the higher cost of Intel to get the same performance, but for multi-gpu solutions Intel really is the best choice. AMD isn't even that far behind.


----------



## Powermonkey500

I'm going to ask a mod to lock or clean this thread up.


----------



## cssorkinman

Thanks Stoffie and Powermonkey , very helpful thread to anyone considering the purchase of either of these processors. You guys spent a lot of your time ( and some of your money) to help demonstrate the differences between the 2 chips. I appreciate your efforts


----------



## SmokinWaffle

Quote:


> Originally Posted by *Powermonkey500*
> 
> I'm going to ask a mod to lock or clean this thread up.


Done.

I just had to delete 98 posts from this thread. When I came in here the first time, I had to delete 20~. A senior Moderator cleaned it up before then, too. That's well over *100 posts of arguing, bickering and off topic.*

If you can't discuss things in a *civil manner,* don't do it at all. I'm not going to spend all of my time moderating _one_ thread because one game gets 4FPS more than the other with a different CPU. You had a chance (2 in fact), and you continued to argue/flame.

*Locked.*


----------

