# [gamegpu] Farcry 4 benchmarks



## raghu78

http://gamegpu.ru/action-/-fps-/-tps/far-cry-4-test-gpu.html











This review looks bad for Nvidia Kepler cards. R9 285 beating GTX 770, R9 280X beating GTX 780 and R9 290 beating GTX 780 Ti across all resolutions. If these benchmarks are indicative of actual game performance then my question is "Is Nvidia holding back Kepler performance (in drivers) to sell more Maxwell cards." ? . This is quite embarassing for Nvidia Kepler cards like GTX 780 Ti , GTX 780, GTX 770 especially in a TWIMTBP title. btw SLI works very well. CF does not work at all. AMD is working with Ubisoft to fix this.

The Nvidia setting cuts fps by 50%. People who have played the game can post their opinion if that setting makes a huge difference to image quality.

I would wait for other sites to post their benchmarks too. Forum users can post links to other performance reviews too.









Edit : gamegpu charts updated with MSAA 4x (Ultra +). GTX 980 is dominant at 1080p with MSAA beating GTX 780 Ti by close to 30% and GTX 970 by almost 20%. At 1080p R9 290 and R9 290X performance is not good. R9 280X performance is much better beating out GTX 770. The gap between R9 280X and R9 290X is less than 20%. AMD needs to improve R9 290 series MSAA performance.


----------



## jmcosta

thats weird
i was just reading this http://www.gamersnexus.net/game-bench/1701-far-cry-4-gpu-benchmark-amd-is-broken-again


----------



## boot318

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I think they are using the new beta driver that AMD released yesterday. It is supposed to be "up-to" 50% increase in FC4.

I'm surprised the game runs well on AMD. CF is broken but I don't use it.


----------



## rdr09

Quote:


> Originally Posted by *jmcosta*
> 
> thats weird
> i was just reading this http://www.gamersnexus.net/game-bench/1701-far-cry-4-gpu-benchmark-amd-is-broken-again


difference in driver used on nvidia. one reviewer used a more stable one maybe?


----------



## Grindhouse

Interesting benchmark !

I'm running a 4930K 4.4Ghz + GTX 980 SLI. I hope i'll be able to get that 176 min FPS and 195 average !


----------



## Adglu

I played through prologue on [email protected]@max+smaa on old 14.9beta drivers and it was highly playable and smooth, definitely an improvement over FC3. Got to check the new 14.11.2 beta soon.


----------



## axizor

Quote:


> Originally Posted by *Grindhouse*
> 
> Interesting benchmark !
> 
> I'm running a 4930K 4.4Ghz + GTX 980 SLI. I hope i'll be able to get that 176 min FPS and 195 average !


980 SLI w/ a 1080p monitor?


----------



## raghu78

We need more reviews to get a better idea of performance. Also the reviews should have all the latest patches installed.


----------



## Grindhouse

Quote:


> Originally Posted by *axizor*
> 
> 980 SLI w/ a 1080p monitor?


Indeed, i am running a 1080p (120hz) monitor with a GTX 980 SLI set up.

I don't belive in overkill. I belive in never dropping under 120fps under any circumstances.


----------



## PureBlackFire

well, it's either temporarily broken for AMD or temporarily broken for 90% of high end Nvidia users. oh Ubisoft, what would pc gaming do without you.


----------



## Clockster

Quote:


> Originally Posted by *Grindhouse*
> 
> Indeed, i am running a 1080p (120hz) monitor with a GTX 980 SLI set up.
> 
> I don't belive in overkill. I belive in never dropping under 120fps under any circumstances.


Witcher 3 would like to have a word with you when it launches


----------



## deafboy

*fingers crossed*


----------



## MK3Steve

Quote:


> Originally Posted by *raghu78*
> 
> We need more reviews to get a better idea of performance. Also the reviews should have all the latest patches installed.


Ive tested the new vs the old driver on my 980 . New driver doesent gave me a single frame more . Will upload a short comparison video between both drivers in the next hour i think . Also the random stutter is still there with newest drivers .


----------



## axizor

Quote:


> Originally Posted by *Grindhouse*
> 
> Indeed, i am running a 1080p (120hz) monitor with a GTX 980 SLI set up.
> 
> I don't belive in overkill. I belive in never dropping under 120fps _under any circumstances._


Does that mean a third 980 on the way the second an fps drop under 120 occurs?

120hz is nice, but those cards deserve more res lol.


----------



## DrBrogbo

From what I've noticed of the Nvidia setting, it definitely makes a visual difference. The fur is a little glitchy (some artifacting around the edges), but it looks fantastic. Almost like CGI fur.
Quote:


> Originally Posted by *MK3Steve*
> 
> Ive tested the new vs the old driver on my 980 . New driver doesent gave me a single frame more . Will upload a short comparison video between both drivers in the next hour i think . Also the random stutter is still there with newest drivers .


Seconded. New driver does not help my 980 in any noticeable way either. Stuttering is still there, but triple buffering helps quite a bit. I've heard there are other tricks (mostly ini file tweaks), but I haven't been able to test them yet.


----------



## ladcrooks

it will get better - always a mess when some games are released









just goes to show the argument about 1080p on here on OC often, and what cards you need - 1080p is still a resolution to be reckoned with


----------



## jameschisholm

Is this with the latest Nvidia driver 344.75?


----------



## paulerxx

50+ fps at Ultra on my HD 7870? Nice! Nice to see some games optimized well for PC...


----------



## jameschisholm

Funny how when the chart says Nvidia, the frames get halved? Is that because of the fancy extras?


----------



## DrBrogbo

Quote:


> Originally Posted by *jameschisholm*
> 
> Funny how when the chart says Nvidia, the frames get halved? Is that because of the fancy extras?


Yeah, "Nvidia" is a graphics setting beyond ultra in the game that enables all the fancy Nvidia-only features (hairworks and the like).


----------



## ThePath

It seems that mid ranged GPU like GTX 760 can run this game better than PS4


----------



## KaffieneKing

Very intrigued, if the first set of numbers is true then I could play 4k with my 290x







(albeit at 30ishFPS but thats still pretty good!)

I suspect the first set of numbers used new today version of AMDs drivers.


----------



## AndroidVageta

Heh, I'm running the game on a single 290 right now (since XFire is broken) with Eyefinity with everything max settings and SMAA and I'm still getting ~30FPS. Not mad smooth mind you but perfectly playable. So with XFire fixed soon that should about double.

You know...they might be power hungry and run hot (well mine are water cooled but they'll still heat a room up!) but these little 290's are freakin wonderful. Who would have thought that for $400 used for both I'd get so much GPU grunt? Plus the 4GB RAM is awesome.

Ahhh...just love my little 290's! There's just NOTHING they can't do!


----------



## rcfc89

Quote:


> Originally Posted by *DrBrogbo*
> 
> Yeah, "Nvidia" is a graphics setting beyond ultra in the game that enables all the fancy Nvidia-only features (hairworks and the like).


In other words if you can run 4k with Nvidia mode enabled and hold 60 fps your machine is a bad mofo.


----------



## BradleyW

Quote:


> Originally Posted by *boot318*
> 
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> 
> I think they are using the new beta driver that AMD released yesterday. It is supposed to be "up-to" 50% increase in FC4.
> 
> I'm surprised the game runs well on AMD. CF is broken but I don't use it.


FC3 was a gaming evolved title and ran well on AMD. I figured AMD would also concentrate on doing a good job with FC4 support. In fact, AMD have been on the ball with new releases since the days of Bioshock Infinite and Tomb Raider.


----------



## omari79

Quote:


> Originally Posted by *DrBrogbo*
> 
> Yeah, "Nvidia" is a graphics setting beyond ultra in the game that enables all the fancy Nvidia-only features (hairworks and the like).


There is fur or some fancy hair effect in FC4??


----------



## jameschisholm

Quote:


> Originally Posted by *omari79*
> 
> There is fur or some fancy hair effect in FC4??


https://www.youtube.com/watch?v=-sC8XIsqjSc


----------



## omari79

Quote:


> Originally Posted by *jameschisholm*
> 
> https://www.youtube.com/watch?v=-sC8XIsqjSc


Cheers mate and +rep

I heard before that PCSS is a bit taxing but looks like the real heavy hitter is these damn fine fur effect..the sharp fps nose dive makes sense now.


----------



## DrBrogbo

Quote:


> Originally Posted by *rcfc89*
> 
> In other words if you can run 4k with Nvidia mode enabled and hold 60 fps your machine is a bad mofo.


Lol yeah, and you spent over $2k on GPUs.


----------



## h2spartan

Quote:


> Originally Posted by *jameschisholm*
> 
> https://www.youtube.com/watch?v=-sC8XIsqjSc


Haha, the buffalo has an unnaturally silky smooth coat. Must take good care of it? Wonder what shampoo it uses....


----------



## amstech

Quote:


> Originally Posted by *DrBrogbo*
> 
> Lol yeah, and you spent over $2k on GPUs.


If I spent all that money on a 4K monitor, badass supporting rig and SLi GPU's and got this type of performance I would be pissed.
So many demanding games make todays topdog SLi setups look very un-impressive.

Going to wait for the next gen of GPU's that should offer a better performance boost, this gen seems more about power/heat savings.


----------



## jameschisholm

Quote:


> Originally Posted by *h2spartan*
> 
> Haha, the buffalo has an unnaturally silky smooth coat. Must take good care of it? Wonder what shampoo it uses....


Pantene 2in1, of course.


----------



## omari79

Quote:


> Originally Posted by *jameschisholm*
> 
> Pantene 2in1, of course.


TRESemme..nuff said


----------



## rcfc89

Quote:


> Originally Posted by *amstech*
> 
> If I spent all that money on a 4K monitor, badass supporting rig and SLi GPU's and got this type of performance I would be pissed.
> So many demanding games make todays topdog SLi setups look very un-impressive.
> 
> Going to wait for the next gen of GPU's that should offer a better performance boost, this gen seems more about power/heat savings.


Anything currently out in SLI/Crossfire (2 only) is going to come up short when it comes to 4k. I believe the next gpu releases within the next few month's (390x, 980Ti, Titan 2) will be able to get it done in 4k with only 2 gpu's.


----------



## MR KROGOTH

Wonder how my GTX295 SLI setup would hold up.
Can I get the benchmark separate of the game?


----------



## szeged

glad to see amd isnt falling behind like in fc3, playing that on my 7970s was a mess.


----------



## ZealotKi11er

If Kepler perfromance does not improve i am not buying a Nvidia GPU anytime soon.


----------



## Alatar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If Kepler perfromance does not improve i am not buying a Nvidia GPU anytime soon.


Eh, depends on the review you look at ... (and if you want to use the PC specific NV features such as HBAO, hairworks, godrays, etc.) GameGPU decided to leave those features out for some reason. Other sites have them. So I dunno, but especially for NV vs. NV performance I'd rather include all the features and run the game maxed out, why gamegpu didn't I dunno.

But anyway:





http://www.pcgameshardware.de/Far-Cry-4-PC-256888/Specials/Technik-Test-Benchmark-1143026/

Those are with maxed out settings (aside from the ones that don't work on AMD) as far as I understand. And also latest drivers.

But yeah, might just be me but I think that benches should be run at max settings. Usually I like gamegpu but sometimes they pull nonsense like this.


----------



## digiadventures

Quote:


> Originally Posted by *jameschisholm*
> 
> Funny how when the chart says Nvidia, the frames get halved? Is that because of the fancy extras?


Quote:


> Originally Posted by *omari79*
> 
> Cheers mate and +rep
> 
> I heard before that PCSS is a bit taxing but looks like the real heavy hitter is these damn fine fur effect..the sharp fps nose dive makes sense now.


Nope, problem with Nvidia mode is perfomance heavy TXAA or something x4 AA mode. Just turn this off and you get similiar perfomance to ultra with all those nice nvidia effects still on


----------



## Alatar

Quote:


> Originally Posted by *digiadventures*
> 
> Nope, problem with Nvidia mode is perfomance heavy TXAA or something x4 AA mode. Just turn this off and you get similiar perfomance to ultra with all those nice nvidia effects still on


Which is how the game should be benched. Not on a preset because they don't make any sense.

Just go custom and max out all the settings you can and then tweak AA levels based on performance.


----------



## AndroidVageta

Quote:


> Originally Posted by *Alatar*
> 
> *SNIP*


I don't know how well I buy those results. My STOCK 290 get's a solid 30FPS consistently (haven't seen it drop below that) with max everything (including Nvidia effects) and SMAA (MSAA is a killer for some reason...same with Unity) at 5760x1080 Eyefinity.

No joke. I'm playing right now...newest AMD drivers with Far Cry 4 nice and patched up to the latest. Not having a single hick-up.

I mean, don't get me wrong, 30FPS isn't buttery smooth or anything but the game hasn't given me any issues other than XFire not working which seems to be an AMD thing right now.

Other than XFire though the game runs fine and I don't have any performance problems at all. I'm getting the same performance at Eyefinity that they're getting at 1440p.

Still think something is off though...no reason under any circumstances why a GTX 770 should be running any game faster than a 290/290X...even the 780 should be about neck in neck.


----------



## djriful

40-50 FPS average with Nvidia preset maxed out, without AA at 1440p. 1x GTX TITAN Modded.


----------



## criminal

Kepler's performance is odd in this game.


----------



## daviejams

Nobody post the CPU benchmarks from that site ?



i5 2500k in 4th ... greatest CPU ever


----------



## Seid Dark

I hope Nvidia doesn't abandon Kepler yet. My card isn't even year old and in theory is on par with 980. If there's no driver optimizations for new games 780 Ti will be useless soon.


----------



## DrBrogbo

Quote:


> Originally Posted by *daviejams*
> 
> Nobody post the CPU benchmarks from that site ?
> 
> 
> 
> i5 2500k in 4th ... greatest CPU ever


No wonder I get such low performance. My QX9650 isn't up to snuff anymore.









Wait, 2500k outperforming the 2600k? Aren't they the exact same series, but the 3600k is .1GHz faster and has HT?


----------



## daviejams

Quote:


> Originally Posted by *DrBrogbo*
> 
> No wonder I get such low performance. My QX9650 isn't up to snuff anymore.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, 2500k outperforming the 2600k? Aren't they the exact same series, but the 3600k is .1GHz faster and has HT?


2600k is the sandy i7 so it's the same but with hyper threading

It's probably a mistake , that site always gets its benchmarks out really quickly

You probably should think about upgrading


----------



## DrBrogbo

Quote:


> Originally Posted by *daviejams*
> 
> 2600k is the sandy i7 so it's the same but with hyper threading
> 
> It's probably a mistake , that site always gets its benchmarks out really quickly
> 
> You probably should think about upgrading


Yeah, I've got my eye on a 4790k, just being patient.


----------



## SoloCamo

Quote:


> Originally Posted by *daviejams*
> 
> Nobody post the CPU benchmarks from that site ?
> 
> 
> 
> i5 2500k in 4th ... greatest CPU ever


Seems to not like HT?

4670 > 4770

2500 > 2600

etc

Random question but why does no one ever seem to include ivy in the benches these days? I mean I get it, the difference is small between sandy to haswell let alone ivy to haswell but still
Quote:


> Originally Posted by *DrBrogbo*
> 
> No wonder I get such low performance. My QX9650 isn't up to snuff anymore.


I'd imagine you are getting better frame rates then that fx-6100 right?


----------



## DrBrogbo

Quote:


> Originally Posted by *SoloCamo*
> 
> I'd imagine you are getting better frame rates then that fx-6100 right?


Pretty comparable, actually. The QX9650 came out in 07, so I'm surprised it has lasted me even half this long. It matched well with my old 580, but I upgraded to a 980, so it is without a doubt holding my entire system back at every turn now.


----------



## djriful

Quote:


> Originally Posted by *Seid Dark*
> 
> I hope Nvidia doesn't abandon Kepler yet. My card isn't even year old and in theory is on par with 980. If there's no driver optimizations for new games 780 Ti will be useless soon.


Don't you worry. If they do, I'm sure a large portion of 600-700 series owner would be raging. 700 series was last gen before 900 series. Not like it is 2 generations old. There weren't 800 series desktop.


----------



## MR KROGOTH

Quote:


> Originally Posted by *DrBrogbo*
> 
> Pretty comparable, actually. The QX9650 came out in 07, so I'm surprised it has lasted me even half this long. It matched well with my old 580, but I upgraded to a 980, so it is without a doubt holding my entire system back at every turn now.


Well if you decide to jump ship on 775, let me know. Need another board and a Bloodrage would be sick.

Again, can the benchmark be had outside the game?


----------



## DrBrogbo

Quote:


> Originally Posted by *MR KROGOTH*
> 
> Well if you decide to jump ship on 775, let me know. Need another board and a Bloodrage would be sick.
> 
> Again, can the benchmark be had outside the game?


Will do. Mine isn't a Bloodrage, though. It's the X48 Blackops.

I don't think there's even an official benchmark in-game, so I doubt there's an external one.


----------



## MR KROGOTH

Quote:


> Originally Posted by *DrBrogbo*
> 
> Will do. Mine isn't a Bloodrage, though. It's the X48 Blackops.
> 
> I don't think there's even an official benchmark in-game, so I doubt there's an external one.


Misread that...blackops is just as good.

So people are just walking around in game and recording FPS?
Thats... A little questionable.


----------



## DrBrogbo

Quote:


> Originally Posted by *MR KROGOTH*
> 
> Misread that...blackops is just as good.
> 
> So people are just walking around in game and recording FPS?
> Thats... A little questionable.


Agreed. There are some heavily scripted sequences, but it will never be the same as a pre-programmed benchmark scene. I suppose as long as they do 3-5 runs in the same area and average out the results, it's as good as it's going to get though, eh?


----------



## MR KROGOTH

Quote:


> Originally Posted by *DrBrogbo*
> 
> Agreed. There are some heavily scripted sequences, but it will never be the same as a pre-programmed benchmark scene. I suppose as long as they do 3-5 runs in the same area and average out the results, it's as good as it's going to get though, eh?


I'd hope they are that honest...


----------



## ZealotKi11er

How come their 8 - Core Haswell is amazing in this site in every dam game. Makes no dam sense. Dont say to me its architecture. 2500K beats 6 Core higher clocked SB-E. 8-Core is lower clocked then 4770K and still wins.


----------



## SoloCamo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How come their 8 - Core Haswell is amazing in this site in every dam game. Makes no dam sense. Dont say to me its architecture. 2500K beats 6 Core higher clocked SB-E. 8-Core is lower clocked then 4770K and still wins.


I've questioned that lately as well.. on a few titles that were clearly decently threaded I let it slide, but something is off with it. Especially with the speed deficit and the generally considerably higher fps somehow over the next lowest.


----------



## pwnzilla61

Quote:


> Originally Posted by *SoloCamo*
> 
> I've questioned that lately as well.. on a few titles that were clearly decently threaded I let it slide, but something is off with it. Especially with the speed deficit and the generally considerably higher fps somehow over the next lowest.


I don't think games gpu has been accurate for a while, my performance is always way better then what they post.


----------



## Chargeit

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How come their 8 - Core Haswell is amazing in this site in every dam game. Makes no dam sense. Dont say to me its architecture. 2500K beats 6 Core higher clocked SB-E. 8-Core is lower clocked then 4770K and still wins.


I like how they exclude the 4790k.


----------



## Master__Shake

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How come their 8 - Core Haswell is amazing in this site in every dam game. Makes no dam sense. Dont say to me its architecture. 2500K beats 6 Core higher clocked SB-E. 8-Core is lower clocked then 4770K and still wins.


looks like to me hyper threading is breaking the game?

the i5 haswell beats the i7 haswell and the i5 sb beats the extreme sb-e

clocks are not even affecting the results.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Master__Shake*
> 
> looks like to me hyper threading is breaking the game?
> 
> the i5 haswell beats the i7 haswell and the i5 sb beats the extreme sb-e
> 
> clocks are not even affecting the results.


8-Core does not get effected and HT and HT off is not much difference.


----------



## Master__Shake

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 8-Core does not get effected and HT and HT off is not much difference.


strange results....


----------



## djriful

Quote:


> Originally Posted by *pwnzilla61*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SoloCamo*
> 
> I've questioned that lately as well.. on a few titles that were clearly decently threaded I let it slide, but something is off with it. Especially with the speed deficit and the generally considerably higher fps somehow over the next lowest.
> 
> 
> 
> I don't think games gpu has been accurate for a while, my performance is always way better then what they post.
Click to expand...

Because most stats are stock/mid oc only. Us (we) always push our hardware at max.


----------



## Luciferxy

Quote:


> Originally Posted by *h2spartan*
> 
> Haha, the buffalo has an unnaturally silky smooth coat. Must take good care of it? Wonder what shampoo it uses....


The same shampoo that Lara used on her TressFX hair









btw, does this game has 3rd person camera view ?


----------



## Jaren1

Im playing with a 3570k @4.2 and my GTX980 at stock speeds. Using Nvidia Preset with MSAAx8 I believe. Latest driver from today and Im typically arounf 70-80 fps, but dip down int the 40's sometimes when going through volumetric fog.

Very pleased. Game looks awesome.


----------



## Olivon

Quote:


> Originally Posted by *Alatar*
> 
> But anyway:
> 
> [/IMG ALT=""]http://www.overclock.net/content/type/61/id/2254486/width/750/height/1500[/IMG]
> 
> [/IMG ALT=""]http://www.overclock.net/content/type/61/id/2254487/width/750/height/1500[/IMG]
> 
> http://www.pcgameshardware.de/Far-Cry-4-PC-256888/Specials/Technik-Test-Benchmark-1143026/


Big difference with GameGPU.
A GTX 770 is beating a 290X behemot


----------



## daviejams

Quote:


> Originally Posted by *Olivon*
> 
> Big difference with GameGPU.
> A GTX 770 is beating a 290X behemot


They've probably tested it with all the Nvidia crap on


----------



## Clockster

Quote:


> Originally Posted by *daviejams*
> 
> They've probably tested it with all the Nvidia crap on


That's exactly what they did.
Game runs like rubbish, I can see it needing 3-4 patches before being playable.


----------



## Olivon

Quote:


> Originally Posted by *daviejams*
> 
> They've probably tested it with all the Nvidia crap on


Too bad, AMD crap (Mantle) doesn't work in this game


----------



## daviejams

Quote:


> Originally Posted by *Olivon*
> 
> Too bad, AMD crap (Mantle) doesn't work in this game


Not sure why your going in the huff , I explained why the midrange Nvidia card could possibly beat a top AMD card

Because of the Nvidia enhancements. This is a gameworks title after all , so should work better on Nvidia anyway


----------



## amstech

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 8-Core does not get effected and HT and HT off is not much difference.


HT has been beneficial for gaming going on years.
I have several reviews and quotes from Toms Hardware and Anandtech to support my claim.
Newer games take advantage of threads however they can get them, not sure if that exceeds 8 now but it will.

http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-4.html
Quote:


> Originally Posted by *Toms*
> We have seen a small handful of titles benefit from Hyper-Threaded Core i7 processors, though. Because we believe this is a trend that will continue as developers optimize their software, we're including the Core i7-4770K as an honorable mention, now selling for $340. In a vast majority of games, the Core i7 won't demonstrate much advantage over the Core i5. But if you're a serious enthusiast who wants some future-proofing and values highly-threaded application performance, this processor may be worth the extra money .


----------



## farmdve

These benches are meaningless with that overclocked processor. That CPU is beastly even without it, but it boosts performance by itself and you can't really tell how the cards actually perform.


----------



## winterkid09

Quote:


> Originally Posted by *farmdve*
> 
> These benches are meaningless with that overclocked processor. That CPU is beastly even without it, but it boosts performance by itself and you can't really tell how the cards actually perform.


Personally, I think that for the purpose of bringing out performance differences in GPU's, the most powerful CPU should be used to eliminate any sort of performance restrictions in any department other than the GPU itself. This allows each different GPU that is tested to be constantly used at 100% utilization and will give a more accurate picture of what each individual GPU can sustain in terms of FPS.

In the scenario where the CPU is not powerful enough and the GPU utilization drops to say.. 80% usage while the CPU needs to catch up, the results would be skewed, as this would only be negatively affecting the higher end GPU's scores.

Nonetheless, it's best to have many different test environments, as not everybody has the most kick-ass CPU!


----------



## kakik09

Wait, does the Nvidia setting also max out Anti-Aliasing? Because I can live without AA as long as I maintain 60 fps..


----------



## DrBrogbo

Quote:


> Originally Posted by *kakik09*
> 
> Wait, does the Nvidia setting also max out Anti-Aliasing? Because I can live without AA as long as I maintain 60 fps..


I think it puts TXAA on. I turned it down to SMAA 2x, since I don't really care too much about AA. It looks nice, but is the first thing I turn off.


----------



## kakik09

Quote:


> Originally Posted by *DrBrogbo*
> 
> I think it puts TXAA on. I turned it down to SMAA 2x, since I don't really care too much about AA. It looks nice, but is the first thing I turn off.


Did TXAA look blurry like it did with Crysis 3?


----------



## djriful

Quote:


> Originally Posted by *Olivon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *daviejams*
> 
> They've probably tested it with all the Nvidia crap on
> 
> 
> 
> Too bad, AMD crap (Mantle) doesn't work in this game
Click to expand...

You can say that again after DX12 released.


----------



## routek

Looks like the game performs really well, everything seems to be broadly inline with what I was expecting.


----------



## raghu78

gamegpu charts updated with MSAA 4x (Ultra +). GTX 980 is dominant at 1080p with MSAA beating GTX 780 Ti by close to 30% and GTX 970 by almost 20%. At 1080p R9 290 and R9 290X performance is not good. R9 280X performance is much better beating out GTX 770. The gap between R9 280X and R9 290X is less than 20%. AMD needs to improve R9 290 series MSAA performance.


----------



## thebski

Quote:


> Originally Posted by *Seid Dark*
> 
> I hope Nvidia doesn't abandon Kepler yet. My card isn't even year old and in theory is on par with 980. If there's no driver optimizations for new games 780 Ti will be useless soon.


Wouldn't be surprised if they do. They released pathetic cards for "next gen" to milk people for all they're worth, so they have to make them look good over "last gen" somehow.


----------



## djriful

Is it me or the water is utter flat looking no physic or anything? Seems downgraded from FC3


----------



## MK3Steve

Quote:


> Originally Posted by *djriful*
> 
> Is it me or the water is utter flat looking no physic or anything? Seems downgraded from FC3


The game looks realy great but yeah the water looks pretty cheep . I found the water in FC3 looked better too yes . But what is nice then when a boat drives in the water it makes those little waves that looks realy cool but yeah the water itself looks a bit low .


----------



## cryp

Something is wrong here.

How does the HD 7850 beat the GTX 660???
Maybe this game has problems with Kepler GPUs.
Anybody knows a fix?


----------



## ZealotKi11er

Quote:


> Originally Posted by *cryp*
> 
> Something is wrong here.
> 
> How does the HD 7850 beat the GTX 660???
> Maybe this game has problems with Kepler GPUs.
> Anybody knows a fix?


HD 7870 beats GTX660 in many games so HDS 7850 is not that far off.


----------



## cryp

Quote:


> Originally Posted by *ZealotKi11er*
> 
> HD 7870 beats GTX660 in many games so HDS 7850 is not that far off.


If you read the reviews, you would see that this is not the case.
The 7870 is about the same level as the 660.


----------



## TopicClocker

From those benchmarks the performance in this game is weird.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How come their 8 - Core Haswell is amazing in this site in every dam game. Makes no dam sense. Dont say to me its architecture. 2500K beats 6 Core higher clocked SB-E. 8-Core is lower clocked then 4770K and still wins.


This always baffles me.









Quote:


> Originally Posted by *h2spartan*
> 
> Haha, the buffalo has an unnaturally silky smooth coat. Must take good care of it? Wonder what shampoo it uses....


OMG.









Quote:


> Originally Posted by *daviejams*
> 
> Nobody post the CPU benchmarks from that site ?
> 
> 
> 
> i5 2500k in 4th ... greatest CPU ever


Haha I wouldn't be surprised if the 2500K is still going strong in 2016, especially overclocked.
The most futureproof CPU ever created. (Alongside the Nehalems when OC'd)


----------



## raghu78

2 more reviews from. from pclab and sweclockers.

http://pclab.pl/art57559-6.html
http://pclab.pl/art57559-7.html




Can't tell why R9 290X performance is so different from one review to another. Much better here especially at 1600p. Is it god rays or PCSS+ or MSAA. AMD better fix the performance for those cases.

http://www.sweclockers.com/artikel/1...da-i-far-cry-4



AMD better fix those min fps too


----------



## Redeemer

Gameworks is terrible and Nvidia is purposely crippling Kepler...wow just wow!


----------



## Master__Shake

Quote:


> Originally Posted by *Redeemer*
> 
> Gameworks is terrible and Nvidia is purposely crippling Kepler...wow just wow!


maxwell is just that good.

j/k

but look at that 980!


----------



## SoloCamo

290x has a better showing then I expected.. hopefully they redo the benches in a months time when most major issues are hopefully fixed


----------



## raghu78

One more FC4 performance review is up. Ultra with SMAA settings. AMD makes an excellent showing with R9 290X beating GTX 970 and GTX 780 Ti. Nvidia has work to do according to the reviewer. ubisoft also has a lot of work to do to fix stability.

http://www.techspot.com/review/917-far-cry-4-benchmarks/





"Given that we found the Radeon R9 280X to be faster than the GeForce GTX 780 at every resolution, there's no doubt some controversy will follow these results.

We figured something was wrong from the get-go so we contacted Nvidia and they suggested using the newer 344.75 drivers, which were given to us prior to being made public -- initially we were using the 344.65 WHQL. The new driver was pressumably tweaked for Far Cry 4, though the release notes don't make any performance claims. Alas, with the suggested release installed, we didn't receive a single extra frame. We should also point out we are testing with the latest version of Far Cry 4 (v1.3 as of posting).

For all Radeon cards we used the Catalyst 14.11.2 beta driver, which has also been updated for Far Cry 4. AMD claims up to a 50% performance increase over the Catalyst 14.11.1 beta in single-GPU scenarios with anti-aliasing enabled.

It's worth mentioning this latest beta driver doesn't support CrossFire. The CrossFire profile for Far Cry 4 is currently disabled while AMD works with Ubisoft to investigate an issue where CrossFire configurations are not performing as intended.

Getting back to Nvidia's poor performance... we can confirm that the 344.75 driver was used while Far Cry 4 has been patched to the latest version through Uplay. We asked Nvidia if the performance we saw was unusual or different to what they have seen and they have yet to reply.

As it stands, we believe AMD is getting the most out of its Radeon graphics cards in Far Cry 4 and don't expect to see many performance improvements in the future, with the exception of CrossFire setups. Nvidia on the other hand have some work ahead, which is hard to believe with Far Cry 4 being Nvidia-sponsored."

So AMD cards perform well with Ultra and SMAA. so the gamegpu results with Ultra and SMAA were not a anomaly. AMD still has to fix MSAA and Crossfire performance. Nvidia cards do well with MSAA but with SMAA they could improve performance.


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> HT has been beneficial for gaming going on years.
> I have several reviews and quotes from Toms Hardware and Anandtech to support my claim.
> Newer games take advantage of threads however they can get them, not sure if that exceeds 8 now but it will.
> 
> http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-4.html


HT is def. not worth it for gaming. i5 often outperform i7 at same clockspeed.

http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/5
_i5 wins overall at same clockspeed by ~5%._

When I had my i7-2600K I tried several AAA titles with HT on and off, there was close to zero difference. In fact HT off, often resulted in a few more fps average in some of the games. We're talking BF3, BF4, Titanfall etc. All multiplayer.

And this is why I went with i5 this time. It's cheaper and just as fast as i7 in games. i5 generally clocks higher due to less heat because of no HT. Especially for ppl with air or CLC..

i7 will be superior in benchmarks, but I don't do benchmarks. I only use my PC for gaming and I don't see the point in paying more for the same performance..


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lass3*
> 
> HT is def. not worth it for gaming. i5 often outperform i7 at same clockspeed.
> 
> http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/5
> _i5 wins overall at same clockspeed by ~5%._
> 
> When I had my i7-2600K I tried several AAA titles with HT on and off, there was close to zero difference. In fact HT off, often resulted in a few more fps average in some of the games. We're talking BF3, BF4, Titanfall etc. All multiplayer.
> 
> And this is why I went with i5 this time. It's cheaper and just as fast as i7 in games. i5 generally clocks higher due to less heat because of no HT. Especially for ppl with air or CLC..
> 
> i7 will be superior in benchmarks, but I don't do benchmarks. I only use my PC for gaming and I don't see the point in paying more for the same performance..


HT makes no difference for gaming and going from 3570K to 3770K i sow no difference but still was told that there is difference. If a game can utilize HT at 100% then HT ON will give you 33% better performance theoretically. The thing is that is never the case. A 6 core CPU or 8 core will also show great gain over 4 Core CPUs.


----------



## MK3Steve

Quote:


> Originally Posted by *raghu78*
> 
> with the suggested release installed, we didn't receive a single extra frame.


Me neither :


----------



## jmcosta

don't forget about frame time, graphical glitches, multigpu....


----------



## MK3Steve

Quote:


> Originally Posted by *ZealotKi11er*
> 
> HT makes no difference for gaming and going from 3570K to 3770K i sow no difference but still was told that there is difference. If a game can utilize HT at 100% then HT ON will give you 33% better performance theoretically. The thing is that is never the case. A 6 core CPU or 8 core will also show great gain over 4 Core CPUs.


HT in far cry 4 has no effect whatsoever . No matter if on or off i get exactly the same framerate .


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> HT is def. not worth it for gaming. i5 often outperform i7 at same clockspeed.
> .


i7's rule the gaming charts, always have and always will.
Plus they are better suited for pushing multi-mon setups.

A benchmark here or there for an i5 chip being faster means nothing, 99% of the time i7's are king.
Windows 7 was designed to use HT and more and more games have been utilizing for years now.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> HT makes no difference for gaming and going from 3570K to 3770K i sow no difference .


So one game makes no difference so lets pass judgement?
Games run on old engines but newer games show a distinct advantage when they utilize threads however they can get them.
Some games show a several FPS advantage, even when comparing to an i5 clocked higher.
I have several charts showing i7's beating higher clocked i5's.

http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-4.html
Quote:


> Originally Posted by *Toms Hardware*
> We have seen a small handful of titles benefit from Hyper-Threaded Core i7 processors, though. Because we believe this is a trend that will continue as developers optimize their software, we're including the Core i7-4770K as an honorable mention, now selling for $340. In a vast majority of games, the Core i7 won't demonstrate much advantage over the Core i5. But if you're a serious enthusiast who wants some future-proofing and values highly-threaded application performance, this processor may be worth the extra money


And this is an older article, its even more true now.


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> i7's rule the gaming charts, always have and always will.
> Plus they are better suited for pushing multi-mon setups.
> 
> A benchmark here or there for an i5 chip being faster means nothing, 99% of the time i7's are king.
> Windows 7 was designed to use HT and more and more games have been utilizing for years now.
> So one game makes no difference so lets pass judgement?
> Games run on old engines but newer games show a distinct advantage when they utilize threads however they can get them.
> Some games show a several FPS advantage, even when comparing to an i5 clocked higher.
> I have several charts showing i7's beating higher clocked i5's.
> 
> http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-4.html
> And this is an older article, its even more true now.


Stopped reading at this:

_"i7's rule the gaming charts, always have and always will.
Plus they are better suited for pushing multi-mon setups."_

My link proved this isnt true. And besides that, I trust my own results more than some random tests. People will do anything to justify their i7 purchase for gaming. There is zero difference. Anyone with an i7 can try to disable HT, they will see the exact same fps in games.

Games have never been utilizing HT. Consoles dosnt support HT, so games never will. Games are written to consoles. DX12 and Mantle will make CPU less important in the future, so i7 will never be better for gaming, unless you choose the HEDT platform for more PCIe lanes with multi GPU setups, and then you get more real cores instead of just virtual cores.


----------



## Ha-Nocri

who cares? Got bored of the stupid game after 1 hour of play...


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> Stopped reading at this:
> 
> My link proved this isnt true. .


This game utilizes 4-8 threads.
Because of HT the i3 basically matches the i5.

The i7 is 7FPS faster then the i5 but only has a 100Mhz clockspeed advantage.
The i7 870 that is clocked much lower then the i5 is still just as fast.

http://s51.photobucket.com/user/topenlt1/media/35038.png.html
Source: Anandtech

If you look at this chart and still believe HT doesn't help you are in *COMPLETE DENIAL.*
Have a nice day!









-


----------



## daviejams

If a game uses two cores then I don't think HT would help with giving you more frames

More modern games are using more cores these days so if I were building a new PC , I'd pay the extra for the i7 for sure


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> This game utilizes 4-8 threads.
> Because of HT the i3 basically matches the i5.
> 
> The i7 is 7FPS faster then the i5 but only has a 100Mhz clockspeed advantage.
> The i7 870 that is clocked much lower then the i5 is still just as fast.
> 
> http://s51.photobucket.com/user/topenlt1/media/35038.png.html
> Source: Anandtech
> 
> If you look at this chart and still believe HT doesn't help you are in *COMPLETE DENIAL.*
> Have a nice day!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -


Sorry, I don't care about your old link from 200X. The i7's are even clocked higher...

I just gave you a link from 2014, with multi GPU results, in newer games and the i5 wins clock for clock overall. Nuff said.

Sounds like you bought a i7 solely for gaming lmao..


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> The i7's are even clocked higher...


You might need reading glasses.
The 870 was at 2.9GHz while the 2500k was at 3.3Ghz, and it lost.

i5's are great because most developers are not programming their games to take advantage of 12-16 threads, only 4-8.
I won't argue that i5's are great for the money and sometimes just as fast or faster in benchmarks, but that's not because they actually are, the i7 is being under used.
The i3 keeping pace with an i5 is the best example.

As far as my chip, I had an i5 but it ran out of power with Photoshop and Video creation.
Windows 7 also runs smoother overall with HT on, but not by much.

You get what you pay for.

-


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> You might need reading glasses.
> The 870 was at 2.9GHz while the 2500k was at 3.3Ghz, and it lost.
> 
> i5's are great because most developers are not programming their games to take advantage of 12-16 threads, only 4-8.
> I won't argue that i5's are great for the money and sometimes just as fast or faster in benchmarks, but that's not because they actually are, the i7 is being under used.
> The i3 keeping pace with an i5 is the best example.
> 
> As far as my chip, I had an i5 but it ran out of power with Photoshop and Video creation.
> Windows 7 also runs smoother overall with HT on, but not by much.
> 
> You get what you pay for.
> 
> -


Rofl, why don't you stick to compairing the same architectures? You just proved you don't know what you are talking about.

i5-750 is comparable to i7-920, same clocks too, and i5 wins. Not that i care about Fallout 3 results in 2014..

Have fun:













Yeah, seems like HT is really worth it in gaming LOL


----------



## HeadlessKnight

Most sites show the 780 and Titan performing between 280 & 280X, or very slightly faster, way to cripple Kepler by drivers so Maxwell seem much better. GG Nvidia ...


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> Rofl, why don't you stick to compairing the same architectures? You just proved you don't know what you are talking about.


I meant the 870 not 970, I stated its (the 870) clock speed.
The AMD 970 is obviously clocked much higher and a different animal.

So eager to throw a pun? Little upset? Lol.
The 870 beats the 2500K but is clocked 400Mhz lower.
The i3 matches the i5.

That clear enough for you?








My work is done here, have a nice day.
Quote:


> Originally Posted by *Lass3*
> 
> Yeah, seems like HT is really worth it in gaming LOL


If you want the best.

Thanks for helping me prove my point.
Showing charts of overclocked i5's running games that utilize 4 threads proves nothing!

There's no reason to keep going back and forth I have said my piece and backed it up with statements/facts from Anandtech/Toms Hardware.
i7's are king, HT is utilized for gaming (arguing over 2-8FPS difference is another topic) and i5's are sweet because most games aren't programmed beyond their capabilities.

-


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> I meant the 870 not 970, I stated its (the 870) clock speed.
> The AMD 970 is obviously clocked much higher and a different animal.
> 
> So eager to throw a pun? Little upset? Lol.
> The 870 beats the 2500K but is clocked 400Mhz lower.
> The i3 matches the i5.
> 
> That clear enough for you?
> 
> 
> 
> 
> 
> 
> 
> 
> My work is done here, have a nice day.
> If you want the best.
> 
> Thanks for helping me prove my point.
> Showing charts of overclocked i5's running games that utilize 4 threads proves nothing!
> 
> -


hahah











i5-3570K beats 12 threaded i7's in the newest AAA game.









The i5-3570K also delivers 1 less fps than the one generation newer i7-4770K thats clocked 100 MHz higher. Very impressive.


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> The i5-3570K also delivers 1 less fps than the one generation newer i7-4770K thats clocked 100 MHz higher. Very impressive.


Those games won't use the power an i7 has to offer.
These will.









Notice the i3 matching the i5, and 6 core i7 dusting the 4 core i7 even though the 4 core is clocked 100Mhz faster.
The architecture's are almost identical.

My 930 @ 4.0Ghz still basically matches a 4770K at 4.0Ghz (gaming), so even if they were a generation or maybe two apart it wouldn't matter.

http://s51.photobucket.com/user/topenlt1/media/tribescpu.png.html

http://s51.photobucket.com/user/topenlt1/media/cpusky.jpg.html

Source: Techspot

-


----------



## jmcosta

some games perform better with more cores/threads others with high core frequency....
i7s or extreme exist for a reason
an i5 can bottleneck multigpu in bf and other few games
these benchs (1min runs) don't show the real issue when it comes to lack of cpu power lol


----------



## Kuivamaa

Review results are all over the place with this game.


----------



## Seid Dark

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Most sites show the 780 and Titan performing between 280 & 280X, or very slightly faster, way to cripple Kepler by drivers so Maxwell seem much better. GG Nvidia ...


Yes, seems like Nvidia wants to force everyone to buy Maxwell. Suck for us who still own high-end Kepler


----------



## criminal

Quote:


> Originally Posted by *Seid Dark*
> 
> Yes, seems like Nvidia wants to force everyone to buy Maxwell. Suck for us who still own high-end Kepler


Good luck with that Nvidia! I will limp by if it affects a game I actual plan on playing or I will just skip any game that shows signs of Nvidia gimping performance.

Maxwell is a boring release, so crippling game performance will not sway me in the least to spend money on it.


----------



## thebski

Quote:


> Originally Posted by *Seid Dark*
> 
> Yes, seems like Nvidia wants to force everyone to buy Maxwell. Suck for us who still own high-end Kepler


It does. All they have to do is release a real GPU and I'll give them my money again for their precious Maxwell. I'm not showing up to the milking parlor, though. GM200 or bust. ~$600 for a "next gen" card that is marginally faster than last gen and likely only due to drivers at that? screw, Nvidia.

The really sad thing is they're selling like hotcakes.


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> Those games won't use the power an i7 has to offer.
> These will.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Notice the i3 matching the i5, and 6 core i7 dusting the 4 core i7 even though the 4 core is clocked 100Mhz faster.
> The architecture's are almost identical.
> 
> My 930 @ 4.0Ghz still basically matches a 4770K at 4.0Ghz (gaming), so even if they were a generation or maybe two apart it wouldn't matter.
> 
> http://s51.photobucket.com/user/topenlt1/media/tribescpu.png.html
> 
> http://s51.photobucket.com/user/topenlt1/media/cpusky.jpg.html
> 
> Source: Techspot
> 
> -


Your i7-930 isnt even close to matching a i7-4770K in current games.. You must be trolling...


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> Your i7-930 isnt even close to matching a i7-4770K in current games.. You must be trolling...


You need to do some research.
Their *gaming* performance is similar with both chips @ 4.0Ghz.
I had a great review of a i7 920 vs a 3770k with both @ 4.0Ghz, difference was 5-10 FPS at most.
Let me see if I can find it

And BTW PM me from now on, no use axing more of this gentlemans thread.


----------



## Olivon

Quote:


> Originally Posted by *thebski*
> 
> It does. All they have to do is release a real GPU and I'll give them my money again for their precious Maxwell. I'm not showing up to the milking parlor, though. GM200 or bust. ~$600 for a "next gen" card that is marginally faster than last gen and likely only due to drivers at that? ****, Nvidia.
> 
> The really sad thing is they're selling like hotcakes.


Not everyone got a 780/780Ti and GTX 970 is an excellent card for the price. That's why they're selling it like hotcakes, success is really massive.


----------



## daviejams

Quote:


> Originally Posted by *amstech*
> 
> You need to do some research.
> Their *gaming* performance is similar with both chips @ 4.0Ghz.
> I had a great review of a i7 920 vs a 3770k with both @ 4.0Ghz, difference was 5-10 FPS at most.
> Let me see if I can find it
> 
> And BTW PM me from now on, no use axing more of this gentlemans thread.


hahaha nice attempt

Come on 3770k is about 20% faster


----------



## criminal

Quote:


> Originally Posted by *Olivon*
> 
> Not everyone got a 780/780Ti and GTX 970 is an excellent card for the price. That's why they're selling it like hotcakes, success is really massive.


This is true, but Nvidia shouldn't be "gimping" performance of Kepler. (If they are in fact doing so.)


----------



## amstech

Quote:


> Originally Posted by *daviejams*
> 
> hahaha nice attempt
> 
> Come on 3770k is about 20% faster


You guys are making me look smarter then I deserve but I appreciate it.

http://alienbabeltech.com/main/ivy-bridge-3770k-gaming-results-vs-core-i7-920-at-4-2ghz/5/

http://s51.photobucket.com/user/topenlt1/media/GTX680-games-1.jpg.html

-
http://s51.photobucket.com/user/topenlt1/media/GTX680-games-1a.jpg.html

This enough lessons for one day?


----------



## thebski

Quote:


> Originally Posted by *Olivon*
> 
> Not everyone got a 780/780Ti and GTX 970 is an excellent card for the price. That's why they're selling it like hotcakes, success is really massive.


I will not dispute that the 970 is a great buy. Excellent price/performance. But the 980 is a joke. Sans the driver gimps of Kepler, you can buy the same performance today for $600 that you could a year ago for $700. Yay.


----------



## daviejams

games from years ago = irrelevant


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> You need to do some research.
> Their *gaming* performance is similar with both chips @ 4.0Ghz.
> I had a great review of a i7 920 vs a 3770k with both @ 4.0Ghz, difference was 5-10 FPS at most.
> Let me see if I can find it
> 
> And BTW PM me from now on, no use axing more of this gentlemans thread.


Dude, you have no clue what you're talking about. It's hard to find reviews, because your CPU is oudated. Any newer i5 will be faster in games, besides that, i5-2500K/i5-3570K/i5-4570K/i5-4690K OC to 4.5 - 5 GHz where your 45nm Bloomfield maxes out around 4 GHz, and the clock for clock performance is much lower in the first place.

It's been years since I've had an i7-920 at 4.2 GHz... Probably 5 years ago... Chipset is very outdated too. Sold it because of no native SATA600. Went to i7-2600K with OC to 5 GHz on custom water, it was MUCH faster than the i7-920 at 4.2 in EVERYTHING. 2600K is 3 generations old now, do the math...


----------



## amstech

Quote:


> Originally Posted by *thebski*
> 
> It's been years since I've had an i7-920 at 4.2 GHz... Probably 5 years ago... Chipset is very outdated too.


But for gaming its 90% as fast at a 4770K at the same clock speed, even today.
Go ahead and look up recent reviews, the results are the same everywhere.

Sandy and Ivy don't offer a performance boost worth upgrading for from a gaming standpoint.
X99 has been the first substantial improvement and it will be years before games utilize 8 cores/16 threads.

Most users who have upgraded and rant are just pissy their new build ain't all that.
I'll be going to an X99/8core setup in a year or so when next gen GPU's release. (this new gen ain't that impressive).

Hey, its just money.

-
Quote:


> Originally Posted by *thebski*
> 
> the 980 is a joke. Sans the driver gimps of Kepler, you can buy the same performance today for $600 that you could a year ago for $700. Yay.


Agreed.


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> But for gaming its 90% as fast at a 4770K at the same clock speed, even today.
> Go ahead and look up recent reviews, the results are the same everywhere.
> 
> Sandy and Ivy don't offer a performance boost worth upgrading for from a gaming standpoint.
> X99 has been the first substantial improvement and it will be years before games utilize 8 cores/16 threads.
> 
> Most users who have upgraded and rant are just pissy their new build ain't all that.
> I'll be going to an X99/8core setup in a year or so when next gen GPU's release. (this new gen ain't that impressive).
> 
> Hey, its just money.
> 
> -
> Agreed.


You are denying the fact that your CPU is outdated..







'

Sadly we can't look up recent reviews, because your CPU is nowhere to be found. It went EOL many years ago.


----------



## amstech

Quote:


> Originally Posted by *Lass3*
> 
> You are denying the fact that your CPU is outdated..
> 
> 
> 
> 
> 
> 
> 
> '
> .


Outdated from a usage standpoint? Very. I built this rig 5 years ago and I've got my money out of it.
I know my X58 is old.

Outdated from a performance standpoint?
Not even close! My Bloomfield still runs games like a champ and that's all I care about.
Games are easy to run.


----------



## Olivon

Quote:


> Originally Posted by *criminal*
> 
> This is true, but Nvidia shouldn't be "gimping" performance of Kepler. (*If they are in fact doing so*.)


Do they are ? If the problem is on just one new coming game I won't call it a problem.
And nVidia is usually quite reactive regarding drivers.
Quote:


> Originally Posted by *amstech*
> 
> Outdated from a performance standpoint?
> Not even close! My Bloomfield still runs games like a champ and that's all I care about.
> Games are easy to run.


If you use CPU limited scenes essentially, Bloomfield is outdated yes (around FX-8350 performance).



http://www.hardware.fr/articles/924-19/indices-performance.html


----------



## incog

Is that guy for real?

In games that utilize hyper-threading and tasks that utilize hyper-threading, yes, the i7 is going to win. That much is obvious? Of course the i7 will do better in tasks that are well-threaded..

The only difference between an overclocked i5 and an overclocked i7 is that hyper-threading and the i7 has a little more cache. That and the 50% premium an i7 costs over an i5.

In games that utilize no more than 4 cores, the i5 is the winner because of its price and because it can clock a bit higher (since hyper-threading generates more heat, i5s have a little more thermal headroom ... ON AVERAGE. chip quality probably plays a bigger role here).

Comparing i7s and i5s at different clock-speeds is bad enough, comparing i7s and i5s of different architectures is absolutely terrible.

If you regularly do tasks that are well-threaded such as encoding and need that extra performance, then the i7 is a great chip to get, yes.

If you're looking for a CPU that is centered around gaming, the i5 is better, simply because not a lot of games utilize hyper-threading at all and the i7 is so much more expensive. The $100 or €100 premium you pay for an i7 is better put into getting a stronger graphics card or monitor. If you think that kind of money isn't important, then send me a $100 check and then I'll believe you.

Generally speaking, for gaming, the i5 is the chip to get. I can't name more than two games off the top of my head that use hyper-threading and I wouldn't pay $100 to get a few more frames out of either of them. There are probably more than just those two but you get my point, hopefully.


----------



## Lass3

Quote:


> Originally Posted by *incog*
> 
> Is that guy for real?
> 
> In games that utilize hyper-threading and tasks that utilize hyper-threading, yes, the i7 is going to win. That much is obvious? Of course the i7 will do better in tasks that are well-threaded..
> 
> The only difference between an overclocked i5 and an overclocked i7 is that hyper-threading and the i7 has a little more cache. That and the 50% premium an i7 costs over an i5.
> 
> In games that utilize no more than 4 cores, the i5 is the winner because of its price and because it can clock a bit higher (since hyper-threading generates more heat, i5s have a little more thermal headroom ... ON AVERAGE. chip quality probably plays a bigger role here).
> 
> Comparing i7s and i5s at different clock-speeds is bad enough, comparing i7s and i5s of different architectures is absolutely terrible.
> 
> If you regularly do tasks that are well-threaded such as encoding and need that extra performance, then the i7 is a great chip to get, yes.
> 
> If you're looking for a CPU that is centered around gaming, the i5 is better, simply because not a lot of games utilize hyper-threading at all and the i7 is so much more expensive. The $100 or €100 premium you pay for an i7 is better put into getting a stronger graphics card or monitor. If you think that kind of money isn't important, then send me a $100 check and then I'll believe you.
> 
> Generally speaking, for gaming, the i5 is the chip to get. I can't name more than two games off the top of my head that use hyper-threading and I wouldn't pay $100 to get a few more frames out of either of them. There are probably more than just those two but you get my point, hopefully.


You are right, and my 1st link already proves the point:

http://www.anandtech.com/show/8227/devils-canyon-review-intel-core-i7-4790k-and-i5-4690k/5

I think he's trolling, he must be


----------



## amstech

[
Quote:


> Originally Posted by *incog*
> 
> Comparing i7s and i5s at different clock-speeds is bad enough, comparing i7s and i5s of different architectures is absolutely terrible.


I compared i7's and i5's of the same architecture and same clock speed to show i5's are slower then i7's if utilized.
I only compared a 930/3770k/4770k to show the minuscule gaming performance difference from gen to gen with even clock speed.

Quote:


> Originally Posted by *incog*
> 
> There are probably more than just those two but you get my point, hopefully.


Your points don't conflict or disprove anything I said, talk away







.
And there are about a 12-15 games now, some of them major titles, that show a distinct advantage when they have more threads to use.


----------



## Lass3

Quote:


> Originally Posted by *amstech*
> 
> [
> I compared i7's and i5's of the same architecture and same clock speed to show i5's are slower then i7's if utilized.
> I only compared a 930/3770k/4770k to show the minuscule gaming performance difference from gen to gen with even clock speed.
> Your points don't conflict or disprove anything I said, talk away
> 
> 
> 
> 
> 
> 
> 
> .
> And there are about a 12-15 games now, some of them major titles, that show a distinct advantage when they have more threads to use.


Not really. Every single so called "next gen" game shows same performance for i5 and i7 clock for clock.

With Mantle and DirectX 12, CPU performance will even be less important than it is now.

I've got an i7 in my work PC, because I can actually utilize HT there, but for a gaming machine, it's waste of money. It makes no difference at all. Why pay more for the same performance.


----------



## Zeion

Quote:


> Originally Posted by *Lass3*
> 
> Not really. Every single so called "next gen" game shows same performance for i5 and i7 clock for clock.
> 
> With Mantle and DirectX 12, CPU performance will even be less important than it is now.
> 
> I've got an i7 in my work PC, because I can actually utilize HT there, but for a gaming machine, it's waste of money. It makes no difference at all. Why pay more for the same performance.


Surprised no one has mentioned Crysis 3, the difference with HT on vs OFF in that title is very significant.

http://www.youtube.com/watch?v=I5jYW4sYBOU

I noted the same fps boost with HT enabled on my 3770K vs with it disabled or compared to my old 3570k running the same 4.5ghz overclock and im only running a single GTX 780.

Another video showing the advantage with HT on.

http://www.youtube.com/watch?v=ZJ7eOUA-5VY


----------



## incog

Quote:


> Originally Posted by *Zeion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lass3*
> 
> Not really. Every single so called "next gen" game shows same performance for i5 and i7 clock for clock.
> 
> With Mantle and DirectX 12, CPU performance will even be less important than it is now.
> 
> I've got an i7 in my work PC, because I can actually utilize HT there, but for a gaming machine, it's waste of money. It makes no difference at all. Why pay more for the same performance.
> 
> 
> 
> Surprised no one has mentioned Crysis 3, the difference with HT on vs OFF in that title is very significant.
> 
> http://www.youtube.com/watch?v=I5jYW4sYBOU
> 
> I noted the same fps boost with HT enabled on my 3770K vs with it disabled or compared to my old 3570k running the same 4.5ghz overclock and im only running a single GTX 780.
Click to expand...

I didn't explicitly mention Crysis 3 but it's one of the titles I had in mind when talking about games that utilize hyper-threading.

Our friend mentions 12 titles; I'd be curious to hear which ones they are. I know of Crysis 3 and Battlefield 4. I wouldn't mind knowing what the other games are though, for my own culture more than anything else.

An i7 is of course just as good as an i5 when it comes to gaming. It's just that one must always consider price / performance when looking at these things. You could argue that back before the release of Hawaii, Maxwell and the GTX 780 Ti, the GTX Titan was the "best" video-card to get. You would be right in saying so, however when considering the price, the GTX 780 was actually a much better card to get for gaming.

Of course, this is just a discussion about the merits of the i7 against the i5. If you have an i7, you still have an awesome processor which is better overall than the i5. Just for gaming though? The i5 is a stronger choice. Not a stronger processor, it's a stronger choice.


----------



## Zeion

Quote:


> Originally Posted by *incog*
> 
> I didn't explicitly mention Crysis 3 but it's one of the titles I had in mind when talking about games that utilize hyper-threading.
> 
> Our friend mentions 12 titles; I'd be curious to hear which ones they are. I know of Crysis 3 and Battlefield 4. I wouldn't mind knowing what the other games are though, for my own culture more than anything else.
> 
> An i7 is of course just as good as an i5 when it comes to gaming. It's just that one must always consider price / performance when looking at these things. You could argue that back before the release of Hawaii, Maxwell and the GTX 780 Ti, the GTX Titan was the "best" video-card to get. You would be right in saying so, however when considering the price, the GTX 780 was actually a much better card to get for gaming.
> 
> Of course, this is just a discussion about the merits of the i7 against the i5. If you have an i7, you still have an awesome processor which is better overall than the i5. Just for gaming though? The i5 is a stronger choice. Not a stronger processor, it's a stronger choice.


Nothing at all wrong with an i5, I gamed with one for a very long time till I got an awesome deal on my 3770K. Honestly I only see a real difference with Crysis 3 and BF4 "Online". I have probably 100 games between steam and origin so that should say quiet a lot, perhaps soon multi- threading will get better and make more of a difference in modern games.


----------



## ZealotKi11er

GTX980 is no joke. Unlike 290 vs 290X which only differ in Compute Units 2560 vs 2816 GTX980 has a larger gab with GTX970. Most GTX970 will match a stock GTX980 at their best OC. GTX980 will go way beyond that. A 290 can even beat a 290X with OC.


----------



## amstech

Thank you Zeion and Incog for the data!
I'm gonna go back to skipping down the corridor with a grin on my face.


----------



## Systemlord

Quote:


> Originally Posted by *Redeemer*
> 
> Gameworks is terrible and Nvidia is purposely crippling Kepler...wow just wow!


I wish accusations like this had hard facts or proof before posting. Maxwell is very different from the GTX 780 in architecture and the GTX 780 is just a refreshed GTX 680 (Kepler). Don't you remember the launch of the GTX 680, it was the next big leap in performance over the GTX 580 (Fermi) and put it above all other Nvidia GPUs, that's what's happening again but with the GTX 980 (Maxwell). Now I hear members complaining that Nvidia just refreshed the 680 to GTX 780 and are accusing Nvidia of milking us with a refresh, grow-up and give me a break! What a bunch of whiners! Kepler is 2 years old.

*AnandTech*
Quote:


> As the two year GPU cycle continues in earnest, we've reached the point where NVIDIA is gearing up for their annual desktop *product line refresh*. With the GeForce 600 series proper having *launched over a year ago*, all the way back in March of 2012, most GeForce 600 series products are at or are *approaching a year old*, putting us roughly halfway through Kepler's expected 2 year lifecycle.


----------



## Lass3

Yeah, lets link random youtube videos now..









Look at the GPU usage for god sake.. lawl

http://www.techspot.com/review/642-crysis-3-performance/page6.html

Same performance clock for clock.

Any newer i5 can feed 980 SLI with 100% GPU usage. Especially with OC.
Even at stock my i5-4690K dosnt bottleneck my 970s.

Crysis 3 isnt very CPU demanding.


----------



## incog

Quote:


> Originally Posted by *amstech*
> 
> Thank you Zeion and Incog for the data!
> I'm gonna go back to skipping down the corridor with a grin on my face.


it's like you didn't read my posts at all


----------



## thebski

Quote:


> Originally Posted by *Systemlord*
> 
> the GTX 780 is just a refreshed GTX 680 (Kepler).


It wasn't refreshed. It was a completely different GPU. GTX 770 was the GTX 680 refresh. GTX 780 wasn't the flagship anyways. GTX 780 Ti was way, way faster than a 680. The performance difference between the 700's and 600's (the refresh) was far superior to the difference between the 700's and 900's (the ground breaking new architecture).

Quote:


> Originally Posted by *Systemlord*
> 
> Don't you remember the launch of the GTX 680, it was the next big leap in performance over the GTX 580 (Fermi) and put it above all other Nvidia GPUs, that's what's happening again but with the GTX 980 (Maxwell).


GTX 680 (GK104) was something like 40% faster than GTX 580 (GF110). GTX 980 (GM204) is marginally faster at best over the GTX 780 Ti (GK110). There's nothing spectacular about the GTX 980. GTX 970, on the other hand, is no joke at $330. In Maxwell's defense, it doesn't have the benefit of a new process node like Kepler did, otherwise we'd likely be seeing similar gains that we saw with Kepler over Fermi. That may very well be the reason why 980 is nothing spectacular, but none the less, it's nothing spectacular and doesn't bring a performance class we haven't seen before like the GTX 680 did.

The performance difference between 780 Ti and 980, or more generally Kepler and Maxwell is pretty inexplicable in FC4, something we certainly haven't seen in other titles to this point. That's why you're seeing people accuse Nvidia of hampering Kepler. There isn't that big of a gap between full blown GK110 and full blown GM204 on paper or in any other title.


----------



## Zeion

Quote:


> Originally Posted by *Lass3*
> 
> Yeah, lets link random youtube videos now..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Look at the GPU usage for god sake.. lawl
> 
> http://www.techspot.com/review/642-crysis-3-performance/page6.html
> 
> Same performance clock for clock.
> 
> Any newer i5 can feed 980 SLI with 100% GPU usage. Especially with OC.
> Even at stock my i5-4690K dosnt bottleneck my 970s.
> 
> Crysis 3 isnt very CPU demanding.


What rock have you been living under for the past 2yrs? Crysis 3 is very CPU dependent and those 2 videos are only a small fraction of the information backing that up and since when is it wrong to post a video showing you the direct comparisons with HT on VS HT off instead of relying on "still shot numbers"?

My gosh you see gameplay video showing the actual fps differences and you still call foul?









I have the game myself and I am well aware of how much better the game runs with HT on, now you can argue that most games dont take advantage of HT and I will not argue against that because you are correct there but you are flat our wrong about crysis 3.


----------



## Zeion

Quote:


> Originally Posted by *amstech*
> 
> Thank you Zeion and Incog for the data!
> I'm gonna go back to skipping down the corridor with a grin on my face.


No problem man but remember out of all the games that I own Crysis 3 and BF4 "multiplayer" are the only two that show a real world benefit that I have noticed. Crysis 3 in fact gains more from it than BF4.


----------



## Zeion

Let me clear something up, Crysis 3 is extremely GPU bound as well as CPU bound and will eat up however many threads you throw at it... CPU clock speed does not seem to be as important here as multithreading.


----------



## DrBrogbo

Today, I learned that "Farcry 4 benchmarks" really means "argue about i5 vs i7 performance for FOUR FREAKIN PAGES"









Can we please get back on topic? This thread is relevant to my interests, but I'm an inch away from unsubscribing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zeion*
> 
> Let me clear something up, Crysis 3 is extremely GPU bound as well as CPU bound and will eat up however many threads you throw at it... CPU clock speed does not seem to be as important here as multithreading.


Crysis 3 is CPU bound if you have 3 x Titans playing @ 1080p with 0AA.


----------



## SoloCamo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Crysis 3 is CPU bound if you have 3 x Titans playing @ 1080p with 0AA.


Not the case.

Going from my FX-9590 to a 4790k at 4ghz (disabled turbo) showed considerable frame rate improvements in quite a few areas on my 290x. Overall, yes the game is GPU bound but there are plenty of spots in this game that need CPU power


----------



## ZealotKi11er

Quote:


> Originally Posted by *SoloCamo*
> 
> Not the case.
> 
> Going from my FX-9590 to a 4790k at 4ghz (disabled turbo) showed considerable frame rate improvements in quite a few areas on my 290x. Overall, yes the game is GPU bound but there are plenty of spots in this game that need CPU power


I am only saying because my 3770K @ 4.6GHz keeps my GPUs 99% almost every time so GPUs are holding me back.


----------



## Systemlord

Quote:


> Originally Posted by *thebski*
> 
> It wasn't refreshed. It was a completely different GPU. GTX 770 was the GTX 680 refresh. GTX 780 wasn't the flagship anyways. GTX 780 Ti was way, way faster than a 680. The performance difference between the 700's and 600's (the refresh) was far superior to the difference between the 700's and 900's (the ground breaking new architecture).
> GTX 680 (GK104) was something like 40% faster than GTX 580 (GF110). GTX 980 (GM204) is marginally faster at best over the GTX 780 Ti (GK110). There's nothing spectacular about the GTX 980. GTX 970, on the other hand, is no joke at $330. In Maxwell's defense, it doesn't have the benefit of a new process node like Kepler did, otherwise we'd likely be seeing similar gains that we saw with Kepler over Fermi. That may very well be the reason why 980 is nothing spectacular, but none the less, it's nothing spectacular and doesn't bring a performance class we haven't seen before like the GTX 680 did.
> 
> The performance difference between 780 Ti and 980, or more generally Kepler and Maxwell is pretty inexplicable in FC4, something we certainly haven't seen in other titles to this point. That's why you're seeing people accuse Nvidia of hampering Kepler. There isn't that big of a gap between full blown GK110 and full blown GM204 on paper or in any other title.


Maxwell hasn't been out for long, I suggest we wait a few months so it can stretch it's legs with driver optimization. However I agree with you on the die shrink of the 680, it was the icing on the cake and Maxwell 2Gen didn't get a die shrink. I can't wait to see big Maxwell as it's unknown if big Maxwell will get a die shrink.


----------



## MK3Steve

About Hyperthreading in Far Cry 4 :


----------



## Zeion

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Crysis 3 is CPU bound if you have 3 x Titans playing @ 1080p with 0AA.


Dude, I gained the same fps increase as you see in those two videos except i am using 1 gtx 780. Crysis 3 benefits greatly from hyperthreading or an 8core CPU.


----------



## MK3Steve

Quote:


> Originally Posted by *Zeion*
> 
> Dude, I gained the same fps increase as you see in those two videos except i am using 1 gtx 780. Crysis 3 benefits greatly from hyperthreading or an 8core CPU.


Same here , had a good performance boost when enabling HT in crysis 3 with a single 780 Ti back in the days .


----------



## SoloCamo

Quote:


> Originally Posted by *MK3Steve*
> 
> About Hyperthreading in Far Cry 4 :


Maybe I'm missing something here, but what's the point of a side by side test when you are standing still?


----------



## jmcosta

Quote:


> Originally Posted by *MK3Steve*
> 
> Same here , had a good performance boost when enabling HT in crysis 3 with a single 780 Ti back in the days .


+1
my old i5 had 99% usage in some levels
might still have the screenshot here somewhere...


----------



## thebski

Quote:


> Originally Posted by *Systemlord*
> 
> Maxwell hasn't been out for long, I suggest we wait a few months so it can stretch it's legs with driver optimization. However I agree with you on the die shrink of the 680, it was the icing on the cake and Maxwell 2Gen didn't get a die shrink. I can't wait to see big Maxwell it's unknown if big Maxwell will get a die shrink.


When I say the 780 and 780 Ti weren't refreshes it's because they use GK110 and we didn't see that in the 600s. I know you edited that out so you realized what I was talking about but I wanted to clarify anyways. GTX 770 used GK104 like the 600s so it was indeed a refreshed card that was just pushed further with higher power limits than we saw with the 600s.

I don't know if I really buy into driver optimizations. There is always fuss about driver optimizations but they never really boost performance very much. They really just keep support updated for new games that come out. Maxwell doesn't really stand a chance to be the large performance jump that we're used to seeing generation to generation if it doesn't get a die shrink at some point. It may never get that as it sounds like 20nm is being skipped and 16nm may not be ready until sometime in 2015. At that point NVidia may just decide to move on to Volta.


----------



## MK3Steve

Quote:


> Originally Posted by *SoloCamo*
> 
> Maybe I'm missing something here, but what's the point of a side by side test when you are standing still?


Taking a certain situation and standing still and compare both HT on & off is ok . The World is still working around in the background . Its not like you need to move to get load on the cpu . Same as here :


----------



## ZealotKi11er

Quote:


> Originally Posted by *MK3Steve*
> 
> Taking a certain situation and standing still and compare both HT on & off is ok . The World is still working around in the background . Its not like you need to move to get load on the cpu . Same as here :


Going to check Crysis 3 right now. Maybe i have not experienced much difference because i used MAX settings @ 1440p. There was also some BIOS settings that improved Crysis 3 CPU performance.


----------



## MK3Steve

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Going to check Crysis 3 right now. Maybe i have not experienced much difference because i used MAX settings @ 1440p. There was also some BIOS settings that improved Crysis 3 CPU performance.


Be sure to check the gras level when you wanna try HT on vs off . The grass level is where you see the most improvement . In other levels the improvement isent that big i believe to remember .


----------



## ZealotKi11er

Quote:


> Originally Posted by *MK3Steve*
> 
> Be sure to check the gras level when you wanna try HT on vs off . The grass level is where you see the most improvement . In other levels the improvement isent that big i believe to remember .


That was intense. GPU usage dropped and CPU goes up 90%. If every game was like this..... Crysis 4 if it ever comes out needs Mantle.


----------



## MK3Steve

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That was intense. GPU usage dropped and CPU goes up 90%. If every game was like this..... Crysis 4 if it ever comes out needs Mantle.


Hopefully another crysis game will come out yes .


----------



## ZealotKi11er

I have played Crysis 3 with 2 x 290 before. I am getting hangs every 10s. There is a 1s stop and then goes back to normal. Anyone has that problem.


----------



## MK3Steve

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have played Crysis 3 with 2 x 290 before. I am getting hangs every 10s. There is a 1s stop and then goes back to normal. Anyone has that problem.


This thread is about far cry 4 dude . Dident had the problems you described tho ...


----------



## ZealotKi11er

Quote:


> Originally Posted by *MK3Steve*
> 
> This thread is about far cry 4 dude . Dident had the problems you described tho ...


Forget FC4. Crysis 3 is 1000 times better looking.


----------



## Zeion

Quote:


> Originally Posted by *MK3Steve*
> 
> Be sure to check the gras level when you wanna try HT on vs off . The grass level is where you see the most improvement . In other levels the improvement isent that big i believe to remember .


That is correct.


----------



## Systemlord

Quote:


> Originally Posted by *thebski*
> 
> When I say the 780 and 780 Ti weren't refreshes it's because they use GK110 and we didn't see that in the 600s. I know you edited that out so you realized what I was talking about but I wanted to clarify anyways. GTX 770 used GK104 like the 600s so it was indeed a refreshed card that was just pushed further with higher power limits than we saw with the 600s.
> 
> I don't know if I really buy into driver optimizations. There is always fuss about driver optimizations but they never really boost performance very much. They really just keep support updated for new games that come out. Maxwell doesn't really stand a chance to be the large performance jump that we're used to seeing generation to generation if it doesn't get a die shrink at some point. It may never get that as it sounds like 20nm is being skipped and 16nm may not be ready until sometime in 2015. At that point NVidia may just decide to move on to Volta.


Yeah as I typed I realized that you were right, if only Nvidia was able to pull off a die shrink for GTX 980. I pray that big Maxwell is on a smaller node than 28nm, I don't think it will happen unless Nvidia waits to release big Maxwell 4Q 2015!


----------



## MK3Steve

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Forget FC4. Crysis 3 is 1000 times better looking.


You cant compare Crysis 3 against Far Cry 4 graphics . Far Cry 4 is a large open world , you have to make compromises to make the game look good and at the same tame running good . If Far Cry 4 with its huge open world would be based on crysis 3 engine noone could play it . I like em both far cry 4 and crysis 3 graphics whise and to be perfectly honest with you so far i had more "this looks great" moments on far cry 4 then when i first played crysis 3 . But the water looks ridicolously bad in far cry 4 . Am i wrong or did it even looked better in Far Cry 3 ?


----------



## hanzy

Just picked this up.

I gotta say, I am really happy with the performance.
Any form of MSAA or TXAA is a serious FPS killer though. Running the "Nvidia" setting with AA turned down to SMAA, average around 70-80 FPS, MSAA 4 drops me to average 50-55, and TXAA is close enough.
IQ overall is pretty damn good.


----------



## xCamoLegend

Quote:


> Originally Posted by *MK3Steve*
> 
> You cant compare Crysis 3 against Far Cry 4 graphics . Far Cry 4 is a large open world , you have to make compromises to make the game look good and at the same tame running good . If Far Cry 4 with its huge open world would be based on crysis 3 engine noone could play it . I like em both far cry 4 and crysis 3 graphics whise and to be perfectly honest with you so far i had more "this looks great" moments on far cry 4 then when i first played crysis 3 . But the water looks ridicolously bad in far cry 4 . Am i wrong or did it even looked better in Far Cry 3 ?


Just because a game is open world doesn't make it more demanding. Maybe on RAM but level of detail scaling and streaming make it maybe slightly more demanding than a level based game for the GPU. It's not like it's rendering the entire screen in full detail.

CPU side it's probably a lot more demanding though and looking at the threading in Far Cry 4, it's more demanding than it should be. 1 core pegged @ 100% and the others are barely working.


----------



## MK3Steve

Quote:


> Originally Posted by *xCamoLegend*
> 
> Just because a game is open world doesn't make it more demanding. Maybe on RAM but level of detail scaling and streaming make it maybe slightly more demanding than a level based game for the GPU. It's not like it's rendering the entire screen in full detail.
> 
> CPU side it's probably a lot more demanding though and looking at the threading in Far Cry 4, it's more demanding than it should be. 1 core pegged @ 100% and the others are barely working.


Of course creating a propper engine for an open world title is harder and more demanding then creating a engine for corridor levels like crysis 3 .


----------



## Systemlord

Quote:


> Originally Posted by *MK3Steve*
> 
> Of course creating a propper engine for an open world title is harder and more demanding then creating a engine for corridor levels like crysis 3 .


Anyone notice how Crysis games got progressively lower scores as each title was release? Crysis scored 98%, Crysis 2 89% and all I know is Crysis 3 was even lower, the consoles development took the Crysis engine more towards a corridor game. Gone was the open worlds.


----------



## gtarmanrob

i got my copy today and started playing. really impressed with it so far. im running everything on Ultra @ 2560x1140, with 144hz, and TXAA2. definitely notice few slow downs but nothing game breaking. if i back off the AA to FXAA or something should breeze through it.

beautiful looking game, i love the art direction they took with it. feels very authentic. also i now officially hate Eagles, so thanks Ubisoft.


----------



## xCamoLegend

Quote:


> Originally Posted by *MK3Steve*
> 
> Of course creating a propper engine for an open world title is harder and more demanding then creating a engine for corridor levels like crysis 3 .


CryEngine can do open world just fine.

Dunia 2 is an heavily modified s-stutter fest branch, nothing to be proud of. Crytek are probably laughing even though their games have gone downhill too.


----------



## MK3Steve

Quote:


> Originally Posted by *Systemlord*
> 
> Anyone notice how Crysis games got progressively lower scores as each title was release? Crysis scored 98%, Crysis 2 89% and all I know is Crysis 3 was even lower, the consoles development took the Crysis engine more towards a corridor game. Gone was the open worlds.


Agreed . Crysis 2+3 was just another call of duty clone in another closet with a bunch of different features . Crysis 3 was better then Crysis 2 but both Crysis 2 & 3 was far away from Crysis 1 . I dont know how much ive played crysis 1 campain , maybee 100 times







? Crysis 2 ive play´d trough 1 time ( couldent even finish because the game crashes in a certain level and cant continue ) and Crysis 3 i played trough 1 time too .... Thats what consoles doing to pc gaming , and with far cry 4 with have a new example that games should be made first for pc then for consoles .


----------



## Zeion

Quote:


> Originally Posted by *MK3Steve*
> 
> Agreed . Crysis 2+3 was just another call of duty clone in another closet with a bunch of different features . Crysis 3 was better then Crysis 2 but both Crysis 2 & 3 was far away from Crysis 1 . I dont know how much ive played crysis 1 campain , maybee 100 times
> 
> 
> 
> 
> 
> 
> 
> ? Crysis 2 ive play´d trough 1 time ( couldent even finish because the game crashes in a certain level and cant continue ) and Crysis 3 i played trough 1 time too .... Thats what consoles doing to pc gaming , and with far cry 4 with have a new example that games should be made first for pc then for consoles .


I played Crysis 1 through+mods a few times as well as warhead. Played Crysis 2 through+mods a few times as well and then Crysis 3 I have played through more than any of them, imo that is the best one of the bunch.


----------



## amstech

Quote:


> Originally Posted by *MK3Steve*
> 
> Taking a certain situation and standing still and compare both HT on & off is ok . The World is still working around in the background . Its not like you need to move to get load on the cpu . Same as here :


Very interesting.
I've seen things like this with other games as well, usually with multiplayer battles and all the background things going on.

Also, I wonder if a little i5 can push triple/quad GPU setups as good as i7's at eyefinity type resolutions.


----------



## raghu78

purepc and linustechtips reviews

http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,4
http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,5
http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,6
http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,7
http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,8

https://www.youtube.com/watch?v=dqiOX-4fjdw&list=UUXuqSBlHAE6Xw-yeJA0Tunw

AMD has to fix MSAA performance and Godrays (Gameworks), performance is excellent with SMAA . GTX 980 is dominant with all settings maxed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *raghu78*
> 
> purepc and linustechtips reviews
> 
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,4
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,5
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,6
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,7
> http://www.purepc.pl/karty_graficzne/wymagania_far_cry_4_niskie_test_kart_graficznych_i_procesorow?page=0,8
> 
> https://www.youtube.com/watch?v=dqiOX-4fjdw&list=UUXuqSBlHAE6Xw-yeJA0Tunw
> 
> AMD has to fix MSAA performance and Godrays (Gameworks), performance is excellent with SMAA . GTX 980 is dominant with all settings maxed.


Isn't SMAA better then MSAA?


----------



## daviejams

I prefer smaa over msaa


----------



## ZealotKi11er

Quote:


> Originally Posted by *daviejams*
> 
> I prefer smaa over msaa


Yeah i noticed that with Crysis 3. SMAA 2x is really good.


----------



## MK3Steve

Hey guys . Many of you know about the mouse acceleration problem right ? The most clicked thread isent aviable anymore and for now only the thread i started is left so i would please you to take 1 minute of your time and post again .

Just do some quick post like this :

+1 got the same problem , game is unplayable for me ( or post how you feel about this )

Ubisoft needs to know how many people arent enjoying or even playing the game because of this so we all should post in this thread .

Thanks for you help in case you do so guys !

Thread :

http://forums.ubi.com/showthread.php/953307-Far-Cry-4-Mouse-mouse-acceleration-game-is-horrible-to-play


----------



## xutnubu

People are saying that the AA used in consoles is godly (HRAA), basically deals with all the shimmering and compared to PC's options is much more budget friendly.

I don't get why it wasn't implemented on PC.


----------



## incog

Quote:


> Originally Posted by *amstech*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MK3Steve*
> 
> Taking a certain situation and standing still and compare both HT on & off is ok . The World is still working around in the background . Its not like you need to move to get load on the cpu . Same as here :
> 
> 
> 
> 
> 
> 
> 
> 
> Very interesting.
> I've seen things like this with other games as well, usually with multiplayer battles and all the background things going on.
> 
> Also, I wonder if a little i5 can push triple/quad GPU setups as good as i7's at eyefinity type resolutions.
Click to expand...

your refusal to read the posts of the people talking to you and completely ignore what they're actually saying is absolutely astounding


----------



## Redeemer

Quote:


> Originally Posted by *Olivon*
> 
> Not everyone got a 780/780Ti and GTX 970 is an excellent card for the price. That's why they're selling it like hotcakes, success is really massive.


you have actual sales numbers?


----------



## somethingname

I stopped playing the stuttering is absolutely game breaking not sure how some of you got it playing smooth haha


----------



## Razor 116

Game runs smooth for me. You can have all the FPS in the world but with mouse accel that can't be disabled despite doing all the config tweaks what's the point, I'll be waiting for a fix to this before I get really stuck into it.


----------



## amstech

Quote:


> Originally Posted by *incog*
> 
> your refusal to read the posts of the people talking to you and completely ignore what they're actually saying is absolutely astounding


I've read everyone's post in this thread.
I will politely respond however I want, what I have presented is *inarguable fact.*

-


----------



## strong island 1

Quote:


> Originally Posted by *Redeemer*
> 
> you have actual sales numbers?


I think I remember reading somewhere that the 970 shipments are record breaking.


----------



## raghu78

Quote:


> Originally Posted by *somethingname*
> 
> I stopped playing the stuttering is absolutely game breaking not sure how some of you got it playing smooth haha


The stuttering seems to be a problem in AMD cards. Nvidia cards perform smoothly according to hardocp review.

http://www.hardocp.com/article/2014/11/21/far_cry_4_video_card_performance_iq_preview/7#.VHAcomes98E

*To go with HBAO+ or Enhanced Godrays*

There are two options we have found so far that do impact image quality, and do impact performance. We have not had time to look at the third one yet, which is the Soft Shadow support option for PCSS shadows. The default "Ultra" setting uses Ubisoft's in-house SBCC Ambient Occlusion method. This does look better than Far Cry 3, but you can go one higher with HBAO+. We have found HBAO+ looks really good in this game, especially on the grass.

This game also supports the best Godrays we've ever seen in a game to date. At "Ultra" settings Volumetric Godrays is enabled, this is "fake" Godrays. You can enable "Enhanced" Godrays and the "real thing" is turned on. This looks great in-game, it does dramatically change the environment.

In terms of visual quality, Enhanced Godrays makes a bigger difference in image quality than HBAO+. *In terms of performance, Enhanced Godrays has a bigger impact on performance on the R9 290X and 290. With the GTX 980 and 970 Enhanced Godrays doesn't take much performance hit at all.*

*If you want the best image quality on GTX 980 and 970 enable Enhanced Godrays and sacrifice HBAO+ for SBCC if you must. For the R9 290X and R9 290 if you need more performance make sure you don't use Enhanced Godrays. Now, if you have the performance for it, by all means turn both on, use HBAO+ and Enhnaced Godrays and this game will look amazing.*

*Overall Performance*

The overall performance of this game appears to be more optimized than Far Cry 3. It seems to run better and faster than FC3 at Ultra settings. Having SMAA as an AA option also helps a lot. We think the Dunia engine has been tremendously "tweaked" from FC3 to FC4 and is simply better now.

We did not experience any visual artifacts on any GPU yet. Only an odd VSYNC problem during the beginning intro of the game, which despite having VSYNC turned off locked the framerate at 60fps during the very very beginning part of this game. (Editor's Note: There is a "Sparse" VSYNC setting in this game that is not available in all modes that we need to do more research into and will for our full article.)

*We have found, based on this preview, that running at the "Ultra" preset will net you the same performance on GTX 980 or R9 290X, or GTX 970 and R9 290. It is not until you enable Enhanced Godrays that you will find the GTX 970 and GTX 980 to provide a large performance boost over R9 290X and R9 290. HBAO+ is about the same performance hit on both, but Enhanced Godrays most definitely runs better on NVIDIA GPUs.

In terms of smoothness, GTX 970/980 is a lot better. In terms of what settings are best, "Ultra" does look great. We do recommend turning on Enhanced Godrays if you can, this will make a big impact in the game, and you must manually enable it on AMD GPUs. For NVIDIA GPUs you can do the same, or select the NVIDIA preset, however the NVIDIA preset also enables TXAA4X and this may be too demanding so just use SMAA. We just suggest manually scrolling down and turning on Enhanced Godrays. If you have the performance for it, also turn on HBAO+ and Soft Shadows if you can. Far Cry 4 will look amazing if you have the GPU configuration to run it at maximum settings. Just to be clear, the "Ultra" preset does not give you the best image quality available.*"


----------



## gtarmanrob

New drivers are awesome. Except I got an issue, when it goes night time, I can't see anything. It's just pure black. Dunno what the go is. I have to make sure it's day time when I do a mission


----------



## rcfc89

Is anyone else getting the weird shadows almost like faded imprint on the screen. Very odd. When I look up at the sky it appears to be a very light image of a door that is permanately imprinted into the screen.


----------



## jameschisholm

If they can fix the frame drops, stuttering and give me better performance I'll be happy. A gtx 780 ti should destroy this.


----------



## 352227

Is it possible to use a 2nd gpu like a 750Ti for all these extra nvidia graphic options (godrays etc)?

Also the VRAM usage is huge at 1440 - coming out at about 3.9GB usage on my GTX 980 on nvidia setting and txaa x 2.


----------



## MrWhiteRX7

Stutters on my 780ti and my 290s.


----------



## raghu78

another farcry 4 performance review . AMD cards are doing really good at Ultra settings with SMAA in all reviews. The only setting which hits AMD cards is Enhanced Godrays which is a Gameworks feature.

http://www.notebookcheck.com/Far-Cry-4-Benchmarks.130346.0.html


----------



## Krysin

For anyone with a multimonitor setup im posting since it might help you.
5760 x 1080 60 hz
I72600k @ 4.4
2x GTX 770 4GB, not OCd

Can get 60 fps nearly always on stock medium settings, found high settings have lots of frame dips -haven't experimented yet to see which setting is the main cause, i expect after tweaking it can run high at 40fps average without issue.

Make sure you update drivers people, i forgot to initially and it was terrible! xD


----------



## BusterOddo

Quote:


> Originally Posted by *raghu78*
> 
> another farcry 4 performance review . AMD cards are doing really good at Ultra settings with SMAA in all reviews. The only setting which hits AMD cards is Enhanced Godrays which is a Gameworks feature.
> 
> http://www.notebookcheck.com/Far-Cry-4-Benchmarks.130346.0.html


These numbers are spot on to what I have been seeing with the same settings; single oc'd 7970 with 2600K. Gameplay is very smooth. The enhanced rays definitely give a pretty large performance hit in terms of actual fps, but also induce a noticeable stutter for me. I have also been using HBAO+ which looks really nice in this game, and does not give nearly the same performance decrease that enhanced rays do.


----------



## Systemlord

Please stop this non-sense about the mayoralty of killing people and tigers, this is way off topic, take it somewhere else! It seems some of us are stuck in-game, come back to reality.


----------



## AndroidVageta

Quote:


> Originally Posted by *Systemlord*
> 
> Please stop this non-sense about the mayoralty of killing people and tigers, this is way off topic, take it somewhere else! It seems some of us are stuck in-game, come back to reality.


LOL! I feel so bad about turning all those 01100001 01101100 01101001 01110110 01100101 *to* 01100100 01100101 01100001 01100100 !!!


----------



## somethingname

The new patch 1.40 fixed the stuttering on my AMD card and it gave me a huge boost in frame rate I can lock it at 60fps max smooth as buttah!


----------



## azanimefan

Quote:


> Originally Posted by *daviejams*
> 
> Nobody post the CPU benchmarks from that site ?
> 
> 
> 
> i5 2500k in 4th ... greatest CPU ever


holy 2 threaded game batman!

geesh... ubi still can't make a game multithreaded can they?


----------



## Diablosbud

Quote:


> Originally Posted by *azanimefan*
> 
> holy 2 threaded game batman!
> 
> geesh... ubi still can't make a game multithreaded can they?


Yeah, really. An i3-4330 getting more FPS than a 9370 is ridiculous. I'm actually laughing at how terribly that game is threaded to produce those results







.


----------



## sepiashimmer

Yet those idiots don't allow it to run on non-hybrid dual cores.

There is an unofficial solution though.


----------



## nleksan

Really? You're going to complain about something that is actually not Oopsie-soft screwing up, but a reality for 99 percent of games?
i3's do better than 8xxx/9xxx AMD's in a great many games, it's nothing new. CPU tasks are not things thatccan be easil, or in many cases at all, coded in parallel.
High IPC ( and Intel has enough advantage to maintain superior IPS regardless of stock v OC), is and will for some time yet be the most important thing to have in a CPU.

On the other hand, my 5Ghz 3930K is loading across all 12 threads... So it's possibly an architectural issu..? .


----------



## AndroidVageta

Oh my God! When are we going to get a freakin Crossfire patch or something for this game? How longs it been now?


----------



## Diablosbud

Quote:


> Originally Posted by *nleksan*
> 
> Really? You're going to complain about something that is actually not Oopsie-soft screwing up, but a reality for 99 percent of games?
> i3's do better than 8xxx/9xxx AMD's in a great many games, it's nothing new. CPU tasks are not things thatccan be easil, or in many cases at all, coded in parallel.
> High IPC ( and Intel has enough advantage to maintain superior IPS regardless of stock v OC), is and will for some time yet be the most important thing to have in a CPU.
> 
> On the other hand, my 5Ghz 3930K is loading across all 12 threads... So it's possibly an architectural issu..? .


My point is, for a large, wealthy gaming company they could have hired good enough programmers to thread it properly. There are games like Battlefield 4 and Crysis 3 that have even shown benefits from i7s over i5s. If a company with that much money wants to thread a game properly they can. This was a case of wanting to save money and neglecting performance. I don't really care, I just felt like pointing out that what they did was sloppy even compared to games with significantly smaller budgets.

I would hope it's an architectural issue, because that's pretty poor CPU scaling even for a game in my opinion.


----------



## azanimefan

well they complain the consoles are tapped out... and those consoles are multicored devices... so obviously they aren't coding for multi cored consoles yet if the game shows no variation in performance from intel chip to intel chip regardless of the core count.

i am not pointing it out because this is abnormal (single or dual threaded games), i'm pointing it out because the current gen consoles demand multithreading and ubisoft os complaining that they can't get any more performance out of this gen consoles, yet every game they release is a 1 or 2 threaded game (AC:Unity was the same as FC4)


----------



## sepiashimmer

Far Cry 4 is also available on Xbox 360 and it has only 3 cores with HT. It might be one of the reasons why the main process runs on third thread. Or did they deliberately put that to exclude gamers with dual core?!


----------



## azanimefan

Quote:


> Originally Posted by *sepiashimmer*
> 
> Far Cry 4 is also available on Xbox 360 and it has only 3 cores with HT. It might be one of the reasons why the main process runs on third thread. Or did they deliberately put that to exclude gamers with dual core?!


no, ubisoft has been using the same old code for the better part of a decade and doesn't want to switch up to a new game engine. it's simply lazy and penny pinching. there are game engines that make use of multi threads pretty effectively out there. but ubi would rather go with their in house stuff that's 10 years old rather then license someone else's stuff that would actually work on a modern console or quad core'd cpu.

I have and will continue to insult EA every chance i get (origin's is trash), but atleast they have forced their game makers to move onto plausibly multithreaded game engines. Ubi is still stuck in 2005.

Sidenote: there is nothing hyperthreaded about the xb1's core design. the xb1 is a true 6 cored cpu. just as the PS4 is a true 8 cored cpu. claiming they're limited to 3 threaded designs due to console specs is wrong. (furthermore FC4 isn't 3 threaded, read those numbers again, this is a true 2 thread title that offloads some extra tasks on other cores if they're available... just like FC3).


----------



## sepiashimmer

I meant last gen Xbox 360 not XB1. They've released Far Cry 4 on Xbox 360 too.


----------



## ahnafakeef

Does the new patch give any performance boosts for Nvidia cards?


----------



## deafboy

Quote:


> Originally Posted by *AndroidVageta*
> 
> Oh my God! When are we going to get a freakin Crossfire patch or something for this game? How longs it been now?


So much this...


----------



## raghu78

Hardocp has their performance review of Farcry 4

http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/6#.VK64c8kVJ8E

Maxwell performs best followed by GCN. Nvidia Kepler cards just cannot compete in this TWIMTBP and Gameworks title. Thats quite an embarassment.


----------



## Robenger

Quote:


> Originally Posted by *raghu78*
> 
> Hardocp has their performance review of Farcry 4
> 
> http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/6#.VK64c8kVJ8E
> 
> Maxwell performs best followed by GCN. Nvidia Kepler cards just cannot compete in this TWIMTBP and Gameworks title. Thats quite an embarassment.


I found that pretty interesting, especially for the 780Ti. It's just so much slower then the 970 you would think it was several generations behind. Also, it was fun seeing how much Gameworks visual add-ons penalized GCN cards


----------



## vlps5122

i avg 110-120 fps with 3 780 ti on 2560x1440, everything max, smaa on. vram usage is around 2500 usually peaks at like 2800


----------



## criminal

Quote:


> Originally Posted by *raghu78*
> 
> Hardocp has their performance review of Farcry 4
> 
> http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/6#.VK64c8kVJ8E
> 
> Maxwell performs best followed by GCN. Nvidia Kepler cards just cannot compete in this TWIMTBP and Gameworks title. Thats quite an embarassment.


Kepler performance still seems very strange.


----------



## Neo_Morpheus

With the threading issue, I think its going to take more time before 4 thread processors are old enough for everyone to move on. As well as threading it for consoles, it will probably coincide together, will still take many years. PC atm can just keep up with a port with raw power.


----------

