# [DigitalFoundry] Is It Time To Upgrade Your Core i5 2500K?



## 8800GT

2500k might be the best CPU of all-time. It definitely falls behind a bit compared to the 6500 but when oc'd it pretty much matches it...not bad for a 5 year old CPU.


----------



## ZealotKi11er

Been telling people for ages. RAM speeds matters in CPU limited games.


----------



## Cakewalk_S

Considering i'll be picking up a new laptop with a CPU as powerful as my 2500k, I think I'll just use my 2500k till it dies then use the laptop... 4.5GHz gets me <120 watts..laptop 6700HQ will get me the same multithread scores or better with ~35watt... Not really interested in dumping $400+ in a new chip, motherboard, and memory...


----------



## ZealotKi11er

They are comparing it to Core i5 6500 which is the best Intel has to offer and losing 15-20% is no big deal. Also again AMD sucking 1080p with driver overhead.


----------



## headd

Considering i5 6500 boost with all cores loaded at 3.3ghz it looks like skylake have 39% better IPC vs sandy brige(4600Mhz sandy match stock 3.3Ghz i5 6500).If you want same performance as 4.6Ghz i5 6500 you need oc sandy bridge to 6.4ghz.


----------



## xlastshotx

I just switched from a 2600k to a 6600k (soon to have a 6700k). CPU wise not super noticeable, 6600k is a bit slower rendering, but that was expected (which is why I am upgrading to the 6700k soon).The 6500k is noticeably faster in games at the same clocks as my 2600k (and same gpu). The ram is much faster though, and the m2 slot is excellent. The 2600k/2500k are great, but DDR4 and all of the new features of the current gen finally made it worth it for me to switch. Prices are good to


----------



## ZealotKi11er

Quote:


> Originally Posted by *xlastshotx*
> 
> I just switched from a 2600k to a 6600k (soon to have a 6700k). CPU wise not super noticeable, 6600k is a bit slower rendering, but that was expected (which is why I am upgrading to the 6700k soon).The 6500k is noticeably faster in games at the same clocks as my 2600k (and same gpu). The ram is much faster though, and the m2 slot is excellent. The 2600k/2500k are great, but DDR4 and all of the new features of the current gen finally made it worth it for me to switch.


Yes but some people have IVY + 2400MHz making the gap even smaller. The big this is 1080p. I left 1080p when I had a Core 2 Quad. I have been 1440p for 4 years and 4K now. 2500K OCed will feed even a Titan @ 4K. 4.6GHz also was easy OC for SB. Most did 4.8GHz and some 5GHz. If you are AMD in the other hand and still play 1080p its not good. AMD CPU overhead gets ignored because review sites use 6700K and 5960X.


----------



## Darklyric

Why does he keep noting that the 3770k is a "drop in replacement' for a 2500k? Also I kinda laughed a little inside when people talk about not being able to overclock 390(x) to match gtx 970/980 levels when in truth all it really takes is around 1150 core vs 1500 core - the cpu overhead of 1080p and all.


----------



## Scotty99

Quote:


> Originally Posted by *xlastshotx*
> 
> I just switched from a 2600k to a 6600k (soon to have a 6700k). CPU wise not super noticeable, 6600k is a bit slower rendering, but that was expected (which is why I am upgrading to the 6700k soon).The 6500k is noticeably faster in games at the same clocks as my 2600k (and same gpu). The ram is much faster though, and the m2 slot is excellent. The 2600k/2500k are great, but DDR4 and all of the new features of the current gen finally made it worth it for me to switch. Prices are good to


You are legit insane lol. You do realize you downgraded, right? Ok so you got 4 more fps in gta 5, that was worth the 400+ dollars you spent on a CPU/MB/Ram lol.

This is why i love this forum tbh, people dont give AF lol.


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> He is getting a 6700K so how is that a downgrade?


/facepalm

You do realize he bought the i5 first......right?

So now he is in the position of spending at least 400 dollars on CPU/mobo/ram and is gonna try and sell his i5 on craigslist or something, just so he can bump his rendering time back up to what it was with his 5 year old i7 (or maybe he just cares about cinebench scores, who knows lol).

Not only is his current purchase a downgrade but he is now in a position where he has to talk to weirdos ( cragislist ) and spend another 250 bucks to upgrade to an i7 (assuming he only loses 150 on his i5)


----------



## philhalo66

I wonder how the 3570K hold up i cant say for modern GPU's but i haven't had a bottleneck since i got rid of my AMD stuff


----------



## Assirra

Wow i am weirded out by that video.
When they showed the bios screen that was exactly the same as mine form CPU to ram to actual bios screen.
Even the outdated bios was the same.

Anyway this was really cool and it showed me 1 thing.
I choose the wrong RAM when upgrading 2 weeks ago, should have gone for more then 1600mhz it seems.


----------



## philhalo66

Quote:


> Originally Posted by *Assirra*
> 
> Wow i am weirded out by that video.
> When they showed the bios screen that was exactly the same as mine form CPU to ram to actual bios screen.
> Even the outdated bios was the same.
> 
> Anyway this was really cool and it showed me 1 thing.
> I choose the wrong RAM when upgrading 2 weeks ago, should have gone for more then 1600mhz it seems.


eBay FTW


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why would he lose $150 for i5? You lose maybe $20-30 or he can just return it. Also that good is not trying? There are people that say i5 is good enough and those that back up HT.


Mistype, meant 50. CPU's dont have return policy, only replacements. (clearly i didnt mean 150 when the CPU costs 230ish new)


----------



## Assirra

Quote:


> Originally Posted by *philhalo66*
> 
> eBay FTW


Well the store normally has a way to switch it if it is still new.
Going to try to see, i just noticed that 2400mhz ram is like 10euro cheaper
what a dumbass i was to not look better....


----------



## ZealotKi11er

Quote:


> Originally Posted by *Assirra*
> 
> Well the store normally has a way to switch it if it is still new.
> Going to try to see, i just noticed that 2400mhz ram is like 10euro cheaper
> what a dumbass i was to not look better....


The problem has been people here have been saying 1600MHz is fine for gaming and it does not matter. The things that puzzles me is that Memory Bandwidth does not seem to have much of an effect. I mean X79 had Quad Channel and so does X99 so you dont need super fast RAM but Speed od single channel seem to make more difference.

I have used 2400MHz for about 3 years now. You got nothing to lose if you get faster speeds and did not pay more compared to 1600MHz. Now I would love to test 2800MHz RAM with my system.


----------



## Assirra

Well i asked my switch and should get answer within 24 hour.
Hopefully they don't do annoying cause it is already used (for less then 2 weeks no less).


----------



## 8-Ball

I have an old 2500k and P67 board sitting around. Maybe I should hold onto it?


----------



## zealord

video is in line with what I've noticed the last year.

Having an i5 2500K + 290X + 1333 MHZ RAM I am definitely hit hard by the bottleneck coming from low ram speed and AMD CPU overhead.

It is definitely time for an upgrade. New cards and CPUs can't come soon enough.

I have to honestly say that CPU overhead on AMD cards is really bothering me. Don't get me wrong I love my 290X, it is actually a powerful card, but damn I wish it wasn't held back by something sofware related


----------



## Cyro999

Quote:


> They are comparing it to Core i5 6500 which is the best Intel has to offer


I thought that they were going to compare to the 6600k as that's the direct replacement for the 2500k, but it seems like OC 2500k (at 4.6ghz in the test, or anywhere near that value) is roughly competing with stock 6500.

That says a lot, because 6500's run at 3.3ghz when all cores are loaded. An OC'd 6600k will run around 4.65ghz on average - 1.41x faster.
Quote:


> I have to honestly say that CPU overhead on AMD cards is really bothering me. Don't get me wrong I love my 290X, it is actually a powerful card, but damn I wish it wasn't held back by something sofware related frown.gif


Yeah, we have a bit of an Nvidia monopoly on running some games out there because of this driver CPU efficiency difference. Literally 1.5x FPS or stuttering as shown in the video with FCAT.

Those differences got really big almost 2 years ago and it has just been brushed under the rug - "let's hope they don't notice until everything is mantle/dx12/vulkan" attitude.

Meanwhile you have to write an essay with half a dozen sources when helping somebody buy the right hardware for those certain games without appearing biased or being generally attacked on forums becasue the 99% (maybe 95% at this point) of gamers who browse forums don't really know about this issue or that they're getting way lower FPS than somebody with the same CPU and Nvidia GPU would get.

They generally just look at high CPU load, low GPU load and blame the game engine/developers for their lack of performance; or funnily enough - blame Nvidia for having much more CPU efficient dx11 drivers, claiming that they're in bed with the devs and sabotaged the game for everyone else.

It's one of the biggest factors in my owning of this GPU and the one before it; i've spent a lot of money and gone far out of my way to get a 15% performance boost in some of my games upgrading from Haswell to Skylake, it would be a huge insult to buy a more powerful graphics card with a way worse driver and get 1.5x worse performance and be back in FPS levels not seen since before i upgraded to a Haswell CPU.


----------



## Ithanul

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes but some people have IVY + 2400MHz making the gap even smaller. The big this is 1080p. I left 1080p when I had a Core 2 Quad. I have been 1440p for 4 years and 4K now. 2500K OCed will feed even a Titan @ 4K. 4.6GHz also was easy OC for SB. Most did 4.8GHz and some 5GHz. If you are AMD in the other hand and still play 1080p its not good. AMD CPU overhead gets ignored because review sites use 6700K and 5960X.


Yeah, I still kick myself for selling my 2500K. Was my first intel chip and first chip to OC. Loved that darn thing, even with my noob go at OCing it. I manage to get 4.8GHz out of that chip.









Oh well, I'm happy with my 4770K. Though, I do have 3930K Sandybridge-E I have yet to play around with.


----------



## Kand

When a Dual Core CPU matches a 2500k, I'd say that it's time to upgrade.

>See G3258 vs Q6600.


----------



## Cyro999

Funnily enough due to bad coding, there are a handful of games that run on a stock q6600 but will stutter horribly or refuse to start on an OC'd to the wall g3258 - even though it has superior singlethreaded and multithreaded performance and should be an all around better CPU.


----------



## 45nm

Quote:


> Originally Posted by *8-Ball*
> 
> I have an old 2500k and P67 board sitting around. Maybe I should hold onto it?


You could always re-use it. It's not worth paying more especially when a Skylake system will require DDR4 and offer minimal gains at best. I used to do upgrades more often years ago but now with the minimal gains and focus on power consumption and features even Socket 775 equipment and hardware can still be viable in today's environment. I also have a 2700K and a Z68 motherboard that I am holding on to and it was a great system that I might end up reusing as well. Also another problem is finding the 6700K in stock here and at affordable prices due to the USD. It's simply not worth paying for DDR4 and features which I won't use when I can always reuse my 2700K build.


----------



## Ironsight

Looks like I'm going to be running my 2500k into the ground and upgrading from my 8GB 1600mhz memory. I was contemplating doubling up, good thing I saw this post first.


----------



## 45nm

Quote:


> Originally Posted by *Kand*
> 
> When a Dual Core CPU matches a 2500k, I'd say that it's time to upgrade.
> 
> >See G3258 vs Q6600.


Not really. When a dual-core or a quad-core can offer multi-threaded performance on the level of something like the 5960X then perhaps it's time to consider an upgrade. I haven't seen that happen and the 2500K and the 2600/2700K are still great even when compared against Haswell/Skylake counterparts.


----------



## 96xj

man , this is killing me .. my 2500k has kicked ass it's entire life @4.6 gig , until recently .
a failing asus motherboard is forcing me to either acquire a comparable used Z77 chipset motherboard ,
which is relatively crazy expensive for a 4~5 year old board used in good condition , and take a chance that it will fail soon .
or upgrade to to a minimum of lga1150 processor and Z97 motherboard ( for not that much more $ ) , yeah $ is a real problem right now .
interesting how capable the old girl really is , i will really miss my 2500k .


----------



## Kand

Quote:


> Originally Posted by *45nm*
> 
> Not really. When a dual-core or a quad-core can offer multi-threaded performance on the level of something like the 5960X then perhaps it's time to consider an upgrade. I haven't seen that happen and the 2500K and the 2600/2700K are still great even when compared against Haswell/Skylake counterparts.






I wish someone made an in depth comparison video!


----------



## Ascii Aficionado

Nope.

I'd need a new GPU long before a CPU (especially @ 1440p)

I've been running my 2500k at stock for 2 years now since the OC was failing due to my chip degrading every 5 months and it was getting closer than I'd like to too high of a Vcore.

When that time comes I was contemplating keeping the mobo, ram, cpu, and case, buying an Audigy or Titanium Creative card, using my current GPU (once I get a new one) then making a super powered XP machine solely for EAX games, as the Alchemy software is awful compared to actual OS support.


----------



## bobfig

interesting. i still feel like i don't need to upgrade still. i'm happy with that.


----------



## Ascii Aficionado

Quote:


> Originally Posted by *bobfig*
> 
> interesting. i still feel like i don't need to upgrade still. i'm happy with that.


You only need to upgrade if you deem it worth it, I just think many 2500k owners are thinking "why the hell don't I need to upgrade yet"


----------



## zealord

There is one important thing people need to realize. For most people it doesn't matter if you run at 50 fps in games. They still consider the CPU to be good, but for other people, like me, it is incredibly frustrating to have a bottleneck you can't circumvent by lowering settings. If you know your GPU can't handle 1080p all maxed out stable 60 fps then you lower a couple of settings. But if that comes from CPU overhead then damn that is annoying.

Some people don't seem to notice the difference between 50 and 60 fps, but for me and many others it is night and day. Back some 8 years ago when I played CS 1.6 a lot and the fps went from 100 to 97 it was heavily noticeable for me and I considered it unplayable (in terms of having a big disadvantage).


----------



## xx9e02

I wonder how much longer I can hold out with my X5650 at 4ghz. Getting the itch to upgrade :[


----------



## Ironsight

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Nope.
> I've been running my 2500k at stock for 2 years now since the OC was failing due to my chip degrading every 5 months and it was getting closer than I'd like to too high of a Vcore.


What voltage and oc were you running that made your chip degrade?


----------



## Ascii Aficionado

Quote:


> Originally Posted by *Ironsight*
> 
> What voltage and oc were you running that made your chip degrade?


It's been so long, but I'm fairly sure it was 1.3v something with a 4.5 OC, I didn't change any other voltages either, it was my first and last time overclocking. I ran Prime95 for about 2 hours and it was fine, and I never had any issues other than maybe every few months I'd get a BSOD that implied my vcore wasn't high enough and a simple increase/nudge solved it.

I always assumed I could have gone with a much lower vcore if I had increased other things.

It's been 100% stable at stock, I didn't break it or anything, I just degraded it.

Edit - I posted in a thread for my mobo years ago, I may have some old info there.


----------



## TheReciever

Quote:


> Originally Posted by *xx9e02*
> 
> I wonder how much longer I can hold out with my X5650 at 4ghz. Getting the itch to upgrade :[


The chip is capable of 5.0ghz maybe time to revisit your machine?

Also to the above, testing prime for 2 hours is anything but stable.


----------



## ASUSfreak

First of all: I'm







(yesterday a marriage party and today it's 1AM and feeling







)

Secondly:

NICE! I remember the days (like 5 years ago???) when there were tests/benchmarks between 2500K/2600K

And it turned out they were both "the same" if you only used it for playing games!

So that means if this video from yesterday showed such a nice result for the 25K, my 26K is still awesome!

Or does that not make any sense (because I'm drunk?)


----------



## r31ncarnat3d

Oh man it's been five years already? That was fast.

My 2600k is still chugging along just fine at 4.0 GHz. I sure don't miss the constant need to upgrade CPUs before Sandy Bridge came along.


----------



## GANDALFtheGREY

Quote:


> Originally Posted by *ASUSfreak*
> 
> First of all: I'm
> 
> 
> 
> 
> 
> 
> 
> (yesterday a marriage party and today it's 1AM and feeling
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Secondly:
> 
> NICE! I remember the days (like 5 years ago???) when there were tests/benchmarks between 2500K/2600K
> 
> And it turned out they were both "the same" if you only used it for playing games!
> 
> So that means if this video from yesterday showed such a nice result for the 25K, my 26K is still awesome!
> 
> Or does that not make any sense (because I'm drunk?)


haha nope you're right on the money


----------



## smithydan

2600k vs 6700k


----------



## LuckyStarV

Quote:


> Originally Posted by *headd*
> 
> Considering i5 6500 boost with all cores loaded at 3.3ghz it looks like skylake have 39% better IPC vs sandy brige(4600Mhz sandy match stock 3.3Ghz i5 6500).If you want same performance as 4.6Ghz i5 6500 you need oc sandy bridge to 6.4ghz.


Biggest issue is they *overclocked the i5 6500 to 4.5ghz* which is only 100mhz behind the 4.6ghz they got out of the 2500k. The IPC gain is nowhere near that big.I say in the range of 20% max best case scenario.

Honestly, given the IPC gains we saw from Sandy to Skylake, I am not surprised, the 6500 was doing so well. With Intel locking down OC and all new motherboards now with the locked BIOS and the less than perfect OC (no temp+new BIOS fixes AVX), you need to get the 6600K to OC now, making it a much more expensive proposition.


----------



## Clocknut

Quote:


> Originally Posted by *zealord*
> 
> Having an i5 2500K + 290X + 1333 MHZ RAM I am definitely hit hard by the bottleneck coming from low ram speed and AMD CPU overhead.
> 
> It is definitely time for an upgrade. New cards and CPUs can't come soon enough.
> 
> I have to honestly say that CPU overhead on AMD cards is really bothering me. Don't get me wrong I love my 290X, it is actually a powerful card, but damn I wish it wasn't held back by something sofware related


A MD GPU's CPU overhead = a very very strong reason not to buy AMD GPU.









Once again it is the AMD driver team kill their gpu business. Until AMD fix this to on par with Nvidia or DX12 is supported by 80% of the games. I will not buy AMD anymore.

Another problem is Socket 2011 users, they pay so much premium for the top of the line CPU, yet they not getting the best of everything. You either get more thread or get better single threaded performance, you can have both.

I feels like Intel should have putting on the latest architecture on 2011 first b4 socket 115x


----------



## akromatic

meh sandy i5...... bring on the LGA1366 and see how that competes with today's gaming rigs


----------



## NuclearPeace

Quote:


> Originally Posted by *Clocknut*
> 
> A MD GPU's CPU overhead = a very very strong reason not to buy AMD GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once again it is the AMD driver team kill their gpu business. Until AMD fix this to on par with Nvidia or DX12 is supported by 80% of the games. I will not buy AMD anymore.


I think I read somewhere ( maybe a post by @Mahigan) that part of the reason why AMD won't fix the high DX11 overhead is partly due to how GCN is designed. I also recall reading that creating a multi-threaded DX11 driver requires a lot of work, something that AMD probably can't spare given their low amount of funds.


----------



## Tivan

Quote:


> Originally Posted by *Darklyric*
> 
> Why does he keep noting that the 3770k is a "drop in replacement' for a 2500k? Also I kinda laughed a little inside when people talk about not being able to overclock 390(x) to match gtx 970/980 levels when in truth all it really takes is around 1150 core vs 1500 core - the cpu overhead of 1080p and all.


Because they use the same socket.


----------



## zealord

Quote:


> Originally Posted by *Clocknut*
> 
> A MD GPU's CPU overhead = a very very strong reason not to buy AMD GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once again it is the AMD driver team kill their gpu business. Until AMD fix this to on par with Nvidia or DX12 is supported by 80% of the games. I will not buy AMD anymore.


got the 290X for super cheap from a friend. It outclassed my previous GTX 680 big time so I couldn't say no to that deal, especially since the 2GB of VRAM on the 680 was beginning to be noticeable.

I will definitely take a very close look at Pascal and Polaris this year. I really don't care if AMD or Nvidia I wait for both to come out and decide what is best then. I like the philosophy AMD have shown lately in terms of pushing stuff open compared to Nvidia who seem to be more like Apple in terms of software and proprietary. Also AMD has really great support for older cards and in the long run AMD cards always feel like a good choice.
On the other hand Nvidia has optimized drivers the day a game releases because they partner up with them. Also you are on the "safe side" with Nvidia when they play dirty with gameworks and stuff.

I hope AMD step up their game with Polaris and CPU overhead for DX11 isn't an issue anymore because then the choice is somewhat easier. (Someone here on OCN said that AMD is gonna decrease CPU overhead with Polaris though. So I am excited!).


----------



## jsc1973

I've always thought it was pretty simple. If the CPU is performing satisfactorily in all you ask of it, then you don't need to upgrade. If not, then you need to upgrade.

There are some people who are still running a C2D and it's good enough for their needs. Others might not have their needs met with anything less than an X99 or Skylake i7. Either way, no one knows better than the user.


----------



## Cakewalk_S

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Nope.
> 
> I'd need a new GPU long before a CPU (especially @ 1440p)
> 
> I've been running my 2500k at stock for 2 years now since the OC was failing due to my chip degrading every 5 months and it was getting closer than I'd like to too high of a Vcore.
> 
> When that time comes I was contemplating keeping the mobo, ram, cpu, and case, buying an Audigy or Titanium Creative card, using my current GPU (once I get a new one) then making a super powered XP machine solely for EAX games, as the Alchemy software is awful compared to actual OS support.


Hows that even possible? I've been running my 2500k at 4.5ghz for 4 years... No sign of degregation.


----------



## jsc1973

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Hows that even possible? I've been running my 2500k at 4.5ghz for 4 years... No sign of degregation.


Different chips function differently, depending on the quality of the silicon wafer they came from.

With that said, I have to wonder if it was degradation of the 2500K, or gradual degradation of the capacitors on the motherboard. I bet it was the latter, and if you put that chip on a healthy motherboard, it would work just like it did before.


----------



## Mad Pistol

If I had known that Intel was going to drag its feet so much after Sandy Bridge, I would have told people to get the i7 2600k instead of the 2500k.


----------



## Scotty99

Quote:


> Originally Posted by *Mad Pistol*
> 
> If I had known that Intel was going to drag its feet so much after Sandy Bridge, I would have told people to get the i7 2600k instead of the 2500k.


While i understand the logic, usage scenarios would still determine this in 2016.

I play MMO's most of the time, a 2600k wouldn't increase my FPS by even 1.


----------



## Ironsight

Quote:


> Originally Posted by *Mad Pistol*
> 
> If I had known that Intel was going to drag its feet so much after Sandy Bridge, I would have told people to get the i7 2600k instead of the 2500k.


QFT!


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> video is in line with what I've noticed the last year.
> 
> Having an i5 2500K + 290X + *1333 MHZ RAM* I am definitely hit hard by the bottleneck coming from low ram speed and AMD CPU overhead.
> 
> It is definitely time for an upgrade. New cards and CPUs can't come soon enough.
> 
> I have to honestly say that CPU overhead on AMD cards is really bothering me. Don't get me wrong I love my 290X, it is actually a powerful card, but damn I wish it wasn't held back by something sofware related


You put it upon yourself. even if you have a GTX 980 you'll get bottlenecked. Never did i have the need to turn on HT on my i7 when i only had a single 290.

Even my Tri-core AMD system uses 1600 RAM.lol


----------



## Darklyric

Quote:


> Originally Posted by *Tivan*
> 
> Because they use the same socket.










****...


----------



## magnek

In the Crysis 3 test @5:30 where they compare the 2500K vs 3770K vs 6500, the results are fairly meaningless. After Crysis 3's multithreading patch HT showed significant gains, so they should've used 3570K for the test instead. And what was the point of running ram at 3 different speeds (2133, 2400, 3200)?


----------



## ZealotKi11er

Quote:


> Originally Posted by *magnek*
> 
> In the Crysis 3 test @5:30 where they compare the 2500K vs 3770K vs 6500, the results are fairly meaningless. After Crysis 3's multithreading patch HT showed significant gains, so they should've used 3570K for the test instead. And what was the point of running ram at 3 different speeds (2133, 2400, 3200)?


They were showing the best CPU they can get if they dont no wish to upgrade MB/RAM.


----------



## zealord

Quote:


> Originally Posted by *rdr09*
> 
> You put it upon yourself. even if you have a GTX 980 you'll get bottlenecked. Never did i have the need to turn on HT on my i7 when i only had a single 290.
> *
> Even my Tri-core AMD system uses 1600 RAM.lol*


Good for you.


----------



## dragneel

Right now I don't feel like it's time at all, but I will be buying a higher end GPU when Pascal and Polaris arrive so I'm not sure. Am I actually going to need PCI-E 3 etc? How bad will the CPU bottleneck be? How will DX12 effect it if it all?. Oh well, new GPU and PSU first, then I'll decide on upgrading everything else later on.


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> Good for you.


http://www.overclock.net/t/1587616/i5-4690k-100-usage-gaming-temps-fine-windows-10

Good for him. He's got 2133 RAM.


----------



## Assirra

Ok, now i know why i went for the 1600mhz.
I was looking if it made a difference and i came upon a video from Linus where it was decided it doesn't matter.
Granted that video was from 2013 but i thought it was right.


----------



## Scotty99

Quote:


> Originally Posted by *Assirra*
> 
> Ok, now i know why i went for the 1600mhz.
> I was looking if it made a difference and i came upon a video from Linus where it was decided it doesn't matter.
> Granted that video was from 2013 but i thought it was right.


Don't take the video as gospel. Not all 1600 ram is created equal.

In my signature you can see my ram, highest rated cas 8 ram on neweggs site (when i bought it in 2011 it had like 30 reviews) cas 8 makes a difference (it indicates quality and is likely binned as in CPU's). I doubt id see FPS increases if i went with a set of 2133 ram (i mean maybe 1-2, and even tho ram is silly cheap right now the price/performance ratio still does not line up)


----------



## Ascii Aficionado

Quote:


> Originally Posted by *Assirra*
> 
> Ok, now i know why i went for the 1600mhz.
> I was looking if it made a difference and i came upon a video from Linus where it was decided it doesn't matter.
> Granted that video was from 2013 but i thought it was right.


Most people did, including me. I went with 1333


----------



## Kand

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Most people did, including me. I went with 1333


http://www.bit-tech.net/hardware/memory/2011/01/11/the-best-memory-for-sandy-bridge/1

I had this at the time. I guess Games weren't as demanding and Graphics Cards weren't powerful enough to actually cause noticeable CPU bottlenecking back then!


----------



## Alvarado

Great, I thought I would go with a full skylake build but now I don't know, thanks OP. +rep.


----------



## Kand

Quote:


> Originally Posted by *Alvarado*
> 
> Great, I thought i would go with a full skylake build but now I don't know, thanks OP. +rep.


Hold it until 2017.


----------



## hyp36rmax

I held onto my i5 2500K for my test bench. My main rig has an 5820K, lan rig a 4770K. Been wanting to try skylake for a concept air build i had in mind.


----------



## Alvarado

Quote:


> Originally Posted by *Kand*
> 
> Hold it until 2017.


I was gonna get sli 980 TIs figured I'd go along with a skylake overhaul, Maybe I should wait.


----------



## Kand

Quote:


> Originally Posted by *Alvarado*
> 
> I was gonna get sli 980 TIs figured I'd go along with a skylake overhaul, Maybe I should wait.


Pascal first. Cannonlake later.


----------



## Clocknut

Quote:


> Originally Posted by *Mad Pistol*
> 
> If I had known that Intel was going to drag its feet so much after Sandy Bridge, I would have told people to get the i7 2600k instead of the 2500k.


or 3930K

last time I got my 2500K is use as a temporary upgrade. I was expecting 3770K to be significant better..... and it turns out to be meh. lol


----------



## Alvarado

Quote:


> Originally Posted by *Kand*
> 
> Pascal first. Cannonlake later.


Off-topic but wish I could wait for pascal but my monitor is dying on me(randomly turns off and a burnt smell at the top right vent) so yeah... Hence I would grab one of those acer predator monitors.


----------



## 96xj

Quote:


> Originally Posted by *Alvarado*
> 
> Off-topic but wish I could wait for pascal but my monitor is dying on me(randomly turns off and a burnt smell at the top right vent) so yeah... Hence I would grab one of those acer predator monitors.


I love the smell of burnt capacitor's in the morning .


----------



## Majin SSJ Eric

Yep, there's really no pressing need even today to upgrade an old, good-overclocking SB platform if you are just gaming (unless you are running 1333mhz memory or are using an AMD gpu). My old 2600k still games very well on a 270X at 1080p and I imagine if I were to throw something like a 970 and a 1440p monitor at it, it would do nearly as well as the new platforms. SB will probably end being remembered as one of the greatest CPU architectures ever made...


----------



## Scotty99

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yep, there's really no pressing need even today to upgrade an old, good-overclocking SB platform if you are just gaming (unless you are running 1333mhz memory or are using an AMD gpu). My old 2600k still games very well on a 270X at 1080p and I imagine if I were to throw something like a 970 and a 1440p monitor at it, it would do nearly as well as the new platforms. *SB will probably end being remembered as one of the greatest CPU architectures ever made*...


I hate boasting but i said this when ivy bridge came out. I knew sandy was the benchmark when ivy went to crap with thermals. What enthusiast would take a 10% ipc gain over a CPU you could clock to the heavens and above without surpassing your cooling solution. Its just coincidence i needed a PC in 2011 but im glad that was the time i needed to build, sandy bridge is the peak of intels accomplishments (especially when you take into account thermals, that is important).


----------



## djriful

That is the most in-depth benchmark and detailed presentation I ever seen...


----------



## djriful

*Is It Time To Upgrade Your Core i7 3930K?*


----------



## akromatic

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yep, there's really no pressing need even today to upgrade an old, good-overclocking SB platform if you are just gaming (unless you are running 1333mhz memory or are using an AMD gpu). My old 2600k still games very well on a 270X at 1080p and I imagine if I were to throw something like a 970 and a 1440p monitor at it, it would do nearly as well as the new platforms. SB will probably end being remembered as one of the greatest CPU architectures ever made...


O_O i'd say nehalem is.

good ol bios without uefi shenanigans, good ol bclk OC, available in hyperthreaded 6 cores and its when overclocked its still competent with today's CPU


----------



## Clocknut

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yep, there's really no pressing need even today to upgrade an old, good-overclocking SB platform if you are just gaming (unless you are running 1333mhz memory or are using an AMD gpu). My old 2600k still games very well on a 270X at 1080p and I imagine if I were to throw something like a 970 and a 1440p monitor at it, it would do nearly as well as the new platforms. SB will probably end being remembered as one of the greatest CPU architectures ever made...


2500K/2600K will be the new Q9650.


----------



## TheReciever

Yeah I remember when Sandy bridge came out and everyone was stating that Sandy bridge while had its perks was a side grade to x58 then later came massive drop in price for the x5650/L5639 unless your motherboard blows you still don't need to upgrade.

X58 has standed the test of time I would say, over 1155.


----------



## Kand

Quote:


> Originally Posted by *TheReciever*
> 
> Yeah I remember when Sandy bridge came out and everyone was stating that Sandy bridge while had its perks was a side grade to x58 then later came massive drop in price for the x5650/L5639 unless your motherboard blows you still don't need to upgrade.
> 
> X58 has standed the test of time I would say, over 1155.


Sandy Bridge a sidegrade to Nehalem?

That's news to me!

http://www.anandtech.com/bench/product/47?vs=287


----------



## mechtech

Quote:


> Originally Posted by *TheReciever*
> 
> Yeah I remember when Sandy bridge came out and everyone was stating that Sandy bridge while had its perks was a side grade to x58 then later came massive drop in price for the x5650/L5639 unless your motherboard blows you still don't need to upgrade.
> 
> X58 has standed the test of time I would say, over 1155.


Definitely. I clearly remember that when the 2500k came out it was seen as just another incremental upgrade. Par for the course. The 920 a generation before it was a huge step forwards from what came before.

Sadly, the 2500k seems so impressive because of the slow rate of CPU performance gains in the last 5 years, not because of any inherent greatness of its architecture.


----------



## DiNet

Quote:


> Originally Posted by *TheReciever*
> 
> Yeah I remember when Sandy bridge came out and everyone was stating that Sandy bridge while had its perks was a side grade to x58 then later came massive drop in price for the x5650/L5639 unless your motherboard blows you still don't need to upgrade.
> 
> X58 has standed the test of time I would say, over 1155.


Same here. From what I remember sandy was quoted to be 5% faster over nehalem at best. And the next one was 5% over sandy and so on... with some added features, but clearly not recommended as direct upgrade path.
I held until now on 950 and was fine, great even. Seemed appropriate to upgrade now in wake of VR


----------



## Peanuts4

I sold my 2500k for more than I paid for it. I love the i5 2500k. Thanks to consoles we're stuck in graphics purgatory, I mean seriously DX9 for over 10 years, DX10 and DX11 were a gimmick to sell new OS's to gamers. One day we'll have great games again although they'll probably be VR and hopefully have nothing to do with Microsoft.


----------



## robbo2

What made SB so great was the 4.5+ clocks and low running temps. There was no need for full loops to get the most from your chip. The chip it'self was also a great price compared to what Skylake is selling for now.


----------



## Defoler

The 2500K isn't that legendary.
You can clearly see that compared to an OCed 6500 and OCed DDR4, you gain quite a few more FPS. This can be translated to higher graphics settings or higher resolution.
The 2500K isn't bad. It can still hold its own, but sticking with it does limit you if you plan to upgrade your periphiral hardware, or you encounter a game that you do wish to get better visuals from.


----------



## Alex132

2500k strongk









Glad I got one.


----------



## thegreatsquare

Hmmm... I should look into upgrading to 2133MHz RAM.

O/T: The i5 2500k was definitely a jackpot of a CPU.


----------



## LongRod

Good to see the 2500k still pulling its own weight even after 5 years. I don't think I would've upgraded until Zen came out had my 2500k's motherboard not finally bit the dust.

Just not cost efficient to buy a decent Z77 board on eBay for $150 when my MSI Z97 board was $109... once I sell my 2500k, the CPU upgrade would've really only been $80, but I am lucky enough to live near a Microcenter.
Quote:


> Originally Posted by *thegreatsquare*
> 
> Hmmm... I should look into upgrading to 2133MHz RAM.
> 
> O/T: The i5 2500k was definitely a jackpot of a CPU.


I should probably look into that too.


----------



## Imouto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Been telling people for ages. RAM speeds matters in CPU limited games.


How long? Because Sandy Bridge had an outstanding memory controller and the fastest you could pair it with at the time was a frigging GTX 280 and games weren't that CPU demanding back then.

For some good comparison and at the same time they're at it maybe they should have tried giving us the latency numbers instead of throwing some pretty graphs at us. I'm betting that these guys didn't even bother lowering the latencies of that kit for the 1600 results.

Most of the time you're trading speed for latency so a good 1600 kit shouldn't have such a performance hit.

And for the sake of saying it out loud the kit DF tested was $230 at launch.


----------



## 364901

The discussion on 1600MHz with tighter timings reminded me of Anandtech's scaling tests with Haswell and DDR3 memory.



DDR3-2400 with a CAS latency of 9 is the best result Haswell can push out.


----------



## benbenkr

Lol.
So now people are suddenly believing that RAM speed matters? Where's all those keyboard warriors who kept saying it isn't otherwise?









It has been debunked for years that higher RAM speed = better performance in games. Love to see those naysayers now.


----------



## Kand

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Been telling people for ages. RAM speeds matters in CPU limited games.


RAM speed matters.

If your GPU can manage to make the CPU be the bottleneck .


----------



## Alex132

Eh, I have a 2500k and 2133Mhz RAM. I guess I could run some benchmarks if people were really interested









(ie; different RAM speeds and how they impact games)


----------



## Blameless

Even my original Nehalems hold up well. I have an i7 920 system (@ 4GHz) and a Xeon W3540 (@ 3.6GHz) that I just gave away, neither of which are wanting for CPU power in most games. The parts are from 2009.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> You got nothing to lose if you get faster speeds and did not pay more compared to 1600MHz. Now I would love to test 2800MHz RAM with my system.


I have a slew of older CPUs that still hold up well, but I'm not going to pair them with faster memory because that would require further expenditure on old platforms.

Most of my surviving LGA-1366 setups are running memory clocked between 1333 and 1600MT/s. The platform doesn't really handle faster stuff well, and I've sold off most of my faster DDR3 at this point.


----------



## Chobbit

You've brought back loving memories, i loved my 2500k it was the biggest step up in terms of beating my previous chip (especially @ 4.8) and i've never had it since. Ended up in my wifes PC with a 7970 and it still held its own. Okay its not even close tomy current setup but CPU wise it took a number of generations to feel like id had an upgrade. Still sold mine for decent moneybut now miss that legend


----------



## r31ncarnat3d

Quote:


> Originally Posted by *Chobbit*
> 
> You've brought back loving memories, i loved my 2500k it was the biggest step up in terms of beating my previous chip (especially @ 4.8) and i've never had it since.


Haha no kidding. I jumped from a 1090t to a 2600k myself. Such an amazing jump.


----------



## Alex132

Quote:


> Originally Posted by *r31ncarnat3d*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chobbit*
> 
> You've brought back loving memories, i loved my 2500k it was the biggest step up in terms of beating my previous chip (especially @ 4.8) and i've never had it since.
> 
> 
> 
> Haha no kidding. I jumped from a 1090t to a 2600k myself. Such an amazing jump.
Click to expand...

I went from a 965BE to a 2500k, the jump in minimum game FPS was astounding









SC2 was suddenly a whole new game.


----------



## jologskyblues

I still remember how disappointed I was with the Bulldozer launch after looking forward to it being the savior of AMD after all the hype around it coming after the failure of Barcelona architecture against Conroe/Nehalem. So much so that on impulse, I bought the 2500K to get rid of the Phenom II x4 945 in my system. My first Intel processor in 10 years after being such a hardcore AMD CPU loyalist.

The 2500K was a revelation. So easy to OC and everything just got faster, snappier and more responsive with it. GTA4, which I was playing at the time got very noticeably smoother with about a 10-15fps increase in the CPU limited parts just by going with the OC'd 2500K where the previous CPU was bottlenecking the GTX 580.

After four years, the Z68 board I was using failed last year due to a faulty PSU so I took the opportunity to upgrade to a newer platform. I just wish it failed a few months later so I could have jumped on the Skylake bandwagon.


----------



## 364901

Quote:


> Originally Posted by *benbenkr*
> 
> It has been debunked for years that higher RAM speed = better performance in games. Love to see those naysayers now.


This doesn't mean that what was true in the past isn't true today. Faster RAM does have an effect on today's chips, and it benefits anyone concerned about their minimum framerates on Skylake, or who simply wants a snappier system. DDR4-3000 with a CAS of 15 should be the starting sweet-spot for enthusiasts building a Skylake system on a Z170 motherboard.

Edit: Also, none of these tests take into account the performance differences in multiplayer games. I'm sure that if there was a better testing methodology for those titles, then we'd be able to easier see performance differences for these CPU-limited titles.


----------



## Scotty99

Its funny how many people have said in here i miss my 2500k or just have good memories of it.....why did you upgrade lol.

Im still rocking mine, bought it august of 2011. Not a single intel CPU release has impressed me since then, but the kicker is being able to clock a CPU at 4.6ghz at 1.3v with a hyper 212+, and never go over 50c in games, 60c in a stress test.

My rig is the exact same as 2011 except i upgraded to a gtx 760 about a year ago from my original 465.

(yes that is a biostar motherboard that has lasted 5 years lol, it was 20 dollars when bundled at microcenter)


----------



## thegreatsquare

Quote:


> Originally Posted by *Alex132*
> 
> (yes that is a biostar motherboard that has lasted 5 years lol, it was 20 dollars when bundled at microcenter)


...that's probably deserving of its own news story.


----------



## Scotty99

Quote:


> Originally Posted by *thegreatsquare*
> 
> ...that's probably deserving of its own news story.


LOL

Im baffled as to how its still going, whats crazier is i sadly a smoker (indoor with a air purifier) that entire time. It should probably be in a museum lol


----------



## Viscerous

Another CPU video with a major flaw. It's missing the games where the CPU is the main factor. When I upgraded my system I didn't notice any major difference in FPS in the Witcher 3 on my old i7 920 with the same GTX 970. However, one of the games I was playing at the time was Heroes of the Storm, an RTS. I went from dropping below 30 at times to never dipping below 60. MMOs are another major genre that usually benefits from the new CPUs. Guild Wars 2 is a major example where the CPU helps, but one such as Final Fantasy XIV actually tends to use a GPU quite well. This video is basically the best case scenario to show a 2500k doing well.


----------



## Alex132

Quote:


> Originally Posted by *thegreatsquare*
> 
> ...that's probably deserving of its own news story.


You messed up your quotes lol


----------



## Murlocke

If I was on a 2500k, I would probably upgrade. New games do get some pretty decent gains over it in the min FPS department, which the most important thing.


----------



## Glottis

Quote:


> Originally Posted by *Scotty99*
> 
> Its funny how many people have said in here i miss my 2500k or just have good memories of it.....why did you upgrade lol.
> 
> Im still rocking mine, bought it august of 2011. Not a single intel CPU release has impressed me since then, but the kicker is being able to clock a CPU at 4.6ghz at 1.3v with a hyper 212+, and never go over 50c in games, 60c in a stress test.
> 
> My rig is the exact same as 2011 except i upgraded to a gtx 760 about a year ago from my original 465.
> 
> (yes that is a biostar motherboard that has lasted 5 years lol, it was 20 dollars when bundled at microcenter)


good that you still find your 2500K good enough, but most people here upgraded because we run 980Tis or TitanXs and need appropriate CPU horsepower to not bottleneck these nice cards. just because we are nostalgic about our old 2500Ks doesn't mean we can't understand superiority of modern X99 or Z170 platforms.


----------



## hollowtek

Nah. upgrading the 2500k won't be cheap. skylake+mobo+ddr4 = more than i have.


----------



## clerick

I think i'll finally upgrade once the pascal gpus are out, but this was still a nice video and it's amazing how well the 2500k is still able to hold up in games!


----------



## Scotty99

Quote:


> Originally Posted by *Glottis*
> 
> good that you still find your 2500K good enough, but most people here upgraded because we run 980Tis or TitanXs and need appropriate CPU horsepower to not bottleneck these nice cards. just because we are nostalgic about our old 2500Ks doesn't mean we can't understand superiority of modern X99 or Z170 platforms.


I understand this forum is called overclock.net, but do people really use benchmarks and stuff to decide whether they need to upgrade or not?

No game has told me to upgrade due to stutters or anything like that, thats why i never upgraded. Bottleneck is a weird word, it actually makes no sense in terms of hardware.


----------



## DiNet

Quote:


> Originally Posted by *Scotty99*
> 
> I understand this forum is called overclock.net, but do people really use benchmarks and stuff to decide whether they need to upgrade or not?
> 
> No game has told me to upgrade due to stutters or anything like that, thats why i never upgraded. Bottleneck is a weird word, it actually makes no sense in terms of hardware.


https://www.google.co.il/search?q=hardware+bottleneck&ie=utf-8&oe=utf-8&gws_rd=cr&ei=hLTJVs6SDsGIaZvAg6gH


----------



## Scotty99

I understand what the word means lol.

Just saying, if you actually think about it and not takes others word on it you will realize it means nothing in terms of PC hardware. Of course your system will be faster if you upgrade a part, but its not like a component can magically makes another component slower.

People need to think more, watch less videos.


----------



## zealord

Quote:


> Originally Posted by *Scotty99*
> 
> I understand this forum is called overclock.net, *but do people really use benchmarks and stuff to decide whether they need to upgrade or not?*
> 
> No game has told me to upgrade due to stutters or anything like that, thats why i never upgraded. Bottleneck is a weird word, it actually makes no sense in terms of hardware.


Nah we use the monthly horoscope to decide whether we upgrade or not.

God I love your post. I wish I could put that quote on a T-shirt and wear it to make people reading it angry


----------



## Murlocke

Quote:


> Originally Posted by *Scotty99*
> 
> I understand this forum is called overclock.net, but do people really use benchmarks and stuff to decide whether they need to upgrade or not?
> 
> No game has told me to upgrade due to stutters or anything like that, thats why i never upgraded. Bottleneck is a weird word, it actually makes no sense in terms of hardware.


Get a 4K monitor and you'll be wanting to squeeze out every FPS you can get.

If your on 1080p, sure, upgrading is pointless because a 5 year old computer can still max games at 60+ FPS that resolution.


----------



## DiNet

Quote:


> Originally Posted by *Scotty99*
> 
> I understand what the word means lol.
> 
> Just saying, if you actually think about it and not takes others word on it you will realize it means nothing in terms of PC hardware. Of course your system will be faster if you upgrade a part, but its not like a component can magically makes another component slower.
> 
> People need to think more, watch less videos.


If you actually think about it... it is how it works.
No magic is involved.


----------



## Scotty99

Quote:


> Originally Posted by *Murlocke*
> 
> Get a 4K monitor and you'll be wanting to squeeze out every FPS you can get.
> 
> If your on 1080p, sure, upgrading at this point is pointless because a 5 year old computer can still max games at 60+ FPS that resolution.


I sit 2 feet from my display, 4k isnt necessary at that distance and its only relevant for TV's. Sorry to the people who have bought 4k monitors.


----------



## Scotty99

Quote:


> Originally Posted by *DiNet*
> 
> If you actually think about it... it is how it works.
> No magic is involved.


What lol?


----------



## Murlocke

Quote:


> Originally Posted by *Scotty99*
> 
> I sit 2 feet from my display, 4k isnt necessary at that distance and its only relevant for TV's. Sorry to the people who have bought 4k monitors.


lol, what?! Go see an eye doctor because you are clearly not seeing 20/20 if you can say this.

You have a 760. You clearly don't push your games to the max, you aren't the target audience of this type of high end hardware. Your like someone going to a sports car dealership and asking why get a sports car when this $1,500 used car will get the job done.


----------



## Scotty99

Quote:


> Originally Posted by *Murlocke*
> 
> lol, what?!
> 
> Go see an eye doctor because you are clearly not seeing 20/20 if you can say this.


Come on man lol.

How many people own 4k monitors, like 8? (discounting apple customers lol).

Even the gaming monitors are not going for 4k, they are focused on 1080, 2560, high refresh rates, and gsync. 4k is absolutely the future for TV's, but not PC monitors....


----------



## DiNet

Quote:


> Originally Posted by *Scotty99*
> 
> What lol?


No magic is involved.
Science is. Physics are. Not magic.

Better now?


----------



## zealord

I am pretty sure scotty99 left his computer logged in on OCN and a console gamer friend of his is currently using his account to post rubbish.


----------



## BigMack70

It's very interesting to see repeated tests showing the impact of memory speed on modern games... makes me wonder if I goofed in getting 2666 MHz DDR4 for my 5930k. Very cool test by DF... OC'd Sandy users still OK but finally at a position where there will be a noticeable difference in performance.


----------



## Imouto

Quote:


> Originally Posted by *Scotty99*
> 
> I understand this forum is called overclock.net, but do people really use benchmarks and stuff to decide whether they need to upgrade or not?
> 
> No game has told me to upgrade due to stutters or anything like that, thats why i never upgraded. Bottleneck is a weird word, it actually makes no sense in terms of hardware.


OC.net seems to lack a lot in common sense and empathy. These forums will make you feel like a worm if you don't spend an elastic amount of money to barely improve your performance in 1% of your usual workload.


----------



## BigMack70

Quote:


> Originally Posted by *Imouto*
> 
> OC.net seems to lack a lot in common sense and empathy. These forums will make you feel like a worm if you don't spend an elastic amount of money to barely improve your performance in


Elastic money?


----------



## Scotty99

http://store.steampowered.com/hwsurvey

.07% of people on steam survey have a 4k monitor. 35+% have a 1080p display.

Come on guys lol.

Im not gonna reply to any of this nonsense anymore, lets keep to the topic.


----------



## Glottis

Quote:


> Originally Posted by *BigMack70*
> 
> It's very interesting to see repeated tests showing the impact of memory speed on modern games... makes me wonder if I goofed in getting 2666 MHz DDR4 for my 5930k. Very cool test by DF... OC'd Sandy users still OK but finally at a position where there will be a noticeable difference in performance.


2666Mhz is fine especially if you got ones with good timing.
http://www.anandtech.com/show/8959/ddr4-haswell-e-scaling-review-2133-to-3200-with-gskill-corsair-adata-and-crucial/7
http://www.legitreviews.com/ddr4-memory-scaling-intel-z170-finding-the-best-ddr4-memory-kit-speed_170340/5


----------



## Murlocke

Quote:


> Originally Posted by *zealord*
> 
> I am pretty sure scotty99 left his computer logged in on OCN and a console gamer friend of his is currently using his account to post rubbish.


Yeah... Saying things like 4K is only relevant in TVs or that "like 8 people run 4K" is complete trolling.

How can someone have 1300 posts here and say something like this?
Quote:


> Originally Posted by *Scotty99*
> 
> http://store.steampowered.com/hwsurvey
> 
> .07% of people on steam survey have a 4k monitor. 35+% have a 1080p display.
> 
> Come on guys lol.
> 
> Im not gonna reply to any of this nonsense anymore, lets keep to the topic.


Do you realize the difference between an enthusiast forum and a hardware survey that consists of ALL PC gamers?

4K isn't very popular right now because:
- It's expensive.
- It's insanely hard to run.
- It's brand spanking new.
NOT because it isn't noticeable on monitors. It's a significant upgrade over 1080p or 1440p that many cannot afford right now.

Enthusiast by definition are the 1%. By being on a forum dedicated to hardware overclocking, you're going to meet a lot of that 1%. If you want to see forums full of "average gamers", then go to steam forums and you'll fit right in with "1080p and midrange cards are good enough" crowd.

Also, if .07% of the entire steam community is "like 8 people" than I expect to see valve going bankrupt very soon.


----------



## BigMack70

Quote:


> Originally Posted by *Scotty99*
> 
> http://store.steampowered.com/hwsurvey
> 
> .07% of people on steam survey have a 4k monitor. 35+% have a 1080p display.
> 
> Come on guys lol.
> 
> Im not gonna reply to any of this nonsense anymore, lets keep to the topic.


If you've been here 2+ years with 1000+ posts and think that OCN is representative of or correlates well with the general population (e.g. the steam hwsurvey), then I question your intelligence.


----------



## DiNet

How is steam survey relevant to real life? According to it 27.38% are laptop users on 1366 x 768
I mean of course with laptops at this resolution there's nothing else to do except fill in survey


----------



## Scotty99

Quote:


> Originally Posted by *BigMack70*
> 
> If you've been here 2+ years with 1000+ posts and think that OCN is representative of or correlates well with the general population (e.g. the steam hwsurvey), then I question your intelligence.


I see you have a rig named "glorious 4k" so you are clearly bias.

I understand the criticism, i figured id have a post like this in regards to it so i have to reply. Yes, this is a forum for enthusiasts which is very different from a mainstream consumer......but that does not change the fact 4k at 2 feet is absurd. There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.


----------



## BigMack70

Quote:


> Originally Posted by *Scotty99*
> 
> I see you have a rig named "glorious 4k" so you are clearly bias.
> 
> I understand the criticism, i figured id have a post like this in regards to it so i have to reply. Yes, this is a forum for enthusiasts which is very different from a mainstream consumer......but that does not change the fact 4k at 2 feet is absurd. There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.


Now I definitely question your intelligence, since I obviously agree with you about 4k screens since I have a 4k TV, not monitor, and you somehow missed that from looking at the picture of a giant 4k TV in my sig...

Anyways, I bet that the relative portion of OCN users at 4k and/or on a top-end GPU is 100x higher than the general population. It wasn't too long ago that 1366x768 was the most popular resolution on the hwsurvey, but it's probably been 10 years since that was the most popular resolution on OCN.


----------



## hollowtek

It's like this: your Scion FRS is cool. It's got 200hp. Awesome, and there's nothing wrong with that. But it's been 5 years now. New cars have come out and they're faster than yours. Your pride is hurt. So what now? You go the forced induction route. Your car is notably faster, but no matter what, it's still behind the rest of the new cars. And those guys started to upgrade their cars too. so now you're even slower - albeit, only a little. You want to upgrade some more, but your car is at it's limit. You decide at his point, I'm just going to toss in a gigantic greddy T88 turbocharger for more power. But it doesn't go any faster. Damn it, why! Whywhywhywhywhywhywhywhyhwyhwyhwhywhy! So you have a few choices now... Just bite the bullet, sell your ride, and get the next best thing. Or, your FRS is as good as it's going to be, and even though it's got some miles on it, it's still a damn good car; it'll last for another 100 years (considering it's made by toyota) - and at the end of the day, it's only slightly slower than everything else that's available. Your choice.


----------



## zealord

Quote:


> Originally Posted by *Scotty99*
> 
> I see you have a rig named "glorious 4k" so you are clearly bias.
> 
> I understand the criticism, i figured id have a post like this in regards to it so i have to reply. Yes, this is a forum for enthusiasts which is very different from a mainstream consumer......but that does not change the fact 4k at 2 feet is absurd. There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.


I am pretty sure it is actually the other way around.

4K on a PC monitor at 2 feet is much more noticeable than a TV at whatever distance you sit at. Let alone like 95% of content for TVs is still 1080p and the TV needs to upscale it unlike on PC where games actually run at 4K resolution.


----------



## Alex132

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> I see you have a rig named "glorious 4k" so you are clearly bias.
> 
> I understand the criticism, i figured id have a post like this in regards to it so i have to reply. Yes, this is a forum for enthusiasts which is very different from a mainstream consumer......but that does not change the fact 4k at 2 feet is absurd. There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.
> 
> 
> 
> I am pretty sure it is actually the other way around.
> 
> 4K on a PC monitor at 2 feet is much more noticeable than a TV at whatever distance you sit at. Let alone like 95% of content for TVs is still 1080p and the TV needs to upscale it unlike on PC where games actually run at 4K resolution.
Click to expand...

Nah brah, it's all about that 20x10 pixel monitor at 10cm distance away









Why does VR even need high resolution panels









Lol.......


----------



## Scotty99

Quote:


> Originally Posted by *zealord*
> 
> I am pretty sure it is actually the other way around.
> 
> 4K on a PC monitor at 2 feet is much more noticeable than a TV at whatever distance you sit at. Let alone like 95% of content for TVs is still 1080p and the TV needs to upscale it unlike on PC where games actually run at 4K resolution.


Nope lol. It boggles the mind that you typed that out and pressed enter. While correct in assuming most content is 720 or 1080 still, that has nothing to do with actual 4k content. Netflix has a ton of shows in 4k, and that will only increase as the years go by and literally EVERY tv will be a 4k tv. The farther you get from a display the more you need to up the res, i dont understand how you dont get that. Up close i would bet you real money you couldnt differentiate a 720p phone from a 1440p phone, human eye is simply not designed to see such small detail.


----------



## zealord

Quote:


> Originally Posted by *Scotty99*
> 
> Nope lol. It boggles the mind that you typed that out and pressed enter. While correct in assuming most content is 720 or 1080 still, that has nothing to do with actual 4k content. Netflix has a ton of shows in 4k, and that will only increase as the years go by and literally EVERY tv will be a 4k tv. *The farther you get from a display the more you need to up the res, i dont understand how you dont get that.* Up close i would bet you real money you couldnt differentiate a 720p phone from a 1440p phone, human eye is simply not designed to see such small detail.


No. Just no. It is the other way around. You will be ashamed of your words once you inform yourself. You are just plain wrong and spreading misinformation.


----------



## Alex132

Quote:


> Originally Posted by *Scotty99*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zealord*
> 
> I am pretty sure it is actually the other way around.
> 
> 4K on a PC monitor at 2 feet is much more noticeable than a TV at whatever distance you sit at. Let alone like 95% of content for TVs is still 1080p and the TV needs to upscale it unlike on PC where games actually run at 4K resolution.
> 
> 
> 
> Nope lol. It boggles the mind that you typed that out and pressed enter. While correct in assuming most content is 720 or 1080 still, that has nothing to do with actual 4k content. Netflix has a ton of shows in 4k, and that will only increase as the years go by and literally EVERY tv will be a 4k tv. The farther you get from a display the more you need to up the res, i dont understand how you dont get that. Up close i would bet you real money you couldnt differentiate a 720p phone from a 1440p phone, human eye is simply not designed to see such small detail.
Click to expand...

You have to be trolling now.

You do know when you put things closer to your face, they get perceivably bigger? (hint: the pixels). I'm sure we all figured this out when we're only a few months old...


----------



## Murlocke

Quote:


> Originally Posted by *Scotty99*
> 
> but that does not change the fact 4k at 2 feet is absurd. There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.


Stop spreading nonsense. I have terrible eye sight (-6 in each eye),which leads to a pretty high prescription and the difference between 1080p and 4K is still MASSIVE. I have a feeling you've never even seen a fully maxed out game at 4K.

PC monitors aren't making a huge switch to 4K because it's expensive and almost impossible to run for the average gaming computer. That does NOT mean it's not a significant improvement.

Your average gamer claimed 1440p wasn't worth it back when it was new, and yeah that turned out to be true too.








Quote:


> Originally Posted by *Scotty99*
> 
> Nope lol. It boggles the mind that you typed that out and pressed enter. While correct in assuming most content is 720 or 1080 still, that has nothing to do with actual 4k content. Netflix has a ton of shows in 4k, and that will only increase as the years go by and literally EVERY tv will be a 4k tv. The farther you get from a display the more you need to up the res, i dont understand how you dont get that. Up close i would bet you real money you couldnt differentiate a 720p phone from a 1440p phone, human eye is simply not designed to see such small detail.


The farther you go away from a display, the more you need to up the resolution? This confirms you have absolutely no clue what you are talking about when it comes to resolution. This is resolution/screen size 101, and you just stated the complete opposite of how it is. Very very common knowledge...

I like how there's like 5 people now telling you your wrong and your just doing this:









There is literally no point in keeping this discussion going, Here's hoping you are just trolling.


----------



## Scotty99

I'm not gonna be mean, im just gonna slide out the side exit.

Back on topic, i havent decided if im gonna destroy my 2500k by running its max 5.2 clock with 1.5v or my daily 4.6ghz OC. Who knows, maybe it can take a 24/7 1.5v?


----------



## zealord

Quote:


> Originally Posted by *Scotty99*
> 
> *I'm not gonna be mean, im just gonna slide out the side exit.*
> 
> Back on topic, i havent decided if im gonna destroy my 2500k by running its max 5.2 clock with 1.5v or my daily 4.6ghz OC. Who knows, maybe it can take a 24/7 1.5v?


Figured as much once you googled it and realized that you were wrong you gonna weasel out instead of admitting you were just wrong.

The good news is. One more for my block list because I know I would never even remotely value anything you ever say


----------



## iLeakStuff

I have a maybe stupid question to you guys since you are talking resolutions:

27" display 1440p:
Thats 11004 PPI

18" display 1080p
Thats 14978 PPI

That 18" display got more pixels per inch than the 27" display. Why does the 27" look better?


----------



## 364901

Quote:


> Originally Posted by *Scotty99*
> 
> I sit 2 feet from my display, 4k isnt necessary at that distance and its only relevant for TV's. Sorry to the people who have bought 4k monitors.


4K is an interesting and necessary problem to solve. 4K on a 22-inch display looks really, really good. High-PPI displays require different approaches to drawing pixels on the display efficiently, and we're getting those technologies that end up giving us extra clarity and fidelity for almost no cost (hell, HEVC even saves us space!). Some anecdotal experiences from others also suggest that MSAA isn't completely necessary at these resolutions, which is partially true - aliasing becomes less noticeable, and 2x MSAA looks good, but it isn't such a drastic improvement as it is currently on 1440p and 1080p displays. Also, 4K TN displays look better than regular 1080p TN displays purely because of how close the pixels are to each other.

Quote:


> Originally Posted by *Scotty99*
> 
> There is a reason we aren't seeing PC monitors making the switch to 4k like TV's are, its simply not necessary.


The reason is actually price and economies of scale. Why do you think so many laptops still ship with 768p monitors? Because that's what the hardware can support, because there are no scaling issues, and the OEM wants to have the lowest BOM possible to extract the most profit out of it. We could have started the switch away from 1080p and 768p years ago, but the manufacturers saw no incentive in it because it wouldn't financially benefit them.


----------



## GoLDii3

Quote:


> Originally Posted by *iLeakStuff*
> 
> I have a maybe stupid question to you guys since you are talking resolutions:
> 
> 27" display 1440p:
> Thats 11004 PPI
> 
> 18" display 1080p
> Thats 14978 PPI
> 
> That 18" display got more pixels per inch than the 27" display. Why does the 27" look better?


PPI is also important to certain degree.

It's like 720p on a 40-50 inch TV vs 720p on a 5 inch smartphone screen.


----------



## Murlocke

Quote:


> Originally Posted by *iLeakStuff*
> 
> I have a maybe stupid question to you guys since you are talking resolutions:
> 
> 27" display 1440p:
> Thats 11004 PPI
> 
> 18" display 1080p
> Thats 14978 PPI
> 
> That 18" display got more pixels per inch than the 27" display. Why does the 27" look better?


Where'd you get those numbers from?









27" would be 108 PPI and 18" would be 122 PPI. The 18" is simply too small to see the small details that 122 PPI would offer. There's a point where increased size over increased resolution = better quality to our eyes. You'd need to be less than a foot away from that 18" to see the gains.

For example, I currently use a 65" 4K TV for gaming. At 5 feet, it has more detail than my 3440x1440 monitor at 2 feet, yet the difference in PPI is massively lower : 67 vs 109. Distance is very important, and I'm betting your sitting a bit further back from that 27" than you did with the 18".


----------



## carlhil2

There is nothing wrong with 4K, it's just that some who don't have it, just don't get it...


----------



## Socks keep you warm

Maybe if AMD actually started releasing good architecture and IPC improvements Intel would feel the need to push there engineering department. For now they can sit on there money and laugh.


----------



## Artikbot

I love it because people actually think Intel are sat on their asses doing nothing all day because AMD isn't competitive.

So Broadwell-Y isn't an advancement? Skylake U battery life isn't an advancement? The HD Graphics 530 isn't an advancement? The sheer performance per watt and cycle of Skylake isn't an advancement?

Come on.


----------



## djsi38t

Quote:


> Originally Posted by *Socks keep you warm*
> 
> Maybe if AMD actually started releasing good architecture and IPC improvements Intel would feel the need to push there engineering department. For now they can sit on there money and laugh.


One thing they have to laugh about is people upgrading needlessly.

It's simple,if your 2500k is maxxing out and not performing like you NEED it to,then it's time to upgrade.


----------



## charlesquik

my i7-2600K is still going strong too. Its a shame the cpu segment is going so slow...its all about power saving but we need raw power. Sure you can develop power saving efficient chip for laptop but we also need monster cpu for us pc gamer we dont care paying 5$ per month of electricity for the pc.

I think sandy is the golden age of cpu and I still dont feel the need to upgrade. Im playing all the games I have 60+ fps without a problem.


----------



## Dyson Poindexter

My problem is finding z77 motherboards is a pain in the butt. I'm using a full ATX board in a Cooler Master N200 just because I didn't want to spend a bunch for a z77 ITX board.


----------



## ZealotKi11er

2500K is no legendary. Is that it took Intel 5 years to get something faster.


----------



## DiNet

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 2500K is no legendary. Is that it took Intel 5 years to get something faster.


2600k is faster... afaik both launched together...


----------



## Ironcobra

Great video, that being said i dont plan to upgrade until the end of this year, however i have been thinking of upgrading my ram to hold me over. This video definitely backs this up. Im just shocked at how bad the dx11 performance. Im glad ive been holding out on a 4k upgrade until i do my computer again. It feels good tho that ive gotten every penny out of my 2500k system. Definitely best cpu for money of all time.

Can someone please recommend a good 16gb kit for pairing with the 2500k, I havnet been keeping up with hardware as much since buying this system.


----------



## CSCoder4ever

Looks like I'll be using my 2700K til' the day it dies.. or the board.


----------



## Sin0822

My issue with this whole thing is why people assume the CPU will make such a big difference in games? Sandy Bridge was very strong, and still is quite fast, tasks just haven't become grueling enough for many people to upgrade (not yet at least).

The CPU is the director while the GPU is the cast and crew, replacing the director might do some good but you cannot expect huge changes as the director while important isn't the most important factor.

The CPU will really only matter more if more is asked of it, so higher resolutions, more GPUs, VR, a very smart AI, and etc. will increase the importance of the CPU in gaming


----------



## overvolted

What about in regards to Big Pascal? If GP100 or whatever the hell it's called is the big jump in performance people are hoping it is, does that have any bearing on the 2500k being obsolete?


----------



## Cakewalk_S

Currently as my rig stands the only game that comes close to bottlenecking my cpu is far cry 4. And even then I order to bottleneck my cpu I have to have my 970 overclocked heavily and have it pegged at 100%...which is the primary bottleneck in my rig. So the way I see it, I still need a more powerful gpu in order to make it cost effective to upgrade the cpu. I think if I had a 980ti or upcoming pascal I'd think differently as it'd be more of an impact with the cpu. So the way I see it, I can still upgrade my gpu another generation before it becomes worth wile to upgrade my cpu. I've had the 2500k since my 560 and it's still pumping great frames.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sin0822*
> 
> My issue with this whole thing is why people assume the CPU will make such a big difference in games? Sandy Bridge was very strong, and still is quite fast, tasks just haven't become grueling enough for many people to upgrade (not yet at least).
> 
> The CPU is the director while the GPU is the cast and crew, replacing the director might do some good but you cannot expect huge changes as the director while important isn't the most important factor.
> 
> The CPU will really only matter more if more is asked of it, so higher resolutions, more GPUs, VR, a very smart AI, and etc. will increase the importance of the CPU in gaming


Well higher resolution task the GPU way more.


----------



## zealord

Quote:


> Originally Posted by *Sin0822*
> 
> My issue with this whole thing is why people assume the CPU will make such a big difference in games? Sandy Bridge was very strong, and still is quite fast, tasks just haven't become grueling enough for many people to upgrade (not yet at least).
> 
> The CPU is the director while the GPU is the cast and crew, replacing the director might do some good but you cannot expect huge changes as the director while important isn't the most important factor.
> 
> The CPU will really only matter more if more is asked of it, so higher resolutions, more GPUs, VR, a very smart AI, and etc. will increase the importance of the CPU in gaming


For me it is more like this :

A CPU has to be able to do 60fps minimum fps for every single game save those that are badly optimized. This isn't the case with the 2500K anymore.

Even if lets just hypothetically say a 2500K can manage 50fps and a 6700K can manage 70fps (both as minimum framerate) in game X then the value of the 6700K is much higher, for me personally, than the actual difference in fps.

It may not be worth the money on paper, but it is definitely worth the money for your eyes and hands if you are one of those people that can feel the difference between 50 and 60 fps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zealord*
> 
> For me it is more like this :
> 
> A CPU has to be able to do 60fps minimum fps for every single game save those that are badly optimized. This isn't the case with the 2500K anymore.
> 
> Even if lets just hypothetically say a 2500K can manage 50fps and a 6700K can manage 70fps (both as minimum framerate) in game X then the value of the 6700K is much higher, for me personally, than the actual difference in fps.
> 
> It may not be worth the money on paper, but it is definitely worth the money for your eyes and hands if you are one of those people that can feel the difference between 50 and 60 fps.


Yes but 70 fps min is pathetic for new CPU. This would be in 1-2 years it will drop below 60 fps.


----------



## zealord

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes but 70 fps min is pathetic for new CPU. This would be in 1-2 years it will drop below 60 fps.


Depends on the game. Also it was just an example and there is still overclocking.


----------



## Sin0822

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well higher resolution task the GPU way more.


True
Quote:


> Originally Posted by *zealord*
> 
> For me it is more like this :
> 
> A CPU has to be able to do 60fps minimum fps for every single game save those that are badly optimized. This isn't the case with the 2500K anymore.
> 
> Even if lets just hypothetically say a 2500K can manage 50fps and a 6700K can manage 70fps (both as minimum framerate) in game X then the value of the 6700K is much higher, for me personally, than the actual difference in fps.
> 
> It may not be worth the money on paper, but it is definitely worth the money for your eyes and hands if you are one of those people that can feel the difference between 50 and 60 fps.


To be honest, I totally agree, but going from 50FPS to 70FPS is a 40% increase hahaha, that is a huge increase. Going from 50FPS to 60FPS is a 20% increase, still quite an increase. I think people are just looking for large numbers and not noticing that some of these percentages are quite high.


----------



## MonarchX

Whether or not to upgrade such a CPU depends on how overclocked it is. I would not replace a 2500K @ 5.0Ghz, let alone 5.5Ghz that some people have achieved. The worst thing about Sandy Bridge CPU's are not compatible with PCIe 3.0, which limits GPU throughput, especially with SLI / CFX.

I am still debating about my 3770K @ 4.8Ghz. Some say not to upgrade it since 4.8Ghz is a great OC (very stable) and PCIe 3.0 is supported. HyperThreading also helps in some cases. I wonder how 3770K @ 4.8Ghz compares against 6700K @ 4.5Ghz in games. I know in synthetic benchmark, compression, video decoding/encoding, etc. 6700K @ 4.5Ghz beats 3770K @ 4.8Ghz, but in games, FPS may not be that different. Any ideas?


----------



## djriful

Quote:


> Originally Posted by *r31ncarnat3d*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chobbit*
> 
> You've brought back loving memories, i loved my 2500k it was the biggest step up in terms of beating my previous chip (especially @ 4.8) and i've never had it since.
> 
> 
> 
> Haha no kidding. I jumped from a 1090t to a 2600k myself. Such an amazing jump.
Click to expand...

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *r31ncarnat3d*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chobbit*
> 
> You've brought back loving memories, i loved my 2500k it was the biggest step up in terms of beating my previous chip (especially @ 4.8) and i've never had it since.
> 
> 
> 
> Haha no kidding. I jumped from a 1090t to a 2600k myself. Such an amazing jump.
> 
> Click to expand...
> 
> I went from a 965BE to a 2500k, the jump in minimum game FPS was astounding
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SC2 was suddenly a whole new game.
Click to expand...

I went from *Phenom X4 975* to *Intel i7 3930k*... tell me about it. 3D Rendering... I froze myself.


----------



## TranquilTempest

Bottom line for ANY upgrade is considering two questions:

Does your current system have unacceptable performance in a game or program you play/use?
Can you afford to upgrade to a system with acceptable performance?

If both answers are yes, it's time to upgrade. If the second answer is no, start saving up.


----------



## djriful

Quote:


> Originally Posted by *TranquilTempest*
> 
> Bottom line for ANY upgrade is considering two questions:
> 
> Does your current system have unacceptable performance in a game or program you play/use?
> Can you afford to upgrade to a system with acceptable performance?
> 
> If both answers are yes, it's time to upgrade. If the second answer is no, start saving up.


Yep, saving since 2011.


----------



## Imouto

On an unrelated note, science! https://www.pugetsystems.com/labs/articles/Can-you-see-the-difference-with-a-4K-monitor-729/

If you want to be an early adopter be my guest but unless you're loaded you can't justify spending an unholy amount of money every year to be updated with every tech coming out and said to be the 2nd coming of Jesus.

So thank you but I'll keep my 1080p monitor until it falls apart or I need more real state.

And lastly, thank you for spending that much money so I can have my used/new gear so cheap.


----------



## hokk

Quote:


> Originally Posted by *hollowtek*
> 
> Nah. upgrading the 2500k won't be cheap. skylake+mobo+ddr4 = more than i have.


Pretty much this
waiting to see zens price performance.


----------



## mechtech

Quote:


> Originally Posted by *Imouto*
> 
> On an unrelated note, science! https://www.pugetsystems.com/labs/articles/Can-you-see-the-difference-with-a-4K-monitor-729/
> 
> If you want to be an early adopter be my guest but unless you're loaded you can't justify spending an unholy amount of money every year to be updated with every tech coming out and said to be the 2nd coming of Jesus.
> 
> So thank you but I'll keep my 1080p monitor until it falls apart or I need more real state.
> 
> And lastly, thank you for spending that much money so I can have my used/new gear so cheap.


Not sure if you were trying to bash or support 4k with that link, but the conclusion of that article is that if you're an average healthy adult sitting form the monitor at a normal distance, any screen above 15 inches should ideally be 4k if you don't want to see pixelation.


----------



## CuriousNapper

Sold my 2500k when the 3770k came out. Regret it still.


----------



## TheReciever

Quote:


> Originally Posted by *Kand*
> 
> Sandy Bridge a sidegrade to Nehalem?
> 
> That's news to me!
> 
> http://www.anandtech.com/bench/product/47?vs=287


Please re-read. X58 vs 1155, sockets. Not specific CPU's

You more than welcome to read the threads of the time.


----------



## Tippy

Quote:


> Originally Posted by *CuriousNapper*
> 
> Sold my 2500k when the 3770k came out. Regret it still.


Why? 3770k is by no means bad, and it should last you for quite a while to come.

One can basically never go wrong with Intel's K series chips.


----------



## Imouto

Quote:


> Originally Posted by *mechtech*
> 
> Not sure if you were trying to bash or support 4k with that link, but the conclusion of that article is that if you're an average healthy adult sitting form the monitor at a normal distance, any screen above 15 inches should ideally be 4k if you don't want to see pixelation *and you are willing to spend the asking price for the monitor and the GPU power to drive it.*


Not bashing it, just saying that it is early adopter stuff. I have a 5" 1080p phone even when I don't need that many pixels just because they're so frigging cheap. So I'll wait until 4K IPS monitors are about $200 and you don't have to sell both of your eyeballs to buy the GPU power needed to drive them.

Exactly the same with VR. I'm sorry but I'll wait until the ppi is somewhat tenfold higher and price is down to earth (and again the GPU price problem).

I guarantee you that I enjoy games as much as a 4K player, mainly because I like to play them and staring at my screen looking for pixels don't amuse me.


----------



## mrteddy

For my quick 2 cents

Still running a 2500K and since work is busy now I had sold my gpus and running dota 2 on the igpu...it aint to bad haha

But one thing I've noticed over the years is that the chip seems to run hotter, it was oc'ed for maybe the first 2 Years of its life but is currently at stock setting. Idle is about 35c with a custom loop for it (280mm X 60mm rad) and full load is about 50...but I used to get cooler temperatures with my nh-d14 or maybe I screwed up my loop

Anyways I think if the price is right I'd drop a 3770K into my board


----------



## xutnubu

Frame-rate chart.

2500K v. 6500:



3770K v. 2500K v. 6500:



3770K v. 2500K v. 6500(minimum frame-rates):



Also, the written article: http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k


----------



## Peanuts4

Quote:


> Originally Posted by *Defoler*
> 
> The 2500K isn't that legendary.
> You can clearly see that compared to an OCed 6500 and OCed DDR4, you gain quite a few more FPS. This can be translated to higher graphics settings or higher resolution.
> The 2500K isn't bad. It can still hold its own, but sticking with it does limit you if you plan to upgrade your periphiral hardware, or you encounter a game that you do wish to get better visuals from.


Those 9FPS definitely justify spending a couple hundred bucks makes perfect sense.


----------



## corky dorkelson

My Gigabyte UD4 did NOT like the 1866mhz ram, I simply could not get it to work no matter what. I'm reluctant to try 2133, but I am sticking with 1080p and SB, so if faster ram gets me a bit better performance without breaking the bank I am definitely curious.

I'm also getting a freesync monitor, so hopefully SB/R9 290 can get me though quite a while longer.


----------



## Tippy

Quote:


> Originally Posted by *xutnubu*
> 
> Frame-rate chart.
> 
> 2500K v. 6500:
> 
> ]


That framerate jump between i5 6500+2666mhz DDR4 and i5 6500+3400mhz DDR4 - what?? How??









Is Witcher 3 really weirdly RAM dependent or what?


----------



## Anarion

I'm waiting for Cannon Lake to make the switch. I know I'm losing some FPS right now but this is not so severe to make me spend so much money. Still this cpu lasted 7 years (assuming I will make the switch in 2017 when Cannon is out).


----------



## djriful

I will wait till my 3930k dies and upgrade it but... wait I have one time lifetime warranty exchange with Intel Protection... I wonder how long I am going to keep my processor. $600 cpu for a lifespan 15 years? 5 years past...


----------



## Socks keep you warm

Quote:


> Originally Posted by *Artikbot*
> 
> I love it because people _actually_ think Intel are sat on their asses doing nothing all day because AMD isn't competitive.
> 
> So Broadwell-Y isn't an advancement? Skylake U battery life isn't an advancement? The HD Graphics 530 isn't an advancement? The sheer performance per watt and cycle of Skylake isn't an advancement?
> 
> Come on.


It's sarcasm.
But i tell you now if AMD were still in the game we would be further then we are


----------



## magnek

Quote:


> Originally Posted by *xutnubu*
> 
> Frame-rate chart.
> 
> 2500K v. 6500:
> 
> 
> 
> 3770K v. 2500K v. 6500:
> 
> 
> 
> 3770K v. 2500K v. 6500(minimum frame-rates):
> 
> 
> 
> Also, the written article: http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k


Good to see the 3770K paired with 2133 ram still doing just fine in most titles (with the sole exception of GTA V I guess). Means my 4930K @ 4.5 will still be relevant for quite some time. Since I have 2400 sticks but only running them at 2133, maybe I really should get off my lazy ass and actually find a stable setting for 2400.









Quote:


> Originally Posted by *djriful*
> 
> I will wait till my 3930k dies and upgrade it but... wait I have one time lifetime warranty exchange with Intel Protection... I wonder how long I am going to keep my processor. $600 cpu for a lifespan 15 years? 5 years past...


You mean the PTPP? It's only good for 3 years, and is definitely not a lifetime warranty.
Quote:


> The Plan provides a one-time replacement: (i) only applicable to the replacement of Eligible Processors and (ii) only when the Plan is purchased within one (1) year of the purchase of the Eligible Processor. The Plan may only be purchased from the Plan website (www.intel.com/go/tuningplan) or an authorized reseller. The Performance Tuning Protection Plan does not affect the length of the standard 3 year warranty. *The Plan will cover the processor running out of specifications for the remainder of the standard 3 year warranty.*


----------



## Alvarado

Slightly off-topic but they just posted this.


----------



## Clocknut

these guys need to test 3930K vs 6700K, how a modern 4cores up against the old 6 cores.


----------



## Cyro999

Quote:


> To be honest, I totally agree, but going from 50FPS to 70FPS is a 40% increase hahaha, that is a huge increase. Going from 50FPS to 60FPS is a 20% increase, still quite an increase. I think people are just looking for large numbers and not noticing that some of these percentages are quite high.


The tests in this video are against an i5 6500. If you use a 6600k at 4.65ghz (a reasonable overclock) then it's ~41% faster. That will show in full in some games, and in part for others (the ones that are not heavily CPU bound enough).

It's not difficult to get numbers that are 30-40% higher than a 2500k by using a 6600k +3200c16 RAM, especially if the 2500k is using slow RAM.


----------



## chronicfx

Great testing done here! We already knew the result but he showed it better than I have seen before. Faster RAM = better minimum FPS.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Cyro999*
> 
> The tests in this video are against an i5 6500. If you use a 6600k at 4.65ghz (a reasonable overclock) then it's ~41% faster. That will show in full in some games, and in part for others (the ones that are not heavily CPU bound enough).
> 
> It's not difficult to get numbers that are 30-40% higher than a 2500k by using a 6600k +3200c16 RAM, especially if the 2500k is using slow RAM.


They used i5 6500 with OC too.


----------



## Assirra

Quote:


> Originally Posted by *Cyro999*
> 
> The tests in this video are against an i5 6500. If you use a 6600k at 4.65ghz (a reasonable overclock) then it's ~41% faster. That will show in full in some games, and in part for others (the ones that are not heavily CPU bound enough).
> 
> It's not difficult to get numbers that are 30-40% higher than a 2500k by using a 6600k +3200c16 RAM, especially if the 2500k is using slow RAM.


The tests are less about how it competes to the next generation and more how it still performs.
The only reason they included the i5 6500 was because they needed something to compare.


----------



## Liranan

Quote:


> Originally Posted by *Scotty99*
> 
> http://store.steampowered.com/hwsurvey
> 
> .07% of people on steam survey have a 4k monitor. 35+% have a 1080p display.
> 
> Come on guys lol.
> 
> Im not gonna reply to any of this nonsense anymore, lets keep to the topic.


Are you serious? According to that same survey there are people with DX8 and below capable video cards and I bet those people outnumber people with Titans.


----------



## jsc1973

Quote:


> Originally Posted by *Imouto*
> 
> OC.net seems to lack a lot in common sense and empathy. These forums will make you feel like a worm if you don't spend an elastic amount of money to barely improve your performance in 1% of your usual workload.


I think your issue is your own. There are people here still running Core 2 Quads because they can't just upgrade on a whim and asking questions, and no one makes them feel like a worm for asking.

There are people here running C2Q's and Athlon II X4's, stuff like that, and people running 5960X with Titan X dual-SLI. Everyone is welcome unless they're trolling. What I've got will match a stock 2500K on a good day, with a five-year-old GPU in tow, although I've built better rigs for others. In any case, I find that I still have a lot of knowledge to contribute.

Most people upgrade because their system starts lagging when they're playing a game or something else they like to do. Some just do it because they like to have the latest hardware. OCN is here to help people get the best out of whatever they have.


----------



## Alvarado

Quote:


> Originally Posted by *Liranan*
> 
> Are you serious? According to that same survey there are people with DX8 and below capable video cards and I bet those people outnumber people with Titans.


You'd be surprised. I was reading this thread about Blizzard raising the system requirements for World of Warcraft's Legion expansion. Should have seen the outcry about it. The thread.


----------



## zealord

Quote:


> Originally Posted by *xutnubu*
> 
> Frame-rate chart.
> 
> 2500K v. 6500:
> 
> 
> 
> 3770K v. 2500K v. 6500:
> 
> 
> 
> 3770K v. 2500K v. 6500(minimum frame-rates):
> 
> 
> 
> Also, the written article: http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k


Very interesting. I never imagined the difference was that huge in those games. (I wish they would add some tests with games at low settings though. I am surprised a single Titan X gets framerate north of 100 in those demanding games on Ultra)

Lets see how expensive good DDR4 ram is this year when I want to upgrade around the time when Polaris/Pascal/new CPUs come out.

The 2500K still has good a framerate, but they are pairing it with 2133mhz RAM. Something I do not have.

Looks like I will prioritize CPU + RAM this year over a good GPU.

Interesting times ahead. Interesting indeed, but expensive.


----------



## jsc1973

Quote:


> Originally Posted by *zealord*
> 
> The 2500K still has good a framerate, but they are pairing it with 2133mhz RAM. Something I do not have.


The stuff is dirt-cheap right now. Why not just get some 2133 and enjoy better framerates?

I don't particularly need 16 gigs of RAM, but when I saw a 2x4 kit for $44, I figured I might as well get it.


----------



## zealord

Quote:


> Originally Posted by *jsc1973*
> 
> The stuff is dirt-cheap right now. Why not just get some 2133 and enjoy better framerates?
> 
> I don't particularly need 16 gigs of RAM, but when I saw a 2x4 kit for $44, I figured I might as well get it.


I never put much thought into RAM and mainboards.

I have an ASRock Z68 Pro3 Gen3. Does it even support 2133MHZ RAM?









You are right. Prices are actually pretty low for RAM.


----------



## DesertRat

Looks like I'll be happy w/ my OC'd 3770K and OC'd 290X for awhile.


----------



## jsc1973

Quote:


> Originally Posted by *zealord*
> 
> I never put much thought into RAM and mainboards.
> 
> I have an ASRock Z68 Pro3 Gen3. Does it even support 2133MHZ RAM?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are right. Prices are actually pretty low for RAM.


The compatibility list for that board lists RAM with speeds as high as 2400: http://www.asrock.com/mb/Intel/Z68%20Pro3%20Gen3/?cat=Memory

I would suspect that any reputable-brand DDR3-2133 would work. Intel platforms aren't known for being fussy about RAM.

Newegg will sell you 8 gigs of 2133 CAS9 for $34.99 right now. I paid $44 for 8 gigs of the CAS10 version on Christmas Eve and thought I got a deal.


----------



## zealord

Quote:


> Originally Posted by *jsc1973*
> 
> The compatibility list for that board lists RAM with speeds as high as 2400: http://www.asrock.com/mb/Intel/Z68%20Pro3%20Gen3/?cat=Memory
> 
> I would suspect that any reputable-brand DDR3-2133 would work. Intel platforms aren't known for being fussy about RAM.
> 
> Newegg will sell you 8 gigs of 2133 CAS9 for $34.99 right now. I paid $44 for 8 gigs of the CAS10 version on Christmas Eve and thought I got a deal.


It is actually not a bad idea. Though the question is. How would the results look like with an AMD card comparing 1333mhz and 2133mhz RAM ? Would it be roughly the same as with Nvidia even through AMD DX11 CPU overhead?


----------



## DJ XtAzY

I think my i7 920 @4.0ghz will rock happily for another 3 years. I don't even notice any of the sluggishness in the OS and browsing, compared to 4year old core2duo chips around 2010 times. After 3 years, I might consider getting a new system.


----------



## ChampN252

Im not gonna throw much 4k mud, but i knew its was legit when the mantle drivers were right and i was able to play DAI at 60 fps graphics maxed. A sight to behold.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zealord*
> 
> It is actually not a bad idea. Though the question is. How would the results look like with an AMD card comparing 1333mhz and 2133mhz RAM ? Would it be roughly the same as with Nvidia even through AMD DX11 CPU overhead?


Even more lol.


----------



## Ragsters

If you guys are looking for some 2133mhz ram and/or an i7 3770k check out my for sale thread.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *zealord*
> 
> For me it is more like this :
> 
> A CPU has to be able to do 60fps minimum fps for every single game save those that are badly optimized. This isn't the case with the 2500K anymore.
> 
> Even if lets just hypothetically say a 2500K can manage 50fps and a 6700K can manage 70fps (both as minimum framerate) in game X then the value of the 6700K is much higher, for me personally, than the actual difference in fps.
> 
> It may not be worth the money on paper, but it is definitely worth the money for your eyes and hands if you are one of those people that can feel the difference between 50 and 60 fps.


Lol, I am fine playing games at an AVERAGE of 30 FPS. I used to have a 120Hz monitor and, sure, gaming at high frame rates is nice and smooth but I certainly don't consider it a necessity. Of course, my setup has been capable of far more than 30 FPS in most games for a long time now so I may feel differently about it now...


----------



## Kana Chan

Would there be any performance gains if they used ddr4 3866mhz-4266 ram?

or ddr3 2800/cas12


----------



## zealord

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, I am fine playing games at an AVERAGE of 30 FPS. I used to have a 120Hz monitor and, sure, gaming at high frame rates is nice and smooth but I certainly don't consider it a necessity. Of course, my setup has been capable of far more than 30 FPS in most games for a long time now so I may feel differently about it now...


I wish I could play at an average of 30fps aswell, but damn I can't. average 30fps means that you also have drops to lower than 30fps and spikes above 30fps. That is absolutely horrible in my personal opinion and I would consider any game that hovers between like 25-35 fps completely and utterly unplayable on PC.

Even hovering from 50-70 feels much worse than rock solid 60.

With a controller it is not as big a problem as with Mouse and Keyboard, but I also played CS 1.6 for like 5 years on an amateur-professional level and everything that wasn't a rock solid locked 100fps was considered a huge disadvantage that made the game unplayable in a regard that you couldn't control the weapon recoil as well as with 100fps.

Maybe we just have a different definition of "playable". I get that a game is playable at 30fps. I have a PS4 and games run ot 30fps on it, but TV + wireless controller + controllers in general have a huge delay compared to M&KB anyways.

Yes 30fps is playable, but 30fps is only as playable as microwave food is edible in my opinion. It gets you through the day, but crushes your soul.


----------



## Alvarado

Quote:


> Originally Posted by *zealord*
> 
> Yes 30fps is playable, but 30fps is only as playable as microwave food is edible in my opinion. It gets you through the day, but crushes your soul.


Never thought I'd see someone compare frame rate to microwave food.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *zealord*
> 
> I wish I could play at an average of 30fps aswell, but damn I can't. average 30fps means that you also have drops to lower than 30fps and spikes above 30fps. That is absolutely horrible in my personal opinion and I would consider any game that hovers between like 25-35 fps completely and utterly unplayable on PC.
> 
> Even hovering from 50-70 feels much worse than rock solid 60.
> 
> With a controller it is not as big a problem as with Mouse and Keyboard, but I also played CS 1.6 for like 5 years on an amateur-professional level and everything that wasn't a rock solid locked 100fps was considered a huge disadvantage that made the game unplayable in a regard that you couldn't control the weapon recoil as well as with 100fps.
> 
> Maybe we just have a different definition of "playable". I get that a game is playable at 30fps. I have a PS4 and games run ot 30fps on it, but TV + wireless controller + controllers in general have a huge delay compared to M&KB anyways.
> 
> Yes 30fps is playable, but 30fps is only as playable as microwave food is edible in my opinion. It gets you through the day, but crushes your soul.


Eh, I am far from a competitive gamer. But yes, I can play a game like Crysis 3 at 1080p on my 2600k/270X backup rig at ~30 FPS and have an enjoyable time. Certainly higher frame rates are preferable but its nothing I can't deal with.

And yes, to me the term "unplayable" gets thrown around waaaaaaay too often around here (and is one of my pet peeves). Unplayable, going by the actual definition of the word, would be a frame rate so low that you literally could not control the game properly (think below 15 FPS or so). Sure we all WANT higher frame rates, but being elitist enough to claim that you flat out can not play a game unless its at 100+ FPS is the kind of thing that just grinds my gears tbh...


----------



## zealord

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Eh, I am far from a competitive gamer. But yes, I can play a game like Crysis 3 at 1080p on my 2600k/270X backup rig at ~30 FPS and have an enjoyable time. Certainly higher frame rates are preferable but its nothing I can't deal with...


That is good for you and I envy you for that. I wish I wasn't as sensitive to stuff like that.

I get physically sick from stuff like 3D, low FoV, low framerate, head bobbing and blur effects.

Sadly nothing I can't do about it and actually more people than one might think suffer from that when playing games.


----------



## randomizer

Still running an i7 920 with 1066MHz DDR3. It runs most things OK, although I wouldn't want do be doing renders or video encoding regularly with it. My GTX 970 is probably still holding me back given that I run at 2560x1440. When it runs out of juice I might even overclock it.


----------



## magnek

Well after 30 minutes of tweaking I'm happy to say my sticks are indeed good for 2400 @ 10-12-12-31 1T. Best part is I didn't even have to increase either ram voltage (1.5V) or IMC voltage (1.1V). So much for XMP recommending 1.65V ram and 1.2V IMC lol. Granted I only tested Prime95 for 2 hours but I know that's more than enough based on my own usage patterns.


----------



## Sin0822

I'm just going to put this out there, but DDR4 timings make a huge difference. The gains from the higher speeds can easily be undone by higher latencies. I wish I know what they used for their tests.


----------



## Derp

Quote:


> Originally Posted by *magnek*
> 
> Well after 30 minutes of tweaking I'm happy to say my sticks are indeed good for 2400 @ 10-12-12-31 1T. Best part is I didn't even have to increase either ram voltage (1.5V) or IMC voltage (1.1V). So much for XMP recommending 1.65V ram and 1.2V IMC lol. Granted I only tested Prime95 for 2 hours but I know that's more than enough based on my own usage patterns.


You might want to run some more memory specific stability tests. Ram that isn't stable will quickly destroy your install. 2400 at 1.5v is probably asking too much, especially when the XMP profile asks for 1.65v.


----------



## BeerPowered

I just upgraded from an i7 2600k to a i7 5930K and I didn't do it just for speed, but rather the newer technologies. The 2500k/2600K are stuck on PCIe 2.0 for example and in SLI you would start bottlenecking high-end cards. Now I have 40 PCIe 3.0 Lanes, I have a UEFI mobo, M.2 Ultra SSD.

Now my old rig 2600K/7970GHZ/16GB 2133mhz RAM will Probably get converted to a Storage Spaces Server.

EDIT: The 950 Pro NVME Ultra M.2 SSD is SOOOO worth it. It crushes my 840 Pro.


----------



## defhed

Quote:


> Originally Posted by *Chargeit*
> 
> Well, likely because those of us that warn against AMD driver overhead end up trolled out of a conversation. I stopped giving the warnings myself. I also no longer send people towards this site for help with PC builds.
> 
> I'd return the 1600 ram, and get some 2133.


Will Microcenter accept the return after I've already opened and been using it?
Quote:


> Originally Posted by *zealord*
> 
> Maybe you should launch the "AMD GPU Warning Initiative" and hand out leaflets and make signs that say "God hates AMD. Go Nvidia!" to promote Nvidia sales. Would also go great with Green glasses and stickers that say "Buy Nvidia. Make Nvidia great again!".
> And then you build a wall around the AMD headquarters and say "Keep AMDigrants out".
> 
> I am just joking of course. Maybe.


I mean.. I've never preferred one of the other... I like the competition.. and I want the competition... I'll buy which ever card is the better deal for what I'm trying to get... But I think this CPU driver overhead thing is complete bullshi and AMD should fix it and I should've bought Nvidia.


----------



## magnek

Quote:


> Originally Posted by *Derp*
> 
> You might want to run some more memory specific stability tests. Ram that isn't stable will quickly destroy your install. 2400 at 1.5v is probably asking too much, especially when the XMP profile asks for 1.65v.


Well the XMP profile also asks for Vccsa to be 1.2V, while my 4930K ran just fine with 1.1V. I think XMP is more of "worst case scenario" guideline that's intended to cover 99% of all cases rather than something that's set in stone. But yeah I'll look into either AIDA64 or Memtest at some point. Worst I just up the ram voltage to 1.65V and call it a day.


----------



## HITTI

I've got the 3770k delided and watercooled. I love it, it's fast strong and powerful. It's not the 2500k but it is in its category.

Atm I don't see a reason to upgrade. I don't play the newer games, I play the cod series upto codwaw, cs:go etc and my processor and graphics card handles my games perfectly, matched with a pair of 840 pro's in raid0.


----------



## Olivon

Quote:


> Originally Posted by *Blameless*
> 
> Even my original Nehalems hold up well. I have an i7 920 system (@ 4GHz) and a Xeon W3540 (@ 3.6GHz) that I just gave away, neither of which are wanting for CPU power in most games. The parts are from 2009.


It really depends on games and scenes.
In CPU bound scenario, I seen myself ~5fps difference between 2600K OC and 4790K OC (4.7GHz for both) on Crysis 3 CPU bound scene (grass stage), in 1080p AA 2X maxed details.
Between a 920 4GHz and a 6700K 4.7GHz, difference must be really important in some case.




Around ~7 fps ratio, between 4770K 3.5GHz and 4.2GHz, HT on.



http://www.hardware.fr/focus/101/perfs-avec-2-4-6-8-coeurs-4-jeux-loupe.html

*Translated*


----------



## OmegaRED.

Amazing video. Thank you Digital Foundry! I went straight to ncix and ordered 16gb of Gskill ripjaws x 2133.

I have 8gb of Cosair Vengeance 1600 powering SLI GTX 980s on a [email protected] this will be a perfect upgrade. The last squeezing of performance possible.









I had read years ago here that ram speed didn't matter with Sandy Bridge and held out.


----------



## rdr09

Quote:


> Originally Posted by *defhed*
> 
> Will Microcenter accept the return after I've already opened and been using it?
> I mean.. I've never preferred one of the other... I like the competition.. and I want the competition... I'll buy which ever card is the better deal for what I'm trying to get... But I think this CPU driver overhead thing is complete bullshi and AMD should fix it and I should've bought Nvidia.


Your oc'ed i5 is struggling to keep up with the 380X/amd cpu overhead? How so? Like i said earlier in the thread, i had a single 290 and i always keep HT off and my i7 is only at 4.5GHz. My cpu never bottlenecked my 290 and i only have 1600 rams handed down from an X58 systm.

Maybe if you have a Phenom II Quad,


----------



## Cyro999

Quote:


> My cpu never bottlenecked my 290


You're either not playing any games that are CPU heavy at any points or you don't know how to look at performance correctly - if you were, that statement wouldn't be true.


----------



## rdr09

Quote:


> Originally Posted by *Cyro999*
> 
> You're either not playing any games that are CPU heavy at any points or you don't know how to look at performance correctly - if you were, that statement wouldn't be true.


Tell that to the guy who has 2 980s with his i5.


----------



## Clocknut

Quote:


> Originally Posted by *Chargeit*
> 
> Well, likely because those of us that warn against AMD driver overhead end up trolled out of a conversation. I stopped giving the warnings myself. I also no longer send people towards this site for help with PC builds.
> .


well.... until AMD fix this problem to be on par with Nvidia's. they will be on my no-buy list.


----------



## Dom-inator

Just upgraded from i5 760 to i5 6600k OC'd to 4.7ghz .... Now that's a difference


----------



## N3G4T1v3

Time to upgrade my 2500k? NEVER!
My 2500k has served me well clocked at 4.7GHz, and its been running like that for 4 year. I'm still keen to try get it stable at 5GHz, but my cooler was already boarder line with temps with the Prime95 burn in run. Not that I've ever reached those temps under normal operation.

What I like about this article is that my CPU was running ahead of the game for 4 years, it's taken Intel this long to catch up. This makes me happy, even more so that my CPU will still be relevant for the next 2 years, if the mobo doesn't konk out on me beforehand.

It's going to be a hard call to find a replacement that will fill these shoes


----------



## essanbee

I'm waiting for Intel to release the 4 core Sandy Bridge add on kit. That ought to hold me for another 5 years.


----------



## jdstock76

It was time to upgrade with IB.


----------



## jmcosta

what about 4.8 or 5ghz? sandy bridge was a chip capable of overclocking that high with a mid end cooling, some even 5.1 with 1.47-1.5v and it doesn't have heat output that high like haswell


----------



## Chobbit

Quote:


> Originally Posted by *Scotty99*
> 
> Its funny how many people have said in here i miss my 2500k or just have good memories of it.....why did you upgrade lol.


I got given an I7 3930k and just had to buy the board 2 years ago (wasn't impressed), then had the chance just over a year ago to get the I7 5930k (oc now to 4.9), SABERTOOTH X99, 32gb of DDR4 RAM, AX1200i and 2 x 980's for £750 (a friend who needed to sell up quick and move away). Which by my calculations means I paid for 1 & 1/2 of the GPU's and the rest came free which just about makes up for the loss of my 2500k I think







lol


----------



## Assirra

Quote:


> Originally Posted by *jmcosta*
> 
> what about 4.8 or 5ghz? sandy bridge was a chip capable of overclocking that high with a mid end cooling, some even 5.1 with 1.47-1.5v and it doesn't have heat output that high like haswell


They probably just used an average aircooled overclock without much work.
Yes there are crazy people that nearly blow up their CPU but those are in general pretty rare.


----------



## twitchyzero

they added a new video threw in the 3770k in the mix.

I wanna see them add a 4K test...the gains of skylake probably even less relevant at higher res since GPU bound?


----------



## Assirra

Quote:


> Originally Posted by *Olivon*
> 
> It really depends on games and scenes.
> In CPU bound scenario, I seen myself ~5fps difference between 2600K OC and 4790K OC (4.7GHz for both) on Crysis 3 CPU bound scene (grass stage), in 1080p AA 2X maxed details.
> Between a 920 4GHz and a 6700K 4.7GHz, difference must be really important in some case.
> 
> 
> 
> 
> Around ~7 fps ratio, between 4770K 3.5GHz and 4.2GHz, HT on.
> 
> 
> 
> http://www.hardware.fr/focus/101/perfs-avec-2-4-6-8-coeurs-4-jeux-loupe.html
> 
> *Translated*


Hmm, is there an easy way to check if HP is on or not in windows?
Like a console command or something?


----------



## Kokin

I had a 2550K (no-iGPU) that ran at 5GHz and switched to a 3570K that has ran 4.7GHz all of its life. None of the newer Intel offerings really appeal to me and I will probably run my 3570K until there's another "Sandy Bridge" CPU with a huge jump in performance.


----------



## corky dorkelson

I have a question. Do you think faster ram would help with lower-spec CPUs like the i5 2400 for example? Would faster ram help me squeak out a tiny bit more performance?


----------



## magnek

Quote:


> Originally Posted by *Assirra*
> 
> Hmm, is there an easy way to check if HP is on or not in windows?
> Like a console command or something?


Open task manager (ctrl+shift+esc), go to the performance tab, and count the number of threads available.


----------



## PCSarge

Quote:


> Originally Posted by *Cakewalk_S*
> 
> Hows that even possible? I've been running my 2500k at 4.5ghz for 4 years... No sign of degregation.


yeah mines still chugging along at 4.8ghz.................ON AN ITX MOBO.....mind you this was the top end enthusiast ITX mobo of the time


----------



## Majin SSJ Eric

My 2600k used to be able to do 5GHz when I got it new back in April, 2011 but since then I've had to lower it down to 4.7GHz as it became unstable. Still, a 4.7GHz 2600k is plenty fast and I haven't had to lower the OC at all in the past few years. My old 3960X used to do 5GHz as well but one day it crapped the bed and would barely OC at all anymore. Turned it in under my IPP and got a new one, which I sold to buy my 4930k, which has been a crap overclocker since new (will only do 4.6GHz stable). Has not degraded any from there though since November, 2013...


----------



## chronicfx

I saw a big difference with my 3570k, it was clocked at 5 GHz and I was originally using the 1600Mhz ram I purchased with it (I liked the look of the corsair LPX and microcenter kinda did not have many choices on the wall that day). But then I was reading on OCN in a thread and the guy was preaching all about ram a minimum framerate. So I purchased a 4x4gb set of Ripjaw-Z with cas10-12-12-24 @2400MHz and truly and honestly noticed the difference just on the boot up.. I had never seen all of my icons hit the screen at once like that. In games it made a huge difference in keeping the minimums and the average frames tighter. I swore by this ram on my ivy system, I even repurchased the exact same set for my haswell system. If you are still on DDR3 I feel that cas 10 2400mhz is the sweetspot for gaming.









http://www.newegg.com/Product/Product.aspx?Item=N82E16820231586

Good solid ram


----------



## clerick

Quote:


> Originally Posted by *chronicfx*
> 
> I saw a big difference with my 3570k, it was clocked at 5 GHz and I was originally using the 1600Mhz ram I purchased with it (I liked the look of the corsair LPX and microcenter kinda did not have many choices on the wall that day). But then I was reading on OCN in a thread and the guy was preaching all about ram a minimum framerate. So I purchased a 4x4gb set of Ripjaw-Z with cas10-12-12-24 @2400MHz and truly and honestly noticed the difference just on the boot up.. I had never seen all of my icons hit the screen at once like that. In games it made a huge difference in keeping the minimums and the average frames tighter. I swore by this ram on my ivy system, I even repurchased the exact same set for my haswell system. If you are still on DDR3 I feel that cas 10 2400mhz is the sweetspot for gaming.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820231586
> 
> Good solid ram


Tempting, but i wonder if my aging z77 would support this ram.


----------



## corky dorkelson

Quote:


> Originally Posted by *clerick*
> 
> Tempting, but i wonder if my aging z77 would support this ram.


My Z68 didn't even like 1866 ram. But I just bought a 2133 kit just to try out. I also have a z77 board, so I will be trying it there also.


----------



## clerick

Quote:


> Originally Posted by *corky dorkelson*
> 
> My Z68 didn't even like 1866 ram. But I just bought a 2133 kit just to try out. I also have a z77 board, so I will be trying it there also.


Hope for the best, will be waiting on your result!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> My 2600k used to be able to do 5GHz when I got it new back in April, 2011 but since then I've had to lower it down to 4.7GHz as it became unstable. Still, a 4.7GHz 2600k is plenty fast and I haven't had to lower the OC at all in the past few years. My old 3960X used to do 5GHz as well but one day it crapped the bed and would barely OC at all anymore. Turned it in under my IPP and got a new one, which I sold to buy my 4930k, which has been a crap overclocker since new (will only do 4.6GHz stable). Has not degraded any from there though since November, 2013...


I ran my 2500K @ 4.8GHz with 1.45v for 1 year with no problems. Got a 3570K then 3770K and could not get over 4.6GHz even though 3770K was a better CPU. I even dellided 3770K but no matter the voltages 4.7 or 4.8 was not attainable. Still 4.6GHz I have had a single crash since day 1. Super stable OC with +165mV. I am hoping what ever CPU I get next can at least OC better so I get better performance. Sadly now 4K I dont really care about CPU performance. Most games CPU is like 10-20% usage while GPU is 99%. Games run so much better then CPU is not stresses as you get moe stable frame times eventhough they are slower because fps is lower 4K.


----------



## Punisher64

Article says 5 year old CPU is still good...makes me happy to still be on x58 with my trusty x5670


----------



## ZealotKi11er

Quote:


> Originally Posted by *Punisher64*
> 
> Article says 5 year old CPU is still good...makes me happy to still be on x58 with my trusty x5670


Yeah you are still good but if you are looking to get the most fps 6700K is a good upgrade.


----------



## Scotty99

It was just such a huge upgrade over the previous stuff, easy to OC, didnt require any cooling whatsoever, and almost all chips could surpass 4.5ghz. Of course new stuff is faster, but that really inst the point. Who wants to buy a 4790k where you could maybe get 300mhz over stock and have to buy (and some chips you had to take apart voiding the warranty!) 100+$ cooling to get there? lol

Sandy was just perfect, and i wont upgrade until i see a 3x improvement in real world tasks. Guess ill have this thing another 10 years : )


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> It was just such a huge upgrade over the previous stuff, easy to OC, didnt require any cooling whatsoever, and almost all chips could surpass 4.5ghz. Of course new stuff is faster, but that really inst the point. Who wants to buy a 4790k where you could maybe get 300mhz over stock and have to buy (and some chips you had to take apart voiding the warranty!) 100+$ cooling to get there? lol
> 
> Sandy was just perfect, and i wont upgrade until i see a 3x improvement in real world tasks. Guess ill have this thing another 10 years : )


3x performance per watt maybe. 3x performance will be hard. I mean Even 6700K right now probably not 3x on everything over Q6600. If you look 3DMark CPU scores my PII @ 4.2GHz gets ~ 6.2K vs 12.2K for 3770K @ 4.6GHz. 6700K get ~ 14K-14.5K with OC. 4790K gets ~ 13K. I would need to wait a lot more than 10 year for 3x performance assuming 4 Core + HT.


----------



## Chargeit

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I ran my 2500K @ 4.8GHz with 1.45v for 1 year with no problems. Got a 3570K then 3770K and could not get over 4.6GHz even though 3770K was a better CPU. I even dellided 3770K but no matter the voltages 4.7 or 4.8 was not attainable. Still 4.6GHz I have had a single crash since day 1. Super stable OC with +165mV. I am hoping what ever CPU I get next can at least OC better so I get better performance. Sadly now 4K I dont really care about CPU performance. Most games CPU is like 10-20% usage while GPU is 99%.
> 
> 
> Games run so much better then CPU is not stresses as you get moe stable frame times eventhough they are slower because fps is lower 4K.


Games really do run smoothly at 4k. Even if you're getting low fps. Removing the CPU as a bottleneck makes a huge difference in smoothness.


----------



## Punisher64

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah you are still good but if you are looking to get the most fps 6700K is a good upgrade.


Lol bro I play CS:GO and a bit of DotA...I'm good


----------



## WolfssFang

Very interesting video, now is it worth upgrading from my 1600 or just wait to upgrade the cpu, decisions decisions.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Punisher64*
> 
> Lol bro I play CS:GO and a bit of DotA...I'm good


I play Dota 2 and @ 4K I am GPU limited for the first time. Getting ~ 80 fps average.


----------



## Scotty99

I just gotta post this lol, :

https://twitter.com/AraxxisTwitch/status/699310043020156929

This is my tweet to luke from linus tech tips 5 days BEFORE digital foundrys video was posted, asking him to do a sandy vs skylake test cause people wanna know.

Sigh luke lol, imagine the views.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> I just gotta post this lol, :
> 
> https://twitter.com/AraxxisTwitch/status/699310043020156929
> 
> This is my tweet to luke from linus tech tips 5 days BEFORE digital foundrys video was posted, asking him to do a sandy vs skylake test cause people wanna know.
> 
> Sigh luke lol, imagine the views.


Sorry but big Youtubers like Linus Tech do not give a dam. There is nothing professional about them. They can never reach same quality of game testing as DF.


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Sorry but big Youtubers like Linus Tech do not give a dam. There is nothing professional about them. They can never reach same quality of game testing as DF.


Well in actuality linus started out as a little guy as well, and luke now has basically his own channel that he pulls suggestions from twitter or his own forums. He really did mess up not taking my suggestion, as evidenced by how much airplay the DF video has garnered.


----------



## Punisher64

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I play Dota 2 and @ 4K I am GPU limited for the first time. Getting ~ 80 fps average.


I play at 1080p with SLI 770's and my x5670....I'm all gucci haha


----------



## Scotty99

But maybe now luke will pay attention to my comments on his twitch page lol!


----------



## PCGamer4Ever

Found this interesting but nothing new to me. I have a system at my Dad's for when I visit and want to game, an i7 870 (Stock Speeds) paired with a 7870 for video. I can still run most games at 1080P in high detail and the game play is smooth. Now it is no where near the performance of my home game rig and I can tell the difference. But it still has more than enough kick to let me game and have fun doing it.


----------



## Majin SSJ Eric

Agreed. Lynnfield is still a pretty capable CPU in its own right. When I built my first rig back in 2011 I even considered 1156 for my CPU mostly because of the SB mobo snafu that was going on at the time (couldn't get 1155 mobos) but ended up going ahead with a 2600k and a recalled board I found on the shelf at CompUSA until I could get my real mobo, the P67 Sabertooth, that I still have to this day.


----------



## Olivon

Sandy Bridge was really a great arch.
Like I always do, I offered my old rig to my best friend but before giving it to him I checked the platform with several load tests in order to tweak 4 BIOS profiles.
The default mode I choose is 4.5GHz, with a ridiculous offset of +0.005 and a LLC on medium settings. A +30% overclock with just ~1.26v during high loads.
During summer 2013, I tested underclocking/under voltage with this 2600K and I was able to run 4GHz fanless with a Prolimatech Megahalem (1.157v during high loads) :



A terrific processor, loved it.


----------



## meowth2

i still use 1st gen i7-980x, which is even older than this 2nd gen 2500k. i still don't find problems running games


----------



## 8800GT

Quote:


> Originally Posted by *meowth2*
> 
> i still use 1st gen i7-980x, which is even older than this 2nd gen 2500k. i still don't find problems running games


Nehalem was definitely a beast, the 980x being _the_ absolute beast no doubt. But I think what is most impressive about the 2500k is that it was/is a mid-range cpu that launched for around 200$ and is still a very capable processor even to this day. I don't want to sound presumptuous but we may never see a CPU like it again. I actually had mine from the first week it was launched running at around 4.8-5ghz it's entire lifetime, went through at least 3 gpu swaps and it's still soldiering on as my friends CPU.


----------



## chronicfx

Quote:


> Originally Posted by *8800GT*
> 
> Nehalem was definitely a beast, the 980x being _the_ absolute beast no doubt. But I think what is most impressive about the 2500k is that it was/is a mid-range cpu that launched for around 200$ and is still a very capable processor even to this day. I don't want to sound presumptuous but we may never see a CPU like it again. I actually had mine from the first week it was launched running at around 4.8-5ghz it's entire lifetime, went through at least 3 gpu swaps and it's still soldiering on as my friends CPU.


Your screenname needs a GPU upgrade as well I see


----------



## The-Beast

Quote:


> Originally Posted by *8800GT*
> 
> But I think what is most impressive about the 2500k is that it was/is a mid-range cpu that launched for around 200$ and is still a very capable processor even to this day.


I really hate this line of thinking. It's not impressive. The only thing impressive about the 2500k is that Intel managed to milk a bunch of idiots for 5-6% performance on 4 generations of successors. They've stalled progress so blatantly in the consumer market that 10 years after quad cores were introduced they still haven't gone up from that count. This despite the fact that we've had 4 node shrinks since Kentsfield.

It's not impressive for developers to not implement decent multithreading 15 years after x64 is released.
It's not impressive that consumers have received nominal benefits from hundreds of billions in sales.
People want to praise Intel for scraps? you're part of the problem.


----------



## randomizer

Quote:


> Originally Posted by *The-Beast*
> 
> It's not impressive for developers to not implement decent multithreading 15 years after x64 is released.


I agree that Intel hasn't pushed the performance boundaries very hard of late (low power consumption is in vogue), but this statement is odd. What is "decent" multithreading, and what do multithreading and x64 have to do with each other?


----------



## Cakewalk_S

I'm really hoping the multithread debate isn't as bad as I think. Yes I've had a 4 core 2500k since I can remember but I'm now looking at getting a skylake laptop with either the i5-6300hq or i7-6700HQ...which the performance gap between the two is quite decent...not only in single core performance because of core clock but the multithread performance beats my overclocked 2500k... I know some more modern games have done a good job of multithreading and spreading out to at least 8 threads which would be good for me but overall I hope most programs do a good job of utilizing 8 threads...


----------



## BigMack70

Quote:


> Originally Posted by *The-Beast*
> 
> I really hate this line of thinking. It's not impressive. The only thing impressive about the 2500k is that Intel managed to milk a bunch of idiots for 5-6% performance on 4 generations of successors. They've stalled progress so blatantly in the consumer market that 10 years after quad cores were introduced they still haven't gone up from that count. This despite the fact that we've had 4 node shrinks since Kentsfield.
> 
> It's not impressive for developers to not implement decent multithreading 15 years after x64 is released.
> It's not impressive that consumers have received nominal benefits from hundreds of billions in sales.
> People want to praise Intel for scraps? you're part of the problem.


lolwut? People are not the problem here; they're behaving exactly as you'd expect -- they buy and praise the best product available.

The problem is that Intel has no competition, and is behaving exactly as a company with no competition behaves - they don't bother with progress (in the areas where nobody can compete with them) and just milk their consumer base. If you want to point the finger somewhere, the finger deserves to be firmly pointed at AMD and their faildozer architecture. The only way out of this mess is if Zen is actually good, not if people stop "praising Intel for scraps".

People stop praising Intel for scraps --> Intel doesn't care, keeps going as normal and making their $$$
AMD develops a competitive CPU for the first time in nearly a decade --> Intel will respond by improving their product and/or dropping sales


----------



## The-Beast

Quote:


> Originally Posted by *randomizer*
> 
> What is "decent" multithreading, and what do multithreading and x64 have to do with each other?


Decent multithreading refers to the fact that many games still typically utilize a single core. So 1 core will get 90% utilization while the rest perform non sequitur tasks. x64 is used as a timeframe, the beginning of multicore consumer systems. In other words how long developers have failed to keep up with progress.
Quote:


> Originally Posted by *BigMack70*
> 
> lolwut? People are not the problem here; they're behaving exactly as you'd expect -- they buy and praise the best product available.


People can behave exactly as expected and still be part of the problem. They aren't mutually exclusive positions.


----------



## Andrew LB

Quote:


> Originally Posted by *The-Beast*
> 
> Decent multithreading refers to the fact that many games still typically utilize a single core. So 1 core will get 90% utilization while the rest perform non sequitur tasks. x64 is used as a timeframe, the beginning of multicore consumer systems. In other words how long developers have failed to keep up with progress.
> People can behave exactly as expected and still be part of the problem. They aren't mutually exclusive positions.


I've checked out task manager on almost every game i've played over the past few years and not a single one of them loads one core while the others have minimal activity. You probably have CPU Core Parking still enabled on 3 of your cores. After i upgraded to windows 10, went to run Ark: Survival Evolved and the game ran like hell. Turns out the upgrade re-enabled core parking and using the utility found at the following link, the problem was solved.

http://forums.guru3d.com/showthread.php?t=392839


----------



## LAKEINTEL

Due to restrictions with windows, programs keeping track of core usage is pretty inaccurate.

Also, of course games use more than one core, the issue is so many of them still have a disproportionate load on a core or two. Multithreading on pc is harder than multithreading on console, so years ago lots of games didn't bother a whole lot. now...they have to.


----------



## randomizer

Quote:


> Originally Posted by *The-Beast*
> 
> Decent multithreading refers to the fact that many games still typically utilize a single core. So 1 core will get 90% utilization while the rest perform non sequitur tasks.


Blame the OS thread scheduler. It's trying not to waste time and energy pointlessly switching the thread between cores. More even load means more cores in higher power states.

I don't know why people want high CPU utilisation all the time. If there is not enough work for a thread to do then it isn't going to use much CPU time.


----------



## Carniflex

Meanwhile in reality ... it really depends on what one is doing. If the main purpose of the rig is running benchmarks then yes definitely, although to be accurate - in that case the logical upgrade path would be to LGA2011 platform.

For gaming ... to be frank I do not think so. If one is running on a 5 year old platfrom then the upgrade money would be better spent on a stronger GFX card for gaming than on the CPU upgrade. And only when everything else has been modernized and there is still pile of money burning a hole in ones pocket only then it would be sensible to upgrade a 2500k in a gaming rig. Because in my experience while the difference can be there in the benchmarks in actual gameplay one can not tell the difference as long as the CPU is "strong enough" and i5 2500K still is. I went a little while ago from AMD 1055T @ 3.8 GHz to i7-3820 @ 4.3 GHz and that is somewhat larger jump in benchmark scores than going from 2500k to 6700k and I did not feel any difference in actual gaming or desktop experience. The display was still running at 60 fps and perhaps my reaction times are not good enough to tell the difference between ~65 fps and ~95 fps on a 60 Hz display.


----------



## wooshna

Quote:


> Originally Posted by *Carniflex*
> 
> Meanwhile in reality ... it really depends on what one is doing. If the main purpose of the rig is running benchmarks then yes definitely, although to be accurate - in that case the logical upgrade path would be to LGA2011 platform.
> 
> For gaming ... to be frank I do not think so. If one is running on a 5 year old platfrom then the upgrade money would be better spent on a stronger GFX card for gaming than on the CPU upgrade. And only when everything else has been modernized and there is still pile of money burning a hole in ones pocket only then it would be sensible to upgrade a 2500k in a gaming rig. Because in my experience while the difference can be there in the benchmarks in actual gameplay one can not tell the difference as long as the CPU is "strong enough" and i5 2500K still is. I went a little while ago from AMD 1055T @ 3.8 GHz to i7-3820 @ 4.3 GHz and that is somewhat larger jump in benchmark scores than going from 2500k to 6700k and I did not feel any difference in actual gaming or desktop experience. The display was still running at 60 fps and perhaps my reaction times are not good enough to tell the difference between ~65 fps and ~95 fps on a 60 Hz display.


This!! Lol

195fps on a 1080p 60hz monitor.

If you are the few that can afford 3 x 4k 32 inch monitors then yeah newst cpu and quad gpus then, though the scaling sucks


----------



## guitarmageddon88

Quote:


> Originally Posted by *Carniflex*
> 
> Meanwhile in reality ... it really depends on what one is doing. If the main purpose of the rig is running benchmarks then yes definitely, although to be accurate - in that case the logical upgrade path would be to LGA2011 platform.
> 
> For gaming ... to be frank I do not think so. If one is running on a 5 year old platfrom then the upgrade money would be better spent on a stronger GFX card for gaming than on the CPU upgrade. And only when everything else has been modernized and there is still pile of money burning a hole in ones pocket only then it would be sensible to upgrade a 2500k in a gaming rig. Because in my experience while the difference can be there in the benchmarks in actual gameplay one can not tell the difference as long as the CPU is "strong enough" and i5 2500K still is. I went a little while ago from AMD 1055T @ 3.8 GHz to i7-3820 @ 4.3 GHz and that is somewhat larger jump in benchmark scores than going from 2500k to 6700k and I did not feel any difference in actual gaming or desktop experience. The display was still running at 60 fps and perhaps my reaction times are not good enough to tell the difference between ~65 fps and ~95 fps on a 60 Hz display.


I agree with this. Although its been burning a hole in my pocket, my sig rig has been just fine and hasnt truly needed an upgrade. I have kind of been wanting to go to one single modern card vs SLI though. I went SLI with 580s when you kind of needed two cards to get the absolute best but now I feel these new cards do great with just one. Im assuming an i7 2600k would be practically the same story as that article is about?


----------



## Mampus

Still rocking 2500K with 1440 x 900 monitor (nope, I won't go higher res unless there's legit reason for my usage to do so). Guess I won't upgrading in a loooong arse time










Not OC yet, maybe I'll buy Cryorig H7 (the so-called 212 killer). 2133 mHz RAM is so cheap indeed, and in near future I should have either 950 or 960


----------



## Chargeit

Quote:


> Originally Posted by *Mampus*
> 
> Still rocking 2500K with 1440 x 900 monitor (nope, I won't go higher res unless there's legit reason for my usage to do so). Guess I won't upgrading in a loooong arse time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not OC yet, maybe I'll buy Cryorig H7 (the so-called 212 killer). 2133 mHz RAM is so cheap indeed, and in near future I should have either 950 or 960


Desktop space alone is enough reason to upgrade to a higher res monitor. 27" 1440p monitors are a pretty good sweet spot between desktop space, clarity, and general usage. Ain't no way nobody's saying they can't use the extra desktop space 1440p provides. 4k is another beast and gets to the point that you're dealing with a little too much desktop space, without going way over 4k.

Gaming wise you can pull 1440p on a 950 or 960 with lots of compromises and lower fps. Though I'd look into an AMD option in that case. Something with 4gb Vram.

Too bad you don't live around me. I've got 3 1080p monitors in my closet taking up a lot of space. I could live with having one less.


----------



## Cyclonic

Quote:


> Originally Posted by *Mampus*
> 
> Still rocking 2500K with 1440 x 900 monitor (nope, I won't go higher res unless there's legit reason for my usage to do so). Guess I won't upgrading in a loooong arse time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not OC yet, maybe I'll buy Cryorig H7 (the so-called 212 killer). 2133 mHz RAM is so cheap indeed, and in near future I should have either 950 or 960


Try 34 inch 3440x1440 21:9 monitor and you will think why did i ever use that old crap rez


----------



## dragneel

Quote:


> Originally Posted by *Chargeit*
> 
> Too bad you don't live around me. I've got 3 1080p monitors in my closet taking up a lot of space. I could live with having one less.


I'll pay for the shipping to send one my way, I'd like to have a 2nd screen again









I'm only kidding, shipping it from the US would cost me as much as a new one :')


----------



## Chargeit

Quote:


> Originally Posted by *dragneel*
> 
> I'll pay for the shipping to send one my way, I'd like to have a 2nd screen again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm only kidding, shipping it from the US would cost me as much as a new one :')


The monitors in my closet are becoming a heavy burden. lol.


----------



## Mampus

Quote:


> Originally Posted by *Chargeit*
> 
> Desktop space alone is enough reason to upgrade to a higher res monitor. 27" 1440p monitors are a pretty good sweet spot between desktop space, clarity, and general usage. Ain't no way nobody's saying they can't use the extra desktop space 1440p provides. 4k is another beast and gets to the point that you're dealing with a little too much desktop space, without going way over 4k.
> 
> Gaming wise you can pull 1440p on a 950 or 960 with lots of compromises and lower fps. Though I'd look into an AMD option in that case. Something with 4gb Vram.
> 
> Too bad you don't live around me. I've got 3 1080p monitors in my closet taking up a lot of space. I could live with having one less.


Regarding desktop space, I use 60cm x 120cm desk, enough for 27 inch screen, but not that enough 'urge' for to switch









1440p with 950/960 is quite taxing though. I love framerates and visuals, so in order to get those, either I must have strong system, or use lower res. I choose the latter because it's cheaper. Sure, you can use 4k monitor, then when gaming, you lower it to, say 1080p, but I like native, Been using 1080p, 21.5 inch display (it belong to my beloved Uncle), then when I set to 768p, it look awful lol. Dunno with 4k down to 1080p though









Quote:


> Originally Posted by *Cyclonic*
> 
> Try 34 inch 3440x1440 21:9 monitor and you will think why did i ever use that old crap rez


Man, that monitor cost about the same as my whole system lol. Sure, 3440 x 1440 even in 34 inch will look bloody sharp as hell, but I sit about 140cm (~4.5 feet) away from my monitor, and even further when I play fighting games with my arcade joystick, so pretty much I don't get the 'blockyness' associated with low-res monitor


----------



## Chargeit

When I say desktop space I mean like windows. A higher res screen offers a lot more space in windows.

Check out this screen shot. To the left is my 4k screen. To the right is my 1080p screen. See how much more space that high res offers. 1440p is in between the two and a good comprimise between the two. I'd even argue 1440p offers more usable space for what you get.



Yea, 1440p is a heavy res to try and push with a GTX 960 if a steady 60 fps is your goal at high settings. Still, it's a really great desktop experience compared to lower res.

I have some benchmarks of my backup rig pushing 1440p in my sig with a GTX 950. Even that can handle 1440p, though at low fps and compromised settings. No. I do not suggest a GTX950 for 1440p. It can still push it at Console fps or better though.


----------



## Partol

Are there any good game benchmarks which would show a significant difference by changing the ram clock/timings?

I just tested the Mafia 2 in-game benchmark: Once with my normal ram settings and once again with ram timings doubled (but same ram frequency).
The difference was less than 1 fps.

People say that fallout 4 is very ram dependent but is there a benchmark for it?

Instead of a benchmark, I suppose I could create a save game at a starting point, and then just move forward and record it. And retest from the same starting point.


----------



## Chargeit

Quote:


> Originally Posted by *Partol*
> 
> Are there any good game benchmarks which would show a significant difference by changing the ram clock/timings?
> 
> I just tested the Mafia 2 in-game benchmark: Once with my normal ram settings and once again with ram timings doubled (but same ram frequency).
> The difference was less than 1 fps.
> 
> People say that fallout 4 is very ram dependent but is there a benchmark for it?
> 
> Instead of a benchmark, I suppose I could create a save game at a starting point, and then just move forward and record it. And retest from the same starting point.


You're going to have to test that the hard way.

Also remember, if your GPU is the bottleneck then faster ram might not show as much of a performance gain. You want to remove your GPU as the bottleneck as much as possible when testing various ram/CPU speeds.


----------



## 12Cores

I find it hard to believe that a 2500k clocked at 5ghz paired with one card will bottleneck any current gpu in any significant way, this cpu is legendary.


----------



## ZealotKi11er

Quote:


> Originally Posted by *12Cores*
> 
> I find it hard to believe that a 2500k clocked at 5ghz paired with one card will bottleneck any current gpu in any significant way, this cpu is legendary.


The best GPUs when 2500K came out where GTX580 and HD 6970. Current GPUs are 3-4X faster. Yes there is difference. Just memory alone going to better controller and DDR4 is making a difference 1080p. What most people do not understand is that 2500K is fine still for gaming but there are faster CPUs out there.


----------



## Majin SSJ Eric

Even a Titan X won't be held back much by a 2500k at something like 4k resolution. Sure, at 1080p you will likely find some CPU bottlenecking (depending on what GPU(s) you are running) but really, if you are willing to spend $1k on a single video card you should probably not be running a 5 year old platform to begin with...


----------



## magnek

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even a Titan X won't be held back much by a 2500k at something like 4k resolution. Sure, at 1080p you will likely find some CPU bottlenecking (depending on what GPU(s) you are running) but really, *if you are willing to spend $1k on a single video card you should probably not be running a 5 year old platform to begin with...*


psssst I think BigMack70 would beg to differ


----------



## Cyro999

Quote:


> I find it hard to believe that a 2500k clocked at 5ghz paired with one card will bottleneck any current gpu in any significant way


You're literally saying that you find it hard to believe that a 2500k at 5ghz is slower than any other CPU for running any games. That's obviously not true.

perf difference might not be huge but it's there and it's solid.


----------



## BoredErica

Quote:


> Originally Posted by *Partol*
> 
> Are there any good game benchmarks which would show a significant difference by changing the ram clock/timings?
> 
> I just tested the Mafia 2 in-game benchmark: Once with my normal ram settings and once again with ram timings doubled (but same ram frequency).
> The difference was less than 1 fps.
> 
> People say that fallout 4 is very ram dependent but is there a benchmark for it?
> 
> Instead of a benchmark, I suppose I could create a save game at a starting point, and then just move forward and record it. And retest from the same starting point.


I think Fallout 4 is CPU dependent. With my 6600k and 980ti, the GPU is usually at 100% or near 100% usage with frame rate uncapped. The times when it falls below 60 fps however, are usually due to GPU usage dropping. And the times when it falls below 45 are basically exclusively GPU with low usage, CPU going crazy. Some of what I have seen suggests that when it's not GPU dependent, it's really often CPU + ram dependent. I have not tested this myself. I guess I could, though.

So when we look at Fallout 4 with an underpowered GPU with god rays turned up, or we just benchmark a typical area in Fallout 4, we only get part of the picture. The areas in downtown Boston where frame rates take a hit are not the majority of the game, but when the fps does drop it can be a little jarring. And so, this is why I think the CPU and GPU testing with Fallout 4 is incomplete. It doesn't account for the problem areas of the game.

Although, I haven't retested with the new patches.


----------



## BigMack70

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even a Titan X won't be held back much by a 2500k at something like 4k resolution. Sure, at 1080p you will likely find some CPU bottlenecking (depending on what GPU(s) you are running) but really, if you are willing to spend $1k on a single video card you should probably not be running a 5 year old platform to begin with...


Can't comment on a 2500k but with a 2600k, the only games where it held me back at T-X SLI at 4k resolution were GTA V and a little bit in Crysis 3. Crysis 3 was still able to get ~60fps but that bumped up about 15-20% with a 5930K. GTA V didn't see framerate improvement but did have some stuttering when maxed out on a 2600k that was gone on a 5930k.

No change in my other games. Granted that's a 2600k and it was at 4.8 GHz but at high resolution you're just so much more GPU bound than CPU that the CPU is really not a very important piece of the puzzle most of the time.


----------



## Partol

Thanks for the suggestions/encouragement. The built-in mafia2 in-game benchmark seems worthless for benching ram. Maybe it pre-loads everything into ram.

Fortunately, in mafia 2 - chapter 2, there is a scene in which the car autodrives itself through the city. Excellent scene for doing comparative benchmarks. The results were more that I expected.

http://www.overclock.net/t/1487162/an-independent-study-does-the-speed-of-ram-directly-affect-fps-during-high-cpu-overhead-scenarios/190#post_24941318


----------



## santi2104

I have been using the sandy bridge platform scince 2012, and i cant find an excuse to upgrade, for gaming it still kicks ass, 4.4ghz stock vcore rock solid for years, long live sandy bridge


----------



## madpossum

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Even a Titan X won't be held back much by a 2500k at something like 4k resolution. Sure, at 1080p you will likely find some CPU bottlenecking (depending on what GPU(s) you are running) but really, if you are willing to spend $1k on a single video card you should probably not be running a 5 year old platform to begin with...


Your going to see a lot more difference upgrading your GPU than CPU in almost all cases. If they don't upgrade very often and then find themselves with around $1000 to upgrade I could see them putting all that into the GPU if they are still happy with their CPU, instead of upgrading both and ending up with a slightly better CPU and a 970 or something. Though I personally would have worked the numbers or saved a little more and upgraded the CPU and get a 980 Ti (which is what I did).


----------



## Smanci

I just swapped my 3570K to 3770 for basically no cost. It does help in some newer games and also allows 41x multiplier + BLCK bump if I ever feel need to OC.


----------



## Cyb3r

honestly the problem with fallout 4 and skyrim is that they feature areas where no matter what hardware you throw at them you'll still see low fps (waterfall in the first town in skyrim for example : while i did notice a difference between my 680 and my 980ti with the exact same settings and making sure i didn't overrun the vram on the 680 (which is rather easy to do with mods and the 680 since it only has 2gb off vram) the difference boiled down to about 3-10'ish fps in that area since it's a known problem with the actuall engine


----------



## ASUSfreak

Quote:


> Originally Posted by *Darkwizzie*
> 
> With my 6600k and 980ti, the GPU is usually at 100% or near 100% usage with frame rate uncapped. The times when it falls below 60 fps however, are usually due to GPU usage dropping. And the times when it falls below 45 are basically exclusively GPU with low usage, CPU going crazy. Some of what I have seen suggests that when it's not GPU dependent, it's really often CPU + ram dependent. I have not tested this myself. I guess I could, though.


Wow, I posted something on page 5, now it's like 28 or so









I had Core 2 Quad Q9450 once with SLI GTX470's.

When I swapped it for my 2600K and the same 470's they got HUGE headroom and my fps went UP!

Ofcourse my Q9450 was bottlenecking my 2 470's. The cards produced much more data per second than my proc+chipset+ram could follow







so that was bottleneck.

With upgrading to P67 and 2600K the fps doubled or something









But when I bought 1440p monitor they could not cope anymore so bought 780Ti's


----------



## TMatzelle60

I believe the 2500K is just like the 460 was to Nvidia. Its a well made beast i think a lot of people will have it for 3-4 years more.


----------



## jetplane48

Maybe it's time for a lil change


----------



## LuckyStarV

Quote:


> Originally Posted by *TMatzelle60*
> 
> I believe the 2500K is just like the 460 was to Nvidia. Its a well made beast i think a lot of people will have it for 3-4 years more.


Or the 7950/7970, still using my 7950 (and it's RMA) after all these years


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Cyro999*
> 
> You're literally saying that you find it hard to believe that a 2500k at 5ghz is slower than any other CPU for running any games. That's obviously not true.
> 
> perf difference might not be huge but it's there and it's solid.


We really need to have a discussion about what is and is not an actual bottleneck. Too many people are considering any component that is not as fast as another with otherwise identical hardware as a "bottleneck" and I just don't think that is the standard that we should be using for the term. To me (and granted this is just my opinion), a bottleneck is when a single component of your system significantly hinders the standard usage and operation of that system in whatever you are trying to do. It is NOT a term to use when simply comparing the differing performances of pieces of hardware. For example, a 2500k at 5GHz will have a performance deficit in gaming when compared to, say, a 6600k at 4.6GHz when using the same GPU (though this is an imperfect comparison because the entire platform is different so its tough to call any gains entirely "CPU" dependent). That is a given. I do not believe, however, that this should be interpreted as the 2500k "bottlenecking" the system unless the overall performance in gaming is actually inadequate, which I don't think it really ever is at this point. Now, if you try to run that same modern GPU on, say, a C2D for example, that would create what I would call an actual bottleneck of the system in which case the performance of the other components are actively being held back to a detrimental degree by the CPU.

As I noted earlier, using the lazy premise that any CPU that performs worse than another with the same GPU installed would mean that literally EVERY processor in use today is a "bottleneck" in gaming other than the top rated Skylake processors. That's just ridiculous IMO...


----------



## BoredErica

Quote:


> Originally Posted by *Cyb3r*
> 
> honestly *the problem with fallout 4 and skyrim is that they feature areas where no matter what hardware you throw at them you'll still see low fps* (waterfall in the first town in skyrim for example : while i did notice a difference between my 680 and my 980ti with the exact same settings and making sure i didn't overrun the vram on the 680 (which is rather easy to do with mods and the 680 since it only has 2gb off vram) the difference boiled down to about 3-10'ish fps in that area since it's a known problem with the actuall engine


That's simply not true. One can easily hit 60fps in ANY part of vanilla Skyrim with today's hardware. If you don't even know the name of the 'first town in Skyrim', chances are you are not into Skyrim all that much, and therefore do not understand the situation with Skyrim today. Something is wrong with your Skyrim setup.

Now, for Fallout 4 on max settings, I would probably agree with you. I don't know for sure that your statement is correct for Fallout 4, but I think it probably is. My guess is that in those cases for Fallout 4, decreasing shadow distance will allow people to go back to 60fps.


----------



## Chargeit

Quote:


> Originally Posted by *Darkwizzie*
> 
> That's simply not true. One can easily hit 60fps in ANY part of vanilla Skyrim with today's hardware. If you don't even know the name of the 'first town in Skyrim', chances are you are not into Skyrim all that much, and therefore do not understand the situation with Skyrim today. Something is wrong with your Skyrim setup.
> 
> Now, for Fallout 4 on max settings, I would probably agree with you. I don't know for sure that your statement is correct for Fallout 4, but I think it probably is. My guess is that in those cases for Fallout 4, decreasing shadow distance will allow people to go back to 60fps.


I was thinking the same thing. Once I moved off of my FX system I had no issues holding 60 fps in unmodded Skyrim.

For a steady 60+ fps in Fallout 4 I needed to drop GodRays down one notch.

Once you toss in mods though, there's no telling what could be the cause of poor performance.


----------



## BoredErica

Quote:


> Originally Posted by *Chargeit*
> 
> I was thinking the same thing. Once I moved off of my FX system I had no issues holding 60 fps in unmodded Skyrim.
> 
> For a steady 60+ fps in Fallout 4 I needed to drop GodRays down one notch.
> 
> Once you toss in mods though, there's no telling what could be the cause of poor performance.


In some areas of downtown Boston, I think the combination of Godrays + highest shadow distance is too much. I have godrays to low, shadows set to max in the launcher. I have not touched shadows in the ini. I'm getting 33fps from one of my save files. (I made the save file because it was the lowest FPS I got in the entire playthrough of the game for me, looking at just the right angle to get the lowest fps). I believe this is a CPU bottleneck due to shadows. But then again, that's a specific case in Fallout 4, and not the rule.

Skyrim modding takes a lot of trial and error to learn which mods are bad, how to set up Skyrim, etc. Today we have Skylake, take that and the average OC is 4.7ghz. Just tested the area I think the guy was talking about. On vanilla Skyrim with all DLCs, including high res texture pack, I got a minimum of 77 fps, by the waterfall, standing on the steps leading to Dragonsreach, looking over the entire hold. That's a very specific case too. Elsewhere I get like 120 fps and up. 8x MSAA, 16x AS, 1440p. By alerting the entire city at once with storm call, I managed to get a minimum of 74 fps. That's with the entire town trying to get me or running away, on the most intensive part of Whiterun, at the right angle, looking down at the entire city just so. It's lower than I anticipated, but still well above 60 and definitely 30+ fps.


----------



## Partol

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I do not believe, however, that this should be interpreted as the 2500k "bottlenecking" the system unless the overall performance in gaming is actually inadequate, which I don't think it really ever is at this point. Now, if you try to run that same modern GPU on, say, a C2D for example, that would create what I would call an actual bottleneck of the system in which case the performance of the other components are actively being held back to a detrimental degree by the CPU....


The problem with your definition of the word "bottleneck" is that it is very subjective.
your words: "the overall performance in gaming is actually inadequate" and "detrimental degree"

For some people, 15-20fps in games may be adequate. Other people expect minimum 30fps or 60fps or 120fps or even more.
At least, we can agree that we disagree on the meaning of "bottleneck".
I would not be opposed to simply ban the word "bottleneck" from this forum.


----------



## dVeLoPe

I have an old i5-760 quad core @ 4ghz since dec of 2010 still kickin although I have always had the latest gpu I think because of the high overclock this chip still can handle its own

although I do have a Rampage V Extreme and i7-5820k still NIB waiting for a few more parts to finalize my 2016 build and this pc will get put in the closet as a backup lol


----------



## magnek

Quote:


> Originally Posted by *Partol*
> 
> The problem with your definition of the word "bottleneck" is that it is very subjective.
> your words: "the overall performance in gaming is actually inadequate" and "detrimental degree"
> 
> For some people, 15-20fps in games may be adequate. Other people expect minimum 30fps or 60fps or 120fps or even more.
> At least, we can agree that we disagree on the meaning of "bottleneck".
> I would not be opposed to simply ban the word "bottleneck" from this forum.


The problem with strict adherence to the definition of "bottleneck" is, anytime you have less than absolute top of the line hardware in your system, you're technically being "bottlenecked", which is completely absurd.

_Don't have a Titan X?_ GPU bottleneck!
_Don't have a 5960X overclocked to the moon?_ CPU bottleneck!
_Don't have DDR4 running at 4266MHz?_ Ram bottleneck!
_Don't have NVME SSD?_ Storage bottleneck!
_Don't have Gigabit internet?_ Network bottleneck!

DONGS Bottlenecks galore


----------



## BoredErica

I can't let this thread pass by without posting this:


----------



## grss1982

Quote:


> Originally Posted by *45nm*
> 
> You could always re-use it. It's not worth paying more especially when a Skylake system will require DDR4 and offer minimal gains at best. *I used to do upgrades more often years ago but now with the minimal gains and focus on power consumption and features even Socket 775 equipment and hardware can still be viable in today's environment.* I also have a 2700K and a Z68 motherboard that I am holding on to and it was a great system that I might end up reusing as well. Also another problem is finding the 6700K in stock here and at affordable prices due to the USD. It's simply not worth paying for DDR4 and features which I won't use when I can always reuse my 2700K build.


I find myself in such a situation also. Currently rocking a non OC-ed i5 3470 and from my perspective it's "good enough" for the games that I play and some other computer stuff that I do. I also have a 775-based system (C2D E8500 IIRC) lying around in pieces that I plan to bring back to life because my parents need it for FB. I don't foresee them ever needing something beefier unless of course dad suddenly wants to play Crysis or something.


----------



## BoredErica

Quote:


> Originally Posted by *grss1982*
> 
> I find myself in such a situation also. Currently rocking a non OC-ed i5 3470 and from my perspective it's "good enough" for the games that I play and some other computer stuff that I do. I also have a 775-based system (C2D E8500 IIRC) lying around in pieces that I plan to bring back to life because my parents need it for FB. I don't foresee them ever needing something beefier unless of course dad suddenly wants to play Crysis or something.


I'm the opposite. At the current rate chips are increasing in performance, I estimated it would take another 9-10 years for me to get 60fps in Oblivion. CCleaner takes like 10 seconds to fully initialize with CCEnhancer. Unpacking and packing mods for Skyrim can take a bit. I've taken care to try to minimize CPU thrashing on Skyrim, but faster chips would always be appreciated. MUGEN level loading is CPU bottlenecked. Yeah, CPU bottlenecked. The lowest fps troughs in FO4 are CPU limited. And just other random odd things I happen to be doing... Can't exactly SLI CPUs, so...

From the perspective of chess, more cores is better, but that's not my top priority right now.


----------



## Partol

Quote:


> Originally Posted by *magnek*
> 
> The problem with strict adherence to the definition of "bottleneck" is, anytime you have less than absolute top of the line hardware in your system, you're technically being "bottlenecked", which is completely absurd.


If you believe the definition of bottleneck is absurd, then don't use that definition, or simply use a different word. I see nothing absurd about the definition of bottleneck. Normally, it's not possible to have the very best hardware, because hardware companies are sometimes/often in possession of newer, unreleased, higher performance hardware.

http://www.thefreedictionary.com/bottleneck
A point or an area of traffic congestion.
To slow down or impede by creating an obstruction.
something that holds up progress, esp of a manufacturing process
a narrow stretch of road or a junction at which traffic is or may be held up
a stage at which progress is impeded.


----------



## randomizer

Quote:


> Originally Posted by *magnek*
> 
> The problem with strict adherence to the definition of "bottleneck" is, anytime you have less than absolute top of the line hardware in your system, you're technically being "bottlenecked", which is completely absurd.


You arguably still have a bottleneck even if you have the absolute top of the line hardware. You simply have no way to remove it. I don't believe that applying the definition this strictly is in the spirit of the metaphor though. Just by looking at the neck of a bottle you can see that the intended meaning is a material impact on throughput. Even the definitions quoted above lean away from merely "the slowest part of the system."


----------



## Chargeit

Something is always the limiting factor. Could be your CPU/Ram/storage/connection/GPU/Game engine. Whatever. Something will always stop some other part from performing to its highest possible level. Key is to remove that as much as possible on your part by matching comparable hardware with hopefully competent software.

Bottlenecking is also relevant to the res/refresh/settings being used. Once again, something is always holding something else back.

When I personally start thinking of it as bottlenecking is when I feel like my experience suffers because of it.


----------



## Cyro999

Quote:


> As I noted earlier, using the lazy premise that any CPU that performs worse than another with the same GPU installed would mean that literally EVERY processor in use today is a "bottleneck" in gaming other than the top rated Skylake processors. That's just ridiculous IMO...


It's not a lazy premise. You'll always have a bottleneck somewhere. It might be very small (10% performance gain if eliminated) or very large (500%+ performance gain if eliminated) but even within the same game and engine, demands change depending on what you're doing.

If you take Starcraft 2 for example which is heavily CPU bottlenecked, you can run a gtx960 and a 980 side by side and they'll have the same framerate even though the 980 is literally twice as fast.

You can take Sandy Bridge vs Skylake and get 40% more FPS on the Skylake system, because it's CPU bottlenecked at that time.

--

If you look at another game that's CPU light, especially run at a high resolution, you can see no FPS difference between Sandy bridge and Skylake in a certain scene. You can then swap from a 960 to a 980 and see FPS double.

If everything else in the system can run 100fps but the GPU can only handle 40, then doubling GPU perf can allow the game to run at 80fps because the GPU is slowing performance to 40% of what the system can achieve.

It's a bottleneck because one part of the system in particular is holding back performance at that point. You could improve everything else in the system aside from that part and see either no performance change or highly limited performance change; And once again, it can be a small or a very large bottleneck depending on the parts and exactly what they're doing at the time.

Quote:


> *You arguably still have a bottleneck even if you have the absolute top of the line hardware. You simply have no way to remove it.*


Yes, this is 100% true. There are a lot of CPU heavy games out there that are not challenging to the latest GPU's. To use Starcraft 2 again as an example, that game was CPU bottlenecked on a gtx460 whenever a lot of units came out. In a big battle, you'll see a 980ti at 10% graphics load. If we had a CPU 3x faster than a 6600k, then the game would run 3x faster - there's literally nothing preventing that from happening aside from CPU performance. Meanwhile swapping from a 950 to a 980ti would see no performance change.


----------



## magnek

Quote:


> Originally Posted by *Partol*
> 
> If you believe the definition of bottleneck is absurd, then don't use that definition, or simply use a different word. I see nothing absurd about the definition of bottleneck. Normally, it's not possible to have the very best hardware, because hardware companies are sometimes/often in possession of newer, unreleased, higher performance hardware.
> 
> http://www.thefreedictionary.com/bottleneck
> A point or an area of traffic congestion.
> To slow down or impede by creating an obstruction.
> something that holds up progress, esp of a manufacturing process
> a narrow stretch of road or a junction at which traffic is or may be held up
> a stage at which progress is impeded.


Quote:


> Originally Posted by *randomizer*
> 
> You arguably still have a bottleneck even if you have the absolute top of the line hardware. You simply have no way to remove it. *I don't believe that applying the definition this strictly is in the spirit of the metaphor though.* Just by looking at the neck of a bottle you can see that the intended meaning is a material impact on throughput. Even the definitions quoted above lean away from merely "the slowest part of the system."


And that's precisely the problem. "The spirit of the metaphor" is ambiguous and poorly defined, as is "material impact". As pointed out previously, at what point does a diminishing performance constitute a "CPU bottleneck"? Say switching from a 2500K to a 6600K both at the same clocks gave a 10% boost in performance, would that be a bottleneck? How about 20%? 30%? Then you have to consider that each person has different tolerances and perceptive abilities, and it becomes even less fruitful to throw around the term "bottleneck" blindly.


----------



## Cyro999

If you have two systems identical otherwise and a 980 is twice as fast as a 960 (this is approximately true; it's a straight double)

System 1 has a gtx960, runs at 50fps. 62% CPU load - 100% GPU load

System 2 has a gtx980, runs at 80fps. 100% CPU load - 80% GPU load

System 1 is bottlenecked by the 960. It could achieve 80fps but can only reach 50 because of the 960.

System 2 is bottlenecked by the CPU. It could achieve 100fps but can only reach 80 because of the CPU.

It shifts around a LOT depending on the game, settings, scene, any particular demands etc but that's the general idea. *You usually have a situation where one part is partially idle and another is maxed out and defining your performance*, which is why it's important to look at which part that is, where and why - if you were to upgrade the part that was idle and waiting for it, then your FPS wouldn't go up at that particular time.

That's part at full load is usually the graphics card unless you're looking at games older than a few years (graphics cards advance far faster than CPU's) or games in the RTS/MMO/physics genres.


----------



## Cyb3r

@ cyro for sc2 that's not entirely true you do get higher performance with the 980 vs the 960 granted that you set both to high perf in nvidia controll panel but it's still a ****ty coded engine that suffers more from the cpu in use than the gpu

most games that are actually cpu limited have horrible coding behind them SC2 / FO3/NV/4 / skyrim / oblivion to name a few

hell even with my 5960x oblivion has cpu issues no matter the oc i throw at it cause the engine is a huge turd

@ darkwizzie i simply haven't played skyrim in about 3months ever since i rebuilded my system since i had to wait a full month on getting rma's done and i'm simply bad with names has nothing todo with me playing or not playing skyrim alot (hint i played quitte a bit off skyrim) and what i meant is that you get performance dips in that area due to bad coding this can easily be checked with skyrim perf monitor when you scale past 4k


----------



## Cyro999

Quote:


> you do get higher performance with the 980 vs the 960


Maybe if you set max graphics and get 500fps on one vs 300 on the other at the start of the game. When performance actually matters, it's the same on both.

You can claim that every game that maxes out a thread on a decent CPU is poorly coded, terrible engine etc but that doesn't change anything, even if it were entirely true (it's not.)


----------



## Cyb3r

@ cyro i meant fps in a max vs max battle you do notice the difference when there's alot off spellcasting going on between 2 armies (skyzerg vs mech for example) it's been better since lotv but still not what it should be


----------



## Cyro999

GPU load in those situations is very minimal


----------



## kifinas

Quote:


> Originally Posted by *Ironcobra*
> 
> Great video, that being said i dont plan to upgrade until the end of this year, however i have been thinking of upgrading my ram to hold me over. This video definitely backs this up. Im just shocked at how bad the dx11 performance. Im glad ive been holding out on a 4k upgrade until i do my computer again. It feels good tho that ive gotten every penny out of my 2500k system. Definitely best cpu for money of all time.
> 
> Can someone please recommend a good 16gb kit for pairing with the 2500k, I havnet been keeping up with hardware as much since buying this system.


I ordered this for my i5 2500K:

SKU: F3-2133C9D-8GXL

2133 with the lowest latency I could find. Has not arrived yet. Order just one kit, 8GB, is enough for me. I will do a platform upgrade before 8GB becomes an issue.


----------



## BoredErica

Quote:


> Originally Posted by *Cyb3r*
> 
> @ cyro for sc2 that's not entirely true you do get higher performance with the 980 vs the 960 granted that you set both to high perf in nvidia controll panel but it's still a ****ty coded engine that suffers more from the cpu in use than the gpu
> 
> most games that are actually cpu limited have horrible coding behind them SC2 / FO3/NV/4 / skyrim / oblivion to name a few
> 
> hell even with my 5960x oblivion has cpu issues no matter the oc i throw at it cause the engine is a huge turd
> 
> @ darkwizzie i simply haven't played skyrim in about 3months ever since i rebuilded my system since i had to wait a full month on getting rma's done and i'm simply bad with names has nothing todo with me playing or not playing skyrim alot (hint i played quitte a bit off skyrim) and what i meant is that you get performance dips in that area due to bad coding this can easily be checked with skyrim perf monitor when you scale past 4k


I just ran Skyrim @ 5120 x 2880 with x4 msaa and x16 as and got 63 fps minimum (50% CPU usage, 90% GPU usage). If you want to scale past even 5k, you should've noted this in your original post.

You're not being reasonable. When Skyrim came out, 1440p was high end stuff. Right now, even with the terrible single-threaded performance improvement of Intel chips past Sandy, it's possible to run double that resolution with x4 msaa and have minimums over 60.


----------



## Majin SSJ Eric

Circling back to the topic of this thread, suffice it to say that the vast majority of AAA games will not be hindered by a 2500k to this day and I think that was the original point.


----------



## Cyro999

Quote:


> Circling back to the topic of this thread, suffice it to say that the vast majority of AAA games will not be hindered by a 2500k to this day and I think that was the original point.


It really depends on the games, many of them (even in the OP video you see some pretty GPU heavy recent games that are CPU limited in those testing conditions) will scale significantly.

There are more than enough games partially to heavily CPU limited that can justify justify either a 2500k to 6600k upgrade for a lot of people - or a fresh build with money put into CPU (up to a 6600k) rather than huge focus on GPU.

Especially for those who play MMO/RTS and/or those targetting a framerate well above 60fps. Those games are typically heavier for a modern CPU than a modern GPU, that gets worse for every month of lifespan after development starts since GPU's are advancing far faster than CPU's at the moment.


----------



## EinZerstorer

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> It's been so long, but I'm fairly sure it was 1.3v something with a 4.5 OC, I didn't change any other voltages either, it was my first and last time overclocking. I ran Prime95 for about 2 hours and it was fine, and I never had any issues other than maybe every few months I'd get a BSOD that implied my vcore wasn't high enough and a simple increase/nudge solved it.
> 
> I always assumed I could have gone with a much lower vcore if I had increased other things.
> 
> It's been 100% stable at stock, I didn't break it or anything, I just degraded it.
> 
> Edit - I posted in a thread for my mobo years ago, I may have some old info there.


LOL.

You didn't degrade anything, you just never had enough vcore to actually support your oc.

Ran multiple 2500k's at 1.4 / 1.418v for YEARS with no degredation at 4.8-5 ghz.


----------



## Partol

Here's a cool (new) Sandy Bridge i7-2600k build video. Yeah, I am excited about Doom (2016) too.

https://www.youtube.com/watch?v=dWDnN93f9I4

This is why single-threaded performance is important in gaming.

The campaign map on Total War: Medieval II, Empire, Shogun 2, etc. is single threaded. tested by me and confirmed.
Morrowind is single threaded, I believe. Pretty much every game made before 2000 is single threaded because, back then, there were no dual-core,tri-core,quad-core, etc. cpu's.
Oblivion, Skyrim, Fallout 3 and New Vegas and many other games are optimized for dual core.

You may ask yourself, why should I care about old games? because some of the best mods and fan missions ever made are for old games.

Did you enjoy Morrowind? There are many mods for morrowind.
http://www.moddb.com/search?q=morrowind

Did you enjoy Oblivion? If you enjoyed Oblivion, then try Nehrim. It's so much better than Oblivion. In Nehrim, the focus is on the main quest and it's a long, satisfying main quest.
Trust me, Nehrim is a much much better RPG than Oblivion.
http://www.moddb.com/mods/nehrim-at-fates-edge/images

Did you enjoy Skyrim? Then, try Enderal (coming 2016): sequel to Nehrim. If Enderal is only half-as-good as Nehrim, then it will still be as good as or better than Skyrim, in my opinion.
http://www.moddb.com/mods/enderal

You may ask yourself, why should I care about those old Total War games if I only play the latest games?
This is why you should care.
http://www.moddb.com/mods/third-age-total-war

The Third Age (Lord of the Rings) mod is so popular that people started to make mods for the original mod.
http://www.moddb.com/mods/divide-and-conquer

Do you play mmo games? That's a whole 'nother discussion.


----------



## pengs

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Circling back to the topic of this thread, suffice it to say that the vast majority of AAA games will not be hindered by a 2500k to this day and I think that was the original point.


Yeah, and if they are it's most likely a draw call or API limitation.

If we look at the enhanced multi-threading and draw call performance that DX12 and Vulcan offer it could extend the life of Sandy and Ivy (even Nehalem and Bloomfield for that matter) further which is really amazing. Combine that with asynchronous shading to remove more processing from the CPU side and supplement the 2500K with some binned memory. If all of this is taken advantage of the 2500K will live on for years to come - no doubts - the 3770K being a threaded fail-safe of sorts.

It's not so much a testament to the socket as it is the pivotal movements which the industry is taking atm. A lot of the longevity is due to the movement towards efficiency over tackling limitations like a linebacker and the fact that the DirectX has reached it's limit as it currently functions/is structured.


----------



## Cyb3r

@ darkwizzie i gave it merely as an example if you take most off the rigs from that time they had issues @ riven i think is what the town is called sure these days unless you throw 4k textures at it skyrim will run past 60fps

and fallout 4 is running on a modded skyrim engine which is why it's having similar issues with performance

@partol even then some older engines from the 2000's that had dualcore patched in at some time in the engines lifecycle don't all run too well oblivion is a pretty good example even though it should run dualcores it's selective dual core optimization same goes for skyrim and Quad core though that engine is more advanced and via forcing can run more stuff on quad cores without breaking the engine too much

@pengs Dx12 is also finally the proper rewrite mostly for alot off functions even to this day Dx9 drawcalls running in DX11 run smoother than when they where ran in Dx9 due to many reasons and semi bad optimization the only problem there is right now is that nvidia is kind off forcing async computing through their drivers instead off hardware based but we'll have to wait on pascall to see if they will go for full hardware async but i think they probably will

and while upgrading from a 2500k to any newer architecture will have some benefits in fps in games it won't be as big as upgrading from a previous architecture off gpu and Dx12 will widen that gap even further


----------



## cdoublejj

I'm still running a Q9550 with 12gb of ram.


----------



## chronicfx

Quote:


> Originally Posted by *EinZerstorer*
> 
> LOL.
> 
> You didn't degrade anything, you just never had enough vcore to actually support your oc.
> 
> Ran multiple 2500k's at 1.4 / 1.418v for YEARS with no degredation at 4.8-5 ghz.


I agree with this. My personal preference is to do all of my stress testing and find my max overclock with whatever "stress test" is the flavor of the chip and then when I complete a robust stressing I still go into bios and click an extra two ups on the vcore and leave it. I have never had had a problem, my Q9550 is still at 4.0ghz, my 3570k at 5ghz, my 4790k at 4.8ghz, and my current 6700k at 4.9ghz. Again, never had a problem personally with feeding a little extra vcore to be sure.


----------



## BoredErica

Quote:


> Originally Posted by *Cyb3r*
> 
> @ darkwizzie i gave it merely as an example if you take most off the rigs from that time they had issues @ riven i think is what the town is called sure these days unless you throw 4k textures at it skyrim will run past 60fps
> 
> and fallout 4 is running on a modded skyrim engine which is why it's having similar issues with performance


You said no matter what hardware. As in, there doesn't exist hardware that can do what you claimed. You went from '*they feature areas where no matter what hardware you throw at them you'll still see low fps' *to 'most of the rigs from 2011 couldn't run Skyrim downscaled from 4k or higher'. You keep changing what you're saying. Doesn't matter if what you said was 'just an example'. It was wrong, which I think you'll never admit at this point.

Why would you expect a rig from 2011 to run Skyrim downscaled from 4k??? Why are we talking about what most hardware did at 2011? You're going to be GPU limited if you downscale like that in 2011, which is not relevant to this thread.

Today, you'd probably have to run Skyrim on Windows 7 with ENBoost to avoid problems with 4k textures everywhere. Problem is stutter, and it would not be GPU bound.


----------



## Cyb3r

i was merely referring to no matter what hardware you'll see lower fps dips at certain points due to coding issues to this day with the engine it isn't as visible on current day hardware with skyrim since it's easy to just bruteforce the engine but then there's other issues as in the physics off the whole game being tied to fps going over the 60fps thresshold with the skyrim engine will break alot off animations and physics in one way or another


----------



## BoredErica

I don't want to lose my temper, so I will leave you with this:

-There is no way to dip below 60fps in Vanilla Skyrim short of going above 5k resolution with today's best hardware. When FPS dips it is indeed generally CPU bottlenecked.
- In fact, even @ 10k, the lowest drops in FPS are CPU bound, which is kind of amazing.
-There is no waterfall in Riverwood. Quite a ways south of Riverwood there is a minor waterfall. The waterfall there is closer to the standing stones than Riverwood. This place is nowhere as performance intensive an area as the area I thought you were referencing (Whiterun, on the steps of Dragonsreach, looking down on the entire city). So if I were to test in your location, the minimum FPS @ 5k w/ x4 MSAA would be above 63 fps.
-What was your original point? You said that there doesn't exist hardware today that can run Skyrim without suffering from 'low fps'.
-When shown that on 1440p and x8 MSAA on a very CPU intensive part of the game overlooking the entire city caused FPS far above 60, you said you mean downscaling from 4k.
-When shown that downscaling from 5k on my hardware with x4 MSAA in the same situation led to minimum FPS above 60, you said you actually meant to talk about hardware at the time of Skyrim's release, which could get problems with performance if you downscaled from 4k.
-When shown that you deviated from what you said originally when you started talking about hardware at 2011 instead of 2016, you started to talk about FPS dipping and the 60 fps cap. It is not reasonable to expect 2011 parts to run Skyrim downscaled from 4k or above and maintain above 60 fps at all times. The performance problem would not be from CPU, since Sandy wasn't that many times slower than Skylake. GTX 580s on the other hand, were slow, hot, and had very little Vram. I don't see how 2011 rigs have anything to do with this thread, except for the CPU (because it's in the title and we were here to compare CPUs).
-I agree with you (although this is an obvious truth) that dips in FPS are due to coding. Dips in FPS could be dips above 60 fps, in which case those dips in FPS has no practical detriment to the player. (Maybe one could nitpick about something, but there are bigger fish to fry.) Any game today will be limited due to coding 10 years from now, as technology improves. Vanilla Skyrim could have done better with its code, but it's not a complete abomination, considering what many people end up doing to their games compared to vanilla Skyrim.
-The game should generally be capped at 60fps, but that does not demonstrate your original point to be true (or really have anything to do with it at all).
-I agree that brute force can cause poorly coded/inadequately coded programs to perform better. I don't think anybody disagrees with you here, but I think we all knew we agreed with that before you even said that. I don't see how that has anything to do with my original reply to you, though. Then again, the same can be said for the two points above this one and under this one.
-Running 4k textures would probably mean using Windows 7 instead of 10. Performance problems would be stuttering if there were problems. It's technically correct to say stuttering is lower FPS, but I would differentiate the two.
-I agree with you that Fallout 4 is a modded Skyrim engine, and shares some similar performance problems.
-Making an incorrect/unreasonable first post, and following it up a couple of points that are correct but unrelated to the first point doesn't make the original post correct. If I give you the most charitable interpretation of what you did, you need to work on your communication. Going from 'any hardware today leads to low fps' to '2011 hardware lead to low FPS when downscaling from 4k or above' is a big difference. And if the difference is a GPU difference for the most part, remember that we're here to talk about the CPU.
-It's unfair of me to give my final word and just disappear, so you can say your piece too, and then we can agree to disagree.

Finally, on an unrelated note:
-Starting Skyrim, especially with Mod Organizer, leads to a CPU bound situation. Right now I can't really say my loading times when I see loading screens are limited by CPU instead of SSD, but I think it's a possibility when I'm running outside in Skyrim and maps get loaded on the fly. So yeah, everywhere I go, I am being hit by CPU limits.


----------



## Unkzilla

Gave my old 2500k to my son and moved up to a 6600k. That 2500k has been running 1.35v @ 4.7ghz since 2011 and is still going strong, it's a beast

I am only running a modest OC on the 6600k - 4.2ghz @1.21v but it has been a worthy upgrade - if not for the cpu performance then for the 3200mhz DDR4 and my M2 drive, the system is super snappy

The only game I remember hitting a cpu bottleneck with the 2500k was Crysis 3.. remember all cores being pegged @ 100% and FPS being around 38-39 in some areas of the game. I've gone back now and I seem to get 50 in the same area but i'm not sure if the game was patched or if the new OS has helped with this etc.


----------



## ZealotKi11er

The GPU is the only thing that is not a Bottleneck in a System. I see the CPU as the enabler. That means if CPU pushes CPU to 99% its doing it job, else bottleneck.


----------



## dragneel

I had planned on upgrading to a 6600k and 16GB 3000mhz RAM shortly then waiting for Polaris/Pascal but now I'm not sure, I'm wondering if maybe I'd be better off getting like a 390X(and new PSU) for now and and being able to upgrade to a 6700k in a couple months instead and then perhaps seeing how I feel about polaris/pascal later on. Seems to me for gaming, I'm probably gonna notice a bigger difference going from a 280(non-x) to 390X than 2500k > 6600k. I just don't know









Essentially what im asking is
GPU now, i7 skylake in 2-3 months, maybe still polaris/pascal later (i'm assuming this would also have the advantage of newer motherboard offerings by the time i get the CPU)
or
i5 skylake now, maybe GPU in 2-3 months and still wait for polaris/pascal


----------



## rdr09

Quote:


> Originally Posted by *dragneel*
> 
> I had planned on upgrading to a 6600k and 16GB 3000mhz RAM shortly then waiting for Polaris/Pascal but now I'm not sure, I'm wondering if maybe I'd be better off getting like a 390X(and new PSU) for now and and being able to upgrade to a 6700k in a couple months instead and then perhaps seeing how I feel about polaris/pascal later on. Seems to me for gaming, I'm probably gonna notice a bigger difference going from a 280(non-x) to 390X than 2500k > 6600k. I just don't know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Essentially what im asking is
> GPU now, i7 skylake in 2-3 months, maybe still polaris/pascal later (i'm assuming this would also have the advantage of newer motherboard offerings by the time i get the CPU)
> or
> i5 skylake now, maybe GPU in 2-3 months and still wait for polaris/pascal


Single 390X . . . your i5 with an oc will be just fine. now, with two if ever, then an i7 3770K is all you need.


----------



## Cyb3r

@ darkwizzie apparently we have a different understanding off certain words and communication used which is fine

and whiterun overlooking the city is a performance hog indeed for most part except the top stairs which somehow get's less fps due to a minor hardcoded bug into skyrim

and sorry i meant more to give skyrim as an example since it clearly show's poor coding in major area's off the game things that can't be fully fixed even with the unofficial patches
and yes while vanilla skyrim doesn't load as much as say a fully modded 4k skyrim with also a fully loaded enb and a ton off mods the maincode off the engine does have some inherent flaws but it's bethesda so i didn't hold my breath after seeying oblivion where due to a coding bug i personally had issues on 2 systems that could have been fixed by a minor patch probably (uncertain since it's a bit hard to pinpoint a problem in such a complex engine)

and i don't see possitive dips as a detriment to the player i do see frameloading issues as a problem (where certain areas just suddenly half gpu load even after trying multiple drivers and taking the same point as a reference)


----------



## EinZerstorer

are you guys really using skyrim as a basis for cpu performance in gaming?

lolwut.


----------



## SuperZan

Quote:


> Originally Posted by *EinZerstorer*
> 
> are you guys really using skyrim as a basis for cpu performance in gaming?
> 
> lolwut.


There's an arrow to the knee / single-thread performance crack in there somewhere.


----------



## BoredErica

Quote:


> Originally Posted by *EinZerstorer*
> 
> are you guys really using skyrim as a basis for cpu performance in gaming?
> 
> lolwut.


lolwhynot?

It's A measurement of CPU performance in gaming. More relevant if you play the game, less so if you don't.


----------



## Boyboyd

Batch L041C124 ftw!

I do desperately need to upgrade my GPU though.


----------



## Cyro999

Yeah - don't use skyrim, WoW, sc2, guild wars 2, total war, civilization, AOTS, league/dota/heroes, kerbal space program, far cry, planetside or any of the other popular CPU intensive games when looking at how CPU's perform when holding back the gaming experience. That would be silly.


----------



## Cyb3r

@einzerstorer it's a good indication for cpu performance even if some are badly coded engines since they force the cpu to do the brunt off the work instead off offloading alot off things to the gpu

and let's not forget xcom2 that runs like an absolute turd even when you throw 8cores and 16threads at it XD


----------



## Dyson Poindexter

It's incredible how bad things have gotten. 5 years prior to the 2500k we had the P4 631. A single core, 3GHz NetBurst CPU. From 2006 to 2011 single-threaded performance increased somewhere in the hundreds of percents, not to mention the quadrupling of CPU cores. In the 5 years past that, we still have ~$250, ~3.5GHz quad core CPUs with ~90W TDPs. Anyone who upgrades from a 4.5 GHz 2500k to a 4.5GHz 6600k is just letting Intel take them for a ride. It's shameful and embarrassing that you can overclock a 5 year old CPU and match any stock desktop CPU that Intel can come up with in 2016.

http://ark.intel.com/compare/27479,52210,88191


----------



## Cyb3r

@dyson true and while some performance is gained in all aspects due to newer instructions inside the cpu intel has been riding the green train for a while now and also the I7 2500k base was darn good so good i don't think we'll see such a performance gain from a cpu in a long time to come


----------



## Dyson Poindexter

While it's true that there have been some impressive gains in power efficiency, particularly on mobile, desktop CPUs are still pretty much the same once loaded. I hate to sound like a whiny entitled person, but what the heck has Intel been doing for desktops for the last 5 years? The sever chips have gone from 8 core to 18 core, that is an improvement. But the underlying IPC is still stagnant.


----------



## czin125

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> While it's true that there have been some impressive gains in power efficiency, particularly on mobile, desktop CPUs are still pretty much the same once loaded. I hate to sound like a whiny entitled person, but what the heck has Intel been doing for desktops for the last 5 years? The sever chips have gone from 8 core to 18 core, that is an improvement. But the underlying IPC is still stagnant.


That depends if the software was optimized for the newer cpus


----------



## _LDC_

I'm still holding my 2600k with love :3 maybe I will upgrade the RAM after seeing the video...


----------



## BoredErica

Quote:


> Originally Posted by *_LDC_*
> 
> I'm still holding my 2600k with love :3 maybe I will upgrade the RAM after seeing the video...


I considered that, but man... Having to sell my old ram is a pain. I dislike selling online, and I dislike the post office.


----------



## Mampus

I'm afraid that someday DDR3 is more expensive than DDR4. Right now they're about the same in terms of pricing...

Gotta grab dat 2133 MHz really quick lol


----------



## _LDC_

Quote:


> Originally Posted by *Darkwizzie*
> 
> I considered that, but man... Having to sell my old ram is a pain. I dislike selling online, and I dislike the post office.


I feel you, really, can't be bothered with those procedures. Luckily I have many friends who could be interested and hopefully I can simply follow that way...


----------



## _LDC_

Quote:


> Originally Posted by *Mampus*
> 
> I'm afraid that someday DDR3 is more expensive than DDR4. Right now they're about the same in terms of pricing...
> 
> Gotta grab dat 2133 MHz really quick lol


now, I am just wondering: how much the timing impacts on the effective performance of the RAM when all that seems to matter is frequency?


----------



## Partol

Quote:


> Originally Posted by *_LDC_*
> 
> now, I am just wondering: how much the timing impacts on the effective performance of the RAM when all that seems to matter is frequency?


If you want to know how ram timings affect performance, then simply change the ram timings and see what happens.

With my Pentium G3258, I tested
bad ram timings vs good ram timing
DDR3 10-14-10-36 @1400MHz vs 5 - 7 - 5 -18 @ 1400MHz

Is there a difference? Not in all games, but in open-world games with a large map, there is a major difference. Don't use benchmarks. Just play the game normally and record it, such as this:

https://www.youtube.com/watch?v=4cL_2rELqng

https://www.youtube.com/watch?v=-Yphw9wwBII


----------



## serothis

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *_LDC_*
> 
> I'm still holding my 2600k with love :3 maybe I will upgrade the RAM after seeing the video...
> 
> 
> 
> I considered that, but man... Having to sell my old ram is a pain. I dislike selling online, and I dislike the post office.
Click to expand...

I'm the same.

I have a box of old hardware that I never got around to selling. It makes for fun frankenstein projects down the road.


----------



## Ironcobra

Quote:


> Originally Posted by *_LDC_*
> 
> I'm still holding my 2600k with love :3 maybe I will upgrade the RAM after seeing the video...


http://www.newegg.com/Product/Product.aspx?Item=N82E16820231617

Thats what im doing on my 2500k except crappy newegg has taken over 12 days to deliver my ram. I have asked for a refund. Didnt realize how much faster of ram i could pair with my sandy, hopefully boots my minimum, also guys for what its worth was having a lot of stuttering in the last year didnt realize i havent changed my thermal paste since 2012







reapplied some arctic silver and dropped my temps by 40! and now all my games including witcher 3 have been stutter free. Stupid of me not to check my temps once in a while my paste was rainbow colored. Do yourself a favor and stay away from newegg tho.


----------



## Cyb3r

Agreed with that newegg statement their rma is atrocious took them a month replacing my ram


----------



## Scotty99

I really cant believe ram will make that much difference. I have a set of cas 8 1600 ram that is really high quality, i cant justify spending even 40 bucks on a new set lol.


----------



## Ironcobra

Quote:


> Originally Posted by *Scotty99*
> 
> I really cant believe ram will make that much difference. I have a set of cas 8 1600 ram that is really high quality, i cant justify spending even 40 bucks on a new set lol.


The numbers don't lie in that video, its a pretty significant difference in minimum.


----------



## Cakewalk_S

Keeping this 2500k till it dies! Pretty darn good chip and with some new clu this thing doesn't even hit 60c @ 4.5ghz... Don't need much more than that.


----------



## OmegaRED.

My 3DMark score went up a few hundred points after switching from 1600 to 2133 ram but I forgot to do extensive game benching beforehand. I'll just assume I squeezed out the last bit of performance possible.

All thanks to this article


----------



## LuckyStarV

Makes me wonder if I should have shelled out for 2133 instead of settling for 1866


----------



## rdr09

Quote:


> Originally Posted by *LuckyStarV*
> 
> Makes me wonder if I should have shelled out for 2133 instead of settling for 1866


you are fine. those are fast enuf. so long as they are not 1333. i have 2 290s and my rams are only 1600. they can keep up.


----------



## MonarchX

Damn.... i7 3770K @ 4.4Ghz is losing to i5 6500K @ 4.5Ghz and stock 6700K. I wonder how my i7 3770K @ 4.8Ghz compares against 6700K @ 4.5-4.6Ghz in recent games.


----------



## MonarchX

Yeah, RAM speed is very important to minimum FPS, which is what causes stuttering. I'd get 2400Mhz of course. It especially helps with SLI and CFX systems.


----------



## ahnafakeef

Quote:


> Originally Posted by *MonarchX*
> 
> Yeah, RAM speed is very important to minimum FPS, which is what causes stuttering. I'd get 2400Mhz of course. It especially helps with SLI and CFX systems.


That makes me want to look into the overclocking potential of my RAM.

Is there any possibility that my 1600MHz DDR3 RAM is overclockable to a higher speed? If yes, could you please give me some pointers as to how I can overclock it?

Thank you.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ahnafakeef*
> 
> That makes me want to look into the overclocking potential of my RAM.
> 
> Is there any possibility that my 1600MHz DDR3 RAM is overclockable to a higher speed? If yes, could you please give me some pointers as to how I can overclock it?
> 
> Thank you.


From experience I could never get to OC 1600MHz RAM. You can try to increase to voltage to 1.65 and increase timings and memory speed. Try to look up similar RAM and see what timings/speeds they have and try to emulate them.


----------



## Derp

Quote:


> Originally Posted by *ahnafakeef*
> 
> Is there any possibility that my 1600MHz DDR3 RAM is overclockable to a higher speed? If yes, could you please give me some pointers as to how I can overclock it?
> .


Last year I did exactly what Zealotki11er mentions on one of my systems. I just took the timings and voltage from the same product line and tested it out. I used THIS 1600 kit and applied the timings and voltages of THIS 2400 set. Not only was it stable, I was able to tigthen some of the timings. When I say stable I mean it was able to pass overnight memtest, one hour of google stressapp and 1000% HCI memtest coverage.


----------



## MonarchX

Quote:


> Originally Posted by *ahnafakeef*
> 
> That makes me want to look into the overclocking potential of my RAM.
> 
> Is there any possibility that my 1600MHz DDR3 RAM is overclockable to a higher speed? If yes, could you please give me some pointers as to how I can overclock it?
> 
> Thank you.


You have some of the best RAM out there. Just increase frequency to 2133Mhz (without changing timings), run Prime95 RAM Torture Test (Blend or Large FFT's) and if that passes at least an hour then test games or vice versa, but Prime95 Torture Test is more likely to tell you if your OC is too high faster. If it crashes, increase voltage to 1.65v and try again. If it still crashes, decrease frequency or increase RAM timings. You can also start by simply decreasing timings at 1866Mhz, which is similar to increased clocks. Ultimately its trial and error until you can reach your highest RAM speed OC at the lowest timings possible. Make sure to also run game benchmarks to see what makes the most difference - higher clocks or lower timings.

You should also OC your 3770K. I have the same cooler and yet I managed to get my baby to run at 4.8Ghz 100% stable, although temps get as high as 88C, but only during Prime95 Torture Test. Use the same principles for CPU OC as for as RAM OC. Its not hard and I am sure you can at least pull 4.2-4.4Ghz out of your 3770K. I wouldn't go past 1.31v. PM me if you need more specific instructions because you're likely have to also adjust Load-Line Calibration and other more advanced setting to reach a high and stable OC. CPU OC will benefit your framerate even more than RAM OC. CPU OC + RAM OC and you'll have a considerable improvement.


----------



## SuperZan

Quote:


> Originally Posted by *MonarchX*
> 
> Damn.... i7 3770K @ 4.4Ghz is losing to i5 6500K @ 4.5Ghz and stock 6700K. I wonder how my i7 3770K @ 4.8Ghz compares against 6700K @ 4.5-4.6Ghz in recent games.


Blahh, I should have compiled some in-game FPS graphs before I sold my 4.7 3770k. Not having done that I can only offer that anecdotally I've observed that my 4.6 6700k is granting me 3-5 min FPS in newer GPU-intensive games with good crossfire support. In older games with more single-threading and less demanding graphics I'm seeing 8-10 minimum FPS improvement. Frametime experience feels a bit more smooth but nothing drastic. This is at 4k with Fiji crossfire so mileage is bound to vary. In short I figured that I was upgrading mostly out of vanity and a desire to move to a new platform, and experience has borne that out. Gaming is slightly improved but with my configuration I wouldn't say that it's new motherboard+cpu+ram improved in terms of £/$/€ to performance. For single midhigh to high-end GPU on 1080/1440 the benefit would be a bit more obvious.


----------



## ahnafakeef

Quote:


> Originally Posted by *ZealotKi11er*
> 
> From experience I could never get to OC 1600MHz RAM. You can try to increase to voltage to 1.65 and increase timings and memory speed. Try to look up similar RAM and see what timings/speeds they have and try to emulate them.


Quote:


> Originally Posted by *Derp*
> 
> Last year I did exactly what Zealotki11er mentions on one of my systems. I just took the timings and voltage from the same product line and tested it out. I used THIS 1600 kit and applied the timings and voltages of THIS 2400 set. Not only was it stable, I was able to tigthen some of the timings. When I say stable I mean it was able to pass overnight memtest, one hour of google stressapp and 1000% HCI memtest coverage.


Since both of you provided a similar solution, I'll answer you together.

My RAM is 1600MHz @1.5V with timings of 9-9-9-24. The same kit with 2133MHz runs at 1.65V and timings of 9-11-11-31, as per Corsair's website.

I tried out the latter set of values but I'm getting error in memtest86. What should be my next course of action? Do I increase the voltage, or modify timings? If it's the latter, I'm going to need detailed instructions on how to determine stable timings.

My apologies if this is too much trouble for you guys. But I've never overclocked system memory before, so I'm going to need all the help I can get.

Thank you.
Quote:


> Originally Posted by *MonarchX*
> 
> You have some of the best RAM out there. Just increase frequency to 2133Mhz (without changing timings), run Prime95 RAM Torture Test (Blend or Large FFT's) and if that passes at least an hour then test games or vice versa, but Prime95 Torture Test is more likely to tell you if your OC is too high faster. If it crashes, increase voltage to 1.65v and try again. If it still crashes, decrease frequency or increase RAM timings. You can also start by simply decreasing timings at 1866Mhz, which is similar to Ultimately its trial and error until you can reach your highest RAM speed OC at the lowest timings possible. Make sure to also run game benchmarks to see what makes the most difference - higher clocks or lower timings.
> 
> You should also OC your 3770K. I have the same cooler and yet I managed to get my baby to run at 4.8Ghz 100% stable, although temps get as high as 88C, but only during Prime95 Torture Test. Use the same principles for CPU OC as for as RAM OC. Its not hard and I am sure you can at least pull 4.2-4.4Ghz out of your 3770K. I wouldn't go past 1.31v. PM me if you need more specific instructions because you're likely have to also adjust Load-Line Calibration and other more advanced setting to reach a high and stable OC. CPU OC will benefit your framerate even more than RAM OC. CPU OC + RAM OC and you'll have a considerable improvement.


Tried 2133MHz without changing timings. Could barely get the system to start. And the time I did ended up resulting in a WHEA error on the desktop.

But as already mentioned, I'm trying out looser timings but that is still resulting in memtest86 errors. Any advice on how to proceed would be appreciated.

Also, my CPU is already overclocked to 4.4GHz @1.22v. I can get it to 4.5GHz @~1.27v. But I can't stabilize 4.6GHz even at 1.32v. Any advice on if and how I can stabilize higher clocks at lower voltages would be appreciated.

Thank you.


----------



## MonarchX

Quote:


> Originally Posted by *ahnafakeef*
> 
> Since both of you provided a similar solution, I'll answer you together.
> 
> My RAM is 1600MHz @1.5V with timings of 9-9-9-24. The same kit with 2133MHz runs at 1.65V and timings of 9-11-11-31, as per Corsair's website.
> 
> I tried out the latter set of values but I'm getting error in memtest86. What should be my next course of action? Do I increase the voltage, or modify timings? If it's the latter, I'm going to need detailed instructions on how to determine stable timings.
> 
> My apologies if this is too much trouble for you guys. But I've never overclocked system memory before, so I'm going to need all the help I can get.
> 
> Thank you.
> Tried 2133MHz without changing timings. Could barely get the system to start. And the time I did ended up resulting in a WHEA error on the desktop.
> 
> But as already mentioned, I'm trying out looser timings but that is still resulting in memtest86 errors. Any advice on how to proceed would be appreciated.
> 
> Also, my CPU is already overclocked to 4.4GHz @1.22v. I can get it to 4.5GHz @~1.27v. But I can't stabilize 4.6GHz even at 1.32v. Any advice on if and how I can stabilize higher clocks at lower voltages would be appreciated.
> 
> Thank you.


CPU: Have you adjust Line-Load Calibration (or maybe its called Load-Line Calibration) settings already? Have you disable all Sleep C states? Enabled PLL Over-voltage?
RAM: My RAM also struggles big time. Its 2133Mhz @ 11-11-11-30 1T @ 1.50v and I only managed to push it to 2200Mhz @ 10-11-10-30 1T @ 1.65v. Try to get it to 2000Mhz. I think its best to make both CPU and RAM work at the same 1:1 frequency ratio.


----------



## ZealotKi11er

Try 1.36v for 4.6GHz. That is how much I need. Increase timing to 10-11-11-31 and see if it help. Do not add more voltage. Try to find 1866MHz try replicate if you can not do 2133.


----------



## MonarchX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try 1.36v for 4.6GHz. That is how much I need. Increase timing to 10-11-11-31 and see if it help. Do not add more voltage. Try to find 1866MHz try replicate if you can not do 2133.


Look at his cooler - its CM Hyper 212! 1.36v is CRAZY-HIGH. 1.31v is the max I would ever advice with that cooler. Besides, voltage alone won't help much of Load-Line/Line-Load Calibration is not properly adjusted.


----------



## ahnafakeef

Quote:


> Originally Posted by *MonarchX*
> 
> CPU: Have you adjust Line-Load Calibration (or maybe its called Load-Line Calibration) settings already? Have you disable all Sleep C states? Enabled PLL Over-voltage?
> RAM: My RAM also struggles big time. Its 2133Mhz @ 11-11-11-30 1T @ 1.50v and I only managed to push it to 2200Mhz @ 10-11-10-30 1T @ 1.65v. Try to get it to 2000Mhz. I think its best to make both CPU and RAM work at the same 1:1 frequency ratio.


LLC: Ultra High
PLL OV: Enabled
CPU PLL Voltage: 1.70000
(If by Sleep C states you mean the following)
CPU C1E: Enabled
CPU C3 Report: Disabled
CPU C6 Report: Disabled
Package C State Support: Disabled

As for the RAM, I'll try increased timings or decreased speeds.

Thank you.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try 1.36v for 4.6GHz. That is how much I need. Increase timing to 10-11-11-31 and see if it help. Do not add more voltage. Try to find 1866MHz try replicate if you can not do 2133.


Do you have liquid cooling? Don't you think 1.36v is too high for an air-cooled CPU?

I'll try out 10-11-11-31 and check with memtest86.

Thank you.

UPDATE: Got error with both 10-11-11-31 and 11-13-13-31 (Corsair's timing for 2400MHz RAM) at [email protected] Will await further instructions.

Also, I'm running 6 instances of memtest86, 2000MB each, simultaneously. Let me know if this procedure is appropriate.


----------



## magnek

Vdroop is there for a reason. I would not use anything beyond "medium" for LLC for 24/7 use, much less "ultra high". There comes a point when you just have to accept your CPU may not hit your target OC without you going nuts on the voltage, so just back off 100MHz and try again.


----------



## rdr09

Quote:


> Originally Posted by *MonarchX*
> 
> Yeah, RAM speed is very important to minimum FPS, which is what causes stuttering. I'd get 2400Mhz of course. It especially helps with SLI and CFX systems.


you want to raise your minimum? add another 980. lol


----------



## MonarchX

Quote:


> Originally Posted by *rdr09*
> 
> you want to raise your minimum? add another 980. lol


Do you even realize we're not discussing GPU's here? You want buy me another 980 or another card - go ahead. Until then people here are discussing CPU's, their effect on min. FPS and how to OC their already-purchased RAM to get better min FPS. I hate it when a dumb smart-ass says something very obvious with a "LOL", as if the rest of us are just idiots, even though he/she is the one to laugh at.


----------



## rdr09

Quote:


> Originally Posted by *MonarchX*
> 
> Do you even realize we're not discussing GPU's here? You want buy me another 980 or another card - go ahead. Until then people here are discussing CPU's, their effect on min. FPS and how to OC their already-purchased RAM to get better min FPS. I hate it when a dumb smart-ass says something very obvious with a "LOL", as if the rest of us are just idiots, even though he/she is the one to laugh at.


Do you realize we have nubs reading stuff we write here? if i have two 980Tis or FuryXs, yah, i'll go to the next platform with faster DDR4. We have members here contemplating on upgrading their DDR3s this late in the game 'cause of one review. lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Do you realize we have nubs reading stuff we write here? if i have two 980Tis or FuryXs, yah, i'll go to the next platform with faster DDR4. We have members here contemplating on upgrading their DDR3s this late in the game 'cause of one review. lol


Its a cheap upgrade. I already have DDR3-2400 but If I wanted to get 6700K I would have to pay $530 CAD for it. I paid $270 CAD for 3770K almost 3 years ago.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its a cheap upgrade. I already have DDR3-2400 but If I wanted to get 6700K I would have to pay $530 CAD for it. I paid $270 CAD for 3770K almost 3 years ago.


That's fine. But, look, we have some here with GTX 760 and 1800 RAMs wishing to go faster on the latter.


----------



## 45nm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its a cheap upgrade. I already have DDR3-2400 but If I wanted to get 6700K I would have to pay $530 CAD for it. I paid $270 CAD for 3770K almost 3 years ago.


That's not even mentioning the lack of 6700K or 6600K in stock in Canada in general and the horrendous pricing and exchange rates. Besides Skylake is DDR4 mostly so all that DDR3 memory you bought can't be reused.


----------



## ZealotKi11er

Quote:


> Originally Posted by *45nm*
> 
> That's not even mentioning the lack of 6700K or 6600K in stock in Canada in general and the horrendous pricing and exchange rates. Besides Skylake is DDR4 mostly so all that DDR3 memory you bought can't be reused.


It really sucks man. I am looking to go ITX but can't find anything for Z77.


----------



## JoshuaB123

IMO everyone should upgrade to 5960X


----------



## 45nm

Quote:


> Originally Posted by *JoshuaB123*
> 
> IMO everyone should upgrade to 5960X


I don't think your comment is serious but I would much rather not jump on X99 and still be a beta-tester/early-adopter for Intel. Besides with the prices of DDR4 and the chipset issues and errata it seems it was an early/premature launch even when it was launched in 2014. To put this into perspective I haven't heard that many issues with HEDT using X58 or X79 compared to X99. Besides wait some more time and there will be another HEDT platform and Haswell-E will be sold for cheap used.


----------



## ZealotKi11er

Quote:


> Originally Posted by *45nm*
> 
> I don't think your comment is serious but I would much rather not jump on X99 and still be a beta-tester/early-adopter for Intel. Besides with the prices of DDR4 and the chipset issues and errata it seems it was an early/premature launch even when it was launched in 2014. To put this into perspective I haven't heard that many issues with HEDT using X58 or X79 compared to X99. Besides wait some more time and there will be another HEDT platform and Haswell-E will be sold for cheap used.


Dam you still on X58. I had the same MB and for me it was the best MB I have ever had.


----------



## 45nm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dam you still on X58. I had the same MB and for me it was the best MB I have ever had.


Yep and I love my X58 platform and this motherboard is great especially considering the support it has for my X5690 Xeon. I intend to use X58 for a very long time although I won't mind playing with X79 once I can get a E5 2670 for my X79 motherboard. X99 is not justified in my opinion.

By the way what ever happened to your Gigabyte motherboard and your X58 build you had?


----------



## TheReciever

If your planning to play the Division then you will likely want a 2600k/3770k.

Im fully tasked with my 2500k @ 4.8ghz though it still plays well enough, it could be better.


----------



## ZealotKi11er

Quote:


> Originally Posted by *45nm*
> 
> Yep and I love my X58 platform and this motherboard is great especially considering the support it has for my X5690 Xeon. I intend to use X58 for a very long time although I won't mind playing with X79 once I can get a E5 2670 for my X79 motherboard. X99 is not justified in my opinion.
> 
> By the way what ever happened to your Gigabyte motherboard and your X58 build you had?


Sold it for cheap when SB came lol. Did not know the demand would increase.


----------



## BoredErica

I tested 2800 c16 vs 3000 c16 and found a ~1.74% fps improvement with the latter in very CPU bound sections of Skyrim and Oblivion. I didn't test roaming around outside though.


----------



## Scotty99

Quote:


> Originally Posted by *MonarchX*
> 
> Look at his cooler - its CM Hyper 212! 1.36v is CRAZY-HIGH. 1.31v is the max I would ever advice with that cooler. Besides, voltage alone won't help much of Load-Line/Line-Load Calibration is not properly adjusted.


I can push 1.5v to my 2500k and never go over 75c lol. (have the 212+ also)

This is just one of the reasons sandy was the best generation evur


----------



## ahnafakeef

Quote:


> Originally Posted by *MonarchX*
> 
> Look at his cooler - its CM Hyper 212! 1.36v is CRAZY-HIGH. 1.31v is the max I would ever advice with that cooler. Besides, voltage alone won't help much of Load-Line/Line-Load Calibration is not properly adjusted.


Did not notice this post when I last posted. The insufficiency of my cooler was pointed out to me in the Ivy overclocking thread when I first overclocked it. Which is why I never tested at anything past 1.32v.

My LLC is set to Ultra High. That's what I was advised on the Ivy overclocking thread. Would a different setting help achieve a higher overclock?

Thank you.
Quote:


> Originally Posted by *magnek*
> 
> Vdroop is there for a reason. I would not use anything beyond "medium" for LLC for 24/7 use, much less "ultra high". There comes a point when you just have to accept your CPU may not hit your target OC without you going nuts on the voltage, so just back off 100MHz and try again.


From what you said, is it safe to assume that Ultra High LLC helps achieve a higher overclock?

Also, why is Ultra High LLC not advisable for 24/7 use?

Thank you.


----------



## TranquilTempest

Quote:


> Originally Posted by *ahnafakeef*
> 
> Did not notice this post when I last posted. The insufficiency of my cooler was pointed out to me in the Ivy overclocking thread when I first overclocked it. Which is why I never tested at anything past 1.32v.
> 
> My LLC is set to Ultra High. That's what I was advised on the Ivy overclocking thread. Would a different setting help achieve a higher overclock?
> 
> Thank you.
> From what you said, is it safe to assume that Ultra High LLC helps achieve a higher overclock?
> 
> Also, why is Ultra High LLC not advisable for 24/7 use?
> 
> Thank you.


LLC increases voltage to offset the drop caused by resistance when the chip is drawing a lot of current. The problems are the same as those caused by just setting the voltage higher in the first place, heat and degradation.

Say you have a cpu core stable at 1.0v regardless of load, if you set your supply voltage to 1.01v it will be stable at idle, but due to the resistance of the traces that voltage will fall below 1v under load and make it unstable. If the voltage falls to 0.95v under load, the ideal LLC setting would bump the supply voltage up to say 1.07v under load, so the voltage at the actual core stays at 1.01v. Now, It's hard to tell exactly what "ultra high LLC" means, but I would personally assume it overshoots, so the final voltage is more than actually needed.


----------



## magnek

Quote:


> Originally Posted by *ahnafakeef*
> 
> Did not notice this post when I last posted. The insufficiency of my cooler was pointed out to me in the Ivy overclocking thread when I first overclocked it. Which is why I never tested at anything past 1.32v.
> 
> My LLC is set to Ultra High. That's what I was advised on the Ivy overclocking thread. Would a different setting help achieve a higher overclock?
> 
> Thank you.
> From what you said, is it safe to assume that Ultra High LLC helps achieve a higher overclock?
> 
> Also, why is Ultra High LLC not advisable for 24/7 use?
> 
> Thank you.




(from an LTT thread)

Because setting LLC to ultra high causes overshoot of your set target Vcore in the BIOS, running the unnecessary risk of frying your chip if you don't know what you're doing.


----------



## Max78

So I was on Newegg look at some very cheap memory (man its sooo cheap now!) and decided to play around with my current sticks to see if I could overclock them a little or something. . .

Well looking at the memory settings in the bios and I wasn't even running the XMP so my timings were 11,11,11,32 and running at 1600mhz. . . .doh!!! Changed the profile to XMP and that brought everything down to 9, 9, 9, 24. Seems to run smoother, no I want to see how far i can take these sticks.


----------



## Aggrotech

cant stream with i5-2500k, so yes, it is TIME.


----------



## Scotty99

Quote:


> Originally Posted by *Aggrotech*
> 
> cant stream with i5-2500k, so yes, it is TIME.


You doing something wrong broski. Use game capture i cant even notice im streaming with OBS and my system.


----------



## EinZerstorer

Quote:


> Originally Posted by *ahnafakeef*
> 
> Did not notice this post when I last posted. The insufficiency of my cooler was pointed out to me in the Ivy overclocking thread when I first overclocked it. Which is why I never tested at anything past 1.32v.
> 
> My LLC is set to Ultra High. That's what I was advised on the Ivy overclocking thread. Would a different setting help achieve a higher overclock?
> 
> Thank you.
> From what you said, is it safe to assume that Ultra High LLC helps achieve a higher overclock?
> 
> Also, why is Ultra High LLC not advisable for 24/7 use?
> 
> Thank you.


* advise.

Also, only listening to some goofballs in a thread probably isn't the best idea. Tons of people have proven that as long as thermals are in check you can push voltage on sandy regardless the type of cooler.


----------



## blackbishop

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The GPU is the only thing that is not a Bottleneck in a System. I see the CPU as the enabler. That means if *CPU pushes CPU* to 99% its doing it job, else bottleneck.


I'm interested in your definition of bottleneck you wrote there







, but about the bold part, is the use of both of the CPU words correct or one of them should be GPU?

I was wondering about what benefits would have upgrading an I5 2500K with an I7 3770K. I see it has 8 threads, so it does mean it handles processes better I guess. One acquaintance is offering it to me very cheap because he will upgrade his rig and since I don't really have that much money/need to purchase also a new motherboard + ddr4 ram yet(until the next 4 years I guess) I was thinking if it might be fine to accept the deal or not.


----------



## SuperZan

Quote:


> Originally Posted by *blackbishop*
> 
> I'm interested in your definition of bottleneck you wrote there
> 
> 
> 
> 
> 
> 
> 
> , but about the bold part, is the use of both of the CPU words correct or one of them should be GPU?
> 
> I was wondering about what benefits would have upgrading an I5 2500K with an I7 3770K. I see it has 8 threads, so it does mean it handles processes better I guess. One acquaintance is offering it to me very cheap because he will upgrade his rig and since I don't really have that much money/need to purchase also a new motherboard + ddr4 ram yet(until the next 4 years I guess) I was thinking if it might be fine to accept the deal or not.







Seriously though, I'm 100% sure the bit you quoted was with regards to the CPU pushing the GPU. It would all be about price and use-case with the 3770k, IMO. If you haven't got a pressing need for eight threads there still aren't loads of games leveraging the MT advantage of the i7. If you do any video encoding, streaming, or anything of that sort, you'll notice a significant positive improvement with the 3770k, so it's up to your use-case. Of course, were I you, if the offer were something like £50 I'd take it on principle to hold me over until the next big re-build.


----------



## blackbishop

Thanks







and sorry about necroing the topic, I didn't see the date of the answers, found this topic in google, read until page 39, created an account and went back to page 33 to quote the text.


----------



## SuperZan

Quote:


> Originally Posted by *blackbishop*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> and sorry about necroing the topic, I didn't see the date of the answers, found this topic in google, read until page 39, created an account and went back to page 33 to quote the text.


We all do it now and again!









I'd just consider whether you use any highly-threaded applications that can make use of the i7, and then consider the price. For reference I sold mine for £220 or thereabouts so something like $275 USD at the time. If it's a reasonable price and you can use it, or it's just a great price for a used 3770k, snap it up.


----------



## blackbishop

Ok, I'll have in mind your advice. And just for the record, the price I'm being offered is half of the one you said in USD (275 / 2 = 137.50), a bit more with shipping though (my friend doesn't live in the same state as me but somewhat near) and he used it like 1 year more or less, overclocking it at 4.0Ghz during it's use and had water cooling system (Swiftech H220x).

EDIT: He tested it before offering it up to 4.4Ghz and he said it was stable.


----------



## Snakecharmed

I sold off a couple of i7-3770Ks for around $235 each in the last two months, so you're getting a great deal and a cheap upgrade from your friend, especially if you turn around and sell your i5-2500K for about $100.


----------



## SuperZan

^ Agreed. Especially as it sounds as though it was cared for and cooled properly. I had the H220x before I picked up the x2 and it was more than capable of handling a 4.6GHz 3770k at very comfortable temps. At 4.0GHz that chip shouldn't have been unduly stressed by heat or voltage and should be in great shape.


----------



## blackbishop

Quote:


> Originally Posted by *Snakecharmed*
> 
> I sold off a couple of i7-3770Ks for around $235 each in the last two months, so you're getting a great deal and a cheap upgrade from your friend, especially if you turn around and sell your i5-2500K for about $100.


Quote:


> Originally Posted by *SuperZan*
> 
> ^ Agreed. Especially as it sounds as though it was cared for and cooled properly. I had the H220x before I picked up the x2 and it was more than capable of handling a 4.6GHz 3770k at very comfortable temps. At 4.0GHz that chip shouldn't have been unduly stressed by heat or voltage and should be in great shape.


Thanks guys







! Your input is really helpful!

Then I'll proceed to inform him that I accept the deal!


----------



## Max78

It really depends on what you do with your system.

If you do nothing but game its going to be a bit of a side grade, I doubt you will notice a difference in performance.
If you multitask like live stream, or you render videos and audio then you will probably notice a performance increase.

If you currently have a 2500K with a decent motherboard I would suggest you try overclocking if you haven't already. You can run the 2500k at 4.7ghz on the Cooler Master hyper 212 which you can get for around $30 (cheaper on sale) and that puts you at stock 6500 levels which isn't bad for a almost 6 year old chip. . .

Here is a great video that might help you make a decision.

https://www.youtube.com/watch?v=2ZxZiksWtRQ

Edit: Didn't see the last page, DOH!


----------



## Klocek001

I'd recommend upgrading the 2500K for

1) 980Ti or higher in nvidia rigs
2) R9 290 or higher in amd rigs


----------



## xx9e02

Quote:


> Originally Posted by *Klocek001*
> 
> I'd recommend upgrading the 2500K for
> 
> 1) 980Ti or higher in nvidia rigs
> 2) R9 290 or higher in amd rigs


I think it's time for me to upgrade








still slumming it on sata2/usb2 lol trying to hold out till zen / kaby


----------



## Chaython

Please stop necroing an old article as it's placed under news
also
Quote:


> Originally Posted by *xx9e02*
> 
> I think it's time for me to upgrade
> 
> 
> 
> 
> 
> 
> 
> 
> still slumming it on sata2/usb2 lol trying to hold out till zen / kaby


I couldn't stand not having at least 1 sata 3 these days, but usb 3+ although it's a nice to have I rarely use if ever. [phone syncs over wifi, as most phones still aren't USB 3 wifi is faster, and all other devices I share files with over the network as well, I hate external USB based drives, except flash drives which you'd rarely have use for anyways... IDK why people care so much for USB, I only buy motherboards with the best IO so I may be considered a baller







[also because I resell all my systems, and people like extra usb they never use]


----------



## Disturbed117

I feel that this thread has ran its course long ago.

Locked.


----------

