# Why isn't PS3 more powerful than PS4?



## sepiashimmer

I've been wondering this a lot, lately. Doesn't the PS3 have Cell with 7 cores running at 3.2 Ghz which is lot more than Jaguar. Why does Sony claim PS4 will have 10 times processing power than PS3?


----------



## CynicalUnicorn

Ignoring the CPU architecture, the GPU is exponentially more powerful and it has 16 times as much RAM (8GB shared vs 256MB system + 256MB graphics). If somebody can find Cell vs. Jaguar comparisons, that would be nice, but according to some forum posts, the Jaguar chip is four times as powerful as the Cell overall. It's been seven years since the PS3 was launched, after all.


----------



## sepiashimmer

Quote:


> Originally Posted by *CynicalUnicorn*
> 
> Ignoring the CPU architecture, the GPU is exponentially more powerful and it has 16 times as much RAM (8GB shared vs 256MB system + 256MB graphics). If somebody can find Cell vs. Jaguar comparisons, that would be nice, but according to some forum posts, the Jaguar chip is four times as powerful as the Cell overall. It's been seven years since the PS3 was launched, after all.


How is Jaguar four times more powerful with a lower clock?


----------



## xxicrimsonixx

GPU > CPU when it comes to gaming performance, and the GPU in the PS4 is significantly faster than then GPU in the PS3.


----------



## black7hought

The PS3 is using CPU and GPU technology that is at least seven years old. We are past the point of clock speed and the number cores determining performance. It is the standard progression of technology becoming more efficient and powerful. Smartphones are a great example of this.

tl;dr

Jaguar: Seven years newer and therefore faster and more powerful.

CELL: Old and busted.


----------



## kill

Quote:


> Originally Posted by *sepiashimmer*
> 
> How is Jaguar four times more powerful with a lower clock?


Different architecture... just like how intel is faster at the same clock speed as its amd counterpart.


----------



## Shrak

Quote:


> Originally Posted by *sepiashimmer*
> 
> How is Jaguar four times more powerful with a lower clock?


Clock speed doesn't mean everything.


----------



## TomiKazi

To be fair, clockspeed has never been a reliable indication. Not even in the Z80, 6502 and 8086 days.


----------



## briddell

Quote:


> Originally Posted by *sepiashimmer*
> 
> How is Jaguar four times more powerful with a lower clock?


How is a 4770k @ 3.6gHz more powerful than a Celeron D @ 4gHz? Per-core, architecture improvements, and power consumption.


----------



## MrAlex

Quote:


> Originally Posted by *sepiashimmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CynicalUnicorn*
> 
> Ignoring the CPU architecture, the GPU is exponentially more powerful and it has 16 times as much RAM (8GB shared vs 256MB system + 256MB graphics). If somebody can find Cell vs. Jaguar comparisons, that would be nice, but according to some forum posts, the Jaguar chip is four times as powerful as the Cell overall. It's been seven years since the PS3 was launched, after all.
> 
> 
> 
> How is Jaguar four times more powerful with a lower clock?
Click to expand...

Architecture plays a major role in things like this. For example, an Intel Pentium Extreme Edition 965 (with a clock of 3.73GHz) will perform exactly the same as a 1.86Ghz Intel E6300, even though you're looking at half the clock speed. Also the Cell processor is a bit different than the traditional CPU you're used to. Read up on it here.


----------



## AndroidVageta

The Cell processor MIGHT actually be more powerful if properly used. I know that in the PC space the Jaguar powered CPU in the new consoles is comparable to a ~3GHz dual core Intel i3 based on benchmarks from the quad-core Jaguar based APU (those numbers multiplied by two of course).

So to say the CPU is more powerful might not be entirely accurate.

I think the PS4 is more powerful than the PS3 due to it's GPU. To put it in perspective the 360's GPU is more powerful than the PS3's and it has 48 stream processors clocked at 500MHz. The PS4 has 1,152 stream processors clocked at 800MHz. So yeah...the GPU is a LOT faster. THAT's where the difference REALLY is...that and the 8GB of RAM vs. the PS3's 512MB.


----------



## DaveLT

Quote:


> Originally Posted by *AndroidVageta*
> 
> The Cell processor MIGHT actually be more powerful if properly used. I know that in the PC space the Jaguar powered CPU in the new consoles is comparable to a ~3GHz dual core Intel i3 based on benchmarks from the quad-core Jaguar based APU (those numbers multiplied by two of course).
> 
> So to say the CPU is more powerful might not be entirely accurate.
> 
> I think the PS4 is more powerful than the PS3 due to it's GPU. To put it in perspective the 360's GPU is more powerful than the PS3's and it has 48 stream processors clocked at 500MHz. The PS4 has 1,152 stream processors clocked at 800MHz. So yeah...the GPU is a LOT faster. THAT's where the difference REALLY is...that and the 8GB of RAM vs. the PS3's 512MB.


If you want the real calculations i'll give it to you
Jaguar are in Kabini's cores. More IPC per core compared to even Kaveri so you can roughly say it's equal to a ... 4GHz Haswell i5 actually. If those 8 cores were to be used properly of course to their full potential. It's not freaking BD/PD









And as we all know, it's not a 7 core proc. It's 1 PPE for everything other than FPU and a SPE lacks the ability to execute multiple IPC and lacks branch predicting and that's why their code needs to be awfully specific and will only perform fantastic then but in practice it never did
Compare their actual throughput with even a i7 965 and you will see how weak it actually is even with the right code

But still, the GPU is a hella lot faster. Ok although it is just as fast as a 7850 but it is light years ahead of the Nvidia GPU in the PS3. But bruh, stream processors aren't a good comparison
Although people think ... so why is the 7850 and the 650 Ti Boost still 100+$ when a PS4 costs 400-500$? It's simple. They slashed out double compute(FP64) on the SoCs on the consoles a lot compared to the desktop 7850s and so they don't need to bin so well, unfortunately on desktop we actually make use of FP64


----------



## Blameless

Quote:


> Originally Posted by *AndroidVageta*
> 
> The Cell processor MIGHT actually be more powerful if properly used. I know that in the PC space the Jaguar powered CPU in the new consoles is comparable to a ~3GHz dual core Intel i3 based on benchmarks from the quad-core Jaguar based APU (those numbers multiplied by two of course).
> 
> So to say the CPU is more powerful might not be entirely accurate.


You'd only be able to extract more performance from the Cell BE in the PS3 than the Jaguar in PS4 in very specific circumstances. The Cell isn't much of a general purpose CPU, and almost anything it is really good at, a GPU is much better at.

Comparing the Cell BE to Jaguar is like comparing an old APU to a recent CPU.

What's faster a Celeron G430 or a 4930k?

The Celeron G530 has one of the weakest versions of Intel HD Graphics (6 EUs, 650MHz) and a single slow CPU core (1.6GHz Sandy, no HT). The 4930k has six very powerful CPU cores, but no EUs at all.

Most people would consider the 4930k more powerful even though there are a small number of tasks the Celeron's EUs are much more suited for. In this way the Celeron is like the Cell, and the 4930k like the Jaguar (and before anyone misinterprets me, this is just an example of the different purposes these parts are geared toward; I'm not making any direct comparisons between the Intel and IBM/AMD parts).


----------



## DaveLT

Quote:


> Originally Posted by *Blameless*
> 
> You'd only be able to extract more performance from the Cell BE in the PS3 than the Jaguar in PS4 in very specific circumstances. The Cell isn't much of a general purpose CPU, and almost anything it is really good at, a GPU is much better at.
> 
> Comparing the Cell BE to Jaguar is like comparing an old APU to a recent CPU.
> 
> What's faster a Celeron G430 or a 4930k?
> 
> The Celeron G530 has one of the weakest versions of Intel HD Graphics (6 EUs, 650MHz) and a single slow CPU core (1.6GHz Sandy, no HT). The 4930k has six very powerful CPU cores, but no EUs at all.
> 
> Most people would consider the 4930k more powerful even though there are a small number of tasks the Celeron's EUs are much more suited for. In this way the Celeron is like the Cell, and the 4930k like the Jaguar (and before anyone misinterprets me, this is just an example of the different purposes these parts are geared toward; I'm not making any direct comparisons between the Intel and IBM/AMD parts).


Just nitpicking but LOL, it's a 2.4GHz dual sandy core. The G1101 is far weaker than even a G530 though. Also i think you're better off comparing A4-4000 with any CPU because Intel "APU" aren't a APU unlike AMD APUs are


----------



## Blameless

Quote:


> Originally Posted by *DaveLT*
> 
> More IPC per core compared to even Kaveri so you can roughly say it's equal to a ... 4GHz Haswell i5 actually. If those 8 cores were to be used properly of course to their full potential. It's not freaking BD/PD


You are greatly exaggerating Jaguar's IPC and performance.

A 4GHz Haswell quad with no HT should soundly beat a ~2GHz eight core Jaguar in almost everything, and often by a wide margin.

One look at benchmarks will lead to this conclusion. Even if you extrapolate A4-5000 CPU performance and assume perfect performance scaling with clock speed and cores (and the later, especially, is probably overly generous, as Jaguar has 4-core modules, and communication between modules is vastly slower than communication within them). You see that Jaguar performance will fall far short of a Haswell i5.

http://www.anandtech.com/show/6974/amd-kabini-review/3

If we double the scores of the A4-5000 (which is only clocked 6.7% lower than the PS4 part) we get less than half the performance of a stock i5-4430. Even if we triple the scores of the A4 (which would represent performance well in excess of the highest rumored Jaguar clocks on either the PS4 or XBoxOne), we still fall short.

You say it's not BD/PD, but Jaguar's IPC is actually not far off Piledriver.
Quote:


> Originally Posted by *DaveLT*
> 
> Just nitpicking but LOL, it's a 2.4GHz dual sandy core. The G1101 is far weaker than even a G530 though. Also i think you're better off comparing A4-4000 with any CPU because Intel "APU" aren't a APU unlike AMD APUs are


I meant G430, which is a 1.6GHz single core. The G530 reference was a typo.

And an Intel APU is just as much an APU as an AMD APU.


----------



## DaveLT

Quote:


> Originally Posted by *Blameless*
> 
> You are greatly exaggerating Jaguar's IPC and performance.
> 
> A 4GHz Haswell quad with no HT should soundly beat a ~2GHz eight core Jaguar in almost everything, and often by a wide margin.
> 
> One look at benchmarks will lead to this conclusion. Even if you extrapolate A4-5000 CPU performance and assume perfect performance scaling with clock speed and cores (and the later, especially, is probably overly generous, as Jaguar has 4-core modules, and communication between modules is vastly slower than communication within them). You see that Jaguar performance will fall far short of a Haswell i5.
> 
> http://www.anandtech.com/show/6974/amd-kabini-review/3
> 
> If we double the scores of the A4-5000 (which is only clocked 6.7% lower than the PS4 part) we get less than half the performance of a stock i5-4430. Even if we triple the scores of the A4 (which would represent performance well in excess of the highest rumored Jaguar clocks on either the PS4 or XBoxOne), we still fall short.
> 
> You say it's not BD/PD, but Jaguar's IPC is actually not far off Piledriver.
> I meant G430, which is a 1.6GHz single core. The G530 reference was a typo.
> 
> And an Intel APU is just as much an APU as an AMD APU.


But the PS4/Xbone has far more cache than the A4-5000. Seriously look at it, 2MB L2?! That causes great hits on the performance

No. Intel CPUs are just CPU+GPU. A real APU has the same power controller (but split for CPU and GPU of course) for both CPU and GPU and is ABLE to actually use both GPU and CPU to calculate unlike Intel's implementation which has 1 part CPU power pin and GPU power pin on the socket


----------



## Blameless

Quote:


> Originally Posted by *DaveLT*
> 
> But the PS4/Xbone has far more cache than the A4-5000. Seriously look at it, 2MB L2?! That causes great hits on the performance


It's the same 2MB L2 cache per module in the Jaguar parts on the PS4 and XBox One and 2MB L2 isn't likely to be a serious weakness.
Quote:


> Originally Posted by *DaveLT*
> 
> No. Intel CPUs are just CPU+GPU. A real APU has the same power controller (but split for CPU and GPU of course) for both CPU and GPU and is ABLE to actually use both GPU and CPU to calculate unlike Intel's implementation which has 1 part CPU power pin and GPU power pin on the socket


An APU is a CPU and a GPU or other accelerator on the same package or die. Differences in power planes have no bearing on whether something is an APU or not. Virtually every processor has a variety of power inputs (which include different pins for different voltages) for various parts of the chip.

Intel's HD Graphics can be used for many of the same GPGPU tasks as AMD's integrated Radeon GPUs on AMD APUs. They both accelerate various standard graphics APIs, of course, both also accelerate video encoding/decoding, and both can be used for GPGPU tasks with APIs like OpenCL.

AMD and Intel's parts have slightly different implementations and differences in performance, but they largely do the same things, and by any remotely reasonable definition, are both APUs.

AMD's most recent HSA push isn't supported by Intel, but APUs predate HSA.


----------



## DaveLT

Quote:


> Originally Posted by *Blameless*
> 
> It's the same 2MB L2 cache per module in the Jaguar parts on the PS4 and XBox One and 2MB L2 isn't likely to be a serious weakness.
> An APU is a CPU and a GPU or other accelerator on the same package or die. Differences in power planes have no bearing on whether something is an APU or not. Virtually every processor has a variety of power inputs (which include different pins for different voltages) for various parts of the chip.
> 
> Intel's HD Graphics can be used for many of the same GPGPU tasks as AMD's integrated Radeon GPUs on AMD APUs. They both accelerate various standard graphics APIs, of course, both also accelerate video encoding/decoding, and both can be used for GPGPU tasks with APIs like OpenCL.
> 
> AMD and Intel's parts have slightly different implementations and differences in performance, but they largely do the same things, and by any remotely reasonable definition, are both APUs.
> 
> AMD's most recent HSA push isn't supported by Intel, but APUs predate HSA.


2MB per module for PS4 ... A4-5000 is only 2MB shared by the whole darn thing. If cache isn't useful then what is cache for then huh?

Not in the same way.

Only AMD APUs are APUs. Intel CPUs are NOT APUs


----------



## DuckieHo

Quote:


> Originally Posted by *DaveLT*
> 
> No. Intel CPUs are just CPU+GPU. A real APU has the same power controller (but split for CPU and GPU of course) for both CPU and GPU and is ABLE to actually use both GPU and CPU to calculate unlike Intel's implementation which has 1 part CPU power pin and GPU power pin on the socket


No, they are absolutely not. The CPU and GPU since Sandy Bridge use the same ring bus to share cache. That alone makes them integrated. Furthermore, Intel's QuickSync demonstrates an APU task. Intel's Clarksfield actually beat AMD to APUs even though they didn't call it that.

Power controller has nothing to do with something being an APU or not. Separate power controllers is just a technique to improve power consumption and has nothing to do with the APU microarchitecuture.


----------



## lacrossewacker

Quote:


> Originally Posted by *DaveLT*
> 
> 2MB per module for PS4 ... A4-5000 is only 2MB shared by the whole darn thing. If cache isn't useful then what is cache for then huh?
> 
> Not in the same way.
> 
> Only AMD APUs are APUs. Intel CPUs are NOT APUs


You realize "APU" is just a marketing term right?

It relates back to the previous ATI/AMD "Fusion" project in 2006 -> Putting a CPU and GPU on the same die.

That's it.

Whether or not Intel chooses that "APU" moniker is up to them (or AMD if it's trademarked). Personally, I don't think Intel would want to associate their products with the marketing term "APU" just because they want to keep their own separate "superior" image.


----------



## DuckieHo

Quote:


> Originally Posted by *lacrossewacker*
> 
> Whether or not Intel chooses that "APU" moniker is up to them (or AMD if it's trademarked). Personally, I don't think Intel would want to associate their products with the marketing term "APU" just because they want to keep their own separate "superior" image.


Naw, the main reason why Intel would not use the APU moniker is because *"they are an x86 CPU company.*" All their businesses (SSD, networking, WiFi, mobile, etc) are to support x86 CPUs.


----------



## Blameless

Quote:


> Originally Posted by *DaveLT*
> 
> 2MB per module for PS4 ... A4-5000 is only 2MB shared by the whole darn thing.


Jaguar is four cores per module.

The A4-5000 only has one module. The PS4 and XBone have two modules.
Quote:


> Originally Posted by *DaveLT*
> 
> Intel CPUs are NOT APUs


Yes they are.
Quote:


> Originally Posted by *lacrossewacker*
> 
> You realize "APU" is just a marketing term right?
> 
> It relates back to the previous ATI/AMD "Fusion" project in 2006 -> Putting a CPU and GPU on the same die.
> 
> That's it.
> 
> Whether or not Intel chooses that "APU" moniker is up to them (or AMD if it's trademarked). Personally, I don't think Intel would want to associate their products with the marketing term "APU" just because they want to keep their own separate "superior" image.


APU is like GPU, a term coined by one company that has become a generic.

3D graphics accelerators with T&L hardware predate the NVIDIA Geforce 256, but NVIDIA released the first consumer level product that hit an arbitrary performance level and called it a GPU. ATI soon followed with their VPUs. Now they are both called GPUs.

APUs have a similar history. SoCs and CPUs with integrated coprocessors have been around for a very long time, but AMD took the idea to a new level and started calling their products APUs. Intel isn't calling any of their products APUs yet, and may never, but that doesn't change the fact that a modern LGA-1155 or 1150 CPU has all the core capabilities of AMDs original FM1 APUs. Thus, the term APU is justly applied to such parts by numerous individuals.


----------



## Arthur Hucksake

The Cell never hit it's potential in the end. Gimped by lack of ram and a relatively poor GPU.

Everything about the PS4 takes it to the cleaners.


----------



## tuffy12345

Quote:


> Originally Posted by *sepiashimmer*
> 
> How is Jaguar four times more powerful with a lower clock?


This has to be a troll, right?


----------



## sepiashimmer

Quote:


> Originally Posted by *tuffy12345*
> 
> This has to be a troll, right?


No.


----------



## deepor

Quote:


> Originally Posted by *sepiashimmer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tuffy12345*
> 
> This has to be a troll, right?
> 
> 
> 
> No.
Click to expand...

Wikipedia might help understand why what you are wondering about can actually happen. This is the article about the Jaguar architecture used for the PS4 CPUs: http://en.wikipedia.org/wiki/AMD_Jaguar

It mentions "out-of-order" and "speculative" execution as features. These are the articles about that:

http://en.wikipedia.org/wiki/Out-of-order_execution
http://en.wikipedia.org/wiki/Speculative_execution

To understand what's happening, reading about pipelines in the CPU might help: http://en.wikipedia.org/wiki/CPU_pipeline

What's happening is, there are advanced features that use a lot of room and transistors to implement in a CPU core. That can make a core be a lot faster than a simpler core that runs just as fast but uses less features and a lot less transistors.

I can't find good articles discussing and explaining CPU architectures at the moment so I just linked to those Wikipedia articles. I think I remember interesting, long articles on arstechnica.com from a long time ago, but I can't find them.

Reading about things like this was a lot more fun a decade ago, when there was PowerPC and Alpha and Sparc and Mips architectures and real CPUs using those still competing with x86. That's when articles about architectures and their features were written a lot. The Cell in the PS3 was related to PowerPC and used some full-featured cores, but most of the transistors of that Cell CPU were invested in a lot of very simple cores.


----------



## sepiashimmer

Quote:


> Originally Posted by *deepor*
> 
> Wikipedia might help understand why what you are wondering about can actually happen. This is the article about the Jaguar architecture used for the PS4 CPUs: http://en.wikipedia.org/wiki/AMD_Jaguar ...
> 
> ...Reading about things like this was a lot more fun a decade ago, when there was PowerPC and Alpha and Sparc and Mips architectures and real CPUs using those still competing with x86. That's when articles about architectures and their features were written a lot. The Cell in the PS3 was related to PowerPC and used some full-featured cores, but most of the transistors of that Cell CPU were invested in a lot of very simple cores.


Thanks.


----------



## Mwarren

PS4 should have been named PS3.5 from what I've seen it basically provides the same graphical capability's of the PS3 instead of being (mostly) limited to 720P you can now play at 1080P.


----------



## HiTechPixel

Quote:


> Originally Posted by *Mwarren*
> 
> PS4 should have been named PS3.5 from what I've seen it basically provides the same graphical capability's of the PS3 instead of being (mostly) limited to 720P you can now play at 1080P.


Are you trolling? Or are you being dumb?


----------



## Mwarren

Quote:


> Originally Posted by *HiTechPixel*
> 
> Are you trolling? Or are you being dumb?


Like I said, the differences are minor. That is not a big resolution nor graphical jump.

The jump from PS2 to PS3 was huge.......PS3 to PS4 is small.

I have a PS4 and am not biased but the truth of the matter is that the PS4 is not a significant upgrade over the PS3 when compared to the previous upgrades (In example PS1 to PS2 and PS2 to PS3.)


----------



## alawadhi3000

Quote:


> Originally Posted by *Mwarren*
> 
> Like I said, the differences are minor. That is not a big resolution nor graphical jump.
> 
> The jump from PS2 to PS3 was huge.......PS3 to PS4 is small.
> 
> I have a PS4 and am not biased but the truth of the matter is that the PS4 is not a significant upgrade over the PS3 when compared to the previous upgrades (In example PS1 to PS2 and PS2 to PS3.)


Early PS3 games like Resistance didn't look good either, give it time and you'll notice the difference between the PS3 and the PS4.


----------



## HiTechPixel

Quote:


> Originally Posted by *Mwarren*
> 
> Like I said, the differences are minor. That is not a big resolution nor graphical jump.
> 
> The jump from PS2 to PS3 was huge.......PS3 to PS4 is small.
> 
> I have a PS4 and am not biased but the truth of the matter is that the PS4 is not a significant upgrade over the PS3 when compared to the previous upgrades (In example PS1 to PS2 and PS2 to PS3.)


Then you don't know anything about what graphics are.


----------



## mothrpe

A more accurate comparison would be launch ps3 games vs launch ps4 games, not end of ps3 cycle games vs ps4 launch games. It looks like a pretty good jump to me


----------



## Cyberdot

PS4's GPU is much better than the one in PS3, and the GPU is the main factor in delivering performance in games.


----------

