# [pcper]PS4 and Xbox One Already Hitting a Performance Wall



## Master__Shake

next gen...

just squint your eyes they look really good.

hello 720p goodbye 1080p


----------



## Imglidinhere

So... devs are finally saying what the tech nerds were saying BEFORE the console was released? Huh... go figure...


----------



## Remij

I knew this was coming, but you can't really blame the console manufacturers, they really need to profit from these consoles early on to recoup their losses throughout the gen and actually make money.

That said, these consoles haven't come close to hitting their strides. All it will take is some out of the box developer to show them they are wrong, and new advancements will be made.


----------



## sugarhell

Just one word. Ubisoft.


----------



## axizor

Quote:


> Originally Posted by *Master__Shake*
> 
> next gen...
> 
> just squint your eyes they look really good.
> 
> hello 720p goodbye 1080p


Who plays console games for the visuals?


----------



## Clocknut

What I fail to understand is, why make the PS4 cost ~$370 to make instead of $470(and sell at the usual $499)

topping up $100 into APU alone could still make a significant diff in performance.


----------



## JonHarris

Lies.


----------



## Azuredragon1

Quote:


> Originally Posted by *sugarhell*
> 
> Just one word. Ubisoft.


^^^what he said.


----------



## bucdan

So ubisoft is claiming that they fully utilized all of the weak jaguar cores and the gpu? I highly doubt it. I wonder if valve will try something to prove them wrong in their new upcoming games, whenever that may be.


----------



## perfectblade

this is just silly. some games released for ps4 have been 1080p vs 720p on xb1. not to mention that it is too early to be making broad statements about optimization. but still just because they failed to optimize doesn't mean others can't/won't. the differences in already released games between ps4 vs xb1 speak for themselves. even if zero progress occurs in optimization, the ps4 is clearly substantially more powerful way beyond 1-2 fps. looks like an ms hit piece (anonymous dev payed by ms).

actually, it's probably just ubisoft making amends to ms for their earlier honesty on the parity clause


----------



## Kuivamaa

Quote:


> Originally Posted by *sugarhell*
> 
> Just one word. Ubisoft.


Haha yeah, can't wait to revisit this newspost in 2 years when much more graphically advanced console games will be out.


----------



## StreekG

Quote:


> Originally Posted by *Clocknut*
> 
> What I fail to understand is, why make the PS4 cost ~$370 to make instead of $470(and sell at the usual $499)
> 
> topping up $100 into APU alone could still make a significant diff in performance.


I agree. If this started at PS3 prices with a bump up in the hardware specs, it would still sell very well and we would actually have 1080P games.
I was playing The Evil Within on PS4 the other day and it dips below 30fps in some open areas. Not a good experience at all, scared to see how it performs on the Xbone


----------



## SlyFox

As a former console gamer, I'm really happy I went with PC this gen. I'll re-evaluate again when the Xbox Two is released


----------



## KingGreasy

At launch there was already Ryse and Killzone Shadowfall. There was Forza 5 and NBA 2k14. Then Infamous Second Son. Those launch/early games looked significantly better than previous gen games. The launch games back with the 360/PS3 did not nearly impress me back then like the PS4 have Xbox One have. There probably will be improvements but the bar was set high right out the gate compared to last gen. People like to show comparisons like Perfect Dark Zero compared to Halo 4. I'm not so confident the growth from Killzone Shadow Fall will be as dramatic.

It's whatever though. I have a PC. I'll buy a new $250+ graphics card next winter to be my graphics meal. Maybe even buy an Oculus Rift if the consumer version is out. I'm not going to place my desire to be wowed from console graphics. Give me Gran Turismo, those Naughty Dog adventure games, and some JRPGs and we're good.


----------



## Pip Boy

Quote:


> Originally Posted by *SlyFox*
> 
> As a former console gamer, I'm really happy I went with PC this gen. I'll re-evaluate again when the Xbox Two is released


what the entirely cloud computing based software xbox2 that can be streamed to your smart phone or TV ?

at least it will always be online... because it has to be.


----------



## Lanlan

Man, I sure can't wait to play my Smash Bros on Wii U in 1080p and at 60fps.


----------



## Pip Boy

Quote:


> Originally Posted by *KingGreasy*
> 
> At launch there was already Ryse and Killzone Shadowfall. There was Forza 5 and NBA 2k14. Then Infamous Second Son. Those launch/early games looked significantly better than previous gen games. The launch games back with the 360/PS3 did not nearly impress me back then like the PS4 have Xbox One have. There probably will be improvements but the bar was set high right out the gate compared to last gen. People like to show comparisons like Perfect Dark Zero compared to Halo 4. I'm not so confident the growth from Killzone Shadow Fall will be as dramatic.


True. People should switch their PS3 back on and have a look at the huge leap to PC level graphics, albeit PC level graphics at 900p/30fps. Previously they were firmly rooted in console land and mostly plagued by hideous textures, pop ins, and no AA at 680p , 27 - 30fps.

Did They just peaked too soon ?.. or did they leave it too long between generations and mobile phones started to look crisper with their high PPI and PC gamers were showing off much better graphics for years before they released this gen? i,e had peoples expectations out of the gate been as high or higher than this new console generation can actually give..

by the middle to end of next year 4k gaming is going to start being more plausible on PC, and UHD & WUDH 1440p 21:9 already is the new PC standard.


----------



## TFL Replica

Perhaps the XB1 and PS4 performed similarly because their game is heavily CPU-bound? The PS4's main advantage is in its GPU.


----------



## FattysGoneWild

Takes years to develop a new console. I think Sony said it took 5 years for the PS4 this time? Final specs are locked way before release. They cannot go back and forth all the time. People for get they have to build the whole thing for a certain price. Article mentions a GTX 760 for 1080p smooth game play. Sure that is nice and a benefit for PC. Consoles cannot do that. You cant upgrade and they are not going to put in a $200 gpu alone. You are dreaming. They have the rest to build. The leap is just not there this time. Xbox to 360 or PS2 to PS3. Now those were leaps. In specs alone. These new consoles are between last gen and next. They are far from next gen imo. It seems like now they are just getting to a point where they can push all games now in actual HD. 720p, 900p, 1080p. Console specs was leaked and we knew what was going to happen. I don't see a long span this time. 3-4 years max maybe this time?


----------



## GameBoy

Yeah, and Epic said Gears of War 1 maxed out the 360 back in 2006. Like *every single previous gen ever*, the developers will learn how to optimize better and games will look better as the years go by.


----------



## AcEsSalvation

Quote:


> we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with


You mean the hardware they have been making a terrible port to in the past?
Quote:


> Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.


And now they need to recode rebuild the engine... but why would they? They are going to get a small fortune in sales anyway.
EDIT:
Quote:


> Originally Posted by *GameBoy*
> 
> Yeah, and Epic said Gears of War 1 maxed out the 360 back in 2006. Like *every single previous gen ever*, the developers will learn how to optimize better and games will look better as the years go by.


I'll believe that the consoles are maxed when other developers come out and say it. For now, I'm going to remember the difference between Oblivion and Skyrim. I'm with you there.


----------



## Dyson Poindexter

I still fail to see where all this "untapped potential" is going to come from. There's no getting away from the Xbone having a slow 7770 for the GPU and certainly less CPU performance than a 2500k.


----------



## FattysGoneWild

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> I still fail to see where all this "untapped potential" is going to come from. There's no getting away from the Xbone having a slow 7770 for the GPU and certainly less CPU performance than a 2500k.


Its insulting to even put or mention a 2500k. It is not even on the same level or even remotely close. 2500k destroys in every single way.


----------



## sugarhell

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> I still fail to see where all this "untapped potential" is going to come from. There's no getting away from the Xbone having a slow 7770 for the GPU and certainly less CPU performance than a 2500k.


If you read infamous dev notes about ps4 they said that they hit a wall because they are unexperienced and also they didint use all the features because they didint want to risk with the developing


----------



## PostalTwinkie

Quote:


> Originally Posted by *GameBoy*
> 
> Yeah, and Epic said Gears of War 1 maxed out the 360 back in 2006. Like *every single previous gen ever*, the developers will learn how to optimize better and games will look better as the years go by.


The problem with your statement is that before the developer had to learn new custom architectures for the various consoles. There was a very steep and long learning curve to those very custom environments, that took years to understand and master.

That isn't how things are anymore, the new consoles are x86 environments, and the learning curve isn't as steep or long. Frankly, the new consoles are just small form factor PCs with a custom OS; they are going to be almost as bound as a PC using the same hardware. The only advantage the console has is around slightly better optimization, as the OS itself can be tailored to one very specific hardware set.

Anyone who doesn't believe the new consoles are very limited just hasn't been paying attention. We called this problem well a head of the launch, when the specs were just rumors.


----------



## Clocknut

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> I still fail to see where all this "untapped potential" is going to come from. There's no getting away from the Xbone having a slow 7770 for the GPU and certainly less CPU performance than a 2500k.


exactly, we are on x86 + GCN, these architecture are more similar to PC than ever. Last gen PowerPC architecture are something that many developer not familiar with, thats why we see a huge optimization when these developer learned how to optimized it over the years.


----------



## AgentHydra

I think they can still get more out of the consoles, I mean look at the difference between an early 360 game and a more recent 360 game, Mass Effect compared to Mass Effect 3. The difference is mind boggling.

We probably won't see a massive graphical improvement like that anymore, but with time I think they will still be able to make improvements. I have my doubts about leaving behind the 30FPS standard though, as the majority of console gamers probably couldn't care less.

edit: and while these are x86 and are a more familiar environment then the 360 was there's still (hopefully) some untapped potential that Mantle and DX12 will enable. I think it's too soon to write them off.


----------



## PostalTwinkie

Quote:


> Originally Posted by *AgentHydra*
> 
> I think they can still get more out of the consoles, I mean look at the difference between an early 360 game and a more recent 360 game, Mass Effect compared to Mass Effect 3. The difference is mind boggling.
> 
> We probably won't see a massive graphical improvement like that anymore, but with time I think they will still be able to make improvements. I have my doubts about leaving behind the 30FPS standard though, as the majority of console gamers probably couldn't care less.


See the reply directly above yours, and mine a few above that.


----------



## baalbelphegor

Quote:


> Originally Posted by *Master__Shake*
> 
> next gen...
> 
> just squint your eyes they look really good.
> 
> hello 720p goodbye 1080p


I laughed so hard. I really hope they coded unity well for PC. But I get the feeling they didn't.


----------



## perfectblade

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The problem with your statement is that before the developer had to learn new custom architectures for the various consoles. There was a very steep and long learning curve to those very custom environments, that took years to understand and master.
> 
> That isn't how things are anymore, the new consoles are x86 environments, and the learning curve isn't as steep or long. Frankly, the new consoles are just small form factor PCs with a custom OS; they are going to be almost as bound as a PC using the same hardware. The only advantage the console has is around slightly better optimization, as the OS itself can be tailored to one very specific hardware set.
> 
> Anyone who doesn't believe the new consoles are very limited just hasn't been paying attention. We called this problem well a head of the launch, when the specs were just rumors.


if x86 is so simple, why are most pc games so poorly optimized or just have major issues at release? granted you are working with limited hardware with consoles, but still. if anything i hope these consoles lead to better pc games as well


----------



## Pnanasnoic

So now that the devs realize that they have a finite amount of horsepower to work with they will be forced to be more creative in developing games, not just squat out Randumb Copycat 3 with newer/ish graphics over and over and over and...


----------



## PostalTwinkie

Quote:


> Originally Posted by *perfectblade*
> 
> if x86 is so simple, *why are most pc games so poorly optimized or just have major issues at release?* granted you are working with limited hardware with consoles, but still. if anything i hope these consoles lead to better pc games as well


Pretty broad brush you are making that stroke with.

While we do have some horribly optimized games, not all games are horribly optimized. However, the real answer to your question is two part.

One part being how vastly different computers are from one person to the next, even when hardware is identical, due to software differences. The other part is simply a matter of money.

It costs money to optimize a game, the longer they spend optimizing, the more it costs them. They want the game out to make money, they are businesses. The longer they keep the game in development, the less they are making on that game. With the proliferation of data connections, it is now possible for them to release a product now and fix it later.


----------



## mtcn77

"The PS4 couldn't do 1080p 30fps for our game".
Rather, it should be: "*our game* couldn't do 1080p 30fps for _The PS4_"


----------



## GameBoy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The problem with your statement is that before the developer had to learn new custom architectures for the various consoles. There was a very steep and long learning curve to those very custom environments, that took years to understand and master.
> 
> That isn't how things are anymore, the new consoles are x86 environments, and the learning curve isn't as steep or long. Frankly, the new consoles are just small form factor PCs with a custom OS; they are going to be almost as bound as a PC using the same hardware. The only advantage the console has is around slightly better optimization, as the OS itself can be tailored to one very specific hardware set.
> 
> Anyone who doesn't believe the new consoles are very limited just hasn't been paying attention. We called this problem well a head of the launch, when the specs were just rumors.


The original XBOX was x86/PC GPU and you can see the same progression in visuals. The PS3/360 both used custom PC GPU's like these consoles do. What's your point?


----------



## mtcn77

Ubifella, you don't install a SoC on a console to integrate features per the best cases of their additive performance, these aren't those discrete components people make them out to be. You develop from the ground up to leverage the enhances of interactivity it brings to the cpu&gpu mutualism - the advance is in the settings which wouldn't impact any of them as the sole limiting step.


----------



## MaadOCwanted

too soon....

Im done being a console fan boy.

witcher 3 for PC!


----------



## Paladin Goo

I opt to see how "utilized" these consoles can be until ROCKSTAR develops a full-fledged game on them. Look at the tour-de-force they pulled off with the last gen releases of GTAV! Even if you didn't like the game, you have to admit it was a technical marvel for the old consoles.


----------



## Boomer1990

I will wait to see what Naughty Dog can do before I go and believe Ubisoft who seems to constantly lie to both console side and the Pc side.


----------



## BizzareRide

Quote:


> Originally Posted by *sugarhell*
> 
> Just one word. Ubisoft.


I came to post this. This is damage control from the Unity backlash...

If consoles peaked, why do better looking games exist for either? Ryse & Quantum Break come to mind for X1 and Infamous, The Order, and Uncharted 4 come to mind for PS4.

Their 'everything in pipeline' seem to hinder performance. They should have streamed everything.


----------



## ZealotKi11er

They are approaching these consoles like PCs but they are not. They are getting numbers in single digit % of that of PC hardware considering they can optimize them. Fully utilize and fully optimize are totally different things. Nintendo optimizes their games. Ubisofts smokes does nothing.


----------



## aroc91

Another thing that may be a factor in comparing the progression of the previous gen and this gen is that the Cell and the 360's Xenon were ahead of their time when they released. When the 360 came out with its 3.2GHz triple core, Intel's best offering was the 3.2GHz dual core Pentium D 840. This generation of consoles didn't start a step ahead of current PC like the last one did.


----------



## mtcn77

Quote:


> Originally Posted by *aroc91*
> 
> Another thing that may be a factor in comparing the progression of the previous gen and this gen is that the Cell and the 360's Xenon were ahead of their time when they released. When the 360 came out with its 3.2GHz triple core, Intel's best offering was the 3.2GHz dual core Pentium D 840. This generation of consoles didn't start a step ahead of current PC like the last one did.


Eight cpu cores that can turbo up to 2.7 ghz with 7870 & 8 GB of GDDR5 all under 175 watts - considering it is AMD's proposition, it certainly is a statement at a power budget, imo.


----------



## IRO-Bot

Quote:


> Originally Posted by *axizor*
> 
> Who plays console games for the visuals?


So WiiU wins!!!!


----------



## Booty Warrior

Quote:


> Originally Posted by *GameBoy*
> 
> The original XBOX was x86/PC GPU and you can see the same progression in visuals.


And the original Xbox launched almost 13 years ago. The PC gaming market expanded leaps and bounds in that time, and big name developers became intimately familiar with coding to x86 platforms. Aside from their unified memory architectures, these consoles bring little new to the table. Unlike last gent with the PowerPC Cell and Xenon CPUs which took years to learn how to properly make use of.


----------



## PostalTwinkie

Quote:


> Originally Posted by *GameBoy*
> 
> The original XBOX was x86/PC GPU and you can see the same progression in visuals. The PS3/360 both used custom PC GPU's like these consoles do. What's your point?


Your comparison to the original XBox isn't really valid, and what points it does bring up actually affirm what I have said.

The thing about the XBox is it used a custom Pentium III, more specifically a process very similar to the mobile Celerons. More importantly, it used the newer Coppermine based Pentium III, which was significantly different than the processors before it.

A huge difference was the L2 Cache on it. While previous Pentium chips had L2 cache, the cache was slower than the processor clock speed. With Coppermine that changed and the L2 cache began operating at the same frequency as the processor itself, resulting in a increase in performance. This allowed developers that were previously used to being held back by the slower L2 the ability to adjust to the newer L2. Translating into better games later, as the developers became familiar with the hardware.

The other hugely important difference is that the original XBox used DirectX8.0, which brought with it Direct3D 8.0 - and an entirely new feature set and way of programming. Essentially, it was a fresh start for a lot of things when it came to programming, having a learning curve to it. DirectX/Direct3D 8 was a major change for the API, and not just an incremental upgrade.

In other words, it was a new environment for developers!

That isn't the case with the newest consoles. They are x86, like the original XBox, but they are using well distributed hardware that they have previous experience with. The DirectX/Direct3D they are using isn't new to them either, and therefore they don't have a steep learning curve. There aren't a ton of hidden tricks and ways of doing things they have to learn. Because the hardware they have is old hardware and the API they are using has been around for years as well.


----------



## Jedi Mind Trick

Idk why people keep ragging on the consoles, this is ubisoft, they can't even make properly optimized PC games (well, came say much for the AC games, but WD was terrible!).


----------



## NrGx

I think game manufacturers will come up with interesting ways to reduce the load on consoles. Just reducing draw distance can have a huge effect on resources required. Think of games like Bayonetta 2 which have environments that are small and instanced, but feel a part of a bigger world.


----------



## ZealotKi11er

I dont know why people thing PS3 and Xbox 360 where powerful.

PS3 had 256MB GPU and 256MB RAM. A PC at that time had a 8800GTX 768MB and 2GB of RAM.

With Core 2 Duo out PC was already 2-3x faster then last gen before they even started.

Xbox 360 was a fireball until 2 year latter once they fixed the RROD.

Ubisoft is stupid. There is no other word. Anyone company can push graphics. They are trying so hard to differentiate their games because they release the same thing every 6 months.
Quote:


> Originally Posted by *PostalTwinkie*
> 
> Your comparison to the original XBox isn't really valid, and what points it does bring up actually affirm what I have said.
> 
> The thing about the XBox is it used a custom Pentium III, more specifically a process very similar to the mobile Celerons. More importantly, it used the newer Coppermine based Pentium III, which was significantly different than the processors before it.
> 
> A huge difference was the L2 Cache on it. While previous Pentium chips had L2 cache, the cache was slower than the processor clock speed. With Coppermine that changed and the L2 cache began operating at the same frequency as the processor itself, resulting in a increase in performance. This allowed developers that were previously used to being held back by the slower L2 the ability to adjust to the newer L2. Translating into better games later, as the developers became familiar with the hardware.
> 
> The other hugely important difference is that the original XBox used DirectX8.0, which brought with it Direct3D 8.0 - and an entirely new feature set and way of programming. Essentially, it was a fresh start for a lot of things when it came to programming, having a learning curve to it. DirectX/Direct3D 8 was a major change for the API, and not just an incremental upgrade.
> 
> In other words, it was a new environment for developers!
> 
> That isn't the case with the newest consoles. They are x86, like the original XBox, but they are using well distributed hardware that they have previous experience with. The DirectX/Direct3D they are using isn't new to them either, and therefore they don't have a steep learning curve. There aren't a ton of hidden tricks and ways of doing things they have to learn. Because the hardware they have is old hardware and the API they are using has been around for years as well.


Where did Ubisoft learn? They never bother to optimize PC games. API are so different even though they use similar architecture. Where do you see Ubisoft optimize for GCN and OpenGL.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont know why people thing PS3 and Xbox 360 where powerful.
> Where did Ubisoft learn? They never bother to optimize PC games. API are so different even though they use similar architecture. Where do you see Ubisoft optimize for GCN and OpenGL.


I know it is fun to poke at Ubisoft, and just about every major company, on optimization issues, but let's be honest here. The engineers behind their products are perfectly capable people, and are just as qualified as anyone to optimize a game and discuss it as well. Just because the talking heads of the company do things that make us mad, doesn't mean the guys slamming on the keyboard making it don't know what they are doing.

Not optimizing at all != Optimizing within the time frame given to you (which is what the guy on the floor working is doing).


----------



## perfectblade

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Not optimizing at all != Optimizing within the time frame given to you (which is what the guy on the floor working is doing).


the thing is, where's the concrete evidence that optimizing for x86 has been developed and is understood well?

i remember an article awhile back saying x86 optimization could be 10x better with better support for open-gl. so where are the improvements?


----------



## ZealotKi11er

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Your comparison to the original XBox isn't really valid, and what points it does bring up actually affirm what I have said.
> 
> The thing about the XBox is it used a custom Pentium III, more specifically a process very similar to the mobile Celerons. More importantly, it used the newer Coppermine based Pentium III, which was significantly different than the processors before it.
> 
> A huge difference was the L2 Cache on it. While previous Pentium chips had L2 cache, the cache was slower than the processor clock speed. With Coppermine that changed and the L2 cache began operating at the same frequency as the processor itself, resulting in a increase in performance. This allowed developers that were previously used to being held back by the slower L2 the ability to adjust to the newer L2. Translating into better games later, as the developers became familiar with the hardware.
> 
> The other hugely important difference is that the original XBox used DirectX8.0, which brought with it Direct3D 8.0 - and an entirely new feature set and way of programming. Essentially, it was a fresh start for a lot of things when it came to programming, having a learning curve to it. DirectX/Direct3D 8 was a major change for the API, and not just an incremental upgrade.
> 
> In other words, it was a new environment for developers!
> 
> That isn't the case with the newest consoles. They are x86, like the original XBox, but they are using well distributed hardware that they have previous experience with. The DirectX/Direct3D they are using isn't new to them either, and therefore they don't have a steep learning curve. There aren't a ton of hidden tricks and ways of doing things they have to learn. Because the hardware they have is old hardware and the API they are using has been around for years as well.


Quote:


> Originally Posted by *PostalTwinkie*
> 
> I know it is fun to poke at Ubisoft, and just about every major company, on optimization issues, but let's be honest here. The engineers behind their products are perfectly capable people, and are just as qualified as anyone to optimize a game and discuss it as well. Just because the talking heads of the company do things that make us mad, doesn't mean the guys slamming on the keyboard making it don't know what they are doing.
> 
> Not optimizing at all != Optimizing within the time frame given to you (which is what the guy on the floor working is doing).


Ubisoft time window is too small. Throwing more worker will no improve game quality. You got to have talent and patience. They release way too many games, too many platforms. EA is the same way but they are more focused in PC because BF is mostly PC. When i played GTA V i was blown away how good the game looked on such old hardware. Thats is what i call optimization and full utilization. I am sorry but considering how poorly Ubisoft games run in PC i blame them. Current gen might not be as strong as PC but you are forgetting then high end PCs have been targeting 1440p+ resolutions for years now. All consoles have to achieve is 1080. They are so much more powerful then last gen yet they have not made much progress.


----------



## PostalTwinkie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Ubisoft time window is too small. Throwing more worker will no improve game quality. You got to have talent and patience. They release way too many games, too many platforms. EA is the same way but they are more focused in PC because BF is mostly PC. When i played GTA V i was blown away how good the game looked on such old hardware. Thats is what i call optimization and full utilization. I am sorry but considering how poorly Ubisoft games run in PC i blame them. Current gen might not be as strong as PC but you are forgetting then high end PCs have been targeting 1440p+ resolutions for years now. All consoles have to achieve is 1080. They are so much more powerful then last gen yet they have not made much progress.


I agree 1000% with what you are saying here.

Ubi does release games too damn fast, so do all the major studios. Way back in the 80s and 90s we would wait a few years between releases of games of the same franchise, sometimes only getting one game per console in a franchise.

Here we are now, and ACU is like the 15th release in the series? Something just insane.

EDIT: The point I am getting at is that the new consoles will not have the legs of the older consoles. There isn't some level of proficiency these guys are going to obtain with these consoles after a few years, because they effectively have had them for a few years.

I will be shocked as hell if they stretch these consoles past 4 years.


----------



## djriful

Even if they manage to pull off back at 60fps, resolution is still limited. I do not see much changes even DX12 and OpenGL Next update on those consoles. PC we are moving into 4k and I am still stuck in UHD.


----------



## Twinnuke

I can handle about 300 A.I. fighting with individual A.I. each with up to 8 cannons with A.I. of their own on my game I'm making on my i7. I think the Xb1 and Ps4 are slowered than my i7 @ 3.6GHZ


----------



## L D4WG

I find this hard to believe, the amount of gains they squeezed out of the 360 and the PS3 in the last 24 months, compare The Last of Us and Halo 4 to the games released during the launch year of last gen.

I think these guys are talking out their rear ends.

Get some decent exclusives developed only for a single current gen console, no backwards gen compatibility and you will see improvements. These guys are lazy


----------



## mtcn77

This was in 1985.

Colours were uncompromised back then.

When you saw the sea, the forest; even a palette of 64 colours at 256*192 pixels resolution that fit a row of maybe 25 sprite blocks held more elegance confined in its aesthetics than the disposables today.


----------



## aberrero

If they are cpu limited on AI then why doesn't the PS4 have a higher resolution than the XO, and better AA, and whatever else can be offloaded to the GPU with almost no CPU overhead?


----------



## DarkPhoenix

SOOoooooooo......
What they are saying is that they can't program an efficient AI or simply fall back to sudo-random AI-Tree that is convincing?


----------



## BinaryDemon

I think the developers statements basically only apply to an advanced open-world game using the AnvilNext engine. We've already seen other developers create games with high visual fidelity and hit the 1080p/60fps standard we all desperately want.

Obviously there will always be compromises but I think they should continue to strive for 1080p and/or 60fps and not just set the bar at 720p/30fps and crank out quick un-optimized crap.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I agree 1000% with what you are saying here.
> 
> Ubi does release games too damn fast, so do all the major studios. Way back in the 80s and 90s we would wait a few years between releases of games of the same franchise, sometimes only getting one game per console in a franchise.
> 
> Here we are now, and ACU is like the 15th release in the series? Something just insane.
> 
> EDIT: The point I am getting at is that the new consoles will not have the legs of the older consoles. There isn't some level of proficiency these guys are going to obtain with these consoles after a few years, because they effectively have had them for a few years.
> 
> I will be shocked as hell if they stretch these consoles past 4 years.


A lot of people say that they tried to save money but 360 core was $299 and the other version $399.

I dont think graphics are the importance here. If these developers have a vision for a game that requires much more powerful hardware then they can make it for PC. It not about how much better they can make the games for PS4/One. Its what they can do more then what they did for last gen. I think thats the main differentiation. Teey are throwing numbers at these games.


----------



## BigMack70

720p is going to become the final destination for console resolution


----------



## ]\/[EGADET]-[

Dirty console peasants...


----------



## ILoveHighDPI

Quote:


> we are less than a year into the life cycle of hardware that was planned for an 8-10 year window


I get frustrated when people say things like this.

Console lifecycles should be five years, no more. We used to say "we need a longer console lifecycle because it takes nearly five years just to figure out the hardware", and now they're saying it takes about a year to figure out the hardware, but we still want an eight year lifecycle?

No, if anything we can shorten the lifecycle to four years instead of five, because apparently that still gives three years of running your console at peak performance.

Or game companies could just stop being dumb (letting marketers make technical and gameplay decisions) and stop wasting resources on SSAO and other ugly effects, and just give us a clean looking game at 1080p and 60fps (seriously, they're wasting half the CPU power on lighting? Good grief).


----------



## huzzug

I see 2 possibilities with this statement:
1. Console gamers now get un optimized games now onwards as the 2 are more similar to the One. Or
2. Ubisoft continues to put out un optimized games blaming the consoles for their behavior.

We as PC gamers have nothing to fear as we would still get the same treatment that we have become so accustomed to in the past from UnoptimizedSoft


----------



## th3illusiveman

Microsoft are gonna regret skimping out on those 384 cores lol. I have no idea why on earth they thought that was a good idea when the PS4 GPU is just barely adequate.


----------



## jmcosta

i find it funny these two consoles trying to be pc, they prefer graphics over frame rate...(not talking about this one but most recent games)
lol
at least wii u doesn't "downgrade" the gaming quality and its a decent console to play with family or friends.


----------



## MacG32

So it's time for a crowd funded game console that's actually a recent specs PC that devs can code for and not waste countless time "optimizing" for already outdated AMD systems? Will real consoles be released next year? Will PCs just overtake consoles by leaps and bounds this holiday season? Will Steam celebrate this with the best holiday sale to date? I guess we'll see...lol









Edit: I loved my cartridge based gaming systems of the past. What fun and excitement.


----------



## HarrisLam

judging from what game devs can achieve with last generation of consoles compared to available tech on the PC side, I can't help thinking that this is another nonsense whine from lazy devs. Been sequeezing juice out of a desert for years, and complain when an entire lake of water was given just months ago. Get the programming together, seriously.


----------



## Cyro999

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> I still fail to see where all this "untapped potential" is going to come from. There's no getting away from the Xbone having a slow 7770 for the GPU and certainly less CPU performance than a 2500k.


Less performance than a 2500k? That implies it's on the same planet or even in the same solar system as a stock intel mainstream desktop cpu from almost four years ago


----------



## kennyparker1337

Quote:


> Originally Posted by *]\/[EGADET]-[*
> 
> Dirty console peasants...






Quote:


> Originally Posted by *BigMack70*
> 
> 720p is going to become the final destination for console resolution


Considering that was the final destination for a near decade old console I seriously doubt 720p is going to be the staple of current gen consoles.
I'm heavily betting most games - and I mean through out the next 10 years - are going to be 1080p when the industry has largely moved to 4K.
4K is much larger jump and more contrasting but that seems to be the current trend - moving to 4K from 1080p instead of something smaller like 1440p.
Quote:


> Originally Posted by *HarrisLam*
> 
> judging from what game devs can achieve with last generation of consoles compared to available tech on the PC side, I can't help thinking that this is another nonsense whine from lazy devs. Been sequeezing juice out of a desert for years, and complain when an entire lake of water was given just months ago. Get the programming together, seriously.


I agree.
Quote:


> Originally Posted by *MacG32*
> 
> So it's time for a crowd funded game console that's actually a recent specs PC that devs can code for and not waste countless time "optimizing" for already outdated AMD systems? Will real consoles be released next year? Will PCs just overtake consoles by leaps and bounds this holiday season? Will Steam celebrate this with the best holiday sale to date? I guess we'll see...lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I loved my cartridge based gaming systems of the past. What fun and excitement.


PC will never overtake consoles and consoles will never overtake PC.
They are not in the same category. They aren't competing.

Cost: Most peoples GPU they have to replace every 6 months cost more than one console.
Development: Coders can code directly to hardware on consoles because the hardware doesn't change.
Usability: A console game just "works". You don't have to mess with settings or properties.
Average User: Most people wan't to buy 1 item and have it work. They don't want to mess with trying to put together several parts or researching anything.

Their pros and cons for each item.
A PC would not make a console user happy.
Just like a console would not make a PC user happy.
Quote:


> Originally Posted by *th3illusiveman*
> 
> Microsoft are gonna regret skimping out on those 384 cores lol. I have no idea why on earth they thought that was a good idea when the PS4 GPU is just barely adequate.


*Unless I'm mistaken* the 360 had worse hardware than the PS3 and most games looked worse on the PS3.
Quote:


> Originally Posted by *jmcosta*
> 
> i find it funny these two consoles trying to be pc, they prefer graphics over frame rate...(not talking about this one but most recent games)
> lol
> at least wii u doesn't "downgrade" the gaming quality and its a decent console to play with family or friends.


In the console market it _is_ all about graphics and not FPS.
This is because everyone has the same hardware and gets the same FPS.
People won't be complaining someone can play a better game than them.
They _will_ be complaining if the game looks like crap.


----------



## kingduqc

No wonder, they got an AMD tablet cpu clocked at pentium 3 clock speed...


----------



## Clocknut

Quote:


> Originally Posted by *jmcosta*
> 
> i find it funny these two consoles trying to be pc, they prefer graphics over frame rate...(not talking about this one but most recent games)
> lol
> at least wii u doesn't "downgrade" the gaming quality and its a decent console to play with family or friends.


I am actually happy that they go for graphic over resolution/fps. because when ported over to PC thats where we get its full potential.

Imaging what kind of texture/graphic they gonna sacrifice in order to run 1080p 60fps.


----------



## raidmaxGuy

I honestly get the feeling that Valve is lying in wait to crush M$ and Sony. Haven't heard a lot of word on the Steam machine lately though....

Unrelated to the point:

Why in the heck is a minimum of 500GB hard drive needed for SteamOS? Are the partitions predefined? The recommended drive size is 1TB.


----------



## jmcosta

Quote:


> Originally Posted by *Clocknut*
> 
> I am actually happy that they go for graphic over resolution/fps. because when ported over to PC thats where we get its full potential.
> 
> Imaging what kind of texture/graphic they gonna sacrifice in order to run 1080p 60fps.


well they already downgrade the graphics "optimization", the view distance, low textures, no AA, uncompressed audio(coz the cpu), limited world size...
k you are right it would look like ps3


----------



## jsc1973

Quote:


> Originally Posted by *mtcn77*
> 
> This was in 1985.
> 
> Colours were uncompromised back then.
> 
> When you saw the sea, the forest; even a palette of 64 colours at 256*192 pixels resolution that fit a row of maybe 25 sprite blocks held more elegance confined in its aesthetics than the disposables today.


Those were the days. I wonder what percentage of people on here have been around long enough to have played consoles in 1985? I can remember playing on my uncle's Atari VCS (later called the 2600) in 1979. Today, I can play those same games on an emulator, using a PC that must be several orders of magnitude more powerful.
Quote:


> Originally Posted by *th3illusiveman*
> 
> Microsoft are gonna regret skimping out on those 384 cores lol. I have no idea why on earth they thought that was a good idea when the PS4 GPU is just barely adequate.


Because they figured that devs would just develop games to the lowest common denominator, which seems to be what is happening in most cases.
Quote:


> Originally Posted by *kingduqc*
> 
> No wonder, they got an AMD tablet cpu clocked at pentium 3 clock speed...


Actually more like an Athlon 64 X8, and way above anything a P3 could do, but not a powerhouse even if you max out all of the cores. With that said, AMD gave them what they asked for; they could have designed a custom APU any number of different ways. It could have been made with Stars cores if Sony or MS had been willing to put more into cooling it, or even 15h cores, although they'd have been nuts to ask for that.


----------



## th3illusiveman

Quote:


> Originally Posted by *kennyparker1337*
> 
> 
> 
> 
> 
> Considering that was the final destination for a near decade old console I seriously doubt 720p is going to be the staple of current gen consoles.
> I'm heavily betting most games - and I mean through out the next 10 years - are going to be 1080p when the industry has largely moved to 4K.
> 4K is much larger jump and more contrasting but that seems to be the current trend - moving to 4K from 1080p instead of something smaller like 1440p.
> I agree.
> PC will never overtake consoles and consoles will never overtake PC.
> They are not in the same category. They aren't competing.
> 
> Cost: Most peoples GPU they have to replace every 6 months cost more than one console.
> Development: Coders can code directly to hardware on consoles because the hardware doesn't change.
> Usability: A console game just "works". You don't have to mess with settings or properties.
> Average User: Most people wan't to buy 1 item and have it work. They don't want to mess with trying to put together several parts or researching anything.
> 
> Their pros and cons for each item.
> A PC would not make a console user happy.
> Just like a console would not make a PC user happy.
> *Unless I'm mistaken* the 360 had worse hardware than the PS3 and most games looked worse on the PS3.
> In the console market it _is_ all about graphics and not FPS.
> This is because everyone has the same hardware and gets the same FPS.
> People won't be complaining someone can play a better game than them.
> They _will_ be complaining if the game looks like crap.


360 games looked better because its arch was closer to PC then the PS3 and devs could program for it easier. Now they are pretty much on even playing fields and M$ are at a serious performance deficit.


----------



## PostalTwinkie

Quote:


> Originally Posted by *jsc1973*
> 
> Those were the days. I wonder what percentage of people on here have been around long enough to have played consoles in 1985? I can remember playing on my uncle's Atari VCS (later called the 2600) in 1979. Today, I can play those same games on an emulator, using a PC that must be several orders of magnitude more powerful.


You are only a handful of years ahead of me. I remember the good old days, the first game I legitimately raged as was Pitfall on the 2600. God how that game drove me up the wall.


----------



## Serandur

What did people expect, serialized code like AI to be easily parallelized through magic? Jaguar is right at the bottom of the barrel in terms of modern x86 CPU architectures, it's very weak and not at all suited to the demands of any high-performance platform (gaming being perhaps the most demanding consumer software out there).

I know people have the gut reaction to shout "Ubisoft", "lazy", and "learn to optimize", but that's stupid and very ill-informed. "Optimized" and "lazy" have become buzzwords for people who don't know what they're talking about to skirt around the fact that processors have very real physical limitations and no, it's not at all easy to split CPU workloads across multiple threads. Even when it is done, it is still a compromise that does not automagically mitigate the very strict limitations these CPUs impose on single-threaded performance, which is and always will still be absolutely vital in CPUs by their very (serialized) nature.


----------



## zantetheo

Life cycle of PS4 and Xbox1 was planned for an 8-10 year window.

Now imagine 3-4 years from now how many of us will have a 4K TVs or monitors. Imagine 3-4 generations of Gpus after GTX 980.

Is really anyone who believes that current consoles will survive after 5 years? 4K TVs will be mainstream and 1080p ancient technology by then.


----------



## bossie2000

There will be newer revisions to came on the existing ps4 and xb1. Newer APU's will replace older one's and more memory can be added and so on!


----------



## thegreatsquare

Quote:


> Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose


...and yet that's exactly what they do. My 720QM was worse even, like ~70% on one core, ~20% on another and 5-10% on the rest.


----------



## XAslanX

Quote:


> Originally Posted by *zantetheo*
> 
> Life cycle of PS4 and Xbox1 was planned for an 8-10 year window.
> 
> Now imagine 3-4 years from now how many of us will have a 4K TVs or monitors. Imagine 3-4 generations of Gpus after GTX 980.
> 
> Is really anyone who believes that current consoles will survive after 5 years? 4K TVs will be mainstream and 1080p ancient technology by then.


Exactly this, the new consoles were Dated On Arrival and I just don't see them lasting that long with their limited hardware. It's 2014, 1080p 60 FPS should be the minimum these new consoles can do with every single game. Forbes article on this here http://www.forbes.com/sites/johnarcher/2014/09/12/the-ps4-and-xbox-one-are-already-out-of-date/


----------



## MURDoctrine

Quote:


> Originally Posted by *bossie2000*
> 
> There will be newer revisions to came on the existing ps4 and xb1. Newer APU's will replace older one's and more memory can be added and so on!


The consoles will get better optimized games and we will see improvements. The only thing is we will see the devs start cutting corners to hit these resolutions and frame-rates. Just think about texture/model pop-ins we see. Someone mentioned Rockstar's wonders they did with GTA V. I'm sorry but GTA V on 360/ps3 was ugly. I just couldn't get past half of the city not loading for 4-5 seconds after I arrived at a location. Once we get it on PC we MIGHT see how they intended but that remains to be seen.


----------



## Carniflex

Quote:


> Originally Posted by *sugarhell*
> 
> Just one word. Ubisoft.


Given the Ubisoft track record I'm taking what they (or their anonymous devs) say with a non-healthy doze of salt. There is obviously some amount of truth in there when they say that this CPU is kind of on the weaker side, but getting only "1-2 fps difference for a very optimized game" between PS4 and XBOne does not sound to me "very optimized". It sounds to be like aiming at the lowest common denominator. Besides, XBOne was supposed to have all this fancy cloud stuff so if they would tailor the game for a specific platform optimally they could offload a lot of AI stuff for that MS Cloud that was supposed to make all the difference. AI is usually not as latency sensitive as many latency sensitive things so if they would have the will THAT part should not be a problem. It should feel quite natural for AI entities to have some kind of "reaction time" as long as it's around 500 ms or lower (rough ballpark of the human reaction time involving conscious decision making).


----------



## Newbie2009

More excuses from Ubisoft. We all know the consoles are underpowered, but I wouldn't believe a word from them. Any excuse to justify 900p to keep M$ happy.


----------



## Profiled

When Assassins creed is been about next gen Crysis? How about Crytek? CryEngine has amazing GPU optimizations.

Just my 99.99 cents.


----------



## Carniflex

Quote:


> Originally Posted by *MacG32*
> 
> So it's time for a crowd funded game console that's actually a recent specs PC that devs can code for and not waste countless time "optimizing" for already outdated AMD systems? Will real consoles be released next year? Will PCs just overtake consoles by leaps and bounds this holiday season? Will Steam celebrate this with the best holiday sale to date? I guess we'll see...lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I loved my cartridge based gaming systems of the past. What fun and excitement.


Steambox? Although ofc it's not crowd-funded


----------



## Hukkel

Quote:


> Originally Posted by *Kuivamaa*
> 
> Haha yeah, can't wait to revisit this newspost in 2 years when much more graphically advanced console games will be out.


This so much.

But hey this is OCN, where users are never shy of some good ol' consolebashing.


----------



## PlugSeven

So what are we to make of it, if and when the pc version does not look any better than the console version?


----------



## EliasAlucard

As much as I appreciate AMD, Sony should have persuaded AMD to produce RISC/PowerPC CPUs for the PS4. That way, we'd have backwards compatibility with the PS3 and also lower wattage. Either way I enjoy my PS4, but CISC/x86 is stone age technology by now.


----------



## Clocknut

there are still a way to save this console thing or at least keeping it relevant.

Simply by releasing the consoles on a faster cycle much like how Apple on its iphone. Every 2-3years a new console, the old one get die process shrink drop price to $199-$299, while maintaining full backward compatibly


----------



## DoomDash

Ubisoft kinda sucks though.


----------



## omari79

Wasn't it always that a console was superior or equal to a high end gaming pc at launch and then the it wpuld get surpassed not long after?

Wasn't this the case with the PS2..PS3..XBOX..360?

What went wrong with the PS4/ONE?


----------



## BigMack70

Quote:


> Originally Posted by *omari79*
> 
> What went wrong with the PS4/ONE?


Sony and Microsoft got cheap and designed garbage systems so that they could make a few extra bucks from hardware sales initially.


----------



## omari79

Quote:


> Originally Posted by *BigMack70*
> 
> Sony and Microsoft got cheap and designed garbage systems so that they could make a few extra bucks from hardware sales initially.


It seems so but shouldn't we discerdit UBI a bit and their lazy attitude towards optimization?


----------



## lacrossewacker

guys.....he's saying the console's are terrible CPU wise -_- That's a given. 8 crappy cores, low clock speed, only 62% available for the game itself.


----------



## Dyson Poindexter

The console shills in this thread are really making me









These "next-gen" consoles aren't sponges soaked in mystery that devs can wring performance from over several years. It's x86 and GCN. We know what it is, and we know what it's capable of. If my Pentium D and Radeon X1900XT haven't become "optimized" enough to max out _Crysis_, why should you expect the new consoles to suddenly become amazing? Sure, small gains will be made but the days of revolutionary performance increases are over.

Honestly, the level of "just you wait and see" with no numbers to back it up with these console people reeks of Stockholm syndrome. Y'all need to lay off the MS Koolaid, "the cloud" and "optimizations" aren't going to rescue your consoles' performance.


----------



## Schoat333

Obviously, there is a wall, and they are going to hit it with current engines. It is going to take some innovation to get beyond that just like it did on the last generation.

This is Ubisoft, so I don't expect them to do anything revolutionary.


----------



## maarten12100

Quote:


> Originally Posted by *Imglidinhere*
> 
> So... devs are finally saying what the tech nerds were saying BEFORE the console was released? Huh... go figure...


All you do is spread negative comments move back over to the laptop section please...
Clearly the consoles can do 1080P but not with the settings devs want. What does resolution say about image quality well the answer is not everything a combination of 720P @ medium may look far better than 1080P at low.

These "news" items are sensationalist garbage that doesn't belong on a tech site I think.
Quote:


> Originally Posted by *BigMack70*
> 
> Sony and Microsoft got cheap and designed garbage systems so that they could make a few extra bucks from hardware sales initially.


bottom line they just narrowly not making a loss so your argument is very much wrong. They however did shave off a few bucks so they didn't have to sell at a loss.
Quote:


> Originally Posted by *lacrossewacker*
> 
> guys.....he's saying the console's are terrible CPU wise -_- That's a given. 8 crappy cores, low clock speed, only 62% available for the game itself.


It shouldn't be that bad @2GHz my quad jaguar based proc already does surprisingly well. Also people are pairing the 750 ti with those Kabini platforms and it is pretty good and that is just 4 cores at low clocks.
Quote:


> Originally Posted by *EliasAlucard*
> 
> As much as I appreciate AMD, Sony should have persuaded AMD to produce RISC/PowerPC CPUs for the PS4. That way, we'd have backwards compatibility with the PS3 and also lower wattage. Either way I enjoy my PS4, but CISC/x86 is stone age technology by now.


It would be nice if it was to be that way but PowerPC is IBM's brainchild so it would be a dual vendor console like the previous one was. However the PS3 fat models were able to run PS2 games because they had the PS2 processors on board.


----------



## BigMack70

Quote:


> Originally Posted by *maarten12100*
> 
> bottom line they just narrowly not making a loss so your argument is very much wrong. *They however did shave off a few bucks so they didn't have to sell at a loss.*


If I'm very much wrong, how come you are making the exact same point that I am?









It's just the facts - they cheapened out this console generation compared to previous ones on hardware costs in order to not sell at a loss as is customary. Result: worse hardware than usual for new consoles relative to the market.


----------



## maarten12100

Quote:


> Originally Posted by *kingduqc*
> 
> No wonder, they got an AMD tablet cpu clocked at pentium 3 clock speed...


Pentium 3 topped out at 1.4GHz so not really. Also we are talking 8 cores the strength of about half the IPC of Haswell linear scaling gets you an i5 in raw performance even with that not being the case it should be plenty.

So let's blame it on Ubisoft for it's inability to make an optimized game and call it a day.


----------



## maarten12100

Quote:


> Originally Posted by *BigMack70*
> 
> If I'm very much wrong, how come you are making the exact same point that I am?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's just the facts - they cheapened out this console generation compared to previous ones on hardware costs in order to not sell at a loss as is customary. Result: worse hardware than usual for new consoles relative to the market.


Because you state that "so that they could make a few extra bucks from hardware sales initially."
They aren't making real money of the hardware sales and are on the edge of making a loss. (MS is making a loss Sony makes a tiny amount but is very likely to make a loss if you include shipping and handling)

But they did indeed go with the cheap solution that best fitted their market strategy. The consoles are cheap so I don't really see the problem. PS3 was crazy expensive at launch and wouldn't last longer than 2 years for most people.


----------



## Clukos

Yeah, let's trust what the most incompetent company in the universe, when it comes to software optimization, says. Why not?









Seriously Ubisoft, fire EVERYONE in your programming teams and get people that actually know how to utilize the hardware available, both PC and Consoles. It's not like you don't have the money!


----------



## Clocknut

Quote:


> Originally Posted by *maarten12100*
> 
> These "news" items are sensationalist garbage that doesn't belong on a tech site I think.
> bottom line they just narrowly not making a loss so your argument is very much wrong. The however did shave off a few bucks so they didn't have to sell at a loss.
> It shouldn't be that bad @2GHz my quad jaguar based proc already does surprisingly well. Also people are pairing the 750 ti with those Kabini platforms and it is pretty good and that is just 4 cores at low clocks.


If they use the old $499 selling price without the kinect thing, it wouldnt be that bad. putting $100 extra on APU alone could make a diff than what is it now.

right now it is $399, or $499 with kinect. It is not cheap enough & it is not good performance. These consoles are good at neither side. People are no problem paying $600-$700 on smartphones, so I dont think a $500 consoles are too much to ask for, if they is delivering a good all-in-one package. The market feedback on Wii U already show that they want a high performance console, not a budget console. Sony & Microsoft are soo wrong to cheap out their console thinking people would buy their average budget console.


----------



## BigMack70

Quote:


> Originally Posted by *maarten12100*
> 
> Because you state that "so that they could make a few extra bucks from hardware sales initially."
> They aren't making real money of the hardware sales and are on the edge of making a loss. (MS is making a loss Sony makes a tiny amount but is very likely to make a loss if you include shipping and handling)
> 
> But they did indeed go with the cheap solution that best fitted their market strategy. The consoles are cheap so I don't really see the problem. PS3 was crazy expensive at launch and wouldn't last longer than 2 years for most people.


Sooooooooo... they're not losing money but in fact are making it from hardware sales, and yet you disagree with me that their hardware designs were financially motivated?

I'm sorry, but...


----------



## maarten12100

Quote:


> Originally Posted by *BigMack70*
> 
> Sooooooooo... they're not losing money but in fact are making it from hardware sales, and yet you disagree with me that their hardware designs were financially motivated?
> 
> I'm sorry, but...


They are making money from software sales as did they with the PS3 and Xbox 360 only difference is the new consoles either only make a little loss or make zero loss but also about zero profit.

If you want to argue semantics that 1 dollar profit for selling a PS4 including handling is a big deal. Then go ahead I'm out. They tie on the console costs versus selling price and make the profit of games.
Quote:


> Originally Posted by *Clocknut*
> 
> If they use the old $499 selling price without the kinect thing, it wouldnt be that bad. putting $100 extra on APU alone could make a diff than what is it now.
> 
> right now it is $399, or $499 with kinect. It is not cheap enough & it is not good performance. These consoles are good at neither side. People are no problem paying $600-$700 on smartphones, so I dont think a $500 consoles are too much to ask for, if they is delivering a good all-in-one package. The market feedback on Wii U already show that they want a high performance console, not a budget console. Sony & Microsoft are soo wrong to cheap out their console thinking people would buy their average budget console.


Nah MS and Sony will get direct profits instead of having a 2 year to ROI cycle again. This way every extra penny of markup on games is direct profit to them. They will make money and the average consumer will have a cheaper console. Most console players don't even care that much they want a cheap console that plays games. Most don't even know the difference between HD (720P) and FHD (1080P).


----------



## BigMack70

My point was that they gimped the consoles for financial reasons, something which I don't see you disagreeing with, and I cannot figure out why you are trying to pick a fight.

Oh wait I forgot... this is the internet. Carry on.


----------



## EmL

And this is coming from Ubisoft.








We'll see what Ubi has to say when the next Uncharted drops.
Inb4 movie game


----------



## PureBlackFire

why hasn't Ubisoft's CEO banned the developers from talking yet?


----------



## BinaryDemon

Quote:


> Originally Posted by *maarten12100*
> 
> Also we are talking 8 cores the strength of about half the IPC of Haswell linear scaling gets you an i5 in raw performance even with that not being the case it should be plenty.


Except that both the XB1 and PS4 reserve two of those cores for the OS. So the usable CPU is more like an FX-6300 clocked @ 1.6-2.0ghz with it's L3 cache removed. I'm not sure what ancient mobile i5 that is equal too, but it's not that impressive.


----------



## Serandur

Quote:


> Originally Posted by *maarten12100*
> 
> Pentium 3 topped out at 1.4GHz so not really. Also we are talking 8 cores the strength of about half the IPC of Haswell *linear scaling* gets you an i5 in raw performance even with that not being the case it should be plenty.
> 
> So let's blame it on Ubisoft for it's inability to make an optimized game and call it a day.


Not even the best-multithreaded games (which are impossible to be as well multithreaded as video editing) out there would let such a thing be possible. You won't get linear scaling with CPUs in games, ever. An i5-2500K (let alone Haswell, where the comparitively small IPC gains are actually rather large in comparison to Jaguar) has roughly 2x the IPC of any Jaguar CPU coupled with roughly 2x the clock speed even at stock (taking turbo boost into account). A stock 2500K core is about 4x as powerful as said Jaguar core. Even in the best of the best multi-threaded scenarios games simply will never attain (given the nature of CPUs as inherently serial processors and most consumer software being inherently limited in how it can scale and effectively divide threads across CPU cores ), the 8-core Jaguar in this case could only ever match 2 stock 2500K cores.

Overclock the 2500K, and not even that. And this is only a what-if scenario that couldn't even occur in games, especially so given the 2 reserved cores. Less, but more powerful cores are inherently easier to work with and a more efficient way of doing things with CPUs. It's in their very nature, in the nature of computations they are designed to execute. You cannot split everything into multiple threads across different cores, it's simply impossible. A lot of AI putting a ton of stress on one core does not equal poor "optimization", it could just mean slow hardware. You cannot squeeze blood from a stone.

If there's someone to blame for console CPUs having severe difficulty and restricting the scope of certain game systems and features, it's the console manufacturers for using cores weaker than Intel Atoms. If "optimizing" means limiting the scope of CPU-intensive programs even further across the board so phone-level CPUs don't choke, screw that. All these false comparisons to games extremely limited in the scope of the technology they implement (ie, console exclusives) and baseless claims of needing to "optimize" better without even understanding what "optimizing" means in this case nor without any programmer-level understanding of AI threading, especially in a manner being used to vitriolically attack the competence of Ubisoft's programmers in knee-jerk reactions to some PR wording, is pathetic.

I'm not stating Ubisoft have a clean record nor anything definitive in either direction, but we've seen what a lot of AI can do to CPUs across many games, not just Ubisoft ones, and the results are pretty much consistently demanding (predominantly on one core, no less) if the AI has any level of scale, autonomy, or response to stimuli.


----------



## routek

yeah, its 6 cores on console, not 8, 2 are locked off.

From benches it would seem the console CPU part is like an Athlon 630 or something in performance numbers, perhaps worse. Its good however in terms of power consumption for a laptop or an APU solution which they chose.

I would've liked to see them wait for Maxwell and looked at Intel for a 2014 launch instead of 2013.

Thing is though this is a tough business and they're selling well, use around 120W in total in a simple APU solution. Buyers seem to be happy with 2007-2010 PC level graphics


----------



## ZealotKi11er

So if we where to make a new console right now. What can of components would be ideal?

Also you guys are forgetting that consoles unlike in PC can be coded down to metal and a CPU has a lot more draw calls then in PC.


----------



## Serandur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So if we where to make a new console right now. What can of components would be ideal?
> 
> Also you guys are forgetting that consoles unlike in PC can be coded down to metal and a CPU has a lot more draw calls then in PC.


Ideal? Laptop Haswell quad-core and Maxwell GPU (970M level); TDP isn't too high, power is nice, but price would be the issue... for the manufacturers. That's some very efficient hardware and they could play with the clocks how they wish. That would be a nice base-line for multiplatform development for some time.

They can code to the imaginary metal as much as they want, Jaguar is still a tablet/phone-level CPU and DX11's inefficiency is severely exaggerated if the results are to be anywhere near comparable to Jaguar with any decent Intel CPU from the past few years. Draw calls are an issue, yes, but there's Mantle and DirectX12 to the rescue for those.


----------



## kennyparker1337

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> The console shills in this thread are really making me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These "next-gen" consoles aren't sponges soaked in mystery that devs can wring performance from over several years. It's x86 and GCN. We know what it is, and we know what it's capable of. If my Pentium D and Radeon X1900XT haven't become "optimized" enough to max out _Crysis_, why should you expect the new consoles to suddenly become amazing? Sure, small gains will be made but the days of revolutionary performance increases are over.
> 
> Honestly, the level of "just you wait and see" with no numbers to back it up with these console people reeks of Stockholm syndrome. Y'all need to lay off the MS Koolaid, "the cloud" and "optimizations" aren't going to rescue your consoles' performance.


"MS Koolaid"









The idea of developers being able to code directly to hardware instead of having to code through high level language that has to go through multiple levels of drivers is not relevant to just Microsoft but the entire idea of a console.

Sure the gains over time isn't going to compare to upgrading a PC but it's absolutely *FREE* to everyone involved.

If you had the decency to look at first year games vs last year games on last gen consoles you would realize that clearly games can be vastly improved upon over time by getting your code to near 100% proficiency at the hardware level.

Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)


----------



## benbenkr

Wait, hold on.. Ubisoft?

Wasn't it just a few months ago they were boasting about the next-gen consoles power to be able to bring Unity to "LIFE"? About how powerful they were and it allowed them to do things they couldn't in AC4?

Then... wait what? Now they're not getting enough power?
This is why everyone in Ubisoft deserves a kick in the face.


----------



## BinaryDemon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Also you guys are forgetting that consoles unlike in PC can be coded down to metal and a CPU has a lot more draw calls then in PC.


I don't think that's the case so much anymore. I doubt developers get direct access to hardware because the OS has to execute complex background tasks. Sure it's a more direct path than PC's have, but it's not like the old days either.

Unless the OS in our new console suspended all other tasks and dedicated the hardware to the game running in the foreground.


----------



## dieanotherday

... xbox two time?


----------



## BinaryDemon

Quote:


> Originally Posted by *dieanotherday*
> 
> ... xbox two time?


I don't know why MS or Sony couldnt release an upgraded console. By default it would match current cpu clockspeed and only utilize the same number of currently available GNC cores, but then developers could add support for the upgraded model.

Yes it would be more work for developers, and Yes all the current console owners would feel shafted.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Serandur*
> 
> Ideal? Laptop Haswell quad-core and Maxwell GPU (970M level); TDP isn't too high, power is nice, but price would be the issue... for the manufacturers. That's some very efficient hardware and they could play with the clocks how they wish. That would be a nice base-line for multiplatform development for some time.
> 
> They can code to the imaginary metal as much as they want, Jaguar is still a tablet/phone-level CPU and DX11's inefficiency is severely exaggerated if the results are to be anywhere near comparable to Jaguar with any decent Intel CPU from the past few years. Draw calls are an issue, yes, but there's Mantle and DirectX12 to the rescue for those.


The CPU alone cost $300. Not going to happen. 970M? Thats 20-30% faster then HD 7870.


----------



## KyadCK

Quote:


> Originally Posted by *BinaryDemon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maarten12100*
> 
> Also we are talking 8 cores the strength of about half the IPC of Haswell linear scaling gets you an i5 in raw performance even with that not being the case it should be plenty.
> 
> 
> 
> Except that both the XB1 and PS4 reserve two of those cores for the OS. So the usable CPU is more like an FX-6300 clocked @ 1.6-2.0ghz with it's L3 cache removed. I'm not sure what ancient mobile i5 that is equal too, but it's not that impressive.
Click to expand...

Jaguar has an IPC closer to Baytrail/Sandy than Piledriver. It also has full cores, not modules.

So... no. Not a 2Ghz 6300. It has about as much multi-thread compute performance as a Haswell Pentium with around half the TDP. Single thread hurts, but it's about time game devs learned to write multi-threaded code anyway.
Quote:


> Originally Posted by *BinaryDemon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Also you guys are forgetting that consoles unlike in PC can be coded down to metal and a CPU has a lot more draw calls then in PC.
> 
> 
> 
> I don't think that's the case so much anymore. I doubt developers get direct access to hardware because the OS has to execute complex background tasks. Sure it's a more direct path than PC's have, but it's not like the old days either.
> 
> Unless the OS in our new console suspended all other tasks and dedicated the hardware to the game running in the foreground.
Click to expand...

You -just- got done explaining two of the cores were for the OS and you're going to now say that they have to stop OS tasks to run the game?

Basic CPU Affinity... Game gets 6, everything else shares the other two. Adding more threads to a system makes it easier to multitask, not harder. It certainly doesn't make it harder to write to bare metal compared to the older generations.

They're always going to say they hit a performance wall. Every game there will be a performance wall. Every year there will be a new performance wall.

And right up until they stop programing for it, smart people will find clever tricks to get around those walls. Just like every other console we were told had a "performance wall".


----------



## ad hoc

Quote:


> Originally Posted by *BinaryDemon*
> 
> I don't know why MS or Sony couldnt release an upgraded console. By default it would match current cpu clockspeed and only utilize the same number of currently available GNC cores, but then developers could add support for the upgraded model.
> 
> Yes it would be more work for developers, and Yes all the current console owners would feel shafted.


They wouldn't just feel shafted. They'd be enraged, and frankly, I would be too if I had just spent $500 on an outdated console. In the PC world we expect that kind of nonsense as a matter of fact, and maybe even as advantageous because it keeps hardware from stagnating. Console owners expect the exact opposite. They expect their console to play everything at Max "console" settings because they shelled out a lot of money.

Besides, imagine a world where each company released different tiers of consoles. We'd have another iPhone situation on our hands. If you could replace hardware in consoles, it would be a different story, but that kind of defeats their original purpose.


----------



## dieanotherday

Quote:


> Originally Posted by *ad hoc*
> 
> They wouldn't just feel shafted. They'd be enraged, and frankly, I would be too if I had just spent $500 on an outdated console. In the PC world we expect that kind of nonsense as a matter of fact, and maybe even as advantageous because it keeps hardware from stagnating. Console owners expect the exact opposite. They expect their console to play everything at Max "console" settings because they shelled out a lot of money.
> 
> Besides, imagine a world where each company released different tiers of consoles. We'd have another iPhone situation on our hands. If you could replace hardware in consoles, it would be a different story, but that kind of defeats their original purpose.


That's what u get for buying any type of prebuilt.


----------



## maarten12100

Quote:


> Originally Posted by *BinaryDemon*
> 
> Except that both the XB1 and PS4 reserve two of those cores for the OS. So the usable CPU is more like an FX-6300 clocked @ 1.6-2.0ghz with it's L3 cache removed. I'm not sure what ancient mobile i5 that is equal too, but it's not that impressive.


Yeah that seems almost correct. At 2-2.4GHz my 6410 punches out about a 2 in Cinebench with double the cores a 3,5 is easily doable and 2 threads going to the OS doesn't mean they won't be used they are obviously handling tasks too background tasks on a pc slow it down too.

sandy i5 performance @2.5GHz it seems plenty for the gpu it is using considering I use a R9 290 with my i5 and I am not bottlenecked.

Ubisoft devs just can't program or are cutting cost
Quote:


> Originally Posted by *Serandur*
> 
> Not even the best-multithreaded games (which are impossible to be as well multithreaded as video editing) out there would let such a thing be possible. You won't get linear scaling with CPUs in games, ever. An i5-2500K (let alone Haswell, where the comparitively small IPC gains are actually rather large in comparison to Jaguar) has roughly 2x the IPC of any Jaguar CPU coupled with roughly 2x the clock speed even at stock (taking turbo boost into account). A stock 2500K core is about 4x as powerful as said Jaguar core. Even in the best of the best multi-threaded scenarios games simply will never attain (given the nature of CPUs as inherently serial processors and most consumer software being inherently limited in how it can scale and effectively divide threads across CPU cores ), the 8-core Jaguar in this case could only ever match 2 stock 2500K cores.
> 
> Overclock the 2500K, and not even that. And this is only a what-if scenario that couldn't even occur in games, especially so given the 2 reserved cores. Less, but more powerful cores are inherently easier to work with and a more efficient way of doing things with CPUs. It's in their very nature, in the nature of computations they are designed to execute. You cannot split everything into multiple threads across different cores, it's simply impossible. A lot of AI putting a ton of stress on one core does not equal poor "optimization", it could just mean slow hardware. You cannot squeeze blood from a stone.
> 
> If there's someone to blame for console CPUs having severe difficulty and restricting the scope of certain game systems and features, it's the console manufacturers for using cores weaker than Intel Atoms. If "optimizing" means limiting the scope of CPU-intensive programs even further across the board so phone-level CPUs don't choke, screw that. All these false comparisons to games extremely limited in the scope of the technology they implement (ie, console exclusives) and baseless claims of needing to "optimize" better without even understanding what "optimizing" means in this case nor without any programmer-level understanding of AI threading, especially in a manner being used to vitriolically attack the competence of Ubisoft's programmers in knee-jerk reactions to some PR wording, is pathetic.
> 
> I'm not stating Ubisoft have a clean record nor anything definitive in either direction, but we've seen what a lot of AI can do to CPUs across many games, not just Ubisoft ones, and the results are pretty much consistently demanding (predominantly on one core, no less) if the AI has any level of scale, autonomy, or response to stimuli.


Quote:


> even with that not being the case it should be plenty.


Yes I agree as you can see above that linear scaling won't happen but one can come really close on as little as 8 cores.
Jaguar cores punch out pretty good scores even at the 1.6GHz range so we have a cpu stronger than pretty much all standard clocked dual core chips on the market 3GHz range. Which do fine in games thus Ubisoft stinks.


----------



## BinaryDemon

Quote:


> Originally Posted by *KyadCK*
> 
> You -just- got done explaining two of the cores were for the OS and you're going to now say that they have to stop OS tasks to run the game?


Well you combined my response to two different subjects but

Yes, I'm basically saying two cores is an incredible waste for the 'lean' OS - the XB1 and PS4 currently run. This OS bloat is because the consoles are trying to be more than gaming systems, it has to be able to answer an incoming voicechat requests, download in the background, ect ect. If I was designing a game console and had to deal with the restrictions of using <$500 hardware, I certainly would want to make sure 99% of those resources were utilized by the foreground task. I guess I'm old school in the fact that I want my console to play GAMES.


----------



## KenjiS

Quote:


> Originally Posted by *Master__Shake*
> 
> next gen...
> 
> just squint your eyes they look really good.
> 
> hello 720p goodbye 1080p


Its more cinematic, studies say gamers dont care about 1080p and 60fps after all, they say it looks unnatural...

New games will replicate the "hand cranked" variable frame rate of 16-18 fps and remove audio, Because audio is a distraction, Also no more color, Because studies say consumers dislike colorful things


----------



## ZealotKi11er

Quote:


> Originally Posted by *KyadCK*
> 
> Jaguar has an IPC closer to Baytrail/Sandy than Piledriver. It also has full cores, not modules.
> 
> So... no. Not a 2Ghz 6300. It has about as much multi-thread compute performance as a Haswell Pentium with around half the TDP. Single thread hurts, but it's about time game devs learned to write multi-threaded code anyway.
> You -just- got done explaining two of the cores were for the OS and you're going to now say that they have to stop OS tasks to run the game?
> 
> Basic CPU Affinity... Game gets 6, everything else shares the other two. Adding more threads to a system makes it easier to multitask, not harder. It certainly doesn't make it harder to write to bare metal compared to the older generations.
> 
> They're always going to say they hit a performance wall. Every game there will be a performance wall. Every year there will be a new performance wall.
> 
> And right up until they stop programing for it, smart people will find clever tricks to get around those walls. Just like every other console we were told had a "performance wall".


It also a way to hide their bad work by saying Consoles are the limiting factor.


----------



## cutty1998

So if these new consoles are only able to really pull off 720P, does that mean the PS3 & 360 were really running at a lower resolution that 720P


----------



## prescotter

Quote:


> Originally Posted by *cutty1998*
> 
> So if these new consoles are only able to really pull off 720P, does that mean the PS3 & 360 were really running at a lower resolution that 720P


Yea they do, take a look here: http://forum.beyond3d.com/showthread.php?t=46241

Example of CoD resolutions:
Call of Duty: Black Ops = 960x544 (2xAA)
Call of Duty: Black Ops 2 = 880x720 (dynamic? Instances of 832x624..., post-AA)
Call of Duty: Ghosts = 928x600 ? (no AA? broken?)
Call of Duty: Modern Warfare = 1024x600 (2xAA)
Call of Duty: Modern Warfare 2 = 1024x600 (2xAA)
Call of Duty: Modern Warfare 3 = 1024x600 (2xAA)


----------



## ZealotKi11er

Quote:


> Originally Posted by *cutty1998*
> 
> So if these new consoles are only able to really pull off 720P, does that mean the PS3 & 360 were really running at a lower resolution that 720P


Yeah. Most games in Xbox 360 and PS3 games where below 720p. Also much lower quality.


----------



## Serandur

Quote:


> Originally Posted by *kennyparker1337*
> 
> "MS Koolaid"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The idea of developers being able to code directly to hardware instead of having to code through high level language that has to go through multiple levels of drivers is not relevant to just Microsoft but the entire idea of a console.
> 
> Sure the gains over time isn't going to compare to upgrading a PC but it's absolutely *FREE* to everyone involved.
> 
> If you had the decency to look at first year games vs last year games on last gen consoles you would realize that clearly games can be vastly improved upon over time by getting your code to near 100% proficiency at the hardware level.
> 
> Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)


You missed the point. The last-generation consoles had some very different hardware (360's GPU had the very first unified shaders, the PS3 used that weird Cell architecture) and the industry as a collective whole was actually making the transition to programmable unified shaders and multi-core CPUs. It wasn't just consoles that improved with the same hardware, as people rocking 8800s could attest to. There was nothing console-specific about the massive improvement in software development tools we saw and hardware optimizations stemmed from esoteric architectures and adapting to them. Neither of these things apply this time around.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The CPU alone cost $300. Not going to happen. 970M? Thats 20-30% faster then HD 7870.


First off, the 7850 is a better representation of the PS4's GPU due to the reduced shader count and low clock speed. Second, the 970M is much faster than that. Third, a downclocked/low-binning i5 bought in bulk should certainly not be $300. Not talking about i7s or the markup mobile OEMs tend to charge.
Quote:


> Originally Posted by *maarten12100*
> 
> Yeah that seems almost correct. At 2-2.4GHz my 6410 punches out about a 2 in Cinebench with double the cores a 3,5 is easily doable and 2 threads going to the OS doesn't mean they won't be used they are obviously handling tasks too background tasks on a pc slow it down too.
> 
> sandy i5 performance @2.5GHz it seems plenty for the gpu it is using considering I use a R9 290 with my i5 and I am not bottlenecked.
> 
> Ubisoft devs just can't program or are cutting cost
> 
> Yes I agree as you can see above that linear scaling won't happen but one can come really close on as little as 8 cores.
> Jaguar cores punch out pretty good scores even at the 1.6GHz range so we have a cpu stronger than pretty much all standard clocked dual core chips on the market 3GHz range. Which do fine in games thus Ubisoft stinks.


I think you completely skipped the part about how CPUs function. They're sequential processors by nature, not parallel ones. There are many, many, many tasks that cannot be split across cores at all, let alone in a perfectly equal manner that maximizes each core without the whole CPU being held back by any particular core struggling with a given thread/workload that is itself incapable of being split. Weaker single-threaded performance only makes the situation worse, in actuality, because the CPU is that much more likely to be held back by that single core choking on a task too great for it. This is not a trivial detail, I don't think a lot of people the faintest idea how CPUs function. In games, an 8-core (6 in this case, effectively) Jaguar will never even match a dual-core Haswell even if it "theoretically" should be able to with all cores added up because that is not how CPUs function in the vast majority of software. This is an inherent limitation, ask any programmer and google "Amdahl's law". Video editing and similar benchmarks like Cinebench are part of a very select few of exceptions.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Serandur*
> 
> You missed the point. The last-generation consoles had some very different hardware (360's GPU had the very first unified shaders, the PS3 used that weird Cell architecture) and the industry as a collective whole was actually making the transition to programmable unified shaders and multi-core CPUs. It wasn't just consoles that improved with the same hardware, as people rocking 8800s could attest to. There was nothing console-specific about the massive improvement in software development tools we saw and hardware optimizations stemmed from esoteric architectures and adapting to them. Neither of these things apply this time around.
> First off, the 7850 is a better representation of the PS4's GPU due to the reduced shader count and low clock speed. Second, the 970M is much faster than that. Third, a downclocked/low-binning i5 bought in bulk should certainly not be $300. Not talking about i7s or the markup mobile OEMs tend to charge.
> I think you completely skipped the part about how CPUs function. They're sequential processors by nature, not parallel ones. There are many, many, many tasks that cannot be split across cores at all, let alone in a perfectly equal manner that maximizes each core without the whole CPU being held back by any particular core struggling with a given thread/workload that is itself incapable of being split. Weaker single-threaded performance only makes the situation worse, in actuality, because the CPU is that much more likely to be held back by that single core choking on a task too great for it. This is not a trivial detail, I don't think a lot of people the faintest idea how CPUs function. In games, an 8-core (6 in this case, effectively) Jaguar will never even match a dual-core Haswell even if it "theoretically" should be able to with all cores added up because that is not how CPUs function in the vast majority of software. This is an inherent limitation, ask any programmer and google "Amdahl's law". Video editing and similar benchmarks like Cinebench are part of a very select few of exceptions.


You are looking at a Mobile GPU that just came out. Consoles came out 1 year ago but they started building before that. If they had to pick the equivalent GPU from AMD right now it would be similar to M295X found on the iMac 5K.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *kennyparker1337*
> 
> "MS Koolaid"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The idea of developers being able to code directly to hardware instead of having to code through high level language that has to go through multiple levels of drivers is not relevant to just Microsoft but the entire idea of a console.
> 
> Sure the gains over time isn't going to compare to upgrading a PC but it's absolutely FREE to everyone involved.
> 
> *If you had the decency* to look at first year games vs last year games on last gen consoles you would realize that clearly games can be vastly improved upon over time by getting your code to near 100% proficiency at the hardware level.


If you had the decency to read my whole post you'd see why that assumption doesn't make sense anymore.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> If you had the decency to read my whole post you'd see why that assumption doesn't make sense anymore.


Then they better find way to or make games better in other ways besides graphics. This is differentiate talented game studios and the boring ones.


----------



## Serandur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are looking at a Mobile GPU that just came out. Consoles came out 1 year ago but they started building before that. If they had to pick the equivalent GPU from AMD right now it would be similar to M295X found on the iMac 5K.


But you said this "So if we where to make a new console right now. What can of components would be ideal?", I picked up on the "right now" part.


----------



## Slomo4shO

Ubisoft and EA receive no business for me... neither have been competitive at developing quality games for years now yet ignorant fanboys continue to support these companies that lie and misinform their consumers on a daily basis. I am sure Ubisoft will miraculously find more performance out of these consoles once a competitor produces a hit product that does a better job delivering more eye candy using the existing hardware...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Serandur*
> 
> But you said this "So if we where to make a new console right now. What can of components would be ideal?", I picked up on the "right now" part.


970M ideal in performance but not in price.


----------



## KSIMP88

short gap between consoles this time.... watch


----------



## raghu78

I am reminded of this proverb when it comes to Ubisoft









http://www.oxforddictionaries.com/definition/english/a-bad-workman-always-blames-his-tools

"a bad workman always blames his tools"

proverb : A person who has done something badly will seek to lay the blame on their equipment rather than admit their own lack of skill.

oh btw the awesome GTA V comes at 1080p on PS4









http://www.ibtimes.co.uk/gta-5-runs-1080p-resolution-ps4-no-word-xbox-one-performance-1471882


----------



## Serandur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 970M ideal in performance but not in price.


I did say:

"but price would be the issue... for the manufacturers."

In any case, I think the console manufacturers should have taken the hit on stronger hardware regardless. They're charging for online access and aren't subsidizing anything, the consoles as they are are just so lacking as a result. $500 retail price with a $50-100 loss on initial units isn't much to ask for what are supposed to be the development targets for the next however many years. I care more about their CPUs, especially. Those are especially bad and limiting to certain core aspects of game design that do not scale up like some graphical assets and resolution.


----------



## routek

Seem to have gone a bit GPU based in the recent posts. Of course but Ubisoft want a better CPU

Here's another quote:

"*We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second"*

The point is probably games won't advance much in systems and AI, we're left with the same old game with better visuals

You can offload to the GPU, PS4 seems to be good for that.


----------



## Schoat333

Crazy to think we are already a year into the new generation of consoles. That went by fast, and pretty unexciting to be honest.


----------



## Crouch

Quote:


> Originally Posted by *Remij*
> 
> I knew this was coming, but you can't really blame the console manufacturers, they really need to profit from these consoles early on to recoup their losses throughout the gen and actually make money.
> 
> That said, these consoles haven't come close to hitting their strides. *All it will take is some out of the box developer to show them they are wrong*, and new advancements will be made.


NAUGHTY DOG


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *StreekG*
> 
> I agree. If this started at PS3 prices with a bump up in the hardware specs, it would still sell very well and we would actually have 1080P games.
> I was playing The Evil Within on PS4 the other day and it dips below 30fps in some open areas. Not a good experience at all, scared to see how it performs on the Xbone


Everyone hated the PS3 at launch because its price was too high. Sony was smart in keeping the price of the PS4 low. A $500 PS4 is a big difference from a $400 PS4. Just ask the average console buyer.


----------



## geoxile

Quote:


> Originally Posted by *routek*
> 
> Seem to have gone a bit GPU based in the recent posts. Of course but Ubisoft want a better CPU
> 
> Here's another quote:
> 
> "*We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second"*
> 
> The point is probably games won't advance much in systems and AI, we're left with the same old game with better visuals
> 
> You can offload to the GPU, PS4 seems to be good for that.


They said they're spending 50% of CPU power on prepackaged rendering systems. Seems to me maybe they should see if they can come up with a more efficient solution for that


----------



## SpeedyVT

I'd like to see the claims from a legit company, not a backwashing engine user.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Serandur*
> 
> I did say:
> 
> "but price would be the issue... for the manufacturers."
> 
> In any case, I think the console manufacturers should have taken the hit on stronger hardware regardless. They're charging for online access and aren't subsidizing anything, the consoles as they are are just so lacking as a result. $500 retail price with a $50-100 loss on initial units isn't much to ask for what are supposed to be the development targets for the next however many years. I care more about their CPUs, especially. Those are especially bad and limiting to certain core aspects of game design that do not scale up like some graphical assets and resolution.


For them to lose money would mean they have to push the hardware for 5+ year to make up the loss. They don't much as much money as you think. $50 for online? That is nothing.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Everyone hated the PS3 at launch because its price was too high. Sony was smart in keeping the price of the PS4 low. A $500 PS4 is a big difference from a $400 PS4. Just ask the average console buyer.


It was also a BD player when standalone BD players cost $1000.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *XAslanX*
> 
> Exactly this, the new consoles were Dated On Arrival and I just don't see them lasting that long with their limited hardware. It's 2014, 1080p 60 FPS should be the minimum these new consoles can do with every single game. Forbes article on this here http://www.forbes.com/sites/johnarcher/2014/09/12/the-ps4-and-xbox-one-are-already-out-of-date/


A gtx 980 can't even do 1080p 60fps on every single game and it costs close to $600. How do we expect consoles to do this?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> It was also a BD player when standalone BD players cost $1000.


Who buys BRays now. Physical media is dead. Back then BRay was just something fancy with no real need for it.


----------



## Pendulum

Quote:


> Originally Posted by *SpeedyVT*
> 
> I'd like to see the claims from a legit company, not a backwashing engine user.


I agree. Destiny and TLOU on my PS4 have literally never had a single performance issue. I've yet to see a frame rate drop on either after putting in 100+ hours.
Coming from Ubisoft I think I'll take this with a truck load of salt.


----------



## KenjiS

Quote:


> Originally Posted by *cutty1998*
> 
> So if these new consoles are only able to really pull off 720P, does that mean the PS3 & 360 were really running at a lower resolution that 720P


Correct, they were at about 540p in most cases... Few games might have gone to 720


----------



## y2kcamaross

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Who buys BRays now. Physical media is dead. Back then BRay was just something fancy with no real need for it.


people who appreciate uncompressed audio and video


----------



## criznit

Quote:


> Originally Posted by *Pendulum*
> 
> I agree. Destiny and TLOU on my PS4 have literally never had a single performance issue. I've yet to see a frame rate drop on either after putting in 100+ hours.
> Coming from Ubisoft I think I'll take this with a truck load of salt.


Throw in AI, large player count, draw distance, etc and that changes. I'm enjoying my ps4 but it's easy to tell what had to be cut out of these games to get them to run at the detail they're at (Infamous, Killzone, etc). I was once again fooled by the hype of the new consoles bringing about better games but once again progression is held back due to the weak apu. I like graphics but I want Far Cry/Crysis type of AI and environmental detail and huge populations in lively city locales. /Dream


----------



## Pendulum

Quote:


> Originally Posted by *criznit*
> 
> Throw in AI, large player count, draw distance, etc and that changes. I'm enjoying my ps4 but it's easy to tell what had to be cut out of these games to get them to run at the detail they're at (Infamous, Killzone, etc). I was once again fooled by the hype of the new consoles bringing about better games but once again progression is held back due to the weak apu. I like graphics but I want Far Cry/Crysis type of AI and environmental detail and huge populations in lively city locales. /Dream


True, I've been thinking FC4 might have some issues with the type of environment it uses. Very large areas filled with dense foliage and AI being spread out could cause some performance issues.
At least they're doing better with AI than Forza 5 on X1...2D crowd.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Who buys BRays now. Physical media is dead. Back then BRay was just something fancy with no real need for it.


Ok? BD players were still $1000 back then. Until you build a time machine that point is still relevant.


----------



## alcal

I think this is an issue largely specific to Assassins Creed. Few games have this mixture of NPC quantity and NPC AI. If anybody remembers ACIII, Boston was a complete slideshow even on decent/overclocked desktop processors. The launch games showed that good looking games can exist for the new consoles, but it seems the level of features and detail the Unity devs are going for is too tall an order (open world, lots of AI, high-res textures, complicated pathing mechanics, and probably a whole host of new graphical technologies). Either way, we should probably just put down the pitchforks until the game is released--I'm sure it will play just fine on PC.


----------



## cutty1998

Quote:


> Originally Posted by *prescotter*
> 
> Yea they do, take a look here: http://forum.beyond3d.com/showthread.php?t=46241
> Example of CoD resolutions:
> Call of Duty: Black Ops = 960x544 (2xAA)
> Call of Duty: Black Ops 2 = 880x720 (dynamic? Instances of 832x624..., post-AA)
> Call of Duty: Ghosts = 928x600 ? (no AA? broken?)
> Call of Duty: Modern Warfare = 1024x600 (2xAA)
> Call of Duty: Modern Warfare 2 = 1024x600 (2xAA)
> Call of Duty: Modern Warfare 3 = 1024x600 (2xAA)


Wow, and I remember how incredible I thought the PS3 and 360 looked 7-8 years ago! If you looked on the back of a lot of 360 games ,they said 1080P on them. Talk about false advertising!


----------



## SpeedyVT

Quote:


> Originally Posted by *Pendulum*
> 
> True, I've been thinking FC4 might have some issues with the type of environment it uses. Very large areas filled with dense foliage and AI being spread out could cause some performance issues.
> At least they're doing better with AI than Forza 5 on X1...2D crowd.


Honestly AI could be handled by HSA because of the GCN cores. Like how games were programmed with consoles originally... OH WAIT UBISOFT STILL THINKS THIS IS A PC!


----------



## cutty1998

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> Ok? BD players were still $1000 back then. Until you build a time machine that point is still relevant.


You're right .Blue ray never really caught on. DVD's still out sell them. I remember buying the PS3 thinking it was a deal at What was it? $600? for the 60 GB & $500 for the 20GB ,as standalone Blu ray players were $1000 at that time! Meanwhile I have bought maybe like 4 Blu rays. I know "The Dark Knight" is one I bought.Can't remember the others.


----------



## maarten12100

Quote:


> Originally Posted by *cutty1998*
> 
> Wow, and I remember how incredible I thought the PS3 and 360 looked 7-8 years ago! If you looked on the back of a lot of 360 games ,they said 1080P on them. Talk about false advertising!


The console did output 1080P just it was scaled up before output thus not really 1080P but the same amount of pixels as 1080P.


----------



## Schoat333

Quote:


> Originally Posted by *maarten12100*
> 
> The console did output 1080P just it was scaled up before output thus not really 1080P but the same amount of pixels as 1080P.


Exactly. Almost all games on the last gen were rendered pretty low, and then up scaled to 720p or 1080p.

One thing people seem to forget is that even tho these new console are not on par with PC's, they are pretty far ahead of the last gen. We are talking about rendering at 900p instead of 1080p. When you play on a tv, that really isn't that serious.


----------



## Master__Shake

Quote:


> Originally Posted by *omari79*
> 
> Wasn't it always that a console was superior or equal to a high end gaming pc at launch and then the it wpuld get surpassed not long after?
> 
> Wasn't this the case with the PS2..PS3..XBOX..360?
> 
> What went wrong with the PS4/ONE?


the technological improvements of pc hardware in the last gens life span made pc gaming FAR superior to anything you could buy in some 500 dollar box that was cobbled together.

i am positive you could build a 500 dollar computer using the free preview of windows 10 and get superior performance than some rickety old console.


----------



## PostalTwinkie

Quote:


> Originally Posted by *maarten12100*
> 
> Pentium 3 topped out at 1.4GHz so not really. Also we are talking 8 cores the strength of about half the IPC of Haswell linear scaling gets you an i5 in raw performance even with that not being the case it should be plenty.
> 
> So let's blame it on Ubisoft for it's inability to make an optimized game and call it a day.


Technically the Pentium III topped out at 1.0 Ghz, and Intel announced a 1.13Ghz variant that was never shipped in mass.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Technically the Pentium III topped out at 1.0 Ghz, and Intel announced a 1.13Ghz variant that was never shipped in mass.


http://www.ebay.com/itm/381012642309


----------



## Pip Boy

Quote:


> Originally Posted by *Schoat333*
> 
> Exactly. Almost all games on the last gen were rendered pretty low, and then up scaled to 720p or 1080p.
> 
> One thing people seem to forget is that even tho these new console are not on par with PC's, they are pretty far ahead of the last gen. We are talking about rendering at 900p instead of 1080p. When you play on a tv, that really isn't that serious.


it is though. spending £450 - £500 or $500 whatever your currency is on a console that can manage the PC equivalent of 1600 x 900 resolution graphics at 30fps isn't good. It isn't good because that cap is the cap we have to bear on PC because consoles hold back PC's.

But they can easily run 4k @ 30fps on the new consoles its just that it will be most likely be 4k Frogger


----------



## ZealotKi11er

If PS3/Xbox 360 where so amazing how come Crysis 1 was no developed for them?


----------



## PostalTwinkie

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> http://www.ebay.com/itm/381012642309


That was a limited release for servers, and a company had to buy something around 1,000 of them each to get them. They weren't mass produced as far as I am aware.

Although, their existence does technically mean I was technically wrong. So I shall revert back to the "mass" part of it!


----------



## Mygaffer

I don't believe anything Ubisoft says. A lot of developers have had a lot more than a 1-2 fps difference between the PS4 and the Xbone. Of course I just game on PC, so I can ignore these underpowered consoles that Microsoft and Sony released. The only think I do buy that this guy is selling is that these consoles are already obsolete.


----------



## PostalTwinkie

I am just curious.

For all the people that think developers are going to pull performance out of this hardware, the Jaguar core and GCN based GPU, products that have been in the wild for a long time.

Where exactly do you expect them to find this massive hidden power? Where do you expect them to squeeze out extra performance? How do you see them doing it?

Everyone is acting like the developers are working with some new exotic hardware that they have to learn, that this is some new shocking environment they have to navigate. It isn't, stop acting like it is! This is lower end PC hardware using DirectX/Direct3D 11, in an x86 environment. This isn't new for them!

Oh, and before someone points to the original XBox, I already handled that several pages back - no need to reference it again, as it only supports what is being said by myself and others.

EDIT:

Additionally, you are all forgetting something. Major developers have their hands on new consoles WELL before the consumer. In same cases we are talking a year or more ahead of the consumer, so they can become familiar with it. Usually this happens shortly after the final specifications are decided, they get a Dev Console.

So here we are, a year after release, and two to three years since Devs first had hands on. Our console life cycle starts much later than that of the developers themselves.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I am just curious.
> 
> Where exactly do you expect them to find this massive hidden power? Where do you expect them to squeeze out extra performance? How do you see them doing it?


The cloud, duh!

In all seriousness, it's just lame excuses from people who think an AMD CPU is as cryptic as the Cell processor in a PS3.


----------



## DizzlePro

why not drop the resolution to 720p & cap the the fps to 24? it looks more cinematic that way


----------



## KenjiS

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> It was also a BD player when standalone BD players cost $1000.


(First off, LONG post up ahead, Ye be warned..)

And Sony lost something like $300 or so on each one sold, Keep in mind a Blu-Ray drive BY ITSELF was over $500 at that time... Sony wanted the PS3 to push Blu-Ray adoption numbers like the PS2 pushed DVDs... This fact actually plays into a point im going to make..

Anyways my take on this.. I'm not a programmer or anything but I do genuinely feel this console generation has been a bit wimpy compared to the last... and theres one simple reason for this, Sony and Microsoft took YEARS to get the 360 and PS3 hardware somewhat profitable, Game sales were barely keeping the books afloat, This time around Sony/MS went with the hardware decisions they made for one simple reason: Cost

Both the Xbox One and PS4 cost less to make than they sell for at retail, Meaning they dont need to rely as heavily on game sales to recoup the cost, at worst, its a wash (Not including R&D), Something in the neighborhood of $380 to build a PS4 and $475 to build an Xbox One (Including the $75 for the Kinect).. So yeah.. not really costing either company much especially considering you need PS Plus or Xbox Live for many features (Which make both companies a lot of money)

Regardless, I'm going to keep things simple and do some comparisons with the last generation, Specifically the 360..

The Xbox 360 used a GPU called "Xenos" which was roughly based on the X1800 GPUs, However, some aspects of its design were WAY ahead of that, namely it had parts of the Terascale microarchitecture which was later brought into the PC world on the 2000-series GPUs... It had the capability for unified shader architecture, something that the X1800 lacked and AMD didn't offer until the HD2900 in early *2007* So in many ways, the 360 was a bit ahead of its curve in terms of graphics features to start with.

Now, even if it was "merely" an X1800-esque chip, that chip was only brought to market in late 2005... Or, *around exactly the same time the Xbox 360 was launched* and it was a Flagship product, top of the range, this was a $600 GPU back then too.. This was a card that handled Far Cry at 1600x1200 and still could squeak a 60fps framerate.*In other words, this was a serious GPU for its time* The CPU was a triple-core derivitive of the PowerPC 970MP used in the PowerMac G5 that launched, you guess it, the end of 2005. So seriously, the 360 was using some top shelf components in its day.

The Xbox 360 lasted so long because they kept the resolution to, at most, 720p in most cases, And this was fine! This was back at a time when HDTVs were new, Most of them were 1080i or 720p at best, and we didnt have much HD content around to compare to. the quality of the early HDTVs pales to a modern decent LCD or Plasma in many ways, Seriously. It took years before the 360's limits were getting apparent because of this... and because it used creme de la creme hardware it could push and deliver a graphically satisfying gaming experience until the end, even with its limitations.

Now lets compare, The Xbox One uses something in the neighborhood of a 7790, or to be more specific, it appears to basically be an R7 260 thats been downclocked(which is based on the 7790 architechture, but matches the One's core count and is effectively the same card as the 7790 from a performance perspective) So if we go by architechture release date, we're dealing with a card released early 2013, Ok fair enough, but we're not dealing with a top end architecture here, We're dealing with an entry level card that retails for $120... Big difference...

Now let me pull up some benchmarks real quick.. Ok all last generation games but it should come as no surprise to anyone here that an R7 260 barely hits 30fps at 1080p High in Metro: Last Light.. Crysis 3? 47 fps at 1080p and Medium settings.... How about Battlefield 4? 39 at High quality settings. Yes, you have the advantage on the Xbox One of direct access to hardware, but you're also down a good 150 or so mhz of clockspeed on the core which WILL negatively impact performance! Not to mention the desktop PC part is using GDDR5, the Xbox One isnt.... Theres only so much "Direct access to hardware" can accomplish here folks...

Quick little side note, the PS4 is using a more powerful 7850 but its not really a HUGE improvement 36fps in Last Light, 53 in Crysis 3, and BF4 gets you 53fps... So roughly 20-25% across the board, but look at the numbers, You're still not getting a good STABLE 60fps out of these puppies...

(Another side note, I wanted numbers for Tomb Raider but I could only find them for the 260X and the 7870, Which are both faster, Still you're looking at 45fps and 61fps respectively at 1080p, Explains what I heard about the Definitive Edition of Tomb Raider on the PS4 having some framerate dropping in certain scenes and why the One cant do 1080p/60 on it... To give even more perspective, the 260X cant even handle unmodded Skyrim at a solid 1080p and 60fps. Bioshock Infinite at Ultra gets 34... Seriously folks...Thats a GPU faster than whats in the Xbox One and its not managing THAT much)

Regardless, Lets talk the CPU in the One for a bit, I do not know AMD CPUs very well but as far as I can tell its essentially Two Athlon 5150s running side by side... Ok, So lets look up the Athlon 5150 and remove its dedicated GPU from the equation, lets check some benchmarks vs other CPUs in gaming situations and pick a CPU-intensive game, Hmn.. I know, How about Company of Heroes 2, running at 1080p with a GTX 770?

The 5150 gets 12fps. 12. Seriously. For comparison a A10-7850k gets 37, and an i3-4330 gets 43fps. Even the 5350 (Same architechture clocked at 2ghz) is managing 16fps. I picked this game because it has complicated physics effects and AI, IE, two things Ubi is claiming is bottlenecking performance.

But thats one game right... So lets go for some others.. Battlefield 4 for instance... 28fps for the 5150, i3-4330 is at 59fps... Sleeping Dogs, 23fps vs 56fps... Bioshock Infinite is 47 vs 91... Its not just one game, its across the board folks.

And while yes, thats just ONE 5150 and "theoretically" two is twice as fast in reality theres ALWAYS a reason thats not true.

To pull some Synthetics out and throw them on the table, Cinebench R10 Multithread test scores the 5150 at 5508, a *Core 2 Quad Q9650 included in the test scored 12,983* Even if you consider BEST CASE scenario on the One's processor DOUBLING the 5150, You're looking at a score of 11,016; or a difference of 1,967 *And this is against a CPU released 7 Years Ago* For comparison my i7-2600k in the same test scores a whopping 22,875

No matter how you slice it, the Athlon 5150 isnt that powerful of a CPU, in most of the synthetics it tends to get its rear kicked by the Atom...

So why the 5150? Because simply put, power and heat consumption, Microsoft didn't want a repeat of the 360 RROD episodes. Also I think AMD offered a very good price on the package. I believe the EU also has some bug crammed up there on power consumption levels or something which may ALSO have driven this decision...

*So how does this all add up? Its simple really, We're expecting a CPU thats not that powerful mated to a GPU thats not that powerful to push graphics forward when the combo in all reality can barely handle last gen's best looking games. Seriously, Look at the benchmarks, I'm not making stuff up here*

Optimization can only go so far folks, You can strip an Eldorado to its bare bones but its still not going to win in a race against a 458 Italia... I'm not saying it wont help but its also not MAGICALLY going to double your framerates! Keep in mind most of the parts I mentioned above are also clocked higher and running GDDR5, which the Xbox One is NOT.

And also the PS4 is just simply put, not that much faster than the One, Not in real world practical terms anyways, What do I mean by this? I mean in the same benchmarks achieving solid average 60fps framerates, its not QUITE fast enough to do that and thus in the real world it is NOT any faster than the One because most developers are going to VSync lock the framerate for smoothness... The extra power afforded by the extra cores and using GDDR5 can however be used to make games look better on the PS4, albeit slightly

Now look folks, Ubi has said a lot of bunk over the years, but in this case I think they might *genuinely* be telling the truth, By all metrics this "next generation" of consoles cut too many corners in the performance department to be truly next gen, IMHO we needed to see a 7970 or so level GPU in them and something like a beefier AMD CPU (someone who knows AMD better help me out with a suggestion here?) It might have cost MS or Sony $600-650 per console to do this but they could have gotten up on stage and showed us some true solid next gen performance from the start.

In interest of full disclosure of where I pulled my numbers and data I used Anandtech and Toms Hardware, Heres all my article links for you guys:

X1800:

http://www.tomshardware.com/reviews/ati-enters-x1000-promised-land,1140-16.html

R7 260:

http://www.anandtech.com/show/7754/the-amd-radeon-r7-265-r7-260-review-feat-sapphire-asus

R7 260X and 7870:

http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635.html

Athlon 5150:
http://www.anandtech.com/show/8067/amd-am1-kabini-part-2-athlon-53505150-and-sempron-38502650-tested


----------



## lacrossewacker

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I am just curious.
> 
> For all the people that think developers are going to pull performance out of this hardware, the Jaguar core and GCN based GPU, products that have been in the wild for a long time.
> 
> Where exactly do you expect them to find this massive hidden power? Where do you expect them to squeeze out extra performance? How do you see them doing it?
> 
> Everyone is acting like the developers are working with some new exotic hardware that they have to learn, that this is some new shocking environment they have to navigate. It isn't, stop acting like it is! This is lower end PC hardware using DirectX/Direct3D 11, in an x86 environment. This isn't new for them!
> 
> Oh, and before someone points to the original XBox, I already handled that several pages back - no need to reference it again, as it only supports what is being said by myself and others.
> 
> EDIT:
> 
> Additionally, you are all forgetting something. Major developers have their hands on new consoles WELL before the consumer. In same cases we are talking a year or more ahead of the consumer, so they can become familiar with it. Usually this happens shortly after the final specifications are decided, they get a Dev Console.
> 
> So here we are, a year after release, and two to three years since Devs first had hands on. Our console life cycle starts much later than that of the developers themselves.


Even though the hardware has been out in the PC market, when have they ever been tailored to? Software/Game developers don't test out their software on each CPU or GPU. They code to the API. EA's highly threaded BF4 is able to spread out across 8 cores, but it hasn't been coded to take DIRECT ADVANTAGE OF A FX 8350 and ONLY A FX 8350.

It's not like each GPU and CPU on the PC market is ever truly explored - not to the extent of consoles.

That's how. x86 is familiar, but it hasn't be exploited. Just the amount of control given to the developers will be benefit any architecture right off the bat.

Also, these developers have not had the access you're talking about to these consoles for very long. The launch titles...those were just built in approximate environments. That's closer to the current PC gaming development than it is console development.

So once they take advantage of x86 and GCN (as well as the many other integrated features - Sound processor comes to mind) we'll continue to see the usual "breakthroughs" in rendering. Killzone's MP for example wasn't 1080p, but for the longest time people thought it was! It was just the scaling technique was soo impressive....it was nearly indistinguishable from native 1080p and took months for people to find out. KZ's multiplayer is actually running at *HALF* the resolution of 1080p

Just a thought.


----------



## Serandur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For them to lose money would mean they have to push the hardware for 5+ year to make up the loss. They don't much as much money as you think. $50 for online? That is nothing.


"Losing $50 a console means long life to make up cost, can't be done"

And

"$50 a year for a service Steam do better for free and online access each year is nothing"

In the same post? Contradiction? If all that isn't enough for them, maybe the console model really is as good as dead because it's just flat-out pay more for less if they can't even take a small hardware cost hit initially while getting to charge for online, yet expect to be the center and focus of game development for years.

5+ years, huh? I bet they do that anyway (we'll see within only ~3 years), and regardless of whether they do or not, I'd rather have those 4-6 or however many years not involve development targets which include outdated netbook cores. Multiplatform AI and core design improvements would be nice, these consoles aren't cutting it. Frankly, if they included modern performance CPUs of any level, they'd be golden for years at the rate Intel are going. But they didn't.


----------



## KenjiS

Quote:


> Originally Posted by *lacrossewacker*
> 
> Also, these developers have not had the access you're talking about to these consoles for very long. The launch titles...those were just built in approximate environments. That's closer to the current PC gaming development than it is console development.


Not new, the early 360 devkits were Power Mac G5s. Seriously


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *KenjiS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dyson Poindexter*
> 
> It was also a BD player when standalone BD players cost $1000.
> 
> 
> 
> (First off, LONG post up ahead, Ye be warned..)
> 
> Snip
Click to expand...

I see what you are saying, but there is a huge difference between the 4 and the One, sure the CPU may be holding the 4 back more than the One, but to say that a DDR3 7790 is close to a GDDR5 7850 is crazy!!!

The difference between a ddr3 and gddr5 7750 (worse than both of these cards) is pretty big,I'm on my phone or I would post some benchies.


----------



## PostalTwinkie

Quote:


> Originally Posted by *lacrossewacker*
> 
> Even though the hardware has been out in the PC market, when have they ever been tailored to? Software/Game developers don't test out their software on each CPU or GPU. They code to the API. EA's highly threaded BF4 is able to spread out across 8 cores, but it hasn't been coded to take DIRECT ADVANTAGE OF A FX 8350 and ONLY A FX 8350.
> 
> It's not like each GPU and CPU on the PC market is ever truly explored - not to the extent of consoles.
> 
> That's how. x86 is familiar, but it hasn't be exploited. Just the amount of control given to the developers will be benefit any architecture right off the bat.
> 
> Also, these developers have not had the access you're talking about to these consoles for very long. The launch titles...those were just built in approximate environments. That's closer to the current PC gaming development than it is console development.
> 
> So once they take advantage of x86 and GCN (as well as the many other integrated features - Sound processor comes to mind) we'll continue to see the usual "breakthroughs" in rendering. Killzone's MP for example wasn't 1080p, but for the longest time people thought it was! It was just the scaling technique was soo impressive....it was nearly indistinguishable from native 1080p and took months for people to find out. KZ's multiplayer is actually running at *HALF* the resolution of 1080p
> 
> Just a thought.


The new consoles are a custom OS running on top of DirectX 11 and OpenGL OpenGL "Like" API GNM and GNMX. One that have been in the wild for years now, again, this isn't something new.

It is older hardware on older APIs, a scenario we haven't actually seen previously in consoles. The original XBox used the very new, and very different, DirectX 8, with a very new and very different hardware set in the Pentium III *Coppermine*. So developers did indeed have a learning curve there, and that showed towards the end of the console. The 360 used very custom hardware on a well known API, so they had a learning curve there in just hardware capabilities.

The new consoles? Again, nope! Older hardware architecture on an older API that these guys have a lot of time with. There isn't much to be learned here, maybe some optimization for the one hardware set they are working with, but that is it. It will not be a generational leap in performance like we have seen in the past, the hardware isn't capable of it.

Go dig this very forum to about 4 months prior to the console release. Many of us, and I mean many, called this very scenario back then.


----------



## sugarhell

^Opengl? Haha. No just no ps4 use gnm. Completely different


----------



## PostalTwinkie

Quote:


> Originally Posted by *sugarhell*
> 
> ^Opengl? Haha. No just no ps4 use gnm. Completely different


You are correct, Sony pushed it as "Like" OpenGL - I wouldn't be surprised if it were a very custom flavor. I corrected my original statement.


----------



## KenjiS

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You are correct, Sony pushed it as "Like" OpenGL - I wouldn't be surprised if it were a very custom flavor. I corrected my original statement.


Well Open GL is open right? makes some sense to use parts of it for Sony as they dont have an existing API to leverage (IE, MS and DirectX)


----------



## bucdan

I think it's time for Sony and MS to up the clocks on the cpu and gpu and sacrifice some heat for it. I read somewhere that the cpus were clocked at 1.6ghz for the ps4 and 1.76 for the xbone? I think they can pull off 2.2ghz stable without too much heat (considering the Athlon 5350). And maybe they can even unlock another core for gaming and extra ram for gaming, so it would be 7 cores and 7 gigs of ram free for the games and 1 core and 1 gig of ram for the background tasks.

I think sony could pull it off since they don't run any background tasks unlike Microsoft with their in-window tv watching or Skype, etc. Oh, cant forget a couple bumps up in clocks for the gpu too. Ideas?

If this ever does happen in the near future, and past games allow it, maybe we could be getting 45 fps or so if the fps is unlockable like Infamous Second Son.


----------



## PostalTwinkie

Quote:


> Originally Posted by *bucdan*
> 
> I think it's time for Sony and MS to up the clocks on the cpu and gpu and sacrifice some heat for it. I read somewhere that the cpus were clocked at 1.6ghz for the ps4 and 1.76 for the xbone? I think they can pull off 2.2ghz stable without too much heat (considering the Athlon 5350). And maybe they can even unlock another core for gaming and extra ram for gaming, so it would be 7 cores and 7 gigs of ram free for the games and 1 core and 1 gig of ram for the background tasks.
> 
> I think sony could pull it off since they don't run any background tasks unlike Microsoft with their in-window tv watching or Skype, etc. Oh, cant forget a couple bumps up in clocks for the gpu too. Ideas?
> 
> If this ever does happen in the near future, and past games allow it, maybe we could be getting 45 fps or so if the fps is unlockable like Infamous Second Son.


I actually expect Sony and Microsoft to bump clocks on their hardware just a little, and to trim down the OS to free up more resources for the games themselves. We effectively seen this sort of move once from Microsoft already with the Kinect change.


----------



## maarten12100

Quote:


> Originally Posted by *KenjiS*
> 
> Not new, the early 360 devkits were Power Mac G5s. Seriously


Wait you mean I can use my G5 as Xbox 360 is this distro available to the general public?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bucdan*
> 
> I think it's time for Sony and MS to up the clocks on the cpu and gpu and sacrifice some heat for it. I read somewhere that the cpus were clocked at 1.6ghz for the ps4 and 1.76 for the xbone? I think they can pull off 2.2ghz stable without too much heat (considering the Athlon 5350). And maybe they can even unlock another core for gaming and extra ram for gaming, so it would be 7 cores and 7 gigs of ram free for the games and 1 core and 1 gig of ram for the background tasks.
> 
> I think sony could pull it off since they don't run any background tasks unlike Microsoft with their in-window tv watching or Skype, etc. Oh, cant forget a couple bumps up in clocks for the gpu too. Ideas?
> 
> If this ever does happen in the near future, and past games allow it, maybe we could be getting 45 fps or so if the fps is unlockable like Infamous Second Son.


Xbox can probably do it. PS not so much because of the size of cooling.


----------



## KenjiS

Quote:


> Originally Posted by *maarten12100*
> 
> Wait you mean I can use my G5 as Xbox 360 is this distro available to the general public?


http://www.engadget.com/2005/04/21/is-this-mac-g5-the-xbox-360-dev-environment-box/

(Scroll down on this one)

http://www.anandtech.com/show/1686/5

As for actually getting that working, Probubly not, they were running a custom Win NT Kernel to my knowledge

Basically the point is the "PC Simulating the console" thing isnt really "new" lol


----------



## mtcn77

Quote:


> Originally Posted by *DizzlePro*
> 
> why not drop the resolution to 720p & cap the the fps to 24? it looks more cinematic that way


+1.
I hate seeing bots that can change directions on the base of their heel when I'm trying to strafe and avoid them. Just spoils the illusion.


----------



## PostalTwinkie

Quote:


> Originally Posted by *DizzlePro*
> 
> why not drop the resolution to 720p & cap the the fps to 24? it looks more cinematic that way


Quote:


> Originally Posted by *mtcn77*
> 
> +1.
> I hate seeing bots that can change directions on the base of their heel when I'm trying to strafe and avoid them. Just spoils the illusion.


You are also forgetting the gallons of Motion Blur we need to add as well, to make it more cinematic.

Oh, maybe we can get Michael Bay in on it, have some fancy explosions, and lens flares?


----------



## Dyson Poindexter

Quote:


> Originally Posted by *bucdan*
> 
> I think it's time for Sony and MS to up the clocks on the cpu and gpu and sacrifice some heat for it. I read somewhere that the cpus were clocked at 1.6ghz for the ps4 and 1.76 for the xbone? I think they can pull off 2.2ghz stable without too much heat (considering the Athlon 5350). And maybe they can even unlock another core for gaming and extra ram for gaming, so it would be 7 cores and 7 gigs of ram free for the games and 1 core and 1 gig of ram for the background tasks.


And what happens to the (likely) thousands of consoles that can't handle the higher clocks/heat? Do they just brick and be done with it?

Are we seriously stooping this low? It's like an automaker promising a 5-second 0-60 on a new turbocharged car, then finding out the cars are struggling and can only do it in 8 seconds. So do they remotely crank up the boost and just replace everyone's engines when the pistons melt? Would it not have made better sense to just build a proper machine in the first place?

If the APUs in the consoles could easily do 2.2 GHz, don't you think MS/Sony would have just set them there to begin with and enjoy the better performance? They might even be able to reach a whole 950p!


----------



## omari79

Quote:


> Originally Posted by *Master__Shake*
> 
> the technological improvements of pc hardware in the last gens life span made pc gaming FAR superior to anything you could buy in some 500 dollar box that was cobbled together.
> 
> i am positive you could build a 500 dollar computer using the free preview of windows 10 and get superior performance than some rickety old console.


You are probably right but i should've paid more attention to the article..this is UBI talking so I'd take anything they say with a grain of salt


----------



## rustySteel

To any of you who think that developers won't be able to make better, higher resolution games that run at higher frame-rates than currently available on either the Xbone or PS4, forgive me for saying this but you really need to apply some critical thinking skills before you comment on this subject.

Let's do a quick thought experiment shall we?

Take a group of fifteen developers (who are already as familiar as you guys claim they can be with x86, gcn, hsa, unified memory, etc) whose only goal is to figure out how to code their way to the above metrics and let them have at it for five years, trying various new tricks, modals, and resource combining methods in order to achieve their goal. Basically, whatever they can come up with they try. And then we measure their performance once per year for the duration of the experiment.

Objectively, does anyone in this thread believe that their first year efforts would be no better than their second, third, fourth, and fifth year efforts? Please put on your critical thinking caps before you answer.

This isn't about hitting a wall, this is about the fact that there might very well not be one. You can research an architecture for 1000 years getting no new results, and in the year 10001 you might get a breakthrough.

This is about the power of abstraction and new ways of thinking about old problems. Or just plain new ways of thinking.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *rustySteel*
> 
> To any of you who think that developers won't be able to make better, higher resolution games that run at higher frame-rates than currently available on either the Xbone or PS4, forgive me for saying this but you really need to apply some critical thinking skills before you comment on this subject.
> 
> Let's do a quick thought experiment shall we?
> 
> Take a group of fifteen developers (who are already as familiar as you guys claim they can be with x86, gcn, hsa, unified memory, etc) whose only goal is to figure out how to code their way to the above metrics and let them *have at it for five years*, trying various new tricks, modals, and resource combining methods in order to achieve their goal. Basically, whatever they can come up with they try. And then we measure their performance once per year for the duration of the experiment.
> 
> Objectively, *does anyone in this thread believe that their first year efforts would be no better than their second, third, fourth, and fifth year efforts*? Please put on your critical thinking caps before you answer.
> 
> This isn't about hitting a wall, this is about the fact that there might very well not be one. You can research an architecture for 1000 years getting no new results, and in the year 10001 you might get a breakthrough.
> 
> This is about the power of abstraction and new ways of thinking about old problems. Or just plain new ways of thinking.


x86 has been around for like 20 years. If someone pulls a miracle out of their butt after 20 years of optimizing for the arch., I'm going to be impressed.

Besides, in those 5 years spent figuring out how to make a low-end APU provide decent performance, real PCs will wipe the floor with consoles so hard it won't even be funny.


----------



## lacrossewacker

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The new consoles are a custom OS running on top of DirectX 11 and OpenGL OpenGL "Like" API GNM and GNMX. One that have been in the wild for years now, again, this isn't something new.
> 
> It is older hardware on older APIs, a scenario we haven't actually seen previously in consoles. The original XBox used the very new, and very different, DirectX 8, with a very new and very different hardware set in the Pentium III *Coppermine*. So developers did indeed have a learning curve there, and that showed towards the end of the console. The 360 used very custom hardware on a well known API, so they had a learning curve there in just hardware capabilities.
> 
> The new consoles? Again, nope! Older hardware architecture on an older API that these guys have a lot of time with. There isn't much to be learned here, maybe some optimization for the one hardware set they are working with, but that is it. It will not be a generational leap in performance like we have seen in the past, the hardware isn't capable of it.
> 
> Go dig this very forum to about 4 months prior to the console release. Many of us, and I mean many, called this very scenario back then.


Well let's wait and see DX11 completely utilized then (and DX12) since PC gamers have continuously complained about how MS never really pushed their DX


----------



## jameschisholm

They could bump the clocks and send out a upgrade kit which includes console deconstruction instructions and a guide on how to attach a AIO watercooler for the CPU's keep em frosty!


----------



## Pip Boy

Quote:


> Originally Posted by *KenjiS*
> 
> (First off, LONG post up ahead, Ye be warned..)
> 
> And Sony lost something like $300 or so on each one sold, Keep in mind a Blu-Ray drive BY ITSELF was over $500 at that time... Sony wanted the PS3 to push Blu-Ray adoption numbers like the PS2 pushed DVDs... This fact actually plays into a point im going to make..
> 
> Anyways my take on this.. I'm not a programmer or anything but I do genuinely feel this console generation has been a bit wimpy compared to the last... and theres one simple reason for this, Sony and Microsoft took YEARS to get the 360 and PS3 hardware somewhat profitable, Game sales were barely keeping the books afloat, This time around Sony/MS went with the hardware decisions they made for one simple reason: Cost
> 
> Both the Xbox One and PS4 cost less to make than they sell for at retail, Meaning they dont need to rely as heavily on game sales to recoup the cost, at worst, its a wash (Not including R&D), Something in the neighborhood of $380 to build a PS4 and $475 to build an Xbox One (Including the $75 for the Kinect).. So yeah.. not really costing either company much especially considering you need PS Plus or Xbox Live for many features (Which make both companies a lot of money)
> 
> Regardless, I'm going to keep things simple and do some comparisons with the last generation, Specifically the 360..
> 
> The Xbox 360 used a GPU called "Xenos" which was roughly based on the X1800 GPUs, However, some aspects of its design were WAY ahead of that, namely it had parts of the Terascale microarchitecture which was later brought into the PC world on the 2000-series GPUs... It had the capability for unified shader architecture, something that the X1800 lacked and AMD didn't offer until the HD2900 in early *2007* So in many ways, the 360 was a bit ahead of its curve in terms of graphics features to start with.
> 
> Now, even if it was "merely" an X1800-esque chip, that chip was only brought to market in late 2005... Or, *around exactly the same time the Xbox 360 was launched* and it was a Flagship product, top of the range, this was a $600 GPU back then too.. This was a card that handled Far Cry at 1600x1200 and still could squeak a 60fps framerate.*In other words, this was a serious GPU for its time* The CPU was a triple-core derivitive of the PowerPC 970MP used in the PowerMac G5 that launched, you guess it, the end of 2005. So seriously, the 360 was using some top shelf components in its day.
> 
> The Xbox 360 lasted so long because they kept the resolution to, at most, 720p in most cases, And this was fine! This was back at a time when HDTVs were new, Most of them were 1080i or 720p at best, and we didnt have much HD content around to compare to. the quality of the early HDTVs pales to a modern decent LCD or Plasma in many ways, Seriously. It took years before the 360's limits were getting apparent because of this... and because it used creme de la creme hardware it could push and deliver a graphically satisfying gaming experience until the end, even with its limitations.
> 
> Now lets compare, The Xbox One uses something in the neighborhood of a 7790, or to be more specific, it appears to basically be an R7 260 thats been downclocked(which is based on the 7790 architechture, but matches the One's core count and is effectively the same card as the 7790 from a performance perspective) So if we go by architechture release date, we're dealing with a card released early 2013, Ok fair enough, but we're not dealing with a top end architecture here, We're dealing with an entry level card that retails for $120... Big difference...
> 
> Now let me pull up some benchmarks real quick.. Ok all last generation games but it should come as no surprise to anyone here that an R7 260 barely hits 30fps at 1080p High in Metro: Last Light.. Crysis 3? 47 fps at 1080p and Medium settings.... How about Battlefield 4? 39 at High quality settings. Yes, you have the advantage on the Xbox One of direct access to hardware, but you're also down a good 150 or so mhz of clockspeed on the core which WILL negatively impact performance! Not to mention the desktop PC part is using GDDR5, the Xbox One isnt.... Theres only so much "Direct access to hardware" can accomplish here folks...
> 
> Quick little side note, the PS4 is using a more powerful 7850 but its not really a HUGE improvement 36fps in Last Light, 53 in Crysis 3, and BF4 gets you 53fps... So roughly 20-25% across the board, but look at the numbers, You're still not getting a good STABLE 60fps out of these puppies...
> 
> (Another side note, I wanted numbers for Tomb Raider but I could only find them for the 260X and the 7870, Which are both faster, Still you're looking at 45fps and 61fps respectively at 1080p, Explains what I heard about the Definitive Edition of Tomb Raider on the PS4 having some framerate dropping in certain scenes and why the One cant do 1080p/60 on it... To give even more perspective, the 260X cant even handle unmodded Skyrim at a solid 1080p and 60fps. Bioshock Infinite at Ultra gets 34... Seriously folks...Thats a GPU faster than whats in the Xbox One and its not managing THAT much)
> 
> Regardless, Lets talk the CPU in the One for a bit, I do not know AMD CPUs very well but as far as I can tell its essentially Two Athlon 5150s running side by side... Ok, So lets look up the Athlon 5150 and remove its dedicated GPU from the equation, lets check some benchmarks vs other CPUs in gaming situations and pick a CPU-intensive game, Hmn.. I know, How about Company of Heroes 2, running at 1080p with a GTX 770?
> 
> The 5150 gets 12fps. 12. Seriously. For comparison a A10-7850k gets 37, and an i3-4330 gets 43fps. Even the 5350 (Same architechture clocked at 2ghz) is managing 16fps. I picked this game because it has complicated physics effects and AI, IE, two things Ubi is claiming is bottlenecking performance.
> 
> But thats one game right... So lets go for some others.. Battlefield 4 for instance... 28fps for the 5150, i3-4330 is at 59fps... Sleeping Dogs, 23fps vs 56fps... Bioshock Infinite is 47 vs 91... Its not just one game, its across the board folks.
> 
> And while yes, thats just ONE 5150 and "theoretically" two is twice as fast in reality theres ALWAYS a reason thats not true.
> 
> To pull some Synthetics out and throw them on the table, Cinebench R10 Multithread test scores the 5150 at 5508, a *Core 2 Quad Q9650 included in the test scored 12,983* Even if you consider BEST CASE scenario on the One's processor DOUBLING the 5150, You're looking at a score of 11,016; or a difference of 1,967 *And this is against a CPU released 7 Years Ago* For comparison my i7-2600k in the same test scores a whopping 22,875
> 
> No matter how you slice it, the Athlon 5150 isnt that powerful of a CPU, in most of the synthetics it tends to get its rear kicked by the Atom...
> 
> So why the 5150? Because simply put, power and heat consumption, Microsoft didn't want a repeat of the 360 RROD episodes. Also I think AMD offered a very good price on the package. I believe the EU also has some bug crammed up there on power consumption levels or something which may ALSO have driven this decision...
> 
> *So how does this all add up? Its simple really, We're expecting a CPU thats not that powerful mated to a GPU thats not that powerful to push graphics forward when the combo in all reality can barely handle last gen's best looking games. Seriously, Look at the benchmarks, I'm not making stuff up here*
> 
> Optimization can only go so far folks, You can strip an Eldorado to its bare bones but its still not going to win in a race against a 458 Italia... I'm not saying it wont help but its also not MAGICALLY going to double your framerates! Keep in mind most of the parts I mentioned above are also clocked higher and running GDDR5, which the Xbox One is NOT.
> 
> And also the PS4 is just simply put, not that much faster than the One, Not in real world practical terms anyways, What do I mean by this? I mean in the same benchmarks achieving solid average 60fps framerates, its not QUITE fast enough to do that and thus in the real world it is NOT any faster than the One because most developers are going to VSync lock the framerate for smoothness... The extra power afforded by the extra cores and using GDDR5 can however be used to make games look better on the PS4, albeit slightly
> 
> Now look folks, Ubi has said a lot of bunk over the years, but in this case I think they might *genuinely* be telling the truth, By all metrics this "next generation" of consoles cut too many corners in the performance department to be truly next gen, IMHO we needed to see a 7970 or so level GPU in them and something like a beefier AMD CPU (someone who knows AMD better help me out with a suggestion here?) It might have cost MS or Sony $600-650 per console to do this but they could have gotten up on stage and showed us some true solid next gen performance from the start.
> 
> In interest of full disclosure of where I pulled my numbers and data I used Anandtech and Toms Hardware, Heres all my article links for you guys:
> 
> X1800:
> 
> http://www.tomshardware.com/reviews/ati-enters-x1000-promised-land,1140-16.html
> 
> R7 260:
> 
> http://www.anandtech.com/show/7754/the-amd-radeon-r7-265-r7-260-review-feat-sapphire-asus
> 
> R7 260X and 7870:
> 
> http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635.html
> 
> Athlon 5150:
> http://www.anandtech.com/show/8067/amd-am1-kabini-part-2-athlon-53505150-and-sempron-38502650-tested


thanks for taking the time to write this. the next gen consoles aren't weak compared to the last generation but they are weak compared to current generation technology. That said, is it all about graphics ? some games on the WiiU look fantastic and have a great aesthetic, good enough for most casuals and also good enough to convey the artists intention.

The PC will always be much stronger, even now there is a UDK video demonstrating a kind of ray tracing technique and we also are on the brink of 4k and we also run 144FPS games natively. Soon VR will be here and a much better experience on PC with higher resolution 1440p / 4k screens, better frame rates and actual decent graphics without extreme dumbing down the consoles VR will have to experience due to needing 60fps per eye at 720p.

said video


----------



## mtcn77

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You are also forgetting the gallons of Motion Blur we need to add as well, to make it more cinematic.
> 
> Oh, maybe we can get Michael Bay in on it, have some fancy explosions, and lens flares?


I actually turn off anisotropic filtering. You are still utilising the same z depth/stencil sampling resources through antialiasing, it is just my conclusion that at higher sample complexities, the "blur" is not worth the colour depth it necessitates concomitantly to be present to be fruitful.
I rather sparkly colours than resemble the sprites of old games than a perfect dissolution of grey shade. It is quite ironic, in my opinion, how that is the ultimate Achilles' heel of lcds and how crt monitors would be better suited to conduct in that black & white visual experience. The lcd experience is all about brightness & bright colours, imo.


----------



## PostalTwinkie

Quote:


> Originally Posted by *rustySteel*
> 
> To any of you who think that developers won't be able to make better, higher resolution games that run at higher frame-rates than currently available on either the Xbone or PS4, forgive me for saying this but you really need to apply some critical thinking skills before you comment on this subject.
> 
> Let's do a quick thought experiment shall we?
> 
> Take a group of fifteen developers (who are already as familiar as you guys claim they can be with x86, gcn, hsa, unified memory, etc) whose only goal is to figure out how to code their way to the above metrics and let them have at it for five years, trying various new tricks, modals, and resource combining methods in order to achieve their goal. Basically, whatever they can come up with they try. And then we measure their performance once per year for the duration of the experiment.
> 
> Objectively, does anyone in this thread believe that their first year efforts would be no better than their second, third, fourth, and fifth year efforts? Please put on your critical thinking caps before you answer.
> 
> This isn't about hitting a wall, this is about the fact that there might very well not be one. You can research an architecture for 1000 years getting no new results, and in the year 10001 you might get a breakthrough.
> 
> This is about the power of abstraction and new ways of thinking about old problems. Or just plain new ways of thinking.


First post, tries to be a big boy.....

Every point you have attempted to make here has been simply put to pasture already in this thread. There isn't some magical level of performance that GCN is capable of that suddenly is going to be discovered.

Any further improvements in performance are going to come from changes in clock speeds, pushed out via a system update, and some manipulation of known development techniques. These gains will not be massive like we have seen with previous generations where the developer had to learn entirely new hardware and software.
Quote:


> Originally Posted by *lacrossewacker*
> 
> Well let's wait and see DX11 completely utilized then (and DX12) since PC gamers have continuously complained about how MS never really pushed their DX


The hope is that DX 12 will bring significant performance increases, especially to the XB1.

Supposedly there is future support of OpenGL - pure OpenGL - coming to the PS4.


----------



## twitchyzero

what a surprise...new console CPUs are weak and Ubi's trying to put 100's of NPC on screen.

This claim that we've tapped consoles to its full textent has already seen by many devs earlier on in PS3/360...then you see the work done by Naughty Dog and ICE team and you'll literally laugh at all those previous attempts at squeezing every last juice out of the console's limited hardware.

This is what gets me so excited for Uncharted 4 and future ND titles...if they can achieve this Phys-X like quality on PS3's silly hardware then I'm cant see what they have in store for us in PS4.



and I'm willing to bet they'll be able to hit 60 fps on most titles....if not the very least they'll keep it at 1080p/30 which is plenty for the 3rd person shooters action adventure games they make.


----------



## KenjiS

Quote:


> Originally Posted by *rustySteel*
> 
> To any of you who think that developers won't be able to make better, higher resolution games that run at higher frame-rates than currently available on either the Xbone or PS4, forgive me for saying this but you really need to apply some critical thinking skills before you comment on this subject.
> 
> -snip-
> .


Because everything has limits, In general, a new generation of game hardware should not be needing this level of optimization to simply surpass last generation unless it is handicapped from the start. Which it is due to Microsoft and Sony having to sell the cost per unit to their respective boards of directors.

Saying "oh in 4-5 years it will be great" does not change the fact that as of RIGHT NOW the hardware is barely able to give us what we got with the last generation, Of course if we let them at it and let them explore and everything eventually we might get something better

Eventually. Maybe. Someday. If X meets Y, the stars align, and enough programmers are sacrificed to Windows ME. Those are key phrases, and also not set in stone.

The issue is simply time, The issue is simply how much longer the Console market will put up with this obfuscation and deception from large developers who are trying to cover MS and Sony's 6 o clocks by making up excuses about making things "more cinematic" or "taking an artistic direction" or "giving it a more film like feel" instead of coming out and saying it bluntly, The new consoles are underpowered, Simple as that.

Again, I direct you to the post I just made on comparing the relative hardware strengths, At launch the Xbox 360 was essentially a high end top of the line Gaming PC with performance and visuals to match, They could simply brute force things and deliver upon a gaming experience that rivaled a high end gaming PC in that time period. it used a GPU that retailed for almost the same price as the entire console. From the gate there was no question, This was more powerful than the Xbox by a large margin.

The Xbox One on the other hand is not anything of the sort, Its a massaged $300 computer. Nothing more. Its using outdated parts that were incredibly low end to begin with. The PS4 is only a bit better off. Both consoles suffer from a lack of power and while yes, theres optimization to be had, I believe most of us are doubting if optimization is enough to compensate for the hardware deficiencies of the platform.

In my opinion, its not, The best we'll see in a year or two is likely 1080p 30fps out of both consoles, Some slightly better visuals and likely less framedrops, But probubly not much extra in terms of texture quality or anything else. Theres only ONE possible ace up the sleeve that MS has potentially built into the Xbox One, but thats a theory so far.

The Order 1886 looks amazing graphically, But its also running 2.35:1 letterboxed, So about 800p, and I seem to remember discussion of a 24fps framerate on some game thats coming soon...

Optimization, programming tricks and etc increase longevity of the platform while technology advances so things continue to look good down the line until the next console comes out. Out of the gate you shouldn't need to rely on these tricks to make a game look better than the last gen. Eventually these tricks hit a limit, Everything has a limit. The Xbox 360 and PS3 hit their limits years ago for instance.

Today is absolute and tomarrow may never come, For all we know next year both MS and Sony could axe both consoles in favor of a fresh start. Arguing what might or could happen doesnt change the fact that right now both consoles are choking, They cannot deliver the improvements in graphical fidelity most folks want.

As for increasing the clock speed.. I do not know about that, if everything has been engineered to cost there may not be much leeway in many of the systems for bumping the clocks, The last thing MS or Sony would want is one of them catching fire and a lawsuit on their hands...or bricking a bunch of them, Either would be a PR problem.


----------



## lacrossewacker

Quote:


> Originally Posted by *phill1978*
> 
> we also are on the brink of 4k and we also run 144FPS games natively.


lol

Maybe with 3x980Ti's, medium graphics, and a 5960X


----------



## KyadCK

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> x86 has been around for like 20 years. If someone pulls a miracle out of their butt after 20 years of optimizing for the arch., I'm going to be impressed.
> 
> Besides, in those 5 years spent figuring out how to make a low-end APU provide decent performance, real PCs will wipe the floor with consoles so hard it won't even be funny.


Write exclusively for a 486 and see how well it runs on Vishera or Haswell compared to something written directly for those archs, I dare you.

"x86" has been around for 20+ years, that doesn't mean Jaguar has, let alone with a unified memory structure among other toys. x86 is a concept, like RISK, not a set-in-stone design. If it was, you would get non-existent performance improvements from each generation, not nearly non-existent ones.


----------



## KenjiS

Quote:


> Originally Posted by *phill1978*


Graphics are not everything no.. But if we start talking about other improvements, things like having a large number of NPCs on screen, well that pushes the CPU harder

More difficult AI? Again, going to push the CPU even harder

More realistic physics? CPU

Funny you mention Voxel Cone Ray Tracing however, One thing I did not mention is theres a theory out there that the Xbox One *is capable of doing it*

How if its so weak? Well simply put its the ESRAM:

http://www.ign.com/boards/threads/so-could-the-x1s-secret-sauce-be-voxel-cone-ray-tracing.453362563/

That article sums it up, but essentially the ESRAM design seems almost perfect to hold partial resident textures, which will remove one of the big hardware issues holding back Voxel Cone Ray Tracing (Namely textures are MASSIVE) Using partial resident textures you could compress around 3gb of textures to fit into 16mb of ram.. Or in the One's case the 32mb of ESRAM.... So you could theoretically cram 6gb worth of textures in there...

Hey is anyone else seeing a little something relating to a little trend in PC titles lately?









So to be clear, If this is correct then *Yes, the Xbox One will get a huge jump in Graphical fidelity in a year or two* HOWEVER, its prediction and speculation and might not even be true

And to clarify my earlier stance, I'm not saying some slight improvements are not possible, But without some form of new tech, like Voxel Cone Ray Tracing, I dont feel the hardware has enough power to begin with to truly advance graphical fidelity to the next level and my stance is more "The hardware shouldn't be counting on NEEDING such things to advance graphical fidelity in the short term, as betting on future promise sometimes is a bad decision

I'd also say that if the above is true MS and Sony are stupid for not coming out with some kind of demo to say "Look this isnt ready but heres what we will be able to do on these things in a year or two"


----------



## rustySteel

Quote:


> *PostalTwinkie:* and some manipulation of known development techniques.


I see you missed the point entirely.


----------



## PostalTwinkie

Quote:


> Originally Posted by *rustySteel*
> 
> I see you missed the point entirely.


No, you clearly missed it...

The techniques would be to lower resolution, and lower the entities on screen, additional effects, etc.

The current hardware in the newest consoles isn't going to be capable of rendering large numbers of NPC, along with things like advanced AI, or really any large number of CPU intensive tasks. They might eventually reach something near 60 FPS at 1080P, but it is going to be at the cost of other wanted aspects of a title.

In other words, we didn't gain anything with these new consoles over the last. Merely traded a few things for others.


----------



## SpeedyVT

Consoles are not mainstream computers and not constrained to the same resitrictions even if physically the same in restriction. Because the hardware is all the same developers can code specifically to the hardware which taps potential power to a level which a PC can't be tapped.


----------



## DiNet

So is anyone apart from Ubisoft saying this?


----------



## maarten12100

Quote:


> Originally Posted by *KenjiS*
> 
> http://www.engadget.com/2005/04/21/is-this-mac-g5-the-xbox-360-dev-environment-box/
> 
> (Scroll down on this one)
> 
> http://www.anandtech.com/show/1686/5
> 
> As for actually getting that working, Probubly not, they were running a custom Win NT Kernel to my knowledge
> 
> Basically the point is the "PC Simulating the console" thing isnt really "new" lol


Too bad would've been a really good use for a machine I rarely use.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SpeedyVT*
> 
> Consoles are not mainstream computers and not constrained to the same resitrictions even if physically the same in restriction.


Thats what i find shocking. How did they push these consoles to their limit so fast? From the performance they are achieving with both these consoles in like 10-15% difference from PC with similar hardware. I mean you can play most PC games @ 1080p with 4GB of RAM and 2GB vRAM. These consoles have 8GB and that does not take into account dont to metal optimization. They where able to make games work in PS3 with only 512MB overall memory where as the same game in PC required 2GB +.


----------



## rustySteel

Quote:


> Originally Posted by *PostalTwinkie*
> 
> No, you clearly missed it...
> 
> The techniques would be to lower resolution, and lower the entities on screen, additional effects, etc.
> 
> The current hardware in the newest consoles isn't going to be capable of rendering large numbers of NPC, along with things like advanced AI, or really any large number of CPU intensive tasks. They might eventually reach something near 60 FPS at 1080P, but it is going to be at the cost of other wanted aspects of a title.
> 
> In other words, we didn't gain anything with these new consoles over the last. Merely traded a few things for others.


Your inability to think outside the box is showing. As an example, read the following article:

http://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article

There will always be new ways to abstract and combine resources to create things you wouldn't have thought were possible from dusty tech, you just need a fresh approach and a clever head. That fact that you insist this type of thing is impossible is actually kind of sad. You need to dream a little bigger darling.


----------



## PostalTwinkie

Quote:


> Originally Posted by *DiNet*
> 
> So is anyone apart from Ubisoft saying this?


Performance concerns have been expressed several times in the past by developers, this is far from new.

In fact.....

Linky

This entire article further show show much of a problem the performance has been. The Kinect change to free up 10% more GPU for developers was done for the developers, and requires the developers to use it. Very telling, especially in regard to this conversation....

Quote:


> *EuroGamer:* And will this mean more games will hit the 1080p 60fps benchmark that's all the rage these days, I asked?
> 
> *Microsoft:* "Xbox One games look beautiful and have rich gameplay and platform features. How developers choose to access the extra GPU performance for their games will be up to them. We have started working with a number of developers on how they can best take advantage of these changes. We will have more to share in the future."


Microsoft completely dodging the question, because they know they have hardware issues.
Quote:


> Originally Posted by *rustySteel*
> 
> Your inability to think outside the box is showing. As an example, read the following article:
> 
> http://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article
> 
> There will always be new ways to abstract and combine resources to create things you wouldn't have thought were possible from dusty tech, you just need a fresh approach and a clever head. That fact that you insist this type of thing is impossible is actually kind of sad. You need to dream a little bigger darling.












That was on the 360, which for its time, had very high end hardware in it. It is also on a title that is 4 years old! Do you not see the problem with what you are saying? You are being forced to use a 4 year old title on a 7 year old system to try and prove your point about a current generation console.

That is how bad it is!

Again, the 360 used very high end hardware for its time, and that is why you seen it last so long. That isn't the case anymore, the newer consoles are using very low end hardware.

Also, that was a tech demo and very controlled, and not the norm.


----------



## maarten12100

Quote:


> Originally Posted by *KyadCK*
> 
> Write exclusively for a 486 and see how well it runs on Vishera or Haswell compared to something written directly for those archs, I dare you.
> 
> "x86" has been around for 20+ years, that doesn't mean Jaguar has, let alone with a unified memory structure among other toys. x86 is a concept, like RISK, not a set-in-stone design. If it was, you would get non-existent performance improvements from each generation, not nearly non-existent ones.


Exactly this. x86 isn't even really what is is at anymore everything is translated to a faster internal standard and eventually converted back to x86 talk again.

Also the xbox 1 and PS4 can do 1080P in pretty much all titles as a HD7850 with a weak cpu can do it too on an unoptimized platform. But 720 with medium settings may be better looking that 1080P at low. Resolution is not a measurement for how good something is going to look.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats what i find shocking. How did they push these consoles to their limit so fast? From the performance they are achieving with both these consoles in like 10-15% difference from PC with similar hardware. I mean you can play most PC games @ 1080p with 4GB of RAM and 2GB vRAM. These consoles have 8GB and that does not take into account dont to metal optimization. They where able to make games work in PS3 with only 512MB overall memory where as the same game in PC required 2GB +.


Ubisoft has horrible devs and devs make a trade off to get the best visual quality with the available grunt thus 720P with medium can be better than 1080P at low scenario.


----------



## KenjiS

Quote:


> Originally Posted by *rustySteel*
> 
> Your inability to think outside the box is showing. As an example, read the following article:
> 
> http://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article
> 
> There will always be new ways to abstract and combine resources to create things you wouldn't have thought were possible from dusty tech, you just need a fresh approach and a clever head. That fact that you insist this type of thing is impossible is actually kind of sad. You need to dream a little bigger darling.


Thats not running at 60fps, Thats "60fps*". Its not using 60 discrete frames per second thus it is not 60fps

As stated in the article, it works like a TV's motion interpolation, Meaning that I'd imagine thered be some weird artifacts if you changed direction or something and its not as smooth as proper 60fps...

Is it clever? Sure. But its not really making the game run better. Think of it as one of those Fiero's thats been turned into a replica 308... Looks like the same thing, but it isnt..


----------



## PostalTwinkie

Quote:


> Originally Posted by *maarten12100*
> 
> Exactly this. x86 isn't even really what is is at anymore everything is translated to a faster internal standard and eventually converted back to x86 talk again.
> 
> Also the xbox 1 and PS4 can do 1080P in pretty much all titles as a HD7850 with a weak cpu can do it too on an unoptimized platform. But 720 with medium settings may be better looking that 1080P at low. Resolution is not a measurement for how good something is going to look.


You are correct, which is precisely the problem. The consoles were pushed from the get go as being able to bring PC gaming experiences to the living room. Being able to bring "cutting edge" visual effects, with "more" things going on.

They weren't sold as "Hey, we can do 1080/60, but need to give up this, this, and this.".

With previous consoles we seen, as they were in the wild, better looking games at more stable frame rates and even higher resolutions, from the previous consoles. Basically the last generation of consoles brought gains to pretty much every aspect of the platform. That isn't the case with the new hardware, there are tradeoffs being made, and unlike previous generations, you have developers speaking out with concern.


----------



## ZealotKi11er

Quote:


> Originally Posted by *maarten12100*
> 
> Exactly this. x86 isn't even really what is is at anymore everything is translated to a faster internal standard and eventually converted back to x86 talk again.
> 
> Also the xbox 1 and PS4 can do 1080P in pretty much all titles as a HD7850 with a weak cpu can do it too on an unoptimized platform. But 720 with medium settings may be better looking that 1080P at low. Resolution is not a measurement for how good something is going to look.
> Ubisoft has horrible devs and devs make trade of to get the best visual quality with the available grunt thus 720P with medium can be better than 1080P at low scenario.


Thats the thing. If a PC with a "weak" CPU and a HD 7850 can do 1080p then why cant these consoles always do it and do it better?

The way i see it developers have to come with clever way to make their games look better and have unique ideas. Throwing more performance does nothing i mean nothing. Look at Crysis 1 for example and compare it to a game today. Todays games got nothing on it even though a PC from 2007 is 5-10x slower then a PC now.


----------



## rustySteel

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Are you so dense? That was on the 360, which for its time, had very high end hardware in it. It is also on a title that is 4 years old! Do you not see the problem with what you are saying? You are being forced to use a 4 year old title on a 7 year old system to try and prove your point about a current generation console.
> 
> That is how bad it is!
> 
> Again, the 360 used very high end hardware for its time, and that is why you seen it last so long. That isn't the case anymore, the newer consoles are using very low end hardware.


Uhhh, are you serious? What difference does it make how high or low end something is? Ten years ago the ps4 would have been considered high end, does that mean that only then - when it was still high end - you would have been able to come up with new programming methods to make higher quality games for it, but that now, since it's viewed as low end, you can't?

What are you talking about? Whether hardware is high or low end is subjective based on the date you're viewing it from. That subjectivity has no effect on emerging programming models and their efficacy.


----------



## Dromihetes

Of course they are hitting a performance wall ,they are consoles.
Developers will need to decide if the PC-s deserve better or they want to keep PC gaming ******ed for years as they did with previous console generations.
Stop preordering games for PC-s


----------



## PostalTwinkie

Quote:


> Originally Posted by *rustySteel*
> 
> Uhhh, are you serious? What difference does it make how high or low end something is? Ten years ago the ps4 would have been considered high end, does that mean that only then - when it was still high end - you would have been able to come up with new programming methods to make higher quality games for it, but that now, since it's viewed as low end, you can't?
> 
> What are you talking about? Whether hardware is high or low end is subjective based on the date you're viewing it from. That subjectivity has no effect on emerging programming models and their efficacy.


Let's make it really simple....

The performance potential of the old consoles was, naturally, much higher than what we have now relative to their release. They were simply higher end machines, so there was room for developers to grow significantly, and not just due to the hardware aspect. That same level of hidden potential doesn't exist in the newest consoles. To obtain 1080/60 they are going to have to make exceptions in other areas.

Secondly; your link, which as been destroyed already, was a very limited tech demo of a way of simulating 60 FPS. It *was not* truly running at 60 FPS, which is clearly stated in the article you linked!

Andreev and his colleagues have devised a system that gives an uncanny illusion of true 60FPS


----------



## KenjiS

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You are correct, which is precisely the problem. The consoles were pushed from the get go as being able to bring PC gaming experiences to the living room. Being able to bring "cutting edge" visual effects, with "more" things going on.
> 
> They weren't sold as "Hey, we can do 1080/60, but need to give up this, this, and this.".
> 
> With previous consoles we seen, as they were in the wild, better looking games at more stable frame rates and even higher resolutions, from the previous consoles. Basically the last generation of consoles brought gains to pretty much every aspect of the platform. That isn't the case with the new hardware, there are tradeoffs being made, and unlike previous generations, you have developers speaking out with concern.


One area where I'd say the x86 architecture offers a key distinct advantage is if MS or Sony want to push a new console out so long as they base it around x86 and GCN then theoretically everything should be easily ported over, making a new console cheaper to develop and you could continue the developer momentum on more powerful hardware and even patch older titles to run at higher framerates on the newer hardware. You could come out with consoles like other companies release Smartphones and Tablets. But then again at this point consumers might start wondering why not just buy a PC...

I'd ALSO say that in theory one could argue this gives MS/Sony the ability to push their Services onto the PC and have Xbox and Playstation become competition to Steam with distinct game stores and abilities... The scariest part of this with Microsoft is Microsoft could, theoretically, block Steam from being installed on a future version of windows and *forcing* you to use Xbox instead. But that also seems like a good way of opening up a can of antitrust accusations...

Graphics should follow the old addage "Show, Dont tell", Companies are pushing the 1080p numbers and framerates out to the forefront because the games graphical improvements aren't large enough compared to the last generation to show off on their own...

Also the recent accusations of MS and Sony getting developers to sabotage games on PC adds weight to MS/Sony concerned over the growth in the PC market


----------



## Master__Shake

Quote:


> Originally Posted by *PostalTwinkie*
> 
> *You are also forgetting the gallons of Motion Blur we need to add as well, to make it more cinematic.*
> 
> Oh, maybe we can get Michael Bay in on it, have some fancy explosions, and lens flares?


blah!

motion blur should die in a fire.


----------



## rustySteel

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Let's make it really simple....
> 
> The performance potential of the old consoles was, naturally, much higher than what we have now relative to their release. They were simply higher end machines, so there was room for developers to grow significantly, and not just due to the hardware aspect. That same level of hidden potential doesn't exist in the newest consoles. To obtain 1080/60 they are going to have to make exceptions in other areas.
> 
> Secondly; your link, which as been destroyed already, was a very limited tech demo of a way of simulating 60 FPS. It *was not* truly running at 60 FPS, which is clearly stated in the article you linked!
> 
> Andreev and his colleagues have devised a system that gives an uncanny illusion of true 60FPS


I never said it was 60fps, the point of the link was to illustrate a new coding paradigm that hadn't been considered before in order to show you that there are other ways of thinking about achieving quality that you're not considering.

To obtain 1080/60 they are going to have to make exceptions in other areas? Do you have a crystal ball? (If you do, I'd like to know the lottery numbers for tonight's drawing if that's cool. If not, you should probably stop talking now because you're making a fool of yourself.)

Btw, how did you destroy that link? I'd really like to know.


----------



## DiNet

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Performance concerns have been expressed several times in the past by developers, this is far from new.
> 
> In fact.....
> 
> Linky
> 
> This entire article further show show much of a problem the performance has been. The Kinect change to free up 10% more GPU for developers was done for the developers, and requires the developers to use it. Very telling, especially in regard to this conversation....
> Microsoft completely dodging the question, because they know they have hardware issues.


Oh, didn't expect this so soon-ish. Xbox doesn't seem like a good example, did read they had tons of "issues" from the very beginning.
This does seems like extremely major issue, if these console are going for same life span of 7 years.. this means that we'll hear a lot of "we are limited by consoles, that's why you got what you got on PC"








Microsoft latest products fail, so that's obvious they'll dodge all the questions regarding bad performance from their products. If you look at timeline the only product, after 2010-ish, that is actually good is office 2013.
The percentage also isn't reassuring. Up to 10%? How it translates to performance? 10% better like 970 vs 980 performance? Like 3fps difference?
Doesn't PS4 outsell xbox by large? That one I know


----------



## KenjiS

Quote:


> Originally Posted by *DiNet*
> 
> The percentage also isn't reassuring. Up to 10%? How it translates to performance? 10% better like 970 vs 980 performance? Like 3fps difference?
> Doesn't PS4 outsell xbox by large? That one I know


And the PS4 is more like 20-35% depending... So even the 10% didnt buy much

What I'd love to see is Nintendo say "Ok guys, Forget it, Gloves off" and throw out a console powered by an i3 and Maxwell on steroids. Show off a next gen Metroid Prime and Starfox game as launch titles. Give a solid proper controller, Build a good online system. Put it out there at $500 and offer developers aggresive incentives to develop for it.

Why? Because keep in mind the gaming demographic is more diverse and aging, I started when I was 3, I'm 26 now. I've been around. Been doing the PC thing since 1996 or so. A lot of us, PC gamers or no grew up with Nintendo, I'd personally LOVE to see some proper good Nintendo games come out that dont revolve around gimmicky accessories or flailing around in front of my TV. Right there is your market.


----------



## Artikbot

Quote:


> Originally Posted by *sugarhell*
> 
> Just one word. Ubisoft.


This man gets it.


----------



## PostalTwinkie

Quote:


> Originally Posted by *rustySteel*
> 
> Btw, how did you destroy that link? I'd really like to know.


I stated they would have to make exceptions to obtain the level of performance they claimed they could during the build up to the launch. Specifically.....

Quote:


> Originally Posted by *PostalTwinkie*
> 
> *
> The techniques would be to lower resolution, and lower the entities on screen, additional effects, etc.*
> 
> The current hardware in the newest consoles isn't going to be capable of rendering large numbers of NPC, along with things like advanced AI, or really *any large number of CPU intensive tasks.* They might eventually reach something near 60 FPS at 1080P, but it is going to be at the cost of other wanted aspects of a title.
> 
> In other words, we didn't gain anything with these new consoles over the last. Merely traded a few things for others.


You then came in with your 60 FPS link, which actually proved exactly what I said - developers are going to have to make concessions to obtain the level of performance these consoles were sold on. In the case of the article you linked, using motion blur techniques.

EDIT:

What was done with the 360 was impressive. The at launch titles compared to the at end of life titles were night a day, no one can ever take that away. However, those consoles had a lot of potential to be tapped, because there was that steep learning curve for the developers. Again, it is about potential here....

There is nothing with the latest consoles to indicate they are going to have the same depth to them as the previous generation. Mainly from the fact that the hardware isn't something new or exotic.

Will titles improve over the life of the console? Of course they will, no one has argued against that. The argument comes in with "how much?". Some of you believe some magical well of performance is hidden below the surface of the consoles, even though nothing indicates that at all.

Will future updates to the software on the consoles make a difference? I hope to hell so, mainly looking at the XB1 and its arrival of DX 12. Then again, one would argue that the DX 11 on it is already to the metal, and DX 12 won't bring that much to it.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *rustySteel*
> 
> I never said it was 60fps, the point of the link was to illustrate a new coding paradigm that hadn't been considered before in order to show you that there are other ways of thinking about achieving quality that you're not considering.
> 
> To obtain 1080/60 they are going to have to make exceptions in other areas? Do you have a crystal ball? (If you do, I'd like to know the lottery numbers for tonight's drawing if that's cool. If not, you should probably stop talking now because you're making a fool of yourself.)
> 
> Btw, how did you destroy that link? I'd really like to know.


Troll account is troll. Joined today to spew console shill garbage.

Go ahead and explain some more how a 7770 with a 1.6 GHz CPU is suddenly going to become a monster performer, and how PC devs all these years didn't know how to optimize x86 and GCN.


----------



## aroc91

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> The cloud, duh!
> 
> In all seriousness, it's just lame excuses from people who think an AMD CPU is as cryptic as the Cell processor in a PS3.


Speaking of the cloud, whatever happened to that? I haven't heard anything since the launch.


----------



## DiNet

Quote:


> Originally Posted by *KenjiS*
> 
> And the PS4 is more like 20-35% depending... So even the 10% didnt buy much
> 
> What I'd love to see is Nintendo say "Ok guys, Forget it, Gloves off" and throw out a console powered by an i3 and Maxwell on steroids. Show off a next gen Metroid Prime and Starfox game as launch titles. Give a solid proper controller, Build a good online system. Put it out there at $500 and offer developers aggresive incentives to develop for it.
> 
> Why? Because keep in mind the gaming demographic is more diverse and aging, I started when I was 3, I'm 26 now. I've been around. Been doing the PC thing since 1996 or so. A lot of us, PC gamers or no grew up with Nintendo, I'd personally LOVE to see some proper good Nintendo games come out that dont revolve around gimmicky accessories or flailing around in front of my TV. Right there is your market.


I took apart macmini today. i7 with 4gb, late 2011 or something like that... doesn't really look hard for any of those companies to build one around i7... mac mini is slightly smaller that wii u. So, it looks like it's not something they don't want to do, because of the cost. I'd say it would be more beneficial for "modular" console, where you can add or upgrade your GPU. cpu will last/outlast gpu's for many years. So i3 build isn't really an answer to this.
It doesn't really matter on what "games" or platforms you grew. My first game was on a casette played on console that I don't even remember the name. With doom and quake on PC's I completely forgot about consoles, and if I'm remembering correctly last console I owned was sega and/or maybe ps1. I do remember playing a lot of Sonic, just don't remember if it was my console or borrowed.
PC master race since forever


----------



## BizzareRide

Quote:


> Originally Posted by *prescotter*
> 
> Yea they do, take a look here: http://forum.beyond3d.com/showthread.php?t=46241
> 
> Example of CoD resolutions:
> Call of Duty: Black Ops = 960x544 (2xAA)
> Call of Duty: Black Ops 2 = 880x720 (dynamic? Instances of 832x624..., post-AA)
> Call of Duty: Ghosts = 928x600 ? (no AA? broken?)
> Call of Duty: Modern Warfare = 1024x600 (2xAA)
> Call of Duty: Modern Warfare 2 = 1024x600 (2xAA)
> Call of Duty: Modern Warfare 3 = 1024x600 (2xAA)


Wow bad post. There are over 3000 retail games for Xbox 360 and the overwhelming majority were 720p... And all of those COD were 60fps

Quote:


> Originally Posted by *DiNet*
> 
> I took apart macmini today. i7 with 4gb, late 2011 or something like that... doesn't really look hard for any of those companies to build one around i7... mac mini is slightly smaller that wii u. So, it looks like it's not something they don't want to do, because of the cost. I'd say it would be more beneficial for "modular" console, where you can add or upgrade your GPU. cpu will last/outlast gpu's for many years. So i3 build isn't really an answer to this.
> It doesn't really matter on what "games" or platforms you grew. My first game was on a casette played on console that I don't even remember the name. With doom and quake on PC's I completely forgot about consoles, and if I'm remembering correctly last console I owned was sega and/or maybe ps1. I do remember playing a lot of Sonic, just don't remember if it was my console or borrowed.
> PC master race since forever


The tray cost of the cheapest i7 is $280.... far too costly for a console, even bought in bulk quantaties


----------



## mtcn77

You guys know to utilize the whole potential of hd5870's 2.72 teraflops that you would need 2.7 TERABYTES/s of bandwidth right? This is the best case representation currently feasible until hbm, imo.


----------



## rustySteel

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I stated they would have to make exceptions to obtain the level of performance they claimed they could during the build up to the launch. Specifically.....
> You then came in with your 60 FPS link, which actually proved exactly what I said - developers are going to have to make concessions to obtain the level of performance these consoles were sold on. In the case of the article you linked, using motion blur techniques.
> 
> EDIT:
> 
> What was done with the 360 was impressive. The at launch titles compared to the at end of life titles were night a day, no one can ever take that away. However, those consoles had a lot of potential to be tapped, because there was that steep learning curve for the developers. Again, it is about potential here....
> 
> There is nothing with the latest consoles to indicate they are going to have the same depth to them as the previous generation. Mainly from the fact that the hardware isn't something new or exotic.
> 
> Will titles improve over the life of the console? Of course they will, no one has argued against that. The argument comes in with "how much?". Some of you believe some magical well of performance is hidden below the surface of the consoles, even though nothing indicates that at all.
> 
> Will future updates to the software on the consoles make a difference? I hope to hell so, mainly looking at the XB1 and its arrival of DX 12. Then again, one would argue that the DX 11 on it is already to the metal, and DX 12 won't bring that much to it.


I think we've been talking past each other. The link i sent wasn't meant to be about fps or resolution or the xbox, it was meant to point to new thinking when it comes to game development, and how new methods are constantly being found in order to achieve your performance/quality goals on limited hardware. In the link the guy saw hdtv interpolation, and viola, new method for emulating 60fps. This is the kind of zeitgeist progression i'm referring to.

My apologies for coming off as a jerk, i think we just misunderstood each other. At the very least, the next few years will be interesting to watch what happens in the console space.

Regards


----------



## KyadCK

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rustySteel*
> 
> I never said it was 60fps, the point of the link was to illustrate a new coding paradigm that hadn't been considered before in order to show you that there are other ways of thinking about achieving quality that you're not considering.
> 
> To obtain 1080/60 they are going to have to make exceptions in other areas? Do you have a crystal ball? (If you do, I'd like to know the lottery numbers for tonight's drawing if that's cool. If not, you should probably stop talking now because you're making a fool of yourself.)
> 
> Btw, how did you destroy that link? I'd really like to know.
> 
> 
> 
> Troll account is troll. Joined today to spew console shill garbage.
> 
> Go ahead and explain some more how a 7770 with a 1.6 GHz CPU is suddenly going to become a monster performer, and how PC devs all these years didn't know how to optimize x86 and GCN.
Click to expand...

You seem stuck on this "x86" thing. Lets try again since you ignored me.
Quote:


> Originally Posted by *KyadCK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dyson Poindexter*
> 
> x86 has been around for like 20 years. If someone pulls a miracle out of their butt after 20 years of optimizing for the arch., I'm going to be impressed.
> 
> Besides, in those 5 years spent figuring out how to make a low-end APU provide decent performance, real PCs will wipe the floor with consoles so hard it won't even be funny.
> 
> 
> 
> Write exclusively for a 486 and see how well it runs on Vishera or Haswell compared to something written directly for those archs, I dare you.
> 
> "x86" has been around for 20+ years, that doesn't mean Jaguar has, let alone with a unified memory structure among other toys. x86 is a concept, like RISK, not a set-in-stone design. If it was, you would get non-existent performance improvements from each generation, not nearly non-existent ones.
Click to expand...


----------



## Dyson Poindexter

Quote:


> Originally Posted by *KyadCK*
> 
> You seem stuck on this "x86" thing. Lets try again since you ignored me.


Ok, replace 2 decades with 2 years for "modern instruction set" CPUs. Still doesn't change the fact that you're dealing with netbook-level single threaded performance on the CPU.

I still fail to see where all this magical performance is going to come from.


----------



## BulletSponge

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> I still fail to see where all this magical performance is going to come from.


Why, pixie dust and uni........er....manicorns, of course.


----------



## prescotter

Quote:


> Originally Posted by *BizzareRide*
> 
> Wow bad post. There are over 3000 retail games for Xbox 360 and the overwhelming majority were 720p... And all of those COD were 60fps
> The tray cost of the cheapest i7 is $280.... far too costly for a console, even bought in bulk quantaties


I bought and still own a x360 but was so amazed that halo 3 wasnt even 720p while it would be the 1080p generation back then.

And it still even aint in today consoles land..


----------



## PostalTwinkie

Quote:


> Originally Posted by *rustySteel*
> 
> I think we've been talking past each other. The link i sent wasn't meant to be about fps or resolution or the xbox, it was meant to point to new thinking when it comes to game development, and how new methods are constantly being found in order to achieve your performance/quality goals on limited hardware. In the link the guy saw hdtv interpolation, and viola, new method for emulating 60fps. This is the kind of zeitgeist progression i'm referring to.
> 
> My apologies for coming off as a jerk, i think we just misunderstood each other. At the very least, the next few years will be interesting to watch what happens in the console space.
> 
> Regards


Oh, I thought you were trumpeting the 60 FPS thing.....

Yea, I fully agree developers are going to have to come up with really creative ways to push these new consoles. I think it is fair to say any creative tricks they develop to get over the hardware limitations will also bleed into the PC market even.

EDIT:

I will also wager this......

If DX 12 isn't the big shift in the paradigm that we all want/need it to be, then the XB1 will not last ~8 years.

Also, digging around the PS4 side of things, since it doesn't use pure OpenGL - instead an OpenGL "like" (per Sony) API, there very may well be hidden performance within that, as developers might be seeing a learning curve on it.

How much? Time will only tell, but this console generation wasn't a leap in performance, not like we have seen in the past from console to console generation.


----------



## Pip Boy

Quote:


> Originally Posted by *lacrossewacker*
> 
> lol
> 
> Maybe with 3x980Ti's, medium graphics, and a 5960X


'brink' < i,e near. 120hz 4k is possible on HDMi and soon we will see monitors with it.

and i didn't mean both exclusively. I meant 144hz @ 1440p which is still EONS ahead of the PS4/Xboxone , even 144hz 21:9 will be out by next year. And not all games that are played are the very newest most unoptimized AAA titles, a 980 will happily play slightly older titles at 60fps 4k and those titles will be crisper than a crisp thing on shiny platform made of crisps.


----------



## Pip Boy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Oh, I thought you were trumpeting the 60 FPS thing.....
> 
> Yea, I fully agree developers are going to have to come up with really creative ways to push these new consoles.


1FPS ?


----------



## Blameless

Quote:


> Originally Posted by *mtcn77*
> 
> You guys know to utilize the whole potential of hd5870's 2.72 teraflops that you would need 2.7 TERABYTES/s of bandwidth right? This is the best case representation currently feasible until hbm, imo.


Only if your GPU is doing nothing but routing information to the system memory and back without actually doing any processing.


----------



## PostalTwinkie

Quote:


> Originally Posted by *phill1978*
> 
> 1FPS ?


What?


----------



## mtcn77

Quote:


> Originally Posted by *Blameless*
> 
> Only if your GPU is doing nothing but routing information to the system memory and back without actually doing any processing.


Per read & write. I was actually going to quote a pdf of a research in which the researchers demonstrated that you would need actually 20 tb/s of reads and writes to do actual work on all 1600 arithmetic crunchers, but my google-fu is pretty pathetic with 2 thumbs only.


----------



## Shadow11377

Quote:


> Originally Posted by *Remij*
> 
> I knew this was coming, but you can't really blame the console manufacturers, they really need to profit from these consoles early on to recoup their losses throughout the gen and actually make money.
> 
> That said, these consoles haven't come close to hitting their strides. All it will take is some out of the box developer to show them they are wrong, and new advancements will be made.


If only a game like _Left 4 Dead 3_ came out & sold millions of copies while running at 1080p with some MSAA and 60 FPS. I believe that could convince people to stop blaming the consoles and instead shift the blame to badly optimized / over complicated games for the intended platform (AKA, bad developers).

Everyone appears to be attempting to push graphics and that always seems to end in *30 FPS/720p/FXAA&Jaggies.*
Seems a little different in this case though, it looks like they overcomplicated the AI and CPU based stuff. Totally understandable.


----------



## Hasty

Quote:


> Originally Posted by *Shadow11377*
> 
> If only a game like _Left 4 Dead 3_ came out & sold millions of copies while running at 1080p with some MSAA and 60 FPS. I believe that could convince people to stop blaming the consoles and instead shift the blame to badly optimized / over complicated games for the intended platform (AKA, bad developers).
> 
> Everyone appears to be attempting to push graphics and that always seems to end in *30 FPS/720p/FXAA&Jaggies.*
> Seems a little different in this case though, it looks like they overcomplicated the AI and CPU based stuff. Totally understandable.


You just summed up my opinion about this.

The issue is not the console hardware, it's the developers.

What do you guys think would happen if the next gen consoles were as powerful as a high end gaming PC?
They would probably up the resolution to 1920x1080 but that's it.
The AAA titles would still be 30fps locked.

It's that way because 60fps doesn't sell.
60fps doesn't show on youtube trailers.
60fps doesn't show on the screenshots in the video game reviews.
Video game journalists don't reduce a game score for running at 30fps.

What sells is polygons, textures and above all shiny over the top lighting effects.

This will be the case whatever the hardware inside these consoles is.


----------



## KenjiS

Quote:


> Originally Posted by *PostalTwinkie*
> 
> I will also wager this......
> 
> If DX 12 isn't the big shift in the paradigm that we all want/need it to be, then the XB1 will not last ~8 years.
> 
> Also, digging around the PS4 side of things, since it doesn't use pure OpenGL - instead an OpenGL "like" (per Sony) API, there very may well be hidden performance within that, as developers might be seeing a learning curve on it.
> 
> How much? Time will only tell, but this console generation wasn't a leap in performance, not like we have seen in the past from console to console generation.


I've pondered if the Xbox One dropping to $350 now isnt an indication that this console lifespan is going to be short, I dont remember how long until the 360 dropped but it was a LONG time, same for the PS3....

I'm wondering if game consoles arent going to be like Tablets/Smartphones, a new one every couple of years..

That would be one justification of moving to x86 and going with cheap hardware. Its very easy to say, launch a new Xbox with improved hardware next year at $500 and put the current One at $300 or something, Offer patches for One games to make them look better on the new hardware. New games look better on the new hardware but will play on the old hardware (just look crappy)

Theres ZERO reasons this is technically infeasable or expensive to do after all...


----------



## HowHardCanItBe

Quote:


> Originally Posted by *kennyparker1337*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dyson Poindexter*
> 
> The console shills in this thread are really making me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These "next-gen" consoles aren't sponges soaked in mystery that devs can wring performance from over several years. It's x86 and GCN. We know what it is, and we know what it's capable of. If my Pentium D and Radeon X1900XT haven't become "optimized" enough to max out _Crysis_, why should you expect the new consoles to suddenly become amazing? Sure, small gains will be made but the days of revolutionary performance increases are over.
> 
> Honestly, the level of "just you wait and see" with no numbers to back it up with these console people reeks of Stockholm syndrome. Y'all need to lay off the MS Koolaid, "the cloud" and "optimizations" aren't going to rescue your consoles' performance.
> 
> 
> 
> "MS Koolaid"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The idea of developers being able to code directly to hardware instead of having to code through high level language that has to go through multiple levels of drivers is not relevant to just Microsoft but the entire idea of a console.
> 
> Sure the gains over time isn't going to compare to upgrading a PC but it's absolutely *FREE* to everyone involved.
> 
> If you had the decency to look at first year games vs last year games on last gen consoles you would realize that clearly games can be vastly improved upon over time by getting your code to near 100% proficiency at the hardware level.
> 
> Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)
Click to expand...

Except with the current generation consoles, they are a x86 architecture, so there isn't much to "learn" to try and tap out more power than there is. It's already struggling to keep games at 1080p. Most of them will try to keep them at 1080p by reducing the draw distance or the FOV and the texture quality with very little AA.


----------



## Pip Boy

Quote:


> Originally Posted by *Hasty*
> 
> You just summed up my opinion about this.
> 
> The issue is not the console hardware, it's the developers.
> 
> What do you guys think would happen if the next gen consoles were as powerful as a high end gaming PC?
> They would probably up the resolution to 1920x1080 but that's it.
> The AAA titles would still be 30fps locked. *doubtful, devs dont intentionally gimp their own games*
> 
> It's that way because 60fps doesn't sell. *wrong, it does if its an FPS or Car game or Platform game*
> 60fps doesn't show on youtube trailers. *wrong*
> 60fps doesn't show on the screenshots in the video game reviews. *video doesnt happen in still shots shocker!*
> Video game journalists don't reduce a game score for running at 30fps. *video game journalists arent journalists and most are paid off*
> 
> What sells is polygons, textures and above all shiny over the top lighting effects. *correct !! but you need to clearly be able to see those effects. So have all of that stuff you mentioned at 640x480 = crap*
> 
> This will be the case whatever the hardware inside these consoles is. *no it wont*


----------



## Pip Boy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> What?


exactly.


----------



## maarten12100

Quote:


> Originally Posted by *Hasty*
> 
> You just summed up my opinion about this.
> 
> The issue is not the console hardware, it's the developers.
> 
> What do you guys think would happen if the next gen consoles were as powerful as a high end gaming PC?
> They would probably up the resolution to 1920x1080 but that's it.
> The AAA titles would still be 30fps locked.
> 
> It's that way because 60fps doesn't sell.
> 60fps doesn't show on youtube trailers.
> 60fps doesn't show on the screenshots in the video game reviews.
> Video game journalists don't reduce a game score for running at 30fps.
> 
> What sells is polygons, textures and above all shiny over the top lighting effects.
> 
> This will be the case whatever the hardware inside these consoles is.


And it is fine that way on the consoles the controller makes it so that you accuracy is horrible to make up for that auto aim and other gimick were put into place that is why a pc game with a mapped controller is so very hard. Due to the controller having high latency versus something like a mouse the 30fps isn't that noticeable even in rapid shooters.

Consoles and PCs while 2 different world both have their uses. Though if people just switched to pc they could save a lot of money on games but on the other hand the consoles just work and there are many people to play with.


----------



## Pip Boy

pc games are cheaper though.


----------



## BizzareRide

Quote:


> Originally Posted by *prescotter*
> 
> I bought and still own a x360 but was so amazed that halo 3 wasnt even 720p while it would be the 1080p generation back then.
> 
> And it still even aint in today consoles land..


There are many 1080p games on X1/PS4.

Halo 3 used two frame buffers both rendering at 1152x640 with a compiled image. Thats why you didnt notice the "lower" res. Some assets were higher quality than they would have otherwise been at 720p. Couple this with the fact that it was one of the only games in existence to use HDR lighting and a 10mi draw distance, simply citing resolution is narrowminded in the grand scheme of things.
Quote:


> Originally Posted by *5entinel*
> 
> Except with the current generation consoles, they are a x86 architecture, so there isn't much to "learn" to try and tap out more power than there is. It's already struggling to keep games at 1080p. Most of them will try to keep them at 1080p by reducing the draw distance or the FOV and the texture quality with very little AA.


This isnt true at all. X1 and PS4 both have unique challenges that developers must overcome and it mainly comes down to one word: utilization.

1. How to keep PS4s compute engines utilized
2. How to best allocate data to best utilize Xbox's split memory pools

Whats more is that developers didnt have access to final SDKs and hardware until close to launch, so first generation games are developed with alpha o beta hardware. Second and third generation games are were we see developers push the limits.


----------



## KenjiS

Quote:


> Originally Posted by *phill1978*
> 
> exactly.


To be fair until very recently Youtube didnt support 60fps videos...

Now it does..

That said I'd say Youtube is honestly difficult to judge things on due to compression artifacting and other things mucking up the source


----------



## Pip Boy

Quote:


> Originally Posted by *5entinel*
> 
> Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)


terrible example. that game was cobbled together from the PC version and was a hideous rushed port for a launch title. Secondly game engine technology has been through about 4 renaissances since then and has completely gone off the charts compared to that first example

your comparing the Frostbite(™) Engine to The IW engine based from Quake III arena are you absolutely insane ? Some of that tech was from 1999 !

Yes optimizations occurred but engine technology pushed the frontiers not some cleaver code weaving fitting an extra bit of textures and sound into some previously unfound cache.

This gen is X86 all the way, there will be a few benefits to having the same consistent hardware specs and perhaps some DX12 benefits also. But don't forget these things are *running a full blown OS behind running downloads and social media / game overlays at the ready* too .. so that's your bit of code space and optimizations gone.

*$120 7790.
AMD CPU cores that would be outpaced by a budget i3 sandy bridge from 2 years ago.

THAT is the story. bargain basement boxes for the masses. 10x as good as something that was 10 years out of date.*

the end..


----------



## Hasty

To illustrate my opinion:

GTA V is coming to Playstation 4.

What will they improve?: The frame rate or the resolution?

http://www.vg247.com/2014/10/27/gta-5-resolution-1080p-ps4-xbox-one-grand-theft-auto/


----------



## mtcn77

Quote:


> Originally Posted by *Blameless*
> 
> Only if your GPU is doing nothing but routing information to the system memory and back without actually doing any processing.



My original source compared the previous generation, but "%1 the memory bandwidth for true compute leverage" is still a trend. I really have high hopes with AMD's next gen hbm. 1024 way 1 gbps vs. 32 way 8 gbps is going to be serious.
Desktop gpu's are no way going to be more complex than consoles anyway. Those extra cores are only available to you because of the inference that advancing the memory infrastructure is more difficult than attaining a higher gpu workload to hide the latency interval.


----------



## MapRef41N93W

Quote:


> Originally Posted by *routek*
> 
> yeah, its 6 cores on console, not 8, 2 are locked off.
> 
> From benches it would seem the console CPU part is like an Athlon 630 or something in performance numbers, perhaps worse. Its good however in terms of power consumption for a laptop or an APU solution which they chose.
> 
> I would've liked to see them wait for Maxwell and looked at Intel for a 2014 launch instead of 2013.
> 
> Thing is though this is a tough business and they're selling well, use around 120W in total in a simple APU solution. Buyers seem to be happy with 2007-2010 PC level graphics


Didn't NVIDIA turn down the console offer? Also I think Intel would probably be out of the ballpark price wise for the PS4 and Xbone. Both companies clearly wanted to make a profit on the actual consoles this gen (after two gens of loses) especially Microsoft. Eventually when your console losses have done the job (gaining you market share) the idea is to then sell profitable hardware. The problem is the hardware choices themselves were questionable.

I wouldn't say it's out of the question to see a new revision of each console selling with an "upcharge" (like a premium model or something) that has better hardware in the near future.


----------



## KenjiS

Quote:


> Originally Posted by *BizzareRide*
> 
> There are many 1080p games on X1/PS4.
> 
> Halo 3 used two frame buffers both rendering at 1152x640 with a compiled image. Thats why you didnt notice the "lower" res. Some assets were higher quality than they would have otherwise been at 720p. Couple this with the fact that it was one of the only games in existence to use HDR lighting and a 10mi draw distance, simply citing resolution is narrowminded in the grand scheme of things.


The fact that 1080p TVs were NOWHERE near as common as today and that we didnt have 1080p HD fairly easily available also changes perception. Most people had, if they were lucky, a 720p HDTV when the 360 came out, 1080p TVs at that point were a lot less common...HD in general was a new concept, even in 2005 it was still clawing and scratching its way into living rooms.

Before the 360 keep in mind all TVs were 480i or 480p and thats it, 480 lines of resolution. Period.

Even during the 360s lifetime 720p was fine and dandy, by the time it mattered it was long in the tooth and people were basically "Meh it IS 6 or 7 years old"

Now? MS and Sony had a big heap on their plate, The 360 and PS3 were at the end of their life and they needed to push out new consoles, They didnt want to take years of being in the hole to get back into the black again so they went cheap and cheerful in both consoles. The result is what we see today, Two consoles that are grossly underpowered to truely deliver what the public is demanding. We expect higher resolutions and more advanced effects but neither console can truly deliver on that expectation, the public is frustrated because they dont really want to pay up for the meager graphical improvements.

Now I'll note, I own an Xbox One and love it, Seriously. I enjoy it, some games do genuinely look great (Forza 5 for instance..Ryse) but it doesn't change the fact that those games that look like how I feel all games should look are the exception, Not the norm.


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *routek*
> 
> yeah, its 6 cores on console, not 8, 2 are locked off.
> 
> From benches it would seem the console CPU part is like an Athlon 630 or something in performance numbers, perhaps worse. Its good however in terms of power consumption for a laptop or an APU solution which they chose.
> 
> I would've liked to see them wait for Maxwell and looked at Intel for a 2014 launch instead of 2013.
> 
> Thing is though this is a tough business and they're selling well, use around 120W in total in a simple APU solution. Buyers seem to be happy with 2007-2010 PC level graphics
> 
> 
> 
> Didn't NVIDIA turn down the console offer? Also I think Intel would probably be out of the ballpark price wise for the PS4 and Xbone. Both companies clearly wanted to make a profit on the actual consoles this gen (after two gens of loses) especially Microsoft. Eventually when your console losses have done the job (gaining you market share) the idea is to then sell profitable hardware. The problem is the hardware choices themselves were questionable.
> 
> I wouldn't say it's out of the question to see a new revision of each console selling with an "upcharge" (like a premium model or something) that has better hardware in the near future.
Click to expand...

NV doesn't have X86 CPU's so I doubt they were ever really considered. But, if they were NV would definitely not turn it down (I believe that it was said to throw shade at AMD), it is guaranteed money for ~5+ years.


----------



## lacrossewacker

my gosh just look at how restrictive these consoles are


----------



## MapRef41N93W

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> NV doesn't have X86 CPU's so I doubt they were ever really considered. But, if they were NV would definitely not turn it down (I believe that it was said to throw shade at AMD), it is guaranteed money for ~5+ years.


Was talking about the GPU end. I recall reading that NVIDIA claimed they turned down the console offer. Could have been mud-slinging yes.


----------



## Master__Shake

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Didn't NVIDIA turn down the console offer? Also I think Intel would probably be out of the ballpark price wise for the PS4 and Xbone. Both companies clearly wanted to make a profit on the actual consoles this gen (after two gens of loses) especially Microsoft. Eventually when your console losses have done the job (gaining you market share) the idea is to then sell profitable hardware. The problem is the hardware choices themselves were questionable.
> 
> I wouldn't say it's out of the question to see a new revision of each console selling with an "upcharge" (like a premium model or something) that has better hardware in the near future.


neither nvidia or intel had what console makers wanted.

they wanted a cpu/gpu combo.

should have opted for an i3 and 650ti












looks like next gen to me


----------



## Hasty

Quote:


> Originally Posted by *lacrossewacker*
> 
> my gosh just look at how restrictive these consoles are


_Built around a dramatically reworked version of the engine that powered Insomniac's ill-fated Fuse, the technology in Sunset Overdrive is optimised specifically for Xbox One hardware, targeting 30fps instead of the slick 60fps update previously favoured by the developer in Ratchet and Clank._


----------



## Slomo4shO

Quote:


> Originally Posted by *KenjiS*
> 
> The fact that 1080p TVs were NOWHERE near as common as today and that we didnt have 1080p HD fairly easily available also changes perception. Most people had, if they were lucky, a 720p HDTV when the 360 came out, 1080p TVs at that point were a lot less common...


Someone that understands technology adoption rates... 4K (UHD) is expected to have only an 8% market penetration by the year 2017



PS was released in 1994
PS2 was released in 2000
PS3 was released in 2006
PS4 was released in 2013

Xbox was released in 2001
Xbox 360 was released in 2005
Xbox One was released in 2013

By the time UHD has a reasonable market share, there will be new consoles available ready to tackle the higher resolutions...


----------



## HowHardCanItBe

Quote:


> Originally Posted by *BizzareRide*
> 
> This isnt true at all. X1 and PS4 both have unique challenges that developers must overcome and it mainly comes down to one word: utilization.
> 
> 1. How to keep PS4s compute engines utilized
> 2. How to best allocate data to best utilize Xbox's split memory pools
> 
> Whats more is that developers didnt have access to final SDKs and hardware until close to launch, so first generation games are developed with alpha o beta hardware. Second and third generation games are were we see developers push the limits.


Then why are so many games struggling to even keep up at 30fps? Them hitting the performance wall would be the only explanation. Yeah, no doubt that developers will and do have tricks up their sleeves but do you expect a huge performance boost?


----------



## Fateful_Ikkou

I seem to remember this same claim being said about the 360 and PS3 the first year, and look what developers did with them. software optimization will see better hardware utilization in the long run, just you wait and see.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Hasty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shadow11377*
> 
> If only a game like _Left 4 Dead 3_ came out & sold millions of copies while running at 1080p with some MSAA and 60 FPS. I believe that could convince people to stop blaming the consoles and instead shift the blame to badly optimized / over complicated games for the intended platform (AKA, bad developers).
> 
> Everyone appears to be attempting to push graphics and that always seems to end in *30 FPS/720p/FXAA&Jaggies.*
> Seems a little different in this case though, it looks like they overcomplicated the AI and CPU based stuff. Totally understandable.
> 
> 
> 
> You just summed up my opinion about this.
> 
> The issue is not the console hardware, it's the developers.
> 
> What do you guys think would happen if the next gen consoles were as powerful as a high end gaming PC?
> They would probably up the resolution to 1920x1080 but that's it.
> The AAA titles would still be 30fps locked.
> 
> It's that way because 60fps doesn't sell.
> 60fps doesn't show on youtube trailers.
> 60fps doesn't show on the screenshots in the video game reviews.
> Video game journalists don't reduce a game score for running at 30fps.
> 
> What sells is polygons, textures and above all shiny over the top lighting effects.
> 
> This will be the case whatever the hardware inside these consoles is.
Click to expand...

I agree with almost everything here, except I blame management.

I know it was a ways back in the thread, but on the subject of physical media, 4K is really going to emphasize the cost of bandwidth again. If they put 100GB on a 4K Bluray, many people aren't even going to have the option of downloading files of that size.


----------



## Master__Shake

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> I agree with almost everything here, except I blame management.
> 
> I know it was a ways back in the thread, but on the subject of physical media, 4K is really going to emphasize the cost of bandwidth again. If they put 100GB on a 4K Bluray, many people aren't even going to have the option of downloading files of that size.


hey hey h.265 is doing really well in the compression department.


----------



## PostalTwinkie

Quote:


> Originally Posted by *phill1978*
> 
> exactly.


Alright, I will bite, you have my attention.

What is this little side conversation about?


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> neither nvidia or intel had what console makers wanted.
> 
> they wanted a cpu/gpu combo.
> 
> should have opted for an i3 and 650ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> looks like next gen to me


Speaking of which...


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> Speaking of which...


that's why i said looks like next gen


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ILoveHighDPI*
> 
> I agree with almost everything here, except I blame management.
> 
> I know it was a ways back in the thread, but on the subject of physical media, 4K is really going to emphasize the cost of bandwidth again. If they put 100GB on a 4K Bluray, many people aren't even going to have the option of downloading files of that size.
> 
> 
> 
> hey hey h.265 is doing really well in the compression department.
Click to expand...

I'm not too up to speed on h.265, but the best googling I can manage indicates 4K files are about 44GB per hour with h.264, and should be about 22GB per hour with h.265, which is still a bit much for a regular blu-ray.


----------



## Master__Shake

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> I'm not too up to speed on h.265, but the best googling I can manage indicates 4K files are about 44GB per hour with h.264, and should be about 22GB per hour with h.265, which is still a bit much for a regular blu-ray.


still not bad if you have a decent connection.

they were selling gta v on the psn store i think it was pretty big.

physical media is very much still needed though.


----------



## Shadow11377

Quote:


> Originally Posted by *Slomo4shO*
> 
> Speaking of which...


8:30
Lmao. That is so horrible.


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> that's why i said looks like next gen


I take it you actually didn't watch the video

You do realize that is on a PC...


----------



## mtcn77

Every time Ubisoft complains, I think they play right into Crytek's hands. It is all the same to me.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> I take it you actually didn't watch the video
> 
> You do realize that is on a PC...


i did watch it.

what is the point you are making?


----------



## Clocknut

Quote:


> Originally Posted by *maarten12100*
> 
> They are making money from software sales as did they with the PS3 and Xbox 360 only difference is the new consoles either only make a little loss or make zero loss but also about zero profit.
> 
> If you want to argue semantics that 1 dollar profit for selling a PS4 including handling is a big deal. Then go ahead I'm out. They tie on the console costs versus selling price and make the profit of games.
> Nah MS and Sony will get direct profits instead of having a 2 year to ROI cycle again. This way every extra penny of markup on games is direct profit to them. They will make money and the average consumer will have a cheaper console. Most console players don't even care that much they want a cheap console that plays games. Most don't even know the difference between HD (720P) and FHD (1080P).


since they are on x86 now, it is more likely they are pretty much stick on this architecture for good for development simplicity. So releasing a new console every 2-3years with full backward compatibility will not affect their game sales, they should take Apple Iphone's approach, new console @ $499, old version get process die shrink(iphone didnt) sell at even lower price(ex $199-$299).

At least with a 2-3years release cycle it keep the console close to PC performance. It will also keep some console gamer from jumping to PC.(one of the main reason some jump to PC platform is the graphic quality diff is too large). They also have 2 tier (old + new console) targeting diff price segment

IMO, this is probably the only way for them to fix this problem, if they want to protect their investment. I simply cannot see how bad these consoles will become in 3-5yrs.


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> what is the point you are making?


You would use one of the worst ports to PC as your standard? Playing at FWXGA (1366x768) fixed at 30 FPS at that?

NSFR plays at HD (1280x720) at 30 FPS fixed on the PS3 and Xbox 360

NFSR plays at FHD (1920x1080) at 30 FPS on the PS4 and Xbox One

The video you showed displays 14% more pixels than the FHD. The PS4 and Xbox One output 97% more pixels than FWXGA. I would definitely not call your supporting video "next gen" when it compares closer to the previous generation


----------



## Zaid

is there anything stopping M$ or sony from releasing an updated console with a more powefull cpu/gpu? they are just x86 cores and GCN gpu.

if not, then i expect some HD or elite version of the xbone/ps4 if people care enough.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> You would use one of the worst ports to PC as your standard? Playing at FWXGA (1366x768) fixed at 30 FPS at that?
> 
> NSFR plays at HD (1280x720) at 30 FPS fixed on the PS3 and Xbox 360
> 
> NFSR plays at FHD (1920x1080) at 30 FPS on the PS4 and Xbox One
> 
> The video you showed displays 14% more pixels than the FHD. The PS4 and Xbox One output 97% more pixels than FWXGA. I would definitely not call your supporting video "next gen" when it compares closer to the previous generation


\

i know it runs at 30 fps, the description in the video i posted said it runs in 30fps my argument was that an i3 and 650ti were as good as a console.

i think we are trying to say two different things.


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> i think we are trying to say two different things.


Your initial statement was "looks like next gen to me" which implies that the i3 and 650 Ti provided similar performance to the PS4 and Xbox One. This obviously isn't the case. If this isn't what you meant then please do elaborate on your initial remark.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> Your initial statement was "looks like next gen to me" which implies that the i3 and 650 Ti provided similar performance to the PS4 and Xbox One. This obviously isn't the case. If this isn't what you meant then please do elaborate on your initial remark.


what do you mean it's not the case?


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> what do you mean it's not the case?












i3 + 650Ti ≈ PS3 and Xbox 360
PS3 and Xbox 360 ≠ next gen
Did my earlier analysis not provide enough clarity?


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i3 + 650Ti ≈ PS3 and Xbox 360
> PS3 and Xbox 360 ≠ next gen
> Did my earlier analysis not provide enough clarity?


got me on a technicality... good for you.





better?

30 fps at 1080p

and it's an older i3


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Clocknut*
> 
> since they are on x86 now, it is more likely they are pretty much stick on this architecture for good for development simplicity. So releasing a new console every 2-3years with full backward compatibility will not affect their game sales, they should take Apple Iphone's approach, new console @ $499, old version get process die shrink(iphone didnt) sell at even lower price(ex $199-$299).
> 
> At least with a 2-3years release cycle it keep the console close to PC performance. It will also keep some console gamer from jumping to PC.(one of the main reason some jump to PC platform is the graphic quality diff is too large). They also have 2 tier (old + new console) targeting diff price segment
> 
> IMO, this is probably the only way for them to fix this problem, if they want to protect their investment. I simply cannot see how bad these consoles will become in 3-5yrs.


If that is actually possible, it would be amazing. New consoles every three years, they should come free with the premium subscription.


----------



## Shadow11377

Quote:


> Originally Posted by *Zaid*
> 
> is there anything stopping M$ or sony from releasing an updated console with a more powefull cpu/gpu? they are just x86 cores and GCN gpu.
> 
> if not, then i expect some HD or elite version of the xbone/ps4 if people care enough.


Probably the fact that the point of consoles is to have a single system that games are designed for.
If it works on the Xbox One, what would be the selling point of an Xbox One (Elite)? Higher FPS? As you can see from the NFS Rivals video, it clearly has a cap and removing it destroys the game.


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Clocknut*
> 
> since they are on x86 now, it is more likely they are pretty much stick on this architecture for good for development simplicity. So releasing a new console every 2-3years with full backward compatibility will not affect their game sales, they should take Apple Iphone's approach, new console @ $499, old version get process die shrink(iphone didnt) sell at even lower price(ex $199-$299).
> 
> At least with a 2-3years release cycle it keep the console close to PC performance. It will also keep some console gamer from jumping to PC.(one of the main reason some jump to PC platform is the graphic quality diff is too large). They also have 2 tier (old + new console) targeting diff price segment
> 
> IMO, this is probably the only way for them to fix this problem, if they want to protect their investment. I simply cannot see how bad these consoles will become in 3-5yrs.
> 
> 
> 
> If that is actually possible, it would be amazing. New consoles every three years, they should come free with the premium subscription.
Click to expand...

Lol, why not just get a PC at that point? Most people already have one and it comes with free online.

Idk why people want upgradeable consoles, they litterally just become 'bad' (more or less single use) PC's.

It also defeats the value that consoles have. My PS3 (an OG 60gb) still runs games at the same settings as everyone else, $600 for 8yrs is a friggin great deal, $400 for the (hopefully) next 8yrs would be even better [note, I do not care about res/graphics, I just want to play some fun games].


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> got me on a technicality... good for you.
> 
> 
> 
> 
> 
> better?
> 
> 30 fps at 1080p
> 
> and it's an older i3


650 Ti Boost ≠ 650 Ti

The 650 Ti Boost is about 35-40% faster than the 650 Ti. Ergo, not the same argument as your previous post.

The GPU in the PS4 is comparable to the HD 7850 which is comparable to the GTX 650 Ti Boost so in your above example the GPU performance would be nearly identical.


----------



## ILoveHighDPI

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ILoveHighDPI*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Clocknut*
> 
> since they are on x86 now, it is more likely they are pretty much stick on this architecture for good for development simplicity. So releasing a new console every 2-3years with full backward compatibility will not affect their game sales, they should take Apple Iphone's approach, new console @ $499, old version get process die shrink(iphone didnt) sell at even lower price(ex $199-$299).
> 
> At least with a 2-3years release cycle it keep the console close to PC performance. It will also keep some console gamer from jumping to PC.(one of the main reason some jump to PC platform is the graphic quality diff is too large). They also have 2 tier (old + new console) targeting diff price segment
> 
> IMO, this is probably the only way for them to fix this problem, if they want to protect their investment. I simply cannot see how bad these consoles will become in 3-5yrs.
> 
> 
> 
> If that is actually possible, it would be amazing. New consoles every three years, they should come free with the premium subscription.
> 
> Click to expand...
> 
> Lol, why not just get a PC at that point? Most people already have one and it comes with free online.
> 
> Idk why people want upgradeable consoles, they litterally just become 'bad' (more or less single use) PC's.
> 
> It also defeats the value that consoles have. My PS3 (an OG 60gb) still runs games at the same settings as everyone else, $600 for 8yrs is a friggin great deal, $400 for the (hopefully) next 8yrs would be even better [note, I do not care about res/graphics, I just want to play some fun games].
Click to expand...

Consoles exist so that people can buy a box, plug it in, and play games, ideally it's a maintenance free PC. We may know that there isn't a lot of skill required to build a PC, but there's hundreds of millions of people out there that don't.
The purpose of a console is to have a simplified computer, not to give everyone a unified gaming experience. It ends up that way because of economics, but it's not required in the design of the system.

They could release a new console every year and as long as new games support old hardware you'll have roughly the same thing as the current model, which apparently is now that you don't get progressively better graphics with the same hardware as time goes by, you just get the same graphics for however long the console cycle lasts.

The benefit from a PC perspective is that games would be designed to take advantage of new hardware every year.


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> 650 Ti Boost ≠ 650 Ti
> 
> The 650 Ti Boost is about 35-40% faster than the 650 Ti. Ergo, not the same argument as your previous post.
> 
> The GPU in the PS4 is comparable to the HD 7850 which is comparable to the GTX 650 Ti Boost so in your above example the GPU performance would be nearly identical.


its got a 100mhz clock increase over the 650ti and 2gb's of vram whoop de doo

c'mon man!

and the boost is exactly 17.50 dollars more than a regular 650ti

http://pcpartpicker.com/parts/video-card/#c=118

actually it's less


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> its got a 100mhz clock increase over the 650ti and 2gb's of vram whoop de doo
> 
> c'mon man!
> 
> and the boost is exactly 17.50 dollars more than a regular 650ti
> 
> http://pcpartpicker.com/parts/video-card/#c=118
> 
> actually it's less


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*


my original argument is flawed fine.

but a card like the 650ti boost which is a CHEAPER card than the regular 650ti is still a better experience than the consoles crap ass hardware.

and an i3 is more than capable of holding its own compared to the a/cpu the consoles have in them.


----------



## Zaid

Quote:


> Originally Posted by *Shadow11377*
> 
> Probably the fact that the point of consoles is to have a single system that games are designed for.
> If it works on the Xbox One, what would be the selling point of an Xbox One (Elite)? Higher FPS? As you can see from the NFS Rivals video, it clearly has a cap and removing it destroys the game.


well seeing as how some games are already getting downgraded to <1080p resolution and it hasn't even been an entire year that either console was out. i think its a safe bet to say that unless advancement of graphics/physics in video games comes to a complete halt, neither xbone or ps4 will be able to handle anything near 1080p/60 fps in the future.

and since the xbone and ps4 are both using x86 CPU and a GCN GPU, they shouldn't have much problem upgrading to more powerful chips without having to recode entire games.

that's where the elite/HD editions come in, you want full HD awesome graphics? well cough up some dough for the upgraded consoles.


----------



## MonarchX

Great news! This means that developers will shift to PC's that are far ahead of consoles already and in every way, except for maybe VRAM needs, but 4GB VRAM should be plenty for a game like Shadow of Mordor, filled with 4K textures, at least @ 1080p!


----------



## Shadow11377

Quote:


> Originally Posted by *Zaid*
> 
> well seeing as how some games are already getting downgraded to <1080p resolution and it hasn't even been an entire year that either console was out. i think its a safe bet to say that unless advancement of graphics/physics in video games comes to a complete halt, neither xbone or ps4 will be able to handle anything near 1080p/60 fps in the future.
> 
> and since the xbone and ps4 are both using x86 CPU and a GCN GPU, they shouldn't have much problem upgrading to more powerful chips without having to recode entire games.
> 
> that's where the elite/HD editions come in, you want full HD awesome graphics? well cough up some dough for the upgraded consoles.


Games are set at developer chosen resolutions, a better console won't exactly enable the higher resolutions. It would require a patch on a per-game basis to make use of the Elite version, and part of me feels like the devs would instead throw in some FXAA option or something stupid for the upgraded version instead of 1080p.


----------



## DRT-Maverick

Maybe for the first time ever game developers will have the incentive to blow off consoles and start developing for PCs again, considering consoles can't handle their latest gaming engines...


----------



## kennyparker1337

Quote:


> Originally Posted by *5entinel*
> 
> Except with the current generation consoles, they are a x86 architecture, so there isn't much to "learn" to try and tap out more power than there is. It's already struggling to keep games at 1080p. Most of them will try to keep them at 1080p by reducing the draw distance or the FOV and the texture quality with very little AA.


Then why hasn't every single game so far just been coded for the PC and ported to consoles.
Wouldn't it be a lot more revenue because the architecture is the same and almost no work would have to go into porting it?

Reducing draw distance and AA was already heavily done on the Xbox 360 / PS3 as well.
What's the difference there?

Certainly launch titles for last gen were struggling to run at subpar resolutions as well, otherwise why wouldn't they have simply made it look better?
Quote:


> Originally Posted by *phill1978*
> 
> Quote:
> 
> 
> 
> Originally Posted by *5entinel*
> 
> Call of Duty 2 (launch title) vs Battlefield 3 (finishing title)
> http://www.overclock.net/content/type/61/id/2228385/width/640/height/360
> http://www.overclock.net/content/type/61/id/2228387/width/640/height/360
> 
> 
> 
> terrible example. that game was cobbled together from the PC version and was a hideous rushed port for a launch title. Secondly game engine technology has been through about 4 renaissances since then and has completely gone off the charts compared to that first example
> 
> your comparing the Frostbite(™) Engine to The IW engine based from Quake III arena are you absolutely insane ? Some of that tech was from 1999 !
> 
> Yes optimizations occurred but engine technology pushed the frontiers not some cleaver code weaving fitting an extra bit of textures and sound into some previously unfound cache.
> 
> This gen is X86 all the way, there will be a few benefits to having the same consistent hardware specs and perhaps some DX12 benefits also. But don't forget these things are *running a full blown OS behind running downloads and social media / game overlays at the ready* too .. so that's your bit of code space and optimizations gone.
> 
> $120 7790.
> AMD CPU cores that would be outpaced by a budget i3 sandy bridge from 2 years ago.
> 
> THAT is the story. bargain basement boxes for the masses. 10x as good as something that was 10 years out of date.
> 
> the end..
Click to expand...

It was the same exact story for last gen regardless of architecture.
Subpar hardware that couldn't change.

Both last gen and current gen consoles have "full blown" OS.
Of course if by "full blown" you mean a heavily neutered version of a PC OS.

I realize it may have been two different engines but it could also be two different engines this time around.
My point was that games looked much better in 2012 vs 2006 for the same exact hardware.

Call of Duty 2 was the most sold launch title for last gen consoles. That's why I picked it.

I guess I'm just saying I don't believe development for current gen games has gotten anywhere near to being perfected.


----------



## Sisaroth

I always was sceptical about this whole "games will now be properly multithreaded" thing. I still hope they manage to do it because it would also benefit PCs.


----------



## Zaid

Quote:


> Originally Posted by *Shadow11377*
> 
> Games are set at developer chosen resolutions, a better console won't exactly enable the higher resolutions. It would require a patch on a per-game basis to make use of the Elite version, and part of me feels like the devs would instead throw in some FXAA option or something stupid for the upgraded version instead of 1080p.


i may be ignorant to how consoles work, but isn't it just an option for the resolution(on devs side)? i don't see why they cant just have the game detect if you have a regular console or an elite version and just change the graphics settings? PC's have been doing this for decades.

Every pc game that has came out, even ones that were horrible ports allowed resolution changes. its not like they run some insanely different architecture its a modified DX11/OpenGL.

I have a gamecube emulator that can increase the resolution/rendering area of games, i checked out metal gear 1 with upped resolution, and it looked insane (in comparison to b4). if people making emulators without the original source code are able to do things like that, and real developers cant, then something is seriously wrong.


----------



## Clocknut

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Consoles exist so that people can buy a box, plug it in, and play games, ideally it's a maintenance free PC. We may know that there isn't a lot of skill required to build a PC, but there's hundreds of millions of people out there that don't.
> The purpose of a console is to have a simplified computer, not to give everyone a unified gaming experience. It ends up that way because of economics, but it's not required in the design of the system.
> 
> They could release a new console every year and as long as new games support old hardware you'll have roughly the same thing as the current model, which apparently is now that you don't get progressively better graphics with the same hardware as time goes by, you just get the same graphics for however long the console cycle lasts.
> 
> The benefit from a PC perspective is that games would be designed to take advantage of new hardware every year.


exactly...with just a few tiers of hardwares, it allow developers to run diff settings for each consoles. Optimizing 3 tiers consoles is still much easier than porting over to PC(with thousands of diff hardware combination)

for ex. PS4 on 720p 30fps low setting, PS5 1080p 60fps high setting, may be PS6 with 4K 60fps.(priced @ $149, $299, $499).

If one of them(Sony/Microsoft/Nintendo) started to do this kind of setup, they could easily maintain a very good competitive advantages.


----------



## Shadow11377

Quote:


> Originally Posted by *DRT-Maverick*
> 
> Maybe for the first time ever game developers will have the incentive to blow off consoles and start developing for PCs again, considering consoles can't handle their latest gaming engines...


I'd really rather the devs who are incompetent don't all abandon consoles and jump on the PC bandwagon, because that would not fix the problem. It would just lead to even more terribly optimized PC games and the death of Consoles, which are an awesome idea btw in case anyone had forgotten.

As much as I love my PC and PC gaming, I hope console gaming gets better.
It's actually a nice experience being able to go over to a friend's house and simply play their games on their console without having to install software like Steam and Origin on their PC to play games if you didn't bring your own machine.

One of my most played games is Killing Floor, and the graphics in that are quite miserable by today's standards but I think it looks good enough especially with 8xMSAA + 16x AF. A lot better than some of the upscaled stuff on the console IMO. If I could just buy any Xbox One game off of a shelf in a Gamestop and have it run 60 FPS at 1080p without FXAA (AA isn't even a requirement.. Just no blurring Post-Process AA and I'd be happy.) and _good enough_ graphics I'd be interested in buying one and plenty of games for it. Well, maybe the PS4 instead since I don't like the Kinect but still. Consoles really don't have to suck so much.


----------



## MapRef41N93W

Quote:


> Originally Posted by *kennyparker1337*
> 
> Then why hasn't every single game so far just been coded for the PC and ported to consoles.
> Wouldn't it be a lot more revenue because the architecture is the same and almost no work would have to go into porting it?
> 
> Reducing draw distance and AA was already heavily done on the Xbox 360 / PS3 as well.
> What's the difference there?
> 
> Certainly launch titles for last gen were struggling to run at subpar resolutions as well, otherwise why wouldn't they have simply made it look better?
> It was the same exact story for last gen regardless of architecture.
> Subpar hardware that couldn't change.
> 
> Both last gen and current gen consoles have "full blown" OS.
> Of course if by "full blown" you mean a heavily neutered version of a PC OS.
> 
> I realize it may have been two different engines but it could also be two different engines this time around.
> My point was that games looked much better in 2012 vs 2006 for the same exact hardware.
> 
> Call of Duty 2 was the most sold launch title for last gen consoles. That's why I picked it.
> 
> I guess I'm just saying I don't believe development for current gen games has gotten anywhere near to being perfected.


The difference is last gen didn't release with sub-par hardware. The hardware in the 360 and PS3 were on par or better than the top PC components on the market.


----------



## Carniflex

Quote:


> Originally Posted by *SpeedyVT*
> 
> Honestly AI could be handled by HSA because of the GCN cores. Like how games were programmed with consoles originally... OH WAIT UBISOFT STILL THINKS THIS IS A PC!


AI is usually integer related stuff. HSA is more suitable for floating point heavy loads that do specific stuff that is efficient on the GPU side.


----------



## Takla

cpu bottleneck? what about mantle or dx12? ubisoft is just incompetent, thats all.


----------



## Shadow11377

Quote:


> Originally Posted by *Takla*
> 
> cpu bottleneck? what about mantle or dx12? ubisoft is just incompetent, thats all.


It's easy to have a huge amount of zombie AIs with limited CPU power, but when you start adding complexity the CPU cost goes up dramatically for each unit.
Perhaps Ubi just went overboard with their expectations and tried to create AI that was too complex to process in the amount they wished to have. Not quite incompetent in my opinion, unless it goes unfixed.


----------



## maarten12100

Quote:


> Originally Posted by *Clocknut*
> 
> since they are on x86 now, it is more likely they are pretty much stick on this architecture for good for development simplicity. So releasing a new console every 2-3years with full backward compatibility will not affect their game sales, they should take Apple Iphone's approach, new console @ $499, old version get process die shrink(iphone didnt) sell at even lower price(ex $199-$299).
> 
> At least with a 2-3years release cycle it keep the console close to PC performance. It will also keep some console gamer from jumping to PC.(one of the main reason some jump to PC platform is the graphic quality diff is too large). They also have 2 tier (old + new console) targeting diff price segment
> 
> IMO, this is probably the only way for them to fix this problem, if they want to protect their investment. I simply cannot see how bad these consoles will become in 3-5yrs.


That would be very good they could for example release a new one every 2 years fully backward compatible but drop older consoles after 4 years or just still have them being compatible but with lesser settings.

If they'd incorporate a cheap game store like steam with them still having profits like steam does and them building these consoles for cheap it could be lucrative.


----------



## criznit

For all we know, unity could be the next "Crysis 1" in terms of pushing tech forward. Complex AI > Graphics since everyone is passed the Oh AH phase of this "next-gen" lol


----------



## Dyson Poindexter

Quote:


> Originally Posted by *Fateful_Ikkou*
> 
> software optimization will see better hardware utilization in the long run, just you wait and see.


OCN doesn't work on "just you wait and see." If that were the case, we'd all have 20nm GPUs connected to AMD Bulldozer chips right now, playing HL3.


----------



## Carniflex

Quote:


> Originally Posted by *lacrossewacker*
> 
> lol
> 
> Maybe with 3x980Ti's, medium graphics, and a 5960X


I can hit that fps at 5400x1920 (thats 10 megapixels, or 25% higher reso than 4K) with every slider as far as it can go (including AA) with a single mildly overclocked 7950 card. The "game" ofc is Lost coast benchmark on source engine which is a bit .. dated. That out of the way, you can get acceptable frame rates at the same reso by doing manual settings (that btw do not look like crap) in titles like Crysis, Crysis 2, Metro 2033. Main things to tune down are the AA level (which you do not need above x2 at that kind of reso) and few specific effects that incur significant performance penalty at high resolutions.


----------



## Carniflex

Quote:


> Originally Posted by *KenjiS*
> 
> I'd ALSO say that in theory one could argue this gives MS/Sony the ability to push their Services onto the PC and have Xbox and Playstation become competition to Steam with distinct game stores and abilities... The scariest part of this with Microsoft is Microsoft could, theoretically, block Steam from being installed on a future version of windows and *forcing* you to use Xbox instead. But that also seems like a good way of opening up a can of antitrust accusations...


That might be one of the reasons for Valve to develop the SteamOS. History has shown that MS is willing to take the antitrust slap on the wrist if it kills off the competition (Netscape vs IE thing). However, in the current world this might end up pretty costly for them as some tail ends of the past antitrust cases are still in effect in EU against MS which might mean very significant fines in the ballpark of tens of billions of Euros. It would be probably cheaper for MS to convince the Steam owners to sell out than to try to kill steam this way. If someone puts, lets say arbitrary large number, say 20 bil EUR on the table for your company which is in markets opinion worth, lets say some other arbitrary smaller number, like 3-4 bil EUR, you will probably think about it.


----------



## Pip Boy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Alright, I will bite, you have my attention.
> 
> What is this little side conversation about?


Quote:


> Originally Posted by *PostalTwinkie*
> 
> Oh, I thought you were trumpeting the 60 FPS thing....
> 
> *Yea, I fully agree developers are going to have to come up with really creative ways to push these new consoles.* I think it is fair to say any creative tricks they develop to get over the hardware limitations will also bleed into the PC market even.


it was a lame joke, they can push the graphics fidelity by lowering the frame rate to 1 FPS.


----------



## Pip Boy

Quote:


> Originally Posted by *Carniflex*
> 
> I can hit that fps at 5400x1920 (thats 10 megapixels, or 25% higher reso than 4K) with every slider as far as it can go (including AA) with a single mildly overclocked 7950 card. The "game" ofc is Lost coast benchmark on source engine which is a bit .. dated. That out of the way, you can get acceptable frame rates at the same reso by doing manual settings (that btw do not look like crap) in titles like Crysis, Crysis 2, Metro 2033. Main things to tune down are the AA level (which you do not need above x2 at that kind of reso) and few specific effects that incur significant performance penalty at high resolutions.


there are a lot of games on PC that can run at 4k with the sort of quality settings that are _STILL_ higher overall than a console manages.

For example, you mention Crysis & Crysis 2 Well i used to play this on a single 5850 1GB card @ 7megapixesls. That card by todays standards is now archaic, The settings were a mixture of mediums and a high or two but the FPS was still 35 - 50fps and although i knew the card was being overstretched for what it was it would of worked in crossfire had i of had two cards at 60fps and mostly high to ultra settings. By comparison a 980 and soon to be 980Ti are many times faster let alone in SLi.

Things like Shadow settings are the first thing you drop from ultra to medium, thats something that on a high resolution setup you don't tend to notice much in the way of a fidelity difference but it usually saves you a good 30 FPS ! Then you don't really need x16 anisotropic filtering at higher resolutions ( forgetting LCD technology and how it works meaning Ansi can look odd) x8 is enough but you could saving a few 2 - 5 fps, then world distances and some of fog effects can be dropped again giving easily another 20fps back.. it does depend on the game but there is a point where you can make a game look almost indistinguishable from ultra with a mix of high-medium + the best textures.

The key point is that right now the only people with a decent top end card that cant run 4k are the same people who don't know how to tweak a game and just set every slider on full even if it makes the game look worse ( FXAA, motionblur, Ultra postprocessing sun glare and ott bloom im looking at you)


----------



## PostalTwinkie

Quote:


> Originally Posted by *phill1978*
> 
> it was a lame joke, they can push the graphics fidelity by lowering the frame rate to 1 FPS.


Ooohhh!!! Ahahaha, I got ya!

That actually wasn't too lame, and not too far from what is really going to have to happen at this moment. They will need to lower frame rate to allow for higher fidelity, because they aren't getting both right now on these new consoles.


----------



## aberrero

I think there is room for optimization. Most game engines, even for PC games, are designed to target last gen consoles. So while these consoles will never exceed PC graphics, I think both of and console graphics will get much better on the same hardware in the next few years as new engines come out.


----------



## mtcn77

Quote:


> Star Wars: The Force Unleashed II is shaping up to be a state-of-the-art console title using a number of rendering techniques *only viable on a 30FPS game*. But the frame-rate upscaler tech works well in making it *look as smooth as a 60FPS one*, and is actually less taxing on system resources than the motion blur code it replaces in the demo.


----------



## superj1977

Zero suprise here, moving on.


----------



## AaronD01

What I don't understand is why they didn't just spend another $100 to put 8GBs of DDR3 Ram in. Imagine if the consoles had the whole 8GBs of GDDR5 memory for graphics alone. They probably could have done it for $50, and it would have freed up a load of resources.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AaronD01*
> 
> What I don't understand is why they didn't just spend another $100 to put 8GBs of DDR3 Ram in. Imagine if the consoles had the whole 8GBs of GDDR5 memory for graphics alone. They probably could have done it for $50, and it would have freed up a load of resources.


You have to think number of sales. $50 over 10M consoles is 500M. Thats a lot of money. Each $ saved is good.


----------



## DRT-Maverick

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You have to think number of sales. $50 over 10M consoles is 500M. Thats a lot of money. Each $ saved is good.


Yeah we can't be taking away from those exec's 'pocket lining' now can we.


----------



## Jedi Mind Trick

What resources would more ram free up? These dont seem to be ram limited, especially with like 10x what consoles had last time. More ram =/= faster CPU.


----------



## Chimeracaust

Quote:


> Originally Posted by *Raven Dizzle*
> 
> I opt to see how "utilized" these consoles can be until ROCKSTAR develops a full-fledged game on them. Look at the tour-de-force they pulled off with the last gen releases of GTAV! Even if you didn't like the game, you have to admit it was a technical marvel for the old consoles.


Yeah that epic blurry nightmare that couldn't even manage 24 fps consistently. I don't know why Rockstar doesn't get more crap for their abysmal frame rates.


----------



## Jedi Mind Trick

Because their games are pretty fun, and worth the annoyance of sub par frame rates (at least to me).


----------



## mtcn77

With all the cooperative spirit between these large corporations, what if AMD caught a hint, or two from Samsung 850's retrolithograpy strategy and just mixed, say 5870, with tsv? It would be the current king of compact gpus, imo.


----------



## KenjiS

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> What resources would more ram free up? These dont seem to be ram limited, especially with like 10x what consoles had last time. More ram =/= faster CPU.


You can free up CPU overhead by not compressing textures or audio to save CPU cycles

This is why we're seeing the monumental amounts of RAM/VRAM usage on newer games, the textures CANNOT be compressed on consoles because of CPU overhead and I'm guessing theres CPU overhead in streaming textures from the HDD or some other bottleneck that prevents that from being an efficient solution on the new architecture, Which is why the programmers arent doing it

This is supported by most new releases it seems using tons of RAM and VRAM, I do not think developers are being lazy necessarily so much as they're trying to get engines that deliver improvements in graphics quality that actually run acceptably well on the consoles, and to do this means designing a game engine around the one strength these consoles have, a unified memory architecture...

RAM was a limitation on the old consoles in some ways, Games like Skyrim and that on the PS3 had a really hard time because it had split memory (256mb of System and 256mb of VRAM) Eventually developers realized they had to essentially develop the PS3 version and then port it to the 360 for it to be well optimized on the PS3.. the 360's unified memory was easier to work with

Of course PCs had no issue because even when the 360 came out 512mb of unified RAM was a pretty small amount... The X1800 GPU used in the 360 was available with 512mb of ram after all and common cards had 256mb easily. Plus i believe the games were using a more even split at first, 256mb system and 256mb video.

This time around there was a very big jump in the amount of RAM, the consoles jumped to 8gb of RAM and made it a unified architecture, you can easily use 2gb of "system" and 4gb of "video" on the consoles. In comparison this time around there just werent that many 4gb+ cards out there (To be fair, the only common 4gb cards were the AMD Radeons, a few OEMs produced some 770s with 4gb of ram, and I know theres a few 670 and 680 variants with 4gb of ram, but almost all the 780s and 780 Tis were 3gb with VERY few 6gb 780s released before nVidia pushed out the 970/980, Sure the Titans have 6gb but thats a VERY small tiny percentage of users)


----------



## XKaan

Someone set a reminder on their calenders to bump this thread in 2 years so we can see where we are at...

Aw hell, I'll do it.

Optimization must be the most overused word in gaming these days, aside from "cinematic".

The console manufacturers got LAZY this time around. Sony blew their wad on the PS3 and played it safe this time around, as did Microsoft. These new consoles are SFF PC's, and despite what everyone thinks there is no hidden 780Ti in there just waiting to be unleashed with a little "optimization" or a DX12 update.

If you like specific console exclusives or sitting on a couch with your pals and playing whatever cinematic game Ubi is selling at that point then great, I'm seriously happy for you. HAVE FUN


----------



## Carniflex

Quote:


> Originally Posted by *XKaan*
> 
> ... snip ...
> If you like specific console exclusives or sitting on a couch with your pals and playing whatever cinematic game Ubi is selling at that point then great, I'm seriously happy for you. HAVE FUN


Exclusives is a fair point, however, as far as sitting on the couch with pals goes, PC does it better.
http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc <- that was at the launch. Half a year has passed and the situation is even worse for the "current gen" consoles. What this means that you can, in a nutshell, do a PC at the console price that is better at it than a console. Throw in the price difference between games on PC and console, the subscription plan you need to be online, etc and the PC is better option unless, as you pointed out, need/want that specific game thats available on that specific console. Almost every discrete GFX card has an HDMI port and it's trivial to hook it up to the TV. If you already happen to have a gaming PC just do a ~150.. 200$ HTPC and use Steam streaming function to play on your gaming PC (which is elsewhere in the house) on your TV and on your couch.


----------



## XKaan

Quote:


> Originally Posted by *Carniflex*
> 
> Exclusives is a fair point, however, as far as sitting on the couch with pals goes, PC does it better.
> http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc <- that was at the launch. Half a year has passed and the situation is even worse for the "current gen" consoles. What this means that you can, in a nutshell, do a PC at the console price that is better at it than a console. Throw in the price difference between games on PC and console, the subscription plan you need to be online, etc and the PC is better option unless, as you pointed out, need/want that specific game thats available on that specific console. Almost every discrete GFX card has an HDMI port and it's trivial to hook it up to the TV. If you already happen to have a gaming PC just do a ~150.. 200$ HTPC and use Steam streaming function to play on your gaming PC (which is elsewhere in the house) on your TV and on your couch.


Oh, you are preaching to the choir here my friend. I'm PC through and through. All I was trying to illustrateis that no matter was BS comes out of a devs mouth, there are some people that are disillusioned about these consoles. Long story short, all I was saying is if you MUST play exclusives, then own a console. If you could care less, then stop abusing yourself and go the PC route.

Me personally, I don't look at gaming a an in-person social type thing. I'm around people all freaking day at work, so when I get home I want to put my headset on and get lost in a game. I really never saw the appeal of passing a controller around a living room. Again, that's just me.

What's different this generation is the PC crowd got tired of waiting for a studio to push PC boundaries, so now we just fund our own games. Elite Dangerous and Star Citizen being the most notable examples. I


----------



## Sisaroth

Quote:


> Originally Posted by *XKaan*
> 
> Someone set a reminder on their calenders to bump this thread in 2 years so we can see where we are at...
> 
> Aw hell, I'll do it.
> 
> Optimization must be the most overused word in gaming these days, aside from "cinematic".
> 
> The console manufacturers got LAZY this time around. Sony blew their wad on the PS3 and played it safe this time around, as did Microsoft. These new consoles are SFF PC's, and despite what everyone thinks there is no hidden 780Ti in there just waiting to be unleashed with a little "optimization" or a DX12 update.
> 
> If you like specific console exclusives or sitting on a couch with your pals and playing whatever cinematic game Ubi is selling at that point then great, I'm seriously happy for you. HAVE FUN


Whats wrong with optimization? it can make quite big differences. If you compare a heavily moded skyrim to shadow of mordor with ultra textures. Say that both use around 4 GB VRAM, then skyrim probably looks a lot better than shadow of mordor. And skyrim performance wise isn't even that great either.


----------



## BigMack70

Quote:


> Originally Posted by *Carniflex*
> 
> as far as sitting on the couch with pals goes, PC does it better.


This is not true. Quick - name me the PC games in which you can play 4-player splitscreen from your couch.

Oh wait...


----------



## Carniflex

Quote:


> Originally Posted by *Sisaroth*
> 
> Whats wrong with optimization? it can make quite big differences. If you compare a heavily moded skyrim to shadow of mordor with ultra textures. Say that both use around 4 GB VRAM, then skyrim probably looks a lot better than shadow of mordor. And skyrim performance wise isn't even that great either.


There is nothing wrong with optimization. However, what optimization is not is some kind of black magic drawing performance out of thin air.

My personal suspicion is that many devs lamenting over the lack of performance are using outdated paradigms, trying to approach the new consoles just like any x86 based using their currently existing heavily single thread dependent game engines and trying to make them "work" somehow. I mean sure, the hardware in there is not something that will knock your socks off, on the other hand it's not something particularly wimpy either, assuming you make use of the extensions that's available, like, for example AVX. The supposed HSA support is kind of wildcard, no idea how that might or might not work in practice. On paper it's supposed to be pretty awesome.

Based on numbers on paper these consoles should have no major problems hitting 1080p and around 60 fps. Similar hardware (although admittedly with much stronger CPU) can do in my PC resolutions above 4K at 30 ..60 fps without castrating the settings all the way to the oblivion. I'm talking specifically about the 7870 Eyefinity 6 model in my sig rig. Surely the consoles must be stronger than the venerable 6770 Eyefinity 5 card also in my sig rig and even that is capable of dragging itself up to 4K (but not really beyond) with only 1 GB of vRAM and reasonably consistent 30 fps in older engines, like for example, Unreal 3 with some manual tweaking of the settings like not using any AA and being careful with shadows.


----------



## Carniflex

Quote:


> Originally Posted by *BigMack70*
> 
> This is not true. Quick - name me the PC games in which you can play 4-player splitscreen from your couch.
> 
> Oh wait...


https://steamcommunity.com/app/49520/discussions/0/864977025916708574/
Quote:


> Supported Games:
> - Borderlands 2
> - Borderlands: The Pre Sequel
> - Left 4 Dead 2
> - Resident Evil 5


The joys of PC gaming. You can do stuff ...
Edit: You are not limited to the offline ofc. Granted the games that support this are few and far between, but it's not particlualrly common feature on the current gen consoles either as far as I'm aware. For online play:


----------



## BigMack70

Quote:


> Originally Posted by *Carniflex*
> 
> https://steamcommunity.com/app/49520/discussions/0/864977025916708574/
> The joys of PC gaming. You can do stuff ...


So according to you, the ability to play those four games split screen on your PC with superior graphics means that *as a platform*, PC offers a superior in-room social experience to consoles? ............ Are you really making this argument?

That's completely ridiculous. If PCs ever get native split screen support for at least 2 people for most of the games that have it on console, then you have a point. Until then, your argument is 100% absurd that PC offers a superior social experience for you and your friends on the couch.

If you want to have your buddies over routinely to play video games, you need to own a console of some sort. I would love to see that change and have exclusives be the only reason consoles have to purchase them, but I don't see it happening anytime soon.

I guess, that is, unless the only games you and your buddies want to play are Borderlands, Left4Dead, and RE5.


----------



## Carniflex

Quote:


> Originally Posted by *BigMack70*
> 
> So according to you, the ability to play those four games split screen on your PC with superior graphics means that *as a platform*, PC offers a superior in-room social experience to consoles? ............ Are you really making this argument?
> 
> That's completely ridiculous. If PCs ever get native split screen support for at least 2 people for most of the games that have it on console, then you have a point. Until then, your argument is completely absurd that PC offers a superior social experience for you and your friends on the couch.
> 
> If you want to have your buddies over and play video games, you need to own a console of some sort. I would love to see that change and have exclusives be the only reason consoles have to purchase them, but I don't see it happening anytime soon.
> 
> I guess, that is, unless the only games you and your buddies want to play are Borderlands, Left4Dead, and RE5.


If you are trying to make an argument that split screen is somehow required for a "better social experience" then I'm sorry I'm not quite following you. You asked for PC games that can do 4 way split screen. I showed some games that can do it, although admittedly with some limitations. The social argument is entirely separate issue. If you and your friends have PC,s they bring their PC along and you have a blast. If you and they they have consoles then you can have the same social experience using consoles. Used hardware is irrelevant in that regard.

I'm not familiar with console gaming, but as far as I'm aware the 4 way splitscreen feature is not a common one either on the current gen console titles.

As far as me and my buddies go. They all have PC's and no consoles. When they come over they can bring their PC's / laptops and we can have a blast that way. We do not use split-screen or controllers for that though. PC gaming is the best when you have the screen all for yourself. We do not play stuff that would benefit greatly from split-screen anyway.

The point I was making, however, earlier is that you can do a PC at the same cost as a console that playes the games that are available on PC with better GFX settings or higher/smoother frame rates. all that without having to pay for a subscription plan to be online and the same games are cheaper, often, on PC.


----------



## BigMack70

Here's a VERY out of date example of local multiplayer on the Xbox 360... 68 games available with 1-4 local players, just a couple years into that one console's life.

Not all games on console support 4-way local multiplayer, but there are at least two orders of magnitude more console games you can play local multiplayer on than on the PC currently. Also, many of the most popular titles DO support local splitscreen for at least 2 players.

The requirements in terms of hassle/difficulty/hardware are FAR higher to play local multiplayer together on the couch for the PC. (You have your buddies come over... you line up 4+ monitors/TVs side by side on your entertainment center/wall/desk so you can all sit on the couch together? That's not what most LANs I've been to look like...) Not only that, but you have extremely limited options for how you might play with people who do not own gaming PCs or the same games you do.

The requirements to play local multiplayer on a console are virtually nil... 1) Someone between you and your friends need the console and game you want to play and 2) You need to go to that person's house and have the necessary controllers. My buddies who don't own an Xbox can come over and play Halo with zero effort. I can go to my brother's house and play smash bros with zero effort. This is a VASTLY superior experience to what you get with the PC.
Quote:


> The point I was making, however, earlier is that you can do a PC at the same cost as a console that playes the games that are available on PC with better GFX settings or higher/smoother frame rates. all that without having to pay for a subscription plan to be online and the same games are cheaper, often, on PC.


This is correct, but your statement that:
Quote:


> as far as sitting on the couch with pals goes, PC does it better.


neither follows from the above nor is it in any way correct or defensible.

As far as playing by yourself or multiplayer over the internet, the PC does it better. As far as sitting on the couch with pals goes, consoles do it better. The difference in the local social experience is as superior on console as the graphical difference is superior on PC at 4k 60fps compared to 720p 30 fps on console.


----------



## maarten12100

Quote:


> Originally Posted by *Sisaroth*
> 
> Whats wrong with optimization? it can make quite big differences. If you compare a heavily moded skyrim to shadow of mordor with ultra textures. Say that both use around 4 GB VRAM, then skyrim probably looks a lot better than shadow of mordor. And skyrim performance wise isn't even that great either.


Using Skyrim as example of optimization OH man I didn't think that this would ever happen









Skyrim uses x87 instructions and has one of the most horrible engines every created in terms of scaling even worse DX9...
VRAM allocated doesn't mean in use and all the Shadow of Mordor claims were just there because Nvidia bribed them to list it as such. (not sure if you have the game but it says "Nvidia the way it's meant to be played" at the very begining)

(AMD bribes too under the gaming evolved name that is why some titles had 4GB requirements)


----------



## Arthedes

Quote:


> Originally Posted by *bucdan*
> 
> So ubisoft is claiming that they fully utilized all of the weak jaguar cores and the gpu? I highly doubt it. I wonder if valve will try something to prove them wrong in their new upcoming games, whenever that may be.


Valve's new game will be HL3 of course.


----------



## Robin Nio

Quote:


> Originally Posted by *Arthedes*
> 
> Valve's new game will be HL3 of course.


I hope for Source 2 engine or Portal 3 cause HL3 is probably over hyped.


----------



## BulletSponge

Quote:


> Originally Posted by *BigMack70*
> 
> So according to you, the ability to play those four games split screen on your PC with superior graphics means that *as a platform*, PC offers a superior in-room social experience to consoles? ............ Are you really making this argument?
> 
> That's completely ridiculous. If PCs ever get native split screen support for at least 2 people for most of the games that have it on console, then you have a point. Until then, your argument is 100% absurd that PC offers a superior social experience for you and your friends on the couch.
> 
> If you want to have your buddies over routinely to play video games, you need to own a console of some sort. I would love to see that change and have exclusives be the only reason consoles have to purchase them, but I don't see it happening anytime soon.
> 
> I guess, that is, unless the only games you and your buddies want to play are Borderlands, Left4Dead, and RE5.


Couch notwithstanding, THIS is the superior social gaming experience.

i


----------



## BizzareRide

Quote:


> Originally Posted by *BulletSponge*
> 
> Couch notwithstanding, THIS is the superior social gaming experience.
> 
> i


You do that with consoles as well. And its easier since they're more portable.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *BizzareRide*
> 
> You do that with consoles as well. And its easier since they're more portable.


What console game allows multiplayer with hundreds of people?


----------



## DarkBlade6

What were they expecting ?! They went all out AMD for cheap cost but now they are CPU bottlenecked. 8 jaguar cores







lol remind me of the fail Atari Jaguar console


----------



## Tojara

Quote:


> Originally Posted by *maarten12100*
> 
> Using Skyrim as example of optimization OH man I didn't think that this would ever happen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Skyrim uses x87 instructions and has one of the most horrible engines every created in terms of scaling even worse DX9...
> VRAM allocated doesn't mean in use and all the Shadow of Mordor claims were just there because Nvidia bribed them to list it as such. (not sure if you have the game but it says "Nvidia the way it's meant to be played" at the very begining)
> 
> (AMD bribes too under the gaming evolved name that is why some titles had 4GB requirements)


Skyrim has actually been rather decent ever since that one guy got Bethesda to add SSE support to make the game run decently. My II X4 620 and 5770 get pretty close to maxing it out at 60fps at 1680x1050, which is pretty nice.


----------



## Pip Boy

Quote:


> Originally Posted by *Carniflex*
> 
> I'm not familiar with console gaming, but as far as I'm aware the 4 way splitscreen feature is not a common one either on the current gen console titles.
> .


The only difference is that if you game on a big 4k Screen with your PC + HDMi each player gets a 1080p slice with no reductions in graphics quality per screen









what is 900p divided by 4


----------



## Shadow11377

Quote:


> Originally Posted by *phill1978*
> 
> The only difference is that if you game on a big 4k Screen with your PC + HDMi each player gets a 1080p slice with no reductions in graphics quality per screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what is 900p divided by 4


400x225 Resolution.
Smaller than 360p each.

Nope, sorry.

800x450 | 800x450
800x450 | 800x450


----------



## BulletSponge

Quote:


> Originally Posted by *BizzareRide*
> 
> You do that with consoles as well. And its easier since they're more portable.


Portable?


Spoiler: Warning: Spoiler!







Checkmate.


----------



## BigMack70

Quote:


> Originally Posted by *BizzareRide*
> 
> Couch notwithstanding, THIS is the superior social gaming experience


Yeah.... just..... no. That's not even comparable to relaxing on the couch with friends, and is just apples and oranges.


----------



## AaronD01

Quote:


> Originally Posted by *Jedi Mind Trick*
> 
> What resources would more ram free up? These dont seem to be ram limited, especially with like 10x what consoles had last time. More ram =/= faster CPU.


The consoles have 8GB of GDDR5 memory each, and that would be GREAT, except for the fact that it's split between the game, operating system, and whatever secondary application is open. Put the OS and other applications on DDR3 Ram and you can not only run more at once, but all of a sudden you have a full 8GB of GDDR5 memory to use. A second upgrade would obviously be the CPU, and an option upgrade to an SSD.

The playstation 3 was sold for $699 at first, and even then sony was losing $200 on every system. Why you ask? Because they knew that down the road they would recoup those losses and make a profit once the cost of the hardware went down. When you can promise a manufacturer massive orders of the same components for 10 years, then you have a lot of room to negotiate a better price. It doesn't seem that Sony or Microsoft did that this generation.

Another thing you should be aware of is the fact that the real money doesn't actually come from the sale of the console it's self, but rather software sales and of course a monthly subscription fee. Add $700 worth of components that would normally cost even more than that, and sell the console for either $400 or $500 and you could have a seriously beast console... at least for another 4 years minimum.

When the Xbox 360 and Playstation 3 first came out, they were ahead of even high-end PCs. It's a shame that this generation has gone so badly.

How much do you think that a 4790K cpu will sell for in 10 years? Take a loss now, only to make bank later.


----------



## VeerK

I'm not sure how to feel about this situation other than to laugh at the people who kept defending the consoles, claiming lack of optimization now meant that the consoles "true power was untapped". On the other hand, I do own a PS4 for some exclusives which are near and dear to my heart and won't be on PC. Either this means a short console cycle which may or may not mean PC would stop being held back in game development, or it means that people will be outraged when their "next-gen consoles" turned out to be a mediocre stop-gap. I don't want to feel like I wasted my money on a PS4 with a PS5 on the horizon, especially when I could have poured that money into my PC rig. I will say this, the only multiplat game I am planning to buy for my PS4 is CoD:AW because it just plain epitomizes couch gaming after a hard day at work for me. For superior everything else, I have my baby


----------



## maarten12100

Quote:


> Originally Posted by *Tojara*
> 
> Skyrim has actually been rather decent ever since that one guy got Bethesda to add SSE support to make the game run decently. My II X4 620 and 5770 get pretty close to maxing it out at 60fps at 1680x1050, which is pretty nice.


I didn't know they did well that is nice but still a DX9 title which although fun will never be something more than lousy in terms of engine efficiency








Quote:


> Originally Posted by *Dyson Poindexter*
> 
> What console game allows multiplayer with hundreds of people?


I don't know but I don't think you'd want hundreds of people in a single game it can be crowed with 32 on a server so unless maps are huge you shouldn't have more.
Quote:


> Originally Posted by *BulletSponge*
> 
> Portable?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Checkmate.


Good luck playing competitive on a laptop floppy keyboard.


----------



## Carniflex

Quote:


> Originally Posted by *maarten12100*
> 
> Good luck playing competitive on a laptop floppy keyboard.


Hehe. To be fair, I do not believe many people play competitive using the laptop touch-pad either. A good separate small keyboard and proper mouse and a laptop can be decent enough portable gaming rig. I carry my keyboard and mouse with me daily as I get irritated if the keyboard or mouse is not exactly like i'm used to. Then again I carry also my desktop between work and home because I customize my stuff (and need a lot of HDD space for data) and I get also mildly irritated if the environment I'm working in is not exactly like I'm used to. Would have to start thinking about what I'm doing instead of focusing on the content.


----------



## Dyson Poindexter

Quote:


> Originally Posted by *BigMack70*
> 
> Yeah.... just..... no. That's not even comparable to relaxing on the couch with friends, and is just apples and oranges.


Who needs 1080p 60 FPS when you have a nice couch? Checkmate, PC!


----------



## Kedas

Well this is Ubisoft saying that, but we all saw what their games have become this last months


----------



## lacrossewacker

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> Who needs 1080p 60 FPS when you have a nice couch? Checkmate, PC!


I'll be enjoying 1080p 60fps Halo, on the couch









I'd enjoy just as much if it was 900p/30fps too....

Just as I enjoyed Crysis 720p at 25fps in 2007.....

I still think you guys give too much credit to PC gaming. I can't imagine an 10 year old saying "hey I'll just buy a PC real quick, let me convince my parents to buy all these parts from some etailer, using components that I don't even understand." Versus, hey I'll just get 1 of the 3 choices and call it a day! Have any of my friends come over and they'll be able to pick up any game without any "training"


----------



## t00sl0w

just to be clear, the console GPUs are equal to something like the 7790 NOT the 7870...right?
I keep seeing 7870 tossed around and that seems way too high.


----------



## Pip Boy

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> Who needs 1080p 60 FPS when you have a nice couch? Checkmate, PC!


who needs gaming when you have a nice girl on the couch.

checkmate all of you.


----------



## Pip Boy

Quote:


> Originally Posted by *t00sl0w*
> 
> just to be clear, the console GPUs are equal to something like the 7790 NOT the 7870...right?
> I keep seeing 7870 tossed around and that seems way too high.


I know. Some say 7870 some say 7850, its more like a 7790 and if im not mistaken it has a slightly slow core clock speed so its more HD7780 if you will and that coupled with a multi core netbook level CPU I can't imagine state of the art 1080p 60fps graphics are easy to pull off without dipping into the 40's on something thats only 30 - 35% faster than a HD5770 from 2009

They have chosen 30fps for consistency of performance.


----------



## BulletSponge

Quote:


> Originally Posted by *phill1978*
> 
> who needs gaming when you have a nice girl on the couch.
> 
> checkmate all of you.


Try 2 on the couch in sli, get on my level.









Edit-I sense a disturbance in the thread. Post removed in 3.....2.....


----------



## azanimefan

Quote:


> Originally Posted by *bucdan*
> 
> So ubisoft is claiming that they fully utilized all of the weak jaguar cores and the gpu? I highly doubt it. I wonder if valve will try something to prove them wrong in their new upcoming games, whenever that may be.


this is Ubisoft, i wouldn't trust a thing they said ever... not after what they did to the silent hunter series. Furthermore, this company LOATHS coding multithread and are so wedded to nvidia that they're using nvidia dev tools to code games for an amd gpu. (gee i wonder if that has anything to do with the dog performance. many other devs are getting 60fps with games that look as good or better then the dogs ubisoft releases.) it sounds like whining to me


----------



## KenjiS

Quote:


> Originally Posted by *t00sl0w*
> 
> just to be clear, the console GPUs are equal to something like the 7790 NOT the 7870...right?
> I keep seeing 7870 tossed around and that seems way too high.


Xbox One is a 7790 or R7 260 (same silicon, the 7790 differs in core count, the R7 260 is actually "closer" on paper to whats in the One, but both chips are almost equal in terms of performance)

PS4 is a 7850

I used the 7870 as an example in one of my posts as an example (the game I was referencing the 7870 was barely powerful enough to handle 60fps average) noting that its more powerful than whats in either console and IT struggles

Consoles needed 7950-level hardware. Pure and simple


----------



## ILoveHighDPI

Quote:


> Originally Posted by *lacrossewacker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dyson Poindexter*
> 
> Who needs 1080p 60 FPS when you have a nice couch? Checkmate, PC!
> 
> 
> 
> I'll be enjoying 1080p 60fps Halo, on the couch
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd enjoy just as much if it was 900p/30fps too....
> 
> Just as I enjoyed Crysis 720p at 25fps in 2007.....
> 
> I still think you guys give too much credit to PC gaming. I can't imagine an 10 year old saying "hey I'll just buy a PC real quick, let me convince my parents to buy all these parts from some etailer, using components that I don't even understand." Versus, hey I'll just get 1 of the 3 choices and call it a day! Have any of my friends come over and they'll be able to pick up any game without any "training"
Click to expand...

Unless your mother has been building PC's ever since they existed and expects you to do the same.

I guess some people are a little more fortunate than others.


----------



## criznit

1080p/60 fps on the couch is actually really nice! Setting up my pc to the tv with the ps4 controller is great! Now if only these 4k tvs would go down in price...


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats the thing. If a PC with a "weak" CPU and a HD 7850 can do 1080p then why cant these consoles always do it and do it better?
> 
> The way i see it developers have to come with clever way to make their games look better and have unique ideas. Throwing more performance does nothing i mean nothing. Look at Crysis 1 for example and compare it to a game today. Todays games got nothing on it even though a PC from 2007 is 5-10x slower then a PC now.


A PC with a weak CPU and a 7850 can do 1080p just fine............with lowered settings.....and it still wouldn't average 60 fps on all games. Some of you guys are making it seem like your $500 cards in your $1300 rigs can run all games at 1080p maxed out averaging 60 fps. The gtx980 can't even run all games maxed out at 1080p at 60 fps.

These consoles are doing fine for what's in them. A $400 PS4 or $350 Xbox one is guaranteed to run every single game made for it in the next 8 or so years. Our PCs can make no such guarantee.


----------



## Synthesize

Quote:


> Originally Posted by *Lanlan*
> 
> Man, I sure can't wait to play my Smash Bros on Wii U in 1080p and at 60fps.


Everyone take a moment to look at the rare unicorn of the tech forum.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *BulletSponge*
> 
> Portable?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Checkmate.


I'm sure that thing costs about equal or more than the xbox one, ps4, and wii u combined.


----------



## Shadow11377

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> A PC with a weak CPU and a 7850 can do 1080p just fine............with lowered settings.....and it still wouldn't average 60 fps on all games. Some of you guys are making it seem like your $500 cards in your $1300 rigs can run all games at 1080p maxed out averaging 60 fps. *The gtx980 can't even run all games maxed out at 1080p at 60 fps.*
> 
> These consoles are doing fine for what's in them. A $400 PS4 or $350 Xbox one is guaranteed to run every single game made for it in the next 8 or so years. Our PCs can make no such guarantee.


No, but what a single 980 *does* do, in a good PC, is guarantee the ability to reach 1080p 60 FPS+ in almost every game if you configure settings the right way. This will almost never result in lower quality than the consoles unless it is a terrible (and I mean absolutely terrible) port.

Low settings with below 60 fps average (not just dipping) is not what I would call a fine experience either. It would have to be called simply playable at that point.

The consoles do guarantee compatibility with games released for them, but unfortunately they offer no guarantee concerning the quality and/or performance of said games so it's not worth much of anything. Kind of like PC.. just run the latest Windows with a decent PC, and you're pretty much (Unofficially) guaranteed to be able to _run_ the games. Worried about crashing? Game saves corrupting? Well that can happen on consoles too.


----------



## Serandur

Quote:


> Originally Posted by *Shadow11377*
> 
> No, but what a single 980 *does* do, in a good PC, is guarantee the ability to reach *1080p 1440p* 60 FPS+ in almost every game if you configure settings the right way. This will almost never result in lower quality than the consoles unless it is a terrible (and I mean absolutely terrible) port.
> 
> Low settings with below 60 fps average (not just dipping) is not what I would call a fine experience either. It would have to be called simply playable at that point.
> 
> The consoles do guarantee compatibility with games released for them, but unfortunately they offer no guarantee concerning the quality and/or performance of said games so it's not worth much of anything. Kind of like PC.. just run the latest Windows with a decent PC, and you're pretty much (Unofficially) guaranteed to be able to _run_ the games. Worried about crashing? Game saves corrupting? Well that can happen on consoles too.


Fixed

Source - me and my 970 (previously 780) on a 1440p screen; the 980 has the extra push necessary for 60 in everything without stupid settings. 1920x1080 with a 980 is frankly lolworthy, it's too low of a resolution for such levels of performance unless someone absolutely must tick the "max" box in everything over some actually beneficial IQ improvements, in my opinion; a 980 is so overkill that 1920x1080 and 60 FPS is a sure thing without question for years, with the exception of crazy levels of AA and "max" whatever that provides little benefit and crushes performance for no good reason other than because it can.


----------



## Master__Shake

Quote:


> Originally Posted by *phill1978*
> 
> The only difference is that if you game on a big 4k Screen with your PC + HDMi each player gets a 1080p slice with no reductions in graphics quality per screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *what is 900p divided by 4
> 
> 
> 
> 
> 
> 
> 
> *


pretty sure the answer is terribad


----------



## Hasty

Quote:


> Originally Posted by *Serandur*
> 
> Fixed
> 
> Source - me and my 970 (previously 780) on a 1440p screen; the 980 has the extra push necessary for 60 in everything without stupid settings. 1920x1080 with a 980 is frankly lolworthy, it's too low of a resolution for such levels of performance unless someone absolutely must tick the "max" box in everything over some actually beneficial IQ improvements, in my opinion; a 980 is so overkill that 1920x1080 and 60 FPS is a sure thing without question for years, with the exception of crazy levels of AA and "max" whatever that provides little benefit and crushes performance for no good reason other than because it can.


How do you define 60fps though?

Not sure for you. But I consider a game is running at 60fps only if the frame time almost never goes over 16.6 millisecond.
That often means an average frame-rate of 100fps or higher is needed.

This is important otherwise you will get V-sync induced micro-stuttering.

And the hell with 60fps. I'll take 144fps, thank you.


----------



## Omega X

Nice to see Ubisoft claiming that they can't optimize a game again. Brought to you in part by "we can't fit women characters on the disc".


----------



## Serandur

Quote:


> Originally Posted by *Hasty*
> 
> How do you define 60fps though?
> 
> Not sure for you. But I consider a game is running at 60fps only if the frame time almost never goes over 16.6 millisecond.
> That often means an average frame-rate of 100fps or higher is needed.
> 
> This is important otherwise you will get V-sync induced micro-stuttering.
> 
> And the hell with 60fps. I'll take 144fps, thank you.


I define it the same way you do outside of very rare and brief exceptions (an out of the ordinary explosion for instance), and my 970 is just short of being able to maintain a constant 60 at 1440p with highish settings only in a small handful of games where it dips into 50s and upper 40s territory like Watch Dogs and Crysis 3. A single, overclocked 980 would just barely mitigate the exceptions I've encountered. It's borderline and not a sure thing for the future, but the post I quoted and edited did say for almost every game and a 980 fits the bill for 60 fps and 1440p in almost every game with proper configuring.

I know from experience that I want more than my 970 for demanding titles in 1440p, but even if my 970 was at 1920x1080 vs a 980, it is complete overkill for a constant 60 FPS and frankly a squandering of its capabilities with such mediocre visual clarity unless of course you are channeling it towards 120/144 FPS, but that's not what I was addressing. The point was, 1920x1080 and 60 FPS is trivial for today's high-end gear, completely (2560x1440 and/or 120/144 Hz displays are far more fitting with proper settings). I'd have far, far more faith in a single 980's longevity (especially given those relatively low monitor standards) than any console, they're on completely different levels. Which is to say I agree with the post I quoted, I just think it was still understated.


----------



## KenjiS

Quote:


> Originally Posted by *ILoveHighDPI*
> 
> Unless your mother has been building PC's ever since they existed and expects you to do the same.
> 
> I guess some people are a little more fortunate than others.


Like my parents?









Seriously I've only had.. 3 store bought Desktop PCs in my lifetime.. Not going to count the Macs for obvious reasons..

a Compaq Presario AMD K6 200mhz, Which was an educational experience, I think I was like 8 and my parents took me to all the stores and basically let me pick my own computer, they told me whatever I decided was fine with them (Within reason) and gave me advice and answered anything I asked really... it was between this and a HP with a Pentium 166 and a Voodoo 2... I picked the Compaq because it had built in JBL speakers.. and because it had Q from Star Trek..Yes I had the option of having my dad custom build me one and I decided against it.. Hey I was 8, it was an educational experience







Wasnt a bad computer mind you, Especially after I tossed a Rage 128 into it!

a Dell Pentium 4 that never did anything but overheat, I opened it up to inspect the problem and see if there was dust clogging the CPU heatsink and next thing i knew the entire heatsink with the processor attached came off and hit the floor... Processor was toast after that

An Alienware from right around when Dell bought them..I was like..16 at the time, Literally they were purchased a day or two after I bought it, It was a nightmare and a very long story.. the short version is it had the wrong GPU in it and was DOA (would not even POST) and Alienware refused to do anything about it, It even had a sticker on it saying it had not passed testing! They were going to charge me a restocking fee on a defective computer if i returned it, They told me they wouldnt get me the right GPU, They told me if i wanted to exchange it they'd have to charge me for an entire second computer and then hit me with a 25% restocking fee. Please also note at the time the only place this restocking fee was mentioned was -in the box- which conveniently, meant you then owed them the restocking fee. Theres a lot more to the story but it basically ended with my mom giving them two options, One they take the defective computer back and send us a new one, with the GPU I ordered, No restocking fee. Or two, They get the computer back in a box, and she contacts the District Attorney, the BBB and tells everyone on Satelliteguys.us about the experience (Which is a pretty large forum) As well as contacts the credit card company and tells them of this experience and gets a full refund. This was after about...10 phone calls and 3 weeks of dicking around with Alienware, including them insisting on a tech coming out to replace parts and then telling us we had to ship it off to god knows where, a Computer that never booted into Windows a single time. Seriously.

It ended up being option 2. And yes. She did every word of her threat. I will say that is probubly the only time I've ever seen my mom legitimately lose her temper with this sort of thing, but your temper would be fried after a literal 6 hours of arguing on the phone with them escalating and trying to get things taken care of

To be fair, I've dealt with Alienware since then (my M17x) and they're VASTLY better than they used to be. Dell is a much nicer company these days and has never really given me any issues...I think it was partly them getting sold to Dell and the CSRs basically not giving a toss because they probubly just all got termination notices

As for other parents they used to think my family was "weird" because i built my own computers, and I know I have some younger folks in my WoT clan who wanted to build their own computer and their parents shot it down really fast because they think their kid will get electrocuted


----------



## rcfc89

I just sold my PS4 yesterday. I tried really hard to enjoy it. In the end the graphics are just garbage. And the low fps just give me an instant headache after 30 minutes of game play.


----------



## iSlayer

Quote:


> Originally Posted by *Master__Shake*
> 
> pretty sure the answer is terribad


225p

Hideous


----------



## Pip Boy

Quote:


> Originally Posted by *BulletSponge*
> 
> Try 2 on the couch in sli, get on my level.


i prefer cross fire


----------



## Pip Boy

Quote:


> Originally Posted by *rcfc89*
> 
> I just sold my PS4 yesterday. I tried really hard to enjoy it. In the end the graphics are just garbage. And the low fps just give me an instant headache after 30 minutes of game play.


and here we have a real life experience from someone giving us a straight answer.


----------



## maarten12100

Quote:


> Originally Posted by *phill1978*
> 
> i prefer cross fire


Is that where you have 2 redhead girls lying all over you?

In that case I'd like some AGP(brown) low profile card. Small brown haired girls








Quote:


> Originally Posted by *phill1978*
> 
> and here we have a real life experience from someone giving us a straight answer.


just one opinion really not an answer.


----------



## Pip Boy

Quote:


> Originally Posted by *maarten12100*
> 
> Is that where you have 2 redhead girls lying all over you?
> 
> In that case I'd like some AGP(brown) low profile card. Small brown haired girls
> 
> 
> 
> 
> 
> 
> 
> 
> .


I just want one that can take an 8 pin.


----------



## nleksan

There is a big difference between NATIVE 1080p, and upscaled to 1080p, with most console games being the latter.


----------



## maarten12100

The only problem when fps on consoles is really outrageously low is when you try split screening with 4 players on a single xbox 360. It was fine until we started blasting in COD black ops zombies. It's funny we've been doing that for years when we were younger split screen on a xbox on a 12 inch CRT. It was hard to see really anything


----------



## lacrossewacker

If "already hitting a performance wall" looks like this...then sign me up!




Same thing as usual. Last COD on X1 was 720p, mediocre graphics. This new COD is a mix of 1080p and 1360x1080p (mostly the latter) at a rock solid 60fps and pretty slick graphics...

Oh and guess what....BF5 will look better than BF4, and Halo 5 will look better than Halo MCC, and Naughty Dog will probably blow us all away.

Like others have said.....let's revisit this thread in 2 years. Can't believe we're having these discussions this early. We haven't even seen the onslaught of Unreal 4 engine games


----------



## perfectblade

this thread=people are mad about unoptimized pc ports


----------



## routek

Quote:


> Originally Posted by *KenjiS*
> 
> Xbox One is a 7790 or R7 260 (same silicon, the 7790 differs in core count, the R7 260 is actually "closer" on paper to whats in the One, but both chips are almost equal in terms of performance)
> 
> PS4 is a 7850
> 
> I used the 7870 as an example in one of my posts as an example (the game I was referencing the 7870 was barely powerful enough to handle 60fps average) noting that its more powerful than whats in either console and IT struggles
> 
> Consoles needed 7950-level hardware. Pure and simple


Pretty much this.

PS4 GPU is 1.84tflops. 7850 is 1.76tflops. 7870 is quite a jump to 2.5tflops so a slightly better 7850


----------



## lacrossewacker

Quote:


> Originally Posted by *routek*
> 
> you missed the point entirely, its AMD parts to AMD parts from the same line, nothing to do with coding directly to hardware and PC. Knew that quote was coming though, I nearly put in a disclaimer to avoid silly quote like this.
> 
> to go back to your point, there's already a whole bunch of games where the desktop equivalent of a console performs around the same and in some cases a little better. *Coding to the metal hasn't helped the console much on evidence so far.* there's a lot of myths about low level coding but don't really want to go into it.


has it ever in the first year









If it's not "coding to the metal," the consoles will still have engines tweaked to specifically to the exact specifications of the hardware inside - something PC games will never have.


----------



## DRT-Maverick

Quote:


> Originally Posted by *lacrossewacker*
> 
> has it ever in the first year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If it's not "coding to the metal," the consoles will still have engines tweaked to specifically to the exact specifications of the hardware inside - something PC games will never have.


It's not like optimization for a PC is necessary if the game has been optimized for lesser hardware. Like saying you have to optimize the F1 sports car for freeway driving.


----------



## DRT-Maverick

Routek, are you on drugs? I'm just curious because "A PC would need twenty GTX 980's" is uhm, really reaching there... By that I mean you don't seem to know what you're talking about. You seem to have this delusion that PCs and consoles are where they were in the early 2000's, when the XBox (original) was just coming out with the same titles and quality as PC games. But alas, that was over 10 years ago, and the gaming industry has switched gears, in fact game developers switched gears in the early 2000's (listen to some Tim Sweeney and Cliff Blezinski interviews from the early 2000's, they were the first to jump onto the Console Exclusive bandwagon, but I digress, Epic has turned to crap anyway).

Due to the gear switch, PC hardware has exponentially surpassed the hardware found in consoles. We can see that by today's standard. If you're trying to argue that maybe you should get off your console and replace your 12 year old PC with something new.


----------



## routek

Quote:


> Originally Posted by *lacrossewacker*
> 
> If "already hitting a performance wall" looks like this...then sign me up!
> 
> 
> 
> 
> Same thing as usual. Last COD on X1 was 720p, mediocre graphics. This new COD is a mix of 1080p and 1360x1080p (mostly the latter) at a rock solid 60fps and pretty slick graphics...
> 
> Oh and guess what....BF5 will look better than BF4, and Halo 5 will look better than Halo MCC, and Naughty Dog will probably blow us all away.
> 
> Like others have said.....let's revisit this thread in 2 years. Can't believe we're having these discussions this early. We haven't even seen the onslaught of Unreal 4 engine games


To be fair, Ubisoft are on about the CPU for AI and multiple game systems, not game looks.

Anyway, I'm looking forward to see what Naughty Dog can do, they're super talented.
Quote:


> Originally Posted by *DRT-Maverick*
> 
> Routek, are you on drugs? I'm just curious because "A PC would need twenty GTX 980's" is uhm, really reaching there... By that I mean you don't seem to know what you're talking about. You seem to have this delusion that PCs and consoles are where they were in the early 2000's, when the XBox (original) was just coming out with the same titles and quality as PC games. But alas, that was over 10 years ago, and the gaming industry has switched gears, in fact game developers switched gears in the early 2000's (listen to some Tim Sweeney and Cliff Blezinski interviews from the early 2000's, they were the first to jump onto the Console Exclusive bandwagon, but I digress, Epic has turned to crap anyway).
> 
> Due to the gear switch, PC hardware has exponentially surpassed the hardware found in consoles. We can see that by today's standard. If you're trying to argue that maybe you should get off your console and replace your 12 year old PC with something new.


I'm sorry but you've missed the part where it was a joke towards ZackisDead, who is a troll that got deleted by the looks of it. I thought saying twenty 980s even the most naïve person would've seen it was a joke but I guess not by your post. Honestly really? yikes







:


----------



## umeng2002

Everyone into GPU tech, PC gamers mostly, already knew this when the specs were announced.


----------



## mtcn77

Children actually feel adopted when they are given rules to follow. There is always a limit, the problem is what we can do with it, how true we can reach within the confinement until the mold is shed.
It is hypocritical imho, voicing the limit, but not referencing what can be done to improve it.
Pc's are not serially optimized anyway, good luck with cache stall neurosis


----------



## SpeedyVT

Quote:


> Originally Posted by *routek*
> 
> Pretty much this.
> 
> PS4 GPU is 1.84tflops. 7850 is 1.76tflops. 7870 is quite a jump to 2.5tflops so a slightly better 7850


It's not a 7850 it's a 7850 that can use HSA


----------



## DRT-Maverick

My bad Routek! I suck at interpreting whether someone's serious or sarcastic or making a point about something else hehe.


----------



## SpeedyVT

Quote:


> Originally Posted by *routek*
> 
> To be fair, Ubisoft are on about the CPU for AI and multiple game systems, not game looks.
> 
> Anyway, I'm looking forward to see what Naughty Dog can do, they're super talented.
> I'm sorry but you've missed the part where it was a joke towards ZackisDead, who is a troll that got deleted by the looks of it. I thought saying twenty 980s even the most naïve person would've seen it was a joke but I guess not by your post. Honestly really? yikes
> 
> 
> 
> 
> 
> 
> 
> :


AI isn't a problem if they utilized the HW properly as GPUs are significantly better for AI not CPUs. Anyone who decides to write AI to a CPU is an idiot. Gaming community already knows that.


----------



## KenjiS

Quote:


> Originally Posted by *SpeedyVT*
> 
> AI isn't a problem if they utilized the HW properly as GPUs are significantly better for AI not CPUs. Anyone who decides to write AI to a CPU is an idiot. Gaming community already knows that.


Except the GPUs are already strained as it is.. So I don't think they have the overhead to do it

Ideally perhaps the better layout would have been a 4-core CPU and 2 GPU units, one GPU unit devoted to physics/AI and the other to graphics processing, the Physics/AI chip could be simpler than the graphics one...


----------



## lacrossewacker

Quote:


> Originally Posted by *SpeedyVT*
> 
> AI isn't a problem if they utilized the HW properly as GPUs are significantly better for AI not CPUs. Anyone who decides to write AI to a CPU is an idiot. Gaming community already knows that.


so they'd have to process their AI to the GPU on the consoles and procees the AI to the CPU on the PC?

Yeah that's not happening. Not on a multiplat


----------



## Carniflex

Quote:


> Originally Posted by *SpeedyVT*
> 
> AI isn't a problem if they utilized the HW properly as GPUs are significantly better for AI not CPUs. Anyone who decides to write AI to a CPU is an idiot. Gaming community already knows that.


My impression is that AI is mostly integer load, which is not particularly suitable fro GPU. I guess it might be perhaps be possible to write AI that operates on single precision floats (it's after-all just a matter of mapping it into the right interval) but then its a question if the stuff you do with the floats is optimal to be done on GPU. Then again I'm not an gaming AI specialist so I might be quite far from the right track with my impression.

Where GPU should really excel at are some types of physics simulations.

Edit: As far as having to rewrite if fot the PC port. Depends really how they do it, in principle the PC CPU's should be strong enough to do the floating point stuff on CPU, especially if they support the modern extensions when compiling the code for PC. If it's done for GPU it would be already massively parallel in principle so should scale with the number of CPU cores linearly.


----------



## Kuivamaa

Quote:


> Originally Posted by *Serandur*
> 
> Ideal? Laptop Haswell quad-core and Maxwell GPU (970M level); TDP isn't too high, power is nice, but price would be the issue... for the manufacturers. That's some very efficient hardware and they could play with the clocks how they wish. That would be a nice base-line for multiplatform development for some time.
> 
> They can code to the imaginary metal as much as they want, Jaguar is still a tablet/phone-level CPU and DX11's inefficiency is severely exaggerated if the results are to be anywhere near comparable to Jaguar with any decent Intel CPU from the past few years. Draw calls are an issue, yes, but there's Mantle and DirectX12 to the rescue for those.


Quote:


> Originally Posted by *SpeedyVT*
> 
> It's not a 7850 it's a 7850 that can use HSA


It is neither. (SP/TU/ROP)

7850 has 1024/64/32
PS4 has 1152/72/32
7870 has 1280/80/32

It is in between that is. But really, claiming that some developer has already not only explored but fully exploited the new perks that consoles bring ,is ludicrous. Unified memory structure (GDDR5 in one hand,ESRAM,even tiny on the other,is the industry suddenly full of masters of these?) and numerous times more efficient APIs than piss poor DX11 (better not talk about DX9) mean that talented devs can work wonders with what consoles sport. Yes decent PCs already outmuscle consoles (that are after all shader limited, a R9 280 is very cheap already) and even low end ones will match them after mantle and DX12 become commonplace) but that is a different discussion.


----------



## routek

PS4 is a custom GPU, when people say its around 7850, it's not to be taken literally. It's in that area and not a 7870
Quote:


> Originally Posted by *SpeedyVT*
> 
> AI isn't a problem if they utilized the HW properly as GPUs are significantly better for AI not CPUs. Anyone who decides to write AI to a CPU is an idiot. Gaming community already knows that.


I posted before in this thread how you can offload that to the PS4 GPU. I was outlining what Ubisoft said, you and the gaming community should get in touch with Ubisoft.


----------



## warr10r

Yes, the new console generation is a pile of crap.

Yes, Ubisoft is a pile of crap as well.

Me.Surprised = 0


----------



## routek

PS4 sitting were you'd expect. Around the 7850-7870 range. PS4 is 50-60fps according to a test on EG and drops to 40

not sure about COD AW but in other games console is sometimes lacking AF, has some poor FXAA, or missing some environment detail here and there or lower quality shadows


----------



## SpeedyVT

Quote:


> Originally Posted by *Carniflex*
> 
> My impression is that AI is mostly integer load, which is not particularly suitable fro GPU. I guess it might be perhaps be possible to write AI that operates on single precision floats (it's after-all just a matter of mapping it into the right interval) but then its a question if the stuff you do with the floats is optimal to be done on GPU. Then again I'm not an gaming AI specialist so I might be quite far from the right track with my impression.
> 
> Where GPU should really excel at are some types of physics simulations.
> 
> Edit: As far as having to rewrite if fot the PC port. Depends really how they do it, in principle the PC CPU's should be strong enough to do the floating point stuff on CPU, especially if they support the modern extensions when compiling the code for PC. If it's done for GPU it would be already massively parallel in principle so should scale with the number of CPU cores linearly.


Abstract AI excels on GPUs where standard cycled AI fails. Abstract is much cooler because it can learn or change radically.


----------



## BinaryDemon

Isn't this when the cloud-servers are supposed to assist consoles with CPU intensive tasks?


----------



## Zaid

Quote:


> Originally Posted by *BinaryDemon*
> 
> Isn't this when the cloud-servers are supposed to assist consoles with CPU intensive tasks?


of course, and its only 9.99$ a month. who cant afford 10$ a month, except if you want to use cloud for EA/Activsion games its an extra 4.99$ per game per month..


----------



## Pip Boy

wont the cpu be posing a limitation

in raw numbers what does a maxed out 8 core AMD Jaguar achieve say compared to a budget gaming PC running an AMD Pile driver 8 core 8320E Black Edition overclocked @ a reasonable 4.5 ghz over the 4.0ghz boost core?


----------



## 222Panther222

The developers where screaming their lungs out about an "easier to develop for" console, now that they have it i feel like their just laying back and doing easy development instead of trying to optimize for the console.

Also the tools and techniques seems to develops really slowly(AC:Unity), probably due to the big ps3/360 install base so they revenue is assured no matter if they put the effort in or not.

Plus the game prince bumping up for no reason at all since iirc sony plan to make the ps4 last as long as the ps3 so games should stay 59$.

Still I'm waiting for worthwhile exclusive.


----------



## Mattbag

Quote:


> Originally Posted by *222Panther222*
> 
> Still I'm waiting for worthwhile exclusive.


There isn't really any worthwhile exclusives i agree, I dont like infamous, and killzone sucked on ps4. The last of us came out remaster on ps4 and that was a last gen game! on xbox one id love to get that master chief collection though so thats honestly what is making me want to buy one soon.

Even still I'm happy with my ps4 and it serves its purpose and plays games not avalible on the pc but my good old pc feels like it will last me a few more years and if i ever need the upgrade i think another 7970 will give me worlds of power


----------



## SpeedyVT

The benefit I see with these consoles is that it could dramatically improve the way games operate on Desktops forcing developers to not take the easier way out of coding.


----------



## CryphicKing

Another article written by self pro-claim visual R&D scientist with 0 hour training in QA/data mining/memory profiling, this person's only experience in rendering/coding/optimization is understand how to putting PC components together yet trying to give a deep down coding/rendering analogy on SoCs architecture/none DreictX environment with his PC hardware reference. hilarious.
Quote:


> Originally Posted by *routek*
> 
> 
> 
> PS4 sitting were you'd expect. Around the 7850-7870 range. PS4 is 50-60fps according to a test on EG and drops to 40
> 
> not sure about COD AW but in other games console is sometimes lacking AF, has some poor FXAA, or missing some environment detail here and there or lower quality shadows


Yeah, but that 7870 is running on a $1500 x99 system just launched last month, playing PC/PS4 version right now, I don't spot any FXAA, missing environment detail or whatnot you talked about, nor do i find them in other manipulates other than BF4.


----------



## ChronoBodi

Quote:


> Originally Posted by *phill1978*
> 
> wont the cpu be posing a limitation
> 
> in raw numbers what does a maxed out 8 core AMD Jaguar achieve say compared to a budget gaming PC running an AMD Pile driver 8 core 8320E Black Edition overclocked @ a reasonable 4.5 ghz over the 4.0ghz boost core?


more like an dual core i3 from Ivy Bridge has the same overall power equal to eight Jaguar cores.

Take from that what you will.


----------



## maarten12100

Quote:


> Originally Posted by *ChronoBodi*
> 
> more like an dual core i3 from Ivy Bridge has the same overall power equal to eight Jaguar cores.
> 
> Take from that what you will.


Nah it is more powerful. Unless you're talking an i3 at 4GHz then your comparison works out.


----------



## SIDWULF

Consoles...seriously holding back PC games and limiting creative energy.


----------



## kennyparker1337

Quote:


> Originally Posted by *SIDWULF*
> 
> Consoles...seriously holding back PC games and limiting creative energy.


How?

You can make the same exact games for both PC and consoles.
You just have to turn down the graphics knob on the console version.


----------



## SIDWULF

Quote:


> Originally Posted by *kennyparker1337*
> 
> How?
> 
> You can make the same exact games for both PC and consoles.
> You just have to turn down the graphics knob on the console version.


Most games are made for consoles and then ported to PC later. Other games are made with all platforms in mind but inherit the performance limitations of consoles. They don't simply turn down the graphics knob they have to turn up the graphics knob for the PC often with weak graphical upgrades on a console based game...


----------



## Azuredragon1

Quote:


> Originally Posted by *SIDWULF*
> 
> Most games are made for consoles and then ported to PC later. Other games are made with all platforms in mind but inherit the performance limitations of consoles. They don't simply turn down the graphics knob they have to turn up the graphics knob for the PC often with weak graphical upgrades on a console based game...


Nothing wrong with a port as long it's well done.


----------



## Zaid

Quote:


> Originally Posted by *kennyparker1337*
> 
> How?
> 
> You can make the same exact games for both PC and consoles.
> You just have to turn down the graphics knob on the console version.


Except their gimping the PC versions so console users don't have something to complain about. some games are getting fps caps, some are even getting resolutions caps, and ps4/xbone have not even been out that long. consoles usually last from 4-8 years before the next gen, in a year or 2 pc hardware is going to be light years ahead of ps4/xbone.

but we wont see games taking advantage of this hardware because consoles are more important.


----------



## Carniflex

Quote:


> Originally Posted by *SIDWULF*
> 
> Most games are made for consoles and then ported to PC later. Other games are made with all platforms in mind but inherit the performance limitations of consoles. They don't simply turn down the graphics knob they have to turn up the graphics knob for the PC often with weak graphical upgrades on a console based game...


That is rather narrow view on things. I assume you are keeping in mind "major publisher AAA titles" with that "most games" statement. Because when you take a look at online distribution platforms including all the indie and smallish studio titles it seems to me _most games_ are either mobile or PC exclusives. Hell, even the games list in Steam that support Linux is atm longer (about 500 titles according to wiki) than entire library available on the current gen consoles of PS4 and XBOne. I am ignoring everything browser based as in essence they can be counted to be present on all platforms that can run a browser which at this age is basically already including even the most recent fridges and other kitchen apparatus.

Some links:
http://en.wikipedia.org/wiki/List_of_PlayStation_4_games (There are currently 388 games on this list (14 of which have been confirmed as free-to-play).)
http://en.wikipedia.org/wiki/List_of_PlayStation_3_games (There are currently 796 games (multi-platform: 636; exclusive: 150; console exclusive: 10) and 15 canceled games on this list as of August 18, 2013.)
http://en.wikipedia.org/wiki/List_of_Xbox_One_games (Exclusive = 30, Microsoft exclusive = 17, Console exclusive or timed = 40/4, Multiplatform/TBA = 201/4, Playable = 112 available to purchase (14 of these are exclusive to Xbox One), Total = 301)
http://en.wikipedia.org/wiki/List_of_Xbox_360_games (There are currently 1,126 games (multiplatform: 926; exclusive: 187; console exclusive: 72) on this list as of November 3, 2014.)
And to be fair I'm also using Wiki link for Steam
http://en.wikipedia.org/wiki/List_of_games_using_Steam_authentication
http://en.wikipedia.org/wiki/Steam_(software) (As of September 2014, over 3,700 games are available through Steam, which has 100 million active users.)
http://en.wikipedia.org/wiki/GOG.com (As of October 28, 2014, there are 842 games available on GOG.)
As other digital platforms overlap (UPlay, Origin, XFire, Humble Store etc) I'm not counting them and also disregarding atm games released independently outside of distribution platforms (like most MMOs).

Not to mention entire genres of games that are either not present at all on consoles (like roguelikes) or have negligible presence (MMO's, RTS, 4X, MOBA).


----------



## Carniflex

Speaking of consoles - does anyone know what is going with the Steambox? I mean it was supposed to be right around the corner when the PS4 and XBOne launched but after that things went kinda quiet. There was some talk about redoing the controller and .. then silence. The project, I think, is still going on as Valve added, for example, the streaming function into the client a while ago and such but been a while since I have seen any news on the subject so anyone knows anything about the "launch date" if you can say such a thing involving something coming from Valve?


----------



## Juub

Quote:


> Originally Posted by *Carniflex*
> 
> Speaking of consoles - does anyone know what is going with the Steambox? I mean it was supposed to be right around the corner when the PS4 and XBOne launched but after that things went kinda quiet. There was some talk about redoing the controller and .. then silence. The project, I think, is still going on as Valve added, for example, the streaming function into the client a while ago and such but been a while since I have seen any news on the subject so anyone knows anything about the "launch date" if you can say such a thing involving something coming from Valve?


It's coming in 2015 Valve said. There was an Alienware prototype showed a couple of months ago that looked nice enough and had an MSRP of 600$ if I recall correctly. Not much else since then. They better start doing some serious marketing if they want to get a solid customer base.


----------



## DIYDeath

Quote:


> Originally Posted by *Juub*
> 
> It's coming in 2015 Valve said. There was an Alienware prototype showed a couple of months ago that looked nice enough and had an MSRP of 600$ if I recall correctly. Not much else since then. They better start doing some serious marketing if they want to get a solid customer base.


Star Citizen release on pc/steambox as a release title.

Just you wait.


----------



## Carniflex

Quote:


> Originally Posted by *DIYDeath*
> 
> Star Citizen release on pc/steambox as a release title.
> 
> Just you wait.


Valve has claimed that it will not do "SteamOS exclusive" releases on its platform. Although I have to assume they mean with that that they will not be preventing devs putting their games on other platforms as well if they so desire as there is several games on Steam already which you can not play without it, like, for example, Valves own titles. Another popular rumor is ofc Half Life 3


----------



## nleksan

I do not, and will not, own any of this gen's consoles, but as far as improving felt performance.... How expensive couldit rreally be to add something like a 32-64GB M.2 SSD and make a "Slim" version?
I would think that the decrease in loading time, huge increase in fluidity of OS, and much better texture streaming from memory ccouldiimprove games regardless of whether they were made w this in mind or not...


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *222Panther222*
> 
> The developers where screaming their lungs out about an "easier to develop for" console, now that they have it i feel like their just laying back and doing easy development instead of trying to optimize for the console.
> 
> Also the tools and techniques seems to develops really slowly(AC:Unity), probably due to the big ps3/360 install base so they revenue is assured no matter if they put the effort in or not.
> 
> Plus the game prince bumping up for no reason at all since iirc sony plan to make the ps4 last as long as the ps3 so games should stay 59$.
> 
> Still I'm waiting for worthwhile exclusive.


New games for the PS4 and xbox one are both $60 which is exactly the same as last generation. We really shouldn't be complaining about game prices anyways because I still remember the days of Genesis and SNES when there was no limit on game prices. MKIII at toys r' us was $70 and many other games hovered around the $60 to $75 range. Mind you, this was in 1993 to 1995 so translanted in today's money would be around $110. Imagine paying over $100 for a game.


----------



## 222Panther222

In Quebec, the majority of PS4 games are 69$ While ps3 and 360 where and still is 59$
Quote:


> We really shouldn't be complaining about game prices anyways because I still remember the days of Genesis and SNES


It's like saying, "We shouldn't complain about poor grip on new tires because it was like this in the past."

Sony said they were planning the same lifespan for the ps4 as the ps3, except this time they already making profits on hardware from day one, why selling games 10$? Plus today they make you pay for DLC that is out at release and should have been on the disc already.

http://www.digitalspy.ca/gaming/news/a489548/playstation-4-to-have-ten-year-lifespan-says-sony.html
http://www.dualshockers.com/2014/05/22/kaz-hirai-the-ps4-is-already-profitable-with-hardware-alone/

I just hope that the 10$ more will go into bigger development and research for new idea instead of going into the PR/publisher pocket.


----------



## Zastugueen

Quote:


> Originally Posted by *222Panther222*
> 
> In Quebec, the majority of PS4 games are 69$ While ps3 and 360 where and still is 59$
> It's like saying, "We shouldn't complain about poor grip on new tires because it was like this in the past."
> 
> Sony said they were planning the same lifespan for the ps4 as the ps3, *except this time they already making profits on hardware from day one, why selling games 10$?* Plus today they make you pay for DLC that is out at release and should have been on the disc already.
> 
> http://www.digitalspy.ca/gaming/news/a489548/playstation-4-to-have-ten-year-lifespan-says-sony.html
> http://www.dualshockers.com/2014/05/22/kaz-hirai-the-ps4-is-already-profitable-with-hardware-alone/
> 
> I just hope that the 10$ more will go into bigger development and research for new idea instead of going into the PR/publisher pocket.


Sony is still in economic shambles.


----------



## Pip Boy

Quote:


> Originally Posted by *Zaid*
> 
> in a year or 2 pc hardware is going to be light years ahead of ps4/xbone.
> 
> but we wont see games taking advantage of this hardware because consoles are more important.


To an extent, but even a game designed for 16:9 900p 30fps scaled to 1080p is going to look great running native at 1440p 144fps 21:9 with AA with even a slight improvement in lighting,textures and shadows. Beyond that there are many games that are PC based that can already make consoles look weak.

but yea in 2 years 4k 30 - 40fps will be achievable with a mid range $250 GPU which is incredible to think and those rocking top end SLI might even be able to downsample to 4k from 6k or even 8k @ 60+ fps which is coming to the limits of usable resolution. PC will be the only platform to get usably decent VR @ 1440p or 4k as it requires a solid 75fps minimum to remain realistic and keep the 3D working properly


----------



## PolyMorphist

You guys can't be blaming Ubisoft for this. They are one of the 7 major game development studios in the world that use their own graphics API to render their games and implement the mechanics. This essentially means that all of the '1s and 0s' you hear about are at one of the lowest possible levels. Other in-house game engines are build using C++, complied into a Windows environment, whereas all Ubisoft games are build from a much lower compilation level to optimize the game and increase performance. If there is a problem with performance, it's not because Ubisoft are incompetent at optimizing their games, it's because they don't have the processing power to fully virtualize their vision.


----------



## umeng2002

I think you're giving them too much credit


----------



## iTurn

Quote:


> Originally Posted by *Carniflex*
> 
> The point I was making, however, earlier is that you can do a PC at the same cost as a console that playes the games that are available on PC with better GFX settings or higher/smoother frame rates. all that without having to pay for a subscription plan to be online and the same games are cheaper, often, on PC.


Prove it!! Your budget is $350-$400. That's brand new with input devices and an OS.


----------



## bluewr

Ubi being Ubi again


----------



## Pendulum

Quote:


> Originally Posted by *iTurn*
> 
> Prove it!! Your budget is $350-$400. That's brand new with input devices and an OS.


...That is also already built for the end user, comes with a full warranty, and an OS that can play any game you throw at it. Don't forget blu-ray, an optical port, and built in wi-fi either.

PC is more expensive, that's just how it is. Your console is the same for a decade, your PC will be upgraded more than 3-4 times in between for most of us.
Used games is a toss up with friends, craigslist, and free PSN games vs Steam and other vendor sales, new games are generally the same price on all platforms.

GTA V is 1080p on both consoles and looks better than AC:U and has a similar amount of AI. I don't hear R* complaining.


----------



## Carniflex

Quote:


> Originally Posted by *iTurn*
> 
> Prove it!! Your budget is $350-$400. That's brand new with input devices and an OS.


That was already true at the launch: http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc Now it's a bit more than half a year later.

The cheapest PS4 in here is 380 EUR, I'll pick that one as the cheaper and stronger of the two. I'm going to use also local prices (everything including 20% VAT) What I threw quickly together is following:
gfx XFX RADEON R9 270X 2GB GDDR5 115
cpu Intel Core i3 2100 3.10 GHz 3M Socket LGA1155 32nm 65W Box 54
mobo ASRock Socket 1155 H61M-DG3/USB3 30
ram 2x Patriot DDR3 Signature 1333MHz 4GB Module, CAS 9, RETAIL 62
hdd Toshiba MQ01ABF050 500GB 38
psu generix 350W PSU 13
case generic mATX 13
mouse generic 800dpi mouse 2
keyboard generic kb 4
op sys Microsoft Windows Home Server 2011 46
Total: 377 EUR

Prices are from: http://www.hinnavaatlus.ee/

Although for a fair comparison you would want to throw in couple of games as well. Another note is that strictly speaking one does not need Windows for gaming PC 100% depending on what he wants to play. SteamOS has already more games supported (approx 500) than has been released on the current gen consoles (which sit atm at approx 390 titles). Although ofc having windows expands options for gaming immensely still. If one drops windows and opts for SteamOS it would make sense to get stronger GFX card. The difference between a 120 EUR GFX card and 160 EUR one is immense.


----------



## iTurn

Quote:


> Originally Posted by *Carniflex*
> 
> That was already true at the launch: http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc Now it's a bit more than half a year later.
> 
> The cheapest PS4 in here is 380 EUR, I'll pick that one as the cheaper and stronger of the two. I'm going to use also local prices (everything including 20% VAT) What I threw quickly together is following:
> gfx XFX RADEON R9 270X 2GB GDDR5 115
> cpu Intel Core i3 2100 3.10 GHz 3M Socket LGA1155 32nm 65W Box 54
> mobo ASRock Socket 1155 H61M-DG3/USB3 30
> ram 2x Patriot DDR3 Signature 1333MHz 4GB Module, CAS 9, RETAIL 62
> hdd Toshiba MQ01ABF050 500GB 38
> psu generix 350W PSU 13
> case generic mATX 13
> mouse generic 800dpi mouse 2
> keyboard generic kb 4
> op sys Microsoft Windows Home Server 2011 46
> Total: 377 EUR
> 
> Prices are from: http://www.hinnavaatlus.ee/
> 
> Although for a fair comparison you would want to throw in couple of games as well. Another note is that strictly speaking one does not need Windows for gaming PC 100% depending on what he wants to play. SteamOS has already more games supported (approx 500) than has been released on the current gen consoles (which sit atm at approx 390 titles). Although ofc having windows expands options for gaming immensely still. If one drops windows and opts for SteamOS it would make sense to get stronger GFX card. The difference between a 120 EUR GFX card and 160 EUR one is immense.


youre 50 euro over budget, i guess windows server is an os







, you have a weaker cpu and less ram (cant imagine using 2GB in 2014







) and its silly to gimp by PC usage (steam OS) to play games... thats the beauty of PCs a multi-function super device.

Did you read the article? OCN loves to bringing it up but eurogamer couldn't do it either, they had to go over budget also "Here's an outline of the costs in building your own Digital Foundry PC. All that's missing is an operating system. OEM versions of Windows 7 Professional are available on eBay for under £50, *meaning that an entire working PC can be built to a budget of £500.* We used prices here from Amazon.co.uk or Ebuyer.com, both of whom offer free delivery options.

CPU: AMD FX-6300 3.5GHz processor with fan - £80.99
Graphics Card: Nvidia GeForce GTX 760 - £176.90
RAM: 8GB Kingston HyperX 1600MHz - £61.42
Motherboard: Gigabyte GA-78LMT-USB3 mATX - £37.06
Storage: Seagate 500GB internal hard drive - £37.70
Case/Power Supply/Keyboard/Mouse: Gigabyte GZ-MA02 4-in-1 - £55.63
*Total: £449.70*
Our design omits an optical drive owing to the rise of digital delivery on the PC platform. However, numerous models are available costing just £13 if you want to add that functionality to your PC. For the current sweet-spot in price vs performance, GPU-wise, we recommend the Radeon R9 270, if you can find one on sale in the £120 area. *By shopping around you should be able to get a legit copy of Windows 7 for around £50, giving us a £500 total*."

No console cost 500 euro much-less £500.


----------



## mtcn77

I think the storage solution is the final bottleneck, but up until that point XB One can load up its active vram cache(esram) the fastest at 94 GB/s from its move engines and ddr3 rams. That is almost five folds over what's available on pc's.


----------



## Carniflex

Quote:


> Originally Posted by *iTurn*
> 
> youre 50 euro over budget, i guess windows server is an os
> 
> 
> 
> 
> 
> 
> 
> , you have a weaker cpu and less ram (cant imagine using 2GB in 2014
> 
> 
> 
> 
> 
> 
> 
> ) and its silly to gimp by PC usage (steam OS) to play games... thats the beauty of PCs a multi-function super device.
> 
> Did you read the article? OCN loves to bringing it up but eurogamer couldn't do it either, they had to go over budget also "Here's an outline of the costs in building your own Digital Foundry PC.
> 
> ...


Where are you getting this 50 EUR over budget? That setup costs 377 EUR including the OP system, the cheapest PS4 was 380 EUR around here. The last numbers in the list are the prices in EUR as shown on that price comparison site which I used. RAM price is for 2 sticks of it.

The CPU is actually marginally stronger than the one in PS4, although the direct comparison is ofc not possible as the usual benchmarks are not running on PS4, although both architectures have their IPC reasonably documented so some kind of comparison can be made. Less RAM, really? The setup has 8 GB DDR3 + 2 GB of GDDR5 so if you must put everything into the same pot then thats 2 GB more than PS4 (which btw does not allow any game to use full 8 GB it has as some of it is reserved for underlying OP system). As far as 2 GB of gfx RAM goes - it is actually good enough for 2014, you can stretch it as far as 4K resolution in most games by just turning down AA a little (which consoles do not have at all or have minimal levels of it) - and I'm not putting that statement out from hearsay - it's personal experience based on running 5 screen eyefinity setup for years inclduing a while on a single 7870 Eyefinity 6 card with 2 GB vRAM.

As I stated the Eurogamer article is half year old, time has moved on and as I demonstrated it is nowadays possible within the budget. Sure it will not knock your socks off, but the setup I posted will do better at 1080p than a "current gen" console, which, as we have seen, most of the time does not do 1080p. There is no way around it, doing a gaming PC on such a limited budget demands significant compromises.

And yeah its a Windows OS. Some limitation on it as it allows up to 8 GB of RAM, but it is in a nusthell a 64 bit Windows 7.

Edit: About the CPU comparison
Quote:


> Playstation 4 CPU's test substitute
> Before we start - remember, that the E-350 Bobcat only contains 2 cores, not the 4 of the AMD Jaguar - or indeed the 8 of the PS4. However, it's not very difficult to perform the simple maths involved. We can easily factor in the 15 - 20 percent performance increase on the IPC / general other improvements anyways. We'll use PassMark as our first example. The AMD E-350 obtains 772. For the purposes of these tests, let's assume that ALL of the AMD cores are used fully, and so we'll be 'generous' and simply times the number by 4. 4×2 = 8. 772*4=3088. Let's add in the extra performance and let's call the whole thing 3600. Intel i3-2100 (amazon link)
> @ 3.10GHz obtains a score of 3,600 (benchmark). The Intel Core I3-2100 is a slow CPU by PC standards.


From http://www.redgamingtech.com/amd-jaguar-ps4s-cpu-vs-pc-desktop/


----------



## Pip Boy

Quote:


> Originally Posted by *Carniflex*
> 
> Another note is that strictly speaking one does not need Windows for gaming PC 100% depending on what he wants to play. SteamOS has already more games supported (approx 500) than has been released on the current gen consoles (which sit atm at approx 390 titles)..


Nice build !

SteamOS/Linux its about 770 playable games actually and rapidly rising with every release and that doesn't take into account opensource free games or GOG games or desura !

Furthermore even though Pendulum makes a good point its all about economies of scale so of course the PS4 will be $150 cheaper than a decent PC but what's $150 in the grand scheme of things when you take into account the price of console games and PSN accounts? and lets face it apart from that skint teenager who has to borrow & beg steal from his parents $150 extra isn't the biggest of deals and that would take your build and move the GPU into a different league.

Also its a bit disingenuous to say you can pick up old PS4 games on craigslist / ebay compared with Steam/GOG/HumbleBundle/Desura and all the rest that do flash sales of games and you don't have to lift a finger, no carped up boxes with greasy finger marked discs inside

finally is it all about gaming ? if it was then why are the consoles featuring browsers and music software / blueray and why is the xboxone essentially a DVR with a gpu ? When you use a browser on a console its a tardy experience and the apps for youtube are terribly limited not to mention there is no adblocking, cookie stomping etc.. support for anything.

A PC can be used for content creation aswell as consumption and can you put a price on that?


----------



## lugal

Quote:


> Originally Posted by *Carniflex*
> 
> ...


So what we have here is "gaming" pc barely (if at all) matching ps4 in raw performance, without close to metal optimizations, with outdated os, without free game and 50€ more expensive than ps4. And on top of that its bigger form factor using "generic" pc case, "generic" (I take its wired) mouse and keyboard and lacking bluray drive completely (that is like 100€ more for bd and software).

GG, I bet millions of people cant wait to pay more money to put some hideous pc, that they need to build themselves, install everything themselves, that needs to be controled with wired kb+m and that provides worse gaming performance, into their living room.


----------



## Pip Boy

Quote:


> Originally Posted by *lugal*
> 
> GG, I bet millions of people cant wait to play @ 900p 30fps , that they need to switch on themselves, install everything five times over and update constantly themselves, that needs to be controlled with a wireless joypad thats inferior to kb+m and that provides worse gaming performance, into their living room.


----------



## Bryst

Quote:


> Originally Posted by *L D4WG*
> 
> I find this hard to believe, the amount of gains they squeezed out of the 360 and the PS3 in the last 24 months, compare The Last of Us and Halo 4 to the games released during the launch year of last gen.
> 
> I think these guys are talking out their rear ends.
> 
> Get some decent exclusives developed only for a single current gen console, no backwards gen compatibility and you will see improvements. These guys are lazy


I wouldn't be so quick to flaunt The Last of Us... Don't forget alot of peoples PS3s literally died playing it. It was to much for alot of peoples systems.


----------



## Carniflex

Quote:


> Originally Posted by *lugal*
> 
> So what we have here is "gaming" pc barely (if at all) matching ps4 in raw performance, without close to metal optimizations, with outdated os, without free game and 50€ more expensive than ps4. And on top of that its bigger form factor using "generic" pc case, "generic" (I take its wired) mouse and keyboard and lacking bluray drive completely (that is like 100€ more for bd and software).
> 
> GG, I bet millions of people cant wait to pay more money to put some hideous pc, that they need to build themselves, install everything themselves, that needs to be controled with wired kb+m and that provides worse gaming performance, into their living room.


Chek the link with benchmarks couple posts back. This
Quote:


> "gaming" pc barely (if at all)


is actually offering better performance than the PS4 (with these hypotetical close to metal optimizations). As far as "free game" goes, depends on where you get that GFX card, both nVidia and AMD have their games programmes.

Also from where are you guys getting this 50 extra EUR of cost? The PS4 I used for comparison (the cheapest version with 500 GB HDD) is locally 380 EUR. The "gaming PC" was 377 EUR, including the Windows OS, unless someone has changed the fundamentals of math behind my back that is still 3 EUR cheaper, not "50 EUR more expensive". Yeah, 3 EUR is neglibe difference so for all practical purposes its the same cost.

Now building yourself part is valid remark entirely and also one of the key points in here. People who have the interest and knowledge to do this are probably already PC gamers and your typical console user is not likely to go down that path. However, the initial discussion started from the claim I made, that "you can do a gaming PC that exceeds the console performance at the same price". So that is why I threw this example build together within this particular budget. If one wants to game then this build would, in fact run, for example, BF4 with better framerate/graphical quality than PS4 or XBOne as was already shown in the lik to the digital foundry article from the time when consoles were released. At that time the consoles had slight price edge which is evaporated by now, as was demonstrated by my example build.

Sure, we can go back and forth here, adding in the fees to be online with the console, debate if what I picked is "real windows" or if windows 7 is "outdated" or not, etc. Then expand it in look at whole picture, picking, say a dozen games, adding in their prices on a console and PC and debate back and forth if this is representative set or not, etc. Claiming back and forth about blueray drive, surround sound headsets, drag in occulus rift and Morpheus into the discussion etc. However, the point still stands - you can do a PC for the price of a console that plays games better as it stands today.

I expect this to be true for a while until the console prices drop to ~250 EUR at which point probably the compromises needed to put together something that can play games decently enough at that price-point would be far too severe. However, at that time few years down the road a ~400 EUR gaming PC would probably offer already magnitudes better gaming performance all things considered. For example, the 380 EUR build I posted should be able to do ~30 fps at 4K resolution without AA in majority of titles out currently and capable of running at that resolution (I can do it with overclocked 7870 and much stronger CPU, at high resolutions things get GPU bound tho, so should be possible). That is 400% higher resolution than consoles are currently capable of supporting (the 4K @ 30 Hz support in current consoles is strictly reserved for "media consumption", not gaming).


----------



## caswow

Quote:


> Originally Posted by *Bryst*
> 
> I wouldn't be so quick to flaunt The Last of Us... Don't forget alot of peoples PS3s literally died playing it. It was to much for alot of peoples systems.


got proof of mass dying ps3 because of this game?


----------



## lugal

Quote:


> Originally Posted by *Carniflex*
> 
> Chek the link with benchmarks couple posts back. This
> is actually offering better performance than the PS4 (with these hypotetical close to metal optimizations).


I must have missed you posting any game benchmarks of your "console beating" gaming pc, feel free to show them after you adjust the specs, form factor and cost to actual console. Only benchmarks Ive seen were some worthless synthetic benchmarks of different cpu in different configuration that tell nothing about gaming performance and some benchmarks of completely different pc that was also 150 pounds more expensive, in similar "lolgeneric" case without wireless peripherals like you posted and lacking things like bluray drive and software capable of playing it.

I get "those 50€" from price of ps4 that is 330€ locally currently. For the most people in eu/us it will be same or even cheaper. Not counting in build cost (not to mention possible troubleshooting) is imo wrong, but hey, if you value your free time at 0€/h, then good for you.

So the fact is, that you didnt prove you can build a pc that matches (or ever surpasses) console for a console price. You pretty much showed the exact opposite and all these "I dont want wireless peripherals, bluray capability and decent looking sff case anyway" excuses just reinforce it even more.


----------



## Jedi Mind Trick

Quote:


> Originally Posted by *nleksan*
> 
> I do not, and will not, own any of this gen's consoles, but as far as improving felt performance.... How expensive couldit rreally be to add something like a 32-64GB M.2 SSD and make a "Slim" version?
> I would think that the decrease in loading time, huge increase in fluidity of OS, and much better texture streaming from memory ccouldiimprove games regardless of whether they were made w this in mind or not...


There were benchmarks of the new Xbox and the ps4 with SSDs, SSHDs, and normal HDDs. They all had roughly the same loading times, also after installing like one or two games that ssd would be full.


----------



## Carniflex

Quote:


> Originally Posted by *lugal*
> 
> I must have missed you posting any game benchmarks of your "console beating" gaming pc, ...


The link was http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc - main point being the comparison of nvidia 760 and R9 270 /270X against PS4 and XBOne in three games that exist on all platforms. The article is from the console launch time so by this time somewhat aged, but other than prices everything should be still more or less valid. The i3 I used in my example build should offer roughly similar performance in games to FX6300 they used at 1080p and above as things tend to get more GPU bound the higher you go.

For the price difference. Seems like I'm not in "most of Europe". Good for you if you can get it for 330 EUR. I used all local prices for both components and PS4 as thats whats I'm familiar with. For me a new PS4 would cost 380 if I would want to buy it today and that is the price I used. I guess that if one would drop the Windows and would use instead SteamOS (which still has more games than the new gen consoles even in Linux mode) it might be possible to squeeze it even into the 330 EUR envelope with a little luck as the Windows I picked was somewhere around 50 EUR ballpark if I remember correct.

As far as "lolgeneric" case goes. The one I used in my example build was INTER-TECH JY-250

Which is relatively generic mATX case. It's not knocking your socks off but it has the essentials covered and is not exactly poking your eye out with ugliness either. Obviously if you absolutely need to have it as small as possible it would make the hardware you can afford suffer on such a limited budget. The "lolgeneric" keyboard and mouse were rather regular offerings as well. They get the job done. Not wireless ofc.

Like the whole example system. It gets the job done although it is in essence a reasonably configured entry level gaming PC.

TBH if I would need to make a reasonably cheap gaming PC then I would rather aim somewhere around 500 EUR ballpark. The compromises you have to do to make it fit into that budget are still significant but in there you can still get quite noticeable gains for relatively modest increase in budget. For "living room" I would just leave the gaming PC in some other room and would use the Steam streaming function. You can get an x86 Intel Atom or even AMD AM1 platform SoC system for as low as 60..100 EUR which would be ideal for just streaming at 1080p. Say, for example: http://www.overclock.net/t/1519453/techreport-this-mini-bay-trail-pc-is-the-size-of-a-thumb-drive/0_50 which price was reported to be somewhere around 60$.


----------



## Prophet4NO1

This thread has devolved into pointlessness.


----------



## mtcn77

I think resolution is not the core subject of the consoles since there is no limit to it until the bottleneck becomes the interface and the hardware. As long as the rendering is done well(no lod anisotropy error & no gamma hue error, no artifacting texture) the resolution will hold up pretty well when scaled. Geometry(SMAA), animation, lighting are definitely more important, imo. The solution sought through resolution is in fact a substitute for the real problem that is artifacting due to the faulty composition of filters at disposal. In my opinion, it takes an absolute comprehension of the technicality in order to distinguish between the benefits of each and every possible option which is my emphasis that not everybody is more technically obsessed as Senna was about his equipment - it is unnecessary, too. Most likely, the best output is not delivered at default.
LCD's are quite precision bound, as well. AF might be OK for CRTs with their superior greyscale & gradient switch performance, but 6 to 8 bit colour gradient precision & slow gradient change is only too restricted until lcd monitors start to impair the output animation.
I actually am quite OK with sprites popping in and out. Retrospectively, I can actually recall the days when sprites were quite sought after.


----------



## VeerK

I'd be happier with great textures and 60fps at 720p than less textures, 30fps at 1080p.


----------



## jameschisholm

Quote:


> Originally Posted by *Prophet4NO1*
> 
> This thread has devolved into pointlessness.


you're not wrong there..

Why is there a constant need to build a pc that matches a console? I thought the whole idea was to build a hi-spec PC to enjoy the benefits it provides? Why would you intentionally want to build a low spec PC to meet console standards?

This is madness...


----------



## kpzero

Since you guys seem to be rehashing the topic of building a gaming pc for the price of the consoles:

My $450 budget gaming build from 2012 is slightly more powerful than the ps4 with a mild 4.2 oc fx-6300 and gpu at 2.15 tflops.

Honestly I think matching the price is the wrong way to go about it but here is something for thought:

PS4 bundle with 1 game: $449 + 2 years of online $100 = $549 (1.84 tflops)

PC including controller and 3 games with unlimited online = $477 (3.5+ tflops easily)

CPU: AMD FX-6300 3.5GHz 6-Core Processor
Motherboard: Gigabyte GA-78LMT-USB3 Micro ATX AM3+ Motherboard
Memory: G.Skill Ares Series 8GB DDR3-1600 Memory
Storage: Seagate Barracuda ES 1TB 3.5" 7200RPM Internal Hard Drive
Video Card: Sapphire Radeon R9 280 3GB Dual-X Video Card
Case: Fractal Design Core 1000 USB 3.0 MicroATX Mid Tower Case
Power Supply: Corsair Builder 500W 80+ Bronze Certified ATX Power Supply
Keyboard: Rosewill RK-201 Wired Standard Keyboard
Mouse: Rosewill RM-P2P Wired Optical Mouse
Controller: Microsoft Xbox 360 Wireless Controller for Windows
Total: $476.55
If you absolutely must have windows and bluray, it is about 100 more but neither are necessary.

Obviously some wont have access to the same deals but the point stands.

Having said that, I will likely own all 3 current gen consoles in the next few years.


----------



## Shadow11377

Quote:


> Originally Posted by *kpzero*
> 
> Since you guys seem to be rehashing the topic of building a gaming pc for the price of the consoles:
> 
> My $450 budget gaming build from 2012 is slightly more powerful than the ps4 with a mild 4.2 oc fx-6300 and gpu at 2.15 tflops.
> 
> Honestly I think matching the price is the wrong way to go about it but here is something for thought:
> 
> PS4 bundle with 1 game: $449 + 2 years of online $100 = $549 (1.84 tflops)
> 
> PC including controller and 3 games with unlimited online = $477 (3.5+ tflops easily)
> 
> CPU: AMD FX-6300 3.5GHz 6-Core Processor
> Motherboard: Gigabyte GA-78LMT-USB3 Micro ATX AM3+ Motherboard
> Memory: G.Skill Ares Series 8GB DDR3-1600 Memory
> Storage: Seagate Barracuda ES 1TB 3.5" 7200RPM Internal Hard Drive
> Video Card: Sapphire Radeon R9 280 3GB Dual-X Video Card
> Case: Fractal Design Core 1000 USB 3.0 MicroATX Mid Tower Case
> Power Supply: Corsair Builder 500W 80+ Bronze Certified ATX Power Supply
> Keyboard: Rosewill RK-201 Wired Standard Keyboard
> Mouse: Rosewill RM-P2P Wired Optical Mouse
> Controller: Microsoft Xbox 360 Wireless Controller for Windows
> Total: $476.55
> *If you absolutely must have windows and bluray, it is about 100 more but neither are necessary.*
> 
> Obviously some wont have access to the same deals but the point stands.
> 
> Having said that, I will likely own all 3 current gen consoles in the next few years.


For gaming I wouldn't say that. Sure Blu-ray isn't needed for much of anything, but a Windows OS is pretty much a requirement. Quite necessary if you want to be able to play most games.


----------



## Pip Boy

Quote:


> Originally Posted by *kpzero*
> 
> Since you guys seem to be rehashing the topic of building a gaming pc for the price of the consoles:
> 
> My $450 budget gaming build from 2012 is slightly more powerful than the ps4 with a mild 4.2 oc fx-6300 and gpu at 2.15 tflops.
> 
> Honestly I think matching the price is the wrong way to go about it but here is something for thought:
> 
> PS4 bundle with 1 game: $449 + 2 years of online $100 = $549 (1.84 tflops)
> 
> PC including controller and 3 games with unlimited online = $477 (3.5+ tflops easily)
> 
> CPU: AMD FX-6300 3.5GHz 6-Core Processor
> Motherboard: Gigabyte GA-78LMT-USB3 Micro ATX AM3+ Motherboard
> Memory: G.Skill Ares Series 8GB DDR3-1600 Memory
> Storage: Seagate Barracuda ES 1TB 3.5" 7200RPM Internal Hard Drive
> Video Card: Sapphire Radeon R9 280 3GB Dual-X Video Card
> Case: Fractal Design Core 1000 USB 3.0 MicroATX Mid Tower Case
> Power Supply: Corsair Builder 500W 80+ Bronze Certified ATX Power Supply
> Keyboard: Rosewill RK-201 Wired Standard Keyboard
> Mouse: Rosewill RM-P2P Wired Optical Mouse
> Controller: Microsoft Xbox 360 Wireless Controller for Windows
> Total: $476.55
> If you absolutely must have windows and bluray, it is about 100 more but neither are necessary.
> 
> Obviously some wont have access to the same deals but the point stands.
> 
> Having said that, I will likely own all 3 current gen consoles in the next few years.


going for the full thread derail.. here why not..

the fractal is a nicer case but for $10 less you can get this Aerocool console sized case to stand vertical or under a TV shelf and shove a half height 750Ti in there also saving a few more $$ of course that would only be a 2gb card but its not far off a 7850 in benchmarks and faster in some.


----------



## paulerxx

I have a fairly cheap build currently...and IMO my PC > anything I've seen on next generation consoles, in every aspect.


----------



## Bryst

I cant help but notice all these builds dont have a monitor.... what are you going to sit infront of your tv with a keyboard and mouse on a board? Might as well buy a console then. At least you get a comfortable controller then.


----------



## Pip Boy

Quote:


> Originally Posted by *Bryst*
> 
> I cant help but notice all these builds dont have a monitor.... what are you going to sit infront of your tv with a keyboard and mouse on a board? Might as well buy a console then. At least you get a comfortable controller then.


I cant help but notice the flaw in this argument every time, see if you can guess what it is...


----------



## Bryst

Quote:


> Originally Posted by *phill1978*
> 
> I cant help but notice the flaw in this argument every time, see if you can guess what it is...


Im going to assume youre going to claim you need to buy a TV to use your console. But a TV has been basically a staple item in a household since the 70s. If you dont have a TV to play a console on, then you most likely arent the type of person to own a console.


----------



## kpzero

Quote:


> Originally Posted by *Bryst*
> 
> I cant help but notice all these builds dont have a monitor.... what are you going to sit infront of your tv with a keyboard and mouse on a board? Might as well buy a console then. At least you get a comfortable controller then.


The build I listed included the comfortable controller and intended to hook up to TV for console experience with Steam Big Picture.


----------



## Bryst

Quote:


> Originally Posted by *kpzero*
> 
> The build I listed included the comfortable controller and intended to hook up to TV for console experience with Steam Big Picture.


But no Bluray player. I just think this whole arguement is stupid. I own a PS4, Gaming PC, and low end gaming laptop. And I'll tell you lately have gotten more enjoyment from the PS4. Though I still play games on my PC. Let people buy what the hell they want to.

OH noes! I have to pay a subscription thats less then half the cost of netflix a month to play games online!!!! Looks like I'll have to skip out on a trip to mcdonalds every month RABBLE RABBLE RABBLE.


----------



## Pip Boy

Quote:


> Originally Posted by *Bryst*
> 
> Im going to assume youre going to claim you need to buy a TV to use your console. But a TV has been basically a staple item in a household since the 70s. If you dont have a TV to play a console on, then you most likely arent the type of person to own a console.


Exactly.

1x hdmi lead + xbox controller = couch gaming PC. Also most of the time i see a console / games room there are actually a lot of people using a large 1080p Monitor/TV for both duties. Not every game on PC demands a keyboard and mouse, in fact most don't. Wait until valves controller arrives and then there is a hybrid of both.

Quote:


> Originally Posted by *kpzero*
> 
> The build I listed included the comfortable controller and intended to hook up to TV for console experience with Steam Big Picture.


which also means a free operating system with a growing feature set for music + web browsing and im sure Netflix amongst others before long.

This turned into console vs PC if we are not allowed to use the PC's endless versatility for content creation and modification as an added value statement, then it should be a PC set in the same scenario which means Big Picture and a controller

but i agree, just buy the thing you like and accept it and all its limitations.


----------



## paulerxx

Quote:


> Originally Posted by *Bryst*
> 
> I cant help but notice all these builds dont have a monitor.... what are you going to sit infront of your tv with a keyboard and mouse on a board? Might as well buy a console then. At least you get a comfortable controller then.


PS4 and Xbox One come with TVs?


----------



## Bryst

Quote:


> Originally Posted by *phill1978*
> 
> Exactly.
> 
> 1x hdmi lead + xbox controller = couch gaming PC. Also most of the time i see a console / games room there are actually a lot of people using a large 1080p Monitor/TV for both duties. Not every game on PC demands a keyboard and mouse, in fact most don't. Wait until valves controller arrives and then there is a hybrid of both.
> which also means a free operating system with a growing feature set for music + web browsing and im sure Netflix amongst others before long.
> 
> This turned into console vs PC if we are not allowed to use the PC's endless versatility for content creation and modification as an added value statement, then it should be a PC set in the same scenario which means Big Picture and a controller
> 
> but i agree, just buy the thing you like and accept it and all its limitations.


Just remember, game graphic development normally follows console performance. So if the current gen consoles really are that limited, that $500 gpu might not be as useful as people think. I personally would rather game at 900p 30fps and have better story and game mechanics then 1440p 120fps and have terrible story and gameplay.

I personally think that the demand for such amazing looking games have left other much more important aspects of games very lacking. And Destiny is a prime example of that.


----------



## Bryst

Quote:


> Originally Posted by *paulerxx*
> 
> PS4 and Xbox One come with TVs?


I already responded to someone else basically saying the same thing.


----------



## Pip Boy

Quote:


> Originally Posted by *Bryst*
> 
> Just remember, game graphic development normally follows console performance. So if the current gen consoles really are that limited, that $500 gpu might not be as useful as people think. I personally would rather game at 900p 30fps and have better story and game mechanics then 1440p 120fps and have terrible story and gameplay.
> 
> I personally think that the demand for such amazing looking games have left other much more important aspects of games very lacking. And Destiny is a prime example of that.


I play mostly older FPS / modded or indie games and crowd funded alpha/beat WIP's. The last AAA title I bought was BL-TPS and it isn't exactly any more fun than BL2.

While I agree with the mechanics bit for me the pure definition of that is in having accurate ( or silly) but detailed physics with destructibility, for me that's the mechanics that make it fun being able to interact with the environment or making the world come alive. This always requires massive compute power and its the first thing to get reduced to the most basic level in pretty much every title ( we seem to be going backwards here even the original source engine does physics more than most AAA titles)

I guess the point with PC is in theory you don't have to trade on anything.


----------



## Bryst

Quote:


> Originally Posted by *phill1978*
> 
> I play mostly older FPS / modded or indie games and crowd funded alpha/beat WIP's. The last AAA title I bought was BL-TPS and it isn't exactly any more fun than BL2.
> 
> While I agree with the mechanics bit for me the pure definition of that is in having accurate ( or silly) but detailed physics with destructibility, for me that's the mechanics that make it fun being able to interact with the environment or making the world come alive. This always requires massive compute power and its the first thing to get reduced to the most basic level in pretty much every title ( we seem to be going backwards here even the original source engine does physics more than most AAA titles)
> 
> I guess the point with PC is in theory you don't have to trade on anything.


If what you like is engaging with the enviroment, and you said yourself that normally requires alot of computing power, then arging that you can build a PC for the price of a console is pretty moot. Even if your getting better cpu performance its not gonna be enough on a $400 budget.


----------



## StreekG

May sound silly, but Xbone and PS4 both use APU, so inbuilt graphics processor, do any of the software updates for the OS come with graphics driver updates, seeing as these are basically PCs now. Can't they be subject to the same graphics driver updates we are? I'm not too schooled up on APU.
OR does it just not work that way with consoles?


----------



## mtcn77

Quote:


> Originally Posted by *StreekG*
> 
> May sound silly, but Xbone and PS4 both use APU, so inbuilt graphics processor, do any of the software updates for the OS come with graphics driver updates, seeing as these are basically PCs now. Can't they be subject to the same graphics driver updates we are? I'm not too schooled up on APU.
> OR does it just not work that way with consoles?


Pc hardware is bound to have high thoroughput and high latency. That is why they need all those extra shaders and resources because they have to fulfill the latency gap by heavy leverage. When you think about it even a top end pc motherboard can only supplant 16GB/s to a videocard through a pcie3.0 aperture. Comparatively on the PS4 & XBox One, they can actually undo copying in the first place since all AMD APU's have the potential to share memory addressing between cpu & gpus (now don't anybody go, "No, it hasn't got HUMA" routine, consoles are fully programmable, it should be conceivable). They can simultaneously write to frame buffer ram via their cpu(20 - 26 GB/s each) and gpu(176 - 170 GB/s).
Consoles work totally differently. They have lower shader counts because it doesn't actually matter as long as the work is dispatched properly. They can serialize & parallelize all they want since latency is not a problem, they know the limits. They can buffer workloads through compute and inject tesselation and all sorts of stuff to make the lower ROP bounds & resolution grid just disappear. I actually have wondered for quite some time why there was actually a minimal triangle size limit to inject tesselation because it seemed like the wisest trick to eliminate sample artifacting. Turns out the invaluable 2*2 "quad shading" is causing shader overhead right where finer texture definition is mostly needed.
I just watched Second Son. It is better than Crysis, I must say. I disliked Prototype, but cannot find any fault in the real execution of the concept.


----------



## Mr iggy

Well you be surprise what optimizing can do for for consoles look at games from 2007 and games in 2013 it's night and Day.


----------



## Serandur

Quote:


> Originally Posted by *Mr iggy*
> 
> Well you be surprise what optimizing can do for for consoles look at games from 2007 and games in 2013 it's night and Day.


What people selectively omit from their recollection is that said "optimization" was largely due to advances in shader and engine utilization across the board... meaning PCs too. It's not some console-exclusive phenomenon. See: 8800s running Battlefield 4 well above what a PS3/360 can do and above what software 8800s had to render in 2006. In regards to 2007 games, also see Crysis.


----------



## Mr iggy

Quote:


> Originally Posted by *Serandur*
> 
> What people selectively omit from their recollection is that said "optimization" was largely due to advances in shader and engine utilization across the board... meaning PCs too. It's not some console-exclusive phenomenon. See: 8800s running Battlefield 4 well above what a PS3/360 can do and above what software 8800s had to render in 2006. In regards to 2007 games, also see Crysis.


To be honest Crysis was way to much of resource hog and it was unoptimized. and I fully agree with you.


----------



## Oubadah

..


----------



## Carniflex

The point of aiming to do a "gaming PC" at the same budget as a console hardware is to demonstrate, that the consoles have relatively weak hardware. It's just a thought experiement and in reality it would make sense to go for approx twice the price of console for a good allaround PC that you want to game on also. Ofc in reality gaming is only one aspect of consoles, I mean, for example, some Apple offering can be considered also as crap bang-for-buck if you focus on hardware which you are getting for your money, yet people are buying it and seem to quite enjoy themselves. Because Apple is not selling hardware specs, it's selling you "experience" and "ecosystem" etc. Problem for consoles is that their core function (traditionally) is playing games and as has been demonstrated a equally priced PC can do it better. If one bought their console for playing blue ray movies then great, they can do it and you can do an equally priced PC that does it also but within the limited budget it would induce significant enough compromises elsewhere to neuter the gaming experience. As stated in the digitalfoundry article everything included consoles are reasonable value, but jack of all trades master of none as the saying goes. At least without jacking up the price.

Also at the end of a day - a "typical console user" is highly unlikely to go out, buy his own PC parts and assemble one on such a limited budget that is reasonably balanced to offer better gaming experience over console. He is just not interested usually in that kind of stuff, he wants a box he can buy, plug it in and it "just works". Compared to similar price class OEM PC's consoles offer substantially better gaming experience I'm pretty sure.

For a "fair" comparison price wise I believe it would make more sense to compare a console + dozen games + online fees + power cost for a year vs a PC with the same things included. But where's the challenge in that! I mean if you can already do more than a console at just the hardware level then it's just beating something that is already lying down as the price difference in games would allow one to just increase the gap in performance. Or add blue-ray drive or two if its so essential









Eventually, however, what matters is if the game you want to play is on your chosen platform. If the game you must play is only on console it does not matter what a PC can do. Granted, the list of exclusives is sort of anemic one (considering there is entire genres that are PC exclusive) but I'm pretty sure both Sony and MS are working hard to get something juicy exclusive for their consoles so the list will sure grow over time. If it's a cross-platfrom one then going with the PC would offer one better experience, unless, ofc, the port is horribly botched, but then it would probably also play like crap already on consoles.


----------



## Oubadah

..


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *paulerxx*
> 
> PS4 and Xbox One come with TVs?


Pretty much everyone owns a TV. Very few people own monitors.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Serandur*
> 
> What people selectively omit from their recollection is that said "optimization" was largely due to advances in shader and engine utilization across the board... meaning PCs too. It's not some console-exclusive phenomenon. See: 8800s running Battlefield 4 well above what a PS3/360 can do and above what software 8800s had to render in 2006. In regards to 2007 games, also see Crysis.


Those 8800s also cost as much as the entire 360/PS3 themselves and were released a whole year or more later than the 360. So it'd be a darn shame if they couldn't outmatch the last gen consoles.


----------



## Serandur

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Those 8800s also cost as much as the entire 360/PS3 themselves and were released a whole year or more later than the 360. So it'd be a darn shame if they couldn't outmatch the last gen consoles.




It's not about outmatching them, it's about clearly reaping the benefits of progression in game software development just as consoles do and therefore also benefiting from "optimization" of games over time to look better while using the same hardware resources... as opposed to the insinuation people sometimes make that you "need" to upgrade to match console optimization as if PC software isn't "optimized" and refined to run more efficiently as well when that's patently false. Stronger hardware is stronger hardware, that's what's being demonstrated by the advantage these stronger cards not only had at release, but maintained over the entire span of the consoles' lifetime.


----------



## Carniflex

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Pretty much everyone owns a TV. Very few people own monitors.


TV can be used as a PC display if that is what you are after. Which most people seem to be if we are talking about living room.

Consoles can be used with a monitor as well, if monitor need to be included. They are just somewhat limited in what they can reasonably use because of lacking sufficiently high bandwidth connector for making use of some of the higher end options available for PC's (not to mention the limitations implemented on console games like frame rate caps for "cinematic experience" and such). Then again it's not exactly a console point anyway, they are clearly aimed at 1080p display with the theoretical possibility of attaching a 4K @ 30 Hz max display strictly for media consumption.


----------



## VeerK

Quote:


> Originally Posted by *Carniflex*
> 
> TV can be used as a PC display if that is what you are after. Which most people seem to be if we are talking about living room.
> 
> Consoles can be used with a monitor as well, if monitor need to be included. They are just somewhat limited in what they can reasonably use because of lacking sufficiently high bandwidth connector for making use of some of the higher end options available for PC's (not to mention the limitations implemented on console games like frame rate caps for "cinematic experience" and such). Then again it's not exactly a console point anyway, they are clearly aimed at 1080p display with the theoretical possibility of attaching a 4K @ 30 Hz max display strictly for media consumption.


I use a 55 inch LG TV to game on my second PC in the office/ living room. I don't see the point of distinguishing console TV vs PC monitor to validate either side.


----------



## SpeedyVT

Consoles are a nice controlled variable that make way for break throughs in optimized code for desktop titles. I do believe we'll expeerience revolvutions in the game industry with these new systems. While the developer attempts to fit as much as possible in the smaller confined space of a Console. When the Desktop consumer expands that space it beyond it suddenly opens a world of more possibilities. Unless you're ubisoft.


----------



## Pip Boy

Quote:


> Originally Posted by *Serandur*
> 
> What people selectively omit from their recollection is that said "optimization" was largely due to advances in shader and engine utilization across the board... meaning PCs too. It's not some console-exclusive phenomenon. See: 8800s running Battlefield 4 well above what a PS3/360 can do and above what software 8800s had to render in 2006. In regards to 2007 games, also see Crysis.


exactly what people don't even get is that the engines they used in 2007 that were typically from 2004 were eons behind the developments made on newer engines in 2013, at almost a decade on any console running and old engine will look worse than a shiny new engine even on the same hardware. Some PS3 games ran on UT2.5 or UT2.0 where as now we have UT4.0 and its literally night and day.

the engines have had to catch up with the hardware as much as the other way around.


----------



## Arturo.Zise

I have my PC connected to a 27" monitor and a Sony 46" TV and I prefer gaming on my TV with my Xbox controller. Only time I use my monitor is for games that require KB+M.


----------



## Pip Boy

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I have my PC connected to a 27" monitor and a Sony 46" TV and I prefer gaming on my TV with my Xbox controller. Only time I use my monitor is for games that require KB+M.


lots of people do this. There are perfectly decent 1080p 127" monitors with HDMi that can do both also.


----------



## SpeedyVT

Quote:


> Originally Posted by *phill1978*
> 
> lots of people do this. There are perfectly decent 1080p 127" monitors with HDMi that can do both also.


Using an LED Projector is a great way to get beyond the pixelation of an LED monitor. Plus costs less than a MASSIVE TV SCREEN! Since it'll be projection you won't even see the pixelation as bad. Something I've always wanted to do. Make a movie theater basement! No basement in my current home! HA HA!







Later life project.


----------



## CryphicKing

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Those 8800s also cost as much as the entire 360/PS3 themselves and were released a whole year or more later than the 360. So it'd be a darn shame if they couldn't outmatch the last gen consoles.


Most importantly, most people seem to forgot(or intentionally ignore) the fact that those 8800GTs are running on a 2012 CPU, Mobo and rams.


----------



## Pip Boy

Quote:


> Originally Posted by *SpeedyVT*
> 
> Using an LED Projector is a great way to get beyond the pixelation of an LED monitor. Plus costs less than a MASSIVE TV SCREEN! Since it'll be projection you won't even see the pixelation as bad. Something I've always wanted to do. Make a movie theater basement! No basement in my current home! HA HA!
> 
> 
> 
> 
> 
> 
> 
> Later life project.


I have a projector but i want a proper 2:35:1 screen next @ 120" and then a 5k projector or 4k with anamorphic lens.

im thinking though in the future given what ive seen the home cinema might get transferred to VR if it becomes high enough resolution so i might wait it out for another 3 - 5 years.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Serandur*
> 
> 
> 
> It's not about outmatching them, it's about clearly reaping the benefits of progression in game software development just as consoles do and therefore also benefiting from "optimization" of games over time to look better while using the same hardware resources... as opposed to the insinuation people sometimes make that you "need" to upgrade to match console optimization as if PC software isn't "optimized" and refined to run more efficiently as well when that's patently false. Stronger hardware is stronger hardware, that's what's being demonstrated by the advantage these stronger cards not only had at release, but maintained over the entire span of the consoles' lifetime.


PCs are indeed optimized better after launch just like consoles. They just never reach console levels of optimization. Nobody is arguing that consoles are "stronger" than PCs. They just get more benefits out of optimization than PCs.


----------



## DRT-Maverick

Yeah I think they skip optimizing for PCs since a PC should be able to process the data, optimized or not- basically the extra headroom for the PC is acceptable and won't interfere w/ the flow of things, therefor optimization for PCs isn't as necessary.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *CryphicKing*
> 
> Most importantly, most people seem to forgot(or intentionally ignore) the fact that those 8800GTs are running on a 2012 CPU, Mobo and rams.


Exactly! A PC built around Nov 2005 (launch date of xbox 360) at around $500 would truly struggle playing modern titles such as Bioshock Infinite, The Witcher 2 and 3, Crysis 3, GTA 5, etc. Back in Nov 05, Intel hadn't even launched their Core 2 duos yet, 2 GB of system RAM was considered a lot, SSDs did not exist, and the fastest video cards available were the 7950gx2 ($500ish) and x1950xtx ($400ish).

A PC built in 05 with $500 or even $600 would give you a dual core Athlon or Pentium 4/D, 2 GB DDR2 Ram, and possibly a 7900gs which was $200 at launch when I purchased it. I may be wrong but I sincerely doubt a system like this would run any of the modern games I mentioned. The PS3 and 360 meanwhile will have no problems running these games. The notion that PCs are overall cheaper than consoles is just insane. I love PC gaming but I recognize and accept the fact that it's more expensive.


----------



## jameschisholm

Quote:


> Originally Posted by *Bryst*
> 
> Just remember, game graphic development normally follows console performance. So if the current gen consoles really are that limited, that $500 gpu might not be as useful as people think. I personally would rather game at 900p 30fps and have better story and game mechanics then 1440p 120fps and have terrible story and gameplay.
> 
> I personally think that the demand for such amazing looking games have left other much more important aspects of games very lacking. And Destiny is a prime example of that.


I must ask then, what if you had the *option* to play this "game" at 1440p 60/120 fps? Would you still choose the former? Incidentally I don't think its a fair comment to say that a game at 1440p 120 fps will have poor story and mechanics..? It's not necessarily always an either/or scenario..


----------



## jameschisholm

What are the steps for console optimization, what do console devs actually do?

From the last gen what I saw was, lower the internal resolution the engine is running at, use blurring anti-aliasing or no anti-aliasing, add extra lighting which impacts performance less by hiding some things, lowering texture quality in some places which aren't so noticeable. When showing game trailers, the cutscenes look amazingly nice but the gameplay looks a bit different, not majorly but a bit. I think by lowering some areas, they gained in others, which gave the devs room to add newer effects which kept things fresh looking.

I found this image online, is this accurate?


----------



## Serandur

Quote:


> Originally Posted by *CryphicKing*
> 
> Most importantly, most people seem to forgot(or intentionally ignore) the fact that those 8800GTs are running on a 2012 CPU, Mobo and rams.


Except, you know, the ones that aren't... which you're either intentionally ignoring or you're making a very laughable and easily-proven-false generalization. You also shouldn't use the term "fact" so liberally, we are talking about open platforms, there is absolutely nothing that dictates people run their old GPUs on 2012 platforms (I certainly didn't). Example:





 - Core 2 Duo, Tomb Raider





 Core 2 Duo, Skyrim

In any case, I was talking about advances in shader utilization... shaders being a GPU thing and no, there is no console-exclusive optimization benefits for GPUs. The only area they hold any actual advantages are in CPU executions and draw calls. These days, they're close to not even having that (while having CPUs that are pitiful in comparison to 5-year old PC ones). "Optimization" = paring back settings as much as possible in the least perceptible way possible, no processing magic, just paring-down with, in the case of more esoteric architectures like the PS3's Cell, growing expertise in adjusting to said hardware; current consoles certainly don't fit that bill. As for why the 8800s being the earliest we see commonly still kicking around? Similar reason why the PS3 had horrendous performance deficits without advanced knowledge and exploitation of its SPEs; unified shaders (PS3 didn't have them, PC didn't get them until 8800s, 360 had them first). The 8800s were a significant leap forward not in just raw processing power, but setting new feature standards for the PC space with DX10, CUDA, and unified shaders.

That's why they're the common entry point, previous GPUs lacked certain features the 360's GPU did (well ahead of its time I might add) and the 8800s have maintained the same-sized advantage all these years as they theoretically should have, just as current PC GPUs are and will continue to perform similarly as they should in reference to PS4/Xbox One GPUs. Hardware resources are finite and modern, professional APIs (driven single-handedly along with high-power GPUs by the PC platform for professional, commercial, and scientific purposes) are very efficient with the last bit of notably wasted cycles being hesitatingly wrung out by DX12 in the near future through more efficient CPU and draw call handling.
Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Exactly! A PC built around Nov 2005 (launch date of xbox 360) at around $500 would truly struggle playing modern titles such as Bioshock Infinite, The Witcher 2 and 3, Crysis 3, GTA 5, etc. Back in Nov 05, Intel hadn't even launched their Core 2 duos yet, 2 GB of system RAM was considered a lot, SSDs did not exist, and the fastest video cards available were the 7950gx2 ($500ish) and x1950xtx ($400ish).
> 
> A PC built in 05 with $500 or even $600 would give you a dual core Athlon or Pentium 4/D, 2 GB DDR2 Ram, and possibly a 7900gs which was $200 at launch when I purchased it. I may be wrong but I sincerely doubt a system like this would run any of the modern games I mentioned. The PS3 and 360 meanwhile will have no problems running these games. The notion that PCs are overall cheaper than consoles is just insane. I love PC gaming but I recognize and accept the fact that it's more expensive.


Do I need to repost the point graph?









Just read my above reply and then note that feature/architecture disparity doesn't exist this time around. I wasn't getting at the notion of what's cheaper, but actually yeah, *this time around*, a PC can be cheaper. There are no subsidized hardware costs (Sony and Microsoft are making a small profit per unit if anything, now) nor are there any fancy hardware tricks/features this time around (360/PS3 CPUs were interesting to say the least, PS3's GPU not so great but the 360's introduced unified shaders and some rudimentary DX10 features) in these consoles. I built a PC for my family for under $700 last year with a 3570K and a GTX 660; under no situation does it not beat the PS4, there isn't a $50 a year online fee and the games are much cheaper. Not to mention the warranties on the parts aren't some paltry 1 year, but rather 3 years standard. That machine will be perfectly capable of edging out a PS4 for its entire lifespan and just from digital distribution sales alone, it is cheaper. Plus it's a PC, it does stuff. The situation is absolutely different this time around, PC APIs have only become that much more mature, hardware has a much larger advantage right out the gate, and we're not going through some revolution of engine design this time around with new, exciting features like unified shaders last time around. That's why people aren't hanging on to those 2005 cards so much today, we had a pretty big shift in architectural and then engine features right after those.


----------



## Systemlord

Sony and Microsoft spent so much R&D making the PS3 and Xbox one that it took years to start making a profit, for every console (PS2,PS3, Xbox, Xbox One) made they were both losing money on every single console sold. Both were giving away hardware and making zero profit, they took couple hundred dollars hit per unit, things seem a bit different now that Nvidia/AMD offer up their GPU's. Neither company is spending the R&D that they used to, I'd like to see the consoles merge into gaming PC's since that's what we basically have hear. Of course there's always and option to use a control pad or keyboard and mouse, living room or computer setup.

I just spent close to $2000 for a BodyBuilt ErgoGenesis 24/7 Executive , same company that manufactures chairs for 911 operators.


----------



## punker

Sony should have went with a Core I5 with HT Chip


----------



## CryphicKing

Quote:


> Originally Posted by *Serandur*
> 
> 
> 
> It's not about outmatching them, it's about clearly reaping the benefits of progression in game software development just as consoles do and therefore also benefiting from "optimization" of games over time to look better while using the same hardware resources... as opposed to the insinuation people sometimes make that you "need" to upgrade to match console optimization as if PC software isn't "optimized" and refined to run more efficiently as well when that's patently false. Stronger hardware is stronger hardware, that's what's being demonstrated by the advantage these stronger cards not only had at release, but maintained over the entire span of the consoles' lifetime.


No, you are the one missing the point, compare PC hardware to consoles' SoC hardware is like saying sports car can carry more cargo than freight trucks because of horse power, they are entire 2 different architectures, one is built for developers, the large freedom to create more efficient workflow, the other is only good for speed and navigation.

thin API layer, less draw call limits, full accessible programmable are the real power developers are looking for. on PC, you either able to run it or you aren't, you have the freedom to choose to keep the old hardware till you failed to met system requirement, this shouldn't sound strange to any PC gamers. while on console, as developers learned more about console architectures, they are able to build better CPU/GPU compilers to achieve graphic leap. GTA4 on PS3 = 680p/15-25 fps, GTA5 = 720p/25-30fps, 3 times the polygon counts, far advanced lighting engine, more crews, better phyiscs etc. on the same hardware. On PC, you will need 2 -3 upgrades to achieve such leap.

funny thing is, developers did give their honest answers to the public from time to time, most of time PC community just shot it down, took their words out of contest or just rewrite them into their own conspiracy theories .


----------



## CryphicKing

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Exactly! A PC built around Nov 2005 (launch date of xbox 360) at around $500 would truly struggle playing modern titles such as Bioshock Infinite, The Witcher 2 and 3, Crysis 3, GTA 5, etc. Back in Nov 05, Intel hadn't even launched their Core 2 duos yet, 2 GB of system RAM was considered a lot, SSDs did not exist, and the fastest video cards available were the 7950gx2 ($500ish) and x1950xtx ($400ish).
> 
> A PC built in 05 with $500 or even $600 would give you a dual core Athlon or Pentium 4/D, 2 GB DDR2 Ram, and possibly a 7900gs which was $200 at launch when I purchased it. I may be wrong but I sincerely doubt a system like this would run any of the modern games I mentioned. The PS3 and 360 meanwhile will have no problems running these games. The notion that PCs are overall cheaper than consoles is just insane. I love PC gaming but I recognize and accept the fact that it's more expensive.


majority PC gamers back in 2005 still using pentium 4 and less than 1G system ram, a 8800GTS can't get 50% games to boot since 2013 let along talk about graphic settings. Gaming PC these days have to got through at least 2-3 upgrades from 2005 to even get games to post screen.


----------



## CryphicKing

Quote:


> Originally Posted by *Serandur*
> 
> Except, you know, the ones that aren't... which you're either intentionally ignoring or you're making a very laughable and easily-proven-false generalization. You also shouldn't use the term "fact" so liberally, we are talking about open platforms, there is absolutely nothing that dictates people run their old GPUs on 2012 platforms (I certainly didn't). Example:
> 
> 
> 
> 
> 
> - Core 2 Duo, Tomb Raider
> 
> 
> 
> 
> 
> Core 2 Duo, Skyrim
> 
> In any case, I was talking about advances in shader utilization... shaders being a GPU thing and no, there is no console-exclusive optimization benefits for GPUs. The only area they hold any actual advantages are in CPU executions and draw calls. These days, they're close to not even having that (while having CPUs that are pitiful in comparison to 5-year old PC ones). "Optimization" = paring back settings as much as possible in the least perceptible way possible, no processing magic, just paring-down with, in the case of more esoteric architectures like the PS3's Cell, growing expertise in adjusting to said hardware; current consoles certainly don't fit that bill. As for why the 8800s being the earliest we see commonly still kicking around? Similar reason why the PS3 had horrendous performance deficits without advanced knowledge and exploitation of its SPEs; unified shaders (PS3 didn't have them, PC didn't get them until 8800s, 360 had them first). The 8800s were a significant leap forward not in just raw processing power, but setting new feature standards for the PC space with DX10, CUDA, and unified shaders.
> 
> That's why they're the common entry point, previous GPUs lacked certain features the 360's GPU did (well ahead of its time I might add) and the 8800s have maintained the same-sized advantage all these years as they theoretically should have, just as current PC GPUs are and will continue to perform similarly as they should in reference to PS4/Xbox One GPUs. Hardware resources are finite and modern, professional APIs (driven single-handedly along with high-power GPUs by the PC platform for professional, commercial, and scientific purposes) are very efficient with the last bit of notably wasted cycles being hesitatingly wrung out by DX12 in the near future through more efficient CPU and draw call handling.
> Do I need to repost the point graph?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just read my above reply and then note that feature/architecture disparity doesn't exist this time around. I wasn't getting at the notion of what's cheaper, but actually yeah, *this time around*, a PC can be cheaper. There are no subsidized hardware costs (Sony and Microsoft are making a small profit per unit if anything, now) nor are there any fancy hardware tricks/features this time around (360/PS3 CPUs were interesting to say the least, PS3's GPU not so great but the 360's introduced unified shaders and some rudimentary DX10 features) in these consoles. I built a PC for my family for under $700 last year with a 3570K and a GTX 660; under no situation does it not beat the PS4, there isn't a $50 a year online fee and the games are much cheaper. Not to mention the warranties on the parts aren't some paltry 1 year, but rather 3 years standard. That machine will be perfectly capable of edging out a PS4 for its entire lifespan and just from digital distribution sales alone, it is cheaper. Plus it's a PC, it does stuff. The situation is absolutely different this time around, PC APIs have only become that much more mature, hardware has a much larger advantage right out the gate, and we're not going through some revolution of engine design this time around with new, exciting features like unified shaders last time around. That's why people aren't hanging on to those 2005 cards so much today, we had a pretty big shift in architectural and then engine features right after those.


why are you showing me another 2 games have nothing to do with the game we are talking about? try crisis3, watchdog and AC black flags etc. consoles have their own first party supported SDK, unique API that is not bonded by any DX version, 10 years after they are still alive and kicking and squeeze out supposedly DX11 game's graphic at reasonable visual and performance. what advantage 8800GTS actually have when 1. a GPU alone cost more than an entire console back then 2. fail to support every DX11 titles these days while 10 years old consoles still do? 3. need 2-3 other peripherals upgrade just to get the GPU to work?

picking a PC to "beat" console is your own misconception and mistake, as they don't match spec for spec, some earlier titles, you may able to have slight higher frame rates than PS4/X1, others games you will have hard time struggling to maintain a playable frame rate or to the point just have to to low down the graphic setting, personally, I won't use a GT660 for AC unity, watchdogs, or division.


----------



## Serandur

Quote:


> Originally Posted by *CryphicKing*
> 
> No, you are the one missing the point, compare PC hardware to consoles' SoC hardware is like saying sports car can carry more cargo than freight trucks because of horse power, they are entire 2 different architectures, one is built for developers, the large freedom to create more efficient workflow, the other is only good for speed and navigation.
> 
> thin API layer, less draw call limits, full accessible programmable are the real power developers are looking for. on PC, you either able to run it or you aren't, you have the freedom to choose to keep the old hardware till you failed to met system requirement, this shouldn't sound strange to any PC gamers. while on console, as developers learned more about console architectures, they are able to build better CPU/GPU compilers to achieve graphic leap. GTA4 on PS3 = 680p/15-25 fps, GTA5 = 720p/25-30fps, 3 times the polygon counts, far advanced lighting engine, more crews, better phyiscs etc. on the same hardware. On PC, you will need 2 -3 upgrades to achieve such leap.
> 
> funny thing is, developers did give their honest answers to the public from time to time, most of time PC community just shot it down, took their words out of contest or just rewrite them into their own conspiracy theories .


No, you're missing the point which was that such a claim is patently false, we have evidence... 8800s running modern games with the same performance gap as they always had over the old consoles demonstrating the same gains from improved engine efficiency over the years for the reasons I stated in my above post. There's no evidence for the opposite. What you're saying is nonsense, SoC hardware doesn't provide such benefits and it's not even a console architecture (whatever that is), it's a PC and mobile architecture design chosen in these new consoles because of cost and efficiency reasons. An 8800 in a PC would run GTA V better than that, it does every single other game available on both a PC and the PS3/360 which isn't DX11 exclusive on PC.

What developers prefer is a unified, efficient, across-the-board high-level API and unified architecture type (which is x86 now) for ease of compatibility across multiple platforms and cost-efficiency of development. When you can provide evidence for your claims, please let me know, but as of now it's pure conjecture that's simply repeated ad nauseum by people with no intimate knowledge of hardware or graphics rendering that doesn't have any evidence that holds up. You're wrong, the evidence I've already posted shows it, the information I've posted above explaining what happened in shader land up above explains it.
Quote:


> Originally Posted by *CryphicKing*
> 
> why are you showing me another 2 games have nothing to do with the game we are talking about? try crisis3, watchdog and battlefield 4, AC black flags etc. consoles have their own first party supported SDK, unique API that is not bonded by DX version, 10 years after they are still alive and kicking and squeeze out reasonable graphic and performance.
> 
> picking a PC to "beat" console is your own misconception and mistake, as they don't match spec for spec, some earlier titles, you may able to have slight higher frame rates than PS4/X1, others games you will have hard time struggling to maintain a playable frame rate or to the point just have to to low down the graphic setting, personally, I won't use a GT660 for AC unity, watchdogs, or division.


It's not to beat a console, it's to prove a point of consistently improving quality of the software they're rendering while maintaining the same performance gap they've always had over the years. First off, I posted Battlefield 4 earlier in the thread, the 8800 ran it with the same performance advantage it's always had in software, and BF4 is a different game on PCs to boot because the standard baselines rose after 8 years. The same applies to the games that don't support below DX11 on PCs. If by alive and kicking, you mean the consoles are all-but dead and twitching running severely gimped and poorly-running versions of those games, then sure. The standards on PCs are higher and have been for a longer time, that's not what I'm addressing. The only games you're able to name that raise any doubt are the ones that don't support the older DX version, have moved the baseline rendering loads significantly after years, or are not yet on PCs and we can't bloody well test those, now can we? Your claims about "optimization" are completely unfounded, you have no evidence of what you're claiming with regards to increased rendering loads at all, but the games that can compare between the two platforms only support the reality I'm trying to tell you.

You won't use a 660 but you'll use a console running a resolution lower than the PC standard of the past how many years now with sub-30 FPS? That's YOUR misconception and preemptively false assertion of its performance. I haven't tried Unity on it, but it did run Watch Dogs better than the PS4 did. What a lot of people don't realize what various settings are, what they do, and how they might match up. You don't have to use a 660 for those games, but it runs them better whether you wish to acknowledge it or not. I have firsthand, personal experience with the hardware we're talking about as well extensive experience and knowledge of computer hardware and rendering software in general as well. Like it or not, whatever it is you're implying or flat-out stating about these things are wrong. I know them to be wrong.


----------



## CryphicKing

Quote:


> Originally Posted by *Serandur*
> 
> No, you're missing the point which was that such a claim is patently false, we have evidence... 8800s running modern games with the same performance gap as they always had over the old consoles demonstrating the same gains from improved engine efficiency over the years for the reasons I stated in my above post. There's no evidence for the opposite. What you're saying is nonsense, SoC hardware doesn't provide such benefits and it's not even a console architecture (whatever that is), it's a PC and mobile architecture design chosen in these new consoles because of cost and efficiency reasons. An 8800 in a PC would run GTA V better than that, it does every single other game available on both a PC and the PS3/360 which isn't DX11 exclusive on PC.
> 
> What developers prefer is a unified, efficient, across-the-board high-level API and unified architecture type (which is x86 now) for ease of compatibility across multiple platforms and cost-efficiency of development. When you can provide evidence for your claims, please let me know, but as of now it's pure conjecture that's simply repeated ad nauseum by people with no intimate knowledge of hardware or graphics rendering that doesn't have any evidence that holds up. You're wrong, the evidence I've already posted shows it, the information I've posted above explaining what happened in shader land up above explains it.


nope, wrong on all accounts, I don't see any evidence shown so far whatsoever, show me how 8800GTS can even get DX11 only games such as crysis3, shadow of mordor, watchdogs etc to boot on PC as it did on 360 and ps3, prove it. also, without CPU, mobo and other components upgrades, you can't even get games to 360/ps3 performance.

It doesn't sound like you have any idea about what it takes to develop a video game, high level API is a PC only obstacle hated by all developers, MS visual stuido, .net framework, sound drive, video card driver etc. with all these layers in places, all your glorious "POWAH" got chop downed to pieces before they can be any use, the way PC hardware designed is just to provide end user certain amount of "power" to handle whatever developers thrown at it. if you have exceed powers beyond recommended spec, then more power to you, you may customize to higher Res/FPS like current PC vs last gen console, but when you failed to met the requirement, then you are out of luck and buy yourself a new GPU, in term of graphic leap, there's no work around on PC hardware.plus, how about show me a single PC exclusive took advantage on those "powah" and look half as good as the manipaltes which you cried need "better optimization" ?

Quote:


> Originally Posted by *Serandur*
> 
> It's not to beat a console, it's to prove a point of consistently improving quality of the software they're rendering while maintaining the same performance gap they've always had over the years. First off, I posted Battlefield 4 earlier in the thread, the 8800 ran it with the same performance advantage it's always had in software, and BF4 is a different game on PCs to boot because the standard baselines rose after 8 years. The same applies to the games that don't support below DX11 on PCs. If by alive and kicking, you mean the consoles are all-but dead and twitching running severely gimped and poorly-running versions of those games, then sure. The standards on PCs are higher and have been for a longer time, that's not what I'm addressing. The only games you're able to name that raise any doubt are the ones that don't support the older DX version, have moved the baseline rendering loads significantly after years, or are not yet on PCs and we can't bloody well test those, now can we? Your claims about "optimization" are completely unfounded, you have no evidence of what you're claiming with regards to increased rendering loads at all, but the games that can compare between the two platforms only support the reality I'm trying to tell you.
> 
> You won't use a 660 but you'll use a console running a resolution lower than the PC standard of the past how many years now with sub-30 FPS? That's YOUR misconception and preemptively false assertion of its performance. I haven't tried Unity on it, but it did run Watch Dogs better than the PS4 did. What a lot of people don't realize what various settings are, what they do, and how they might match up. You don't have to use a 660 for those games, but it runs them better whether you wish to acknowledge it or not. I have firsthand, personal experience with the hardware we're talking about as well extensive experience and knowledge of computer hardware and rendering software in general as well. Like it or not, whatever it is you're implying or flat-out stating about these things are wrong. I know them to be wrong.


And those DX9 "maxed out" PC version missed even more singature rendering features than last gen consoles, and games released after 2013 completely left 8800GTS out of the picture (as well as every high end 2XX GPUs from 2008) . on GT660, Looks like you only picked one game that fit your fanboyism fantasy while completely disregard the same GPU you wish to "outperform" PS4/X1 completely failed at 9 other games. what res/fps developers trying to implant on console is depend on the amount of technology implement, Trine 2 is 4k ready on both ps4/x1 if all devleopers thinkg visual R&D development should stop there and just worry about 1080P/60fps. btw, locked 30fps under stress test is nothing like swing around 21-60 unstable FPS with stretched out frame time

I think you meant to say you "think" you are getting better performance in watchdogs than PS4 on your GT660 while reality shows the opposite, . You will need this to keep you in reality check,. Still personally I'm not looking forward using GT660 on Assassin's creed unity.


----------



## Serandur

Quote:


> Originally Posted by *CryphicKing*
> 
> snip


1. The PS3/360 do not render DX11 features, they are incapable of doing so. The versions they get are not utilizing any DX11 features, they're not running at anything above 1280x720, they're not even maintaining 30 FPS. You're failing to understand, they're not running those features, they're not running settings that are necessarily even an option for PCs because of how far the baseline moves over the course of nearly a decade; they're getting a customized and lower-tier version of the game without them. Worse, you're arguing with someone who actually does understand and who does have actual experience with these things without either.

2. You're not grasping the argument here of the differing standards regarding PCs and consoles and subsequently different baselines nor of the reality of how well the 8800s hold up and benefit from real software "optimization" just as all hardware, PC and otherwise, does. The PC is the source of modern high-performance processor and development software. Companies and institutes bank some serious financial, scientific, professional, and development eggs into that basket. It's not nearly as inefficient as you think it is, it's an insult to the software engineers designing processing technology and software.

3. You clearly lack experience or understanding of the complexity and reality of modern hardware and software development let alone the versatility (to lock a framerate if you so choose
) and settings (which benchmarks are only an indicator of one combination of) that allow the end-user to configure their games how they see fit and how those combinations make copy-paste benchmarks of that one specific combination do not give you a clear picture at all of what's going on. There's a false equivalence between console settings and those benchmark PC setting you're not grasping. I've actually used a GTX 660 machine and no matter what BS you spew, it factually runs Watch Dogs at a higher resolution (1920x1080) than the PS4 with a healthy mix of settings while always staying above 30 FPS on my brother's machine. I've used an 8800 and Core 2 machine as well and I own a PS3; there's no single instance where the PS3 ever came within range on any software they could run in common, you'll find many examples online if you're so inclined to actually care. I've been into rendering software and hardware a very long time, one of the basic things any person experienced with either would be aware of are rendering options/settings.

In any case, I think I'm done here. You can keep screaming at the clouds and ironically accusing knowledgeable posters of not having a clue all you like. If you like your console gaming, that's fine, but it's blatantly obvious you have no knowledge of PC gaming or how it stacks up and you're not going to change facts with willful ignorance. In any case, there's nothing to be had by further discussion here.


----------



## moccor

Lol seems like someone believed Ubisoft when they announced the console versions of their Farcry 4 was gonna run the same as maxed PC settings


----------



## lacrossewacker

Quote:


> Originally Posted by *jameschisholm*
> 
> What are the steps for console optimization, what do console devs actually do?
> 
> From the last gen what I saw was, lower the internal resolution the engine is running at, use blurring anti-aliasing or no anti-aliasing, add extra lighting which impacts performance less by hiding some things, lowering texture quality in some places which aren't so noticeable. When showing game trailers, the cutscenes look amazingly nice but the gameplay looks a bit different, not majorly but a bit. I think by lowering some areas, they gained in others, which gave the devs room to add newer effects which kept things fresh looking.
> 
> I found this image online, is this accurate?


Halo 3 to Halo Reach to Halo 4
1138x640 to 1152x720 to 1280x720

from


to


thats a start


----------



## BinaryDemon

I'd like to say I think the dev's for the PS4 and XB1 are doing a great job optimizing for consoles, and it should be sort of obvious that they would hit a performance wall this early.

When the Xbox 360 and PS3 launched in 2005/2006 their hardware was arguably equal or better than the best consumer PC hardware at the time. Sure PC's quickly surpassed them, but they were designed to be powerful machines with the hopes they would last 10 years. This generation neither MS or Sony took that approach. When both launched in 2013, it was obvious that even my previous system an aging i5-2500k + GTX580 offered a better gaming experience. Both MS and Sony tried to save money and opted for what they believed would be the hardware required for 1080p gaming. I believe both under-estimated this, obviously MS worse than Sony, and that's why both are having trouble with 1080p/60fps.

It's going to be a long ~8 years, before the next version consoles are ready. I bet MS and Sony are hoping that the cost of hardware for 4K Gaming @ 60fps drops significantly in that time.


----------



## jameschisholm

Quote:


> Originally Posted by *lacrossewacker*
> 
> Halo 3 to Halo Reach to Halo 4
> 1138x640 to 1152x720 to 1280x720
> 
> from
> 
> 
> to
> 
> 
> thats a start


Do you have one where the game environment is the same? Like both in a jungle/green area? Or both in a space type scenario?

I think when I said they reduced the res maybe they didnt for every game but games like cod they did. Plus 720p 30fps is hardly impressive.

Also noticed the first image there's literally no foliage, everything looks bare. 2nd image is a clean Sci fi space environment with a bright sun and brown rocks. It doesn't seem hard to achieve, however it looks to be a significant improvement over the first image so they've done something right to make it better.


----------



## iSlayer

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Exactly! A PC built around Nov 2005 (launch date of xbox 360) at around $500 would truly struggle playing modern titles such as Bioshock Infinite, The Witcher 2 and 3, Crysis 3, GTA 5, etc. Back in Nov 05, Intel hadn't even launched their Core 2 duos yet, 2 GB of system RAM was considered a lot, SSDs did not exist, and the fastest video cards available were the 7950gx2 ($500ish) and x1950xtx ($400ish).
> 
> A PC built in 05 with $500 or even $600 would give you a dual core Athlon or Pentium 4/D, 2 GB DDR2 Ram, and possibly a 7900gs which was $200 at launch when I purchased it. I may be wrong but I sincerely doubt a system like this would run any of the modern games I mentioned. The PS3 and 360 meanwhile will have no problems running these games. The notion that PCs are overall cheaper than consoles is just insane. I love PC gaming but I recognize and accept the fact that it's more expensive.


Probably because you're directly comparing PC hardware to console hardware, ignoring a lot of other factors that make consoles hard to compare to PCs. Such as the difference in game pricing, the capabilities of PCs, many people owning consoles also owning PCs and that the console and PC could be replaced with one singular, better PC...


----------



## mtcn77

A comparison between the pc and console is parallel to vying x86 to ARM. x86 is definitely faster, but the pipeline stages are longer and thus are more liable to pipeline stalls during a flush; therefore the control group has to be stronger and take a bigger portion out of the die, affecting both power statistics & thoroughput efficiency(higher latency & higher thoroughput predicament).


----------



## paulerxx

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Pretty much everyone owns a TV. Very few people own monitors.


You can't plug a computer into a TV?


----------



## Pip Boy

Quote:


> Originally Posted by *lacrossewacker*
> 
> Halo 3 to Halo Reach to Halo 4
> 1138x640 to 1152x720 to 1280x720
> 
> from
> 
> 
> to
> 
> 
> thats a start


posting screen shots doesn't mean much, it doesn't show physics,lighting,animation, shaders, reflections,occlusions,of anisotropy in motion.

engine differences account for the majority of visual improvements. These carry optimisations but the real difference is most always the progression of game engine technologies.


----------



## Master__Shake

Quote:


> Originally Posted by *BinaryDemon*
> 
> I'd like to say I think the dev's for the PS4 and XB1 are doing a great job optimizing for consoles, and it should be sort of obvious that they would hit a performance wall this early.
> 
> When the Xbox 360 and PS3 launched in 2005/2006 their hardware was arguably equal or better than the best consumer PC hardware at the time. Sure PC's quickly surpassed them, but they were designed to be powerful machines with the hopes they would last 10 years. This generation neither MS or Sony took that approach. When both launched in 2013, it was obvious that even my previous system an aging i5-2500k + GTX580 offered a better gaming experience. Both MS and Sony tried to save money and opted for what they believed would be the hardware required for 1080p gaming. I believe both under-estimated this, obviously MS worse than Sony, and that's why both are having trouble with 1080p/60fps.
> 
> *It's going to be a long ~8 years, before the next version consoles are ready.* I bet MS and Sony are hoping that the cost of hardware for 4K Gaming @ 60fps drops significantly in that time.


i seriously doubt it'll be 8 years before the next iteration of the PlayStation of Xbox.

5 is a stretch.

4k in every home in 5 years is probably not going to happen either.

cable tv moves too slow.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Master__Shake*
> 
> i seriously doubt it'll be 8 years before the next iteration of the PlayStation of Xbox.
> 
> 5 is a stretch.


Yup, I would say we start seeing semi-official information on them in about 3 years.


----------



## Master__Shake

i hope that ms and sony start releasing them in 3

then people will just give up on consoles. too many too soon then they move to pc.


----------



## Cybertox

Consoles will always, sooner or later hit performance walls because they are limited in hardware and cannot be upgraded. That is just the design of consoles, you cant do much about it unless you bring an upgradable console to the market.


----------



## iTurn

Quote:


> Originally Posted by *Serandur*
> 
> 3. You clearly lack experience or understanding of the complexity and reality of modern hardware and software development let alone the versatility (to lock a framerate if you so choose
> ) and settings (which benchmarks are only an indicator of one combination of) that allow the end-user to configure their games how they see fit and how those combinations make copy-paste benchmarks of that one specific combination do not give you a clear picture at all of what's going on. There's a false equivalence between console settings and those benchmark PC setting you're not grasping. I've actually used a GTX 660 machine and no matter what BS you spew, it factually runs Watch Dogs at a higher resolution (1920x1080) than the PS4 with a healthy mix of settings while always staying above 30 FPS on my brother's machine. I've used an 8800 and Core 2 machine as well and I own a PS3; there's no single instance where the PS3 ever came within range on any software they could run in common, you'll find many examples online if you're so inclined to actually care. I've been into rendering software and hardware a very long time, one of the basic things any person experienced with either would be aware of are rendering options/settings.


Not disagreeing with what you said, I'm only saying Ubisoft is a horrible company to use for benchmarks in the PC vs Consoles department. Their optimization track record for the PS4 has been leaving users wanting...


----------



## mtcn77

Quote:


> Originally Posted by *Cybertox*
> 
> Consoles will always, sooner or later hit performance walls because they are limited in hardware and *cannot be upgraded*. That is just the design of consoles, you cant do much about it unless you bring an upgradable console to the market.


Last time I checked, other than the RMA ticket, it wasn't possible to upgrade your previous hardware investment. They should reclassify pc hardware operation scheme from "race to idle" to "race to stall". 16GB/s pcie3.0 bandwidth to 68+25.6=93.6GB/s can make, or break empires.
Much obliged to Mr. Ofer Rosenberg & Mr. Kayvon Fatahalian for their presentation that I frequently reference.


----------



## Kane2207

Meh - I don't really care.

I brought a PS4 for whatever Naughty Dog puts out, play PC for multi-plats, and if they've already hit a wall, there's no rush to buy new PC hardware. Which keeps the wife and wallet happy


----------



## CryphicKing

Quote:


> Originally Posted by *Serandur*
> 
> 1. The PS3/360 do not render DX11 features, they are incapable of doing so. The versions they get are not utilizing any DX11 features, they're not running at anything above 1280x720, they're not even maintaining 30 FPS. You're failing to understand, they're not running those features, they're not running settings that are necessarily even an option for PCs because of how far the baseline moves over the course of nearly a decade; they're getting a customized and lower-tier version of the game without them. Worse, you're arguing with someone who actually does understand and who does have actual experience with these things without either.
> 
> 2. You're not grasping the argument here of the differing standards regarding PCs and consoles and subsequently different baselines nor of the reality of how well the 8800s hold up and benefit from real software "optimization" just as all hardware, PC and otherwise, does. The PC is the source of modern high-performance processor and development software. Companies and institutes bank some serious financial, scientific, professional, and development eggs into that basket. It's not nearly as inefficient as you think it is, it's an insult to the software engineers designing processing technology and software.
> 
> 3. You clearly lack experience or understanding of the complexity and reality of modern hardware and software development let alone the versatility (to lock a framerate if you so choose
> ) and settings (which benchmarks are only an indicator of one combination of) that allow the end-user to configure their games how they see fit and how those combinations make copy-paste benchmarks of that one specific combination do not give you a clear picture at all of what's going on. There's a false equivalence between console settings and those benchmark PC setting you're not grasping. I've actually used a GTX 660 machine and no matter what BS you spew, it factually runs Watch Dogs at a higher resolution (1920x1080) than the PS4 with a healthy mix of settings while always staying about 30 FPS on my brother's machine. I've used an 8800 and Core 2 machine as well and I own a PS3; there's no single instance where the PS3 ever came without range on any software they could run in common, you'll find many examples online if you're so inclined to actually care. I've been into rendering software and hardware a very long time, one of the basic things any person experience with either would be aware of are rendering options/settings.
> 
> In any case, I think I'm done here. You can keep screaming at the clouds and ironically accusing knowledgeable posters of not having a clue all you like. If you like your console gaming, that's fine, but it's blatantly obvious you have no knowledge of PC gaming or how it stacks up and you're not going to change facts with willful ignorance. In any case, there's nothing to be had by further discussion here.


1. DX11 is not a rendering enchantment, it's a windows platform only programming API regulated by Microsoft, which require GPU maker to manufacture the respectable hardware to match the programming pipeline, it's capable of doing more rendering feature than DX9 because they unitize more system resource on the same PC, console never had such issue, every console have a unique APIs and SDK of their own, DirectX version never meant anything to low level API/Cosnoles, some clever marketing phrase got into your head such as tessellation have made more than handful appearance on 360/ps3 -->

http://www.eurogamer.net/articles/digitalfoundry-2014-vs-gran-turismo-6

btw, they are handful 360/ps3 games running at 1080p, looks like you didn't get yourself informed in the day when 1080p/60fps don't make to headline stories.

http://wikibin.org/articles/list-of-full-hd-1080p-ps3-games.html

2. I find it laughable when someone without knowing what directX is also saying high level API is more efficient than low level API claiming to have any kind "engineering experience" in software. Show me your resume on how many years experience you have in Quality assurance/data mining/memory profiling in different OS environment before I can take your "engineering experience seriously", Sorry to inform you that most of these glorious standard from in your imagination doesn't click with reality under the sun, visual leap in video game depend on visual R&D development, and which hardware runs the latest algorithms better depend on the programming efficiency on hardware, the one with low level API and SoCs always have advantage on such efficiency, that's what real R&D programmers rely on to build their engine/or middleware. PC GPU just provided certain amount of "power" to handle whatever thrown at it. your are struggling with this simple fact and obsessed with this "powah" can be compared to a gym trainned bodybuiler telling a MMA fighter that he can't get better cardio, better martial arts and body coordination without getting bigger Biceps,

3. Being a C++/Java veteran work in 3d architecture production, I had plenty hands on experience on middleware such as UDK3/4, cryengine2/3, spent month learning/studying their source code document as well as console's API manual(gamecube, xbox360)I recommend you to Google up some basics on different type of API, what is SoCs hardware does compare to general purpose hardware and it's coding efficiency compare to desktop PC before attempt to talk about engineering effort in different OS environment which you never worked with.

aside from that I'm still waiting for a single evidence from you to back up a single claim you made. I already showed you watchdog's benchmark in my previous post, search up where your GTX660's at, and feel free to check up GTX660's performance on dead rising 3, ryse and black flags. any of those games swallow a GTX660 alive, it's performance on ACU's will be even more hilarious.

Lastly, I have to apologize that I don't find you to be knowledgeable in any subject you claim to understand, other than have experience in how to put up PC components together or what you PC parts you are buying, plus, you should stop imaging every none PC only gamer as your console Nemesis, Most of time I prefer PC gaming over console if you wonder.


----------



## CryphicKing

Quote:


> Originally Posted by *jameschisholm*
> 
> Do you have one where the game environment is the same? Like both in a jungle/green area? Or both in a space type scenario?
> 
> I think when I said they reduced the res maybe they didnt for every game but games like cod they did. Plus 720p 30fps is hardly impressive.
> 
> Also noticed the first image there's literally no foliage, everything looks bare. 2nd image is a clean Sci fi space environment with a bright sun and brown rocks. It doesn't seem hard to achieve, however it looks to be a significant improvement over the first image so they've done something right to make it better.


720P was considered high end gaming standard 9 years ago on any platform, and halo4 rendered at 720P compare to halo reach's 1152x720p and halo3's 640P while featuring numerous technical improvement. Point is, console hardware are made for programming efficiency that is not possible in PC environment, every console is able to achieve 3-4 times graphic leap in 5-9 years without upgrade the hardware, PC can still outpace consoles by increase raw power year after year until developers running out of ideas to create better work flow on console's architecture, typically 3-5 years.

The leap from Halo3 to Halo, reach to halo4 is not only the resolution, the engine improvement is the real highlight, 3 times the polygon budget, a revamped lighting system, larger landscape, more enmies on screen etc. there's a doucment on technical improvement from Halo3 to halo reach you can watch, the best example of visual leap from 360/ps3 imo is GTA4 to red dead to GTA5.

Visual leap from this generation console you can refer to AC black flag to AC: unity from Xbox1, running at the same 900P/30fps, but it's a total different beast, an overhauled engine with numerous improvement in every respect, now check PC's spec requirement from black flag to Unity.


----------



## Serandur

Quote:


> Originally Posted by *CryphicKing*


You make far too many assumptions based on limited data:

8800 -> 2012 mobo/CPU - false

Assuming I don't know what DirectX is because I state the old consoles don't have hardware supporting DX11 and for stating the reason for lack of pre DX11 GPU support being a raised baseline standard on PCs - false

Assuming 660 levels of performance based on a benchmark running ONE combination of rendering settings in a manner MEANT to push GPU loads for the sake of comparitive testing (well above what a PS4 runs) -> 660 performance running equivalent to console settings - false

Assuming "efficiency" in the context I used it meant optimal performance when I clearly stated it in terms of a standard API that supports a much broader range of hardware (in DirectX's case, it makes very effective use of GPUs with few wasted cycles, the overhead is primarily an issue for draw calls) -> false

In that link you provided, assuming "upscaled 1080p" is actual 1920x1080 -> false

Your knowledge and experience is limited, in regards to settings (seriously, performance setting options; modern PC games are extremely scalable) and performance of hardware you don't even have access to. Google the 8800 for newer games yourself, you've ignored my examples. As for settings, nobody runs PC game benchmarks with equivalent settings to consoles (especially considering when resolution is below 1920x1080). Consider the differences.


----------



## lacrossewacker

Quote:


> Originally Posted by *phill1978*
> 
> posting screen shots doesn't mean much, it doesn't show physics,lighting,animation, shaders, reflections,occlusions,of anisotropy in motion.
> 
> engine differences account for the majority of visual improvements. These carry optimizations but the real difference is most always the progression of game engine technologies.


The engine is better in every way.

I suggest you read DigitalFoundrys overview of Halo 4's engine *here*.

Old tech interview - good insight into how the tech and art need to compliment each other




Real time cutscene goodness





Enjoy....



Vs.





Vs.





Vs.





Vs.



I'll wait for a more credible source like 343 industries to say when the X1's been completely "maxed" just as naughty dog will eventually do with the PS4. It'll be at least 2 or 3 games in though.


----------



## Slomo4shO

Quote:


> Originally Posted by *Master__Shake*
> 
> My $450 budget gaming build from 2012 is slightly more powerful than the ps4 with a mild 4.2 oc fx-6300 and gpu at 2.15 tflops.
> 
> Video Card: Sapphire Radeon R9 280 3GB Dual-X Video Card


The R9 280 was out in 2012?


----------



## Master__Shake

Quote:


> Originally Posted by *Slomo4shO*
> 
> The R9 280 was out in 2012?


i never said that and yes it was called the 7950.

or 7950 boost or whatever you'd like.


----------



## SpeedyVT

Most of you complaining about hitting the wall keep forgetting that it's these limitations we set for ourselves that improve our performance on desktop PCs with better optimization methods. Sony and Microsoft will milk the performance out of their systems with exclusives.


----------



## jameschisholm

I suppose if they are hitting a wall now, then pc raw power should send fps through the roof.


----------



## Systemlord

Quote:


> Originally Posted by *paulerxx*
> 
> You can't plug a computer into a TV?


The difference between HDTV's and LED monitors are pretty darn close, both are heading towards 4K. You can plug a computer into a LED HDTV.


----------



## Carniflex

Quote:


> Originally Posted by *Cybertox*
> 
> Consoles will always, sooner or later hit performance walls because they are limited in hardware and cannot be upgraded. That is just the design of consoles, you cant do much about it unless you bring an upgradable console to the market.


Hehe. That is supposed to be a SteamBox. Although I'd imagine if the Sony and MS decide to pursue the traditional console model (which I'm not convinced) they might throw in there some kind of expansion slot or two. In my personal opinion they are more likely to try to pull off some kind of software based trick (which they are, in essence already doing with this gen consoles which are x86 based PC's in a nutshell with very customized software layer on top of it). If they manage to truly stretch the current gen consoles out for 10 years (which I kinda doubt) then by the end of their life cycle I expect them to be beaten even by the top end mobile phones / tablets raw performance wise (although probably at much higher price point at that time).

However, while I'm highly critical towards the current gen console hardware when compared to the current gen PC's the hardware is not really a point of a console







The reason why consoles sell is the exclusive games you can get only on that platform. If what you want to play does not exist at all on the PC it does not matter how much more powerful machine can you build on the same budget. This generation both MS and Sony have also thrown in some additional services (the subscription to be online we are loving to take pokes at is not only a tax to be online, it includes some "stuff", like Netflix for US residents and such).


----------



## caswow

Quote:


> Originally Posted by *Carniflex*
> 
> Hehe. That is supposed to be a SteamBox. .


there is no steambox. its a pc. PC:thumb:


----------



## Carniflex

Quote:


> Originally Posted by *caswow*
> 
> there is no steambox. its a pc. PC:thumb:


Sure there is! It's that magic controllers that turns a regular arbitrary strength PC into a SteamBox .. I suppose (assuming it comes with a sticker you can plaster over the brand name of your case).

Although on a bit more serious note. In my opinion the small from factor PC's and the current gen consoles are already competing for the spot in the living room. The traditional consoles currently have an edge in that regard because of that is what people know. The small from factor PC market is fragmented and many of these are not aimed at gaming so your layman is not aware that this kind of option even exists or of he is aware he might not be aware that PC gaming is no longer like it was 15 .. 10 years ago. The official release of the Steam Controller will not change that much other than perhaps getting some modestly overpriced boxes with a steam stiker in brick and mortar stores, but I do not see Valve dropping hundreds of millions into a marketing campaign to sell these. So I expect the living room PC gaming to remain relatively niche activity unless something unexpected happens and people start buying these "SteamBox'es" en masse to play some kind of exclusive title on it, like dunno .. Dota 2? Half Life 3?

Most PC gamers probably will not get a real "Steam Console" opting instead to use the Steam Streaming function with the help of a ~100$ HTPC in whatever format they prefer it. With little technical knowledge and a bit will to tinker an 5+ year old DELL can be turned into reasonably capable gaming machine by dropping 150 .. 200$ GFX card in it.


----------



## CryphicKing

Quote:


> Originally Posted by *Serandur*
> 
> You make far too many assumptions based on limited data:
> 
> 8800 -> 2012 mobo/CPU - false


Funny, I'm still waiting on your evidence on a single bencharmk score or video with fps recorded on BF4, tomb raider or skyrim benchmarked on core2duo, 1G ram on a 8800, In case you have short term memory, 1G system ram was top of the line standard when 8800GTS was available, even crysis from 2007's asked 1.5- 2G ram to max out, 4G ram were still considered high end requirement before crysis3 and bf4
Looks like you had our roles reversed, I only provide fact check, you are the one counter facts based on fantasies and assumption all along
Quote:


> Originally Posted by *Serandur*
> 
> In that link you provided, assuming "upscaled 1080p" is actual 1920x1080 -> false


You should stop assuming and get yourself informed, the resolution/frame rate on these ps3 games were tested and proven, some even running at 60fps, not upscaled but fully native 1080p--->>
http://www.eurogamer.net/articles/digitalfoundry-rr7-the-1080p-dream-blog-entry

The "false" in your dreams and fantasy = correct in real world.

Quote:


> Originally Posted by *Serandur*
> 
> Assuming I don't know what DirectX is because I state the old consoles don't have hardware supporting DX11 and for stating the reason for lack of pre DX11 GPU support being a raised baseline standard on PCs - false
> 
> Assuming "efficiency" in the context I used it meant optimal performance when I clearly stated it in terms of a standard API that supports a much broader range of hardware (in DirectX's case, it makes very effective use of GPUs with few wasted cycles, the overhead is primarily an issue for draw calls) -> false


again, your assumpation vs facts
it doesn't take me a second to figure out you have no clue on what dx11 is judging from all your false claims, you went as far to creat your own theory on how it works, I only give you some insight how does DX works on PC and how the low level API environment of console give consoles the freedom to bypass directX obstacle and able to produce supposedly "DX11 only multiplates" after 9 years even till this day, on PC when a game tells you it doesn't support dx11 then you are out of luck and buy yourself a new OS and GPU, simple as that, now which part you still don't understand????

Quote:


> Originally Posted by *Serandur*
> 
> Assuming 660 levels of performance based on a benchmark running ONE combination of rendering settings in a manner MEANT to push GPU loads for the sake of comparitive testing (well above what a PS4 runs) -> 660 performance running equivalent to console settings - false


once again, you are the only one doing the assuming job, where did you get your idea on what "setting" ps4 is and what pc setting compare to them? (you suggested it in the first place remember?) Some games even had an edge on PS4 vs PC's benchmarked setting posted. Mind you all the scores I posted were benchmarked by professionals from different websites, not only 660 fail to keep up on all these games, by next year this GPU will became completely irreverent in gaming. Look up some digital foundry analyze before coming here asking someone to believe your fantasies.
Your claim = false (again)

Quote:


> Originally Posted by *Serandur*
> 
> Google the 8800 for newer games yourself, you've ignored my examples. As for settings, nobody runs PC game benchmarks with equivalent settings to consoles (especially considering when resolution is below 1920x1080). Consider the differences..


How about you accept the fact that 8800 can't get DX11 only games to work these days and stop come up with your own facts? I didn't ignore your example, I don't recall I skipped any. Every claim you made about 8800GTS are completely false, whichever way you cut it, cost, performance, life time duration, game compatibility, gaming experience etc. 8800 can't keep up.
Quote:


> Originally Posted by *Serandur*
> 
> Your knowledge and experience is limited, in regards to settings (seriously, performance setting options; modern PC games are extremely scalable) and performance of hardware you don't even have access to. Google the 8800 for newer games yourself, you've ignored my examples. As for settings, nobody runs PC game benchmarks with equivalent settings to consoles (especially considering when resolution is below 1920x1080). Consider the differences.


I don't have access to what? I have no idea why You tend to imagining me(or any none PC only gamers) as your console nemesis when I'm primarily a PC gamer I customize my PC games as much as possible and participated more Crysis1/crysis2 modding project than you ever had, and it's safe to say that I have more chance game on better setting than you on PC. Sorry to disappoint you that I refuse to stay ignorant and choose not to be a single platform only gamer as you are, still, I'm not that console nemesis form your nightmare


----------



## Serandur

Quote:


> Originally Posted by *CryphicKing*
> 
> Funny, I'm still waiting on your evidence on a single bencharmk score or video with fps recorded on BF4, tomb raider or skyrim benchmarked on core2duo, 1G ram on a 8800, In case you have short term memory, 1G system ram was top of the line standard when 8800GTS was available, even crysis from 2007's asked 1.5- 2G ram to max out, 4G ram were still considered high end requirement before crysis3 and bf4
> Looks like you had our roles reversed, I only provide fact check, you are the one counter facts based on fantasies and assumption all along
> again, your assumpation vs facts
> it doesn't take me a second to figure out you have no clue on what dx11 is judging from all your false claims, you went as far to creat your own theory on how it works, I only give you some insight how does DX works on PC and how the low level API environment of console give consoles the freedom to bypass directX obstacle and able to produce supposedly "DX11 only multiplates" after 9 years even till this day, on PC when a game tells you it doesn't support dx11 then you are out of luck and buy yourself a new OS and GPU, simple as that, now which part you still don't understand????
> once again, you are the only one doing the assuming job, where did you get your idea on what "setting" ps4 runs at and how you justfy PC setting and compare to them? (you are the one suggested it in the first place remember?) Some games even had an edge on PS4 vs PC's benchmarked setting. Mind you all the scores I posted were benchmarked by professionals from different websites, not only 660 fail to keep up on all these games, by next year this GPU will became completely irreverent in gaming. Look up some digital foundry analyze before coming here asking someone to believe your fantasies.
> Your claim = false (again)
> You should stop assuming and get yourself informed, the resolution/frame rate on these ps3 games were tested and proven, some even running at 60fps, not upscaled but fully native 1080p--->>
> http://www.eurogamer.net/articles/digitalfoundry-rr7-the-1080p-dream-blog-entry
> 
> The "false" in your dreams and fantasy = correct in real world.
> How about you accept the fact that 8800 can't get DX11 only games to work these days and stop come up with your own facts? I didn't ignore your example, I don't recall I skipped any. Every claim you made about 8800GTS are completely false, whichever way you cut it, cost, performance, life time duration, game compatibility, gaming experience etc. 8800 can't keep up.
> I don't have access to what? I have no idea why You tend to imagining me(or any none PC only gamers) as your console nemesis when I'm primarily a PC gamer I customize my PC games as much as possible and participated more Crysis1/crysis2 modding project than you ever had, and it's safe to say that I have more chance game on better setting than you on PC. Sorry to disappoint you that I refuse to stay ignorant and choose not to be a single platform only gamer as you are, still, I'm not that console nemesis form your nightmare


Let's calm down. There is clearly a misunderstanding of my position as well as yours going on here. I am not stating the 8800 can run those DX11-only games, it clearly cannot. I did not state any fantasy regarding DX11; As you've said, APIs still need the compliant hardware from GPU manufacturers. As we've both said, consoles do not run DirectX or APIs as in a PC environment. I am not making any claims regarding differences in RAM requirements because last generation, PCs had full OSs to manage whereas consoles had very simplistic ones. I am also not stating CPU overhead on a high-level API isn't an issue nor am I denying that a fixed platform can have renderers, streaming solutions, APIs, etc. suited specifically to them. DirectX these days, however, is very highly efficient with even PC GPUs. We see them stack up to consoles just as we would expect them too were they PC GPUs. I will list evidence below.

My original claim began regarding the strong advancement of PC GPU performance over the years as well as consoles because of advancements in hardware-level shaders and utilization of them, from which the 8800 series also significantly benfitted. We also saw the advent of multi-core CPUs into mainstream uses and adaptation to those as well. The few games that require DX11 as a baseline obviously do not run on 8800s, hence why I am excluding them from comparison *purely for the sake of seeing how 8800s have also held up and benefitted from rendering software advancements*. I'm not sure what argument you thought I was making regarding DirectX, but I was simply stating we cannot compare games that obviously don't run on one of the two groups of hardware in question. This is same reason why exclusives cannot be used as a measure of how general performance has increased with both platforms. I did not state a fantasy simplifying DirectX in rendering features like tessellation, you assumed that. But yes, the PS3/360 do lack hardware support for a lot of the features DirectX 11 introduced. How couldn't they, they're using modified PC GPU architectures predating DX11? With a few exceptions like Gran Turismo, yes some of the features associated with DX11 can be approximated to a light degree or in very specific circumstances. Those circumstances don't generally apply to multiplatform games, but that was not my point. I know what DirectX is very well and I'd appreciate it if both of us could stop with the assumptions on the other's intention and realize this.

Regarding resolution, I'm sorry, but you are wrong here. That link you posted full of "1080p" games is in fact mostly 1280x720 with a few very rare exceptions in either direction, above or below. Unfortunately, the old beyond3d link isn't functioning right now (beyond3D in general doesn't seem to be as of the time of writing), but part of it has been chronicled here: http://www.ps3hax.net/showthread.php?t=4806

Digital Foundry also have many different analyses out there. Even later in its life, the PS3/360 generally did not render games beyond 1280x720 (_especially_ multiplatform games), this is an established fact. Obviously there are a very small handful of exceptions, but very few of those in the list you originally posted. In any case regarding GPU performance and PC resolutions, any platform that is different from 2006/2007 Core 2s has no bearing on GPU throughput with regards to resolution and as GPUs were indeed my focus (see my shader post), here are some examples:

1280x704, clearly lower-tier graphical settings, reduced gameplay design with multiplayer player limits, post-process AA, 30 FPS cap

and






Also:

1152 x 720 on 360, 1024x720 on PS3, lowered settings obviously, 20-30 FPS

and






I'm not listing anymore of those examples simply because it's time-consuming to do so, but the point is that 8800s also significantly benefitted from more advanced engines. Of course we're not going to find many official benchmarks on cards that old, but it still holds up to a definitely superior experience over the years. Of course a lot of PC gamers upgraded since then, faster stuff is shiny and enticing. It doesn't negate this.

My point also had nothing to do with matching last-gen consoles from the same timeframe because of the many differences of what happened last time hardware-wise from what's happening this time around. I'm purely showing a demonstration of GPUs holding up. The settings aren't desirable by modern PC gamer standards, mostly, but that's not the point. The point is that they still benefitted and held on for a while with reasonable settings for their capabilities like the consoles. DX11 titles are of course the exception, I'm excluding them only because we cannot compare how processing performance held up with those as they aren't going to run on 8800s or any pre-Fermi GPUs for that matter.

Furthermore, regarding your claims of this "nemesis" thing. It's another assumption and a harmful, vitriolic one at that. It's nothing to do with my choice of platform. I already stated I have a PS3, have a 3DS too, had an Xbox, had Gameboys, has PS2s, the PS1, SNES, and am waiting on Kingdom Hearts III and FF XV to decide if I wish to get a PS4 as well. There is no ignorance in noticing how PC GPUs actually don't fall behind consoles ones.

The hardware I stated you have no access to is the GTX 660 I mentioned multiple times that you're so sure cannot keep up with what I presume to mean a PS4 or Xbox One's GPU. If you meant in regards to other PC GPUs, of course I'm not arguing that. But it is holding up better than the Xbone and PS4's GPUs, that's a fact and one I've tested personally with my brother's 660. I'm not saying anything regarding your access to PC hardware, I'm talking about this one piece of hardware that theoretically and realistically holds up pretty well to the new consoles.

Furthermore, you misunderstood what I said about the benchmark. It's nothing to do with being professional or not, I wasn't criticizing any aspect of the benchmark; only stating that, as we as PC gamers very well know, benchmarks at one (usually higher-tier) setting meant to stress and illustrate the differences between GPUs free from any other bottlenecks are obviously not representative of an irrelevant console-tier metric or the capability of the tested hardware to run the game well with some different settings. Regarding the PS4's settings of Watch Dogs, it's a fact that it's running at 1600x900 with roughly high settings and capped at 30 FPS with some very minor dips as per Digital Foundry's own account. The same account states that the Xbox One version renders at "792p" and that, and I quote, "dropping down to the GTX 750 Ti gave us the same experience at the high quality level - effectively a match for PS4, but with a full 1080p resolution." A 750Ti is actually a very slightly weaker card than the 660 and here it is besting the PS4 with similar settings and a locked 30 FPS, but almost 50% higher resolution. Then there's my own testimony based on my testing of the 660 relative to these observations. The 660 holds up well.

There are many other examples out there. Search for Battlefield 4 (1600x900, confirmed mix of medium and high on PS4), AC Unity now (1600x900 with 20-25 FPS on PS4), and plenty others. In all said cases, especially given the detail of setting analysis we get these days, the PS4's GPU performs right where it theoretically would, around 7850 level if not lower in situations where its CPU limit it (as per recent game benchmarks, even i3s are holding up very well above/against the console CPUs this time around even with DX11 overhead, even moreso of a gap in situation where Mantle is utilized). Again, the point is not whether the 660 holds up to newer PC GPUs, but that we are seeing no clear advantages of console-specific optimization with regards to GPU performance and it can hold up to at least those. Now that both consoles and PCs are on a level playing field in feature and instruction sets, it's become even more straightforward to really compare and that's what's relevant in the context of this thread.

I do not mean you any harm and I apologize for any confrontational tones I gave off, I simply wanted to clarify my position and some of the points I was making hinged on my assumption that knowledge of where the consoles stood in these games was widespread and common ground. I should have established those points before giving an interpretation of them, I apologize.


----------



## SpeedyVT

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Serandur*
> 
> Let's calm down. There is clearly a misunderstanding of my position as well as yours going on here. I am not stating the 8800 can run those DX11-only games, it clearly cannot. I did not state any fantasy regarding DX11; As you've said, APIs still need the compliant hardware from GPU manufacturers. As we've both said, consoles do not run DirectX or APIs as in a PC environment. I am not making any claims regarding differences in RAM requirements because last generation, PCs had full OSs to manage whereas consoles had very simplistic ones. I am also not stating CPU overhead on a high-level API isn't an issue nor am I denying that a fixed platform can have renderers, streaming solutions, APIs, etc. suited specifically to them. DirectX these days, however, is very highly efficient with even PC GPUs. We see them stack up to consoles just as we would expect them too were they PC GPUs. I will list evidence below.
> 
> My original claim began regarding the strong advancement of PC GPU performance over the years as well as consoles because of advancements in hardware-level shaders and utilization of them, from which the 8800 series also significantly benfitted. We also saw the advent of multi-core CPUs into mainstream uses and adaptation to those as well. The few games that require DX11 as a baseline obviously do not run on 8800s, hence why I am excluding them from comparison *purely for the sake of seeing how 8800s have also held up and benefitted from rendering software advancements*. I'm not sure what argument you thought I was making regarding DirectX, but I was simply stating we cannot compare games that obviously don't run on one of the two groups of hardware in question. This is same reason why exclusives cannot be used as a measure of how general performance has increased with both platforms. I did not state a fantasy simplifying DirectX in rendering features like tessellation, you assumed that. But yes, the PS3/360 do lack hardware support for a lot of the features DirectX 11 introduced. How couldn't they, they're using modified PC GPU architectures predating DX11? With a few exceptions like Gran Turismo, yes some of the features associated with DX11 can be approximated to a light degree or in very specific circumstances. Those circumstances don't generally apply to multiplatform games, but that was not my point. I know what DirectX is very well and I'd appreciate it if both of us could stop with the assumptions on the other's intention and realize this.
> 
> Regarding resolution, I'm sorry, but you are wrong here. That link you posted full of "1080p" games is in fact mostly 1280x720 with a few very rare exceptions in either direction, above or below. Unfortunately, the old beyond3d link isn't functioning right now (beyond3D in general doesn't seem to be as of the time of writing), but part of it has been chronicled here: http://www.ps3hax.net/showthread.php?t=4806
> 
> Digital Foundry also have many different analyses out there. Even later in its life, the PS3/360 generally did not render games beyond 1280x720 (_especially_ multiplatform games), this is an established fact. Obviously there are a very small handful of exceptions, but very few of those in the list you originally posted. In any case regarding GPU performance and PC resolutions, any platform that is different from 2006/2007 Core 2s has no bearing on GPU throughput with regards to resolution and as GPUs were indeed my focus (see my shader post), here are some examples:
> 
> 1280x704, clearly lower-tier graphical settings, reduced gameplay design with multiplayer player limits, post-process AA, 30 FPS cap
> 
> and
> 
> 
> 
> 
> 
> 
> Also:
> 
> 1152 x 720 on 360, 1024x720 on PS3, lowered settings obviously, 20-30 FPS
> 
> and
> 
> 
> 
> 
> 
> 
> I'm not listing anymore of those examples simply because it's time-consuming to do so, but the point is that 8800s also significantly benefitted from more advanced engines. Of course we're not going to find many official benchmarks on cards that old, but it still holds up to a definitely superior experience over the years. Of course a lot of PC gamers upgraded since then, faster stuff is shiny and enticing. It doesn't negate this.
> 
> My point also had nothing to do with matching last-gen consoles from the same timeframe because of the many differences of what happened last time hardware-wise from what's happening this time around. I'm purely showing a demonstration of GPUs holding up. The settings aren't desirable by modern PC gamer standards, mostly, but that's not the point. The point is that they still benefitted and held on for a while with reasonable settings for their capabilities like the consoles. DX11 titles are of course the exception, I'm excluding them only because we cannot compare how processing performance held up with those as they aren't going to run on 8800s or any pre-Fermi GPUs for that matter.
> 
> Furthermore, regarding your claims of this "nemesis" thing. It's another assumption and a harmful, vitriolic one at that. It's nothing to do with my choice of platform. I already stated I have a PS3, have a 3DS too, had an Xbox, had Gameboys, has PS2s, the PS1, SNES, and am waiting on Kingdom Hearts III and FF XV to decide if I wish to get a PS4 as well. There is no ignorance in noticing how PC GPUs actually don't fall behind consoles ones.
> 
> The hardware I stated you have no access to is the GTX 660 I mentioned multiple times that you're so sure cannot keep up with what I presume to mean a PS4 or Xbox One's GPU. If you meant in regards to other PC GPUs, of course I'm not arguing that. But it is holding up better than the Xbone and PS4's GPUs, that's a fact and one I've tested personally with my brother's 660. I'm not saying anything regarding your access to PC hardware, I'm talking about this one piece of hardware that theoretically and realistically holds up pretty well to the new consoles.
> 
> Furthermore, you misunderstood what I said about the benchmark. It's nothing to do with being professional or not, I wasn't criticizing any aspect of the benchmark; only stating that, as we as PC gamers very well know, benchmarks at one (usually higher-tier) setting meant to stress and illustrate the differences between GPUs free from any other bottlenecks are obviously not representative of an irrelevant console-tier metric or the capability of the tested hardware to run the game well with some different settings. Regarding the PS4's settings of Watch Dogs, it's a fact that it's running at 1600x900 with roughly high settings and capped at 30 FPS with some very minor dips as per Digital Foundry's own account. The same account states that the Xbox One version renders at "792p" and that, and I quote, "dropping down to the GTX 750 Ti gave us the same experience at the high quality level - effectively a match for PS4, but with a full 1080p resolution." A 750Ti is actually a very slightly weaker card than the 660 and here it is besting the PS4 with similar settings and a locked 30 FPS, but almost 50% higher resolution. Then there's my own testimony based on my testing of the 660 relative to these observations. The 660 holds up well.
> 
> There are many other examples out there. Search for Battlefield 4 (1600x900, confirmed mix of medium and high on PS4), AC Unity now (1600x900 with 20-25 FPS on PS4), and plenty others. In all said cases, especially given the detail of setting analysis we get these days, the PS4's GPU performs right where it theoretically would, around 7850 level if not lower in situations where its CPU limit it (as per recent game benchmarks, even i3s are holding up very well above/against the console CPUs this time around even with DX11 overhead, even moreso of a gap in situation where Mantle is utilized). Again, the point is not whether the 660 holds up to newer PC GPUs, but that we are seeing no clear advantages of console-specific optimization with regards to GPU performance and it can hold up to at least those. Now that both consoles and PCs are on a level playing field in feature and instruction sets, it's become even more straightforward to really compare and that's what's relevant in the context of this thread.
> 
> I do not mean you any harm and I apologize for any confrontational tones I gave off, I simply wanted to clarify my position and some of the points I was making hinged on my assumption that knowledge of where the consoles stood in these games was widespread and common ground. I should have established those points before giving an interpretation of them, I apologize.






Facts are that we've not fully utilized this hardware yet to the fullest extent. When we do, the benefit comes to us PC Consumers as well. Helping developers optimize code and design on such platforms. When I say we've not fully utilized mind you that even our desktops are poorly utilized. Microsoft was embrassed about Mantle, but that still leaves other unoptimized areas. Fact is PS4 is 10 times more powerful than the PS3, the games do not be significantly different. PS3 dedicated most of it's cores to graphics and not AI processing so on. When a company makes the claim the PS4 can't handle it. I laugh because they can't handle to do better programming.


----------



## lacrossewacker

Quote:


> Originally Posted by *SpeedyVT*
> 
> Facts are that we've not fully utilized this hardware yet to the fullest extent. When we do, the benefit comes to us PC Consumers as well. Helping developers optimize code and design on such platforms. When I say we've not fully utilized mind you that even our desktops are poorly utilized. Microsoft was embrassed about Mantle, but that still leaves other unoptimized areas. Fact is PS4 is 10 times more powerful than the PS3, the games do not be significantly different. PS3 dedicated most of it's cores to graphics and not AI processing so on. When a company makes the claim the PS4 can't handle it. *I laugh because they can't handle to do better programming*.


Exactly.

That's why I say "I'll wait for a more credible source like 343 industries to say when the X1's been completely "maxed" just as naughty dog will eventually do with the PS4. It'll be at least 2 or 3 games in though."

Ubisoft just doesn't invest the time necessary to truly "max" out each nook and cranny of both consoles - if they did, we wouldn't be seeing the current bug fest of AC: Unity.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *phill1978*
> 
> posting screen shots doesn't mean much, it doesn't show physics,lighting,animation, shaders, reflections,occlusions,of anisotropy in motion.
> 
> engine differences account for the majority of visual improvements. These carry optimizations but the real difference is most always the progression of game engine technologies.


The engine is better in every way.

I suggest you read DigitalFoundrys overview of Halo 4's engine *here*.

Old tech interview - good insight into how the tech and art need to compliment each other




Real time cutscene goodness





Enjoy....



Vs.





Vs.





Vs.





Vs.



I'll wait for a more credible source like 343 industries to say when the X1's been completely "maxed" just as naughty dog will eventually do with the PS4. It'll be at least 2 or 3 games in though.


----------



## Carniflex

In my opinion more interesting thought experiment is what would be in pretty decent small from factor gaming PC. And what would be the price and is it "worth it" over a console. After thinking about it for a little bit that is what I threw together on paper.


Not everything is mandatory in there and obviously one can go much higher. All prices include 20% VAT and are the current prices in Estonia. About 1050 EUR for the whole system is quite reasonable price considering it's reasonably small from factor and pretty powerful for single screen gaming.

The question is if it is "worth it" over the console in similar situation (i.e, living room). And that is probably pretty subjective. In my opinion it would be, but I am ofc biased


----------



## Kane2207

Quote:


> Originally Posted by *Carniflex*
> 
> In my opinion more interesting thought experiment is what would be in pretty decent small from factor gaming PC. And what would be the price and is it "worth it" over a console. After thinking about it for a little bit that is what I threw together on paper.
> 
> 
> Not everything is mandatory in there and obviously one can go much higher. All prices include 20% VAT and are the current prices in Estonia. About 1050 EUR for the whole system is quite reasonable price considering it's reasonably small from factor and pretty powerful for single screen gaming.
> 
> The question is if it is "worth it" over the console in similar situation (i.e, living room). And that is probably pretty subjective. In my opinion it would be, but I am ofc biased


I see this argument quite a lot, and whilst it is valid, until I can play Naughty Dog games on my PC (read - NEVER) I'll continue to buy consoles regardless of how powerful they are compared to my desktop


----------



## Carniflex

Quote:


> Originally Posted by *Kane2207*
> 
> I see this argument quite a lot, and whilst it is valid, until I can play Naughty Dog games on my PC (read - NEVER) I'll continue to buy consoles regardless of how powerful they are compared to my desktop


"Never" is very very long time







Although you are ofc correct that for the time being this is not the case. Besides, buying a console for a game that exists only on that console is one of the very few reasons against what I'm not able to argue somehow.


----------



## Kane2207

Quote:


> Originally Posted by *Carniflex*
> 
> "Never" is very very long time
> 
> 
> 
> 
> 
> 
> 
> Although you are ofc correct that for the time being this is not the case. Besides, buying a console for a game that exists only on that console is one of the very few reasons against what I'm not able to argue somehow.


Most of the exclusives on the PS4 are Sony owned studios (Naughty Dog, SCE Santa Monica, Polyphony, Suckerpunch) so the chances of them coming to something other than a Sony branded console are really slim to none.

I purchase all cross platform games for the PC primarily, and would switch to PC full time if I could, but I do enjoy laying back on the couch playing Uncharted/TLoU/GT/GoW/Infamous. There's very few games that come close to those on the PC (3rd person primarily), the Arkham games are good, I enjoyed Shadow of Mordor.


----------



## iSlayer

Quote:


> Originally Posted by *lacrossewacker*
> 
> The engine is better in every way.
> 
> I suggest you read DigitalFoundrys overview of Halo 4's engine *here*.
> 
> Old tech interview - good insight into how the tech and art need to compliment each other
> 
> 
> 
> 
> Real time cutscene goodness
> 
> 
> 
> 
> 
> Enjoy....
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> I'll wait for a more credible source like 343 industries to say when the X1's been completely "maxed" just as naughty dog will eventually do with the PS4. It'll be at least 2 or 3 games in though.


"Credible source" "343 industries"

Bungie's surprisingly less competent replacement sure is a good source.

Anyways the canon is past me at this point but Captain Freaky Face makes the ACU glitchfaces seem normal.


----------



## Pip Boy

Quote:


> Originally Posted by *lacrossewacker*
> 
> The engine is better in every way.
> 
> I suggest you read DigitalFoundrys overview of Halo 4's engine *here*.
> .


i suggest you read before you post. I was agreeing with you


----------



## jameschisholm

I think people expect too much from the consoles.


----------



## lacrossewacker

Quote:


> Originally Posted by *phill1978*
> 
> i suggest you read before you post. I was agreeing with you


oh









Yeah I actually spent like 30 minutes making the comment on a tablet, it wasn't necessarily just to you.
Quote:


> Originally Posted by *iSlayer*
> 
> "Credible source" "343 industries"
> 
> *Bungie's surprisingly less competent replacement sure is a good source.*
> 
> Anyways the canon is past me at this point but Captain Freaky Face makes the ACU glitchfaces seem normal.


not sure what you're getting at?

Halo 4's engine puts it at probably the second best looking game behind The Last of Us - however....try comparing the scale of those two games ^_^

Look at Halo 2 Anniversary running at 60fps compared to Bungie's destiny. Considering Halo 2 A has to run two engines at once, it's pretty darn impressive looking.

Enjoy at 60fps if you want


Spoiler: Warning: Spoiler!


----------



## PostalTwinkie

Quote:


> Originally Posted by *jameschisholm*
> 
> I think people expect too much from the consoles.


Nope, people are just pissed the expectations set by the manufacturers and developers are far from being seen.

During all the press and buildup to the release the term "Next Gen" was being tossed around, along with "1080/60", and here we are struggling with 900/30.


----------



## iSlayer

Quote:


> Originally Posted by *lacrossewacker*
> 
> oh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I actually spent like 30 minutes making the comment on a tablet, it wasn't necessarily just to you.
> not sure what you're getting at?
> 
> Halo 4's engine puts it at probably the second best looking game behind The Last of Us - however....try comparing the scale of those two games ^_^


Being the best looking console game is like being the hottest person ever bludgeoned with a brick.
Quote:


> Look at Halo 2 Anniversary running at 60fps compared to Bungie's destiny. Considering Halo 2 A has to run two engines at once, it's pretty darn impressive looking.
> 
> Enjoy at 60fps if you want
> 
> 
> Spoiler: Warning: Spoiler!


I've been watching it. For all the multi problems I find the graphics a bit unimportant.

And two engines at once? The stutter to reload the other engine is jarring to say the least. Its not exactly particularly impressive given that H2 ran on a 733 MHz P3 with a lower midrange GPU from 2002 with a shocking 64MBs of DDR.

It'd be like me bragging about running Minecraft and CSGO at the same time.

But let's get to the heart of the matter, H2A/HCEA being a murky, blackish mess. I can't decipher anything of what's going on and I've been playing the franchise since H1. The graphics are truly awful. Sure, they may look pretty but functionally they're god awful. The lighting is at best unreal, being embarrassed by Doom 3 if not the original Doom.

God knows how 343 botched this so hard but man have they come their best to screw it all up.


----------



## bucdan

Quote:


> Originally Posted by *iSlayer*
> 
> Being the best looking console game is like being the hottest person ever bludgeoned with a brick.
> I've been watching it. For all the multi problems I find the graphics a bit unimportant.
> 
> And two engines at once? The stutter to reload the other engine is jarring to say the least. Its not exactly particularly impressive given that H2 ran on a 733 MHz P3 with a lower midrange GPU from 2002 with a shocking 64MBs of DDR.
> 
> It'd be like me bragging about running Minecraft and CSGO at the same time.
> 
> But let's get to the heart of the matter, H2A/HCEA being a murky, blackish mess. I can't decipher anything of what's going on and I've been playing the franchise since H1. The graphics are truly awful. Sure, they may look pretty but functionally they're god awful. The lighting is at best unreal, being embarrassed by Doom 3 if not the original Doom.
> 
> God knows how 343 botched this so hard but man have they come their best to screw it all up.


I feel the same way about it. The graphics and scene is just dark... instead of using lighting to hide poor textures, they use darkness lol.


----------



## w0rmk00n

I seriously can't believe these new consoles couldn't reach a 1080p at 60 FPS minimum on all games. That is just sad. Like come on it's 2014.


----------



## Systemlord

Quote:


> Originally Posted by *w0rmk00n*
> 
> I seriously can't believe these new consoles couldn't reach a 1080p at 60 FPS minimum on all games. That is just sad. Like come on it's 2014.


Perhaps in 2020 consoles will be able to run at 1080/60 fps, but by that time 4K will be the norm. Consoles will always be years behind current tech, HDTV 1080 has been around for many years.


----------



## lacrossewacker

Quote:


> Originally Posted by *Systemlord*
> 
> Perhaps in 2020 consoles will be able to run at 1080/60 fps, but by that time 4K will be the norm. Consoles will always be *years behind current tech*, HDTV 1080 has been around for many years.[/quote
> depends what "current tech" is. For our demographic, you're right. To your average Joe that has an old laptop from college and who's biggest tech upgrade was an iPad, consoles are the best all in one type devices. Not much to compare it to.
> 
> Maybe it's just me but I know 1 PC gamer and he has a GTX 570. pretty good compared to the usual person i overhear talking about diablo/WoW


----------



## Oubadah

..


----------



## iSlayer

Inferior, restricted gaming PC doesn't quite have the same ring to it.


----------



## FattysGoneWild

Quote:


> Originally Posted by *jameschisholm*
> 
> I think people expect too much from the consoles.


No. People just expect what MS and Sony say these consoles can do. Even though pc community knows better about that part. Just hyperbole with 50% actual delivery.


----------



## Shadow11377

Quote:


> Originally Posted by *Systemlord*
> 
> Perhaps in 2020 consoles will be able to run at 1080/60 fps, but by that time 4K will be the norm. Consoles will always be years behind current tech, HDTV 1080 has been around for many years.


The hardware will get better, sure. But will the devs ever learn? That is debatable.Based on the simple fact that there have been zero improvements compared to the previous generation in regards to running games at standard resolutions, I have no hope that next gen will be any better.

The people who are making games run at *792p* today will probably release games at 1584p or something else equally dumb next time around. Also keep in mind some of these guys actually say they believe 30 FPS is more "cinematic" and they designed around that, which is both very hilarious and also disappointing at the same time.


----------



## lacrossewacker

Quote:


> Originally Posted by *Shadow11377*
> 
> The hardware will get better, sure. But will the devs ever learn? That is debatable.Based on the simple fact that there have been zero improvements compared to the previous generation in regards to running games at standard resolutions, I have no hope that next gen will be any better.
> 
> The people who are making games run at *792p* today will probably release games at 1584p or something else equally dumb next time around. Also keep in mind some of these guys *actually say they believe 30 FPS is more "cinematic" and they designed around that,* which is both very hilarious and also disappointing at the same time.


You actually believe that?

It's PR. Let the hordes believe what they say, and those who know better realize their obligated to maintain a positive spin on their products.


----------



## Shadow11377

Quote:


> Originally Posted by *lacrossewacker*
> 
> You actually believe that?
> 
> It's PR. Let the hordes believe what they say, and those who know better realize their obligated to maintain a positive spin on their products.


I believe some of them actually believe what they say yeah. Lying is totally a possibility but it's not impossible for some of the ones who say this stuff to actually believe it.

Like this guy.
_"At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird."_
Source

The ones talking about it not being "worth it" may be reasonable, and I have a little hope for them, but this guy straight up said that 60FPS felt weird and that it was worse. (lol)


----------



## EliasAlucard

Quote:


> Originally Posted by *DoomDash*
> 
> Ubisoft kinda sucks though.


Agreed.
Quote:


> Originally Posted by *maarten12100*
> 
> It would be nice if it was to be that way but PowerPC is IBM's brainchild so it would be a dual vendor console like the previous one was. However the PS3 fat models were able to run PS2 games because they had the PS2 processors on board.


I'm sure a licensing deal could've been worked out. Especially considering that Sony, Toshiba and IBM co-developed Cell. RISC/PowerPC is superior to CISC/x86, and heck, I'd rather see an ARM processor inside the PS4, at least ARM is RISC.

Besides, after IBM lost its CPU monopoly on both consoles (now IBM's PowerPC is only inside the Wii U), I read somewhere about how IBM was considering licensing out its PowerPC architecture to other hardware manufacturers. I hope IBM does that, because it's really about damn time to bury the inferior and outdated CISC/x86 and move on to superior RISC architectures such as PowerPC and ARM. Fortunately, thanks to AMD, we seem to be headed that way now with AMD licensing ARM. I can't wait until the day I'll be able to buy a badass ARM desktop CPU from AMD.

There were PS3 models without the Emotion Engine btw; I have one, and they played most PS2 games fine by emulating the Emotion Engine in combination with the PS2's GPU (which was included in those PS3 models). I'm sure PS2 games could be emulated on the PS3 without any PS2 hardware; Sony just isn't willing to spend the money on it for that, but it was done with the Xbox 360 and Xbox games.


----------



## Systemlord

Quote:


> Originally Posted by *Oubadah*
> 
> I keep seeing phrases like "jack of all trades" and "all in one" used to describe consoles, but I don't think those are accurate at all. The console's repertoire is actually very narrow. For example, their format support is severely limited, so they're not much good as media centers.
> 
> "All in one" is an accurate descriptor for the PC, even a basic one.


Yeah right I know, I used my computer to pay all my bills, every piece of furniture and household appliances in my home. Social media and communication with family and friends and PC gaming, even still I haven't covered everything!


----------



## lacrossewacker

Quote:


> Originally Posted by *Shadow11377*
> 
> I believe some of them actually believe what they say yeah. Lying is totally a possibility but it's not impossible for some of the ones who say this stuff to actually believe it.
> 
> Like this guy.
> _"At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird."_
> Source
> 
> The ones talking about it not being "worth it" may be reasonable, and I have a little hope for them, but this guy straight up said that 60FPS felt weird and that it was worse. (lol)


Do you think theyd want to hype up 60fps before releasing a game that would struggle to even run at 30fps?

Saying "we're targeting 30fps for that _cinematic_ feel" vs "we had to settle with 30fps because our game is actually struggling to run 20fps at some parts of the game"

If they were asked "why didn't you target 60fps" they can't answer in a way that puts down their own games. Saying, "we know 60fps is so much better, sorry our console versions only run at 30fps" is how you upset a large part of your consumer base.

Think of it as the X1 and PS4 release, X1 was blunt about all of their features and XbxLive requirement reasonings (fan base brought out the pitchforks) , while the PS4 covered up its lack of features by going the "we're gaming focused" route. That was just PR spin for, "we're unable to meet our goals for launch, just wait 8-12 months" please.

Not trying to ruffle any feathers with that last piece, just a relevant example that i can think of.


----------



## hojnikb

Quote:


> Originally Posted by *EliasAlucard*
> 
> Agreed.
> I'm sure a licensing deal could've been worked out. Especially considering that Sony, Toshiba and IBM co-developed Cell. RISC/PowerPC is superior to CISC/x86, and heck, I'd rather see an ARM processor inside the PS4, at least ARM is RISC.
> 
> Besides, after IBM lost its CPU monopoly on both consoles (now IBM's PowerPC is only inside the Wii U), I read somewhere about how IBM was considering licensing out its PowerPC architecture to other hardware manufacturers. I hope IBM does that, because it's really about damn time to bury the inferior and outdated CISC/x86 and move on to superior RISC architectures such as PowerPC and ARM. Fortunately, thanks to AMD, we seem to be headed that way now with AMD licensing ARM. I can't wait until the day I'll be able to buy a badass ARM desktop CPU from AMD.
> 
> There were PS3 models without the Emotion Engine btw; I have one, and they played most PS2 games fine by emulating the Emotion Engine in combination with the PS2's GPU (which was included in those PS3 models). *I'm sure PS2 games could be emulated on the PS3 without any PS2 hardware; Sony just isn't willing to spend the money on it for that, but it was done with the Xbox 360 and Xbox games.*


Actually, thats already doable with hacked consoles. By using internal ps2 emulator for psn ps2 games, you can run lots of ps2 games that way. Not all of them though, since there are issues.


----------



## SpeedyVT

Quote:


> Originally Posted by *lacrossewacker*
> 
> Do you think theyd want to hype up 60fps before releasing a game that would struggle to even run at 30fps?
> 
> Saying "we're targeting 30fps for that _cinematic_ feel" vs "we had to settle with 30fps because our game is actually struggling to run 20fps at some parts of the game"
> 
> If they were asked "why didn't you target 60fps" they can't answer in a way that puts down their own games. Saying, "we know 60fps is so much better, sorry our console versions only run at 30fps" is how you upset a large part of your consumer base.
> 
> Think of it as the X1 and PS4 release, X1 was blunt about all of their features and XbxLive requirement reasonings (fan base brought out the pitchforks) , while the PS4 covered up its lack of features by going the "we're gaming focused" route. That was just PR spin for, "we're unable to meet our goals for launch, just wait 8-12 months" please.
> 
> Not trying to ruffle any feathers with that last piece, just a relevant example that i can think of.


These games are poorly designed. We've got to hold on and wait for actual good games designed around the console and not computer.


----------



## PostalTwinkie

Quote:


> Originally Posted by *SpeedyVT*
> 
> These games are poorly designed. We've got to hold on and wait for actual good games designed around the console and not computer.




















































That won't magically change the hardware that is in the consoles............

I, I just.....


----------



## Redwoodz

Seems ******ed. There's games that hit a performance wall on a 6 core Haswell-E and a GTX 980 at in certain conditions. A console was never meant to match a high dollar gaming rig. If you design a game that overtaxes the hardware then obviously you fail. Grow-up and stop blaming your ineptitude on the hardware.


----------



## lacrossewacker

Quote:


> Originally Posted by *PostalTwinkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That won't magically change the hardware that is in the consoles............
> 
> I, I just.....


To be fair, Assassin's creed is an ambitious game. Technical issues aside, regardless of all the graphical hiccup memes, it is a gorgeous game at times. So we won't see a revolutionary change if they keep aiming for these extremely demanding scenarios. Having 100s of NPCs on screen is going to be rough in any game.

However, we're already seeing the benefits of time with these consoles.

Exhibit A


Exhibit B


Sure enough, we can bet BF5 will look better than BF4. Halo 5 will look better than Halo 2 Anniversary. Gears of War will most likely usher a new level of graphics fidelity for the X1. Naughty Dogs will likely do the same for the PS4.

I don't know how people can jump to these conclusions SO SOON with these consoles. Should we continue to judge the X360 on Perfect Dark's fidelity and the PS3's Genji warrior game???

Just in case you didn't see earlier



Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *lacrossewacker*
> 
> The engine is better in every way.
> 
> I suggest you read DigitalFoundrys overview of Halo 4's engine *here*.
> 
> Old tech interview - good insight into how the tech and art need to compliment each other
> 
> 
> 
> 
> Real time cutscene goodness
> 
> 
> 
> 
> 
> Enjoy....
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> 
> 
> Vs.
> 
> 
> 
> I'll wait for a more credible source like 343 industries to say when the X1's been completely "maxed" just as naughty dog will eventually do with the PS4. It'll be at least 2 or 3 games in though.


----------



## iSlayer

I made games using MS paint tools for graphics that look better than MCC lol. I'd be shocked if they somehow managed to make the Halo series literally and metaphorically worse. That'd be an achievement in awfulness.


----------



## lacrossewacker

Quote:


> Originally Posted by *iSlayer*
> 
> I made games using MS paint tools for graphics that look better than MCC lol. I'd be shocked if they somehow managed to make the Halo series literally and metaphorically worse. That'd be an achievement in awfulness.


Tell me what was so bad about it - because I think it looks pretty good. Shrine, the Halo 2 Anniversary multiplayer map is gorgeous IMO


----------



## umeng2002

No one is forcing devs to make a game a certain way. If their title chugs in spots, etc. It's all their fault. It's that simple. The reason why Nintendo games on the Wii U run well is that they know the limits of their system and don't try to shoehorn in effects and designs that won't run well.

There will be optimizations, so I don't think they've "hit a wall." They've hit a wall using current programming techniques. These consoles have only been out for a year. Look at Perfect Dark Zero and The Last of Us.


----------



## VeerK

Quote:


> Originally Posted by *umeng2002*
> 
> No one is forcing devs to make a game a certain way. If their title chugs in spots, etc. It's all their fault. It's that simple. The reason why Nintendo games on the Wii U run well is that they know the limits of their system and don't try to shoehorn in effects and designs that won't run well.
> *
> There will be optimizations, so I don't think they've "hit a wall." They've hit a wall using current programming techniques. These consoles have only been out for a year. Look at Perfect Dark Zero and The Last of Us.*


Yes, but can we really expect to see such dramatic jumps due to "optimization"? I mean, PS3 used cell architecture whereas PS4 uses x86, the difference being x86 isn't an isolated coding set.


----------



## Neo_Morpheus

I was remembering back in the day when the Amiga 500 came out, and the class room split into different groups. I was "The PC will eventually win over your console"
Even now looking at the latest consoles, they are even more nooberish, NO KEYBOARD, NO MOUSE, how the heck are you supposed to play the latest MMO or advanced game with like a few buttons on controller. No people, we need to see these consoles for what they are, lazy drunk couch potato people that only need a simple shooter to keep them amused.


----------



## aberrero

Quote:


> Originally Posted by *VeerK*
> 
> Yes, but can we really expect to see such dramatic jumps due to "optimization"? I mean, PS3 used cell architecture whereas PS4 uses x86, the difference being x86 isn't an isolated coding set.


No, we can't.

1. The X360 and PS3 had GPUs that rivaled the highest end gaming PCs, so nobody had tried to harness that much power before. Since these consoles are more midrange, games targeting high end PCs could carry over pretty easily.
2. As you said, they both had unique architectures that nobody had optimized for before, including PowerPC chips on both and Cell processors on the PS3.
3. A lot of the optimization work that can be done has already been done. Game develoeprs come up with more efficient rendering techniques all the time, and the x1/ps4 can take advantage of all the optimizations done for the prior generation. My guess is that making games run more efficiently will continue to get harder over time.


----------



## Gnomepatrol

Quote:


> Originally Posted by *Master__Shake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BinaryDemon*
> 
> I'd like to say I think the dev's for the PS4 and XB1 are doing a great job optimizing for consoles, and it should be sort of obvious that they would hit a performance wall this early.
> 
> When the Xbox 360 and PS3 launched in 2005/2006 their hardware was arguably equal or better than the best consumer PC hardware at the time. Sure PC's quickly surpassed them, but they were designed to be powerful machines with the hopes they would last 10 years. This generation neither MS or Sony took that approach. When both launched in 2013, it was obvious that even my previous system an aging i5-2500k + GTX580 offered a better gaming experience. Both MS and Sony tried to save money and opted for what they believed would be the hardware required for 1080p gaming. I believe both under-estimated this, obviously MS worse than Sony, and that's why both are having trouble with 1080p/60fps.
> 
> *It's going to be a long ~8 years, before the next version consoles are ready.* I bet MS and Sony are hoping that the cost of hardware for 4K Gaming @ 60fps drops significantly in that time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i seriously doubt it'll be 8 years before the next iteration of the PlayStation of Xbox.
> 
> 5 is a stretch.
> 
> 4k in every home in 5 years is probably not going to happen either.
> 
> cable tv moves too slow.
Click to expand...

You are assuming that cable tv is around in 5 years as we have it today.


----------



## iSlayer

Quote:


> Originally Posted by *lacrossewacker*
> 
> Tell me what was so bad about it - because I think it looks pretty good. Shrine, the Halo 2 Anniversary multiplayer map is gorgeous IMO


I've said something similar before but the issue is that whatever work went in by the artists seems to have been shafted by the lighting programmer as the lighting is just so bad. Everything exists in this blackish miasma and what doesn't look blurred into an incomprehensible noise is so unreally lit, like the master chief started taking LSD instead of steroids.

Now, I'm not above truly bad graphics, see years spent playing Q3 with a config. Graphics have to be truly awful and functionally crap for me to start caring and in Halo's case it is now. H1-3 have realistic colors with fictional but not disruptive coloring. I can exist immersed in H1-3 but the new graphics? I'm constantly niggled away by the question of just what is three feet away from me, making combat functionally worse. Then, it'll end and I'll be walking down a hall and everything will have a blue filter over it and everything just feels wrong, like I'm playing a bastardized version of one of my favorite franchises... oh wait.


----------



## GraveDigger7878

For the dude who I am to lazy to find and make a quote; When you hear "all-in-one" thrown around for the console it suggests that the unit can accomplish all of said tasks. "Jack of all trades", however, is not a compliment. Jack of all trades, master of none. Mediocre at everything.

At any rate, I am hoping some of you agree with me on this, for the last 5 years I have been preaching that Microsoft and Sony need to release $999.99 consoles. They need to pack out every last cent with current tech so that down the road they can lower prices and build on performance through optimization. Most of my friends are PC master race so we kind of threw together ideas of how the consoles should be made to benefit gaming as an entirety. That is what we thought.


----------



## moccor

Quote:


> Originally Posted by *GraveDigger7878*
> 
> For the dude who I am to lazy to find and make a quote; When you hear "all-in-one" thrown around for the console it suggests that the unit can accomplish all of said tasks. "Jack of all trades", however, is not a compliment. Jack of all trades, master of none. Mediocre at everything.
> 
> At any rate, I am hoping some of you agree with me on this, for the last 5 years I have been preaching that Microsoft and Sony need to release $999.99 consoles. They need to pack out every last cent with current tech so that down the road they can lower prices and build on performance through optimization. Most of my friends are PC master race so we kind of threw together ideas of how the consoles should be made to benefit gaming as an entirety. That is what we thought.


Yeah if it was say, 800 for a PS4 instead of 400 and included a i5/i7 and a GPU (not shtty APU), they could literally skip a whole console and have one that isn't a sad excuse of a console.


----------



## Hawk777th

My only comment on this thread would be who cares. I am a long time gamer and collector and love pretty GFX and built my rig to enjoy PC games but I have to ask the question.

Since when did GFX make games fun? Ever? Is fun not the point of games?

I have built rigs to play graphics intensive games and benchmarks. But the novelty of the visuals wears off after you realized how crap 90% of the AAA games market is now. I would much rather worry that devs are making fun games than worrying about RES FPS OMG SHADERS ETC. Because at the end of the day none of those things make mainstream games more fun to play.

All the graphics in the world didnt make Crysis 3 a fun or good game. IMO. People remember their favorite games for a lifetime. But if they are just tech demos that you can play as soon as the next shiny system builder game comes out they forget the old one. Its now old news in the gfx department. Seems to me people remember fun experiences more than SSAA.

I would also add that compare Halo 3 to Halo 4 graphics they dont even look like they are on the same system. We have a ways to go in optimization.

I will admit that its fun to see what tech can do and what they can do today compared to the old days, but that wears of quickly into a 4 hr session with a game that is just not fun to play.

Also consoles have always been built to a price point and always will be behind the tech curve even with direct to hardware coding. I own both PS4 and XONE but I dont play on them to think they are the best graphics experience possible. I play games to have fun not have a spec war.









Just my .02c


----------



## moccor

Quote:


> Originally Posted by *Hawk777th*
> 
> I have built rigs to play graphics intensive games and benchmarks. But the novelty of the visuals wears off after you realized how crap 90% of the AAA games market is now. I would much rather worry that devs are making fun games than worrying about RES FPS OMG SHADERS ETC.


Main problem is PC's are fully capable of enhanced settings with almost 0 effort in all those categories on PC, they just lie to consumers and act like PS4/Xbone/PC are comparable. Two tools that comes to mind and fixes a lot of screwups is GeDoSaTo. Another is SweetFX. They are very easily fixed with these tools that don't actually do anything difficult (programming wise)


----------



## ivymaxwell

i have a good pc. but i still own a ps4. i like the pop and play on your couch with nice ps4 controller, no changing settings, clinking around or sitting in a computer chair or changing hdmi out to big monitor etc. also ps4 is a complete system for 400$ and blu ray and playing naughty dog games, gtav. sports games are best on a console sitting in your bed or chair and playing with a comfy remote controller on your giant screen tv relaxing. my pc is good but im enjoying my ps4.

oh yeah and one series that you must play. METAL GEAR SOLID games.

mgs1 mgs2 mgs3 mgs4 mgs5prelogue mgs5phantompain.

red dead redemption.


----------



## Mattbag

Quote:


> Originally Posted by *ivymaxwell*
> 
> i have a good pc. but i still own a ps4. i like the pop and play on your couch with nice ps4 controller, no changing settings, clinking around or sitting in a computer chair or changing hdmi out to big monitor etc. also ps4 is a complete system for 400$ and blu ray and playing naughty dog games, gtav. sports games are best on a console sitting in your bed or chair and playing with a comfy remote controller on your giant screen tv relaxing. my pc is good but im enjoying my ps4.
> 
> oh yeah and one series that you must play. METAL GEAR SOLID games.
> 
> mgs1 mgs2 mgs3 mgs4 mgs5prelogue mgs5phantompain.
> 
> red dead redemption.


I agree my pc isn't bad but I save that for MMOs, RTS and online blockbuster games like borderlands, crysis, dark souls..... other games like the cheesy tripple AAA titles and action games i prefer to sit back on the couch, not to mention the fact that with gamefly i just rent a game and send it back....


----------



## hellojustinr

LOL at this article and every other PC master troll enthusiast saying Xbox One and PS4 are hitting a "performance wall" already.

Xbox 360 and PS3 were hitting performance wall way back in 2008 with the release of Grand Theft Auto IV and they are still playing the likes of modern games like Far Cry 4 today albeit at Low settings on HD. Heck they were able to optimize GTA V to release last year with the same frame rate and resolution as GTA IV. Performance wall means nothing to consoles. All consoles hit the performance wall early on.

Xbox 360 and PS3 equivalent in terms of PC architecture/specs raw power
Triple-core AMD Phenom first-gen
NVIDIA GEFORCE 7950GT and ATI RADEON X1800 (not even the higher end, 7950 GTX or X1950 XTX versions)
256MB/512MB of RAM

Yet I'd like to see a PC with those similar specs handle the same games those consoles are doing now.

Games like MGS4 don't seem possible on a GPU like the 7950 GTX whasoever in my mind when I think about it if I thought about it on the PC hardware side of things but it is on the console.
___________________________________________________________________________________
That's not to mention our GPUs have multiplied in power times 20 (exaggggg) in the past seven years.

Example

NVIDIA point of view (easier to follow)
2007
From 7950 GTX -- > 8800 GTX G80 TESLA (with latter exceeding the performance of even two 7950GTXs in SLI (7950 GX2). improvement = 2.5x literally)

early 2008
minor 8800 GTX G80 TESLA -- > 9800 GTX+ G92 TESLA (really minor performance improvement, more for power savings, if any at all. improvement = 1.1x

late 2008
9800GTX G92 TESLA --> GTX 280 MATURE TESLA (easily two times faster than the G92 chip, a beast for its time. improvement = 2.2x)

2009
GTX 280 MATURE TESLA --> GTX 480 FERMI PROTOTYPE (ran extremely hot but early Fermi archi thats why, was able to compete with the GTX 295 which itself was two GTX 280s in essence. improvement = 2x)

2010
GTX 480 FERMI PROTOTYPE --> GTX 580 MATURE FERMI (mature version of Fermi, ran faster as well as a result. improvement = 1.4x)

2012
GTX 580 MATURE FERMI --> GTX 680 KEPLER (in a particular tech demo, one 680 was able to outpace three 580s in SLI, just showed the huge generational leap it made, plus the power efficiency gains it made (e.g. less heat as well), 2012 was that year. improvement = nearly 2.5x-3x)

2013
GTX 680 KEPLER --> GTX 780 TI IMPROVED KEPLER (just a more mature Kepler in my opinion. improvement = 1.5x)

2014
GTX 780 TI IMPROVED KEPLER --> GTX 980 MAXWELL (focused on power savings, shows NVIDIA is refocused on mobile. improvement=1.3x????)

Conclusion:
With all these tenfold increases in performance it feels as if games are not looking that much better yet have much higher requirements than ever before. AC Unity requires a GTX 680 MINIMUM just to run apparently which is just BS when consoles are effectively using near GTX 650 level GPUs.

30FPS on consoles at 900p seem reasonable for a badly optimized game on a console with GTX 650 level GPU but 30FPS on a GTX 680 is just ridiculous (this is based on personal experience playing with a extra 680 I have laying around).

If I were to base it on PC nomenclature just going up from a GTX 650 to a 660 is a big increase already, and from a 660 to a 660 Ti a much larger performance increase, then from 660 Ti to 670 is an even bigger differential. Why is the 680 the minimum?

I feel we're being neglected of proper optimizations and many people are ignoring it because GPUs have gotten so cheap to the point you can get a pretty decent GPU for $130 used (HD 7950 and GTX 670 goes for $150 solid used). Heck with the release of the 970, effectively 780 Ti level performance is down to the $350 price point and that effectively affects all lower level GPUs as well.
_________________________________________________________________________________
Game with proper optimizations = Dishonored (runs linearly the same regardless of hardware and scales properly in accordance to hardware raw performance), COD Black Ops II, COD Ghosts

Games with horrible optimization = Battlefield 4 (in some degree), GTA IV, AC Unity, Metro 2033

In PC people's mind's, the harder it is to run the game the more intensive it is, but that is not always the case. Games like Metro 2033 Last Light do not look anywhere near as stunning yet require the raw performance power of the likes of a game like Crysis 3.

Consoles and PCs are NOT comparable whatsoever. Time and time again it has always been like that and with their transition to x86-x64 architecture, it still will not change. The only thing we have going for us is, with consoles featuring x86-x64 AMD architecture, many of the optimizations devs make on the consoles we'll eventually transition to us when they are ported over to PC since the consoles are essentialy GCN architecture PCs.

The time developers will have to develop specifically for the next-gen console's silicon will be a lot more larger in span compared to the pathetic life cycles of our GPU architectures (Fermi anyone? that's quite recent, just three years ago). Flagships just two-three years ago are being forgotten quickly and those GPUs alone cost $400 when they first came out, the price of ONE flagship next-gen console alone. PC enthusiast just tell people to upgrade and upgrade and upgrade, and for what reason? I feel we should have reached that point where we shouldn't be upgrading as frequently anymore.

Just some food for thought (a really long one at that). Don't take the insults too seriously but I think we all know some part of this rant to be true in some aspect. Wish I could get my message across a lot more clearer but it's hard to organize all my thoughts into this certain subject. Hope you get the idea

I personally think we should start demanding for better optimizations in the PC community, as a whole rather than keep telling each other to just upgrade and upgrade and upgrade time and time again







(e.g. Ubisoft tsk tsk, Watch_Dogs and now AC Unity)


----------



## umeng2002

Quote:


> Originally Posted by *VeerK*
> 
> Yes, but can we really expect to see such dramatic jumps due to "optimization"? I mean, PS3 used cell architecture whereas PS4 uses x86, the difference being x86 isn't an isolated coding set.


Yes, but maybe not to that extent. Most AAA games were tailored to the PS360's specs up to well only the most currently released games. Most console devs haven't don graphics beyond dx9 type effects until this past year. They haven't had GPU compute to really use until the last year. In short most devs are use to the old systems - usually their PC ports were handled by "someone else," so its only been recently that most devs have wrapped their head around dx11 graphics, GPU compute, etc.


----------



## Carniflex

Quote:


> Originally Posted by *umeng2002*
> 
> Yes, but maybe not to that extent. Most AAA games were tailored to the PS360's specs up to well only the most currently released games. Most console devs haven't don graphics beyond dx9 type effects until this past year. They haven't had GPU compute to really use until the last year. In short most devs are use to the old systems - usually their PC ports were handled by "someone else," so its only been recently that most devs have wrapped their head around dx11 graphics, GPU compute, etc.


Perhaps, although as far as I understood the code development even for last gen consoles all happened still on x86 based platforms. The x86 is reasonably well documented and with established development practices etc... Granted the latter might be part of the issue with the first releases on the new consoles which have 6 somewhat anemic cores available for the games. If you approach it with the "standard x86 practice" you probably end up with something that would be optimal on a single 4..5 GHz powerful core and fairly powerful discrete GPU as opposed to 6 anemic cores and relatively large floating point compute resources sharing the same memory space. This is kind of "new" paradigm although devs who have worked on last gen consoles should have more familiarity with optimizing for multiple cores than traditional PC centered devs.

Probably the lamentation would be a lot lower intensity if the consoles would have, say Pentium G3258 at 4.4 GHz and say, something like mildly undercloked 7950 or 760 Ti in it.


----------



## Pip Boy

Quote:


> Originally Posted by *hellojustinr*
> 
> Consoles and PCs are NOT comparable whatsoever. Time and time again it has always been like that and with their transition to x86-x64 architecture, it still will not change. The only thing we have going for us is, with consoles featuring x86-x64 AMD architecture, many of the optimizations devs make on the consoles we'll eventually transition to us when they are ported over to PC since the consoles are essentialy GCN architecture PCs.


Your wrong here,

It SHOULD but its not happening that way. Instead the Devs are being forced to make games faster and faster to get out of the gate quicker and they are taking that new wonderful cross platform x86 architecture and applying it wholesale to every platform without real deep optimisation.

It's now much more financially savvy to take the PC version ( all console games start on some sort of PC ) and literally plonk it on the console on low settings with everything uncompressed. If anything the counter is happening from what your claiming. For example, RAW audio files are left instead of being compressed through either sheer laziness / greed to rush a title / higher RAM so wastage occurs and titles are reaching upto 50GB to download that used to be 12, 15, 18GB etc... So they can load quicker and not have the CPU hit, Another example is RAW textures so the console can load quicker and again not have the decompression hit. All this means PC gamers need 50GB per game to download which is _utterly ridiculous_ given what most games offer and 6gb VRAM cards to play at 'ultra'

It could of been a wonderful thing for PC gamers this shared architecture and in some cases it will. But as a wider picture that's emerging this is not happening at all, It has been done to keep costs down from the very inception of the consoles life.

Microsoft & Sony KNEW Devs could make more games and shovel them out quicker with shared architecture, they also KNEW that to manufacture the hardware would be quicker and cheaper.

Its nothing to do with gaming its to do with business and its a race to the bottom for maximum profit


----------



## nleksan

Hey, some of us are more appreciative of finally having uncompressed audio than we are of uncompressed textures, and I am pretty confident that the former is not as bad on space as the latter (I record, mix/edit, master, and produce music, which I started after becoming fed up w 99.99pct of rrecordings being victims of the Loudness War, whose lack of ddynamic range makes it sound like trash whether you have it in mp3 128kbps, 320kbps, FLAC, or the original studio master so I record w the purpose of retaining all of the subtleties that don't exist in mass consumer music).

I am not a texture artist, but I do SOME photography and editing and I just don't think that the minute decrease in visual quality from compressed textures (I have displays MORE than capable of displaying the difference) can pass ABX testing, not by myself or anyone else, provided that theiimplementation is not complete junk.

On the other hand, those of us who have trained ears and use real, high end audio equipment (and there exists no "gaming" audio product, ie speakers or headphones, that has sound quality anywhere near beginning to approach "not painful"), which is an admittedly small group and one I've personally found to exclude most self-proclaimed" audiophiles" but instead consist mostly of those who work with audio professionally (if you believe that the power cord, speaker wire, or anything else like that makes a difference AT ALL in sound, provided you aren't comparing 24AWG junk w 8AWG pure silver while pushing 500W+ to EACH speaker, then you fall into the "audiophile" group).

Point being, if devs spent even 2x as much time on audio as they do on other useless ego-stroking whotheheckcares, it would DRASTICALLY improve the immersion, the sense of presence, and so on... 0.3pct less blurry textures, however, are useless.


----------



## SpeedyVT

Quote:


> Originally Posted by *VeerK*
> 
> Yes, but can we really expect to see such dramatic jumps due to "optimization"? I mean, PS3 used cell architecture whereas PS4 uses x86, the difference being x86 isn't an isolated coding set.


You'd be surprised. X86 isn't truly used to it's fullest potential constantly masked by faster and faster processors. The consoles using x86 is a blessing in disguise for us PC users. If the developers learn to exploit x86, that could come back around benefiting us full swing. Give us maybe multitudes more performance.


----------



## Redwoodz

Devs have ditched developing gameplay for eye-candy and it is no wonder they hit a wall,all they do is try to make the best looking visuals. Watch this video on the history of graphics. 




This just in- They have discovered the cause of low framerates in AC Unity for PC.Not the hardware,the instruction queue is the culprit.
http://www.kitguru.net/gaming/matthew-wilson/ubisoft-has-discovered-acu-low-frame-rate-cause/


----------



## Dyson Poindexter

Quote:


> Originally Posted by *Redwoodz*
> 
> This just in- They have discovered the cause of low framerates in AC Unity for PC.Not the hardware,the instruction queue is the culprit.
> http://www.kitguru.net/gaming/matthew-wilson/ubisoft-has-discovered-acu-low-frame-rate-cause/


Good to see that they are getting close to finishing their game!


----------



## lacrossewacker

Quote:


> Originally Posted by *Dyson Poindexter*
> 
> Good to see that they are getting close to finishing their game!


lololol


----------



## jubjub532

Quote:


> Originally Posted by *nleksan*
> 
> Point being, if devs spent even 2x as much time on audio as they do on other useless ego-stroking whotheheckcares, it would DRASTICALLY improve the immersion, the sense of presence, and so on... 0.3pct less blurry textures, however, are useless.


Most people dont have a audio setup coming any where near close enough to tell the difference and unfortunately a lot of people dont care anyway. When i got my DT990 Pro's as my first decent headphones last year i was pretty dissapointed at how few games have high quality audio.

Consoles always hit a performance wall soon after release, but they will just keep releasing games anyway. What i hate about it is that games are then often made worse so that they can be played on consoles and the PC version always suffers.


----------



## hellojustinr

Quote:


> Originally Posted by *SpeedyVT*
> 
> You'd be surprised. X86 isn't truly used to it's fullest potential constantly masked by faster and faster processors. The consoles using x86 is a blessing in disguise for us PC users. If the developers learn to exploit x86, that could come back around benefiting us full swing. Give us maybe multitudes more performance.


^^ THIS

"X86 isn't truly used to it's fullest potential constantly masked by faster and faster processors."

Speaking on experience, I have to go through like 10 different upgrade cycles throughout the Xbox 360's lifetime (that is from 2005 to 2013) to keep up with the games that were being released with the 7th generation (albeit at higher texture res)


----------



## Ramzinho

i am seriously laughing so hard. roflmao.gif


----------



## CleanSweep

The hardware on both systems is truly anemic, I'm skipping this generation of consoles and waiting on the PS5. Y'all know the next gen console CPUs are basically AMD E-series but with 8 cores, right?


----------



## nleksan

Nobody "knows" what the next consoles will use.

I have a feeling that is a rumor from a console forum, and while people are going to fight about PC vs Consoles until the day we become extinct, the one undisputed, uncontested, in fact fully agreed thing between both "sides" is that PC has an incredible online community with absolutely insane amount of knowledge and experience which we happily share wwith those who need it; console forums, well, there is no console forum that is not situated in the internet's anatomical equivalent of the taint.

The next consoles, I predict, will be hugely fragmented as the money, and consequently the power, has been growing in favor of Developers/Publishers and not MS/Sony for some time, because I may be a cynical misanthrope but people really are stupid, and stupid people with money are easily relieved of it by companies that defy all logical business principles by turning profits on DAY ONE of the release of a barely recycled game from series started because the first game (intended to be a standalone) sells well and thus we clearly want 18 sequels, 3 reboots, 4 "re-released, and remastered in true full HD" versions of the originals, two franchise spin-offs for a total of another 13 games all of which are exclusively mind-bendingly-horrifibad, and the new game despite being a fresh coat of paint on an existing entity, is released in a state that is so bad that the majority of people can't play it, and a massive Day One patch means they were fully aware how inexcusable the state of the game is, but it will never be fully resolved as patches =/= profits.
The developers will be steering the ship if things don't change...

And that's the thing. Things WILL NOT CHANGE, no one will fix things, no one cares. The problem with people is that we all expect someone else to fix things, and rationalize however we can to ourselves why we sitting around complaining, aka doing exactly nothing that is of benefit to exactly no one.

We can maybe cause temporary disruption enough to change, but the overwhelming majority of people who organized just enough tohave an iimpact will ddepart the second they feel tthey've actually done something. That's why companies sued by individuals can offer a pithy amount as a settlement, even when the plaintiff has an absolutely unquestionably bulletproof case worth $90 million, an offer to walk away RIGHT NOW with a check for $8 million, before taxes, and THE OVERWHELMING MAJORITY take the 8mil (more like 3 after getting bent by the bank/govt), and when asked later they all regret it immensely and "don't know what they were thinking". The companies/lawyers know that you immediately start thinking ofproblems yyouhhaveccurrently that money will "fix" and you become, literally, physiologically incapable of logically thinking and processing about more than a couple of weeks, at the very most, months, ahead. The offer will have an "expiration date" that conveniently prevents you from consulting multiple people and/or having two nights sleep because both things (the former is immediate, but only if the individual is considered a "renowned expert in the field", but occasionally a loved one, a spouse or a truly admired parent or close family knowledgeable about the issue, and it's only effective in keeping the person from stupidly taking the "piss-off money" aabout 10pct of the time; 72hr or 3 nights sleep, however, is more like 90pct).

Basically, everyone wants a better world, almost everyone thinks that they are doing everything they can, fewer than a one hundreth of one percent actually are.

We don't want to forego the potential for fun to make a point, becausewe all kknow that no one else cares enough tomake a ddifferenc.

Whoever solves thiss, a fundamental part of human Psychology and social Psychology, well, forget that little Nobel Foundation, or the President, or any lines on a map... They will have indelibly altered the future of the species in a uniquely, purely positive way and will become legend.


----------



## EliasAlucard

Quote:


> Originally Posted by *SpeedyVT*
> 
> You'd be surprised. X86 isn't truly used to it's fullest potential constantly masked by faster and faster processors. The consoles using x86 is a blessing in disguise for us PC users. If the developers learn to exploit x86, that could come back around benefiting us full swing. Give us maybe multitudes more performance.


Look, x86 is crap and it doesn't have much left before it's fully maxed out in hardware performance. Sure, software developers may not have squeezed out every drop of x86's capabilities, but RISC based processors such as the PowerPC based Cell CPU in the PS3, and ARM based mobile devices, are inherently superior because the RISC instruction set doesn't require as much wattage and heat at the same performance level. Imagine a RISC processor running at 220 watt? In the case of the AMD FX 9590, you'll get 5GHz, which is a lot, but a RISC based PowerPC or ARM requiring 220 watt would easily give us a 16 core 10GHz per core CPU, if not more, or something like that. Look at modern mobile devices such as the Snapdragon 800, which in spite of being passively cooled and running at very low wattage, packs half the processing power of an Intel i5:

http://cpuboss.com/cpus/Qualcomm-Snapdragon-800-vs-Intel-Core-i5-3550S
http://www.anandtech.com/show/7082/snapdragon-800-msm8974-performance-preview-qualcomm-mobile-development-tablet/6

Facts are, RISC/x86 is ancient technology and seriously needs a retirement.

That's why the PS4 and XB1 both using x86-64 instruction set was an inferior solution. I'm glad Sony went with AMD because the Intel monopoly must be opposed, but what Sony really should have done is continued with the superior PowerPC based Cell.


----------



## Tojara

Quote:


> Originally Posted by *EliasAlucard*
> 
> Look, x86 is crap and it doesn't have much left before it's fully maxed out in hardware performance. Sure, software developers may not have squeezed out every drop of x86's capabilities, but RISC based processors such as the PowerPC based Cell CPU in the PS3, and ARM based mobile devices, are inherently superior because the RISC instruction set doesn't require as much wattage and heat at the same performance level. Imagine a RISC processor running at 220 watt? In the case of the AMD FX 9590, you'll get 5GHz, which is a lot, but a RISC based PowerPC or ARM requiring 220 watt would easily give us a 16 core 10GHz per core CPU, if not more, or something like that. Look at modern mobile devices such as the Snapdragon 800, which in spite of being passively cooled and running at very low wattage, packs half the processing power of an Intel i5:
> 
> http://cpuboss.com/cpus/Qualcomm-Snapdragon-800-vs-Intel-Core-i5-3550S
> http://www.anandtech.com/show/7082/snapdragon-800-msm8974-performance-preview-qualcomm-mobile-development-tablet/6
> 
> Facts are, RISC/x86 is ancient technology and seriously needs a retirement.
> 
> That's why the PS4 and XB1 both using x86-64 instruction set was an inferior solution. I'm glad Sony went with AMD because the Intel monopoly must be opposed, but what Sony really should have done is continued with the superior PowerPC based Cell.


You're just blabbering nonsense, the fact is that we won't know how an 220W ARM CPU would perform before someone makes one. Doubling the operating frequency quite easily quadruples the power consumption even if you have a pipeline long enough to do that in the first place. If you don't, you have to change a hundred variables that make estimating performance an impossibility. It's also been proven that increasing the core count isn't usually the way to do things with (consumer) CPUs, software is just way too slow to adapt to the changes. Future-proofing doesn't make much sense when we usually figure out a better way to do things by the time your new features would actually be useful.


----------



## EliasAlucard

Quote:


> Originally Posted by *Tojara*
> 
> You're just blabbering nonsense, the fact is that we won't know how an 220W ARM CPU would perform before someone makes one. Doubling the operating frequency quite easily quadruples the power consumption even if you have a pipeline long enough to do that in the first place. If you don't, you have to change a hundred variables that make estimating performance an impossibility. It's also been proven that increasing the core count isn't usually the way to do things with (consumer) CPUs, software is just way too slow to adapt to the changes. Future-proofing doesn't make much sense when we usually figure out a better way to do things by the time your new features would actually be useful.


No, the nonsense speaker here is you. Let's have a look at the facts:

The power consumption of the initial PlayStation 3 units based on 90 nm Cell CPU ranges from 170-200 W during normal use, despite having a 380 W power supply.[32] The power consumption of newer 40 GB PlayStation 3 (65 nm process Cell/90 nm RSX) units ranges from 120-140 W during normal use.[33] The latest 80 GB units use both 65 nm Cell and 65 nm RSX, and have further lowered power consumption to between 90-120 W. *The PS3 Slim reduces this power consumption by another 34% with the use of a 45 nm Cell, to around 76 W*.
https://en.wikipedia.org/wiki/PlayStation_3_technical_specifications#Form_and_power_consumption

^^ That's 76 W in total, not 76 W for the Cell processor alone, but also including the RSX GPU, hard drive and so on. And that's with a 45 nm Cell processor. My 45 nm AMD Phenom II X6 1090T, clocked at 3.2 GHz with six cores (the PS3 has like 8 cores @ 3.2 GHz, of which only 6 are used for gaming), requires 125 W (source). And that's for the processor alone. Compare the Phenom II X6 1090T running at 125 W to the entire PS3 slim running at 76 W.

Now imagine a PS4 using the Cell processor at 32 nm, the wattage would easily be like what, 10-20 watt for a 3.2 GHz PS4 processor? I don't know, but I'm pretty damn sure we'd have PS4 consoles with much cooler and way more powerful CPUs at a seriously reduced electricity bill. Win-win, but apparently you Intel fanboys don't want win-win, in your outdated beliefs on x86 as the be all, end all solution to technology.


----------



## Lex Luger

Another intel hater speaking pure nonsense based on nothing. IBM is slowing dying and their uncompetitive POWER PC processors are a huge part. I wouldnt be surprised if IBM stopped manufacturing power PC in 5 years. I wouldnt be surprised is IBM is dead in 10 years. Same goes for AMD.


----------



## Blindsay

Quote:


> Originally Posted by *EliasAlucard*
> 
> ^^ That's 76 W in total, not 76 W for the Cell processor alone, but also including the RSX GPU, hard drive and so on. And that's with a 45 nm Cell processor. My 45 nm AMD Phenom II X6 1090T, clocked at 3.2 GHz with six cores (the PS3 has like 8 cores @ 3.2 GHz, of which only 6 are used for gaming), requires 125 W (source). And that's for the processor alone. Compare the Phenom II X6 1090T running at 125 W to the entire PS3 slim running at 76 W.
> 
> Now imagine a PS4 using the Cell processor at 32 nm, the wattage would easily be like what, 10-20 watt for a 3.2 GHz PS4 processor? I don't know, but I'm pretty damn sure we'd have PS4 consoles with much cooler and way more powerful CPUs at a seriously reduced electricity bill. Win-win, but apparently you Intel fanboys don't want win-win, in your outdated beliefs on x86 as the be all, end all solution to technology.


You cant arbitrarily compare core count and clock speed of different architectures like that. Using AMD as an example for x86 efficiency is fail.

Also TDP and power consumption are not the same thing...

Comparing ARM to x86 is silly anyways, it is like comparing an apple to an orange. Those types of devices are designed with efficiency in mind first and then performance and x86 is almost the opposite.

Sure ARM uses less power, but it also cant even begin to compare to something like an i7 in performance


----------



## muselmane

I dont quite understand why people are surprised by this, to be honest. The decision to switch from RISC to x86 was, AFAIK, one that was mostly due to cost reduction. Especially seeing how they went with weak hardware (at the point of release), claiming that the APU nature of the chip would leave huge room for improvement later on, if used correctly (which seems to be the caveat here). Nevertheless i believe that the hardware they chose was the right harwdare, looking at what it can potentially handle and at what price.
Its a compromise. One that is acceptable imo for both consumers and developers. The big problem is, that Publishers expect to get another installment of whatever game every year, or even every half year, having multiple games in pipeline at the same time. And every single one of those reinstallments have to be bigger and have more crap content and more this and more whatever. No wonder that quality suffers. And its easy to blame it on the hardware.

For me this is just another reason to boycott big publishers and stick to supporting kickstarter, greenlight, whatever projects (while maintaining the necessary caution ofc).

/rant over


----------



## Orangey

There is a 32core 10B transistor 2.6GHz ARM taped out @ TSMC 16FF. It seems to scale up but actual performance in games & general software to the point where it could replace X86 is another matter entirely.


----------



## Carniflex

Quote:


> Originally Posted by *Orangey*
> 
> There is a 32core 10B transistor 2.6GHz ARM taped out @ TSMC 16FF. It seems to scale up but actual performance in games & general software to the point where it could replace X86 is another matter entirely.


When AMD released their Jaguar cores in AM1 socet there were also some speculation how would a 32 core chip with these look like. Based on die shots it sounded actually like something feasible within about 100W TDP envelope. Probably somewhat less than 10 billion transistors







Anyway - x86 cores can scale as well.

Besides modern "x86" is already more like RISK with all these extensions than how it started with. It just has not been traditionally power efficiency focused but as far as I understand it in theory it can be dragged down into the same ballpark as ARM if that is the goal. Same way as ARM can be scaled up to the performance ballpark of x86 at the expense of power consumption.

At the end of a day the end user does not care probably much what is the architecture as long as the programs he wants/needs run on it with acceptable performance for her/him.


----------



## WhiteCrane

Correct me if I'm wrong but the days of RISC being inherently superior to CISC processors are long over and today's intel CPU's are a hybrid of RISC and CISC architecture. The cell was just a single core RISC chip with some stream processors. Nothing fancy.


----------



## SpeedyVT

Quote:


> Originally Posted by *EliasAlucard*
> 
> Look, x86 is crap and it doesn't have much left before it's fully maxed out in hardware performance. Sure, software developers may not have squeezed out every drop of x86's capabilities, but RISC based processors such as the PowerPC based Cell CPU in the PS3, and ARM based mobile devices, are inherently superior because the RISC instruction set doesn't require as much wattage and heat at the same performance level. Imagine a RISC processor running at 220 watt? In the case of the AMD FX 9590, you'll get 5GHz, which is a lot, but a RISC based PowerPC or ARM requiring 220 watt would easily give us a 16 core 10GHz per core CPU, if not more, or something like that. Look at modern mobile devices such as the Snapdragon 800, which in spite of being passively cooled and running at very low wattage, packs half the processing power of an Intel i5:
> 
> http://cpuboss.com/cpus/Qualcomm-Snapdragon-800-vs-Intel-Core-i5-3550S
> http://www.anandtech.com/show/7082/snapdragon-800-msm8974-performance-preview-qualcomm-mobile-development-tablet/6
> 
> Facts are, RISC/x86 is ancient technology and seriously needs a retirement.
> 
> That's why the PS4 and XB1 both using x86-64 instruction set was an inferior solution. I'm glad Sony went with AMD because the Intel monopoly must be opposed, but what Sony really should have done is continued with the superior PowerPC based Cell.


x86 and ARM are machine codes which has little to do with performance, it's their instruction sets. You can't get 10ghz on AMD because it's architecture not machine code. Machine Code can effect how you build an architecture, but doesn't entirely effect how it's made. If you want to see better faster processors pray for Graphine or Carbon Nano Tubes.


----------



## Pip Boy

I feel like this thread is slightly deviating from the point that the PS4 ( and xboxone ) are performance limited for next generation consoles and 900p/30fps really is their target. Sure with optimisations they can look fantastic ( were not talking about 7yr old laptop gpu's) but this early next year with g-sync / freesync / a-sync 40fps is going to look nicer than ever befores on PC and the new revisions of Nvidia/AMD are out that will run 40 - 60 at 4k no problems, where as a console losing too many frames from 60fps is going to tear or drop to 30fps.

The GPU and Display technology are leaping ahead within 1 year of the consoles release. Consoles cant even run VR properly either.

there just boxes for the masses not the masters.


----------



## SpeedyVT

Quote:


> Originally Posted by *phill1978*
> 
> I feel like this thread is slightly deviating from the point that the PS4 ( and xboxone ) are performance limited for next generation consoles and 900p/30fps really is their target. Sure with optimisations they can look fantastic ( were not talking about 7yr old laptop gpu's) but this early next year with g-sync / freesync / a-sync 40fps is going to look nicer than ever befores on PC and the new revisions of Nvidia/AMD are out that will run 40 - 60 at 4k no problems, where as a console losing too many frames from 60fps is going to tear or drop to 30fps.
> 
> by next year 4k PC gaming will be runable on a laptop, which is amazing.


Optimizations don't work like that. For games on consoles it's more programing to the hardware directly which improves it dramatically. I don't expect 4k gaming on anything affordable on laptops next year. We are stuck in a constantly porting generation which is flawed and lacks system optimization and potential.


----------



## EliasAlucard

Quote:


> Originally Posted by *Blindsay*
> 
> You cant arbitrarily compare core count and clock speed of different architectures like that.


Why not? I'm talking about performance per watt anyway. Perhaps it's a new concept to you, but you should look it up:

https://en.wikipedia.org/wiki/Performance_per_watt

There's absolutely no question about it that RISC is the better performer given the same wattage. And there's a damn good reason why [email protected] was used on the PS3 and it was the top performing distributed computing project in the world; think we'll see Sony putting out any *@home project on the PS4 any time soon? I'm not holding my breath. The PS4's CPU is too weak for charity.

Quote:


> Originally Posted by *Blindsay*
> 
> Using AMD as an example for x86 efficiency is fail.


It's not like x86 is energy efficient in the first place, and that's true regardless of if it's AMD or Intel. How much wattage does an average modern ARM based Android device require? They're quite powerful these days in terms of CPU performance, yet they run at a much lower energy than the average x86 desktop.

Quote:


> Originally Posted by *Blindsay*
> 
> Also TDP and power consumption are not the same thing...


Whatever man, yeah, there's a small difference but even if we recalculate the TDP to raw wattage, x86 is still far behind ARM/PowerPC in energy efficiency.

Quote:


> Originally Posted by *Blindsay*
> 
> Comparing ARM to x86 is silly anyways, it is like comparing an apple to an orange. Those types of devices are designed with efficiency in mind first and then performance and x86 is almost the opposite.


Which is exactly how it's supposed to be done. The energy consumption of x86 is _irresponsibly_ high, for environmental reasons if anything.

Quote:


> Originally Posted by *Blindsay*
> 
> Sure ARM uses less power, but it also cant even begin to compare to something like an i7 in performance


But PS3's Cell definitely can in spite of being a decade old. And I'm sure ARM based CPUs definitely could too if they were designed for the desktop, which they will be once AMD begins dishing out ARM processors.

Quote:


> Originally Posted by *WhiteCrane*
> 
> Correct me if I'm wrong but the days of RISC being inherently superior to CISC processors are long over and today's intel CPU's are a hybrid of RISC and CISC architecture. The cell was just a single core RISC chip with some stream processors. Nothing fancy.


You're wrong, and I'm going to correct you.

It's true that Intel has added some RISC-like functionality over the years, but pretty much all x86-64 CPUs (even the low powered ones) are still more or less purely CISC, because unlike RISC, they're still everything but energy efficient. In fact, x86-64 is still the only real CISC player in town nowadays. In everything else, they've understood the importance of RISC. Most mobile devices use RISC (except the piece of junk with Intel inside no one is buying, thank God) and so do most video game consoles (except for the original Xbox and now this current generation of PS4/XB1, sadly). It's mostly on the Windows desktop/laptop, we're still burdened by this CISC crap, and that's the result of several decades now of proprietary vendor lock-in by the Wintel duopoly. Microsoft and Intel could end the Wintel nonsense tomorrow if they really wanted to, but they're not inclined to do so because they both know that without Wintel, both Microsoft and Intel would be out of the game completely. Look at how difficult it has been for both Microsoft and Intel to establish their presence in the mobile devices scene as a case in point, with both ARM based Windows RT/Phone and Intel based Android devices seeing no success whatsoever. That's how it works when there's real market competition and no anti-competitive corporations dominating the iGnorant masses with their vendor lock-in strategies (in this case, a monopoly on software through the x86 Windows ISA/OS).

You x86 Wintel fanboys obviously haven't done your homework, and you're not capable of connecting the dots.

Quote:


> Originally Posted by *SpeedyVT*
> 
> x86 and ARM are machine codes which has little to do with performance, it's their instruction sets. You can't get 10ghz on AMD because it's architecture not machine code. Machine Code can effect how you build an architecture, but doesn't entirely effect how it's made. If you want to see better faster processors pray for Graphine or Carbon Nano Tubes.


Yeah graphene will kick ass once it's ready and out on the market, if that ever happens, because they were still working on graphene's ability (or lack thereof) to turn off the CPUs last I checked, or something like that anyway.


----------



## EliasAlucard

Quote:


> Originally Posted by *Systemlord*
> 
> Perhaps you could use paragraphs because your speed induced rant is making you forget to unclump your paragraphs. You know what I think it's time you go somewhere else to rant, quit this little fanboy complex your bring into an otherwise great thread that doesn't need you in it! I'm tired of seeing you come in here with your superior attitude! Try living in the world we live in, not some decade long ran about WIntelt! Geez grow up and schedule an appointment with your nearest psychologist to vent your frustration!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please no more caffeine for you. You on speed or something?


So ad hominems aside, do you have an actual technically based counter-argument, or are you going to focus on my persona?

I'm the one who's anti-fanboyism here by the way, because I'm critical of the fact of the fact that AMD (the CPU brand I prefer) uses x86. Fanboys swallow it all from their favourite brand without critical thinking.


----------



## Systemlord

Quote:


> Originally Posted by *EliasAlucard*
> 
> So ad hominems aside, do you have an actual technically based counter-argument, or are you going to focus on my persona?
> 
> I'm the one who's anti-fanboyism here, because I'm critical of the fact of the fact that AMD (the x86 brand I prefer) uses x86. Fanboys swallow it all from their favourite brand without critical thinking.


Well I would like to get back to the thread topic instead of a counter-argument, "PS4 and Xbox One Already Hitting a Performance Wall", not rants about Wintel. Your way off topic counter-arguments is getting old, we aren't here to talk about mobile devices, PC's, Windows RT. If you so anti-fanboyism why do you have AMD stapled to your avatar/forehead? For somebody who says they are anti-fanboyism you have a funny way of showing it. You seem as though your on the attack when somebody that's wrong or has an incorrect understanding of how things work, I highly doubt your making any friends here by being critical and argumentative.

Seriously though this thread was more enjoyable before you started being overcritical and you with your counter-arguments, most don't care for this and your killing this thread.

Unsubscribed for now.


----------



## SpeedyVT

Quote:


> Originally Posted by *Systemlord*
> 
> Well I would like to get back to the thread topic instead of a counter-argument, "PS4 and Xbox One Already Hitting a Performance Wall", not rants about Wintel. Your way off topic counter-arguments is getting old, we aren't here to talk about mobile devices, PC's, Windows RT. If you so anti-fanboyism why do you have AMD stapled to your avatar/forehead? For somebody who says they are anti-fanboyism you have a funny way of showing it. You seem as though your on the attack when somebody that's wrong or has an incorrect understanding of how things work, I highly doubt your making any friends here by being critical and argumentative.
> 
> Seriously though this thread was more enjoyable before you started being overcritical and you with your counter-arguments, most don't care for this and your killing this thread.
> 
> Unsubscribed for now.


I'm just going to say then, I'm positive we've not delved in to the deepness of PS4's potential as we've yet to see enough non-ported game written directly to the hardware.


----------



## Blameless

Quote:


> Originally Posted by *EliasAlucard*
> 
> Look, x86 is crap and it doesn't have much left before it's fully maxed out in hardware performance.


People have been saying nonsense like this for thirty years.

x86 isn't an architecture. It's an instruction set that exists now largely as a comparability layer, with fairly minimal overhead in die-area requirements and performance. The more time goes on, the _less_ x86 is holding processors back. One need only compare the performance (and performance per watt) of x86 chips to non-x86 solutions over the years to demonstrate that this is the case.
Quote:


> Originally Posted by *EliasAlucard*
> 
> And there's a damn good reason why [email protected] was used on the PS3 and it was the top performing distributed computing project in the world; think we'll see Sony putting out any *@home project on the PS4 any time soon? I'm not holding my breath. The PS4's CPU is too weak for charity.


The PS3's CPU was essentially an APU, and did reasonably well in [email protected] because it had seven vector processors along side it's general purpose POWER core.
Quote:


> Originally Posted by *EliasAlucard*
> 
> It's not like x86 is energy efficient in the first place, and that's true regardless of if it's AMD or Intel. How much wattage does an average modern ARM based Android device require? They're quite powerful these days in terms of CPU performance, yet they run at a much lower energy than the average x86 desktop.


Quote:


> Originally Posted by *EliasAlucard*
> 
> The energy consumption of x86 is _irresponsibly_ high, for environmental reasons if anything.


Quote:


> Originally Posted by *EliasAlucard*
> 
> x86 is still far behind ARM/PowerPC in energy efficiency.


ISA has little to do with power efficiency in modern processors, especially in the 1w+ range of most everything more powerful than a phone.

It becomes more of an issue for super-low power apps, but at laptop or desktop power envelopes, it evaporates completely.
Quote:


> Originally Posted by *EliasAlucard*
> 
> But PS3's Cell definitely can in spite of being a decade old.


No it can't, not for the same general purpose tasks that CPUs are designed for. Any massively threaded vector task is better done on an array of processors like those inside GPUs. A low-end APU (from either AMD or Intel) is all-round faster and more power efficient than the Cell BE.

The general purpose PPC core in the Cell BE was relatively weak, even for it's time, and it was RISC.


----------



## Orangey

Quote:


> Originally Posted by *EliasAlucard*
> 
> Yeah graphene will kick ass once it's ready and out on the market, if that ever happens, because they were still working on graphene's ability (or lack thereof) to turn off the CPUs last I checked, or something like that anyway.


Why do people keep parroting this old chestnut?

http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/breakthrough-in-creating-a-band-gap-for-graphene-promises-huge-potential-for-electronic-applications

2010 ffs.


----------



## jordanecmusic

Quote:


> Originally Posted by *SpeedyVT*
> 
> These games are poorly designed. We've got to hold on and wait for actual good games designed around the console and not computer.


Xbox one and ps4 are essentially low end gaming computers. There is no other way to code a game for it unlike a computer now because of its 64 bit architecture. If you want a game designed for a console go back to playing your xbox 360s and ps3s.


----------



## SpeedyVT

Quote:


> Originally Posted by *jordanecmusic*
> 
> Xbox one and ps4 are essentially low end gaming computers. There is no other way to code a game for it unlike a computer now because of its 64 bit architecture. If you want a game designed for a console go back to playing your xbox 360s and ps3s.


That's so wrong it's not even funny.


----------



## jordanecmusic

If im wrong then correct me. its the closest they've gotten to being a pc. It makes pc games easier to port properly too.


----------



## GoldenTiger

Quote:


> Originally Posted by *omari79*
> 
> Wasn't it always that a console was superior or equal to a high end gaming pc at launch and then the it wpuld get surpassed not long after?
> 
> Wasn't this the case with the PS2..PS3..XBOX..360?
> 
> What went wrong with the PS4/ONE?


This. You'd think that it would be obvious but seems people are blind.

The delusional fanboys thinking hugely outdated tech at launch will make for a great console graphically in 5+ years are a riot!

Bonus points for them blathering on cluelessly about "not all cores are used 100 percent!!!!" as meaning "bad coders".


----------



## GoldenTiger

Quote:


> Originally Posted by *jordanecmusic*
> 
> Xbox one and ps4 are essentially low end gaming computers. There is no other way to code a game for it unlike a computer now because of its 64 bit architecture. If you want a game designed for a console go back to playing your xbox 360s and ps3s.


Not too far off from the truth, actually.


----------



## lacrossewacker

Quote:


> Originally Posted by *jordanecmusic*
> 
> If im wrong then correct me. its the closest they've gotten to being a pc. It makes pc games easier to port properly too.


The Xbox (2001) was even closer









Intel Celeron, Nvidia Geforce 3 variant, Samsung/Micron DDR ram, Seagate HDD (IDE)

Heck, it was even called *X*box because it was a Direct*X* console


----------



## iTurn

Quote:


> Originally Posted by *GoldenTiger*
> 
> This. You'd think that it would be obvious but seems people are blind.
> 
> The delusional fanboys thinking hugely outdated tech at launch will make for a great console graphically in 5+ years are a riot!
> 
> Bonus points for them blathering on cluelessly about "not all cores are used 100 percent!!!!" as meaning "bad coders".


But they AREN'T being used 100%... I wouldn't call all devs bad coders for not using 100% of the system though.
Quote:


> Originally Posted by *GoldenTiger*
> 
> Not too far off from the truth, actually.


Very far from the truth...

It's amazing people are still parroting Ubi who made the statement who then came back and said that hardware limitations aren't the reason for the low performance but the coding was to blame. Quite odd but this is OCN after all.


----------



## SpeedyVT

Quote:


> Originally Posted by *iTurn*
> 
> But they AREN'T being used 100%... I wouldn't call all devs bad coders for not using 100% of the system though.
> Very far from the truth...
> 
> It's amazing people are still parroting Ubi who made the statement who then came back and said that hardware limitations aren't the reason for the low performance but the coding was to blame. Quite odd but this is OCN after all.


Agreed. It's consoles that pushing gaming further in the future. While I'm a MASTERRACE, PC User, I still understand the importance of the success of consoles.


----------



## iTurn

Quote:


> Originally Posted by *SpeedyVT*
> 
> Agreed. It's consoles that pushing gaming further in the future. While I'm a MASTERRACE, PC User, I still understand the importance of the success of consoles.


Agreed, I don't get OCNs beef with the consoles and not seeing how it has newer tech just because it doesn't spec up to high end. Consoles used to spec up to PC high end when they housed discrete GPUs/CPUs they now house a SoC and there is no APU/iGPU near as power in the PC world (correct me if I'm wrong though).

SoC around the power of an i3 CPU wise and a little more powerful than a 7850HD/GTX660 GPU wise, with GDDR5 as system/VRAM says more advanced tech than whats available to desktops imo, even if it's weaker.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *hellojustinr*
> 
> LOL at this article and every other PC master troll enthusiast saying Xbox One and PS4 are hitting a "performance wall" already.
> 
> Xbox 360 and PS3 were hitting performance wall way back in 2008 with the release of Grand Theft Auto IV and they are still playing the likes of modern games like Far Cry 4 today albeit at Low settings on HD. Heck they were able to optimize GTA V to release last year with the same frame rate and resolution as GTA IV. Performance wall means nothing to consoles. All consoles hit the performance wall early on.
> 
> Xbox 360 and PS3 equivalent in terms of PC architecture/specs raw power
> Triple-core AMD Phenom first-gen
> NVIDIA GEFORCE 7950GT and ATI RADEON X1800 (not even the higher end, 7950 GTX or X1950 XTX versions)
> 256MB/512MB of RAM
> 
> Yet I'd like to see a PC with those similar specs handle the same games those consoles are doing now.
> 
> Games like MGS4 don't seem possible on a GPU like the 7950 GTX whasoever in my mind when I think about it if I thought about it on the PC hardware side of things but it is on the console.
> ___________________________________________________________________________________
> That's not to mention our GPUs have multiplied in power times 20 (exaggggg) in the past seven years.
> 
> Example
> 
> NVIDIA point of view (easier to follow)
> 2007
> From 7950 GTX -- > 8800 GTX G80 TESLA (with latter exceeding the performance of even two 7950GTXs in SLI (7950 GX2). improvement = 2.5x literally)
> 
> early 2008
> minor 8800 GTX G80 TESLA -- > 9800 GTX+ G92 TESLA (really minor performance improvement, more for power savings, if any at all. improvement = 1.1x
> 
> late 2008
> 9800GTX G92 TESLA --> GTX 280 MATURE TESLA (easily two times faster than the G92 chip, a beast for its time. improvement = 2.2x)
> 
> 2009
> GTX 280 MATURE TESLA --> GTX 480 FERMI PROTOTYPE (ran extremely hot but early Fermi archi thats why, was able to compete with the GTX 295 which itself was two GTX 280s in essence. improvement = 2x)
> 
> 2010
> GTX 480 FERMI PROTOTYPE --> GTX 580 MATURE FERMI (mature version of Fermi, ran faster as well as a result. improvement = 1.4x)
> 
> 2012
> GTX 580 MATURE FERMI --> GTX 680 KEPLER (in a particular tech demo, one 680 was able to outpace three 580s in SLI, just showed the huge generational leap it made, plus the power efficiency gains it made (e.g. less heat as well), 2012 was that year. improvement = nearly 2.5x-3x)
> 
> 2013
> GTX 680 KEPLER --> GTX 780 TI IMPROVED KEPLER (just a more mature Kepler in my opinion. improvement = 1.5x)
> 
> 2014
> GTX 780 TI IMPROVED KEPLER --> GTX 980 MAXWELL (focused on power savings, shows NVIDIA is refocused on mobile. improvement=1.3x????)
> 
> Conclusion:
> With all these tenfold increases in performance it feels as if games are not looking that much better yet have much higher requirements than ever before. AC Unity requires a GTX 680 MINIMUM just to run apparently which is just BS when consoles are effectively using near GTX 650 level GPUs.
> 
> 30FPS on consoles at 900p seem reasonable for a badly optimized game on a console with GTX 650 level GPU but 30FPS on a GTX 680 is just ridiculous (this is based on personal experience playing with a extra 680 I have laying around).
> 
> If I were to base it on PC nomenclature just going up from a GTX 650 to a 660 is a big increase already, and from a 660 to a 660 Ti a much larger performance increase, then from 660 Ti to 670 is an even bigger differential. Why is the 680 the minimum?
> 
> I feel we're being neglected of proper optimizations and many people are ignoring it because GPUs have gotten so cheap to the point you can get a pretty decent GPU for $130 used (HD 7950 and GTX 670 goes for $150 solid used). Heck with the release of the 970, effectively 780 Ti level performance is down to the $350 price point and that effectively affects all lower level GPUs as well.
> _________________________________________________________________________________
> Game with proper optimizations = Dishonored (runs linearly the same regardless of hardware and scales properly in accordance to hardware raw performance), COD Black Ops II, COD Ghosts
> 
> Games with horrible optimization = Battlefield 4 (in some degree), GTA IV, AC Unity, Metro 2033
> 
> In PC people's mind's, the harder it is to run the game the more intensive it is, but that is not always the case. Games like Metro 2033 Last Light do not look anywhere near as stunning yet require the raw performance power of the likes of a game like Crysis 3.
> 
> Consoles and PCs are NOT comparable whatsoever. Time and time again it has always been like that and with their transition to x86-x64 architecture, it still will not change. The only thing we have going for us is, with consoles featuring x86-x64 AMD architecture, many of the optimizations devs make on the consoles we'll eventually transition to us when they are ported over to PC since the consoles are essentialy GCN architecture PCs.
> 
> The time developers will have to develop specifically for the next-gen console's silicon will be a lot more larger in span compared to the pathetic life cycles of our GPU architectures (Fermi anyone? that's quite recent, just three years ago). Flagships just two-three years ago are being forgotten quickly and those GPUs alone cost $400 when they first came out, the price of ONE flagship next-gen console alone. PC enthusiast just tell people to upgrade and upgrade and upgrade, and for what reason? I feel we should have reached that point where we shouldn't be upgrading as frequently anymore.
> 
> Just some food for thought (a really long one at that). Don't take the insults too seriously but I think we all know some part of this rant to be true in some aspect. Wish I could get my message across a lot more clearer but it's hard to organize all my thoughts into this certain subject. Hope you get the idea
> 
> I personally think we should start demanding for better optimizations in the PC community, as a whole rather than keep telling each other to just upgrade and upgrade and upgrade time and time again
> 
> 
> 
> 
> 
> 
> 
> (e.g. Ubisoft tsk tsk, Watch_Dogs and now AC Unity)


I want to rep this a million times. An Intel Pentium 4/D or dual core Athlon with 2 GB of ram and a 7900gs or x1850 cannot possibly hope to ever run any modern games. They wouldn't even boot. The system I just described though would have cost more than a 360 or PS3 at the time of 2005/2006. The PS3 and 360 can still run Far Cry 4, Witcher 3, GTA5, Crysis 3.....


----------



## SpeedyVT

Quote:


> Originally Posted by *iTurn*
> 
> Agreed, I don't get OCNs beef with the consoles and not seeing how it has newer tech just because it doesn't spec up to high end. Consoles used to spec up to PC high end when they housed discrete GPUs/CPUs they now house a SoC and there is no APU/iGPU near as power in the PC world (correct me if I'm wrong though).
> 
> SoC around the power of an i3 CPU wise and a little more powerful than a 7850HD/GTX660 GPU wise, with GDDR5 as system/VRAM says more advanced tech than whats available to desktops imo, even if it's weaker.


I wouldn't even compare it to a PC. There is so much that doesn't even make it a PC it's literally not comparable. Such as the independent operative ability of the GPU. Also the fact the GPU can thread like a processor if coded to it. The 256bit access of memory on CPU level. While it's cores are based on x86 it's not a PC. It's a Frankenstein.


----------



## tajoh111

Quote:


> Originally Posted by *iTurn*
> 
> Agreed, I don't get OCNs beef with the consoles and not seeing how it has newer tech just because it doesn't spec up to high end. Consoles used to spec up to PC high end when they housed discrete GPUs/CPUs they now house a SoC and there is no APU/iGPU near as power in the PC world (correct me if I'm wrong though).
> 
> SoC around the power of an i3 CPU wise and a little more powerful than a 7850HD/GTX660 GPU wise, with GDDR5 as system/VRAM says more advanced tech than whats available to desktops imo, even if it's weaker.


Its way less power than an i3. i3's nowadays are capable of out performing fx8350's in most gaming tasks, let alone the 6 1.8 ghz jaguar cores in the ps4.

If you look at the original xbox. It had something a bit faster than a geforce 3 500ti. Price of geforce 3 500ti at time of xbox 1 release? 350 dollars which taking into account inflation, is 460 dollars.

PS3 power was equivalent to a 7900gt to 7900gtx. so a card between 300-500 dollars at the time. And the PS3 was super delayed. It should of came out during the gtx 7800 era. So with inflation between 350-500+

xbox 360 power was a bit more than a hd1950 xtx. Cost of this card was around 450 dollars. With inflation, 531 dollars.

Todays consoles, come with something around a 7850 which were 150 dollar cards at the time of the time of the ps4 and xbox 1 release. That's a drastic drop compared to what we saw before. We should really be getting gtx 780 to gtx 780 ti to r9 290 to r9 290x levels of performance if it was like the consoles of the past. These chips are 3x the power of what are in the xbox one and ps4.

We might not be hitting the wall now, but by next year, there is a good chance we will hit the wall.

Stuff like HSA doesn't matter nearly as much as in the consoles, because consoles are coded so close to the metal that they are really using their resources efficiently. It's why in the past, the consoles got away often with using a CPU significantly less powerful than the GPU. Much of HSA is about tasking stuff to the CPU and letting the GPU do that work. Console and gaming have far less to gain from HSA. 100 utilization is something that you see in consoles because they truly extract everything out of it. Consoles underperforming is hardly a result of lazy coding. Its the ports to PC that have bad performance due to lazy coding.

It's not AMD fault at all however, its console makers for putting such a cheap chip in the first place. They should have took a loss like the past instead of underpriced consoles that make a profit from day 1.


----------



## SpeedyVT

Quote:


> Originally Posted by *tajoh111*
> 
> Its way less power than an i3. i3's nowadays are capable of out performing fx8350's in most gaming tasks, let alone the 6 1.8 ghz jaguar cores in the ps4.
> 
> If you look at the original xbox. It had something a bit faster than a geforce 3 500ti. Price of geforce 3 500ti at time of xbox 1 release? 350 dollars which taking into account inflation, is 460 dollars.
> 
> PS3 power was equivalent to a 7900gt to 7900gtx. so a card between 300-500 dollars at the time. And the PS3 was super delayed. It should of came out during the gtx 7800 era. So with inflation between 350-500+
> 
> xbox 360 power was a bit more than a hd1950. Cost of this card was around 450 dollars. With inflation, 531 dollars.
> 
> Todays consoles, come with something around a 7850 which were 150 dollar cards at the time of the time of the ps4 and xbox 1 release. That's a drastic drop compared to what we saw before. We should really be getting gtx 780 to gtx 780 ti to r9 290 to r9 290x levels of performance if it was like the consoles of the past. These chips are 3x the power of what are in the xbox one and ps4.
> 
> We might not be hitting the wall now, but by next year, there is a good chance we will hit the wall.
> 
> Stuff like HSA doesn't matter nearly as much as in the consoles, because consoles are coded so close to the metal that they are really using their resources efficiently. It's why in the past, the consoles got away often with using a CPU significantly less powerful than the GPU. Much of HSA is about tasking stuff to the CPU and letting the GPU do that work. Console and gaming have far less to gain from HSA. 100 utilization is something that you see in consoles because they truly extract everything out of it. Consoles underperforming is hardly a result of lazy coding. Its the ports to PC that have bad performance due to lazy coding.
> 
> It's not AMD fault at all however, its console makers for putting such a cheap chip in the first place. They should have took a loss like the past instead of underpriced consoles that make a profit from day 1.


You mean i3's in games? Everything else an i3 is very futile.


----------



## tajoh111

Quote:


> Originally Posted by *SpeedyVT*
> 
> You mean i3's in games? Everything else an i3 is very futile.


I did mention games.


----------



## zantetheo

For now new consoles are pretty decent...but in 1-2 years from now....

4K - new GPUS and the GTX 980 at about 200$ (would be like the way we see GTX 680 now days comparing the latest GPUs) such a big gap.

Wonder how long will last Xbox one and PS4. I give them 3 more years max.


----------



## Systemlord

In the past consoles at release were very close to a high-end PC in specs, I remember the buzz about Xbox 360 and PS3 with it's folding performance. Sony and MS really went for lower specs this time around which I believe will max out pretty quick. It's because they didn't want to lose as much money per console, their losing less money per console sold. Just look at the price differences, PS3 at launch ran $599-$699, my brother got one for free buying a high-end DLP display, at that time the PS3 had been out for some time and cost $600. Compare those prices to what they now, Xbox One is $349 and the PS4 is $399, that's quite a big difference from the PS3's launch price even a year later.


----------



## NFL

Quote:


> Originally Posted by *tajoh111*
> 
> Its way less power than an i3. i3's nowadays are capable of out performing fx8350's in most gaming tasks, let alone the 6 1.8 ghz jaguar cores in the ps4.
> 
> If you look at the original xbox. It had something a bit faster than a geforce 3 500ti. Price of geforce 3 500ti at time of xbox 1 release? 350 dollars which taking into account inflation, is 460 dollars.
> 
> PS3 power was equivalent to a 7900gt to 7900gtx. so a card between 300-500 dollars at the time. And the PS3 was super delayed. It should of came out during the gtx 7800 era. So with inflation between 350-500+
> 
> xbox 360 power was a bit more than a hd1950 xtx. Cost of this card was around 450 dollars. With inflation, 531 dollars.
> 
> Todays consoles, come with something around a 7850 which were 150 dollar cards at the time of the time of the ps4 and xbox 1 release. That's a drastic drop compared to what we saw before. We should really be getting gtx 780 to gtx 780 ti to r9 290 to r9 290x levels of performance if it was like the consoles of the past. These chips are 3x the power of what are in the xbox one and ps4.
> 
> We might not be hitting the wall now, but by next year, there is a good chance we will hit the wall.
> 
> Stuff like HSA doesn't matter nearly as much as in the consoles, because consoles are coded so close to the metal that they are really using their resources efficiently. It's why in the past, the consoles got away often with using a CPU significantly less powerful than the GPU. Much of HSA is about tasking stuff to the CPU and letting the GPU do that work. Console and gaming have far less to gain from HSA. 100 utilization is something that you see in consoles because they truly extract everything out of it. Consoles underperforming is hardly a result of lazy coding. Its the ports to PC that have bad performance due to lazy coding.
> 
> It's not AMD fault at all however, its console makers for putting such a cheap chip in the first place. *They should have took a loss like the past instead of underpriced consoles that make a profit from day 1.*


Yes because clearly Sony isn't currently losing enough money as is, they need to lose more money!


----------



## Serandur

The science-fiction/fantasy and conjecture show has been entertaining, but I think it's time to inject some facts:
Quote:


> Originally Posted by *EliasAlucard*
> 
> snip


ARM replacing x86... uh. Yeah, yeah do that and enjoy killing off 30 years of legacy software support and the versatility provided by the broader and additional instruction sets for no good reason. Current iterations of Core are a very powerful and highly evolved microarchitecture that's slowed down in high-end performance gains simply because there's nothing on the market that can touch it as is. It's not even limited by power consumption or die size these days (outside of mobile), only Intel's complacency in the HEDT space because of a lack of any threat in that market. Maybe AMD's own new Zen microarchitecture (hint: it's x86) might change things up a bit.

What are being referred to as RISC architectures (ARM) have an efficiency advantage of any kind because they use minimal instruction sets (plus the much lower performance, clock speed, and voltage targets), which is undesirable on any serious home computing platform. And of course, that advantage is brought into question in the many areas where x86 has a significant advantage with instruction sets that ARM does not possess.

It's in the name, RISC - reduced instruction set computing. For devices where every ounce of power savings is valuable and not much serious heavy computing is done, yes, tailored and efficient ARM devices are fantastic. For anything not so anemic in power constraints, performance and broad support/compatibility are top-dog and ARM's advantages diminish significantly. Hence why no one's made a high-end ARM device yet in conjunction with it actually being far more complex and difficult to design any microarchitecture than people give credit for. ARM is irrelevant for anything but mobile and other very low-power devices. Beyond that, even with regards to power efficiency, Atom to Core M all the way up through HEDT Extreme i7s say hi. This is to say nothing of my opinions on Intel and their current monopoly on the performance market. I think they're being too complacent up there and wasting resources scaling Core down efficiently to ARM power-envelopes, but over my dead body will I ditch x86 in any of my serious performance platforms, much less my desktop PC. The most powerful architecture at the moment on any consumer device is x86, bar none, and I very much like my legacy support.

People have been ranting about RISC vs CISC for decades, Intel/AMD have largely incorporated the advantages of RISC since then, x86 is more relevant than ever, and the distinctions that once gave RISC its advantages have only shrunk over the years. Here's a research paper (you know, the *science* part of computer science). Delusions and ignorance have no place here:

http://research.cs.wisc.edu/vertical/papers/2013/hpca13-isa-power-struggles.pdf

_"We find that ARM and x86 processors are simply engineering design points optimized for different levels of performance, and there is nothing fundamentally more energy efficient in one ISA class or the other. The ISA being RISC or CISC seems irrelevant."
_
Quote:


> Originally Posted by *iTurn*
> 
> Agreed, I don't get OCNs beef with the consoles and not seeing how it has newer tech just because it doesn't spec up to high end. Consoles used to spec up to PC high end when they housed discrete GPUs/CPUs they now house a SoC and there is no APU/iGPU near as power in the PC world (correct me if I'm wrong though).
> 
> SoC around the power of an i3 CPU wise and a little more powerful than a 7850HD/GTX660 GPU wise, with GDDR5 as system/VRAM says more advanced tech than whats available to desktops imo, even if it's weaker.


For one, it's because Jaguar is a barrel-bottom architecture. Weak is an understatement; one of the (if not the) weakest PC CPU architectures currently out there is better. It doesn't touch an i3, it's single-threaded performance is absolutely abysmal to the point where ancient Core 2s curbstomp it in that category. Any remotely modern Sandy Bridge (almost 4 years old already), Ivy Bridge, or Haswell desktop i3 not only has more or less twice the IPC, but at least twice the clock speed as well. You're then looking at 4-5x the single-threaded performance of Jaguar on an i3. People will also reference more cores, which is extremely faulty because CPUs are inherently for the sake of sequential processing. It's a simplified distinction between CPUs and GPUs; CPUs primarily deal with serial tasks that are contingent on previous tasks having been completed in a timely manner and on the same core, GPUs are a vast array of much simpler cores for the purpose of processing what are called embarrassingly parallel workloads (ie, rasterization, pixel shading, lighting, etc.). Jaguar is not criticized because it doesn't match up to high-end PC CPUs, it's because it doesn't even match up to what are colloquially-considered low-end ones and gets its ass handed to it in performance capabilities by even ancient CPUs like Core 2 and gen 1 Core i.

Ergo, anyone "criticizing" (and I use this loosely because bashing from a point of ignorance is not really criticism) programmers for not being able to make their software magically defy conventions of computer science and scale linearly across multiple cores is simply demonstrating a gross misunderstanding of processing, programming, threading, etc. As per GDDR5, it's not meant to be system RAM. If it brought forth any true benefits as such, it would have happened already. It's effectively a bandwidth-enhanced version of DDR3 designed specifically and exclusively for GPUs, which rely far more on the bandwidth of their on-board memory than CPUs. The tradeoff is latency. CPUs benefit more from lower latency in memory timings, ergo they use DDR3. GPUs benefit more from the bandwidth, ergo GDDR5 exists. They're different variants of the same type of technology specialized for two different things. Using it as a system RAM is not more advanced, it's just unorthodox and another poor byproduct of relying on a SoC for what is supposed to be a high-performance platform. In any case, for the PS4's CPU, it is limited to 20 GB/s of bandwidth regardless of GDDR5 being used as it has its own memory bus.
Quote:


> Originally Posted by *SpeedyVT*
> 
> You mean i3's in games? Everything else an i3 is very futile.


If an i3 is futile, what is Jaguar? Worthless? SpeedyVT, my response here to you is considering many of the things you've said over the past few pages, not just this one quote on i3s.

Regarding multithreading, please recall that the only reason multi-core CPUs exist is because CPU development hit a wall since increased frequencies began demanding unsustainable increases in voltage, power consumption, and heat. The traditional increases in clock speed previously used to increase performance became unsustainable circa the Pentium 4 and Athlon era and therefore dual-cores came into being as a _compromise_. As I stated above, CPUs are inherently relevant to sequential operations. Many operations _*simply cannot be parallelized*_. A single thread cannot be divided across multiple cores and the benefits of multi-core CPUs are therefore inherently limited. The additional cores are beneficial for running multiple programs simultaneously and in some niche cases such as servers, video rendering, and other professional contexts, but interactive and consumer software is much more limited. This is a well-established phenomenon in computer science, see: Amdahl's law.



Many very intelligent and highly experienced software engineers, programmers, and developers for many different applications bigger than those relevant to a mainstream game-playing toy, of all things, have wrestled with the difficulties of this and still come away with results I'm sure you'd find lacking. It's not for lack of effort, expertise, or motivation, it's just a fact. Many operations CPUs are responsible for cannot be split across multiple cores. Weaker single-threaded performance certainly doesn't make the issue easier. Finally, not only are two cores on the consoles locked for the OS, but the eight cores aren't even truly together, they're separated into what is essentially two separate quad-core Jaguar CPUs taped together. There is likely a performance penalty for communication between them, as if the pitiful single-threaded performance wasn't enough insult. Similarly, code that is difficult/impossible to parallelize certainly isn't going to be run on a GPU either.

The CPU in these consoles would be, in any truly performance-directed system, a textbook of example for microprocessor engineers of what not to do. That's not to say Sony and Microsoft didn't know what they were doing, however. The types of games/exclusives that wow people that care are the extremely linear and "cinematic" types that don't do much interesting with CPUs anyway (ie. Uncharted, The Order, God of War, etc.) with a focus on linearity and "optimization" in the form of that selective view in order to wrest the greatest perceivable visuals they can. Sony and Microsoft are not in the console market for the bettering of video games nor the advancement of game development software, they're strictly in it for the financial aspect and their latest consoles strongly reflect that lack of quality in favor of reduced manufacturing costs. Everything from the PS4 controller and build quality complaints to the unthinkably weak hardware they've packed into the things are evidence of that. This is not a fanboy assertion, it's not an inflammatory one towards anyone who enjoys playing on consoles, it's simply the evident reality. As for extracting further performance from modern x86 CPUs, support for the latest instruction sets is a bit lacking I suppose, but there's no magic there and Jaguar doesn't even employ said instruction sets (ie. AVX2, FMA3, TSX, etc.). "Optimization" won't make for additional performance without cutting something. Considering Jaguar's weakness, the only reason it's even able to run Planetside 2, for example, at 30 FPS stable, if it can, is probably the lower-level console API. It's already doing its job; MS and Sony simply utilized it as an excuse to use some of the weakest processors on the market with clearly no intent to have the machines be capable of sustaining 60 FPS or even 30 in any remotely CPU-demanding game nor to actually advance AI or scope or other things that make some difference in game design.


----------



## Systemlord

Quote:


> Originally Posted by *Serandur*
> 
> The science-fiction/fantasy and conjecture show has been entertaining, but I think it's time to inject some facts:
> ARM replacing x86... uh. Yeah, yeah do that and enjoy killing off 30 years of legacy software support and the versatility provided by the broader and additional instruction sets for no good reason. Current iterations of Core are a very powerful and highly evolved microarchitecture that's slowed down in high-end performance gains simply because there's nothing on the market that can touch it as is. It's not even limited by power consumption or die size these days (outside of mobile), only Intel's complacency in the HEDT space because of a lack of any threat in that market. Maybe AMD's own new Zen microarchitecture (hint: it's x86) might change things up a bit.
> 
> What are being referred to as RISC architectures (ARM) have an efficiency advantage of any kind because they use minimal instruction sets (plus the much lower performance, clock speed, and voltage targets), which is undesirable on any serious home computing platform. And of course, that advantage is brought into question in the many areas where x86 has a significant advantage with instruction sets that ARM does not possess.
> 
> It's in the name, RISC - reduced instruction set computing. For devices where every ounce of power savings is valuable and not much serious heavy computing is done, yes, tailored and efficient ARM devices are fantastic. For anything not so anemic in power constraints, performance and broad support/compatibility are top-dog and ARM's advantages diminish significantly. Hence why no one's made a high-end ARM device yet in conjunction with it actually being far more complex and difficult to design any microarchitecture than people give credit for. ARM is irrelevant for anything but mobile and other very low-power devices. Beyond that, even with regards to power efficiency, Atom to Core M all the way up through HEDT Extreme i7s say hi. This is to say nothing of my opinions on Intel and their current monopoly on the performance market. I think they're being too complacent up there and wasting resources scaling Core down efficiently to ARM power-envelopes, but over my dead body will I ditch x86 in any of my serious performance platforms, much less my desktop PC. The most powerful architecture at the moment on any consumer device is x86, bar none, and I very much like my legacy support.
> 
> People have been ranting about RISC vs CISC for decades, Intel has largely incorporated the advantages of RISC since then, x86 is more relevant than ever, and the distinctions that once gave RISC its advantages have only shrunk over the years. Here's a research paper (you know, the *science* part of computer science). Delusions and ignorance have no place here:
> 
> http://research.cs.wisc.edu/vertical/papers/2013/hpca13-isa-power-struggles.pdf
> 
> _"We find that ARM and x86 processors are simply engineering design points optimized for different levels of performance, and there is nothing fundamentally more energy efficient in one ISA class or the other. The ISA being RISC or CISC seems irrelevant."
> _
> For one, it's because Jaguar is a barrel-bottom architecture. Weak is an understatement; one of the (if not the) weakest PC CPU architectures currently out there is better. It doesn't touch an i3, it's single-threaded performance is absolutely abysmal to the point where ancient Core 2s curbstomp it in that category. Any remotely modern Sandy Bridge (almost 4 years old already), Ivy Bridge, or Haswell desktop i3 not only has more or less twice the IPC, but at least twice the clock speed as well. You're then looking at 4-5x the single-threaded performance of Jaguar on an i3. People will also reference more cores, which is extremely faulty because CPUs are inherently for the sake of sequential processing. It's a simplified distinction between CPUs and GPUs; CPUs primarily deal with serial tasks that are contingent on previous tasks having been completed in a timely manner and on the same core, GPUs are a vast array of much simpler cores for the purpose of processing what are called embarrassingly parallel workloads (ie, rasterization, pixel shading, lighting, etc.). Jaguar is not criticized because it doesn't match up to high-end PC CPUs, it's because it doesn't even match up to what are colloquially-considered low-end ones and gets its ass handed to it in performance capabilities by even ancient CPUs like Core 2 and gen 1 Core i.
> 
> Ergo, anyone "criticizing" (and I use this loosely because bashing from a point of ignorance is not really criticism) programmers for not being able to make their software magically defy conventions of computer science and scale linearly across multiple cores is simply demonstrating a gross misunderstanding of processing, programming, threading, etc. As per GDDR5, it's not meant to be system RAM. If it brought forth any true benefits as such, it would have happened already. It's effectively a bandwidth-enhanced version of DDR3 designed specifically and exclusively for GPUs, which rely far more on the bandwidth of their on-board memory than CPUs. The tradeoff is latency. CPUs benefit more from lower latency in memory timings, ergo they use DDR3. GPUs benefit more from the bandwidth, ergo GDDR5 exists. They're different variants of the same type of technology specialized for two different things. Using it as a system RAM is not more advanced, it's just unorthodox and another poor byproduct of relying on a SoC for what is supposed to be a high-performance platform. In any case, for the PS4's CPU, it is limited to 20 GB/s of bandwidth regardless of GDDR5 being used as it has its own memory bus.
> If an i3 is futile, what is Jaguar? Worthless? SpeedyVT, my response here to you is considering many of the things you've said over the past few pages, not just this one quote on i3s.
> 
> Regarding multithreading, please recall that the only reason multi-core CPUs exist is because CPU development hit a wall since increased frequencies began demanding unsustainable increases in voltage, power consumption, and heat. The traditional increases in clock speed previously used to increase performance became unsustainable circa the Pentium 4 and Athlon era and therefore dual-cores came into being as a _compromise_. As I stated above, CPUs are inherently relevant to sequential operations. Many operations _*simply cannot be parallelized*_. A single thread cannot be divided across multiple cores and the benefits of multi-core CPUs are therefore inherently limited. The additional cores are beneficial for running multiple programs simultaneously and in some niche cases such as servers, video rendering, and other professional contexts, but interactive and consumer software is much more limited. This is a well-established phenomenon in computer science, see: Amdahl's law.
> 
> 
> 
> Many very intelligent and highly experienced software engineers, programmers, and developers for many different applications bigger than those relevant to a mainstream game-playing toy, of all things, have wrestled with the difficulties of this and still come away with results I'm sure you'd find lacking. It's not for lack of effort, expertise, or motivation, it's just a fact. Many operations CPUs are responsible for cannot be split across multiple cores. Weaker single-threaded performance certainly doesn't make the issue easier. Finally, not only are two cores on the consoles locked for the OS, but the eight cores aren't even truly together, they're separated into what is essentially two separate quad-core Jaguar CPUs taped together. There is likely a performance penalty for communication between them, as if the pitiful single-threaded performance wasn't enough insult. Similarly, code that is difficult/impossible to parallelize certainly isn't going to be run on a GPU either.
> 
> The CPU in these consoles would be, in any truly performance-directed system, a textbook of example for microprocessor engineers of what not to do. That's not to say Sony and Microsoft didn't know what they were doing, however. The types of games/exclusives that wow people that care are the extremely linear and "cinematic" types that don't do much interesting with CPUs anyway (ie. Uncharted, The Order, God of War, etc.) with a focus on linearity and "optimization" in the form of that selective view in order to wrest the greatest perceivable visuals they can. Sony and Microsoft are not in the console market for the bettering of video games nor the advancement of game development software, they're strictly in it for the financial aspect and their latest consoles strongly reflect that lack of quality in favor of reduced manufacturing costs. Everything from the PS4 controller and build quality complaints to the unthinkably weak hardware they've packed into the things are evidence of that. This is not a fanboy assertion, it's not an inflammatory one towards anyone who enjoys playing on consoles, it's simply the evident reality. As for extracting further performance from modern x86 CPUs, support for the latest instruction sets is a bit lacking I suppose, but there's no magic there and Jaguar doesn't even employ said instruction sets (ie. AVX2, FMA3, TSX, etc.). "Optimization" won't make for additional performance without cutting something. Considering Jaguar's weakness, the only reason it's even able to run Planetside 2, for example, at 30 FPS stable, if it can, is probably the lower-level console API. It's already doing its job; MS and Sony simply utilized it as an excuse to use some of the weakest processors on the market with clearly no intent to have the machines be capable of sustaining 60 FPS or even 30 in any remotely CPU-demanding game nor to actually advance AI or scope or other things that make some difference in game design.


These are some cold hard facts to swallow, it doesn't bold well for gaming or for the new very cheaply built consoles. It makes me wonder how these consoles will be able to produce AAA polished games that make it worth owning one. The first Halo game sold me on Xbox, but backwards compatibility with the Xbox 360 is the reason I moved on to gaming PC's.

MS Xbox 360's incompatibility was the single defining reason I gave up on consoles. There isn't a single game for the Xbox One that makes me even desire the new consoles. Recent events doesn't give me much faith in either console, now they can't even launch reliable software without alienating everyone and ruining the experience with the horrible unstable games.

You're post has educated me with a treasure trove of valuable info and I look forward to the next!


----------



## Carniflex

Quote:


> Originally Posted by *Serandur*
> 
> As per GDDR5, it's not meant to be system RAM. If it brought forth any true benefits as such, it would have happened already. It's effectively a bandwidth-enhanced version of DDR3 designed specifically and exclusively for GPUs, which rely far more on the bandwidth of their on-board memory than CPUs. *The tradeoff is latency.* CPUs benefit more from lower latency in memory timings, ergo they use DDR3. GPUs benefit more from the bandwidth, ergo GDDR5 exists. They're different variants of the same type of technology specialized for two different things. Using it as a system RAM is not more advanced, it's just unorthodox and another poor byproduct of relying on a SoC for what is supposed to be a high-performance platform. In any case, for the PS4's CPU, it is limited to 20 GB/s of bandwidth regardless of GDDR5 being used as it has its own memory bus.
> If an i3 is futile, what is Jaguar? Worthless? SpeedyVT, my response here to you is considering many of the things you've said over the past few pages, not just this one quote on i3s.


A minor correction. GDDR5 and DDR3 latency is essentialy the same. What gives the rise for common misconception, as far as I understand, that GDDR5 has higher latency is that it sits behind the PCIe bus in todays PC architecture. And THAT is indeed relatively lagging interface for accessing memory.
Quote:


> It's worth remembering that if you do a little bit of Googling around, you can easily come across the specs of various GDDR5 memory. Hynix for example place PDF's of their products freely available on the internet.
> The latency on GDDR5 and DDR3 memory does vary some, but the typical CAS latency of most sticks of DDR3 is between 7 and 9 (see the earlier Tomshardware link for more information). GDDR5 RAM meanwhile is a typical latency of about 15. Once again, all of this depends on the make and models of the RAM.
> So - there's the answer right? Let's say 8 CAS vs 15 CAS? No, it's not. We have to remember that the speeds are for all intents and purposes - relative. If you take the CAS of both, and then multiply it by the clock speed - you get the ns of delay. CAS of 15/1.35 = 11ns.


From: http://www.redgamingtech.com/ps4-vs-xbox-one-gddr5-vs-ddr3-latency/

I believe main reason why GDDR5 is not used for system meory is the lack of suitable memory controller in the CPU and the relatively higher price and power consumption of it. It would be just rather inconvenient thing to do although I personally would see some benefit of it for, say. AMD APU's which are chronically bandwidth starved.


----------



## iTurn

Quote:


> Originally Posted by *tajoh111*
> 
> Its way less power than an i3. i3's nowadays are capable of out performing fx8350's in most gaming tasks, let alone the 6 1.8 ghz jaguar cores in the ps4.
> 
> If you look at the original xbox. It had something a bit faster than a geforce 3 500ti. Price of geforce 3 500ti at time of xbox 1 release? 350 dollars which taking into account inflation, is 460 dollars.
> 
> PS3 power was equivalent to a 7900gt to 7900gtx. so a card between 300-500 dollars at the time. And the PS3 was super delayed. It should of came out during the gtx 7800 era. So with inflation between 350-500+
> 
> xbox 360 power was a bit more than a hd1950 xtx. Cost of this card was around 450 dollars. With inflation, 531 dollars.
> 
> Todays consoles, come with something around a 7850 which were 150 dollar cards at the time of the time of the ps4 and xbox 1 release. That's a drastic drop compared to what we saw before. We should really be getting gtx 780 to gtx 780 ti to r9 290 to r9 290x levels of performance if it was like the consoles of the past. These chips are 3x the power of what are in the xbox one and ps4.
> 
> We might not be hitting the wall now, but by next year, there is a good chance we will hit the wall.
> 
> Stuff like HSA doesn't matter nearly as much as in the consoles, because consoles are coded so close to the metal that they are really using their resources efficiently. It's why in the past, the consoles got away often with using a CPU significantly less powerful than the GPU. Much of HSA is about tasking stuff to the CPU and letting the GPU do that work. Console and gaming have far less to gain from HSA. 100 utilization is something that you see in consoles because they truly extract everything out of it. Consoles underperforming is hardly a result of lazy coding. Its the ports to PC that have bad performance due to lazy coding.
> 
> It's not AMD fault at all however, its console makers for putting such a cheap chip in the first place. They should have took a loss like the past instead of underpriced consoles that make a profit from day 1.


Way less powerful? I should have stopped reading there... do remember that console devs will use all of the available cores and when all available cores are used it is on par if not more powerful than an i3. Single threaded use of course the i3 is more powerful.

HSA doesn't matter? You have no Idea what HSA is do you? http://developer.amd.com/resources/heterogeneous-computing/what-is-heterogeneous-system-architecture-hsa/ read up on HSA and see that "coding to the metal" doesn't prevent it from having performance boosts as it's not mantle...

If Ubi admits the coding is at fault who are you to say otherwise? turning your network offline solved the FPS issue on the PS4 if thats not a coding issue I don't know what is.


----------



## iSlayer

Quote:


> Originally Posted by *SpeedyVT*
> 
> I wouldn't even compare it to a PC. There is so much that doesn't even make it a PC it's literally not comparable. Such as the independent operative ability of the GPU. Also the fact the GPU can thread like a processor if coded to it. The 256bit access of memory on CPU level. While it's cores are based on x86 it's not a PC. It's a Frankenstein.


In comparing a console I would ask myself what its purpose is.

As a gaming platform, its inherently limited compared to PCs and boasts anemic hardware this generation, with a CPU that's thoroughly outclassed by even an FX 6300 I imagine and GPUs that are very rapidly outdating, when AMD drops the 3xx the consoles will only get more embarrassing more quickly, as PC gaming pushes for 4k and SSAA to mimic 4k.

Normally a console isn't such a hard sell but this time its really just for the exclusives and not much else. And even those have been a bit MCC I mean poor.

As a (gaming) PC? Well, they are being advertised for more than just their potential for gaming, as Netflix and whatever boxes. So, how do they compare? Well they're HTPCs but limited in that capacity. No MadVR beauties, just the kind of stuff that you can play through any SmartTV or Tablet or other.

A bit of a far cry from the PS1 with its surprisingly high end CD player, or the PS3 doubling as a very cheap Bluray player.

So, however you want to look at it the consoles have some clear drawbacks.

Also, consoles as PCs. That's not something historically foreign. The PS2 did that (they even sold kits to make it one!) and the PS3 could run Linux.
Quote:


> Originally Posted by *Carniflex*
> 
> A minor correction. GDDR5 and DDR3 latency is essentialy the same. What gives the rise for common misconception, as far as I understand, that GDDR5 has higher latency is that it sits behind the PCIe bus in todays PC architecture. And THAT is indeed relatively lagging interface for accessing memory.
> From: http://www.redgamingtech.com/ps4-vs-xbox-one-gddr5-vs-ddr3-latency/
> 
> I believe main reason why GDDR5 is not used for system meory is the lack of suitable memory controller in the CPU and the relatively higher price and power consumption of it. It would be just rather inconvenient thing to do although I personally would see some benefit of it for, say. AMD APU's which are chronically bandwidth starved.


Aren't all iGPUs chronically bandwidth starved? CPUs not so much.
Quote:


> Originally Posted by *iTurn*
> 
> Way less powerful? I should have stopped reading there... do remember that console devs will use all of the available cores and when all available cores are used it is on par if not more powerful than an i3. Single threaded use of course the i3 is more powerful.


Yes and that's nice but you should never forget some problems are inherently serial or that multithreading a game is really hard.
Quote:


> HSA doesn't matter? You have no Idea what HSA is do you? http://developer.amd.com/resources/heterogeneous-computing/what-is-heterogeneous-system-architecture-hsa/ read up on HSA and see that "coding to the metal" doesn't prevent it from having performance boosts as it's not mantle...
> 
> If Ubi admits the coding is at fault who are you to say otherwise? turning your network offline solved the FPS issue on the PS4 if thats not a coding issue I don't know what is.


I believe it was said earlier that the consoles can't employ HSA.


----------



## Jaydev16

So reality caught up with the bug boxes?


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Systemlord*
> 
> In the past consoles at release were very close to a high-end PC in specs, I remember the buzz about Xbox 360 and PS3 with it's folding performance. Sony and MS really went for lower specs this time around which I believe will max out pretty quick. It's because they didn't want to lose as much money per console, their losing less money per console sold. Just look at the price differences, PS3 at launch ran $599-$699, my brother got one for free buying a high-end DLP display, at that time the PS3 had been out for some time and cost $600. Compare those prices to what they now, Xbox One is $349 and the PS4 is $399, that's quite a big difference from the PS3's launch price even a year later.


PS3 was never $700 (except on ebay). I believe the two launch versions were $500 and $600.


----------



## iTurn

Quote:


> Originally Posted by *iSlayer*
> 
> In comparing a console I would ask myself what its purpose is.
> 
> As a gaming platform, its inherently limited compared to PCs and boasts anemic hardware this generation, with a CPU that's thoroughly outclassed by even an FX 6300 I imagine and GPUs that are very rapidly outdating, when AMD drops the 3xx the consoles will only get more embarrassing more quickly, as PC gaming pushes for 4k and SSAA to mimic 4k.
> 
> Normally a console isn't such a hard sell but this time its really just for the exclusives and not much else. And even those have been a bit MCC I mean poor.
> 
> As a (gaming) PC? Well, they are being advertised for more than just their potential for gaming, as Netflix and whatever boxes. So, how do they compare? Well they're HTPCs but limited in that capacity. No MadVR beauties, just the kind of stuff that you can play through any SmartTV or Tablet or other.
> 
> A bit of a far cry from the PS1 with its surprisingly high end CD player, or the PS3 doubling as a very cheap Bluray player.
> 
> So, however you want to look at it the consoles have some clear drawbacks.
> 
> Also, consoles as PCs. That's not something historically foreign. The PS2 did that (they even sold kits to make it one!) and the PS3 could run Linux.
> Aren't all iGPUs chronically bandwidth starved? CPUs not so much.
> Yes and that's nice but you should never forget some problems are inherently serial or that multithreading a game is really hard.
> I believe it was said earlier that the consoles can't employ HSA.


Embarrassing for who? Sony or MS that's selling millions of them? Console only gamers that don't care about PCs? PC gamers that own consoles and are the real Master Race compared to PC only having folks?

I haven't forgotten but being built from the ground up with a task in mind helps alot.

The PS4 can, it's speculated that the XB1 can't.


----------



## iSlayer

Quote:


> Originally Posted by *iTurn*
> 
> Embarrassing for who? Sony or MS that's selling millions of them? Console only gamers that don't care about PCs? PC gamers that own consoles and are the real Master Race compared to PC only having folks?
> 
> I haven't forgotten but being built from the ground up with a task in mind helps alot.
> 
> The PS4 can, it's speculated that the XB1 can't.


More so referring to games on them not even being 1080p or 60 fps.

Its nice MS and Sony are making so much $ but I don't really care about them unless their wellbeing threatens the consumer.


----------



## iTurn

Quote:


> Originally Posted by *iSlayer*
> 
> More so referring to games on them not even being 1080p or 60 fps.
> 
> Its nice MS and Sony are making so much $ but I don't really care about them unless their wellbeing threatens the consumer.


I only wish for 60fps in certain games, racers/shooters/ARPG (diablo 3)/fast action games (DMC). I don't really 'need' 60fps otherwise. PS4 only has a few non-1080p games.

Yes I wish all were 1080p but I'm not embarassed by it.

I do get your complaint and yes I wish they were more powerful but I don't see anyone that bought the systems complain about the lack of graphical quality.


----------



## Orangey

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> PS3 was never $700 (except on ebay). I believe the two launch versions were $500 and $600.


Surely everyone must remember *599*

It was a popular meme for a couple of years.


----------



## SpeedyVT

I try to limit my console game purchases to exclusives or titles my friends are playing at the time. Consoles are great, and should benefit my PC games when Developers learn to better exploit hardware.


----------



## srenia

"The GPU contains AMD graphics technology supporting a customized version of Microsoft DirectX graphics features. Hardware and software customizations provide more direct access to hardware resources than standard DirectX. They reduce CPU overhead to manage graphics activity and combined CPU
and GPU processing"

The whole HSA isn't in console is false when talking about Xbox one soc. This above quote is from the IEEE journal. Its part of a article that goes in depth on the Xbox one soc and Kinect. A industry journal that is highly accurate and peer reviewed. MS spent billions in research on that system. Doesn't mean PC's won't catch up to the One, but the One is the first to use the tech.

Defiantly suggest anyone who wants to research it more to go to the journals website. It's under a paywall.


----------



## Blameless

I have no idea to what degree HSA is supported on Xbox One, but that quote is extremely ambiguous and may not be making any references to HSA at all.


----------



## srenia

"System software and hardware keep page tables synchronized so that CPU, GPU, and other processors can share memory, pass pointers rather than copying data, and a linear data structure in a GPU or CPU virtual space can have physical pages scattered in DRAM and SRAM. The unified memory system frees applications from the mechanics of where data is located, but GPU-intensive applications can specify which data should be in SRAM for best performance."

Other quotes in the article as well. It's HSA. Hardware and software working together. In the previous post did you notice the custom hardware is controlling the CPU and GPU.

Outside of HSA talk is that it does 64 workloads verses the eight the ps4 can do.

MS AMP C++ is being used for HSA and DirectX 12 is hardware change as well. Both software packages are going to be used this next year in games and further used as developers get used to it. It surprises me that anyone could think MS didn't include these features in its own Xbox one.


----------



## umeng2002

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> PS3 was never $700 (except on ebay). I believe the two launch versions were $500 and $600.


Yeah but it cost them nearly $1000 each to make. So this generation, they were pretty much breaking even because, well Sony is in no position to take loses on anything. Imagine what could have been if they could take the loses and Sell a $1000 PS4 for $500 or $600 bucks. It sure as **** wouldn't have an APU.


----------



## WhiteCrane

Quote:


> Originally Posted by *EliasAlucard*
> 
> You're wrong, and I'm going to correct you.
> 
> It's true that Intel has added some RISC-like functionality over the years, but pretty much all x86-64 CPUs (even the low powered ones) are still more or less purely CISC, because unlike RISC, they're still everything but energy efficient. In fact, x86-64 is still the only real CISC player in town nowadays. In everything else, they've understood the importance of RISC. Most mobile devices use RISC (except the piece of junk with Intel inside no one is buying, thank God) and so do most video game consoles (except for the original Xbox and now this current generation of PS4/XB1, sadly). It's mostly on the Windows desktop/laptop, we're still burdened by this CISC crap, and that's the result of several decades now of proprietary vendor lock-in by the Wintel duopoly. Microsoft and Intel could end the Wintel nonsense tomorrow if they really wanted to, but they're not inclined to do so because they both know that without Wintel, both Microsoft and Intel would be out of the game completely. Look at how difficult it has been for both Microsoft and Intel to establish their presence in the mobile devices scene as a case in point, with both ARM based Windows RT/Phone and Intel based Android devices seeing no success whatsoever. That's how it works when there's real market competition and no anti-competitive corporations dominating the iGnorant masses with their vendor lock-in strategies (in this case, a monopoly on software through the x86 Windows ISA/OS).
> 
> You x86 Wintel fanboys obviously haven't done your homework, and you're not capable of connecting the dots.


Slow down. I know all about SPARC and _Power_PC. But Their chips just aren't competitive with Intel and AMD's offerings. Why did Apple drop the _Power_PC CPU's if they were superior to x86 standard? And you're right, ARM beats it on performance per watt. But watts aren't the only thing limiting performance. Wouldn't you rather have an i7 than an ARM Samsung Exynos or Snapdragon? There's no comparison for raw power. Also, you're assuming that an ARM CPU would show leaps in performance if they simply designed one that runs on desktop wattages. Is there an example of this yet?


----------



## Darkpriest667

Quote:


> Originally Posted by *WhiteCrane*
> 
> Why did Apple drop the _Power_PC CPU's if they were superior to x86 standard?


One word: Margins. Power PC CPUs WERE (and in some cases are depending on the use) superior to AMD and Intel offerings. The Problem? IBM wanted a premium. It's one of the reasons that APPLE PCs were so expensive compared to their non apple counterparts.

This is the reason the XBOX was so expensive during its initial launch. It was running a triple core Power PC CPU that had hyperthreading (before intel mind you) and could kick the crap out of anything we (PC gamers) had.

If you ever want to know why APPL does anything just remember the word Margins.


----------



## Orangey

Even if consoles do manage to get ahold of some new chips which have features not available to contemporary PCs, by the time the developers can actually get to grips with the arch PCs will have outstripped them. Not to mention the laws of physics & economics: form factor limits FLOPS/MIPS/insert-metric-of-your-choosing indirectly to such an extent that PCs will always be much more powerful and have better-looking games, while a console with a BOM comparable to a high-end gaming rig will sit on shelves gathering dust. The Neo Geo was great by all accounts but nobody could afford one.


----------



## WhiteCrane

Quote:


> Originally Posted by *Darkpriest667*
> 
> One word: Margins. Power PC CPUs WERE (and in some cases are depending on the use) superior to AMD and Intel offerings. The Problem? IBM wanted a premium. It's one of the reasons that APPLE PCs were so expensive compared to their non apple counterparts.
> 
> This is the reason the XBOX was so expensive during its initial launch. It was running a triple core Power PC CPU that had hyperthreading (before intel mind you) and could kick the crap out of anything we (PC gamers) had.
> 
> If you ever want to know why APPL does anything just remember the word Margins.


The Intel Macs were considerable faster than the G5 Macs were. Also, in a general _Power_PC versus x86 discussion, I wanted to point out PS3 and Xbox 360 both lagged behind the performance of cheap x86 CPU's available when they launched.

This is an Anandtech article that was controversially ripped down by MSFT's influence minutes after it being posted back in 2005, for any of you who remember. Again, the developer who was writing didn't have much good to say about the RISC CPU's in PS360.

http://hardware.slashdot.org/comments.pl?sid=154333&cid=12947910


----------



## Blameless

Quote:


> Originally Posted by *Darkpriest667*
> 
> It was running a triple core Power PC CPU that had hyperthreading (before intel mind you) and could kick the crap out of anything we (PC gamers) had.


Intel's SMT implementation (Hyperthreading) predates the the XBox 360's CPU by more than three years.

Also, raw Xenon performance was not particularly impressive (it's a small, relatively cheap to make, low power part). It's an extremely narrow design, with little in the way of ILP, and would likely have been outperformed by x86 parts of the time by a wide margin. Find a way to run the same benchmark suite on both parts, and a 2GHz dual-core, two thread, Athlon 64 X2 is almost certainly going to beat the snot out of the 3.2GHz tri-core, hex thread, Xenon CPU in the XBox 360.

Neither the XBox 360 nor PS3 CPUs were terribly impressive, from a performance standpoint, as CPUs.
Quote:


> Originally Posted by *WhiteCrane*
> 
> The Intel Macs were considerable faster than the G5 Macs were. Also, in a general _Power_PC versus x86 discussion, I wanted to point out PS3 and Xbox 360 both lagged behind the performance of cheap x86 CPU's available when they launched.


Indeed. The last of the Apple PowerPCs could only be compared favorably to the Pentium IVs, and the console CPUs where power and cost reduced derivatives.


----------



## Darkpriest667

EDIT--- I take it back I was mistaken.


----------



## delboy67

The consoles are not, and are not supposed to be, hedt gaming pcs, they're (relatively) tiny boxes that plug into your tv designed for people who want to play games with friends and dont care about cpus/gpus/flops etc, the way they're spec'd is aimed mostly at form factor and power consumption hence the apu. I think they're supposed to use even less watts than they use currently but the node/fab was not ready in time, they should be sub 80w in a year or 2 and even smaller, that's all that matters in console land and that's why they don't compare.
The only reason we're even talking about fps and resolution on consoles is clever marketing by sony because lets face it even though its ps is more powerful the yearly cod/fifa/etc will just look the same on both just like last gen. The games being enjoyable and looking nice used to be all that mattered to consoles, now sony (has anyone else in uk noticed their page in the argos catalougue? mentioing 'teraflops' ***? never seen this before) has started using pc style benchmarketing,
Hey hope it doesnt backfire sony! so far this year I alone have used your *exact* marketing, that otherwise ignorant people are now starting understand, to get at least 5 ps diehard friends to just build gaming pcs instead of buying ps4s, once more sheep learn about these 'frames' and 'resolutions' thier path to PC heaven will be clear.


----------



## Carniflex

Quote:


> Originally Posted by *delboy67*
> 
> ...
> Hey hope it doesnt backfire sony! so far this year I alone have used your *exact* marketing, that otherwise ignorant people are now starting understand, to get at least 5 ps diehard friends to just build gaming pcs instead of buying ps4s, once more sheep learn about these 'frames' and 'resolutions' thier path to PC heaven will be clear.


It will be probably few years until 4K are common enough but when it gets "close enough" to the 1080p prices it will be probably one of the factors luring people over to the PC side. Already today one can get reasonably good 60 Hz 4K screens for less than 500 with little patience (Asus 28'') with cheapest offers around 400 in here (DELL 28'', but it is said to have horrific input lag bcos of legacy connectors even when using displayport).


----------



## srenia

http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&tp=&arnumber=6756701&queryText%3Dxbox+one

Would suggest anyone serious about what the Xbox One has to read this. Basically full HSA and DirectX 12 is part of the One. Its a peered reviewed article and backed by MS. It's also coming out for PC's here soon. Read up and then read up on the demo's of DirectX 12 and HSA. One has some tech that is yet to be tapped fully yet. Soon PC's as well.


----------



## Orangey

I can see a future where NV get a console contract and go about blaming AMD for the failure of this gen. They'd say things like "it was irresponsible of AMD to accept the margins they did" blah blah blah. Now gib 800USD for cut-down GM204.


----------



## srenia

Quote:


> Originally Posted by *srenia*
> 
> http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&tp=&arnumber=6756701&queryText%3Dxbox+one
> 
> Would suggest anyone serious about what the Xbox One has to read this. Basically full HSA and DirectX 12 is part of the One. Its a peered reviewed article and backed by MS. It's also coming out for PC's here soon. Read up and then read up on the demo's of DirectX 12 and HSA. One has some tech that is yet to be tapped fully yet. Soon PC's as well.


Quote:


> Originally Posted by *Orangey*
> 
> I can see a future where NV get a console contract and go about blaming AMD for the failure of this gen. They'd say things like "it was irresponsible of AMD to accept the margins they did" blah blah blah. Now gib 800USD for cut-down GM204.


Well with this info given you realize that its the Xbox One that is helping program adoption of HSA and DirectX 12? The chicken and egg scenario is eliminated so that the consumer benefits and therefore benefits MS. The biggest revolution in programming and hardware is being led by the demonized MS. MS certainly deserves a round of applause for this achievement they are in the middle of.


----------



## amstech

Lol.
People get a console so its not running all the junk a PC has to run.
Now, it looks like people will get a PC to avoid all the junk a console runs!

Seriously so glad I didn't bite and get a PS4/XB1, still enjoying the PS3/360S.


----------



## WhiteCrane

Again, just throwing this out there... Original XBOX did circles around PS2 with a Pentium III processor.

I don't see the RISC advantage.


----------



## Orangey

Gamecube was RISC and was teh best.


----------



## iSlayer

Quote:


> Originally Posted by *WhiteCrane*
> 
> Again, just throwing this out there... Original XBOX did circles around PS2 with a Pentium III processor.
> 
> I don't see the RISC advantage.


733MHz P3 versus a 300MHz MIPS processor. Not a tough guess that the P3 will probably be faster.


----------



## PostalTwinkie

Quote:


> Originally Posted by *WhiteCrane*
> 
> Again, just throwing this out there... Original XBOX did circles around PS2 with a Pentium III processor.
> 
> I don't see the RISC advantage.


Yea, but the XBox didn't use just any Pentium III, but the latest Coppermine based Pentium III. Which had significant differences over the earlier Pentium IIIs that increased performance. It also ran at twice the clock speed as the processor in the PS2.

Now I can't comment on how the Xbox being x86 and the PS2 using MIPS makes a difference, or rather how much. Something tells me though that MIPS can't make up for half the clock speed. Then again, even the Xbox GPU had more raw throughput potential than the PS2.

At the time of assembly, the Xbox used higher tier computer hardware when scaled to computers of the same time frame. They sure weren't shipping with 4 year old performance levels like the current generation.


----------



## SpeedyVT

Quote:


> Originally Posted by *iSlayer*
> 
> 733MHz P3 versus a 300MHz MIPS processor. Not a tough guess that the P3 will probably be faster.


MIPS by my knowledge was faster than a P3. It was a more defined architecture that gave better IPC.


----------



## Pip Boy

Quote:


> Originally Posted by *Orangey*
> 
> Gamecube was RISC and was teh best.


and its a good board game too


----------



## PostalTwinkie

Quote:


> Originally Posted by *SpeedyVT*
> 
> MIPS by my knowledge was faster than a P3. It was a more defined architecture that gave better IPC.


I am going to wager that the IPC of MIPS doesn't make up for having less than half the clock speed on the processor, and less overall throughput of the GPU as well. More efficient than the Pentium III? Sure, I imagine it is. Raw power? Don't think it did, could be wrong though.


----------



## iSlayer

Quote:


> Originally Posted by *SpeedyVT*
> 
> MIPS by my knowledge was faster than a P3. It was a more defined architecture that gave better IPC.


Quote:


> Originally Posted by *PostalTwinkie*
> 
> Yea, but the XBox didn't use just any Pentium III, but the latest Coppermine based Pentium III. Which had significant differences over the earlier Pentium IIIs that increased performance. It also ran at twice the clock speed as the processor in the PS2.
> 
> Now I can't comment on how the Xbox being x86 and the PS2 using MIPS makes a difference, or rather how much. Something tells me though that MIPS can't make up for half the clock speed. Then again, even the Xbox GPU had more raw throughput potential than the PS2.
> 
> At the time of assembly, the Xbox used higher tier computer hardware when scaled to computers of the same time frame. They sure weren't shipping with 4 year old performance levels like the current generation.


Isn't it known that the Xbox had the best hardware of that generation? Even the GameCube was more powerful than the PS2.


----------



## hellojustinr

Hardware schmardware lets just be happy Xbox ONE and PS4 came out so consoles moved onto higher res textures, which in turn means more mainstream games will have a focus on not just better looking graphics (since it will be the standard now) but better gameplay as well.

I can see the difference easily moving from an Xbox 360 copy of GTA V to the Xbox ONE copy of GTA V.

Point blank, I bought a next gen console and most of the games I play on PC look nearly as good on my Xbox ONE in regards to textures.

NBA 2K15 anyone
GTA V anyone
Destiny anyone
COD Advanced Warfare anyone
Far Cry 4 anyone
AC Unity anyone (ok maybe not AC Unity, it is running horribly on consoles at the moment but so are PC versions)
Crysis 3 anyone

Granted GTA V on PC is still going to look amazing though at 1080p playing at 60FPS HAHA but I'm happy for the industry as a whole that it is FINALLY moving onto higher res textures as standard. It feels like the whole industry as a whole was at a standstill just promoting games that looked soooo good but when it came down to consumer hands the only mainstream platforms available were the 2005/2006 Xbox 360 and PS3 running those games at texture resolution: low and 720p graphics

At least now they have a outlet for higher res texture games and with all due respect to the PC community, the Xbox One and PS4 games look amazing. Those exclusives are gonna look amazing and its a win-win for the gaming community as a whole


----------



## PostalTwinkie

Quote:


> Originally Posted by *hellojustinr*
> 
> Hardware schmardware lets just be happy Xbox ONE and PS4 came out so consoles moved onto higher res textures, which in turn means more mainstream games will have a focus on not just better looking graphics (since it will be the standard now) but better gameplay as well.
> 
> I can see the difference easily moving from an Xbox 360 copy of GTA V to the Xbox ONE copy of GTA V.
> 
> Point blank, I bought a next gen console and most of the games I play on PC look nearly as good on my Xbox ONE in regards to textures.
> 
> NBA 2K15 anyone
> GTA V anyone
> Destiny anyone
> COD Advanced Warfare anyone
> Far Cry 4 anyone
> AC Unity anyone (ok maybe not AC Unity, it is running horribly on consoles at the moment but so are PC versions)
> Crysis 3 anyone
> 
> Granted GTA V on PC is still going to look amazing though at 1080p playing at 60FPS HAHA but I'm happy for the industry as a whole that it is FINALLY moving onto higher res textures as standard. It feels like the whole industry as a whole was at a standstill just promoting games that looked soooo good but when it came down to consumer hands the only mainstream platforms available were the 2005/2006 Xbox 360 and PS3 running those games at texture resolution: low and 720p graphics
> 
> At least now they have a outlet for higher res texture games and with all due respect to the PC community, the Xbox One and PS4 games look amazing. Those exclusives are gonna look amazing and its a win-win for the gaming community as a whole


You are running off about Hi-res textures on the console, when they aren't even able to pull off 1920x1080 and PC gaming is pushing 4K.










You are a bit silly to say your consoles look as good as your pc gaming, considering your sig rigs. Then again about every game you listed was a Console > PC game, so that is expected.


----------



## WhiteCrane

I personally thought game cube was the most powerful of the 3 in that generation. 400Mhz PowerPC CPU. RISC CPU's do perform better clock for clock, or they used to anyway.

I don't think the difference are as pronounced as they used to be.


----------



## iSlayer

I've gotten used to qualifying "console games look great!" with a relatively speaking. It doesn't help that mid-range hardware can push better graphics and framerates.

As we move farther away the performance gap will only grow.


----------



## hellojustinr

Quote:


> Originally Posted by *PostalTwinkie*
> 
> You are running off about Hi-res textures on the console, when they aren't even able to pull off 1920x1080 and PC gaming is pushing 4K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are a bit silly to say your consoles look as good as your pc gaming, considering your sig rigs. Then again about every game you listed was a Console > PC game, so that is expected.


Because they are







))

Being realistic here I don't know why people are bashing on consoles

And of course I will list those games. I'm not gonna list a PC exclusive since it wouldn't be comparable (what am I going to compare it too)


----------



## hellojustinr

Quote:


> Originally Posted by *iSlayer*
> 
> I've gotten used to qualifying "console games look great!" with a relatively speaking. It doesn't help that mid-range hardware can push better graphics and framerates.
> 
> As we move farther away the performance gap will only grow.


The performance gap will grow for us PC gamers, not the console gamers.

They are just gonna sit pretty regardless of all these talks just like last generation when they said 8800GTX was gonna kill the PS3 and Xbox 360 and it didn't. phew


----------



## Orangey

Pretty sure the problem with the consoles is not GPU but CPU, it is a cores over IPC design. Strangely Sony made this mistake with PS3 and now both of them have bungled it again this gen.

PS3 development matured in year 3 or so, we should be waiting until 2016 to see what developers can really wrangle out of Jaguar as a baseline going forward.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Orangey*
> 
> Pretty sure the problem with the consoles is not GPU but CPU, it is a cores over IPC design. Strangely Sony made this mistake with PS3 and now both of them have bungled it again this gen.
> 
> PS3 development matured in year 3 or so, we should be waiting until 2016 to see what developers can really wrangle out of Jaguar as a baseline going forward.


This argument has been completely dismantled before, in this thread towards the start.

We will not see the same level of maturation out of this generation as we have previous generations. There are several reasons for it, a few of which are a bit deeper than just surface specs and environments, and have to do with timing of new technologies in x86.


----------



## aberrero

It will be really interesting to see if MS or Sony are aggressive about refreshing hardware this time around. They can easily just incorporate a new AMD chip a couple years from now and ship a new console that is completely backwards compatible, but runs games at 60fps instead of 30fps, or whatever.


----------



## aberrero

Quote:


> Originally Posted by *PostalTwinkie*
> 
> This argument has been completely dismantled before, in this thread towards the start.
> 
> We will not see the same level of maturation out of this generation as we have previous generations. There are several reasons for it, a few of which are a bit deeper than just surface specs and environments, and have to do with timing of new technologies in x86.


I just got Little Big Planet 3 and it, frankly, looks pretty bad. Since it is designed to run on PS3 as well as PS4, it has low res textures and other issues. It's possible to have a better game today, but once develoeprs no longer have to worry about supporting last gen hardware I think we will see a pretty big uptick in visuals.

Of course, even wit hthe ugly looks LBP3 sometimes lags and has tons of incredibly long loading screens, so who knows?


----------



## jsc1973

Quote:


> Originally Posted by *hellojustinr*
> 
> Being realistic here I don't know why people are bashing on consoles


Because you're on OCN, where a lot of the membership is made up of people who game on PC hardware, and who value the higher performance that PC hardware typically offers over consoles. I haven't owned a console since before a lot of the people posting here were born.

If someone comes here and extols the virtues of console gaming, they're likely to get shouted down. Would you go on the Haswell-E owners' thread here and tell them they'd all be better off if they ran AMD APU's instead of X99?


----------



## hellojustinr

Quote:


> Originally Posted by *PostalTwinkie*
> 
> This argument has been completely dismantled before, in this thread towards the start.
> 
> We will not see the same level of maturation out of this generation as we have previous generations. There are several reasons for it, a few of which are a bit deeper than just surface specs and environments, and have to do with timing of new technologies in x86.


Where is this argument dismantled at (thread?), that is a very great, concise conclusion of this generation

EDIT: nvm re-read it. gonna read from the start


----------



## hellojustinr

Quote:


> Originally Posted by *jsc1973*
> 
> Because you're on OCN, where a lot of the membership is made up of people who game on PC hardware, and who value the higher performance that PC hardware typically offers over consoles. I haven't owned a console since before a lot of the people posting here were born.
> 
> If someone comes here and extols the virtues of console gaming, they're likely to get shouted down. Would you go on the Haswell-E owners' thread here and tell them they'd all be better off if they ran AMD APU's instead of X99?


What I am trying to get across is that consoles getting better also benefits us PC gamers so there is no reason to bash whatsoever. The only reason people bash is to feel superficially superior one way or another over our "console" brethrens which is sooooooooooooo stupid.

And yes I would if the situation needed an AMD APU I would not rule out the possibility recommending an AMD APU (like per say a very limited budget?)

. I get what you are getting across to me nonetheless and you have a valid point


----------



## Stormscion

amd disappoints again ?


----------



## Blameless

Quote:


> Originally Posted by *SpeedyVT*
> 
> MIPS by my knowledge was faster than a P3. It was a more defined architecture that gave better IPC.


The Emotion Engine's MIPS R5900 CPU core is pretty weak, but it also had two VPUs (vector processing units) along side it.
Quote:


> Originally Posted by *PostalTwinkie*
> 
> I am going to wager that the IPC of MIPS doesn't make up for having less than half the clock speed on the processor, and less overall throughput of the GPU as well. More efficient than the Pentium III? Sure, I imagine it is. Raw power? Don't think it did, could be wrong though.


As a general purpose processor, the Pentium III was probably faster, but the VPUs would have made the EE quite a bit better at some FP intensive tasks.
Quote:


> Originally Posted by *WhiteCrane*
> 
> RISC CPU's do perform better clock for clock, or they used to anyway.


RISC doesn't mean much of anything for performance, in and of it self. It implies a simpler chip with potentially a bit less overhead, which can hint at more performance per transistor, in an ideal scenario, but RISC is ultimately just vague design philosophy. Huge variety in the IPC of different designs and implimentations, regardless of RISC vs. CISC.
Quote:


> Originally Posted by *hellojustinr*
> 
> Being realistic here I don't know why people are bashing on consoles


I'm not bashing on consoles; they do what they do and they do it pretty well.

It's just silly to expect processors designed for low heat/low power applications and/or limited function roles to be magically more powerful general purpose designs than contemporaneous general purpose processors with higher transistor counts and fewer operating constrains.


----------



## jsc1973

Quote:


> Originally Posted by *hellojustinr*
> 
> What I am trying to get across is that consoles getting better also benefits us PC gamers so there is no reason to bash whatsoever. The only reason people bash is to feel superficially superior one way or another over our "console" brethrens which is sooooooooooooo stupid.
> 
> And yes I would if the situation needed an AMD APU I would not rule out the possibility recommending an AMD APU (like per say a very limited budget?)
> 
> . I get what you are getting across to me nonetheless and you have a valid point


There are some people that are dogging console users out of some weird sense of "superiority," but I've also seen console gamers post here telling PC gamers how dumb they are because consoles are cheaper and have exclusive games and whatnot. I don't have much of a dog in the fight; I haven't personally owned consoles since high school (which I graduated from in 1990), and I rarely even play anything on a PC anymore, but I keep up with it because I build and recommend for people who do. I can build someone a PC that will offer a better gaming experience and do more other things than a PS4 or XB1 for very little additional money, so that's the advice I give.

Anyone who's not a pathological Intel fanboy would recommend an AMD APU in the right situation. I'm running one right now, albeit as if it were a 760K. But I wouldn't go on a thread full of 5930K users and tell them they were dumb because my CPU is much cheaper and can play games too. (They'd probably ignore me as an idiot/troll, which I would be if I did that.) That's what I've seen console gamers do on threads like this, and they deserve to get bashed if they do it. People who have rational discussions on the issue don't deserve to be bashed, but I don't see much of that around here.


----------



## DRT-Maverick

Console vs computer ultimately comes down to preference and personal choice. I like gaming on a PC as I prefer using a mouse and keyboard for most games (except for racing games and running I prefer a controller). A console would be fun to have, maybe someday when I get a decent TV, however the money I invest in my PCs could have afforded me a nice TV and console without a problem, again it comes down to preference and use.

Consoles are useful, they're somewhat on the affordable side and can be plugged into just about any modern TV out there, no special software or anything required, it's much more convenient to setup, plus it also ends up being an entertainment system, and a link for playing youtube videos on TV and stuff. There are definite bonuses to consoles, and gaming on them isn't bad.

I personally prefer gaming on a PC though, so I opt to using a PC. This won't stop me from picking up a controller at a friend's house and playing on a console. Whether I feel it necessary to purchase a console of my own is a different story- when you've already got a PC that can outperform a console tenfold, and you don't already own a console, then getting a console is a little bit less important to you.

This doesn't mean I wouldn't buy a console- PC games are hard to play multiplayer with friends over, consoles are great for videogame socialization, whereas PCs are pretty solitary.


----------



## zGunBLADEz

Fuuny ubisoft claim this crap when they cannot optimize a game on PC which is broader and more powerful..

Cough Assasins Creed, Far cry, or lets talk about watch dogs..


----------



## hellojustinr

Quote:


> Originally Posted by *DRT-Maverick*
> 
> Console vs computer ultimately comes down to preference and personal choice. I like gaming on a PC as I prefer using a mouse and keyboard for most games (except for racing games and running I prefer a controller). A console would be fun to have, maybe someday when I get a decent TV, however the money I invest in my PCs could have afforded me a nice TV and console without a problem, again it comes down to preference and use.
> 
> Consoles are useful, they're somewhat on the affordable side and can be plugged into just about any modern TV out there, no special software or anything required, it's much more convenient to setup, plus it also ends up being an entertainment system, and a link for playing youtube videos on TV and stuff. There are definite bonuses to consoles, and gaming on them isn't bad.
> 
> I personally prefer gaming on a PC though, so I opt to using a PC. This won't stop me from picking up a controller at a friend's house and playing on a console. Whether I feel it necessary to purchase a console of my own is a different story- when you've already got a PC that can outperform a console tenfold, and you don't already own a console, then getting a console is a little bit less important to you.
> 
> This doesn't mean I wouldn't buy a console- PC games are hard to play multiplayer with friends over, consoles are great for videogame socialization, whereas PCs are pretty solitary.


I like having both, both can co-exist and they do. I don't get the people who bash the consoles or it's processing power, I was not expecting near R9 290 performance or even GTX 680 performance inside those little boxes, even after waiting 9 years (7th gen to 8th gen). I am perfectly happy with the Xbox ONE (and PS4) current state of things.

Granted I can see why people are mad as it was directed towards the "next-gen" of gaming which in the PC community might translate to 4K gaming but in the console crowd it's directed towards better, higher resolution textures on the same resolution or even higher (720p-1080P). Looking at both sides of the argument I would be mad too as it would seem to be that were lead on to believe that the consoles can do 4K which in reality it most likely will not and considering 4K TVs are becoming the mainstream now, we may not have a console natively outputting a 4K feed into that TV which decreases the appeal of 4K as a whole. But I think 1080p upscaled to 4K will look just fine with the better textures, not too mad about it (can't be lol I don't own a 4K TV yet)

The 7th gen games looked like improved 6th gen games upscaled in native HD at best to be honest in comparison to the current next-gen games, which I think fully utilizes the HD resolution's potential to the max.


----------



## Zyro71

You know for trolling case scenarios.
The best way to resolve the quality of games would be to
A. Place graphical settings options within games on the Xbox one and PlayStation 4, giving the ability to run at 60fps or higher quality 30fps content etc (be amazing to see 3D on the consoles.)
Or B. On PC, have a setting that says "Choose this for Console quality settings" which would go about changing everything to fit the consoles.


----------



## Zyro71

Quote:


> Originally Posted by *Stormscion*
> 
> amd disappoints again ?


Well think about that for a moment. Sony and Microsoft went the budget route to actually get a console done.

If they went for something more powerful who knows what they would of gotten..and the price also would be more expensive too most likely.

Like if they went with an i3 or an intel atom or something they could not use intel HD graphics,, or they could with the iris pro, but in the end the price would still be expensive.

They would also be able to use AMD or nvidia graphics, however, again, the overall price would be more expensive in the long run, and being that these companies want a profit, they would not want to be selling the console at a loss.

In the end on Microsoft side. they will be revamping the console, soon, so they could easily take a loss until products are made to drop the console price more.


----------



## Skrillex

I just can't trust this article.

1-2 FPS difference between the consoles yet the Xbox One has 764 shader units the PS4 has 1152 that's 50% more.

I'm sorry it just doesn't wash, either they are coding improperly for the PS4 or telling huge lies.


----------



## lacrossewacker

Quote:


> Originally Posted by *Skrillex*
> 
> I just can't trust this article.
> 
> 1-2 FPS difference between the consoles yet the Xbox One has 764 shader units the PS4 has 1152 that's 50% more.
> 
> I'm sorry it just doesn't wash, either they are coding improperly for the PS4 or telling huge lies.


well that's what a CPU bottlenecks does. The PS4 could have a Geforce Titan in there, still wouldn't help it produce more frames.


----------



## WhiteCrane

Quote:


> Originally Posted by *lacrossewacker*
> 
> well that's what a CPU bottlenecks does. The PS4 could have a Geforce Titan in there, still wouldn't help it produce more frames.


why would Sony spend the money for the extra shaders of they weren't usable? Their engineers don't know what a bottleneck is?


----------



## lacrossewacker

Quote:


> Originally Posted by *WhiteCrane*
> 
> why would Sony spend the money for the extra shaders of they weren't usable? Their engineers don't know what a bottleneck is?


Assassin's creed goal with pumping as many NPC's into the game world as possible was just either too ambitious, or poorly implemented. I'm inclined to assume it's too ambitious - there's nothing other candidate to compare their implementation to.

Ubisoft would've had the same affect if they dialed that _feature_ back by half.

On the other hand, games like Tomb Raider can run on pretty much any CPU and still maintain at least 60fps (GPU dependent) - in which a PS4 would excel relative to the X1


----------



## Carniflex

Quote:


> Originally Posted by *WhiteCrane*
> 
> why would Sony spend the money for the extra shaders of they weren't usable? Their engineers don't know what a bottleneck is?


It's more of an issue on game engine level. If the engine is the same that the company has been using the last decade copy-pasting stuff around it and relying hugely on single threaded performance then there is not a lot Sony can do about it. It's a game developer problem in this case - using an screwdriver to put a nail into the wall


----------



## WhiteCrane

Quote:


> Originally Posted by *lacrossewacker*
> 
> Assassin's creed goal with pumping as many NPC's into the game world as possible was just either too ambitious, or poorly implemented. I'm inclined to assume it's too ambitious - there's nothing other candidate to compare their implementation to.
> 
> Ubisoft would've had the same affect if they dialed that _feature_ back by half.
> 
> On the other hand, games like Tomb Raider can run on pretty much any CPU and still maintain at least 60fps (GPU dependent) - in which a PS4 would excel relative to the X1


So your saying excessive usage of the CPU caused the bottleneck. Too many AI's. The CPU was doing too many things instead of just focusing on keeping up with the GPU.


----------



## lacrossewacker

Quote:


> Originally Posted by *WhiteCrane*
> 
> So your saying excessive usage of the CPU caused the bottleneck. Too many AI's. The CPU was doing too many things instead of just focusing on keeping up with the GPU.


Yep - it's when the crowds are present (even if they aren't drawn on screen) that the game drops to a consistently lower fps. This is also where the X1 commands a lead - it's marginally faster CPU.

The reverse is true in GPU heavy scenes.

Nothing unusual.


----------



## azanimefan

no. this is ubi coding the game to use 2.5 cores on a 5 and 7 core game machine, that causes this problem. of course it's cpu bound when you use a decade old game engine heavily modded to look new.

Ubi needs to build a new engine that makes use of the MOAR CORZ! on the new game consoles. until they do their games will run like crap on both of them.


----------



## WhiteCrane

Ok. It sounds to me like Ubisoft created a bottleneck where there wasn't one. Do consoles struggle with crowds?


----------



## PostalTwinkie

Quote:


> Originally Posted by *WhiteCrane*
> 
> why would Sony spend the money for the extra shaders of they weren't usable? Their engineers don't know what a bottleneck is?


Marketing, really.

It cost them a whole whopping few cents to a buck more per console to have that extra hardware in it, and they get to advertise and scream about how superior the PS4 is to the XBO. There is a reason the PS4 is stomping the XBO right now, it isn't because it is really that much better, but they can just advertise it as far superior.

It is pretty brilliant to be honest.


----------



## Neo Zuko

This generation of consoles was not about graphics. It was about having the ram to do more side features while you play. Things like cross game chat on PSN, instant videos uploaded to YouTube, Twitch, ETC. Features like the share button.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Neo Zuko*
> 
> This generation of consoles was not about graphics. It was about having the ram to do more side features while you play. Things like cross game chat on PSN, instant videos uploaded to YouTube, Twitch, ETC. Features like the share button.


Which would be fine, if doing so didn't compromise the ability to perform its main task - to play video games.


----------



## Systemlord

Quote:


> Originally Posted by *Zyro71*
> 
> Well think about that for a moment. Sony and Microsoft went the budget route to actually get a console done.
> 
> If they went for something more powerful who knows what they would of gotten..and the price also would be more expensive too most likely.
> 
> Like if they went with an i3 or an intel atom or something they could not use intel HD graphics,, or they could with the iris pro, but in the end the price would still be expensive.
> 
> They would also be able to use AMD or nvidia graphics, however, again, the overall price would be more expensive in the long run, and being that these companies want a profit, they would not want to be selling the console at a loss.
> 
> In the end on Microsoft side. they will be revamping the console, soon, so they could easily take a loss until products are made to drop the console price more.


R&D is expensive for designing new consoles and then trying to get everything to work together, this is why MS and Sony went the cheap way this time. There's no way these consoles will live as long as the last generation.
Quote:


> Originally Posted by *Carniflex*
> 
> It's more of an issue on game engine level. If the engine is the same that the company has been using the last decade copy-pasting stuff around it and relying hugely on single threaded performance then there is not a lot Sony can do about it. It's a game developer problem in this case - using an screwdriver to put a nail into the wall


Call Of Duty is the first thing that comes to mind when reading your post regarding copy and paste games, it's time to break some new ground with updated engines instead of recycling these engines!


----------



## WhiteCrane

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Marketing, really.
> 
> It cost them a whole whopping few cents to a buck more per console to have that extra hardware in it, and they get to advertise and scream about how superior the PS4 is to the XBO. There is a reason the PS4 is stomping the XBO right now, it isn't because it is really that much better, but they can just advertise it as far superior.
> 
> It is pretty brilliant to be honest.


I disagree as the average user does not understand the difference between hardware architectures, and marketing can just as easily spin the facts of an openly less powerful system to make it sound better by simply highlighting the strengths. i.e. the Sega Saturn having 2 CPU's or the Cell in the PS3 having a theoretical performance of 2 TFLOPs to Xbox 360's Xenon CPU only having 1 TFLOP.

Marketing nonsense doesn't require actual investment.


----------



## Neo Zuko

I don't see why the GPU can't be upgradable at this point with consoles. They are basically mini computers and they want 10 year plus life cycles. They could make it so that you could have a wave one GPU and games, then in 3-5 years wave two, then a final wave three.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Marketing, really.
> 
> It cost them a whole whopping few cents to a buck more per console to have that extra hardware in it, and they get to advertise and scream about how superior the PS4 is to the XBO. There is a reason the PS4 is stomping the XBO right now, it isn't because it is really that much better, but they can just advertise it as far superior.
> 
> It is pretty brilliant to be honest.


I'm more inclined to think the biggest reason for the lead was the $100 difference in price and Microsoft's decisions prior to the release of the console. They caught a lot of flack for that and many people switched to playstation.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Neo Zuko*
> 
> I don't see why the GPU can't be upgradable at this point with consoles. They are basically mini computers and they want 10 year plus life cycles. They could make it so that you could have a wave one GPU and games, then in 3-5 years wave two, then a final wave three.


Just ask Sega what happens when you continuously release new upgraded consoles every 2 to 3 years. These are consoles not PCs. Why would these companies want to take away the main advantage of a console over a PC?


----------



## Neo Zuko

Now it's different because they are much more expensive with hard drives and ram and generations are lasting much longer than four to five years.


----------



## umeng2002

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Marketing, really.
> 
> It cost them a whole whopping few cents to a buck more per console to have that extra hardware in it, and they get to advertise and scream about how superior the PS4 is to the XBO. There is a reason the PS4 is stomping the XBO right now, it isn't because it is really that much better, but they can just advertise it as far superior.
> 
> It is pretty brilliant to be honest.


Adding more GCN _cores_ isn't some BS marketing move. That is what separates a GTX 970 and GTX 980. The PS4 has 50% more than the X1. That is substantial and most games on the consoles have shown this.

Blame Ubi for being too ambitious or not technically proficient enough to make Unity run well. Unless they open of the code, we'll never know.

Every one into PC gaming during the PS360 era KNEW that AAA third parties were and still are not technically procieint with anything other than the 2005 level tech in the PS360. During that time, the PC has had more cores, better GPUs, etc. while the big third parties were releasing games happy living on a highly clocked dual core.


----------



## perfectblade

Quote:


> Originally Posted by *iSlayer*
> 
> Isn't it known that the Xbox had the best hardware of that generation? Even the GameCube was more powerful than the PS2.


it depends on how you define best. i bought 1 gamecube, 1 ps2, and 2 xboxs. guess which ones still work today


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Neo Zuko*
> 
> Now it's different because they are much more expensive with hard drives and ram and generations are lasting much longer than four to five years.


I don't think the PS4 and Xbox one are "much more" expensive than previous consoles. Console prices seem to be relatively stable over the past 2 decades outside of a few outliers. The PS1 launched in the US in 1994 for $299 which is $470 in 2013 money. The SNES was $199 back in 1991 which is $342 and the Genesis was about the same. Then you have the crazy Sega Saturn $619 and 3DO $1143....


----------



## Orangey

Don't forget Neo Geo... it was good though, unlike 3DO. Saturn had some classics.

It seems like if you go back and look at the suppliers of chips in all these old consoles, there's so much variety. Buyer's market. Nowadays it's just AMD, IBM and maybe Nvidia if you want to accept their Apple-like terms.


----------



## hellojustinr

Quote:


> Originally Posted by *Neo Zuko*
> 
> I don't see why the GPU can't be upgradable at this point with consoles. They are basically mini computers and they want 10 year plus life cycles. They could make it so that you could have a wave one GPU and games, then in 3-5 years wave two, then a final wave three.


Compatiblity issues and it would divide the consoles even more. Not something anyone I believe would be interested in

Imagine reading "This game only works on consoles with the wave 2 GPU hardware upgrade". That excludes so many people already who do not buy that. It adds to requirements and makes certain games exclusive to not just consoles but certain hardware waves only.

I would be pretty mad to see that GTA V on Xbox 360 runs on the slim but does not run linearly on the phat or at all. That would create a pretty large divide in people who own an Xbox 360 console, thankfully it is not like that and my Xbox 360 Slim I bought in 2011 downstairs is the same functionality wise (except WiFi) as the Xbox 360 phat that I bought in 2007 I have upstairs and I can cross-play with both consoles.


----------



## hellojustinr

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> I don't think the PS4 and Xbox one are "much more" expensive than previous consoles. Console prices seem to be relatively stable over the past 2 decades outside of a few outliers. The PS1 launched in the US in 1994 for $299 which is $470 in 2013 money. The SNES was $199 back in 1991 which is $342 and the Genesis was about the same. Then you have the crazy Sega Saturn $619 and 3DO $1143....


True this and with discounts, it's reaching the $299 mark almost (Xbox ONE going for $329 almost these days and PS4 as well). That's a great feat indeed and I can look forward to seeing a lot more people being able to afford a next-gen console so I can play with more people.


----------



## delboy67

Maybe amd has a secret plan, win both console contracts, gimp both consoles, make their own low level api then release their own amd super console with 4 pd modules and a 290x under the hood.


----------



## hellojustinr

Quote:


> Originally Posted by *umeng2002*
> 
> Adding more GCN _cores_ isn't some BS marketing move. That is what separates a GTX 970 and GTX 980. The PS4 has 50% more than the X1. That is substantial and most games on the consoles have shown this.
> 
> Blame Ubi for being too ambitious or not technically proficient enough to make Unity run well. Unless they open of the code, we'll never know.
> 
> Every one into PC gaming during the PS360 era KNEW that AAA third parties were and still are not technically procieint with anything other than the 2005 level tech in the PS360. During that time, the PC has had more cores, better GPUs, etc. while the big third parties were releasing games happy living on a highly clocked dual core.


True this.

With next-gen consoles moving forward, games will be released on the focus of at the very least the grounds of the internal components within the Xbox ONE and PS4 minimum which is great for us PC gamers as well!

One game I've seen benefitted from this as of recent is NBA 2K15!!! Looks amazingggggg with the "next-gen" graphics. Doubt it was not possible last year or the year before that on PC hardware but it certainly would not have happened with next-gen hardware pushing the devs for better graphics.


----------



## Neo Zuko

You guys are all right, I was just saying that it's a shame as we were all used to upgrading entire consoles every 4-6 years and that turned into 8-12 years with the PS3/360/PS4/XB1. So buying a better GPU or more RAM after 5 years would be like buying into an extension on that old system to double the life better. Because at that point we are threw the normal 5 year console lifecycle, the user base is presumably already saturated (or big enough) so we are not dividing it early on with an add on. Sure Add-Ons don't sell historically but that can change. For example we didn't buy phones every two years until the smart phone revolution. Now we are well trained to want a new phone every year.

So which would you rather do next time? Keep the PS4 long past its prime for a double life cycle like the PS3 did (while struggling to keep up with the latest features and graphics), get a PS5 in 5 years, or upgrade the GPU and RAM in 6 years and keep BC going? No different than a PC really as these are not special chips anymore the "emotion engine" was. But I get that it could never work based on current user base models and console growth patterns.

So what I end up doing with these PS3-like double life spans is favoring the consoles for the first 5 years then switching to PC for the next 5 years. The jump is just so big by then it's taking a huge toll on game performance. I don't want Mass Effect 3 on PS3 or 360 for example. The PC version runs much better.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Neo Zuko*
> 
> You guys are all right, I was just saying that it's a shame as we were all used to upgrading entire consoles every 4-6 years and that turned into 8-12 years with the PS3/360/PS4/XB1. So buying a better GPU or more RAM after 5 years would be like buying into an extension on that old system to double the life better. Because at that point we are threw the normal 5 year console lifecycle, the user base is presumably already saturated (or big enough) so we are not dividing it early on with an add on. Sure Add-Ons don't sell historically but that can change. For example we didn't buy phones every two years until the smart phone revolution. Now we are well trained to want a new phone every year.
> 
> So which would you rather do next time? Keep the PS4 long past its prime for a double life cycle like the PS3 did (while struggling to keep up with the latest features and graphics), get a PS5 in 5 years, or upgrade the GPU and RAM in 6 years and keep BC going? No different than a PC really as these are not special chips anymore the "emotion engine" was. But I get that it could never work based on current user base models and console growth patterns.
> 
> So what I end up doing with these PS3-like double life spans is favoring the consoles for the first 5 years then switching to PC for the next 5 years. The jump is just so big by then it's taking a huge toll on game performance. I don't want Mass Effect 3 on PS3 or 360 for example. The PC version runs much better.


Nobody knows how long this generation is going to last. Sony and MS day they're shooting for 10 years but nobody here is Ms. Cleo. Let's not start stating things as facts until they actually are facts.


----------



## Neo Zuko

Well it's conjecture, but educated conjecture!!


----------



## Ganf

Folks.... The primary reason that PC gamers rag on consoles isn't about superiority, it's about consoles holding back the industry. Most people just don't communicate this well, and some people don't understand it and just jump on one bandwagon or the other.

For the last 4 years consoles were a ball and chain on the leg of every PC gamer holding them back from experiencing what their hardware was truly capable of. Historically a release of a new generation of consoles meant that PC games would get a breath of fresh air, and that it would no longer be the games holding the hardware back for the next 4 or 5 years, but the hardware holding the games back. With this generation we have no such swing of the pendulum. Consoles only received an upgrade in the perspective of other consoles, meaning that PC gamers started this generation of hardware with a ball and chain twice the size of the one they just shook off instead of having a few years where they could run free and do as they please.

The next 5 years are going to be rough on gaming in general but I think it's going to be great for PC gaming hardware, as these companies are going to see their audience grow dramatically as consumers get tired of seeing the ever-widening gap between consoles and PC's.

4k will very soon be the new standard for PC gaming and consoles can't even manage 1080p reliably. Just the fact that every electronic company in existence is pushing 4k TV's and monitors like everyone's survival through any future apocalypse is going to depend on whether they own one or not will ensure that the PC market grows and consoles wane. If the leaked numbers on the 390 and 390x are anything to go by it looks like AMD has already benefitted from all dem console bux and are looking to push things forward in a hurry. Consumers are already edging over to PC and game devs will follow, and hardware will flourish.


----------



## caswow

Quote:


> Originally Posted by *Ganf*


i dont know why some of you dont understand that people dont want 1000€ konsoles not even 600€ kosnoles. 400€ is a sweet spot for a ready to play console. if if if. yea if developers got more money they maybe would spend more on pc games. but no pc gamers are waiting for the steam sales and 70% off prices so they can say look console games much more expensive...


----------



## umeng2002

Quote:


> Originally Posted by *caswow*
> 
> i dont know why some of you dont understand that people dont want 1000€ konsoles not even 600€ kosnoles. 400€ is a sweet spot for a ready to play console. if if if. yea if developers got more money they maybe would spend more on pc games. but no pc gamers are waiting for the steam sales and 70% off prices so they can say look console games much more expensive...


Well the point is valid (consoles have been holding PC titles back). After 2008 and Sony's current state of affairs, they didn't want to take a hit on the hardware sales. But they also wanted a 10 year life span. They can't have it both ways.

These things are powered by $30 APUs...

BD drives are dirt cheap. 500 GB are dirt cheap. Those were two big costs in the PS360 yet they manged to cram state of the art GPUs and multicore CPUs in there in 2005/ 2006 for $400 to $600... and taking a loss.

IDK... the door is wide open for Nintendo to release a cutting edge console in a few years.


----------



## kpzero

Quote:


> Originally Posted by *Neo Zuko*
> 
> You guys are all right, I was just saying that it's a shame as we were all used to upgrading entire consoles every 4-6 years and that turned into 8-12 years with the PS3/360/PS4/XB1. So buying a better GPU or more RAM after 5 years would be like buying into an extension on that old system to double the life better. Because at that point we are threw the normal 5 year console lifecycle, the user base is presumably already saturated (or big enough) so we are not dividing it early on with an add on. Sure Add-Ons don't sell historically but that can change. For example we didn't buy phones every two years until the smart phone revolution. Now we are well trained to want a new phone every year.


This generation probably would have been the ideal scenario for an expansion accessory to throw in a crossfire at around the 4th year to get a few more years out of the hardware. I imagine they could sell what is basically a 7870 for cheap in 2016 or 2017 and get a decent boost. Unfortunately they would have needed a stronger cpu part for that to be viable.
Quote:


> Originally Posted by *Ganf*
> 
> 4k will very soon be the new standard for PC gaming and consoles can't even manage 1080p reliably. Just the fact that every electronic company in existence is pushing 4k TV's and monitors like everyone's survival through any future apocalypse is going to depend on whether they own one or not will ensure that the PC market grows and consoles wane. If the leaked numbers on the 390 and 390x are anything to go by it looks like AMD has already benefitted from all dem console bux and are looking to push things forward in a hurry. Consumers are already edging over to PC and game devs will follow, and hardware will flourish.


4k will not be the standard for pc gaming "very soon". Hardware is nowhere near ready for that. For it to be the standard, you would need the mainstream gpus (equivalent of R9 270/760 and maybe even R7 260x/750ti) to be able to surpass 30fps on games that come out around their respective launches. So probably 2 generations at a minimum.
Quote:


> Originally Posted by *umeng2002*
> 
> IDK... the door is wide open for Nintendo to release a cutting edge console in a few years.


Unfortunately, recent history of the Wii, 3DS, and Wii U says that Nintendo will underwhelm us with their hardware choice. I really hope they buck that trend but I am not hopeful at the current time.


----------



## umeng2002

Quote:


> Originally Posted by *kpzero*
> 
> Unfortunately, recent history of the Wii, 3DS, and Wii U says that Nintendo will underwhelm us with their hardware choice. I really hope they buck that trend but I am not hopeful at the current time.


I hope their recent sales would convince them to.

Also, there aren't any expansion ports on the X1 or PS4 that would allow for an Crossfire upgrade.

The biggest performance "boost" you'll see from the X1 and PS4 will be from the image scalers once devs have to drop bellow 720p in a few years to push image quality higher.


----------



## ILoveHighDPI

I still like the idea of adopting the current cell-phone upgrade schedule. Keep the same architecture and give people yearly upgrades, all compatible with the same software. Just make sure that old games are supported for at least five years and people can decide for themselves when their upgrade cycles start and end.
That strategy doesn't allow for much flexibility to adjust the fundamentals of console gaming, but in the last ten years pretty much all we've learned is that changing the fundamentals is a bad idea.

Of course, if mobile gaming gets any more capable consoles will just become redundant.
Why doesn't Apple have an IOS gamepad yet? You could have awesome LAN parties with a bunch of iPads.


----------



## SIDWULF

Skyrim could of been a way better game if it was made for PC first. The developers wouldn't be limited by console performance problems. We would have more interactivity, graphics, richer world, ect. All the mods had to pick that up and even then they are limited in scope by the "console" engine Skryim uses.

Pathetic.

Why waste your time making games for a system that uses a CONTROLLER to aim.


----------



## Neo Zuko

Now that consoles have "PC" insides if they don't make the next generation BC they are shooting themselves in the foot. Absolutely no reason to go back to complicated / custom CPUs or GPUs and no reason to do it different the next time. Just roll your game collection into the next upgraded box with the same architecture. It works better this way. That's one of the reasons I went all digital this round of consoles as the next round will no doubt be mostly digital.


----------



## Ganf

Quote:


> Originally Posted by *caswow*
> 
> i dont know why some of you dont understand that people dont want 1000€ konsoles not even 600€ kosnoles. 400€ is a sweet spot for a ready to play console. if if if. yea if developers got more money they maybe would spend more on pc games. but no pc gamers are waiting for the steam sales and 70% off prices so they can say look console games much more expensive...


Hardware manufacturers were the focus of my post, and PC games are, even with steep sales factored in, still more profitable sale per sale than console games. I'd quote it from the people who made them, but that was said a long time ago and nobody has tried to refute it, so it hasn't been repeated and I'm too lazy to google that hard for something everyone except a scant few know for fact.
Quote:


> Originally Posted by *kpzero*
> 
> 4k will not be the standard for pc gaming "very soon". Hardware is nowhere near ready for that. For it to be the standard, you would need the mainstream gpus (equivalent of R9 270/760 and maybe even R7 260x/750ti) to be able to surpass 30fps on games that come out around their respective launches. So probably 2 generations at a minimum.


This assumes that you can't already get 30-40fps out of a new game on a 270/760 on custom settings between low and medium at 4k, which is undeniably better looking than anything the consoles can produce.

Yaknow... Turn that SMAA off or something...


----------



## GoldenTiger

Quote:


> Originally Posted by *Ganf*
> 
> Folks.... The primary reason that PC gamers rag on consoles isn't about superiority, it's about consoles holding back the industry. Most people just don't communicate this well, and some people don't understand it and just jump on one bandwagon or the other.
> 
> For the last 4 years consoles were a ball and chain on the leg of every PC gamer holding them back from experiencing what their hardware was truly capable of. Historically a release of a new generation of consoles meant that PC games would get a breath of fresh air, and that it would no longer be the games holding the hardware back for the next 4 or 5 years, but the hardware holding the games back. With this generation we have no such swing of the pendulum. Consoles only received an upgrade in the perspective of other consoles, meaning that PC gamers started this generation of hardware with a ball and chain twice the size of the one they just shook off instead of having a few years where they could run free and do as they please.
> 
> The next 5 years are going to be rough on gaming in general but I think it's going to be great for PC gaming hardware, as these companies are going to see their audience grow dramatically as consumers get tired of seeing the ever-widening gap between consoles and PC's.
> 
> 4k will very soon be the new standard for PC gaming and consoles can't even manage 1080p reliably. Just the fact that every electronic company in existence is pushing 4k TV's and monitors like everyone's survival through any future apocalypse is going to depend on whether they own one or not will ensure that the PC market grows and consoles wane. If the leaked numbers on the 390 and 390x are anything to go by it looks like AMD has already benefitted from all dem console bux and are looking to push things forward in a hurry. Consumers are already edging over to PC and game devs will follow, and hardware will flourish.


Hear, hear. And I agree the transition to 4k will be faster than most expect. Pc gaming is looking towards a bright future in my opinion.


----------



## skawster

Quote:


> Originally Posted by *Ganf*
> 
> Folks.... The primary reason that PC gamers rag on consoles isn't about superiority, it's about consoles holding back the industry. Most people just don't communicate this well, and some people don't understand it and just jump on one bandwagon or the other.
> 
> For the last 4 years consoles were a ball and chain on the leg of every PC gamer holding them back from experiencing what their hardware was truly capable of. Historically a release of a new generation of consoles meant that PC games would get a breath of fresh air, and that it would no longer be the games holding the hardware back for the next 4 or 5 years, but the hardware holding the games back. With this generation we have no such swing of the pendulum. Consoles only received an upgrade in the perspective of other consoles, meaning that PC gamers started this generation of hardware with a ball and chain twice the size of the one they just shook off instead of having a few years where they could run free and do as they please.
> 
> The next 5 years are going to be rough on gaming in general but I think it's going to be great for PC gaming hardware, as these companies are going to see their audience grow dramatically as consumers get tired of seeing the ever-widening gap between consoles and PC's.
> 
> 4k will very soon be the new standard for PC gaming and consoles can't even manage 1080p reliably. Just the fact that every electronic company in existence is pushing 4k TV's and monitors like everyone's survival through any future apocalypse is going to depend on whether they own one or not will ensure that the PC market grows and consoles wane. If the leaked numbers on the 390 and 390x are anything to go by it looks like AMD has already benefitted from all dem console bux and are looking to push things forward in a hurry. Consumers are already edging over to PC and game devs will follow, and hardware will flourish.


Too bad that in the mean time most games were also reduced to bland storylines, shallow characters, cheesy gameplay and uninventive control/interaction layouts...


----------



## Systemlord

Quote:


> Originally Posted by *Ganf*
> 
> Folks.... *The primary reason that PC gamers rag on consoles isn't about superiority, it's about consoles holding back the industry.* Most people just don't communicate this well, and some people don't understand it and just jump on one bandwagon or the other.
> 
> For the last 4 years consoles were a ball and chain on the leg of every PC gamer holding them back from experiencing what their hardware was truly capable of. Historically a release of a new generation of consoles meant that PC games would get a breath of fresh air, and that it would no longer be the games holding the hardware back for the next 4 or 5 years, but the hardware holding the games back. With this generation we have no such swing of the pendulum. Consoles only received an upgrade in the perspective of other consoles, meaning that PC gamers started this generation of hardware with a ball and chain twice the size of the one they just shook off instead of having a few years where they could run free and do as they please.
> 
> The next 5 years are going to be rough on gaming in general but I think it's going to be great for PC gaming hardware, as these companies are going to see their audience grow dramatically as consumers get tired of seeing the ever-widening gap between consoles and PC's.
> 
> 4k will very soon be the new standard for PC gaming and consoles can't even manage 1080p reliably. Just the fact that every electronic company in existence is pushing 4k TV's and monitors like everyone's survival through any future apocalypse is going to depend on whether they own one or not will ensure that the PC market grows and consoles wane. If the leaked numbers on the 390 and 390x are anything to go by it looks like AMD has already benefitted from all dem console bux and are looking to push things forward in a hurry. Consumers are already edging over to PC and game devs will follow, and hardware will flourish.


Bingo!

Quote:


> Originally Posted by *SIDWULF*
> 
> Skyrim could of been a way better game if it was made for PC first. The developers wouldn't be limited by console performance problems. We would have more interactivity, graphics, richer world, ect. All the mods had to pick that up and even then they are limited in scope by the "console" engine Skryim uses.
> 
> Pathetic.
> 
> Why waste your time making games for a system that uses a CONTROLLER to aim.


The middle of last year is when I started playing Skyrim and notice right away it looked like a low textured console game. I immediately began to add mods that brought the textures way up and it looked nothing like the vanilla game anymore, night and day difference, as for making my point I think developers should add something for the PC ports when games are launched on multi-platform. That is include high res textures like in Crysis 2, when it first was released it was a few months before the high res content was available.


----------



## Carniflex

Quote:


> Originally Posted by *umeng2002*
> 
> IDK... the door is wide open for Nintendo to release a cutting edge console in a few years.


Valve is supposed to "release" their SteamBox in 2015. Which is basically a PC and does not even pretend really seriously to be anything else. When it (the controller) releases I expect majority of tech and review sites to compose some sort of comparison review of available steamboxes and the current gen consoles.

Even some of the current offerings are sort of reasonable. Although for it to take off properly someone (some of the OEM's probably) would have to do a marketing campaign to show the typical console target audience that this "new console" is "better".


----------



## Carniflex

Quote:


> Originally Posted by *kpzero*
> 
> 4k will not be the standard for pc gaming "very soon". Hardware is nowhere near ready for that. For it to be the standard, you would need the mainstream gpus (equivalent of R9 270/760 and maybe even R7 260x/750ti) to be able to surpass 30fps on games that come out around their respective launches. So probably 2 generations at a minimum.


In my experience they can already. "My experience" being 7870 running 5x1080p eyefinity. The 30 FPS mark is perfectly doable in vast majority of titles at 10 megapixels (25% higher than 4K) on 7870 with the mix of high/medium settings and no AA. The 7950 has a little more vRAM headroom and can pull off the same with x2 AA or more or less straight high settings. Hell I have managed to get 30 fps even on a single 6770 on that resolution with mix of low/medium settings and overclock, although 1 GB of vRAM is really minimum for 4K as it is possible to hit the vRAM wall pretty reliably with 1 GB at 4k.


----------



## kpzero

Quote:


> Originally Posted by *Carniflex*
> 
> In my experience they can already. "My experience" being 7870 running 5x1080p eyefinity. The 30 FPS mark is perfectly doable in vast majority of titles at 10 megapixels (25% higher than 4K) on 7870 with the mix of high/medium settings and no AA. The 7950 has a little more vRAM headroom and can pull off the same with x2 AA or more or less straight high settings. Hell I have managed to get 30 fps even on a single 6770 on that resolution with mix of low/medium settings and overclock, although 1 GB of vRAM is really minimum for 4K as it is possible to hit the vRAM wall pretty reliably with 1 GB at 4k.


I was referring to 8th gen exclusive ports that will soon be the reality of pc gaming. Many of those ports are likely to have low/medium settings for 7870 and 7950 at 1080p. There havent been many non crossgens yet but so far the evidence doesnt support a good time at 4k.

Further, in my opinion for something to be a "standard", it shouldnt require low or medium settings. Maybe you define standard differently than me but I cant imagine it being considered a standard before atleast 20-30% of gamers use the resolution.


----------



## Carniflex

Quote:


> Originally Posted by *kpzero*
> 
> I was referring to 8th gen exclusive ports that will soon be the reality of pc gaming. Many of those ports are likely to have low/medium settings for 7870 and 7950 at 1080p. There havent been many non crossgens yet but so far the evidence doesnt support a good time at 4k.
> 
> Further, in my opinion for something to be a "standard", it shouldnt require low or medium settings. Maybe you define standard differently than me but I cant imagine it being considered a standard before atleast 20-30% of gamers use the resolution.


None of these games are out yet so cant say anything based on facts about those. Based on the ports so far it is fair to expect that PC with 200$ GFX card and i3 or overclocked G3258 will outperform the consoles by a fair margin (either running it at 1080p or 60 Hz or even both for particularly lazy ports using single threaded engine). But at this point there is really no point on speculating about these theoretical "awesome multi-threaded engine truly console optimized" fairies.

My comment was saying that vast majority of games out today can do 4K at 30 fps with manual settings. BTW I will have to also point out that in your average dark tunnel shooter you probably cant tell the difference between low/medium or medium/high at a glance. The "Low" settings are not particularly crappy looking nowadays you know. Let me thow out couple of screenshots I did for another thread about a year ago for comparison. Clicking on them will open them in full 10 megapixel resolution.



Metro 2033. If I remember correct then DX9 "low, DX 10 "medium" and DX 11 "high".

Once these "true 8th gen" ports emerge another couple of years have passed and and another generation or two GFX cards are on market meaning the consoles are no longer even equivalent to the medium/low range gaming PC, meaning that lowish/medium budget gaming gaming PC built then (or any current gaming PC with ~200$ gfx card upgrade) should have no problems whatsoever doing 4k @ 60 Hz with visual settings equal to the console (i.e., preset "low" and no AA).


----------



## Thingamajig

Quote:


> Originally Posted by *Systemlord*
> 
> Bingo!
> The middle of last year is when I started playing Skyrim and notice right away it looked like a low textured console game. I immediately began to add mods that brought the textures way up and it looked nothing like the vanilla game anymore, night and day difference, as for making my point I think developers should add something for the PC ports when games are launched on multi-platform. That is include high res textures like in Crysis 2, when it first was released it was a few months before the high res content was available.


Oblivion and the Fallout series were very much the same. Fortunately modding bought them up to spec but you were still limited by what was otherwise an engine designed for console use. This severely limited it's capacity (and stability) when running mods. If this thing was going on back when Oblivion was released, you can imagine how far ahead the PC platform is now.

Skyrim was an improvement, but i've already hit the same walls previous Bethesda games suffered with.

I remember back on Skyrim's release finding out it was a 32bit game - i knew immediately the community would suffer with a memory limit, and i was frustrated that, for such a large game (and a 2011 release) it was still living in the past.

But yes, a heavily modified skyrim is truely a night/day experience over vanilla, some images people manage are breathtaking - and this doesn't even scratch the surface of what gameplay enhancements are done. My own personal collection I upload to Steam: http://steamcommunity.com/id/Thingamajig/screenshots


----------



## iTurn

Quote:


> Originally Posted by *Ganf*
> 
> This assumes that you can't already get 30-40fps out of a new game on a 270/760 on custom *settings between low and medium at 4k, which is undeniably better looking than anything the consoles can produce.*
> 
> Yaknow... Turn that SMAA off or something...


Yea thats very much incorrect, you must've been in a cave PS4/XB1 games have been consistently in "high" settings region with only PCs Ultra surpassing them of course they're a few outliers but there have been no low-medium settings PS4/XB1 games... we get it you don't like consoles you don't have to make up lies about them.


----------



## Newbie2009

Quote:


> Originally Posted by *iTurn*
> 
> Yea thats very much incorrect, you must've been in a cave PS4/XB1 games have been consistently in "high" settings region with only PCs Ultra surpassing them of course they're a few outliers but there have been no low-medium settings PS4/XB1 games... we get it you don't like consoles you don't have to make up lies about them.


I would guess we will have 4k cards running 60fps in the majority of games with slightly better than console graphics (as you mentioned consoles similar to high PC) in 2016-17. Not talking about xfire or sli.


----------



## scaz

Why do you think steam has been pushing to get PC connected to our home tvs with options to make it easy to use controllers instead of traditional mouse and keyboard. I am also don't want to upgrade because I bought a few online only games and those won't transfer to the Xbox one so I will stick with the 360.


----------



## Neo_Morpheus

hi everybody chat chat time, well trying to understand where the companies are coming from. Lets have a look at the possible upgrade pathways. Due to hardware limitations over what price the public is willing to spend, this gen sub 1080p 30-60fps?. Next gen 8 years respectfully, even more GPU/CPU power is needed, it most likely will be (after what price the public is willing to spend) 1080p 60fps with a SSD!.(most people will still have only 1080p TV's) After this it will most likely be kicked up a notch in the resolution department. The companies have done there homework and recommend the slow and painful pathway to full 1080p for the next 10 years.


----------



## GoldenTiger

Quote:


> Originally Posted by *iTurn*
> 
> Yea thats very much incorrect, you must've been in a cave PS4/XB1 games have been consistently in "high" settings region with only PCs Ultra surpassing them of course they're a few outliers but there have been no low-medium settings PS4/XB1 games... we get it you don't like consoles you don't have to make up lies about them.


High at 1366x768 blurry upscaled to 1080 and running 20 to 30fps looks horrific compared to medium at native 3840x2160 4k resolution running 30 to 40fps as was compared. You're just plain wrong, the consoles indeed can't even come close to putting out what a midrange pc card can already, and that gap will continually and hugely widen very quickly. The consoles are just plain outright underpowered from launch this time around, it's actually very sad compared to previous generations.


----------



## GoldenTiger

Quote:


> Originally Posted by *Carniflex*
> 
> None of these games are out yet so cant say anything based on facts about those. Based on the ports so far it is fair to expect that PC with 200$ GFX card and i3 or overclocked G3258 will outperform the consoles by a fair margin (either running it at 1080p or 60 Hz or even both for particularly lazy ports using single threaded engine). But at this point there is really no point on speculating about these theoretical "awesome multi-threaded engine truly console optimized" fairies.
> 
> My comment was saying that vast majority of games out today can do 4K at 30 fps with manual settings. BTW I will have to also point out that in your average dark tunnel shooter you probably cant tell the difference between low/medium or medium/high at a glance. The "Low" settings are not particularly crappy looking nowadays you know. Let me thow out couple of screenshots I did for another thread about a year ago for comparison. Clicking on them will open them in full 10 megapixel resolution.
> 
> 
> 
> Metro 2033. If I remember correct then DX9 "low, DX 10 "medium" and DX 11 "high".
> 
> Once these "true 8th gen" ports emerge another couple of years have passed and and another generation or two GFX cards are on market meaning the consoles are no longer even equivalent to the medium/low range gaming PC, meaning that lowish/medium budget gaming gaming PC built then (or any current gaming PC with ~200$ gfx card upgrade) should have no problems whatsoever doing 4k @ 60 Hz with visual settings equal to the console (i.e., preset "low" and no AA).


I remember that post and set of screenshots. You can tell only a small change between the middle medium and the bottom high shots and in motion it is negligible. Low-end gaming pc already provide far better visuals than consoles and better framerate, that is only going to become ridiculously bigger a chasm in short order as resolutions Eclipse the paltry 720 to 900p of consoles like Xbox 1 and ps4.

Forum warriors are as I keep saying far too obsessed with the ERMAHGERRRRRRD MAX EVERY SLIDARRRRRR mentality and cripple their performance for negligible gains in visual fidelity. I see people claiming you can't run 4k games at great settings with top end sli cards like the Gtx 970 and 980 as a pair and laugh hysterically every time. I'm loving every minute of my 4k ips 32 inch 60hz game time. I lower one or two sliders down a notch and can't even find the changes in screenshots, but the game then runs flawlessly. As usually is the case with people having computer issues, PEBKAC: user error.


----------



## iTurn

Quote:


> Originally Posted by *GoldenTiger*
> 
> High at 1366x768 blurry upscaled to 1080 and running 20 to 30fps looks horrific compared to medium at native 3840x2160 4k resolution running 30 to 40fps as was compared. You're just plain wrong, the consoles indeed can't even come close to putting out what a midrange pc card can already, and that gap will continually and hugely widen very quickly. The consoles are just plain outright underpowered from launch this time around, it's actually very sad compared to previous generations.


1366x768? What?... research some facts and then come back, it's ok to prefer something but to make up stupid facts. High is High is High is High... I have a computer I do have these games to compare against but humor me and post some proof
What you're essentially saying is Shadow of mordor @ 4k on Medium will look better than whats available on the current gen consoles.








Quote:


> Originally Posted by *Newbie2009*
> 
> I would guess we will have 4k cards running 60fps in the majority of games with slightly better than console graphics (as you mentioned consoles similar to high PC) in 2016-17. Not talking about xfire or sli.


Just for clarity sake, PC will always be better than consoles in the performance arena, I am aware of this, but saying the consoles run low-medium settings like last gen is plain false.


----------



## Prophet4NO1

Quote:


> Originally Posted by *iTurn*
> 
> 1366x768? What?... research some facts and then come back, it's ok to prefer something but to make up stupid facts. High is High is High is High... I have a computer I do have these games to compare against but humor me and post some proof
> What you're essentially saying is Shadow of mordor @ 4k on Medium will look better than whats available on the current gen consoles.
> 
> 
> 
> 
> 
> 
> 
> 
> Just for clarity sake, PC will always be better than consoles in the performance arena, I am aware of this, but saying the consoles run low-medium settings like last gen is plain false.


Go watch some side by side vids on youtube. 720p res with lower settings scaled up to 1080p compared to PC at native 1080p at high-ultra settings looks a lot different. BF4 is a good comparison. But pick your multi platform poison and see for your self.


----------



## GoldenTiger

Quote:


> Originally Posted by *iTurn*
> 
> 1366x768? What?... research some facts and then come back, it's ok to prefer something but to make up stupid facts. High is High is High is High... I have a computer I do have these games to compare against but humor me and post some proof
> What you're essentially saying is Shadow of mordor @ 4k on Medium will look better than whats available on the current gen consoles.
> 
> 
> 
> 
> 
> 
> 
> 
> Just for clarity sake, PC will always be better than consoles in the performance arena, I am aware of this, but saying the consoles run low-medium settings like last gen is plain false.


If you can't grasp that a barely improved 1366x768 or 1440x900 image like a console gives you at 25 fps (yep that's how ridiculously low res they run games at) gives a far worse picture than a slightly lower settings machine running native 3840x2160 at 30 to 40fps, it is clear there's no point with continuing this discussion as it is far above your knowledge level of technology unfortunately.


----------



## Carniflex

Quote:


> Originally Posted by *Prophet4NO1*
> 
> Go watch some side by side vids on youtube. 720p res with lower settings scaled up to 1080p compared to PC at native 1080p at high-ultra settings looks a lot different. BF4 is a good comparison. But pick your multi platform poison and see for your self.


The comparison is a bit on the old side as it was done when the consoles came out (this affects mostly prices), but speaking of BF4 - http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc has a comparison. At the time of comparison the "test" PC exceeded the price of a console, as a bit more than half a year have already passed the setup in there can be had atm for about the price of a console or even slightly cheaper, depends on how ugly case one picks









Although I have to point out that this purely hardware based comparison might not be the entirely "correct" way of doing it as your "typical console" user is highly unlikely to be going out and assembling his own PC in a way that makes sense and is not crippled in such a tight budget.


----------



## iTurn

Quote:


> Originally Posted by *Prophet4NO1*
> 
> *Go watch some side by side vids on youtube*. 720p res with lower settings scaled up to 1080p compared to PC at native 1080p at high-ultra settings looks a lot different. BF4 is a good comparison. But pick your multi platform poison and see for your self.


I don't even...








Quote:


> Originally Posted by *GoldenTiger*
> 
> If you can't grasp that a barely improved 1366x768 or 1440x900 image like a console gives you at 25 fps (yep that's how ridiculously low res they run games at) a far worse picture than a slightly lower settings machine running native 3840x2160 at 30 to 40fps, it is clear there's no point with continuing this discussion as it is far above your knowledge level of technology unfortunately.


I can't grasp you using 1366x768/1440x900 as a standard.... PS4 has what 5 games out of 130+ that aren't 1080p native?
You're too stupid to pull your head out of your rectum due to your bias... You're so intelligent you're using outliers how many games on the PS4 are 25fps?


----------



## GoldenTiger

Quote:


> Originally Posted by *iTurn*
> 
> I don't even...
> 
> 
> 
> 
> 
> 
> 
> 
> I can't grasp you using 1366x768/1440x900 as a standard.... PS4 has what 5 games out of 130+ that aren't 1080p native?
> You're too stupid to pull your head out of your rectum due to your bias... You're so intelligent you're using outliers how many games on the PS4 are 25fps?
> You're as dumb as the dude quoted above relying on youtube for picture quality references


ROFL. As I said. The ps4 games that run so undemanding graphics would be runnable at max 4k on the example comparison we were talking about on the pc. Most of the demanding ps4 games only run 720 to 900p which are the resolution numbers I quoted. Are you unable to follow along or something? Because it sure seems it.


----------



## Tempest2000

Get your DC update today for dynamic weather.

Low quality but you get the point (found these on another site; I take no credit)


----------



## SIDWULF

Buttfield 4 has console engine written all over it. Console holding back everything! The graphics are all smoke and mirrors, tricks and hacks.

(I still love BF4)


----------



## Tempest2000

Quote:


> Originally Posted by *SIDWULF*
> 
> Buttfield 4 has console engine written all over it. Console holding back everything! The graphics are all smoke and mirrors, tricks and hacks.
> 
> (I still love BF4)


Not sure if serious, but BF4 can run on a very low-spec PC, far less powerful than the current gen consoles (see minimum system requirements).

The lowest common denominator for most new games is a PC. The best part about PC gaming (hardware variety) is ultimately a hindrance to PC development. This is something that's next to impossible for a PC elitist to accept.


----------



## CrazyHeaven

The same thing was said about the PS3 and yet we still had better games coming out for it. Of course they'll run into a performance wall with poorly written software. But to say the consoles don't have more in them isn't true. Poorly written program can easily hit the wall of any pc on OCN.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> I'm more inclined to think the biggest reason for the lead was the $100 difference in price and Microsoft's decisions prior to the release of the console. They caught a lot of flack for that and many people switched to playstation.


So....marketing?

People. Marketing isn't just about ads on TV and Billboards, there is an entire science and process behind it.

If spending an extra couple pennies per console allows them to justify asking for a $100 more at point of sale, that is a marketing win. If having that extra hardware allows them to legally advertise it as superior hardware compared to the XBox, thus moving more units, that is a marketing win.

Had Sony actually given a crap about real performance, they would have gone an even more aggressive route with the hardware. However, they don't, they wanted the marketing material from it to make all the games they wanted and to charge more. As those of us here know that the hardware difference really won't translate into anything meaningful to the end user.

Marketing.

_"I want to sell you something you don't need that cost more than the other guy, because I can."_


----------



## Tempest2000

Quote:


> Originally Posted by *PostalTwinkie*
> 
> So....marketing?
> 
> People. Marketing isn't just about ads on TV and Billboards, there is an entire science and process behind it.
> 
> If spending an extra couple pennies per console allows them to justify asking for a $100 more at point of sale, that is a marketing win. If having that extra hardware allows them to legally advertise it as superior hardware compared to the XBox, thus moving more units, that is a marketing win.
> 
> Had Sony actually given a crap about real performance, they would have gone an even more aggressive route with the hardware. However, they don't, they wanted the marketing material from it to make all the games they wanted and to charge more. As those of us here know that the hardware difference really won't translate into anything meaningful to the end user.
> 
> Marketing.
> 
> _"I want to sell you something you don't need that cost more than the other guy, because I can."_


I don't think you realize that PS4 launched for $100 less than X1 (in the US) AND is overall more powerful. I'm not really sure what you're trying to say but it sounds like you've got it backwards.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Tempest2000*
> 
> Get your DC update today for dynamic weather.
> 
> Low quality but you get the point (found these on another site; I take no credit)


Dude, what game and system is this? Friggin jaw dropping!


----------



## PostalTwinkie

Quote:


> Originally Posted by *Tempest2000*
> 
> I don't think you realize that PS4 launched for $100 less than X1 (in the US) AND is overall more powerful. I'm not really sure what you're trying to say but it sounds like you've got it backwards.


The Xbox One with Kinect was more expensive than the base PS4. The XBox One without the Kinect was cheaper, and currently the PS4 is retailing more than the Xbox One. Unit to Unit the PS4 has sold for more.

Depending on what kind of "package" deal you get is another factor.

EDIT:

Oh, and that doesn't even consider the buzz generated by the "better" hardware that allowed PS4s to fetch insane pricing on eBay and Craigslist. Around launch I was laughing as people flooded Craigslist with _"New unopened Xbox One + $150 cash for PS4!"_


----------



## Tempest2000

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Dude, what game and system is this? Friggin jaw dropping!


Driveclub PS4


----------



## Tempest2000

Quote:


> Originally Posted by *PostalTwinkie*
> 
> The Xbox One with Kinect was more expensive than the base PS4. The XBox One without the Kinect was cheaper, and currently the PS4 is retailing more than the Xbox One. Unit to Unit the PS4 has sold for more.
> 
> Depending on what kind of "package" deal you get is another factor.
> 
> EDIT:
> 
> Oh, and that doesn't even consider the buzz generated by the "better" hardware that allowed PS4s to fetch insane pricing on eBay and Craigslist. Around launch I was laughing as people flooded Craigslist with _"New unopened Xbox One + $150 cash for PS4!"_


Actually, X1 without Kinect originally retailed for $399, the same price as PS4, not less.

Regardless, what you just said in no way helps prove your point about marketing... You claimed that Sony made marginally better hardware so they can sell it for more than X1, which obviously isn't true considering the launch prices.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Tempest2000*
> 
> Actually, X1 without Kinect originally retailed for $399, the same price as PS4, not less.
> 
> Regardless, what you just said in no way helps prove your point about marketing... You claimed that Sony made marginally better hardware so they can sell it for more than X1, which obviously isn't true considering the launch prices.


Thought the Xbox without Kintect sold for $349, pretty sure it did.

Either way, my point is actually highlighted by your statements. As Sony will have an easier time selling you "better" hardware for the same price. After all, why buy the car with 200 HP when you can have 215 HP for the same price?


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Thought the Xbox without Kintect sold for $349, pretty sure it did.
> 
> Either way, my point is actually highlighted by your statements. As Sony will have an easier time selling you "better" hardware for the same price. After all, why buy the car with 200 HP when you can have 215 HP for the same price?


The current $350 price tag for the kinect free xbox one is only a temporary price cut for the holidays. The kinect free xbox one was always $399 and it might go back up to that price once the holidays are over.


----------



## PostalTwinkie

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> The current $350 price tag for the kinect free xbox one is only a temporary price cut for the holidays. The kinect free xbox one was always $399 and it might go back up to that price once the holidays are over.


Well, no freaking wonder the PS4 is just slapping the XBO around in sales.

Microsoft would be nuts to increase the price again.


----------



## lacrossewacker

Quote:


> Originally Posted by *LuckyDate56*
> 
> Poor of the guy who wrote this article.. I mean after what we saw with uncharted 4, a game that slaps AC in the face in terms of graphics...


Granted AC Unity is no poster boy child for the best implementation of the tech - it's apples to oranges as far as Uncharted is going.


----------



## PostalTwinkie

Quote:


> Originally Posted by *LuckyDate56*
> 
> What bothers me the most is that graphically speaking there's not a big difference between AC IV: Black Flag and AC Unity. I really want to play AC Unity but man, I can't stand playing it at 20fps on my PS4. Untill Ubisoft doesn't fix it, I won't buy it.
> 
> It's not a limitation of the PS4's CPU or overall hardware; it's Ubisoft that it sucks at making video games (*for a long time*).


FTFY


----------



## aberrero

Quote:


> Originally Posted by *PostalTwinkie*
> 
> Well, no freaking wonder the PS4 is just slapping the XBO around in sales.
> 
> Microsoft would be nuts to increase the price again.


Trying to sell the XBO for $500 was the dumbest thing MS has ever done. As if anybody cares about Kinect.


----------



## perfectblade

Quote:


> Originally Posted by *aberrero*
> 
> Trying to sell the XBO for $500 was the dumbest thing MS has ever done. As if anybody cares about Kinect.


ms is like that doddering old grandpa who figures out what's hip with his grandkids 5 years late. that goes for touchscreens and kinnect.

actually, to be fair, the previous ceo did bear a strong resemblance to santa clause. an angry santa


----------



## jameschisholm

Quote:


> Originally Posted by *LuckyDate56*
> 
> I mean come on, is there a BIG difference between BF and Unity?
> 
> 
> 
> 
> Both pictures are on PC maxed out. Like for real, there's no big difference!


There's something about the clothes in ACU that I think look weird, is it just me?


----------



## ToxicAdam

Quote:


> Originally Posted by *aberrero*
> 
> Trying to sell the XBO for $500 was the dumbest thing MS has ever done. *As if anybody cares about Kinect.*


This was Don Mattrick dream of the future and is why he's now working for Zynga.


----------



## Silent Scone

They said the same about the Xbox 360 around a year in, they made that heap do some pretty awesome things. Not any-one single developer will tell you the same thing twice.


----------



## Orangey

Actually it went more like this:

xbots: TWICE AS MUCH RAM! IT'S OVER SONY ARE FINISHED (and PS3 henceforth enjoyed awful multiplats)
sdf: POWAR OF CELERY! (became the biggest flop since Itanic)


----------



## Arturo.Zise

So I have 4 friends that have just bought or are getting PS4's for xmas and are hounding me to grab one for some online gaming. I have resisted so far as I'm about to upgrade my PC to X99 and have over 170 games in my library already to chew through. It does have it's benefits but I see no point in having a console and a PC that play the same stuff anyway.

Thoughts?


----------



## perfectblade

Quote:


> Originally Posted by *Arturo.Zise*
> 
> So I have 4 friends that have just bought or are getting PS4's for xmas and are hounding me to grab one for some online gaming. I have resisted so far as I'm about to upgrade my PC to X99 and have over 170 games in my library already to chew through. It does have it's benefits but I see no point in having a console and a PC that play the same stuff anyway.
> 
> Thoughts?


bloodborne, blooborne.


----------



## Arturo.Zise

Quote:


> Originally Posted by *perfectblade*
> 
> bloodborne, blooborne.


I already own Dark Souls


----------



## Boomer1990

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I already own Dark Souls


Bloodborne, Uncharted, MLB the Show, Until Dawn, Deep Down, jrpg's, GOW, w/e media molecule is making, the rpg that Guerrilla games is making, and w/e Quantic Dream is making.


----------



## KenjiS

My issue on getting a PS4 is still a very simple one, It still, To my knowledge, and every bit of knowledge I've looked up, does not correctly handle HD Audio Codecs from Blu Rays and such (IE, Bitstream them to allow my audio receiver to decode them). I'm quite leery on letting the console do the decoding as I do not feel it isnt being tampered with/molested like on the Xbox One (Where the decoded LPCM stream from the blu ray is noticably worse quality than doing the same on my PS3 or simply having my audio receiver decode Dolby TrueHD/DTS HD MA) And yes, its ONLY Blu-Ray audio that sounds god awful on the One to me.

This coupled with the fact theres no blu ray remote available mean I'm very hesitant to drop $400 on a console that cant actually replace my PS3 for media usage, Which is a shame as my PS3 is now basically a glorified Blu-Ray player....


----------



## aberrero

The PS4 looks like it can never replace a PS3. Keep both.


----------



## Orangey

If I ever become an audiophile shoot me.


----------



## t00sl0w

Quote:


> Originally Posted by *KenjiS*
> 
> My issue on getting a PS4 is still a very simple one, It still, To my knowledge, and every bit of knowledge I've looked up, does not correctly handle HD Audio Codecs from Blu Rays and such (IE, Bitstream them to allow my audio receiver to decode them). I'm quite leery on letting the console do the decoding as I do not feel it isnt being tampered with/molested like on the Xbox One (Where the decoded LPCM stream from the blu ray is noticably worse quality than doing the same on my PS3 or simply having my audio receiver decode Dolby TrueHD/DTS HD MA) And yes, its ONLY Blu-Ray audio that sounds god awful on the One to me.
> 
> This coupled with the fact theres no blu ray remote available mean I'm very hesitant to drop $400 on a console that cant actually replace my PS3 for media usage, Which is a shame as my PS3 is now basically a glorified Blu-Ray player....


does it at least keep the DTS core on DTS-MA tracks or is it dropping everything down to shoddy DD 640?


----------



## iTurn

Quote:


> Originally Posted by *Orangey*
> 
> If I ever become an audiophile shoot me.


I actually agree with Kenji, the sound difference is noticeable, and while I do like a nice audio setup I am far from an Audiophile.


----------



## KenjiS

Quote:


> Originally Posted by *t00sl0w*
> 
> does it at least keep the DTS core on DTS-MA tracks or is it dropping everything down to shoddy DD 640?


On the PS4 I'm honestly not sure -exactly- what its doing as I do not own one, According to some people on a forum Bitstream might work correctly depending on your specific PS4/Receiver combo, IE, it likes some receivers and dislikes others for some reason. One of the folks having issues has a Pioneer Elite SC-25 however which is a sister model to my SC-27. People having issues are claiming it drops it from lossless DTS-HD MA to standard DTS in their case. Basically its a mess. Now to be fair most of the posts I've seen on the subject are older, could be fixed by now for all I know, but I'm honestly not going to spend $400 to answer a question that I want the answer to -before- I drop $400... My attempts at queries have ended with people basically not understanding what im asking or arguing that LPCM is the same without understanding why I'm hesitant to drop $400 on another "Next gen" console that fails to do what my PS3 or the other pile of Blu Ray players I own does fine.

Oh and no remote and I don't touch HDMI-CEC with a 20 foot pole.

As for the Xbox One it drops the lossless to either DTS or Dolby whichever you choose. Which is not what I desire, the only way to get the "actual" lossless codecs to work is to let the Xbox decode them and send it over LPCM, the issue with this is the Dynamic Range is massively messed up on my setup when it does this for a Blu Ray. Technically, and theoretically, it should be -exactly- the same but its unfortunately not because the Xbox is mucking around when it decodes the audio, even when you disable DRC and etc..

PS3 and other sources have no issues whatsoever. Put the same disc in the PS3 and everythings fine and dandy, it bitstreams the HD Audio codecs correctly. So yeah, its definitely an Audio Processing issue on the Xbox One.. and I'm not the only one angry over it (Apparently nothing can be done about it either as its related to the "Snap" feature, Which i would gladly give up while watching a blu-ray to have proper bloody audio)

As for why not both because I dont have any more space to have another console on my stand, So the PS4 would need to legitimately replace the PS3. I have played with the idea of just getting a good stand alone Blu-Ray player for that task but it doesnt change that I simply dont have the space for another unit (Unless I got a Blu-ray player that could sit under the PS4 or something) but now im looking at buying two new expensive pieces of electronics and i honestly just cannot justify that


----------



## CasualCat

Quote:


> Originally Posted by *Orangey*
> 
> If I ever become an audiophile shoot me.


You could probably use a Hsu, SVS, or Outlaw


----------



## y2kcamaross

Quote:


> Originally Posted by *CasualCat*
> 
> You could probably use a Hsu, SVS, or Outlaw


Don't forget Epik or Seaton!


----------



## CasualCat

Quote:


> Originally Posted by *y2kcamaross*
> 
> Don't forget Epik or Seaton!


Epik closed shop.


----------



## y2kcamaross

Quote:


> Originally Posted by *CasualCat*
> 
> Epik closed shop.


Yes, bu you
Quote:


> Originally Posted by *CasualCat*
> 
> Epik closed shop.


Yes, but you can still find some pretty good deals on their used subs. I bought Elemental Designs' A5-350 right before they went belly up unfortuntately, but got a hell of a deal on a second one last year


----------



## iSlayer

Quote:


> Originally Posted by *Orangey*
> 
> Actually it went more like this:
> 
> xbots: TWICE AS MUCH RAM! IT'S OVER SONY ARE FINISHED (and PS3 henceforth enjoyed awful multiplats)
> sdf: POWAR OF CELERY! (became the biggest flop since Itanic)


Your posts are always something interesting I'll give you that.
Quote:


> Originally Posted by *Boomer1990*
> 
> Bloodborne, Uncharted, MLB the Show, Until Dawn, Deep Down, jrpg's, GOW, w/e media molecule is making, the rpg that Guerrilla games is making, and w/e Quantic Dream is making.


Lolquanticdream

There's certainly some variety there, shame its mostly prison food.
Quote:


> Originally Posted by *Orangey*
> 
> If I ever become an audiophile shoot me.


Yup, they're definitely a thing of some sort.


----------



## Orangey

I liked Indigo Prophecy.

At least, I remember enjoying it... what the hell was it about anyway?


----------



## iSlayer

Quote:


> Originally Posted by *Orangey*
> 
> I liked Indigo Prophecy.
> 
> At least, I remember enjoying it... what the hell was it about anyway?


Some things happen and then the plot does a big poo in its pants and takes off the pants and goes to inspect the poo which falls out onto the plot's face.


----------



## Boomer1990

Quote:


> Originally Posted by *iSlayer*
> 
> Your posts are always something interesting I'll give you that.
> Lolquanticdream
> 
> There's certainly some variety there, shame its mostly prison food.
> Yup, they're definitely a thing of some sort.


Out of curiosity explain why most of those games are prison food?


----------



## KenjiS

Quote:


> Originally Posted by *KenjiS*
> 
> -snipped for length-
> 
> To be fair, I've dealt with Alienware since then (my M17x) and they're VASTLY better than they used to be. Dell is a much nicer company these days and has never really given me any issues...I think it was partly them getting sold to Dell and the CSRs basically not giving a toss because they probubly just all got termination notices


Ok I had to jinx myself by repeating this entire thing, Not long after I posted this the keyboard on my mom's 3 year old Dell XPS 17 went, Its still under warranty so she called Dell, They gave her no hassle about sending a tech out to replace it. Thought I'd share this here as a follow up to that story.

However after that point everything goes to you know where in a basket.

The technician they sent to us was incompetent, Did not have the correct tools or skill to disassemble a laptop, Took a screwdriver to it very aggressively, gouged the crap out of the casing on the computer(Which was immaculate and very well cared for), smacked it around (Yes, literally -smacked- it around, slammed it on the table etc...) lost screws and in the end didnt get it put back together correctly (I keep finding screws from it all over the place) ON top of it now the laptop is overheating from the moment you press "power" (My guess is the "technician" in smacking it around mostly disassembled and etc has dislodged the heatsink assembly or cracked the motherboard or something) and its not running at all correctly (Which it was, with the exception of the keyboard, until he touched it) He also broke parts off of it and no, it was not back together correctly (keyboard wasnt even seated, the case had a 1/4" gap on one side..etc) It was 3 hours of my mother and i watching in astonished pain because we honestly had zero clue what to do with the guy... Like seriously, we both wanted to do something but feared him flipping out and throwing the thing across the room or something...

Contacted the service company, got passed to Dell, Dell insisted we send it in, We did, with the long, laundry list of problems the system now has, Dell returns it having essentially simply reassembled the computer, Not fixing any of the physical damage the tech caused, nor the overheating problems, and the system is still not running anything close to "right" it takes AGES to do anything and is -constantly- crashing, Its basically unusable in its current state. (And before anyone interjects, Yes we know what we're doing, the system is seriously borked up and was perfectly fine save a dead key on the keyboard before all of this)

Shes currently in the midst of arguing with Dell over the situation and she is not happy, Shes also discovered that unfortunately, it seems the 17" desktop-replacement-multimedia Notebooks like what she had are now kind of a dead class. Shes considering getting an Asus RoG actually as thats basically the only thing that really "replaces" what she had (A 17" 1920x1080 full-fat i7 with dual 500gb 7200rpm drives and dedicated graphics. These systems are VERY rare it seems.. Most 17" are cheap college-specials that she does not want or gaming notebooks.. So guess what shes looking at? lol) Dell is offering a replacement but she wants to know, very clearly, what it is Dell is going to send her to "replace" her system and be "comparable"

Slightly OT I know but its a follow up from an earlier post I made on the subject of computers I've owned


----------



## iSlayer

Can't believe I actually wrote up this post. Prison food would probably be a compliment if in regards to some of these games...
Quote:


> Bloodborne


Potential, basically more Dark Souls
Quote:


> Uncharted


Racism simulator 2014, thankfully this marks the end of that series and Naughty Dog has to do something new. Also, oh look, a sequel.

Anyways, glad they're finally done sequelizing it.
Quote:


> MLB the Show


2k just dropped their MLB game, so the franchise that ticks up each year like EA's fair can enjoy even less reason to innovate. Hooray though, another number tacked onto the box, just like CoD.
Quote:


> Until Dawn


Horror games generally live or die by the plot and this has a plot only slightly more original than the missionary position.

Also, mainstream doing a good horror game? As likely as my fecal matter containing solid diamonds. Aliens: Isolation was pretty special in that regard. What was the last non-indie horror game to pull that lightning off, Condemned 2 I think?
Quote:


> Deep Down


Capcom, so it won't be exclusive for long. Also, they're trying to be Dark Souls by being generally more crap. Dungeon crawl for one. For two, part of what makes Dark Souls is the plot, and Capcom has writers that are proud graduates of kindergarten and nothing else. If I were a professional "writer" for Capcom I'd be as embarrassed as it gets and hide that fact whenever possible. Anyone that found that out I'd use as an experiment for my scheme to take over the world.

Oops, I spoiled Resident Evil. You didn't say which resident evil though, iSlayer.









Quote:


> jrpg's


Lol
Quote:


> GOW


Another gears of war?
Quote:


> w/e media molecule is making, the rpg that Guerrilla games is making


Speculation.mp4
Quote:


> and w/e Quantic Dream is making.


----------



## Boomer1990

Quote:


> Originally Posted by *iSlayer*
> 
> Potential, basically more Dark Souls
> Racism simulator 2014, thankfully this marks the end of that series and Naughty Dog has to do something new.
> 2k just dropped their MLB game, so the franchise that ticks up each year like EA's fair can enjoy even less reason to innovate. Hooray though, another number tacked onto the box, just like CoD.
> Horror games generally live or die by the plot and this has a plot only slightly more original than the missionary position.
> 
> Also, mainstream doing a good horror game? As likely as my fecal matter containing solid diamonds. Aliens: Isolation was pretty special in that regard. What was the last non-indie horror game to pull that lightning off, Condemned 2 I think?
> Capcom, so it won't be exclusive for long. Also, they're trying to be Dark Souls by being generally more crap. Dungeon crawl for one. For two, part of what makes Dark Souls is the plot, and Capcom has writers that are proud graduates of kindergarten and nothing else.
> Lol
> Another gears of war?
> Speculation.mp4


GOW on ps= God of War, please explain how Uncharted is a Racism simulator? Mlb the show has been one of the best sports games for a long time now and 2k has not even come close to touching it. MM and GG are both confirmed working on new Ip's atm so I listed them for ps4's exclusives. What do you have against Jrpg's? Heavy Rain was a good game.


----------



## John-117

Quote:


> Originally Posted by *KenjiS*
> 
> As for the Xbox One it drops the lossless to either DTS or Dolby whichever you choose. Which is not what I desire, the only way to get the "actual" lossless codecs to work is to let the Xbox decode them and send it over LPCM, the issue with this is the Dynamic Range is massively messed up *on my setup* when it does this for a Blu Ray. *Technically, and theoretically, it should be -exactly- the same but its unfortunately not because the Xbox is mucking around when it decodes the audio, even when you disable DRC and etc..*


It's either a problem with your setup/receiver or you (like many others) don't know what you are talking about, no offense.
It's like saying FLAC sounds better than WAVE, which is absolutely ******ed. It doesn't matter who does the decoding (xbox or av receiver), the result will be 100% the same.
The only way I guess, people may be able to "hear" any difference is if the use old av receivers which don't support Audyssey EQ(or equivalent) with LPCM.


----------



## iSlayer

Heavy rain was a good game.

Good game.

Gameeeeee

Lol

Sorry, couldn't be resisted.

I said speculation.mp4 because I have nothing to say about basically nothing. Quantic Dream we know don't make good games and David Cage is a very special writer. Much like a sword made of Penguins is very special.

God of War is still being sequelized? What more is there to do? What new intrigue can be brought about. Kratos has already killed god dang Zeus I thought. He has no equal essentially. Ask yourself, is this game for publishers to line their wallet with more of basically the same game being stamped out through design by committee or an original idea that needs to see the light of day.


----------



## KenjiS

Quote:


> Originally Posted by *John-117*
> 
> It's either a problem with your setup/receiver or you (like many others) don't know what you are talking about, no offense.
> It's like saying FLAC sounds better than WAVE, which is absolutely ******ed. It doesn't matter who does the decoding (xbox or av receiver), the result will be 100% the same.
> The only way I guess, people may be able to "hear" any difference is if the use old av receivers which don't support Audyssey EQ(or equivalent) with LPCM.


Pretty sure its not, I am NOT the only one complaining about it and the Xbox One DOES muck with the audio because of the Snap functionality(it has to inject the sounds for the Xbox/Snap/etc) It IS applying processing of some kind which is essentially crushing and destroying all detail in the audio. The audio is muffled, it sounds awful and its incredibly poor quality.

You know why I know this? Because i can take the same blu-ray, I can put it in the PS3 and play it in DTS HD MA/Dolby True HD and its fine, I can switch the PS3 to do the decoding and send LPCM to the receiver and its fine, I can play LPCM from my desktop and its fine. I can watch Blu-Rays encoded in LPCM, and again, Fine, its the Xbox One's decoding software for Blu-Ray discs (Games/Netflix? Fine, Blu-Rays? Not fine)

If everything else is completely fine except the Xbox One's Blu-Ray playback module, i suspect its the Xbox One and not the rest of my system. Also again, Not the only person out there complaining that the Xbox One's handling of audio in regards to Blu-Rays specifically is muddled and messed up

THEORETICALLY YES, a decoded LPCM stream should sound 100% exactly like the Lossless codecs, and I'm not disagreeing with that if the audio is simply decoded and sent to your receiver, However if its decoding the lossless audio and processing it improperly and injecting system audio into it (Which the Xbox One is doing), and not simply handing it off as is then yes, it may not sound right

Also using the R word is not a good way to make your point. No need to reduce yourself to slurs

-edit- Thats also not getting into the fact the Xbox throws the surround channels on the wrong speakers if you play a 5.1 source on a 7.1 system. Seriously. If the Xbox is the one piece of equipment that is not playing nicely with my system i am not spending the time to make it work right, it should simply work like every other Blu Ray player I've owned and oh right, my PS3, which works fine, and my PC, that works fine, and my Smart TV, THAT WORKS FINE. Oh and the FiOS box, the Roku... Yeah I have a laundry list of things where the audio works 100% perfectly fine and then the Xbox One where it works fine unless its a Blu-Ray, because its handling of Blu-Ray audio is poor, Again, I'd be fine if it just handled the decoding correctly, but it doesnt


----------



## Boomer1990

Quote:


> Originally Posted by *iSlayer*
> 
> Heavy rain was a good game.
> 
> Good game.
> 
> Gameeeeee
> 
> Lol
> 
> Sorry, couldn't be resisted.
> 
> I said speculation.mp4 because I have nothing to say about basically nothing. Quantic Dream we know don't make good games and David Cage is a very special writer. Much like a sword made of Penguins is very special.
> 
> God of War is still being sequelized? What more is there to do? What new intrigue can be brought about. Kratos has already killed god dang Zeus I thought. He has no equal essentially. Ask yourself, is this game for publishers to line their wallet with more of basically the same game being stamped out through design by committee or an original idea that needs to see the light of day.


Hmmm, your statement on "WE" know Quantic Dream does not make good games... http://www.metacritic.com/game/playstation-3/heavy-rain
http://www.amazon.com/Heavy-Rain-Greatest-Hits-Playstation-3/product-reviews/B002CZ38KA/ref=cm_cr_dp_qt_see_all_top?ie=UTF8&showViewpoints=1&sortBy=byRankDescending

Fans have been wanting another God of War so they are making another. So white dude killing foreign people = racism







I guess that means almost every war game is a racism simulator as well because usually we play as a white dude killing Russians/Chinese/North Koreans.


----------



## iSlayer

Quote:


> Originally Posted by *Boomer1990*
> 
> Hmmm, your statement on "WE" know Quantic Dream does not make good games... http://www.metacritic.com/game/playstation-3/heavy-rain
> http://www.amazon.com/Heavy-Rain-Greatest-Hits-Playstation-3/product-reviews/B002CZ38KA/ref=cm_cr_dp_qt_see_all_top?ie=UTF8&showViewpoints=1&sortBy=byRankDescending


The point was they don't make games as they don't understand the meaning of games being interactive.

I don't count metascores for much. MGS4 got 10/10s like it cured cancer, talk to some franchise fans about how good MGS4 is. They'll laugh at you.
Quote:


> Fans have been wanting another God of War so they are making another. So white dude killing foreign people = racism
> 
> 
> 
> 
> 
> 
> 
> I guess that means almost every war game is a racism simulator as well because usually we play as a white dude killing Russians/Chinese/North Koreans.


Not exactly a new complaint lol. Most on OCN haven't forgotten the BF4 controversy (I'd expect). Then again, its crappiness kinda overshadowed the controversy round these parts.

Fans wanted another. Again, does this mean the devs have an original idea or is this just the ticking up of numbers in the franchise.


----------



## Orangey

Metacritic is the most gamed metric ever. Publishers refer to it explicitly in contracts and dealings with journalism entities.


----------



## t00sl0w

Quote:


> Originally Posted by *KenjiS*
> 
> -edit- Thats also not getting into the fact the Xbox throws the surround channels on the wrong speakers if you play a 5.1 source on a 7.1 system. Seriously. If the Xbox is the one piece of equipment that is not playing nicely with my system i am not spending the time to make it work right, it should simply work like every other Blu Ray player I've owned and oh right, my PS3, which works fine, and my PC, that works fine, and my Smart TV, THAT WORKS FINE. Oh and the FiOS box, the Roku... Yeah I have a laundry list of things where the audio works 100% perfectly fine and then the Xbox One where it works fine unless its a Blu-Ray, because its handling of Blu-Ray audio is poor, Again, I'd be fine if it just handled the decoding correctly, but it doesnt


going slightly OT for a second, but you say you have your audio working correctly on your roku...does that mean that it simply passes through what you send to it appropriately?
OR
does it mean that things like DTS-MA, Dobly TrueHD, LPCM, etc, are passed through on the roku?
I have a roku 3 and all of my BR rips contain the highest level of HD audio track the bluray had, but the roku always defaults to the DTS core and moves dolby TrueHD to DD 640, same with LPCM, it gets moved to DD 640.
the tracks do work with other devices, for instance when my laptop feeds the receiver w/passthrough everything works.


----------



## Tempest2000

Holy crap, iSlayer must really hate videogames. I wonder why he's in this thread?


----------



## perfectblade

Quote:


> Originally Posted by *Tempest2000*
> 
> Holy crap, iSlayer must really hate videogames. I wonder why he's in this thread?


it's the sarkeesian strategy. pretend you at one point played a game to make it seem less ridiculous when you demand they conform to your social justice dictates that would literally destroy the marketability of all games


----------



## iSlayer

I'm well known for being about that social justice life. For example, Metal Gear Solid. I hate that sexist franchise so much I subscribed to an awesome twitch channel dedicated to the Metal Gear (Solid) games. I react to claims of sexism from Kojima as a knee jerk reaction to Quiet from people that ignore the sexualization of the protagonists. Snake, Raiden, Big Boss: all have firm, muscled buttocks covered by skin tight suits that you, the player, spend a lot of time looking at. Raiden was even designed to be more popular with woman lol.

But hey, i'm just an industry troll









I realize I wasn't direct enough, if Kojima is sexist, I'm a granny.
Quote:


> Originally Posted by *Tempest2000*
> 
> Holy crap, iSlayer must really hate videogames. I wonder why he's in this thread?


I spent the majority of my day playing promod and TF2 "pugs" lol. The hatred of video games is pretty unreal I tell ya what.
Quote:


> Originally Posted by *perfectblade*
> 
> it's the sarkeesian strategy. pretend you at one point played a game to make it seem less ridiculous when you demand they conform to your social justice dictates that would literally destroy the marketability of all games


I...

Guess I'm getting out the laptop.


Spoiler: Warning: Spoiler!








I don't know what kind of Sarkeesian strategist owns 94 games on Steam, most of which SJWs do or would complain about. But uh, keep going with that, you almost had a point if you weren't so blatantly wrong.

Are there Sarkeesian strategists who keep up with the competitive CS scene (Pasza #1!)? Who have sunk over 5k hours into MMOs? That have a love of FPSs and competitive ones especially, particularly Promod and CPMA, and are actually pretty alright at them? Have they also spent thousands of hours as a console gamer, from Crash to 007, Armored Core to Tomb Raider to Red Faction, Halo to CoD, Jak to Midnight Club, Metroid to Pokemon to Phoenix Wright?

Now, i'm going to go pass out while watching uKnighted, I think it was ThreeDogg running Peace Walker, content with the knowledge that i'm a fake gamer grill. PerfectBlade, you deserve a round of applause for figuring out that i'm a fraud.


----------



## Orangey

>owns 94 games
>on a Mac


----------



## Ganf

^ Had to code 87 of them by himself.

^ Most of them are revisions of Solitaire.


----------



## jameschisholm

Do you guys think they will stop sequelizing Assassins Creed and COD? and those dev's release a new IP? Getting a bit repetitive don't you agree?


----------



## SpeedyVT

Quote:


> Originally Posted by *jameschisholm*
> 
> Do you guys think they will stop sequelizing Assassins Creed and COD? and those dev's release a new IP? Getting a bit repetitive don't you agree?


Not until it's a dead horse.


----------



## iSlayer

Quote:


> Originally Posted by *SpeedyVT*
> 
> Not until it's a dead horse.


What you meant was...

Not until it once was a dead horse but has now been eaten by people who later died, became grass, and were then eaten by another horse who has since died.


----------



## KenjiS

Quote:


> Originally Posted by *Orangey*
> 
> >owns 94 games
> >on a Mac


Hey I think thats impressive! Must be hard to get the ENTIRE library of Mac games after all!









...And that was a light hearted jab, Not serious. I feel like I shouldnt even have to say that


----------



## iSlayer

Quote:


> Originally Posted by *KenjiS*
> 
> Hey I think thats impressive! Must be hard to get the ENTIRE library of Mac games after all!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...And that was a light hearted jab, Not serious. I feel like I shouldnt even have to say that


That joke once was a dead horse but has now been eaten by people who later died, became grass, and were then eaten by another horse who has since died.

You're like two days late.


----------



## KenjiS

Quote:


> Originally Posted by *iSlayer*
> 
> That joke once was a dead horse but has now been eaten by people who later died, became grass, and were then eaten by another horse who has since died.
> 
> You're like two days late.


I cant tell if you're actually angry at this or if you're screwing with me


----------



## Carniflex

Quote:


> Originally Posted by *KenjiS*
> 
> Hey I think thats impressive! Must be hard to get the ENTIRE library of Mac games after all!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...And that was a light hearted jab, Not serious. I feel like I shouldnt even have to say that










There is currently Steam winter sale. There is 2058 games on sale for Mac OS X


----------



## KenjiS

Quote:


> Originally Posted by *Carniflex*
> 
> 
> 
> 
> 
> 
> 
> 
> There is currently Steam winter sale. There is 2058 games on sale for Mac OS X


Thats a lot more than when I had a mac









Seriously though, I know the Mac has a bunch of games, I was just teasing


----------



## 420swag

ill believe this in like 4 years


----------



## GetToTheChopaa

Quote:


> Originally Posted by *Ganf*
> 
> ^ Had to code 87 of them by himself.
> 
> ^ Most of them are revisions of Solitaire.


Almost made me spill my coffee!








Rep+!


----------



## degenn

Quote:


> Originally Posted by *Orangey*
> 
> If I ever become an audiophile shoot me.


I'd like to volunteer to take the first shot. Barret .50cal of course.


----------



## KenjiS

Quote:


> Originally Posted by *degenn*
> 
> I'd like to volunteer to take the first shot. Barret .50cal of course.


Why use .50 cal when a .338 Lapua will do....


----------



## iSlayer

That's how we ended up with these consoles. And .50 cal? Tank shell or bust.


----------



## Orangey

I have built up an immunity to bullets by shooting myself with 9mm and working up to .700 nitro express.


----------

