# [ipon] Shadow of Mordor: 6GB of VRAM for ultra textures



## Alatar

Quote:


> In addition, the game clearly shows how much VRAM you will also need: the Ultra package at least 6GB on, while high package be satisfied 3GB year over year.


Source
Source (translated)

Couldn't find an english article on this but will update the news post if one pops up. Interesting to see if benchmarks actually back up the need for this much VRAM.


----------



## ZealotKi11er

Cool but what card did they use to develop? GTX Titan?


----------



## kingduqc

Thats 2 game comfirming the need of over 3gb yet not so long ago if you said that this would exactly happen every 780ti owners would take a pleasure explaning to you how "wrong" you where.

Kinda sad even high end cards like the 980 won't even meet the requirement. I just wonder if it's going to impact that much how textures look.


----------



## Clocknut

as I said b4.. new console port = Large VRAM req.

it has begun....









GTX950/960 4Gb > 780ti 3GB


----------



## Biorganic

Quote:


> Originally Posted by *kingduqc*
> 
> Thats 2 game comfirming the need of over 3gb yet not so long ago if you said that this would exactly happen every 780ti owners would take a pleasure explaning to you how "wrong" you where.
> 
> Kinda sad even high end cards like the 980 won't even meet the requirement. I just wonder if it's going to impact that much how textures look.


Sounds to me like a laziness issue more than a quality texture situation. If the developers know they have 3-5 gigs of Vram to play with they probably won't compress textures or optimize things as much...

I'm not upgrading my 7950s for another year at least. I'll really see how my "lack" of Vram affects my gaming experience....


----------



## Ramzinho

i believe this is absurd. The most advanced GPUs now are 4GB "yes there are 6GB' but they are special editions.. either this ultra settings look super impressive with tons of details or it's terribly optimized and i tend to believe in the later. devs are as usual so lazy with imports optimization ....


----------



## Junkboy

Quote:


> Originally Posted by *kingduqc*
> 
> Thats 2 game comfirming the need of over 3gb yet not so long ago if you said that this would exactly happen every 780ti owners would take a pleasure explaning to you how "wrong" you where.
> 
> Kinda sad even high end cards like the 980 won't even meet the requirement. I just wonder if it's going to impact that much how textures look.


This game looks like it might be be able to have awesome visuals with massive textures, but a game developed by JP studio that is primarily a ps3/360 title that is being ported to PC/PS4/Xbone? I'll give Mordor the benefit of the doubt but Evil Within is just a joke list of "recommended" specs.

Though I guess Evil Within could be the new GTA IV and a new benchmark of atrocious ports! 3GB will still be fine for the conceivable future and won't be truly needed for another year or two save for maybe a handful titles.


----------



## curly haired boy

really hoping big maxwell gives us at least 8 gigs of VRAM....


----------



## Neilthran

How many video cards at the consumer level have 6GB of vram right now?


----------



## routek

I said how a $700 card with only 3gb of vram was rubbish at launch of the 780ti. yes performance was good but you can get 3gb of vram on a 7950 for $200. GTX 690, 2gb of vram was awful even then. Some say but but games didn't require it but I looked at what I wanted to do with settings and mods and value and thought NVidia is awfully tight with their vram, do not want.

Also more importantly 780ti SLI has so much GPU power and a tiny amount of vram its shocking to me anyone would bother. Such a waste of GPU power like the 690 was not long after. Instead of supporting that you should've got 680 4gb for SLI instead of a 690

A $700 GPU released 10 months ago should come with 6gb of vram, I don't care what you say. its poor value and its about time NVidia started upping their game. 970 is not bad


----------



## tpi2007

This doesn't surprise me at all. I've been saying since 2012 that with the new consoles coming the GTX 600 and 700 (the ones based on GK104) series' 2 GB of VRAM wasn't going to be enough and why I wouldn't buy one of those cards.

11/7/12:
Quote:


> Originally Posted by *tpi2007*
> 
> I usually buy Nvidia cards, and although AMD seems to be doing a good job on both an architecture and now drivers, I'm still favoring Nvidia. However, having a GTX 480, I just can't justify buying a Kepler based card. Not even a GTX 670, which I think is overpriced for what it offers. *Next year, with the new consoles coming out, PC games will start using more resources and with it more VRAM.* Now, the GTX 680 and GTX 670 only come with 2 GB of VRAM, with 4 GB being more expensive, many times by € 50, while AMD 7900 series cards have 3 GB of VRAM. My bet is that the top of the line Kepler cards will drop a lot in price in the first half of next year. But even then, I still wouldn't buy one, these Keplers always seemed like an interim solution. to be honest, not even at € 200 I would buy one, I will wait for a proper high-end card, with a 384-bit memory bus and 3 GB of VRAM.


----------



## Ramzinho

Quote:


> Originally Posted by *Neilthran*
> 
> How many video cards at the consumer level have 6GB of vram right now?


Titans and 295X2 and some extreme 7970


----------



## B!0HaZard

Quote:


> Originally Posted by *kingduqc*
> 
> Thats 2 game comfirming the need of over 3gb yet not so long ago if you said that this would exactly happen every 780ti owners would take a pleasure explaning to you how "wrong" you where.
> 
> Kinda sad even high end cards like the 980 won't even meet the requirement. I just wonder if it's going to impact that much how textures look.


I think there's a significant difference between Shadow of Mordor asking for 6 GB for ultra textures, yet scaling down to 1 GB on low, and The Evil Within not even being guaranteed to run properly on less than 4 GB. Even if 6 GB is unrealistic now, it will increase the longevity of the game because it will take longer before the textures get outdated. Crysis could not realistically run at max settings on a single GPU card when it released, but that also made it live much, much longer than it otherwise would have. This ultra texture option will also largely eliminate the need for "HD texture mods" that are so prevalent in many other games.

I am sure that The Evil Within will have a lower texture quality option since they insist that 4 GB is the recommended requirement to play the game "as the developers intended", but it doesn't really say anything about the actual requirement, it could need just 512 MB to run at high texture settings and the developer could still claim that the 4 GB ultra setting is needed for the "optimal playing experience" (completely ignoring that playing the game with low resolution textures is better than not playing it at all). It is a huge mistake by Bethesda/Tango not to give us minimum requirements.


----------



## Alatar

Quote:


> Originally Posted by *Neilthran*
> 
> How many video cards at the consumer level have 6GB of vram right now?


Titan
Titan Black
Titan Z
Custom 780s (very few models)
Custom 7970s (very few models)
Custom 290Xs (8GB sapphire model, only around 250 made, only sold in UK and germany)

+ some GK104 based mobile GPUs.


----------



## farmdve

Ridiculous. Although I will not play this game because of other reasons, I see this only as a gimmick to get people to buy more cards/better cards.

But to think that 2GB are no longer enough...


----------



## DADDYDC650

Some folks said the 6GB on my card was worthless....


----------



## HiTechPixel

I commend developers for upping the recommended hardware requirements to push a long PC hardware. But I doubt this game requires 6GB+ of VRAM for its ultra textures.

1.) It's a fairly small game and games bigger than it doesn't consume anywhere close to 6GB of VRAM.

2.) Many "next-gen" games that had absurd recommended hardware requirements turned out to be all lies. Just look at Watch Dogs.


----------



## Ramzinho

Quote:


> Originally Posted by *Alatar*
> 
> Titan
> Titan Black
> Titan Z
> Custom 780s (very few models)
> Custom 7970s (very few models)
> Custom 290Xs (8GB sapphire model, only around 250 made, only sold in UK and germany)
> 
> + some GK104 based mobile GPUs.


doesn't the 295X carry 8GB Vram?


----------



## Alatar

Quote:


> Originally Posted by *Ramzinho*
> 
> doesn't the 295X carry 8GB Vram?


Yes but it's mirrored (and due to this only has 4GB usable) because the 295X2 is just CFX on a stick.


----------



## Kuivamaa

I don't think it is a gimmick, just a paradigm shift. Video cards will be made with more VRAM from now on, that's about it. And yes recent/expensive models can go obsolete in the process. 15 years ago even potent processors could be junk after 18 months.


----------



## Neilthran

Quote:


> Originally Posted by *Ramzinho*
> 
> Titans and 295X2 and some extreme 7970


Quote:


> Originally Posted by *Alatar*
> 
> Titan
> Titan Black
> Titan Z
> Custom 780s (very few models)
> Custom 7970s (very few models)
> Custom 290Xs (8GB sapphire model, only around 250 made, only sold in UK and germany)
> 
> + some GK104 based mobile GPUs.


So high end and top end cards. On average not everyone has those. They don't care to optimize the games much anymore. I suppose it's cheaper for them. If the game looked incredible i could understand. But that's not the case, the graphics are nothing special.


----------



## Ramzinho

Quote:


> Originally Posted by *Alatar*
> 
> Yes but it's mirrored (and due to this only has 4GB usable) because the 295X2 is just CFX on a stick.


Pfff. i thought being on one pcb meant they are 8GB functional. my bad then .


----------



## tpi2007

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alatar*
> 
> Yes but it's mirrored (and due to this only has 4GB usable) because the 295X2 is just CFX on a stick.
> 
> 
> 
> Pfff. i thought being on one pcb meant they are 8GB functional. my bad then .
Click to expand...

It's the way both Crossfire and SLI work (it doesn't matter if the GPUs are on the same PCB or not). And why someone with 780 Ti's in SLI is going to be feeling awkward very soon. Lots of GPU power and only 3 GBs of VRAM. There are no 780 Tis with 6 GB of VRAM because Nvidia wanted that feature to further separate the Titan Black from the 780 Ti.


----------



## GoldenTiger

Quote:


> Originally Posted by *tpi2007*
> 
> This doesn't surprise me at all. I've been saying since 2012 that with the new consoles coming the GTX 600 and 700 (the ones based on GK104) series' 2 GB of VRAM wasn't going to be enough and why I wouldn't buy one of those cards.
> 
> 11/7/12:


You did, I remember people saying that, but the truth is that it's almost 2 years later and most people will be upgrading their cards (if they hadn't already since then) if they run into a game that VRAM becomes an issue in. A few shoddy console ports (yet to be shown to actually need the VRAM capacity) wouldn't be worrying me much if I still owned a 2GB card (last one I owned was a GTX 670 a loooong ways back).


----------



## Promisedpain

Knowing this - Why the hell does 980 only have 4GB Vram? Anyway, I won't be upgrading my 780 ti until 980 ti arrives, or I might even wait for the next gen.


----------



## sugarhell

6gb vram for high res textures? I hope its over 8k textures otherwise is a poor job from ubisoft...


----------



## GoldenTiger

Quote:


> Originally Posted by *sugarhell*
> 
> 6gb vram for high res textures? I hope its over 8k textures otherwise is a poor job from ubisoft...


8k uncompressed textures on every pebble on-screen!


----------



## tpi2007

Quote:


> Originally Posted by *GoldenTiger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tpi2007*
> 
> This doesn't surprise me at all. I've been saying since 2012 that with the new consoles coming the GTX 600 and 700 (the ones based on GK104) series' 2 GB of VRAM wasn't going to be enough and why I wouldn't buy one of those cards.
> 
> 11/7/12:
> 
> 
> 
> You did, I remember people saying that, but the truth is that it's almost 2 years later and most people will be upgrading their cards (if they hadn't already since then) if they run into a game that VRAM becomes an issue in. A few shoddy console ports (yet to be shown to actually need the VRAM capacity) wouldn't be worrying me much if I still owned a 2GB card (last one I owned was a GTX 670 a loooong ways back).
Click to expand...

That's true, but it's also true that a GTX 680 SLI setup (or a GTX 690) should still be able to handle these titles at max details... at 1080p.

Also, tell that to people who bought a GTX 770, which came out a year later. That card is the worst offender in my opinion.

In any case, it was mind-boggling looking at Nvidia's Kepler lineup - they gave 2 GB of VRAM all the way down to the GTX 650 Ti Boost. You can see how weird it is that the top of the line GTX 680 also comes with 2 GB of VRAM. Something didn't add up, even back then.

But you don't have to look elsewhere (eventual shoddy console ports), PC games with mod textures were using lots of VRAM and still giving playable framerates, that was a look into the future right there.


----------



## GoldenTiger

Quote:


> Originally Posted by *Promisedpain*
> 
> Knowing this - Why the hell does 980 only have 4GB Vram? Anyway, I won't be upgrading my 780 ti until 980 ti arrives, or I might even wait for the next gen.


Because a ridiculously req'd console port or two doesn't indicate the broader market trend, and 8GB 970/980 models are coming soon anyway (around a month). Considering it's extremely difficult to actually hit the VRAM wall at 4GB even with 4K resolution (I've tried!) to where it even allocates 4gb let alone impacts performance (unless you slap on something ludicrous like 8x MSAA which at 4K is beyond pointless) even in games like Battlefield 4, Archeage, Elder Scrolls Online, Tomb Raider, etc. all at maxed (or just shy of maxed) settings... by the time 4GB becomes any kind of real limiter even at 4K most people will be eyeing an upgrade regardless.

(Yes, I know you can do it with a literal low-single-digit-count of modded singleplayer games by adding 8192x8192 textures to every table and pebble on the ground with horridly optimized fan texture pack mods.







)


----------



## tpi2007

Quote:


> Originally Posted by *GoldenTiger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Promisedpain*
> 
> Knowing this - Why the hell does 980 only have 4GB Vram? Anyway, I won't be upgrading my 780 ti until 980 ti arrives, or I might even wait for the next gen.
> 
> 
> 
> Because a ridiculously req'd console port or two doesn't indicate the broader market trend, and 8GB 970/980 models are coming soon anyway (around a month). Considering it's extremely difficult to actually hit the VRAM wall at 4GB even with 4K resolution (I've tried!) to where it even allocates 4gb let alone impacts performance (unless you slap on something ludicrous like 8x MSAA which at 4K is beyond pointless) even in games like Battlefield 4, Archeage, Elder Scrolls Online, Tomb Raider, etc. all at maxed (or just shy of maxed) settings... by the time 4GB becomes any kind of real limiter even at 4K most people will be eyeing an upgrade regardless.
> 
> (Yes, I know you can do it with a literal low-single-digit-count of modded singleplayer games by adding 8192x8192 textures to every table and pebble on the ground with horridly optimized fan texture pack mods.
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

But that is the point, you're talking about games designed with the last generation of consoles on mind, they have nothing to do with this paradigm shift. The new consoles have 8 GB of VRAM / RAM, so expect new games to take advantage of at least 4 GB, so they can start distancing themselves from the last generation and give people a reason to buy the new consoles. The market needs a "Wow!" moment, where you can see the differences quite clearly.


----------



## GoldenTiger

Quote:


> Originally Posted by *tpi2007*
> 
> That's true, but it's also true that a GTX 680 SLI setup (or a GTX 690) should still be able to handle these titles at max details... at 1080p.
> 
> But you don't have to look elsewhere (eventual shoddy console ports), PC games with mod textures were using lots of VRAM and still giving playable framerates, that was a look into the future right there.


Point taken, however I think they'll be just fine within the limits of their horsepower at least, anyway







. I'd be entirely unconcerned if I owned them still off of a leaked spec or two showing VRAM capacity requirements for max settings that no general consumer card on the market has (Titan which is a prosumer card, and a very limited numbers of gtx 780 special editions). I'd say it'll be another year and a half or more yet before 2GB is a true limiter as to what settings you can run with a 680-class card in SLI (be it SLI on a stick or not) that would otherwise run fine if not for the VRAM capacity.

I don't count mod packs, generally, at least since most of them (and all are fan-made) are horrendously poorly done on the optimization front applying extreme texture resolutions and normal maps to items that will never take up more than 1/100th of the screen space (my post wasn't really a joke talking about 8K texture-mapped pebbles...







). You can usually find a variant of them by the same author in most cases that takes up a fraction of the video memory capacity while looking objectively identical on-screen.
Quote:


> Originally Posted by *tpi2007*
> 
> But that is the point, you're talking about games designed with the last generation of consoles on mind, they have nothing to do with this paradigm shift. The new consoles have 8 GB of VRAM / RAM, so expect new games to take advantage of at least 4 GB, so they can start distancing themselves from the last generation and give people a reason to buy the new consoles. The market needs a "Wow!" moment, where you can see the differences quite clearly.


8GB of RAM, 3gb taken by system functions with *5gb* available between system and video functions in the case of the PS4 for example. Considering most modern games ask for (at minimum) 2-2.5gb or so of system memory, that means most console games will probably use between 1.5-2gb of memory for graphics.

Wasting 4k textures on objects on the screen isn't a "WOW" moment when the user's monitor is 1080p and literally cannot display that resolution. It's just called poor optimization amongst us devs







, same as the only other real method you could use to push to asking for 6gb for a 1080p intended display (uncompressed textures and other absurdities).


----------



## pwnzilla61

I am not going to pay another $500+ on a gpu on a $60 dollar game that is a console port. I should not have to adhere to consoles, with a gaming pc.


----------



## Alatar

I wouldn't be so quick to call shadow of mordor a pure console port.

It's made by Monolith and the game seems to have a reasonable list of PC specific options too:



And the ultra textures that have a 6GB recommendation seem to be available only through a UHD texture pack that you need to download separately.

Also on an unrelated note I'm pretty excited as a lotr fan. Especially since the game seems to be getting mostly glowing reviews.


----------



## GoldenTiger

Quote:


> Originally Posted by *Alatar*
> 
> I wouldn't be so quick to call shadow of mordor a pure console port.
> 
> It's made by Monolith and the game seems to have a reasonable list of PC specific options too:
> 
> [*IMG ALT=""]http://www.overclock.net/content/type/61/id/2188167/width/750/height/1500[/IMG]
> 
> And the ultra textures that have a 6GB recommendation seem to be available only through a UHD texture pack that you need to download separately.
> 
> Also on an unrelated note I'm pretty excited as a lotr fan. Especially since the game seems to be getting mostly glowing reviews.


I was referring to it as a console port as I saw others in the thread doing so, but I would have made literally the exact same points minus the words "console port" otherwise







.

Sounds like I'm going to want to look into buying this game, though, if it's LOTR based







.


----------



## Takla

lets see if shadow of mordor and the evil within run like dead rising 3. the textures in shadows of mordor look good but not 6gb worth of vram good. crysis 3 on 1920x1080 with verything set to very high and 8x msaa does only take 2.3gb and it looks better then these 2.


----------



## RSharpe

Quote:


> Originally Posted by *routek*
> 
> I said how a $700 card with only 3gb of vram was rubbish at launch of the 780ti. yes performance was good but you can get 3gb of vram on a 7950 for $200. GTX 690, 2gb of vram was awful even then. Some say but but games didn't require it but I looked at what I wanted to do with settings and mods and value and thought NVidia is awfully tight with their vram, do not want.
> 
> Also more importantly 780ti SLI has so much GPU power and a tiny amount of vram its shocking to me anyone would bother. Such a waste of GPU power like the 690 was not long after. Instead of supporting that you should've got 680 4gb for SLI instead of a 690
> 
> A $700 GPU released 10 months ago should come with 6gb of vram, I don't care what you say. its poor value and its about time NVidia started upping their game. 970 is not bad


This... I've been on 3GB GTX 580s for 4 years. It's a joke for Nvidia to screw consumers with ultra-short upgrade cycles by releasing a top-end card with so little VRAM... especially knowing that lazy console ports will require tons of memory, and that displays are going 1440p/1600p/4k


----------



## RSharpe

Quote:


> Originally Posted by *Promisedpain*
> 
> Knowing this - Why the hell does 980 only have 4GB Vram? Anyway, I won't be upgrading my 780 ti until 980 ti arrives, or I might even wait for the next gen.


Hopefully, because big boy Maxwell is just around the corner.


----------



## Kimir

"Assuming a 1080p rendering resolution", sigh.. so last year.


----------



## maarten12100

More like 6GB allocated 2GB in use.
Nice try fooling people!


----------



## sugarhell

Quote:


> Originally Posted by *maarten12100*
> 
> More like 6GB allocated 2GB in use.
> Nice try fooling people!


Even for allocation and prebuffering this is too much


----------



## GoldenTiger

Quote:


> Originally Posted by *maarten12100*
> 
> More like 6GB allocated 2GB in use.
> Nice try fooling people!


Probably







.

Also, for those who are truly (overly, I'd say... but it's your money







) concerned can buy 8GB GTX 970/980 cards in under a month according to Gibbo (OverclockersUK).


----------



## xXUNLUCKYXx

Just wait for the performance results everything will become clear.


----------



## friend'scatdied

Well, a common argument against cards with double the reference memory amount was that the underlying GPU was not fast enough to ever avail the additional memory.

One might argue this is still the case with 6GB GK110s and 8GB GM204, but is there any data to justify otherwise yet?

In any case an 18-month upgrade cycle should line me up for 8GB big-die Maxwell.


----------



## Promisedpain

Quote:


> Originally Posted by *GoldenTiger*
> 
> Probably
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also, for those who are truly (overly, I'd say... but it's your money
> 
> 
> 
> 
> 
> 
> 
> ) concerned can buy 8GB GTX 970/980 cards in under a month according to Gibbo (OverclockersUK).


Not concerned, and I'll be ok even if I have to use high (and I probably will since 780 ti only has 3 GB vram) it's only one game and I won't die if I have to lower the textures by a little bit, I mean even the most powerful PCs back in the day would not run crysis maxed 1080/60.

Most recommend specs are usually miss leading, I mean Watch_Dogs (especially with the new patch) works just fine with i5 and 2gb card, same with wolfenstein.


----------



## iSlayer

Recommending an i7 over an i5 for gaming.

Yah, i'm sure those 780 Ti owners are going to be really sad and find this game unplayable...

/s
Quote:


> Originally Posted by *friend'scatdied*
> 
> Well, a common argument against cards with double the reference memory amount was that the underlying GPU was not fast enough to ever avail the additional memory.
> 
> One might argue this is still the case with 6GB GK110s and 8GB GM204, but is there any data to justify otherwise yet?
> 
> In any case an 18-month upgrade cycle should line me up for 8GB big-die Maxwell.


Thank you.


----------



## GoldenTiger

Quote:


> Originally Posted by *iSlayer*
> 
> Recommending an i7 over an i5 for gaming.
> 
> Yah, i'm sure those 780 Ti owners are going to be really sad and find this game unplayable...
> 
> /s
> Thank you.










Time to throw them in the dumpster, not even worth selling after ebay fees now!

/s


----------



## Ramzinho

Quote:


> Originally Posted by *sugarhell*
> 
> 6gb vram for high res textures? I hope its over 8k textures otherwise is a poor job from ubisoft...


Ohhhh. ubisoft you said


----------



## GoldenTiger

Quote:


> Originally Posted by *Ramzinho*
> 
> Ohhhh. ubisoft you said


8k textures which no consumer display could show in full screenspace even on an object taking up the entire screen!







(meme voice) "Sounds legit."


----------



## Slaughtahouse

It probably uses quite a bit of VRAM, but I honestly doubt that 6gbs is REQUIRED for ultra @ 1080p. As others said, wait till benchies/reviews come out.

After watching IGN's review, I really want to get this game now. Was hoping for a good LotR game. It's been FAAAR too long. Now we've finally got one


----------



## Noufel

ouf i'm relieved i can sell my 290s not like those who have 3gigs ( 780,780ti....)


----------



## RagingCain

They are just lazily using the VRAM for storage.

As a programmer, this is just stupid.


----------



## Xeio

Admittedly, I did not expect my shiny new 980 to be obsolete in less than a week.









I have a feeling this is over-exaggerated like Watch Dogs was, but I suppose we'll see...


----------



## perfectblade

pc optimization is such a joke


----------



## sugarhell

Quote:


> Originally Posted by *GoldenTiger*
> 
> 8k textures which no consumer display could show in full screenspace even on an object taking up the entire screen!
> 
> 
> 
> 
> 
> 
> 
> (meme voice) "Sounds legit."


Emm? Go look rage at 16k vs the default one.


----------



## John Shepard

Why hasn't nvidia released a 12GB Titan already?
We need MOAR memory.


----------



## szeged

Rofl 6gb @ 1080p....Yeahhhhhhhhhhhhhhhhhhhhhhhhh nope. I bet we will be seeing 780s running ultra just fine.


----------



## fleetfeather

If you developed around a 6GB VRAM requirement for Ultra settings, you didn't develop very well.

In fact, you under-developed.


----------



## Marc79

I knew it, 2-3GB Vram just became obsolete with these new "next gen" ports.

It looks like Maxwell refresh is the next upgrade for me.


----------



## CBZ323

What can i say, the game doesnt even look that good.

I doubt the texture pack will make it that much better.


----------



## Noufel

VRAM Wars commence !!!!!!
So with my 290cf i'll be stuck to high


----------



## staryoshi

At 1080p, I'm not buying this. We'll see when it releases. Sounds like a recommended power supply wattage to me.

They probably just set the recommendation to "Titan," which happens to have 6GB VRAM.


----------



## xXUNLUCKYXx

With the consoles using 8gb of unified memory its going to mean more work for Devs splitting the tasks to two pools of memory on the PC just more time and money for them I suppose.


----------



## Aparition

6GB?

So they are providing the raw non-compressed Dev texture files?









Probably no difference between High and Ultra save for compression. With Downsampling now a main consumer thing thanks to Nvidia I doubt anyone needs Ultra.


----------



## paulerxx

Is this a joke? Does anyone have a screenshot of these 6GB VRAM textures?

Let's see crappy ports because of the next generation of consoles so far...
COD Ghosts, Watch Dogs, Dead Rising 3....I know there's more, someone add to this list.


----------



## pcguru000

What a confusing world we live in... one minute we're yelling for better hardware, less compression and the next we're yelling that the compression is okay and the hardware requirements are too steep.


----------



## maarten12100

Quote:


> Originally Posted by *Marc79*
> 
> I knew it, 2-3GB Vram just became obsolete with these new "next gen" ports.
> 
> It looks like Maxwell refresh is the next upgrade for me.


Quote:


> Originally Posted by *Noufel*
> 
> VRAM Wars commence !!!!!!
> So with my 290cf i'll be stuck to high


Are you guys seriously buying into this?


----------



## BusterOddo

This is strange. Should somebody tell them that you can't actually download moar vram??? They can't possibly expect everyone to go out and buy 6+gb vram cards. Another watchmestutterdogs incoming!


----------



## ZealotKi11er

6GB part is fine until you see 1080p lol. So at 4K we need 20GB of vRAM? Also dont hold your breath in 8GB GTX980. Its not going to utilize the memory same way as 680 never really used 4GB.


----------



## GoldenTiger

Quote:


> Originally Posted by *sugarhell*
> 
> Emm? Go look rage at 16k vs the default one.


Rage used a megatexture tech that was not a per-object texturing system like 99.99999999999% of game titles use. That's not the same thing







at all. When people refer to texture sizes, it is always referred to in the context of per-object size. Rage's 16k chunk size for its texture atlases covered many objects.

Example of a texture atlas covering multiple objects such as Rage used on a larger scale:



Example of a single object texture as referred to when someone says "4k" or "1024" texture:



The latter is what game developers refer to when speaking of texture sizes, and of course what I meant when saying "8K textures for single objects would be useless when it can't even be displayed by a 1080p or 4K monitor with the single object taking up the entire screen, even."


----------



## kiwiis

Are there any Maxwells even on the horizon with 6GB+ VRAM?


----------



## Mhill2029

Well there are 8GB variants of the 900series coming at some point, although I expect this to be implemented on full blown Maxwell GPU's. But this is ridiculous to the extreme, especially as they are saying this is at 1080p!


----------



## GoldenTiger

Quote:


> Originally Posted by *Mhill2029*
> 
> Well there are 8GB variants of the 900series coming at some point, although I expect this to be implemented on full blown Maxwell GPU's. But this is ridiculous to the extreme, especially as they are saying this is at 1080p!


Gibbo of Overclockers UK (the premiere hardware vendor for Great Britain) has said we will see 8GB variants of the GTX 970/980 sometime in November.


----------



## kiwiis

True that it's probably excessive in this case but it can't hurt to be thinking down the line. Still using a GTX680 with 2 gigs of vram and I found out ages ago that 2 gigs was no longer sufficient enough. When I get a 900 card, I'd like to be certain VRAM will never be an issue within the card's lifetime.


----------



## tpi2007

Quote:


> Originally Posted by *GoldenTiger*
> 
> Point taken, however I think they'll be just fine within the limits of their horsepower at least, anyway
> 
> 
> 
> 
> 
> 
> 
> . I'd be entirely unconcerned if I owned them still off of a leaked spec or two showing VRAM capacity requirements for max settings that no general consumer card on the market has (Titan which is a prosumer card, and a very limited numbers of gtx 780 special editions). I'd say it'll be another year and a half or more yet before 2GB is a true limiter as to what settings you can run with a 680-class card in SLI (be it SLI on a stick or not) that would otherwise run fine if not for the VRAM capacity.
> 
> I don't count mod packs, generally, at least since most of them (and all are fan-made) are horrendously poorly done on the optimization front applying extreme texture resolutions and normal maps to items that will never take up more than 1/100th of the screen space (my post wasn't really a joke talking about 8K texture-mapped pebbles...
> 
> 
> 
> 
> 
> 
> 
> ). You can usually find a variant of them by the same author in most cases that takes up a fraction of the video memory capacity while looking objectively identical on-screen.
> 
> 8GB of RAM, 3gb taken by system functions with *5gb* available between system and video functions in the case of the PS4 for example. Considering most modern games ask for (at minimum) 2-2.5gb or so of system memory, that means most console games will probably use between 1.5-2gb of memory for graphics.
> 
> Wasting 4k textures on objects on the screen isn't a "WOW" moment when the user's monitor is 1080p and literally cannot display that resolution. It's just called poor optimization amongst us devs
> 
> 
> 
> 
> 
> 
> 
> , same as the only other real method you could use to push to asking for 6gb for a 1080p intended display (uncompressed textures and other absurdities).


Well, they are advertising the 6 GB as the Ultra High, so why not get an Ultra High-end card to meet the requirements ?

You can make the same argument people did back when Crysis required a lot of horsepower for Very High and didn't look that much better than High for people to complain about it. After all, people with an 8800GT could run it at High settings. Why want to have the very best with mid-range ? If people want to have their cake and eat it, then you might as well get a console, because there you only have one denominator.

As to the PS4's VRAM / RAM usable by games, I don't think Sony ever commented on the exact amount that games can access, except for clarifying that part of it will be flexible memory that can take advantage of FreeBSD's virtual memory functionality, the rest seems like rumours, besides I don't doubt that Sony has been making improvements to the OS'es memory usage to reduce its footprint, thus giving developers more to work with.

In any case, 2-2.5 GB for system memory for the game, leaves 3-2.5 GB for textures, not 1.5-2.

http://www.vg247.com/2013/07/26/ps4-has-up-to-5-5gb-of-ram-for-developers-4-5gb-guaranteed-1gb-of-flexible-memory/
Quote:


> "We would like to clear up a misunderstanding regarding our "direct" and "flexible" memory systems," the firm said to Eurogamer in a statement. "The article states that "flexible" memory is borrowed from the OS, and must be returned when requested - that's not actually the case.
> 
> "The actual true distinction is that:
> 
> "Direct Memory" is memory allocated under the traditional video game model, so the game controls all aspects of its allocation -
> 
> "Flexible Memory" is memory managed by the PS4 OS on the game's behalf, and allows games to use some very nice FreeBSD virtual memory functionality. However this memory is 100% the game's memory, and is never used by the OS, and as it is the game's memory it should be easy for every developer to use it.
> 
> *"We have no comment to make on the amount of memory reserved by the system or what it is used for."*


Not withstanding that, always expect the PC versions to have higher resolution textures, requiring more VRAM. Back when the PS3 launched with a total of 512 MB of RAM + VRAM (256+256), Nvidia was launching a GPU with 768 MB of VRAM, to which you have to add a PC with 1 GB of RAM.

Now, a year later, Crysis was released, making full use of that VRAM. We are now almost a year after the new consoles' release. And here comes the trend again.


----------



## Mhill2029

Quote:


> Originally Posted by *GoldenTiger*
> 
> Gibbo of Overclockers UK (the premiere hardware vendor for Great Britain) has said we will see 8GB variants of the GTX 970/980 sometime in mid-October through early November.


Well, I think that's a bit optimistic to be that soon. Although wasn't Maxwell supposed to share vram on SLI systems?


----------



## GoldenTiger

Quote:


> Originally Posted by *kiwiis*
> 
> True that it's probably excessive in this case but it can't hurt to be thinking down the line. Still using a GTX680 with 2 gigs of vram and I found out ages ago that 2 gigs was no longer sufficient enough. When I get a 900 card, I'd like to be certain VRAM will never be an issue within the card's lifetime.


Wait a month, and get the 8GB versions then







:
Quote:


> Originally Posted by *GoldenTiger*
> 
> Gibbo of Overclockers UK (the premiere hardware vendor for Great Britain) has said we will see 8GB variants of the GTX 970/980 sometime in November.


Source link: http://videocardz.com/52122/overclockersuk-reference-geforce-gtx-980970-4gb-8gb-models-also-planned (he edited out the post later on before the cards launched, due to nda heat that it resulted in, but the videocardz article quotes it).


----------



## waylo88

6GB's is idiotic. Optimize your damn textures please.


----------



## GoldenTiger

Quote:


> Originally Posted by *tpi2007*
> 
> Well, they are advertising the 6 GB as the Ultra High, so why not get an Ultra High-end card to meet the requirements ?
> 
> *As to the PS4's VRAM / RAM usable by games, I don't think Sony ever commented on the exact amount that games can access*
> 
> Now, a year later, Crysis was released, making full use of that VRAM. We are now almost a year after the new consoles' release. And here comes the trend again.


Because no consumer card other than specialty 780 models provides it?







(Titan is a prosumer card, you could argue that does). I don't disagree that higher VRAM capacities will be desirable and become required more and more as time goes on but I think the time of actually needing it for high settings is not here just yet







.

Oops... Guess I should edit that out then, not that it would do much good since it was quoted, I suppose.









Again, not arguing the trend is coming, but I am just arguing that it isn't here just yet







.

EDIT: Whew, guess I'm safe anyway, it has been publicly stated: http://www.dualshockers.com/2014/03/11/naughty-dog-explains-ps4s-cpu-memory-and-more-in-detail-and-how-they-can-make-them-run-really-fast/ Naughty Dog studios dev regarding RAM availability: "Even in the PlayStation 4 you have 5 gigs, which seems like a lot but you'll be amazed by how quickly it fills up."


----------



## kiwiis

Quote:


> Originally Posted by *GoldenTiger*
> 
> Wait a month, and get the 8GB versions then


Planned on it. At least by then they'll have figured out how to flash modded (skyn3t) bioses. No rush just yet


----------



## waylo88

Quote:


> Originally Posted by *pcguru000*
> 
> What a confusing world we live in... one minute we're yelling for better hardware, less compression and the next we're yelling that the compression is okay and the hardware requirements are too steep.


People are saying the requirements are too steep because 6GB's of VRAM required for ultra textures at 1080p is completely ******ed. Unless, as pointed out, they're going to just throw 8K uncompressed textures on every tiny pebble and chair in the game, which is still totally ******ed.

People want devs to actually push limits, not artificially inflate requirements.


----------



## Zipperly

Here we go.... more demanding games are going to start coming out due to the new consoles which is what everyone has been screaming for over the past 2yrs and now suddenly those same people are going to start crying "poor optimization" after finally getting their wishes granted. Lmbo.........


----------



## waylo88

Quote:


> Originally Posted by *Zipperly*
> 
> Here we go.... more demanding games are going to start coming out due to the new consoles which is what everyone has been screaming for over the past 2yrs and now suddenly those same people are going to start crying "poor optimization" after finally getting their wishes granted. Lmbo.........


You're honestly telling me 6GB's of VRAM required for ultra textures at 1080p is acceptable? How does that not reek of poor optimization?


----------



## tpi2007

Quote:


> Originally Posted by *GoldenTiger*
> 
> Because no consumer card other than specialty 780 models provides it?
> 
> 
> 
> 
> 
> 
> 
> (Titan is a prosumer card, you could argue that does). I don't disagree that higher VRAM capacities will be desirable and become required more and more as time goes on but I think the time of actually needing it for high settings is not here just yet
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Oops... Guess I should edit that out then, not that it would do much good since it was quoted, I suppose.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Again, not arguing the trend is coming, but I am just arguing that it isn't here just yet
> 
> 
> 
> 
> 
> 
> 
> .
> 
> EDIT: Whew, guess I'm safe anyway, it has been publicly stated: http://www.dualshockers.com/2014/03/11/naughty-dog-explains-ps4s-cpu-memory-and-more-in-detail-and-how-they-can-make-them-run-really-fast/ Naughty Dog studios dev regarding RAM availability: "Even in the PlayStation 4 you have 5 gigs, which seems like a lot but you'll be amazed by how quickly it fills up."


It's less than half a victory. It still means, according to your own math (but not your conclusion) that they will likely use 3-2.5 GB for textures, and that always translates to more on the PC side.

Not to mention that the source you quote admits that it's easy to fill up rather quickly, so they have to optimize for things to fit.

Crytek also said that 8 GB on the consoles may not be enough:

http://www.neowin.net/news/crytek-developer-says-8gb-of-ram-in-current-consoles-may-not-be-enough
Quote:


> Sean Tracy, the Engine Business Development Manager at Crytek USA, states that "I would have to agree with the viewpoint that 8 gigs can easily be filled up" when asked if there would be a shortage of memory in this generation. He explained that "developers don't necessarily even have access to all 8 gigs of it.


Quote:


> Tracy said that "we will soon find that the computational requirements of games will quickly hit the ceiling of a few gigs of ram." He concluded by saying that "it's not the raw power alone that will allow for photo-realistic graphics, but technology that intelligently scales and utilizes all that the hardware has to offer."


----------



## lugal

"Hurr durr, consoles are slowing progress of pc gaming, they are evil and pc master race doesnt need close to metal optimizations anyway..."
- new consoles come out, devs start to use their hardware and increase requirements for pc ports accordingly -
"Omgz, you want me to buy top of the line graphics cards to be able to use highest possible graphics settings? Impossibru, wheres ze optimizations..."


----------



## Zipperly

Quote:


> Originally Posted by *lugal*
> 
> "Hurr durr, consoles are slowing progress of pc gaming, they are evil and pc master race doesnt need close to metal optimizations anyway..."
> - new consoles come out, devs start to use their hardware and increase requirements for pc ports accordingly -
> "Omgz, you want me to buy top of the line graphics cards to be able to use highest possible graphics settings? Impossibru, wheres ze optimizations..."


+1000!


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Here we go.... more demanding games are going to start coming out due to the new consoles which is what everyone has been screaming for over the past 2yrs and now suddenly those same people are going to start crying "poor optimization" after finally getting their wishes granted. Lmbo.........


See you're confused..We wanted better looking games that actually used our hardware to it's potential and efficiently threaded games...Not more just "demanding". (unoptimized in these current cases)


----------



## JoHnYBLaZe

I've never seen a benchmark where a 780ti chokes a game because of VRAM, even when put up against the Titan black

This is at 1440p mind you.....LoL at 1080

Some people here are ready to buy this hook, line and sinker, well to you I say this: These games had better blow crysis 3 out of the atmosphere....

Or else I'm calling caching and that garbage where a game reads your system and locks certain settings


----------



## LaBestiaHumana

Quote:


> Originally Posted by *paulerxx*
> 
> See you're confused..We wanted better looking games that actually used our hardware to it's potential and efficiently threaded games...Not more just "demanding". (unoptimized in these current cases)


Welcome to PC gaming.

Most day 1 releases are poorly optimized, some take months to fix, and some don't ever get a fix.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> See you're confused..We wanted better looking games that actually used our hardware to it's potential and efficiently threaded games...Not more just "demanding". (unoptimized in these current cases)


Im not confused at all, better looking games/textures requires more vram... I dont know why this is such a shock to so many of you as this has been the case for a very long time otherwise we would all still be fine with 512mb video cards.


----------



## dir_d

A 4gb 980 should do fine with its delta compression.


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> I've never seen a benchmark where a 780ti chokes a game because of VRAM, even when put up against the Titan black
> 
> This is at 1440p mind you.....LoL at 1080
> 
> Some people here are ready to buy this hook, line and sinker, well to you I say this: These games had better blow crysis 3 out of the atmosphere....
> 
> Or else I'm calling caching and that garbage where a game reads your system and locks certain settings


Fully agreed man.


----------



## GoldenTiger

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> I've never seen a benchmark where a 780ti chokes a game because of VRAM, even when put up against the Titan black
> 
> This is at 1440p mind you.....LoL at 1080
> 
> Some people here are ready to buy this hook, line and sinker, well to you I say this: These games had better blow crysis 3 out of the atmosphere....
> 
> Or else I'm calling caching and that garbage where a game reads your system and locks certain settings


This is correct, at least with current games. The trend is coming but it is folly to argue it is here.
Quote:


> Originally Posted by *tpi2007*
> 
> It's less than half a victory. It still means, according to your own math (but not your conclusion) that they will likely use 3-2.5 GB for textures, and that always translates to more on the PC side.
> 
> Not to mention that the source you quote admits that it's easy to fill up rather quickly, so they have to optimize for things to fit.
> 
> Crytek also said that 8 GB on the consoles may not be enough:
> 
> http://www.neowin.net/news/crytek-developer-says-8gb-of-ram-in-current-consoles-may-not-be-enough


Goofed the math, blame sleep. Same point anyway.

I know it can be filled quickly, but optimization (as an umbrella term) takes dev time but also allows us to do much more with existing hardware. It's simply an ROI issue still. The days of truly needing target systems with 6gb of Vram are simply not here yet, and virtually no one has them so they aren't being targeted yet. Generally speaking I hear talk of 2-2.5gb as the general 1080 target for upcoming (1 year release eta or so) games still







. You're calling 4-6gb being useful for maxing settings in games a year or two early still on high end gaming systems. 1440 and 4k users simply won't need more than 4gb for a few years yet except when running extreme aa. Your argument is simply too early, not incorrect.


----------



## Swolern

Well i guess im keeping my 1 1/2 year old Titan.









Wonder when the 8gb 980s will release.


----------



## GoldenTiger

Quote:


> Originally Posted by *Swolern*
> 
> Well i guess im keeping my 1 1/2 year old Titan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wonder when the 8gb 980s will release.


Bus width and vram capacity are the modern ghz myth.


----------



## iARDAs

What about 4K Ultra?

10 GB of Vram?


----------



## Aparition

Quote:


> Originally Posted by *iARDAs*
> 
> What about 4K Ultra?
> 
> 10 GB of Vram?


I was just thinking that maybe they said 6GB at 1080p because they were including uncompressed textures running at x6 SSAA.


----------



## Deadboy90

I better be able to see ants crawling on the ground for 6GB of Vram.


----------



## seabiscuit68

Can someone remind me if Crossfire / SLI is additive for RAM. If you are using two cards with 3GB of RAM, will you have 6GB of RAM available for use, or will the game still only use 3GB?


----------



## waylo88

Quote:


> Originally Posted by *iARDAs*
> 
> What about 4K Ultra?
> 
> 10 GB of Vram?


That's probably a conservative estimate. 12GB's is probably more like it.

But at least they're pushing limits, right guys?

Ugh.


----------



## Zipperly

Quote:


> Originally Posted by *seabiscuit68*
> 
> Can someone remind me if Crossfire / SLI is additive for RAM. If you are using two cards with 3GB of RAM, will you have 6GB of RAM available for use, or will the game still only use 3GB?


The ram does not stack so still 3gb usable.


----------



## zealord

nothing to worry about. maybe mistake . Not that there will be any visible difference between high and ultra anyways. I don't see that game being more graphically impressive than Crysis 3.


----------



## Sideways8LV

Is it not worth looking at this from a different perspective?

What if developers start adding in these Ultra/Ultimate settings designed for the high-end enthusiast crowd, to let them really unleash the power of their rigs?

Yes, not everyone is running a 6Gb card, fewer running SLI/Crossfire with 6Gb cards, Yes, few people are gaming at 4K. Yes, it sucks to have a game that you can't max out on your very capable, average gaming rig. All valid points.

However, if I was a game dev, it would be awesome to have this Ultra HD, Ultimate quality setting for people who have the bank to utilize it to enjoy and most likely show off the best possible depiction of all the developers hard work. It must be very disheartening to create a title and then have to downgrade it to run smooth on hardware the majority own.

I say keep the ultimate setting, have it in more games. Let the high-end gamers enjoy it and give us regular ones something to save up for/drool over/be jealous of. Give it a year or two and we'll have upgraded enough to run the Ultra setting anyway and we'll be moaning about a newer game demanding 12Gb for 8K Ultra Textures.

I'm just waffling because I'm at work but I see no problem with having crazy high system specifications, providing the game is still accessible and playable at a smooth rate for lower end machines too.


----------



## RagingCain

Quote:


> Originally Posted by *Sideways8LV*
> 
> Is it not worth looking at this from a different perspective?
> 
> What if developers start adding in these Ultra/Ultimate settings designed for the high-end enthusiast crowd, to let them really unleash the power of their rigs?
> 
> Yes, not everyone is running a 6Gb card, fewer running SLI/Crossfire with 6Gb cards, Yes, few people are gaming at 4K. Yes, it sucks to have a game that you can't max out on your very capable, average gaming rig. All valid points.
> 
> However, if I was a game dev, it would be awesome to have this Ultra HD, Ultimate quality setting for people who have the bank to utilize it to enjoy and most likely show off the best possible depiction of all the developers hard work. It must be very disheartening to create a title and then have to downgrade it to run smooth on hardware the majority own.
> 
> I say keep the ultimate setting, have it in more games. Let the high-end gamers enjoy it and give us regular ones something to save up for/drool over/be jealous of. Give it a year or two and we'll have upgraded enough to run the Ultra setting and we'll be moaning about a newer game demanding 12Gb for 8K Ultra Textures.
> 
> I'm just waffling because I'm at work but I see no problem with having crazy high system specifications, providing the game is still accessible and playable at a smooth rate for lower end machines too.


Because the image quality is crap for the amount of resources used, it isn't making the images prettier it is just storage.

They are designed around at best using 6GB of VRAM with a weak ~7860 HD Radeon which couldn't hope to power 6GB VRAM for rendering.


----------



## Sideways8LV

Do we know that for sure? I haven't seen any gameplay on that setting.


----------



## RagingCain

Quote:


> Originally Posted by *Sideways8LV*
> 
> Do we know that for sure? I haven't seen any gameplay on that setting.


Don't need to.

Actual Video Buffer Required:
1920x1080,32-bit Color (4 Bytes), Z Buffer 16-bit, 8-Bit Stencil (all 4 Bytes), Double Buffered, 4xAA
265,420,800 Bytes, or 265.420 MB, lets double that figure though, make them Post Processing effects extra purdy, ~ 530 MB of VRAM.

The rest is texture storage, caching, buffering, basically what the developer designs to do with it. If they aren't lazy, they properly utilize RAM, Pagefile (if the I/O is up to snuff) and then regular disk usage.

The only explanation is not creating a NUMA memory structure (i.e. a PC memory structure) and programming for a console and not doing a proper port.

Let's say 2GB of VRAM (4GB VRAM double buffered) was the video buffer, the VIDEO BUFFER, for a 16:9 Aspect ratio.
128 Megapixel resolution at 16:9 is, ~15,300x8,500


----------



## Sideways8LV

Any other way to explain it why it would state 6Gb? Like that setting runs the game at a downsampled 4K resolution? Like, built-in DSR as it were.

I don't know enough about game design to do anything other than spout uneducated gibberish, but I'm trying to understand it more.

Is this common with console games being ported?


----------



## Ferreal

Lol 6gb for ultra on 1080p. What about 1440p?

Titan Z sounds good right now if that's true.


----------



## Crouch

So my 970 that's coming soon won't be able to max this game out at 1080p


----------



## Sideways8LV

It says if the UltraHD texture pack is installed, UltraHD is 4K no? Not everyone has a rig to run 4K textures. Doesn't say the ultra setting is part of original game, but with an additional pack it can be utilized.

But assuming RagingCain is correct, it does nothing for looks. I don't know what to think.


----------



## Ferreal

Quote:


> Originally Posted by *lugal*
> 
> "Hurr durr, consoles are slowing progress of pc gaming, they are evil and pc master race doesnt need close to metal optimizations anyway..."
> - new consoles come out, devs start to use their hardware and increase requirements for pc ports accordingly -
> "Omgz, you want me to buy top of the line graphics cards to be able to use highest possible graphics settings? Impossibru, wheres ze optimizations..."


LOL nice


----------



## RagingCain

Quote:


> Originally Posted by *Sideways8LV*
> 
> It says if the UltraHD texture pack is installed, UltraHD is 4K no? Not everyone has a rig to run 4K textures. Doesn't say the ultra setting is part of original game, but with an additional pack it can be utilized.
> 
> But assuming RagingCain is correct, it does nothing for looks. I don't know what to think.


Well we can do 4K now, if they give us that content, Ultra HD is not an official moniker.

It is an obvious they have 6GB of VRAM work space on the consoles, now we have to.


----------



## Chargeit

Yea, I wouldn't be too worried about this personally. Seems like they're just throwing the 6gb crowd a bone.

Isn't this a console port?


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Im not confused at all, better looking games/textures requires more vram... I dont know why this is such a shock to so many of you as this has been the case for a very long time otherwise we would all still be fine with 512mb video cards.


Except these games aren't graphically superior to games that only need 2/3GBs of ram.....

http://i.ytimg.com/vi/fpRXiyIvvX4/maxresdefault.jpg < looks like crap
http://static5.gamespot.com/uploads/original/mig/7/2/0/5/2037205-711438_20130424_003.jpg < crap

vs

http://cdn2.gamefront.com/wp-content/uploads/2013/10/Crysis-3-4K.jpg
http://i.ytimg.com/vi/X5Y1zPIhFj8/maxresdefault.jpg
http://games.kitguru.net/wp-content/uploads/2012/03/3AlanWake-2012-03-01-14-27-24-28.jpg
http://assets.vg247.com/current//2014/06/Witcher_3_Wild_Hunt_e3_2014-27.jpg


----------



## RagingCain

Quote:


> Originally Posted by *paulerxx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zipperly*
> 
> Im not confused at all, better looking games/textures requires more vram... I dont know why this is such a shock to so many of you as this has been the case for a very long time otherwise we would all still be fine with 512mb video cards.
> 
> 
> 
> Except these games aren't graphically superior to games that only need 2/3GBs of ram.....
> 
> http://i.ytimg.com/vi/fpRXiyIvvX4/maxresdefault.jpg < looks like crap
> http://static5.gamespot.com/uploads/original/mig/7/2/0/5/2037205-711438_20130424_003.jpg < crap
Click to expand...

Looks crap, in context, to the fact that 6GB of VRAM is used.


----------



## Sideways8LV

Quote:


> Originally Posted by *RagingCain*
> 
> ...Ultra HD is not an official moniker.


Yeah that term gets thrown around a lot unfortunately.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> Except these games aren't graphically superior to games that only need 2/3GBs of ram.....
> 
> http://i.ytimg.com/vi/fpRXiyIvvX4/maxresdefault.jpg < looks like crap
> http://static5.gamespot.com/uploads/original/mig/7/2/0/5/2037205-711438_20130424_003.jpg < crap
> 
> vs
> 
> http://cdn2.gamefront.com/wp-content/uploads/2013/10/Crysis-3-4K.jpg
> http://i.ytimg.com/vi/X5Y1zPIhFj8/maxresdefault.jpg
> http://games.kitguru.net/wp-content/uploads/2012/03/3AlanWake-2012-03-01-14-27-24-28.jpg
> http://assets.vg247.com/current//2014/06/Witcher_3_Wild_Hunt_e3_2014-27.jpg


Not gonna argue with you about it, there is more too this than just flashy graphics. BTW nice comparison there with the Pre-Alpha build. LMBO.

And Alan Wake? ahhahaha... the textures in that game are horrible.


----------



## tpi2007

Quote:


> Originally Posted by *GoldenTiger*
> 
> Goofed the math, blame sleep. Same point anyway.
> 
> I know it can be filled quickly, but optimization (as an umbrella term) takes dev time but also allows us to do much more with existing hardware. It's simply an ROI issue still. The days of truly needing target systems with 6gb of Vram are simply not here yet, and virtually no one has them so they aren't being targeted yet. Generally speaking I hear talk of 2-2.5gb as the general 1080 target for upcoming (1 year release eta or so) games still
> 
> 
> 
> 
> 
> 
> 
> . You're calling 4-6gb being useful for maxing settings in games a year or two early still on high end gaming systems. 1440 and 4k users simply won't need more than 4gb for a few years yet except when running extreme aa. Your argument is simply too early, not incorrect.


I hope you're right, but in the news reel there are two games already requiring 4 and 6 GB of VRAM at the highest settings.

As I said, we can make the case that if you want "Ultra High" settings you need an Ultra High-end card with lots of VRAM, with "High" settings being attainable with 3 GB VRAM cards, and that is achieved with lots of cards - HD 7950 / R9 280 / HD 7970 / HD 7970 Ghz Edition, GTX 770 4 GB and now the GTX 970 4 GB, which are all priced below the high-end segment.

You can make the case that Ultra High will not make much difference and thus many mid-range cards will be able to handle the game at high settings. So, if you want the best, you have to buy the best cards.

It happened in 2007 (when Crysis came out, around one year after the PS3), that is all I'm saying.


----------



## TopicClocker

What IGN had to say: http://uk.ign.com/articles/2014/09/26/middle-earth-shadow-of-mordor-review
Quote:


> *On the PC side Mordor also compares to the Batman games, in that it's of good quality. There are even some enhanced graphics settings, including an ultra-high texture setting that requires a full 6GB of video memory.* My only issue with it is some awkward menu controls, but most of those are customizable and those that aren't aren't too inconvenient to get used to.


These textures are likely PC specific enhancements, hence the reason why they are apart of an optional "Ultra HD Texture Pack", so by no means is this a requirement.

This could be seen as a treat for PC Gamers with capable hardware and enough memory to run these textures, the High textures could be what the next gen consoles are running.

The problem is that hardly any GPUs of today except Titans or custom cards have 6GB.


----------



## Sideways8LV

Looking forward to side by side comparisons as well as benchmarks with different spec'd rigs.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Not gonna argue with you about it, there is more too this than just flashy graphics. BTW nice comparison there with the Pre-Alpha build. LMBO.
> 
> And Alan Wake? ahhahaha... the textures in that game are horrible.


and yet the textures STILL look better than games that need 4GBs and 6GBs of VRAM....







That game maxed out on my HD5770 1GB back in the day. Both Evil Within and Shadow Of Mordor are unimpressive looking, even if these two games only required 2GB of VRAM. I wouldn't be impressed.


----------



## RagingCain

If my math was a little confusing, this is a basic example of how Memory Management is supposed to work in PC games.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> and yet the textures STILL look better than games that need 4GBs and 6GBs of VRAM....


Uh no they do not, I have alan wake on my rig and im well aware of how the textures look in that game. They are low res and its very obvious that it was a 360 game at one point.


----------



## Sideways8LV

Quote:


> Originally Posted by *RagingCain*
> 
> If my math was a little confusing, this is a basic example of how Memory Management is supposed to work in PC games.


Thanks, that's very helpful. I find I learn best when pictures and primary colors are involved.

Is this pretty much standard for all engines? I have noticed an increase in games where low res textures load first and then you'll see high res textures graduate over the top, sometimes quickly, sometimes not so quickly, to represent the final look. This is typically demonstrated in games like Rage or Borderlands 2 to name a couple.

Are they still all cached and pre-cached the same as in that diagram? Would it not be possible to have yet another layer of even higher res textures cached somewhere in that 6Gb of memory to load on top again, thus making it visually more detailed and gorgeous?

Again, my thoughts are based on my elementary knowledge of the topic, so it might not make much sense to anyone who does know more, so feel free to correct me where I'm way off with my thinking.


----------



## th3illusiveman

poor coding. If a Game like Metro last light can run well with ultra textures, this should too.


----------



## GoldenTiger

Quote:


> Originally Posted by *tpi2007*
> 
> I hope you're right, but in the news reel there are two games already requiring 4 and 6 GB of VRAM at the highest settings.
> 
> As I said, we can make the case that if you want "Ultra High" settings you need an Ultra High-end card with lots of VRAM, with "High" settings being attainable with 3 GB VRAM cards, and that is achieved with lots of cards - HD 7950 / R9 280 / HD 7970 / HD 7970 Ghz Edition, GTX 770 4 GB and now the GTX 970 4 GB, which are all priced below the high-end segment.
> 
> You can make the case that Ultra High will not make much difference and thus many mid-range cards will be able to handle the game at high settings. So, if you want the best, you have to buy the best cards.
> 
> It happened in 2007 (when Crysis came out, around one year after the PS3), that is all I'm saying.


Yes I that I actually advocated and heavily backed crytek at the Crysis 1 release stating it wasn't unoptimized but just was designed around future systems. However, while Crysis backed itself up with amazing graphics quality, these games do not. If anything citing Crysis 1 backs my argument that these games are not well optimized, at least from the outside in. In 2007 Crysis had best in class graphics by far yet looked good and ran well on lower settings than max in comparison to contemporary titles at the time while looking as good as its competitors.

Today these two games are showing no improvements for drastically higher asking requirements compared to modern games. So using them as an example just shows that they are currently by far the exception and not well optimized, not a broad trend imminent and currently







. Trust me, if they showed a need for the quoted figures I would be storming the gates insisting they aren't poorly done. The fact of the matter is, they simply don't. This just shows two badly done graphic jobs, not an industry standard.

Go Google GoldenTiger Crysis unoptimized hardforum, if you want to see







.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Uh no they do not, I have alan wake on my rig and im well aware of how the textures look in that game. They are low res and its very obvious that it was a 360 game at one point.


Evil Within looks pretty bad bro. It looks like a Wii U game or something.

http://static1.gamespot.com/uploads/original/1365/13658182/2443259-the+evil+within+screenshot_1383569070.png
http://www.nerdist.com/wp-content/uploads/2014/02/EvilWithinScary.jpg
http://static1.gamespot.com/uploads/original/280/2802776/2578939-tew1.jpg
http://static1.gamespot.com/uploads/original/1365/13658182/2443257-the+evil+within+screenshot+%283%29_1383569101.png


----------



## GoldenTiger

http://hardforum.com/showpost.php?p=1031581192&postcount=54

October 2007, I argued Crysis was well optimized and deserved its high requirements. These two games? Nope. They are just poorly done, not indicative of high requirements in the here and now.


----------



## Sideways8LV

Those screens look kinda Fallout 3ish graphics. I wouldn't go as far as to say WiiU. Kinda insulting lol. But not next-gen groundbreaking graphics either.


----------



## Ferreal

Those screens look pretty good to me.


----------



## GoldenTiger

Quote:


> Originally Posted by *Sideways8LV*
> 
> Those screens look kinda Fallout 3ish graphics. I wouldn't go as far as to say WiiU. Kinda insulting lol. But not next-gen groundbreaking graphics either.


Definitely not the Crysis of 2014 as some are trying to portray







. At least from what I'm seeing so far... we'll see once it's out and we can see for ourselves on our systems







.


----------



## Sideways8LV

I must have missed those claims, haven't been paying that game too much attention. I watched the early gameplay and it kinda put me off. I have a thing about being chased lol.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> Evil Within looks pretty bad bro. It looks like a Wii U game or something.
> 
> http://static1.gamespot.com/uploads/original/1365/13658182/2443259-the+evil+within+screenshot_1383569070.png
> http://www.nerdist.com/wp-content/uploads/2014/02/EvilWithinScary.jpg
> http://static1.gamespot.com/uploads/original/280/2802776/2578939-tew1.jpg
> http://static1.gamespot.com/uploads/original/1365/13658182/2443257-the+evil+within+screenshot+%283%29_1383569101.png


Im honestly not sure what you are smoking but it must be good stuff. I just watched some actual early game play videos and it looked really good in motion and there is nothing wrong with those screens either.


----------



## Zipperly

Quote:


> Originally Posted by *Ferreal*
> 
> Those screens look pretty good to me.


+1 and the game in motion looks a whole lot better, lots of stuff going on the scenes where still shots dont capture it or do it any justice.


----------



## Sideways8LV

Screenshots don't do games justice anyway. They are either super polished and not representative, or don't depict how great the game feels in motion. A still image can display a lot of flaws a moving game won't exhibit.


----------



## Sideways8LV

Gonna say 'Jinx' on that one Zip


----------



## Zipperly

Quote:


> Originally Posted by *Sideways8LV*
> 
> Screenshots don't do games justice anyway. They are either super polished and not representative, or don't depict how great the game feels in motion. A still image can display a lot of flaws a moving game won't exhibit.


+1 Rep.


----------



## Ferreal

I can't wait to play this game. Graphics look amazing even with the videos I saw on PS4.


----------



## Zipperly

Quote:


> Originally Posted by *Ferreal*
> 
> I can't wait to play this game. Graphics look amazing even with the videos I saw on PS4.


Yup, the ps4 footage even looks great. The shadows in some parts were really impressive as well.


----------



## paulerxx

Quote:


> Originally Posted by *Sideways8LV*
> 
> Screenshots don't do games justice anyway. They are either super polished and not representative, or don't depict how great the game feels in motion. A still image can display a lot of flaws a moving game won't exhibit.


Okay this is true...But Evil Within doesn't look better in motion. You can just tell by the screenshot comparison from before that this game shouldn't use more than 2GBs of vram. You complained about Alan Wake...A game from what, 2011? The fact that it's even comparable says a lot.
The animations look awkward and unrealistic, and the grain effect is a good way of saying "this game looks bad so we need a cheesy effect to make up for it!" Don't get me wrong, the actual GAMEPLAY looks good. The game is going to be fun, but graphically it's nothing to defend when someone says it looks crappy. It doesn't look next generation, it's no Crysis to say the least.


----------



## Sideways8LV

First of all I never mentioned Alan Wake, that was someone else.

Secondly, I wasn't defending the graphics so much as saying it was harsh to compare to a Wii-U game.

No it's not next-gen looking, which I stated. I actually compared it to a title from 2008. In looks anyway, not in gameplay. Still looks good though.


----------



## paulerxx

Game play wise...I think Shadow Of Mordor and Evil Within are going to do great. But requiring more than 3GBs of ram shows the developers didn't put too much effort into optimizing the PC version.
I will probably play and buy both games, it's just upsetting I'm not getting the visual quality I'm use to.


----------



## JSTe

Grain of salt. Teeny tiny.



Look at the textures on the character. 2GB or more for those 1024x1024 textures from 2007?


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> Okay this is true...But Evil Within doesn't look better in motion. You can just tell by the screenshot comparison from before that this game shouldn't use more than 2GBs of vram. You complained about Alan Wake...A game from what, 2011? The fact that it's even comparable says a lot.
> The animations look awkward and unrealistic, and the grain effect is a good way of saying "this game looks bad so we need a cheesy effect to make up for it!" Don't get me wrong, the actual GAMEPLAY looks good. The game is going to be fun, but graphically it's nothing to defend when someone says it looks crappy. It doesn't look next generation, it's no Crysis to say the least.


First of all any game looks better in motion than still shots, period. Second of all it was you who decided to bring up Alan Wake as a comparison to the Evil Within which is completely and utterly laughable and is in no way comparable to the Evil Within, especially from a texture stand point in which Alan Wake uses extremely low res textures.


----------



## Sideways8LV

Has anyone actually got a link to *PC* gameplay footage? All I can see is console.


----------



## Frankzro

Shenanigans.


----------



## Zero4549

As silly as this might be, it is just as silly that modern GPUs have such little VRAM. I remember 6GB cards almost 5 years ago! Sure, they weren't reference cards, but is that really the point?

Sure, no games until now really needed so much VRAM but is that really a reason to skimp on a few dirt cheap chips for several generations in a row?

Besides, how many games now COULD have looked better, if higher VRAM cards were common, but had to be released with ugly highly compressed textures instead? I imagine quite a few.

Ultimately, someone needs to be willing to look a bit silly, and push out of the norm, for other things to ever progress. Just look at the original Crysis, and the era of great looking games huge and amazingly powerful gaming hardware that was made to meet that demand, thanks to C1 raising the bar on what was considered "good", even if many people couldn't really appreciate the game itself at the time.


----------



## TopicClocker

Quote:


> Originally Posted by *paulerxx*
> 
> Game play wise...I think Shadow Of Mordor and Evil Within are going to do great. But requiring more than 3GBs of ram shows the developers didn't put too much effort into optimizing the PC version.
> I will probably play and buy both games, it's just upsetting I'm not getting the visual quality I'm use to.


Shadow of Mordor.

IGN: http://uk.ign.com/articles/2014/09/26/middle-earth-shadow-of-mordor-review
Quote:


> *On the PC side Mordor also compares to the Batman games, in that it's of good quality. There are even some enhanced graphics settings, including an ultra-high texture setting that requires a full 6GB of video memory.* My only issue with it is some awkward menu controls, but most of those are customizable and those that aren't aren't too inconvenient to get used to.


These textures are likely PC specific enhancements, hence the reason why they are apart of an optional "Ultra HD Texture Pack", so by no means is this a requirement.

From what it sounds like, the Ultra textures are a bonus to PC Gamers.

People are complaining, over something which could benefit PC Gamers? Isn't this what was wanted?
It wasn't too long ago when PC Gamers were complaining left right and center about consoles holding PCs back, if the developers are offering PC Gamers enhancements and textures which are a step-up over the next gen versions alot of people are going to look silly.

Ultimately it's down to how it looks and performs before anyone starts calling it poorly optimized, we don't know how it performs, and we don't know how the PC version looks in comparison to the next-gen consoles, and the optional "Ultra HD Texture Pack" surely is not a requirement to run the game and could well be an enhanced feature for PCs.

If these textures are PC exclusive and turn out to look better than the high textures and the next gen consoles.


----------



## GoldenTiger

Quote:


> Originally Posted by *Zero4549*
> 
> As silly as this might be, it is just as silly that modern GPUs have such little VRAM.
> Sure, no games until now really needed so much VRAM but is that really a reason to skimp on a few dirt cheap chips for several generations in a row?
> 
> Besides, how many games now COULD have looked better, if higher VRAM cards were common, but had to be released with ugly highly compressed textures instead? I imagine quite a few.
> 
> Ultimately, someone needs to be willing to look a bit silly, and push out of the norm, for other things to ever progress. Just look at the original Crysis, and the era of great looking games huge and amazingly powerful gaming hardware that was made to meet that demand, thanks to C1 raising the bar on what was considered "good", even if many people couldn't really appreciate the game itself at the time.


*cough*
Post made in October 2007 regarding Crysis 1: http://hardforum.com/showpost.php?p=1031581192&postcount=54 , and I have brought up the same sort of point you made in your post here, over the years. *Needless to say, I agree... it's a "chicken before the egg" scenario and ultimately one of the sides needs to cave and do it







.*

*That said, there are very good reasons neither card manufacturers or game devs have done so yet: The problem has been, historically, that...*

-If you're a video card mfg, adding extra expense to your cards and design complexity in a market that is highly competitive on price and product segmentation (of which VRAM capacity can be a part of) is harmful to your bottom line. Additionally, fully pushing the limits gives you less room for product spacing with future releases.

while...

-If you're a game dev. (







), making games that have settings beyond most current hardware capabilities has resulted in terrible press and PR about your title and in the gaming media, while providing questionable benefit in the immediate term and requiring extra resources brought to bear to create the extra technical complexity and graphical fidelity that would help push the mfg's to beef up their cards further.

Unfortunately there are definite disadvantages for both sides to be the guy that goes first, even though I think anyone would agree that in the long game it would benefit the industry on both sides of that fence.


----------



## Teh Bottleneck

Quote:


> Originally Posted by *GoldenTiger*
> 
> Yes I that I actually advocated and heavily backed crytek at the Crysis 1 release stating it wasn't unoptimized but just was designed around future systems. However, while Crysis backed itself up with amazing graphics quality, these games do not. If anything citing Crysis 1 backs my argument that these games are not well optimized, at least from the outside in. In 2007 Crysis had best in class graphics by far yet looked good and ran well on lower settings than max in comparison to contemporary titles at the time while looking as good as its competitors.
> 
> Today these two games are showing no improvements for drastically higher asking requirements compared to modern games. So using them as an example just shows that they are currently by far the exception and not well optimized, not a broad trend imminent and currently
> 
> 
> 
> 
> 
> 
> 
> . Trust me, if they showed a need for the quoted figures I would be storming the gates insisting they aren't poorly done. The fact of the matter is, they simply don't. This just shows two badly done graphic jobs, not an industry standard.
> 
> Go Google GoldenTiger Crysis unoptimized hardforum, if you want to see
> 
> 
> 
> 
> 
> 
> 
> .


Great post, these would be my exact thoughts on the topic as well. Comparing the Crysis situation back from 2007 to this is laughable. Only thing I'd like to add, both of these games still aren't released to the public, so there's still a very good chance we're only dealing with PR. Remember CoD Ghosts 6 GBs of RAM debacle?


----------



## Leopard2lx

I guess we'll have to wait and see the difference between High and Ultra Textures, because to me the real question is how much better do those Ultra textures look? I mean, if they make the game look OUTSTANDING then more power to those who have the VRAM to run that....the problem is that if they make little difference then you know there is something wrong.
Watch Dogs looked only slightly better texture wise on Ultra vs High and it ran like crap, but then again it still ran kinda crappy even on High and even after the so-called patch released to fix it. Hopefully this isn't the same garbage job that Ubisoft pulled.


----------



## GoldenTiger

Quote:


> Originally Posted by *Leopard2lx*
> 
> I guess we'll have to wait and see the difference between High and Ultra Textures, because to me the real question is how much better do those Ultra textures look? I mean, if they make the game look OUTSTANDING then more power to those who have the VRAM to run that....the problem is that if they make little difference then you know there is something wrong.
> Watch Dogs looked only slightly better texture wise on Ultra vs High and it ran like crap, but then again it still ran kinda crappy even on High and even after the so-called patch released to fix it. Hopefully this isn't the same garbage job that Ubisoft pulled.


QFT







.
Quote:


> Originally Posted by *Teh Bottleneck*
> 
> Great post, these would be my exact thoughts on the topic as well. Comparing the Crysis situation back from 2007 to this is laughable. Only thing I'd like to add, both of these games still aren't released to the public, so there's still a very good chance we're only dealing with PR. Remember CoD Ghosts 6 GBs of RAM debacle?


Thanks! That's my assumption, as well, that this isn't actually a game recommendation even but rather a PR stunt







. But it does make good headlines and draw attention to their games... exactly as planned, if I were to hazard a guess, although I won't say that as a literal statement since they are not yet released and I of course have no inside look as to their thought process.


----------



## Teh Bottleneck

Quote:


> Originally Posted by *Sideways8LV*
> 
> Has anyone actually got a link to *PC* gameplay footage? All I can see is console.


Here's a good one, claims to use the Ultra settings. Put it at 1440p, and enjoy:
http://www.youtube.com/watch?v=DrzSBpkxbqw
The game looks pretty good, sure, but there's nothing in there which screams "we need 6 GB to pull this off!"


----------



## iSlayer

Quote:


> Originally Posted by *GoldenTiger*
> 
> Yes I that I actually advocated and heavily backed crytek at the Crysis 1 release stating it wasn't unoptimized but just was designed around future systems. However, while Crysis backed itself up with amazing graphics quality, these games do not. If anything citing Crysis 1 backs my argument that these games are not well optimized, at least from the outside in. In 2007 Crysis had best in class graphics by far yet looked good and ran well on lower settings than max in comparison to contemporary titles at the time while looking as good as its competitors.
> 
> Today these two games are showing no improvements for drastically higher asking requirements compared to modern games. So using them as an example just shows that they are currently by far the exception and not well optimized, not a broad trend imminent and currently
> 
> 
> 
> 
> 
> 
> 
> . Trust me, if they showed a need for the quoted figures I would be storming the gates insisting they aren't poorly done. The fact of the matter is, they simply don't. This just shows two badly done graphic jobs, not an industry standard.
> 
> Go Google GoldenTiger Crysis unoptimized hardforum, if you want to see
> 
> 
> 
> 
> 
> 
> 
> .


Crysis relative to now wasn't very well optimized on the ultra high end. Back then? Yes.

That being said yah, its lower, more realistic settings, were definitely optimized. People just needed to stop thinking their Voodoos running DX7 could max Crysis at 60 fps.


----------



## Sideways8LV

Thanks for that, sadly what I'm sitting at can barely play 1080p content without stuttering heh, so I wouldn't see the benefits. I was just interested to see if any was out so people could make comparisons.


----------



## yunshin

Just another game that sounds like a steaming pile of unoptimized crap.


----------



## Zipperly

Quote:


> Originally Posted by *Leopard2lx*
> 
> I guess we'll have to wait and see the difference between High and Ultra Textures, because to me the real question is how much better do those Ultra textures look? I mean, if they make the game look OUTSTANDING then more power to those who have the VRAM to run that....the problem is that if they make little difference then you know there is something wrong.
> Watch Dogs looked only slightly better texture wise on Ultra vs High and it ran like crap, but then again it still ran kinda crappy even on High and even after the so-called patch released to fix it. Hopefully this isn't the same garbage job that Ubisoft pulled.


I disagree there, the diff from high to ultra was quiet noticeable if you know where to look.

http://international.download.nvidia.com/geforce-com/international/comparisons/watch-dogs/watch-dogs-textures-comparison-1-ultra-vs-high.html

Notice the upper left hand side of the screen where the rust looking part is located.


----------



## Silent Scone

Odd it jumps from 3gb to 6gb









Sounds to me that NV are trying to clear inventory


----------



## Zipperly

Quote:


> Originally Posted by *Teh Bottleneck*
> 
> Here's a good one, claims to use the Ultra settings. Put it at 1440p, and enjoy:
> http://www.youtube.com/watch?v=DrzSBpkxbqw
> The game looks pretty good, sure, but there's nothing in there which screams "we need 6 GB to pull this off!"


Thanks, that looks good as heck to me. Lots of detail on the ground too.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Thanks, that looks good as heck to me. Lots of detail on the ground too.


It's better looking than I originally thought..But still not 3GB+ worthy.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> It's better looking than I originally thought..But still not 3GB+ worthy.


I can agree with that.


----------



## fcman

So what do I need to run ultra at 1600p?


----------



## DADDYDC650

Quote:


> Originally Posted by *paulerxx*
> 
> It's better looking than I originally thought..But still not 3GB+ worthy.


Was the HD ultra texture pack installed in that youtube clip?


----------



## Ascii Aficionado

Right, just like how CoD:G recommended 8GB and Watch Dogs recommended 4-6GB for Ultra, and both turned out to only use a fraction of that even @ 1440p.

I will say that those textures seem very detailed, but I haven't seem then rendered nor have I seen a closeup.

Will take this with a massive grain of salt.


----------



## TopicClocker

Welp, if this game requires 6GB I'm waiting for 6-8GB cards.


----------



## xxroxx

Not sure if it has been posted... but here is some ultra gameplay:

http://www.pcgamer.com/2014/09/26/middle-earth-shadow-of-mordor-video-max-settings-at-2560x1440-on-lpc/




















































not even tesselated rocks... it looks awful and cartoonish.

*edit*

also, UGH. it feels like watching some modded Assassins Creed. Animations are literally the same... And there are lots of them!

*edit 2*

2:08, look. That's the 6GB "ultra" option.


----------



## omari79

that's watch dogs all over again..a poorly optimized port that no one should bother picking it up..

but who am i kidding..the game will sell anyway which is the reason for having to deal with games like these in the first place


----------



## Zipperly

Quote:


> Originally Posted by *xxroxx*
> 
> Not sure if it has been posted... but here is some ultra gameplay:
> 
> http://www.pcgamer.com/2014/09/26/middle-earth-shadow-of-mordor-video-max-settings-at-2560x1440-on-lpc/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not even tesselated rocks... it looks awful and cartoonish.
> 
> *edit*
> 
> also, UGH. it feels like watching some modded Assassins Creed. Animations are literally the same... And there are lots of them!
> 
> *edit 2*
> 
> 2:08, look. That's the 6GB "ultra" option.


That does not look awful and cartoonish.







6GB worth? no.. but not as bad as you are making it out to be either.


----------



## Ferreal

Only complain so far is that it looks just like the Batman games.... I really dislike the combat in those games.


----------



## Remij

I have a feeling this game will run fine on ultra on 3 and 4 gb cards.


----------



## guitarmageddon88

Wait what....?

I just watched gameplay footage because I was looking for some "ultra x1000 better graphics than crysis3" stuff and this is garbage.

More of the same with that whole fantasy genre type stuff.

Not my cup o tea anyways


----------



## RaleighStClair

Quote:


> Originally Posted by *Promisedpain*
> 
> Knowing this - Why the hell does 980 only have 4GB Vram? Anyway, I won't be upgrading my 780 ti until 980 ti arrives, or I might even wait for the next gen.


Same. This trend is likely to continue as well. Sucks for those that just bought the 4GB model 900 series cards. And dont forget these 4GB and 6GB are for 1080p resolutions. Looks like the real winnners will be those with 6GB+ cards this time next year.


----------



## TopicClocker

Quote:


> Originally Posted by *xxroxx*
> 
> Not sure if it has been posted... but here is some ultra gameplay:
> 
> http://www.pcgamer.com/2014/09/26/middle-earth-shadow-of-mordor-video-max-settings-at-2560x1440-on-lpc/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not even tesselated rocks... it looks awful and cartoonish.
> 
> *edit*
> 
> also, UGH. it feels like watching some modded Assassins Creed. Animations are literally the same... And there are lots of them!
> 
> *edit 2*
> 
> 2:08, look. That's the 6GB "ultra" option.


Quote:


> Originally Posted by *omari79*
> 
> that's watch dogs all over again..a poorly optimized port that no one should bother picking it up..
> 
> but who am i kidding..the game will sell anyway which is the reason for having to deal with games like these in the first place


For all we know Ultra could be a PC exclusive feature, from IGN's review that's exactly what it sounds like.
Quote:


> *On the PC side Mordor also compares to the Batman games, in that it's of good quality. There are even some enhanced graphics settings, including an ultra-high texture setting that requires a full 6GB of video memory.*


Ontop of that, it's a YouTube video, YouTube videos are compressed so that really hurts the quality, raw screens and footage is what would be a lot better than using YouTube videos.

At-least wait for the game to come out before complaining. (This isn't targeted at anyone specifically).
No one knows how the game performs or how Ultra textures compare to High and below, so calling it poorly optimized before anyone has had a chance to play it or do comparisons is ridiculous.









I'm not trying to justify the 6GB Ultra textures, not at all as that is an insane amount of Vram which hardly any cards outside of Titans or custom cards have.
The ludicrous thing is that we don't even know what they really look like and people are calling the game poorly optimized already, over one setting!

As I said in a previous post.
Quote:


> Originally Posted by *TopicClocker*
> 
> *People are complaining, over something which could benefit PC Gamers? Isn't this what was wanted?
> It wasn't too long ago when PC Gamers were complaining left right and center about consoles holding PCs back, if the developers are offering PC Gamers enhancements and textures which are a step-up over the next gen versions alot of people are going to look silly.*
> *
> Ultimately it's down to how it looks and performs before anyone starts calling it poorly optimized, we don't know how it performs, and we don't know how the PC version looks in comparison to the next-gen consoles, and the optional "Ultra HD Texture Pack" surely is not a requirement to run the game and could well be an enhanced feature for PCs.*


If it is a bonus for PC Gamers and looks better than High, and you still have people complaining, then you cannot please anybody.
What about the 6GB requirement? Well whether the VRam requirements are justifiable is another story and everyone will soon find out, only 4 days until the game's release.


----------



## xxroxx

Quote:


> Originally Posted by *Zipperly*
> 
> That does not look awful and cartoonish.
> 
> 
> 
> 
> 
> 
> 
> 6GB worth? no.. but not as bad as you are making it out to be either.


opinions, opinions... yeah, for what it is meant to be, it is far behind the competition. characters look average, not that they're bad, but just ordinary as most out there. environments, though... Seriously, they are awful. Look at them with attention. For 6GB? That's waaay too low res and low poly. Lighting, on the other hand, looks good and masks what I've pointed.
But no more than a closer look is necessary to see that, for todays standards and expectations, this is bad.


----------



## Zipperly

Quote:


> Originally Posted by *xxroxx*
> 
> opinions, opinions... yeah, for what it is meant to be, it is far behind the competition. characters look average, not that they're bad, but just ordinary as most out there. environments, though... Seriously, they are awful. Look at them with attention. For 6GB? That's waaay too low res and low poly. Lighting, on the other hand, looks good and masks what I've pointed.
> But no more than a closer look is necessary to see that, for todays standards and expectations, this is bad.


Yes opinions opinions... which i am just as entitled to mine as the next person. As far as the video is concerned it looks great TO ME, not to you and that is fine. Also keep in mind the game will look tons better from the perspective of someone actually playing it rather than watching a youtube video of it which has been proven to not be the best gauge for how good a game will look in person.


----------



## The Source

There is something wrong when the latest generation of videos cards that just launched cannot play a new release like this at the highest settings. There are only two groups of people that can play this that have 6GB. Titan owners and 780 6GB owners. Why bother if it's catering to such a ridiculously small percentage of the market. Seriously, it's in the order of <.1% of PC gamers.


----------



## Nvidia Fanboy

Quote:


> Originally Posted by *Zipperly*
> 
> Here we go.... more demanding games are going to start coming out due to the new consoles which is what everyone has been screaming for over the past 2yrs and now suddenly those same people are going to start crying "poor optimization" after finally getting their wishes granted. Lmbo.........


Haha, this is so true. Gamers can be so whiny at times.


----------



## RaleighStClair

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> Haha, this is so true. Gamers can be so whiny at times.


The only people that are crying about this are the ones that said ''3 GB is more than enough for the PS4 generation, blah,blah, etc.". We see now that looking at the games coming out that it appears 6GB of VRAM is really needed jsut for 1080p -- of course I am sure not all games will require as much but I assume most with eh quality of 'Mordor and Evil WIthin.


----------



## CaliLife17

Why are people getting so bent out of shape about the what the recommended specs are. Who really has ever gone by that. Have people forgotten COD: Ghosts Ram "Requirements". People remember Watch Dog's "requirements" and that game ran like crap no matter the system.

Lets not all get bent out of shape because they listed 6gb of VRAM. Nvidia or AMD probably paid them to say that. People with 980 and 780 Ti's and 290x I'm sure will be able to run it no problem. Im really surprised people are taking developers at their word when it comes to specs, have we not learned are lesson?

If you REALLY want to argue with the someone, wait till benchmarks come out, so you can actually point to tangible evidence.

EDIT: And please stop saying that because of Consoles, we will need more VRAM. 1 game "recommended" specs does not equal EVERY GAME OUT THERE. Seriously people, no consoles will not suddenly make 2-4gb GPU's obsolete.


----------



## RaleighStClair

Quote:


> Originally Posted by *CaliLife17*
> 
> Why are people getting so bent out of shape about the what the recommended specs are. Who really has ever gone by that. Have people forgotten COD: Ghosts Ram "Requirements". People remember Watch Dog's "requirements" and that game ran like crap no matter the system.
> 
> Lets not all get bent out of shape because they listed 6gb of VRAM. Nvidia or AMD probably paid them to say that. People with 980 and 780 Ti's and 290x I'm sure will be able to run it no problem. Im really surprised people are taking developers at their word when it comes to specs, have we not learned are lesson?
> 
> If you REALLY want to argue with the someone, wait till benchmarks come out, so you can actually point to tangible evidence.


Is this not a forum to talk about video game news? I must be in the wrong place.


----------



## Flames21891

Well, I got two 680's with a half decent OC. I guess we shall see just how much 2GB of VRAM can really do.


----------



## CaliLife17

Quote:


> Originally Posted by *RaleighStClair*
> 
> Is this not a forum to talk about video game news? I must be in the wrong place.


No talk all you want, thats what is great about a forum. But when you start deducing things from 1 game developers "recommended specs" and acting like it is final and applies to every game to ever come out in the future, then that is just being ridic.

We have no evidence that the game won't run on ultra with less than 6gb. Max Payne did the same thing with 3gb Vram limit, and it ran fine on GPU's under 3gb.


----------



## hollowtek

lol really? this is silly. dev is like "nope, we're not optimizing for pc, don't like it, buy a console. they're probably going to pirate this game. anyways."

Less than 100 people will be playing on ultra in these forums ?


----------



## Zipperly

Quote:


> Originally Posted by *CaliLife17*
> 
> EDIT: And please stop saying that because of Consoles, we will need more VRAM. 1 game "recommended" specs does not equal EVERY GAME OUT THERE. Seriously people, no consoles will not suddenly make 2-4gb GPU's obsolete.


I hate to tell you this but the new consoles having as much ram as they do available to them for textures will absolutely make 2gb and possibly 3gb cards obsolete very soon. There are not even that many next gen console ports available yet for the PC but the ones that are on the horizon are proving to be vram hogs with requirements of 4gbs or more video Ram. Ryse, Evil within and the Shadow of Mordor. Its only going to get higher as devs learn to take advantage of the new consoles.

For the life of me I dont understand why so many are shocked by this, if this area never progressed then we would all still be gaming on 256-512mb cards.


----------



## Zipperly

Quote:


> Originally Posted by *CaliLife17*
> 
> We have no evidence that the game won't run on ultra with less than 6gb. Max Payne did the same thing with 3gb Vram limit, and it ran fine on GPU's under 3gb.


And max payne 3 was also a last gen console port....


----------



## Zipperly

Quote:


> Originally Posted by *RaleighStClair*
> 
> Is this not a forum to talk about video game news? I must be in the wrong place.


Yes it is and carry on, these folks downplaying the whole vram issue are in for a rude awakening soon.


----------



## zipper17

6Gb Vram is such a bull****, let's see what ultra texture can really offer, i bet crysis series still better

2-3 GB vram is the most standard for 1080p ultra texture


----------



## GoldenTiger

Quote:


> Originally Posted by *Zipperly*
> 
> Yes it is and carry on, these folks downplaying the whole vram issue are in for a rude awakening soon.


It won't be that sudden







, for this reason:
Quote:


> Originally Posted by *GoldenTiger*
> 
> *cough*
> Post made in October 2007 regarding Crysis 1: http://hardforum.com/showpost.php?p=1031581192&postcount=54 , and I have brought up the same sort of point you made in your post here, over the years. *Needless to say, I agree... it's a "chicken before the egg" scenario and ultimately one of the sides needs to cave and do it
> 
> 
> 
> 
> 
> 
> 
> .*
> 
> *That said, there are very good reasons neither card manufacturers or game devs have done so yet: The problem has been, historically, that...*
> 
> -If you're a video card mfg, adding extra expense to your cards and design complexity in a market that is highly competitive on price and product segmentation (of which VRAM capacity can be a part of) is harmful to your bottom line. Additionally, fully pushing the limits gives you less room for product spacing with future releases.
> 
> while...
> 
> -If you're a game dev. (
> 
> 
> 
> 
> 
> 
> 
> ), making games that have settings beyond most current hardware capabilities has resulted in terrible press and PR about your title and in the gaming media, while providing questionable benefit in the immediate term and requiring extra resources brought to bear to create the extra technical complexity and graphical fidelity that would help push the mfg's to beef up their cards further.
> 
> Unfortunately there are definite disadvantages for both sides to be the guy that goes first, even though I think anyone would agree that in the long game it would benefit the industry on both sides of that fence.


Yes, tech always marches on, but I think it'll be another 2 years before 4gb starts to become even a little long in the tooth for most setups at all







, let alone general gaming whatsoever. While this is a real trend, the pair of games hitting the news today is nothing new and aren't any reason to be any more concerned than yesterday







.


----------



## MaxFTW

Its all a lie in general, They do this because they dont want to test on 930449 different systems and put a silly requirement so they dont get told off


----------



## zipper17

So, afer all this long, it's a true story.


----------



## Clocknut

if u take a look at GTX580 = it has 1.5GB Ram. If u have a SLI setup, vram is probably ur biggest problem now.

I think 2gb 680 is stating to see this problem. Soon 3GB 780ti


----------



## TheBlindDeafMute

Nonsense. This is pure laziness. I have skyrim with over 30 HD and high rez texture packs, from foliage and skin replacements, to entire city remakes. Uses 2.5 gb. I highly doubt the game will actually look great enough to justify 6gb of vram


----------



## Flames21891

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> Nonsense. This is pure laziness. I have skyrim with over 30 HD and high rez texture packs, from foliage and skin replacements, to entire city remakes. Uses 2.5 gb. I highly doubt the game will actually look great enough to justify 6gb of vram


It very likely won't live up to the 6GB standard, much in the way Titanfall doesn't live up to the 3GB standard.

If the game isn't just blowing smoke on these requirements, it's very likely that the game simply uses a lot of uncompressed assets from the console versions. Which is depressing because that means the devs could make it work on more mainstream hardware, they simply don't have the initiative to do so.

Of course, judgement should be reserved for the actual release. When it runs like crap on 980 SLI rigs, then we can start the mud slinging.


----------



## Assirra

Am i living under a rock or something? Where did this massive shift came from? I thought 4gb vram was considered huge and now we got 6bg?
I still got 2GB on my gtx680 lol.


----------



## xxroxx

Quote:


> Originally Posted by *Zipperly*
> 
> Yes opinions opinions... which i am just as entitled to mine as the next person. As far as the video is concerned it looks great TO ME, not to you and that is fine. Also keep in mind the game will look tons better from the perspective of someone actually playing it rather than watching a youtube video of it which has been proven to not be the best gauge for how good a game will look in person.


exactly my thoughts, mate







and yeah, you're right about youtube - not only we're looking strictly at details, but it also has some terrible compression at times. But either way, environment looks awful IMO.


----------



## kingduqc

Quote:


> Originally Posted by *paulerxx*
> 
> Evil Within looks pretty bad bro. It looks like a Wii U game or something.
> 
> http://static1.gamespot.com/uploads/original/1365/13658182/2443259-the+evil+within+screenshot_1383569070.png
> http://www.nerdist.com/wp-content/uploads/2014/02/EvilWithinScary.jpg
> http://static1.gamespot.com/uploads/original/280/2802776/2578939-tew1.jpg
> http://static1.gamespot.com/uploads/original/1365/13658182/2443257-the+evil+within+screenshot+%283%29_1383569101.png


this is a console screen right? Tons of jaggies all over the place it must be.


----------



## SoloCamo

Quote:


> Originally Posted by *lugal*
> 
> "Hurr durr, consoles are slowing progress of pc gaming, they are evil and pc master race doesnt need close to metal optimizations anyway..."
> - new consoles come out, devs start to use their hardware and increase requirements for pc ports accordingly -
> "Omgz, you want me to buy top of the line graphics cards to be able to use highest possible graphics settings? Impossibru, wheres ze optimizations..."


The problem is, this isn't progress. I can honestly say looking at these two games, Crysis 1 looks better in many aspects... and that game is what, almost 8 years old now? That's the problem. These requirements are laughable at best. I'm not afraid to upgrade when my system is just plain old or incapable but a 4-6gb requirement on these for that level mediocrity is a joke.

Yea it may look good to some (truthfully, I'm not very impressed - not saying it looks bad but I would not call this game out as leader graphically) but that doesn't mean now that new consoles are around devs should just be sloppy since they have additional resources to spare.


----------



## Leopard2lx

Even though Shadow of Mordor is an open-world game more or less, it better beat Crysis 3 in graphics with that kind of VRAM requirement for max settings...otherwise it means we just got another crap ass console port.

If it does look fantastic, then I don't have a problem with the high system requirements, even if I can't personally run it.


----------



## iARDAs

For those of you hating the game without playing, it has very very good reviews.


----------



## pwnzilla61

Quote:


> Originally Posted by *Leopard2lx*
> 
> Even though Shadow of Mordor is an open-world game more or less, it better beat Crysis 3 in graphics with that kind of VRAM requirement for max settings...otherwise it means we just got another crap ass console port.
> 
> If it does look fantastic, then I don't have a problem with the high system requirements, even if I can't personally run it.


SOM doesn't even come close to Crysis 3.


----------



## Baasha

I was skeptical about this game but after looking at a couple of videos - especially the combat - I'm sold.

If only the idiots at Ubisoft would get the combat of AC to look/feel as good as this game, AC would be so much better.

The flip-dodge, executions, and combos make for a very interesting game. The one-button mashing garbage that is AC combat needs to be relegated to the dustbins of gaming history.


----------



## Fan o' water

I will wait for actual performance reviews before hitting any vram panic buttons. The 780ti was supposed to be inferior to the 290x due to less vram, but it wasn't. I think the faster processing power of modern gpu's will overcome any console requirements.


----------



## awdrifter

PS4 has 5.5GB of ram available for games, so if this game really requires 6GB of ram, the texture will be much better than the PS4 version. But I somehow doubt that. I'll have to hold off on buying a new GPU and just wait and see. I remember when last gen consoles first came out, the PC ports of game suddenly required 8800GTX with 512mb of RAM. This might happen again.


----------



## Silent Scone

after playing Ethan Carter at 1440p and 4xmsaa and have it barely use over 1.5gb, it's safe to say there isn't any real need for certain titles to use half as much as they do.

Like I said before 3 to 6gb seems a mighty large jump.

It's an NVidia title too don't forget









All being said though this was always going to happen with developers having more memory to play with on console counterparts, so I wouldn't whinge. This is the jump most people were hurrying on for 5 years.


----------



## Pip Boy

So it took us years to break past 1.2GB of VRAM with the original crysis (which still has awesome textures) onto higher resolutions and more AA and newer games and most struggled to break anywhere near 2gb vram in fact even with texture packs, 1440p and AA people struggle to break 2.6gb, perhaps at a real push 3 - 3.5gb VRAM. Even though there was a willing to see if things could be pushed that far by the community with mods / 4k / Supersampling etc..

So somewhere in that chaos even though its been proven 4k so far hasn't added a huge extra strain on VRAM and existing games textures often use 2k sometimes 4k textures (remember a texture is only 1 small slice out of many textures so beyond 2k is rarely needed) we didn't just move to 4GB of VRAM required which is an effective doubling or tripling any game ever released we went upto 6GB VRAM ?

so the only conclusion is that this is just lazy dev'itus and the texture files are probably just RAW files or similarly large uncompressed dumping of bulk image files.

a pathetic semi-troll like coin flick towards PC gamers..


----------



## Silent Scone

Quote:


> Originally Posted by *phill1978*
> 
> So it took us years to break past 1.2GB of VRAM with the original crysis (which still has awesome textures) onto higher resolutions and more AA and newer games and most struggled to break anywhere near 2gb vram in fact even with texture packs, 1440p and AA people struggle to break 2.6gb, perhaps at a real push 3 - 3.5gb VRAM. Even though there was a willing to see if things could be pushed that far by the community with mods / 4k / Supersampling etc..
> 
> So somewhere in that chaos *even though its been proven 4k so far hasn't added a huge extra strain on VRAM* and existing games textures often use 2k sometimes 4k textures (remember a texture is only 1 small slice out of many textures so beyond 2k is rarely needed) we didn't just move to 4GB of VRAM required which is an effective doubling or tripling any game ever released we went upto 6GB VRAM ?!
> 
> so the only conclusion is that this is just lazy dev'itus and the texture files are probably just RAW files or similarly large uncompressed dumping of bulk image files.


Depends.

If you're using multiple cards it can be. Which you should be at 4K.


----------



## Pip Boy

Quote:


> Originally Posted by *Silent Scone*
> 
> Depends.
> 
> If you're using multiple cards it can be. Which you should be at 4K.


I agree. But show me an example of 6GB Vram on an existing title (lets not mention how SOM doesnt look as good as titles with lesser requirements) at 4k please. Also id be interested if someone could Downsample from 8K ( what's that, about 32million pixels?) on a modded skyrim with texture packs and see if it hits 6GB VRAM in 4k.. it might.

the point is progress is good for all gamers when a dev properly utilises the hardware like times of old, if this was progress and not just lazy devs throwing around ridiculous requirements to get PC fanboi's all moist.. its a cynical marketing ploy an ironically goes to show how much they actually _DON'T_ care about PC gamers. Its trying to garner more sales. I dont doubt the textures are amazing, perhaps even they do use 6GB of VRAM but do they _actually need_ too ?

This is one game, but what if more dev's start following this trend ? 50gb of HDD space? 6GB VRAM, 16GB system RAM = Next COD game @ 1080p .

optimisations allow for a game to be modded beyond their static requirement. If games get incredibly unoptimised then you can kiss good bye to having the headroom to mod and do crazy things to your games also.


----------



## Zipperly

Quote:


> Originally Posted by *kingduqc*
> 
> this is a console screen right? Tons of jaggies all over the place it must be.


Do you even know what aliasing looks like? because there are none in any of those shots.


----------



## fashric

I can't wait to see these "Ultra" textures that will obviously be 3x as impressive as any textures in a game we have seen before /s I actually can't believe people in a hardware forum are buying into this rubbish.


----------



## Zipperly

Quote:


> Originally Posted by *pwnzilla61*
> 
> SOM doesn't even come close to Crysis 3.


So you have played SOM already? wow..... Anyway, I agree that so far Crysis 3 is a better looking game but we cant really compare it to SOM until the game is actually released. Anyway 2 entirely different types of games, How much freedom to roam in SOM? things like that matter to you know.


----------



## fashric

Compare it to Skyrim with modded 2k textures then. Even at 1440p it doesn't need more than 3gb vram. This is clearly a case of them not bothering to optimise for PC. The same thing happened with Watch_Dogs the requirements do not match the quality of the graphics. You aren't getting anything more for the 6gb vram than you would in a properly optimised game with 3gb vram.


----------



## batman900

Quote:


> Originally Posted by *fashric*
> 
> Compare it to Skyrim with modded 2k textures then. Even at 1440p it doesn't need more than 3gb vram. This is clearly a case of them not bothering to optimise for PC. The same thing happened with Watch_Dogs the requirements do not match the quality of the graphics. You aren't getting anything more for the 6gb vram than you would in a properly optimised game with 3gb vram.


Regardless, it is becoming a trend and thus more vram will be needed if we want to conform. Otherwise companies will be just as happy letting us run on med/high while saving R&D costs for the pc minority.


----------



## Zipperly

Quote:


> Originally Posted by *fashric*
> 
> Compare it to Skyrim with modded 2k textures then. Even at 1440p it doesn't need more than 3gb vram. This is clearly a case of them not bothering to optimise for PC. The same thing happened with Watch_Dogs the requirements do not match the quality of the graphics. You aren't getting anything more for the 6gb vram than you would in a properly optimised game with 3gb vram.


I cant really speak for watch dogs because I have never played it but as for modded skyrim I have had no problems whatsoever going past my 3gb limitation at 1920X1080 combined with a modded ini + lots of high res texture mods. Anyway modded or not I dont think its an apples to apples comparison since skyrim was built around last gen consoles.

We shall see how this all turns out but I wont be shocked when newer games start needing 4gb+ for vram.


----------



## Pip Boy

Quote:


> Originally Posted by *batman900*
> 
> Regardless, it is becoming a trend and thus more vram will be needed *if we want to conform*. Otherwise companies will be just as happy letting us run on med/high while saving R&D costs for the pc minority.


we dont. poor coding should not be accepted and encouraged.

we have all seen consoles with seemingly terrible system power match PC graphics in the past ( ok not the resolution but on a TV they look very nice still) optimisation of something like a 970/980 GPU and i mean real optimisation like you see on the demo scene or people coding at the lowest level could produce the sort of graphics we all kind of expect now but for some reason generally speaking graphics quality is remaining largely static


----------



## Redeemer

Thanks for packing the 780TI with only 3GB and charging top dollar Nvidia


----------



## KenjiS

Quote:


> Originally Posted by *Redeemer*
> 
> Thanks for packing the 780TI with only 3GB and charging top dollar Nvidia


780 Ti is now dead, 980 replaced it, technically

As for 6gb Ultra textures... Either they're that good, or they are not compressed. From my reading they're an optional download atop it

Does anyone maybe want to cut them some slack and assume that maybe, JUST MAYBE, they actually are trying to please people with the investment in Titans and maybe the ultra textures really will look quite good?









Or the "suggestion" is actually just excessive or assuming supersampling or something

But it does reinforce why I'm ditching my 770 SLI setup. SLI is bloody useless if everything needs 4gb+ of VRAM. 2gb IS probubly not sufficient anymore

And its nothing new either.. I remember the last time I saw things like this was about when the 360 came out, Call of Duty 2 came out and I remember my GPU at the time, which WAS pretty powerful, pretty much said "NOPE" and refused to run the game at anything above about 5fps...


----------



## Tonza

Quote:


> Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".
> 
> I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).
> 
> There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.
> 
> So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.
> 
> It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
> I'll test that more in-depth at some point next week, so we'll see.
> 
> So much for not opimized.
> 
> So don't get your panties in a bunch Wink




I knew that the 6GB VRAM requirement was fishy.


----------



## KenjiS

^- That was my suspicion...


----------



## perfectblade

Quote:


> Originally Posted by *phill1978*
> 
> we dont. poor coding should not be accepted and encouraged.
> 
> we have all seen consoles with seemingly terrible system power match PC graphics in the past ( ok not the resolution but on a TV they look very nice still) optimisation of something like a 970/980 GPU and i mean real optimisation like you see on the demo scene or people coding at the lowest level could produce the sort of graphics we all kind of expect now but for some reason generally speaking graphics quality is remaining largely static


yeah exactly. nvidia said titan=4k. but there is no way that will be happening. maybe if pcs got a 10th of the optimization consoles do...at least in 90s it seemed like pcs games were decently programmed. but then again they were more likely to be the lead platform


----------



## ladcrooks

its a ploy between card and game manufactures to update your gpu's more often


----------



## Baasha

Quote:


> Originally Posted by *phill1978*
> 
> I agree. *But show me an example of 6GB Vram on an existing title* (lets not mention how SOM doesnt look as good as titles with lesser requirements) at 4k please. Also id be interested if someone could Downsample from 8K ( what's that, about 32million pixels?) on a modded skyrim with texture packs and see if it hits 6GB VRAM in 4k.. it might.
> 
> the point is progress is good for all gamers when a dev properly utilises the hardware like times of old, if this was progress and not just lazy devs throwing around ridiculous requirements to get PC fanboi's all moist.. its a cynical marketing ploy an ironically goes to show how much they actually _DON'T_ care about PC gamers. Its trying to garner more sales. I dont doubt the textures are amazing, perhaps even they do use 6GB of VRAM but do they _actually need_ too ?
> 
> This is one game, but what if more dev's start following this trend ? 50gb of HDD space? 6GB VRAM, 16GB system RAM = Next COD game @ 1080p .
> 
> optimisations allow for a game to be modded beyond their static requirement. If games get incredibly unoptimised then you can kiss good bye to having the headroom to mod and do crazy things to your games also.


Looks like you haven't seen this: *6GB VRAM USAGE*


----------



## MapRef41N93W

Just bought it for 25% off at GMG. Will test on SLI 970s + 4770k when it comes out and post my results.


----------



## CaliLife17

Quote:


> Originally Posted by *Baasha*
> 
> Looks like you haven't seen this: *6GB VRAM USAGE*


If i remember correctly, Crysis 3 will cache as much Vram as you have available, but its not actually USING that 6gb of vram. So if you have 6gb of vram, it will show you are using 6gb, if you have 4gb of vram, it will show you are using 4gb of vram, where in reality you are not using that much.


----------



## Mygaffer

Quote:


> Originally Posted by *Ramzinho*
> 
> i believe this is absurd. The most advanced GPUs now are 4GB "yes there are 6GB' but they are special editions.. either this ultra settings look super impressive with tons of details or it's terribly optimized and i tend to believe in the later. devs are as usual so lazy with imports optimization ....


It isn't an import, it is a port, it was developed for x86 platforms, I don't know why people are finding it so hard to believe that just like has happened forever VRAM requirements continue to go up for games.

My first video card had 32MB of VRAM. Megabytes. I remember cards I was buying in the mid to late 2000's that had 512MB, then 1GB, and so on. You can't run a game like Crysis on my Geforce 2 with 32MB of VRAM, and it sounds like you can't run Shadow of Mordor with ultra textures on a 7970 or 780.

Most PC gamers I know celebrate this kind of stuff, because it usually means bigger, better looking games are coming out.


----------



## SoloCamo

Quote:


> Originally Posted by *Mygaffer*
> 
> It isn't an import, it is a port, it was developed for x86 platforms, I don't know why people are finding it so hard to believe that just like has happened forever VRAM requirements continue to go up for games.
> 
> My first video card had 32MB of VRAM. Megabytes. I remember cards I was buying in the mid to late 2000's that had 512MB, then 1GB, and so on. You can't run a game like Crysis on my Geforce 2 with 32MB of VRAM, and it sounds like you can't run Shadow of Mordor with ultra textures on a 7970 or 780.
> 
> Most PC gamers I know celebrate this kind of stuff, because it usually means bigger, better looking games are coming out.


But no one is expecting a a Geforce 2 to run Crysis.. and the vram limit is not the first reason that comes to mind. The issue here is the consoles this time have considerably weaker hardware then they've ever launched at compared to what is available in the pc world on even medium end gpu's...so that argument is null.

I'm not excited to see artificially (and considerably) inflated requirements when the games don't even look much better...


----------



## Alatar

Fun fact, back when Crysis came out GeForce 2 was about as old as a GTX 280 is today. Slightly bigger gap with crysis and geforce 2 but not much.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alatar*
> 
> Fun fact, back when Crysis came out GeForce 2 was about as old as a GTX 280 is today. Slightly bigger gap with crysis and geforce 2 but not much.


Fun Fact my GTX280 can play almost any game @ 720p right now. I played BF4 no problem.


----------



## StrongForce

That's nice, I hope that game is gonna be epic by the way.. the come back of monolith, better be badass !







.

And [email protected] everyone who were saying 3gb more than enough even for higher res not long ago







.

Also, does that mean 980's should have pack 6gb and not 4 uh uh... Nvidia slowing progress down again lol, I just hope AMD models releases with 6 Gb, that would be so sick.


----------



## omari79

Quote:


> Originally Posted by *TopicClocker*
> 
> For all we know Ultra could be a PC exclusive feature, from IGN's review that's exactly what it sounds like.
> Ontop of that, it's a YouTube video, YouTube videos are compressed so that really hurts the quality, raw screens and footage is what would be a lot better than using YouTube videos.
> 
> At-least wait for the game to come out before complaining. (This isn't targeted at anyone specifically).
> No one knows how the game performs or how Ultra textures compare to High and below, so calling it poorly optimized before anyone has had a chance to play it or do comparisons is ridiculous.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not trying to justify the 6GB Ultra textures, not at all as that is an insane amount of Vram which hardly any cards outside of Titans or custom cards have.
> The ludicrous thing is that we don't even know what they really look like and people are calling the game poorly optimized already, over one setting!
> 
> As I said in a previous post.
> If it is a bonus for PC Gamers and looks better than High, and you still have people complaining, then you cannot please anybody.
> What about the 6GB requirement? Well whether the VRam requirements are justifiable is another story and everyone will soon find out, only 4 days until the game's release.


mate..i am willing to bet my lowly GTX460 on this..thats how sure i am that the optimization will be garbage..its a trend nowadays









look up Crysis 3 (on youtube too) then tell me that ultra whatever settings on this game warrants 6GB of VRM..


----------



## TopicClocker

Quote:


> Originally Posted by *omari79*
> 
> mate..i am willing to bet my lowly GTX460 on this..thats how sure i am that the optimization will be garbage..its a trend nowadays
> 
> 
> 
> 
> 
> 
> 
> 
> 
> look up Crysis 3 (on youtube too) then tell me that ultra whatever settings on this game warrants 6GB of VRM..


Based on what? a few recent games which don't perform too great and made by other developers?
Making assumptions and jumping to conclusions when we have hardly seen what the game looks or performs like, it's really unfair to be honest to prejudge a game when there's no evidence as to why it would be poorly optimized.

Shadow of Mordor is an open-world game with massive levels, Crysis 3 is not, it's demands are going to be vastly different to an open world and dynamic game.
They're entirely not comparable on a technical level, but that's not to justify the alleged 6GB Vram for Ultra textures.

Straight from NeoGaf, from somebody who has actually played the game. (I've been closely following the thread there).
Quote:


> Originally Posted by *blaidd*
> 
> Alright.
> 
> Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".
> 
> I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).
> 
> There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.
> 
> *So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.*
> 
> *It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X.* This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
> I'll test that more in-depth at some point next week, so we'll see.
> 
> So much for not opimized.


They also posted this image.


No one knows if it truly requires 6GB, or how the Ultra textures look.
If this is a PC enhanced feature, which the signs point to yes, would it be better to cut it from the game entirely?
Suppose they're trying to throw PC Gamers a bone here? (Even though it has insanely high Vram requirements?).
No I am not justifying it, but the quote above paints a different picture about it being poorly optimized.

I myself am not fond of allegedly needing 6GB Vram for Ultra textures, but the truth is no one knows if it does or not, and no one knows how it performs so calling an unreleased game poorly optimized when hardly anyone outside of reviewers or early copies have had a chance to try is ridiculous, over an optional Ultra texture pack which could well be especially *for* PC Gamers!

Wait for the release, the benchmarks and the comparisons from EuroGamer, gosh if this single optional Ultra HD Texture pack hadn't existed no one would be picking up pitch forks, it's ridiculous.

What PC Gamers should be picking up pitch forks about is the game which wants 4GB video memory.

If I'm wrong I'll gladly accept it, but right now it's all assumptions, ALL of it.


----------



## fashric

I feel most posters are completely missing the point as to why some of us think this is a bad thing. They are increasing the vram requirements not because the textures are any better but because of the way the engine they use to make the game is geared towards using the newer consoles memory configuration and this is different to the way it works in a PC. If it was because of a massive improvement in image and texture quality then I don't many would have an issue. If you want to throw your money away buying a 6gb card for no improvement but just because of studios/developers refusal to port properly for PC then you are either very rich or not very sharp.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fashric*
> 
> I feel most posters are completely missing the point as to why some of us think this is a bad thing. They are increasing the vram requirements not because the textures are any better but because of the way the engine they use to make the game is geared towards using the newer consoles memory configuration and this is different to the way it works in a PC. If it was because of a massive improvement in image and texture quality then I don't many would have an issue. If you want to throw your money away buying a 6gb card for no improvement but just because of studios/developers refusal to port properly for PC then you are either very rich or not very sharp.


What else can we do. The thing is 3 or 4 GB should not be enough in 1 or 2 years. Consoles have that much but run at sub 1080p/900p/720p resolutions. We run games at 1440p and 4K. Even if the game used 2GB @ 900p its going to use well over 6GB @ 4K.


----------



## BinaryDemon

Quote:


> Originally Posted by *TopicClocker*
> 
> If this is a PC enhanced feature, which the signs point to yes, would it be better to cut it from the game entirely?
> Suppose they're trying to throw PC Gamers a bone here? (Even though it has insanely high Vram requirements?).


I agree it's probably an extreme lack of optimization but...

Never cut anything for PC!

Just because something doesn't play well right now, doesn't mean hardware wont have caught up in a year or two.


----------



## fashric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What else can we do. The thing is 3 or 4 GB should not be enough in 1 or 2 years. Consoles have that much but run at sub 1080p/900p/720p resolutions. We run games at 1440p and 4K. Even if the game used 2GB @ 900p its going to use well over 6GB @ 4K.


Simple, only buy games from developers that properly support the PC.


----------



## KenjiS

Quote:


> Originally Posted by *TopicClocker*
> 
> Based on what? a few recent games which don't perform too great and made by other developers?
> -snipped for length-


Exactly. And furthermore we do NOT know what the Ultra textures look like as they are going to be a day 1 patch thing, So no, Not even the reviewers are getting to see them. For all we know they could look absolutely stupendous and actually be WORTH a 6GB GPU. Or they might not actually need it. The issue is people are crying poor optimization when theres no evidence of that yet.

Poor optimization is Watch_Dogs, Where folks like me had to drop to an Xbox 360-level of graphics quality to get it -somewhat- running and it still stuttered like a you know what because its using an absurd amount of VRAM. Shadow of Mordor on the other hand I can see using 3gb of VRAM for High settings(Which is all the footage we've seen so far..), the stuff I've seen from the game looks very good, and it does appear to have large expansive worlds that would require large textures.

We keep bringing up Crysis 3 over and over and over, That horse is getting a bit thin from all the beating. Perhaps we need to accept that not every game is Crysis 3. Its also likewise not fair to compare Crysis 3 which is a tiny small fairly linear world with an open world game

The thing I dont get is that I've seen this happen before over and over again, Suddenly theres a big jump in specs needed to run the newest games and a chasm appears, and usually that chasm includes even fairly recent hardware because the latest games require well.. insert something here...Be it more shader units or VRAM or whatever.. Then they plateau for a while. Then theres a big jump again. Its just how it is.

Folks are just mad because until about January of this year the opinion was "2gb is all the VRAM you'll ever need" and now thats simply not the case, For whatever reason, Games want more than 2gb of VRAM, Yes its probubly partially because the new consoles have a lot of Shared RAM to use, but it could also be that simply put, to push graphical fidelity higher this is specifically what was needed. Visual gains are not always linear in terms of performance requirements. The higher you push that bar, the more you need to push it higher.

Give it 6 months, The 960 Ti or whatever will probubly be loaded with 4gb of VRAM for $250 and the 980 Ti will probubly have 8gb (At $1000)


----------



## KenjiS

Quote:


> Originally Posted by *fashric*
> 
> Simple, only buy games from developers that properly support the PC.


Doesnt have good graphics = Doesnt properly support PC

Has good graphics, including a special Ultra texture pack, but requires high end systems that only 2% of PC Gamers have = Doesnt properly support PC

Executives see this = PC gamers are unpleasable

-edit- and No I'm not ruling ut the game is poorly optimized possibly, But since none of us here have the bloody thing its a bit early to call that..


----------



## MonarchX

Another case of Watch Dogs? They up the requirement because they hit a wall in their engine development that prevents them from optimizing the game to a point where even high-end PC hardware can't run it! If they raise system specs, then they can always point to your system specs when you get FPS issues. I bet that even the 6GB cards will experience severe FPS issues. There is also another thing called compression. A well-compressed texture can look 99.9% like the original in terms of high quality, fidelity, but take up 90% less VRAM than the original.


----------



## TopicClocker

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What else can we do. The thing is 3 or 4 GB should not be enough in 1 or 2 years. Consoles have that much but run at sub 1080p/900p/720p resolutions. We run games at 1440p and 4K. Even if the game used 2GB @ 900p its going to use well over 6GB @ 4K.


I think 4GB will be the bare minimum.
The PS4 has 4.5GB - 5.5GB available for games currently, so unless they start using 5GB for graphics memory...

Who thought the next gen consoles would start biting PC Gamers in the backside, because of their unified memory...


----------



## EniGma1987

Quote:


> Originally Posted by *fashric*
> 
> I feel most posters are completely missing the point as to why some of us think this is a bad thing. They are increasing the vram requirements not because the textures are any better but because of the way the engine they use to make the game is geared towards using the newer consoles memory configuration and this is different to the way it works in a PC. If it was because of a massive improvement in image and texture quality then I don't many would have an issue. If you want to throw your money away buying a 6gb card for no improvement but just because of studios/developers refusal to port properly for PC then you are either very rich or not very sharp.


I don't mind vram requirements rising like this because in another generation or two we will be having stacked memory on GPUs and the vram size of cards will be in the 12-16GB range before you know it. As we move to 4K and 8K resolutions over the next 5-10 years the vram requirements for textures and such will increase of course, but they will eventually top out and then we will just have a bunch of space left over that might as well be used to cache other stuff so games run better.


----------



## KenjiS

Quote:


> Originally Posted by *TopicClocker*
> 
> I think 4GB will be the bare minimum.
> The PS4 has 4.5GB - 5.5GB available for games currently, so unless they start using 5GB for graphics memory...
> 
> Who thought the next gen consoles would start biting PC Gamers in the backside, because of their unified memory...


And if what you posted before is right, the 6GB number could be including using Supersampling, IE, rendering the bloody thing at 4k

Which yeah, That REALISTICALLY might need some more VRAM folks...

2gb is probubly no longer enough, 3gb is probubly bare minimum and 4gb is probubly the new standard. Theres probubly a good reason nVidia pushed out 4gb cards right before the holidays...


----------



## Pip Boy

Quote:


> Originally Posted by *Baasha*
> 
> Looks like you haven't seen this: *6GB VRAM USAGE*


lol seriously.
Quote:


> Playing Crysis 3 maxed out including 8x MSAA @ 5160x2560 (3 30" monitors in portrait mode)


i was referring to an out of the box requirement on a typical setup.

That setup above is probably used for hardcore gaming by less than 150 - 200 people in the world with 3x30" 1600p monitors on PC outside of production work, let alone running 4X titans, that takes the number down to less than what??

As you probably know x4 MSAA is effectively taking the edge of a 1080p resolution for example and x4 it, meaning 4k edges (not full scene like supersampling but still .. a lot of edges).

So 51690 x 2560 x 8 = 105,676,800 pixels of edge aliasing _per frame_









maxed out, the best graphics game on PC on FOUR x GTX-Titan SC @ $4,000 and he is getting 40 - 25fps drops..

totally typical usage scenario


----------



## Mygaffer

Quote:


> Originally Posted by *SoloCamo*
> 
> But no one is expecting a a Geforce 2 to run Crysis.. and the vram limit is not the first reason that comes to mind. The issue here is the consoles this time have considerably weaker hardware then they've ever launched at compared to what is available in the pc world on even medium end gpu's...so that argument is null.
> 
> I'm not excited to see artificially (and considerably) inflated requirements when the games don't even look much better...


I was simply making a point, that history bears out that AAA games will use more VRAM over time. The new consoles may have relatively weak hardware, but they have large amounts of ram, I believe both systems have 8GB of unified memory. Also they are quoting this amount of VRAM for use with an HD texture pack, which I image is similar to the one Bethesda released for the PC version of Skyrim.
Thus it is not at all unreasonable to assume that the 6GB of VRAM requirement is not only reasonable, but not the result of "poor programing". Do you expect 3GB frame buffers to stay relevant forever?


----------



## KenjiS

Quote:


> Originally Posted by *Mygaffer*
> 
> I was simply making a point, that history bears out that AAA games will use more VRAM over time. The new consoles may have relatively weak hardware, but they have large amounts of ram, I believe both systems have 8GB of unified memory. Also they are quoting this amount of VRAM for use with an HD texture pack, which I image is similar to the one Bethesda released for the PC version of Skyrim.
> Thus it is not at all unreasonable to assume that the 6GB of VRAM requirement is not only reasonable, but not the result of "poor programing". Do you expect 3GB frame buffers to stay relevant forever?


Precisely

And you are correct, both consoles have 8gb of unified memory, the PS4 actually uses GDDR5. Yes. 8gb of GDDR5.

Each has roughly 2gb used by the system that is unusable by an application. So figure 5.5-6gb available.

Really the design makes a lot of sense. The only real reason PCs dont do this is because we dont use APUs and the bus speed would bottleneck way too much to be worthwhile. And the consoles have to last a VERY long time. Possibly 7 years. The PS3 found out the disadvantage to a fragmented memory system later in its life...

Though that is an entire other can of worms


----------



## Zipperly

Quote:


> Originally Posted by *TopicClocker*
> 
> Based on what? a few recent games which don't perform too great and made by other developers?
> Making assumptions and jumping to conclusions when we have hardly seen what the game looks or performs like, it's really unfair to be honest to prejudge a game when there's no evidence as to why it would be poorly optimized.
> 
> Shadow of Mordor is an open-world game with massive levels, Crysis 3 is not, it's demands are going to be vastly different to an open world and dynamic game.
> They're entirely not comparable on a technical level, but that's not to justify the alleged 6GB Vram for Ultra textures.
> 
> Straight from NeoGaf, from somebody who has actually played the game. (I've been closely following the thread there).
> They also posted this image.
> 
> 
> No one knows if it truly requires 6GB, or how the Ultra textures look.
> If this is a PC enhanced feature, which the signs point to yes, would it be better to cut it from the game entirely?
> Suppose they're trying to throw PC Gamers a bone here? (Even though it has insanely high Vram requirements?).
> No I am not justifying it, but the quote above paints a different picture about it being poorly optimized.
> 
> I myself am not fond of allegedly needing 6GB Vram for Ultra textures, but the truth is no one knows if it does or not, and no one knows how it performs so calling an unreleased game poorly optimized when hardly anyone outside of reviewers or early copies have had a chance to try is ridiculous, over an optional Ultra texture pack which could well be especially *for* PC Gamers!
> 
> Wait for the release, the benchmarks and the comparisons from EuroGamer, gosh if this single optional Ultra HD Texture pack hadn't existed no one would be picking up pitch forks, it's ridiculous.
> 
> What PC Gamers should be picking up pitch forks about is the game which wants 4GB video memory.
> 
> If I'm wrong I'll gladly accept it, but right now it's all assumptions, ALL of it.


Finally, someone who gets it.


----------



## kx11

so is this COD ghosts 6gb ram requirement saga all over again ?


----------



## KenjiS

Quote:


> Originally Posted by *Zipperly*
> 
> +1 Its as if some of these people just recently joined PC gaming and have not seen over the past decade how vram requirements have consistently risin over time due to better looking games/newer consoles.


I know right? Sometimes I do wonder if a lot of the folks that are posting here are younger or something. I'm 26, I've been a PC gamer since I was... heck... 8? When were TIE Fighter and X-Wing a thing









its like with Vista and people complaining how unoptimized it was and what a resource hog it was, The same things were said about XP! And 95 before it!


----------



## KenjiS

Quote:


> Originally Posted by *kx11*
> 
> so is this COD ghosts 6gb ram requirement saga all over again ?


That was just stupid in my book. Seriously. RAM is not expensive. Today's newegg email had 8gb for $72. Thats $12 more than a new copy of Ghosts was. Thats not like saying you need a $1000 graphics card.

People just dont understand things change. PC Gaming has always been about evolution and pushing foreward. Sometimes that DOES mean stuff isnt very well optimized and you're paying a pound to get a penny in terms of performance, But thats true in EVERYTHING when you're pushing limits.


----------



## Zipperly

Quote:


> Originally Posted by *KenjiS*
> 
> I know right? Sometimes I do wonder if a lot of the folks that are posting here are younger or something. I'm 26, I've been a PC gamer since I was... heck... 8? When were TIE Fighter and X-Wing a thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its like with Vista and people complaining how unoptimized it was and what a resource hog it was, The same things were said about XP! And 95 before it!


Exactly my friend and im 41 so I have seen a lot of stuff change over the years.


----------



## NuclearPeace

Quote:


> Originally Posted by *KenjiS*
> 
> I know right? Sometimes I do wonder if a lot of the folks that are posting here are younger or something. I'm 26, I've been a PC gamer since I was... heck... 8? When were TIE Fighter and X-Wing a thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its like with Vista and people complaining how unoptimized it was and what a resource hog it was, The same things were said about XP! And 95 before it!


OT: I'm only 15.

I think that OSes are different than games. I don't see much reason why an OS should be designed to use up more resources than its predecessors.


----------



## KenjiS

Quote:


> Originally Posted by *NuclearPeace*
> 
> OT: I'm only 15.
> 
> I think that OSes are different than games. I don't see much reason why an OS should be designed to use up more resources than its predecessors.


Sometimes to have a UI that doesnt look like something a 5 year old drew on my screen with crayon!


----------



## TopicClocker

Quote:


> Originally Posted by *NuclearPeace*
> 
> OT: I'm only 15.
> 
> I think that OSes are different than games. I don't see much reason why an OS should be designed to use up more resources than its predecessors.


Pssh I'm barely older than you, I joined OCN at that age.
I started PC Gaming properly with my own Gaming PC when I was like 13.


----------



## fashric

My god its just like its going in one ear and straight out of the other. *THERE IS NOTHING WRONG WITH REQUIREMENTS INCREASING IF THE IMPROVEMENT IS WORTH IT BUT THIS GAME CLEARLY WILL NOT BE* They are saying 6gb is for 1080p, 1080P!!! ffs not 4k not 8k but *1080* if you believe this is nothing to do with covering their asses for poor PC optimisation/support then you have been blind to what has been happening since the "Next Gen" consoles were released in relation to game engines being used to make the "next gen" (god i hate that term) games. Also I'm not sure what age has to do with how this is interpreted but I've been gaming since the ZX Spectrum days so yeah...


----------



## perfectblade

Quote:


> Originally Posted by *fashric*
> 
> My god its just like its going in one ear and straight of the other. *THERE IS NOTHING WRONG WITH REQUIREMENTS INCREASING IF THE IMPROVEMENT IS WORTH IT BUT THIS GAME CLEARLY WILL NOT BE* They are saying 6gb is for 1080p, 1080P!!! ffs not 4k not 8k but *1080* if you believe this is nothing to do with covering their asses for poor PC optimisation/support then you have been blind to what has been happening since the "Next Gen" consoles were released in relation to game engines being used to make the "next gen" (god i hate that term) games. Also I'm not sure what age has to do with how this is interpreted it but I've been gaming since the ZX Spectrum days so yeah...


when you look at what the ps4 is capable of with the hardware it has and compare it to the requirements for pcs to achieve the same thing it's insane.

people saying consoles have 8gb of ram...yes that is ram. now compare the ps4 gpu to a pc gpu with 6gb of vram...devs hate pc gamers that's the only conclusion i can reach


----------



## Zipperly

Quote:


> Originally Posted by *fashric*
> 
> *THERE IS NOTHING WRONG WITH REQUIREMENTS INCREASING IF THE IMPROVEMENT IS WORTH IT BUT THIS GAME CLEARLY WILL NOT BE*


I was wondering if I could borrow your crystal ball?


----------



## GTR Mclaren

Sorry shadow of Mordor but you are just an horrible port

Not even Skyrim, with modded ultra textures consumes more than 3GB at 1080p


----------



## Zipperly

Quote:


> Originally Posted by *GTR Mclaren*
> 
> Not even Skyrim, with modded ultra textures consumes more than 3GB at 1080p


Actually thats not true, I have gotten skyrim to break the 3gb barrier at 1080P with a good number of mods+ini tweak. Anyway...... Skyrim was based around last gens consoles, so many of you are missing the boat on this one. Its not an apples to apples comparison with anything last gen.

As for SOM it remains to be seen how the ultra textures will look and what the actual requirements will be.


----------



## Yungbenny911

The V-ram wars haha, Stutter dogs was just the beginning. All i'm waiting for is games that require 120GB ov V-ram to run...


----------



## SoloCamo

Quote:


> Originally Posted by *Zipperly*
> 
> I was wondering if I could borrow your crystal ball?


If that video posted looks worthy of these specs to you I'm just lost.. not even trying to be rude in any way - I just can't believe it could be justified as ok when even Crysis 3 / Metro kills it overall while using way less vram (and mind you, are much older).


----------



## TopicClocker

Quote:


> Originally Posted by *perfectblade*
> 
> when you look at what the ps4 is capable of with the hardware it has and compare it to the requirements for pcs to achieve the same thing it's insane.
> 
> people saying consoles have 8gb of ram...yes that is ram. now compare the ps4 gpu to a pc gpu with 6gb of vram...devs hate pc gamers that's the only conclusion i can reach


These Ultra textures appear to be PC exclusive, so the PS4 wont have them.
It'll probably be running something along the lines of High so thus a 3GB GPU should be fine in accomplishing that.

Unless, you know, they manage to pull off Ultra textures with 5GB of the PS4's available 4.5GB - 5.5GB available memory.











Power isn't the problem, there's more than enough of it, the video memory is the big problem.
Quote:


> Originally Posted by *SoloCamo*
> 
> If that video posted looks worthy of these specs to you I'm just lost.. not even trying to be rude in any way - I just can't believe it could be justified as ok when even Crysis 3 / Metro kills it overall while using way less vram (and mind you, are much older).


The IGN video is a much better quality video.


----------



## perfectblade

Quote:


> Originally Posted by *TopicClocker*
> 
> These Ultra textures appear to be PC exclusive, so the PS4 wont have them.
> It'll probably be running something along the lines of High so thus a 3GB GPU should be fine in accomplishing that.
> 
> Unless, you know, they manage to pull off Ultra textures with 5GB of the PS4's available 4.5GB - 5.5GB available memory.


i know that, it's still insane though. that amount of power should be able to do 4k easily, but it will never happen with the laziness of devs


----------



## TopicClocker

Quote:


> Originally Posted by *perfectblade*
> 
> i know that, it's still insane though. that amount of power should be able to do 4k easily, but it will never happen with the laziness of devs


Yeah it is insane.

The developers have been starving, having to work around 6-8 year old technology as a benchmark for their games, the Xbox 360 is almost a decade old, shocking really.

The next generation consoles have set a much higher and greatly needed benchmark.
Games like Infamous Second Son really show that.

EuroGamer: Digital Foundry vs. inFamous: Second Son
Quote:


> At last, we have an entire city at our disposal with more than enough detail to support any experience you could imagine. *The power to deliver visuals of this calibre has existed for years on the PC, but it's only when developers are freed from the shackles of last-generation consoles that we can begin to see what the DX11-class hardware is truly capable of.*


----------



## Zipperly

Quote:


> Originally Posted by *SoloCamo*
> 
> If that video posted looks worthy of these specs to you I'm just lost.. not even trying to be rude in any way - I just can't believe it could be justified as ok when even Crysis 3 / Metro kills it overall while using way less vram (and mind you, are much older).


Couple of things, that video posted was not running the ultra texture setting...... it was running the high setting so no one even knows how it will look with Ultra textures enabled. 2nd of all a game will never look as good on a youtube video as it will compared to actually playing it on your own monitor. 3rd of all neither Metro nor Crysis 3 are open world games, It never ceases to amaze me at the number of people who overlook that one very important aspect when comparing games......... and yes, I am older.


----------



## Zipperly

Quote:


> Originally Posted by *SoloCamo*
> 
> If that video posted looks worthy of these specs to you I'm just lost.. not even trying to be rude in any way - I just can't believe it could be justified as ok when even Crysis 3 / Metro kills it overall while using way less vram (and mind you, are much older).


Sigh...... once again do your homework. That video is not using the ultra settings and games always look better when you actually play them rather than watching them on youtube. Again, you are comparing an open world game "SOM" to Crysis 3 and Metro which are not open world games. If you are going to compare SOM to something else then compare it to another open world game so its more of an apples to apples comparison.


----------



## TheBlindDeafMute

Doesn't matter how good it looks...

The Witcher 3 will look better


----------



## perfectblade

Quote:


> Originally Posted by *TopicClocker*
> 
> Yeah it is insane.
> 
> The developers have been starving, having to work around 6-8 year old technology as a benchmark for their games, the Xbox 360 is almost a decade old, shocking really.


i know that. i'm just saying that the gap between pcs and consoles can achieve with given hardware is insane and keeps getting worse and seems to be due to lazy devs. but keep not addressing that point


----------



## Zipperly

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> Doesn't matter how good it looks...
> 
> The Witcher 3 will look better


+1 Dont doubt that at all man, Witcher 3 is looking great.


----------



## Zipperly

Quote:


> Originally Posted by *perfectblade*
> 
> i know that. i'm just saying that the gap between pcs and consoles can achieve with given hardware is insane and keeps getting worse and seems to be due to lazy devs. but keep not addressing that point


The consoles also do not always run 1080P and 60fps either, never mind the fact that we usually get better looking textures and can turn up settings such as AA,AF, Ambient Occlusion etc.... Its not exactly an apples to apples comparison in that department. Games/ports will be more demanding for us because we can run higher settings than the console counterpart, that has nothing to do with lazy devs.

I could take BF4 and run it at the same settings and resolution that the PS4 is using and probably about double my framerates compared to the settings/resolution that I am currently running on my pc.


----------



## KenjiS

Quote:


> Originally Posted by *fashric*
> 
> My god its just like its going in one ear and straight out of the other. *THERE IS NOTHING WRONG WITH REQUIREMENTS INCREASING IF THE IMPROVEMENT IS WORTH IT BUT THIS GAME CLEARLY WILL NOT BE* They are saying 6gb is for 1080p, 1080P!!! ffs not 4k not 8k but *1080* if you believe this is nothing to do with covering their asses for poor PC optimisation/support then you have been blind to what has been happening since the "Next Gen" consoles were released in relation to game engines being used to make the "next gen" (god i hate that term) games. Also I'm not sure what age has to do with how this is interpreted but I've been gaming since the ZX Spectrum days so yeah...


Go back and read the post with the guy WHO HAS THE GAME

*THE 6GB REQUIREMENT IS ASSUMING YOU ARE ON ULTRA SETTINGS WHICH INCLUDES SUPERSAMPLING. THUS ITS ACTUALLY RUNNING AT 4k AND DOWNSAMPLING TO 1080p*

I think 6gb of VRAM for 4k is reasonable.


----------



## KenjiS

Quote:


> Originally Posted by *Zipperly*
> 
> Sigh...... once again do your homework. That video is not using the ultra settings and games always look better when you actually play them rather than watching them on youtube. Again, you are comparing an open world game "SOM" to Crysis 3 and Metro which are not open world games. If you are going to compare SOM to something else then compare it to another open world game so its more of an apples to apples comparison.


This. Ultra settings arent even out yet. The High ones do appear to want about 3gb of VRAM. Thats not unreasonable at this point.

That first page image might not even be real, given another screen of the settings menu lacks said warning...


----------



## TopicClocker

Quote:


> Originally Posted by *perfectblade*
> 
> i know that. i'm just saying that the gap between pcs and consoles can achieve with given hardware is insane and keeps getting worse and seems to be due to lazy devs. but keep not addressing that point


You're looking at other games and devs, but we have yet to see what Shadow of Mordor has to offer. That is the point I'm trying to make, we don't know what it looks like with Ultra Textures and we don't know how it performs.

In the end it could run fine with 3GB, but no one knows.

I agree with what you're saying to some degree, I can pick out at least 3 games this year which have questionable performance and visuals.

Titan Fall needs 3GB vram for "Insane" Textures, Watch Dogs... I don't know what to say other than something went wrong, for all platforms, the same goes for Dead Rising 3.

Shadow of Mordor? No one knows yet.


----------



## KenjiS

Quote:


> Originally Posted by *TopicClocker*
> 
> Titan Fall needs 3GB vram for "Insane" Textures, Watch Dogs... I don't know what to say other than something went wrong, for all platforms, the same goes for Dead Rising 3.
> 
> Shadow of Mordor? No one knows yet.


Watch dogs took two massively different engines, put them in a blender and tried to get something working out of it and failed at it...


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> Exactly. And furthermore we do NOT know what the Ultra textures look like as they are going to be a day 1 patch thing, So no, Not even the reviewers are getting to see them. For all we know they could look absolutely stupendous and actually be WORTH a 6GB GPU. Or they might not actually need it. The issue is people are crying poor optimization when theres no evidence of that yet.
> 
> Poor optimization is Watch_Dogs, Where folks like me had to drop to an Xbox 360-level of graphics quality to get it -somewhat- running and it still stuttered like a you know what because its using an absurd amount of VRAM. Shadow of Mordor on the other hand I can see using 3gb of VRAM for High settings(Which is all the footage we've seen so far..), the stuff I've seen from the game looks very good, and it does appear to have large expansive worlds that would require large textures.
> 
> We keep bringing up Crysis 3 over and over and over, That horse is getting a bit thin from all the beating. Perhaps we need to accept that not every game is Crysis 3. Its also likewise not fair to compare Crysis 3 which is a tiny small fairly linear world with an open world game
> 
> The thing I dont get is that I've seen this happen before over and over again, Suddenly theres a big jump in specs needed to run the newest games and a chasm appears, and usually that chasm includes even fairly recent hardware because the latest games require well.. insert something here...Be it more shader units or VRAM or whatever.. Then they plateau for a while. Then theres a big jump again. Its just how it is.
> 
> Folks are just mad because until about January of this year the opinion was "2gb is all the VRAM you'll ever need" and now thats simply not the case, For whatever reason, Games want more than 2gb of VRAM, Yes its probubly partially because the new consoles have a lot of Shared RAM to use, but it could also be that simply put, to push graphical fidelity higher this is specifically what was needed. Visual gains are not always linear in terms of performance requirements. The higher you push that bar, the more you need to push it higher.
> 
> Give it 6 months, The 960 Ti or whatever will probubly be loaded with 4gb of VRAM for $250 and the 980 Ti will probubly have 8gb (At $1000)


Forget about poor optimization for a second here....

I've run AA, at 4k for pete's sake, with 780ti sli....

Can you show me 1 scientific, well done, benchmark example of a Vram wall? Because I've never seen one.....vram limit is the pc equivalent of the kennedy assassination

I mean 6gb for 1080p?? Does that sound anywhere near logical? What kind of advanced alien technology game textures are we talking about here?

It's like if microsoft came out and said: windows 9 will require 32gb of ram


----------



## KenjiS

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Forget about poor optimization for a second here....
> 
> I've run AA, at 4k for pete's sake, with 780ti sli....
> 
> Can you show me 1 scientific, well done, benchmark example of a Vram wall? Because I've never seen one.....vram limit is the pc equivalent of the kennedy assassination
> 
> I mean 6gb for 1080p?? Does that sound anywhere near logical? What kind of advanced alien technology game textures are we talking about here?
> 
> It's like if microsoft came out and said: windows 9 will require 32gb of ram


Because you have a 780 Ti. You have 3gb of VRAM so of course you're not hitting a VRAM wall

Those of us with 2gb cards have so far found this year that its VERY iffy at 1080p, and if you're at 1440p or 4k then forget it.

Yes there indeed WERE benchmarks done of Watch_Dogs using identically clocked cards, one with 2gb and one with 4gb, 2gb = Stutters like mad, 4gb= Ran completely fine. the FPS was broadly similar the difference was the 2gb card had to constantly pause to refresh the textures in VRAM. Looking at VRAM usage on the 4gb card the guy noted the game was consuming 2.4gb at High and 2.7gb at Ultra, Which is precisely why 2gb was completely

Also, Again, the 1080p 6gb figure is assuming you're running supersampling, IE, you're running the game at 4k. Thats why 6gb.


----------



## SirWaWa

and to think... remember the guys that were saying oh... 1GB of VRAM is enough, there is no need for more
please stand up so you can be berated


----------



## FattysGoneWild

That was a long time ago in pc technology. Don't even go there!


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> Because you have a 780 Ti. You have 3gb of VRAM so of course you're not hitting a VRAM wall
> 
> Those of us with 2gb cards have so far found this year that its VERY iffy at 1080p, and if you're at 4k then forget it
> 
> Also, Again, the 1080p 6gb figure is assuming you're running supersampling, IE, you're running the game at 4k. Thats why 6gb.


"Iffy" is not cold hard facts, I'm afraid.....so lets try this:

Whatever game you **believe** your having vram issues with.....compare benchmarks at 1080p between a 770/680 and a 280x/7970....bet they're about the same

I was under the impression it was 6gb at 1080p for the textures, I wasn't aware of the supersampling part, but even still...that's a hell of a consumption for little old 1080


----------



## FattysGoneWild

This requirement is geared towards the people with to much time and money. Along with e-peen status to maintain. Just think. Titan and 780/Ti is outdated that fast. Those are the kind that will gladly burn the money to upgrade once again. That fast. Suckers I say. Even 980 owners. Before the end of the year the Ti will be out and 980 is old again. But, hey keep on buying those fancy cards to play unoptimized games that are completely overkill. It keeps the market going and GPU vendors love you for it. More money then common sense. Reminds of pc gaming going back to the old days. Very expensive to play the latest and greatest in all of its glory. Which will just drive people once again to consoles. Then we have the age old argument once again.


----------



## JoHnYBLaZe

Do any of the self appointed "experts" here on Vram usage have any CLEAR CUT, RESPECTABLE SCIENTIFIC DATA on Vram limitations.....

Anyone at all?

Because there's a lot of people planting their flag on the moon in this thread.....LoL


----------



## KenjiS

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> "Iffy" is not cold hard facts, I'm afraid.....so lets try this:
> 
> Whatever game you **believe** your having vram issues with.....compare benchmarks at 1080p between a 770/680 and a 280x/7970....bet they're about the same
> 
> I was under the impression it was 6gb at 1080p for the textures, I wasn't aware of the supersampling part, but even still...that's a hell of a consumption for little old 1080


Went and found the Watch Dogs comparison i spoke of

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,7.html

Consumes 2.3gb of VRAM at 1440p. Minimum. With High Textures

Can also verify on TWO systems with 2gb cards Ultra + 1080p was unplayable, VRAM pegged over at 2004MB. The only way to get it down below that was to drop to the most basic textures and etc. Then suddenly the problems went away. But moving one notch up pretty much murdered my VRAM usage, Causing massive amounts of stutter.

If you want more evidence, Heres another one I found with more comparisons, at 1080p now too.

In Russian but the images are in English, Scroll down to the VRAM results.. Again we see the game consuming a ton of VRAM

http://gamegpu.ru/action-/-fps-/-tps/watch-dogs-test-gpu.html

1920x1080, High Textures No AA is over 2gb on both tested cards. One being a Radeon with 4gb and the other is a 780 Ti at 3gb. Ultra Textures with no AA jumps that up to 2.9gb...

And then we can have all the users chime in, People with 3 and 4gb cards had less issues than people with 2gb cards who found the game unplayable. Its status as a VRAM hog is noted at just about every single PC gaming benchmark site.

Titanfall was similar, As was Wolfenstein, Both required and made use of 3gb+ for VRAM.

-edit- and in my personal collection Id vote Rome Total War II hits VRAM walls a lot too. Not as bad as other examples.

http://gamegpu.ru/rts-/-strategii/total-war-rome-ii-patch-2-test-gpu.html

Less so at 1920x1080 more so at 2560x1600

Its just in the PC world 2gb has been enough for quite a while now, 4gb was considered excessive. Graphics cards traded VRAM capacity for speed, ie on the 780 Ti because the games did not NEED more than 2gb

This is now no longer the case. 2gb is barely enough at 1080p for the latest titles and forget it at 1440p or 4k.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> Went and found the Watch Dogs comparison i spoke of
> 
> http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,7.html
> 
> Consumes 2.3gb of VRAM at 1440p. Minimum. With High Textures
> 
> Can also verify on TWO systems with 2gb cards Ultra + 1080p was unplayable, VRAM pegged over at 2004MB. The only way to get it down below that was to drop to the most basic textures and etc. Then suddenly the problems went away. But moving one notch up pretty much murdered my VRAM usage, Causing massive amounts of stutter.
> 
> If you want more evidence, Heres another one I found with more comparisons, at 1080p now too.
> 
> In Russian but the images are in English, Scroll down to the VRAM results.. Again we see the game consuming a ton of VRAM
> 
> http://gamegpu.ru/action-/-fps-/-tps/watch-dogs-test-gpu.html
> 
> 1920x1080, High Textures No AA is over 2gb on both tested cards. One being a Radeon with 4gb and the other is a 780 Ti at 3gb. Ultra Textures with no AA jumps that up to 2.9gb...
> 
> And then we can have all the users chime in, People with 3 and 4gb cards had less issues than people with 2gb cards who found the game unplayable. Its status as a VRAM hog is noted at just about every single PC gaming benchmark site.
> 
> Titanfall was similar, As was Wolfenstein, Both required and made use of 3gb+ for VRAM.
> 
> -edit- and in my personal collection Id vote Rome Total War II hits VRAM walls a lot too. Not as bad as other examples.
> 
> http://gamegpu.ru/rts-/-strategii/total-war-rome-ii-patch-2-test-gpu.html
> 
> Less so at 1920x1080 more so at 2560x1600
> 
> Its just in the PC world 2gb has been enough for quite a while now, 4gb was considered excessive. Graphics cards traded VRAM capacity for speed, ie on the 780 Ti because the games did not NEED more than 2gb
> 
> This is now no longer the case. 2gb is barely enough at 1080p for the latest titles and forget it at 1440p or 4k.


Sorry, but none of this is measurable proof....

Watchdogs stutters like crazy, even on titans, hell, watchdogs is a complete mess no matter how much Vram you throw at it.....

Titanfall the least "advanced" looking game of what you mentioned has been shown to cache extra Vram, but you should be able to tell that just by looking at it....

Wolfenstein runs on the OLD rage engine, which is terribly un-optimized, it doesn't even support SLi

And there are old forums concerning Total war optimization problems and FPS fluctuations, even guys with 4gb 680's were still having problems

We can't just go around blaming Vram without any investigation or evidence....


----------



## KenjiS

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Sorry, but none of this is measurable proof....
> 
> Watchdogs stutters like crazy, even on titans, hell, watchdogs is a complete mess no matter how much Vram you throw at it.....
> 
> Titanfall the least "advanced" looking game of what you mentioned has been shown to cache extra Vram, but you should be able to tell that just by looking at it....
> 
> Wolfenstein runs on the OLD rage engine, which is terribly un-optimized, it doesn't even support SLi
> 
> And there are old forums concerning Total war optimization problems and FPS fluctuations, even guys with 4gb 680's were still having problems
> 
> We can't just go around blaming Vram without any investigation or evidence....


Then what proof do you want. I show you tests and graphs demonstrating that Watch Dogs is consuming X amount of VRAM on various cards. Demonstrating that it does, indeed, consume more than 2gb and in some cases, more than 3gb of VRAM.

If it consumes more VRAM than is available, its going to introduce some form of performance degradation.

You asked for examples. I gave examples. I gave evidence to back my examples. You basically say "Nope! Not a fair arguement!, Those games are -insert something here- and that doesnt count!"

So basically you're saying unless Gabe Newell lays Half Life 3 on us and it needs 4gb of VRAM its poorly optimized crap (this is meant to be a joke)

The only Synthetic sort of benchmark that could prove such a thing would need to say, Have preset VRAM loading limits, IE, select 2, 4, or 6gb loading and then run it through your card and watch the corresponding output, This would demonstrate the relationship between VRAM loading and performance


----------



## fashric

Quote:


> Originally Posted by *KenjiS*
> 
> Go back and read the post with the guy WHO HAS THE GAME
> 
> *THE 6GB REQUIREMENT IS ASSUMING YOU ARE ON ULTRA SETTINGS WHICH INCLUDES SUPERSAMPLING. THUS ITS ACTUALLY RUNNING AT 4k AND DOWNSAMPLING TO 1080p*
> 
> I think 6gb of VRAM for 4k is reasonable.


I think you need to read the original post and the pic in it again....doesn't say anything about supersampling. It even explicitly states at a rendering resolution of 1080. If it does turn out that this requirement is only for 4k resolutions then it still seems very excessive (considering 99% of current PC games don't even hit 3gb on 4k). They will have to be the absolute best textures ever put in a game to justify 6gb.


----------



## KenjiS

Quote:


> Originally Posted by *fashric*
> 
> I think you need to read the original post and the pic in it again....doesn't say anything about supersampling.


Quote:


> I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).
> 
> There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.
> 
> So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.


IE the 6GB number is assuming you are running supersampling. At least thats what im getting out of that. Could also be assuming you are using AA.

I'm thinking Ultra will use more like 3.2-3.5gb


----------



## fashric

Quote:


> Originally Posted by *KenjiS*
> 
> Then what proof do you want. I show you tests and graphs demonstrating that Watch Dogs is consuming X amount of VRAM on various cards. Demonstrating that it does, indeed, consume more than 2gb and in some cases, more than 3gb of VRAM.
> 
> If it consumes more VRAM than is available, its going to introduce some form of performance degradation.
> 
> You asked for examples. I gave examples. I gave evidence to back my examples. You basically say "Nope! Not a fair arguement!, Those games are -insert something here- and that doesnt count!"
> 
> So basically you're saying unless Gabe Newell lays Half Life 3 on us and it needs 4gb of VRAM its poorly optimized crap (this is meant to be a joke)
> 
> The only Synthetic sort of benchmark that could prove such a thing would need to say, Have preset VRAM loading limits, IE, select 2, 4, or 6gb loading and then run it through your card and watch the corresponding output, This would demonstrate the relationship between VRAM loading and performance


Watch_Dogs performs poorly on cards with 3,4 or 6gb of vram . Just pop along to the Ubisoft or Geforce forums. As already said you are looking at games with major performance issues and simply blaming vram when it isn't a lack of vram causing the issue.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> Then what proof do you want. I show you tests and graphs demonstrating that Watch Dogs is consuming X amount of VRAM on various cards. Demonstrating that it does, indeed, consume more than 2gb and in some cases, more than 3gb of VRAM.
> 
> If it consumes more VRAM than is available, its going to introduce some form of performance degradation.
> 
> You asked for examples. I gave examples. I gave evidence to back my examples. You basically say "Nope! Not a fair arguement!, Those games are -insert something here- and that doesnt count!"
> 
> So basically you're saying unless Gabe Newell lays Half Life 3 on us and it needs 4gb of VRAM its poorly optimized crap (this is meant to be a joke)
> 
> The only Synthetic sort of benchmark that could prove such a thing would need to say, Have preset VRAM loading limits, IE, select 2, 4, or 6gb loading and then run it through your card and watch the corresponding output, This would demonstrate the relationship between VRAM loading and performance


I'm not downing you here, just asking you to try being a little more skeptical of this situation....

Some games can cache vram....meaning they're not actually using whats being reported

Watch dogs, as I said before,....will stutter on titans....does that not raise a red flag for you?

I would blame any of the games you mentioned WAAAAAYYYY before I blamed Vram of my card if I was you.....

Take into account your own statement...."introduce some form of performance degradation"....there's not even a clear cut mathematical consequence for crying out loud....aren't you at least curious about this?


----------



## KenjiS

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> I'm not downing you here, just asking you to try being a little more skeptical of this situation....
> 
> Some games can cache vram....meaning they're not actually using whats being reported
> 
> Watch dogs, as I said before,....will stutter on titans....does that not raise a red flag for you?
> 
> I would blame any of the games you mentioned WAAAAAYYYY before I blamed Vram of my card if I was you.....
> 
> Take into account your own statement...."introduce some form of performance degradation"....there's not even a clear cut mathematical consequence for crying out loud....aren't you at least curious about this?


I dunno. I really dont then. What I do know is developers are telling us that 2gb isnt going to cut it, The Evil Within, Shadow of Mordor being two examples. Watch_Dogs devs said similar. Wolfenstein locked out graphics options with less than 3gb of VRAM

There has to be a reason for this beyond the conspiracy theories as well

And what I also know is that in some games I cited above, the nice new 970 I ordered is going to beat my 770 SLI to the curb or come very close to its performance. Which simply doesnt add up unless there is some limitation somewhere between the two cards, In raw power two 770s should trump a single 970, But thats not necessarily the case. So what other differences? Well one of the big improvements on the 970 IS the memory and the way its handled... So theres one theory

One way to test would be to pit say, a 4gb GTX 770 vs a 2gb one in a game and see the performance difference.

-edit- Aha, Thought I had something like that

http://www.hardocp.com/article/2014/06/11/asus_strix_gtx_780_oc_6gb_review_exclusive/3#.VCelL_ldUQE

Check out Apples to Apples, the 6gb Strix is definitely smoother and has a performance advantage against a standard 3gb 780 at 1440p in Watch Dogs...

Trend continues in Battlefield 4, Crysis 3, Tomb Raider and FC4.

Clock speeds are very close, 889 vs 863 should not be producing the 20-ish% differences in tests... VRAM is likely the reason


----------



## xxroxx

Quote:


> Originally Posted by *KenjiS*
> 
> I dunno. I really dont then. What I do know is developers are telling us that 2gb isnt going to cut it, The Evil Within, Shadow of Mordor being two examples. Watch_Dogs devs said similar. Wolfenstein locked out graphics options with less than 3gb of VRAM
> 
> There has to be a reason for this beyond the conspiracy theories as well
> 
> And what I also know is that in some games I cited above, the nice new 970 I ordered is going to beat my 770 SLI to the curb or come very close to its performance. Which simply doesnt add up unless there is some limitation somewhere between the two cards, In raw power two 770s should trump a single 970, But thats not necessarily the case. So what other differences? Well one of the big improvements on the 970 IS the memory and the way its handled... So theres one theory
> 
> One way to test would be to pit say, a 4gb GTX 770 vs a 2gb one in a game and see the performance difference.
> 
> -edit- Aha, Thought I had something like that
> 
> http://www.hardocp.com/article/2014/06/11/asus_strix_gtx_780_oc_6gb_review_exclusive/3#.VCelL_ldUQE
> 
> Check out Apples to Apples, the 6gb Strix is definitely smoother and has a performance advantage against a standard 3gb 780 at 1440p in Watch Dogs...
> 
> Trend continues in Battlefield 4, Crysis 3, Tomb Raider and FC4.
> 
> Clock speeds are very close, 889 vs 863 should not be producing the 20-ish% differences in tests... VRAM is likely the reason


Crysis 3 is ok-ish to be displaying a difference between 6 vs 3 GB VRAM. Maybe BF4 too. Tomb Raider, well, that I do not know. Maybe caching or something, because the game's textures weren't anything great - but not bad also. Now Watch_Dogs... Watch_Dogs is a mess. Stop taking it as a example, for gods sake! If a game stutters on a 770, 780, 780ti, 290X AND a Titan, than the problem is in the game, not in the cards - except if the game is the new graphic benchmark for games, which is NOT the case with Watch_Dogs and its pre-baked shadows and window reflections.

The thing is that new games will cut 360/PS3 from their list or have a version made in another engine for those platforms, so the ones made for X1 amd PS4 - both of which have like, 8 gigs of vram - will be the ones ending up ported for PC. Lazy devs and crappy publishers who just want to get a bit more money exploit PC gamers by selling us games that require greater hardware than consoles to run them at a bit higher res and ~60 fps. That's wrong and should NOT be accepted.

I was super hyped about W_Ds. I did not know about Ubisofts latest disagreement with PC and its crowd, but as details about the game started surfacing, I couldn't not see commentaries about Ubi's neglect to us. So I did torrent the game, beat it - with it stuttering like hell - and did NOT buy it, for the simple reason that it was an unfinished product - and still is. If they fix the game, I'll buy it in a heartbeat.

And that is what I believe that should be done with all crappy console ports, instead of accepting and buying more expensive hardware.


----------



## KenjiS

Quote:


> Originally Posted by *xxroxx*
> 
> Crysis 3 is ok-ish to be displaying a difference between 6 vs 3 GB VRAM. Maybe BF4 too. Tomb Raider, well, that I do not know. Maybe caching or something, because the game's textures weren't anything great - but not bad also. Now Watch_Dogs... Watch_Dogs is a mess. Stop taking it as a example, for gods sake! If a game stutters on a 770, 780, 780ti, 290X AND a Titan, than the problem is in the game, not in the cards - except if the game is the new graphic benchmark for games, which is NOT the case with Watch_Dogs and its pre-baked shadows and window reflections.
> 
> .


But just about every site is including it in their test suites as an example of a typical next gen console game. So it DOES count. Plus the patch mostly fixed things on single GPU rigs. It even ran a bit better on mine, but still cant overcome the VRAM limitation, a limitation that even the game developer pointed out.

I completely disagree with pirating the game however. That is just feeding Ubisoft's "All PC gamers are pirates" mentality.

And so what, the Xbox One and PS4 offer the ability TO push graphics further.

I can also argue diminishing returns. It takes more and more power now to make a game look better. We're already at the point where any real advancement is going to require more power than just about anything on the market. People argued Crysis was poorly optimized too you know, and Far Cry before it. This arguement happens over and over again anytime a game needs more power to run at its highest settings than the top 5% of people have.

If a game demands more power, its poorly optimized and thus crap

If a game runs fine and smooth but doesnt shake up the graphics fidelty, Then its a console port and thus crap.

Either way it makes people seem quite unpleasable no?

And then at the end of it, we're arguing over a game that isnt out yet. A game that looks pretty darn good at High settings, Theres no telling what Ultra might be like. A game that one guy whos played it has commented on how much VRAM its using, but also how well its playing...

And its not like we're talking Ubisoft who has the WORST track record here..


----------



## Defoler

Quote:


> Originally Posted by *KenjiS*
> 
> But just about every site is including it in their test suites as an example of a typical next gen console game. So it DOES count. Plus the patch mostly fixed things on single GPU rigs. It even ran a bit better on mine, but still cant overcome the VRAM limitation, a limitation that even the game developer pointed out.


The VRAM limitations were always a bit of a farce.

Game engines will load as much textures as they can onto a card, even if current scene does not require them. As long as there is enough memory, textures do not get removed from the GPU VRAM.
When a scene requires 2GB of VRAM to load all textures (and please note that 2GB of textures is an insane amount. Seriously, insane), if the GPU has 6GB of vram, it will not delete the 2GB of textures it used on the scenes before, or the 2GB of vram from the scenes before that.

This can save time loading scenes, but in general, 2GB, 4GB or 6GB of memory impart is not as big as the GPU itself. We have seen more than enough a 3GB card performing better than a 6GB card (look at the 780 ti vs the original titan).

As long as you have the horse power, you don't need as much memory as overly stated in many cases. We have seen games take almost the full 4GB memory of the 290x and still doesn't perform much better than the 780 ti with just 3GB.

Anyway, loading 6GB of memory onto a card is a massive amount. This is like loading the whole game onto the GPU regardless of what you need or don't need. Which to me looks like more of an optimisation problem and less of "well you wanted quality" answer.
I believe a 4GB or even 3GB cards will be more than enough running the game as long as you have the power to push it.


----------



## KenjiS

Quote:


> Originally Posted by *Defoler*
> 
> The VRAM limitations were always a bit of a farce.
> 
> Game engines will load as much textures as they can onto a card, even if current scene does not require them. As long as there is enough memory, textures do not get removed from the GPU VRAM.
> When a scene requires 2GB of VRAM to load all textures (and please note that 2GB of textures is an insane amount. Seriously, insane), if the GPU has 6GB of vram, it will not delete the 2GB of textures it used on the scenes before, or the 2GB of vram from the scenes before that.
> 
> This can save time loading scenes, but in general, 2GB, 4GB or 6GB of memory impart is not as big as the GPU itself. We have seen more than enough a 3GB card performing better than a 6GB card (look at the 780 ti vs the original titan).
> 
> As long as you have the horse power, you don't need as much memory as overly stated in many cases. We have seen games take almost the full 4GB memory of the 290x and still doesn't perform much better than the 780 ti with just 3GB.
> 
> Anyway, loading 6GB of memory onto a card is a massive amount. This is like loading the whole game onto the GPU regardless of what you need or don't need. Which to me looks like more of an optimisation problem and less of "well you wanted quality" answer.
> I believe a 4GB or even 3GB cards will be more than enough running the game as long as you have the power to push it.


Well let me put it another way, One reason im dumping my SLI (Beyond overall being disappointed in the performance of it in things i regularly play and the fact a 970 handles several titles I enjoy just as well if not better than my SLI setup) is that everyone told me my limitation at this point is not raw power but the fact that games want more and more VRAM, This is backed up by every expert review site and everything I've read that says "2gb is just not enough for 1440p+ gaming"

One theory is the games I am specifically talking about stress the memory subsystem more than the processing, So Maxwells superior memory handling at these resolutions benefits more than raw grunt on the 770 SLI setup

Im not disagreeing with you that its not -possible- for games to use less. But they're not, either because its easier, or maybe it just happens to actually run -better- this way for some reason due to the engine limitations.

Regardless I dont think 2gb is enough anymore. 4gb is the new 2gb for the forseeable future.. I do not disagree that 6gb seems a bit much and agree with the earlier assessment that 3-4gb should handle Ultra fine...


----------



## SONICDK

should i hold of the puchase of a 970 ?


----------



## KenjiS

Quote:


> Originally Posted by *SONICDK*
> 
> should i hold of the puchase of a 970 ?


No. I highly doubt a 970 cant run this. if it really bugs you wait till Tuesday..I'll have my 970 and give a report on how it plays.


----------



## Flames21891

Just gonna say it, Watch_Dogs is a mess. I doubt most having stuttering issues with it are hitting a VRAM wall. I ran it on a single 2GB 680 at 1080p and didn't get any stuttering.

So far, I myself have yet to hit a VRAM wall on 2GB. The problem is that it's very hard to discern what minimum VRAM requirements in a game are, because simply looking at usage isn't very accurate due to caching. Will 2GB hold up indefinitely? Very unlikely. In fact, despite not having issues yet, I plan on retiring my 680's once big Maxwell arrives mostly for the extra VRAM. For now, however, I doubt 3-4GB cards are in any danger.


----------



## xxroxx

Quote:


> Originally Posted by *KenjiS*
> 
> But just about every site is including it in their test suites as an example of a typical next gen console game. So it DOES count. Plus the patch mostly fixed things on single GPU rigs. It even ran a bit better on mine, but still cant overcome the VRAM limitation, a limitation that even the game developer pointed out.
> 
> I completely disagree with pirating the game however. That is just feeding Ubisoft's "All PC gamers are pirates" mentality.
> 
> And so what, the Xbox One and PS4 offer the ability TO push graphics further.
> 
> I can also argue diminishing returns. It takes more and more power now to make a game look better. We're already at the point where any real advancement is going to require more power than just about anything on the market. People argued Crysis was poorly optimized too you know, and Far Cry before it. This arguement happens over and over again anytime a game needs more power to run at its highest settings than the top 5% of people have.
> 
> If a game demands more power, its poorly optimized and thus crap
> 
> If a game runs fine and smooth but doesnt shake up the graphics fidelty, Then its a console port and thus crap.
> 
> Either way it makes people seem quite unpleasable no?
> 
> And then at the end of it, we're arguing over a game that isnt out yet. A game that looks pretty darn good at High settings, Theres no telling what Ultra might be like. A game that one guy whos played it has commented on how much VRAM its using, but also how well its playing...
> 
> And its not like we're talking Ubisoft who has the WORST track record here..


Yeah, unfortunately that is happening. And that's horrible because it is kind of a brainwash on less informed people - makes them think that watch_dogs is revolucionary and stuff, thus needing a monster GPU to run it "stutterlessly". Yet, we are informed people in a specialized forum, so we all know that that is not true and that W_Ds should not be seen as a benchmark. It is crap, period. And even though patches have come, I still see a lot of people complain that stutter has not gone away, even though they have single 290X/780 Ti.



Spoiler: About pirating and W_D



Now, when I first got to REAL PC gaming, I did pirate quite a few games, that's true and I am not proud of it. However, out of the 7, 8 or maybe 9 games I've torrented, I bought all of the ones who were worth it and honestly I do not feel bad for not buying the other ones. I am starting a Game Design course and I believe that, as anything else, if your product does not match what it advertises, I should not be bought and/or the customer shall have his money back. The same way that anything made properly must be paid for or else it is robbery and you deserve jail for it.

Watch_Dogs delivers a nice - although short - story with normal graphics and nice animation and expression, giving a cinematic feel to the game and it did made me feel inside of it. The downside? My computer which probably has like 2 or 3 times the processing power of a PS4 cannot freaking run it. The funniest part of the game which is free roaming around in your car wrecking things up and escaping the cops - also one of the game's biggest selling point - stuttered like hell and driving was so f*cked up because of this that I ended up crashing into something or someone.

Now to be fair and wrap things up about W_Ds, I do pretend to buy it - but at a price it deserves. So as soon as I see it at a low price, I'll "waste" my money on it. Otherwise, yeah, I pirated it.

Bioshock 1 and Infinite (didn't play the 2nd) got all of my money. Paid full price on them. Why? Because they are great and I had no problems whatsoever with them. They deserved it. So did the Diablo 3, which you can see in my signature along with CS:GO and the great BS:I. Those are just three of the games I've paid for and I do not care spending money on games. If they deserve it, I'll pay whatever it costs - well, inside a limit, of course. If I can't pay, I won't play.



Also, contrary to the current belief, graphical fidelity isn't the reason why a game is good or not. Being good or not is. In a game where the environment is set in a realistic manner, of course we'll look for graphical fidelity and with this theme being the most popular, graphics become a great deal of a game. So if a game is announced as being a graphics revolution and stuff, but comes to PC - the platform with greatest processing capacity - with poor graphics or console-level graphics, of course we will complain.

And the same thing applies when a game requires too much while offering the same as the rest. *BUT* news have surfaced and I learned that *1*, SOM ultra textures have not yet been unveiled and *2* the 6GB might be for 1080p + supersampling. So I'll shut my mouth about this game for now.

But for the rest of crappy console ports, they are crappy console ports and period. Look at what true PC-oriented games are making. Star Citizen and KC: Deliverance are pushing visual fidelity in current hardware. Hell, Bioshock Infinite also looks great and doesn't requires much. I averaged 150 FPS on ultra at 1080p - and at times that game is reaaaally packed, so it is not to say that it's a light game.

Well, I talked too much already. Resuming, disrespectful and lazy console ports should not be accepted as normal or praised as benchmarks. They deserve to be ignored or torrented IMO so lazy developers learn that we will not take their bull*****.


----------



## KenjiS

War Thunder is pretty good looking I might add... and scales extremely well (Friends with potato computers run it fine, and its runs GREAT on my rig, Even supports SLI correctly)

That said, I like my Assassins Creed, and Im very much anticipating Dragon Age Inquisition, and even after all the crap, I still want to play Watch_Dogs, IN fairness, its the first game ive had such issues with... My move to the 970 is more or less jsut me trying to ensure I wont have any issues if it takes a few months before SLI is supported properly in these titles. SLI Just isnt for me, At the end of the day whether its the VRAM or what, a single 970 is a big upgrade from a single 770 at 1440p...

There is still part of me that maybe wants to leave my SLI installed and see if SOM will run with it, Even after ordering the 970.. -sigh- Just to see if im maybe wrong. But I dont think I am...


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> Well let me put it another way, One reason im dumping my SLI (Beyond overall being disappointed in the performance of it in things i regularly play and the fact a 970 handles several titles I enjoy just as well if not better than my SLI setup) is that everyone told me my limitation at this point is not raw power but the fact that games want more and more VRAM, This is backed up by every expert review site and everything I've read that says "2gb is just not enough for 1440p+ gaming"
> 
> One theory is the games I am specifically talking about stress the memory subsystem more than the processing, So Maxwells superior memory handling at these resolutions benefits more than raw grunt on the 770 SLI setup
> 
> Im not disagreeing with you that its not -possible- for games to use less. But they're not, either because its easier, or maybe it just happens to actually run -better- this way for some reason due to the engine limitations.
> 
> Regardless I dont think 2gb is enough anymore. 4gb is the new 2gb for the forseeable future.. I do not disagree that 6gb seems a bit much and agree with the earlier assessment that 3-4gb should handle Ultra fine...


In what game do you expect better performance from a 970 compared to 770 SLi?

I'm going to be totally honest with you....unless the game can't use SLi, there's no way on god's green earth it will beat 770 SLi, and if I'm mistaken please prove it, I'd be interested to see that

This is the main reason why I hate this Vram crap....I only upgrade when I can see the BENEFITS IN ACTUAL NUMBERS


----------



## Yungbenny911

When i had my 2gb 770's at 1400Mhz/1950Mhz and my 4K monitor, i quickly found out that V-ram wasn't the issue, raw power was. At stock speeds, most games were un-playable at 4k, they would lock up, slow down, dip fps every now and then, but bumping that to 1400Mhz really improved gameplay. It was almost like i was playing on a different set of GPU's, and i can bet that only 1% of GK 104 users ran their GPU's at such a high OC for daily gaming.

It's either this game in question is going to be VERY demanding to run even at 1080p, and i mean to a point where people with 2GB cards would have to run at medium settings to maintain 60fps+, or it's just a result of lazy developers. Either ways, putting 6GB as the required V-ram usage for a game at 1080p is raw stupidity. What a way to either limit your audience... Most users still have GTX 570's in their PC's with 1.2gb V-ram


----------



## KenjiS

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> In what game do you expect better performance from a 970 compared to 770 SLi?
> 
> I'm going to be totally honest with you....unless the game can't use SLi, there's no way on god's green earth it will beat 770 SLi, and if I'm mistaken please prove it, I'd be interested to see that
> 
> This is the main reason why I hate this Vram crap....I only upgrade when I can see the BENEFITS IN ACTUAL NUMBERS


http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/16

Company of Heroes 2, Doesnt use SLI at all, Cant even bloody play it without disabling SLI completely which is a pain in the rear. Framerate at 1440p is MEH (26-32.. A bit low)

Rome Total War 2 is weird, Roughly equal performance to my SLI from a single card, Yes seriously. Anandtech tested it, 1440p on a 970 = 55.6 (with a very mild OC mind you, Less than most folks are getting) and running the same settings I test out at 58.8 using the same benchmark. Single 770 is 30.5 for me, they got 31.4. So its not like its "poorly" using SLI (a 90% improvement is pretty much close to ideal from SLI..)

Metro:Last Light is another example on their Benches, 69.3 VS a 770 SLI at 61.8...

SLI has not delivered what I desire at all, I wanted to future proof my rig a bit and beef it up since I bought a 1440p display and its not done that, So far this year just about everything I've played either doesnt support SLI or has something else that means I had to turn SLI off to make it work correctly. Anything where SLI is supported and gives a performance boost I didnt care about actually playing(IE Crysis 3, Benchmarks, Crysis 3, Battlefield 4, Crysis 3...). I've only had good times with SLI in Rome Total War 2 and War Thunder..and thats it really... Beyond that I didnt need it (Sniper Elite 3, Bioshock Infinite) or something else meant it wasnt helpful (Company of Heroes 2, Watch_Dogs, Wolfenstein)

And if every game thats coming out, For whatever reason, Wants/Needs more than 2gb of VRAM, then guess what, SLI is useless because the game is either going to lock me out of settings or its going to run like absolute turd. So theres that.

Shadow of Mordor is a game I'm quite looking foreward to and the one guy whos actually played it points out its using 2.7/4gb on his card. I highly doubt my 2gb is going to work out to play it at High, much less Ultra. Especially at 1440p. I'll be happy to keep my GTX 770 in there and try to run it, but I am suspecting I will not be having much luck.

I still stand by that if the game is designed to run with all of its assets loaded into VRAM vs System RAM which seems to be the case with upcoming titles, the resulting performance if you cant do so is going to be poor. Because for good or worse, its how the game is designed, and at the end of the day all I really want to do is play the games coming out at the end of the year, at maxed out settings, at 1440p, Without problems or questions on if my GPU setup was supported, You know, the experience I had before I went to SLI. I dont care about getting some absurd FPS numbers, I care about running or not running something, SLI has not increased my ability to run anything I couldnt before, if anything it hurt it. So going from 60fps to 90fps or something silly like that is just that, silly, I have a 60hz monitor, I dont get a kick out of stupid high numbers for stupid high numbers sake. I wasted $350 on a second GPU that barely gets to run at 100% because almost nothing I care about playing actually utilizes it (Except War Thunder and Rome Total War 2, And even then well.. See above)

But please, if you see something else causing my disappointment, Please tell me


----------



## Pip Boy

I think I read that the 6GB VRAM is for the interior castle section where they load in a 48bit color 30 million DPI texture of half the bayeux tapestry

the 12GB version will be out next year with the full article in bump mapped 3D


----------



## Zipperly

Quote:


> Originally Posted by *FattysGoneWild*
> 
> That was a long time ago in pc technology. Don't even go there!


Well I see someone stood up mighyt fast!







Also who cares how long ago it was.... vram requirements increasing have always been a part of this game.


----------



## Zipperly

Quote:


> Originally Posted by *FattysGoneWild*
> 
> This requirement is geared towards the people with to much time and money. Along with e-peen status to maintain. Just think. Titan and 780/Ti is outdated that fast. Those are the kind that will gladly burn the money to upgrade once again. That fast. Suckers I say. Even 980 owners. Before the end of the year the Ti will be out and 980 is old again. But, hey keep on buying those fancy cards to play unoptimized games that are completely overkill. It keeps the market going and GPU vendors love you for it. More money then common sense. Reminds of pc gaming going back to the old days. Very expensive to play the latest and greatest in all of its glory. Which will just drive people once again to consoles. Then we have the age old argument once again.


Wow someone is bitter and disconnected from reality. First of all a titan has 6gb's of vram so please tell us why you are included that particular GPU in your little rant when this discussion is based solely on Vram requirements? If anything at all titan owners will be able to hang on for a whole lot longer. Now for the 3gb 780's and 780TI's we still do not know if they are going to suffer from vram limitations until some of these games are out and "even if they do" It is very easy to sell them for enough $$$ to basically purchase a 4gb 970 without having to spend any money out of pocket. Heck some 780TI's are still going for over $500.00 on ebay.

You act as if the new cards are $700-$800 and people must sell their current GPU's and still have to come up with the other half just to upgrade, that could not be further from the truth. Also when the 980TI comes out who gives a rats butt what it does to the 980? Will the 980 suddenly become slow and obsolete? no, of course not. Your logic and reasoning skills "or lack thereof"are failing you greatly, have a great day.


----------



## Imouto

The key here is prolly the enhanced geometry. Aside from the obvious 4k textures that everyone can see at first clance you have a lot of textures, bump maps, spec maps and other stuff requiring more and more VRAM. It may be caused by more complex geometry and tesselation requirements. I work a lot with this kind of maps and they can be heavier than the regular difusse texture most of the time.

Also thing is that you're talking all the time about textures when the main problem are the shaders. Complex shaders + complex textures = VRAM usage skyrocketing.


----------



## Saberfang

I don't get how some people seem hyped about the fact that the requirements of this game, and a few others like the Evil Within, are putting higher the needs for specs. I'm really curious about how these awesome *full HD* textures will really look compared to the standard high ones and to get a benchmark to see how performace will be affected but aren't you forgetting about the fact that the last week we got a couple of new GPU that can't even match this insane request of 6GB VRAM???

Yeah, I'm really happy to know that going to a shop right now and spending big bucks on a new rig doesn't mean that I can run anything without having issue and that a x70 or a x80 couldn't be called high end cards because we got a gimmick called Titan in the last generation that needed to be given a use.
This is not progress, I don't care if it's due to a poor optimization or there is real proof about it, it's just silly to say that what you could get yesterday isn't good enought for what you may be playing today while you should have full performance for at least a couple of years...

It's like saying, that from tomorrow, 8 cores CPU may be a minimum requirement and that the only one on the market cost about 1000$.....Yeah, I'm really hyped about it


----------



## perfectblade

people seriously need to stop associating specs with graphics. there is no positive correlation between making terribly optimized games and quality graphics. if anything it's the opposite


----------



## Saberfang

Let me explain better my idea because I think I may have been misunderstood. What I wanted to say is that we are well aware that a high end 9xx/7xx and I dare to say a 6xx have more than enought power than we need to play in full hd and we know that we aren't using our hardware at its top potential due to softweare limitation (for VGA just look at Mantle and it's performance increase) so I think that having games coming out now that say that we need 6GB VRAM for a feature its just silly.
Sure, not meeting this requirements doesn't mean you can't play the game and maybe you will still be able to use this setting, but still having the last *high end* GPU coming out last week and not being able to meet the top performance without issue its unacceptable. It may be due to bad optimization or just because some genius at Nvidia tough that by getting high VRAM version later they could have been able too milk the market more or send enthusiast to the insanely priced over 1000$ budget for a GPU but that will still not be a confort if that is what more games and developers are going for...

I need to get myself a new rig in a month or two and I like to keep my systems for years so this kind of news disappoints me to no end. We alredy have power in our PCs and we use it so little, expecially when it comes to CPUs and GPUs, so when they try to force the market with this kind of tricks, it just pisses me off. Optimize software first!


----------



## kx11

i wonder if windows 8.1 shared memory functions will solve this 6gb vram


----------



## Yungbenny911

Quote:


> Originally Posted by *kx11*
> 
> i wonder if windows 8.1 shared memory functions will solve this 6gb vram


Shader Cache on Nvidia control panel. Maybe that would also help...


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/16
> 
> Company of Heroes 2, Doesnt use SLI at all, Cant even bloody play it without disabling SLI completely which is a pain in the rear. Framerate at 1440p is MEH (26-32.. A bit low)
> 
> Rome Total War 2 is weird, Roughly equal performance to my SLI from a single card, Yes seriously. Anandtech tested it, 1440p on a 970 = 55.6 (with a very mild OC mind you, Less than most folks are getting) and running the same settings I test out at 58.8 using the same benchmark. Single 770 is 30.5 for me, they got 31.4. So its not like its "poorly" using SLI (a 90% improvement is pretty much close to ideal from SLI..)
> 
> Metro:Last Light is another example on their Benches, 69.3 VS a 770 SLI at 61.8...
> 
> SLI has not delivered what I desire at all, I wanted to future proof my rig a bit and beef it up since I bought a 1440p display and its not done that, So far this year just about everything I've played either doesnt support SLI or has something else that means I had to turn SLI off to make it work correctly. Anything where SLI is supported and gives a performance boost I didnt care about actually playing(IE Crysis 3, Benchmarks, Crysis 3, Battlefield 4, Crysis 3...). I've only had good times with SLI in Rome Total War 2 and War Thunder..and thats it really... Beyond that I didnt need it (Sniper Elite 3, Bioshock Infinite) or something else meant it wasnt helpful (Company of Heroes 2, Watch_Dogs, Wolfenstein)
> 
> And if every game thats coming out, For whatever reason, Wants/Needs more than 2gb of VRAM, then guess what, SLI is useless because the game is either going to lock me out of settings or its going to run like absolute turd. So theres that.
> 
> Shadow of Mordor is a game I'm quite looking foreward to and the one guy whos actually played it points out its using 2.7/4gb on his card. I highly doubt my 2gb is going to work out to play it at High, much less Ultra. Especially at 1440p. I'll be happy to keep my GTX 770 in there and try to run it, but I am suspecting I will not be having much luck.
> 
> I still stand by that if the game is designed to run with all of its assets loaded into VRAM vs System RAM which seems to be the case with upcoming titles, the resulting performance if you cant do so is going to be poor. Because for good or worse, its how the game is designed, and at the end of the day all I really want to do is play the games coming out at the end of the year, at maxed out settings, at 1440p, Without problems or questions on if my GPU setup was supported, You know, the experience I had before I went to SLI. I dont care about getting some absurd FPS numbers, I care about running or not running something, SLI has not increased my ability to run anything I couldnt before, if anything it hurt it. So going from 60fps to 90fps or something silly like that is just that, silly, I have a 60hz monitor, I dont get a kick out of stupid high numbers for stupid high numbers sake. I wasted $350 on a second GPU that barely gets to run at 100% because almost nothing I care about playing actually utilizes it (Except War Thunder and Rome Total War 2, And even then well.. See above)
> 
> But please, if you see something else causing my disappointment, Please tell me


In the cases you just mentioned you are well justified in your decision....

I just find it ridiculous that all of a sudden your 970 can't run textures on shadow of mordor because it's not 6gb.....

Without any clear scientific data or the slightest reasonable explanation

I tell you...it all reeks of up-selling B.S.


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Without any clear scientific data or the slightest reasonable explanation


We do not yet know how cards with less than 6gb will perform with Ultra textures, the next thing you said without scientific data or reasonable explanation makes no sense either when its actually very simple that if textures are complex enough and extremely high resolution then it is going to need more vram. However no one knows yet if the ultra texture requirements for 6gb will be justified or not until the verdict is actually in.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Saberfang*
> 
> Let me explain better my idea because I think I may have been misunderstood. What I wanted to say is that we are well aware that a high end 9xx/7xx and I dare to say a 6xx have more than enought power than we need to play in full hd and we know that we aren't using our hardware at its top potential due to softweare limitation (for VGA just look at Mantle and it's performance increase) so I think that having games coming out now that say that we need 6GB VRAM for a feature its just silly.
> Sure, not meeting this requirements doesn't mean you can't play the game and maybe you will still be able to use this setting, but still having the last *high end* GPU coming out last week and not being able to meet the top performance without issue its unacceptable. It may be due to bad optimization or just because some genius at Nvidia tough that by getting high VRAM version later they could have been able too milk the market more or send enthusiast to the insanely priced over 1000$ budget for a GPU but that will still not be a confort if that is what more games and developers are going for...
> 
> I need to get myself a new rig in a month or two and I like to keep my systems for years so this kind of news disappoints me to no end. We alredy have power in our PCs and we use it so little, expecially when it comes to CPUs and GPUs, so when they try to force the market with this kind of tricks, it just pisses me off. Optimize software first!


You just stated Nvidia's problem.....

Cards are absolutely BLITZING 1080p right now and the margin is only growing...

How then....to create more demand?

Have devs make games that CACHE a of Vram....BOOM...card obsolete


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> We do not yet know how cards with less than 6gb will perform with Ultra textures, the next thing you said without scientific data or reasonable explanation makes no sense either when its actually very simple that if textures are complex enough and extremely high resolution then it is going to need more vram. However no one knows yet if the ultra texture requirements for 6gb will be justified or not until the verdict is actually in.


Zipper....."its actually very simple that if textures are complex enough and extremely high resolution then it is going to need more vram"

When has this ever happened and been documented and explained

Also, before you answer that...what is the clear and concise consequence of running out of Vram?

Is it not exactly the same consequence as the card not being fast enough? How do we know which is which?

How do we know what is cached and what is actually being used?

http://www.tomshardware.com/reviews/gigabyte-geforce-gtx-titan-black-ghz-edition,3821-8.html

^^^ 780ti vs titan black, which has DOUBLE the Vram, at 4k....

Tell me again what is...."actually very simple" about this?


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Zipper....."its actually very simple that if textures are complex enough and extremely high resolution then it is going to need more vram"
> 
> When has this ever happened and been documented and explained


Seriously? where have you been for the past decade? under a rock? This is the way things have always been, if not then we would all still be fine with 256-512mb's of video ram on our GPU's.

Quote:


> Is it not exactly the same consequence as the card not being fast enough? How do we know which is which?


Well of course not, two totally different discussions. This discussion is however about vram and with improved textures comes increased vram requirements........... Im more shocked that you are shocked by this simple fact more than anything else you have said.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> Seriously? where have you been for the past decade? under a rock? This is the way things have always been, if not then we would all still be fine with 256-512mb's of video ram on our GPU's.
> Well of course not, two totally different discussions. This discussion is however about vram and with improved textures comes increased vram requirements........... Im more shocked that you are shocked by this simple fact more than anything else you have said.


You speak in absolute certainty about Vram requirements and increases

So what then, are these requirements....in real numbers, please....not caching....not what the devs are claiming....honest scientific benchmarks of ACTUAL USAGE

Better yet, what kind of performance AM I ACTUALLY PAYING FOR AND EXPECTING in relation to a higher amount of vram?

If you can't answer these things honestly and clearly don't you find that just a little bit fishy?


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> You speak in absolute certainty about Vram requirements and increases
> 
> So what then, are these requirements....in real numbers, please....not caching....not what the devs are claiming....honest scientific benchmarks of ACTUAL USAGE


I tell you what, go try to play Crysis 3 at 1920X1080 with a 1gb video card, raw processing power aside you are going to find yourself hitching and stuttering due to an insufficient amount of video ram. I under stand where you are coming from about caching though as some games certainly do this but it still does not negate the fact that "and this is time proven" that more detailed textures require more vram. A simple test with vanilla skyrim vs a highly modded skyrim will do just fine.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> I tell you what, go try to play Crysis 3 at 1920X1080 with a 1gb video card, raw processing power aside you are going to find yourself hitching and stuttering due to an insufficient amount of video ram. I under stand where you are coming from about caching though as some games certainly do this but it still does not negate the fact that "and this is time proven" that more detailed textures require more vram. A simple test with vanilla skyrim vs a highly modded skyrim will do just fine.


O.k. more detailed textures require more vram.....fair enough

What about this though.... any card that HAD 1gb that I try to play crysis 3 on will run out of raw power before it hits that vram wall...and if I'm wrong and it actually DOES hit the vram wall...its will look just the same as if it ran out of power....a catch 20/20 if you will

To make this situation even MORE UNCLEAR here is a video of gtx 690 (the highest performing card with the lowest Vram) vs a Titan at different games and resolutions
https://www.youtube.com/watch?v=RMGH3HEj_ew

This is why when I hear "vram requirements" I instantly think "ripoff"

Oh yeah....and skyrim....love it.....when it's modded it's been said to leak memory and will crash to desktop for no practical reason, also a modded game is probably not a good choice for reference purpose


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> O.k. more detailed textures require more vram.....fair enough
> 
> What about this though.... any card that HAD 1gb that I try to play crysis 3 on will run out of raw power before it hits that vram wall...and if I'm wrong and it actually DOES hit the vram wall...its will look just the same as if it ran out of power....a catch 20/20 if you will


You will know it when you run out of video ram vs just running low on raw power and you absolutely will run out of vram with a 1gb card when playing crysis 3 at 1920X1080 or maybe even lower res than that. Running low on raw power will just result in low fps, running out of vram on the other hand will leave you with a stuttering mess which will look even worse than just being low on raw processing power.

If for example a 1gb GTX 780 existed all that raw processing power in the world wouldnt do it any good in a situation such as this.
Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> To make this situation even MORE UNCLEAR here is a video of gtx 690 (the highest performing card with the lowest Vram) vs a Titan at different games and resolutions including crysis 3
> 
> https://www.youtube.com/watch?v=RMGH3HEj_ew
> 
> This is why when I hear "vram requirements" I instantly think "ripoff"


Alan Wake, very low res textures in that one. Vram requirements are not that high in some of those games even at 4k res so im not surprised that the 690 didnt have any trouble. Many other games in that video are also pretty old and are based off of last gen consoles. Heck in some of the test they dont even run any AA..... BTW crysis 3 is not in that video.

So again, here we are comparing games built around last gen consoles so its no wonder that a 2gb card doesnt hold up well here. How much ram did the old consoles have anyway? 256 to 512mb I believe and now the new consoles have around 5gb's to work with for gaming. That is a dramatic difference and if you dont think that difference will carry over to pc gaming then you are sadly mistaken. As technology evolves so does the need for better hardware.


----------



## marcus556

I wonder if how the new 900 series gpu's deal with memory if this will still be the case...??? 2GB seems like a lot to make up for....


----------



## TopicClocker

Wow, how hard is it to wait for the game to release?


----------



## Zipperly

Quote:


> Originally Posted by *TopicClocker*
> 
> Wow, how hard is it to wait for the game to release?


I agree, im going to withdraw from these discussions for the time being. I just dont understand why people have a hard time with history repeating itself, its been this way for as long as I can remember. Then you have the crowd that wants to compare this to games built around the previous generation consoles which only had a measly 256-512mb of ram to work with.

Personally if the devs are giving PC gamers an optional setting for higher resolution textures then more power to them and if you dont have a video card that will cut it for vram then you can dial it back to High texture settings which is where they could have left us at if they decided to do so. I will more than welcome PC exclusive features and can appreciate any gaming company who keeps us PC gamers in mind like that instead of just doing a direct straight port over with your typical AA/AF options.


----------



## BradleyW

I actually played this game last night at EuroGamer Expo London. The ultra textures did not look that impressive. I believe this is a case of using large VRAM amounts with unjustifiable intent.


----------



## TopicClocker

Quote:


> Originally Posted by *BradleyW*
> 
> I actually played this game last night at EuroGamer Expo London. The ultra textures did not look that impressive. I believe this is a case of using large VRAM amounts with unjustifiable intent.


Cool, did they say anything about the requirements, and was it really running Ultra textures?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I actually played this game last night at EuroGamer Expo London. The ultra textures did not look that impressive. I believe this is a case of using large VRAM amounts with unjustifiable intent.


What was the resolution of the screen?


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> You will know it when you run out of video ram vs just running low on raw power and you absolutely will run out of vram with a 1gb card when playing crysis 3 at 1920X1080 or maybe even lower res than that. Running low on raw power will just result in low fps, running out of vram on the other hand will leave you with a stuttering mess which will look even worse than just being low on raw processing power.
> 
> If for example a 1gb GTX 780 existed all that raw processing power in the world wouldnt do it any good in a situation such as this.
> Alan Wake, very low res textures in that one. Vram requirements are not that high in some of those games even at 4k res so im not surprised that the 690 didnt have any trouble. Many other games in that video are also pretty old and are based off of last gen consoles. Heck in some of the test they dont even run any AA..... BTW crysis 3 is not in that video.
> 
> So again, here we are comparing games built around last gen consoles so its no wonder that a 2gb card doesnt hold up well here. How much ram did the old consoles have anyway? 256 to 512mb I believe and now the new consoles have around 5gb's to work with for gaming. That is a dramatic difference and if you dont think that difference will carry over to pc gaming then you are sadly mistaken. As technology evolves so does the need for better hardware.


Firstly...where is your source saying what the difference between running out of Vram and running out of processing power will be?

Second can you explain more clearly the difference between "stuttering" and FPS fluctuations, because they look and act remarkably the same....furthermore can you prove their is a difference

Let me make it clear I'm not downing what your saying, only asking for clarity on this situation, you may feel that you know for sure but facts are only facts because of proof my friend

There were games in that video running at stupid high resolutions with AA, ok maybe they weren't the latest SO....

Shall I post ANOTHER link of 780ti VS Titan black to further illustrate my point OR....

Can you post a benchmark of similarly powered cards that show the benefits of having more Vram?


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Firstly...where is your source saying what the difference between running out of Vram and running out of processing power will be?


I have experienced both phenomenons first hand, running out of video ram results in very noticeable nasty stuttering which is a complete different experience compared to when you are simply running low on processing power. Also I typically play my games with programs that monitor vram usage so its not that hard for me to confirm that I have gone over my vram limitations when the stuttering begins. Honestly.. running out of video ram is very noticeable, you would be the first person I have ever come across to challenge this.
Quote:


> Second can you explain more clearly the difference between "stuttering" and FPS fluctuations, because they look and act remarkably the same....furthermore can you prove their is a difference


Actually they do not, the stuttering that occurs from a lack of sufficient video ram is extremely noticeable to the point that it is worse than any typical FPS fluctuations.

Quote:


> Can you post a benchmark of similarly powered cards that show the benefits of having more Vram?


Based off of last gen? Probably not. However I had no trouble at all exceeding the 2gb limitation with my old GTX 680 when playing BF4 at 1920X1080.

As I mentioned earlier I will back out of this discussion, history will soon prove itself all over again. Its a rinse, wash and repeat cycle... always has been and always will be.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> I have experienced both phenomenons first hand, running out of video ram results in very noticeable nasty stuttering which is a complete different experience compared to when you are simply running low on processing power. Also I typically play my games with programs that monitor vram usage so its not that hard for me to confirm that I have gone over my vram limitations when the stuttering begins. Honestly.. running out of video ram is very noticeable, you would be the first person I have ever come across to challenge this.
> Actually they do not, the stuttering that occurs from a lack of sufficient video ram is extremely noticeable to the point that it is worse than any typical FPS fluctuations.
> Based off of last gen? Probably not. However I had no trouble at all exceeding the 2gb limitation with my old GTX 680 when playing BF4 at 1920X1080.
> 
> As I mentioned earlier I will back out of this discussion, history will soon prove itself all over again. Its a rinse, wash and repeat cycle... always has been and always will be.


O.K. so what happened when you "exceeded" the 2gb limitation in BF4....your FPS dropped?? fluctuated wildy??.....ALL SYMPTOMS OF LOW RAW POWER!

GTX 690....BF4
https://www.youtube.com/watch?v=t6qGqMhvrnA

Before you find a setting thats not to your standard in that video.....Bet you ANY AMOUNT OF MONEY a titan black put up similar performance!

Why are people defending this mythical, magical Vram wall when they should be looking for answers

Unless of course to burn money is the goal...


----------



## Pip Boy

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> O.K. so what happened when you "exceeded" the 2gb limitation in BF4....your FPS dropped?? fluctuated wildy??.....ALL SYMPTOMS OF LOW RAW POWER!
> 
> GTX 690....BF4
> https://www.youtube.com/watch?v=t6qGqMhvrnA
> 
> Before you find a setting thats not to your standard in that video.....Bet you ANY AMOUNT OF MONEY a titan black put up similar performance!
> 
> Why are people defending this mythical, magical Vram wall when they should be looking for answers
> 
> Unless of course to burn money is the goal...


because they don't actually understand the technology ?


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What was the resolution of the screen?


I can't disclose technical information. I wish I could.







lol.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *phill1978*
> 
> because they don't actually understand the technology ?


So nvidia and the game makers owe it to us to explain

Who here buys GPU without having a clear image in their mind of the performance they can expect?

Why should Vram be any different?


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> Omg...
> 
> 
> 
> 
> 
> 
> 
> No, it was not low raw power and the fps did not fluctuate wildly either. Instead it turned into an all out stutter fest, while monitoring the game with MSI Afterburner as long as I was not peaking beyond the 2gb vram limitation my framerates were perfectly fine and the game was silky smooth, if it was a LOW RAW POWER situation as you keep "incorrectly suggesting it was" then my frame rate would be utter and complete crap all of the time REGARDLESS of how much vram was being utilized. You really dont know what you are talking about man.......... not even close.
> 
> Btw- I have no trouble filling up the 3gb's of video ram in BF4 with my GTX 780 if I max out the resolution slider scale and apply enough AA. Its really not that difficult to do at all. But again do I have to play the game that way? no not at all but 1920X1080 on ultra settings with a single GTX 680 proved to be a bit too much for a 2gb card on many 64 player maps.
> Yes you must be right, the very fact that we no longer game with 256mb or 512mb cards is all due to a legendary myth. Im going to suggest to nvidia and ati that anything more than that is just a waste and we should all still be able to play any current game to date on ultra settings at 1080p with less than 1gb of video ram. Im not going to bother replying to anymore of your senseless dribble, you are a hopeless case as far as I am concerned and a complete and total waste of my time.


Buddy....if you feel that your hitting the Vram wall....don't get mad....enlighten us as to how and why

I understand what your saying makes perfect sense in your mind but it ain't solid proof i'm afraid....not by a long-shot

How come a 690 was able to handle BF4 in the vid?

And if you turn up the resolution scale and AA far enough you can make A LOT of cards "stutter"

But If I failed to illustrate my pure intentions in our little debate....then good luck to you either way friend =)


----------



## firebird1le

Hi all, I built a PC with a i7 4790k a few months ago and reused my EVGA 760 SC 2gb in anticipation of buying a 970 4gb when they came instock. I only have a 1080 monitor. Should I hold off till 8gb cards come out? I would like to run the graphics maxed out. I am planing on playing Shadow of Mordor in addition to the upcoming Crew, Batman, Far Cry 4, Witcher 3 and Dragon Age Inquisition. I would be more then willing to hold off till 8gb cards come out, I just dont want to buy a 4gb card and be obsolete i 6 months. I think Mordor is the only game so far that has released requirements, but I would assume the other games would be similar.


----------



## Zipperly

Quote:


> Originally Posted by *firebird1le*
> 
> Hi all, I built a PC with a i7 4790k a few months ago and reused my EVGA 760 SC 2gb in anticipation of buying a 970 4gb when they came instock. I only have a 1080 monitor. Should I hold off till 8gb cards come out? I would like to run the graphics maxed out. I am planing on playing Shadow of Mordor in addition to the upcoming Crew, Batman, Far Cry 4, Witcher 3 and Dragon Age Inquisition. I would be more then willing to hold off till 8gb cards come out, I just dont want to buy a 4gb card and be obsolete i 6 months. I think Mordor is the only game so far that has released requirements, but I would assume the other games would be similar.


I would expect 4gb cards to last a good while yet, you might get the odd game here and there requiring more but for the most part you should be fine. Now of course if you want to wait it out a bit I had heard that the 8gb 970 and 980's would be out mid October, you could wait and see how much those will cost before you make a decision. I personally would rather be well above the recommended specs for vram and seeing im still ok with my 3gb 780 I will hang onto it a bit longer and then upgrade to a 6-8gb video card when I actually need it.

If "the evil within" actually does need more than 3gbs of video ram then I might upgrade sooner rather than later because I am looking forward to playing that game and I am the type that likes to play with maxed out settings.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *firebird1le*
> 
> Hi all, I built a PC with a i7 4790k a few months ago and reused my EVGA 760 SC 2gb in anticipation of buying a 970 4gb when they came instock. I only have a 1080 monitor. Should I hold off till 8gb cards come out? I would like to run the graphics maxed out. I am planing on playing Shadow of Mordor in addition to the upcoming Crew, Batman, Far Cry 4, Witcher 3 and Dragon Age Inquisition. I would be more then willing to hold off till 8gb cards come out, I just dont want to buy a 4gb card and be obsolete i 6 months. I think Mordor is the only game so far that has released requirements, but I would assume the other games would be similar.


Mordor comes out in 2 days....

No doubt reviewers will be testing and dissecting the 6gb texture recommendation....

I wouldn't be the least bit surprised if the 970 performs on par with a 6gb titan, assuming there's no funky optimization issues or setting locks....

If that is the case you don't have to be worried about upgrading that 970 until it can't put out decent FPS

Like every other GPU that has ever been made LoL


----------



## KenjiS

4gb is the new 2gb, Simple as that lol

Dont worry, Tuesday is coming. I will see what I can do for a benchmark of some form... I have 2 rigs to test on after all..


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *KenjiS*
> 
> 4gb is the new 2gb, Simple as that lol
> 
> Dont worry, Tuesday is coming. I will see what I can do for a benchmark of some form... I have 2 rigs to test on after all..


WoW....

4gb is barely out and its already a bare minimum

Even though a Titan has never shown up a 690 in any benchmark

Nor a Titan black shown up a 780ti

Poor are the people reading this and becoming mis-informed


----------



## Nicnivian

And here I am with this GTX 780ti now. A card for PEASANTS!!!

Oh, wait. No... I'm not worried.


----------



## ZealotKi11er

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> WoW....
> 
> 4gb is barely out and its already a bare minimum
> 
> Even though a Titan has never shown up a 690 in any benchmark
> 
> Nor a Titan black shown up a 780ti
> 
> Poor are the people reading this and becoming mis-informed


We have had 2GB since GTX280 or 285. 3GB since GTX580, 4GB since GTX680 and 6GB since GTX Titan.

If you think about it the fist real 2GB cards came out with HD 6970 and 3GB where from HD 7970 and 4GB from 290X.

HD 7970 is 3 years old. I dont care if GTX780 Ti is 1 year old. Thats Nvidia's problem. I expect right now 3GB to be the minimum.

Now using vRAM for the sake of using it is stupid.

Right now 2GB should be 1080p min, 3GB for 1440p and 4GB for 4K soon to be 6GB+

Next year you will see 3GB for 1080p, 4GB for 1440p and 4K will need 6GB.


----------



## Zipperly

Quote:


> Originally Posted by *StrongForce*
> 
> Even the 780 is limited with it's 3gb VRAM, that card should have had 4gb ! and the new ones 6
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then again like some people say we'll see how it turns out, does the ultra setting on this game really look so much better than the high settings ? ... maybe the ultra settings it just like pushing the resolution scale to 200% on BF4, which I'm sure would require ALOT of VRAM.


Yes thats the problem, not every single map uses the same amounts of video ram so some people who always stick to playing the same maps may never actually come across this situation. Either way the 2gb 680 sits right on the cliffs edge of not having enough video ram for all ultra bf4 at 1920X1080.

There actually are some 6gb GTX 780's out there and up until now 3gb's has been plenty for most users especially at 1080p. Guess we will soon find out if that requirement goes up or not. In any case if Johny is still replying to me then he is wasting his time, he made my block list a page or two back.


----------



## Jaycz

Wait for the game to come out and be reviewed and tested before getting angry

The way i see it 1 of a few things are going to happen,

The game is going to look amazing on the ultra pack and be worthy of the 6gb vRAM usage.

Poor optimization.

Another CoD:G RAM thing, or any other inflated system spec requirement


----------



## fashric

Quote:


> Originally Posted by *StrongForce*
> 
> Yes I had order a 770 because I kinda rushed the purchase without thinking further few months ago, and without thinking 2gb would be much of a bottleneck, came in BF4, maxed it all, ran into huge fps drops (to like 16) on one of the Naval strike maps, yes bf4 maxed does use more than 2 gb VRAM, and the people who can run this without having massive stutter, well I don't know what you're doing, maybe you're playing the maps that doesn't require much VRAM ?
> 
> Also GPU Z showed it used 2200mb of vram before on BF4.
> 
> now with my HD 7950 it havent drop under 40 and it's only in the parts where it's very intense (such as pearl market roofs..for instance), but the rest of the time the FPS are very stable between 40 and settings I can't remember what settings I'm using. mix of med and high I think @ 1900x1200
> 
> Even the 780 is limited with it's 3gb VRAM, that card should have had 4gb ! and the new ones 6
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then again like some people say we'll see how it turns out, does the ultra setting on this game really look so much better than the high settings ? ... maybe the ultra settings it just like pushing the resolution scale to 200% on BF4, which I'm sure would require ALOT of VRAM.


I'm running BF4 all on Ultra with 4xMSAA at 1440p with 3gb 780's in SLI and there is no issue with vram I average around 80fps.


----------



## Zipperly

Quote:


> Originally Posted by *fashric*
> 
> I'm running BF4 all on Ultra with 4xMSAA at 1440p with 3gb 780's in SLI and there is no issue with vram I average around 80fps.


I run at 1920X1080 all ultra 2XMSAA 80FOV *Res Scale 150%*. I sometimes see up to 2500gbs of video ram usage at these settings myself. If I go 4XMSAA and push the slider a bit further I have no problems running out of video ram. Also im usually getting 60fps using the settings I just mentioned on demanding 64 player maps.

1440p all Ultra with 4XMSAA shouldnt be that difficult, its when you start to mess with resolution scale that vram starts to get eatin up.


----------



## StrongForce

Quote:


> Originally Posted by *fashric*
> 
> I'm running BF4 all on Ultra with 4xMSAA at 1440p with 3gb 780's in SLI and there is no issue with vram I average around 80fps.


Yea I'm talking 2gb for 1080p, good to know 3gb still not an issue for 1440p


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *StrongForce*
> 
> Yes I had order a 770 because I kinda rushed the purchase without thinking further few months ago, and without thinking 2gb would be much of a bottleneck, came in BF4, maxed it all, ran into huge fps drops (to like 16) on one of the Naval strike maps, yes bf4 maxed does use more than 2 gb VRAM, and the people who can run this without having massive stutter, well I don't know what you're doing, maybe you're playing the maps that doesn't require much VRAM ?
> 
> Also GPU Z showed it used 2200mb of vram before on BF4.
> 
> now with my HD 7950 it havent drop under 40 and it's only in the parts where it's very intense (such as pearl market roofs..for instance), but the rest of the time the FPS are very stable between 40 and settings I can't remember what settings I'm using. mix of med and high I think @ 1900x1200
> 
> Even the 780 is limited with it's 3gb VRAM, that card should have had 4gb ! and the new ones 6
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then again like some people say we'll see how it turns out, does the ultra setting on this game really look so much better than the high settings ? ... maybe the ultra settings it just like pushing the resolution scale to 200% on BF4, which I'm sure would require ALOT of VRAM.


When you push that resolution slider in BF4...your downscaling

It's not 1080p anymore....your rendering at a higher resolution...then compressing it for 1080

Neither a 770 nor a 780 is going to ace BF4 at resolutions higher than 1080


----------



## ZealotKi11er

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> When you push that resolution slider in BF4...your downscaling
> 
> It's not 1080p anymore....your rendering at a higher resolution...then compressing it for 1080
> 
> Neither a 770 nor a 780 is going to ace BF4 at resolutions higher than 1080


I run 150% @ 1440p no problem with 4GB of vRAM.


----------



## Caffinator

my laptop only has 4gb ram. i dont think it will play this game


----------



## Zipperly

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I run 150% @ 1440p no problem with 4GB of vRAM.


And I run 150% res scale at 1080p all Ultra 2XMSAA no problem with a single GTX 780 "mind you I am overclocked". I am able to maintain 60fps the majority of the time on 64player maps too.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I run 150% @ 1440p no problem with 4GB of vRAM.


Yea, but your on an r9 290 aren't you?

I've seen some benchmarks where a 280x beats a 780, so thats not really surprising....

BF4 likes amd and you have mantle support as well

The resolution slider is a downscaler, it renders the game at a higher resolution

Then saying "wow my card can't handle 1080p with 200% resolution slider, must be the Vram...."

No, its the higher res that your card is actually rendering for pete's sake.....


----------



## Flames21891

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Yea, but your on an r9 290 aren't you?
> 
> I've seen some benchmarks where a 280x beats a 780, so thats not really surprising....
> 
> BF4 likes amd and you have mantle support as well
> 
> The resolution slider is a downscaler, it renders the game at a higher resolution
> 
> Then saying "wow my card can't handle 1080p with 200% resolution slider, must be the Vram...."
> 
> No, its the higher res that your card is actually rendering for pete's sake.....


Well, it could be the VRAM.

The performance impact from not having enough raw power and not having enough VRAM are quite different. When your GPU runs out of VRAM, it then has to start using system memory. However, not only is system memory much slower, the card has to go out of its way to even communicate with it as that RAM isn't onboard.

So, when you run out of VRAM, the game will be playing fine until that memory needs to be swapped. Let's say it takes exactly one second for your GPU to communicate with system RAM and get what it needs. During that second your screen will freeze, and your FPS will be zero. Once your GPU receives the resources it needs to continue, it will go back to running at whatever framerate it was before until it needs something stored in system memory again. This phenomena is referred to as "stuttering" because of the stop and go nature of it. Not having enough raw GPU power simply yields low framerates, but those framerates are consistent.

Running out of VRAM is very possible. In fact, I'm practically positive that if I tried running Crysis 3 maxxed out at 4K, I would surpass my 2GB limit and encounter stuttering.

I get the feeling that 2GB will be phased out very soon, especially given how much memory the consoles have to play with. 3-4GB, however, should be more than enough for 1080p for quite a bit longer I would imagine. I doubt we'll see 6GB requirements becoming the norm so soon. VRAM requirements do inch upwards, but at a steady pace. I doubt we'd go from 2-3GB being plenty to 6GB being minimum overnight like this.

Again, I say let people play around with the game first before passing judgement. We'll know in just a couple of days whether this is an optimization issue, or if the textures live up to their requirements.


----------



## HothTron

No thanks, i'll be playing on my PS4 instead.


----------



## Zipperly

Quote:


> Originally Posted by *Flames21891*
> 
> Well, it could be the VRAM.
> 
> The performance impact from not having enough raw power and not having enough VRAM are quite different. When your GPU runs out of VRAM, it then has to start using system memory. However, not only is system memory much slower, the card has to go out of its way to even communicate with it as that RAM isn't onboard.
> 
> So, when you run out of VRAM, the game will be playing fine until that memory needs to be swapped. Let's say it takes exactly one second for your GPU to communicate with system RAM and get what it needs. During that second your screen will freeze, and your FPS will be zero. Once your GPU receives the resources it needs to continue, it will go back to running at whatever framerate it was before until it needs something stored in system memory again. This phenomena is referred to as "stuttering" because of the stop and go nature of it. Not having enough raw GPU power simply yields low framerates, but those framerates are consistent.
> 
> Running out of VRAM is very possible. In fact, I'm practically positive that if I tried running Crysis 3 maxxed out at 4K, I would surpass my 2GB limit and encounter stuttering.
> 
> I get the feeling that 2GB will be phased out very soon, especially given how much memory the consoles have to play with. 3-4GB, however, should be more than enough for 1080p for quite a bit longer I would imagine. I doubt we'll see 6GB requirements becoming the norm so soon. VRAM requirements do inch upwards, but at a steady pace. I doubt we'd go from 2-3GB being plenty to 6GB being minimum overnight like this.
> 
> Again, I say let people play around with the game first before passing judgement. We'll know in just a couple of days whether this is an optimization issue, or if the textures live up to their requirements.


You are wasting your time, I went round and round in circles with this guy telling him exactly what you just said and got nowhere. Eventually I just blocked him, had no other choice really. Also since you quoted him I am able too see what he said where he claims a 280X is faster than a GTX 780 in BF4 and he is flat out wrong, what a ridiculous thing to say especially when the benchmarks reflect the complete opposite. Not only that but I came from a highly overclocked HD 7970 "280x equivalent" when BF4 came out and I am well aware of how that card performs in BF4 and it is NO WHERE near as fast as a GTX 780 and before I purchased this GTX 780 I owned a aftermarket cooled "highly overclocked" 290X and again "even with mantle" I found that it doesnt perform as well as my overclocked GTX 780 especially now that nvidia is doing something similar as mantle through optimized DX11 drivers.

One of the benefits I have as a PC gamer is the fact that I have personally owned about every high end GPU from both sides of the fence at one point or another since as long as I can remember, I do this because I am an enthusiast and as such like to try the high end cards from both camps before deciding which one I will stick it out with.


----------



## perfectblade

Quote:


> Originally Posted by *HothTron*
> 
> No thanks, i'll be playing on my PS4 instead.


it's just getting silly at this point. we're supposed to spend $600 every 2 years to play games at 1080p...with higher textures (somewhat) than consoles? and people defend this?


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Flames21891*
> 
> Well, it could be the VRAM.
> 
> The performance impact from not having enough raw power and not having enough VRAM are quite different. When your GPU runs out of VRAM, it then has to start using system memory. However, not only is system memory much slower, the card has to go out of its way to even communicate with it as that RAM isn't onboard.
> 
> So, when you run out of VRAM, the game will be playing fine until that memory needs to be swapped. Let's say it takes exactly one second for your GPU to communicate with system RAM and get what it needs. During that second your screen will freeze, and your FPS will be zero. Once your GPU receives the resources it needs to continue, it will go back to running at whatever framerate it was before until it needs something stored in system memory again. This phenomena is referred to as "stuttering" because of the stop and go nature of it. Not having enough raw GPU power simply yields low framerates, but those framerates are consistent.
> 
> Running out of VRAM is very possible. In fact, I'm practically positive that if I tried running Crysis 3 maxxed out at 4K, I would surpass my 2GB limit and encounter stuttering.
> 
> I get the feeling that 2GB will be phased out very soon, especially given how much memory the consoles have to play with. 3-4GB, however, should be more than enough for 1080p for quite a bit longer I would imagine. I doubt we'll see 6GB requirements becoming the norm so soon. VRAM requirements do inch upwards, but at a steady pace. I doubt we'd go from 2-3GB being plenty to 6GB being minimum overnight like this.
> 
> Again, I say let people play around with the game first before passing judgement. We'll know in just a couple of days whether this is an optimization issue, or if the textures live up to their requirements.


This particular stuttering scenario your speaking about....where can one find more information about it?

Before I had my 780ti's I ran a 690 at 1440p with AA

Never once saw that happen

Never read anything about that anywhere

Check minimum frame rates on a 690 vs titan anywhere

People just blame Vram without any research, guess the 980 is not enough by that logic....


----------



## Zipperly

Quote:


> Originally Posted by *perfectblade*
> 
> it's just getting silly at this point. we're supposed to spend $600 every 2 years to play games at 1080p...with higher textures (somewhat) than consoles? and people defend this?


Exaggerate much?

Last time I checked you didnt have to spend $600.00 on a GPU every 2yrs to play the latest and greatest with the highest settings. Also I guess you dont know about the newly released $330.00 GTX 970 either........


----------



## Caffinator

i usually stay a generation behind.


----------



## HothTron

Quote:


> Originally Posted by *perfectblade*
> 
> it's just getting silly at this point. we're supposed to spend $600 every 2 years to play games at 1080p...with higher textures (somewhat) than consoles? and people defend this?


Some games I just wanna plug in and play in front of my big screen and surround sound after a long day of doing IT work and dealing with computer issues 10 hour a day and users. Using a DS4 controller works just great vs keyboard and mouse in many games and Modor is one. Other games I prefer/can only play on PC

And no, I don't wanna plug my computer into my TV, I keep my gaming systems seperate.


----------



## Leopard2lx

I run BF4 on Ultra + 2XMSAA + 200% Resolution Scale (4k) at about 30-35 fps average and the VRAM is maxed out at 3GB, however it does not seem to affect performance as I don't get any stutters, frame drops or freezes. It's fairly consistent frame rate.

Personally, I'm OK with Shadow of Mordor on High textures and everything else on Ultra as long as it runs great, because Crap Dogs also supposedly required over 3 GB of VRAM for Ultra textures yet the game ran poorly even on High with 3 GB of VRAM so that tells me it's not a VRAM issue but a game designed for unified memory aka consoles and not adapted properly for PC.

Regardless, as along as the game runs great on High textures I'd be happy.


----------



## Zipperly

My GTX 780 for those who are curious is more than capable of maxing out BF4 at 1920X1080 while keeping a consistent 60fps all Ultra settings. I can even up the resolution scale to 150% with 2XMSAA and maintain 60fps the majority of the time in 64man servers. Anyone can go back and read what it is that I actually stated and what I stated was that with the resolution slider set at 200% with 4XMSAA the game was then exceeding my 3gb limitation "which introduces massive stuttering" and rightfully so since in reality I have gone far beyond 1080P with the resolution slider at the 200% mark on a game that is already notorious for using large amounts of vram when maxed out.


----------



## Zipperly

Quote:


> Originally Posted by *Leopard2lx*
> 
> I run BF4 on Ultra + 2XMSAA + 200% Resolution Scale (4k) at about 30-35 fps average and the VRAM is maxed out at 3GB, however it does not seem to affect performance as I don't get any stutters, frame drops or freezes. It's fairly consistent frame rate.


That sounds about right with 2XMSAA, 4XMSAA combined with 200% res scale pushes it over the 3gb barrier from time to time depending on which map you are playing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zipperly*
> 
> That sounds about right with 2XMSAA, 4XMSAA combined with 200% res scale pushes it over the 3gb barrier from time to time depending on which map you are playing.


With my 290X i am hitting ~ 2.2GB with 1440p 150% scaling. 0x MSAA. No point on using MSAA if you are using SSAA.


----------



## perfectblade

Quote:


> Originally Posted by *Leopard2lx*
> 
> Ultra textures yet the game ran poorly even on High with 3 GB of VRAM so that tells me it's not a VRAM issue but a game designed for unified memory aka consoles and not adapted properly for PC.


which i strongly suspect is also the case in this instance


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Leopard2lx*
> 
> I run BF4 on Ultra + 2XMSAA + 200% Resolution Scale (4k) at about 30-35 fps average and the VRAM is maxed out at 3GB, however it does not seem to affect performance as I don't get any stutters, frame drops or freezes. It's fairly consistent frame rate.
> 
> Personally, I'm OK with Shadow of Mordor on High textures and everything else on Ultra as long as it runs great, because Crap Dogs also supposedly required over 3 GB of VRAM for Ultra textures yet the game ran poorly even on High with 3 GB of VRAM so that tells me it's not a VRAM issue but a game designed for unified memory aka consoles and not adapted properly for PC.
> 
> Regardless, as along as the game runs great on High textures I'd be happy.


Quote:


> Originally Posted by *Zipperly*
> 
> That sounds about right with 2XMSAA, 4XMSAA combined with 200% res scale pushes it over the 3gb barrier from time to time depending on which map you are playing.


BF4 on Ultra + 2XMSAA + 200% Resolution Scale (4k) at about 30-35 fps average....

So then...with 4XMSAA would yeild....what....10-15 fps average??

Then he's claiming on and on for 10 posts that its a Vram issue he's having??? WHAT?!?

Someone please explain to him that he's asking his 780 to render 200% **4k** resolution with 4xMSAA

You can't run that kind of AA at 4k even with a Titan black or 100gb 780


----------



## HothTron

Quote:


> Originally Posted by *perfectblade*
> 
> which i strongly suspect is also the case in this instance


Again, further reason I play games on console as thats what a vast majority of games are developed and sold on. PC market is an afterthought and has been the last 10+ years and you know I'm right. Yes, steam exists and only does well due to the VERY low gaming cost mark up costs but 10 years ago, physical media was king and the console market began its domination in regards to sales to the general consumer public.

Thats business and the market, don't have to like it but welcome to 2014, not 1994.


----------



## Leopard2lx

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> BF4 on Ultra + 2XMSAA + 200% Resolution Scale (4k) at about 30-35 fps average....
> 
> So then...with 4XMSAA would yeild....what....10-15 fps average??
> 
> Then he's claiming on and on for 10 posts that its a Vram issue he's having??? WHAT?!?
> 
> Someone please explain to him that he's asking his 780 to render 200% **4k** resolution with 4xMSAA
> 
> You can't run that kind of AA at 4k even with a Titan black or 100gb 780


Not sure who your post is geared towards, but my point was that VRAM did NOT seem to be an issue, at least not in BF4, even though though it was maxed out at 3GB. The game still ran at consistent frame rates with no stutters, freezes or sudden frame drops.
On the other hand, the frame rate went up in SLI even though the VRAM was still "maxed out" at 3 GB, which indicates that BF4 benefits more from raw processing power than additional VRAM. I guess we do have to keep in mind that BF4 is NOT an open world game so I don't know how much that would change things.

This could be different from game to game but my guess is that if a game engine is optimized you should be able to run the game with consistent fps even if you are hitting the VRAM wall at the price of a tad lower average fps.


----------



## Flames21891

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> This particular stuttering scenario your speaking about....where can one find more information about it?
> 
> Before I had my 780ti's I ran a 690 at 1440p with AA
> 
> Never once saw that happen
> 
> Never read anything about that anywhere
> 
> Check minimum frame rates on a 690 vs titan anywhere
> 
> People just blame Vram without any research, guess the 980 is not enough by that logic....


You probably won't be able to read much about it anywhere as it's just kinda common knowledge. No one's ever really charted down how much of a performance hit you take from running out of VRAM, because the short answer is that it sucks and makes the game practically unplayable.

I've had it happen before. Before my 680's, I had a 7970, but before _that_ I had two 512MB 4870's. The reason I ended up retiring them was actually because I upgraded from a 720p monitor to a 1080p one. Those cards still had enough raw power to run my games, however MANY were hitting a VRAM wall. Funnily enough, want to know one of the worst ones? Magicka. That game ran well over 60 FPS with my setup, but every seven seconds or so, my cards would have a leisurely chat with my system memory for about a second, resulting in the stuttering scenario I mentioned earlier.

So to those who know what VRAM limitations are like, it's very easy to discern if that's the issue.


----------



## perfectblade

Quote:


> Originally Posted by *HothTron*
> 
> Again, further reason I play games on console as thats what a vast majority of games are developed and sold on. PC market is an afterthought and has been the last 10+ years and you know I'm right. Yes, steam exists and only does well due to the VERY low gaming cost mark up costs but 10 years ago, physical media was king and the console market began its domination in regards to sales to the general consumer public.
> 
> Thats business and the market, don't have to like it but welcome to 2014, not 1994.


yeah this is pretty much my thought. i like pc gaming and all because i like the controls. but i don't care about the latest and greatest on pc. i'd rather just play that on console because the upgrade cycle is insanity


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Flames21891*
> 
> You probably won't be able to read much about it anywhere as it's just kinda common knowledge. No one's ever really charted down how much of a performance hit you take from running out of VRAM, because the short answer is that it sucks and makes the game practically unplayable.
> 
> I've had it happen before. Before my 680's, I had a 7970, but before _that_ I had two 512MB 4870's. The reason I ended up retiring them was actually because I upgraded from a 720p monitor to a 1080p one. Those cards still had enough raw power to run my games, however MANY were hitting a VRAM wall. Funnily enough, want to know one of the worst ones? Magicka. That game ran well over 60 FPS with my setup, but every seven seconds or so, my cards would have a leisurely chat with my system memory for about a second, resulting in the stuttering scenario I mentioned earlier.
> 
> So to those who know what VRAM limitations are like, it's very easy to discern if that's the issue.


I appreciate your rational approach

But I respectfully disagree with you

It was once common knowledge the world was flat...

Like I've been saying all this time....put a 690 up against a Titan Black or 980....let's really test this....

If your running some old >2gb card you have bigger performance issues than Vram....


----------



## Kittencake

so glad I have a 290x


----------



## MxPhenom 216

Quote:


> Originally Posted by *DADDYDC650*
> 
> Some folks said the 6GB on my card was worthless....


It is in single GPU configurations.


----------



## KenjiS

Quote:


> Originally Posted by *Zipperly*
> 
> For everyone who is confused by Johny if you take careful notice to what he is saying it is very clear and cut that he doesnt actually understand what vram does much less what purpose it serves. He apparently thinks that more vram is supposed to automatically = *" higher frame rates"*. For example im sure his mind is completely blown as to why a 3gb GTX 780 performs the same as a 6gb GTX 780 failing to take into account that both cards will indeed perform exactly the same until something happens which pushes beyond the 3gb limitation of the 3gb GTX 780 model at which point the 3gb model would turn into a stuttering mess while the 6gb model continues to hum right along.
> 
> I experienced the same thing with my old 1.5gb GTX 580 in skyrim+texture mods, as soon as I went over the 1.5gb vram limit BOOM, stutter city. I resolved this issue at the time by purchasing a 3gb GTX 580 and skyrim never stuttered again but according to Johny this is all an impossibility because after all Vram is just a waste of time and shouldnt actually even be a part of the GPU.


Thank you! Wish you were here last night!


----------



## KenjiS

Quote:


> Originally Posted by *perfectblade*
> 
> it's just getting silly at this point. we're supposed to spend $600 every 2 years to play games at 1080p...with higher textures (somewhat) than consoles? and people defend this?


I spent $300 on my 570 and it lasted until last year, It gave 3 years of service roughly. I never had to touch settings or anything, Everything ran maxed out

I upgraded to my 770 and really, I'd have been fine if i hadnt gone 1440p...


----------



## Saberfang

Let's try to think outside the box, few months ago I was testing my system RAM by removing one card to check the health of the remaining 2GB equipped. It's not like my system wasn't working but as soon as it got to some multitasking performance were dreadful due to RAM wall and the need to swapping data for the active program since there wasn't room for all.
The VRAM issue is similar but, when it comes to texture, there is much more prediction. You mainly know what you need to see next as well as how far you are from the object you need to skin allowing you to choose different resolution texture and updating them once you get closer and need that.
As long as there is room for it then you have no issue but when you hit the wall you start getting overhead proportional to the lack of memory you suffer.
I think this two are similar scenario but graphical engine should be able to manage their resources way better than what conventional system RAM should.

I don't know if SoM really hits the wall of 6GB VRAM and why but the real wrong thing here is if you know that game in development are gonna hit that wall you don't release high end product that doesn't meet this criteria, no matter what!

Inviato dal mio Galaxy Nexus utilizzando Tapatalk


----------



## Defoler

Quote:


> Originally Posted by *KenjiS*
> 
> Well let me put it another way, One reason im dumping my SLI (Beyond overall being disappointed in the performance of it in things i regularly play and the fact a 970 handles several titles I enjoy just as well if not better than my SLI setup) is that everyone told me my limitation at this point is not raw power but the fact that games want more and more VRAM, This is backed up by every expert review site and everything I've read that says "2gb is just not enough for 1440p+ gaming"


I have seen more than enough reviews comparing 2GB and 4GB 770 to see that the performance gain in the extra 2GB for 1440p is not there because the GPU is just not strong enough.
Comparing 770 with 2GB to 970 with 4GB is comparing apples to oranges. You can't claim the 970 performance is better because of more memory and ignoring the GPU extra performance.
Its like claiming the Ferrari enzo is faster than a Hyundai i30 because it has wider tires....

If memory alone was what made games better, we should have seen the titan black supreme than everything else in 4K, and still the 780 ti could perform better in some games. Or the 780 TI with 3GB wining over the 290x with 4GB. My 680 4GBs performed the same as the norm

I have yet to see proof of conclusive evidence which shows without a doubt that extra memory = extra performance without also applying GPU power performance in the mix and making it irrelevant.
Your decision to move to 970 is irrelevant as there is also a high GPU performance gain which way overshadow the memory gain most likely.


----------



## Defoler

Quote:


> Originally Posted by *Saberfang*
> 
> Let's try to think outside the box, few months ago I was testing my system RAM by removing one card to check the health of the remaining 2GB equipped. It's not like my system wasn't working but as soon as it got to some multitasking performance were dreadful due to RAM wall and the need to swapping data for the active program since there wasn't room for all.
> The VRAM issue is similar but, when it comes to texture, there is much more prediction. You mainly know what you need to see next as well as how far you are from the object you need to skin allowing you to choose different resolution texture and updating them once you get closer and need that.
> As long as there is room for it then you have no issue but when you hit the wall you start getting overhead proportional to the lack of memory you suffer.
> I think this two are similar scenario but graphical engine should be able to manage their resources way better than what conventional system RAM should.
> 
> I don't know if SoM really hits the wall of 6GB VRAM and why but the real wrong thing here is if you know that game in development are gonna hit that wall you don't release high end product that doesn't meet this criteria, no matter what!
> 
> Inviato dal mio Galaxy Nexus utilizzando Tapatalk


So you are saying going from 4GB dual channel memory to 2GB single channel memory is similar to 2GB vs 4GB of vram?
This is so much different... its like removing your arm and saying that based on this experience you can walk fine if you removed a leg instead.


----------



## StrongForce

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> When you push that resolution slider in BF4...your downscaling
> 
> It's not 1080p anymore....your rendering at a higher resolution...then compressing it for 1080
> 
> Neither a 770 nor a 780 is going to ace BF4 at resolutions higher than 1080


when I mean "maxing" that's without that slider.


----------



## Zipperly

Quote:


> Originally Posted by *ZealotKi11er*
> 
> With my 290X i am hitting ~ 2.2GB with 1440p 150% scaling. 0x MSAA. No point on using MSAA if you are using SSAA.


Using 2XMSAA in combination with 150% slider looks better than 0XMSAA from an aliasing standpoint and 4XMSAA is even better but obviously is pretty resource heavy at that point.


----------



## RagingCain

I wish people in here would not talk about a subject they do not know about.

People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum*. Doing the exact same thing.*

The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.

The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.

1920x1080P, uses *256MB* of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.

*Memory usage is not linear, it does not go up every freaking year.* It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300

When publishers tell developers to do this for the PC:
1.) Don't have to optimize for a NUMA architecture.
2.) Don't have to prioritize assets.
3.) Don't have to write efficient rendering methods.

A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.


----------



## Zipperly

Quote:


> Originally Posted by *Defoler*
> 
> I have seen more than enough reviews comparing 2GB and 4GB 770 to see that the performance gain in the extra 2GB for 1440p is not there because the GPU is just not strong enough.


You didnt see any difference because these "reviews" never went beyond the 2gb limitation, had they done so you would have seen a dramatic difference because the 2gb card would suddenly fall flat on its face. A lot of you guys are very confused ..... you think that we are saying that the card with additional vram is supposed to be faster than its lower vram counter part in all scenario's regardless of vram utilization and that is where you are wrong.

I explained this in an earlier post when I talked about my 1.5gb GTX 580 in combination with skyrim+texture mods I found myself hitting the vram limit fairly easily, the game could be running along at 60fps and then just like that fall flat on its face when I would exceed the vram limitation. I resolved this by selling that card and picking up a 3gb model GTX 580 and I never again had the issues in skyrim that I had when I owned the 1.5gb GTX 580 and I was also able to add more mods in addition to what I already had installed.

Quote:


> If memory alone was what made games better, we should have seen the titan black supreme than everything else in 4K, and still the 780 ti could perform better in some games.


I can play a lot of games at 4k with my 780 and not run out of vram so that really proves nothing. However there are a few modern games that I have no problem pushing past the 3gb limitation at 4k combined with MSAA,


----------



## Zipperly

Quote:


> Originally Posted by *RagingCain*
> 
> It isn't time to need "2GB+" for 1080P.


Uhuh, we have have been hearing this age old argument for decades and we see how that one turns out........ I had a 2GB GTX 680 and had no problems exceeding 2gbs at 1080p with more than a couple of current games. With the new consoles out it is utterly and completely foolish to believe that 2gb's for 1080p will continue to cut it for much longer.


----------



## TopicClocker

Quote:


> Originally Posted by *RagingCain*
> 
> I wish people in here would not talk about a subject they do not know about.
> 
> People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum*. Doing the exact same thing.*
> 
> The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
> 
> The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.
> 
> 1920x1080P, uses *256MB* of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.
> 
> *Memory usage is not linear, it does not go up every freaking year.* It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300
> 
> When publishers tell developers to do this for the PC:
> 1.) Don't have to optimize for a NUMA architecture.
> 2.) Don't have to prioritize assets.
> 3.) Don't have to write efficient rendering methods.
> 
> A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.


If unified memory really is the focus then what can we do other than getting the hardware to accommodate?


----------



## RagingCain

Quote:


> Originally Posted by *Zipperly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> It isn't time to need "2GB+" for 1080P.
> 
> 
> 
> Uhuh, we have have been hearing this age old argument for decades and we see how that one turns out........ I had a 2GB GTX 680 and had no problems exceeding 2gbs at 1080p with more than a couple of current games. With the new consoles out it is utterly and completely foolish to believe that 2gb's for 1080p will continue to cut it for much longer.
Click to expand...

I am so glad you took the time to read nothing of what I wrote, of any of my posts in this thread, continue to spew bull, and don't even know what the hell you are talking about.

A game doesn't work at 1080P because of mods? What a surprise, a developer can optimize a game only so much not knowing what the player is going to do to it.

Calculate the individual frame size for 1920x1080, 32-bit Color, 24-Bit Z Buffer, and an 8-bit stencil. Then tell me HOW much you actually need for 1920x1080P.


----------



## specopsFI

Okay, so the issue here is that we are at a discontinuity point when it comes to VRAM and memory management in general. I'm not an expert on game development, but here is how I as a gamer and hardware enthusiast have come to see it.

It is true that for the longest time, games haven't demonstrated the point of extra VRAM. The bottleneck has been somewhere else. Now, with the new consoles having way more memory than the previous gen, there are already a few cases where VRAM actually is becoming an issue to take note of. It is pointless to use history as proof here, the VRAM landscape is changing right now. That being said, it's also pointless to use VRAM usage as proof of VRAM requirements. Usage means nothing. VRAM only becomes an issue when the lack of it actually causes obvious problems. Yes, running out of VRAM is pretty obvious with all the stuttering or perhaps missing textures. But no, running out of VRAM is not the only possible explanation for stuttering or performance issues.

Not surprisingly, Watch_Dogs has been mentioned in this thread several times. It is a perfect example of both the change in VRAM usage and the difficulty of seeing whether or not VRAM is the actual issue. W_D does use a lot of VRAM. It uses much more VRAM on ultra textures than it does on high textures. Most people have experienced how going from ultra to high textures makes the stuttering less distracting. Most people have graphics cards with 2-3GB of VRAM. It's no surprise that people put these things together and conclude that W_D stutters because of lack of VRAM. Only when you actually go the extra mile and really put that theory to test you'll discover that W_D stutters regardless of VRAM usage, regardless of texture quality, regardless of graphics card. Even [H]ardOCP, who were very vocal about W_D being VRAM limited, have come around in their recent 780Ti review, stating that the 3GB 780Ti is actually capable of providing a superior gaming experience compared to the 4GB 290X even with ultra textures and @1600p. It took them a while and even now they still resort to their previous thinking in their GTX 980 article, but there is no going around it: the stuttering in W_D with ultra textures is not a VRAM issue, it's a more general game engine issue. From another point of view: the ultra textures are not the problem, they just make the problem more obvious.

The real challenge in porting PS4/XB1 games to PC is not the lesser VRAM amount of the majority of GPUs but the fundamentally different memory architecture. The new consoles have a shared memory architecture. A gaming PC has separate memory pools for system memory and VRAM. A lazy console port on PC will basically load everything to both system memory and VRAM. I suppose that could work... for the very small minority who have over 4GB of VRAM for 1080p and don't have the need for any extra eye candy above that of the console version. So in general, it's a rotten idea. That is why there needs to be some extra memory management for the PC version and this is where Ubisoft really went south with Watch_Dogs. You can run the game on PC without hitting your VRAM limit and still get stalling and stuttering very similar to running out of VRAM. The game engine just isn't able to keep its memory management in order, it seems to process textures in way too big of portions. You can have unallocated VRAM and still have the game pausing to do memory swaps. That is not a problem that can be solved by throwing more VRAM at it.

VRAM requirements are going up as we speak, no question about it. That is just what a new generation of main platforms does. But it's not just about the amount of VRAM, it's about the rather large shared memory pool. I had high hopes that the the new consoles being x86 and GCN would make for easy and effective porting to PCs. Sadly, it seems to not have materialized because of the shared memory. PCs _still_ require a very distinctive approach compared to consoles and some game developers _still_ miss that mark. This is where it all comes down to the subject of this thread. Whether SoM is a lazy port or not remains to be seen. The VRAM requirements are IMHO pointing towards non-optimal texture streaming but it can also be about the actual quality of the textures. If the ultra textures are the best ever seen on a PC game, then kudos. If they are kind of average, then something is wrong with the game engine.

One last thing. I really don't appreciate nonsensical "PC only" options such as uncompressed textures. If you can get the exact same quality with less requirements, adding an extra option with uncompressed textures is just silly and quite frankly a flame bait and a marketing tool for GPU makers. Compression is everyone's friend until you start to lose quality because of it. I hope there is more to the ultra textures in SoM than just them being uncompressed.


----------



## alexmaia_br

Quote:


> Originally Posted by *RagingCain*
> 
> The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
> 
> When publishers tell developers to do this for the PC:
> 1.) Don't have to optimize for a NUMA architecture.
> 2.) Don't have to prioritize assets.
> 3.) Don't have to write efficient rendering methods.


That is indeed a good point.
Problem is, as I see it:

It's very likely the games won't be optimized and publishers will tell developers to do this for PC.

Since I want to play the games, and not rage about "how it should have to be", I will just get a new GPU with loads of RAM.

So yeah, I agree with you, but:

1) I want to play the games at max settings
2) The publishers will likely push for un optimized games because of consoles
3) I like shiny, new stuff ( I'm shallow that way







)


----------



## RagingCain

Quote:


> Originally Posted by *alexmaia_br*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
> 
> When publishers tell developers to do this for the PC:
> 1.) Don't have to optimize for a NUMA architecture.
> 2.) Don't have to prioritize assets.
> 3.) Don't have to write efficient rendering methods.
> 
> 
> 
> That is indeed a good point.
> Problem is, as I see it:
> 
> It's very likely the games won't be optimized and publishers will tell developers to do this for PC.
> 
> Since I want to play the games, and not rage about "how it should have to be", I will just get a new GPU with loads of RAM.
> 
> So yeah, I agree with you, but:
> 
> 1) I want to play the games at max settings
> 2) The publishers will likely push for un optimized games because of consoles
> 3) I like shiny, new stuff ( I'm shallow that way
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

That is, of course, your prerogative. Completely understandable points, however, keep in mind: In identical GPUs with varying amounts of memory i.e. 3GB model vs. 6GB model, the larger VRAM model is usually 1~10% slower than the smaller VRAM model due to slower memory timings both in benchmarks and heavy VRAM utilization titles.

Also keep in mind:
1.) I don't believe they really are giving you max settings, I honestly think they are giving you the console experience.
2.) Poor ports should not be rewarded in my opinion.
3.) You should hate being lied to more (but that is my own opinion.)

You are honestly better off, just getting it for the console.

Let no one make a mistake, I am always usually on the bleeding edge with technology. As a programmer, I am telling you all they are lying out their behinds to your faces, and most of you are gobbling it up like Apple users.

I have 32GB of RAM in my system, don't even use it save for 1% of the time. I deemed that a necessity. 6GB of VRAM for 1080P is a mathematical and technological abomination.
Quote:


> Originally Posted by *TopicClocker*
> 
> If unified memory really is the focus then what can we do other than getting the hardware to accommodate?


Raise hell and awareness.


----------



## n780tivs980

Bet amd will cash in on this, they wont have to lower the power usage from the 290x or the awful heat/noise, just up the performance 10-20% and slap 6gb vram as a standard on it and people will eat it up aslong as it's priced near the 980.


----------



## iSlayer

6GBs of VRAM for ultra? Really? It'll be a shock if it isn't, not if it is a console port.


----------



## RagingCain

Quote:


> Originally Posted by *KenjiS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RagingCain*
> 
> That is, of course, your prerogative. Completely understandable points, however, keep in mind: In identical GPUs with varying amounts of memory i.e. 3GB model vs. 6GB model, the larger VRAM model is usually 1~10% slower than the smaller VRAM model due to slower memory timings both in benchmarks and heavy VRAM utilization titles.
> 
> Also keep in mind:
> 1.) I don't believe they really are giving you max settings, I honestly think they are giving you the console experience.
> 2.) Poor ports should not be rewarded in my opinion.
> 3.) You should hate being lied to more (but that is my own opinion.)
> 
> You are honestly better off, just getting it for the console.
> 
> Let no one make a mistake, I am always usually on the bleeding edge with technology. As a programmer, I am telling you all they are lying out their behinds to your faces, and most of you are gobbling it up like Apple users.
> 
> 
> 
> The problem still is a practical one, Do you want to play new games? Yes? Then you need more VRAM because THAT is what the companies are doing. I dont know why, I'm supposing that at the core, the engines are specifically being written to not stream assets from the HDD/System ram as effectively and that developers are heavily using VRAM in the place of System Ram. Very possibly because the new consoles are very weak in terms of HDD and streaming performance from what I've seen.
> 
> Also I DID post a benchmark from HardOCP earlier in the thread with a 6GB 780 vs a 3gb 780 where the 6gb 780 DID perform better in several benchmarks by a decent margin. Somethng like 10-ish FPS in a few titles. So yeah. Not IMPOSSIBLE.
> 
> It doesnt matter if thats how it should be or whatever. Watch_Dogs is not the only example. Plenty of games use more than 2gb. Especially going above 1080p. Now granted, 6gb is excessive, But most of us here are thinking that even that requirement is likely WAY beyond the actual need, and 4gb folks should be fine. It all depends on how things are written.
> 
> Now this happened before. The last time I saw sudden huge jumps in PC requirements where a GPU I had just bought suddenly couldnt run anything was back when the 360 came out. We're at a transition point where the engines and what have you are brand new and optimized incredibly poorly. Heck I'd argue this is the case on the consoles too, Especially given they're struggling to hit 1080p on every game (The Order 1886 anyone? With its 2.35:1 aspect ratio and 24fps because "cinematic feel" ) It will all level out and then we'll start seeing more improvement in graphic fidelity. At this point in the lifespan its more about "making it run" than "making it look exceptionally good" folks...
> 
> Oh, and Shadow of Mordor is running variable resolution on the Xbox One, from 720-900p according to people with early copies, Locked at 30fps. So there is that to keep in mind....
> 
> Now lets get back to Shadow of Mordor. At the end of the day, we have little evidence that its "simply" a console port. Because we have not played it yet. Nor do we know if its a bad one, Since we havnt played it yet. The PC Gamer footage looked rather good to me.
> 
> At the end of the day we'll know more. Now I'm off to reformat my system and everything
Click to expand...

If you mean this one: http://www.hardocp.com/article/2014/06/11/asus_strix_gtx_780_oc_6gb_review_exclusive/4#.VClsJfldV8E

This is the review of an overclocked and not identical model of the same card. Identical cards at identical speeds, just one with x GB of VRAM vs. 2x GB VRAM, the smaller card usually edges out. The only exception is when you are hitting that elusive VRAM wall or, forcing the drivers to compensate.

The last console releases were released with the top of the end GPUs (custom built too) in an era where Dual Core was emerging and these consoles were semi-multithreaded.

Watch_Dogs runs terrible without being modified, regardless of VRAM amount.

You are also forgetting that the quality of the images are less than the quality of yester years games utilizing 1/2 to a 1/3 of the VRAM. Play the game on console if you "need" to play it. It's seemingly not designed for PCs anyways. Why brute force the issue?


----------



## PhilWrir

Cleaned

Lets get this back on topic and back to being professional guys


----------



## Yungbenny911

This V-ram argument again... "My GPU gets 3.6 MB usage in bf4 @ 1440p on my 4gb 970, 2gb would choke" <--- That's a Noob right there. People just need to educate themselves with hands on experience.

Go buy a 3GB 780, 6GB 780, 4k monitor, and do YOUR testing with factual results to prove it. All these walls of text would not help anyone, so just freaking GO BUY IT!!!. When you're done, come back and talk about V-ram being an issue.

We'll see how well that 6gb would "boost" performance on the 780 compared to the 3gb version...


----------



## Zipperly

Quote:


> Originally Posted by *Yungbenny911*
> 
> This V-ram argument again... "My GPU gets 3.6 MB usage in bf4 @ 1440p on my 4gb 970, 2gb would choke" <--- That's a Noob right there. People just need to educate themselves with hands on experience.


Since you speak of hands on experience "something I have plenty of" regarding vram usage I can clearly tell you that BF4 at only 1920X1080p Ultra settings pushed my 2gb GTX 680 beyond its vram limitations. Some maps would play perfectly fine while on others I would be humming along at a nice smooth consistent framerate only to "suddenly" run into large amounts of stutter to the point that the frame rate would plummet down into the single digits then of course a few seconds later it might clear up again. The whole point is that I was teetering on a cliffs edge with vram under certain situations. Moving to a 4gb 680 resolved every single one of these issues for me using identical resolution + settings in BF4. The issue here was not a lack of raw power, had that been the case then my 4gb 680 would have suffered the same issues, the problem was a lack of vram... period.

Absolutely amazes me how some of you deny the possibility of going over a cards set vram amount, I had no trouble doing this with modded skyrim on my 1.5gb GTX 580 and I also had no trouble resolving the issue by moving on to a 3gb GTX 580.


----------



## Paladin Goo

Quote:


> Originally Posted by *Zipperly*
> 
> Since you speak of hands on experience "something I have plenty of" regarding vram usage I can clearly tell you that BF4 at only 1920X1080p Ultra settings pushed my 2gb GTX 680 beyond its vram limitations. Some maps would play perfectly fine while on others I would be humming along at a nice smooth consistent framerate only to "suddenly" run into large amounts of stutter to the point that the frame rate would plummet down into the single digits then of course a few seconds later it might clear up again. The whole point is that I was teetering on a cliffs edge with vram under certain situations. Moving to a 4gb 680 resolved every single one of these issues for me using identical resolution + settings in BF4. The issue here was not a lack of raw power, had that been the case then my 4gb 680 would have suffered the same issues, the problem was a lack of vram... period.
> 
> Absolutely amazes me how some of you deny the possibility of going over a cards set vram amount, I had no trouble doing this with modded skyrim on my 1.5gb GTX 580 and I also had no trouble resolving the issue by moving on to a 3gb GTX 580.


I never had a single issue on my 680s.


----------



## StrongForce

Quote:


> Originally Posted by *RagingCain*
> 
> I wish people in here would not talk about a subject they do not know about.
> 
> People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum*. Doing the exact same thing.*
> 
> The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
> 
> The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.
> 
> 1920x1080P, uses *256MB* of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.
> 
> *Memory usage is not linear, it does not go up every freaking year.* It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300
> 
> When publishers tell developers to do this for the PC:
> 1.) Don't have to optimize for a NUMA architecture.
> 2.) Don't have to prioritize assets.
> 3.) Don't have to write efficient rendering methods.
> 
> 
> 
> A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.


Umm, what you don't seem to get is, for instance as someone mentioned in Skyrim, the mods takes alot of VRAM, I mean even if it's the developers fault for not optimizing properly their game/engine, we don't care, we just know that we need more VRAM to run those mods.. that's why some people get the 6gb versions, a friend asked me which graphic card I would get if I had money, I said 780 6gb..

Also like someone say, we will see if the game actually have epic textures or if it's just a VRAM vampire for no reasons.. also I think things like view distance could take more VRAM for instance so maybe the game actually MAKE USE of all that VRAM..

Thanks for bringing info on the technical aspect though







.


----------



## Zipperly

Quote:


> Originally Posted by *Raven Dizzle*
> 
> I never had a single issue on my 680s.


Well thats nice but I clearly know that at 1080p full ultra 2gb 680's will and do hit the vram wall, same thing with bf3 and my 1.5gb 580... moved on to the 3gb model and all stuttering issues due to low vram were eliminated.


----------



## Paladin Goo

Quote:


> Originally Posted by *Zipperly*
> 
> Well thats nice but I clearly know that at 1080p full ultra 2gb 680's will and do hit the vram wall, same thing with bf3 and my 1.5gb 580... moved on to the 3gb model and all stuttering issues due to low vram were eliminated.


I actually never hit that wall on my 680s with BF4, believe it or not. There were games I did hit it with, but BF4 wasn't one of them. Aside from when Nvidia broke SLI support for BF4 last December, I've not had a single issue. It was always around 1.7GB or so.


----------



## Yungbenny911

Quote:


> Originally Posted by *Zipperly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yungbenny911*
> 
> This V-ram argument again... "My GPU gets 3.6 MB usage in bf4 @ 1440p on my 4gb 970, 2gb would choke" <--- That's a Noob right there. People just need to educate themselves with hands on experience.
> 
> 
> 
> Since you speak of hands on experience "something I have plenty of" regarding vram usage *I can clearly tell you that BF4 at only 1920X1080p Ultra settings pushed my 2gb GTX 680 beyond its vram limitations.* Some maps would play perfectly fine while on others I would be humming along at a nice smooth consistent framerate only to "suddenly" run into large amounts of stutter to the point that the frame rate would plummet down into the single digits then of course a few seconds later it might clear up again. The whole point is that I was teetering on a cliffs edge with vram under certain situations. Moving to a 4gb 680 resolved every single one of these issues for me using identical resolution + settings in BF4. The issue here was not a lack of raw power, had that been the case then my 4gb 680 would have suffered the same issues, the problem was a lack of vram... period.
> 
> Absolutely amazes me how some of you deny the possibility of going over a cards set vram amount, I had no trouble doing this with modded skyrim on my 1.5gb GTX 580 and I also had no trouble resolving the issue by moving on to a 3gb GTX 580.
Click to expand...

Yet another wall of text without proof to back it up. Just stop, you clearly don't know what you're talking about







. I've had hands on experience WITH PROOF on Bf4 @ 4k, and yes, you're completely spreading misinformation. You ran into a performance wall, end of story...

I was able to play at 4k res because i OC'ed my 770's in SLI to 1400Mhz (core)/ 1950Mhz (mem), running at stock clocks resulted in 1-2 lockup's and what not. At 1400Mhz, they played bf4 at 4k very well, much better than i ever imagined.I can bet a million you never tested a 770 or 680 at such high OC on games, that's why you believe you hit a v-ram wall...

*BF4 @ 160%, and 200% 1080p.*


Spoiler: Warning: Spoiler!











*BF4 @ 5120x2160p Downsampled Resolution (OGSSAA)*


Spoiler: Warning: Spoiler!


----------



## TopicClocker

Quote:


> Originally Posted by *Zipperly*
> 
> Well thats nice but I clearly know that at 1080p full ultra 2gb 680's will and do hit the vram wall, same thing with bf3 and my 1.5gb 580... moved on to the 3gb model and all stuttering issues due to low vram were eliminated.


Quote:


> Originally Posted by *Raven Dizzle*
> 
> I actually never hit that wall on my 680s with BF4, believe it or not. There were games I did hit it with, but BF4 wasn't one of them. Aside from when Nvidia broke SLI support for BF4 last December, I've not had a single issue. It was always around 1.7GB or so.


I'm gonna have to admit I had no problem with my 2GB 760 either in Battlefield 4 on Ultra at 1080p.

Two games which I did have trouble with are Watch Dogs and Titan Fall on their max textures, If I dropped the textures a notch I had no problems.


----------



## TopicClocker

Quote:


> Originally Posted by *perfectblade*
> 
> it's just getting silly at this point. we're supposed to spend $600 every 2 years to play games at 1080p...with higher textures (somewhat) than consoles? and people defend this?


It's a shame PC Gamers are now getting Vram scares, PC Gamers never had this problem before the PS4 and the Xbox One released, it was purely GPU power which was of the concern.
Upgrading was mostly for faster hardware, the "upgrade cycle" is for those who want the latest and greatest hardware, totally unnecessary unless you want to max every game that releases which is pretty much impossible.


----------



## AK-47

So what's the requirements to run this game at 1440 or 4K on ultra?


----------



## Capt

Quote:


> Originally Posted by *AK-47*
> 
> So what's the requirements to run this game at 1440 or 4K on ultra?


Probably 12GB VRAM.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Defoler*
> 
> I have seen more than enough reviews comparing 2GB and 4GB 770 to see that the performance gain in the extra 2GB for 1440p is not there because the GPU is just not strong enough.
> Comparing 770 with 2GB to 970 with 4GB is comparing apples to oranges. You can't claim the 970 performance is better because of more memory and ignoring the GPU extra performance.
> Its like claiming the Ferrari enzo is faster than a Hyundai i30 because it has wider tires....
> 
> If memory alone was what made games better, we should have seen the titan black supreme than everything else in 4K, and still the 780 ti could perform better in some games. Or the 780 TI with 3GB wining over the 290x with 4GB. My 680 4GBs performed the same as the norm
> 
> I have yet to see proof of conclusive evidence which shows without a doubt that extra memory = extra performance without also applying GPU power performance in the mix and making it irrelevant.
> Your decision to move to 970 is irrelevant as there is also a high GPU performance gain which way overshadow the memory gain most likely.


This....

A 770 will run out of processing power before it has the abilitly to juggle more than 2gb of textures...

Every review that has ever put a 770 2gb against a 774gb or 680 2gb against a 680 4gb has come to the same conclusion...

Every benchmark has shown the SAME performance...


----------



## Yungbenny911

When 770's in SLI @ 1400Mhz (core)/ 1950Mhz (mem) are getting 28-65 FPS @ 200%. I wonder what he thinks his 680 should do







. People are funny...


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *RagingCain*
> 
> I wish people in here would not talk about a subject they do not know about.
> 
> People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum*. Doing the exact same thing.*
> 
> The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because *they have to for consoles*. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
> 
> The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.
> 
> 1920x1080P, uses *256MB* of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.
> 
> *Memory usage is not linear, it does not go up every freaking year.* It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300
> 
> When publishers tell developers to do this for the PC:
> 1.) Don't have to optimize for a NUMA architecture.
> 2.) Don't have to prioritize assets.
> 3.) Don't have to write efficient rendering methods.
> 
> A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.


THANK YOU!!!


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Yungbenny911*
> 
> When 770's in SLI @ 1400Mhz (core)/ 1950Mhz (mem) are getting 28-65 FPS @ 200%. I wonder what he thinks his 680 should do
> 
> 
> 
> 
> 
> 
> 
> . People are funny...


I tried explaining this to the dude in reference to the 690

Showed him benchmarks against a Titan, the 6gb buffer never helped the titan beat 690 AT ANY GAME in all the different resolutions and settings they used...

He started raging....


----------



## ZealotKi11er

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> I tried explaining this to the dude in reference to the 690
> 
> Showed him benchmarks against a Titan, the 6gb buffer never helped the titan beat 690 AT ANY GAME in all the different resolutions and settings they used...
> 
> He started raging....


The reason why GTX770 cant use 4GB is not GPU power but its bandwidth limited. Thats why there is not much difference. Titan is never going to use 6GB.


----------



## pterois

This definitely sucks and The Evil Within has similar vram requirements. Just got my GTX 970 and plan to get a second one next month for a 4K setup. I was looking forward to Shadow of Mordor and I will be very disappointed not to be able to enjoy it at it's best. I'm sure that Assassin's Creed Unity will have some comparable vram requirements for ultra textures. It does raise the question, what is nvidia doing? Working closely with most of these developers and optimizing the games games to run best on nvidia hardware and then releasing both their new flagships with only 4GB of VRAM. The maxwell Titan is still some months away but still that shouldn't be required to properly enjoy upcoming games especially at lower resolutions.


----------



## TFL Replica

I wouldn't put too much stock in recommended specs. Wait for the game's release. Everything will become clear.


----------



## iSlayer

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> HE THINKS HE'S HITTING THE VRAM WALL BECAUSE HE'S PUSHING THE RESOLUTION SCALE TO 200%...
> 
> ESSENTIALLY DOWNSCALING, WHICH IS RENDERING 4K AND COMPRESSING TO 1080P
> 
> ON TOP OF THIS HE'S APPLYING MSAA
> 
> AND TELLING EVERYONE HE'S HITTING THE VRAM WALL!
> 
> seriously.... -__-


No one said the VRAM myth obsessed aren't letting fear and panic cloud their judgment.


----------



## Chargeit

Did anyone stop to think that the reason they're doing this "6gb" thing is to gain attention?

I didn't think about the game myself until reading this "6gb for ultra textures" stuff.

A good marketing move.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Chargeit*
> 
> Did anyone stop to think that the reason they're doing this "6gb" thing is to gain attention?
> 
> I didn't think about the game myself until reading this "6gb for ultra textures" stuff.
> 
> A good marketing move.


Unless of course they did something shady with regards to the settings or optimization....

Then it's going backfire

All that aside, gameplay seems good enough to be one of the best of the year


----------



## Yungbenny911

Quote:


> Originally Posted by *Chargeit*
> 
> Did anyone stop to think that the reason they're doing this "6gb" thing is to gain attention?
> 
> I didn't think about the game myself until reading this "6gb for ultra textures" stuff.
> 
> A good marketing move.


Well... Negative attention sells if it's used properly, and in this case, it is lol. Now people would want to buy the game just to see if it really needs 6gb on 1080p







. Just like how the Crysis series were used more for benchmarking and their visuals, as opposed to actual game completion (I did complete 2 & 3 though)


----------



## KenjiS

Game will unlock in 22 hours according to Steam

Weird. But still.. WE SHALL HAVE OUR ANSWER!

Yes I will try to do some form of testing with my SLI before ripping them out for the 970


----------



## Chargeit

Yea, I had zero interest in the game until I heard this.

I'm still not buying it, least not yet. However, whenever higher vram Maxwell come around, I'm getting one and this will be a game high on my to do list. Not because I want to play it, but because I want to compare.


----------



## criznit

This game was on my radar for a while and I will check it out. I can't wait to see how the sHD pack will make the game look and hopefully it will be night and day between the vanilla ultra textures and the OMG6GBCARDYARRRR pack.


----------



## TopicClocker

Quote:


> Originally Posted by *KenjiS*
> 
> Game will unlock in 22 hours according to Steam
> 
> Weird. But still.. WE SHALL HAVE OUR ANSWER!
> 
> Yes I will try to do some form of testing with my SLI before ripping them out for the 970


Ugh If I still had my card I would of probably tested it already, I have my way of getting early copies.
I don't feel like waiting for 6-8GB models as a few upcoming PS4 exclusives have really got my attention.

My savings for a high-end GPU wont survive another month because of them lol.


----------



## KenjiS

Quote:


> Originally Posted by *TopicClocker*
> 
> Ugh If I still had my card I would of probably tested it already, I have my way of getting early copies.
> I don't feel like waiting for 6-8GB models as a few upcoming PS4 exclusives have really got my attention.
> 
> My savings for a high-end GPU wont survive another month because of them lol.


I'm still getting stuff installed. Then I was going to run a fresh set of baseline benchmarks for comparison with the 970, Since Reformatting probubly will increase some of my numbers I'd assume it would not be fair to compare otherwise


----------



## Chargeit

I'm not even sure what this game is about.

It looks like Assassin's creed meets lords of the rings right? It just isn't something I look forward to. I played one Assassin's creed game. It was cool, but, not something I want to revisit year after year.

I'd take a good "Lord of the Rings" Skyrim style RPG myself. Maybe even a "Neverwinter nights / Dragon age" style lotr. That would be cool.


----------



## Zipperly

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yet another wall of text without proof to back it up. Just stop, you clearly don't know what you're talking about
> 
> 
> 
> 
> 
> 
> 
> . I've had hands on experience WITH PROOF on Bf4 @ 4k, and yes, you're completely spreading misinformation. You ran into a performance wall, end of story...
> 
> I was able to play at 4k res because i OC'ed my 770's in SLI to 1400Mhz (core)/ 1950Mhz (mem), running at stock clocks resulted in 1-2 lockup's and what not. At 1400Mhz, they played bf4 at 4k very well, much better than i ever imagined.I can bet a million you never tested a 770 or 680 at such high OC on games, that's why you believe you hit a v-ram wall...
> 
> *BF4 @ 160%, and 200% 1080p.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *BF4 @ 5120x2160p Downsampled Resolution (OGSSAA)*
> 
> 
> Spoiler: Warning: Spoiler!


I speak from personal experience, so go ahead and try to pretend that you know more than me in this subject because I can guarantee that you do not! And speaking of a Great Wall of text ^ By the way it's very apparent you don't know what the hell you are talking about, over clocking your gpu will not substitute for running out of VRAM but if it makes you feel any better my 680 was at 1380 on the core.


----------



## Kriant

Sooo....my quad r9 290x's setup will not be enough, just because of 4gb of Vram limitation X_X OMG, I am sort of intrigued, but also a bit sad if that's true


----------



## Jaycz

Quote:


> Originally Posted by *Chargeit*
> 
> I'd take a good "Lord of the Rings" Skyrim style RPG myself. Maybe even a "Neverwinter nights / Dragon age" style lotr. That would be cool.


That would be amazing.


----------



## kingduqc

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> I tried explaining this to the dude in reference to the 690
> 
> Showed him benchmarks against a Titan, the 6gb buffer never helped the titan beat 690 AT ANY GAME in all the different resolutions and settings they used...
> 
> He started raging....


There need to be more test done on vram limitation and on how it affect smoothness. I'm 100% sure you won't see it on a fps graph bar but you might feel it (stuttering). I was on a 1.5 gb 580 and I did not notice any problems but that might not be as extreme as needing 6gb on a 2gb card. Anyone has a 690 or dual/troible 680/770 with 2gb of vram that want to test it out and point out results on this particular game?


----------



## Chargeit

Quote:


> Originally Posted by *Jaycz*
> 
> That would be amazing.


Wouldn't it be.

I mean, I'm sure this one will be fun enough, but, I can only picture poor Tolkien turning in his grave over what they're doing with his work on this one. No respect man.

I don't picture ever getting a competent Skyrim style "lotr", too much work. However, the Neverwinter Nights / Dragon Age style could and should be done.









Still, if the reviews are right for this game, I'll get it sooner then later. If the reviews aren't great, then I might get it test the 6gb Vram, at a highly discounted price.


----------



## WhiteCrane

What if you are willing to play at 720p?


----------



## ElectroManiac

Here is a benchmark using a gtx 970 on 1080p


----------



## Flames21891

Quote:


> Originally Posted by *ElectroManiac*
> 
> Here is a benchmark using a gtx 970 on 1080p
> 
> 
> Spoiler: Warning: Spoiler!


Well it certainly ran great on a single 970. Assuming those were actually Ultra textures (as I believe it states you would need to download an HD texture pack to actually use it) it seems like the game was simply blowing smoke about the 6GB VRAM requirement.

Now to see how my 2GB cards hold out.


----------



## n780tivs980

Quote:


> Originally Posted by *Yungbenny911*
> 
> Well... Negative attention sells if it's used properly, and in this case, it is lol. Now people would want to buy the game just to see if it really needs 6gb on 1080p
> 
> 
> 
> 
> 
> 
> 
> . Just like how the Crysis series were used more for benchmarking and their visuals, as opposed to actual game completion (I did complete 2 & 3 though)


Nah all that will come out of this publicity is alot more people will pirate it first to see how the 6gb limitation works before deciding to buy it or not.

Quote:


> Originally Posted by *ElectroManiac*
> 
> Here is a benchmark using a gtx 970 on 1080p


Looks amazing tbh, and that is apparntly without the 6gb hd textures. Cant wait till tomorrow, I wonder if the 780ti is going to fare better over the 980 here because of the heavy aa options.


----------



## Leopard2lx

What kind of AA was he running in the video benchmark? I didn't see an AA option in those menus.


----------



## Chargeit

Quote:


> Originally Posted by *WhiteCrane*
> 
> What if you are willing to play at 720p?


Then I'd say get off that xbox one, and load it up on your pc.









I think I'd lower other settings before going for lower res.


----------



## fashric

Quote:


> Originally Posted by *n780tivs980*
> 
> Looks amazing tbh, and that is apparntly without the 6gb hd textures. Cant wait till tomorrow, I wonder if the 780ti is going to fare better over the 980 here because of the heavy aa options.


Really? Looked distinctly average too me, hopefully its just the youtube compression.


----------



## Chargeit

Quote:


> Originally Posted by *fashric*
> 
> Really? Looked distinctly average too me, hopefully its just the youtube compression.


That's because your eyes aren't used to the goodness that comes with requiring 6gb of Vram. Once the system shock of seeing all that eye candy wears off, you'll realize you just saw the best looking damned textures know to man.

But yea, it didn't look like much if you ask me. However, youtube has a way of doing that. That wasn't the highest quality gaming video I've seen. He must of had to cut it down good to upload it.


----------



## The Source

Quote:


> Originally Posted by *n780tivs980*
> 
> Nah all that will come out of this publicity is alot more people will pirate it first to see how the 6gb limitation works before deciding to buy it or not.
> Looks amazing tbh, and that is apparntly without the 6gb hd textures. Cant wait till tomorrow, I wonder if the 780ti is going to fare better over the 980 here because of the heavy aa options.


It showed Ultra texture quality was selected.


----------



## Chargeit

Quote:


> Originally Posted by *The Source*
> 
> It showed Ultra texture quality was selected.




The question is, was it installed.


----------



## hanzy

Sucks that it isn't releasing at midnight, but 12PM tomorrow afternoon...
What a weird time for a launch, eh?

Most people will be at work, and taking a lunch break, if you do that sort of thing...

I said I wasn't going to preorder after WD, but I did preorder this due to the good reviews.


----------



## KenjiS

Quote:


> Originally Posted by *The Source*
> 
> It showed Ultra texture quality was selected.


From my understanding NO the PC Gamer video was technically High textures as the Ultra pack isnt available yet. Will be available tomarrow


----------



## Pip Boy

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> THANK YOU!!!


This forum needs a reddit style upvote system so at least there is a chance that people get to read a sensible comment rather than have 40 pages of mostly speculation.


----------



## The Source

Quote:


> Originally Posted by *phill1978*
> 
> This forum needs a reddit style upvote system so at least there is a chance that people get to read a sensible comment rather than have 40 pages of mostly speculation.


That system is extremely flawed and open to manipulation. I get what you're saying though.


----------



## 970Rules

Quote:


> Originally Posted by *Flames21891*
> 
> Well it certainly ran great on a single 970.


gtx 970 is a monster , and if didn't run great on ever single thing maxed out in 1080p, I even say that would more then not prove this game is a half ass'ed port!
anyone with a brain know's 6gb thing a sick joke, it's like when devs say you need a I7 core in other games, yet i5's lay the smack down on such games!
no game at 1080p needs 6gb to "run" a level, only consoles need to load ever thing onto vram....................... it's just freaking cache. using video ram as a giant storage unit is what cache is.
4GB of vram will be fine for running around a level, and not be waiting for pc ram to give the video memory the textures it needs. KNOW THIS - the textures it needs running right now in a level will FIT just fine on 4gb
As it's only needs to load area your playing in, ever freaking possible thing not loaded is cache, that is what so many people don't get, hell I even say refuse to understand and claim like trolls they are " you need 6gb, the 970/980 already outdated lulz."

that 6GB vram is smoke up peoples collective dumb asses and beyond ******ed at 1080p to claim you need that much to run max freaking anything in pc game in 2014

the trolls saying other wise need to learn don't ever take a dev word for it when it's ****** statement, i have never seen a game coded right for the pc use that much ACTIVE vram!

i have seen titans VRAM get used up by a larger cache, BUT THAT FREAKING CACHE!

I swear to god it make so much more sense if the idiot devs that put that 6GB "suggestion" was using all titans , and was like, oh wow it nearly using all 6GB OF VRAM! , i better put the suggestion of 6GB needed for max settings.... it's cache you blockheads!

Sense this is big open world game.
Lets think of skyrim, pretend you start 1 end of world, and over say a hour travel you as far as you can to other side of world.

Guest what, where you started 1 hour ago, now other side of world....
IS NOT in video ram anymore. YOU don't need that to be "ACTIVE"
now if you fast travel back to where you started, it will "load" the area back. but the place you just left is no longer active and makes room for new active parts.


----------



## Sequences

About time. Ultra is finally out of reach for most people.


----------



## fashric

Quote:


> Originally Posted by *phill1978*
> 
> This forum needs a reddit style upvote system so at least there is a chance that people get to read a sensible comment rather than have 40 pages of mostly speculation.


Haha you can tell you spend too much time on reddit when you see good posts and then look for the upvote arrow ^^
Quote:


> Originally Posted by *Chargeit*
> 
> That's because your eyes aren't used to the goodness that comes with requiring 6gb of Vram. Once the system shock of seeing all that eye candy wears off, you'll realize you just saw the best looking damned textures know to man.
> 
> But yea, it didn't look like much if you ask me. However, youtube has a way of doing that. That wasn't the highest quality gaming video I've seen. He must of had to cut it down good to upload it.


Ah maybe they will sell some super duper high ultra texture viewing glasses that I can purchase for only $499!!


----------



## n780tivs980

Quote:


> Originally Posted by *The Source*
> 
> It showed Ultra texture quality was selected.


Nah not yet, they need the update for it. Also I thought it was locked to actually having 6gb to be able to use it.


----------



## LaBestiaHumana

My modded Skyrim goes slightly above 3GB. I get about 90FPS average. 120max- 65min.

My friend with dual 780ti gets frame dips all the way down to 5fps, using the same mods.

Most of his games run perfectly fine at 1080p, but for there are definitely cases where 3GB is not enough.

If this game is poorly optimized I will definitely use more than 3gb and cause frame dips when it tries using more than what you have.

The game will still be playable, it will just take some tweaks instead of just turning everything to Ultra.

I am gonna say that now is the time to cash in on the 3gb cards. Before 980ti, 390x arrive.


----------



## Arturo.Zise

I don't like the way these new games are shaping up for PC users. It seems like developers are putting console first and PC gets the leftovers. If all these new releases start to show a similar trend then I might just downgrade my PC and buy a PS4. No point in putting $$$ in to gaming rigs when they don't make the games take full advantage of them


----------



## Paladin Goo

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I don't like the way these new games are shaping up for PC users. It seems like developers are putting console first and PC gets the leftovers.


Welcome to 2008.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Arturo.Zise*
> 
> I don't like the way these new games are shaping up for PC users. It seems like developers are putting console first and PC gets the leftovers. If all these new releases start to show a similar trend then I might just downgrade my PC and buy a PS4. No point in putting $$$ in to gaming rigs when they don't make the games take full advantage of them


That's one of the major cons of PC on the whole PC vs Console. Sure we have better hardware, but that doesnt translate into better running games. Especially at launch. I'm scared to buy anymore PC games unless they have been out and patched for some time.


----------



## Ksireaper

Quote:


> Originally Posted by *fashric*
> 
> Really? Looked distinctly average too me, hopefully its just the youtube compression.


Same here. Looked pretty bland. I will pick it up on a steam sale and give it a shot down the road.


----------



## Arturo.Zise

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> That's one of the major cons of PC on the whole PC vs Console. Sure we have better hardware, but that doesnt translate into better running games. Especially at launch. I'm scared to buy anymore PC games unless they have been out and patched for some time.


Yeah. At least with PC there is generally a good modding community out there that can help with mods/fixes/tweaks to get the best out of the games. I'm trying to explain to my friends that 900p/1080p at 30fps on "Next Gen" consoles is nothing to rave about, but when PC gamers don't get the real benefits of much better hardware then it doesn't really prove much.


----------



## Shadowarez

Yeah it seems consoles are holding back Pc game progression way it looks now games will be created on consoles for consoles then Pc ports will just get rebuilt from console code with a few checkmarks that consoles can't do.

Pretty soon Pc's won't be gaming machines as consoles progress Pc's will return back to just for work. Unless some studios get enough money to say screw consoles let's code create on Pc for Pc not code create on consoles then try make that code work on Pc hardware.


----------



## Jaycz

Quote:


> Originally Posted by *Shadowarez*
> 
> Yeah it seems consoles are holding back Pc game progression way it looks now games will be created on consoles for consoles then Pc ports will just get rebuilt from console code with a few checkmarks that consoles can't do.
> 
> Pretty soon Pc's won't be gaming machines as consoles progress Pc's will return back to just for work. Unless some studios get enough money to say screw consoles let's code create on Pc for Pc not code create on consoles then try make that code work on Pc hardware.


Woah man theirs still plenty of games on PC(steam) that consoles don't have, the console ports will be pretty bad though, but haven't they always been


----------



## omarh2o

Crysis 3 1440p max settings 8xmsaa and I get 110+ fps and never goes over 2.1gb of gpu memory usage. So with less aa or using the new down sampling feature from NVidia, which is supposed to be less demanding then the equivalent aa setting, should be significantly less memory usage than 2.1. so unless this game has revolutionary textures never seen before, I cant see how 6gb minimum will be recommended for 1080P. it just doesn't add up, either very poorly optimized, or textures that blow crisis 3 out of this world. It just seems pretty ridiculous to me, what would it take to play this in 1440p?


----------



## Jaycz

Quote:


> Originally Posted by *omarh2o*
> 
> Crysis 3 1440p max settings 8xmsaa and I get 110+ fps and never goes over 2.1gb of gpu memory usage. So with less aa or using the new down sampling feature from NVidia, which is supposed to be less demanding then the equivalent aa setting, should be significantly less memory usage than 2.1. so unless this game has revolutionary textures never seen before, I cant see how 6gb minimum will be recommended for 1080P. it just doesn't add up, either very poorly optimized, or textures that blow crisis 3 out of this world. It just seems pretty ridiculous to me, what would it take to play this in 1440p?


From what i understand its rendering at 4k then downscaling to 1080p

i think i read that somewhere


----------



## omarh2o

Quote:


> Originally Posted by *Jaycz*
> 
> From what i understand its rendering at 4k then downscaling to 1080p
> 
> i think i read that somewhere


Quote:


> Originally Posted by *Jaycz*
> 
> From what i understand its rendering at 4k then downscaling to 1080p
> 
> i think i read that somewhere


If that's the case then it it should make more sense, although I still expect pretty good visuals at 6gb recommended, even with 4k down sampling.


----------



## Jaycz

Quote:


> Originally Posted by *omarh2o*
> 
> If that's the case then it it should make more sense, although I still expect pretty good visuals at 6gb recommended, even with 4k down sampling.


For sure, 6gb is alot of vRAM l0l

The way i see it, its either badly optimized, amazing quality, or the devs threw out a large number for some reason


----------



## KenjiS

THE MOMENT OF TRUTH

its unlocked now, about to try firing it up on my GTX 770 SLI


----------



## Jaycz

Quote:


> Originally Posted by *KenjiS*
> 
> THE MOMENT OF TRUTH
> 
> its unlocked now, about to try firing it up on my GTX 770 SLI


Give us the numbers buddy


----------



## KenjiS

Just finishing install now. Will try High first since i dont know where to get the Ultra textures..


----------



## n780tivs980

Quote:


> Originally Posted by *KenjiS*
> 
> Just finishing install now. Will try High first since i dont know where to get the Ultra textures..


Bench it!!!!!!!!!!!!!!!!!!!


----------



## BizzareRide

Quote:


> Originally Posted by *KenjiS*
> 
> THE MOMENT OF TRUTH
> 
> its unlocked now, about to try firing it up on my GTX 770 SLI


Do you have the Ultra texture pack installed?

Edit: nvm


----------



## KenjiS

Quote:


> Originally Posted by *BizzareRide*
> 
> Do you have the Ultra texture pack installed?
> 
> Edit: nvm


point me to it and ill try it!

launching now


----------



## omarh2o

Quote:


> Originally Posted by *KenjiS*
> 
> point me to it and ill try it!
> 
> launching now


GO GO GO! Im at work right now I cant bench it


----------



## KenjiS

First run yielded positive results! Theres a built-in benchmark, Which is good news!

First off! My settings:

And my results from my first run:

51 average FPS. Not bad at all!

Yeah "6gb VRAM" is a bit of hooey. The benchmark was smooth as butter on "High" which supposedly needs 3gb at 1080p, I ran at 1440p. No Supersampling

Game does NOT appear to use SLI however. So this is a SINGLE 770 giving these numbers. Poorly optimized isnt what I'd call it either..


----------



## Jaycz

How much vRAM did it actually use?


----------



## The Source

No SLI?? What engine does this game use?


----------



## KenjiS

ok Ultra Textures ARE automatically in the game! and there is a very large difference between the two. High I'd say is equal to Xbox One/PS4 quality. Ultra looks pretty darn good. Definitely a large improvement.

Second run. This time I put it to the "Ultra" default...



Much worse showing, only 16fps average



Coming up I'm going to drop the resolution to 1080p and see if that improves anything.


----------



## KenjiS

Quote:


> Originally Posted by *The Source*
> 
> No SLI?? What engine does this game use?


Might be my settings. Let me do a 1080p run and ill check my profile.

-edit- I cant say how much VRAM its using, I only have 2gb cards, its pegged over at 2gb as you can see HOWEVER its butter smooth at High, unlike Watch_Dogs. So far I'd say the game seems like its well optimized and the VRAM requirements are just a bit excessive, but unnecessary.


----------



## Gir

Computer specs in my sig, I ran it on the "Apollo-Soyuz Test Project" rig

All settings at the highest at 1080p, 84fps


----------



## The Source

The game uses the Lith Tech engine. If you have inspector you could try an SLI profile for one of these games if there is one.

http://en.wikipedia.org/wiki/LithTech
Quote:


> F.E.A.R. by Monolith Productions (2005)
> Condemned: Criminal Origins by Monolith Productions (2005)
> F.E.A.R. Extraction Point by TimeGate Studios (2006)
> F.E.A.R. Perseus Mandate by TimeGate Studios (2007)
> Condemned 2: Bloodshot by Monolith Productions (2008)
> Terrorist Takedown 2: US Navy SEALs by City Interactive (2008)
> Mortyr: Operation Thunderstorm by City Interactive (2008)
> Code Of Honor 2: Conspiracy Island by City Interactive (2008)
> SAS: Secure Tomorrow by City Interactive (2008)
> Royal Marines: Commando by City Interactive (2008)
> Crossfire by SmileGate (2008)
> Combat Arms by Nexon Corporation (2008)
> F.E.A.R. 2: Project Origin by Monolith Productions (2009)
> Armed Forces Corp. by City Interactive (2009)
> Battlestrike: Shadow of Stalingrad aka. Battlestrike: Force of Resistance 2 by City Interactive (2009)
> Code of Honor 3: Desperate Measures by City Interactive (2009)
> Wolfschanze II by City Interactive (2009)
> Special Forces by City Interactive (2010)
> Terrorist Takedown 3 by City Interactive (2010)


----------



## NoDoz

Looking for a deal on this game...any out there?


----------



## KenjiS

OK 1080p run. Or to be more precise "1792x1008". The in game resolution setting is tied to your native resolution, So this is being rendered at 70% of my 1440p display.

Left everything else pegged to "Ultra"


As you can see my FPS improved Dramatically, This is almost playable in fact.


TLDR? 6gb VRAM "requirement" is *fake*. 2gb seems playable *Even with Ultra Textures* If that is indeed the limitation

Now to investigate SLI...


----------



## NoDoz

I need to know about SLI before I buy it. Not doing it unless it supports it.


----------



## Jaycz

Quote:


> Originally Posted by *KenjiS*
> 
> TLDR? 6gb VRAM "requirement" is *fake*. 2gb seems playable *Even with Ultra Textures* If that is indeed the limitation
> 
> Now to investigate SLI...


Okay, so just another CoD:G RAM thing, good to know


----------



## KenjiS

Quote:


> Originally Posted by *Jaycz*
> 
> Okay, so just another CoD:G RAM thing, good to know


Yep. Nothing however stops you from selecting things or anything, its basically a "Hey we think its a bad idea but its your rig, just dont cry to us if it doesnt work out"

I think SOM is missing a SLI profile. Im still tinkering.


----------



## Worldwin

Quote:


> Originally Posted by *KenjiS*
> 
> Yep. Nothing however stops you from selecting things or anything, its basically a "Hey we think its a bad idea but its your rig, just dont cry to us if it doesnt work out"
> 
> I think SOM is missing a SLI profile. Im still tinkering.


You had the texture pack installed, correct?


----------



## LaBestiaHumana

Seems like a good game, will buy it once the bugs have been worked out and SLI is confirmed to work.


----------



## Yungbenny911

Blah Blah Blah 2Gb V-ram this and that, i wonder where the V-ram warriors are now?







. After seeing that video up there, I want to play this game so bad haha.


----------



## Leopard2lx

Quote:


> Originally Posted by *Worldwin*
> 
> You had the texture pack installed, correct?


Yes he does and apparently there is nothing extra to install or download. They come with the game.

Also, as far as SLI goes, I believe it's NVIDIA that needs to release a profile and they are usually pretty good about that so we might get one in the next 24-48 hrs. You can check via GeForce Experience.


----------



## KenjiS

Quote:


> Originally Posted by *Worldwin*
> 
> You had the texture pack installed, correct?


I'm assuming yes. I did see a rather large difference in the benchmark between "High" and "Ultra" visually. High looks a step above a One or a PS4. Ultra looks VERY good but DOES appear to need a bit more than 2gb of VRAM or much more power than a Single 770 can muster at 1440p. 1080p you might be fine, if you dial back the AO you should be perfectly playable in fact

I can answer the VRAM questions later today.

I will say the game has lots of jagged edges from the looks, So my theory here is they're assuming you'll be using Supersampling if you can. Thus the absurd VRAM requirements.

But otherwise the game appears more optimized than Watch_Dogs by a huge mile. 1080p with mild supersampling and High settings are easily manageable on a 2gb card I'd say...


----------



## Jaycz

Quote:


> Originally Posted by *Yungbenny911*
> 
> Blah Blah Blah 2Gb V-ram this and that, i wonder where the V-ram warriors are now?
> 
> 
> 
> 
> 
> 
> 
> . After seeing that video up there, I want to play this game so bad haha.


There was one guy swearing up and down 6gb of vRAM is going to be necessary from now of because of the _*recommended specs*_ of 2 games l0l


----------



## saeedkunna

Quote:


> Originally Posted by *NoDoz*
> 
> Looking for a deal on this game...any out there?


i got it from green man gaming for 37.5 but i am not sure if they stiil have that offer


----------



## DrBrogbo

I can see Ultra requiring 6GB. Running the benchmark with everything maxed (textures oh high, and couldn't find any AA options, sadly), I ended up with 87.41 fps average, and 2.3GB of VRAM usage.

What is people's VRAM usage on ultra?


----------



## KenjiS

I will note on the SLI thing I think nVidia will get SLI support fast, I DID note that in my testing my second card was not "idling" but in fact running at like 10-15%. So theres something goofy there.

Game is not a trainwreck of optimization, at least not in the benchmark. I would start playing it right now but I am going to wait for morning when my 970 gets here, and I'll do some more tests


----------



## Gir

Quote:


> Originally Posted by *DrBrogbo*
> 
> I can see Ultra requiring 6GB. Running the benchmark with everything maxed (textures oh high, and couldn't find any AA options, sadly), I ended up with 87.41 fps average, and 2.3GB of VRAM usage.
> 
> What is people's VRAM usage on ultra?


GPU-Z shows 3.5gb used for me ultra at 1080p


----------



## NoDoz

Quote:


> Originally Posted by *saeedkunna*
> 
> i got it from green man gaming for 37.5 but i am not sure if they stiil have that offer


How do I get it for that? Code or something?


----------



## DrBrogbo

Quote:


> Originally Posted by *Gir*
> 
> GPU-Z shows 3.5gb used for me ultra at 1080p


Yeah I just tried it myself, showed the same. I'm not sure I even noticed a difference between the two, and with only 4GB, I'm not sure I want to risk crashes or catastrophic slowdowns because of it.


----------



## KenjiS

Quote:


> Originally Posted by *DrBrogbo*
> 
> I can see Ultra requiring 6GB. Running the benchmark with everything maxed (textures oh high, and couldn't find any AA options, sadly), I ended up with 87.41 fps average, and 2.3GB of VRAM usage.
> 
> What is people's VRAM usage on ultra?


I believe the game is intending you to use supersampling quite heavily. And I believe honestly thats where the VRAM requirement is coming from

I can see running the game at 4 or 5k and Ultra might tax the VRAM, and even hit 6gb

But if you're talking just playing the game and getting it playable, 1080p and Ultra should easily work on a 2gb GTX 770

1440p on the other hand... I think the VRAM limitation just sacked me in the gut









TL;DR, their requirements are just cautiously excessive from the looks.


----------



## Leopard2lx

I don't understand. So there is no option for AA? Wow! Welcome to 2005!







That is pretty ******ed.


----------



## Alatar

Quote:


> Originally Posted by *KenjiS*
> 
> TLDR? 6gb VRAM "requirement" is *fake*. 2gb seems playable *Even with Ultra Textures* If that is indeed the limitation
> 
> Now to investigate SLI...


But your min fps is 5.... Playable (avg 30fps) yeah but hardly ideal.


----------



## omarh2o

this kind of sucks, so im assuming 1440p ultra isn't even an option for a 980?


----------



## KenjiS

Quote:


> Originally Posted by *Leopard2lx*
> 
> I don't understand. So there is no option for AA? Wow! Welcome to 2005!
> 
> 
> 
> 
> 
> 
> 
> That is pretty ******ed.


If I may go out on a limb, might not be that bad an idea: We're in the era where on the PC 4k gaming is becoming a reality quickly. At 4k the need for AA is substantially reduced due to the substantially higher resolution. The game appears to favor supersampling, its built right into the game for you in fact, Everything up to *5k* in my case.

At that kind of SS any aliasing is going to be nonexistent...


----------



## Alatar

Also I just activated my key on steam and steam is telling me that it's a 35GB download. Going to take a while even on a faster connection


----------



## KenjiS

Quote:


> Originally Posted by *Alatar*
> 
> But your min fps is 5.... Playable (avg 30fps) yeah but hardly ideal.


I'm doing things pretty quick and dirty. I think its tweakable. Maybe drop the AO a bit for instance.

I'm going to completely disable SLI and do another run, Just to rule out SLI -impairing- performance any.


----------



## Leopard2lx

Quote:


> Originally Posted by *KenjiS*
> 
> If I may go out on a limb, might not be that bad an idea: We're in the era where on the PC 4k gaming is becoming a reality quickly. At 4k the need for AA is substantially reduced due to the substantially higher resolution. The game appears to favor supersampling, its built right into the game for you in fact, Everything up to *5k* in my case.
> 
> At that kind of SS any aliasing is going to be nonexistent...


Except that you are going to need a crazy system to be able to run it with enough supersampling to remove a decent amount of jaggies...and like 6gb VRAM which 99% of the people DON'T have.

Honestly, this just sounds like really lazy work from the devs. Was it that hard to add a few AA options?
My problem is that AA is one of the things I care about the most in terms of graphics because jagged edges really annoy me to no end!

So, another console port where the PC's get sent back 10 years and get the leftovers









....well, at least it was only $37...


----------



## omarh2o

So anyone know the usage for ultra 1440p?


----------



## KenjiS

I went and disabled SLI completely and tested it.

1440p Ultra:



1080p Ultra:



Strange however. I think theres a possability of some limitation here. 1080p seems to show that SLI was indeed providing a performance boost. So it seems that one of two things is happening:

1. Something is limiting SLI's performance, IE, lack of VRAM and need to stream assets from hard drive/system ram

or

2. SLI support is broken


----------



## NoDoz

What numbers are you getting in SLI?


----------



## DrBrogbo

For me, 1200P maxed out (ultra textures), the 30 minutes of gameplay I tried peaked at 3908MB VRAM usage. Framerate with my 980 stayed above 60 at ALL times though, so it's very well optimized. I was afraid my old QX9650 would severely limit me, but nope.









AWESOME game, by the way. Open world LOTR with combat like Batman AA.


----------



## wholeeo

Don't believe there's an SLI profile yet for the game, don't see it in Nvidia inspector.


----------



## KenjiS

SLI Disabled again

1440p Very High Preset(About 1940MB of VRAM used, Under my 2gb)



1440p Very High Preset + Ultra Textures (IE, leaving everything from Very High EXCEPT the Textures)



1080p Very High Preset + Ultra Textures



My preliminary analysis is that the game DOES want VRAM and that will be the limitation. Very High looks pretty nice though by the way and should run great on most 2gb GPUs at 1440p (Which is 1080p with a bit of supersampling to smooth out the edged!)

SLI seemed to help a bit in my 1080 benchmarks. But I feel SLI is not properly optimized yet and would say leave it off for now

Beyond that game seems VERY well optimized, You just will need a 3-4gb card for Ultra textures. And you know what? Thats fine. Very High looks pretty good too and that will run on 2gb. We can all agree on this yes?


----------



## NoDoz

Im downloading it now...Ill do some testing as well


----------



## KenjiS

Quote:


> Originally Posted by *NoDoz*
> 
> Im downloading it now...Ill do some testing as well


Very curious as you have a 980 SLI, I'd like to see if its just my 2gb VRAM limiting SLIs performance or not

Strongly feel the game needs 3gb at minimum for Ultra textures. But 6gb? Yeah thats them being cautious

and I'll fire it up tomarrow when my 970 gets here. My guess is that my 970 will easily run 1440p Ultra however...


----------



## Toomuch_

Quote:


> Originally Posted by *NoDoz*
> 
> How do I get it for that? Code or something?


I literally just purchased the game on the site and got deal, so the code is still good but probably not for long!

Promo code is SEPTEM-BEROFF-ER25XX

G'luck!


----------



## cstkl1

33.9gb. damn. ok thats 4 hrs.

is the sli working??

collecting my 2 units of swift. will test later to see how the 780ti matrix sli vs black sli.


----------



## NoDoz

Quote:


> Originally Posted by *KenjiS*
> 
> Very curious as you have a 980 SLI, I'd like to see if its just my 2gb VRAM limiting SLIs performance or not
> 
> Strongly feel the game needs 3gb at minimum for Ultra textures. But 6gb? Yeah thats them being cautious
> 
> and I'll fire it up tomarrow when my 970 gets here. My guess is that my 970 will easily run 1440p Ultra however...


Yeah Ill figure it out and post on here. Gonna be a while but at least its downloading at a high rate.


----------



## KenjiS

Quote:


> Originally Posted by *NoDoz*
> 
> Yeah Ill figure it out and post on here. Gonna be a while but at least its downloading at a high rate.


No rush. I am going to wait for my 970 and then play probubly


----------



## saeedkunna

Quote:


> Originally Posted by *NoDoz*
> 
> How do I get it for that? Code or something?


i used this code : SEPTEM-BEROFF-ER25XX


----------



## Chargeit

For $37 I have to pick this one up to test my 780's 3gb Vram. =D

Told you the 6gb thing is a great marketing tool.


----------



## saeedkunna

downloaded 9.1/33.9 GB steam is very slow today my connection is 40mb but i am only getting 1.3mb download rate on steam









i wanna see how my tow titan's black will do @4k


----------



## Chargeit

When do you get your key from this green man site?


----------



## MapRef41N93W

Well just 13 more hours to go. Why couldn't GMG have released the codes earlier? AT&T DSL monopoly FTL...


----------



## cstkl1

err dont see much discount on that

steam with the full season pass is already USD36


----------



## Chargeit

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Well just 13 more hours to go. Why couldn't GMG have released the codes earlier? AT&T DSL monopoly FTL...


Ah, so GmG hasn't released the codes yet.

Damn, my net is slow. I top out at about 950kps... That's running wide open. I have have to cut it back during the day since it messes with my ol'ladys MMO'ing. When you limit the dl speed with steam, you're lucky to pull in 200kps avg dl. =/ I won't have this sucker downloaded tomorrow.

Freaking Business line.


----------



## DNMock

3840 x 2160 (4k) resolution
4970k
and a pair of barely overclocked 290x's. Which based on that image there, somehow the ram does stack since it says I have 8gb of Vram apparently with my two cards.

Maybe that's why they thought they needed 6gb of ram, because they thought when you crossfired or used sli, that ram stacked. 2x 780ti's = 6gb of vram in their book just like 2 290x's = 8gb of vram in mine.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Chargeit*
> 
> Ah, so GmG hasn't released the codes yet.
> 
> Damn, my net is slow. I top out at about 950kps... That's running wide open. I have have to cut it back during the day since it messes with my ol'ladys MMO'ing. When you limit the dl speed with steam, you're lucky to pull in 200kps avg dl. =/ I won't have this sucker downloaded tomorrow.
> 
> Freaking Business line.


Nah they released the codes a few hours ago. I'm getting about 670kb/s.


----------



## Chargeit

Quote:


> Originally Posted by *cstkl1*
> 
> err dont see much discount on that
> 
> steam with the full season pass is already USD36


Says $67.49 with the pass.


----------



## Nyt Ryda

Quote:


> Originally Posted by *DNMock*
> 
> 
> 
> 
> 
> 4970k
> and a pair of barely overclocked 290x's. Which based on that image there, somehow the ram does stack since it says I have 8gb of Vram apparently with my two cards.


What resolution was that running at ?


----------



## Chargeit

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Nah they released the codes a few hours ago. I'm getting about 670kb/s.


It tells me "No keys available for this game" under my account. Do they run out of them or something? I've never used the site before.


----------



## DNMock

double post


----------



## DNMock

Quote:


> Originally Posted by *Nyt Ryda*
> 
> What resolution was that running at ?


4k


----------



## cstkl1

Quote:


> Originally Posted by *Chargeit*
> 
> Says $67.49 with the pass.


Quote:


> Originally Posted by *Chargeit*
> 
> Says $67.49 with the pass.


eh since steam now available is RM ( malaysian ringgit) its RM116 for the season pass bundle pack.
Current USD to RM is 3.27 = 35.47.

Normal is RM95..

so .. their pricing is region based??


----------



## KenjiS

Nice!

Shame an nVidia title is not working with SLI but working with Crossfire..


----------



## Chargeit

Quote:


> Originally Posted by *cstkl1*
> 
> eh since steam now available is RM ( malaysian ringgit) its RM116 for the season pass bundle pack.
> Current USD to RM is 3.27 = 35.47.
> 
> Normal is RM95..
> 
> so .. their pricing is region based??


Yea, I guess so because it isn't that price here.


----------



## saeedkunna

Quote:


> Originally Posted by *Chargeit*
> 
> When do you get your key from this green man site?


you should get it immediately since the game is already out


----------



## Flames21891

So it looks like they weren't spouting BS with the texture requirements. I can't run it on High without plenty of stutters and hitches, so I believe Ultra is out of the question.

Also of note, however, is that SLI appears non-functional as of now. I noticed zero performance gain with SLI enabled over a single card, and in fact it seemed to run a little worse.

Lastly, the textures really don't look that great, so why they are eating so much VRAM is beyond me. In fact, the game looks pretty average all around minus the lighting, which almost looks out of place among the average textures and landscape.


----------



## Chargeit

Quote:


> Originally Posted by *saeedkunna*
> 
> you should get it immediately since the game is already out


I checked the forums and it seems like other people are still waiting.

It all shows up correctly under my account, but, where it says key's, it says "No keys available for this game."

Kind of sucks. I was hoping to start that DL tonight. I'll sit up a little longer, but I have work in the morning and have to be up in a few hours. =/

It isn't a huge deal. Hell, I didn't even want to play this game until about 12 hours ago.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Chargeit*
> 
> It tells me "No keys available for this game" under my account. Do they run out of them or something? I've never used the site before.


You should have gotten your key via e-mail. The site probably takes a few hours to update.

It's possible you are also still in queue. I ordered the game a few days ago.


----------



## DNMock

Quote:


> Originally Posted by *Flames21891*
> 
> So it looks like they weren't spouting BS with the texture requirements. I can't run it on High without plenty of stutters and hitches, so I believe Ultra is out of the question.
> 
> Also of note, however, is that SLI appears non-functional as of now. I noticed zero performance gain with SLI enabled over a single card, and in fact it seemed to run a little worse.
> 
> Lastly, the textures really don't look that great, so why they are eating so much VRAM is beyond me. In fact, the game looks pretty average all around minus the lighting, which almost looks out of place among the average textures and landscape.


I'm raping it with ultra everything at 4k resolution, around 50 fps using 2 crossfired 290x. Their system also believes 290x's ram stacks in crossfire for some odd reason:


----------



## omarh2o

Quote:


> Originally Posted by *DNMock*
> 
> I'm raping it with ultra everything at 4k resolution, around 50 fps using 2 crossfired 290x. Their system also believes 290x's ram stacks in crossfire for some odd reason:


If this is true, then the whole 6gb thing is completely false and there is a huge optimization issue here. Ive never heard of a memory stacking issue.


----------



## NoDoz

Sounds like we can expect some patches.


----------



## DNMock

Just rechecked, still has me listed at 8gb of vram somehow. I haven't read the patch notes on the new AMD drivers that came out today so maybe they did something amazing and mind blowing! Or this game is just bugged like that. Although I am still getting 50 fps at 4k with ultra settings so who knows.


----------



## Chargeit

Quote:


> Originally Posted by *DNMock*
> 
> Just rechecked, still has me listed at 8gb of vram somehow. I haven't read the patch notes on the new AMD drivers that came out today so maybe they did something amazing and mind blowing! Or this game is just bugged like that. Although I am still getting 50 fps at 4k with ultra settings so who knows.


This game truly is amazing, not only does it have the most hardcore textures know to man, but it also upgrades your GPU's Vram... Happy days.


----------



## The Source

http://www.neogaf.com/forum/showthread.php?t=903815&page=9

Quote:


> SLI seems to work fine using the F.E.A.R. 3 sli bits with nvidia inspector. I also added the game's exe to the profile but don't know if that was necessary.


I can't confirm if this works, or how well it works as I am still downloading the game on dial-up.


----------



## Chargeit

Ah well, no key for me tonight. I need to crash out.


----------



## SONICDK

http://www.cjs-cdkeys.com/products/Middle%252dearth%3A-Shadow-of-Mordor-CD-Key-For-Steam.html
dont know if its cheaper than gmg


----------



## DNMock

Quote:


> Originally Posted by *Chargeit*
> 
> This game truly is amazing, not only does it have the most hardcore textures know to man, but it also upgrades your GPU's Vram... Happy days.


WTS 980 destroyers: R9-290x upgraded to 8 gigs of ddr5 ram by the design team at Shadows of Mordor!


----------



## TFL Replica

I would like to see some high versus ultra comparison screenshots.


----------



## The Source

Quote:


> Originally Posted by *The Source*
> 
> http://www.neogaf.com/forum/showthread.php?t=903815&page=9
> I can't confirm if this works, or how well it works as I am still downloading the game on dial-up.


I've seen some confirmations over on neogaf that this works.


----------



## firebird1le

Here's 1080 with the very high preset. EVGA 760 SC 2gb


----------



## KenjiS

Quote:


> Originally Posted by *Flames21891*
> 
> So it looks like they weren't spouting BS with the texture requirements. I can't run it on High without plenty of stutters and hitches, so I believe Ultra is out of the question.
> 
> Also of note, however, is that SLI appears non-functional as of now. I noticed zero performance gain with SLI enabled over a single card, and in fact it seemed to run a little worse.
> 
> Lastly, the textures really don't look that great, so why they are eating so much VRAM is beyond me. In fact, the game looks pretty average all around minus the lighting, which almost looks out of place among the average textures and landscape.


Thats strange. my 770 is fine on Very High. Ultra is no but Very high is fine


----------



## cix92

Quote:


> Originally Posted by *TFL Replica*
> 
> I would like to see some high versus ultra comparison screenshots.


http://www.pcgameshardware.de/Mittelerde-Mordors-Schatten-PC-258069/Specials/Hands-on-Test-Ultra-Texturen-1137689/

LMAO @ 6GB requirement

I'm glad some people here explained it actually , they were right. Devs stuck stuff into VRAM just bec they can , i played games 4-5 years ago almost with better textures or similar with 2gb card.


----------



## KenjiS

Quote:


> Originally Posted by *The Source*
> 
> I've seen some confirmations over on neogaf that this works.


I gave it a quick shot before bed using my 1440p Very High settings (Basically my best performing yet) using the FEAR 3 profile in nVidia inspector.

52.98 avg, 20 min

Seems to be having the same behavior of only loading my second card about 25% or so.

For now seems best to leave SLI off

Unless im not using nvidia inspector right... :/


----------



## kx11

maxed settings benchmark result @ 1440p ( DOF turned off because i hate it )


----------



## omarh2o

with a 980? not bad, what was the memory usage?


----------



## TFL Replica

Quote:


> Originally Posted by *cix92*
> 
> http://www.pcgameshardware.de/Mittelerde-Mordors-Schatten-PC-258069/Specials/Hands-on-Test-Ultra-Texturen-1137689/
> 
> LMAO @ 6GB requirement
> 
> I'm glad some people here explained it actually , they were right. Devs stuck stuff into VRAM just bec they can , i played games 4-5 years ago almost with better textures or similar with 2gb card.


Thanks for the link. Looks like the skeptics were right.


----------



## omarh2o

Quote:


> Originally Posted by *TFL Replica*
> 
> Thanks for the link. Looks like the skeptics were right.


What exactly does that link say? sorry I cant translate on my phone.


----------



## Promisedpain

I've been reading steam and guys with 2GB 770 get 60 fps running ultra textures.... BS requirements once again. My 780 ti classy will destroy this game.


----------



## TFL Replica

Quote:


> Originally Posted by *omarh2o*
> 
> What exactly does that link say? sorry I cant translate on my phone.


You can view the comparison screenshots comparing high and ultra (not taken from the exact same spot, but close enough), and 4k to 1080p downsampling versus regular 1080p.

The ultra textures are inconsistent and disappointing, and some of the textures are on par with Risen 1 (my opinion). Basically, this game is a mess.


----------



## Promisedpain

Quote:


> Originally Posted by *TFL Replica*
> 
> You can view the comparison screenshots comparing high and ultra (not taken from the exact same spot, but close enough), and 4k to 1080p downsampling versus regular 1080p.
> 
> The ultra textures are inconsistent and disappointing, and some of the textures are on par with Risen 1 (my opinion). Basically, this game is a mess.


http://www.pcgameshardware.de/commoncfm/comparison/?id=121926

I don't see much difference... and I'm not going to say how awful I think the textures look like....


----------



## Silent Scone

why do you need the torrentz?


----------



## daviejams

I'll buy this after work tonight , see if it I get 6gb Video Ram with my 7970s in crossfire !

Does it run on ultra on 3gb cards OK ? Would not mind having it on my living room computer and using the big screen and controller. Seems like the kind of game that is more suited for that than a desk


----------



## cstkl1

Quote:


> Originally Posted by *Silent Scone*
> 
> why do you need the torrentz?


dont need it other than daily dose of tv-shows.
just happen to come ac-cross my private tracker site since it hit on the top torrent list.


slowed down the internet for a bit to watch some tv shows.

btw what did you do with ure TB??


----------



## daviejams

33.9gb ? I'll download that in less than an hour

I only ever torrent TV shows these days too


----------



## cstkl1

Quote:


> Originally Posted by *daviejams*
> 
> 33.9gb ? I'll download that in less than an hour
> 
> I only ever torrent TV shows these days too


errr where is this ultra hd pack..
seems like its pre-installed.. looking for it.


----------



## The Source

Quote:


> Originally Posted by *KenjiS*
> 
> I gave it a quick shot before bed using my 1440p Very High settings (Basically my best performing yet) using the FEAR 3 profile in nVidia inspector.
> 
> 52.98 avg, 20 min
> 
> Seems to be having the same behavior of only loading my second card about 25% or so.
> 
> For now seems best to leave SLI off
> 
> Unless im not using nvidia inspector right... :/


Quote:


> Working SLI profile(Only for Nvidia users)
> I tried a lot SLI profile but i only find this working.I am getting around 145 fps in benchmark and all max out on 1080p
> 1)GO Nvidia Inspector
> 2) Select Middle Earth Shadow of Mordor
> 3)Than go to Add application to current profile than brower and select Middle Earth Shadow of Mordor.exe.
> 4) Last and final select Fear 3 on SLI bits Compatibility bits(Dx1x) and than hit apply.


http://forums.guru3d.com/showpost.php?p=4925921&postcount=290

From the comparisons I'm seeing Ultra textures don't add much of anything. That's a shame.


----------



## daviejams

Quote:


> Originally Posted by *cstkl1*
> 
> errr where is this ultra hd pack..
> seems like its pre-installed.. looking for it.


I read that it comes as a separate download. Maybe look on steam store for the download ?


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yet another wall of text without proof to back it up. Just stop, you clearly don't know what you're talking about
> 
> 
> 
> 
> 
> 
> 
> . I've had hands on experience WITH PROOF on Bf4 @ 4k, and yes, you're completely spreading misinformation. You ran into a performance wall, end of story...
> 
> I was able to play at 4k res because i OC'ed my 770's in SLI to 1400Mhz (core)/ 1950Mhz (mem), running at stock clocks resulted in 1-2 lockup's and what not. At 1400Mhz, they played bf4 at 4k very well, much better than i ever imagined.I can bet a million you never tested a 770 or 680 at such high OC on games, that's why you believe you hit a v-ram wall...
> 
> *BF4 @ 160%, and 200% 1080p.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *BF4 @ 5120x2160p Downsampled Resolution (OGSSAA)*
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Zipperly*
> 
> I speak from personal experience, so go ahead and try to pretend that you know more than me in this subject because I can guarantee that you do not! And speaking of a Great Wall of text ^ By the way it's very apparent you don't know what the hell you are talking about, over clocking your gpu will not substitute for running out of VRAM but if it makes you feel any better my 680 was at 1380 on the core.


In the interest of anyone reading this who's trying to decide on buying a new GPU....

770 SLi in the video would have CLEARLY demonstrated ANY Vram limitation much better than a single 680

770 SLi and a single 680 both have 2gb but 770 SLi has more power to expose any such limitation much more clearly....

Anyone who knows anything about Vram understands this....


----------



## cstkl1

Quote:


> Originally Posted by *daviejams*
> 
> I read that it comes as a separate download. Maybe look on steam store for the download ?


i have all the dlc etc all downloaded..

but hmm when i enabled ultra.. so far gamed a bit just now at 2560x1080. ( havent unbox the swift since i just got it just now)
it was topping at 5gb vram.


----------



## daviejams

Quote:


> Originally Posted by *cstkl1*
> 
> i have all the dlc etc all downloaded..
> 
> but hmm when i enabled ultra.. so far gamed a bit just now at 2560x1080. ( havent unbox the swift since i just got it just now)
> it was topping at 5gb vram.


Well if you can set it to ultra you must have the HD textures. What card you playing it on ? 5gb is a lot !

I don't think I'll bother buying the version with all the DLC tbh , I like LOTR but would not say I am a huge fan or anything like that


----------



## Clockster

Pre-Caching


----------



## cstkl1

Quote:


> Originally Posted by *daviejams*
> 
> Well if you can set it to ultra you must have the HD textures. What card you playing it on ? 5gb is a lot !
> 
> I don't think I'll bother buying the version with all the DLC tbh , I like LOTR but would not say I am a huge fan or anything like that


titan black.

hmm benchmark.. it starts with min fps 48/max 3xx but during the run it stays above 60... max was 4.5gb on benchmark run.

also the game has a bug. i have a xbox 360 controller and keyboard.
during the initial stage. for the hold in interrogate the wsd keys go whack and you can see some chinese characters at the bottom right...no idea whats thats about but the controller works..

was hoping to use the keyboard as range mode it will definately be faster than the controller.

this game definately uses a lot of ram. was hitting 9gb at one point.
even watchdog at its insane setting that took all my vram etc.. only hit just under 8.

ran 150% one card is playable for 3840x1646

so this game doesnt use AA because in the resolution it supports downsampling. up to 200%


----------



## Pip Boy

Quote:


> Originally Posted by *Yungbenny911*
> 
> Blah Blah Blah 2Gb V-ram this and that, i wonder where the V-ram warriors are now?
> 
> 
> 
> 
> 
> 
> 
> . After seeing that video up there, I want to play this game so bad haha.


its either unoptimised or that vram is some sort caching and just available IF the ram is there. knowone has been truly clear or posted screen shots from fraps or any other program in the last 10 pages.. not even the " 56GB of VRAM is needed and we haven't needed it before because .. consoles" crowd

get some proper evaluation done.. this is OCN afterall im sure the game will do 4gb but does it really need 6gb and I mean that in the sense of SHOULD it really need it given the quality @ 1080p


----------



## The Source

so what is the vram usage with 200% res scaling and ultra textures, at 1080p?


----------



## JSTe

Quote:


> Originally Posted by *Promisedpain*
> 
> http://www.pcgameshardware.de/commoncfm/comparison/?id=121926
> 
> I don't see much difference... and I'm not going to say how awful I think the textures look like....


Damn, that's just... rancid.

Low resolution and colours consist of purple and green compression blocks. These allegedly "require" 3GB and 6GB of VRAM respectively?


----------



## cstkl1

Quote:


> Originally Posted by *The Source*
> 
> so what is the vram usage with 200% res scaling and ultra textures, at 1080p?


Shld be 6. Storing the game till that keyboard bug is fixed n also waiting for sli. But on slight gaming n benchmark. A single titan black at 1202 was enough for downsampling at 3840x 1646. Fps was avg at 40 with max at 70-80 with min 30fps.

So gpu hrsepower wise this game is not intensive. More a vram issue.

So i guess that 6gb at 1080p was at 200% downsampling at ultra texture. Btw seriously this game takes up a lot of my ram.

So hope all the bug n driver will be updated this weekend intime for my gsync experience


----------



## The Source

Quote:


> Originally Posted by *cstkl1*
> 
> Shld be 6. Storing the game till that keyboard bug is fixed n also waiting for sli. But on slight gaming n benchmark. A single titan black at 1202 was enough for downsampling at 3840x 1646. Fps was avg at 40 with max at 70-80 with min 30fps.
> 
> So gpu hrsepower wise this game is not intensive. More a vram issue.
> 
> So i guess that 6gb at 1080p was at 200% downsampling at ultra texture. Btw seriously this game takes up a lot of my ram.
> 
> So hope all the bug n driver will be updated this weekend intime for my gsync experience


Did you give the FEAR 3 SLI profile a go?


----------



## cstkl1

Quote:


> Originally Posted by *The Source*
> 
> Did you give the FEAR 3 SLI profile a go?


Err no. Back to benching. That keyboard thing was annoying. This game definately works best on a keyboard.


----------



## John Shepard

So was the 6GB requirement bs or not?
On the steam forums there are people claiming that they can run ultra on a 2GB card.


----------



## QxY

Really interested in seeing 780 Ti vs 980 benches on Ultra.


----------



## cstkl1

Quote:


> Originally Posted by *John Shepard*
> 
> So was the 6GB requirement bs or not?
> On the steam forums there are people claiming that they can run ultra on a 2GB card.


its true. Reducing to high nvr exceeded 4gb. That 2gb its a lie. Ultra was around 5gb. Benchmark was already 4.5gb.


----------



## daviejams

I'll buy it and run the benchmark tonight on my i5 2500k with a R9 280x (3gb) card and post what it says

If I have time I'll install it on my i7 with 2x7970 cards also to see


----------



## WhiteCrane

How much VRAM does PS4 have? Think it can handle ultra textures?


----------



## John Shepard

Either way my copy arrives tomorrow.I guess i'll try it on my 680 and see how it runs.
Hopefully it won't be like Watch Dogs.


----------



## Rickles

I bought it, it will be a day or more before it finishes downloading though..........


----------



## TopicClocker

Quote:


> Originally Posted by *WhiteCrane*
> 
> How much VRAM does PS4 have? Think it can handle ultra textures?


The PS4 currently has 5.5GB available for games out of it's 8GB, if it required 6GB for Vram i don't think it would be able to fit unless the memory demands of the PS4 and PC are different.

The memory requirements seem to be possibly more than is needed.
(So perhaps the PS4 can run them them).

It also sounded as if the Ultra textures were an optional download, if they ran the same textures on the PS4 why would they make it an optional download?


----------



## WhiteCrane

Quote:


> Originally Posted by *TopicClocker*
> 
> The PS4 currently has 5.5GB available for games out of it's 8GB, if it required 6GB for Vram i don't think it would be able to fit unless the memory demands of the PS4 and PC are different.
> 
> The memory requirements seem to be possibly more than is needed.
> (So perhaps the PS4 can run them them).
> 
> It also sounded as if the Ultra textures were an optional download, if they ran the same textures on the PS4 why would they make it an optional download?


wait, so the PS4 system RAM is shared between general memory and VRAM?

It has no dedicated VRAM? I didn't know that.


----------



## t00sl0w

so what is the consensus on this so far?
BS requirement or actuality?
and not past allocation, I mean, actual comparison showing massive performance loss in not having the VRAM available.
also, noting if the comparison uses something like a 770 VS titan as that is silly due to the power difference period.

Quote:


> Originally Posted by *WhiteCrane*
> 
> wait, so the PS4 system RAM is shared between general memory and VRAM?
> 
> It has no dedicated VRAM? I didn't know that.


yeah, both consoles use unified memory with the xbone adding the extra ESRAM layer.

in reality though, the VRAM isn't the limiting factor for the texture quality on the consoles, it would be the actual power of the GPU and what it can process.


----------



## Chargeit

I can't wait to test my 780.

I will say one thing, I've been seeing some really bad min. 30 min is fine, but 5, that might be a indication of the game stuttering because it's having to swap textures on the fly. At leas that makes sense.

Oh, and yea, the textures in the game look like old murky water from the pics. =P Game still looks fun.


----------



## RagingCain

Quote:


> Originally Posted by *Chargeit*
> 
> I can't wait to test my 780.
> 
> I will say one thing, I've been seeing some really bad min. 30 min is fine, but 5, that might be a indication of the game stuttering because it's having to swap textures on the fly. At leas that makes sense.
> 
> Oh, and yea, the textures in the game look like old murky water from the pics. =P Game still looks fun.


That probably just needs a driver patch to optimize for the game a bit better. There is plenty sorry, there should be plenty of room for VRAM texture swap on a 3GB card, if they didn't load everything, whether in use or not, into VRAM to begin with.

For SLI users, someone suggested try using the FarCry 3 profile.


----------



## Chargeit

Quote:


> Originally Posted by *RagingCain*
> 
> That probably just needs a driver patch to optimize for the game a bit better. There is plenty sorry, there should be plenty of room for VRAM texture swap on a 3GB card, if they didn't load everything, whether in use or not, into VRAM to begin with.
> 
> For SLI users, someone suggested try using the FarCry 3 profile.


Yea, lets hope it's a driver thing.

I'm not worried about the 3gb on my 780. I'm not one of those people that get into understanding the inner-workings of the tech, but, it would make sense to me that that larger bandwidth should help make up for less ram.


----------



## NoDoz

2560x1600 SLI on Ultra



1080P SLI on Ultra


----------



## y2kcamaross

Quote:


> Originally Posted by *NoDoz*
> 
> 2560x1600 on Ultra


Is that with the working FEAR 3 sli profile, or just one card?


----------



## NoDoz

Here are my results with a single 980 now.

2560x1600



1080P



So as you can see Im barely getting an increase in performance using SLI. On a good note, when SLI is working properly 980SLI is going to crush this game as a single still gets 60fps on ultra at 2560x1600.


----------



## Dasboogieman

Quote:


> Originally Posted by *NoDoz*
> 
> Here are my results with a single 980 now.
> 
> 2560x1600
> 
> 
> 
> 1080P
> 
> 
> 
> So as you can see Im barely getting an increase in performance using SLI. On a good note, when SLI is working properly 980SLI is going to crush this game as a single still gets 60fps on ultra at 2560x1600.


This is really really weird, my Crossfire AMD 290s send the Max FPS through the roof yet the average is the same as a single GTX 980.
Not sure what to make of this game now.



When I disable crossfire, the single 290 only averages 25-90 FPS, avering more towards the 75FPS range, however, the max FPS is 600


----------



## TopicClocker

Quote:


> Originally Posted by *t00sl0w*
> 
> so what is the consensus on this so far?
> BS requirement or actuality?
> and not past allocation, I mean, actual comparison showing massive performance loss in not having the VRAM available.
> also, noting if the comparison uses something like a 770 VS titan as that is silly due to the power difference period.
> yeah, both consoles use unified memory with the xbone adding the extra ESRAM layer.
> 
> in reality though, the VRAM isn't the limiting factor for the texture quality on the consoles, it would be the actual power of the GPU and what it can process.


Someone on Reddit said a few days ago that the developers are assuming you're going to use the game's super-sampling at 1080p, which would be something like 4K.

From what I've seen that may just be the case.

And the performance is fine.
Here's someone running Ultra settings at 1080p with High textures.




To enable SLI I've heard that using the F.E.A.R 3 profile may help.


----------



## FallenFaux

I think the game is running on the same engine as the Batman games. So maybe try one of the Arkham SLI profiles?


----------



## gooface

I have high hopes for my PC with this game. I might have to get a GTX 980 later this year.


----------



## TFL Replica

Quote:


> Originally Posted by *FallenFaux*
> 
> I think the game is running on the same engine as the Batman games. So maybe try one of the Arkham SLI profiles?


The Batman games use modified UE3, if I'm not mistaken. This uses Monolith's LithTech engine.


----------



## Dasboogieman

My bad, just managed to force Crossfire, now the darn game scales properly


----------



## FallenFaux

Quote:


> Originally Posted by *TFL Replica*
> 
> The Batman games use modified UE3, if I'm not mistaken. This uses Monolith's LithTech engine.


Oh yeah, you're right. I guess F.E.A.R. 3 probably is the best choice.


----------



## jtw473

After playing for a bit on 980s at 1440p maxed out ultra im dropping textures to high. Vram usage is pegged at 4gb and I get stutters panning the camera and during combat. If you are stubborn ultra if definitely playable but I can see why 6GB vram is recommended, at least for 1440p. High doesn't look much worse and the actual gameplay experience is far better.


----------



## The Source

Quote:


> Originally Posted by *TFL Replica*
> 
> The Batman games use modified UE3, if I'm not mistaken. This uses Monolith's LithTech engine.


That is correct. I've posted it three times already I think and other have as well, but no one reads, even a few pages back.

*FEAR 3 SLI PROFILE.*


----------



## DNMock

Quote:


> Originally Posted by *Dasboogieman*
> 
> My bad, just managed to force Crossfire, now the darn game scales properly


Did the game give your 290's 8gb of VRAM like it did me?


----------



## StrongForce

http://www.pcgameshardware.de/commoncfm/comparison/?id=121926

Did the person who made this slider thing really had the ultra textures installed ? I mean, the rocks looks pretty ugly, and they look identical lol will probably get this game when it on sale, awaiting reviews out of curiosity now.


----------



## Ferreal

lol what's with AMD users getting 1000+ max fps?


----------



## Dasboogieman

Quote:


> Originally Posted by *DNMock*
> 
> Did the game give your 290's 8gb of VRAM like it did me?


Yeah it did, might've been a mis-report by GPUz though since it's counting both GPUs. Though weirdly, it still reported 8Gb of VRAM usage even when I had the game running on a single GPU (since Crossfire was disabled by mistake).

Also, where are people getting this Texture pack from? I can't find any official sources except dodgy torrents or Ukrainian Warez.


----------



## Crouch

A friend of mine with a 770 is running the game at very high on a 1440p monitor & getting 45-60 avg. Ultra at 1440p he's getting 30 fps avg with the texture pack installed. So at 1080p I guess that 6gb Vram madness is just absurd,


----------



## Yungbenny911

The only thing that annoys me is that they just make blank statements like "6gb is required for Ultra 1080p". Yes 6Gb might be required for SSAA x8 + Ultra Textures + DDOF e.t.c., but how many GPU's can actually run that at playable settings?

It's almost like they put V-RAM first before they take into account a GPU's processing capabilities. Like saying, "Oh if your single GTX 770 happened to have 6GB of V-ram, you would definitely run everything on the highest settings possible @ 1080p"

Even a 6gb Titan wouldn't be able to run at playable FPS.... (-__-)"


----------



## Dasboogieman

Quote:


> Originally Posted by *Yungbenny911*
> 
> The only thing that annoys me is that they just make blank statements like "6gb is required for Ultra 1080p". Yes 6Gb might be required for SSAA x8 + Ultra Textures + DDOF e.t.c., but how many GPU's can actually run that at playable settings?
> 
> It's almost like they put V-RAM first before they take into account a GPU's processing capabilities. Like saying, "Oh if your single GTX 770 happened to have 6GB of V-ram, you would definitely run everything on the highest settings possible @ 1080p"
> 
> Even a 6gb Titan wouldn't be able to run at playable FPS.... (-__-)"


I think the devs were really lazy, similar to the Watchdogs team, they just simply cached the entire game environment in the VRAM and had it stream in realtime to the GPU. I mean, if you consider the barebones code and shaders actually required by the game, I doubt you'd use more than 2Gb.


----------



## LaBestiaHumana

Based on some videos, the game looks to be running ok, unless you max out the vram, then you get dips.


----------



## Paladin Goo

My own benchmark with sig rig:


----------



## StrongForce

Quote:


> Originally Posted by *Yungbenny911*
> 
> The only thing that annoys me is that they just make blank statements like "6gb is required for Ultra 1080p". Yes 6Gb might be required for SSAA x8 + Ultra Textures + DDOF e.t.c., but how many GPU's can actually run that at playable settings?
> 
> It's almost like they put V-RAM first before they take into account a GPU's processing capabilities. Like saying, "Oh if your single GTX 770 happened to have 6GB of V-ram, you would definitely run everything on the highest settings possible @ 1080p"
> 
> Even a 6gb Titan wouldn't be able to run at playable FPS.... (-__-)"


Good point..

As for the SLI/Crossfire showing double the VRAM, maybe the devs didn't know.. you would think they should know that kind of stuff though ! hah







.


----------



## SoloCamo

Quote:


> Originally Posted by *Raven Dizzle*
> 
> My own benchmark with sig rig:


Thanks - glad to see it's fine

Tis a shame the game's textures look blah it best, regardless of youtube's compression


----------



## The Source

Quote:


> Originally Posted by *StrongForce*
> 
> http://www.pcgameshardware.de/commoncfm/comparison/?id=121926
> 
> Did the person who made this slider thing really had the ultra textures installed ? I mean, the rocks looks pretty ugly, and they look identical lol will probably get this game when it on sale, awaiting reviews out of curiosity now.


Reviews have been out for a few days now, and the consensus is that the gameplay is actually good. Ground textures are the most noticeable with the texture pack.

Yes the devs should have been more specific when suggesting 6GB's, and I can't believe the internet ran amuck for nothing.... again.


----------



## n780tivs980

So what they meant was 3gb but doubled it for sli just like 290/x users are reporting that the game is reporting 8gb?


----------



## speedy2721

My GTX 780ti is at 1250 core and 7400 memory. I am using a 1440p monitor. Here are my results for high and ultra textures with everything else on max:

Ultra: Maxed out VRAM
Average- 62.16
Max- 249.25
Min- 27.81

High: Used 2400MB VRAM
Average- 62.96
Max- 135-280
Min- 35-40

The max seems to vary each time I run the benchmark on high so that is effecting the average. The main difference between high and ultra seem to be minimum fps and memory usage.


----------



## Menta

stock 970 ultra pack and ultra


----------



## sugiik

so how / where to download ultra hd texture for steam version ?


----------



## NoDoz

Here is my new result with the FEAR 3 SLI profile. An increase of 37 FPS.

2560x1600 on ULTRA


----------



## TopicClocker

Quote:


> Originally Posted by *sugiik*
> 
> so how / where to download ultra hd texture for steam version ?


I'm not sure if it's listed on Steam yet, but earlier today on NeoGaf people were using this.

https://steamdb.info/app/311670/


----------



## 47 Knucklehead

Quote:


> Originally Posted by *John Shepard*
> 
> So was the 6GB requirement bs or not?
> On the steam forums there are people claiming that they can run ultra on a 2GB card.


You can select what ever you want, but if your card doesn't have the memory, it automatically steps down to a resolution that it can play at.

So yes, if you only have a video card with 3GB ov VRAM, you can select "Ultra", but it will only play using the 3GB "High" settings.

6GB is absurd. Also, I'm not impressed with the game anyway. It's basically "Batman" or "Assassins Creed" meets Mordor.


----------



## Glottis

Quote:


> Originally Posted by *StrongForce*
> 
> http://www.pcgameshardware.de/commoncfm/comparison/?id=121926
> 
> Did the person who made this slider thing really had the ultra textures installed ? I mean, the rocks looks pretty ugly, and they look identical lol will probably get this game when it on sale, awaiting reviews out of curiosity now.


no they didn't have ultra textures installed. here's updated comparison with ultra textures installed.

http://www.pcgameshardware.de/Mittelerde-Mordors-Schatten-PC-258069/Specials/Hands-on-Test-Ultra-Texturen-1137689/


----------



## TopicClocker

I can't wait for EuroGamer to do an analysis and a review of this game.









I'm hearing from multiple sources that the Xbox One version is 900p and the PS4 version is 1080p.

I'm not sure about the frame-rate but I'm hearing 30fps somewhere, and 60fps elsewhere, it's pretty unclear.


----------



## 970Rules

Quote:


> Originally Posted by *47 Knucklehead*
> 
> if you only have a video card with 3GB ov VRAM, you can select "Ultra", but it will only play using the 3GB "High" settings.


What your saying is darn right false. And at worst trying to keep troll of 6gb needed myth alive.
if you have hd texture packed installed and selected on a freaking 780 ti, IT will be ultra, same for 970/980....
i know this sense i already seen many 780 ti users on reddit and the other gtx 900 series users post SS of high vs ultra in same game spot/same view of same comparing 2 settings!

Also real fps diff on high vs ultra settings , take this stock 980 user textures report ". Got 90 fps on high and 83 on ultra even with a 4gb card" if it's same textures why is he getting 7 avg fps lower, on same setting, You don't know what your talking about period sir.

now for the people sure asking for a link and i know one has been already given here, yet people keep asking for it? , it's on the offical dlc list of main game steam store page for crying out loud.
always double check the steam game store page, 99% of time it have the DLC/downloads link somewhere on that page.

http://store.steampowered.com/app/311670/


----------



## 47 Knucklehead

Quote:


> Originally Posted by *970Rules*
> 
> what your saying is damn right false. and trying to keep troll of 6gb needed myth alive.


http://www.gamefront.com/shadow-of-mordors-pc-ultra-settings-require-6gb-vram/
http://www.overclock3d.net/articles/gpu_displays/shadow_of_mordor_requires_6gb_of_vram/1
http://www.kitguru.net/gaming/matthew-wilson/shadow-of-mordor-wants-6gb-of-vram-for-1080p-ultra-settings/
https://www.facebook.com/permalink.php?story_fbid=744434572291050&id=162236020510911
http://n4g.com/news/1594176/middle-earth-shadow-of-mordor-pc-settings-unveiled-ultra-textures-require-6gb-of-vram
http://www.gamespot.com/forums/system-wars-314159282/shadow-of-mordor-needs-6gb-vram-to-max-out-texture-31583922/
http://tweakers.net/nieuws/98708/lotr-shadows-of-mordor-vraagt-6gb-vram-voor-optimale-textures.html

But by all means, keep calling people "idiots" and "trolls".










And since you link to Steam ...

http://steamcommunity.com/app/241930/discussions/0/613937306836737824/
Quote:


> Recommended for starting max graphics is 3 GB Vram for High Graphics, 2 for Medium and 1 for low. Recommended is just a suggestion and you may fair better with lower but don't bet on it.
> If you look at the Recommended for this game (as written on Steam) you will see it recommends at GTX 660 or a Radeon HD 7950, neither of which has a 6GB variety.
> 
> *6GB Vram is only required if you want to run the Optional Ultra Textures*, Texture pictures so high in quality you would need 6 GB Vram for the pc to even load them, everything else remains the same in the game as High. Unless you are running a screen with 4k resolution there is little point to worry about Ultra Textures, you will be perfectly fine with High on a 1080p screen or lower.


So YES, if you want to run on Ultra, you DO NEED 6GB of VRAM. Otherwise, AS I SAID, you will be running in something NOT Ultra ... aka HIGH for cards with 3-4GB of VRAM. As I said in my previous post.

PERIOD. END OF STORY.


----------



## NoDoz

Running the game at 2560x1600 on ultra is maxes my 980s out at 4gb.


----------



## The Source

Quote:


> Originally Posted by *47 Knucklehead*
> 
> http://www.gamefront.com/shadow-of-mordors-pc-ultra-settings-require-6gb-vram/
> http://www.overclock3d.net/articles/gpu_displays/shadow_of_mordor_requires_6gb_of_vram/1
> http://www.kitguru.net/gaming/matthew-wilson/shadow-of-mordor-wants-6gb-of-vram-for-1080p-ultra-settings/
> https://www.facebook.com/permalink.php?story_fbid=744434572291050&id=162236020510911
> http://n4g.com/news/1594176/middle-earth-shadow-of-mordor-pc-settings-unveiled-ultra-textures-require-6gb-of-vram
> http://www.gamespot.com/forums/system-wars-314159282/shadow-of-mordor-needs-6gb-vram-to-max-out-texture-31583922/
> http://tweakers.net/nieuws/98708/lotr-shadows-of-mordor-vraagt-6gb-vram-voor-optimale-textures.html
> 
> But by all means, keep calling people "idiots" and "trolls".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And since you link to Steam ...
> 
> http://steamcommunity.com/app/241930/discussions/0/613937306836737824/
> So YES, if you want to run on Ultra, you DO NEED 6GB of VRAM. Otherwise, AS I SAID, you will be running in something NOT Ultra ... aka HIGH for cards with 3-4GB of VRAM. As I said in my previous post.
> 
> PERIOD. END OF STORY.


So now we need comparison shots of actual 6GB cards running ultra and those of 4GB and less running ultra? I'm not buying it. If it's manually selected, its going to run it. I didn't see anything you posted suggesting otherwise. Just people saying you _need_ 6GB to run it smoothly.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *The Source*
> 
> So now we need comparison shots of actual 6GB cards running ultra and those of 4GB and less running ultra? I'm not buying it. If it's manually selected, its going to run it. I didn't see anything you posted suggesting otherwise. Just people saying you _need_ 6GB to run it smoothly.


If only there was a demo or benchmark, I'll test it out and post screenshots and video. I'm still on the fence about buying it. Certainly won't buy it to benchmark.


----------



## kx11

instead of enjoying this awesome game people would like to argue about PS4 RAM

man give it a rest


----------



## LaBestiaHumana

Quote:


> Originally Posted by *kx11*
> 
> instead of enjoying this awesome game people would like to argue about PS4 RAM
> 
> man give it a rest


That's the problem with a lot of PC users. "he grass and rocks must be realistic in order for them to enjoy a good game. lol

Just play the game with what you have and enjoy it, if you can't run HD textures oh well. It's just one game.


----------



## The Source

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> If only there was a demo or benchmark, I'll test it out and post screenshots and video. I'm still on the fence about buying it. Certainly won't buy it to benchmark.


There is a benchmark included.


----------



## TopicClocker

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> That's the problem with a lot of PC users. "he grass and rocks must be realistic in order for them to enjoy a good game. lol
> 
> Just play the game with what you have and enjoy it, if you can't run HD textures oh well. It's just one game.


There's more to it than that, people are fearful of the memory the next generation consoles have.


----------



## Newbie2009

How does this game run on 290 cards? I'm wondering to go console or pc depending on how the drivers are.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *The Source*
> 
> There is a benchmark included.


Free?


----------



## The Source

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> That's the problem with a lot of PC users. "he grass and rocks must be realistic in order for them to enjoy a good game. lol
> 
> Just play the game with what you have and enjoy it, if you can't run HD textures oh well. It's just one game.


What we care about is results that reflect the quality given. We are still seeing the same quality for the last few years and hardware requirements are continually rising. It's a concerning trend.
Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Free?


No. If you have no interest in the game, why participate in the conversation?


----------



## Rickles

i remember when games didn't have grass, and then games that had grass and it was dynamic, darn pokemon has spoiled me.


----------



## Matthew89

970 Oc'd
Sometimes the Benchmark starts with low mins and very high max fps it's weird lol


----------



## Chargeit

True all that.

I know I'd like to see what it does with ultra textures on my 780, but, I'm not even going to dl them until later. Most likely I'll wait until I'm ready to make the move to a newer 6 or 8gb card, whenever those hit. Until then, I'll play this sob with high textures and a freaking Xbox controller.









It's funny how excited I now am about a game I hadn't really thought of until the other day.

I will kill many Orcs.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *The Source*
> 
> What we care about is results that reflect the quality given. We are still seeing the same quality for the last few years and hardware requirements are continually rising. It's a concerning trend.


I don't know if Ultra looks better or the same as Very High, but all I'm saying is that, if high is all your pc can handle then enjoy the game.

A lot of games from 2011-2012 don't look as good as some of the latest titles. There may be a few exceptions but improvements have been made. We all know the number 1 priority right now is consoles, and we shall continue to get lazy ported games just because more money is made from console games.


----------



## Chargeit

Quote:


> Originally Posted by *Rickles*
> 
> i remember when games didn't have grass, and then games that had grass and it was dynamic, darn pokemon has spoiled me.


I remember when games looked like this...



Then suddenly games looked like this...



And now games look like this!



Yea.

*Search Pacman for a cool pacman game around a Google logo. =D


----------



## 47 Knucklehead

Quote:


> Originally Posted by *The Source*
> 
> So now we need comparison shots of actual 6GB cards running ultra and those of 4GB and less running ultra? I'm not buying it. If it's manually selected, its going to run it. I didn't see anything you posted suggesting otherwise. Just people saying you _need_ 6GB to run it smoothly.


http://gearnuke.com/middle-earth-shadow-mordor-low-vs-ultra-quality-comparison-confirms-6-gb-vram-requirement/


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Chargeit*
> 
> I remember when games looked like this...
> 
> 
> 
> Then suddenly games looked like this...
> 
> 
> 
> And now games look like this!
> 
> 
> 
> Yea.
> 
> *Search Pacman for a cool pacman game around a Google logo. =D


One day they will look like this:


----------



## The Source

Quote:


> Originally Posted by *47 Knucklehead*
> 
> http://gearnuke.com/middle-earth-shadow-mordor-low-vs-ultra-quality-comparison-confirms-6-gb-vram-requirement/


That's a bit more clear. But some are saying this is another case of it will use as much vram as it can. That doesn't mean it will scale back manually selected settings if it doesn't like the hardware.

Maybe the game needs an update or two.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *47 Knucklehead*
> 
> http://gearnuke.com/middle-earth-shadow-mordor-low-vs-ultra-quality-comparison-confirms-6-gb-vram-requirement/


Ultra does look significantly better than high.


----------



## kx11

Quote:


> Originally Posted by *Matthew89*
> 
> 970 Oc'd
> Sometimes the Benchmark starts with low mins and very high max fps it's weird lol


that benchmark reminds A LOT of Thief Benchmark like it's the same but with a sun and some weird creatures


----------



## cstkl1

Quote:


> Originally Posted by *The Source*
> 
> That's a bit more clear. But some are saying this is another case of it will use as much vram as it can. That doesn't mean it will scale back manually selected settings if it doesn't like the hardware.
> 
> Maybe the game needs an update or two.


No it doesnt. So far on 2560x1080 its abt 5gb.
It is smooth but really hope the sli will be fixed. Really wanna play it at 144hz.


----------



## Chargeit

A fully modded Skyrim looks pretty damned nice.





I didn't get too crazy with mine when I tested it out, but it looks really good.


----------



## cstkl1

Quote:


> Originally Posted by *Matthew89*
> 
> 970 Oc'd
> Sometimes the Benchmark starts with low mins and very high max fps it's weird lol


Check the excel sheet in the save game folder wb creates in documents. It will show the fps per frame on that benchmark.


----------



## Alatar

Highest usage during the benchmark at 2560x1440

Not sure if that actually matters at all. I think I'll now play a while and when I'm in a place where I can easily do comparisons I'll do that.


----------



## TopicClocker

Quote:


> Originally Posted by *47 Knucklehead*
> 
> http://gearnuke.com/middle-earth-shadow-mordor-low-vs-ultra-quality-comparison-confirms-6-gb-vram-requirement/


Why are the PS4 Gamers on NeoGaf saying that the game runs at 30fps?
That article links to another which claims 60fps?


----------



## cstkl1

Quote:


> Originally Posted by *TopicClocker*
> 
> Why are the PS4 Gamers on NeoGaf saying that the game runs at 30fps?
> That article links to another which claims 60fps?


Cutscenes. Finishing move. Not sure its a monitor hz thing etc.


----------



## cstkl1

Quote:


> Originally Posted by *Alatar*
> 
> 
> 
> Highest usage during the benchmark at 2560x1440
> 
> Not sure if that actually matters at all. I think I'll now play a while and when I'm in a place where I can easily do comparisons I'll do that.


Finally another same here benchmark is 4.5gb n real in game abt 5. I assume they did the 6gb requirement since there is no 5gb vram gc.

Bro check ure system mem. This game eats up a lot. Full 8gb on top of my background ending up arnd 10gb.


----------



## TopicClocker

GAMEGPU benchmarks.
http://www.overclock.net/t/1516227/gamegpu-middle-earth-shadow-of-mordor


----------



## Pawelr98

Quote:


> Originally Posted by *47 Knucklehead*
> 
> You can select what ever you want, but if your card doesn't have the memory, it automatically steps down to a resolution that it can play at.
> 
> So yes, if you only have a video card with 3GB ov VRAM, you can select "Ultra", but it will only play using the 3GB "High" settings.
> 
> 6GB is absurd. Also, I'm not impressed with the game anyway. It's basically "Batman" or "Assassins Creed" meets Mordor.


GTA IV also has such system but it can be bypassed by a single line in a shortcut.
I will check what this game can do because my brother ordered the game.Not going to play it but I will test the game on his 7950 3gb.


----------



## Chargeit

Quote:


> Originally Posted by *cstkl1*
> 
> Bro check ure system mem. *This game eats up a lot. Full 8gb* on top of my background ending up arnd 10gb.


I guess going for 16gb system ram instead of 8gb will pay off for more then just Ramdisk and SSD caching.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Chargeit*
> 
> A fully modded Skyrim looks pretty damned nice.
> 
> 
> 
> 
> 
> I didn't get too crazy with mine when I tested it out, but it looks really good.


Yes it does, and also uses more than 3GB of Vram at 1080p.


----------



## Chargeit

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Yes it does, and also uses more than 3GB of Vram at 1080p.


The mods you see in that screen shot didn't use 3GB of Vram (I don't' think they did at least, its been a while), maybe 2 - 2.5. Though yea, you can I guess if you get really crazy with it.

I just used that "Realvision" guide and did everything max. I think I cut back on the grass though. I wanted to keep it at a playable fps, which it was.


----------



## cstkl1

Quote:


> Originally Posted by *Chargeit*
> 
> I guess going for 16gb system ram instead of 8gb will pay off for more then just Ramdisk and SSD caching.


Yeah even watch dog needed abt 8gb on total. Mainly using 6gb etc. this is a first i have seen this high.


----------



## Sisaroth

Are there any benchmarks up yet? And i mean actually good ones that doesn't show just memory used but actually comparing the same 3(4) GB cards to 6(8) GB cards to see if there actually is a difference in performance.


----------



## cstkl1

Quote:


> Originally Posted by *Sisaroth*
> 
> Are there any benchmarks up yet? And i mean actually good ones that doesn't show just memory used but actually comparing the same 3(4) GB cards to 6(8) GB cards to see if there actually is a difference in performance.


Judging from wolfy using uncompressed textures. I think it affects frame time variance. So current benches on avg fps wont suffice.


----------



## TopicClocker

Quote:


> Originally Posted by *cstkl1*
> 
> Yeah even watch dog needed abt 8gb on total. Mainly using 6gb etc. this is a first i have seen this high.


Personally I never had trouble running Watch Dogs with 6GB of ram.
Although there was this one time when I was playing Watch Dogs and I was tired so I paused the game and I went to sleep, when I woke up and started playing again it was crawling LOL, I think this happened twice from what I recall.


----------



## cstkl1

Quote:


> Originally Posted by *TopicClocker*
> 
> Personally I never had trouble running Watch Dogs with 6GB of ram.
> Although there was this one time I was tired and went to sleep, when I woke up and started playing again it was crawling LOL.


It depends isnt it on the read/write/copy speeds.

I really hope nvidia unified mem thingy will materialize.


----------



## DrBrogbo

Quote:


> Originally Posted by *TopicClocker*
> 
> Personally I never had trouble running Watch Dogs with 6GB of ram.
> Although there was this one time when I was playing Watch Dogs and I was tired so I paused the game and I went to sleep, when I woke up and started playing again it was crawling LOL, I think this happened twice from what I recall.


That says more about the failings of the game than running it with 6GB RAM.


----------



## Menta

how are you all doing the test on ultra. V Sync on or off....?

its makes a huge difference

some benchs i have seen here on some cards wont bench like that with VSync ON!!!

970 overclocked



ultra VSync off 1080p


----------



## MonarchX

I apologize if this was already asked, but does this game run faster on 4GB cards with Ultra textures than on 3GB cards with Ultra textures @ 1080p? GTX 980 vs GTX 780 Ti @ 1080p - does GTX 780 Ti suffer a large FPS loss due to having only 3GB VRAM??? Any benches?


----------



## The Source

Well I tried playing at 1440p with ultra textures with 3GB 780's and it's a bit stuttery using the latest WHQL. Usage tanks when it stutters so it's vram. Any resolution above 1440p is unplayable and results in CTD with low memory error. The game barely uses any ram, so I don't know where some of you are seeing that usage from. The game needs AA, so res scaling is kind of a must. I might just shelve this until later.


----------



## kx11

the game is updating for the HD content ( 3.7 gb )

we'll see after that


----------



## cstkl1

Quote:


> Originally Posted by *The Source*
> 
> Well I tried playing at 1440p with ultra textures with 3GB 780's and it's a bit stuttery. Any resolution above 1440p is unplayable and results in CTD with low memory error. The game barely uses any ram, so I don't know where some of you are seeing that usage from. The game needs AA, so res scaling is kind of a must. I might just shelve this until later.


On riva osd.

How did you come up with that conclusion it doesnt by That limited gametime.


----------



## delellod123

on this page, memory usage is only 3gb? it is a SLI or Crossfire rig, both with 3gb being shared arcross the cards. Without a good profile, GPU 2 isn't being utilized, as shown from its 15% usage. Why does this site say the screenshots confirm 6GB?


----------



## delellod123

Quote:


> Originally Posted by *TopicClocker*
> 
> GAMEGPU benchmarks.
> http://www.overclock.net/t/1516227/gamegpu-middle-earth-shadow-of-mordor


sorry i was referring to this....


----------



## MonarchX

http://www.pcgameshardware.de/Mittel...turen-1137689/ shows that you need even 4GB to run the game on HIGH, let alone ULTRA! Is that WITH SSAA or WITHOUT SSAA?

One way or another, this game is NOT stunning looks-wise. Its great, but not amazing...


----------



## delellod123

Quote:


> Originally Posted by *TopicClocker*
> 
> Why are the PS4 Gamers on NeoGaf saying that the game runs at 30fps?
> That article links to another which claims 60fps?


i mean this... sorry for triple post, I am used to being able to edit my posts


----------



## cstkl1

hmm there was a 96mb update just now.
no idea what it is.

ah finally hd content..

crap i was doing just now without the hd content..


----------



## Alatar

Some steam screenshots:


Spoiler: Warning: Spoiler!


----------



## cstkl1

Quote:


> Originally Posted by *Alatar*
> 
> Some steam screenshots:


thats definitely better that what i saw.

waiting for the hd content pack to be downloaded. maybe region delayed on steam.

funny there was a difference on ultra/high without it.


----------



## cstkl1

anyone wonder what that 96mb update is all about??


----------



## Alatar

No idea about the sub 100mb patch but I do know that the HD content pack does something since I now have a 3.7GB download:


----------



## cstkl1

Quote:


> Originally Posted by *Alatar*
> 
> No idea about the sub 100mb patch but I do know that the HD content pack does something since I now have a 3.7GB download:


thats the hd content pack.

now lets see how this does. makes me wonder on all those ppl who posted just now. how many had it actually .

waiting for the download complete.. rechecked the dlc content.. now it shows the option. wasnt there before


btw how fairs the haswell-e on phase??


----------



## LaBestiaHumana

Quote:


> Originally Posted by *The Source*
> 
> If you have no interest in the game, why participate in the conversation?


I do have interest, but I'm done purchasing games that require some Fixin'. Once SLI works and it's comfirmed to be running smoothly, I'll pick it up immediately. What I won't do is buy it for benchmarks. I thought I made that clear.

Game looks promising, I like how Ultra actually differs from High settings.


----------



## SONICDK

so does this means we need to wait for the 9xx 8gb models ?


----------



## Qu1ckset

*Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb*


----------



## cstkl1

Quote:


> Originally Posted by *Qu1ckset*
> 
> *Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb*


whats that gc clocked at.

will do a head to head comparison with yours although my ram is kindda optimized insanely


----------



## Murlocke

Quote:


> Originally Posted by *Qu1ckset*
> 
> *Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb*


What a joke. Glad I waited to buy this... Anyone with 1600p or higher is going to need probably 8GB of VRAM.


----------



## Qu1ckset

Quote:


> Originally Posted by *cstkl1*
> 
> whats that gc clocked at.
> 
> will do a head to head comparison with yours although my ram is kindda optimized insanely


My cards at stock clocks, haven't overclocked anything on my system due my cases PSU only being 500watts


----------



## rt123

So can somebody please tell me how much System RAM this game uses while running on Ultra..?

I remember reading 8GB somewhere.
Just want to be sure.


----------



## cstkl1

Quote:


> Originally Posted by *rt123*
> 
> So can somebody please tell me how much System RAM this game uses while running on Ultra..?
> 
> I remember reading 8GB somewhere.
> Just want to be sure.


it took full 8.5gb alone ontop of background of 1.5gb. but that was like 1 hr into game play without the ultra texture pack installed.


----------



## Azefore

Alright guys, figured I'd share as well, this is everything on ultra without v-sync, 2560x1440 with a GTX 780 on my 1st level OC (1215/6700)



*Has hit my vram limit on it as you can see but playable without turning down anything yet


----------



## rt123

Quote:


> Originally Posted by *cstkl1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> So can somebody please tell me how much System RAM this game uses while running on Ultra..?
> 
> I remember reading 8GB somewhere.
> Just want to be sure.
> 
> 
> 
> it took full 8.5gb alone ontop of background of 1.5gb. but that was like 1 hr into game play without the ultra texture pack installed.
Click to expand...

That was quick.
Thank you Sir.


----------



## villain

I don't see much of a difference between High and Ultra. The grass looks better and some textures are slightly more detailed, but you wouldn't know unless you had 2 pictures to compare. I'm more impressed how downsampling gets rid of all the rough edges. It looks so much better, but it's obviously a huge hit in terms of performance.

*Middle mouse click or left mouse click and select Original to be able to see the full sized pictures.*

High vs. Ultra


1200p vs 2400p (downsampled)


----------



## Qu1ckset

Quote:


> Originally Posted by *Azefore*
> 
> Alright guys, figured I'd share as well, this is everything on ultra without v-sync, 2560x1440 with a GTX 780 on my 1st level OC (1215/6700)
> 
> 
> 
> *Has hit my vram limit on it as you can see but playable without turning down anything yet


Is V-Sync even active while benching? , I have it on, but I thought that stuff is disabled during benching ?


----------



## Murlocke

Quote:


> Originally Posted by *villain*
> 
> I don't see much of a difference between High and Ultra. The grass looks better and some textures are slightly more detailed, but you wouldn't know unless you had 2 pictures to compare. I'm more impressed how downsampling gets rid of all the rough edges. It looks so much better, but it's obviously a huge hit in terms of performance.


Welcome to the world of JPG, Compression, and downsized pictures. It was the same thing with Watch Dogs, you really needed to be playing the game to see the difference. No one really hosts uncompressed images.
Quote:


> Originally Posted by *Qu1ckset*
> 
> Is V-Sync even active while benching? , I have it on, but I thought that stuff is disabled during benching ?


If you enable it or force it on, it's always going to be on while rendering something.


----------



## KenjiS

Well the result is in!

Fired up Shadow of Mordor on my new GTX 970, 1440p + Ultra settings = 58fps in the benchmark

Also uses about 3.5gb of VRAM

So I think I'm safe in saying that yes, the game is VRAM limited


----------



## cix92

Quote:


> Originally Posted by *cstkl1*
> 
> some ppl cant see that min fps drop.
> it does prove you need to click it bigger and see the fps on it. or actually test the game.


Min. fps in that bench is irrelevant since its from initial loading stutter , it doesn't happen during middle of actual benchmark.

6GB is just super overkill , some people with actual knowlege explained it earlier in this thread.

Also textures that are being shown clearly don't look like they need more than 2-3GB of VRAM , its just badly coded for PC memory management.


----------



## rt123

Quote:


> Originally Posted by *KenjiS*
> 
> Well the result is in!
> 
> Fired up Shadow of Mordor on my new GTX 970, 1440p + Ultra settings = 58fps in the benchmark
> 
> Also uses about 3.5gb of VRAM
> 
> So I think I'm safe in saying that yes, the game is VRAM limited


How much system RAM usage.?


----------



## KenjiS

Quote:


> Originally Posted by *rt123*
> 
> How much system RAM usage.?


Wasnt paying attention, As soon as im sorting out Rome Total War II's failure to load all of a sudden I will rebench and tell you...


----------



## cix92

I just don't believe that textures in this game require more than 2GB video ram. 3GB would be absolute maximum since its "open world"

There are tons of other games that have similar quality texture and use far far less vram.

Questionable ports like this shouldn't be taken seriously as predetermination of future PC games. Its laughable for someone to actually buy 6+ GB GPU just because someone didn't coded their game properly on PC. Let alone actual game , from what i saw it's just batman and ACreed with Mordor mashed up , casual brawler/slasher


----------



## Murlocke

Quote:


> Originally Posted by *cix92*
> 
> Min. fps in that bench is irrelevant since its from initial loading stutter , it doesn't happen during middle of actual benchmark.
> 
> 6GB is just super overkill , some people with actual knowlege explained it earlier in this thread.
> 
> Also textures that are being shown clearly don't look like they need more than 2-3GB of VRAM , its just badly coded for PC memory management.


There are plenty of benchmark screenshots with 60+ FPS on minimum. If you are all a sudden going from 60+ on the initial loading to 19FPS, then you are likely VRAM capping and it's having to dump textures constantly. The benchmark is a small area of the world, and it doesn't have to stream new textures in. It would be very noticeable in real gameplay, especially when moving the camera around. You have to constantly load new textures.


----------



## Qu1ckset

*Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb *No Vsync**


*Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb *With Vsync**


----------



## villain

Quote:


> Originally Posted by *Murlocke*
> 
> Welcome to the world of JPG, Compression, and downsized pictures. It was the same thing with Watch Dogs, you really needed to be playing the game to see the difference. No one really hosts uncompressed images.


Did you open them in full size? These are my pictures and they are a good representation of the differences I saw when I ran the benchmark with different settings. High and Ultra don't look that different unless can compare them side by side. The difference between 1200p and 2400p in terms of smooth edges is significant and can be seen on that picture.


----------



## rt123

Quote:


> Originally Posted by *KenjiS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> How much system RAM usage.?
> 
> 
> 
> Wasnt paying attention, As soon as im sorting out Rome Total War II's failure to load all of a sudden I will rebench and tell you...
Click to expand...

Thanks.
Take your time.


----------



## Promisedpain

Quote:


> Originally Posted by *Qu1ckset*
> 
> *Here is my Benchmark with the Ultra Textures installed with everything on max @2560x1080 with a 780ti 3Gb *No Vsync**


Is it playable on ultra? I actually have the same card. 1080p.


----------



## Murlocke

Quote:


> Originally Posted by *villain*
> 
> Did you open them in full size? These are my pictures and they are a good representation of the differences I saw when I ran the benchmark with different settings. High and Ultra don't look that different unless can compare them side by side. The difference between 1200p and 2400p in terms of smooth edges is significant and can be seen on that picture.


I did, but i'm on 3440x1440 so the picture is still tiny for me. I agree in those pictures the difference is small, but I would imagine if I was rendering it the difference would be larger. Need more close ups of walls/ground to really see the benefits of higher resolution textures.









1200p vs 2400p is major, even on your screenshots.


----------



## Qu1ckset

Quote:


> Originally Posted by *Promisedpain*
> 
> Is it playable on ultra? I actually have the same card. 1080p.


I only played about 10minz into the start of the game and everything runs smooth as butter, probably won't get a good play threw till the weekend tho cause I'm on overnights this week.. , but Ya I don't see any issues what so ever with ultra textures...


----------



## KenjiS

Quote:


> Originally Posted by *rt123*
> 
> Thanks.
> Take your time.


About 3gb.

Did a bit of tweaking, Heres two more Benchmark runs:

1440p Ultra Textures:



64.97 Average, Not bad at all!

Now for giggles, 2160p (4k) Supersampled, Ultra Textures:



Ignore the min FPS, That was just the load in, during the entire sequence it was pegged at that 36fps number...

Either is easily playable..

Now I need to eat and then I'd like to actually play some games!


----------



## cstkl1

Quote:


> Originally Posted by *rt123*
> 
> How much system RAM usage.?


I got my osd label wrongly labbeled. The system mem used maxed out around 5gb.


----------



## rt123

Quote:


> Originally Posted by *KenjiS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> Thanks.
> Take your time.
> 
> 
> 
> About 3gb.
Click to expand...

Thanks.
Quote:


> Originally Posted by *cstkl1*
> 
> I got my osd label wrongly labbeled. The system mem used maxed out around 5gb.


Okay.


----------



## phenom01

2560x1440 downsampled to 1920x1080 HD patch enabled all settings maxed.


Spoiler: Warning: Spoiler!















The ingame fps bar during the test is causing a bit of stuttering. In the game itself it never drops below 60.


----------



## cstkl1

Quote:


> Originally Posted by *cix92*
> 
> Min. fps in that bench is irrelevant since its from initial loading stutter , it doesn't happen during middle of actual benchmark.
> 
> 6GB is just super overkill , some people with actual knowlege explained it earlier in this thread.
> 
> Also textures that are being shown clearly don't look like they need more than 2-3GB of VRAM , its just badly coded for PC memory management.


Somebody who wrote the game or designed the engine? Till then how did you come up with whats real n not.
Ishh samething again .. Ppl claimed watchdogs was fine with 780ti on ultra. Wasnt. So this debate i can only test over the weekend since both rigs will have close to the same spec.


----------



## Promisedpain

Quote:


> Originally Posted by *cstkl1*
> 
> Somebody who wrote the game or designed the engine? Till then how did you come up with whats real n not.
> Ishh samething again .. Ppl claimed watchdogs was fine with 780ti on ultra. Wasnt. So this debate i can only test over the weekend since both rigs will have close to the same spec.


Wasn't watch dogs the game where even people with titan had problems while driving?

https://www.youtube.com/watch?v=GoMCLFNoRik Seems to drop to 40's maybe even 30's on titan.


----------



## The Source

Quote:


> Originally Posted by *KenjiS*
> 
> About 3gb.
> 
> Did a bit of tweaking, Heres two more Benchmark runs:
> 
> 1440p Ultra Textures:
> 
> 
> 
> 64.97 Average, Not bad at all!
> 
> Now for giggles, 2160p (4k) Supersampled, Ultra Textures:
> 
> 
> 
> Ignore the min FPS, That was just the load in, during the entire sequence it was pegged at that 36fps number...
> 
> Either is easily playable..
> 
> Now I need to eat and then I'd like to actually play some games!


I'll eat my shoe if it's playable in game. The benchmark runs a lot better than in game.


----------



## Chargeit

Wow, the game sure doesn't seem to like the fact that I have Nvidia surround enabled. I didn't notice a proper 1920x1080 option.









You have to wonder how it is that a indie game can handle this without a problem, yet a AAA game acts all ******ed about it... Annoyance level rising.


----------



## supermi

I installed the game on my M18x r2 with 780m (4gb) at 1019mhz and I played at 720p with ultra textures installed all settings at max and supersampling from 1080p.
4gb is pegged most of the time though not always, no slowdowns ...
with ultra textures but no supersampling I was at around 3.2gb 720p ...
Supersampling to 1080p cut fps from 60 something average (32 or so minimum) to 45 or 47 average and low of 17 in benchmark game so far is smooth with xbox controller.

I am installing the game on my main rig now and HOPING SLI works or is fixed soon and that it works with surround vision









combat is fun and game is cool 720p even with supersampling is low res







but 132 inches makes up for it a little ... again hope surround works cause I do want more pixels LOL!


----------



## Chargeit

Quote:


> Originally Posted by *supermi*
> 
> combat is fun and game is cool 720p even with supersampling is low res
> 
> 
> 
> 
> 
> 
> 
> but 132 inches makes up for it a little ... again *hope surround works cause I do want more pixels LOL*!


It don't.


----------



## supermi

Quote:


> Originally Posted by *Chargeit*
> 
> It don't.


YAY good times


----------



## Chargeit

Quote:


> Originally Posted by *supermi*
> 
> YAY good times


It kind of evens out since you can't switch to 1080p. You're going to have to turn off surround if you want to play the game without it being displayed on the middle screen, and 1/6 of each side screen.


----------



## ElectroManiac

Sorry if this was ask before but where do I download the ultra texture pack?


----------



## supermi

Quote:


> Originally Posted by *ElectroManiac*
> 
> Sorry if this was ask before but where do I download the ultra texture pack?


You can download it right from steam, just go to the game's steam page in the store and look under the DLC









Quote:


> Originally Posted by *Chargeit*
> 
> It kind of evens out since you can't switch to 1080p. You're going to have to turn off surround if you want to play the game without it being displayed on the middle screen, and 1/6 of each side screen.


I do love surround, but I am considering letting my lovely 120hz surround 3d monitors go and getting a single 4k screen then the Rift for my 3d needs


----------



## KenjiS

Quote:


> Originally Posted by *The Source*
> 
> I'll eat my shoe if it's playable in game. The benchmark runs a lot better than in game.


Will find out later. Sadly im too busy to actually sit and play -sigh-

Cant people stop interrupting kenji's fun time :C


----------



## Leopard2lx

I am getting 97 fps average at 1080p Max Settings + High Testures

and getting 60 fps average at 1600p Max Settings + High Testures

Not bad. Also, my min frames were 40+ fps. This is with a 780 @ 1320 Mhz.

Now to try the HD texture setting....


----------



## Sannakji

Quote:


> Originally Posted by *Clocknut*
> 
> as I said b4.. new console port = Large VRAM req.
> 
> it has begun....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX950/960 4Gb > 780ti 3GB


Nowt to do with port. Textures are just large.


----------



## The Source

So the res scaling option jumps from 100% to 150%, is there a way to make a custom res of say 125%?

And honestly, I'm already getting bored of the repetitive gameplay.


----------



## fizzle

Running a 780 TI here @ 1214 Mhz. The benchmarks with the ultra textures are meaningless. When you actually play the game and the screen gets populated with enemies it turns into a slide show as the VRAM fills up. Shame because the visuals are incredible with the HD textures. I wonder if there's a workaround?


----------



## supermi

Ok so surround works fine!

But the aspect ratio in landscape surround DOES NOT. Portrait Surround is GREAT!

I have 3x 27 inch 1080 displays so 3/4 of a 4k with SLI titans, SLI seems to be working ALL ULTRA including texture pack ... native resolution

114 high
35 average
17 low

VRAM 5.5gb

Seemed smooth actually no stutters or any issue.

Maybe that small patch earlier helped SLI? not using inspector and not renamed .exe just plug and play










Seems fun I hope the story keeps it from feeling repetitive !


----------



## Chargeit

Maxed out 1080p without the high res texture pack using my 780...



Will test with ultra textures when I finish dl'ing them later.

This game has some big issues, such as not allowing you to set your freaking res, like every computer game I've ever played, but it isn't unplayable at high settings.

I agree with "The Source". I've only put about 30 min of game time into this, and it's already getting old.


----------



## fizzle

BTW Guys, if you don't want to supersample the nvidia control panel FXAA works, and makes the game look much better in my opinion.


----------



## Azefore

Got ~5 hours of game play in and 7/20 main missions done, I turned back a few of the graphical settings to stay above 60 fps. With everything on ultra, in certain scenes, I would see FPS stay ~22 fps, I think they were more from strongholds but it was pretty bad.


----------



## DrBrogbo

Just another user to confirm that 6GB of VRAM is not required. My 980 keeps it buttery smooth above 60 at all times (benchmark was 88fps). VRAM usage is pegged at 4030MB or something like that, but no stuttering or choppiness.

Not sure how well it does with 3 or 2GB, though. MAN those ultra textures look fantastic.


----------



## fizzle

Interesting. With 3 Gb mine def stutters during combat with multiple enemies. It's fine otherwise, but those are the moments where it's do or die lol


----------



## ToxicAdam

Quote:


> Originally Posted by *fizzle*
> 
> Interesting. With 3 Gb mine def stutters during combat with multiple enemies. It's fine otherwise, but those are the moments where it's do or die lol


Would turning off AA and Motion blur help or is it all about the textures?

I don't use those two options on my games?


----------



## NoDoz

1080P ULTRA. 980SLI with FEAR 3 profile


----------



## Chargeit

I'm playing it on my 780 with ultra texture now... Runs great so far. I started it up and didn't realize I had finished my download. I was thinking, damn, these textures look pretty good on high.











Steam screen shots don't seem to work, the game saves it's screenshots in the documents file. It doesn't seem to save them in the full res, which is why some people are saying the game looks bad... Really, some of these textures are freaking sweet.


----------



## saeedkunna

here is my tests
titan black @4k maxed ultra texture vsync off


titan black @4k maxed ultra texture vsync on


----------



## Ferreal

I'm getting consistently 5.8gb to 6gb vram usage.

Anyone with Titan Z on here? lol

I'm curious to see how it would do in this game.


----------



## cix92

Quote:


> Originally Posted by *NoDoz*
> 
> 1080P ULTRA. 980SLI with FEAR 3 profile


That setup is a beast , so much power , yet they screwed it up with terrible, bad memory usage.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Ferreal*
> 
> I'm getting consistently 5.8gb to 6gb vram usage.
> 
> Anyone with Titan Z on here? lol
> 
> I'm curious to see how it would do in this game.


Titan Z only has 6gb of Vram and SLI won't do so well, so basically it will perform as one Titan Black.


----------



## MapRef41N93W

Single Gaming 970 at 1550MHz I get 67 on the benchmark for high and 66.5 for ultra 1440p. Game was really smooth on ultra from what I played (only about a half hour). 6GB definitely not needed.


----------



## Silent Scone

Not read other peoples findings but Ultra textures with HD pack is a no go on the 980 GTX

Benchmark results with *TRI SLi at 1440P* using FEAR3 SLi bits

There isn't a driver out for this yet, which for an NV endorsed title is pretty poor show.

1555/2065

[email protected] 4.35


----------



## Promisedpain

Quote:


> Originally Posted by *Silent Scone*
> 
> Not read other peoples findings but Ultra textures with HD pack is a no go on the 980 GTX
> 
> Benchmark results with *TRI SLi at 1440P* using FEAR3 SLi bits
> 
> There isn't a driver out for this yet, which for an NV endorsed title is pretty poor show.
> 
> 1555/2065
> 
> [email protected] 4.35


Funny how the card is a few weeks old and cannot max out games.... terribly optimization or poor move by nvidia putting only 4GB on a brand new card? I wonder if it will be the same with Evil Within, it recommends 4GB Vram, RIP my 780 ti.


----------



## Kriant

Quote:


> Originally Posted by *Silent Scone*
> 
> Not read other peoples findings but Ultra textures with HD pack is a no go on the 980 GTX
> 
> Benchmark results with *TRI SLi at 1440P* using FEAR3 SLi bits
> 
> There isn't a driver out for this yet, which for an NV endorsed title is pretty poor show.
> 
> 1555/2065
> 
> [email protected] 4.35


0_o how is that a "no go" with AVG fps being in 70s? If the benchmark is anything like Metro benchmarks, those tend to show min fps at 4 to 10 fps, yet that never actually happens in game, or visually in benchmark, and it's just something that is registered when the benchmark stars at black screen. Is is acutlally dropping to single digits on multiple occasions during the benchmark ever so often? or is it just the minimum number that magically appear at the end of the benchmark, without you noticing it?


----------



## Qu1ckset

Quote:


> Originally Posted by *Promisedpain*
> 
> Funny how the card is a few weeks old and cannot max out games.... terribly optimization or poor move by nvidia putting only 4GB on a brand new card? I wonder if it will be the same with Evil Within, it recommends 4GB Vram, RIP my 780 ti.


What are you talking about ? My Single Stock 780ti 3Gb maxes this game with ultra textures @2560x1080 never dipping below 60fps on what I've played so far..

980 is faster and has 1Gb more Vram, if trying to play the game at 1080p with Ultra textures it should have zero difficulties!

Evil Within from what I've seen looks super MEH , why is it even requiring 4Gb Vram nothing I've seen makes the textures look anything more then nothing special avg graphics..


----------



## Alatar

Played around 3 hours. vram usage according to gpu-z usually hovers around 5.9GB or so, maximum here:


----------



## daviejams

Bought it last night , in game benchmark was 70 odd frames but in actual gameplay it is a bit different . it's probably 50-70 in game and all cut scenes are 60fps which is a nice change from the 30fps cut scenes we usually get

Played it for about an hour , really good game so far. Going to have a go at this over the weekend


----------



## Promisedpain

Quote:


> Originally Posted by *Qu1ckset*
> 
> What are you talking about ? My Single Stock 780ti 3Gb maxes this game with ultra textures @2560x1080 never dipping below 60fps on what I've played so far..
> 
> 980 is faster and has 1Gb more Vram, if trying to play the game at 1080p with Ultra textures it should have zero difficulties!
> 
> Evil Within from what I've seen looks super MEH , why is it even requiring 4Gb Vram nothing I've seen makes the textures look anything more then nothing special avg graphics..


A few hours and I can test this out myself, still downloading the game T_T

Some people are reporting stuttering on 780 ti and ultra, strange. I'm pretty sure you have the ultra textures installed since you're on oc.net, so no point in asking that, right?


----------



## DrBrogbo

Quote:


> Originally Posted by *fizzle*
> 
> Interesting. With 3 Gb mine def stutters during combat with multiple enemies. It's fine otherwise, but those are the moments where it's do or die lol


I could see that, seeing as how running on high used about 2500MB
Quote:


> Originally Posted by *Promisedpain*
> 
> Funny how the card is a few weeks old and cannot max out games.... terribly optimization or poor move by nvidia putting only 4GB on a brand new card? I wonder if it will be the same with Evil Within, it recommends 4GB Vram, RIP my 780 ti.


It CAN max out the game. My 980 at 1920x1200 with everything maxed (including ultra textures) never ever EVER dips below 60fps. Benchmark had it at 88.

That's at stock clocks, too.


----------



## Qu1ckset

Quote:


> Originally Posted by *Promisedpain*
> 
> A few hours and I can test this out myself, still downloading the game T_T
> 
> Some people are reporting stuttering on 780 ti and ultra, strange. I'm pretty sure you have the ultra textures installed since you're on oc.net, so no point in asking that, right?


Ya man, didn't even start my game till the ultra textures were finished downloading, and it all runs fine on my system!


----------



## daviejams

I did not have to download any ultra textures by the way , and it's set to ultra in the options ?

33.9gb steam


----------



## Silent Scone

Quote:


> Originally Posted by *DrBrogbo*
> 
> I could see that, seeing as how running on high used about 2500MB
> It CAN max out the game. My 980 at 1920x1200 with everything maxed (including ultra textures) never ever EVER dips below 60fps. Benchmark had it at 88.
> 
> That's at stock clocks, too.


However you're only playing at 1200. Max sure you've downloaded the HD Pack.

VRAM usage for me instantly goes over 4GB and stutters more than occasionally. It's the minimums you need to look at. 7fps minimum is due to the VRAM.

Also be sure to quit the game and relaunch after applying ultra textures, as otherwise they don't load into memory.

Enjoying this so far though


----------



## fashric

Tested it out at 1440p all set to ultra using the fear 3 sli bits and the benchmark ran fine

but the game was a different story. Only played for about 30 mins but it was an absolute stutterfest and not playable with ultra textures. The textures look really good but I'm still sure that the 6gb vram requirement is just poor PC optimisation. Turned the textures down to high and its playable at least.


----------



## Silent Scone

Thank you! I'm not going mad









Seeing Alatar's results I'm really not that fussed, this game just seems to use a huge amount of VRAM, and looks just as great at high textures!

GM200 better come with at least 8gb


----------



## QxY

Quote:


> Originally Posted by *fashric*
> 
> but the game was a different story. Only played for about 30 mins but it was an absolute stutterfest and not playable with ultra textures. The textures look really good but I'm still sure that the 6gb vram requirement is just poor PC optimisation. Turned the textures down to high and its playable at least.


Could be an SLI issue? Some users reported it's fine with a single 780 Ti with Ultra Textures, some stuttering but still playable. I'd try a single card just to be sure.


----------



## Silent Scone

Well it uses more VRAM with TRI SLi, but I think you'll still struggle in single card configuration too, just maybe not as much.


----------



## Hilpi234

Well, i tried with a Single 980, fought around 20 minutes i did not see any struggling... i give it a try when my second card shows up

@1440p - Gsync


----------



## Silent Scone

Quote:


> Originally Posted by *Hilpi234*
> 
> Well, i tried with a Single 980, fought around 20 minutes i did not see any struggling... i give it a try when my second card shows up
> 
> @1440p - Gsync


Make sure people are installing the HD Pack before commenting


----------



## Promisedpain

A few screenshots that I took on High textures (everything else maxed) the game looks really good. Now I'm curious how much better ultra looks since it requires 3GB more vram. I'm downloading the textures, but I'm expecting a slide show, even tho some people have reported ultra to be working on a 780 ti.


----------



## Sisaroth

Quote:


> Originally Posted by *daviejams*
> 
> I did not have to download any ultra textures by the way , and it's set to ultra in the options ?
> 
> 33.9gb steam


Doesn't it say in the options menu "you can select ultra but it will run as high if you havent installed ultra textures" ?


----------



## The Source

Quote:


> Originally Posted by *QxY*
> 
> Could be an SLI issue? Some users reported it's fine with a single 780 Ti with Ultra Textures, some stuttering but still playable. I'd try a single card just to be sure.


It plays a bit smoother and is more stable with a single card. Still, 30fps isn't playable really, during combat it just lags too much. I'm finding if I try ultra even at 1440p with SLI it will crash and it stutters constantly so there is no point.

For those with 3GB card claiming it runs just fine with the ultra textures even at 1080p, what the heck are you doing about the horrid aliasing? Nothing? So what you all mean to say is, if you can tolerate the eye bleed ( I wish I was exaggerating. It's difficult to see the improvement ultra brings through the noise) the 6GB recommendation isn't necessary.


----------



## Promisedpain

Quote:


> Originally Posted by *Sisaroth*
> 
> Doesn't it say in the options menu "you can select ultra but it will run as high if you havent installed ultra textures" ?


Yeah. You'll need to download the texture pack in order to play ultra. You can turn it on sure, but it will be high, not actual ultra.


----------



## fizzle

Quote:


> Originally Posted by *Qu1ckset*
> 
> What are you talking about ? My Single Stock 780ti 3Gb maxes this game with ultra textures @2560x1080 never dipping below 60fps on what I've played so far..
> 
> 980 is faster and has 1Gb more Vram, if trying to play the game at 1080p with Ultra textures it should have zero difficulties!
> 
> Evil Within from what I've seen looks super MEH , why is it even requiring 4Gb Vram nothing I've seen makes the textures look anything more then nothing special avg graphics..


Try playing the game. For me with a 780 Ti it turns into a slideshow when fighting multiple enemies as it dumps and loads memory. The benchmark ran just fine though, never dipped below my capped 60 fps. But in game? Not playable. It stutters and and slows down to a crawl sometimes.
Quote:


> Originally Posted by *Qu1ckset*
> 
> Ya man, didn't even start my game till the ultra textures were finished downloading, and it all runs fine on my system!


I wonder what you are doing differently. I want to know! HD textures look great...


----------



## The Source

It's another case of the benchmark not representing the game properly, and everyone just assumes as usual.

I think the game looks pretty good on ultra beginning at around 1800p, but I don't see that happening even with 4GB's.

And, I'm starting to wonder how this game recieved all of the great reviews it did. It's just repetitive, and dull from the get go. The combat is good and somewhat satisfying, but that's all there is. Maybe I'm just not creative enough. I'm playing Black Flag at the moment as well so I'm sure that's not helping as it's a much better game.


----------



## Promisedpain

Quote:


> Originally Posted by *The Source*
> 
> It's another case of the benchmark not representing the game properly, and everyone just assumes as usual.
> 
> I think the game looks pretty good on ultra beginning at around 1800p, but I don't see that happening even with 4GB's.
> 
> And, I'm starting to wonder how this game recieved all of the great reviews it did. It's just repetitive, and dull from the get go. The combat is good and somewhat satisfying, but that's all there is. Maybe I'm just not creative enough. I'm playing Black Flag at the moment as well so I'm sure that's not helping as it's a much better game.


It feels like playing AC, attack, parry, repeat. Still a good game imo.


----------



## JSTe

Quote:


> Originally Posted by *fizzle*
> 
> Try playing the game. For me with a 780 Ti it turns into a slideshow when fighting multiple enemies as it dumps and loads memory. The benchmark ran just fine though, never dipped below my capped 60 fps. But in game? Not playable. It stutters and and slows down to a crawl sometimes.


Yeah, sounds like a fantastic port.

And doesn't even have the looks to justify the ******ed HW requirements.


----------



## daviejams

Quote:


> Originally Posted by *JSTe*
> 
> Yeah, sounds like a fantastic port.
> 
> And doesn't even have the looks to justify the ******ed HW requirements.


It's a really good game tbf

Looks great and plays great , can't ask for much more than that


----------



## TopicClocker

Quote:


> Originally Posted by *JSTe*
> 
> *Yeah, sounds like a fantastic port.*


It is, minus the Ultra textures.


----------



## iSlayer

Benchmark scores for the 770 are good.


----------



## Ferreal

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Titan Z only has 6gb of Vram and SLI won't do so well, so basically it will perform as one Titan Black.


I believe titan z is 12gb vram


----------



## Chargeit

I benched it last night maxed with the ultra texture pack on my 780 @ 1080p (keeping the res reasonable). The min is normally 28 - 30, but I had to run the bench two times in a row because I forgot to screenshot the first go.



The game can look pretty damned good at those settings. It runs fine, though like with Dead Rising 3, I'm using a controller. I also use Adaptive Vsync, not in game Vsync. There are fps drops, and times where the game gets load stutter, but it isn't bad and easy to combat with a nice 360 camera pan to load things in. I'm only 3 hours into the game, 1 hour of that with the ultra texture pack, so, maybe deeper in it becomes worse, but for now, it plays well. The game also looks good with high textures, so, I'd just drop the ultra if need be.

My biggest issues is the fact you can't just set your own res. It means I have to disable surround to play correctly. Anyone who has used triple monitors knows that isn't fun (Display fusion makes it a lot more bearable). I guess my second issue would be the gameplay is fairly repetitive, though I'm enjoying it more now that I'm getting used to playing.


----------



## Promisedpain

Quote:


> Originally Posted by *Ferreal*
> 
> I believe titan z is 12gb vram


6x2 = 12gb, but just like any other sli setup, you can only utilize 6GB vram.


----------



## fizzle

Quote:


> Originally Posted by *TopicClocker*
> 
> It is, minus the Ultra textures.


That really is it. Even without the high res textures it still looks quite wonderful and runs very well.

It's just a VRAM problem.

I *hope* that this not where the future is headed otherwise my 780 Ti will be irrelevant very fast.


----------



## Promisedpain

Quote:


> Originally Posted by *fizzle*
> 
> That really is it. Even without the high res textures it still looks quite wonderful and runs very well.
> 
> It's just a VRAM problem.
> 
> I *hope* that this not where the future is headed otherwise my 780 Ti will be irrelevant very fast.


We probably have to upgrade very soon... The Evil Within is also requiring 4GB VRAM.


----------



## TopicClocker

Quote:


> Originally Posted by *fizzle*
> 
> That really is it. Even without the high res textures it still looks quite wonderful and runs very well.
> 
> It's just a VRAM problem.
> 
> I *hope* that this not where the future is headed otherwise my 780 Ti will be irrelevant very fast.


For some reason I'm worried about getting a 4GB GTX 970.
Theoretically 4GB should be fine against the next gen consoles at 1080p, 6GB is over that.








The PS4 currently has 4.5 - 5.5GB available for games.

sigh.


----------



## fizzle

Quote:


> Originally Posted by *Promisedpain*
> 
> We probably have to upgrade very soon... The Evil Within is also requiring 4GB VRAM.


Damnit I JUST got my 780 Ti. Did not expect the whole VRAM thing to be a major issue so bloody soon.


----------



## Promisedpain

Quote:


> Originally Posted by *TopicClocker*
> 
> For some reason I'm worried about getting a 4GB GTX 970.
> Theoretically 4GB should be fine against the next gen consoles at 1080p, 6GB is over that.
> 
> 
> 
> 
> 
> 
> 
> 
> The PS4 currently has 5.5GB available for games.
> 
> sigh.


I heard 8GB 970/980s are on the way. That should be enough...


----------



## TopicClocker

Quote:


> Originally Posted by *Promisedpain*
> 
> I heard 8GB 970/980s are on the way. That should be enough...


I know but there still hasn't been any dates.
I don't feel like waiting 2 months for 8GB models.

LBP3 and GTA 5 on the PS4 are calling me.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Promisedpain*
> 
> Funny how the card is a few weeks old and cannot max out games.... terribly optimization or poor move by nvidia putting only 4GB on a brand new card? I wonder if it will be the same with Evil Within, it recommends 4GB Vram, RIP my 780 ti.


Are you mad? Just because the card can't max out a game means it's rendered useless.

Can't run ultra? The game should be enjoyable on high settings.


----------



## fizzle

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Are you mad? Just because the card can't max out a game means it's rendered useless.
> 
> Can't run ultra? The game should be enjoyable on high settings.


It's not that... It's just stupid that the rendering performance is there but the vRAM is not.


----------



## Promisedpain

Quote:


> Originally Posted by *fizzle*
> 
> It's not that... It's just stupid that the rendering performance is there but the vRAM is not.


Yeah, 780 ti would destroy this game if it wasn't for the VRAM - It's still one of the fastest cards out there.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *fizzle*
> 
> It's not that... It's just stupid that the rendering performance is there but the vRAM is not.


You'll be fine. Like I said high settings will still look good. I'd start to worry when it struggles with medium settings.

Just enjoy the games people.


----------



## Leopard2lx

Those who are having issues with frame rate fluctuation should turn on D3DOverrider as it makes the game a lot smoother by forcing triple buffering while V-Sync is on.
Also, FXAA can be turned on from NVCP for a bit extra AA.


----------



## FissioN2222

Quote:


> Originally Posted by *Leopard2lx*
> 
> Also, FXAA can be turned on from NVCP for a bit extra AA.


ew..


----------



## Yungbenny911

Why do people still use a console's V-RAM amount to predict increase in V-RAM usage in PC gaming? Even though they have such high amount of V-RAM and low level API, the PS4 and Xcrap's GPU *CANNOT* render anything close to what a 780ti can.

So...

1) Lower resolution and texture quality on PS4 games = Lower V-RAM usage (if you didn't know, some demanding games run at 1600x900p on the PS4)

2) One can argue that V-RAM can hold textures without affecting performance too much, but can they verify how much of those textures would be visible in the game? From what i understand, a higher field of view (FOV) = more for the GPU to render, and we all know the PS4 does not have a powerful GPU in it, so why do people believe consoles use so much V-RAM?

It's like saying the GTX 880m actually utilizes it's 8GB V-RAM, or the 555m uses it's 3GB V-RAM. Oh lord...


----------



## Dasboogieman

Quote:


> Originally Posted by *fizzle*
> 
> Damnit I JUST got my 780 Ti. Did not expect the whole VRAM thing to be a major issue so bloody soon.


Beautiful planned obsolescence from NVIDIA right there. The GK110 design is pretty much perfect, no flaws, robust all round performance. However, they had to do something to keep you upgrading (perhaps even to their exorbitant Titan Black) hence the 3Gb and a blanket ban on AIBs adding 6Gb variants. That was the dealbreaker for me, I refuse to pay $800 for a GPU I can see will be plainly obsolete in 1-2 years due to a tiny purpose placed design flaw. The 290 has some serious issues but at least I know it will withstand the test of time better.


----------



## supermi

Quote:


> Originally Posted by *Chargeit*
> 
> I benched it last night maxed with the ultra texture pack on my 780 @ 1080p (keeping the res reasonable). The min is normally 28 - 30, but I had to run the bench two times in a row because I forgot to screenshot the first go.
> 
> 
> 
> The game can look pretty damned good at those settings. It runs fine, though like with Dead Rising 3, I'm using a controller. I also use Adaptive Vsync, not in game Vsync. There are fps drops, and times where the game gets load stutter, but it isn't bad and easy to combat with a nice 360 camera pan to load things in. I'm only 3 hours into the game, 1 hour of that with the ultra texture pack, so, maybe deeper in it becomes worse, but for now, it plays well. The game also looks good with high textures, so, I'd just drop the ultra if need be.
> 
> My biggest issues is the fact you can't just set your own res. It means I have to disable surround to play correctly. Anyone who has used triple monitors knows that isn't fun (Display fusion makes it a lot more bearable). I guess my second issue would be the gameplay is fairly repetitive, though I'm enjoying it more now that I'm getting used to playing.


For now you can flip into portrait surround it gets the aspect ratio right!

Different than landscape surround? Yes

Better than single monitor? Yup


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Dasboogieman*
> 
> Beautiful planned *obsolescence* from NVIDIA right there. The GK110 design is pretty much perfect, no flaws, robust all round performance. However, they had to do something to keep you upgrading (perhaps even to their exorbitant Titan Black) hence the 3Gb and a blanket ban on AIBs adding 6Gb variants. That was the dealbreaker for me, I refuse to pay $800 for a GPU I can see will be plainly obsolete in 1-2 years due to a tiny purpose placed design flaw. The 290 has some serious issues but at least I know it will withstand the test of time better.


You guys are getting ridiculous. 780ti will not be obsolete anytime soon.


----------



## iSlayer

Quote:


> Originally Posted by *fizzle*
> 
> Damnit I JUST got my 780 Ti. Did not expect the whole VRAM thing to be a major issue so bloody soon.


Did we read the same thread? Ultra doesn't seem to be running crippld due to VRAM, at all.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *iSlayer*
> 
> Did we read the same thread? Ultra doesn't seem to be running crippld due to VRAM, at all.


Stuttering and frame dips have been confirmed when it caps the vram. Depending on how picky someone is, it may or may not be a problem.

Apparently no one wants tu run on high, since it renders the game unplayable and 780ti is obsolete according to some people.


----------



## Azefore

Interesting thing I noticed, the dark ranger skin when active on main menu yields ~99fps, the regular Talion skin active on main menu yields ~22fps.


----------



## The Source

Quote:


> Originally Posted by *iSlayer*
> 
> Did we read the same thread? Ultra doesn't seem to be running crippld due to VRAM, at all.


Well you must be fixating on the benchmark screenshots and not actually reading anything.

I give up. People these days only read what they want to see. It's astounding.

For an nvidia sponsored title, theres still no driver update. lol


----------



## iSlayer

I said running crippled not running less than god's gift to man.


----------



## TopicClocker

Everyone needs to see this.

http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures
http://www.overclock.net/t/1516372/eurogamer-eyes-on-with-pc-shadow-of-mordors-6gb-ultra-hd-textures


----------



## Majorhi

Figured I'd add my $.02. I ran the benchmark with the HD texture pack addon installed. All settings set to ultra except one, it would only go to high. And the results.... looks like I'll be turning the settings down if I want a descent frame rate. This is ran on an HD7950 1000/1500

AVG: 56.01
MAX: 234.93
MIN: 9.92


----------



## TopicClocker

Quote:


> Originally Posted by *Majorhi*
> 
> Figured I'd add my $.02. I ran the benchmark with the HD texture pack addon installed. All settings set to ultra except one, it would only go to high. And the results.... looks like I'll be turning the settings down if I want a descent frame rate. This is ran on an HD7950 1000/1500


That ran really well!


----------



## Silent Scone

Quote:


> Originally Posted by *The Source*
> 
> Well you must be fixating on the benchmark screenshots and not actually reading anything.
> 
> I give up. People these days only read what they want to see. It's astounding.


Let them get on with it. Memory is memory and I've tried the HD pack at 1440p with Ultra and it exceeds the VRAM limit which results in hitching. It's incredibly frustrating watch people saying "WORKS OK HERE" with absolutely no information and its completely misleading.

You need more than 4gb to sucessfully run HD Texture Pack with Ultra at 1440p and have a hitching free experience.

Anyone disputing that either has low standards or is in denial lol. That said, game looks great without!


----------



## fizzle

Quote:


> Originally Posted by *TopicClocker*
> 
> Everyone needs to see this.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures
> http://www.overclock.net/t/1516372/eurogamer-eyes-on-with-pc-shadow-of-mordors-6gb-ultra-hd-textures


Interesting, thanks. I didn't really notice it on the characters however. The Uruk armors look quite different with the HD textures. The majority of the difference was in the world textures, the rocks, grass, mud etc.


----------



## Silent Scone

Quote:


> Originally Posted by *TopicClocker*
> 
> Everyone needs to see this.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures
> http://www.overclock.net/t/1516372/eurogamer-eyes-on-with-pc-shadow-of-mordors-6gb-ultra-hd-textures


Thanks


----------



## fizzle

Quote:


> Originally Posted by *Silent Scone*
> 
> Let them get on with it. Memory is memory and I've tried the HD pack at 1440p with Ultra and it exceeds the VRAM limit which results in hitching. It's incredibly frustrating watch people saying "WORKS OK HERE" with absolutely no information and its completely misleading.
> 
> You need more than 4gb to sucessfully run HD Texture Pack with Ultra at 1440p and have a hitching free experience.
> 
> Anyone disputing that either has low standards or is in denial lol. That said, game looks great without!


Exactly this. Benchmark means literally nothing. I noticed no FPS differences on my setup when comparing ultra to high textures. Playing the game was a different story. Put it on ultra, stand on a tower and simply rotate the camera and you will begin to see what happens as the VRAM fills up.


----------



## Chargeit

Quote:


> Originally Posted by *supermi*
> 
> For now you can flip into portrait surround it gets the aspect ratio right!
> 
> Different than landscape surround? Yes
> 
> Better than single monitor? Yup


My monitor stand comes in tomorrow, will be able to test out portrait when I get it. =D

http://www.newegg.com/Product/Product.aspx?Item=N82E16824980006

The one review on Newegg is bad, but everywhere else those stands get good reviews. Also, it allows for each side monitor to be adjusted up or down, which I need since the mount on my left monitor is kind of in a wonky spot.

I'm on a single 780 right now (I did sli, but couldn't deal with the heat, waiting for 6/8gb Maxwell). Don't think I'll even attempt this game on 3 screen, least I'm not willing to lower the settings to the point that it would be playable. I was planning on playing at 1080p. My issue was, the game doesn't give you the option to set that up correctly. I had to turn off surround to get it to play on a single monitor.


----------



## speedy2721

Quote:


> Originally Posted by *fizzle*
> 
> Exactly this. Benchmark means literally nothing. I noticed no FPS differences on my setup when comparing ultra to high textures. Playing the game was a different story. Put it on ultra, stand on a tower and simply rotate the camera and you will begin to see what happens as the VRAM fills up.


I have the same exact thing happen to me on my 780ti when set to Ultra.

Not sure if this is normal, but when on Ultra I noticed that my the game keeps caching my 8gb of RAM and eventually the game will crash due to the game running out of memory and sometimes I get an error saying invalid memory address. The memory seems to fill up more and more as I rotate the screen really fast at different locations.


----------



## Baasha

Quote:


> Originally Posted by *TopicClocker*
> 
> Everyone needs to see this.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures
> http://www.overclock.net/t/1516372/eurogamer-eyes-on-with-pc-shadow-of-mordors-6gb-ultra-hd-textures


Wow.. this is quite incredible! I wonder then, at 4K Surround, Ultra textures are definitely not possible. What about 4K? Is 6GB enough?

Also, before I pull the trigger on this game, does it work well with SLI? I've read a few threads on various forums about how SLI is not yet supported(?). Really? Can anyone who has the game and two or more cards confirm whether SLI works w/ this game or not?


----------



## Promisedpain

Quote:


> Originally Posted by *Baasha*
> 
> Wow.. this is quite incredible! I wonder then, at 4K Surround, Ultra textures are definitely not possible. What about 4K? Is 6GB enough?
> 
> Also, before I pull the trigger on this game, does it work well with SLI? I've read a few threads on various forums about how SLI is not yet supported(?). Really? Can anyone who has the game and two or more cards confirm whether SLI works w/ this game or not?


3GB VRAM for a slighty better ground textures? other than that I don't see much difference... you gotta be kidding me.

http://cdn.overclock.net/a/ac/ac58f054_215212126jpg.jpeg

http://cdn.overclock.net/8/8d/8da46e35_120874466jpg.jpeg


----------



## TopicClocker

Quote:


> Originally Posted by *Baasha*
> 
> Wow.. this is quite incredible! I wonder then, at 4K Surround, Ultra textures are definitely not possible. What about 4K? Is 6GB enough?
> 
> Also, before I pull the trigger on this game, does it work well with SLI? I've read a few threads on various forums about how SLI is not yet supported(?). Really? Can anyone who has the game and two or more cards confirm whether SLI works w/ this game or not?


From what I've heard SLI isn't properly supported yet, however I've been hearing that some have had some luck using the F.E.A.R 3 profile for SLI.


----------



## The Source

Quote:


> Originally Posted by *Baasha*
> 
> Wow.. this is quite incredible! I wonder then, at 4K Surround, Ultra textures are definitely not possible. What about 4K? Is 6GB enough?
> 
> Also, before I pull the trigger on this game, does it work well with SLI? I've read a few threads on various forums about how SLI is not yet supported(?). Really? Can anyone who has the game and two or more cards confirm whether SLI works w/ this game or not?


It's not officially supported but most of us SLI users, including myself have found some success using the FEAR 3 SLI profile. Still waiting to hear from nvidia though.

I'd even be content with lowering the resolution to 1080p to use ultra if there were some kind of AA option.


----------



## supermi

Quote:


> Originally Posted by *Chargeit*
> 
> My monitor stand comes in tomorrow, will be able to test out portrait when I get it. =D
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16824980006
> 
> The one review on Newegg is bad, but everywhere else those stands get good reviews. Also, it allows for each side monitor to be adjusted up or down, which I need since the mount on my left monitor is kind of in a wonky spot.
> 
> I'm on a single 780 right now (I did sli, but couldn't deal with the heat, waiting for 6/8gb Maxwell). Don't think I'll even attempt this game on 3 screen, least I'm not willing to lower the settings to the point that it would be playable. I was planning on playing at 1080p. My issue was, the game doesn't give you the option to set that up correctly. I had to turn off surround to get it to play on a single monitor.


The stand will make it so muchbeasier!
When I saw the game did not work in surround landscape I had it rotated and surround settings adjusted in 5 min and now have a 46 inch 3/4 4k monitor of awesome LOL

YOU WILL TOO!
you may need to lower some settings with a 3gb single card at that res. Was using 5.5-6gb of cram for me with ultra textures. Tried super sampling did not crash the game but choppy would be a good description LOL


----------



## Silent Scone

Quote:


> Originally Posted by *fizzle*
> 
> Exactly this. Benchmark means literally nothing. I noticed no FPS differences on my setup when comparing ultra to high textures. Playing the game was a different story. Put it on ultra, stand on a tower and simply rotate the camera and you will begin to see what happens as the VRAM fills up.


The bench is completely inconsistent too. I actually had 30 FPS higher average with Ultra than with high enabled, however minimum dropped from 30 to 6FPS. There is your tell anyway before you even experience the problems in game.


----------



## Ferreal

I'm running this game @ 1440p RoG Swift with SLI titan blacks. I can tell the cards are struggling to keep up. Vram 5.8gb to 6gb consistently, fps drops when there are a lot going on. Dropping below 60fps quite often and stutters here and there, but the trade off is worth it. HD textures look amazing.

One complaint, a game supported by Nvidia but no SLI update? I had to force rendering 2 to get SLI working.


----------



## Silent Scone

Quote:


> Originally Posted by *Ferreal*
> 
> I'm running this game @ 1440p RoG Swift with SLI titan blacks. I can tell the cards are struggling to keep up. Vram 5.8gb to 6gb consistently, fps drops when there are a lot going on. Dropping below 60fps quite often and stutters here and there, but the trade off is worth it. HD textures look amazing.
> 
> One complaint, a game supported by Nvidia but no SLI update? I had to force rendering 2 to get SLI working.


Open NV INSPECTOR and set SLI bits to F.E.A.R 3. Should help the frame drops


----------



## Ferreal

Quote:


> Originally Posted by *Silent Scone*
> 
> Open NV INSPECTOR and set SLI bits to F.E.A.R 3. Should help the frame drops


Thanks! I'll give that a try when I get home.


----------



## criznit

Quote:


> Originally Posted by *speedy2721*
> 
> I have the same exact thing happen to me on my 780ti when set to Ultra.
> 
> Not sure if this is normal, but when on Ultra I noticed that my the game keeps caching my 8gb of RAM and eventually the game will crash due to the game running out of memory and sometimes I get an error saying invalid memory address. The memory seems to fill up more and more as I rotate the screen really fast at different locations.


I had a question about this exact scenario. I read somewhere that systems that have 3GB VRAM should run a minimum of 16GB system memory to prevent hitching and stuttering in games like watch dog (high VRAM requirements). I will try to find the link when I get home and will probably buy another 8GB of memory and test it out when I get some time.


----------



## Aparition

From what I have seen from other people posting images and the new article Ultra adds a tiny amount of improvement to some environment textures, such as the ground (tiny difference to high). Ultra adds more tessellation to objects such as fur on cloaks.

That added tessellation is probably what is driving up the memory requirement a lot.


----------



## kx11

tested the benchmark with everything maxed + HD content @ 1440p


----------



## RagingCain

Quote:


> Originally Posted by *Promisedpain*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baasha*
> 
> Wow.. this is quite incredible! I wonder then, at 4K Surround, Ultra textures are definitely not possible. What about 4K? Is 6GB enough?
> 
> Also, before I pull the trigger on this game, does it work well with SLI? I've read a few threads on various forums about how SLI is not yet supported(?). Really? Can anyone who has the game and two or more cards confirm whether SLI works w/ this game or not?
> 
> 
> 
> 3GB VRAM for a slighty better ground textures? other than that I don't see much difference... you gotta be kidding me.
> 
> http://cdn.overclock.net/a/ac/ac58f054_215212126jpg.jpeg
> 
> http://cdn.overclock.net/8/8d/8da46e35_120874466jpg.jpeg
Click to expand...

This is the best example I have been trying to illustrate to people.
6GB of VRAM used to make image 2. Insane.


----------



## Aparition

Quote:


> Originally Posted by *RagingCain*
> 
> This is the best example I have been trying to illustrate to people.
> 6GB of VRAM used to make image 2. Insane.


Looks more and more like "Ultra" just adds extra storage, a very tiny image improvement.
The difference between the above two images is almost nill. Ultra may have a tiny bit better lighting but hard to say from the images.

I didn't see any textures differences between the bag or the wall.
A Difference in lighting and the tessellation are the only noticeable real difference.

It may also be the max view/draw distance is higher in Ultra.


----------



## John Shepard

Game runs like arse on my 680 even on medium textures.There's spot right now that i am getting 11fps.... What a joke.
I should have went with the PS4 version.


----------



## pterois

The Ultra textures make a huge difference at 4K at least. There is a lot more clarity and sharpness as well as more detail. Rendering the game at 4K with high textures make it seem a bit fuzzy compared to Ultra.
It does take a hit though. I am using a single GTX 970 (overclocked at 1450 MHz) as I am still waiting for the second one, and it hovers around 30 FPS. It does favor a lot better with the High textures.


----------



## Qu1ckset

Quote:


> Originally Posted by *fizzle*
> 
> Try playing the game. For me with a 780 Ti it turns into a slideshow when fighting multiple enemies as it dumps and loads memory. The benchmark ran just fine though, never dipped below my capped 60 fps. But in game? Not playable. It stutters and and slows down to a crawl sometimes.
> I wonder what you are doing differently. I want to know! HD textures look great...


I already stated in another post I've only played about 30mins of the game, so il probably experience what you are going threw when I have more time to play this game! (Weekend).

For all the users saying 780ti is useless now, and 3/4Gb Vram isn't enough because it can't run this ridicules HD Texture pack Demanding 6Gb Vram with nothing amazing to show for it? Same goes for Evil Within and its 4Gb Vram requirement , that's games graphics looks nothing special!

Obviously these developers aren't good at optimizing there game because Crysis 3 or even modded skyrim look way better then Evil Within and Shadow or Mordor and don't need anywhere near 6Gb Vram!

Consoles have a shared pool of memory, So that usuable 5.5Gb of ram in the (PS4) is worse then a PC with 8-16Gb System Memory + 2-4GB VRAM, PS4's other partitioned Memory isn't for gaming what so ever , it's saved to run the OS smooth and run background service like party chat , and make it smooth when you press the guide button to go check out PSN Store or what ever else you want to do, while still in game!


----------



## JoHnYBLaZe

Nvidia doesn't even have to improve their GPU's anymore....

Just have devs saturate the crap out of the GPU buffer with god knows what

Targeting everyone with 780's or better, cuz, you know....that's where the money is

Anyone the least bit curious as to why this games textures need 6gb, or are they too busy saving for a 8gb card with the SAME performance of the card they already have


----------



## KenjiS

Quote:


> Originally Posted by *John Shepard*
> 
> Game runs like arse on my 680 even on medium textures.There's spot right now that i am getting 11fps.... What a joke.
> I should have went with the PS4 version.


I dont get that.. my 770 was handling it fine at Very High, a 680 should not be THAT far behind since a 770 is just a slightly tweaked 680...

by "fine" i mean about 52fps in the benchmark at 1440p.... You should not be having an issue running it, but i have seen a few folks with 680s saying the same thing so driver issue maybe?


----------



## MonarchX

Could someone definitely let me know whether my rig can handle *ULTRA Texture Pack* @ 1080p @ Ultra Settings, but WITHOUT any downscaling? By now we know that people with 2GB of VRAM can run the game smoothly with HIGH textures, but can high-end 3GB VRAM rigs run this game smoothly @ 1080p with ULTRA textures/pack???


----------



## Sisaroth

I wonder if there is difference between PCI-E 2 and 3, and x8 - x16. If you don't have enough vram then pci-e might become the bottleneck. Aside from people not actually running ultra and saying ultra is fine because they can't read the menus, it might be an explanation why some people seem to be doing fine with less than 6GB.

And i would also wait for a driver update before drawing conclusions like 980 being obsolete, you never know that nvidia might find a way to more efficiently use the VRAM.


----------



## NoDoz

Few screen shots I took


----------



## MrWhiteRX7

Haven't played the game on my 780ti yet, but my triple xfire 290setup is handling it like a champ







No stutters and I'm pegging my 100fps cap @ 1440p ultra.

Funny how this is an "nvidia" title


----------



## NoDoz

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Haven't played the game on my 780ti yet, but my triple xfire 290setup is handling it like a champ
> 
> 
> 
> 
> 
> 
> 
> No stutters and I'm pegging my 100fps cap @ 1440p ultra.
> 
> Funny how this is an "nvidia" title


I'm over 100fps at 1600p on ultra with the HD pack with 2 cards in sli. That's not too bad


----------



## MrWhiteRX7

Quote:


> Originally Posted by *NoDoz*
> 
> I'm over 100fps at 1600p on ultra with the HD pack with 2 cards in sli. That's not too bad


Yes it is.. I have mine capped at 100 and it never dips that's all I need with a 96hz refresh. I am just amused that it seems it's always the red team getting yelled at about launch drivers / xfire profiles and my trifire worked out of the gate where a lot of the green team is having sli issues. Not bashing ( I love my TI!!!!! ) just a funny turn of events on an Nvidia title.


----------



## Clazman55

Quote:


> Originally Posted by *John Shepard*
> 
> Game runs like arse on my 680 even on medium textures.There's spot right now that i am getting 11fps.... What a joke.
> I should have went with the PS4 version.


I would double check your drivers, using a single 670 and I'm easily avg 45, with occasional dip to like 30.


----------



## battletoad

Here's my thoughts after spending around 10 hours with 780Ti [email protected]:

The in-game benchmark it not representative of the performance you can expect when actually playing the game. Don't even bother with it. Your game will stutter if you try everything at Ultra, even if the bench says you are good to go.

FEAR 3 SLI via Nvidia Injector works really well overall. However, when cycling through the menus screens (map, ability tree, etc) I see noticeable "white line" artifacts. Watch Dogs did the exact same thing during gameplay for me, so if it did for you as well you know what to expect here. Also, the placement of controller buttons on the menu screens (such as A button to confirm, or B button to go back) become slightly out of place and return to proper alignments very quickly when cycling through.

I'm probably not explaining this well, but basically there is some bugginess in the menu screens here that I do not experience when disabling SLI. I have not noticed any artifacting during gameplay whatsoever, so I found this to be a minor issue overall. It will probably be corrected once an official SLI profile comes out and/or a minor patch.

I found High Texture Quality/High AO/Medium Shadows/DOF OFF/Motion Blur OFF/ everything else MAX + Vsync ON produced 60fps through the vast majority of my 10 hours of play. When things get crazy (with 30+ Uruks on screen) sometimes the framerate might drop to about 50fps at these settings. Regrettably, the same 10fps hit will also happen more often than not with only around 8-10 orcs if you use the "Flurry" combo move where your character turns blue and has lots of particle effects and the camera zooms in on you and the Uruk you are attacking. This is a go-to move, so you will have this happen far more often than you will see 30+ enemies on screen.

AO set to "Ultra" takes the same situations and drops about 10 more FPS in my experience. That's low 40s for those keeping score. I found this to be too much. This is the first game I have ever played where I noticed any kind of impact at all when attempting to max Ambient Occlusion.

I can notice barely any difference between High Textures and Medium,both in performance and visual quality. Either way, they don't look that great, with the better ones appearing a bit too fuzzy or washed-out, and the bad ones (some ground textures, fabrics on rooftops) looking pretty terrible.

This can be mostly eliminated by playing as much of the game at night during the rain as possible. The wet glistening look on everything really helps mask the textures, and it goes a long way to improving things. Unlike AC4, when it rains in Mordor there is no performance hit for all the puddles and ripples on the ground, which is nice. That said, the rain drops themselves are REALLY bad. They look more like marbles or some other solid object than liquid. Great puddles. Bad raindrops.

To be fair, when you play the game without looking for flaws it IS harder to just notice all the things I have complained about. It is a extremely fun game with acceptable/good visuals. It WILL make you question why it takes so much GPU power to run, and why it doesn't look/run better for those with $2000+ PCs, but overall I am very satisfied with the game.

This is NOT the "sky is falling" moment for PC gamers that you might have been worried about when 6GB [email protected] was first mentioned. This isn't Watch Dogs. It sure as hell isn't Dead Rising 3. Shadow of Mordor runs smooth as silk compared to either of those terrible ports.

That said, this game really makes me second guess if I want to buy the Acer 4K G-Sync monitor. As someone who is not upgrading GPUs until Pascal in 2016/2017, I now have significant doubts about any cards with less than 6GB running new console port games at 4K, and would really only feel comfortable with 8GB or more. It might be fine for last-gen ports like Borderlands 2 and Bioshock Infinite, but the new crop of games seem to be 1440p-only with 3-4GB cards.


----------



## JoHnYBLaZe

Hold up....Can anyone else confirm this game is using 8gb of SYSTEM RAM?!?

Does that not sound totally wrong?......I know I'm not the only one who builds PC's here

That would make this game an effective MEMORY STRESS TEST

Go ahead and feel the need to upgrade if you want to....might want to order some ram to go with that new GPU...=/

In other news.....I smell fish


----------



## Menta

its what you get from console ports....lazy games uncompressed data

sometimes i ask my self why bother


----------



## Chargeit

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Hold up....Can anyone else confirm this game is using 8gb of SYSTEM RAM?!?
> 
> Does that not sound totally wrong?......I know I'm not the only one who builds PC's here
> 
> That would make this game an effective MEMORY STRESS TEST
> 
> Go ahead and feel the need to upgrade if you want to....might want to order some ram to go with that new GPU...=/
> 
> In other news.....I smell fish


I was using 10gb system ram when playing.

*Testing it now and sitting at 8gb total ram usage. 3.5gb of that is other. I did see 10gb used at one point last night, least I think I did.

*Maxed out while watching it at 8.5gb.


----------



## cstkl1

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Hold up....Can anyone else confirm this game is using 8gb of SYSTEM RAM?!?
> 
> Does that not sound totally wrong?......I know I'm not the only one who builds PC's here
> 
> That would make this game an effective MEMORY STRESS TEST
> 
> Go ahead and feel the need to upgrade if you want to....might want to order some ram to go with that new GPU...=/
> 
> In other news.....I smell fish


5gb max with background. My post before was false. I mislabeled my pagefile as ram n vice versa.

Hmm will be testing with rog swift this weekend on both of my rigs. Decided both to go tn. Cant stand tearing anymore.

But for now a single titan black at 2560x1080 is smooth although there is some tearing once in a while.

Memory test. Memtest hci in windows on ure oc. That will test ure ram n imc stability. Auto clocking here wont do.

The game is odd that it has occasionally fps spike ups close to 1k. No downspikes. Odd.


----------



## Ferreal

Quote:


> Originally Posted by *MonarchX*
> 
> Could someone definitely let me know whether my rig can handle *ULTRA Texture Pack* @ 1080p @ Ultra Settings, but WITHOUT any downscaling? By now we know that people with 2GB of VRAM can run the game smoothly with HIGH textures, but can high-end 3GB VRAM rigs run this game smoothly @ 1080p with ULTRA textures/pack???


Well they did recommend 6gb vram for ultra HD textures. I am running 1440p on ultra and it is using consistently 5.8gb - 6gb.

People are reporting to be running ultra with 780ti. Either way, this game is pretty amazing @ high or ultra settings.

I'm not a huge fan of batman style combat but they got it right here.


----------



## cstkl1

Quote:


> Originally Posted by *Ferreal*
> 
> Well they did recommend 6gb vram for ultra HD textures. I am running 1440p on ultra and it is using consistently 5.8gb - 6gb.
> 
> People are reporting to be running ultra with 780ti. Either way, this game is pretty amazing @ high or ultra settings.
> 
> I'm not a huge fan of batman style combat but they got it right here.


Thats what they said on wolfy. But since i had both. It was false.
Samething again ppl claimed watchdogs. N infact watchdogs ran best on a single titan. Uber smooth but fps wasnt anything to shout about.

About the ultra vs high quality. Really to test it
out. But whats funny is before i instaleed the hd pack .. Ultra vs high had like 1-2gn vram difference. After installing. Its still the same. Not sure whether there was a increase in fidelity as there was a lot of cutscenes during the first gameplay.


----------



## ZealotKi11er

I wish i had a 8GB 290X just for this game.


----------



## ThePath

Quote:


> Originally Posted by *Menta*
> 
> its what you get from console ports....lazy games uncompressed data
> 
> sometimes i ask my self why bother


First of all, the console version texture is similar to high settings. The Ultra texture is PC exclusive

Second of all, the game is not very demanding. Even GPU like GTX 750 Ti can get decent fps at 1080p
http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_1920.jpg

It doesn't look like bad console port


----------



## ZealotKi11er

Quote:


> Originally Posted by *ThePath*
> 
> First of all, the console version texture is similar to high settings. The Ultra texture is PC exclusive
> 
> Second of all, the game is not very demanding. Even GPU like GTX 750 Ti can get decent fps at 1080p
> http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_1920.jpg
> 
> It doesn't look like bad console port


Considering those fps then Consoles should have no problem with this game since they have enough vRAM.


----------



## Ferreal

Quote:


> Originally Posted by *cstkl1*
> 
> Thats what they said on wolfy. But since i had both. It was false.
> Samething again ppl claimed watchdogs. N infact watchdogs ran best on a single titan. Uber smooth but fps wasnt anything to shout about.
> 
> About the ultra vs high quality. Really to test it
> out. But whats funny is before i instaleed the hd pack .. Ultra vs high had like 1-2gn vram difference. After installing. Its still the same. Not sure whether there was a increase in fidelity as there was a lot of cutscenes during the first gameplay.


What kind of FPS are you getting in game? I haven't had a chance to test out high vs ultra, do you see a big difference?


----------



## Darklyric

Anyone seen any new of a 290 running this with a the hd pack and ultra all around? Don't want to pay full price for it not being maxed... And is CF or SLI working?

Thanks


----------



## Promisedpain

I've now downloaded the ULTRA res pack and it seems to working on my 780 ti too. BUT there was a problem at the beginning, seems like the game eats a lot of RAM at ultra settings, I was crashing to desktop with out of memory error, seems like 8GB is barely enough. I had to close some programs before I got into game. It seems like the weather and time of the day change everytime I reload the game, so it's hard to take comparable screenshots - but Ultra and High look about the same. And I did notice the fur looking better on the coat at the main menu, just like the screenshots around the internet show, nothing huge in-game tho...

EDIT: Barely any difference between high and ultra, the screenshots around the internet are pretty much accurate....


----------



## Chargeit

Nice.

Now I'm messing with upscaling.

I set it to 150% of whatever 1080p is. I limited the fps to 30 (by using adaptive Vsync half refresh) to remove as much fps drop and stutter as possible (30 fps very playable with a controller). The game plays great like this and looks even better.

I just got done fighting a massive group of at least 15 orcs with little to no noticeable fps drops or stutter.


----------



## MapRef41N93W

Hmm I'm getting a lot of crashing with the game now. "Video deviced removed or reset". Only happening with this game.


----------



## supermi

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Hmm I'm getting a lot of crashing with the game now. "Video deviced removed or reset". Only happening with this game.


I am getting that as well. Are you overclocked?


----------



## MapRef41N93W

Quote:


> Originally Posted by *supermi*
> 
> I am getting that as well. Are you overclocked?


Getting this crash at stock and OC. Didn't happen to me once last night, but I've had it happen 3 times today. On two different operating systems (was happening both before and after I migrated to Windows 10 tech demo).


----------



## hurleyef

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Hmm I'm getting a lot of crashing with the game now. "Video deviced removed or reset". Only happening with this game.


I'm getting the same thing. I've been assuming that it was my over clock and have been ticking it down.


----------



## 12Cores

So a console port skinned version of Assassins Creed requires 6gb of Vram and Crysys 3 does not, ok







.


----------



## MapRef41N93W

Quote:


> Originally Posted by *12Cores*
> 
> So a console port skinned version of Assassins Creed requires 6gb of Vram and Crysys 3 does not, ok
> 
> 
> 
> 
> 
> 
> 
> .


Definitely not a skinned version of Assassin's Creed. It's way better at the same types of things AC does than AC itself.


----------



## fizzle

Quote:


> Originally Posted by *12Cores*
> 
> So a console port skinned version of Assassins Creed requires 6gb of Vram and Crysys 3 does not, ok
> 
> 
> 
> 
> 
> 
> 
> .


Have you played it? Why bash a game you haven't played yet? I don't get why people do this.


----------



## 12Cores

Quote:


> Originally Posted by *fizzle*
> 
> Have you played it? Why bash a game you haven't played yet? I don't get why people do this.


This game is not my cup of tea, I will probably get it for $10 on Steam next year. I just watched FrankieonthePC1080p's video on youtube and the game looks like Assassin Creed with a few tweaks sorry, it does not mean its a bad game just stating my opinion not trying to cause any issues on the thread.

Cheers


----------



## fizzle

Quote:


> Originally Posted by *12Cores*
> 
> This game is not my cup of tea, I will probably get it for $10 on Steam next year. I just watched FrankieonthePC1080p's video on youtube and the looks like Assassin Creed with a few tweaks sorry, it does not mean its a bad game just stating my opinion not trying to cause any issues on the thread.
> 
> Cheers


Fair enough! I'd say give it a shot. I got bored of assassins creed after the second one, but this is pretty refreshing. It's like they took the best of batman and ass creed and polished the crap out of it with a bonus: it's LOTR.


----------



## phenom01

God I am loving this game. Such brutal combat...just killed my first warchief and man am I hooked on this. I played Assassins creed series and the Batman series...and this game is sooooo much better than them IMO. Nothing like getting a 50+ hit combo and decapitating every orc in sight with finishers...then when one trys to run catching him and draining him for arrows to use to pick off other stragglers.


----------



## Chargeit

These were taken during a side mission where you have to pit fight and kill 50 Orcs... I died with 1 kill left.


----------



## KenjiS

Quote:


> Originally Posted by *phenom01*
> 
> God I am loving this game. Such brutal combat...just killed my first warchief and man am I hooked on this. I played Assassins creed series and the Batman series...and this game is sooooo much better than them IMO. Nothing like getting a 50+ hit combo and decapitating every orc in sight with finishers...then when one trys to run catching him and draining him for arrows to use to pick off other stragglers.


I know right









The game is definitely a love it/hate it sort of game, If you like AC and Batman, you'll love it, if you dont like either you wont. Simple as that

AS for the "port" its a very good port, its well optimized, My frames havent dipped below 50 with everything maxed out at Ultra on a single 970...


----------



## wholeeo

Played for about an hour and am very happy with the purchase.









edit: Removed benchmark image, looks like once you download the Ultra textures the game sets textures back down to high automatically.


----------



## andrew110

Quote:


> Originally Posted by *KenjiS*
> 
> AS for the "port" its a very good port, its well optimized, My frames havent dipped below 50 with everything maxed out at Ultra on a single 970...


That's at 1440p? I take it without the texture pack?


----------



## KenjiS

Quote:


> Originally Posted by *andrew110*
> 
> That's at 1440p? I take it without the texture pack?


With the texture pack

As others said though, it might be that Ultra is dynamically scaling or something based on my VRAM...


----------



## andrew110

Quote:


> Originally Posted by *KenjiS*
> 
> With the texture pack
> 
> As others said though, it might be that Ultra is dynamically scaling or something based on my VRAM...


Either way that's good performance from a sub $400 card.

I have a 970 on the way and can't wait to play the game on my 1440p monitor. Hopefully FC4 and GTA 5 are just as smooth.

Coming from a single 670 I'm expecting huge gains.


----------



## KenjiS

Quote:


> Originally Posted by *andrew110*
> 
> Either way that's good performance from a sub $400 card.
> 
> I have a 970 on the way and can't wait to play the game on my 1440p monitor. Hopefully FC4 and GTA 5 are just as smooth.
> 
> Coming from a single 670 I'm expecting huge gains.


Well if it helps, my 770 SLI wasnt playable at the same settings, probubly due to VRAM constraints

ive found the 970 to be a nice...lets call it side grade from my 770 SLI


----------



## andrew110

Quote:


> Originally Posted by *KenjiS*
> 
> Well if it helps, my 770 SLI wasnt playable at the same settings, probubly due to VRAM constraints
> 
> ive found the 970 to be a nice...lets call it side grade from my 770 SLI


4GB is definately the minimum at our res.

Do you have it OC? I went with the base model Zotac because it was the only one in stock. I hope I don't regret it.


----------



## KenjiS

Quote:


> Originally Posted by *andrew110*
> 
> 4GB is definately the minimum at our res.
> 
> Do you have it OC? I went with the base model Zotac because it was the only one in stock. I hope I don't regret it.


Yeah, OC to 1554 Core and 8000 memory with very little tweaking

the 970s are generally super good at OCing so you should get some good numbers out of the Zotac...


----------



## cstkl1

Quote:


> Originally Posted by *Ferreal*
> 
> What kind of FPS are you getting in game? I haven't had a chance to test out high vs ultra, do you see a big difference?


Benchy shows 70 avg. in game i limited it to 60. It nvr moved from it even when it was one vs 50.

Really need to see the difference. Will do a side by side with 780ti since both rig is using the same monitor. Hope the tn monitor wont affect the colors.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Darklyric*
> 
> Anyone seen any new of a 290 running this with a the hd pack and ultra all around? Don't want to pay full price for it not being maxed... And is CF or SLI working?
> 
> Thanks


My 290's are working just fine in crossfire and ultra + hd pack.


----------



## Dsrt

Running 2x R9-290x 1100/1375 in xfire and have problems with ultra hd texture pack, turning camera makes it choppy. Benchmarks ~90fps average (framepacing enabled), but ingame you can feel the fps drops when you stand in towers and turn the camera. Had to lower to High textures. Might be CPU related though, didint check the usage (i5-3570k 4.3ghz)


----------



## Silent Scone

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> My 290's are working just fine in crossfire and ultra + hd pack.


No, they're not.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Silent Scone*
> 
> No, they're not.


LOL ok... Because it's not 3 x 980s? Played all last night and about 4 hours tonight 1440p ultra and hd pack. Let me guess screenshots and video probably wouldn't even suffice for you.


----------



## Silent Scone

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> LOL ok... Because it's not 3 x 980s? Played all last night and about 4 hours tonight 1440p ultra and hd pack. Let me guess screenshots and video probably wouldn't even suffice for you.


Because you're not some magical cliché that can make a 4GB card run with a texture pack that uses more than 4GB.

As I said previously some people have low standards


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Chargeit*
> 
> I was using 10gb system ram when playing.
> 
> *Testing it now and sitting at 8gb total ram usage. 3.5gb of that is other. I did see 10gb used at one point last night, least I think I did.
> 
> *Maxed out while watching it at 8.5gb.


Quote:


> Originally Posted by *Ferreal*
> 
> Well they did recommend 6gb vram for ultra HD textures. I am running 1440p on ultra and it is using consistently 5.8gb - 6gb.
> 
> People are reporting to be running ultra with 780ti. Either way, this game is pretty amazing @ high or ultra settings.
> 
> I'm not a huge fan of batman style combat but they got it right here.


Quote:


> Originally Posted by *Promisedpain*
> 
> I've now downloaded the ULTRA res pack and it seems to working on my 780 ti too. BUT there was a problem at the beginning, seems like the game eats a lot of RAM at ultra settings, I was crashing to desktop with out of memory error, seems like 8GB is barely enough. I had to close some programs before I got into game. It seems like the weather and time of the day change everytime I reload the game, so it's hard to take comparable screenshots - but Ultra and High look about the same. And I did notice the fur looking better on the coat at the main menu, just like the screenshots around the internet show, nothing huge in-game tho...
> 
> EDIT: Barely any difference between high and ultra, the screenshots around the internet are pretty much accurate....


WoW....All the vram and then around 8gb of system ram!!

Couldn't even do that with my most intensive benchmarking program....LoL

Would love to know how in the hell that is happening from just 1 game....


----------



## Silent Scone

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> WoW....All the vram and then around 8gb of system ram!!
> 
> Couldn't even do that with my most intensive benchmarking program....LoL
> 
> Would love to know how in the hell that is happening from just 1 game....


Afterthought, poor compression, poor LOD adjustment. But insane texture detail







.

It's like they said in the Euro Gamer interview. Their development systems will be running K6000s with potentially up to 12GB of frame buffer, so rather than just scrap everything they've been working on, they've let the end user have a go at it too. Which I think is perfectly fair game
Quote:


> "They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"


Which is why it's funny when you see people saying "urgh it runs fine for me with my 4GB card".

Sure thing!


----------



## StrongForce

Gameplay with fraps benchmark that check min-av, max fps would be good !


----------



## funfordcobra

I have 2 780 ti cards in SLI and it runs fine. Everything is ultra except texture. (High)

2560x1440 I get 80-100 fps in game

3440x1440 is locked at 60 with vsync no dips at all. Without vsync its at 70-80 avg.

Both resolution hover right under 3gb vram.


----------



## Promisedpain

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> WoW....All the vram and then around 8gb of system ram!!
> 
> Couldn't even do that with my most intensive benchmarking program....LoL
> 
> Would love to know how in the hell that is happening from just 1 game....


Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> WoW....All the vram and then around 8gb of system ram!!
> 
> Couldn't even do that with my most intensive benchmarking program....LoL
> 
> Would love to know how in the hell that is happening from just 1 game....


I think It's because the lack of VRAM the game starts using my RAM instead.


----------



## hurleyef

This was while fighting a roomful of orcs in a stronghold with the alarm raised for two minutes at 1440p with all settings at ultra, including the dlc ultra texture pack. Everything seemed smooth to me, but I don't necessarily have the best eyes for that.

FRAPSLOG.TXT 0k .TXT file


ShadowOfMordor2014-10-0203-40-00-16fps.csv 0k .csv file


ShadowOfMordor2014-10-0203-40-00-16frametimes.csv 154k .csv file


ShadowOfMordor2014-10-0203-40-00-16minmaxavg.csv 0k .csv file


----------



## Silent Scone




----------



## Defoler

Kitguru has a review using a FX-8350 and 780 stock.
At 1440p and ultra textures they got a 55fps average.
Quote:


> We did notice frame drops during 4K testing and gameplay, which is to be expected, current single GPU technology just doesn't have the horsepower.


Quote:


> Judging from our testing, its safe to say that you can ignore that 6GB 'requirement' for Ultra textures at sub 4K resolutions, the game is perfectly playable with a 3GB GPU. On another note, I have to say, so far I'm really enjoying the game and its dynamic enemy system. However, I'm still at the early stages so its too soon for me to say whether its definitely worth buying or not.


They don't have higher memory cards on hand, but I doubt that the difference is there as the GPU core is the bottleneck than the vram usage.


----------



## daviejams

I've got it running at about 70fps everything at ultra but textures at high

i5 2500k & R9 280x

Game is incredible by the way , this is the first kind of different game since the new consoles have come out. Loads going on behind the scenes

Looks amazing and I love how they have taken elements from AC games , Far Cry 3 and Batman and rolled them into one and set it in Mordor

Well done


----------



## Promisedpain

Quote:


> Originally Posted by *Defoler*
> 
> Kitguru has a review using a FX-8350 and 780 stock.
> At 1440p and ultra textures they got a 55fps average.
> 
> They don't have higher memory cards on hand, but I doubt that the difference is there as the GPU core is the bottleneck than the vram usage.


Yeah it seems playable at ultra on my 780 ti.


----------



## Silent Scone

Sweet. Does your Ti have more VRAM than my 980?

Just curious


----------



## funfordcobra

People who run 3+ cards and think they are superior are idiots. That is all.


----------



## Silent Scone

Nobody is referring to number of cards

I think idiots are idiots, especially ones with random rage posts









Even more so especially ones who think they can run on the frame buffer and are adamant it works o.k when quite clearly it doesn't. It's just some of us aren't butt hurt enough to lie about it lol


----------



## friend'scatdied

Quote:


> Originally Posted by *Silent Scone*
> 
> Nobody is referring to number of cards
> 
> I think idiots are idiots, especially ones with random rage posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even more so especially ones who think they can run on the frame buffer and are adamant it works o.k when quite clearly it doesn't. It's just some of us aren't butt hurt enough to lie about it lol


Is there data to validate that it doesn't work OK?

TBH in this context I think both subjective anecdote and developer PR is bunk. Objectivity is somewhat lacking for now -- comparative benchmarks might help.

KitGuru's assessment is insufficient IMHO. Good test might be 780 Ti vs. Titan Black (at same clocks) or even 7970 3GB vs. 7970 6GB to determine whether the lesser VRAM results in diminished performance.


----------



## Silent Scone

Quote:


> Originally Posted by *friend'scatdied*
> 
> Is there data to validate that it doesn't work OK?
> 
> TBH in this context I think both subjective anecdote and developer PR is bunk. Objectivity is somewhat lacking for now -- comparative benchmarks might help.
> 
> KitGuru's assessment is insufficient IMHO. Good test might be 780 Ti vs. Titan Black (at same clocks) or even 7970 3GB vs. 7970 6GB to determine whether the lesser VRAM results in diminished performance.


I was going to fit my blocks later but I am sorely tempted to do a compressive video / FRAPS-out of this in action. I trust my own results (obviously) and there are a lot, and I mean a lot of people who are saying it is working ok for them even at 1440P

Frankly, I don't see how. Memory is memory!


----------



## funfordcobra

I can't run ultra settings on 3GB. That is all.


----------



## friend'scatdied

Quote:


> Originally Posted by *Silent Scone*
> 
> Frankly, I don't see how. Memory is memory!


Yeah, that's the thing. Best way to confirm/deny either claim is to isolate as many variables as possible - - use two GPU SKUs with varying VRAM that are otherwise completely identical. In recent memory this could only be done with the 7970, 780, and 780 Ti (Titan Black).

Unfortunately not many people have this on-hand especially with the mass transition to 4GB-only GM204. We might see some level of validation if a 8GB GM204 becomes available and people provide before/afters.


----------



## wholeeo

Think a few people need to back and check what the texture setting is actually set to. I thought all was great until I realized that after I had downloaded the Ultra HD textures my texture settings were dropped down to High. Also, performance should be gauged from in game play, the benchmark is much easier to run than the actual game. Ultra @ 1440p on my 3GB 780s was fine in the benchmark, ran like trash in game. That's with SLI enabled via alternate force frame rendering 2 in the nvcp. However at high textures settings everything is perfect.


----------



## friend'scatdied

So basically people who think they're running Ultra textures smoothly on 3GB might actually be running High?

I guess those applicable going back on their claims of "Ultra textures buttery smooth on 3GB" is another way to prove >3GB is necessary.


----------



## Silent Scone

My sons 680 (2GB) runs high fairly well so that's no great feet lol


----------



## daviejams

Who cares really , it runs great on 3gb cards.

I am happy with everything but textures set to ultra


----------



## cstkl1

Quote:


> Originally Posted by *daviejams*
> 
> Who cares really , it runs great on 3gb cards.
> 
> I am happy with everything but textures set to ultra


Thats the thing. There are ppl claiming its fine with a 3gb on uktra with hd pack. I dont believe it. Hence will test this weekend. But judging from the past on ppl claiming wolfy/watchdog was fine which was a lie, it fits. And reviews... There was no ultra hd pack until like 10hrs after the game was launched and none for the review download codes etc.

Too many reviews trying to get site hits and no proper testing.

Another bug in game. On ultra certain grass areas theres like this illuminating white lines on it when you move left to right.


----------



## daviejams

Quote:


> Originally Posted by *cstkl1*
> 
> Thats the thing. There are ppl claiming its fine with a 3gb on uktra with hd pack. I dont believe it. Hence will test this weekend. But judging from the past on ppl claiming wolfy/watchdog was fine which was a lie, it fits. And reviews... There was no ultra hd pack until like 10hrs after the game was launched and none for the review download codes etc.
> 
> Too many reviews trying to get site hits and no proper testing.
> 
> Another bug in game. On ultra certain grass areas theres like this illuminating white lines on it when you move left to right.


I don't believe them either , just play it on high textures if you have got a 3gb card. Saying that I've not tried it myself just read what folk on here have been saying about the game using 5.5gb of VRAM. If it was using that and you had a 3gb card it would stutter - same as the watchdogs problem !

Wolfenstein was fine for me on R9 280x 3gb. No stuttering but the occasional drops to the 40s

Some game this Shadow of Mordor , so much going on in the background


----------



## BusterOddo

Quote:


> Originally Posted by *wholeeo*
> 
> Think a few people need to back and check what the texture setting is actually set to. I thought all was great until I realized that after I had downloaded the Ultra HD textures my texture settings were dropped down to High. Also, performance should be gauged from in game play, the benchmark is much easier to run than the actual game. Ultra @ 1440p on my 3GB 780s was fine in the benchmark, ran like trash in game. That's with SLI enabled via alternate force frame rendering 2 in the nvcp. However at high textures settings everything is perfect.


This is exactly what I did. Thought I was running the HD pack on ultra after it installed, then posted in the new driver thread that it was running good on my cf 7970's. Launched the game yesterday after work and noticed it was on high. Setting it to ultra makes the game an unplayable slideshow with my 3gb cards. So with everything on ultra except textures the game runs really good. Getting almost perfect scaling too.

As a side note I am having a blast with this game! Was never a fan of Batman, but love the AC games. This is a really nice mix of both. I will admit I would much rather have the ultra textures cause they do look really good, but the game still looks really nice without. Kind of makes me not want to play anymore until I can max it out though...


----------



## Leopard2lx

I can post a run with FRAPS using HD textures on a 3gb card when I get back from work.


----------



## FreeElectron

Ok
So if i want to game on 1440p on ultra

Code:



Code:


(2560*1440)/(1920*1080) = 1.78

I would need

Code:



Code:


6*1.78= 10.68GB

11GB VRAM?
Dafuq?


----------



## ChronoBodi

uh guys, with Ultra HD textures installed, and playing on 4K with every settings maxed except for Motion Blur off and Ambient Occlusion at medium (Can't tell the difference enough to be worth it)

i confirm it takes up 5700MB of VRAM at 4K.

the whole "1080p needs 6GB vram" forgot to tell you that's what happens if supersampling is enable ingame to 200%, then of course you're playing at 4K by that point.

It's really for 4K that 6GB is needed, anything below is fine with 4GB VRAM.


----------



## FreeElectron

Quote:


> Originally Posted by *ChronoBodi*
> 
> uh guys, with Ultra HD textures installed, and playing on 4K with every settings maxed except for Motion Blur off and Ambient Occlusion at medium (Can't tell the difference enough to be worth it)
> 
> i confirm it takes up 5700MB of VRAM at 4K.
> 
> the whole "1080p needs 6GB vram" forgot to tell you that's what happens if supersampling is enable ingame to 200%, then of course you're playing at 4K by that point.
> 
> It's really for 4K that 6GB is needed, anything below is fine with 4GB VRAM.


Just to be sure
Can you record the test with everything at max at 4k (I know it is unplayable) and turn on MSI OSD or any OSD to show GPU usage and VRAM usage?


----------



## ChronoBodi

Quote:


> Originally Posted by *FreeElectron*
> 
> Just to be sure
> Can you record the test with everything at max at 4k (I know it is unplayable) and turn on MSI OSD or any OSD to show GPU usage and VRAM usage?


Well, will do later.

We still have no official SLI profile, so it's either FEAR 3 SLI bits or forced AFR2.

Either way, it's 60 FPS when you're out in the open in the fields, but it all goes to crap when you go into combat, awful 30 FPS and even 20 FPS drops. it's weird, smooth 60 fps out in the open, but crap framerates in combat.

And this is gameplay, the benchmark is a lot more forgiving.


----------



## Ferreal

Quote:


> Originally Posted by *ChronoBodi*
> 
> Well, will do later.
> 
> We still have no official SLI profile, so it's either FEAR 3 SLI bits or forced AFR2.
> 
> Either way, it's 60 FPS when you're out in the open in the fields, but it all goes to crap when you go into combat, awful 30 FPS and even 20 FPS drops. it's weird, smooth 60 fps out in the open, but crap framerates in combat.
> 
> And this is gameplay, the benchmark is a lot more forgiving.


That just shows you NEED more vram for 4k. I am running 1440p here and not getting drops during combat.

I have everything that could be set on ultra except for ambient occlusion is on high.


----------



## ChronoBodi

Then why would they release a game stating it needs 6GB VRAM and you have one?

the 6gb VRAM @ 1080p is supersampled, which means i'm running the same thing anyway @ 4K, which IS 200% upscaled 1080p.

Not even that, theres VERY VERY few cards with 8GB of VRAM, but quite a few 6GB cards, so, the crappy 30 fps performance has to be down to not having a proper SLI profile.


----------



## Ferreal

I am running Titan black SLI and using NVCP to force AFR2. Works fine for now until nvidia releases new drivers.

Having OSD on did affect my fps for some reason. Using precisionX for OSD, after turning it off I am getting steady FPS with no drops.


----------



## ChronoBodi

Quote:


> Originally Posted by *Ferreal*
> 
> I am running Titan black SLI and using NVCP to force AFR2. Works fine for now until nvidia releases new drivers.
> 
> Having OSD on did affect my fps for some reason. Using precisionX for OSD, after turning it off I am getting steady FPS with no drops.


Ok, i do the same thing with SLI Original Titans...

Yet i have crappy FPS drops in combat gameplay, yet 60 FPS everywhere else. How are you not getting any drops?


----------



## Baasha

Just got this game yesterday. My goodness does it look good! It is a real surprise since I don't really care about LotR etc.

The combat is highly refreshing after playing the garbage that is Assassins' Creed - one button mashing turd. This at least has flip-dodge, combos, and brilliant executions! I can't believe the muppets at Ubi won't implement something like this - they had a similar system in PoP Warrior Within years ago. Ugh...

Anyway, Surround does NOT work on this game - just black screens and "stopped working." So I played it on just one 4K monitor.

No SLI support either - forcing AFR2 is meh... since I'm running 4-Way SLI, only two GPUs are utilized and the scaling is not that great. If I force it to use all 4 GPUs (Nvidia Inspector), scaling is in the 20 - 30% range and the FPS is around 20.

With AFR2 (using 2 GPUs), I'm seeing around 50 - 60 FPS with everything maxed out, including Ambient Occlusion, and Ultra HD Texture Pack.

The VRAM usage is at a constant 6GB which is incredible - but the textures are absolutely GORGEOUS!

Stay tuned for some incredible 4K videos of this sweet game!

Really wish other companies would focus more on combat system - combos, finishers etc. Makes the game SO much better.


----------



## 970Rules

I can't speak for 700 series people with 3g
I just know gtx 900 series has no issues with 4g vram on this game , my 970 clocked at 1500+ core/8000 memory is shutter free/ smooth at 1080p "ultra" textures installed and selected, just as it should be with newest and greatest tech.
Had about 10+ hours with the game so far.


----------



## Ferreal

Quote:


> Originally Posted by *ChronoBodi*
> 
> Ok, i do the same thing with SLI Original Titans...
> 
> Yet i have crappy FPS drops in combat gameplay, yet 60 FPS everywhere else. How are you not getting any drops?


You did say that you are running @ 4k resolution.

I am running this on 1440p, that may be the difference.


----------



## friend'scatdied

Honest question:

Does the 3rd-gen Delta Color Compression introduced in Maxwell have any implications on game VRAM footprint vs. Kepler and earlier?

My understanding is it only really affects memory bandwidth (allowing the 256-bit bus at 7GHz to perform 25% better than otherwise), but I'd be blown away if it allowed that 4GB to go farther.

Would be curious to see VRAM utilization on a 6GB GK110 vs. any of the GM204s.


----------



## RSharpe

Quote:


> Originally Posted by *Baasha*
> 
> Just got this game yesterday. My goodness does it look good! It is a real surprise since I don't really care about LotR etc.
> 
> The combat is highly refreshing after playing the garbage that is Assassins' Creed - one button mashing turd. This at least has flip-dodge, combos, and brilliant executions! I can't believe the muppets at Ubi won't implement something like this - they had a similar system in PoP Warrior Within years ago. Ugh...
> 
> Anyway, Surround does NOT work on this game - just black screens and "stopped working." So I played it on just one 4K monitor.
> 
> No SLI support either - forcing AFR2 is meh... since I'm running 4-Way SLI, only two GPUs are utilized and the scaling is not that great. If I force it to use all 4 GPUs (Nvidia Inspector), scaling is in the 20 - 30% range and the FPS is around 20.
> 
> With AFR2 (using 2 GPUs), I'm seeing around 50 - 60 FPS with everything maxed out, including Ambient Occlusion, and Ultra HD Texture Pack.
> 
> The VRAM usage is at a constant 6GB which is incredible - but the textures are absolutely GORGEOUS!
> 
> Stay tuned for some incredible 4K videos of this sweet game!
> 
> Really wish other companies would focus more on combat system - combos, finishers etc. Makes the game SO much better.


Have you tried using Nvidia Inspector with the F.E.A.R. 3 compatibility bits? I've been using them since launch and managed to get 97%+ utilization on my two GPU fairly regularly in-game. The cut-scenes appear to be pre-rendered at 60fps, but usage in-game is good otherwise.


----------



## DADDYDC650

Runs great at @1440p ultra settings with hd pack installed using my 780 6GB SC. Thanks goes out to Jacob over at EVGA for the 6GB GPU recommendation.


----------



## marcus556

Im running Ultra everything on a 780ti 3gb model and not having any issues. Vsync is even turned on and i still get 60 frames


----------



## Promisedpain

Quote:


> Originally Posted by *marcus556*
> 
> Im running Ultra everything on a 780ti 3gb model and not having any issues. Vsync is even turned on and i still get 60 frames


Yeah same here. I wonder if Evil Within will be the same, then I don't have to upgrade for a while... tho 980 ti sounds tempting, depending how much vram it will have.


----------



## Menta

http://www.gamespot.com/videos/middle-earth-shadow-of-mordor-graphics-comparison/2300-6421635/

i dont recall the PC version being so hazzy and white!!!


----------



## Azefore

Quote:


> Originally Posted by *Menta*
> 
> http://www.gamespot.com/videos/middle-earth-shadow-of-mordor-graphics-comparison/2300-6421635/
> 
> i dont recall the PC version being so hazzy and white!!!


Could be your monitor's calibration, my game looks pretty much the same as the video.

Quick snippet I took with Shadowplay, didn't bother capturing 1440p(not my best performance either







):


----------



## Menta

LOL its not my monitor it is a gamespot showdown.... but no way the PC version looks like that vídeo.

the pc version on gamespots vídeo looks like it has a strange layer of white


----------



## Azefore

Quote:


> Originally Posted by *Menta*
> 
> LOL its not my monitor it is a gamespot showdown.... but no way the PC version looks like that vídeo


Perhaps the inherit console over-saturation looks jarring, almost all console games are overly saturated and get rid of any real nice subtle blackened details. 1:42 is the biggest show of this point. Their PC footage looks neutral as can be to myself.


----------



## Menta

i dont know but on my 970 the game is very Sharp much better then that vídeo


----------



## Azefore

Quote:


> Originally Posted by *Menta*
> 
> i dont know but on my 970 the game is very Sharp much better then that vídeo


Aye I hear you, don't take the video at face value too, youtube and other players are all at fault for compression/etc. Only good way to do it is to have a download of the original source video available.


----------



## TopicClocker

Quote:


> Originally Posted by *Menta*
> 
> http://www.gamespot.com/videos/middle-earth-shadow-of-mordor-graphics-comparison/2300-6421635/
> 
> i dont recall the PC version being so hazzy and white!!!


The PC footage has completely washed out colours.

In the comments someone who is staff said this:
Quote:


> We just want to give fans, and developers the most accurate representation of the game we can. Hence, we wanted to be as transparent as possible, and state the DLC Ultra-HD texture pack was not installed. *That said, Gamma settings on all 3 platforms are a little different. And I am working on a better solution so we don't confuse people.*


Well that explains the colour differences across all three platforms.
I'm happy that they are responded to people's concerns.


----------



## LancerVI

Ok, I just couldn't resist. I have to get this. Just purchased it. I want to see my R9 290's in CF fall to their knees.


----------



## Buris

I'll definitely have to check out if my R9 290 can run it on ultra with the ultra texture pack--

Shouldn't be an issue- I feel like this type of info is released simply because it draws news attention, and making a game that pushes limits automatically brings a bunch of attention to tons of PC Elitists like me


----------



## LancerVI

Quote:


> Originally Posted by *Buris*
> 
> I'll definitely have to check out if my R9 290 can run it on ultra with the ultra texture pack--
> 
> Shouldn't be an issue- I feel like this type of info is released simply because it draws news attention, and making a game that pushes limits automatically brings a bunch of attention to tons of PC Elitists like me


I think so too and to be honest, it definitely worked on me. I just can't help myself now. Crysis 1 had a similar effect. I usually don't buy games at launch, but I keep hearing how it's going to bring my system to it's knees so I just have to see for myself.

I also thought the 32GB kit in my system was overkill, but hey, it sounds like it may not be for too long.

OT Old Man Moment: Ultima VII complete is 30MB. I remember thinking when it was just Black Gate coming in over 10MBs......OMG!!! Think about that. Music, speech, graphics, everything. I know it's a silly crazy comparison, but I honestly think Ultima VII still looks pretty good and they fit all that in less than 10 5.25 1.2MB floppies. There's a part of me that thinks the efficiency of game programming was lost somewhere, though, I'm completely speaking out of my backside, I know very little of the subject, it's just a feeling. BUT 6GB VRAM!!!!!???!?!?!?!?!


----------



## Darklyric

so has anyone run this on a 290 with the extra texture pack installed at ultra settings?


----------



## paulerxx

Very High
 Ultra (without texture pack)

i5 3570k(stock)
8GB 1600 DDR3
HD 7870 1100/1350

Not bad considering PS4 runs at 30fps at 1080p with a similar GPU.


----------



## LancerVI

Quote:


> Originally Posted by *Darklyric*
> 
> so has anyone run this on a 290 with the extra texture pack installed at ultra settings?


I will in about an hour. I'll try to record a run with a single 290 and then in crossfire with the ultra textures installed. Give me a couple hours. Kids are coming home from school. Homework and then dinner time!


----------



## MapRef41N93W

Quote:


> Originally Posted by *paulerxx*
> 
> Very High
> Ultra (without texture pack)
> 
> i5 3570k(stock)
> 8GB 1600 DDR3
> HD 7870 1100/1350
> 
> Not bad considering PS4 runs at 30fps at 1080p with a similar GPU.


PS4 has an underpowered mobile CPU and it's a well threaded game.


----------



## Darklyric

Quote:


> Originally Posted by *LancerVI*
> 
> I will in about an hour. I'll try to record a run with a single 290 and then in crossfire with the ultra textures installed. Give me a couple hours. Kids are coming home from school. Homework and then dinner time!


To late lol I bought it already... You should run your [email protected] stock so we can compare I7 vs an 8350 thought just for fun.


----------



## MonarchX

I think it was already mentioned that Ultra 4K textures create a more visible difference at higher resolutions of 1440p and 4K. I fully understand that screen resolution and texture resolution are 2 different things, but even with Skyrim, I noticed 4K textures don't look that much better than 2K textures on far away objects and the difference is truly visible only if you get real close to the texture or even zoom-in on it. At 1080p, 4K Ultra textures won't look that much better than 2K High textures, but at 4K, the difference will be more visible. You also need true 4K and not 4K to 1080p downscaling to fully appreciate such high resolution textures. IMHO, 4K to 1080p downscaling with 2K textures looks much better than regular 1080p with 4K textures. All in all, it makes little sense to use Ultra textures @ 1080p, unless you have 6GB of VRAM on a high-end SLI GPU setup with a very low input lag monitor. I think smooth motion is more important than slightly sharper textures in this game because of its fast-paced style of combat.


----------



## GoDucks2014

Watch, all this huff and fuss and it will end up being something like on a 780 level card or better with less than 6GB of VRAM will result in FPS of 106 vs FPS of 123 with 6GB and that will be what they call "negatively impacted".


----------



## LancerVI

Quote:


> Originally Posted by *GoDucks2014*
> 
> Watch, all this huff and fuss and it will end up being something like on a 780 level card or better with less than 6GB of VRAM will result in FPS of 106 vs FPS of 123 with 6GB and that will be what they call "negatively impacted".


You're probably right and as a Huskies fan, I hate to say that to you!!! Go Huskies!!!! Bow Down!! Woof Woof!!!


----------



## Menta

the game has good graphics but very very far to be called a "system breaker"

its not all out about the graphics though


----------



## paulerxx

The game looks great even on my HD7870, and runs really well too. Everything on ultra except textures/ambient occlusion , the game seems to be like Dead Rising 3 and runs more fluid a 30fps cap + Vsync (animations seem a bit off if not set this way, test it yourself, you'll see)

If you experience stuttering/low frames person second. Mess around with these settings in this order: Textures, ambient occlusion and then shadows.
"Below is the performance impact off each setting:

Lighting Quality: It has a LOW performance impact.

Mesh Quality: It has a LOW performance impact.

Motion Blur: It has a negligible performance impact.

Shadow Quality: It has a HIGH performance impact.

Texture Filtering: It has a HIGH performance impact.

Texture Quality: It has a LOW performance impact. Having said that you will have to play with the corresponding settings otherwise some serious FPS variance will occur. If you have a GPU from the future with 6 GB VRAM you can play on Ultra. Otherwise you have: High (if you have 3GB VRAM), Medium (if you have 2 GB VRAM) and Low (if you have 1GB VRAM).

Ambient Occlusion: It has a VERY HIGH performance impact. I recommend using the "High" Setting or lower. Drop this setting first.

Vegetation Range: It has a HIGH performance impact.

Depth of Field: It has a negligible performance impact.

Order independent Transparency: It has a MEDIUM performance impact. Turn OFF if you are playing with VSync On.

Tessellation: It has a LOW performance impact."

Here's an actual guide http://in2gpu.com/2014/09/30/middle-earth-shadow-mordor-performance-guide/


----------



## Leopard2lx

Here are the results of a couple of runs I did using FRAPS. Ultra settings with Ultra textures with a 780 clocked at 1306 Mhz

First run at 1080p

2014-10-02 16:45:57 - ShadowOfMordor
Frames: 27602 - Time: 309484ms - Avg: *89.187* - Min: 0 - Max: 101

Second run at 1600p

2014-10-02 16:55:25 - ShadowOfMordor
Frames: 11694 - Time: 198844ms - Avg: *58.810* - Min: 28 - Max: 79

The frame rate fluctuates substantially more than with High textures (most likely due to the VRAM limitation), but other than that as you can see, the game is still perfectly playable with 3gb of VRAM.


----------



## StrongForce

Quote:


> Originally Posted by *wholeeo*
> 
> Think a few people need to back and check what the texture setting is actually set to. I thought all was great until I realized that after I had downloaded the Ultra HD textures my texture settings were dropped down to High. Also, performance should be gauged from in game play, the benchmark is much easier to run than the actual game. Ultra @ 1440p on my 3GB 780s was fine in the benchmark, ran like trash in game. That's with SLI enabled via alternate force frame rendering 2 in the nvcp. However at high textures settings everything is perfect.


Exactly haven't played the game but I pretty much had that assumption it's why I asked people to do actual gameplay, and if possible even in intensive areas and post min avg and max frames, now that's interesting data, also with resolution, card etc


----------



## Chargeit

I wonder how many copies having that 6gb Vram suggestion sold. I know I bought it because of that.









I know mine's set to ultra textures.


----------



## chronicfx

Has anyone actually played the game or just benchmarked it? This game kind of snuck up on me, I thought the good games were coming out in november. Never bought an assassins creed game or a batman game, have been more a skyrim, witcher, dragon age. Mass effect , fallout, crysis kind of guy rinse and repeat throughout the years. Was waiting for falr cry and dragon age, should I get this?


----------



## NoDoz

Quote:


> Originally Posted by *chronicfx*
> 
> Has anyone actually played the game or just benchmarked it? This game kind of snuck up on me, I thought the good games were coming out in november. Never bought an assassins creed game or a batman game, have been more a skyrim, witcher, dragon age. Mass effect , fallout, crysis kind of guy rinse and repeat throughout the years. Was waiting for falr cry and dragon age, should I get this?


I enjoy the game alot. Seems well thought out.


----------



## Chargeit

Quote:


> Originally Posted by *chronicfx*
> 
> Has anyone actually played the game or just benchmarked it? This game kind of snuck up on me, I thought the good games were coming out in november. Never bought an assassins creed game or a batman game, have been more a skyrim, witcher, dragon age. Mass effect , fallout, crysis kind of guy rinse and repeat throughout the years. Was waiting for falr cry and dragon age, should I get this?


It's more like Assassins Creed / Batman then the other ones you've mentioned.

Catch it on a Steam sale maybe.


----------



## KenjiS

Quote:


> Originally Posted by *chronicfx*
> 
> Has anyone actually played the game or just benchmarked it? This game kind of snuck up on me, I thought the good games were coming out in november. Never bought an assassins creed game or a batman game, have been more a skyrim, witcher, dragon age. Mass effect , fallout, crysis kind of guy rinse and repeat throughout the years. Was waiting for falr cry and dragon age, should I get this?


Yes I've sunk like, 6-7 hours into it now and been having a blast.

Its got some great stuff going for it, Solid storyline, excellent animations, solid combat mechanics and some great parkour mechanics, I'd even say better than AC (Talion doesnt seem to stop or slow down anywhere near as much as Ezio/Connor/Edward...) The graphics are pretty good to look at too and its well optimized and you should have no trouble hitting 60fps

It also controls a lot better than AC with mouse/keyboard, as in I find it a lot more playable than AC4 was...


----------



## StrongForce

So I'm a wee bit confused lol, 6gb is for 4k downscaled, then approximately how much VRAM a 2k downscaled 1080p with ultra texture pack and most settings on ultra would need to run smooth would 4 be enough ? would that mean Nvidia was'nt so cheapass to release with "only" ahem lol, 4gb.

And by the way nice to hear this game is good, like someone said it's not all about the textures, plus I believe I heard the word open world, which usually doesn't feature epic ultra texture ! nice come back Monolith, looking forward to buy this game once I upgrade my card and then run it on ultra pack muahha







, I have bout 10 other games to finish too before starting a new one lolz.


----------



## KenjiS

Quote:


> Originally Posted by *StrongForce*
> 
> So I'm a wee bit confused lol, 6gb is for 4k downscaled, then approximately how much VRAM a 2k downscaled 1080p with ultra texture pack and most settings on ultra would need to run smooth would 4 be enough ? would that mean Nvidia was'nt so cheapass to release with "only" ahem lol, 4gb.


I'm running 1440p very well with 4gb of VRAM. 4k is pushing it and slows to a crawl in too many things (Not VRAM though.. i think this is actually just needing a hair more processing power like 970 SLI..)


----------



## Sequences

You don't need 6GB of VRAM to enjoy the game. I'm running 3600x1920 with 2GB VRAM and gameplay is nice and smooth.


----------



## SoloCamo

Quote:


> Originally Posted by *Sequences*
> 
> You don't need 6GB of VRAM to enjoy the game. I'm running 3600x1920 with 2GB VRAM and gameplay is nice and smooth.


at what settings?


----------



## hanzy

So, I am running the game with everything maxed BUT textures, those are on high.

What I noticed on my system with ultra textures was good FPS ~ 80-96 FPS, but it would stutter at times, especially during combat with groups.
I dropped textures to high, and the game is very smooth, running average of ~100 FPS.
I prefer this setting.

There is a difference in IQ, but is not enough of a difference to make me play/wish I could play on with ultra textures enabled.

So far I think it is a really good game, but it is not a god tier game to me. I liked Tomb Raider 2013 much much more, and I think Black Flag is more fun, and has a lot more content(but much more repetitive), and looks a lot better.

I would rate it 7.5/10.


----------



## BusterOddo

Quote:


> Originally Posted by *chronicfx*
> 
> Has anyone actually played the game or just benchmarked it? This game kind of snuck up on me, I thought the good games were coming out in november. Never bought an assassins creed game or a batman game, have been more a skyrim, witcher, dragon age. Mass effect , fallout, crysis kind of guy rinse and repeat throughout the years. Was waiting for falr cry and dragon age, should I get this?


I only heard about it when this thread popped up. I would absolutely recommended this game. Very action packed, good story so far, and the graphics can rival the top games out.
Quote:


> Originally Posted by *KenjiS*
> 
> Yes I've sunk like, 6-7 hours into it now and been having a blast.
> 
> Its got some great stuff going for it, Solid storyline, excellent animations, solid combat mechanics and some great parkour mechanics, I'd even say better than AC *(Talion doesnt seem to stop or slow down anywhere near as much as Ezio/Connor/Edward...)* The graphics are pretty good to look at too and its well optimized and you should have no trouble hitting 60fps
> 
> It also controls a lot better than AC with mouse/keyboard, as in I find it a lot more playable than AC4 was...


Yes, I really like how you don't ever feel "stuck" in a move while fighting. Its very fluid and makes really nice transactions quickly in and out of all the different moves you have.


----------



## MapRef41N93W

Quote:


> Originally Posted by *hanzy*
> 
> So, I am running the game with everything maxed BUT textures, those are on high.
> 
> What I noticed on my system with ultra textures was good FPS ~ 80-96 FPS, but it would stutter at times, especially during combat with groups.
> I dropped textures to high, and the game is very smooth, running average of ~100 FPS.
> I prefer this setting.
> 
> There is a difference in IQ, but is not enough of a difference to make me play/wish I could play on with ultra textures enabled.
> 
> So far I think it is a really good game, but it is not a god tier game to me. I liked Tomb Raider 2013 much much more, and I think Black Flag is more fun, and has a lot more content(but much more repetitive), and looks a lot better.
> 
> I would rate it 7.5/10.


I would say this game absolutely crushes Black Flag (at least when Black Flag actually becomes an AC game, rather than the really good ship combat). AC wishes it had this good of a combat system, and you get much more freedom in this game without the ridiculous "simulation" B.S. and hovering menus everywhere.


----------



## StrongForce

Quote:


> Originally Posted by *Sequences*
> 
> You don't need 6GB of VRAM to enjoy the game. I'm running 3600x1920 with 2GB VRAM and gameplay is nice and smooth.


I know but we all have different definitions of "smoothness" I guess lol, what's your min fps ? I was just wondering if 2k downscale to 1080 was runnable on a 4gb say gtx 970 ! and I meant all ultra with texture pack etc !


----------



## LancerVI

My Crossfires 290's are running this game on ULTRA all the way across the board with the HD textures JUST FINE! @1080p 60+ FPS most of the time. That said, I'm only 30 minutes into the game, so I'm sure I haven't gotten into any major battles. I just killed that slaver Orc with like 10+ orcs on screen give or take. I was in mid 60's to 70's the entire way through.

It's actually weird. Game DVR stayed pretty much at 60 (I do have VSync on)while bandicam was reports fps well in the 80 to 100 FPS range. I don't know what the discrepancy was, but it had no hitches so far and while the game looks good, I'm still trying to figure out why the 6GB VRAM is necessary.

Also, there is clearly a problem with Crossfire and the menu system. It flickers like crazy. Turn off crossfire and it goes away.

As far as the game goes, I have to say, I'm not a big fan of AC or Batman, but I'm loving this game. Maybe it's because I'm a JRR Tolkien fanboy. Nothing like ole leathery tomes around the house and the Silmarillion.


----------



## dph314

The benchmark is in no way indicative of having enough VRAM for Ultra textures. I got 88fps in the benchmark with everything maxed and Ultra textures, then the game was a stutter-fest as soon as there was more than 2 guys on the screen at the same time.

Just to clarify for anyone wondering...my personal experience is that 3GB is in no way enough. Not even close. Doesn't matter what framerates people post in the benchmark runs. As soon as you're fighting 6 orcs, you'll turn the textures down to High.


----------



## LancerVI

Quote:


> Originally Posted by *dph314*
> 
> The benchmark is in no way indicative of having enough VRAM for Ultra textures. I got 88fps in the benchmark with everything maxed and Ultra textures, then the game was a stutter-fest as soon as there was more than 2 guys on the screen at the same time.
> 
> Just to clarify for anyone wondering...my personal experience is that 3GB is in no way enough. Not even close. Doesn't matter what framerates people post in the benchmark runs. As soon as you're fighting 6 orcs, you'll turn the textures down to High.


Well, that's interesting because that's not my experience AT ALL.

I'm sorry your machine is struggling, but the fact of the matter is that my crossfire 290's with 4GB's are running this fine, albeit, in the beggining stages of the game with 10-15 orcs on screen at 60+ fps just fine @ 1080p.

Again, my only problem so far is the dang flickering menu with crossfire enabled.


----------



## DADDYDC650

Quote:


> Originally Posted by *LancerVI*
> 
> Well, that's interesting because that's not my experience AT ALL.
> 
> I'm sorry your machine is struggling, but the fact of the matter is that my crossfire 290's with 4GB's are running this fine, albeit, in the beggining stages of the game with 10-15 orcs on screen at 60+ fps just fine @ 1080p.
> 
> Again, my only problem so far is the dang flickering menu with crossfire enabled.


He did say 3GB wasn't enough and not 4.


----------



## dph314

Quote:


> Originally Posted by *LancerVI*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dph314*
> 
> The benchmark is in no way indicative of having enough VRAM for Ultra textures. I got 88fps in the benchmark with everything maxed and Ultra textures, then the game was a stutter-fest as soon as there was more than 2 guys on the screen at the same time.
> 
> Just to clarify for anyone wondering...my personal experience is that 3GB is in no way enough. Not even close. Doesn't matter what framerates people post in the benchmark runs. As soon as you're fighting 6 orcs, you'll turn the textures down to High.
> 
> 
> 
> Well, that's interesting because that's not my experience AT ALL.
> 
> I'm sorry your machine is struggling, but the fact of the matter is that my crossfire 290's with 4GB's are running this fine, albeit, in the beggining stages of the game with 10-15 orcs on screen at 60+ fps just fine @ 1080p.
> 
> Again, my only problem so far is the dang flickering menu with crossfire enabled.
Click to expand...

Yeah 4 might be perfectly fine. Must not need 6 but I don't think 3 will cut it. I've seen 970's doing pretty well, so, I don't know.


----------



## LancerVI

I'm really starting to get the feeling that it's all a bit overblown. New cards launch. New game launches that says your old card is going to be crap for it. Not much on conspiracy, but I'm sure this might have sold a card or two.

The characters look great. The other textures? Meh. But again, I really am enjoying the game.


----------



## dph314

Quote:


> Originally Posted by *LancerVI*
> 
> I'm really starting to get the feeling that it's all a bit overblown. New cards launch. New game launches that says your old card is going to be crap for it. Not much on conspiracy, but I'm sure this might have sold a card or two.
> 
> The characters look great. The other textures? Meh. But again, I really am enjoying the game.


I agree, definitely enjoying the game. The level of detail is very impressive. Kill an enemy on mud and you can actually see their body sunk a few inches into it. When Captains 'come back from the dead' and rise back up in the army menu, they show scars of how they were killed. Detail like that just really makes me appreciate a game and the effort that went into it.

Edit: Ever since dropping down to High textures, I see a consistent ~2800MBs of VRAM usage and no stuttering. I guess I'll be sticking with High until I get my 980 Classified.


----------



## Baasha

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I would say this game absolutely crushes Black Flag (at least when Black Flag actually becomes an AC game, rather than the really good ship combat). AC wishes it had this good of a combat system, and you get much more freedom in this game without the ridiculous "simulation" B.S. and hovering menus everywhere.


Agree with you 1000%.

If ONLY Ubi made the combat system in AC like this game, it would be a million times better. I LOVE combos, executions, and finishers etc. in this game and they should expand on that and build an awesome combat system with several upgrade trees etc.

AC combat is absolutely worthless.

*I used to be a big fighting game enthusiast - I won a Killer Instinct 2 tournament in the arcade back in the day!







CA-CA-CA-CA COMBO BREAKER! ULTRAAAAAAAA COMBOOOOOOOOOOO!


----------



## Baasha

Quote:


> Originally Posted by *dph314*
> 
> I agree, definitely enjoying the game. The level of detail is very impressive. Kill an enemy on mud and you can actually see their body sunk a few inches into it. When Captains 'come back from the dead' and rise back up in the army menu, they show scars of how they were killed. Detail like that just really makes me appreciate a game and the effort that went into it.
> 
> Edit: Ever since dropping down to High textures, I see a consistent ~2800MBs of VRAM usage and no stuttering. I guess I'll be sticking with High until I get my 980 Classified.


Agreed - this game's visuals are absolutely stunning - especially in 4K w/ everything on Ultra. Constant 6GB of VRAM being used and the dumb thing is that no real SLI working so the ghetto method makes only 2 GPUs work - and that too the scaling is all whacky.

I've had it with "Tarz" - kept killing me and then I got the old bugger - several times that he went from puke-green to pale white!


----------



## Hilpi234

Yes, the Hd Pack is installed, i just played a side Mission, no Stuttering.... Only the dip after loading the Map.

Gsync and Framelimit @ 119


----------



## Promisedpain

Quote:


> Originally Posted by *Baasha*
> 
> Agreed - this game's visuals are absolutely stunning - especially in 4K w/ everything on Ultra. Constant 6GB of VRAM being used and the dumb thing is that no real SLI working so the ghetto method makes only 2 GPUs work - and that too the scaling is all whacky.
> 
> I've had it with "Tarz" - kept killing me and then I got the old bugger - several times that he went from puke-green to pale white!


Wish I could see the game on 4K. I can only run 1080p.







Still looks amazing tho.


----------



## FreeElectron

Quote:


> Originally Posted by *Hilpi234*
> 
> 
> 
> 
> 
> Yes, the Hd Pack is installed, i just played a side Mission, no Stuttering.... Only the dip after loading the Map.
> 
> Gsync and Framelimit @ 119


What's your rig's cooling?


----------



## Hilpi234

1x 480, 1x420 Rad @ Bottom and Top


----------



## Sequences

Quote:


> Originally Posted by *StrongForce*
> 
> I know but we all have different definitions of "smoothness" I guess lol, what's your min fps ? I was just wondering if 2k downscale to 1080 was runnable on a 4gb say gtx 970 ! and I meant all ultra with texture pack etc !


I haven't checked and I don't plan on worrying about it. This is the point I'm making: gameplay > super high texture pack. So many people obsess over playing everything at max. I think I'm playing on high with my dinky setup and having fun. Just play at a setting that has good FPS for you.


----------



## FreeElectron

Quote:


> Originally Posted by *Hilpi234*
> 
> 1x 480, 1x420 Rad @ Bottom and Top


CPU? Mobo?


----------



## Murlocke

How do you disable the 30fps cap with vsync? I have FPS limiter set to "Unlimited". I get about 50FPS average at 3440x1440, Ultra textures installed, Ultra preset. The only thing that isn't maxed is Ambient Occlusion (which is on High by default with the Ultra preset). For good reason it seems, if I set it to Ultra, I can't tell the difference other than the fact I get 35-40FPS.


----------



## Hilpi234

No only CPU and GPU ... Mainboard is not really necessary

4770k

Gigabyte Soc z97


----------



## FreeElectron

Quote:


> Originally Posted by *Hilpi234*
> 
> No only CPU and GPU ... Mainboard is not really necessary


I meant what are the cpu and mobo?


----------



## dph314

Quote:


> Originally Posted by *Murlocke*
> 
> *How do you disable the 30fps cap with vsync*? I have FPS limiter set to "Unlimited". I get about 50FPS average at 3440x1440, Ultra textures installed, Ultra preset. The only thing that isn't maxed is Ambient Occlusion (which is on High by default with the Ultra preset). For good reason it seems, if I set it to Ultra, I can't tell the difference other than the fact I get 35-40FPS.


Get 60fps, that's how







If you're hitting 50fps with v-sync on then you'll drop down to 30fps because you're not hitting 60fps. Then if you can't uphold 30fps, v-sync will drop you down to 15fps.

Is that what you're asking or am I missing something? Sounds like what you'd want to use is Adaptive V-Sync, which disables it when you can't maintain 60fps, so if you're at 50, then you wouldn't be dropped down to 30.


----------



## Foxrun

Quote:


> Originally Posted by *KenjiS*
> 
> With the texture pack
> 
> As others said though, it might be that Ultra is dynamically scaling or something based on my VRAM...


I can't hit that with my 980.


----------



## Silent Scone

1440p

980 GTX. Install HD Pack DLC.

Load game. Apply settings, including Ultra Texture Quality

Restart game

Load game

VRAM taps out at 4050

Stuttering and banding ensues

lowers detail.

Reads 980 Users online: "Works ok for me at Ultra!"

Laughs.


----------



## StrongForce

Quote:


> Originally Posted by *Silent Scone*
> 
> 1440p
> 
> 980 GTX. Install HD Pack DLC.
> 
> Load game. Apply settings, including Ultra Texture Quality
> 
> Restart game
> 
> Load game
> 
> VRAM taps out at 4050
> 
> Stuttering and banding ensues
> 
> lowers detail.
> 
> Reads 980 Users online: "Works ok for me at Ultra!"
> 
> Laughs.


Wow that's sad. yea like I said in another post some people maybe just don't notice or don't care much "different definition of smoothness"









Would be curious how much it takes for 1080p with same settings though


----------



## Silent Scone

Like your signature says









Three things cannot be long hidden: the sun, the moon, and the truth.


----------



## KenjiS

Quote:


> Originally Posted by *Silent Scone*
> 
> 1440p
> 
> 980 GTX. Install HD Pack DLC.
> 
> Load game. Apply settings, including Ultra Texture Quality
> 
> Restart game
> 
> Load game
> 
> VRAM taps out at 4050
> 
> Stuttering and banding ensues
> 
> lowers detail.
> 
> Reads 980 Users online: "Works ok for me at Ultra!"
> 
> Laughs.


No clue man, I'm not having those issues, and my VRAM isnt tapping out at 4050... and I have the HD pack installed and everything set to Ultra and 1440p


----------



## Defoler

I'm running a 2x680 with 4GB using alternate rendering 2 to force SLI and using vsync.

The game is so far from being even close to optimised.
Before you even enter the game, it takes 1GB of VRAM, 2GB of VRAM and constantly with the face in the menu. Open cinematic which are lower resolution, and the game takes 4GB of memory.

Benchmarks shows 59fps average with 40fps drops.
Playing its normally running at 45-50fps with a few drops to 35fps. But, the GPU is not capped some of the time. I think its mainly drivers and not VRAM being the bottleneck.

All of this with ultra textures.
Personally I think they are just dumping every single texture on the card. There is no real need for 6GB if they optimised the game. Even on cinematic, the game keeps all the textures in the GPU. There is no release of unused textures.

I think when nvidia brings an optimised drivers for the game, things will be different.


----------



## Foxrun

Quote:


> Originally Posted by *Silent Scone*
> 
> 1440p
> 
> 980 GTX. Install HD Pack DLC.
> 
> Load game. Apply settings, including Ultra Texture Quality
> 
> Restart game
> 
> Load game
> 
> VRAM taps out at 4050
> 
> Stuttering and banding ensues
> 
> lowers detail.
> 
> Reads 980 Users online: "Works ok for me at Ultra!"
> 
> Laughs.


Exactly


----------



## DADDYDC650

Played for awhile last night and according to EVGA Precision, the game used a little over 6.1GB









Settings are 1440p and everything maxed out. No vsync.


----------



## paulerxx

The game plays beautifully maxed out (minus Ultra textures) at 1080p with a 30fps cap and VSYNC on my HD7870...Some games are just meant to be played this way. Dead Rising 3 is a recent example. Test it out yourself. 30fps cap with vsync, whatever settings you want. No stuttering, no frame rate dips, no slow downs.


----------



## MxPhenom 216

Quote:


> Originally Posted by *Yungbenny911*
> 
> This V-ram argument again... "My GPU gets 3.6 MB usage in bf4 @ 1440p on my 4gb 970, 2gb would choke" <--- That's a Noob right there. People just need to educate themselves with hands on experience.
> 
> Go buy a 3GB 780, 6GB 780, 4k monitor, and do YOUR testing with factual results to prove it. All these walls of text would not help anyone, so just freaking GO BUY IT!!!. When you're done, come back and talk about V-ram being an issue.
> 
> We'll see how well that 6gb would "boost" performance on the 780 compared to the 3gb version...


So thats why people running 770s with 2GB were having problems running the game right? No.

Battlefield is different. Itll show its "using" all the memory you have, but is it ACTUALLY using it? No.

A lot more games these days are becoming that way.

Hell at 1080p the game "uses" everything my 780 has in terms of memory, but like I said, its not actually using it like you would think.


----------



## Arturo.Zise

Quote:


> Originally Posted by *paulerxx*
> 
> The game plays beautifully maxed out (minus Ultra textures) at 1080p with a 30fps cap and VSYNC on my HD7870...Some games are just meant to be played this way. Dead Rising 3 is a recent example. Test it out yourself. 30fps cap with vsync, whatever settings you want. No stuttering, no frame rate dips, no slow downs.


Does it feel smooth? 30fps would seem like playing in slow motion to me?


----------



## Buris

I can confirm I max it out with a R9 290 @1080p, HD texture packs and all, getting around 50 FPS in game, 80 FPS on the benchmark .









It doesn't bring my PC to it's knees, but it puts up a good fight.

My only complaint is that while some Textures look downright amazing (particularly some cloth textures), some (leather, ground) textures could look much better.

Animation is amazing and the combat seems pretty good but I think the game is a bit over-hyped.


----------



## StrongForce

Quote:


> Originally Posted by *KenjiS*
> 
> No clue man, I'm not having those issues, and my VRAM isnt tapping out at 4050... and I have the HD pack installed and everything set to Ultra and 1440p


It certainly would depends where you are playing too.. maybe Silent Scone can tell you where he was in the game
Quote:


> Originally Posted by *Silent Scone*
> 
> Like your signature says
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Three things cannot be long hidden: the sun, the moon, and the truth.












Let's try to clarify things especially for people like me who doesn't have the game (yet) , is there a slider/option to set the texture scaling like in BF4 ?? some people mentioned something like that and doing 4k and 2k downscaled, or was it just for Nvidia DSR thing ?? I'm a bit confuse about this.


----------



## Chargeit

Quote:


> Originally Posted by *Buris*
> 
> I can confirm I max it out with a R9 290 @1080p, HD texture packs and all, getting around 50 FPS in game, 80 FPS on the benchmark .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't bring my PC to it's knees, but it puts up a good fight.
> 
> My only complaint is that while some Textures look downright amazing (particularly some cloth textures), some (leather, ground) textures could look much better.
> 
> Animation is amazing and the combat seems pretty good but I think the game is a bit over-hyped.


Kick it to 1080p +150%. It makes the world look much better. Sadly performance does suffer and I had to use adaptive Vsync half refresh to get it running well (enough) with a controller. Still, you should at least check it out.

*I'm kind of in a between stage. I wanted to play the game to beat it, but, I kind of want to see how it does when I do upgrade my GPU to 6 or 8gb Maxwell. This game can look amazing, and I can only guess what it looks like on a proper 4k monitor/tv. I'm now really starting to consider getting a 4k TV to toss on the wall behind/over my triple monitors. Might need more horsepower to push it, but, would be nice.


----------



## LancerVI

Quote:


> Originally Posted by *Buris*
> 
> I can confirm I max it out with a R9 290 @1080p, HD texture packs and all, getting around 50 FPS in game, 80 FPS on the benchmark .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't bring my PC to it's knees, but it puts up a good fight.
> 
> My only complaint is that while some Textures look downright amazing (particularly some cloth textures), some (leather, ground) textures could look much better.
> 
> Animation is amazing and the combat seems pretty good but I think the game is a bit over-hyped.


Pretty much the same here. My CF'd 290's are doing just fine w/Ultra across the board @ 1080p with HD textures 70/80 FPS average gameplay and I'm being conservative, I usually see 100FPS in general, but I'm still testing, especially with what I'm about to mention which is the menu flickers horribly with CF enabled and so does very distant textures. Again, only with CF enabled.


----------



## Promisedpain

Quote:


> Originally Posted by *Sequences*
> 
> I haven't checked and I don't plan on worrying about it. This is the point I'm making: gameplay > super high texture pack. So many people obsess over playing everything at max. I think I'm playing on high with my dinky setup and having fun. Just play at a setting that has good FPS for you.


Most of us are not running on "max" anyway, max settings are 4K at ultra - or some crazy multimonitor setups. Possibilities are endless...


----------



## LancerVI

Quote:


> Originally Posted by *Promisedpain*
> 
> Most of us are not running on "max" anyway, max settings are 4K at ultra - or some crazy multimonitor setups. Possibilities are endless...


True enough. I don't know if you can consider 1080p 'ultra' anymore.


----------



## kx11

Quote:


> Originally Posted by *Silent Scone*
> 
> 1440p
> 
> 980 GTX. Install HD Pack DLC.
> 
> Load game. Apply settings, including Ultra Texture Quality
> 
> Restart game
> 
> Load game
> 
> VRAM taps out at 4050
> 
> Stuttering and banding ensues
> 
> lowers detail.
> 
> Reads 980 Users online: "Works ok for me at Ultra!"
> 
> Laughs.


maybe OC cpu causes the stuttering

works fine with me


----------



## StrongForce

Quote:


> Originally Posted by *Promisedpain*
> 
> Most of us are not running on "max" anyway, max settings are 4K at ultra - or some crazy multimonitor setups. Possibilities are endless...


Yea also when someone said I'm running full ultra, please say if you're using 100% or what resolution scale ??

Quote:


> Originally Posted by *kx11*
> 
> maybe OC cpu causes the stuttering
> 
> works fine with me


Like I said maybe he's in a part in the game that's much more intense, how about you guys figure out which town or spot is the most intense and run your test in there


----------



## kx11

Quote:


> Originally Posted by *StrongForce*
> 
> Yea also when someone said I'm running full ultra, please say if you're using 100% or what resolution scale ??
> Like I said maybe he's in a part in the game that's much more intense, how about you guys figure out which town or spot is the most intense and run your test in there


i'm talking about the benchmark


----------



## StrongForce

Quote:


> Originally Posted by *kx11*
> 
> i'm talking about the benchmark


Oh ok see, some people mentioned the benchmark not being very reflective of actual gameplay performance.


----------



## Defoler

Quote:


> Originally Posted by *StrongForce*
> 
> Oh ok see, some people mentioned the benchmark not being very reflecting of actual gameplay performance.


I think it does.
It gives you a very good average of the performance you will get in general, and it does go to a minimum which you will see on some points of the game on large events.


----------



## LancerVI

Quote:


> Originally Posted by *StrongForce*
> 
> Oh ok see, some people mentioned the benchmark not being very reflecting of actual gameplay performance.


True enough. Also, I have to totally stand corrected here.

I was playing at 1080p with NO SCALING. just 1:1. There obviously was no issue. I WAS geting 80 to 100 fps.

HOWEVER, I couldn't clearly see the menu, because when Crossfire is enabled, you get a really bad, flickering menu. It's very hard to read.

Soooooo, when I disabled CF, set scaling to 200%, I was in the 30fps range with strong dips into the low 20's single digits. Again, with CF disabled.

So, to be fair, to myself, I couldn't see the scaling option. Now that I can, I see what the confusion is. I did have everything set to ultra, but the scaling was 1:1 at 1080p. Changing that changes performance SIGNIFICANTLY, obviously.

So in short, with everything maxed out, including resolution scaling; this game brings my system to it's KNEES!!!!!!









....and here's my video of it being embarrassed for ALL to see.



EDIT: ***!?!??! 480p???? Trying to fix that now, but that wasn't the point anyway.

So for all my bloviating before about how it runs fine? Well, it does. If you plan on running it at 1080p with no supersampling at Ultra with HD textures. You'll be just fine. HOWEVER, if you supersample at all, be prepared to take a serious nose dive in performance. DUH!!! Again, I'm sorry, but I couldn't see the options, the screen was flickering so bad, I thought I had covered everything.


----------



## wholeeo

Wanted to bench the game on my 750 ti and wasn't able to drop the settings from 1440p to 1080p.. Even when setting my monitor itself to 1080p via the driver. Someone show me the light.


----------



## StrongForce

Quote:


> Originally Posted by *LancerVI*
> 
> True enough. Also, I have to totally stand corrected here.
> 
> I was playing at 1080p with NO SCALING. just 1:1. There obviously was no issue. I WAS geting 80 to 100 fps.
> 
> HOWEVER, I couldn't clearly see the menu, because when Crossfire is enabled, you getting a really bad, flickers menu. It's very hard to read.
> 
> Soooooo, when I disabled CF, set scaling to 200%, I was in the 30fps range with strong dips into the low 20's. Again, with CF disabled.
> 
> So, to be fair, to myself, I couldn't see the scaling option. Now that I can, I see what the confusion is. I did have everything set to ultra, but the scaling was 1:1 at 1080p. Changing that changes performance SIGNIFICANTLY, obviously.
> 
> So in short, with everything maxed out, including resolution scaling; this game brings my system to it's KNEES!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ....and here's my video of it being embarrassed for ALL to see.
> 
> 
> 
> EDIT: ***!?!??! 480p???? Trying to fix that now, but that wasn't the point anyway.
> 
> So for all my bloviating before about how it runs fine? Well, it does. If you plan on running it at 1080p with no supersampling at Ultra with HD textures. You'll be just fine. HOWEVER, if you supersample at all, be prepared to take a serious nose dive in performance. DUH!!! Again, I'm sorry, but I couldn't see the options, the screen was flickering so bad, I thought I had covered everything.


Ah yea it's pretty much like doubling the res or quadrupling it like 4k, this article here talks extensively about DSR, I know it's not totally the same but it does sound like it functions pretty similarily in terms of performance requirement, correct me if I'm wrong : http://techreport.com/review/27102/maxwell-dynamic-super-resolution-explored


----------



## LancerVI

Quote:


> Originally Posted by *StrongForce*
> 
> Ah yea it's pretty much like doubling the res or quadrupling it like 4k, this article here talks extensively about DSR, I know it's not totally the same but it does sound like it functions pretty similarily in terms of performance requirement, correct me if I'm wrong : http://techreport.com/review/27102/maxwell-dynamic-super-resolution-explored


Yeah, I'm familiar with SS, we've been doing it for years with a lot of things, Darks Souls comes to mind most recently with DSfix. It really fixes up muddy looking textures, but boy, does it really impact performance.


----------



## paulerxx

Quote:


> Originally Posted by *Arturo.Zise*
> 
> Does it feel smooth? 30fps would seem like playing in slow motion to me?


30fps capped with vsync is more smooth in this game then low settings, with 60fps cap and vsync...It's odd. I think we should expect this from a lot of next generation ports.


----------



## TopicClocker

If it's 480p it's still processing and should update to the max supported input resolution when its done.


----------



## LancerVI

Quote:


> Originally Posted by *TopicClocker*
> 
> If it's 480p it's still processing and should update to the max supported input resolution when its done.


Thanks. Not much of a youtuber. Submitter that is.


----------



## Toque

Ya I'm not going to bother with Ultra on my 4G 290x and 1080p monitor. I just up-scaled the image to 150% in settings to fix the non AA. Looks good and runs an average of 60fps.









Finally get to use my Xbone controller.

So hot, want to touch the hiney


----------



## FlyingSolo

Am not sure about you guys but i think the built in benchmark tool is wrong. With my sig rig with CPU OC @ 4.2 and GPU at stock for now still testing. On 1440p everything is set on ultra with hd texture pack with no motion blur. In game i get 29 FPS. So am not sure how people are playing this game on ultra with hd texture pack in 1440p and getting 60 FPS with one card. Tested with Afterburner. The benchmark tool shows i get more FPS. Plus the VRAM uses is anywhere from 3500 to 3800 that's what i have seen so far.


----------



## delellod123

Can some one tell me the cheapest place to get this game? No region locked keys please. I am in USA


----------



## Shogon

Quote:


> Originally Posted by *delellod123*
> 
> Can some one tell me the cheapest place to get this game? No region locked keys please. I am in USA


http://www.overclock.net/t/1516405/nuuvem-middle-earth-shadow-of-mordor-steam-key-28-20/0_20

Check it out! With the price I paid I should of considered buying the premium edition lol.


----------



## Alatar

Quote:


> Originally Posted by *wholeeo*
> 
> Wanted to bench the game on my 750 ti and wasn't able to drop the settings from 1440p to 1080p.. Even when setting my monitor itself to 1080p via the driver. Someone show me the light.


I tried this as well. No luck on getting it to 1080p even with the monitor set to 1080p from the NV panel.


----------



## chronicfx

Quote:


> Originally Posted by *LancerVI*
> 
> True enough. Also, I have to totally stand corrected here.
> 
> I was playing at 1080p with NO SCALING. just 1:1. There obviously was no issue. I WAS geting 80 to 100 fps.
> 
> HOWEVER, I couldn't clearly see the menu, because when Crossfire is enabled, you get a really bad, flickering menu. It's very hard to read.
> 
> Soooooo, when I disabled CF, set scaling to 200%, I was in the 30fps range with strong dips into the low 20's single digits. Again, with CF disabled.
> 
> So, to be fair, to myself, I couldn't see the scaling option. Now that I can, I see what the confusion is. I did have everything set to ultra, but the scaling was 1:1 at 1080p. Changing that changes performance SIGNIFICANTLY, obviously.
> 
> So in short, with everything maxed out, including resolution scaling; this game brings my system to it's KNEES!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ....and here's my video of it being embarrassed for ALL to see.
> 
> 
> 
> EDIT: ***!?!??! 480p???? Trying to fix that now, but that wasn't the point anyway.
> 
> So for all my bloviating before about how it runs fine? Well, it does. If you plan on running it at 1080p with no supersampling at Ultra with HD textures. You'll be just fine. HOWEVER, if you supersample at all, be prepared to take a serious nose dive in performance. DUH!!! Again, I'm sorry, but I couldn't see the options, the screen was flickering so bad, I thought I had covered everything.


I just hope it doesn't run like that for me too. I feel there are two camps, some people have the hardware but have their drivers/software conflicts that cause it to run bad and there seem to be some that have the same hardware and no problems... I guess we will see. I will base it off a single 290x as i am not expecting much out of trifire until i see proof it works.


----------



## LancerVI

Quote:


> Originally Posted by *chronicfx*
> 
> I just hope it doesn't run like that for me too. I feel there are two camps, some people have the hardware but have their drivers/software conflicts that cause it to run bad and there seem to be some that have the same hardware and no problems... I guess we will see. I will base it off a single 290x as i am not expecting much out of trifire until i see proof it works.


Yeah, so after playing with it a bit, keeping everything at Ultra and setting CF to 'Optimize 1x1' for ShadowofMordor.exe I'm getting in the mid 50+/- FPS with Supersampling /Scaling set to 150%

Also, I've found that while the screen flickers still, it's no where near as bad with 'Optimize 1x1' as the CF method used. Also, there is clearly a performance benefit with CF, make no mistake. Turn it off and FPS dives, but AMD has some work to do here with getting crossfire to work.

As for the game, I've said it before; not a big AC or Batman game fan, but I'm really enjoying this game. I really enjoy the Orc captain / Sauron army system and it's just fun to play. Calling this an AC/Batman had a baby clone is a disservice I think, but I'm no expert at either of those games. Though, again, I'm a HUGE Tolkien fanman. (Too old to be a fanboy.







)

BTW @Alatar or some other knowledgeable person, in my vid I posted, in the settings it reports my video cards as Radeon R9 200 series with total VRAM @8192. That's not possible right? I know SLI and Crossfire mirror the memory not stack it, so what's going on there?


----------



## TopicClocker

Quote:


> Originally Posted by *LancerVI*
> 
> BTW @Alatar or some other knowledgeable person, in my vid I posted, in the settings it reports my video cards as Radeon R9 200 series with total VRAM @8192. That's not possible right? I know SLI and Crossfire mirror the memory not stack it, so what's going on there?


The game upgrades CF configs, I'd hit eBay with that.
Jokes aside, a couple of gamers have reported the seemingly stacking of vram in Shadow of Mordor, it's really weird.

Does anyone know if the same thing happens with SLI?

I have no idea if it will be fixed or not, but I'd think either a new driver or a patch for the game might fix it.


----------



## Gavush

Quote:


> Originally Posted by *Neilthran*
> 
> How many video cards at the consumer level have 6GB of vram right now?


My coworker put together a rig a few months ago : 2x evga Nvidia gtx 780sc 6gb & an Intel i7 4930K 3.4ghz 6core. So there's you some 6gb consumer cards. He is playing it on ultra with the texture pack and gets bad frames / graphics via the In game benchmark but says actual gameplay is fantastic.

Edit: yes he is playing it at 4k.


----------



## LancerVI

Quote:


> Originally Posted by *TopicClocker*
> 
> The game upgrades CF configs, I'd hit eBay with that.
> Jokes aside, a couple of gamers have reported the seemingly stacking of vram in Shadow of Mordor, it's really weird.
> 
> Does anyone know if the same thing happens with SLI?
> 
> I have no idea if it will be fixed or not, but I'd think either a new driver or a patch for the game might fix it.


LOL, yeah no kidding. Need to list these bad boys. The VRAM is reproductive!

I know it only mirrors so it's got to be a bug, unless I've missed some ground breaking SLI/CF VRAM striping VRAM technology.


----------



## BradleyW

Just tested Ultra at 4K and averaged at 60fps with a single dip to 54fps on the benchmark. Getting perfect x2 scaling.

However only getting 1.7x scaling in 1080p, with a min fps of 96. It's not a CPU issue, because the min fps increased when the GPU's were overclocked.

Good to see decent CFX support. Just hope the actual game performs well. Only tested the bench!


----------



## Flames21891

Well, I've been able to play this game @ 1080p with everything maxed out sans Textures which is set to High.

Given the absolute minimal difference between High and Ultra, this doesn't really bother me. Still, if my 2GB cards can run it on High without issues, despite the game saying it requires 3GB, I'd say the new 4GB 9xx cards should do pretty well on Ultra.

Also I haven't yet checked how much system RAM the game uses during extended play. If it goes over 8GB, then my decision to buy 16GB has finally been validated, lol. That aside, using that much memory is absolutely bonkers.

The FEAR 3 SLI profile works for the most part. Though there is some minor flickering artifacts here and there, it keeps the game at well above 60 FPS so for now I'll deal with it. Hopefully a proper profile is released soon.


----------



## Scorpion49

I did some checking using the built in benchmark, with 6GB Titan at 1080p. With Ultra textures downloaded and then putting the option on, I got 83fps average and 4700MB of Vram usage. Setting textures to high with everything else the same I got 84fps average and only 2400MB of Vram used. Doesn't seem to be any performance penalty to the Ultra textures so I guess I'll run those. System RAM usage was consistently right around 3GB no matter the settings.


----------



## Gorea

Quote:


> Originally Posted by *Scorpion49*
> 
> I did some checking using the built in benchmark, with 6GB Titan at 1080p. With Ultra textures downloaded and then putting the option on, I got 83fps average and *4700MB of Vram usage*. Setting textures to high with everything else the same I got 84fps average and only 2400MB of Vram used. Doesn't seem to be any performance penalty to the Ultra textures so I guess I'll run those. System RAM usage was consistently right around 3GB no matter the settings.


Just like I was guessing. It's a tiny bit higher than 4GB of video RAM, but there is nothing in between 4 and 6, so 6 is the requirement.

Many people on this forum take certain things a bit too literally and start making smartass comments about the high video RAM.


----------



## Scorpion49

Quote:


> Originally Posted by *Gorea*
> 
> Just like I was guessing. It's a tiny bit higher than 4GB of video RAM, but there is nothing in between 4 and 6, so 6 is the requirement.
> 
> Many people on this forum take certain things a bit too literally and start making smartass comments about the high video RAM.


Yeah, I think this is about right as well. I can only think of a handful of 5GB cards and they were all GPGPU stuff so they went with the next highest number that people can actually buy.


----------



## BradleyW

SLI users, have you tried forcing AFR as oppose to game profiles?
Also my min fps increased by 6 fps just by disabling HT!
In CFX at 1440p and 4K I get perfect x2 scaling. 1080p gives around x1.8


----------



## Zaid

Did the benchmark at maxed out graphics (ultra textures) on 1080p tv with SSAA @ 150% and i get average of 55fps on the benchmark. with about 3030mb of Vram usage reported by msi afterburner.

CPU @ 4.9 GPU @ 1220/1630.


----------



## MonarchX

I am reporting something strange. I did NOT download the HD texture pack, and yet I can select Ultra textures AND that setting will actually have performance impact!

Benchmark results for my Intel Core i7 3770K @ 4.8Ghz, 16GB 2200Mhz RAM @ 10-11-10-30-1T, EVGA GTX 780 Ti @ 1250Mhz GPU (Skyn3t BIOS, Boost disabled) / 7500Mhz VRAM @ 1080p with:

1. Everything set to Ultra, including AO and Textures (not sure if anyone noticed that AO is set to High when Ultra preset is selected):
Avg. FPS = 87
Min. FPS = 34

2. Everything set to Ultra, including AO, but Textures set to High:
Avg. FPS = 88
Min. FPS = 52

Why in the world would Ultra Textures be select-able when HD pack is NOT downloaded and why that settings affect my performance???


----------



## MonarchX

I was wrong - I do have HD pack installed and while my VRAM usage is at 3014MB all the time, the game runs PERFECTLY SMOOTH as long as I don't mess with downscaling and leave it at 1080p! So much for 6GB VRAM requirement! Battles with a dozen of enemies lowers FPS down to 60 in worst cases. I mean even if I encounter more demanding areas later, as long as FPS dips don't go below 30 for more than a moment, I am good!

Thing is, I see 0 difference between Ultra and High, so using Ultra makes little sense. When I set Textures to High, my VRAM usage is 2998MB constantly and its just as smooth as when using Ultra textures!


----------



## LancerVI

Quote:


> Originally Posted by *BradleyW*
> 
> Just tested Ultra at 4K and averaged at 60fps with a single dip to 54fps on the benchmark. Getting perfect x2 scaling.
> 
> However only getting 1.7x scaling in 1080p, with a min fps of 96. It's not a CPU issue, because the min fps increased when the GPU's were overclocked.
> 
> *Good to see decent CFX support. Just hope the actual game performs well. Only tested the bench!*


Are you not getting the flickering menus in CF? If not how did you manage that?


----------



## hanzy

Quote:


> Originally Posted by *BradleyW*
> 
> SLI users, have you tried forcing AFR as oppose to game profiles?
> Also my min fps increased by 6 fps just by disabling HT!
> In CFX at 1440p and 4K I get perfect x2 scaling. 1080p gives around x1.8


I forced AFR2 via NVCP, I could not get the FEAR profile working for some reason...
This seems to working fine however.

I cant believe nvidia doesn't have an SLI profile yet, ***.


----------



## CyberWolf575

Am I the only person who thinks even on Ultra this game looks like crap? The ground and rock textures are so bad, that despite the fact that the character textures are amazing, I can not get over the fact of how horrible those ground/rock textures are. It's like a game from 2008.

But on the gameplay aspect, this game is a LOT of fun. Already sunk around 12 hours in to it, and loving it!


----------



## n780tivs980

Quote:


> Originally Posted by *CyberWolf575*
> 
> Am I the only person who thinks even on Ultra this game looks like crap? The ground and rock textures are so bad, that despite the fact that the character textures are amazing, I can not get over the fact of how horrible those ground/rock textures are. It's like a game from 2008.
> 
> But on the gameplay aspect, this game is a LOT of fun. Already sunk around 12 hours in to it, and loving it!


When it's raining some of the ground/rock textures are incredible.


----------



## Zipperly

Quote:


> Originally Posted by *CyberWolf575*
> 
> Am I the only person who thinks even on Ultra this game looks like crap? The ground and rock textures are so bad, that despite the fact that the character textures are amazing, I can not get over the fact of how horrible those ground/rock textures are. It's like a game from 2008.
> 
> But on the gameplay aspect, this game is a LOT of fun. Already sunk around 12 hours in to it, and loving it!


Well, I would disagree. I think the game as a whole looks great and certainly better looking than any open world type of game back in 08.


----------



## brandon6199

If you know absolutely nothing about LOTR, would you still be able to play this game and not be lost?

It looks cool but I've never seen any of the movies, etc.


----------



## Zipperly

Quote:


> Originally Posted by *n780tivs980*
> 
> When it's raining some of the ground/rock textures are incredible.


Yep.


----------



## hurleyef

Quote:


> Originally Posted by *Silent Scone*
> 
> Nobody is referring to number of cards
> 
> I think idiots are idiots, especially ones with random rage posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even more so especially ones who think they can run on the frame buffer and are adamant it works o.k when quite clearly it doesn't. It's just some of us aren't butt hurt enough to lie about it lol


Runs just fine for me, but don't take my word for it as clearly anyone who claims to have an experience different from your expectations is either stupid or full of ****. Here's some benchmark results I took while playing this game at max settings with the hd textures installed on a single gtx 980 at 1440p. Also, lest you should doubt my honesty, I took video proof of the whole affair. Forgive the quality, this is the first time I've actually used the video camera, and it took a few tries for forgetting to film this or that in the one shot.

Results: min: 49, max: 61, avg: 56

https://www.youtube.com/watch?v=VgzpELuJG0w&feature=youtu.be

FRAPSLOG.TXT 0k .TXT file


ShadowOfMordor2014-10-0400-12-39-47fps.csv 0k .csv file


ShadowOfMordor2014-10-0400-12-39-47frametimes.csv 59k .csv file


ShadowOfMordor2014-10-0400-12-39-47minmaxavg.csv 0k .csv file


On second thought, I did just go through all this trouble just to prove someone wrong on the internet, so maybe the idiot part fits. But then, you did just get a world record firestrike air run, so does that make you a celebrity? Would that make it ok?


----------



## Zaid

Quote:


> Originally Posted by *brandon6199*
> 
> If you know absolutely nothing about LOTR, would you still be able to play this game and not be lost?
> 
> It looks cool but I've never seen any of the movies, etc.


yup,basically all you need to know is:

sauron= main evil guy
mordor= big bad evil land ruled by sauron.
gondor= major human city across the river from mordor.


----------



## KenjiS

Quote:


> Originally Posted by *Zaid*
> 
> yup,basically all you need to know is:
> 
> sauron= main evil guy
> mordor= big bad evil land ruled by sauron.
> gondor= major human city across the river from mordor.


You forgot that one does not simply walk into Mordor...


----------



## Chargeit

Quote:


> Originally Posted by *CyberWolf575*
> 
> *Am I the only person who thinks even on Ultra this game looks like crap?* The ground and rock textures are so bad, that despite the fact that the character textures are amazing, I can not get over the fact of how horrible those ground/rock textures are. It's like a game from 2008.
> 
> But on the gameplay aspect, this game is a LOT of fun. Already sunk around 12 hours in to it, and loving it!


I thought it looked pretty bad until I set it to 1080p +150%... That made all the difference.


----------



## hollowtek

Quote:


> Originally Posted by *KenjiS*
> 
> You forgot that one does not simply walk into Mordor...


brace yourselves...


----------



## Chargeit

Quote:


> Originally Posted by *n780tivs980*
> 
> When it's raining some of the ground/rock textures are incredible.


No kidding.


----------



## KenjiS

Quote:


> Originally Posted by *hollowtek*
> 
> brace yourselves...


Oh come on, its been all these pages and not ONE LOTR Meme?


----------



## KenjiS

XD And i am not disappointed


----------



## Silent Scone

Quote:


> Originally Posted by *hurleyef*
> 
> Runs just fine for me, but don't take my word for it as clearly anyone who claims to have an experience different from your expectations is either stupid or full of ****. Here's some benchmark results I took while playing this game at max settings with the hd textures installed on a single gtx 980 at 1440p. Also, lest you should doubt my honesty, I took video proof of the whole affair. Forgive the quality, this is the first time I've actually used the video camera, and it took a few tries for forgetting to film this or that in the one shot.
> 
> Results: min: 49, max: 61, avg: 56
> 
> https://www.youtube.com/watch?v=VgzpELuJG0w&feature=youtu.be
> 
> FRAPSLOG.TXT 0k .TXT file
> 
> 
> ShadowOfMordor2014-10-0400-12-39-47fps.csv 0k .csv file
> 
> 
> ShadowOfMordor2014-10-0400-12-39-47frametimes.csv 59k .csv file
> 
> 
> ShadowOfMordor2014-10-0400-12-39-47minmaxavg.csv 0k .csv file
> 
> 
> On second thought, I did just go through all this trouble just to prove someone wrong on the internet, so maybe the idiot part fits. But then, you did just get a world record firestrike air run, so does that make you a celebrity? Would that make it ok?


It doesn't work ok for you though does it. Because the texure pack uses over 4gb VRAM. also when was the last time anyone played a game for LESS THAN A MINUTE.









One doesn't simply walk into more VRAM.


----------



## StrongForce

Quote:


> Originally Posted by *FlyingSolo*
> 
> Am not sure about you guys but i think the built in benchmark tool is wrong. With my sig rig with CPU OC @ 4.2 and GPU at stock for now still testing. On 1440p everything is set on ultra with hd texture pack with no motion blur. In game i get 29 FPS. So am not sure how people are playing this game on ultra with hd texture pack in 1440p and getting 60 FPS with one card. Tested with Afterburner. The benchmark tool shows i get more FPS. Plus the VRAM uses is anywhere from 3500 to 3800 that's what i have seen so far.


Sometime is odd here, look at the resolution scale thing, is it up ? and whats youre firestrike score just to see if everything runs fine outside the game, altought I don't know the score of a 970 it should be pretty damn high.
Quote:


> Originally Posted by *Zaid*
> 
> Did the benchmark at maxed out graphics (ultra textures) on 1080p tv with SSAA @ 150% and i get average of 55fps on the benchmark. with about 3030mb of Vram usage reported by msi afterburner.
> 
> CPU @ 4.9 GPU @ 1220/1630.


You sure you applyed the texture pack /ultra settings??

WAW why is there so much difference between you guys performance.

What about in game ? any big frame drops during fights etc..

Lol memes *** !


----------



## Zaid

Quote:


> Originally Posted by *StrongForce*
> 
> Sometime is odd here, look at the resolution scale thing, is it up ? and whats youre firestrike score just to see if everything runs fine outside the game, altought I don't know the score of a 970 it should be pretty damn high.
> You sure you applyed the texture pack /ultra settings??
> 
> WAW why is there so much difference between you guys performance.
> 
> What about in game ? any big frame drops during fights etc..
> 
> Lol memes *** !


100% sure on ultra textures, just redid the benchmark on stock GPU clocks and got 44 fps for ultra with 3800mb of vram usage, with high i got 54fps with 2700mb vram on high.

the 55fps i stated was with the gpu overclocked from 1060/1250 to 1220/1630.and when i OC the gpu, CPU usage goes from 35-45% to 50-65% and that's with a 4.9ghz overclock. This game is mad hungry for CPU and GPU power, but optimized very well. in game it barley goes above 3000mb vram, fps in game average is ~55, minimum 45.


----------



## hurleyef

Quote:


> Originally Posted by *Silent Scone*
> 
> It doesn't work ok for you though does it. Because the texure pack uses over 4gb VRAM.


My evidence would seem to indicate that it doesn't use more that 4gb VRAM. I wonder what factors are causing this discrepancy, as many people here are clearly getting conflicting results.
Quote:


> Originally Posted by *Silent Scone*
> 
> also when was the last time anyone played a game for LESS THAN A MINUTE.


The sample was chosen to be graphically intensive, not to be representative of a normal play session. Were I to run a benchmark of a normal session, the results be much more ambiguous as they'd include periods of abnormally high frame rates such as on menus and abnormally low rates as in during transitions between loading screens etc. One of the issues that I have with the in game benchmark is that it seems to start too soon, skewing the results.


----------



## blackhole2013

I dont get it I put ultra textures on with my 3gb 780 and I see no slow downs why is that


----------



## BradleyW

Not sure about everyone here, but I think this game looks a bit dated in some aspects.


----------



## JSTe

Quote:


> Originally Posted by *BradleyW*
> 
> Not sure about everyone here, but I think this game looks a bit dated in some aspects.


I don't really think so, just generic in all aspects.


----------



## TopicClocker

Quote:


> Originally Posted by *Silent Scone*
> 
> It doesn't work ok for you though does it. Because the texure pack uses over 4gb VRAM. also when was the last time anyone played a game for LESS THAN A MINUTE.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One doesn't simply walk into more VRAM.


Ugh this is stressful, some say that 4GB is fine, some say it's not...
I game at 1080p so I'd think 4GB should be pretty safe.

Debating whether to buy a GTX 970 or a PS4, I damn well want the GTX 970 but the 4GB thing is getting to me. (Exclusives is why I want the PS4).

Theoretically against the next gen consoles 4GB should be fine, and 6-8GB is more than enough.


----------



## hurleyef

Quote:


> Originally Posted by *JSTe*
> 
> I don't really think so, just generic in all aspects.


There's only so much you can do with Mordor... That said, I think they did an absolutely fantastic job with the orcs. And some this is, at its heart, a game about killing orcs, I think that should count for quite a bit.


----------



## cstkl1

Quote:


> Originally Posted by *hurleyef*
> 
> My evidence would seem to indicate that it doesn't use more that 4gb VRAM. I wonder what factors are causing this discrepancy, as many people here are clearly getting conflicting results.
> The sample was chosen to be graphically intensive, not to be representative of a normal play session. Were I to run a benchmark of a normal session, the results be much more ambiguous as they'd include periods of abnormally high frame rates such as on menus and abnormally low rates as in during transitions between loading screens etc. One of the issues that I have with the in game benchmark is that it seems to start too soon, skewing the results.
> Perhaps one could simply download more?


uploading now.
a proper 1440p maxed out. Shadowplay recording with fps ( y ures is missing in crucial area). and other gpu info...vram..ram..pagefile, clocks etc on riva OSD.

"Disclaimer .. although i disabled vsync and gsync supposedly will be disabled with this but my monitor is still running red on the led which indicated gsync is still on"

You will notice when it hits 6gb.. my fps will drop to 40. again did thake a hit on fps on shadowplay but not much because i suspect its the second card thats doing the recording work. based on the 2nd card running at 1111mhz instead of 3xxmhz when shadowplay is disabled.





wait for it to be processed.

also will share the original file at 2.5gb got those who want to see the actualy graphic of ultra texture at 1440p.

edit noticed gfore experience has optimized setting. it sets everything maxed out but texture at high.
*Full shadowplay video

http://1drv.ms/1pQbuzs*


----------



## Ramzinho

read about 10 pages. the opinions are so conflicting. some people with 290X says they get 60FPS on ultra @ 1400p, others says they can't get past high.

Someone shed some insight. as i'm lost in the wall of posts. maybe @Alatar or @BradleyW can help


----------



## cstkl1

Quote:


> Originally Posted by *Ramzinho*
> 
> read about 10 pages. the opinions are so conflicting. some people with 290X says they get 60FPS on ultra @ 1400p, others says they can't get past high.
> 
> Someone shed some insight. as i'm lost in the wall of posts. maybe @Alatar or @BradleyW can help


Download the video above n also watch the youtube. Thats how 1440p ultra looks like n proper fps gameplay with scenes etc..


----------



## Onikage

Im on a gtx 760 i can barely keep 60 FPS on medium also i get stutering with high textures not sure what am i doing wrong


----------



## Arnotts

3840x2160 downsampled to 1920x1080
medium preset, except textures set to ultra (with the HD texture pack downloaded and installed)
Average fps is in the mid-40's, but _it feels smooth_.
The game looks FANTASTIC when it's downsampled-- the combination of the medium preset, high (or ultra) textures and 4K resolution makes the game look amazing and feel smooth.

Ultra textures @ 4K (everything else set with the medium preset) uses ~3700mb of VRAM.
High textures @ 4K (everything else set with the medium preset) uses ~2800mb of VRAM.

On a single 970 OC'd to 1446mhz on the core.


----------



## StrongForce

Quote:


> Originally Posted by *Zaid*
> 
> Did the benchmark at maxed out graphics (ultra textures) on 1080p tv with SSAA @ 150% and i get average of 55fps on the benchmark. with about 3030mb of Vram usage reported by msi afterburner.
> 
> CPU @ 4.9 GPU @ 1220/1630.


Quote:


> Originally Posted by *JSTe*
> 
> I don't really think so, just generic in all aspects.


care to explain your disapointment ??


----------



## BradleyW

Quote:


> Originally Posted by *Ramzinho*
> 
> read about 10 pages. the opinions are so conflicting. some people with 290X says they get 60FPS on ultra @ 1400p, others says they can't get past high.
> 
> Someone shed some insight. as i'm lost in the wall of posts. maybe @Alatar or @BradleyW can help


Run the benchmark at 200% scaling.

Ultra Settings,
No motion blur,
HT disabled ,
High textures,

Report your lowest fps. (Use an fps reader like fraps and eyeball your lowest fps. Lowest fps result in the benchmark is broken).

My lowest was 27fps.

150 scaling min fps was 47fps.

100 scaling min fps 57fps. (CFX = 101fps)

Hope this helps.


----------



## Dasboogieman

Quote:


> Originally Posted by *BradleyW*
> 
> Run the benchmark at 200% scaling.
> 
> Ultra Settings,
> No motion blur,
> HT disabled ,
> High textures,
> 
> Report your lowest fps. (Use an fps reader like fraps and eyeball your lowest fps. Lowest fps result in the benchmark is broken).
> 
> My lowest was 30fps.


I reran the game with ultra textures, you get epic stuttering and screen artifacting with crossfire 290s at 1080p. I seriously doubt 4gb is sufficient at ultra.


----------



## BradleyW

Quote:


> Originally Posted by *Dasboogieman*
> 
> I reran the game with ultra textures, you get epic stuttering and screen artifacting with crossfire 290s at 1080p. I seriously doubt 4gb is sufficient at ultra.


Maybe.
I just used High textures and the benchmark appears to be stutter and flicker free.


----------



## Zipperly

Quote:


> Originally Posted by *Onikage*
> 
> Im on a gtx 760 i can barely keep 60 FPS on medium also i get stutering with high textures not sure what am i doing wrong


High textures use up more than 2gb's, 3gb is where you need to be for smooth play at high textures. Turn the textures down to medium and you should be fine.


----------



## cstkl1

Quote:


> Originally Posted by *BradleyW*
> 
> Maybe.
> I just used High textures and the benchmark appears to be stutter and flicker free.


This is what i notice on 1440p. When it hits 6gb my fps will drop to 30s for a sec n then my vram will drop after a while It happens especially when u run across one area. Tested on 780ti. It just tanked n nvr recovered. It also stutters when a lot of enemies aka 10 or more especially if u do a lot of finishing moves etc.

So whats with alot of ppl bs. Heck even nvidia optimize for the black on 1440p is high n not ultra.


----------



## BradleyW

Quote:


> Originally Posted by *cstkl1*
> 
> This is what i notice on 1440p. When it hits 6gb my fps will drop to 30s for a sec n then my vram will drop after a while It happens especially when u run across one area. Tested on 780ti. It just tanked n nvr recovered. It also stutters when a lot of enemies aka 10 or more especially if u do a lot of finishing moves etc.


Disabling HT could help you with the stutter issues. I think I will leave the textures on High.


----------



## Zipperly

I have found the perfect balance for me.. I play at 2560X1440 downsampled to 1920X1080+FXAA. No I normally do not like fxaa but it works very well in combination with downsampling and you dont get the blur you would normally get when you are just using your typical native resolution. Im running everything on High except for *"mesh and vegetation at Ultra"*. With most of the other settings it is very difficult to tell the difference between ultra and high yet there is a huge fps penalty that comes along with running things like ambient occlusion or shadows on ultra for example.

With the above settings this nets me a silky smooth non stuttering 60fps the entire time and only once during a very huge orc fight did I see my frame rate drop to 58fps.


----------



## Silent Scone

Like I said

Memory is memory. But apparently other people don't get the stuttering associated with it.

https://www.youtube.com/watch?v=Poe2h2TplYk&feature=youtu.be


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> I have found the perfect balance for me.. I play at 2560X1440 downsampled to 1920X1080+FXAA. No I normally do not like fxaa but it works very well in combination with downsampling and you dont get the blur you would normally get when you are just using your typical native resolution. Im running everything on High except for *"mesh and vegetation at Ultra"*. With most of the other settings it is very difficult to tell the difference between ultra and high yet there is a huge fps penalty that comes along with running things like ambient occlusion or shadows on ultra for example.
> 
> With the above settings this nets me a silky smooth non stuttering 60fps the entire time and only once during a very huge orc fight did I see my frame rate drop to 58fps.


I might have to try that combination just for fun. See what kind of fps I get.


----------



## cstkl1

Quote:


> Originally Posted by *BradleyW*
> 
> Disabling HT could help you with the stutter issues. I think I will leave the textures on High.


So ure telling me ht affects a 780ti when it doesnt on a black eventhough only difference on both is the vram.

Disabling ht when the cpu vcore for the same multi is the same (ht doesnt reduce vcore) aint a option dude.

Btw watch the video. I believe thats the best one can hope for on a single gpu.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> I might have to try that combination just for fun. See what kind of fps I get.


Keep in mind that is vsynced.







If i turn off vsync I can run the same settings that i mentioned but at 4k resolution and hold around 40fps. I might have to try some adaptive vsync.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Keep in mind that is vsynced.
> 
> 
> 
> 
> 
> 
> 
> If i turn off vsync I can run the same settings that i mentioned but at 4k resolution and hold around 40fps. I might have to try some adaptive vsync.


Is the benchmark more demanding than the actual game? I ran your settings and dropped to 50 fps on a single 290X in the benchmark. Do you have the 3 options enabled below the graphics options?


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> Is the benchmark more demanding than the actual game?


I have not even tried the benchmark but I have about 7hrs into the game.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> I have not even tried the benchmark but I have about 7hrs into the game.


I see.
If you do run the bench, use an fps reader. The reader in the bench is broken. It reports silly min fps as low as 7 and highs up to 10000


----------



## Zipperly

With adaptive vsync on i still get a little tearing, not as bad as with no vsync but not as good as full vsync either. Think I will stick to full vsync, I dont like any amount of screen tear.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> I see.
> If you do run the bench, use an fps reader. The reader in the bench is broken. It reports silly min fps as low as 7 and highs up to 10000


Okay I might play with it later, right now im just hooked on this game and utterly amazed at how well my 780 handles it.


----------



## FissioN2222

Just finished the game and I must say I think they did a great job on the port. The amount of options they put in is great, the game ran very smoothly, and still looked very good at medium textures. I look at the ultra textures as something extra they put in for PC, they very well could of left it out and no one would of known or noticed.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *Zipperly*
> 
> Im running everything on High except for *"mesh and vegetation at Ultra"*. With most of the other settings it is very difficult to tell the difference between ultra and high yet there is a huge fps penalty that comes along with running things like ambient occlusion or shadows on ultra for example.
> 
> With the above settings this nets me a silky smooth non stuttering 60fps the entire time and only once during a very huge orc fight did I see my frame rate drop to 58fps.


Same settings for me as well! I'm @1440 though, but still get 60+, no stuttering, I'm pleasantly surprised how well the game runs! Not gonna get into all this VRM nonsense talk, but the game looks really, really good at high, can't tell the difference, really (actually, the only difference I noticed was the low fps I was getting). Tried ultra settings on my 780, but it stuttered and 30fps and lower when fighting the Uruks just doesn't do it for me anymore (yes, I have become some sort of snob, I guess...


----------



## Zipperly

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Same settings for me as well! I'm @1440 though, but still get 60+, no stuttering, I'm pleasantly surprised how well the game runs! Not gonna get into all this VRM nonsense talk, but the game looks really, really good at high, can't tell the difference, really (actually, the only difference I noticed was the low fps I was getting). Tried ultra settings on my 780, but it stuttered and 30fps and lower when fighting the Uruks just doesn't do it for me anymore (yes, I have become some sort of snob, I guess...


Yep, theoretically im running the same resolution as you are but downsampled to 1080p though that is just as demanding as native 1440p. Im glad you can see benefits using the same combination of settings.


----------



## Onikage

Where is this game wasting all the VRAM? i mean seriously 2 gigs for medium textures on 1080p Assassins creed 4 looks better with 10 times more stuff on the screen and much more foliage and uses max 1.5 gb on max setting with smaa


----------



## Zipperly

Quote:


> Originally Posted by *Onikage*
> 
> Where is this game wasting all the VRAM? i mean seriously 2 gigs for medium textures on 1080p Assassins creed 4 looks better with 10 times more stuff on the screen and much more foliage and uses max 1.5 gb on max setting with smaa


Please dont compare that mess of a game to SOM and just because its more "colorful" doesn't make it better looking. SOM from a technological stand point has much more going on and as a whole looks better than Assassins creed 4.


----------



## cstkl1

Quote:


> Originally Posted by *Onikage*
> 
> Where is this game wasting all the VRAM? i mean seriously 2 gigs for medium textures on 1080p Assassins creed 4 looks better with 10 times more stuff on the screen and much more foliage and uses max 1.5 gb on max setting with smaa


Yeah awesome stutterfest on sli. Yippy.
While this game.. One card suffice.

So far this game us playable on current drivers on both camp.

Ubi needs to learn. But anyway who cares abt them. Bycottimg them for life.

I hope wb build more lore on the middle earth. Hope to be an dark elf vs that huge elephant etc.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Please dont compare that mess of a game to SOM and just because its more "colorful" doesn't make it better looking. SOM from a technological stand point has much more going on and as a whole looks better than Assassins creed 4.


Although AC-IV is poorly coded, it performed far better than the other AC's and AC IV certainly looks over 9000 times better than Mordor.









For those trying to get CFX working, be careful with AFR. I've noticed AFR removes many in game effects but still renders the calculations! Using Tomb Raider profile offers best fps and allows all effects to be shown! So in other words, better graphics and higher fps.


----------



## Onikage

Quote:


> Originally Posted by *Zipperly*
> 
> Please dont compare that mess of a game to SOM and just because its more "colorful" doesn't make it better looking. SOM from a technological stand point has much more going on and as a whole looks better than Assassins creed 4.


I agree ac4 inst very well optimized but what exactly is so much more advanced in SOM? AC4 has dynamic shadows,ligthing,screen space reflections,tesselation, probably the most realistic water physics ever all Nvidia features like hbao+,txaa,PCSS also compare the amount of foliage in AC4 to SOM it has couple of bushes here and there but nowhere near as much im just saying game dosnt justify using so much VRAM the hell even crysis 3 dosnt go over 2 gb at 1440p with everything maxed exepct AA


----------



## Zipperly

Its easy to think that AC4 looks better when the entire game is in a more lush vibrant/jungle type of setting which can give you a false sense that it looks better....... SOM's entire world is dull by comparison but looking beyond that its easy to see that SOM as "a whole" is better looking than AC4 and yes on a technical level to boot. Thats my 2 cents, we can agree to disagree.


----------



## cstkl1

Quote:


> Originally Posted by *Onikage*
> 
> I agree ac4 inst very well optimized but what exactly is so much more advanced in SOM? AC4 has dynamic shadows,ligthing,screen space reflections,tesselation, probably the most realistic water physics ever all Nvidia features like hbao+,txaa,PCSS also compare the amount of foliage in AC4 to SOM im just saying game dosnt justify using so much VRAM


You dont have texture popins like ac4, watchdog, wolfy etc.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Its easy to think that AC4 looks better when the entire game is in a more lush vibrant/jungle type of setting which can give you a false sense that it looks better....... SOM's entire world is dull by comparison but looking beyond that its easy to see that SOM as "a whole" is better looking than AC4 and yes on a technical level to boot. Thats my 2 cents, *we can agree to disagree*.


That's the spirit!








Quote:


> Originally Posted by *cstkl1*
> 
> You dont have texture popins like ac4, watchdog, wolfy etc.


That's an interesting point, I never got any pop in textures on those 3 games. Not sure what I did different or......???
I had the odd car pop in on watch dogs but it was rare.


----------



## cstkl1

Quote:


> Originally Posted by *BradleyW*
> 
> That's the spirit!
> 
> 
> 
> 
> 
> 
> 
> 
> That's an interesting point, I never got any pop in textures on those 3 games. Not sure what I did different or......???
> I had the odd car pop in on watch dogs but it was rare.


Its massive on watchdog more on object popin n wolfy had texture popin when moving around its n ac4 had both ..Now i am wondering what year ure living in. Def in the way distant future where the name 290 is reused. Cause as of 2014. Popins on all three games mentioned is massive.

J/k. Happens on both rigs n even on a 780 strix sli on a totally different setup. Last ati card was a 7970. So skipped their 2 series until they have a proper card that slam dunks on tesselation. Resolution played was 2560x1080, 3840x1440 on different settings to get 60fps as avg playable framerate.


----------



## Onikage

Quote:


> Originally Posted by *cstkl1*
> 
> You dont have texture popins like ac4, watchdog, wolfy etc.


Oh whatewer
Quote:


> Originally Posted by *Zipperly*
> 
> Its easy to think that AC4 looks better when the entire game is in a more lush vibrant/jungle type of setting which can give you a false sense that it looks better....... SOM's entire world is dull by comparison but looking beyond that its easy to see that SOM as "a whole" is better looking than AC4 and yes on a technical level to boot. Thats my 2 cents, we can agree to disagree.


I understand but you still havent explaind what exactly is so much more advanced tech wise in SOM nerly all graphical feature SOM has can be found ac4 if you could elaborate a little bit and dont get get me wrong im not defending ubisoft AC4 had its issues but the visuals were superb.


----------



## Ramzinho

From what i read here and everywhere else. game seems to be Very good looking at 1080 with New GPUs. or GPUS with headroom of VRAM. but i don't know how it is for you guys on 1440p. and regarding crossfire i'm sure that next Nvidia driver and 14.10 will 100% address such a good game.. I seriously hope devs keep optimizing games like they did on this one. it might not be 100% optimized but it's a glimpse of hope..


----------



## BradleyW

Quote:


> Originally Posted by *cstkl1*
> 
> Its massive on watchdog more on object popin n wolfy had texture popin when moving around its n ac4 had both ..Now i am wondering what year ure living in. Def in the way distant future where the name 290 is reused. Cause as of 2014. Popins on all three games mentioned is massive.
> 
> J/k. Happens on both rigs n even on a 780 strix sli on a totally different setup. Last ati card was a 7970. So skipped their 2 series until they have a proper card that slam dunks on tesselation. Resolution played was 2560x1080, 3840x1440 on different settings to get 60fps as avg playable framerate.


Yes I'm from the future. In the future we have the AMD 390X. I bet you anything it comes true.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *Ramzinho*
> 
> From what i read here and everywhere else. game seems to be Very good looking at 1080 with New GPUs. or GPUS with headroom of VRAM. but i don't know how it is for you guys on 1440p. and regarding crossfire i'm sure that next Nvidia driver and 14.10 will 100% address such a good game.. I seriously hope devs keep optimizing games like they did on this one. it might not be 100% optimized but it's a glimpse of hope..


Hey mate!
Game runs and looks awesome on a [email protected], high settings+ultra meshes and vegetation (added the ultra options since they didn't impact performance too much, but game looks really good at high settings). no vsync 60+ all the time , at least when I cared enough to check the fps... Remarkably well optimized!


----------



## Flames21891

Quote:


> Originally Posted by *Zipperly*
> 
> High textures use up more than 2gb's, 3gb is where you need to be for smooth play at high textures. Turn the textures down to medium and you should be fine.


Ehh, sorta.

I've been running it on High with 2GB 680's without a problem. I'm willing to bet a decent chunk of VRAM usage is just caching as usual, and based on my experience I would say it's feasible to get by on 2GB if your card is powerful enough.


----------



## cstkl1

Quote:


> Originally Posted by *BradleyW*
> 
> Yes I'm from the future. In the future we have the AMD 390X. I bet you anything it comes true.


hope so future man

cause nvidia said future was now and amd said future was soon and launched AMD 285x









hope get to live in that future of ures..









ok found out something lol. this game gsync works without vsync but it will hang after like 2 hours of gaming etc. enabled vsync and hangong stopped. so wondering why the former worked?? Btw this hanging actually made the whole os went slow mode.


----------



## CyberWolf575

Quote:


> Originally Posted by *Onikage*
> 
> Oh whatewer
> I understand but you still havent explaind what exactly is so much more advanced tech wise in SOM nerly all graphical feature SOM has can be found ac4 if you could elaborate a little bit and dont get get me wrong im not defending ubisoft AC4 had its issues but the visuals were superb.


I dont know why everyone says that SOM is so much better. Honestly, I have the game on Ultra + 150% and I still can't say that the ground/rock textures look good..they honestly dont.


----------



## The Source

Quote:


> Originally Posted by *Ramzinho*
> 
> From what i read here and everywhere else. game seems to be Very good looking at 1080 with New GPUs. or GPUS with headroom of VRAM. but i don't know how it is for you guys on 1440p. and regarding crossfire i'm sure that next Nvidia driver and 14.10 will 100% address such a good game.. I seriously hope devs keep optimizing games like they did on this one. it might not be 100% optimized but it's a glimpse of hope..


I don't see drivers from nvidia helping much at all. It's pretty well optimized as it is. They'll just take the short cut route and use the FEAR 3 SLI profile that the community has already made note of, and not change much of anything for it. They do this often. that would explain why they haven't addressed it yet at all. It is an nvidia title so what are we supposed to think here? Games coming out later this month have already been addressed.

Running this now on fresh win install on raid 0 ssds seems to help a bit.


----------



## Silent Scone

Quote:


> Originally Posted by *The Source*
> 
> I don't see drivers from nvidia helping much at all. It's pretty well optimized as it is. They'll just take the short cut route and use the FEAR 3 SLI profile that the community has already made note of, and not change much of anything for it. They do this often. that would explain why they haven't addressed it yet at all. It is an nvidia title so what are we supposed to think here? Games coming out later this month have already been addressed.
> 
> Running this now on fresh win install on raid 0 ssds seems to help a bit.


It is pretty poor show, but it's not really an 'Nvidia title'. They've just paid for the advertisement. There isn't any GW effects, Physx or similar.

Even so, that makes them even more obviously aware of the game. They've dropped the ball like this a few times recently.


----------



## Zaid

Quote:


> Originally Posted by *Silent Scone*
> 
> *It is pretty poor show, but it's not really an 'Nvidia title'. They've just paid for the advertisement. There isn't any GW effects, Physx or similar*.
> 
> Even so, that makes them even more obviously aware of the game. They've dropped the ball like this a few times recently.


when i saw the nvidia logo i immediately though the game would run like horse **** on AMD, similiar to how batman was. i am glad to say this is not the case, game has not been touched by the green hand of nvidia(get it?).


----------



## Silent Scone

Quote:


> Originally Posted by *Zaid*
> 
> when i saw the nvidia logo i immediately though the game would run like horse **** on AMD, similiar to how batman was. i am glad to say this is not the case, game has not been touched by the green hand of nvidia(get it?).


That's a pretty naive way of looking at it if I'm honest. GameWorks is just a set of sub libraries and effects for developers to work with. It's up to the developers how these effects are used. In the case of Ubisoft for example, poorly. Shadow Of Mordor is just an endorsement, nothing more.


----------



## Zaid

Quote:


> Originally Posted by *Silent Scone*
> 
> That's a pretty naive way of looking at it if I'm honest. GameWorks is just a set of sub libraries and effects for developers to work with. It's up to the developers how these effects are used. In the case of Ubisoft for example, poorly. Shadow Of Mordor is just an endorsement, nothing more.


so blame the developers for poor amd optimization in gameworks sponsored games?

nvidia loves making things so awful for amd that users will be forced to switch over to nvidia, its not as common now as it was a few years ago. this reminds me of the physx/nvidia drivers.

i had an old nvidia card running as a physx cards and AMD as a display adapter, nvidia then updated their drivers disable any accelerated physx is there was an AMD GPU present, i had to get modified nvida drivers for it to work. it was a deliberate attack on people with AMD cards that wanted to use nvidia as a 2ndary card. this isnt the same as " well then AMD should have their own physx, if they want their users to have physx", i paid for the nvida card which means im entitled to everything it comes with. its like you owning a frod, going out and buying a mazda then ford comes and says were taking the car back until you get rid of that mazda.


----------



## paulerxx

This game is not poorly optimized...I have good graphics with great performance that = good game to me. Not every game is meant to be maxed out at launch.


----------



## Gorea

This game is actually very optimized and nearly perfect (technical/performance point of view) relative to the vast majority of video games. The frame rate is very consistent and rarely jumps all over the place.

Just because a GTX 760 or a R7 260x doesn't max out a game 60+ fps at 4k does not mean it is "unoptimized".


----------



## The Source

Quote:


> Originally Posted by *Gorea*
> 
> This game is actually very optimized and nearly perfect (technical/performance point of view) relative to the vast majority of video games. The frame rate is very consistent and rarely jumps all over the place.
> 
> Just because a GTX 760 or a R7 260x doesn't max out a game 60+ fps at 4k does not mean it is "unoptimized".


It's mostly us 780/ti owners that are complaining the most.







When the highest end (minus titan owners) of last gen can't max a game, threads like this happen. Especially when the only real world difference in perfomance, save 5-10% is a bit more vram.

If the benchmark tool actually represented the gameplay properly, we might have had a new crysis, benchmark wise.


----------



## swiftypoison

According to afterburner, SoM uses all of my GTX 770 Classy's 4GB. After about 30minutes of game play, it crashes with a low memory error, I have 8GB ram


----------



## Zipperly

Quote:


> Originally Posted by *swiftypoison*
> 
> According to afterburner, SoM uses all of my GTX 770 Classy's 4GB. After about 30minutes of game play, it crashes with a low memory error, I have 8GB ram


What res?


----------



## swiftypoison

Quote:


> Originally Posted by *Zipperly*
> 
> What res?


1080p


----------



## iSlayer

I kinda feel like I'm drowning in green in this thread, what's the good word on performance in this game from the red team?


----------



## DIYDeath

Quote:


> Originally Posted by *iSlayer*
> 
> I kinda feel like I'm drowning in green in this thread, what's the good word on performance in this game from the red team?


I havent actually seen any reports from Red either. I'd be interested to know how their cards are handling the game too.


----------



## Dasboogieman

Quote:


> Originally Posted by *DIYDeath*
> 
> I havent actually seen any reports from Red either. I'd be interested to know how their cards are handling the game too.


A single 290 handles the game well at around 70-80 in game FPS on high, you get weird flickering and stutters when you run on ultra.
As for Crossfire, forcing AFR only works in the benchmark, you get artifacts all over the place in game. Ironically, the FEAR3 profile works great for us as well with good scaling, however we get these really annoying menu flickers that make it impossible to change graphic settings sometimes when crossfire is used.


----------



## DIYDeath

Quote:


> Originally Posted by *Dasboogieman*
> 
> A single 290 handles the game well at around 70-80 in game FPS on high, you get weird flickering and stutters when you run on ultra.
> As for Crossfire, forcing AFR only works in the benchmark, you get artifacts all over the place in game. Ironically, the FEAR3 profile works great for us as well with good scaling, however we get these really annoying menu flickers that make it impossible to change graphic settings sometimes when crossfire is used.


That all sounds fixable which is pretty cool. Glad to see the launch was a success for most people.


----------



## swiftypoison

After some testing, it seems like the crash was related to the settings being in Ultra and the game trying to use more vram than the available 4GB on my GTX 770 Classy. Setting the game to very high does around 3900mb at 60+ fps 99.9% of the time.


----------



## Zaid

Quote:


> Originally Posted by *DIYDeath*
> 
> I havent actually seen any reports from Red either. I'd be interested to know how their cards are handling the game too.


there are multiple ppl with amd cards playing this n posting benchmark results, i am one of them.

my overclocked 290x everything maxed out with ultra + 150% SSAA @ 1080p and i get 60 uless im looking at alot of buildings/npc then its about 50 fps. game feels great, only issue is camera movemnt indoors which is allrdy a known issue.


----------



## Exilon

This game looks awesome with 150% SSAA + FXAA/MLAA.


----------



## The Source

Quote:


> Originally Posted by *Zaid*
> 
> there are multiple ppl with amd cards playing this n posting benchmark results, i am one of them.
> 
> my overclocked 290x everything maxed out with ultra + 150% SSAA @ 1080p and i get 60 uless im looking at alot of buildings/npc then its about 50 fps. game feels great, only issue is camera movemnt indoors which is allrdy a known issue.


It's already been stated, many times, that the benchmark does not represent gameplay performance.


----------



## Zaid

Quote:


> Originally Posted by *The Source*
> 
> It's already been stated, many times, that the benchmark does not represent gameplay performance.


that is true, that is why i posted my gameplay numbers.


----------



## The Source

Quote:


> Originally Posted by *Zaid*
> 
> that is true, that is why i posted my gameplay numbers.


Combat is where the performance drops occur. I am happy it is working well for you. I need some that magic sauce.


----------



## Murlocke

Quote:


> Originally Posted by *Zaid*
> 
> there are multiple ppl with amd cards playing this n posting benchmark results, i am one of them.
> 
> my overclocked 290x everything maxed out with ultra + 150% SSAA @ 1080p and i get 60 uless im looking at alot of buildings/npc then its about 50 fps. game feels great, only issue is camera movemnt indoors which is allrdy a known issue.


Hard time believing you aren't exaggerating here. I'd believe it if you were at 100% resolution. If you aren't exaggerating, then you probably didn't correctly install ultra textures. I tested this game on a 980 and a 780Ti, they both get about 40-55FPS on 3440x1400 with everything maxed but high textures and ambient occlusion. Ultra textures tended to kill the 780Ti due to VRAM, while the 980 managed but only at about 30-35FPS during fights.

150% of 1080p isn't that far off of 3440x1440, so i'm very skeptical of your performance. I think a lot of people are judging too quickly on whats playable, Act 2 is easily 2x harder to run due to the environment and the tutorial is pretty easy to run.


----------



## cstkl1

Best way for ppl to show that they are not lieing is post a youtube vid n share the actual video. So far i have. I am interested in x290. Want to see whether and did a tessalation tweak on their profile.


----------



## Zaid

Quote:


> Originally Posted by *The Source*
> 
> Combat is where the performance drops occur. I am happy it is working well for you. I need some that magic sauce.


my magic suace overclocking, i honestly cannot play during the day because temps get too hot and i get GPU errors, i have to wait till night time with the window open to keep temps at a stable 65-70.
Quote:


> Originally Posted by *Murlocke*
> 
> Hard time believing you aren't exaggerating here. I'd believe it if you were at 100% resolution. If you aren't exaggerating, then you probably didn't correctly install ultra textures. I tested this game on a 980 and a 780Ti, they both get about 40-55FPS on 3440x1400 with everything maxed but high textures and ambient occlusion. Ultra textures tended to kill the 780Ti due to VRAM, while the 980 managed but only at about 30-35FPS during fights.
> 
> 150% of 1080p isn't that far off of 3440x1440, so i'm very skeptical of your performance. I think a lot of people are judging too quickly on whats playable, Act 2 is easily 2x harder to run due to the environment and the tutorial is pretty easy to run.


If i dont overclock my GPU, my minimum fps drops like crazy during heavy fights i get choppy fps, and my avg fps drops to about low 50's. i have to OC my 290x to 1220/1620 to get that FPS.


----------



## cstkl1

I shld have done a video on that fighting in the pit against 50-100. 6gb was maxed out n it wants more.

Also hmm is vram related to pagefile. Find it off that my pagefile is pretty high for this game.


----------



## TopicClocker

Quote:


> Originally Posted by *LancerVI*
> 
> Thanks. Not much of a youtuber. Submitter that is.


Wow that vid is still 480p?
Something went wrong lol


----------



## Exilon

150% render target at 1080p high textures with driver FXAA results in a very clean looking game. 45-50 fps on a GTX 780, so performance is acceptable for a 3rd person hack-and-slash.


----------



## DIYDeath

Quote:


> Originally Posted by *cstkl1*
> 
> I shld have done a video on that fighting in the pit against 50-100. 6gb was maxed out n it wants more.
> 
> Also hmm is vram related to pagefile. Find it off that my pagefile is pretty high for this game.


I disabled my pagefile. Not sure if it helped or not.


----------



## cstkl1

Quote:


> Originally Posted by *DIYDeath*
> 
> I disabled my pagefile. Not sure if it helped or not.


So is mine. This reading is from kernel mem paged i think, or cache??


----------



## NrGx

I'm getting around 30 FPS average with Ultra settings enabled at 1440p with my specs in signature - does that sounds about right?


----------



## Murlocke

Quote:


> Originally Posted by *NrGx*
> 
> I'm getting around 30 FPS average with Ultra settings enabled at 1440p with my specs in signature - does that sounds about right?


Yes. Turn ambient occlusion to high for a good boost with literally no graphic difference. Still wouldn't recommend running ultra textures, the difference is pretty small but the requirement is big.


----------



## hollowtek

Alright, so I have everything except textures on ultra 1080p..
i5 2500 @ 4.4
gtx 970 at 1492/4001

getting 91fps. uses 3.885gb vram

bumping it up to 1522, everything set on ultra (and ultra textures on).. still getting avg 91fps. weird.

LOL and a pitiful 31fps on 4k ultra.


----------



## Chargeit

Quote:


> Originally Posted by *Exilon*
> 
> 150% render target at 1080p high textures with driver FXAA results in a very clean looking game. 45-50 fps on a GTX 780, so performance is acceptable for a 3rd person hack-and-slash.


You should try adaptive Vsync half refresh rate, if you can deal with 30 fps. I use a controller for 3rd person games and don't mind a steady 30 as long as it's smooth.


----------



## Murlocke

Quote:


> Originally Posted by *hollowtek*
> 
> Alright, so I have everything except textures on ultra 1080p..
> i5 2500 @ 4.4
> gtx 970 at 1492/4001
> 
> getting 91fps. uses 3.885gb vram
> 
> bumping it up to 1522, everything set on ultra (and ultra textures on).. still getting avg 91fps. weird.


At the very beginning of the game? Play a couple hours. It could just be that the 970/980 eat up 1080p... it's a pretty low resolution for those cards... best to downsample unless you have a 120+ hz monitor.


----------



## hollowtek

Quote:


> Originally Posted by *Murlocke*
> 
> At the very beginning of the game? Play a couple hours. It could just be that the 970/980 eat up 1080p... it's a pretty low resolution for those cards... best to downsample unless you have a 120+ hz monitor.


that's not a bad idea. i bumped it up to 1440p, getting ~65fps. that's perfect range for me.


----------



## cstkl1

http://www.gamespot.com/articles/does-shadow-of-mordors-crazy-ultra-hd-content-pack/1100-6422735/

Interesting. U can run ultra if u cap ure fps on gpu that is vram limited. Will try this later with the 780ti.

I was doing that on my 2560x1080 before shifting to swift.


----------



## blackhole2013

Please tell me why my gtx 780 3gb running 1300/6400 is able to run game average 90 fps with ultra textures which I dloaded the extra 3.7gb of them on I thought I need 6 gb of vram ???


----------



## Zaid

Quote:


> Originally Posted by *blackhole2013*
> 
> Please tell me why my gtx 780 3gb running 1300/6400 is able to run game average 90 fps with ultra textures which I dloaded the extra 3.7gb of them on I thought I need 6 gb of vram ???


6GB is required for ultra textures @ 4k. ultra @ 1080p should be good enough with 3-4GB


----------



## cstkl1

Quote:


> Originally Posted by *blackhole2013*
> 
> Please tell me why my gtx 780 3gb running 1300/6400 is able to run game average 90 fps with ultra textures which I dloaded the extra 3.7gb of them on I thought I need 6 gb of vram ???


Play the game cause my 780ti running 1384/8000 kills the benchmark but its not playable at 1440p.

Although the titan black at 1254 is smooth n good gameplay but its not perfect. Gonns try that fps capping.


----------



## Exilon

Quote:


> Originally Posted by *blackhole2013*
> 
> Please tell me why my gtx 780 3gb running 1300/6400 is able to run game average 90 fps with ultra textures which I dloaded the extra 3.7gb of them on I thought I need 6 gb of vram ???


Did you restart after setting to texture to ultra? If you didn't, you're still running high.

If you did, you just have a very high tolerance for stuttering.


----------



## Clockdisaster

My question is . To have 6gb Vram you have to have two titans?
Thats just insane.
I thinnk new the age of video gaming where graphics matters too much for is really lame.
The games arent as catchy as it was 5 years ago when graphics werent as good, but the gameplay was way more fun


----------



## Murlocke

I just beat it (and got 100% at the same time). Two sittings, 23 total hours, including the Season Pass extras. Great game. Meh ending.

Feel like after you beat it all warchiefs/captains should have death threat applied automatically + automatically level 20 + ability to go higher than that. It's pretty annoying have to death threat every single guy before you kill them. It's really the only way to find epic runes. The game is pretty easy as soon as you are able to double combat brand every few seconds, nothing can really stop a mass swarm of your orcs.
Quote:


> Originally Posted by *Clockdisaster*
> 
> My question is . To have 6gb Vram you have to have two titans?
> Thats just insane.
> I thinnk new the age of video gaming where graphics matters too much for is really lame.
> The games arent as catchy as it was 5 years ago when graphics werent as good, but the gameplay was way more fun


No.. SLI doesn't "add" VRAM and a single Titan is already 6GB. You don't really need 6GB of VRAM unless you are running 4K but the game has high requirements in every department. A single 980 Ti should maintain 60+ FPS in this game on practically any resolution but 4K, whenever it is released.

If graphics don't matter much to you, then don't max the game. This game has great gameplay.


----------



## sgs2008

Sigh 3gb isnt enough for 1440p my 2 780tis stutter like hell with ultra textures fine on high. IS 4gb enough for ultra textures at 1440p?


----------



## The Source

Quote:


> Originally Posted by *sgs2008*
> 
> Sigh 3gb isnt enough for 1440p my 2 780tis stutter like hell with ultra textures fine on high. IS 4gb enough for ultra textures at 1440p?


That's a dangerous question that I'm asking myself as well.







I need to keep telling myself to just wait and see what's coming out early next year card wise, and what requirement this years fall games are going to need. I haven't had these 780's very long, and taking that financial hit thanks to nvidia releasing the 970 at the price they did, really doesn't sit well. Yes, it's partially my own fault for not waiting, but I think I made an educated guess based on their release history. They just decided to throw a curve ball that caught a lot of people off guard.

4GB might be enough, but for how long? I really don't like the idea of having to upgrade every six months.

As for this game? Some are reporting that its enough, barely, and some are saying it's not enough. It's just one game that's easy enough to just set aside for a future play through. We haven't even seen a patch or official driver support for it yet.


----------



## BradleyW

I keep getting random fps drops during small battles when CFX is enabled. Come on AMD, release some gaming ready beta drivers. There are so many games out this month!

Is anyone having a terrible time with the mouse movement?


----------



## chronicfx

Quote:


> Originally Posted by *BradleyW*
> 
> I keep getting random fps drops during small battles when CFX is enabled. Come on AMD, release some gaming ready beta drivers. There are so many games out this month!
> 
> Is anyone having a terrible time with the mouse movement?


Are you playing in trifire? Does this game run with trifire?


----------



## Zipperly

Quote:


> Originally Posted by *cstkl1*
> 
> http://www.gamespot.com/articles/does-shadow-of-mordors-crazy-ultra-hd-content-pack/1100-6422735/
> 
> Interesting. U can run ultra if u cap ure fps on gpu that is vram limited. Will try this later with the 780ti.
> 
> I was doing that on my 2560x1080 before shifting to swift.


I can run full ultra everything very well at 1080P with my frame rate capped to 60 which is where it would stay most of the time but...... play the game long enough and eventually it will hitch and skip around as vram is still an issue even at just 1080P. I prefer to run the game at 1440P downsampled to 1080P with High everything except for Mesh and Vegetation set to ultra. Those settings still net me 60fps capped/vsynced and it looks better than running the game at 1080P full ultra.

Edit: I absolutely despise the "draft" feature on these forums, every time I quote someone I end up having to go back and delete something where I had previously started a different discussion with someone else. I know you can discard the draft but I usually forget to do that.


----------



## Zipperly

Quote:


> Originally Posted by *The Source*
> 
> That's a dangerous question that I'm asking myself as well.
> 
> 
> 
> 
> 
> 
> 
> I need to keep telling myself to just wait and see what's coming out early next year card wise, and what requirement this years fall games are going to need. I haven't had these 780's very long, and taking that financial hit thanks to nvidia releasing the 970 at the price they did, really doesn't sit well. Yes, it's partially my own fault for not waiting, but I think I made an educated guess based on their release history. They just decided to throw a curve ball that caught a lot of people off guard.
> 
> 4GB might be enough, but for how long? I really don't like the idea of having to upgrade every six months.
> As for this game? Some are reporting that its enough, barely, and some are saying it's not enough. It's just one game that's easy enough to just set aside for a future play through. We haven't even seen a patch or official driver support for it yet.


I would wait, its what I have decided to do. I sincerely think the 4gb cards will have a shorter life than the 3gb 780's mostly due to the fact that when the 780's were released we didnt have the PS4 or Xbone and downsampling wasn't being implemented in the driver control panel either the way that they are with the new cards.

Now of course you dont have to use downsampling so 4gb's "should" be ok for straight up 1920X1080 *"future games"* for some time to come but once you go downsampling its extremely hard "imo" to do without it and it does eat into the vram.


----------



## Zipperly

Quote:


> Originally Posted by *Murlocke*
> 
> Yes. Turn ambient occlusion to high for a good boost with literally no graphic difference. Still wouldn't recommend running ultra textures, the difference is pretty small but the requirement is big.


+1 and actually everything set to high except for mesh and vegetation "keep those 2 on ultra" and you can gain about 15-20fps while not noticing any actual difference in the way the game looks but you will see a huge improvement in how well it runs.


----------



## chronicfx

Quote:


> Originally Posted by *BradleyW*
> 
> I keep getting random fps drops during small battles when CFX is enabled. Come on AMD, release some gaming ready beta drivers. There are so many games out this month!
> 
> Is anyone having a terrible time with the mouse movement?


Bradleyw i was under the impression the term crossfirex (cfx) was reserved for tri and quadfire and if your are running two cards it is called crossfire (cf). That was my confusion. It seems they have changed it to mean all configurations.


----------



## BradleyW

Quote:


> Originally Posted by *chronicfx*
> 
> Bradleyw i was under the impression the term crossfirex (cfx) was reserved for tri and quadfire and if your are running two cards it is called crossfire (cf). That was my confusion. It seems they have changed it to mean all configurations.


CFX means crossfire with *2 or more* GPU's.


----------



## cstkl1

Quote:


> Originally Posted by *chronicfx*
> 
> Bradleyw i was under the impression the term crossfirex (cfx) was reserved for tri and quadfire and if your are running two cards it is called crossfire (cf). That was my confusion. It seems they have changed it to mean all configurations.


Cf was just renamed as cfx.

Most probably because of that speaker brand.


----------



## funfordcobra

With vsync on and fps cap set to 60 I can run all ultra textures now on a 3440x1440 screen with SLI780 tis with only 3gb. The only change I made was turning the fps cap from unlimited to 60. Doesn't make sense but it worked and I can tell a difference between ultra and high.


----------



## Zipperly

Quote:


> Originally Posted by *funfordcobra*
> 
> With vsync on and fps cap set to 60 I can run all ultra textures now on a 3440x1440 screen with SLI780 tis with only 3gb. The only change I made was turning the fps cap from unlimited to 60. Doesn't make sense but it worked and I can tell a difference between ultra and high.


You dont get any skipping or hitching? that is odd because i limited my frame rate to 60 as well and even at 1080p Full Ultra I eventually run into vram issues.


----------



## HighTemplar

So it seems everyone claiming that the 6GB VRAM requirement was BS have been proven wrong. Several benchmarks have shown thus far that it was actually stated for a reason, and really DOES saturate the VRAM to the point of causing unplayability.

Whether that is because of bad optimization (most likely) or another reason, remains to be seen.


----------



## Baasha

Shadow of Mordor in 1440P @ 60FPS (download link in description for 60FPS version):


----------



## BradleyW

I can't seem to interrogate Orcs. I grab them and press SPACE, but I throw them instead. Also indoors my fps drops, but massive Orc fights in the open does not seem to drop my fps. Very odd.


----------



## Chargeit

Quote:


> Originally Posted by *BradleyW*
> 
> I can't seem to interrogate Orcs. I grab them and press SPACE, but I throw them instead. Also indoors my fps drops, but massive Orc fights in the open does not seem to drop my fps. Very odd.


You have to hold it down. It took awhile for me to realize that. The game doesn't do a great job of explaining some things. It shoots a bunch of information at you quickly, while telling you the back drop... Wham bam thank you ma'am.


----------



## daviejams

So turns on the PC yesterday , AMD Raptr program opens , wants to optimize this game. I let it , it sets the textures to ultra and lighting to low , everything else stays the highest. Works pretty well , you do get slow down occasionally but apart from the odd moment it's fine
3GB R280x
This game is great by the way, nearly finished it


----------



## abirli

if i have the ultra options in menu and titan sli does that mean i get full hd textures or do i still have to download the hd texture pack?


----------



## hurleyef

Quote:


> Originally Posted by *abirli*
> 
> if i have the ultra options in menu and titan sli does that mean i get full hd textures or do i still have to download the hd texture pack?


You still have to download it. If you select ultra without the texture pack installed then it just uses the high textures.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Zipperly*
> 
> I can run full ultra everything very well at 1080P with my frame rate capped to 60 which is where it would stay most of the time but...... play the game long enough and eventually it will hitch and skip around as vram is still an issue even at just 1080P. I prefer to run the game at 1440P downsampled to 1080P with High everything except for Mesh and Vegetation set to ultra. Those settings still net me 60fps capped/vsynced and it looks better than running the game at 1080P full ultra.
> 
> Edit: I absolutely despise the "draft" feature on these forums, every time I quote someone I end up having to go back and delete something where I had previously started a different discussion with someone else. I know you can discard the draft but I usually forget to do that.


Still telling everyone he's having a vram issue at 1080p on his 780....

Yet somehow doesn't understand he's rendering the game at 1440

Rendering at 1440p is rendering at 1440p even if downscaling on 1080p monitor....1 gtx 780 is not enough...not even with 1000 gb of vram

Those of us with actual 1440p monitors understand this....


----------



## Zipperly

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Still telling everyone he's having a vram issue at 1080p on his 780.....


I am by FAR not the only 3gb 780 owner in this thread who has noticed with SOM that full Ultra textures at 1920X1080P can eventually lead to some pretty bad stuttering, and no...... it isnt a raw power issue as I consistently get 60fps at FULL ultra at 1080P resolution until as I said "I hit the vram limitation of my card which then causes a massive frame rate dive which introducing some pretty severe stutter which is a tale tale sign of a vram problem. "

Quote:


> Yet somehow doesn't understand he's rendering the game at 1440










In this case I was not downsampling, I was running the resolution scale at 100 percent which is 1920X1080 resolution. I am well aware of what downsampling does and as I mentioned before I run the game fine at 2560X1440P downsampled to 1920X1080P with *"HIGH TEXTURES".* Next time you try to make me look like an idiot be sure to get the facts straight first.
Quote:


> Rendering at 1440p is rendering at 1440p even if downscaling on 1080p monitor....1 gtx 780 is not enough...not even with 1000 gb of vram


Wrong, 1 GTX 3GB 780 handles SOM just fine at 1440P resolution with High textures. I can even enable Ultra textures at 1440P resolution and I still get a solid 50+fps until as I previously mentioned I run into vram issues which obviously happens even faster at 1440P than it does at 1080P.

In both cases if this were a raw power issue I would never be getting such a good framerate at any given point to begin with.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Still telling everyone he's having a vram issue at 1080p on his 780....
> 
> Yet somehow doesn't understand he's rendering the game at 1440
> 
> Rendering at 1440p is rendering at 1440p even if downscaling on 1080p monitor....1 gtx 780 is not enough...not even with 1000 gb of vram
> 
> Those of us with actual 1440p monitors understand this....


Let me make sure I got it right! Are you saying that one 780 can't run this at 1440p high settings and maintain ~60 fps?

Because if you do, my 780 and my Qnix would like to have a word with you...


----------



## Zipperly

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Let me make sure I got it right! Are you saying that one 780 can't run this at 1440p high settings and maintain ~60 fps?
> 
> Because if you do, my 780 and my Qnix would like to have a word with you...


He also doesnt believe vram is ever an issue, first the discussion was about 2gb cards which he defended over and over religiously and we see how that has turned out. Now he denies that SOM is a vram beast even at native 1080P.

He has been wrong on every single point that he and I have discussed and now that SOM is out he is even more bitter about it.


----------



## Zipperly

Just fired up SOM again, im playing 1440P downsampled to 1080P Ultra textures and getting 60fps with the occasional hitch/stutter. Only difference for me at 1080P is that the hitch/stutter with Ultra textures enabled is a little less frequent and takes a little longer to show up but eventually it happens. With textures on High I never get any amount of hitching or stuttering at 1440P downsampled to 1080P.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Just fired up SOM again, im playing 1440P downsampled to 1080P Ultra textures and getting 60fps with the occasional hitch/stutter. Only difference for me at 1080P is that the hitch/stutter with Ultra textures enabled is a little less frequent and takes a little longer to show up but eventually it happens. With textures on High I never get any amount of hitching or stuttering at 1440P downsampled to 1080P.


With your settings I've ditched down to 54 fps, but mostly keep over 60. I just play max out at 1080p on single GPU. CFX's fps is too random right now.


----------



## Chargeit

Vram limits the 780 from keeping a steady 60 running this with ultra textures. It doesn't matter how many 780's you have.

I was able to run the ultra textures at first, but, there will come a time when you'll start stuttering.

I cut back to high textures and no more stutter.


----------



## Zipperly

Quote:


> Originally Posted by *Chargeit*
> 
> Vram limits the 780 from keeping a steady 60 running this with ultra textures. It doesn't matter how many 780's you have.
> 
> I was able to run the ultra textures at first, but, there will come a time when you'll start stuttering.
> 
> I cut back to high textures and no more stutter.


Exactly.


----------



## GetToTheChopaa

I never tried the game at 1080p, until minutes ago just to see how it runs at ultra and while the benchmark average was 72 fps, in game was a different story, vram was hovering around the 3000 mark ( I kept my eye on it @ all time) and whenever it hit 3072 MB it would stutter. Not too often but enough to distract and annoy me. High settings @1440p and I'm happy with how the game looks and runs!


----------



## salamachaa

Anyone else notice how terrible vsync is in this game? I shut it off then tearing happened even though my frame rate wasn't over 110 (OCed Qnix). Had to override vsync in drivers because every time I would get 1 fps under 60 it would drop the frame rate to 30. Other than that, from the hour that I have played, the game seems to run fine for a 3rd person game at ultra 1440p on my 970. Yeah there are dips, but for a game like this, it's acceptable to run at a frame rate around 40-50 to me.


----------



## NrGx

Anyone else think this game doesn't look as good as it should given the high system requirements? I went back and played some Skyrim modded today and it looks better in some regards... still having heaps of fun with SoM at the moment must say.


----------



## Silent Scone

Quote:


> Originally Posted by *Zipperly*
> 
> And?
> No you dont say! How many times have I now told you that I am well aware of what the resolution slider does? You can stop talking to me as if I dont know what im talking about. I have only gone over the resolution slider a dozen or so times now and yet you are the one who is still hung up on it!
> I am running a single 780 at 1440P right now in SOM ultra textures and getting mostly 60fps, only time it dips under that is when I run into stutter from insufficient vram and how many other 780 owners have you seen that have backed me on this yet you continue to argue.
> 
> *Also.....Do not tell me what I can and cannot do with a card that I clearly own*. If you are saying otherwise then you either do not own the hardware you claim to own or are simply trolling. In fact I will let the others around here with a single 780 take care of you, you are getting to be beyond the point of being ridiculous now.


Some people have a higher tolerance for poor performance. You can't just magic more VRAM out of thin air. If 4GB isn't enough then 3GB certainly isn't.

There isn't any magical fix, for a hardware enthusiast community there isn't half a lot of people who seem naive to the blatantly-stinking obvious...


----------



## salamachaa

Quote:


> Originally Posted by *NrGx*
> 
> Anyone else think this game doesn't look as good as it should given the high system requirements? I went back and played some Skyrim modded today and it looks better in some regards... still having heaps of fun with SoM at the moment must say.


I agree with you. There are lots of games that look better IMO that run much better too. AC4 (now not at launch), BF4, Crysis 3 and Lichdom Battlemage just to name a few. The game is quite good and it feels like a natural progression from the Batman games with some Assassin's Creed mixed in. Honestly, I bet that something funky is going on in SOM like the game drawing something that isn't on screen.


----------



## blackhole2013

Quote:


> Originally Posted by *Exilon*
> 
> Did you restart after setting to texture to ultra? If you didn't, you're still running high.
> 
> If you did, you just have a very high tolerance for stuttering.


Its what Zaid said I guess I only have a Asus 1080p 120hz monitor and my 3gb is enough for ultra textures not so for 1440p users they need 6gb vram ...


----------



## Zipperly

Quote:


> Originally Posted by *salamachaa*
> 
> Anyone else notice how terrible vsync is in this game? I shut it off then tearing happened even though my frame rate wasn't over 110 (OCed Qnix). Had to override vsync in drivers because every time I would get 1 fps under 60 it would drop the frame rate to 30. Other than that, from the hour that I have played, the game seems to run fine for a 3rd person game at ultra 1440p on my 970. Yeah there are dips, but for a game like this, it's acceptable to run at a frame rate around 40-50 to me.


I dont get those issues with vsync. If I drop under 60fps then it drops only a few frames.


----------



## salamachaa

Quote:


> Originally Posted by *blackhole2013*
> 
> Its what Zaid said I guess I only have a Asus 1080p 120hz monitor and my 3gb is enough for ultra textures not so for 1440p users they need 6gb vram ...


I have not experienced stutter with 4gb of vram at 1440p and ultra (yes I did restart). I do not get 60 fps, but the frame rate does stay pretty constantly in the high 30s - mid 40s which for a 3rd person game non-twitch game is fine for me.


----------



## Zipperly

Quote:


> Originally Posted by *Silent Scone*
> 
> Some people have a higher tolerance for poor performance. You can't just magic more VRAM out of thin air. If 4GB isn't enough then 3GB certainly isn't.
> 
> There isn't any magical fix, for a hardware enthusiast community there isn't half a lot of people who seem naive to the blatantly-stinking obvious...


Im not disagreeing with any of that.


----------



## salamachaa

Quote:


> Originally Posted by *Zipperly*
> 
> I dont get those issues with vsync. If I drop under 60fps then it drops only a few frames.


Strange. I don't have the issue overriding vsync in Nvidia control panel, but using the in game vsync resulted in the simply frame locking and rounding down to either 120, 60, 30, or 15 fps. I had the same problem in AC4 when it first came out, but that was with a r9 290 and now I'm using a gtx 970.


----------



## blackhole2013

Quote:


> Originally Posted by *Zipperly*
> 
> Im not disagreeing with any of that.


Wow so my 780 3gb with ultra textures and average 90 fps is poor performance no stuttering is not good performance for that guy .. Huh I still cant run watchdogs on ultra textures now when I do it stutters like crazy


----------



## salamachaa

Quote:


> Originally Posted by *blackhole2013*
> 
> Wow so my 780 3gb with ultra textures and average 90 fps is poor performance no stuttering is not good performance for that guy .. Huh I still cant run watchdogs on ultra textures now when I do it stutters like crazy


That's just watch dogs. It's just a terrible, terrible, awful port.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *Silent Scone*
> 
> Some people have a higher tolerance for poor performance. You can't just magic more VRAM out of thin air.


Haha! So true!

Quote:


> Originally Posted by *salamachaa*
> 
> I have not experienced stutter with 4gb of vram at 1440p and ultra (yes I did restart). I do not get 60 fps, but the frame rate does stay pretty constantly in the high 30s - mid 40s which for a 3rd person game non-twitch game is fine for me.


I also tried ultra at 1440p, not only was a 30-something fps slideshow, but stutter........ nevermind 30 fps is stutter!
At the end of the day it's just a matter of standards!

Edit: After I upgraded my monitor to 1440 I just couldn't play modded Skrym (heavily modded+ENB) because of the low fps (30+) I was getting with my 780


----------



## salamachaa

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Haha! So true!
> I also tried ultra at 1440p, not only was a 30-something fps slideshow, but stutter........ nevermind 30 fps is stutter!
> At the end of the day it's just a matter of standards!


I suppose. I'll admit it's not the best experience possible, but it's acceptable to me and I prefer the extra eyecandy to the extra frames in this kind of game.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *blackhole2013*
> 
> Wow so my 780 3gb with ultra textures and average 90 fps is poor performance no stuttering is not good performance for that guy .. Huh I still cant run watchdogs on ultra textures now when I do it stutters like crazy


Sorry, but I just don't buy it that you're getting 90 fps at ultra settings on a 780, without stuttering on top of that...


----------



## Zipperly

.


----------



## salamachaa

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Haha! So true!
> I also tried ultra at 1440p, not only was a 30-something fps slideshow, but stutter........ nevermind 30 fps is stutter!
> At the end of the day it's just a matter of standards!
> 
> Edit: After I upgraded my monitor to 1440 I just couldn't play modded Skrym (heavily modded+ENB) because of the low fps (30+) I was getting with my 780


Skyrim was probably Havok pummeling one thread. I had the same issue until I I upped my havok threads to 4 (not recommended for users with less than 6 threads). It's a Skyrim flora overhaul/grass on steroids/ any flora mod thing.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *salamachaa*
> 
> I suppose. I'll admit it's not the best experience possible, but it's acceptable to me and I prefer the extra eyecandy to the extra frames in this kind of game.


I thought that too before getting the 1440p panel, but I was wrong! Also, in this game there isn't a big enough difference between high and ultra settings to justify the compromise. Personal opinion, of course!


----------



## blackhole2013

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Sorry, but I just don't buy it that you're getting 90 fps at ultra settings on a 780, without stuttering on top of that...


Its true remember I only have a 1080p monitor ...


----------



## Zipperly

Delete.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *salamachaa*
> 
> Skyrim was probably Havok pummeling one thread. I had the same issue until I I upped my havok threads to 4 (not recommended for users with less than 6 threads). It's a Skyrim flora overhaul/grass on steroids/ any flora mod thing.


I have only 4! I'm not even doing grass on steroids and the crazy, incredibly bushy trees (forgot the name of the mod), because that's just........ no way!


----------



## Zipperly

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> I thought that too before getting the 1440p panel, but I was wrong! Also, in this game there isn't a big enough difference between high and ultra settings to justify the compromise. Personal opinion, of course!


I did just spend the past 2hrs gaming with SOM 1440P Ultra textures out of curiosity and I get mostly 60fps except for the time that it stutters a bit due to vram. So with Textures, Vegetation and Mesh set to ultra with the others on high I have found 1440P to be very doable while getting a very good frame rate "vsynced" aside from the occasional stutter. I do agree though that the difference in visual quality from Ultra to High is pretty hard to notice so high is generally better for a "stutter free" experience.


----------



## salamachaa

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> I have only 4! I'm not even doing grass on steroids and the crazy, incredibly bushy trees (forgot the name of the mod), because that's just........ no way!


Interesting. What 4 mods did you have?


----------



## GetToTheChopaa

Quote:


> Originally Posted by *blackhole2013*
> 
> Its true remember I only have a 1080p monitor ...


All the benchmarks I've seen on 780s were in the 70s, some lower! Can you post a video of your bench? (I changed my 1440 to 1080 and ran an ultra bench, average 72fps)
https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&es_th=1&ie=UTF-8#q=shadow+of+mordor+gtx+780+ultra&tbm=vid&start=0

Quote:


> Originally Posted by *Zipperly*
> 
> I did just spend the past 2hrs gaming with SOM 1440P Ultra textures out of curiosity and I get mostly 60fps except for the time that it stutters a bit due to vram. So with Textures, Vegetation and Mesh set to ultra with the others on high I have found 1440P to be very doable while getting a very good frame rate vsynced.
> 
> Keep in mind the overclock on my card too..


Everything on ultra at 1440p even the benchmark average is 40fps while in game when fighting 20-30 Uruks is a slideshow. High with a couple of ultra options is OK, as we agreed earlier.


----------



## Zipperly

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> Everything on ultra at 1440p even the benchmark average is 40fps while in game when fighting 20-30 Uruks is a slideshow. High with a couple of ultra options is OK, as we agreed earlier.


Yes, full ultra is another story but I was able to get pretty close to it while nailing 60fps for the most part. Ultra textures, mesh and vegetation while the others on high does the trick. Though for complete stutter free play high textures is the best choice.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *salamachaa*
> 
> Interesting. What 4 mods did you have?


I was referring to "threads" when I said I only have 4. I should have been more specific. Mods, I have around 150-200, lost track...


----------



## salamachaa

Quote:


> Originally Posted by *GetToTheChopaa*
> 
> I was referring to "threads" when I said I only have 4. I should have been more specific. Mods, I have around 150-200, lost track...


Did you clean them and make sure your load order was correct? I never had frame drops below 45 when I did heavy (300+) modding.


----------



## GetToTheChopaa

Quote:


> Originally Posted by *salamachaa*
> 
> Did you clean them and make sure your load order was correct? I never had frame drops below 45 when I did heavy (300+) modding.


Doesn't it CTD if the load order is not correct? I've had instances when I manually changed the load order because of this, but never did anything else regarding load order. 300+ mods at 1440p?


----------



## Murlocke

Quote:


> Originally Posted by *sgs2008*
> 
> Sigh 3gb isnt enough for 1440p my 2 780tis stutter like hell with ultra textures fine on high. IS 4gb enough for ultra textures at 1440p?


Yes, a 980 was fine at 3440x1440 for me. No stuttering, just lower than you'd like FPS. I was getting about 30 in fights. The VRAM is there, the power is not.


----------



## salamachaa

Quote:


> Originally Posted by *Murlocke*
> 
> Yes, a 980 was fine at 3440x1440 for me. No stuttering, just lower than you'd like FPS. I was getting about 30 in fights. The VRAM is there, the power is not.


Similar experience for me on my 970 at 1440p. 35-45 fps no stuttering.


----------



## Zipperly

Quote:


> Originally Posted by *Murlocke*
> 
> Yes, a 980 was fine at 3440x1440 for me. No stuttering, just lower than you'd like FPS. I was getting about 30 in fights. The VRAM is there, the power is not.


I wonder if he was running full ultra because I can run 1440P ultra textures, ultra mesh and ultra vegetation. I set everything else to high and this gives me mostly 60fps with minimal stutter. Now if I turn the other settings from high to ultra it does indeed become a stutter fest.... even at 1080P though less frequent obviously.

At any rate high textures are pretty hard to tell apart from ultra and will resolve his stuttering issues all together at 1440P.


----------



## PhilWrir

I want to make it clear that at this point the game is out and this topic can easily be read about and verified in reviews of the game.

Its going to stay open for a few more days *max*, but I expect everyone to get it back on topic rather than let the bickering continue.


----------



## BradleyW

Quote:


> Originally Posted by *salamachaa*
> 
> Similar experience for me on my 970 at 1440p. 35-45 fps no stuttering.


The 290X seems to be doing Ultra at 1440p and hovering between 55 to 70 fps on my end. Not a single stutter, most likely due to the 290X's insane memory bandwidth.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> The 290X seems to be doing Ultra at 1440p and hovering between 55 to 70 fps on my end. Not a single stutter, most likely due to the 290X's insane memory bandwidth.


Full Ultra? as in every option on Ultra? and gameplay or benchmark?


----------



## salamachaa

Quote:


> Originally Posted by *BradleyW*
> 
> The 290X seems to be doing Ultra at 1440p and hovering between 55 to 70 fps on my end. Not a single stutter, most likely due to the 290X's insane memory bandwidth.


Are you referring to during fights or just in general? The 35-45 fps was during bigger fights for me. That's really the only time I have looked at the framerate other than the benchmark because I noticed the slowdown.


----------



## firebird1le

Is ultra shadows worth enabling?


----------



## Zipperly

Quote:


> Originally Posted by *firebird1le*
> 
> Is ultra shadows worth enabling?


From a visual standpoint no, its better for most people to go with high and enjoy the fps boost while not looking much different from ultra.


----------



## Juub

Woah, looks like we really hit the point of diminishing return. Anyone recall back then where there was a world of difference between High and Max? Now even between Low and Max the difference isn't that big anymore. We got super powerful hardware but what's the point when no game can show for it? I mean Shadow of Mordor requires 6GB of VRAM but how come games like Crysis 3 and Metro Redux look much better and have much lower requirements.

On a different note, I ditched my 7950's for a single 970 and I'm loving it. Peak performance isn't as high but it's much more consistent and performs a lot better in many games(Metro Redux, AC IV, Watch_Dogs to name a few). I'll be using it as a placeholder until big Maxwell comes.


----------



## ToxicAdam

Quote:


> Originally Posted by *Juub*
> 
> We got super powerful hardware but what's the point when no game can show for it? I mean Shadow of Mordor requires 6GB of VRAM but how come games like Crysis 3 and Metro Redux look much better and have much lower requirements.


Not a very optimize console port.


----------



## Chargeit

Quote:


> Originally Posted by *ToxicAdam*
> 
> Not a very optimize console port.


How so? The ultra textures look amazing.

Everyone complained that consoles held back computer gaming. Now consoles have access to a lot of Vram. Time to catch up.


----------



## Juub

Also I thought this game was supposed to run a 60fps on the PS4. I read some websites claiming they had tried the game and it was buttery smooth at 60fps. Now I watch everywhere and it seems to be capped at 30. What happened?


----------



## Murlocke

The reason why this game requires more VRAM than others is because it loads higher resolution textures much further away than most games. Same with Watch Dogs. Many open world games use lower resolution textures on anything more than ~30 feet from your character. It looks awful in comparison to this game... Look at the orcs off in the distance, they are fully textured and look amazing.

As for Crysis 3 and Metro requiring less VRAM, well yes, they aren't really open world. They have some larger levels, but that's not the same. At 4K, Metro does indeed eat more than 4GB of VRAM after about 15 minutes when I tried it - even with it's much smaller level designs.

I prefer the direction gaming is going, it only sucks right now because we need new GPUs. Now that we have games demanding it, it won't take long for new 6+ GB cards. I personally wouldn't buy a $500+ card that doesn't have 6GB of VRAM unless you are running 1080p. The 980 is a great card for 1080p and current games, but I see it becoming obsolete very quickly for anyone running higher. 780Tis are even in a worse position, but at least most of their owners got a good use out of them.
Quote:


> Originally Posted by *Juub*
> 
> Also I thought this game was supposed to run a 60fps on the PS4. I read some websites claiming they had tried the game and it was buttery smooth at 60fps. Now I watch everywhere and it seems to be capped at 30. What happened?


Cause they probably demo'd the game at closed events. Usually there's just a PS4 sitting on a stand with the cables going back to a hidden very powerful PC. Few companies have been caught doing this recently and it really should be frowned upon more. It's fraud, pure and simple.


----------



## DrBrogbo

Quote:


> Originally Posted by *Juub*
> 
> Also I thought this game was supposed to run a 60fps on the PS4. I read some websites claiming they had tried the game and it was buttery smooth at 60fps. Now I watch everywhere and it seems to be capped at 30. What happened?


Probably yet more people who fall for BS marketing hype, and don't understand the difference between 30 and 60fps.

People really need to stop caring about consoles not supporting 60Hz as a standard anyway. It's just flat-out not going to happen.


----------



## Ascii Aficionado

So, the general consensus is with a 780 Ti Classified (3GB) @ 1440p, I should just stick with high and not try to play ultra ?

I mean, I can test Ultra, but I should not feel like I'm missing out if I lower it to high, correct ?


----------



## Murlocke

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> So, the general consensus is with a 780 Ti Classified (3GB) @ 1440p, I should just stick with high and not try to play ultra ?
> 
> I mean, I can test Ultra, but I should not feel like I'm missing out if I lower it to high, correct ?


Set preset to Ultra, then set textures to High.

Wouldn't bother with ultra texture since it won't run well and the difference is pretty small. Even on my friend's 4GB 980 we determined that ultra textures wasn't the best setting for 1440p, not because of VRAM but because no single GPUs has enough power to run this game maxed out at 60 FPS @ 1440p. 30-40FPS isn't very enjoyable, and you still won't get a solid 60 even with high textures. Closer to 40-50FPS during fights.


----------



## GTR Mclaren

Quote:


> Originally Posted by *Murlocke*
> 
> Set preset to Ultra, then set textures to High.
> 
> Wouldn't bother with ultra texture since it won't run well and the difference is pretty small. Even on my friend's 4GB 980 we determined that ultra textures wasn't the best setting for 1440p, not because of VRAM but because no single GPUs has enough power to run this game maxed out at 60 FPS @ 1440p. 30-40FPS isn't very enjoyable, and you still won't get a solid 60 even with high textures. Closer to 40-50FPS during fights.


but for 1080p it is good ??


----------



## cstkl1

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> So, the general consensus is with a 780 Ti Classified (3GB) @ 1440p, I should just stick with high and not try to play ultra ?
> 
> I mean, I can test Ultra, but I should not feel like I'm missing out if I lower it to high, correct ?


Geforce experience optimize is pretty spot on on a single gpu. This game doesnt really benifit from high clocks but rather vram limit. Set optimize shld be all max with texture high.

Fps on fights is 50ish with 60fps most of the time. Fps cap at 60 n gsync enabled.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Murlocke*
> 
> The reason why this game requires more VRAM than others is because it loads higher resolution textures much further away than most games. Same with Watch Dogs. Many open world games use lower resolution textures on anything more than ~30 feet from your character. It looks awful in comparison to this game... Look at the orcs off in the distance, they are fully textured and look amazing.
> 
> As for Crysis 3 and Metro requiring less VRAM, well yes, they aren't really open world. They have some larger levels, but that's not the same. At 4K, Metro does indeed eat more than 4GB of VRAM after about 15 minutes when I tried it - even with it's much smaller level designs.
> 
> I prefer the direction gaming is going, it only sucks right now because we need new GPUs. Now that we have games demanding it, it won't take long for new 6+ GB cards. I personally wouldn't buy a $500+ card that doesn't have 6GB of VRAM unless you are running 1080p. The 980 is a great card for 1080p and current games, but I see it becoming obsolete very quickly for anyone running higher. 780Tis are even in a worse position, but at least most of their owners got a good use out of them.
> Cause they probably demo'd the game at closed events. Usually there's just a PS4 sitting on a stand with the cables going back to a hidden very powerful PC. Few companies have been caught doing this recently and it really should be frowned upon more. It's fraud, pure and simple.


^^^Best answer I've seen regarding this so far

Not even modded skyrim loads stuff in the distance like this game does

Nvidia knew very well this game was going to make there flagships look old and possibly push new sales

On a side note if this game never gets SLi support that is a BIG FAIL in my book


----------



## Fresh Sheep

Yea, ultra all settings and then set textures to high. Game has great draw distances and models and I doubt you'd notice the ultra textures once in motion from my testing anyways. Game runs perfectly at 1440p on my 780 so you should be fine.

I have to say as well, I don't really agree with everyone saying the game is poorly optimised. When you take into account how steady the framerate is (lowest I think mine has dropped to was ~45 in a huge battle in a very dense area, was barely noticeable), especially with the fantastic draw distance and when you have a bonkers amount of highly detailed orcs on the screen at once... I personally think they did a great job with this game.


----------



## NrGx

Checked out my VRAM usage using Ultra settings at is hovers around 3.8GB. A bit absurd if you ask me - this game must be poorly optimised; not even Skyrim takes this much when you do 8K LOD packs.


----------



## Arizonian

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> ^^^Best answer I've seen regarding this so far
> 
> Not even modded skyrim loads stuff in the distance like this game does
> 
> Nvidia knew very well this game was going to make there flagships look old and possibly push new sales
> 
> On a side note *if this game never gets SLi support that is a BIG FAIL* in my book


Agreed.

I'll spend money and try Shadows when it does support SLI, but not until then.


----------



## Ahnt

My system runs it great on High settings @ 1080p. I haven't ran fraps yet, but it's silky smooth. It does use a high amount of vram tho. It stutters a good amount on Ultra, but I think it's the HDD more than anything.

2500K / Bone stock
H61
2x2 1333 6-8-7-14 1T
HD7850 with a high overclock


----------



## Murlocke

Quote:


> Originally Posted by *GTR Mclaren*
> 
> but for 1080p it is good ??


On 1080p every game is pretty easy to max with a modern GPU.









Though it will still look a lot worse than 1440p with high textures. I'm kinda doubting you can even see the difference between High/Ultra textures on 1080p, you can barely see the difference on 1440p. I have to stare for it, or compare side by side screenshots, but while actually playing I never notice the difference. I bet they'd make a big difference on a 32" + 4K monitor though.


----------



## Silent Scone

Quote:


> Originally Posted by *Murlocke*
> 
> Set preset to Ultra, then set textures to High.
> 
> Wouldn't bother with ultra texture since it won't run well and the difference is pretty small. Even on my friend's 4GB 980 we determined that ultra textures wasn't the best setting for 1440p, not because of VRAM but because no single GPUs has enough power to run this game maxed out at 60 FPS @ 1440p. 30-40FPS isn't very enjoyable, and you still won't get a solid 60 even with high textures. Closer to 40-50FPS during fights.


Ultra textures doesn't really impact the frame rate that much. I can get a reasonably solid 60 FPS, it's the VRAM limit that is the issue.

l
Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> ^^^Best answer I've seen regarding this so far
> 
> Not even modded skyrim loads stuff in the distance like this game does
> 
> Nvidia knew very well this game was going to make there flagships look old and possibly push new sales
> 
> On a side note if this game never gets SLi support that is a BIG FAIL in my book


Push what sales? Lol. Titan Black is EOL.


----------



## Murlocke

Quote:


> Originally Posted by *Silent Scone*
> 
> Ultra textures doesn't really impact the frame rate that much. I can get a reasonably solid 60 FPS. It s ol


On what 1080p? If you are getting it on 1440p you managed to somewhat utilize your 3x 980s. I'm talking on a single GPU on 1440p. You can't get a solid 60FPS on a single 980, I literally just tried yesterday with two different computers. Both resulted in about 30-40FPS average on ultra textures, and 40-50FPS average on high textures. Try the second act, where there is tons of grass. The first act is way easier to run.


----------



## Silent Scone

Quote:


> Originally Posted by *Murlocke*
> 
> On what 1080p? If you are getting it on 1440p you managed to somewhat utilize your 3x 980s. I'm talking on a single GPU on 1440p. You can't get a solid 60FPS on a single 980, I literally just tried yesterday with two different computers. Both resulted in about 30-40FPS average on ultra textures, and 40-50FPS average on high textures. Try the second act, where there is tons of grass. The first act is way easier to run.


There is no consistency with it, I've not run into any issues though so far, but saying that - I probably won't be trying in Act 2 because of the VRAM swap over occurring. I'd imagine that is even worse in act 2 as well. I'm also talking about a single GPU!







Although SLI isn't great at the moment, actually worse. All at 1440p

Edit: last post was a bit jumbled, iOS 8 is a POS


----------



## Merlena

Runs fluid at my computer with one 780Ti, it gets a bit warm of course, but I noticed turning down Ambient occlusion and view distance.

But I'm on 2560x1440; with pretty much solid frames; not sure if G-Sync is what saving me. Since weird enough it runs better on my old 780Ti than my new 980, either that or I'm just imaging things.

The anti aliasing effects in this game is a joke, setting wise, so lets wait until a patch and some tweaks get out, and we should be good, at probably any resolution.


----------



## hollowtek

yeah as many have already mentioned, the stutter on ultra texture isn't worth it right now lol. toning it back down to high textures.


----------



## 970Rules

just got done with the game. 970 was smooth. 6gb a joke.


----------



## cstkl1

Quote:


> Originally Posted by *Murlocke*
> 
> On what 1080p? If you are getting it on 1440p you managed to somewhat utilize your 3x 980s. I'm talking on a single GPU on 1440p. You can't get a solid 60FPS on a single 980, I literally just tried yesterday with two different computers. Both resulted in about 30-40FPS average on ultra textures, and 40-50FPS average on high textures. Try the second act, where there is tons of grass. The first act is way easier to run.


My tb performance on fps still the same on both maps. Avg 50-60 n massive fights 4x ish just like that vid i posted. Also whenever it hits 6gb fps will drop to 39 n as i run forward it will go back down to [email protected] This is ultra @1440p. Fps capped at 60.

Silent Scone is right. Vram is whats causing that fps drop. Not enough data being fed to the gpu core to keep that fps steady.

Also pretty sure he also ia just running single card till proper sli profile comes out.

Btw can ppl who claim 290x is doing all that post a vid with riva n also share the vid.

I suspect tesselatation tweaking here on amd profile. Need the vid. Cause they did it on batman series.


----------



## NoDoz

I have been testing gameplay with a single 780 @ 2560x1600 on very high settings. Game play is perfect and have a few hours of time gaming on it. Just a FYI.


----------



## cstkl1

Quote:


> Originally Posted by *Fresh Sheep*
> 
> Yea, ultra all settings and then set textures to high. Game has great draw distances and models and I doubt you'd notice the ultra textures once in motion from my testing anyways. Game runs perfectly at 1440p on my 780 so you should be fine.
> 
> I have to say as well, I don't really agree with everyone saying the game is poorly optimised. When you take into account how steady the framerate is (lowest I think mine has dropped to was ~45 in a huge battle in a very dense area, was barely noticeable), especially with the fantastic draw distance and when you have a bonkers amount of highly detailed orcs on the screen at once... I personally think they did a great job with this game.


Yup. The game dev did it right with initial release. Very few bugs etc. both camps no issue.

Btw please retell that draw distance to that ac4 dude. Watchdog should have been like this. Annoying popins. Expect fc4 to be the same. Wolfy with that id engine had same issue as rage with that texture popins.

This game doesnt have any of those annoying traits. Btw on skyrim arguement.. I doubt ure killing 20 guys. Running at flash speed across the map. Shadow striking a few ppl. Jumping back on a garagor to rampage in matter of minutes. Hence their engine used on this game is superb.

So any patches will just make it even better.

Really want to see those 290x in ultra vids. Please share them.


----------



## Silent Scone

Seconded, I don't think the game is poorly optimised for a moment. You don't have to run Ultra Textures, and without the game runs perfectly well on a single 980, and I'm sure 780 also. It also looks amazing, and High Textures is hardly low-end. They still look amazing and at 1440p my 980GTX still allocates over 3GB which is still considerable with nothing but forced FXAA. Looks are subjective, but the LOD distance and textures look really, really good in this game for 3rd person.

It might come across as fairly forthright of me and others to palm off people saying it runs ok for them, but there isn't any logical reason why it would. You are, or at least should be running the exact same textures with the exact same amount of frame buffer available.

Varying across vendors, you might find that the recent AMD drivers are bettering NV cards at the moment in how they're handling memory allocation (albeit most of this is handled by DirectX, so it SHOULD be mostly the same for everyone) but when you are _right_ on the limit, there will 100% be times in the game where swap out does occur and the performance will plummet.


----------



## Zipperly

Quote:


> Originally Posted by *Juub*
> 
> I mean Shadow of Mordor requires 6GB of VRAM but how come games like Crysis 3 and Metro Redux look much better and have much lower requirements.


Couple of things... crysis 3 and metro are not apples to apples comparison with SOM as neither are open world games. Lastly the 6gb vram requirement for SOM is more or less a 4k recommendation. I have been playing SOM with Ultra textures at 1440P just fine on my 3gb 780 with very minimal stuttering. Ultra Textures, Ultra mesh, Ultra vegetation with the rest on high.


----------



## Zipperly

Quote:


> Originally Posted by *Murlocke*
> 
> Set preset to Ultra, then set textures to High.
> 
> Wouldn't bother with ultra texture since it won't run well and the difference is pretty small. Even on my friend's 4GB 980 we determined that ultra textures wasn't the best setting for 1440p, not because of VRAM but because no single GPUs has enough power to run this game maxed out at 60 FPS @ 1440p. 30-40FPS isn't very enjoyable, and you still won't get a solid 60 even with high textures. Closer to 40-50FPS during fights.


If your friend tried my combination of settings at 1440P I bet he would get 60fps most of the time because I certainly do with my overclocked 780. It can be done if you run Textures at Ultra, Mesh and Vegetation at Ultra but run the remaining options on High.


----------



## Zipperly

Quote:


> Originally Posted by *Murlocke*
> 
> On 1080p every game is pretty easy to max with a modern GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though it will still look a lot worse than 1440p with high textures. I'm kinda doubting you can even see the difference between High/Ultra textures on 1080p, you can barely see the difference on 1440p. I have to stare for it, or compare side by side screenshots, but while actually playing I never notice the difference. I bet they'd make a big difference on a 32" + 4K monitor though.


+1 High is the best choice for textures if you want to completely eliminate any stuttering on 3-4gb cards. Ultra textures can be done if a few other key features are turned down to high, end result is very minimal stuttering.. and I do mean VERY MINIMAL as long at you are not over 1440P.


----------



## funfordcobra

The f.e.a.r. 3 SLI profile works great. No AFR. I'm getting a solid 100 fps on my 2560x1440p\100hz with very few dips in the 90s and on 3440x1440p/60hz its locked at 60 with no dips at all.


----------



## Zaid

anybody figured out how the spacebar counter works?

do i have to tap a direction as well as spacebar? because i seem to be failing at 90% of those counters.


----------



## Murlocke

Quote:


> Originally Posted by *Zaid*
> 
> anybody figured out how the spacebar counter works?
> 
> do i have to tap a direction as well as spacebar? because i seem to be failing at 90% of those counters.


Join the club, there's a built-in delay with rolling that appears to be about the same length as the window they give you. You are suppose to roll in a direction, not just press space, but it's still almost impossible sometimes it seems. It seems like it basically requires 60FPS or you'll miss it. Dodging any of the animals in the world is even worse, I never get them. I just take them out from range...









Test it... just try rolling in one direction. I feel there's about a 250-350ms delay before I actually roll. Right click on the other hand is easy and instant...


----------



## supermi

what about the execution, seems hit or miss ESPECIALLY with the KB a controller seams better a little more responsive maybe.

I was wondering if anyone else was having those kinda issues


----------



## Daniel1236

I'm running high textures with everything else maxed out at 1080p with a 2GB GTX770 and have had no issues at all.

Very impressed with the quality standard of the development on the game, it runs great.


----------



## j0z3

So this means my HD5770 cannot run this game?

Mother of god that is a lot.


----------



## Zaid

Quote:


> Originally Posted by *Murlocke*
> 
> Join the club, there's a built-in delay with rolling that appears to be about the same length as the window they give you. You are suppose to roll in a direction, not just press space, but it's still almost impossible sometimes it seems. It seems like it basically requires 60FPS or you'll miss it. Dodging any of the animals in the world is even worse, I never get them. I just take them out from range...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Test it... just try rolling in one direction. I feel there's about a 250-350ms delay before I actually roll. Right click on the other hand is easy and instant...


Oh this was failure on my part, i wasn't executing properly. i think you have to double tap space and a direction for him to do the jump n roll. its still alot more difficult then then regular right click counter.

Quote:


> Originally Posted by *supermi*
> 
> what about the execution, seems hit or miss ESPECIALLY with the KB a controller seams better a little more responsive maybe.
> 
> I was wondering if anyone else was having those kinda issues


its fine for me on pc, at first it was awkward getting the timing for ground execution. but now its smooth as butter, especially with the quicker execution ability.

Quote:


> Originally Posted by *j0z3*
> 
> So this means my HD5770 cannot run this game?
> 
> Mother of god that is a lot.


i think you would be able to run this game, on low settings. you can even do what they did on Xbox one and run it on 900p only.


----------



## DrBrogbo

Quote:


> Originally Posted by *Zaid*
> 
> Oh this was failure on my part, i wasn't executing properly. i think you have to double tap space and a direction for him to do the jump n roll. its still alot more difficult then then regular right click counter.
> its fine for me on pc, at first it was awkward getting the timing for ground execution. but now its smooth as butter, especially with the quicker execution ability.
> i think you would be able to run this game, on low settings. you can even do what they did on Xbox one and run it on 900p only.


I have issue with those sometimes. It happens with the captains and warchiefs, and it's supposed to mean it's a heavy attack you can't block/counter. Your only choice is to roll out of the way, which can be tricky, especially since a lot of the times the captains spawn nearly-endless hordes of the uruks. Those fights turn almost Witcher-2-combat-esque: Slash a few times, roll away, slash a few more times, roll away, slash a few more times, do an execution, roll away, etc. Painful.


----------



## Zaid

Quote:


> Originally Posted by *DrBrogbo*
> 
> I have issue with those sometimes. It happens with the captains and warchiefs, and it's supposed to mean it's a heavy attack you can't block/counter. Your only choice is to roll out of the way, which can be tricky, especially since a lot of the times the captains spawn nearly-endless hordes of the uruks. Those fights turn almost Witcher-2-combat-esque: Slash a few times, roll away, slash a few more times, roll away, slash a few more times, do an execution, roll away, etc. Painful.


when the fight gets to that point, its almost always lost if you fight em head on. best thing to do when that happens is to run to a fire/wasp nest/explosives and use that when their all bunched up, works like magic.


----------



## hurleyef

The bow is op once you've upgraded your focus and arrow capacity and have the combat drain ability. You can easily clear hordes very quickly that way.


----------



## JoHnYBLaZe

Quote:


> Originally Posted by *Silent Scone*
> 
> Ultra textures doesn't really impact the frame rate that much. I can get a reasonably solid 60 FPS, it's the VRAM limit that is the issue.
> 
> l
> Push what sales? Lol. Titan Black is EOL.


Wow....so nvidia technically doesn't even have a card to run this game

I meant they created demand for more cards even though 970's, 780's and up are plenty powerful....

For the prices of these cards, this whole vram business is a little too SLIMEY for me

I can't think of any product off hand that, for these prices, degrades in performance in such a short time


----------



## DIYDeath

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Wow....so nvidia technically doesn't even have a card to run this game
> 
> I meant they created demand for more cards even though 970's, 780's and up are plenty powerful....
> 
> For the prices of these cards, this whole vram business is a little too SLIMEY for me
> 
> I can't think of any product off hand that, for these prices, degrades in performance in such a short time


Well there is the Titan Black, that handles the game quite nicely. Want to shell out $1000 for a card that's probably going to have its replacement within 12 months on the secondary market? lol


----------



## Silent Scone

EOL means End Of Life. There is always the Titan-Z. Plenty inventory of those


----------



## DIYDeath

Quote:


> Originally Posted by *Silent Scone*
> 
> EOL means End Of Life. There is always the Titan-Z. Plenty inventory of those


I wonder what they'll have to do to sell them now that maxwell is here.


----------



## Silent Scone

Quote:


> Originally Posted by *DIYDeath*
> 
> I wonder what they'll have to do to sell them now that maxwell is here.


There is always a need for something, and they have been selling. Or at least they did initially. Mostly professional use from the info I heard you'll be glad to here.

Can't be that many insane gamers. I'd definitely say they'll have a rough time clearly the last Z inventory. Titan Blacks should clear up pretty quickly though.


----------



## Rickles

Anyone else hate the stupid ghouls in this game???


----------



## Zipperly

Quote:


> Originally Posted by *DIYDeath*
> 
> Well there is the Titan Black, that handles the game quite nicely. Want to shell out $1000 for a card that's probably going to have its replacement within 12 months on the secondary market? lol


Or you can get a 6gb 780 used for roughly half that amount, overclock it and it would suffice just fine.


----------



## Zipperly

Quote:


> Originally Posted by *Rickles*
> 
> Anyone else hate the stupid ghouls in this game???


Yes, those little pricks are a pain in the butt for me too. Oh and last night I enjoyed trying to defeat one of the captains only to have 2 other bosses with him joining in on the fight along with 50 or so orcs. Every time I would restart the mission and go for that captain those other 2 jerk offs would still be with him.


----------



## Rickles

Quote:


> Originally Posted by *Zipperly*
> 
> Yes, those little pricks are a pain in the butt for me too. Oh and last night I enjoyed trying to defeat one of the captains only to have 2 other bosses with him joining in on the fight along with 50 or so orcs. Every time I would restart the mission and go for that captain those other 2 jerk offs would still be with him.


I've had very few issues with captains, unless there are too many archers or spear throwers around, killed my first war chief taking 0 damage, but those ghouls and the matron especially just murder me.


----------



## Zipperly

Quote:


> Originally Posted by *Rickles*
> 
> I've had very few issues with captains, unless there are too many archers or spear throwers around, killed my first war chief taking 0 damage, but those ghouls and the matron especially just murder me.


My bad, it was a war chief I was going after and he had 2 captains hanging out with him every time i tried the mission. Then the plethora of orcs........ my gosh.


----------



## BusterOddo

Quote:


> Originally Posted by *Zipperly*
> 
> My bad, it was a war chief I was going after and he had 2 captains hanging out with him every time i tried the mission. Then the plethora of orcs........ my gosh.


Getting the info on the captains and killing them first really helps. Use their weakness to kill them quickly, and try to clear a bunch of the orcs before attaching the chief. My only complaint with this game is it was too short







It was so good I couldn't stop playing and now I already beat it.


----------



## Zipperly

Quote:


> Originally Posted by *BusterOddo*
> 
> Getting the info on the captains and killing them first really helps. Use their weakness to kill them quickly, and try to clear a bunch of the orcs before attaching the chief. My only complaint with this game is it was too short
> 
> 
> 
> 
> 
> 
> 
> It was so good I couldn't stop playing and now I already beat it.


Okay I will go for the captains first, specially the ones harassing me. I have around 10hrs in this game and im only on the first act. lol.


----------



## Faded

I don't have any real issues with the ghouls... build up a quick multiplier and then Wraith Flash, it freaking decimates them. The matron takes a HUGE amount of damage from the bonfires, near her... like half her health, if i recall.

My biggest issue has been with one of the captains that has Invuln to Stealth, Ranged AND Combat mastery... couple that with the gang he roams with and the dude is a HUGE pain in the ass, for me. I REALLY dont like the Combat Master perk...


----------



## DiNet

gigabyte 970 G1
75 avg on 1080
32 avg on 4K DSR
With HD texture packs.


----------



## BradleyW

Quote:


> Originally Posted by *DiNet*
> 
> gigabyte 970 G1
> 75 avg on 1080
> 32 avg on 4K DSR
> With HD texture packs.


But at what settings?
Are those fps results the min, max, avg?


----------



## ozlay

running fine for me on a 7950 with everything maxed out ultra textures at 1080p so i guess the 6gb requirement is for 1440p+


----------



## paulerxx

So played this after playing Alien Isolation and this game looks just ugly and unoptimized compared to it.


----------



## The Source

Quote:


> Originally Posted by *Zipperly*
> 
> My bad, it was a war chief I was going after and he had 2 captains hanging out with him every time i tried the mission. Then the plethora of orcs........ my gosh.


Only two? Anytime I go after anyone in or near a stronghold, it's a constant stream and I can't get near my target no matter the tactic used. More than a little annoying.


----------



## The Source

Quote:


> Originally Posted by *BradleyW*
> 
> But at what settings?
> Are those fps results the min, max, avg?


If he's using the benchmark tool it doesn't matter. It doesnt represent in game performance at all.


----------



## 2003M36sp

Here's 720p everything maxed out with the HD Texture Pack, I dont know about the vram since i only have 3gb it's just like Watchdogs stays around 2700mb or more most of the time.

Either way it loves ram as well I have 8gb, and the AMD 14.9 drivers aren't to bad in the 90% gpu most of the time better then the 14.7 beta i had for a while.





Games fun A which is X for me (ps3) to run on the controller isn't cool stunning him n the multi strikes is cool.


----------



## Faded

Quote:


> Originally Posted by *2003M36sp*
> 
> Here's 720p everything maxed out with the HD Texture Pack, I dont know about the vram since i only have 3gb it's just like Watchdogs stays around 2700mb or more most of the time.
> 
> Either way it loves ram as well I have 8gb, and the AMD 14.9 drivers aren't to bad in the 90% gpu most of the time better then the 14.7 beta i had for a while.
> 
> 
> 
> 
> 
> Games fun A which is X for me (ps3) to run on the controller isn't cool stunning him n the multi strikes is cool.


Is that Eurotrip? Lucy Lawless cracked me up in that movie...


----------



## chronicfx

Quote:


> Originally Posted by *The Source*
> 
> If he's using the benchmark tool it doesn't matter. It doesnt represent in game performance at all.


Couldn't agree more! What a bunch of garbage to post. Show me in game lol. I am playing 1440p with no blur, high textures, high ao all else ultra and getting solid 55-60 fps with single 290x stock. I can put everything on ultra and peg 100fps 1440p with trifire, friendly afr and frame pacing on i even feel like the dodges and blocks are way more responsive. It is smooth but my soundblaster zx has these very faint crackle and pops during dialogue when ever something gets framelimited or vsynced such as cutscenes do it alot. Annoys the crap out of me. When I can play totally with vsync off it doesn't happen. But yes my trifire seems to work ok. Have not tried dual crossfire but I have always bought three cards because trifire seems to cure stutter.

Edit: yes the menus are garbage. Disable cf go in set them, get out, enable, then play without touching them again.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> So played this after playing Alien Isolation and this game looks just ugly and unoptimized compared to it.


So you own SOM?


----------



## Zipperly

Quote:


> Originally Posted by *The Source*
> 
> Only two? Anytime I go after anyone in or near a stronghold, it's a constant stream and I can't get near my target no matter the tactic used. More than a little annoying.


No bro, 1 war chief and 2 captains, each one attacks and says what they have to say to me which is annoying as hell... and then the 50 or so orcs.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> So you own SOM?


I own both.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> I own both.


Ok I do as well but do not get the feeling that SOM is ugly or unoptimized by comparison, one is a very impressive open world game and the other is a corridor shooter.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> Ok I do as well but do not get the feeling that SOM is ugly or unoptimized by comparison, one is a very impressive open world game and the other is a corridor shooter.


All I'm saying is, Alien Isolation feels next generation to me and Shadow Of Mordor does not.


----------



## DIYDeath

Quote:


> Originally Posted by *The Source*
> 
> Only two? Anytime I go after anyone in or near a stronghold, it's a constant stream and I can't get near my target no matter the tactic used. More than a little annoying.


Get Shadow Strike so you can zip to the archers and pretend its a batman game, use counter and pwn 3/4 of everything attacking you. When you max your hit multiplier use executions, and try to target your executions on berserkers or shields or warchiefs/captains.

I utterly destroyed all 4 warchiefs yesterday doing that, died twice.


----------



## n780tivs980

Think they shipped two different version of SoM by mistake but only to a few select people in this thread, because it looks immense. The character model textures are out of this world, the particle effects the attention to detail over this massive open world is insane. All the little extras like dust moving under your feet, the rain that changes textures being able to see the wind effect capes/banners/cloth.

All while being open world, you can see for miles and all your enemies for miles. This is easily the most detailed and stable running open world game ever. I was not expecting much at all going in and have been blown away, they have put so much detail into so many different things.

Been playing aliens all night and they do not compare, yes aliens has great closed off atmosphere but it's just that closed off. It's small corridors where the lighting is easy to project and control, you are not lighting an entire plain like SoM. Character models in aliens are no where near on par with SoM either or particle effects.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> All I'm saying is, Alien Isolation feels next generation to me and Shadow Of Mordor does not.


I get you, well both feel next gen to me. Im loving these great releases that we are getting.


----------



## paulerxx

Quote:


> Originally Posted by *Zipperly*
> 
> I get you, well both feel next gen to me. Im loving these great releases that we are getting.


I really appreciate these great ports aswell this generation. All I can say, is keep them coming. I don't regret buying either game. Not a bit.


----------



## Zipperly

Quote:


> Originally Posted by *n780tivs980*
> 
> Think they shipped two different version of SoM by mistake but only to a few select people in this thread, because it looks immense. The character model textures are out of this world, the particle effects the attention to detail over this massive open world is insane. All the little extras like dust moving under your feet, the rain that changes textures being able to see the wind effect capes/banners/cloth.
> 
> All while being open world, you can see for miles and all your enemies for miles. This is easily the most detailed and stable running open world game ever. I was not expecting much at all going in and have been blown away, they have put so much detail into so many different things.
> 
> Been playing aliens all night and they do not compare, yes aliens has great closed off atmosphere but it's just that closed off. It's small corridors where the lighting is easy to project and control, you are not lighting an entire plain like SoM. Character models in aliens are no where near on par with SoM either or particle effects.


My thoughts on this as well.


----------



## Zipperly

Quote:


> Originally Posted by *paulerxx*
> 
> I really appreciate these great ports aswell this generation. All I can say, is keep them coming. I don't regret buying either game. Not a bit.


----------



## sumitlian

Quote:


> Originally Posted by *BradleyW*
> 
> The 290X seems to be doing Ultra at 1440p and hovering between 55 to 70 fps on my end. Not a single stutter, most likely due to the 290X's insane memory bandwidth.


Before I sold my 280x three days ago, I benched it too on SOM.

FX 8350 4.3 GHz, CPU-NB 2600 MHz
2 x 4GB 2400 MHz, 10-10-12-32-1T
ASUS TOP 280X 1180 core / 1710 Memory
1080p 60Hz downsampled on 1600 x 900 60Hz 20" monitor.





Still good for Oced 280X. I gained ~7 fps with this overclocking than 1000 core and 1500 Memory.


----------



## Zaid

Quote:


> Originally Posted by *n780tivs980*
> 
> Think they shipped two different version of SoM by mistake but only to a few select people in this thread, because it looks immense. The character model textures are out of this world, the particle effects the attention to detail over this massive open world is insane. All the little extras like dust moving under your feet, the rain that changes textures being able to see the wind effect capes/banners/cloth.
> 
> All while being open world, you can see for miles and all your enemies for miles. This is easily the most detailed and stable running open world game ever. I was not expecting much at all going in and have been blown away, they have put so much detail into so many different things.
> 
> Been playing aliens all night and they do not compare, yes aliens has great closed off atmosphere but it's just that closed off. It's small corridors where the lighting is easy to project and control, you are not lighting an entire plain like SoM. Character models in aliens are no where near on par with SoM either or particle effects.


i agree, for how much detail is on screen all the time its insane. my heavily modified skyrim takes about 3gb of vram and its no where near as good as SOM.

the only things that i am not impressed by are the fires, and grass. but TBH its the best grass ive seen in an rpg im just picky when it comes to grass.


----------



## blackhole2013

Ok fine I was wrong when lots of things are going on my 780 3gb was skipping with Ultra textures.. so you won and I use high textures with everything else ultra ....


----------



## DiNet

Quote:


> Originally Posted by *BradleyW*
> 
> But at what settings?
> Are those fps results the min, max, avg?


Avg. fps with built-in benchmark.
Highest possible settings.
344 WHQL driver and vsync enabled because 60" screen.

Min are 22 on 4K and 50 on 1080.
Max 70 on 4K and >120 on 1080.


----------



## paulerxx

Quote:


> Originally Posted by *blackhole2013*
> 
> Ok fine I was wrong when lots of things are going on my 780 3gb was skipping with Ultra textures.. so you won and I use high textures with everything else ultra ....


The game looks good either way bro. No foul


----------



## ElectroManiac

For what I see ultra texture are not that much better than high. I think is not worth the extra vram to be honest. The game on everything on ultra plus high texture look amazing. Really having fun on this game.


----------



## paulerxx

Quote:


> Originally Posted by *ElectroManiac*
> 
> For what I see ultra texture are not that much better than high. I think is not worth the extra vram to be honest. The game on everything on ultra plus high texture look amazing. Really having fun on this game.


I agree, everything maxed with high textures and 150% = beautiful game.


----------



## n780tivs980

Got a frog looking shield using warchief at power level 19(started at 15) kicking ma butt every time, it's the first time I have died. He has some sort of aoe poison that keeps interupting my combos, he has no second chance buff and he has decimate attack that almost 1 shots me.


----------



## cstkl1

Quote:


> Originally Posted by *n780tivs980*
> 
> Got a frog looking shield using warchief at power level 19(started at 15) kicking ma butt every time, it's the first time I have died. He has some sort of aoe poison that keeps interupting my combos, he has no second chance buff and he has decimate attack that almost 1 shots me.


As u go higher intel on their weakness is pretty important.
Brand all archers.
Kill/brand all shield/berserker guys
Brand after a few combo everytime.
If possible brand the spear throwing dudes.
Keep changing fighting areas
Shadow strike a few ppl n later somebody far away n drain them.

Where i am playing atm one to one is impossible cause before my first strike he will blow the horn. N they all seem to have poison which will affect block ability.


----------



## Flames21891

Quote:


> Originally Posted by *cstkl1*
> 
> As u go higher intel on their weakness is pretty important.
> Brand all archers.
> Kill/brand all shield/berserker guys
> Brand after a few combo everytime.
> If possible brand the spear throwing dudes.
> Keep changing fighting areas
> Shadow strike a few ppl n later somebody far away n drain them.
> 
> Where i am playing atm one to one is impossible cause before my first strike he will blow the horn. N they all seem to have poison which will affect block ability.


Easy way to deal with tough captains later in the game is to farm epic runes. There's some pretty godlike ones out there. Right now I have one that makes me immune to poison, one that increases my bow damage by 100% if my HP is below 25%, one that recharges Storm of Urfael in half as many combat finishers, and one that fully recharges my health and focus instantly if I kill a captain or warchief. Putting a couple of high level HP gain runes on your sword is also a very good idea. Right now I'm guaranteed to get 10% of my HP back on a flurry kill, and I have a 67% chance of getting all my HP back once my combo reaches 30.

Also you're basically unkillable once you get Storm of Urfael. I have yet to see an orc that is 100% immune to combat finishers, so my tactic is Wraith Stun>Flurry Combo>Double Execute>Pop Storm of Urfael and spam until dead.


----------



## Juub

Wait so I downloaded the game and set everything to Ultra and had no issues hitting above 60 with my 970. Now I see a 3.7GB download on my Steam. Does that mean I was running on Very High all along and not Ultra?


----------



## ElectroManiac

Quote:


> Originally Posted by *Juub*
> 
> Wait so I downloaded the game and set everything to Ultra and had no issues hitting above 60 with my 970. Now I see a 3.7GB download on my Steam. Does that mean I was running on Very High all along and not Ultra?


Yes


----------



## Juub

So I tried again and the fps is mostly at 60 except dips are much more frequent now.

Edit: Frame rate mostly stays in the high 70's to mid 80's.


----------



## BradleyW

Did anyone notice lower fps on the second map? Had to turn Veg from Ultra to High for a huge fps boost.


----------



## CryphicKing

Quote:


> Originally Posted by *Juub*
> 
> So I tried again and the fps is mostly at 60 except dips are much more frequent now.
> 
> Edit: Frame rate mostly stays in the high 70's to mid 80's.


you will find second map to be far more demanding than the first, when high vege density + some army vs army strategy elements kicks in.


----------



## cstkl1

Quote:


> Originally Posted by *CryphicKing*
> 
> you will find second map to be far more demanding than the first, when high vege density + some army vs army strategy elements kicks in.


Thats a vram issue. On tb it is still the same.


----------



## DIYDeath

Quote:


> Originally Posted by *BradleyW*
> 
> Did anyone notice lower fps on the second map? Had to turn Veg from Ultra to High for a huge fps boost.


I get stutters in the 2nd map from time to time. The 1st was no problem at all, a stutter if I ended up fighting 50 orcs at once if I was unlucky.


----------



## BradleyW

Quote:


> Originally Posted by *cstkl1*
> 
> Thats a vram issue. On tb it is still the same.


No, it is the rendering requirement for the vast vegetation. 1st map has little to no veggy!


----------



## cstkl1

Quote:


> Originally Posted by *BradleyW*
> 
> No, it is the rendering requirement for the vast vegetation. 1st map has little to no veggy!


My fps is the same.


----------



## BusterOddo

Quote:


> Originally Posted by *BradleyW*
> 
> Did anyone notice lower fps on the second map? Had to turn Veg from Ultra to High for a huge fps boost.


Yes the second half was much more demanding, most likely due to the increase veggy's.







The grass is very high detailed as far as you can see.


----------



## Chargeit

I haven't noticed any decreased performance in the 2nd half of the game. Of course, I just set it to adaptive Vsync half refresh rate early in the game and ran with it. So, maxed 1080 +150% High textures at half refresh rate runs stable throughout the game.

I'll take a stable 30 over a fluctuating 60.


----------



## BradleyW

Quote:


> Originally Posted by *Chargeit*
> 
> I haven't noticed any decreased performance in the 2nd half of the game. Of course, I just set it to adaptive Vsync half refresh rate early in the game and ran with it. So, maxed 1080 +150% High textures at half refresh rate runs stable throughout the game.
> 
> I'll take a stable 30 over a fluctuating 60.


That's why you did not see the fps reduction, because with your system you'll never dip below 30 at that resolution.








Only thing I hate about 30 fps is the massive mouse lag.


----------



## Chargeit

Quote:


> Originally Posted by *BradleyW*
> 
> That's why you did not see the fps reduction, because with your system you'll never dip below 30 at that resolution.
> 
> 
> 
> 
> 
> 
> 
> 
> Only thing I hate about 30 fps is the massive mouse lag.


Yea, the adaptive Vsync half refresh rate keeps the game at a very constant fps/frame timing.

I play this game with a controller. 30 fps with a KB&M isn't nearly as forgiving. I can take it on a controller. I'm fairly sure if I showed a friend this game right now I could tell them it was running 60 fps and they'd believe me.


----------



## Zipperly

Game controller FTW with this one.


----------



## Chargeit

Quote:


> Originally Posted by *Zipperly*
> 
> Game controller FTW with this one.


Man, I don't see how people can play a game like this without one, and I've been computer gaming since the early/mid 90's. Of course, the first video game I remember playing was Super Mario on the NES as young kid in the 80's. I grew up using both kb&m and controllers. Since I can take advantage of both, there isn't much point in me not using the method that works best for a given game. I wouldn't play a racing game with a kb&m anymore then I'd play a fps with a controller for instance.

Speaking of controllers, ever since adding this 3rd monitor I've been thinking more and more about getting a flight stick... Though I have no flying games right now, I've been eyeballing a few space sims.


----------



## Zipperly

Quote:


> Originally Posted by *Chargeit*
> 
> Man, I don't see how people can play a game like this without one, and I've been computer gaming since the early/mid 90's. Of course, the first video game I remember playing was Super Mario on the NES as young kid in the 80's. I grew up using both kb&m and controllers. Since I can take advantage of both, there isn't much point in me not using the method that works best for a given game. I wouldn't play a racing game with a kb&m anymore then I'd play a fps with a controller for instance.
> 
> Speaking of controllers, ever since adding this 3rd monitor I've been thinking more and more about getting a flight stick... Though I have no flying games right now, I've been eyeballing a few space sims.


Same here man, in fact I play most of my games with the controller now. Only first person shooters gets the attention of my keyboard and mouse.


----------



## Murlocke

Controller in this game would put you at at an extreme disadvantage for bow headshots, no? I mean I can kill 10 orcs with my bow before focus runs out. Can't imagine doing that with a controller.


----------



## Zipperly

Quote:


> Originally Posted by *Murlocke*
> 
> Controller in this game would put you at at an extreme disadvantage for bow headshots, no? I mean I can kill 10 orcs with my bow before focus runs out. Can't imagine doing that with a controller.


No not at all, I do that easily with a controller.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> No not at all, I do that easily with a controller.


But that's because you are a controller legend.








You've got a gold plated 360 controller haven't you! Just having some fun with ya!


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> But that's because you are a controller legend.
> 
> 
> 
> 
> 
> 
> 
> 
> You've got a gold plated 360 controller haven't you! Just having some fun with ya!


No doubt its easier with a keyboard and mouse to do those head shots, im just rather good with the controller as well. Good thing is that its very easy to just grab the keyboard and mouse if you really need it for those certain parts, the game will pick up the keyboard and mouse controls without having to change anything in the options.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> No doubt its easier with a keyboard and mouse to do those head shots, im just rather good with the controller as well. Good thing is that its very easy to just grab the keyboard and mouse if you really need it for those certain parts, the game will pick up the keyboard and mouse controls without having to change anything in the options.


Yeah that's true you can change on the fly. Might be tricky to change during heated mid combat to shoot someone in the head.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> Yeah that's true you can change on the fly. Might be tricky to change during heated mid combat to shoot someone in the head.


Well yeah it could be but its not something I have to deal with, 100 percent controller for me on this one.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Well yeah it could be but its not something I have to deal with, 100 percent controller for me on this one.


I just used the mouse. The mouse support was not as good as I hoped. It felt delayed and movement was shoddy. Keyboard was perfect though. Mouse in combat felt much quicker, so maybe it was intentional.


----------



## Chargeit

Quote:


> Originally Posted by *Murlocke*
> 
> Controller in this game would put you at at an extreme disadvantage for bow headshots, no? I mean I can kill 10 orcs with my bow before focus runs out. Can't imagine doing that with a controller.


Yea, that's the one spot that I don't like a controller. Fighting and just about everything else works out better. I'd be worried about breaking my mouse and keyboard playing this game.

I did notice that the game lets you switch between KB&m and controller on the fly. You could use the controller for most things, and then use the kb&m for aiming... Only issue for me is, I switch the Y axis when using a controller, which means the mouse is also reversed. Switching the Y axis on a controller is old school I guess, but, I'm so used to doing it that I can't play the other way.


----------



## Zaid

Quote:


> Originally Posted by *BradleyW*
> 
> Yeah that's true you can change on the fly. Might be tricky to change during heated mid combat to shoot someone in the head.


not at all, with tombraider i would play with a controller 90% of the time until i got to those crazy fight scenes with over 20 ppl shooting at you. i would get behind cover and put my controller down and mouse n keyboard are already infront of me so i just grab em and start unloading headshots.

i haven't tried it with SOM, mostly because i really like how the keyboard and mouse are set up.


----------



## BradleyW

Quote:


> Originally Posted by *Zaid*
> 
> not at all, with tombraider i would play with a controller 90% of the time until i got to those crazy fight scenes with over 20 ppl shooting at you. i would get behind cover and put my controller down and mouse n keyboard are already infront of me so i just grab em and start unloading headshots.
> 
> i haven't tried it with SOM, mostly because i really like how the keyboard and mouse are set up.


My comment is about SOM.


----------



## Dasboogieman

Quote:


> Originally Posted by *JoHnYBLaZe*
> 
> Wow....so nvidia technically doesn't even have a card to run this game
> 
> I meant they created demand for more cards even though 970's, 780's and up are plenty powerful....
> 
> For the prices of these cards, this whole vram business is a little too SLIMEY for me
> 
> I can't think of any product off hand that, for these prices, degrades in performance in such a short time


NVIDIA have always been conservative with VRAM capacity. It helps the profit margins by ensuring their designs are cheaper to build and by forcing early upgrades for otherwise perfectly adequate GPU cores.
The 780ti would have been so perfect with 6gb of VRAM. In this regard, Hawaii cards are slightly better equipped than the regular 780to but you have to deal with the weaker core, power, heat and driver issues.


----------



## Noufel

I hate that crossbow captain he killed me 5 times







i'll cut him in half next time i meet him


----------



## Swolern

Loving this game so far! The finishing moves are gruesome and bloody awesome!!









My VRAM sits at about 5.5-5.8GB use on a single Titan with Ultra presets @2560x1440 and im averaging 60-75fps.


----------



## FreeElectron

Does ultra enable AA?


----------



## Juub

So I unkocked Critical Strikes but it doesn't seem to be working. I figure it's like in the Batman games where your additional combos increases by 2 instead of 1 by timing your strikes by tapping exactly once when you hit an enemy. Sometimes it works but sometimes it doesn't.


----------



## Wezzor

So is there any bigger diffrence between Ultra and High on the textures?


----------



## chronicfx

Several gigabytes worth apparently


----------



## chronicfx

Quote:


> Originally Posted by *Swolern*
> 
> Loving this game so far! The finishing moves are gruesome and bloody awesome!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My VRAM sits at about 5.5-5.8GB use on a single Titan with Ultra presets @2560x1440 and im averaging 60-75fps.


I don't know if this was ever looked at but during all the vram commotion of watchdogs some noticed that people with mechanical hdd's were having more issues than those with ssd drives. Could be another piece of the puzzle as those textures come from somewhere.


----------



## Zipperly

Quote:


> Originally Posted by *Wezzor*
> 
> So is there any bigger diffrence between Ultra and High on the textures?


Aside from vram usage? not really and unless you run a much higher res than 1080P you wont notice much of anything. There are some comparison screen shots floating around and you will be hard pressed to find any noticeable difference.


----------



## Clazman55

Quote:


> Originally Posted by *Dasboogieman*
> 
> NVIDIA have always been conservative with VRAM capacity. It helps the profit margins by ensuring their designs are cheaper to build and by forcing early upgrades for otherwise perfectly adequate GPU cores.
> The 780ti would have been so perfect with 6gb of VRAM. In this regard, Hawaii cards are slightly better equipped than the regular 780to but you have to deal with the weaker core, power, heat and driver issues.


You don't really need the VRAM. The game will use it, if it is there though.


----------



## The Source

Quote:


> Originally Posted by *Wezzor*
> 
> So is there any bigger diffrence between Ultra and High on the textures?


Since "bigger" is going to be as subjective as is gets around here, there is a difference and it is clear despite what the cross-eyed crowd is saying.


----------



## Chargeit

Quote:


> Originally Posted by *Wezzor*
> 
> So is there any bigger diffrence between Ultra and High on the textures?


There is a difference. It isn't huge, but I do notice it. Does it take away from the visuals? No.

The high textures still look good, and I'm not sure that everyone could pick out the differences.

Though the high textures look good, because of this game I will not buy a GPU with less then 6gb of Vram.


----------



## BradleyW

Quote:


> Originally Posted by *Chargeit*
> 
> There is a difference. It isn't huge, but I do notice it. Does it take away from the visuals? No.
> 
> The high textures still look good, and I'm not sure that everyone could pick out the differences.
> 
> Though the high textures look good, because of this game I will not buy a GPU with less then 6gb of Vram.


I did some comparisons of my own with .bmp screenshots at 1080p. Did not notice much of a difference at all. Maybe that was due to "1080p"? I assume .bmp would be fine as a file format for viewing, right?


----------



## bfromcolo

Wow 128 pages in this thread already.

I am interested in this game but I haven't seen a lot of reports from people with lesser cards, I have a 7850 1G. Anyone with similar running this? Is there any point in even trying? I dont want to buy a $50 game that requires I purchase a $300 card to run it.


----------



## BradleyW

Quote:


> Originally Posted by *bfromcolo*
> 
> Wow 128 pages in this thread already.
> 
> I am interested in this game but I haven't seen a lot of reports from people with lesser cards, I have a 7850 1G. Anyone with similar running this? Is there any point in even trying? I dont want to buy a $50 game that requires I purchase a $300 card to run it.


I think a 7850 would struggle. I'd aim for low at 1080p or 720p @ 30fps.


----------



## Valeriant

I finally got this game today! Yay!

Just benchmarking with 1920x1200 (60Hz), GTX 780 6GB, ultra settings (default) the game ate 4.7GB VRAM!
Playing till the game showing the first Uruk Captain as target and quit, the VRAM usage already at 98%.

I think the game kept loaded textures till maxing the VRAM and start disposing old ones when a new textures are needed. IMO this is a good strategy for an open-world game. You don't need to be reloading textures back when the player ran to previous areas. If new textures are needed just dispose oldest ones from the stack. BUT if you have low VRAM then that previous area you ran back to must be loaded back hence the need for bigger VRAM. Um, just a thought.

I'm installing the separate HD texture pack now.


----------



## Chargeit

Quote:


> Originally Posted by *BradleyW*
> 
> I did some comparisons of my own with .bmp screenshots at 1080p. Did not notice much of a difference at all. Maybe that was due to "1080p"? I assume .bmp would be fine as a file format for viewing, right?


There's a difference. The orcs and the landscape has more depth when set to ultra, little things like the skin crevices on their elbows, more/crisper layers of shading on their backs. That doesn't mean high looks bad, it just means ultra makes everything stick out a little more. Least from what I could see.

Anyway, like I mentioned, it isn't so bad that you can just pick it out without really looking for it.

I say if you have a card that can run the game on high textures (3gb+), then there isn't a reason to not get it. The game will look about as good as it would with ultra textures.


----------



## BradleyW

Quote:


> Originally Posted by *Chargeit*
> 
> There's a difference. The orcs and the landscape has more depth when set to ultra, little things like the skin crevices on their elbows, more/crisper layers of shading on their backs. That doesn't mean high looks bad, it just means ultra makes everything stick out a little more. Least from what I could see.
> 
> Anyway, like I mentioned, it isn't so bad that you can just pick it out without really looking for it.
> 
> I say if you have a card that can run the game on high textures (3gb+), then there isn't a reason to not get it. The game will look about as good as it would with ultra textures.


Yeah you are correct on the added depth. But during normal gameplay, I could not see the difference until I started doing screenshot comparisons.


----------



## Chargeit

Quote:


> Originally Posted by *BradleyW*
> 
> Yeah you are correct on the added depth. But during normal gameplay, I could not see the difference until I started doing screenshot comparisons.


Yea, it is minor.

Now, I saw a youtuber playing on a 2gb 770, and it looked horrible. I'm assuming he had the textures set to low, because it didn't look right at all.

All in all I'm enjoying the game. I do get annoyed that I can't max out textures, but, that's only because I'm being anal, not because it dilutes the gaming experience.


----------



## jojoenglish85

I can see a difference i game at 1440p, looks amazing, just burned about 4 hours straight and about to take another plunge back into the game again








Everything max out on ultra by the way via GTX 970 SC Non-ACX


----------



## KenjiS

Quote:


> Originally Posted by *Chargeit*
> 
> Yea, it is minor.
> 
> Now, I saw a youtuber playing on a 2gb 770, and it looked horrible. I'm assuming he had the textures set to low, because it didn't look right at all.
> 
> All in all I'm enjoying the game. I do get annoyed that I can't max out textures, but, that's only because I'm being anal, not because it dilutes the gaming experience.


Weird, before my 970 got here I played on my 770 and it looked really darn good at Very High and 1440p...

Ultra is out on a 2gb card, but Very High still looks extremely good and plays extremely well


----------



## Chargeit

Quote:


> Originally Posted by *KenjiS*
> 
> Weird, before my 970 got here I played on my 770 and it looked really darn good at Very High and 1440p...
> 
> Ultra is out on a 2gb card, but Very High still looks extremely good and plays extremely well


I'm not so sure the guy is great with tinkering with his settings.


----------



## Zipperly

Quote:


> Originally Posted by *KenjiS*
> 
> Weird, before my 970 got here I played on my 770 and it looked really darn good at Very High and 1440p...
> 
> Ultra is out on a 2gb card, but Very High still looks extremely good and plays extremely well


LOL I know you dont mean to but it bugs me when people say "very high" when there is no "very high" setting.


----------



## Juub

Quote:


> Originally Posted by *Zipperly*
> 
> LOL I know you dont mean to but it bugs me when people say "very high" when there is no "very high" setting.


There actually is a ''Very High'' preset.


----------



## Zipperly

Quote:


> Originally Posted by *Juub*
> 
> There actually is a ''Very High'' preset.


Yeah im talking the actual different options, its either high or ultra. I never messed with the preset so my bad on that one.


----------



## KenjiS

Quote:


> Originally Posted by *Zipperly*
> 
> Yeah im talking the actual different options, its either high or ultra. I never messed with the preset so my bad on that one.


Yeah was gonna say, Very High is basically High textures and Ultra everything else

Its pretty much 95% of Ultra and runs on 2gb GPUs just fine


----------



## chronicfx

Tactical question. When there is a guy with intel but he is in a group, how do you keep from killing him? Do i have to start using throw? Sometimes I am accidentally killing these guys. Also are you vulnerable when you grab them up and interrogate them, like if I stunned the intel guy and just grabbed him in the crowd would it work?


----------



## Darklyric

Quote:


> Originally Posted by *chronicfx*
> 
> Tactical question. When there is a guy with intel but he is in a group, how do you keep from killing him? Do i have to start using throw? Sometimes I am accidentally killing these guys. Also are you vulnerable when you grab them up and interrogate them, like if I stunned the intel guy and just grabbed him in the crowd would it work?


I just draw them all get him stunned and pull out the bow and cap everyone close, which usually works. ---Not always lol


----------



## Flames21891

Quote:


> Originally Posted by *chronicfx*
> 
> Tactical question. When there is a guy with intel but he is in a group, how do you keep from killing him? Do i have to start using throw? Sometimes I am accidentally killing these guys. Also are you vulnerable when you grab them up and interrogate them, like if I stunned the intel guy and just grabbed him in the crowd would it work?


Counters don't seem to do damage (and if they do it's extremely low) so I just keep countering them without doing any followup attacks while dispatching everyone else as fast as possible.

Also You can grab them outright, but you are vulnerable while doing so. Fun thing though is that once you get past a certain point in the animation, you are then immune to everything around you (time seems to stop in fact) and can interrogate them without worry.

The best tactic is to whittle down their numbers while only countering your target, and when there's about four or fewer orcs left, you usually have a pretty good chance of getting a successful interrogation if you just go for it.


----------



## HowHardCanItBe

This is nothing more than to drive up the sales of graphics cards and lazy porting. People will fall for this though no matter what. Let someone else be brave enough to show the difference b/w the texture packs. Not that texture packs have really made a huge difference in the visuals for me. At least the texture packs provided by the developers. The texture packs provided by the modding community on the other hand is a totally different story.


----------



## Juub

Quote:


> Originally Posted by *5entinel*
> 
> This is nothing more than to drive up the sales of graphics cards and lazy porting. People will fall for this though no matter what. Let someone else be brave enough to show the difference b/w the texture packs. Not that texture packs have really made a huge difference in the visuals for me.


6GB of VRAM is for advertisement purposes. The game loads a huge map and applies textures to the entirety of it in one loading screen. From the grass up close to the enemies in the distance, everything looks gorgeous. I'd say for such a good looking open-world game, the VRAM requirements are justified but I run this game perfectly with my 970.

Also, the port is actually really good as the frame rate is high even on 770's and there is no stutter at all. I consider it a Very High quality port.









Mouse and Keyboard presets, Ultra preset exclusive to PC, unlocked frame rate, settings are plentiful to tweak. I'd say this is a PC port done right. I'd be extremely happy if every PC port was like that.


----------



## Zipperly

Quote:


> Originally Posted by *5entinel*
> 
> This is nothing more than to drive up the sales of graphics cards and lazy porting. People will fall for this though no matter what. Let someone else be brave enough to show the difference b/w the texture packs. Not that texture packs have really made a huge difference in the visuals for me. At least the texture packs provided by the developers. The texture packs provided by the modding community on the other hand is a totally different story.


Dude this is one of the best ports to come out in a very long time... please dont talk like this when you dont even own the game.


----------



## DIYDeath

Quote:


> Originally Posted by *Juub*
> 
> 6GB of VRAM is for advertisement purposes. The game loads a huge map and applies textures to the entirety of it in one loading screen. From the grass up close to the enemies in the distance, everything looks gorgeous. I'd say for such a good looking open-world game, the VRAM requirements are justified but I run this game perfectly with my 970.
> 
> Also, the port is actually really good as the frame rate is high even on 770's and there is no stutter at all. I consider it a Very High quality port.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mouse and Keyboard presets, Ultra preset exclusive to PC, unlocked frame rate, settings are plentiful to tweak. I'd say this is a PC port done right. I'd be extremely happy if every PC port was like that.


Square Enix could learn a thing or two...if they decide to port another FF game to PC.


----------



## Zipperly

Quote:


> Originally Posted by *Juub*
> 
> 6GB of VRAM is for advertisement purposes. The game loads a huge map and applies textures to the entirety of it in one loading screen. From the grass up close to the enemies in the distance, everything looks gorgeous. I'd say for such a good looking open-world game, the VRAM requirements are justified but I run this game perfectly with my 970.
> 
> Also, the port is actually really good as the frame rate is high even on 770's and there is no stutter at all. I consider it a Very High quality port.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mouse and Keyboard presets, Ultra preset exclusive to PC, unlocked frame rate, settings are plentiful to tweak. I'd say this is a PC port done right. I'd be extremely happy if every PC port was like that.


----------



## KenjiS

Quote:


> Originally Posted by *Juub*
> 
> 6GB of VRAM is for advertisement purposes. The game loads a huge map and applies textures to the entirety of it in one loading screen. From the grass up close to the enemies in the distance, everything looks gorgeous. I'd say for such a good looking open-world game, the VRAM requirements are justified but I run this game perfectly with my 970.
> 
> Also, the port is actually really good as the frame rate is high even on 770's and there is no stutter at all. I consider it a Very High quality port.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mouse and Keyboard presets, Ultra preset exclusive to PC, unlocked frame rate, settings are plentiful to tweak. I'd say this is a PC port done right. I'd be extremely happy if every PC port was like that.


This right here

Thats something i keep forgetting to mention, the loading times are pretty bloody fast and theres NEVER any slow down or loading really when you say fast travel or anything else, INCREDIBLY impressed, feels like im running an SSD on it


----------



## Juub

Quote:


> Originally Posted by *KenjiS*
> 
> This right here
> 
> Thats something i keep forgetting to mention, the loading times are pretty bloody fast and theres NEVER any slow down or loading really when you say fast travel or anything else, INCREDIBLY impressed, feels like im running an SSD on it


I'm 6hrs into the game and I see no fast travel option. Is it an ability you unlock later on?


----------



## KenjiS

Quote:


> Originally Posted by *Juub*
> 
> I'm 6hrs into the game and I see no fast travel option. Is it an ability you unlock later on?


The towers you unlock with the Wraith can be fast traveled between by going to the map and pressing x


----------



## Juub

Quote:


> Originally Posted by *KenjiS*
> 
> The towers you unlock with the Wraith can be fast traveled between by going to the map and pressing x


Huh really so that's what they are for? I thought they were just respawnint points. Guess I'll keep playing as if I didn't know.


----------



## TPCbench

I run the built-in benchmark tool of Shadow of Mordor

1080p, Very High quality preset
Avg fps = 47
Min fps = 29

1080p, Ultra quality preset
Avg fps = 30
Min fps = 5

Pentium G3258 @ 4.2 GHz + Palit GTX 750 Ti StormX Dual 2GB

The game is not playable for me when using Ultra textures even if I reduce the shadows and ambient occlusion. It is very likely that the 2GB VRAM is the culprit

CPU usage is around 80%

Sticking to Very High quality preset


----------



## Swolern

Quote:


> Originally Posted by *Juub*
> 
> Huh really so that's what they are for? I thought they were just respawnint points. Guess I'll keep playing as if I didn't know.


You have to climb to the top of the tower and forge the anvil at the top. I missed a couple towers before i realized that too.


----------



## DiNet

This game si a trojan from nvidia.
Seeing it at 4K dsr really makes me want to buy 2 980's...
I'm so glad that it's fairly short game and I'm done with it.


----------



## DIYDeath

Quote:


> Originally Posted by *DiNet*
> 
> This game si a trojan from nvidia.
> Seeing it at 4K dsr really makes me want to buy 2 980's...
> I'm so glad that it's fairly short game and I'm done with it.


You know DSR works with kelper too, right? Just enable it through Nvidia Inspector. Old cards will hate you for it but cards like the Titan and maybe the 780/780 ti won't mind.


----------



## Valeriant

I can't really felt the impact with Ultra Texture Pack, I only notices more detailed and sharper images on grounds/walls and that is only within certain radius from Talion. Don't see improvement on faraway objects. The first pic shows the white area on the ground near Talion having more refined and not blurry, and also objects like the wood planks on the second pic.





Maybe the Ultra Texture Pack would be good for Nvidia DSR. Hmm, right... the game has that setting, will try using the in-game DSR.


----------



## cstkl1

Quote:


> Originally Posted by *Valeriant*
> 
> I can't really felt the impact with Ultra Texture Pack, I only notices more detailed and sharper images on grounds/walls and that is only within certain radius from Talion. Don't see improvement on faraway objects. The first pic shows the white area on the ground near Talion having more refined and not blurry, and also objects like the wood planks on the second pic.
> 
> 
> 
> 
> 
> Maybe the Ultra Texture Pack would be good for Nvidia DSR. Hmm, right... the game has that setting, will try using the in-game DSR.


On my both rig there is a difference. Weather, smoke etc. 2nd map makes a lot of difference.
One thing though with gsync this game is awesome.
Big difference .


----------



## Wezzor

Just bought the game and currently downloading it.
Anyway, does anyone here use SweetFX or is it really no need for it since the game looks really good already?


----------



## rbarrett96

I just saw a refurb Sapphire 6 GB 7970 Gigahertz edition for $299. Seriously thinking of selling my 680 4 GB Twin Frozr and buying two of these. What do you all think? Should I do it? How much could I get for my 680 if it's in good condition? I'm seeing really inflated prices on ebay because the care is hard to find, but I don't see many selling. My other option is to buy a used Twin Frozr 4 GB for $315.


----------



## Juub

Quote:


> Originally Posted by *rbarrett96*
> 
> I just saw a refurb Sapphire 6 GB 7970 Gigahertz edition for $299. Seriously thinking of selling my 680 4 GB Twin Frozr and buying two of these. What do you all think? Should I do it? How much could I get for my 680 if it's in good condition? I'm seeing really inflated prices on ebay because the care is hard to find, but I don't see many selling. My other option is to buy a used Twin Frozr 4 GB for $315.


Where do you live? Buy a 970 for 30-50$ more at this price. 4GB is plenty for Ultra and a 970 slaughters a 7970 by sometimes 50%.


----------



## rbarrett96

Well the idea was to have 6GB as that's what this game stated was required for ultra textures. And wouldn't Crossfiring the 7970s be more than enough to outperform the 980, no less a 970?


----------



## clerick

Quote:


> Originally Posted by *rbarrett96*
> 
> Well the idea was to have 6GB as that's what this game stated was required for ultra textures. And wouldn't Crossfiring the 7970s be more than enough to outperform the 980, no less a 970?


The pre 290x era cards like the 7xxx have really poor crossfire stutter. Even with frame pacing it was still incredibly stutterish. i went from 2x 7950s to a single 970 and the difference were night and day.


----------



## chronicfx

Quote:


> Originally Posted by *clerick*
> 
> The pre 290x era cards like the 7xxx have really poor crossfire stutter. Even with frame pacing it was still incredibly stutterish. i went from 2x 7950s to a single 970 and the difference were night and day.


Never mind had my colors mixed up..


----------



## rbarrett96

I suppose with the price they are now, a pair of 290's would be good in crossfire, but wouldn't have the extra vram...


----------



## Juub

Quote:


> Originally Posted by *rbarrett96*
> 
> Well the idea was to have 6GB as that's what this game stated was required for ultra textures. And wouldn't Crossfiring the 7970s be more than enough to outperform the 980, no less a 970?


Always take the more powerful single card over dual weaker ones. HD 7000 had/still have horrible stutter when paired.

Quote:


> Originally Posted by *clerick*
> 
> The pre 290x era cards like the 7xxx have really poor crossfire stutter. Even with frame pacing it was still incredibly stutterish. i went from 2x 7950s to a single 970 and the difference were night and day.


Exact same thing I did. My peak performance isn't as high but the consistency and smoothness make it totally worth it.









Quote:


> Originally Posted by *rbarrett96*
> 
> I suppose with the price they are now, a pair of 290's would be good in crossfire, but wouldn't have the extra vram...


Unless you can get them for less, there's no point in taking a R9 290 over a 970.

Edit: Also, it's highly doubtful anything will require over 4GB of VRAM at 1080. I can play Shadow of Mordor at 2560x1440 and VRAM is still plentiful.


----------



## Faded

Quote:


> Originally Posted by *rbarrett96*
> 
> I just saw a refurb Sapphire 6 GB 7970 Gigahertz edition for $299. Seriously thinking of selling my 680 4 GB Twin Frozr and buying two of these. What do you all think? Should I do it? How much could I get for my 680 if it's in good condition? I'm seeing really inflated prices on ebay because the care is hard to find, but I don't see many selling. My other option is to buy a used Twin Frozr 4 GB for $315.


There's a 4gb 290X on sale for 349 at newegg. It's not 6gb but that's a damn good deal for that card


----------



## Chargeit

Quote:


> Originally Posted by *rbarrett96*
> 
> I just saw a refurb Sapphire 6 GB 7970 Gigahertz edition for $299. Seriously thinking of selling my 680 4 GB Twin Frozr and buying two of these. What do you all think? Should I do it? How much could I get for my 680 if it's in good condition? I'm seeing really inflated prices on ebay because the care is hard to find, but I don't see many selling. My other option is to buy a used Twin Frozr 4 GB for $315.


I wouldn't buy a 7xxx line GPU for going crossfire if I were you.
Quote:


> Originally Posted by *Faded*
> 
> There's a 4gb 290X on sale for 349 at newegg. It's not 6gb but that's a damn good deal for that card


I'll just keep my mouth







about this one.


----------



## Swolern

Quote:


> Originally Posted by *cstkl1*
> 
> On my both rig there is a difference. Weather, smoke etc. 2nd map makes a lot of difference.
> One thing though with gsync this game is awesome.
> Big difference .


2 Blackies & a Swift!! Man I bet that is absolutely gorgeous. What fps are you getting in-game, with what settings?


----------



## Zipperly

Quote:


> Originally Posted by *Chargeit*
> 
> I'll just keep my mouth
> 
> 
> 
> 
> 
> 
> 
> about this one.


I wont, get the GTX 970 instead.


----------



## cstkl1

Quote:


> Originally Posted by *Swolern*
> 
> 2 Blackies & a Swift!! Man I bet that is absolutely gorgeous. What fps are you getting in-game, with what settings?


Only running on one card. Sli fix thing has a lot of issues like this white lines. It improves fps only like on running arnd. The mem vram hit happens more often n it tanks further.

Will wait for a profile.


----------



## supermi

Quote:


> Originally Posted by *cstkl1*
> 
> Only running on one card. Sli fix thing has a lot of issues like this white lines. It improves fps only like on running arnd. The mem vram hit happens more often n it tanks further.
> 
> Will wait for a profile.


What is your GPU usage like when in the broken sli?

On three monitors I getting just about 99% usage but unsure if it is showing itself as more fps...


----------



## cstkl1

Quote:


> Originally Posted by *supermi*
> 
> What is your GPU usage like when in the broken sli?
> 
> On three monitors I getting just about 99% usage but unsure if it is showing itself as more fps...


Usage n powerlimit all shows its working.
But that white streaks u get off n on n really the vram 6gb hit happens very frequently.

Samething on 780ti on high. Sli enable n the vram hit happens very often


----------



## Cyro999

Hey guys - i'm playing with a 4770k @ 4.5ghz and a 970. I have EXTREMELY inconsistent mouse response, sometimes it's 5x more sensitive than other times, depending on how i'm turning the camera - it feels like really bad negative acceleration. I've tried some fixes with no success. Is there any way to get mouse response remotely close to 1:1?

I prefer to play with muscle memory, but right now i have to kinda wave the mouse to one side, then see if it turned nearly as much as i wanted, then move again, look, move again, look etc and it just makes all of the mouse actions take 5x as long as it would otherwise take because i have no idea where my aim will end up


----------



## cstkl1

Quote:


> Originally Posted by *Cyro999*
> 
> Hey guys - i'm playing with a 4770k @ 4.5ghz and a 970. I have EXTREMELY inconsistent mouse response, sometimes it's 5x more sensitive than other times, depending on how i'm turning the camera - it feels like really bad negative acceleration. I've tried some fixes with no success. Is there any way to get mouse response remotely close to 1:1?
> 
> I prefer to play with muscle memory, but right now i have to kinda wave the mouse to one side, then see if it turned nearly as much as i wanted, then move again, look, move again, look etc and it just makes all of the mouse actions take 5x as long as it would otherwise take because i have no idea where my aim will end up


I am using a old g9 n a g502 with no issue. So ...


----------



## Cyro999

Quote:


> Originally Posted by *cstkl1*
> 
> I am using a old g9 n a g502 with no issue. So ...


Upon further investigation it looks like mouse sensitivity might be tied to framerate.. I'm running settings that i can easily handle at FPS cap now, and the mouse is laggy but response is much closer to 1:1.

Previously i would wave the mouse and it'd take 25cm of movement to turn 180 degrees. Then i'd move it 5-10cm more and it'd flip back 180 degrees.


----------



## Swolern

Quote:


> Originally Posted by *Cyro999*
> 
> Hey guys - i'm playing with a 4770k @ 4.5ghz and a 970. I have EXTREMELY inconsistent mouse response, sometimes it's 5x more sensitive than other times, depending on how i'm turning the camera - it feels like really bad negative acceleration. I've tried some fixes with no success. Is there any way to get mouse response remotely close to 1:1?
> 
> I prefer to play with muscle memory, but right now i have to kinda wave the mouse to one side, then see if it turned nearly as much as i wanted, then move again, look, move again, look etc and it just makes all of the mouse actions take 5x as long as it would otherwise take because i have no idea where my aim will end up


I had that same issue. Playing around with settings fixed it for me. Cant remember if Vsync or motion blur setting that had the impact on the negative mouse acceleration. I believe when i turned motion blur off, it went away. I ended up switching to a xbox controller just because its a button masher type control gameplay and my index finger got tired, lol.


----------



## The Source

So those of you with 4GB vram, not 6, but 4, how is the second map with all ultra settings except textures on high with resolution scaling? How high can you go with the scaling before stuttering takes hold? I'd like to know as I'm looking at 970's and before I make any kind of move I would like to know what to expect. Thanks.


----------



## Cyro999

Quote:


> Originally Posted by *The Source*
> 
> So those of you with 4GB vram, not 6, but 4, how is the second map with all ultra settings except textures on high with resolution scaling? How high can you go with the scaling before stuttering takes hold? I'd like to know as I'm looking at 970's and before I make any kind of move I would like to know what to expect. Thanks.


Not sure what you mean by second map, but i was playing at 1620p (2.25x 1080p) with those settings and a few things like motion blur disabled, as well as out of order transparency and maybe 1-2 settings turned down and i didn't even hit 3GB of VRAM used. No stutter.

I had to fall back from 1620p though with my single 970 because being anywhere above ~60-80% GPU load meant that i was hitting 100% GPU load at times and breaking the 100FPS cap, which made my mouse respond in a very inconsistent and confusing way and made the game unplayable. If you want to downscale from 4k and maintain 60fps i'd guess you can do it with sli970, but not with ultra textures.


----------



## The Source

Quote:


> Originally Posted by *Cyro999*
> 
> Not sure what you mean by second map, but i was playing at 1620p (2.25x 1080p) with those settings and a few things like motion blur disabled, as well as out of order transparency and maybe 1-2 settings turned down and i didn't even hit 3GB of VRAM used. No stutter.
> 
> I had to fall back from 1620p though with my single 970 because being anywhere above ~60-80% GPU load meant that i was hitting 100% GPU load at times and breaking the 100FPS cap, which made my mouse respond in a very inconsistent and confusing way and made the game unplayable. If you want to downscale from 4k and maintain 60fps i'd guess you can do it with sli970, but not with ultra textures.


Thanks for the response. I don't expect to be able to run 4K but perhaps 1620p with Ultra textures or 1800p with high textures. As it sits right now, I run into stuttering with 3GB's starting at 1620p (high textures), and thats in the first part of the game which is supposed to be the less demanding part.


----------



## hurleyef

I'm playing using the ultra preset (everything max except AO) with the texture pack installed at 1440p and haven't had any issues with stuttering on either map.


----------



## revro

Quote:


> Originally Posted by *Chargeit*
> 
> Man, I don't see how people can play a game like this without one, and I've been computer gaming since the early/mid 90's. Of course, the first video game I remember playing was Super Mario on the NES as young kid in the 80's. I grew up using both kb&m and controllers. Since I can take advantage of both, there isn't much point in me not using the method that works best for a given game. I wouldn't play a racing game with a kb&m anymore then I'd play a fps with a controller for instance.
> 
> Speaking of controllers, ever since adding this 3rd monitor I've been thinking more and more about getting a flight stick... Though I have no flying games right now, I've been eyeballing a few space sims.


you should check out star citizen, made by creator of wing commander, privateer, freelancer, starlancer chris roberts









i did prefer to use controllers on some last few pc games. i liked using xbox 360 controller more for dark souls 1 and 2


----------



## rbarrett96

Quote:


> Originally Posted by *Chargeit*
> 
> I wouldn't buy a 7xxx line GPU for going crossfire if I were you.
> I'll just keep my mouth
> 
> 
> 
> 
> 
> 
> 
> about this one.


Ok, so you all have convinced me NOT to go with two of the 7XXX cards. So now the question becomes do I buy a 2nd 680 4GB for $315 for SLI, Sell the one I have and get 2 290x's or sell it and get a single 980? If I did the 980, I wouldn't be able to get a 2nd card for quite some time


----------



## BradleyW

Quote:


> Originally Posted by *The Source*
> 
> So those of you with 4GB vram, not 6, but 4, how is the second map with all ultra settings except textures on high with resolution scaling? How high can you go with the scaling before stuttering takes hold? I'd like to know as I'm looking at 970's and before I make any kind of move I would like to know what to expect. Thanks.


Honestly? I've had no stuttering at all. Could be due to the 290X's high memory bandwidth combined with the 4GB. Not sure how the 970 would hold up though.


----------



## chronicfx

Quote:


> Originally Posted by *BradleyW*
> 
> Honestly? I've had no stuttering at all. Could be due to the 290X's high memory bandwidth combined with the 4GB. Not sure how the 970 would hold up though.


No stuttering here either, handles 50 orcs and 2 captains and a couple caragor just fine on ultra with 290x


----------



## Zipperly

Quote:


> Originally Posted by *The Source*
> 
> So those of you with 4GB vram, not 6, but 4, how is the second map with all ultra settings except textures on high with resolution scaling? How high can you go with the scaling before stuttering takes hold? I'd like to know as I'm looking at 970's and before I make any kind of move I would like to know what to expect. Thanks.


Well for what it is worth with my GTX 780 I am running 3072X1728 All ultra except *Textures on High and Texture Filtering on High*. Adaptive vsync half refresh rate keeps me at a very smooth 30fps and Vram isnt an issue at all. This is on the second map and yes I play with a wireless controller instead of keyboard and mouse.


----------



## Zipperly

Quote:


> Originally Posted by *The Source*
> 
> Thanks for the response. I don't expect to be able to run 4K but perhaps 1620p with Ultra textures or 1800p with high textures. As it sits right now, I run into stuttering with 3GB's starting at 1620p (high textures), and thats in the first part of the game which is supposed to be the less demanding part.


Uh, that shouldnt be happening if you are not running Ultra textures.....


----------



## rbarrett96

Quote:


> Originally Posted by *Faded*
> 
> There's a 4gb 290X on sale for 349 at newegg. It's not 6gb but that's a damn good deal for that card


Was this an open box? I wasn't able to find it on new egg


----------



## MonarchX

Does anyone else with 3GB of VRAM playing this game with Ultra HD Texture pack? I do and I experience no problems aside from a VERY rare mini-stutter that happens for about half a second. My VRAM is maxed out all the time, but you don't need more than 3GB to play this game with Ultra HD Texture Pack and Ultra Settings. I play @ 1080p, no downscaling though.


----------



## Zipperly

Quote:


> Originally Posted by *MonarchX*
> 
> Does anyone else with 3GB of VRAM playing this game with Ultra HD Texture pack? I do and I experience no problems aside from a VERY rare mini-stutter that happens for about half a second. My VRAM is maxed out all the time, but you don't need more than 3GB to play this game with Ultra HD Texture Pack and Ultra Settings. I play @ 1080p, no downscaling though.


Yes that is around the same experience that I have at 1080p but I would rather cut them back to high and do some downsampling, the game looks loads better that way to me personally.


----------



## The Source

Quote:


> Originally Posted by *Zipperly*
> 
> Uh, that shouldnt be happening if you are not running Ultra textures.....


It might have something to do with SLI (an official profile would really help determine this), or that my native res is 1440p. It does help a bit to lock it at 30fps, but then it's 30fps and it shows.


----------



## Zipperly

Quote:


> Originally Posted by *The Source*
> 
> It might have something to do with SLI (an official profile would really help determine this), or that my native res is 1440p. It does help a bit to lock it at 30fps, but then it's 30fps and it shows.


Its probably SLI related and try "adaptive half refresh rate" in the Control Panel. If you are gaming with a controller it makes 30fps very smooth with the combination of adaptive sync.


----------



## Chargeit

Quote:


> Originally Posted by *rbarrett96*
> 
> Ok, so you all have convinced me NOT to go with two of the 7XXX cards. So now the question becomes do I buy a 2nd 680 4GB for $315 for SLI, Sell the one I have and get 2 290x's or sell it and get a single 980? If I did the 980, I wouldn't be able to get a 2nd card for quite some time


What res are you playing at?

I mean, personally, I'd take a single 980 over the other options because you won't have to worry about sli or crossfire. I'm also hooked on some of the Nvidia control panel options, so, I don't think I could do without them myself.


----------



## Faded

Quote:


> Originally Posted by *rbarrett96*
> 
> Was this an open box? I wasn't able to find it on new egg


Its 379 with a 30 dollar rebate

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150696&cm_re=xfx_r9_290x-_-14-150-696-_-Product

EDIT - and I just realized its a mail in rebate card... *sigh*... I'm not a fan of them, but they have always come through, for me... it just takes a couple weeks.


----------



## cstkl1

again somehow all this game thread recently ends up with being hardware talk or purchasing etc..

nobody is actually talking about the gameplay.


----------



## Triniboi82

Really enjoying this game, runs smooth @ 1440p with a mix of ultra & high settings on a single 970, average between 57-75 fps, sometimes drops to around 45fps but very rarely. Also leaving off motion blur helps as well. Just finished the great white graug mission, kinda disappointed, was expecting more of a fight


----------



## rbarrett96

I'm only playing at 1080p, but I'd like the option to upgrade to a 27" high res or two monitors in the future, not to mention the Vram these texture packs are using. That was the only reason I considered the 6GB 7970s. To be somewhat future proof on the Vram front.


----------



## Juub

Quote:


> Originally Posted by *cstkl1*
> 
> again somehow all this game thread recently ends up with being hardware talk or purchasing etc..
> 
> nobody is actually talking about the gameplay.


There is a thread in the PC Games section dedicated to the gameplay.


----------



## EniGma1987

Quote:


> Originally Posted by *cstkl1*
> 
> again somehow all this game thread recently ends up with being hardware talk or purchasing etc..
> 
> nobody is actually talking about the gameplay.


And yet the point of this thread is that the game is/was supposed to required a huge amount of memory to play, so it makes sense people are comparing it in the hardware aspect. Nothing about the original post of this thread is in regards to 6GB of VRam being needed for some gameplay aspect, just fidelity.


----------



## daviejams

Well I completed it. Brilliant game , brilliant graphics
10/10 well done to monolith , maybe game of the year for me

I liked killing large groups of Orcs then running into a couple of Captains the most


----------



## Baasha

Just finished it last night as well.

Great game - too bad poor/no SLI support.

Can't wait to really unleash this game w/ 4-Way SLI in 4K Surround!









Anyway, love the combat - the runes got old/useless pretty fast. They should add more weapons and build on the combat system for the next one.

There is... a sequel right?


----------



## SunBro

I really like the game but i was baffled to find out that my peasant gtx 750 ti can play this game and not drop below 30 fps with everything ultra(except texture quality and ambient occlusion at high) I mostly get 40 fps and sometimes drop to 31-32 with some 50 fps here and there.


----------



## EniGma1987

Quote:


> Originally Posted by *SunBro*
> 
> I really like the game but i was baffled to find out that my peasant gtx 750 ti can play this game and not drop below 30 fps with everything ultra(except texture quality and ambient occlusion at high) I mostly get 40 fps and sometimes drop to 31-32 with some 50 fps here and there.


Many people don't consider that playable.


----------



## sepiashimmer

Quote:


> Originally Posted by *Baasha*
> 
> Just finished it last night as well.
> 
> Great game - too bad poor/no SLI support.
> 
> Can't wait to really unleash this game w/ 4-Way SLI in 4K Surround!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, love the combat - the runes got old/useless pretty fast. They should add more weapons and build on the combat system for the next one.
> 
> There is... a sequel right?


Of course, after all it is the hen which lays golden eggs but after 4 sequels it would have become repetitive to be considering interesting or fun.


----------



## SunBro

Quote:


> Originally Posted by *EniGma1987*
> 
> Many people don't consider that playable.


Of course that was just my test run. I normally play at high preset(you know, everything at high except textures and ao at medium) In game benchmark gives me avg 50 fps (i rarely see it dropping to 40s in my shadowplay counter) Still, avg of 50 fps at high preset is damn good for a budget card.


----------



## BrightCandle

I found enabling AFR2 in the Nvidia profile for the game got SLI working. Didn't fix the problem I couldn't reasonably go above high textures but I am getting there to completing it now.


----------



## Cyro999

Quote:


> Originally Posted by *SunBro*
> 
> Of course that was just my test run. I normally play at high preset(you know, everything at high except textures and ao at medium) In game benchmark gives me avg 50 fps (i rarely see it dropping to 40s in my shadowplay counter) Still, avg of 50 fps at high preset is damn good for a budget card.


Well it's way faster than xbox1 which has to play it @1080p with 30fps most of the time


----------



## Newbie2009

I picked it up on PS4 to avoid driver issues. Have to say I'm impressed with how it runs. Looks pretty, no slowdown I have noticed. Great game. I'm really liking the little PS4.
9/10 for me.


----------



## chronicfx

Quote:


> Originally Posted by *Newbie2009*
> 
> I picked it up on PS4 to avoid driver issues. Have to say I'm impressed with how it runs. Looks pretty, no slowdown I have noticed. Great game. I'm really liking the little PS4.
> 9/10 for me.


The drivers are fine. I am running it on ultra and 1440p. I prefer 60fps


----------



## Silent Scone

Quote:


> Originally Posted by *chronicfx*
> 
> The drivers are fine. I am running it on ultra and 1440p. Just have to avoid trying to use crossfire and have a nice gpu. I am not even using Fps monitoring anymore because the 290x handles it fine and I spend too much time staring at numbers rather than playing. That said 30FPS is intolerable for me. *Sorry you have to give that a 9/10. I give anything 30fps a 1/10 personally.*


Slightly ignoramus-esque but ok.


----------



## chronicfx

Quote:


> Originally Posted by *Silent Scone*
> 
> Slightly ignoramus-esque but ok.[/]
> 
> Ok


----------



## Silent Scone

Nice 4/10 avocation. Not sure what that's got to do with the other post


----------



## jojoenglish85

my 970 runs this maxed [email protected] and no overclock just fine. I sometimes get low dips when i teleport from tower to tower(right after teleporting and the new map loads) but afterwards its butter smooth again.


----------



## Frankzro

Quote:


> Originally Posted by *chronicfx*
> 
> Alot ignoramous on my part, but I am allowed to be ignorant when I get home After spending all day elucidating and performing lcms/nmr structural analysis on monomers and polymers and unknowns. 1/10 is my own research and rating for 30fps, i have played both ways. I can't do 30fps. It is really tough on my eyes.


I know you are an intelligent man. Although... when I see people type words I NEVER HEAR (but can be read)... it bugs me. Words like *"elucidating"* ... damn son


----------



## chronicfx

Quote:


> Originally Posted by *Frankzro*
> 
> I know you are an intelligent man. Although... when I see people type words I NEVER HEAR (but can be read)... it bugs me. Words like *"elucidating"* ... damn son[/]
> 
> Yes


----------



## Newbie2009

Quote:


> Originally Posted by *chronicfx*
> 
> The drivers are fine. I am running it on ultra and 1440p. Just have to avoid trying to use crossfire and have a nice gpu. I am not even using Fps monitoring anymore because the 290x handles it fine and I spend too much time staring at numbers rather than playing. That said 30FPS is intolerable for me. Sorry you have to give that a 9/10. I give anything 30fps a 1/10 personally.


Running 1600p so was concerned one card wouldn't cut it and I heard xfire didn't work properly.


----------



## Juub

Quote:


> Originally Posted by *Silent Scone*
> 
> Slightly ignoramus-esque but ok.


How? If he can't tolerate 30fps he can't. Nothing wrong about that. I personally avoid 30fps like the plague. I'd rather put my game at medium settings and get 60fps than max settings and get 30fps. Frame rate directly affects the gameplay, the fluidity and the overall experience. It is far more important than graphics for most of us.


----------



## SunBro

Quote:


> Originally Posted by *Cyro999*
> 
> Well it's way faster than xbox1 which has to play it @1080p with 30fps most of the time


Exactly. I can't go back to playing at 30 fps(other than testing purposes)
Quote:


> Originally Posted by *Juub*
> 
> How? If he can't tolerate 30fps he can't. Nothing wrong about that. I personally avoid 30fps like the plague. I'd rather put my game at medium settings and get 60fps than max settings and get 30fps. Frame rate directly affects the gameplay, the fluidity and the overall experience. It is far more important than graphics for most of us.


Yup. Besides, many games don't look that horrid at medium/high settings these days. Certainly playable.


----------



## Juub

Quote:


> Originally Posted by *Cyro999*
> 
> Well it's way faster than xbox1 which has to play it @1080p with 30fps most of the time


More like 900p 30fps. 1080 is reserved for the PS4.


----------



## Silent Scone

Quote:


> Originally Posted by *Juub*
> 
> How? If he can't tolerate 30fps he can't. Nothing wrong about that. I personally avoid 30fps like the plague. I'd rather put my game at medium settings and get 60fps than max settings and get 30fps. Frame rate directly affects the gameplay, the fluidity and the overall experience. It is far more important than graphics for most of us.


Because it was a blanket statement and in certain instances 30fps is fine. Actually works better in The Evil Within.


----------



## BradleyW

Keyboard and mouse lags far too much on 30 fps. That's the only reason why I dislike that kind of fps for PC. I don't want to use a pad. If I did, I would have gotten a console.


----------



## chronicfx

Quote:


> Originally Posted by *Silent Scone*
> 
> Because it was a blanket statement and in certain instances 30fps is fine. Actually works better in The Evil Within.


Cool


----------



## chronicfx

Quote:


> Originally Posted by *Silent Scone*
> 
> Because it was a blanket statement and in certain instances 30fps is fine. Actually works better in The Evil Within.


Yep


----------



## Silent Scone

Quote:


> Originally Posted by *chronicfx*
> 
> 30fps looks great on myst and zork too.. I agree. You should do lunch with the ubisoft ceo, you guys would get along great.


It's not that simple, but evidently despite your occupational egotistic random post, you are









Certain games just don't require the level of fluidity. There are a huge number of games that work just fine - and look great without 60 frames per second. Uncharted series is captivating and you'd be pressed to even realise it's running at 30 FPS.

So it depends on the genre. Just makes me laugh how people get so hung up on the FPS argument, yet are obviously too short sighted anyway.

I don't agree with Ubisoft press release, so I'll pretend that wasn't sarcastic. We all get it, 60 FPS looks great, but making out it burns your eyes and refusing to play a game at it is a massive generalisation. You've missed out on a lot if that's what you really think


----------



## chronicfx

Edit: nevermind noone wants to listen to this garbage. I kind of like this thread.. Conversation over


----------



## Cyro999

Quote:


> Originally Posted by *Silent Scone*
> 
> It's not that simple, but evidently despite your occupational egotistic random post, you are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Certain games just don't require the level of fluidity. There are a huge number of games that work just fine - and look great without 60 frames per second. Uncharted series is captivating and you'd be pressed to even realise it's running at 30 FPS.
> 
> So it depends on the genre. Just makes me laugh how people get so hung up on the FPS argument, yet are obviously too short sighted anyway.
> 
> I don't agree with Ubisoft press release, so I'll pretend that wasn't sarcastic. We all get it, 60 FPS looks great, but making out it burns your eyes and refusing to play a game at it is a massive generalisation. You've missed out on a lot if that's what you really think


Fluidity is only half of the argument, and while low frame rates are unfluid and not good at representing motion, they also add a massive amount of input latency which is very unpleasant


----------



## Juub

Quote:


> Originally Posted by *Silent Scone*
> 
> Because it was a blanket statement and in certain instances 30fps is fine. Actually works better in The Evil Within.


It works better perhaps because the engine was designed around it? There is not a single instance where a lower frame rate can be preferable to a higher frame rate when they aren't tied down to the engine.


----------



## cstkl1

Quote:


> Originally Posted by *Silent Scone*
> 
> It's not that simple, but evidently despite your occupational egotistic random post, you are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Certain games just don't require the level of fluidity. There are a huge number of games that work just fine - and look great without 60 frames per second. Uncharted series is captivating and you'd be pressed to even realise it's running at 30 FPS.
> 
> So it depends on the genre. Just makes me laugh how people get so hung up on the FPS argument, yet are obviously too short sighted anyway.
> 
> I don't agree with Ubisoft press release, so I'll pretend that wasn't sarcastic. We all get it, 60 FPS looks great, but making out it burns your eyes and refusing to play a game at it is a massive generalisation. You've missed out on a lot if that's what you really think


I am no game dev but i think the net is more up in arms about game engines that are developed with a fixed low fps rather than open. And its all to appease console issues. Personally think game engine shldnt have such limits as they are normally reused/retweaked/sold etc so the underlieing issue will crop up on other games. Good example is idtech engines. SOM is terrible at 30 for me cause reaction time all screwed up. I tried it. But fidelity is still there.

So since ubi tendes to reuse their engine etc. i am guessing if concerns are not voiced now. 30fps will become a industry standard pushing gaming evolution back. We already can see for the first time gpu are exceeding game engines.


----------



## blackhole2013

you know what I put the ultra textures back from high with my 780 3gb and its using all my 3gb but is now running smooth at 1080p 120 hz


----------



## NrGx

Yeah I was scared performance was going to be bad but with everything on max with textures on high, I'm getting 60 fps pretty stable at 1440p. Vram stays pretty consistently at 3.9gb though so I suspect ultra textures would tip it over. But that's okay, fun game and it looks great regardless


----------



## SunBro

This and evil within are just sys spec hyperboles. Sure they are demanding but not *that* demanding. I think it hurts the game's rep/financial position to some degree. I would almost skimp on this game for example. Thinking that my 750 ti will have to play it at low to get 60 fps... But here i am, slaying orcs at high settings with 60 fps.


----------



## cstkl1

hmm sli profile out

mouse is freaking slow at any fps 80 and above

and when fighting it same as single card fps.


----------



## The Source

Quote:


> Originally Posted by *cstkl1*
> 
> hmm sli profile out
> 
> mouse is freaking slow at any fps 80 and above
> 
> and when fighting it same as single card fps.


I didn't see the profile for this game mentioned in the driver release notes.


----------



## cstkl1

Quote:


> Originally Posted by *The Source*
> 
> I didn't see the profile for this game mentioned in the driver release notes.


out bro. it has it. but i think they didnt mention it as its not really working well.

using 0x080000F5 (Catzilla, Dragon Age 2, Ocean Demo, Middle-Earth:Shadow of Mordor....)

hmm initial test looks like gonna still use single card.

redownloading ryse atm from steam.
gonna try it.

works well with ryse. getting double frames. 80fps at the kings chapter.


----------



## The Source

Bah, I finally got a 780 6GB version and after reinstalling drivers and firing up the game, I can't remember how to get the in between res scaling resolutions to show up. I have native (1440p). and the next step is 2160p at 150%. Do I have to remove the custom resolutions I have in the nvidia control panel? Or is it something else?


----------



## LaBestiaHumana

Bought this game at G2A for 25 bucks. Runs pretty smooth with just one Titan, and does use 5gb+ vram on ultra.


----------



## Juub

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> Bought this game at G2A for 25 bucks. Runs pretty smooth with just one Titan, and does use 5gb+ vram on ultra.


What res? I can play it at 2560x1440 and get no stutter at all with 4GB. At 1080p I average around 75fps. The 5GB is allocated. It doesn't actually use that much.


----------



## Newbie2009

Almost tempted to get on pc to see 1600p with 4GB results


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Juub*
> 
> What res? I can play it at 2560x1440 and get no stutter at all with 4GB. At 1080p I average around 75fps. The 5GB is allocated. It doesn't actually use that much.


1080P, with one Titan I'm getting 90-100 FPS. SLI a steady 100. Game somehow got capped for me.

At 4k DSR I'm getting about 40fps, vram maxes out completely.


----------



## Strileckifunk

Need to try these ultra settings with the 970. Only problem is I have to decide between DSR and 96 mhz


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Strileckifunk*
> 
> Need to try these ultra settings with the 970. Only problem is I have to decide between DSR and 96 mhz


Not worth the performance hit on this game with DSR. I would just stick to native res and high frame rates.


----------



## CasualCat

Is it just me or do the cut scenes in this game drop the resolution and or the texture quality? Almost seems like they're set at some standard setting no matter your individual settings.


----------



## Cyro999

Quote:


> Originally Posted by *CasualCat*
> 
> Is it just me or do the cut scenes in this game drop the resolution and or the texture quality? Almost seems like they're set at some standard setting no matter your individual settings.


Yea, they're at console quality. GPU load drops to 0 etc


----------



## thrgk

where is the cheapest place to buy this game for pc? missed the last sale i think


----------



## LaBestiaHumana

Quote:


> Originally Posted by *thrgk*
> 
> where is the cheapest place to buy this game for pc? missed the last sale i think


G2A for around 25 bucks.


----------



## Wezzor

Quote:


> Originally Posted by *LaBestiaHumana*
> 
> G2A for around 25 bucks.


is G2A even a legit reseller? I'd never buy from them.


----------



## d3vour3r

playing this with a single 7950 and its fine at 1080p... whats this 6gb ram nonsense?


----------



## Bitemarks and bloodstains

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LaBestiaHumana*
> 
> G2A for around 25 bucks.
> 
> 
> 
> is G2A even a legit reseller? I'd never buy from them.
Click to expand...

They are a grey market reseller.
They buy keys from cheaper regions such as Russia and east Europe and then resell them.
I seem to recall they were also involved in the sniper elite 3 stolen key fiasco.


----------



## Cyro999

Quote:


> Originally Posted by *d3vour3r*
> 
> playing this with a single 7950 and its fine at 1080p... whats this 6gb ram nonsense?


you'll pass 3GB of VRAM with max (including ultra texture pack) even at 1080p


----------



## d3vour3r

is the ultra texture pack an extra download? because I didn't do that so that probably explains it.

either way its very playable and looks very nice







happy kicking back on my couch playing on my 50" with surround sound.


----------



## Cyro999

Quote:


> is the ultra texture pack an extra download? because I didn't do that so that probably explains it.


Yea


----------



## thrgk

Quote:


> Originally Posted by *Cyro999*
> 
> Yea


Edit:found


----------



## SOCOM_HERO

So essentially this game requires you to have somehow obtained a Titan or other ludicrously high end GPU for silly money. to play on ultra. No thanks. I prefer a game where you can have a decently high range GPU, overclock it a tad and enjoy ultra settings.


----------



## SoloCamo

Quote:


> Originally Posted by *SOCOM_HERO*
> 
> So essentially this game requires you to have somehow obtained a Titan or other ludicrously high end GPU for silly money. to play on ultra. No thanks. I prefer a game where you can have a decently high range GPU, overclock it a tad and enjoy ultra settings.


The sad thing is the 780ti's and 980's which have less vram then the titans are the faster cards but simply don't have enough vram


----------



## Juub

Quote:


> Originally Posted by *SOCOM_HERO*
> 
> So essentially this game requires you to have somehow obtained a Titan or other ludicrously high end GPU for silly money. to play on ultra. No thanks. I prefer a game where you can have a decently high range GPU, overclock it a tad and enjoy ultra settings.


You can play on Ultra with 4GB with no issues at all. My 970 ran through this game like nothing. benchmark averaged 79fps and in-game in the first area was around 75fps and second area around 60-62fps on average.

Quote:


> Originally Posted by *SoloCamo*
> 
> The sad thing is the 780ti's and 980's which have less vram then the titans are the faster cards but simply don't have enough vram


They do but I take it you are being sarcastic. That said, I've heard of people experiencing stutters with 3GB.


----------



## LaBestiaHumana

Quote:


> Originally Posted by *Wezzor*
> 
> is G2A even a legit reseller? I'd never buy from them.


Yep, friend of mine and I always buy from there. Wouldn't recommend it if I hadn't tried it myself .


----------



## daviejams

Quote:


> Originally Posted by *Juub*
> 
> You can play on Ultra with 4GB with no issues at all. My 970 ran through this game like nothing. benchmark averaged 79fps and in-game in the first area was around 75fps and second area around 60-62fps on average.
> They do but I take it you are being sarcastic. That said, I've heard of people experiencing stutters with 3GB.


I played it on "ultra" with a 3GB R9 280x

It would stutter when loading in new textures occasionally


----------



## Noufel

Finished the game steady 100 fps ( i think it's capped at 100 ) on 290 trix cfx ultra setting on the first map and 85-90 on the second the vram usage was like 3600-3800 and sometimes maxed out , but the game experience was overall smouth
Ps: the black speech of mordor is great


----------



## Sisaroth

I played on high with no issues. Actually never checked fps. I just used the settings it defaulted too and increased textures to high.


----------



## PunkX 1

Ultra, with textures on High at 1080p, my card's VRAM usage is at 2.8-2.9GB.


----------



## Dimaggio1103

Quote:


> Originally Posted by *PunkX 1*
> 
> Ultra, with textures on High at 1080p, my card's VRAM usage is at 2.8-2.9GB.


Im at 5760x1080 eyefinity with ultra textures everything else high settings with max lighting and shadows. I get around 3.8Gb of Vram usage. frames average ~40FPS I think 6GB is a myth I ran it just fine. And yes I downloaded the HD and did several restarts to confirm. Vram went from 3GB on high to 3.8 on Ultra so I know its working right. I just do not get why they said you need at least 6.


----------



## The Source

Does anyone in here have a 6GB EVGA 780 for sale?


----------



## darealist

Lol i aint buying any gpu until it has as much VRAM as those console toys. Newer games are gonna bring pre-2015 GPUs to their knees, and many here will be whinning about "bad ports."


----------



## sumitlian

I ran this at 2560 x 1600 everything at ultra with HD texture pack on a single 280X 3GB 1.0GHz/1.5GHz, it ran fine with 35+ fps everywhere most of the time and never went below 30 fps.


----------



## Cyro999

Quote:


> Originally Posted by *sumitlian*
> 
> I ran this at 2560 x 1600 everything at ultra with HD texture pack on a single 280X 3GB 1.0GHz/1.5GHz, it ran fine with 35+ fps everywhere most of the time and never went below 30 fps.


Sure you had ultra texture pack running fine? I saw way over 3GB even at 1080p, and much higher at almost double the resolution (1600p)


----------



## Leopard2lx

I ran this at 3840 x 210 everything at ultra with HD texture pack on a single 270X 2GB 1.0GHz/1.5GHz, it ran fine with 30+ fps everywhere most of the time and never went below 25 fps. Smooth and no problems


----------



## sumitlian

Quote:


> Originally Posted by *Cyro999*
> 
> Sure you had ultra texture pack running fine? I saw way over 3GB even at 1080p, and much higher at almost double the resolution (1600p)


Yes I'm sure. The game size was around 40 GB after HD texture pack. I used 1600p 60Hz downsampling on 1600x900 monitor.
Make sure you have more than 8GB of RAM. With 8GB it stuttered a little bit. I had to use 16GB RAM and total system RAM usage was ~6800 MB. At least it has been in my case with 280X.
I never saw fps go below 30. I didn't complete it, just uninstalled if after 6-7 hours.


----------



## sumitlian

Quote:


> Originally Posted by *Leopard2lx*
> 
> I ran this at 3840 x 210 everything at ultra with HD texture pack on a single 270X 2GB 1.0GHz/1.5GHz, it ran fine with 30+ fps everywhere most of the time and never went below 25 fps. Smooth and no problems


lol you keep yourself burning.
You can also watch in youtube there has been negligible difference between GTX 780/970 vs 280X just in this game at all resolution and settings.


----------



## Greatness

The game seems to load over 3GB of uncompressed textures into memory when in ultra mode viewing long distances. If you enable ultra with 3GB and are not playing in area's with high draw distances you will not notice many issues. However if you go outside and look out to the distance you will get some slow down as the textures attempt to load uncompressed to you memory. This is just the way they designed the game the cards hand handle the rendering just fine but it was obviously not optimized with the ultra texture pack I bet it was an afterthought.


----------



## LaBestiaHumana

2.5GB Vram in the main menu, 5.2 in game.


----------



## Silent Scone

Quote:


> Originally Posted by *Greatness*
> 
> The game seems to load over 3GB of uncompressed textures into memory when in ultra mode viewing long distances. If you enable ultra with 3GB and are not playing in area's with high draw distances you will not notice many issues. However if you go outside and look out to the distance you will get some slow down as the textures attempt to load uncompressed to you memory. This is just the way they designed the game the cards hand handle the rendering just fine but it was obviously not optimized with the ultra texture pack I bet it was an afterthought.


From the PR comment they made it's not anything sloppy, they simply just let users decide if they want to use the better uncompressed textures that they had available from rendering them on their machines, which I say is fair game. If you've got the memory to use them. Their workstations probably have 12 to 16GB depending on what cards they use. Hence why 'High' looks just as good really if you don't train your eye to look for differences / LOD distance


----------



## thrgk

Can u get different weapons in this game ? How do you get more elf arrows .

Also I set everything at ultra and seems to play well. Thats with the HD content pack downloaded


----------



## maarten12100

Quote:


> Originally Posted by *thrgk*
> 
> Can u get different weapons in this game ? How do you get more elf arrows .
> 
> Also I set everything at ultra and seems to play well. Thats with the HD content pack downloaded


No you can't but you can upgrade by binding runes. Also you get more arrows by draining an enemy using the E button.


----------



## Cyro999

Quote:


> Originally Posted by *sumitlian*
> 
> Yes I'm sure. The game size was around 40 GB after HD texture pack. I used 1600p 60Hz downsampling on 1600x900 monitor.
> Make sure you have more than 8GB of RAM. With 8GB it stuttered a little bit. I had to use 16GB RAM and total system RAM usage was ~6800 MB. At least it has been in my case with 280X.
> I never saw fps go below 30. I didn't complete it, just uninstalled if after 6-7 hours.


I was not maxing out on RAM (or close to it), but VRAM for 1080p was >3.5GB on max everything

30fps isn't ok for me though, 60 plays pretty bad so i turned down some stuff to hold 100 hardcap


----------



## Dimaggio1103

Quote:


> Originally Posted by *Cyro999*
> 
> I was not maxing out on RAM (or close to it), but VRAM for 1080p was >3.5GB on max everything
> 
> 30fps isn't ok for me though, 60 plays pretty bad so i turned down some stuff to hold 100 hardcap


People need to remember there is a difference between allocating and actually using. Also 60 unplayable? Man I would hate to be you. lol


----------



## Cyro999

Quote:


> Originally Posted by *Dimaggio1103*
> 
> People need to remember there is a difference between allocating and actually using. Also 60 unplayable? Man I would hate to be you. lol


It's all relative - playing at 24fps on a 60hz monitor is as discomforting as 60 on a 144, for me at least with fast motion. It's both *2.4x fps

not that 60 is bad, but it's nowhere near the same experience, it's kinda unpleasant to play every game at 150fps but one at 60fps


----------



## bfromcolo

Appears Steam has to be online to play this game? Is there some way around this?

Edit - disregard, I was seeing a message about access requiring Steam to be online, but I guess that only matters if you want to use WBPlay, game launches fine.


----------



## chronicfx

Anyone seen if the menu flickering is fixed with 14.9.2 drivers in crossfire? Haven't bothered to upgrade from 14.9.0 since one gpu has been handling it fine but would not mind if they fixed crossfire.


----------



## jordanecmusic

what ever happened to proper texture streaming?!
man developers are getting lazy...

Ubisoft? more like U be sorry...
Yeah I stole that from Jon Tron
So what
Sanction me


----------



## Kaltenbrunner

Only played a little bit of 1 old LoTR videogame. So I'd love a big modern looking 1


----------



## blackhole2013

Why cant they use more system ram to make up for less VRAM ..


----------



## Cyro999

Quote:


> Originally Posted by *blackhole2013*
> 
> Why cant they use more system ram to make up for less VRAM ..


because system RAM and VRAM are not the same thing. Also.. 1600mhz c9 standard memory has about ~20GB/s of bandwidth in dual channel.

The ~260GB/s on the 970 is considered to be extremely low for a GPU of its power

Don't games do that anyway? Store stuff that can't fit in VRAM in the RAM and then swap stuff into VRAM when it's needed? That's the cause of the stuttering problems when you don't have enough VRAM


----------



## blackhole2013

I have an Idea they should make a pcie vram add on that you put in another slot like sli on that would add exrta vram without a new card .. Hey you never know right


----------



## Cyro999

The memory chips are usually very close to the GPU, like 1-2 inches away on the PCB - and i think there's a limited amount of chips that the pcb/chip/memorycontroller can handle. I think usually when they double VRAM on a card, they're just using VRAM ic's that are twice as dense (like 512mb per chip instead of 256 for example)


----------



## Unknownm

Quote:


> Originally Posted by *blackhole2013*
> 
> I have an Idea they should make a pcie vram add on that you put in another slot like sli on that would add exrta vram without a new card .. Hey you never know right


Off topic kinda, we should design Vram PCI-e not for more GPU space but Data use! One company way back released one PCI version with DDR.

32GB GDDR5 with 2 week battery pack!


----------



## BrightCandle

I just finished the game. Its much more stable FPS wise on a 970 than it was on 2x 680's even if they should be roughly the same performance I used significantly higher settings in this game. VRAM is almost certainly to blame but so is the poor SLI support. Despite getting a 970 I didn't even bother to download the ultra texture pack, game looks great anyway and it running smoothly was more important to me.

I really enjoyed this game in the end once it was running well. A little disappointed with the ending, too many QTEs for my taste rather than an actual fight but the main game was one of the few games I actually played to finish recently.


----------



## Juub

Quote:


> Originally Posted by *BrightCandle*
> 
> I just finished the game. Its much more stable FPS wise on a 970 than it was on 2x 680's even if they should be roughly the same performance I used significantly higher settings in this game. VRAM is almost certainly to blame but so is the poor SLI support. Despite getting a 970 I didn't even bother to download the ultra texture pack, game looks great anyway and it running smoothly was more important to me.
> 
> I really enjoyed this game in the end once it was running well. A little disappointed with the ending, too many QTEs for my taste rather than an actual fight but the main game was one of the few games I actually played to finish recently.


I did and a single 970 runs it like a charm. 70-75fps across the board. 55-60 in really busy areas.


----------



## ChronoBodi

Quote:


> Originally Posted by *delellod123*
> 
> Can some one tell me the cheapest place to get this game? No region locked keys please. I am in USA


Quote:


> Originally Posted by *chronicfx*
> 
> Anyone seen if the menu flickering is fixed with 14.9.2 drivers in crossfire? Haven't bothered to upgrade from 14.9.0 since one gpu has been handling it fine but would not mind if they fixed crossfire.


it's fixed with the latest Omega drivers, and also, why does it combine the VRAM of both cards? it shows up as 16000MB in the video options for me, and running at 4K with Ultra, with no motion blur/AA (hated motion blur), the Vram usages is like 12500MB, so, divide that by 2 and total VRAM usage is now 6250MB.

Folks, this is a sign of things to come, Finally, a game that not even 6GB VRAM is enough, lol. to be fair, this applies only to 4K resolution, for anything lower 6GB is fine.


----------



## chevy106

What I'm getting out of this is, get compressed textures, devs do more more work to compress the textures in the game thereby keeping the VRAM usage low. On the flipside, the devs have you buy the HD package, the devs do less compression (if at all) of the SAME textures, and you have to fork over the $$ to use these textures by upgrading your video card?


----------



## Aparition

Quote:


> Originally Posted by *ChronoBodi*
> 
> it's fixed with the latest Omega drivers, and also, why does it combine the VRAM of both cards? it shows up as 16000MB in the video options for me, and running at 4K with Ultra, with no motion blur/AA (hated motion blur), the Vram usages is like 12500MB, so, divide that by 2 and total VRAM usage is now 6250MB.
> 
> Folks, this is a sign of things to come, Finally, a game that not even 6GB VRAM is enough, lol. to be fair, this applies only to 4K resolution, for anything lower 6GB is fine.


Do you know if that is actual VRAM usage or just the game allocating VRAM for its use?
Allocated usage tends to be a lot higher than actual usage. The Game just using as much ram as you got, but not necessarily needing that much memory and still perform well.


----------



## supermi

Quote:


> Originally Posted by *Aparition*
> 
> Do you know if that is actual VRAM usage or just the game allocating VRAM for its use?
> Allocated usage tends to be a lot higher than actual usage. The Game just using as much ram as you got, but not necessarily needing that much memory and still perform well.


Oh it IS usage.... It pegged my titans @ 6gb in 1080 surround and that gave some loading stutters, not many but some!


----------



## Aparition

Quote:


> Originally Posted by *supermi*
> 
> Oh it IS usage.... It pegged my titans @ 6gb in 1080 surround and that gave some loading stutters, not many but some!


Surround setup, ya that would definitely do it.


----------



## Blameless

Quote:


> Originally Posted by *blackhole2013*
> 
> I have an Idea they should make a pcie vram add on that you put in another slot like sli on that would add exrta vram without a new card .. Hey you never know right


Transferring over PCI-E is no faster than going to system memory. System memory is generally faster and the PCI-E bus the bottleneck.
Quote:


> Originally Posted by *Unknownm*
> 
> Off topic kinda, we should design Vram PCI-e not for more GPU space but Data use! One company way back released one PCI version with DDR.
> 
> 32GB GDDR5 with 2 week battery pack!


You don't need GDDR5 when cheap dual-channel DDR3L could totally saturate a PCI-E 16x 3.0 slot.
Quote:


> Originally Posted by *ChronoBodi*
> 
> why does it combine the VRAM of both cards?


Each card has it's own set of memory addresses. The driver mirrors things, but the rest of the system generally cannot assume this.


----------



## The Source

Quote:


> Originally Posted by *chevy106*
> 
> What I'm getting out of this is, get compressed textures, devs do more more work to compress the textures in the game thereby keeping the VRAM usage low. On the flipside, the devs have you buy the HD package, the devs do less compression (if at all) of the SAME textures, and you have to fork over the $$ to use these textures by upgrading your video card?


The HD content is free and no one is forcing anyone to put out money for cards that can run it. The devs were nice enough to make it available as this is something you rarely, if ever see. It's there if you have the hardware and if not, don't worry about it. Texture development of every game starts out with higher resolution textures that are then scaled down during optimization for the given platform. We might see this more often if people weren't so stupid.


----------

