# [GN] Watch Dogs On PC Requires 3 GB VRAM For Ultra Quality Textures



## brucethemoose

Is that a hard requirement or suggestion?

I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. This is an Xbox 360 game after all









However, the buffered frames setting is interesting...


----------



## Bit_reaper

Quote:


> Originally Posted by *brucethemoose*
> 
> Is that a hard requirement or suggestion?
> 
> I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. This is an Xbox 360 game after all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *However, the buffered frames setting is interesting...*


I wonder if its just a new name for the same old thing meaning, Flip Queue aka pre-rendered frames. 3 is the most common size of the Flip Queue.


----------



## 331149

Well that sucks. Only have 2 on my 270X.


----------



## kx11

it requires less than 3gb but they ask for 3gb to make sure the game performance will be smooth

also MSAAx2 isn't enough @ 1200p


----------



## brucethemoose

Quote:


> Originally Posted by *Bit_reaper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *brucethemoose*
> 
> Is that a hard requirement or suggestion?
> 
> I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. This is an Xbox 360 game after all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *However, the buffered frames setting is interesting...*
> 
> 
> 
> I wonder if its just a new name for the same old thing meaning, Flip Queue aka pre-rendered frames. 3 is the most common size of the Flip Queue.
Click to expand...

I bet that's another name for flip queue. It's interesting because devs usually don't expose that setting.


----------



## twerk

Is that a hard lock on the Ultra textures, so if you don't have 3GB you can't select it? If so, that's stupid.


----------



## Cretz

What settings would be the equivalent of how the game is on the PS4?


----------



## brucethemoose

Quote:


> Originally Posted by *twerk*
> 
> Is that a hard lock on the Ultra textures, so if you don't have 3GB you can't select it? If so, that's stupid.


Don't worry, someone will make a mod to get around it. You can probably spoof your GPU ID, and make the game think you're running a 3gb card.


----------



## FattysGoneWild

2GB card myself. Wondering if you could get by playing on Ultra textures @1080p IF you turn off the following. I personally don't use any of the 3 settings in games any ways and always disable them.

AA
Motion Blur
DOF


----------



## eternal7trance

Got a 290, I ain't even mad


----------



## Newbie2009

hmmm wonder how much that will increase to for 1600p


----------



## RagingCain

If it's a "hard requirement" then it is totally unnecessary for PCs and shamefully programmed that way.


----------



## Pawelr98

It's problay just like with Max Payne 3. Max settings showed up 4060mb Vram usage (game detected my Hd6990 as 4gb card). This Vram usage meter is simply unrealistic.


----------



## Outcasst

It's not a hard requirement. Was watching somebody streaming on Twitch before his account got deleted. He had a 2GB card but the game was stuttering like crazy on the highest setting.


----------



## frickfrock999

Holy crap, I never thought my 3 GB of VRAM would actually come in handy. Especially since I don't play at 4K or have a multi monitor set up.

I'm glad to hear this. It's a good thing games are taking advantage of high end hardware now!


----------



## kx11

someone played the game @ 4k without any problems


----------



## Blameless

Quote:


> Originally Posted by *Outcasst*
> 
> It's not a hard requirement. Was watching somebody streaming on Twitch before his account got deleted. He had a 2GB card but the game was stuttering like crazy on the highest setting.


HardOCP had a similar experience. And they still complained about the preset not being available on 2GiB cards...

Obviously the game is using the memory for something if you can manually set ultra+ and it results in abysmal performance.


----------



## AndroidVageta

I played Titanfall at max texture resolution with triple 1080p monitors in Eyefinity on a single 7970 3GB.

I doubt this news very much.


----------



## B!0HaZard

Quote:


> Originally Posted by *brucethemoose*
> 
> I bet that's another name for flip queue. It's interesting because devs usually don't expose that setting.


Well, Far Cry 3 had the option too (with the exact same name in fact) so maybe it's a Ubi thing. AC4 doesn't, however.
Quote:


> Originally Posted by *RagingCain*
> 
> If it's a "hard requirement" then it is totally unnecessary for PCs and shamefully programmed that way.


It completely breaks the game a few years down the road when the game can no longer detect the correct amount or something stupid. I believe GTA4 had issues detecting RAM correctly. I've had older games tell me that I don't pass the requirements too because they don't recognize the newer CPU/GPU, luckily none have prevented me from playing anyway.


----------



## CBZ323

Why the hell are people saying it's a hard requirement?

Someone made that up and everyone is freaking out. Sometimes I hate the internet.


----------



## John Shepard

50 bucks says any high end 2GB card can easily max it out at 1080p.(might have to turn down the MSAA though)
You will never ever need more than 2GB at 1080p.


----------



## y2kcamaross

Quote:


> Originally Posted by *John Shepard*
> 
> 50 bucks says any high end 2GB card can easily max it out at 1080p.(might have to turn down the MSAA though)
> *You will never ever need more than 2GB at 1080p*.


....


----------



## Murlocke

All I will say is - If 2x 780 Tis can't max this game at 1440p than it almost certainly comes down to bad coding/port. If it needs 3GB on 1080p, it's going to easily require 4+ GB on 1440p for good performance.

Let's recap:
- Quad core required to even play on low.
- 4GB RAM to even launch the game and play on low, 6GB recommended. Max will surely require more.
- DX11 required.
- 3GB VRAM to max on 1080p.

Honestly this game is going to do abysmal with these requirements. I bought the digital deluxe, but I don't expect big PC sales and I expect a lot of unhappy gamers that didn't read the requirements wanting refunds.


----------



## mrawesome421

Quote:


> Originally Posted by *John Shepard*
> 
> 50 bucks says any high end 2GB card can easily max it out at 1080p.(might have to turn down the MSAA though)
> You will never ever need more than 2GB at 1080p.


Skyrim would like a word with you....


----------



## Clairvoyant129

Quote:


> Originally Posted by *Murlocke*
> 
> All I will say is - If 2x 780 Tis can't max this game at 1440p than it almost certainly comes down to bad coding/port.


I don't think we will have to worry (those with 780s, 290s, Titans etc).


----------



## Bit_reaper

Quote:


> Originally Posted by *mrawesome421*
> 
> Skyrim would like a word with you....


Modded skyrim. In its vanilla state you wont get it over 2GB.

But yes the time will come when more then 2GB vram will be needed even @ 1080p. Will Watch dogs be one of thous titles? We'l see. Even so there is also the GPU grunt to consider. There hasn't been a single game where an 2GB card would have had the GPU power but not the vram to run the game maxed.

Vram hungry games will mostly be an issue for people running 2GB multi gpu setups.


----------



## Newbie2009

Quote:


> Originally Posted by *Murlocke*
> 
> All I will say is - If 2x 780 Tis can't max this game at 1440p than it almost certainly comes down to bad coding/port. If it needs 3GB on 1080p, it's going to easily require 4+ GB on 1440p for good performance.
> 
> Let's recap:
> - Quad core required.
> - 4GB RAM to even launch the game and play on low, 6GB recommended. Max will surely require more.
> - DX11 required.
> - 3GB VRAM to max on 1080p.
> 
> Honestly this game is going to do abysmal with these requirements. I bought the digital deluxe, but I don't expect big PC sales and I expect a lot of unhappy gamers that didn't read the requirements wanting refunds.


Yeah bad coding no doubt. Let's hope they don't pull a ubisoft. oh wait


----------



## Alex132

Well if this is true I will have tons of fun with 2GB on 1440p.

Just like to point out they said they could max the game with a single 670, and that has 2GB of VRAM.

Quote:


> Originally Posted by *Bit_reaper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mrawesome421*
> 
> Skyrim would like a word with you....
> 
> 
> 
> Modded skyrim. In its vanilla state you wont get it over 2GB.
> 
> But yes the time will come when more then 2GB vram will be needed even @ 1080p. Will Watch dogs be one of thous titles? We'l see. Even so there is also the GPU grunt to consider. There hasn't been a single game where an 2GB card would have had the GPU power but not the vram to run the game maxed.
> 
> Vram hungry games will mostly be an issue for people running 2GB multi gpu setups.
Click to expand...

I wish I coulda gotten 4GB on my 690


----------



## Murlocke

Quote:


> Originally Posted by *Alex132*
> 
> Just like to point out they said they could max the game with a single 670, and that has 2GB of VRAM.


That's why I feel this is all hype to increase game sales/advertise more. GTA IV claimed outrageous VRAM requirements, and even locked them without tweaking, but many people managed to max the game with much less.


----------



## kx11

best review EVER

http://teamcoco.com/video/clueless-gamer-watchdogs


----------



## B!0HaZard

Quote:


> Originally Posted by *Bit_reaper*
> 
> Vram hungry games will mostly be an issue for people running 2GB multi gpu setups.


I think most of the time even if you could hit the VRAM limit, you wouldn't get good frame rates anyway. I mean, most 2 GB cards aren't that fast anyway. 2 GB Kepler cards are the only ones that might really be struggling.


----------



## Alex132

Quote:


> Originally Posted by *Murlocke*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Just like to point out they said they could max the game with a single 670, and that has 2GB of VRAM.
> 
> 
> 
> That's why I feel this is all hype to increase game sales/advertise more. GTA IV claimed outrageous VRAM requirements, and even locked them without tweaking, but many people managed to max the game with much less.
Click to expand...

Sadly it's actually become a sorta marketing thing where companies just try to get the highest recommended specs to try and impress people.


----------



## SoloCamo

Quote:


> Originally Posted by *John Shepard*
> 
> 50 bucks says any high end 2GB card can easily max it out at 1080p.(might have to turn down the MSAA though)
> *You will never ever need more than 1GB at 1080p*.


And less than 2 years ago it was the above quote... I learned back in the geforce 4 days that it's better to go with the most vram you can get on a good card (don't go 4gb gt640 over a 2gb 680 obviously) while my friend was struggling with his 64mb ti4200 I was sitting happy with my 128mb one and had enough for some more AA, too


----------



## Artikbot

6 gigs of RAM becoming a trend... That's it. This Christmas the computer needs an overhaul.

Quote:


> Originally Posted by *Bit_reaper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mrawesome421*
> 
> Skyrim would like a word with you....
> 
> 
> 
> Modded skyrim. In its vanilla state you wont get it over 2GB.
Click to expand...

Official hi-res texpacks are enough.


----------



## PunkX 1

I'm ready


----------



## Paladin Goo

Quote:


> Originally Posted by *brucethemoose*
> 
> Is that a hard requirement or suggestion?
> 
> I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. *This is an Xbox 360 game after all
> 
> 
> 
> 
> 
> 
> 
> *


There's so much wrong with that statement.


----------



## B!0HaZard

Quote:


> Originally Posted by *SoloCamo*
> 
> And less than 2 years ago it was the above quote... I learned back in the geforce 4 days that it's better to go with the most vram you can get on a good card (don't go 4gb gt640 over a 2gb 680 obviously) while my friend was struggling with his 64mb ti4200 I was sitting happy with my 128mb one and had enough for some more AA, too


Yeah, when I built my first gaming PC, everyone kept repeating "dual cores are just as good as quads in games" and "2 GB system RAM is enough". In 2011 when I upgraded it was, "eh, 4 GB system RAM is plenty for any game out currently". All of those statements were true at the time, but I had to upgrade early because of all of those decisions.


----------



## MPsilent

Time to upgrade my GTX 670 2GB then...


----------



## Paladin Goo

So, all that crap about "most of the design team are running it with 670s on Ultra with a good CPU" was a lie? Thanks Ubi.


----------



## eternal7trance

Quote:


> Originally Posted by *Raven Dizzle*
> 
> So, all that crap about "most of the design team are running it with 670s on Ultra with a good CPU" was a lie? Thanks Ubi.


Maybe they had 670 4gb models?


----------



## iamhollywood5

Quote:


> Originally Posted by *Raven Dizzle*
> 
> So, all that crap about "most of the design team are running it with 670s on Ultra with a good CPU" was a lie? Thanks Ubi.


Yeah that Jonathon Morin guy said you could max the game with a 670 as long as there was no CPU bottleneck. I don't know what to believe about this game anymore. They keep changing the hardware specs and each new spec is ridiculous.


----------



## jattz

My 650 ti boost is obviously going to suffer.


----------



## Ascii Aficionado

I simply don't believe this.

I'd wait to see what reputable people say since this is solely based on a message in the game's menu.


----------



## mrawesome421

I am more interested in if my 4Ghz Phenom X4 will struggle on this title. My 7850 will do it's job I think.


----------



## Ascii Aficionado

Quote:


> Originally Posted by *mrawesome421*
> 
> I am more interested in if my 4Ghz Phenom X4 will struggle on this title. My 7850 will do it's job I think.


I assume it'll be a massive bottleneck simply due to architecture/IPC.


----------



## Paladin Goo

Quote:


> Originally Posted by *eternal7trance*
> 
> Maybe they had 670 4gb models?


Yes, cause when people hear GTX 670, their first thought is the few 4GB models, and not the garden variety 2GB models that 90% of people have. lol.


----------



## mrawesome421

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> I assume it'll be a massive bottleneck simply due to architecture/IPC.


Eh.. I certainly hope not. But I'll be darned if I upgrade my PC just yet for THIS game.


----------



## driftingforlife

I doubt this, probably just has a memory leak









I mean I can max crysis 3 on my 2Gb 680's at 1440p and this is no dam crysis.


----------



## TopicClocker

Quote:


> Originally Posted by *driftingforlife*
> 
> I doubt this, probably just has a memory leak
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean I can max crysis 3 on my 2Gb 680's at 1440p and this is no dam crysis.


It will be a Crisis if Ultra textures dont run well on 2GB cards.


----------



## BradleyW

I am ready for Watch Dogs.....


----------



## Bitemarks and bloodstains

This doesn't surprise me as Wolfenstein is taking up 2.5GB of VRAM at 2560x1440 on my mates setup.

I personally hope this is true (as long as the graphics are amazing). We have been stuck in the last gen for too long


----------



## StrongForce

This almost sounds like a conspiracy to try to make me regret not waiting a bit more and get a 780 instead of that 770 I just got.

lol

But yea I hope that's not true, yesterday I was playing BF4 for first time, full Ultra would give me a few lag spikes, removed MSAA altogether still lag spikes, so I don't know, maybe it was the server but could be the VRAM too, such a heavy game







I'll have to do more testing.


----------



## t00sl0w

sounds like, "oh guys look, PC players have to buy the most expensive graphics cards and CPUs to be able to play the game smoothly and attractively. meanwhile, if you buy a xbone or PS4 then you dont have to spend more than $400 to experience the greatness" propaganda.


----------



## bencher

Quote:


> Originally Posted by *eternal7trance*
> 
> Got a 290, I ain't even mad


----------



## lostsurfer

Someone's going to find an value in a .cfg file to change to work around the 3gb requirement.


----------



## Bit_reaper

Quote:


> Originally Posted by *Artikbot*
> 
> 6 gigs of RAM becoming a trend... That's it. This Christmas the computer needs an overhaul.
> 
> Official hi-res texpacks are enough.


You sure about that? Even with the official hi-res pack I wasn't getting more then 1.8GB reserved. It wasn't until I started to mess around with ini tweaks that I pushed over 2GB. Bumping up shadows res form highest in-game setting of 4k to 8k that I ran out. Really shadow res sucked up way more then I expected it to.

I will admit though that I didn't play the game much un-modded so I guess there could be a place/situation where it goes over 2GB even with only the official texture pack but I certainly didn't run in to any vram issues with only that.


----------



## Alvarado

Quote:


> Originally Posted by *StrongForce*
> 
> This almost sounds like a conspiracy to try to make me regret not waiting a bit more and get a 780 instead of that 770 I just got. lol


I argree!


----------



## NavDigitalStorm

Hopefully my 295x2 can run this on 4K without issues


----------



## mohit9206

I told you guys to get the 4GB 770/780 but you wouldn't listen saying no 2gb is enough but isnt it a bit sensible if you are spending $400 on a graphics card you get get one with atleast 3gb of vram ? 770 is certainly powerful enough to utilise 3gb so future proofing is always a good idea.Hope now the popularity of 3 and 4gb cards will increase.2gb needs to die at above $400 price point.


----------



## MapRef41N93W

Quote:


> Originally Posted by *AndroidVageta*
> 
> I played Titanfall at max texture resolution with triple 1080p monitors in Eyefinity on a single 7970 3GB.
> 
> I doubt this news very much.


Titanfall is a source engine game that runs on a potato. Watchdogs is not.


----------



## mohit9206

Quote:


> Originally Posted by *SoloCamo*
> 
> And less than 2 years ago it was the above quote... I learned back in the geforce 4 days that it's better to go with the most vram you can get on a good card (don't go 4gb gt640 over a 2gb 680 obviously) while my friend was struggling with his 64mb ti4200 I was sitting happy with my 128mb one and had enough for some more AA, too


You were smart.
Quote:


> Originally Posted by *B!0HaZard*
> 
> Yeah, when I built my first gaming PC, everyone kept repeating "dual cores are just as good as quads in games" and "2 GB system RAM is enough". In 2011 when I upgraded it was, "eh, 4 GB system RAM is plenty for any game out currently". All of those statements were true at the time, but I had to upgrade early because of all of those decisions.


That is why its always better to go with extra ram,vram and cores etc if you can afford it.Lasts a bit longer.I still see so many people asking whether to go 2gb or 4gb 770 and the experts keep saying get 2gb unless you do multi-monitor which is a wrong advice.If someone can afford 770 4gb then its better choice.Now you will see all those people regretting getting 2gb.This is in 2014 just imagine in another years things will be even more demanding.More vram is always better as long as the card is fast enough to utilise it.


----------



## djriful

Quote:


> Originally Posted by *brucethemoose*
> 
> Is that a hard requirement or suggestion?
> 
> I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. This is an Xbox 360 game after all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, the buffered frames setting is interesting...


Xbox game? No...

http://www.vg247.com/2013/02/27/watch-dogs-pc-is-lead-platform

The amount of next-gen features, multi-simulation characters and environment are PC minded design before Xbox One and PS4 release. Now they are porting to X1 and PS4 when the dev kit came out.


----------



## AndroidVageta

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Titanfall is a source engine game that runs on a potato. Watchdogs is not.


Titanfall still has high-res textures.


----------



## djriful

Me, running 6GB VRAM since 2013. You guys got catch up to do!


----------



## djriful

Quote:


> Originally Posted by *stonedwarlock*
> 
> i played the game and its aswome..
> 
> one thing to notice is that textures are based on VRAM and the rest its up to gpu/cpu clocks.
> 
> running all ultra excpet the textures on HIGH and i get avg 40 fps with 670 2gb.
> 
> ya time to upgrade.
> 
> i have one question.. when you run out of vram the game will stop rendering textures? i can run the game at ultra textures but it will start suttering a lot but i dont see any difference between medium high and ultra, it all feels the same.. strange


Wait for Nvidia new driver, and wish the game is DirectX 11.2 supported for Tiled Resources... that would solve the issue with VRAM.

http://blogs.windows.com/windows/b/extremewindows/archive/2014/01/30/directx-11-2-tiled-resources-enables-optimized-pc-gaming-experiences.aspx


----------



## Mad Pistol

My GTX 780 is SO ready for this game!


----------



## EniGma1987

Quote:


> Originally Posted by *brucethemoose*
> 
> Is that a hard requirement or suggestion?
> 
> I'm sure you can mess with AA and tweak ini files to get full res textures working on 2GB cards. This is an Xbox 360 game after all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, the buffered frames setting is interesting...


Nvidia cards have an option to set the max buffered/pre-rendered frames in the control panel globally or on a per-game basis. So I dont think it is anything really new.


----------



## cdoublejj

i wonder if 2x 780 Ti 6gb edition in SLI would take care of it?


----------



## ZealotKi11er

Come on guys. 2GB vRAM is so 2010.


----------



## Alvarado

Yeah...I'm not gonna call unoptimized till we see drivers for it.


----------



## My Desired Display Name

Is there a demo or do preorders get it early or something? A couple people seem as though they're playing it, or is it one of those dark side of the internet type of deals.


----------



## Ash568

well i hope my 7950 can handle high sitting cant wait for the 870 to come out


----------



## Alatar

A Titan at 1350MHz should do nicely.

Gotta use that 6GB frame buffer somewhere I guess...


----------



## tpi2007

Guys, the game will only be released on the 27th, today is the 23rd. No discussions based on illegally obtained copies please.


----------



## Seronx

Quote:


> Originally Posted by *tpi2007*
> 
> Guys, the game will only be released on the 27th, today is the 23rd. No discussions based on illegally obtained copies please.


A couple retail stores have send out the game early. I'm not 100% sure if it was by accident or by intent. These copies that were shipped are not illegal copies.


----------



## BusterOddo

Quote:


> Originally Posted by *Seronx*
> 
> A couple retail stores have send out the game early. I'm not 100% sure if it was by accident or intentional but those copies are not illegal copies.


Really??? I pre-ordered from Steam and have to wail til Tuesday when I could play this weekend... Steam should release early too then


----------



## waylo88

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Titanfall is a source engine game that runs on a potato. Watchdogs is not.


Not true at all. Go take a look at the Titanfall reddit and you'll see all kinds of people with mid-range all the way up to high-end hardware who have problems running the game with max settings.

Yes, that's due to it being unoptimized, but to say it runs on a potato just because it uses the Source engine is a gross exaggeration.


----------



## SkyNetSTI

Hey! I just changed my 1080p monitor with 1440 p! So does it means that 780SLI won't be able to run the game on ultra according to this info???


----------



## Seronx

Quote:


> Originally Posted by *BusterOddo*
> 
> Really??? I pre-ordered from Steam and have to wail til Tuesday when I could play this weekend... Steam should release early too then


I want to reaffirm that I'm unsure if it was by accident or by intent. Valve is very strict in following release date launches. Just because a couple physical retail stores break the sale agreement doesn't mean Valve can as well.

I'm just pointing out those copies that were shipped before the May 27th mark are legitimate copies of the game. While the PC download version not on amazon, uplay, steam, etc are illegitimate copies.
Quote:


> Originally Posted by *SkyNetSTI*
> 
> Hey! I just changed my 1080p monitor with 1440 p! So does it means that 780SLI won't be able to run the game on ultra according to this info???


Your 780's will be able to run this game Ultra just make sure to have the latest Nvidia drivers.


----------



## ty12004

Quote:


> Originally Posted by *Seronx*
> 
> A couple retail stores have send out the game early. I'm not 100% sure if it was by accident or by intent. These copies that were shipped are not illegal copies.


Even if the PC copy is received early, it is locked by UPlay to prevent early access.

All that aside, really excited to see some benches for it haha.


----------



## B!0HaZard

Quote:


> Originally Posted by *ty12004*
> 
> Even if the PC copy is received early, it is locked by UPlay to prevent early access.


Just go offline. The system requirements do not mention online activation so if you get a physical copy, you should have no problem playing. One article claims that Uplay is needed to activate, but it could just need you to install the program, it doesn't mention needing an internet connection.


----------



## BulletSponge

Quote:


> Originally Posted by *t00sl0w*
> 
> sounds like, "oh guys look, PC players have to buy the most expensive graphics cards and CPUs to be able to play the game smoothly and attractively. meanwhile, if you buy a xbone or PS4 then you dont have to spend more than $400 to experience the greatness" propaganda.


Exactly what I was thinking. Anything to sell more consoles.


----------



## MonarchX

I upgraded from my old lady ASSus GTX 680 to my new hot, steamy, filthy, power-hungry slut EVGA GTX 780 Ti SC ACX *with Backplate* (that's right, its important to mention the Backplate with capital "B" or else its just another average girl) just in time for this game then!!! Except that I was SO happy about my purchase, which I didn't expect to make for another 6 months since I had no money, that I gave away my Watch Dogs and even Daylight coupons







. It didn't feel right to be happy by myself and when I told others "Dude, get you a GTX 780 Ti!" to make them happy they got a wee bit annoyed, so I Watch Dogged them up, but now I just







over this game! I shouldn't have gotten that Backplate, but erm... no... she didn't look complete without it







. Anyway, I feel very strangely off-topic today... not that there is a specific topic for any specific day... or... nah!


----------



## dboythagr8

I'll be at 4k tho

I should be ok as long as a I disable AA


----------



## TopicClocker

Quote:


> Originally Posted by *ty12004*
> 
> Even if the PC copy is received early, it is locked by UPlay to prevent early access.
> 
> All that aside, really excited to see some benches for it haha.


Nope, my retail copy works fine from uplay., there's likely to be some pirates about though.


----------



## Murlocke

Quote:


> Originally Posted by *dboythagr8*
> 
> I'll be at 4k tho
> 
> I should be ok as long as a I disable AA


If this game truely needs 3GB of VRAM for 1080p Ultra, you will not be OK at 4K even without AA. Not even close, and that goes for every card on the market. You'd need more along the lines of 8+ GB of VRAM at 4K, it's 4x 1080p. Even if this VRAM requirement is bogus... I still wouldn't expect to max this game at 4K on anything but 3-4x 6GB Titan Blacks given it's current requirements. I think 2x 780Tis would nicely max it at 1440p, but not much higher if you want to maintain 60FPS.


----------



## BusterOddo

Quote:


> Originally Posted by *TopicClocker*
> 
> Nope, my retail copy works fine from uplay., there's likely to be some pirates about though.


Yeah I'm pretty jealous right now







How is it running? What settings are you using?


----------



## ty12004

Quote:


> Originally Posted by *TopicClocker*
> 
> Nope, my retail copy works fine from uplay., there's likely to be some pirates about though.


Ah sorry, I was wrong. Didn't mean to imply you were a pirate.

Nice of them to allow people to play the install early!


----------



## NuclearPeace

I don't know about you guys but this game sounds like it has graphics that has never been seen before or it just runs like dung. Probably the latter, going off of how previous Ubisoft titles performed.


----------



## dboythagr8

Quote:


> Originally Posted by *Murlocke*
> 
> If this game truely needs 3GB of VRAM for 1080p Ultra, you will not be OK at 4K even without AA. Not even close, and that goes for every card on the market. You'd need more along the lines of 8+ GB of VRAM at 4K, it's 4x 1080p. Even if this VRAM requirement is bogus... I still wouldn't expect to max this game at 4K on anything but 3-4x 6GB Titan Blacks given it's current requirements. I think 2x 780Tis would nicely max it at 1440p, but not much higher if you want to maintain 60FPS.


My Ti's are faster than Titans. Only thing Titans have over them are the VRAM.


----------



## Alatar

Quote:


> Originally Posted by *dboythagr8*
> 
> My Ti's are faster than Titans. Only thing Titans have over them are the VRAM.


Unless you have classys Titans (not blacks) will have the edge with high end cooling due to voltage control.


----------



## fateswarm

It's very easy to require hardware resources. Do badly. To require the same resources for a better outcome, that's a challenge.


----------



## omari79

a couple of days ago its was the Min requirement of 6GB of RAM and now this..

granted the 6GB RAM requirement is covered but a 3GB of VRAM is pushing it for the average gamer..

for a xbox360 game..there's like 0 optimization going into it and it must be a viable business strategy because they keep cooking and we keep eating


----------



## startekee

Why is this game 15GB and Wolfenstein is 44GB


----------



## Alatar

Quote:


> Originally Posted by *tpi2007*
> 
> Guys, the game will only be released on the 27th, today is the 23rd. No discussions based on illegally obtained copies please.


Reiterating this point. The game has not launched yet and isn't unlocked on uplay.

Please do not discuss playing with illegal, cracked etc. copies of the game. We're going to have to start issuing warnings if the posts about playing copies of the PC game continue.


----------



## kx11

Quote:


> Originally Posted by *startekee*
> 
> Why is this game 15GB and Wolfenstein is 44GB


Wolf' got an interesting SP so the 40gb are worth it


----------



## startekee

And whoever said this game look like GTA must me smoking on that good lol. And yea Kx11 wolfenstein is worth it but I just don't understand the size when Watch Dogs should need more textures


----------



## yunshin

Quote:


> Originally Posted by *kx11*
> 
> Wolf' got an interesting SP so the 40gb are worth it


Heh...


----------



## startekee

Quote:


> Originally Posted by *Alatar*
> 
> Reiterating this point. The game has not launched yet and isn't unlocked on uplay.
> 
> Please do not discuss playing with illegal, cracked etc. copies of the game. We're going to have to start issuing warnings if the posts about playing copies of the PC game continue.


Just saw this. Sorry


----------



## omari79

Quote:


> Originally Posted by *kx11*
> 
> Wolf' got an interesting SP so the 40gb are worth it


yeah the SP is engaging but very few games have a little to no difference between low and ultra settings + no DX11/tessellation in a 2014 title..


----------



## Faithh

TL;DR it doesnt need 3GB for ultra. Whats going on with most people really, they see their 780's vram usage at 95% and they automatically think 3GB is the minimum? Lots of that amount are just cached resources that just doesn't impact your performance. Skyrim used to fill your vram to 100% on any card, a 3GB, 4GB, 8 & 12GB.


----------



## mercs213

The game isn't launched officially, but people are getting their copies early (ordered from uplay) and they can play it. Confirmed on the subreddit for watch dogs.


----------



## HeadlessKnight

Quote:


> Originally Posted by *Faithh*
> 
> TL;DR it doesnt need 3GB for ultra. Whats going on with most people really, they see their 780's vram usage at 95% and they automatically think 3GB is the minimum? Lots of that amount are just cached resources that just doesn't impact your performance. *Skyrim used to fill your vram to 100% on any card, a 3GB, 4GB, 8 & 12GB*.


Are you sure about that? Skyrim with many texture mods don't fill up the 3 GB on my 780 Ti, highest I've seen is 2.5 GB, same attitude with my HD 7950. However using heavy texture mods it used to fill my GTX 670 2 GB and stutter.


----------



## Faithh

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Are you sure about that? Skyrim with many texture mods don't fill up the 3 GB on my 780 Ti, highest I've seen is 2.5 GB, same attitude with my HD 7950. However using heavy texture mods it used to fill my GTX 670 2 GB and stutter.


If swapping doesnt work properly you'd get such things. You might have missed this issue though


----------



## sdlvx

Nvidia optimized title. Ultra requires 3GB of ram. A quick look at Steam Hardware Survey points to most Nvidia cards being 2GB GK104 models.

A slow golf clap for Nvidia optimizing a title to make their mid-high end cards obsolete.

I suppose this is to make their cards that are expensive yet under-deliver beyond 1080p look good in benchmarks at 1080p.


----------



## TopicClocker

Quote:


> Originally Posted by *BusterOddo*
> 
> Yeah I'm pretty jealous right now
> 
> 
> 
> 
> 
> 
> 
> How is it running? What settings are you using?


I gave ultra a shot and was stuttering, I presume it's this 3GB vram requirement.
Atm I'm running with most things on ultra/highest settings but I took Ambient Occlusion down to like MBHAO, on HBAO+ it would stutter and drop the frames for some reason, I'm not sure why, probbaly needs a patch or a driver hopefully.
I took my shadows down from Ultra to high because the same thing would happen and textures has to be dropped to high from Ultra.

Another cause for stuttering was me having shadowplay on in the background so that helped smooth the game out more, It's running 30+ at the moment and pretty stable, I''m planning to do a thread on my performance experience this weekend if I'm not glued to the game.

From my experience at the momment, killer settings, HBAO+, Ultra Shadows and Ultra textures
Another thing, this game is hammering my CPU like no other, I dont think I've ever seen my CPU hit so hard with high utilization on every core, it just goes to show they put some effort into the CPU side of optimization, I'm only on a quad core but If it scales up to 8 cores and threads there's certainly hope for 4-8 cores especially those Piledrivers
Quote:


> Originally Posted by *ty12004*
> 
> Ah sorry, I was wrong. Didn't mean to imply you were a pirate.
> 
> Nice of them to allow people to play the install early!


It's alright man.
Quote:


> Originally Posted by *sdlvx*
> 
> Nvidia optimized title. Ultra requires 3GB of ram. A quick look at Steam Hardware Survey points to most Nvidia cards being 2GB GK104 models.
> 
> A slow golf clap for Nvidia optimizing a title to make their mid-high end cards obsolete.
> 
> I suppose this is to make their cards that are expensive yet under-deliver beyond 1080p look good in benchmarks at 1080p.


This. Ever since TitanFall I've wonder why Nvidia didn't make 3GB or 4GB standard, meanwhile 7900s chugging along laughing at the GK104s.


----------



## pengs

Quote:


> Originally Posted by *startekee*
> 
> Why is this game 15GB and Wolfenstein is 44GB


mega textures
Quote:


> Originally Posted by *startekee*
> 
> And whoever said this game look like GTA must me smoking on that good lol. And yea Kx11 wolfenstein is worth it but I just don't understand the size when Watch Dogs should need more textures


Game size and texture sizes aren't directly related because each game has different things in the immediate view and some games repeat textures more lessening the need for more vram.

Yeah I agree, it looks meh.


----------



## Ascii Aficionado

Hmm...

Now this makes sense

http://www.evga.com/articles/00830/


----------



## StrongForce

Man thanks for posting this because when I ordered my 770 I was kind of in a hurry or hasty I should say lol (all excited about getting, new card, finally!), didn't realise I could get the same card with 4Gb for 20-30bucks more.. no big deal.

So I emailed them today good thing there is a retract delay of 7 days







and gonna the get 4gb model, will be more worthwhile, and it's true heavily modded games with texture packs etc can eat ALOT of VRAM so it isn't surprising really !, so yea, damn, how come I didn't thought about this when ordering ?


----------



## NavDigitalStorm

4 more days... I can not handle this.


----------



## Alvarado

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> 4 more days... I can not handle this.


^ I know the feeling


----------



## stonedwarlock

oh plz, its just bad coding, everything max out on bf4 and never reaches 2gb and watch dogs nearly has impressive has bf4 and reaches almost 4gb.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Alvarado*
> 
> ^ I know the feeling


So how is everyone playing early? Pirated copies?


----------



## TopicClocker

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So how is everyone playing early? Pirated copies?


Mr postman visited early


----------



## NavDigitalStorm

Oh pirates.... why....


----------



## XAslanX

Not the first time this has happen with Ubisoft. Blood Dragon was leaked weeks direct from UPlay before it's official release, in other words they have some pretty poor security.


----------



## Alvarado

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So how is everyone playing early? Pirated copies?


Yeah.....a few of them on this one popular site.


----------



## mingocr83

Quote:


> Originally Posted by *XAslanX*
> 
> Not the first time this has happen with Ubisoft. Blood Dragon was leaked weeks direct from UPlay before it's official release, in other words they have some pretty poor security.


Agree!


----------



## TopicClocker

Quote:


> Originally Posted by *XAslanX*
> 
> Not the first time this has happen with Ubisoft. Blood Dragon was leaked weeks direct from UPlay before it's official release, in other words they have some pretty poor security.


lol ubisoft shipped watch dogs from their ubistore to people two days ago, there's a few people who bought the game with early copies.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *TopicClocker*
> 
> lol ubisoft shipped watch dogs from their ubistore to people two days ago, there's a few people who bought the game with early copies.


But isn't the PC version locked until launch date?


----------



## TopicClocker

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> But isn't the PC version locked until launch date?


Mine isn't came through the door, installed and it works.
UPlay version btw.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *TopicClocker*
> 
> Mine isn't came through the door, installed and it works.
> UPlay version btw.


So people with the NVIDIA codes can play?


----------



## Clockster

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So people with the NVIDIA codes can play?


Not if you ordered the digital version with the code.


----------



## TopicClocker

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So people with the NVIDIA codes can play?


I dont know, I pre ordered the box game, installed it and it showed up in my libray and asked me for a key I think


----------



## NavDigitalStorm

Quote:


> Originally Posted by *TopicClocker*
> 
> I dont know, I pre ordered the box game, installed it and it showed up in my libray and asked me for a key I think


That's hilarious. No need for a crack even?


----------



## TopicClocker

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That's hilarious. No need for a crack even?


Nope lol I was shocked myself, glad I got it from uplay and from ubisoft themselves.
I might do a thread tomorrow about performance.


----------



## mingocr83

I preordered the digital version...it means that I have to wait until release?


----------



## TopicClocker

Quote:


> Originally Posted by *mingocr83*
> 
> I preordered the digital version...it means that I have to wait until release?


I think so, that seems to be the general consensus, especially if ordered from Steam.


----------



## mingocr83

doublepost


----------



## mingocr83

Quote:


> Originally Posted by *TopicClocker*
> 
> I think so, that seems to be the general consensus, especially if ordered from Steam.


PRE-ORDER
RELEASES: MAY 27, 2014
AT 12:01AM EST

Damn! At least that is May 26th, 10pm, my time...


----------



## Arm3nian

Quote:


> Originally Posted by *Alatar*
> 
> Unless you have classys Titans (not blacks) will have the edge with high end cooling due to voltage control.


Nope. By the time the titan makes up the 10% difference you would be at benching and non stable voltages/clocks. And even if you were stable, no one would run a titan at those overvoltages for 24/7. DP and 6gb vram is all you got.


----------



## Redeemer

Hope my 5.0Ghz i5 2500k and 780TI cut it


----------



## HeadlessKnight

Quote:


> Originally Posted by *Arm3nian*
> 
> Nope. By the time the titan makes up the 10% difference you would be at benching and non stable voltages/clocks. And even if you were stable, no one would run a titan at those overvoltages for 24/7. DP and 6gb vram is all you got.


He got a point. Titans can go up to 1.6V (I think) with LLC enabled. Reference GTX 780 Ti's are locked to 1.21V. Titans can beat out 780 Ti's much easier than regular 780s. as the Ti's only have 7% more shaders which translate to about 4-5% difference cfc.
Titans are only slightly slower cfc but have much voltage headroom to easily overcome the SMX difference. Unstable that depends on the card itself, this is a generalization and generalizations don't always work. Under watercooling I've seen impressive overclocks on some Titans, only 780 Ti's that can hit 1350 MHz+ stable at everything are either vmoded or the classifieds.


----------



## mingocr83

Quote:


> Originally Posted by *Redeemer*
> 
> Hope my 5.0Ghz i5 2500k and 780TI cut it


How much RAM though?

Twitch Dev Q&A

http://www.twitch.tv/ubisoft/b/531426190


----------



## TopicClocker

Quote:


> Originally Posted by *mingocr83*
> 
> How much RAM though?
> 
> Twitch Dev Q&A
> 
> http://www.twitch.tv/ubisoft/b/531426190


sigh I wish I had good speed to stream.


----------



## mingocr83

Quote:


> Originally Posted by *TopicClocker*
> 
> sigh I wish I had good speed to stream.


The Q/A is already over..that is the video recording from today...you can play at any resolution.


----------



## NavDigitalStorm

So apparently my mate's copy came early in the mail and he's playing. Thinking of heading over there after work to capture some game play for YouTube. Any clue on embargo details?


----------



## TopicClocker

Quote:


> Originally Posted by *mingocr83*
> 
> The Q/A is already over..that is the video recording from today...you can play at any resolution.


I mean to stream myself, my upload speed is pretty poor.
Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So apparently my mate's copy came early in the mail and he's playing. Thinking of heading over there after work to capture some game play for YouTube. Any clue on embargo details?


Not sure, but it's pretty impressive.


----------



## StrongForce

Quote:


> Originally Posted by *Redeemer*
> 
> Hope my 5.0Ghz i5 2500k and 780TI cut it


Man, of course it does ! 4670k is barely more powerful than that I bet.

Quote:


> Originally Posted by *mingocr83*
> 
> How much RAM though?
> 
> Twitch Dev Q&A
> 
> http://www.twitch.tv/ubisoft/b/531426190


Surely hope he have at least 8Gb with a PC like that


----------



## Arm3nian

Quote:


> Originally Posted by *HeadlessKnight*
> 
> He got a point. Titans can go up to 1.6V (I think) with LLC enabled. Reference GTX 780 Ti's are locked to 1.21V. Titans can beat out 780 Ti's much easier than regular 780s. as the Ti's only have 7% more shaders which translate to about 4-5% difference cfc.
> Titans are only slightly slower cfc but have much voltage headroom to easily overcome the SMX difference. Unstable that depends on the card itself, this is a generalization and generalizations don't always work. Under watercooling I've seen impressive overclocks on some Titans, only 780 Ti's that can hit 1350 MHz+ stable at everything are either vmoded or the classifieds.


1.6v are you serious? That's ln2 zombie bench mode. Titans have died at 1.35v-1.45v. No one runs those voltages or clocks for 24/7 gaming.


----------



## HeadlessKnight

Quote:


> Originally Posted by *Arm3nian*
> 
> 1.6v are you serious? That's ln2 zombie bench mode. Titans have died at 1.35v-1.45v. No one runs those voltages or clocks for 24/7 gaming.


Yup. I know 1.6V is unsafe for prolonged periods without exotic cooling, I was just pointing out how much voltage headroom the Titan has when compared to reference 780 Ti's. and that's why they shine with watercooling unlike the reference 780 Ti's which won't gain that much due to voltage limitations. That's why I consider Titan to be a better card than ref 780 Ti with watercooling.
However on air the Ti's are better though.


----------



## Clairvoyant129

Quote:


> Originally Posted by *Arm3nian*
> 
> 1.6v are you serious? That's ln2 zombie bench mode. Titans have died at 1.35v-1.45v. No one runs those voltages or clocks for 24/7 gaming.


He's just saying it can go up to those volts.


----------



## Arm3nian

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Yup. I know 1.6V is unsafe for prolonged periods without exotic cooling, I was just pointing out how much voltage headroom the Titan has when compared to reference 780 Ti's. and that's why they shine with watercooling unlike the reference 780 Ti's which won't gain that much due to voltage limitations. That's why I consider Titan to be a better card than ref 780 Ti with watercooling.
> However on air the Ti's are better though.


Titan is better than ref 780ti for benching under water, but it's not "faster" for gaming. It could match it, but you will be risking degrading the card running it 24/7. Not everyone lives in Finland, it was 100 degrees in Vegas yesterday.


----------



## Bit_reaper

Quote:


> Originally Posted by *Arm3nian*
> 
> Titan is better than ref 780ti for benching under water, but it's not "faster" for gaming. It could match it, but you will be risking degrading the card running it 24/7. Not everyone lives in Finland, it was 100 degrees in Vegas yesterday.


And here I taught all yanks had air-con


----------



## Arm3nian

Quote:


> Originally Posted by *Bit_reaper*
> 
> And here I taught all yanks had air-con


We do have air conditioning, but we can't just open the window where I live and have the ambients in the negatives.


----------



## XaNaX

I can confirm this works with 4GB of ram


----------



## Flames21891

Quote:


> Originally Posted by *Bit_reaper*
> 
> And here I taught all yanks had air-con


Most do, but running it too often results in very high electric bills. I live in a desert myself, and here you sweat it out until you can't take anymore. Unfortunately, that means your hardware has to sweat it out with you









Hopefully the 3GB thing is mostly nonsense. Either that or the textures better look downright gorgeous.


----------



## Bit_reaper

Quote:


> Originally Posted by *Arm3nian*
> 
> We do have air conditioning, but we can't just open the window where I live and have the ambients in the negatives.


We might be pretty high up north but summer is coming. Its already getting up to 28°C (82.4°F) in peek hours. It doesn't usually get super hot here and cracking 35°C (95°F) very rare but when you don't have air-con it can get pretty damn toasty indoors even with moderately warm summers.

Even now in night time its hard for me to get the room ambient to drop bellow 26°C (78°F).
Quote:


> Originally Posted by *Flames21891*
> 
> Most do, but running it too often results in very high electric bills. I live in a desert myself, and here you sweat it out until you can't take anymore. Unfortunately, that means your hardware has to sweat it out with you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully the 3GB thing is mostly nonsense. Either that or the textures better look downright gorgeous.


That I do believe. The damn things chug down power like there is no tomorrow. High summer ambient temps was one of the things that got me in to water cooling and now I'm taking more drastic measures like reducing my OC as I just can't stand how hot the apartment is getting.


----------



## psp3000

do you guys think i can run it on ultra with 8Gb ram GTX 780 Classy OC and I7-2600 non k


----------



## BusterOddo

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> 4 more days... I can not handle this.


Quote:


> Originally Posted by *Alvarado*
> 
> ^ I know the feeling


It's getting serious...lol


----------



## DF is BUSY

my 770 just died a bit on the inside.

guess i'll have to turn down some settings @ 1200p


----------



## Mygaffer

Quote:


> Originally Posted by *TheBDK*
> 
> Well that sucks. Only have 2 on my 270X.


Then upgrade. If true I am happy, because it means they are pushing the resolution of the textures.


----------



## Clocknut

sounds like getting 2x GTX860 in SLI is a bad idea.









May be should hold on my Radeon 7790 longer and get a single GTX980/R9 490x


----------



## NavDigitalStorm

Game cant even run on Ultra without stuttering on a 295x2 at 1440p.


----------



## daman246

my 670 is dying, cant run ultra unfortunately.


----------



## SOCOM_HERO

Shameful they couldn't consolidate it down to 2GB. That really makes running the game at top settings an expensive endeavor for most.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Game cant even run on Ultra without stuttering on a 295x2 at 1440p.


Probably needs drivers form AMD and its a Nvidia Tittle.


----------



## Redeemer

Quote:


> Originally Posted by *mingocr83*
> 
> How much RAM though?
> 
> Twitch Dev Q&A
> 
> http://www.twitch.tv/ubisoft/b/531426190


16 GB G.Skill Sniper 1866


----------



## NavDigitalStorm

So, any idea about posting graphics comparison and repercussions?


----------



## jason387

Quote:


> Originally Posted by *twerk*
> 
> Is that a hard lock on the Ultra textures, so if you don't have 3GB you can't select it? If so, that's stupid.


You can select ULTRA Textures even with a 1GB card. I can confirm that.


----------



## djriful

Quote:


> Originally Posted by *Redeemer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mingocr83*
> 
> How much RAM though?
> 
> Twitch Dev Q&A
> 
> http://www.twitch.tv/ubisoft/b/531426190
> 
> 
> 
> 16 GB G.Skill Sniper 1866
Click to expand...

Cheapo Samsung 30nm wonder rams 16GB oc @ 2133-2400Mhz.  ($98 total)


----------



## TopicClocker

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So, any idea about posting graphics comparison and repercussions?


I'll do the graphics comparisons in my new thread, what do you mean by repecussions, performance wise?


----------



## Cpyro

Quote:


> Originally Posted by *TopicClocker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NavDigitalStorm*
> 
> So, any idea about posting graphics comparison and repercussions?
> 
> 
> 
> I'll do the graphics comparisons in my new thread, what do you mean by repecussions, performance wise?
Click to expand...

i think he means more along the lines that (copied this from other thread







) The game hasn't actually launched, so copyright issues and possible legal action can still be taken by Ubisoft i would say. They have taken some videos off you-tube and other sites but there are too many to go through, i'm sure.


----------



## TopicClocker

Quote:


> Originally Posted by *Cpyro*
> 
> i think he means more along the lines that (copied this from other thread
> 
> 
> 
> 
> 
> 
> 
> ) The game hasn't actually launched, so copyright issues and possible legal action can still be taken by Ubisoft i would say. They have taken some videos off you-tube and other sites but there are too many to go through, i'm sure.


I heard something about an embargo being lifted, and I think some seemingly major youtubers have posted aswell. I'm not sure.


----------



## NavDigitalStorm

Im uploading my video. Theres tons of graphics stuff already out there.


----------



## error-id10t

You guys using the PC tweaks to get supposed 4K textures..?


----------



## whtchocla7e

https://www.youtube.com/watch?v=uKto4jnkVVU


----------



## djriful

Quote:


> Originally Posted by *whtchocla7e*
> 
> https://www.youtube.com/watch?v=uKto4jnkVVU


I just laughed at the moment when you died, your ragdoll body just nudges the police car sideway. ...

Must be some super physics.


----------



## djriful

Quote:


> Originally Posted by *djriful*
> 
> Quote:
> 
> 
> 
> Originally Posted by *whtchocla7e*
> 
> https://www.youtube.com/watch?v=uKto4jnkVVU
> 
> 
> 
> I just laughed at the moment when you died, your ragdoll body just nudges the police car sideway. ...
> 
> Must be some super physics.
Click to expand...


----------



## NinjaToast

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> http://www.digitalstormonline.com/unlocked/watch-dogs-graphics-comparison-ultra-to-low-pc-idnum276/


Quote:


> Originally Posted by *Alvarado*
> 
> Hmm.... not sure if I like that.


Yeah, the most that looked different was the lighting really, some of the textures didn't even look like they changed at all. xD


----------



## xutnubu

Quote:


> Originally Posted by *whtchocla7e*
> 
> https://www.youtube.com/watch?v=uKto4jnkVVU


In-game SMAA, that's nice. More games should support it.


----------



## MapRef41N93W

Quote:


> Originally Posted by *waylo88*
> 
> Not true at all. Go take a look at the Titanfall reddit and you'll see all kinds of people with mid-range all the way up to high-end hardware who have problems running the game with max settings.
> 
> Yes, that's due to it being unoptimized, but to say it runs on a potato just because it uses the Source engine is a gross exaggeration.


Yes it was an obvious exaggeration. I've never heard anyone on high-end hardware having trouble running the game. It's a mid-range game and not exactly something someone should be boasting about running.


----------



## Kinaesthetic

Quote:


> Originally Posted by *NinjaToast*
> 
> Yeah, the most that looked different was the lighting really, some of the textures didn't even look like they changed at all. xD


I have a feeling they were using the leaked Chinese version, rather than an official version.

The leaked Chinese version is 13.9GB, whereas the actual system requirements for the game, given by Ubisoft, show it needing 25GB of HDD space. I'd be highly suspicious of the Chinese cracked version. I highly doubt the Chinese cracked version has managed to compress over 11GB of data. Even the XBox1/PS4 versions are apparently 20GB.

So I'd say somethings are missing.


----------



## error-id10t

http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/

Apply this and/or make everything Ultra and add more SuperSampling.


----------



## Alvarado

Quote:


> Originally Posted by *error-id10t*
> 
> http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/
> 
> Apply this and/or make everything Ultra and add more SuperSampling.


Wow...


----------



## iamhollywood5

Sad to see that my $700 GPU will not be able to run this the way it should. Saw my earlier comment about my gameplay experience was deleted, I do have an early, legit copy. I pre-ordered the physical CD from Gamestop with the local pick-up option - it's literally across from my backyard. (Yes I still try to get physical copies of some games due to resale value, and I wasn't confident I'd want to keep Watch Dogs but still wanted to try it day 1). I know the manager personally and twisted his arm enough today to pick mine up early. I'm not sure what you'd consider the "legal status" of my copy, but it's paid for and not pirated. I went to get it today because I saw that other people were playing early through uplay without problems.

Anywho, I just want to warn people what to expect. If you have a 3GB card, you will not be able to run ultra textures and any real AA at 1080p without some problems. The game will use more than 3GB at 1080p if you have it. 3GB is the minimum for these settings, not the recommended. Morin himself admitted that a 780 3GB probably wont be very smooth with those settings. I still chalk it up to poor optimization, and the realization that I couldn't run max settings at 1080p on a $700 GPU was immensely frustrating. The worst part is that when the game is not using all 3072MB, it runs at 90+ fps with everything else maxed out, but as soon as I hit the limit it tanks into the 20s for a second. I've never seen a game like that, where the textures take up so much memory but the core can render them extremely fast. And the ultra textures look decent, but not good enough to need 3GB. Both Crysis and BF4 have better looking textures that use much less than 3072MB at 1080p.

I don't know what to do. With decent AA and ultra textures, the game actually looks pretty good. Dropping AA, textures, and AO down gets rid of the frame dives but looks unflattering and runs at a needlessly high FPS. I really hope they put out a patch that helps 3GB cards, because there are a ton of 3GB cards on there and the game is at an awkward position where it goes just past that. Seriously have never seen a case like this. Usually if the game is using that much memory, the core can't render it all at 60+ fps. It's just awkward. At the very least, the game does not have that awful and inevitable single-frame stuttering that plagued Far Cry 3, which I was worried it would have.


----------



## StrongForce

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Game cant even run on Ultra without stuttering on a 295x2 at 1440p.


what's your fps ? reducing MSAA doesnt help ? sounds crazy..

And yea whats the difference SMAA and MSAA ? I mean, MSAA I think takes more ressource but I don't know.. just got an nvidia card never seen that before yesterday when I poped arkham origins and did some.., which 1 looks better and which 1 takes more ressources


----------



## AndroidVageta

Well...I heard from a friend who has a system VERY similar to mine (he has a 4670K but other than that is the same) who says he can play at 1080p with temperal SMAA with everything else maxed and it's perfectly playable...maybe no 60FPS solid but above ~40FPS and very smooth.

He's also said that the graphics, while not mind blowing, are among some of the, if not THE graphics he's ever seen. Says it's not quite first E3 showing but that it still looks breath taking. Says the water is the most realistic he's seen (and he's played Crysis, BF4, etc). Over all he says it's better than even GTA 4 with iCEnhancer...the world is a lot more alive and detailed.

So who know's, I've yet to see it in person so I can't say. He will say that he wasn't too impressed by what he saw but after playing it is pleasantly surprised.


----------



## axizor

Quote:


> Originally Posted by *AndroidVageta*
> 
> Well...I heard from a friend who has a system VERY similar to mine (he has a 4670K but other than that is the same) who says he can play at 1080p with temperal SMAA with everything else maxed and it's perfectly playable...maybe no 60FPS solid but above ~40FPS and very smooth.
> 
> He's also said that the graphics, while not mind blowing, *are among some of the, if not THE graphics he's ever seen*. Says it's not quite first E3 showing but that it still looks breath taking. Says the water is the most realistic he's seen (and he's played Crysis, BF4, etc). Over all he says it's better than even GTA 4 with iCEnhancer...the world is a lot more alive and detailed.
> 
> So who know's, I've yet to see it in person so I can't say. He will say that he wasn't too impressed by what he saw but after playing it is pleasantly surprised.


Some of the what graphics he's ever seen?

And how is he playing it? Bittorrent?


----------



## naved777

wow at 1080p it eats whole 3GB Vram


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> Sad to see that my $700 GPU will not be able to run this the way it should. Saw my earlier comment about my gameplay experience was deleted, I do have an early, legit copy. I pre-ordered the physical CD from Gamestop with the local pick-up option - it's literally across from my backyard. (Yes I still try to get physical copies of some games due to resale value, and I wasn't confident I'd want to keep Watch Dogs but still wanted to try it day 1). I know the manager personally and twisted his arm enough today to pick mine up early. I'm not sure what you'd consider the "legal status" of my copy, but it's paid for and not pirated. I went to get it today because I saw that other people were playing early through uplay without problems.
> 
> Anywho, I just want to warn people what to expect. If you have a 3GB card, you will not be able to run ultra textures and any real AA at 1080p without some problems. The game will use more than 3GB at 1080p if you have it. 3GB is the minimum for these settings, not the recommended. Morin himself admitted that a 780 3GB probably wont be very smooth with those settings. I still chalk it up to poor optimization, and the realization that I couldn't run max settings at 1080p on a $700 GPU was immensely frustrating. The worst part is that when the game is not using all 3072MB, it runs at 90+ fps with everything else maxed out, but as soon as I hit the limit it tanks into the 20s for a second. I've never seen a game like that, where the textures take up so much memory but the core can render them extremely fast. And the ultra textures look decent, but not good enough to need 3GB. Both Crysis and BF4 have better looking textures that use much less than 3072MB at 1080p.
> 
> I don't know what to do. With decent AA and ultra textures, the game actually looks pretty good. Dropping AA, textures, and AO down gets rid of the frame dives but looks unflattering and runs at a needlessly high FPS. I really hope they put out a patch that helps 3GB cards, because there are a ton of 3GB cards on there and the game is at an awkward position where it goes just past that. Seriously have never seen a case like this. Usually if the game is using that much memory, the core can't render it all at 60+ fps. It's just awkward. At the very least, the game does not have that awful and inevitable single-frame stuttering that plagued Far Cry 3, which I was worried it would have.


The price of the Gpu doesnt always determine performance









My $275 gpu will run it at higher settings.


----------



## AndroidVageta

Quote:


> Originally Posted by *bencher*
> 
> The price of the Gpu doesnt always determine performance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My $275 gpu will run it at higher settings.


My buddies 7970 plays the game fine...


----------



## degenn

http://www.dsogaming.com/news/watch_dogs-hidden-deferredfxquality-option-improves-significantly-visuals/#more-65558


----------



## iamhollywood5

Quote:


> Originally Posted by *AndroidVageta*
> 
> My buddies 7970 plays the game fine...


Yeah but with ultra textures?

Question for anyone who has played on a 4GB or 6GB card: At 1080p with ultra textures and AA on, do you ever see VRAM usage exceed 3072MB?


----------



## Alatar

I wonder if having to do some strange ini file tweak for actual max settings is a deliberate move from ubi. Do this before launch and surprise everyone at launch with much better visuals like we see in those pics above. And those weren't even night shots where the game would shine.
Quote:


> Originally Posted by *Arm3nian*
> 
> Nope. By the time the titan makes up the 10% difference you would be at benching and non stable voltages/clocks. And even if you were stable, no one would run a titan at those overvoltages for 24/7. DP and 6gb vram is all you got.


Quote:


> Originally Posted by *Arm3nian*
> 
> 1.6v are you serious? That's ln2 zombie bench mode. Titans have died at 1.35v-1.45v. No one runs those voltages or clocks for 24/7 gaming.


Quote:


> Originally Posted by *Arm3nian*
> 
> Titan is better than ref 780ti for benching under water, but it's not "faster" for gaming. It could match it, but you will be risking degrading the card running it 24/7. Not everyone lives in Finland, it was 100 degrees in Vegas yesterday.


The only titan that's actually known to have died (as far as I know at least) at ~1.45v was rbuass' card on LN2 without any vrm cooling. And it was hardmodded as well.

Plenty of people run say 1.325v on water for gaming. Sometimes enough for 1400MHz. The 780Ti is only a couple of percent faster clock for clock and has to deal with lower voltages. Feel free to run firestrike extreme with your max OC and I'll show you what a Titan can do 24/7. It might look bad in reviews but it's a completely different beast in the hands of enthusiasts.

Truth of the matter is that if you're using watercooling you need a Ti classy to beat Titans.


----------



## eternal7trance

Quote:


> Originally Posted by *iamhollywood5*
> 
> Yeah but with ultra textures?
> 
> Question for anyone who has played on a 4GB or 6GB card: At 1080p with ultra textures and AA on, do you ever see VRAM usage exceed 3072MB?


On this game or other games? Because I have with other games


----------



## bencher

Quote:


> Originally Posted by *naved777*
> 
> wow at 1080p it eats whole 3GB Vram


Quote:


> Originally Posted by *iamhollywood5*
> 
> Yeah but with ultra textures?
> 
> Question for anyone who has played on a 4GB or 6GB card: At 1080p with ultra textures and AA on, do you ever see VRAM usage exceed 3072MB?


----------



## Faithh

The game is quiet cpu bound sometimes and it's lightthreaded honestly


----------



## AndroidVageta

Quote:


> Originally Posted by *iamhollywood5*
> 
> Yeah but with ultra textures?
> 
> Question for anyone who has played on a 4GB or 6GB card: At 1080p with ultra textures and AA on, do you ever see VRAM usage exceed 3072MB?


Yes. He's playing with everything maxed (including textures) with temporal SMAA at 1080p perfectly fine. He's playing it right now without a single issue. I don't know his memory usage or frame rate but he's saying it's at least mid-30's and up constantly with no slow down or anything.


----------



## bencher

Quote:


> Originally Posted by *AndroidVageta*
> 
> Yes. He's playing with everything maxed (including textures) with temporal SMAA at 1080p perfectly fine. He's playing it right now without a single issue. I don't know his memory usage or frame rate but he's saying it's at least mid-30's and up constantly with no slow down or anything.


Since when is 30fps perfectly fine?


----------



## Alvarado

Quote:


> Originally Posted by *AndroidVageta*
> 
> Yes. He's playing with everything maxed (including textures) with temporal SMAA at 1080p perfectly fine. He's playing it right now without a single issue. I don't know his memory usage or frame rate but he's saying it's at *least mid-30's* and up constantly with no slow down or anything.


Ouch...


----------



## iamhollywood5

Quote:


> Originally Posted by *eternal7trance*
> 
> On this game or other games? Because I have with other games


This particular game.

Today while playing with ultra textures and TXAA x4 I saw 2800-2900MB on average, sometimes spiking a little over 3000MB. I'm wondering if the stuttering issues I was having was due to the game actually running out of VRAM or something else. I'd experiment with it more but I went out of town tonight and wont have a chance to play again until monday night.


----------



## Serandur

You know that sinking feeling when you feel regretful and ripped off at the timing in which you made a major upgrade/new build? Where throwing bucketloads of money at a high-end machine still isn't enough to ensure well-made engineering decisions for longevity, quality, and packaging? I had to choose between a 780 and a 290 for a new build a couple months ago. Litecoin mining minimized the price difference, so I chose the 780. If I had to choose again now, I have no idea what I would take, because the 780 is ludicrously expensive in comparison and has less VRAM, but AMD don't support downsampling. One company charges premium prices and skimps out on good engineering sense for planned obsolence and artifical market segmentation and the other doesn't support the best and easiest form of AA on their high-end cards easily capable of handling it. 6 GB 780s debuted very shortly after my purchase and Devil's Canyon now laughs at my dilemma between a 4770K and the 3770K I chose too (which is not a good overclocker; Intel's TIM solution is flat-out incompetent). Now Watch Dogs threatens my expensive, "flagship" card's anemic VRAM capacity (on an Nvidia-sponsored title no less) and the 290s have hit record-low prices. I've also had to RMA quite a few things with this build due to defects, and yet the most limiting factors are stupid packaging decisions (external components like the TIM/IHS and VRAM) on my high-end, flagship microprocessors. Why the hell does it feel like so many premium manufacturers are skimping out on quality? I'm feeling a bit down.









I hope The Witcher 3 and future Ubisoft games don't push reasonable texture VRAM requirements over my 3GB edge, I know spring is never the best time to buy with the new stuff coming out in summer, but I had no choice. Nvidia being cheap bastards is pissing me off, I can't imagine the frustration of SLI 780 Ti owners. Ridiculous, this is not right and the blame rests squarely on Nvidia for this, not a developer for finding an actual use for more VRAM (Daylight similarly pushes VRAM it would seem, but Watch Dogs really exposes how inadequate it is for the GPU's capabilities). These cards should have had 6 GB models to begin with or at least 4.5 GB models if possible with RAM chip sizes. Screw Nvidia and screw their dumbass Titan line for which they've crippled what is supposed to be their flagship series. Sorry for the rant, I'm just extremely livid with the situation. I strongly desire for AMD to get with the program on high-end software features so I can at least be happy with properly-designed high-end graphics cards that receive proper VRAM amounts for their capabilities and don't slaughter my wallet.


----------



## PappaSmurfsHarem

Mouse movement sucks.


----------



## djriful

Quote:


> Originally Posted by *PappaSmurfsHarem*
> 
> Mouse movement sucks.


Get a controller.


----------



## djriful

Quote:


> Originally Posted by *Faithh*
> 
> The game is quiet cpu bound sometimes and it's lightthreaded honestly


You got to put your rig underwater. CPU would run maxed at 50'c and 40'c GPUs


----------



## PappaSmurfsHarem

Quote:


> Originally Posted by *djriful*
> 
> Get a controller.


Have one. But I hate controllers.


----------



## iamhollywood5

Quote:


> Originally Posted by *Serandur*
> 
> You know that sinking feeling when you feel regretful and ripped off at the timing in which you made a major upgrade/new build? Where throwing bucketloads of money at a high-end machine still isn't enough to ensure well-made engineering decisions for longevity, quality, and packaging? I had to choose between a 780 and a 290 for a new build a couple months ago. Litecoin mining minimized the price difference, so I chose the 780. If I had to choose again now, I have no idea what I would take, because the 780 is ludicrously expensive in comparison and has less VRAM, but AMD don't support downsampling. One company charges premium prices and skimps out on good engineering sense for planned obsolence and artifical market segmentation and the other doesn't support the best and easiest form of AA on their high-end cards easily capable of handling it. 6 GB 780s debuted very shortly after my purchase and Devil's Canyon now laughs at my dilemma between a 4770K and the 3770K I chose too (which is not a good overclocker; Intel's TIM solution is flat-out incompetent). Now Watch Dogs threatens my expensive, "flagship" card's anemic VRAM capacity (on an Nvidia-sponsored title no less) and the 290s have hit record-low prices. I've also had to RMA quite a few things with this build due to defects, and yet the most limiting factors are stupid packaging decisions (external components like the TIM/IHS and VRAM) on my high-end, flagship microprocessors Why the hell does it feel like so many premium manufacturers are skimping out on quality? I'm feeling a bit down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope The Witcher 3 and future Ubisoft games don't push reasonable texture VRAM requirements over my 3GB edge, I know spring is never the best time to buy with the new stuff coming out in summer, but I had no choice. Nvidia being cheap bastards is pissing me off, I can't imagine the frustration of SLI 780 Ti owners. Ridiculous, this is not right and the blame rests squarely on Nvidia for this, not a developer for finding an actual use for more VRAM (Daylight similarly pushes VRAM it would seem, but Watch Dogs really exposes how inadequate it is for the GPU's capabilities). These cards should have had 6 GB models to begin with or at least 4.5 GB models if possible with RAM chip sizes. Screw Nvidia and screw their dumbass Titan line for which they've crippled what is supposed to be their flagship series. Sorry for the rant, I'm just extremely livid with the situation. I strongly desire for AMD to get with the program on high-end software features so I can at least be happy with properly-designed high-end graphics cards that receive proper VRAM amounts for their capabilities and don't slaughter my wallet.


Can't blame ya, I feel the exact same way. Nvidia has taken way too long to allow 6GB 780s and they still don't allow any 6GB 780 Ti's, so that they can protect their Titan Blacks. At least I only sunk my money into one 780 Ti instead of 2. But the decision to limit GK110 gaming cards to 3GB is starting to look bad.


----------



## iamhollywood5

Quote:


> Originally Posted by *PappaSmurfsHarem*
> 
> Have one. But I hate controllers.


I must say this is one of the few PC games I prefer a controller for. It plays amazing with a controller.


----------



## xCamoLegend

The graphics are the least thing to be concerned with this game.

Car physics/handling are like Saints Row 2 tier, worse than San Andreas.
Gravity is nothing like reality and crashing a vehicle makes it bounce like a ball.
Curbs have a terrible hitbox and stop cars dead in their tracks.


----------



## bencher

Quote:


> Originally Posted by *Serandur*
> 
> You know that sinking feeling when you feel regretful and ripped off at the timing in which you made a major upgrade/new build? Where throwing bucketloads of money at a high-end machine still isn't enough to ensure well-made engineering decisions for longevity, quality, and packaging? I had to choose between a 780 and a 290 for a new build a couple months ago. Litecoin mining minimized the price difference, so I chose the 780. If I had to choose again now, I have no idea what I would take, because the 780 is ludicrously expensive in comparison and has less VRAM, but AMD don't support downsampling. One company charges premium prices and skimps out on good engineering sense for planned obsolence and artifical market segmentation and the other doesn't support the best and easiest form of AA on their high-end cards easily capable of handling it. 6 GB 780s debuted very shortly after my purchase and Devil's Canyon now laughs at my dilemma between a 4770K and the 3770K I chose too (which is not a good overclocker; Intel's TIM solution is flat-out incompetent). Now Watch Dogs threatens my expensive, "flagship" card's anemic VRAM capacity (on an Nvidia-sponsored title no less) and the 290s have hit record-low prices. I've also had to RMA quite a few things with this build due to defects, and yet the most limiting factors are stupid packaging decisions (external components like the TIM/IHS and VRAM) on my high-end, flagship microprocessors. Why does it feel like so many premium manufacturers are skimping out on quality? I'm feeling a bit down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope The Witcher 3 and future Ubisoft games don't push reasonable texture VRAM requirements over my 3GB edge, I know spring is never the best time to buy with the new stuff coming out in summer, but I had no choice. Nvidia being cheap bastards is pissing me off, I can't imagine the frustration of SLI 780 Ti owners. Ridiculous, this is not right and the blame rests squarely on Nvidia for this, not a developer for finding an actual use for more VRAM (Daylight similarly pushes VRAM it would seem, but Watch Dogs really exposes how inadequate it is for the GPU's capabilities). These cards should have had 6 GB models to begin with or at least 4.5 GB models if possible with RAM chip sizes. Screw Nvidia and screw their dumbass Titan line for which they've crippled what is supposed to be their flagship series. Sorry for the rant, I'm just extremely livid with the situation. I strongly desire for AMD to get with the program on high-end software features so I can at least be happy with properly-designed high-end graphics cards that receive proper VRAM amounts for their capabilities and don't slaughter my wallet.


I don't see why I would spend $700 on a 780ti when I 290x offers similar performance for less.

Also prepare to be burnt when buying highest end stuff. That's why I never spend over $350 on a graphics card. And it worked out great for me cause my card is at then top spot with the rest while only costing me a fraction


----------



## djriful

I think everyone should stop making final assumption on the GPU performances, GPU drivers has no yet released for this game.


----------



## iamhollywood5

Quote:


> Originally Posted by *bencher*
> 
> I don't see why I would spend $700 on a 780ti when I 290x offers similar performance for less.
> 
> Also prepare to be burnt when buying highest end stuff. That's why I never spend over $350 on a graphics card. And it worked out great for me cause my card is at then top spot with the rest while only costing me a fraction


For some of us, we bought a 780 Ti when the 290X was at the same price or even higher due to the mining craze. Plus the 780 Ti's core is more powerful than the 290X and there's the whole bag of nvidia goodies like g-sync, physX, CUDA, yadda yadda yadda. But NV vs AMD is not the point here. The problem is Nvidia's restriction on 780/Ti memory. I would have gladly paid $100 extra to have 6GB on my 780 Ti because it would have been worth it to me, but I wont pay the $300 premium of the Titan.


----------



## Alatar

If you guys read the Gaf there are reports of the game using 5GB Vram at max settings when using Titans.

(or at least having that much reported usage, there's no way of monitoring actual usage)


----------



## kx11

Quote:


> Originally Posted by *Alatar*
> 
> If you guys read the Gaf there are reports of the game using 5GB Vram at max settings when using Titans.
> 
> (or at least having that much reported usage, there's no way of monitoring actual usage)


it's Crysis 3 kinda thing , reserve 90% of the VRAM for no reason


----------



## Alatar

Quote:


> Originally Posted by *kx11*
> 
> it's Crysis 3 kinda thing , reserve 90% of the VRAM for no reason


Probably. However if even 780s are stuttering with high texture settings then the actual usage might also be quite high.

One thing is for sure though, the screens posted here: http://neogaf.com/forum/showpost.php?p=113213557&postcount=1 look amazing.


----------



## Arm3nian

Quote:


> Originally Posted by *Alatar*
> 
> The only titan that's actually known to have died (as far as I know at least) at ~1.45v was rbuass' card on LN2 without any vrm cooling. And it was hardmodded as well.
> 
> Plenty of people run say 1.325v on water for gaming. Sometimes enough for 1400MHz. The 780Ti is only a couple of percent faster clock for clock and has to deal with lower voltages. Feel free to run firestrike extreme with your max OC and I'll show you what a Titan can do 24/7. It might look bad in reviews but it's a completely different beast in the hands of enthusiasts.
> 
> Truth of the matter is that if you're using watercooling you need a Ti classy to beat Titans.


I could care less about your opinion of 24/7, I've read lots of pages in the titan club and know what people run 24/7 for gaming, and the best of cards reach around 1300mhz for stable non degrading clocks/volts.


----------



## flashcrew

I think Crysis has new challenger... "Can it run Watch Dogs?"


----------



## Toology

I think this game is going to require a driver, i hope Nvidia and AMD release something before it officially releases monday night/ tuesday.


----------



## Clockster

The game def needs a day 1 driver, I've now seen it at 1080P and it doesn't use 3GB of Vram for Ultra, if they have 6GB cards the game automatically reserves Vram. The game is also a stuttering mess, On both Nvidia and AMD cards whether you play high or Ultra. Looks fairly good though.


----------



## Murlocke

Quote:


> Originally Posted by *Alatar*


Guy is full of bull. I'm getting 60-70FPS on a single 780Ti while driving around at fast speeds. There are chops that seem to be related to VRAM usages every now and then, but it is playable. SLI results in less performance for me at the moment, tried the latest drivers that claim to have a Watch Dogs SLI profile and it seems to improve it but it's still worse than single GPU. I'm starting to think SLI won't even help, the limitation here is VRAM... and I'm on freaking 1080p.

Personally, I don't see it in the graphics. They are good, but not 3GB of VRAM needed for 1080p good.

EDIT: Is that you? Haha, if it is I'm sorry should of worded the post nicer. Quite buzzed at the moment. Thought it was another developer hyping up the requirements.


----------



## Arm3nian

Quote:


> Originally Posted by *Murlocke*
> 
> Guy is full of bull. I'm getting 60-70FPS on a single 780Ti while driving around at fast speeds. There are chops that seem to be related to VRAM usages every now and then, but it is playable. SLI results in less performance for me at the moment, tried the latest drivers that claim to have a Watch Dogs SLI profile and it seems to improve it but it's still worse than single GPU. I'm starting to think SLI won't even help, the limitation here is VRAM... and I'm on freaking 1080p.
> 
> Personally, I don't see it in the graphics. They are good, but not 3GB of VRAM needed for 1080p good.


The more vram you have the more will be used to avoid stutters. Basic concept. Also the reason I won't argue with Alatar, he'll just post any crap he finds to make everyone else agree.


----------



## Matt-Matt

Quote:


> Originally Posted by *eternal7trance*
> 
> Got a 290, I ain't even mad


Same, haha.


----------



## iamhollywood5

For some reason I dont think drivers will be able to fix it. So far it looks like the only way to max this game out at 1080p smoothly is with a Titan, 290X, or 760/770 4GB SLI. Maybe 270X 4GB Cfx too.

Just crazy. I've seen several games with notably better textures use barely over 2GB maxed at 1080p (Thief, Tomb Raider, Crysis 3, BF4, etc...). Between system memory and graphics memory, this game is just a hog.


----------



## djriful

Watch Dogs need a DX11.2 patch to support Tiles Resources for textures so it doesn't need to use up so much VRAM. Maybe TITAN Z comes in save the day? Shove $3k just for Watch Dogs... silly.


----------



## DF is BUSY

Quote:


> Originally Posted by *iamhollywood5*
> 
> For some reason I dont think drivers will be able to fix it. So far it looks like the only way to max this game out at 1080p smoothly is with a Titan, 290X, or 760/770 4GB SLI. Maybe 270X 4GB Cfx too.
> 
> Just crazy. I've seen several games with notably better textures use barely over 2GB maxed at 1080p (Thief, Tomb Raider, Crysis 3, BF4, etc...). Between system memory and graphics memory, this game is just a hog.


and that's just on 1080p. 1200/1440/"4k" users must be irate right about now.


----------



## GoldenTiger

Quote:


> Originally Posted by *B!0HaZard*
> 
> Yeah, when I built my first gaming PC, everyone kept repeating "dual cores are just as good as quads in games" and "2 GB system RAM is enough". In 2011 when I upgraded it was, "eh, 4 GB system RAM is plenty for any game out currently". All of those statements were true at the time, but I had to upgrade early because of all of those decisions.


Yep, I advocated that people get quad cores, and before them, dual cores, in place of slightly faster-per-thread single/dual cores respectively at the time periods, but was always laughed off the boards. Of course, people who did follow my advice were set for a much longer timeframe than those people doing the laughing







in the end. People tend to be hilariously short-sighted...


----------



## Serandur

Quote:


> Originally Posted by *bencher*
> 
> I don't see why I would spend $700 on a 780ti when I 290x offers similar performance for less.
> 
> Also prepare to be burnt when buying highest end stuff. That's why I never spend over $350 on a graphics card. And it worked out great for me cause my card is at then top spot with the rest while only costing me a fraction


It's pretty much what iamhollywood5 said, though I actually have a factory-overclocked GTX 780 GHz Edition I got for $510, not a Ti for $700. It was a very good deal relative to the ($485) Tri-X 290 I also wanted at the time (in fact, all throughout 2013 I had intended on going with an aftermarket 290, I thought it was so impressive).

My 780 Ghz matches even the Tri-X 290X in performance and is faster than basically any out-of-the-box R9 290. Plus it's quiet and only takes up two slots. The only thing really going in the 290's favor at the time (sadly, given my enthusiasm for the series) was the extra gigabyte of VRAM that I knew I wanted, but just couldn't choose the card for. I had really hoped the 290s might have even been so popular with gamers that they forced Nvidia's hand in dropping the price of their 780s further to the mid $400s, but the popularity of the R9 high-end cards for cryptocurrency mining caused the prices to skyrocket though around the same time the first custom 290s actually came out and I had been waiting for several months already at the time with no hope of a drop in sight and was without a desktop at all.

I needed to build already and became interested in downsampling (a feature Nvidia drivers always support, and AMD drivers break if anything) since both the 290 and the 780 are vastly overpowered for most games and too many games don't support proper AA. I love playing so many of my games scaled down from 2560x1440 or even 4K at 60 fps and the 780 was the only one of the two capable of giving me that along with better performance, quieter/cooler operation, better power-efficiency, better looks (for my machine's color scheme), etc.

That's just my explanation of why the 780 seemed to me a better choice when I bought it, AMD make top-notch hardware but I really wish they better supported certain graphical effects and techniques. Downsampling is a must for me now, it produces gorgeous image quality. I understand higher-end cards diminishing in value more quickly and all that, but that's not the problem here.

The problem is the 780/Ti features arguably the most advanced microprocessor on the consumer market, is _still_ a top-of-the-line chip, and is easily powerful (especially in SLI) enough to achieve some great things, but Nvidia made the boneheaded decision to restrict VRAM amounts and prevent even their board partners from solving the problem.

By the point where I bought it, nearly a year after the 780 launched, it had already dropped to a more reasonable price, Maxwell was nowhere in sight (and still isn't), and it seemed a good a time as any to go for it. So, I did, and then literally a couple days after I did, some 6 GB models were finally allowed and now, while the 780/Ti is still right up there at the top, Nvidia's own advertised partner games (Daylight, Watch Dogs) are proving the inadequacy of that limitation for the GPU's performance.

I understood and was ready for getting burned in the sense that a newer top-of-the-line chip would eventually be released, not prepared for my still top-end GPU to be artificially limited by Nvidia's bull****. It's a different problem, not really surmountable and Nvidia can go to hell if they think I have either the money or the lack of sense to pay twice as much for a very similar card that simply has what mine should have had in the first place so it doesn't choke.

I'm very happy with the processor's performance itself, not so happy with the way Nvidia intentionally crippled it so it already can't keep up the way it should be able to almost immediately after I bought it. It's outrageous, you can't do better with this situation on the Nvidia side short of $1000 or the new 6GB 780s that people like me who already have them would be just plain wasteful to upgrade to if we have any budget limitations and the AMD side is disappointing me in terms of support for the things I, as an enthusiast, want. Stuck between a rock and a hard place.

Quote:


> Originally Posted by *iamhollywood5*
> 
> For some of us, we bought a 780 Ti when the 290X was at the same price or even higher due to the mining craze. Plus the 780 Ti's core is more powerful than the 290X and there's the whole bag of nvidia goodies like g-sync, physX, CUDA, yadda yadda yadda. But NV vs AMD is not the point here. The problem is Nvidia's restriction on 780/Ti memory. I would have gladly paid $100 extra to have 6GB on my 780 Ti because it would have been worth it to me, but I wont pay the $300 premium of the Titan.


Yep, this.

Quote:


> Originally Posted by *iamhollywood5*
> 
> if it dropped down to a slideshow then you do need 3gb for ultra. That slideshow happened because you ran out of VRAM and were swapping textures from your hard drive.


Of all the ways to feel your (new flagship) card is no longer adequate...







:


----------



## Murlocke

Not my video, but this is what it does for me as well.
https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be

It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


----------



## Alatar

Quote:


> Originally Posted by *Murlocke*
> 
> Guy is full of bull. I'm getting 60-70FPS on a single 780Ti while driving around at fast speeds. There are chops that seem to be related to VRAM usages every now and then, but it is playable. SLI results in less performance for me at the moment, tried the latest drivers that claim to have a Watch Dogs SLI profile and it seems to improve it but it's still worse than single GPU. I'm starting to think SLI won't even help, the limitation here is VRAM... and I'm on freaking 1080p.
> 
> Personally, I don't see it in the graphics. They are good, but not 3GB of VRAM needed for 1080p good.
> 
> EDIT: Is that you? Haha, if it is I'm sorry should of worded the post nicer. Quite buzzed at the moment. Thought it was another developer hyping up the requirements.


Nah it's a Forbes writer. Guy has been testing the game with a 295X2 and some Titans.

And I'm just posting what I see, I don't have access to the game yet. I'll most likely post some tests after launch once I get it downloaded.
Quote:


> Originally Posted by *Arm3nian*
> 
> I could care less about your opinion of 24/7, I've read lots of pages in the titan club and know what people run 24/7 for gaming, and the best of cards reach around 1300mhz for stable non degrading clocks/volts.





Spoiler: Warning: Spoiler!







^ the clocks I use for intensive games

perfectly acceptable volts with LLC off
1368MHz reported by GPU-Z
my normal fan speed ( means ~45C+ temps), no crazy AP31s for benching purposes
~15-20 minutes of valley in the background for stability
ambients in the 25-28C range with outside temps now during the morning 20C+
And my card isn't even close to being the best one for high MHz at low volts. Mine actually likes the high volts.

Could probably go higher but I haven't had the need, and I don't have the time right now to stability test higher clocks. Maybe with watch_dogs I'll see if I can get it running at 1400MHz core. The clock for clock difference between Tis and Titans is less than 5%. It doesn't require much from a Titan to pass 780Ti speeds, which usually can be done due to voltage control. As long as you're on water that is.

1.325v is nowhere near degrading or damaging. It's perfectly fine if you're on water.


----------



## daman246

enabling Veryical Sync threw Nvdia panel remove the stutering and slowness it had when i used ingame Sync


----------



## PILMAN

Certainly glad I picked up a 4 gb gtx 770, everyone was telling me I was wasting my money and that a 2 gb would be plenty and that I'd never be able to use the 4 gb, obviously that's not the case as skyrim uses a lot of VRAM for the high res testures and I'm hearing some people are using up to 4 gigs so it's pushing those cards to the limit,


----------



## Murlocke

Quote:


> Originally Posted by *daman246*
> 
> enabling Veryical Sync threw Nvdia panel remove the stutering and slowness it had when i used ingame Sync


Doesn't actually apply vsync in the game for me.


----------



## Deadboy90

Quote:


> Originally Posted by *brucethemoose*
> 
> Don't worry, someone will make a mod to get around it. You can probably spoof your GPU ID, and make the game think you're running a 3gb card.


Unless it actually needs more than 2gb vram for ultra textures. This could be the new Crysis.


----------



## Serandur

Quote:


> Originally Posted by *djriful*
> 
> Watch Dogs need a DX11.2 patch to support Tiles Resources for textures so it doesn't need to use up so much VRAM. Maybe TITAN Z comes in save the day? Shove $3k just for Watch Dogs... silly.


Does DX11.2 work on 780s? I can't find a proper answer for this anywhere.

Quote:


> Originally Posted by *Murlocke*
> 
> Not my video, but this is what it does for me as well.
> https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be
> 
> It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


What makes you think Titans are having the same issue?


----------



## Deadboy90

Quote:


> Originally Posted by *PILMAN*
> 
> Certainly glad I picked up a 4 gb gtx 770, everyone was telling me I was wasting my money and that a 2 gb would be plenty and that I'd never be able to use the 4 gb, obviously that's not the case as skyrim uses a lot of VRAM for the high res testures and I'm hearing some people are using up to 4 gigs so it's pushing those cards to the limit,


Not sure who was telling you that but clearly they don't think forward. I get the same argument when I say I'm happy about having 3 GB on my 7950, that games don't use more than 2 GB maxed out etc. But they said the same thing about 1gb vram back in 2010-11.


----------



## Deadboy90

Quote:


> Originally Posted by *Murlocke*
> 
> Not my video, but this is what it does for me as well.
> https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be
> 
> It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


Probobly driver issue or something that will be fixed with launch patch. If 6gb Titans are getting the same issue it has nothing to do with vram usage. Remember, this game isnt even out yet


----------



## jmcosta

2gb is enough for ultra settings with msaa2x or smaa

MSAA looks gross in cars and objects


FXAA


SMAA


and the stuttering its the game loading the AI , reminds me of Stalker doing the same thing lol

graphics look gta4 hd


----------



## PILMAN

Quote:


> Originally Posted by *Deadboy90*
> 
> Not sure who was telling you that but clearly they don't think forward. I get the same argument when I say I'm happy about having 3 GB on my 7950, that games don't use more than 2 GB maxed out etc. But they said the same thing about 1gb vram back in 2010-11.


Exactly, I had a radeon 6870 1 gig back in the day and what killed it wasn't the processor, but the VRAM. Many processors were comparable but the VRAM was higher. I originally was going to buy a gtx 770 and was going to get a 2 gb then realized that would be pointless if ram increased so spent the extra money on the 4 gig version and glad I did. There's been a lot of arguments on forums claiming we will never use 4 gb, that the bandwidth isn't capable of using 4 gb and any argument I had proving skyrim with high res texture mods used more than 2 gigs was shot down or we were told that games are unoptimized if using more than 2 gb bandwidth. Fortunately I didn't listen to all the crap on the internet, IMO I think a lot of pope wanted to defend their choices of saving money going with a lower gig card because they didn't want to admit they were short changing themselves, I feel bad for a lot for people who got short changed but I wanted to secure my investment and now I know it's paying off.


----------



## hht92

Guys just wait until the game is out with drivers (don't use this pirate example) and if the game cant maxed out at 1080p with single 780 then EPIC FAIL.
Don't argue about the cards, 2GB or 3 GB doesn't matter, if the game is badly coded and poorly optimize its not nvidia fault


----------



## bencher

Quote:


> Originally Posted by *kx11*
> 
> it's Crysis 3 kinda thing , reserve 90% of the VRAM for no reason


Crysis doesn't work that way. I play crysis 3 everyday and it uses 1.7ghz vram at 1080p with 4xaa


----------



## Arm3nian

Quote:


> Originally Posted by *Alatar*
> 
> Nah it's a Forbes writer. Guy has been testing the game with a 295X2 and some Titans.
> 
> And I'm just posting what I see, I don't have access to the game yet. I'll most likely post some tests after launch once I get it downloaded.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ^ the clocks I use for intensive games
> 
> perfectly acceptable volts with LLC off
> 1368MHz reported by GPU-Z
> my normal fan speed ( means ~45C+ temps), no crazy AP31s for benching purposes
> ~15-20 minutes of valley in the background for stability
> ambients in the 25-28C range with outside temps now during the morning 20C+
> And my card isn't even close to being the best one for high MHz at low volts. Mine actually likes the high volts.
> 
> Could probably go higher but I haven't had the need, and I don't have the time right now to stability test higher clocks. Maybe with watch_dogs I'll see if I can get it running at 1400MHz core. The clock for clock difference between Tis and Titans is less than 5%. It doesn't require much from a Titan to pass 780Ti speeds, which usually can be done due to voltage control. As long as you're on water that is.
> 
> 1.325v is nowhere near degrading or damaging. It's perfectly fine if you're on water.


Valley is probably the most terrible stability test, try BF4 64 player maps, that's a proper stability test. There's also no guarantee your card won't degrade at 1.325 for 24/7, anything over the max nvidia allows is risking it. Temps are meaningless, my 4930k sits at 55c and at 1.4v, am I going over the intel spec for games just cause it's not anywhere near 80c? No.

When we got the ti's we were expecting voltage control, but it never happened. Dissapointing yes, but the titan was $300 more, and that's not worth the money for gaming. At stock clocks the ti is shown to be 5-10% faster. I run my ti at 1280 for games, haven't tried more but that number is perfectly stable. If you add the difference it's basically what you got, but you paid more.


----------



## Alatar

Does anyone know when the reviews that follow the NDA are going up?

I'm actually interested in seeing how this thing scores and what kind of criticism it's going to receive since it seems like the internet collectively changes its opinion on WD every 5 minutes








Quote:


> Originally Posted by *Arm3nian*
> 
> Valley is probably the most terrible stability test, try BF4 64 player maps, that's a proper stability test. There's also no guarantee your card won't degrade at 1.325 for 24/7, anything over the max nvidia allows is risking it. Temps are meaningless, my 4930k sits at 55c and at 1.4v, am I going over the intel spec for games just cause it's not anywhere near 80c? No.


There's not guarantee of anything when it comes to overclocking. But people have been running over NV spec on Titans for ages now without any issues at all. And 100mV over a very conservative official spec meant for air cooling isn't going to degrade anything at a pace that would matter in any way...

And there's quite a bit of difference between intel's and NV's voltage specs... NV's max spec is literally 37mV over the stock voltage.

And yeah I know valley is terrible but I just wanted a basic test running in the background. I don't have the time to record YT vids of me using those clocks to play 10 games








Quote:


> When we got the ti's we were expecting voltage control, but it never happened. Dissapointing yes, but the titan was $300 more, and that's not worth the money for gaming. At stock clocks the ti is shown to be 5-10% faster. I run my ti at 1280 for games, haven't tried more but that number is perfectly stable. If you add the difference it's basically what you got, but you paid more.


At stock clocks the Ti is clocked ~6% higher with ~15% higher memory clocks so again that comes to a ~4% clock for clock difference. So matching a 1280MHz Ti would require about a 1330MHz Titan, assuming the memory clocks were similar that is.

Yes I paid more but as I've said in the past, I would gladly pay that much more if it meant that I got:

a) A faster (once OC'd, because OC'd is what I care about) card
b) double the VRAM
c) Got my card 9 months (almost a whole generation) before the 780Ti launched.

I know some people will disagree but that's fine. In my opinion those three things are worth the $300 price premium. c) being the most important one of those.


----------



## Arm3nian

Quote:


> Originally Posted by *Alatar*
> 
> Does anyone know when the reviews that follow the NDA are going up?
> 
> I'm actually interested in seeing how this thing scores and what kind of criticism it's going to receive since it seems like the internet collectively changes its opinion on WD every 5 minutes
> 
> 
> 
> 
> 
> 
> 
> 
> There's not guarantee of anything when it comes to overclocking. But people have been running over NV spec on Titans for ages now without any issues at all. And 100mV over a very conservative official spec meant for air cooling isn't going to degrade anything at a pace that would matter in any way...
> 
> And there's quite a bit of difference between intel's and NV's voltage specs... NV's max spec is literally 37mV over the stock voltage.
> 
> And yeah I know valley is terrible but I just wanted a basic test running in the background. I don't have the time to record YT vids of me using those clocks to play 10 games
> 
> 
> 
> 
> 
> 
> 
> 
> At stock clocks the Ti is clocked ~6% higher with ~15% higher memory clocks so again that comes to a ~4% clock for clock difference. So matching a 1280MHz Ti would require about a 1330MHz Titan, assuming the memory clocks were similar that is.
> 
> Yes I paid more but as I've said in the past, I would gladly pay that much more if it meant that I got:
> 
> a) A faster (once OC'd, because OC'd is what I care about) card
> b) double the VRAM
> c) Got my card 9 months (almost a whole generation) before the 780Ti launched.
> 
> I know some people will disagree but that's fine. In my opinion those three things are worth the $300 price premium. c) being the most important one of those.


My card has a ridiculous memory clock, 8500mhz all day, my valley scores still suck though, diminishing effects after a certain speed I think.

Maybe the titan would have been better for me since I'll never go back to air, but the ti still rocks for games. Any EVGA step ups to a classy?


----------



## HeadlessKnight

Quote:


> Originally Posted by *Murlocke*
> 
> Not my video, but this is what it does for me as well.
> https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be
> 
> It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


Wow this is horrible. Is this running on a 3 GB card?

As for Titan vs 780 Ti vs 780 this is how I see it from best to worst based on cooling .

Air :

1- 780 Ti
2- Titan/ 780 Classified/ Lightning
3- 780 reference

WCing :

1- 780 Ti (voltage unlocked models)
2- Titan
3- 780 Ti reference
4- 780
Quote:


> Originally Posted by *Arm3nian*
> 
> Any EVGA step ups to a classy?


You can only step up to reference cards iirc.


----------



## TopicClocker

Quote:


> Originally Posted by *Murlocke*
> 
> Not my video, but this is what it does for me as well.
> https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be
> 
> It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


Did you have shadow play on shadowtiome recording:? If so take that off otherwise it'll be horrible.
I've smoothed out my game entirely more or less by taking it ouff.


----------



## Booty Warrior

Quote:


> Originally Posted by *jmcosta*
> 
> 2gb is enough for ultra settings with msaa2x or smaa
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> MSAA looks gross in cars and objects
> 
> 
> FXAA
> 
> 
> SMAA
> 
> 
> 
> 
> and the stuttering its the game loading the AI , reminds me of Stalker doing the same thing lol
> 
> graphics look gta4 hd


Interesting. Can we wait for an official driver before everyone starts panicking?


----------



## Arm3nian

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Wow this is horrible. Is this running on a 3 GB card?
> 
> As for Titan vs 780 Ti vs 780 this is how I see it from best to worst based on cooling .
> 
> Air :
> 
> 1- 780 Ti
> 2- Titan/ 780 Classified/ Lightning
> 3- 780 reference
> 
> WCing :
> 
> 1- 780 Ti (voltage unlocked models)
> 2- Titan
> 3- 780 Ti reference
> 4- 780


No disagreement, but we only know this now. If the 780ti got voltage control and I got the titan I would be kicking myself. Ref ti has beefy power delivery, 1.21v is a waste on the card. Quite sad really, my gtx 690 was also volt limited... next card for me will have unlimited voltage.


----------



## stonedwarlock

why people are forgetting mesh quality in ubisoft games? i think they render the quality of the game with the mesh so everything ytou see even at 1km away is render like things that are 1meter away from you, i think thats why vram usge is so high..

btw where are the phiscs of this game compared to E3 gameplay? i think the retail verssion will have them all or maybe im wrong and they downgraded the game otherwise not even a titan would be enouhg?


----------



## Redeemer

Quote:


> Originally Posted by *djriful*
> 
> Cheapo Samsung 30nm wonder rams 16GB oc @ 2133-2400Mhz.
> 
> 
> 
> 
> 
> 
> 
> ($98 total)


Mmm thanks sir!


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> For some of us, we bought a 780 Ti when the 290X was at the same price or even higher due to the mining craze. Plus the 780 Ti's core is more powerful than the 290X and there's the whole bag of nvidia goodies like g-sync, physX, CUDA, yadda yadda yadda. But NV vs AMD is not the point here. The problem is Nvidia's restriction on 780/Ti memory. I would have gladly paid $100 extra to have 6GB on my 780 Ti because it would have been worth it to me, but I wont pay the $300 premium of the Titan.


I know the 780ti is a little faster than 290x.

Maybe the $300 premium would be worth it for you. It was worth it with 780ti.


----------



## djriful

Quote:


> Originally Posted by *jmcosta*
> 
> 2gb is enough for ultra settings with msaa2x or smaa
> 
> MSAA looks gross in cars and objects
> 
> 
> FXAA
> 
> 
> SMAA
> 
> 
> and the stuttering its the game loading the AI , reminds me of Stalker doing the same thing lol
> 
> graphics look gta4 hd


Where is TXAA?


----------



## TopicClocker

Quote:


> Originally Posted by *djriful*
> 
> Where is TXAA?


There's FXAA, SMAA, MSAA and TXAA.


----------



## djriful

Quote:


> Originally Posted by *TopicClocker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> Where is TXAA?
> 
> 
> 
> There's FXAA, SMAA, MSAA and TXAA.
Click to expand...

I just realize the poster is not on Nvidia hardware for TXAA.


----------



## revro

have you all forgotten that part of windows 8.2 is shared ram and vram which will remove vram limitations in game?
i know its still far but maybe we should not write of our gtx780s









best
revro


----------



## iamhollywood5

Quote:


> Originally Posted by *Murlocke*
> 
> Not my video, but this is what it does for me as well.
> https://www.youtube.com/watch?v=Mlj7RTZV9FM&feature=youtu.be
> 
> It's smooth FPS, then chops, then smooth FPS. I think even people with 6GB Titans are experiencing the same. Hopefully it's a driver issue, this game shouldn't need more than 3GB of VRAM at 1080p. It clearly states 3GB for Ultra in the menus, so it should work.


That is the *exact* same experience I had playing yesterday, and he has a 3GB 780 Ti like me. If Titans are really having that same "smooth - chop - smooth - chop - smooth" experience then it's another issue. But I have yet to see that demonstrated.

Quote:


> Originally Posted by *Serandur*
> 
> Does DX11.2 work on 780s? I can't find a proper answer for this anywhere.
> What makes you think Titans are having the same issue?


As of right now, I think the answer is no. However MS has demoed 11.2 running on a GTX 770 so hopefully support gets implemented for Kepler cards soon. 11.2 is already supported on AMD cards... But the bigger question is, does Watch Dogs support 11.2? It could really use what's shown in this video here:




Quote:


> Originally Posted by *bencher*
> 
> I know the 780ti is a little faster than 290x.
> 
> Maybe the $300 premium would be worth it for you. It was worth it with 780ti.


Like I said, at the time of purchase, the 780 Ti and 290X were the same price because of mining.


----------



## Hl86

About 35-45 fps on ultra with smaa on 1440p with a [email protected] and [email protected]


----------



## .:hybrid:.

Quote:


> Originally Posted by *iamhollywood5*
> 
> That is the *exact* same experience I had playing yesterday, and he has a 3GB 780 Ti like me. If Titans are really having that same "smooth - chop - smooth - chop - smooth" experience then it's another issue. But I have yet to see that demonstrated.
> As of right now, I think the answer is no. However MS has demoed 11.2 running on a GTX 770 so hopefully support gets implemented for Kepler cards soon. 11.2 is already supported on AMD cards... But the bigger question is, does Watch Dogs support 11.2? It could really use what's shown in this video here:
> 
> 
> 
> 
> Like I said, at the time of purchase, the 780 Ti and 290X were the same price because of mining.


Cool I didn't know about tiled resources. Seems like it would benefit SSDs a lot because of the constant texel swapping.

Doubt we will see games making use of it until dx12 though, it seems only the base directx digit gets supported by games, probably for good reason since fragmentation sucks to keep a codebase for.


----------



## axizor

Am I missing something? How are you guys playing this game early?


----------



## DADDYDC650

Mostly all of them downloaded the game illegally although they won't admit it since they don't want to get in trouble.


----------



## axizor

Ahh, "swim" was playing the game. Lol


----------



## mattjm

I think I'll get the 4GB version when I get the 270X next month. It may help in some games as I plan to keep it for 2-3 years. Even if it isn't used, I don't think it'll be too much of a waste as the 4GB version is available for $20-30 more than the 2Gig one.


----------



## bencher

Quote:


> Originally Posted by *mattjm*
> 
> I think I'll get the 4GB version when I get the 270X next month. It may help in some games as I plan to keep it for 2-3 years. Even if it isn't used, I don't think it'll be too much of a waste as the 4GB version is available for $20-30 more than the 2Gig one.


are you going to crossfire?


----------



## mattjm

Quote:


> Originally Posted by *bencher*
> 
> are you going to crossfire?


No, most probably. Not sure. I know the 270X is too weak for utilizing 4GB but I think it'd be better to get it as I mod games and it's only 20-30$ more and it's better than repenting later LOL. Since I keep it for 2-3 years, it won't be too big a waste I guess. My 1GB 5770 used to face VRAM shortage with mods. So don't want that episode to happen again.


----------



## iamhollywood5

Quote:


> Originally Posted by *mattjm*
> 
> I think I'll get the 4GB version when I get the 270X next month. It may help in some games as I plan to keep it for 2-3 years. Even if it isn't used, I don't think it'll be too much of a waste as the 4GB version is available for $20-30 more than the 2Gig one.


The 4GB model is only good if you're crossfiring. And crossfiring a couple 270Xs doesn't make sense when you can get a 290 for $400 or less (2x 270X would be at least $440+) with the same amount of stream processors (2560), way more ROPs, double the memory bandwidth, and the avoidance of the frame pacing issues that come up with multi-GPU setups.

The 270Xs core will not be able to render 3GB+ textures at anywhere near an enjoyable frame rate.


----------



## Matt-Matt

Quote:


> Originally Posted by *mattjm*
> 
> No, most probably. Not sure. I know the 270X is too weak for utilizing 4GB but I think it'd be better to get it as I mod games and it's only 20-30$ more and it's better than repenting later LOL. Since I keep it for 2-3 years, it won't be too big a waste I guess. My 1GB 5770 used to face VRAM shortage with mods. So don't want that episode to happen again.


Yeah, if it's only $20 more it's worth it just incase you need it. Unless you're getting a big memory clock decrease from it. Plus if you do crossfire, you'll be kicking yourself with 2GB of VRAM.

And besides, if you sell it it'll likely sell for the $20 more then a 2GB one down the road.


----------



## StrongForce

Now My feelings are mixed between switching to Ichill 770 4gb, or a ichill 780 ... perhaps even the DHS edition but uh, seems nearly impossible to get my hands on one, I took a look at online shops that would ship in france... noone yet geeez.


----------



## iamhollywood5

Here's a little thread I started on the Watch Dogs sub-reddit about VRAM usage. Interesting comments in there about what people are finding. People with 4GB cards are definitely seeing the game go well over 3GB at 1080p with everything maxed.

link

Almost makes me wanna sell my 3GB 780 Ti Classy and pick up a 780 (non-Ti) 6GB. In most games my performance would go down a bit, but if this is what gaming is gonna be like from now on, I might have to do it. I'd probably come out of the swap with $100 extra in my pocket too.

The ideal balanced future proof setup now seems like 2x 6GB 780s (non Ti). And right now you can only get those directly through EVGA.


----------



## bencher

Quote:


> Originally Posted by *StrongForce*
> 
> Now My feelings are mixed between switching to Ichill 770 4gb, or a ichill 780 ... perhaps even the DHS edition but uh, seems nearly impossible to get my hands on one, I took a look at online shops that would ship in france... noone yet geeez.


I would get the 780 any day over any 770.


----------



## StrongForce

false
Quote:


> Originally Posted by *bencher*
> 
> I would get the 780 any day over any 770.


Yea it's almost 10 frames more in 1080p (at least the ichill ones, which I became a huge fan of after watching the benchmarks and also the temps!) but I don't wanna be limited by VRAM, that's just ridiciulous ! I wish they made an ichill 780 with more ram.. would be more tempted, after reading the previous posts, seems almost like the 780 is bottlenecked by it's 3gb VRAM.. I know people say it just uses as much VRAM as it can to avoid stutters, but damn, what if 3gb Ultra limit is actually causing the stutters ? .. that would mean the 3gb limit is even not true, which as someone pointed out, would make them look a bit stupid for a Nvidia game who's high end "main card " 780 is only 3gb..

And yea 3Gb on the Ti.. I don't know why they're doing it.. maybe savy in a sense, the more they put now, the more they will have to put for the next gen ? (which translate in millions more of spending for them) I don't know.. mere speculation, not an expert


----------



## Sadmoto

I find it utterly stupid that games aren't supporting more then one save, I assume this is companies ways to make it so everyone buys their own copy but come on...
Im so tired of everyone's god damn greed, more and more I see, I feel like "mick dodging" it and living in the woods.


----------



## bencher

Quote:


> Originally Posted by *StrongForce*
> 
> false
> Yea it's almost 10 frames more in 1080p (at least the ichill ones, which I became a huge fan of after watching the benchmarks and also the temps!) but I don't wanna be limited by VRAM, that's just ridiciulous ! I wish they made an ichill 780 with more ram.. would be more tempted, after reading the previous posts, seems almost like the 780 is bottlenecked by it's 3gb VRAM.. I know people say it just uses as much VRAM as it can to avoid stutters, but damn, what if 3gb Ultra limit is actually causing the stutters ? .. that would mean the 3gb limit is even not true, which as someone pointed out, would make them look a bit stupid for a Nvidia game who's high end "main card " 780 is only 3gb..
> 
> And yea 3Gb on the Ti.. I don't know why they're doing it.. maybe savy in a sense, the more they put now, the more they will have to put for the next gen ? (which translate in millions more of spending for them) I don't know.. mere speculation, not an expert


From what I understand 3GB is minimum.


----------



## revro

i will see in next days how my 780 3gb gigabyte oc handles @1440p, might be fun, worst case i go down with AA, or go to very high from ultra, then again my [email protected] is probably going to cry. where is the 8core 5930k when one needs her


----------



## NavDigitalStorm

Here is the usage on a 295x2.

EDIT: at 1440p. Ultra graphics with 2x MSAA


----------



## AndroidVageta

Guys...GUYS!...3GB of VRAM is PLENNNNTTTYYYY for this game!!!

My friends 7970 3GB has NO problem running this game maxed out. As I've said previously he's running MAX textures and MAX graphics options with Temporal SMAA with NO problems. Sure he's probably averaging 40FPS but that's because he isn't overclocked or anything. I've played it today...even during gun fights and what not the performance was smooooooth with no hick-ups or texture issues or anything.

So...for 1080p...3GB is more than enough. All these cards using 4, 5, 6GB of RAM are doing so because it's AVAILABLE...NOT because it's NEEDED.

So...now let's continue on with how 6GB of VRAM is required...


----------



## edo101

jokes on them I got 4Gb


----------



## iamhollywood5

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Here is the usage on a 295x2.
> 
> EDIT: at 1440p. Ultra graphics with 2x MSAA


Wait what? The 295x2 has 8GB *per core*? What the hell? Does the 295x2 have some kind of unified memory tech?


----------



## edo101

Quote:


> Originally Posted by *axizor*
> 
> Am I missing something? How are you guys playing this game early?


lol same thing I'm wondering. I was like when did this game come out, I goggled it said May 27. then I looked on torrents and there it was.Yeah they are playing pirated copies which makes me mad...mad cause this only proves Ubi's point. But then again you can find copies of the game on Xbox and PS3 on torrents too


----------



## NavDigitalStorm

Quote:


> Originally Posted by *edo101*
> 
> lol same thing I'm wondering. I was like when did this game come out, I goggled it said May 27. then I looked on torrents and there it was.Yeah they are playing pirated copies which makes me mad...mad cause this only proves Ubi's point. But then again you can find copies of the game on Xbox and PS3 on torrents too


Ubisoft shipped copies early.


----------



## AndroidVageta

Quote:


> Originally Posted by *edo101*
> 
> lol same thing I'm wondering. I was like when did this game come out, I goggled it said May 27. then I looked on torrents and there it was.Yeah they are playing pirated copies which makes me mad...mad cause this only proves Ubi's point. But then again you can find copies of the game on Xbox and PS3 on torrents too


You ever consider that those that have downloaded might already have bought it? Both me and my friend have it pre-ordered and paid for. Sure I'm not playing a pirated copy (I can wait) but the fact that he is I don't see being a problem.


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> Wait what? The 295x2 has 8GB *per core*? What the hell? Does the 295x2 have some kind of unified memory tech?


Divide the usage by 2


----------



## NinjaToast

Quote:


> Originally Posted by *edo101*
> 
> lol same thing I'm wondering. I was like when did this game come out, I goggled it said May 27. then I looked on torrents and there it was.Yeah they are playing pirated copies which makes me mad...mad cause this only proves Ubi's point. But then again you can find copies of the game on Xbox and PS3 on torrents too


See now people are forgetting that the physical copies on PC are playable right now thanks to Ubisoft's Uplay being highly ineffective. Researching the date of release but not if early copies were available to consumers because of retailers and uplay's inability it lock it down.


----------



## iamhollywood5

Quote:


> Originally Posted by *bencher*
> 
> Divide the usage by 2


Thought gpu-z read actual usage lol


----------



## edo101

Lol my bad then. Sorry about that. Didn't know UPlay was weak.


----------



## NinjaToast

Quote:


> Originally Posted by *edo101*
> 
> Lol my bad then. Sorry about that. Didn't know UPlay was weak.


No need to be sorry, just showing there is always more to the story than "oh everyone's a pirate."









Also it's well known Uplay is awful.


----------



## Bitemarks and bloodstains

That's just GPU-z being screwy and
Quote:


> Originally Posted by *iamhollywood5*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bencher*
> 
> Divide the usage by 2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thought gpu-z read actual usage lol
Click to expand...

It technically is reading actual usage as 7.6Gb of VRAM is being used, admittedly it is 3.8GB of data being mirrored but it is still usage.


----------



## axizor

I don't see how this one game justifies some R9 owners to throw it in the face of the 780 (Ti)/Nvidia people...


----------



## awdrifter

Only Nvdia will pull stuff like this. This is probably a way to limit the Ultra texture option to the GTX780 and TItan. This is a Nivia sponsored game afterall.


----------



## bencher

Quote:


> Originally Posted by *axizor*
> 
> I don't see how this one game justifies some R9 owners to throw it in the face of the 780 (Ti)/Nvidia people...


Quote:


> Originally Posted by *awdrifter*
> 
> Only Nvdia will pull stuff like this. This is probably a way to limit the Ultra texture option to the GTX780 and TItan. This is a Nivia sponsored game afterall.


I didn't look at it from this angle.


----------



## NuclearPeace

Quote:


> Originally Posted by *awdrifter*
> 
> Only Nvdia will pull stuff like this. This is probably a way to limit the Ultra texture option to the GTX780 and TItan. This is a Nivia sponsored game afterall.


Double edged sword. It would make AMD look better because the R9 280 has 3GB against the similarly priced 760's 2gb


----------



## edo101

Quote:


> Originally Posted by *awdrifter*
> 
> Only Nvdia will pull stuff like this. This is probably a way to limit the Ultra texture option to the GTX780 and TItan. This is a Nivia sponsored game afterall.


wait what are you guys talking about? does this game not have ultra textures?


----------



## TopicClocker

Quote:


> Originally Posted by *NuclearPeace*
> 
> Double edged sword. It would make AMD look better because the R9 280 has 3GB against the similarly priced 760's 2gb


And the 680/770/670 vs 7900s..


----------



## TopicClocker

Quote:


> Originally Posted by *edo101*
> 
> wait what are you guys talking about? does this game not have ultra textures?


No it does, 3GB vram is supposedly required for Ultra textures.
I think awdrifter is insinuating Nvidia is trying to limit it to Nvidia cards which have 3GB+ standard from the get go, and a alot of cards like the 670/680/760/770 have a 2GB standard, with a price premium for 4GBs, and with the cost of that price premium you could probably buy a 2GB variant of the next card up. (May be different for the US) In the UK it's so bad over here I might as well get a 290 for the cost of a 4GB 770, AMD's pricing is excellent on those cards, either that or go for a 280 or 280X which have the 3GB standard a lot of games are demanding for the highest quality textures these days.


----------



## iamhollywood5

Quote:


> Originally Posted by *awdrifter*
> 
> Only Nvdia will pull stuff like this. This is probably a way to limit the Ultra texture option to the GTX780 and TItan. This is a Nivia sponsored game afterall.


*limit the ultra textures to Titan. Even the 780 seemingly runs out of RAM. If anything Nvidia is pushing me over the AMD here.


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> *limit the ultra textures to Titan. Even the 780 seemingly runs out of RAM. If anything Nvidia is pushing me over the AMD here.


AMD offerings are cheaper, but very hard to cool on air.

I like the stock cooler on my card and don't want to change it. I might just have to change it because it needs 60% fanspeed to keep temps under 80c while playing Crysis.

On the plus side. You can buy 4 used R9 290 for the price of yout Ti LOL.


----------



## edo101

Quote:


> Originally Posted by *TopicClocker*
> 
> No it does, 3GB vram is supposedly required for Ultra textures.
> I think awdrifter is insinuating Nvidia is trying to limit it to Nvidia cards which have 3GB+ standard from the get go, and a alot of cards like the 670/680/760/770 have a 2GB standard, with a price premium for 4GBs, and with the cost of that price premium you could probably buy a 2GB variant of the next card up. (May be different for the US) In the UK it's so bad over here I might as well get a 290 for the cost of a 4GB 770, AMD's pricing is excellent on those cards, either that or go for a 280 or 280X which have the 3GB standard a lot of games are demanding for the highest quality textures these days.


so Nvidia is restricting performance on their 2GB models? I still don't understand what the issue here is


----------



## Dudewitbow

Quote:


> Originally Posted by *edo101*
> 
> so Nvidia is restricting performance on their 2GB models? I still don't understand what the issue here is


the fact that users with 2gb have to tweak down settings or they potentially run into performance problem when memory is full is the topic. the game uses more than 2gb(as they say) so anyone with mid end/previous high end nvidia gpus with only 2gb vram sort of got shafted for not taking the 4gb premium. AMD's offerings are slightly more balanced in the equation in regards to vram, as their 7900/280 units have 3gb ram and should be able to run the game on ultra probably, and their 7800/270 have 2gb vram, which is more realistic on the 2gb vram use, as it is now a mid/high 150-200$ price tier gpu, compared to the pnvidia gpus that can go from the 250-350$ range(760, 660ti, 670, 680, 770) (assuming you didnt pay the 4gb premium)


----------



## Yvese

Quote:


> Originally Posted by *bencher*
> 
> AMD offerings are cheaper, but very hard to cool on air.


Luckily here's a ton of great aftermarket solutions now









Heck, you can get the best r9 290 out there - the Sapphire Tri-X for $379 http://www.overclock.net/t/1491379/tiger-direct-r9-290-tri-x-359-99-after-rebate. That's half the price of a Ti.

Honestly, only those with more money than sense would get a Ti now that the mining craze is over with ( hopefully for good ). You're paying a $3-400 premium and for what... gsync? which requires you to spend an extra $1-200 on your monitor ( on top of what a new monitor itself costs ). Note that I'm only talking about gaming so if you need cuda or whatever, by all means.

I admit that shadowplay is really cool though. Unfortunately that's not enough to get me to pay an extra $3-400.


----------



## edo101

Quote:


> Originally Posted by *Dudewitbow*
> 
> the fact that users with 2gb have to tweak down settings or they potentially run into performance problem when memory is full is the topic. the game uses more than 2gb(as they say) so anyone with mid end/previous high end nvidia gpus with only 2gb vram sort of got shafted for not taking the 4gb premium. AMD's offerings are slightly more balanced in the equation in regards to vram, as their 7900/280 units have 3gb ram and should be able to run the game on ultra probably, and their 7800/270 have 2gb vram, which is more realistic on the 2gb vram use, as it is now a mid/high 150-200$ price tier gpu, compared to the pnvidia gpus that can go from the 250-350$ range(760, 660ti, 670, 680, 770) (assuming you didnt pay the 4gb premium)


Yeah that sucks but its not they didn't have a choice to pick the red team. I think the same people also make the argument of why 4Gb or 3GB is not needed. But Yeah Nvidia should've at least made 3GB...good thing we have competition though


----------



## L36

Played with the titan. Averaged 5.5GB of VRAM maxed at 1600p with 2x TXAA ultra textures.


----------



## bencher

Quote:


> Originally Posted by *L36*
> 
> Played with the titan. Averaged 5.5GB of VRAM maxed at 1600p with 2x TXAA ultra textures.


Did you need AA at that high of a resolution?


----------



## StrongForce

Getting tempted to wait for the EVGA 6gb card too.. this or the Ichill 780.. what you guys think, I should wait ? also do you think the other brands will release 6gb models soon after EVGA ? if not soon (cause I know they have the exclusivity and all that) how long would that take ??

If it's months then I might just get that Ichil 1..

Also can the 384 bit bus can even make use of 6gb.. lol.


----------



## iamhollywood5

Quote:


> Originally Posted by *StrongForce*
> 
> Getting tempted to wait for the EVGA 6gb card too.. this or the Ichill 780.. what you guys think, I should wait ? also do you think the other brands will release 6gb models soon after EVGA ? if not soon (cause I know they have the exclusivity and all that) how long would that take ??
> 
> If it's months then I might just get that Ichil 1..
> 
> Also can the 384 bit bus can even make use of 6gb.. lol.


With 7Ghz memory the 384-bit 780Ti actually has slightly more memory bandwidth than the 512-bit 290X with its 5Ghz memory. I believe the 780 has 6Ghz memory which falls a bit behind the 290X but bandwidth is not the problem here, it's just the amount.

But personally I would always go with EVGA over any other vendor. I think they are hands-down the best.


----------



## edo101

Lol I do hope y'all are not thinking this game is worth shelling out 600 bucks for a Ti


----------



## bencher

Quote:


> Originally Posted by *StrongForce*
> 
> Getting tempted to wait for the EVGA 6gb card too.. this or the Ichill 780.. what you guys think, I should wait ? also do you think the other brands will release 6gb models soon after EVGA ? if not soon (cause I know they have the exclusivity and all that) how long would that take ??
> 
> If it's months then I might just get that Ichil 1..
> 
> Also can the 384 bit bus can even make use of 6gb.. lol.


I mean there is also AMD


----------



## StrongForce

Quote:


> Originally Posted by *iamhollywood5*
> 
> With 7Ghz memory the 384-bit 780Ti actually has slightly more memory bandwidth than the 512-bit 290X with its 5Ghz memory. I believe the 780 has 6Ghz memory which falls a bit behind the 290X but bandwidth is not the problem here, it's just the amount.
> 
> But personally I would always go with EVGA over any other vendor. I think they are hands-down the best.


That what I hear everyone saying, but the Ichill cooler seems better according to the reviews I checked.

Is ACX dual fan cooler the best they have ? (on air) according to my research it is.

But yea .. looking it up right now :

http://www.anandtech.com/show/7356/capsule-review-evga-geforce-gtx-780-superclocked-acx/3

http://hexus.net/tech/reviews/graphics/63393-inno3d-ichill-geforce-gtx-780-herculez-x3-ultra/?page=11

Only 2 °c difference, not much of a big deal I guess.., though still lol, but yea, also depends ambiant temps, case etc, but we can assume that's not by much ..

If I can get the EVGA ACX 6gb with watchdogs bundle I'd be highly tempted








Quote:


> Originally Posted by *bencher*
> 
> I mean there is also AMD


Yea I'd be tempted by a 290 to be honest, but I'm boycotting AMD for personal reasons, had troubles with that memory leak on BF4 which made me waste so much time.

Also had troubles with the drivers in the past, and it runs much hotter, not good for my FX 8320.

The catalyst control center is crappy also, for instance one of the problems I had was that I clicked on it and it would never pop on, had to reinstall. no big deal but yea, those few reasons enough for me to boycott, especially the memory leak, the rest isnt boycotting simple choice


----------



## iamhollywood5

Quote:


> Originally Posted by *edo101*
> 
> Lol I do hope y'all are not thinking this game is worth shelling out 600 bucks for a Ti


No it's not, but it does show that we're hitting that 3GB ceiling and 3GB is no longer enough to guarantee you wont have a VRAM shortage at 1080p. I'm seriously considering swapping my 780 Ti 3GB for a 780 6GB. And no i'm not going back to AMD.


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> No it's not, but it does show that we're hitting that 3GB ceiling and 3GB is no longer enough to guarantee you wont have a VRAM shortage at 1080p. I'm seriously considering swapping my 780 Ti 3GB for a 780 6GB. And no i'm not going back to AMD.


Not sure if swapping a faster card for a slower one is smart


----------



## StrongForce

Quote:


> Originally Posted by *bencher*
> 
> I mean there is also AMD


Quote:


> Originally Posted by *iamhollywood5*
> 
> No it's not, but it does show that we're hitting that 3GB ceiling and 3GB is no longer enough to guarantee you wont have a VRAM shortage at 1080p. I'm seriously considering swapping my 780 Ti 3GB for a 780 6GB. And no i'm not going back to AMD.


Just read some article saying they also gonna make the Ti's with extra RAM too.. if you're interested.

http://www.prodimex.ch/pInfos.aspx?CODE=F19E04&D=EVGA+GeForce+GTX+780+w%2f+ACX+Cooler+PCIe+%28+GeForce+GTX780+6144MB+2xDVI+HDMI+Display+port+%29

And damn, my local shop, nope, no watchdogs bundle









Wow at this price there is a Zotac Ti.. so they really want to make us choose between, raw power, or extra Vram.. ugh.

And yea the extra 3Gb is freaking pricey.. holy, they're milking us.


----------



## bencher

Quote:


> Originally Posted by *StrongForce*
> 
> Just read some article saying they also gonna make the Ti's with extra RAM too.. if you're interested.
> 
> http://www.prodimex.ch/pInfos.aspx?CODE=F19E04&D=EVGA+GeForce+GTX+780+w%2f+ACX+Cooler+PCIe+%28+GeForce+GTX780+6144MB+2xDVI+HDMI+Display+port+%29
> 
> And damn, my local shop, nope, no watchdogs bundle


The EVGA guy here already said the 6GB ti wont be coming.


----------



## edo101

Quote:


> Originally Posted by *iamhollywood5*
> 
> No it's not, but it does show that we're hitting that 3GB ceiling and 3GB is no longer enough to guarantee you wont have a VRAM shortage at 1080p. I'm seriously considering swapping my 780 Ti 3GB for a 780 6GB. And no i'm not going back to AMD.


why no AMD?


----------



## Entreaty88

The textures in this game look like ass. If it isn't way too sharp its way too low res. Nevertheless, my 780 Ti GHz and 4770 should do a decent job. Thing is, anyone read that changing the .ini settings unlocks way better graphics?

EDIT: http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/

EDIT 2: I have a suspicion that old gen consoles have screwed up the development of this game.


----------



## Murlocke

Well I got SLI to work, apparently the solution is running beyond 1080p. Must be some kind of bottleneck because on 1080p my SLI results in worse performance, but on 1440p it's double the FPS.

I then put 3-4 hours into the game at 1440p and 2x MSAA, driving around fast and free roaming. I even put 1 hour into the spider tank mission and was above 60FPS almost the entire time. Overall, I average ~60FPS, with dips into the 50s. Sometimes, I will get the same chops I get on 1080p and drop to low FPS for a couple frames. I really don't think it's caused by VRAM though, otherwise 1440p simply wouldn't be playable. The graphics are insane in my eyes, and IMO the best we've ever seen in an open world game by a long shot.

I furthered tested if it was VRAM by going up to 1440p 4x MSAA and the chops remain the same, but average FPS was about 45FPS. Just to test how far I could actually go on 1440p before hitting 3GB of non-cached VRAM.. the answer was 8x MSAA. This is where my average FPS goes from about 45FPS, to about 10FPS, and almost starts locking up. This is what I expect when VRAM capping, not the chops many people are experiencing. I feel like drivers or patches will fix that.

By the way, the police AI is nuts in the game. It's really really good... best i've probably ever seen. A single cop can be a challenge to get away from at times.
Quote:


> Originally Posted by *Entreaty88*
> 
> The textures in this game look like ass. If it isn't way too sharp its way too low res. Nevertheless, my 780 Ti GHz and 4770 should do a decent job. Thing is, anyone read that changing the .ini settings unlocks way better graphics?
> 
> EDIT: http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/


I totally disagree, the game looks great.

My 3-4 hours of testing above was with that set to PC. Personally, I don't notice a difference with it off, and i'm guessing that's why it was never put into the real game.


----------



## Alvarado

Quote:


> Originally Posted by *Murlocke*
> 
> Well I got SLI to work, apparently the solution is running beyond 1080p. Must be some kind of bottleneck because on 1080p my SLI results in worse performance, but on 1440p it's double the FPS.
> 
> I then put 3-4 hours into the game at 1440p and 2x MSAA, driving around fast and free roaming. I even put 1 hour into the spider tank mission and was above 60FPS almost the entire time. Overall, I average ~60FPS, with dips into the 50s. Sometimes, I will get the same chops I get on 1080p and drop to low FPS for a couple frames. I really don't think it's caused by VRAM though, otherwise 1440p simply wouldn't be playable. The graphics are insane in my eyes, and IMO the best we've ever seen in an open world game by a long shot.
> 
> I furthered tested if it was VRAM by going up to 1440p 4x MSAA and the chops remain the same, but average FPS was about 45FPS. Just to test how far I could actually go on 1440p before hitting 3GB of non-cached VRAM.. the answer was 8x MSAA. This is where my average FPS goes from about 45FPS, to about 10FPS, and almost starts locking up. This is what I expect when VRAM capping, not the chops many people are experiencing. I feel like drivers or patches will fix that.
> 
> By the way, the police AI is nuts in the game. It's really really good... best i've probably ever seen. A single cop can be a challenge to get away from at times.


Woah!


----------



## mercs213

Quote:


> Originally Posted by *Entreaty88*
> 
> The textures in this game look like ass. If it isn't way too sharp its way too low res. Nevertheless, my 780 Ti GHz and 4770 should do a decent job. Thing is, anyone read that changing the .ini settings unlocks way better graphics?
> 
> EDIT: http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/
> 
> EDIT 2: I have a suspicion that old gen consoles have screwed up the development of this game.


Placebo. Also the amazing screen shots you are seeing are downsampled from 4K res: http://www.dsogaming.com/news/watch_dogs-hidden-deferredfxquality-option-improves-significantly-visuals/#more-65558


----------



## Booty Warrior

Quote:


> Originally Posted by *Murlocke*
> 
> Well I got SLI to work, apparently the solution is running beyond 1080p. Must be some kind of bottleneck because on 1080p my SLI results in worse performance, but on 1440p it's double the FPS.
> 
> I then put 3-4 hours into the game at 1440p and 2x MSAA, driving around fast and free roaming. I even put 1 hour into the spider tank mission and was above 60FPS almost the entire time. Overall, I average ~60FPS, with dips into the 50s. Sometimes, I will get the same chops I get on 1080p and drop to low FPS for a couple frames. I really don't think it's caused by VRAM though, otherwise 1440p simply wouldn't be playable. The graphics are insane in my eyes, and IMO the best we've ever seen in an open world game by a long shot.
> 
> I furthered tested if it was VRAM by going up to 1440p 4x MSAA and the chops remain the same, but average FPS was about 45FPS. Just to test how far I could actually go on 1440p before hitting 3GB of non-cached VRAM.. the answer was 8x MSAA. This is where my average FPS goes from about 45FPS, to about 10FPS, and almost starts locking up. This is what I expect when VRAM capping, not the chops many people are experiencing. I feel like drivers or patches will fix that.
> 
> By the way, the police AI is nuts in the game. It's really really good... best i've probably ever seen. A single cop can be a challenge to get away from at times.
> I totally disagree, the game looks great.
> 
> My 3-4 hours of testing above was with that set to PC. Personally, I don't notice a difference with it off, and i'm guessing that's why it was never put into the real game.


So, crisis averted?

People need to relax and wait for the official release + drivers before freaking out.


----------



## Silent Scone

Well things _are_ going this way. GPUs need to stack up in this department next time round.


----------



## Hl86

Best game i played in a long time, even with 40 fps, it feels really smooth. It needs heavy cpu though, cause with 1650*1050, im getting 40 fps with a 4.8ghz 2500k.


----------



## NinjaToast

Quote:


> Originally Posted by *Booty Warrior*
> 
> So, crisis averted?
> 
> People need to relax and wait for the official release + drivers before freaking out.


NO! RELAXING IS FER SQUARES AND STUFF!









I kid, that post is comforting.


----------



## zealord

Quote:


> Originally Posted by *Hl86*
> 
> Best game i played in a long time, even with 40 fps, it feels really smooth. It needs heavy cpu though, cause with 1650*1050, im getting 40 fps with a 4.8ghz 2500k.


what gpu?


----------



## Murlocke

Quote:


> Originally Posted by *Booty Warrior*
> 
> So, crisis averted?
> 
> People need to relax and wait for the official release + drivers before freaking out.


Well it's not buttery smooth by any means, and there are still odd chops that will hopefully be fixed. It is definitely playable at 1440p 2x MSAA on 2x 780Tis though... This game loads an incredible amount of high detailed textures in the distance, and I think that's where the chops are coming from.


----------



## AndroidVageta

Again:

Guys...GUYS!...3GB of VRAM is PLENNNNTTTYYYY for this game!!!

My friends 7970 3GB has NO problem running this game maxed out. As I've said previously he's running MAX textures and MAX graphics options with Temporal SMAA with NO problems. Sure he's probably averaging 40FPS but that's because he isn't overclocked or anything. I've played it today...even during gun fights and what not the performance was smooooooth with no hick-ups or texture issues or anything.

So...for 1080p...3GB is more than enough. All these cards using 4, 5, 6GB of RAM are doing so because it's AVAILABLE...NOT because it's NEEDED.

So...now let's continue on with how 6GB of VRAM is required...rolleyes.gif


----------



## Alvarado

Yeah but....playing at 40 fps average? Crazy talk.


----------



## Silent Scone

I'd slit my wrists before I'd accept 40fps average. Just saying..

3GB is definitely not plenty any more...It's quickly becoming the minimum requirement for max settings. Especially when gaming at 1440P and up.

So yes, we could go back to last February and start taunting the "6GB is crazy", but games are going this way now, and requirements will only go up. I've no intention of busting out my wallet and getting rid of my Ti's just yet, but I do hope the next installments will come with plenty of memory.


----------



## y2kcamaross

Quote:


> Originally Posted by *Silent Scone*
> 
> I'd slit my wrists before I'd accept 40fps average. Just saying..
> 
> 3GB is definitely not plenty any more...It's quickly becoming the minimum requirement for max settings. Especially when gaming at 1440P and up.
> 
> So yes, we could go back to last February and start taunting the "6GB is crazy", but games are going this way now, and requirements will only go up. I've no intention of busting out my wallet and getting rid of my Ti's just yet, but I do hope the next installments will come with plenty of memory.


slit your wrists? Really? Little over dramatic aren't we


----------



## Alvarado

Quote:


> Originally Posted by *y2kcamaross*
> 
> slit your wrists? Really? Little over dramatic aren't we


Don't think so. This is OCN afterall.


----------



## DADDYDC650

Quote:


> Originally Posted by *bencher*
> 
> The EVGA guy here already said the 6GB ti wont be coming.


That's not exactly accurate.


----------



## Murlocke

Here's some shots I just took... Most have FPS in corners.

Settings: 1440p, Ultra (maxed everything), 2x MSAA. 3GB of VRAM. SLI enabled. I do experience 1-3 frames of chops every now and then, especially in high density areas, but it's totally playable.


----------



## bencher

Quote:


> Originally Posted by *DADDYDC650*
> 
> That's not exactly accurate.


Quote:


> Originally Posted by *EVGA-JacobF*
> 
> Have an unfortunate update on the 780 Ti 6GB. At this moment I cannot confirm when or even if this card will be released.
> 
> Although it was being planned at one time and did seem on track, sometimes things change and this was one of those times unfortunately.
> 
> On the other hand, the 780 6GB is available now.
> 
> Also, previously there was mentioned that there will be a 780 6GB with a reference cooler, unfortunately I cannot confirm if this one will be released either at this moment.


Accurate enough.


----------



## .:hybrid:.

Quote:


> Originally Posted by *Murlocke*
> 
> Seen a few people wanting uncompressed screenshots. Here's 28 of them that I just took with a legit copy, clocking in at 295MB total. FPS in the corner.
> 
> Settings: 1440p, Ultra (maxed everything), 2x MSAA. 3GB of VRAM. SLI enabled. Like stated above, I do experience 1-3 frames of chops every now and then, especially in high density areas, but it's totally playable. The game is beautiful, especially at night. Most importantly, the game seems really fun so far.


Doesn't imgur compress files though?

I think http://minus.com/ allows uncompressed uploading


----------



## iamhollywood5

Quote:


> Originally Posted by *edo101*
> 
> why no AMD?


Features and drivers. I know they have stepped up their game with drivers but Nvidia is still leading in that department, and I don't wanna go back to life without shadowplay and CUDA (which I use for video production and motion graphic software like Adobe). I also love the technology of G-sync and plan to get a Benq G-sync monitor once they're released, and I'm not convinced FreeSync will be as good of a solution. I could list other things but I don't want to start a fanboy war. I certainly dont hate AMD - in fact, I'm always tempted to pick up a 290X Lightning just to play AMD titles (but I'm not made of money so...)


----------



## Murlocke

Quote:


> Originally Posted by *.:hybrid:.*
> 
> Doesn't imgur compress files though?
> 
> I think http://minus.com/ allows uncompressed uploading


Damn them, you are right. Tried that site but it was the same.


----------



## DADDYDC650

@Bencher, I still don't see any confirmation of the 780 TI 6GB never being released but ok.


----------



## edo101

Quote:


> Originally Posted by *Murlocke*
> 
> Seen a few people wanting uncompressed screenshots. Here's 28 of them that I just took with a legit copy, clocking in at 295MB total. FPS in the corner.
> 
> Settings: 1440p, Ultra (maxed everything), 2x MSAA. 3GB of VRAM. SLI enabled. Like stated above, I do experience 1-3 frames of chops every now and then, especially in high density areas, but it's totally playable. The game is beautiful, especially at night. Most importantly, the game seems really fun so far.


what kind of frames are you getting? And man this is really stupid when a massive card with only 2GB has trouble running a game that looks almost identical on consoles


----------



## Murlocke

Quote:


> Originally Posted by *edo101*
> 
> what kind of frames are you getting? And man this is really stupid when a massive card with only 2GB has trouble running a game that looks almost identical on consoles


Frames are on the screenshots.


----------



## TopicClocker

Quote:


> Originally Posted by *edo101*
> 
> what kind of frames are you getting? And man this is really stupid when a massive card with only 2GB has trouble running a game that looks almost identical on consoles


You might want to check this out: http://www.overclock.net/t/1491602/watch-dogs-2gb-vram-performance-analysis-and-graphics
I went over a couple problems.


----------



## SheepMoose

Is this game worthy of a pre-order?
Don't expect my system to be able to run it at max, or anywhere near, but is the gameplay fun enough?


----------



## StrongForce

So we can speculate that a 3gb VRAM card would be enough after some patches for ultra.., well I don't mind removing any type of MSAA or ultra shadows personally but I like ultra textures.


----------



## hzac

Quote:


> Originally Posted by *Murlocke*
> 
> Seen a few people wanting uncompressed screenshots. Here's 28 of them.
> 
> Compressed JPG: Here
> Uncompressed: Here
> 
> Settings: 1440p, Ultra (maxed everything), 2x MSAA. 3GB of VRAM. SLI enabled. I do experience 1-3 frames of chops every now and then, especially in high density areas, but it's totally playable.


I think your uncompressed link is from a personal folder. Its making me sign in then goes to my folders and doesnt give me a download.


----------



## Murlocke

Quote:


> Originally Posted by *hzac*
> 
> I think your uncompressed link is from a personal folder. Its making me sign in then goes to my folders and doesnt give me a download.


Fixed. More images are being uploaded right now too.


----------



## Redeemer

Who would ever have thought that a top of the line 780TI 3GB would get its buffer maxed out this soon at only 1080p


----------



## hzac

Quote:


> Originally Posted by *Murlocke*
> 
> Fixed. More images are being uploaded right now too.


Maybe its just becuase ive only got a 1080p mointor but there seems to be an awful lot of jaggies even for 2xmsaa?


----------



## Murlocke

Quote:


> Originally Posted by *hzac*
> 
> Maybe its just becuase ive only got a 1080p mointor but there seems to be an awful lot of jaggies even for 2xmsaa?


Yeah, looking at 1440p screenshots full size on 1080p will result in that. For scaling click "Full Screen", then use the arrows to browse through the pictures. This will scale them to your monitor and it should look better.


----------



## TopicClocker

Quote:


> Originally Posted by *Redeemer*
> 
> Who would ever have thought that a top of the line 780TI 3GB would get its buffer maxed out this soon at only 1080p


Nvidias always giving cards stupid amounts of ram.


----------



## Kinaesthetic

Quote:


> Originally Posted by *Redeemer*
> 
> Who would ever have thought that a top of the line 780TI 3GB would get its buffer maxed out this soon at only 1080p


You mean....

Who would've ever thought that a top of the line 780Ti 3GB would get its buffer maxed out this soon at only 1080p.

Who would've ever thought that a top of the line R9-290X would get its buffer maxed out this soon at only 1080p.

Who would've ever thought that a top of the line R9-295X would get its buffer maxed out this soon at only 1080p.

Who would've ever thought that a top of the line GTX Titan/TitanBlack would get its buffer maxed out this soon at only 1080p.

Who would've ever thought that a top of the line (insert high-end card here) would get its buffer maxed out this soon at only 1080p.

This VRAM issue is brand agnostic and is a problem with the game itself. Everyone's cards are maxing, or nearly maxing out the VRAM of their card, regardless of VRAM size, and pretty much regardless of most common resolutions (4K not included because I haven't heard anything from anyone on 4K playing this game, and that is honestly where there would be a VRAM bottleneck). This is a Ubisoft garbage-coding problem. Not an Nvidia/AMD problem.


----------



## bencher

Quote:


> Originally Posted by *Kinaesthetic*
> 
> You mean....
> 
> Who would've ever thought that a top of the line 780Ti 3GB would get its buffer maxed out this soon at only 1080p.
> 
> Who would've ever thought that a top of the line R9-290X would get its buffer maxed out this soon at only 1080p.
> 
> Who would've ever thought that a top of the line R9-295X would get its buffer maxed out this soon at only 1080p.
> 
> Who would've ever thought that a top of the line GTX Titan/TitanBlack would get its buffer maxed out this soon at only 1080p.
> 
> Who would've ever thought that a top of the line (insert high-end card here) would get its buffer maxed out this soon at only 1080p.
> 
> This VRAM issue is brand agnostic and is a problem with the game itself. Everyone's cards are maxing, or nearly maxing out the VRAM of their card, regardless of VRAM size, and pretty much regardless of most common resolutions (4K not included because I haven't heard anything from anyone on 4K playing this game, and that is honestly where there would be a VRAM bottleneck). This is a Ubisoft garbage-coding problem. Not an Nvidia/AMD problem.


LOL AMD high-end cards have no problem.


----------



## Murlocke

I've pretty much completely fixed my stuttering by *setting GPU render frames to 3* instead of 1.

1440p, Ultra/Maxed, 2x MSAA, Vsync set to 1 frame. Steady 60FPS with drops to about 55FPS, around 3 load chops that lasted about 2-3 frames in about 30 minutes of driving around at full speed. Totally acceptable given the amount of detail the city has.. it's nuts.

I'm pretty happy with my performance and the game's drivers aren't even out.


----------



## Kinaesthetic

Quote:


> Originally Posted by *bencher*
> 
> LOL AMD high-end cards have no problem.


Post #265 by Nav from DS shows the 295X using its full VRAM buffer @ 2xMSAA 1440p. People are seeing the Titan use nearly its full VRAM buffer. People are seeing the 780/780Ti use nearly its full VRAM buffer.

And people all over the internet saying it plays like crap on both sides. It *is* *brand agnostic.*

For once bencher, cut the damn bias out of your posts. You've done nothing but troll this thread. This is a miserably coded game for ANYONE. The problem here is Ubisoft. Not AMD or Nvidia.

Here are some of the people with AMD cards who have posted on the Watch_Dogs subreddit:

Quote:


> From zhohner:
> 
> *My Specs:*
> *Motherboard:* MSI X79A-GD45 (8D)
> *CPU:* i7 3820 @ 4.4Ghz
> *Memory:* 16GB DDR3 1866MHz
> *GPU:* 2 x Sapphire R9 290 TRI-X OC 4GB (Catalyst 14.4 Drivers)
> *Storage:* Samsung EVO 250GB SSD
> *Monitor:* Samsung U28D590D
> *OS:* Windows 8.1 64 bit
> 
> *Performance:* Playing at 2160p @ 60Hz, all settings at maximum save for AA, which is turned off. Vsync is disabled. Getting around 28 - 30fps. Crossfire does not seem to be working very well, when enabled I see medium to low usage split across both cards, when disabled it makes full use of the one card. Performance with either is the same at around 28 - 30fps. Hopefully we will see some improved drivers on or shortly after launch.
> 
> From Manuela45:
> 
> Getting around 29 to 50fps at ultra, V-sync Disabled, Smaa, HBAO+ HIGH, 2560x1440
> 
> PROC:Intel 3930k stock clocks. GPU: XFX R9 290 OC'ed MoBo: Asus Rampage IV Extreme RAM: Samsung 16GB DDR3 1600MHz


Not to mention the numerous posters here in this thread and around on OCN. Get a grip.


----------



## Sadmoto

Quote:


> Originally Posted by *Kinaesthetic*
> 
> This is a Ubisoft garbage-coding problem. Not an Nvidia/AMD problem.


This.

I told my friend who got his copy early to look at cpu/gpu usage because I was curious.

and without fail ubisoft used their craptastic coding they do in AC. we all remember going into boston on AC3 and it just being a mess, even on high end cpus.

it baffles me that they still haven't learned how to properly code for multi-core cpus, you'd think that it would be common sense now a days for major gaming companies to come up with a method to code "use up to X core" where X is the # of cores on the current CPU.
Instead it almost seems like they are using the page out of ARMA 3's coding structure and designating each cpu for a different process ( 1 for physics, 1 for AI, 1 for sounds, etc...)

or when you hit a CPU bound scene, but yet your usage isn't even at 60% with 4 out of 8 cores and less then 30% with the other 4.









seriously does ubisoft think we are all running on ancient dual cores/quads, meanwhile having 6gb of ram and 2gb+ vram?









even with their shoddy but expected coding, it runs pretty good

I can have everything on ultra except textures on high and with anything under 4xmsaa and still hold good frames


----------



## StrongForce

Mhh yea I remember a friend telling me about how AC3 was optimized with the feets, lol.

let's hope the drivers will help or even fix ..

Anyone have a 3gb card running at 1080p with no huge fps drops ??


----------



## Sadmoto

this must be how ubisoft does their cpu coding.

wasnt there a tweet or blog post about how this game would utilize up to 8 cores because it was being made around the new consoles?

they have 2 days and counting to fix that load of crap with either one of their quote edits like for the whole res/fps fiasco with the consoles, or I don't know, maybe actually stick to their word? HAH. Riiiiight....








thats like asking EA to stop being greedy...


----------



## xxicrimsonixx

4GB on 290X


----------



## lilchronic

Here's some screenshot's iv'e taken @ 2560x1440 everything ultra and SMAA.... any more AA and ill hit my vram limit




Spoiler: Warning: Spoiler!


----------



## Ascii Aficionado

Quote:


> Originally Posted by *lilchronic*
> 
> Here's some screenshot's iv'e taken @ 2560x1440 everything ultra and SMAA.... any more AA and ill hit my vram limit


It looks like you don't have anisotropic enabled.


----------



## StrongForce

I'm actually considering flip flopping like crazy and just buy a r9 280 until nvidia release the 800's.. after much thought, lol.


----------



## lilchronic

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> It looks like you don't have anisotropic enabled.


thiese are my settings


----------



## iamhollywood5

Quote:


> Originally Posted by *Murlocke*
> 
> I've pretty much completely fixed my stuttering by *setting GPU render frames to 3* instead of 1.
> 
> 1440p, Ultra/Maxed, 2x MSAA, Vsync set to 1 frame. Steady 60FPS with drops to about 55FPS, around 3 load chops that lasted about 2-3 frames in about 30 minutes of driving around at full speed. Totally acceptable given the amount of detail the city has.. it's nuts.
> 
> I'm pretty happy with my performance and the game's drivers aren't even out.


Hmm. I also had it set to 3 on my 780 Ti and didn't have the same kind of smoothness at Ultra 1080p, although I was trying to use TXAA (i know i know). Weird.
Quote:


> Originally Posted by *Redeemer*
> 
> Who would ever have thought that a top of the line 780TI 3GB would get its buffer maxed out this soon at only 1080p


To be fair, GK110 is getting old, a lot older than Hawaii. Back when the 780 (exactly 1 year ago) was first released, nobody was worried about 3GB for 1080p. Games weren't even close to that limit at 1080p and people thought we wouldn't need more until higher resolutions were the mainstream. But yes it's insane that a $700 graphics card could run out of VRAM at 1080p on a game that doesn't exactly look mind-blowing.
Quote:


> Originally Posted by *hzac*
> 
> Maybe its just becuase ive only got a 1080p mointor but there seems to be an awful lot of jaggies even for 2xmsaa?


For some reason this game has jaggies and aliases like a biotch. More than any recent game I've seen.
Quote:


> Originally Posted by *Sadmoto*
> 
> and without fail ubisoft used their craptastic coding they do in AC. we all remember going into boston on AC3 and it just being a mess, even on high end cpus.
> 
> it baffles me that they still haven't learned how to properly code for multi-core cpus, you'd think that it would be common sense now a days for major gaming companies to come up with a method to code "use up to X core" where X is the # of cores on the current CPU.


And yet they said we NEEDED 8 thread CPUs to run the game on ultra -.-


----------



## Sadmoto

Quote:


> Originally Posted by *lilchronic*
> 
> thiese are my settings


I was wondering myself if that was an ingame setting but I guess not
you should be able to enable ansi filter through amd CCC or the nvidia equal though.


----------



## Murlocke

So, MSAA is not worth it in this game. It actually performs noticeably worse than SMAA or Temporal SMAA while costing much more VRAM. Going to list them here in order of VRAM consumption/performance loss. Picture's here are all max, with 16x AF forced in drivers.

No AA: http://i.imgur.com/OiFM7S2.jpg

FXAA: http://i.imgur.com/HHMnuUE.jpg

SMAA: http://i.imgur.com/vLAMADd.jpg

Temporal SMAA: http://i.imgur.com/26ptRVL.jpg

2x MSAA: http://i.imgur.com/CHMim6C.jpg

SMAA/Temporal SMAA seems to be the best here. In theory, Temporal should look better in motion, and is only a 2FPS hit. Any higher than 2x MSAA will eat VRAM for breakfast, and it takes 4x MSAA to be comparable to SMAA. TXAA is not even worth testing, it blurs the picture more than FXAA and has huge performance loss.

Note: Any brightness differences were caused by shadows, not the AA method.


----------



## lilchronic

forced anisotropic filtering on x16 through nvcp..... here's another with it on


----------



## StrongForce

Anyone here with a HD 7950 or a r9 280 to test it ? I guess I can wait release date but it's not gonna be easy to find a benchmark with those cards on this game..


----------



## Blameless

Quote:


> Originally Posted by *Murlocke*
> 
> If it needs 3GB on 1080p, it's going to easily require 4+ GB on 1440p for good performance.


Not necessarily.

Many texture heavy games will need a far larger texture buffer than frame buffer, and increasing display resolution generally only increases the latter.


----------



## Murlocke

16x AF forced in drivers does indeed make a difference, especially when it comes to distant ground textures:

On: http://i.imgur.com/vLAMADd.jpg
Off: http://i.imgur.com/RmJXi0N.jpg

(Ignore the FPS, one was with SLI and the other without)


----------



## awdrifter

Is that on Ultra quality textures? It still looks kinda blurry even with the AF turned on. It definitely looked better though.


----------



## Sadmoto

Quote:


> Originally Posted by *Murlocke*
> 
> 16x AF forced in drivers does indeed make a difference, especially when it comes to distant ground textures:
> 
> On: http://i.imgur.com/vLAMADd.jpg
> Off: http://i.imgur.com/RmJXi0N.jpg
> 
> (Ignore the FPS, one was with SLI and the other without)


Glad to hear about the ansi helping distant detail. I felt that was one thing that looked "off" aside from shadows.
I've seen certain areas/items/buildings will have VERY blocky shadows but the rest is all crisp, ive really seems these weird shadows on the window washer lifts.

The stuff in the distance almost gets this gritty look or is it just me?


----------



## Murlocke

Quote:


> Originally Posted by *awdrifter*
> 
> Is that on Ultra quality textures? It still looks kinda blurry even with the AF turned on. It definitely looked better though.


That's max, you can thank imgur for the JPG compression. lilchronic's photos above have far less compression (click them, then click original), but still in-game is better.









https://www.mediafire.com/#673cchrcdzkj0
I've got a massive amount of uncompressed images here, but they lack 16x AF and I was using MSAA thinking it was better than SMAA. So most of the jaggies/distance texture blurriness is fixed... too lazy to reupload another 500MB of pictures.









Quote:


> Originally Posted by *Sadmoto*
> 
> Glad to hear about the ansi helping distant detail. I felt that was one thing that looked "off" aside from shadows.
> I've seen certain areas/items/buildings will have VERY blocky shadows but the rest is all crisp, ive really seems these weird shadows on the window washer lifts.
> 
> The stuff in the distance almost gets this gritty look or is it just me?


I haven't noticed either of those things, I haven't been going around staring at shadows though. From what i've seen, they are really good for an open world game.


----------



## iamhollywood5

Little off topic but I sure hope people who pirated this game at least pre-ordered already. I know there are definitely some people with legit copies but the fact that so many people are torrenting the game thus proving Ubisoft's stereotype of PC gamers is sad. I mean people on twitter are straight up admitting they torrented the game to the devs when asking them for performance fixes. It's sad because they clearly put work into the PC version of this game instead of just porting it over, despite the current performance issues. With the pre-launch leak and all the torrenting I'll be shocked if we actually get performance-fixing patches, I'd be pissed if I was Ubisoft right now.


----------



## Murlocke

Quote:


> Originally Posted by *iamhollywood5*
> 
> Little off topic but I sure hope people who pirated this game at least pre-ordered already. I know there are definitely some people with legit copies but the fact that so many people are torrenting the game thus proving Ubisoft's stereotype of PC gamers is sad. I mean people on twitter are straight up admitting they torrented the game to the devs when asking them for performance fixes. It's sad because they clearly put work into the PC version of this game instead of just porting it over, despite the current performance issues. With the pre-launch leak and all the torrenting I'll be shocked if we actually get performance-fixing patches, I'd be pissed if I was Ubisoft right now.


I don't think piracy is nearly as bad as the big companies claim. Honestly they should of released a benchmark when spec requirements are this high. I bet most people pirating just want to see if they can run it before dropping $60, and many of the rest just want something to mess around with until they get their legit copy on Tuesday. Hopefully Ubisoft doesn't pull the "See, PC ruins sales" card. 360 copies were leaked shortly after the game went gold, and it's probably only a matter of time before the new consoles are the same way. If the games good, most people will buy it IMO.

I really want to start playing this game, the copy i'm using is on a friend's uPlay account and I haven't gone past the first mission to avoid spoiling myself. I have the digital deluxe pre-ordered, and looking forward to Tuesday.


----------



## Sadmoto

@murlocke: I wasn't going around and nitpicking shadows, I was trying to get to one of the ctos tower and saw the shadow differnce as I was using the lift







then I would notice it here and there.

And the shadow detail is completely different, the "off" shadows I'm talking about looking really bad, its a night&day differnce. You can see each pixel but if you look at the regular shadows they are smooth and crisp.

Also for who was looking for 7950 performance
My 7870xt @ 1100 is at 30-60 fps, thats with, 1080p, high textures, everything else maxed and with 3 different aa (fxaa, smaa, smaa temp and msaa x2, they all have roughly the same fps impact)

Before you go omg that's not a 7950, but its pretty close, it uses the same gpu
From what I'm told a 7870xt @ 1200 has essentially the same horsepower as a 7950 @ stock


----------



## Silent Scone

Quote:


> Originally Posted by *Arm3nian*
> 
> I could care less about your opinion of 24/7, I've read lots of pages in the titan club and know what people run 24/7 for gaming, and the best of cards reach around 1300mhz for stable non degrading clocks/volts.


Quote:


> Originally Posted by *Alatar*
> 
> Nah it's a Forbes writer. Guy has been testing the game with a 295X2 and some Titans.
> 
> And I'm just posting what I see, I don't have access to the game yet. I'll most likely post some tests after launch once I get it downloaded.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ^ the clocks I use for intensive games
> 
> perfectly acceptable volts with LLC off
> 1368MHz reported by GPU-Z
> my normal fan speed ( means ~45C+ temps), no crazy AP31s for benching purposes
> ~15-20 minutes of valley in the background for stability
> ambients in the 25-28C range with outside temps now during the morning 20C+
> And my card isn't even close to being the best one for high MHz at low volts. Mine actually likes the high volts.
> 
> Could probably go higher but I haven't had the need, and I don't have the time right now to stability test higher clocks. Maybe with watch_dogs I'll see if I can get it running at 1400MHz core. The clock for clock difference between Tis and Titans is less than 5%. It doesn't require much from a Titan to pass 780Ti speeds, which usually can be done due to voltage control. As long as you're on water that is.
> 
> 1.325v is nowhere near degrading or damaging. It's perfectly fine if you're on water.


I've seen Titans die with less than 1.325v. It's a lottery. The VRMS on the reference PCBs are not up to it...You're just dicing with death, honestly.


----------



## bencher

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Post #265 by Nav from DS shows the 295X using its full VRAM buffer @ 2xMSAA 1440p. People are seeing the Titan use nearly its full VRAM buffer. People are seeing the 780/780Ti use nearly its full VRAM buffer.
> 
> And people all over the internet saying it plays like crap on both sides. It *is* *brand agnostic.*
> 
> For once bencher, cut the damn bias out of your posts. You've done nothing but troll this thread. This is a miserably coded game for ANYONE. The problem here is Ubisoft. Not AMD or Nvidia.
> 
> Here are some of the people with AMD cards who have posted on the Watch_Dogs subreddit:
> 
> Not to mention the numerous posters here in this thread and around on OCN. Get a grip.


He didn't report any stuttering.

And no... It doesn't play like crap on both sides.


----------



## TopicClocker

Quote:


> Originally Posted by *Sadmoto*
> 
> This.
> 
> I told my friend who got his copy early to look at cpu/gpu usage because I was curious.
> 
> and without fail ubisoft used their craptastic coding they do in AC. we all remember going into boston on AC3 and it just being a mess, even on high end cpus.
> 
> it baffles me that they still haven't learned how to properly code for multi-core cpus, you'd think that it would be common sense now a days for major gaming companies to come up with a method to code "use up to X core" where X is the # of cores on the current CPU.
> Instead it almost seems like they are using the page out of ARMA 3's coding structure and designating each cpu for a different process ( 1 for physics, 1 for AI, 1 for sounds, etc...)
> 
> or when you hit a CPU bound scene, but yet your usage isn't even at 60% with 4 out of 8 cores and less then 30% with the other 4.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> seriously does ubisoft think we are all running on ancient dual cores/quads, meanwhile having 6gb of ram and 2gb+ vram?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> even with their shoddy but expected coding, it runs pretty good
> 
> I can have everything on ultra except textures on high and with anything under 4xmsaa and still hold good frames


Are you sure? maybe it depends on the CPU, my CPU is showing perhaps the highest utilization I've ever seen from it, stressing all of my 4 cores.
We need someone with a 8thread/8core CPU to test the game.
Quote:


> Originally Posted by *iamhollywood5*
> 
> Hmm. I also had it set to 3 on my 780 Ti and didn't have the same kind of smoothness at Ultra 1080p, although I was trying to use TXAA (i know i know). Weird.
> To be fair, GK110 is getting old, a lot older than Hawaii. Back when the 780 (exactly 1 year ago) was first released, nobody was worried about 3GB for 1080p. Games weren't even close to that limit at 1080p and people thought we wouldn't need more until higher resolutions were the mainstream. But yes it's insane that a $700 graphics card could run out of VRAM at 1080p on a game that doesn't exactly look mind-blowing.
> For some reason this game has jaggies and aliases like a biotch. More than any recent game I've seen.
> And yet they said we NEEDED 8 thread CPUs to run the game on ultra -.-


I've got a thread with 2GB performance, if this continues it could be the end of the days for the majority of the previous high end cards 670/680/690, I could swear the 2GB variants were the first to release and the 4GBs came after?
http://www.overclock.net/t/1491602/watch-dogs-2gb-vram-performance-analysis-and-graphics
Quote:


> Originally Posted by *iamhollywood5*
> 
> Little off topic but I sure hope people who pirated this game at least pre-ordered already. I know there are definitely some people with legit copies but the fact that so many people are torrenting the game thus proving Ubisoft's stereotype of PC gamers is sad. I mean people on twitter are straight up admitting they torrented the game to the devs when asking them for performance fixes. It's sad because they clearly put work into the PC version of this game instead of just porting it over, despite the current performance issues. With the pre-launch leak and all the torrenting I'll be shocked if we actually get performance-fixing patches, I'd be pissed if I was Ubisoft right now.


The console versions are perhaps getting equally pirated as much.


----------



## Silent Scone

Quote:


> Originally Posted by *bencher*
> 
> He didn't report any stuttering.
> 
> And no... It doesn't play like crap on both sides.


Take it you've played it then seeing as you're so sure?

The option states 3gb required. Not '3gb isn't enough'. DX paging allows for caching of all available frame buffer. You may find in some instances it will be smoother with a larger frame buffer, but there shouldn't be an issue at 1080 with this game. It's poor optimisation, and if you think you know different please feel free to share it with everyone...


----------



## th3illusiveman

Quote:


> Originally Posted by *Murlocke*
> 
> I've pretty much completely fixed my stuttering by *setting GPU render frames to 3* instead of 1.
> 
> 1440p, Ultra/Maxed, 2x MSAA, Vsync set to 1 frame. Steady 60FPS with drops to about 55FPS, around 3 load chops that lasted about 2-3 frames in about 30 minutes of driving around at full speed. Totally acceptable given the amount of detail the city has.. it's nuts.
> 
> I'm pretty happy with my performance and the game's drivers aren't even out.


I have a big bowl of Jelly that i'm eating right now









I need to get a new card soon lol, C'mon R9 390X


----------



## Serandur

http://www.techpowerup.com/mobile/201067/asus-unveils-the-geforce-gtx-780-strix-6-gb-graphics-card.html

I really wish I could return my 780 GHz for this, now Nvidia are finally allowing 6 GB 780s, why? Should have gone with EVGA and their step-up program... :/

Should I decide to sell my 780 and replace it this time next year, how much do you guys think I might still be able to get for it?


----------



## Ascii Aficionado

Quote:


> Originally Posted by *Serandur*
> 
> http://www.techpowerup.com/mobile/201067/asus-unveils-the-geforce-gtx-780-strix-6-gb-graphics-card.html
> 
> I really wish I could return my 780 GHz for this, now Nvidia are finally allowing 6 GB 780s, why? Should have gone with EVGA and their step-up program... :/
> 
> Should I decide to sell my 780 and replace it this time next year, how much do you guys think I might still be able to get for it?


Sigh, I want a Ti version.

/entitlement


----------



## Serandur

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Sigh, I want a Ti version.
> 
> /entitlement


For those cost of a Ti, Nvidia should be shamed for not making it a base standard. I'm not the fanboy type, I've pretty much been exclusively Nvidia for years (for certain software features), but their $700 flagship has no more VRAM than AMD's 2.5 year old 7970 in the middle of a console gen transition. They've known what would happen, and they're going to use it to leverage upgrades moreso than performance especially in these relatively slow times of advancement (with their TSMC problems and all). Their 2GB 770 is a bad joke, now, their limited 780s/Tis a fatally flawed product. I feel like an AMD hardware fanboy forced into buying their cutthroat competitor's garbage for what I now consider basic feature support.


----------



## revro

Quote:


> Originally Posted by *Serandur*
> 
> For those cost of a Ti, Nvidia should be shamed for not making it a base standard. I'm not the fanboy type, I've pretty much been exclusively Nvidia for years (for certain software features), but their $700 flagship has no more VRAM than AMD's 2.5 year old 7970 in the middle of a console gen transition. They've known what would happen, and they're going to use it to leverage upgrades moreso than performance especially in these relatively slow times of advancement (with their TSMC problems and all). Their 2GB 770 is a bad joke, now, their limited 780s/Tis a fatally flawed product. I feel like an AMD hardware fanboy forced into buying their cutthroat competitor's garbage for what I now consider basic feature support.


yep, i can get a 350eur 4gb 290 windforce, 450eur 4gb 290x windforce, 550eur evga 780 6gb and 660eur evga 780ti SC ACX 3gb


----------



## ElementR

Quote:


> Originally Posted by *Murlocke*
> 
> I've pretty much completely fixed my stuttering by *setting GPU render frames to 3* instead of 1.
> 
> 1440p, Ultra/Maxed, 2x MSAA, Vsync set to 1 frame. Steady 60FPS with drops to about 55FPS, around 3 load chops that lasted about 2-3 frames in about 30 minutes of driving around at full speed. Totally acceptable given the amount of detail the city has.. it's nuts.
> 
> I'm pretty happy with my performance and the game's drivers aren't even out.


What drivers are you using? 337.81 added Watch Dogs SLI profile.


----------



## bencher

Wow this game is brilliant!


----------



## StrongForce

Quote:


> Originally Posted by *Sadmoto*
> 
> @murlocke: I wasn't going around and nitpicking shadows, I was trying to get to one of the ctos tower and saw the shadow differnce as I was using the lift
> 
> 
> 
> 
> 
> 
> 
> then I would notice it here and there.
> 
> And the shadow detail is completely different, the "off" shadows I'm talking about looking really bad, its a night&day differnce. You can see each pixel but if you look at the regular shadows they are smooth and crisp.
> 
> Also for who was looking for 7950 performance
> My 7870xt @ 1100 is at 30-60 fps, thats with, 1080p, high textures, everything else maxed and with 3 different aa (fxaa, smaa, smaa temp and msaa x2, they all have roughly the same fps impact)
> 
> Before you go omg that's not a 7950, but its pretty close, it uses the same gpu
> From what I'm told a 7870xt @ 1200 has essentially the same horsepower as a 7950 @ stock


Nice, so if I wanna play with 60 fps ish I could probably turn down a few options, remove AA and stuff and I'd be good to go, sounds good.. yea I have my eyes on a sapphire 280 oc boost : 850 MHz (940 MHz) / 1250 MHz those are it's frequencys.. sounds good, plus I could just overclock it, anyone know if basic overclocking is covered by warranty ? no volt mods or anything.

Quote:


> Originally Posted by *Serandur*
> 
> For those cost of a Ti, Nvidia should be shamed for not making it a base standard. I'm not the fanboy type, I've pretty much been exclusively Nvidia for years (for certain software features), but their $700 flagship has no more VRAM than AMD's 2.5 year old 7970 in the middle of a console gen transition. They've known what would happen, and they're going to use it to leverage upgrades moreso than performance especially in these relatively slow times of advancement (with their TSMC problems and all). Their 2GB 770 is a bad joke, now, their limited 780s/Tis a fatally flawed product. I feel like an AMD hardware fanboy forced into buying their cutthroat competitor's garbage for what I now consider basic feature support.


Man, you pretty just sumed up how I feel... this 770 I got right now is a few frames away from 780s and Titans in benchmarks, so it's got alot of horsepower more VRAM would only help for future titles / or watchdogs for instance..

The reasons I gonna return my 770 and get a r9 280 are simple, I want more Vram, and the 770 4gb is too close to a normal 780 3gb price point, the 780 6Gb is too close to a Ti.. at this point, if they had release the Ti 6gb, I'd be tempted (even if don't have all the money I'd save..), because I doubt the difference between the Ti and the 6gb Ti would be 100 bucks like the 780 uh.. if I hurt my wallet so much I'd rather get the best of the best..

so for almost half the price of my current 770 I can get a 280 oc which will do me good even if it doesn't run everything ultra.. plus I upgraded card mainly for BF4 and this card is just brilliant for this.

Nvidia weird VRAM tactics (and cheesy pricing, well that's EVGA here mostly with their 100 bucks for 3gb vram) make me "forgive" AMD for all the trouble they caused to me lol, and hope if I get that card at least the memory leak is gone (was mainly an older cards problem apparently.. something they could have probably fix in the drivers, but they dropped support mmh..)


----------



## Bit_reaper

Quote:


> Originally Posted by *lilchronic*
> 
> thiese are my settings


I haven't played Watch dogs my self but my assumption is that _GPU max buffered frames_ is the same as Flip Queue. If this is the case I would strongly recommend people try running the game with that sett to 3 as it usually gives the best balance between smoothing and responsiveness.

For thous who don't know what the flip queue is. Its basically an buffer for the instructions going to the GPU. Every time the gpu renders a frame it takes one of thees buffered frames that has all the stuff the CPU needs to do before the the frame can be rendered. Mean while the CPU keeps churning out thees "frames" until the buffer is full. 1 meaning it only ever has calculated 1 frame in advance while 3 means it can have up to 3 calculated before hand.

If you set the flip queue to 1 it means you are basically not allowing any buffering and so any little hiccup in the CPU rendering pipe will show up in the frame rate (as the GPU is forced to wait idly while the CPU is finished with the frame). The faster the CPU the less hiccups but in my experience even with fast CPU some delays are still large enough to be noticed. So it almost always pays off to run with a flip queue larger then 1 even if it introduces a tiny bit of input lag.


----------



## Sannakji

Quote:


> Originally Posted by *Jindaman*
> 
> 
> http://gearnuke.com/watch-dogs-pc-full-graphics-settings-revealed-requires-3-gb-vram-ultra-quality-textures/


Oh the crow that will be eaten by all those that lulz at seemingly unpowered cards that come in 4GB flavours...


----------



## tin0

3GB or no 3GB, my GTX690 is ready


----------



## yusupov

i got the 337.81s after finding out about them today...dont change a thing. i did see a tip to add --disablepagefilecheck to a shortcut target of .exe. will try the setting of 3 pre-rendered frames as well.

i have been getting horrible frame drops quite frequently. defragging my HDD now & hoping it will run better, if not ill try moving to SSD. i assume there will be big day 1 changes & better optimized drivers, but right now it is really not up to snuff. still enjoying it, but a long way to go in optimization in my view.


----------



## givmedew

Quote:


> Originally Posted by *iamhollywood5*
> 
> Little off topic but I sure hope people who pirated this game at least pre-ordered already. I know there are definitely some people with legit copies but the fact that so many people are torrenting the game thus proving Ubisoft's stereotype of PC gamers is sad. I mean people on twitter are straight up admitting they torrented the game to the devs when asking them for performance fixes. It's sad because they clearly put work into the PC version of this game instead of just porting it over, despite the current performance issues. With the pre-launch leak and all the torrenting I'll be shocked if we actually get performance-fixing patches, I'd be pissed if I was Ubisoft right now.


Willing to bet the majority of people who torrented it are either children or adults who paid for the game. Not much we can do about the kids but like I said it is my opinion that with this game most people bought it. I don't know a single friend who is a hardcore PC gamer that didn't pre-order this game months ago.

Also to all those talking about the hard limit on the graphics... Max Payne was the same way...


----------



## mfknjadagr8

If i was ubisoft I'd send my team of lawyers a list of names lol...its one thing to pirate something to test then buy...but to have the gall to pirate it...never intend to purchase AND expect to.habe your pirated game patched..those are the type of folks that deserve whatever happens


----------



## yusupov

yeah, lets all shed a tear for poor, poor ubisoft...btw, funny how it seems like its only their PC games that leak early. maybe some self-fulfilling prophecies going on their w/ their disgraceful attitude of late toward pc gamers. they deserve every bit of it.


----------



## Yungbenny911

Everyone going crazy about V-ram







... Are you guys checking your FPS? I mean... Are you constantly above 60 fps with 2gb/3gb cards and still stuttering? I'm waiting for the game to be released so i can test it on my 2gb 770's extensively and give my results.


----------



## Lex Luger

Pro tip for those with 2Gb. cards

Set shadows to high, Set textures to ultra


----------



## Devnant

Found this on reddit. Make a shortcut to the game and add "-disablepagefilecheck" without the quotation marks after the target and the stuttering should be extremely minimal (essentially just when driving very fast and loading new environments).

Example (under target): "X:\Games\WD\watch_dogs.exe" -disablepagefilecheck

Haven't tested it myself, because I'm waiting for the launch, but let me know if it works for you.

http://www.reddit.com/r/watch_dogs/comments/26fji7/get_rid_of_ultrasetting_stuttering_by_adding/


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> Everyone going crazy about V-ram
> 
> 
> 
> 
> 
> 
> 
> ... Are you guys checking your FPS? I mean... Are you constantly above 60 fps with 2gb/3gb cards and still stuttering? I'm waiting for the game to be released so i can test it on my 2gb 770's extensively and give my results.


Quote:


> Originally Posted by *Lex Luger*
> 
> Pro tip for those with 2Gb. cards
> 
> Set shadows to high, Set textures to ultra


Yup, I posted it in my Performance thread.
http://www.overclock.net/t/1491602/watch-dogs-2gb-vram-performance-analysis-and-graphics

We are indeed going crazy about V-ram, check my thread Yungbenny911, you may be intrigued by the results.


----------



## Silent Scone

Quote:


> Originally Posted by *yusupov*
> 
> yeah, lets all shed a tear for poor, poor ubisoft...btw, funny how it seems like its only their PC games that leak early. maybe some self-fulfilling prophecies going on their w/ their disgraceful attitude of late toward pc gamers. they deserve every bit of it.


Yeh so that makes it ok...


----------



## KenLautner

Quote:


> Originally Posted by *Yungbenny911*
> 
> Everyone going crazy about V-ram
> 
> 
> 
> 
> 
> 
> 
> ... Are you guys checking your FPS? I mean... Are you constantly above 60 fps with 2gb/3gb cards and still stuttering? I'm waiting for the game to be released so i can test it on my 2gb 770's extensively and give my results.


Please do. I want to compare it to gtx 770 2gb but with an fx 8350 instead of your 3770k


----------



## yusupov

Quote:


> Originally Posted by *Silent Scone*
> 
> Yeh so that makes it ok...


its fine with me. theyre not as bad as EA, but some of the comments directed at pc gamers have been downright disgusting. and yes, i do support them (as well as EA) quite heavily myself, but thats partly nostalgia on the part of ubisoft & partly my consideration for the devs who dont deserve to be shovelled in the same pit as their publishers. and at least its clear that watch dogs, unlock ac4 imo, seems to be pretty poorly optimized all-around. im sorry but i find it difficult to believe that the ps4 shouldnt be able to handle this game at 1080p. if anything pc gamers will very likely get the best of it in terms of support & thats a refreshing change of course from their attitude of a few months ago.


----------



## sugarhell

Imagine if they didint delayed watch dog...


----------



## TwiztidFreek

This makes me wonder. The GPU I am planning on buying is the GTX 770 which comes with this game. Should I get the 280X and forget about ShadowPlay?


----------



## majin662

So thus far I have not seen Watch Dogs go above my VRAM. Is it happening instantly for people or gradually or what? Just did a high speed mission that led to a foot chase in a crowded area and still didn't max out. Am I missing something? I still get some stutter while driving like everyone else even after multiple "tweaks" but other than that game is smooth and not exceeding my limits.Have FPS capped at 30 right now just to test if it helped with driving (not looking so) if not capped average 40+


Spoiler: Warning: Spoiler!


----------



## TopicClocker

Quote:


> Originally Posted by *majin662*
> 
> So thus far I have not seen Watch Dogs go above my VRAM. Is it happening instantly for people or gradually or what? Just did a high speed mission that led to a foot chase in a crowded area and still didn't max out. Am I missing something? I still get some stutter while driving like everyone else even after multiple "tweaks" but other than that game is smooth and not exceeding my limits.Have FPS capped at 30 right now just to test if it helped with driving (not looking so) if not capped average 40+
> 
> 
> Spoiler: Warning: Spoiler!


Yup, you are missing something. Anti Aliasing, been hearing that was choking them, 1440p is impressive nevertheless.


----------



## majin662

Quote:


> Originally Posted by *TopicClocker*
> 
> Yup, you are missing something. Anti Aliasing, been hearing that was choking them, 1440p is impressive nevertheless.


Using FXAA in game and doing the whole Enhance application :2x and AF x16 in control panel (a carry over from far cry 3 and AC IV) but yeah no heavy hitters on the AA side


----------



## TopicClocker

Quote:


> Originally Posted by *majin662*
> 
> Using FXAA in game and doing the whole Enhance application :2x in control panel (a carry over from far cry 3 and AC IV)


I know I saw that, that and SMAA seem safe. But things like MSAA seems to be whats doing it.


----------



## StrongForce

Quote:


> Originally Posted by *Yungbenny911*
> 
> Everyone going crazy about V-ram
> 
> 
> 
> 
> 
> 
> 
> ... Are you guys checking your FPS? I mean... Are you constantly above 60 fps with 2gb/3gb cards and still stuttering? I'm waiting for the game to be released so i can test it on my 2gb 770's extensively and give my results.


I'm already disapointed with my 2gb 770 in BF4 as I can run all ultra without Vsync it seems, but as soon as I put Vsync on I get crazy spikes, I could run it on high but I guess..







I was expecting to max it (besides MSAA I mean I know that takes alot)


----------



## Murlocke

FXAA is terrible, you are blurring your entire picture. Go with SMAA, the requirements are pretty much the same.

Side note: Rain causing puddles in the streets is incredible. It's yet to rain in my game after ~5 hours of playing...


----------



## bencher

Quote:


> Originally Posted by *Murlocke*
> 
> FXAA is terrible, you are blurring your entire picture. Go with SMAA, the requirements are pretty much the same.
> 
> Side note: Rain causing puddles in the streets is incredible. It's yet to rain in my game after ~5 hours of playing...


Man you guys are forcing me to get this game.

I am holding out for WIIU version though.


----------



## majin662

Quote:


> Originally Posted by *Murlocke*
> 
> FXAA is terrible, you are blurring your entire picture. Go with Temporal SMAA, the requirements are pretty much the same.
> 
> Side note: Rain causing puddles in the streets is incredible. It's yet to rain in my game after ~5 hours of playing...


I'll retest temporal now that I have a good baseline.

Have you tried raising buffered frames any higher to see if it helps driving? On my way to bbq but wanted to see if it helps any more or no. Guessing that issue is going to be patches and drivers though.


----------



## Murlocke

Quote:


> Originally Posted by *majin662*
> 
> I'll retest temporal now that I have a good baseline.
> 
> Have you tried raising buffered frames any higher to see if it helps driving? On my way to bbq but wanted to see if it helps any more or no. Guessing that issue is going to be patches and drivers though.


Higher buffer frames increases input lag for me. I find the default of 3 works best.


----------



## Onikage

I still think AC4 looks better sure its not as open world as watch dogs but still quite open and it dosnt go over 1.8 GB of vram even with TXAA 4x


----------



## Serandur

I haven't got it for my 780 GHz and 3770K yet, but my friend just got his copy on his GTX 660 and 3570K and I've been messing around with it. I've set the preset to high with temporal SMAA and it stutters/freezes a bit here and there, especially while driving. Setting textures to medium completely fixes the issue, setting to Ultra causes severe and extended hangups. The problem is definitely VRAM-related and though the 660 might be an outlier as far as 2 GB cards go (because the card is very reluctant to use more than 1.5 GB due to the awkward VRAM setup with a 192-bit bus and 2 GB of VRAM causing bandwidth to be divided by 3 when the extra is used), there's my own personal confirmation. Oh, and the game's Vsync implementation is garbage, where's triple-buffering?

I have heard, more than once, that lowering shadows also alleviates the issue and presumably VRAM usage, so I'll try that out when I get my own copy (preferable to lowered textures, historically my favorite and easiest on performance graphics setting, now a source of concern







).


----------



## Hl86

Runs fine on a 290x


----------



## xxpantherrrxx

My game is installed to an SSD


----------



## LuminatX

Quote:


> Originally Posted by *twerk*
> 
> Do all the people experiencing stuttering when driving around have the game on an SSD? Just wondering if there is a bottleneck trying to load textures from a HDD.


I thought this was the problem too, I had it on my hard drive, so I switched it to my SSD and no change


----------



## Serandur

Hmm, why are the new Nvidia drivers not on their site?


----------



## LuminatX

the only "new" drivers I've heard of are the 337.82 and they were only released in china because of the launch for GW2 over there


----------



## Serandur

Quote:


> Originally Posted by *LuminatX*
> 
> the only "new" drivers I've heard of are the 337.82 and they were only released in china because of the launch for GW2 over there


I'm not sure if it's true or not, but some people are claiming they actually help with Watch Dogs. Even if they didn't, I still find it strange Nvidia have localized drivers. Improvement is improvement, no?


----------



## killerhz

well here's hoping my 780 Classified can play this game maxed out. read so much about games not able to use the likes of cards with 6GB so passed on the upgrade to the 780 6GB









hoping it's just driver related you are all experiencing and nivida releases something soon.


----------



## xxpantherrrxx

The game stutters horribly on any setting I have it on and is pretty much unplayable.

i7 4770K @ 4.4Ghz
16GB of RAM
780 Ti @ 1300 Mhz.
The game is installed to a SSD.


----------



## Razzaa

Quote:


> Originally Posted by *xxpantherrrxx*
> 
> The game stutters horribly on any setting I have it on and is pretty much unplayable.
> 
> i7 4770K @ 4.4Ghz
> 16GB of RAM
> 780 Ti @ 1300 Mhz.
> The game is installed to a SSD.


Of course it does lol.


----------



## MonarchX

Does anyone know how to activate ctOS boxes in that one place where you have to activate 2 of them? Pressing Q or E does nothing, nor does pressing mouse buttons. Pressing Z just shows network paths, but nothing seems to activate the ctOS box right next to the player.


----------



## DADDYDC650

So many people with "friend's" that have an "early copy" of the game.

So does this game use more than 3GB VRAM at 1440p and up with max settings even with little to no AA?


----------



## APhamX

Quote:


> Originally Posted by *DADDYDC650*
> 
> So many people with "friend's" that have an "early copy" of the game.
> 
> So does this game use more than 3GB VRAM at 1440p and up with max settings even with little to no AA?


No AA, Medium Shaders, shadows and reflections, Ultra Textures. High Reflections, details and water.

4770k @ 4.4 GHZ
16GB DDR3
2 x 7950 3GB @ 1100/1350

MSI Afterburner reads 4.5GB of video memory used, 14GB of pagefile. 50-58 FPS @ 1440p.


----------



## Serandur

Quote:


> Originally Posted by *DADDYDC650*
> 
> So many people with "friend's" that have an "early copy" of the game.
> 
> So does this game use more than 3GB VRAM at 1440p and up with max settings even with little to no AA?


Not necessarily early copies, legitimate normal copies. Uplay supposedly doesn't have any time restrictions on how early the keys can be activated and a lot of people are getting keys from Nvidia bundles or for cheap ($35-$40) from people selling the ones that came with their Nvidia bundles. Not sure if those give keys early, probably just lucky physical copy owners aside from pirates. The caveat is the Steam version people didn't get any early fun time.

Edit: Interesting stuff going on in the NeoGAF thread by the way. Potential mass-ban.


----------



## DADDYDC650

Quote:


> Originally Posted by *APhamX*
> 
> No AA, Medium Shaders, shadows and reflections, Ultra Textures. High Reflections, details and water.
> 
> 4770k @ 4.4 GHZ
> 16GB DDR3
> 2 x 7950 3GB @ 1100/1350
> 
> MSI Afterburner reads 4.5GB of video memory used, 14GB of pagefile. 50-58 FPS @ 1440p.


Any stuttering at all?


----------



## APhamX

Quote:


> Originally Posted by *DADDYDC650*
> 
> Any stuttering at all?


After 4-5 hours of gameplay, I noticed studdering while driving. However I just restarted the game and it was good to go. I don't see any screen tearing either while in CF even without vertical sync. I'm impressed.


----------



## Serandur

Quote:


> Originally Posted by *APhamX*
> 
> After 4-5 hours of gameplay, I noticed studdering while driving. However I just restarted the game and it was good to go. I don't see any screen tearing either while in CF even without vertical sync. I'm impressed.


4-5 hours for stuttering to appear? That's interesting, what's going on with the 780s I wonder. Is it Tuesday yet? :/

Edit: With Crossfire, does that 4.5 GB mean only 2.25 per GPU?


----------



## Ascii Aficionado

Quote:


> Originally Posted by *APhamX*
> 
> After 4-5 hours of gameplay, I noticed studdering while driving. However I just restarted the game and it was good to go. I don't see any screen tearing either while in CF even without vertical sync. I'm impressed.


Sounds like a textbook memory leak.


----------



## Ascii Aficionado

http://www.overclock.net/t/1491380/tpu-msi-radeon-r9-280x-gaming-6-gb-review#post_22308841


----------



## Serandur

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Sounds like a textbook memory leak.


Do companies ever intentionally do anything like that just to discourage pirates/early players?


----------



## APhamX

Quote:


> Originally Posted by *Serandur*
> 
> Do companies ever intentionally do anything like that just to discourage pirates/early players?


Probably not, "stares at gta4"


----------



## d3vour3r

if i dont get 100fps max out at 1440p with my rig ill be pi$$ed


----------



## Ascii Aficionado

Quote:


> Originally Posted by *d3vour3r*
> 
> if i dont get 100fps max out at 1440p with my rig ill be pi$$ed


Is your 3930k @ stock ?

And vsync off should make a large difference, I'd have to play with it on though.


----------



## Serandur

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> http://www.overclock.net/t/1491380/tpu-msi-radeon-r9-280x-gaming-6-gb-review#post_22308841


Before Watch Dogs, I took benchmarks of Crysis 3 and AC IV as evidence enough that 3GB should be fine (780 vs Titan). Now I'm not so sure, need some Watch Dogs benchmarks.


----------



## Toology

People with stutter should try out this tweak http://forums.guru3d.com/showthread.php?t=389072 It is suppose to make a huge difference on games like watch dogs, crysis 3 and arma 3. Be sure to name the exe just right.


----------



## Hl86

Got a 290x, aint even mad.

No stuttering with ultra texture and temp smaa @1440p. Though over 2x msaa stutters begin, but temporal smaa is a beast in quality.


----------



## LuminatX

ive noticed vsync locks you to 30fps as well, otherwise with it off you get crazy screen tearing :/


----------



## Ascii Aficionado

Are you sure it's 30fps or just half your refresh rate if it can't maintain close to your refresh rate (double-buffered).

Because if it uses double-buffered vsync then that's stupid.


----------



## Serandur

Quote:


> Originally Posted by *LuminatX*
> 
> ive noticed vsync locks you to 30fps as well, otherwise with it off you get crazy screen tearing :/


AC IV did that, I fixed it with D3DOverrider. I've heard that doesn't work with Watch Dogs though. I wonder if driver-forced triple-buffering will.


----------



## eternal7trance

Quote:


> Originally Posted by *APhamX*
> 
> No AA, Medium Shaders, shadows and reflections, Ultra Textures. High Reflections, details and water.
> 
> 4770k @ 4.4 GHZ
> 16GB DDR3
> 2 x 7950 3GB @ 1100/1350
> 
> MSI Afterburner reads 4.5GB of video memory used, 14GB of pagefile. 50-58 FPS @ 1440p.


Wait what. I don't understand. How are you getting 4.5gb of VRAM used if you only have 3gb?


----------



## bencher

Quote:


> Originally Posted by *eternal7trance*
> 
> Wait what. I don't understand. How are you getting 4.5gb of VRAM used if you only have 3gb?


4.5/2 -.-


----------



## Murlocke

Quote:


> Originally Posted by *d3vour3r*
> 
> if i dont get 100fps max out at 1440p with my rig ill be pi$$ed


How can you honestly expect that? I've made like 40 posts regarding my performance and our specs are very similar. You aren't going to come close to that number, even with no AA.
Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Are you sure it's 30fps or just half your refresh rate if it can't maintain close to your refresh rate (double-buffered).
> 
> Because if it uses double-buffered vsync then that's stupid.


It has 1 frame buffering and 2 frame buffering, not sure why people would use 2 frame buffering and get locked to 30FPS.


----------



## Marc79

Murlocke, do you think running Watch dogs @1440p Ultra setting with AA off, would be playable (~40fps) with a single 780Ti?


----------



## LuminatX

so I have to change to 3 frame buffer? so vsync doesn't lock me to 30fps?


----------



## Murlocke

Quote:


> Originally Posted by *Marc79*
> 
> Murlocke, do you think running Watch dogs @1440p Ultra setting with AA off, would be playable (~40fps) with a single 780Ti?


I get a little more than that even with SMAA enabled on a single GPU. You'll probably get some minor VRAM chops when driving in dense areas, but nothing that makes the game unplayable.
Quote:


> Originally Posted by *LuminatX*
> 
> so I have to change to 3 frame buffer? so vsync doesn't lock me to 30fps?


No, set vsync to 1 frame instead of 2. It's the other setting for it.

I would still change pre-buffer frames to 3 as it seems to be the best blend for input lag/less loading chops. If you go to 1, you get less input lag but way more chops. If you go to 5, you get way more input lag. (Or so it seems...)


----------



## Marc79

Thanks. Hopefully the game delivers. The screens you posted earlier were great.

Pretty much the waiting game now till I can download it.


----------



## Murlocke

Quote:


> Originally Posted by *Marc79*
> 
> Thanks. Hopefully the game delivers. The screens you posted earlier were great.
> 
> Pretty much the waiting game now till I can download it.


It's an NVIDIA title, and GPU utilization in SLI mode is absolute garbage right now (it's pretty bad with a single GPU too). They have to release new drivers soon, the game's performance is even worse if you don't use the 337.81 beta drivers that aren't even on NVIDIA website.

I'm almost certain we'll see a driver on Tuesday with some healthy improvements.


----------



## majin662

Well tried temporalsmaa and and yeah much better with roughly same hit.

Tried gpu frames at 5..no further reduction to driving stutters.

Really think that's gonna be a driver/ patch thing as it seems universal.

All in all very happy with settings now and still not exceeding vram.


----------



## yusupov

how do you know yr not exceeding vram?? i bet some of the amd guys w/ 4GB cards (or titan users) will say it exceeds it...the bizarre inconsistency seems to be a memory thing plus i seem to get worse performance over time which suggests it draws even more. curious if im wrong though.

IF i had a copy of it, id probably be getting relatively very good performance now compared to if id played it earlier due to a couple of the tips given.


----------



## eternal7trance

Quote:


> Originally Posted by *bencher*
> 
> 4.5/2 -.-


Why would you say 4.5/2? Why not just say what you're using since the cards only have 3gb of vram? Less confusing


----------



## APhamX

Quote:


> Originally Posted by *d3vour3r*
> 
> if i dont get 100fps max out at 1440p with my rig ill be pi$$ed


Quote:


> Originally Posted by *yusupov*
> 
> how do you know yr not exceeding vram?? i bet some of the amd guys w/ 4GB cards (or titan users) will say it exceeds it...the bizarre inconsistency seems to be a memory thing plus i seem to get worse performance over time which suggests it draws even more. curious if im wrong though.
> 
> IF i had a copy of it, id probably be getting relatively very good performance now compared to if id played it earlier due to a couple of the tips given.


MSI Afterburner shows me the VRAM usage.


----------



## majin662

Quote:


> Originally Posted by *yusupov*
> 
> how do you know yr not exceeding vram?? i bet some of the amd guys w/ 4GB cards (or titan users) will say it exceeds it...the bizarre inconsistency seems to be a memory thing plus i seem to get worse performance over time which suggests it draws even more. curious if im wrong though.
> 
> IF i had a copy of it, id probably be getting relatively very good performance now compared to if id played it earlier due to a couple of the tips given.


If that was for me then the answer is I'm watching it via afterburner while I play. Posted screens a few pages back.


----------



## bencher

Quote:


> Originally Posted by *eternal7trance*
> 
> Why would you say 4.5/2? Why not just say what you're using since the cards only have 3gb of vram? Less confusing


The program reports 4.5. Do the math.


----------



## Murlocke

Just want to note that any VRAM monitoring for this game is completely wrong (and pointless). It will cache pretty much any amount of VRAM it can, doesn't mean that's actually what it needs.

You will know when you are VRAM capping when you start getting massive spikes in FPS.


----------



## reklaw75

Hey guys,

Am wondering if anyone has run this on a 4770K and also a hex core (e.g. 4930K)

I currently have my gaming system setup with 4770K (oc to 4.5ghz) 32 gig ram and a single 290x on a 55" 1080P TV.

I have another system (server at the moment) with X79 and 4930K (also at 4.5ghz).

Do you think it will be worth swapping in the 4930K over the 4770K for this game? i.e. will the extra cores be of benefit minus the slightly better IPC of the 4770K @ 4 cores?

Thoughts?

Reklaw


----------



## Murlocke

Quote:


> Originally Posted by *reklaw75*
> 
> Hey guys,
> 
> Am wondering if anyone has run this on a 4770K and also a hex core (e.g. 4930K)
> 
> I currently have my gaming system setup with 4770K (oc to 4.5ghz) 32 gig ram and a single 290x on a 55" 1080P TV.
> 
> I have another system (server at the moment) with X79 and 4930K (also at 4.5ghz).
> 
> Do you think it will be worth swapping in the 4930K over the 4770K for this game? i.e. will the extra cores be of benefit minus the slightly better IPC of the 4770K @ 4 cores?
> 
> Thoughts?
> 
> Reklaw


Based on utilization currently, I would say the difference would be small to none. With 4x 4GB 680s you really shouldn't have issues unless you are wanting to run 4K... but you are running 1080p which makes me really want to ask "What the hell?"


----------



## ElementR

Quote:


> Originally Posted by *Murlocke*
> 
> Based on utilization currently, I would say the difference would be small to none. With 4x 4GB 680s you really shouldn't have issues unless you are wanting to run 4K... but you are running 1080p which makes me really want to ask "What the hell?"


GK104s don't have the bandwidth to support 4 GB of VRAM. 4GB GK104 cards are a waste of money.


----------



## majin662

Quote:


> Originally Posted by *Murlocke*
> 
> Just want to note that any VRAM monitoring for this game is completely wrong (and pointless). It will cache pretty much any amount of VRAM it can, doesn't mean that's actually what it needs.
> 
> You will know when you are VRAM capping when you start getting massive spikes in FPS.


Yeah. The game seems to cache all you can give it. For me at least so far. .once it's reached its levels it likes it stays right around those even after long play time.

That's the only thing I'm really basing my "Monitoring" off of in relation to folks who are asking.


----------



## reklaw75

Quote:


> Originally Posted by *Murlocke*
> 
> Based on utilization currently, I would say the difference would be small to none. With 4x 4GB 680s you really shouldn't have issues unless you are wanting to run 4K... but you are running 1080p which makes me really want to ask "What the hell?"


Apologies Murlocke and others. My sig is out way out of date, I'm utilising a single 290X (4 Gig VRAM) at the moment. I'm just using the big screen TV instead of the 1600P monitor so can play with controller in relaxed mode









So sounds like it may not be worth the swap in of 4930K then?

Another quick question, I have another 290X I can put in if needed, reckon it would be worth it for just 1080P or just OC the single 290X a bit.

Ta,

Reklaw


----------



## reklaw75

Quote:


> Originally Posted by *ElementR*
> 
> GK104s don't have the bandwidth to support 4 GB of VRAM. 4GB GK104 cards are a waste of money.


Missed your post ElementR, It's been a while since I had my 680's (was on trifire 290X's on water for a while before I downgraded for 1080p gaming).
I would disagree with 680's not having enough bandwidth, back in the day I had Crysis 3 maxed on 1600P on 4 cards, I had Skyrim at 1600P with tons of texture mods needing the 4GB (at least 3.8 of it). Now we see games like Watchdogs needing > 3GB. When I was going to go the 4K route, I would have most certainly gotten either 8GB 290X's or waited until the next Nvidia Generation for cards with 8 GB + to be future proof. I learnt my lesson from 580's with 1.5GB and have seen each generation punished in at least some games for skimping on VRAM.

Reklaw


----------



## MonarchX

Just to let you guys know, there are several cwacks out there and you need the latest Uplay-free cwack or else you will get framerate drops.


----------



## StrongForce

Quote:


> Originally Posted by *d3vour3r*
> 
> if i dont get 100fps max out at 1440p with my rig ill be pi$$ed


But you only need 60 though don't you ?? unless they already released 1440p 120hz.. I didn't knew.

And holy crap. contacted Inno3d guys for some info, and sometimes it's useful.

http://www.caseking.de/shop/catalog/Graphics-Card/All-Graphics-Card/Inno3D-GeForce-GTX-780-iChill-HerculeZ-X3-Ultra-6144-MB-DDR5::27657.html

They already have their 6gb for sale..









I think they just got me.. until now, I didn't spend more than 550 Swiss franks for a card (roughly $) but I might just go for that 600$ 1.. Lol.. that way I'm SURE VRAM won't stop me in the next few years !








Dark side is tempting me bahha

With that said I won't have enough money left for watchdogs, roflmao


----------



## iamhollywood5

So I got this from Jonathan Morin today:
Quote:


> @Digital_Veil_ A 3GB card is ok on Ultra. But pushing past 1080p or activating any MSAA/TXAA you can go over. You need 4 for that.


(link)

So there you have it. If you have a 3GB card at 1080p, you have to choose between ultra textures and good anti-aliasing. It really just screws over 780/Ti owners the most. If you have a 280X/7970 you likely wouldn't be able to render both ultra textures and good AA at the same time at 60+ fps anyway, unless you're crossfiring. Sigh...

As frustrating as this is, at least Jonathan Morin seems to be a lot more knowledge about the PC version that most multi-platform devs that interact with the community. He's been giving pretty specific answers and I appreciate that.

I do like how the game uses full-res textures out to very far distances so texture pop-in is virtually eliminated, which is undoubtedly fueling the VRAM hogging, but I wish we could set that texture distance ourselves. I'd tolerate a little pop-in if the nearby textures were still ultra and I didn't have VRAM shortage.

I hope Nvidia learns a lesson from this game. Oh wait, they probably wanted this to happy. As much as I don't want to go back to AMD, this does make me look longingly at a 290X...


----------



## Murlocke

Quote:


> Originally Posted by *iamhollywood5*
> 
> So I got this from Jonathan Morin today:
> (link)
> 
> So there you have it. If you have a 3GB card at 1080p, you have to choose between ultra textures and good anti-aliasing. It really just screws over 780/Ti owners the most. If you have a 280X/7970 you likely wouldn't be able to render both ultra textures and good AA at the same time at 60+ fps anyway, unless you're crossfiring. Sigh...
> 
> As frustrating as this is, at least Jonathan Morin seems to be a lot more knowledge about the PC version that most multi-platform devs that interact with the community. He's been giving pretty specific answers and I appreciate that.
> 
> I do like how the game uses full-res textures out to very far distances so texture pop-in is virtually eliminated, which is undoubtedly fueling the VRAM hogging, but I wish we could set that texture distance ourselves. I'd tolerate a little pop-in if the nearby textures were still ultra and I didn't have VRAM shortage.
> 
> I hope Nvidia learns a lesson from this game. Oh wait, they probably wanted this to happy. As much as I don't want to go back to AMD, this does make me look longingly at a 290X...


I play fine at 1440p, maxed, and Temporal SMAA (looks better than 2x MSAA) enabled on 3GB VRAM. I even enable VSYNC, which is a further VRAM hog. It's totally playable for me, and no major chops.


----------



## revro

Quote:


> Originally Posted by *Serandur*
> 
> Hmm, why are the new Nvidia drivers not on their site?


cause the official release is on 27th?
Quote:


> Originally Posted by *ElementR*
> 
> GK104s don't have the bandwidth to support 4 GB of VRAM. 4GB GK104 cards are a waste of money.


but does it support lets say 3-3.5gb vram usage? if yes, its still better than 2gb card


----------



## iamhollywood5

Quote:


> Originally Posted by *Murlocke*
> 
> I play fine at 1440p, maxed, and Temporal SMAA (looks better than 2x MSAA) enabled on 3GB VRAM. I even enable VSYNC, which is a further VRAM hog. It's totally playable for me, and no major chops.


Hmm. In my limited time playing I haven't tried SMAA, only MSAA and TXAA so maybe that's the difference. Hopefully switching to SMAA helps. Although I guess the chops I experienced are even happening with Titans and I never had an insanely bad slowdowns so maybe Morin was just giving a very conservative estimate.


----------



## Murlocke

Quote:


> Originally Posted by *iamhollywood5*
> 
> Hmm. In my limited time playing I haven't tried SMAA, only MSAA and TXAA so maybe that's the difference. Hopefully switching to SMAA helps. Although I guess the chops I experienced are even happening with Titans and I never had an insanely bad slowdowns so maybe Morin was just giving a very conservative estimate.


You need 4x MSAA to beat SMAA's quality, 2x MSAA is significantly worse. I took detailed screenshots and posted them around here. TXAA is awful as well, and just blurs the picture and cost massive amounts of performance.

I would only use SMAA or Temporal SMAA, unless you can handle 4x MSAA... however 4x MSAA by itself seems to want another 1GB of VRAM or so.


----------



## zylonite

Quote:


> Originally Posted by *Cretz*
> 
> What settings would be the equivalent of how the game is on the PS4?


lowest settings with everything OFF


----------



## Silent Scone

Really, really trying not to purchase Titans right about now...


----------



## Murlocke

I made another post in the other thread after another go at tweaking settings. Updated AA comparison screenshots too. Copy/pasting.

No AA: http://i.imgur.com/OiFM7S2.jpg
FXAA: http://i.imgur.com/HHMnuUE.jpg
SMAA: http://i.imgur.com/vLAMADd.jpg
Temporal SMAA: http://i.imgur.com/26ptRVL.jpg
2x MSAA: http://i.imgur.com/CHMim6C.jpg
4x MSAA: http://i.imgur.com/7MFFfsR.jpg
8x MSAA: http://i.imgur.com/OkZuk7H.jpg

Note: FPS is higher in the last two cause SLI is enabled. The FPS is fine when staring at one thing, but turning the camera will result in massive single digit FPS dips due to VRAM caps. 4x MSAA is definitely the sweet spot visually, but to run it at 1440p you would need 4GB VRAM at a minimum. It's going to look better to run 1440p with SMAA than 1080p with 4x MSAA. Best qualityerformance options are Temporal SMAA or 4x MSAA in my opinion. *Also, i'm not sure what's up with 2x MSAA. It almost seems like it's not even working in the game, there's no improvement compared to no AA, infact it's actually worse.*

Edit: More testing. It seems I can get the game playing "semi-acceptable" on Max settings, 1440p, and 4x MSAA with 2x 780Tis by disabling only Motion Blur. GPU pre-render is 3, and Vsync enabled (preference, can disable for even less VRAM usage). Motion blur seems to have a rather large impact on VRAM and I honestly prefer it off anyway. FPS is fine, but loading chops still happen frequently while driving at fast speeds, but rarely are they bad enough to cause you to crash. When loading into the world at first, it will be extremely choppy for a few seconds, but it does look quite a bit better than SMAA.









Still hoping for some drivers that improve VRAM management in the game.


----------



## revro

Quote:


> Originally Posted by *Serandur*
> 
> Not necessarily early copies, legitimate normal copies. Uplay supposedly doesn't have any time restrictions on how early the keys can be activated and a lot of people are getting keys from Nvidia bundles or for cheap ($35-$40) from people selling the ones that came with their Nvidia bundles. Not sure if those give keys early, probably just lucky physical copy owners aside from pirates. The caveat is the Steam version people didn't get any early fun time.
> 
> Edit: Interesting stuff going on in the NeoGAF thread by the way. Potential mass-ban.


do you have link for it? whats going on?

thanks
revro


----------



## Serandur

Quote:


> Originally Posted by *revro*
> 
> do you have link for it? whats going on?
> 
> thanks
> revro


Most of the commotion is pretty much over now, but there were so many people posting impressions of performance that the mods started asking to see proof of purchase from all who posted impressions and would continue to do so. One poster actually had an unused physical copy and decided to test out if physical copies were actually playable before the official release date and concluded they were in both online and offline mode. So it's proven not fake (not entirely anyway), some people are actually legitimately able to play the game this early. Those with key codes from retail versions too.

http://www.neogaf.com/forum/showthread.php?t=824563&page=25

It started on page 25, post #1215.


----------



## Silent Scone

Quote:


> Originally Posted by *Murlocke*
> 
> I made another post in the other thread after another go at tweaking settings. Updated AA comparison screenshots too. Copy/pasting.
> 
> No AA: http://i.imgur.com/OiFM7S2.jpg
> FXAA: http://i.imgur.com/HHMnuUE.jpg
> SMAA: http://i.imgur.com/vLAMADd.jpg
> Temporal SMAA: http://i.imgur.com/26ptRVL.jpg
> 2x MSAA: http://i.imgur.com/CHMim6C.jpg
> 4x MSAA: http://i.imgur.com/7MFFfsR.jpg
> 8x MSAA: http://i.imgur.com/OkZuk7H.jpg
> 
> Note: FPS is higher in the last two cause SLI is enabled. The FPS is fine when staring at one thing, but turning the camera will result in massive single digit FPS dips due to VRAM caps. 4x MSAA is definitely the sweet spot visually, but to run it at 1440p you would need 4GB VRAM at a minimum. It's going to look better to run 1440p with SMAA than 1080p with 4x MSAA. Best qualityerformance options are Temporal SMAA or 4x MSAA in my opinion. *Also, i'm not sure what's up with 2x MSAA. It almost seems like it's not even working in the game, there's no improvement compared to no AA, infact it's actually worse.*
> 
> Edit: More testing. It seems I can get the game playing "semi-acceptable" on Max settings, 1440p, and 4x MSAA with 2x 780Tis by disabling only Motion Blur. GPU pre-render is 3, and Vsync enabled (preference, can disable for even less VRAM usage). Motion blur seems to have a rather large impact on VRAM and I honestly prefer it off anyway. FPS is fine, but loading chops still happen frequently while driving at fast speeds, but rarely are they bad enough to cause you to crash. When loading into the world at first, it will be extremely choppy for a few seconds, but it does look quite a bit better than SMAA.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still hoping for some drivers that improve VRAM management in the game.


Vram is managed by directX and any improvement to be sort will be found from Ubisoft.


----------



## DADDYDC650

Mods here should ask for proof of purchase from those that have posted their impressions of Watch Dogs. I'm positive most folks would fail to provide any.


----------



## Clocknut

Quote:


> Originally Posted by *Murlocke*
> 
> Based on utilization currently, I would say the difference would be small to none. With 4x 4GB 680s you really shouldn't have issues unless you are wanting to run 4K... but you are running 1080p which makes me really want to ask "What the hell?"


after the game release, I hope someone could test i7 Vs i5 at stock speed to see the HT could make a huge diff or not


----------



## xxpantherrrxx

So what he is saying is that if you want 1080p/ Ultra textures/settings and any kind of MSAA added to that you have to have 4GB of VRAM? LOL!!!


----------



## Zamoldac

I didn't try 1080p but i can confirma that average usage for 1440p Max settings (only MSAA 2x) is on average 3.4Gb Vram with peaks around 3.7Gb.
R9 290 @1200Mhz/ 5800Mhz


----------



## naved777

Nvidia released 337.88 specifically for Watch Dogs

http://www.geforce.com/whats-new/articles/nvidia-geforce-337-88-whql-watch-dog-drivers


----------



## Silent Scone

Awesome


----------



## LuminatX

Quote:


> Originally Posted by *naved777*
> 
> Nvidia released 337.88 specifically for Watch Dogs
> 
> http://www.geforce.com/whats-new/articles/nvidia-geforce-337-88-whql-watch-dog-drivers


Oh definitely downloading right now, going to giver a try.


----------



## APhamX

Quote:


> Originally Posted by *LuminatX*
> 
> Oh definitely downloading right now, going to giver a try.


AMD is going to release their drivers tomorrow too! Hopefully that means I can hit 60+ fps on High/Ultra with 7950 CF








Quote:


> This was confirmed by Guru3D member Hilbert Hagedoorn who stated that, "BTW both AMD and NVIDIA WILL release an optimized driver tomorrow. We already have access to these and NO we can not distribute it just yet."
> Read more at http://gamingbolt.com/watch-dogs-receiving-amd-and-nvidia-optimized-drivers-on-launch-day#ihog3lQMLrewVilh.99


----------



## LuminatX

New nvidia drivers work much better! no more stuttering for me, and I'm able to max out the game only having to turn shadows to high instead of ultra (on my 2gb gtx670), with temporal smaa at 1080p


----------



## majin662

Quote:


> Originally Posted by *LuminatX*
> 
> New nvidia drivers work much better! no more stuttering for me, and I'm able to max out the game only having to turn shadows to high instead of ultra (on my 2gb gtx670), with temporal smaa at 1080p


So driving for you is fixed with these? Haven't had a chance to load them yet


----------



## LuminatX

Quote:


> Originally Posted by *majin662*
> 
> So driving for you is fixed with these? Haven't had a chance to load them yet


Yup, just played for about an hour and the first thing I did was hop into a car and fly around the city, no stutters


----------



## majin662

Quote:


> Originally Posted by *LuminatX*
> 
> Yup, just played for about an hour and the first thing I did was hop into a car and fly around the city, no stutters


That's good to know. Can't wait to try these out later. Thanks.


----------



## freaakk93

So finally a member







just downloaded the driver and i am happy again (; Now i can play the game maxed out ultra textures and 4x msaa 1080p with 45fps+ and less to no stutters.(inno gtx770 2gb)


----------



## partyboy690

Just registered to say thank you, saw the last two posts about people with 2GB cards running ultra textures. I'd drop the shadow resolution but ultra textures are a must have for me. Glad the new Nvidia drivers helped, if anyone else has a 2GB GTX 770 or a 2GB 680 with an i5 quad could they give me a rough estimation of performance at 1080p.


----------



## Serandur

Quote:


> Originally Posted by *LuminatX*
> 
> Yup, just played for about an hour and the first thing I did was hop into a car and fly around the city, no stutters


Quote:


> Originally Posted by *freaakk93*
> 
> So finally a member
> 
> 
> 
> 
> 
> 
> 
> just downloaded the driver and i am happy again (; Now i can play the game maxed out ultra textures and 4x msaa 1080p with 45fps+ and less to no stutters.(inno gtx770 2gb)


Thank you for sharing this, I have some hope my 780 will be alright again.


----------



## KenLautner

Quote:


> Originally Posted by *partyboy690*
> 
> Just registered to say thank feck, saw the last two posts about people with 2GB cards running ultra textures. I'd drop the shadow resolution but ultra textures are a must have for me. Glad the new Nvidia drivers helped, if anyone else has a 2GB GTX 770 or a 2GB 680 with an i5 quad could they give me a rough estimation of performance at 1080p.


Also can someone tell if the game is performing better on FX-8350 or i5-4670 at stock clocks?


----------



## andyboy

Wait, so it wont let you choose the option if you dont have?


----------



## freaakk93

Performance on stock 4670k went from 80% usage to 75%. RAM is still 4gb performance is good on 770 2gb even with ultra textures but i had to turn shadows to high think they are still buggy or just eating performance


----------



## BusterOddo

Pre-load is now available through Steam...annnnnd go









13.6 GB download


----------



## yusupov

new drivers = the fix.


----------



## mingocr83

6 hours left at 6Mbps...just got a new contract for 25Mbps...unfortunately, they won't come til wednesday to install the new service....


----------



## Leopard2lx

this is so lame....it's still not available to download on uplay...


----------



## Emu105

Quote:


> Originally Posted by *Leopard2lx*
> 
> this is so lame....it's still not available to download on uplay...


I feel like around this time tomorrow is when we will be able to download the game just watch....


----------



## mingocr83

Quote:


> Originally Posted by *Leopard2lx*
> 
> this is so lame....it's still not available to download on uplay...


Agree.

That is why I purchased on Steam the Deluxe Edition. I still have the Uplay key, in fact I'm selling it to a friend of mine in discount just to get rid of it.


----------



## BusterOddo

Alright, downloaded and ready to go...now the wait...Steam updated the release date. It now shows May 26, but also says approximately 10 hours to go which would still make it May 27.


----------



## NinjaToast

Quote:


> Originally Posted by *BusterOddo*
> 
> Alright, downloaded and ready to go...now the wait...Steam updated the release date. It now shows May 26, but also says approximately 10 hours to go which would still make it May 27.


it's more like 9 hours(steam doesn't do hours and minutes,) it unlocks at 12am EST.


----------



## mingocr83

Quote:


> Originally Posted by *BusterOddo*
> 
> Alright, downloaded and ready to go...now the wait...Steam updated the release date. It now shows May 26, but also says approximately 10 hours to go which would still make it May 27.


10 hours for me...that means that WD will unlock at 11pm my time..May 26th.


----------



## MonarchX

New drivers, by themselves, did not fix Ultra texture driving stutters, but a very famous cwuker, who likes to Reload weapons, released a cwuk just a few hrs ago and that completely fixed Ultra texture stuttering!


----------



## xSociety

Quote:


> Originally Posted by *MonarchX*
> 
> New drivers, by themselves, did not fix Ultra texture driving stutters, but a very famous cwuker, who likes to Reload weapons, released a cwuk just a few hrs ago and that completely fixed Ultra texture stuttering!


What?


----------



## Alvarado

Quote:


> Originally Posted by *xSociety*
> 
> What?


^ On another note, wheres my origin preload!


----------



## NinjaToast

Quote:


> Originally Posted by *Alvarado*
> 
> ^ On another note, wheres my origin preload!


NONE FOR YOU! ALL THE STEAM PRE-LOADS MWAHAHAHAHAHAA!


----------



## Alvarado

Quote:


> Originally Posted by *NinjaToast*
> 
> NONE FOR YOU! ALL THE STEAM PRE-LOADS MWAHAHAHAHAHAA!


Fine I can live with that....least I didn't pay full price


----------



## NinjaToast

Quote:


> Originally Posted by *Alvarado*
> 
> Fine I can live with that....least I didn't pay full price


Paid full price to have less of a library separation, which is fine but I am kinda jelly that you didn't pay full price..


----------



## iamhollywood5

Wow it looks like Nvidia did WORK on this game! Can't wait to try them out. They claim Ultra textures are optimal on their 3GB cards, need to see it with my own eyes first but I like to hear that!


----------



## Silent Scone

That looks like the 360 version? lol.


----------



## NABBO

Quote:


> Originally Posted by *Silent Scone*
> 
> That looks like the 360 version? lol.


Watch Dogs vs Grand Theft Auto IV on PC

https://translate.google.it/translate?sl=ru&tl=en&js=y&prev=_t&hl=it&ie=UTF-8&u=http%3A%2F%2Fgamegpu.ru%2Figrovye-novosti%2Fwatch-dogs-vs-grand-theft-auto-iv-na-pk.html&edit-text=


----------



## NinjaToast

^ you had to of taking that screen at low res cause the 720p viewing has much higher quality. xD


----------



## NABBO

From that comparison, for many things, it is better the old GTA4


----------



## DADDYDC650

Watch Dogs looks better than GTA 4 without mods.


----------



## KenLautner

Guys guys shhhh..
Listen here.. Does anyone have a FX-8350 and an i5-4670 ?

I want to know the difference in performance using the two CPU's with the same GPU at stock clocks.
Basically I want to know which of the two CPU is performing better other hardware remaining constant.


----------



## NABBO

Quote:


> Originally Posted by *DADDYDC650*
> 
> Watch Dogs looks better than GTA 4 without mods.


in general it is better graphics watch dogs (lighting, etc.)
but for some effects, it is better the old GTA4, even without mod


----------



## SkyNetSTI

Quote:


> Originally Posted by *NABBO*
> 
> Watch Dogs vs Grand Theft Auto IV on PC
> 
> https://translate.google.it/translate?sl=ru&tl=en&js=y&prev=_t&hl=it&ie=UTF-8&u=http%3A%2F%2Fgamegpu.ru%2Figrovye-novosti%2Fwatch-dogs-vs-grand-theft-auto-iv-na-pk.html&edit-text=


edit: after this comparison my unhappiness has increased alot!!!
is there the way to "return" pre-order in origin??? I want my money back!
doubleedit: found the way for refund, will apply asap!

Honestly... I am not very happy with this game... graphics is nothing special to me(+runs slow), controls and driving physics is crap... I would prefer GTA V over WatchDogs on PC anyday!
will play more maybe I will change my mind...


----------



## My Desired Display Name

Yeah I'm not a fan of the driving physics at all, though I do like the other parts of the game, and the mini games are pretty fun.


----------



## revro

is it as terribly repetitive as assasins creed 1?


----------



## StrongForce

Guru3d released an article if anyone is interested.

Claims they're not impressed by the graphics, well it's an open world game after all.. and the stutters might just be caused by the view distance ( which according to someone actually still looks good from far away, so that's good).

But a game isn't all about graphics anyway







, some people say the car physics weren't impressive either, sounds a bit sad ! I would have enjoy getting a controller and mess arround with cars, already had lot of fun on gtaIV with that (minus the controller).

http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,1.html


----------



## kersoz2003

Even for 3gb cards , texture settings cant be ultra..cause it uses nearly 4 gb for ultra...right now I can use high for textures with my r9 280x for better result..


----------



## Murlocke

Quote:


> Originally Posted by *kersoz2003*
> 
> Even for 3gb cards , texture settings cant be ultra..cause it uses nearly 4 gb for ultra...right now I can use high for textures with my r9 280x for better result..


I just did a fresh/clean install of the english (non-international) version of the drivers released today and noticed a large improvement over the international versions. Fired up the game, and did 1 full loop around the entire city at fast driving speeds on 3GB of VRAM. I counted 3 major FPS drops, the rest of the time I maintained 50+ FPS. Settings: 1440p, 4x MSAA, Vsync ON, Motion Blur off, Everything else maxed.

I also called a friend, and had him try the new drivers on his 2GB GTX 680. He managed 1440p, Temporal SMAA, Vsync off, Motion Blur off, Everything else maxed and was maintaining ~35FPS with very rare stuttering due to VRAM capping.

AMD's driver is coming tomorrow with an estimated 25-28% performance jump. NVIDIA's new driver greatly improved VRAM usage in the game, so I'd expect the same from AMD's new driver. 2GB should max the game at 1080p and SMAA without VRAM issues.


----------



## HeadlessKnight

Quote:


> Originally Posted by *Murlocke*
> 
> I just did a fresh install of the english (non-international) version of the drivers released today and noticed a large improvement over the international versions. Fired up the game, and did 1 full loop around the entire city at fast driving speeds. I counted 3 major FPS drops, the rest of the time I maintained 50+ FPS. Settings: 1440p, 4x MSAA, Vsync ON, Motion Blur off, Everything else maxed.
> 
> I also called a friend, and had him try the new drivers on his 2GB GTX 680. He managed 1440p, Temporal SMAA, Vsync off, Motion Blur off, Everything else maxed and was maintaining ~35FPS with very rare stuttering due to VRAM capping.
> 
> AMD's driver is coming tomorrow with an estimated 25-28% performance jump. NVIDIA's new driver greatly improved VRAM usage in the game, so I'd expect the same from AMD's new driver. 2GB should max the game at 1080p and SMAA without VRAM issues.


Great news. Thanks







. Since I am only at 1080p I hope it is stutter free.


----------



## Redeemer

Quote:


> Originally Posted by *Serandur*
> 
> For those cost of a Ti, Nvidia should be shamed for not making it a base standard. I'm not the fanboy type, I've pretty much been exclusively Nvidia for years (for certain software features), but their $700 flagship has no more VRAM than AMD's 2.5 year old 7970 in the middle of a console gen transition. They've known what would happen, and they're going to use it to leverage upgrades moreso than performance especially in these relatively slow times of advancement (with their TSMC problems and all). Their 2GB 770 is a bad joke, now, their limited 780s/Tis a fatally flawed product. I feel like an AMD hardware fanboy forced into buying their cutthroat competitor's garbage for what I now consider basic feature support.


This is how Nvidia makes money

they gave us a

Crippled 680 2GB 256 bit bus

Forced us to buy a 780 GTX 384 bit 3GB

Finally gave us a fully unlocked GK110 but with only 3GB (780TI)

Makes us pay 1K+ for top end

GK110 has been cut up so much over the last year


----------



## xxpantherrrxx

The game is still stuttering even with the new drivers. I have a GTX 780 Ti which is heavily overclocked to 1300mhz. It only stutters when entering the city, if you're off wondering on the outskirts it's totally fine. Running all Ultra settings with Temporal SMAA. Saw a max of 2941 MB's of VRAM usage.


----------



## twerk

Quote:


> Originally Posted by *xxpantherrrxx*
> 
> The game is still stuttering even with the new drivers. I have a GTX 780 Ti which is heavily overclocked to 1300mhz. It only stutters when entering the city, if you're off wondering on the outskirts it's totally fine. Running all Ultra settings with Temporal SMAA.


Have you tried SMAA? Temporal SMAA doesn't play well with my card.


----------



## d3vour3r

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> Is your 3930k @ stock ?
> 
> And vsync off should make a large difference, I'd have to play with it on though.


3930k is at 4.5ghz

i oc my monitor to 96hz so should get that with vsync on.


----------



## xxpantherrrxx

Yup I've tried it. It stutters no matter what setting I am on when I am in the city, on the outskirts the game runs flawlessly. This does not happen with any other game I own.


----------



## TopicClocker

I did a couple performance tests on the 337.88 driver a few hours ago in my thread, those with 2GB or more may find it interesting, I found I could enable a lot more settings, practically maxing out the game with SMAA, Ultra causes a few stutters but may get better if anything.
http://www.overclock.net/t/1491602/watch-dogs-2gb-vram-performance-analysis-and-graphics-337-88-driver-improvement

Edit: Damn this thread blew up, I dont look at it for a day and there's already 140+ replies.


----------



## Alvarado

Quote:


> Originally Posted by *Murlocke*
> 
> I just did a fresh/clean install of the english (non-international) version of the drivers released today and noticed a large improvement over the international versions. Fired up the game, and did 1 full loop around the entire city at fast driving speeds on 3GB of VRAM. I counted 3 major FPS drops, the rest of the time I maintained 50+ FPS. Settings: 1440p, 4x MSAA, Vsync ON, Motion Blur off, Everything else maxed.
> 
> I also called a friend, and had him try the new drivers on his 2GB GTX 680. He managed 1440p, Temporal SMAA, Vsync off, Motion Blur off, Everything else maxed and was maintaining ~35FPS with very rare stuttering due to VRAM capping.
> 
> AMD's driver is coming tomorrow with an estimated 25-28% performance jump. NVIDIA's new driver greatly improved VRAM usage in the game, so I'd expect the same from AMD's new driver. 2GB should max the game at 1080p and SMAA without VRAM issues.


Sweet! gives me hope now that I should be able to play it at 1080p


----------



## Shaded War

Quote:


> Originally Posted by *Alvarado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Murlocke*
> 
> I just did a fresh/clean install of the english (non-international) version of the drivers released today and noticed a large improvement over the international versions. Fired up the game, and did 1 full loop around the entire city at fast driving speeds on 3GB of VRAM. I counted 3 major FPS drops, the rest of the time I maintained 50+ FPS. Settings: 1440p, 4x MSAA, Vsync ON, Motion Blur off, Everything else maxed.
> 
> I also called a friend, and had him try the new drivers on his 2GB GTX 680. He managed 1440p, Temporal SMAA, Vsync off, Motion Blur off, Everything else maxed and was maintaining ~35FPS with very rare stuttering due to VRAM capping.
> 
> AMD's driver is coming tomorrow with an estimated 25-28% performance jump. NVIDIA's new driver greatly improved VRAM usage in the game, so I'd expect the same from AMD's new driver. 2GB should max the game at 1080p and SMAA without VRAM issues.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sweet! gives me hope now that I should be able to play it at 1080p
Click to expand...

I'm hoping to run it at 5760 x 1080p

By the looks of things, I'll be dropping the settings or reducing to one screen.


----------



## Alvarado

Quote:


> Originally Posted by *Shaded War*
> 
> I'm hoping to run it at 5760 x 1080p
> 
> By the looks of things, I'll be dropping the settings or reducing to one screen.


Ouch..... though...I'd love to play at something higher then 1080 just waiting for Asus's g-sync monitor. Their sure taking their sweet, sweet time with it sigh.


----------



## StrongForce

Quote:


> Originally Posted by *Redeemer*
> 
> This is how Nvidia makes money
> 
> they gave us a
> 
> Crippled 680 2GB 256 bit bus
> 
> Forced us to buy a 780 GTX 384 bit 3GB
> 
> Finally gave us a fully unlocked GK110 but with only 3GB (780TI)
> 
> Makes us pay 1K+ for top end
> 
> GK110 has been cut up so much over the last year


Mmh yea, and now they gonna make money on the 6Gb 780, would I be crazy if I predict the 800 in the next few months, and the 6Gb 780s be the last attempt of cashing on the 700's









The fact they haven't mention any timeline for the 800's series kinda worrys me.. I don't feel like getting a 7806Gb, then 2month later see the 880's come out (a bit like with the Ti's..)
Quote:


> Originally Posted by *xxpantherrrxx*
> 
> The game is still stuttering even with the new drivers. I have a GTX 780 Ti which is heavily overclocked to 1300mhz. It only stutters when entering the city, if you're off wondering on the outskirts it's totally fine. Running all Ultra settings with Temporal SMAA. Saw a max of 2941 MB's of VRAM usage.


What Resolution ? that's important


----------



## xxpantherrrxx

Normal old 1920x1080 :\


----------



## eternal7trance

Quote:


> Originally Posted by *xxpantherrrxx*
> 
> Normal old 1920x1080 :\


Still better than console though


----------



## xxpantherrrxx

Not if it keeps stuttering the way it does. I will say it has improved a bit with the new drivers but it is still stuttery in the city.


----------



## Xboxmember1978

I get stutter too. I'm on the newest drivers released today and the rig in my sig, seeing 1700MB~2000MB on all low settings is for the birds


----------



## NABBO

Quote:


> Originally Posted by *Redeemer*
> 
> This is how Nvidia makes money
> 
> they gave us a
> 
> Crippled 680 2GB 256 bit bus
> 
> Forced us to buy a 780 GTX 384 bit 3GB
> 
> Finally gave us a fully unlocked GK110 but with only 3GB (780TI)
> 
> Makes us pay 1K+ for top end
> 
> GK110 has been cut up so much over the last year


and continue to buy nvidia?

why?

I've read in other posts, you've got 680 "256bit sucks," and after 780 Ti "miserable 3Gb"

Instead of those, you bought 7970, and after 290x, as often praise amd, maybe it was the right choice.

not "are forced" to buy nvidia.
Nvidia do not point the gun in your face, if you do not buy their video cards


----------



## Alvarado

Quote:


> Originally Posted by *NABBO*
> 
> and continue to buy nvidia?
> 
> why?
> 
> I've read in other posts, you've got 680 "256bit sucks," and after 780 Ti "miserable 3Gb"
> 
> Instead of those, you bought 7970, and after 290x, as often praise amd, maybe it was the right choice.
> 
> not "are forced" to buy nvidia.
> *Nvidia do not point the gun in your face, if you do not buy their video cards*


Or do they! dun dun dun!


----------



## NABBO

Quote:


> Originally Posted by *Alvarado*
> 
> Or do they! dun dun dun!


"I was forced to buy 780 SLI", otherwise Nvidia would have killed my dog, set fire to my house, and tortured my girlfriend.


----------



## Ascii Aficionado

Quote:


> Originally Posted by *NABBO*
> 
> "I was forced to buy 780 SLI", otherwise Nvidia would have killed my dog, set fire to my house, and tortured my girlfriend.


Wow, you too ?

/s


----------



## xCamoLegend

Why is the driving so bad?

it's like they copied it from GTA 3.


----------



## bvsbutthd101

I actually like the driving. My computer seems to run this game fine but I keep getting frame drops. Quite frequently. Maybe I'll do a fresh install of drivers.


----------



## Ascii Aficionado

Quote:


> Originally Posted by *bvsbutthd101*
> 
> I actually like the driving. My computer seems to run this game fine but I keep getting frame drops. Quite frequently. Maybe I'll do a fresh install of drivers.


Wait for the launch patch and Nvidia's WD driver.


----------



## Serandur

Quote:


> Originally Posted by *NABBO*
> 
> and continue to buy nvidia?
> 
> why?
> 
> I've read in other posts, you've got 680 "256bit sucks," and after 780 Ti "miserable 3Gb"
> 
> Instead of those, you bought 7970, and after 290x, as often praise amd, maybe it was the right choice.
> 
> not "are forced" to buy nvidia.
> Nvidia do not point the gun in your face, if you do not buy their video cards


Correct as you are and as personally approving of AMD's hardware as I am, there are certain key things Nvidia support that AMD do not like downsampling (to put a 780's overkill abilities to good use in less-demanding games), decent OpenGL drivers, Gsync, CUDA if you're into that, etc. Personally, Downsampling is the only must-have for me of this list and if either AMD would finally support it or I make a successful and relatively soon transition to a native 4K monitor where I will no longer need downsampling, I would jump ship to AMD and never look back.

For whatever reason though, they just refuse to do it, unfortunately. It sucks, AMD sell fantastic hardware for a great price, but what good is a high-end processor without high-end software to take advantage of its capabilities? That's just my stance and situation, I really am wanting to forget Nvidia entirely, but it is difficult to abandon certain liked features and capabilities that a person is previously accustomed to.


----------



## PhilWrir

If there is any more discussion of piracy in this thread I will be handing out warnings/infractions from this post forward

There have been multiple warnings in this thread and multiple cleanings related to discussion of piracy.
It is forbidden by the TOS and everyone has had more than enough reminders now

Keep it on topic and keep it legal everyone


----------



## lilchronic

Quote:


> Originally Posted by *d3vour3r*
> 
> 3930k is at 4.5ghz
> 
> i oc my monitor to 96hz so should get that with vsync on.


yeah you could turn vsycn off, unless the tearing bother's you. The higher you're refresh rate the less you notice the tearing...... @ 120hz+ tearing is barley noticeable


----------



## Hydrored

So what's the deal on the Ti? Should I not upgrade to one and instead get the 290x for the extra VRAM? I'm not loyal to any camp


----------



## DADDYDC650

Unless you Need G Sync, don't waste extra cash on a TI. Get a nice custom 290x.


----------



## givmedew

Quote:


> Originally Posted by *Hydrored*
> 
> So what's the deal on the Ti? Should I not upgrade to one and instead get the 290x for the extra VRAM? I'm not loyal to any camp


I would just grab a 290 non x and call it a day. The money you save you can put towards a second 290 later if you want it or 4k monitor or whatever. Half of the performance difference between the 290 and 290x is the clock speed and the rest is the shaders. They non x usually will clock just as high as the 290x cards. At least for most cards all the power pieces are essentially the same.


----------



## ChronoBodi

well, 20% of the game's 14GB is corrupted, i have to repair it via Uplay repair, geez.


----------



## BusterOddo

Quote:


> Originally Posted by *ChronoBodi*
> 
> well, 20% of the game's 14GB is corrupted, i have to repair it via Uplay repair, geez.


Game wont launch for me either. Im running windows 7 64 bit, and when I try to launch the game I get an error message: Watch dogs requires windows vista SP1 or higher. There's 2 other guys with the same exact problem on the Ubisoft forum already. Not the start I was hoping for...


----------



## Clairvoyant129

Running smoothly on my 2 GTX Titans + 4930K @ 4.5GHz, all settings maxed out with additional settings in the Nvidia control panel (AF 16X + 8x supersampling). Using about 4GB+ of Vram.

No stuttering, must be an AMD problem.


----------



## degenn

Anyone else's download stuck at 99% with 14239MB of 14240MB downloaded?

*EDIT*

Quitting out of U-Play and then re-opening the client and clicking download seemed to have fixed the problem -- phew!


----------



## Bit_reaper

Quote:


> Originally Posted by *SkyNetSTI*
> 
> edit: after this comparison my unhappiness has increased alot!!!
> is there the way to "return" pre-order in origin??? I want my money back!
> doubleedit: found the way for refund, will apply asap!
> 
> Honestly... I am not very happy with this game... graphics is nothing special to me(+runs slow), controls and driving physics is crap... I would prefer GTA V over WatchDogs on PC anyday!
> will play more maybe I will change my mind...


Was unmodded GTA IV really that good looking







Its been a long time since I fired it up. That said the Watch dogs NPC AI and animations seem pretty mediocre which is pretty sad to see after all they _"living, breathing Chicago"_ hyping.


----------



## MapRef41N93W

So it's basically Assassin's Creed with cellphones like everyone was pretty much predicting? Sad, but totally expected from Ubisoft.


----------



## Clockster

Quote:


> Originally Posted by *Clairvoyant129*
> 
> Running smoothly on my 2 GTX Titans + 4930K @ 4.5GHz, all settings maxed out with additional settings in the Nvidia control panel (AF 16X + 8x supersampling). Using about 4GB+ of Vram.
> 
> No stuttering, must be an AMD problem.


Yeah that's why a lot of guys running Nvidia cards are complaining, because it must be an AMD problem.
The level of idiocy on these forums lately..


----------



## ChronoBodi

im stuck on the part where you have to make a blackout item, but i can't do it since i cannot unlock the skill. WHAT DO YOU do in this part, the game doesn't let you do anything else, this is first mission.


----------



## th3illusiveman

Quote:


> Originally Posted by *Clairvoyant129*
> 
> Running smoothly on my 2 GTX Titans + 4930K @ 4.5GHz, all settings maxed out with additional settings in the Nvidia control panel (AF 16X + 8x supersampling). Using about 4GB+ of Vram.
> 
> No stuttering, must be an AMD problem.


You're not very perceptive are you....
Quote:


> Originally Posted by *ChronoBodi*
> 
> im stuck on the part where you have to make a blackout item, but i can't do it since i cannot unlock the skill. WHAT DO YOU do in this part, the game doesn't let you do anything else, this is first mission.


Watch a quick lets play or walk through video. I think you find it in the weapons wheel and activate it using the throw grenade button [LB?].


----------



## DADDYDC650

Quote:


> Originally Posted by *PhilWrir*
> 
> If there is any more discussion of piracy in this thread I will be handing out warnings/infractions from this post forward
> 
> There have been multiple warnings in this thread and multiple cleanings related to discussion of piracy.
> It is forbidden by the TOS and everyone has had more than enough reminders now
> 
> Keep it on topic and keep it legal everyone


I doubt the AMD 290 cards will have issues maxing out the game once they receive their new driver tomorrow and they don't have to spend $1000+ to do it.









I've been playing it for almost an hour maxed out @1440p, Temporal SMAA with v-sync enabled and it runs great! No lag at all and practically no stuttering. Great game so far! Graphics look good and I don't have an issue with driving.


----------



## NateN34

Well, the game looks amazing and seems like it will be a blast to play.

However, the stuttering and frame-rate drops are horrible. Even on low, I get stutters/dips. SSD made the stuttering shorter, but it is still there. Going to hold out on playing it, until it is patched up a bit.


----------



## jimlaheysadrunk

having some pretty gnarly frame drops here too, 4770k, reference 780 3gb


----------



## mingocr83

I just played for 2 hours, ultra settings, not a single problem. I was a bit worried reading all the comments here, but in case, everything works perfectly.


----------



## NavDigitalStorm

Playing with my R9 295X2 on Ultra 1440p... very disappointed. Both GPU's showing load but only getting 35 fps.


----------



## StrongForce

How about the game itself guys is it good, seems like reviews are mixed


----------



## DADDYDC650

Quote:


> Originally Posted by *StrongForce*
> 
> How about the game itself guys is it good, seems like reviews are mixed


I'm liking it. Going on 3 hours so far. Nice gfx with solid gameplay. If I had to rate it early on, I'd give it a 7.5 out of 10 at least.


----------



## mingocr83

Quote:


> Originally Posted by *StrongForce*
> 
> How about the game itself guys is it good, seems like reviews are mixed


To be honest, the game is interesting, but not the kind of engagement I expected. That thing of hacking everyone and everything is a bit boring...you tend to do some cool stuff, but then...is the same. It's game to enjoy a couple of hours weekly...


----------



## DADDYDC650

Quote:


> Originally Posted by *mingocr83*
> 
> I just played for 2 hours, ultra settings, not a single problem. I was a bit worried reading all the comments here, but in case, everything works perfectly.


No problems here as well. Runs great on my rig using the latest Nvidia driver.


----------



## Clockster

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Playing with my R9 295X2 on Ultra 1440p... very disappointed. Both GPU's showing load but only getting 35 fps.


The watch dog driver from AMD should be available in the next few hours, that should hopefully give a dramatic boost in performance.


----------



## Leopard2lx

I don't understand how they release a game like this with so many performance / technical problems on high end systems. Do they not bother testing before release? Or did they just say "well, it's fine like this, whatever...they'll deal with it."









Welcome to beta testing Watch Dogs.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Leopard2lx*
> 
> I don't understand how they release a game like this with so many performance / technical problems on high end systems. Do they not bother testing before release? Or did they just say "well, it's fine like this, whatever...they'll deal with it."
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome to beta testing Watch Dogs.


Pretty much what happened


----------



## th3illusiveman

Quote:


> Originally Posted by *StrongForce*
> 
> How about the game itself guys is it good, seems like reviews are mixed


everything but the shooting is good. Driving is kinda meh, but that's cause of the performance issues.


----------



## mingocr83

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Playing with my R9 295X2 on Ultra 1440p... very disappointed. Both GPU's showing load but only getting 35 fps.


Add to the fact that this is a Nvidia game...for sure this game will run like crap for a few months until Nvidia decides it has been enough in this particular case to mess up with gamers from the "other band"


----------



## Emu105

Yeah i still get stuttering while I'm driving walking around and stuff the game runs smooth, but while i drive it just hicups out of no where like i can hear my gpu stop working then can hear it again for a mil sec and thats when the game stutters or hicups.


----------



## Blackops_2

Quote:


> Originally Posted by *StrongForce*
> 
> How about the game itself guys is it good, seems like reviews are mixed


Are there any reviews out yet.


----------



## mingocr83

Quote:


> Originally Posted by *Blackops_2*
> 
> Are there any reviews out yet.


IGN should be out in a few minutes...they published the review on facebook, but seems is not online yet on their website.

EDIT: Here it is:

http://www.ign.com/articles/2014/05/27/watch-dogs-review


----------



## zantetheo

Has anyone tried Gtx 770 4GB? How much vram needs for a playable experiance?


----------



## KenjiS

Quote:


> Originally Posted by *zantetheo*
> 
> Has anyone tried Gtx 770 4GB? How much vram needs for a playable experiance?


This is my curiousity, the few things ive read are basically no Ultra if you have 2gb because its unplayable.


----------



## zantetheo

Guess we will have to wait and see if 4Gb on Gtx 770 is a plus for this game


----------



## Blackops_2

Quote:


> Originally Posted by *mingocr83*
> 
> IGN should be out in a few minutes...they published the review on facebook, but seems is not online yet on their website.
> 
> EDIT: Here it is:
> 
> http://www.ign.com/articles/2014/05/27/watch-dogs-review


Yeah a bunch have come out in the last hour.

Well after seeing a decent bit of reviews only one seemed to really like the story







i'll see eventually but i was hoping for an awesome story. Or for me at least i liked the premise laid out from the trailers.

IGN - 8.4/10
Gamespot - 8/10
Polygon - 8/10
Eurogamer - 7/10
GamesRadar - 4/5
PC Gamer 87/100


----------



## KenjiS

Quote:


> Originally Posted by *zantetheo*
> 
> Guess we will have to wait and see if 4Gb on Gtx 770 is a plus for this game


I quickly loaded it up on high/1440p/MSAA 2x and im getting really good framerates with my 2gb 770 SLI(60-70), but i expected that...this game is why i went for SLI-ing my 770s

Might try Ultra tomarrow


----------



## ChronoBodi

lol this game sucks up 4.5 GB VRAM on 4K with no AA whatsoever. i don't think 6GB would even be enough for heavy AA on 4K, but... i'm on 24 inch 4K, no AA needed here!


----------



## revro

Quote:


> Originally Posted by *ChronoBodi*
> 
> lol this game sucks up 4.5 GB VRAM on 4K with no AA whatsoever. i don't think 6GB would even be enough for heavy AA on 4K, but... i'm on 24 inch 4K, no AA needed here!


did you have the latest nvidia watchdogs drivers?


----------



## ChronoBodi

Quote:


> Originally Posted by *revro*
> 
> did you have the latest nvidia watchdogs drivers?


Yea it's the 337.88 driver.

I know it's 4k, but still, 4.5gb with no AA is still mindboggling.


----------



## revro

Quote:


> Originally Posted by *ChronoBodi*
> 
> Yea it's the 337.88 driver.
> 
> I know it's 4k, but still, 4.5gb with no AA is still mindboggling.


and did you done the other stuff recommended? turn off motion blur, set level of detail to high, and textures as well? then use smaa or temporal smaa


----------



## ChronoBodi

Quote:


> Originally Posted by *revro*
> 
> and did you done the other stuff recommended? turn off motion blur, set level of detail to high, and textures as well? then use smaa or temporal smaa


Its all Ultra with no motion blur and no AA.


----------



## KenjiS

I booted mine up to Ultra and it seems to be ok except when something moves, Im thinking Motion Blur is the culprit, Will try to investigate more tomarrow..


----------



## Thetbrett

just had a play. with 3770k @4.4 2x 780 ti 16 gb ram, i ran at 60 fps solid. I just uded auto detect settings. I now need to find my Xbox controller..


----------



## silent man

i have GTX 780 SC ACX 3GB
latest Nvidia

game sucks stuttering when driving
also when set Shader to Hight !!!!!!!!!!!!

just wanna know one thing

even if everything ULTRA ...is there any diffrence between ULTRA and HIGHT on Texture option?


----------



## KenjiS

Eugh now it runs like a dog and im getting texture glitching up the wazoo..

Im also getting very scared that i just threw $330 down the toilet for SLI :/


----------



## yunshin

Hitting minimums of 40s during certain areas, otherwise it's a solid 60. Ultra settings with Temp SMAA.

Not exactly what I want, but I'll tolerate the drops in fps for now.


----------



## KenjiS

Quote:


> Originally Posted by *yunshin*
> 
> Hitting minimums of 40s during certain areas, otherwise it's a solid 60. Ultra settings with Temp SMAA.
> 
> Not exactly what I want, but I'll tolerate the drops in fps for now.


I'd be happy with 40-60 fps so long as it didnt drop to 2 on me


----------



## Silent Scone

Quote:


> Originally Posted by *Leopard2lx*
> 
> I don't understand how they release a game like this with so many performance / technical problems on high end systems. Do they not bother testing before release? Or did they just say "well, it's fine like this, whatever...they'll deal with it."
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome to beta testing Watch Dogs.


Have you met developers? lol. It's quite a common mentality, without brushing them ALL with the same brush - that they can't see past their own nose. If it works for them, then their work is done.


----------



## FuryDharok

As long as it's smoother with my 2GB 670 on ultra than a PS4, then i'm happy.


----------



## SkyNetSTI

Quote:


> Originally Posted by *NABBO*
> 
> Watch Dogs vs Grand Theft Auto IV on PC
> 
> https://translate.google.it/translate?sl=ru&tl=en&js=y&prev=_t&hl=it&ie=UTF-8&u=http%3A%2F%2Fgamegpu.ru%2Figrovye-novosti%2Fwatch-dogs-vs-grand-theft-auto-iv-na-pk.html&edit-text=


Quote:


> Originally Posted by *silent man*
> 
> i have GTX 780 SC ACX 3GB
> latest Nvidia
> 
> game sucks stuttering when driving
> also when set Shader to Hight !!!!!!!!!!!!
> 
> just wanna know one thing
> 
> even if everything ULTRA ...is there any diffrence between ULTRA and HIGHT on Texture option?


is sucks for sure! with my sli game stuttering every 3 seconds on any settings!
GPU load is between 30-65%
will request refund tonight


----------



## Ramzinho

looking at like the past 8 pages.. all i see are Nvidia GPUs.. that makes cry and wanna see how someone is doing with a 7970 on this


----------



## Ascii Aficionado

Quote:


> Originally Posted by *Ramzinho*
> 
> looking at like the past 8 pages.. all i see are Nvidia GPUs.. that makes cry and wanna see how someone is doing with a 7970 on this


http://www.techspot.com/review/827-watch-dogs-benchmarks/


----------



## Ramzinho

Quote:


> Originally Posted by *Ascii Aficionado*
> 
> http://www.techspot.com/review/827-watch-dogs-benchmarks/


thanks a lot.. i know there are some posts. but you know.. OCN reviews are much more accurate and personal.

what kind of resolution is: 1680x1050


----------



## Ascii Aficionado

Quote:


> Originally Posted by *Ramzinho*
> 
> what kind of resolution is: 1680x1050


It used to be the most common 16:10 desktop resolution, games supported it more than 16:9 when widescreen displays weren't common.


----------



## Psyco Flipside

Quote:


> Originally Posted by *Ramzinho*
> 
> looking at like the past 8 pages.. all i see are Nvidia GPUs.. that makes cry and wanna see how someone is doing with a 7970 on this


I'm averaging (1920x1200, pre-game drivers, [email protected], everything Ultra, HBAO Max, 4x MSAA):
- 7970 (@1100/1500): 37-38 FPS
- 7970 (@1250/1700): 44 FPS

VRAM Usage: > 2950 MB. RAM Usage: 4-4.5GB

Let's see if new drivers improve it


----------



## Ramzinho

Quote:


> Originally Posted by *Psyco Flipside*
> 
> I'm averaging (1920x1200, pre-game drivers, [email protected], everything Ultra, HBAO Max, 4x MSAA):
> - 7970 (@1100/1500): 37-38 FPS
> - 7970 (@1250/1700): 44 FPS
> 
> VRAM Usage: > 2950 MB. RAM Usage: 4-4.5GB
> 
> Let's see if new drivers improve it


i don't have my CPU oced "cough" and i don't think Stock Cpu would make that huge of a difference in a more GPU intensive game... and i kinda feel you can get a bit more than that .... but again i'm running 1080p .

Do you really feel difference in 2MSAA than 4MSAA?


----------



## xxpantherrrxx

And you're not stuttering
Quote:


> Originally Posted by *Psyco Flipside*
> 
> I'm averaging (1920x1200, pre-game drivers, [email protected], everything Ultra, HBAO Max, 4x MSAA):
> - 7970 (@1100/1500): 37-38 FPS
> - 7970 (@1250/1700): 44 FPS
> 
> VRAM Usage: > 2950 MB. RAM Usage: 4-4.5GB
> 
> Let's see if new drivers improve it


How are you running those settings with just 3GB of VRAM? I get terrible stuttering with those setting until I drop the MSAA to Temp SMAA and the suttering is about 98% gone.


----------



## Psyco Flipside

Quote:


> Originally Posted by *Ramzinho*
> 
> i don't have my CPU oced "cough" and i don't think Stock Cpu would make that huge of a difference in a more GPU intensive game... and i kinda feel you can get a bit more than that .... but again i'm running 1080p .
> Do you really feel difference in 2MSAA than 4MSAA?


CPU frequency should have little to zero impact to this game. The benchmarks in the link above show the same FPS within the 2.5GHz-4.5GHz margin, so you should be fine at stock









I haven't tried 2xMSAA. Only 4xMSAA and Temp SMAA, and it runs much smoother in the last one.
Quote:


> Originally Posted by *xxpantherrrxx*
> 
> And you're not stuttering
> How are you running those settings with just 3GB of VRAM? I get terrible stuttering with those setting until I drop the MSAA to Temp SMAA and the suttering is about 98% gone.


Yep, I'm experiencing the same situation.
I usually get the stutters either when I'm driving fast or at intersections. With Temp SMAA I have almost no stutter but it doesn't look the same... That's why I can't wait anymore for the new driver


----------



## xxpantherrrxx

There is still stuttering in the city no matter what settings, when you leave the city to roam out the outskirts the stuttering is 100% gone.


----------



## ahnafakeef

Hello everyone. I just started playing and am in need of some assistance.

I don't want to use vSync (so I can get the maximum FPS at all times) so I enabled frame limiter to 60 in RivaTuner. But I still get tearing in the game.

How can I solve this issue?

Thanks in advance.


----------



## Silent Scone

DP 1.3 standard monitor or G-Sync capable monitor. lol


----------



## Twirlz

Game is running pretty well on a FX 8350 + 280X. I am running with a mix of ultra/high settings but the frame rate is so varying. In cut scenes I get like 90+ FPS with 99% GPU utilization however when driving about I get 35FPS with 60% GPU utilization and walking about in highly populated areas is about 30-50FPS with 60-70% GPU utilization. Is this a case of the CPU not coping well with all the AI etc in highly populated areas or will the performance likely improve through future patches?

When theres not much AI or indoors the GPU utilization shoots up to like 90% +


----------



## BusterOddo

Quote:


> Originally Posted by *Ramzinho*
> 
> looking at like the past 8 pages.. all i see are Nvidia GPUs.. that makes cry and wanna see how someone is doing with a 7970 on this


I'm trying to run cf 7970's. I say trying because I am getting very low gpu usage. 35% on one, 65% on the other which is resulting in pretty much 30-35 fps everywhere I go regardless of settings. I'm seeing around 4500-5800mb of vram usage depending on the settings I try. I will be waiting for today's driver update.


----------



## dboythagr8

So what's the verdict on 4k and 3GB cards? Anybody run that yet


----------



## Silent Scone

Quote:


> Originally Posted by *dboythagr8*
> 
> So what's the verdict on 4k and 3GB cards? Anybody run that yet


One of my friends has tried running it on Tri SLi Titans at 4K and ran out of VRAM with 4X MSAA with ultra settings lol. Not that this is at all surprising really, but he's been unable to get the game running since. Keeps crashing.


----------



## mfknjadagr8

Lol 2400 in cards and cant.play i would be so happy lol


----------



## Ramzinho

Quote:


> Originally Posted by *BusterOddo*
> 
> I'm trying to run cf 7970's. I say trying because I am getting very low gpu usage. 35% on one, 65% on the other which is resulting in pretty much 30-35 fps everywhere I go regardless of settings. I'm seeing around 4500-5800mb of vram usage depending on the settings I try. I will be waiting for today's driver update.


driver is out mate.. check the AMD driver sub forum


----------



## Newbie2009

Is it just me or has UPLAY taken a Dump?


----------



## CaptainZombie

Does anyone know if the retail PC version that comes with the discs also have a code to just d/l via uplay? With my 20% at BBY gamers club, it would be cheaper than just buying online at the moment. I don't have a DVD drive on my PC anymore either.


----------



## Silent Scone

Quote:


> Originally Posted by *Newbie2009*
> 
> Is it just me or has UPLAY taken a Dump?


I couldn't login for ages...


----------



## BusterOddo

Quote:


> Originally Posted by *Ramzinho*
> 
> driver is out mate.. check the AMD driver sub forum


Yeah, already have the 14.6 downloaded and installed. After the install process I noticed what you see quoted below. Uplay is down for the moment.
Quote:


> Originally Posted by *Newbie2009*
> 
> Is it just me or has UPLAY taken a Dump?


----------



## Newbie2009

Quote:


> Originally Posted by *Silent Scone*
> 
> I couldn't login for ages...


Yeah I'm getting unrecoverable error and uplay shutting down, can't log in


----------



## mattjm

So guys, can I expect it to run at @ 1080p with all settings on high(not ultra), AA turned off with 45+ avg FPS with a Phenom II X4 965 and a 270X, both at stock clocks?


----------



## Silent Scone

I'd left mine downloading so hope it hasn't copped out.


----------



## TheBlindDeafMute

Guru 3d did a benchmark, and stated that on any ultra setting it used over 3gb of vram. If you have 1440p, then we'll into 4gb. I just find that hard to believe, but I will test it tonight.


----------



## BusterOddo

Quote:


> Originally Posted by *mattjm*
> 
> So guys, can I expect it to run at @ 1080p with all settings on high(not ultra), AA turned off with 45+ avg FPS with a Phenom II X4 965 and a 270X, both at stock clocks?


I found this chart to be nearly identical with what fps I am seeing with one card:

http://www.overclock.net/t/1492068/techspot-watch-dogs-benchmarked


----------



## zylonite

ATI 7970 GHz 3GB and the game crashes on start with the latest AMD 14.6 beta drivers.


----------



## Leopard2lx

Unimpressed so far. Getting about 65-70fps average on Ultra / 1080p / Temporal SMAA but stutters and frame drops while driving even with the latest drivers.


----------



## TopicClocker

Quote:


> Originally Posted by *BusterOddo*
> 
> I found this chart to be nearly identical with what fps I am seeing with one card:
> 
> http://www.overclock.net/t/1492068/techspot-watch-dogs-benchmarked[/quote
> 
> I would take those charts with a grain of salt, a couple of people including myself in the thread noticed things were off.
> No way is my Phenom II pushing 50fps, it's bottlenecking most of the time, and it's clocked higher than the one that's in the charts.
> 
> This could of-course be the game itself as I've read that people are reporting their 780s aren't at full utilization with overclocked i5s.


----------



## wstanci3

Well, well. What a surprise. Ubisoft delivers yet again a poorly optimized title.
I guess I will wait a couple of months to grab this when the wrinkles are ironed out.


----------



## TheBlindDeafMute

Quote:


> Originally Posted by *wstanci3*
> 
> Well, well. What a surprise. Ubisoft delivers yet again a poorly optimized title.
> I guess I will wait a couple of months to grab this when the wrinkles are ironed out.


...and Uplay keeps crashing and giving me dump files. I though Origin was the worst, and so far Uplay has been worse in every way...repeatedly. STEAM FTW.

have I mentioned I hate







Uplay?


----------



## Clockster

Quote:


> Originally Posted by *Newbie2009*
> 
> Yeah I'm getting unrecoverable error and uplay shutting down, can't log in


Uplay is down at the moment...lol people are mad xD
I don't know why, this is what we've come to expect from Ubisoft lol
I honestly don't care. I've now played the game so the initial excitement is gone, so will play again tomorrow.


----------



## wstanci3

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> ...and Uplay keeps crashing and giving me dump files. I though Origin was the worst, and so far Uplay has been worse in every way...repeatedly. STEAM FTW.
> 
> have I mentioned I hate
> 
> 
> 
> 
> 
> 
> 
> Uplay?


Steam >Origin >Uplay
All have their headaches but Uplay... ugh









I was really looking forward to the main story. Has anyone been following it yet or are we trying to find a stable 60fps frame rate?


----------



## cstkl1

has anybody manage to get sli fully working. both gpu running close to max.
cause so far this magic driver for it aint working full.


----------



## Leopard2lx

Quote:


> Originally Posted by *TopicClocker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BusterOddo*
> 
> I found this chart to be nearly identical with what fps I am seeing with one card:
> 
> http://www.overclock.net/t/1492068/techspot-watch-dogs-benchmarked[/quote
> 
> I would take those charts with a grain of salt, a couple of people including myself in the thread noticed things were off.
> No way is my Phenom II pushing 50fps, it's bottlenecking most of the time, and it's clocked higher than the one that's in the charts.
> 
> This could of-course be the game itself as I've read that people are reporting their 780s aren't at full utilization with overclocked i5s.
> 
> 
> 
> Yep. My 780 usage is around 60% ish on 1080p with max settings and temporal smaa. Haven't checked the cpu usage.
> 
> Lame Ubisoft!
Click to expand...


----------



## EVGA-JacobF

Quote:


> Originally Posted by *Cretz*
> 
> What settings would be the equivalent of how the game is on the PS4?


I believe "High" settings are equivalent to PS4.


----------



## cstkl1

Quote:


> Originally Posted by *EVGA-JacobF*
> 
> I believe "High" settings are equivalent to PS4.


finally the right guy.

ure sli, is it working like fully loaded close to 100 percent on two cards with watchdog retail and the latest driver??

and do u find it that sometimes it jerks etc but ure frame rates are all ok and so is the latency??


----------



## Zillerella

Anyone else experiencing crashes? Just downloaded the game and I recieve crash "the game stopped working" and the video driver crash all time.
Anyone can help?

Also the Uplay dosent work atm?


----------



## My Desired Display Name

Is there an auto aim with the mouse? It seems like the game is pulling towards targets even when I'm on a mouse.


----------



## TheBlindDeafMute

I don't think think the textures look all that great. Where does all the hype and vram get used? I could turn on any number of games, and blow this out of the water. Muddy mess, even when set to 8x MSAA and all the goodies. Blah. Oh well, glad it was free, lol. And Uplay sucks. Worst game client ever.


----------



## revro

try to set uplay to offline mode, worked for me to start uplay. tough i am copying data now so i cant play watchdogs. but it should work for you
also kinda stupid that forgotten sands the prince of persia game cant be uninstalled from uplay as i got it off steam. basically its in my list of installed games even tough i uninstalled it. strange


----------



## BusterOddo

Yes I agree with you, very early numbers. I was however getting almost full usage out of single card.


----------



## cstkl1

my bad i found its because of the shadowplay.
without it fps increases by a but load and uber smooth.


----------



## Benh23

Quote:


> looking at like the past 8 pages.. all i see are Nvidia GPUs.. that makes cry and wanna see how someone is doing with a 7970 on this


I have a R9 280X and it is basically the same thing as a 7970 and it is fairly hard to keep an accurate FPS due to the stuttering that happens (Until a new patch comes) but on ULTRA @1080p with SMAA it gets a solid 40-50 FPS minus the stuttering


----------



## KenjiS

To make all you AMD folks feel a bit better, my SLI GTX 770s cant run the game at 1440p High or Ultra because I only have 2gb of VRAM

So, admittedly I get 60-70fps, But then it takes a dump every few inches to about 1-2fps, Literal slideshow

Obviously the game is unplayable and the only way to make it playable is to dump it down to medium settings or 1080p where supposedly 1080p High is barely playable for a 2gb card...

So im kinda in a lousy mood, I went SLI expecting this game to need it, and it turns out i shouldnt have bothered as VRAM ended up being the crippling factor...

And to top it off now UPlay isnt even working...


----------



## Gaupz

Hmm my cards sit @ 575Mhz and i get 60 fps..... great.


----------



## y2kcamaross

Quote:


> Originally Posted by *Gaupz*
> 
> Hmm my cards sit @ 575Mhz and i get 60 fps..... great.


shut down afterburner/evga precision then restart the one you shut down, even with the game running, it will then show correct clocks


----------



## Clazman55

What I find weird is the GTX700 series users at 50-60% usage.

My working theory on the responses I've seen on OCN: 2GB of VRAM is limiting the usage % of the GPU. I have 4GB and WD uses about 3.2GB and I'm at 99% usage on a 670. It seems like WD is trying to load as many assets as possible into VRAM and when it can't the CPU has to grab the files from HDD/SSD and then pass it to the GPU. That is the only thing I can think of that would bottleneck a better card. That or the 700 series driver has a bug. I wouldnt call it an optimization issue, but definitely a wired design choice.


----------



## MonarchX

What fascinaets me is that developers were more than aware of this bug before the game was released, and yet they decided to release the game with such stuttering anyway. Then they complain that people pirate their games and don't pay for it in the end... The game stutters and offers very little of what people were expecting. Just ride the cameras!


----------



## TopicClocker

Quote:


> Originally Posted by *Clazman55*
> 
> What I find weird is the GTX700 series users at 50-60% usage.
> 
> My working theory on the responses I've seen on OCN: 2GB of VRAM is limiting the usage % of the GPU. I have 4GB and WD uses about 3.2GB and I'm at 99% usage on a 670. It seems like WD is trying to load as many assets as possible into VRAM and when it can't the CPU has to grab the files from HDD/SSD and then pass it to the GPU. That is the only thing I can think of that would bottleneck a better card. That or the 700 series driver has a bug. I wouldnt call it an optimization issue, but definitely a wired design choice.


Hmm that's interesting if that's the case but I'm not sure, I figured the bottle-necking for me was my CPU. What frame-rate do you get on your 4GB 670?
I'm actually glad I didn't buy a 770 2GB or second 760, it's a damn shame if this doesn't improve in Watch Dogs and keeps happening to 2GB cards in future games, that means the 690 is also going down with it too.

If only Nvidia went with 3-4Gb standard for cards of those leagues like AMD, 1440p sounds like it's out of the question for those running those resolutions, 1080p is alright for me, I get trouble when enabling Ultra shadows with HBAO+, one or the other seems okay, I think Shadows causes the most trouble though. Ultra textures is a definite no go and it generally sounds like tougher cards like the 780s are having problems too.

It is the first official day of launch so things may improve, I also dont recall there being a second patch for this game, only one since the 23rd.


----------



## revro

i think we can thank amd for making their unified ram and vram. hence this terrible port that cant work properly for most people without 4+gb vram


----------



## KenjiS

Quote:


> Originally Posted by *Clazman55*
> 
> What I find weird is the GTX700 series users at 50-60% usage.
> 
> My working theory on the responses I've seen on OCN: 2GB of VRAM is limiting the usage % of the GPU. I have 4GB and WD uses about 3.2GB and I'm at 99% usage on a 670. It seems like WD is trying to load as many assets as possible into VRAM and when it can't the CPU has to grab the files from HDD/SSD and then pass it to the GPU. That is the only thing I can think of that would bottleneck a better card. That or the 700 series driver has a bug. I wouldnt call it an optimization issue, but definitely a wired design choice.


Consider the design of the PS4 and Xbox One however, Both have unified memory architechtures with 8gb of Ram in total. Even assuming overhead its a simple matter to allocate 3gb to VRAM, Even 4 is technically possible (Watch Dogs on my PC is taking about 2gb of system RAM, Assuming a console uses less system RAM for assets..)

My GUESS here is that on these consoles you dont compress the textures, This saves processing overhead at the cost of RAM, but you have a lot of RAM to work with so its a good tradeoff, On the PC you have large processing reserves but most GPUs have 2-3gb of ram, So you would be better off compressing textures to save RAM and use the processing overhead to decompress them

Not sure this is technically correct tho, So dont take it as word of god, just seems like a logical explanation


----------



## reklaw75

Reporting in on how the game runs on AMD GPU / Intel CPU.

Resolution: 1920 x 1080
CPU: 4770K @ 4.5 Ghz
GPU: 290x @ 1000 (40 mhz underclocked)
RAM: 32 Gig @ 2400

Settings: Ultra
AA: Temporal AA (Havent tried anything else yet)

Running on a SSD and without the 14.6 AMD Drivers (on 14.4 WHQL) the game
last night was running beautifully in all areas EXCEPT for driving which would
do a very occasional hiccup. Was using the -nopagefilecheck option.

Thinking about doing a run tonight with the game on a ram drive and reporting back.
Will also try out different AA modes and the new 14.6 driver.

Seems the key is having loads of RAM, the only thing i could ask of the game at the moment
is the driving to not stutter occasionaly, however until ive tried the 14.6 I can't complain.


----------



## KenjiS

Might be the SSD as well as that would load stuff into the VRAM faster than a mech HDD


----------



## Seallone

Just dont bother, 3x 780 ti crap only 3gb







lol
4930 stock atm,
ssd 1tb samsung
full reinstall, my drived died installing WD lol no jooke.

4k runs so bad its a joke.

WD, feal and plays worse than MY ps4 . this game is bad, really bad.

WE CAN NEVER PREORDER. just pirate, sorry mods. Nvidia And Uplay trolling me here.

OMG its that bad, we should all make videos. I do enjoy my system WD, just piss me off for 2 days... Dont buy this crap.NVIDIA THE WAY TO BE PLAYED


----------



## Nightfallx

Quote:


> Originally Posted by *Seallone*
> 
> Just dont bother, 3x 780 ti crap only 3gb
> 
> 
> 
> 
> 
> 
> 
> lol
> 4930 stock atm,
> ssd 1tb samsung
> full reinstall, my drived died installing WD lol no jooke.
> 
> 4k runs so bad its a joke.
> 
> WD, feal and plays worse than MY ps4 . this game is bad, really bad.
> 
> WE CAN NEVER PREORDER. just pirate, sorry mods. Nvidia And Uplay trolling me here.
> 
> OMG its that bad, we should all make videos. I do enjoy my system WD, just piss me off for 2 days... Dont buy this crap.NVIDIA THE WAY TO BE PLAYED


someones mad. i have 1x 780 and i feel the game performs really well.


----------



## Rangerscott

It doesnt run my 12k multi monitor system on super duper ultra 9000 with one gtx 660.

POOR OPTIMIZATION!

Ill wait for the $20 sale.


----------



## Seallone

Yeah just wait, only problem with that, more sale on consuls. meaning same problem for us who like visuals.


----------



## Seallone

Quote:


> Originally Posted by *Nightfallx*
> 
> someones mad. i have 1x 780 and i feel the game performs really well.


GD for u, some people like to play H rez. reason for pc's i ve got 100 + games. and this game runs bad on £1500 on nividia hardware. other games fine. u missing the point brother. or sister.

Where mad Becouse companys spend more on marketing then on the game, But thats above you thinking. Yeah mad. U should be?


----------



## Flames21891

Quote:


> Originally Posted by *Nightfallx*
> 
> someones mad. i have 1x 780 and i feel the game performs really well.


I agree. My SLI 680's are doing great. My only complaint is that the framerate will dip into the 30's if I look in a particular direction in some parts of the city for unknown reasons, but everywhere else it's 60+ FPS. Of course, I have everything but AA cranked all the way up, so I should probably play with settings a bit and see if I can alleviate that.

I've actually had zero issues with stuttering so far, just occasional framerate dips as I already mentioned. Even then it never drops below playable so I'm pretty okay with it.

EDIT:
Quote:


> Originally Posted by *Seallone*
> GD for u, some people like to play H rez. reason for pc's i ve got 100 + games. and this game runs bad on £1500 on nividia hardware. other games fine. u missing the point brother. or sister.
> 
> Where mad Becouse companys spend more on marketing then on the game, But thats above you thinking. Yeah mad. U should be?


When playing at a very high resolution, you're putting much more strain on your hardware. It's a risk you run that for some games, you'll have to turn settings down. In this case, you may actually be running out of VRAM. Even with Nvidia's new drivers, for how the game looks the VRAM usage is still obnoxiously high. If you're trying to run the game at or above 1440p that's likely your problem.

Yeah I agree it really shouldn't happen, but it could be teething problems with the engine (Watch_Dogs is built on a brand new engine after all) that they may be able to sort out with future updates. We'll just have to see.


----------



## givmedew

The game is unplayable on all of my NVIDIA setups at the moment.

I just installed the 14.6 AMD Beta that is leaked and huge difference. The game runs very well on my R9 290X and on my HD 6950 w/ unlocked shaders


----------



## NinjaToast

Quote:


> Originally Posted by *givmedew*
> 
> The game is unplayable on all of my NVIDIA setups at the moment.
> 
> I just installed the 14.6 AMD Beta *that is leaked* and huge difference. The game runs very well on my R9 290X and on my HD 6950 w/ unlocked shaders


It's not leaked anymore, hasn't been for hours, it's available on AMD's site right now.









Glad to hear the new drivers are working for you though!


----------



## KenjiS

Probubly again, as I stated, VRAM!

3gb seems to be bare minimum for high or ultra settings. 2gb is incredibly optimistic, the stuttering is being caused by having to reload textures from HDDs...

The people having problems with "better" GPUs are likely people who dont have 4gb of VRAM, So far from what I've seen if you have 4gb of VRAM the game runs great


----------



## Silent Scone

Playing at 1440p 2x MSAA Ultra with 'high' textures. Stuttering but only once in awhile... Ultra textures is a stutter fest...

Frame rates in the 90s to 130s. Spec in sig


----------



## Silent Scone

Quote:


> Originally Posted by *KenjiS*
> 
> Probubly again, as I stated, VRAM!
> 
> 3gb seems to be bare minimum for high or ultra settings. 2gb is incredibly optimistic, the stuttering is being caused by having to reload textures from HDDs...
> 
> The people having problems with "better" GPUs are likely people who dont have 4gb of VRAM, So far from what I've seen if you have 4gb of VRAM the game runs great


Nope it's poor for Titan owners too.


----------



## Flames21891

Quote:


> Originally Posted by *KenjiS*
> 
> Probubly again, as I stated, VRAM!
> 
> 3gb seems to be bare minimum for high or ultra settings. 2gb is incredibly optimistic, the stuttering is being caused by having to reload textures from HDDs...
> 
> The people having problems with "better" GPUs are likely people who dont have 4gb of VRAM, So far from what I've seen if you have 4gb of VRAM the game runs great


Depends on the resolution too, I think. I'm getting no stuttering on 2GB 680's @ 1080p, so I would assume that anyone else gaming at that resolution should basically be fine unless their card's pretty outdated.

A lot of people are complaining about stuttering and mentioning what card they have, but not at what resolution they're trying to run the game at. If 780 owners @ 1080p are having issues, it's gotta be something else I would think. Either that or my GPU's were somehow blessed by the silicon gods, lol.


----------



## EVGA-JacobF

Quote:


> Originally Posted by *KenjiS*
> 
> Consider the design of the PS4 and Xbox One however, Both have unified memory architechtures with 8gb of Ram in total. Even assuming overhead its a simple matter to allocate 3gb to VRAM, Even 4 is technically possible (Watch Dogs on my PC is taking about 2gb of system RAM, Assuming a console uses less system RAM for assets..)
> 
> My GUESS here is that on these consoles you dont compress the textures, This saves processing overhead at the cost of RAM, but you have a lot of RAM to work with so its a good tradeoff, On the PC you have large processing reserves but most GPUs have 2-3gb of ram, So you would be better off compressing textures to save RAM and use the processing overhead to decompress them
> 
> Not sure this is technically correct tho, So dont take it as word of god, just seems like a logical explanation


The textures are likely still compressed, but perhaps not as heavily. Texture resolution is likely significantly higher as well. Remember compared to PS3 the PS4 has like 16X the available memory MINIMUM (since the memory is shared). It is only natural that games will take advantage of this and on the PC the min requirements will increase dramatically. As the old consoles fade away 3GB-4GB will likely become the new requirement for highest detail settings.

Wolfenstein is a similar story as well.


----------



## KenjiS

Quote:


> Originally Posted by *Flames21891*
> 
> Depends on the resolution too, I think. I'm getting no stuttering on 2GB 680's @ 1080p, so I would assume that anyone else gaming at that resolution should basically be fine unless their card's pretty outdated.
> 
> A lot of people are complaining about stuttering and mentioning what card they have, but not at what resolution they're trying to run the game at. If 780 owners @ 1080p are having issues, it's gotta be something else I would think. Either that or my GPU's were somehow blessed by the silicon gods, lol.


I would love to try to see if i can play it at 1080p but given uplay is pitchin fits i cant... But that is going to look terrible given i have a 1440p monitor (My "solution" if it comes to that isto play WD on my TV where I can play it at 1080p High/Ultra hopefully..)

Or its a poor design choice, Dont remember where I explained it but the fact its chewing up so much VRAM makes sense for the PS4 and Xbox One which have large reserves of VRAM, PCs generally dont...


----------



## KenjiS

Quote:


> Originally Posted by *EVGA-JacobF*
> 
> The textures are likely still compressed, but perhaps not as heavily. Texture resolution is likely significantly higher as well. Remember compared to PS3 the PS4 has like 16X the available memory MINIMUM (since the memory is shared). It is only natural that games will take advantage of this and on the PC the min requirements will increase dramatically. As the old consoles fade away 3GB-4GB will likely become the new requirement for highest detail settings.
> 
> Wolfenstein is a similar story as well.


But I at least could play Wolfenstein on Ultra and get 50-60fps, On a Single 770, In a game that didnt support SLI... And wasnt backed by nVidia...


----------



## givmedew

Quote:


> Originally Posted by *KenjiS*
> 
> Probubly again, as I stated, VRAM!
> 
> 3gb seems to be bare minimum for high or ultra settings. 2gb is incredibly optimistic, the stuttering is being caused by having to reload textures from HDDs...
> 
> The people having problems with "better" GPUs are likely people who dont have 4gb of VRAM, So far from what I've seen if you have 4gb of VRAM the game runs great


The shutters are not PURELY a VRAM issues since it doesn't happen on my HD 6950 w/ unlocked shaders but does on my 2GB SLI setup regardless if SLI is enabled or disabled.

I don't have a 3+GB NVIDIA card or I would try it out.

I suspect something else is going on otherwise my AMD card would stutter as well. I even tried running the NVIDA card on the 2nd lowest texture setting and it still stutters. Every couple seconds. It is horrible.

Hopefully NVIDIA can fix it but I think it might end up being on Ubisoft since the NVIDIA drivers are already made with Watch Dogs in mind.


----------



## EVGA-JacobF

Quote:


> Originally Posted by *KenjiS*
> 
> But I at least could play Wolfenstein on Ultra and get 50-60fps, On a Single 770, In a game that didnt support SLI... And wasnt backed by nVidia...


How did you enable it? In Wolfenstein if you do not have a 3GB card you cannot even select the highest setting!

Also TITANFALL is another similar situation... Insane texture quality is stutter city on 2GB card with a bit of MSAA. Very high setting no problem though.


----------



## KenjiS

Quote:


> Originally Posted by *givmedew*
> 
> The shutters are not PURELY a VRAM issues since it doesn't happen on my HD 6950 w/ unlocked shaders but does on my 2GB SLI setup regardless if SLI is enabled or disabled.
> 
> I don't have a 3+GB NVIDIA card or I would try it out.
> 
> I suspect something else is going on otherwise my AMD card would stutter as well. I even tried running the NVIDA card on the 2nd lowest texture setting and it still stutters. Every couple seconds. It is horrible.
> 
> Hopefully NVIDIA can fix it but I think it might end up being on Ubisoft since the NVIDIA drivers are already made with Watch Dogs in mind.


I'll keep that in mind then. Sorry its just the best theory I have to work with at the moment.


----------



## KenjiS

Quote:


> Originally Posted by *EVGA-JacobF*
> 
> How did you enable it? In Wolfenstein if you do not have a 3GB card you cannot even select the highest setting!
> 
> Also TITANFALL is another similar situation... Insane texture quality is stutter city on 2GB card with a bit of MSAA. Very high setting no problem though.


Derp, Just double checked, Im on High.

But I cant even get High working on WD







I have to go down to Medium

I totally understand progress and such. I just wish I had realized "yeah 2gb wasnt enough" before buying the second GTX 770...

-edit- and I just loaded it up and tried to play it at 1080p, Still stutters, This game needs patching badly. Going to shelve it for now and wait to hear if theres going to be a patch from ubi or something


----------



## Flames21891

Quote:


> Originally Posted by *KenjiS*
> 
> Derp, Just double checked, Im on High.
> 
> But I cant even get High working on WD
> 
> 
> 
> 
> 
> 
> 
> I have to go down to Medium
> 
> I totally understand progress and such. I just wish I had realized "yeah 2gb wasnt enough" before buying the second GTX 770...
> 
> -edit- and I just loaded it up and tried to play it at 1080p, Still stutters, This game needs patching badly. Going to shelve it for now and wait to hear if theres going to be a patch from ubi or something


That's so strange. Like I said, I have 2 680's, and get zero stutter @ 1080p. Wonder if it's some weird thing with 700 series cards?


----------



## KenjiS

Quote:


> Originally Posted by *Flames21891*
> 
> That's so strange. Like I said, I have 2 680's, and get zero stutter @ 1080p. Wonder if it's some weird thing with 700 series cards?


I do not know... I'm spent on trying to get it working at this point.

Unless its that you're running Win 7, Win 8 does consume a bit of VRAM


----------



## MonarchX

Quote:


> Originally Posted by *reklaw75*
> 
> Reporting in on how the game runs on AMD GPU / Intel CPU.
> 
> Resolution: 1920 x 1080
> CPU: 4770K @ 4.5 Ghz
> GPU: 290x @ 1000 (40 mhz underclocked)
> RAM: 32 Gig @ 2400
> 
> Settings: Ultra
> AA: Temporal AA (Havent tried anything else yet)
> 
> Running on a SSD and without the 14.6 AMD Drivers (on 14.4 WHQL) the game
> last night was running beautifully in all areas EXCEPT for driving which would
> do a very occasional hiccup. Was using the *-nopagefilecheck option*.
> 
> Thinking about doing a run tonight with the game on a ram drive and reporting back.
> Will also try out different AA modes and the new 14.6 driver.
> 
> Seems the key is having loads of RAM, the only thing i could ask of the game at the moment
> is the driving to not stutter occasionaly, however until ive tried the 14.6 I can't complain.


How do I use that option and what does it do? Is it just something to add to the end of the .exe? Does it help?


----------



## MonarchX

Quote:


> Originally Posted by *EVGA-JacobF*
> 
> The textures are likely still compressed, but perhaps not as heavily. Texture resolution is likely significantly higher as well. Remember compared to PS3 the PS4 has like 16X the available memory MINIMUM (since the memory is shared). It is only natural that games will take advantage of this and on the PC the min requirements will increase dramatically. As the old consoles fade away 3GB-4GB will likely become the new requirement for highest detail settings.
> 
> Wolfenstein is a similar story as well.


It better be a 3GB requirement for a while or else all GTX 780 and GTX 780 Ti 3GB owners are screwed. These are the top-end cards, worth $500-$750. However, VRAM capacity is NOT the issue or else 4GB card owners wouldn't be complaining of the identical stutters.


----------



## MonarchX

Quote:


> Originally Posted by *Flames21891*
> 
> That's so strange. Like I said, I have 2 680's, and get zero stutter @ 1080p. Wonder if it's some weird thing with 700 series cards?


OMG - I had the same thing going on when I had my GTX 680. Deus Ex HR would run fine on it, but if I were to try GTX 770 I would get stutters, even though both cards were nearly identical. There is something strange going on.


----------



## degenn

Averaging 60-80fps (smooth, no stutter) completely maxed out Ultra everything @ 2560x1600 and 2x MSAA on my system:

RIVBE w/ 4930k @ 4.4ghz
16gb ddr3 2666mhz trident-x
SLI Titan Blacks
Samsung 840 Pro SSD

Using any more MSAA results in framerates under 60fps -- using any other method of AA results in noticeable blurring of the textures. MSAA is much sharper.

I use just over 4GB of VRAM @ 2x MSAA w/ the above settings. I tried 8x MSAA and my VRAM usage shot to the full 6GB available on my Titans and framerates took a dive into the mid-30's.









Enjoying the game so far.


----------



## Rangerscott

Hope its not vram. I thought I just upgraded and future proofed myself by getting a 780.


----------



## DADDYDC650

Quote:


> Originally Posted by *Rangerscott*
> 
> Hope its not vram. I thought I just upgraded and future proofed myself by getting a 780.


It's a VRAM issue and more titles will need at least 3GB in the near future.

Funny how some folks were claiming that 3GB would be enough for awhile. ..


----------



## majin662

I've pretty much tried all the current
"tweaks" that are floating around. None have completely gotten rid of the stutter while driving fast. Not-disableyadayada...not borderless..not buffered frames 3..1..5. Not dropping to 1080. Not lowering textures. Textures and shadows....so on and so on.

I'm sure it'll be sorted out eventually...is not completely vram and all that jazz. No combo of magic tweaks has completely gotten rid of it for me. Some have appeared to reduce it but that's it.

I'm still enjoying myself and honestly think this game looks pretty dang good. No sense in getting caught in the "tweaking" trap imo

edit: on a hilarious (in my opinion) Geforce Experience is finally detecting my game and wouldn't you know it's telling me to run at the exact same settings I've been using, including 2560x1440. For some reason I find that funny.


----------



## KenjiS

Quote:


> Originally Posted by *MonarchX*
> 
> OMG - I had the same thing going on when I had my GTX 680. Deus Ex HR would run fine on it, but if I were to try GTX 770 I would get stutters, even though both cards were nearly identical. There is something strange going on.


I will load up WD on my laptop later, GTX 680M + SSDs..

At least i might be able to rule out a few things that way


----------



## Marc79

I'm going to try he game soon and check how it runs on my setup. I will say this I played the game at my friends house yesterday, 2x770's (2GB) @ 1440p (4770k), and to play with no stutters, I had lower textures to High, settings to Medium, and AA completely off, otherwise it was unplayable with major stutters.


----------



## Brianmz

I keep getting the stuttering as well if I play with Textures at ultra 1440p, I lowered them to High, everything else maxed out and temporal SMAA and the stuttering is mostly gone, kind of disappointing tbh.


----------



## eternal7trance

I keep getting stuttering no matter what setting I put it at.


----------



## anthonyg45157

Quote:


> Originally Posted by *Brianmz*
> 
> I keep getting the stuttering as well if I play with Textures at ultra 1440p, I lowered them to High, everything else maxed out and temporal SMAA and the stuttering is mostly gone, kind of disappointing tbh.


Same here. 1440p Everything ultra, hbao+ high, textures on high instead of ultra and fxaa. It's decently smooth, but ultra textures do look quite a bit better.

I find its smooth enough to play but still disappointing. I'm using a heavily over clocked 780 lightning (1320mhz) and my fps is between 40 and 60+. Hovers in the 50 range most often.

I decided to turn fraps off because I end up watching the fps counter too often LOL.

I'm impressed with some aspects of the graphics but then disappointed in some areas. Game looks much better at night vs day. Water and reflections are amazing.


----------



## y2kcamaross

Why do you all think it's a vram issue? It's just an ultra texture setting issue imo, I get the game stutters with my [email protected] ultra everything with no AA as I do with my [email protected], if I set the textures to high or medium, the problem goes away


----------



## KenjiS

Mine wont even do high... Mine wont even do 1080p on high


----------



## anthonyg45157

Quote:


> Originally Posted by *KenjiS*
> 
> Mine wont even do high... Mine wont even do 1080p on high


Not cool


----------



## KenjiS

Quote:


> Originally Posted by *anthonyg45157*
> 
> Not cool


Indeed... I ill go mess with it more i guess... Maybe try nVidia's "recommended" settings in the GeForce Experience, See where that gets me


----------



## StrongForce

Does the stutter happens only in big/long view distance areas with lots of NPC's on the screen ?

I find it amazing that this game is already limiting 3gb cards though







, so I sent back my 770(thanks to this whole thread lol) and gonna wait a few weeks, if no word from the 800's series (yes I'm hoping for that !) then I might just get a 780 6gb, that thing gots to be future proof on 1080p







, and I found a spot where I can get it with Watchdogs bundled apparently ! yeha

Also if anyone wants to have some fun, some friend showed me this watchdogs prank : https://www.youtube.com/watch?v=GLyaqXaSsCA







bit dumb as he's not very stealth, but fun lol.


----------



## ChronoBodi

yea, pretty much only the 4GB-6GB cards are having any luck in avoiding VRAM issues. It's 4.5 GB VRAM taken... on 4K with no AA, nevermind the higher AA settings.

Thanks the new consoles' 8GB of RAM for this reality, especially the PS4's GDDR5.

It's more likely 4GB will be the new standard since they have to split the 8GB pool into system ram and video ram.


----------



## iamhollywood5

Game still stutters like crazy even with game-side update and new Nvidia drivers, and settings lowered to high textures, no AA, and 1080p. I freaking knew it, Far Cry 3 all over again (which STILL stutters like hell to this day). Except this game somehow stutters even worse than FC3.

Done with Ubisoft.


----------



## KenjiS

I managed to get the stuttering down by turning off Motion Blur and Depth of field effects, Have it sorta going at 2560x1440, High, Temporal SMAA.. I think i might have it playable, Tho the controls are terrible and my controller needs charging

The game ITSELF is giving me a good 50-80fps... Just the stutters and drops are terrible

And fyi, I dropped to Medium, game STILL stuttered unplayably... 1920x1080, Still stuttered unplayably...

Wonder if its a Depth of Field/Motion Blur issue?


----------



## bencher

Quote:


> Originally Posted by *iamhollywood5*
> 
> Game still stutters like crazy even with game-side update and new Nvidia drivers, and settings lowered to high textures, no AA, and 1080p. I freaking knew it, Far Cry 3 all over again (which STILL stutters like hell to this day). Except this game somehow stutters even worse than FC3.
> 
> Done with Ubisoft.


And whats your vram usage?


----------



## TopicClocker

Quote:


> Originally Posted by *KenjiS*
> 
> I managed to get the stuttering down by turning off Motion Blur and Depth of field effects, Have it sorta going at 2560x1440, High, Temporal SMAA.. I think i might have it playable, Tho the controls are terrible and my controller needs charging
> 
> The game ITSELF is giving me a good 50-80fps... Just the stutters and drops are terrible
> 
> And fyi, I dropped to Medium, game STILL stuttered unplayably... 1920x1080, Still stuttered unplayably...
> 
> Wonder if its a Depth of Field/Motion Blur issue?


I'll look into the DOF and Motion Blur, that's an interesting experience on your end, it's a shame so many are having troubles with this game.


----------



## KenjiS

Quote:


> Originally Posted by *TopicClocker*
> 
> I'll look into the DOF and Motion Blur, that's an interesting experience on your end, it's a shame so many are having troubles with this game.


Its not completely fixed it just sorta increased the periods between stutters, It stutters for a bit in a new area then smooths out nicely whereas before it just kept stuttering and wouldnt stop...

Hope it maybe helps someone out there tho


----------



## iamhollywood5

Quote:


> Originally Posted by *bencher*
> 
> And whats your vram usage?


2.6-2.8gb tops (out of 3gb). I'm not running out of VRAM, the game is just horribly optimized.


----------



## feedtheducks

Can't even play 15 secs of the game, without stuttering. Ubisoft!!! (Shakes fist)* My settings are maxed out, without motion blur, 1440p. 4770K + GTX 780.


----------



## DADDYDC650

I have minor stuttering only when driving through the city but nothing that effects gameplay. Otherwise the game runs great at 1440p, ultra/max settings, Temp SMAA and V-Sync enabled.


----------



## dboythagr8

I can't even start a game because "Watch Dogs has stopped working" when I click New Game and it starts loading.

So trash I might be done with UBI

Faulting application name: Watch_Dogs.exe, version: 0.1.0.1, time stamp: 0x537507a1
Faulting module name: Disrupt_b64.dll, version: 0.0.0.0, time stamp: 0x5375077c
Exception code: 0xc0000005
Fault offset: 0x000000000246791d
Faulting process id: 0x1670
Faulting application start time: 0x01cf7a2b119ea592
Faulting module path: E:\Uplay\Ubisoft Game Launcher\games\Watch_Dogs\bin\Disrupt_b64.dll
Report Id: ad58dc65-e61e-11e3-be81-5404a63f554c
Faulting package full name:
Faulting package-relative application ID:


----------



## NinjaToast

^ I had the disrupt dll crash last night, what fixed it for me was turning down settings till I got past the first mission.


----------



## Emu105

Just found how to make this game look 10x better

Quote:


> *#3 - Graphics and Performance Fixes You Should Try*
> If you have a high-end PC and want to avoid any graphics and performance issues before Ubisoft fixes the game, you should run game while AA turned Off, GPU Rendering Set to 1 and Vsync Turned Off.
> 
> Moreover, using Nvidia Control Panel, you should enable FXAA, set Anisotropic Filtering to 16X, enable AA Mode Override, set AA Settings to 32XCSAA, set AA Transparency to 8X, Set Max Pre-Rendered Frame to 1 (Optional), set Multi Display to 1, select Prefer Maximum Performance and turn off Texture Filtering.
> 
> Make sure you have Clamp and Threaded Optimization On. Test turning on/off Vsync to see what reduces stuttering and based on that, set it. AMD Graphics users can replicate similar settings in Catalyst Control Panel.


Game looks freaking amazing!! It still stutters but i can live with it stuttering when it looks this good!!

http://segmentnext.com/2014/05/27/watch-dogs-crashes-fps-drops-stuttering-frame-skipping-errors-freezes-and-fixes/


----------



## th3illusiveman

I can run everthing Ultra except textures and shadows and it's smooth. Turn on Ultra textures and my card can't keep up :/ so much for the Vram benefit... need another 7970


----------



## givmedew

anyone else have the game crash randomly?


----------



## Flames21891

So I think I figured it out: SLI.

I just discovered that I've been playing this whole time with SLI disabled (I need to keep better track of this stuff







) and for the sake of science, I enabled it and ran the game again. Long story short, it's stutter city with SLI enabled.

Don't have an AMD rig to test CrossFire, but if you're running a multi-GPU rig and experiencing horrible stuttering, my suggestion would be to play it with a single card. My single 680 runs at 60+ fps most of the time, so all those with 780(Ti)s should still be fine on a single GPU.

Now the question is: Does Nvidia need to fix their SLI profile, or does Ubisoft need to fix their SLI support?


----------



## DADDYDC650

Quote:


> Originally Posted by *givmedew*
> 
> anyone else have the game crash randomly?


I'm in the middle of act 2. Not a single crash.


----------



## KenjiS

Quote:


> Originally Posted by *Flames21891*
> 
> So I think I figured it out: SLI.
> 
> I just discovered that I've been playing this whole time with SLI disabled (I need to keep better track of this stuff
> 
> 
> 
> 
> 
> 
> 
> ) and for the sake of science, I enabled it and ran the game again. Long story short, it's stutter city with SLI enabled.
> 
> Don't have an AMD rig to test CrossFire, but if you're running a multi-GPU rig and experiencing horrible stuttering, my suggestion would be to play it with a single card. My single 680 runs at 60+ fps most of the time, so all those with 780(Ti)s should still be fine on a single GPU.
> 
> Now the question is: Does Nvidia need to fix their SLI profile, or does Ubisoft need to fix their SLI support?


Huh. Funny, Stuttering isnt completely eliminated on my rig but it does seem to change the nature of it.. perhaps theres an issue with AFR?

Game is giving me 35-40 on a single card. Playable enough


----------



## Silent Scone

Quote:


> Originally Posted by *Flames21891*
> 
> So I think I figured it out: SLI.
> 
> I just discovered that I've been playing this whole time with SLI disabled (I need to keep better track of this stuff
> 
> 
> 
> 
> 
> 
> 
> ) and for the sake of science, I enabled it and ran the game again. Long story short, it's stutter city with SLI enabled.
> 
> Don't have an AMD rig to test CrossFire, but if you're running a multi-GPU rig and experiencing horrible stuttering, my suggestion would be to play it with a single card. My single 680 runs at 60+ fps most of the time, so all those with 780(Ti)s should still be fine on a single GPU.
> 
> Now the question is: Does Nvidia need to fix their SLI profile, or does Ubisoft need to fix their SLI support?


So basically, it's JUST like AC4? That has TERRIBLE stuttering with SLi and it still isn't fixed!!!!


----------



## Devnant

Quote:


> Originally Posted by *Flames21891*
> 
> So I think I figured it out: SLI.
> 
> I just discovered that I've been playing this whole time with SLI disabled (I need to keep better track of this stuff
> 
> 
> 
> 
> 
> 
> 
> ) and for the sake of science, I enabled it and ran the game again. Long story short, it's stutter city with SLI enabled.
> 
> Don't have an AMD rig to test CrossFire, but if you're running a multi-GPU rig and experiencing horrible stuttering, my suggestion would be to play it with a single card. My single 680 runs at 60+ fps most of the time, so all those with 780(Ti)s should still be fine on a single GPU.
> 
> Now the question is: Does Nvidia need to fix their SLI profile, or does Ubisoft need to fix their SLI support?


I´ve tested it. Playing without SLI I get about 40 FPS average, and when the frames dip to the 30s while driving, the experience is still smooth. With SLI on, my FPS is on the 70-80s most of the time, but there are still the occasional huge drops to 30 FPS here and then, and hence stuttering.

It seems SLI doesn´t improve minimum FPS.


----------



## Murlocke

Quote:


> Originally Posted by *Devnant*
> 
> I´ve tested it. Playing without SLI I get about 40 FPS average, and when the frames dip to the 30s while driving, the experience is still smooth. With SLI on, my FPS is on the 70-80s most of the time, but there are still the occasional huge drops to 30 FPS here and then, and hence stuttering.
> 
> It seems SLI doesn´t improve minimum FPS.


Same here, the game is better without SLI. I just got done packaging my 2nd 780 Ti up and sending it back to Amazon. I see no reason to keep 2 when 3GB of VRAM is going to be the limit in "next gen" games.

The one thing I noticed is with SLI you can get triple buffering with vsync, so you can get between 30-60FPS. However, with a single GPU It's either 30 or 60... even with pre-buffered frames set to 3.

No idea why, it should work.


----------



## revro

yep in my country i can get for 350eur 4gb r9 290 windforce 290x for 450eur, evga 780 6gb for 550-570eur and by now 780 is at end of its life. so i rather wait for 880 or 3 months later 880ti







who knows how much vram they give it. i mean seems like 6gb is a must since we are going to get these ps4/xone ports that cant live without their 6-8gb ...


----------



## Devnant

Quote:


> Originally Posted by *Murlocke*
> 
> Same here, the game is better without SLI. I just got done packaging my 2nd 780 Ti up and sending it back to Amazon. I see no reason to keep 2 when 3GB of VRAM is going to be the limit in "next gen" games.
> 
> The one thing I noticed is with SLI you can get triple buffering with vsync, so you can get between 30-60FPS. However, with a single GPU It's either 30 or 60... even with pre-buffered frames set to 3.
> 
> No idea why, it should work.


Dunno if it helps, but I´m running @ 1440p TXAAx2 with SLI TITANS, [email protected] (HT enabled), and 16GB RAM. The game eats about 4.5 GB VRAM, uses about 70% of each GPU, 50% of each CPU thread, 5.5 GB of system RAM, but there´s still stuttering.

Disabling pagefile helps a bit, but the game crashes during intense driving.

I think a few patches could help things, but considering we are dealing with Ubisoft, I wouldn´t keep your hopes up.


----------



## Nightfallx

Quote:


> Originally Posted by *KenjiS*
> 
> Derp, Just double checked, Im on High.
> 
> But I cant even get High working on WD
> 
> 
> 
> 
> 
> 
> 
> I have to go down to Medium
> 
> I totally understand progress and such. I just wish I had realized "yeah 2gb wasnt enough" before buying the second GTX 770...
> 
> -edit- and I just loaded it up and tried to play it at 1080p, Still stutters, This game needs patching badly. Going to shelve it for now and wait to hear if theres going to be a patch from ubi or something


sell your 770, I sold mine a week ago and only lost like $20


----------



## yusupov

Quote:


> Originally Posted by *Emu105*
> 
> Just found how to make this game look 10x better
> 
> Game looks freaking amazing!! It still stutters but i can live with it stuttering when it looks this good!!
> 
> http://segmentnext.com/2014/05/27/watch-dogs-crashes-fps-drops-stuttering-frame-skipping-errors-freezes-and-fixes/


???
this looks awful. the override doesnt do anything & the aliasing is atrocious.


----------



## TheBlindDeafMute

if you really think Ubisoft is going to care about a couple of NVIDIA and AMD fanboys whining about how the game won't run on their SLI setups, they won't. Consoles will sell in the millions PC realistically they could care less about. On my sli rig, game runs ok at best.


----------



## Clazman55

I just want to corroborate the SLI issue. The game stutters with SLI enabled, no stutter with it SLI set to off or Activate all monitors mode.

I have another test I want to talk about but cant, against OCN rules. IM if you want to know, rather interesting.


----------



## anthonyg45157

Quote:


> Originally Posted by *yusupov*
> 
> ???
> this looks awful. the override doesnt do anything & the aliasing is atrocious.


Yeah I tried what he said and it didn't look any better..at least at first glance. I'll take some screen shots to find out.

If you want it to look better, use sweet fx. I'm happy with results I got with it.


----------



## Silent Scone

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> if you really think Ubisoft is going to care about a couple of NVIDIA and AMD fanboys whining about how the game won't run on their SLI setups, they won't. Consoles will sell in the millions PC realistically they could care less about. On my sli rig, game runs ok at best.


Wow hi you must be that guy who's barely out of nappies who still thinks only 5 people have multi GPU setups...

Very silly way of looking at it. If people don't report issues, nothing would get fixed. There are many users with SLI / Crossfire setups.


----------



## zacker

ubisoft dont care for pc gamers the game is so unoptimized for pc they dont even enable anisotropic filtering x16 probably is x2 in game also all antializing options look bad i dont know seriosuly i tried txaa x4 on 24 monitor 1980x1920 looks weird for me this game takes 5/10 even gta iv is better dont waste your precious money on this buy something else


----------



## DF is BUSY

finally got around to getting optimal levels on the graphical settings



^ sweet spot for my rig, averaging 50-60 fps (adaptive vsync on)

now I have to find a proper AA level and I'm good to go!


----------



## TheBlindDeafMute

Quote:


> Originally Posted by *Silent Scone*
> 
> Wow hi you must be that guy who's barely out of nappies who still thinks only 5 people have multi GPU setups...
> 
> Very silly way of looking at it. If people don't report issues, nothing would get fixed. There are many users with SLI / Crossfire setups.


Very silly way of reading my post. I asked if "Ubisoft" is going to care, and the answer is no. Nvidia and AMD will care, and release drivers to fix the issue. Judging from their past history with assassin's Creed and the like, I'm betting it's a while before we see a patch if at all from ubi . Please don't assume to know my level of insight on sli setups, and take the time to read what is typed.


----------



## cstkl1

its a real pity. this game when i used both cards with everything turned up, 8X MSAA etc, its beautiful, custom Deffered FX, Rain Render, Alpahsomething.. 16AF force, Texture set to High Quality.

but stutters....


----------



## CaptainZombie

I ran the game for about an hour this morning after it d/l last night and here are my settings. Luckily I have not run into any screen tearing issues, massive framerate drops, or really bad slowdown. I was worried that my setup would not be able to run this game at all.

i5 [email protected] 3.3ghz
770 @ stock
8GB RAM 1600mhz
Installed on Evo 120GB SSD
50" Plasma

- Resolution 1920 x 1080
- Refresh rate 60hz
- Fullscreen
- Vsync "1"
- GPU max buffered frames "3"
- Textures "High"
- Anti-aliasing "MSAA4x"
- Level of detail "Ultra"
- Shadows "High"
- Reflections "High"
- AO "HBAO+Low"
- Motion blur "On"
- Depth of field "On"
- Water "Ultra"
- Shader "High"

Any recommendations on what you guys think I can bump up a bit with my setup? Going to tinker around with this a bit more later tonight. I was thinking of maybe OC my 2500K or OC the 770 to see if that does much. I am pretty happy with the results, but wouldn't mind to try and squeeze out some more performance from my rig.


----------



## yusupov

ugh, turn that motion blur off DF. theres some frames AND better quality.









im also still getting the big frame drops but i have everything cranked...it does indeed look gorgeous, enough so that im just enduring it & hoping it gets patched out very soon.

@zombie, ive seen it said that AF x16 forced in nvidia CP looks better...im not so sure actually, gonna give it another try myself. i don't really see much thats bump-uppable other than AO? as said above i'd turn off motion blur but thats personal preference.


----------



## Silent Scone

Quote:


> Originally Posted by *yusupov*
> 
> ugh, turn that motion blur off DF. theres some frames AND better quality.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im also still getting the big frame drops but i have everything cranked...it does indeed look gorgeous, enough so that im just enduring it & hoping it gets patched out very soon.
> 
> @zombie, ive seen it said that AF x16 forced in nvidia CP looks better...im not so sure actually, gonna give it another try myself. i don't really see much thats bump-uppable other than AO? as said above i'd turn off motion blur but thats personal preference.


I bumped AF in control panel, definitely looks better here







.

There are certain times of day / weather / locations that really do have the lightning spot on. It's just a shame it's plagued with issues at the moment. Oh, and I know water is always a talking point when it's of no real consequence but does it look good or what!


----------



## DF is BUSY

Quote:


> Originally Posted by *CaptainZombie*
> 
> I ran the game for about an hour this morning after it d/l last night and here are my settings. Luckily I have not run into any screen tearing issues, massive framerate drops, or really bad slowdown. I was worried that my setup would not be able to run this game at all.
> 
> i5 [email protected] 3.3ghz
> 770 @ stock
> 8GB RAM 1600mhz
> Installed on Evo 120GB SSD
> 50" Plasma
> 
> - Resolution 1920 x 1080
> - Refresh rate 60hz
> - Fullscreen
> - Vsync "1"
> - GPU max buffered frames "3"
> - Textures "High"
> - Anti-aliasing "MSAA4x"
> - Level of detail "Ultra"
> - Shadows "High"
> - Reflections "High"
> - AO "HBAO+Low"
> - Motion blur "On"
> - Depth of field "On"
> - Water "Ultra"
> - Shader "High"
> 
> Any recommendations on what you guys think I can bump up a bit with my setup? Going to tinker around with this a bit more later tonight. I was thinking of maybe OC my 2500K or OC the 770 to see if that does much. I am pretty happy with the results, but wouldn't mind to try and squeeze out some more performance from my rig.


- turn off in-game vsync

- msaa4x is a real killer, but if you're getting the fps you want then it should be okay

- changing gpu max buffered frames to "1" eliminates stuttering and provides overall smoothness i've read


----------



## TopicClocker

Quote:


> Originally Posted by *Silent Scone*
> 
> I bumped AF in control panel, definitely looks better here
> 
> 
> 
> 
> 
> 
> 
> .
> 
> There are certain times of day / weather / locations that really do have the lightning spot on. It's just a shame it's plagued with issues at the moment. Oh, and I know water is always a talking point when it's of no real consequence but does it look good or what!


For real, and it is a shame, the games barely had a day of release and people are already picking up pitch forks and threatening to sell their cards.
I'm sure this game just needs a patch or two to solve most of these problems, It's not like the console version are perfect either, they're having troubles themselves.

All I can see is hate for this game and constant comparisons to GTA when no one's really looking at the big picture how it's doing things differently, it's not just a copy and paste job, it's also a new IP from Ubisoft and it's not like it's going to be perfect, GTA 5 isn't perfect either and I can pick out so many flaws and disappointments If I wanted to, I'm not defending either Ubisoft or Rockstar but some of the attitudes I've seen are pretty silly concerning this.

it's like everyone is suddenly siding with Rockstar, Rockstar that refuses to mention any word of "PC" and makes PC Gamers hold out for almost over a year for the PC versions of Grand Theft Auto, where is GTA 5 PC? Rockstar that released GTA 4 which also ran like crap, and still does on some rigs, did anybody forget that, let alone having to wait months for a PC port which has no Anti-Aliasing and lacks various graphical settings.









It's almost been a year and no announcement or confirmation if they're doing it or not, just keeping people waiting and we all know tons of PC Gamers are going to run out and buy it when it's announced(Including myself), it shows you how they treat their fans and customers, at-least Ubisoft do PC versions in a decent amount of time. Who knows, maybe they'll do a revamped GTA 5 with the next gen systems along with the PC version, even though they said they wont do one for the next gen systems but that would be a missed opportunity, if they do that I suppose it's not as bad as making us wait for an individual port, consoles are more or less the benchmark, the PS4 is pushing this benchmark forward and if they make a "next gen" version we could be looking forward to some good enhancements.

I can't imagine a single game where it was released in a perfect state, whether consoles or PC ever since the PS2/Xbox era where there was no such thing if I can recall correctly as updates or patches for those consoles, maybe for the the original Xbox, the N64 would be a better example however.


----------



## Leopard2lx

How do I take a screenshot in Watch Dogs?


----------



## Bitemarks and bloodstains

Use Afterburner.
http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## Silent Scone

Uplay overlay is f12 for screens too.


----------



## KenjiS

Quote:


> Originally Posted by *Nightfallx*
> 
> sell your 770, I sold mine a week ago and only lost like $20


Would probubly take the newer one and use it in the evga trade up, take my original and sell it here, then go for like a 780 Ti 6gb lol

I dunno I'll probubly just hang onto it for the immediate future, Id rather wait on the 800 series before doing anything


----------



## ChronoBodi

- I set Max pre-rendered frames to in NV control panel to app controlled..
- then in game I set the frames to render ahead at 2
- then I set vsync in game to off
- then I set vsync in NV control panel to adaptive

seriously guys, for my setup, THIS is a LOT smoother and it feels more like 60 FPS than it was before, try this now!


----------



## CaptainZombie

Quote:


> Originally Posted by *DF is BUSY*
> 
> - turn off in-game vsync
> - msaa4x is a real killer, but if you're getting the fps you want then it should be okay
> - changing gpu max buffered frames to "1" eliminates stuttering and provides overall smoothness i've read


Thanks, I went ahead and made those changes and so far its looking good.

Is MSAA4x better than TXAA or the opposite?


----------



## DF is BUSY

Quote:


> Originally Posted by *CaptainZombie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DF is BUSY*
> 
> - turn off in-game vsync
> - msaa4x is a real killer, but if you're getting the fps you want then it should be okay
> - changing gpu max buffered frames to "1" eliminates stuttering and provides overall smoothness i've read
> 
> 
> 
> Thanks, I went ahead and made those changes and so far its looking good.
> 
> Is MSAA4x better than TXAA or the opposite?
Click to expand...

from what i recall, msaa4x hits harder (or slightly more than txaa2x) but is more sharp/crisp. i think a lot of users are using smaa or temporal smaa and enjoying it.

txaa is actually pretty nice to look at, if you dont mind the blur aliasing.


----------



## DF is BUSY

Quote:


> Originally Posted by *ChronoBodi*
> 
> - I set Max pre-rendered frames to in NV control panel to app controlled..
> - then in game I set the frames to render ahead at 2
> - then I set vsync in game to off
> - then I set vsync in NV control panel to adaptive
> 
> seriously guys, for my setup, THIS is a LOT smoother and it feels more like 60 FPS than it was before, try this now!


yup, this is ideal.

you can set the render ahead to 1 even.


----------



## dboythagr8

Quote:


> Originally Posted by *NinjaToast*
> 
> ^ I had the disrupt dll crash last night, what fixed it for me was turning down settings till I got past the first mission.


I re-downloaded the game overnight, and this morning it worked.

Abysmal performance for me though at 4k...then again I'm trying to brute force it @ Ultra which is too much for 3GB VRAM. But others said it's not so I dunno...I will say that watching my Precision X monitoring it seems like the stuttering is not related to VRAM and has more to do with the engine itself...


----------



## NavDigitalStorm

Here are my AMD benchmarks: http://www.digitalstormonline.com/unlocked/watch-dogs-benchmarks-amd-radeon-r9-1080p-and-4k-idnum281/


----------



## Meebsy

It must not like my GTX680 SLI on Ultra as it acts like an epileptic disco with huge stutters. Turning the textures down to High fixed the epileptic disco but not the stutters. Shame, i wanted those Ultra textures.


----------



## iamhollywood5

Quote:


> Originally Posted by *Meebsy*
> 
> It must not like my GTX680 SLI on Ultra as it acts like an epileptic disco with huge stutters. Turning the textures down to High fixed the epileptic disco but not the stutters. Shame, i wanted those Ultra textures.


It's pretty obvious that Ultra textures will use more than 2GB. They certainly don't look good enough to require that much, but that's the reality. 690 and 680x2 owners got screwed over the worst here. Between the two cores there's definitely enough processing power to handle ultra over 60fps if you had 4GB...


----------



## Flames21891

Quote:


> Originally Posted by *iamhollywood5*
> 
> It's pretty obvious that Ultra textures will use more than 2GB. They certainly don't look good enough to require that much, but that's the reality. 690 and 680x2 owners got screwed over the worst here. Between the two cores there's definitely enough processing power to handle ultra over 60fps if you had 4GB...


I'm able to run it on Ultra @ 1080p with everything cranked up to max and FXAA on a single 2GB 680 without too many dips in framerate. Definitely no stuttering or anything. So I think the VRAM requirements for Ultra are slightly exaggerated, unless that recommendation is for resolutions above 1080p.

Regardless, SLI is definitely broken in this game.


----------



## KenjiS

Quote:


> Originally Posted by *iamhollywood5*
> 
> It's pretty obvious that Ultra textures will use more than 2GB. They certainly don't look good enough to require that much, but that's the reality. 690 and 680x2 owners got screwed over the worst here. Between the two cores there's definitely enough processing power to handle ultra over 60fps if you had 4GB...


770 SLI guys too :/ At least those of us with 2gb cards

Im pretty much done trying to get this working, I have spent more time trying and tried most of the options out there (Except the one i heard which is to try an illegal copy) To go through it all again:

-Turned off SLI
-Dropped to High
-Dropped to Medium
-Dropped to 1080p
-Tried borderless window
-Tried -disablepagecheck
-Turned off Motion Blur
-Turned off Depth of Field
-Various combinations of the above

None of it corrected the stuttering, Its as smooth as getting dragged behind a bus down a pothole-lined road and the driving segments are unplayable on the "best" combo of settings i found (SLI Off, High, No MB or DoF, 2560x1440)

The issues have killed my desire to play this game, and this game was at the top of my list to play when I saw it. Im so done with it.

Will just keep watching the threads and hoping for a patch :/


----------



## Clairvoyant129

Even Skyrim modded to Oblivion didn't take this much vram at 2560x1440.

All settings ultra with MSAA 4x.



4.8GB









Quote:


> Originally Posted by *KenjiS*
> 
> 770 SLI guys too :/ At least those of us with 2gb cards
> 
> Im pretty much done trying to get this working, I have spent more time trying and tried most of the options out there (Except the one i heard which is to try an illegal copy) To go through it all again:
> 
> -Turned off SLI
> -Dropped to High
> -Dropped to Medium
> -Dropped to 1080p
> -Tried borderless window
> -Tried -disablepagecheck
> -Turned off Motion Blur
> -Turned off Depth of Field
> -Various combinations of the above
> 
> None of it corrected the stuttering, Its as smooth as getting dragged behind a bus down a pothole-lined road and the driving segments are unplayable on the "best" combo of settings i found (SLI Off, High, No MB or DoF, 2560x1440)
> 
> The issues have killed my desire to play this game, and this game was at the top of my list to play when I saw it. Im so done with it.
> 
> Will just keep watching the threads and hoping for a patch :/


It's not just you, AMD cards with 4GB are having trouble as well. I'm sure Ubisoft will release some sort of patch to address these issues.


----------



## Thetbrett

in the end, I decided to play this game on my tv, so bumped it down to 1080p, all maxed and much better to play on the couch with a controller anyway. Looks great from there.


----------



## Unkzilla

Just read both hardocp and tomshardware benchmarks which seem to be done in hardware intensive scenes (as they are hitting 30-50fps @ 1440p)

It appears that even the 4gb on the 290's is somewhat of a limitation for Ultra - min fps on the 290s seems to be low 30's (worse on the 3gb 780) but on the 6gb Titan, its in the 50's. That has to mean that even the 4gb isn't enough

I wonder if this will ever be optimized or if this is simply a case of waiting for the next series of GPUs to come out with more vram


----------



## Leopard2lx

I just went back to the 337.50 driver and set everything on HIGH / Level of Detail to Ultra / TXAA x 4 / MHBAO / Pre-rendered Frames to 2 and I have no more stuttering. It does feel very nice not to have frame drops.
If you are out of options, try this combo and see if it helps. To be honest the game still looks good on High I am just going to leave it like this and try to enjoy the rest of the game. I'm done trying to tweak it.


----------



## Silent Scone

On my Tri SLi 780Ti setup I've opted to disable SLi at 1440P as the stuttering is intermittent but still there. SMAA is a good alternative to MSAA. Has less of an impact and looks as good as 4x IMO..

Ultra settings

High Textures

1440P

SMAA



Although it's inevitable that the 780 series at 3gb was going to run into trouble eventually, I wasn't expecting it quite so soon as low as 1440p. It's debatable whether it's justified in this game though, as the difference in quality settings is negligible.


----------



## ChronoBodi

i just noticed the RAM and VRAM usage, which makes sense on a PS4 if you think about it.

4.5GB VRAM is taken up and 2.1GB RAM is taken up, roughly 6.6-7GB total ram is taken up which fits inside the PS4's 8GB ram pool.

Yea we need more 6GB vram cards, and not just the Titans only. Is there even 8GB 290s?


----------



## Marc79

My system ram usage goes past 6GB of Ram, so I don't know. And I think for 1440p to use ultra textures, 3GB might not be enough, I'm thinking 4GB at least, maybe 5GB who knows. I've put 13 hours into the game, and with Ultra textures, High detail, temporal SMAA I get playable (good) frames, but driving fast thru city, which I do 50% of the time, becomes somewhat choppy, playable but choppy. I don't know how anyone can run Ultra textrues on 2GB cards, with my friend's 2x770's (2GB) it was unplayable. Unless 2GB Vram is enough for Ultra textures at 1080p, but I doubt it.


----------



## carlhil2

After one half hour, my ram usage hit a high of 5160... at 1080p


----------



## Alatar

One thing this game does not want to do for some reason is Steam in home streaming.

problem 1)

- Stream doesn't work in fullscreen mode

problem 2)

- Even with perfectly smooth frame rates on the slave system the client will be extremely stuttery for no reason whatsoever...

And I wanted to play this on the couch


----------



## Alex132

I'm having a lot of problems running this game, I can't even run this game at the lowest texture settings possible with no AA.

It instantly uses up ALL of my VRAM, in the loading screen and freezes my entire computer.

Why is this game so poorly optimized? Since when do we need insane amounts of VRAM to run textures that are like 1024x1024 when I never had a problem with 4k textures on my 1GB 5870 in Skyrim!

This sounds like a HORRIBLE job of optimizing, like they're just placing artificial loads onto the game by loading uncompressed crap into the VRAM.

"Oh you know that skyscraper building 5000km away from you? Yeah youre going to load the textures for that building right now. In the load screen."


----------



## lilchronic

what i notice while playing is that once i hit my vram limt 3072MB it will drop to 2800MB vram then my system ram will load 200MB and stutter then my vram will start to go back up to 3GB then drop again and load another 200MB to my system ram and it will keep doing that till i hit 7Gb usage on my system ram......


----------



## Ghoxt

I'll concur that my 6GB Titan has been running flawless with the game. Just a data point about what you've already heard.


----------



## carlhil2

Quote:


> Originally Posted by *Ghoxt*
> 
> I'll concur that my 6GB Titan has been running flawless with the game. Just a data point about what you've already heard.


mine also, even though it's only been a hour+ of gaming so far...card running at 1300+, no crashes, yet...I'll see how it goes once I lower my voltage and overclock the ram...


----------



## Silent Scone

Quote:


> Originally Posted by *Ghoxt*
> 
> I'll concur that my 6GB Titan has been running flawless with the game. Just a data point about what you've already heard.


See this is the kicker, and I'd be willing to bet a small portion at least of people complaining about 'hitching' and performance issues were the ones back in Feb 2013 saying "6GB is crazy, you don't need that much







"

I'm a former Titan owner, and do-like to throw money away admittedly, but I refuse to buy back into GK110 Titans when they're so-very old now. It's great to see them so far down the line kicking above the field







. That said, I don't think it's entirely justified in this game....


----------



## ChronoBodi

Quote:


> Originally Posted by *Silent Scone*
> 
> See this is the kicker, and I'd be willing to bet a small portion at least of people complaining about 'hitching' and performance issues were the ones back in Feb 2013 saying "6GB is crazy, you don't need that much
> 
> 
> 
> 
> 
> 
> 
> "
> 
> I'm a former Titan owner, and do-like to throw money away admittedly, but I refuse to buy back into GK110 Titans when they're so-very old now. It's great to see them so far down the line kicking above the field
> 
> 
> 
> 
> 
> 
> 
> . That said, I don't think it's entirely justified in this game....


To be honest, GK110 itself is old really, the 780 Ti is only a 5% faster version of the Titan with half the Vram. It was pretty futureproof though if u bought the Titan in Feb 2013, it's still kicking ass with no VRAM troubles at all.

Nothing truly replaces it til the inevitable GM110 cards come, and they better be 6gb or more.

Still, its a shame the powerful cards are being handicapped by low Vram, especially the 2gb gk104s.

I say, better to have it and not need it, than need it and not have it.


----------



## Silent Scone

That's what I meant dude, didn't mean just Titans. It's all, very, very long in the tooth! I'm ever so slightly regretting my transition to the Ti but for one game I will let it slip. The Ti does clock so much better as well. Even though reference is voltage capped. If this is a way of Nv trying to boost potential Titan-Z/6GB GTX sales, they're not doing a very good job. Every Tom, Dick and Harry will have realised these cards are barely any difference apart by now.


----------



## KenjiS

Quote:


> Originally Posted by *ChronoBodi*
> 
> To be honest, GK110 itself is old really, the 780 Ti is only a 5% faster version of the Titan with half the Vram. It was pretty futureproof though if u bought the Titan in Feb 2013, it's still kicking ass with no VRAM troubles at all.
> 
> Nothing truly replaces it til the inevitable GM110 cards come, and they better be 6gb or more.
> 
> Still, its a shame the powerful cards are being handicapped by low Vram, especially the 2gb gk104s.
> 
> I say, better to have it and not need it, than need it and not have it.


I'm guessing GTX 880 will be 8gb, 870 4gb...

The long in the tooth thing is why I am so very hesitant to "step up" my one 770 to a 780, its long in the tooth, but i doubt the 800 series is coming before the 90 days is up... the entire thinking on the SLI 770s was that "I spend the $340 now I'd spend on an 870 near the end of the year and get a bit of future proofing", Which sadly backfired on me from the looks


----------



## Silent Scone

If it were me, and it is in this case as at 1440P I could now do with the extra memory...I'd wait. I wouldn't give them the satisfaction. NV are really taking the mickey with this line, as if it's our fault they're struggling with yields on the smaller process. Even in an ideal buyers market, the 290X 8GB variants are available.


----------



## KenjiS

Quote:


> Originally Posted by *Silent Scone*
> 
> If it were me, and it is in this case as at 1440P I could now do with the extra memory...I'd wait. I wouldn't give them the satisfaction. NV are really taking the mickey with this line, as if it's our fault they're struggling with yields on the smaller process. Even in an ideal buyers market, the 290X 8GB variants are available.


Problem with that is a R9 290X is going to be a LOT more out of pocket compared to stepping up my newer 770 and selling the first one... I'd actually pocket money that way... (I think id get around $560 for both selling them) and I'm scared the 290X is going to be -really- loud compared to my virtually silent rig right now...

And right now im essentially paying to power that second 770 and its not being used.. Half what i play doesnt seem to ever use it, the few things that do already ran fine without it :/ I was expecting at the very least to count on the AAA titles coming out this year to use it, but now its seeming like im going to get kneecapped by having 2gb of VRAM which means that is probubly off the table.. The point was to future proof myself a bit in the most cost effective way and i made a massive mistake it seems...


----------



## revro

Quote:


> Originally Posted by *Silent Scone*
> 
> If it were me, and it is in this case as at 1440P I could now do with the extra memory...I'd wait. I wouldn't give them the satisfaction. NV are really taking the mickey with this line, as if it's our fault they're struggling with yields on the smaller process. Even in an ideal buyers market, the 290X 8GB variants are available.


this, i will go for 2 880ti, cause we know they will release them 6 months after 880 ... fool me once nvidia shame on you, fool me twice ...

Quote:


> Originally Posted by *KenjiS*
> 
> Problem with that is a R9 290X is going to be a LOT more out of pocket compared to stepping up my newer 770 and selling the first one... I'd actually pocket money that way... (I think id get around $560 for both selling them) and I'm scared the 290X is going to be -really- loud compared to my virtually silent rig right now...
> 
> And right now im essentially paying to power that second 770 and its not being used.. Half what i play doesnt seem to ever use it, the few things that do already ran fine without it :/ I was expecting at the very least to count on the AAA titles coming out this year to use it, but now its seeming like im going to get kneecapped by having 2gb of VRAM which means that is probubly off the table.. The point was to future proof myself a bit in the most cost effective way and i made a massive mistake it seems...


just get a windforce r9 290, in my country it costs 350eur while 290x costs 450eur and you would have 4gb and mantle and less heat cause of less shaders


----------



## KenjiS

Cept i could for about $160 out of pocket, get a 780 3gb, or for $210 get a 780 6gb...

Could then sell the other 770 I cant step up and get $280, So id pocket a good $70-120 out of the deal...

Sell both and get around $560, which would pay for a 290X easily yeah but...


----------



## ACP84

Just to add to this @1080p with my Gtx titan I maxed out at 5002 vram with everything ultra + 8x MSAA .... Even at 4x MSAA I was at 4.5 vram and just with Fxaa I was still around 3.5gb vram ... So basically if you have a 3gb video card don't even both with Ultra settings even with 1080p, this is why your getting stuttering

I can only image if I jumped to 1440p or 4k how much vram I would need


----------



## givmedew

Quote:


> Originally Posted by *KenjiS*
> 
> Cept i could for about $160 out of pocket, get a 780 3gb, or for $210 get a 780 6gb...
> 
> Could then sell the other 770 I cant step up and get $280, So id pocket a good $70-120 out of the deal...
> 
> Sell both and get around $560, which would pay for a 290X easily yeah but...


You can get the R9 290 at retail pricing again. I would not buy the 290X because 1/2 of the speed increase is the shaders and half is the small bump in clock speed. You can almost def do that small bump on the 290 that you would get. So after that I think it is ~3-5% increase in speed over the 290. Totally not worth the extra $$$.

The 1040MHz Gigabyte card is $399 http://www.amazon.com/Gigabyte-GDDR5-4GB-2xDVI-Graphics-GV-R929OC-4GD/dp/B00HS84DFU/ref=sr_1_1?ie=UTF8&qid=1401372565&sr=8-1&keywords=r9+290

Has the same OC clock speed as the 290X OC so the only difference in speed will come from the extra shaders.


----------



## DF is BUSY

Quote:


> Originally Posted by *Marc79*
> 
> My system ram usage goes past 6GB of Ram, so I don't know. And I think for 1440p to use ultra textures, 3GB might not be enough, I'm thinking 4GB at least, maybe 5GB who knows. I've put 13 hours into the game, and with Ultra textures, High detail, temporal SMAA I get playable (good) frames, but driving fast thru city, which I do 50% of the time, becomes somewhat choppy, playable but choppy. I don't know how anyone can run Ultra textrues on 2GB cards, with my friend's 2x770's (2GB) it was unplayable. Unless 2GB Vram is enough for Ultra textures at 1080p, but I doubt it.


2gb on ultra? it might be borderline safe/at the edge's max i think.

if i set ultra textures @ 1200p, I can max the game out in the other settings menu but I get a huge lag spike/texture load at some parts of the city (though it is rather playable). Certain times of the day when shadows are more evident also get some big fps hit.

i think i'm gonna stick to high textures and max the other settings out for some more comfy playing.

and i seen my ram usage go anywhere from 6gb-9gb...... this game is a darn resource hog


----------



## givmedew

Quote:


> Originally Posted by *DF is BUSY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Marc79*
> 
> My system ram usage goes past 6GB of Ram, so I don't know. And I think for 1440p to use ultra textures, 3GB might not be enough, I'm thinking 4GB at least, maybe 5GB who knows. I've put 13 hours into the game, and with Ultra textures, High detail, temporal SMAA I get playable (good) frames, but driving fast thru city, which I do 50% of the time, becomes somewhat choppy, playable but choppy. I don't know how anyone can run Ultra textrues on 2GB cards, with my friend's 2x770's (2GB) it was unplayable. Unless 2GB Vram is enough for Ultra textures at 1080p, but I doubt it.
> 
> 
> 
> 2gb on ultra? it might be borderline safe/at the edge's max i think.
> 
> if i set ultra textures @ 1200p, I can max the game out in the other settings menu but I get a huge lag spike/texture load at some parts of the city (though it is rather playable). Certain times of the day when shadows are more evident also get some big fps hit.
> 
> i think i'm gonna stick to high textures and max the other settings out for some more comfy playing.
> 
> and i seen my ram usage go anywhere from 6gb-9gb...... this game is a darn resource hog
Click to expand...

My opinion 2GB isn't enough on ultra.

The ultra setting doesn't look that much better that its worth doing anyways. I'd recommend textures on high, FXAA, Temporal or TXAA (if you like TXAA and have a NVIDIA card).
*
ALSO TO OPTOMIZE FRAMES PER SECOND*

For a higher spec computer leave *Level of Detail* to Ultra for a medium to medium low spec game put it on High

*Motion blur* is worth turning off for just about anyone unless you really like the effect. It costs you frames and blurs everything when you move. I can go without.

*Shadows, Reflections, and Shader* are all things that can be set to high for a high spec computer without loosing much detail. For medium spec computers set all those to medium.

Ambient Occlusion is going to cost you no matter what setting you put it on. I think for now MHBAO is the way to go. If you are not getting solid frame rates you can try turning it off completely.

Also don't mess with the *frame buffer* unless you are having issues that are solved by adjusting it. If you reduce it you can loose a significant amount of lower FPS frames. You won't notice the change in most case but when your FPS starts to dip it will dip even lower than if you had left it alone.


----------



## KenjiS

Quote:


> Originally Posted by *givmedew*
> 
> You can get the R9 290 at retail pricing again. I would not buy the 290X because 1/2 of the speed increase is the shaders and half is the small bump in clock speed. You can almost def do that small bump on the 290 that you would get. So after that I think it is ~3-5% increase in speed over the 290. Totally not worth the extra $$$.
> 
> The 1040MHz Gigabyte card is $399 http://www.amazon.com/Gigabyte-GDDR5-4GB-2xDVI-Graphics-GV-R929OC-4GD/dp/B00HS84DFU/ref=sr_1_1?ie=UTF8&qid=1401372565&sr=8-1&keywords=r9+290
> 
> Has the same OC clock speed as the 290X OC so the only difference in speed will come from the extra shaders.


Yeah but isnt the 290 incredibly loud and also quite hot compared to the 290X? I do have my system on 24/7 and i like it to be relatively quiet lol


----------



## Silent Scone

Quote:


> Originally Posted by *ACP84*
> 
> Just to add to this @1080p with my Gtx titan I maxed out at 5002 vram with everything ultra + 8x MSAA .... Even at 4x MSAA I was at 4.5 vram and just with Fxaa I was still around 3.5gb vram ... So basically if you have a 3gb video card don't even both with Ultra settings even with 1080p, this is why your getting stuttering
> 
> I can only image if I jumped to 1440p or 4k how much vram I would need


Ultra is fine at 1080p. You're spreading misinformation with taking Rivatuner as gospel. What you're seeing is what is cached in frame buffer. If it's working at Ultra @ 1080p for people, then it's fine. I can run the game with Ultra textures @ 1440P on my Ti with FXAA. At 1080p I'd be able to at least use some MSAA.


----------



## TopicClocker

Has anybody seen the Steam review page for Watch Dogs?








Keep scrolling if you go it.

There was even one that said "uDontplay."


----------



## Marc79

yeah, pretty much all Red with an occasional "thumbs up".


----------



## Alex132

Quote:


> Originally Posted by *Silent Scone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ACP84*
> 
> Just to add to this @1080p with my Gtx titan I maxed out at 5002 vram with everything ultra + 8x MSAA .... Even at 4x MSAA I was at 4.5 vram and just with Fxaa I was still around 3.5gb vram ... So basically if you have a 3gb video card don't even both with Ultra settings even with 1080p, this is why your getting stuttering
> 
> I can only image if I jumped to 1440p or 4k how much vram I would need
> 
> 
> 
> Ultra is fine at 1080p. You're spreading misinformation with taking Rivatuner as gospel. What you're seeing is what is cached in frame buffer. If it's working at Ultra @ 1080p for people, then it's fine. I can run the game with Ultra textures @ 1440P on my Ti with FXAA. At 1080p I'd be able to at least use some MSAA.
Click to expand...

He's not spreading misinformation. But he is senseless for using 8xMSAA. Just use SMAA.

I can't run this at lowest details / etc. on my 690.

This is the world's worst "optimized" game I have ever seen. Like honestly, they were given extra time to polish the game. TOPKEK. What did they polish? It's a hit-and-miss whether or not this game works on your system, and it hogs resources for no reason at all.

And Uplay? yeah, just, don't get me started on uplay


----------



## TopicClocker

Quote:


> Originally Posted by *Alex132*
> 
> He's not spreading misinformation. But he is senseless for using 8xMSAA. Just use SMAA.
> 
> I can't run this at lowest details / etc. on my 690.
> 
> This is the world's worst "optimized" game I have ever seen. Like honestly, they were given extra time to polish the game. TOPKEK. What did they polish? It's a hit-and-miss whether or not this game works on your system, and it hogs resources for no reason at all.
> 
> And Uplay? yeah, just, don't get me started on uplay


If anything It's between this and AC3, I'm astonished how there's only a few that can get this game running decently, I'm only not giving it the 1# position as I can get an average fps of upwards of 30-40, AC3, I ran the entire game in the 20s, that's worse than GTA 4 or Saints Row 2 combined.


----------



## Alex132

Quote:


> Originally Posted by *TopicClocker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> He's not spreading misinformation. But he is senseless for using 8xMSAA. Just use SMAA.
> 
> I can't run this at lowest details / etc. on my 690.
> 
> This is the world's worst "optimized" game I have ever seen. Like honestly, they were given extra time to polish the game. TOPKEK. What did they polish? It's a hit-and-miss whether or not this game works on your system, and it hogs resources for no reason at all.
> 
> And Uplay? yeah, just, don't get me started on uplay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anything It's between this and AC3, I'm astonished how there's only a few that can get this game running decently, I'm only not giving it the 1# position as I can get an average fps of upwards of 30-40, AC3, I ran the entire game in the 20s, that's worse than GTA 4 or Saints Row 2 combined.
Click to expand...

Have you seen what IQ I am running at?


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Alex132*
> 
> LOOK AT THIS ULTRA 2014 EXPERIENCE YO.
> 
> 
> 
> 
> (1600x900, everything on low/off - still get 2GB VRAM usage and stuttering like crazy every ~15s)






35-45 FPS, stutters every ~15s when I load more textures into my VRAM, 1980-2044MB VRAM usage.

Oh and plus now I actually can't play because UdontPlay won't let me load my save games, if I click "skip" it just crashes when I launch the game


----------



## ACP84

Quote:


> Originally Posted by *Alex132*
> 
> He's not spreading misinformation. But he is senseless for using 8xMSAA. Just use SMAA.
> 
> I can't run this at lowest details / etc. on my 690.
> 
> This is the world's worst "optimized" game I have ever seen. Like honestly, they were given extra time to polish the game. TOPKEK. What did they polish? It's a hit-and-miss whether or not this game works on your system, and it hogs resources for no reason at all.
> 
> And Uplay? yeah, just, don't get me started on uplay


Just for clarification I was using 8x MSAA for testing vram usage only... While it was still playable at 40 fps it was choppy... Temporal smaa or smaa seems the way to go as I was averaging about 75 using that with ultra


----------



## TopicClocker

Quote:


> Originally Posted by *Alex132*
> 
> Have you seen what IQ I am running at?
> 
> 35-45 FPS, stutters every ~15s when I load more textures into my VRAM, 1980-2044MB VRAM usage.
> 
> Oh and plus now I actually can't play because UdontPlay won't let me load my save games, if I click "skip" it just crashes when I launch the game


Wow, if this is why Ubisoft delay AC PC versions its probably best they continue doing that.
But Watch Dogs was already delayed 6+ months....

My problems aren't too serious, but the stuttering is driving me mad from time to time, I find myself switching the Shadows down to High as sometimes It just feels like stuttering when It feels like it, one moment it's running quite smoothly, the next it's a slideshow and settings have to get dropped, or the game has to be restarted, this could be my 6GB ram coming into play and the supposed recommended is 8GB, I've noticed increased system ram usage over time, especially if I leave the game on overnight, 5GB+ of system ram gobbled down and stutter, stutter everywhere.

The 337.88 drivers helped me apply AA, HBAO+ and Ultra shadows @1080p, and decreased the stutter on Ultra textures, but I wouldn't call Ultra textures playable from my experience.
The low GPU usage others seem to be reporting is pretty odd, especially on i5 machines, i thought my CPU was just bottlenecking but it seems to be a wide-spread problem on more powerful processors.

I dont think I've seen anything like this before.


----------



## Silent Scone

Just adding onto what I said a moment ago...

Running in Borderless Window fixes a lot of the hitching for me. I'm at Ultra textures with Ultra settings at 1440P and even with 4XMSAA there is minimal stutters...2XMSAA there are none practically







Seems fullscreen mode is partially to blame, least for me.


----------



## Threx

Quote:


> Originally Posted by *KenjiS*
> 
> I'm guessing GTX 880 will be 8gb, 870 4gb...
> 
> The long in the tooth thing is why I am so very hesitant to "step up" my one 770 to a 780, its long in the tooth, but i doubt the 800 series is coming before the 90 days is up... the entire thinking on the SLI 770s was that "I spend the $340 now I'd spend on an 870 near the end of the year and get a bit of future proofing", Which sadly backfired on me from the looks


I'm also planning to get the 870 when it is released. Hopefully the basic version has at least 4Gb of ram, with another version having 6gb or so.

The Division (another game by Ubisoft) looks fantastic (at least until they dumb it down) and I'm now suspecting it will probably need huge amounts of vram. Crossing my fingers for a 6gb GTX 870.

Edit: Also planning on getting a 1440p ROG Swift so I'll definitely need the 6gb. >.>


----------



## givmedew

Quote:


> Originally Posted by *KenjiS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *givmedew*
> 
> You can get the R9 290 at retail pricing again. I would not buy the 290X because 1/2 of the speed increase is the shaders and half is the small bump in clock speed. You can almost def do that small bump on the 290 that you would get. So after that I think it is ~3-5% increase in speed over the 290. Totally not worth the extra $$$.
> 
> The 1040MHz Gigabyte card is $399 http://www.amazon.com/Gigabyte-GDDR5-4GB-2xDVI-Graphics-GV-R929OC-4GD/dp/B00HS84DFU/ref=sr_1_1?ie=UTF8&qid=1401372565&sr=8-1&keywords=r9+290
> 
> Has the same OC clock speed as the 290X OC so the only difference in speed will come from the extra shaders.
> 
> 
> 
> Yeah but isnt the 290 incredibly loud and also quite hot compared to the 290X? I do have my system on 24/7 and i like it to be relatively quiet lol
Click to expand...

They are the same card except the chip. It is possible that side by side a 290X would be a bit louder than a 290 because of the extra shaders but most likely not.

As far as them being loud... not at all. Only the BLOWER type reference cards that where out the first months the card was out. AMD kind of shot themselves in the foot on that honestly. They forcibly made the vendors sell the cards with the factory cooling unit and now we have people who still think the card is super loud after the cooling systems have been replaced.

Quote from Techspot
_"What impresses us the most is how quiet Gigabyte's cards run, as they were inaudible over the test system's two 120mm case fans, which are also very quiet. From FurMark to seemingly every game, Gigabyte's cards remain whisper quiet under load."_

There is also the XFX Double D for $390

and


ASUS DC2 for $399

If it where me I would get the Gigabyte one w/ the 1040MHz OC. The clock speeds that they are overclocking it to suggests that they are binning the CPUs since there is no way that they are getting 1040MHz on every single chip at stock voltage without doing so. That they are selling a non OC'd version of the same exact card to me is almost proof of binning.

Hope that helps... the cards are amazing and the 4GB seems to be the sweet spot and you don't even have to pay a premium. My friend has (2) 780TIs and he is having issues running watchdogs with ultra textures. I am not having any problems... the game is using 3200-3300 with out MSAA. I'm using the Temporal SMAA.


----------



## givmedew

duplicate


----------



## zantetheo

Tested the game and my GTX 770 4 GB uses 3.3-3.5 GB
All Ultra + Temp SMAA


----------



## revro

Quote:


> Originally Posted by *TopicClocker*
> 
> Wow, if this is why Ubisoft delay AC PC versions its probably best they continue doing that.
> But Watch Dogs was already delayed 6+ months....
> 
> My problems aren't too serious, but the stuttering is driving me mad from time to time, I find myself switching the Shadows down to High as sometimes It just feels like stuttering when It feels like it, one moment it's running quite smoothly, the next it's a slideshow and settings have to get dropped, or the game has to be restarted, this could be my 6GB ram coming into play and the supposed recommended is 8GB, I've noticed increased system ram usage over time, especially if I leave the game on overnight, 5GB+ of system ram gobbled down and stutter, stutter everywhere.
> 
> The 337.88 drivers helped me apply AA, HBAO+ and Ultra shadows @1080p, and decreased the stutter on Ultra textures, but I wouldn't call Ultra textures playable from my experience.
> The low GPU usage others seem to be reporting is pretty odd, especially on i5 machines, i thought my CPU was just bottlenecking but it seems to be a wide-spread problem on more powerful processors.
> 
> I dont think I've seen anything like this before.


yeah my 4 cores of [email protected] are all 100% when driving and gpu is 50%ish so i am getting like avg 20fps and thats not great even at 1440p. in normal scenes i have 40fps so thats ok. hope they do something or i have to wait months to my new build ....

Quote:


> Originally Posted by *KenjiS*
> 
> Yeah but isnt the 290 incredibly loud and also quite hot compared to the 290X? I do have my system on 24/7 and i like it to be relatively quiet lol


yeah i have a 780 windforce and its great cooler, as other wrote above me, windforce cooler is quit


----------



## KenjiS

Quote:


> Originally Posted by *givmedew*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> They are the same card except the chip. It is possible that side by side a 290X would be a bit louder than a 290 because of the extra shaders but most likely not.
> 
> As far as them being loud... not at all. Only the BLOWER type reference cards that where out the first months the card was out. AMD kind of shot themselves in the foot on that honestly. They forcibly made the vendors sell the cards with the factory cooling unit and now we have people who still think the card is super loud after the cooling systems have been replaced.
> 
> Quote from Techspot
> _"What impresses us the most is how quiet Gigabyte's cards run, as they were inaudible over the test system's two 120mm case fans, which are also very quiet. From FurMark to seemingly every game, Gigabyte's cards remain whisper quiet under load."_
> 
> There is also the XFX Double D for $390
> 
> and
> 
> 
> ASUS DC2 for $399
> 
> If it where me I would get the Gigabyte one w/ the 1040MHz OC. The clock speeds that they are overclocking it to suggests that they are binning the CPUs since there is no way that they are getting 1040MHz on every single chip at stock voltage without doing so. That they are selling a non OC'd version of the same exact card to me is almost proof of binning.
> 
> Hope that helps... the cards are amazing and the 4GB seems to be the sweet spot and you don't even have to pay a premium. My friend has (2) 780TIs and he is having issues running watchdogs with ultra textures. I am not having any problems... the game is using 3200-3300 with out MSAA. I'm using the Temporal SMAA.


Good to know, as you said it sucks that AMD did that with the blower-type as that WAS my impression till now(That the 290s were absurdly loud), its an option on the table right now at least, I'm really heavily weighing my options right now as its also possible this "VRAM hog" situation may cool off and my 770 SLI will end up being great...

Currently my mulling is whether i ought to just sell off the two 770s I have RIGHT now and go for an R9 290 (Well and pocket some cash which could be used to also toss a 500gb SSD in my rig) or just wait a bit, sell them off later and go for the R9 300 or GTX 800 series parts, Given that GTX 670s are still being listed at $225 or so on the forums, I think my 770s should hold their value decently well even after the new parts launch... I would say something that would tip the scales to AMD's favor would be announcing that Dragon Age Inquisition uses Mantle from day 1 or something, Thats a game im very much looking foreward to and if it supported Mantle.... well that might tip the balance

The only reason i dont just jump on it is the simple concern of future proofing myself a little... The current crop of GPUs is getting long in the tooth and it mgiht be a better idea to wait a few months, take the loss in value and jump on the next gen off the bat...


----------



## MonarchX

After Day1 patch and setting pre-render frames to 1, stuttering was significantly reduced and was more of an annoyance in downtown, especially at night. Stuttering never got in the way of doing anything like getting away from a police army though.

There is no evidence that more VRAM eliminates stuttering... AFAIK 6GB Titan struggles as much as a GTX 780 Ti.

I beat the game and enjoyed it. It looked amazing on my high contrast ratio monitor with 240Hz. I was hoping for a deeper story like DeusEx and way more content along with some ability to avoid killing innocent people and cops. I plowed through countless guards and cops to get away... So much for a righteous vigilante who hunts killers. I also hoped hacking would be more challenging and necessary instead of being optional and at times slower and less effective than Rambo-style invasions. There is too little of main NPC interactions... No friendship development, no character growth. I loved police chases and some cool ways to use hacking to kill enemies. Its a solid 75/100 IF you don't pay a penny for it.


----------



## MonarchX

Quote:


> Originally Posted by *KenjiS*
> 
> Good to know, as you said it sucks that AMD did that with the blower-type as that WAS my impression till now(That the 290s were absurdly loud), its an option on the table right now at least, I'm really heavily weighing my options right now as its also possible this "VRAM hog" situation may cool off and my 770 SLI will end up being great...
> 
> Currently my mulling is whether i ought to just sell off the two 770s I have RIGHT now and go for an R9 290 (Well and pocket some cash which could be used to also toss a 500gb SSD in my rig) or just wait a bit, sell them off later and go for the R9 300 or GTX 800 series parts, Given that GTX 670s are still being listed at $225 or so on the forums, I think my 770s should hold their value decently well even after the new parts launch... I would say something that would tip the scales to AMD's favor would be announcing that Dragon Age Inquisition uses Mantle from day 1 or something, Thats a game im very much looking foreward to and if it supported Mantle.... well that might tip the balance
> 
> The only reason i dont just jump on it is the simple concern of future proofing myself a little... The current crop of GPUs is getting long in the tooth and it mgiht be a better idea to wait a few months, take the loss in value and jump on the next gen off the bat...


Get the next or current nVidia card if you want problem-free gaming. WD is not representative of VRAM needs for all other upcoming games. 3GBa is future-proof for 1080p.


----------



## KenjiS

Quote:


> Originally Posted by *MonarchX*
> 
> Get the next or current nVidia card if you want problem-free gaming. WD is not representative of VRAM needs for all other upcoming games. 3GBa is future-proof for 1080p.


I am running 1440p, not 1080p


----------



## ACP84

Quote:


> Originally Posted by *zantetheo*
> 
> Tested the game and my GTX 770 4 GB uses 3.3-3.5 GB
> All Ultra + Temp SMAA


Can you please tell me what program your using for monitoring? My precision doesn't work with the game


----------



## KenjiS

Quote:


> Originally Posted by *ACP84*
> 
> Can you please tell me what program your using for monitoring? My precision doesn't work with the game


Probubly MSI Afterburner, it has 64-bit support


----------



## Cpyro

looks like Afterburner


----------



## SoloCamo

Quote:


> Originally Posted by *MonarchX*
> 
> Get the next or current nVidia card if you want problem-free gaming. WD is not representative of VRAM needs for all other upcoming games. 3GBa is future-proof for 1080p.


Yea, Nvidia for problem free gaming! Wait, what?









And 3gb is no more future proof now then 2gb was supposedly last year and that 1gb supposedly was the year before that. I hate that argument, get the best card you can with the money and get the highest vram available at the time for it. Hasn't let me down yet.

A lot of people are failing to realize, these new consoles are not vram limited.... Prepare for higher vram needs, not necessarily because of the games, but because developers can and don't need to be as cautious with use.


----------



## zantetheo

Quote:


> Originally Posted by *ACP84*
> 
> Can you please tell me what program your using for monitoring? My precision doesn't work with the game


yeap afterburner

http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## NABBO

Quote:


> Originally Posted by *MonarchX*
> 
> After Day1 patch and setting pre-render frames to 1, stuttering was significantly reduced and was more of an annoyance in downtown, especially at night. Stuttering never got in the way of doing anything like getting away from a police army though.
> 
> There is no evidence that more VRAM eliminates stuttering... AFAIK 6GB Titan struggles as much as a GTX 780 Ti.
> 
> I beat the game and enjoyed it. It looked amazing on my high contrast ratio monitor with 240Hz. I was hoping for a deeper story like DeusEx and way more content along with some ability to avoid killing innocent people and cops. I plowed through countless guards and cops to get away... So much for a righteous vigilante who hunts killers. I also hoped hacking would be more challenging and necessary instead of being optional and at times slower and less effective than Rambo-style invasions. There is too little of main NPC interactions... No friendship development, no character growth. I loved police chases and some cool ways to use hacking to kill enemies. Its a solid 75/100 IF you don't pay a penny for it.


I bet the stuttering there are also cards with 4 and 6GB.
in the past I had TITAN ... but Bioshock infinite ( example ) suffered from stuttering as much as 670S 2GB sli ...

and I remember at that time, many people were paranoid, convinced that a card with + vram, would solve those problems.

watch this video:

"GTX Titan Black vs. Geforce GTX 780 Ti | Skyrim mit 150 Mods"

http://www.youtube.com/watch?v=CN8PLXnJxLA

titan black, allocate a lot of VRAM, but the performance is equal to a 780ti 3Gb

there is no improvement


----------



## SkyNetSTI

Quote:


> Originally Posted by *zantetheo*
> 
> Tested the game and my GTX 770 4 GB uses 3.3-3.5 GB
> All Ultra + Temp SMAA


Dude, what software does show this info there?


----------



## lilchronic

Msi afterburner


----------



## Ghoxt

Why do people keep saying "I bet the Titans have stutterring too" confusing things". Lets please deal in fact as we see it here on OCN.

Has anyone here on OCN had the dip down to 1-2 FPS stuttering currently running a Titan with THIS game?


----------



## Cyberion

Quote:


> Originally Posted by *kx11*
> 
> it requires less than 3gb but they ask for 3gb to make sure the game performance will be smooth
> 
> also MSAAx2 isn't enough @ 1200p


It does require 3GB. I've hit 2.8GB roaming around the city and fiddling with the settings.


----------



## Cyberion

Quote:


> Originally Posted by *SkyNetSTI*
> 
> Dude, what software does show this info there?


It's RivaTuner. Comes with MSI Afterburner. You'll have to manually start it up, set it to launch at start with a high detection. Then make sure in the Afterburner settings that the info you want is shown (FPS, Frametime, voltage, clock, etc.). You'll have to select the option and then click "Show in OSD."

http://imgur.com/d0aDwIn


----------



## NABBO

Quote:


> Originally Posted by *Ghoxt*
> 
> Why do people keep saying "I bet the Titans have stutterring too"


AndyB Nvidia...

http://forums.guru3d.com/showpost.php?p=4825305&postcount=50

writes "Performance is identical" http://abload.de/image.php?img=boh2zup2.gif

(and problems is identical rotfl







)


----------



## Frankzro

They ATE THIS GAME ALIVE! On MetaCritic!

OUCH!!


----------



## Razzaa

Quote:


> Originally Posted by *Frankzro*
> 
> They ATE THIS GAME ALIVE! On MetaCritic!
> 
> OUCH!!


I agree.


----------



## TopicClocker

Quote:


> Originally Posted by *Frankzro*
> 
> They ATE THIS GAME ALIVE! On MetaCritic!
> 
> OUCH!!


Dayum, and I thought Steam was bad enough.

This game has broken new levels of dislike.


----------



## Cyberion

Quote:


> Originally Posted by *Frankzro*
> 
> They ATE THIS GAME ALIVE! On MetaCritic!
> 
> OUCH!!


A score of 81 is still great. http://www.metacritic.com/game/pc/watch-dogs


----------



## TopicClocker

Quote:


> Originally Posted by *Cyberion*
> 
> A score of 81 is still great. http://www.metacritic.com/game/pc/watch-dogs


The user score isn't however.


----------



## Norlig

Patch for PC is incomming









https://twitter.com/SebViard/status/472036622171930624

http://www.forbes.com/sites/jasonevangelho/2014/05/29/ubisoft-working-on-pc-patch-for-watch-dogs-offers-advice-to-boost-performance/


----------



## TopicClocker

Quote:


> Originally Posted by *Cyberion*
> 
> Reading into the user reviews, most of them have merit. I don't agree giving the game a 0 but it certainly doesn't compare to say, GTA V or Skyrim. This sums up most of the game's problems:
> EDIT: MetaCritic rated it 81/100. Check what page this post is on. CONSPIRACIES.


Most people's problems is exactly that, it doesn't compare to GTA 5, I think it's a nice game, excluding the performance problems, it's not meant or trying to replace or be GTA, it's some form of Splinter Cell, Assassin's Creed modern hybrid game-play wise imo, but all this game has received is flak and negativity, I dont blame/haven't got anything against the PC problems and negativity as they sound pretty catastrophic for some people so hopefully they sort them out but wow the hate is crazy.


----------



## zacker

anyone know how i can take a screenshot in game? have some usefull tweaks that worked for me pretty good


----------



## TopicClocker

Quote:


> Originally Posted by *Norlig*
> 
> Patch for PC is incomming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://twitter.com/SebViard/status/472036622171930624
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/29/ubisoft-working-on-pc-patch-for-watch-dogs-offers-advice-to-boost-performance/


Gave the forbes one a skim read earlier.
Quote:


> Ubisoft's Sebastien Viard explains that "Making an open world run on [next-generation] and [current-generation] consoles plus supporting PC is an incredibly complex task." He goes on to to say that Watch Dogs can use 3GB or more of RAM on next-gen consoles for graphics, and that "your PC GPU needs enough Video Ram for Ultra options due to the lack of unified memory.


Yup, this just solidifies they focused mostly on the next gen machines, for most of us, we haven't got 6GB+ to dump onto our graphics cards.







Titanfall all over again.


----------



## MonarchX

Quote:


> Originally Posted by *SoloCamo*
> 
> Yea, Nvidia for problem free gaming! Wait, what?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And 3gb is no more future proof now then 2gb was supposedly last year and that 1gb supposedly was the year before that. I hate that argument, get the best card you can with the money and get the highest vram available at the time for it. Hasn't let me down yet.
> 
> A lot of people are failing to realize, these new consoles are not vram limited.... Prepare for higher vram needs, not necessarily because of the games, but because developers can and don't need to be as cautious with use.


Erm... Nevermind me, reviewers, even developers who constantly complain about AMD's drivers and having constant issues, requiring rolling back drivers to get one thing to work and then back to newer drivers to get the other thing to work. With nVidia drivers, things just work. You get an issue once in a blue moon, and even then its probably due to an overly high OC!

Games barely use 2GB of VRAM these days, with exception for 2 games - BF4 and Watch Dogs. BF4 runs just as fast 2GB GTX 770 as it does on 4GB GTX 770. Realize that higher VRAM utilization may simply mean data pre-caching, not necessarily data needed to display at that time. 3GB for 1080p is quite still proof. Last year 2GB VRAM was future-proof for a good year and so far only the poorly-opimized Watch Dogs demands 3GB for maximum candy.

GTX 780 Ti is the fastest single-GPU card and nVidia JUST announced 6GB versions, so yeah 3GB is what if you go for the fastest card right now. Watch-Dogs VRAM usage is meaningless in terms of stutters as having more VRAM doesn't eliminate stutters and 3GB cards perform better than 4GB cards (check out Guru3D review), although its because the game was optimized for nVidia hardware, so it runs better on nVidia cards.


----------



## SoloCamo

Quote:


> Originally Posted by *MonarchX*
> 
> Erm... Nevermind me, reviewers, even developers who constantly complain about AMD's drivers and having constant issues, requiring rolling back drivers to get one thing to work and then back to newer drivers to get the other thing to work. With nVidia drivers, things just work. You get an issue once in a blue moon, and even then its probably due to an overly high OC!
> 
> Games barely use 2GB of VRAM these days, with exception for 2 games - BF4 and Watch Dogs. BF4 runs just as fast 2GB GTX 770 as it does on 4GB GTX 770. Realize that higher VRAM utilization may simply mean data pre-caching, not necessarily data needed to display at that time. 3GB for 1080p is quite still proof. Last year 2GB VRAM was future-proof for a good year and so far only the poorly-opimized Watch Dogs demands 3GB for maximum candy.
> 
> GTX 780 Ti is the fastest single-GPU card and nVidia JUST announced 6GB versions, so yeah 3GB is what if you go for the fastest card right now. Watch-Dogs VRAM usage is meaningless in terms of stutters as having more VRAM doesn't eliminate stutters *and 3GB cards perform better than 4GB cards (check out Guru3D review), although its because the game was optimized for nVidia hardware, so it runs better on nVidia cards.*


And yet this article on guru 3d says the exact opposite of what you just said?

http://www.hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/5#.U4fnb_mwKPl

Where are you getting your information from?

I rarely call the fanboy card at, but you sir, you need to really look at the bigger picture. The drivers 'issues' are an old retired and quite beaten horse that no longer apply first of all. Things don't alway's 'just work' on either side of the camp, both parties have their bad drivers, Stop being so blind.

just to quote one of many parts from my link above -
Quote:


> "In terms of performance we were surprised how close the R9 290X and GTX 780 Ti are. There has been a lot of FUD around the internet about AMD potentially lacking in performance compared to NVIDIA. We hope we have smashed the rumors and provided facts based on gameplay and not some quick-use benchmark tool that will many times tell you little. We actually found the Radeon R9 290X slightly faster in some scenarios compared to the GeForce GTX 780 Ti. We also found out that gameplay consistency was a lot better on Radeon R9 290X with "Ultra" textures enabled thanks to its 4GB of VRAM."


----------



## zacker

ok here is my tweaks for the game after 2 days of searching
i5 3570 k oc at 4.7 ghz and gigabyte 780ti oc at 1250 mhz with skyn3ts bios mod and 377.88 drivers installed

this settings are a mix of msaa x4 in game + fxaa nvidia control panel and x32 csaa

first i will post my screenshots of my settings game is almost lag free for me i hope some of these settings help some people in here for max out settings

    

now my ingame settings window mode is for taking screens only i play fullscreen 60hz monitor btw

 

ingame screenshots

  ]           

i dont know why but when i enabled 32x csaa in nvidia control panel i get almost no lag at all if you try something from these let me know also i have day 1 patch installed


----------



## eternal7trance

Quote:


> Originally Posted by *TopicClocker*
> 
> Most people's problems is exactly that, it doesn't compare to GTA 5, I think it's a nice game, excluding the performance problems, it's not meant or trying to replace or be GTA, it's some form of Splinter Cell, Assassin's Creed modern hybrid game-play wise imo, but all this game has received is flak and negativity, I dont blame/haven't got anything against the PC problems and negativity as they sound pretty catastrophic for some people so hopefully they sort them out but wow the hate is crazy.


Most of the reviews sound like they were written by a two year old just following what the other two year old sheep say


----------



## Thetbrett

had a quick run @ 1440p ultra temporal AA vsync 2, 3 pre rendered frames and yup, vram maxed out. Ran ok though.
same settings 1080p

actually uses 14mb more. My conclusion is it just takes all it can and runs with it.


----------



## MonarchX

Quote:


> Originally Posted by *SoloCamo*
> 
> And yet this article on guru 3d says the exact opposite of what you just said?
> 
> http://www.hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/5#.U4fnb_mwKPl
> 
> Where are you getting your information from?
> 
> I rarely call the fanboy card at, but you sir, you need to really look at the bigger picture. The drivers 'issues' are an old retired and quite beaten horse that no longer apply first of all. Things don't alway's 'just work' on either side of the camp, both parties have their bad drivers, Stop being so blind.
> 
> just to quote one of many parts from my link above -


Did you mean HardOCP? You didn't provide a Guru3D link







Here is the Guru3D link that shows AMD hardware lags behind a little bit at lower resolution of 1080p - http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review,9.html and it makes since the game is optimized for nVidia. It also proves that at least for 1080p 3GB of VRAM is plenty and Watch Dogs only benefits from 4GB of VRAM at much higher resolutions. I've personally seen this game running on a 4GB R9 290 AMD card and my 3GB GTX 780 Ti nVidia card - there is no difference in stutters, but AMD card produces about 10-15 fewer FPS, which had no negative or positive effect on stutters. MSI AB reported up to 3.7GB of VRAM usage for R9 290, while MSI AB reported up to 2.9GB VRAM usage for GTX 780 Ti, and yet again - no stutter differences. That means that the extra VRAM used on the 4GB R9 290 card was more than likely for pre-caching data ahead of time, but that did not result in higher FPS or any stutter reduction. The same applies to 6GB GTX Titan cards - they report 3.5-3.7GB VRAM usage, but run the game with similar FPS and stutters as GTX 780 Ti.

Please realize that I think *AMD has superior GPU hardware to nVidia's and AMD hardware is capable of delivering more for a lower price*. This is why AMD hardware was picked for today's generation consoles. nVidia, on the other hand, often lacks in hardware and even software features, but it uses a much more simplified driver approach, a sane approach as described by OpenGL developers, that results in much better drivers. I've yet to experience a single driver issue since 2 years ago when I bought a GTX 680 and now a GTX 780 Ti. There were literally 0 and I used every official, unofficial, WHQL or beta release. Take a look at nVidia's release notes and you will see that the bugs they fix in each release are very petty, mostly affecting a tiny number of people and usually with older or some niche cards. Also take a look at how many "hotfix" drivers nVidia had to release to fix something important in their previous driver release that was affecting many people. nVidia also sends out its engineers and developers to different studious to assist game developers in optimizing their games for nVidia hardware, which is an expensive, but a simple and a working methodology for providing its customers with stable and optimized drivers for any program out there. nVidia sticks to older approaches, older hardware architecture, little & rare innovation, but that is what allows them to produce very stable drivers for all games.

AMD is different - they are creative, innovative, progressive, and provide support for the latest APIs like DirectX 11.1 full-support, while nVidia is still stuck with DirectX 11.0. However, the price of all that is consistent software issues. Drivers for AMD are not a thing of the past, but they did get better since Radeon 9700/9800 Pro days. As far as today's DirectX 11 driver scene goes - Catalyst 14.4 drivers fixed several game issues, but they broke video rendering for madVR users (ALL of them, not just select few). There is always some sort of a nasty pesky bug in each driver release... A black screen here, a crash there, an error somewhere else, a flashing texture in one game, or some incompatibility in another or even broken video rendering... Don't even get me started on AMD Crossfire...

A recent OpenGL developer meeting provided descriptions of 3 companies and their products without naming any of them. It was incredibly easy to figure out which one was nVidia, AMD, and Intel. nVidia was described as a company with a sane approach to hardware and software for a premium price, while AMD was described as a hardware company mostly plagued by poor & complicated software/driver designs/coding that often contained outdated low-level code created by people who were no longer working for the company, which all-in-all resulted in layers of messy coding where one driver set fixed some things, but broke a few others. When you try to fix a top-layer bug in one place and realize you can't fix it unless you fix a bottom-layer (foundation) bug, then fixing that bottom-layer bug does resolves the top-layer bug in one place, but creates a bug in another place, which in turn needs its own fixing. Poor foundation leads to a vicious cycle of fixing one bug and creating another. The only way to break the cycle is to re-write the foundation, *a perfect opportunity for MANTLE*. Its not just about a new API and CPU overhead optimization, but also about having home-cooked drivers for a home-cooked API, a great API foundation and a great driver foundation = stable drivers (in theory). AMD has already tried to re-write DirectX drivers from ground-up once and since then drivers did get better, but I think now they reached a point of no return, probably because it would take way too much time and effort to re-write drivers from ground-up all over again, especially now that they are doing just that with Mantle. All that makes Mantle a really big gamble, but if they make it popular enough - it may just save AMD's reputation as a software-capable company.


----------



## Silent Scone

Quote:


> Originally Posted by *zacker*
> 
> i dont know why but when i enabled 32x csaa in nvidia control panel i get almost no lag at all if you try something from these let me know also i have day 1 patch installed


Are you sure its even working??? Are you using enhance or override?


----------



## KenjiS

Quote:


> Originally Posted by *Frankzro*
> 
> They ATE THIS GAME ALIVE! On MetaCritic!
> 
> OUCH!!


http://www.amazon.com/Watch-Dogs-Deluxe-Edition-Download/dp/B00FPQFXGK/ref=sr_1_3?ie=UTF8&qid=1401438917&sr=8-3&keywords=watch+dogs

Amazon too

Good way of getting their attention actually...


----------



## zacker

Quote:


> Originally Posted by *Silent Scone*
> 
> Are you sure its even working??? Are you using enhance or override?


yes its working try override options i think its the best quality aam


----------



## Silent Scone

Quote:


> Originally Posted by *Thetbrett*
> 
> 
> had a quick run @ 1440p ultra temporal AA vsync 2, 3 pre rendered frames and yup, vram maxed out. Ran ok though.
> same settings 1080p
> 
> actually uses 14mb more. My conclusion is it just takes all it can and runs with it.


Most games do this, it's just caching all available space to avoid loading it in real time. This is why I get sick of people showing Afterburner readouts saying "look it's using 4.7GB on my Titan". It's almost as stupid as saying "look my HDD uses 400GB of space" What?

There is definitely some heavy VRAM usage going on this game though even at 1080p which is why increasing the resolution results in such a negative effect on lesser cards. You may find in games (not specifically this one) that the experience will be smoother on a card with more VRAM as it's able to cache plenty more data ahead of time, but for the most part games are optimised well enough that you do not notice this happening.

If anything we should be pushing Nvidia and AMD to now give us MORE VRAM at the more affordable level, and that is what NV are doing admitidly by giving us the 780GTX with 6GB, but this is an aging card. Majority of gamers who were pushing for these new games, will have already purchased the 3GB variants in the sales leading after the Ti launch. It's crude...

I think VRAM is going to be more and more of a hot topic as time goes on, and 3-4GB is just going to not cut it. Despite criticism, the textures at Ultra in this game for a sandbox game, are very good. maybe not as optimised as they should be, but good nether the less. It's better to have the extra memory and not need it, than to need it and not have it. To compete with at least the Playstation level of capability, and to scale higher than 1080p, we need all the memory we can get. Just to note, this is not justifying the stuttering within in Watch Dogs, if VRAM was absolutely imperative - you would be finding the game coming to a crawl completely. The games clearly coping with what's available, just poorly.

Bottom line, thanks to the next gen push moving away from the likes of the Xbox 360s 512MB dedicated memory - seemingly 'large' amounts like 3gb aren't large any more...PC is about going above and beyond when it's not always necessary, and for that - we need more VRAM.


----------



## TopicClocker

Quote:


> Originally Posted by *Silent Scone*
> 
> Most games do this, it's just caching all available space to avoid loading it in real time. This is why I get sick of people showing Afterburner readouts saying "look it's using 4.7GB on my Titan". It's almost as stupid as saying "look my HDD uses 400GB of space" What?.


This, but when you start running into stutters I think that's mostly when you know your running into a vram wall.

But for most people, this game stutters everywhere so It's practically impossible to say until they fix it.


----------



## eternal7trance

With everything cranked as high as I can go on 1440p the highest I've seen was 3.6gb for VRAM usage. However this is with 4xMSAA. If I try to do 8xMSAA the game slows down to almost nothing. So tired of people coming out with badly optimized PC games


----------



## NABBO

Quote:


> Originally Posted by *MonarchX*
> 
> There is no evidence that more VRAM eliminates stuttering... AFAIK 6GB Titan struggles as much as a GTX 780 Ti.
> .


"Politics aside, what's very clear is that those pursuing 60fps gameplay are facing a challenge. Texture management seems to be the major culprit - streaming ultra-level assets into and out of RAM simply doesn't happen fast enough, resulting in noticeable stutter. It's not a RAM issue based on our testing - we replaced a 3GB GTX 780 Ti with a 6GB GTX Titan, producing the exact same result. "

http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off


----------



## KenjiS

^- interesting..

All we can do is wait till early June for a patch :/


----------



## NABBO

problem is not the vram.
problem is the game.
that needs a patch to improve the texture streaming

Patch is coming.


----------



## Serandur

Quote:


> Originally Posted by *NABBO*
> 
> problem is not the vram.
> problem is the game.
> that needs a patch to improve the texture streaming
> 
> Patch is coming.


That's a relief (780's still just fine). Not taking the blame away from Ubisoft or anything, but I wonder if VRAM bandwidth noticeably affects anything.


----------



## Leopard2lx

Quote:


> Originally Posted by *Norlig*
> 
> Patch for PC is incomming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://twitter.com/SebViard/status/472036622171930624
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/29/ubisoft-working-on-pc-patch-for-watch-dogs-offers-advice-to-boost-performance/


I love how they recommend lowering the graphical settings....like the PC community are complete morons and we haven't thought about that.
I have news for you Ubisoft: THE GAME STILL STUTTERS EVEN WITH MEDIUM SETTINGS!!! (less, but still does) How much lower do you want me to go on a $500 GPU just to play your garbage game that isn't even that good as it was promised? Downgraded, then delayed, now runs like crap! Turn on FXAA? Really? At 1080p FXAA does nothing but make my eyes bleed from all the jaggies on my 42 inch TV.

They simply still refuse to admit that their engine sucks. "Nvidia explains that Ubisoft's Disrupt engine is quite complex, and that the way the game streams in data may be a potential bottleneck."
So there you go!! It's not our PC's and settings, it's YOUR crap coding and engine!!


----------



## iamhollywood5

Quote:


> Originally Posted by *Serandur*
> 
> That's a relief (780's still just fine). Not taking the blame away from Ubisoft or anything, but I wonder if VRAM bandwidth noticeably affects anything.


I'm sure higher bandwidth does help make the stutters less severe, but it surely wouldn't eliminate them completely. But AMD's 512-bit bus isn't even utilized when they throw such crappy low-bin 5Ghz memory on their flagship card. Wish they'd release a clocked-up version with 6Ghz or 7Ghz memory, like they did with the 7970 Ghz Edition (although they'd have to call it something different). The way it stands, the 780 Ti actually has a slight lead in bandwidth thanks to 7Ghz memory.

If anybody has a 290X with golden memory chips that can be OC'd really far, I'd like to see how the game runs for them.
Quote:


> Originally Posted by *Leopard2lx*
> 
> I love how they recommend lowering the graphical settings....like the PC community are complete morons and we haven't thought about that.
> I have news for you Ubisoft: THE GAME STILL STUTTERS EVEN WITH MEDIUM SETTINGS!!! (less, but still does) How much lower do you want me to go on a $500 GPU just to play your garbage game that isn't even that good as it was promised? Downgraded, then delayed, now runs like crap! Turn on FXAA? Really? At 1080p FXAA does nothing but make my eyes bleed from all the jaggies on my 42 inch TV.
> 
> They simply still refuse to admit that their engine sucks. "Nvidia explains that Ubisoft's Disrupt engine is quite complex, and that the way the game streams in data may be a potential bottleneck."
> So there you go!! It's not our PC's and settings, it's YOUR crap coding and engine!!


Yeah it's borderline insulting that they just tell us to lower settings instead of fixing their own product. I realize dropping $700 on a 3GB video card may not have been the wisest choice, but I'm just trying to run 1080p, Ultra textures, and Temp SMAA. Is it REALLY that much to ask to get the game running with those settings and stay under 3GB? Especially for a game that honestly looks quite mediocre even on max settings?

Nvidia should be raging at Ubisoft right now. This was supposed to be their banner-flying game.


----------



## emett

I'm playing an unpatched version on ultra with 4xTXAA and it's butter smooth.


----------



## zacker

txaa sucks in watch dog blurry to much


----------



## KenjiS

Quote:


> Originally Posted by *iamhollywood5*
> 
> Yeah it's borderline insulting that they just tell us to lower settings instead of fixing their own product. I realize dropping $700 on a 3GB video card may not have been the wisest choice, but I'm just trying to run 1080p, Ultra textures, and Temp SMAA. Is it REALLY that much to ask to get the game running with those settings and stay under 3GB? Especially for a game that honestly looks quite mediocre even on max settings?
> 
> Nvidia should be raging at Ubisoft right now. This was supposed to be their banner-flying game.


Glad im not the only one that felt that way, I felt his entire tone was almost condescending to the PC community "Oh you silly peons cant run it, You must not have a good enough computer/Thats how PC games are because you lack unified memory"

Other games managed just fine without unified memory.. Heck Crysis 3 runs great and looks gorgeous... Its not us, its your complete lack of optimization.

I actually wrote up a big long post about what I think happened at Ubisoft with Watch Dogs. Dont remember which WD thread it was tho... I'll give the TL;DR version here:
Quote:


> Ubisoft started working on the PC version of Watch Dogs years ago, The version of WD we saw at E3 2011 was this original vision, At the time they didnt have much data on the Xbox One/PS4 beyond it used the x86 architechure and maybe some rough performance ideas, They proceeded with this knowledge and developed the Disrupt engine on the PC figuring it will be simple to port it, Then they get the initial dev kits for the PS4/Xbox One and find out "Oh crap" they're nowhere near as powerful as expected/the hardwares limitations in several areas meant the original Disrupt just wasnt going to work on it. Queue massive brick crapping and pretty much being forced to throw out all that work on the PC/NG version as they scramble to get Disrupt working on the next gen consoles, But no matter how hard they tried they just couldnt get it working before Nov last year, Thus forcing the delay, The 6 months they spent was basically getting a somewhat functional game on the Disrupt engine that ran correctly on the NG consoles, By the time all this messing about was done there was little to no choice, the PC version had to be ported out of the NG Console version with little time for any form of testing an optimization.


I know it doesnt excuse the state the game is in, but its a possible explanation


----------



## Serandur

Quote:


> Originally Posted by *KenjiS*
> 
> Glad im not the only one that felt that way, I felt his entire tone was almost condescending to the PC community "Oh you silly peons cant run it, You must not have a good enough computer/Thats how PC games are because you lack unified memory"
> 
> Other games managed just fine without unified memory.. Heck Crysis 3 runs great and looks gorgeous... Its not us, its your complete lack of optimization.
> 
> I actually wrote up a big long post about what I think happened at Ubisoft with Watch Dogs. Dont remember which WD thread it was tho... I'll give the TL;DR version here:
> I know it doesnt excuse the state the game is in, but its a possible explanation


I exited the thread quite a while back, but I strongly agree. Ubisoft's attitude towards PC gaming and gamers is disgusting and has been leading back to their original "PC guys just throw stronger hardware at it, who needs optimization" (paraphrase, not accurate) statement regarding AC IV as well as their track record of pre-patch game performance. I'm not one to use the term "optimization" lightly as it does get misused, but as far as incompetent programming goes, Ubisoft fit the bill trying to dump as much as they can onto our hardware or even their own flailing engine with the justification that grossly overpowered future hardware will even out their issues. Their texture-streaming solution clearly isn't functioning properly, they need to be mindful of what hardware is on the market, especially when that hardware is multiple times more powerful and/or expensive than it needs to be to run their games properly. I almost wonder if they do this because they think it comes across positively to "bring high-end hardware to its knees" or something.


----------



## Bit_reaper

Quote:


> Originally Posted by *KenjiS*
> 
> Glad im not the only one that felt that way, I felt his entire tone was almost condescending to the PC community "Oh you silly peons cant run it, You must not have a good enough computer/Thats how PC games are because you lack unified memory"
> 
> Other games managed just fine without unified memory.. Heck Crysis 3 runs great and looks gorgeous... Its not us, its your complete lack of optimization.
> 
> I actually wrote up a big long post about what I think happened at Ubisoft with Watch Dogs. Dont remember which WD thread it was tho... I'll give the TL;DR version here:
> I know it doesnt excuse the state the game is in, but its a possible explanation


Not to mention that it has been reported that the game still has stutter issues even on 6GB Titans. On PS4 the developers only have access to 6GB in total. Also the PS4 hard drive is way slower then what any of us have as our main drives so its not like its a matter of bandwidth there either.

Its not the PC hardware, its an engine problem. A lot of that going around lately. Bioshock infinite, Far Cry 3, CoD:Ghost ect. and its not like its the traditional micro stutter that is due to crappy multi gpu support.


----------



## KenjiS

Bioshock Infinite I dont recall issues with, and I actually just played the Burial at Sea DLCs a couple weeks ago to toy with my SLI setup (Maybe fixed or some such) Far Cry 3 I didnt have any problems with really... but I played it well after it came out

Not saying others didnt.. Ive just been fairly lucky it appears


----------



## Bit_reaper

Quote:


> Originally Posted by *KenjiS*
> 
> Bioshock Infinite I dont recall issues with, and I actually just played the Burial at Sea DLCs a couple weeks ago to toy with my SLI setup (Maybe fixed or some such) Far Cry 3 I didnt have any problems with really... but I played it well after it came out
> 
> Not saying others didnt.. Ive just been fairly lucky it appears


Infinite got patched that fixed most of the stuttering. It sill had some minor hiccups when moving form certain areas to another but as long as you set up your setting so you where pushing +60 at all times then the general stutter was pretty low.

I'm pretty sure FarCry 3 got patched as well but seeing as I didn't play it my self I don't know for sure.

The point is that thous games stuttered on release and then got (mostly) fixed with patches just underlines the fact that it was never an issue with the hardware but an software issue.

Putting out an poorly optimized game is one thing but putting out an game that wont run smoothly is another. Random slow downs, stutters ect. should never be present in an final release. Sure they can't check for every possible hardware combo but considering this is a wide spread issue that effects what is essentially the target PC hardware then you got to assume they knew about the issue even before they shipped it.

If the game won't run smoothly on an 4770k and an GTX 780 then what the hell is it supposed to run on in order not to stutter.


----------



## KenjiS

Quote:


> Originally Posted by *Bit_reaper*
> 
> If the game won't run smoothly on an 4770k and an GTX 780 then what the hell is it supposed to run on in order not to stutter.


I'd say heck, even my rig should handle most games thrown at it and come out on top... I would hopefully like to think my rig is pretty powerful you know?


----------



## Silent Scone

Quote:


> Originally Posted by *KenjiS*
> 
> Glad im not the only one that felt that way, I felt his entire tone was almost condescending to the PC community "Oh you silly peons cant run it, You must not have a good enough computer/Thats how PC games are because you lack unified memory"
> 
> Other games managed just fine without unified memory.. Heck Crysis 3 runs great and looks gorgeous... Its not us, its your complete lack of optimization.
> 
> I actually wrote up a big long post about what I think happened at Ubisoft with Watch Dogs. Dont remember which WD thread it was tho... I'll give the TL;DR version here:
> I know it doesnt excuse the state the game is in, but its a possible explanation


In fairness what other sandbox games are there to speak of though? Crysis 3, although a good game in my eyes is still pretty much just a rail shooter. I think, although there is definitely some underlying issues with their engine, he still has a point. Having the unified system that the PS4 has along with the fact it also can allocate 3GB alone just for graphics, means that we are quite obviously no longer in the high end of the segment in this department. I think this was always going to happen regardless of poor optimisation. We know developers do tend to brush us PC users aside as an afterthought a lot of the time, so now that they're not limited to the small memory of the last generation boxes, we're probably seeing more symptoms of not only poor code, but DirectX overheads and memory mangement struggling to cope. Times are changing!

I also think there is a lot of denial across internet boards about just how much VRAM we really need these days lol.


----------



## Alatar

I'm trying to get WD to run well in windowed mode (so I could stream it) but the game refuses to have high GPU utilization while windowed.

GPU usage can even drop to sub 50% while driving...


----------



## Leopard2lx

Quote:


> Originally Posted by *KenjiS*
> 
> Bioshock Infinite I dont recall issues with, and I actually just played the Burial at Sea DLCs a couple weeks ago to toy with my SLI setup (Maybe fixed or some such) Far Cry 3 I didnt have any problems with really... but I played it well after it came out
> 
> Not saying others didnt.. Ive just been fairly lucky it appears


Last time I played Bioshock Infinite was a few months ago and it still stuttered sometimes even though my average FPS was around 120-130 fps. LMAO!


----------



## KenjiS

Quote:


> Originally Posted by *Silent Scone*
> 
> I also think there is a lot of denial across internet boards about just how much VRAM we really need these days lol.


I think the issue with the amount of VRAM Watch Dogs is consuming is that the textures arent even that "sharp" or "crisp", They're not really "high quality", I've seen better. We wouldnt be having so much complaining if the game actually -looked- like its performing..

Now as you said, Crysis 3 isnt really the same genre.. So I concede that..

I dont think the PC is getting a good benchmark title till The Witcher 3 at this rate... CD Projekt ALWAYS does right by the PC guys


----------



## revro

hmm my farcry 3 had no problems and i had a preorder and played it on a 660 on 1080 monitor


----------



## Alex132

Yeah it's basically quite a problem playing this game for me.

With all settings on Ultra, textures on Medium and TXAAx2 I can get these frames:
Quote:


> 2014-05-30 22:44:03 - Watch_Dogs
> Frames: 77270 - Time: 1609867ms - Avg: 47.998 - Min: 0 - Max: 148


Note the Min FPS? That's because there are MASSIVE slow-downs when loading textures, I mean like the entire game freezes for 5-10 seconds. Which crippling during missions. Not only that but the pop-in is a joke! It's actually rather hard to drive quickly on a high-way because of how close the cars pop-in!
Quote:


> Originally Posted by *http://www.forbes.com/sites/jasonevangelho/2014/05/29/ubisoft-working-on-pc-patch-for-watch-dogs-offers-advice-to-boost-performance/*
> Unless you have 4GB of Video RAM, playing at 1440p or 4K will be problematic, at least for now.


Let's just go out and buy 6GB Titan Blacks why don't we? Oh wait, not everyone has $5000000 laying around.


----------



## SoloCamo

Quote:


> Originally Posted by *iamhollywood5*
> 
> I'm sure higher bandwidth does help make the stutters less severe, but it surely wouldn't eliminate them completely. But AMD's 512-bit bus isn't even utilized when they throw such crappy low-bin 5Ghz memory on their flagship card. Wish they'd release a clocked-up version with 6Ghz or 7Ghz memory, like they did with the 7970 Ghz Edition (although they'd have to call it something different). The way it stands, the 780 Ti actually has a slight lead in bandwidth thanks to 7Ghz memory.
> 
> If anybody has a 290X with golden memory chips that can be OC'd really far, I'd like to see how the game runs for them.


If I can do 1550-1600mhz would that be considered golden? Shame I don't have the game to test with but I regularly game at 1500mhz mem which is just shy of 400gb/s bandwidth vs the stock 320gb/s


----------



## inedenimadam

Playing on 2 7970s at 6300x1152 Ultra textures, the game is only playable if I turn down practically everything else and restrict myself to FXAA. MSAA x8 brings me down to 0-1 fps


----------



## Onikage

Overcloacked asus gtx 760 here textures on high,SMAA and still eating 2 GB of VRAM what a joke and stutering is still there not as bad as on ultra but still hapens


----------



## Silent Scone

Think people should just wait for the patch. It's playable at 1440p on ultra for me with FXAA SMAA combination so not sure why people with 3gb plus haven't been able to get it at least playable. Especially at 1080...maybe my spec helps


----------



## philharmonik

Thanks for this my friend! I have tried your settings and the game is running great. Looks even better than it did. I set everything as you suggested, except I'm running 1440p. I don't what FPS i am getting, but its playing very smoothly, only a few hiccups here and there, which I'm sure will get ironed out in the patch. GPU is getting full load, CPU is between 50-60% utilization on my FX8350. I also uninstalled, and re-installed to my SSD.
Quote:


> Originally Posted by *zacker*
> 
> ok here is my tweaks for the game after 2 days of searching
> i5 3570 k oc at 4.7 ghz and gigabyte 780ti oc at 1250 mhz with skyn3ts bios mod and 377.88 drivers installed
> 
> this settings are a mix of msaa x4 in game + fxaa nvidia control panel and x32 csaa
> 
> first i will post my screenshots of my settings game is almost lag free for me i hope some of these settings help some people in here for max out settings
> 
> 
> 
> Thanks for this my friend! I have tried your settings and the game is running great. Looks even better than it did. I set everything as you suggested, except I'm running 1440p. I don't know what FPS i am getting, but its playing very smoothly, only a few hiccups here and there, which I'm sure will get ironed out in the patch. GPU is getting full load, CPU is between 50-60% utilization on my FX8350. I also uninstalled, and re-installed to my SSD.
> 
> now my ingame settings window mode is for taking screens only i play fullscreen 60hz monitor btw
> 
> 
> 
> ingame screenshots
> 
> ]
> 
> i dont know why but when i enabled 32x csaa in nvidia control panel i get almost no lag at all if you try something from these let me know also i have day 1 patch installed


----------



## KenjiS

What does 32XCSAA do anyways.. that sounds like a really huge resource hog..

I may try that later.


----------



## xxpantherrrxx

What's even more hilarious is that one of the devs said a majority of their studio were running Ultra settings on GTX 670's.


----------



## ChronoBodi

Quote:


> Originally Posted by *xxpantherrrxx*
> 
> What's even more hilarious is that one of the devs said a majority of their studio were running Ultra settings on GTX 670's.


Was it a stuttering fest for them as well? They are running on mid-high GPUs, they gotta optimize on that seriously.


----------



## Alex132

Quote:


> Originally Posted by *xxpantherrrxx*
> 
> What's even more hilarious is that one of the devs said a majority of their studio were running Ultra settings on GTX 670's.


Everybody lies.


----------



## Xboxmember1978

They can run it on onboard video but they never said it wasn't stuttering...lol...


----------



## philharmonik

Quote:


> Originally Posted by *KenjiS*
> 
> What does 32XCSAA do anyways.. that sounds like a really huge resource hog..
> 
> I may try that later.


http://gaming.stackexchange.com/questions/31801/what-are-the-differences-between-the-different-anti-aliasing-multisampling-set


----------



## zantetheo

Quote:


> Originally Posted by *eternal7trance*
> 
> With everything cranked as high as I can go on 1440p the highest I've seen was 3.6gb for VRAM usage. However this is with 4xMSAA. If I try to do 8xMSAA the game slows down to almost nothing. So tired of people coming out with badly optimized PC games


GTX 770 4GB:

1080p ultra + Temp SMAA
3.3-3.5 GB
Playable no stutter

1080p ultra + 4 X MSAA
3.9 GB
Not playable for a GTX 770 (frames down to 35)


----------



## zacker

yes guys try the x32 csaaa looks perfect better than everything and good fps for me also x32 cssa stacks with fxxa from nvida control panel and stacks with ingame settings choose what ever you like in game


----------



## tor6770

I get around 40-60fps, 3.7 GB at 1080p, Ultra, 2x TXAA with my GTX 770 4GB OC (1200MHz/7500MHz).


----------



## zacker

why u use txaa?it blur faces and at distance looks bad


----------



## StrongForce

Any1 can update on how a 7950/r9 280 would run at least high with the new drivers ? not necessary with AA I don't mind that.

This one is tempting : http://www.caseking.de/shop/catalog/Graphics-Card/All-Graphics-Card/VTX3D-Radeon-R9-280-X-Edition-3072-MB-DDR5-mini-DP-HDMI-DVI::26526.html and on sale that's like 260$ Meh I'm still mixed feelings about what to get lol.


----------



## TopicClocker

Quote:


> Originally Posted by *StrongForce*
> 
> Any1 can update on how a 7950/r9 280 would run at least high with the new drivers ? not necessary with AA I don't mind that.
> 
> This one is tempting : http://www.caseking.de/shop/catalog/Graphics-Card/All-Graphics-Card/VTX3D-Radeon-R9-280-X-Edition-3072-MB-DDR5-mini-DP-HDMI-DVI::26526.html and on sale that's like 260$ Meh I'm still mixed feelings about what to get lol.


Have you got a 770? Not sure why it says 5870 under the 770, If so you might want to wait for the newer gen cards if you're interested in getting something from a newer generation not to put you off or anything.


----------



## ChronoBodi

Quote:


> Originally Posted by *zacker*
> 
> yes guys try the x32 csaaa looks perfect better than everything and good fps for me also x32 cssa stacks with fxxa from nvida control panel and stacks with ingame settings choose what ever you like in game


how much of a hit is x32 CSAA? Might not be needed for me on 4K, is it the same performance hit as 8xMSAA?


----------



## philharmonik

Quote:


> Originally Posted by *zacker*
> 
> yes guys try the x32 csaaa looks perfect better than everything and good fps for me also x32 cssa stacks with fxxa from nvida control panel and stacks with ingame settings choose what ever you like in game


Dude!!! Ever since I applied your settings/plus SweetFX I just played for almost 4 hours straight without a freeze/crash. Yes, it still has a slight 1 seconds freeze/hiccup here and there, but it looks incredible and I"m having alot of fun. One thing I have noticed, compared to the E3 2012 demo, it seems like Tesselation is disabled. In the E3 demo, at the nightclub, the lights on the marquee hang down like real light bulbs. In THIS PC version, they are flat, not hanging down. I am disappointed with this and they better fix it with the patch. I will say that I am enjoying the gameplay regardless of the graphics when compared to E3. Still, I wish it looked like E3 graphics. It would make it that much more immersive. If anyone wants the SweetFX file that I have running, just let me know, and I will post the link. Thanks!


----------



## Emu105

It's crazy I posted about 32XCSAaa and I got flamed saying it didn't do anything. Sure did for me


----------



## pjckmen

Very interesting post. really informative. of all the blogs I have read on the same topic, this one is actually enlightening.


----------



## philharmonik

Quote:


> Originally Posted by *ChronoBodi*
> 
> how much of a hit is x32 CSAA? Might not be needed for me on 4K, is it the same performance hit as 8xMSAA?


I enabled the x32 CSAA in the Nvidia Control Panel and the game has been playing near flawless at 1080p. I switched from my 1440p monitor to my Samsung HDTV in the living room and I'm having a blast!!! When I first got the game, I couldn't play for more than 35-45 minutes without it freezing completely. I would have to restart the game all over again. Try zacker's settings and see if it works for you! Also, I got on youtube and found some sweetFX settings and that made the game look even better. I still believe that "tesselation" is missing. So I hope that is something that is enabled in the incoming patch.


----------



## philharmonik

Quote:


> Originally Posted by *Emu105*
> 
> It's crazy I posted about 32XCSAaa and I got flamed saying it didn't do anything. Sure did for me


That just goes to show you, not everyone knows what they are talking about. When you get a game in this state, try EVERYTHING!


----------



## DIYDeath

Quote:


> Originally Posted by *twerk*
> 
> Is that a hard lock on the Ultra textures, so if you don't have 3GB you can't select it? If so, that's stupid.


Ive seen several hundred comments about people complaining that their 400-600 series card can't play the game @ ultra.
I think that in itself will say more than I ever could about the intelligence of some of the people complaining and probably has a lot to do with that lock, if it is a lock.
Quote:


> Originally Posted by *Emu105*
> 
> It's crazy I posted about 32XCSAaa and I got flamed saying it didn't do anything. Sure did for me


Welcome to the internet: where people don't know how to do 10 minutes of research before opening their mouths to let a torrent of b.s. fall from their lips!

Of course 32x CSAA does something, CSAA stacks with other types of anti-aliasing, that's the whole point of CSAA.

Don't let those people get to you, they clearly have no clue.


----------



## StrongForce

Quote:


> Originally Posted by *TopicClocker*
> 
> Have you got a 770? Not sure why it says 5870 under the 770, If so you might want to wait for the newer gen cards if you're interested in getting something from a newer generation not to put you off or anything.


Oh I forgot to remove it, I had it for 2 days then decided to return it as I couldn't run BF4 fully maxed with MSAAx4 and vsync etc ( not that I care so much but it was a 400$) card so after seeing this thread I figured 2Gb Vram simply was'nt enough future proof, especially for a 400$ card, so now I'm just considering getting a r9 280 as a temporary solution (my HD5870 gets spikes in bf4 and also is pretty much dying on me when I start my PC the morning I see some mad flickering/like if I had bad reception with a TV transmitter lol, and it was gone with the 770).

I'm also tempted to get the 780 6gb from Inno3d but.. feels a bit expensive, I'd almost rather wait for the 800's series and get something "newer" for that price.


----------



## degenn

I've been playing the game the last few days with SweetFX (OmniFX preset as usual, the preset seen in this thread is pretty ugly imo) and running at 2560x1600 with 4x MSAA/4x SGSSAA along with 32x CSAA and good lord does this game ever look incredible.







If only the gameplay could match the visuals then this game would be an instant classic. Gameplay isn't horrible but it does get repetitive rather quickly -- it feels like it could've used another 6 months dev-time to add some more variety to the gameplay. Obviously could've used some more optimization as well though I'm not having problems running it but that's probably only because my rig is on the overkill end of the spectrum.


----------



## Leopard2lx

Just played a bit with one of the SweetFX presets and man, does it look good! The visuals are now so much more "mature", dramatic and not so washed out. Now, if they could just fix the drop in frames, it could redeem itself quite a bit. I actually decided to stop paying until the "promised" patch arrives because I want to have a better driving experience, now that the graphics looks sooooo good with SweetFX.
Amazing what a few graphical filters can do! Was it really that difficult for Ubisoft to implement something similar?







I mean, wasn't SweetFX written by one guy? It's funny and sad at the same time that one or two guys can make something so good and free, but an entire team of highly paid devs can't.








This is just like Dark Souls and Durante's DSFix.


----------



## MonarchX

SweetFX won't work with Windows 8.1, will it?


----------



## TopicClocker

Quote:


> Originally Posted by *StrongForce*
> 
> Oh I forgot to remove it, I had it for 2 days then decided to return it as I couldn't run BF4 fully maxed with MSAAx4 and vsync etc ( not that I care so much but it was a 400$) card so after seeing this thread I figured 2Gb Vram simply was'nt enough future proof, especially for a 400$ card, so now I'm just considering getting a r9 280 as a temporary solution (my HD5870 gets spikes in bf4 and also is pretty much dying on me when I start my PC the morning I see some mad flickering/like if I had bad reception with a TV transmitter lol, and it was gone with the 770).
> 
> I'm also tempted to get the 780 6gb from Inno3d but.. feels a bit expensive, I'd almost rather wait for the 800's series and get something "newer" for that price.


Have you not got a card currently then?
Well if you do I'd say you might want to hold out for the newer gens, not entirely sure when they're going to hit though, but if you currently haven't got a GPU maybe you might want to get a card to hold you over until the new gens if you want them, if you dont want to spend to much on a temporary solution something along the lines of a 280 would probably make for a good choice since it has 3GB vram.
On the other hand a 4GB card like the 290 is another powerful card but you might want to research into the current cards and what new stuff may or may not be releasing in the next few months, I almost bought a 290 that was on sale last month but It's too soon for me to upgrade lol and I dont want to sell my card, not yet anyway.

Why couldn't you fully max BF4 with MSAAx4, were you running it at 1080p and was it a lack of GPU power to push 60fps or something else?

I myself am not sure about the 2GB situation, I'm happy with my card currently, for me it was either that or spend a bit more on a 2GB 770, and I'd still be in the same dilemma Vram wise, or go to the AMD side where the Vram side of things are much better, but I'd be losing Physx which I've wanted for years of missing out, hadn't experienced it since the 8000 series, and shadowplay, long-term driver support and so on.

If only Nvidia would give their cards sensible amounts of vram, 2GB on a 680 or 690 dont make any sense, meanwhile AMD was slapping on 3GB on their 7900s, I'm not sure about 3GB, I wouldn't think there's a problem with it but I dont know.
At this point in time I think it's best to shoot for 3-4GB cards, I hope Nvidia release their newer generation cards with upwards of 4GB now, do not want a repeat of this skimming on vram.


----------



## Leopard2lx

Quote:


> Originally Posted by *MonarchX*
> 
> SweetFX won't work with Windows 8.1, will it?


Yes, it does. Follow the YouTube guide posted here a few posts back. Works perfectly with Watch Dogs.


----------



## StrongForce

Quote:


> Originally Posted by *TopicClocker*
> 
> Have you not got a card currently then?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well if you do I'd say you might want to hold out for the newer gens, not entirely sure when they're going to hit though, but if you currently haven't got a GPU maybe you might want to get a card to hold you over until the new gens if you want them, if you dont want to spend to much on a temporary solution something along the lines of a 280 would probably make for a good choice since it has 3GB vram.
> On the other hand a 4GB card like the 290 is another powerful card but you might want to research into the current cards and what new stuff may or may not be releasing in the next few months, I almost bought a 290 that was on sale last month but It's too soon for me to upgrade lol and I dont want to sell my card, not yet anyway.
> 
> Why couldn't you fully max BF4 with MSAAx4, were you running it at 1080p and was it a lack of GPU power to push 60fps or something else?
> 
> I myself am not sure about the 2GB situation, I'm happy with my card currently, for me it was either that or spend a bit more on a 2GB 770, and I'd still be in the same dilemma Vram wise, or go to the AMD side where the Vram side of things are much better, but I'd be losing Physx which I've wanted for years of missing out, hadn't experienced it since the 8000 series, and shadowplay, long-term driver support and so on.
> 
> If only Nvidia would give their cards sensible amounts of vram, 2GB on a 680 or 690 dont make any sense, meanwhile AMD was slapping on 3GB on their 7900s, I'm not sure about 3GB, I wouldn't think there's a problem with it but I dont know.
> At this point in time I think it's best to shoot for 3-4GB cards, I hope Nvidia release their newer generation cards with upwards of 4GB now, do not want a repeat of this skimming on vram
> 
> 
> .


Yea agree it seems Nvidia is stripping their cards potential with the amount of VRAM they put on, 4gb on the other side sounds optimal..

Yea I have my HD5870, guess I'll hold onto it for now.. unless I see a really tempting cheap deal on that website, lol.

Also when they release their new cards, they should release the Ti version, and possibly the + Vram options straight away (and the Ti with extra Vram too lol..)

Quote:


> Originally Posted by *degenn*
> 
> I've been playing the game the last few days with SweetFX (OmniFX preset as usual, the preset seen in this thread is pretty ugly imo) and running at 2560x1600 with 4x MSAA/4x SGSSAA along with 32x CSAA and good lord does this game ever look incredible.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If only the gameplay could match the visuals then this game would be an instant classic. Gameplay isn't horrible but it does get repetitive rather quickly -- it feels like it could've used another 6 months dev-time to add some more variety to the gameplay. Obviously could've used some more optimization as well though I'm not having problems running it but that's probably only because my rig is on the overkill end of the spectrum.


Awesome yea..I've been playing the witcher enhanced edition and yesterday I installed SweetFX for it, and geez, it does look good ! I might need to tweak the lightning a bit but just adding the basic settings helps alot it's crazy, really nice piece of software that thing is.


----------



## TopicClocker

Quote:


> Originally Posted by *StrongForce*
> 
> Yea agree it seems Nvidia is stripping their cards potential with the amount of VRAM they put on, 4gb on the other side sounds optimal..
> 
> Yea I have my HD5870, guess I'll hold onto it for now.. unless I see a really tempting cheap deal on that website, lol.
> 
> Also when they release their new cards, they should release the Ti version, and possibly the + Vram options straight away (and the Ti with extra Vram too lol..)
> Awesome yea..I've been playing the witcher enhanced edition and yesterday I installed SweetFX for it, and geez, it does look good ! I might need to tweak the lightning a bit but just adding the basic settings helps alot it's crazy, really nice piece of software that thing is.


Before I buy into the newer gens I'm trying to make sure I wait for their Ti versions to release, you know how Nvidia likes to do their Ti Ti Boost editions of their cards lol, I'll probably get a new card offering at-least 30% more performance and this will be when they've revealed most of their lineup.
But If it doesn't have the ram I expect from it I wont buy it, I hope a card of 780 or more performance has a minimum of 4GB of Vram.


----------



## Silent Scone

4GB is definitely not optimal. If developers are already testing the limits of what they can push on PS4 with the amount of unified memory they have, regardless of poor compression, then for over 1080P and with any respectable level of AA then the Titans 6GB is looking slightly less ridiculous right about now. If NV of all people are looking to plant 8GB plus on their next range, you can count your chickens they know it's because they'll need it. I think people over think the VRAM argument.

Currently, what other next generation games are there to speak of? It's early days for the next gen consoles. This is the first instance where at 1440P my 780Tis have hit a wall, but it's not going to get better...

Years people have complained about poor textures in games and poorly ported games. Watch Dogs may not be the best example, or exception, but it's definitely a shining example of memory allocation being pushed a lot further than it has been before due to the new platforms.


----------



## TopicClocker

Quote:


> Originally Posted by *Silent Scone*
> 
> 4GB is definitely not optimal. If developers are already testing the limits of what they can push on PS4 with the amount of unified memory they have, regardless of poor compression, then for over 1080P and with any respectable level of AA then the Titans 6GB is looking slightly less ridiculous right about now. *If NV of all people are looking to plant 8GB plus on their next range*, you can count your chickens they know it's because they'll need it. I think people over think the VRAM argument.


Are they?
The 8GB can be allocated to whatever the developers wish, but the thing is that they have access to only 6GB IIRC (PS4) for games unless this gets lifted years later or so and this has to be used for graphics and for the game, I would think 4GB would be good (for 1080p), 6GB maybe be more or less optimal but it's not so easy to speculate when the true exclusive next generation games aren't really available yet.

Another thing is that the devs may just use the Vram because its there which can help them with loading things quicker, Titanfall for example which can stutter on 2GB cards with Insane textures, and this is because the game hasn't got texture streaming to supposedly prevent problems with loading the textures which can be looked as as a result of the Xbox One's unified memory, Watch_Dogs is another culprit of developers taking advantage of unified memory, even the Xbox 360 had its benefits because of it's combined memory pool whilst the PS3 had 256 for system memory and 256 for graphics.

What I dont understand is why and how Microsoft messed up the Xbox One so much, if it wasn't for Epic games the 360 supposedly wouldn't of had 512MB, and look at the Xbox One, DDR3. Could nobody tell them...


----------



## Leopard2lx

This isn't a VRAM issue. It's a "Ubisoft engine and coding sucks" issue. Just because the game uses so much VRAM doesn't mean it's justified. Even NV said that it looks like the game engine is bottlenecking it's texture streaming process. Also, how else do you explain turning everything down to medium with no AA and the game still stutters? Or how about the guys with Titans that still experience the same problem?


----------



## iamhollywood5

Quote:


> Originally Posted by *Leopard2lx*
> 
> This isn't a VRAM issue. *It's a "Ubisoft engine and coding sucks" issue*. Just because the game uses so much VRAM doesn't mean it's justified. Even NV said that it looks like the game engine is bottlenecking it's texture streaming process. Also, how else do you explain turning everything down to medium with no AA and the game still stutters? Or how about the guys with Titans that still experience the same problem?


Yes it's a coding issue, but it's more just a Ubisoft sucks issue.

And yes the texture streaming suckage is they reason they coded the game to reserve the vast majority of your VRAM, even if you have 6GB+ instead of only reserving what the game actually needs in a particular scene. Reduces how frequently textures need to be pulled from the system drive. The fact they Ubisoft intentionally coded the game to eat up all of your VRAM no matter how much you have reveals the fact that they were FULLY aware of the texture streaming and stuttering issues before the game launched.

"Well, maybe PC gamers will be able to play our game 10 years from now when graphics cards have 24GB VRAM"


----------



## revro

yup 32csaa works like charm, all the stutter gone, tough i am still cpu bottlenecked but my driving fps went from 20-22 to 28-30+ so thats pretty comfortable @1440p


----------



## zacker

nice to see that x32csaa helps you too


----------



## DIYDeath

Quote:


> Originally Posted by *TopicClocker*
> 
> Are they?
> The 8GB can be allocated to whatever the developers wish, but the thing is that they have access to only 6GB IIRC (PS4) for games unless this gets lifted years later or so and this has to be used for graphics and for the game, I would think 4GB would be good (for 1080p), 6GB maybe be more or less optimal but it's not so easy to speculate when the true exclusive next generation games aren't really available yet.
> 
> Another thing is that the devs may just use the Vram because its there which can help them with loading things quicker, Titanfall for example which can stutter on 2GB cards with Insane textures, and this is because the game hasn't got texture streaming to supposedly prevent problems with loading the textures which can be looked as as a result of the Xbox One's unified memory, Watch_Dogs is another culprit of developers taking advantage of unified memory, even the Xbox 360 had its benefits because of it's combined memory pool whilst the PS3 had 256 for system memory and 256 for graphics.
> 
> What I dont understand is why and how Microsoft messed up the Xbox One so much, if it wasn't for Epic games the 360 supposedly wouldn't of had 512MB, and look at the Xbox One, DDR3. Could nobody tell them...


Agreed what the heck happened there? Clearly the Xbox One is inferior to the PS4 and PC from a technical standpoint and there's no reason it should be technically lagging behind the times. No offense to any Xbox One players but I hope the console gets abandoned, a phrase from Angry Joe is what needs to be said to them at this point and no more.


----------



## Onikage

I thought forcing AA trough control panel dosnt work in dx11 games or am i missing something?


----------



## Silent Scone

Quote:


> Originally Posted by *TopicClocker*
> 
> Are they?
> The 8GB can be allocated to whatever the developers wish, but the thing is that they have access to only 6GB IIRC (PS4) for games unless this gets lifted years later or so and this has to be used for graphics and for the game, I would think 4GB would be good (for 1080p), 6GB maybe be more or less optimal but it's not so easy to speculate when the true exclusive next generation games aren't really available yet.
> 
> A*nother thing is that the devs may just use the Vram because its there which can help them with loading things quicker*, Titanfall for example which can stutter on 2GB cards with Insane textures, and this is because the game hasn't got texture streaming to supposedly prevent problems with loading the textures which can be looked as as a result of the Xbox One's unified memory, Watch_Dogs is another culprit of developers taking advantage of unified memory, even the Xbox 360 had its benefits because of it's combined memory pool whilst the PS3 had 256 for system memory and 256 for graphics.
> 
> What I dont understand is why and how Microsoft messed up the Xbox One so much, if it wasn't for Epic games the 360 supposedly wouldn't of had 512MB, and look at the Xbox One, DDR3. Could nobody tell them...


Most modern rendering engines use this technique, but how well is obviously dependent on the game...Maxwell on desktop will undoubtedly have 8GB VRAM, yes.


----------



## StrongForce

Would be nice if it had 8Gb indeed, for all models ? I'm a biit sceptical about this 1







, if maxwell is 20nm and 8gb VRAM, (or at least their GTX 880 let's say) it's got to be quite badass..might be worth the wait after all lol, ah well, I don't really care about 20nm to be honest as long as they can reduce power consumption, and release it soon.. I'd be happy because if I see a deal on a r9 280/ or x that's really good soon I might push the button.


----------



## DIYDeath

Quote:


> Originally Posted by *Onikage*
> 
> I thought forcing AA trough control panel dosnt work in dx11 games or am i missing something?


Sure it works, why wouldn't it?


----------



## MonarchX

Woah, these speculations are getting out of hand. *Let's get some things straight:*

1. Not all of 8GB of GDDR5 RAM is used exclusively for graphics on PS4 or XB1. They use about half for OS and gameplay and the other half for graphics.
2. We don't really know how much RAM is being used by graphics on PS4 specifically for Watch Dogs.
3. Just because VRAM monitoring app shows a certain VRAM amount usage value doesn't mean that the game needs that amount of VRAM to load all the needed textures and graphics for smooth gameplay. Take BF4, for example. On 3-4GB VRAM cards, it uses above/more than 2GB of VRAM, and yet it runs just as smoothly on 2GB VRAM cards. This was demonstrated on 2GB & 4GB GTX 770 cards.
4. Watch Dogs runs just as crappy on 4-6GB cards as it does on 3GB cards, regardless of VRAM usage.
5. OCN.net rigs are not representative of your average PC gamer rig specs, where only top 1-2% own high-end 3-4GB VRAM videocards.
6. Game developers and videocard makers cannot expect people to just upgrade to 4GB and 6GB cards and yet they do... 3GB is within reason and it can sustain even a 4K resolution in 99% of games today, but unlikely to do so in the future.

No need to panic because Unisoft messed up. You guys do realize that they knew very well that the game would stutter like crazy even on high-end cards and yet they still decided to release it, right??? If they were willing to release a game in such a state, what can you say about their coding skills? *Watch Dogs is a game everyone should play, but nobody should buy







.*


----------



## Onikage

Quote:


> Originally Posted by *DIYDeath*
> 
> Sure it works, why wouldn't it?


I tried it in couple of dx11 games it never seemed to do anything also tried 32x csaa in watch dogs and not realy sure did it do anything my FPS is the same so im not realy sure there seems to be less jagies but it could be placebo effect


----------



## Silent Scone

Sigh. Who said Ps4 had 8gb reserved for graphics? Nobody. The rumours of having 8gb on Maxwell are nothing to do directly with next gen consoles, they're through leaked spec sheets. It's not unrealistic considering 880m is equipped with this much. There is no speculation:

Ps4 has up to 3gb available within its unified GDDR for graphics meaning if this is utilised, gaming on PC at higher than 1080 is occasionally, and ever increasingly going to require MORE than this to be able to do so. This is common sense...I'm not sure why people insist on denying that their precious cards aren't going to hold up all that long. I've three of them and it doesn't bother me...On my phone so excuse the crudeness but 3GB is NOT plenty anymore.

Also encouraging piracy on a game that's clearly had a lot of work put into it regardless of bugs is dumb. Get your coat...


----------



## BradleyW

People, the truth is, games don't need to use as much VRAM as what Watch Dogs is using. It's just poor rendering techniques and bad optimization, which comes hand in hand with the engine they've based Disrupt on.
Quote:


> Originally Posted by *Leopard2lx*
> 
> Yes, it does. Follow the YouTube guide posted here a few posts back. Works perfectly with Watch Dogs.


See sig.


----------



## StrongForce

Quote:


> Originally Posted by *BradleyW*
> 
> People, the truth is, games don't need to use as much VRAM as what Watch Dogs is using. It's just poor rendering techniques and bad optimization, which comes hand in hand with the engine they've based Disrupt on.
> See sig.


Remember it's an open world game too but yea I think you're right.. there is probably a bit of both (heavy rendering and bad optimization/engine) let's hope they don't use that Disrupt engine on other games then !


----------



## DIYDeath

Quote:


> Originally Posted by *Onikage*
> 
> I tried it in couple of dx11 games it never seemed to do anything also tried 32x csaa in watch dogs and not realy sure did it do anything my FPS is the same so im not realy sure there seems to be less jagies but it could be placebo effect


Without knowing all the details I can only speculate and speculation without education when it comes to computers is stupid.
Can you tell me more about your hardware and setup?

It should just work, no tinkering needed. Hell I even wrote a guide on how to implement forced (OG/SG)SSAA in Watch Dogs for Win8.1, no one has had issues yet bar a few sweetfx questions so there's something fishy going on.


----------



## TopicClocker

Quote:


> Originally Posted by *MonarchX*
> 
> Woah, these speculations are getting out of hand. *Let's get some things straight:*
> 
> 1. Not all of 8GB of GDDR5 RAM is used exclusively for graphics on PS4 or XB1. They use about half for OS and gameplay and the other half for graphics.
> 2. We don't really know how much RAM is being used by graphics on PS4 specifically for Watch Dogs.
> 3. Just because VRAM monitoring app shows a certain VRAM amount usage value doesn't mean that the game needs that amount of VRAM to load all the needed textures and graphics for smooth gameplay. Take BF4, for example. On 3-4GB VRAM cards, it uses above/more than 2GB of VRAM, and yet it runs just as smoothly on 2GB VRAM cards. This was demonstrated on 2GB & 4GB GTX 770 cards.
> 4. Watch Dogs runs just as crappy on 4-6GB cards as it does on 3GB cards, regardless of VRAM usage.
> 5. OCN.net rigs are not representative of your average PC gamer rig specs, where only top 1-2% own high-end 3-4GB VRAM videocards.
> 6. Game developers and videocard makers cannot expect people to just upgrade to 4GB and 6GB cards. 3GB is within reason and it can sustain even a 4K resolution in 99% of games.
> 
> No need to panic because Unisoft messed up. You guys do realize that they knew very well that the game would stutter like crazy even on high-end cards and yet they still decided to release it, right??? If they were willing to release a game in such a state, what can you say about their coding skills? *Watch Dogs is a game everyone should play, but nobody should buy
> 
> 
> 
> 
> 
> 
> 
> .*


1. Yup, the PS4 afaik has 6GB GDDR5 RAM for it's use and this has to be used for the game/gameplay and the graphics.
2. Guerrilla Games and Sucker Punch Productions have presentations with profilers of the video ram usage if that's anything to go by.
3. Indeed some games will be showing usage of 3-4GB of "usage" and on a 2GB card it's easily playable.
Quote:


> Originally Posted by *Silent Scone*
> 
> Sigh. Who said Ps4 had 8gb reserved for graphics? Nobody. The rumours of having 8gb on Maxwell are nothing to do directly with next gen consoles, they're through leaked spec sheets. It's not unrealistic considering 880m is equipped with this much. There is no speculation:
> 
> Ps4 has up to 3gb available within its unified GDDR for graphics meaning if this is utilised, gaming on PC at higher than 1080 is occasionally, and ever increasingly going to require MORE than this to be able to do so. This is common sense...I'm not sure why people insist on denying that their precious cards aren't going to hold up all that long. I've three of them and it doesn't bother me...On my phone so excuse the crudeness but 3GB is NOT plenty anymore.
> 
> Also encouraging piracy on a game that's clearly had a lot of work put into it regardless of bugs is dumb. Get your coat...


Why isn't 3GB not plenty anymore? Well it may not be "plenty" but I dont think it's a problem as of yet, for 1080p anyway, the only game to possibly give 3GB trouble may just be this game, and this game recommends 3GB for the highest quality textures...








This game is in no shape to set benchmarks with its current state, I would think the next Batman or Witcher game, or another next gen exclusive would be be able to

Quote:


> Originally Posted by *StrongForce*
> 
> Remember it's an open world game too but yea I think you're right.. there is probably a bit of both (heavy rendering and bad optimization/engine) let's hope they don't use that Disrupt engine on other games then !


This is the first time they've used the engine, It probably just needs some work, GTA 4 wasn't to great when It used the RAGE engine for I think the second time, and over the years it's improved, just look at Max Payne 3 for instance not sure if that was a one off or another Rockstar dev team had ported it, but it's probably not a good example. Well for consoles GTA 5 sports a higher resolution on the PS3, it was 1152x640 with GTA 4 and Red Dead Redemption and now it's 1280x720 with GTA 5, the 360 versions were all 720p I think.

Quote:


> Originally Posted by *DIYDeath*
> 
> Agreed what the heck happened there? Clearly the Xbox One is inferior to the PS4 and PC from a technical standpoint and there's no reason it should be technically lagging behind the times. No offense to any Xbox One players but I hope the console gets abandoned, a phrase from Angry Joe is what needs to be said to them at this point and no more.


I dont think it should get abandoned, but goddammit Microsoft, for all of your systems you've released they've been more or less technically superior to the other systems, (360, PS3 gen, the PS3's upper hand was the Cell's SPEs, it had an inferior GPU but the Cell's SPEs could aid the RSX without the SPEs it would be an inferior machine) and on your third generation system you completely fluff it up, in practically the worse way possible! What genius thought DDR3 was a good idea for a "Next Generation" system in 2013 expected to push resolutions of 1080p for maybe another 7 or 8 years? The GPU is basically a R7 250, and the desktop version is clocked higher than the one in the console!














Why isn't it pushing for power like their previous generation systems? the 360 was close to cutting edge bringing new things to the table, the Xbox One? Kinect 2.0 that's getting removed from some SKUs.

Sony didn't even have to try, I'm no fanboy I like both the 360 and PS3 but Microsoft has done nothing but disappoint people, especially with that always-online DRM they tried to push and did a 180, it should be called the Xbox 180 if anything.

I know they'll squeeze power out of it and all but it's practically doomed for the rest of the generation on the technical side, I'm so disappointed, the IPs are going to be good though, can't wait for Halo 5, but can you imagine what they could do with the PS4's GPU, no forget that, if they put something along the lines of a 7870XT in there?


----------



## DIYDeath

Quote:


> Originally Posted by *TopicClocker*
> 
> 1. Yup, the PS4 afaik has 6GB GDDR5 RAM for it's use and this has to be used for the game/gameplay and the graphics.
> 2. Guerrilla Games and Sucker Punch Productions have presentations with profilers of the video ram usage if that's anything to go by.
> 3. Indeed some games will be showing usage of 3-4GB of "usage" and on a 2GB card it's easily playable.
> Why isn't 3GB not plenty anymore? Well it may not be "plenty" but I dont think it's a problem as of yet, for 1080p anyway, the only game to possibly give 3GB trouble may just be this game, and this game recommends 3GB for the highest quality textures...
> 
> 
> 
> 
> 
> 
> 
> 
> This game is in no shape to set benchmarks with its current state, I would think the next Batman or Witcher game, or another next gen exclusive would be be able to
> This is the first time they've used the engine, It probably just needs some work, GTA 4 wasn't to great when It used the RAGE engine for I think the second time, and over the years it's improved, just look at Max Payne 3 for instance not sure if that was a one off or another Rockstar dev team had ported it, but it's probably not a good example. Well for consoles GTA 5 sports a higher resolution on the PS3, it was 1152x640 with GTA 4 and Red Dead Redemption and now it's 1280x720 with GTA 5, the 360 versions were all 720p I think.
> I dont think it should get abandoned, but goddammit Microsoft, for all of your systems you've released they've been more or less technically superior to the other systems, (360, PS3 gen, the PS3's upper hand was the Cell's SPEs, it had an inferior GPU but the Cell's SPEs could aid the RSX without the SPEs it would be an inferior machine) and on your third generation system you completely fluff it up, in practically the worse way possible! What genius thought DDR3 was a good idea for a "Next Generation" system in 2013 expected to push resolutions of 1080p for maybe another 7 or 8 years? The GPU is basically a R7 250, and the desktop version is clocked higher than the one in the console!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know they'll squeeze power out of it and all but it's practically doomed for the rest of the generation on the technical side, I'm so disappointed, the IPs are going to be good though, can't wait for Halo 5, but can you imagine what they could do with the PS4's GPU, no forget that, if they put something along the lines of a 7870XT in there?


Im a pc guy myself so in reality Im just happy consoles are now 64bit so they can stop holding back pc gaming on such a fundamental level, lol.


----------



## TopicClocker

Quote:


> Originally Posted by *DIYDeath*
> 
> Im a pc guy myself so in reality Im just happy consoles are now 64bit so they can stop holding back pc gaming on such a fundamental level, lol.


I'm primarily a PC gamer but I do love some exclusives on the consoles, I do wish they were on PC though.

I'm glad they're 64bit but consoles will always hold back PC Gaming, multi-platform titles anyway, PC hardware is always moving forward, consoles are always static but this extra ram they've got should do some good for the future.


----------



## StrongForce

Quote:


> Originally Posted by *TopicClocker*
> 
> This is the first time they've used the engine, It probably just needs some work, GTA 4 wasn't to great when It used the RAGE engine for I think the second time, and over the years it's improved, just look at Max Payne 3 for instance not sure if that was a one off or another Rockstar dev team had ported it, but it's probably not a good example. Well for consoles GTA 5 sports a higher resolution on the PS3, it was 1152x640 with GTA 4 and Red Dead Redemption and now it's 1280x720 with GTA 5, the 360 versions were all 720p I think.
> I


Didn't know MaxPayne3 was using the same engine, interesting, yes it looks muuch better but also tunnel gameplay so easer for them to make it look good, I haven't played GTA 5 yet though hope they ever release the PC version like uh, that thing should be released what a bunch of ******* those devs are for not releasing on PC straight away.. and also wasn't impressed with what I saw besides the graphic and the tons of useless minigames, what real stuff they added ?? ah well it's a question for another thread, getting sligthly off topic here..

By the way guys I made a thread in graphic cards/general asking and trying to figure out if multi GPU could be an option for me, and I thought I'd make a huge thread talking and asking questions bout the benefits and the downsides, if you are interested check it out and enlight me







, plus I think it could be helpful for newbies that have no clue too.


----------



## philharmonik

Has anyone noticed there is no Tessellation in WD? In the E3 demo there was tessellation.


----------



## awdrifter

Probably because the consoles can't handle tessellation so they dropped it.


----------



## iamhollywood5

Quote:


> Originally Posted by *awdrifter*
> 
> Probably because the consoles can't handle tessellation so they dropped it.


Pretty sure the new consoles can do at least some tessellation, but they had enough problems getting the game to run at 30fps on these consoles even without it. But yeah the lack of tessellation is one of the reasons the game looks so bland.


----------



## StrongForce

yea and why they didn't put it for PC's that's the question.. game already too laggy as it is / lack of optimization ? pff


----------



## Leopard2lx

I am really tired of the PC gamers getting all these second-hand console ports. I really wish they would do more PC exclusives, properly optimized and executed. At the end of the day, it all boils down to GREED. They make more money by catering to a larger demographic and PC gamers always get the left-overs.
They made the money of off all the PC users who bought it, and now maybe they'll fix it...or maybe they won't. and whenever they feel like it. They probably don't even care that much about their reputation because look at EA. One of the worst companies but they still sell well and still make good money. Case in point: Far Cry 3 and Assasin's Creed 3. Ubisoft messed that up good, and now people still bought WD like crazy and it's now their fastest selling game ever, even with their history of bad ports and downgrades. When the next Assasin's Creed game comes out, it will probably sell good. They don't care because we keep buying their crap no matter what.

Well, lesson learned for me. Never pre-order again!


----------



## StrongForce

Yea really, look at GTA 5, didn't they cash in 1 Billion in the first days ? Did they port to PC yet ? no, supposedly because there aren't enough customers on PC ?!! like really ! you just made 700 million benefits and talk about not worth investing a few millions for PC gamers, thing that will sell millions of copies for the next few years to come.. really it's all about greed, they will probably release it and I bet it's like their developers trainee that did the work, like they didn't invest a cent in it, lol!

It kinda sums up the mentality of some developers towards PC gaming to me..

Anyone knows how Assassin creed 3 runs on console by the way I'd be curious, maybe it's just badly optimized on all platforms..


----------



## KenjiS

Tried the 32x CSAA settings listed earlier, WOW it makes the game look good..

Sadly didnt fix the stutter, But i also admit i turned my SLI back on which is probubly why that happened...

Will likely wait for the patch, Id really like SLI to work :C


----------



## error-id10t

Yeah I also tried the CSAAx32, with no drop in FPS it's bizarre but for my eyes, it does work. I now use that with AFx16 which I had always been using. Can't run MSAAx8 on my system so I resort to TXAAx4 (for those curious I saw ~15FPS drop which is too large).


----------



## Emu105

Quote:


> Originally Posted by *error-id10t*
> 
> Yeah I also tried the CSAAx32, with no drop in FPS it's bizarre but for my eyes, it does work. I now use that with AFx16 which I had always been using. Can't run MSAAx8 on my system so I resort to TXAAx4 (for those curious I saw ~15FPS drop which is too large).


TXAA makes the game look really blurry


----------



## Devnant

Yes. TXAA is just plain horrible. MSAA is much, much better. I stopped playing the game, and will wait for a patch.


----------



## MonarchX

Quote:


> Originally Posted by *Silent Scone*
> 
> Sigh. Who said Ps4 had 8gb reserved for graphics? Nobody. The rumours of having 8gb on Maxwell are nothing to do directly with next gen consoles, they're through leaked spec sheets. It's not unrealistic considering 880m is equipped with this much. There is no speculation:
> 
> Ps4 has up to 3gb available within its unified GDDR for graphics meaning if this is utilised, gaming on PC at higher than 1080 is occasionally, and ever increasingly going to require MORE than this to be able to do so. This is common sense...I'm not sure why people insist on denying that their precious cards aren't going to hold up all that long. I've three of them and it doesn't bother me...On my phone so excuse the crudeness but 3GB is NOT plenty anymore.
> 
> Also encouraging piracy on a game that's clearly had a lot of work put into it regardless of bugs is dumb. Get your coat...


This is obviously OK with you because someone who can afford your rig, can afford to quickly upgrade to the newest technology.

Now I do agree that 4K is going to need more than 4GB, but only for a few titles. As of right now, I think 3GB can handle almost all games @ 4K just as easily as 4GB with 1 or 2 exception games. You do know that 4K is equivalent of 1080p with 4s OGSSAA, right?

1440p will also push those 3GB boundaries with consoles optimized games. Its undeniable that many games will continue to be poorly optimized for PCs and if you want to play them instead of complaining about them then you will have to suck it up and buy appropriate hardware for 1440p, 1600p, and surely 4K.

For 1080p, 3GB is just right. I am glad I stayed with 1080p because the best monitor only supports that much. Worst case scenario for 3GB cards is using SMAA instead of MSAA to conserve memory. Again, you also need a fast GPU that can process data fast enough on the fly to reduce the need for pre-caching, so 4GB GTX 770 and 3GB R9 280X are not going to cut it even for 2014 titles to run at 60fps, but more like 30-35fps average with dips into low 20s and rare spikes up to 45fps. I do expect my card to perform at about 40-60fps on poorly optimized games. AC4 is also poorly optimized, but it sure runs great on my humble rig.

Its also not unthinkable that a powerful $1000 SLI system will be needed GPU-wise to render 4K and even 1440p untunt much better hardware comes out. 3GB GTX 780 Ti with good OC and 4GB R9 290X with really good OC @ 1080p really is the minimum for decent performance in 2014 games. 3GB-4GB+ 2x SLI for the same cards is the minimuum for 1440p. 4GB-6GB+ 3x SLI of the same cards is the minimum for 4K.

That era where you could buy a decent gaming PC for 1K was over the day new console generation went for sale. If a console power increases 2X, then PC power needs to increase 3X or even 4X to keep up. Either get a console or a premium-priced PC if you want to play recent games with Ultra settings. Ultra settings are for Ultra PCs. High aettingsbare for high-end PCs and etc. I feel kind of bad for people who buy GTX 760 and expect to run Ultra or Very High settings. Its a card for Medium settings with some set to High. Don't bother with nVidia settings recommendation or at least drop all their recommendations by 1 notch.


----------



## p00ter71

The game runs really smooth on my rig. Runs ultra 30- 40 fps so I dropped down to high where I get 40- 55 fps. No stutters.


----------



## eternal7trance

I noticed this game started running a lot smoother once I put it on my SSD


----------



## p00ter71

Yea I installed mine to a SSD as well.


----------



## Silent Scone

Hmm yeah I'm starting to wonder if it's worth moving it over to my OS drive. It's currently on my secondary RAID SSD partition, but putting it on the OS SSD partition, which is capable of over 1gb/s, may help in some ways.


----------



## Jedson3614

I'm running ultra on 4770k and gtx 670 with 2gb no stuttering, my gtx 670 is overclocked, and so is my cpu, but I have had no issues playing in on ultra or with high res textures. it isn't even on my SSD either. It's on my secondary mechanical drive.


----------



## philharmonik

Quote:


> Originally Posted by *Jedson3614*
> 
> I'm running ultra on 4770k and gtx 670 with 2gb no stuttering, my gtx 670 is overclocked, and so is my cpu, but I have had no issues playing in on ultra or with high res textures. it isn't even on my SSD either. It's on my secondary mechanical drive.


What resolution are you running at? That seems odd that you are having no issues with only 2GB of VRAM on a 670. Especially since the requirement is 3GB for Ultra textures.


----------



## revro

Quote:


> Originally Posted by *MonarchX*
> 
> This is obviously OK with you because someone who can afford your rig, can afford to quickly upgrade to the newest technology.
> 
> Now I do agree that 4K is going to need more than 4GB, but only for a few titles. As of right now, I think 3GB can handle almost all games @ 4K just as easily as 4GB with 1 or 2 exception games. You do know that 4K is equivalent of 1080p with 4s OGSSAA, right?
> 
> 1440p will also push those 3GB boundaries with consoles optimized games. Its undeniable that many games will continue to be poorly optimized for PCs and if you want to play them instead of complaining about them then you will have to suck it up and buy appropriate hardware for 1440p, 1600p, and surely 4K.
> 
> For 1080p, 3GB is just right. I am glad I stayed with 1080p because the best monitor only supports that much. Worst case scenario for 3GB cards is using SMAA instead of MSAA to conserve memory. Again, you also need a fast GPU that can process data fast enough on the fly to reduce the need for pre-caching, so 4GB GTX 770 and 3GB R9 280X are not going to cut it even for 2014 titles to run at 60fps, but more like 30-35fps average with dips into low 20s and rare spikes up to 45fps. I do expect my card to perform at about 40-60fps on poorly optimized games. AC4 is also poorly optimized, but it sure runs great on my humble rig.
> 
> Its also not unthinkable that a powerful $1000 SLI system will be needed GPU-wise to render 4K and even 1440p untunt much better hardware comes out. 3GB GTX 780 Ti with good OC and 4GB R9 290X with really good OC @ 1080p really is the minimum for decent performance in 2014 games. 3GB-4GB+ 2x SLI for the same cards is the minimuum for 1440p. 4GB-6GB+ 3x SLI of the same cards is the minimum for 4K.
> 
> That era where you could buy a decent gaming PC for 1K was over the day new console generation went for sale. If a console power increases 2X, then PC power needs to increase 3X or even 4X to keep up. Either get a console or a premium-priced PC if you want to play recent games with Ultra settings. Ultra settings are for Ultra PCs. High aettingsbare for high-end PCs and etc. I feel kind of bad for people who buy GTX 760 and expect to run Ultra or Very High settings. Its a card for Medium settings with some set to High. Don't bother with nVidia settings recommendation or at least drop all their recommendations by 1 notch.


you dont necessary need 2 780ti or 2 290x, 780 standard and 290 suffice themselves and you save hundreds of dollars/euros


----------



## Jedson3614

1080, I have the old asus vgn nvidia 3d vision monitor and it runs at 120hz no 3d.


----------



## MonarchX

Quote:


> Originally Posted by *revro*
> 
> you dont necessary need 2 780ti or 2 290x, 780 standard and 290 suffice themselves and you save hundreds of dollars/euros


That's for PC-optimized games. I was talking about the fact that most AAA games out there will be optimized for consoles and to play them on PCs with even acceptable framerate you need massive horse-power. For PC-optimized games, you can get by with a R9 280X 3GB / 290 4GB @ 1080p MSAA/SMAA or GTX 770 4GB @ 1080p SMAA (2GB would also work for a while or with High/Very High settings instead of Ultra, but it sure isn't optimal) at least until the end of 2014. GTX 780 and R9 290X are the cards you need to not just get acceptable, but good framerate in PC-optimized games. I don't consider R9 290 to be a card for consistently great framerate, seeing how in many cases R9 290X performs on the level of a non-Ti GTX 780. R9 290 creates an impression of an in-between acceptable and good framerate card, a best bang-for-the-buck deal. I think GTX 770 4GB costs as much as R9 290, but 290 performs better. GTX 780 Ti or Titan Black are the cards for excellent framerate. I also assume that all the cards I mentioned would be overclocked by at least 30% or more. If I were forced to use stock clocks or even SuperClocked settings for my GTX 780 Ti, I would've been pissed for spending $700 on it.

For console-optimized games I also assume nothing less than a 4.7Ghz i7 2600K , i7 4.5Ghz 3770K, i7 4.4Ghz 4770K OR i5 CPUs with similar performance, like 5Ghz i5 2500K, 4.8Ghz i5 3570K, and 4.7Ghz i5 4670K. I could be wrong about the exact clocks that would yield identical performance, but I hope I was close enough. 8GB of RAM would be the minimum, but 16GB would be better. An SSD would also be a must-have, but its a must-have for ANY rig of any type these days. That really is what it takes to get those "consolified" games with bad PC optimizations to run well on PCs. For PC-optimized games, 8GB of RAM is fine and you can safely game even with stock i5 2500K, although you'd be dumb not to overclock it.


----------



## DF is BUSY

Quote:


> Originally Posted by *philharmonik*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jedson3614*
> 
> I'm running ultra on 4770k and gtx 670 with 2gb no stuttering, my gtx 670 is overclocked, and so is my cpu, but I have had no issues playing in on ultra or with high res textures. it isn't even on my SSD either. It's on my secondary mechanical drive.
> 
> 
> 
> What resolution are you running at? That seems odd that you are having no issues with only 2GB of VRAM on a 670. Especially since the requirement is 3GB for Ultra textures.
Click to expand...

i recall a post somewhere on the internets that if you have a 2gb card and run ultra textures, the game will constantly re-render far away textures and or taking far away textures and use them to render textures in your immediate view distance/surroundings....... or something like that.

that aside, i run ultra textures on my 2gb 770 and it has been fine though (playable) not as "smooth" as high textures i think.

ultra shadows/reflections in certain areas at certain time of day kills my fps though- ultra textures or not.

probably the lack of raw horsepower if anything.


----------



## TopicClocker

Quote:


> Originally Posted by *MonarchX*
> 
> That era where you could buy a decent gaming PC for 1K was over the day new console generation went for sale. If a console power increases 2X, then PC power needs to increase 3X or even 4X to keep up. Either get a console or a premium-priced PC if you want to play recent games with Ultra settings. Ultra settings are for Ultra PCs. High aettingsbare for high-end PCs and etc. I feel kind of bad for people who buy GTX 760 and expect to run Ultra or Very High settings. Its a card for Medium settings with some set to High. Don't bother with nVidia settings recommendation or at least drop all their recommendations by 1 notch.


A card for medium settings? you do realize 90% of the time with it set to medium you'll be pushing 60fps with that card? The cards that are made for medium settings are cards like the 7770-7850, if it's made for medium settings then the 7950/280, which was one of last years high-end AMD cards before the 290, is also for medium settings too.








It's hardly any slower than a stock 670 which is one of the highest cards in the mid-range spectrum Nvidia side.

Not everybody is shooting for Ultra settings 60fps at 1080p upwards and spending $400-700 upwards on GPUs, the GPU market has many cards for a wide variety of people, you don't need the latest, greatest and most expensive card to have a great PC Gaming experience and raise settings above High, my old 6850 would do medium with some settings at high.









So much wrong in this post, you're basically saying you need a 780 to run settings at high? and two for Ultra?


----------



## PCgamer5150

Quote:


> Originally Posted by *MonarchX*
> 
> This is obviously OK with you because someone who can afford your rig, can afford to quickly upgrade to the newest technology.
> 
> Now I do agree that 4K is going to need more than 4GB, but only for a few titles. As of right now, I think 3GB can handle almost all games @ 4K just as easily as 4GB with 1 or 2 exception games. You do know that 4K is equivalent of 1080p with 4s OGSSAA, right?
> 
> 1440p will also push those 3GB boundaries with consoles optimized games. Its undeniable that many games will continue to be poorly optimized for PCs and if you want to play them instead of complaining about them then you will have to suck it up and buy appropriate hardware for 1440p, 1600p, and surely 4K.
> 
> For 1080p, 3GB is just right. I am glad I stayed with 1080p because the best monitor only supports that much. Worst case scenario for 3GB cards is using SMAA instead of MSAA to conserve memory. Again, you also need a fast GPU that can process data fast enough on the fly to reduce the need for pre-caching, so 4GB GTX 770 and 3GB R9 280X are not going to cut it even for 2014 titles to run at 60fps, but more like 30-35fps average with dips into low 20s and rare spikes up to 45fps. I do expect my card to perform at about 40-60fps on poorly optimized games. AC4 is also poorly optimized, but it sure runs great on my humble rig.
> 
> Its also not unthinkable that a powerful $1000 SLI system will be needed GPU-wise to render 4K and even 1440p untunt much better hardware comes out. 3GB GTX 780 Ti with good OC and 4GB R9 290X with really good OC @ 1080p really is the minimum for decent performance in 2014 games. 3GB-4GB+ 2x SLI for the same cards is the minimuum for 1440p. 4GB-6GB+ 3x SLI of the same cards is the minimum for 4K.
> 
> That era where you could buy a decent gaming PC for 1K was over the day new console generation went for sale. If a console power increases 2X, then PC power needs to increase 3X or even 4X to keep up. Either get a console or a premium-priced PC if you want to play recent games with Ultra settings. Ultra settings are for Ultra PCs. High aettingsbare for high-end PCs and etc. I feel kind of bad for people who buy GTX 760 and expect to run Ultra or Very High settings. Its a card for Medium settings with some set to High. Don't bother with nVidia settings recommendation or at least drop all their recommendations by 1 notch.


MONARCH, so did i do the right thing by getting a GTX 780 FTW edition a few days ago or should i have waited? I game at 1080 single monitor.
really wanted something better than my 760 and finally got the $$$ so i ordered a 780 ftw.
some are saying i should have got a 290X but i have a anxiety about AMD stuff from past experience.
Maybe i should have went 6g 780?


----------



## anthonyg45157

Quote:


> Originally Posted by *TopicClocker*
> 
> The user score isn't however.


Quote:


> Originally Posted by *PCgamer5150*
> 
> MONARCH, so did i do the right thing by getting a GTX 780 FTW edition a few days ago or should i have waited? I game at 1080 single monitor.
> really wanted something better than my 760 and finally got the $$$ so i ordered a 780 ftw.
> some are saying i should have got a 290X but i have a anxiety about AMD stuff from past experience.
> Maybe i should have went 6g 780?


Is that evga? Don't they have the step up program and you'll be able to "step up" to a 6gb 780?


----------



## PCgamer5150

Quote:


> Originally Posted by *anthonyg45157*
> 
> Is that evga? Don't they have the step up program and you'll be able to "step up" to a 6gb 780?


yes it is evga but all the 780 6g's are out of stock and from what i understand may not be back anytime soon


----------



## anthonyg45157

Quote:


> Originally Posted by *PCgamer5150*
> 
> yes it is evga but all the 780 6g's are out of stock and from what i understand may not be back anytime soon


Quote:


> Originally Posted by *PCgamer5150*
> 
> yes it is evga but all the 780 6g's are out of stock and from what i understand may not be back anytime soon


Humm. Idk their policy on the step up program. If they are out of stock does that mean you're out of luck? Or do they promise you one when they come back into stock?


----------



## ghostrider85

Quote:


> Originally Posted by *PCgamer5150*
> 
> yes it is evga but all the 780 6g's are out of stock and from what i understand may not be back anytime soon


Once you apply for step up, you will get your upgrade even if it goes past 90 days, as long as you apply before that.


----------



## PCgamer5150

Quote:


> Originally Posted by *ghostrider85*
> 
> Once you apply for step up, you will get your upgrade even if it goes past 90 days, as long as you apply before that.


Great did not know that thanks, some are saying the future is cards with at least 4g of vram so this may be the way to go
UPDATE:
was a great idea but i am out of luck,







at least for now
http://www.evga.com/articles/00830/


----------



## TopicClocker

Quote:


> Originally Posted by *ghostrider85*
> 
> Once you apply for step up, you will get your upgrade even if it goes past 90 days, as long as you apply before that.


You can get an upgrade even if it goes past 90 days?
Hmm might have to go EVGA next time...


----------



## PCgamer5150

Quote:


> Originally Posted by *TopicClocker*
> 
> You can get an upgrade even if it goes past 90 days?
> Hmm might have to go EVGA next time...


I believe you can as long and your are signed up for one?
Right now the sign in is CLOSED!


----------



## TopicClocker

Quote:


> Originally Posted by *PCgamer5150*
> 
> I believe you can as long and your are signed up for one?
> Right now the sign in is CLOSED!


Hmm definitely going EVGA next time, could spend the extra cash on a higher GPU instead of contemplating selling an existing etc.
I'm sitting out till most of the 800 series release with 4GB+ and after a few months with my current card, I wouldn't feel good getting rid of a card after under a year of having it and It's serving me well, I usually have my cards for 1-2 years before I consider upgrading, (I almost bought a 290 a few weeks ago) this is how people get into crazy upgrade cycles.

Right now is perhaps the worst time to be buying new cards if you want one for long-term as the newer cards are rumored to drop this year or maybe early next year, but there's no word from Nvidia or AMD yet.


----------



## PCgamer5150

Quote:


> Originally Posted by *ghostrider85*
> 
> Once you apply for step up, you will get your upgrade even if it goes past 90 days,
> as long as you apply before that.


This is correct i just confirmed it with evga.
I am going to do it!


----------



## DIYDeath

Quote:


> Originally Posted by *TopicClocker*
> 
> Hmm definitely going EVGA next time, could spend the extra cash on a higher GPU instead of contemplating selling an existing etc.
> I'm sitting out till most of the 800 series release with 4GB+ and after a few months with my current card, I wouldn't feel good getting rid of a card after under a year of having it and It's serving me well, I usually have my cards for 1-2 years before I consider upgrading, (I almost bought a 290 a few weeks ago) this is how people get into crazy upgrade cycles.
> 
> Right now is perhaps the worst time to be buying new cards if you want one for long-term as the newer cards are rumored to drop this year or maybe early next year, but there's no word from Nvidia or AMD yet.


Exactly, now is NOT the time to buy a new GPU, the offerings for Maxwell have been total crap so far, you'd be better off with a 6gb vram 780 ti or a Titan Black, at least theyre the cream of that chipset and have enough vram to be future proof for a year or so.

However, once Maxwell cream is released then it'll be worth it.


----------



## philharmonik

Really wish I had went with EVGA now. I completely forgot about their step up program!


----------



## revro

Quote:


> Originally Posted by *DIYDeath*
> 
> Exactly, now is NOT the time to buy a new GPU, the offerings for Maxwell have been total crap so far, you'd be better off with a 6gb vram 780 ti or a Titan Black, at least theyre the cream of that chipset and have enough vram to be future proof for a year or so.
> 
> However, once Maxwell cream is released then it'll be worth it.


hasnt evga jacob confirmed there wont be 780ti 6gb cards?


----------



## Blackops_2

Have we hit the 3gb limit or is all this due to this gaming being a horrid port?


----------



## DIYDeath

Quote:


> Originally Posted by *revro*
> 
> hasnt evga jacob confirmed there wont be 780ti 6gb cards?


Months ago 780 ti 6gb was available as a option for the step up program. To my knowledge it was never released as a "retail" card.
http://www.evga.com/articles/00830/

Yep wasn't the ti, was just the 780.
I love EVGA, I dont blame them for no 6gb 780 ti but Im getting frustrated with Nvidia. Theyre clearly doing shady things and Ive just about had it. Low vram cards seem very purposeful to force consumers to upgrade, probably to a otherwise inferior maxwell chip (not that maxwell is inferior but the maxwells being released right now sure as heck are).

If I see the continuing trend I will boycott and switch to AMD+custom water loop (I dont want to, I love my Titan Black but it isnt fair to others and Im willing to stand on that) I find the path Nvidia is taking to be morally devoid (hyperbole) and its my unwavering stance that these types of companies need to be punished for their complacency and abuse of customers. (hyperbole, I think its a marketing scheme, I don't like it, its borderline dishonest)


----------



## PCgamer5150

Quote:


> Originally Posted by *Blackops_2*
> 
> Have we hit the 3gb limit or is all this due to this gaming being a horrid port?


Most say it is due to being poorly optimized but who knows?
I am hearing witcher 3 will use more than 3g's vram


----------



## ghostrider85

I don't understand why this game requires more than 3gb of vram with those crappy textures.


----------



## ChronoBodi

Quote:


> Originally Posted by *ghostrider85*
> 
> I don't understand why this game requires more than 3gb of vram with those crappy textures.


probably hasty port job from PS4 to PC, i dunno.

considering the PS4 has 6GB RAM out of its 8GB to use, 4GB of it can be VRAM while the other 2GB is system memory.

Of course they have this flexibility since its unified memory, while we have separate memory pools for VRAM or system ram only.

But really, for guys like me with 6GB VRAM and 16gb system ram, it's a moot point. But for most gamer PCs with 2GB VRAM or less, this can be an issue.


----------



## inedenimadam

Quote:


> Originally Posted by *ghostrider85*
> 
> I don't understand why this game requires more than 3gb of vram with those crappy textures.


I thought the textures were pretty slick. Specifically the roads. My favorite though is right at the start of the game when you get to your hideout and you have the projector set up and walk in front of it. I was thoroughly impressed. It could be because I have a projector, making a personal connection with the character, or because it was new and different from the traditional in game monitors/TVs with low-res images. The projection was brilliant and was very well replicated and scaled on the character based on how far from the projector you stood while in the way of the wall. \

I am having issues with eyefinity zooming in way to far, and huge frame rate drops when driving, so I am going to wait for drivers/updates before I progress any further in the game. I dont have much time for more than one playthrough on games anymore (well...Skyrim).


----------



## anthonyg45157

Quote:


> Originally Posted by *inedenimadam*
> 
> I thought the textures were pretty slick. Specifically the roads. My favorite though is right at the start of the game when you get to your hideout and you have the projector set up and walk in front of it. I was thoroughly impressed. It could be because I have a projector, making a personal connection with the character, or because it was new and different from the traditional in game monitors/TVs with low-res images. The projection was brilliant and was very well replicated and scaled on the character based on how far from the projector you stood while in the way of the wall. \
> 
> I am having issues with eyefinity zooming in way to far, and huge frame rate drops when driving, so I am going to wait for drivers/updates before I progress any further in the game. I dont have much time for more than one playthrough on games anymore (well...Skyrim).


I loved the projector part as well! Even took a few screen shots at the specific part


----------



## MonarchX

Quote:


> Originally Posted by *PCgamer5150*
> 
> MONARCH, so did i do the right thing by getting a GTX 780 FTW edition a few days ago or should i have waited? I game at 1080 single monitor.
> really wanted something better than my 760 and finally got the $$$ so i ordered a 780 ftw.
> some are saying i should have got a 290X but i have a anxiety about AMD stuff from past experience.
> Maybe i should have went 6g 780?


I don't know what games you are talking about - PC-optimized or console-optimized and I don't know the resulution you play at and I don't know your FPS requirements and graphics settings requirements... If I did then I would've simply re-stated the post you quoted already. No offense, but I think I provided my opinion for just about every combination of wants & needs to satisfy those wants.


----------



## MonarchX

Witcher 3 is sure as hell going to use more than 3GB of VRAM. Witcher 2 brings down GTX 780 @ 1440p with some ubersampling, and it isn't all that fast without it. Running Witcher 3 will cost you some $1000+ worth of videocards at the same 1440p.

Either be willing to spend a TON of money on upgrades or get a console. These are the rules. All the in-between graphics cards are just there for companies to make profit and make people jealous to have them save up more money for the next generation in-between card that will suck just as much UNLESS its the very best and most expensive. When a new and the fastest videocard comes out - that becomes the MINIMUM for acceptable framerate and you need SLI to get that good framerate @ those high resolutions. This is why it is best to stay @ 1080p







.


----------



## StrongForce

Quote:


> Originally Posted by *MonarchX*
> 
> Witcher 3 is sure as hell going to use more than 3GB of VRAM. Witcher 2 brings down GTX 780 @ 1440p with some ubersampling, and it isn't all that fast without it. Running Witcher 3 will cost you some $1000+ worth of videocards at the same 1440p.
> 
> Either be willing to spend a TON of money on upgrades or get a console. These are the rules. All the in-between graphics cards are just there for companies to make profit and make people jealous to have them save up more money for the next generation in-between card that will suck just as much UNLESS its the very best and most expensive. When a new and the fastest videocard comes out - that becomes the MINIMUM for acceptable framerate and you need SLI to get that good framerate @ those high resolutions. This is why it is best to stay @ 1080p
> 
> 
> 
> 
> 
> 
> 
> .


You think however at 1080p 3gb will be enough, like a 280x ?


----------



## PCgamer5150

Quote:


> Originally Posted by *MonarchX*
> 
> I don't know what games you are talking about - PC-optimized or console-optimized and I don't know the resulution you play at and I don't know your FPS requirements and graphics settings requirements... If I did then I would've simply re-stated the post you quoted already. No offense, but I think I provided my opinion for just about every combination of wants & needs to satisfy those wants.


yeah i forgot to update or remove that question.
I got my answers thanks


----------



## PCgamer5150

Quote:


> Originally Posted by *StrongForce*
> 
> You think however at 1080p 3gb will be enough, like a 280x ?


not for max settings


----------



## MonarchX

Quote:


> Originally Posted by *StrongForce*
> 
> You think however at 1080p 3gb will be enough, like a 280x ?


3GB for 1080p is like pushing it to the extreme with 4x MSAA, but with SMAA it should be JUST right. However that is VRAM. You also need a much faster GPU than 280X!

Also, these are just my opinions - *I am not the all-knowing God who can predict everything 100%*. I just connect A to B to C and provide you the picture that I see







.


----------



## Flames21891

Quote:


> Originally Posted by *Blackops_2*
> 
> Have we hit the 3gb limit or is all this due to this gaming being a horrid port?


It's bad optimization and exaggeration. At 1080p I can run this game on a single 2GB 680 with everything cranked up and not hit my 2GB limit. All the issues this game has are purely the result of problems with the game engine, not that the game is actually pushing the hardware to its limits.

I will agree that my 2GB 680's may become obsolete quite a bit faster than I would have liked, as the new consoles have an abundance of memory to play with. However, I hope that the visuals we get out of it match the increase in requirements. Otherwise this generation's gonna really suck.


----------



## Blackops_2

Glad to hear that. I rarely reached the 3gb limit on my 7970 and I just switched to 780 classys. Though Nvidia could've gone with more ram. Now that said there were several situations even at 1440p where the limited memory and smaller bus width of the Kepler based cards really didn't penalize them compared to Tahiti. Which I always thought was interesting and the same to some extent could be said for the 290x/290 vs the 780Ti/780 in certain situations where you think the larger bus width and more memory of Hawaii would benefit hugely it really isn't the case until you get to 4k.

Sent from my iPhone using Tapatalk


----------



## bryjoered

In response to the guy that says you need a 780ti at the minimum and SLI 780s to run 1440p is just flat out wrong. Maybe if you want a constant 60fps at max graphical settings. The regular 780 is more than sufficient to run on mostly ultra some high on new high end games currently with conservative levels of antialiasing. If you're running at 1440p you don't need as much antialiasing anyways. Regardless, resolution is king even if I had a 4k monitor and just a mid range card I would use that resolution no matter how many in game settings I had to sacrifice. There is no other setting that matters when compared to resolution. You don't need to have all the settings cranked up to max to get a good gaming experience, this is something that enthusiasts don't understand.

The main problem that I've been experiencing with my mid-range 670 is it's 2gb limit and not it's performance in anyway. It is more than capable of pushing high end shadow, particle and tessellation effects on modern games. We know have four-five major releases that exceed the 2gb VRAM limit and force you to lower AA or texture quality. Titanfall: COD Ghosts, Wolfenstein: The new order, BF4 and now Watch Dogs. This is not an issue of performance most cards simply just don't have the memory required for these high resolution textures.

The other thing I would like to mention is gaming at 1080p is just fine, it is something that is not even achievable on the new consoles most of the time. It still looks great, not as good as 1440p but still great. Let's take, for example, the GTX 770 4gb version. This card will easily get close to ultra settings on games like the Witcher 3 at 1080p. Hell, I'll probably be able to run Witcher 3 on high settings with my 670. Believe it.


----------



## MonarchX

Quote:


> Originally Posted by *bryjoered*
> 
> In response to the guy that says you need a 780ti at the minimum and SLI 780s to run 1440p is just flat out wrong. Maybe if you want a constant 60fps at max graphical settings. The regular 780 is more than sufficient to run on mostly ultra some high on new high end games currently with conservative levels of antialiasing. If you're running at 1440p you don't need as much antialiasing anyways. Regardless, resolution is king even if I had a 4k monitor and just a mid range card I would use that resolution no matter how many in game settings I had to sacrifice. There is no other setting that matters when compared to resolution. You don't need to have all the settings cranked up to max to get a good gaming experience, this is something that enthusiasts don't understand.
> 
> The main problem that I've been experiencing with my mid-range 670 is it's 2gb limit and not it's performance in anyway. It is more than capable of pushing high end shadow, particle and tessellation effects on modern games. We know have four-five major releases that exceed the 2gb VRAM limit and force you to lower AA or texture quality. Titanfall: COD Ghosts, Wolfenstein: The new order, BF4 and now Watch Dogs. This is not an issue of performance most cards simply just don't have the memory required for these high resolution textures.
> 
> The other thing I would like to mention is gaming at 1080p is just fine, it is something that is not even achievable on the new consoles most of the time. It still looks great, not as good as 1440p but still great. Let's take, for example, the GTX 770 4gb version. This card will easily get close to ultra settings on games like the Witcher 3 at 1080p. Hell, I'll probably be able to run Witcher 3 on high settings with my 670. Believe it.


I was talking about upcoming games, not today's games. I also did say that I was talking about consistent 60fps @ 1440p, which does need something better than GTX 780...

As far as Witcher 3 goes - we don't know for sure, but the rumor goes that GTX 780 Ti can run it at constant 60fps with 4x MSAA, but not 8x MSAA.


----------



## Alex132

Quote:


> Originally Posted by *Blackops_2*
> 
> Glad to hear that. I rarely reached the 3gb limit on my 7970 and I just switched to 780 classys. Though Nvidia could've gone with more ram. Now that said there were several situations even at 1440p where the limited memory and smaller bus width of the Kepler based cards really didn't penalize them compared to Tahiti. Which I always thought was interesting and the same to some extent could be said for the 290x/290 vs the 780Ti/780 in certain situations where you think the larger bus width and more memory of Hawaii would benefit hugely it really isn't the case until you get to 4k.
> 
> Sent from my iPhone using Tapatalk


780 Ti has a higher memory bandwidth than the 290X btw, it's not always just about the bus-size









http://www.techpowerup.com/gpudb/2512/geforce-gtx-780-ti.html
http://www.techpowerup.com/gpudb/2460/radeon-r9-290x.html


----------



## Silent Scone

lol at stock maybe.

Although it's a bit of a paper tiger agreed. All very well having a 512bit wide memory bus - then put slow ass memory on it.

I suppose one way of looking at it would be imagining it was a freeway (or motorway depending on what side of the pond you're on), and putting a Vespa on it.


----------



## bryjoered

Yeah, I've never been a guy that needs a constant 60 frames per second. I was playing Watch Dogs on my friends PS4 which is locked at 30 FPS and it appeared perfectly smooth to me. I get like 45 fps average on my system for Watch Dogs and the performance feels way worse because it stutters so bad sometimes. I think that consistency is more important than the actual framerate. This is not to discount the fact that 60fps is ideal, just saying I don't think it is the de facto standard, unless you're playing a twitch multiplayer shooter.


----------



## PCgamer5150

Quote:


> Originally Posted by *bryjoered*
> 
> Yeah, I've never been a guy that needs a constant 60 frames per second. I was playing Watch Dogs on my friends PS4 which is locked at 30 FPS and it appeared perfectly smooth to me. I get like 45 fps average on my system for Watch Dogs and the performance feels way worse because it stutters so bad sometimes. I think that consistency is more important than the actual framerate. This is not to discount the fact that 60fps is ideal, just saying I don't think it is the de facto standard, unless you're playing a twitch multiplayer shooter.


I am the same, give me a constant 30 on a game that IS properly optimized over any game at 60fps that is not.
I dont play multiplayer so 30 is fine by me


----------



## Jedson3614

Interestingly enough I updated to the most recent Nvidia drivers and now I am getting the stuttering issues, but never did with the ladder driver before.


----------



## PCgamer5150

Quote:


> Originally Posted by *Jedson3614*
> 
> Interestingly enough I updated to the most recent Nvidia drivers and now I am getting the stuttering issues, but never did with the ladder driver before.


I would roll back then


----------



## cstkl1

My issue with this game. Even if they optimize it.. It will still have object pop-ups. Its a defect on game engine design.


----------



## Murlocke

Quote:


> Originally Posted by *cstkl1*
> 
> My issue with this game. Even if they optimize it.. It will still have object pop-ups. Its a defect on game engine design.


Strange.. I don't have any object pop-ups whatsoever. Everything is rendered *really* far away, and I've always assumed that's why VRAM requirements are so high. You sure your VRAM is keeping up?


----------



## Silent Scone

Still no patch then, hopefully get something this week.


----------



## cstkl1

Quote:


> Originally Posted by *Murlocke*
> 
> Strange.. I don't have any object pop-ups whatsoever. Everything is rendered *really* far away, and I've always assumed that's why VRAM requirements are so high. You sure your VRAM is keeping up?


yeah.
U get it when driving fast.

The background is rendered etc.. Its only like cars popping up in the end of the street.. heck even had this one guy appearing out of no where or trashbins etc... but with u really stepping on the pedal.. seriously u will notice it look at the end of the streets etc. i dont use gps sometimes so and use camera mode total view so i can see it .

try setting everything low and drive dude.. u will see what i mean. its nothing to do with all texture settings etc. just a issue with the engine on objects. the background etc all has been rendered just cars etc popping.

nothing to do with vram. this are objects at a distance. they popup. cause on a bike etc i planning my route there on pedestrian walkway etc.. and then things came up out of no where when i was looking at it later.


----------



## Garzhad

Hmm. Haven't gotten this yet, probably will snag it the first sale that comes by, but, i'm willing to bet you can get it down to 1.5GB max on the VRAM after tweaking your settings. Far Cry 3 didn't even take that much with the DX9 high-res texture pack on Ultra.

Outside of sheer tex size, the other half of VRAM consumption tends to come from crap like Motion Blur, Depth of Field and other things that are just annoying, and excessive levels of MSAA, since you don't really need more then 4x(the jump from 2-4 isn't very noticeable, 4-8, has virtually no difference) and you don't even really need that if it's compatible with SMAA(prolly some ENB bias regarding that AA algorithm here, having used it for, like, ever)
Quote:


> Originally Posted by *cstkl1*
> 
> The background is rendered etc.. Its only like cars popping up in the end of the street.. heck even had this one guy appearing out of no where or trashbins etc... but with u really stepping on the pedal.. seriously u will notice it look at the end of the streets etc. i dont use gps sometimes so and use camera mode total view so i can see it .
> 
> nothing to do with vram. this are objects at a distance. they popup. cause on a bike etc i planning my route there on pedestrian walkway etc.. and then things came up out of no where when i was looking at it later.


Yeah, I had this in Far Cry 3. There, it was a problem with LoD settings being too low. You wouldn't notice it much running around on foot, but driving the jeep, trees and vegetation and cars would randomly appear/disappear ect. Going into the FC3 config file and raising the LoD settings eliminated it. Maybe it will here too?


----------



## Silent Scone

yep. Motion blur is a VRAM hog in this game


----------



## KenjiS

Quote:


> Originally Posted by *Garzhad*
> 
> Hmm. Haven't gotten this yet, probably will snag it the first sale that comes by, but, i'm willing to bet you can get it down to 1.5GB max on the VRAM after tweaking your settings. Far Cry 3 didn't even take that much with the DX9 high-res texture pack on Ultra.
> 
> Outside of sheer tex size, the other half of VRAM consumption tends to come from crap like Motion Blur, Depth of Field and other things that are just annoying, and excessive levels of MSAA, since you don't really need more then 4x(the jump from 2-4 isn't very noticeable, 4-8, has virtually no difference) and you don't even really need that if it's compatible with SMAA(prolly some ENB bias regarding that AA algorithm here, having used it for, like, ever)


Which shows you've read no part of the thread. Sorry.

People here are complaining they're turning down the textures, turning off AA, turning off motion blur and DoF effects and the game is *still* consuming an unacceptable amount of VRAM and stuttering on computers it *absolutely* should not stutter on.

The big problem is changing the settings seems to make zero difference in actually making the game play any better. it just looks worse


----------



## cstkl1

Double Post


----------



## Silent Scone

Quote:


> Originally Posted by *KenjiS*
> 
> Which shows you've read no part of the thread. Sorry.
> 
> People here are complaining they're turning down the textures, turning off AA, turning off motion blur and DoF effects and the game is *still* consuming an unacceptable amount of VRAM and stuttering on computers it *absolutely* should not stutter on.
> 
> The big problem is changing the settings seems to make zero difference in actually making the game play any better. it just looks worse


Not how I've found it. Reducing texture quality to medium alleviates almost all of the stutter. Mainly, I might add because the 'medium' setting looks very, very flat! Not what I've come to expect looking at textures in other games. I've tried various different methods to help reduce stuttering whilst maintaining Ultra texture quality on my 780Ti. First off and most important, SLI scaling is very broken. This is an issue in the neighbouring engine used in AC4 with stuttering. Enabling SLi will add an extra layer of micro stuttering to your already poor experience.

With 3GB of VRAM on the Ti at 1440P, I've found that using temporal SMAA is possible at Ultra settings, however there is still some stutter. Frame buffer tends to sit around 2.8GB, so it is not completely filled (there may be some swapping going out regardless).

If you are blessed with having more than one NV GPU, setting one to dedicated PhysX, whilst still having SLi disabled also alleviates some of the stutter for me. This is also comparable to the poor PhysX performance in AC4.

Lastly a few people have suggested CSAA 32X removes stuttering entirely. I have tried this, and it does have a small impact as far as I can tell when set to Enhance application settings, but the problem is still there.

There is a line somewhere, and it's difficult to see the forest through the trees when Watch Dogs is so poorly optimised on the PC. But VRAM is a factor with up-and coming games. I do not agree with Sebastian using this as a feasible argument for the performance in Watch Dogs however, it's more a convenient excuse in all honesty.

That said, I think people need to wake up and smell the roses as far as the continual VRAM arguments are concerned. Where 3GB may have been abundant when developers were cross platform working with the Xbox 360, which had only 512MB to work with, now we are seeing textures, and visual effects being rendered on machines with up to 8GB, almost half of which is capable of being used purely for graphics. So it's not unrealistic in any respect to expect, that once you either increase the amount of pixels or simply add additional multi sampling, that the game will want more VRAM than what a large majority of GPUs have at the moment. This is something that all GPU vendors will undoubtedly, and in some cases already have addressed with the next line of cards.


----------



## blackRott9

I've not been getting hit with this bad stutter some are complaining about in WD.

I've shadows reduced to High. Motion blur and DOF are off. AO is set to HBAO+ High, Water Ultra, Shader High, Level of Detail Ultra, Reflections High, Textures Ultra and Temporal SMAA is enabled. In game vsync is disabled. I'm running it @ 1920x1200.

I'm running the game off of a Samsung SSD 830. Yes, I see FPS drops when the games streams things and my SSD may be helping to save me from hitches. I will see an occasional hitch like with any game and it's nothing regular or frequent.

I'm using an FX 6300 @ 4.66GHz and a 7970 @ 1230|1630. It will do higher clocks and the fan noise gets on my nerves at those settings. I'm using Cat 14.6.


----------



## cstkl1

@Murlocke
Quote:


> Originally Posted by *Silent Scone*
> 
> Not how I've found it. Reducing texture quality to medium alleviates almost all of the stutter. Mainly, I might add because the 'medium' setting looks very, very flat! Not what I've come to expect looking at textures in other games. I've tried various different methods to help reduce stuttering whilst maintaining Ultra texture quality on my 780Ti. First off and most important, SLI scaling is very broken. This is an issue in the neighbouring engine used in AC4 with stuttering. Enabling SLi will add an extra layer of micro stuttering to your already poor experience.
> 
> With 3GB of VRAM on the Ti at 1440P, I've found that using temporal SMAA is possible at Ultra settings, however there is still some stutter. Frame buffer tends to sit around 2.8GB, so it is not completely filled (there may be some swapping going out regardless).
> 
> If you are blessed with having more than one NV GPU, setting one to dedicated PhysX, whilst still having SLi disabled also alleviates some of the stutter for me. This is also comparable to the poor PhysX performance in AC4.
> 
> Lastly a few people have suggested CSAA 32X removes stuttering entirely. I have tried this, and it does have a small impact as far as I can tell when set to Enhance application settings, but the problem is still there.
> 
> There is a line somewhere, and it's difficult to see the forest through the trees when Watch Dogs is so poorly optimised on the PC. But VRAM is a factor with up-and coming games. I do not agree with Sebastian using this as a feasible argument for the performance in Watch Dogs however, it's more a convenient excuse in all honesty.
> 
> That said, I think people need to wake up and smell the roses as far as the continual VRAM arguments are concerned. Where 3GB may have been abundant when developers were cross platform working with the Xbox 360, which had only 512MB to work with, now we are seeing textures, and visual effects being rendered on machines with up to 8GB, almost half of which is capable of being used purely for graphics. So it's not unrealistic in any respect to expect, that once you either increase the amount of pixels or simply add additional multi sampling, that the game will want more VRAM than what a large majority of GPUs have at the moment. This is something that all GPU vendors will undoubtedly, and in some cases already have addressed with the next line of cards.


try this
Watch Dog Setting.
on [email protected] , GTX 780ti 1202/7200mhz just a normal oc. at 1440p

This is the settings for uber smooth play.
1. Display at 2560x1440p Refresh Rate 60hz, Vsync 1, Texture High, MSAA 2x
2. Set Graphic option all maxed out
3. Nvidia Control Panel AF 16x, Texture Filter High Quality.
4. Gamer Profile xml - AlphatoCoverage 1, RainRender 1, Deffered FX pc

FPS drop min i saw was 48-50fps and most of the time maintains close to 60fps . Zero Stutter.

Max Mem 2950MB

For 780ti SLI
If u oc ure monitor to 75hz. MSAA 2x also.
Solid 75fps. Uber Smooth.

Max Mem 2953MB.

Test was on a Dell u2713HM, Matrix 780ti [email protected] [email protected]/7200mhz, 4770k 4.2ghz, Corsair Dominator GT @4x2gb 2200mhz CL7 1.65, Crucial M500 Raid 0. - One hour Gameplay

CSAA32 has zero impact on fps from my testing and no increase in fidelity. I think those guys did overide the AA setting.

GTX Titan Black SLI also is zero stutter and less object popin at a distance with texture at High instead of ultra. So something aint right with the this Dunia Engine.


----------



## Silent Scone

Thanks, however the game runs ok for me at the minute. Simply reducing to High Textures will remove any stuttering I'm still getting


----------



## StrongForce

http://www.overclock.net/t/1496918/dso-watch-dogs-stutter-fix-2-0-mod-lets-you-enjoy-ultra-textures-on-gpus-with-2gb-vram/0_30#post_22442411

Here is a thread I saw that might interest some of you.. good luck


----------



## Silent Scone

Still a stuttering mess for me even on Titans...

Sebastian, if you some how read this. You're crap.


----------

