PDA

View Full Version : PC specific features and performance figures?



viewer111
05-13-2014, 11:36 PM
We have been only hearing about console performance figures (frame rates, resolution .. etc.).

What about the PC version?

- What graphic options are available?
- At what frame rate would the game run with the minimum and recommended hardware?
- What are the supported resolutions?
- Is there going to be free roam on the PC version?

UBi seem to be very quiet about the PC version, yet the PC version seem to be one of the most popular platforms!

GuZZ33
05-14-2014, 09:01 AM
If you go on GeForce.com and check the Watch Dogs game info the game has ultra high definition textures according to nvidia so anyone with a 4k tv or pc monitor will experience true quad 1080p screen resolution with UHD textures. What nvidia claim is that game developers use high resolution textures in their games and when the game detects 4k it will enable 4k textures and anything below 4k is automatically reduced, so i would presume that if you have a 1080p screen then you will only see 2k textures.


I hope the pc version has SLI support and can utilize multi GPU configurations?. I just hope the game engine is not restricted to 60 frames per second like ghost recon future soldier, that is bad news for pc gamers who have invested money in very high end pc hardware and multi gpu's and multi core cpu's. Looking forward to the first bout of pc real world home user reviews and what they have to say about the games graphics and pc support.

viewer111
05-14-2014, 01:43 PM
What Nvidia says is PR talk. We have not yet seen any recent PC gameplay.

It is worrying that dedicated hardware on the PS4 (equivalent to a high-end PC) runs at 900 with 30FPS, so how would it run on a PC with similar specifications (with a load of programs running in the background of the operating system)?

Did UBi made any optimisation for the PC platform? Is there any evidence?

Any benchmark demos or trailers?

Why Ubi are quiet about the PC version?

RealHawxy
05-14-2014, 01:50 PM
We have been only hearing about console performance figures (frame rates, resolution .. etc.).

What about the PC version?

- What graphic options are available?
- At what frame rate would the game run with the minimum and recommended hardware?
- What are the supported resolutions?
- Is there going to be free roam on the PC version?

UBi seem to be very quiet about the PC version, yet the PC version seem to be one of the most popular platforms!
1. Lots, like any other game. Wait till release.
2. Minimum would be 30FPS, recommended would be 60FPS+
3. Anything up to 4k
4. Yes, we get everything the consoles get of course, I don't know why you would ask that.


What Nvidia says is PR talk. We have not yet seen any recent PC gameplay.

It is worrying that dedicated hardware on the PS4 (equivalent to a high-end PC) runs at 900 with 30FPS, so how would it run on a PC with similar specifications (with a load of programs running in the background of the operating system)?

Did UBi made any optimisation for the PC platform? Is there any evidence?

Any benchmark demos or trailers?

Why Ubi are quiet about the PC version?

1. What it runs on the PS4 is irrelevant as the hardware and software differences are too vast. They also cut down on the res/FPS to add more detail and expand on certain features in some areas.
2. Yes, Watch_Dogs was built for the PC and then ported to console, confirmed by developers a few times already. Next gen systems didn't exist at the time they started developing. The Nvidia partnership assists in the optimization.
3. https://www.youtube.com/watch?v=fWKOyqOJMmQ
4. Because they have a launch partnership with Sony that requires all trailer footage to be from the PS4.

utack
05-14-2014, 03:19 PM
What Nvidia says is PR talk. We have not yet seen any recent PC gameplay.
Nvidia will have many optimizations for the game. But since they can't do magic I would expect a similar image quality and maybe 10% performance loss on an equal AMD card.


It is worrying that dedicated hardware on the PS4 (equivalent to a high-end PC) runs at 900 with 30FPS, so how would it run on a PC with similar specifications (with a load of programs running in the background of the operating system)?

The only thing "in the background" you need to worry about is DirectX. My personal guess would be that the PS4 version had too less development time to use all the advanages of the new generation and will also use some similar higher level API. If it doesn't, it still can't use the consoles full potential. Ubisoft had the Hardware at best one year ago.


Did UBi made any optimisation for the PC platform? Is there any evidence?

Any benchmark demos or trailers?

Why Ubi are quiet about the PC version?

PC gamers don't have this inherent fear that a game might perform poor or look poor on their hardware. Especially not when Nvidia is doing all the marketing of how great it is. And besides: We got our spec's way before any console tech details were released.

dragoy
05-14-2014, 03:25 PM
PS4 uses its own api so doesnt take advantage or have drawbacks from dx at all, now thats out of the way. When it comes to nvidia then the thing to think to say in your mind is hardware, hardware, hardware. They do bring major advances to games but you require the hardware to realise it this is not just gpu but monitor too.

The graphic options will prob be low, medium, high, ultra and custom. The question of the fps dependent on hardware is a question within infinite answers.

Supported resolutions are unknown at this time upon pc but we can assume 1080p, surround and upto 4k.

There is free roam upto 8 people with drop in and drop out functionality, the information is covered in stickys above.

GuZZ33
05-14-2014, 03:34 PM
Watch_Dogs was built for the PC and then ported to console, confirmed by developers a few times already.Can you link me to the Ubisoft developers who stated this, just interested thanks.


SLi is supported they have been adding the profiles into the Latest drivers, beta included,Yeah but sometimes patches and even drivers can brake a game. Splinter Cell Blacklist ha problems if you enable nvidia TXAA with SLI. I know this because it's what happened with 2 gtx780's, in fact i found that one gtx780 was just as good as 2 gtx780's, the patches caused some issues, and in recent software drivers from nvidia have no support for Blacklist and yet they have support for aliens colonial marines, seriously ...that is just a troll joke from nvidia, surely?.



Nvidia can't do magic


PC gamers don't have this inherent fear that a game might perform poor or look poor on their hardware. Especially not when Nvidia is doing all the marketing of how great it is. And besides: We got our spec's way before any console tech details were released.Hahaha yeah i like the funny stuff, nvidia certainly can't.

GuZZ33
05-14-2014, 03:37 PM
1080p and upto 4k..Lol, why no 4k textures on 4k screens?, i mean yeah 2k textures on 2k (1080p) screens is great, is it 4k bs or what?.

dragoy
05-14-2014, 03:37 PM
Sli performance can vary from game to game if it shunts one card into a phsyx card or not but the software will always try and give you the best performance.

dragoy
05-14-2014, 03:39 PM
Lol, why no 4k textures on 4k screens?, i mean yeah 2k textures on 2k (1080p) screens is great, is it 4k bs or what?.

Im slightly confused right now, 4k = 4000 pixels, 1080p= 1080 pixels, so where are you getting 2k textures on a 1080p panel ?

GuZZ33
05-14-2014, 03:54 PM
Sli performance can vary from game to game if it shunts one card into a phsyx card or not but the software will always try and give you the best performance.SLI performance for video game is down to the game developers and the gpu driver engineers/teams.

dragoy
05-14-2014, 03:55 PM
SLI performance for video game is down to the game developers and the gpu driver engineers/teams.

Thats what i said, and how they designate the cards to shunt workloads.

GuZZ33
05-14-2014, 03:56 PM
4k pixelsI'm not interested in pixel count i'm interested in the quality of texture and other in game assets.

GuZZ33
05-14-2014, 03:57 PM
Thats what i said, and how they designate the cards to shunt workloads.Yeah and that's why i said patches and drivers can brake a game or make it run with less performance etc.

fishers1989
05-14-2014, 04:02 PM
Im slightly confused right now, 4k = 4000 pixels, 1080p= 1080 pixels, so where are you getting 2k textures on a 1080p panel ?

Texture resolution doesn't behave like display resolution. In a game you don't care what dpi a texture has since it changes depending on the distance from the viewing plane (LOD optimisation means 4k textures are replaced with lower res ones beyond a few feet) and the scale of the object, whereas displays are static so dpi is very important. So you can have 4K textures on a 720p TV if you want to and you'll be able to see the difference.

dragoy
05-14-2014, 04:04 PM
Texture resolution doesn't behave like display resolution. In a game you don't care what dpi a texture has since it changes depending on the distance from the viewing plane (LOD optimisation means 4k textures are replaced with lower res ones beyond a few feet) and the scale of the object, whereas displays are static so dpi is very important. So you can have 4K textures on a 720p TV if you want to and you'll be able to see the difference.

You might see smoother textures but you will not get 4k textures, the pixel count is needed to create the textures themselves and the gpu grunt to go along with it as 4k is basically 1080p x4 stacked. If you could we wouldnt need to buy 4k monitors to get 4k textures.

Edit, and can i just add im not stating that we cannot get better fidelity in textures, colour depth and lighting on 1080p.

GuZZ33
05-14-2014, 04:10 PM
So how do you actually know when a game developer has used ultra high definition textures?, what are textures? is it the same as texture mapping? what is image resolution? is it the same as pixel resolution?. I have come across skyrim texture mods and they have 2k and 4k textures but what exactly is it?. Crysis 3 on the PC maxed out has some seriously low looking tetxures in certain areas i wonder why that is?.

GuZZ33
05-14-2014, 04:12 PM
You might see smoother textures but you will not get 4k textures, the pixel count is needed to create the textures themselves and the gpu grunt to go along with it as 4k is basically 1080p x4 stacked. If you could we wouldnt need to buy 4k monitors to get 4k textures.So are you saying that 4k textures equals 8 million pixels of textures?.

dragoy
05-14-2014, 04:13 PM
Thats all todo with the engine and its limitations on specific games. Frostbite 3 and unreal engine 4 can take advantage of higher technology and coding, other engines cannot. The textures, colours etc are all specific upon the engines.

dragoy
05-14-2014, 04:14 PM
So are you saying that 4k textures equals 8 million pixels of textures?.

Im walking away now.

GuZZ33
05-14-2014, 04:15 PM
Battlefield 4 maxed out looks like poop on my 1080p monitor that's why when i sit 4 to 5 feet away the picture looks much smoother and better.

GuZZ33
05-14-2014, 04:19 PM
Im walking away now.4k or full quad high definition (3840x2160p) (4,096x2,160 pixels for projectors) is 8 million pixels, and 1920x1080p or 1080p is approx 2 million pixels. So what is the actual resolution of the 4k texture?.

utack
05-14-2014, 04:21 PM
So are you saying that 4k textures equals 8 million pixels of textures?.

4k is usually 4096 pixels, or 16.8 mio Pixels.
So 67 megabytes in Vram.
Common 4k GPU's will surely have 4GB of Vram, so why not? Given that you maybe need 10 in a worst case scenario, on objects that are fairly close.

GuZZ33
05-14-2014, 04:28 PM
Good stuff, so nobody really knows then?, hahaha :confused::D:confused:

Mr_Shade
05-14-2014, 04:30 PM
I will see if we have anything to share about the PC version - other than whats already been announced.

fishers1989
05-14-2014, 04:34 PM
You might see smoother textures but you will not get 4k textures, the pixel count is needed to create the textures themselves and the gpu grunt to go along with it as 4k is basically 1080p x4 stacked. If you could we wouldnt need to buy 4k monitors to get 4k textures.

Edit, and can i just add im not stating that we cannot get better fidelity in textures, colour depth and lighting on 1080p.

Right but you can say the same thing about any texture that's at an angle, you won't see all of the pixels on a texture unless it's completely face on and you can then get into the issues of texture filtering which makes things more complicated.

dragoy
05-14-2014, 04:36 PM
Im just going to leave this here.


http://youtu.be/XZo0FX6uBnQ

GuZZ33
05-14-2014, 04:48 PM
4k is stupid i think, even for pc's, seriously, who is going to sit close up to a huge sodding monitor, surely you will have other issues kicking in, have you read any of the tv/monitor manuals? they tell you to take short and frequent brakes from sitting close up to the screen, hahaha. I read a chart about my monitor preference when sat close at my pc desk/space, my monitor is 24 inches, 23 diagonal that's what is comfortable viewing for me (i have tested larger tv's they are even worse) according to one chart it states that for a 23-24 inch 4k monitor i would have to sit at approx 1 foot away to get the full benefit, which is in my honest opinion unreal viewing distance. Many companies like THX always recommend larger screen sizes but the problems arise even for pc gamers when the screen becomes too big to sit in front of andat a comfortable distance to reach your keyboard and mouse, in which i would be 2 feet away, even a 28 inch 4k monitor is a complete waste of my money and time. 4k wa meant for the large cinema houses so i would agree aith others that state 4k in the home is stupid.

utack
05-14-2014, 04:53 PM
4k is stupid i think, even for pc's, seriously, who is going to sit close up to a huge sodding monitor,
I will definitly get a 4K PC display next rather than a 144Hz one. Even with nice AA, stuff like grass in some distance just doesn't look any good with the current pixel density.

http://i7.minus.com/i8B6iadsWWlFY.png

The depth is lost, I can't determine the distance of this grass by it's looks. It only appears to be two flat layers with a little distance between to me.

Mr_Shade
05-14-2014, 04:56 PM
I'm happy with 1080p - and AA ;)

BlastThyName
05-14-2014, 06:23 PM
I will see if we have anything to share about the PC version - other than whats already been announced.
Thanks, that would be awesome if you could give us a list of the PC graphical settings.
I'm not talking about presets, there are irrelevant.

I hope CPU heavy settings will be editable.

EverAmbiguous
05-15-2014, 04:34 AM
I too would love to know a little more about PC features--particularly if the PC will have better textures and such than the PS4 version naturally does (in other words, is the PS4 version running at "High" or "Ultra").

Mostly because it's been intimated that the only real difference between the PC and PS4 version will be higher resolution and smoother FPS (not that that is bad, but I wonder if the textures and other graphical effects are really at max on a PS4).

GuZZ33
05-19-2014, 02:41 PM
Is there Multi-GPU support for Nvidia and AMD, i guess there is Direct X 11 technology there so i presume tessellation and lighting and shadows will be there too?. can't wait for the PC game testers to comment on the final product, almost there :cool:

ChrisSoSik
05-19-2014, 05:26 PM
I would suggest that everyone keep up to date with Watchdog's PCgamingWiki (http://pcgamingwiki.com/wiki/Watchdogs) after the game is released for information on the title's PC functionality. Anything being said by PR before release is just fair-game these days, nothing more.

Madasahat69
05-19-2014, 06:10 PM
If you have a 1080p monitor and your PC rig can handle it downsample from 2k to 1080p makes a nice improvement!

utack
05-20-2014, 10:12 AM
If you have a 1080p monitor and your PC rig can handle it downsample from 2k to 1080p makes a nice improvement!

SSAA/FSAA works ok but it needs a lot of GPU power.
I would bet Nvidia did not include TXAA for nothing, I expect similar results at lower cost.

Iweryn
05-20-2014, 10:28 AM
i would like to know if there is posibility to set resolution to 1920x1200 without the Black bars on top and on bottom of the screen

YazX_
05-20-2014, 10:44 AM
IMO, hardware is not ready to get smooth 4k experience for avg budget users, if you want to game on 4k, you need to spend 4k or more on PC hardware only, let alone the 4k monitor that costs around 3k, yah sure prices are going down, but its going to be a while until we see them in the 300-400$ range.

Anyway, as i said it before, game only supports 64-bit OS, meaning 40% performance boost than 32-bit, in addition to full RAM utilization, as 32-bit can only address 2 GB RAM, in addition to 8-cores CPU, why would Ubisoft go too far with those requirements if the game doesn't actually utilize those hardware specs?!

P.S: Moving thread to Watchdogs PC Discussion.

BlastThyName
05-20-2014, 11:21 AM
IMO, hardware is not ready to get smooth 4k experience for avg budget users, if you want to game on 4k, you need to spend 4k or more on PC hardware only, let alone the 4k monitor that costs around 3k, yah sure prices are going down, but its going to be a while until we see them in the 300-400$ range.
http://www.pcper.com/reviews/Displays/Video-Perspective-Samsung-U28D590D-28-4K-Single-Stream-60-Hz-Monitor-Review
This is getting close. The hardware required however for smooth 4K gaming isn't affordable yet. Especially if you want a 60fps on top of that.

But that's why PC gaming is flexible, 4K gaming will become a reality on PC long before it is on consoles.


Anyway, as i said it before, game only supports 64-bit OS, meaning 40% performance boost than 32-bit, in addition to full RAM utilization, as 32-bit can only address 2 GB RAM, in addition to 8-cores CPU, why would Ubisoft go too far with those requirements if the game doesn't actually utilize those hardware specs?!
P.S: Moving thread to Watchdogs PC Discussion.
Ubisoft's requirements are a complete joke more often that not, you can't fault people for not taking them too seriously.
Rememeber AC4's recommended specs ? That was funny.
As I've rightly stated, specs without targets are as meaningless as it gets.

What we're sure of is that CPU will play a major role. I would advise against purchasing WD on PC if you have less than a 2nd gen Core I5, but that's just me.

Needless to say I expect the game to be unoptimized at launch, this is Ubisoft we are talking about after all.
We all know they don't hold the PC platform in high regards.

YazX_
05-20-2014, 01:06 PM
http://www.pcper.com/reviews/Displays/Video-Perspective-Samsung-U28D590D-28-4K-Single-Stream-60-Hz-Monitor-Review
This is getting close. The hardware required however for smooth 4K gaming isn't affordable yet. Especially if you want a 60fps on top of that.

But that's why PC gaming is flexible, 4K gaming will become a reality on PC long before it is on consoles.


Ubisoft's requirements are a complete joke more often that not, you can't fault people for not taking them too seriously.
Rememeber AC4's recommended specs ? That was funny.
As I've rightly stated, specs without targets are as meaningless as it gets.

What we're sure of is that CPU will play a major role. I would advise against purchasing WD on PC if you have less than a 2nd gen Core I5, but that's just me.

Needless to say I expect the game to be unoptimized at launch, this is Ubisoft we are talking about after all.
We all know they don't hold the PC platform in high regards.

well you have every right to think that and im not blaming you or anyone, but here is the situation is completely different, this game is developed on and for PC as next gen consoles specs were not available yet.

all i can say is, since WD is using Disrupt Engine which is new and specifically created for this game in addition to what i have mentioned earlier lead to a great optimized PC game. and lets all hope it is :)

BlastThyName
05-20-2014, 01:59 PM
well you have every right to think that and im not blaming you or anyone, but here is the situation is completely different, this game is developed on and for PC as next gen consoles specs were not available yet.
That really is not guarantee of anything positive pertaining to the PC version. I don't know why everyone is bringing this up in the hope of clearing things up, this is naive at best.
"Lead" platform is fairly meaningless and we have heard this before, do you remember Black Flag ?
http://gamingbolt.com/assassins-creed-4-black-flag-pc-is-lead-platform-will-be-ported-to-consoles
Do I really have to remind you how many games leading on PC were terribly optimized ? Crysis, Metro 2033 on the top of my head.
Never underestimate Ubisoft's sheer incompetence when it comes to PC.
They don't care and that shows.


all i can say is, since WD is using Disrupt Engine which is new and specifically created for this game in addition to what i have mentioned earlier lead to a great optimized PC game. and lets all hope it is :)
I hope it is as anyone else but I'm not confident it will be a good PC version. The engine has nothing to do with this, ressources are key.
Ubisoft could have very well rearchitectured the AnvilNext to be anything more than an atrocious CPU hog but they didn't.

Again, I'm no soothsayer but I'm rather pessimistic.

dragoy
05-20-2014, 02:24 PM
Give it a rest already blast, we already know how much of a downer you have on pcs because you cant get your machine to run to its expected specs, we have a week togo and then you can see for yourself without all the boo boo its gonna be bad sideshow.

BlastThyName
05-20-2014, 02:30 PM
Give it a rest already blast
No offense dragoy but I don't take orders from you. I shall express my opinions in a respectful manner whenever I please.
Put me on ignore if you don't want to read my posts.


we already know how much of a downer you have on pcs because you cant get your machine to run to its expected specs, we have a week togo and then you can see for yourself without all the boo boo its gonna be bad sideshow.
My PC runs exactly as it should. The problem always lies in the software in my case, always.
I know how to take care of my machine and games such as BF4/Crysis 3 perform admirably well.
I also happen to have great knowledge of how my PC is supposed to perform.
That helps.

Mr_Shade
05-20-2014, 03:05 PM
New game engine = unproven on your system = shouldn't pre judge ;)

I'm sorry but you can't compare a game to another - or even same engine to another - as you may have seen with titanfall vs hl2 - it's not simple;

BlastThyName
05-20-2014, 03:29 PM
New game engine = unproven on your system = shouldn't pre judge ;)
Which is fortunately not what I'm doing at all, I'm keeping my expectations low and I never, ever tried to pass my "pessimism" as anything but assumption based on history.

I never claimed to know how Watch Dogs would perform, I'm merely remaining extremely cautios about Ubisoft's PC efforts which have been lackluster to say the least in recent years.

You can't never, ever fault anyone for being immensely sceptical when Ubisoft and PC are in the same sentence. Any worries is 100% warranted, I'm not born yesterday.
I've been gaming on PC for more more than a decade in spite of my young age. Experience taught me not to trust PR and easily falsifiable things, that's why I remain cautious.

viewer111
05-20-2014, 04:51 PM
Which is fortunately not what I'm doing at all, I'm keeping my expectations low and I never, ever tried to pass my "pessimism" as anything but assumption based on history.

I never claimed to know how Watch Dogs would perform, I'm merely remaining extremely cautios about Ubisoft's PC efforts which have been lackluster to say the least in recent years.

You can't never, ever fault anyone for being immensely sceptical when Ubisoft and PC are in the same sentence. Any worries is 100% warranted, I'm not born yesterday.
I've been gaming on PC for more more than a decade in spite of my young age. Experience taught me not to trust PR and easily falsifiable things, that's why I remain cautious.

I like your attitude and agree with you.

This is a multi-platform release and the PC usually takes the lowest priority and therefore optimisation (resource limitations), and this is not CD Projekt Red we are talking about here (with continuous support for their PC titles).

dragoy
05-20-2014, 05:15 PM
No offense dragoy but I don't take orders from you. I shall express my opinions in a respectful manner whenever I please.
Put me on ignore if you don't want to read my posts.


My PC runs exactly as it should. The problem always lies in the software in my case, always.
I know how to take care of my machine and games such as BF4/Crysis 3 perform admirably well.
I also happen to have great knowledge of how my PC is supposed to perform.
That helps.

Your ability to know pcs is let down by the simple fact that you think your cpu is a bottleneck to your graphics card which i have proven constantly tobe wrong. And the worst thing is people who dont know any better might think your correct simply by the fact that you bomb threads that have anything todo with pcs as if you know what your on about, which shown by the case of you thinking you have a bottleneck via cpu is wrong.


I like your attitude and agree with you.

This is a multi-platform release and the PC usually takes the lowest priority and therefore optimisation (resource limitations), and this is not CD Projekt Red we are talking about here (with continuous support for their PC titles).

And yah has already stated that watch dogs was produced for pc and ported to console because when watch dogs was being made the next gens were not even released for them tobe coded for.

viewer111
05-20-2014, 05:45 PM
Your ability to know pcs is let down by the simple fact that you think your cpu is a bottleneck to your graphics card which i have proven constantly tobe wrong. And the worst thing is people who dont know any better might think your correct simply by the fact that you bomb threads that have anything todo with pcs as if you know what your on about, which shown by the case of you thinking you have a bottleneck via cpu is wrong.


Blast is right, here is a proof about the CPU bottleneck from the horse's mouth:

http://www.dsogaming.com/news/watch_dogs-creative-director-shares-more-information-about-pc-specs-requirements/

BlastThyName
05-20-2014, 05:47 PM
Unfortunately, that was a foregone conclusion with open world games.
I hope the requirements are justified.

sunny52489
05-20-2014, 08:27 PM
Sadly my pc sucks so I don't think I'm gonna get great fps either way.

ChrisSoSik
05-21-2014, 02:12 AM
New game engine = unproven on your system = shouldn't pre judge ;)


Shouldn't prejudge? With Uplays current refund policy (http://shop.ubi.com/store/ubina/ContentTheme/pbPage.en_US-ReturnsAndCancellations/ThemeID.8605600),why not encourage it? Currently, we have to purchase the game to figure out it doesn't work as well on our system as it does on Dragoy's. After we purchase, no refund. Our only option is to pray for a patch. And this happens year after year. Prejudging is all we have.

EverAmbiguous
05-21-2014, 05:47 AM
Blast is right, here is a proof about the CPU bottleneck from the horse's mouth:

http://www.dsogaming.com/news/watch_dogs-creative-director-shares-more-information-about-pc-specs-requirements/

Hmm. That's interesting.

Well, my 3570k should be alright. Maybe High settings? We'll see.

I find myself wondering if I should consider OCing...

BlastThyName
05-21-2014, 10:13 AM
Hmm. That's interesting.

Well, my 3570k should be alright. Maybe High settings? We'll see.

I find myself wondering if I should consider OCing...

If you have a good aftermarket coller overclocking is a smart move. It should help in Watch Dogs at least, even though I think the number of threads will be the limiting factor.

pr0digy1
05-21-2014, 10:43 AM
Does the PC version get the Playstation exclusive things? like many other Ubisoft games.
I always find it weird when Ubi does that and it is included in the PC release.

YazX_
05-21-2014, 10:55 AM
Does the PC version get the Playstation exclusive things? like many other Ubisoft games.
I always find it weird when Ubi does that and it is included in the PC release.

Sony exclusives are CONSOLES exclusives, and PC is not a console, thats why Ubisoft include such contents with PC. regarding the question, most probably they will same as other games.



Hmm. That's interesting.

Well, my 3570k should be alright. Maybe High settings? We'll see.

I find myself wondering if I should consider OCing...

be very careful, Ivy Bridge and beyond generate alot of heat when OCed, you must have after market cooler and should take it slowly with voltage.

EverAmbiguous
05-21-2014, 12:14 PM
I'm not too inclined to overclock, but the whole "k" thing is designed for it.

I'll see what kind of quality I wind up with and decide from there I think.

dragoy
05-21-2014, 03:37 PM
Blast is right, here is a proof about the CPU bottleneck from the horse's mouth:

http://www.dsogaming.com/news/watch_dogs-creative-director-shares-more-information-about-pc-specs-requirements/


one small problem with that, blast isnt using a dual core nor a 3470. She, im assuming she but possibly he is running a 4770k

BlastThyName
05-21-2014, 04:44 PM
one small problem with that, blast isnt using a dual core nor a 3470. She, im assuming she but possibly he is running a 4770k
I'm a straight male. Not sure why you thought I was female, anyway :

I tested the game on the following (at my store) :
AC4 on a 770/stock 4770K = 40-60fps
AC4 on a 770/stock 4960X = above 60fps.

I should probably overclock my CPU then, but Haswells dont overclock that well....I mean the "luck of the draw".

EDIT :
I'm interested to see the graphical settings of the game. I hope there are enough and most importantly CPU related ones.

dragoy
05-21-2014, 04:47 PM
I think what your confusing is not a bottleneck but what 500 price difference between cpus look like.

BlastThyName
05-21-2014, 04:50 PM
I think what your confusing is not a bottleneck but what 500 price difference between cpus look like.
That's the textbook definition of a bottleneck. Without changing anything but the CPU I get better performance, the game seems to scale well with the number of cores.

As it stands I'm not dismayed all that much by the performance I get. D3D11 is no Mantle when it comes to efficient multithreading and work submission.

dragoy
05-21-2014, 04:52 PM
Well no, the term bottleneck is where your cpu cannot handle what the gpu is trying todo and thus limits the gpus capabilities, so the 4770k is in no way bottlenecking a 770 but a 4960x can simply do way more then its cheaper, smaller younger sibling can.

BlastThyName
05-21-2014, 04:57 PM
Well no, the term bottleneck is where your cpu cannot handle what the gpu is trying todo and thus limits the gpus capabilities, so the 4770k is in no way bottlenecking a 770 but a 4960x can simply do way more then its cheaper, smaller younger sibling can.
Nope, I know what I'm talking about.
But I was surprised to see such a difference when I'd assume the GPU would have significantly more to do.
My GPU usage regularly dropped to 60-70 in Havana for instance, the CPU was hammered.

I stand by my word. If the GPU usage is not 99% then the CPU is holding it back.
It used to be worse before the 337.50 BETA driver.

dragoy
05-21-2014, 04:59 PM
wait, your actually shocked that a hexacore top of the range flagship model cpu that costs nearly 800 is able to push more out of a game then a quad core 230 cpu ?

BlastThyName
05-21-2014, 05:02 PM
wait, your actually shocked that a hexacore top of the range flagship model cpu that costs nearly 800 is able to push more out of a game then a quad core 230 cpu ?
Kind of yes. Because I was under the assumption that AC4 used compute shaders extensively, alas I was wrong.
It's not a next-gen game by any stretch of the imagination.

In GPU bound games I didn't notice such a performance difference between a 4770K/4960X. Crysis 3, Battlefield 4, Metro Last Light, Tomb Raider to name a few.
Less than 4%.

dragoy
05-21-2014, 05:05 PM
Even in gpu bound games the cpu will be the brain of the pc, it will shunt work around and the faster it can shunt work around the faster everything gets thrown out onto the screen. And even in gpu heavy games the cpu will be an important factor doing alot of workload. And if you are running all base tests on a 770 you wont see much difference in the latest batch of games as it is infact the 770 that is limiting you and not the cpu.

BlastThyName
05-21-2014, 05:09 PM
Even in gpu bound games the cpu will be the brain of the pc, it will shunt work around and the faster it can shunt work around the faster everything gets thrown out onto the screen. And even in gpu heavy games the cpu will be an important factor doing alot of workload. And if you are running all base tests on a 770 you wont see much difference in the latest batch of games as it is infact the 770 that is limiting you and not the cpu.
Which is why I was surprised to see my CPU bottlenecking my GPU.
But I suppose this is almost inevitable in a massive open world game.

I suspect Watch Dogs will be even worse in that regard but as I've said repeatedly, I will scarcely mind if the game is that impressive behind the scenes.
From a purely graphical perspective I'm not impressed.

dragoy
05-21-2014, 05:12 PM
No you have it the wrong way around, your gpu is hitting its limit, not the cpu. If what you suggest was infact true there would be no upgrade option gpu wise for your machine, this is simply not the case as you could get a 780, a 780 ti, a titan black and sli or the R9 equivalents with crossfire and see a vast improvement in your fps, graphics and physx capability's. The difference you see with the 4770k to the 4960x is shear brute force difference, fast cache, more cores/threads for the engines to work with.

GuZZ33
05-21-2014, 05:27 PM
Assassin’s Creed IV: Black Flag is plagued by performance issues and runs identical on tricore and quadcore CPUs. Not only that but the performance difference between a dualcore and a quadcore is around 5fps at best.

Assassin’s Creed IV: Black Flag – PC Performance Analysis | DSOGaming | The Dark Side Of Gaming (http://www.dsogaming.com/pc-performance-analyses/assassins-creed-iv-black-flag-pc-performance-analysis/)

dragoy
05-21-2014, 05:33 PM
They really still make tripple core cpus ? i thought the last ones where phenoms ?

Mr_Shade
05-21-2014, 05:34 PM
Watch_Dog uses a different engine.. so hopes are high - it will push the hardware more ;)

It looked fine performance wise what I saw ;)

BlastThyName
05-21-2014, 05:57 PM
They really still make tripple core cpus ? i thought the last ones where phenoms ?
DSO-Gaming simulates a three cores CPU by disabling one on their Core 2 Quad.

With regards to your previous comment, GPU usage below 99% = CPU bottleneck.
No way around it. A faster GPU will push more frames as long as the CPU is fast enough to feed it.

dragoy
05-21-2014, 05:59 PM
DSO-Gaming simulates a three cores CPU by disabling one on their Core 2 Quad.

With regards to your previous comment, GPU usage below 99% = CPU bottleneck.
No way around it. A faster GPU will push more frames as long as the CPU is fast enough to feed it.

Well to the tripple core and disabling a core just makes me wonder why ????

To the bottleneck if what you say is correct a faster gpu wouldnt produce more fps on your machine because according to your your cpu is stopping the 770 from going above a set point and thus any card above would also be crippled by this factors. They arnt so you arnt bottlened via cpu.

JkBax
05-21-2014, 06:02 PM
It looked fine performance wise what I saw ;)

What did you see?

BlastThyName
05-21-2014, 06:04 PM
Well to the tripple core and disabling a core just makes me wonder why ????
They wanted to see if more cores translate into better performance.
It does, but to a meager extend. Black Flag should never be brought up when talking about quality multithreading.


To the bottleneck if what you say is correct a faster gpu wouldnt produce more fps on your machine because according to your your cpu is stopping the 770 from going above a set point and thus any card above would also be crippled by this factors. They arnt so you arnt bottlened via cpu.
A 780ti for instance would produce more frames but the gap between, say, a 770 would be rather small because the CPU is holding everything back. I will try with one of the Titan Black we have in store (that nobody is interested in apparently).

The game is also capped at something like 63-65fps, that kind of a problem in my case.

dragoy
05-21-2014, 06:06 PM
For a 780ti to produce more frames your cpu cannot be holding it back. its the 770 holding you back. they arnt that great you know.

BlastThyName
05-21-2014, 07:46 PM
For a 780ti to produce more frames your cpu cannot be holding it back. its the 770 holding you back. they arnt that great you know.
I know that, it's a mid-end GPU. You're the very last one who could teach me a single thing pertaining to hardware.

CPU bottlenecks don't prevent a better GPU from performing better, it just drastically reduces the gap between GPUs.
There is a reason why even at 1080p the CPU is the limiting factor in Black Flag, I assume the same will hold true for Watch Dogs as it has already been confirmed to be a CPU bound game.

Hopefully next-gen consoles raise the bar for PC, it's sorely needed.

dragoy
05-21-2014, 07:58 PM
I know that, it's a mid-end GPU. You're the very last one who could teach me a single thing pertaining to hardware.

CPU bottlenecks don't prevent a better GPU from performing better, it just drastically reduces the gap between GPUs.
There is a reason why even at 1080p the CPU is the limiting factor in Black Flag, I assume the same will hold true for Watch Dogs as it has already been confirmed to be a CPU bound game.

Hopefully next-gen consoles raise the bar for PC, it's sorely needed.

well you know what, you have a paradox, you apparently have a cpu that is bottlenecking a 770 but wont bottleneck a 780.

BlastThyName
05-21-2014, 08:45 PM
Unfortunately the two are bottlenecked. Black Flag is significantly more dependent on the CPU than the GPU.
I gained 9fps with a GTX Titan. GPU usage was really low (dropped to 55% in place).

Well this seems in line with this benchmark but I'm running the lastest released drivers.
http://gamegpu.ru/action-/-fps-/-tps/assassin-s-creed-4-black-flag-test-gpu.html

Wh1t3_5t0rm
05-22-2014, 03:01 PM
Im slightly confused right now, 4k = 4000 pixels, 1080p= 1080 pixels, so where are you getting 2k textures on a 1080p panel ?

That's not how it works. 4K is 38402160 and 2K is 1920x1080.

Psycold000
05-23-2014, 08:48 PM
4K will be utilized way more when VR headsets start to hit the retail market.

JkBax
05-23-2014, 08:52 PM
4K will be utilized way more when VR headsets start to hit the retail market.

How so?

dragoy
05-23-2014, 09:23 PM
How so?

I think they are refering to the fact that oculus is making 4k headsets.

Psycold000
05-23-2014, 09:43 PM
How so?
VR headsets require the screen to be closer to your eyes than anything else so 4k and 8k would be more beneficial in that area.

ShadyFusion23
05-24-2014, 03:45 AM
VR headsets require the screen to be closer to your eyes than anything else so 4k and 8k would be more beneficial in that area.

I'm enjoying your post drag and blast.

Ok I need to set the record straight.

YAZ: 32bit is limited to 3.25GB due to hardware address limitation designated by 32bit OS.

Here is how I understand it. Say I have an i7-3770k. Ok I put ina a GTX 285 GTX in and play game. In that game, I get 20fps. Now i bump that card to a 760, now I can get 80 in the same game. THen I bump that to a 780Ti. I may now get 100. As the card goes up the FPS will ruse. However, and as blast says the gap performance will start to decrease if I keep upgrading my card but with the same CPU in a pattern that it begins to not be significant anymore due to the CPU being bottle-necked. Now, I take that same 780Ti, put it into the fastest intel on the market, then it isn't bottlenecked as it was before.

Don't get me wrong both you have great points. I'm was amused by the banter back and forth.

Also 2k resolutoin is not 1980x1080 is is 2048x1536 sometimes 2048x1556

1080p: 1920x1080
2k: 2048x1536
4k: 4096 x 2160.

Just need to put my 2 cents in. lol.

Wh1t3_5t0rm
05-24-2014, 04:26 AM
1080p: 1920x1080
2k: 2048x1536
4k: 4096 x 2160.


You are correct about 2K, thanks for clearing that up, but 4K is 38402160.

GuZZ33
05-24-2014, 09:45 AM
Also 2k resolutoin is not 1980x1080 is is 2048x1536 sometimes 2048x1556.You're correct, but 2k is just a generic term, a bit like saying 4k.

Another way of looking at it is 2k is 2 million pixels and 4k is 8 million pixels on your screen, other terms used are Full HD 1080p and Ultra HD 2160p.

From wiki,

2K resolution is a generic term for display devices or content having horizontal resolution on the order of 2,000 pixels. In the movie projection industry, digital cinema initiatives the dominant standard for 2K output. In the digital film production chain, a resolution of 2048x1556 is often used for acquiring "open gate" or anamorphic input material, a resolution based on the historical resolution of scanned super 35mm film. In television, the top-end 1080p high-definition television format qualifies as 2K resolution, having a horizontal resolution of 1920 pixels, with a vertical resolution of 1080 pixels.


2K (resolution) (http://en.wikipedia.org/wiki/2K_(resolution))

Clannsman
05-25-2014, 12:19 PM
I would be more than a little surprised if your average quad core cant handle WD. Open world style games dont tend to be very CPU demanding, the most demanding CPU games i've encountered are the Totalwar franchise and Rome 2 especially.

Even with my modded Rome2 playing 20K @ 1920x1200res men on battlefield my 1st Gen I5-760 @4.25 creams it!!.

What most forget when referring to CPU bottle neck is the resolution. Sure if your going to play at a low 1080p res, then your CPU could bottleneck extreme fps on a serious G-card, but as you up the res the G-card is ALWAYS the limiting factor as long as you have any half decent 4yr old CPU.

Case point: I5-760 v I5-4570K both running AMD7870 Crossfire. @ 2560x1440res. I5-4570k averages 5fps faster!!

Tully__
05-25-2014, 12:28 PM
.... the most demanding CPU games i've encountered are...
Combat flights sims, especially those based on WW-I and WW-II where the action is up close and personal. Those suckers eat CPU capacity for breakfast, swallow multi-gigs of ram as a followup then queue up at the GPU stand for more, more, more... car race sims aren't far behind if they've got anything close to real world physics.

Clannsman
05-25-2014, 12:54 PM
Combat flights sims, especially those based on WW-I and WW-II where the action is up close and personal. Those suckers eat CPU capacity for breakfast, swallow multi-gigs of ram as a followup then queue up at the GPU stand for more, more, more... car race sims aren't far behind if they've got anything close to real world physics.

As a keen combat flight sim enthusiast (rise of flight / IL2) etc i would agree, im also into space sims (X-series) etc, all of which are real genuine CPU testers, and again i've never found my now very old I5-760 (albeit overclocked) wanting....the most most FPS gain certainly at 1920x1200 res (which is now below the considered norm) is always gained via the G-card.

The only place where any half decent CPU bottle neck is discovered is when you test games at low res like 1080p with something like R-290x graphics card....and that normally shows the difference between 100fps and 125fps!!! hardly an issue unless your playing 120khz refresh.

As previously mentioned, i'll be very supprised if WD is CPU intensive.

BlastThyName
05-25-2014, 12:56 PM
My copy will arrive tomorrow but my friends have been luckier than me and already received their Vigilante editions.
The game runs well apparently with the 337.82 drivers.

Careful about VRAM usage though, be sure not to set your textures to ultra if you don't have 3gb of VRAM.

Clannsman
05-25-2014, 01:04 PM
My copy will arrive tomorrow but my friends have been luckier than me and already received their Vigilante editions.
The game runs well apparently with the 337.82 drivers.

Careful about VRAM usage though, be sure not to set your textures to ultra if you don't have 3gb of VRAM.


I dont even see how V-ram will make any difference either...seen a million+1 tests with 2gb v 3gb+ , Vram cards @ 2560x1440p...didn't make a jot of difference!, only when they test 4k res does the Vram start to show.

Tully__
05-25-2014, 01:09 PM
I dont even see how V-ram will make any difference either...seen a million+1 tests with 2gb v 3gb+ , Vram cards @ 2560x1440p...didn't make a jot of difference!, only when they test 4k res does the Vram start to show.
That will depend on texture detail though, if this game is using much larger or higher resolution textures than has previously done, demand for memory may be much larger.

Clannsman
05-25-2014, 01:12 PM
That will depend on texture detail though, if this game is using much larger or higher resolution textures than has previously done, demand for memory may be much larger.


Sure, but i run a AMD 7870 (heavily overclocked) with 2gb V-Ram. I play skyrim modded with 4K textures throughout @ 1200p.....the V-ram is always maxed out and would use more if i had more..the result......constant 60fps with V-sync.

Wh1t3_5t0rm
05-25-2014, 01:52 PM
I heard from some people that Ubi aren't kidding about 3GB RAM for Ultra textures. For everyone I know who has a 2GB card, they say that the game just crashes outright with Ultra textures.