TotalBiscuit Takes On The Watch Dogs PC Controversy

It take a long time to develope the AAA titles that you guys want to push the $2k+hardware. The market JUST IS NOT THERE.

Most people completely fail to realize this point. Top end videocards are roughly 1% of the PC market. If you look at steam hardware data the most popular pc display device is Intel integrated graphics. The top discrete card is GTX 660 and for AMD a 7850.

Gaming is a business. There simply isn't enough return on investment to spend extra development cycles on features only small fraction of gamers can use.
 
Picking up someone else's accent is normal. Having it integrated to your own over time is equally normal; I myself have unconsciously changed my accent from living across Europe and the U.S.

I saw a video on his youtube channel where he was a guest speaker at some event, and that voice just came right out from the get go. That said, if you watch his mannerisms during that he plays with his hands, has some idiosyncrasies for sure. But then again his job is to spend hours among hours playing games and commenting on every little bit of them, some might call that a dream job, but it takes a "special" person to do that :D
 
Most people completely fail to realize this point. Top end videocards are roughly 1% of the PC market. If you look at steam hardware data the most popular pc display device is Intel integrated graphics. The top discrete card is GTX 660 and for AMD a 7850.

Gaming is a business. There simply isn't enough return on investment to spend extra development cycles on features only small fraction of gamers can use.

At the same time though, a bit of optimization would make any game run decently at settings that exceed what the new consoles are doing on these popular 7850/660 cards. People with higher end cards could crank up additional settings like high AO, AA, and other settings. It wouldn't necessarily be designed around the high end still, but the high end would still benefit (as well as the mid-range.)

Looking at Watch Dogs, I can see any reason why it shouldn't run at a high frame rate with high+ settings on something like a 7850 or 660. Many people have said it. There are games that look a lot better running on lower hardware.
 
Why would Ubisoft waste all that time in development, just to scrap it. Wouldn't the company just delay the release of the PC version?
The game was already delayed for something like 6 months. Delaying it again would be even worse from Ubisoft's business perspective.

I saw a video on his youtube channel where he was a guest speaker at some event, and that voice just came right out from the get go. That said, if you watch his mannerisms during that he plays with his hands, has some idiosyncrasies for sure. But then again his job is to spend hours among hours playing games and commenting on every little bit of them, some might call that a dream job, but it takes a "special" person to do that :D
TotalBiscuit suffers from dyspraxia which could explain some of his idiosyncrasies.
 
Most people completely fail to realize this point. Top end videocards are roughly 1% of the PC market. If you look at steam hardware data the most popular pc display device is Intel integrated graphics. The top discrete card is GTX 660 and for AMD a 7850.

Gaming is a business. There simply isn't enough return on investment to spend extra development cycles on features only small fraction of gamers can use.

Not sure if you're being general or in reference to Watch Dogs... Watch Dogs has the textures and features there, they were just dumbed down. So, it could easily have been included in the final release. Even as an "Ultra" setting. Development was already done with that part.

If this mod wasn't in the actual release, they'd be in much better shape. Still sucks as it wouldn't have met E3 expectations, but PC users wouldn't feel ripped off as much....
 
TotalBiscuit suffers from dyspraxia which could explain some of his idiosyncrasies.
Ah that explains things. I know he's recently been diagnosed with cancer too, sucks.

Who is TotalBiscuit? Just some YouTube guy?
You waited until the 3rd page to ask this question? I mean the topic is "TotalBiscuit Takes On The Watch Dogs..."

But to answer your question "Just some youtube guy" in a general sense, however he's very well watched, does a lot of "First Look" type videos of steam games and other often not well advertised games, he obviously does not hold back any punches when doing the videos, he may have started as "some guy" but he's got a company now, he makes money due to the number of hits. He's also done other things like speaking out about how YouTubes auto-infringe ability can fuck with people, like when he was slapped with one by a game company who did not like his review.
 
Not sure if you're being general or in reference to Watch Dogs... Watch Dogs has the textures and features there, they were just dumbed down. So, it could easily have been included in the final release. Even as an "Ultra" setting. Development was already done with that part.

If this mod wasn't in the actual release, they'd be in much better shape. Still sucks as it wouldn't have met E3 expectations, but PC users wouldn't feel ripped off as much....

The mod isn't complete. It's buggy as hell and things glitch out, especially with some of the lighting/shadowing. So specifically to Watch Dogs I can see it being a simple calculation of QA/bugfix on those features vs return on investment for money spent. Will X number of more people buy the game if it had these features in vs the cost of making them work properly. Apparently to Ubisoft, they did not feel spending time to fully implement this stuff would result in sales high enough to warrant it.

At the same time though, a bit of optimization would make any game run decently at settings that exceed what the new consoles are doing on these popular 7850/660 cards. People with higher end cards could crank up additional settings like high AO, AA, and other settings. It wouldn't necessarily be designed around the high end still, but the high end would still benefit (as well as the mid-range.)

Looking at Watch Dogs, I can see any reason why it shouldn't run at a high frame rate with high+ settings on something like a 7850 or 660. Many people have said it. There are games that look a lot better running on lower hardware.

See that's the point. That "bit" of optimization costs money. Would that "bit" generate enough additional sales to warrant the investment on the PC platform?
 
See that's the point. That "bit" of optimization costs money. Would that "bit" generate enough additional sales to warrant the investment on the PC platform?

Well, it really depends on the users doesn't it? If nobody bought it because it was unoptimized garbage, then I think it would be well worth their time (in their eyes) to do it. It will always be worth it in the users' eyes. The problem is that people will buy it either way, so there's no real monetary incentive for them to do it.

Then the other part is that the incentive should intrinsically be to put out a good product that you can be proud of, and people will enjoy. However, the reality is we have a half-ass money grabbing corporation and allegedly cocky devs running the show there.

So ideally, yes, it would be worth it, and there would be more sales, or sales period. Real world... sadly no...
 
Wish I could edit to add to that. Fortunately there are companies that DO put out tested, optimized products. While it may or may not cost them more or take more time to do it, they do it. I don't think that necessarily hurts sales or their overall profits much either, at least in any major way.
 
Well, it really depends on the users doesn't it? If nobody bought it because it was unoptimized garbage, then I think it would be well worth their time (in their eyes) to do it. It will always be worth it in the users' eyes. The problem is that people will buy it either way, so there's no real monetary incentive for them to do it.

The real problem is that people will pre-order this shit sight unseen which allows for these companies to make money off of broken, unoptimized crap.
 
I was thinking that maybe they did indeed tune down PC graphics because of the new consoles. Not because they were paid off by Microsoft or Sony, but because they feared the backlash of console gamers if they saw that the PC version looks that much better while being told at the same time these shiny new all powerful consoles still can't keep up.

So it was neither malice nor stupidity, it was simply damage prevention.

And those next-gen gamers would have had the honor of paying $10 MORE than the better-looking PC version costs, too.
 
You waited until the 3rd page to ask this question? I mean the topic is "TotalBiscuit Takes On The Watch Dogs..."

But to answer your question "Just some youtube guy" in a general sense, however he's very well watched, does a lot of "First Look" type videos of steam games and other often not well advertised games, he obviously does not hold back any punches when doing the videos, he may have started as "some guy" but he's got a company now, he makes money due to the number of hits. He's also done other things like speaking out about how YouTubes auto-infringe ability can fuck with people, like when he was slapped with one by a game company who did not like his review.

Well, I didn't know, but since there was a lot of focus on him during some posts rather than the actual news part, I had to ask. I just haven't heard of him at all. I don't look into the new games too often, though.
 
The real problem is that people will pre-order this shit sight unseen which allows for these companies to make money off of broken, unoptimized crap.

I'll admit I do this occasionally, however after playing video/computer games since around 1980 or so, I think I've developed a sense for what will or will not turn out well, or that I will enjoy or not. That's not to say I'm 100% accurate, or that my tastes line up with everyone else', but I don't fail myself very often. I didn't buy DaiKatana... :D
 
Well, I didn't know, but since there was a lot of focus on him during some posts rather than the actual news part, I had to ask. I just haven't heard of him at all. I don't look into the new games too often, though.

Fair enough, I've only been swinging by his youtube channel for the last 6-12 months, not sure what lead me there probably Googled a game name that was on Humble Bundle or something and went there. But got hooked to him, don't listen to everything he says but he gives really good first impressions about games compared to so many other people out there.
 
And those next-gen gamers would have had the honor of paying $10 MORE than the better-looking PC version costs, too.

Last time I checked, Watch_Dogs (and most other "AAA" titles) were $60 on PC as well.
 
At the same time though, a bit of optimization would make any game run decently at settings that exceed what the new consoles are doing on these popular 7850/660 cards. People with higher end cards could crank up additional settings like high AO, AA, and other settings. It wouldn't necessarily be designed around the high end still, but the high end would still benefit (as well as the mid-range.)

Looking at Watch Dogs, I can see any reason why it shouldn't run at a high frame rate with high+ settings on something like a 7850 or 660. Many people have said it. There are games that look a lot better running on lower hardware.

The closest equivalent to a PS4 GPU is the 7850 and that's only running the game at 30fps 1600x900. Assuming no other bottlenecks, you'd need twice that to double the framerate plus another 44% to take that up to a full 1080p. That's 2.9x the performance of a PS4 GPU and well into 780 Ti territory. That obviously isn't the case when an HD7850 is more than capable of running Watch Dogs at 1080p/30.

There's definitely something else going on and is also why the notion of a 'gimped' PC version is a load of nonsense. If the PC version is gimped then so is the PS4 version gimped.
 
The closest equivalent to a PS4 GPU is the 7850 and that's only running the game at 30fps 1600x900. Assuming no other bottlenecks, you'd need twice that to double the framerate plus another 44% to take that up to a full 1080p. That's 2.9x the performance of a PS4 GPU and well into 780 Ti territory. That obviously isn't the case when an HD7850 is more than capable of running Watch Dogs at 1080p/30.
I wouldn't consider the PS4 a 7850. Sure on paper it may seem like a 7850, but it's barely capable of a R7 260X. A GeForce GTX 750 Ti is only $150 and would obliterate the PS4.

I thought by now that people would realize that paper talk is pointless. If the PS4 can't do WatchDogs at 1080P with 30fps but a 7850 can, then guess what the PS4 is not equivalent to a 7850. It's barely a R7 260X, which is a $110 graphics card.
There's definitely something else going on and is also why the notion of a 'gimped' PC version is a load of nonsense. If the PC version is gimped then so is the PS4 version gimped.

Or the PS4 is just gimped. But that couldn't be possible right, cause on paper it's a 7850. Except it's not.
 
The closest equivalent to a PS4 GPU is the 7850 and that's only running the game at 30fps 1600x900. Assuming no other bottlenecks, you'd need twice that to double the framerate plus another 44% to take that up to a full 1080p. That's 2.9x the performance of a PS4 GPU and well into 780 Ti territory. That obviously isn't the case when an HD7850 is more than capable of running Watch Dogs at 1080p/30.

There's definitely something else going on and is also why the notion of a 'gimped' PC version is a load of nonsense. If the PC version is gimped then so is the PS4 version gimped.

Gimped I'd wager, by poor coding on the "dev's' part. I'm not a game programmer, but know a few, and have been playing all my life. Old or new, there have been plenty of games that went above an beyond people's expectations at the time, squeezing things out of hardware that people didn't know was possible. Anyone remember the first time they played Unreal? (of Quake II in Open GL for the first time?)

Look at Bethesda. They seem generally regarded as moderately inept, but their Skyrim engine is pretty nice, supports a huge world, and unlike Oblivion, hasn't crashed on me once even when modded up pretty well. (though I do know they've had unstable engines in the past) My point is, if I can play Skyrim at highest settings, hardware has iterated a couple of times since then, and Ubi can't get this game running half-decently on cutting edge hardware, then something is wrong.
 
Also Dead Rising 3 for PC is having a frame rate cap of 30 fps, for no apparent reason. Developers say it's ok to increase the fps to 60 if you want, but warn of possible problems. Just so happens the Xbox One and PS4 are also limited to 30 fps.

"This game was built from the ground up for the Xbox One and what a lot of people don't realize is that our visual framerate is actually tied to the game's simulation under the hood," Airhart said. "Over three years ago, when we started building the game, that's kind of how we built it, targeting 30 FPS.

"When we started the PC project at the very end of last year and beginning of this year, we knew that we weren't going to be able to guarantee running above 30 FPS."

Does anyone really believe this? Is so apparent that console systems aren't up to speed with a cheap PC that it seems they purposely gimp their games on PC, so as to not overshadow consoles.
 
Oh hey look, the same guys have a Tomb Raider PS4 vs PC video as well. Very detailed and slow motion to see all the detail. Not sure what they did for the PS4 but it looks worse to me. Textures look darker and less detailed for some reason on PS4 with a bit of color saturation turned up.
 
Also Dead Rising 3 for PC is having a frame rate cap of 30 fps, for no apparent reason. Developers say it's ok to increase the fps to 60 if you want, but warn of possible problems. Just so happens the Xbox One and PS4 are also limited to 30 fps.



Does anyone really believe this? Is so apparent that console systems aren't up to speed with a cheap PC that it seems they purposely gimp their games on PC, so as to not overshadow consoles.

Tying physics and other aspects of the gameplay to a fixed frame rate is something we've seen in the past on one of those god awful Need for Speed ports for PC. It's the sign of a lazy and or incompetent developer. Sadly, the developers of Dead Rising 3 may be telling the truth. If they are that just makes them retarded.

Personally I think they had all those features in the game engine to showcase it for E3 and the media in general. They disabled them because those advanced features wouldn't work on the consoles. They most likely locked those things out on the PC version because they didn't want to take the time to ensure they worked properly, or troubleshoot the games other issues combined with those effects. As it is everything I've read on the subject says that the mod creates additional problems with the game. The flickering headlights mentioned in the video is a great example of that. This type of thing was something the developers probably didn't want to address.

I'm not sure it was malice, or some sort of "make the console version just as good by gimping the PC version" type of conspiracy at work. I think it was lazy developers who didn't want to take the time to ensure those things worked properly or that there would be menu options for them etc. Lazy and rushed for time due to impending publishing deadlines is probably close to the truth. In this business, it usually is.
 
No you can't guarantee anything on a PC, because everyone has different hardware. That being said, at an equivalent hardware level of an Xbox one or better, you absolutely should be able to do so. Not necessarily "guarantee" per se as that could have legal ramifications; but you should absolutely expect that as a developer.
 
I wouldn't consider the PS4 a 7850. Sure on paper it may seem like a 7850, but it's barely capable of a R7 260X. A GeForce GTX 750 Ti is only $150 and would obliterate the PS4.

I thought by now that people would realize that paper talk is pointless. If the PS4 can't do WatchDogs at 1080P with 30fps but a 7850 can, then guess what the PS4 is not equivalent to a 7850. It's barely a R7 260X, which is a $110 graphics card.


Or the PS4 is just gimped. But that couldn't be possible right, cause on paper it's a 7850. Except it's not.

I'm not sure that argument works when there are already so many known quantities in these GPUs. We already know that the PS4 GPU is 1.84 TFLOPs to the 7850s 1.76 TFLOPs, for example, so we know that it's not shader bound. It also has more texture units (72 vs. 64) so it's also not bound in that area. The 260x also has a 100GB/sec path to the 176GB/sec path in the PS4 so even that doesn't make sense.

These are all same architecture GPUs so the same rules apply.
 
Gimped I'd wager, by poor coding on the "dev's' part. I'm not a game programmer, but know a few, and have been playing all my life. Old or new, there have been plenty of games that went above an beyond people's expectations at the time, squeezing things out of hardware that people didn't know was possible. Anyone remember the first time they played Unreal? (of Quake II in Open GL for the first time?)

Look at Bethesda. They seem generally regarded as moderately inept, but their Skyrim engine is pretty nice, supports a huge world, and unlike Oblivion, hasn't crashed on me once even when modded up pretty well. (though I do know they've had unstable engines in the past) My point is, if I can play Skyrim at highest settings, hardware has iterated a couple of times since then, and Ubi can't get this game running half-decently on cutting edge hardware, then something is wrong.

Poor coding is definitely more plausible than anything else. Personally, I'd just put the game issues down to lack of experience with 64-bit addressing space and having DX11 hardware as a baseline.

Some people also report artifacting and that's simply a no-no from a QA perspective. If something doesn't pass QA then you typically cut it from the product.
 
I'm not sure that argument works when there are already so many known quantities in these GPUs. We already know that the PS4 GPU is 1.84 TFLOPs to the 7850s 1.76 TFLOPs, for example, so we know that it's not shader bound. It also has more texture units (72 vs. 64) so it's also not bound in that area. The 260x also has a 100GB/sec path to the 176GB/sec path in the PS4 so even that doesn't make sense.

These are all same architecture GPUs so the same rules apply.

Makes sense cause you're only looking at the GPUs. That 176 GB/s that the PS4 has is shared with the 8 core CPU. BTW the 7850 has 153.6GB/sec. Some of that is going to be lost to the CPU, where the 7850 doesn't have that problem. Plus the there's going to be issues with memory clashing between the CPU and GPU. Also the CPU plays a role regardless if it's console or PC. Not as much as PC but it effects graphics performance on console as well.

But again, paper vs actual evidence of performance. You're going by paper and I have a video that shows games on the PC vs PS4 vs Xbone. A PC using a $110 graphics card.
 
I'm not sure it was malice, or some sort of "make the console version just as good by gimping the PC version" type of conspiracy at work. I think it was lazy developers who didn't want to take the time to ensure those things worked properly or that there would be menu options for them etc. Lazy and rushed for time due to impending publishing deadlines is probably close to the truth. In this business, it usually is.
I think you would have a point about it not being malice if they didn't come out and say the graphics were not downgraded from E3, which is quite obviously a lie at this point. If they had come out and said they had too much trouble getting the same graphics to work properly outside of the E3 demo in time for release, THEN you could talk about no malice being involved.
 
Makes sense cause you're only looking at the GPUs. That 176 GB/s that the PS4 has is shared with the 8 core CPU. BTW the 7850 has 153.6GB/sec. Some of that is going to be lost to the CPU, where the 7850 doesn't have that problem. Plus the there's going to be issues with memory clashing between the CPU and GPU. Also the CPU plays a role regardless if it's console or PC. Not as much as PC but it effects graphics performance on console as well.

But again, paper vs actual evidence of performance. You're going by paper and I have a video that shows games on the PC vs PS4 vs Xbone. A PC using a $110 graphics card.

I think the PS4 CPU is limited to only 20ish GB/sec of that bandwidth but, yeah, I don't really have any argument there. That still leaves a 50% bandwidth advantage over the 260x and is a point of contention.

On the other hand, the X1 runs the game at 1408x792 which also doesn't account for the 50% performance differential between it and the PS4. That resolution differential is only about 30% and assumes it being shader bound as the PS4 also has a twofold pixel fill advantage over the Xbox One.

Either way, I don't buy the 'intentionally gimped' argument. There are simply too many confounds that don't add up.
 
No you can't guarantee anything on a PC, because everyone has different hardware. That being said, at an equivalent hardware level of an Xbox one or better, you absolutely should be able to do so. Not necessarily "guarantee" per se as that could have legal ramifications; but you should absolutely expect that as a developer.

Thats why PC games have more graphics options, so you can turn features off if the game is too slow.

But in this case, they decided to hide the highest quality options for no apparent reason, and then blatantly lied about hiding them.
 
Dice is going to pull the same crap with Star Wars: Battlefront. Oh they showed some nice development footage, but the limitations of the new consoles are gonna rear their head in the end product. Sad thing is I know A LOT of people waiting till a month before SW:B release to upgrade to high end equipment and it more than likely won't be needed. Hell, I might get by just fine on my system now. I just don't trust these developers anymore.
 
How about this for a theory:

Watch Dogs was always pitched as the advance guard that was going to set the bar for the next gen consoles. Realistically, anyone with a brain, realized the 'next gen' consoles simply weren't going to be that powerful or offer much of a visual bump over the existing consoles. Nor were they going to approach PC standards as apples-apples revealed they were obviously way underpowered in this regard.

So what was Ubi to do after all the shuffling of feet between MS and Sony to lockdown exactly what their hardware was going to ship with? Ubi could never offer the experience they promised, not necessarily through any fault of their own, but the fact that the 'next gen' consoles simply aren't. They had also made the usual pandering promises to PC user base, doubtless without any real intention of fulfilling those...but now it was going to appear that all those early console adopters salivating for Watch Dogs were betrayed.

What were they to do? To say that many consoles weren't moved by the promise of Watch Dogs as a launch title is disingenious too. Ubi set themselves up as the poster child for MS and Sony, so of course they were now committed to offer a 1-1 experience to PC at minimum.
 
People saying it was "poor coding"...was it that hard for them to flip a couple of switches in a config file? I don't really buy that.
 
People saying it was "poor coding"...was it that hard for them to flip a couple of switches in a config file? I don't really buy that.

I say poor coding because based on other open world games that arguably look better, this game doesn't perform as well. Also, as noted flipping those switches causes glitches/artifacts, and so far the picture I'm getting is maybe they didn't want to, (or to give them the benefit of the doubt) didn't have enough time to fix. So either they're lazy, management sucks, they're incompetent, or there really is a conspiracy to make the PC look worse than the consoles.

I'm going with a mix of incompetence, bad management, and laziness. Which kind of form together to create the same result as the gimping conspiracy, only less intentional and premeditated.
 
People saying it was "poor coding"...was it that hard for them to flip a couple of switches in a config file? I don't really buy that.

It's exactly because it isn't that hard to flip a couple of switches. That means that there's more to it than just 'the man is keeping us down'.

The sheer fact that there are rendering artifacts when using this mod simply means that it didn't pass QA. Acting like there's some anti-PC sentiment going on is just silly.
 
I think the PS4 CPU is limited to only 20ish GB/sec of that bandwidth but, yeah, I don't really have any argument there. That still leaves a 50% bandwidth advantage over the 260x and is a point of contention.

On the other hand, the X1 runs the game at 1408x792 which also doesn't account for the 50% performance differential between it and the PS4. That resolution differential is only about 30% and assumes it being shader bound as the PS4 also has a twofold pixel fill advantage over the Xbox One.

Either way, I don't buy the 'intentionally gimped' argument. There are simply too many confounds that don't add up.

Then explain all the other games that run equivalent on a R7 260X compared to a PS4 and Xbone? Take a look at that video. Better yet, go read the article that did the tests. It isn't just watch dogs. Paper specs are about as useful as the paper they are written on. If that were the case, then why have benchmarks?
 
How about this for a theory:

Watch Dogs was always pitched as the advance guard that was going to set the bar for the next gen consoles. Realistically, anyone with a brain, realized the 'next gen' consoles simply weren't going to be that powerful or offer much of a visual bump over the existing consoles. Nor were they going to approach PC standards as apples-apples revealed they were obviously way underpowered in this regard.

So what was Ubi to do after all the shuffling of feet between MS and Sony to lockdown exactly what their hardware was going to ship with? Ubi could never offer the experience they promised, not necessarily through any fault of their own, but the fact that the 'next gen' consoles simply aren't. They had also made the usual pandering promises to PC user base, doubtless without any real intention of fulfilling those...but now it was going to appear that all those early console adopters salivating for Watch Dogs were betrayed.

What were they to do? To say that many consoles weren't moved by the promise of Watch Dogs as a launch title is disingenious too. Ubi set themselves up as the poster child for MS and Sony, so of course they were now committed to offer a 1-1 experience to PC at minimum.

Why do people make ignorant statements on here? Go watch the video of PS4 vs PS3 for GTA V. There is a huge improvement with the new consoles.

You are in a niche percentage of a huge industry. Find devs that will cater to you (good luck, they all work for a living), or just move on with your life.

This is not going to change, not because they are poo pooing in your cheerios but because there isn't any fucking money in the high end PC gaming industry. The ROI is not there to design, tweak, and support these features with the wide range of hardware.
 
Back
Top