D
Deleted member 83233
Guest
What a stupid accent.
Wow, your ignorance and thread crapping abilities know no bounds. Your mother must be ultra-proud of you.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
What a stupid accent.
It take a long time to develope the AAA titles that you guys want to push the $2k+hardware. The market JUST IS NOT THERE.
Picking up someone else's accent is normal. Having it integrated to your own over time is equally normal; I myself have unconsciously changed my accent from living across Europe and the U.S.
Most people completely fail to realize this point. Top end videocards are roughly 1% of the PC market. If you look at steam hardware data the most popular pc display device is Intel integrated graphics. The top discrete card is GTX 660 and for AMD a 7850.
Gaming is a business. There simply isn't enough return on investment to spend extra development cycles on features only small fraction of gamers can use.
The game was already delayed for something like 6 months. Delaying it again would be even worse from Ubisoft's business perspective.Why would Ubisoft waste all that time in development, just to scrap it. Wouldn't the company just delay the release of the PC version?
TotalBiscuit suffers from dyspraxia which could explain some of his idiosyncrasies.I saw a video on his youtube channel where he was a guest speaker at some event, and that voice just came right out from the get go. That said, if you watch his mannerisms during that he plays with his hands, has some idiosyncrasies for sure. But then again his job is to spend hours among hours playing games and commenting on every little bit of them, some might call that a dream job, but it takes a "special" person to do that
Most people completely fail to realize this point. Top end videocards are roughly 1% of the PC market. If you look at steam hardware data the most popular pc display device is Intel integrated graphics. The top discrete card is GTX 660 and for AMD a 7850.
Gaming is a business. There simply isn't enough return on investment to spend extra development cycles on features only small fraction of gamers can use.
TotalBiscuit suffers from dyspraxia which could explain some of his idiosyncrasies.
Who is TotalBiscuit? Just some YouTube guy?
Ah that explains things. I know he's recently been diagnosed with cancer too, sucks.TotalBiscuit suffers from dyspraxia which could explain some of his idiosyncrasies.
You waited until the 3rd page to ask this question? I mean the topic is "TotalBiscuit Takes On The Watch Dogs..."Who is TotalBiscuit? Just some YouTube guy?
Yup, island apes need to learn how to speak American properly.Yeah I know, fuck the 60 odd million people who talk similarly amirite?
Not sure if you're being general or in reference to Watch Dogs... Watch Dogs has the textures and features there, they were just dumbed down. So, it could easily have been included in the final release. Even as an "Ultra" setting. Development was already done with that part.
If this mod wasn't in the actual release, they'd be in much better shape. Still sucks as it wouldn't have met E3 expectations, but PC users wouldn't feel ripped off as much....
At the same time though, a bit of optimization would make any game run decently at settings that exceed what the new consoles are doing on these popular 7850/660 cards. People with higher end cards could crank up additional settings like high AO, AA, and other settings. It wouldn't necessarily be designed around the high end still, but the high end would still benefit (as well as the mid-range.)
Looking at Watch Dogs, I can see any reason why it shouldn't run at a high frame rate with high+ settings on something like a 7850 or 660. Many people have said it. There are games that look a lot better running on lower hardware.
See that's the point. That "bit" of optimization costs money. Would that "bit" generate enough additional sales to warrant the investment on the PC platform?
Well, it really depends on the users doesn't it? If nobody bought it because it was unoptimized garbage, then I think it would be well worth their time (in their eyes) to do it. It will always be worth it in the users' eyes. The problem is that people will buy it either way, so there's no real monetary incentive for them to do it.
I was thinking that maybe they did indeed tune down PC graphics because of the new consoles. Not because they were paid off by Microsoft or Sony, but because they feared the backlash of console gamers if they saw that the PC version looks that much better while being told at the same time these shiny new all powerful consoles still can't keep up.
So it was neither malice nor stupidity, it was simply damage prevention.
You waited until the 3rd page to ask this question? I mean the topic is "TotalBiscuit Takes On The Watch Dogs..."
But to answer your question "Just some youtube guy" in a general sense, however he's very well watched, does a lot of "First Look" type videos of steam games and other often not well advertised games, he obviously does not hold back any punches when doing the videos, he may have started as "some guy" but he's got a company now, he makes money due to the number of hits. He's also done other things like speaking out about how YouTubes auto-infringe ability can fuck with people, like when he was slapped with one by a game company who did not like his review.
The real problem is that people will pre-order this shit sight unseen which allows for these companies to make money off of broken, unoptimized crap.
Well, I didn't know, but since there was a lot of focus on him during some posts rather than the actual news part, I had to ask. I just haven't heard of him at all. I don't look into the new games too often, though.
And those next-gen gamers would have had the honor of paying $10 MORE than the better-looking PC version costs, too.
At the same time though, a bit of optimization would make any game run decently at settings that exceed what the new consoles are doing on these popular 7850/660 cards. People with higher end cards could crank up additional settings like high AO, AA, and other settings. It wouldn't necessarily be designed around the high end still, but the high end would still benefit (as well as the mid-range.)
Looking at Watch Dogs, I can see any reason why it shouldn't run at a high frame rate with high+ settings on something like a 7850 or 660. Many people have said it. There are games that look a lot better running on lower hardware.
I wouldn't consider the PS4 a 7850. Sure on paper it may seem like a 7850, but it's barely capable of a R7 260X. A GeForce GTX 750 Ti is only $150 and would obliterate the PS4.The closest equivalent to a PS4 GPU is the 7850 and that's only running the game at 30fps 1600x900. Assuming no other bottlenecks, you'd need twice that to double the framerate plus another 44% to take that up to a full 1080p. That's 2.9x the performance of a PS4 GPU and well into 780 Ti territory. That obviously isn't the case when an HD7850 is more than capable of running Watch Dogs at 1080p/30.
There's definitely something else going on and is also why the notion of a 'gimped' PC version is a load of nonsense. If the PC version is gimped then so is the PS4 version gimped.
The closest equivalent to a PS4 GPU is the 7850 and that's only running the game at 30fps 1600x900. Assuming no other bottlenecks, you'd need twice that to double the framerate plus another 44% to take that up to a full 1080p. That's 2.9x the performance of a PS4 GPU and well into 780 Ti territory. That obviously isn't the case when an HD7850 is more than capable of running Watch Dogs at 1080p/30.
There's definitely something else going on and is also why the notion of a 'gimped' PC version is a load of nonsense. If the PC version is gimped then so is the PS4 version gimped.
"This game was built from the ground up for the Xbox One and what a lot of people don't realize is that our visual framerate is actually tied to the game's simulation under the hood," Airhart said. "Over three years ago, when we started building the game, that's kind of how we built it, targeting 30 FPS.
"When we started the PC project at the very end of last year and beginning of this year, we knew that we weren't going to be able to guarantee running above 30 FPS."
Also Dead Rising 3 for PC is having a frame rate cap of 30 fps, for no apparent reason. Developers say it's ok to increase the fps to 60 if you want, but warn of possible problems. Just so happens the Xbox One and PS4 are also limited to 30 fps.
Does anyone really believe this? Is so apparent that console systems aren't up to speed with a cheap PC that it seems they purposely gimp their games on PC, so as to not overshadow consoles.
I wouldn't consider the PS4 a 7850. Sure on paper it may seem like a 7850, but it's barely capable of a R7 260X. A GeForce GTX 750 Ti is only $150 and would obliterate the PS4.
I thought by now that people would realize that paper talk is pointless. If the PS4 can't do WatchDogs at 1080P with 30fps but a 7850 can, then guess what the PS4 is not equivalent to a 7850. It's barely a R7 260X, which is a $110 graphics card.
Or the PS4 is just gimped. But that couldn't be possible right, cause on paper it's a 7850. Except it's not.
Gimped I'd wager, by poor coding on the "dev's' part. I'm not a game programmer, but know a few, and have been playing all my life. Old or new, there have been plenty of games that went above an beyond people's expectations at the time, squeezing things out of hardware that people didn't know was possible. Anyone remember the first time they played Unreal? (of Quake II in Open GL for the first time?)
Look at Bethesda. They seem generally regarded as moderately inept, but their Skyrim engine is pretty nice, supports a huge world, and unlike Oblivion, hasn't crashed on me once even when modded up pretty well. (though I do know they've had unstable engines in the past) My point is, if I can play Skyrim at highest settings, hardware has iterated a couple of times since then, and Ubi can't get this game running half-decently on cutting edge hardware, then something is wrong.
I'm not sure that argument works when there are already so many known quantities in these GPUs. We already know that the PS4 GPU is 1.84 TFLOPs to the 7850s 1.76 TFLOPs, for example, so we know that it's not shader bound. It also has more texture units (72 vs. 64) so it's also not bound in that area. The 260x also has a 100GB/sec path to the 176GB/sec path in the PS4 so even that doesn't make sense.
These are all same architecture GPUs so the same rules apply.
I think you would have a point about it not being malice if they didn't come out and say the graphics were not downgraded from E3, which is quite obviously a lie at this point. If they had come out and said they had too much trouble getting the same graphics to work properly outside of the E3 demo in time for release, THEN you could talk about no malice being involved.I'm not sure it was malice, or some sort of "make the console version just as good by gimping the PC version" type of conspiracy at work. I think it was lazy developers who didn't want to take the time to ensure those things worked properly or that there would be menu options for them etc. Lazy and rushed for time due to impending publishing deadlines is probably close to the truth. In this business, it usually is.
Makes sense cause you're only looking at the GPUs. That 176 GB/s that the PS4 has is shared with the 8 core CPU. BTW the 7850 has 153.6GB/sec. Some of that is going to be lost to the CPU, where the 7850 doesn't have that problem. Plus the there's going to be issues with memory clashing between the CPU and GPU. Also the CPU plays a role regardless if it's console or PC. Not as much as PC but it effects graphics performance on console as well.
But again, paper vs actual evidence of performance. You're going by paper and I have a video that shows games on the PC vs PS4 vs Xbone. A PC using a $110 graphics card.
No you can't guarantee anything on a PC, because everyone has different hardware. That being said, at an equivalent hardware level of an Xbox one or better, you absolutely should be able to do so. Not necessarily "guarantee" per se as that could have legal ramifications; but you should absolutely expect that as a developer.
People saying it was "poor coding"...was it that hard for them to flip a couple of switches in a config file? I don't really buy that.
People saying it was "poor coding"...was it that hard for them to flip a couple of switches in a config file? I don't really buy that.
I think the PS4 CPU is limited to only 20ish GB/sec of that bandwidth but, yeah, I don't really have any argument there. That still leaves a 50% bandwidth advantage over the 260x and is a point of contention.
On the other hand, the X1 runs the game at 1408x792 which also doesn't account for the 50% performance differential between it and the PS4. That resolution differential is only about 30% and assumes it being shader bound as the PS4 also has a twofold pixel fill advantage over the Xbox One.
Either way, I don't buy the 'intentionally gimped' argument. There are simply too many confounds that don't add up.
How about this for a theory:
Watch Dogs was always pitched as the advance guard that was going to set the bar for the next gen consoles. Realistically, anyone with a brain, realized the 'next gen' consoles simply weren't going to be that powerful or offer much of a visual bump over the existing consoles. Nor were they going to approach PC standards as apples-apples revealed they were obviously way underpowered in this regard.
So what was Ubi to do after all the shuffling of feet between MS and Sony to lockdown exactly what their hardware was going to ship with? Ubi could never offer the experience they promised, not necessarily through any fault of their own, but the fact that the 'next gen' consoles simply aren't. They had also made the usual pandering promises to PC user base, doubtless without any real intention of fulfilling those...but now it was going to appear that all those early console adopters salivating for Watch Dogs were betrayed.
What were they to do? To say that many consoles weren't moved by the promise of Watch Dogs as a launch title is disingenious too. Ubi set themselves up as the poster child for MS and Sony, so of course they were now committed to offer a 1-1 experience to PC at minimum.
The ROI is not there to design, tweak, and support these features with the wide range of hardware.