I remember this game, it was awesome. Me and my roommate in college used to get stoned and play it. Once he was so out of it that he thought he was using the force to steer. I laughed for about 3 hours. Yeah.... Stoned.
That's my guess too. Games Workshop has said they plan to focus on fewer but higher quality releases in the future. Frontier would be a decent fit, they're more experienced than most of the developers Games Workshop has used..
Yeah, I'm having a good giggle reading some of these responses. There will always be enthusiasts who want the highest quality experience, of course. But every one here is pretty out of touch with your average consumer. With media, easiest always wins, and streaming is easiest.
Not sure that analogy helps your point. Obviously it's worked for Hollywood for a hundred years. If the final result is a pretty actress, do we really care how it was achieved? Same thing for DLSS. We just want a good image and good frame rate. I don't care about the method, just the results.
The Mandalorian series is in production. Fett is on hold, and that's fine by me, I'd rather see new stories about new characters instead of dusting off whatever original series character they feel like this year. You have no secret info about the rest of your statement, so can we not pretend...
But he was the lead character. If you don't develop him, then the movie loses a lot of strength. I liked Solo overall, I've rewatched it a few times. I loved the fan service explaining the Kessel run parsec issue and Han shooting first. But the movie could have been better had his origins...
I really liked Rogue One, but found Solo to be a decent but fairly average movie. I don't know why you say it has good character development. I think it has none at all. Han starts out as a roguish good guy, in the middle of the movie he's a roguish good guy and he ends the movie being a...
Yes. But when you make a game in the Unreal Engine and build it, the engine compiler does all the D3D code writing grunt work for you. Previous to this, it wouldn't use any of the ray tracing code MS added simply because it didn't know it existed. Now, when you enable ray tracing in Unreal...
Yeah this is not true. For the initial launch of BFV, the 2070 had some pretty bad ray tracing performance and everybody glued onto those headlines. Then in early December, Dice released a patch that fixed that. All the Turing cards saw their raytracing performance jump massively. At 1080p...
"an intentionally misrepresented proposition that is set up because it is easier to defeat than an opponent's real argument."
In other words, his best argument was inventing a theory that the R7 could undervolt and save 50W, where the 2080 could not, and therefore made the R7 better. It might...
That's quite the strawman. To paraphrase. "Undervolting the R7 probably won't have problems, and undervolting the 2080 probably will. So the AMD is 50 Watts less! Definitely better."
Yeah that's the direction things are heading. I can see the appeal in it, consumers never need to upgrade their console. Developers never need to focus on specific hardware generations. If the bandwidth is there and the latency is minimal, I'd be fine with that. I'm no teenage cyborg...
Didn't they do this because the Justice Department was up their ass about anti-trust shit? To show that they didn't have a monopoly they bailed Apple out. Seems to have worked out for all of them.
Just a thought Kyle, but maybe you might want to look into the claims being made before running a story by the National Inquirer of tech news as your only source. Charlie has had the biggest hard on for ruining Intel and Nvidia for decades. He's been calling on their demise since before...
Even semiaccurate admitted that today's process names are more about nanomarketing than nanometers.
Here's another:
https://www.semiwiki.com/forum/content/7602-semicon-west-intel-10nm-gf-7nm-update.html
Quote at the bottom:
The other interesting Intel news is that they may skip 7nm and move...
Don't fall for the marketing. 10nm / 7nm are as much the name of the product as the actual sizes. In reality when you look at chip density Intel's 10nm was much smaller than Samsung's 10nm. But it's not smaller than Samsung's 7nm, although, is very close...
It never fails, people always seem to think TV series are going to cast all A listers. I got another spoiler, it's not going to be a carbon copy word for word copy of the books. So if that's the other hill people plan to die on. Good tidings.
"Some early adopters say that the screen is more 3D than holographic."
So then what makes it different from the old HTC Evo 3D I had 6 years ago. Man I loved that slow ass phone.