I wonder how far we are from having real-time raytracing in games.
I feel like unless there's a major breakthrough, it'll be about 10 years at least, if we even get there, engineers and developers love to stall at "good enough".
Handling cash has costs associated with it too, just less "upfront". The people counting it and transporting it aren't doing it for free, also cash is easier to steal and to counterfeit (places that do lots of small cash transactions get screwed on fake bills being passed, especially bars and...
AFAIK the claim is that the tiny evil haxor chips were added at some point *after* the boards left the factory, so no board design elements were ever there to facilitate them.
Basically it's really frigging unlikely this is a thing :p
I'm not a motherboard engineer/designer, but I'm super skeptical that you can just put a chip on a board and have it do even simple things that a chip in that position on the board wasn't intended to do.
I mean, I'd maybe get if an actual existing controller was cloned, and extra functionality...
2020 seems optimistic, but it's hard to know where they are starting from.
They've been making GPUs the whole time, just integrated, so they aren't starting from 0 in terms of either hardware or software knowledge and processes internally, and they've been getting better (slowly).
If they are...
Pretty much all 3d rendering for games is a bucket workarounds for trying to achieve something visually similar to raytracing in the absence of the computational power to do the real thing in real time.
I'm super glad to see some green shoots on getting there, but unless there's been some big...
I moved from a 2600k to a 3770k about 5 years ago or thereabouts, and there wasn't much improvement, until the latest round of cpus I haven't felt like there was enough of a boost to be worth considering, even now when looking at the 8700k Im thinking "I'd kind of like an m2 socket on my...
I'm curious to see how quickly intel can respond to this, as in, release products without the hardware fault. Depending on it's exact nature and where they are in development of their current pipeline of products, it could be quite a while.
They are probably boned if they don't have something...
I have already basically decided this is my next board, I've been on the M-ITX train for a few years, and this will be replacing my P8Z77-I Deluxe, looking forward to the review.
A thing I've wondered with more recent boards though is: Does a heatsink really help an M.2 SSD? if so, why does...
*mathematically* I believe that death is inevitable on a long enough timeline, because even within a normal 80 year timeline people die from a lot of shit that isn't age related, and if you extend that lifespan you also increase the opportunities for random other shit to kill you.
But when it...
It's time for me to upgrade my 3770k, I was holding on for Coffee Lake in the hopes there'd be an actual performance increase for normal users, but once again it's just: "MASSIVE 15% performance increase! ... for the tiny slice of the PC market that just encodes video all day and/or run 70...
I don't dislike Prometheus, it was hamstrung by a messed up development process though. They didn't really decide what it was until they were basically already shooting, They moved back and forth between making it a reboot, or a prequel or a fully standalone thing until way too late, so the...
There's great sequels alright, but the ratio isn't that good.
That said, when it's a sequel made a bunch of time later, that isn't a straight up attempt to cash in on the success of a first film while there's still buzz around it, chances are way better.
Most sequels fail because they have...
I had an a7n8x deluxe for a long time, good board.
Currently I'm running the ASUS P8Z77-I Deluxe Mini-ITX with an i7 3770k, and until nowish I've felt there was no real reason to upgrade, my main use case is gaming and that MB/CPU combo doesn't come close to bottlenecking anything.
That said...
"...and a GPU that benefits from five years of improvements to Radeon technology"
So no real difference then?
*rimshot*
I have a really hard time seeing anyone doing a "new" console ever again, or at least for a very long time. Fully bespoke technology is more or less too costly and too risky...
Yeah, even if this hadn't happened I think prices were going to come down a lot this year, due to scaling up of data centre adoption and stuff, but this hopefully bodes well. I really want everything on an SSD at this point, but it's still like 8/9:1 for SSD vs HDD pricing so it's not viable...
My main PC use case is gaming, and as such I've seen nothing yet to make me feel like I need an "upgrade" to the performance of my 3770k, however I am interested in more recent stuff my PC lacks, like nvme and usb 3.1 support, so I was looking to upgrade on that basis.
Until recently it hadn't...
I feel like given the pricing and memory spec, Nvidia considers the Titans more like cheap workstation cards than high end gamer cards (or just purely a prestige product, there can't be much money in those), and I suspect that's where most of those sales go, even if they don't market them that...
I'm curious to see what happens to volatile storage over the next few years. Will we see the death of RAM? With maybe a bit more on die cache on cpus, and Optane 2/3 doing everything else?
"....and it never seems to work when you need it."
Something about the description of wifi there reminds of those infomercials that pretend a problem exists that simply doesn't
UK/Ireland Netflix is a joke. I've felt like that was more to do with the crazy fragmented way the Content producers do licensing here, where basically SKY owns all content forever, so the streaming services just get scraps from the table, but increasingly I've felt like Netflix just doesn't...
I reckon they'll do it when/if they have something to add technologically.
HL did new things with narrative and tech, HL2 did new things, mainly with tech/gameplay (kids forget how mind blowing and innovative bringing physics into the gameplay to the extent they did was at the time), not so...
It always strikes me as odd when people equate long hours to productivity or "hard" work.
There is simply no way that someone "working" 130 hours isn't spending half of that time fixing shit they broke because their brain stopped working properly about 40 hours in.
I know it's seen as a badge...
The relatively fast release of this after the other gtx 10xx cards kind of makes me wonder how long we'll have to wait for the 1080ti. I promised myself I'd hang on on upgrading until there was a true 4k60 card, the TX is on that line, but soooo expensive. I'd probably puke if I bought it and...
I feel like we aren't *too* far off RAM in it's current form becoming obsolete, or just being cache on the CPU or on something like an ssd (basically whatever comes after 3D XPoint)
I don't see CPU Cache, and RAM, and GPU RAM and SSDs all continuing to exist discretely basically
Kyle always calls it like he sees it. People who can't take his views (or any view that doesn't fit their personal narrative) should evaluate their own biases.