It's a good tradition :D. I am trying to convince myself not to buy this one since I seem to have moved beyond Civ. I loved Civ 1 as a kid, played lots of 2 and 3, and I played the everloving shit out of Civ 4. I never really got in to 5. Part of it is the hex layout bothered me for some reason...
I wish Epic weren't such dicks about their distribution but I totally get why Remedy did it. Alan Wake II was kinda risky. It isn't the kind of super-mass-market game that is going to appeal just everyone and their kids. It is a more limited kind of product. But they also wanted to do it big...
Stuff like this is why I love PC gaming: Because it can be different things to different people. I'm a graphics whore and I love my 4k monitor, I love RT, I want to play games cranked. Yet you can play the same games turned down on a portable and shit, we could actually play those same games...
If you don't set your expectations according to what reality is capable of, then you'll just be mad/underwhelmed all the time. There's no magic to get more transistors on a chip, it is increasingly hard engineering and it has slowed down a lot. Likewise while things can be optimized, there's...
Or, and I'm just spitballing here, but keep the gaming hobby you have and just use the GPU you already have. It isn't like new GPUs make your old one break or ever perform any worse. I'll never get the desperation to have to have the newest card right away. The old card doesn't stop functioning...
Me too. Don't get me wrong, I like OLED and I like my OLED TV. QD-OLED in particular has amazing viewing angles that are a sight to behold. However the thing that got the most wow factor from me was the HDR on MiniLED. For games with good HDR, I still prefer to play them on my PG32UQX rather...
I'm on again off again with MMOs. I don't hate them, and it can be fun playing with others in a collaborative environment (I don't like competitive games, I get too worked up). However sometimes, now being one of those times, I want some good single player fun.
I have a few I REALLY still need...
Yep. I'm not going to shit on anyone who decides this generation is worth buying either. Depending on how old your card is, it very will might be and even if you have a 40 series, well maybe the small upgrade is worth it for you. I know someone who has a 7950X3D and is going to get a 9950X3D...
I've actually been pretty satisfied with the amount and quality of single player games recently. While you are right a lot of the big-name games are multi-player, 2024 was pretty robust for good single player games too. I have a stack that I haven't gotten around to playing yet.
Also just as an...
Well.. I guess you have two options:
1) Skip this generation (that's what I'm doing) and maybe the next and maybe the one after that. Buy when there is a card that gives you enough of a performance increase to justify what you are spending. Same deal as you likely do with CPUs.
2) Stay mad and...
Shit like this is why I preferentially purchase from GOG, and I wish more services were like them. I'm not going to be the old man shouting at a cloud saying "everything should be physical" because I know that ship has sailed, but I would like that the digital services be good. Steam isn't bad...
The inflated price I listed is inflated correctly from $700. The point stands that the one that people are all doe-eyed over was near $1000 in today's money. It is a lot, but it isn't unprecedented.
I mean we'll see what happens with pricing, and if it gets too expensive (or is too expensive)...
I mean... The 90 series are expensive but they are ultra-high end. The 80 series are pretty in-line. The 5080 is $1000 MSRP. The much vaunted "awesome value" 1080Ti was $699 MSRP, which when adjusted for inflation is about $950 today, so around the same price for around the same class of...
Maybe, but probably not a ton. I think part of it is just that the new model takes more time to run, even on the 5090 and that some games just have issues with consistent frames. Jedi Survivor is always an example given where no matter what you did, there were some frame rate drops.
Personally...
Twice. Once was with my Core 2 Duo system, I got a Core 2 Quad Q6600 for it later. I can't remember what I was doing that was running that was having issues with two cores, but I did and this was a cheap(ish) option to fix it.
The time before that was a Pentium Overdrive chip. The 486SX2...
Have a look at DF's video, they tested it. On the 2080 it is about a 13% uplift, enough to allow for 1080p30 play with DLSS with PT on. 3080 was 10% uplift. 4090 was minimal, like 1% uplift, 5080 was also like 1% uplift. So it seems like they are not nerfing it older GPUs, in fact in those more...
No. The reason they are excited for the 5090 is almost entirely the memory. They don't hate the performance increase either, but the memory is the big deal. Beeeg models need beeeg memory and when the GPU doesn't have it, things run a lot slower. That's why they'll pay lots for the enterprise...
The 5080 is far more likely to be around for the immediate future because:
1) It's easier to produce more of them. The die is a lot smaller, meaning more per wafer, meaning faster production.
2) It's not really that impressive vs the 4080 and 80 series buyers are less likely to be "I gotta...
I'll agree in that I would tell people not to judge HDR from the experience you get on one of those displays, if you got one and said "Man HDR doesn't impress me much," I'd say "Well try a display with better HDR and then tell me what you think." However I do think that if you get one, you'll...
To be fair, while I'm a high brightness MiniLED guy, 400 nit HDR isn't terrible. No it doesn't have the impact you really want and get with higher values, but it isn't nothing. No Mans Sky has pretty broken HDR settings necessitating using a reshade shader to put tone mapping on it. To get it...
It 100% is his entire identity, that's why he is such a bitch about things and goes so hard because it is all of who he is and all he has. An attack on his record is an attack on his ego.
Gee, that's a huge surprise :P.
The bigger shakeup from it wasn't actually how cheap it is, or isn't, though more that it kinda removed the mystique from the eyes of the normal person, and particularly finance bros. The AI companies had done a good job making it look like only they could do...
Same way it always does: By where the mouse input said it was going. The idea with Reflex 2 is you sample the mouse input at a much higher rate than you render (this is normally done anyhow) and you take where the input is as close to that moment as possible, warp the display to match, fill in...
The idea with reflex 2 is it would respond to where your cursor was when you triggered the shot, not what was on the generated frame, hence reducing perceived latency.
Also something to remember when talking about this stuff: Not everyone plays high speed shooters, in fact I'd go as so far to...
Apple just flat refuses to do shit like that. They never like the industrial/enterprise market, despite getting use there. People usually have to hack around and get 3rd party solutions to make their shit work like it should in such environments (why they don't buy better tools in the first...
Well... there is one possibility that might make it happen and that is the whole reflex 2/frame warping thing. I'm not counting on it or saying it happen but if they can get the warp to happen fast for each generated frame, it could make it feel much more responsive by lowering the latency...
That was always their problem: No killer app. All Apple's marketing was "You can do what you do on your iPhone or Macbook... BUT IN AR!!!1111one" While that sounds neat, in reality people will play with something like that and then set it aside because putting on your goggles is way more work...
Ok... but all that means is you get to wait longer anyhow. If you want nVidia to launch these cards when they have lots, that means they have to wait to build up stocks of them. So you still don't get to have one until later. It doesn't really change anything unless your objection is others...
It is kinda no-win for the GPU companies. You can't make the cards any faster, it just takes quite a lot of time to fab these new chips and then get everything built so there's only going to be so much supply per day/week/etc. So what do you do? You can launch when the products are ready, that's...
I always loved when you'd walk in to the clothing store and the guy would say "I suspect you have a sharp eye for fashion," when you look like you'd just grabbed random shit off the rack at a thrift store.
Ya I'm not sure why this is some kind of expose form Der8auer. PCIe bandwidth generally has been growing faster than (consumer) need for it which is great. The reason the new card supports PCIe 5 is because that's the latest final spec and why shouldn't it? It doesn't need the extra bandwidth...
No kidding. It's amazing the purist bitching about this, as though if they didn't do frame gen they could just magically get a higher render rate. Like, if someone else was doing it and nVidia wasn't then ok. If Intel had a card that could throw down 400+fps in Alan Wake 2 with all the details...
Nah. I'll probably get one at some point, since I have a bad tech addiction and want the shiniest toys, but I don't want one bad enough to camp out for it. I'll either wait until they are in stock, or maybe someone on the forum will help me as they did with the 3090.
Realistically I shouldn't...
They are an established "China random name" brand on Amazon. They've been around for quite awhile, and people have tested their cables (with real signal testers) and verified performance.
When stuff is quiet, that's when it matters most that there is minimal noise/whine/etc. That's why despite our ears rather low SNR you want a very high SNR on audio equipment. If a power amp only had 40dB of SNR and could play 100dB SPL the noise would be exceedingly loud any time you weren't...
1) The frame gen thing makes sense. So long as your rendered frames are fast enough to give a good feeling, latency wise, the generated ones probably are just nice smoothness. What it won't do, which nVidia likes to imply, is make a sub-30 fps experience nice and playable. Maybe reflect 2 will...
Also I would point out that while people are hating on a 30% more performance as "not a generational upgrade"... CPUs usually get 10% performance uplift AT BEST and often less with their generations and we don't hear screaming. I think we need to accept that GPUs are starting to plateau, just...