So does nvidia if you tick the clean install box. If not it will save your settings and keep some backup files of the old drivers (for what I don't know). On an old Win 7 install I had literally gigabyte of leftovers from updating nvidia drivers that way for years. That machine had no issues...
So much negativity in this thread but this is the correct answer. Upscaling is the future, human senses have their limits so there is a lot of power being wasted on things we don't even see especially at the higher resolutions. Smart upscaling frees up resources to enhance aspects of the image...
Yea but Intel has a huge lead in this game with 13th gen. 13900k is well over 20% faster than 7800 X3D. The cache is nice but sometimes you need more cores/frequency too.
Very sad that the 7950 X3D handles this game so badly though, I don't know if it's because of data crossing the 2 CCDs or...
Isn't there some way to get AutoHDR working on it? I don't have the game but I heard about that, I think by renaming the .exe and maybe with some mod too.
Quite shocking to have no HDR support though...
Yea Starfield is hammering the threads. It might even scale with more cores than the 13900k has, nobody tested that yet I think. It's a good 30% faster than 7800 X3D, with trash ram and stock clocks. That's a lot.
I'm with Dan_D and btw, Prime95 does not stress the 7950X3D (or any other X3D) at all! so for those CPUs it's probably not even worth doing more than a few minutes (it's because they have a lot of safeties built-in regarding voltage, amps etc to avoid killing the cache). You will heat up your...
Yea I think 54 nits is very low even in a dark room. Well it might be fine for reading black on white text and stuff like that but I'm guessing he's in fact in the good old (and preferrable) 80-120 nits range :)
Glad you solved it! I'm not totally surprised the CrystalDisk benchmark was a "lie" though, synthetic benchmarks are never to be trusted too much haha.
Yea that's true, QD-OLED may not actually burn faster when driven at the same brightness level. That would require another set of tests, but I still like what Rtings is doing and it's very valuable. But at least we can say "if you run 100% brightness this is the result", since most people won't...
HDR content still averages in the 100-120 nits range (you know how many people complain that HDR is "too dark" already). It shouldn't really burn that much faster than SDR at 120 nits even with the highlights. Of course if you pause the video or have a static overlay that is non HDR friendly...
I am out of ideas but this is an interesting thread. Now it looks like hardware issue again. Is there another PC you could test those drives in? Different mobo, CPU and all that.
OLED is flawed, QD-OLED burns faster - absolutely true. But LCD is flawed too in different ways.Such as grey blacks in a dark room (or blooming with FALD). And slow pixel response times.
I just don't think you understand the tech and the whole "burn-in" thing very well. There is no flawless...
No, you don't use a display at max brightness unless you're in a bright room. But then you don't use OLED, it's the wrong tech for that. So not a realistic scenario.
Heat also matters a lot for OLED, using it at max brightness for so many hours non stop will make things worse especially in...
Burn-in doesn't mean it's broken though, in the case of OLED it just means degraded uniformity because some pixels are darker than the others, and it won't even be noticeable all the time if at all. Mostly you will notice by displaying a solid colour background, and it won't even show with all...
I have like 16000h on my LG CX by now and it still looks flawless (it's over 3 years old now). That's a better track record than several LCDs I've had.
But yea I'm using it in a dark room with low brightness setting for SDR, so it's not surprising given what I've learned about the tech over...
Has to be.
edit : but Google says its TLC... So now I'm thinking it's just defective or the specs are a lie, that's way too low, and 600GB of free isn't THAT bad.
The original Far Cry is an interesting game but I never could be bothering finishing it for some reason. I guess the mutant monkeys and the fact I kept trying to play on highest difficulty while being bad at the game. Actually I'm pretty sure I lost my saves too at some point...
Was technically...
With pc parts it's not complicated, every watt pulled from the wall is dumped as heat into your room. A PC is an expensive electric heater that can do fancy things while heating the room.
The 4090 is so much more efficient that I'm sure it's drawing way less watts than your 3090 ti on average...
Nah I haven't managed to reproduce it with anything except that game so far. Since LigTasm mentioned the shader generation of Hogwart's Legacy I figure maybe Jedi Survivor which I do own (thanks AMD) may exhibit the same behaviour, but did not get around to installing that yet. But for science I...
That is the besides the point. And your CPU is a different model, with heat spread probably over a bigger surface area (2 CCDs, both with less cores than mine).
I can increase my fan speeds further or slap a custom loop on it but the situation I'm talking about in this thread will continue to...
Here's what happens in that game I mentioned:
Looking at task manager it loaded 2 cores but one after the other, so still seems totally single threaded.
Look at those watts and the temps!
X3D CPUs are special and have a lot of limits (frequency, voltage, amps, temperature...) in place to avoid blowing up the fragile cache on them.
But here you can see the difference:
The single core load I'm talking about in this thread looks more like CR23... but with 35w and and only core...
Missed the point. I clearly explained that multi threaded workloads are not running "hot". I just tried Linpack Xtreme since you mentioned that, it's not even close to the scenario I mentioned in this thread: I can't even reach 80c with it! With slower fans.
And single thread run is below 60c.
I did not say that this was a problem or that it bothered me or caused any issue whatsoever. OK sure it gets loud for my taste but I can find my way around that, it's just a detail and totally subjective.
It is just surprising and I'm trying to understand and wondering how common it is and if...
Interesting, thanks. I don't have that game to test it myself. Does shader compilation not use the GPU as well though? So it would dump more heat in the case maybe? But still this behaviour seems crazy, on my 11900k some loading screens were hot and loud, but that was multi-threaded ones like BF...
Hello!
I accidentally stumbled upon an interesting X3D torture scenario. A single threaded game loading screen that pushes temperatures similar or higher than CR23 multi core test for me (which has until now been the hottest test for a 7800 X3D).
What you need: the game Age of Empires 2 DE (I...
It's fine for that purpose but you still need a bit of cooling power due to the extreme heat density mind you. Like with an identical wattage an Intel CPU will be way quieter (because easier to cool).
The fan speed does not matter too much though, it needs a large heatsink and good application...
It was a nice review, I didn't actually know about this monitor until seeing this thread (I'm so happy with my OLED TV I don't check display news too often anymore).
I'm not in the market for such product right now (size / resolution / panel type / colours) but I'm still happy to see 500hz out...
I got a cheap Crucial 4TB QLC drive and while it fits my current use case (storing blu-ray rips in a practically fanless home server in my living room) I can confirm that write speed is abysmal. Like it drops down to less than HDD speeds then goes back up a bit then down again. It's about 80%...
Yup that's basically what I was saying. With the CPUs I have used, I never had enough physical cores that disabling HT made any sense because of the dramatic loss in several different games that I play - while I gained only a puny 0-5% by disabling HT in the other titles.
But starting with...
Same on my Gigabyte board. Cost me the same as my Rocket Lake board back in the days but less than half the features and the ports literally.
But then I did not expect to need any of those features so that didn't slow me down. Of course with the current situation I actually miss them now, RIP.
Yea I don't think it has much if anything to do with a "specific" model of chair or whatever they are implying. But it happens to be a very popular model, heck it's exactly what I've been using since forever at home (without issue and feels perfect for my body).
I'm not sure why you made that swap though, depending on what you do and even what games the 7950x will be better due to having higher clocks. It's totally a side grade.
You can power limit Intel to around 125w for gaming and still retain 98% of the performance (productivity suffers more if heavily multi-threaded). Will be very easy to keep cool and quiet at that wattage, even on air.
Yea but for example thermal shutdown point is a basic setting for many years now, and I can't believe they would mess this up in 2023. All motherboard makers are at fault though it seems. And AMD's communication remains amazingly bad, and not only with us customers lol.