Any happy 4080 owners here?

Will a decent 850w be enough? Or is that too risky? Corsair RM850x....for e.g.
Yes, should be fine. Corsair RM850x is a very good PSU for price. But make sure it's 2021 model, not 2018 model. Any good quality PSU will do and 850W is definitely at very minimum at least for 4090. But 4080 should be perfectly fine with either 750W or 650W PSU as long as it's quality psu, not cheap 750W PSU. I've used a Corsair HX850 for many years and worked fine with my old Vega 56 that is over 300W TDP card. Heck, 4080 don't use much power unlike my Vega 56. So I could imagine it would works with low wattage PSU like 650W. Again, just make sure it's a really good quality PSU. Never get any cheap PSU. I plan to undervolt my 13900K and 4090 eventually...just need to find the time to do it. haha.
 
  • Like
Reactions: pavel
like this
I've been tempted by LG's C-series OLED HDTVs for a while now, but I've been concerned about the burn-in risks for something that's mostly going to be used as a giant computer monitor. That's the only thing really holding me back from OLED at the moment.
Honestly, I've had it for a year now, and never could be arsed to not have desktop icons or hide my taskbar. I don't see any burn in whatsoever.
 
I bought my FE open box for 1019 and I am very happy with it. Its pricy at 1200,but not outrageous by any means, imho. Everyone is free to disagree. My issue is that a narcissist (watch Jensons Taipei presentation this year for full confirmation) worth 26 BILLION USD is the sole decider of video cards for the working poor,and everyone else ,full stop. Jenson would rather destroy the PC Gaming market then lower prices 10 or 20% on cards, during the 1st few years of what is very clearly the start of a decade+ long Worldwide financial Depression,not a recession. His decisions also fully effect AMDs pricing decisions as well ,that's clear as day.
 
Last edited:
I bought my FE open box for 1019 and I am very happy with it. Its pricy at 1200,but not outrageous by any means, imho. Everyone is free to disagree. My issue is that a narcissist (watch Jensons Taipei presentation this year for full confirmation) worth 26 BILLION USD is the sole decider of video cards for the working poor,and everyone else ,full stop. Jenson would rather destroy the PC Gaming market then lower prices 10 or 20% on cards, during the 1st few years of what is very clearly the start of a decade+ long Worldwide financial Depression,not a recession. His decisions also fully effect AMDs pricing decisions as well ,that's clear as day.
Video cards are a luxury, not a right. You can always buy lower on the stack, or older cards used.

P. S. You might be a little conceited too if you'd built a net worth of $26 billion.
 
Video cards are a luxury, not a right. You can always buy lower on the stack, or older cards used.

P. S. You might be a little conceited too if you'd built a net worth of $26 billion.

I agree , you might, I still firmly believe the man is a narcissist pure and simple. Conceit does not begin to come within a lightyear of the behavior he has shown in the last few years,publicly; especially exhibited by Jenson at Computex 23. Your free to disagree. I am a longtime ( 2007 ) share holder, but strongly feel of late Jenson is a detriment to the companies face overall. He is extremely abrasive now. Still enjoying my 4080 FE.
 
I agree , you might, I still firmly believe the man is a narcissist pure and simple. Conceit does not begin to come within a lightyear of the behavior he has shown in the last few years,publicly; especially exhibited by Jenson at Computex 23. Your free to disagree. I am a longtime ( 2007 ) share holder, but strongly feel of late Jenson is a detriment to the companies face overall. He is extremely abrasive now. Still enjoying my 4080 FE.
LOL. He's taken Nvidia to the top of the heap and as a shareholder you think he's a detriment...cognitive dissonance at work.

Jensen is brilliant and he's why Nvidia is a trillion-dollar company.
 
I bought my FE open box for 1019 and I am very happy with it. Its pricy at 1200,but not outrageous by any means, imho. Everyone is free to disagree. My issue is that a narcissist (watch Jensons Taipei presentation this year for full confirmation) worth 26 BILLION USD is the sole decider of video cards for the working poor,and everyone else ,full stop. Jenson would rather destroy the PC Gaming market then lower prices 10 or 20% on cards, during the 1st few years of what is very clearly the start of a decade+ long Worldwide financial Depression,not a recession. His decisions also fully effect AMDs pricing decisions as well ,that's clear as day.
Nah. He's a greedy jerk but AMD almost appears to be colluding - with their own high prices. Also, their tech. is up for debate. I've read some complaints that they are reusing previous tech. hardware (I assume they mean RDNA 2).

AMD could sell more cards if they'd lower the price. Ppl want a competitor to Nvidia but AMD is content to just sit in the weeds and sell the few cards they are selling at the higher prices.
 
Intel was working towards being a 3rd competitor but I wouldn't put it past them to do the same high pricing bullsh!t Nvidia and AMD are doing. Assuming they can get anything better than they have done out the door.
 
Is 4k worth it ?

I would say no unless you have a very large screen your sitting close to, otherwise get a nice 1440p monitor and get yourself a decent video card and your good to roll for years vs having to chase gimmicks to get to get good frame rates. Plus if you like ray tracing and such it's much easier to get decent frames on a lower resolution. Under 30" inches I think you would be pressed to notice any difference, at least from what I have seen.
 
Intel was working towards being a 3rd competitor but I wouldn't put it past them to do the same high pricing bullsh!t Nvidia and AMD are doing. Assuming they can get anything better than they have done out the door.
You're probably right. They are '3rd' right now - and I don't think their cards are priced all that competitively, right now, are they? They're not really expensive because they are not in the ball park of the flagship AMD or Nvidia cards. If they ever get there - are they going to be cheaper than AMD or Nvidia cards. I would not make a bet on that.
 
LOL. He's taken Nvidia to the top of the heap and as a shareholder you think he's a detriment...cognitive dissonance at work.

Jensen is brilliant and he's why Nvidia is a trillion-dollar company.
I agree and never said he didn't greatly help take the company to its heights ,but NVDA has tens of thousands of brilliant engineers ,Jenson is the face,little more at this point,a very bombastic tone deaf one. He doesn't design the chips,or make them. AI, I predict ,is just another dot com bubble about to happen. Humanity will sour on it very badly. Its driven heavily by greed and Malthusian multi billionaires,who do not have our best interests ,but our worst.
 
  • Like
Reactions: pavel
like this
O11D XL - that's the largest version of that case, too, isn't it? I currently have a Phanteks P500A - it's black and I really wish I (saved up for) got a white case - but, the cases I like - G500A and Lancool III - also have a lot of room for large video cards. I think I'll have enough room/space. But, I see how the amount of space the card takes up - 3/3.5 slots or whatever it is - can be a problem or annoyance.

I'm glad that the card is quiet and there's no coil whine, though. The Zotac cards are usually the cheapest here - I've even found some ads for 2nd hand Zotac Trinity and AMP cards. So, it makes it easier if they're decent cards since the prices would be way more 'affordable' - if you can describe any 4080 that way...haha.
Yeah, it's the XL for a reason, though it's actually smaller than my old Corsair Obsidian 800D in every dimension except width. Much of that is likely due to not having 5.25" bays, so they can cut down the depth while still having plenty of room inside.

If anything, I'm kinda used to stonking huge full-towers, be it the aforementioned 800D, the totally-not-a-Lian-Li-PC-V2000 ABS Tagan Black Pearl that it replaced, or some of those truly tall monstrosities you'd find back in the AT form factor era when clock speed LCDs and turbo buttons were still expected.

Actually, microled burns down too ;). All self emissive techs do.
I suppose on a long enough timeline, burn-in is going to happen, but I should point out that CRT monitors from the FD Trinitron/Diamondtron NF era often didn't leave me staring at burned-in UI elements, even used ones.

If OLED and/or MicroLED can accomplish that for a solid decade, I'll be happy.

Honestly, I've had it for a year now, and never could be arsed to not have desktop icons or hide my taskbar. I don't see any burn in whatsoever.
And that's exactly what I'm looking for right there, enough peace of mind that I'm not seeing left-over taskbar or icons even when used extensively as a PC monitor, though I'd still have the sense to turn the ol' screen saver back on again.
 
Yeah, it's the XL for a reason, though it's actually smaller than my old Corsair Obsidian 800D in every dimension except width. Much of that is likely due to not having 5.25" bays, so they can cut down the depth while still having plenty of room inside.

If anything, I'm kinda used to stonking huge full-towers, be it the aforementioned 800D, the totally-not-a-Lian-Li-PC-V2000 ABS Tagan Black Pearl that it replaced, or some of those truly tall monstrosities you'd find back in the AT form factor era when clock speed LCDs and turbo buttons were still expected.


I suppose on a long enough timeline, burn-in is going to happen, but I should point out that CRT monitors from the FD Trinitron/Diamondtron NF era often didn't leave me staring at burned-in UI elements, even used ones.

If OLED and/or MicroLED can accomplish that for a solid decade, I'll be happy.


And that's exactly what I'm looking for right there, enough peace of mind that I'm not seeing left-over taskbar or icons even when used extensively as a PC monitor, though I'd still have the sense to turn the ol' screen saver back on again.
Oh I definitely have a screen saver come on after I think its like 10 mins of inactivity and then the screen shuts off if still inactive.

But yeah, I like having desktop icons and the hiding taskbar is too much of a change from my normal habits, it felt like adding an extra step and I didn't like that, so it stays. I'll have to see if there is some sort of test or something to see if you have burn in, but just in my day to day of using the PC, gaming, streaming movies/tv I don't see any burn in at all.

Really my only complaint with the LG C1 is the auto dimming when you are on say a white Word doc, or web page and it auto dims the screen. This can apparently be solved by a service remote and changing that setting with it, so I need to look into that.
 
It's one of the few cards that has decnet RBG on the side of the shroud. I always have lighting theme on or Twisted or Marquee.
Not even tried that yet :) Just been testing out some games. Cyberpunk 2077 maxxed with DLSS3 on quality is ridiculously beautiful. Even at 1440p.

*edit* Marquee is excellent :)
 

Attachments

  • 20231125_195712.jpg
    20231125_195712.jpg
    238.2 KB · Views: 0
Last edited:
I enjoy my 4080 but I'm looking farther into the future and excited for the 6080 series so I can upgrade to a 5080. Just got to wait another 2 or 3 years.
 
Actually, microled burns down too ;). All self emissive techs do.
Yes it does. But the MicroLED tech uses inorganic materials which means in theory that will have a much longer lifespan than the equivalent organic LED tech.

It's still far off from mainstream consumer use though. The Samsung Wall using MicroLEDs retails for a cool Half MILLION dollars. So if you're a Saudi Oil Prince by all means just buy it now.

The rest of us will use the available OLED tech which still looks gorgeous and is relatively affordable today.
 
Honestly, I've had it for a year now, and never could be arsed to not have desktop icons or hide my taskbar. I don't see any burn in whatsoever.
I use a free software called AutoHideDesktopIcons.

I can set a timer for how long the icons and taskbar appear. All I do is click anywhere on the desktop and the icons appearr then it autohides after the set timer goes off.

I set it to 5 seconds but if you need more time to hunt for your icons you can increase that timer easily to any amount you please.

What I really don't like is how Windows doesn't fully hide the white line on the taskbar when it hides and this app gets rid of that for me in addition to the icon hiding function.
 
I use a free software called AutoHideDesktopIcons.

I can set a timer for how long the icons and taskbar appear. All I do is click anywhere on the desktop and the icons appearr then it autohides after the set timer goes off.

I set it to 5 seconds but if you need more time to hunt for your icons you can increase that timer easily to any amount you please.

What I really don't like is how Windows doesn't fully hide the white line on the taskbar when it hides and this app gets rid of that for me in addition to the icon hiding function.
I really don't like extra steps to get to my taskbar whether that's a click or waiting a second forit to pop up.
 
I really don't like extra steps to get to my taskbar whether that's a click or waiting a second forit to pop up.

With the program that I mentioned, the task bar pops up instantly when I scroll my mouse pointer over the area it should be at. Like I said, it's one of the better solutions out there and it's free.

It's more of a peace of mind thing for OLED owners who don't relish the idea of seeing images of their program icons burnt into their expensive OLED display.
 
With the program that I mentioned, the task bar pops up instantly when I scroll my mouse pointer over the area it should be at. Like I said, it's one of the better solutions out there and it's free.

It's more of a peace of mind thing for OLED owners who don't relish the idea of seeing images of their program icons burnt into their expensive OLED display.
Interesting. Will take a look, thanks.
 
the fans on my msi 4080 trio seem to run most of the time while sitting at the desktop.

gpu temp: 34c
hot spot: 42c
memory: 36c

the board power draw is 52 watts with two monitors. was 25 watts with my 3070.
 
Jensen is good at promoting shareholders’ interests and propelling the company forward (as his job entails) but bad as a spokesperson speaking to consumers and customers. Let someone else do that, JH.
 
Nah. He's a greedy jerk but AMD almost appears to be colluding - with their own high prices. Also, their tech. is up for debate. I've read some complaints that they are reusing previous tech. hardware (I assume they mean RDNA 2).

AMD could sell more cards if they'd lower the price. Ppl want a competitor to Nvidia but AMD is content to just sit in the weeds and sell the few cards they are selling at the higher prices.

In the end it's a business. AMD has basically thrown up their hands and said - it's too expensive for us to go for the performance crown. So instead let's focus on increasing our profit margins. That means target the console market. That means only compete with Nvidia's 2nd tier video card while bumping up prices to the point where they are only a little cheaper than Nvidia.

The ripple effect of Nvidia pricing means it would be stupid to sell your AMD video cards for any price that is too low because that only cuts into your own profits. What does market share matter when you dominate the console market and you're selling PC GPUs just to keep a nominal footprint in that market?

This is sad but it's the result of Nvidia's dominance and high pricing strategy.

So far Intel is only going into the low end to middle tier of GPU peformance with their ARC cards. Will they ever produce a monster to compete with the like of the upcoming Nvidia 5000 series? Doubtful but I would welcome a new contender to the crown.
 
I'm pissed that the 4080 super is going to cost 999$ supposedly. When my 4080 cost 1200$. I have a PNY 4080. I have a 200+ on core and 1000+ on memory oc and it's stable. MW3 plays awesome with DLAA at 4k (max settings) and I get 120fps on my C2. Looks and plays gorgeous.

I am happy with my 4080.
 
I do love my 4080, but yes, the price we always knew was BS, and was never going to get better. At least my 4080 will last quite a long time given the VRAM amount, feature set, etc. I also generally enjoy how the thing just runs so cool compared to my 3080ti that it replaced. I expect that 3080ti to eventually just die due to how hot it runs.
 
I'm pissed that the 4080 super is going to cost 999$ supposedly. When my 4080 cost 1200$. I have a PNY 4080. I have a 200+ on core and 1000+ on memory oc and it's stable. MW3 plays awesome with DLAA at 4k (max settings) and I get 120fps on my C2. Looks and plays gorgeous.

I am happy with my 4080.
Imagine how pissed I am that I bought a 4080 for $1500* just days before the super rumor surfaced. Although I don't believe the 999 price for a minute, even if it is slightly cheaper or faster it'd have been worth the wait.
Otherwise I'm happy with the performance I got, finally I can play Cyberpunk 2077 with no compromise, and if the card lasts as long as my 2080Ti before it (4 years) then I'll have no regrets.

*EU + taxes
 
Imagine how pissed I am that I bought a 4080 for $1500* just days before the super rumor surfaced. Although I don't believe the 999 price for a minute, even if it is slightly cheaper or faster it'd have been worth the wait.
Otherwise I'm happy with the performance I got, finally I can play Cyberpunk 2077 with no compromise, and if the card lasts as long as my 2080Ti before it (4 years) then I'll have no regrets.

*EU + taxes


$999 is likely accurate but will be the short supply non OC models that will be very tough to get your hands on. AIBs want the margins on OC models.
 
I'm pissed that the 4080 super is going to cost 999$ supposedly. When my 4080 cost 1200$. I have a PNY 4080. I have a 200+ on core and 1000+ on memory oc and it's stable. MW3 plays awesome with DLAA at 4k (max settings) and I get 120fps on my C2. Looks and plays gorgeous.

I am happy with my 4080.

be my buddy that paid almost $1000 for a 3070 during the shortage. It doubled the price of the system I built him at the time.
 
In the end it's a business. AMD has basically thrown up their hands and said - it's too expensive for us to go for the performance crown. So instead let's focus on increasing our profit margins. That means target the console market. That means only compete with Nvidia's 2nd tier video card while bumping up prices to the point where they are only a little cheaper than Nvidia.

The ripple effect of Nvidia pricing means it would be stupid to sell your AMD video cards for any price that is too low because that only cuts into your own profits. What does market share matter when you dominate the console market and you're selling PC GPUs just to keep a nominal footprint in that market?

This is sad but it's the result of Nvidia's dominance and high pricing strategy.

So far Intel is only going into the low end to middle tier of GPU peformance with their ARC cards. Will they ever produce a monster to compete with the like of the upcoming Nvidia 5000 series? Doubtful but I would welcome a new contender to the crown.
Intel has the $ and budget - but, I dunno about the tech - they were an integrated graphics expert before they starting making their ARC cards as you know - and I believe it will take them a while to catch up. If 'catching up' means being an option for gamers and content creation - either one or both.

AMD could reduce the price and try to sell by volume - there's a lot of ppl I read in forums and sites - about their willingness to buy gpus if it 'was just a bit cheaper' and preferring AMD - because of the 'evil Nvidia monster' and it's a good reason, imho - but, like you said, AMD probably has the mentality that each card sold at a lower price eats into profits and they calculated that the volume sales wouldn't be enough to offset that, I guess?

They also dominate the console sales - but, from what I've been told - the gpu hardware isn't quite 'comparable' to PC gpu hardware. They are 'equivalent' to certain PC gpu hardware. I suppose buyers of consoles often get PC hardware as well but I am not a console user - although, I have friends who have them. But, I know they are pretty expensive and I assume sales are sufficient for AMD?
 
I have had the RTX-4080 FE since Dec 2022 and totally love it, and handles all games I play at ultra maxxed settings, I play @ 3440 X 1440. My monitor is a 240hz display, and I can't get current games to play that fast on average, except older games like Doom Eternal.

I will keep my eyes open for a 5080 this Fall.
 
I’m torn between 4080 or just wait for the supers. I’d like to get a 4090 but the prices are just insane
 
I’m torn between 4080 or just wait for the supers. I’d like to get a 4090 but the prices are just insane

At this point id just wait for the much RTX-5xxx launch coming in September.

Rumored performance of the 5090 being 200% faster than the 4090.

To buy a 4090 after it's been out 16+ months would not be wise. Unless you buy it and sell it to help pay for the 5090 that works.

I have the 4080 bought a year ago and it runs all my games at max ultra settings with high frames @ 3440 x 1440

Isn't the performance difference between a 4090 and a 4080 only like 30% to 40%?
 
At this point id just wait for the much RTX-5xxx launch coming in September.

Rumored performance of the 5090 being 200% faster than the 4090.

To buy a 4090 after it's been out 16+ months would not be wise. Unless you buy it and sell it to help pay for the 5090 that works.

I have the 4080 bought a year ago and it runs all my games at max ultra settings with high frames @ 3440 x 1440

Isn't the performance difference between a 4090 and a 4080 only like 30% to 40%?
mmm you make a fair point. I've been trying to get a 4090 from scalpers but most of them won't budge. September isn't that far away tbh. I could just wait. Assuming 50's do launch in September.
 
mmm you make a fair point. I've been trying to get a 4090 from scalpers but most of them won't budge. September isn't that far away tbh. I could just wait. Assuming 50's do launch in September.

The RTX-2080 launched in September 2018, the RTX-3090 launched in September 2020, and the RTX-4090 launched in September 2022. So I would predict a September 2024 release for the 5090.

Rumored performance for the 5090 is crazy, supposed to be a big leap over the 4090.
 
Back
Top