RTX 3xxx performance speculation

It definitely is a space heater now, before with my strix 2080 the glass was cool to the touch when gaming. I have have grazed my leg against it plenty of times now with the RX580 I actively make sure my legs doesn't touch the same window in the same case that has been there since 2017(first time I grazed it with the 580 in there I panicked and took the gpu cooler off to regrease.. no improvement). While its not scientific cool to the touch vs feeling like you are being trained not to touch it is not something I need numbers on.

Oh, there's no oc on this thing at all, not willing to bring more heat to the card. Card progression Asus 1080ti turbo converted to g12 with 240 aio, zotac 2080ti in Oct of 18 and finally Asus 2080 strix since Nov of 18.

rubbish. Not a hope in hell that the 580 is heating up the air in your case enough to cause the glass panel in the side to get that hot.

But, let's for arguments sake say this actually happened, then the cooling inside your case is useless. In fact you must have no cooling and all the vents blocked up.

What RX 580 was it?
 
rubbish. Not a hope in hell that the 580 is heating up the air in your case enough to cause the glass panel in the side to get that hot.

But, let's for arguments sake say this actually happened, then the cooling inside your case is useless. In fact you must have no cooling and all the vents blocked up.

What RX 580 was it?
It is, not was a Nitro RX580 and my case has sufficient cooling the hot spot literally is the area around the 580, touch the glass over the 8700 and it's cooler than the section over the card. Swap the card and no more hot spot in the glass. Temp for the card varies between 78 and 80c. Do note that when I game and have these temps that's 2-3 hour sessions when I can find the time.
 
Last edited:
MSFS 2020 is heap of trash mesh issues the orthos are VERY inconsistent out side major cities and the flight model is trash your better off with Xplane
oh AND its single threaded because its still based on FSX lol
This may be the dumbest post I've seen in recent history. I suggest you go drink some more apple juice from your sippy cup. Once you've done that, come post in the MSFS2020 thread.
 
It is, not was a Nitro RX580 and my case has sufficient cooling the hot spot literally is the area around the 580, touch the glass over the 8700 and it's cooler than the section over the card. Swap the card and no more hot spot in the glass. Temp for the card varies between 78 and 80c. Do note that when I game and have these temps that's 2-3 hour sessions when I can find the time.

So your case cooling sucks basically.
 
This may be the dumbest post I've seen in recent history. I suggest you go drink some more apple juice from your sippy cup. Once you've done that, come post in the MSFS2020 thread.
lets start with the TBM's backward torque curve... and that fact the FMS blows up if you even look at it wrong...
 
I hope it's lower. I'm good with the 1080 Ti performance but I need it in a smaller package.

I doubt it's going to have worse performance per watt. So if you want 1080Ti performance, i'm quite sure you'll get it using far less than 250watts.

Now if you want the successor to say, the 2080Ti, then yeah I can see potentially having higher power consumption, but you're also getting a ton more performance based on the rumors currently circulating.
 
I always liked all aluminum cases or Silverstone type cases because they are always cool to the touch and they are a sort of giant heat sync. Inverted motherboard cases are great for warm cards, where they vent out the top. Bring it!
 
Anecdotal evidence with a sample size of 1 isn't saying much. Got anything more credible?
its still the FSX flight model at the core soo yeah
the flight model and frame rate are locked. the flight modeling is a single thread so you end up CPU limited in how high you can push the frame rate since you have calculate the flight physics each frame
 
its still the FSX flight model at the core soo yeah
the flight model and frame rate are locked. the flight modeling is a single thread so you end up CPU limited in how high you can push the frame rate since you have calculate the flight physics each frame

Up until the most recent COD, it was still “based” on the Quake 3 engine But had no problems scaling through the years
 
How many others want to buy the new Nvidia flagship day one but are just oh so curious as to what AMD is actually gonna get performance wise? If Nvidia is actually gonna launch in september, and with AMD most likely not launching till the new ryzen chips are ready closer to end of year, surely someone is going to let the big navi numbers slip out just to make that decisions a little muddier.
 
How many others want to buy the new Nvidia flagship day one but are just oh so curious as to what AMD is actually gonna get performance wise? If Nvidia is actually gonna launch in september, and with AMD most likely not launching till the new ryzen chips are ready closer to end of year, surely someone is going to let the big navi numbers slip out just to make that decisions a little muddier.

I'm curious, but I'm not playing the "wait for x" game. It's time for me to buy and whoever has the product that fits the bill at the time gets my money.
 
How many others want to buy the new Nvidia flagship day one but are just oh so curious as to what AMD is actually gonna get performance wise? If Nvidia is actually gonna launch in september, and with AMD most likely not launching till the new ryzen chips are ready closer to end of year, surely someone is going to let the big navi numbers slip out just to make that decisions a little muddier.

Nah. While I got no problem switching between CPUs based on whatever's more bang for buck when I'm buying, Nvidia has me happily (on my end and theirs) by the balls with their SW suite (DLSS, Gamestream, etc).
 
Anyone have a 120hz oled panel paired up with a 2080ti right now? I'm thinking about getting one but concerned with burn in since it would be primarily for videogames. If i upgraded to the new flagship, I really wanted to pair it with a new tv that could take advantage of the power.
 
Just replaced our c7 LG because of burn in, my wife had a 2080 in it, she mainly plays Destiny 2 and the division 2. We went with a Sony led because she doesn't want to think about burn in again or cut back her play time. I don't know if the newer ones have fixed it but we also play longer sessions (2-3hrs) a couple times a weekend. That TV also doesn't get much use during the week and only video games when it was on.
 
Just replaced our c7 LG because of burn in, my wife had a 2080 in it, she mainly plays Destiny 2 and the division 2. We went with a Sony led because she doesn't want to think about burn in again or cut back her play time. I don't know if the newer ones have fixed it but we also play longer sessions (2-3hrs) a couple times a weekend. That TV also doesn't get much use during the week and only video games when it was on.

Yeah from what I'm hearing burn in is a serious issue that with current tech, it has not been illiminated from large tv's, even this years models. It's strange because many flagship phones have oled displays in one form or another and don't suffer from what would give extreme burn in logically. Perhaps I don't know enough about the details, but I've been fawning over a nice oled for gaming for about 10 years now, and with them supporting 4k 120hz, and affordable in the bigger sizes, I cringe against the thought of burn in on such a gem. I have a decade old Panasonic Plasma that is still my favorite tv to watch in movies on that I keep in my room, despite having a larger 4k panel in the living room that I rarely use anymore. I'll be straight though, I really just want the oled for gaming, and anything else would be a distant second.

I'll have to wait and see if the new cards will be able to sustain 100fps close to maxed out at 4k on modern next gen titles, if so, I might just roll the dice. . .maybe just a 43 inch would be a good stepping stone for oled gaming.
 
Yeah from what I'm hearing burn in is a serious issue that with current tech, it has not been illiminated from large tv's, even this years models. It's strange because many flagship phones have oled displays in one form or another and don't suffer from what would give extreme burn in logically.

Burn in on OLED phones is about the same a burn in on OLED TVs. Lots of examples of each.

If using it as a TV and not a dedicated Monitor, burn likely won't be an issue with an OLED TV.
 
Theres a thread about the LG 48CX in the displays section at the moment. Seems like the consensus on OLED is to keep it at ~25-30% brightness if using it as a monitor. The ones with older OLED screens that have suffered from burn in tend to be the ones that kept their brightness levels way too high.
 
Theres a thread about the LG 48CX in the displays section at the moment. Seems like the consensus on OLED is to keep it at ~25-30% brightness if using it as a monitor. The ones with older OLED screens that have suffered from burn in tend to be the ones that kept their brightness levels way too high.

The biggest factor is having the same content on the screen. If you are watching a variety of TV, Movies, and playing a variety of games, then it really isn't an issue.

If you use it only for games, and you only play two games, and they have a HUD or other constant UI elements, then it will be a problem.
 
I think one of the biggest performance questions I have is whether the 3080ti will be a cut down GA100 (or 101) or if it is a shared GA102 die with the 3080 with more cores enabled. That and how close the releases of the two will be to each other - usually they are staggered to maximize sales of each product.
 
Oh boy! The plot thickens...
1597065239756.png
 
How many others want to buy the new Nvidia flagship day one but are just oh so curious as to what AMD is actually gonna get performance wise? If Nvidia is actually gonna launch in september, and with AMD most likely not launching till the new ryzen chips are ready closer to end of year, surely someone is going to let the big navi numbers slip out just to make that decisions a little muddier.

I’d never play the wait for game with AMD. Got burned by that already at least once.

Plus they are completely unproven (in games) with new tech like RT. Not buying a high end card on promises.
 
You can build a decent pc for 500.00 it's no longer luxury. Just buy parts that are a few years old cheap bingo. Use 1080p instead of 4k it's a cheap hobby. Free games off Epic every week.

Being bombarded by google ads is the hard part and social media influncers that no longer even care about PC parts even if they are new.
 
Last edited:
You can build a decent pc for 500.00 it's no longer luxery. Just buy parts that are a few years old cheap bingo. Use 1080p instead of 4k it's a cheap hobby. Free games off Epic every week.

Not one I would game on...I have moved past console graphics, thank you very much...
 
You can build a decent pc for 500.00 it's no longer luxury. Just buy parts that are a few years old cheap bingo. Use 1080p instead of 4k it's a cheap hobby. Free games off Epic every week.

Being bombarded by google ads is the hard part and social media influncers that no longer even care about PC parts even if they are new.

Not only that you can upgrade just the gpu and give it life.

Even “high end” is a cheap hobby if done over a few years. It’s pretty hard to find hobbies cheaper tbh.
 
so August 31st official paper launch but September 17th is when cards will be available?...I don't want a Founders Edition and will most likely wait for the 3rd party cards with better cooling
 
Back
Top