Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yep, also I would think their 3080ti/3090 would have more than 12GB of VRAMYeah, the above needs a truck load of salt. Someone just grabbed some of the Twitter "leaks" Printed up some specs on a laser printer, then took a picture.
Well, while some people do, others don't... just as you feel it's not worth golf course access, season tickets, guns, ammo, w/e; he doesn't think it's worth $1200 for a GPU. He's entitled to his opinion just as you are. It doesn't make him wrong or you wrong. Some people don't spend $10k for upgrades on a $50k car and instead buy cheap cars to work on (my daily driver cost me $250 and another $60 to get on the road) and then spend a few $'s here and their to make them faster. Which person is the enthusiast? The one that paid someone else to put stuff on their car, or the one that tore into their own car? It's not just about spending money, it's about getting whatever you can out of what you have. Some people have more discretionary funds than others and different priorities. This used to be the place to buy a celeron and almost double your frequency... when did it turn into buy the most expensive stuff or your not [H]? If you can only afford to buy 3rd hand used computer parts and build it yourself, who cares?
For me, it's not about having the funds, or priorities. It's more about a consumer ethos where I won't pay for a product for which I think the price has been artificially jacked up through measures that are insidious and abject going well past any reasonable level of free market arguments.
Further, accepting that a $1200 top-end card is reasonable is effectively saying you're ok with the smoke-and-mirrors upwards pricing scheme nV has been pushing for the past 5-6 years. nV is in fact banking on the fact people will just tacitly accept that they can charge 30-40% more than the past. I'm not one of those people.
Being that I have a 2080ti that can run anything I throw at it with disgusting gobs of performance I just do not personally see the need to get a 3k series. I will wait until 4k series. Or lets see what Big Navi can do but it probably wont exceed 30% more under any metric though.
People like you
Finding excuses for nearly doubling the price of top en card while giving half the usual performance boost.
Get out of here.
Take with a grain of salt....
https://wccftech.com/rumor-alleged-...-up-to-23-tflops-of-peak-graphics-horsepower/
For sure. I wouldn't even have posted it if it wasn't already a "rumor" thread.It's WCCFTech. I read the site, but their rumors articles should be taken with a salt shaker minimum; escalating up through sacks of rocksalt and salt spreader trucks for GPU leaks.
30% in what though?
Compute? Non RT? RT?
Being that I have a 2080ti that can run anything I throw at it with disgusting gobs of performance I just do not personally see the need to get a 3k series. I will wait until 4k series. Or lets see what Big Navi can do but it probably wont exceed 30% more under any metric though.
There's plenty of cases just for enthusiasts, where guaranteeing a speed is a benefit worth paying for.there's always a market for the uber rich to want to have something that the dirty masses can't and I'm sure binning existing hardware will always make that niche market profitable regardless of the size of it.
Take with a grain of salt....
https://wccftech.com/rumor-alleged-...-up-to-23-tflops-of-peak-graphics-horsepower/
View attachment 256690
I assume you are running less than 4K? Seems the top end will be the realm of getting 4K speeds up ultimately to 144HZ. I don't know I just think AMD is going to be much faster than 208TI this generation in 4K and possibly close to the 3090. We will see soon hopefully.
3440x1440 which is hard on a GPU
Harder than the 2560 vesion of 1440 but still 67 % less tan 3840x2160 so not really that hard comparitively. But still needs a solid video card
Yeah, well 4K is nice but I feel like it's not worth it in the end, given how much you spend on your machine to barely get 60 fps.
I've done all sort of setups, including a 7680x1440 Surround setup and a 55" 4K TV on my desk, and I'm on 1080p ultrawide now (2560x1080).
It sounds crazy, but honestly 1080p still looks nice. Given you can run ultra settings and get really high and smooth refresh (I'm on 166 Hz) I feel it is a good trade off.
Also good for ray tracing. Was able to play Control with RT high and medium other settings, 90 FPS, very nice. No way that would be possible at 4K.
I'm exactly the opposite. I had an ASUS 165Hz 1080p 24" (sold it) and my Predator X34 (100Hz 34" 21:9 3440x1440); after going ultrawide I could not go back to a 16:9 monitor or go back to 1080p after experiencing the increased clarity of 1440p. Not to mention, over 100Hz I have a very hard time actually noticing the increased refresh rate. But to each their own, everyone has their own preference.Like I have said in my previous post, (not speaking for everyone, just me) There is no real benefit to 4k gaming. Movie watching yes but not gaming. I have a IPS Predator x34 on my left display and a Predator 1080p 240hz and I game more on the 240hz at 1080p far far far more than the big wide screen. I have a 2080ti so its not the GPU holding me back. Its the refresh rate and smoothness of this 240hz panel. That is what makes gaming so fun to me. Smoothness equates to more immersion. 60hz is punishing to look at now. And I dont know how in the hell people can do that on their 4k screens most of them barely hitting 35 fps on avg.
But to keep this on topic in this particular reply, I sincerely hope that the 3000 series nV cards can shred the 2080ti in every metric because its good for the market.
I'm exactly the opposite. I had an ASUS 165Hz 1080p 24" (sold it) and my Predator X34 (100Hz 34" 21:9 3440x1440); after going ultrawide I could not go back to a 16:9 monitor or go back to 1080p after experiencing the increased clarity of 1440p. Not to mention, over 100Hz I have a very hard time actually noticing the increased refresh rate. But to each their own, everyone has their own preference.
3440*1440 = 4953600 pixels
3840*2160=8294400 pixels
4953600/8294400 = .59722222 so 60%, im not being nitpicky but if were going to use numbers lets be accurate
There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.
Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.
In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".
Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.
If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.
Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.
For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.
There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.
Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.
In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".
Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.
If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.
Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.
For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.
3440*1440 = 4953600 pixels
3840*2160=8294400 pixels
4953600/8294400 = .59722222 so 60%, im not being nitpicky but if were going to use numbers lets be accurate
There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.
Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.
In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".
Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.
If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.
Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.
For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.
I disagree with everything you wrote there.
I was constantly annoyed at how bad 1440p/120hz looked compared to 4k/60hz on my previous Samsung TV, 2019 model.
Now I have a 43" 4k/120hz monitor Valve Index and I need a top-end video card.
You can stay at 1080p and enjoy it, but saying that there's no benefit to 4k is ridiculous at best.
The phrase is a grain or pinch of salt. A truckload of salt would be taking it very seriously.Yeah, the above needs a truck load of salt. Someone just grabbed some of the Twitter "leaks" Printed up some specs on a laser printer, then took a picture.
Since I never heard of a Samsung 1440P TV, that sounds more like an issue of running at non-native resolution. BTW I actually wish there was a nice 1440P TV for computer/console gaming in the living room.
I am firmly in the camp of staying 1440P or lower for a PC desktop monitor and running faster and/or with higher settings, than merely throwing more pixels at it.
Since I never heard of a Samsung 1440P TV, that sounds more like an issue of running at non-native resolution. BTW I actually wish there was a nice 1440P TV for computer/console gaming in the living room.
I am firmly in the camp of staying 1440P or lower for a PC desktop monitor and running faster and/or with higher settings, than merely throwing more pixels at it.
Exactly, his issue was most likely due to running non-native. Even then the reports of 1440p looking terrible on a 4K TV are just hype in my experience. I run 1440p, 1620p and 1800p on a 55" OLED and they all look very good from 8ft away. Native 4K is best of course but certainly not noticeable enough to warrant the massive performance hit.
I have the Q7F. It's a great kit but the 1440p mode leaves something to be desired.
I'd rather look at 1080p native than 1440p upscaled.
Definitely not. It's a 4K screen.Also, 1440p on Q7F is native resolution.
Yeah I'm on 1440p (16:9) and honestly the only thing I want is a nice rasterized performance bump so I can turn on TAA and use the sharpening filter in some games while still maintaining a frame rate north of 100 fps. Or honestly even if more devs adopted DLSS 2.0 that would boost me up with my current card enough.
Also, 1440p on Q7F is native resolution.
https://www.rtings.com/tv/reviews/samsung/q7fn-q7-q7f-qled-2018
It's not excuses; it's an understanding of how markets and life in a Capitalistic society works. If $1200 was too much then NV wouldn't be able to sell any cards and they'd have to reduce their prices but obviously there are enough people that do think it's worth it. Because of that NV may even look to raise prices even more in the future. Will I like it if they do this? Absolutely not but I'm sure that there are still a bunch more people out there with the disposable income that won't even flinch at spending $2k on the best of the best and there will still be cards that cost what I'm willing to pay.
According to the site you linked, it says the tv is 4k and mentions nothing about "Native Resolution" @ 1440P.
Well... Unless your referring to "native support" under supported resolutions.... but that's not the "panel native resolution".. From what I can gather from the rting site, "Native Support" means that it supports that resolution and refresh rate without having to force the TV into that mode with a custom resolution profile.
Something like what you have to do with this Sony TV if you want it to run 1080P @ 120Hz https://www.rtings.com/tv/reviews/sony/x850e
"1080p @ 120Hz Yes (forced resolution required)"
"1080p @ 120 Hz @ 4:4:4 is supported but doesn't appear by default, so a custom resolution is needed. "
I've never even seen a 1440p TV.
Monitor? Of course, but TV? TV went straight from 1080p to 4k.
40% less (he said 67% less); 60% of 4k.
Personally, I love 3440x1440. It’s a sweet spot imo.
Given their HPC part is ~54 billion transistors to the 2080ti’s 18.6 billion the sky is the limit. It comes down to what they think the market will support $$$ wise.
Sad thing is I did use a calculator and somehow screwed up. 1440 does seem to be the sweet spot right now for many gamers, but I personally prefer the fidelity of 4K over 1440 more than frame rates as long as it can be close to 60 fps. I love 32" 4K since I can see the whole screen without panning. I assume on 3440x1440 you need to pan which I guess could be more immersive but not sure I would enjoy that. Not really an immersive gamer play mostly RPGS, MMOS and strategy games that can be played at a more casual pace.
Sad thing is I did use a calculator and somehow screwed up. 1440 does seem to be the sweet spot right now for many gamers, but I personally prefer the fidelity of 4K over 1440 more than frame rates as long as it can be close to 60 fps. I love 32" 4K since I can see the whole screen without panning. I assume on 3440x1440 you need to pan which I guess could be more immersive but not sure I would enjoy that. Not really an immersive gamer play mostly RPGS, MMOS and strategy games that can be played at a more casual pace.