PS5/XBOX X vs 3080Ti/Big Navi

Honestly it seems like some have 0 interest in consoles but just want to argue with those that have interest in them. Guess will see how it shakes out when people can get their hands on either one.

That's quite true.
I love my PC and I consider myself to be a PC gamer.

I owned a PS4 Pro and played Diablo 3 on it. Honestly, it was more enjoyable to play with a controller than kb+m. I mentioned that on a gaming forum, got tons of shit for it.
But I couldn't last with only a PS4 + a phone/tablet. I need a PC.

What I'm doing is questioning 4k gaming today. Nvidia has gone too far with their pricing. They turned x080Ti into Titan X price-wise yet cut on the performance gain.
They did that because there's no competition. But they deserve to get shit for that, not praise.
 
I would say folks who do some thinking and research, finding accurate answers will make the right choice for what is best for them and family.

In my case or consideration, 120hz+ OLED, VRR, HDMI 2.1 or even DP 2.0 would give the best panel gaming experience. The 120hz capability or higher gives an effective range for VRR, add in FRC. While most platforms PC or Console may not be able to push that 120 FPS in all games or max quality it will allow for a very smooth gaming experience for 4K HDR with VRR. At present there is zero monitors that have that, only high end TV's and zero video cards that have HDMI 2.1 that would support those high end TVs.

Next gen GPU's and Consoles will have that capability, so to use it one would need to have either a next gen console or GPU. One just maybe need to look at the $ and what value you will get with it. I don't see an issue popping in a 3080 Ti or RDNA II top end Card in my HTPC, it will hold it, has good airflow and then rock and roll on a new 2021 OLED TV. Definitely an option so are next Gen Consoles with their plus and minuses.

There are monitors now that use DSC (Display Compression Codec) via DP1.4 that will support 120hz HDR 4K and current generation GPU's do support that. No OLEDs for PCs yet nor support for DSC on the 4K 120hz TVs. I rather invest in next generation hardware and capabilities. Current Shield and probably many of the longer HDMI cables will probably not support HDMI 2.1 except the fiber optic ones will probably work great as another option. If one does not need the above 60hz HDR 4K TV, there are plenty of current tech stuff that works great and also some high end Monitors that will do 120hz 4K.
 
That's quite true.
I love my PC and I consider myself to be a PC gamer.

I owned a PS4 Pro and played Diablo 3 on it. Honestly, it was more enjoyable to play with a controller than kb+m. I mentioned that on a gaming forum, got tons of shit for it.
But I couldn't last with only a PS4 + a phone/tablet. I need a PC.

What I'm doing is questioning 4k gaming today. Nvidia has gone too far with their pricing. They turned x080Ti into Titan X price-wise yet cut on the performance gain.
They did that because there's no competition. But they deserve to get shit for that, not praise.

Moore's law is dead...it is not only NVIDIA that will charge more:

1588833619739.png
 
Moore's law is dead...it is not only NVIDIA that will charge more:

View attachment 243615
Kinda sucks seeing how prices for the chips will go up, also if Nvidia charged $1200 for the 2080 Ti, what are they going to charge for the 3080 Ti?

Maybe Nvidia will bring back the Titan for the over the $1200 mark from the loutfy $2500 price tag today to keep that price range for the top gaming card while the 3080 Ti will be something else. Still they will be confined by what the market will pay and if there is good competition from AMD (Intel?) prices maybe kept reasonable.
 
Kinda sucks seeing how prices for the chips will go up, also if Nvidia charged $1200 for the 2080 Ti, what are they going to charge for the 3080 Ti?

Maybe Nvidia will bring back the Titan for the over the $1200 mark from the loutfy $2500 price tag today to keep that price range for the top gaming card while the 3080 Ti will be something else. Still they will be confined by what the market will pay and if there is good competition from AMD (Intel?) prices maybe kept reasonable.

You cannot escape physics...the cost will keep rising the smaller the nodes gets...and AMD/NVIDIA is not here to help you...they are here to earn money....like any company, so the cost will not be absorbed by them but put on the customers.
 
i also liked diablo a lot better with a gamepad on PS4 - M+K is clearly better for some gametypes.. imho Diablo is not one of them.
 
You cannot escape physics...the cost will keep rising the smaller the nodes gets...and AMD/NVIDIA is not here to help you...they are here to earn money....like any company, so the cost will not be absorbed by them but put on the customers.
I wonder how much of a loss AMD was taking with there Vega VII, 7nm 16gb of HBM2 at $699?

Anyways AMD profit margin is around 45% while Nvidia is around 65% (varies per quarter).
  • So a Nvidia card which cost $500 to make, Nvidia will charge an average of $825 to the buyer and depending upon seller markup (middlemen, storage, shipping, store), let say 10% -> ~$900
  • AMD card costing $500 to make will charge average of $725 to the buyer, with seller markup -> ~$800
The point is AMD has some better leeway for pricing without upsetting investors, they just need a good or better competitive product to go along with it. In reality Nvidia as well as AMD most likely makes a much higher profit margin on the higher end cards and less on the lower end ones. We just need some good old fashion competition to keep the upper end card prices in check.
 
Honestly it seems like some have 0 interest in consoles but just want to argue with those that have interest in them. Guess will see how it shakes out when people can get their hands on either one.

Actually it looks a lot more like this thread was started with the old "Consoles are so good now, why buy a PC?" rhetoric, that we get when a new console generation debuts.

Naturally there is some push-back to that often repeated rhetoric.

Consoles and PC are different.

For me, it isn't about the HW. It's about the philosophy. Consoles are like iPhones: A Walled Garden.

For my gaming, I much prefer open gaming, with modding/hacking etc...
 
Actually it looks a lot more like this thread was started with the old "Consoles are so good now, why buy a PC?" rhetoric, that we get when a new console generation debuts.

Naturally there is some push-back to that often repeated rhetoric.

Consoles and PC are different.

For me, it isn't about the HW. It's about the philosophy. Consoles are like iPhones: A Walled Garden.

For my gaming, I much prefer open gaming, with modding/hacking etc...
and that is it, preference also probably means for some. I pretty much prefer PCs as well but open to having a console with them. I pretty much doubt I will bother with next Gen Consoles unless something really stands out with them, probably makes a better HTPC for a living room setup for the money. Anyways one does not have to choose one or other, having both can be an option as well if one can afford them.
 
You cannot escape physics...the cost will keep rising the smaller the nodes gets...and AMD/NVIDIA is not here to help you...they are here to earn money....like any company, so the cost will not be absorbed by them but put on the customers.
Except in the 2080 Ti case it was not on a significantly smaller Node, 12nm which is basically an improved 16nm node, yet the rather stiff price. If the price was due to the 16nm to 12nm node with the 2080 Ti @ $1200, I shudder to think what Nvidia will want/charge for a 3080 Ti on a much improved Node at 7nm! I am sure it was shear profit margin motive there with an continued effort to push up the prices. They may not be as lucky this coming round -> lets hope so.
 
i also liked diablo a lot better with a gamepad on PS4 - M+K is clearly better for some gametypes.. imho Diablo is not one of them.

I wholeheartedly agree. I loved Diablo 3 on console but I'll probably buy Diablo 4 on PC since they stated it will have gamepad support from the get go. Also, I believe Big Navi and the Nvidia equivalent will crush the whatever console when it comes to performance. The consoles are estimated to have the speed of current gen hardware 2070 Super to 2080 Super performance depending on the source. With a die shrink on this new hardware it is safe to assume 30 to 40 percent increase in performance on the high-end. Not that it matters I'm weird and will buy all the consoles along with whatever high-end card gives me the best performance.
 
I mean, if you can afford a 3080 ti, you shouldn't have too much of an issue to spend another $500 on a console.

So, with that being said, why not both?
:p
 
I wonder how much of a loss AMD was taking with there Vega VII, 7nm 16gb of HBM2 at $699?

Anyways AMD profit margin is around 45% while Nvidia is around 65% (varies per quarter).
  • So a Nvidia card which cost $500 to make, Nvidia will charge an average of $825 to the buyer and depending upon seller markup (middlemen, storage, shipping, store), let say 10% -> ~$900
  • AMD card costing $500 to make will charge average of $725 to the buyer, with seller markup -> ~$800
The point is AMD has some better leeway for pricing without upsetting investors, they just need a good or better competitive product to go along with it. In reality Nvidia as well as AMD most likely makes a much higher profit margin on the higher end cards and less on the lower end ones. We just need some good old fashion competition to keep the upper end card prices in check.

$500 per card seems like an extreme situation. A 7nm wafer costs a bit under 10K, and under current yields even a 700mm2 monster die should only cost about $150 to fab. The rest of the card is in no way going to cost $350 make, otherwise nobody can make entry level cards. I would be surprised if an actual 700mm2 7nm Titan card would even cost $250 to make, especially given they'll be using cut down dies and not fully enabled ones, and probably under $300 fully packaged and shipped.
 
  • Like
Reactions: noko
like this
I mean, if you can afford a 3080 ti, you shouldn't have too much of an issue to spend another $500 on a console.

So, with that being said, why not both?
:p

I would not buy a console for the same reason I would not buy a GTX 1650 or a RX 5500...I can afford better graphics than a console or low/midrange GPU.
I dislike the limitations consoles put on graphics in general, so why would I support a platform than causes games to be visually downgraded?
 
I would not buy a console for the same reason I would not buy a GTX 1650 or a RX 5500...I can afford better graphics than a console or low/midrange GPU.
I dislike the limitations consoles put on graphics in genera, so why would I support a platform than causes games to be visually downgraded?

I believe that's the question OP is asking and you've just answered it.

On the other hand, hardware specs to the side, owning a console along with a PC, can enhance things in other ways. One reason, is the ability to play with people you know (friends, family, colleagues, etc) who don't have a gaming PC in their house. To add to that, there's really only a handful worth of games that support cross play.

Second reason; interactive games during parties, get togethers, etc. Yes, I know a PC is fully capable of displaying it on the big screen TV, and making room for people to sit around and play/take part in specific games. It just becomes a pain to "generally" have to set that up each time. Additionally, it's just easier to have it set up with a console to the TV and be done with it. For a measily $400 to $500, why not? Furthermore, Rock Band, dance video games, to even coop on one screen(Diablo 3 is one of the best examples) comes pretty much exclusively to console platforms. Again, like I said, some games you can do the same with a PC, it just becomes a bit more impractical at some point with those first two reasons alone with the general setting most people are in. Most people don't have their gaming rigs set up to their home theater system, they have it separated and usually in different rooms(office, bedroom corner, etc...).

So all in all, I'd say for in person interactive stuff for parties and friend/family get togethers, consoles are a solid addition to the house. After all, who gives a shit about graphics when you're having fun with everyone around you?

Solo experience though? PC all the way.

Both can enhance the household regardless if you use one once in a while, or every day.
 
Last edited:
  • Like
Reactions: noko
like this
I believe that's the question OP is asking and you've just answered it.

On the other hand, hardware specs to the side, owning a console along with a PC, can enhance things in other ways. One reason, is the ability to play with people you know (friends, family, colleagues, etc) who don't have a gaming PC in their house. To add to that, there's really only a handful worth of games that support cross play.

Second reason; interactive games during parties, get togethers, etc. Yes, I know a PC is fully capable of displaying it on the big screen TV, and making room for people to sit around and play/take part in specific games. It just becomes a pain to "generally" have to set that up each time. Additionally, it's just easier to have it set up with a console to the TV and be done with it. For a measily $400 to $500, why not? Furthermore, Rock Band, dance video games, to even coop on one screen(Diablo 3 is one of the best examples) comes pretty much exclusively to console platforms. Again, like I said, some games you can do the same with a PC, it just becomes a bit more impractical at some point with those first two reasons alone with the general setting most people are in. Most people don't have their gaming rigs set up to their home theater system, they have it separated and usually in different rooms(office, bedroom corner, etc...).

So all in all, I'd say for in person interactive stuff for parties and friend/family get togethers, consoles are a solid addition to the house. After all, who gives a shit about graphics when you're having fun with everyone around you?

Solo experience though? PC all the way.

Both can enhance the household regardless if you use one once in a while, or every day.
I game online or solo IRL but when people are involved IRL no gaming happens...unless you call socialising for "gaming".
Besides I don't own a TV, nothing being aired is worth a danm...and havn't been for decades.

Besides, these days it is really, really , really simple to hook up your PC to a TV.
 
LG cx series says hi!
YES INDEED!
Been eyeing this one:
https://www.trustedreviews.com/reviews/lg-cx-48-inch-oled-oled48cxpua

It supports GSync currently but limited, once Ampere comes out, the GPU should fully support full capabilty. Since it is using VESA VRR standard, I would expect it will support AMD next generation as well, as long as it has HDMI 2.1. Anyways this would make for an excellent bedroom system.
I was thinking of using the 48 inch LG CX as a monitor in my man cave whenever I can make a new build incorporating the next gen GPUs. But then I read this about LG cutting bandwidth on HDMI 2.1 for these displays: https://www.forbes.com/sites/johnar...d-tvs-dont-support-full-hdmi-21/#2727babb6276

Not sure if it's significant though.
 
I game online or solo IRL but when people are involved IRL no gaming happens...unless you call socialising for "gaming".
Besides I don't own a TV, nothing being aired is worth a danm...and havn't been for decades.

Besides, these days it is really, really , really simple to hook up your PC to a TV.

Then for you personally, owning a console doesn't make sense.
 
Then for you personally, owning a console doesn't make sense.

Not only does owning a conosle not make sense for me...but I still get affected by the subpar visual quality as games get downgraded/dumbed down due to the console limitations.
Currently console are the bane of progres...
 
Not only does owning a conosle not make sense for me...but I still get affected by the subpar visual quality as games get downgraded/dumbed down due to the console limitations.
Currently console are the bane of progres...

I have bad news for you. It’s gonna get worse with phones getting more and more powerful.
 
Not only does owning a conosle not make sense for me...but I still get affected by the subpar visual quality as games get downgraded/dumbed down due to the console limitations.
Currently console are the bane of progres...

I feel like that is the case for cross-console/ported games, however, when a game is exclusive to a specific console... They look pretty damn good.
 
I was thinking of using the 48 inch LG CX as a monitor in my man cave whenever I can make a new build incorporating the next gen GPUs. But then I read this about LG cutting bandwidth on HDMI 2.1 for these displays: https://www.forbes.com/sites/johnar...d-tvs-dont-support-full-hdmi-21/#2727babb6276

Not sure if it's significant though.
Did I read that right, their 2019 2.1HDMI TVx did support 48 Gps while the 2020 is around 40 Gps because they think they can give better quality with their processing??? WTF? So 2020 LG HDMI TVs will support 10bit HDMI at 120hz at 4K, not 12bit which would require 48 Gps, except their 2019 TVs with HDMI 2.1 would. Well there is always 2021 TVs (no doubt LG will brag about full bandwidth ability then :D). The other option is QLED from Samsung where burn in would not be an issue for desktop usage for long periods of time, better brightness, decent blacks but not as good pixel response times even when advertised at 1ms (BS). The Samsungs I looked at only had 1 HDMI 2.1 port though, the others are 2.0b I do believe. I am in no hurry.

Really, if one says they have HDMI 2.1 that should mean everything what that standard represents and is very misleading if you do not and especially when it is not mentioned in the marketing materials.
 
Last edited:
$500 per card seems like an extreme situation. A 7nm wafer costs a bit under 10K, and under current yields even a 700mm2 monster die should only cost about $150 to fab. The rest of the card is in no way going to cost $350 make, otherwise nobody can make entry level cards. I would be surprised if an actual 700mm2 7nm Titan card would even cost $250 to make, especially given they'll be using cut down dies and not fully enabled ones, and probably under $300 fully packaged and shipped.
Exactly! Nvidia will have a very rude awakening when they have to tell their investors (Stock holders) that they have to severely cut their profit margins to compete. That is if they have to against AMD or Intel. Gaming still makes up their large chunk of profits but stiff competition is coming for their HPC line of GPU's. The price hikes for Turing had nothing to do with a significant node change which was from the start very mature with great yields. Their large sized chips would definitely increase costs, which kinda also puts them on the spot for a significant improvement on 7nm or whatever node they will be using.

Did AMD raised their prices this time but yet very competitive to keep Nvidia moving towards the cliff? I guess that will depend how well AMD has executed on RNDA2 and how secret they have been about it. When someone has very high profit margins, meaning a large market potential, others will come into play rapidly to fill the void. Nvidia is still unique in the quality not only of their hardware designs but more so of their Software Ecosystem. Very interesting and fun to watch how this will progress.
 
You can learn machine learning and computer vision, make excellent money, plus make your graphics card a tax write off as a business expense

1589149741898.jpeg
 
I game online or solo IRL but when people are involved IRL no gaming happens...unless you call socialising for "gaming".
Besides I don't own a TV, nothing being aired is worth a danm...and havn't been for decades.

Besides, these days it is really, really , really simple to hook up your PC to a TV.
You got the aired stuff right being shit, from my view, the same movies play over and over again while the newer and better movies all end up on streaming services. News is subpar with repetitive opinionated stuff being repetitively repeated to ad nausea on to the next cycle of crap. History channel, Science Channel are virtually the only channels I watch, rest is streamed. My wife streams YouTube General Hospital videos from the 80's :D.

Have continually hooked up a PC to our TV for over a decade, pain in the ass overall to use. Win 10 for awhile would abruptly update right in a middle of a movie, game etc. WTF, constant updates, driver updates requiring maintenance time, that is not the case now. Win 10 is a desktop operating system and not a entertainment operating system, so it is not as fined tune for controller and interaction from the couch.

I've found just leaving the HTPC computer on all the time, let it do it's thing worked better. Then Smart TV's could stream making the computer redundant, what you could watch from the HTPC only from the past is much easier to do just using your TV controller made for the couch.

Then movies like on Bluray became a pain in the ass where 1/4 of them would not run, had to wait for updates, Bluray player firmware updates, software updates -> A dedicated 4K HDR Bluray player is just so much easier to use. Now streaming is better, download and watch and not worry about disks for best quality, streaming directly in general is usually good enough. Forget watching 4K Bluray on a PC - possible yes, worth it? HELL NO!

Games - have yet found a good way for a keyboard mouse type PC game to play well mostly due to keyboard quality, lag due to wireless interface and usable distance, at one point had usb cables strung which was a total waste of time. Console games are made for the TV, plays with the controller well - no compromises. At the distances one plays using a Console, the up close quality is irrelevant in my opinion since your not playing up close to notice the difference for the most part.

I also don't buy the quality difference as a strong point -> That would also imply games themselves that are not as high visual quality as others, one would not play either. There are many lower IQ great games I rather play than suck ass great looking ones, so what really counts? At least for me is the game play and experience one gets from it.

There are PC games I do believe that play best on the PC and will always play best on the PC due to the higher potential PCs have in general. For those that can maintain, even build great gaming PCs they can reap the reward. Others can use consoles and have a great gaming experience. Some will do both due to having even more options available to them.

Anyways the new Consoles, at least for awhile should make the games themselves in general better for the PC.
 
Last edited:
Did I read that right, their 2019 2.1HDMI TVx did support 48 Gps while the 2020 is around 40 Gps because they think they can give better quality with their processing??? WTF? So 2020 LG HDMI TVs will support 10bit HDMI at 120hz at 4K, not 12bit which would require 48 Gps, except their 2019 TVs with HDMI 2.1 would. Well there is always 2021 TVs (no doubt LG will brag about full bandwidth ability then :D). The other option is QLED from Samsung where burn in would not be an issue for desktop usage for long periods of time, better brightness, decent blacks but not as good pixel response times even when advertised at 1ms (BS). The Samsungs I looked at only had 1 HDMI 2.1 port though, the others are 2.0b I do believe. I am in no hurry.

Really, if one says they have HDMI 2.1 that should mean everything what that standard represents and is very misleading if you do not and especially when it is not mentioned in the marketing materials.

We have no way of knowing what the 2019 TVs support since there's no way to send a 12bit 4K 120hz 4:4:4 signal at this point. However Phil Jones at Sound United (I don't have a timestamp, sorry, listened to this on the subway) mentioned that all 2020 TVs "HDMI 2.1" TVs will be 40 GBps as far as he knows, as they will only support up to 10 bit 4k@120 4:4:4.

 
  • Like
Reactions: noko
like this
We have no way of knowing what the 2019 TVs support since there's no way to send a 12bit 4K 120hz 4:4:4 signal at this point. However Phil Jones at Sound United (I don't have a timestamp, sorry, listened to this on the subway) mentioned that all 2020 TVs "HDMI 2.1" TVs will be 40 GBps as far as he knows, as they will only support up to 10 bit 4k@120 4:4:4.


That seems perfectly OK for 4K TV's due to they are all 10 bit panels at the most anyways. Still to use these newer HDMI 2.1 TVs, you will have to have source supplying it via HDMI 2.1 which for GPU's and Consoles do not exist yet. Fiber Optic cables incoming as well for very long distances -> Cool!
 
Except in the 2080 Ti case it was not on a significantly smaller Node, 12nm which is basically an improved 16nm node, yet the rather stiff price. If the price was due to the 16nm to 12nm node with the 2080 Ti @ $1200, I shudder to think what Nvidia will want/charge for a 3080 Ti on a much improved Node at 7nm! I am sure it was shear profit margin motive there with an continued effort to push up the prices. They may not be as lucky this coming round -> lets hope so.
The 2080 Ti has 158% more transistors on a 60% larger die compared to the 1080 Ti. And price per transistor did not go down between the two. Profit margin between both is most likely the same for NVIDIA, and those profits go back into R&D. If NVIDIA wanted to stagnate like AMD, I'm sure they would sell Turing for a lot cheaper.
I was thinking of using the 48 inch LG CX as a monitor in my man cave whenever I can make a new build incorporating the next gen GPUs. But then I read this about LG cutting bandwidth on HDMI 2.1 for these displays: https://www.forbes.com/sites/johnar...d-tvs-dont-support-full-hdmi-21/#2727babb6276

Not sure if it's significant though.
It's not. 4K HDR10 at 120 Hz is only about 30 Gbps uncompressed (27.8 Gbps without HDR10). 4K at 120 Hz with 12-bit color is only 33.4 Gbps uncompressed. I don't know why people are all of the sudden spelling doom & gloom for the TV when this information came out.
 
The 2080 Ti has 158% more transistors on a 60% larger die compared to the 1080 Ti. And price per transistor did not go down between the two. Profit margin between both is most likely the same for NVIDIA, and those profits go back into R&D. If NVIDIA wanted to stagnate like AMD, I'm sure they would sell Turing for a lot cheaper.

It's not. 4K HDR10 at 120 Hz is only about 30 Gbps uncompressed (27.8 Gbps without HDR10). 4K at 120 Hz with 12-bit color is only 33.4 Gbps uncompressed. I don't know why people are all of the sudden spelling doom & gloom for the TV when this information came out.
So your saying the die for the 2080 Ti cost $400 more to make? I doubt that two dies even cost $400. The process was mature even being 12nm, unless the rather large size had utterly terrible yields. Nvidia sold less cards, made more money and profit margins the same - I don't think so.

Here are the actual data rates for different resolutions, chroma, refresh rates:

https://images.idgesg.net/images/article/2018/02/formatdataratetable-100750471-orig.jpg.

No, 4K HDR10 at 120 is not 30 Gbps uncompressed, not even close.
 
So your saying the die for the 2080 Ti cost $400 more to make? I doubt that two dies even cost $400. The process was mature even being 12nm, unless the rather large size had utterly terrible yields. Nvidia sold less cards, made more money and profit margins the same - I don't think so.

Here are the actual data rates for different resolutions, chroma, refresh rates:

https://images.idgesg.net/images/article/2018/02/formatdataratetable-100750471-orig.jpg.

No, 4K HDR10 at 120 is not 30 Gbps uncompressed, not even close.
That table is adding the overhead including blanking interval. I am giving the pure data rate of these resolutions. The data rate of 48 Gbps HDMI 2.1 is 42.7 Gbps.
 
That table is adding the overhead including blanking interval. I am giving the pure data rate of these resolutions. The data rate of 48 Gbps HDMI 2.1 is 42.7 Gbps.
So?
HDMI 2 nor DP1.4 will do 120 HDR 10bit at 120fps uncompressed. 30Gbps is not even close even if a monitor can operate without a blanking interval. Still the chart is accurate and shows overall overhead for the data rates, signal writes top to bottom with blank in between the two frames, in other words the blank space is needed and cannot be ignored, it is still part of the signal.
 
I used to never really mess with consoles, but have the pro/one X for the one off/exclusives. Other than that they sit there and collect dust.

Aa for panel LG Oled B9 55

https://www.rtings.com/tv/reviews/lg/b9-oled
Got the LG C9 and won’t look back. It is soooo nice especially compared to the circa 2009 Visio 550vx it replaced. 4K 60hz doesn’t get much use however as I much prefer 2560x1440 @120hz getting good use out of the g-sync although just to check it out I did run destiny 2 @4k 60hz HDR looked amazing but it’s better at 120hz especially in pvp. Can’t wait for DisplayPort to HDMI 2.1 so I can see if my 2080ti will run 4K 120hz destiny2
 
Got the LG C9 and won’t look back. It is soooo nice especially compared to the circa 2009 Visio 550vx it replaced. 4K 60hz doesn’t get much use however as I much prefer 2560x1440 @120hz getting good use out of the g-sync although just to check it out I did run destiny 2 @4k 60hz HDR looked amazing but it’s better at 120hz especially in pvp. Can’t wait for DisplayPort to HDMI 2.1 so I can see if my 2080ti will run 4K 120hz destiny2

I replaced an old Sony Bravia that "could" do 1080@120hz, however, it would have little white dashes/artifacts. Since the mode was not officially supported there was nothing Sony/Retailer would do and it ended up being a TV again. But my 2060 laptop hooked up to the OLED makes me smile as I see tear free 4K gaming (though lower texture settings), so I usually leave it at 1440@120.
 
  • Like
Reactions: noko
like this
Assasin's Creed Valhalla will "run at least 30 FPS" at 4K on Xbox Series X
https://www.gamesradar.com/uk/assas...l-run-at-least-30-fps-at-4k-on-xbox-series-x/

So much for the original argument starting this thread.
Is that with raytracing? Too much is not know still. For most of us PC gamers that would be unacceptable but for the average console user seeing a constant 30fps would be good. Even today console games will dip below 30fps and are played. Game like that, graphically intensive, I would probably play at 1440p at 60fps+ using VRR and sitting from the couch would not even notice a significant difference in the graphics quality but a significant smoothness difference. Now 4K up close on a big monitor or TV, PC style that would kinda suck.

The recent Microsoft event XBox Series X gameplay event was, well without game play :confused:??? Not to encouraging at this point in time. The next gen hardware maybe ready but does not look like the software is. Then again the next gen consoles could be a rather big let down.
 
Interesting collage seen on Reddit:
1589466389169.png


We all know how that turned out for the PS4.
Is that with raytracing? Too much is not know still. For most of us PC gamers that would be unacceptable but for the average console user seeing a constant 30fps would be good. Even today console games will dip below 30fps and are played. Game like that, graphically intensive, I would probably play at 1440p at 60fps+ using VRR and sitting from the couch would not even notice a significant difference in the graphics quality but a significant smoothness difference. Now 4K up close on a big monitor or TV, PC style that would kinda suck.

The recent Microsoft event XBox Series X gameplay event was, well without game play :confused:??? Not to encouraging at this point in time. The next gen hardware maybe ready but does not look like the software is. Then again the next gen consoles could be a rather big let down.
Software for consoles in the release window usually does suck as developers continue to push out games for the last generation hardware. It usually take about 1-1.5 years to see any actual significant improvement in games when a new console generation comes out. I read somewhere that Valhalla is not using ray tracing, probably because of this.
 
Back
Top