On paper GPU in Xbox Series X faster than RTX 2080 Super

Also I play Xbox 360 games through an emulator, so that pretty much covers all the Xbox games. PS4 though... I'm still waiting for an emulator.

I have two ps4 controllers. One for the PC and one for the PS4. Works wirelessly with a simple Bluetooth dongle on almost every game (with the help of an Xbox 360 controller emulator)
 
All I was trying to say is AMD bridged that gap significantly and RDNA2 will reduce it further
...based on AMDs statements of capabilities.
Since we don't know RT performance, even with the leaks, its not wise to speculate that at the moment. But we can speculate rasterisation performance.
In order to speculate on performance at all, we must at least consider RT performance. It's a cornerstone of both Turing and RDNA2.
I agree that DX12/Vulcan reduces some of that historic benefit for sure. But it's still present. I do agree with your final point though, absolutely.
This is one of the less-discussed question marks. While consoles have used low-overhead APIs -- if they even used APIs at all -- games and game engines on the desktop were not built around that idea; realistically, not supporting DX10+ (DX11 is an extension and requirement upgrade, but not a new 'regime') wouldn't have worked.

Now, DX12 / Vulcan / whatever BS Apple is peddling this year all work similarly enough to each other and to consoles that a more coherent low-API development approach may be taken. We don't really know what the fruits will be; we can't know, really, nor can anyone else, what released games will look like a few years from now, from an API overhead perspective. At worst, we should see some improvement, which would likely result in CPU requirement stagnation; more likely, we'll see a regression in CPU requirements for newer games, and at very best, we'll see new features added that expand the gaming experience to take advantage of resources that are now less taxed.
 
Sure, maximum raster performance was sacrificed for RT in Turing, which is reflected in its die size. But it doesnt change the performance/flops metric. All I was trying to say is AMD bridged that gap significantly and RDNA2 will reduce it further. Since we don't know RT performance, even with the leaks, its not wise to speculate that at the moment. But we can speculate rasterisation performance.

I agree that DX12/Vulcan reduces some of that historic benefit for sure. But it's still present. I do agree with your final point though, absolutely.

p.s. it feels good to be back in the forums after a decade. I lost my old account/pass but been in the forums since 2003 (i've seen all the drama, ups downs and everything) so it's great to see old timers are still lurking around here, as well as the new ones. Forums seem to be working great.
If Turing didn't have ray tracing I highly doubt that NVIDIA would have made it ≈800mm² as they could not have justified the cost. The 2080 Ti is still 30-40% faster than the 1080 Ti in purely rasterized performance.
 
Rumors of a Dell laptop to be released next year with Navi 23 graphics

If this refers to discrete "Navi 23" gpu derivative for mobile then it could be as powerful as a PS5

If this refers to integrated igpu with 23 CUs then that puts it in the ballpark of 5500XT & assuming the iGPU is RDNA 2 based (skipping the first gen RDNA) then it will compete with Nvidia's 1660 super & 1660 ti or more powerful than the rumored Xbox series S not to mention the Xbox1X & PS4 pro

https://www.pcgamer.com/next-years-...er-than-a-next-gen-game-console/#comment-jump
 
Am I the only person who has a PC hooked up to their TV? Mine runs Linux Mint and has two wireless Xbox 360 controllers and even uses the WiiMote for Wii games. I still use a mouse and keyboard, where the mouse is on my arm chair. I prefer games on my desk but I use the TV too. Also I play Xbox 360 games through an emulator, so that pretty much covers all the Xbox games. PS4 though... I'm still waiting for an emulator.
I have an actual xbox 360 & wii in my living room. The things are practically free.
 
I have an actual xbox 360 & wii in my living room. The things are practically free.


Maintaing older console hardware is NOT free. - if your 360 is one of the older nmodels, it could die on you at any point. Also, the Wii Sensor bars are no painless to kep up-and-running, and the native resolution form the Wii looks ike shit on newer TVs.

Even if it's "effectively free" to acquire one, you still have to setup that interaction that hands you the console. And then you have to acquire all the parts that are not included/working in the used console sale (all the stuff that was broken, or just lost buy the previous owner)


It costs you a lot of time. And a lot of dedicated home theater space (to store your console collection somewhere, including that stupid Wii Sensor bar, or the extreme space-eating Xbox Kinect Monster.).

You have to dedicate time and space to EVERY console in your collection. If one of your anncient consoles die tomorrow, you have to troubleshoot it (swapping it out for a new one is not as easy as it used to be, now that we require hard drive game installations, AND day-one patches). So you have to troubleshoot, or just give-up on your console playing for the next few days (while you troubleshoot, or dig out a replacement console form your collection).

Or you can just have an HTPC with the Bluetooth console controller of your choice, and call it a day? When was the last time your PC died on you? A Mini ITX box also takes up about as much space as the Xbox Series x. It can't emulate today's consoles, but it's a whole lot less painful than keeping around 15+ year-old hardware.
 
Last edited:
Back
Top