|NVIDIA TU102||AMD Vega 20||AMD Navi 10||XBOX Series X GPU|
|2020 High-End GPUs|
|Fabrication Node||12nm Turing||7nm GCN4||7nm RDNA1||7nm RDNA2|
|Device||TITAN RTX||Radeon VII||Radeon RX 5700 XT||XBOX Series X|
|FP32 Max Compute|| |
That list just illustrates how a FLOP/s number alone means nothing. AMD hardware historically has high FP32 LINPACK performance but it rarely translates to actual real time rendering performance.Round-up of a few news links:
VideoCardZ - https://videocardz.com/newz/microsoft-confirms-xbox-series-x-feature-12-teraflops-amd-rdna-2-gpu
2020 High-End GPUs NVIDIA TU102 AMD Vega 20 AMD Navi 10 XBOX Series X GPU Fabrication Node 12nm Turing 7nm GCN4 7nm RDNA1 7nm RDNA2 Unified Cores 4608 3840 2560 ? Device TITAN RTX Radeon VII Radeon RX 5700 XT XBOX Series X HDMI Support 2.0b 2.0b 2.0b 2.1 FP32 Max Compute
PC Gamer - https://www.pcgamer.com/on-paper-th...es-x-is-faster-than-a-geforce-rtx-2080-super/
Here's a rundown of graphics cards from both AMD and Nvidia in terms of TFLOPs:
- GeForce RTX 2080 Ti (Turing TU102)—13.45 TFLOPs
- Radeon RX Vega 64 (Vega 10)—12.66 TFLOPs
- Xbox Series X (Navi - RDNA 2)—12 TFLOPs
- GeForce RTX 2080 Super (Turing TU104)—11.15 TFLOPs
- Radeon RX Vega 56 (Vega 10)—10.54 TFLOPs
- GeForce RTX 2080 (Turing TU104)—10.07 TFLOPs
- Radeon RX 5700 XT (Navi 10)—9.754 TFLOPs
- GeForce RTX 2070 Super (Turing TU104)—9.062 TFLOPs
- Radeon RX 5700 (Navi 10 XL)—7.949 TFLOPs
- GeForce RTX 2070 (Turing TU106)—7.465 TFLOPs
- GeForce RTX 2060 Super (Turing TU106)—7.181 TFLOPs
- GeForce RTX 2060 (Turing TU106)—6.451 TFLOPs
Euro Gamer - https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-power-play-analysis
Microsoft's messaging is so confident that there's almost the sense that the firm is throwing down the gauntlet to its main rival, Sony. I didn't anticipate that Microsoft or indeed Sony would resuscitate the teraflop as a measure of a console's power - principally because it's not an especially accurate way of expressing what a GPU is really capable of, especially when comparing architectures from two very different generations. I can only imagine that Microsoft is comfortable enough with a comparison with the last-gen enhanced machine that possibly under-sells its new machine, while at the same time challenging Sony to follow suit with its latest offering.
Yar, hard to take those numbers seriously when Vega64 has a decent TFLOPs lead over a 2080 Super.That list just illustrates how a FLOP/s number alone means nothing. AMD hardware historically has high FP32 LINPACK performance but it rarely translates to actual real time rendering performance.
The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput. Per Eurogamer:
I'm standing by my $600 prediction for price, and I don't think it will be delayed.What do you think will be the launch price of the Xbox series X ?
Will the launch be delayed to next year ??
What will be the cost of the additional 1 TB NVMe Compact Flash Express card which is expected to run at same speed as internal NVMe 1 TB card ???
The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput. Per Eurogamer:
Is MS still going to try to do the all-in-one DVR/Console or was that a bust?WTF? Microsoft dropped the optical output despite earlier prototypes having it.
More on Xbox Series X Velocity architecture:The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput.
hardware decompression block2.4 GB/s raw I/O throughput — or 4.8 GB/s compressed, or 40 times, compared to the 120MB/s in the Xbox One X hard disk
DirectStorage, building upon DirectXIt uses Zlib, a general-purpose data-compression library, and a mysterious new system named "BCPack," geared to GPU textures.
Sampler Feedback Streaming (SFS)Microsoft’s Goossen told Digitial Foundry that doing decompression on the 4K textures to match the speed of the SSD rate would have consumed three Zen 2 CPU cores, plus an additional two more just for the I/O overhead. With DirectStorage, Microsoft reduced that down to just a tenth of one core. There are plans to expand to expand this feature to Windows devices moving forward
https://www.windowscentral.com/xbox-velocity-architecturestreamlining GPU usage and loading only portions of textures demanded by a setting
stated Andrew Goossen, Technical Fellow on Xbox Series X at Microsoft via Xbox Wire."The CPU is the brain of our new console, and the GPU is the heart, but the Xbox Velocity Architecture is the soul,"
Microsoft provides examples of games in those categories, including Final Fantasy XV, Assassin's Creed Odyssey, and Red Dead Redemption 2. Such titles were notorious for lengthy initial loading screens, sometimes taking minutes to get started. And for games that mask loading by slowing players, whether elevators or hallways, Xbox Series X may provide additional creative freedom to avoid those limitations.
DirectStorage allows developers to utilize SSD storage as a virtual RAM buffer for games. The SSD is used as a cache, allowing the 16GB RAM pool to offload--and even pull--extra power from the customized PCIe 4.0 SSD.DirectStorage, building upon DirectX
I'm def not getting hung up on console specs, as the end result is going to still be that the cross-platform games are going to play almost identically between the consoles. But I'll be getting the PS5 regardless for its exclusives and sticking to my PC for all Xbox games and will stream them locally to my living room via my Shield TV or Steam Link as I've been doing. The only thing I really like about the X Boxes is their BC for older original Xbox and 360 titles and that's basically all I use my current Xbone X for.I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
Well they claim no loading times at all. It is not just a super fast SSD. Even with the fastest SSD on PC there can be still a lot of loading.I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
At these speeds it's going to be on the order of milliseconds, not seconds. I'm sure games on both systems will be made with the ability to not have any load times at all after the initial caching period. So while it is impressive that Sony is able to work with Seagate to get the full PCI-E 4.0 bandwidth in 4 lanes it's not really going to make that much of a difference compared to the Xbox.I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
The duplicate file trick was only necessary for the PS3, as it used 2x Bluray drive (slower rotation speed meant slower access times). Now that data is all installed on the hard drve, they most-likely don't make any copies.Hopefully install sizes will go down as they will no longer need to put as much duplicate files like they do now to reduce load times.
I don't mind some loading times, but not like now where for example in state of decay 2 where you can freeze the game for a couple seconds due to moving around too fast on the map.
The duplicate file trick was only necessary for the PS3, as it used 2x Bluray drive (slower rotation speed meant slower access times). Now that data is all installed on the hard drve, they most-likely don't make any copies.
At most, the hard drive after install is defragmented (while idle,) but you can say this of any modern OS.
Yeah, Mark Cerny hit on that specifically during his recent GDC speech and cited that exact instance in Spiderman 4 needing to put a ton of duplicate assets on the HDDto make up for its slow file seeking speeds.Wrong on it being only necessary for the ps3, data amplification is necessary for literally every game designed for hdd, talk to any game developer if you want confirmation, heck that's even an example for Spiderman on ps4, the ubiquitous mailbox had to be copied a hundred times or so through the hdd, and this is the same for similar objects in any game ever. Up until next gen console designed games for both consoles.
No big deal in the world of gaming, but I actually liked having the HDMI-In for my cable box. It makes all of my TV content use the same color space as my Xbox...which was well calibrated. It also provides faux-Atmos (or 7.1) for all of that content as well. It's capped at 1080p, but then again, luckily so is 99% of Comcast's content.
Looking at my Xbox One game library, the only game I own for it that I don't also have on the PC is RDR2. If I ever see that game for cheap on PC, I think I'm just going to sell my Xbox One X while I can still get some decent money for
When the original PS4 and Xbox One consoles came out, their lack of support for true 4K resolutions and high dynamic range made them arguably out of date before they’d even gone on saleDigital Foundry revealed the complete specifications of the Series X and gave details of new features like HDR globally implemented through a machine learning algorithm even on original Xbox game and Dynamic Input Latency.
~John Archer in Forbes on HDR enhancement feature of Xbox Series XAccording to Eurogamer, the HDR conversion the Xbox Series X applies to games appears to be the real deal, applying genuine HDR light levels on a believably relative basis to different objects in every game frame.
Intriguingly, the engine that’s creating this HDR effect is apparently based on the widely acclaimed HDR system used in Gears 5 (the glorious results of which I talked about in this earlier article).
Many TVs have tried to offer their own built-in SDR to HDR video conversions over recent years, and in most cases the results have been unconvincing to say the least. Strained and unbalanced colors; flared-out bright areas; and exaggerated noise and object edging are just some of the problems HDR upconverters in TVs have struggled to resolve.
Ironically given that we’re talking about the Xbox Series X here, I’d argue that the only TV brand that’s done a convincing SDR to HDR conversion is Sony. Sony’s TV system only works, though, because it takes a very mild approach to the conversion process, producing only a relatively gentle HDR effect.
This is the first thing I've heard about the Series X that is interesting to me. I'm a big fan of HDR. I'm not necessarily sure it would sell me a system, but i'm definitely interesting in seeing how this works, and what affect it has on the industry. HDR support on PC already trails behind the consoles. I kinda wonder if this will make that even worse. If devs can have "free" HDR on consoles, that gives even less motivation for them to put the effort into implementing it manually. I also wonder though if this auto-HDR can match the appearance of developer implemented HDR. If this tech eliminates the need for developer implemented HDR, will we see this technology come to PC? Either by way of Microsoft rolling it into Windows 10 or AMD and Nvidia rolling it into their graphics hardware?~John Archer in Forbes on HDR enhancement feature of Xbox Series X
That's mostly because the TV industry thus far has avoided HDMI 2.1 (aside from LG) since there's no signal source for it yet (*glares at NVIDIA*). LGs top-end TVs from last year already support most of the HDMI 2.1 spec (hell, they got NVIDIA to backport HDMI VRR into HDMI 2.0) and most of their TVs this year will support HDMI 2.1 as well. I think a few of Samsungs models also support HDMI 2.1 on at least one port (how else are they supporting 8k?).With every new thing we’re hearing about the PS5 and Xbox Series X, however, it’s looking like this time it’s the TVs that will struggle to keep up with everything the latest consoles can do, rather than the other way round.
I was looking at TVs a bit yesterday. Compared to what I paid for 65”4K in 2016, “upgrading” to similar with VRR HDMI 2.1 is almost half the price. I could easily see 2.1 becoming a more common thing in households a year from now.Samsung made a big announcement about their HDMI 2.1 TV's earlier this week.
I remember when the PS3 touted 1080p as a feature. News articles couldn't say "...but nobody even HAS a 1080p TV yet!" enough.
A few years later and everyone and their mom had one. Granted, that was the first and only move to "HD" for a lot of people, but it didn't take long to catch on.
HDMI 2.1 could be like that depending on how the manufacturers choose to push them and if the new consoles catch on.
The only real knock on the concept is that (like HDR), you kinda have to see it in person to fully understand it.
Something that could really help is a catchy term (like "HD" or "4K") to describe higher refresh rates.