Xbox series X

When new console xbox will be available?
Around Thanksgiving this year. If you're outside the USA that is some time in November. Microsoft will have an event in April where we get all the details, supposedly including a hard release date and price.
 
Tweet.png
 
Round-up of a few news links:

VideoCardZ - https://videocardz.com/newz/microsoft-confirms-xbox-series-x-feature-12-teraflops-amd-rdna-2-gpu


NVIDIA TU102AMD Vega 20AMD Navi 10XBOX Series X GPU
2020 High-End GPUs
Fabrication Node12nm Turing7nm GCN47nm RDNA17nm RDNA2
Unified Cores460838402560?
DeviceTITAN RTXRadeon VIIRadeon RX 5700 XTXBOX Series X
HDMI Support2.0b2.0b2.0b2.1
FP32 Max Compute

16.3 TFLOPs


13.4 TFLOPs


9.0 TFLOPs


12.0 TFLOPs


PC Gamer - https://www.pcgamer.com/on-paper-th...es-x-is-faster-than-a-geforce-rtx-2080-super/

Here's a rundown of graphics cards from both AMD and Nvidia in terms of TFLOPs:

  • GeForce RTX 2080 Ti (Turing TU102)—13.45 TFLOPs
  • Radeon RX Vega 64 (Vega 10)—12.66 TFLOPs
  • Xbox Series X (Navi - RDNA 2)—12 TFLOPs
  • GeForce RTX 2080 Super (Turing TU104)—11.15 TFLOPs
  • Radeon RX Vega 56 (Vega 10)—10.54 TFLOPs
  • GeForce RTX 2080 (Turing TU104)—10.07 TFLOPs
  • Radeon RX 5700 XT (Navi 10)—9.754 TFLOPs
  • GeForce RTX 2070 Super (Turing TU104)—9.062 TFLOPs
  • Radeon RX 5700 (Navi 10 XL)—7.949 TFLOPs
  • GeForce RTX 2070 (Turing TU106)—7.465 TFLOPs
  • GeForce RTX 2060 Super (Turing TU106)—7.181 TFLOPs
  • GeForce RTX 2060 (Turing TU106)—6.451 TFLOPs

Euro Gamer - https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-power-play-analysis

Microsoft's messaging is so confident that there's almost the sense that the firm is throwing down the gauntlet to its main rival, Sony. I didn't anticipate that Microsoft or indeed Sony would resuscitate the teraflop as a measure of a console's power - principally because it's not an especially accurate way of expressing what a GPU is really capable of, especially when comparing architectures from two very different generations. I can only imagine that Microsoft is comfortable enough with a comparison with the last-gen enhanced machine that possibly under-sells its new machine, while at the same time challenging Sony to follow suit with its latest offering.
 
Round-up of a few news links:

VideoCardZ - https://videocardz.com/newz/microsoft-confirms-xbox-series-x-feature-12-teraflops-amd-rdna-2-gpu


2020 High-End GPUs
NVIDIA TU102AMD Vega 20AMD Navi 10XBOX Series X GPU
Fabrication Node12nm Turing7nm GCN47nm RDNA17nm RDNA2
Unified Cores460838402560?
DeviceTITAN RTXRadeon VIIRadeon RX 5700 XTXBOX Series X
HDMI Support2.0b2.0b2.0b2.1
FP32 Max Compute

16.3 TFLOPs


13.4 TFLOPs


9.0 TFLOPs


12.0 TFLOPs


PC Gamer - https://www.pcgamer.com/on-paper-th...es-x-is-faster-than-a-geforce-rtx-2080-super/

Here's a rundown of graphics cards from both AMD and Nvidia in terms of TFLOPs:

  • GeForce RTX 2080 Ti (Turing TU102)—13.45 TFLOPs
  • Radeon RX Vega 64 (Vega 10)—12.66 TFLOPs
  • Xbox Series X (Navi - RDNA 2)—12 TFLOPs
  • GeForce RTX 2080 Super (Turing TU104)—11.15 TFLOPs
  • Radeon RX Vega 56 (Vega 10)—10.54 TFLOPs
  • GeForce RTX 2080 (Turing TU104)—10.07 TFLOPs
  • Radeon RX 5700 XT (Navi 10)—9.754 TFLOPs
  • GeForce RTX 2070 Super (Turing TU104)—9.062 TFLOPs
  • Radeon RX 5700 (Navi 10 XL)—7.949 TFLOPs
  • GeForce RTX 2070 (Turing TU106)—7.465 TFLOPs
  • GeForce RTX 2060 Super (Turing TU106)—7.181 TFLOPs
  • GeForce RTX 2060 (Turing TU106)—6.451 TFLOPs

Euro Gamer - https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-power-play-analysis

Microsoft's messaging is so confident that there's almost the sense that the firm is throwing down the gauntlet to its main rival, Sony. I didn't anticipate that Microsoft or indeed Sony would resuscitate the teraflop as a measure of a console's power - principally because it's not an especially accurate way of expressing what a GPU is really capable of, especially when comparing architectures from two very different generations. I can only imagine that Microsoft is comfortable enough with a comparison with the last-gen enhanced machine that possibly under-sells its new machine, while at the same time challenging Sony to follow suit with its latest offering.
That list just illustrates how a FLOP/s number alone means nothing. AMD hardware historically has high FP32 LINPACK performance but it rarely translates to actual real time rendering performance.
 
That list just illustrates how a FLOP/s number alone means nothing. AMD hardware historically has high FP32 LINPACK performance but it rarely translates to actual real time rendering performance.

Yar, hard to take those numbers seriously when Vega64 has a decent TFLOPs lead over a 2080 Super.

My theory is that MS is pushing the TFLOPs number because they heard through their contacts that Sony's PS5 TFLOP is lower. So, they're making their messaging target an area where they'll appear to have a performance lead over Sony (though in reality the systems will have more or less parity to each other).

But that theory's pulled out of my butt, so who knows how accurate it will be.
 
What is interesting is the 5700XT is closer to what the Xbox will have in it yet it out performs the Vega 64 in a lot of cases. So taking that or a better architecture (RDNA2) and giving it more raw compute power and then putting it into a $500 console... Sounds pretty impressive to me.
 
What do you think will be the launch price of the Xbox series X ?

Will the launch be delayed to next year ??

What will be the cost of the additional 1 TB NVMe Compact Flash Express card which is expected to run at same speed as internal NVMe 1 TB card ???

https://www.thefpsreview.com/2020/0...storage-with-pci-express-4-0-expansion-cards/

“Microsoft has partnered with Seagate for its proprietary external 1TB SSD expansion,” wrote Eurogamer in its Xbox Series X deep-dive.
“It’s very short, quite weighty for its dimensions and actually presents rather like a memory card.”

The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput. Per Eurogamer:

the game package that sits on storage essentially becomes extended memory, allowing 100GB of game assets stored on the SSD to be instantly accessible by the developer. It’s a system that Microsoft calls the Velocity Architecture and the SSD itself is just one part of the system
 
What do you think will be the launch price of the Xbox series X ?

Will the launch be delayed to next year ??

What will be the cost of the additional 1 TB NVMe Compact Flash Express card which is expected to run at same speed as internal NVMe 1 TB card ???

https://www.thefpsreview.com/2020/0...storage-with-pci-express-4-0-expansion-cards/



The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput. Per Eurogamer:
I'm standing by my $600 prediction for price, and I don't think it will be delayed.
 
My money is on $499 as well. Many of us still remember the $599 PS3 and how insane that sounded. Somehow, $499 just seems a lot more reasonable. I expect Microsoft to be willing to take on a larger loss than Sony in order to not be the underdog this generation.

Both Sony and Microsoft are adamant they are launching this year. While I do believe that's the intention for both, I also wouldn't be surprised if that changed. This far out, neither company is going to talk about that. They don't want to appear weak or uncertain, and it's far enough off that the entire world is hoping for coronavirus to be over by then. The reality is this is all still very new, and growing... nobody really knows where the world is going to be a month from now, or six months from now. Will we still be battling this? Will people still be out of work? Will we be facing a global economic recession? All pretty important things that will impact if these consoles can truly hit their release target. Until the entire world settles down, nothing is 100% certain.

I'm sure both companies intended to talk launch details around E3. Three months from now, they should be able to say with a little more confidence if they are on track to launch this year.
 
I feel like there is going to be a good supply issue when both consoles are released...are the factories back up and running in china? I am pumped to see more about both consoles.
 
The internal 1 TB NVMe SSD, offers 2.4 GB/s of raw and 4.8 GB/s of compressed I/O throughput.

More on Xbox Series X Velocity architecture:

custom SSD storage
2.4 GB/s raw I/O throughput — or 4.8 GB/s compressed, or 40 times, compared to the 120MB/s in the Xbox One X hard disk

hardware decompression block
It uses Zlib, a general-purpose data-compression library, and a mysterious new system named "BCPack," geared to GPU textures.

DirectStorage, building upon DirectX
Microsoft’s Goossen told Digitial Foundry that doing decompression on the 4K textures to match the speed of the SSD rate would have consumed three Zen 2 CPU cores, plus an additional two more just for the I/O overhead. With DirectStorage, Microsoft reduced that down to just a tenth of one core. There are plans to expand to expand this feature to Windows devices moving forward

Sampler Feedback Streaming (SFS)
streamlining GPU usage and loading only portions of textures demanded by a setting

https://www.windowscentral.com/xbox-velocity-architecture

https://www.pcworld.com/article/353...ectstorage-velocity-architecture-windows.html

"The CPU is the brain of our new console, and the GPU is the heart, but the Xbox Velocity Architecture is the soul,"
stated Andrew Goossen, Technical Fellow on Xbox Series X at Microsoft via Xbox Wire.

Microsoft provides examples of games in those categories, including Final Fantasy XV, Assassin's Creed Odyssey, and Red Dead Redemption 2. Such titles were notorious for lengthy initial loading screens, sometimes taking minutes to get started. And for games that mask loading by slowing players, whether elevators or hallways, Xbox Series X may provide additional creative freedom to avoid those limitations.
 
DirectStorage, building upon DirectX

DirectStorage allows developers to utilize SSD storage as a virtual RAM buffer for games. The SSD is used as a cache, allowing the 16GB RAM pool to offload--and even pull--extra power from the customized PCIe 4.0 SSD.

Armed with caching, compression systems, and a synergized API optimized to work alongside the SoC itself, the Xbox Velocity Architecture allows for massively improved asset fetching and processing, leading to ultra-fast loading times, more detailed game environments, and more.

Using DirectStorage, the SSD can now deliver assets, textures, and data to the Zen 2 CPU much more quickly, while also communicating with the system's 6GB of data-allocated GDDR6 memory pool (the Xbox Series X has 16GB GDDR6, but only 6GB of it is available to the SSD).

https://www.tweaktown.com/news/7127...revolutionary-changes-gaming-forever/amp.html
 
I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
 
I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!

I'm def not getting hung up on console specs, as the end result is going to still be that the cross-platform games are going to play almost identically between the consoles. But I'll be getting the PS5 regardless for its exclusives and sticking to my PC for all Xbox games and will stream them locally to my living room via my Shield TV or Steam Link as I've been doing. The only thing I really like about the X Boxes is their BC for older original Xbox and 360 titles and that's basically all I use my current Xbone X for.
 
I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
Well they claim no loading times at all. It is not just a super fast SSD. Even with the fastest SSD on PC there can be still a lot of loading.
 
I keep hearing how the SSD on the PS5 is a lot faster. That is true but I think these systems are so fast that there is going to be not noticeable difference between the two. I mean, really. Are we going to notice or care about 1 second difference? I'd like to get both consoles if I can afford it. But for now I am getting the Xbox. Good time to be alive!
At these speeds it's going to be on the order of milliseconds, not seconds. I'm sure games on both systems will be made with the ability to not have any load times at all after the initial caching period. So while it is impressive that Sony is able to work with Seagate to get the full PCI-E 4.0 bandwidth in 4 lanes it's not really going to make that much of a difference compared to the Xbox.
 
Hopefully install sizes will go down as they will no longer need to put as much duplicate files like they do now to reduce load times.

I don't mind some loading times, but not like now where for example in state of decay 2 where you can freeze the game for a couple seconds due to moving around too fast on the map.
 
Hopefully install sizes will go down as they will no longer need to put as much duplicate files like they do now to reduce load times.

I don't mind some loading times, but not like now where for example in state of decay 2 where you can freeze the game for a couple seconds due to moving around too fast on the map.

The duplicate file trick was only necessary for the PS3, as it used 2x Bluray drive (slower rotation speed meant slower access times). Now that data is all installed on the hard drve, they most-likely don't make any copies.

At most, the hard drive after install is defragmented (while idle,) but you can say this of any modern OS.
 
Last edited:
The duplicate file trick was only necessary for the PS3, as it used 2x Bluray drive (slower rotation speed meant slower access times). Now that data is all installed on the hard drve, they most-likely don't make any copies.

At most, the hard drive after install is defragmented (while idle,) but you can say this of any modern OS.


Wrong on it being only necessary for the ps3, data amplification is necessary for literally every game designed for hdd, talk to any game developer if you want confirmation, heck that's even an example for Spiderman on ps4, the ubiquitous mailbox had to be copied a hundred times or so through the hdd, and this is the same for similar objects in any game ever. Up until next gen console designed games for both consoles.
 
Wrong on it being only necessary for the ps3, data amplification is necessary for literally every game designed for hdd, talk to any game developer if you want confirmation, heck that's even an example for Spiderman on ps4, the ubiquitous mailbox had to be copied a hundred times or so through the hdd, and this is the same for similar objects in any game ever. Up until next gen console designed games for both consoles.

Yeah, Mark Cerny hit on that specifically during his recent GDC speech and cited that exact instance in Spiderman 4 needing to put a ton of duplicate assets on the HDDto make up for its slow file seeking speeds.
 
No big deal in the world of gaming, but I actually liked having the HDMI-In for my cable box. It makes all of my TV content use the same color space as my Xbox...which was well calibrated. It also provides faux-Atmos (or 7.1) for all of that content as well. It's capped at 1080p, but then again, luckily so is 99% of Comcast's content.

Looking at my Xbox One game library, the only game I own for it that I don't also have on the PC is RDR2. If I ever see that game for cheap on PC, I think I'm just going to sell my Xbox One X while I can still get some decent money for it.
 
No big deal in the world of gaming, but I actually liked having the HDMI-In for my cable box. It makes all of my TV content use the same color space as my Xbox...which was well calibrated. It also provides faux-Atmos (or 7.1) for all of that content as well. It's capped at 1080p, but then again, luckily so is 99% of Comcast's content.

Looking at my Xbox One game library, the only game I own for it that I don't also have on the PC is RDR2. If I ever see that game for cheap on PC, I think I'm just going to sell my Xbox One X while I can still get some decent money for
 
...and it's gone. Sold it to one of my friends for $175. He was looking for one and I wanted to swap over to the cheaper PC-only version of GamePass. Everyone wins. I ended up just setting up a separate color-space on my TV for that content. The only loss will be the fake surround sound that used all 7 speakers, but it's not like that added much.
 
Digital Foundry revealed the complete specifications of the Series X and gave details of new features like HDR globally implemented through a machine learning algorithm even on original Xbox game and Dynamic Input Latency.

https://hardforum.com/threads/digit...ck-compat-quick-resume-demo-showcase.1994146/

When the original PS4 and Xbox One consoles came out, their lack of support for true 4K resolutions and high dynamic range made them arguably out of date before they’d even gone on sale

With every new thing we’re hearing about the PS5 and Xbox Series X, however, it’s looking like this time it’s the TVs that will struggle to keep up with everything the latest consoles can do, rather than the other way round.

According to Eurogamer, the HDR conversion the Xbox Series X applies to games appears to be the real deal, applying genuine HDR light levels on a believably relative basis to different objects in every game frame.
...
Intriguingly, the engine that’s creating this HDR effect is apparently based on the widely acclaimed HDR system used in Gears 5 (the glorious results of which I talked about in this earlier article).

~John Archer in Forbes on HDR enhancement feature of Xbox Series X

https://www.forbes.com/sites/johnar...look-brand-new/amp/?__twitter_impression=true

Many TVs have tried to offer their own built-in SDR to HDR video conversions over recent years, and in most cases the results have been unconvincing to say the least. Strained and unbalanced colors; flared-out bright areas; and exaggerated noise and object edging are just some of the problems HDR upconverters in TVs have struggled to resolve.

Ironically given that we’re talking about the Xbox Series X here, I’d argue that the only TV brand that’s done a convincing SDR to HDR conversion is Sony. Sony’s TV system only works, though, because it takes a very mild approach to the conversion process, producing only a relatively gentle HDR effect.
 
~John Archer in Forbes on HDR enhancement feature of Xbox Series X

https://www.forbes.com/sites/johnar...look-brand-new/amp/?__twitter_impression=true

This is the first thing I've heard about the Series X that is interesting to me. I'm a big fan of HDR. I'm not necessarily sure it would sell me a system, but i'm definitely interesting in seeing how this works, and what affect it has on the industry. HDR support on PC already trails behind the consoles. I kinda wonder if this will make that even worse. If devs can have "free" HDR on consoles, that gives even less motivation for them to put the effort into implementing it manually. I also wonder though if this auto-HDR can match the appearance of developer implemented HDR. If this tech eliminates the need for developer implemented HDR, will we see this technology come to PC? Either by way of Microsoft rolling it into Windows 10 or AMD and Nvidia rolling it into their graphics hardware?
 
Seems like a neat feature for sure. Not neat enough to make me buy one (especially with all exclusives on the PC), but I still think it's really cool. I'd also be interested in similar technology on the PC.
 
With every new thing we’re hearing about the PS5 and Xbox Series X, however, it’s looking like this time it’s the TVs that will struggle to keep up with everything the latest consoles can do, rather than the other way round.

That's mostly because the TV industry thus far has avoided HDMI 2.1 (aside from LG) since there's no signal source for it yet (*glares at NVIDIA*). LGs top-end TVs from last year already support most of the HDMI 2.1 spec (hell, they got NVIDIA to backport HDMI VRR into HDMI 2.0) and most of their TVs this year will support HDMI 2.1 as well. I think a few of Samsungs models also support HDMI 2.1 on at least one port (how else are they supporting 8k?).

All I'm really hearing here is Sony/MSFT announcing they'll be supporting HDMI 2.1, then repeating the HDMI 2.1 feature list to get people all excited.
 
Samsung made a big announcement about their HDMI 2.1 TV's earlier this week.
I remember when the PS3 touted 1080p as a feature. News articles couldn't say "...but nobody even HAS a 1080p TV yet!" enough.
A few years later and everyone and their mom had one. Granted, that was the first and only move to "HD" for a lot of people, but it didn't take long to catch on.
HDMI 2.1 could be like that depending on how the manufacturers choose to push them and if the new consoles catch on.
The only real knock on the concept is that (like HDR), you kinda have to see it in person to fully understand it.
Something that could really help is a catchy term (like "HD" or "4K") to describe higher refresh rates.
 
Samsung made a big announcement about their HDMI 2.1 TV's earlier this week.
I remember when the PS3 touted 1080p as a feature. News articles couldn't say "...but nobody even HAS a 1080p TV yet!" enough.
A few years later and everyone and their mom had one. Granted, that was the first and only move to "HD" for a lot of people, but it didn't take long to catch on.
HDMI 2.1 could be like that depending on how the manufacturers choose to push them and if the new consoles catch on.
The only real knock on the concept is that (like HDR), you kinda have to see it in person to fully understand it.
Something that could really help is a catchy term (like "HD" or "4K") to describe higher refresh rates.
I was looking at TVs a bit yesterday. Compared to what I paid for 65”4K in 2016, “upgrading” to similar with VRR HDMI 2.1 is almost half the price. I could easily see 2.1 becoming a more common thing in households a year from now.

I just wish manufacturers would fucking pick an HDR standard and stick with it. It seems like you can get HDR10+ or Dolby Vision but not quite Both.
 
I was looking at TVs a bit yesterday. Compared to what I paid for 65”4K in 2016, “upgrading” to similar with VRR HDMI 2.1 is almost half the price. I could easily see 2.1 becoming a more common thing in households a year from now.

I just wish manufacturers would fucking pick an HDR standard and stick with it. It seems like you can get HDR10+ or Dolby Vision but not quite Both.

Same. I glanced at some of the new Samsung TV's after their announcement and they're much cheaper than they were a few years ago. They seem dead set on HDR10+ while it seems like Dolby is what they others are going with. As someone who is mostly using for gaming, I've only seem Dolby pop up a couple times. Optical disk movies seem to favor it, though.
 
Back
Top