Separate names with a comma.
Discussion in 'HardForum Tech News' started by Megalith, Dec 29, 2018.
4k pong at 240fps sure.
vr pong is gonna be sweet.
I'm going to assume his favorite hobby is huffing spray paint from a bag. Pro tip.... I heard the gold/silver ones are the best! Go Team!
Are any TVs capable of true 240 fps? 4k at 240hz I don't recall even being possible due to bandwidth limitations.
Adding in expecting 1440p upscale to 4k with 30-60fps with variable rate support.
No 8k at 30fps? This is bullshit!
Really? Why is that?
He may be right about streaming systems being the new low end, but those performance predictions are ridiculous.
What’s weird is we already have somewhat reliable idea of what Ryzen7nm with Navi is and to some degree what that would be capable of. So why guess so high knowing your already wrong unless you intentionally lied in your analysis.
You guys are going to feel so silly when Pachter nails this prediction.
His entire career is defined by being wrong most of the time. If you make a wrong, outlandish claim, it gets a lot of views from people telling you what an idiot you are. This boosts ad traffic and keeps publications in business. It's how much of modern media works.
Care to put money where your mouth is? I'm willing to bet not even 5% of games on the next gen consoles will be able to perform at those settings. I say 5% since I could see a few demos or retro titles being tweaked to work on that.
Besides, betting against Pachter isn't exactly a risky proposition, he's wrong 75% of the time. He's so consistently wrong, you'll have statistically better predictions assuming the opposite of what he says is true.
This guy is talking shit.
how they gonna manage to fit all this high end hardware into a package that sells for $500?
I wish I could get paid for making up asinine things.
As long as they're on track with how the PS9 will be like IDK.
240 FPS was a typo. He meant 24.
People didn’t notice* the drop from 60 to 30, and the drop from 30 to 24 is much smaller.
*: ask a random non-techie guy on the internet
...I wish that I was joking.
I won't bet money, but I'll let you have your 'told you so' moment.
i think realistically the next gen consoles will do 4k / 60fps in some titles and checkboard rendering in others with 60 fps a min requirement and hopefully 1440p @ 90fps per eye for VR , thats what i predict.
Newer AMD video cards are not much faster than the Xbox One X and the rate of improvement for video cards is very slow now just like it is for CPUs.
So my guess is, the Xbox Two will be twice the video horsepower of the One X and the CPU will be a Zen 2 which will be 2-3x faster than the One X.
What I am hoping for is 4K/60Hz on most games. The console may be able to run higher refresh rates however 99% of people use 60hz screens so no developer will waste their time on 120hz frame rates.
4k 240fps on a direct port of atari 2600 pacman maybe.
1080@60 has been out of reach, and you're hoping for 4k@60? Even if it gets the variable refresh rate someone else brought up, it doesn't matter if half the time you spend playing the game it keeps dipping down into "cinematic mode".
Your comment makes no sense as textures have nearly zero impact on performance unless you don't have enough vram.
My comment makes perfect sense, misterbobby.
So in other words you are clueless. Again textures have basically no impact on performance as long as you have enough vram and your ignorance on the subject changes nothing. They don't even have to use potato textures now and the next generation of consoles will have even more vram.
Microsoft already added 120hz adaptive sync support for their Xbox One, which will be locked to 60hz. 1080p120hz (not sure about 144hz) and 4k60hz is going to happen. For 240hz to occur, I would be skeptical to see native 1080p240hz with 0 downsampling. 240hz is definitely aimed for VR, but I would love to see it for some fast paced games.
Since the new consoles will be x86, full backward compatibility is going to happen very quickly. I expect many existing titles will get a big FPS boost if they are supported. Hopefully most PS4/Xbox One games locked at 30 will go to 60 on the newer consoles over time; as the hardware will support that.
The biggest hurdle will be 4k + high refresh rate adoption. While many 4k TV's support 1080p120hz, most console gamers do not have that ability. 120hz adoption rate could be a hurdle.
We also need to be careful with what resolution these consoles claim to be rendering at. They are gonna cheat their asses off to get 4k above 60fps.
People read 240 Hz and immediately think AAA games with super duper graphics. With this "next gen" there's going to be plenty of performance for simpler games. VR could use more frames, no?
PS7 will offer that
A lot of games designed for PS4 Pro and Xbox One X run at 1080p/60. The GPU is powerful enough in both systems to render 1080p/60 but both systems have weak CPUs and are unable to keep up. I think in the next generation, both the CPU and the GPU will be able to run most if not all games at 1440p/60 (maybe some will even push it up to 4K). The problem is, both MS and Sony give the developers liberty to do whatever they want and most find it a lot easier develop games locked at 30fps because a) they value art over frame rate b) with the limited CPU power, it is hard to ensure a constant 60hz.
Given the work Microsoft but into backwards compatibility making the one support both 360 and original Xbox games would say that is a given that will be in the new Xbox. Almost have to say the same for 4K as 4K is supported by both platforms now. so half the stuff is a no shit the next generation will have that stuff. It will also support HDMI, use USB3 and support controllers. It will also support tcp/ip v4 and v6.
It takes time for high-res textures to get loaded into VRAM from the disk, so VRAM bandwidth (data transfer rate) is also important as well on top of the VRAM quantity.
I've seen this first hand with quite a few games, especially DOOM 2016 with the nightmare mode and Fallout 4 with the HD texture pack with a GTX 980 Ti.
With the consoles, the unified memory architectures helps immensely with this task since it normally won't need to copy the memory twice from the disk to RAM to VRAM, but only from DISK to unified RAM, so that is definitely a plus.
The point is that he is clueless for thinking texture resolution is any type of limitation to getting 240 fps. It is the very last thing that would ever matter if there is enough vram so saying they would need to use potato level textures to get 240 fps is beyond stupid.
You are right, it doesn't look like the higher the resolution of the textures negatively impacts performance.
If the number of said textures were to increase (regardless of increasing/decreasing resolution), then that would be a different story as the GPU would have to calculate more of them.
But if the number of said textures remains the same, then yes, the resolution increase normally only relies more on disk access to load said textures into RAM/VRAM/unified RAM, but not overall GPU calculations.
Like AMD.... Hmmm....... Your making yourself look like a fanboy...... so maybe post the link to those rumors so you won't be judged for fake news?
The concept is fake. Current PC GPU's can't claim that. Stop the BS.
Not exactly, but a lot of people get confused by the difference between a skin and a texture. In the bad old days, a 2-dimensional image used as a wrap for a 3D object was a 'skin'*, and if it was directly proportional to the 3D object it was called a 'map'. A texture was an additional wrap that identified highs and lows on a skin and, depending on the direction of the light source, gave the 2D skin the appearance of a 3D surface. Nowadays there are a lot of different textures that can be applied to a skin or map.
*Note: We think of skins being form-fitted to a particular object, but this wasn't always the case - in the 90's, developers would often make 'skins' or 'wraps' as large, square graphics, and then 'dip' the 3D object into the graphic. This way they could use one graphic for many different purposes (if you played Everquest you saw a lot of this - a grass skin that was used as an animal skin or the leaves of trees, stone that was used for beaches, buildings or cliff walls, wood that was used for bark, doors, docks, and walls, etc.) In 2018 we think of 'skins' as being form-fitting colorizations for guns or clothing, and that kind of makes sense, since they are a 'skin' over an object. But in the old days, this would be correctly referred to as a 'map', since the edges of the graphic corresponded to points on the 3D model.
At some point, a skin that had a texture file attached started to be referred to as a 'texture'. And I can guarantee you, if you increase the resolution of a texture that has a lot of bump, light and opacity mapping (among other things) you'll slow your fps.
And I can guarantee you that I can compare performance between low and very high textures in any game out there and the difference will be almost meaningless if there is a difference at all. So again saying we would need potato textures is idiotic as it has nothing to do with being able to hit high framerates or not.
It is obvious that Mr. Pachter (Sony wins again) got it wrong. I mean 240 hertz is not supported in any standard that I know of unless there some form of display port which allows this but certainly not at 4K. That might mean that he heard the number from someone or something . Maybe it is VR related comment where they would have support for 120 hertz per "eye" and even then certainly not at 4K.
I can see 4K 60FPS and/or 240Hz 1080p with both options provided for a la cart type of consumer preference approach. As for 4K/240Hz this analyst is dreaming or really meant to say what I mentioned and clearly not tech savvy.
All the marketing positions were filled!
Because they source the CPU and GPU From AMD not Intel and Nvidia
Direct X 12 + GNM/ GNMX 2 + powerful hardware from AMD = Profit.
DX 12 + AMD Hardware = more fps
240 FPS 4K from a console, ROFL. Does this guy work at the Onion? So the PS5 is going to be 8 times faster than the Xbox One X eh.
I love how most so-called industry professionals/analysts are morons.
more likely $300+ it would take 4 top of the line NVIDIA cards to get that done....