NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

Hmmm... We are talking about selling used video cards here. This is how it works for 99% of the use cases. Sure if you may find a company that has a blender render farm using 12 2080tis that want to keep them all the same and is willing to pay a little extra for the consistency.... This one person (few people) is not going to keep prices high. Sure when stock runs out prices on new old stock may go up, but nobody here is dumb enough to buy one, which is the target market were talking about. I can say with pretty high confidence at the end of the year you won't be selling to many used 2080tis for MORE than an equivalent ampere card. Is it possible a couple may? Sure, but it won't be the norm.

Yeah I was referring to parts sitting on shelves. I don't know what will happen with used 2080 Ti's but a 50% price cut seems unlikely. It would be great if the $800 Ampere card puts it to shame but I wouldn't bet on it.
 
IF, an I mean if it becomes common that new console AAA games run at 4K60FPS then It would be a bit disappointing if you needed a $700+ GPU to achieve the same...
But the DLSS solution in Death Stranding for example implies that even the lowest tier RTX cards are capable of that.

It could very well be that traditional GPU horsepower gets only a minor bump this time around, RT and DLSS gets a major one.
 
Just looking at passmark real quick for average g3dmark, if I had to guess this is going to go the same way it usually does. The pattern seems to be that the Ti of last gen is at or near the base of next gen, without the new features. If this deviates from historical progression color me surprised. It's generally why if you buy near the top you can skip the next release. I don't have a crystal ball, but this is usually close enough to think about. I've not done a deeper look at this history, but I would think the pattern holds true enough.
 
IF, an I mean if it becomes common that new console AAA games run at 4K60FPS then It would be a bit disappointing if you needed a $700+ GPU to achieve the same...

There's no way AAA console games will run 4K native in this generation. Way too many better things to spend resources on.
 
But we also know that both the PS5 and Xbox Series X are made to deliver fantastic gaming at 4K resolution. For the TVs selling now, that will mean 4K resolution at 60 frames per second (fps). That matches what the current Xbox One X and the PS4 Pro can do, though most current-generation games are still capped at 30 fps.

But the next-gen consoles coming later this year go a step beyond that in two ways: frame rate and resolution. Early specs and announcements make it clear that both consoles will offer 4K gaming at 120 Hz, doubling or even quadrupling the current frame rates you can get and producing smooth, judder-free gameplay as a result.

Then, there's resolution. In addition to ramping up the number of frames, the new consoles will also be able to handle much higher resolution, with games playable at 8K Ultra HD or 7680 × 4320 pixels. But jumping up to more than 33 million pixels does come with one trade-off: 8K gameplay will be limited to 60 fps.

If you're in the market for a TV to go with one of the next-gen consoles coming this fall, there are three key features you'll want to take into account: connectivity, frame rates and resolution.

https://www.tomsguide.com/best-pick...But we also know that,frames per second (fps).


One of the things that has concerned me about going Intel with my next build is that PS5 and Xbox are going to be on ryzen hardware. This is going to result in a massive shift in optimization of software in gaming development, the tools that support it, and so on. As big as the gaming industry is now, it is a major driver. While Intel holds an advantage now, I can't help but think that they are concentrating on the server and mobile markets. The implications of all this aren't clear to me yet, maybe some of you that are more well connected know.

Over 50% of US homes are expected to have 4K TV by the end of 2020, and it would seem that these consoles are going to be very capable of delivering. If I'm off base here please correct it, I would like accurate information.
 
The only way consoles will do 4K60FPS is with FFX...meaning subpar image quality compared to the PC rigth out of the gate.

I have heard all the FUD before...EVERY single console launch...and it ALWAYS fails.

I am getting to old for this shite, the same stupid patterns repeat and muppets refuse to learn 🤷‍♂️
 
Yeah I was referring to parts sitting on shelves. I don't know what will happen with used 2080 Ti's but a 50% price cut seems unlikely. It would be great if the $800 Ampere card puts it to shame but I wouldn't bet on it.
Yeah, they will drop to a level that is close to the value of what is the same/similar performance. I'm not positive what the final #'s will be, nobody does, but whatever Ampere and RNDA2 come in price/performance is what older cards will have to drop to in order to sell. The used market will have to come in even lower than this in general.
 
There's no way AAA console games will run 4K native in this generation. Way too many better things to spend resources on.
I agree they won't do so native, but then again why would they have to do it native? I mean, the PS4 Pro/xbox one x do 4k with checkerboard rendering, current amd GPU's use fidelity fx cas or RIS, there is no reason (for me at least) to think they won't continue to use some sort of upscaling technique in order to run AAA console games at 4k. Maybe not DLSS but something that does some scaling (either rendering more detail on the center of focus or rendering lower resolution and upscaling or some combination).

https://www.extremetech.com/gaming/235274-how-sonys-playstation-4-pro-does-4k-without-delivering-4k
 
Actual developers have been coy about how they’re going to deliver 4K content to the PS4 Pro. Johan Anderrson claims that Frostbite 3 is using a custom method of delivering 4K that’s neither native “or” upscaled but something altogether different.

From the article you linked, that made me chuckle.

Just looking at what current PC hardware can do, and looking at what's going into these consoles... didn't make sense to me. They sure would like you to believe it, and I doubt most console players care about the details.

Xbox One and PS4 together was about 150 million consoles combined, lifetime sales. That's likely to happen again with these new consoles.
 
I agree they won't do so native, but then again why would they have to do it native?

They certainly don't. I was referring to the earlier comment on the price of PC GPUs. Can't really make a price comparison if consoles are running hacked 4K vs PC native.
 
delivering 4K that’s neither native “or” upscaled but something altogether different.

DAFQ? I think the quick maths failed for him.
 
delivering 4K that’s neither native “or” upscaled but something altogether different.

DAFQ? I think the quick maths failed for him.

18jy40.jpg
 
One of the things that has concerned me about going Intel with my next build is that PS5 and Xbox are going to be on ryzen hardware. This is going to result in a massive shift in optimization of software in gaming development, the tools that support it, and so on. As big as the gaming industry is now, it is a major driver. While Intel holds an advantage now, I can't help but think that they are concentrating on the server and mobile markets. The implications of all this aren't clear to me yet, maybe some of you that are more well connected know.
Really no implications. Software is built for Intel CPUs, and AMD builds CPUs to run software that's optimized for Intel.

It's not like they're running the same binaries from the consoles on Windows, after all.
 
Really no implications. Software is built for Intel CPUs, and AMD builds CPUs to run software that's optimized for Intel.

It's not like they're running the same binaries from the consoles on Windows, after all.

Thanks for this. What I've noticed over the years is games have always run better on the Intel/Nvidia side of things. Whether that remains true for future releases is another matter. In addition, gaming at 2k has changed my thoughts some, as being GPU bound makes CPU brand choice less relevant.

I can't help but wonder how much more similar these binaries will be going forward with these new consoles? Going forward at higher resolutions I'm not sure it matters on the CPU end at all. I can't see GPU's being less valuable in this scenario, if anything, the current pricing or more.
 
How was Turing a bad buy, when you compare it to the launches from Fermi to Turing?
The perfomance delta data does not agree with your stance?
The die data does not agree with your stance?
The transistor count/price does not agree with your stance?

If people think that Ampere will be like Pascal, they are setting themselves up for a major disappointment...
This kind of argument only holds up if you own stock in the company. I could care less, as the buyer, how much your house renovation cost. I am only going to pay the value per square foot that the comps show. If you chose to put in italian marble you get to eat that cost. 2080ti pricing was based on a false bubble created by a mining craze that died, not on performance increase. So go ahead and name call but your position is what would be found in an Nvidia company email, not some cool hot take.
 
This kind of argument only holds up if you own stock in the company. I could care less, as the buyer, how much your house renovation cost. I am only going to pay the value per square foot that the comps show. If you chose to put in italian marble you get to eat that cost. 2080ti pricing was based on a false bubble created by a mining craze that died, not on performance increase. So go ahead and name call but your position is what would be found in an Nvidia company email, not some cool hot take.

Must really annoy you that the market does not agree?
I sense a lot of emotions comming from you...bad foundation for a debate about hardware.

I have what correlates to a RTX Titan in "for fun money" every month, you don't hear me whining about a luxury hobby.

That entitlement 🤣
 
You left out the 3080 Super Ultra God Tier, and the upgraded version of that, the 3080 Super Ultra God Tier Ti.
Not quite right. There will be no 3080 Super Ultra God Tier Ti..... that will be the 3090, which will also have Super, Ti, and Ultra variants.
 
You left out the 3080 Super Ultra God Tier, and the upgraded version of that, the 3080 Super Ultra God Tier Ti.

All released in staggered fashion, 3 months one after the other to maximize sales to upgraders. The more you buy...
 
Nvidia has been competing mostly with themselves since AMD bought ATi; yet performance has continued to get cheaper.

I'd put the blame more on a confluence of limitations for the recent stagnation in price/performance growth, being largely the cadence of foundry availability and the need to introduce RT hardware.

Well, RT has been introduced, and fab capacity for the latest processes has grown at more than one foundry, and really unless Nvidia offers something really attractive, they're not going to get the unit sales. The people that were willing to pay up for Turing have paid up.

See I don't agree with that. A top of the line 3D card (Voodoo 1, Voodoo 2) used to be $300. If you adjust for inflation on that, that's $474.47 for a top of the line GPU.

Today, top of the GPUs are $1,200 - $1,500.

Yeah.
 
See I don't agree with that. A top of the line 3D card (Voodoo 1, Voodoo 2) used to be $300. If you adjust for inflation on that, that's $474.47 for a top of the line GPU.

Today, top of the GPUs are $1,200 - $1,500.

Yeah.

Those cards required a 2D card to run FYI.
 
See I don't agree with that. A top of the line 3D card (Voodoo 1, Voodoo 2) used to be $300. If you adjust for inflation on that, that's $474.47 for a top of the line GPU.

Today, top of the GPUs are $1,200 - $1,500.

Yeah.

Inflation is for bread and milk. Completely irrelevant for state of the art technology.
 
Back
Top