magda
2[H]4U
- Joined
- Sep 26, 2018
- Messages
- 3,814
My prediction is that the RTX 3xxx is gonna be faster than the Riva 128
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I hope big Navi isn’t that expensive... it better match nVidia in features if it is.
80% performance increase in RT is not enough. You'll need at least twice the performance if you want to get anywhere near 60fps at 4k which would be the wholy grail.
Based on What AMD just did with the TR3 pricing, they are going to charge out the ass for big navi, most likely no value incentive other than maybe 50 bucks for parity to Nvidia performance.
For now. . .people also lost their shit when gunpowder was invented. . .they couldn't even fathom a nuclear explosion at the time, and neither can we for graphics
As a 1080ti owner I've been looking for a decent upgrade (without being scammed) but I wouldnt get an RT card now with Version 2 round the corner.Not for current 2080ti owners, but if they could deliver 10% more perf with 2080ti Super and maybe offer an entry price of $899, that would be a big thing to a lot of Non owners.
I wouldn't hold my breath for next gen it could be a whole two years before you have one in your hands. I mean people have this damn thing called money and they don't know what to do with it basically. So they stare at the screen until Next Gen is pumped out at records pace. I never use my cards as long as I think I'm going I used my 1080ti less for gaming than my 980ti and 970 and 670. I used my 2080 RTX more than the last 4 cards in a shorter time frame. I got a ton of use out of my 550 ti even though that was the cheapest card of the lot.
The only game I really spent alot of time with my 1080ti with was Kingdomcome Deliverance I spent almost 60 hours with that game.
Voxel, the mythical unicorn we've all dreamed about. I think everyone is putting too much emphasis on RTs importance in the near future. Unless PS5 can do RT that blows your panties off at 60 fps, don't expect devs to put much effort into it for the next 5-10 years. Give me a gigantic leap in raster performance, I couldn't care less for RT right now or even 5 years from now. Shit, some of you old bastards on this forum won't even be alive by the time RT is truly ubiquitous lol!!
If Ampere is another overpriced dud like Turing, I'm definitely going with whatever AMD has unless they really screw up too. Then I'll have to cry in a corner and hope Intel pulls a miracle out of its big blue ass.
P.S. We should have a HardOCP betting pool on future gpu releases and the winner gets a nice chunk of change. My bet is AMD will catch up to nVidia by 2021-2022 across the board because of vulkan/dx12 and RT won't be as big of a deal as some think it will. The writing is already on the wall, nVidia can't pull driver magic anymore and they're only slightly better in efficiency now. Games like Call of Duty MW and RDR2 are indicative of what's to come from AAA devs.
Nope still need to shade voxels.
Shading is dynamic lighting + dynamic shadows + anything that isn't geometry (fog, clouds, fire, smoke etc) + post processing (dof, motion blur, AA)
Raytracing doesn't reduce the need for shading in any way. It just changes what data you pass to the shader.
Based on what metric? The 290X was a 780 Ti competitor in actual performance and it was $100 cheaper.Well, historically their high end card is $50-100 too much. I guess we shouldn’t expect different.
Based on what metric? The 290X was a 780 Ti competitor in actual performance and it was $100 cheaper.
The Fury X was a 980 Ti competitor in actual performance and it released at the same price point.
Vega 64 was a 1080 competitor in actual performance and it released at the same price point.
This is the NVIDIA subforum and I always enjoy a good circlejerk, but come on.
It will be very interesting to see how the 5500 does in mainstream sales as it's already making a splash in PR world beating the 1650 soundly across the board. A 1650 superpooper won't close that gap. .
It's because AMD essentially built a 1660 competitor and aimed it at the 1650.
5500: 6.4B Transistors: 1408 cores
1660: 6.6B Transistors: 1408 cores
I expect AMD to price 5500 over 1650.
I wouldn't hold my breath for next gen it could be a whole two years before you have one in your hands. I mean people have this damn thing called money and they don't know what to do with it basically. So they stare at the screen until Next Gen is pumped out at records pace. I never use my cards as long as I think I'm going I used my 1080ti less for gaming than my 980ti and 970 and 670. I used my 2080 RTX more than the last 4 cards in a shorter time frame. I got a ton of use out of my 550 ti even though that was the cheapest card of the lot.
The only game I really spent alot of time with my 1080ti with was Kingdomcome Deliverance I spent almost 60 hours with that game.
Does the 5500 have all CUs enabled? not sure if Navi 14 has 22 or 24 CUs total.
A 5500XT with 24 CUs and 8 GB GDDR6 for $200 would be a kickass mainstream card and reinvigorate the market.
Today rogame spotted a new entry at the Geekbench database. This is not the RX 5500 XT though, but Radeon PRO (a workstation card) named 5500M. Unlike the Radeon RX variant, the PRO is listed with 24 CUs enabled.
Not necessarily.Apparently it's 24 CUs, which yields 1536 cores:
https://videocardz.com/newz/amd-radeon-pro-5500m-gets-full-navi-14
This is identical to the 1660 Ti. So fully enabled it has the exact core count of the 1660Ti and the cut down versions has the exact core count of the 1660 Non Ti. Interesting coincidence.
The only issue compared to the 1660[Ti|Super) is the lower memory BW due to the 128 bit bus vs the 192 bit 1660s.
RTX will move forward for at least one more generation, and either become more popular or fade away. I feel this will be NV's biggest challenge. Find a way to make it a thing with devs an publishers that doesnt cost too much to implement.
I wouldn't hold my breath for next gen it could be a whole two years before you have one in your hands. I mean people have this damn thing called money and they don't know what to do with it basically. So they stare at the screen until Next Gen is pumped out at records pace. I never use my cards as long as I think I'm going I used my 1080ti less for gaming than my 980ti and 970 and 670. I used my 2080 RTX more than the last 4 cards in a shorter time frame. I got a ton of use out of my 550 ti even though that was the cheapest card of the lot.
The only game I really spent alot of time with my 1080ti with was Kingdomcome Deliverance I spent almost 60 hours with that game.
Would make no sense.I'm kind of surprised we haven't seen a 2080 Ti Super yet. Despite AMD having absolutely nothing to compete with it, it would pretty typical of Nvidia to milk the market for all it can especially if they price it at $1000. That should clear up their inventory of TU102 and make room for Ampere a few months later.
Would make no sense.
Why devalue existing product when you can just sell it for $1,300
and muppets will still buy it.
Of course there is inventory to sell, there certainly isn't a shortage or they'd be 'out of stock' everywhere already wouldn' they?This assumes that they have an inventory to sell...
And this is unnecessary on a hardware enthusiast forum. There are always buyers for the very best, and it's always priced higher than a linear value plot would dictate.
My dream state wants a 3080 Ti with 12GB or more VRAM that outperforms a 2080 Ti by 10 to 15% and is priced around $900. Might not happen but one can dream.
Of course there is inventory to sell, there certainly isn't a shortage or they'd be 'out of stock' everywhere already wouldn' they?
Sorry to hurt your sensitive feelings, by muppets I am referring to people buying hardware at top dollar when it's basically EOL in production terms. That is a stupid investment. But if you have the money to lose, be my guest.
Prediction the 3080 TI will be sold out for months.
Of course there is inventory to sell, there certainly isn't a shortage or they'd be 'out of stock' everywhere already wouldn' they?
Sorry to hurt your sensitive feelings
by muppets I am referring to people buying hardware at top dollar when it's basically EOL in production terms. That is a stupid investment. But if you have the money to lose, be my guest.
You can buy a high end GPU for many reasons and one could indeed be an investment. Some buy GPU's, as in hundreds of them to mine with (more in the past then present), for their business, for rendering etc. Some probably could argue it is an investment in themselves for having fun. Anyways investments can take many forms and a high end gaming card could be it.If anyone looks at buying computer parts as an investment, they are looking at it the wrong way. Just like a car, you are losing money from the start. You are buying high end computer parts for entertainment, that's exactly how I look at it, and I budget the money to do it high end and look at it as such.
My guess will be 10 to 15% better then the 2000 series. I expect Ray Tracing performance to be the larger increase at around 30 to 50%.
Well that’s not going to cut it especially if they are going to keep charging these insane prices. If the want to charge around $1200 US dollars it had better have around 20-30% increase in rasterised graphics and close to 100% in Ray Tracing.
Pricing will depend on how well Big Navi does that is coming from AMD. This also may be Nvidias last Monolithic chip design which kinds of tells you they have hit a wall on what they can do. I just dont see large jumps in performance with these small process nodes very likely.
Its either gonna be $1500 with 30% increase over the 2080ti
or its gonna be $699 with a 500% performance increase over the 2080ti and they will call it "U L T R A"
I think you will be disappointed. 7nm (10nm Intel) hasn't really seen much clock speed boost. Intel actually went backwards. AMD got a few percent.
Nvidia never disappoints in performance during a new process, 7nm 3080 and 3080ti are going to be insanely fast.
Why would you bring CPU's and Intel into this conversation?
Nvidia never disappoints in performance during a new process, 7nm 3080 and 3080ti are going to be insanely fast.
Why would you bring CPU's and Intel into this conversation?
Because there are three big players in PC silicon parts: AMD, Intel and NVidia.
Transitioning to 14nm/16nm, they made impressive performance gains.
But transition to 7nm so far, neither AMD nor Intel have made big gains this time, so I wouldn't expect Big gains from NVidia either.
AMDs gains seem to be mostly from architecture (Navi, Zen 2), not from clock speed boost.
So the evidence is that 28nm->14nm transition looks like it was MUCH better than the 14nm->7nm transition.
On top of that you have to factor that transistor economics are worse this time.
On the 28nm->14nm 980Ti to 1080Ti increased transistor count by 50%. That isn't going to happen this time. I would expect 10%-20% more transistors this time. Count ourselves lucky if it's 20%.
This is NOT going to be like 1080Ti vs 980Ti. That was the last gasp of big transistor count gains.