Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
A lot of not-so tech minded people I know used to believe, and some still do, that a card with more memory was better than a card with less memory, irregardless of what chip was on it. Might just be that AMD wants to point and gesticulate about how much RAM they have.The 16GB makes no sense. Is it an architecture limitation, like they HAD to ship it with 16GB if they wanted that many compute units or something like that?
I'm hoping for the best and expecting the worst. Would be pretty great if real-world numbers match or exceed the 2080.Sounds decent but I'll reserve judgement for when the [H] review comes out. I don't believe anything AMD or nVidia (or any other tech company) say about their stuff any more. Let me see the independent reviews.
What a bummer, 2080 performance at 2080 prices without the 2080 feature set.
Only people who would really be interested in such a card are the people who need the 16GB of ram.
The 16GB makes no sense. Is it an architecture limitation, like they HAD to ship it with 16GB if they wanted that many compute units or something like that?
After the updates the 2080 does fine with RTX fps I mean you aren't going to get 4k performance but reasonable 2k isn't out the window according to all the benchmarks, the 2080ti gets 60-90fps now at 1080p, and the 2060 apparantly targets 60fps or greater at 1080p with RTX in low, it's new so really without DLSS it's hard to quantify.
HBM works in stacks of 4 and each stack has to be the same, they probably could have gimped the card with 12GB of ram but because of the complexity in producing the stacked memory it probably would have cost more to gimp it than to ship it as is. That being said this is the consumer version of one of their data center cards where I can assure you 16GB of ram is sufficient but not great, my quadro’s are running 24GB a piece and they sit at 70+% memory utilization at any point so on that front 16GB is insufficient.The 16GB makes no sense. Is it an architecture limitation, like they HAD to ship it with 16GB if they wanted that many compute units or something like that?
[H] numbers say 30fps is the low.according to all the benchmarks, the 2080ti gets 60-90fps now at 1080p,
If this has the clock speeds to match a 2080 it might actually outperform in a number of games at 4k with that 16GB. It's a well proven fact that even without ray tracing there are many games now that can exceed 8GB at 4k.
Not 2080TI 2080If the card was going to beat a 2080Ti they would have shown that in the slides
Oh lol. I read “2080 it” as 2080 tiNot 2080TI 2080
Unknowns: Power consumption. It's 7nm so power might be comparable to nvidia this time around but we'll have to see.
Dual 8-pin power connector indicates to me that this will hardly help slow down climate change...
AMD renders just fine. What apps do you use that are Cuda only?Gonna try to follow up with both of you comments with one of my own. I'm looking at this from a 3d artist and rendering perspective and based on the tentative analysis of the specs (or lack there of in this case) AMD is clearly not in the market for GPU rendering because it requires CUDA cores, so their intent is pretty clear they're not interested. Which is understandable but troubling none the less because Nvidia will continue to have monopolistic control in this market. And unlike the severe amount of negativity RTX is getting with their raytracing engine, when it comes to gpu rendering, this is where it excels and if these early test benchmarks from Chaosgroup's VRay are any indication, then it's going to be an attractive seller.
Flipping the script back to gaming, and admittedly I'm only a light gamer these days so have no skin in the game. But anyways, if the performance is only marginally better than the 2080, and the fact 90% of gamers won't ever take full advantage of 16GB of VRAM, and they're pricing it exactly as the same as the already overpriced 2080, no CUDA cores, then unfortunately in this humble mother fuckers opinion AMD dropped the ball, especially with the pricing. Unless they were to set the floor at $599 (and I could argue all day even $499 would be more attractive) then it's going to be difficult to convince the hobbyist gamer, or loyalist to jump ship.
At the end of the day nobody is going to care about watts or any, gamers are going to speak with their wallets, and sadly for me I have no choice but to play Nvidia's long term strategy because I would love nothing more than to give them the bird after their arbitrary NDA's and forced Geforce program scandal. This is what happens when in a capitalistic system where there is no competition, progress stagnates and consumers lose to corporate interests and their shareholders.
2080ti has dual 8 pin
Please don't suck.
Please don't suck.
Please don't suck.
As I understand it:
Pro's - Slightly cheaper than a 2080, equivalent performance, double the ram.
Con's - No RTX, No CUDA.
My Concerns: 16GB of HBM2 (if it's anything like Vega) suggests limited availability and that prices won't change much.
Unknowns: Power consumption. It's 7nm so power might be comparable to nvidia this time around but we'll have to see.
Other thoughts: Might be a good machine learning card vs the RTX 2080 with all the ram and raw processing power, depends how much the tensor cores help a given problem. Though you'll have to use custom AMD builds of tensorflow and may have less or no support in other frameworks.
AMD renders just fine. What apps do you use that are Cuda only?
Ok, and? They’re not going to do dual 8-pin unless it needs dual 8-pin. It will still be a power hungry GPU. That’s my point.
Well then you chose the correct GPU being that it is Nvidia only software.VRay
Well then you chose the correct GPU being that it is Nvidia only software.
This varies from your opinion where you blame AMD or more importantly declare AMD unable to render. "AMD is clearly not in the market for GPU rendering because it requires CUDA cores, so their intent is pretty clear they're not interested."
AMD has no control over software design in which a company chooses to support only Nvidia's proprietary Cuda. You do see the difference I hope.
AMD GPU's have traditionally been quite strong in compute.
What point do I need to make? Did you make an asinine comment about 2080ti's not going to help the climate?
Oh lol. I read “2080 it” as 2080 ti
Exactly. In [H] we trust. I can't remember the last time I bought a major PC component that hadn't been reviewed here.Sounds decent but I'll reserve judgement for when the [H] review comes out.
Let just hope Navi is as good as people are saying and they're just trying to dump all Vega chips before they release Navi later this year.Pretty sure its overstock of the MI60. Basically a stop gap card till Navi comes out. The Big Navi they have promised us in 2020, will be going into production soon..
[H] numbers say 30fps is the low.
I'm not paying $1300 to play on a 2007 resolution and have it slow down to 30fps. That's insanity.
Which benchmarks and when? They did patch, drivers, and optimizations, I wasn't aware they retested because that sounds like the original numbers when first released.
Also Tesselation was the same exact way when it released.
These are prosumer features on prosumer products, if you are crying about the price then you were never the intended market, people are acting like this exact scenario hasn't played out a dozen times before within gaming...
You can buy the cards and use and or ignore rtx features or you can buy AMD, you have a choice.
There's a possibility that im a bored billionaire.So is there a possibility they are 1% lows.
they came to the conclusion that RTX just doesn't work well, at least in BF1. This is using the ASUS ROG STRIX RTX 2080 Ti OC mind you. Not a 2070 2080 or even a regular 2080ti, its an OC model. $1300 for 23,24,30 fps minimums. I don't want it to drop every minute and a half ~(1/100) to shit frames, if it really is only 1% lows.There are wild swinging differences in framerate depending where you are in the map. The minimums are not exactly welcoming.