Yup, grow my own food etc... I'm like a techy farmer guy ugghhTractor?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yup, grow my own food etc... I'm like a techy farmer guy ugghhTractor?
I think the resistance in the ability to just manufacture the cards make the possible speed of a shift, slow.I think shifts in market share can happen pretty quickly. And 90+% of the market is non flag ship parts.
At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.I think the resistance in the ability to just manufacture the cards make the possible speed of a shift, slow.
Imagine AMD selling the 7900xtx $599, would they gain market share or simply be back order everywhere..... Maybe chiplet open that a bit more than before, but in the current reality I can imagine that shift must by definition be quite gradual (making you reserve more next gen, design for more sales, take less orders on your other CPU and others competing with yourself business, build up AIB channels and so on)
IMHO, they're fine as long as they don't require 4+ slots and a kickstand. I still think the 4090 is ridiculous.I think its obvious they are on a path the beating Nvidia up. Assuming they don't screw something up... OR Nvidia manages a switch to chiplets themselves. Reports are their next gen will be monolithic as well.
AMD has reduced their development costs by a magnitude, not just production costs. Not only are these cheaper to fab... it will be much much cheaper to design new generations. Reusing the 6nm bits for 2-3 generations is viable. They did it with Zen. They only really need a full redesign every couple gens. They could get even more savings fabbing the older process bits all at once, ordering chips for 2 generations of GPUs and just warehousing them for a year.
At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.
It is a bit hard to know because there the actual margin-cost of the GPU itself and everything else on a graphic cards, the way AIB talked about the latest generation, their margin did not seem to be that high even at the very high price tag.Maybe not, lets say they got it back to a 50/50 split then the investors would be fine with that as it will lead to more profitable sales later. Only AMD really knows what the cost is for each GPU, but I would assume $750.00 would still be quite profitable for them on their Halo card. I am just not convinced that AMD goal is to take market share.
If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.Your not wrong about disrupting the market leader.
I think we should probably wait for benchmarks a bit though. We don't know how much the 7900 will beat the 4080. It seems like it will, we don't even know that for sure.
I mean how crazy do things need to get before people bite on AMD ? If 90% of the time a 7900XTX is within single digits of a 4090... and beats a 4080 by double digits in basically everything not counting a couple RT titles for $200 less. I think people start biting at that point if they are on NV 2000s and older.
I think shifts in market share can happen pretty quickly. And 90+% of the market is non flag ship parts. All the holding off on really mid range next gen from both NV and AMD is annoying. What AMD really needs to gain market share is a 7800 and 7700. Come in with those cards at $700/500 and put the screws on the last gen. I wish AMD had done that and just said fine we are going to have to blow out a bunch of 6000 cards and loose some money.
I agree with you, as much as AMDs tech is revolutionary. Alchemist sucking really sucks... hopefully BM does kick NV and AMD in the backside. I just hope Intel has learned enough lessons to do that.
She's got that nice ring to fall back on though. She would be fine.At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.
Not many looking for rx6600. They can barely drum up enough excitement for their top of the line, brand spanking new chiplet design GPU. And that excitement is mostly to compare to what Nvidia has already released; it doesn't even stand on it's own two legs. Their almost bottom of the barrel offerings wouldn't be flying off the shelves, they would like be collecting dust on those shelves.If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
If they cranked the production of RX 6600 then its price would drop to less than $200 due to oversupplyIf amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
They're already regularly priced at 200 dollars, I almost bought one as a back up card for 219 a month or two ago. I saw a couple of deals on 6600 XTs in the low 200s.If they cranked the production of RX 6600 then its price would drop to less than $200 due to oversupply
It is a bit hard to know because there the actual margin-cost of the GPU itself and everything else on a graphic cards, the way AIB talked about the latest generation, their margin did not seem to be that high even at the very high price tag.
I am not sure why a massive cut of price would lead to more profitable sales later.
Or just release a 7600 for $250. Take the loss on the remaining 6600s they would have to sell for 150. lolIf amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
If they do have a refresh with a larger die version in the 7950 xtx, they need to focus on whatever accelerates ray tracing. That's my biggest worry/disappointment with the latest amd gpu cards so far, the gap this generation seems to have widened when amd needed to narrow it.A little would probably go a long way I imagine. I mean all the cache that takes up tons of die space are on the chiplets. The GCD is all logic... even if they decided to go up to 350mm or so the amount of extra horsepower would be impressive. I don't know if they will... heck this might be it for this gen. I think of it like early zen. As ground breaking as zen 1 was, it was the follow ups that took the idea and ran with it. I think this gen they where probably conservative. Make sure it works they couldn't afford to have a generation fall on its face. Knowing how Su dealt with Zen though I'm sure they already have the next gen well on the way... cause they probably are going to use the exact same cache chiplets. I wouldn't be shocked if they already had very early engineering bits for the next gen.
That's their compute units.If they do have a refresh with a larger die version in the 7950 xtx, they need to focus on whatever accelerates ray tracing.
I am pretty sure that gap will actually have gotten much smaller. But we'll have to wait for benchmarks.If they do have a refresh with a larger die version in the 7950 xtx, they need to focus on whatever accelerates ray tracing. That's my biggest worry/disappointment with the latest amd gpu cards so far, the gap this generation seems to have widened when amd needed to narrow it.
When the benchmarks come out, if the 7900 xtx can't even beat a 3090 in raytracing I won't bother.
Great interview on how AMD with RDNA3 plan to quote "Take Nvidia by the shoulders, wheel them into a corner, and turn the lights out", figuratively speaking. And speaking figuratively, that they are "literally going to take Nvidia and nearly drown them in the ocean, then, like a powerful wave toss them onto the shore like they're nothing, with their pants pulled down and their ass in the breeze", figuratively, literally (I'm paraphrasing).
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.I think this gen of AMD cards will be very good bang for buck, albeit not any better than Ampere in ray tracing. I think the next will challenge nVidia for the top spot. I just hope they don't price them equivalent to nVidia at that point and instead use the opportunity to earn some mind share.
He might be talking about the lower tier AMD cards, not necessarily the XTX. If he wasn't, well then I guess nvidia's pricing strategy is working its magic.You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
AMD is playing a bit of the same game Nvidia is playing to move old stock. At least they have been getting their older product out at an actual discount. Having said that... I would expect the 7900 XT will probably not get restocked super heavy while the XTX will. Then in a few months AMD will drop a 7800 XT... that might drop a bit more ram or something perform pretty darn close and come in at a proper 7800 price.You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
I'm only cheering for competition in hopes that it gets price/performance into a better place. The reality that we live in is that people are still buying up almost everything at ridiculous prices. It seems the 4080s are showing that people are a little more hesitant to buy it out in seconds, but they are still going to sell everything that's available in a day or two. This $1000 7900xtx card is a step in right direction. If it matches or surpasses the 4080 in rasterization, then at least nVidia will be put on notice. If AMD uses that to try and unseat nVidia in the next gen of cards by going for market share, then maybe that performance level comes down to $899 or even $850, or something more reasonable. If you watch the GN video that came out overnight, you'll see that AMD should be seeing more and more competitive advantage in pricing on chiplets. I'm sure nVidia is going to go down the chiplet road, but AMD has a massive headstart. We didn't get into this mess overnight and we're not getting out of it overnight. The 1080Ti was $700 and nVidia decided to never make that mistake again. It's been 3 generations since then. The 3080 was MSRP at a somewhat OK price, but it was unobtainium. Now, look at CPUs. After 4 generations of Ryzen, AMD and Intel are just now slitting each other's throats on pricing. I think there's more gpu pain ahead, but hopefully, it will improve a generation at a time. And, as Furious_Styles said, there should still be some value in the 7800-7700-7600 range for the time being, but I still don't see them beating the used market on 3080s. Myself, just this week, I just kind of side graded from a 3080 to a used 3090 because I saw a deal too good to pass up and I had a buyer for my 3080.He might be talking about the lower tier AMD cards, not necessarily the XTX. If he wasn't, well then I guess nvidia's pricing strategy is working its magic.
Would I go to jail if I admitted part of me is cheering for the idea of a $1600 monster AMD card with dual exhausts, so that more chairs are thrown, more bottles are broken, and more guys are saying something unclean about some other guy's girlfriend?You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
If you mean are their disabled bits on the 7900 XTX. No it doesn't look like it. The logic chip is what it is. IMO they went conservative with it. I mean its an upgrade from the previous gen logic... but they didn't swing to hard. I am sure they wanted to prove the concept and get a working product first. The MCM cache chips though are probably good to go for one more generation. I would think they will create a slightly bumped still RDNA 3 chip for a refresh using the exact same cache MCM chips. When they move the RDNA 4 most likely they will also use the same 6nm MCM chips.I haven't dug into the chip design apart from high-level architecture. Is 7900xtx the full fat chip or is there more fat to be had?
I am batting for AMD on this one, they talk a mean game but I am really hoping to see some follow-through, I want an actual race between AMD and Nvidia, not a leisurely jog.
The thing about the MCM design like you note is if they wanted to improve logic or have a die shrink, it's more viable since they don't have to port the RAM or the controllers. AMD has never bothered to compete with "Titan" and the 4090 is basically nVidia's Titan class part renamed. Any tech enthusiast will be more than happy if the 7900XTX has a clear win vs the 4080 in raster and if it's at least as good as the 3080Ti (or so) in ray-tracing. Especially at a (at minimum) -$300 price point. Other than nVidia die hards, I can't see anyone picking the 4080 over the 7900XTX for any other reason but that or availability. I have a feeling AMD is going to need to print as many chips as they can possibly squeeze out this gen because this may be the next 9700 banger.If you mean are their disabled bits on the 7900 XTX. No it doesn't look like it. The logic chip is what it is. IMO they went conservative with it. I mean its an upgrade from the previous gen logic... but they didn't swing to hard. I am sure they wanted to prove the concept and get a working product first. The MCM cache chips though are probably good to go for one more generation. I would think they will create a slightly bumped still RDNA 3 chip for a refresh using the exact same cache MCM chips. When they move the RDNA 4 most likely they will also use the same 6nm MCM chips.
I think the whole market is waiting to see the ~$300 parts. AMD is likely to sell everything they make. nVidia may not, just due to pricing and the fact that as the market leader they'll have way more parts in the channel as evidenced by the 4080.Well I think AMD has more room to grow with the chiplet design, while Nvidia's back is pretty much against the wall with monolithic design. Not really sure if they can really crank the wick much more for the next generation and if so that will give AMD ample room to catch up. But who knows Nvidia might have a Ace up their sleeve. I am however more concerned by the pricing these days then the performance. If 80% of the market can't afford the cards then it hurts the gaming market as a whole. This is also why I expect Ray Tracing to go nowhere, as it needs a user base for any developer to seriously utilize it.
It's more a question of, does AMD do anything with it? rather than can they, because it's obvious that they can.Well I think AMD has more room to grow with the chiplet design, while Nvidia's back is pretty much against the wall with monolithic design. Not really sure if they can really crank the wick much more for the next generation and if so that will give AMD ample room to catch up. But who knows Nvidia might have a Ace up their sleeve. I am however more concerned by the pricing these days then the performance. If 80% of the market can't afford the cards then it hurts the gaming market as a whole. This is also why I expect Ray Tracing to go nowhere, as it needs a user base for any developer to seriously utilize it.
Ding ding ding. Or towards Enterprise GPU's. Sell the same piece of silicon for 4x-7x as consumer market, why wouldn't they.will AMD choose to put that saved silicon into more GPUs, or will they take those silicon savings and put it towards Enterprise
It's more a question of, does AMD do anything with it? rather than can they, because it's obvious that they can.
Their chiplet design saves silicon and cuts costs, but will AMD choose to put that saved silicon into more GPUs, or will they take those silicon savings and put it towards Enterprise where they consistently fail to meet OEM demand for Threadrippers and EPYCs? So will they choose to try to take a larger share of the GPU market or will they take the safe bet and apply it to the Enterprise market where they are guaranteed sales and high margins?
I (and probably most of us here) want AMD to take the fight to Nvidia in the GPU space, but investors and shareholders will want them to go after Enterprise because it gets them better returns now and isn't seeing a reduction in demand.
So I don't at all question AMD's ability to deliver, I question their priorities.
Well for the last 2 years AMD failed to meet both Dell and Lenovo's demand for EPYCs and Threadrippers, both had to push customers to Xeons because AMD could not get them the chips, so there was certainly a failure to meet demand in those spaces.Well they are using different nodes this time from the cpu and part of the gpu, so that should allow them to dedicate more to the gpu side this time without coming at the cost of cpu production. Also since the new cpu's are not flying off the shelves I think they will just dedicate more production to the server side instead of desktop. But reality is you never want to have more then what the market demands, so being slightly under market demand is the goal to maintain maximum prices. So the answer is maybe? Just depends on how much more demand there is for the enterprise side and how much they want to actually supply in that market.
Same. Or a heatkiller block for the reference GPU. I'm sure there will be a liquid devil which are usually great but I just don't like the 'devil' on the front of the GPU. Either way i'm getting rid of the 3090 and going 7900XTX. Jensen can eat shit with the stupid pricing of the 4090 and the IMO probably the worst priced GPU ever with the 4080.Waiting for single slot with a preinstalled water block.
If you close your eyes, he cannot hurt you.I just don't like the 'devil' on the front of the GPU.
I would lean harder into the "it's just a name" part.They should have called it the 7800, not 7900.
Makes them look impotent. I know it's just a name....
Eh, I think it has just as much to do with paying more money for the scaling instead of an equal amount with the 2 series and to some extent the 3 series. nVidia more or less took away all the multiplicative gains in terms of cost vs performance that people usually get when moving generationally. That's more or less what the GN video was about.But people are already complaining about a 4080 costing 1200 but not a 4090 costing 1600....