MLID: AMD RX 8800 XT Leak, RDNA 4 Ray Tracing, Release Date, Price

I mean chiplet crossfire ofc. Not 2 physical GPUs

There is an AMD RDNA 4 patent for 3 chiplets in crossfire mode inside same GPU
A, I have pci express in mind when I hear crossfire, I doubt the patent mention or as anything specific to RDNA 4 or RDNA in general, if that the one mentioned:
https://www.freepatentsonline.com/20240193844.pdf

Seem really generic, there is no doubt that all the giants have been looking at this forever, a bit like there was talk that started for the PS5pro a while ago when a 2019 patent from sony : https://www.freepatentsonline.com/y2020/0242723.html, extremelly generic (multi gpu on same die, chiplet, etc... for cloud gaming server, console).

Could be, but I am not sure the patent or its absence mean much, there was leaked design for the cancelled top end RDNA 4, the MI300 do use them in a extreme way (13 chiplets) so they are gaining a lot experiences in that regard, maybe 3d stacking could help, if you put the gpu one over the other that could make a lot of surface for them to speak to each other and reduce latency issue (or a giant cache that cover both that they can share, like they do with HBM memory on the MI300)
 
, I doubt the patent mention or as anything specific to RDNA 4 or RDNA in general, if that the one mentioned:
https://www.freepatentsonline.com/20240193844.pdf
The patent itself doesn't mention which gpu it is but mlid claims that it is rdna 4

Also refer below:

AMD’s canceled Radeon RX 8000 “Navi 4C” GPU diagram has been partially leaked (by MLID)​


https://videocardz.com/newz/amds-ca...navi-4c-gpu-diagram-has-been-partially-leaked

View attachment 590305
 
Those all look like remote workstation configurations so different type of datacenter stuff and Nvidia hasn’t even announced the new version of them yet, let alone started taking orders.
Besides the 5090 will be made from the chips that couldn’t bin to be one of those. Which given an 80% yield rate would generally mean that 8 of every 10 chips produced gets sold as the expensive workstation card with 1 consumer card and 1 unusable.

Nvidia and AMD aren’t artificially gimping down silicon unless they’ve planned poorly. They let the natural binning process sort out their product lineup.
 
Yes, rdna4 was scaled back. They are just doing one chip and not going after the high end. The original plan was to go all in with a xtx like part using 2-3 chips.

Rdna5 may go that route if AMD chooses to. I imagine it will depend if they can gain some market with 4. If not who knows what the future of Radeon is. AMD has laid the ground to do high end with the same silicon at least... It the market will support it. Hard to blame them for not wanting to make another xtx low margin part that eats twice the silicon allotment. Right now anyway.
I don’t know… if RDNA4 doesn’t turn around AMD’s gaming numbers I’m not confident there will be an RDNA5.
 
I don’t know… if RDNA4 doesn’t turn around AMD’s gaming numbers I’m not confident there will be an RDNA5.
There will certainly be an RDNA 5 card, but it will likely be relugated to mid-tier. AMD is already reportedly contracted to do PS6 which will almost certainly use some derivation of RDNA 5, and that's largely fungible for the PC market/PC cards. What we will see more of, however, is a sharper turn towards SOCs by AMD -- and so, even less of a reason to compete with nV in dGPUs.
 
There will certainly be an RDNA 5 card, but it will likely be relugated to mid-tier. AMD is already reportedly contracted to do PS6 which will almost certainly use some derivation of RDNA 5, and that's largely fungible for the PC market/PC cards. What we will see more of, however, is a sharper turn towards SOCs by AMD -- and so, even less of a reason to compete with nV in dGPUs.
I don’t know, the PS5 Pro is more like RDNA 3.5 than 4. I could easily see them just launching the PS6 with full RDNA4. Unless it’s far enough out and it’s a UDNA thing,

Because remember AMD is unifying the RDNA and CDNA going forward.
 
I don’t know… if RDNA4 doesn’t turn around AMD’s gaming numbers I’m not confident there will be an RDNA5.
I think that about sums it up. From what I have heard it sounds to me like they have the right idea. The problem with the 7900xt and xtx was the price. AMD couldn't afford to do them really a whole lot cheaper then they did. Yet at that price point and performance level they were also not going to gain any real market share. it also means their lower end parts were not as attractive. (I think its telling that one of the best selling 7000 parts is the made for China market cut down GRE... at a bit lower price point gamers are willing to roll the dice on AMD)

If the 8800XT is what the rumors suggest... 4080 raster performance with at least 4070ti RT performance, at a great price. Maybe they gain some actual market share. I think aiming for the $1k price point would be a big mistake right now. When you start getting into that price point, most gamers will just choose to spend a little more for the NV part. Even if the AMD part is superior at those prices which I believe it is. The mind share is too great. When you start getting down in the the $500-600 range people are a lot more willing to go AMD. Heck even Intel was able to move cards when you get out of that ouch too much to take a chance price range. Hard to blame anyone dropping a grand on a GPU to just go with what they figure will be the reliable choice.
 
Last edited:
I don’t know, the PS5 Pro is more like RDNA 3.5 than 4. I could easily see them just launching the PS6 with full RDNA4. Unless it’s far enough out and it’s a UDNA thing,

Because remember AMD is unifying the RDNA and CDNA going forward.
I am not sure the whole RDNA CDNA unification being as big a deal as we may expect. I think we'll get a lot better idea when we see what RDNA 4 actually is when it launches. The much improved RT performance is probably because they have already essentially moved the two together even if its not official, or specifically the same silicon. I mean the matrix math capabilities of the MI325 CDNA is essentially AMDs version of tensor cores. If RDNA 4 has been moved to work in much the same way then really they are already at an architecture level already mostly unifed. Perhaps what the AMD guy was more refering to with the UDNA... was that they want to move to a place were the really do just use the same GPU chiplet silicon in both lines. Which makes a lot of sense. I mean if your going to use chiplets (which AMD is already doing with CDNA... and we know they had planned to do with RDNA 4) what is the point if your not either stacking them, or using them across product lines. Ideally your able to do both things.

Will be interesting to see the arch break downs of RDNA 4 when they announce them. I feel like it will basically be UDNA 0.5. With the replacements for the MI325/8000 being the point were AMD fully utalizes one core GPU chiplet for everything. It is also how they can stay in the gaming sector without really spending a ton of money on gaming hardware design specifically.
 
Those all look like remote workstation configurations so different type of datacenter stuff and Nvidia hasn’t even announced the new version of them yet, let alone started taking orders.
I am not sure running the metaverse is akin to a remote workstation, they do remote workstaiton obviously, but batch rendering, AI workload.

If we end up with over a million of gaming xx90, I would imagine it is because they are making millions of those RTX6000-L40 and just those who were not good enough go down the line to the gaming version, I am not sure workstation (local or cloud) market alone would be big enough for that.

For an concrete example:
https://clustervision.com/genomics-workloads-powered-by-nvidia-l40-gpu/
Genetic work made with a supercomputer driven by L40

Supermicro seem to offer it, for ai cluster, simulation cluster, etc..., they do not offer it for the top of the line largest scale solution, but just after that they are
https://www.supermicro.com/en/solutions/ai-deep-learning

If you are not very rich client, I can see it being a nicer price point (and much shorter waiting list)
 
Last edited:
Why make $12B when you could make a multiple of that amount by using that same silicon for AI products instead? It literally doesn’t make sense to “waste” it on gaming products.ou have to use the defective dies for something…
That gaming revenue is not wasted if there is an upper limit to AI-based markets or if manufacturing yields can't be improved further.

So let's say that with increased AI product volume, there is "only" $6 billion of gaming revenue. That ain't exactly chicken feed.
 
I am not sure running the metaverse is akin to a remote workstation, they do remote workstaiton obviously, but batch rendering, AI workload.

If we end up with millions of gaming xx90, I would imagine it is because they are making a lot of those RTX6000-L40 and the not good enough going down the line to the gaming version, I am not sure workstation (local or cloud) market alone would be big enough for that.

For an concrete example:
https://clustervision.com/genomics-workloads-powered-by-nvidia-l40-gpu/
Genetic work made with a supercomputer driven by L40

Supermicro seem to offer it, for ai cluster, simulation cluster, etc..., they do not offer it for the top of the line largest scale solution, but just after that they are
https://www.supermicro.com/en/solutions/ai-deep-learning

If you are not very rich client, I can see it being a nicer price point (and much shorter waiting list)
Yeah but this far down the stack and TSMC can churn them out like nothing. Hell GDDR7 is likely to have more constraints than TSMC for those chips. So there exists the ability for Nvidia to crank out more to meet demand.
 
I am not sure running the metaverse is akin to a remote workstation, they do remote workstaiton obviously, but batch rendering, AI workload.

If we end up with millions of gaming xx90, I would imagine it is because they are making a lot of those RTX6000-L40 and the not good enough going down the line to the gaming version, I am not sure workstation (local or cloud) market alone would be big enough for that.

For an concrete example:
https://clustervision.com/genomics-workloads-powered-by-nvidia-l40-gpu/
Genetic work made with a supercomputer driven by L40

Supermicro seem to offer it, for ai cluster, simulation cluster, etc..., they do not offer it for the top of the line largest scale solution, but just after that they are
https://www.supermicro.com/en/solutions/ai-deep-learning

If you are not very rich client, I can see it being a nicer price point (and much shorter waiting list)
Nvidia has no shortage of AI markets to saturate. This is what makes me doubt they are going to be wanting to cast off a lot of 5090 dies.
Even the cut down versions... we know Nvidia have designed cut down versions for other makrets, such as China. Plenty of places to sell those dies at very high profit margins. The 5090 is coming sure, I don't know if I would expect massive supply however. Even cast off chips might be more valuable to NV then over supplying 5090s.
 
If the 8800XT is what the rumors suggest... 4080 raster performance with at least 4070ti RT performance, I think aiming for the $1k price point would be a big mistake right now.
That would be pretty much exactly what a 7900xtx is in term of performance (tiny bit slower raster, tiny bit faster RT) ?

They are often around ~$900 right now, going over that would seem like going backward for sure, that card with 24gb of vram launched for $1000 2 years ago.

The 5070 will probably be around that card you describe (maybe a bit faster RT), if the 5070 launch at $599 that will be the mark to beat and the 5070 will have DLSS 4.0 whatever that will be, could mean that described 8800xt need to be $550 or less, an issue being they could not launch it before the 5070 it seem, Nvidia being ready.
 
  • Like
Reactions: ChadD
like this
That gaming revenue is not wasted if there is an upper limit to AI-based markets or if manufacturing yields can't be improved further.

So let's say that with increased AI product volume, there is "only" $6 billion of gaming revenue. That ain't exactly chicken feed.
True... the problem for Nvidia who is now the highest valued company in the world. Adding 6 billion in revenue at 20 points lower margin... hurts their stock more then it helps. Better to feed 3 billion of that market at 20 points more. I would be happy to be wrong but I feel like NV is about to drop the most expensive gaming GPU ever made... and stock is not going to be plentiful.
 
That would be pretty much exactly what 7900xtx is in term of performance (tiny bit slower raster, tiny bit faster RT) ?

They are often around ~$900 right now, going over that would seem like going backward for sure, that card with 24gb of vram launched for $1000 2 years ago.

The 5070 will probably be around that card you describe (maybe a bit faster RT), if the 5070 launch at $599 that will be the mark to beat.
Possibly we'll see how much Nvidia cuts those cards down. I'm sure they will both play a little chicken on announcements and pricing. Hopefully gamers get some good options from both companies. My expectation is that Nv is going to wow on the 5090 side, and dissapoint anyone in the marke for a strong 70 class card. Would like to be wrong.
 
True... the problem for Nvidia who is now the highest valued company in the world. Adding 6 billion in revenue at 20 points lower margin... hurts their stock more then it helps. Better to feed 3 billion of that market at 20 points more. I would be happy to be wrong but I feel like NV is about to drop the most expensive gaming GPU ever made... and stock is not going to be plentiful.
But it also insulates them against any bubble effects that AI may bring, if that bubble pops then that gaming segment is still a solid foundation to land on.
 
But it also insulates them against any bubble effects that AI may bring, if that bubble pops then that gaming segment is still a solid foundation to land on.
If the bubble pops franky they will be more screwed then Intel currently is. As much as 8 billion dollars sounds like insane revenue... to the highest valuation company in the world, its probably under 5% of total revenue by this time next year. If the bubble pops Nvidia will be in serious trouble as a company. That is the down side to their current valuation. I don't see anything popping and Nvidia having to worry. In the case that their data center business craters though... Nvidia would drop so hard so fast someone would buy em up.
 
But it also insulates them against any bubble effects that AI may bring, if that bubble pops then that gaming segment is still a solid foundation to land on.
Gaming segment is too small (what is it currently, about 10-15% of their total business?) and they already have the majority of gaming segment so they can’t grow that business substantially. If 85-90% of their business evaporates (not saying the AI bubble is going to pop for certain), gaming is not going to save them. They’ll have to pivot to some other business or shrink drastically.
 
True... the problem for Nvidia who is now the highest valued company in the world

I guess even Google can't acquire them right now.

Wonder if Jensen is thinking about taking the company private.
. Adding 6 billion in revenue at 20 points lower margin... hurts their stock more then it helps
How much does Jensen care at this point?
. Better to feed 3 billion of that market at 20 points more. I would be happy to be wrong but I feel like NV is about to drop the most expensive gaming GPU ever made... and stock is not going to be plentiful.
Stock split anyone?
 
If the 8800XT is what the rumors suggest... 4080 raster performance with at least 4070ti RT performance, at a great price. Maybe they gain some actual market share. I think aiming for the $1k price point would be a big mistake right now. When you start getting into that price point, most gamers will just choose to spend a little more for the NV part. Even if the AMD part is superior at those prices which I believe it is. The mind share is too great. When you start getting down in the the $500-600 range people are a lot more willing to go AMD. Heck even Intel was able to move cards when you get out of that ouch too much to take a chance price range. Hard to blame anyone dropping a grand on a GPU to just go with what they figure will be the reliable choice.
While I don't disagree that aiming to provide last gen high end for this gen mid-range pricing is good, can't help but feel the exit from trying to compete at the high end / enthusiast tier will just continue to hurt AMD in the long run.

This isn't the first time they have tried the "sweet spot" strategy and ultimately Nvidia always gained market share the other times they tried it. HD 4000 and HD 5000 series comes to mind. Fantastic time for Radeon and one of the times we saw some good pricing wars, but ultimately Nvidia continued to outsell and outgain AMD in the market. The next time that happened was with the 290X which put a stop to Nvidia sandbagging HARD on GK110 during the Kepler years and you got the 780 Ti and some pricing adjustments. RX480 and RX580 I guess sold pretty well to that market, but not sure it really moved the needle any for AMD.

Since then though? Nvidia doesn't even see themselves on the same level as Radeon at this stage. The only way AMD is going to have any power to break the Nvidia mindshare is actually leap frogging Nvidia with something of their own that's revolutionary to gaming graphics that Nvidia can't do, AND compete at the high end / enthusiast tiers.

Just my opinion, could be wrong, but I think AMD's ultimate problem is the perception that they are the second rate graphics company always playing catchup. Also AMD as usual never misses an opportunity to miss an opportunity and pricing RDNA3 7900 XTX at $999 with the 7900 XT at $899 was just bad press from the start. They sure know how to shoot themselves in the foot at every turn - not to mention clowning on Nvidia for the 12VHPWR and then immediately having a bunch of bad vapor chambers for their coolers at launch. Just terrible AMD.
 
Last edited:
If the 8800XT is what the rumors suggest... 4080 raster performance with at least 4070ti RT performance, at a great price. Maybe they gain some actual market share. I think aiming for the $1k price point would be a big mistake right now.
I mean the 4080 exists, and could easily sell at $999 it’s not like the 5000 series is likely to add anything new that the 4000 series can’t do. And RDNA 4 isn’t going to offer anything that Nvidia can’t do, so it becomes a choice of do I buy a 4080 which is a known entity and a safe solid choice that does all the things? Or do I buy the AMD?

That card from AMD if it’s within 15% of the 4080 price is something you would need to think about but 20-30% cheaper becomes a no brainer.

But in a race to the bottom the winner is the team with the deeper pockets.
 
While I don't disagree that aiming to provide last gen high end for this gen mid-range pricing is good, can't help but feel the exit from trying to compete at the high end / enthusiast tier will just continue to hurt AMD in the long run.

This isn't the first time they have tried the "sweet spot" strategy
I was going to say, depending what people mean by last gen high end and this gen mid-range..... Isn't that the norm ?

The 1060 6gb was close to the 980 performance,
The 2060 had the 1080 performance, would that have been an example of mid-range getting last gen high end performance (and one of the most acclaimed high end gpu ever) ? How fondly people remember that 2060 launch ?
The 6700xt-3070 had the 2080ti performance
The 4070 super is quite close to the 3090-6950xt, the 7800xt close to the 6900xt

If AMD did try to compete at the high end I am not sure how and why their mid-range would not have been 7900xt/xtx type of performance, there seem to be some spin about that strategy or possible result that oversold it a little bit, would people find it strange and Nvidia giving too much performance in the 5070-5070 super tier if they are a little bit faster than a 7900xt....

That card from AMD if it’s within 15% of the 4080 price is something you would need to think about but 20-30% cheaper becomes a no brainer.
Not if the 5070 is 20-30% cheaper than the 4080 super with similar performance.
 
Last edited:
I mean the 4080 exists, and could easily sell at $999 it’s not like the 5000 series is likely to add anything new that the 4000 series can’t do. And RDNA 4 isn’t going to offer anything that Nvidia can’t do, so it becomes a choice of do I buy a 4080 which is a known entity and a safe solid choice that does all the things? Or do I buy the AMD?

That card from AMD if it’s within 15% of the 4080 price is something you would need to think about but 20-30% cheaper becomes a no brainer.

But in a race to the bottom the winner is the team with the deeper pockets.
If AMD tries to sell a 8800XT at a grand ya they loose of course.
Everything is pointing more toward $500-600. (Or AMD will do their normal stupid thing of launching it at $750 and dropping it to $675 after the reviews lambast their poor value)
Guess we'll have to wait and see, both performance and price. I agree if AMD isn't a massive value win they die. I don't think they have to do anything new or unique. They need to deliver performance improvements at cost saving prices. I doubt Nvidia massively slashes 4080 prices, The 5080 is almost for sure going to cost more then the 4080 did. A 5070 depending on how Nvidia has sliced it may end up being enemic. From what I have seen rumor wise. I believe it may well be the case that Nvidia drops a 5070 that looses in RT benchmarks to the 8800XT. So if AMD prices right they may have a real winner. We'll see, we have been down this road with AMD before. We'll just have to see what they actually release, and if its perf and cost really do put them into a sweet spot that actually pushes gamers to choose them over Nvidia. The stars need to align a bit, they need to execute and hope that Nvidia really has decided to feed gamers scrapes. (rumors for the NV 5000s are all over, some say big gains others say be ready to be disappointed)

It would be nice if AMD was the master of their own destiny here, but we all know the truth if Nvidia delivers good gains on 5070 parts and ships them at resonable if not pure value price points they win. Nvidia has to pull what they pulled last gen once more with 70 cards that should really be 60s. I have a feeling that is exactly what is going to happne but ya that isn't in AMDs hands. We'll just have to wait and see. lol :)
 
I mean the 4080 exists, and could easily sell at $999
Not that easily when you can buy the 4080 super at $999 instead ;)... not a big upgrade, but why not.... $999 Gigabyte 4080 super are in stock at amazon and bestbuy right now.

As for the 5000 series adding something or not the 4000 will not, DLSS like generative AI that start from low quality asset or compressed texture instead of just the final image upscaling could be an 5 series exclusive, for actual or artificial reason.
 
Just my opinion, could be wrong, but I think AMD's ultimate problem is the perception that they are the second rate graphics company always playing catchup. Also AMD as usual never misses an opportunity to miss an opportunity and pricing RDNA3 7900 XTX at $999 with the 7900 XT at $899 was just bad press from the start. They sure know how to shoot themselves in the foot at every turn - not to mention clowning on Nvidia for the 12VHPWR and then immediately having a bunch of bad vapor chambers for their coolers at launch. Just terrible AMD.
Don't disagree with anything you said. But overall AMD has done quite well under Lisa Su.

I'm going to guess that GPUs are the red headed stepchild in AMD and the only reason they are still in the game is because of game console business. The consumer GPU business may not have A-list players in charge. Just a guess of course.
 
But it also insulates them against any bubble effects that AI may bring, if that bubble pops then that gaming segment is still a solid foundation to land on.
Keep in mind that just making money isn't enough for companies in 2024. If the AI market crashes and the need for Nvidia GPU's has dried up then what's going on with Intel will look pale in comparison. Nvidia was already propped up by crypto for many years and this has effected their stocks positively. The fact that Nvidia went from a crypto company to an AI company was just amazing luck. Nvidia isn't going to get the 1000% profit on gaming GPU's like they do with AI. If anything, gamers have been complaining that gaming GPU's have been overpriced. I'm not sure when, but at some point there's going to be an Nvdia share holder who's going to be left holding the worst bag in history.
 
Don't disagree with anything you said. But overall AMD has done quite well under Lisa Su.

I'm going to guess that GPUs are the red headed stepchild in AMD and the only reason they are still in the game is because of game console business. The consumer GPU business may not have A-list players in charge. Just a guess of course.
Oh I know they have. I mean, I'm using a Ryzen based system after being on Intel for like 15 years.

But Radeon division? Yeah I get the sense you are right.
 
The idea of that video that a Q1 2025 rdna 4 launch was some disinformation campaign to throw off Nvidia, that it would be launch in 2024 and with high volume, was maybe ........ wrong
 
The idea of that video that a Q1 2025 rdna 4 launch was some disinformation campaign to throw off Nvidia, that it would be launch in 2024 and with high volume, was maybe ........ wrong
AMD can throw all the disinformation they want, but the AIB's leak data like there is no tomorrow.
Hell the AIB's likely include Nvidia on the monthly schedule news blast about the upcoming work schedules.
 
AMD can throw all the disinformation they want, but the AIB's leak data like there is no tomorrow.
Hell the AIB's likely include Nvidia on the monthly schedule news blast about the upcoming work schedules.
Doesn't NVidia have some means to limit the data leaked by AIBs? Or are the AIBs a bunch of bottom dwelling slime and sleaze?
 
AMD can throw all the disinformation they want, but the AIB's leak data like there is no tomorrow.
Hell the AIB's likely include Nvidia on the monthly schedule news blast about the upcoming work schedules.
What make if funny, is the premise of that (old by now) video seem to be, leaker are just a tool of disinformation used by the company to try to mislead their opposition, while saying a lot of rumors and prediction that did not happen in it (like an AMD gpu with 7900xtx that would have launched this last October for $500...., and using only 210 watt to do said performance...)

And like my first post on the thread say, he build is premise using old RDNA 2 leaks, that he say where AMD throwing disinformation to mix up NVIDIA, but the leaks he use as an example of an disinformation campaign were all pretty much 100% exact.

Doesn't NVidia have some means to limit the data leaked by AIBs? Or are the AIBs a bunch of bottom dwelling slime and sleaze?
Would be hard, it can start to involve a lot of people in some of the know that are not paid enough to have to respect stuff and that a lot of people would know, that why the stuff that a lot of people would know like memory bus size, die size, cooler size, can tend to be have good leaks, as what less people would not need to know like performance, way less so.
 
Last edited:
Doesn't NVidia have some means to limit the data leaked by AIBs? Or are the AIBs a bunch of bottom dwelling slime and sleaze?
Bottom dwelling sleaze, 90% of every leak in the tech industry comes from AIB's.
 
The idea of that video that a Q1 2025 rdna 4 launch was some disinformation campaign to throw off Nvidia, that it would be launch in 2024 and with high volume, was maybe ........ wrong
If AMD did that then I'd be amazed. I swear that AMD has some employees who also work for Nvidia.
 
  • Like
Reactions: ChadD
like this
If AMD did that then I'd be amazed. I swear that AMD has some employees who also work for Nvidia.
We know for a fact that they did not, Lisa Su said that RDNA 4 will launch in Q1 2025 in that last quarterly result.
 
If AMD did that then I'd be amazed. I swear that AMD has some employees who also work for Nvidia.
You don't buy their baiting claims? lol

As much as I like AMD and want to support their GPU division. You can't help but laugh at the idea AMD is going to surprise Nvidia with some super card, a months early launch of a part, or even a launch price point. People like to believe that sort of stuff... can be fun but can also make you seem a little delusional. I must admit when AMD was bragging about baiting them a few years ago... I found it funny but it was also very cringe. As was the trolling on the power connector stuff. I believe NV made a terrible decision on the power connector, but AMD doing the teasing marketing was just stupid.

AIB leakers aside. The truth is engineers go back and forth between Intel AMD Nvidia and other companies so much that you know few big secrets really are all that secret. A quarter of the industries GPU engineers probably play in the same weekend D&D games. lol
 
You don't buy their baiting claims? lol

As much as I like AMD and want to support their GPU division. You can't help but laugh at the idea AMD is going to surprise Nvidia with some super card, a months early launch of a part, or even a launch price point. People like to believe that sort of stuff... can be fun but can also make you seem a little delusional. I must admit when AMD was bragging about baiting them a few years ago... I found it funny but it was also very cringe. As was the trolling on the power connector stuff. I believe NV made a terrible decision on the power connector, but AMD doing the teasing marketing was just stupid.
That's immature and unprofessional. People who do that sort of thing should have their asses fired out of the company. Brand damage and all that sftuff.
 
...You can't help but laugh at the idea AMD is going to surprise Nvidia with some super card, a months early launch of a part, or even a launch price point. People like to believe that sort of stuff... can be fun but can also make you seem a little delusional...
What year is this!? I would also like x1800 ...ahem, 3870 ...ahem, 4870 ...cough, POLARIS (all 86 pages of posts in this forum) ...cough, Vega 10 ...no wait, Vega 20 to come in faster and cheaper than the competition.
 
Back
Top