AMD RX5300 belatedly appears

DanNeely

Supreme [H]ardness
Joined
Aug 26, 2005
Messages
4,477
It's a slightly cut down RX5500: Same number of shader cores at a slightly lower clock speed (343mhz lower base, 200 mhz lower boost); but only 3 memory controllers and thus only 3GB of GDDR6 instead of 4. TDP comes in at 100W, down from 130W.

Tom's speculates it might be OEM only; which fits with a release shortly before the next generation comes out. It doesn't always happen; but this is when weird OEM only cards to use up the dregs from die harvesting appear. If it was intended for broader consumer release, I'd expect it to be cut a bit farther to hit the 75W point and not need an extra power connector.

The use of GDDR6 means it has more bandwidth than the 5300XT with 4 GB of GDDR5; but the higher power needed for it is presumably why it's at a lower clock rate. I am curious if the use of GDDR6 was primarily to keep the GPU from becoming memory starved, or if GDDR5 production is winding down to the point that lack of availability was the driving factor in the change.

https://www.tomshardware.com/news/amd-sneaks-out-the-radeon-rx-5300-for-casual-1080p-gaming

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_RX_5000_series
 
Maybe it's just me, but this is a disappointment. With Zen2 apus nowhere to be found in retail, dedicated graphics are needed if you want Zen2 (or say 6/8/12/16 cores). Where are the "low power" 2d graphics for people who need > 4c/8t (amd) and want an AMD video solution that isn't ancient and/or requiring an extra power connector?
 
Maybe it's just me, but this is a disappointment. With Zen2 apus nowhere to be found in retail, dedicated graphics are needed if you want Zen2 (or say 6/8/12/16 cores). Where are the "low power" 2d graphics for people who need > 4c/8t (amd) and want an AMD video solution that isn't ancient and/or requiring an extra power connector?
What's the size of this market?
 
Yeah, that’s an extremely niche user base. If you’re doing it for video encode/decode you could use an Intel proc, or a first/second gen AMD APU, or wait for third gen parts, or use a zen2 proc with an nvidia card. The customer base for low power zen2 with low power rdna GPU has got to be pretty small.

I understand the frustration though. We don’t really get a “top to bottom” refresh anymore, at the low end we just get rebadges for a few generations and then every once in a while the parts are refreshed with something from the last generation architecture (RX550 after years of HD7750 cards being the only AMD option).

what I don’t understand is AMD’s continued insistence on separating CPUs from APUs. Would be nice if all their procs had some basic graphics functionality - Vega 3 or something. Just a small block to drive displays and play video. Then they could differentiate the APUs by increasing their spec some.

and then maybe the market for sub-75w GPUs would just dry up and they wouldn’t need to spend the “effort” they already do on the low end anymore.
 
What's the size of this market?

oem market it could be significant so that they can sell all AMD rigs with 6+ core processor and have a dedicated gpu while keeping it on a 350w psu for cost savings. outside of that it would be pretty niche market for people that build their own systems. could also just be that they have a ton of extra navi 14 chips they need to get rid of.

what I don’t understand is AMD’s continued insistence on separating CPUs from APUs. Would be nice if all their procs had some basic graphics functionality - Vega 3 or something. Just a small block to drive displays and play video. Then they could differentiate the APUs by increasing their spec some.

and then maybe the market for sub-75w GPUs would just dry up and they wouldn’t need to spend the “effort” they already do on the low end anymore.

complexity and cost is why they do it. the dies are far less complex to manufacture, less points of failure, less waste, and allows them to produce a 1 chip fits all stack. everything from the 3100 to the epyc 7H12 uses the same chiplet. try doing that with something that's also suppose to be an apu.
 
oem market it could be significant so that they can sell all AMD rigs with 6+ core processor and have a dedicated gpu while keeping it on a 350w psu for cost savings. outside of that it would be pretty niche market for people that build their own systems. could also just be that they have a ton of extra navi 14 chips they need to get rid of.
But OEM's have this: https://www.amd.com/en/products/apu/amd-ryzen-3-4300ge
I guess I just don't see the point.
 
/shrug. I'm not a chip designer, I'm just saying - they've got a bunch of talented people there who can probably figure it out. Why they haven't tried to bolt on a tiny GPU to the I/O die or something and have instead created this weird segmentation in their product line (which is going to get even more bizarre when they have 8C/16T desktop APUs right alongside 8C/16T processors without graphics) seems strange to me. Early shots of 8C/16T Zen2 chips even showed a whole "chiplet" or spare space on the package, surely someone at AMD is looking into a graphics chiplet as a possibility?

Like why have an 8C/16T processor with graphics and one without? At a certain point you're eating into your own customer base there - hmm, do I want to get the 3700X with slightly higher clocks and no graphics, or the 4700G with slightly lower clocks and graphics? The 4700G would mean if my GPU ever dies I can keep using the PC, but the 3700X might be a little quicker, hmm....

And unless they change their naming scheme we'll end up with the Ryzen 4700G (Zen2+APU) and the Ryzen 4700X (Zen3 without APU), with nothing really obvious signifying that they're different generations with different capabilities, making it even more confusing for people just getting into the DIY scene wondering why one their 4700 doesn't work with the HDMI output on their motherboard but their friend's is just fine.

Anyways, this is way off topic. I'm just hoping they sort out the lineup and get some kind of basic graphics universal in their processors - it's the one area that Intel still has them on and I think it's why Intel still has so much of the market in pre-builts.
 
/shrug. I'm not a chip designer, I'm just saying - they've got a bunch of talented people there who can probably figure it out. Why they haven't tried to bolt on a tiny GPU to the I/O die or something and have instead created this weird segmentation in their product line (which is going to get even more bizarre when they have 8C/16T desktop APUs right alongside 8C/16T processors without graphics) seems strange to me. Early shots of 8C/16T Zen2 chips even showed a whole "chiplet" or spare space on the package, surely someone at AMD is looking into a graphics chiplet as a possibility?

could be a limitation of the pinout for am4 socket given the location of where the igpu power plane is located. because they pretty much shot down any chance of that happening immediately when the chiplet design was announced so there has to be some type of hardware limitation.

Like why have an 8C/16T processor with graphics and one without? At a certain point you're eating into your own customer base there - hmm, do I want to get the 3700X with slightly higher clocks and no graphics, or the 4700G with slightly lower clocks and graphics? The 4700G would mean if my GPU ever dies I can keep using the PC, but the 3700X might be a little quicker, hmm....

a better way to look at it is...would you buy a 3700x without a igp for 350 dollars or a 3700x with an igp that costs 400 dollars?

And unless they change their naming scheme we'll end up with the Ryzen 4700G (Zen2+APU) and the Ryzen 4700X (Zen3 without APU), with nothing really obvious signifying that they're different generations with different capabilities, making it even more confusing for people just getting into the DIY scene wondering why one their 4700 doesn't work with the HDMI output on their motherboard but their friend's is just fine.

Anyways, this is way off topic. I'm just hoping they sort out the lineup and get some kind of basic graphics universal in their processors - it's the one area that Intel still has them on and I think it's why Intel still has so much of the market in pre-builts.

i can't remember when it was exactly maybe during one of the zen+ announcement events but i remember some one from AMD mentioning that the APU naming scheme was a complete mistake and that it should of never happened the way it did but due to the fact that first gen apu's were a hybrid of the zen and zen+ architectures(zen die with zen+ boost algorithm) that was why they originally started them at the 2000 series branding. i could be wrong though and the person was full of shit.


But OEM's have this: https://www.amd.com/en/products/apu/amd-ryzen-3-4300ge
I guess I just don't see the point.

you'd be amazed as a sales person, point out X system that has an IGP that is all around better than Y system with a dedicated gpu.. and they take the one with the dedicated gpu because they saw some youtube video or article about dedicated gpu's being better than igp's, lol..

also think about all those parents buying pc's for their kids not remembering what they're kid asked for just that it was some 2k series gpu, well 5k is better than 2k and it's under 800 bucks, good enough.
 
Last edited:
what I don’t understand is AMD’s continued insistence on separating CPUs from APUs. Would be nice if all their procs had some basic graphics functionality

This is expected in 2022 "Raphael" model

https://hardforum.com/threads/amd-ryzen-in-2021-warhol-van-gogh-and-cezanne.2000402/post-1044696431

If I got this right

Desktop Roadmap:
  • Matisse — (zen-2 no igp AM4 socket) current version
  • Vermeer — (Zen-3 no igp AM4 socket) expected by end of this year
  • Warhol — (zen-3+ no igp AM5 socket?) expected next year
  • Raphael — (zen-4 + Navi RDNA(2?) AM5 socket?) expected in 2022
Mobile Roadmap
  • Renoir — (Zen-2 + optimised Vega) released in beginning of this year
  • Cezanne — (Zen-3 + further optimised Vega) expected end of this year!?
  • Rembrandt — (Zen-3 + Navi RDNA2) expected by end of next year
Ultra Portable Roadmap
  • None so far
  • Van Gogh — (Zen2 + Navi RDNA2) expected by end of this year
  • Dragon Crest? — ( Zen(4?) + Navi RDNA(3?) ) expected by end of next year

EDIT:
(updating with this information below)

CoffeeFox (@CoffeeFox3) Tweeted:
https://twitter.com/CoffeeFox3/status/1297225481331175424?s=20

h/t erek
 
This is almost the card I want.... I eagerly await a low profile single slot GPU that doesn’t need external power so I can upgrade my MAME cabinet.
I build them using retired surplus Optiplex’s so very limited on space and power. Also super niche but the profile isn’t.
Edit:
Currently using the 1650’s which aren’t bad but not quite there.
 
Last edited:
This seems like a slap in the face to consumers. The 5500 XT was bad enough costing more than a RX 580 while not offering better performance. The introduction of the 5300XT shows that AMD isn't about to introduce a sub $250 card with Ray-Tracing capabilities. Nvidia doesn't seem to be either, which I don't know what crack pipe were they smoking when the RTX 3060 is rumored to be $400. Long live the RX 580/GTX 1060 cards because those owners have no reason to start looking to upgrade for some years now. If they think people are going to save money or work extra hours during a recession to buy these things then AMD and Nvidia are going to get a rude awakening.
 
I could see this work well as an OEM card for Intel-based machines. And as a slightly better option for AMD APUs.

There's always been a market for these cheap cards, and a big one. They've got extra glass and they're selling it. What's the hubbub?
 
This seems like a slap in the face to consumers. The 5500 XT was bad enough costing more than a RX 580 while not offering better performance. The introduction of the 5300XT shows that AMD isn't about to introduce a sub $250 card with Ray-Tracing capabilities. Nvidia doesn't seem to be either, which I don't know what crack pipe were they smoking when the RTX 3060 is rumored to be $400. Long live the RX 580/GTX 1060 cards because those owners have no reason to start looking to upgrade for some years now. If they think people are going to save money or work extra hours during a recession to buy these things then AMD and Nvidia are going to get a rude awakening.

I'm not sure how you can draw any conclusions about AMDs plans for Big Navi from their dumping bad dies into a presumed OEM only card at the end of the previous generation.

Flames about NVidia's rumored pricing belong in other threads.

I wouldn't be surprised if the lower end cards for this generation from both companies either don't have ray tracing, or only offer token levels of it (eg RTX 2xxx equivalents). The performance gap between shading and ray tracing in the RTX 2000 series (50/50 split in die area between the two functions) suggests that to do RT without killing FPS the RT cores need to take about 3-4x as much area as the shader cores do (that or gen 2 RT cores need a massive performance/transistor increase). That's a big enough jump; even with the die shrinks coming that it might still be too much for the cheapest cards. We probably won't know that for a few months more because the cards in that market segment won't be released until a while after their higher performing siblings.

If your finances or employment are hanging on by a thread, you probably shouldn't buy a new card anytime soon; regardless of what the new ones cost. But many people whose employment is safe are going to spend like normal; and a lot of gamers probably decided to save their stimulus for a new GPU in the fall because their CPU was fast enough they didn't think upgrading to Ryzen was a good use of their windfall this spring.
 
I'm not sure how you can draw any conclusions about AMDs plans for Big Navi from their dumping bad dies into a presumed OEM only card at the end of the previous generation.
Because AMD should be releasing all new products based on RDNA2.0. If you want Ray-Tracing to take off even as a RTX 2080 Ti owner then sub $250 cards must have Ray-Tracing. There's a reason why an RTX 2080 Ti owner has half a dozen games with Ray-Tracing and one is Quake 2 and the other is Minecraft. Consoles will bring Ray-Tracing to modern games but that depends on if this generation of consoles will sell, and right now nobody has any official pricing. I can't stress this enough but sub $250 GPU's must have Ray-Tracing in order for most new games to have this feature. Game developers are not going to include a feature for the few people who own RTX or Big Navi GPUs.
I wouldn't be surprised if the lower end cards for this generation from both companies either don't have ray tracing, or only offer token levels of it (eg RTX 2xxx equivalents). The performance gap between shading and ray tracing in the RTX 2000 series (50/50 split in die area between the two functions) suggests that to do RT without killing FPS the RT cores need to take about 3-4x as much area as the shader cores do (that or gen 2 RT cores need a massive performance/transistor increase). That's a big enough jump; even with the die shrinks coming that it might still be too much for the cheapest cards. We probably won't know that for a few months more because the cards in that market segment won't be released until a while after their higher performing siblings.
The fact that AMD is releasing a 5300XT suggests that there's no RDNA2.0 6500XT or 6300XT. If it eats a lot of transistors then they should have implemented a chiplet like design to reduce costs. Nvidia seems to be separating Ray-Tracing from the GPU itself probably to reduce cost.
If your finances or employment are hanging on by a thread, you probably shouldn't buy a new card anytime soon; regardless of what the new ones cost. But many people whose employment is safe are going to spend like normal; and a lot of gamers probably decided to save their stimulus for a new GPU in the fall because their CPU was fast enough they didn't think upgrading to Ryzen was a good use of their windfall this spring.
I'm just being realistic here. There's a reason why Sony and Microsoft have not released the prices of their new consoles during 2020. We're like 2-3 months away from their release. I'm not sure if AMD and Nvidia are as aware of what's going on around the world as much as Sony and Microsoft are.
 
Because AMD should be releasing all new products based on RDNA2.0. If you want Ray-Tracing to take off even as a RTX 2080 Ti owner then sub $250 cards must have Ray-Tracing. There's a reason why an RTX 2080 Ti owner has half a dozen games with Ray-Tracing and one is Quake 2 and the other is Minecraft. Consoles will bring Ray-Tracing to modern games but that depends on if this generation of consoles will sell, and right now nobody has any official pricing. I can't stress this enough but sub $250 GPU's must have Ray-Tracing in order for most new games to have this feature. Game developers are not going to include a feature for the few people who own RTX or Big Navi GPUs.

The fact that AMD is releasing a 5300XT suggests that there's no RDNA2.0 6500XT or 6300XT. If it eats a lot of transistors then they should have implemented a chiplet like design to reduce costs. Nvidia seems to be separating Ray-Tracing from the GPU itself probably to reduce cost.

I'm just being realistic here. There's a reason why Sony and Microsoft have not released the prices of their new consoles during 2020. We're like 2-3 months away from their release. I'm not sure if AMD and Nvidia are as aware of what's going on around the world as much as Sony and Microsoft are.

Look at it from AMD's point of view

They caught up with intel by removing igp from CPU & adding more cores

Applying same strategy in GPU, they would likely spend more of the silicon budget on rasterization (rather than ray tracing) to catch up with Nvidia

Fwiw, the rumors say that RDNA 2 is reserved for class of cards above 5700XT

5700 XT (& 5500) might get a refresh RDNA 1.1 if you will. It should reduce the prices of RDNA 1 cards, which should benefit budget gamers not interested in ray tracing
 
They caught up with intel by removing igp from CPU & adding more cores
No, they didn't, and their APU is the company's strongest foot forward.

So much so that other companies are playing catch-up on the entire concept of APU, even though it seems like such an old idea now.
 
No, they didn't, and their APU is the company's strongest foot forward.

So much so that other companies are playing catch-up on the entire concept of APU, even though it seems like such an old idea now.

Yes, it is very ironical, that AMD went almost bankrupt buying ATI but finally caught up with intel using a CPU without IGP

If Nvidia takes their eyes off the ball, then AMD has the best chance of hitting them by focusing on rasterization rather than ray tracing

https://hardforum.com/threads/nvidi...ps-up-to-24gb-of-vram.1997795/post-1044706734

Bottom line is that RT in its current form is useless and even with 200% boost it’ll still be useless for the most part. DLSS 2x is far more interesting but unfortunately requires devs adding it and since it’s proprietary we’ll never see mass adoption of it.

So this brings me back to good old raster performance, I want tons of it at 7nm+ and NVIDIA needs to deliver. I don’t want to fund their server architectures that are watered down to double as graphics cards. So my only hope is that Big Navi brings raster performance as it’s main driver with RT as an afterthought and gives it to us at a reasonable price.

As a 2080 Ti owner, I don’t feel ripped off because I bought this thing for $800 off some schmuck. If I had paid $1300+ (what this strix costs new) I’d have been pretty upset about it all. But fortunately as an enthusiast I already knew RT was never my main reason for getting the 2080 Ti.

If Jensen spends his time praising RT and DLSS but only shows mediocre gains in raster performance, I’ll do a hard pass on this and see what his niece has up her sleeve. Besides, if nvidia really wanted to push features, they should’ve bought a stake in a big game publisher like EA or Activision but it’s clear to me gaming is becoming less and less of a focus for them and as a result I’m losing interest in nvidia.
 
Because AMD should be releasing all new products based on RDNA2.0. If you want Ray-Tracing to take off even as a RTX 2080 Ti owner then sub $250 cards must have Ray-Tracing. There's a reason why an RTX 2080 Ti owner has half a dozen games with Ray-Tracing and one is Quake 2 and the other is Minecraft. Consoles will bring Ray-Tracing to modern games but that depends on if this generation of consoles will sell, and right now nobody has any official pricing. I can't stress this enough but sub $250 GPU's must have Ray-Tracing in order for most new games to have this feature. Game developers are not going to include a feature for the few people who own RTX or Big Navi GPUs.

The fact that AMD is releasing a 5300XT suggests that there's no RDNA2.0 6500XT or 6300XT. If it eats a lot of transistors then they should have implemented a chiplet like design to reduce costs. Nvidia seems to be separating Ray-Tracing from the GPU itself probably to reduce cost.

I'm just being realistic here. There's a reason why Sony and Microsoft have not released the prices of their new consoles during 2020. We're like 2-3 months away from their release. I'm not sure if AMD and Nvidia are as aware of what's going on around the world as much as Sony and Microsoft are.
I think we can expect to see full DXR support across the line to some degree unless they don’t have DX12U support which I doubt. I can see it coming in the $299 segment at launch then a few months down the road those $299 cards getting hit with $50 mail-in rebates.

I know that AMD and NVidia both have chiplet GPU’s in the works and unless AMD has been incredibly tight lipped they are a solid 2 years behind NVidia on it. The biggest issue with chipsets is the delay moving data between the chips, NVidia has some strong patents and practice implementing the tech to some degree in their Tesla cards. Unless you want first gen chiplet GPU’s to run 50% slower on existing and older titles but 100% faster on titles that support it there are going to have to be changes made on the software side first, or they need an architecture advancement so great that they can meet their previous flagship on one chip.

I don’t think they have released prices because they are waiting to see how the wind blows. Memory prices are going down so that means costs are too, yeah they launch is close but the first one to leak a number is at a disadvantage. Both consoles are going to be costly so unless they are going back to the days where they loose money on them and focus on software sales, which given Sony’s recent announcement of PC support may be an actual thing but who ever comes our first can expect to be undercut and if they respond by lowering their prices they are “the ones who were gouging“. They probably have their techs working overtime trying to figure out each others costs so they can launch as close to each other as they can assuming they haven’t already just colluded on the launch price.
 
Back
Top