Lowest on newegg.com/pcpartpicker right now- when AMDhave the balls to price the 6500 within 50 bucks of that ass-raping 3050!
RTX 3050: $330 / $318
6500 XT: $210/$199
Not sure people really do that.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Lowest on newegg.com/pcpartpicker right now- when AMDhave the balls to price the 6500 within 50 bucks of that ass-raping 3050!
Thanks,No, its not - with dlss, that NVIDIA result is more than playable
What does any of that have to do with the initial chart you posted? None of them were playable. That specific chart was dumb.No, its not - with dlss, that NVIDIA result is more than playable
Quit justifying a shit card that should have shipped with 96-bit bus and 6gb ram., and 8x pcie
there are a lot less demanding rt games out there, and they all MOSTLY maintain that massive 5:1 performance advantage - when AMDhave the balls to price the 6500 within 50 bucks of that ass-raping 3050!
I'm waiting for people to realize that the reason why this card is going up against 4 y.o Nvidia cards is because Nvidia hasn't developed their replacements. There's a reason for that. There's also a reason why TechPower Up is being so disingenuous here. It's because Nvidia still makes the 1650, and the 1050TI and cleans up selling them for $200+ with ZERO reviewers saying anything about that. There's a reason for that.Same performance as the 1650 while using the same amount of power...without encoding ability.
When restricted to PCIe 3.0, it slots between a 1650 and a 1050ti. Even low profile users are better off with either of the old Nvidia cards. Standard size case owners are much better off with a used RX570 or 1650 Super.
This trash is useful for nobody.
FSR would help, but according to this, 1650 and 1050tis would get it too.I'd imagine with it being RDNA 2 it'll have support for Radeon Super Resolution. So upscale 1080p from 720p and I'd imagine you'd get some solid FPS at 1080 medium/high settings.
IF a baby ampere card gets made. Nvidia seems to have zero interest in that.FSR would help, but according to this, 1650 and 1050tis would get it too.
https://www.amd.com/en/partner/changing-the-game-amd-fidelityfx-super-resolutAmpere
A baby Ampere card would get dlss and true rtx support as well.
IF a baby ampere card gets made. Nvidia seems to have zero interest in that.
OEM need something that is new and entry level I would imagine, I think what tend to replace new entry level is older generation mid level card.Both AMD and Nvidia have said they want the used market to replace the entry-level market. And AMD and Intel both have plans for stonkin' integrated graphics.
OEM need something that is new and entry level I would imagine, I think what tend to replace new entry level is older generation mid level card.
I am not sure why that something NVIDIA would particularly want, but maybe there is no money in this anyway for them. I would feel that the 1650 of the world sold in entry option (or until very recently 2060) would have nice margin.It's all going to be integrated graphics. I wouldn't be surprised if integrated graphics comes to workstation and server parts. If any of these three companies has plans to fart out a "make the monitors go" card it's going to be Intel.
There is still giant room between integrated graphic and a new generation of card offering, where older cards seem to fit very well.
Well, duh. Look at all the new-old 5450s and 730s still on the shelves.Nvidia is only going to focus on the mid-to-high end, along with halo parts, for the foreseeable future
Yeah, I know, but Radeon Super Resolution is driver level FSR, which means games that don’t have FSR the 6500/6400 cards would get a leg up.FSR would help, but according to this, 1650 and 1050tis would get it too.
https://www.amd.com/en/partner/changing-the-game-amd-fidelityfx-super-resolutAmpere
A baby Ampere card would get dlss and true rtx support as well.
Right after the return window closed on the RX 6800 I bought. You're all welcome.They are pretty much down to msrp
The 6500 xt is now available with 8GB of VRAM
https://www.sapphiretech.com/en/consumer/pulse-radeon-rx-6500-xt-8g-gddr6
The card no one cares about.The 6500 xt is now available with 8GB of VRAM
https://www.sapphiretech.com/en/consumer/pulse-radeon-rx-6500-xt-8g-gddr6
Why? performance will still be castrated by the 64-bit bus, and pcie x4 connector
How is this even possible with 64 bits (2 modules) of gddr6? I didn't know there was a 4 GB Gddr6 module.The 6500 xt is now available with 8GB of VRAM
https://www.sapphiretech.com/en/consumer/pulse-radeon-rx-6500-xt-8g-gddr6
How is this even possible with 64 bits (2 modules) of gddr6? I didn't know there was a 4 GB Gddr6 module.
The card no one cares about.
Skeptical as Samsung only shows 8 and 16Gb modules (1 or 2 GB). But yeah, very much pointless unless there is some obscure compute application that demands alot of vram while needing limited gpu performance.Hey man, when prices are falling hard on this card, you need some excuse to charge a premium.
Looking at the datasheet of this micron chip[0], it seems that in "clamshell" topology, in 8x mode, two chips can share a channel ("Psuedo-Channel mode"), so you could have four chips with just a 64 bit bus. But I could be misinterpreting the information in that datasheet.How is this even possible with 64 bits (2 modules) of gddr6? I didn't know there was a 4 GB Gddr6 module.
No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.Ah the 8 GB does seem legit but how exactly did they make the die run 8x pcie?
There was a definite performance hit going from pcie 4.0 x4 to pcie 3.0 x4 with that card. Since pcie 3.0 x8 has the same bandwidth as pcie 4.0 x4, it would be a nice improvement for that card in older platforms.No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.
No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.
There was a definite performance hit going from pcie 4.0 x4 to pcie 3.0 x4 with that card. Since pcie 3.0 x8 has the same bandwidth as pcie 4.0 x4, it would be a nice improvement for that card in older platforms.
Right, but he asked how they managed to do it. I was just saying it must be possible, or we wouldn't be able to run 16x cards in an 8x slot. (or 4x, or 1x for that matter)It doesn't need to, but for people who are on PCI-E 2.0 like me, the improvement to 8X makes the card a much better value.
Each socket has multiple presence detect pins, so the host knows how many lanes it has. The host and card negotiate how many lanes to use, presumably at boot.Right, but he asked how they managed to do it. I was just saying it must be possible, or we wouldn't be able to run 16x cards in an 8x slot. (or 4x, or 1x for that matter)