RTX 3xxx performance speculation

As an eBay Associate, HardForum may earn from qualifying purchases.

I'm seeing the sold prices more around $750-850 for the past week through that filter (not counting fees).

And from experience, marking an item as "no returns" on eBay means little to nothing nowadays. Sellers still have the right to return for 30 days regardless and will manufacturer the most minor reasons to do so when a new, better series is incoming. I'd bet the return rate on 20-series cards now are sky high. Case in point, a guy in the post above was suggesting he'd buy a 2060 to "hold him over" now and just return it when 3080 came in stock.

The only way to avoid this issue is to sell at least 30 days ahead of a release so the eBay terms return window elapses. You're still not guaranteed to avoid these asses, but much less likely to get a buyer trying to beat the system. I had a guy try to return one of the 2080 Supers I recently sold due to vague "overheating" issues (very coincidentally right around when specs and rumored prices for Ampere were solidifying). He'd had the card for exactly 32 days and eBay rejected the request -- sorry SOB.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
I'm seeing the sold prices more around $750-850 for the past week through that filter (not counting fees).

And from experience, marking an item as "no returns" on eBay means little to nothing nowadays. Sellers still have the right to return for 30 days regardless and will manufacturer the most minor reasons to do so when a new, better series is incoming. I'd bet the return rate on 20-series cards now are sky high. Case in point, a guy in the post above was suggesting he'd buy a 2060 to "hold him over" now and just return it when 3080 came in stock.

The only way to avoid this issue is to sell at least 30 days ahead of a release so the eBay terms return window elapses. You're still not guaranteed to avoid these asses, but much less likely to get a buyer trying to beat the system. I had a guy try to return one of the 2080 Supers I recently sold due to vague "overheating" issues (very coincidentally right around when specs and rumored prices for Ampere were solidifying). He'd had the card for exactly 32 days and eBay rejected the request -- sorry SOB.
If you meant the dig at me, you are way off..read my second post. As an ebayer I'd never advise doing that to an individual seller or small business but as I said the trillion dollar establishments getting billions from the Fed money print daily (and yes they are quite directly through that money bumping their stock up multiple times) now that is totally fair game.
 
If you meant the dig at me, you are way off..read my second post. As an ebayer I'd never advise doing that to an individual seller or small business but as I said the trillion dollar establishments getting billions from the Fed money print daily (and yes they are quite directly through that money bumping their stock up multiple times) now that is totally fair game.

My point was the simply the motivation is strong at this point to game the system to return a card when better options are imminent, and if there’s a gap in any policy people will take it. Apologies if it came off as a dig against you. I would agree there is a vast ethical difference between screwing over an individual seller (and outright lying about a defect), vs taking advantage of the return policy of a multinational corporation that had an established resale system (who may well pay little to no tax).
 
My point was the simply the motivation is strong at this point to game the system to return a card when better options are imminent, and if there’s a gap in any policy people will take it. Apologies if it came off as a dig against you. I would agree there is a vast ethical difference between screwing over an individual seller (and outright lying about a defect), vs taking advantage of the return policy of a multinational corporation that had an established resale system (who may well pay little to no tax).
Agreed , we're on the same page...BTW anyone remember the days when newegg had a 365 day no questions asked return policy on video cards? Man, those were the days if you really wanted to game the system...I actually had a 7900 GS die on me 9 months in and since the card was out of stock for replacement I got a refund much to my amazement. Sadly those days are gone as are other things like tax free shopping on the net....even fleabay and swappa scam tax from you these days, depressing prices further for individuals. Sad state of affairs indeed.
 
If AMD doesn't try to piss in Nvidia's cheerios by announcing something even the date for the reveal or releasing benchmarks to temp people waiting on the fence they don't deserve to increase market share. Take what jeep did when ford announced the Bronco, announced updates and reasons to think about getting a jeep instead. No more passes for being poorly ran.

NVIDIA goes up to 80% marketshare of AIB market:
https://www.jonpeddie.com/press-releases/pandemic-distorts-global-gpu-market-results

AMD needs more than "just below NVIDIA" to change that development.
 
If you meant the dig at me, you are way off..read my second post. As an ebayer I'd never advise doing that to an individual seller or small business but as I said the trillion dollar establishments getting billions from the Fed money print daily (and yes they are quite directly through that money bumping their stock up multiple times) now that is totally fair game.

Everyone thinks they are Robin Hood when in actuality they are just thieves.
 
Videocardz has spec confirmation:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked

With GeForce RTX Ampere, NVIDIA introduces 2nd Generation Ray Tracing Cores and 3rd Generation Tensor Cores. The new cards will also support PCI Express 4.0. Additionally, Ampere supports HDMI 2.1 and DisplayPort 1.4a display connectors.

NVIDIA GeForce RTX 3090 features GA102-300 GPU with 5248 cores and 24GB of GDDR6X memory across a 384-bit bus. This gives a maximum theoretical bandwidth of 936 GB/s. Custom boards are powered by dual 8-pin power connectors which are required because the card has a TGP of 350W.

The RTX 3080 gets 4352 CUDA cores and 10GB of GDDR6X memory. This card will have a maximum bandwidth of 760 GB/s thanks to a 320-bit bus and 19 Gbps memory speed. This model has a TGP of 320W and the custom models that we saw also require dual 8-pin connectors.

The RTX 3070 also launches at the end of next month (unless it changes). We can confirm it has 8GB of GDDR6 memory (non-X). The memory speed is estimated at 16 Gbps and TGP at 220W. The CUDA Core specs are still to be confirmed.

No real surprises. RTX 3070 only gets 8GB of regular GDDR6, which I speculated earlier, and one more thing:

"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."

You know, for the people building a house of cards stories on top of 8nm Rumors...
 
Videocardz has spec confirmation:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked



No real surprises. RTX 3070 only gets 8GB of regular GDDR6, which I speculated earlier, and one more thing:

"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."

You know, for the people building a house of cards stories on top of 8nm Rumors...

Those specs are as expected.

I'm still on the fence whether the entire line up will be on 7nm, a mix of 7nm and 8nm, or only 8nm. I do think if it's 7nm across the board supply will be extremely tight, likely only a few thousand chips globally per model at launch.
 
As an eBay Associate, HardForum may earn from qualifying purchases.


Hey guys question.2:28 minute.

What is additional resistance that may be induced by adaptter?Something bad? Short circuit?
I just ask if using adapter will be safe? I have Seasonic Prime TX-750 Plus Titanium
 
My poor wallet, I don’t really need 24gb of memory.
Oh well, good for cuda I guess.
 


Hey guys question.2:28 minute.

What is additional resistance that may be induced by adaptter?Something bad? Short circuit?
I just ask if using adapter will be safe? I have Seasonic Prime TX-750 Plus Titanium


Theoretically it's safe or they wouldn't be doing it. They don't want a class action lawsuit...

But for my comfort, it depends on card power. Personally I wouldn't want to use adapters on cards pulling 300+ watts. Adapters can only increase resistance (and heating). If getting a 300+ watt card, I would either get an AIB card with 8 pin connectors, or a real cable from my PSU maker if getting a high power card with 12 pin.

If it was closer to 200W on a 3070, I wouldn't worry about it, and would use the adapters.
 
Theoretically it's safe or they wouldn't be doing it. They don't want a class action lawsuit...

But for my comfort, it depends on card power. Personally I wouldn't want to use adapters on cards pulling 300+ watts. Adapters can only increase resistance (and heating). If getting a 300+ watt card, I would either get an AIB card with 8 pin connectors, or a real cable from my PSU maker if getting a high power card with 12 pin.

If it was closer to 200W on a 3070, I wouldn't worry about it, and would use the adapters.
So then Seasonic Prime TX-750 Plus Titanium change for new psu without adapter?: D I just asking maybe i will stay with 2080 ti dont know for sure.
 
$1200 for a used 2080ti with the 3000 series being announced next week? You’re dreaming.
I sold my 2080 Ti last night for $1000 cash. Dude was even aware of the 3000 series but didn't care and just wanted to play Warzone with the highest FPS possible.
 
Videocardz has spec confirmation:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked



No real surprises. RTX 3070 only gets 8GB of regular GDDR6, which I speculated earlier, and one more thing:

"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."

You know, for the people building a house of cards stories on top of 8nm Rumors...
Interesting if 7nm TSMC and at 350w rating, meaning when OC over 400w if there is headroom. Makes me wonder what Nvidia is expecting from AMD if even a consideration. I am hoping there will be headroom for a significant OC on the 3090.

As for AMD to make up market share, I would say they will have to tackle the discrete mobile market in a big way, perf/w as well as perf/$.
 
I sold my 2080 Ti last night for $1000 cash. Dude was even aware of the 3000 series but didn't care and just wanted to play Warzone with the highest FPS possible.

Seems like you got lucky, and buyer is stupid. I wonder, all the cards on ebay listed at $1K+, are they selling? A quick glance and they all looked like BIN listings, not auctions that had bids over 1K. Saw a few ~900 bids though, which still seems a bit nuts to me. Do these people not know how close we are to new cards?

I picked up a used 2080 Ti back in April for $840. I think I got a good deal on a good card. Once we see what the 3000 series is capable of, and what it costs, i'll weigh if it's worth selling or holding onto for another generation.
 
Seems like you got lucky, and buyer is stupid. I wonder, all the cards on ebay listed at $1K+, are they selling? A quick glance and they all looked like BIN listings, not auctions that had bids over 1K. Saw a few ~900 bids though, which still seems a bit nuts to me. Do these people not know how close we are to new cards?

I picked up a used 2080 Ti back in April for $840. I think I got a good deal on a good card. Once we see what the 3000 series is capable of, and what it costs, i'll weigh if it's worth selling or holding onto for another generation.
Lots with bids over $1k based on model:

https://www.ebay.com/sch/i.html?_fr...te=1&LH_ItemCondition=3000&rt=nc&LH_Auction=1

I've actually sold 2 brand new 2080 ti's in the last 2 days for $1750 each. That's crazy for you.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Frustrating that AMD rumors are next to non existent.
I expect some more news next week, but so far, barely anything.

Ya it's a little odd that they haven't had anything leak yet other than the 72-80 CU rumor.

At this point I will probably just get a 3090 assuming the $1400 price turns out to be true. I don't want a 10GB 3080 and I imagine a 20GB 3080 will approach 3090 prices so might as well just get the 3090 at launch.
 
Sold my 2080 TI on ebay last night for $950, which is only $100 less than I paid for it in November of 2018.
 
Videocardz has spec confirmation:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked



No real surprises. RTX 3070 only gets 8GB of regular GDDR6, which I speculated earlier, and one more thing:

"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."

You know, for the people building a house of cards stories on top of 8nm Rumors...

Anything on the size for the 3080, is it a 3 slot card as well..?
Power draw implies it could need the cooling space....

As a MITX guy, this might limit my choices to the 3070 and below.
 
Anything on the size for the 3080, is it a 3 slot card as well..?
Power draw implies it could need the cooling space....

As a MITX guy, this might limit my choices to the 3070 and below.

Everything about the cooler in the recent NVidia video, was showing the 2 slot version, that we have multiple images from multiple sources. I think that would be strange if the only cooler actually verified is not on either the 3080 and 3090 that show up at launch.

There is no NVidia verification of the gigantic 3 slot version. If it is indeed real, then it seems it would only be on the 3090.
 
  • Like
Reactions: Auer
like this
Videocardz has spec confirmation:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked



No real surprises. RTX 3070 only gets 8GB of regular GDDR6, which I speculated earlier, and one more thing:


Told everyone here: if they are developing a new memory design, it means they're pushing performance this time around...but it also means Nvidia would rather not use that new pricey memory on volume parts.

The 3070 matching 2080 Ti is completely reasonable, as is the 3090 be least 50% faster (20% higher SPs plus 10% higher clocks plus 20% IPC). They're absorbing the massive power increase with that crazy-looking cooler.

The lack of 16GB 3070 also implies they may stay on 256-bit bus for the 3060. Otherwise, they will have to stay on an already-castrated 6GB VRAM (can't see them having 12GB ram on the 3060 and 8GBon the 3070).

I would expect 16GB 3070 Super refresh cards (and maybe also 3060?)
 
Last edited:
Why are people making such a fuss over this new 12 pin design? Doesn't it exist just because they ran out of PCB space? We have had dual 8 pin cards for a fairly long time now. I'm pretty sure I have even come across triple 8 pin models before, although I can't remember specifics.

I highly doubt anyone's PC is going to have trouble running adapters other than aesthetics.
 
The 3070 matching 2080 Ti is completely reasonable, as is the 3090 be least 50% faster (20% higher SPs plus 10% higher clocks plus 20% IPC). They're absorbing the massive power increase with that crazy-looking cooler.

3070 has significantly less Memory BW than 2080 Ti, so matching might be more of a challenge. I expect it will be between 2080 Super and 2080 Ti. 3070 Ti/Super later down the road with GDDR6X will catch the 2080 Ti.

3090 at least has ~50% more memory BW than 2080 ti, so at least has the memory BW to pull it off.
 
Last edited:
3070 has significantly less Memory BW than 2080 Ti, so matching might be more of a challenge. 3090 at least has ~50% more memory BW than 2080 ti, so at least has the memory BW to pull it off.


I'm assuming the rest will be made-up by compression/improved cache. It's only a 25% theoretical deficit.

Turing didn't really do much over Pascal (they mostly just rode the GDDR6 train to success), so they have room for improvement.
 
Last edited:
I need to get a backup GPU for in-between times like this.

I have no interest in a 3090FE anyways, I'm waiting on a no-frills version, particularly from EVGA, like a 3090 Black.
 
I'm assuming the rest will be made-up by compression/improved cache. It's only a 25% theoretical deficit.

Turing didn't really do much over Pascal (they mostly just rode the GDDR6 train to success), so they have room for improvement.

I really doubt compression/cache will make up for that much bandwidth. Given 3080 is rumored to have about 20-30% more perf and has ~25% more BW, and 3090 is expected to have ~50% more perf and has ~50% more BW, seems to indicate they may need the BW to match perf increases.
 
I'm assuming the rest will be made-up by compression/improved cache. It's only a 25% theoretical deficit.

Turing didn't really do much over Pascal (they mostly just rode the GDDR6 train to success), so they have room for improvement.

They touted Turing having effectively 50% higher bandwidth than Pascal. A good portion was from "traffic reduction".

I don't remember memory OC'ing have a huge affect on Turing. Hopefully in a week we'll have an idea with Ampere. Personally I am not too concerned about memory BW with nVidia, they have a pretty good history of balancing BW with overall GPU performance. (Except for GTX 970s in SLI....)

Turing Memory.PNG
 
For folks asking if people don't care and are loose with their money, don't forget we have the dollar printing press working overtime right now and many folks are tapped into that- Robinhood traders, people receiving more on unemployment than they made before...
 
I really doubt compression/cache will make up for that much bandwidth. Given 3080 is rumored to have about 20-30% more perf and has ~25% more BW, and 3090 is expected to have ~50% more perf and has ~50% more BW, seems to indicate they may need the BW to match perf increases.


Well, if you're right about no compression improvement on Ampere, then I'm right about the 3060 having a 256-bit memory bus. I recall idiots earlier in this thread postulating some GDDR6x-powered 192-bit 3060 wish-mobile :rolleyes:
 
Last edited:
I hope the 3080 ends up being worthwhile, 10 GB of VRAM is sort of meh, and it seems almost identical to the 2080 Ti in terms of specs.
 
I hope the 3080 ends up being worthwhile, 10 GB of VRAM is sort of meh, and it seems almost identical to the 2080 Ti in terms of specs.

Well Nvidia knows the vast majority of gamers won’t care because 10GB is just fine for them just like 8GB is currently fine. For those that do care they will pony up for the 20GB 3080 or the 3090 or go AMD.
 
Back
Top