WTB: Nvidia Card for ML

Status
Not open for further replies.

Stratus_ss

Limp Gawd
Joined
Jul 22, 2005
Messages
200
I'm looking for a card for some ML stuff that is newer then kepler.

A 1080 TI could be a good option, ultimately I am looking to stay around $250. I know what's on ebay but I thought I would come here first to see if anyone had a card that needs a new home.

Only cards supporting the latest drivers (535 or similar due to the way the cuda tooling works on modern distros) please
 
Last edited:
Would Quadros work for you?
Provided they are of a new enough generation, yes. Current cuda packages want a driver newer that 5xx. The issue with Kepler et al is they are being supported by the 470.xx drivers

What did you have in mind
 
Thought I was going to have a free Quadro p4000 and p6000 this week.
P4 looks to have less cuda than the 1080ti, but they p6 has a little more
 
Thought I was going to have a free Quadro p4000 and p6000 this week.
P4 looks to have less cuda than the 1080ti, but they p6 has a little more
Thought I was going to have a free Quadro p4000 and p6000 this week.
P4 looks to have less cuda than the 1080ti, but they p6 has a little more
How much were you asking for the P6000?

EDIT moving to PM
 
I've got an evga 2080 Super for a little more than your budget if that interests you.
 
I've got an Asus Strix 1080Ti if you're interested. Will come with box. Let me know.
 
I want to clarify looking for cards with 11+ GB of ram in order to hold larger large language models
 
Best shot is still ebay if you need it sooner. Nvidia tesla P100 12GB going for $138.90 shipped excl tax and P100 16GB version for $150 shipped excl tax. These two will meet your 11+ GB vram and $150 budget that you are looking for. This card is passive cooled though. GL.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
I have some HP DL-380 G8s and G9s but my preference while playing around, would be a desktop model unless there was a really stellar deal that works with the DL380s

Just out of curiosity - what are you doing? Or is that classified? 👀
 
If you're looking for a 12GB card for inference it's hard to beat the 3060 - tensor cores, most recent architecture (Nvidia has supply chain issues so Hopper is basically MIA, meaning Ampere will be supported for some time to come), and clean upgrade path through the Ampere stack to A100. V100's are about twice as fast, but the cards don't have fans and cost twice as much.
 
I have some HP DL-380 G8s and G9s but my preference while playing around, would be a desktop model unless there was a really stellar deal that works with the DL380s
Does NVIDIA evan have desktop GPU drivers for server OS?
 
If you're looking for a 12GB card for inference it's hard to beat the 3060 - tensor cores, most recent architecture (Nvidia has supply chain issues so Hopper is basically MIA, meaning Ampere will be supported for some time to come), and clean upgrade path through the Ampere stack to A100. V100's are about twice as fast, but the cards don't have fans and cost twice as much.

That's true but the 3060 is a chunk of change. If I was going to spend that kind of money, I'd start pestering dbwillis for another P6000 :cool:
 
Last edited:
P6000 24gb would also work @ $250

Going rate on those (eBay) is closer to $500. That said, if you could find one around $250 it would be a good deal. At $500 you’re better off with newer cards unless you absolutely need 24gb and can’t scrounge another $100 for a well-worn 3090.
 
Status
Not open for further replies.
Back
Top