Threadripper Pays for Itself Using Cryptocurrency Mining @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,629
Threadripper Pays for Itself Using Cryptocurrency Mining

If you could have your AMD Ryzen Threadripper pay for itself over time, would you? No matter your feelings towards cryptocurrency mining, you can get your Threadripper mining today, and paying for itself. The process could not be much easier either. And the big kicker is that the actual wattage load on your system is likley much less than you would guess.
 
Odd that when you have CPU and GPU mining going the system is unusable. I have my 4670K and 1080Ti both going and my system is plenty usable, youtube, surfing, Outlook/Excel/Word/etc, no issues.


Also, you look like Ron Hextall. LOL
 
Don't forget that any profit must be claimed on income tax, so that is likely another 30-40% off, making payback rate longer.

30-40%? They're from Texas, not Californistan. Plus, they can write off the cost of the CPU (IIRC they buy their own from retail channels) as business expenses.
 
I must admit when I read the PCGamesN article on this, my thoughts were one, "AMD fuckers trying to make money." And two, "bullshit." (No offense to PCGN but it was just AMD's numbers quoted)

Glad to hear your take on this Kyle. Personally, I don't think we will suddenly see Threadrippers selling out for mining. But it's kinda hard to get pissed about existing Threadripper owners potentially being able to get more value per dollar out of their systems. Sounds like a good deal, and that's a good thing for them!
 
30-40%? They're from Texas, not Californistan. Plus, they can write off the cost of the CPU (IIRC they buy their own from retail channels) as business expenses.

You are damn right we can. :)

Now I'm really leaning towards Threadripper.....
 
Glad to hear your take on this Kyle. Personally, I don't think we will suddenly see Threadrippers selling out for mining.!

If you wanted to make money and not deal with the crap GPU market right now, (With long term profitability of GPU's going south with markup and upcoming releases), why wouldn't you think of it as a new option purchase?

Maybe I should buy 20 @ $1000 each now and sell them for $1200 in a few weeks....but then I'm not an opportunist creep.
 
With the current Monero rates, and running 24/7, our Threadripper 1950X could pay for itself in about 1.5 years.
I assume the calculations are made at current rates and difficulty. So its really anyones guess as to when if ever, it will pay itself.

Take my case with ETH for instance. It took me almost 4 months to mine 1 ETH. When I started I was projecting 3 months tops at the current price/difficulty to recover the cost of the GTX1070. When I reached 1ETH, it was like $300. I was hoping it would go back to $360 so the card would pay for itself. Low and behold, a few weeks later it jumped to $740 and I cashed out at almost double what I paid.

Now its at $1,000 at it will take me more than a year to mine another ETH. Hopefully the price will go up again.
 
If you wanted to make money and not deal with the crap GPU market right now, (With long term profitability of GPU's going south with markup and upcoming releases), why wouldn't you think of it as a new option purchase?

Maybe I should buy 20 @ $1000 each now and sell them for $1200 in a few weeks....but then I'm not an opportunist creep.

Because for each $600 a year you'll make, it'll cost you a $300 motherboard, $200 in ram, $750 for the processor, and a minimum of $100 for the power supply, GPU, and cooling. Or you can buy four 1060s and make that same $600 in one quarter.
 
I've had mine going since I put it together in August. Was running xmr-stak on Nicehash until the big debacle, was always showing $5-6/day or something, tweaked properly.

Still have it going now, but on a different pool. Haven't really been tracking stats on it as closely. I'm sure the CPU has probably about paid for itself by now though.
 
Because for each $600 a year you'll make, it'll cost you a $300 motherboard, $200 in ram, $750 for the processor, and a minimum of $100 for the power supply, GPU, and cooling. Or you can buy four 1060s and make that same $600 in one quarter.

IF you can get hold of a 1060 at a reasonable price. That's the problem. Your proposed system is
$300 (4 pci slots motherboard)
$200 in ram
$150 baseline processor
$200+ power supply
$350 x 4 GTX1060 =
$2250

just doing some paper napkin math.
 
I have both my threadrippers mining. What is surprising to me how close u can get the 1920x to the 1950x for
hash rate. I been doing etn for about a month. My 1920x at 4.125 does around 1050 to 1120ish where as the
1950x at 3.95 gets from 1150 to 1320 h/s on nano pool. Now if i watch a video or start doing something that
uses cpu clock it will drop down into the 600's for hash rate. The systems are doing other things for the most part so i
don't see max h/s rates very often.

The core package on the 1920 sucks around 140watts compared to around 160watts on the 1950x Thats averages
My peaks are north of 200watts on the 1950x.

https://etn.nanopool.org/account/et...b1NvXBgKxh9j7CgC5DF88bMDxCqPFk/Threadripper01
 
Because for each $600 a year you'll make, it'll cost you a $300 motherboard, $200 in ram, $750 for the processor, and a minimum of $100 for the power supply, GPU, and cooling. Or you can buy four 1060s and make that same $600 in one quarter.
Guess you did not watch the video eh?
 
I have both my threadrippers mining. What is surprising to me how close u can get the 1920x to the 1950x for
hash rate. I been doing etn for about a month. My 1920x at 4.125 does around 1050 to 1120ish where as the
1950x at 3.95 gets from 1150 to 1320 h/s on nano pool. Now if i watch a video or start doing something that
uses cpu clock it will drop down into the 600's for hash rate. The systems are doing other things for the most part so i
don't see max h/s rates very often.

The core package on the 1920 sucks around 140watts compared to around 160watts on the 1950x Thats averages
My peaks are north of 200watts on the 1950x.

https://etn.nanopool.org/account/et...b1NvXBgKxh9j7CgC5DF88bMDxCqPFk/Threadripper01
You are utilizing the same number of threads at the 1950X as I pointed out in the video. You are utilizing more of the HyperThreading abilities than the 1950X is.
 
IF you can get hold of a 1060 at a reasonable price. That's the problem. Your proposed system is
$300 (4 pci slots motherboard)
$200 in ram
$150 baseline processor
$200+ power supply
$350 x 4 GTX1060 =
$2250

just doing some paper napkin math.


I've got four 1060's mining out of an Optiplex 390 that I bought for $75 and a CX550 that I got for $50. So even at $350 a piece (I paid $190 for them), that's only $1525, which can be made back in half a year or so.
 
I assume the calculations are made at current rates and difficulty. So its really anyones guess as to when if ever, it will pay itself.

+1

1.5 year payback calculated with current difficulty means it'll probably never ROI.
 
Not in next year Tax laws :)

It's taxed as ordinary income for short term capital gains...nowhere near 30-40% unless you're making more than $157,500 (or $315,000 filing jointly).

maxresdefault.jpg
 
Last edited:
I have both my threadrippers mining. What is surprising to me how close u can get the 1920x to the 1950x for
hash rate. I been doing etn for about a month. My 1920x at 4.125 does around 1050 to 1120ish where as the
1950x at 3.95 gets from 1150 to 1320 h/s on nano pool. Now if i watch a video or start doing something that
uses cpu clock it will drop down into the 600's for hash rate. The systems are doing other things for the most part so i
don't see max h/s rates very often.

The core package on the 1920 sucks around 140watts compared to around 160watts on the 1950x Thats averages
My peaks are north of 200watts on the 1950x.

https://etn.nanopool.org/account/et...b1NvXBgKxh9j7CgC5DF88bMDxCqPFk/Threadripper01

That's interesting, consider it's 4c/8t less, but the hash rate is only 14% less. Makes me wonder if the importance is just getting that sweet 32MB of L3 then.
Of course, even saving say $150-200 on the CPU still leaves the other platform costs. Hrmmm.

Oh and then there's the issue of a cooler. I already got the free Noctua AM4 bracket last year, but that doesn't matter since you'd ideally be using the TR4 models for Threadripper.
 
Nah, all the gamers without video cards are the ones paying for it. And we'll all be paying the environmental costs down the line.
 
It's not only Threadripper that's good for mining. There are many CPU's around that pay for themselves in those said 1.5 years. My own 8700k mines with NiceHash cryptonight nondelidded with less than 50°C, less than 50w, and with Process Lasso I tie the algo threads to physical cores, 12 half-3/4 utilised cores become 6 full utilised cores and 6 little above or around idle. In either scenario, I can actually work on my PC and do not feel a lot of lag or stutter, it is actually astonishingly smooth to work with. My hashrate does not fluctuate a lot by using Chrome, VPN,RDP and other daily lightweight stuff. The only time that I cannot mine on cpu and gpu and watch a YT vid same time is LBRY algo iirc, with that algo there is no lag free video lag on my end, 1080Ti that is.
 
Huh, interesting

I had thought that CPU Mining was a thing of the past and that GPU was the only way to go these days, and that with more popular currencies like Bitcoin, even GPU's aren't profitable anymore.

Always learning new stuff on here!
 
ASIC is one thing which new coins try to avoid by design. ASIC brings to much hashrate power into the hands of a few strong whereas GPU's bring little hashrate, evenly distributed, to a lot of small GPUs, ordinary people like you and me vs. the Chinese Crypto Farms. That is why Ethereum is not mined on ASIC and prolly never will be, the devs dont want it to be ASIC friendly but many small miners instead, you and me.

CryptoNight is so far the only cpu-algo that I am aware of that brings a profit worth speaking of. Mind you, even an old AMD FX can be a 3€/day CPU if your electricity cost is as cheap as in Argentina or Brasil, 0.05cent/kWh
Where I live, the nett would be zero with 0.30cent/kWh and 300w/24/7. I just wanted to say, a 5G-AMD-FX can mine a lot, energy aside. Pal o mine does about 6x my hashrate and uses about 6x my energy, circa with FX. Very astonishing. he also runs 2 x 1070 usually on eth-sia in that system. The 2 GPU's together need LESS watt than his CPU ! LoL

I tried 2 5650 Xeons...too old, wouldnt work. Then I tried my 5820k, same, wouldnt work with NH Miner. It runs on a 4thgen i5 but wont or at least didnt on a 5820k, that is odd.

Anyway, yeah, CPU's are back on stage, CPU mining is hip, little wattage with modern processors, almost lag free, very nice 1st impression, running it 24/7 with a rig in sig
 
Mining Monero is a good way to not pay taxes on cryptocurrency. It is a privacy coin and if you find private ways to spend it then there is no way for the government to know you have that kind of income unless you make a video and post it on the internet or something similar :p

Additionally anything taken out before a year is short term capital gains, I think 25-33% depending on tax bracket. After a year it is long term gains and taxed at 15%.
 
Don't forget that any profit must be claimed on income tax, so that is likely another 30-40% off, making payback rate longer.

Is it income or capital gains?


Year payout is kinda crap really.
 
ASIC is one thing which new coins try to avoid by design. ASIC brings to much hashrate power into the hands of a few strong whereas GPU's bring little hashrate, evenly distributed, to a lot of small GPUs, ordinary people like you and me vs. the Chinese Crypto Farms. That is why Ethereum is not mined on ASIC and prolly never will be, the devs dont want it to be ASIC friendly but many small miners instead, you and me.

CryptoNight is so far the only cpu-algo that I am aware of that brings a profit worth speaking of. Mind you, even an old AMD FX can be a 3€/day CPU if your electricity cost is as cheap as in Argentina or Brasil, 0.05cent/kWh
Where I live, the nett would be zero with 0.30cent/kWh and 300w/24/7. I just wanted to say, a 5G-AMD-FX can mine a lot, energy aside. Pal o mine does about 6x my hashrate and uses about 6x my energy, circa with FX. Very astonishing. he also runs 2 x 1070 usually on eth-sia in that system. The 2 GPU's together need LESS watt than his CPU ! LoL

I tried 2 5650 Xeons...too old, wouldnt work. Then I tried my 5820k, same, wouldnt work with NH Miner. It runs on a 4thgen i5 but wont or at least didnt on a 5820k, that is odd.

Anyway, yeah, CPU's are back on stage, CPU mining is hip, little wattage with modern processors, almost lag free, very nice 1st impression, running it 24/7 with a rig in sig

nh is trash. 5650s will easily run it and they do alri9ght. same with the 5820k
 
It's not only Threadripper that's good for mining. There are many CPU's around that pay for themselves in those said 1.5 years. My own 8700k mines with NiceHash cryptonight nondelidded with less than 50°C, less than 50w, and with Process Lasso I tie the algo threads to physical cores, 12 half-3/4 utilised cores become 6 full utilised cores and 6 little above or around idle. In either scenario, I can actually work on my PC and do not feel a lot of lag or stutter, it is actually astonishingly smooth to work with. My hashrate does not fluctuate a lot by using Chrome, VPN,RDP and other daily lightweight stuff. The only time that I cannot mine on cpu and gpu and watch a YT vid same time is LBRY algo iirc, with that algo there is no lag free video lag on my end, 1080Ti that is.


Could you share your daily rate with the 8700k?

This may be just the excuse I need...
 
So is this applicable to the Intel equivalent?
Basically lots of core plus large cache means it's good for mining?
 
The trick is to configure the amount of threads properly. xmr-stak doesn't like HT 'cores' so it's best to only run as many threads as you have physical cores.

Any 2012/2013+ Intel will mine alright. I have a 3770K going.
 
Last edited:
IMO for the "not stable" in my experience from mining via my systems, really depends on the algo, the load, specific settings etc, I have had mine going full bore gaming extra windows etc without a problem and "suffering" end of day performance "lost" from 1-5%, so basically made the bank and still able to do what I wanted to do

other algo, drivers or whatever, it was either mine at at best 40% load "stable" or only mine, so essentially could not do anything else.

Geforce/Titan or Radeon, Intel or AMD all have very much have their own strengths and weaknesses, the nice thing about having a shit load of cores, you can always disable so of those cores for the mining program lose a bit of performance but still do everything you want to do, with TR there is anything but lack of cores ^.^
could possibly be the Intel chip someone referenced higher IPC A higher clock speed B so that means TR had to run "harder" to achieve similar gain, but because was that much more cores likely also allowed much higher given level of performance from the results

Computing, there is never a silver bullet ^.^
 
Back
Top