NVIDIA Coin Mining Performance Increases With Maxwell

I think this might do the opposite. It would be like the opengl cad cards. Basically a gaming card with a different firmware, but the price is 5X higher.
I was actually thinking the opposite. The mining cards are higher due to the fact that their main purpose is to make money. But these cards wouldn't just be for mining. They'll be tuned for mining-like purposes: folding proteins, mining bitcoins, etc.

Mining does not kill the GPU market. Do you think AMD or Nvidia cares who buys their GPUs? All they care about is that their GPUs are being purchased regardless of the reason and they are laughing all the way to the bank.
I don't think you understood what I meant. I meant by "Killing the gpu market" that the demand for mining prices products out of range for their core customers. That is all.

With fear that maybe Nvidia may end up having shortages (and price increases) themselves come higher end Maxwells later this year because of cryptocurrency mining, I am curious though on two things (primarily):

  1. Why aren't there other fabrication companies besides TSMC for AMD and Nvidia GPUs? Perhaps Global Foundries or someone on that level?

    My reasoning is this: If production can't keep up with demand, why not open another fabrication facility or add another company to help with production?
  2. If the Geforce and Radeon series are for gaming and the Quadro and Firepro are for compute and 3D rendering, and as others have asked in other threads in the last two weeks-- why not a dedicated mining card from both Nvidia and AMD?

    If you think about it, it does sound logical to go down this path. It'd seriously challenge ASIC manufacturers to keep up. Put a single dual-link DVI on the card, offer specifically tuned and binned GPUs, offer a BIOS editor that can be adjusted via Windows or Linux or the UEFI-equivalent for a GPU that one can select before booting into an OS, remove video encoding features and Crossfire/SLI support, remove the HDMI HDCP and onboard audio, and offer at most two GPUs per card that are not Crossfire/SLI-ed linked via a PLX PCI-Express bus since the mining software doesn't need them to be connected by Crossfire/SLI anyway.

    Also, it'd be a way to cash in on this because regardless what we think about cryptocurrency, even if one dies out, another will take its place no matter what. We're already seeing other encryption algorithms being used here-- adaptive n-factor, SHA-3, kecca, and possibly others I can't remember. People will mine for cryptocurrency no what you or I think about them, and there will be those that will be willing to make a market for them.
I was thinking along the same lines as you.
 
hey amd i think i know how you can solve your money problems.

step 1. produce a lot of 290x and load them up.
step 2. fire up you fav coin client
step 3.?????
step 4. insane PROFIT.
 
Yeah you can say it doesn't impact us, but just sold two 6990's for a total of $841. That is insane for such old cards
 
ASICs are on the horizon for scrypt, and will also come for any other mining algo that entrenches itself in the future. I'd hate to be the guy just getting started pouring money into GPUs to mine scrypt right now. Existing miners are OK, but it seems to me that scrypt mining is at the point SHA mining was maybe two years ago.

Think of this: Were they to decide to, who is best positioned and tooled to crank out mining ASICs should the market solidify to the point where it's in the cards (pun intended) financially for them? AMD and Nvidia. GPUs they already make could somewhat be considered an ASIC.

Cryptomining could end up being a huge win for them.
 
ASICs are on the horizon for scrypt, and will also come for any other mining algo that entrenches itself in the future. I'd hate to be the guy just getting started pouring money into GPUs to mine scrypt right now. Existing miners are OK, but it seems to me that scrypt mining is at the point SHA mining was maybe two years ago.

Think of this: Were they to decide to, who is best positioned and tooled to crank out mining ASICs should the market solidify to the point where it's in the cards (pun intended) financially for them? AMD and Nvidia. GPUs they already make could somewhat be considered an ASIC.

Cryptomining could end up being a huge win for them.
Nope.

ASICs are being made but there are so little inventory that no one will benefit from them for a long time. The ones that will have first dibs are HUGE group buys and big farms.

A lot of coins are already ASIC resistant, but can still utilize GPU's.
Also cost. A unit made by Hashra cost $2400 for 3.5 MHash. You can build a rig MUCH cheaper than that with same speeds. The only thing you will miss out on is power savings. Oh and I forgot to mention warranty, ASICs come with a 90 day warranty, yes a unit that costs $2400 will be a paperweight if something happens after 90 days, which is likely.

Do more research.
 
Nope.

ASICs are being made but there are so little inventory that no one will benefit from them for a long time. The ones that will have first dibs are HUGE group buys and big farms.

A lot of coins are already ASIC resistant, but can still utilize GPU's.
Also cost. A unit made by Hashra cost $2400 for 3.5 MHash. You can build a rig MUCH cheaper than that with same speeds. The only thing you will miss out on is power savings. Oh and I forgot to mention warranty, ASICs come with a 90 day warranty, yes a unit that costs $2400 will be a paperweight if something happens after 90 days, which is likely.

Do more research.
And there is this:
https://bitcointalk.org/index.php?topic=421921.0

Lightning ASIC.

300 kH/s for 8W on Scrypt mining only, but costs $200. It was released recently.

Compare that to the 750 Ti at around 280 kH/s for nearly 75W (PCI-E bus powered) for $150 to $200, and the R7 260 for the same price for around the same hashrate (or higher depending on clockspeeds) but at 150W.
 
Nope.

ASICs are being made but there are so little inventory that no one will benefit from them for a long time. The ones that will have first dibs are HUGE group buys and big farms.

A lot of coins are already ASIC resistant, but can still utilize GPU's.
Also cost. A unit made by Hashra cost $2400 for 3.5 MHash. You can build a rig MUCH cheaper than that with same speeds. The only thing you will miss out on is power savings. Oh and I forgot to mention warranty, ASICs come with a 90 day warranty, yes a unit that costs $2400 will be a paperweight if something happens after 90 days, which is likely.

Do more research.

You mean they'll start off exactly how BTC ASICs started?

Do any research.
 
Nope.

ASICs are being made but there are so little inventory that no one will benefit from them for a long time. The ones that will have first dibs are HUGE group buys and big farms.

A lot of coins are already ASIC resistant, but can still utilize GPU's.
Also cost. A unit made by Hashra cost $2400 for 3.5 MHash. You can build a rig MUCH cheaper than that with same speeds. The only thing you will miss out on is power savings. Oh and I forgot to mention warranty, ASICs come with a 90 day warranty, yes a unit that costs $2400 will be a paperweight if something happens after 90 days, which is likely.

Do more research.



IIRC the reason why BitCoin was such a flop so quickly was because of the way it was designed. You could build ASIC's to do the task really fast and efficiently. There was little complexity to it at all.

Newer iterations of crypto currency incorporated memory intensive operations which cannot be bypassed easily with ASIC's. It isn't completely unfeasible, you'd just have to invest more in the memory controller/bandwidth/memory capacity. It'll probably cost more and still not be nearly as beasty as you saw with BitCoin, but will be leaps and bounds better than GPU's in the future.

In fact, GPU's are very inefficient at a lot of these Distributed style projects because they aren't designed for them in mind. It just so happens that the programming is there and the GPU's naturally specialize in functions that allow them to capitalize on those parallel loads much better than any other off the shelf hardware. There really is nothing special about GPU's compared to an ASIC designed around a specific task. Right now memory is really the biggest advantage that GPU's have with non-BitCoin related projects.

This all if I remember reading up on this correctly, although I must say I'm not deeply invested in mining just from what I read every now and then so I may be wrong or off basis a little bit.
 
To execute the general idea an "ASIC-resistant" currency all they have to do is change the proof of work algo. Since ASICs are single-function machines, at that point any ASIC dedicated to doing the old pow algo becomes effectively a paperweight.

That's not a bulletproof answer though. For one, ASIC development for a specific mining algo has become somewhat of a "coming of age" for that coin. Some that said they'd be ASIC-resistant currency are waffling on that statement because of the psychological "legitimacy" it brings. There are also ways around changing pow algos for ASIC developers if they know that that's what they're working with prior to development.
 
The first thought that cross my mind when I saw the title is "Noooo please don't do that!"

The situation with AMD cards is bad, if it happens to nVidia as well, those of us who only cares about gaming will be out of option.:(
 
The first thought that cross my mind when I saw the title is "Noooo please don't do that!"

The situation with AMD cards is bad, if it happens to nVidia as well, those of us who only cares about gaming will be out of option.:(

I want to get back to gaming. The minute new nVidia cards are released may be the only time it will see MSRP.
 
[*]Why aren't there other fabrication companies besides TSMC for AMD and Nvidia GPUs? Perhaps Global Foundries or someone on that level?

My reasoning is this: If production can't keep up with demand, why not open another fabrication facility or add another company to help with production?

Fab processes are not interchangeable in that way, even if they are the same node. It isn't as simple as just having TSMC and Samsung (for example) fabbing the same design as long as both are 28nm.

The interesting thing about this is that the supply shortage actually isn't simply due to a higher demand than production can meet situation. Forecasts and expectations were for a continued downward trend in the market, as such TSMC customers (AMD was specifically reported as one) actually cut down orders leading to the end of 2013 (yes they cut orders despite they were launching a new chip and new line). TSMC 28nm capacity actually is not being fully utilized (Qualcomm also cut orders for example).

For these types of products supply ramping isn't very quick or simple. You really need to be able to accurately gauge and forecast demand well in advance.

[*]If the Geforce and Radeon series are for gaming and the Quadro and Firepro are for compute and 3D rendering, and as others have asked in other threads in the last two weeks-- why not a dedicated mining card from both Nvidia and AMD?

Actually this is something that has been overlooked in this situation is that Nvidia has actually been in the same situation as AMD is currently in fairly recently, they just managed to take advantage of it more for themselves. Kepler GK110 was actually highly in demand for GPGPU purposes and as such was delayed for the consumer market essentially (and essentially priced out of the consumer market). Think back to how GK110 launched, first to the professional market with very high markup as a Tesla product (think the hundreds of GPU orders by some miners are large, the Titan supercomputer order dwarfs that :p). Then they staggered it out to the "prosumer" market with the GTX Titan at high markup. Then finally you had a somewhat reasonable consumer variant of the card in the GTX 780 (keep in mind launched at $650 MSRP despite being a cut chip, whereas the GF100/110 both were $499) but really the consumer market did not really see a full GK110 consumer card until the 780ti (before that Nvidia's Quadro line got it first as well). Through out all this Nvidia had been selling GK104 parts for much higher than their GF104 counterparts.

Just for some further context I believe Tesla product demand last year with Kepler was over a 100% increase over Fermi.
 
If i can get a Maxwell chip that does 700khs - 1mhs for under 250w come and talk to me.

I am not gonna pay $150-200 for a card that doesn't break even 300khs.

Every single rig i throw together costs me extra cash past just the GPU (CPU, Board, Ram, SSD, chasis etc), so at some point you gotta make it count in terms of pure performance.
 
But with a whole lot of them your overhead for motherboards, CPU, DRAM, connectors, etc. increase. The cost and mining rate of 1 x R9 290X is equivalent to 4 x 750 Ti. It's also dependent on how many 750 Ti GPUs Nvidia driver will recognize in the OS which is unknown at this time.

Another drawback is that mining as many coins as possible, as quickly as possible is highly advantageous. Faced with increasing mining difficulty and the potential for a burst bubble, slow-and-efficient-and-steady doesn't sound quite as appealing.
 
If i can get a Maxwell chip that does 700khs - 1mhs for under 250w come and talk to me.

I am not gonna pay $150-200 for a card that doesn't break even 300khs.

Every single rig i throw together costs me extra cash past just the GPU (CPU, Board, Ram, SSD, chasis etc), so at some point you gotta make it count in terms of pure performance.

300khashes/second? Something is screwball somewhere.

On the rig in my sig (using Bitminter 1.4.3), I'm cranking 44 Mhashes/second - and this is a "tinker project" mainly to see what I can do with a background miner on supposedly-irrelevant hardware. A GTX550Ti is considered irrelevant even for CUDA-based mining (which this is) - any GTX6xx or 7xx (including GTX750Ti) would smash it flat.

As I said, this is NOT a serious project (though with minimal effort, I can convert from tinker project to serious project if I wanted to), and I'm using EOL hardware for it.
However, mining is like folding - efficiency, not just power, is the watchword. (It's also why - as is the case for folding - CLI clients, followed by Java-based clients - are the most prevalent; Bitminter is Java-based and requires an up-to-date JRE.)
 
My 780 does 720-730 kh/s At around 380ish watts. ($550)

It would take 3 750ti's to easily beat that @ around 240 watts for around 900 kh/s. ($450-500)

Now the drawback is you waste more PCI-E slots going that route. So the money you saved kind of goes out the window since you would have to build more systems for the 750ti's.

I would rather use my 6 pci-e slots with 6 280x's then 6 750ti's.

But if the 750ti's are that damn good for mining at this wattage usage. Bring on the 880 GTX's!!!
 
300khashes/second? Something is screwball somewhere.

On the rig in my sig (using Bitminter 1.4.3), I'm cranking 44 Mhashes/second - and this is a "tinker project" mainly to see what I can do with a background miner on supposedly-irrelevant hardware. A GTX550Ti is considered irrelevant even for CUDA-based mining (which this is) - any GTX6xx or 7xx (including GTX750Ti) would smash it flat.

As I said, this is NOT a serious project (though with minimal effort, I can convert from tinker project to serious project if I wanted to), and I'm using EOL hardware for it.
However, mining is like folding - efficiency, not just power, is the watchword. (It's also why - as is the case for folding - CLI clients, followed by Java-based clients - are the most prevalent; Bitminter is Java-based and requires an up-to-date JRE.)

You're talking about different algorithms. 700kh he is referring to is for scrypt. Not sha256

You guys should move this convo to the mining sub forum!
 
If i can get a Maxwell chip that does 700khs - 1mhs for under 250w come and talk to me.

I am not gonna pay $150-200 for a card that doesn't break even 300khs.

Every single rig i throw together costs me extra cash past just the GPU (CPU, Board, Ram, SSD, chasis etc), so at some point you gotta make it count in terms of pure performance.

You can get pretty close to that now with an 280x. I run my 2 of my 280x's at 1.065v, and the bottom one needs 1.1v...that bastard. My kh/s averages are around 770-780 kh/s on each (scrypt).

I run them at 1800 memory and 1110 gpu.

Again i gotta say bring on on the BIG MAXWELL cards
 
Back
Top