HW issues once intensity goes past 13, R9 290x

XViper

Gawd
Joined
Aug 24, 2012
Messages
838
I tried everything to get this to work properly. I previously had the card running in BAMT on a different rig and since then I moved the cards to my personal rig and watercooled and crossfired both of them. My personal rig runs windows and I figured the settings would be identical for the system. I was wrong...

I've used Furmark and Heaven to benchmark the cards and the temps are exactly where they should be.

I'm using cgminer 3.7.2 .

cgminer -o stratum+tcp://xxx -u 1234 -p d=xxx --scrypt

It does about 500 each now. When I had it running, I was doing 800 and 950.

Is there a setting that I need to set for watercooled setups? Do I need to increase voltage?

Setting the intensity to above 13 throws HW errors.

Using Catalyst 14.2

Any ideas?
 
WCing shoudn't mean a different cgminer config.

Past setting the --scrypt option you still need to fill out more flags.

Scroll down to the 290x configs here, and give some a shot.

That being said, 290x's typically like -g 1 / high intensity configs as you can see from that link. Now, if those configs don't work and you're still getting errors..you may need to uninstall / reinstall drivers.
 
also anytime you're running a high intensity the TC needs to be cranked up. Windows requires way more RAM to crank the TC up than linux does.

You also said you crossfired them... thats bad for mining
 
Thanks for the link. I noticed more people run Catalyst 13.11

Is it necessary to install SDK?

I have disabled crossfire for mining. Shouldn't 16gb's be enough though?
 
^^ Depends on the Cat. pack. I don't even remember if the 13.11's / 13.12's come with the SDK built in because i almost always install the standalone SDK by default.

I would try the 13.12s + 2.9 SDK.

I've got 4 290 rigs up and they are rock solid on that driver / SDK combo.

When you do switch your drivers, make sure to use Display Driver Uninstaller. I've personally found the express uninstall function from the CP hit or miss, sometimes it does its job and sometimes it doesn't. You could use that too, just to be overly thorough.
 
Thanks for the link. I noticed more people run Catalyst 13.11

Is it necessary to install SDK?

I have disabled crossfire for mining.

It's not necessary to install the SDK pack for Windows anyway, I have no idea about linux, but I read somewhere that all the AMD drivers after 13.xx had the SDK with them already.

I use Catalyst 13.12's with all my cards, no matter what make/model/brand. I have not tested the 14.xx drivers for mining yet.
 
No need to install SDK. What TC are you running, bump it up. Delete the bin files in the cgminer folder. Threads?
 
Disable crossfire.
It will crash cgminer.

I haven't noticed massive ram usage, 16 GB is plenty.

The SDK was integrated into the drivers during 2013, I don't remember which month but I believe 13.6 or earlier.

Restart the PC between cgminer crashes. Once it fails it seems to leave something hanging which only a reboot helps.
 
Crossfire has been disabled. When cgminer crashes, it bluescreens. I have to do a hard reset.

Disable crossfire.
It will crash cgminer.

I haven't noticed massive ram usage, 16 GB is plenty.

The SDK was integrated into the drivers during 2013, I don't remember which month but I believe 13.6 or earlier.

Restart the PC between cgminer crashes. Once it fails it seems to leave something hanging which only a reboot helps.
 
the massive RAM usage is in a normal mining rig which at one point I was only at 2Gb since thats all most need. I'm running 8Gb most of the time in my rigs now but I think 4gb is enough most of the time.
 
I run 8GB of RAM in my main rig, never seen usage above 1.6GB and that includes all of Windows 8 preloaded services and cache. I've also heard that enabling Crossfire is bad but I couldn't tell you why. I always use the 13.12 drivers and never manually installed the SDK. I can get about 4-5 clean shutdowns (X) out of cgminer before it freezes Windows.

A 290/X should take 20 intensity all day. Are your VRMs cooled properly by the waterblock?
 
VRM's are really cool. Benched through Furmark.

GPU Temp: 43c
VRM Temp1: 57c
VRM Temp2: 33c

So far I'm noticing a trend that I need 13.12

If it works, I'm going to have to say stay away from 14.2. Maybe it doesn't like Windows.

The odd thing is that I ran BAMT with 14.2 in and it was working on the other rig. My main is below in the sig but the other rig I had is a cheap 1150 build on a Asrock H81 Pro.

I run 8GB of RAM in my main rig, never seen usage above 1.6GB and that includes all of Windows 8 preloaded services and cache. I've also heard that enabling Crossfire is bad but I couldn't tell you why. I always use the 13.12 drivers and never manually installed the SDK. I can get about 4-5 clean shutdowns (X) out of cgminer before it freezes Windows.

A 290/X should take 20 intensity all day. Are your VRMs cooled properly by the waterblock?
 
Last edited:
Oh I completely missed (didn't register) the 14.2. 14.1 would only give me about half of the normal hashrate.

13.12 is good.

I read somewhere that OpenGL was messed up on 14.1 but I don't know if it's true nor if 14.2 fixed it.

I want to upgrade to 14.x to try mantle but haven't had a chance and I want to be sure mining is doable first.
 
Too bad I can't get crossfire working on BF4 with 13.12.

I know 14.2 was when they fixed it.
 
I installed 13.12 + SDK.

I get this message:
[2014-03-04 19:28:09] Started cgminer 3.7.2

[2014-03-04 19:28:09] Started cgminer 3.7.2
[2014-03-04 19:28:09] Probing for an alive pool
[2014-03-04 19:28:09] Pool 0 difficulty changed to 512
[2014-03-04 19:28:10] Maximum buffer memory device 0 supports says 536870912
[2014-03-04 19:28:10] Your scrypt settings come to 2147287040
[2014-03-04 19:28:10] Error -61: clCreateBuffer (padbuffer8), decrease TC or in
crease LG
[2014-03-04 19:28:10] Failed to init GPU thread 0, disabling device 0
[2014-03-04 19:28:10] Restarting the GPU from the menu will not fix this.
[2014-03-04 19:28:10] Try restarting cgminer.
Press enter to continue:

I did the setx command already. Do I need to lower TC? I used 32765.
 
14.2 seems to require completely different settings than 13.12. I installed 14.2 on a new rig I built the other day and was getting tons of HW errors and lockups, used settings that are rock solid on my other builds with the same cards. Thought it was an issue with the card but then I went back to 13.12 and everything worked fine.
 
Have you deleted all the .bin files in the cgminer folder?
And what is your complete command line to start cgminer?
 
Open a dos box (start -> run, enter "cmd" - without the quotes), then enter
setx GPU_MAX_ALLOC_PERCENT 100
and then
setx GPU_USE_SYNC_OBJECTS 1

then try with the higher TC again.

H.
 
I did the setx commands. They are executed every time. It doesn't take for some reason.

Haven't been able to figure out windows mining yet. I was able to get it running again in Linux with sgminer 4.0. It takes the TC command and intensity up to 20. Guess it's something with windows.
 
Verify that the setx commands have been executed properly - I had one machine they were converted to lower case on. from a dos window do:
set | find /i "gpu"
and verify spelling and case.

H.
 
Back
Top