# GUIMiner outdoinging CGMiner on my R9 270?



## NitroOC

Long story short, I used GUIminer to get my conf roughed in for CGminer, since everyone says it's the best and you can get the best hash rates on it. If I enter the settings GUIminer uses, I get hardware errors like crazy, stales, and my effective hashrate (what my pool sees) is around 150-200k, where it would be 400-450 using GUIminer.

I guess I could just keep using GUIminer, but what the heck is wrong with CGminer and this?

Quick specs on the situation -

Dell Dimension 4700
Pentium 4
4gb ram
250gb hd
Windows 7 32bit
Twin 400w PSU's (dell psu doesn't have enough 4 pin connectors but oddly enough has 18a on both 12v rails).
Gigabyte R9 270

Here's the settings in GUIminer:



Last CGminer settings that did anything somewhat respectful, it won't even start up with thread concurrency over 7680.

Code:



Code:


{
"pools" : [
        {
                "url" : "stratum

+tcp://pool1.eu.multipool.us:3352",
                "user" : "",
                "pass" : ""
        }
]
,
"intensity" : "19",
"vectors" : "1",
"worksize" : "256",
"kernel" : "scrypt",
"lookup-gap" : "2",
"thread-concurrency" : "7680",
"shaders" : "0",
"gpu-engine" : "0-0",
"gpu-fan" : "0-85",
"gpu-memclock" : "0",
"gpu-memdiff" : "0",
"gpu-powertune" : "0",
"gpu-vddc" : "0.000",
"temp-cutoff" : "95",
"temp-overheat" : "85",
"temp-target" : "75",
"api-mcast-port" : "4028",
"api-port" : "4028",
"expiry" : "120",
"gpu-dyninterval" : "7",
"gpu-platform" : "0",
"gpu-threads" : "1",
"hotplug" : "5",
"log" : "1",
"no-pool-disable" : true,
"queue" : "1",
"scan-time" : "30",
"scrypt" : true,
"temp-hysteresis" : "3",
"shares" : "0",
"kernel-path" : "/usr/local/bin",
"device" : "0"
}

When I turn up the TC on CGminer it throws an error on start up (error 61) saying I'm trying to allocate memory to mining that isn't available?

What do you guys think?


----------



## Namwons

i think your thread concurrency, worksize, and intensity are all too high. try concurrency between 4-5000, worksize 256, and intensity 13. start low and work your way up, just like overclocking. also try some settings that others have used here. not sure if you are going to get more out of the 270 though, maybe up to 450 or so.


----------



## NitroOC

What confuses me is how does guiminer run at the settings I'm running it at? Or is there no correlation between the settings they run?

Last night I ran guiminer with TC at 20400 and an intensity of 19, OC on the card is 1050/1500, 100% fan, and 20% power increase. It achieved 6650 shares accepted, 1096 stale. On my pool that's about 660 DOGE/hr.


----------



## Namwons

Quote:


> Originally Posted by *NitroOC*
> 
> What confuses me is how does guiminer run at the settings I'm running it at? Or is there no correlation between the settings they run?
> 
> Last night I ran guiminer with TC at 20400 and an intensity of 19, OC on the card is 1050/1500, 100% fan, and 20% power increase. It achieved 6650 shares accepted, 1096 stale. On my pool that's about 660 DOGE/hr.


somewhere i read that intensity has to be raised or lowered until accepted to reject rate is within 1-2%.

as for GUIminer, im not sure why it runs in GUI instead of CG? i though GUI is just a user interface that automatically creates a .bat file for your miner. i have used GUI created .bat for CUDAminer no problem, not sure why it wouldnt be the same for CG.


----------



## NitroOC

Update: I worked on the CGminer config for a bit today, this is working nearly perfect:

Code:



Code:


{
"pools" : [
        {
                "url" : "stratum

+tcp://pool1.eu.multipool.us:3352",
                "user" : "",
                "pass" : ""
        }
]
,
"intensity" : "19",
"vectors" : "1",
"worksize" : "256",
"kernel" : "scrypt",
"lookup-gap" : "0",
"thread-concurrency" : "20400",
"shaders" : "0",
"gpu-engine" : "0-0",
"gpu-fan" : "0-85",
"gpu-memclock" : "0",
"gpu-memdiff" : "0",
"gpu-powertune" : "0",
"gpu-vddc" : "0.000",
"temp-cutoff" : "95",
"temp-overheat" : "85",
"temp-target" : "75",
"api-mcast-port" : "4028",
"api-port" : "4028",
"expiry" : "120",
"gpu-dyninterval" : "7",
"gpu-platform" : "0",
"gpu-threads" : "1",
"hotplug" : "5",
"log" : "1",
"no-pool-disable" : true,
"queue" : "1",
"scan-time" : "30",
"scrypt" : true,
"temp-hysteresis" : "3",
"shares" : "0",
"kernel-path" : "/usr/local/bin",
"device" : "0"
}


----------



## Epipo

good job "not really knowing how well it works" hope you keep updating


----------



## test tube

to run high memory usage using the external program via the bat you need to set your environmental variables, see https://bitcointalk.org/index.php?topic=117221.0

guiminer has the latest version of cgminer and does this for you though, so there's no real point...


----------



## moejama

You need to add the environmental variables as stated above.

Go to command prompt and type each command. You only need to do this once vs including it in a batch file.

setx GPU_MAX_ALLOC_PERCENT 100

setx GPU_USE_SYNC_OBJECTS 1


----------



## dathaeus

Hey... I have slightly different components but on Win7 64 and have two R9-270 in there... did these final settings give you around 450?
Quote:


> Originally Posted by *NitroOC*
> 
> Update: I worked on the CGminer config for a bit today, this is working nearly perfect:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> {
> "pools" : [
> {
> "url" : "stratum
> 
> +tcp://pool1.eu.multipool.us:3352",
> "user" : "",
> "pass" : ""
> }
> ]
> ,
> "intensity" : "19",
> "vectors" : "1",
> "worksize" : "256",
> "kernel" : "scrypt",
> "lookup-gap" : "0",
> "thread-concurrency" : "20400",
> "shaders" : "0",
> "gpu-engine" : "0-0",
> "gpu-fan" : "0-85",
> "gpu-memclock" : "0",
> "gpu-memdiff" : "0",
> "gpu-powertune" : "0",
> "gpu-vddc" : "0.000",
> "temp-cutoff" : "95",
> "temp-overheat" : "85",
> "temp-target" : "75",
> "api-mcast-port" : "4028",
> "api-port" : "4028",
> "expiry" : "120",
> "gpu-dyninterval" : "7",
> "gpu-platform" : "0",
> "gpu-threads" : "1",
> "hotplug" : "5",
> "log" : "1",
> "no-pool-disable" : true,
> "queue" : "1",
> "scan-time" : "30",
> "scrypt" : true,
> "temp-hysteresis" : "3",
> "shares" : "0",
> "kernel-path" : "/usr/local/bin",
> "device" : "0"
> }


----------



## NitroOC

Been black screening.. Only got it to run stable at 410, stock clocks.


----------



## FatSpitfire

Guys I just made this account here in the forum because this is the only topic in the net with r9 270 , so to start off who what rig has ?

I have a Asus P8 motherboard with a (slow !) P4 3,0 Ghz and second hand 4 Gb DDR3 Ram.
But I didnt even know that they are the best buy right now , when i bought My two R9 270s (also Asus) , I hear some 270s may get up to 470 - it completely depends on the manufacturer.i see them like this : Gigabite - performance , Asus - endurance . Sapphire - best price. Pls correct me if I`m wrong . Your setups are good but I think you get them on very high intencity lvl (19 is a lot). I get a lot more errors when I kick them to the max on I=19 , than when i use less intensity and more thread concurency. Dont push you cards too hard - for 8-10 khash/s you could loose a lot more than just a couple of shares!


----------



## dathaeus

LOL there are several threads on the 270 on the internet... the best one I found a while back was on the feathercoin forum... methinks u need to search google better.


----------



## FatSpitfire

... And I also use GuiMiner , which I think is better than Cg but there is another problem at hand here - how you guys whatch out for you card temp readings ?


----------



## FatSpitfire

haha maybe you`re right








let me say than that I just found it by accident


----------

