# AMD Richland A10-6800K APU thread



## Castaa

I'm updating the Wikipedia entry and I thought I'd pass along the latest info about Richland

4 Piledriver Cores (2 Modules)
4.4 GHz Turbo/4.1 GHz normal
HD 8670D GPU w/ 384 Stream Processors @ 844 MHz
DDR3 2133 support
Socket FM2
An AMD reported 20-40% faster graphics core over previous generation APU (Trinity)
Mobile Richland APUs were officially announced March 12th, 2013.

Desktop APUs are expected in June 2013.

Sources:
http://www.fudzilla.com/home/item/29933-top-richland-28nm-apu-is-a10-6800k
http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-6800K.html
http://www.fudzilla.com/home/item/30299-richland-clocks-leaked


----------



## boot318

2 socket cpu upgrades in one year.... This just goes to show how AMD feels about their APU's. I think they should have just waited for SR, but they may surprise me on the IGPU end.


----------



## skitz9417

well at least we see how steamroller cores will be if they are better than the old apus


----------



## xd_1771

I'm curious as to whether there'll be a CPU performance improvement in this. At least, I know to expect Phenom II x4 class performance (and the associated loss of two multithreaded cores for me). I'm interested in pursuing Socket FM2 because I would like to downsize to Mini ITX at some time in the future. Kaveri APU and HSA support will really make it for me but I'm thinking of moving as early as the Richland generation release. Since it is theoretically a downgrade for me (though the downgraded machine will continue to fit my needs), I can free up new cash to balance my computing demands with items such as a very good 10.1" tablet.


----------



## skitz9417

yea i hope steamroller will be better than phemon x4s and x6s as well plus with bulldozer and piledriver was a i little improvement


----------



## boostinsteve

I jumped on the APU bandwagon with a computer I built for my GF. She used to come use my computer, and it was aggrevating, as this is mine, and she gets the front room tv. Now she has that, and I am actually very impressed. Hopefully they can start packing some really good igpu power, and I will just make the switch myself. I will wait and see what happens with Kaveri however before doing anything to rushed.


----------



## skitz9417

i will change to apu if the steamroller desktop cpu anit any good


----------



## xzamples

what gpu will this use?


----------



## Deadboy90

Quote:


> Originally Posted by *xzamples*
> 
> what gpu will this use?


Its supposed to use GCN archetecture.


----------



## vanara.hen

Quote:


> Originally Posted by *xzamples*
> 
> what gpu will this use?


HD 8670


----------



## DeadFire

I have very high hopes for the APU platform! If they go where I would like to see them BIG cooling is going to be very important! I want to see the IGP with 256-bit bus with native support for DDR3-2400. Not that the current chips won't run 2400. The APUs are limited because of there bus. I'm only getting 40 GB of bandwidth on my A10 with 2400, but the textures and pixels it puts out are on par with a GTX 460 SE easily!


----------



## Castaa

Disappointing to find that leaked info shows that the A10-6800K will have the same number (384) of Stream processors as the 5800K.

http://www.fudzilla.com/home/item/30264-more-richland-details-leak-six-parts-confirmed

So that 20-40% improvement quote looks to be only a bump in officially supported memory speed and I would assume GPU clock speed. Meaning the GPU speed would have to be raised to at least 900-1000 Mhz.


----------



## zulk

Aren't they rumored to be gcn cores ?


----------



## spatulator

No... there is a slide from amd released recently that shows the 2013 roadmap and although it does not say which gpu tech will go into richland... it does say gcn for kaveri. Just read between the lines. Also official support for 2133 men speed should give about 5-10% graphics boost alone so richland looks like a very minor update to me. I also read that the 20% increase was referring to the laptop version of richland. Do we even know if desktop richland is coming?


----------



## Castaa

The A10-6800K name itself suggests it's a desktop part. No laptops, that I know of used the Trinity A10-5800K part.

And yes I would agree, Richland is only going to be a modest improvement over Trinity. Given what we know (or rumored to know), I would be shocked if we see even a 20% improvement in actual game performance. Which is the only thing that matters in the end.


----------



## Castaa

All the Richland clock speeds apparently leaked. Though it's not official.

http://www.fudzilla.com/home/item/30299-richland-clocks-leaked


----------



## A Bad Day

I wouldn't expect any major improvements within six months. Not even Intel can do that, unless if someone is really threatening their market share.


----------



## MrPerforations

nvm


----------



## FIRINMYLAZERMAN

I have an AMD Phenom II X4 965 BE CPU @ 3.4GHz stock speed paired with an AMD Radeon HD 6770 1GB GDDR5 dedicated GPU. How much better or worse will the AMD A10-6800K with the AMD Radeon HD 8670D IGP be in comparison to my Phenom II X4 965 BE/Radeon HD 6770 setup in terms of gaming?


----------



## DaveLT

The new A10-6800k is said to go heads to heads with a HD7750 (Yes no joke!)


----------



## FIRINMYLAZERMAN

So, what you're saying is that the integrated Radeon HD 8670D graphics chip is supposed to be about equal to a dedicated Radeon HD 7750 chip in terms of overall performance, or... are you saying that the A10-6800K w/ the Radeon HD 8670D IGP can be paired in Hybrid CrossFire with an HD 7750 dedicated GPU?


----------



## DaveLT

Go heads to heads means equal with a HD7750


----------



## artk2219

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I have an AMD Phenom II X4 965 BE CPU @ 3.4GHz stock speed paired with an AMD Radeon HD 6770 1GB GDDR5 dedicated GPU. How much better or worse will the AMD A10-6800K with the AMD Radeon HD 8670D IGP be in comparison to my Phenom II X4 965 BE/Radeon HD 6770 setup in terms of gaming?


I would say the cpu would be about the same as your 965, maybe slightly faster, but if it is its nothing that you couldn't achieve through a modest OC. Graphics wise is another question altogether, although I would be surprised if it matched, much less beat your 6770. However AMD is supposed to be supporting socket FM2 in the future so Kaveri may give you a sizable boost if you go that route, alternatively if you already have an AM3+ mobo steamroller is "supposed" to be AM3+ compatible, but we wont know until we get some more information. Hope this helped!


----------



## spatulator

I agree, wait for kaveri...you dont want to buy a bunch of new hardware only to have similar performance with the old setup


----------



## DaveLT

It's been said to have performance like a 7750 ==
Anyway, Kaveri won't be until the end of the year and i really need a new build soon so i'm gunning for richland but if it's not all that much faster i will grab trinity at cut prices (I doubt it won't be all that much faster)


----------



## BlankName

Quote:


> Originally Posted by *DaveLT*
> 
> It's been said to have performance like a 7750 ==
> Anyway, Kaveri won't be until the end of the year and i really need a new build soon so i'm gunning for richland but if it's not all that much faster i will grab trinity at cut prices (I doubt it won't be all that much faster)


It wont be able to match a 7750 in terms of performance, simply because it is rumored to only have 384 SPU's compared to the 7750's 512 SPU's. It might have the same core performance if that's what you are saying. Tho again it's still crippled by desktop DDR3 speeds, even with the best memory you'll be lucky to get 2800MHz. Which is far off the effective 4500MHz of a stock 7750. If AMD uses HD 7000 architecture in Richland, we could see more shaders then 384. No one knows for sure until release day.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *BlankName*
> 
> It wont be able to match a 7750 in terms of performance, simply because it only has 384 SPU's compared to the 7750's 512 SPU's. It might have the same core performance if that's what you are saying. Tho again it's still crippled by desktop DDR3 speeds, even with the best memory you'll be lucky to get 2800MHz. Which is far off the effective 4500MHz of a stock 7750.


Well, it'll at least perform better than the A10-5800K's IGP, that's for sure







How MUCH better has yet to be seen.


----------



## Wall

Isn't A10-6800k (the desktop part) expected in March 19? That's that the wikipedia source suggests.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Wall*
> 
> Isn't A10-6800k (the desktop part) expected in March 19? That's that the wikipedia source suggests.


That's what I've heard from certain sources, but I've also heard from other sources that it could be coming out in June of this year. I'm really hoping that it comes out in March though!


----------



## ericore

Technically, the A10-6800k can outperform the 7750 video card.
Remember, by default the 7750 uses DDR3, not GDDR5; only the more expensive models of 7750 ship with that.
You don't need GDDR5 for 720P gaming, you do need it for 1080P and above.
Though the A10-68000k has 384 cores and the 7750 has 512 cores; you the performance does not really scale up in this type of comparison.
Case and point, the A8-5600k is almost as fast as the A10-5800k and has 256 cores; it might be 4 FPS slower on average compared to A10-5800k.
Technically, APU should have a natural advantage to render more FPS as long as memory bandwidth is not an issue.
If you want best FPS for 720P then get DDR 1600 with CAS 7
If you want best FPS for 1080P then get DDR 2133 with lowest CAS.
Don't expect A10-6800k to be sufficient for most 1080p games; will work on many but on low settings.
Expect the APU to hit prime time come DDR4; then there won't be as nearly of a point of getting GPU.
Also the reason the A10-6800k is faster than 7750 besides of all the benefits that it is on 1 die, the cores are not the same as those in 7750; AMD made some updates they are more efficient.
The real question is can you get 20-40% improvement on the A8-6600k, or did AMD mostly find a way to use up the mostly unused cores in the A10-5800k.
There is a 100mhz GPU higher clock which can't hurt.
Overclocking the GPU out of this by another 100-150 mhz would yield enough power for ALL games 720p HIGH-ULTRA settings, 1080p Low - Medium.
When DDR4 comes out, ADD one the the quality.
DDR4 for AMD will hit Q4 2014, and that will be the next AMD motherboard to get.
By 2014, AMD will have nearly completely cough in terms of CPU performance (Floating Point) for games.
Bar none, A10-6800k best/most affordable CPU for DolphinEMU.


----------



## DaveLT

Are you sure they're gonna improve the FPU this time round? I'm sorta sick of waiting for AMD since Stars to improve their FPU but they actually made it worse
It's better to just buy 1866, best of both worlds. after all most 1866 kits can OC to 2133 easily or a ARES 1866 CL9 can get you to 2400 (Trinity is capable of 2800 on air ... but that's another story)


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Are you sure they're gonna improve the FPU this time round? I'm sorta sick of waiting for AMD since Stars to improve their FPU but they actually made it worse
> It's better to just buy 1866, best of both worlds. after all most 1866 kits can OC to 2133 easily or a ARES 1866 CL9 can get you to 2400 (Trinity is capable of 2800 on air ... but that's another story)


No, the FPU is going to stay as it is _in terms of actual unit capabilities_.

Steamroller is the next architectural restructuring. It won't come with more FPU power _per se_, but due to changes in the way the cores are fed, the throughput will increase quite substantially.


----------



## Heavy MG

Quote:


> Originally Posted by *Artikbot*
> 
> No, the FPU is going to stay as it is _in terms of actual unit capabilities_.
> 
> Steamroller is the next architectural restructuring. It won't come with more FPU power _per se_, but due to changes in the way the cores are fed, the throughput will increase quite substantially.


Is it worth it to upgrade to a 6800K and whatever graphics it will Crossfire with,if you already have a 5800K? I'm thinking of just waiting for the Steamroller & GCN based Kaveri as long as AMD keeps FM2.


----------



## DaveLT

Not quite. Wait for kaveri







It's only a 10-20% performance CPU (From the extra clock) increase even if it might be cheaper this time round


----------



## M3T4LM4N222

Although the A10-6800K doesn't look like it will be overly impressive or a huge increase from the A10-5800K, I am probably going to dive in and upgrade. I'm toying with the idea of getting rid of my A10-5800K + 6670 in an AMD Dual Graphics combo.


----------



## Deadboy90

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> Although the A10-6800K doesn't look like it will be overly impressive or a huge increase from the A10-5800K, I am probably going to dive in and upgrade. I'm toying with the idea of getting rid of my A10-5800K + 6670 in an AMD Dual Graphics combo.


Might be worth it, word is 6800k is supposed to use GNC for its graphics so im thinking xfire with a 7750 might be possible.


----------



## Dimaggio1103

Just purchased a A8-5600k build for my store. Its gonna be a dual purpose rig. Runs my quick books (register) and light gaming when I dont have any work or customers in.
















I only grabbed a 5600 cause I wanna buy a richland APU. I did a review on the first APU's in my sig, hope trinity is a nice step up. would love some left 4 dead 2 on my down time. lol


----------



## M3T4LM4N222

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Just purchased a A8-5600k build for my store. Its gonna be a dual purpose rig. Runs my quick books (register) and light gaming when I dont have any work or customers in.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I only grabbed a 5600 cause I wanna buy a richland APU. I did a review on the first APU's in my sig, hope trinity is a nice step up. would love some left 4 dead 2 on my down time. lol


My A10's 7660D maxes L4D2 @ 41FPS on average, max in the 60's, lows in the 30's. It actually plays extremely smooth. The A8's iGPU has less shaders so you may have to turn down some settings to yield better frames.


----------



## Dimaggio1103

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> My A10's 7660D maxes L4D2 @ 41FPS on average, max in the 60's, lows in the 30's. It actually plays extremely smooth. The A8's iGPU has less shaders so you may have to turn down some settings to yield better frames.


Right on. I mostly game on my sig rig, but hopefully this little APU will work for those slow work days.


----------



## spatulator

Quote:


> Originally Posted by *Deadboy90*
> 
> Might be worth it, word is 6800k is supposed to use GNC for its graphics so im thinking xfire with a 7750 might be possible.


I dont expect GCN...



If Richland was using GCN, AMD would be saying so here in this slide. They show Kaveri having GCN, I expect Richland to use the same VLIW4 arch as Trinity. In fact, they are calling it 2nd generation. Notice that Trinity is referred to as 2nd generation? The whole platform is being shown here as being in the same generation category as Trinity.


----------



## ericore

*Kaveri isn't worth the upgrade, better to get richland and the APU that comes after Kaveri*
Kaveri is being released on FM2 Platform, so you would think "I should wait."
The reason you may want to skip Kaveri is since its shipping on FM2, it will still be DDR3.
The APU after Kaveri will use DDR4 which will be a more substantial performance increase for both GPU and CPU parts of the APU.
Richland uses VL5W + PileDriver. Kaveri will use GCN + Steamroller.
I don't think GCN will make much difference for FPS, Steamroller cores will be nice, but anyone who buys this APU is either wants a gaming or entertainment station.


----------



## DaveLT

Umm, no. GCN will make a HUGE difference clock for clock. IPC will probably be better, hopefully. FPU will get a boost probably. And for god's sake trinity and richland uses VLIW4!


----------



## thepoopscooper

i think i am going to skip this and wait for kaveri...


----------



## spatulator

Quote:


> Originally Posted by *ericore*
> 
> *Kaveri isn't worth the upgrade, better to get richland and the APU that comes after Kaveri*
> Kaveri is being released on FM2 Platform, so you would think "I should wait."
> The reason you may want to skip Kaveri is since its shipping on FM2, it will still be DDR3.
> The APU after Kaveri will use DDR4 which will be a more substantial performance increase for both GPU and CPU parts of the APU.
> Richland uses VL5W + PileDriver. Kaveri will use GCN + Steamroller.
> I don't think GCN will make much difference for FPS, Steamroller cores will be nice, but anyone who buys this APU is either wants a gaming or entertainment station.


AMD said that fm2 will be supported at least until excavator, so I would expect to wait a little longer than you are figuring for ddr4. Richland uses vliw4 which is better suited than vliw5 for on die graphics, where on earth are you getting your information?


----------



## Deadboy90

Quote:


> Originally Posted by *spatulator*
> 
> I dont expect GCN...
> 
> 
> 
> If Richland was using GCN, AMD would be saying so here in this slide. They show Kaveri having GCN, I expect Richland to use the same VLIW4 arch as Trinity. In fact, they are calling it 2nd generation. Notice that Trinity is referred to as 2nd generation? The whole platform is being shown here as being in the same generation category as Trinity.


If they aren't using GCN what's the point of refreshing the line? Might as well wait for Kaveri.


----------



## BlankName

Quote:


> Originally Posted by *Deadboy90*
> 
> If they aren't using GCN what's the point of refreshing the line? Might as well wait for Kaveri.


20-40% increase in graphics performance? That's quite a big improvement over Trinity, so this will give OEM's a newer and better product for their machines before summer back to school sales. It may be 9+ months before we even see Kaveri released to OEM's, and it could be until next year before Kaveri hits store shelves. So Richland is a perfect chip to help keep AMD's APU line rolling between now and then.


----------



## ericore

VLIW4 = Radeon 5000 series
VLIW5 = Radeon 6000 series
GCN = Radeon 7000 series

Fact: Numerous leaks indicate significant performance improvements (we know it can't be the CPU, and it can't simply better power management, and we know the core count is the same, the core clock received something 84mhz boost which can't possibly give 20% better FPS.
1. That's how you can logically deduce that Richland uses VLIW5, because is the only thing that is consistent with the claim.
2. You'll notice the slide says "2nd generation directx 11 GPU". The first gen was Radeon 5000 series, so it follows that this is Radeon 6000 series AKA VLIW5

Also AMD did not state it would use FM2 for 2 years, what it stated was that it has committed to use sockets for at least 2 more years. Seems like a short window for a new socket, but then again it didn't take long to release FM2 after FM1, so don't be surprised if they release at least 1 more socket for APUs, but Kaveri will work with FM2 and will be released 2013 Q4.

Intel is releasing DDR4 with broadwell in 2014. It's impossible to foresee whether AMD will be able to do so in 2014, but if they miss that window your probably looking at Q2-Q3 of 2015.
It really doesn't matter whether AMD releases this with or without a socket, the point is the APU will be much more efficient regardless due to DDR4 being more than twice as fast.


----------



## DaveLT

*facepalm* 2nd gen IGPs not discrete cards. It uses VLIW4 for god's sake even AMD said it themselves, are you being a idiot?


----------



## spatulator

If you're wondering why AMD went with vliw4 vs vliw5 for trinity, (and presumably for Richland too), here's a THG article that explains it http://www.tomshardware.com/reviews/a10-4600m-trinity-piledriver,3202-3.html. If youre wondering where the performance boost of 20-40% for Richland comes from, take the clock speed increase (800-840mhz= 5%)...and add the gains from 2133mhz ddr3 support...viola. If there is some front end memory sharing or bus enhancements that would be the icing on the cake. Its good that there will be a better chip available, but this really is a "in between" kind of stop gap until kaveri comes out. But if its reasonably priced and youre building a new light gaming system, why not...should be a good option.

You have to assume that AMD is going to tout this new chip in the most positive way possible. I read that the 40% they are throwing out there is referring to their low power notebook chips, which makes sense because Richland is supposed to be more power effecient which manifests itself as a performance increase in 25-35 watt cpus. There are many different ways to configure desktop Richland's graphics with different memory speeds...AMD's statistics are showing, lets say 20% for the desktop 6800... this could be based on a comparision of a trinity build with 1600mhz memory vs a Richland build with 2133mhz memory. That would certainly account for the other 15%.


----------



## Heavy MG

Quote:


> Originally Posted by *spatulator*
> 
> If you're wondering why AMD went with vliw4 vs vliw5 for trinity, (and presumably for Richland too), here's a THG article that explains it http://www.tomshardware.com/reviews/a10-4600m-trinity-piledriver,3202-3.html. If youre wondering where the performance boost of 20-40% for Richland comes from, take the clock speed increase (800-840mhz= 5%)...and add the gains from 2133mhz ddr3 support...viola. If there is some front end memory sharing or bus enhancements that would be the icing on the cake. Its good that there will be a better chip available, but this really is a "in between" kind of stop gap until kaveri comes out. But if its reasonably priced and youre building a new light gaming system, why not...should be a good option.
> 
> You have to assume that AMD is going to tout this new chip in the most positive way possible. I read that the 40% they are throwing out there is referring to their low power notebook chips, which makes sense because Richland is supposed to be more power effecient which manifests itself as a performance increase in 25-35 watt cpus. There are many different ways to configure desktop Richland's graphics with different memory speeds...AMD's statistics are showing, lets say 20% for the desktop 6800... this could be based on a comparision of a trinity build with 1600mhz memory vs a Richland build with 2133mhz memory. That would certainly account for the other 15%.


Once you purchase 2133Mhz ram,and OC it with the GPU,it is kind of pointless to be upgrading to Richland unless AMD improved the lacking IMC performance or allow for a better graphics card in Dual Graphics mode. I would even consider Richland as a nice upgrade from my 5800K if Richland has a lower TDP. If Kaveri is FM2 it will be a worthy upgrade,but there will probably be new motherboards for it by then. I'd like to see some GDDR3 or GDDR5 GPU ram cache on the motherboard,it would make for an epic IGPU.


----------



## DaveLT

Same TDP but higher factory clocks. The reason why they can release richland as a stop-gap is because the yields have improved, not at least having done 32nm for 2-3 years already


----------



## spatulator

I just noticed this slide from ces and what it says about Richland...


Notice the wording...
"20-40% more performance than previous generation".
Now take a look back at the slide I posted earlier in this thread...


Which generation is Richland a part of?
What is the previous generation?

Its plain to see, AMD was comparing Richland to Llano...why else would they say "previous generation"...they are even calling Richland a 2nd generation APU. Well, folks...there it is. Richland is really Trinity 2.0


----------



## Castaa

Quote:


> Originally Posted by *spatulator*
> 
> I just noticed this slide from ces and what it says about Richland...
> 
> 
> Notice the wording...
> "20-40% more performance than previous generation".
> Now take a look back at the slide I posted earlier in this thread...
> 
> 
> Which generation is Richland a part of?
> What is the previous generation?
> 
> Its plain to see, AMD was comparing Richland to Llano...why else would they say "previous generation"...they are even calling Richland a 2nd generation APU. Well, folks...there it is. Richland is really Trinity 2.0


Judging by the specs, you are probably right. No way the fastest Richland is even 20% faster than the fastest Trinity part.


----------



## FIRINMYLAZERMAN

I honestly don't think I'm going to replace my current CPU and GPU in my main computer for a little while yet. However, I'm still thinking about getting an A10-6800K for my future budget mini-ITX LAN rig. Does anyone know when the A10-6800K is supposed to be available to the public to buy?


----------



## Castaa

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I honestly don't think I'm going to replace my current CPU and GPU in my main computer for a little while yet. However, I'm still thinking about getting an A10-6800K for my future budget mini-ITX LAN rig. Does anyone know when the A10-6800K is supposed to be available to the public to buy?


I've read this month but I've also read June. My guess is this means laptops this month and a retail APU part this June. If it follows the pattern Trinity set.


----------



## Artikbot

Quote:


> Originally Posted by *ericore*
> 
> VLIW4 = Radeon 5000 series
> VLIW5 = Radeon 6000 series
> GCN = Radeon 7000 series


Wrong. VLIW4 and VLIW5 are both present on HD6 series. In fact, VLIW5 was the architecture behind the Radeon 2900, and upwards to the HD6870. Only the HD6900 series uses VLIW4, which is posterior to VLIW5.

And GCN is brutally more powerful than VLIW 4 (and of course, than the ancient VLIW5) per core. So, a 40% GPU improvement by Kaveri over Trinity in GPU performance alone is very feasible. CPU performance will get a very decent boost as well, perhaps in the boundaries of a 25-30%.

As for Richland, it's just a tweaked and improved Trinity. I'm expecting a 15-20% general performance improvement.


----------



## RegularBear

Richland is Trinity 2.0.
Still has Piledriver cores, still has VLIW4 shaders. It does not make use of GCN shaders, and it is still a 32nm part.
The performance increases that are being touted for Richland are due impart to some tweaked microcode, and some tweaks to GloFo's previously very problematic 32nm node.
The performance figures being thrown around of a 20%-40% increase in performance are taken from lower clocked laptop chips. These low end parts could clock marginally higher to help squeeze out more performance thanks to process improvements, and the aforementioned microcode tweaks. The slide that was shown at CES which showed these claims carried a footnote, and this attached slide clarifies where those numbers came from.
The higher binned parts, especially the highest end desktop part will not see nearly such a large increase in performance.

Just to really drive this home:
Richland is not Kaveri
Richland does not use GCN graphics
Richland is a slight upgrade of Trinity with no real architectural changes

I hope this clarifies things.

If you wish you may view the full slidestack, including footnotes, from the 2013 CES presentation here:
http://www.slideshare.net/AMD/amd-ces-2013-press-conference

performance data.jpg 354k .jpg file


----------



## spatulator

I got the same info about the 20-40% claim being based on a laptop part, not a desktop. I read it on a 3rd party website covering amd at ces. I believed that claim to be accurate at the time because it makes alot of sense. But after looking at these slides, both coming directly from AMD (see my previous post above). I now have to change my opinion on the 20-40% claim based on what is coming from AMD directly here, and very plainly. AMD was comparing Richland to Llano. In kind of a sneaky way...but I dont blame em, just marketing bs to be expected.


----------



## Castaa

Launching March 12th:

http://www.fudzilla.com/home/item/30662-amd-to-launch-richland-on-12th-of-march


----------



## numba18

So noting that this new apu will only be a tad faster than the current trinity model, is there any benefit in waiting if I plan to upgrade to a discrete graphics card in the future?


----------



## hathornd

Didn't they mean 20-40% faster *GPU*? I don't remember seeing anyone say 20-40% faster everything.

-Donny

EDIT: Ahh, it's on the board behind the guy at the press conference... But it does have a superscript "5" by it, and we don't see 5... Suspect.


----------



## DaveLT

I think 20% more performance is on a whole (Higher clocks = more GFLOPS from both GPU and CPU)


----------



## Artikbot

Yeah I also think the 20-40% is for the package as a whole.

Now, 20-40% on Kaveri's GPU side alone, I can well see that.


----------



## Wall

Quote:


> Originally Posted by *Fudzilla*
> Richland will feature an integrated GPU that will be upgraded to Radeon HD 8000 series, a generation ahead of Trinity


Doesn't that mean GCN?


----------



## Castaa

Quote:


> Originally Posted by *Wall*
> 
> Doesn't that mean GCN?


Next gen graphics core. 8000 series.


----------



## RegularBear

It is simply a rebranding, just like the VLIW4 GPU in Trinity was originally branded as part of the 7XXX series. New products always get a flashy new number, it doesn't matter if they contain new architecture or not. The series of cards that carries the 8XXX branding are simply rebranded 7XXX series cards for OEMs. They aren't GCN2.0 products despite being rebranded with a higher number.

You have to seriously question anyone who would suggest that you take an existing GPU part made on a 28nm bulk node, and then scale it up to fit it on a 32nm PD SOI part. Scaling down is a viable option, scaling up is incredibly laughable.


----------



## Artikbot

Quote:


> Originally Posted by *RegularBear*
> 
> It is simply a rebranding, just like the VLIW4 GPU in Trinity was originally branded as part of the 7XXX series. New products always get a flashy new number, it doesn't matter if they contain new architecture or not.


Yup, this. Richland only gets minor improvements in both GPU ands CPU, what basically means higher clockspeeds and nothing else.


----------



## DaveLT

Kind of matches up since the APU's GPU parts have always been lagging a year behind for the graphics division to "tune their drivers"
And the fact that GCN 2.0 is not coming until Q4 of this year so Kaveri will then get GCN 1.0 if everything is right at Q4
Which also points to a fact that even the APU division doesn't have to wait for a 1 new CPU archs and use the one before because BD has been rubbish and PD just much less power hungry


----------



## Wall

If the A10-6800k packs more GFLOPS at the same wattage, wouldn't that reasonably mean that you can get the same performance as a 5800k at a lower wattage, say 10-15 W ? That could be important for people (me included) building mini-ITX PCs in very tight spaces.


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> And the fact that GCN 2.0 is not coming until Q4 of this year so Kaveri will then get GCN 1.0 if everything is right at Q4


AMD always sends the 1st version of the desktop parts to the APUs.

As in, Kaveri will use Steamroller cores before they are released to the desktop market, and most certainly it will use improved GCN graphics, something similar to what we'll see in GCN2.


----------



## DaveLT

Quote:


> Originally Posted by *Wall*
> 
> If the A10-6800k packs more GFLOPS at the same wattage, wouldn't that reasonably mean that you can get the same performance as a 5800k at a lower wattage, say 10-15 W ? That could be important for people (me included) building mini-ITX PCs in very tight spaces.


Yes or no. Maybe


----------



## runs2far

Looks like mobile first :-(

http://techreport.com/news/24482/amd-intros-35w-richland-mobile-apus


----------



## Castaa

Quote:


> Originally Posted by *runs2far*
> 
> Looks like mobile first :-(
> 
> http://techreport.com/news/24482/amd-intros-35w-richland-mobile-apus


No surprising since Trinity was also launched on laptops before desktops.


----------



## spatulator

Quote:


> Originally Posted by *Wall*
> 
> Originally Posted by Fudzilla
> Richland will feature an integrated GPU that will be upgraded to Radeon HD 8000 series, a generation ahead of Trinity
> 
> Doesn't that mean GCN?


Fudzilla I think was duped by AMD's tricky marketing jibberish about generation improvements. According to AMD, Richland CPU and GPU are both 2nd generation APU tech, same generation as trinity.


----------



## Artikbot

Quote:


> Originally Posted by *spatulator*
> 
> Fudzilla I think was duped by AMD's tricky marketing jibberish about generation improvements.


And what the crap does this mean?









_this as in the sentence, _


----------



## spatulator

Quote:


> Originally Posted by *Artikbot*
> 
> And what the crap does this mean?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _this as in the sentence, _


Richland is not a generation ahead of Trinity, it is a 2nd generation APU like Trinity. The quote from Fudzilla was incorrect.


----------



## Wall

I know the article says they have "no timeline" for the desktop parts, but if anybody has any info, I would very much like to know it


----------



## azanimefan

The Richland will utilize the Oland CPU... which is like the GCN version of the 6670. The 8670 will be on sale in the next month or two... and is currently shipping as an OEM part in dells and hps.

So, yes, the Richland will ship with a GCN generation GPU... and apparently that GPU is 20%-30% "better" then the 6670 its replacing.

That said, Richland simply represents a moderate step up in performance. According to net rumors and press releases its Kaveri which will be a generational leap forward.


----------



## DaveLT

Quote:


> Originally Posted by *azanimefan*
> 
> The Richland will utilize the Oland CPU... which is like the GCN version of the 6670. The 8670 will be on sale in the next month or two... and is currently shipping as an OEM part in dells and hps.
> 
> So, yes, the Richland will ship with a GCN generation GPU... and apparently that GPU is 20%-30% "better" then the 6670 its replacing.
> 
> That said, Richland simply represents a moderate step up in performance. According to net rumors and press releases its Kaveri which will be a generational leap forward.


What? Dafuq?
Richland is not going to be a GCN GPU. It's merely Trinity 2.0


----------



## Wall

OK, just let me know when the desktop a10-6800k will be shipping!


----------



## Artikbot

Quote:


> Originally Posted by *spatulator*
> 
> Richland is not a generation ahead of Trinity, it is a 2nd generation APU like Trinity. The quote from Fudzilla was incorrect.


Agree.

They should have named them, p.ex A10-5850K instead of 6800K.


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> Agree.
> 
> They should have named them, p.ex A10-5850K instead of 6800K.


Who knows if AMD will name Kaveri A12-7800k?


----------



## Atomic Man

Quote:


> Originally Posted by *Wall*
> 
> OK, just let me know when the desktop a10-6800k will be shipping!


I was waiting for the A10-6800k myself, but then I did some research. Last year when Trinity was first announced for mobile it took 4 months before the desktop versions started shipping, and rumors earlier this year were pointing to a July release, 4 months from now.

I bought an A10-5800k and I will be skipping Richland, I may upgrade to Kaveri if my mobo is compatible.


----------



## Castaa

Quote:


> Originally Posted by *Wall*
> 
> I know the article says they have "no timeline" for the desktop parts, but if anybody has any info, I would very much like to know it


June is the month I've read for the desktop part.


----------



## A Bad Day

Quote:


> Originally Posted by *Artikbot*
> 
> Agree.
> 
> They should have named them, p.ex A10-5850K instead of 6800K.


But then the OEMs would complain, as lesser number increase means less consumer excitement...


----------



## azanimefan

you should see a boost of nearly 20%-30% in performance with the richland 6800k so while it's just a minor tinker with the previous 5800k, it does boost performance a bit.

it should be interesting to see if they can get that type of boost.


----------



## DaveLT

Quote:


> Originally Posted by *azanimefan*
> 
> you should see a boost of nearly 20%-30% in performance with the richland 6800k so while it's just a minor tinker with the previous 5800k, it does boost performance a bit.
> 
> it should be interesting to see if they can get that type of boost.


20% ... Nah, not 20%. Definitely not.
Clocks only went up 10% this time you're not going to get 20-30% increase


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> 20% ... Nah, not 20%. Definitely not.
> Clocks only went up 10% this time you're not going to get 20-30% increase


10% on the CPU and 5% on the GPU. Plus a possibly improved IMC. 20% is doable


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> 10% on the CPU and 5% on the GPU. Plus a possibly improved IMC. 20% is doable


20% how? On a whole? Not again. 10% boost means 10% better framerates at best. This chip actually bottlenecks a single 7870 (i think?) without me knowing but not by much

Improved IMC, yes but it's a redesign. If it is a redesign you can get better throughput but no way in hell for a chance, they would officially support 2133 out of the box now but ... come on, ALL A10-5800k were able to do 2133







Trinity's IMC on LN2 hit 2800 ... and more.


----------



## runs2far

The promised high performance increase from Trinity to Richland is for the low wattage model and is perf/watt.
The 6800K will not achieve the highest promised performance increase.

DaveLT:
CPU vs. GPU Bottleneck depends on the game.


----------



## azanimefan

Quote:


> Originally Posted by *DaveLT*
> 
> 20% ... Nah, not 20%. Definitely not.
> Clocks only went up 10% this time you're not going to get 20-30% increase


they're saying they got 10-15% boost just tinkering with the FSB or something... that they improved the efficiency of the cpu/gpu, allowing them to get higher base clocks with less energy, netting a 20-30% improvement in both cpu tasks and gpu tasks.

granted it was looking like the a10 will probably be seeing just a 20% improvement, while the lower end and laptop chips will see the biggest improvement. Overall, while the update is just a polishing of Trinity, it seems to be netting a solid improvement in performance. We'll have to wait till the chip is released to see if any of this is true.


----------



## spatulator

And now everyone is forgetting that AMD was comparing Richland to Llano in their own words. I guess people just believe what they want to believe.


----------



## Soup4Lunch

I was excited about the launch of the A10-6800K, but now after some research, only to discover that it richland is the same architecture as trinity! How disappointing. For AMD to claim that there are 15% increases in CPU performance and 30% in iGPU by increasing the clock speeds is just a bunch of baloney...

My A10-5800K overclocked to the same specs as they are offering on this launch gives the exact same performance increases. Artificial benchmarks consistently increase 16% on the CPU from 3.8GHz to 4.4GHz, and 30% on the iGPU from 800MHz to 844MHz (with 2133MHz RAM).

Now it looks like kaveri (with GCN) will not be launched on FM2 as promised. Thanks AMD...

http://wccftech.com/amds-kaveri-based-28nm-richland-apu-features-steamroller-cores-compatibility-fm2-socket/

Now we are being told it will launch on FM3, and _*may*_ be backwards compatible with FM2. I understand the need for the higher memory clock speeds of DDR4/5, but when making the choice to go for AMD it was with the idea of HD 7750 performance from an upcoming APU on the FM2 socket.

Anyone else feel this way?

-Chris


----------



## Ultracarpet

Quote:


> Originally Posted by *Soup4Lunch*
> 
> I was excited about the launch of the A10-6800K, but now after some research, only to discover that it richland is the same architecture as trinity! How disappointing. For AMD to claim that there are 15% increases in CPU performance and 30% in iGPU by increasing the clock speeds is just a bunch of baloney...
> 
> My A10-5800K overclocked to the same specs as they are offering on this launch gives the exact same performance increases. Artificial benchmarks consistently increase 16% on the CPU from 3.8GHz to 4.4GHz, and 30% on the iGPU from 800MHz to 844MHz (with 2133MHz RAM).
> 
> Now it looks like kaveri (with GCN) will not be launched on FM2. Thanks AMD...
> 
> Anyone else feel this way!?
> 
> http://wccftech.com/amds-kaveri-based-28nm-richland-apu-features-steamroller-cores-compatibility-fm2-socket/
> 
> -Chris


Overclocked Trinity = stock Richland... sure, but:

Overclocked Trinity < Overclocked Richland

and I'm pretty sure Kaveri is still going to be FM2 where did you hear otherwise?


----------



## Soup4Lunch

I am basing by disappointing outlook on a few articles like this one:

http://lensfire.in/25341/news/amd-kaveri-apu-will-use-fm3-socket-but-supports-fm2/

And that I remember reading somewhere(?) that FM2 would get one CPU update before FM3. I expected that update to have the GCN cores.

Does anyone really see the industry moving toward DDR4 anytime soon? The wiki article here estimates 50% market penetration by 2015 (http://en.wikipedia.org/wiki/DDR4_SDRAM). That would make AMD a very early adopter IMO.


----------



## Soup4Lunch

Quote:


> Originally Posted by *Ultracarpet*
> 
> Overclocked Trinity = stock Richland... sure, but:
> 
> Overclocked Trinity < Overclocked Richland
> 
> and I'm pretty sure Kaveri is still going to be FM2 where did you hear otherwise?


Do you think there will be the same headroom for OC'ing on the A10-6800K, or that maybe they have just realized that the APU can be stable at those frequencies and are shipping them out pre-OC'd?


----------



## spatulator

Quote:


> Originally Posted by *Soup4Lunch*
> 
> Do you think there will be the same headroom for OC'ing on the A10-6800K, or that maybe they have just realized that the APU can be stable at those frequencies and are shipping them out pre-OC'd?


There is not alot of headroom on trinity to begin with so my guess is that Richland is going to have a few internal tweaks to power management to facilitate higher clock speed.

I am disappointed that FM2 will be left in the dust, but at the same time it really is necessary...if APUs are to thrive they need enough memory bandwidth to compete with discrete cards. If you ask me the right way to do it would have been to have FM2 mobos available with GDDR5 sideport memory, or just keep FM1 around for another generation. But, hey this is cutting edge technology for PC's so its also good to see the changes that are needed.


----------



## Yeroon

DDR4 and APUs can't happen fast enough. DDR4 is supposed to double the throughput out of the gate. Current gen APUs are at the limits of bandwidth, so going to 512 gcn cores would need something (DDR4) to work effectively.
If Kaveri is fm3 but backwards compatible, yet supports dd4 in the fm3 setup, this will give those who have trinity 2 upgrade paths. First, Kaveri itself, then a mobo +ddr4 later.

AMD also has the tdp caps to think about, so a trinity at richland clocks will use more power (marginally) then richland at stock clocks.


----------



## DaveLT

Quote:


> Originally Posted by *spatulator*
> 
> There is not alot of headroom on trinity to begin with so my guess is that Richland is going to have a few internal tweaks to power management to facilitate higher clock speed.
> 
> I am disappointed that FM2 will be left in the dust, but at the same time it really is necessary...if APUs are to thrive they need enough memory bandwidth to compete with discrete cards. If you ask me the right way to do it would have been to have FM2 mobos available with GDDR5 sideport memory, or just keep FM1 around for another generation. But, hey this is cutting edge technology for PC's so its also good to see the changes that are needed.


Better yields = ?, better clocks of course.
Quote:


> Originally Posted by *Yeroon*
> 
> DDR4 and APUs can't happen fast enough. DDR4 is supposed to double the throughput out of the gate. Current gen APUs are at the limits of bandwidth, so going to 512 gcn cores would need something (DDR4) to work effectively.
> If Kaveri is fm3 but backwards compatible, yet supports dd4 in the fm3 setup, this will give those who have trinity 2 upgrade paths. First, Kaveri itself, then a mobo +ddr4 later.
> 
> AMD also has the tdp caps to think about, so a trinity at richland clocks will use more power (marginally) then richland at stock clocks.


Nope. That's what you get from better yields. Improved perf/watt
Quote:


> Originally Posted by *Soup4Lunch*
> 
> I am basing by disappointing outlook on a few articles like this one:
> 
> http://lensfire.in/25341/news/amd-kaveri-apu-will-use-fm3-socket-but-supports-fm2/
> 
> And that I remember reading somewhere(?) that FM2 would get one CPU update before FM3. I expected that update to have the GCN cores.
> 
> Does anyone really see the industry moving toward DDR4 anytime soon? The wiki article here estimates 50% market penetration by 2015 (http://en.wikipedia.org/wiki/DDR4_SDRAM). That would make AMD a very early adopter IMO.


It's all just rumours right now, isn't it? And really the only time DDR4 will ever be first out is about 2014, where Intel will launch it with their Haswell-E of course and knowing AMD much later


----------



## spatulator

OK so at first glance this looks like a pretty nice graphics performance bump, but then when I look at the numbers...wait a second, I've never scored that low with the 5800k...even before I started overclocking. Hmm, perhaps the 5800k is being fed slower RAM? Really the difference there is bigger than a 40mhz clock boost should be.

What would be fun is if they can some how open up dual graphics compatibility to include more discreet options.

source
http://wccftech.com/amd-apu-performance-numbers-revealed-details-launch-schedule-richland-kabini-apus-leaked/


----------



## Soup4Lunch

The 3DMark11 score is just what I expected, lol. According to the article you provided the A10-6800 with a score of 1667 still does not out perform my A10-5800K at the exact same GPU frequency but with a score of 1684 (http://www.3dmark.com/3dm11/6222902).

I am grabbing PCMark7 now to run against the productivity score of 4176, and I don't think it will be much different.

Already not looking like enough to win anyone over!


----------



## DaveLT

Well ...


----------



## Kuivamaa

Quote:


> Originally Posted by *Soup4Lunch*
> 
> Now it looks like kaveri (with GCN) will not be launched on FM2 as promised. Thanks AMD...
> 
> http://wccftech.com/amds-kaveri-based-28nm-richland-apu-features-steamroller-cores-compatibility-fm2-socket/
> 
> Now we are being told it will launch on FM3, and _*may*_ be backwards compatible with FM2. I understand the need for the higher memory clock speeds of DDR4/5, but when making the choice to go for AMD it was with the idea of HD 7750 performance from an upcoming APU on the FM2 socket.
> 
> Anyone else feel this way?
> 
> -Chris


WCCF is largely inaccurate.

http://wccftech.com/amd-apu-performance-numbers-revealed-details-launch-schedule-richland-kabini-apus-leaked/

They pretty much claimed richland will have GCN cores. Also kaveri WILL be FM2 compatible so I don't see the problem here.


----------



## Yeroon

Quote:


> Originally Posted by *DaveLT*
> 
> Better yields = ?, better clocks of course.
> Nope. That's what you get from better yields. Improved perf/watt
> It's all just rumours right now, isn't it? And really the only time DDR4 will ever be first out is about 2014, where Intel will launch it with their Haswell-E of course and knowing AMD much later


I was agreeing with you that Richland is a better quality trinity, duuno why you "noped" my explanation that a Trinity at Richland clocks would need a higher TDP (and higher power draw) than Richland at Richland clocks.

However, I think you are wrong about the DDR4. (unsuuported rumor ATM) Apparently samsung was not happy with how their sales of the green ram went (I am assuming server side) and is working on getting DDR4 out now. Micron has working chips as well.

Kaveri has already been confirmed to be able to use GDDR5, which apparently is the same type of memory controller that is used for DDR4, so Kaveri sould support DDR4 before its even out. Why do you think intel has to be the first for DDR4, it benefits AMD way more ATM to use DDR4 ASAP with both the CPU side lacking bandwidth compared to intel, and the APU gpu needs as much as it can off of standard ram.


----------



## M3T4LM4N222

I've seen several sources saying Richland was coming out today, but no sights of it.


----------



## svenge

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> I've seen several sources saying Richland was coming out today, but no sights of it.


The desktop ones are supposed to come out in June.


----------



## Wall

wccftech.com is indeed VERY inaccurate. I remember having read this article which makes claims that are already proven wrong (for example Richland using GCN), plus it's already 9 months old, so I wouldn't use it to prove a point. Now this article states:
Quote:


> Among the information provided are the release and technical details of Richland A-Series AMD APU which will arrive on 19th March for desktop.


We now know for sure that Richland for desktop is not here yet.

Point being, rumors are not to be taken too seriously, plus I wouldn't trust that site too much, or an article written by that skinny guy.


----------



## A Bad Day

Isn't that the same site that Photoshopped a GTX 200s GPU and claimed it was the Titan?


----------



## Kuivamaa

Yep,that's them.


----------



## Soup4Lunch

I was just reading this article which stated Kaveri will utilize GDDR5 RAM (http://www.brightsideofnews.com/news/2013/3/6/analysis-amd-kaveri-apu-and-steamroller-core-architectural-enhancements-unveiled.aspx). And I have to say that after digging a bit deeper, that it makes sense given the announcement that AMD would be providing the PlayStation 4 with a "semi-custom" 8-core APU utilizing 8GB of onboard GDDR5. The articles I have been reading say that it is necessary for GDDR5 to be soldered on, and if the Kaveri APU is to utilize GDDR5... is anyone thinking what I am thinking?

Back to the days of CPU cartridges! Lol... I'm thinking of you Pentium III Katmai SECC2

Maybe this custom PS4 APU won't be too far off what we should be expecting from Kaveri? Am I crazy?


----------



## A Bad Day

I think it would be more feasible if AMD focused on DDR4 or at least a quad-channel or even a hexa-channel DDR3 system.

GDDR5 has to be point to point to maintain its high clock speed, thus it must be soldered. Although Samsung's product page mentioned a 4 Gb GDDR5 chip, AMD would need 16 of those high density chips to achieve 8 GB of system RAM.


----------



## Artikbot

Quote:


> Originally Posted by *A Bad Day*
> 
> I think it would be more feasible if AMD focused on DDR4 or at least a quad-channel or even a hexa-channel DDR3 system.


Price would go through the roof.


----------



## spatulator

Article seems to reference a gddr5 memory controller, not actual memory on the chip. I think this would make more sense, if you need the extra bandwidth for modern games...buy a mobo with side port memory soldered on. AMD can keep their prices more attractive this way and mobo manufacturers can sell a nifty high end feature.

Personally I would love it if I could drop kaveri into my normal fm2 board and get GDDR5 performance without having to buy a new mobo, I just think the other option seems more likely.


----------



## runs2far

Quote:


> Originally Posted by *Soup4Lunch*
> 
> I was just reading this article which stated Kaveri will utilize GDDR5 RAM (http://www.brightsideofnews.com/news/2013/3/6/analysis-amd-kaveri-apu-and-steamroller-core-architectural-enhancements-unveiled.aspx). And I have to say that after digging a bit deeper, that it makes sense given the announcement that AMD would be providing the PlayStation 4 with a "semi-custom" 8-core APU utilizing 8GB of onboard GDDR5. The articles I have been reading say that it is necessary for GDDR5 to be soldered on, and if the Kaveri APU is to utilize GDDR5... is anyone thinking what I am thinking?
> 
> Back to the days of CPU cartridges! Lol... I'm thinking of you Pentium III Katmai SECC2
> 
> Maybe this custom PS4 APU won't be too far off what we should be expecting from Kaveri? Am I crazy?


The rumoured GDDR5 controller is not able to work along with DDR3 aka. the CPU will only be able to use GDDR5 or DDR3 memory, not both at the same time.
You would not use GDDR5 with a x86 CPU and regular x86 OS, the performance would be poor due to the high latency of GDDR5.
The PS4 can get away with it thanks to programs being written directly for it.


----------



## Soup4Lunch

You are absolutly right. To gain the bandwidth the latency would be terrible. Therefore, I would conclude that we are likely to see a 2GB GDDR5 soldered on by the manufacturer, and the use of DDR3 to be continued, which would be in line with keeping FM2 compatibility









But the GPU couldn't access the GDDR5 and the CPU the DDR3? I don't understand why not?


----------



## runs2far

Quote:


> Originally Posted by *Soup4Lunch*
> 
> But the GPU couldn't access the GDDR5 and the CPU the DDR3? I don't understand why not?


They probably use the same pins on the CPU for both types of memory, this removes the need for a new socket.
If the CPU had to support both memory types at the same time, they would have to add pins for the GDDR5. The number of pins on a CPU is a big part of the price, making a DDR3 and GDDR5 connected CPU far more expensive than current FM2 models.


----------



## DaveLT

It would probably have something like 1300+ pins







Because i'll say if it needs extra speed it needs at least 256bit DDR3 access bandwidth because it seems that's what's bottlenecking the poor chip now (Dual channel or 128bit DDR3)


----------



## A Bad Day

Quote:


> Originally Posted by *DaveLT*
> 
> It would probably have something like 1300+ pins
> 
> 
> 
> 
> 
> 
> 
> Because i'll say if it needs extra speed it needs at least 256bit DDR3 access bandwidth because it seems that's what's bottlenecking the poor chip now (Dual channel or 128bit DDR3)


DDR3 and GDDR5 are pin compatible. As long as they have a memory bus that can recognize the two different memories then it should be fine.


----------



## DaveLT

Quote:


> Originally Posted by *A Bad Day*
> 
> DDR3 and GDDR5 are pin compatible. As long as they have a memory bus that can recognize the two different memories then it should be fine.


Huh? Pin compatible? I was just saying the iGPU needs a larger memory bus. A 4 channel minimum at least i think.


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Huh? Pin compatible? I was just saying the iGPU needs a larger memory bus. A 4 channel minimum at least i think.


Intel is capable of squeezing a 50% more bandwidth from a dualchannel setup. AMD just needs to improve the IMCs.


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> Intel is capable of squeezing a 50% more bandwidth from a dualchannel setup. AMD just needs to improve the IMCs.


Absolutely.


----------



## Castaa

GREAT news for AMD's Richland. It turns out that leaked A10-6700 APU (Richland) benchmarks are indeed 30% faster than Trinity.

http://wccftech.com/amd-a10-6700-gpu-performance-exposed-80-faster-core-i53570k-20-a105800k-3dmark-11-fire-strike/

Color me impressed and wrong.


----------



## runs2far

Quote:


> Originally Posted by *Castaa*
> 
> GREAT news for AMD's Richland. It turns out that leaked A10-6700 APU (Richland) benchmarks are indeed 30% faster than Trinity.
> 
> http://wccftech.com/amd-a10-6700-gpu-performance-exposed-80-faster-core-i53570k-20-a105800k-3dmark-11-fire-strike/
> 
> Color me impressed and wrong.


Looks good, but I will still wait for a better source before I believe it..


----------



## Mopar63

The testing is not complete so very suspect. I am wondering if they did the testing why we did not see the Physics scores for those tests to get a feel for the CPU side...


----------



## spatulator

If there is a 30% increase in 3dmark11 I'm curious to know where it comes from. The article referenced by wccftech http://www.expreview.com/24493.html (chinese) ...says its the same graphics core as trinity, so on the GPU side of things, the only difference we know of is a 5% clock speed increase. The other 25%? hmm, well...perhaps better memory bandwidth available to the gpu? I cant think of any other factor.


----------



## sdlvx

This is pure conjecture, but AMD seems to be fixing specific areas of original Bulldozer at a time.

BD -> PD: fix FPU
PD-> PD 2.0: improve IMC
PD 2.0 -> Steamroller: Improve front end

As for the numbers AMD is throwing around with % increase, it's the same crap that Intel does. "We increased our 3dmark score by 40% so we're just going to call our new chip up to 40% faster and tack on a 5% to 10% faster CPU and just not talk about that."

But you guys are missing the big white elephant in the room. AMD increased clock speed by 10% just because 32nm got better. Intel has spent a lot of time and money to get a 3% to 7% performance increase out of Intel Bridge evolution via Haswell. Richland is going to close the CPU gap and it's just a rebrand with higher clocks. I don't know how that isn't a bigger deal. If AMD released a new architecture and it was only barely faster than the old one, and Intel released new parts with a clock speed bump bigger than AMD"s change in performance, the internet would be full of "LOL AMD IS DOOMED SELL UR STOCK WHATS GOING TO HAPPEN WHEN THERES A MONOPOLY!?!?!?!?"

Amd beat Haswell's CPU gains (assuming Tom's hardware is correct) with significantly less resources.


----------



## Castaa

Quote:


> Originally Posted by *spatulator*
> 
> If there is a 30% increase in 3dmark11 I'm curious to know where it comes from. The article referenced by wccftech http://www.expreview.com/24493.html (chinese) ...says its the same graphics core as trinity, so on the GPU side of things, the only difference we know of is a 5% clock speed increase. The other 25%? hmm, well...perhaps better memory bandwidth available to the gpu? I cant think of any other factor.


It is a head-scratcher. Maybe next gen GPU improvements. Does more per clock.


----------



## Artikbot

Quote:


> Originally Posted by *Castaa*
> 
> It is a head-scratcher. Maybe next gen GPU improvements. Does more per clock.


Apparently they use VLIW4-based graphics, so there is only so much you can do. I pray to the Heavens that the improvements come from the IMC!!

Edit: 30% faster GPU!! Look at those 3DMark results O__O

Boy Kaveri is going to be soooo sweet


----------



## DaveLT

Intel finally caught up on their GPU department to Llano and AMD blew them away again


----------



## AlphaC

Quote:


> Originally Posted by *Castaa*
> 
> GREAT news for AMD's Richland. It turns out that leaked A10-6700 APU (Richland) benchmarks are indeed 30% faster than Trinity.
> 
> http://wccftech.com/amd-a10-6700-gpu-performance-exposed-80-faster-core-i53570k-20-a105800k-3dmark-11-fire-strike/
> 
> Color me impressed and wrong.


Color me skeptical. It's plausible for 15% , we saw that from Piledriver fixing Bulldozer's scheduler.

VLIW4 is still HD6xxx series


----------



## DaveLT

Quote:


> Originally Posted by *AlphaC*
> 
> Color me skeptical. It's plausible for 15% , we saw that from Piledriver fixing Bulldozer's scheduler.
> 
> VLIW4 is still HD6xxx series


Yes, indeed. I don't see how they can improve anything with just a seemingly mere small clock boost and by now AMD is very much done with VLIW already so it can't be drivers
Or more ROPs or TMUs?


----------



## Castaa

Assuming the benchmarks are legit, maybe their Trinity comparison numbers are somehow too low. Can someone with a A10 Trinity run the newest 3DMark and report it?


----------



## FIRINMYLAZERMAN

I haven't been on this thread in quite some time. Is the A10-6800K available to consumers now, or is it not yet released? If not, then is there a solid date when the A10-6800K is supposed to be available?


----------



## DaveLT

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I haven't been on this thread in quite some time. Is the A10-6800K available to consumers now, or is it not yet released? If not, then is there a solid date when the A10-6800K is supposed to be available?


Only in june as expected, i believe they are rolling out Richland for laptops for now


----------



## FIRINMYLAZERMAN

Are there confirmed prices (USD/CAD) for the A10-6800K yet, or...?


----------



## ololosh

http://wccftech.com/amd-a-series-apus-simcity-bundle-promotion-compares-richland-apu-performance-haswell-igpu/

_The promotion would be expanded to Richland A-Series APUs (A10/ A8) when they launch on 18th-19th April 2013_


----------



## overkll

Richland's improvements come from better power managment and new temperature based turbo, as well as process improvements and higher clocks.

http://semiaccurate.com/2013/03/12/amd-goes-mobile-first-with-richland/


----------



## DeadFire

Quote:


> Originally Posted by *overkll*
> 
> Richland's improvements come from better power managment and new temperature based turbo, as well as process improvements and higher clocks.
> 
> http://semiaccurate.com/2013/03/12/amd-goes-mobile-first-with-richland/


True, not to mention it will include HD7000 tech GPU coded HD8000D. So you will get a performance boost there as well, the APUs are capable or parallel processing right along side the GPU and CPU, so a better iGPU would help performance just as much as changing stepping or caches.


----------



## beers

Quote:


> Originally Posted by *Castaa*
> 
> Can someone with a A10 Trinity run the newest 3DMark and report it?


I'd be interested in this as well. It looks like a lot of the dual graphics mode setups score around the same (I've only really seen 3dm11 results though) even though some may have significantly more hardware resources than others.

Here's one from the HTPC for comparison, not sure how much it would fall off having lesser shaders and being a dual core: http://www.3dmark.com/3dm/430732

I'm pretty excited for Kaveri/GCN based APU though, especially if they implement a nice solution for utilizing GDDR5 as a RAM platform.


----------



## DaveLT

Quote:


> Originally Posted by *DeadFire*
> 
> True, not to mention it will include HD7000 tech GPU coded HD8000D. So you will get a performance boost there as well, the APUs are capable or parallel processing right along side the GPU and CPU, so a better iGPU would help performance just as much as changing stepping or caches.


It's NOT GCN. It's still VLIW4 for god's sake


----------



## Soup4Lunch

http://www.3dmark.com/3dm/515447?
http://www.3dmark.com/3dm11/6382616

Ran 3DMark with my A10-5800K for benchmark comparison.


----------



## Castaa

Quote:


> Originally Posted by *Soup4Lunch*
> 
> http://www.3dmark.com/3dm/515447?
> http://www.3dmark.com/3dm11/6382616
> 
> Ran 3DMark with my A10-5800K for benchmark comparison.


Ice Storm
*70253*
AMD A10-6700 - 86027 GPU Points
AMD A10-5800 - 65657 GPU Points

Cloud Gate
*6381*
AMD A10-6700 - 6450 Points / GPU score 8933
AMD A10-5800 - 5645 Points / GPU score 7418

Fire Strike
*1128*
AMD A10-6700 - 1131 Points / GPU score 1212
AMD A10-5800 - 919 Points / GPU score 987

So ya, article's Trinity scores are lower. I see you are using 2133 memory, maybe they used 1600.


----------



## Soup4Lunch

Yes 2133MHz RAM and OC'd to 950MHz

With the stock speed of 800MHz and 2133MHz RAM:

http://www.3dmark.com/3dm/515278?

Ice Storm
*67505*
AMD A10-6700 - 86027 GPU Points
AMD A10-5800 - 65657 GPU Points

Cloud Gate
*6079*
AMD A10-6700 - 6450 Points / GPU score 8933
AMD A10-5800 - 5645 Points / GPU score 7418

Fire Strike
*1009*
AMD A10-6700 - 1131 Points / GPU score 1212
AMD A10-5800 - 919 Points / GPU score 987

But I think that the performance comparison should be based on what is reasonably attainable by an inexperienced OC'er (950MHz and 2133MHz), since it is the black version.

However, if the Richland APU's have the same OC potential then there is actually a considerable improvement in performance by this "slight tweak" AMD has implemented.


----------



## DaveLT

Although those new GPU scores is nothing is scoff at. It's quite a leap for a "tweak"


----------



## spatulator

Ive been tinkering in the bios with the a10 5800k and Im getting slightly higher 3dmark scores with power management turned on (turbo core and cool n quiet). I am of course keeping a base line clock speed with my testing. Turbo core is only limiting power use, not set to boost cpu cores. The results were surprising to me because OC'ing usually means turing these features off for better performance. Perhaps the tweaks to power management for Richland are translating to better GPU performance.


----------



## Nemisor

I find this interesting...

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/60147-detailing-richland-s-dual-graphics-gcn-compatibility.html

Assuming this is correct, it appears that they managed to enable xfire with 7xxx/8xxx series standalone cards and Richland, despite the fact that they aren't GCN


----------



## FIRINMYLAZERMAN

Will the A10-6800K be compatible with current socket FM2 motherboards?

I'm currently trying to build an AMD budget mini-ITX system, and this is the motherboard I'm thinking about buying: http://www.newegg.ca/Product/Product.aspx?Item=N82E16813130664

I'm probably going to buy an A10-5800K for now, just because I'll be wanting to use this system before the A10-6800K is released, but I would still like to know if the A10-6800K has been confirmed to be compatible with current socket FM2 motherboards.


----------



## beers

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Will the A10-6800K be compatible with current socket FM2 motherboards?
> 
> I'm currently trying to build an AMD budget mini-ITX system, and this is the motherboard I'm thinking about buying: http://www.newegg.ca/Product/Product.aspx?Item=N82E16813130664
> 
> I'm probably going to buy an A10-5800K for now, just because I'll be wanting to use this system before the A10-6800K is released, but I would still like to know if the A10-6800K has been confirmed to be compatible with current socket FM2 motherboards.


Yep, can confirm here if you want another source:
http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-6800K.html


----------



## DeadFire

Quote:


> Originally Posted by *DaveLT*
> 
> It's NOT GCN. It's still VLIW4 for god's sake


Excuse me... But still it is only RUMORS and SPECULATIONS! We won't know until it is on the market... "For gods sake". AMD has reported both, so who really knows??


----------



## Kuivamaa

If it was GCN, they would have advertised it as such, make sure that everybody knows. It isn't, it is just VLIW.


----------



## Artikbot

It is VLIW4.


----------



## ololosh

No richland apus for desktop release till june. I am very disappointed, but at least this is final :

http://www.biostar.com.tw/app/en/news/news.php?S_ID=166


----------



## beers

I'm always amused at the TDP figures for these..
How does an unlocked multiplier automatically constitute 35w of extra heat?

Edit: Nixing some mild clock differences that you could likely achieve with stock voltage.


----------



## DaveLT

Quote:


> Originally Posted by *beers*
> 
> I'm always amused at the TDP figures for these..
> How does an unlocked multiplier automatically constitute 35w of extra heat?
> 
> Edit: Nixing some mild clock differences that you could likely achieve with stock voltage.


AMD's engineers ... not very efficient







(Pun intended) In reality they only draw a little bit more over the locked ones


----------



## artk2219

Quote:


> Originally Posted by *beers*
> 
> I'm always amused at the TDP figures for these..
> How does an unlocked multiplier automatically constitute 35w of extra heat?
> 
> Edit: Nixing some mild clock differences that you could likely achieve with stock voltage.


Its not that it automatically makes it 35 watts hotter, its that it kicks it ever so slightly out of the 65 Watt TDP bracket. The next highest bracket in AMD's TDP rating is 100 Watts. So even though the chip may not actually output 100 Watts of heat, this gives then some thermal room to sell some of the less efficient chips. So some of those 100 watt chips may run hotter, but they should all work fine.


----------



## Artikbot

Quote:


> Originally Posted by *beers*
> 
> I'm always amused at the TDP figures for these..
> How does an unlocked multiplier automatically constitute 35w of extra heat?


The secret to achieve lower TDP on locked parts is that the Turbo almost never reaches the maximum state.

The A10-5700, for example, has a 4GHz turbo but in reality it will almost never get past 3.7GHz, only getting to 4GHz on a single threaded load, and only ramping one core.

That, and a much lower VID. The A10-5700 has 1.225V stock VID, opposed to the 1.375V I believe for the 5800K.


----------



## Dromihetes

Is there a clear date for this chip to arrive on the market ?!


----------



## ololosh

No clear date, but the idea for the chips to be presented at computex 2013 ( june 4th -8th), so I quess, richland apus for desktop, will be available, closely to the given time frame,a little sooner or later, in the beggining of june


----------



## GreyHayze

You have got to be kidding me, I literally JUST bought my A10-5800k a month ago..


----------



## Artikbot

Quote:


> Originally Posted by *GreyHayze*
> 
> You have got to be kidding me, I literally JUST bought my A10-5800k a month ago..


Don't worry. The big update for us Trinity users is Kaveri, bound to release very early next year


----------



## A Bad Day

GCN GPU, Steamroller CPU, native support for DDR3-2500 or GDDR5, possible tri/quad modules...

Worth waiting instead of picking up a Haswell or Richland laptop.


----------



## Artikbot

Quote:


> Originally Posted by *A Bad Day*
> 
> GCN GPU, Steamroller CPU, native support for DDR3-2500 or GDDR5, possible tri/quad modules...
> 
> Worth waiting instead of picking up a Haswell or Richland laptop.


Yup.


----------



## DaveLT

Quote:


> Originally Posted by *A Bad Day*
> 
> GCN GPU, Steamroller CPU, native support for DDR3-2500 or GDDR5, possible tri/quad modules...
> 
> Worth waiting instead of picking up a Haswell or Richland laptop.


Definitely. But it better not be expensive


----------



## Dromihetes

Imagine FM 2 board with 2 APU sockets and GDDR5









You have eight cores and Crossfire right there with incredible power .


----------



## beers

Quote:


> Originally Posted by *Dromihetes*
> 
> Imagine FM 2 board with 2 APU sockets and GDDR5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have eight cores and Crossfire right there with incredible power .


Isn't that called a PS4?


----------



## Dromihetes

No.
You have and you will have FM 2 APU-s with different CPU architecture opposed to those in the console.
Over the socket timeline you would be able to upgrade to more powerful APU-s
Such dual socket FM2 APU mobos would be an excellent choice for enthusiasts that overclock.I doubt you can get your PS 4 and overclock it without some serious modding if possible ,those APU-s must be locked at all levels.
Such boards would simply be awesome in my opinion.Maybe Sapphire will release some in the future.


----------



## DaveLT

Quote:


> Originally Posted by *Dromihetes*
> 
> No.
> You have and you will have FM 2 APU-s with different CPU architecture opposed to those in the console.
> Over the socket timeline you would be able to upgrade to more powerful APU-s
> Such dual socket FM2 APU mobos would be an excellent choice for enthusiasts that overclock.I doubt you can get your PS 4 and overclock it without some serious modding if possible ,those APU-s must be locked at all levels.
> Such boards would simply be awesome in my opinion.Maybe Sapphire will release some in the future.


Don't even need dual socket CPUs, kaveri is probably coming in a 6 core SR option


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Definitely. But it better not be expensive


You bet it won't!!

I say top end 7800K will be below €150. Probably hovering the €130 mark.

And if that rumor speculating 3 Steamroller modules is true... I'll drool all over the place


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> You bet it won't!!
> 
> I say top end 7800K will be below €150. Probably hovering the €130 mark.
> 
> And if that rumor speculating 3 Steamroller modules is true... I'll drool all over the place


It better be true







If IPC went up + extra cores ... FTW! I just hope it's just about 100W TDP then and then can i buy a AMD system for real and then put my current xeon to bed
I can myself a AMD enthusiast but never bought a AMD cpu before


----------



## Dromihetes

A dual socket FM 2 board would bring some interesting advantages :the 2 APU-s can go higher in speeds independently and will heat less than if they would be in one chip , there would also be a quad channel memory practically with eventually GDDR5.The VRM stqages would be 2 and not one so more headroom in there too.
With such thing there would be no need of AMD standalone CPU-s thus more developing money would go in one direction ,optimization of the APU-s and faster process fabrication adoption.

I wonder if some motherboard manufacturer would try such experiment and see what it brings.


----------



## DaveLT

AMD would really have to scratch their head to clock 2 CPUs to idle power consumption as 1 chip though ... And remember although this is OCN AMD's top priority now is to get Perf/W and lower idle power consumption

It's a great idea nonetheless but not something either AMD or Intel would do though and also multi-CPU systems have always required some way to interface each other so if the CPU never came with it ...


----------



## SAimNE

they would actually probably pull it off pretty fast. amd has done very well with anything involving apus for quite awhile. when i first got my apu+discrete i was vaguely sure i would regret it and end up getting sporadic fps and tearing all for 5fps.... instead im jumping up like 20% and have yet to run into any significant dips/spikes that i hadnt experienced with just a discrete before, and tearing was actually extremely rare. so yeah, amd would probably figure it out in no time flat as long as an apu is involved. feels weird to say it about a company notorious for f'ing up on graphics drivers, but the dual graphics drivers were pretty much perfect. first time in a long while i was ever so completely behind amd....... and meanwhile forums were still filled with intel fanboys who had never owned one discouraging any buyers with the threats of low performance and screen tears -_-.


----------



## Wall

Quote:


> Originally Posted by *A Bad Day*
> 
> GCN GPU, Steamroller CPU, native support for DDR3-2500 or GDDR5, possible tri/quad modules...
> 
> Worth waiting instead of picking up a Haswell or Richland laptop.


Source?


----------



## Artikbot

Quote:


> Originally Posted by *Wall*
> 
> Source?


I can't remember it, but ig you google it, you'll see that AMD themselves have announced that Kaveri will be able to run GDDR5 as a system memory.


----------



## Stormscion

There were pictures where it was pointed out that new CPU designs will have both ddr3 and gddr5 capable memory controler... and that makes sense since already we know PS4 is all about gddr5


----------



## OldtimeGamer

Quote:


> Originally Posted by *GreyHayze*
> 
> You have got to be kidding me, I literally JUST bought my A10-5800k a month ago..


I'm in the same boat...having just purchased my A10-5800k two months ago. However, I don't suspect I'll run out and buy a A10-6800k any time soon.

However, I'm interested to see which discrete AMD cards the A10-6800k will work with. Being currently limited to a HD 6670 or 6570 is a real drawback.... especially when I have a HD6950 sitting around, that I cant use with it.


----------



## Papadope

Quote:


> Originally Posted by *OldtimeGamer*
> 
> However, I'm interested to see which discrete AMD cards the A10-6800k will work with. Being currently limited to a HD 6670 or 6570 is a real drawback.... especially when I have a HD6950 sitting around, that I cant use with it.


Why would you let an HD6950 sit around? It's much faster than a A10-6800k in dual graphics mode with a 6670.


----------



## DaveLT

Quote:


> Originally Posted by *OldtimeGamer*
> 
> I'm in the same boat...having just purchased my A10-5800k two months ago. However, I don't suspect I'll run out and buy a A10-6800k any time soon.
> 
> However, I'm interested to see which discrete AMD cards the A10-6800k will work with. Being currently limited to a HD 6670 or 6570 is a real drawback.... especially when I have a HD6950 sitting around, that I cant use with it.


It's a sad one ... but you can use the 6950 though


----------



## OldtimeGamer

I don't want to get this thread off topic..... so I'll keep my reply brief about why I can't use my 6950 with my A10 5800K.

I tried to add my 6950 to my A10 5800K but because its not a 6670 or 6570, the discrete GPU its giving me nothing but problems. I can't run the monitor cable out of the 6950, only the motherboard. Additionally, it wont let me into BIOS while the 6950 is physically installed. (I watched a youtube video where the narrator said AMD was aware of that problem.... as of last summer... and was "looking into it" but I suppose that problem was not rectified.

With the card installed, I also loose my ASUS start-up splash screen, plus I also loose the splash screen that shows Windows 7 is starting up. Loosing those two features is no big deal to me..... but the lack of being able to access BIOS is a big deal. Who wants to pull the discrete card every time you want to access BIOS?

Additionally I had run 3D MARK before adding the 6950 and again after installing it but still running the monitor cable out of the motherboard. The numbers were near identical, so I wasn't really gaining anything.... even though 3D MARK recognized the presence of both GPU's.

So anyway....That is a quick summary of how the HD7660D and HD6950 are not working together for me.


----------



## Papadope

Couldn't you install the card, plug your monitor into the video port on the motherboard. Enter the bios through the integrated gpu and motherboard output. Disable the integrated graphics in the bios, then reboot and plug the monitor cable directly into the 6950?


----------



## OldtimeGamer

Thanks guys for being willing to help. Maybe you guys can help figure this out.

However....I don't want to hijack this thread, since this is supposed to be all about the upcoming Richland A10-6800K APU .

I'll start a new thread and just post a link to it here.... in the event anyone talking about the A10-6800K is interesting in jumping over to help me resolve my problem.

Here is the discussion thread I just now started, should anyone here wish to take a look.
http://www.overclock.net/t/1388771/amd-a10-5800k-and-hd-6950-compatibility-problems-causing-no-access-to-bios


----------



## Soup4Lunch

MSI Richland BIOS update

Just found this today









Brings support for RIchland to FM2.

Also spotted this article


----------



## Papadope

Nice find!, I literally just purchased that board on Newegg an hour ago. Building a budget gaming rig for a friend, he only needs it to be capable of running Civ V lol.


----------



## Artikbot

Gigabyte also updated my motherboard BIOS like three weeks ago or so for Richland. It must be around the corner


----------



## ololosh

http://www.techpowerup.com/179248/amd-richland-desktop-apu-lineup-detailed.html

_yet AMD plans to launch a trio of new FCH chipsets. Leading the pack is the A88X (eight SATA 6 Gb/s ports), followed by A78 (six SATA 6 Gb/s ports),and A68 (probably four SATA 6 Gb/s ports, entry-level)_

No sign of these new chipsets, I wonder if the new motherboards will be released


----------



## DaveLT

Quote:


> Originally Posted by *ololosh*
> 
> http://www.techpowerup.com/179248/amd-richland-desktop-apu-lineup-detailed.html
> 
> _yet AMD plans to launch a trio of new FCH chipsets. Leading the pack is the A88X (eight SATA 6 Gb/s ports), followed by A78 (six SATA 6 Gb/s ports),and A68 (probably four SATA 6 Gb/s ports, entry-level)_
> 
> No sign of these new chipsets, I wonder if the new motherboards will be released


I wonder what extra features do the have this time round as seeing that A85X already has eight SATAIII ports


----------



## notarat

I built an HTPC with an AMD A10 5800K and, for what it's designed to do, it does it quite well. It's nice to see another upgrade available for FM2 because I was wondering if they'd get around to putting Richland into an FM2 board.


----------



## ololosh

Quote:


> Originally Posted by *DaveLT*
> 
> I wonder what extra features do the have this time round as seeing that A85X already has eight SATAIII ports


Feature of such chipsets would be, user purchasing FM2 based pc for the first time is not required to by extra trinity apu, beside intented richland, just for flashing BIOS making new apu compatible


----------



## Himo5

Quote:


> Originally Posted by *Artikbot*
> 
> Gigabyte also updated my motherboard BIOS like three weeks ago or so for Richland. It must be around the corner


I notice also that - presumably in preparation for Richland - ASUS have updated the BIOS for the F2A85-V PRO to v.6002 with additional updates for Chipset and Graphics, a new version of AI Suite II and an update to the QVL.

So upgrading the APU will now involve a - not so simple - reinstall of W8 (since I am still bedding it in) and the usual problem of working out how to upgrade the motherboard DVD with the new files - something that somehow never gets mentioned.


----------



## Papadope

Quote:


> Originally Posted by *Himo5*
> 
> So upgrading the APU will now involve a - not so simple - reinstall of W8.


Wait what? why does upgrading the APU to Richland force you to reinstall Windows 8?


----------



## Artikbot

Quote:


> Originally Posted by *Papadope*
> 
> Wait what? why does upgrading the APU to Richland force you to reinstall Windows 8?


----------



## Himo5

Quote:


> Originally Posted by *Papadope*
> 
> Wait what? why does upgrading the APU to Richland force you to reinstall Windows 8?


It doesn't *force* a reinstall, but to my mind leaving the residues of chipset/graphics updates hanging around in the none too settled environment of a new OS is just asking for trouble in the long term.


----------



## Artikbot

Quote:


> Originally Posted by *Himo5*
> 
> It doesn't *force* a reinstall, but to my mind leaving the residues of chipset/graphics updates hanging around in the none too settled environment of a new OS is just asking for trouble in the long term.


Why? Just wipe any AMD drivers from your current install, swap APUs, and tadah!


----------



## Himo5

Quote:


> Originally Posted by *Artikbot*
> 
> Why? Just wipe any AMD drivers from your current install, swap APUs, and tadah!


Bearing in mind the move from Trinity to Richland is supposed to be a minor performance upgrade this waiting period should best be spent eliminating anything which is already affecting performance and what better assurance of optimum performance is there than ability to perform a clean instal of the OS?

If this is already too much of a hassle after less than a year of operation going to Richland may be only a way of getting back to the performance Trinity started with.

There are far too many kinds of dependencies and compromises that clicking the uninstal button has so often not cleared away in the past to suppose W8 and all its possible third party upgraders have suddenly achieved immunity from them.

After six months now's the time to take a long, hard look at your installation to sort out any botched installs or second thoughts not carried through with; to make sure you can do a clean instal without loss of data and settings; to take a second look at questions like chosing between IDE OC capacity and AHCI booting speed or making sure ram is good enough to take advantage of the 2133MHz IMC.

While Kaveri may be worth a new motherboard Richland is probably not, so it's worth making this as much a matter of consolidation as upgrade.


----------



## Himo5

Oh... and I still wish like crazy Asus et.al. would offer an ISO for download when they do a major driver update!


----------



## lordeamon

i have an a8-5600k with 8gb 1866mhz ram on a msi a75ia-e53 mobo. i just updated my mobo to support richland apu so i think richland is almost there. but is it really worth the upgrade? will it allow me to play bioshock infinite (medium setting) on a 720p resolution with a decent fps? im planning to get the a10-6700 btw. im just not sure if its a good idea to upgrade my apu or wait for kaveri.


----------



## DaveLT

Quote:


> Originally Posted by *Himo5*
> 
> IDE OC capacity and AHCI booting speed or making sure ram is good enough to take advantage of the 2133MHz IMC.


Why would IDE affect OC capability?


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Why would IDE affect OC capability?


I have zero idea, but I've experienced the AHCI bug with high FSB speeds too.

Odds are it runs off some FSB strap.


----------



## Himo5

When FM1 boards came out the first problem to be faced was getting access to the drives when OnChip SATA type in the BIOS defaulted to AHCI and even after that an AHCI setting - if you fitted an SSD - would cut the OC capacity of the board in half.

So that was a straight choice between OC and SSD, and a lot of people switching to FM2 last autumn probably kept the choice they made on FM1 if there was no clear info on whether things had changed. I was just hinting that in such cases, now might be a good moment to review the situation.


----------



## Artikbot

You can run an SSD and go for moderate bus speeds, as long as you disable AHCI. Running SATA mode is fine, no need to force IDE. At least, on my board it is, until 117MHz or so.


----------



## Himo5

Quote:


> Originally Posted by *Artikbot*
> 
> You can run an SSD and go for moderate bus speeds, as long as you disable AHCI. Running SATA mode is fine, no need to force IDE. At least, on my board it is, until 117MHz or so.


I think the point I was trying to get at is that a lot of issues like this were not clearly explained in the user manuals on these boards, so it is worth reviewing what has since come to light online which installers couldn't get a clear picture of at the time.


----------



## Castaa

Rumor update and prices: June 4th launch

Model Core/Threads CPU clock Boost Frequency L2 Cache Graphics TDP Price
AMD A10-6800K 4/4 4.1 GHz 4.4 GHz 4 MB HD 8670D(844MHz) 100W $142
AMD A10-6700 4/4 3.7 GHz 4.3 GHz 4 MB HD 8670D(844MHz) 65W $122
AMD A10-6600K 4/4 3.9 GHz 4.2 GHz 4 MB HD 8570D(844MHz) 100W $112
AMD A8-6500 4/4 3.5 GHz 4.1 GHz 4 MB HD 8570D(800MHz) 65W $91
AMD A4-6400K 2/2 3.9 GHz 4.1 GHz 1 MB HD 8470D(844MHz) 65W $69

source: http://wccftech.com/amd-richland-apu-prices-leaked-scheduled-launch-4th-june/


----------



## OldtimeGamer

Now if we could just learn which discrete graphics cards you can crossfire them with....but if not, June 4th is right around the corner.


----------



## Artikbot

Quote:


> Originally Posted by *OldtimeGamer*
> 
> Now if we could just learn which discrete graphics cards you can crossfire them with....but if not, June 4th is right around the corner.


Odds are they'll be the same as now. HD6670 GDDR5 for optimal experience.


----------



## Heavy MG

Quote:


> Originally Posted by *Artikbot*
> 
> Odds are they'll be the same as now. HD6670 GDDR5 for optimal experience.


Why not the HD7750 or 7770? I've seen videos of people running 5800k's with a HD7750 in Dual Graphics mode.


----------



## artk2219

Quote:


> Originally Posted by *Heavy MG*
> 
> Why not the HD7750 or 7770? I've seen videos of people running 5800k's with a HD7750 in Dual Graphics mode.


Because the 7750 and 7770 are way more powerful than whats in trinity or richland. You could run it in dual graphice mode maybe, but the other cards wont really gain anything, if anything it will probably slow them down since they will more than likely be waiting for the APU's graphics to finish processing its portion of the picture most of the time.


----------



## agrims

Quote:


> Originally Posted by *artk2219*
> 
> Because the 7750 and 7770 are way more powerful than whats in trinity or richland. You could run it in dual graphice mode maybe, but the other cards wont really gain anything, if anything it will probably slow them down since they will more than likely be waiting for the APU's graphics to finish processing its portion of the picture most of the time.


Well, maybe not. A lot of it could be made up in coding through drivers. I am sure they could code the drivers to augment the discrete graphics cards with the APU. It may work now, there are many revews online that shows it working with the 5800K and 7750, so AMD could optimize the driver code, and Richland may be a pleasant surprise to all of us. What would be really great, is if they keep FM2 boards through Kaveri, but bring out a new chipset that can support GDDR4 or 5 on the motherboard, say a 95x series. I know currently we are limited by ddr3 speeds, but maybe when it is out, the current gen would be able to use the ddr4-5 onboard memory through coding. We aren't even sure if the APU uses the GDDR5 on the 6670 as is, or if it splits the value between card and chip. Could you imagine the happy smiles from everyone that wouldn't have to upgrade board and chip, just board!! George Takei, OHHH MYYYY!









I know, I know, it's a pipe dream, but a man can dream!


----------



## OldtimeGamer

Quote:


> Originally Posted by *artk2219*
> 
> Because the 7750 and 7770 are way more powerful than whats in trinity or richland. You could run it in dual graphice mode maybe, but the other cards wont really gain anything, if anything it will probably slow them down since they will more than likely be waiting for the APU's graphics to finish processing its portion of the picture most of the time.


I tried to run my 6950 with the A10 5800K and that was exactly what I got...the 6950 was slowed down so much, it hardly made any difference at all being installed. Now I'm just running the 6950 and its much ..much better.

But it would certainly be nice if we could move past the limitations of a 6670 with this new release.


----------



## DaveLT

Quote:


> Originally Posted by *OldtimeGamer*
> 
> I tried to run my 6950 with the A10 5800K and that was exactly what I got...the 6950 was slowed down so much, it hardly made any difference at all being installed. Now I'm just running the 6950 and its much ..much better.
> 
> But it would certainly be nice if we could move past the limitations of a 6670 with this new release.


I think you mean 6770?


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> I think you mean 6770?


6670 is the fastest card you can crossfire with good results.


----------



## OldtimeGamer

Quote:


> Originally Posted by *Artikbot*
> 
> 6670 is the fastest card you can crossfire with good results.


Right...I was just referring to crossfire options.


----------



## Castaa

Quote:


> Richland had its name changed to 2013 AMD Elite Performance APU and although it doesn't deliver the same leap in performance like Jaguar-based parts, *it ends up 12 percent faster than Trinity in productivity, while visual performance is 20 to 40 percent better. The big news is that it is up to 51 percent more efficient than its predecessor.* It also bests the competition in gaming by 39 to 72 percent.


Richland for desktop official announced by AMD.

http://www.fudzilla.com/home/item/31468-amd-announces-kabini-temash-and-richland


----------



## Artikbot

Must... resist... Must... wait... for... Kaveri...


----------



## sanket779292

what about power consumption? ???


----------



## DaveLT

Quote:


> Originally Posted by *sanket779292*
> 
> what about power consumption? ???


Same as before, could possibly be lower. Of course Richland is only good for those who haven't gone to Trinity


----------



## Castaa

Quote:


> Originally Posted by *sanket779292*
> 
> what about power consumption? ???


Did you even read the quote? It's even in bold type. Up to 51% more efficient.


----------



## Milestailsprowe

Is there any sign new motherboards for Richland


----------



## agrims

Artikbot, you can wait, I'll tell you how it goes when they release! Going for the mac daddy. And the article specifically states it will be FM2. Kaveri may very well be a FM2+ though


----------



## Heavy MG

Quote:


> Originally Posted by *agrims*
> 
> Artikbot, you can wait, I'll tell you how it goes when they release! Going for the mac daddy. And the article specifically states it will be FM2. Kaveri may very well be a FM2+ though


Buying into Richland and selling your Trinity chip might be a ok deal,seeing as AMD is probably going FM2+ or something with all the talk of integrating GDDR5 onto the mobo. I'm thinking of selling my 5800K and HD6670 for a 6800K.


----------



## Artikbot

Quote:


> Originally Posted by *Heavy MG*
> 
> Buying into Richland and selling your Trinity chip might be a ok deal,seeing as AMD is probably going FM2+ or something with all the talk of integrating GDDR5 onto the mobo. I'm thinking of selling my 5800K and HD6670 for a 6800K.


My Trinity chip is delidded, and although I have the lid right in front of me, I knok unless I sell it here on OCN, people will go a bit bonkers if they see the IHS knocking off the processor









I'll wait for Kaveri anyway, it's not like this one isn't fast already!!

@agrims: Thanks buddy!


----------



## DaveLT

What temps do you get @ stock before delidding?


----------



## Dromihetes

Quote:


> Originally Posted by *Artikbot*
> 
> My Trinity chip is delidded, and although I have the lid right in front of me, I knok unless I sell it here on OCN, people will go a bit bonkers if they see the IHS knocking off the processor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll wait for Kaveri anyway, it's not like this one isn't fast already!!
> 
> @agrims: Thanks buddy!


So it s not soldered ?!
I ve killed an X2 6400+ some years ago doing this ,it was soldered and i have discovered that to late


----------



## EmpireTrooper86

is the new A10-6800K going 2 work with the fm2 motherbords my friend wants 2 know he has a 5800k right now or would he have to get a hole diff mobo jw


----------



## DaveLT

Quote:


> Originally Posted by *msi intel gamer*
> 
> is the new A10-6800K going 2 work with the fm2 motherbords my friend wants 2 know he has a 5800k right now or would he have to get a hole diff mobo jw


Just update the MB BIOS using the 5800k if it doesn't with the 6800k in


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> What temps do you get @ stock before delidding?


Before delidding, thermal shutdown at 3.6GHz on stock fan settings (Gigabyte board misreading temperatures).

After delidding, rock solid at 3.6GHz on stock fan settings, exhaust air is much cooler.

For me, that's enough


----------



## MKHunt

Anybody going to bite on the A4-4000?

It's up on newegg.

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113343

I'm holding out for Richland for my HTPC build.


----------



## DaveLT

Wow, that is seriously some good value there!
Kind of making me regret my HTPC purchase ...


----------



## overkll

Quote:


> Originally Posted by *MKHunt*
> 
> Anybody going to bite on the A4-4000?
> 
> It's up on newegg.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819113343
> 
> I'm holding out for Richland for my HTPC build.


I did. Had it about 4 days now. It's not my main system. Just wanted a new toy to play with. while waiting from A10's to come out.

According to a killawatt unit, whole system idles at 20-25 watts. 40 to 53 watts under load. Stock HS fan never even speeds up.

WEI is 5.8 for the processor in Win7 Pro. This is the low man on the Richland totem pole. Dual-core, 3.0Ghz, 3.2Ghz Turbo with 1MB of L2 cache and a Radeon 7460 IIRC. Don't have it powered up to check.

One more thing. This model only supports up to 1333MHz DDR3.


----------



## MKHunt

Quote:


> Originally Posted by *overkll*
> 
> I did. Had it about 4 days now. It's not my main system. Just wanted a new toy to play with. while waiting from A10's to come out.
> 
> According to a killawatt unit, whole system idles at 20-25 watts. 40 to 53 watts under load. Stock HS fan never even speeds up.
> 
> WEI is 5.8 for the processor in Win7 Pro. This is the low man on the Richland totem pole. Dual-core, 3.0Ghz, 3.2Ghz Turbo with 1MB of L2 cache and a Radeon 7460 IIRC. Don't have it powered up to check.
> 
> One more thing. This model only supports up to 1333MHz DDR3.


Any testing of 3D BD iso playback support? I know with trinity the A6-5400k was the lowest end proc to support that but I haven't heard of any specifics on support in Richland. Not sure which to spring for for my build. I do wish there were more good mini-ITX FM2 boards.


----------



## overkll

Quote:


> Originally Posted by *MKHunt*
> 
> Any testing of 3D BD iso playback support? I know with trinity the A6-5400k was the lowest end proc to support that but I haven't heard of any specifics on support in Richland. Not sure which to spring for for my build. I do wish there were more good mini-ITX FM2 boards.


Nope. This is a cheap box - No optical drive as of now.

Correction - the IGP is a Radeon HD 7480D. Weird, everything I've read about Richland lead me to believe that the IGP would be 8xxx series. But then again, the desktop Richland's should also be Ax-6xxx series. What is this Frankenstein? Laptop silicon on a desktop FM2 PGA?


----------



## DaveLT

But laptop 4k is trinity ...


----------



## EmpireTrooper86

when is the A10-6800K comeing out anyone know


----------



## Milestailsprowe

Quote:


> Originally Posted by *overkll*
> 
> I did. Had it about 4 days now. It's not my main system. Just wanted a new toy to play with. while waiting from A10's to come out.
> 
> According to a killawatt unit, whole system idles at 20-25 watts. 40 to 53 watts under load. Stock HS fan never even speeds up.
> 
> WEI is 5.8 for the processor in Win7 Pro. This is the low man on the Richland totem pole. Dual-core, 3.0Ghz, 3.2Ghz Turbo with 1MB of L2 cache and a Radeon 7460 IIRC. Don't have it powered up to check.
> 
> One more thing. This model only supports up to 1333MHz DDR3.


did you have to update the BIOS?


----------



## overkll

Quote:


> Originally Posted by *msi intel gamer*
> 
> when is the A10-6800K comeing out anyone know


June 4th from what I've heard.


----------



## overkll

Quote:


> Originally Posted by *Milestailsprowe*
> 
> did you have to update the BIOS?


Yes. Had to use my Trinity chip to do the bios upgrade. The update seems to have fixed the high temp issue when in the bios. And the stock heat sink fan is much quieter. That's on a MSI FM2-A75MA-E35 mATX mobo.


----------



## Milestailsprowe

I w
Quote:


> Originally Posted by *overkll*
> 
> Yes. Had to use my Trinity chip to do the bios upgrade. The update seems to have fixed the high temp issue when in the bios. And the stock heat sink fan is much quieter. That's on a MSI FM2-A75MA-E35 mATX mobo.


I was looking to a Richland CPU but the motherboards for them aren't out yet.


----------



## overkll

Quote:


> Originally Posted by *Milestailsprowe*
> 
> I w
> I was looking to a Richland CPU but the motherboards for them aren't out yet.


Huh? One can use *ANY* FM2 socket motherboard. Probably have to update the bios to the latest version.


----------



## Milestailsprowe

Quote:


> Originally Posted by *overkll*
> 
> Huh? One can use *ANY* FM2 socket motherboard. Probably have to update the bios to the latest version.


I mean a motherboard that will woke with Richland right out of the box


----------



## Artikbot

Quote:


> Originally Posted by *Milestailsprowe*
> 
> I mean a motherboard that will woke with Richland right out of the box


Pretty much every motherboard in the market with support for it, even current models if it appears in the supported product list.

If it doesn't work though, the manufacturer should update your BIOS free of charge (not sure about shipping)


----------



## Milestailsprowe

Quote:


> Originally Posted by *Artikbot*
> 
> Pretty much every motherboard in the market with support for it, even current models if it appears in the supported product list.
> 
> If it doesn't work though, the manufacturer should update your BIOS free of charge (not sure about shipping)


Thing is I'm ITX and I have only 1 choice of boards which is the MSI ITX board because it supports backplates and wont catch fire.

It does not support richland out of the box


----------



## MKHunt

Quote:


> Originally Posted by *Milestailsprowe*
> 
> Thing is I'm ITX and I have only 1 choice of boards which is the MSI ITX board because it supports backplates *and wont catch fire*.
> 
> It does not support richland out of the box


That is the best feature IMO.


----------



## DaveLT

Unlike ASRock, lol.


----------



## Alanim

Quote:


> Originally Posted by *MKHunt*
> 
> That is the best feature IMO.


That's the WORST feature. It should have included the house heating feature.

I loved the feature where I when I do a caseless test on the motherboard using the cardboard box, then entering bios so that I can heat my entire house in mere moments as the INNOVATIVE fire feature activates. No other motherboard can heat your entire house like that.

I mean if that wasn't a feature then why wasn't there a mandatory recall with the 70%? feature activation rate.


----------



## MKHunt

I admit, that's a pretty valid point. From a power draw perspective, it's hard to beat that efficiency for a small space heater.


----------



## Alanim

Quote:


> Originally Posted by *MKHunt*
> 
> I admit, that's a pretty valid point. From a power draw perspective, it's hard to beat that efficiency for a small space heater.


House heating was more implying your entire house catching fire, epic heat.

*edit* although space heaters are known for that as well.

I'm sure it has killed someone by now, you just don't hear about because they're dead.


----------



## Artikbot

Quote:


> Originally Posted by *Alanim*
> 
> I'm sure it has killed someone by now, you just don't hear about because they're dead.


That's pretty sadistic


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> That's pretty sadistic


I think ... that's just being in the least of ... "respect"
Being sadistic is talking about how they got slowly burned and then rolling into their graves while on fire


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> I think ... that's just being in the least of ... "respect"
> Being sadistic is talking about how they got slowly burned and then rolling into their graves while on fire


----------



## badrapper

http://diybbs.zol.com.cn/11/11_106618.html

Some Info here on *AMD Richland A10-6800K* Performance


----------



## Artikbot

112W for the 6800K... Damn, I might even patch together this A10-5700 and throw it into the market, the A10-6700 should be a quiet monster!


----------



## brasslad

Gotta love APUs for low end builds.
My greatest aggravation is for upgrade there should be at least a FM2 FX-6300/6350 no iGPU variant for people going to discrete cards. Or even a 95 watt FX-8300/8350.
Yes I know it would be a small market.


----------



## DaveLT

Quote:


> Originally Posted by *brasslad*
> 
> Gotta love APUs for low end builds.
> My greatest aggravation is for upgrade there should be at least a FM2 FX-6300/6350 no iGPU variant for people going to discrete cards. Or even a 95 watt FX-8300/8350.
> Yes I know it would be a small market.


It would probably be clocked at 3GHz .. lol. The problem with making bigger dies like a 8350 for a APU is that it costs more to cut down so they start off with dual cores then up to quad cores
So they don't do bigger dies. Usually there will be the die in there but lasered off for various reasons
That's how they keep prices of the APUs down. I would have wished to see a 100W 6 core PD variant though








Nevermind, SR promises APU 6 core and IPC increase to near SB-level. I think.
And on desktop 10 core is rumored to be released

But wow, Richland even OC'd still pulling much lower power consumption than A10-5800k according to the review. I doubt it is correct though.


----------



## Castaa

Quote:


> Originally Posted by *badrapper*
> 
> http://diybbs.zol.com.cn/11/11_106618.html
> 
> Some Info here on *AMD Richland A10-6800K* Performance


Overclocked *A10-6800K* @4.4 Ghz 3DMark11 *P2221*

Verses Xbit's review:

Overclocked *A10-5800K* @4.5 Ghz 3Dmark11 *P2020*

So taking overclocking into account, Richland is only about 10-12% faster. Still half the 3DMark performance of my discrete OC'ed GTX 460.


----------



## DaveLT

"only" lol. It's just a lower power consumption OC'd 5800k ... Keep in mind this is a integrated. Pretty darn much better than what intel puts out even for haswell


----------



## teldar

I don't know if this was addressed or not.

The GPU has some architecture changes. I don't remember where I read this, but I believe they changed the layout of the clusters. They supposedly went from 6 to 3 clusters, I believe. With the new layout it increased efficiency and decreased power consumption, not just a clock speed bump.


----------



## Castaa

Quote:


> Originally Posted by *DaveLT*
> 
> "only" lol. It's just a lower power consumption OC'd 5800k ... Keep in mind this is a integrated. Pretty darn much better than what intel puts out even for haswell


Well considering they claimed 20-40% performance increases a few months back, I say "only".


----------



## agrims

It may not be a powerhouse, but for the money, where are you going to find this performance per dollar? I am in line and checking daily for the release of the 6800k. It will be mine. And I may not bite for Kaveri right away when it releases after Richland, but I personally think the numbers of Trinity are very solid at the end of the day, something that mere computer mortals can't see or feel, other than their wallets being lighter with more power. How much of a waste is haswell or sb/ib to the average person? I truly think AMD is on the right path for the new standard of computing. Ok, rant over.

I still cannot wait to give the 6800 a go!


----------



## DaveLT

Quote:


> Originally Posted by *Castaa*
> 
> Well considering they claimed 20-40% performance increases a few months back, I say "only".


Their performance increase claims come from CPU+GPU boosts


----------



## overkll

Found some A8-6600K's for sale already in the U.S.!!!! Provantage has em in stock for $119.61.

http://www.provantage.com/amd-ad660kwohlbox~7AAMD2UV.htm

I know it's not the 6800K or even the 6700, but for those who cannot wait another week or so, it's a good option.


----------



## Artikbot

Quote:


> Originally Posted by *Castaa*
> 
> Well considering they claimed 20-40% performance increases a few months back, I say "only".


Because 3DMark is extremely representative of performance, is it?


----------



## Castaa

Quote:


> Originally Posted by *Artikbot*
> 
> Because 3DMark is extremely representative of performance, is it?


What other measure would you like to use to show a 20-40% improvement? Graphics processing seems to be the most Richland friendly benchmark for it.


----------



## DaveLT

Quote:


> Originally Posted by *Castaa*
> 
> What other measure would you like to use to show a 20-40% improvement? Graphics processing seems to be the most Richland friendly benchmark for it.


Try ... a game? Or unigine heaven.


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Try ... a game? Or unigine heaven.


Exactly. Game performance is measured, well, in games. Not in 3DMark.


----------



## Castaa

Quote:


> Originally Posted by *Artikbot*
> 
> Exactly. Game performance is measured, well, in games. Not in 3DMark.


You two are on crack if you expect *any* game to show a 20-40% frame rate improvement over Trinity.

3Dmark11 is going to be the mostly Richland friendly measure because scales the most with improved GPU performance.


----------



## Darias

Quote:


> Originally Posted by *overkll*
> 
> Found some A8-6600K's for sale already in the U.S.!!!! Provantage has em in stock for $119.61.
> 
> http://www.provantage.com/amd-ad660kwohlbox~7AAMD2UV.htm
> 
> I know it's not the 6800K or even the 6700, but for those who cannot wait another week or so, it's a good option.


I just go to your link and already the A8-6600k is out of stock lol !!!


----------



## spatulator

Guys, Richland is not a generation ahead of Trinity. The 20-40% improvement is when compared to Llano, which was the first generation apu. Both trinity and richland (being piledriver based) are considered by AMD to be second generation APUs. I posted a slide somewhere that shows this I'm not going to bother reposting it. Richland actually sounds like a nice evolution of piledriver though with low power draw. Hopefully this one will OC like a beast.


----------



## DaveLT

The GPU section had a bit of change along with the CPU to lower power consumption


----------



## Bbrad

Quote:


> Originally Posted by *spatulator*
> 
> Guys, Richland is not a generation ahead of Trinity. The 20-40% improvement is when compared to Llano, which was the first generation apu. Both trinity and richland (being piledriver based) are considered by AMD to be second generation APUs. I posted a slide somewhere that shows this I'm not going to bother reposting it. Richland actually sounds like a nice evolution of piledriver though with low power draw. Hopefully this one will OC like a beast.


the 20% improvement it over Trinity that's all ready been said and discussed

Sent from my SCH-I500 using Tapatalk 2


----------



## boot318

http://www.youtube.com/watch?v=KE6pIMqMQ2E

Newegg talking about the new APUs.


----------



## Alanim

Quote:


> Originally Posted by *boot318*
> 
> http://www.youtube.com/watch?v=KE6pIMqMQ2E
> 
> Newegg talking about the new APUs.


They said it uses GCN cores in the video, that can't be right?


----------



## Alanim

Quote:


> Originally Posted by *boot318*
> 
> http://www.youtube.com/watch?v=KE6pIMqMQ2E
> 
> Newegg talking about the new APUs.


They said it uses GCN cores in the video, that can't be right?


----------



## Alanim

Quote:


> Originally Posted by *boot318*
> 
> http://www.youtube.com/watch?v=KE6pIMqMQ2E
> 
> Newegg talking about the new APUs.


They said it uses GCN cores in the video, that can't be right?


----------



## Milestailsprowe

Figured it out


----------



## DaveLT

Quote:


> Originally Posted by *Alanim*
> 
> They said it uses GCN cores in the video, that can't be right?


IF it used GCN cores we would be seeing 40-50% performance boost with much less stream processors and lower power consumption ...


----------



## overkll

Quote:


> Originally Posted by *Darias*
> 
> I just go to your link and already the A8-6600k is out of stock lol !!!


I don't think it's out of stock. They had 997 in stock and 1750 incoming. I think the distributor listed them early. Once AMD found out, they put the kabash on 'em. Provantage even delisted the part number!

I, however, did get one! Delivered yesterday. In use today.


----------



## peter-mafia

Quote:


> Originally Posted by *overkll*
> 
> I, however, did get one! Delivered yesterday. In use today.


Do you mind telling us anything about it? Any substantial improvement? How does the stock cooler look like?
I don't see any reason to pay $142 for a 6800K. 5800K was $129 when it came out. Already sold my 5800K, though. Probably will end up buying a A8-5600K for $91 + a low profile 7750 for $85 after rebate.


----------



## DaveLT

Maybe good time to get a FM2 proc!


----------



## tuffy12345

Why can't I just buy it yettttttt?


----------



## overkll

double post


----------



## overkll

Quote:


> Originally Posted by *peter-mafia*
> 
> Do you mind telling us anything about it? Any substantial improvement? How does the stock cooler look like?
> I don't see any reason to pay $142 for a 6800K. 5800K was $129 when it came out. Already sold my 5800K, though. Probably will end up buying a A8-5600K for $91 + a low profile 7750 for $85 after rebate.


I haven't played with it much yet. I don't have the same Trinity chip to compare it to. I have an A10-5700 which I love - cool, quiet, low-power usage. Biggest difference I see so far is that the minimum clock speed for "Cool n' Quiet" is 1900MHz as opposed to the 1400MHz for my a10-5700. It also handles 2133Mhz ram without overclocking.

Seems like Funtoo Linux likes it. Overall WEI in Win7 Pro (meaningless) is 6.9 (graphics). Proc is 7.3 IIRC.

Like I said, haven't played with it much yet. Need to hook it up to a Kill-a-watts unit. I suspect the idle usage is insanely low.

Stock cooler is the same as Trinity.

Unfortunately playtime has to wait. Got Saturday chores/errands to do.


----------



## DaveLT

Jeez, 7.3? Goes to show WEI is super crappy. I am on 2.52GHz and it gives me 7.3.
3.6GHz and it hardly changes and i can promise you ... this thing is alot faster than a A10-5800k even at 2.52GHz. You get a bottleneck with the A10-5800k with a 7850 but i don't







(Tested : BF3)


----------



## tuffy12345

Quote:


> Originally Posted by *DaveLT*
> 
> Jeez, 7.3? Goes to show WEI is super crappy. I am on 2.52GHz and it gives me 7.3.
> 3.6GHz and it hardly changes and i can promise you ... this thing is alot faster than a A10-5800k even at 2.52GHz. You get a bottleneck with the A10-5800k with a 7850 but i don't
> 
> 
> 
> 
> 
> 
> 
> (Tested : BF3)


----------



## darkusx45

A10-6800k @ Newegg: http://bit.ly/15AODAS


----------



## Bbrad

Zoomed in on the side of the box its still being crossfired with the 6670 as its highest GPU

Sent from my SCH-I500 using Tapatalk 2


----------



## Himo5

A10-6800K @ aria .co.uk


----------



## EliteReplay

reviews?


----------



## mircopolo

Wccftech have some number at the bottom of this slide for a few games and random CPU benchmarks (not sure where they got them from), I keep pinging google every 15 minutes but no detailed reviews yet.









http://wccftech.com/amd-launches-generation-aseries-richland-apus-desktop-fm2-platform/


----------



## sanket779292

a10 5800k avaliable at newegg


----------



## beers

Quote:


> Originally Posted by *sanket779292*
> 
> a10 5800k avaliable at newegg


6800k you mean..

Microcenter already has some bundles for these as well.
http://www.microcenter.com/site/products/amd_bundles.aspx


----------



## ET3D

Quote:


> Originally Posted by *mircopolo*
> 
> Wccftech have some number at the bottom of this slide for a few games and random CPU benchmarks (not sure where they got them from), I keep pinging google every 15 minutes but no detailed reviews yet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://wccftech.com/amd-launches-generation-aseries-richland-apus-desktop-fm2-platform/


Thanks for the link. It's the only Richland results I've found yet. I hope the reviews arrive soon. Looks like a marginal improvement.


----------



## edge929

Richland is just supposed to be a small bump over Trinity. Iris Pro should easily beat the 8670D in the 6800K. Kaveri should provide a bigger bump in late 2013.


----------



## Artikbot

Quote:


> Originally Posted by *edge929*
> 
> Richland is just supposed to be a small bump over Trinity. Iris Pro should easily beat the 8670D in the 6800K. Kaveri should provide a bigger bump in late 2013.


----------



## Milestailsprowe

I might have to pick up one of those micro center bundles. That price with the cooler I want leaves me saves me $70. Any difference between the MSI itx board and the a85x asroxk


----------



## DaveLT

Quote:


> Originally Posted by *edge929*
> 
> Richland is just supposed to be a small bump over Trinity. Iris Pro should easily beat the 8670D in the 6800K. Kaveri should provide a bigger bump in late 2013.


Oh christ ... The Iris Pro is only a TINY bit better than 7660D. 5-10% increase puts them on par.
BUT ALSO. How much does the iris pro cost if anyone would remind me? That's right. *617$*


----------



## Seronx

8670D can Hybrid Crossfire with the 7750.


----------



## edge929

Quote:


> Originally Posted by *DaveLT*
> 
> Oh christ ... The Iris Pro is only a TINY bit better than 7660D. 5-10% increase puts them on par.
> BUT ALSO. How much does the iris pro cost if anyone would remind me? That's right. *617$*


Of course Intel will want the "Intel fee" for Iris Pro as they do with all their products, I wasn't comparing on price. The 5200 is consistently faster than the 7660D and as you mention, should be roughly on par with the 8670D. The big difference here is power consumption. The 5200 accomplishes at 47W what the 8670D does at 100W. Would I ever buy a 5200 chip? Absolutely not.


----------



## Bbrad

Quote:


> Originally Posted by *Seronx*
> 
> 8670D can Hybrid Crossfire with the 7750.


you do not read it clearly states on the side of the box it can only be crossfired with the 6670









Sent from my SCH-I500 using Tapatalk 2


----------



## beers

Quote:


> Originally Posted by *Seronx*
> 
> 8670D can Hybrid Crossfire with the 7750.


Do you have a source?
Even if you could, you probably wouldn't want to. The iGPU is in the same performance class as the 6670 or 6570 DDR5, not to mention also being VLIW


----------



## DaveLT

Quote:


> Originally Posted by *edge929*
> 
> Of course Intel will want the "Intel fee" for Iris Pro as they do with all their products, I wasn't comparing on price. The 5200 is consistently faster than the 7660D and as you mention, should be roughly on par with the 8670D. The big difference here is power consumption. The 5200 accomplishes at 47W what the 8670D does at 100W. Would I ever buy a 5200 chip? Absolutely not.


And your point is?
Don't be nonsensical. AMD accomplishes what Intel does at 500$ less


----------



## nakano2k1

Quote:


> Originally Posted by *Bbrad*
> 
> you do not read it clearly states on the side of the box it can only be crossfired with the 6670
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my SCH-I500 using Tapatalk 2


He can read an do research as well instead of just taking AMDs word for it.

http://www.youtube.com/watch?v=WOENTgy7z5c

Even though this is a A10-5800K, it's the same VLIW4 architecture in the new A10-6800K CPUs, so it will crossfire.

As for YOU, if you're going to post on the forums with the big boys, learn the difference between "They're, Their and There" PLEASE!!!


----------



## edge929

Quote:


> Originally Posted by *DaveLT*
> 
> And your point is?
> Don't be nonsensical. AMD accomplishes what Intel does at 500$ less


My point is the Intel chip is more impressive from a power standpoint. Intel accomplishes what AMD can at half the TDP. Some will pay the premium for less power/heat, I won't. With that said, I'm neither an Intel or AMD fan.


----------



## Bbrad

Quote:


> Originally Posted by *edge929*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DaveLT*
> 
> And your point is?
> Don't be nonsensical. AMD accomplishes what Intel does at 500$ less
> 
> 
> 
> My point is the Intel chip is more impressive from a power standpoint. Intel accomplishes what AMD can at half the TDP. Some will pay the premium for less power/heat, I won't. With that said, I'm neither an Intel or AMD fan.
Click to expand...

you whanna talk power consumption look at a apu 2 tines the graphics power of phase well and a quad core for a very low tdp ([MINDBLOWN])

Sent from my SCH-I500 using Tapatalk 2


----------



## Assimilator87

Quote:


> Originally Posted by *edge929*
> 
> My point is the Intel chip is more impressive from a power standpoint. Intel accomplishes what AMD can at half the TDP. Some will pay the premium for less power/heat, I won't. With that said, I'm neither an Intel or AMD fan.


It's not entirely fair to compare a 22nm chip to a 32nm one. Even with Kaveri, AMD will still be quite a bit behind in process technology (28nm). So AMD will pretty much *always* use more power at the same performance level if we're not considering architecture differences.


----------



## MKHunt

Amazon had the Asus F2 A85 Pro-M for 99 shipped. Richland here I come.


----------



## lostmage

Anyone here know how the 6800k and a crossfired 6670 would compare to my current 1055t @3.4ghz and 5850?


----------



## beers

Quote:


> Originally Posted by *lostmage*
> 
> Anyone here know how the 6800k and a crossfired 6670 would compare to my current 1055t @3.4ghz and 5850?


Should be quite a bit less. I would expect something around 2500 in 3dm11 which is around on par with a 7750.


----------



## Bbrad

Quote:


> Originally Posted by *beers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lostmage*
> 
> Anyone here know how the 6800k and a crossfired 6670 would compare to my current 1055t @3.4ghz and 5850?
> 
> 
> 
> Should be quite a bit less. I would expect something around 2500 in 3dm11 which is around on par with a 7750.
Click to expand...

less? I think you mean more haha

Sent from my SCH-I500 using Tapatalk 2


----------



## nakano2k1

Quote:


> Originally Posted by *Bbrad*
> 
> less? I think you mean more haha
> 
> Sent from my SCH-I500 using Tapatalk 2


What are YOU talking about??

When the crossfire between the 8670D and the 7750 would actually work well (which isn't very often), it would be easily beaten by a 5850 and true six core Phenom CPU.

The only thing the a10 + 7750 combo would win at would be being more efficient given the smaller manufacturing process and new CPU commands.


----------



## xd_1771

Quote:


> Originally Posted by *edge929*
> 
> My point is the Intel chip is more impressive from a power standpoint. Intel accomplishes what AMD can at half the TDP. Some will pay the premium for less power/heat, I won't. With that said, I'm neither an Intel or AMD fan.


You know, there are 65W TDP APUs that have the 8670D in them and 4 cores, like the A10-6700.


----------



## xd_1771

*On the thread*

*I ask that all members do their part in complying with the OCN Terms of Service and Professionalism Initiative (links in sig) throughout this discussion.*

*Regards, -xd*


----------



## beers

Quote:


> Originally Posted by *Bbrad*
> 
> less? I think you mean more haha
> 
> Sent from my SCH-I500 using Tapatalk 2


Probably not. I'll pick up a 6800k likely here in the next week or so and have a 7570 to crossfire it with.
If you look at the 5800k crossfire figures they weren't too much over 2000 for default level 3dmark11 tests.


----------



## Bbrad

Quote:


> Originally Posted by *beers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> less? I think you mean more haha
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> Probably not. I'll pick up a 6800k likely here in the next week or so and have a 7570 to crossfire it with.
> If you look at the 5800k crossfire figures they weren't too much over 2000 for default level 3dmark11 tests.
Click to expand...

tell me how it goes cross firing it







and is there any guides to doing it considering its not officially supported?

Sent from my SCH-I500 using Tapatalk 2


----------



## Scooter31

Quote:


> Originally Posted by *Bbrad*
> 
> tell me how it goes cross firing it
> 
> 
> 
> 
> 
> 
> 
> and is there any guides to doing it considering its not officially supported?
> 
> Sent from my SCH-I500 using Tapatalk 2


I think he means AMD Dual Graphics. http://www.amd.com/us/products/technologies/dual-graphics/Pages/dual-graphics.aspx#3


----------



## Clockdripdoor

A10-6800K is available at Newegg. I ordered mine a few hours ago.

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113331&cm_sp=recommend-_-19-113-331-_-06042013_2


----------



## Bbrad

Quote:


> Originally Posted by *Scooter31*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> tell me how it goes cross firing it
> 
> 
> 
> 
> 
> 
> 
> and is there any guides to doing it considering its not officially supported?
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> I think he means AMD Dual Graphics. http://www.amd.com/us/products/technologies/dual-graphics/Pages/dual-graphics.aspx#3
Click to expand...

-_- that's what I meant lol

Sent from my SCH-I500 using Tapatalk 2


----------



## peter-mafia

Was hoping till the last minute it would crossfire with a 7750. Very disappointed (taking into account the new price tag). Money wise Trinity is much better (wish I hadn't sold my a10-5800k). Already ordered a A8-5600K for $89 (simcity included. to be sold) + a HIS 7750 ($85 after MIR). IGD will be turned off. Waiting for Kaveri


----------



## M3T4LM4N222

I wanna see some benchmarks on the A10-6800K. I wanna see what the actual improvements are.


----------



## Bateman2343

What type of the settings could the A10-6800K run Crysis 3 by itself (without a discrete gpu)? What about Kaveri?


----------



## MKHunt

Of course Amazon is super mega out of stock of the 6800k


----------



## Dr_Asik

Quote:


> Originally Posted by *Bateman2343*
> 
> What type of the settings could the A10-6800K run Crysis 3 by itself (without a discrete gpu)? What about Kaveri?


720p, low settings, perhaps med.


----------



## M3T4LM4N222

Quote:


> Originally Posted by *Dr_Asik*
> 
> 720p, low settings, perhaps med.


No... I ran the BETA on low settings @ 1920 x 1080 @ about 35fps on average with the A10-5800K. That was with 13.1B drivers.

You get about the same frames in FC3, but I haven't played Far Cry since 12.11B drivers.


----------



## overkll

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> I wanna see some benchmarks on the A10-6800K. I wanna see what the actual improvements are.


I heard the review embargo will be lifted at midnight Eastern Daylight time. That'd be about 5 minutes from now...


----------



## Bbrad

Quote:


> Originally Posted by *overkll*
> 
> Quote:
> 
> 
> 
> Originally Posted by *M3T4LM4N222*
> 
> I wanna see some benchmarks on the A10-6800K. I wanna see what the actual improvements are.
> 
> 
> 
> I heard the review embargo will be lifted at midnight Eastern Daylight time. That'd be about 5 minutes from now...
Click to expand...

which review?

Sent from my SCH-I500 using Tapatalk 2


----------



## overkll

Quote:


> Originally Posted by *Bbrad*
> 
> which review?
> 
> Sent from my SCH-I500 using Tapatalk 2


It's starting. Here's Anandtech's:

http://www.anandtech.com/show/7028/amd-richland-desktop-apus-now-available


----------



## RegularBear

S|A has a well done review up.

http://semiaccurate.com/2013/06/04/amd-brings-richland-to-the-desktop-the-a10-6800k/

Take note that the performance increases here are absolutely nothing like the wild figures that have been rumored at multiple unscrupulous sources.


----------



## Artikbot

Quote:


> Originally Posted by *RegularBear*
> 
> S|A has a well done review up.
> 
> http://semiaccurate.com/2013/06/04/amd-brings-richland-to-the-desktop-the-a10-6800k/
> 
> Take note that the performance increases here are absolutely nothing like the wild figures that have been rumored at multiple unscrupulous sources.


Richland is an overclocked Trinity anyway, we all knew hwow would it turn out to be compared to it









But the real interesting thing is how they pulled general performance by about a 10% and lowered power consumption equally as much.

After all, it's the only thing K-series APUs are trailing Intel's offerings on. Non-K already match them.

Also, SA dropped the ball... again.
Quote:


> It's a good chip, but to a large extent it suffers from the same flaws as Trinity which is to say that its CPU is uncompetitive and offers lower performance per watt than competing Intel chips.


They aren't comparing Richland's CPU to a 3770K, aren't they?


----------



## DaveLT

Quote:


> Originally Posted by *Artikbot*
> 
> They aren't comparing Richland's CPU to a 3770K, aren't they?


Yet again ... Taking Richland's GPU in account Intel can forget about fighting in the perf/watt arena and we all know that


----------



## Th4natos

Maybe someone can answer this question for me, as it is the only thing stopping me from jumping into the A10 6800K with both feet right now and that is BIOS support from FM2 boards. Obviously, it is compatible, but since it comes with a disclaimer saying that you will need to update the BIOS and I don't have any other chips that would fit into an FM2 socket, am I screwed?


----------



## Artikbot

Quote:


> Originally Posted by *Th4natos*
> 
> Maybe someone can answer this question for me, as it is the only thing stopping me from jumping into the A10 6800K with both feet right now and that is BIOS support from FM2 boards. Obviously, it is compatible, but since it comes with a disclaimer saying that you will need to update the BIOS and I don't have any other chips that would fit into an FM2 socket, am I screwed?


You could ask the retailer to do the upgrade for you. Most won't have any problem in doing so.


----------



## sanket779292

Myself too not buying 6.8k because of bios hassles, is there any way to do that witout having older chips or the chip will run atleast for bios update first time?


----------



## sorance2000

Quote:


> Originally Posted by *Artikbot*
> 
> Richland is an overclocked Trinity anyway, we all knew hwow would it turn out to be compared to it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But the real interesting thing is how they pulled general performance by about a 10% and lowered power consumption equally as much.
> 
> After all, it's the only thing K-series APUs are trailing Intel's offerings on. Non-K already match them.
> 
> Also, SA dropped the ball... again.
> They aren't comparing Richland's CPU to a 3770K, aren't they?


I don't think so. If Richland would be just an overclocked Trinity, the TDP should have been increased, but it didn't So these is the reason it is named Richland, that is more power in the same TDP.
Try to overclock the Trinity one up to the Richland's frequency and mantain 100W TDP.


----------



## DaveLT

It also has lower power consumption even while at a higher clock


----------



## majorleague

Here is a youtube link showing 3dmark11 and windows index rating for the 4770k 3.5ghz Haswell. Not overclocked.

This is around 10-20fps slower than the 6800k in most games. And almost twice the price!!

Youtube link:
click


----------



## overkll

Quote:


> Originally Posted by *RegularBear*
> 
> S|A has a well done review up.
> 
> http://semiaccurate.com/2013/06/04/amd-brings-richland-to-the-desktop-the-a10-6800k/
> 
> Take note that the performance increases here are absolutely nothing like the wild figures that have been rumored at multiple unscrupulous sources.


They also didn't test with 2133Mhz memory:
Quote:


> ...and our RAM of choice is an 8 GB DDR3 1866Mhz kit from AMD. [Editor's note: This is the memory we have available. We would prefer to test with faster memory. We hope our budget allows such at a future date.]...


----------



## Himo5

Quote:


> Originally Posted by *Papadope*
> 
> Quote:
> 
> 
> 
> I notice also that - presumably in preparation for Richland - ASUS have updated the BIOS for the F2A85-V PRO to v.6002 with additional updates for Chipset and Graphics, a new version of AI Suite II and an update to the QVL.
> So upgrading the APU will now involve a - not so simple - reinstall of W8 (since I am still bedding it in) and the usual problem of working out how to upgrade the motherboard DVD with the new files - something that somehow never gets mentioned.
> 
> 
> 
> Wait what? why does upgrading the APU to Richland force you to reinstall Windows 8?
Click to expand...

As I was afraid, booting into W8 with a new A10-6800K in place of a A10-5800K was greeted as follows:



It looks like updating the driver DVD on Asus FM2 boards is about to become an issue for a lot of people.


----------



## Bbrad

Can I use 2400mhz memory with the A10 just to improve performance even thought its not officially supported?

Sent from my SCH-I500 using Tapatalk 2


----------



## Himo5

If you have the A10-6800K, an Asus FM2 board and some 2400 ram that gives SPD an XMP reading you can do this in BIOS AI Tweaker menu by choosing DOCP mode for [AI Overclock Tuner]. This should post and boot, then you can go into AI Suite, go into Digi+ Control and hit thje big red Smart DIGI+ Key, accept the VRM setting it offers then go into TurboV EVO and set up the APU multiplier and CPU voltage (leave the APU frequency at 100). This will give you CPU 5GHz and RAM at 2400MHz and leave you to see what you can get with GPU boost.

Here's a running example:


PS. Just to say that, of course, you need an after market cooler to do this, but you don't have to go overboard with it, something like my Mugen 3B will do nicely.


----------



## Castaa

Quote:


> As the last step in overclocking our hybrid platform we selected the highest system memory frequency. The Richland's controller supports DDR3-2400 and we had no problems enabling that clock rate. As a result, we had a well-overclocked platform whose graphics performance was much higher than at the default settings.
> 
> The overall score of the overclocked A10-6800K is almost 25% higher, its x86 part having accelerated by 10% according to the gaming physics test. The integrated Radeon HD 8670D is comparable to Radeon HD 6670 graphics cards when overclocked.


Xbit has their overclocking review up. http://www.xbitlabs.com/articles/cpu/display/amd-a10-6800k_9.html


----------



## beers

Quote:


> Originally Posted by *Castaa*
> 
> Xbit has their overclocking review up. http://www.xbitlabs.com/articles/cpu/display/amd-a10-6800k_9.html


Dang that article is full of typos, misnamings and unit errors..


----------



## RyanJC

Are there any reviews of the 6800-K with dual graphics? All the reviews I've come across avoid this subject...


----------



## Bbrad

Quote:


> Originally Posted by *RyanJC*
> 
> Are there any reviews of the 6800-K with dual graphics? All the reviews I've come across avoid this subject...


its going to be the exact same performance as the 5800k maybe with one more fps because of the over clock.

Sent from my SCH-I500 using Tapatalk 2


----------



## Dr_Asik

Quote:


> Originally Posted by *M3T4LM4N222*
> 
> No... I ran the BETA on low settings @ 1920 x 1080 @ about 35fps on average with the A10-5800K. That was with 13.1B drivers.
> 
> You get about the same frames in FC3, but I haven't played Far Cry since 12.11B drivers.


1080p at 35fps? Benchmarks show below 20 for Richland: http://uk.hardware.info/reviews/4461/8/amd-a10-6800k--a10-6700-cpu-review-richland-tested-benchmarks-igpu-crysis-3-1920x1080-low


----------



## Bbrad

Quote:


> Originally Posted by *Dr_Asik*
> 
> Quote:
> 
> 
> 
> Originally Posted by *M3T4LM4N222*
> 
> No... I ran the BETA on low settings @ 1920 x 1080 @ about 35fps on average with the A10-5800K. That was with 13.1B drivers.
> 
> You get about the same frames in FC3, but I haven't played Far Cry since 12.11B drivers.
> 
> 
> 
> 1080p at 35fps? Benchmarks show below 20 for Richland: http://uk.hardware.info/reviews/4461/8/amd-a10-6800k--a10-6700-cpu-review-richland-tested-benchmarks-igpu-crysis-3-1920x1080-low
Click to expand...

them benchmarks is rigged for Intel on my old a4 I get 20+ fps on battliefield

Sent from my SCH-I500 using Tapatalk 2


----------



## Dr_Asik

Quote:


> Originally Posted by *Bbrad*
> 
> them benchmarks is rigged for Intel on my old a4 I get 20+ fps on battliefield
> 
> Sent from my SCH-I500 using Tapatalk 2


We were talking about Crysis 3. The Intels do way worse in the benchmark actually and the A10s are tested with 2133mhz memory so I don't see what you're talking about.


----------



## FIRINMYLAZERMAN

I'm curious to know if any of you know if there's any information available comparing the A10-6800K to a Phenom II X4 965 BE? I'm very interested to know how well the A10-6800K performs in comparison to the Phenom II X4 965 BE, more specifically in gaming.


----------



## lacrossewacker

Quote:


> Originally Posted by *Bbrad*
> 
> you whanna talk power consumption look at a apu 2 tines the graphics power of phase well and a quad core for a very low tdp ([MINDBLOWN])
> 
> Sent from my SCH-I500 using Tapatalk 2


well to bring in the TDP talk, better take a look at the i7-4950hq

Half of the A10-6800k's TDP, yet....

http://images.anandtech.com/graphs/graph6993/55292.png
http://images.anandtech.com/graphs/graph6993/55286.png
http://images.anandtech.com/graphs/graph6993/55313.png
http://images.anandtech.com/graphs/graph6993/55315.png

That's all within a 47 watt envelope...for laptops....

They didn't have a 6800k in that review, I was using the 5800k as my comparison...but they're basically the same.

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/20

Just FYI







(oh, but the Intel CPU is expensive as balls)


----------



## MKHunt

Just ordered a 6800k, SSD, and some 2400MHz rams. HTPC, assemble!


----------



## jsc1973

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I'm curious to know if any of you know if there's any information available comparing the A10-6800K to a Phenom II X4 965 BE? I'm very interested to know how well the A10-6800K performs in comparison to the Phenom II X4 965 BE, more specifically in gaming.


That would depend on what graphics card is running with the 965BE. Without that information, there's no way to know.


----------



## RyanJC

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I'm curious to know if any of you know if there's any information available comparing the A10-6800K to a Phenom II X4 965 BE? I'm very interested to know how well the A10-6800K performs in comparison to the Phenom II X4 965 BE, more specifically in gaming.


Performance of the 6800K is comparable to 5800-K

http://www.anandtech.com/bench/Product/675?vs=102


----------



## mtcn77

Quote:


> Originally Posted by *lacrossewacker*
> 
> well to bring in the TDP talk, better take a look at the i7-4950hq
> 
> Half of the A10-6800k's TDP, yet....
> 
> http://images.anandtech.com/graphs/graph6993/55292.png
> http://images.anandtech.com/graphs/graph6993/55286.png
> http://images.anandtech.com/graphs/graph6993/55313.png
> http://images.anandtech.com/graphs/graph6993/55315.png
> 
> That's all within a 47 watt envelope...for laptops....
> 
> They didn't have a 6800k in that review, I was using the 5800k as my comparison...but they're basically the same.
> 
> http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/20
> 
> Just FYI
> 
> 
> 
> 
> 
> 
> 
> (oh, but the Intel CPU is expensive as balls)


Yep, I believe you.

http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/7


----------



## lacrossewacker

Quote:


> Originally Posted by *mtcn77*
> 
> Yep, I believe you.
> 
> http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/7


um....yeah?
http://techreport.com/r.x/core-i7-4770k/power-peak.png
http://techreport.com/r.x/core-i7-4770k/power-task-energy.png
http://techreport.com/r.x/core-i7-4770k/igp-g2-fps.png
http://techreport.com/r.x/core-i7-4770k/igp-metro-fps.png
http://techreport.com/r.x/core-i7-4770k/qtbench.png
http://techreport.com/r.x/core-i7-4770k/tc-aes.png
http://techreport.com/r.x/core-i7-4770k/7z-comp.png
http://techreport.com/r.x/core-i7-4770k/x264.png
http://techreport.com/r.x/core-i7-4770k/handbrake.png
http://techreport.com/r.x/core-i7-4770k/lux-icd.png
http://techreport.com/r.x/core-i7-4770k/lux-igp.png
http://techreport.com/r.x/core-i7-4770k/lux-icdigp.png
http://techreport.com/r.x/core-i7-4770k/cinebench.png (Cinebench 7.41 vs 3.32)
http://techreport.com/r.x/core-i7-4770k/pov-chess.png

The 4950hq goes on to win in almost EVERY SINGLE benchmark in CPU and GPU performance. (don't wanna link all of them)

But here's part of their conclusion

"With that said, Haswell's integrated graphics have made bigger strides than the CPU cores this time around. The HD 4600 IGP in the Core i7-4770K isn't quite a fast as the one in AMD's A10-5800K, but it comes perilously close to wiping out AMD's one consistent advantage in this class of chip. *And the Iris Pro graphics solution in the Core i7-4950HQ not only wipes out that advantage but threatens low-end discrete mobile GPUs, as well*."

The highlight of this thread isn't about the LAPTOP 4950hq, just putting the DESKTOP A10-6800k in perspective and clarifying the false claims about AMD being the only one with this level of performance.

EDIT: Actually looking back at the benchmarks, Intel's latest mobile CPU is faster than a FX 8350 in most benchmarks. It'd just lose once you OC the FX 8350.


----------



## Artikbot

Quote:


> Originally Posted by *lacrossewacker*
> 
> um....yeah?
> http://techreport.com/r.x/core-i7-4770k/power-peak.png
> http://techreport.com/r.x/core-i7-4770k/power-task-energy.png
> http://techreport.com/r.x/core-i7-4770k/igp-g2-fps.png
> http://techreport.com/r.x/core-i7-4770k/igp-metro-fps.png
> http://techreport.com/r.x/core-i7-4770k/qtbench.png
> http://techreport.com/r.x/core-i7-4770k/tc-aes.png
> http://techreport.com/r.x/core-i7-4770k/7z-comp.png
> http://techreport.com/r.x/core-i7-4770k/x264.png
> http://techreport.com/r.x/core-i7-4770k/handbrake.png
> http://techreport.com/r.x/core-i7-4770k/lux-icd.png
> http://techreport.com/r.x/core-i7-4770k/lux-igp.png
> http://techreport.com/r.x/core-i7-4770k/lux-icdigp.png
> http://techreport.com/r.x/core-i7-4770k/cinebench.png (Cinebench 7.41 vs 3.32)
> http://techreport.com/r.x/core-i7-4770k/pov-chess.png
> 
> The 4950hq goes on to win in almost EVERY SINGLE benchmark in CPU and GPU performance. (don't wanna link all of them)
> 
> But here's part of their conclusion
> 
> "With that said, Haswell's integrated graphics have made bigger strides than the CPU cores this time around. The HD 4600 IGP in the Core i7-4770K isn't quite a fast as the one in AMD's A10-5800K, but it comes perilously close to wiping out AMD's one consistent advantage in this class of chip. *And the Iris Pro graphics solution in the Core i7-4950HQ not only wipes out that advantage but threatens low-end discrete mobile GPUs, as well*."
> 
> The highlight of this thread isn't about the LAPTOP 4950hq, just putting the DESKTOP A10-6800k in perspective and clarifying the false claims about AMD being the only one with this level of performance.
> 
> EDIT: Actually looking back at the benchmarks, Intel's latest mobile CPU is faster than a FX 8350 in most benchmarks. It'd just lose once you OC the FX 8350.


Hell yeah, let's use 657$ parts to make a comparison against a 6800K.

It's just showing how far Intel needs to go up the price scale to beat AMD's IGP overall gaming performance.


----------



## lacrossewacker

Quote:


> Originally Posted by *Artikbot*
> 
> Hell yeah, let's use 657$ parts to make a comparison against a 6800K.
> 
> It's just showing how far Intel needs to go up the price scale to beat AMD's IGP overall gaming performance.


A. It's over priced
B. The 6800k is not a comparable product.
C. Suggesting Intel doesn't have anything that can compete is now false. It smokes the 6800k's CPU and beats the 6800k's GPU by a good margin.
D. It's still over priced, I acknowledged that in my first post.
E. Still a mobile CPU versus a desktop APU


----------



## Bbrad

Quote:


> Originally Posted by *lacrossewacker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Artikbot*
> 
> Hell yeah, let's use 657$ parts to make a comparison against a 6800K.
> 
> It's just showing how far Intel needs to go up the price scale to beat AMD's IGP overall gaming performance.
> 
> 
> 
> A. It's over priced
> B. The 6800k is not a comparable product.
> C. Suggesting Intel doesn't have anything that can compete is now false. It smokes the 6800k's CPU and beats the 6800k's GPU by a good margin.
> D. It's still over priced, I acknowledged that in my first post.
> E. Still a mobile CPU versus a desktop APU
Click to expand...

fanboy







Intel struggles to compete with 5800k which I loses to and you compare 6800k







just leave the GPUs to amd and Intel and compute power. You post is a fail

Sent from my SCH-I500 using Tapatalk 2


----------



## mtcn77

Guys, AMD isn't flexing any gpu muscle on either of its APU's. You are talking with such fury, yet there is no comparison of chips idling 5 watts(4600m) and 37 watts imo.
AMD is acting within a certain power budget and that is soon about to change. As HSA takes off, the cpu will continue to shrink in die while gpu engulfs more than 42% that Richland gpu unit does.
Please stop comparing top of the line processors with low mainstream. It is overkill. Intel wins, so what? It's next gen hardware, that is what it is supposed to do, replace the old(VLIW4). What's more it is not going to be launched on the same platform. 37 watts idle is desktop niche and also the territory of GDDR5 enhanced gpu's. If you need a challenge, there you have it.


----------



## ihatelolcats

is the max temp on these really 120C? thats nuts


----------



## Bbrad

Quote:


> Originally Posted by *ihatelolcats*
> 
> is the max temp on these really 120C? thats nuts


you go over 65C its going to hurt it 120c is probably the blow up temp haha but really its would go out.

Sent from my SCH-I500 using Tapatalk 2


----------



## FIRINMYLAZERMAN

Are there any websites out there (aside from CPUBoss) that have compared (either directly or indirectly compared) the A10-6800K to a Phenom II X4 965 BE?


----------



## Castaa

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Are there any websites out there (aside from CPUBoss) that have compared (either directly or indirectly compared) the A10-6800K to a Phenom II X4 965 BE?


If you are thinking of upgrading, it'd be wiser to consider a FX-6300 or FX-8320, given your hardware sig. An A10-6800K cpu performance wise is going to be around the level of a FX-4350 or FX-4300. Though the latter CPUs benefit from a large L3 cache, where APUs have none. There are lots of reviews comparing FX-4000 series CPUs and Phenom II X4 CPUs.


----------



## DaveLT

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Are there any websites out there (aside from CPUBoss) that have compared (either directly or indirectly compared) the A10-6800K to a Phenom II X4 965 BE?


Forget 6800k, 5800k already roflstomps phenom II x4


----------



## DrBrogbo

Quote:


> Originally Posted by *Bbrad*
> 
> fanboy
> 
> 
> 
> 
> 
> 
> 
> Intel struggles to compete with 5800k which I loses to and you compare 6800k
> 
> 
> 
> 
> 
> 
> 
> just leave the GPUs to amd and Intel and compute power. You post is a fail
> 
> Sent from my SCH-I500 using Tapatalk 2


Did you even read his post? The i7-4950hq STOMPS all of AMD's APU offerings, even ones that use twice as much power (the desktop A10-6800K, for example).


----------



## FIRINMYLAZERMAN

What do you guys think about this build? It would be used mostly for LAN events and to take with me to places. Prices are AFTER tax and shipping and discounts, and prices are in Canadian dollars.

CPU: AMD A10-6800K 4.1GHz Quad-Core APU - $181
GPU: AMD Radeon HD 8670D IGP (as part of the A10-6800K APU)
MOBO: ASRock Extreme4-M A85X Socket FM2 Micro-ATX Motherboard - $97
RAM: 8GB (2 x 4GB) Patriot Viper 3 Mamba DDR3 2133MHz RAM Kit - $86
PSU: Seasonic SS-500ET 500W 80 PLUS Bronze Certified Power Supply - $75
Case: NZXT Vulcan Micro-ATX Black Case - $82

Total (including tax, shipping, and discounts) = $521

*** I can get an AMD A6-5400K right now for $75 after tax, which would bring the price down to $415, and I know it wouldn't be as powerful as the A10-6800K, but I would probably have to buy a 5000 series APU anyway, so that I can update the motherboard's BIOS ***


----------



## DaveLT

Quote:


> Originally Posted by *DrBrogbo*
> 
> Did you even read his post? The i7-4950hq STOMPS all of AMD's APU offerings, even ones that use twice as much power (the desktop A10-6800K, for example).


Jeez. At six times the price it better bloody do so.
We're talking HD4600 here which Intel rumored to be double HD4000, but did it double? No. And that's the highest grade LGA chip there.
He's talking about the HD4600 that's for sure because the A10-5800k still roflstomps it much less 6800k


----------



## Castaa

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> What do you guys think about this build? It would be used mostly for LAN events and to take with me to places. Prices are AFTER tax and shipping and discounts, and prices are in Canadian dollars.
> 
> CPU: AMD A10-6800K 4.1GHz Quad-Core APU - $181
> GPU: AMD Radeon HD 8670D IGP (as part of the A10-6800K APU)
> MOBO: ASRock Extreme4-M A85X Socket FM2 Micro-ATX Motherboard - $97
> RAM: 8GB (2 x 4GB) Patriot Viper 3 Mamba DDR3 2133MHz RAM Kit - $86
> PSU: Seasonic SS-500ET 500W 80 PLUS Bronze Certified Power Supply - $75
> Case: NZXT Vulcan Micro-ATX Black Case - $82
> 
> Total (including tax, shipping, and discounts) = $521
> 
> *** I can get an AMD A6-5400K right now for $75 after tax, which would bring the price down to $415 ***


It all depends on what are you going to be using the box for? The A10-6800K is the end of the line for the FM2 socket. FM2+ is required for Kaveri. So you CPU upgrade path is over.

Xbit found that 2400 Mhz worked with the A10-6800K and made game run faster.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Castaa*
> 
> It all depends on what are you going to be using the box for? The A10-6800K is the end of the line for the FM2 socket. FM2+ is required for Kaveri. So you CPU upgrade path is over.
> 
> Xbit found that 2400 Mhz worked with the A10-6800K and made game run faster.


I would be playing games at 720p (1280 x 720 resolution) or lower, and I would be trying to play games at low/mixed medium settings.

Games I would be playing would be Hawken, League of Legends, Battlefield 3 (online/multiplayer) Blacklight: Retribution, and other online/multiplayer games, so I wouldn't need an overly powerful computer.

2400MHz+ RAM is really expensive, and I'm trying to keep the cost down however I can


----------



## DaveLT

Quote:


> Originally Posted by *Castaa*
> 
> It all depends on what are you going to be using the box for? The A10-6800K is the end of the line for the FM2 socket. FM2+ is required for Kaveri. So you CPU upgrade path is over.
> 
> Xbit found that 2400 Mhz worked with the A10-6800K and made game run faster.


No, kaveri will drop into FM2 motherboards no problem


----------



## Milestailsprowe

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Are there any websites out there (aside from CPUBoss) that have compared (either directly or indirectly compared) the A10-6800K to a Phenom II X4 965 BE?


http://www.anandtech.com/bench/Product/675?vs=102

Using this I'm gonna assume that the 6800k is much better


----------



## Himo5

TridentX 2400MHz 2x4GB for $96 at Amazon.com

This one item in the range is below £100 this side of the pond as well, and only $10 more than your Patriot kit.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Himo5*
> 
> TridentX 2400MHz 2x4GB for $96 at Amazon.com
> 
> This one item in the range is below £100 this side of the pond as well, and only $10 more than your Patriot kit.


That same kit is at least $15+ more where I live, and your price is before tax. My price is after tax (the Patriot RAM is $74 before tax)

EDIT:

I actually found that TridentX RAM for $90 plus tax, but after tax is going to be just over $100, so... $86 for 2133MHz RAM or over $100 for 2400MHz RAM... I'd rather save the $15+ if I can lol

I can understand paying extra if I were going from 1333MHz RAM to 2133MHz RAM, but not from 2133MHz RAM to 2400MHz RAM, at least, not right now anyway.


----------



## pedantic

Quote:


> Originally Posted by *Castaa*
> 
> It all depends on what are you going to be using the box for? The A10-6800K is the end of the line for the FM2 socket. FM2+ is required for Kaveri. So you CPU upgrade path is over.
> 
> Xbit found that 2400 Mhz worked with the A10-6800K and made game run faster.


I sure hope not. I have always gotten AMD but the frequency of socket upgrades is getting a little much for me.


----------



## MKHunt

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> That same kit is at least $15+ more where I live, and your price is before tax. My price is after tax (the Patriot RAM is $74 before tax)
> 
> EDIT:
> 
> I actually found that TridentX RAM for $90 plus tax, but after tax is going to be just over $100, so... $86 for 2133MHz RAM or over $100 for 2400MHz RAM... I'd rather save the $15+ if I can lol
> 
> I can understand paying extra if I were going from 1333MHz RAM to 2133MHz RAM, but not from 2133MHz RAM to 2400MHz RAM, at least, not right now anyway.


http://www.amazon.com/gp/product/B00A771ZWI/ref=oh_details_o02_s00_i00?ie=UTF8&psc=1

2400MHz for $85.


----------



## Castaa

Quote:


> Originally Posted by *DaveLT*
> 
> No, kaveri will drop into FM2 motherboards no problem


Quote:


> The difference between the sockets lies in two pins, two that are covered on the FM2 socket, but uncovered on the FM2+ socket. This means that while 'older' Trinity and Richland APUs will drop right into the FM2+ socket, it also means that the upcoming Kaveri APUs will not fit, or work in the current FM2 socket -- pins will bend.


Kaveri *will not* work with FM2 mobos.

http://www.tomshardware.com/news/Kaveri-AMD-FM2-FM2b-Socket,22949.html

Trinity/Richland will work with newer FM2+


----------



## Castaa

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I would be playing games at 720p (1280 x 720 resolution) or lower, and I would be trying to play games at low/mixed medium settings.
> 
> Games I would be playing would be Hawken, League of Legends, Battlefield 3 (online/multiplayer) Blacklight: Retribution, and other online/multiplayer games, so I wouldn't need an overly powerful computer.
> 
> 2400MHz+ RAM is really expensive, and I'm trying to keep the cost down however I can


Ya that's true, go with the 2133 memory then and a A10-6800K and get a decent cooler to OC it.

I'm a bit confused because you sig rig is already a gaming box.


----------



## agrims

If AMD follows the same path of AM3/+, we will be able to use Kaveri in select FM2 mobos. Right now it looks bleak, but it may work. If not, I will be upgrading my mobo again! At least the next upgrade won't be as steep due to all my parts being upgraded right now.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Castaa*
> 
> Ya that's true, go with the 2133 memory then and a A10-6800K and get a decent cooler to OC it.
> 
> I'm a bit confused because you sig rig is already a gaming box.


I'm just throwing possible ideas out there for a second (more portable) gaming PC that I could take with me, since my main PC isn't portable my any means









As far as my main PC, I do plan on upgrading, but I won't have the money for what I want for a while







I'm planning on upgrading my main PC's CPU, RAM and motherboard to an FX-8350 CPU, 8GB of Kingston HyperX Beast 2400MHz RAM and an ASUS TUF SABERTOOTH 990FX/GEN3 R2.0 motherboard


----------



## spatulator

Quote:


> Kaveri will not work with FM2 mobos.
> 
> http://www.tomshardware.com/news/Kaveri-AMD-FM2-FM2b-Socket,22949.html
> 
> Trinity/Richland will work with newer FM2+


It's important to keep pushing the envelope and I suppose it is necessary with the memory limitation of the current mobo architecture. But I would have liked to be able to drop in a new kaveri cpu in my fm2 mobo. I like AMD and all, but now I'm feeling the frustration that I heard when owners of Llano were complaining about end of socket life for fm1.
These "value" APUs are geared toward budget minded enthusiasts, but I think that consumers who got on the bandwagon for a Llano or Trinity cpu might feel like buying a new mobo for each iteration of new apu technology is asking too much for a budget setup.
For both these 1st and second gen APUs, AMD was promising much longer socket life than delivered.


----------



## Bbrad

Quote:


> Originally Posted by *spatulator*
> 
> Quote:
> 
> 
> 
> Kaveri will not work with FM2 mobos.
> 
> http://www.tomshardware.com/news/Kaveri-AMD-FM2-FM2b-Socket,22949.html
> 
> Trinity/Richland will work with newer FM2+
> 
> 
> 
> It's important to keep pushing the envelope and I suppose it is necessary with the memory limitation of the current mobo architecture. But I would have liked to be able to drop in a new kaveri cpu in my fm2 mobo. I like AMD and all, but now I'm feeling the frustration that I heard when owners of Llano were complaining about end of socket life for fm1.
> These "value" APUs are geared toward budget minded enthusiasts, but I think that consumers who got on the bandwagon for a Llano or Trinity cpu might feel like buying a new mobo for each iteration of new apu technology is asking too much for a budget setup.
> For both these 1st and second gen APUs, AMD was promising much longer socket life than delivered.
Click to expand...

1st rumor 2nd amd has already said they tend to keep fm2 till steamroller.

Sent from my SCH-I500 using Tapatalk 2


----------



## spatulator

I think they said they would keep the socket till excavator or whatever that next one is called but I don't expect that to happen. Yes it's a rumor at this point but it makes alot of sense...unlike rumors of Richland being 20-40% faster GPU than trinity. That rumor made no sense.


----------



## azanimefan

Quote:


> Originally Posted by *spatulator*
> 
> I think they said they would keep the socket till excavator or whatever that next one is called but I don't expect that to happen. Yes it's a rumor at this point but it makes alot of sense..*.unlike rumors of Richland being 20-40% faster GPU than trinity*. That rumor made no sense.


yet it is... when overclocked and paired with ddr3 2400 ram... since richland overclocks better... they're pulling 25%+ better performance out of it... now it's actually matching a 6670... which was 30% faster then the a10-5800k


----------



## MKHunt

Speaking of which, I got a package today


----------



## Bbrad

Quote:


> Originally Posted by *MKHunt*
> 
> Speaking of which, I got a package today


mind uploading some videos of some gameplay? Hope you don't mind me asking i would be gratefully to see just how the gpu does.

Sent from my SCH-I500 using Tapatalk 2


----------



## MKHunt

I can try my durndest. Any specific titles and settings? I only have access to my 1440p screens until June 23ish, then I'll be able to test 1080p.


----------



## Bbrad

Quote:


> Originally Posted by *MKHunt*
> 
> I can try my durndest. Any specific titles and settings? I only have access to my 1440p screens until June 23ish, then I'll be able to test 1080p.


how about battlefield 3 on low at 1440p if it can run that on low at decent frames i should be able to run it at medium at 1080p







thanks again.

Sent from my SCH-I500 using Tapatalk 2


----------



## Wumbologist

http://www.scan.co.uk/products/amd-athlon-x4-760k-black-edition-unlocked-richland-quad-core-s-fm2-38ghz-4mb-cache-100w-retail

Athlon 760K


----------



## azanimefan

at $90 bucks that's a pretty good deal. We might have a chip that will dethrone the PhIIx4 965 as the sub $100 chip of choice. I mean all things equal the a10-5800k was basically identical to the PhII in compute tasks.

the A10-6800k overclocks much better then the 5800k... so the athlon ii x4 760k might be a winner.


----------



## Artikbot

Quote:


> Originally Posted by *azanimefan*
> 
> at $90 bucks that's a pretty good deal. We might have a chip that will dethrone the PhIIx4 965 as the sub $100 chip of choice. I mean all things equal the a10-5800k was basically identical to the PhII in compute tasks.
> 
> the A10-6800k overclocks much better then the 5800k... so the athlon ii x4 760k might be a winner.


It stomps the PhII X4 in gaming


----------



## spatulator

Quote:


> Quote:
> Originally Posted by spatulator
> 
> I think they said they would keep the socket till excavator or whatever that next one is called but I don't expect that to happen. Yes it's a rumor at this point but it makes alot of sense...unlike rumors of Richland being 20-40% faster GPU than trinity. That rumor made no sense.
> 
> Originally Posted by azanimefan
> yet it is... when overclocked and paired with ddr3 2400 ram... since richland overclocks better... they're pulling 25%+ better performance out of it... now it's actually matching a 6670... which was 30% faster then the a10-5800k


The 6800 is 5% faster out of the box than Trinity. The 5800 is OC-able to about 4.2ghz on the CPU side, and 1ghz on the GPU side YMMV. The best thing Richland has going for it performance wise has to be the 2400 RAM support which is cool, but pricey too. (I got 2100 sticks 8gb for $35 when I got the 5800).

So yeah, lets see how this thing can overclock...could be great...5ghz would be awesome!

Understand that when we refer to the company claim of 20-40% faster, we are talking about stock setup, not OC capability.


----------



## Wumbologist

I would say the Athlon is better than the phenom because of the motherboard offerings. There are no am3 itx or decent micro atx motherboards compared to fm2


----------



## Bbrad

Quote:


> Originally Posted by *Wumbologist*
> 
> I would say the Athlon is better than the phenom because of the motherboard offerings. There are no am3 itx or decent micro atx motherboards compared to fm2


phenoms use socket am3+...

Sent from my SCH-I500 using Tapatalk 2


----------



## EliteReplay

Quote:


> Originally Posted by *MKHunt*
> 
> I can try my durndest. Any specific titles and settings? I only have access to my 1440p screens until June 23ish, then I'll be able to test 1080p.


what about if u scalate the screen res to 1080p? just do BF3 Multiplayer at least norshar canals, metro, siene crosing,
Crysis 2, Starcraft 2, any other game that is not from 2013.....


----------



## EliteReplay

well i just found this video on youtube... BF3Gameplay


----------



## EliteReplay

Another VideoGameplay with A10 6800k


----------



## lowfat

Grabbed an a8-6600k for my HTPC. Got it running at 4.7GHz on a FM2A75M-ITX. Anything higher requires quit a bit of voltage.


----------



## Opcode

Being officially sponsored by AMD, my A10-6800k unit should be coming in the next couple of weeks. I will be able to post benchmarks, and review this APU in wider detail at that time.


----------



## DaveLT

Quote:


> Originally Posted by *lowfat*
> 
> Grabbed an a8-6600k for my HTPC. Got it running at 4.7GHz on a FM2A75M-ITX. Anything higher requires quit a bit of voltage.


That is quite a clock


----------



## Stormscion

Quote:


> Originally Posted by *DaveLT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lowfat*
> 
> Grabbed an a8-6600k for my HTPC. Got it running at 4.7GHz on a FM2A75M-ITX. Anything higher requires quit a bit of voltage.
> 
> 
> 
> That is quite a clock
Click to expand...

Nice clocks


----------



## Bbrad

Quote:


> Originally Posted by *Opcode*
> 
> Being officially sponsored by AMD, my A10-6800k unit should be coming in the next couple of weeks. I will be able to post benchmarks, and review this APU in wider detail at that time.


how did you get sponsered lol.

Sent from my SCH-I500 using Tapatalk 2


----------



## Wumbologist

Quote:


> Originally Posted by *Bbrad*
> 
> phenoms use socket am3+...
> 
> Sent from my SCH-I500 using Tapatalk 2


Phenom is AM3

And there are still no AM3+ ITX motherboards or decent micro ATX so I don't know what you are trying to say


----------



## azanimefan

Quote:


> Originally Posted by *Wumbologist*
> 
> Phenom is AM3


yes, but they work fine with AM3+


----------



## MKHunt

I experienced my SFF HTPC almost not coming together. I don't trust the RIchland heatsinkls to be adequate. But I also couldn't find a SFF heatsink that had acceptable numbers on a 100W CPU. So I went the way of SIlverstone.





There was a wee bit of anxiety about whether or not it would fit...


----------



## pedantic

Quote:


> Originally Posted by *MKHunt*
> 
> I experienced my SFF HTPC almost not coming together. I don't trust the RIchland heatsinkls to be adequate. But I also couldn't find a SFF heatsink that had acceptable numbers on a 100W CPU. So I went the way of SIlverstone.
> 
> 
> 
> 
> 
> There was a wee bit of anxiety about whether or not it would fit...


looks good. i have a htpc case as well and couldnt find a cooler that fit....so im using stock


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *MKHunt*
> 
> I can try my durndest. Any specific titles and settings? I only have access to my 1440p screens until June 23ish, then I'll be able to test 1080p.


I would like to see how well Skyrim, NFS: Most Wanted 2012, Battlefield 3, Tomb Raider 2013, and Crysis 3 run at 720p (1280 x 720) at mixed medium/high settings on the A10-6800K at stock 4.1GHz with just the Radeon HD 8670D IGP


----------



## MKHunt

I can't do BF3 multiplayer nor NFS. And if the W8 download would budge from "6 hours remaining" that would be nice too


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *MKHunt*
> 
> I can't do BF3 multiplayer nor NFS. And if the W8 download would budge from "6 hours remaining" that would be nice too


Hey, no problem! Yeah, NFS: MW isn't a hard game to run anyway, so that's okay. Wow, 6 hours to download Windows 8... damn, that sucks


----------



## Opcode

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> I would like to see how well Skyrim, NFS: Most Wanted 2012, Battlefield 3, Tomb Raider 2013, and Crysis 3 run at 720p (1280 x 720) at mixed medium/high settings on the A10-6800K at stock 4.1GHz with just the Radeon HD 8670D IGP


I can benchmark them games for you once my sponsored 6800k unit arrives.


----------



## DaveLT

Quote:


> Originally Posted by *MKHunt*
> 
> I experienced my SFF HTPC almost not coming together. I don't trust the RIchland heatsinkls to be adequate. But I also couldn't find a SFF heatsink that had acceptable numbers on a 100W CPU. So I went the way of SIlverstone.


There are heatsinks under 80mm tall that will dissipate 100W fine








This one is actually underrated, http://www.amazon.com/Logisys-Corp-AC4400BT-Beta-Cooling/dp/B007RWXCQS/ref=sr_1_6?s=pc&ie=UTF8&qid=1371013932&sr=1-6&keywords=logisys
Your case is 80mm+ tall right?
Stock cooling is actually good enough, you'll be surprised actually how cool trinity procs run


----------



## MKHunt

The case is 87.5mm internal. When putting it together, I could hear light scratching as the HS and case made contact. Maybe I should add some TIM and try to make the case an extension of the heat sink









Mostly I'm thrilled that I can continue my legacy of fitting oversized coolers into uATX cases.


----------



## Mark the Bold

I ordered the parts and I think I done goofed. I ordered some g skill 2133hz kits for the a10-6800k. However these are a 1.6v kit, and I read that the chip will only support 1.5v kits at 2133hz natively.

Any of you had any experience with this? I'm wondering if I should just return the G.skill 1.6v kit and find an officially supported 1.5v memory kit.

Appreciate the feedback! Any kits you recommend. Will be running this stock.....


----------



## FIRINMYLAZERMAN

What's the maximum safe voltage and temperature that the A10-6800K can go up to without damaging the APU and affecting the longevity?


----------



## Artikbot

74ºC for 24/7 usage is AMD's thermal limit. So, say 71-72ºC.

Good to see they have risen this edge from the 71.3ºC Trinity had!


----------



## Opcode

Quote:


> Originally Posted by *Mark the Bold*
> 
> I ordered the parts and I think I done goofed. I ordered some g skill 2133hz kits for the a10-6800k. However these are a 1.6v kit, and I read that the chip will only support 1.5v kits at 2133hz natively.
> 
> Any of you had any experience with this? I'm wondering if I should just return the G.skill 1.6v kit and find an officially supported 1.5v memory kit.
> 
> Appreciate the feedback! Any kits you recommend. Will be running this stock.....


Memory voltage has nothing to do with the APU, memory voltages is controlled entirely by the motherboard. This day in age all motherboards support dram up to 1.65v, you can even go higher than that if your board has dram voltage control in the bios. Just pop them into your motherboard and they should boot right up, tho keep in mind most A85X chipset boards don't support 2133 native without overclocking. So you're going to be "overclocking" them either way, even tho its the board doing the OC and not the dram. I wouldn't worry too much about it, you will need to set a profile so they run at their rated speeds at best. It's not really a big issue and nothing I would be concerned about.


----------



## DaveLT

Quote:


> Originally Posted by *Mark the Bold*
> 
> I ordered the parts and I think I done goofed. I ordered some g skill 2133hz kits for the a10-6800k. However these are a 1.6v kit, and I read that the chip will only support 1.5v kits at 2133hz natively.
> 
> Any of you had any experience with this? I'm wondering if I should just return the G.skill 1.6v kit and find an officially supported 1.5v memory kit.
> 
> Appreciate the feedback! Any kits you recommend. Will be running this stock.....


Nah. That's recommended. Nobody said you can't run a 2133 kit @ 1.65V on trinity, Trinity's IMC is just like Vishera. Now they strengthened the IMC on Richland, you could run 2133 flawlessly and 2400 not a problem.
Back on trinity it's 1 step lower.
Unless it's ... Sandy bridge where anything over 1.5V will degrade the IMC ...


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Artikbot*
> 
> 74ºC for 24/7 usage is AMD's thermal limit. So, say 71-72ºC.
> 
> Good to see they have risen this edge from the 71.3ºC Trinity had!


What's the maximum safe operating voltage for the A10-6800K without damaging the APU?


----------



## Bbrad

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Artikbot*
> 
> 74ºC for 24/7 usage is AMD's thermal limit. So, say 71-72ºC.
> 
> Good to see they have risen this edge from the 71.3ºC Trinity had!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's the maximum safe operating voltage for the A10-6800K without damaging the APU?
Click to expand...

I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.

Sent from my SCH-I500 using Tapatalk 2


----------



## spatulator

My 2 cents. I have my 2133 sticks at 1.6v with the 5800. Been that way for 6mos+. That is the mfg limit but I think you can push it a little further. When I was trying to max them out I think I had them running at 1.63v for a few days without issues.


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Bbrad*
> 
> I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.
> 
> Sent from my SCH-I500 using Tapatalk 2


Yeah, I kind-of thought 1.45v would be the safe limit.

How realistic do you think it would be to get the A10-6800K running at a stable 5GHz on air at 1.45v or under while still maintaining reasonable temperatures? *** I'm thinking about building a mini-ITX or micro-ATX system with the A10-6800K being my CPU + GPU, so that's why I'm asking







***


----------



## Bbrad

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> Yeah, I kind-of thought 1.45v would be the safe limit.
> 
> How realistic do you think it would be to get the A10-6800K running at a stable 5GHz on air at 1.45v or under while still maintaining reasonable temperatures? *** I'm thinking about building a mini-ITX or micro-ATX system with the A10-6800K being my CPU + GPU, so that's why I'm asking
> 
> 
> 
> 
> 
> 
> 
> ***
Click to expand...

I'm actually currently building a micro it build with a H80i watercooler I think that's pretty crazy on air but there's some really awesome air coolers that can probably lift you there with good temps but the voltage would be above 1.45 at 5.0ghz probably unless your lucky.

Sent from my SCH-I500 using Tapatalk 2


----------



## FIRINMYLAZERMAN

Quote:


> Originally Posted by *Bbrad*
> 
> I'm actually currently building a micro it build with a H80i watercooler I think that's pretty crazy on air but there's some really awesome air coolers that can probably lift you there with good temps but the voltage would be above 1.45 at 5.0ghz probably unless your lucky.
> 
> Sent from my SCH-I500 using Tapatalk 2


Yeah, I would probably need a really good motherboard and a Noctua CPU cooler to be able to achieve a stable 5GHz clock speed on air.

Realistically, how far do you think I could overclock the A10-6800K to at 1.45v with reasonable temperatures and a good air CPU cooler and motherboard while being stable?


----------



## DaveLT

Quote:


> Originally Posted by *Bbrad*
> 
> I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.
> 
> Sent from my SCH-I500 using Tapatalk 2


This is not like Vishera or Zambezi.
This is TRINITY AND RICHLAND! In which they have upped the safe max thermal junction that is perfectly safe to run 24/7 365 to 74C


----------



## Bbrad

Quote:


> Originally Posted by *DaveLT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> This is not like Vishera or Zambezi.
> This is TRINITY AND RICHLAND! In which they have upped the safe max thermal junction that is perfectly safe to run 24/7 365 to 74C
Click to expand...

dude its already been said many times not to go over 65°c at the most for daily usage







jeez dude yea that's the max do you know what max is?the temp before frying the chip.

Sent from my SCH-I500 using Tapatalk 2


----------



## Mark the Bold

Quote:


> Originally Posted by *Opcode*
> 
> Memory voltage has nothing to do with the APU, memory voltages is controlled entirely by the motherboard. This day in age all motherboards support dram up to 1.65v, you can even go higher than that if your board has dram voltage control in the bios. Just pop them into your motherboard and they should boot right up, tho keep in mind most A85X chipset boards don't support 2133 native without overclocking. So you're going to be "overclocking" them either way, even tho its the board doing the OC and not the dram. I wouldn't worry too much about it, you will need to set a profile so they run at their rated speeds at best. It's not really a big issue and nothing I would be concerned about.


Well thanks guys. I feel a little better. I have stuck with Intel for the last several builds, so I always thought like you said above: the memory voltage was controlled by the MB not the chip.

It was this review here that made me have second thoughts: http://www.legitreviews.com/article/2209/14/

Specifically:
Quote:


> We contacted AMD about our performance numbers and they said that we did not see huge performance gains since we ran the old AMD A10-5800K APU with 2133MHz memory and not 1866MHz, which is the highest memory clock speeds officially supports on it. We ran both the 5800K and 6800K at 2133MHz to keep the test systems as close as possible and hoped to run it on the A10-6700. Hoped is the key word as we were unable to get the A10-6700 stable with the Corsair Dominator Platinum 2133MHz CL9 1.65V memory kit. AMD specifies that the A10-6800K officially supports 2133MHz memory with 1.50V, so maybe our kit requires too much voltage. *Just a heads up to anyone looking to run 2133MHz memory with a Richland APU. Our experience with Richland is that you should run 2133MHz memory to get the most performance out of it, but the A10-6800K seems more receptive to it and we suggest a 1.50V kit*.


If it matters, I got this kit here:

http://www.newegg.com/Product/Product.aspx?Item=N82E16820231655

Appreciate the feedback.


----------



## MKHunt

Quote:


> Originally Posted by *Mark the Bold*
> 
> Well thanks guys. I feel a little better. I have stuck with Intel for the last several builds, so I always thought like you said above: the memory voltage was controlled by the MB not the chip.
> 
> It was this review here that made me have second thoughts: http://www.legitreviews.com/article/2209/14/
> 
> Specifically:
> If it matters, I got this kit here:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820231655
> 
> Appreciate the feedback.


The 2400 kit i got for my A10 is 1.65V and I'm not worried. Any review site will recommend 1.5V because that is the manufacturer's recommended spec. It's like how on Ivy any chip with a decent cooler can run on 1.35V with minimal to no degradation at 4ghz + but no review site will ever ever recommend that people run 1.35V









Sandy Bridge was the only recent CPU where the IMC was so sensitive that exceeding the spec actually resulted in rapid degradation. The only other proc I can think of that is semi-recent with that limitation is the Phenom II 955BE C2 stepping revision. But then it was fixed with the C3.


----------



## DaveLT

Quote:


> Originally Posted by *Bbrad*
> 
> dude its already been said many times not to go over 65°c at the most for daily usage
> 
> 
> 
> 
> 
> 
> 
> jeez dude yea that's the max do you know what max is?the temp before frying the chip.
> 
> Sent from my SCH-I500 using Tapatalk 2



74C IS THE MAX recommended for Richland to run 24/7


----------



## Bbrad

Quote:


> Originally Posted by *DaveLT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> dude its already been said many times not to go over 65°c at the most for daily usage
> 
> 
> 
> 
> 
> 
> 
> jeez dude yea that's the max do you know what max is?the temp before frying the chip.
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> 
> 74C IS THE MAX recommended for Richland to run 24/7
Click to expand...

I will admit to defeat and apologize if you show me proof

Sent from my SCH-I500 using Tapatalk 2


----------



## MKHunt

Quote:


> Originally Posted by *Bbrad*
> 
> I will admit to defeat and apologize if you show me proof
> 
> Sent from my SCH-I500 using Tapatalk 2


Same architecture, same processes, etc as the 5800k

http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-5800K.html

http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-6800K.html


----------



## Bbrad

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bbrad*
> 
> I will admit to defeat and apologize if you show me proof
> 
> Sent from my SCH-I500 using Tapatalk 2
> 
> 
> 
> Same architecture, same processes, etc as the 5800k
> 
> http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-5800K.html
> 
> http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-6800K.html
Click to expand...

??? That does answer our question.

Sent from my SCH-I500 using Tapatalk 2


----------



## spatulator

Please do not apologize just use google.


----------



## dave12

I saw the sign outside Microcenter that said they had these in stock and I have been thinking about getting one to replace my 1090t/HD5870 box that has been relegated to media PC status. I intend to buy one, but I was wondering about noise with stock HS.

Assuming I have no plan on overclocking, will the stock processor keep these guys cool silently pushing at least one screen playing 1080p and a desktop screen or do I need to pick up an aftermarket cooler? Quiet means that I probably wouldn't notice the HS fan over normal video playback volume. Appreciate any feedback from current owners.


----------



## FIRINMYLAZERMAN

Does anyone on here have any experience overclocking an A10-5800K (not 6800K) on an ASRock FM2A85X Extreme4-M micro-ATX motherboard with an air CPU cooler? If so, how much of a stable safe overclock were you able to achieve, and what air CPU cooler did you use?


----------



## Opcode

Quote:


> Originally Posted by *dave12*
> 
> I saw the sign outside Microcenter that said they had these in stock and I have been thinking about getting one to replace my 1090t/HD5870 box that has been relegated to media PC status. I intend to buy one, but I was wondering about noise with stock HS.
> 
> Assuming I have no plan on overclocking, will the stock processor keep these guys cool silently pushing at least one screen playing 1080p and a desktop screen or do I need to pick up an aftermarket cooler? Quiet means that I probably wouldn't notice the HS fan over normal video playback volume. Appreciate any feedback from current owners.


The stock heatsink is probably fairly noisy, its the same heatsink that shipped with the Athlon II's. So I guess AMD wasn't big on changing the heatsink design, tho I do wonder if it features their new "silent" fan that shipped with their FX-6300. I'll be able to tell you how noisy it is once my sponsored AMD unit arrives.


----------



## Mark the Bold

Man. I'm getting crazy high temps on this A10-6800k thing without overclocking.

Like 75-80 C under prime95 after like 10 seconds. And around 50C at idle.

My cooler is a CM Hyper 212. Not the best, but certainly should perform better than this....

What kind of temps you peeps getting? Doesn't seem right. Install seems spot on. Reset it twice.

Some people were saying that Core Temp doesn't jive with AMD sensors, but real temp + hwmonitor says same thing....

I've always heard AMD's were hot, but damn.


----------



## DaveLT

AMD = hot? Those misconceptions still exists today?
Check your mounting and everything. Case air flow as well.


----------



## Mark the Bold

Sorry man. I'm no AMD hater. In fact, for $150 PLUS a free sim city game I can't think of a better deal for a 4.4 Ghz chip and a decent equivalent Radeon 6670 graphics card all in one.

I installed the Gigabyte MB utility and my CPU temps are just splendid now. Never went over 40 C during 30 minutes of Prime 95.

Go AMD! Great little chip this Richland.....









FYI: Real Temp, Core Temp, Everest and HW Monitor are all taking the piss with these new Richland chips.


----------



## Wumbologist

I have had trouble getting correct readings from my 750K too

I get wild readings from Coretemp, Realtemp, HWMonitor and even AMD Overdrive (stuff like 0c to 107c). I'm certain OCCT, Speedfan, Asrock AXTU are correct as they never give erratic readings and are always within a degree of each other.


----------



## Mark the Bold

I was nervous too. Because the law of averages would say that if (5) independent systems say one thing, and only (1) says another chances are the (1) is wrong.

But sticking my finger under the heatsink in there (scientific aint I?) shows that there is no way this chip is 80C. No way.


----------



## shark77

anyone have tested a10 6800k with a dedicated gpu,in sc2 ?I would like to see how this apu handles it.


----------



## Opcode

Quote:


> Originally Posted by *Mark the Bold*
> 
> Sorry man. I'm no AMD hater. In fact, for $150 PLUS a free sim city game I can't think of a better deal for a 4.4 Ghz chip and a decent equivalent Radeon 6670 graphics card all in one.
> 
> I installed the Gigabyte MB utility and my CPU temps are just splendid now. Never went over 40 C during 30 minutes of Prime 95.
> 
> Go AMD! Great little chip this Richland.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FYI: Real Temp, Core Temp, Everest and HW Monitor are all taking the piss with these new Richland chips.


Was about to say, they probably haven't updated any of these tools to read the sensors in Richland properly. I would go by official motherboard software/bios readings as CoreTemp and others were known to throw reading problems before (e.g. 0C)
Quote:


> Originally Posted by *shark77*
> 
> anyone have tested a10 6800k with a dedicated gpu,in sc2 ?I would like to see how this apu handles it.


I imagine you wouldn't need a discrete GPU to play this game near maxed out. It's a top down RTS, so the graphics end of the game isn't going to be that demanding. RTS games are always CPU bound because of the sheer amount of units and AI involved for them units.

The video below is the game running on a overclocked A10-5800k at 1080p. Core @ 4 GHz, and GPU @ 800 MHz, which is slightly slower than what the A10-6800k offers at stock clocks So you can expect similar performance with the A10-6800k at stock. If you add discrete I am sure the frame rate will go up, tho it should be playable on medium settings on the APU all by itself. Also note the memory used in this video is 2400 MHz DDR3.


----------



## Wumbologist

If you are going to use a dedicated GPU you'd be better off getting the 6800K equivalent CPU the Athlon 760K and save yourself £50-60









My 750k does great with my 7850 and can max everything however I am using a 17inch monitor


----------



## azanimefan

Quote:


> Originally Posted by *Mark the Bold*
> 
> Man. I'm getting crazy high temps on this A10-6800k thing without overclocking.
> 
> Like 75-80 C under prime95 after like 10 seconds. And around 50C at idle.
> 
> My cooler is a CM Hyper 212. Not the best, but certainly should perform better than this....


either the cpu is reporting it's temps wrong or you put your hyperevo 212 on wrong. Please find out which by taking the cpu cooler off... clean both the bottom of the cooler and the top of the chip, reapply the thermal paste, and reseat the cpu cooler.

by all reports richland runs cooler then trinity...


----------



## majorleague

For real Richland performance video check this out!!
Lots of other videos too.
I think the guy is working flat out.
Crossfire videos have started appearing too.


https://www.youtube.com/watch?v=bQq50l3bnLs


----------



## majorleague

Quote:


> Originally Posted by *azanimefan*
> 
> either the cpu is reporting it's temps wrong or you put your hyperevo 212 on wrong. Please find out which by taking the cpu cooler off... clean both the bottom of the cooler and the top of the chip, reapply the thermal paste, and reseat the cpu cooler.
> 
> by all reports richland runs cooler then trinity...


I have that cooler and I must say, Richland is so much better with heat, I had to watch my step with trinity as it could overheat fairly easily.
With Richland this has not yet happened once.


----------



## sanket779292

I have 4gb 1600 vengeance ram will this be good enough for richland im confused between this and i3


----------



## agrims

You would be starving it of memory speed. The APU's love speed. You could probably OC it to 1866, but it could prove to be really hard to push any higher. Also you would probably need 8gb minimum. You wouldn't want to run out..


----------



## sanket779292

Thanks, i'll try to get 8 but can i oc that ram to 1866 without voltage bump on msi a75 ma p33 mobo


----------



## Artikbot

Quote:


> Originally Posted by *Bbrad*
> 
> I say around 1.45v some push it at 1.5v but I wouldn't and please god do not listen to that guy you quoted if you get your CPU near 70°c for 24/7 your going to ruin it keep it at 60°c but its safe to rise to 65°c sometimes.
> 
> Sent from my SCH-I500 using Tapatalk 2












Do you know how ridiculous do you sound?

http://www.cpu-world.com/CPUs/Bulldozer/AMD-A10-Series%20A10-6800K.html

Now go back to whatever cave you came from.


----------



## DaveLT

Quote:


> Originally Posted by *sanket779292*
> 
> Thanks, i'll try to get 8 but can i oc that ram to 1866 without voltage bump on msi a75 ma p33 mobo


No. Most Corsair Vengeances the new batches are binned so tightly it isn't enthusiast RAM ... it just isn't.


----------



## agrims

+1 on the vengeance ram. It is capable of stepping up to the next MHz most of the time, but they are binned quite high and they are finicky when it comes to OCing. Try and see what you can get out of them, or return them for something else. If I were running an APU, I wouldn't settle for less than 2133 personally and would probably just go with 2400.


----------



## sanket779292

Even If it does 1866 at stock voltage it will be fine for me


----------



## Jim888

how will the 6800k Do for Video Rendering/editing? I've got a friend who's looking to build a budget machine but wants to use lightworks and sony vegas


----------



## Opcode

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> What's the maximum safe operating voltage for the A10-6800K without damaging the APU?


Well according to CPU World Trinity ran between 0.825V - 1.475V, so my guess is up to 1.45V would be fine for Richland as it uses the same exact same CPU cores. The A10-5800k runs at 1.45V out of the box when in turbo mode. So you should hit 4.6 GHz on the 6800k before hitting 1.433V, so my guess is 4.7 GHz range around 1.45V would be best case scenario for 24/7. Tho the A10-6800k may handle more volts than that, I personally wouldn't feel comfortable running it on anything above 1.5V myself. I still await my 6800k, so I haven't gotten yet to play with one.
Quote:


> Originally Posted by *Jim888*
> 
> how will the 6800k Do for Video Rendering/editing? I've got a friend who's looking to build a budget machine but wants to use lightworks and sony vegas


Any software that's GPU accelerated will perform better on a APU than a similar budget i3. Both lightworks and sony vegas are GPU accelerated, so he should expect to see better at real time effects in lightworks etc. Not to mention the four cores should make a big difference when it comes time to encode or convert videos. Tho some of them processes are already GPU accelerated this day in age as well.


----------



## DaveLT

Quote:


> Originally Posted by *sanket779292*
> 
> Even If it does 1866 at stock voltage it will be fine for me


Problem is, it won't even do 1866 no matter how much voltage you feed it. For the new chips


----------



## Himo5

Quote:


> Originally Posted by *Opcode*
> 
> Well according to CPU World Trinity ran between 0.825V - 1.475V, so my guess is up to 1.45V would be fine for Richland as it uses the same exact same CPU cores. The A10-5800k runs at 1.45V out of the box when in turbo mode. So you should hit 4.6 GHz on the 6800k before hitting 1.433V, so my guess is 4.7 GHz range around 1.45V would be best case scenario for 24/7. Tho the A10-6800k may handle more volts than that, I personally wouldn't feel comfortable running it on anything above 1.5V myself. I still await my 6800k, so I haven't gotten yet to play with one.
> Any software that's GPU accelerated will perform better on a APU than a similar budget i3. Both lightworks and sony vegas are GPU accelerated, so he should expect to see better at real time effects in lightworks etc. Not to mention the four cores should make a big difference when it comes time to encode or convert videos. Tho some of them processes are already GPU accelerated this day in age as well.


For A10-6800K voltages, see this APU Club post
Quote:


> Originally Posted by *The Stilt*
> 
> 1.61V is the absolute maximum value for Richland APUs (Core VDD).
> Beyond that point AMD cannot guarantee the functionality of the part, even after the voltage has been lowered to normal levels (i.e. permanent damage). On conventional cooling methods nothing above 1.55V should be used even for a short periods of time.


Luckily my deeply unconventional 38mm Silverstone kept the apu from harm.


----------



## Opcode

Quote:


> Originally Posted by *Himo5*
> 
> For A10-6800K voltages, see this APU Club post
> Luckily my deeply unconventional 38mm Silverstone kept the apu from harm.


I still don't trust anything above 1.5v, especially with the clock mesh.


----------



## sanket779292

Thanks, but i'll try to do 1866 with my vengeance as some people have done at 9 10 9 [email protected] but i want to ask if 1.65 will degrade my ram


----------



## agrims

No, you should be fine at that voltage. Good luck and tell us what you get. Remember that your overall overclocks can be affected by each parts oc. You may not be able to push your CPU as far...


----------



## spatulator

Quote:


> Originally Posted by *Jim888*
> 
> how will the 6800k Do for Video Rendering/editing? I've got a friend who's looking to build a budget machine but wants to use lightworks and sony vegas


I tried Vegas on my 5800k and I was very happy with the rendering speed and real time effects. The latest version is open cl accelerated and is just smooth as silk on a budget apu.


----------



## Opcode

Quote:


> Originally Posted by *spatulator*
> 
> I tried Vegas on my 5800k and I was very happy with the rendering speed and real time effects. The latest version is open cl accelerated and is just smooth as silk on a budget apu.


Lightworks should take advantage of the iGPU also, for real time effects etc.


----------



## sanket779292

will buy a 6800k next month., let's see how fast it is


----------



## EliteReplay

Quote:


> Originally Posted by *sanket779292*
> 
> will buy a 6800k next month., let's see how fast it is


i will get the A10 6700... looking for good performance but same time less power hungry...


----------



## Artikbot

Quote:


> Originally Posted by *EliteReplay*
> 
> i will get the A10 6700... looking for good performance but same time less power hungry...


If you have a way to, I'd love some power consumption figures!!


----------



## beers

Finally picked up a 6800K at microcenter. If you look at the amd bundles on their website they actually advertise the 6800K+Asus F2A85-V Pro at $10 less than they should ($40 discount instead of $30).

Seems pretty snappy, I'll have to flash this 7570 back to a functioning BIOS and give it a shot with DGM.
Quote:


> If you have a way to, I'd love some power consumption figures!!


Were you looking for that chip specifically? I can get some kill-a-watt figures if you want something to compare with.


----------



## EliteReplay

Quote:


> Originally Posted by *beers*
> 
> Finally picked up a 6800K at microcenter. If you look at the amd bundles on their website they actually advertise the 6800K+Asus F2A85-V Pro at $10 less than they should ($40 discount instead of $30).
> 
> Seems pretty snappy, I'll have to flash this 7570 back to a functioning BIOS and give it a shot with DGM.
> Were you looking for that chip specifically? I can get some kill-a-watt figures if you want something to compare with.


just go ahead and test it with the kill a watt i will begreat...
the a10 6700 consume like 15 to 25 less than the a106800k


----------



## beers

Quote:


> Originally Posted by *EliteReplay*
> 
> just go ahead and test it with the kill a watt i will begreat...
> the a10 6700 consume like 15 to 25 less than the a106800k


The hardware info is under my 'HTPC' in the sig, but in brief summary:

6800K, Asus F2A85-V Pro, 2x 4 GB Mushkin Redline 2133 @ 1.65v, 120 GB Corsair Force 3, Corsair CX430 PSU

Results:
Idle : 26-27w (@75% eff -> ~19-21w)
wPrime : 99-108w (@ 78% eff -> ~77-84w)
Furmark : 98-102w (@ 78% eff -> ~76-80w)
wPrime + Furmark : 143-150w (@ 81% eff -> ~115-122w)

I included the estimated DC system wattage after factoring in PSU efficiency based on figures from the Jonnyguru review here:
http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story2&reid=214

Surprisingly it looked like it stayed in turbo mode @ 4.3 GHz most of the time, although it did fluctuate a bit between 4.3 and 4.2 as the tests were running for the wprime+Furmark combo. I did put on a network stream through VLC from a samba share and it seemed to jump around from high 30's to upper 60's/low 70's but wasn't very consistent.


----------



## Artikbot

Wow B, lovely figures right there!

How well does it undervolt?


----------



## beers

Quote:


> Originally Posted by *Artikbot*
> 
> Wow B, lovely figures right there!
> 
> How well does it undervolt?


Unfortunately it looks like undervolting *really* pisses it off. I was getting instant crashes at load even at 1.275v.


----------



## Artikbot

Quote:


> Originally Posted by *beers*
> 
> Unfortunately it looks like undervolting *really* pisses it off. I was getting instant crashes at load even at 1.275v.


Turbo screwing around?

I disabled it when undervolting my A10-5700. It really crapped the thing.


----------



## DaveLT

Quote:


> Originally Posted by *EliteReplay*
> 
> just go ahead and test it with the kill a watt i will begreat...
> the a10 6700 consume like 15 to 25 less than the a106800k


Are you sure? I'm very confident they draw at least 35-45W less


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> Are you sure? I'm very confident they draw at least 35-45W less


The Trinities did indeed draw around 35W less, mostly because the A10-5800K exceeded its rated power quite a bit. Richland might be better optimised


----------



## beers

Quote:


> Originally Posted by *Artikbot*
> 
> Turbo screwing around?
> 
> I disabled it when undervolting my A10-5700. It really crapped the thing.


Good call, I don't think I disabled it any.

Do you know of any fix for modifying BIOS/CMOS settings over HDMI? I see nothing on the screen unless I use VGA...
The issue existed with my old A55 board and 5400K too, not sure if there is a setting or if it's an error by design.


----------



## Opcode

My A10-6800k system still doesn't come for at least another 1-2 weeks.


----------



## Opcode

Hmm... here is another report of the HD 7750 working in DGM with a 6800k.

Image Linky #1

Image Linky #2

Image Linky #3


----------



## Bbrad

Anybody know how many vrms are on this board want to do some over clocking I like it because of the HiFi Biostar Hi-Fi A85W

Sent from my SCH-I500 using Tapatalk 2


----------



## Opcode

Quote:


> Originally Posted by *Bbrad*
> 
> Anybody know how many vrms are on this board want to do some over clocking I like it because of the HiFi Biostar Hi-Fi A85W
> 
> Sent from my SCH-I500 using Tapatalk 2


Board looks like a 4+2 phase, mosfets are heatsinked so it should be good to go for overclocking.


----------



## Artikbot

Quote:


> Originally Posted by *beers*
> 
> Good call, I don't think I disabled it any.
> 
> Do you know of any fix for modifying BIOS/CMOS settings over HDMI? I see nothing on the screen unless I use VGA...
> The issue existed with my old A55 board and 5400K too, not sure if there is a setting or if it's an error by design.


Woah, nope O.O

Sounds like one hell of an issue. I use DVI, and it works just fine.


----------



## EliteReplay

this apu cpu runs very well the game at med settings


----------



## sanket779292

How much ram will be shared for this apu graphics and will a 4gb 1600 ram be enough for a10 system for light gaming at 720p


----------



## DaveLT

Quote:


> Originally Posted by *sanket779292*
> 
> How much ram will be shared for this apu graphics and will a 4gb 1600 ram be enough for a10 system for light gaming at 720p


Let me guess ... Probably yes. But i will not even recommend 4GB for ANY systems ... 8GB is the min.


----------



## beers

Quote:


> Originally Posted by *sanket779292*
> 
> How much ram will be shared for this apu graphics and will a 4gb 1600 ram be enough for a10 system for light gaming at 720p


You can configure how much you want allocated, I seem to recall leaving it on auto used something like 384 MB but didn't dip into any 3d environments to see if it bumped the amount up any. You can choose up to 2GB.

That being said I don't feel 4 GB is enough for a modern system, as also stated above (especially when you are allocating .5-1 GB to the GPU as well).


----------



## Mr-Mechraven

Hey guys, quick question since you all have some idea about these new apu's. Is the A6 6400K ok for light gaming at low resolutions (720 ) ??

If not which would be the best choice APU on a really tight budget ?

Thanks


----------



## sanket779292

A8 would be better


----------



## Mr-Mechraven

Quote:


> Originally Posted by *sanket779292*
> 
> A8 would be better


I was just looking at that one and the A8 5600K, i figure the extra £20 is worth it. Its for a friends daughters first gaming pc build im doing, very limited budget but i figured an APU is the way to go.
Im giving her my old Phatom 410 White case and some new Xigmatek 120mm fans for free


----------



## subyman

I wish there were some nicer mITX options for FM2. I needed one with DP out, but none offered it so I ended up going with AM3+ and a discrete card


----------



## Wumbologist

Quote:


> Originally Posted by *subyman*
> 
> I wish there were some nicer mITX options for FM2. I needed one with DP out, but none offered it so I ended up going with AM3+ and a discrete card


There's ITX AM3+?


----------



## sanket779292

I cant even find m-atx am3+ all i see is atx


----------



## void

Quote:


> Originally Posted by *subyman*
> 
> I wish there were some nicer mITX options for FM2. I needed one with DP out, but none offered it so I ended up going with AM3+ and a discrete card


Yeah I agree. I think AMD's APUs would be be perfect for very compact mITX PC due to the decent integrated graphics so it removes the need for a larger discrete video card. Unfortunately there seems to be so few quality FM2 mITX boards and the few there are my local distributors seem very reluctant to import.

Hopefully this changes in the future.


----------



## Artikbot

Quote:


> Originally Posted by *void*
> 
> Hopefully this changes in the future.


Since FM2+ is shaping up to be quite the powerhouse and FM2 is becoming a rather popular HTPC platform, we can assume there's going to be a very nice selection of FM2+ mITX boards


----------



## Opcode

Quote:


> Originally Posted by *EliteReplay*
> 
> this apu cpu runs very well the game at med settings


Worth mentioning the memory is 2400 and the iGPU core is clocked to 1080 MHz. These chips are really good at overclocking, no clue why he didn't go for CPU clock as well. Below is the standard overclock I have seen done on this chip stable, pushing it to 5.0 GHz seems to cause problems and requires you to drop down the memory speed to get it stable (not worth sacrificing GPU performance for a measly 100 MHz). Tho people are getting the below on the stock cooler and Hyper 212+ no problem. So definitely a amazing budget gaming chip if you're willing to overclock it. If you run a dedicated GPU, you can bump down memory speeds and this is a easy 5.0 GHz (1.450v) chip.

Code:



Code:


CPU = 4900 MHz
GPU = 1086 MHz
MEM = 2400 MHz
VCORE = 1.400v


----------



## beers

Here's a dual 3dm11 result with a Dell 7570 (1 GB GDDR5) if anyone's interested:

Stock clocks all around, RAM @ 2133 9-10-11-28
http://www.3dmark.com/3dm11/6756239


----------



## peter-mafia

Regarding crossfiring with a 7750.

Was pretty skeptical about Richland. Sold my 5800k. Bought a8-5600K. Turned off IGD. Popped in a low profile 7750.
3dmark11 P3073 ([email protected] ddr3 2133. gpu nb 1.3V.... [email protected]/1200)
http://www.3dmark.com/3dm11/6762319
24-35fps in GTA4 1080p. Everything ultra (shadows high)

Was very disappointed. A10-5800K OC + a gddr5 6670 (micro-stuttering in SOME games) would give the same result.
Turned on IGD (A8 sucks- just 256 processing units). OC 1013mhz.
3dmark11 P3653 (WOW)
http://www.3dmark.com/3dm11/6763440

Was excited. Almost ordered a 6800K. But.... GTA4 is unplayable. FPS drops down to as low as 14fps. Stuttering is unbearable.

My conclusions:
-a 7750 won't work properly with a A8 APU whether it be trinity or richland (the same IGD).
-a 7750 alone is too weak for poorly optimized games

A10s have 384 "old" processing units. A 7750 has 512 GCN, I'm assuming it won't be working properly


----------



## DaveLT

It's because you CF'd the APU and the dGPU ... that's why. 480+384 do the sums ... with a 1GHz 384 SP APU.
Yeah sure GCN is very optimized but it isn't as good for older games, 512 probably means about VLIW4 768 SP
If i wasn't wrong 7770 = 6850. You need to OC the 7750 but they can go really far because it's GCN ...


----------



## peter-mafia

So, your point is? I don't get it. Your solution?
Quote:


> If i wasn't wrong 7770 = 6850. You need to OC the 7750 but they can go really far because it's GCN ...


I don't understand the above quote at all.
I`m pretty sure a lot of people want to crossfire their a10-6800K/5800K with a 7750 low profile ( the fastest low profile VGA card one can get)
Tried GTA4 again. No stuttering. FPS is around 29-35. Go figure.
Gonna try in Tomb Raider and Skyrim later.
Overclocked the IGD to 1083 at 1.4V. NB 2200mhz. ddr3 2400 10-12-12-30. 3dMark11 P3887
http://www.3dmark.com/3dm11/6763608
Hopefully, I was wrong and the idea of coupling with a 7750 is feasible.


----------



## DaveLT

Yeah, my point is that two 6670 is definitely faster than a 7750 ... OC that 7750 and Dual Graphics with the APU (DON'T FORGET TO USE FULLSCREEN)


----------



## peter-mafia

OCing the 7750 makes no sense at all. It is the IGD that should be OCed to be on par with a 7750.


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> Regarding crossfiring with a 7750.
> 
> Was pretty skeptical about Richland. Sold my 5800k. Bought a8-5600K. Turned off IGD. Popped in a low profile 7750.
> 3dmark11 P3073 ([email protected] ddr3 2133. gpu nb 1.3V.... [email protected]/1200)
> http://www.3dmark.com/3dm11/6762319
> 24-35fps in GTA4 1080p. Everything ultra (shadows high)
> 
> Was very disappointed. A10-5800K OC + a gddr5 6670 (micro-stuttering in SOME games) would give the same result.
> Turned on IGD (A8 sucks- just 256 processing units). OC 1013mhz.
> 3dmark11 P3653 (WOW)
> http://www.3dmark.com/3dm11/6763440
> 
> Was excited. Almost ordered a 6800K. But.... GTA4 is unplayable. FPS drops down to as low as 14fps. Stuttering is unbearable.
> 
> My conclusions:
> -a 7750 won't work properly with a A8 APU whether it be trinity or richland (the same IGD).
> -a 7750 alone is too weak for poorly optimized games
> 
> A10s have 384 "old" processing units. A 7750 has 512 GCN, I'm assuming it won't be working properly


Quote:


> Originally Posted by *peter-mafia*
> 
> OCing the 7750 makes no sense at all. It is the IGD that should be OCed to be on par with a 7750.


I don't see the problem, the A10-6800k will do 1080 MHz core on the HD 8670D graphics. And with 2400 MHz DDR3, it can play BF3 at medium settings @ 1080p above 30 FPS. In fact here is GTA IV being played on medium settings with high textures at 30-52 FPS running at 1080p with the GPU at 1080 MHz. If he bothered to overclock the CPU side, he wouldn't get as bad frame rate dips in action heavy scenes. The HD 7750 alone should push 30-50 FPS on all high settings. Don't expect too much from the cheapest card AMD has on the market using GCN ($89 card).

HD 8670D @ 1080 MHz @ 1080p




HD 7750 @ 880 MHz @ 900p


----------



## peter-mafia

Played Tomb Raider a bit.at 1080p Ultra settings. dual graphics 7750-> the frame rate won't drop below 35 fps. 7750 only-> drops below 30fps. Awesome game btw.
6800k here I come.
Opcode, the idea is to squeeze as much graphics power out of a mITX rig as possible (4" case. A 7970ghz won't fit







nor will 7850).


----------



## cssorkinman

Quote:


> Originally Posted by *peter-mafia*
> 
> Played Tomb Raider a bit.at 1080p Ultra settings. dual graphics 7750-> the frame rate won't drop below 35 fps. 7750 only-> drops below 30fps. Awesome game btw.
> 6800k here I come.
> Opcode, the idea is to squeeze as much graphics power out of a mITX rig as possible (4" case. A 7970ghz won't fit
> 
> 
> 
> 
> 
> 
> 
> nor will 7850).


What do you get in the TR benchmark?
Agreed, very good game


----------



## peter-mafia

Didn`t know there was a benchmark. Thanks








1080p Ultra settings. Catalyst 13.4
A8 + 7750 dual graphics
min FPS 26.7
max FPS 42.8
avg FPS 36.0

7750only
20.7
32.1
26.3
Pretty impressive, isn't it?
Don't forget I'm using the A8 APU which lacks a lot of stream processors compared to the A10-5800/6800K


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> Played Tomb Raider a bit.at 1080p Ultra settings. dual graphics 7750-> the frame rate won't drop below 35 fps. 7750 only-> drops below 30fps. Awesome game btw.
> 6800k here I come.
> Opcode, the idea is to squeeze as much graphics power out of a mITX rig as possible (4" case. A 7970ghz won't fit
> 
> 
> 
> 
> 
> 
> 
> nor will 7850).


A few days after this monday I will be posting benchmarks of some games with my A10-6800k. I just don't get my rig until monday, and I plan on testing quite a few games like GTA IV, Tomb Raider, BF3, Crysis 3, Saints Row 3, etc. I got a review topic planned for the A10-6800k, and what the CPU side can offer when I tie a HD 5870 into the mix. So if you're looking for numbers, I will be able to supply them before you make a purchase.


----------



## cssorkinman

Quote:


> Originally Posted by *peter-mafia*
> 
> Didn`t know there was a benchmark. Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 1080p Ultra settings. Catalyst 13.4
> A8 + 7750 dual graphics
> min FPS 26.7
> max FPS 42.8
> avg FPS 36.0
> 
> 7750only
> 20.7
> 32.1
> 26.3
> Pretty impressive, isn't it?
> Don't forget I'm using the A8 APU which lacks a lot of stream processors compared to the A10-5800/6800K


All thing considered , it seems like it does very well.
Fwiw, my 8350 7970 rig gets about 60 fps average on ultimate at stock speeds 1900x1200 . Visually spectacular game


----------



## Cyrious

Question for you guys: how much of an upgrade would getting a system based off the a10-6800k be vs my sig rig "Midget"?

Because i am really wanting a low power consumption rig that is at minimum on par with what i have right now in terms of performance.

Other question: has anyone run [email protected] on it? and if so how much PPD did it reel in (both SMP and GPU)?


----------



## DaveLT

Quote:


> Originally Posted by *Cyrious*
> 
> Question for you guys: how much of an upgrade would getting a system based off the a10-6800k be vs my sig rig "Midget"?
> 
> Because i am really wanting a low power consumption rig that is at minimum on par with what i have right now in terms of performance.
> 
> Other question: has anyone run [email protected] on it? and if so how much PPD did it reel in (both SMP and GPU)?


More than just your current rig. That's for sure.


----------



## Opcode

Quote:


> Originally Posted by *Cyrious*
> 
> Question for you guys: how much of an upgrade would getting a system based off the a10-6800k be vs my sig rig "Midget"?
> 
> Because i am really wanting a low power consumption rig that is at minimum on par with what i have right now in terms of performance.
> 
> Other question: has anyone run [email protected] on it? and if so how much PPD did it reel in (both SMP and GPU)?


Both CPU and GPU performance will improve if you move to a A10-6800k system. In terms of iGPU, I heard its a little bit stronger than a discrete HD 6670 (which is about 2x as strong as your GT 430). And the A10-5800k already beats the Phenom II x4 940 in CPU performance, so with the higher base clock being 4.1 GHz plus the 4.4 GHz Turbo it should offer superior CPU performance over the Phenom II x4 940. Tho if your 940 is a BE, a simple overclock would put it ahead of the A10-6800k in terms of CPU performance easily (at the cost of more power consumption). So if power consumption is a key aspect, a stock A10-6800k beats out a stock Phenom II x4 940 in terms of both power consumption and performance. According to statistics found on the web from the same source (could be right, could be wrong).

A10-6800k

Code:



Code:


Idle = 53W
Load = 145W

Phenom II x4 940

Code:



Code:


Idle = 159W
Load = 229W


----------



## Cyrious

Quote:


> Originally Posted by *DaveLT*
> 
> More than just your current rig. That's for sure.


If it pulled in more than 13,000 PPD i would be very impressed.

Quote:


> Originally Posted by *Opcode*
> 
> Both CPU and GPU performance will improve if you move to a A10-6800k system. In terms of iGPU, I heard its a little bit stronger than a discrete HD 6670 (which is about 2x as strong as your GT 430). And the A10-5800k already beats the Phenom II x4 940 in CPU performance, so with the higher base clock being 4.1 GHz plus the 4.4 GHz Turbo it should offer superior CPU performance over the Phenom II x4 940. Tho if your 940 is a BE, a simple overclock would put it ahead of the A10-6800k in terms of CPU performance easily (at the cost of more power consumption). So if power consumption is a key aspect, a stock A10-6800k beats out a stock Phenom II x4 940 in terms of both power consumption and performance. According to statistics found on the web from the same source (could be right, could be wrong).
> 
> A10-6800k
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Idle = 53W
> Load = 145W
> 
> Phenom II x4 940
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Idle = 159W
> Load = 229W


Oh, very nice then. The Phenom II power figures are way off though. According to my kill-a-watt, my system, idling, pulls just shy of 100W from the wall (80W system power). Load, it pulls approx 280W from the wall (224W system power). Either way its still a very nice consumption drop especially for the performance increase.

edit: neglected to take into account my monitor and the box fan i have running off the power strip the kill-a-watt is measuring for the load measurements.
Scratch that, the Load power figures are fairly accurate for a overclocked phenom II and low power dedicated GPU. The Idle number is definitely high though.


----------



## Opcode

Quote:


> Originally Posted by *Cyrious*
> 
> If it pulled in more than 13,000 PPD i would be very impressed.
> 
> Oh, very nice then. The Phenom II power figures are way off though. According to my kill-a-watt, my system, idling, pulls just shy of 100W from the wall (80W system power). Load, it pulls approx 350W from the wall (280W system power). Either way its still a very nice consumption drop especially for the performance increase.


Tho from what I am reading the GT 430 is better at folding. Something you can research and see if there is any results for APU's, because I only found a 6670 can produce 4600 PPD. Which isn't a whole lot to what you are getting with your GT 430.


----------



## Cyrious

Quote:


> Originally Posted by *Opcode*
> 
> Tho from what I am reading the GT 430 is better at folding. Something you can research and see if there is any results for APU's, because I only found a 6670 can produce 4600 PPD. Which isn't a whole lot to what you are getting with your GT 430.


No need for specifics just ballpark figures. 4600PPD is actually what my 430 was getting before the new units started coming out, and since an APU uses less power for the graphics segment than my dedicated card does (card pulls about 55W right now), the PPD/watt should not drop by much. Now, the question is: what does the CPU side do in terms of PPD? Ive tried looking but either my google-fu is bad or no one has really folded on one of these. Hardwarecanucks used to do [email protected] benching with their reviews but i havent seen anything yet.


----------



## beers

Edit:
Nevermind it actually said 7770








Any ideas on modifications to force CF mode when using a 7770?


----------



## Opcode

Quote:


> Originally Posted by *beers*
> 
> I bet none of you suckers can match this 'integrated' 3dm11 result
> 
> P3813
> http://www.3dmark.com/3dm11/6764303


I don't get my 6800k until tomorrow, then I can try


----------



## beers

Quote:


> Originally Posted by *Opcode*
> 
> I don't get my 6800k until tomorrow, then I can try


Sounds good. I gave it a shot earlier but there's no option in CCC. Even trolling through registry values and trying to force a hardware ID of 7750 was no-go.
I guess I could try flashing it to a 7750, but meh. Let me know if you have any luck


----------



## Opcode

Quote:


> Originally Posted by *beers*
> 
> Sounds good. I gave it a shot earlier but there's no option in CCC. Even trolling through registry values and trying to force a hardware ID of 7750 was no-go.
> I guess I could try flashing it to a 7750, but meh. Let me know if you have any luck


You try cranking the iGPU up to 1080 MHz? He could very well have his OC'd as it was registered in power savings mode. Just curious as to what numbers will come of it, since its the GPU holding the score back (other numbers are legit).


----------



## beers

Quote:


> Originally Posted by *Opcode*
> 
> You try cranking the iGPU up to 1080 MHz? He could very well have his OC'd as it was registered in power savings mode. Just curious as to what numbers will come of it, since its the GPU holding the score back (other numbers are legit).


Haven't messed with it too much. I am under the assumption that the iGPU is faster than the other 7570 I have in there now. If I could get some degree of 7770 to work in dual mode then I'd go about OCing it a bit.


----------



## Opcode

Quote:


> Originally Posted by *beers*
> 
> Haven't messed with it too much. I am under the assumption that the iGPU is faster than the other 7570 I have in there now. If I could get some degree of 7770 to work in dual mode then I'd go about OCing it a bit.


After I am done toying with the iGPU I got a brand new ASUS HD 5870 Voltage Tweak Edition to throw into it. So no DGM for me, my only concern is a CPU bottleneck. Tho people are easily getting 4.9 GHz on really decent volts, so making it a 5 GHz chip should alleviate that quite a bit. That is if I don't get a crappy MSI board or something (crosses fingers, eyes, toes, derp!).


----------



## Cyrious

Quote:


> Originally Posted by *Opcode*
> 
> I don't get my 6800k until tomorrow, then I can try


Hey, when you get it can you do some [email protected] benching for me?
GPU by itself, CPU by itself, then both together (preferrably overclocked in all 3 tests). If you need help getting it set up to fold at least for a little bit PM me and i'll walk you through it.


----------



## Opcode

Quote:


> Originally Posted by *Cyrious*
> 
> Hey, when you get it can you do some [email protected] benching for me?
> GPU by itself, CPU by itself, then both together (preferrably overclocked in all 3 tests). If you need help getting it set up to fold at least for a little bit PM me and i'll walk you through it.


I can set it up possibly on stock clocks and see what I get, won't be doing any overclocking until I get my hands on a Hyper 212+. Unless I magically get a nicely lapped stock heatsink that keeps it plenty cool for overclocking (yea right).


----------



## Cyrious

Quote:


> Originally Posted by *Opcode*
> 
> I can set it up possibly on stock clocks and see what I get, won't be doing any overclocking until I get my hands on a Hyper 212+. Unless I magically get a nicely lapped stock heatsink that keeps it plenty cool for overclocking (yea right).


That will be fine too. I can just toss in an overclock percentage and churn out some rough numbers from there.


----------



## spatulator

Quote:


> Originally Posted by *Opcode*
> 
> After I am done toying with the iGPU I got a brand new ASUS HD 5870 Voltage Tweak Edition to throw into it. So no DGM for me, my only concern is a CPU bottleneck. Tho people are easily getting 4.9 GHz on really decent volts, so making it a 5 GHz chip should alleviate that quite a bit. That is if I don't get a crappy MSI board or something (crosses fingers, eyes, toes, derp!).


You could disable the iGPU and even one of the piledriver modules when you get around to testing that discreet card for maximum oc headroom =)


----------



## s33dless

Let's talk OC! I'm running about 4900 MHz stable right now (I can grab multipliers etc. when I get home) at 1.5V. Forgot the memory timings, but my RAM was a bit of a ***** to get to comply at rated speed (mostly because I discovered the hard way "Auto" even sucks for setting RAM voltage). My iGPU is the hard part--I can push 1000MHz+, but the display driver will crash and recover from time to time under heavy loads. I tried raising the "APU 1.2V", which I _believe_to be the iGPU voltage, to 1.25 but stability hasn't improved much. I would push it higher, but the number goes yellow at 1.26, so I'm kind of worried. Has anyone else pushed it higher?


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> Let's talk OC! I'm running about 4900 MHz stable right now (I can grab multipliers etc. when I get home) at 1.5V. Forgot the memory timings, but my RAM was a bit of a ***** to get to comply at rated speed (mostly because I discovered the hard way "Auto" even sucks for setting RAM voltage). My iGPU is the hard part--I can push 1000MHz+, but the display driver will crash and recover from time to time under heavy loads. I tried raising the "APU 1.2V", which I _believe_to be the iGPU voltage, to 1.25 but stability hasn't improved much. I would push it higher, but the number goes yellow at 1.26, so I'm kind of worried. Has anyone else pushed it higher?


What is your memory speed? If it's above 1866 you may end up with stabability issues when trying to go above 4.9 GHz. Try backing your memory clock down, and see if that helps. I am awaiting replacement parts, once I get them I will be trying for the 5.0 GHz mark as well.


----------



## s33dless

I'm running it at the rated 1600 for the time being (or as close as i can get with my FSB where it is, i think 105). my CPU overclock is stable, i'm just trying to push the GPU now. i've seen people get 1000 MHz on their GPUs, so i know it's more then possible, just i can't get it stable. even 900 MHz gives me slight issues. how high can you safely raise the iGPU voltage?

it's a great damn chip though--skyrim is beautiful in 1080.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> I'm running it at the rated 1600 for the time being (or as close as i can get with my FSB where it is, i think 105). my CPU overclock is stable, i'm just trying to push the GPU now. i've seen people get 1000 MHz on their GPUs, so i know it's more then possible, just i can't get it stable. even 900 MHz gives me slight issues. how high can you safely raise the iGPU voltage?
> 
> it's a great damn chip though--skyrim is beautiful in 1080.


I have never played with APU overclocking, this will be my first APU rig. Tho I do know the maximum average OC on the iGPU for this chip is 1080 MHz. Once you get there you pretty much hit a wall.


----------



## beers

Would there be any large performance hit by saying getting a 2 GB 7750 or similar for DGM with the iGPU forced @ 2 GB?
I imagine when using large textures or higher resolutions that it wouldn't faceplant as a result of insufficient VRAM.

The memory bandwidth of the iGPU is already gimped, my perception is that it wouldn't make a large difference since system RAM is already the bottleneck on one of the DGM member cards.

The GDDR5 variant looks more appealing but apparently the only one available is the 6 port DP one that's about $250..

Edit:
Based on the reviews though the DDR3 7750 takes a pretty massive performance hit.


----------



## s33dless

i'm looking at getting an eyefinity 6 card right before getting my 4th monitor. i'm hoping by then they'll have released a new one--all the ones available are kind of old. A 7900 series eyefinity 6...that would be nice. i don't want to fuss with all those dvi/hdmi/vga ports. DP all day baby.


----------



## spatulator

Quote:


> Originally Posted by *s33dless*
> 
> Let's talk OC! I'm running about 4900 MHz stable right now (I can grab multipliers etc. when I get home) at 1.5V. Forgot the memory timings, but my RAM was a bit of a ***** to get to comply at rated speed (mostly because I discovered the hard way "Auto" even sucks for setting RAM voltage). My iGPU is the hard part--I can push 1000MHz+, but the display driver will crash and recover from time to time under heavy loads. I tried raising the "APU 1.2V", which I _believe_to be the iGPU voltage, to 1.25 but stability hasn't improved much. I would push it higher, but the number goes yellow at 1.26, so I'm kind of worried. Has anyone else pushed it higher?


Have you tried to OC the GPU with the CPU at stock? Thats what I would try first. Try with turbo off and with it on. Try with cool n quiet on and off. These APUs have unconventional power gating which could help or hinder a stable OC. I suggest using AMD Overdrive or HW Monitor to monitor the GPU voltage while under load.

I have my 5800k iGPU set to 975mhz. I can bench as high as 1050mhz but that will cause instability after a couple hours of gaming. I chatted with a guy who had his watercooled 5800k at 1150mhz.


----------



## By-Tor

I'm looking to upgrade my 4 y/o rig (listed below) and have been looking at the A10-6800 and the Fx-8320 as my new build.

How's the A10 in games?

How would it stack up to my Phenom II 940?

Thanks


----------



## DaveLT

Quote:


> Originally Posted by *By-Tor*
> 
> I'm looking to upgrade my 4 y/o rig (listed below) and have been looking at the A10-6800 and the Fx-8320 as my new build.
> 
> How's the A10 in games?
> 
> How would it stack up to my Phenom II 940?
> 
> Thanks


It would quite literally ... ROFLSTOMP your PII TBH but seriously, get a 8320 instead.


----------



## By-Tor

Yeah I'm leaning toward the FX series processors, but these A series are very interesting.

My wifes AM2 MB died so I ended up building her a FM2 setup with a dual core Trinity processor and the thing really runs great. Pulled her 4850 video card and just using the on chip video and really like it. For what she does it is more than she needs, but for the price I couldn't be beat.

After building hers I now want to upgrade mine...

Going to change out the MB, processor and memory first and just use the 4870x2 for now till I can get a 7800 or 7900 video card and water block.

Thanks


----------



## s33dless

i say it hauls ass, especially for integrated graphics. it eats pretty much any game i play in 720p. to go full 1080 i have to use the next tier down, but "high" on skyrim in 1080 looks way better than "ultra" at 720. my original plan was to use this as a stop gap, but i can definitely see myself sticking to it for a while. especially with some of those hybrid crossfire setups i've heard about, that's going to be fun to experiment with.

$150 for a more than decent unlocked processor that's already clocked high as crap AND i can put off buying a graphics card until i get a 4th monitor (at which point the pixel count would definitely rape the iGPU no matter how you OC it)? it was a no-brainer for me.


----------



## hakz

is it true that you can CF a 7750 with 6800k's?


----------



## Opcode

Quote:


> Originally Posted by *hakz*
> 
> is it true that you can CF a 7750 with 6800k's?


A few people have had success with doing so.


----------



## EliteReplay

Very impresive


----------



## Opcode

Quote:


> Originally Posted by *EliteReplay*
> 
> Very impresive


Not bad what a $150 APU coupled with 2400 MHz memory plus a 1080 MHz iGPU overclock can do. The video above is with the iGPU at 1080 MHz and so is the one below, once you crank up the iGPU the APU really starts showing its value. I may make a bunch of gaming videos like these once I get my unit up.

DiRT3 on Ultra @ 1080p


----------



## s33dless

Overclocking that iGPU is getting to be a pain for me. Even at stock, the display drivers crash if i play anything for over an hour. They come back up 2 seconds later, but it's still rather irritating. I've been trying to make 100% sure it's not a stability issue with my system. Once I'm satisfied that it's not my fault, AMD will be getting yet another email complaining about their shoddy drivers.


----------



## spatulator

Perhaps your RAM is clocked to high. Memory errors can look like GPU errors because of the shared system memory. Remember that no OC is guaranteed and alot of claimed OC speeds you will read about are exaggerated. Also tinkering with the speed of an APU is more difficult than a traditional set up. Do you have an msi motherboard by chance? My MSI motherboard was not cooperative with GPU overclocking and it took me weeks to find a way to do it.


----------



## s33dless

No, I have an ASUS. Again, I thought it was my OC so I took everything back to stock. I'm running my RAM at rated right now (or at least i'm pretty sure i am, 1600 MHZ 10-10-10 for sure, but 1T/2T...I don't even remember which one I have set lol).

MY multiplier is locked to 46 though. Intentionally: 47+ work, but the coil whine is unbearable.


----------



## peter-mafia

Quote:


> Originally Posted by *s33dless*
> 
> Even at stock, the display drivers crash if i play anything for over an hour. They come back up 2 seconds later, but it's still rather irritating. .


Quote:


> Originally Posted by *s33dless*
> 
> Intentionally: 47+ work, but the coil whine is unbearable.


Check if there is no overheating problem.
Coil whine from where? Any chance from the PSU?
Definitely, not a driver issue.


----------



## jsc1973

Quote:


> Originally Posted by *hakz*
> 
> is it true that you can CF a 7750 with 6800k's?


Yes, and it supposedly gives very good performance. It just doesn't make sense unless you already have the 6800K or the 7750, because you can get an FX-6300 and a 7870 for less and it will be faster, so you don't hear many reports of people doing it.


----------



## spatulator

Cant really comment on the coil whine Ive never had that happen but I suspect the display driver crashes are being caused by a memory error, especially if your gpu is at stock. If you have already tried a clean reinstall of the driver run memtest at the rated speed for your sticks and determine if you need an rma. If the sticks are good test your psu, a psu tester is only 5 bucks and you'll use it again for sure. (or use multimeter if you have one).


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> Overclocking that iGPU is getting to be a pain for me. Even at stock, the display drivers crash if i play anything for over an hour. They come back up 2 seconds later, but it's still rather irritating. I've been trying to make 100% sure it's not a stability issue with my system. Once I'm satisfied that it's not my fault, AMD will be getting yet another email complaining about their shoddy drivers.


Memory errors. Driver crashes are mostly caused by unstable memory OCs also have you stress tested your RAM?
Quote:


> Originally Posted by *jsc1973*
> 
> Yes, and it supposedly gives very good performance. It just doesn't make sense unless you already have the 6800K or the 7750, because you can get an FX-6300 and a 7870 for less and it will be faster, so you don't hear many reports of people doing it.


I'm baffled by your statement


----------



## s33dless

last memtest I ran came up clean, maybe the application was bad. i'll try a bunch of other ones, any recommendations?


----------



## spatulator

While you can pair up a 7750 or 6670 with a 6800k there are better ways to spend your budget if graphics performance is your #1 goal. Although without looking it up myself Im guessing a fx6300 and 7870 will cost more?


----------



## DaveLT

Quote:


> Originally Posted by *spatulator*
> 
> While you can pair up a 7750 or 6670 with a 6800k there are better ways to spend your budget if graphics performance is your #1 goal. Although without looking it up myself Im guessing a fx6300 and 7870 will cost more?


Absolutely.


----------



## jsc1973

Quote:


> Originally Posted by *DaveLT*
> 
> Memory errors. Driver crashes are mostly caused by unstable memory OCs also have you stress tested your RAM?
> I'm baffled by your statement


Sorry. Run-on sentence up there.









The point was that yes, you can take a 6800K and 7750 and crossfire them if you want to, but you don't see this configuration done by design, because something like an FX-6300 and a stand-alone 7870 is more cost-effective. However, if you already own a 6800K and want more graphics performance, crossfiring it with a 7750 can be done.

This wasn't the case with a Llano 3870K. A 3870K plus a 6670 was often more cost-effective than a similarly performing AMD system using a regular desktop CPU and GPU, so you saw that combination done a lot. Still do, actually.


----------



## DaveLT

I'll agree with you on that though. But still HD8670 AND a 7750 isn't to be trifled with.
Still, how much faster is a 7870 @1GHz than a 7850 @1GHz? because i know full well a 7850 @ stock clocks is 50% faster than a 7770 GHz Ed lol








Hell a FX6350 (cheaper than 6800k, strange) + 7850 will probably be more cost effective than 7870
But with AM3+ boards the selection are a bit limited. And a bit more expensive for the same feature sets as FM2 boards


----------



## stl drifter

hey guys would this be a good Cpu for a midrange video editing rig or should i go with a FX6300 cpu


----------



## computerparts

Quote:


> Originally Posted by *stl drifter*
> 
> hey guys would this be a good Cpu for a midrange video editing rig or should i go with a FX6300 cpu


I'd go with the FX6300 for that.


----------



## DaveLT

Quote:


> Originally Posted by *stl drifter*
> 
> hey guys would this be a good Cpu for a midrange video editing rig or should i go with a FX6300 cpu


Actually 6350 but a true "midrange" video editing platform would be a 8350


----------



## peter-mafia

Quote:


> Originally Posted by *jsc1973*
> 
> Yes, and it supposedly gives very good performance. It just doesn't make sense unless you already have the 6800K or the 7750, because you can get an FX-6300 and a 7870 for less and it will be faster, so you don't hear many reports of people doing it.


A10-5800 (has the same GPU as the A10-6800k)+ 7750 = $110 + $90 (NewEgg)= $200
In fact A10-6800K comes with SimCity 2013, which can be sold on ebay for $30, thus it's not $150 but approximately $127 (ebay+paypal fees)
FX-6300 + 7870 = $120 + $190 (after MIR). = 310-10=$300
7870 comes with Crysis 3 only. Never settle reloaded is over for that card. Can be sold on eBay for $9-$10
Not exactly the same


----------



## DaveLT

Quote:


> Originally Posted by *peter-mafia*
> 
> A10-5800 (has the same GPU as the A10-6800k)+ 7750 = $110 + $90 (NewEgg)= $200
> In fact A10-6800K comes with SimCity 2013, which can be sold on ebay for $30, thus it's not $150 but approximately $127 (ebay+paypal fees)
> FX-6300 + 7870 = $120 + $190 (after MIR). = 310-10=$300
> 7870 comes with Crysis 3 only. Never settle reloaded is over for that card. Can be sold on eBay for $9-$10
> Not exactly the same


You're making yourself look like arse here and since when did NRS end?


----------



## stl drifter

Quote:


> Originally Posted by *DaveLT*
> 
> Actually 6350 but a true "midrange" video editing platform would be a 6350


what would be a cheap good graphics card to go along with this?


----------



## DaveLT

Quote:


> Originally Posted by *stl drifter*
> 
> what would be a cheap good graphics card to go along with this?


Oops, i meant 8350, otherwise, HD7770 min, i think. If your program supports OpenCL grab a HD7950 or HD7870 or HD7850


----------



## spatulator

Quote:


> Originally Posted by *stl drifter*
> 
> hey guys would this be a good Cpu for a midrange video editing rig or should i go with a FX6300 cpu


With Sony Vegas Pro and a couple lesser known video editing programs, yes the 6800k is a great choice because the igpu accelerates rendering and real time effects using opencl. For adobe premiere you should build a system with nvidia graphics to take advantage of GPU rendering.


----------



## s33dless

lol, you guys keep recommending people buy so much hardware.

if you're not gaming, that APU will do just about everything ever in good time. i use it for MATLAB/Simulink, AutoCAD, and 3DSMax. I go to grab a glass of water and my renders are done when I get back. I always feel slightly guilty making my computer wait lol


----------



## void

Quote:


> Originally Posted by *s33dless*
> 
> lol, you guys keep recommending people buy so much hardware.
> 
> if you're not gaming, that APU will do just about everything ever in good time. i use it for MATLAB/Simulink, AutoCAD, and 3DSMax. I go to grab a glass of water and my renders are done when I get back. I always feel slightly guilty making my computer wait lol


I get what you mean. But it's OCN a forum dedicated to high performance PC hardware so without being given a budget or usage scenario most will recommend the fastest possible hardware to get the job done. If your productivity is dependent on render times faster hardware will be worth the cost.


----------



## s33dless

yes, but always keep in mind marginal returns. at some point (don't ask me which, that involves a lot research, analysis, and debate, definitely debate) the performance jump you're paying for is negligible. sometimes the difference between the very top tier and the one right below it is just a few kiloflops, which in my mind is not worth a hundred bucks difference.

then there's things like the mega supergame black xtreme [insert whatever else] i7's that go for like a grand. if you buy that processor, you have to have absolutely 0 bottlenecks or you just wasted a good $800, which means spending another couple thousand dollars and you'll hit the ultimate bottleneck:

PROGRAMS. hardware gets better every year, software just hogs more resources as time goes on. not only that, STILL very few code for multiple cores, so 8/10 times you have 1 core loaded and 3 just idling.


----------



## spatulator

To me the 6700 is the best AMD has to offer with Richland. Its a pretty powerful processor and gpu that really looks like the best option out there for a low power draw HTPC build. We all know that PC sales are down and that people buy laptops primarily. To non-enthusiast PC users the old "big black box" looks like a dinosaur. So why don't we see some more mini PC systems at the stores?


----------



## s33dless

because, honestly, most non-enthusiasts/non-professionals don't need a computer. they need a netflix and/or facebook machine (which is why tablets and macs sell so damn well).


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> yes, but always keep in mind marginal returns. at some point (don't ask me which, that involves a lot research, analysis, and debate, definitely debate) the performance jump you're paying for is negligible. sometimes the difference between the very top tier and the one right below it is just a few kiloflops, which in my mind is not worth a hundred bucks difference.
> 
> then there's things like the mega supergame black xtreme [insert whatever else] i7's that go for like a grand. if you buy that processor, you have to have absolutely 0 bottlenecks or you just wasted a good $800, which means spending another couple thousand dollars and you'll hit the ultimate bottleneck:
> 
> PROGRAMS. hardware gets better every year, software just hogs more resources as time goes on. not only that, STILL very few code for multiple cores, so 8/10 times you have 1 core loaded and 3 just idling.


----------



## spatulator

Quote:


> Originally Posted by *s33dless*
> 
> because, honestly, most non-enthusiasts/non-professionals don't need a computer. they need a netflix and/or facebook machine (which is why tablets and macs sell so damn well).


I think AMD should push to market mini pcs. They have won the console bid, now make a move to put a few teensy workstations on the shelves. Get a demo rig end cap thing going, show people what they can offer for the price of a cheapo laptop. Dont let the vendors nerf the graphics either like what happened with alot of Trinity laptops (single channel ram).


----------



## s33dless

Quote:


> Originally Posted by *DaveLT*


what was so confusing about my post?


----------



## beers

Quote:


> Originally Posted by *s33dless*
> 
> PROGRAMS. hardware gets better every year, software just hogs more resources as time goes on. not only that, STILL very few code for multiple cores, so 8/10 times you have 1 core loaded and 3 just idling.


This argument is increasingly fail as time progresses.


----------



## s33dless

well, I'm a software developer, and that's what i'm aware of. if someone else knows something i'm missing, please, let me know.


----------



## beers

Y u no write multithreaded code?


----------



## s33dless

i do for my personal projects, but not at work for sure lol. takes too much time, and i work in industrial stuff. the few PCs they do use are built for robustness, not performance. they have dual cores in them at best.

and it's not like it's just me...try decompiling some of the programs you use and take a look. threads are hardly ever implemented. even when they are, a lot of the time it's done in a way that one thread still does about 98% of the work and runs little threads off on the side for a couple milliseconds at most (whenever there's a value/calculation it doesn't want to wait for).

but it doesnt just stop at threading. if you REALLY want to make a program shine on multiple cores, you need to use concurrency, and that's hard. in fact, a lot of the time it's not work the risk because if done wrong, you can LOSE overall performance.

i stick by my point: i honestly think software is still the #1 bottleneck. a lot seem to think that slapping "x64" into the executable name means it was optimized for multi-core. i lol at them

EDIT: horrible run-on fixed.


----------



## s-x

Quote:


> Originally Posted by *spatulator*
> 
> I think AMD should push to market mini pcs. They have won the console bid, now make a move to put a few teensy workstations on the shelves. Get a demo rig end cap thing going, show people what they can offer for the price of a cheapo laptop. Dont let the vendors nerf the graphics either like what happened with alot of Trinity laptops (single channel ram).


Yeah, I could see a low to mid end mini PC selling pretty well. The only one I can think that is commercially available at retail stores is the mac mini...
Speaking of which, manufactures are freaking horrible at designing systems. HP has a desktop that is a medium sized tower, it has no power supply but has a powerbrick. When you open the system it has like a micro ATX motherboard and the thing is super lightweight and feels like theres nothing inside. Its so pitiful, it seems like HP was trying to save money and used leftover parts to make the damn thing.


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> i do for my personal projects, but not at work for sure lol. takes too much time, and i work in *industrial stuff*. the few PCs they do use are built for robustness, not performance. they have dual cores in them at best.


That's your problem there


----------



## s33dless

...as of now. do you know what i've worked in before? do you know how many programmers i know in other fields? how about you present an argument and some evidence instead of taking ad hominem jabs at me? i know you can do better









and like i said, i do for all my personal projects, but those are tools nobody else would be anywhere near interested in.

also like i said: multithreading != concurrency. the main point of multithreading is to not make stuff wait.

moreover, no matter how many cores you have, there is no true "parallel" processing. everything is sequential at the low-level.

do some reading, you might learn something.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> well, I'm a software developer, and that's what i'm aware of. if someone else knows something i'm missing, please, let me know.


In fact its the exact opposite, hardware has been pulling away from software for a while now. A prime example is how FX processors aren't so good in gaming, it's nothing to do with the hardware. It's that the software isn't optimized to utilize all of the available resources. Almost 90% of all your desktop software is single threaded.


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> In fact its the exact opposite, hardware has been pulling away from software for a while now. A prime example is how FX processors aren't so good in gaming, it's nothing to do with the hardware. It's that the software isn't optimized to utilize all of the available resources. Almost 90% of all your desktop software is single threaded.


lol, you're saying the same thing i said. glad to see someone else agrees: the bottleneck is software. not fully optimized.


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> ...as of now. do you know what i've worked in before? do you know how many programmers i know in other fields? how about you present an argument and some evidence instead of taking ad hominem jabs at me? i know you can do better


I studied C,C#, C++, Java programming.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> lol, you're saying the same thing i said. glad to see someone else agrees: the bottleneck is software. not fully optimized.


Well you said software is heavy, which means that hardware isn't strong enough to run it efficiently. What I am saying is there is a fine line in performance between software optimized for dual cores, and then resulting in terrible performance on an eight core machine. A prime example is Battlefield, every title ran like crap on FX chips until they released Battlefield 3. Which was properly optimized to support all eight cores of these machines. If the software isn't written with the hardware in mind, it will always be one step behind that hardware. Hopefully this doesn't happen with HSAIL. I do establish a few similar points with you, but I was mostly objecting against you.


----------



## s33dless

what does that have to do with anything i just said? let me do a recap:
1)I said software is the bottleneck.
2)you gave a flat wat
3)I gave further explanation for my stance.
4)you say me working in industrial applications is my problem
5)I told you you weren't making any actual points in the argument but simply taking jabs at me
6)you say you've studied c, c#, c++ and java

see the non-sequitur? i said a vast majority of programs are not optimized for multi-core. you say that's not true. why is that not true? i don't think you would just say that just to say it, so i would like to hear your reasoning, i might be able to learn something.

and what i have to say for #6:all c based. you're practically using the same language 3 times except:
c++ has object oriented
c# is c++ with a bunch of microsoft libraries
and java runs it it's own VM for portability, as well as other assorted syntax stuff like the keywords and polymorphism being implemented diefferently

but don't take that the wrong way--i honestly believe that if you can program in one language, you cna program in any once you get a syntax guide. it may not be the most efficient code possible and you may not use all the tools that particular language has available, but an algorithm is an algorithm. if the logic works, it *will* port (though not necessarily cleanly).

but, sorry for the tangent and taking this thread off topic.

i would recommend this APU to anybody. the price/performance ratio is very good. i have a couple buddies with i3/i5 rigs who can't believe i built a liquid cooled rig that can pull games 1080 with more than playable frame rates with MATLAB running in the background for around $600.


----------



## peter-mafia

Just received my A10-6800K. Immediately de-lidded it (razor blade 0.08 mm). Under the heat spreader looks exactly the same as Trinity... The thermal grease is very dry. Will be surprised if it conducts anything.



Cut off one capacitor, though. Still seems to be working fine.
Started OC from the GPU.
CPU NB voltage 1.400
NB frequency 2200
DDR3 2400
GPU frequency 1169

Thermal grease MX4 (Because of the missing capacitor didn't want to apply the Coollaboratory Liquid Pro before testing the APU)

Looks like the IMC is much better now. 2400mhz is not a problem at all even at 1.300V. Had to raise the voltage to crank the GPU frequency up.
Gonna test some more, then OC the CPU part and to crossfire it with a 7750.

As of now 3dMark11 P2111
http://www.3dmark.com/3dm11/6810442
Not bad at all









Mobo Asrock FM2A85-ITX


----------



## cainy1991

the GPU score you got will increase steadily with a ram upgrade.
*Or I should say... run the ram at correct speeds*


----------



## peter-mafia

I'm gonna surprise you. The ram is running at 2400mhz. You need to multiply what you see by 2. The result is very good BTW.
Thx for the advice,though. Been overclocking since 1998


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> Just received my A10-6800K. Immediately de-lidded it (razor blade 0.08 mm). Under the heat spreader looks exactly the same as Trinity... The thermal grease is very dry. Will be surprised if it conducts anything.
> 
> 
> 
> Cut off one capacitor, though. Still seems to be working fine.
> Started OC from the GPU.
> CPU NB voltage 1.400
> NB frequency 2200
> DDR3 2400
> GPU frequency 1169
> 
> Thermal grease MX4 (Because of the missing capacitor didn't want to apply the Coollaboratory Liquid Pro before testing the APU)
> 
> Looks like the IMC is much better now. 2400mhz is not a problem at all even at 1.300V. Had to raise the voltage to crank the GPU frequency up.
> Gonna test some more, then OC the CPU part and to crossfire it with a 7750.
> 
> As of now 3dMark11 P2111
> http://www.3dmark.com/3dm11/6810442
> Not bad at all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mobo Asrock FM2A85-ITX


My iGPU will run 1169 at the same stock voltages perfectly stable.


----------



## peter-mafia

That's awesome







I was a little bit worried about such a high voltage for the CPU NB (1.4 V).
Just for comparison. My A10-5800K coupled with ddr3-2400 would run stable only at 1.4625 V (CPU NB). The GPU was limited by 1086 mhz.
Tried the CPU part at 4500 mhz 1.4375 V. Runs stable but performance wise no difference with stock. Won't be pushing any further.
It's sad but NewEgg won't give you a free copy of simcity 2013 with the purchase of the A10-6800K anymore ( I got one and sold it)


----------



## s33dless

"sad"? lol, with the massive flop the new SimCity was, giving it for free is a liability. customers might take it as an insult lol


----------



## peter-mafia

I never played it







A month ago sold one for $30 (A8-5600k), the day before yesterday sold another one for $28 (A10-6800K). No insult at all







People will pay much less for decent games like Bioshock Infinite for the reasons I cannot understand.
My latest results:
A10-6800K (4400mhz; 1169mhz) + 7750 LP (865/1250).
3dMark11 performance 4231
http://www.3dmark.com/3dm11/6820113

Runs well but hot (62c prime95. Zalman CNPS8000B + Coollaboratory liquid Pro). Still thinking I shouldn't have sold the A10-5800K... Don't see a difference.


----------



## DaveLT

To be frank, i wouldn't ever OC anything on such a tiny heatsink


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> I never played it
> 
> 
> 
> 
> 
> 
> 
> A month ago sold one for $30 (A8-5600k), the day before yesterday sold another one for $28 (A10-6800K). No insult at all
> 
> 
> 
> 
> 
> 
> 
> People will pay much less for decent games like Bioshock Infinite for the reasons I cannot understand.
> My latest results:
> A10-6800K (4400mhz; 1169mhz) + 7750 LP (865/1250).
> 3dMark11 performance 4231
> http://www.3dmark.com/3dm11/6820113
> 
> Runs well but hot (62c prime95. Zalman CNPS8000B + Coollaboratory liquid Pro). Still thinking I shouldn't have sold the A10-5800K... Don't see a difference.


I am running a $9 heatsink jobby, and I never hit above 55C while running P95.


----------



## nz3777

Hello guys can I jump in here for a second>? Iam curious I wanna build with one of these A10 6800KS for my kids and wife to game on, What cards are compatable for Hybrid Crossfire? And will it make a decent gaming setup for someone that DOSENT play games like Last light and so forth....We already have a rig that can handle the high end stuff I want this one to be in the living room for them so they can use it as media and light gaming~ Any ideas what I can achive here if i crossfire that thing? Also whats the BEST board to run with the 6800k? Thank you


----------



## Opcode

Quote:


> Originally Posted by *nz3777*
> 
> Hello guys can I jump in here for a second>? Iam curious I wanna build with one of these A10 6800KS for my kids and wife to game on, What cards are compatable for Hybrid Crossfire? And will it make a decent gaming setup for someone that DOSENT play games like Last light and so forth....We already have a rig that can handle the high end stuff I want this one to be in the living room for them so they can use it as media and light gaming~ Any ideas what I can achive here if i crossfire that thing? Also whats the BEST board to run with the 6800k? Thank you


The iGPU alone is fairly good, and can handle older games quite well. I am able of allocate up to 2GB of my system memory for the iGPU using my Extreme6, so handling large resolutions isn't a big issue either. Here is a set of benchmarks I conducted with the APU using only 512 MB of allocated memory. Keep in mind these are with the CPU side at stock.



When paired with a 6670 or if you can get DGM to work with a HD 7750. These chips are more than capable of pushing impressive frame rates. I can run Prototype 2 fully maxed out above 40 FPS with this APU at stock and with my dedicated 5870. Older games like Battlefield 2 and Modern Warfare 2, this APU alone should run smooth. Tho if you're running it on a big TV or something, there could be a marginal difference in performance. People have had success with HD 7750, if not than the 6670 is the largest card to run in DGM.


----------



## DaveLT

Just remember to put 1866 (If you're buying Corsair, stop.) or 2133 minimum or you can simply take 1866 C9 and OC to 2133 C10, it's still faster than 1866 C9


----------



## nz3777

Opcode- Thanks for that man I appricate you taking the time to answer! Iam kinda familar with the 6670 I had 2-6570s In crossfire as my 1st cards they did pretty well but only problem is back then i used 1366x768 resolution they handle that easily ( the cards I mean ) once i switched resolution to 1920x1080 OMG its like they just died or something lol- If we can game on lets say 3860x1080-2 monitors and maybe not max out every game but some-what are my chances you think with the hybrid crossfire?


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> I am running a $9 heatsink jobby, and I never hit above 55C while running P95.


That's really good, I run about the same with my liquid cooler. I was about to be mad then I remembered our AC is always off so I idle at like 38C.

EDIT: lol, tops out at 50 now. boy does it sure help when ambient isnt 85F+...


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> That's really good, I run about the same with my liquid cooler. I was about to be mad then I remembered our AC is always off so I idle at like 38C.
> 
> EDIT: lol, tops out at 50 now. boy does it sure help when ambient isnt 85F+...


Yea my liquid loop tops out at 49C. But it makes way too much noise, so I switched over to the air which is much quieter during idle.


----------



## s33dless

I can never hear my pump, and my radiator is more than quiet enough even with the filter on. I also decoupled my hard drives, god that thing is quiet as all **** now.

What helps a lot is my mobo's built-in fan controller. ASUS is awesome, having the ability to set custom temp/rpm curves PER FAN is something I would have killed for back in the day (this should make it obvious when I last built myself a machine). My load is mad loud thogh, I think I might add another fan on the radiator for a push-pull. Has anyone played with that?


----------



## Cyrious

Quote:


> Originally Posted by *s33dless*
> 
> I can never hear my pump, and my radiator is more than quiet enough even with the filter on. I also decoupled my hard drives, god that thing is quiet as all **** now.
> 
> What helps a lot is my mobo's built-in fan controller. ASUS is awesome, having the ability to set custom temp/rpm curves PER FAN is something I would have killed for back in the day (this should make it obvious when I last built myself a machine). My load is mad loud thogh, I think I might add another fan on the radiator for a push-pull. Has anyone played with that?


I may not have a 6800k to play with, but i can tell you that going push pull definitely can improve performance on any cooling system that supports it. Take my H50 for example. I have it on my Q9400 and with the chip cranked to 3800mhz 1.5v the 2 fans (Coolermaster Blade Master 120mm PWM) on it keep the temps in the lower to mid 70s.
And on my Phenom II (3.6ghz 1.45v), my xigmatek gaia with a mismatched pair of fans (the stock 1500 fan pushing and an antec tri-cool 120 running full bore) keeps it below 55C, whereas before with just the stock fan it would be into the 60s at full load.

Get a pair of good PWM fans and a PWM splitter, stick em on your cooler, then tweak your fan profile to as needed. Word of warning though, push pull on a radiator will cause both fans to have a sort of dull whining noise that is very noticeable when they are cranked to full speed.


----------



## s33dless

Quote:


> Originally Posted by *Cyrious*
> 
> I may not have a 6800k to play with, but i can tell you that going push pull definitely can improve performance on any cooling system that supports it. Take my H50 for example. I have it on my Q9400 and with the chip cranked to 3800mhz 1.5v the 2 fans (Coolermaster Blade Master 120mm PWM) on it keep the temps in the lower to mid 70s.
> And on my Phenom II (3.6ghz 1.45v), my xigmatek gaia with a mismatched pair of fans (the stock 1500 fan pushing and an antec tri-cool 120 running full bore) keeps it below 55C, whereas before with just the stock fan it would be into the 60s at full load.
> 
> Get a pair of good PWM fans and a PWM splitter, stick em on your cooler, then tweak your fan profile to as needed. Word of warning though, push pull on a radiator will cause both fans to have a sort of dull whining noise that is very noticeable when they are cranked to full speed.


Excellent reply, thanks for the insight. I have an H60, so all I need to do is pop another fan on the inside. The whining is probably due to the mechanical coupling between the 2 fans (yes, through the air, you can couple through fluids). Getting rid of that would take quite a bit of fine tuning. I've messed with turbine design, so it shouldnt be too much of a ***** of a problem to model.

Glad to hear mismatched fans work tho, because that is 100% what I was aiming for (forget buying a matching $20 corsair fan). Wich means forgetting the splitter too, since both fans will need different profiles.

I love how cold/quiet I can run this thing. I idle at 36 OC'd with the A/C off. My roommate idles at 38 stock with all fans running full blast on his high end i3. And my integrated runs pretty much as well as whatever mid tier card he has in there. Can't wait for payday. 7970, here I come!


----------



## Cyrious

Quote:


> Originally Posted by *s33dless*
> 
> Excellent reply, thanks for the insight. I have an H60, so all I need to do is pop another fan on the inside. The whining is probably due to the mechanical coupling between the 2 fans (yes, through the air, you can couple through fluids). Getting rid of that would take quite a bit of fine tuning. I've messed with turbine design, so it shouldnt be too much of a ***** of a problem to model.
> 
> Glad to hear mismatched fans work tho, because that is 100% what I was aiming for (forget buying a matching $20 corsair fan). Wich means forgetting the splitter too, since both fans will need different profiles.
> 
> I love how cold/quiet I can run this thing. I idle at 36 OC'd with the A/C off. My roommate idles at 38 stock with all fans running full blast on his high end i3. And my integrated runs pretty much as well as whatever mid tier card he has in there. Can't wait for payday. 7970, here I come!


The mechanical coupling you mentioned arises from the fact both fans are of the same make and model yet one is in the others airstream, forcing it to spin faster. The blades moving in and out of phase then cause it. IMO in your opinion get a coolermaster sickleflow (9 blades instead of 7), use it as intake, and then use the stock corsair fan as exhaust. That in theory should help prevent it from happening. I of course wont do it as i spent my cash on my Blademasters and i intend on getting some use out of them.

36C OCed idle is pretty good, my setup gets me a little bit cooler, but then again at these voltages any load is going to result in a considerable core temp increase. Yay for 1.5v.


----------



## s33dless

Quote:


> Originally Posted by *Cyrious*
> 
> The mechanical coupling you mentioned arises from the fact both fans are of the same make and model yet one is in the others airstream, forcing it to spin faster. The blades moving in and out of phase then cause it. IMO in your opinion get a coolermaster sickleflow (9 blades instead of 7), use it as intake, and then use the stock corsair fan as exhaust. That in theory should help prevent it from happening. I of course wont do it as i spent my cash on my Blademasters and i intend on getting some use out of them.
> 
> 36C OCed idle is pretty good, my setup gets me a little bit cooler, but then again at these voltages any load is going to result in a considerable core temp increase. Yay for 1.5v.


I was going to use the corsair fan as intake and the stock case fan that came with my NZXT source 210 as exhaust. It has 9 blades, 1200 RPM max. You're saying the inverse would be better? I have a dust filter on the intake side, so I wanted the fan with the higher static pressure there.

Back on topic for a bit: What does anybody know about the plans for this socket type? I hear kaveri is going to be on a new one, is my mobo basically going to be a paperweight if i want to get a new APU down the line?

EDIT: the corsair fan also has a much higher flow rate (~74 compared to ~45). pull should have the higher flow rate, no?


----------



## agrims

No one knows yet for sure. There will be FM2+ coming for sure out, but as with AM3, and AM3+, it will possibly work in AM3, but probably only the 85 series mobo's. it is a waiting game though..

As for fans, less blades in the front more blades in the rear. You want quiet that will work as you are compressing the air like a true turbine. However this may not be as efficient at cooling as you will have differential pressure in the radiator and will not move as much air through it. The best option would be to use 2 like fans as they will try and equalize each other naturally.


----------



## Cyrious

Quote:


> Originally Posted by *agrims*
> 
> No one knows yet for sure. There will be FM2+ coming for sure out, but as with AM3, and AM3+, it will possibly work in AM3, but probably only the 85 series mobo's. it is a waiting game though..
> 
> As for fans, less blades in the front more blades in the rear. You want quiet that will work as you are compressing the air like a true turbine. However this may not be as efficient at cooling as you will have differential pressure in the radiator and will not move as much air through it. The best option would be to use 2 like fans as they will try and equalize each other naturally.


What he said. Taking some useless 120mm fans you dont need anymore and turning them into shrouds so the entire radiator face gets airflow (no dead zones from the hub and the motor mounts) would also help greatly, but is costly in terms of space used.


----------



## Sodalink

I got a7750 2GB I got off Best buy for $15 bucks on a clearance mistake and have been very tempted to try this as appose of getting a FX cpu and dedicated card since it will save me quite a few Watts of power or am I wrong?

I'm just waiting to see if they offer free Simcity with the A10-6800k as they seem to offer it with most of the other APUs.


----------



## Sodalink

double post


----------



## s33dless

Quote:


> Originally Posted by *Sodalink*
> 
> I got a7750 2GB I got off Best buy for $15 bucks on a clearance mistake and have been very tempted to try this as appose of getting a FX cpu and dedicated card since it will save me quite a few Watts of power or am I wrong?
> 
> I'm just waiting to see if they offer free Simcity with the A10-6800k as they seem to offer it with most of the other APUs.


do it. i've seen vids on youtube of a10 and 7750 xfire, looks solid, especially at 15 effen dollars.

i have a buddy that works for a company that does auto exhaust, so he knows a lot about mass flow. i told him what i was aiming for--having the lower flow rate pushing just to relieve some of the drag introduced by the radiator. he told me it should work. matching fans would also do that, but it's a bit overkill IMO if all i'm tryimg to do is compensate for some drag (those fancy fans are expensive).

and that pressure differential may not mean a bad thing. remember, any delta in pressure leads to a delta in temp (and vice versa). a little math should reveal which. i actually have a model i made in simulink, it would be perfect for this (i just need to adapt it to have 2 differing rotor stages and no stators).
good to hear there's a little glimmer of hope for the socket tho. i do have an 85, so hopefully they dont screw us on that.


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> I got a7750 2GB I got off Best buy for $15 bucks on a clearance mistake and have been very tempted to try this as appose of getting a FX cpu and dedicated card since it will save me quite a few Watts of power or am I wrong?
> 
> I'm just waiting to see if they offer free Simcity with the A10-6800k as they seem to offer it with most of the other APUs.


Some comparison benches would be awesome since there isn't any real comparison data with the DDR3 variant of 7750.

Edit:
Crap I didn't even really read/comprehend your post, apparently. What FX were you going for? For a full time rig I'd probably suggest AM3+ or seeing how SR 'rolls' out.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> Excellent reply, thanks for the insight. I have an H60, so all I need to do is pop another fan on the inside. The whining is probably due to the mechanical coupling between the 2 fans (yes, through the air, you can couple through fluids). Getting rid of that would take quite a bit of fine tuning. I've messed with turbine design, so it shouldnt be too much of a ***** of a problem to model.
> 
> Glad to hear mismatched fans work tho, because that is 100% what I was aiming for (forget buying a matching $20 corsair fan). Wich means forgetting the splitter too, since both fans will need different profiles.
> 
> I love how cold/quiet I can run this thing. I idle at 36 OC'd with the A/C off. My roommate idles at 38 stock with all fans running full blast on his high end i3. And my integrated runs pretty much as well as whatever mid tier card he has in there. Can't wait for payday. 7970, here I come!


You're going to pair a 7970 with a 6800k?







I can run BF3 with my 6800k stock and my HD 5870 stock at high preset @ 1600x900. And my frame rate is 62-113 (97 average), without dips below 62. I think a 7970 would be quite much for a APU based machine.


----------



## james8

bought a A10-6800K a couple days ago to build HTPC. first time building an AMD rig. when i see stock voltage i freaked out a little and though my mobo was malfunctioning since i see 1.344-1.404 vcore in UEFI and those are usually max OC voltage on Intel i5's and i7s. then i read somewhere about AMD having high voltage...

used Asus F2A85M/CSM motherboard. tried hwmonitor and coretemp and cannot get good CPU temp reading. idles in the high 40s and load in the 90s??? stock cooler. the socket reading is about right at 31 idle and 58-59 load in intel burn test but the super high readings in hwmonitor and the 0 C idle in coretemp seem completely off. not to mention the voltage constantly fluctuating between 1.1-1.3
anyone know of anyway to accurate measure core temperatures?

also i occasionally notice that when stress testing with intel burn test, 1-2 of the cores drop from 4.1 Ghz to 3.8 Ghz then go back up again. sometimes they'd all go to 4.3 Ghz before immediately dropping to 3.8 Ghz. is this a sign of heat throttling?
i'm worried


----------



## Sodalink

Quote:


> Originally Posted by *beers*
> 
> Some comparison benches would be awesome since there isn't any real comparison data with the DDR3 variant of 7750.
> 
> Edit:
> Crap I didn't even really read/comprehend your post, apparently. What FX were you going for? For a full time rig I'd probably suggest AM3+ or seeing how SR 'rolls' out.


FX6600 I think? The new 6x core. However I was leaning more towards the APU siince my daughter who is 5 will probably forget to turn off the computer really often and wanted to save as much power as possible.

She would probably play minecraft and some other kids games.


----------



## Opcode

Quote:


> Originally Posted by *Sodalink*
> 
> FX6600 I think? The new 6x core. However I was leaning more towards the APU siince my daughter who is 5 will probably forget to turn off the computer really often and wanted to save as much power as possible.
> 
> She would probably play minecraft and some other kids games.


Minecraft and little games like that will be not a problem with just a A10-6800k paired with 8GB of 2133 memory. No need to add a dedicated card if she's only going to be playing games of that magnitude. Tho if you wanna throw it in because you picked it up for cheap. It's suppose to pair with the A10-6800k (not officially), tho I am not sure if DGM is 100% guaranteed to work every time with the HD 7750. If it doesn't I guess you could fall back onto the APU's built in iGPU which is still quite good. I was able to play Tomb Raider around 30 FPS on normal settings with it. Tho I would look into buying aftermarket cooling for it, such as the Hyper 212 Plus at least. Because this chip runs really hot, and doesn't have a high temperature limit. A10-6800k, 8GB DDR3 2133 MHz, and a Hyper 212 Plus would be a good build. If the card works in DGM that's a bonus, if not then the APU alone will suit whatever a child would want to play.


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> You're going to pair a 7970 with a 6800k?
> 
> 
> 
> 
> 
> 
> 
> I can run BF3 with my 6800k stock and my HD 5870 stock at high preset @ 1600x900. And my frame rate is 62-113 (97 average), without dips below 62. I think a 7970 would be quite much for a APU based machine.


Because eyefinity 5x1. The IGP cannot handle that many pixels at any sort of playable frame rate. even 3x1 gives it issues. I'm obviously not going to xfire (bottleneck plus I get 9 displays that way instead of 6), but I'm 100% sure I won't get any CPU lag (especially not OC'd), this thing hauls ass. Game on 5 screens and 3 more for status etc.? Yes please.


----------



## Sodalink

Quote:


> Originally Posted by *Opcode*
> 
> Minecraft and little games like that will be not a problem with just a A10-6800k paired with 8GB of 2133 memory. No need to add a dedicated card if she's only going to be playing games of that magnitude. Tho if you wanna throw it in because you picked it up for cheap. It's suppose to pair with the A10-6800k (not officially), tho I am not sure if DGM is 100% guaranteed to work every time with the HD 7750. If it doesn't I guess you could fall back onto the APU's built in iGPU which is still quite good. I was able to play Tomb Raider around 30 FPS on normal settings with it. Tho I would look into buying aftermarket cooling for it, such as the Hyper 212 Plus at least. Because this chip runs really hot, and doesn't have a high temperature limit. A10-6800k, 8GB DDR3 2133 MHz, and a Hyper 212 Plus would be a good build. If the card works in DGM that's a bonus, if not then the APU alone will suit whatever a child would want to play.


Yeah worst case scenario the 7750 2Gb doesn't work and I can try to sell it again. And since the 7750 card is a 2GB one can the graphics of the APU be set to use 2GB? I currently have a A3500 x3 LLano in my server that I couldn't find anywhere in the bios to do that. Is that a bios thing?

Also I would be using 1600 and might try to overclock to 1866 is possible. I already have them and as of now ram is a bit expensive. As for aftermarket cooler I have plenty laying around. I got 2xThermaltake Frio, a Zerotherm Nirvana and a corcar H50 I haven't put back into my gaming rig and probably won't put it back and get maybe something else. I was going to wait to see if they would bundle the A6800k with Sim City, but right now Tigerdirect has it for $135 - 3% cash back and no taxes makes it tempted. Newegg has ASUS F2A85-M PRO FM2 AMD A85X for like $90 AR and taxes or maybe I could try to get the open box for $75 after taxes and if I get the rebate to go through.


----------



## MKHunt

The M-Pro is a sweet little board. At 90 AR it's a steal.

I paired it with a silverstone cooler for a sweet SFF build.


Spoiler: Warning: Spoiler!







I have some CL9 2400MHz trident x coming in for my 6800k.


----------



## void

Quote:


> Originally Posted by *MKHunt*
> 
> The M-Pro is a sweet little board. At 90 AR it's a steal.
> 
> I paired it with a silverstone cooler for a sweet SFF build.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have some CL9 2400MHz trident x coming in for my 6800k.


NIce build









What's the model number for the cooler?


----------



## Abundant Cores

I hear these things can churn out 5Ghz on Air, true?


----------



## s33dless

Quote:


> Originally Posted by *Abundant Cores*
> 
> I hear these things can churn out 5Ghz on Air, true?


I pushed 4.9 on water and it was still pretty cool (It never went to 60). But I had ******ed amounts of coil whine, so I clocked it down. If you don't mind the noise and can undervolt your system a good bit, yes, you could most definitely hit 5 GHz. Dunno about it being stable though.


----------



## Abundant Cores

Quote:


> Originally Posted by *s33dless*
> 
> I pused 4.9 on water and it was still pretty cool (It never went to 60). But I had ******ed amounts of coil whine, so I clocked it down. If you don't mind the noise and can undervolt your system a good bit, yes, you could most definitely hit 5 GHz


That coil whine is coming from your Motherboard or your PSU, CPU's don't have coils









The reason I ask is because a lot of reviewers said they got 5Ghz out of the 6800K on Air, some easily, much to their surprise given that pushing a 5800K much past 4.4Ghz was a bit like drilling a hole into your own knee cap.

This and that idiotic FX-9590 coming out of the box at 5Ghz makes me think that the Piledriver architecture has had a much needed seeing to.


----------



## s33dless

I never said the CPU had coils, just that I was getting coil whine. And though they don't have "coils" in the traditional sense, the internal circuitry has to have inductive elements, though being microscopic as they are they can't be a huge source of noise.


----------



## MKHunt

Quote:


> Originally Posted by *void*
> 
> NIce build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's the model number for the cooler?


Silverstone NT-06. ~28C in BIOS with ICD for TIM. I don't have the guts to delid my $150 6800k but I had no hesitation about taking a razor to my 3770k lol.

The fit is TIGHT.



1/2mm more and there would be problems.


----------



## DaveLT

Quote:


> Originally Posted by *MKHunt*
> 
> Silverstone NT-06. ~28C in BIOS with ICD for TIM. I don't have the guts to delid my $150 6800k but I had no hesitation about taking a razor to my 3770k lol.


Had the guts to delid a 300$ chip but no guts to delid a 100$ chip? Priceless


----------



## Farmer Boe

Quote:


> Originally Posted by *MKHunt*
> 
> Silverstone NT-06. ~28C in BIOS with ICD for TIM. I don't have the guts to delid my $150 6800k but I had no hesitation about taking a razor to my 3770k lol.
> 
> The fit is TIGHT.
> 
> 
> 
> 1/2mm more and there would be problems.


Is that NT-06 the passive type? Pretty beastly HTPC rig!


----------



## MKHunt

Nope. It has a slim 120mm fan mounted on the bottom. At full speed it can get pretty loud, but luckily it never has to run very fast to keep things cool


----------



## Sodalink

Is this RAM link to pair with this motherboard and A6800k?

It was the cheapest I found, I'm hoping to use the -$15 coupon and sell the other 2x4GB set I have for around $50 that way I don't have to invest that much money into this. I haven't really ever used ram besides the 1333 or 1600.


----------



## FLCLimax

Quote:


> Originally Posted by *Sodalink*
> 
> Is this RAM link to pair with this motherboard and A6800k?
> 
> It was the cheapest I found, I'm hoping to use the -$15 coupon and sell the other 2x4GB set I have for around $50 that way I don't have to invest that much money into this. I haven't really ever used ram besides the 1333 or 1600.


i just got this RAM and a 6800K. i got the brand new gigabyte itx mobo though, i can post results tonight or tomorrow. my stuff will be delivered any minute.


----------



## Farmer Boe

Quote:


> Originally Posted by *Sodalink*
> 
> Is this RAM link to pair with this motherboard and A6800k?
> 
> It was the cheapest I found, I'm hoping to use the -$15 coupon and sell the other 2x4GB set I have for around $50 that way I don't have to invest that much money into this. I haven't really ever used ram besides the 1333 or 1600.


That ram seems pretty decent. You could drop the speed to 2133 DDR and tighten those timings quite a bit. I know G.Skill ram loves to be tweaked. Higher frequency kits are always better for APU's if you can afford it. I have all mine cranked way past their rated speeds.


----------



## Opcode

Quote:


> Originally Posted by *Farmer Boe*
> 
> That ram seems pretty decent. You could drop the speed to 2133 DDR and tighten those timings quite a bit. I know G.Skill ram loves to be tweaked. Higher frequency kits are always better for APU's if you can afford it. I have all mine cranked way past their rated speeds.


Why would you ever do that? APU's LOVE frequency over timings. The jump from 2133 to 2400 could give him up to 5 more FPS in games.


----------



## void

Quote:


> Originally Posted by *FLCLimax*
> 
> i just got this RAM and a 6800K. i got the brand new gigabyte itx mobo though, i can post results tonight or tomorrow. my stuff will be delivered any minute.


Gigabyte have a new FM2 ITX









Can't wait to here your thoughts on it.


----------



## peter-mafia

Quote:


> Originally Posted by *Sodalink*
> 
> Is this RAM link to pair with this motherboard and A6800k?
> 
> It was the cheapest I found, I'm hoping to use the -$15 coupon and sell the other 2x4GB set I have for around $50 that way I don't have to invest that much money into this. I haven't really ever used ram besides the 1333 or 1600.


You'll be fine. Your mobo has nothing to do with memory compatibility because the IMC is inside the APU.
Mine is Ripjaws Z 2400
The first kit was DOA. The second one- no problems at all. At stock speeda my A10-6800K got 1749 in 3dmark11 performance. Not very impressive


----------



## FLCLimax

hmm...well the board is nice. my CPU temp is too damn high to believe though...60 degrees idle with an H80i.


----------



## Sodalink

Quote:


> Originally Posted by *peter-mafia*
> 
> You'll be fine. Your mobo has nothing to do with memory compatibility because the IMC is inside the APU.
> Mine is Ripjaws Z 2400
> The first kit was DOA. The second one- no problems at all. At stock speeda my A10-6800K got 1749 in 3dmark11 performance. Not very impressive


That score seems way too love for what I was expecting. Even though I usually don't care much about scores, but real performance. Like when I upgraded from a AMD1050T 6x core to a 2500k. In many things I felt like I downgraded and in what it exceeds didn't feel that much difference. I will get my cpu next week then I should do some testing.

Quote:


> Originally Posted by *FLCLimax*
> 
> hmm...well the board is nice. my CPU temp is too damn high to believe though...60 degrees idle with an H80i.


That's a bit high. I wonder what temps I will get with a Therlmatake Frio. I was planning to use an H50, but I think I will stick with the Frio.


----------



## FLCLimax

there's no way the temp is being reported correctly.


----------



## beers

Quote:


> Originally Posted by *FLCLimax*
> 
> hmm...well the board is nice. my CPU temp is too damn high to believe though...60 degrees idle with an H80i.


100% isn't mounted correctly or making poor contact, make sure pump is getting full 12v, etc.
Quote:


> Originally Posted by *peter-mafia*
> 
> You'll be fine. Your mobo has nothing to do with memory compatibility because the IMC is inside the APU.
> Mine is Ripjaws Z 2400
> The first kit was DOA. The second one- no problems at all. At stock speeda my A10-6800K got 1749 in 3dmark11 performance. Not very impressive


That's kind of surprising. Mine at stock pulled out around 2100ish on 2133 CL9. Maybe it's a Windows 8 thing


----------



## Opcode

Quote:


> Originally Posted by *FLCLimax*
> 
> hmm...well the board is nice. my CPU temp is too damn high to believe though...60 degrees idle with an H80i.


CoreTemp, HWMonitor, and all that jazz gives the wrong temperatures. Even tho HWMonitor is suppose to be updated to support Richland. The only two things that I have found to read the right temperatures is your motherboards OC suite (should have temps) and AIDA64 Extreme (Computer -> Sensor -> CPU). Don't mind the CPU # readings as them are wrong also. Tho the base CPU temperature should be right.


----------



## FLCLimax

figured it out, the OC software for my board shows the right temps. now i need some more thermal paste to reinstall my H80i. idles at 31 on the stock cooler.


----------



## Sodalink

So with the APU 6800K or the others... can you set the video to allocate RAM to be something to like 2GB? I will be trying to crossfire it with a 7750 2GB I have. I tried doing this for my A3500 x3 Llano and I couldn't find any information as to how to do it a while ago.


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> So with the APU 6800K or the others... can you set the video to allocate RAM to be something to like 2GB? I will be trying to crossfire it with a 7750 2GB I have. I tried doing this for my A3500 x3 Llano and I couldn't find any information as to how to do it a while ago.


Yep, you sure can. I can't remember the specific name of the feature but you can allocate it under a 'force' setting up to 2 GiB


----------



## Sodalink

That's a BIOS setting right?


----------



## s33dless

Yeah. what MoBo do you have? mine tops out at 2GB shared memory, but that's still pretty beastly. too bad my ram is just 1600 MHz (but 16 GB of it, i can have everything open forever). i hear this flies on 2400, but christ almighty those prices are steep.


----------



## Opcode

Quote:


> Originally Posted by *Sodalink*
> 
> So with the APU 6800K or the others... can you set the video to allocate RAM to be something to like 2GB? I will be trying to crossfire it with a 7750 2GB I have. I tried doing this for my A3500 x3 Llano and I couldn't find any information as to how to do it a while ago.


Yes, it should be a bios option in the "North Bridge" section (where it's located on mine as it's most logical). It's can be marked as "shared memory" as well. My Extreme6 allows me to allocate up to 2GB of memory to the iGPU also.


----------



## darkusx45

Try the newest version of Speedfan it reports the correct temp for my 6800k along with the OC software. Let us know if you could crossfire the 2gb version of the HD 7750 with our processor.


----------



## Farmer Boe

Am I correct in assuming only the 2GB GDDR3 version of the 7750 works for hybrid crossfire?


----------



## Opcode

Quote:


> Originally Posted by *Farmer Boe*
> 
> Am I correct in assuming only the 2GB GDDR3 version of the 7750 works for hybrid crossfire?


The GDDR5 versions do also apparently. Tho some people are noticing stuttering gameplay that they think is due to the GDDR5 being paired with DDR3.


----------



## Sodalink

Quote:


> Originally Posted by *darkusx45*
> 
> Try the newest version of Speedfan it reports the correct temp for my 6800k along with the OC software. Let us know if you could crossfire the 2gb version of the HD 7750 with our processor.


I will, I got the motherboard yesterday and will be getting the CPU next Tuesday hopefully. I hope I get the RAM by then as for the rest of the parts I have them already. This weekend I will be taking out the 90GB SSD from my server to put into this build and put a 40Gb SSD (I hope it's enough).
Quote:


> Originally Posted by *Farmer Boe*
> 
> Am I correct in assuming only the 2GB GDDR3 version of the 7750 works for hybrid crossfire?


I hope it does... other wise I might regret making this build and not going with something else.







.


----------



## Kyno

Hi A10-6800k users!

May I ask your advice based on your experience with the CPU?

From what I've seen the 8670D will be enough for my small gaming needs, and i don't need a huge CPU power. I thought I would be better of with a A10-6800K alone rather than a i3-3220 with a discret graphic card.

Silence is very important to me, so I feel like this CPU being alone would be easier to cool (with PWM fans and Asus Fan Xpert 2) when not playing. I don't mind noise at full load.

Do you know how the temps would compare to a small i3 when doing a bit of multitasking? (desktop work+streaming...)

Thanks!


----------



## darkusx45

Awesome I'm debating whether or to get a HD 7750 to crossfire or wait and get a GTX 760 around holiday season.


----------



## beers

Quote:


> Originally Posted by *Farmer Boe*
> 
> Am I correct in assuming only the 2GB GDDR3 version of the 7750 works for hybrid crossfire?


It should be either, most reports I've seen are using the GDDR5 card in tandem with the iGPU. IIRC none of the 7750 cards have a crossfire bridge on them that may contribute to their 'dual graphics' eligibility (although there are a few cards that have a dedicated bridge that you can still crossfire through PCIE).

I hadn't seen any reports of the 2 GB 7750 version in use but would be interested if there is still a giant RAM speed penalty like discrete-only setups or if they'd be a pretty good match together in DGM.


----------



## DaveLT

Quote:


> Originally Posted by *Kyno*
> 
> Hi A10-6800k users!
> 
> May I ask your advice based on your experience with the CPU?
> 
> From what I've seen the 8670D will be enough for my small gaming needs, and i don't need a huge CPU power. I thought I would be better of with a A10-6800K alone rather than a i3-3220 with a discret graphic card.
> 
> Silence is very important to me, so I feel like this CPU being alone would be easier to cool (with PWM fans and Asus Fan Xpert 2) when not playing. I don't mind noise at full load.
> 
> Do you know how the temps would compare to a small i3 when doing a bit of multitasking? (desktop work+streaming...)


So far what i've seen it just ... stomps the i3 for that.

Here's my redeemer : I like Intel.
Quote:


> Originally Posted by *beers*
> 
> It should be either, most reports I've seen are using the GDDR5 card in tandem with the iGPU. IIRC none of the 7750 cards have a crossfire bridge on them that may contribute to their 'dual graphics' eligibility (although there are a few cards that have a dedicated bridge that you can still crossfire through PCIE).
> 
> I hadn't seen any reports of the 2 GB 7750 version in use but would be interested if there is still a giant RAM speed penalty like discrete-only setups or if they'd be a pretty good match together in DGM.


Avoid the DDR3 version is what i have to say. GT640 from day 1 already loses massively to 7750 because of it having DDR3


----------



## Kyno

Quote:


> Originally Posted by *DaveLT*
> 
> So far what i've seen it just ... stomps the i3 for that.


I suppose you're talking about the iGPU?


----------



## spatulator

Quote:


> Originally Posted by *Farmer Boe*
> 
> Am I correct in assuming only the 2GB GDDR3 version of the 7750 works for hybrid crossfire?


The GDDR5 version of the 6670 performs waay better than the GDDR3 version according to some test results I read awhile ago about the 5800k in DG with the 6670 discrete. The DDR3 version did not offer much performance boost at all. The biggest limitation of these apus is memory bandwidth. If you are doing DG setup I'd suggest getting a GDDR5 discrete card over a ddr3 version.


----------



## Farmer Boe

Quote:


> Originally Posted by *spatulator*
> 
> The GDDR5 version of the 6670 performs waay better than the GDDR3 version according to some test results I read awhile ago about the 5800k in DG with the 6670 discrete. The DDR3 version did not offer much performance boost at all. The biggest limitation of these apus is memory bandwidth. If you are doing DG setup I'd suggest getting a GDDR5 discrete card over a ddr3 version.


Yeah I have a few 7750's that I can try with my 5800K but I'm messing around with the Athlon 750K at the moment.


----------



## DaveLT

Quote:


> Originally Posted by *Kyno*
> 
> I suppose you're talking about the iGPU?


Not just GPU. It's just better when it comes to streaming


----------



## FLCLimax

got everything set up properly. i must say i am impressed with the 8670D's performance. the most demanding game i play is Guild Wars 2, but still just over 30fps @ 1920x1080 is good for a iGPU. got the GPU @ 1024mhz, CPU @ 4.4ghz and allocated 2GB of RAM for the iGPU.


----------



## FLCLimax

plays GW2 decently @ 1080p, can't look for more than 35 fps with how cpu intensive the game is. Dirt, Trine 2, Borderlands 2, torchlight 2, TF2 all run great.


----------



## Sodalink

Quote:


> Originally Posted by *FLCLimax*
> 
> 
> 
> 
> 
> plays GW2 decently @ 1080p, can't look for more than 35 fps with how cpu intensive the game is. Dirt, Trine 2, Borderlands 2, torchlight 2, TF2 all run great.


How many fps do you get with borderlands 2? I'm getting horrible performance with mine crossfired with a 2gb 7750... if I run the card only I get same performance which still is so awful. I get like around 10 fps. Something must be wrong because my 7770 can play in high settings just fine.

Also my 2400 ram is running at 1600 and need to figure out how to manually set it to run at rated speeds. If I just tell it to run at 2400 the pc doesn't boot.


----------



## Ryude

Quote:


> Originally Posted by *Sodalink*
> 
> How many fps do you get with borderlands 2? I'm getting horrible performance with mine crossfired with a 2gb 7750... if I run the card only I get same performance which still is so awful. I get like around 10 fps. Something must be wrong because my 7770 can play in high settings just fine.
> 
> Also my 2400 ram is running at 1600 and need to figure out how to manually set it to run at rated speeds. If I just tell it to run at 2400 the pc doesn't boot.


Some specific games will give worse performance when using crossfire. In those cases, you're better off just using the discrete GPU and disabling crossfire. But I know that when it does work, you should expect anywhere from 60-80% more FPS.


----------



## axelroll

Hello, and sorry for this ultra-noob questions, im using the 6800k processor; can i buy a graphic card later (something high-end like a 7970) and use it as my primary and only GPU? (like disabling the integrated one) or can only buy one of the listed before for crossfire?

and other thing, i have to replace my mobo (the other one died) and looking to the Asus F2A85-V-PRO http://www.amazon.com/ASUS-240-Pin-Motherboards-F2A85-V-PRO/dp/B009F1DUAM/ref=cm_cd_al_qh_dp_t can i use it as my Motherboard, right Out of the Box?, This is my first AMD-PC, so i can´t use an older cpu to update the bios to support my cpu (if needed)

Thanks in Advance


----------



## DaveLT

Of course you can but i wouldn't really buy a 7970 for a 6800k ... Bottleneck fest

If it isn't the latest bios then ... you'll have to send it to ASUS i think


----------



## beers

You can boot on a F2A85-V PRO with an older BIOS, but you'll get some random BSODs until you update it. At least that was the behavior of the HTPC around here, the BIOS that shipped on the board was a year out of date and paired with a 6800k.


----------



## axelroll

Quote:


> Originally Posted by *DaveLT*
> 
> Of course you can but i wouldn't really buy a 7970 for a 6800k ... Bottleneck fest
> 
> If it isn't the latest bios then ... you'll have to send it to ASUS i think


Thanks I'll check graphic cards later.

Quote:


> Originally Posted by *beers*
> 
> You can boot on a F2A85-V PRO with an older BIOS, but you'll get some random BSODs until you update it. At least that was the behavior of the HTPC around here, the BIOS that shipped on the board was a year out of date and paired with a 6800k.


So I can go ahead and buy it without problems (I mean, the pc, would turn on, to update the BIOS) or so you recommend me another Good Motherboard?

***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---***---

The reason because i choosed the a10 is because´ve low budget (and the a10 has good price + integrated graphics)
I have around $550 to build the pc, can you help me to choose the parts, please,








something that is good and doesn´t has to be updated soon (just a graphic card)
i also need a Case for the PC, i like the Tempest 410 (or any other that has that style)

PD: I already have 2 HDD and 2 SSD, 8gb 2133 MHz Ram, and 2 Blu-ray internal Drives (so i need at least 7 sata ports)

PD2: Monitor, Mouse, Keyboard and Speakers not needed (i alredy have them)

PD3: Thanks Again, and Sorry for all this, i´m still a noob


----------



## agrims

I would go for the Tempest 210 instead and save some boneage. Also, if you want to go high end later on the GPU then get a 600-650w PSU. That way you have one less part to buy later.


----------



## Milestailsprowe

I need suggestions on a APU Build for a friend. Lets say the price is around $400 ish.



How is this and the Grand Total for it with New Egg coupons is $391.43


----------



## OldtimeGamer

Quote:


> Originally Posted by *Ryude*
> 
> Some specific games will give worse performance when using crossfire. In those cases, you're better off just using the discrete GPU and disabling crossfire. But I know that when it does work, you should expect anywhere from 60-80% more FPS.


Some games are definitely worse in crossfire.

I have cross-fired 6950's, flashed to 6970 and learned that TERA Online is not better in crossfire. It stutters very noticeably. Maybe I can adjust some settings but it made things worse...not better.


----------



## Farmer Boe

Quote:


> Originally Posted by *OldtimeGamer*
> 
> Some games are definitely worse in crossfire.
> 
> I have cross-fired 6950's, flashed to 6970 and learned that TERA Online is not better in crossfire. It stutters very noticeably. Maybe I can adjust some settings but it made things worse...not better.


Crossfire stuttering will be (hopefully) fixed with a patch at the end of July from AMD.


----------



## DaveLT

Quote:


> Originally Posted by *Farmer Boe*
> 
> Crossfire stuttering will be (hopefully) fixed with a patch at the end of July from AMD.


It's not about that, it's about the minor stuttering that happens in all games but if it's stutters massively without improving fps, It's THE GAME.


----------



## Opcode

Quote:


> Originally Posted by *Milestailsprowe*
> 
> I need suggestions on a APU Build for a friend. Lets say the price is around $400 ish.
> 
> 
> 
> How is this and the Grand Total for it with New Egg coupons is $391.43


Only thing I would recommend is try the best you can to get aftermarket cooling. I would try and get at least a Hyper 212+. The A10-6800k runs hot, especially on a copper core stock heatsink designed for FX chips. I was seeing 59C in games (only with a CPU load, as I run a HD 5870), so the stock aluminum cooler might cause the chip to run way too hot. It's easy to exceed these chips thermal limit of 74C. I hit around 54C when in Bad Company 2, again only a CPU load and that's on water. I guess he could try the stock cooler, but make sure he pays good attention to temperatures as he does his first load testing.


----------



## DaveLT

Jeez, Hyper 212 again. It's wayyyy overrated!
It performs well but IT'S TOO LOUD.
http://www.newegg.com/Product/Product.aspx?Item=N82E16835103179 30$ right now
Or this, http://www.amazon.com/DeepCool-Gamer-Storm-Extreme-Overclocking/dp/B003XWVG2I which is WAY better than the hyper212 if you're not reading the wrong sites


----------



## s33dless

Ok, a lot of people say not to buy the 7970: I did (ASUS DC2-3GD5).

IT. IS. PERFECT. Holy crap. I'm running 3x1 eyefinity, all games max (skyrim, dark souls, dmc, ace combat, assassin's creed II, etc. etc...), 0 stutter, i upgraded my cooling a bit (put a second fan on my water block radiator), never even hit 50 at full load on my CPU. Good stuff.

Not only that: I keep my iGPU enabled. The only problems it causes are that GPU OC utilities get confused and sometimes crashed, but I don't need to OC anyhow, everything i throw at it is butter smooth (i just wanted fan control).

Now as to WHY I would keep my iGPU on:
1) extra screens. The iGPU is also eyefinity capable, that's 3 more screens since my mobo has the ports. yes, this does help a lot, because i have a 4th "status" monitor...and being on a different gfx device it adds no load whatsoever, no matter how small, to my gaming screens.

2) OpenCL. See, I could have CPU + dGPU OR CPU + dGPU + iGPU. While crossfiring gives you a hardware level performance boost, my code is going to ******* fly once i start popping open cl kernels in there. with hetero geneous computing, more devices = better. hell, even [email protected] and those other compute intensive things people are a lot into always say you should turn your sli/xfire off and use the cards independently. much, much more efficient.

my recommendation is YES GET THE CARD. not a waste in the least. whatever "bottleneck", i don't see it. i AM oc'd to 4.7 ghz though, so stock might be a different story, but 4.7 is EASY to get on these things.

I recommend getting that card if your mobo/case have the room (3 PCIe slots wide!!!) it's a PCIe 3.0 card, but most of those are back compatible to pcie 2 (this one surely is, I'm using it after all!). apparently a lot of people don't know that, and are getting mid-tier hardware because of it (my buddy didnt know and he's one of the few other pc enthusiasts i know personally). i also recommend leaving the iGPU on if you play with hetero computing, but make sure you know how to get your workflows straight.


----------



## azanimefan

glad you're enjoying the 7970... it is a MONSTER of a gpu... there might be better out there (780/Titan), but just because it's the 3rd most powerful gaming card on the planet doesn't make it crap. it's silly overpowered. And if you're doing opencl/compute, it puts to shame anything anyone has outside of the quatro lineup (and some of the quatro lineup too)

i think people said not to get the 7970 if you were getting a 6800k because it sorta makes getting the 6800k pointless. For the same cash you could have got a FX 8320 (they're both $145)... that said, only people with a _*core i*_ who need to tell themselves all their cpu power is 100% necessary regardless of any evidence to the contrary would claim a 6800k would meaningfully bottleneck any gpu. AMD sell 2nd rate cpus, but those 2nd rate cpus are generally good enough in almost every situation.

love the creativity of using the igpu as an additional monitor. that's a really awesome idea and makes me a bit jealous i never thought of it.


----------



## s33dless

Quote:


> Originally Posted by *azanimefan*
> 
> glad you're enjoying the 7970... it is a MONSTER of a gpu... there might be better out there (780/Titan), but just because it's the 3rd most powerful gaming card on the planet doesn't make it crap. it's silly overpowered. And if you're doing opencl/compute, it puts to shame anything anyone has outside of the quatro lineup (and some of the quatro lineup too)
> 
> i think people said not to get the 7970 if you were getting a 6800k because it sorta makes getting the 6800k pointless. For the same cash you could have got a FX 8320 (they're both $145)... that said, only people with a _*core i*_ who need to tell themselves all their cpu power is 100% necessary regardless of any evidence to the contrary would claim a 6800k would meaningfully bottleneck any gpu. AMD sell 2nd rate cpus, but those 2nd rate cpus are generally good enough in almost every situation.
> 
> love the creativity of using the igpu as an additional monitor. that's a really awesome idea and makes me a bit jealous i never thought of it.


It's only pointless if you don;t use the iGPU, which I do. Not to mention the main thing about the APU: it buys you time. While I was buying monitors and other equipment, card included, I could still game with only having spent 25% of the cost of the completed system up front.

But yeah, my boner for the heterogeneity might be why AMD said they're going all APU in the future. Hell, why not? It's a genius Idea, and straight CPU power, as experimenting with OpenCL has shown me, is HORRIBLE for some operations. If programmers get smarter about implementing these hyda setups in their code, computer architecture is going to take a massive turn for the better.

EDIT:
Just found out I qualify for a $20 rebate. Going 'till 7/31 on all ASUS cards apparently, grab one!


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> Ok, a lot of people say not to buy the 7970: I did (ASUS DC2-3GD5).
> 
> IT. IS. PERFECT. Holy crap. I'm running 3x1 eyefinity, all games max (skyrim, dark souls, dmc, ace combat, assassin's creed II, etc. etc...), 0 stutter, i upgraded my cooling a bit (put a second fan on my water block radiator), never even hit 50 at full load on my CPU. Good stuff.
> 
> Not only that: I keep my iGPU enabled. The only problems it causes are that GPU OC utilities get confused and sometimes crashed, but I don't need to OC anyhow, everything i throw at it is butter smooth (i just wanted fan control).
> 
> Now as to WHY I would keep my iGPU on:
> 1) extra screens. The iGPU is also eyefinity capable, that's 3 more screens since my mobo has the ports. yes, this does help a lot, because i have a 4th "status" monitor...and being on a different gfx device it adds no load whatsoever, no matter how small, to my gaming screens.
> 
> 2) OpenCL. See, I could have CPU + dGPU OR CPU + dGPU + iGPU. While crossfiring gives you a hardware level performance boost, my code is going to ******* fly once i start popping open cl kernels in there. with hetero geneous computing, more devices = better. hell, even [email protected] and those other compute intensive things people are a lot into always say you should turn your sli/xfire off and use the cards independently. much, much more efficient.
> 
> my recommendation is YES GET THE CARD. not a waste in the least. whatever "bottleneck", i don't see it. i AM oc'd to 4.7 ghz though, so stock might be a different story, but 4.7 is EASY to get on these things.
> 
> I recommend getting that card if your mobo/case have the room (3 PCIe slots wide!!!) it's a PCIe 3.0 card, but most of those are back compatible to pcie 2 (this one surely is, I'm using it after all!). apparently a lot of people don't know that, and are getting mid-tier hardware because of it (my buddy didnt know and he's one of the few other pc enthusiasts i know personally). i also recommend leaving the iGPU on if you play with hetero computing, but make sure you know how to get your workflows straight.


What temps are you seeing with the H60? I'm running the FX-8150 water loop on mine, it's rad is about twice as thick as the H60. It competes with the H80, tho I am curious as to the general temperatures on liquid overall. In Bad Company 2, I am pushing up to around 54C with the loop on silent (fans on 600 rpm). I also have the fans setup in a push-pull config exhausting out the back of the case (my 5870 is blower design so it exhausts out the back of the case also). So case temps aren't bad at all (otherwise I would have the fans sucking air from outside the case).
Quote:


> Originally Posted by *s33dless*
> 
> It's only pointless if you don;t use the iGPU, which I do. Not to mention the main thing about the APU: it buys you time. While I was buying monitors and other equipment, card included, I could still game with only having spent 25% of the cost of the completed system up front.
> 
> But yeah, my boner for the heterogeneity might be why AMD said they're going all APU in the future. Hell, why not? It's a genius Idea, and straight CPU power, as experimenting with OpenCL has shown me, is HORRIBLE for some operations. If programmers get smarter about implementing these hyda setups in their code, computer architecture is going to take a massive turn for the better.
> 
> EDIT:
> Just found out I qualify for a $20 rebate. Going 'till 7/31 on all ASUS cards apparently, grab one!


Got a new HD 5870 for free, so I will be sticking to that.


----------



## s33dless

You absolutely cannot beat free my friend, and I'm pretty sure you can xfire with that and haul all sorts of ass.

My H60 is slightly modified--I added the case fan that came with the case to it, so it's basically an H80 with one janky fan. With the A/C on, I never hit 50 (avg around 45 at full load) wheras before I was averaging 50 at the same ambient, 55 if the A/C was off.

But I run my fans much faster than you do...600 RPM is whisper quiet. Mine scale with temp.


----------



## mlibby1980

anyone know if you could use a 6970 2gb for the dual graphics


----------



## s33dless

Quote:


> Originally Posted by *mlibby1980*
> 
> anyone know if you could use a 6970 2gb for the dual graphics


Looked on AMD's website, they said "amd.com/dualgraphics" would have a chart, I click the link...404. AMD's website is VERY bad in general, even 1/2 of their OpenCL documentation is dead links.


----------



## mlibby1980

yeah I saw that also reason I posted here


----------



## azanimefan

Quote:


> Originally Posted by *mlibby1980*
> 
> anyone know if you could use a 6970 2gb for the dual graphics


nope. the 6670 matches a Richland, your 6970 is way too much gpu to match the apu.


----------



## s33dless

Quote:


> Originally Posted by *azanimefan*
> 
> nope. the 6670 matches a Richland, your 6970 is way too much gpu to match the apu.


yes, but as mentioned before in this thread, that doesn't necessarily stop you--there are vids of guys xfiring richland to 7750's and getting mad performance boosts. Jus open catalyst and if it's there, try it and see if it helps.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> You absolutely cannot beat free my friend, and I'm pretty sure you can xfire with that and haul all sorts of ass.
> 
> My H60 is slightly modified--I added the case fan that came with the case to it, so it's basically an H80 with one janky fan. With the A/C on, I never hit 50 (avg around 45 at full load) wheras before I was averaging 50 at the same ambient, 55 if the A/C was off.
> 
> But I run my fans much faster than you do...600 RPM is whisper quiet. Mine scale with temp.


It indeed does game amazingly well for as old of a card that it is. It can run BF3 on it with this APU at 1600x900 on High preset with an average of 90 FPS. I never see dips lower than 60 FPS in BF3. This is the cooling unit that I use, its rad is double thick (the same as a H80). It runs neck and neck with the H100 according to frostytech with the fans at full RPM. Got it brand new for free also, so can't beat that either.
Quote:


> Originally Posted by *mlibby1980*
> 
> anyone know if you could use a 6970 2gb for the dual graphics


No, the biggest officially supported card is a HD 6670. Tho the HD 7750 seems to work in DGM as well (un-official). All other cards wont run in DGM mode regardless. With a 6970 you're not missing out on anything, that card is pretty beast. It's stronger than my 5870 and I can run any game I want maxed out.


----------



## s33dless

Tried Dead Space 3 yesterday too. Same deal--everything maxed, no stutter, 3x1 24" 1080p.


----------



## mlibby1980

Yeah just trying to figure the cheapest way to game. I just sold a i7 3770k build and still have my 670 4gb card im selling but i rarely game so looking for a cheap alternative to gaming sense i honestly rarely do but want the opinion there for when i do. is it worth running the dual graphics? going to do a mitx build sense i have the case psu and 2 2tb hard drives left after i sell my video card


----------



## s33dless

Well, many will say more performance is always "worth" it, but it depends on your cutoff. the only reason i got a card was eyefinity. if you're only playing on one screen, the iGPU on its own will gobble ANYTHING up at full at 720p. At 1080p you have to start turning things down, but still gorgeous and smooth.

So I would put it this way:
resolution = 720p: not worth it
720p < = resolution <= 1080p: worth it (that few fps boost at max settings will make them a little better than playable at least)
resolution > 1080p: not worth it (can't haul it)

A lot of the advice I've been seeing in this forum is very generalized--"Your CPU will bottleneck your gpu" ,"It's not worth it with memory that slow","Overkill" etc. etc.. You need to be able to weigh your situation. What you need and use will dictate what is a useful path for you, common advice be damned.

I mean, look at that up there. Is it worth it to pair a 6970 to an A10? Yes AND no. Depends on what you're doing! MY advice is always, especially if you already have the hardware, try it. And take benchmarks with a grain of salt: there's no better test than actually using the damn thing.


----------



## DaveLT

I just built for someone a A10-6800k rig today and jeez ... the stock cooler is terrible. I don't why (or if i have not updated the bios) but idle was showing 56C with the stock cooler. HOLY CHRIST. But the actual temp was only 40C ...
Maybe AMD changed to a "Intel-type" temp display instead of a real temp measurement (eg. 42C at 42C) therefore i am inclined to believe it will go up to 105C being 70C the actual temp ...
I put a 92mm tower on it and the temps probably went down to 33C actual temp but still showing 48C in bios/hwmonitor

Is it because of an old bios? It's the MSI A55M-E33


----------



## Deadboy90

Quote:


> Originally Posted by *DaveLT*
> 
> I just built for someone a A10-6800k rig today and jeez ... the stock cooler is terrible. I don't why (or if i have not updated the bios) but idle was showing 56C with the stock cooler. HOLY CHRIST. But the actual temp was only 40C ...
> Maybe AMD changed to a "Intel-type" temp display instead of a real temp measurement (eg. 42C at 42C) therefore i am inclined to believe it will go up to 105C being 70C the actual temp ...
> I put a 92mm tower on it and the temps probably went down to 33C actual temp but still showing 48C in bios/hwmonitor
> 
> Is it because of an old bios? It's the MSI A55M-E33


It's possible, update it.


----------



## DaveLT

Quote:


> Originally Posted by *Deadboy90*
> 
> It's possible, update it.


I'll do it soon


----------



## OldtimeGamer

Quote:


> Originally Posted by *DaveLT*
> 
> Jeez, Hyper 212 again. It's wayyyy overrated!
> It performs well but IT'S TOO LOUD.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835103179 30$ right now
> Or this, http://www.amazon.com/DeepCool-Gamer-Storm-Extreme-Overclocking/dp/B003XWVG2I which is WAY better than the hyper212 if you're not reading the wrong sites


I have a HYPER 212 and have no complaints at all. I don't think its loud in any way, compared to all the rest of my fans.

My case sits about 7' from me, so at least for me, I think its pretty quiet and keeps my iGPU cool.


----------



## Sodalink

Quote:


> Originally Posted by *s33dless*
> 
> yes, but as mentioned before in this thread, that doesn't necessarily stop you--there are vids of guys xfiring richland to 7750's and getting mad performance boosts. Jus open catalyst and if it's there, try it and see if it helps.


I just built an A 6800K build and I was expecting too much out of the 2GB GDR3 7750 and I didn't get the performance I was hoping for. It might be because drivers are not optimized for the card since is not listed as hybrid crossfire possibilities? If you guys want to run some kind of benchmarks let me know before I sell the system since I changed my mind about saving energy instead of power. I think I will go with another kind of build. I loved the A6800k performance by itself though. Can't really go wrong with APUs if you don't plan to add a graphics card. My 3500 x3 Llano is still rocking strong on my HTPC 8TB Server playing Blu rays and even some light gaming.


----------



## OldtimeGamer

nevermind...


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> I just built an A 6800K build and I was expecting too much out of the 2GB GDR3 7750 and I didn't get the performance I was hoping for. It might be because drivers are not optimized for the card since is not listed as hybrid crossfire possibilities? If you guys want to run some kind of benchmarks let me know before I sell the system since I changed my mind about saving energy instead of power. I think I will go with another kind of build. I loved the A6800k performance by itself though. Can't really go wrong with APUs if you don't plan to add a graphics card. My 3500 x3 Llano is still rocking strong on my HTPC 8TB Server playing Blu rays and even some light gaming.


What do you get in 3dmark11 with dual graphics mode enabled?


----------



## s33dless

Quote:


> Originally Posted by *Sodalink*
> 
> I just built an A 6800K build and I was expecting too much out of the 2GB GDR3 7750 and I didn't get the performance I was hoping for. It might be because drivers are not optimized for the card since is not listed as hybrid crossfire possibilities? If you guys want to run some kind of benchmarks let me know before I sell the system since I changed my mind about saving energy instead of power. I think I will go with another kind of build. I loved the A6800k performance by itself though. Can't really go wrong with APUs if you don't plan to add a graphics card. My 3500 x3 Llano is still rocking strong on my HTPC 8TB Server playing Blu rays and even some light gaming.


TSK. TSK. You used the GDDR3 version, it's not surprising it didn't fly. Try it with a GDDR5 and see.

And again with the "if you're not going to add a card". The CPU can pull way more than enough FLOPS to carry a 7970 and a 3x1 1080 eyefinity SLS (6,220,800 pixels). If you're only using a single screen, anything in the 7800 range would probably get you flying through everything. You just have to pay attention to what you're doing.


----------



## Sodalink

Quote:


> Originally Posted by *beers*
> 
> What do you get in 3dmark11 with dual graphics mode enabled?


I'll post you the pic in a few hours.

Quote:


> Originally Posted by *s33dless*
> 
> TSK. TSK. You used the GDDR3 version, it's not surprising it didn't fly. Try it with a GDDR5 and see.
> 
> And again with the "if you're not going to add a card". The CPU can pull way more than enough FLOPS to carry a 7970 and a 3x1 1080 eyefinity SLS (6,220,800 pixels). If you're only using a single screen, anything in the 7800 range would probably get you flying through everything. You just have to pay attention to what you're doing.


Yep, I had the card already from a great clearance deal i got. My daughter wanted a computer and I was itching to build a new system which I have not doing in more than a year hehe that's why I did it. It was a bit of an in pulse buy since I sold some of my daughters stuff she didn't use anymore and got the cash.

I was looking at some benchmarks and the A 6800k seems to be really close to i5 2500k speeds which I was thinking of getting for her and run the 7750 until I could get her a better card.


----------



## FLCLimax

Quote:


> Originally Posted by *DaveLT*
> 
> I just built for someone a A10-6800k rig today and jeez ... the stock cooler is terrible. I don't why (or if i have not updated the bios) but idle was showing 56C with the stock cooler. HOLY CHRIST. But the actual temp was only 40C ...
> Maybe AMD changed to a "Intel-type" temp display instead of a real temp measurement (eg. 42C at 42C) therefore i am inclined to believe it will go up to 105C being 70C the actual temp ...
> I put a 92mm tower on it and the temps probably went down to 33C actual temp but still showing 48C in bios/hwmonitor
> 
> Is it because of an old bios? It's the MSI A55M-E33


there is no software out right now that will work properly with the 6800k. use your motherboard's included overclocking/monitoring software, that will be the only accurate temp readout you get.


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> I'll post you the pic in a few hours.


A compare URL is more useful


----------



## Clockdripdoor

Quote:


> Originally Posted by *Mark the Bold*
> 
> Man. I'm getting crazy high temps on this A10-6800k thing without overclocking.
> 
> Like 75-80 C under prime95 after like 10 seconds. And around 50C at idle.
> 
> My cooler is a CM Hyper 212. Not the best, but certainly should perform better than this....
> 
> What kind of temps you peeps getting? Doesn't seem right. Install seems spot on. Reset it twice.
> 
> Some people were saying that Core Temp doesn't jive with AMD sensors, but real temp + hwmonitor says same thing....
> 
> I've always heard AMD's were hot, but damn.


I am running my first water cooled build.

A10-6800k at 4.4Ghz

32 C at idle

Slim 240 rad with one San Ace PWM at 1400 rpm.


----------



## Sodalink

Quote:


> Originally Posted by *beers*
> 
> A compare URL is more useful


here you go:

http://www.3dmark.com/3dm11/6896740

I'm not sure how bad or good that is









cpu clocked at 4.3ghz
Igpu clocked at 1267 Ghz
Ram running at 2400 Ghz

I ran the trail default settings.

Edit: I just noticed this:
Memory
8,192 MB
Module 1
4,096 MB G.Skill DDR3 @ 800 MHz
Module 2
4,096 MB G.Skill DDR3 @ 800 MHz

What the? Bios shows ram is running at 2400


----------



## Milestailsprowe

What is it like without the 7750 or the 7750 on its own?


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> I just built for someone a A10-6800k rig today and jeez ... the stock cooler is terrible. I don't why (or if i have not updated the bios) but idle was showing 56C with the stock cooler. HOLY CHRIST. But the actual temp was only 40C ...
> Maybe AMD changed to a "Intel-type" temp display instead of a real temp measurement (eg. 42C at 42C) therefore i am inclined to believe it will go up to 105C being 70C the actual temp ...
> I put a 92mm tower on it and the temps probably went down to 33C actual temp but still showing 48C in bios/hwmonitor
> 
> Is it because of an old bios? It's the MSI A55M-E33


It's because most software doesn't display the temperature correctly. Also these chips do put out quite a bit of heat. Use AIDA64 Extreme to show the proper CPU temperature.
Quote:


> Originally Posted by *FLCLimax*
> 
> there is no software out right now that will work properly with the 6800k. use your motherboard's included overclocking/monitoring software, that will be the only accurate temp readout you get.


There is, use AIDA64 Extreme. The core temperatures are wrong, but the CPU temperature is correct.


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> here you go:
> 
> http://www.3dmark.com/3dm11/6896740
> 
> I'm not sure how bad or good that is


Thanks for running that. I am surprised the gains from that aren't more substantial, though.
That score is about 10% higher than my 6800k+7570 bench (http://www.3dmark.com/3dm11/6765119)


----------



## rppot

Hi guys, I've been reading about crossfiring a 7750 with this APU. There really isn't much information out there on this; this thread seems to be the best resource. I've skimmed some of this thread and haven't come across anyone with any problems doing it.

The only person I've seen to have had a problem getting it to work is this guy:



I'm looking at buying the XFX FX-775A-ZNP4 Radeon HD 7750 Core Edition here: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150612
Is there anything wrong with my build or this card that could cause problems?

AMD A10 6800K APU
Corsair CX430W power supply
Biostar HI FI A85W motherboard
8GB Crucial Ballistix 1600 RAM
1 SSD, 2 mechanical HDDs


----------



## Sodalink

Quote:


> Originally Posted by *Milestailsprowe*
> 
> What is it like without the 7750 or the 7750 on its own?


Or ran those tests later
Quote:


> Originally Posted by *beers*
> 
> Thanks for running that. I am surprised the gains from that aren't more substantial, though.
> That score is about 10% higher than my 6800k+7570 bench (http://www.3dmark.com/3dm11/6765119)


Might be because, the 7750 is not officially supported and it has GDDR3?

by the way should I believe that link on what it says about the ram running at 800 or BIOS which says 2400?


----------



## beers

Quote:


> Originally Posted by *Sodalink*
> 
> by the way should I believe that link on what it says about the ram running at 800 or BIOS which says 2400?


Use BIOS/CPUz. Mine sits at 2133 but the result sheet usually says 666 MHz, so it's being erroneously reported.


----------



## tambok2012

Quote:


> Originally Posted by *rppot*
> 
> Hi guys, I've been reading about crossfiring a 7750 with this APU. There really isn't much information out there on this; this thread seems to be the best resource. I've skimmed some of this thread and haven't come across anyone with any problems doing it.
> 
> The only person I've seen to have had a problem getting it to work is this guy:
> 
> 
> 
> I'm looking at buying the XFX FX-775A-ZNP4 Radeon HD 7750 Core Edition here: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150612
> Is there anything wrong with my build or this card that could cause problems?
> 
> AMD A10 6800K APU
> Corsair CX430W power supply
> Biostar HI FI A85W motherboard
> 8GB Crucial Ballistix 1600 RAM
> 1 SSD, 2 mechanical HDDs


can you go gddr3 version


----------



## rppot

Quote:


> Originally Posted by *tambok2012*
> 
> can you go gddr3 version


What's the difference between GDDR5 and DDR3? I thought 1GB GDDR5 would outperform 2GB DDR3 in the same card?

I'm also led to believe this is the same card as in this video http://www.youtube.com/watch?v=_2VrYX1YRgY


----------



## tambok2012

Quote:


> Originally Posted by *rppot*
> 
> What's the difference between GDDR5 and DDR3? I thought 1GB GDDR5 would outperform 2GB DDR3 in the same card?
> 
> I'm also led to believe this is the same card as in this video http://www.youtube.com/watch?v=_2VrYX1YRgY


i've heard that even if you have GDDR5 speed it will be back to GDDR3 once you will go Hybrid Crossfire w/an APU


----------



## Opcode

Quote:


> Originally Posted by *rppot*
> 
> What's the difference between GDDR5 and DDR3? I thought 1GB GDDR5 would outperform 2GB DDR3 in the same card?
> 
> I'm also led to believe this is the same card as in this video http://www.youtube.com/watch?v=_2VrYX1YRgY


It doesn't matter which you get, the results of DGM aren't going to be all that impressive. It's unofficially supported, which means you wont see drivers tweaked to enhance DGM performance between Richland and the HD 7750. Regardless to what memory type you get it will still scale like crap. You're better off skipping the DGM with the HD 7750 and go straight to buying an HD 7770 GHz edition. There is no point in buying a HD 7750 when you can get a HD 7770 GHz edition for the same exact price (actually cheaper $89). Them extra 128 GCN cores clocked 200 MHz faster will most likely deliver much more performance than the HD 7750 running in DGM with the A10-6800k (still VLIW4 architecture). APU's are great if you're going to run only the iGPU because you don't need a discrete card. Tho if you're looking to game, the HD 7770 + Athlon X4 750k is a better option than any DGM setup with the A10-6800k. Plus you also have to deal with the current crossfire issues with frame latency. I personally would just turn the iGPU down to 400 MHz in the bios and set the allocated memory to 32mb and just use the iGPU for a second display (eyefinity). Run a faster and cheaper discrete card by itself and get better gaming performance.


----------



## rppot

Thanks for confirming that I should have reservations about a 6800k + 7750 setup. I don't know what you're trying to say though. Will I not get the performance of a 7750 while gaming, with a slight boost from the 6800k?

How much better would the 7770 vs the 7750 + 6800k in idiot language? Where can I go to compare the theoretical performance of a 6800k + 7750 vs 6800k running a 7770 on its own? I'm still leaning towards crossfiring with the 7750 as this is my best option for a 7770: http://www.ncix.com/products/?sku=68528&vpn=FX777AZNF4&manufacture=XFX&promoid=1311 and I don't want to deal with a mail in rebate. Also I'm baselessly skeptical of Powercolor, but they are always the cheapest brand with usually mediocre reviews and I don't want to get burned.

After initial disappointment with my first build, I'm actually somewhat impressed with the 6800k being able to run games like Fallout 3 and Borderlands at 25-30fps at 720p. But it's square in the low end in terms of gaming. Almost what I needed at the bare minimum, but not quite.

I am now strongly considering putting the 6800k + motherboard + RAM??? into a smaller case and lower power supply and turning it into a media center computer. Reading others' plans on this same idea, they're concerned about potential heat issues. Is this an issue I should be concerned about?

This is probably too off topic...That would leave the case and maybe the power supply and RAM. Where should I start in looking for a CPU + GPU to last me 3-4 years into the future (from Black Friday) for purely gaming?


----------



## Sodalink

Quote:


> Originally Posted by *rppot*
> 
> Thanks for confirming that I should have reservations about a 6800k + 7750 setup. I don't know what you're trying to say though. Will I not get the performance of a 7750 while gaming, with a slight boost from the 6800k?
> 
> How much better would the 7770 vs the 7750 + 6800k in idiot language? Where can I go to compare the theoretical performance of a 6800k + 7750 vs 6800k running a 7770 on its own? I'm still leaning towards crossfiring with the 7750 as this is my best option for a 7770: http://www.ncix.com/products/?sku=68528&vpn=FX777AZNF4&manufacture=XFX&promoid=1311 and I don't want to deal with a mail in rebate. Also I'm baselessly skeptical of Powercolor, but they are always the cheapest brand with usually mediocre reviews and I don't want to get burned.
> 
> After initial disappointment with my first build, I'm actually somewhat impressed with the 6800k being able to run games like Fallout 3 and Borderlands at 25-30fps at 720p. But it's square in the low end in terms of gaming. Almost what I needed at the bare minimum, but not quite.
> 
> I am now strongly considering putting the 6800k + motherboard + RAM??? into a smaller case and lower power supply and turning it into a media center computer. Reading others' plans on this same idea, they're concerned about potential heat issues. Is this an issue I should be concerned about?
> 
> This is probably too off topic...That would leave the case and maybe the power supply and RAM. Where should I start in looking for a CPU + GPU to last me 3-4 years into the future (from Black Friday) for purely gaming?


Like mentioned above... The build I just did with 7750 with the 6800k does not outperform the 7770 alone. I also have a 7770 in my HTPC with a Phenom II 840 x4 2.8 ghz and gaming is better. However, the 6800k is much better when it comes to cpu power. I only did the hybrid crossfire with the 7750 because I had it already and got it extremely cheap. I was hoping the 6800k + 7750 was going to do a bit better than the 7770 while using less power and having a video card that does not require to be connected to the PSU. I was wrong though and it's mostly due (I think) to not being officially supported by current drivers.

This is what I got for my build:
Asus F2A84-M Pro
AMD A6800k
G.Skill 2x4GB DDR3 2400

However, I think I'm going to get different build for my daughter as the 7750 hybrid crossfire didn't work like I expected.


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> It's because most software doesn't display the temperature correctly. Also these chips do put out quite a bit of heat. Use AIDA64 Extreme to show the proper CPU temperature.
> There is, use AIDA64 Extreme. The core temperatures are wrong, but the CPU temperature is correct.


I do have AIDA64 Extreme but i didn't fire it up since i had to disable UAC and forgot about it very soon







I do know the actual temp of the CPU anyway







*Edit : I checked A64 Extreme and it showed correct temps after updating BIOS
Quote:


> Originally Posted by *rppot*
> 
> What's the difference between GDDR5 and DDR3? I thought 1GB GDDR5 would outperform 2GB DDR3 in the same card?
> 
> I'm also led to believe this is the same card as in this video http://www.youtube.com/watch?v=_2VrYX1YRgY


The GDDR5 will absolutely eat the DDR3 for lunch, that's the reason the GT640 is massively inferior to the 7750
Quote:


> Originally Posted by *tambok2012*
> 
> i've heard that even if you have GDDR5 speed it will be back to GDDR3 once you will go Hybrid Crossfire w/an APU


DGM is NOT Hybrid Crossfire! It simply uses either GPU to render half a screen without actually using another's memory subsystem
Quote:


> Originally Posted by *rppot*
> 
> Thanks for confirming that I should have reservations about a 6800k + 7750 setup. I don't know what you're trying to say though. Will I not get the performance of a 7750 while gaming, with a slight boost from the 6800k?
> 
> How much better would the 7770 vs the 7750 + 6800k in idiot language? Where can I go to compare the theoretical performance of a 6800k + 7750 vs 6800k running a 7770 on its own? I'm still leaning towards crossfiring with the 7750 as this is my best option for a 7770: http://www.ncix.com/products/?sku=68528&vpn=FX777AZNF4&manufacture=XFX&promoid=1311 and I don't want to deal with a mail in rebate. Also I'm baselessly skeptical of Powercolor, but they are always the cheapest brand with usually mediocre reviews and I don't want to get burned.
> 
> After initial disappointment with my first build, I'm actually somewhat impressed with the 6800k being able to run games like Fallout 3 and Borderlands at 25-30fps at 720p. But it's square in the low end in terms of gaming. Almost what I needed at the bare minimum, but not quite.
> 
> I am now strongly considering putting the 6800k + motherboard + RAM??? into a smaller case and lower power supply and turning it into a media center computer. Reading others' plans on this same idea, they're concerned about potential heat issues. Is this an issue I should be concerned about?
> 
> This is probably too off topic...That would leave the case and maybe the power supply and RAM. Where should I start in looking for a CPU + GPU to last me 3-4 years into the future (from Black Friday) for purely gaming?


Powercolor is a seriously good brand now, i bought sapphire and got low-clocking cards twice








There isn't any heat issue with bigger heatsinks (which i recommend) but the utilities are mostly wrong apart from AIDA64. Idle temps for a10-6800k was roughly 40C with stock heatsink. I used to have a Q9400 and it's actual idle temp was like 43C ... on the same heatsink i use today for his a10-6800k
Not a bad temp for a terrible heatsink








If you want a good low profile cheap heatsink buy a AMD FX stock heatsink


----------



## rppot

I just noticed that Aero doesn't run for me. Is this the case for anyone else? The troubleshooter detects no issues yet no Aero.


----------



## Opcode

Quote:


> Originally Posted by *rppot*
> 
> I just noticed that Aero doesn't run for me. Is this the case for anyone else? The troubleshooter detects no issues yet no Aero.


If GPU drivers are installed and you're running Windows 7. Just run the windows index rating test, it should enable Aero on its own.


----------



## DaveLT

Change the theme. If you just installed it freshly Aero just won't run because it's in Windows Basic


----------



## peter-mafia

So I"ve checked the 3dMark 11 database
Phenom 965 + 7770 score 3700 on average
http://www.3dmark.com/3dm11/3198331
On the other hand my mini-ITX rig including the A10-6800K @ 4400/1169; ddr3 2400: CPU NB 2400: LP VGA powercolor 7750 gddr5 @865/1250: ssd OCZ vertex plus r2 120 scores 4231
http://www.3dmark.com/3dm11/6820113

In the Tomb Raider benchmark I got 30fps (lowest) /51fps (highest)/ 39fps (average) on Ultra settings 1080p. Catalyst 13.4. Highly doubt, a single 7770 will be able to do that
The difference is huge. And look at the size of my rig.


----------



## DaveLT

Vs. a 965 in a minimum ATX case


----------



## EliteReplay

Can people post their builds here please? i mean pictures of those nice mAtx cases and all of that


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> It doesn't matter which you get, the results of DGM aren't going to be all that impressive. It's unofficially supported, which means you wont see drivers tweaked to enhance DGM performance between Richland and the HD 7750. Regardless to what memory type you get it will still scale like crap. You're better off skipping the DGM with the HD 7750 and go straight to buying an HD 7770 GHz edition. There is no point in buying a HD 7750 when you can get a HD 7770 GHz edition for the same exact price (actually cheaper $89). Them extra 128 GCN cores clocked 200 MHz faster will most likely deliver much more performance than the HD 7750 running in DGM with the A10-6800k (still VLIW4 architecture). _*APU's are great if you're going to run only the iGPU because you don't need a discrete card.*_ _*Tho if you're looking to game, the HD 7770 + Athlon X4 750k is a better option than any DGM setup with the A10-6800k.*_ Plus you also have to deal with the current crossfire issues with frame latency. _*I personally would just turn the iGPU down to 400 MHz in the bios and set the allocated memory to 32mb and just use the iGPU for a second display (eyefinity).*_ Run a faster and cheaper discrete card by itself and get better gaming performance.


1) Not true: the A10-6800k is a good processor PERIOD.
2) Sure, for dual graphics. but if you're willing to abandon that and go to 7800/7900 series (which you have the time to save for because guess what? your iGPU can play things if you turn the options down), you'll get much, much better performance.
3) A "second display" is not eyefinity. "eyefinity" is a hard/soft blend that allows you to have up to 6 displays plugged in to 1 AMD display device. Part of eyefinity is SLS (single large surface) display which lets you use multiple monitors as one big monitor. If you have a mobo with at least one displayport, you can get eyefinity SLS on just the APU (which I did for a while and it was sweet, though it stuttered a bit--but whaddyawant at 3x 1080p?). But if you've ever played with hetero programming, you know GPU cores are awesome and having a nice spread throughout your system is very useful. OpenCLer's should love this platform, it's great stuff.


----------



## azanimefan

Quote:


> Originally Posted by *s33dless*
> 
> 1) Not true: the A10-6800k is a good processor PERIOD.
> 2) Sure, for dual graphics. but if you're willing to abandon that and go to 7800/7900 series (which you have the time to save for because guess what? your iGPU can play things if you turn the options down), you'll get much, much better performance.
> 3) A "second display" is not eyefinity. "eyefinity" is a hard/soft blend that allows you to have up to 6 displays plugged in to 1 AMD display device. Part of eyefinity is SLS (single large surface) display which lets you use multiple monitors as one big monitor. If you have a mobo with at least one displayport, you can get eyefinity SLS on just the APU (which I did for a while and it was sweet, though it stuttered a bit--but whaddyawant at 3x 1080p?). But if you've ever played with hetero programming, you know GPU cores are awesome and having a nice spread throughout your system is very useful. OpenCLer's should love this platform, it's great stuff.


RE:
1) I think you missed his point. his point wasn't that the athlon II x4 750k was a better cpu; his point was it wasn't cost effective, and therefore it was silly to go with an APU if you intended to get a discrete card. he's right by the way. let me demonstrate using the lowest chip/gpu prices on pcpartpicker

5800k + 6770 - $171.63
6800k + 6770 - $198.95
5800k + 7750 - $202.43
6800k + 7750 - $229.73

Every single ones of the following combos would out perform EITHER of the dual graphics combos above.
g2020 + 7770GE - $131.99
i3-3210+ 7770GE - $189.09
Athon II + 7770GE - $159.28
Ph II x4 + 7770GE - $170.98
FX4300+ 7770GE - $193.96
FX6300+ 7770GE - $197.97

THATS why you don't buy an APU if you are planning to get a discrete gpu. the $$ just doesn't work out right. it's not cost effective at all.

*for reference, here are the prices
HD 6770 1gb ddr3 - $48.98
HE 7750 4gb ddr3 - $79.76
HD 7770GE 1gb ddr5 - $79.99
a10-5800k - $122.68
a10-6800k - $149.97
Pentium G2020 - $52.00
Core i3-3210 - $109.10
Athlon II x4 750k - $79.29
Phenom II x4 965be - $90.98
FX 4300 - $113.97
FX 6300 - $117.98


----------



## DaveLT

But did you know that a 7750 is available half-slot? The 7770 isn't







In terms of power density 7750+6800k is utterly insane
And also 7770+a10-6800k is a insane combo. No way in hell will the 7770 be more powerful


----------



## MrJava

Just imagine what mini-itx kaveri rigs (w/ 7750,9750) will be like.








Quote:


> Originally Posted by *peter-mafia*
> 
> So I"ve checked the 3dMark 11 database
> Phenom 965 + 7770 score 3700 on average
> http://www.3dmark.com/3dm11/3198331
> On the other hand my mini-ITX rig including the A10-6800K @ 4400/1169; ddr3 2400: CPU NB 2400: LP VGA powercolor 7750 gddr5 @865/1250: ssd OCZ vertex plus r2 120 scores 4231
> http://www.3dmark.com/3dm11/6820113
> 
> In the Tomb Raider benchmark I got 30fps (lowest) /51fps (highest)/ 39fps (average) on Ultra settings 1080p. Catalyst 13.4. Highly doubt, a single 7770 will be able to do that
> The difference is huge. And look at the size of my rig.


----------



## nitrubbb

would really like mini-atx with kaveri this year :-|

or laptop


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> 1) Not true: the A10-6800k is a good processor PERIOD.
> 2) Sure, for dual graphics. but if you're willing to abandon that and go to 7800/7900 series (which you have the time to save for because guess what? your iGPU can play things if you turn the options down), you'll get much, much better performance.
> 3) A "second display" is not eyefinity. "eyefinity" is a hard/soft blend that allows you to have up to 6 displays plugged in to 1 AMD display device. Part of eyefinity is SLS (single large surface) display which lets you use multiple monitors as one big monitor. If you have a mobo with at least one displayport, you can get eyefinity SLS on just the APU (which I did for a while and it was sweet, though it stuttered a bit--but whaddyawant at 3x 1080p?). But if you've ever played with hetero programming, you know GPU cores are awesome and having a nice spread throughout your system is very useful. OpenCLer's should love this platform, it's great stuff.


1. I agree that its a good chip, but its not a good chip for a budget gaming platform. The iGPU creates a whole mess of extra un-needed heat. It's better to save $75 and get just the CPU if you're pairing it with a discrete card and plan to do some overclocking.
2. Why run the iGPU? There isn't a single game my HD 5870 can't touch. Why would I wan't to run the iGPU, if I can max games like BF3 without a problem.
3. I meant to say dual display, don't know where the eyefinity came from.
4. I prefer using discrete for OpenCL, these Richland APU's don't make any difference. It's still hardware intercommunication as there is no full HSA support. I can crunch a hundred times more with my HD 5870 that supports Compute than the 8670D is capable of. You're better off waiting for HSA to come along before even diving into HSAIL. I myself plan on writing a few new applications to make use of it once it becomes steadily available on the market.

Overall it's a great chip, tho for the price point for gamers just buy the FX-6350 which will roflstomp this chip ten times over. APU's are really only a good value to the work office or someone who uses their machine for casual stuff (web surfing, messenger, maybe a little word, and very light gaming). Tho like I said its not a good value for everyone, I would rather have the Athlon X4 750k so I could cut out the extra iGPU heat (especially since I am not using it). It's cheaper and can maintain a higher overclock at lower temperatures.

A10-6800k + 2400 MHz Memory = Athlon X4 750k + HD 7770

You might as well go with plan B as it will game a whole lot better than the 6800k could ever. And they will come to be around the same exact price. The time to buy a APU is when Kaveri comes around Steamroller + 512 GCN cores will make it the heart of budget gaming builds. Plus it will have full HSA support so it will actually have an edge over OpenCL with a discrete card.


----------



## s33dless

it's only not cost effective if you don't use the iGPU, like i've said before. and as i've also said before, in an OpenCL context the more stream processors you have, the better. If i can find the hardware, I'll run some computations on the FX6300 and the 6800K GPU and post results. I bet you even with the 6 cores, the 6800K will smoke it.

but if all you're going to do is game on one screen, then certainly it's not the best way to go. but to be frank, i don't see why you would build a non-esoteric system with this. i know it's marketed as "budget gaming", but the specs told me otherwise. this thing is a workhorse if you can tap it properly, but understandably, esoteric setups aren't for everyone so they'll just turn the GPU off.

buying a dGPU does not make buying an a10 non-cost effective. buying an apu, then a dGPU and _TURNING THE iGPU OFF_ makes it a waste. and if you're going super-multi monitor, having an APU will get you there much, much faster and cheaper than the other stuff will (because more displays = more cards in that situation).

EDIT: just saw that above. of course the dGPU hauls more. but cpu + dgpu doesnt beat cpu + dgpu + igpu. now hetero support just means more work on your end for a workflow, btu i'm willing to deal with that, i like building my own implementations most of the time anyways (takes more time, but it's guaranteed to work the way you want it). using the MATLAB openCL toolkit now lets me run my simulations while I game wo bogging the main processor core or the dGPU.


----------



## Stormscion

LOL no way in hell that A10 6800 iGPU is better then 7770. I have mine overclocked @ 1100/1500 memory and it is very very powerful. APU is great and all, but still cant get to that performance.

ALSO 2400ghz ram costs almost as much as 7770 or that APU alone. And will still have far lower bandwidth then GDDR5.
In terms of CPU performance 750k is basically 5800k and 6800 is higher clocked 5800k. SO they are quite similar.

Until they get HSA enabled on the dedicated radeon/firepro GPUs (and that is planed as well!) APU + dedicated GPU has lower price to performance ration then just CPU + dedicated GPU . Ofc there are cons to the APU + dGPU such as higher failure tolerance... if your dGPU dies you will still have very capable machine left. And also you can have tons of monitors and other good stuff.

But still from pure price to performance stand point i would argue that it has worse performance.

ALso tbh i would not bother with FM2 just get FM2+ with new chipsets... i think AMD will make sure to get features it is lacking compared to the intel on there new FM2+ motherboards.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> it's only not cost effective if you don't use the iGPU, like i've said before. and as i've also said before, in an OpenCL context the more stream processors you have, the better. If i can find the hardware, I'll run some computations on the FX6300 and the 6800K GPU and post results. I bet you even with the 6 cores, the 6800K will smoke it.
> 
> but if all you're going to do is game on one screen, then certainly it's not the best way to go. but to be frank, i don't see why you would build a non-esoteric system with this. i know it's marketed as "budget gaming", but the specs told me otherwise. this thing is a workhorse if you can tap it properly, but understandably, esoteric setups aren't for everyone so they'll just turn the GPU off.
> 
> buying a dGPU does not make buying an a10 non-cost effective. buying an apu, then a dGPU and _TURNING THE iGPU OFF_ makes it a waste. and if you're going super-multi monitor, having an APU will get you there much, much faster and cheaper than the other stuff will (because more displays = more cards in that situation).


I think you're missing the point, I bet the 7770 with the GCN architecture packing 640 ALU's will smoke the 384 VLIW4 that are inside the A10-6800k. If anyone here has a 7770 GHz edition, they can run the same tests as yours and I imagine the difference will be marginal. Meanwhile the 750k + 7770 route would be the same price or cheaper (roughly $160, which is exactly what it costs for the A10-6800k alone). Don't get me wrong AMD's APU's are great, but they aren't that great when it comes to custom builds other than ITX. Also you can't turn the iGPU off, even my Extreme6 doesn't have that unique ability. It will always be running, just never under a load in a power saving mode (still consuming power). I can turn the core clock down to 400 MHz and lower the shared memory down to 32 MB, tho there is no option to "disable" it in the bios. And yes, dGPU + CPU does put a spanking on a APU + DGM. You can't tell me your A10-6800k running in DGM with a 7750 will spank my HD 5870. Heck you can buy one of these cards brand new on ebay for under $99. Spend the extra $10 and go from the HD 7770 to the HD 5870 which competes with the HD 7850. I don't game while I code, and vise versa as that's just counter productive. I also don't need to run simulations of anything, as when I work on something I just debug it myself in real time. If I need to test if something is stable I just run it overnight, no reason for it cut into my own personal time. As any GPU based calculation with OpenCL is still being queued by your serial processor anyways (how OpenCL works). 750k + HD 7770 is a better enthusiast platform than a A10-6800k + 2400 MHz memory. Just add in a decent motherboard to the mix to overclock it with, and you'll easily match and beat out the A10-6800k in CPU performance. The 750k alone will do 4.5 GHz with ease. So sure it would cost you more "my way" in the long run, tho what's an extra $20-30 for a better motherboard. Chances are you'd be spending that for a better board anyways even if you owned the A10-6800k. The same can be said about the cooling solution, most people will go aftermarket with a Hyper 212 or another. So you're wallet really isn't taking much of a hit at all, meanwhile you're getting a better setup in the long run. And before you ask why did I buy one, easy, I didn't.


----------



## s33dless

Ok, I think everybody's misunderstanding me here. I never said that I had a 7770, nor that A10-6800K can outperform an FX + 7770. I said an APU can smoke the reg processors if you're smart about offloading tasks to the iGPU. Of course, you take a hit because the memory here is much slower, being system memory, BUT you don't have to do PCIe transfers, which saves that whole situation. And again, **** the DGM, it's only a help in a slim set of cases. When I said "APU+iGPU+dGPU" I meant all 3 in parallel, not Dual Graphics Mode.

I have a 7970. Was it worth getting? According to the advice you've been giving everyone, the answer is always "no". I disagree. If you're willing to put in the work and come up with a scheme that uses the hardware, the answer is a resounding "yes". It all depends on your situation:
Quote:


> I don't game while I code, and vice-versa as that's just counter productive. I also don't need to run simulations of anything, as when I work on something I just debug it myself in real time.


Great, that's you. I run simulations because I also design hardware and do physical simulations (like i was planning an electric conversion on a classic car the other day, fun stuff but my starting torque never seemed to come out right). "Debugging in real-time" doesn't really mean anything in this context, and I don't want my computer to be basically unusable for what could be hours as it pounds through matrix after matrix after matrix.

If your set of circumstances is anything like mine, then it's in fact a much better idea than the other route. I wanted more screens to enhance my workflow (I can support 9 right now, working on buying more monitors), more stream processors to crunch all my matrix math, but admittedly, most of all, I wanted to play on a promising new architecture.

So, again, yes, if you're just trying to game, and then at that only on one screen, there are much more cost-effective routes.


----------



## nitrubbb

Quote:


> Originally Posted by *Stormscion*
> 
> ALso tbh i would not bother with FM2 just get FM2+ with new chipsets... i think AMD will make sure to get features it is lacking compared to the intel on there new FM2+ motherboards.


when will FM2+ mobos be available?


----------



## s33dless

Kaveri is supposed to launch by end of year from what I've heard, so the next few months should see that answered.

Anybody hear anything new about the future of FM2? There are rumors going about that AMD won't leave FM2 in the dust in favor of FM2+ and just nuke some features if you shove an FM2+ native processor on, but nothing official, and my trust in internet rumor is very little.


----------



## rppot

I'm aware that 6800K + another card isn't the most cost effective option. I thought the performance of the iGPU on its own would do me fine, but it isn't enough. I haven't had the chance to use better RAM so this is unfair on my half (How much of a performance increase could I expect using 2400 MHz, by the way?)

It's just one notch below what I need, though still very usable, so now I want to fix that. I think 2400 MHz RAM could be an effective solution but it's about the same price as a 7750 HD card (I'm also looking at higher end cards to run on their own). I think the performance benefits from the RAM would be less and only really applicable to any APU builds in the future, so I am leaning towards adding a card.

PS: In regards to the Aero being missing, I think it has something to do with Catalyst Control Center and the overclocking tool. I am using the latest beta drivers.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> Ok, I think everybody's misunderstanding me here. I never said that I had a 7770, nor that A10-6800K can outperform an FX + 7770. I said an APU can smoke the reg processors if you're smart about offloading tasks to the iGPU. Of course, you take a hit because the memory here is much slower, being system memory, BUT you don't have to do PCIe transfers, which saves that whole situation. And again, **** the DGM, it's only a help in a slim set of cases. When I said "APU+iGPU+dGPU" I meant all 3 in parallel, not Dual Graphics Mode.
> 
> I have a 7970. Was it worth getting? According to the advice you've been giving everyone, the answer is always "no". I disagree. If you're willing to put in the work and come up with a scheme that uses the hardware, the answer is a resounding "yes". It all depends on your situation:
> Great, that's you. I run simulations because I also design hardware and do physical simulations (like i was planning an electric conversion on a classic car the other day, fun stuff but my starting torque never seemed to come out right). "Debugging in real-time" doesn't really mean anything in this context, and I don't want my computer to be basically unusable for what could be hours as it pounds through matrix after matrix after matrix.
> 
> If your set of circumstances is anything like mine, then it's in fact a much better idea than the other route. I wanted more screens to enhance my workflow (I can support 9 right now, working on buying more monitors), more stream processors to crunch all my matrix math, but admittedly, most of all, I wanted to play on a promising new architecture.
> 
> So, again, yes, if you're just trying to game, and then at that only on one screen, there are much more cost-effective routes.


I never said that the HD 7970 isn't a good buy, tho it's not a "excellent" buy if you're looking to pair it with it a A10-6800k for gaming. I can provide proof of that as it even bottlenecks my HD 5870. I am only getting roughly 50-60% utilization while in-game on several games. That's because of a CPU bottleneck, I bet if I cranked it up to 5.0 GHz that bottleneck would be nearly non-existent. Tho I am not comfortable at such high volts, and it's not a 24/7 safe overclock. The only reason to go that big is if you use the card for compute or other related tasks. Otherwise it's never going to run at 99% load while in most games. If it bottlenecks my card, it's going to bottleneck your card twice as bad. Does that mean the game is unplayable? Absolutely not, I still get 40-60 FPS in BC2 with it completely maxed out @ 1600x900. Tho I really should be getting 60-120+ at 99% utilization. If you need the card for number crunching then that's all good. Tho I have no idea why one would buy a $500 GPU to put into a $300 rig. If it floats your boat than so be it, tho I wouldn't go around recommending a 7970 to everyone running this APU. It's a waste for gaming, as you can get the same exact performance out of a HD 7870 (still bottlenecked by this APU).
Quote:


> Originally Posted by *s33dless*
> 
> Kaveri is supposed to launch by end of year from what I've heard, so the next few months should see that answered.
> 
> Anybody hear anything new about the future of FM2? There are rumors going about that AMD won't leave FM2 in the dust in favor of FM2+ and just nuke some features if you shove an FM2+ native processor on, but nothing official, and my trust in internet rumor is very little.


It's rumored to be a completely new socket FM2+ which has two extra pins (no backwards compatibility).
Quote:


> Originally Posted by *rppot*
> 
> I'm aware that 6800K + another card isn't the most cost effective option. I thought the performance of the iGPU on its own would do me fine, but it isn't enough. I haven't had the chance to use better RAM so this is unfair (How much of a performance increase could I expect using 2400 MHz, by the way?)
> 
> It's just one notch below what I need, though still very usable, so now I want to fix that. I think 2400 MHz RAM could be an effective solution but it's about the same price as a 7750 HD card (I'm also looking at higher end cards to run on their own). I think the performance benefits from the RAM would be less and only really applicable to any APU builds in the future, so I am leaning towards adding a card.
> 
> PS: In regards to the Aero being missing, I think it has something to do with Catalyst Control Center and the overclocking tool. I am using the latest beta drivers.


I ran a benchmark testing your theroy in Tomb Raider a few weeks ago. Memory speed scales twice as good as core clock. For every bump in core speed you do, you'll get twice the performance out of a single bump in memory speed (e.g. 1866 -> 2133). Tho even with 2400 MHz memory I wouldn't have high expectations, if you overclock the iGPU core to 1086 MHz along with it. You should be able to run games like BF3 at playable frame rates. Tho like I said in a previous post, you can grab a HD 7770 GHz edition card for what a HD 7750 costs. A 7770 GHz edition overclocked 1100/5000 will just murder any DGM setup that you can make possible.


----------



## azanimefan

Quote:


> Originally Posted by *peter-mafia*
> 
> So I"ve checked the 3dMark 11 database
> Phenom 965 + 7770 score 3700 on average
> http://www.3dmark.com/3dm11/3198331
> On the other hand my mini-ITX rig including the A10-6800K @ 4400/1169; ddr3 2400: CPU NB 2400: LP VGA powercolor 7750 gddr5 @865/1250: ssd OCZ vertex plus r2 120 scores 4231
> http://www.3dmark.com/3dm11/6820113
> 
> In the Tomb Raider benchmark I got 30fps (lowest) /51fps (highest)/ 39fps (average) on Ultra settings 1080p. Catalyst 13.4. Highly doubt, a single 7770 will be able to do that
> The difference is huge. And look at the size of my rig.


according to the database score you linked to prove you scored better then a phiix4965 +7770 your *a10-6800k is running 3 7750s in xfire*. That's not a a10 in dual graphics with a 7750. What type of game are you playing? Are you linking to someone's score and claiming it as your own? Or are you lying about your rig?

Because that's not the rig you are claiming you own. please clarify the mistake. (i got suspicious when i searched for phIIx4 965be + 7770ge scores, and saw even those with 2 7770 in xfire weren't touching that claimed 4231 score. That's when i looked closer at your machine in the bench... it's a x3 xfired 7750. not a dual graphics setup (which i still don't think can work))

btw: that system you linked to isn't running it's ram at your claimed 2400mhz, rather it's running it at 1600mhz... finally not a single overclocked 6800k + 7750 comes close to the highest scored stock 965+7770 in any of those benches (to say nothing for the overclocked results). so even if dual graphics is working on any of those, it still isn't a match for a 965+7770.

_Checkmate_


----------



## DaveLT

Quote:


> Originally Posted by *azanimefan*
> 
> according to the database score you linked to prove you scored better then a phiix4965 +7770 your *a10-6800k is running 3 7750s in xfire*. That's not a a10 in dual graphics with a 7750. What type of game are you playing? Are you linking to someone's score and claiming it as your own? Or are you lying about your rig?
> 
> Because that's not the rig you are claiming you own. please clarify the mistake. (i got suspicious when i searched for phIIx4 965be + 7770ge scores, and saw even those with 2 7770 in xfire weren't touching that claimed 4231 score. That's when i looked closer at your machine in the bench... it's a x3 xfired 7750. not a dual graphics setup (which i still don't think can work))
> 
> btw: that system you linked to isn't running it's ram at your claimed 2400mhz, rather it's running it at 1600mhz... finally not a single overclocked 6800k + 7750 comes close to the highest scored stock 965+7770 in any of those benches (to say nothing for the overclocked results). so even if dual graphics is working on any of those, it still isn't a match for a 965+7770.
> 
> _Checkmate_


Wait 3x7750s? Dude. There are only 1 Xfire connector on ALL 7750s


----------



## Stormscion

I dont know but my FX-8120 with 7770 plays 1080p medium crysis3 over 35fps avg and high settings on FC3 blood dragon for 45fps average. Tho both are overlcocked ( 4ghz cpu and 1100/1500 gpu).

Would be pretty impresive to see APU with iGPU do that. I think it is impossible.


----------



## azanimefan

Quote:


> Originally Posted by *DaveLT*
> 
> Wait 3x7750s? Dude. There are only 1 Xfire connector on ALL 7750s


7750s are slow enough to not need xfire bridges.. they can xfire through a xfire capable motherboard. btw: not a single 7750 has a xfire bridge link port. They only can xfire through the motherboard.


----------



## DaveLT

Quote:


> Originally Posted by *azanimefan*
> 
> 7750s are slow enough to not need xfire bridges.. they can xfire through a xfire capable motherboard. btw: not a single 7750 has a xfire bridge link port. They only can xfire through the motherboard.


But surely can they even do 3x crossfire?


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> If it floats your boat than so be it, tho I wouldn't go around recommending a 7970 to *everyone* running this APU.


Right, but I'm saying you shouldn't be recommending it to nobody. There are sets of conditions where it works. Single monitor 1080p is not one of them, sure, but I think other people out there after crazier setups should definitely experiment with this.

I mean, could you imagine a server mobo running a bunch of these? It would be so much cheaper to get an effective cluster going, crunch numbers all day and night.


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> Right, but I'm saying you shouldn't be recommending it to nobody. There are sets of conditions where it works. Single monitor 1080p is not one of them, sure, but I think other people out there after crazier setups should definitely experiment with this.
> 
> I mean, could you imagine a server mobo running a bunch of these? It would be so much cheaper to get an effective cluster going, crunch numbers all day and night.


At only 100W TDP each, i don't see why AMD shouldn't turn the APU platform into a quad-APU platform








Totally kickass for the amount of computing horsepower combined once HSA arrives with Kaveri


----------



## s33dless

Quote:


> Originally Posted by *DaveLT*
> 
> At only 100W TDP each, i don't see why AMD shouldn't turn the APU platform into a quad-APU platform
> 
> 
> 
> 
> 
> 
> 
> 
> Totally kickass for the amount of computing horsepower combined once HSA arrives with Kaveri


Yeah, no joke. I heard rumors that AMD is going to slowly kill off the other architectures and go all APU. I think that's mighty weird, not to mention not smart, but I think more APU's is always a good idea. Notebooks especially have a lot to gain form this platform as far as consumers go, but the avenues you can go with this are ridiculous..


----------



## azanimefan

Quote:


> Originally Posted by *DaveLT*
> 
> But surely can they even do 3x crossfire?


if the motherboard chipset allows it... sure i don't see why not. there are x3 and x4 xfire 7750 setups on that benching site putting up numbers that suggest x3 and x4 xfire is working properly.


----------



## s33dless

i don't know how much progress has been made, but when this SLi/xfire business started, I got really excited only to discover the 2 cards as one only gave you 1.5x performance (and god only knows what metric they used for "performance) max. i was turned off by that: why pay 2x for 1.5x?

i've had a distaste for it ever since and never really looked into it after that, because even [email protected] and those other compute heavy guys say 2 single is better for number crunching than 2 bridged (xfire/sli/whatever they poop out next).

gamers seem to love it though, how much has the architecture really changed (i noticed they called it a few different things as time progressed too, but i dismissed that as marketing mumbo-jumbo).

EDIT:
Looks like they did indeed step their game up while I wasn't looking. Not forcing the exact same GPU (unlike nvidia) is pretty sweet, and apparently you can top out at 2x "performance" (again, no metrics...), which is way better than before. Still wouldn't use it for straight up compute, but gaming/movies must get a mad boost (driver issues and micro-stutter aside)


----------



## peter-mafia

Quote:


> Originally Posted by *azanimefan*
> 
> according to the database score you linked to prove you scored better then a phiix4965 +7770 your *a10-6800k is running 3 7750s in xfire*. That's not a a10 in dual graphics with a 7750. What type of game are you playing? Are you linking to someone's score and claiming it as your own? Or are you lying about your rig?
> 
> Because that's not the rig you are claiming you own. please clarify the mistake. (i got suspicious when i searched for phIIx4 965be + 7770ge scores, and saw even those with 2 7770 in xfire weren't touching that claimed 4231 score. That's when i looked closer at your machine in the bench... it's a x3 xfired 7750. not a dual graphics setup (which i still don't think can work))
> 
> btw: that system you linked to isn't running it's ram at your claimed 2400mhz, rather it's running it at 1600mhz... finally not a single overclocked 6800k + 7750 comes close to the highest scored stock 965+7770 in any of those benches (to say nothing for the overclocked results). so even if dual graphics is working on any of those, it still isn't a match for a 965+7770.
> 
> _Checkmate_


You made me laugh.
Scroll *my* results page down and you'll see *ASRock FM2A85X-ITX*
Still don't get it? It's a mini-ITX board with ONE pci-e slot only. The memory is running at 2400mhz. I did a review on that board and memory on Newegg.
You are just embarrassing yourself

Saw the "checkmate" thing and laughed again. I need an ambulance


----------



## peter-mafia

1080p ultra
Highly doubt the 7770ge is even close


----------



## Mopar63

Quote:


> Originally Posted by *s33dless*
> 
> Yeah, no joke. I heard rumors that AMD is going to slowly kill off the other architectures and go all APU. I think that's mighty weird, not to mention not smart, but I think more APU's is always a good idea. Notebooks especially have a lot to gain form this platform as far as consumers go, but the avenues you can go with this are ridiculous..


The truth is it is very smart, the APU is the next evolution of the CPU. Look at the CPU pre FPU. No one though people needed the FPU and when it was released on a single chip most people thought it a waste. Today our programs would run like crap without that direct FPU on the chip.

The parallel co-processor of the APU is the new FPU. This will eventually be something every uses and AMD is ahead of the curve on this.

The APU is not, as to many people think, about graphics. In the long run it is about a parallel co-processor in every pot.


----------



## darkusx45

Quote:


> Originally Posted by *peter-mafia*
> 
> 
> 1080p ultra
> Highly doubt the 7770ge is even close


What HD 7750 version is that card? Im looking foward on gerting the same setup.


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> 
> 1080p ultra
> Highly doubt the 7770ge is even close


I get that with just the iGPU at normal settings @ 1600x900. Also you got two different backgrounds and two active windows at the same time which is not possible. The screen is all copy and paste so its hard to validate the results.


----------



## Stormscion

Quote:


> Originally Posted by *Opcode*
> 
> Quote:
> 
> 
> 
> Originally Posted by *peter-mafia*
> 
> 
> 1080p ultra
> Highly doubt the 7770ge is even close
> 
> 
> 
> I get that with just the iGPU at normal settings @ 1600x900. Also you got two different backgrounds and two active windows at the same time which is not possible. The screen is all copy and paste so its hard to validate the results.
Click to expand...

You must be pretty delusional to think that igpu is stronger then 7700.


----------



## DaveLT

Quote:


> Originally Posted by *Stormscion*
> 
> You must be pretty delusional to think that igpu is stronger then 7700.


That's spot on







Although he said 1600x900 but no idea what the settings were either


----------



## malmental

quick question fellas.
A10-6800K can be CF-X with what GPU's, any besides the HD 6670..?
also is there any hacks to allow more if the 6670 is the only one.
the net is so vague sometimes and then I though you guys should know anyways.

thanks.


----------



## azanimefan

Quote:


> Originally Posted by *DaveLT*
> 
> That's spot on
> 
> 
> 
> 
> 
> 
> 
> 
> Although he said 1600x900 but no idea what the settings were either


unfortunately for him i have tomb raider, and have a screenie at 1080p, same bench. Trick is his graphic settings aren't displayed, but just looking at the difference in picture resolution you can see he's not playing at 1080p. 

that said, i played a game with the settings too. you see I thought the picture quality in his photo was screwy... something was off for sure, at least it didn't look like my version of tomb raider. so i played a little with the graphic settings till i got close. Turns out if you turn off Tessellation and High Precision you get a picture that looks pretty much identical to his (also a bump in FPS). See my 7770ge benches about the same as every other 7770ge on tomb raider, ultra quality, 1080p. that is i average low 30's in fps. But by turning both those settings off i got a 10fps bump on the spot and a much more impressive screenie to show off around here.

Picture Taken with FRAPS (fps at time of photo shown in yellow top corner), 965be overclocked to 3.8ghz, 7770ge at stock. I backed the overclock off the gpu yesterday because of a tech issue i was trying to iron out (unrelated to the gpu), figured the issue out (it was a weird short/power fluctuation causing my gpu to not wake from sleep mode, or my h100 to turn off, THAT was alarming) when i found an unrelated fan's plug had been loosened on the motherboard. never would have guessed that would cause a gpu/h100 issue, but once the plug was fixed the problems vanished. till now i forgot to turn on afterburner, so the picture was taken with the gpu at stock settings.


----------



## akromatic

Quote:


> Originally Posted by *azanimefan*
> 
> RE:
> 1) I think you missed his point. his point wasn't that the athlon II x4 750k was a better cpu; his point was it wasn't cost effective, and therefore it was silly to go with an APU if you intended to get a discrete card. he's right by the way. let me demonstrate using the lowest chip/gpu prices on pcpartpicker
> 
> 5800k + 6770 - $171.63
> 6800k + 6770 - $198.95
> 5800k + 7750 - $202.43
> 6800k + 7750 - $229.73
> 
> Every single ones of the following combos would out perform EITHER of the dual graphics combos above.
> g2020 + 7770GE - $131.99
> i3-3210+ 7770GE - $189.09
> Athon II + 7770GE - $159.28
> Ph II x4 + 7770GE - $170.98
> FX4300+ 7770GE - $193.96
> FX6300+ 7770GE - $197.97
> 
> THATS why you don't buy an APU if you are planning to get a discrete gpu. the $$ just doesn't work out right. it's not cost effective at all.
> 
> *for reference, here are the prices
> HD 6770 1gb ddr3 - $48.98
> HE 7750 4gb ddr3 - $79.76
> HD 7770GE 1gb ddr5 - $79.99
> a10-5800k - $122.68
> a10-6800k - $149.97
> Pentium G2020 - $52.00
> Core i3-3210 - $109.10
> Athlon II x4 750k - $79.29
> Phenom II x4 965be - $90.98
> FX 4300 - $113.97
> FX 6300 - $117.98


but the APU is a good chip that gets your foot in the door when you are tight on cash and drop in a card any other time when affordable so its not silly to get an APU with discrete graphics

besides an A8/10 costs as much as an i3 and performs on par which makes is a valid competitor on the CPU bit alone

the only issue with CFx mode is that not all games like it


----------



## Opcode

Quote:


> Originally Posted by *Stormscion*
> 
> You must be pretty delusional to think that igpu is stronger then 7700.


Quote:


> Originally Posted by *DaveLT*
> 
> That's spot on
> 
> 
> 
> 
> 
> 
> 
> Although he said 1600x900 but no idea what the settings were either


No one knows how to read?







Quote:


> Originally Posted by *Opcode*
> 
> I get that with just the iGPU at *normal settings* @ 1600x900. Also you got two different backgrounds and two active windows at the same time which is not possible. The screen is all copy and paste so its hard to validate the results.


----------



## azanimefan

btw: this is tombraider at 1080p, ultra settings, no games played with the graphic settings (not sure why but my fps starts at 2 or 3 on the first frame of the bench when on ultra, at no other time in the bench is the frames that low), matching the screenie as good as i can. figured i'd upload this picture so you can compare.





as you can see laura is a bit wider in 1080p then in the previous screenie... leading me to doubt he's playing at 1080p. note the light bloom on the metal behind her. it's different from the one in his screenie... the light bloom on the metal in his screen is much closer to the one in my first one with some settings played with. Leading me to believe he played with the settings a bit too.

additionally look at the fingers in the 1080p version i posted and the one he did. the difference in texture quality is plain. he turned down the texture quality as well i think (_we can't tell for certain, he might had took a low res screenie for all i know, i just am having trouble believing he's getting 7850 performance out of his 6800k+7750, because you'd need a 7850 to pull those types of fps from this game on ultra_)


----------



## Stormscion

Quote:


> Originally Posted by *akromatic*
> 
> but the APU is a good chip that gets your foot in the door when you are tight on cash and drop in a card any other time when affordable so its not silly to get an APU with discrete graphics
> 
> besides an A8/10 costs as much as an i3 and performs on par which makes is a valid competitor on the CPU bit alone
> 
> the only issue with CFx mode is that not all games like it


nobody is denying that it is good buy if you want to use system and get stronger GPU later on if you are on the tight budget









APU is strong indeed! But not as strong as 7770ge...yet! Maybe with kaveri









certainly it allows for interesting possibilities. Unfortunately AMD is trolling us and not releasing APU with more cores ... apparently kaveri will have 2m4core as well and that is sad tbh. I would get APU over FX if it had 8 cores and great iGPU on top of it







would be monster chip


----------



## s33dless

Quote:


> Originally Posted by *Mopar63*
> 
> The APU is not, as to many people think, about graphics. In the long run it is about a parallel co-processor in every pot.


THIS. This is why buying the APU and a discrete card is not always a bad idea. This is what I've been trying to communicate on this thread. Not only do you have a new processor in parallel, it's a nuch of _stream procesors_, which excel at some tasks AND it's not subject to the PCIe bandwidth limit. i know what CPU lag looks like, and i haven't seen any on any A10 + HD7900 system (some reviewers actually used that as a test bed).

I also brought up the time thing as well. Getting an APU buys you lots of time. You can 1080p game with good settings at good frame rates (not excellent on either unless you go 720p), but deal with it for a few weeks and drop a high end card in later.


----------



## peter-mafia

Quote:


> Originally Posted by *darkusx45*
> 
> What HD 7750 version is that card? Im looking foward on gerting the same setup.


It's the Powercolor 7750 LP with a modified cooler. Slightly OC 865/1250. Stays cool. If you don't wanna modify it then buy the Sapphire LP

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131480

Quote:


> Originally Posted by *Opcode*
> 
> I get that with just the iGPU at normal settings @ 1600x900. Also you got two different backgrounds and two active windows at the same time which is not possible. The screen is all copy and paste so its hard to validate the results.


Finish the test. *Press ALT+Enter* . Get surprised
Quote:


> Originally Posted by *azanimefan*
> 
> btw: this is tombraider at 1080p, ultra settings, no games played with the graphic settings (not sure why but my fps starts at 2 or 3 on the first frame of the bench when on ultra, at no other time in the bench is the frames that low), matching the screenie as good as i can. figured i'd upload this picture so you can compare.
> 
> 
> as you can see laura is a bit wider in 1080p then in the previous screenie... leading me to doubt he's playing at 1080p. note the light bloom on the metal behind her. it's different from the one in his screenie... the light bloom on the metal in his screen is much closer to the one in my first one with some settings played with. Leading me to believe he played with the settings a bit too.


I told you that card sucked. The newer the game (DX 11) the bigger the difference will; be.

OK. Seriously, fellas. I don't have much time, but you are making me look like a liar. Specially for ya:


----------



## Mopar63

Quote:


> Originally Posted by *s33dless*
> 
> THIS. This is why buying the APU and a discrete card is not always a bad idea. This is what I've been trying to communicate on this thread. Not only do you have a new processor in parallel, it's a nuch of _stream procesors_, which excel at some tasks AND it's not subject to the PCIe bandwidth limit. i know what CPU lag looks like, and i haven't seen any on any A10 + HD7900 system (some reviewers actually used that as a test bed).
> 
> I also brought up the time thing as well. Getting an APU buys you lots of time. You can 1080p game with good settings at good frame rates (not excellent on either unless you go 720p), but deal with it for a few weeks and drop a high end card in later.


There is however currently a problem, the current APUs turn off the parallel co-processor if a video card bigger than a 6670 is detected. The good news is with the new chips due the end of this year, this flaw is fixed.

AMD NEEDs to pull focus from the FX lineup and put that research money and effort into the APU design. Intel has already moved their entire CPU lineup, with the small exception of the Extreme, to onboard graphics? Why? It is not because the i7 is being used with onboard, it is because that graphical processor can be used to handle parallel processor work for the CPU and not tax the discrete graphics. They already see the way the wind is blowing.

AMD and their APU design has a pretty solid lead in the parallel processing side. Okay they are kicking Intel's ass but the issue for them is they are behind in the CPU core design.


----------



## DaveLT

It will be a long time before Intel will start using the terribly weak (GFlops on the Intel IGPs are really low!) iGPU for parallel computing or at least i see it that they will never do it







Since all they want to do is provide a low-power GPU onboard, saves cost (Not really though, considering the prices Intel charges ...) for mobile platforms.
Sadly with HD4600 of course AMD has a lead for idle power consumption. AMD has ATI which has been researching ways to save GPU power for years ...
(Otherwise, Broadwell-D has been cancelled)


----------



## s33dless

Quote:


> Originally Posted by *Mopar63*
> 
> There is however currently a problem, the current APUs turn off the parallel co-processor if a video card bigger than a 6670 is detected. The good news is with the new chips due the end of this year, this flaw is fixed.


You can force it on. There should be a BIOS option, right where you choose your primary display device (PCIe/iGPU). I set PCIe to main and iGPU to force, and voila! extra screens and floating point operations! yay~


----------



## tuffy12345

Ok, I've got a 6800K that I really love and I'm about to order a 7750 within the next week or so. Am I good to go, or do I need to get the 6670? For crossfire or whatever they're calling it these days. I've got the ASrock ITX mobo. I went back about 7 pages, and saw the UF guy defending his crossfire but no one seems to believe him?


----------



## azanimefan

Quote:


> Originally Posted by *peter-mafia*
> 
> It's the Powercolor 7750 LP with a modified cooler. Slightly OC 865/1250. Stays cool. If you don't wanna modify it then buy the Sapphire LP
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131480
> Finish the test. *Press ALT+Enter* . Get surprised
> I told you that card sucked. The newer the game (DX 11) the bigger the difference will; be.
> 
> OK. Seriously, fellas. I don't have much time, but you are making me look like a liar. Specially for ya:


i'm bowing out of this argument with an apology to you. My apologies. Apparently your machine breaks every benchmark, review site, and tech barrier known to man. Apparently a 6800+7750 = better then i5+7850 gpu performance. Learn something new every day i guess. If you're pulling our leg and playing games with the video thats on you. I have my issues with it and the screenies, but nothing i can substantiate, so further accusations would be simply sour grapes and poor form.

So i'll officially apologize. not sure what you needed to do to squeeze that performance out of that machine, as it's completely contrary to my understanding of computer electronics, but in this case i'm not ashamed to ask "how?" how did you make your setup blow up all benches? how did you take your setup to 2+ times the performance it should be giving out. I want to know how, because your numbers are so significantly beyond anyone else's with the same setup (by 20-50%), that you must have figured out some hardware combo or something, some type of overclock, to push your system well beyond what it should be capable of. I mean you're getting numbers i'd expect to see out of an i5+7850 right now. And as an overclocker my whole adult life i gotta ask how?

cause i'd love to take what i learn to apply it to something down the road to pull more out of tech then you're supposed to. That's what overclocking is about for the most part.


----------



## Opcode

Quote:


> Originally Posted by *Stormscion*
> 
> nobody is denying that it is good buy if you want to use system and get stronger GPU later on if you are on the tight budget
> 
> 
> 
> 
> 
> 
> 
> 
> 
> APU is strong indeed! But not as strong as 7770ge...yet! Maybe with kaveri
> 
> 
> 
> 
> 
> 
> 
> 
> 
> certainly it allows for interesting possibilities. Unfortunately AMD is trolling us and not releasing APU with more cores ... apparently kaveri will have 2m4core as well and that is sad tbh. I would get APU over FX if it had 8 cores and great iGPU on top of it
> 
> 
> 
> 
> 
> 
> 
> would be monster chip


The reason for this is cost, AMD also cant really release a eight core APU their technology is just not there yet. Eight cores would take up nearly the entire die leaving no room for ALU's. This is why quad core has become the norm for APU's. Rumor is Kaveri will be the first step to changing that. As rumor has it there will be a six core Kaveri APU. Tho again with that many cores the IGP might not be that impressive due to die size. I would prefer the quad core with 512 ALU's which should beat out the discrete HD 7750. Plus things will get much more interesting with Kaveri as if you were to pair a 7750 with the APU, then you would be packing over 1024 GCN ALU's. Which is equivalent to the power of the HD 7850 minus the fact of crossfire impact, less memory, and a smaller memory bus. Tho it should still be interesting to see, wonder what cards will be supported for DGM with Kaveri (HD 7750 is almost a guessed guarantee).
Quote:


> Originally Posted by *tuffy12345*
> 
> Ok, I've got a 6800K that I really love and I'm about to order a 7750 within the next week or so. Am I good to go, or do I need to get the 6670? For crossfire or whatever they're calling it these days. I've got the ASrock ITX mobo. I went back about 7 pages, and saw the UF guy defending his crossfire but no one seems to believe him?


Its really hit and miss, not everyone can get DGM to work with the A10-6800k and the 7750.


----------



## s33dless

Quote:


> Originally Posted by *tuffy12345*
> 
> Ok, I've got a 6800K that I really love and I'm about to order a 7750 within the next week or so. Am I good to go, or do I need to get the 6670? For crossfire or whatever they're calling it these days. I've got the ASrock ITX mobo. I went back about 7 pages, and saw the UF guy defending his crossfire but no one seems to believe him?


If you:
a)will use the iGPU for compute/extra screens alongside the card

or

b)game on more than 1 screen (eyefinity SLS)

I would recommend ditching the DGM idea and getting a 78-something or 79-something. If neither of those apply (you only want to game on 1 1080p screen and will leave the iGPU off and unused when you pop your card in), then you should go for the xfire--but keep in mind what Opcode said, it's hit or miss, so make sure you do your research into what manufacturer's card works (mobo might be a factor too).


----------



## Notty

I have a problem









Bought a A10-6800k today along with a AsRock FM2A75M-DGS.

Installed a fresh Windows 8 and when I try to install AMD Drivers/Catalyst just after the display driver is installed (in the middle of the installation) my PC restarts. When it enter Windows again there is no driver installed.

I also tried to only install the display driver and the installer finished. But it rebooted automatically and every time I entered on Windows the PC reboots.

Anyone knows what am I doing wrong? I need the PC to work purposes, while gaming a bit here and there and I´m starting to regret buying this hardware. And you know how bad is buyer regret









At this time i can only think I should spend 100€ more and get an i3 with a 7770/650ti

Help please


----------



## azanimefan

Quote:


> Originally Posted by *Notty*
> 
> I have a problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bought a AsRock FM2A75M-DGS.


there is your problem. Hate to say it, but i'm willing to bet, at the end of the day, it's the motherboard. See if you can't update the bios. if that doesn't work i'd contact asrock about the problem. Their fm2a75 chipset motherboards are known disasters for reliability. that's why they're so cheap. generally asrock has a lot of problems with amd chips. (though their 85 motherboards aren't too bad on the whole)


----------



## s33dless

Quote:


> Originally Posted by *Notty*
> 
> I have a problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bought a A10-6800k today along with a AsRock FM2A75M-DGS.
> 
> Installed a fresh Windows 8 and when I try to install AMD Drivers/Catalyst just after the display driver is installed (in the middle of the installation) my PC restarts. When it enter Windows again there is no driver installed.
> 
> I also tried to only install the display driver and the installer finished. But it rebooted automatically and every time I entered on Windows the PC reboots.
> 
> Anyone knows what am I doing wrong? I need the PC to work purposes, while gaming a bit here and there and I´m starting to regret buying this hardware. And you know how bad is buyer regret
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At this time i can only think I should spend 100€ more and get an i3 with a 7770/650ti
> 
> Help please


That sounds like a driver conflict. Try talking to AMD support, but here's a few steps they might make you go through:
-undo all over/under-clocking
-make sure you're using the latest drivers (which i always saw as odd advice...sometimes the latest is the worst!)
-get rid or all display drivers and the whole catalyst suite. run the default windows display drivers (your display resolution will be weird, this is fine)
-reinstall
-check your bios options for power saving,
-undo all over/under-clocking

EDIT:
Are you running Win8x64, or 32? You might be installing the wrong driver.


----------



## Notty

I´m running Win8 x64. Did everything you listed no luck.

I think the problem is the motherboard, reading all the problems on the web with the same model (ATX version)...

So now I must buy an Asus version and there we go 20€ more... 80€ more and I would fly with i3 3240 3,4ghz + HD7770. Must think about this. grrr

Thanks anyway.


----------



## s33dless

Quote:


> Originally Posted by *Notty*
> 
> I´m running Win8 x64. Did everything you listed no luck.
> 
> I think the problem is the motherboard, reading all the problems on the web with the same model (ATX version)...
> 
> So now I must buy an Asus version and there we go 20€ more... 80€ more and I would fly with i3 3240 3,4ghz + HD7770. Must think about this. grrr
> 
> Thanks anyway.


haha. FWIW, I have one of the ASUS mobo's, and it's 100% fine. or just keep getting RMA's until one works. you DID pay for it, they owe you a working product, but i understand a lot of the time it's not worth the hassle.


----------



## Notty

Yeah I´m not even going to RMA this junk... I will trade for an ASUS F2A55-M and pay the 20€ difference because I just bought this today lol

Yeah the hassle... you´re right. Now I must go to the store again and talk with them, more thermal paste more board mouting etc etc...wathever, I just want to open some excel files and play some games at 720p low settings.


----------



## s33dless

Quote:


> Originally Posted by *Notty*
> 
> Yeah I´m not even going to RMA this junk... I will trade for an ASUS F2A55-M and pay the 20€ difference because I just bought this today lol
> 
> Yeah the hassle... you´re right. Now I must go to the store again and talk with them, more thermal paste more board mouting etc etc...wathever, I just want to open some excel files and play some games at 720p low settings.


If you're expecting 720 low, you will be very, very surprised and very, very pleased with the iGPU performance.


----------



## peter-mafia

Quote:


> Originally Posted by *azanimefan*
> 
> i'm bowing out of this argument with an apology to you. My apologies. Apparently your machine breaks every benchmark, review site, and tech barrier known to man. Apparently a 6800+7750 = better then i5+7850 gpu performance. Learn something new every day i guess. If you're pulling our leg and playing games with the video thats on you. I have my issues with it and the screenies, but nothing i can substantiate, so further accusations would be simply sour grapes and poor form.
> 
> So i'll officially apologize. not sure what you needed to do to squeeze that performance out of that machine, as it's completely contrary to my understanding of computer electronics, but in this case i'm not ashamed to ask "how?" how did you make your setup blow up all benches? how did you take your setup to 2+ times the performance it should be giving out. I want to know how, because your numbers are so significantly beyond anyone else's with the same setup (by 20-50%), that you must have figured out some hardware combo or something, some type of overclock, to push your system well beyond what it should be capable of. I mean you're getting numbers i'd expect to see out of an i5+7850 right now. And as an overclocker my whole adult life i gotta ask how?
> 
> cause i'd love to take what i learn to apply it to something down the road to pull more out of tech then you're supposed to. That's what overclocking is about for the most part.











I was pretty skeptical about that combination, too. Since I was restricted to my mini ITX low profile case I had just a couple options:
1)Started with the A10-5800K crossfired with the low profile Sapphire 6670. CPU 4200, integrated graphics 1083. ddr2133 (2400 wouldn't work). Wasn't impressed. Managed to get 3200 in 3dMark11 Perf.

2)After Richland had been released sold both the VGA and APU. Bought the A8-5600K and 7750 LP. Turned IGD off (yes, with my Asrock mini ITX board it can be done). CPU 4200, nb 2200mhz, No VGA OC (overheated). Got around 2900 3dmarks. Don't remember for sure. You can find the results a few pages back. Got disappointed. Found some videos on Youtube regarding dual graphics A10 trinity +7750 and even 7770. Turned on IGD in BIOS, dual graphics. Was shocked. Got 3700 or something 3dmarks (IGD was overclocked. 1083mhz). The link was posted a few pages back.

3)Sold the A8-5600K. Bought the A10-6800K. (the de-lidding photos belong to my A10-5800K. But there is no difference)

the memory shown in the photo (ripjaws x 2133) was replaced with the ripjaws z 2400


Zalman CNPS8000B filed down (wouldn't clear the ram)



The SB radiator was replaced with a hand-made one. Thermal pad under the VRM heatsink was replaced with MX4. Hand made radiator over the chokes (used thermal adhesive AS5)





OCing:
1)CPU speed 4400mhz. turbo off (VS 4100/4400 stock)
2)CPU voltage 1.425V (or 1.4375. Need to get into BIOS to make sure. In windows it's fluctuating)
3)CPU NB frequency 2400mhz (vs 1800 stock)
4) CPU NB voltage 1.400V (VS 1.25V ???)- responsible for CPU NB OCing and integrated GFX OCing
5)integrated graphics 1169mhz (vs 844mhz stock!!!! That's a huge increase)
6)dual memory 2400mhz (vs 1600 for most rigs)
7) The discrete graphics card seemed to be too hot (78c easily). So, took the turbine off. Cut through the connections between the fins for better air flow. Put a 80*10mm 5v fan atop. Ice cold! Less than 60c. Slightly OCed the card
gfx 865 (VS 800)
memory 1250 (VS 1125 stock)


That's it. I think Catalyst 13.4 is what allows me to do DGM with the 7750.
All I can say is it's definitely works with the Asrock FM2A85X-ITX (Bios 1.50). The chipset is A85. Don't see any reason for that combination not to work with another mobo, though.
Guys, the 7750/7770 is merely $80-110. Pretty sure it's average you usually spend at bars







It's worth a try.

And I doubt the 7850 is slower than my rig. A friend of mine's i7-2600+660 is like 20% faster


----------



## Clockdripdoor

I am not happy with my FPS in BF3 on low setting. After reading this thread, I see that I may have luck with a 7750.

How would you rate this XFX 7750? I wanted something with display port for future monitor.

Cheap, $89.99

XFX FX-775A-ZNP4

Express 3.0 x16

AMD GPU Radeon HD 7750 Core Clock 800MHz 512 Stream Processors

Effective Memory Clock 1125MHz 1GB Memory Type GDDR5

HDMI x DisplayPort x DVI

Warranty 2 years


----------



## beers

Quote:


> Originally Posted by *Notty*
> 
> I have a problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bought a A10-6800k today along with a AsRock FM2A75M-DGS.
> 
> Installed a fresh Windows 8 and when I try to install AMD Drivers/Catalyst just after the display driver is installed (in the middle of the installation) my PC restarts. When it enter Windows again there is no driver installed.
> 
> I also tried to only install the display driver and the installer finished. But it rebooted automatically and every time I entered on Windows the PC reboots.
> 
> Anyone knows what am I doing wrong? I need the PC to work purposes, while gaming a bit here and there and I´m starting to regret buying this hardware. And you know how bad is buyer regret
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At this time i can only think I should spend 100€ more and get an i3 with a 7770/650ti
> 
> Help please


Make sure the BIOS is up to date, or it will randomly BSOD.
That happened with my HTPC, BIOS update cleared it right up.


----------



## peter-mafia

Quote:


> Originally Posted by *svtfast*
> 
> I am not happy with my FPS in BF3 on low setting. After reading this thread, I see that I may have luck with a 7750.
> 
> How would you rate this XFX 7750? I wanted something with display port for future monitor.
> 
> Cheap, $89.99
> 
> XFX FX-775A-ZNP4
> 
> Express 3.0 x16
> 
> AMD GPU Radeon HD 7750 Core Clock 800MHz 512 Stream Processors
> 
> Effective Memory Clock 1125MHz 1GB Memory Type GDDR5
> 
> HDMI x DisplayPort x DVI
> 
> Warranty 2 years


The only con I see is it's a 1 slot card meaning the heatsink is very tiny. May overheat if OCed. The other specs are pretty standard. On the other hand BF3 is a very picky game. Have heard even the 7870 is not enough (never played BF myself)
Got a bunch of AAA games that I don't have time to play. If any of these games has a benchmark I can test it:
Batman Arkham Coty
Bioshock Infinite
Skyrim
Assassin's creed 3
FarCry 3
FarCry3 Blood Dragon
Crysis 3
TR (obviously







)


----------



## Notty

Quote:


> Originally Posted by *beers*
> 
> Make sure the BIOS is up to date, or it will randomly BSOD.
> That happened with my HTPC, BIOS update cleared it right up.


Yes I updated the Bios no luck.

In fact I have read that A10 6800k is not suited for A75 motheboards. That can be the issue here. AMD specified on these new Richland CPU that A85 is for 18 and A10, A75 for A6 and A8. -
Quote:


> With the release of the FM2 APUs, AMD has specified which processors they have targeted for which chipsets. According to AMD, the A55 chipset is targeted for use with the A4 and A6 series APUs. The A75 chipset is targeted for the A6 and A8 APUs and the A85X chipset targets the A8 and the A10 series APUs


http://archive.benchmarkreviews.com/index.php?option=com_content&task=view&id=860&Itemid=63&limitstart=1

Also in my AsRock FM2A75M-DGS box it says "reccomended for a8 and a6".

Maybe if I switch to A8 6600k it works better?

The thing is I don´t want to spend 80€/85€ on a A85 board. I want to keep the price as low as possible. This board cost me 50€, A8 6600k is 107€ here in Portugal, for a total of 157€ (cpu + gpu + board).

If I keep the A10 with a 80€ board I would spend around 215€ total... for 260€ I can get an Asus B75-M + Intel i3 3240 3,4ghz + HD7770 GHZ, and it would spank A10 + A85 board anyday























However If I can save the 100€ and keep a worse performer A75 + A8 combo, I would be happy with that. But I can´t accept paying slightly less than i3 3,4ghz + 7770


----------



## DaveLT

Quote:


> Originally Posted by *Notty*
> 
> If I keep the A10 with a 80€ board I would spend around 215€ total... for 260€ I can get an Asus B75-M + Intel i3 3240 3,4ghz + HD7770 GHZ, and it would spank A10 + A85 board anyday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However If I can save the 100€ and keep a worse performer A75 + A8 combo, I would be happy with that. But I can´t accept paying slightly less than i3 3,4ghz + 7770


Surely not, A55 worked well with my customer's 6800k
Quote:


> Originally Posted by *Notty*
> 
> I´m running Win8 x64. Did everything you listed no luck.
> 
> I think the problem is the motherboard, reading all the problems on the web with the same model (ATX version)...
> 
> So now I must buy an Asus version and there we go 20€ more... 80€ more and I would fly with i3 3240 3,4ghz + HD7770. Must think about this. grrr
> 
> Thanks anyway.


Are you installing the right driver? APU drivers are different
Quote:


> Originally Posted by *Notty*
> 
> Yeah I´m not even going to RMA this junk... I will trade for an ASUS F2A55-M and pay the 20€ difference because I just bought this today lol
> 
> Yeah the hassle... you´re right. Now I must go to the store again and talk with them, more thermal paste more board mouting etc etc...wathever, I just want to open some excel files and play some games at 720p low settings.


Gigabyte F2A85XM-D3H is a better bet ...
Quote:


> Originally Posted by *peter-mafia*
> 
> And I doubt the 7850 is slower than my rig. A friend of mine's i7-2600+660 is like 20% faster


Duh. 660 is about onpar with 7870.


----------



## peter-mafia

Quote:


> Originally Posted by *DaveLT*
> 
> Duh. 660 is about onpar with 7870.


Found his message in Skype history. His rig is stock (2600 not K and GTX660)
Tomb Raider 1080p ultra
lowest fps 46.5
highest 62
average 55
Basically, he gets 15 more fps than I do (27% difference)


----------



## Clockdripdoor

Quote:


> Originally Posted by *peter-mafia*
> 
> The only con I see is it's a 1 slot card meaning the heatsink is very tiny. May overheat if OCed. The other specs are pretty standard. On the other hand BF3 is a very picky game. Have heard even the 7870 is not enough (never played BF myself)
> Got a bunch of AAA games that I don't have time to play. If any of these games has a benchmark I can test it:
> Batman Arkham Coty
> Bioshock Infinite
> Skyrim
> Assassin's creed 3
> FarCry 3
> FarCry3 Blood Dragon
> Crysis 3
> TR (obviously
> 
> 
> 
> 
> 
> 
> 
> )


Ok then what about this?

Radeon 7790

GPU 1075Mhz
Memory 1500Mhz
1GB GDDR5

Never mind. I see that the 7790 and 7770 require crossfire bridges. Crap.

or

Radeon 7770

GPU 1000Mhz
Memory 1125Mhz
1GB GDDR5


----------



## DaveLT

Quote:


> Originally Posted by *svtfast*
> 
> Ok then what about this?
> 
> Radeon 7790
> 
> GPU 1075Mhz
> Memory 1500Mhz
> 1GB GDDR5
> 
> Never mind. I see that the 7790 and 7770 require crossfire bridges. Crap.
> 
> or
> 
> Radeon 7770
> 
> GPU 1000Mhz
> Memory 1125Mhz
> 1GB GDDR5


7790 in a heartbeat, only costs slightly more. These are the stuff you want when going single and don't have much money


----------



## Notty

And then for slightly more you can buy a 650ti Boost wich almost performs as good as a 660. And then for slightly more you get a 7870 wich outperforms it, and then for just a bit more you can get a gtx760 and be almost on pair with a gtx670.

You can always pay slightly more and get a better card, at least until you reach the 400€+ cards.

So in fact I disagree with you. HD7770 has great value for what´s in offer. Between a 7790/GTX650 ti and a GTX650ti Boost for just 20€/30€ more and much more performance, it would be a no brainer. However its price is far from HD7770, at least here where I live.

I don´t think 7790 is the "great bang for the buck" you´re saying. It´s a good card, and it´s not expensive, but definetly not the best performance vs dollar/euro around.

Also keep in mind HD7770 overclocks so well that for 95€ you can get the same performance, specially if you get the Ghz version.


----------



## Clockdripdoor

Quote:


> Originally Posted by *DaveLT*
> 
> 7790 in a heartbeat, only costs slightly more. These are the stuff you want when going single and don't have much money


If I go 7770 or 7790 I cant use crossfire.

I have been reading the reviews of the 7770 so I am leaning towards that now. Only one review for the 7790 model I selected.

I will have to figure out how I can rerun my water cooling loop to fit a GPU. I wish there was a water block for a 7770.


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> I never said that the HD 7970 isn't a good buy, tho it's not a "excellent" buy if you're looking to pair it with it a A10-6800k for gaming. I can provide proof of that as it even bottlenecks my HD 5870. *I am only getting roughly 50-60% utilization while in-game on several games. That's because of a CPU bottleneck*, I bet if I cranked it up to 5.0 GHz that bottleneck would be nearly non-existent.


Screenshot, normal photo, and panoramic. Running stock, as the screenshot will show. Apparently this "bottleneck" only exists for you.







ALSO:
"*While the A10-6800K has enough grunt to get the most out of the Radeon HD 7970 GHz Edition at 1920x1200, the A4-4000 doesn't.*"
from "this review", at the bottom.

So, bad tuning on your part?

EDIT: the screen I'm running the AITune and Catalyst is on the iGPU, but all the GPU utilities use whatever you have the main system display set on.


----------



## glussier

I don't know if all hd7770 can be crossfired, but the Sapphire model can be: http://www.eteknix.com/amd-radeon-hd-7770-crossfire-review/


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> Screenshot, normal photo, and panoramic. Running stock, as the screenshot will show. Apparently this "bottleneck" only exists for you.
> 
> 
> 
> 
> 
> 
> 
> ALSO:
> "*While the A10-6800K has enough grunt to get the most out of the Radeon HD 7970 GHz Edition at 1920x1200, the A4-4000 doesn't.*"
> from "this review", at the bottom.
> 
> So, bad tuning on your part?
> 
> EDIT: the screen I'm running the AITune and Catalyst is on the iGPU, but all the GPU utilities use whatever you have the main system display set on.


Try running Bad Company 2 @ 1080p smart one.

As a software and game programmer/engineer/developer I can tell you there is no such thing as "bad tuning" your hardware unless you're referring to bios tweaks? In which all of my hardware if tweaked properly, everything is running at factory rated specifications (Extreme6 UEFI is pretty easy). Latest drivers is all you should need otherwise, I don't bother to OC my 5870 because it never runs above 80% unless I am in quad core optimized games. So thus my reasoning still stands, in quite a few games the 7970 will be extremely bottlenecked by this APU. Try running single and dual core optimized games which is more than 80% of the market and let us know how your tests go. Like I said above, try Battlefield Bad Company 2 which is a insanely hot game still today. I bet you'll never hit over 60% with your GPU @ 1080p.

Edit: Install and run Bad Company 2 and let us know your GPU utilization. Here is mine on a 32 player map with only 16 players. Almost forgot this is at 1600x900 @ max settings.


----------



## DaveLT

Quote:


> Originally Posted by *svtfast*
> 
> If I go 7770 or 7790 I cant use crossfire.
> 
> I have been reading the reviews of the 7770 so I am leaning towards that now. Only one review for the 7790 model I selected.
> 
> I will have to figure out how I can rerun my water cooling loop to fit a GPU. I wish there was a water block for a 7770.


Don't waste your money on a 7770, 7790 cards are not really by every mfr but it's alot better than a 7770 and much more efficient
Best part? Only a little bit more expensive.
Quote:


> Originally Posted by *Notty*
> 
> And then for slightly more you can buy a 650ti Boost wich almost performs as good as a 660. And then for slightly more you get a 7870 wich outperforms it, and then for just a bit more you can get a gtx760 and be almost on pair with a gtx670.
> 
> You can always pay slightly more and get a better card, at least until you reach the 400€+ cards.
> 
> So in fact I disagree with you. HD7770 has great value for what´s in offer. Between a 7790/GTX650 ti and a GTX650ti Boost for just 20€/30€ more and much more performance, it would be a no brainer. However its price is far from HD7770, at least here where I live.
> 
> I don´t think 7790 is the "great bang for the buck" you´re saying. It´s a good card, and it´s not expensive, but definetly not the best performance vs dollar/euro around.
> 
> Also keep in mind HD7770 overclocks so well that for 95€ you can get the same performance, specially if you get the Ghz version.


1) you might as well buy GTX780 if you're taking it there. Jeez. You don't understand it at all do you?
2) If you want bang for buck and insane overclocking Powercolor 7850 PCS+ always turns out to be the best. 1) Digital PWM 2) Unlocked Voltage 3) Very very cool running unlike the Keplers
3) I disagree on the HD7770 overclocking well. Had many HD7770s that i was benching (Had my hands on it because i was preparing many rigs for a competition) and the 7770s can forget about hitting the clocks 7790 can go
The best i had on a 7770 for my friend was 1.1 ...
4) And according to what you just showed me the 7790 is ahead of the 650 Ti Boost at least.


----------



## Mopar63

Quote:


> Originally Posted by *s33dless*
> 
> "*While the A10-6800K has enough grunt to get the most out of the Radeon HD 7970 GHz Edition at 1920x1200, the A4-4000 doesn't.*"
> from "this review", at the bottom.
> .


Look I am the first to admit that the A10 chips are amazing. However it does NOT have enough grunt to always push higher end gaming. Using Skyrim as an example. My wife's machine can play and with the official HD mods as well as a few game content patches. However on a 7950 (less grunt then a 7970) she is quite a bit behind an i5 3450 in game performance. Add in a bunch of the unofficial HD mods and her system just cannot handle it.

Now we can lower the resolution and bring her back to great gaming but in the end the A10 does not have the grunt power.

However let me be clear this is pushing the envelope. If e stay at what 90% of what people do then the A10 can handle the load. In a game that is GPU limited it will do awesome.


----------



## Kuivamaa

Stock A10-6800k will definitely lose vs any i5 in skyrim which is pretty much worst case scenario for BD/PD architecture (poorly threaded/old instructions used). A nice 4.6Ghz overclock base should improve things though.


----------



## Durquavian

Yeah SKYRIM uses x87 instruction set. http://forum.hwbot.org/showthread.php?t=78490 Run ver 2 of this and it will get a bit better on AMD CPUs. It releases the block on x87 instruction set. You know it is on the up and up, Stilt found it and he is the one that OCed one of these the other day to 8.2ghz.


----------



## DaveLT

8.2GHz ... HOLY.


----------



## peter-mafia

Quote:


> Originally Posted by *svtfast*
> 
> Ok then what about this?
> 
> Radeon 7790
> 
> GPU 1075Mhz
> Memory 1500Mhz
> 1GB GDDR5
> 
> Never mind. I see that the 7790 and 7770 require crossfire bridges. Crap.
> 
> or
> 
> Radeon 7770
> 
> GPU 1000Mhz
> Memory 1125Mhz
> 1GB GDDR5


MSI R7770-PMD1GD5 doesn't seem to require a bridge. $95 after MIR. I would suggest you buy a 7750/7770 on eBay from a seller with a 14-day money back policy. If for some reason you don't like how the dual graphics works then just send it back ($12-14) and get a full refund. If you buy the card on Newegg and don't like it they will make you pay a re-stocking fee.
7770 alone without being CF is not worth any attention if you wanna play at decent settings.


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> Like I said above, try Battlefield Bad Company 2 which is a insanely hot game still today. I bet you'll never hit over 60% with your GPU @ 1080p.


a) i already said don't do it if single screen gaming _several times_, because we all know the frame rate vs. CPU scaling issues when you go down in resolution. notice how i'm running 3 screens?
Quote:


> Originally Posted by *s33dless*
> 
> Right, but I'm saying you shouldn't be recommending it to nobody. *There are sets of conditions where it works. Single monitor 1080p is not one of them, sure, but I think other people out there after crazier setups should definitely experiment with this.*
> 
> I mean, could you imagine a server mobo running a bunch of these? It would be so much cheaper to get an effective cluster going, crunch numbers all day and night.


b) This thread. If you want to look at it the way you do, any CPU will technically "bottleneck", just with varying degrees. Nothing a decent OC won't blow out of the water and at the end of the day, even though you may not have the raw FPS some systems have (I'm looking at you quad-SLi TITAN 800000000 FPS guys), you can still game with no problems (even less than some systems with higher frame rates because 0 stutter!).

Again: I know what CPU lag looks like. I'm getting 0 on any games, including Skyrim, which is horribly CPU bound (I hear all the shadows are CPU rendered...*** Bethesda?).

c) Bad Company 2 is from 2010 AND isn't optimized. This isn't the hardware's fault, it's just sub-par programming. I've also mentioned the general problem with that on this thread. But that aside, seeing as how the game is from 2 years before the card I'd be glad to run it for you if you buy me a copy









d) I admittedly do get low (~60-70%) util in Skyrim (with everything maxed), but again, that's because a lot of stuff is CPU rendered, so the card doesn't have much work to do. Despite this, it runs 100% smooth with no dips in frame rate, whatever the frame rate may be...I never checked. For the aforementioned reasons, I don't trust FPS as a good metric, I mean, FPS is an average over a whole second. In the computer world, a second is forever. 8 Trillion FPS means piddly squat if you render 7.9 billion of them in the last 10 ms. I'd MUCH rather run 40 FPS with 1 frame drawn every 25 ms.

So, to reiterate (yet again) my point:
Quote:


> Originally Posted by *s33dless*
> 
> I'm saying you shouldn't be recommending it to nobody. *There are sets of conditions where it works. Single monitor 1080p is not one of them, sure, but I think other people out there after crazier setups should definitely experiment with this.*
> 
> I mean, could you imagine a server mobo running a bunch of these? It would be so much cheaper to get an effective cluster going, crunch numbers all day and night.


Now I'm done with this bottlenecking discussion. I agree to disagree, just as long as the data I presented is here for others to look at as well.

I've started this thread here becaude my iGPU/dGPU hydra won't let me run the utils to control the fans, and with the way I have everything else set, the 7970 is the loudest component in my system by a long shot. It's the only thing I can hear over the sound of the A/C vent in my room at load, which is REALLY damn good all things considered...but still slightly annoying.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> a) i already said don't do it if single screen gaming _several times_, because we all know the frame rate vs. CPU scaling issues when you go down in resolution. notice how i'm running 3 screens?
> b) This thread. If you want to look at it the way you do, any CPU will technically "bottleneck", just with varying degrees. Nothing a decent OC won't blow out of the water and at the end of the day, even though you may not have the raw FPS some systems have (I'm looking at you quad-SLi TITAN 800000000 FPS guys), you can still game with no problems (even less than some systems with higher frame rates because 0 stutter!).
> 
> Again: I know what CPU lag looks like. I'm getting 0 on any games, including Skyrim, which is horribly CPU bound (I hear all the shadows are CPU rendered...*** Bethesda?).
> 
> c) Bad Company 2 is from 2010 AND isn't optimized. This isn't the hardware's fault, it's just sub-par programming. I've also mentioned the general problem with that on this thread. But that aside, seeing as how the game is from 2 years before the card I'd be glad to run it for you if you buy me a copy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> d) I admittedly do get low (~60-70%) util in Skyrim (with everything maxed), but again, that's because a lot of stuff is CPU rendered, so the card doesn't have much work to do. Despite this, it runs 100% smooth with no dips in frame rate, whatever the frame rate may be...I never checked. For the aforementioned reasons, I don't trust FPS as a good metric, I mean, FPS is an average over a whole second. In the computer world, a second is forever. 8 Trillion FPS means piddly squat if you render 7.9 billion of them in the last 10 ms. I'd MUCH rather run 40 FPS with 1 frame drawn every 25 ms.
> 
> So, to reiterate (yet again) my point:
> Now I'm done with this bottlenecking discussion. I agree to disagree, just as long as the data I presented is here for others to look at as well.
> 
> I've started this thread here becaude my iGPU/dGPU hydra won't let me run the utils to control the fans, and with the way I have everything else set, the 7970 is the loudest component in my system by a long shot. It's the only thing I can hear over the sound of the A/C vent in my room at load, which is REALLY damn good all things considered...but still slightly annoying.


a) You have to keep the idea in mind that anyone buying a graphics card for gaming will 90% of the time be using it on a single monitor @ 1080p. Using it across eyefinity is a completely different story and irrelevant to this thread, and to the my subject at hand.

b) My statement still stands, that even a overclocked A10-6800k bottlenecks a HD 5870 big time let alone a HD 7970. And yes of course it can still run games that are playable at max settings, tho that's not the point. You could be seeing 30% higher frame rate (going from 60 FPS to 78 FPS) if the CPU wasn't such a bottleneck.

c) Bad Company 2 doesn't need to be optimized for anything, the game was written in 2009-2010. Hardware was much slower back then and nobody ever had issues with playing the game. So the game programming has nothing to do with the bottleneck other than it not being threaded for more than a dual core machine (which was the typical platform at the time). I am pretty sure that I made it quite clear, I even posted actual evidence of my claims. Tho you insist that its not a bottleneck, and choose to question my knowledge as the culprit of the problem. Bad Company 2 is only one of the hundreds of still popular games that people play that will see the same bottleneck.

d) Regardless to how much of a load the GPU has to do, it should be running at 99% load in any DirectX game. This is how games are written to utilize as much hardware as possible. Even drawing a small 3D rotating box can put your GPU at 99% load. The fact that you're getting only 60-70% means your CPU is bottlenecking. You're losing out on 30% of your cards performance due to fact the CPU can not keep up with it.

In conclusion, my point is still the same. There's nothing wrong with buying a HD 7970, it's a great and powerful card. Tho based on the game titles available on the market, you won't ever use it to its full potential with a A10-6800k unless you're playing quad core optimized games. To make matters worse, even Battlefield 3 which is optimized for multi-core doesn't run at a solid 99% load. I see 93% dips with my HD 5870 which means the CPU is chugging data from time to time (bottlenecking). If you got the $400 to dump into a graphic card, you would think the same person could afford a $200 i5-3570k to pair it with to properly utilize every bit of the card regardless of the game title. Or even enough to move over to FX chips that will offer better performance than these Athlon's. Don't get me wrong there's nothing wrong with putting a HD 7970 into a APU build, tho its something that very few will ever do (you being one of them). As you have to have more purpose behind doing such a thing other than gaming and eyefinity. Your average single screen gamer would be better off with a FX-4300 which is cheaper and much faster to pair with the HD 7970. My dispute isn't over running multiple monitors, or using the card for compute. It's about putting a 7970 into a APU build for 1080p gaming.


----------



## lacrossewacker

Thinking about building a pc for my brother.

Just a small mini-atx box with the 6800k.

If I paired it with a SSD, would the 6800k really need any OC'ing out the box, or is it perfectly fine for your average college work and maybe a little counter strike?

How about the heat sink? I'd like to make this a small as possible. Thanks for you input.


----------



## Kuivamaa

Quote:


> Originally Posted by *lacrossewacker*
> 
> Thinking about building a pc for my brother.
> 
> Just a small mini-atx box with the 6800k.
> 
> If I paired it with a SSD, would the 6800k really need any OC'ing out the box, or is it perfectly fine for your average college work and maybe a little counter strike?
> 
> How about the heat sink? I'd like to make this a small as possible. Thanks for you input.


http://forum.oktabit.gr/topic/oktabit-vero-w-pc-mainstream-a6800a#comment-115079

All sorts of A10-6800k synthetic and gaming benchies there.


----------



## s33dless

Quote:


> Originally Posted by *Opcode*
> 
> My dispute isn't over running multiple monitors, or using the card for compute. It's about putting a 7970 into a APU build for 1080p gaming.


Which I've agreed with you on since the beginning isn't the best route! I'm just trying to make sure we don't screw the 3% of the population it would serve to go that route.

I would expect you could run Counter Strike at stock without breaking much of a sweat. It could do Skyrim, so unless you're running a crazy mod I haven't heard of, this thing would eat Counter Strike for breakfast.


----------



## Mopar63

Quote:


> Originally Posted by *lacrossewacker*
> 
> Thinking about building a pc for my brother.
> 
> Just a small mini-atx box with the 6800k.
> 
> If I paired it with a SSD, would the 6800k really need any OC'ing out the box, or is it perfectly fine for your average college work and maybe a little counter strike?
> 
> How about the heat sink? I'd like to make this a small as possible. Thanks for you input.


My wife runs an A10 5800K as her gaming system at stock with a 6950 for video and plays anything she wants at 1080 with a good gaming experience.


----------



## malmental

Quote:


> Originally Posted by *Mopar63*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lacrossewacker*
> 
> Thinking about building a pc for my brother.
> 
> Just a small mini-atx box with the 6800k.
> 
> If I paired it with a SSD, would the 6800k really need any OC'ing out the box, or is it perfectly fine for your average college work and maybe a little counter strike?
> 
> How about the heat sink? I'd like to make this a small as possible. Thanks for you input.
> 
> 
> 
> My wife runs an A10 5800K as her gaming system at stock with a 6950 for video and plays anything she wants at 1080 with a good gaming experience.
Click to expand...

a wife that games.....
priceless.


----------



## Mopar63

Quote:


> Originally Posted by *malmental*
> 
> a wife that games.....
> priceless.


You have no idea. She is the best. Some nights it is her staying up late gaming, not me


----------



## Clockdripdoor

Does any have a PowerColor TurboDuo AX7790 1GBD5-TDH/OC Radeon HD 7790 1GB 128-bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card?

Looks like a good card for the money. Only going to play BF3. Right now I get a pathetic 15 FPS.


----------



## beers

Quote:


> Originally Posted by *svtfast*
> 
> Does any have a PowerColor TurboDuo AX7790 1GBD5-TDH/OC Radeon HD 7790 1GB 128-bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card?
> 
> Looks like a good card for the money. Only going to play BF3. Right now I get a pathetic 15 FPS.


It's easier to just go look at reviews of products than to wait for a first hand account (and this thread isn't really even based on any portion of your question).

here: http://www.bit-tech.net/hardware/graphics/2013/03/22/sapphire-radeon-hd-7790-1gb-review/3


----------



## DaveLT

Quote:


> Originally Posted by *svtfast*
> 
> Does any have a PowerColor TurboDuo AX7790 1GBD5-TDH/OC Radeon HD 7790 1GB 128-bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card?
> 
> Looks like a good card for the money. Only going to play BF3. Right now I get a pathetic 15 FPS.


It's actually quite good for BF3, it definitely will do nearly 60fps on 1080p High i reckon


----------



## tuffy12345

Ordered a 7750. Hopefully it pairs correctly. This is for my HTPC so I'm not looking for anything crazy. I really like the APU concept, and being able to crossfire your CPU with the GPU seems pretty sweet.


----------



## malmental

Quote:


> Originally Posted by *tuffy12345*
> 
> Ordered a 7750. Hopefully it pairs correctly. This is for my HTPC so I'm not looking for anything crazy. I really like the APU concept, and being able to crossfire your CPU with the GPU seems pretty sweet.


this is the combo I'm curious about as well.

please keep me posted to your rig operation and configuration.?
thanks.


----------



## Opcode

Quote:


> Originally Posted by *tuffy12345*
> 
> Ordered a 7750. Hopefully it pairs correctly. This is for my HTPC so I'm not looking for anything crazy. I really like the APU concept, and being able to crossfire your CPU with the GPU seems pretty sweet.


It will be even more sweet when Kaveri delivers, the 7750 will be a perfect match if there is a 512 ALU model Kaveri APU.


----------



## nitrubbb

what is the max GPU I can put in my rig that APU and GPU would crossfire?

also what will be max GPU for upcoming kaveri?


----------



## Mopar63

for Kaveri there is no info yet. Officially GPUs will not pair with anything g above a 6670.


----------



## Artikbot

Quote:


> Originally Posted by *Mopar63*
> 
> for Kaveri there is no info yet. Officially GPUs will not pair with anything g above a 6670.


I would like to believe that it will crossfire with the upcoming HD87xx series, seeing how it contains a similar CU count as the HD7750.


----------



## Clockdripdoor

I have an old Radeon 4870 that I am going to put in my system (A10-6800k at 4.7Ghz) for stuff n giggles. AMD still has a driver for it. 15+ FPS in BF3 or bust!!! (card got 30+ in old comp)


----------



## malmental

Quote:


> Originally Posted by *Artikbot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mopar63*
> 
> for Kaveri there is no info yet. Officially GPUs will not pair with anything g above a 6670.
> 
> 
> 
> I would like to believe that it will crossfire with the upcoming HD87xx series, seeing how it contains a similar CU count as the HD7750.
Click to expand...

where is there an updated list on the GPU's that can be CF-X'd with the 5800K / 6800K, also is there a driver hack to allow more options.?


----------



## Clockdripdoor

Quote:


> Originally Posted by *svtfast*
> 
> I have an old Radeon 4870 that I am going to put in my system (A10-6800k at 4.7Ghz) for stuff n giggles. AMD still has a driver for it. 15+ FPS in BF3 or bust!!! (card got 30+ in old comp)


Damn!! Old 4870 got up to 65 FPS in 32 player map multiplayer.
That is 50 more than what I get with my A10. I would keep this GPU but it runs too hot, has DX 10, and eats up power.


----------



## Mopar63

Quote:


> Originally Posted by *malmental*
> 
> where is there an updated list on the GPU's that can be CF-X'd with the 5800K / 6800K, also is there a driver hack to allow more options.?


Same list there has always been. This is not a driver issue, this is a performance balance issue. The current APUs do not have enough horsepower to balance crossfire with a 7700 series. It's like trying to crossfire a 7850 with a 7970.


----------



## malmental

Quote:


> Originally Posted by *Mopar63*
> 
> Quote:
> 
> 
> 
> Originally Posted by *malmental*
> 
> where is there an updated list on the GPU's that can be CF-X'd with the 5800K / 6800K, also is there a driver hack to allow more options.?
> 
> 
> 
> Same list there has always been. This is not a driver issue, this is a performance balance issue. The current APUs do not have enough horsepower to balance crossfire with a 7700 series. It's like trying to crossfire a 7850 with a 7970.
Click to expand...

ah, understood and didn't think about it like that...
thanks.


----------



## Opcode

Quote:


> Originally Posted by *nitrubbb*
> 
> what is the max GPU I can put in my rig that APU and GPU would crossfire?
> 
> also what will be max GPU for upcoming kaveri?


We don't and wont know until Kaveri is released. Or until AMD releases any official documentation outlining the DGM support. As for Trinity and Richland the current max officially supported card is the HD 6670. Tho people seem to be getting the HD 7750 to work in DGM with the A10-6800k.
Quote:


> Originally Posted by *svtfast*
> 
> Damn!! Old 4870 got up to 65 FPS in 32 player map multiplayer.
> That is 50 more than what I get with my A10. I would keep this GPU but it runs too hot, has DX 10, and eats up power.


You can buy a better cooler that fits it on Newegg. My HD 5870 I got for free is a blessing, I can run BF3 in single player mode at around 90 FPS on High preset (without dips below 60 FPS). It does have DX11 and runs fairly cool for being a reference design (I prefer it as the heat goes out the back of the case). Only downside is it is another huge power hog, tho I guess that's the trade off if you want good frame rates.


----------



## Clockdripdoor

Quote:


> Originally Posted by *Opcode*
> 
> We don't and wont know until Kaveri is released. Or until AMD releases any official documentation outlining the DGM support. As for Trinity and Richland the current max officially supported card is the HD 6670. Tho people seem to be getting the HD 7750 to work in DGM with the A10-6800k.
> You can buy a better cooler that fits it on Newegg. My HD 5870 I got for free is a blessing, I can run BF3 in single player mode at around 90 FPS on High preset (without dips below 60 FPS). It does have DX11 and runs fairly cool for being a reference design (I prefer it as the heat goes out the back of the case). Only downside is it is another huge power hog, tho I guess that's the trade off if you want good frame rates.


I am going to get a Radeon 7790 instead. It is much more powerfull than my 4870 with a higher GPU clock and higher memory clock. I will not be able to crossfire the iGPU and the dGPU but the dGPU will make up for the difference. I may get another 7790 if prices go down (and if I can hid it from my wife).

I would like to water cool the dGPU but I have not been able to find any water blocks that would fit it except for universal ones.


----------



## agrims

Hide from the wife! Ha! Why not buy now and ask for forgiveness? I kid.. I kid..

You should see a solid improvement with that card...


----------



## peter-mafia

Quote:


> Originally Posted by *Mopar63*
> 
> Same list there has always been. This is not a driver issue, this is a performance balance issue. The current APUs do not have enough horsepower to balance crossfire with a 7700 series. It's like trying to crossfire a 7850 with a 7970.


Obviously, it's not true. There are some problems with obsolete dx9 games. New games run flawlessly. You may remember such a venture as lucid Virtu that can crossfire different cards like nvidia+amd. They have problems with the drivers, though. AMD don't have such a problem.
I don't have any stutter neither in Tomb Raider nor Skyrim, nor AC3
Your claim is unsubstantiated.


----------



## Mopar63

Quote:


> Originally Posted by *peter-mafia*
> 
> Obviously, it's not true. There are some problems with obsolete dx9 games. New games run flawlessly. You may remember such a venture as lucid Virtu that can crossfire different cards like nvidia+amd. They have problems with the drivers, though. AMD don't have such a problem.
> I don't have any stutter neither in Tomb Raider nor Skyrim, nor AC3
> Your claim is unsubstantiated.


My claim is based on the official position of AMD, sounds pretty substantiated to me...


----------



## bassium

Upgrading my rig this week from a a8-5600k 7750 dg to an a10-6800k hope to see a big boost : )


----------



## peter-mafia

Quote:


> Originally Posted by *bassium*
> 
> Upgrading my rig this week from a a8-5600k 7750 dg to an a10-6800k hope to see a big boost : )


Finally, another person got DGM to work with the 7750.
You'll get some increase but nothing overwhelming. I had the A8-5600K+7750 too







I think this is the RAM what limits the APU's capabilities not the amount of stream processors.
Quote:


> My claim is based on the official position of AMD, sounds pretty substantiated to me...


They didn't say the 7750 wouldn't work in DGM. They just guarantee that the 6670 and below will work. It's like overclocking. You can do it but at your own risk.
And with older drivers DGM won't work. Thanks AMD for fixing it


----------



## CCast88

Ok so I just read the last 6 pages or so and looking at what I read, The A10 cannot push high end cards. Currently, I still have my system on my rig info and I am planning on getting the new HD 9000 series for this rig and building an HTPC ITX rig with the upcoming Kaveri apu and switching over one of my HD 5850s into it. From what I read, I guess the A10 won't be able to push a 5850, even though it's a 3-4 year old card? And another qauestion.. will my i7 930 have the balls to push an HD 9970?


----------



## DaveLT

Quote:


> Originally Posted by *CCast88*
> 
> Ok so I just read the last 6 pages or so and looking at what I read, The A10 cannot push high end cards. Currently, I still have my system on my rig info and I am planning on getting the new HD 9000 series for this rig and building an HTPC ITX rig with the upcoming Kaveri apu and switching over one of my HD 5850s into it. From what I read, I guess the A10 won't be able to push a 5850, even though it's a 3-4 year old card? And another qauestion.. will my i7 930 have the balls to push an HD 9970?


Nah, it will push a 5850 fine. Even in CPU-centric games (Or just in other words badly coded)

For a i7 930 ... that's hard to say.
AFAIK a 1.1GHz 7850 gets about 50% cpu load on 2.4GHz (My xeon) in BF3 MP. Which means it can handle 2 OVERCLOCKED (Pushes it to stock 7970 level) 1.1GHz 7850s








And let's say 9970 is 60% better than 7970 which would mean ... It will handle 9970 fine with spare room. Me thinks.
But really 9970 on i7 930? By then it's long due time to upgrade.


----------



## CCast88

Quote:


> Originally Posted by *DaveLT*
> 
> Nah, it will push a 5850 fine. Even in CPU-centric games (Or just in other words badly coded)
> 
> For a i7 930 ... that's hard to say.
> AFAIK a 1.1GHz 7850 gets about 50% cpu load on 2.4GHz (My xeon) in BF3 MP. Which means it can handle 2 OVERCLOCKED (Pushes it to stock 7970 level) 1.1GHz 7850s
> 
> 
> 
> 
> 
> 
> 
> 
> And let's say 9970 is 60% better than 7970 which would mean ... It will handle 9970 fine with spare room. Me thinks.
> But really 9970 on i7 930? By then it's long due time to upgrade.


Well since AMD is skipping 8000 series, HD 9000 series should be here around the holiday seasons.


----------



## Clockdripdoor

AMD is not skipping the 8000 series. They should skip late this year (in a few months). 9000 series are right behind them.

I decided to get a 7790. Cheap enough and should push everything I need. Still using my old 4870 with my A10.


----------



## Mopar63

Quote:


> Originally Posted by *CCast88*
> 
> Ok so I just read the last 6 pages or so and looking at what I read, The A10 cannot push high end cards. Currently, I still have my system on my rig info and I am planning on getting the new HD 9000 series for this rig and building an HTPC ITX rig with the upcoming Kaveri apu and switching over one of my HD 5850s into it. From what I read, I guess the A10 won't be able to push a 5850, even though it's a 3-4 year old card? And another qauestion.. will my i7 930 have the balls to push an HD 9970?


You know I think we need something to define pushing a card. According to an article done on Anand at 1440 most games run great on an A10 even compared to higher end cards. 1440 is pushing most cards a bit so the A10 looks like it can handle the load. It is when you get to dual cards that the APU stumbles.


----------



## s33dless

Quote:


> Originally Posted by *Mopar63*
> 
> You know I think we need something to define pushing a card. According to an article done on Anand at 1440 most games run great on an A10 even compared to higher end cards. 1440 is pushing most cards a bit so the A10 looks like it can handle the load. It is when you get to dual cards that the APU stumbles.


In anything that's not fps capped (like Skyrim, Dark Souls, Ace Combat, and F1 2012, all capped @ 60 fps), I get ridiculous frame rates (400+ in DmC) with my 7970 on an APU system. A lot of reviews and benchmarks were also taken with high end cards, I would suggest reading them.

But, of course, if you're just going to disable the iGPU then don't get it.


----------



## DaveLT

One advantage is the ridiculously low CPU power consumption compared to FX6300
Remember that 100W TDP is the TDP for the CPU+GPU. They squeezed lots of power into such a package ... Intel can forget about squeezing this much CPU+GPU power in 100W at least in this price point.


----------



## Opcode

Quote:


> Originally Posted by *s33dless*
> 
> In anything that's not fps capped (like Skyrim, Dark Souls, Ace Combat, and F1 2012, all capped @ 60 fps), I get ridiculous frame rates (400+ in DmC) with my 7970 on an APU system. A lot of reviews and benchmarks were also taken with high end cards, I would suggest reading them.
> 
> But, of course, if you're just going to disable the iGPU then don't get it.


Again, a game that utilizes all four cores of the APU. So its expected that you get good frame rates (I probably get 150+ with my HD 5870). The A10-6800k has a high base clock which makes it pretty good for gaming on the CPU side of things. In multi-core optimized games, it's a workhorse there's no denying that. Tho its gaming performance is still hindered once you step away from multi-core optimized games. It still bottlenecks your GPU in a lot of games. It has little to do with being FPS capped, its easily determined if the game was made to support more than two cores. One module is not enough for these Athlon X4's to push games to their full potential. Tho these games are still perfectly playable with them. If you were to go into your bios and turn one of your modules off, I bet it would make a world of difference in every game you play. Now im not saying its a bad APU, tho if you're pairing it with a discrete card. It's pointless to buy one, as you can pick up a FX-6300 for $10-20 cheaper. And it will give you a huge margin of better performance. Especially considering FM2 is a dead socket, if you were to buy an APU I would at least wait till FM2+ boards come out. That way you can switch over to Kaveri later as its suppose to be an epic upgrade from current APU's.
Quote:


> Originally Posted by *Hideaki Itsuno*
> Multi-threading and multi-core processers are most certainly supported. The engine is optimized to run in parallel processing, and it is possible to scale support up to 8 cores. Processes such as rendering, sound processing and compressed resource decoding utilize all available processing power.
> By running the above-mentioned processes in parallel, we can achieve a performance boost of a factor of 1.6 to 1.8 when using a dual core CPU, 2.2 to 2.5 with a 4 core CPU, and 3 to 3.3 with an 8 core CPU. Load times alone are reduced by about 40% when using a dual core processor.


----------



## DaveLT

Yeah, waiting for FM2+ to provide a IPC boost to around between nehalem and Sandy.
That way i can finally put aside my Xeon


----------



## litoralis

A10-6800k with RAM OC'ed to 2400 and OC'ed HD6670 or HD7750 in Hybrid Crossfire, Anyone try this yet? Any Benchmarks?

I've been running this thread: http://www.overclock.net/t/1312363/a10-5800k-with-ram-oced-to-2400-and-oced-hd6670-in-crossfire-anyone-try-this-yet-any-benchmarks
A10-5800k with RAM OC'ed to 2400 and OC'ed HD6670 in Crossfire, Anyone try this yet? Any Benchmarks?

and I've been asked to build a HTPC light gaming PC again for a friend.

The A10-6800k is my first choice of CPU here, but I've not kept up with the OC'ing potential.

Baseline for the proposed system:

AMD A10-6800k
ASRock FM2A75M-ITX Rev. 2.0 Socket FM2
Corsair H60 or Corsair Hydro Series H80i Extreme
8 GB of some decent RAM
an SSD
OC'ed HD6670 or HD 7750
Is this a reasonable build concept? Can I actually OC the 6800k like the 5800k?


----------



## Devildog83

Quote:


> Originally Posted by *litoralis*
> 
> A10-6800k with RAM OC'ed to 2400 and OC'ed HD6670 or HD7750 in Hybrid Crossfire, Anyone try this yet? Any Benchmarks?
> 
> I've been running this thread: http://www.overclock.net/t/1312363/a10-5800k-with-ram-oced-to-2400-and-oced-hd6670-in-crossfire-anyone-try-this-yet-any-benchmarks
> A10-5800k with RAM OC'ed to 2400 and OC'ed HD6670 in Crossfire, Anyone try this yet? Any Benchmarks?
> 
> and I've been asked to build a HTPC light gaming PC again for a friend.
> 
> The A10-6800k is my first choice of CPU here, but I've not kept up with the OC'ing potential.
> 
> Baseline for the proposed system:
> 
> AMD A10-6800k
> ASRock FM2A75M-ITX Rev. 2.0 Socket FM2
> Corsair H60 or Corsair Hydro Series H80i Extreme
> 8 GB of some decent RAM
> an SSD
> OC'ed HD6670 or HD 7750
> Is this a reasonable build concept? Can I actually OC the 6800k like the 5800k?


I just bought the FM2A75M-DGS micro ATX. It looks nice but I haven't got a CPU yet. Man the thing is narrow for a micro ATX. Good features though. It's a budget build and I hope the board will be OK.


----------



## MKHunt

I have all that you're seeking (even 2400 CL9 1T RAM) but lack the 6670 lol. My 6800k rig was HTPC only in build concept, so the dGPU was never ordered


----------



## tuffy12345

I have a 6800K with 2133 RAM, and I just got a 7750 in today. The dual graphics option shows up, and allows me to enable, but it won't stick. I'm tinkering, hopefully I will be able to figure it out.

EDIT: Appears to be working just fine. Just had to update to the most recent drivers. Windows experience index or whatever jumped from a 5.1 in desktop graphics performance to a 6.9. Now to see if there is any improvement from games.

EDIT: Ok, played a couple. Today, I was playing CS:GO at 720p and getting about 70-80fps, just tried it in 1440p and was getting roughly the same, plus about 5fps. I used to play Skyrim at 1600x900 on medium and get 20fps constant. I booted it up for S&Gs, ran it at 1440p on high and had 45-50.

Very satisfied so far (an hour in). Can't wait to see how this technology comes along.


----------



## Mopar63

Quote:


> Originally Posted by *litoralis*
> 
> A10-6800k with RAM OC'ed to 2400 and OC'ed HD6670 or HD7750 in Hybrid Crossfire, Anyone try this yet? Any Benchmarks?
> 
> I've been running this thread: http://www.overclock.net/t/1312363/a10-5800k-with-ram-oced-to-2400-and-oced-hd6670-in-crossfire-anyone-try-this-yet-any-benchmarks
> A10-5800k with RAM OC'ed to 2400 and OC'ed HD6670 in Crossfire, Anyone try this yet? Any Benchmarks?
> 
> and I've been asked to build a HTPC light gaming PC again for a friend.
> 
> The A10-6800k is my first choice of CPU here, but I've not kept up with the OC'ing potential.
> 
> Baseline for the proposed system:
> 
> AMD A10-6800k
> ASRock FM2A75M-ITX Rev. 2.0 Socket FM2
> Corsair H60 or Corsair Hydro Series H80i Extreme
> 8 GB of some decent RAM
> an SSD
> OC'ed HD6670 or HD 7750
> 
> Is this a reasonable build concept? Can I actually OC the 6800k like the 5800k?


Solid build, I have a 6800K using the Gigabyte F2A85ZXN-WiFi and was able to hit 4.6 Ghz pretty easily. I used 8 gig of Kingston 2133 Beast and this little beat is a rocket. Get yourself an OCZ or Samsung SSD and hang on for the ride...


----------



## DaveLT

Quote:


> Originally Posted by *Mopar63*
> 
> Solid build, I have a 6800K using the Gigabyte F2A85ZXN-WiFi and was able to hit 4.6 Ghz pretty easily. I used 8 gig of Kingston 2133 Beast and this little beat is a rocket. Get yourself an OCZ or Samsung SSD and hang on for the ride...


Consider plextor m5 pro SSDs instead. Or M5S


----------



## peter-mafia

Got a Kill-a-Watt meter. Under full load (prime95+ FurMark 1280*1024 no AA) my rig drains 241w. The PSU is PowerMan 300W (basically, noname. NOT 80 plus). I think it's responsible for 30-40% of that total. Not bad









IN WIN BP655.300TB3L (300W PSU)
mini ITX Asrock Fm2A85x-ITX
GSklill-Z series 8GB 2400 10-12-12-31
Powercolor 7750 Low Profile (custom cooler) @865/1250
A10-6800K @ 4400mhz at 1.425V. De-Lidded
CPU NB 1.4V
CPU NB frequency 2400
IGD (APU graphics) 1169mhz
Cooler Zalman CNPS8000B
Thermal Grease- CoolLaboratory Liquid Pro
SSD OCZ Vertex plus R2 120GB
HDD WD 250GB
BD-RE Lite-On
Win 8 Pro
Catalyst 13.4

*tuffy12345*, great news







You are righ,t DG won't work with the 7750 unless you use 13.4+ Catalyst


----------



## Opcode

Quote:


> Originally Posted by *peter-mafia*
> 
> Got a Kill-a-Watt meter. Under full load (prime95+ FurMark 1280*1024 no AA) my rig drains 241w. The PSU is PowerMan 300W (basically, noname. NOT 80 plus). I think it's responsible for 30-40% of that total. Not bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IN WIN BP655.300TB3L (300W PSU)
> mini ITX Asrock Fm2A85x-ITX
> GSklill-Z series 8GB 2400 10-12-12-31
> Powercolor 7750 Low Profile (custom cooler) @865/1250
> A10-6800K @ 4400mhz at 1.425V. De-Lidded
> CPU NB 1.4V
> CPU NB frequency 2400
> IGD (APU graphics) 1169mhz
> Cooler Zalman CNPS8000B
> Thermal Grease- CoolLaboratory Liquid Pro
> SSD OCZ Vertex plus R2 120GB
> HDD WD 250GB
> BD-RE Lite-On
> Win 8 Pro
> Catalyst 13.4
> 
> *tuffy12345*, great news
> 
> 
> 
> 
> 
> 
> 
> You are righ,t DG won't work with the 7750 unless you use 13.4+ Catalyst


It's worth a try with the new 13.8 beta drivers, and see if they improve performance any.


----------



## Derp

Quote:


> Originally Posted by *peter-mafia*
> 
> Got a Kill-a-Watt meter. Under full load (prime95+ FurMark 1280*1024 no AA) my rig drains 241w. The PSU is PowerMan 300W (basically, noname. NOT 80 plus). I think it's responsible for 30-40% of that total. Not bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IN WIN BP655.300TB3L (300W PSU)
> mini ITX Asrock Fm2A85x-ITX
> GSklill-Z series 8GB 2400 10-12-12-31
> Powercolor 7750 Low Profile (custom cooler) @865/1250
> A10-6800K @ 4400mhz at 1.425V. De-Lidded
> CPU NB 1.4V
> CPU NB frequency 2400
> IGD (APU graphics) 1169mhz
> Cooler Zalman CNPS8000B
> Thermal Grease- CoolLaboratory Liquid Pro
> SSD OCZ Vertex plus R2 120GB
> HDD WD 250GB
> BD-RE Lite-On
> Win 8 Pro
> Catalyst 13.4
> 
> *tuffy12345*, great news
> 
> 
> 
> 
> 
> 
> 
> You are righ,t DG won't work with the 7750 unless you use 13.4+ Catalyst


Even with a poor psu, that seems really high for that build.


----------



## truckerguy

Well I've order a AMD Richland A10-6800K and a ASRock FM2A85X Extreme4 FM2 AMD A85X I have a Mushkin Enhanced Chronos 240gig ssd and some Samsung wonder ram a case with a w/c loop. Im looking to see if I can get this apu to run at 5GHz day in day out


----------



## beers

Quote:


> Originally Posted by *Derp*
> 
> Even with a poor psu, that seems really high for that build.


Would be a >30w difference just from 70% to 80% PSU efficiency..


----------



## Opcode

Quote:


> Originally Posted by *truckerguy*
> 
> Well I've order a AMD Richland A10-6800K and a ASRock FM2A85X Extreme4 FM2 AMD A85X I have a Mushkin Enhanced Chronos 240gig ssd and some Samsung wonder ram a case with a w/c loop. Im looking to see if I can get this apu to run at 5GHz day in day out


It will punch 5.0 GHz at around 1.525v, tho you might need a bit more to get it stable. The iGPU I have in mine is impressive, I am able to get it to 1100+ MHz on the stock iGPU volts. I've been tempted to see if I can do 1300+ MHz with the iGPU, tho it doesn't scale all that well because its being held back by my 1866 MHz memory. These APU's love memory speed, each jump on the memory frequency (1333->1600->1866->2133->etc) gives twice the amount of performance than a jump on the iGPU core clock. So if you plan on doing some gaming on the iGPU, I would try and get your wonder sticks to do at least 2133 MHz. Tho higher would ultimately be better, timing don't mean nothing when gaming. If you can do 2400 MHz at a CAS of like 12, its still going to offer better gaming performance than at 2133 with a CAS of 9.


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> It will punch 5.0 GHz at around 1.525v, tho you might need a bit more to get it stable. The iGPU I have in mine is impressive, I am able to get it to 1100+ MHz on the stock iGPU volts. I've been tempted to see if I can do 1300+ MHz with the iGPU, tho it doesn't scale all that well because its being held back by my 1866 MHz memory. These APU's love memory speed, each jump on the memory frequency (1333->1600->1866->2133->etc) gives twice the amount of performance than a jump on the iGPU core clock. So if you plan on doing some gaming on the iGPU, I would try and get your wonder sticks to do at least 2133 MHz. Tho higher would ultimately be better, timing don't mean nothing when gaming. If you can do 2400 MHz at a CAS of like 12, its still going to offer better gaming performance than at 2133 with a CAS of 9.


*Weak IMC








Otherwise when Kaveri is out (hopefully with onboard GDDR5) i will buy Trident-X 2400 C11s they're like cheap and do wonders ...


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> *Weak IMC
> 
> 
> 
> 
> 
> 
> 
> 
> Otherwise when Kaveri is out (hopefully with onboard GDDR5) i will buy Trident-X 2400 C11s they're like cheap and do wonders ...


From what I hear there will be no GDDR5 support with Kaveri. They don't really need to add it either, since Excavator will use DDR4. APU's will continue to get better and better game performance as they start to utilize the latest technology. If there was GDDR5 support, we would of seen it on the FM2+ boards that are being shown off. Take the ASUS board below for an example, it tops out a 2600 MHz DDR3.

ASUS A88XM-Pro FM2+


Spoiler: Warning: Spoiler!


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> From what I hear there will be no GDDR5 support with Kaveri. They don't really need to add it either, since Excavator will use DDR4. APU's will continue to get better and better game performance as they start to utilize the latest technology. If there was GDDR5 support, we would of seen it on the FM2+ boards that are being shown off. Take the ASUS board below for an example, it tops out a 2600 MHz DDR3.
> 
> ASUS A88XM-Pro FM2+
> 
> 
> Spoiler: Warning: Spoiler!


WAIT. Kaveri is with Steamroller not Excavator, not using GDDR5 for a HD7750-level GPU is a BAD, BAD IDEA!
And no, DDR4 will not beat GDDR5 in no way since GDDR5 is actually quad-clocked and 6GHz GDDR5 chips are a hell lot less expensive than 2600MHz DDR3 ...


----------



## agrims

True but GDDR5 is super latent compared to DDR4. I am sure it will come full circle but the market has spoken and DDR4 will be the next big thing. 5 is super fast on bandwidth but slow latency wise and for the most part a system runs better with low latency. HSA will probably change that as well, but it won't be around fully until excavator.


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> True but GDDR5 is super latent compared to DDR4. I am sure it will come full circle but the market has spoken and DDR4 will be the next big thing. 5 is super fast on bandwidth but slow latency wise and for the most part a system runs better with low latency. HSA will probably change that as well, but it won't be around fully until excavator.


Does it matter for GPUs? Nope. GPUs don't give a arse for latency ...


----------



## agrims

True but there will be none incorporated, and I doubt they will even if they wanted to because they are after a harmonious bond of CPU and GPU. GDDR5 is too latent for a system to run normal programming, so we will be getting what they want. Besides, all they want the GPU to do is help the CPU out through daily tasks, not just gaming..


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> WAIT. Kaveri is with Steamroller not Excavator, not using GDDR5 for a HD7750-level GPU is a BAD, BAD IDEA!
> And no, DDR4 will not beat GDDR5 in no way since GDDR5 is actually quad-clocked and 6GHz GDDR5 chips are a hell lot less expensive than 2600MHz DDR3 ...


I think you're missing my point, the upgrade path for GDDR5 would be moot. AMD themselves would have to put the sticks into production, as no other manufacture makes 240 pin desktop GDDR5 sticks. Which would be great for AMD since they would get most of the business. Tho like said it puts AMD in a very delicate place where each APU generation after Kaveri won't offer as much improvement over the previous. With sticking to DDR3 and DDR4 with Kaveri and then with Excavator, AMD can keep people buying their APU's every generation. Kaveri will be a huge step forward with Steamroller, GCN, and HSA support. Then Carrizo will be a huge step forward with most likely GCN 2.0 and DDR4 support. Besides GDDR5 is only going to impact application performance globally because of its latency.


----------



## s33dless

Started mining with this (basically all of the scrypt based currencies in rotation) just for for ****s. With the iGPU at stock, it gets ~50 kH/s (anything from 48 to 61 basically). While pathetic on its own, it's certainly much better than you'd get CPU mining (<10 kH/s), so using this as a core for a (SCRYPT!!!) mining rig would be very good. All your cards mining...then an extra 50, because why not?

I'd like to see someone with 2400 MHz RAM try it. Or in xfire with a 7750? I remember a few claiming to have done it on this thread, somebody give us a run of 7750 alone, iGPU alone, then both in xfire.

Any takers? All you have to do is install the AMD OpenCL stuff, download GUIMiner-Scrypt, and register at a pool. Or solo, doesn't really matter, I just want to see some numbers. I know it sounds like work, but let's bench this for the sake of our fellow enthusiasts, hmm?


----------



## beers

On the subject of compute, the 6800k + 7570 combo I have can burn through around ~35,000 keys/s using hashcat against a WPA2 hash.

The iGPU provides just about the same level of performance as the card.


----------



## Devildog83

Do you think you can get a 7770 to work with an A10 or A8?


----------



## Opcode

Quote:


> Originally Posted by *Devildog83*
> 
> Do you think you can get a 7770 to work with an A10 or A8?


Not without hacking up the device driver, and even then it wont be stable any. A10 + 7750 shouldn't even be possible, tho a driver bug allows it. Which shows, because it scales terribly. A single HD 7770 GHz edition should provide faster frame rates than a A10 and a HD 7750 in DGM.


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> Not without hacking up the device driver, and even then it wont be stable any. A10 + 7750 shouldn't even be possible, tho a driver bug allows it. Which shows, because it scales terribly. A single HD 7770 GHz edition should provide faster frame rates than a A10 and a HD 7750 in DGM.


Proof plz.


----------



## void

So a FM2 mITX has turned up in NZ. It's the Asrock FM2A85X-ITX any thoughts on the quality of this board?


----------



## DaveLT

Quote:


> Originally Posted by *void*
> 
> So a FM2 mITX has turned up in NZ. It's the Asrock FM2A85X-ITX any thoughts on the quality of this board?


Well it's not bad, much better than the garbage FM2A75 but certainly not the best. Not so keen on the 4+2 though


----------



## void

Thanks for you input.









I may just stick to mATX or even jump back up to ATX. What would the best boards in each of those form factors from either of the major players Gigabyte or Asus?


----------



## s33dless

Back to the compute discussion--I've noticed something odd when mining. The display driver crashes when I use the iGPU sometimes. I've nailed the symptom down to 1 event: displaying things.

If too much action (windows moving, charts scrolling, etc.) happens on the displays the iGPU is running while it mines, the driver crashes. This is in both Catalyst 13.6 and 13.8, AMD APP SDK 2.8. I'm sure there's some combo of drivers that would make it work, but forget rolling back, not worth the work, I don't take mining that seriously.

But just an FYI for those who can;t get it to run, or get it to run but not for long: keep things dusplayed as static as possible on the iGPU and it will crunch away happily.


----------



## darkusx45

Just got my HD 7750 1gb DDR5 going to be doing some test with different drivers.
13.4 Driver

3DMark 11
HD 7750 = P3103
A10-6800k = P2026
DGM A10-6800k + HD 7750 = P4170







I will be working next with the beta drivers 13.6 and the new 13.8.


----------



## Farmer Boe

Quote:


> Originally Posted by *darkusx45*
> 
> Just got my HD 7750 1gb DDR5 going to be doing some test with different drivers.
> 13.4 Driver
> 
> 3DMark 11
> HD 7750 = P3103
> A10-6800k = P2026
> DGM A10-6800k + HD 7750 = P4170
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be working next with the beta drivers 13.6 and the new 13.8.


Thanks for doing the tests. +rep


----------



## Fabriz89

I have a A8-6600K. New drivers 13.8 Beta solved an annoying issues I had with my new monitor: it went black randomly many times while I was on desktop (gaming and fullscreen video didn't have any problem). And they fixed the stuttering I had on Diablo3 too. Now I'm really satisfied with it.
I'm wondering if with these drivers a crossfire with a 7750 would be any good with my processor.


----------



## darkusx45

13.6 beta drivers

3DMark 11
7750 = P3103
A10-6800k = P2026
DGM A10-6800k + HD 7750 = P4173

Not much change don't want to post pics of the 13.6 drivers test.

13.8 beta drivers

3DMark 11
7750 = P3041
A10-6800k = P2027
DGM A10-6800k + HD 7750 = P4078







Glad to post results to give back to community for helping me choose the right HD 7750 card. Let me know if you want me to do any more test.

Forgot to mention Ram is @ 2400mhz on all test.


----------



## Javardo69

Someone here undervolted? How much did you achieve stable ?


----------



## truckerguy

I just ordered a 6800K I will find the lower point of voltage


----------



## Javardo69

what is the max threshold temperature of this processor?


----------



## Opcode

Quote:


> Originally Posted by *Javardo69*
> 
> what is the max threshold temperature of this processor?


74C as I was told by AMD's APU department.


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> 74C as I was told by AMD's APU department.


74C IN AIDA64 that is


----------



## s33dless

I got an old GTS 450 the other day, popped it in the box to play with. With CUDAMiner, I get ~46 kH/s stock, OC'd to a rock stable 925 core/1900 mem, I get ~53, which is right where the iGPU was at stock. I know NVidia cards weren't supposed to be good at mining, and also that this is an older, lower end card, but still.

I'm going through and trying out all the games on my 7970 while the iGPU and GTS 450 mine. So far no stutter in anything---God I live heterogeneity.


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> 74C IN AIDA64 that is


Well, that's the actual temp limit for the chip. If you use other software it will most likely give you wrong temps (HWMonitor, Speedfan, etc). So I would recommend just downloading the trial of AIDA64, or downloading your motherboards OC suite.


----------



## s33dless

Best I can get out of that GTS 450 is ~57 @ 980/2000. It'll still run with artifacts if I clock it higher, but there's's no performance gain. I wish I could OC my iGPU more, too bad my RAM flips out if I get too close to 900MHz.

Good stuff though. Ran all I could with the iGPU and GTS both mining, got no grief. The worst thing that happened was some artifacting in Skyrim when I got to the Thalmor embassy. I thought it might have beee the snow, but no, Winterhold doesn't give me that problem. I went to all the most visually intense places I could think of, and only got a single hiccup in frames (when I tripped on a USB cable and unplugged a bunch of stuff).

I keep forgetting to get that x87 fix for APU's though. Does anybody have a link? I forgot where I saw the latest one...


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> Well, that's the actual temp limit for the chip. If you use other software it will most likely give you wrong temps (HWMonitor, Speedfan, etc). So I would recommend just downloading the trial of AIDA64, or downloading your motherboards OC suite.


I actually have a copy of Aida64 Extreme in my HDD








But anyway, the real time i would upgrade is either to P67 Deluxe/Sabertooth + i5 3570k AND whatever is coming out for kaveri ...
It's nice to have two systems


----------



## Indy1944

Hey guys have a question for you. I recently built my A-10 6800 rig. I have duel CPU enabled and have an HD 6670 DDR 3 2Gig card. DDR 3 2133 ram in there with great timings. So why is it stuttering. I even have the latest catalyst drivers 13.8


----------



## Indy1944

I mean I'd like to get the hybrid crossfire working but since I went back to just the gpui overclocked I must say the game quality is smooth and fluid by itself. Must have something to do with that dominator platinum memory running at 2133


----------



## DaveLT

Quote:


> Originally Posted by *Indy1944*
> 
> I mean I'd like to get the hybrid crossfire working but since I went back to just the gpui overclocked I must say the game quality is smooth and fluid by itself. Must have something to do with that dominator platinum memory running at 2133


Nah, it isn't anything to do with the dom plats ... You might need to try 2400


----------



## Indy1944

DaveLT your not understanding what I'm saying....I said I'm getting great performance with the Gpui only. To a certain point performance is about quality of your hardware and the dominator platinum a are far superior to anything out there


----------



## Indy1944

I have used many ram and the dom plats are the only ones that were truly plug and play, nice tight timings built in.


----------



## s33dless

Hey now, let's not let this turn into a flame war.

I tried adding an HD 46-something (I forget which) to the pile, but it caused all sorts of driver conflicts. I thought the GTS 450 might be the issue, but removing it made no difference. I can get either the iGPU or the 46xx to run, but not both. They both appear in the device manager, but one will have a "Code 43".

And AMD support is closed until Wednesday, which is fun. I know it's a weird setup, but they shoved everything in the HD 4xxx line into legacy, meaning it should have 0 problems ever, hence why they stopped development on the drivers. Looks like some programmers are going to be in for some extra work this week...


----------



## glussier

Ram frequency and timings has a great effect on performance. Ram brand has very little effect on performance, even for an APU


----------



## Indy1944

We can go at it all day. We all have our favorites...


----------



## glussier

I buy Kingston or Corsair indifferently. And the only reason I only buy these 2 brands is because this is what my dealer has in stock.


----------



## s33dless

Quote:


> Originally Posted by *glussier*
> 
> I buy Kingston or Corsair indifferently. And the only reason I only buy these 2 brands is because this is what my dealer has in stock.


Best reason to buy anything bro.


----------



## DaveLT

Quote:


> Originally Posted by *s33dless*
> 
> Best reason to buy anything bro.


I only buy Kingston or G.Skill because i care about my memory OCs ... Staying away from Corsair because their RAM is very tightly binned now.


----------



## Devildog83

Quote:


> Originally Posted by *DaveLT*
> 
> I only buy Kingston or G.Skill because i care about my memory OCs ... Staying away from Corsair because their RAM is very tightly binned now.


G-Skill Trident X get's my vote. I have the 2400 and love the stuff.


----------



## DaveLT

Quote:


> Originally Posted by *Devildog83*
> 
> G-Skill Trident X get's my vote. I have the 2400 and love the stuff.


AND IT'S CHEAP!


----------



## Mopar63

Personally the only ram I use is Kingston HyperX. It is the only brand I have used without ever having stick failure or a single compatibility issue.


----------



## bassium

Still in the midst of upgrading my rig. Any recommendations for a mobo/PSU to go with the a10-6800k? My motherboard can't change voltage so it's definitely time to upgrade : D


----------



## Indy1944

ASRock extreme 6 FM2, easiest board to overclock, if your near a Micro Center you can get 30 dollors off a combo deal


----------



## Devildog83

Quote:


> Originally Posted by *DaveLT*
> 
> AND IT'S CHEAP!


I all of the rigs I have had, rebuilt or built I have use Kingston HyperX, Samsung wonder,Corsair Vengeance, G-Skill Trident X and Team Vulcan RAM and I have never had a stick fail ever. Must be lucky I guess. I like the G-Skill out of all of them but they all worked well.


----------



## Opcode

Quote:


> Originally Posted by *bassium*
> 
> Still in the midst of upgrading my rig. Any recommendations for a mobo/PSU to go with the a10-6800k? My motherboard can't change voltage so it's definitely time to upgrade : D


Quote:


> Originally Posted by *Indy1944*
> 
> ASRock extreme 6 FM2, easiest board to overclock, if your near a Micro Center you can get 30 dollors off a combo deal


I do own a Extreme6, it is a pretty decent board. The only issue that I have with mine, is the BIOS runs the CPU at 100% load when you go into it to change settings. Otherwise there isn't an issue, so if you do get one watch your CPU temps when fiddling around in the bios. On the up side it does pack a beefy 8+2 phase vrm which is quite a bit for a 100w TDP chip. I was able to get my 6800k to 5.0 GHz on 1.50v (not stable) so it definitely can overclock.


----------



## Indy1944

Quote:


> Originally Posted by *Opcode*
> 
> I do own a Extreme6, it is a pretty decent board. The only issue that I have with mine, is the BIOS runs the CPU at 100% load when you go into it to change settings. Otherwise there isn't an issue, so if you do get one watch your CPU temps when fiddling around in the bios. On the up side it does pack a beefy 8+2 phase vrm which is quite a bit for a 100w TDP chip. I was able to get my 6800k to 5.0 GHz on 1.50v (not stable) so it definitely can overclock.


Yeah. Noticed my CPU speed stayed max. But I'm usually only gaming on it so I'm ok plus temps are low, don't worry too much. But honestly best board I used yes the 2133 memory was a breeze haven't overclocked memory but might try


----------



## s33dless

FYI for those planning to xfire with a 7750:

I called AMD about the HD 4650 I popped in for legacy development/mining, and they claim that they offer absolutely no support for mixing cards from different families (not even for xfire, I wasn't trying that, but as in running them period). This is news to me, as I'd done it plenty of times before with no issue.


----------



## Indy1944

I have the HD 6670 His 2gig version and I can't figure out how to set up duel graphics lags like nuts


----------



## peter-mafia

Quote:


> Originally Posted by *s33dless*
> 
> they offer absolutely no support for mixing cards from different families


I would be surprised if they did.
Wow,AMD dropped the prices on the 7950, 7970 and 7990 (yesterday it was selling for$650).Whiie I was debating on whether I needed the 7990 it was gone already...
When the 7950 goes for $180 there is little to no reason to crossfiring the 6800k.. The only problem is it's huge. I'm grateful to Nvidia for rolling out the gtx760


----------



## Artikbot

Quote:


> Originally Posted by *peter-mafia*
> 
> When the 7950 goes for $180 there is little to no reason to crossfiring the 6800k..


That won't happen. Or if it does, Kaveri will already be out.

Still, a dedicated GPU is outside the scope of an APU build.

For instance, you could not fit a dedicated GPU, no matter which one inside my all-in-one chassis.


----------



## peter-mafia

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127737
Already happened  I'm so excited. It won't fit in my case, too, but got one crazy idea (hack saw)


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> I do own a Extreme6, it is a pretty decent board. The only issue that I have with mine, is the BIOS runs the CPU at 100% load when you go into it to change settings. Otherwise there isn't an issue, so if you do get one watch your CPU temps when fiddling around in the bios. On the up side it does pack a beefy 8+2 phase vrm which is quite a bit for a 100w TDP chip. I was able to get my 6800k to 5.0 GHz on 1.50v (not stable) so it definitely can overclock.


It's an awesome board for it's price. In fact that board claimed the most top spots for CPU OC'ing on HWBot








Quote:


> Originally Posted by *peter-mafia*
> 
> I would be surprised if they did.
> Wow,AMD dropped the prices on the 7950, 7970 and 7990 (yesterday it was selling for$650).Whiie I was debating on whether I needed the 7990 it was gone already...
> When the 7950 goes for $180 there is little to no reason to crossfiring the 6800k.. The only problem is it's huge. I'm grateful to Nvidia for rolling out the gtx760


What? 7950's huge? Just so you know, kepler cards do run hot








Also besides, 7950 @ 860MHz is just as fast as GTX760 @ boost clock ...


----------



## nitrubbb

any FM2+ mobos for sale yet?


----------



## Indy1944

Quote:


> Originally Posted by *nitrubbb*
> 
> any FM2+ mobos for sale yet?


Check newegg. They usually get them first


----------



## Indy1944

Anyone have luck duel graphicing this thing? A10 6800 and HD 6670


----------



## peter-mafia

A lot of people did. You need to be more specific, though. It's quite obvious the problem is at your end.


----------



## peter-mafia

Keep *duel*ing then. I know you'll defeat your foe )))


----------



## void

Quote:


> Originally Posted by *nitrubbb*
> 
> any FM2+ mobos for sale yet?


I don't think so. Asus has shown off a few but I'm certain they haven't been released to the public.
Quote:


> Originally Posted by *Indy1944*
> 
> Another know it all


Where in the process are you hitting problems? It is an officially supported combination so it shouldn't be too hard to get working.


----------



## Indy1944

well, when I enable duel graphics I get great framerates but lag or bad micro stutter, maybe im missing something


----------



## Durquavian

Quote:


> Originally Posted by *Indy1944*
> 
> well, when I enable duel graphics I get great framerates but lag or bad micro stutter, maybe im missing something


Not sure how they sync clocks for those. Can tell you MSI Afterburner is of great help. http://www.overclock.net/t/1265543/the-amd-how-to-thread This thread will help with setups. But the only part you would still be interested in is the Afterburner (AB) part really. 13.8 driver from AMD is straight up and requires no real changes.


----------



## Mopar63

You know I have heard a lot of people talk about how the APU cannot be used for a higher end gaming build because it bottlenecks. I was curious about this so since I have the new Haswell system up I decided to strip down Mini Mopar and rebuild it as an APU using the 6800k.

The APU is overclocked to 4.6 GHz and has 8 gigs of DDR3 2133 for memory. The Haswell is an i7 4770 overclocked to 4.3 GHz and has 16 gig of DDR3 1600. Both systems were tested in 3DMark using a 7970 GHz Edition at stock speeds.

In Firestorm the Haswell system scored an overall rating of 7299.0, the APU hit in at 6069, this means overall the Haswell was showing as about 20% faster. Not quite the bump I expected. Looking at the Physics score the Haswell destroyed the 6800K with a score that was 135% faster.

However I expected the has well to be faster. The 4770 has a faster base core and add to that Hyper threading this score makes a lot of sense. However I was curious if the APU was bottle necking the 7970.

The Graphics score on the Haswell was 8096, on the APU it scored 8035, that means the Haswell system was only driving the 7970 faster by a rate of 0.7%, that's right less than a single percentage point. Guess it is safe the say that a 7970 is not bottle necked by Haswell at least not at high end gaming levels with 1080 resolution.

Next up is to see how a modded Skyrim plays on the APU compared to the Haswell.


----------



## Ryude

Quote:


> Originally Posted by *Mopar63*
> 
> You know I have heard a lot of people talk about how the APU cannot be used for a higher end gaming build because it bottlenecks. I was curious about this so since I have the new Haswell system up I decided to strip down Mini Mopar and rebuild it as an APU using the 6800k.
> 
> The APU is overclocked to 4.6 GHz and has 8 gigs of DDR3 2133 for memory. The Haswell is an i7 4770 overclocked to 4.3 GHz and has 16 gig of DDR3 1600. Both systems were tested in 3DMark using a 7970 GHz Edition at stock speeds.
> 
> In Firestorm the Haswell system scored an overall rating of 7299.0, the APU hit in at 6069, this means overall the Haswell was showing as about 20% faster. Not quite the bump I expected. Looking at the Physics score the Haswell destroyed the 6800K with a score that was 135% faster.
> 
> However I expected the has well to be faster. The 4770 has a faster base core and add to that Hyper threading this score makes a lot of sense. However I was curious if the APU was bottle necking the 7970.
> 
> The Graphics score on the Haswell was 8096, on the APU it scored 8035, that means the Haswell system was only driving the 7970 faster by a rate of 0.7%, that's right less than a single percentage point. Guess it is safe the say that a 7970 is not bottle necked by Haswell at least not at high end gaming levels with 1080 resolution.
> 
> Next up is to see how a modded Skyrim plays on the APU compared to the Haswell.


Synthetic tests like that won't show true bottlenecking. Go play BF3 or SC2.


----------



## Durquavian

Quote:


> Originally Posted by *Mopar63*
> 
> You know I have heard a lot of people talk about how the APU cannot be used for a higher end gaming build because it bottlenecks. I was curious about this so since I have the new Haswell system up I decided to strip down Mini Mopar and rebuild it as an APU using the 6800k.
> 
> The APU is overclocked to 4.6 GHz and has 8 gigs of DDR3 2133 for memory. The Haswell is an i7 4770 overclocked to 4.3 GHz and has 16 gig of DDR3 1600. Both systems were tested in 3DMark using a 7970 GHz Edition at stock speeds.
> 
> In Firestorm the Haswell system scored an overall rating of 7299.0, the APU hit in at 6069, this means overall the Haswell was showing as about 20% faster. Not quite the bump I expected. Looking at the Physics score the Haswell destroyed the 6800K with a score that was 135% faster.
> 
> However I expected the has well to be faster. The 4770 has a faster base core and add to that Hyper threading this score makes a lot of sense. However I was curious if the APU was bottle necking the 7970.
> 
> The Graphics score on the Haswell was 8096, on the APU it scored 8035, that means the Haswell system was only driving the 7970 faster by a rate of 0.7%, that's right less than a single percentage point. Guess it is safe the say that a 7970 is not bottle necked by Haswell at least not at high end gaming levels with 1080 resolution.
> 
> Next up is to see how a modded Skyrim plays on the APU compared to the Haswell.


Make sure you use the x87 fix first. http://downloads.hwbot.org/downloads/tools/BDC_R1.02B.zip or iff you are reluctant to just trust the download ( don't blame you ) http://forum.hwbot.org/showthread.php?t=78490


----------



## Opcode

Quote:


> Originally Posted by *Ryude*
> 
> Synthetic tests like that won't show true bottlenecking. Go play BF3 or SC2 *in multiplayer*.


Otherwise the CPU cores do seem to max out my HD 5870 in single player mode. Tho once online and the CPU having to calculate many more numbers at once, you start to see the bottleneck. I get about 65% utilization on a 32 player map in Bad Company 2. And I get 99% utilization in single player campaign mode.


----------



## Mopar63

Multiplayer games can often have frame rate hits due to internet issues and are unreliable for testing. I did however just do a Skyrim run and actually played hard for about 30 minutes using Fraps. This BTW is the same Skyrim setup and settings I use on the Haswell system. Haswell in testing has a minimum rate of 51 FPS and an average of 61 FPS. The APU delivered a minimum of 43 FPS and an average of 59.7. Skyrim is a game that can take a serious hit at the CPU level so it seems a good test.

Now let me be clear I am NOT trying to say an APU is equal to an i7 or even close, any fool knows in raw processor power the i7 destroys it. What I am seeing however is that an APU can be mated with a powerful video card and give some solid gaming performance with an experience on par with more expensive CPUs.


----------



## Ryude

Quote:


> Originally Posted by *Mopar63*
> 
> Multiplayer games can often have frame rate hits due to internet issues and are unreliable for testing. I did however just do a Skyrim run and actually played hard for about 30 minutes using Fraps. This BTW is the same Skyrim setup and settings I use on the Haswell system. Haswell in testing has a minimum rate of 51 FPS and an average of 61 FPS. The APU delivered a minimum of 43 FPS and an average of 59.7. Skyrim is a game that can take a serious hit at the CPU level so it seems a good test.
> 
> Now let me be clear I am NOT trying to say an APU is equal to an i7 or even close, any fool knows in raw processor power the i7 destroys it. What I am seeing however is that an APU can be mated with a powerful video card and give some solid gaming performance with an experience on par with more expensive CPUs.


I know what you're saying, but for most multiplayer games intel is way better. Single player games seem to do just fine on APU. Look at my sig rig, I have an FX-6300 and a i5-4670K both overclocked to the same 4.6GHz. On the AMD system I get massive framerate drops in Neverwinter, SC2, Firefall, BF3, etc. On the Intel system I don't have that, it's smooth as butter.

Edit: I used the same 7950 in both systems.


----------



## Indy1944

You cant compare the 2 cpu, cmon, not an even fight,


----------



## Mopar63

Quote:


> Originally Posted by *Ryude*
> 
> I know what you're saying, but for most multiplayer games intel is way better. Single player games seem to do just fine on APU. Look at my sig rig, I have an FX-6300 and a i5-4670K both overclocked to the same 4.6GHz. On the AMD system I get massive framerate drops in Neverwinter, SC2, Firefall, BF3, etc. On the Intel system I don't have that, it's smooth as butter.
> 
> Edit: I used the same 7950 in both systems.


Okay fired up Neverwinter to see. In PvP and PVE at 1080 with the sliders for in game at max I am able to play with identical gaming experience on both systems. To make this test fair BTW I dual boxed, so both at the same time on the same server and the same connection.

Quote:


> Originally Posted by *Indy1944*
> 
> You cant compare the 2 cpu, cmon, not an even fight,


Well that is only partially true, in benchmarks with a pure raw performance test the Haswell smashes it and I have never said otherwise. However when you get to the gaming experience and turn off the benchmarks the performance is much closer than anyone would imagine.

Now someone commented here that synthetic benchmarks are not meaningful and I usually would be in 100% agreement. However for this particular round of testing and what I was looking for I am going to disagree. I was looking at how the CPU effects the performance of the GPU. In this case if the APU was holding back the 7970 it should have shown with the Haswell scoring a much higher GPU test score. The fact is however when looking at the score that the APU appears to have enough power, at least in this test (1080 gaming) to push the 7970 just as well as the Haswell.


----------



## Durquavian

Quote:


> Originally Posted by *Mopar63*
> 
> Okay fired up Neverwinter to see. In PvP and PVE at 1080 with the sliders for in game at max I am able to play with identical gaming experience on both systems. To make this test fair BTW I dual boxed, so both at the same time on the same server and the same connection.
> Well that is only partially true, in benchmarks with a pure raw performance test the Haswell smashes it and I have never said otherwise. However when you get to the gaming experience and turn off the benchmarks the performance is much closer than anyone would imagine.
> 
> Now someone commented here that synthetic benchmarks are not meaningful and I usually would be in 100% agreement. However for this particular round of testing and what I was looking for I am going to disagree. I was looking at how the CPU effects the performance of the GPU. In this case if the APU was holding back the 7970 it should have shown with the Haswell scoring a much higher GPU test score. The fact is however when looking at the score that the APU appears to have enough power, at least in this test (1080 gaming) to push the 7970 just as well as the Haswell.


Agreed. The question is "Can it perform well enough to be playable"? And as most will tell you, that own these systems, YES. Is it the best experience possible, probably not, but it is good enough that probably 80% of users wouldn't notice.


----------



## DaveLT

Quote:


> Originally Posted by *Durquavian*
> 
> Agreed. The question is "Can it perform well enough to be playable"? And as most will tell you, that own these systems, YES. Is it the best experience possible, probably not, but it is good enough that probably 80% of users wouldn't notice.


Now comes the question why Intel "users" knock it saying it's unplayable with a 7970


----------



## glussier

Nobody, even an Intel user, will tell you that it is unplayable with a 7970. What they will tell you is that with an APU the 7970 won't give it's maximum performance, which is totally the truth.


----------



## Durquavian

Quote:


> Originally Posted by *glussier*
> 
> Nobody, even an Intel user, will tell you that it is unplayable with a 7970. What they will tell you is that with an APU the 7970 won't give it's maximum performance, which is totally the truth.


I wouldn't say nobody, maybe most rational people would be a better term.


----------



## DaveLT

Quote:


> Originally Posted by *Durquavian*
> 
> I wouldn't say nobody, maybe most rational people would be a better term.


It won't run at full potential when compared to a 4770k yes but that chip is on the other world of pricing + their mobos are pricier than ever


----------



## glussier

Quote:


> Originally Posted by *DaveLT*
> 
> It won't run at full potential when compared to a 4770k yes but that chip is on the other world of pricing + their mobos are pricier than ever


It won't even match a 4670k or a 3570k


----------



## Indy1944

See..I think part of the issue is people are stuck on benchmarks and they tend to forget that real life usage(games and such) is more than enough as far as processing power is concerned.


----------



## DaveLT

Quote:


> Originally Posted by *glussier*
> 
> It won't even match a 4670k or a 3570k


Which is still far more expensive than 6800k is. At just 50% more cost it better well be


----------



## glussier

Where I live, it's $224.00 vs $270.00, so, I wouldn't call that far more expensive. If you can spend the money on an hd7970, you sure can spend the difference on the Intel cpu.


----------



## Indy1944

I'm going to buy a dedicated video card to play BF3 on my A10 6800 rig. I'm going to disable the gpu imbedded in the CPU. Any recommendations on GPU?


----------



## DaveLT

Quote:


> Originally Posted by *Indy1944*
> 
> I'm going to buy a dedicated video card to play BF3 on my A10 6800 rig. I'm going to disable the gpu imbedded in the CPU. Any recommendations on GPU?


Powercolor 7850 PCS+
The best 7850 you can get for your money


----------



## glussier

Like I told you in another thread, the 7850 would be the best match for an APU. I would go with the Gigabyte model, as it's clocked @975, but that's just me.


----------



## Mopar63

Quote:


> Originally Posted by *glussier*
> 
> Nobody, even an Intel user, will tell you that it is unplayable with a 7970. What they will tell you is that with an APU the 7970 won't give it's maximum performance, which is totally the truth.


Actually it is only partially the truth as my tests are showing. At 1080, which the 7970 is overkill for, the APU is not having trouble feeding the GPU. However the 7970 does not begin to stretch it's legs until higher resolution but Anand showed that even up to 1440 the APU can keep the 7970 fed. The issue I think comes in as you go even higher in resolutions and then I am pretty sure the APU will begin to fall short.

Now does it make sense to buy a 7970 with an APU, of course not the system is unbalanced however it also makes no sense to buy a 7970 if you are only going to game at 1080 resolutions. I think what I am showing however is that the APU makes very acceptable gaming system and at 1080 delivers an experience to rival Haswell even up to the i7.


----------



## Pip Boy

if you only experience a 20 -30% loss in peak frames on an APU at 1080p and above vs a decent intel then thats not bad really, it should mean that kaveri will remove the 20% loss and most people in real world gaming situations will see almost no difference.

for what its worth maybe its placebo but my A10-5800k @ stock feels snappier on my HTPC than my 1055T @ stock ? not sure why it just does. I don't mean running compression, file conversion etc.. (although i could check) but in general desktop responsiveness (an using linux btw)

i haven't played around much with my setup atm been busy with stuff but i increased the multi and booted fine @ 4.5ghz. I might at some volts and see if i can go further. Also the temps aren't too bad and are about 30deg idle in bios with ambient temp about 25 deg in a fairly closed in case with only 3 fans running around 700rpm

Still it can climb quite quick to the 40's so Im going to de-lid it for the japes before i start pushing some bigger clocks


----------



## Mopar63

Phill1978, the snappiness thing is something I have observed as well. I have noticed my APU systems seem to snap through everyday functions a little quicker than my Haswell or any other Intel. It is not enough to be measurable but there is a feeling that the system is doing basic tasks quicker.


----------



## Pip Boy

Quote:


> Originally Posted by *Mopar63*
> 
> Phill1978, the snappiness thing is something I have observed as well. I have noticed my APU systems seem to snap through everyday functions a little quicker than my Haswell or any other Intel. It is not enough to be measurable but there is a feeling that the system is doing basic tasks quicker.


yea its hard to explain but i dunno it just feels less laggy like there is no build up to an action then a burst of power, just immediate action.. no that sounds dumb but well its what ive experienced, perhaps its just that i didnt expect the performance to be so good?

might be placebo its slower i think when doing big conversions vs the phenom but with these APU's its all about day to day tasks and a bit of light gaming imo which these excel at for the price and a lot of people will be impressed when they do some surfing and desktop work with these they 'feel' fast


----------



## glussier

Must be a placebo effect, and, the speed to be apparent will depend on what you are doing.

I have a q9650 @ 4ghz, an I5 3570 @ 4.2, a dual xeon Workstation with 2 xeon E5-2687W (32 threads) and an AMD 6800k, and in Windows they all seem equally fast. But, Under the proper workloads my dual xeon Workstation beats the crap of all my other computers.


----------



## Indy1944

It could be from the faster RAM people use in the Richland CPU's(2133 on up)


----------



## DaveLT

Quote:


> Originally Posted by *Indy1944*
> 
> It could be from the faster RAM people use in the Richland CPU's(2133 on up)



Hell no it isn't


----------



## Indy1944

Does anyone have a dedicated gpu paired up with the 6800?


----------



## Mopar63

Quote:


> Originally Posted by *Indy1944*
> 
> Does anyone have a dedicated gpu paired up with the 6800?


As I have stated I have a 7970 with one right now. My wife's system has a 6950 with a 5800 and I have tested with 7850 as well and gotten good results.

http://www.overclock.net/t/1347709/amd-richland-a10-6800k-apu-thread/880#post_20573212


----------



## Indy1944

Quote:


> Originally Posted by *Mopar63*
> 
> As I have stated I have a 7970 with one right now. My wife's system has a 6950 with a 5800 and I have tested with 7850 as well and gotten good results.
> 
> http://www.overclock.net/t/1347709/amd-richland-a10-6800k-apu-thread/880#post_20573212


Wow. Pretty impressive.


----------



## Indy1944

Quote:


> Originally Posted by *Mopar63*
> 
> As I have stated I have a 7970 with one right now. My wife's system has a 6950 with a 5800 and I have tested with 7850 as well and gotten good results.
> 
> http://www.overclock.net/t/1347709/amd-richland-a10-6800k-apu-thread/880#post_20573212


May I ask what your specs are?


----------



## Opcode

Quote:


> Originally Posted by *Indy1944*
> 
> Does anyone have a dedicated gpu paired up with the 6800?


HD 5870 paired up with my A10-6800k.


----------



## awdrifter

Can I join this club? I just ordered the A10 6800K and MSI FM2-A55M-E33 mobo combo from Tigerdirect. I already have a set of Kingston 8GB DDR3-1600 ram, so I'm going to use that in this build. This is going to be an HTPC/light gaming build (old games). Now I know the mobo is just a budget board, but it seems like it does have some overclocking features. Has anyone used this board? Do you think I can get 4.4ghz (all cores) out of this mobo? Also, are there tools that can overclock the IGP shader clock? Thanks.


----------



## Indy1944

msi sucks...why would u by that board


----------



## awdrifter

It's a bundle from Tigerdirect. If it's completely not overclockable, I'm ok with it too. But on the MSI site it does claim that OC Genie II can overclock it a bit. My experience with MSI is not too bad actually. I've built a i5 3570k + HD7950 build for a friend using the MSI Z77A-G41 and it was able to overclock the CPU to 4.5ghz stable. So while MSI is bad for extreme overclocking, for a mild overclock in my experience is not too bad.


----------



## DaveLT

Quote:


> Originally Posted by *awdrifter*
> 
> Can I join this club? I just ordered the A10 6800K and MSI FM2-A55M-E33 mobo combo from Tigerdirect. I already have a set of Kingston 8GB DDR3-1600 ram, so I'm going to use that in this build. This is going to be an HTPC/light gaming build (old games). Now I know the mobo is just a budget board, but it seems like it does have some overclocking features. Has anyone used this board? Do you think I can get 4.4ghz (all cores) out of this mobo? Also, are there tools that can overclock the IGP shader clock? Thanks.


Quote:


> Originally Posted by *awdrifter*
> 
> It's a bundle from Tigerdirect. If it's completely not overclockable, I'm ok with it too. But on the MSI site it does claim that OC Genie II can overclock it a bit. My experience with MSI is not too bad actually. I've built a i5 3570k + HD7950 build for a friend using the MSI Z77A-G41 and it was able to overclock the CPU to 4.5ghz stable. So while MSI is bad for extreme overclocking, for a mild overclock in my experience is not too bad.


It is actually completely voltage unlocked lol. Now, the MSI A85X (the highest one on the board) claimed quite a bit of HWBot scores
Otherwise 4.4GHz might sound possible since the boost freq is actually 4.4 lol


----------



## awdrifter

I know the APU is unlocked, I'm just not sure if the bios will have the options to allow me to set the IGP shader clocks.


----------



## DaveLT

Quote:


> Originally Posted by *awdrifter*
> 
> I know the APU is unlocked, I'm just not sure if the bios will have the options to allow me to set the IGP shader clocks.


It will, i think ...


----------



## agrims

MSI makes a great FM2 board. They accel at making boards for CPU's that don't require much voltage, cue Intel, FM2, etc. Their boards are also good in AM3+ in the lower power chips, like 6 and 4 cores. They aren't great at supplying the 8 cores what they need, unless you go top of the top line. The board you bought doesn't come with VRM heatsinks, so don't push it much past the turbo speeds, as they will get toasty, but that will happen with any board without VRM heatsinks. The MSI boards will allow you to OC anything that can OC, and their Click Bios 2 is almost dummy proof, so easy to navigate.

The only real grip I have with mine is that they have a safety in the bios, (FM2-A85XA-G65A), that will fight you tooth and nail past a certain point. I want to over volt the hell out of my CPU, but the board won't let me. If only I could get past it... I can say that the VRM's stay nice and cool on the board, never over 65c, and that is just with heatsinks, no fans on them or anything, and it keeps the voltage very stable all the time. It is a consistent board.


----------



## DaveLT

Their FM2 boards, good? Hell no.
Gigabyte and ASUS makes BETTER AM3+ boards from bottom to top, with exception of 990fx-ud3

Their FM2 boards are only worth it if it's cheaper, for my town, no. (ASUS is suffering bad problems with RMA in the US)
The F2A85-M LE costs LESS and is actually A85X and not A55 (A55M-E33 btw)


----------



## agrims

Ok.. you win. My board sucks even though MSI has some of the highest OC FM2 board in HWbot... Also, the OP stated that the board he would get is through a combo deal, means he doesn't have a say in a different board through that route. I never said that they make great AM3+ boards. I stated that they are, and I quote, "Good in AM3+ in lower power chips, like 6 and 4 cores. They aren't great at supplying the 8 cores what they need, unless you go top of the top line." Please don't argue with plain English. Also, misquoting and misinterpreting are bad business for everyone.

MSI also has one of the easiest to navigate bios' around. I also warned the OP about no cooling options on the VRM's with that board, and that goes for ANY board without it and OC'ing. Besides, if the OP has already bought the combo deal, and the Click Bios 2 will be his best friend, and buying the combo deal tells me that he more likely than not is new to the OC scene. DaveLT, you sure do naysay a crap load anything anyone says, you are like sex panther, works 80% of the time, every time.


----------



## Mopar63

Quote:


> Originally Posted by *Indy1944*
> 
> May I ask what your specs are?


My 6800K test rig, I am about to move some parts around before I go final... is:

AMD 6800K APU @ 4.6GHz
Gigabyte F2A85XN-WIFI mITX
Kingston HyperX Beast 2400 (2x4) @ 2133
Sapphire VaporX 7970 GHz
OCZ Vector 256 Gig SSD
Water 2.0 Pro (single fan in push)
Fractal Design Node 304
Fractal Design Integra R2 650

This is a pretty beastly little gaming rig for 1080 gaming I am still software testing, loading up Crysis3 later today.

Now in fairness this system is a bit overkill, especially the video card. If I had not already had the card I would have gone for a 7870 or maybe a 7950 but no higher. Anything above those at 1080 really does not have a big impact on gaming experience. You could also shave some money off the cost if you went with a slower SSD, something like a Samsung 840 (none Pro).

If you wanted the system to be a little physically bigger the Asus F2A85-M Pro is a great mATX board for FM2.


----------



## Durquavian

Quote:


> Originally Posted by *agrims*
> 
> MSI makes a great FM2 board. They accel at making boards for CPU's that don't require much voltage, cue Intel, FM2, etc. Their boards are also good in AM3+ in the lower power chips, like 6 and 4 cores. They aren't great at supplying the 8 cores what they need, unless you go top of the top line. The board you bought doesn't come with VRM heatsinks, so don't push it much past the turbo speeds, as they will get toasty, but that will happen with any board without VRM heatsinks. The MSI boards will allow you to OC anything that can OC, and their Click Bios 2 is almost dummy proof, so easy to navigate.
> 
> The only real grip I have with mine is that they have a safety in the bios, (FM2-A85XA-G65A), that will fight you tooth and nail past a certain point. I want to over volt the hell out of my CPU, but the board won't let me. If only I could get past it... I can say that the VRM's stay nice and cool on the board, never over 65c, and that is just with heatsinks, no fans on them or anything, and it keeps the voltage very stable all the time. It is a consistent board.


Actually the 8 core visheras require a 990 chipset board. Nearly all ( maybe 1 or 2 exceptions) 970 chipset will not handle the power draw of 8 cores. The only issue MSI has is that voltage cap in the bios. I get past mine by using FSB to log in at a lower clock so it doesn't BSOD/lockup and once logged in use MSI Control center for adjusting the voltage to muchhigher levels and fsb to attain desired clock ( 4.8ghz on mine for now).


----------



## Durquavian

Quote:


> Originally Posted by *awdrifter*
> 
> It's a bundle from Tigerdirect. If it's completely not overclockable, I'm ok with it too. But on the MSI site it does claim that OC Genie II can overclock it a bit. My experience with MSI is not too bad actually. I've built a i5 3570k + HD7950 build for a friend using the MSI Z77A-G41 and it was able to overclock the CPU to 4.5ghz stable. So while MSI is bad for extreme overclocking, for a mild overclock in my experience is not too bad.


OC Genie never worked worth a **** on mine. Not sure if it worked for many.


----------



## awdrifter

Quote:


> Originally Posted by *agrims*
> 
> Ok.. you win. My board sucks even though MSI has some of the highest OC FM2 board in HWbot... Also, the OP stated that the board he would get is through a combo deal, means he doesn't have a say in a different board through that route. I never said that they make great AM3+ boards. I stated that they are, and I quote, "Good in AM3+ in lower power chips, like 6 and 4 cores. They aren't great at supplying the 8 cores what they need, unless you go top of the top line." Please don't argue with plain English. Also, misquoting and misinterpreting are bad business for everyone.
> 
> MSI also has one of the easiest to navigate bios' around. I also warned the OP about no cooling options on the VRM's with that board, and that goes for ANY board without it and OC'ing. Besides, if the OP has already bought the combo deal, and the Click Bios 2 will be his best friend, and buying the combo deal tells me that he more likely than not is new to the OC scene. DaveLT, you sure do naysay a crap load anything anyone says, you are like sex panther, works 80% of the time, every time.


I'm only looking for a mild oc. In my experience Intel CPUs can run at their turbo clocks for all cores without raising the voltage usually, so I'm hoping the 6800k can do the same (or maybe with minor voltage bump like 0.05v). As long as the bios has the option i think I'll be ok with it.


----------



## agrims

You will be fine. They turbo to the same voltage as base clock. You may have to up it a hair for the all time turbo OC, but that will be fine on the board you will get. Also, the you win bit was directed at DaveLT...


----------



## Pip Boy

Just a heads up to anyone running a Gigabyte F2A85 board. I have an A10-5800k and because of the funny temp sensor locations and the fact i run linux i couldn't get a proper reading from the APU. I was worried and about to de-lid for better temps as sat idle in Bios the system would go from a nice low temp initially and then rise to 47deg from 25 deg ambient in a quite closed off fractal design define mini case with good fans but they only run low in bios 500-600rpm.

Anyways i have heard of others with this immediate temperature rise from a sensible 'about ambient ' reading on a well ventilated system to loaded core type value but hadn't thought anymore about it because i had expected a loaded core to be a lot higher in temps than 47 deg. I bought a cheap power meter to measure power from the wall and booted into bios, minus the computer monitor but left HDD and 4 fans.

Setup: Seasonic 350W Gold plus PSU / A10-5800k 100w TDP / 8gb Samsung green 1600mhz ram / 128gb SSD / WD black

Power in Bios From wall wart detector: 105W
Power in Linux Mint xfce From wall wart detector: 36w -44w!

So there you go, in the bios the system is running the full 4.2 GHZ all the time on load maxing the TDP. At that power value given my PSU im in an extremely efficient position so the reading is pretty much spot on with the 5W being the fans/ HDD

the temps come from this loading of the power! I turned of CPB and reduced the CPU to 2.4ghz but power only dropped to 76w so again there was no idle states available in the bios it was lower because i lowered it. The temps dropped to close to ambient again.

Note: I haven't touched the GPU so the 36w in linux is running the stock clocks and voltage.

I will experiment for the giggles on reducing some stuff to save more power, even so im really impressed because this setup is to be used for a quasi HTPC/ Server.

might help someone ..


----------



## Durquavian

Quote:


> Originally Posted by *awdrifter*
> 
> I'm only looking for a mild oc. In my experience Intel CPUs can run at their turbo clocks for all cores without raising the voltage usually, so I'm hoping the 6800k can do the same (or maybe with minor voltage bump like 0.05v). As long as the bios has the option i think I'll be ok with it.


You wont have any issues seeing that a lot get 5.0ghz on air with little effort. And .05v is a pretty big jump in voltage actually and far more than you'll need for that slight OC you want.


----------



## DaveLT

Quote:


> Originally Posted by *Durquavian*
> 
> Actually the 8 core visheras require a 990 chipset board. Nearly all ( maybe 1 or 2 exceptions) 970 chipset will not handle the power draw of 8 cores. The only issue MSI has is that voltage cap in the bios. I get past mine by using FSB to log in at a lower clock so it doesn't BSOD/lockup and once logged in use MSI Control center for adjusting the voltage to muchhigher levels and fsb to attain desired clock ( 4.8ghz on mine for now).


Well unless you're not talking about the 970 EVOs and the 970a-ud3 ... there aren't many 970 boards to begin with.
But i can kindly say 970 boards might handle a 8350 but not a 8150. That's for sure


----------



## Mopar63

If you are going to buy a 970 board I would only get the ones from Asus or Gigabyte. They easily handle the base line 8 cores, not the super models, and have some good overclocking headroom. That is something I love about Gigabyte, they do not skimp as much as others on their budget lineups.


----------



## Indy1944

I have a FX2 board from ASRock extreme 6 must say the software utility for OC is rather accurate and dead on, but I use bios for that, all I really use it for is temp monitoring and bumping up the gpu,


----------



## Indy1944

what PSU you use, I have a corsair 600, I want a 7870 will that be enough juice for it?


----------



## DaveLT

Quote:


> Originally Posted by *Indy1944*
> 
> what PSU you use, I have a corsair 600, I want a 7870 will that be enough juice for it?


Absolutely. That's more than enough in fact as long as it's not a VS or CX


----------



## Indy1944

It's a cx


----------



## Mopar63

You will probably be fine. The CX are not well received because under hard running conditions they will not meet their full rated load. They are only rated for full load at temps up to 30C. The thing to remember is that they are not a "bad" PSU just a low cost one. Use a CX600 like a 500 watt PSU and you will be fine.

As for your specific question you should be fine. My testing rig with the 6800K and the 7970 is only pulling around 430 watts under load. When gaming I am under 400 watts.


----------



## agrims

You should be fine. The power requirements for a 78xx are for 500W. That is a conservative number as well. Running a 6800K and 7870 at 90% is around 405W requirement with a hefty OC. Any reason you are opting for the 7870 vice 7850? The 7850 will net you the best bang for your buck. Once overclocked and unlocked with ASUS GPU Tweak, you can reach just a hair under a stock 7950 in performance. I have mine at a everyday 50% overclock(powercolor PCS+)

Or go 7950 for $180.00


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> You should be fine. The power requirements for a 78xx are for 500W. That is a conservative number as well. Running a 6800K and 7870 at 90% is around 405W requirement with a hefty OC. Any reason you are opting for the 7870 vice 7850? The 7850 will net you the best bang for your buck. Once overclocked and unlocked with ASUS GPU Tweak, you can reach just a hair under a stock 7950 in performance. I have mine at a everyday 50% overclock(powercolor PCS+)
> 
> Or go 7950 for $180.00


Why specifically ASUS GPU Tweak? You should use your mfr's tuning utility no less. If i bought a PCS+ (which my friend did) i would have used the PowerUp Tuner


----------



## agrims

ASUS GPU tweak has a hidden box that allows you to over overclock any 7850. You can ramp voltage way past what you can do with stock software, and the power tune stops at 1150 on the core. With ASUS tool you can go up to 1700, but mine is sitting at 1260 and 5800 at 1.20v.


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> ASUS GPU tweak has a hidden box that allows you to over overclock any 7850. You can ramp voltage way past what you can do with stock software, and the power tune stops at 1150 on the core. With ASUS tool you can go up to 1700, but mine is sitting at 1260 and 5800 at 1.20v.


Of course it will on the PCS+ ... It's practically unlocked.
Please tell me more about the hidden box. I can't even go beyond 1050MHz with GPU Tweak


----------



## agrims

Ok, go into GPU tweak, on the left side go to settings. Click on Tune, and right below that there will be two tabs, Display Priority and TuneSetting (there is no space). There is two boxes in this tab, Overclocking range enhancement, and Keep setting for next start on close application. Check the Overclocking range enhancement. You have now turned on the cheapest 7950 in the world! BTW, each brand is different on OC, with the Powercolor PCS+ being the one with the highest OCability.


----------



## By-Tor

Quote:


> Originally Posted by *agrims*
> 
> Ok, go into GPU tweak, on the left side go to settings. Click on Tune, and right below that there will be two tabs, Display Priority and TuneSetting (there is no space). There is two boxes in this tab, Overclocking range enhancement, and Keep setting for next start on close application. Check the Overclocking range enhancement. You have now turned on the cheapest 7950 in the world! BTW, each brand is different on OC, with the Powercolor PCS+ being the one with the highest OCability.


I'm using Asus GPU Tweak Version 2.4.1.0 on my Asus 7850, but when going by your directions I see no Overclocking range Enhancement under the Tunesetting tab. I can only go as high as 1050 GPU clock and 1.228 GPU Voltage, but I wish I could find a magic anything that would let me raise both of these settings.

What version are you using?


----------



## agrims

I am using the same version, but I am at work right now.. Stupid duty stations half a world away... When did that version come out? If it is newer than sometime in June, then you need to back date it one version if at all possible. Ohh forgot that you also have to click on GPU voltage... opps.. try that and see. You can also get guidance here, and proof: http://forums.overclockers.co.uk/showthread.php?t=18389760 It starts at post 11.

I am NOT advocating or advertising the referenced site in any way mods. This is posted as it is helpful for everyone who is looking to unlock their 7850 to the performance of a 7950 or better.

Good luck DaveLT and By-Tor, and anyone else wanting the most performance for dollar.

BTW, if you want truely incredible performance, by the artic accelero twin turbo II and a bios flash to ASUS DCII TOP v1.3. (I have not tried it myself, but it has been reported to work well with other manufacturers, people have gotten up to 1376 on the GPU core benched and stable.. I am just too cheap to mess with it myself...)


----------



## Mopar63

Cross threading this slightly but I thought it relevant. In my testing of an APU based high end gaming system I was asked if it could stand up to Crysis 3 on high at 1080. The answer is yes it can. I was able to run for over an hour of game play with smooth play, never noticed any hitching. The benchmarks are not all that impressive with the average frame rate hitting 45 FPS. The min showed at 17 but I am presuming that was in transitioning to staged segments since I never noticed it during any game play.

The more I push on this little chip the more impressed I become.


----------



## s33dless

Quote:


> Originally Posted by *Mopar63*
> 
> Actually it is only partially the truth as my tests are showing. At 1080, which the 7970 is overkill for, the APU is not having trouble feeding the GPU. However the 7970 does not begin to stretch it's legs until higher resolution but Anand showed that even up to 1440 the APU can keep the 7970 fed. The issue I think comes in as you go even higher in resolutions and then I am pretty sure the APU will begin to fall short.
> 
> Now does it make sense to buy a 7970 with an APU, of course not the system is unbalanced however it also makes no sense to buy a 7970 if you are only going to game at 1080 resolutions. I think what I am showing however is that the APU makes very acceptable gaming system and at 1080 delivers an experience to rival Haswell even up to the i7.


I have a 7970 and use 5760x1080 as my resolution on anything I can and suffer from no stuttering issues. Even without the x87 fix, Skyrim would stay glued at max fps with everything turned up.

Can't wait for tmw when LCD #5 gets in.


----------



## Mopar63

s33dless what overclock are you running at???


----------



## blunt eastwood

I have a noob question. I'd like to build a SFF PC and an APU sounds like exactly what I need to keep it small.

My question is whether or not BF3 would be playable with memory of the right speed. I don't mind dropping the settings down, but I only play multiplayer. Thanks for the help.


----------



## Opcode

Quote:


> Originally Posted by *blunt eastwood*
> 
> I have a noob question. I'd like to build a SFF PC and an APU sounds like exactly what I need to keep it small.
> 
> My question is whether or not BF3 would be playable with memory of the right speed. I don't mind dropping the settings down, but I only play multiplayer. Thanks for the help.


If you can find a board to OC on, the APU is capable of playing BF3 on a 64 player map @ 720p on low/med at around 40-50 FPS. That's coming from a test conducted with the APU CPU @ 4.8 GHz and the iGPU @ 1086 MHz coupled with 2400 MHz memory. I am sure you will still get 25-35 FPS without any overclocking, you just have to make sure you get the fastest memory that you can. The only thing holding these APU's back is memory bandwidth.


----------



## agrims

For those having issues with ASUS GPU Tweak on the 7850, I just turned on my computer, and figured it out.. Any version will work, but on the newer versions there is a little stop sign button that will say advanced settings. Once you click that, the screen will have more options. You should now be able to go into the settings menu and change what is needed to go well beyond what AMD wants for the 7850. Good luck!


----------



## Indy1944

Ok. Just got my hd 7870 hooked it up to my a10-6800.......first impression is wow. Amazing gameplay with BF3...every now and then it will lag for split second. Gonna oc my CPU a little see if it works any suggestions?


----------



## By-Tor

Quote:


> Originally Posted by *agrims*
> 
> For those having issues with ASUS GPU Tweak on the 7850, I just turned on my computer, and figured it out.. Any version will work, but on the newer versions there is a little stop sign button that will say advanced settings. Once you click that, the screen will have more options. You should now be able to go into the settings menu and change what is needed to go well beyond what AMD wants for the 7850. Good luck!


I see that and it does allow higher GPU clocks, but the voltage is still 1.228 max.

PS: So far I have been able to run bench's at 1165/1410. Going to keep pushing forward to see where it will go.

So far very happy...

Thanks


----------



## Devildog83

I got my new APU system up and running, I have the A8 6600k with the Asrock a75 board, The is no 6600k thread so I will ask here. I have never overclocked an APU before and I was wondering if I could get some advice. I would just like to get it to boost clock but unlike the Asus board with the FX CPU there does not seem to be a turbo feature. What do I need to change to get the 4.2 boost here?


----------



## Indy1944

Quote:


> Originally Posted by *Devildog83*
> 
> I got my new APU system up and running, I have the A8 6600k with the Asrock a75 board, The is no 6600k thread so I will ask here. I have never overclocked an APU before and I was wondering if I could get some advice. I would just like to get it to boost clock but unlike the Asus board with the FX CPU there does not seem to be a turbo feature. What do I need to change to get the 4.2 boost here?


Your out of luck. Should of gotten the 6800 unlocked and the extreme 6 board from astock


----------



## Devildog83

Quote:


> Originally Posted by *Indy1944*
> 
> Your out of luck. Should of gotten the 6800 unlocked and the extreme 6 board from astock


This is unlocked. It has a 4,2 Ghz turbo. I know I can overclock it. I will look else where for help I guess.


----------



## Opcode

Quote:


> Originally Posted by *Devildog83*
> 
> I got my new APU system up and running, I have the A8 6600k with the Asrock a75 board, The is no 6600k thread so I will ask here. I have never overclocked an APU before and I was wondering if I could get some advice. I would just like to get it to boost clock but unlike the Asus board with the FX CPU there does not seem to be a turbo feature. What do I need to change to get the 4.2 boost here?


From what I know about my own setup you don't overclock the turbo with these APU's. Once you change the multiplier above the default turbo speed. That becomes the base clock and turbo no longer exists.

Here are the boost states for your chip.

Code:



Code:


#1: 4200 MHz, 1.35V
#2: 4100 MHz, 1.275V
#3: 4000 MHz, 1.2V

So I would just set the overclock mode to manual, and set the volts to 1.35v and the multiplier to 42. Then if you go into windows all four cores should run at 4.2 GHz all the time. Plus with PowerNow enabled, it will kick the chips volts and clock down to around 1900 MHz @ 0.275v. So that its not running at 4.2 GHz all the time, which is perfectly fine if you have sufficient cooling. As the chip doesn't generate heat unless there is a load to begin with.
Quote:


> Originally Posted by *Indy1944*
> 
> Your out of luck. Should of gotten the 6800 unlocked and the extreme 6 board from astock


It is a unlocked chip, all black boxes are unlocked (k skew). And his board is the FM2A75 Pro4-M is my guess, which has a 4+2 phase VRM and the mosfets are heatsinked, so overclocking is perfectly possible.
Quote:


> Originally Posted by *Devildog83*
> 
> This is unlocked. It has a 4,2 Ghz turbo. I know I can overclock it. I will look else where for help I guess.


Like said above, I am uncertain about turbo tho I have overclocked my 6800k to 5.3 GHz with just the multiplier. Never seen a turbo setting in my bios on the Extreme6, so I doubt you'll find one in yours either. Tho I could be entirely wrong and there could be a turbo option in the bios somewhere. Tho it never caught my eye while I was tinkering around with mine.


----------



## Devildog83

Quote:


> Originally Posted by *Opcode*
> 
> From what I know about my own setup you don't overclock the turbo with these APU's. Once you change the multiplier above the default turbo speed. That becomes the base clock and turbo doesn't exist.
> It is a unlocked chip, all black boxes are unlocked (k skew). And his board is the FM2A75 Pro4-M is my guess, which has a 4+2 phase VRM and the mosfets are heatsinked, so overclocking is perfectly possible.
> Like said above, I am uncertain about turbo tho I have overclocked my 6800k to 5.3 GHz with just the multiplier. Never seen a turbo setting in my bios on the Extreme6, so I doubt you'll find one in yours either.


Cool thanks. I have noticed when I run a stability test it goes over 4 Ghz on it's own. It seems to only clock up when needed. Much different than a CPU. When my 8350 stays at or near 4.7 all of the time.


----------



## Indy1944

Quote:


> Originally Posted by *Devildog83*
> 
> Cool thanks. I have noticed when I run a stability test it goes over 4 Ghz on it's own. It seems to only clock up when needed. Much different than a CPU. When my 8350 stays at or near 4.7 all of the time.


That's not true, the 8350's throttle down and throttle up based on CPU needs. You can disable the energy savings functions in bios


----------



## DaveLT

Quote:


> Originally Posted by *Indy1944*
> 
> Your out of luck. Should of gotten the 6800 unlocked and the extreme 6 board from astock











You don't need an extreme6 to OC







Quote:


> Originally Posted by *Opcode*
> 
> From what I know about my own setup you don't overclock the turbo with these APU's. Once you change the multiplier above the default turbo speed. That becomes the base clock and turbo no longer exists.
> 
> Here are the boost states for your chip.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> #1: 4200 MHz, 1.35V
> #2: 4100 MHz, 1.275V
> #3: 4000 MHz, 1.2V
> 
> So I would just set the overclock mode to manual, and set the volts to 1.35v and the multiplier to 42. Then if you go into windows all four cores should run at 4.2 GHz all the time. Plus with PowerNow enabled, it will kick the chips volts and clock down to around 1900 MHz @ 0.275v. So that its not running at 4.2 GHz all the time, which is perfectly fine if you have sufficient cooling. As the chip doesn't generate heat unless there is a load to begin with.
> It is a unlocked chip, all black boxes are unlocked (k skew). And his board is the FM2A75 Pro4-M is my guess, which has a 4+2 phase VRM and the mosfets are heatsinked, so overclocking is perfectly possible.
> Like said above, I am uncertain about turbo tho I have overclocked my 6800k to 5.3 GHz with just the multiplier. Never seen a turbo setting in my bios on the Extreme6, so I doubt you'll find one in yours either. Tho I could be entirely wrong and there could be a turbo option in the bios somewhere. Tho it never caught my eye while I was tinkering around with mine.


At least for what i've seen on the MSI A55M-E33 you could simply do turbo OC instead by raising the max turbo multiplier


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> For those having issues with ASUS GPU Tweak on the 7850, I just turned on my computer, and figured it out.. Any version will work, but on the newer versions there is a little stop sign button that will say advanced settings. Once you click that, the screen will have more options. You should now be able to go into the settings menu and change what is needed to go well beyond what AMD wants for the 7850. Good luck!


So i tried 2.0.6, max voltage still 1.225V. Just stick to the utility your mfr provides


----------



## blunt eastwood

Quote:


> Originally Posted by *Opcode*
> 
> If you can find a board to OC on, the APU is capable of playing BF3 on a 64 player map @ 720p on low/med at around 40-50 FPS. That's coming from a test conducted with the APU CPU @ 4.8 GHz and the iGPU @ 1086 MHz coupled with 2400 MHz memory. I am sure you will still get 25-35 FPS without any overclocking, you just have to make sure you get the fastest memory that you can. The only thing holding these APU's back is memory bandwidth.


Thanks, I will look into your suggestions. I've also read that a 5800K that's OC'ed is just as fast. Is it worth getting one of those instead or is the price/performance difference not worth it?


----------



## DaveLT

Quote:


> Originally Posted by *blunt eastwood*
> 
> Thanks, I will look into your suggestions. I've also read that a 5800K that's OC'ed is just as fast. Is it worth getting one of those instead or is the price/performance difference not worth it?


6800k will OC much more easily


----------



## agrims

I am running my OC on 1.15V on the GPU core of the 7850. It is the unlocking of the GPU core past 1150 and Memory past 5400 that you unlock to extreme levels with the GPU tweak. You shouldn't need such high volts to push the 7850 to 50% and possibly beyond. Stock max OC speeds for any OC tool is 1150 and 5400 or so. The ASUS tool allows you to go well beyond that... I tried other tools, but Asus has the money for the 7850 extreme OC.


----------



## s33dless

Is the OP still active in the thread? I think we need to start pasting tools and some general info up there to increase thread value. What I can already think of:
1) Bulldozer conditioner
2) OC clocks, voltages
3) temps (along with cooler types)

What else?


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> I am running my OC on 1.15V on the GPU core of the 7850. It is the unlocking of the GPU core past 1150 and Memory past 5400 that you unlock to extreme levels with the GPU tweak. You shouldn't need such high volts to push the 7850 to 50% and possibly beyond. Stock max OC speeds for any OC tool is 1150 and 5400 or so. The ASUS tool allows you to go well beyond that... I tried other tools, but Asus has the money for the 7850 extreme OC.



Are you sure about that? I needed a full 1.18V to maintain stability on my 7850 @ 1.12GHz and ... this is better clocking than the one i bought before (Actually RMA'd it due to some "issues"







)
Btw, keep your GPU tweak left and right elsewhere. Sapphire allows me to do extreme OCs if i can (needs a new vBIOS ...)


----------



## Artikbot

Quote:


> Originally Posted by *DaveLT*
> 
> 6800k will OC much more easily


Also it will eat slightly less power at the same speeds.


----------



## ContrastTech

Hello all,

Stumbled accross this thread and figured to throw in a OC result.
We got an A10-6800K to 5.1GHz although figuring it out was kind of fun as there were many BSODs lol.

Any input or questions is appreciated otherwise was just wanting to make a comment and share our results.


-ContrastTech


----------



## Mopar63

Okay dude you have to tell us how you got it there


----------



## ContrastTech

Hey Mopar!

This is the link to my CPUID readout : http://valid.canardpc.com/2888675

We are running it in crossfire with an XFX HD 7750 all 4 cores still active.

CPU Multiplier: 51x (5.1GHz)
CPU Voltage: 1.51250V
Motherboard NB Multiplier: 10x (2.0GHz)
Motherboard NB Voltage: 1.2812V

I have disabled turbo core and practically all auto features in the ASRock A85X Extreme4 board we are using.
Also there will be a review video posted within the next few days from us as this is a Test Drive rig that was sent to us by AMD to test (given we added a CooolerMaster Seidon 240m pre-built liquid cooling setup onto it for cooling.

I hope you have luck OCing! We will be pushing it higher later on but as 5.1 was such a struggle (spent 3 hours to get stable) it's being left at this speed for now.

-ContrastTech

PS: We will have 3DMark11, UNIHeaven, and other benchmarks posted as well on our youtube channel soon. I'll provide results here as well if they are requested.
Also It smelt interesting when the benchmarks were running but temps are around 50C for CPU and 32C for motherboard. All temps are 50C or lower. RAM is at 23C and so is our SSD and HDD.


----------



## Papadope

Wow ContrastTech,

I'm surprised how low the voltage is for 5+ GHZ, that's not bad for a AMD chip if you can cool it.


----------



## ContrastTech

Yes that is true. Again we had to run liquid (well didn't HAVE to thats what we chose to do as air didn't set with us to well).
I'm interested to see if it can go any higher (I saw someone do 8.2GHz but thats on LN2)
-ContrastTech


----------



## Opcode

Quote:


> Originally Posted by *ContrastTech*
> 
> Hello all,
> 
> Stumbled accross this thread and figured to throw in a OC result.
> We got an A10-6800K to 5.1GHz although figuring it out was kind of fun as there were many BSODs lol.
> 
> Any input or questions is appreciated otherwise was just wanting to make a comment and share our results.
> 
> 
> -ContrastTech


Is that a stable clock, or just a suicide run?


----------



## ContrastTech

OPCode,

This is an absolutely stable clock. I can submit a video if you'd like?
Currently testing Crysis 2 with this clock and have run through 3DMark11 already along with a few other benchmarks.

No stability issues at all!

-ContrastTech


----------



## iceman595

Quote:


> Originally Posted by *ContrastTech*
> 
> OPCode,
> 
> This is an absolutely stable clock. I can submit a video if you'd like?
> Currently testing Crysis 2 with this clock and have run through 3DMark11 already along with a few other benchmarks.
> 
> No stability issues at all!
> 
> -ContrastTech


what's your whole setup?


----------



## ContrastTech

Iceman,

Our rig is as follows:

AMD A10-6800K
Fractal 600Watt Continuous PSU
ASRock A85X Extreme4 Motherboard
AMD AMP&XMP1.3 1866MHz DDR3 8GB
Kingston 60GB SSD (OS)
HGST 2TB HDD (Storage)
Windows 7 64Bit Professional
XFX Core Edition HD7750
CoolerMaster Seidon 240m CPU cooler
Fractal Midi R2 Case

Thanks for your interest,
-ContrastTech


----------



## iceman595

What kind of power draw are you having? i'd assume most of it is coming from the gpu


----------



## Opcode

Quote:


> Originally Posted by *ContrastTech*
> 
> OPCode,
> 
> This is an absolutely stable clock. I can submit a video if you'd like?
> Currently testing Crysis 2 with this clock and have run through 3DMark11 already along with a few other benchmarks.
> 
> No stability issues at all!
> 
> -ContrastTech


I will give my A10-6800k a go at this same clock and volts and see how it turns out.









See if I can maybe top that









I will use Bad Company 2 as a test medium because it takes huge FPS hits in multiplayer due to CPU bottleneck.


----------



## ContrastTech

OPcode,

Goodluck man! Let us know how it all goes.

btw whats your build in comparison to ours?

-ContrastTech


----------



## Opcode

Quote:


> Originally Posted by *ContrastTech*
> 
> OPcode,
> 
> Goodluck man! Let us know how it all goes.
> 
> btw whats your build in comparison to ours?
> 
> -ContrastTech


I run virtually the same build (I am also a test driver) except instead of the Seidon 240M I run the FX liquid loop. And instead of the Extreme4 I run the Extreme6. So less CPU cooling, but more VRM horsepower. I also run a HD 5870 in mine, so quite a bit more GPU power tho its irrelevant to the OC.

AMD A10-6800K
Fractal Tesla R2 650w
ASRock FM2A85X Extreme6
8GB AMD Radeon Performance 1866 MHz DDR3
Kingston 60GB SSD
Windows 8 Pro 64-bit
ASUS Voltage Tweak Edition HD 5870
AMD FX Liquid Loop
Fractal Arc Midi R2 Case


----------



## ContrastTech

Opcode,

NICE!!! I think it's possible to push the APU to 5.5 and I'm certain I can get it to 5.4GHz with maybe the NB on the board @ 2.2GHz and the voltages stepped up from 1.28V to 1.3V or a notch higher and the APU Voltage from 1.512 to maybe 1.53V or maybe maxxed at 1.55V.

I'll play around with it some more tomorrow when I'm back at the shop. Temps are starting to worry me as in Crysis2 and under heavy physics load the CPU tripped 88C. Never got hotter than that but that did cause me to become concerned.

Let me know how your test drive rig decides to behave!

-ContrastTech

PS: downloading some games on steam and will have some pics to show with Portal 2, MW3, Farcry 3, Metro 2033, Bioshock INfinite, Borderlands 2, and Sid Meir's Civ V.


----------



## Opcode

Quote:


> Originally Posted by *ContrastTech*
> 
> Opcode,
> 
> NICE!!! I think it's possible to push the APU to 5.5 and I'm certain I can get it to 5.4GHz with maybe the NB on the board @ 2.2GHz and the voltages stepped up from 1.28V to 1.3V or a notch higher and the APU Voltage from 1.512 to maybe 1.53V or maybe maxxed at 1.55V.
> 
> I'll play around with it some more tomorrow when I'm back at the shop. Temps are starting to worry me as in Crysis2 and under heavy physics load the CPU tripped 88C. Never got hotter than that but that did cause me to become concerned.
> 
> Let me know how your test drive rig decides to behave!
> 
> -ContrastTech
> 
> PS: downloading some games on steam and will have some pics to show with Portal 2, MW3, Farcry 3, Metro 2033, Bioshock INfinite, Borderlands 2, and Sid Meir's Civ V.


Yea, 88C is quite a high temperature. That already exceeds the 74C limit that these chips have before damage is done. What are you using to monitor your temps?


----------



## ContrastTech

OPCode,

Yeah it is higher than I would like.

I was using CPUID Hardware Monitor.

-ContrastTech


----------



## agrims

Try speedfan or the windows OC tool from Asus to get temps on these chips. I believe that aida64 works on them as well. I didn't have good luck with any of the others... HW monitor included


----------



## Opcode

Quote:


> Originally Posted by *ContrastTech*
> 
> OPCode,
> 
> Yeah it is higher than I would like.
> 
> I was using CPUID Hardware Monitor.
> 
> -ContrastTech


That would be why, CPUID Hardware Monitor's sensor probes are bad. Speedfan, CoreTemp, and many others are too! Try downloading the trial of AIDA64 Extreme. I think you will be pleasantly surprised of your real temps.


----------



## agrims

Speedfan is accurate, it just labels the temp as system temp 1. Nothing beats the mobo bios tool though for accurate temps!


----------



## ContrastTech

Hello again,

I've tried speedfan and coretemp with no success "error CPU NOT supported"

Dont know why but CPUID Hardware Monitor is the first one to give me any readings.
Unless there is a newer version out that we missed some how?
I'll try adia and the other ones you guys mentioned. Thanks!

-ContrastTech


----------



## agrims

Nope HWmonitor is garbage on AMD. Mine showed a lowly 107c at stock idle. I was like, hmmm no fires, nothing smells hot, must be wrong... Tried speedfan and it worked for me, but it seems to be a case by case basis. AIDA64 works but you gotta pay after using it a few times..


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> Speedfan is accurate, it just labels the temp as system temp 1. Nothing beats the mobo bios tool though for accurate temps!


Quote:


> Originally Posted by *Opcode*
> 
> That would be why, CPUID Hardware Monitor's sensor probes are bad. Speedfan, CoreTemp, and many others are too! Try downloading the trial of AIDA64 Extreme. I think you will be pleasantly surprised of your real temps.


The only one that worked for my friend's 6800k is AIDA64, the rest wrote 88C but it wasn't anywhere near that ...


----------



## Durquavian

Quote:


> Originally Posted by *ContrastTech*
> 
> Hello again,
> 
> I've tried speedfan and coretemp with no success "error CPU NOT supported"
> 
> Dont know why but CPUID Hardware Monitor is the first one to give me any readings.
> Unless there is a newer version out that we missed some how?
> I'll try adia and the other ones you guys mentioned. Thanks!
> 
> -ContrastTech


Download HWiNFO64 or 32 whichever OS you are running. It is the best there is out there as far as free. Most all OCers use it. http://www.hwinfo.com/download.php


----------



## EliteReplay

really interested to see some power consumption if u guys have a kill a watt







thanks u are awesome!


----------



## Mopar63

I am gonna push for 5.0 I think but will likely stop there. As a rule of thumb since I use these systems a LOT after I get done with them I shoot for a loaded tested heat range of around 60C for AMD. Anything over that I tend to back off.


----------



## blunt eastwood

Quote:


> Originally Posted by *blunt eastwood*
> 
> Thanks, I will look into your suggestions. I've also read that a 5800K that's OC'ed is just as fast. Is it worth getting one of those instead or is the price/performance difference not worth it?


Quote:


> Originally Posted by *DaveLT*
> 
> 6800k will OC much more easily


So I will stick with the 6800K, but if I have to OC in order to play BF3, can I still use a SFF?


----------



## Opcode

Quote:


> Originally Posted by *blunt eastwood*
> 
> So I will stick with the 6800K, but if I have to OC in order to play BF3, can I still use a SFF?


You don't have to OC it to play BF3. Tho in all honesty, you could skip the whole APU idea and go with the 750k + HD 7770.

Code:



Code:


http://www.newegg.com/Product/Product.aspx?Item=N82E16819113328
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202011

Which turns out to be around $175 for the pair. You will spend more than that for a A10-6800k and 2400 MHz memory in today's market (ram prices are horrible). The extra GPU grunt will make a huge difference, and you can easily overclock the CPU to 4.2 GHz or so to match up with the A10-6800k's CPU. You shouldn't need an extreme board to do it either, just something that has a heatsink and at least 4+2 phase vrm. It really depends on how much you plan on investing into your SFF build. And which case you plan on using. I personally would go with what I recommended above if the case supports full width cards. I would also slap in a H50 or so and just pick up a board that will handle a moderately good overclock. Like said it will get you further in gaming than just a APU alone, and it wont cost you any extra because you'll need aftermarket cooling for the A10-6800k anyways (very hot chip). What case do you have planned for your SFF and do you have a set budget/list of components that you will need?


----------



## ContrastTech

Hello again folks,

Thanks so much for your replies

Dave, Agrims and Durq:

I will try HWiNFO 64 (win7 64 is what we are running)
ADIA64 is pay to use so we will be avoiding that most likely.

ElitReplay:

I have no way of testing the power consumption at this time (unless there some software again that I don't know about).

OPCode:

Agreed with everything mentioned. A10 isn't the best option out there.

Eastwood:

6800K seems to be a decent APU just be ware that it will needs cooling, it gets hot.

To Everyone:

Well I would have to agree there are some funny readings from CPUID's program because I don't smell anything funny.
If my temps are low enough I'll try pushing to 5.4GHz I'll let you all know how that goes after of course testing the current clocks and settings
in some more games. Pretty sure we will have to get a better cooling system than the Seidon 240m we have to be able go higher but I'm fairly certain it can climb
a bit higher. If I dont try to push for 5.4 today it will probably be tomorrow. Will keep trolling this thread though for info as I have had no idea how to monitor my temps as the programs I used to use
(realtemp and so on) all give up on this APU. (lol)

-ContrastTech


----------



## blunt eastwood

Quote:


> Originally Posted by *Opcode*
> 
> You don't have to OC it to play BF3. Tho in all honesty, you could skip the whole APU idea and go with the 750k + HD 7770.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819113328
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202011
> 
> Which turns out to be around $175 for the pair. You will spend more than that for a A10-6800k and 2400 MHz memory in today's market (ram prices are horrible). The extra GPU grunt will make a huge difference, and you can easily overclock the CPU to 4.2 GHz or so to match up with the A10-6800k's CPU. You shouldn't need an extreme board to do it either, just something that has a heatsink and at least 4+2 phase vrm. It really depends on how much you plan on investing into your SFF build. And which case you plan on using. I personally would go with what I recommended above if the case supports full width cards. I would also slap in a H50 or so and just pick up a board that will handle a moderately good overclock. Like said it will get you further in gaming than just a APU alone, and it wont cost you any extra because you'll need aftermarket cooling for the A10-6800k anyways (very hot chip). What case do you have planned for your SFF and do you have a set budget/list of components that you will need?


I'm trying to stay as low as possible budget-wise. I like your suggestion of the 750k + 7770HD, especially if I don't have to OC since I'm a noob at it and would rather avoid doing it if it's not necessary to play BF3.

I don't have a case in mind and was going to ask for suggestions. I'd like to stay around the $300 mark and already have an OS, Optical Drive, KB+M, HDD, and 4 GB of Crucial PC310600 DDR3 1333 SDRAM. I just need a CPU, GPU, case and mobo, and I have no problem getting used parts.


----------



## Durquavian

Quote:


> Originally Posted by *blunt eastwood*
> 
> I'm trying to stay as low as possible budget-wise. I like your suggestion of the 750k + 7770HD, especially if I don't have to OC since I'm a noob at it and would rather avoid doing it if it's not necessary to play BF3.
> 
> I don't have a case in mind and was going to ask for suggestions. I'd like to stay around the $300 mark and already have an OS, Optical Drive, KB+M, HDD, and 4 GB of Crucial PC310600 DDR3 1333 SDRAM. I just need a CPU, GPU, case and mobo, and I have no problem getting used parts.


On the point of the 7770, it is good but it won't get you high quality in games. Now crossfiring 2 7770s will. I have these so I speak from XP.


----------



## Opcode

Quote:


> Originally Posted by *blunt eastwood*
> 
> I'm trying to stay as low as possible budget-wise. I like your suggestion of the 750k + 7770HD, especially if I don't have to OC since I'm a noob at it and would rather avoid doing it if it's not necessary to play BF3.
> 
> I don't have a case in mind and was going to ask for suggestions. I'd like to stay around the $300 mark and already have an OS, Optical Drive, KB+M, HDD, and 4 GB of Crucial PC310600 DDR3 1333 SDRAM. I just need a CPU, GPU, case and mobo, and I have no problem getting used parts.


Do you need a PSU also? If so that will make a big impact on your budget. So it's good to know whether you do or don't.


----------



## ContrastTech

Hello,

So we just came back from the Twitter AMD Test Drive happy hour for AMD's APUs and it seems a bit sad about how they went about it.

Any serious questions seemed to be completely avoided and anyone talking about "Amos Famous Cookies" and other irrelevant nonsense were immediately responded to.

Could just be a jump to conclusions, but seems like something is awry.

Also figured to share some scores for 3DMark11 and Valley


----------



## blunt eastwood

Quote:


> Originally Posted by *Durquavian*
> 
> On the point of the 7770, it is good but it won't get you high quality in games. Now crossfiring 2 7770s will. I have these so I speak from XP.


By high quality I assume you mean resolution, in which case that's not a huge consideration of mine because I'm only looking for light gaming, mostly BF3 online, and don't mind lowering the settings.
Quote:


> Originally Posted by *Opcode*
> 
> Do you need a PSU also? If so that will make a big impact on your budget. So it's good to know whether you do or don't.


Yes I do have a PSU, but it's a regular sized one, so it may not work for a SFF build. Do you think a Q6600 and a 7770 would meet my needs?


----------



## DaveLT

Quote:


> Originally Posted by *blunt eastwood*
> 
> By high quality I assume you mean resolution, in which case that's not a huge consideration of mine because I'm only looking for light gaming, mostly BF3 online, and don't mind lowering the settings.
> Yes I do have a PSU, but it's a regular sized one, so it may not work for a SFF build. Do you think a Q6600 and a 7770 would meet my needs?


Of course.


----------



## Opcode

Quote:


> Originally Posted by *blunt eastwood*
> 
> By high quality I assume you mean resolution, in which case that's not a huge consideration of mine because I'm only looking for light gaming, mostly BF3 online, and don't mind lowering the settings.
> Yes I do have a PSU, but it's a regular sized one, so it may not work for a SFF build. Do you think a Q6600 and a 7770 would meet my needs?


If you have a Q6600 on hand, with a board and memory to go with it. Then you wouldn't need to invest into main components right yet. You can always upgrade later to a faster platform such as Kaveri in Q1 next year. And if Kaveri can crossfire with say a HD 7770 than you're future proofing your purchases right now. Tho I cant guarantee there will be any HD 7770 DGM support. Either way you can still hold out until AMD starts selling the binned chips that have the iGPU disable due to defect. Four steamroller cores probably will only run about $95 new. And that will be one hell of a chip to have for the budget gamer.

P.S. You should open your own thread, you will get more responses and collaboration with your build ideas.


----------



## awdrifter

I'm having problem installing Windows 7 x64 on my setup (6800k + MSI A55M-E33). It's stuck at the "Completing Installation" step. Does anyone know any way to get around this issue? I can install the 32 bit version just fine. But since I have 8GB of ram, I want to use the 64 bit version. Thanks.


----------



## awdrifter

So I finally got it to work. I just needed to remove all usb devices after the first reboot. The bios is pretty limited, it doesn't have CPU voltage control, however I found that I can use AMD Overdrive to adjust CPU and NB voltage. Is there a way to set AMD Overdrive to automatically load the oc settings on boot? Thanks.


----------



## Alatar

Just ordered an A10-6800K for my 11 year old brother but I'll be testing it on phase and LN2 first because the board he's getting is mine









So gonna be posting here again in a few days


----------



## s33dless

Does anyone have access to the datasheets for the processor/chipset? ASUS gives a lot of two-word explanations in their manual for most of their BIOS parameters, I want to know what they are and what they do before I start screwing with them.


----------



## Stormscion

Quote:


> Originally Posted by *Durquavian*
> 
> On the point of the 7770, it is good but it won't get you high quality in games. Now crossfiring 2 7770s will. I have these so I speak from XP.


I play many games on high.

stock 7770 is very conservatively clocked and that is what you see in most benchmarks. 1150 memory clock is lol when you can push memory up to 1500 easy on every model... also core is usually able to hit from ~1ghz to 1.2ghz... those are huge performance gains.


----------



## DaveLT

Quote:


> Originally Posted by *Stormscion*
> 
> I play many games on high.
> 
> stock 7770 is very conservatively clocked and that is what you see in most benchmarks. 1150 memory clock is lol when you can push memory up to 1500 easy on every model... also core is usually able to hit from ~1ghz to 1.2ghz... those are huge performance gains.


What ... I am having problems pushing my sapphire 7850 memory over 1320 ... It can't be done. Despite using a 6GHz ELPiDA (effective) chip -_-


----------



## agrims

Yes but the 7850 has a 256 bit channel whereas the 7770 is only 128. That speed can happen due to the constriction.. Think of it like this, water current through a big bay with an equal mouth moves slower than water from a big bay with a tiny channel. So we get more performance than those with the 7770, at slower clocks..


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> Yes but the 7850 has a 256 bit channel whereas the 7770 is only 128. That speed can happen due to the constriction.. Think of it like this, water current through a big bay with an equal mouth moves slower than water from a big bay with a tiny channel. So we get more performance than those with the 7770, at slower clocks..


Most other 7850s clock way higher on the memory than i do







though. 7850 NEEDS that big a bus anyway, it's just as powerful as a 6970 (Now at least) so it better need that bus


----------



## agrims

You should throw down on an accelero twin for it.. I imagine you are hitting the temp limit for those chips.. I found out something interesting though, you will get an infinitely better OC out of just the core over memory. Try going stock speeds on memory and then go up to 1260 core. Test and if stable do 1290 core. Stable, then up the memory to 4900, 5000, etc. once unstable back off 10-20mhz.. You will be rock solid.. Ohh, and to reach the fabled 1.3v you have to raise volts to 1.25, then set power limit to 120%.. At full load you will hit 1.3v or so close to it. You may have a bad chip also though, as it kind of sounds like it is error checking itself out... GDDR5 doesn't crash like DDR3, as I am sure you are well aware.. For those that don't know it slows down due to error checking, or checksum, long before it crashes. A guy in London I know of has his set at 4900 on the memory and 1320 core and is faster than mine...


----------



## DaveLT

Quote:


> Originally Posted by *agrims*
> 
> You should throw down on an accelero twin for it.. I imagine you are hitting the temp limit for those chips.. I found out something interesting though, you will get an infinitely better OC out of just the core over memory. Try going stock speeds on memory and then go up to 1260 core. Test and if stable do 1290 core. Stable, then up the memory to 4900, 5000, etc. once unstable back off 10-20mhz.. You will be rock solid.. Ohh, and to reach the fabled 1.3v you have to raise volts to 1.25, then set power limit to 120%.. At full load you will hit 1.3v or so close to it. You may have a bad chip also though, as it kind of sounds like it is error checking itself out... GDDR5 doesn't crash like DDR3, as I am sure you are well aware.. For those that don't know it slows down due to error checking, or checksum, long before it crashes. A guy in London I know of has his set at 4900 on the memory and 1320 core and is faster than mine...


Nah, it isn't the memory temp problem. I placed memory sinks (and even VRM sinks) and it did nought. Nada, didn't do a thing. Still crashing at 1350 mem.
The reason i want the memory to be higher is for mining, there needs to be a "golden" ratio
And also neither can i do 1180 core, max voltage 1.225v (Power limit goes to 150%







)
This is the better card already .. RMA'd one other and got this one, turns out to only be slightly better but suffers worser memory speeds. I'm suspecting mem voltage is 1.5v

But yes my memory at the very least is at 5280MHz and that's the most stable i can get. Crashes takes some time though and i know that. Excessive attempts to ECC and then BSOD


----------



## Alatar

http://valid.canardpc.com/hbjhmm


----------



## DaveLT

5.5GHz Air? If so ... HOLY COW.


----------



## Farmer Boe

I highly doubt that is on air cooling. Knowing Alatar, he's dumping LN2 all over it!


----------



## Durquavian

Prob not LN2 either. It is thatg phase change thing he has most likely. These things can do in 7ghz to 8ghz on LN2


----------



## void

Very impressive Alatar.


----------



## Alatar

Quote:


> Originally Posted by *DaveLT*
> 
> 5.5GHz Air? If so ... HOLY COW.


Quote:


> Originally Posted by *Farmer Boe*
> 
> I highly doubt that is on air cooling. Knowing Alatar, he's dumping LN2 all over it!


Quote:


> Originally Posted by *Durquavian*
> 
> Prob not LN2 either. It is thatg phase change thing he has most likely. These things can do in 7ghz to 8ghz on LN2


Phase change. However the voltages are really low.

over 6ghz at 1.65v is really good, even compared to single core runs on Vishera. Also much better than the 6800K based 760K I had that only 5700MHz at 1.75v...

Who knows this might be a 8ghz chip. Maybe I'll try LN2 next week?


----------



## Lukinrats

Need some help guys. I bought this processor out of sheer curiosity. I like playing around with new things so I decided to do a simple APU build.
Bought the black edition and an ECS A85F2-A Deluxe. I only put 4gb of ram in to start, but I have another stick to add.

Right now I am just wondering about a couple of things. First, this bios doesn't have an option for editing the NB multiplier or Freq. only voltage. Is that normal for these FM2 builds

I thought I would start my overclocking by reducing the multi to 16x or 18x and dropping ram to 800 or so. Then increase FSB until I find the highest setting there. However, this is not working. I am wondering if it is affecting the SAta controller(s) or something.

Is there a decent guide to these? I am new to these APU and socket fm2


----------



## Opcode

Quote:


> Originally Posted by *Lukinrats*
> 
> Need some help guys. I bought this processor out of sheer curiosity. I like playing around with new things so I decided to do a simple APU build.
> Bought the black edition and an ECS A85F2-A Deluxe. I only put 4gb of ram in to start, but I have another stick to add.
> 
> Right now I am just wondering about a couple of things. First, this bios doesn't have an option for editing the NB multiplier or Freq. only voltage. Is that normal for these FM2 builds


No, with my Extreme6 I can change the OC profile to Manual and tweak the multiplier, volts, overvolts, etc. If your board has the EZ-Bios, make sure you are in the advanced view, this could be why you are not seeing any of these settings.
Quote:


> Originally Posted by *Lukinrats*
> 
> I thought I would start my overclocking by reducing the multi to 16x or 18x and dropping ram to 800 or so. Then increase FSB until I find the highest setting there. However, this is not working. I am wondering if it is affecting the SAta controller(s) or something.


You should be able to overclock via the multiplier, if you bought a unlocked processor.
Quote:


> Originally Posted by *Lukinrats*
> 
> Is there a decent guide to these? I am new to these APU and socket fm2


Every boards bios is different, finding and learning the settings is part of the whole experience. Tho I can say, hitting 5.0 GHz stable isn't likely on some cases without backing the ram down to 1866 MHz. For some reason these APU's don't like to run at 5.0 GHz with the memory above 1866 MHz. North bridge to 2.2 GHz, and just crank away the multiplier.


----------



## Lukinrats

Quote:


> Originally Posted by *Opcode*
> 
> No, with my Extreme6 I can change the OC profile to Manual and tweak the multiplier, volts, overvolts, etc. If your board has the EZ-Bios, make sure you are in the advanced view, this could be why you are not seeing any of these settings.
> You should be able to overclock via the multiplier, if you bought a unlocked processor.
> Every boards bios is different, finding and learning the settings is part of the whole experience. Tho I can say, hitting 5.0 GHz stable isn't likely on some cases without backing the ram down to 1866 MHz. For some reason these APU's don't like to run at 5.0 GHz with the memory above 1866 MHz. North bridge to 2.2 GHz, and just crank away the multiplier.


No, I am in the advanced menu. I will just have to post some screen shots I guess. I have plent if experience with OC but with intel mostly. Just sold a phenom 1045t that I over locked. I don't know why the nb multi is not there. I know I can use the unlocked multi for CPU but I wanted to get my fsb as high as I could before I start that.

I will see about posting some screens when I get home

Thanks for the quick reply


----------



## DaveLT

1) If you think FM2 OC'ing is anything like AM3 ... You're very wrong. This BCLK OC'ing is not entirely easy, i have seen 180MHz but don't bother. Just use multi all the way


----------



## Lukinrats

I don't really think anything to be honest. If I did, I wouldn't come and ask questions. However, I went looking for a guide and all I found was this

http://www.ocinside.de/go_e.html?http://www.ocinside.de/html/workshop/amd_fm2_overclock.html


----------



## Alatar

Quote:


> Originally Posted by *DaveLT*
> 
> 1) If you think FM2 OC'ing is anything like AM3 ... You're very wrong. This BCLK OC'ing is not entirely easy, i have seen 180MHz but don't bother. Just use multi all the way


But multi only goes to x63


----------



## DaveLT

Quote:


> Originally Posted by *Alatar*
> 
> But multi only goes to x63


What a shame ... but only at that point should you begin using BCLK








Quote:


> Originally Posted by *Lukinrats*
> 
> I don't really think anything to be honest. If I did, I wouldn't come and ask questions. However, I went looking for a guide and all I found was this
> 
> http://www.ocinside.de/go_e.html?http://www.ocinside.de/html/workshop/amd_fm2_overclock.html


There's a guide on OCN


----------



## Lukinrats

Quote:


> Originally Posted by *DaveLT*
> 
> What a shame ... but only at that point should you begin using BCLK
> 
> 
> 
> 
> 
> 
> 
> 
> There's a guide on OCN


There may be one on OCN but I can't find it so far


----------



## mtcn77

What is the expert opinion on overclocking the 6800k?
It is obvious that 6800k requires less voltage for the same clocks compared to 5800k, yet does it really pay off? 4.4ghz @ 1.3v on auto - I have the stock cooler, so please consider.


----------



## DaveLT

Quote:


> Originally Posted by *mtcn77*
> 
> What is the expert opinion on overclocking the 6800k?
> It is obvious that 6800k requires less voltage for the same clocks compared to 5800k, yet does it really pay off? 4.4ghz @ 1.3v on auto - I have the stock cooler, so please consider.


Once you get rid of the stock cooler the 6800k begins to shine. 4.4GHz on the stock cooler is already a big feat for any chip that uses a aluminium base heatsink ...
But yes 6800k roughly requires less voltage/higher clocks depending on how you look at it

Haswells can barely do 4.2GHz on the copper core stock cooler (With the inclusion of throttling)


----------



## Alatar

Quote:


> Originally Posted by *DaveLT*
> 
> Once you get rid of the stock cooler the 6800k begins to shine. 4.4GHz on the stock cooler is already a big feat for any chip that uses a aluminium base heatsink ...
> But yes 6800k roughly requires less voltage/higher clocks depending on how you look at it
> 
> Haswells can barely do 4.2GHz on the copper core stock cooler (With the inclusion of throttling)


....







Does more than that just fine. And without throttling. Also a 4.2GHz haswell has a 14% OC so in order to match that OCing potential a 6800K would have to reach nearly 4.7GHz on the stock cooler.

As for the 6800K stuff, I'll be doing an LN2 run tomorrow and then installing the thing for my younger brother. Will probably do some limited OCing for him on the stock cooler so I'll comment here after I've done that.


----------



## DaveLT

You got lucky that's all. Besides 80C (which correlates to 80C Tjmax) is really high and that's 4.3GHz

Also don't give me the "14% OC" BS if it's hot it's hot


----------



## Alatar

Quote:


> Originally Posted by *DaveLT*
> 
> You got lucky that's all. Besides 80C (which correlates to 80C Tjmax) is really high and that's 4.3GHz
> 
> Also don't give me the "14% OC" BS if it's hot it's hot


TJ max is 108C... These aren't AMD CPUs, you can run them at really high temps.

And yes I will give you the percentages since those are all that matter







pure clock speed number doesn't matter any more than a pure IPC number









But yeah, I'll probably be also posting 6800K stock cooler OCs in a day or two.


----------



## DaveLT

Quote:


> Originally Posted by *Alatar*
> 
> TJ max is 108C... These aren't AMD CPUs, you can run them at really high temps.
> 
> And yes I will give you the percentages since those are all that matter
> 
> 
> 
> 
> 
> 
> 
> pure clock speed number doesn't matter any more than a pure IPC number
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah, I'll probably be also posting 6800K stock cooler OCs in a day or two.


108C this time round still conincides with 80C again lol, thing is intel keeps raising the TjMax is because they had to

Still surprising what you can do with the 6800k stock cooler even if it's total crap ... it's even WORSER than the i5&i7 stock cooler








The only FM2 rig i built for a client ended up in me putting a 5$ aftermarket tower .. Ok it's not particularly very small but it performed decently so i didn't care after that, it was overheating quite badly all the time with the stock cooler on and in these ambient temps ... Left the GPU @ 897MHz and didn't touch anything else. Still pulled under 80W. Hmm not bad








You better do post them


----------



## Papadope

AMD uses tim like Intel on the APU's right? I know the FX Series are soldered. I wonder what kind of overclock's could be hit on a delidded chip with a closed loop water cooler. H80 for example, I doubt anybody would be investing into a custom loop on a APU lol. Maybe with Carrizo.


----------



## mtcn77

I think 6800k is worthy of recommendation. My pc is truly "silent", but I think there is some sort of capacitor hum. Might be due to my aged psu.
Note: Nope, disable C6 to fix it.


----------



## DaveLT

Quote:


> Originally Posted by *mtcn77*
> 
> I think 6800k is worthy of recommendation. My pc is truly "silent", but I think there is some sort of capacitor hum. Might be due to my aged psu.
> Note: Nope, disable C6 to fix it.


Coil whine








Quote:


> Originally Posted by *Papadope*
> 
> AMD uses tim like Intel on the APU's right? I know the FX Series are soldered. I wonder what kind of overclock's could be hit on a delidded chip with a closed loop water cooler. H80 for example, I doubt anybody would be investing into a custom loop on a APU lol. Maybe with Carrizo.


Yes but thankfully they don't run nearly as hot. It runs hot on the stock cooler yes but that's a aluminium-base cooler attempting to cool a 100W TDP proc








In HWM and such programs apart from AIDA64 EE will state all the way to 105C and it would be fine but A64 EE will state the correct which 73C is the max on the 6800k
In fact 6800k runs pretty cool on other heatsinks i reckon but on OCN who doesn't want to invest on a custom loop? There's no need to buy everything new ... XSPC EX240 if you tried harder are 20$ cheaper in china. That's ... cheap.

(NOTE: I just spent 20$ on a raystorm. SWEEEEEEEEEEEEEEEEEEEEEET)


----------



## mtcn77

Geez, Arctic Silver 5 really likes it cozy to set in. Now, prime95 stable at 69c on stock settings. :/
These compute engines are a real blessing. Now, I need not search for passwords.


----------



## JoeelMex

Hey guys thinking a building a mini-pc with one of these chips for my daughter. Lately we both play Final Fantasy a Realm Reborn. How do you think this CPU with the integrated GPU handle the game at 1080P. Right now she plays on my laptop which has an Haswell 4800 with an Nvidia 765 GPU. You think it will come close to my laptop? Thanks


----------



## DaveLT

Quote:


> Originally Posted by *JoeelMex*
> 
> Hey guys thinking a building a mini-pc with one of these chips for my daughter. Lately we both play Final Fantasy a Realm Reborn. How do you think this CPU with the integrated GPU handle the game at 1080P. Right now she plays on my laptop which has an Haswell 4800 with an Nvidia 765 GPU. You think it will come close to my laptop? Thanks


I'm not sure but you might have to lower the settings if the GTX765M is being stressed at 100% on the laptop (which is unlikely i think)


----------



## JoeelMex

Would someone with a 6800K please run this benchmark using the integrated GPU,

http://na.finalfantasyxiv.com/benchmark/

It's the final Fantasy benchmark tool. Keep the resolution at 1080P and decrease the setting until you get a HIGH score in green.

I will be very grateful.

Thanks


----------



## s33dless

Can somebody put up a visual reference for delidding? I want to go for it, but I have no idea where the components are located underneath the plate, don't want to chop anything off.


----------



## Papadope

Quote:


> Originally Posted by *s33dless*
> 
> Can somebody put up a visual reference for delidding? I want to go for it, but I have no idea where the components are located underneath the plate, don't want to chop anything off.


You have to be more careful with the APU's compared to Intel's chips, here is a picture of a delidded A10-5700. More at the *Source*.










Image provided by Artikbot


----------



## MKHunt

Well shoot. Now I want to delid my 6800k. I do wish Arktikbot posted pics of his final pre-lidding setup. 7C seems like a tiny drop (going from early posts) for fixing any contact issues. Maybe the APUs use a proper amount of black adhesive that doesn't result in poor contact unlike Intel chips?


----------



## Lukinrats

Ok folks. Still trying to get a handle on the overclocking of my processor. I have a few questions. I think most of my problems can be attributed to my mainboard's bios settings. There are not settings for NB multi, NB freq, and I can't find any setting that is there to help with Vdroop.

Do these richland Apu's stay at max clocks all the time? I thought they would clock down when not under load. I had disabled turbo mode when i started my overclock, and I thought that might be why it did not clock down, but I went back to default settings with Turbo set to auto and it is still not clocking down.

*Edit: I reset bios to default and noticed that the down clocking is functioning, so i guess it will not clock down after you change the multi???*

Does anyone have a picture of your boards bios options for overclocking. I would like to compare them to the options I have on this ECS A85F2-A Deluxe. It just seems lacking in the settings department

Lastly, what is the standard result from the overclocking of the GPU? Just wondering what to expect when I try it. I figure with the default being 844/800 respectively, that there is not much overclocking to be had on the GPU

Thanks in advance

Nathan


----------



## s33dless

Quote:


> Originally Posted by *Lukinrats*
> 
> Ok folks. Still trying to get a handle on the overclocking of my processor. I have a few questions. I think most of my problems can be attributed to my mainboard's bios settings. There are not settings for NB multi, NB freq, and I can't find any setting that is there to help with Vdroop.
> 
> Do these richland Apu's stay at max clocks all the time? I thought they would clock down when not under load. I had disabled turbo mode when i started my overclock, and I thought that might be why it did not clock down, but I went back to default settings with Turbo set to auto and it is still not clocking down.
> 
> *Edit: I reset bios to default and noticed that the down clocking is functioning, so i guess it will not clock down after you change the multi???*
> 
> Does anyone have a picture of your boards bios options for overclocking. I would like to compare them to the options I have on this ECS A85F2-A Deluxe. It just seems lacking in the settings department
> 
> Lastly, what is the standard result from the overclocking of the GPU? Just wondering what to expect when I try it. I figure with the default being 844/800 respectively, that there is not much overclocking to be had on the GPU
> 
> Thanks in advance
> 
> Nathan


Turn off all of your power saving options in both bios and amd overdrive and whatnot. That ought to kill whatever underclocking you have left.

What pastes are people using after delids for this? Is IHS back on better?


----------



## Lukinrats

Quote:


> Originally Posted by *s33dless*
> 
> Turn off all of your power saving options in both bios and amd overdrive and whatnot. That ought to kill whatever underclocking you have left.


Well, actually I have already done that, and it stopped the clocking down function (if it has one). I am trying to figure out how to overclock the cpu, and still have the proccy clock down when not under load.

For instance, my i7 processor is overclocked to 4.9ghz. However, when I am not demanding that from it, it will clock down below 2ghz. Now I realize this is an AMD, but CPU-Z is reporting a multi of 10-44x, so I just assumed that it must have this ability also. However, even at defaults, and 0-1% CPU usage, it is still running at full clocks


----------



## Lukinrats

Also, still looking for a screeny of someone's bios settings for overclocking. I do not need the actual settings. I only want to see what the choices are that you are working with. I feel my board is lacking several options, but before I contact manufacturer, I want to see what others provide


----------



## Lukinrats

Also, I have overclocked the GPU in the bios, but nothing is reporting it as 1ghz. GPU-z still shows 844. Is that something normal?


----------



## agrims

Turn back on cool and quiet. If you are 100% stable at the OC you have, chances are with CnQ on, it will be stable still and down clock like you want. If I am remembering right, the windows 7 "FX fix" they implemented automatically shuts down core C6 state, and that is what plays hell on BD based chips, as windows says no and our boards with it enabled says yes. Read up on the BD/PD OC thread. Most of their info applies to our FM2 builds as they run the same tech core wise.


----------



## Lukinrats

Quote:


> Originally Posted by *agrims*
> 
> Turn back on cool and quiet. If you are 100% stable at the OC you have, chances are with CnQ on, it will be stable still and down clock like you want. If I am remembering right, the windows 7 "FX fix" they implemented automatically shuts down core C6 state, and that is what plays hell on BD based chips, as windows says no and our boards with it enabled says yes. Read up on the BD/PD OC thread. Most of their info applies to our FM2 builds as they run the same tech core wise.


Will do. I have not disabled C&Q at all. I did disable C6 and Turbo, but have since turned all of them back on. My proc still runs at 44x all the time. Sometimes I will see the voltage drop, but not the multi

I just can't decide if my gpu is being overclocked. I am telling it to in the bios, but I can't find anything that reports it as so


----------



## Opcode

Quote:


> Originally Posted by *Lukinrats*
> 
> Also, I have overclocked the GPU in the bios, but nothing is reporting it as 1ghz. GPU-z still shows 844. Is that something normal?


GPU-Z works for me. I ran mine at 1086 MHz which I am certain is the same clock you are at right now with yours. It must be your bios has a toggle option or something for enabling and disabling the set custom clocks. What motherboard are you using?


----------



## Lukinrats

Quote:


> Originally Posted by *Opcode*
> 
> GPU-Z works for me. I ran mine at 1086 MHz which I am certain is the same clock you are at right now with yours. It must be your bios has a toggle option or something for enabling and disabling the set custom clocks. What motherboard are you using?


Yes, 1080. Using ECS A85F2-A Deluxe. There is the option in the bios to OC igd so I enable it and set it to 1080. There is no option to enable clocks.

I am not very happy with the board I don't think. I keep asking for someone to post a shot of their bios. So that I can compare. I feel this board is lacking in the settings department.

I can't get a good OC either. It clocks itself to 44x with turbo, however when I try to clock past that, prime causes hardware failures. I can boot in at 46x but it is not stable enough to run prime


----------



## Opcode

Quote:


> Originally Posted by *Lukinrats*
> 
> Yes, 1080. Using ECS A85F2-A Deluxe. There is the option in the bios to OC igd so I enable it and set it to 1080. There is no option to enable clocks.
> 
> I am not very happy with the board I don't think. I keep asking for someone to post a shot of their bios. So that I can compare. I feel this board is lacking in the settings department.
> 
> I can't get a good OC either. It clocks itself to 44x with turbo, however when I try to clock past that, prime causes hardware failures. I can boot in at 46x but it is not stable enough to run prime


I'm not a fan of ECS boards. I never owned one, and I never will. They're not exactly a popular company when it comes to hardware. I personally would of spent the extra $5-10 and got the ASRock Extreme6. The problem with getting a bios shot of that board, is not very many people own it.


----------



## Lukinrats

Quote:


> Originally Posted by *Opcode*
> 
> I'm not a fan of ECS boards. I never owned one, and I never will. They're not exactly a popular company when it comes to hardware. I personally would of spent the extra $5-10 and got the ASRock Extreme6. The problem with getting a bios shot of that board, is not very many people own it.


Funny you should mention that. I just filled out a return to Newegg on that piece of crap board. I also went ahead and ordered the extreme6. Wierd


----------



## mtcn77

Is there an overclocking guide on Piledriver based cpu's depicting transistor delay(clock speed) data across the voltage spectrum?
Intel's excellent FinFET shmoo plot graphs left me yearning.


----------



## Skitsofrantik

I'm building a pc with the A10-6800k apu. I've been trying to find what graphic capabilities I could expect and found this youtube channel with a lot of videos of games running on the A10-6800k. I was wondering if these were real or fake. Can I get this kind of performance from this apu?

Youtube Link: http://www.youtube.com/user/STRIKESHARK?feature=csp-in-feed

any help would be appreciated


----------



## Opcode

Quote:


> Originally Posted by *Skitsofrantik*
> 
> I'm building a pc with the A10-6800k apu. I've been trying to find what graphic capabilities I could expect and found this youtube channel with a lot of videos of games running on the A10-6800k. I was wondering if these were real or fake. Can I get this kind of performance from this apu?
> 
> Youtube Link: http://www.youtube.com/user/STRIKESHARK?feature=csp-in-feed
> 
> any help would be appreciated


Yes, but keep in mind he is running 2400 MHz memory which provides a huge boost. And his iGPU is overclocked from 844 MHz to 1080 MHz.


----------



## glussier

He is also playing in 720p. While 2400mhz ram provides a boost, it's not as huge as some people want's us to believe.


----------



## DaveLT

If you want a cheap low-power gaming rig either 1) Grab a decent FM2 board like a Extreme6 and the 6800k + TridentX 2400MHz/C10 and be done with it
2) Wait for kaveri. It might be extremely good. The upcoming A88X boards from Gigabyte made me drool









I can bet i won't like the G1.Sniper A88X since it's
1) Green with no features to bother buying it over even the D3H but the UP4 is a work of art.


----------



## Papadope

Maybe don't wait for Kaveri, but definitely wait for FM2+. The boards are going to be so much better and then you also have a upgrade path to Kaveri later on. Besides, they should be out within 2 months tops.

I bought a A8-3850 on FM1 about a month before FM2 came out instead of waiting. I regret that one now, I would really have liked to upgrade to a A10-6800K.


----------



## DaveLT

Quote:


> Originally Posted by *Papadope*
> 
> Maybe don't wait for Kaveri, but definitely wait for FM2+. The boards are going to be so much better and then you also have a upgrade path to Kaveri later on. Besides, they should be out within 2 months tops.
> 
> I bought a A8-3850 on FM1 about a month before FM2 came out instead of waiting. I regret that one now, I would really have liked to upgrade to a A10-6800K.


Err ... Isn't Kaveri coming out at the same time as FM2+?


----------



## Opcode

Quote:


> Originally Posted by *glussier*
> 
> He is also playing in 720p. While 2400mhz ram provides a boost, it's not as huge as some people want's us to believe.


Yes it is, running 2400 MHz memory makes a huge difference. I conducted a benchmark using Tomb Raider to prove so. Memory bandwidth is holding these APU's back big time. The graph below shows the jump from 1866 MHz to 2133 MHz alone provided a 3-4 FPS boost on average. So going from the typical 1866 MHz that most people run to 2400 MHz would provide over 6-8 FPS in games. That's the difference of playing at 23 FPS vs playing at 30 FPS.


----------



## Papadope

Quote:


> Originally Posted by *DaveLT*
> 
> Err ... Isn't Kaveri coming out at the same time as FM2+?


No, because FM2+ supports all FM2 processors. It's the same as AM3+, I purchased my AM3+ motherboard and Phenom II 1100T (AM3 Processor) months before the first FX chips were released.


----------



## Papadope

Wow, The price of DDR3 2400 has come down significantly over the last 6 months. It's definitely a viable purchase in an APU build now. I remember a few months back when I built my friends computer there was a significant jump in price from 2133MHZ to 2400MHZ. They are the same price now.


----------



## DaveLT

It used to be 80 bucks now it's 95 bucks


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> It used to be 80 bucks now it's 95 bucks


It is?


----------



## DaveLT

Quote:


> Originally Posted by *Opcode*
> 
> It is?


Yes a month or two back for TRIDENT-X
Remember i written Trident-X C10


----------



## Cyberburnout

I was a part of the AMD Test Drive program and received an A10-6800k. I love this APU. Its a great balance of CPU Power, GPU and pocket friendly. I am using a SSD for a boot drive and a 1TB Seagate Hybrid drive for storage. Boots in no time flat. The kit came with 1866 Performance memory but I switched it to 2133 and it made a pretty big difference. Snappier, Games run smoother. streaming HD video also benefits from the switch. So if any one if thinking of getting one for a build or a htpc go with at least the 2133 instead of the cheaper 1866. great overclocker too. Had mine running at 4.9 under water. Backed it down to 4.7 so it can run silent.


----------



## Opcode

Quote:


> Originally Posted by *DaveLT*
> 
> Yes a month or two back for TRIDENT-X
> Remember i written Trident-X C10


Bah who needs them, just pick up a pair of these its all you'll need. Games can't use more than 4GB of memory, and with 2GB dedicated to the iGPU you still got some to spare.


----------



## nitrubbb

how does 6800k (or 5800k) handle trackmania 2 valley and americas army 4 ? (iGPU I mean)


----------



## m3nt4t

Hello everyone... I'm horribly confused as to what temps I'm actually getting on my A10-6800K. EVERY monitoring tool reads something different.... the software that came with my motherboard (Easy Tune 6) almost ALWAYS reads 28C. The BIOS itself is telling me 37-39C and SpeedFan is currently telling me 41C... aside from Easy Tune most of these readings seem a bit high considering I'm running a Corsair H60 water cooling unit and I've applied Arctic Silver 5 EXACTLY the way it was shown on Newegg's YouTube vid... I really want to see what this APU will overclock to but I'm not even going to bother until I can get some good temps... any info or advice would help


----------



## Durquavian

Quote:


> Originally Posted by *m3nt4t*
> 
> Hello everyone... I'm horribly confused as to what temps I'm actually getting on my A10-6800K. EVERY monitoring tool reads something different.... the software that came with my motherboard (Easy Tune 6) almost ALWAYS reads 28C. The BIOS itself is telling me 37-39C and SpeedFan is currently telling me 41C... aside from Easy Tune most of these readings seem a bit high considering I'm running a Corsair H60 water cooling unit and I've applied Arctic Silver 5 EXACTLY the way it was shown on Newegg's YouTube vid... I really want to see what this APU will overclock to but I'm not even going to bother until I can get some good temps... any info or advice would help


like many will probably tell you, try loading the APU and check temps that way. If the temps you are using above are low load or idle they will range a bit from one monitoring software to another.


----------



## m3nt4t

I see the reasoning I believe... use say Prime 95 and see what temps that produces???


----------



## Farmer Boe

Quote:


> Originally Posted by *m3nt4t*
> 
> I see the reasoning I believe... use say Prime 95 and see what temps that produces???


Yes, install a bunch of monitoring programs and have them all running as you start prime95. Watch which one looks the most accurate/plausible under load and idle. Sometimes the idle temps will drop to 0C depending on the program but as long as the load temp readout is working accurately, you're fine.


----------



## DaveLT

Trust ET6. Don't go over 74C on that one and you will be fine


----------



## Himo5

Quote:


> Originally Posted by *m3nt4t*
> 
> Hello everyone... I'm horribly confused as to what temps I'm actually getting on my A10-6800K. EVERY monitoring tool reads something different.... the software that came with my motherboard (Easy Tune 6) almost ALWAYS reads 28C. The BIOS itself is telling me 37-39C and SpeedFan is currently telling me 41C... aside from Easy Tune most of these readings seem a bit high considering I'm running a Corsair H60 water cooling unit and I've applied Arctic Silver 5 EXACTLY the way it was shown on Newegg's YouTube vid... I really want to see what this APU will overclock to but I'm not even going to bother until I can get some good temps... any info or advice would help


If that's a Gigabyte board it may also be worth your while checking the website for updates to utilities/drivers/bios since a lot of them were revised after release, both at the start of 2013 and later for Richland.


----------



## MacClipper

Quote:


> Originally Posted by *m3nt4t*
> 
> I see the reasoning I believe... use say Prime 95 and see what temps that produces???


See below hwinfo64 readouts

My latest effort... using my budget pair of KVR1333 and racked them up to DDR3-1866 still at 1.50V




























Way much easier time with P95 Blend with 8GB RAM (vs 16GB).


----------



## DaveLT

... KVR1333 @ 1866 CL9


----------



## m3nt4t

Thank you everyone for your replies, I apologize for my inexperience but I am learning.... and I think I've found a concurrence although I'm not entirely sure what it means just yet. It seems that the System reading on Easy Tune and the Temp1 reading on SpeedFan match exactly through my (relatively tame) testing... as Easy Tune is made by my motherboard manufacturer I want to say that it has the correct CPU reading and the System reading and Temp1 are probably reading the chipset(?). The curveball in all this is that the BIOS reads my CPU temps as only a couple of degrees below the System/Temp1 readings... my motherboard is the Gigabyte GA-F2A85XN-WIFI, all drivers and the BIOS are up to date. My H60's plugged in to the CPU header as per the instructions and the SP120 fan is plugged in to the System header, the board only has two fan headers... the case fans run off a fan controller and the PSU. I want to believe that the watercooled CPU is going to run a bit cooler than just about ANY air-cooled solution and since most of what I've read says the processor idles at 27-35C 28C doesn't seem like too much of a stretch... anyways, here's a couple of screenshots of my monitoring apps, I'll post some more when I send it through Prime95. Thanks again for the advice!


----------



## MacClipper

KVR1333 - rather transient Hynix CFR batch, also does DDR3-2133 CL9 at 1.65V but I prefer running it mostly at its stock 1.5V thus DDR3-1866C9 instead.

Further testing done, I am liking my 6800K/Extreme6+ combo more and more... rainy weather with ambient temps of ~27C.

Same Vcore at 1.40V BIOS LLC 0%







Killed off a few startup TSRs like CCC and finally broke through the stubborn 4.0 ceiling for Cinebench 11.5 score, not bad for an AMD APU on ambient cooling... yay!


----------



## spatulator

Has anyone had success with a higher overclock by disabling the iGPU? Does it make a difference? What about disabling 2 cores and the iGPU to see the max threshold?


----------



## Himo5

Quote:


> Originally Posted by *spatulator*
> 
> Has anyone had success with a higher overclock by disabling the iGPU? Does it make a difference? What about disabling 2 cores and the iGPU to see the max threshold?


Looking at the breakdown of the top 7 OC's for the A10-5800K that I found a few months back, 5 of them have 'GPU Type :Standard VGA Graphics Adapter', which I presume means the IGPU has been disabled.

Most of them have also dropped the RAM speeds right down and tightened them up as far as possible, although the amount of RAM doesn't seem to have been a factor.

On BIOS upgrades for ASUS FM2 boards post Richland, under Advanced>CPU Configuration> there is now a Core Leveling Mode [Automatic/OneCorePerProcessor/OneComputeUnit/OneCorePerCU]. So you are probably onto something there.


----------



## nz3777

I just bought the A10 6800 k for the kids and wife to have a gaming/Media computer, But I got lucky Got a Free Msi motherboard and Far cry 3 not bad for $129.99 Micro center rocks~! Whats the Best board I can get for this Little Quad-Core monster? I might just use the Msi board as a back-up Iam getting the felling its not the best overclocker lol~


----------



## iceman595

Quote:


> Originally Posted by *nz3777*
> 
> I just bought the A10 6800 k for the kids and wife to have a gaming/Media computer, But I got lucky Got a Free Msi motherboard and Far cry 3 not bad for $129.99 Micro center rocks~! Whats the Best board I can get for this Little Quad-Core monster? I might just use the Msi board as a back-up Iam getting the felling its not the best overclocker lol~


The asus Pro mobo, is the way to go
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131883


----------



## Opcode

Quote:


> Originally Posted by *nz3777*
> 
> I just bought the A10 6800 k for the kids and wife to have a gaming/Media computer, But I got lucky Got a Free Msi motherboard and Far cry 3 not bad for $129.99 Micro center rocks~! Whats the Best board I can get for this Little Quad-Core monster? I might just use the Msi board as a back-up Iam getting the felling its not the best overclocker lol~


Save yourself any headaches, get one of these while they are in stock. The Extreme6 for the A85X chipset holds the world record for the FM2 platform. This is the exact same board just updated with some better features and support for the FM2+ platform. So if you ever decide to drop in a Kaveri chip later (which could totally be worth it) you'll have a board capable of that option. I own the FM2A85X Extreme6 and can say its a solid board. That's all if you can fit a ATX board in your case.


----------



## nz3777

I havent bought a case for them yet, Iam still looking around. But the Asrock extreem 6 i like those actually. Ok thanks for the Info, how many pcie-e lanes on that one?


----------



## Durquavian

Quote:


> Originally Posted by *nz3777*
> 
> I havent bought a case for them yet, Iam still looking around. But the Asrock extreem 6 i like those actually. Ok thanks for the Info, how many pcie-e lanes on that one?


2 PCI-e lanes. Single at x16 dual at x8/x8 but being 3.0 the x8 may not hurt much.


----------



## nz3777

What gpu do you guys recommend with that board and the a10 6800k? Like what would work best? 7970?


----------



## DaveLT

Quote:


> Originally Posted by *nz3777*
> 
> I havent bought a case for them yet, Iam still looking around. But the Asrock extreem 6 i like those actually. Ok thanks for the Info, how many pcie-e lanes on that one?


Indeed. But the UP4 is better on power consumption front
Quote:


> Originally Posted by *Durquavian*
> 
> 2 PCI-e lanes. Single at x16 dual at x8/x8 but being 3.0 the x8 may not hurt much.


It won't even hurt to begin with BUT it's down to the APU to support 2.0 or 3.0


----------



## Indy1944

Dude 7970? Really, I own the 6800 and id never abuse myself by putting a 7970 in my case, the sweet spot video card is the 7850, maybe 7870 if u overclock the CPU....


----------



## Indy1944

MSI is garbage, will prolly burn up if u overclock


----------



## nz3777

Yeah I think your right!~ A nice 7870 might do the trick. Maybe the power color devil?>


----------



## DaveLT

Quote:


> Originally Posted by *nz3777*
> 
> Yeah I think your right!~ A nice 7870 might do the trick. Maybe the power color devil?>


Nah too expensive, don't bother with the Devil edition.
Quote:


> Originally Posted by *Indy1944*
> 
> MSI is garbage, will prolly burn up if u overclock


Your assumption levels is off the charts here
1) The FM2 motherboards are actually decent
2) The GPUs are okay


----------



## nz3777

Anyone have any benchmarks with the Richland? Id like to see how it stacks up against other chips.


----------



## beers

Quote:


> Originally Posted by *nz3777*
> 
> Anyone have any benchmarks with the Richland? Id like to see how it stacks up against other chips.


Which ones were you interested in? They've been out quite a while, I'm sure you could turn a few up through some searching..


----------



## robbo2

I grabbed one of these chips for a competition on HWBot. Unfortunately, it died a little prematurely









http://valid.canardpc.com/2897181


----------



## DaveLT

Quote:


> Originally Posted by *robbo2*
> 
> I grabbed one of these chips for a competition on HWBot. Unfortunately, it died a little prematurely
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://valid.canardpc.com/2897181


Are you running PSCheck? I think if you did that it might kill it prematurely ... running the APU on single-core


----------



## robbo2

Quote:


> Originally Posted by *DaveLT*
> 
> Are you running PSCheck? I think if you did that it might kill it prematurely ... running the APU on single-core


I was, but couldn't clock single cores with this board. So both cores in the second module were running at that speed.


----------



## nz3777

Guys what would happen If I put in a Radeon 6950/70 with the A10 6800k? Will it boost the graphics? I just need a case and Psu now and I can fire this thing up~ Well maybe a Motherbaord as well, I dont like this Msi they gave me free~! Maybe use it as a back-up? Schould I go out and buy a New Operating system or just use my Windows 7 sp1 I currently run on my fx 6100 based system? I know theres a ton of updates to do but if ill save $110 bucks the hell with it i will do it. I personally dont like Windows 8 too much.


----------



## Devildog83

Quote:


> Originally Posted by *nz3777*
> 
> Yeah I think your right!~ A nice 7870 might do the trick. Maybe the power color devil?>


The Devil made me do it.


----------



## nz3777

Nice!!! Hows it run btw? Have you benched it? Iam curious if its better then a regular 7870,Its gotta run cool as hell considering it has those 3 fans! Beastly little Card!


----------



## DannyT

My A10 should be coming in today, should get a hell of a performance boost than an a6


----------



## NewHighScore

Hey guys. I'm thinking about grabbing a 6800k for the wifey to play games on. While I have absolutely no hate for AMD I have always ran Intel CPU's in my rigs and am not quite up to speed on the AMD news. Are there any new APU's on the horizon? We don't mind waiting a month or so if need be otherwise I will probably just grab a 6800k.

Cheers!


----------



## nz3777

I just got ours the other day but I havent had a chance to fire it up yet~ I think There IS new APU sceduled to release but iam not sure exactly when-Kalveri or something like that lol. I heard the only diffrence between trinity and Richland 5800k vs 6800 k is just clock speed. Iam real curious to fire this thing up, just need a case now and a psu and thats about it, I will be glad to post some benchmarks in a few days for you.......... They say the Hybrid Crossfire is No good but iam still gonna invest $50-$75 dollars for a Gpu I wanna try it out for myself, if it dosent work out ill stick in A Radeon 6970 for the time being Until I get my hands on the 9000 series~


----------



## Opcode

Quote:


> Originally Posted by *NewHighScore*
> 
> Hey guys. I'm thinking about grabbing a 6800k for the wifey to play games on. While I have absolutely no hate for AMD I have always ran Intel CPU's in my rigs and am not quite up to speed on the AMD news. Are there any new APU's on the horizon? We don't mind waiting a month or so if need be otherwise I will probably just grab a 6800k.
> 
> Cheers!


Kaveri should be hitting sometime in Q4 2013 to Q1 2014. So yes a whole new platform is on the horizon, and with Steamroller cores and GCN architecture it should be worth the wait.


----------



## Devildog83

Quote:


> Originally Posted by *nz3777*
> 
> Nice!!! Hows it run btw? Have you benched it? Iam curious if its better then a regular 7870,Its gotta run cool as hell considering it has those 3 fans! Beastly little Card!


Here's a heaven run, maxed at 60c. I set the fans to 80% for the test.


Spoiler: Warning: Spoiler!







3DMark11 -




It's really not that little, it's longer and taller than most 7950's @ 11.22 long and 5.37 high.


----------



## Devildog83

By the way, since I am around here, I built an A8 6600k system and I am wondering since I am not the APU expert. If I stick my old HD7770 in that system and shut down the GPU side will it perform better than stock? When Kaveri comes out we will upgrade to the max.


----------



## glussier

Yes it will be faster with the 7770.


----------



## Opcode

Quote:


> Originally Posted by *Devildog83*
> 
> By the way, since I am around here, I built an A8 6600k system and I am wondering since I am not the APU expert. If I stick my old HD7770 in that system and shut down the GPU side will it perform better than stock? When Kaveri comes out we will upgrade to the max.


The HD 7770 will out perform the built in iGPU big time. I don't think you can shut down the iGPU tho, it just runs in a idle state of like 150 MHz so it doesn't eat a bunch of power or produce a bunch of heat. I have a HD 5870 paired with my A10-6800k, and it handles the card pretty well. I get 90% and higher utilization in games like Bad Company 2.


----------



## nz3777

If I were ever to get a 7870 that would be the one! The Mini-Devil I like to call it lol. Pretty nice fps On Heaven btw.


----------



## Devildog83

Quote:


> Originally Posted by *Opcode*
> 
> The HD 7770 will out perform the built in iGPU big time. I don't think you can shut down the iGPU tho, it just runs in a idle state of like 150 MHz so it doesn't eat a bunch of power or produce a bunch of heat. I have a HD 5870 paired with my A10-6800k, and it handles the card pretty well. I get 90% and higher utilization in games like Bad Company 2.


Cool thanks.


----------



## Devildog83

Quote:


> Originally Posted by *nz3777*
> 
> If I were ever to get a 7870 that would be the one! The Mini-Devil I like to call it lol. Pretty nice fps On Heaven btw.


Yep, I love the back-plate too. I would love to have the Devil 13 7990 if you can even find them anymore. I would have to get a bigger PSU.


----------



## DannyT

Um.. I think I have a problem. my CPU seems to be running on ridiculously high temps. In the bios it reads around 56 degrees Celsius


----------



## Durquavian

Quote:


> Originally Posted by *DannyT*
> 
> Um.. I think I have a problem. my CPU seems to be running on ridiculously high temps. In the bios it reads around 56 degrees Celsius


Bios will always read higher than in OS. Something about fan profiles not running or something like that. Load into your OS windows or whatever and check it there.


----------



## DannyT

Its 69 degrees in HWMonitor and 41 degrees in Speedfan


----------



## Opcode

Quote:


> Originally Posted by *DannyT*
> 
> Its 69 degrees in HWMonitor and 41 degrees in Speedfan


Try the trial of AIDA Extreme. For the Extreme6 almost all other software doesn't seem to work right at reading temperatures.


----------



## m3nt4t

I've gotten a different reading from EVERYTHING I've tried... AIDA64EE has me at 17C, EasyTune6 has me at 28C, the BIOS has me at 35C, Speedfan and Open Hardware Monitor have me at 40C so I understand your confusion.... as I've been advised in this thread before I've chosen to go by what the temps read under load... none of them have gotten higher than 51C which is WAY under the thermal barrier of 74C for this APU. From what I've been able to find these APU's DO run hot and at 100W TDP this isn't very surprising... that said DannyT your temps DO seem a lot higher than normal... maybe compare temps with the stock cooler and fresh smear of Arctic Silver??? If they continued to stay in these ranges I'd consider an RMA....


----------



## Ultisym

Quote:


> Originally Posted by *Devildog83*
> 
> Here's a heaven run, maxed at 60c. I set the fans to 80% for the test.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3DMark11 -
> 
> 
> 
> 
> It's really not that little, it's longer and taller than most 7950's @ 11.22 long and 5.37 high.


Im curious, what did this 7870 cost you? I just picked up a pair of 7950s for $199 each.


----------



## DaveLT

Quote:


> Originally Posted by *m3nt4t*
> 
> I've gotten a different reading from EVERYTHING I've tried... AIDA64EE has me at 17C, EasyTune6 has me at 28C, the BIOS has me at 35C, Speedfan and Open Hardware Monitor have me at 40C so I understand your confusion.... as I've been advised in this thread before I've chosen to go by what the temps read under load... none of them have gotten higher than 51C which is WAY under the thermal barrier of 74C for this APU. From what I've been able to find these APU's DO run hot and at 100W TDP this isn't very surprising... that said DannyT your temps DO seem a lot higher than normal... maybe compare temps with the stock cooler and fresh smear of Arctic Silver??? If they continued to stay in these ranges I'd consider an RMA....


They don't run hot. The stock cooler does i mean just look at it, they gave a crappy aluminium cooler that has a fan that only has a top speed of 3500rpm, it's temporary cooling at best
It's quite obvious that it's for the enthusiast crowd as TBH no one of us uses stock cooling or even imagine using it ...


----------



## m3nt4t

I don't use a stock cooler (H60 water cooler), I merely suggested using a different cooler to get an estimate on temps............ and in other news my APU seems to be climbing in idle temps as well, I just walked in to the room to find my temps at 51C. Starting to think I should have gotten an i3







)))


----------



## m3nt4t

I might add that using anything OTHER than the stock cooler voids the warranty (according to the accompanying lit)....... it IS a crappy cooler though, loud too!


----------



## Devildog83

Quote:


> Originally Posted by *Ultisym*
> 
> Im curious, what did this 7870 cost you? I just picked up a pair of 7950s for $199 each.


$208.00 It did come with a solid back-plate which would have cost at about $30 and I would had to purchase so it's still a good deal to me. I have gone in to long debates about getting a low end 7950 as opposed to this high end 7870 and I understand but this is the route I chose and I don't regret it one bit. A 7950 near this quality would have cost near or even over $300 and don't look as good IMO, which does matter to me. If I had to pay the $260 retail price I would have thought a lot more about getting a 7950.


----------



## Papadope

Quote:


> Originally Posted by *Devildog83*
> 
> $208.00 It did come with a solid back-plate which would have cost at about $30 and I would had to purchase so it's still a good deal to me. I have gone in to long debates about getting a low end 7950 as opposed to this high end 7870 and I understand but this is the route I chose and I don't regret it one bit. A 7950 near this quality would have cost near or even over $300 and don't look as good IMO, which does matter to me. If I had to pay the $260 retail price I would have thought a lot more about getting a 7950.


That 7870 is a nice card, I'm liking the look.







Never saw it before.


----------



## Devildog83

Thanks Papadope, I love it. Now I am considering another for X Fire.

I just put my HD 7770 in the Jonny 1 leg A8 6600k rig and it is a nice improvement from the onboard graphics. It's actually very close in 3Dmark11 to what I got with the FX 4100 and that card. I am impressed with AMD APU's so far. I put together a very nice rig for $450 plus the 7770.


----------



## phillyd

Might be buying one soon. Anyone know where I can get one used?

Also looking to buy the Gigabyte A85x ITX mobo, anyone know if its any good?


----------



## DaveLT

Quote:


> Originally Posted by *phillyd*
> 
> Might be buying one soon. Anyone know where I can get one used?
> 
> Also looking to buy the Gigabyte A85x ITX mobo, anyone know if its any good?


The best A85X ITX mobo there is already, if you want one you don't have to think any further


----------



## NewHighScore

Quote:


> Originally Posted by *phillyd*
> 
> Might be buying one soon. Anyone know where I can get one used?
> 
> Also looking to buy the Gigabyte A85x ITX mobo, anyone know if its any good?


I have that motherboard but I have yet to test it out. I'll let you know my opinion once I set it up.


----------



## phillyd

Thanks for the opinions guys. I'm definitely willing to pay $5 more for it over the ASRock.


----------



## DaveLT

Quote:


> Originally Posted by *phillyd*
> 
> Thanks for the opinions guys. I'm definitely willing to pay $5 more for it over the ASRock.


It's also kinda worth it over the ASRock for the IR VRMs for they are more efficient


----------



## ChrisB17

How difficult is this chip to get stable at 5 ghz?


----------



## mtcn77

Pretty much the apprehension of having to pay for water cooling as much as you did for the unit, imo. Corsair H100i is my the best bet. Guys at vmodtech have even done 5.3 ghz, funny that it equates to Phenom II x4 4.2 ghz in some benchmarks.


----------



## PiMaster9001

Quote:


> Originally Posted by *Alatar*
> 
> 
> 
> http://valid.canardpc.com/hbjhmm


That's a pretty good voltage readout. How well did it scale to stock/near stock clocks?


----------



## Justinbaileyman

Thats got to be a suicide run I can tell you this he isnt using that for no 24/7 use!!
I wanna see some more benchie's at those speeds please..
Can you run passmark for me pretty please??
Also what cooling are you using??
I am running a 760K using a Kraken X60 all in one water cooling setup and the max I can squeeze out is 5.2Ghz at 1.525v for 24/7 use.


----------



## DaveLT

Quote:


> Originally Posted by *Justinbaileyman*
> 
> Thats got to be a suicide run I can tell you this he isnt using that for no 24/7 use!!
> I wanna see some more benchie's at those speeds please..
> Can you run passmark for me pretty please??
> Also what cooling are you using??
> I am running a 760K using a Kraken X60 all in one water cooling setup and the max I can squeeze out is 5.2Ghz at 1.525v for 24/7 use.


Wow ... Power of RCM.


----------



## Justinbaileyman

Quote:


> Originally Posted by *DaveLT*
> 
> Wow ... Power of RCM.


What is RCM??


----------



## Durquavian

Quote:


> Originally Posted by *Justinbaileyman*
> 
> Thats got to be a suicide run I can tell you this he isnt using that for no 24/7 use!!
> I wanna see some more benchie's at those speeds please..
> Can you run passmark for me pretty please??
> Also what cooling are you using??
> I am running a 760K using a Kraken X60 all in one water cooling setup and the max I can squeeze out is 5.2Ghz at 1.525v for 24/7 use.


he is using extreme cooling. The name escapes me, like a peltier something. Not LN2 but higher than water cooling. That's it Phase Change cooling.


----------



## Justinbaileyman

Quote:


> Originally Posted by *Durquavian*
> 
> he is using extreme cooling. The name escapes me, like a peltier something. Not LN2 but higher than water cooling. That's it Phase Change cooling.


Ahh got it thanks... I'd hate to have his electric bill thats for sure..


----------



## Opcode

Quote:


> Originally Posted by *Justinbaileyman*
> 
> What is RCM??


Resonant Clock Mesh. It's a power saving feature incorporated into Richland APU's. Hence why we can reach higher clocks on less volts with the A10-6800k than the A10-5800k.


----------



## awdrifter

My lightly overclocked (4.5ghz) 6800k is throttling. This happens after a few minutes of OCCT. Does anyone know what would cause a 6800k to throttle? In the Bios I already set the target temp to 70c, and the cpu is running at 60-62c. Thanks.


----------



## mtcn77

It still is throttled back by temperature.


----------



## EastCoast

There are more things to consider then just the Apu.


----------



## DaveLT

Quote:


> Originally Posted by *mtcn77*
> 
> It still is throttled back by temperature.


But it WON'T throttle until beyond 70C


----------



## Farmer Boe

Quote:


> Originally Posted by *DaveLT*
> 
> But it WON'T throttle until beyond 70C


Which program are you using to monitor the load temps? These things are notorious for giving inaccurate readings depending on the program used.


----------



## mtcn77

Quote:


> Originally Posted by *DaveLT*
> 
> But it WON'T throttle until beyond 70C


Well, at least mine does... It is a hard threshold. The cpu monitoring circuit won't let you.


----------



## awdrifter

Thanks for the info, I'll try with a better cooler.


----------



## phillyd

Anyone know when the Gigabyte FM2+ motherboards, particularly the F2A88XN-Wifi, will be released? Trying to decide if I should wait for it or just get an FM2 board now, and upgrade later.


----------



## DaveLT

Quote:


> Originally Posted by *Farmer Boe*
> 
> Which program are you using to monitor the load temps? These things are notorious for giving inaccurate readings depending on the program used.


AIDA64. The only program i will use for Richland and up


----------



## jezzer

Quote:


> Originally Posted by *phillyd*
> 
> Anyone know when the Gigabyte FM2+ motherboards, particularly the F2A88XN-Wifi, will be released? Trying to decide if I should wait for it or just get an FM2 board now, and upgrade later.


They are allready released. At least here where i am.


----------



## phillyd

Yeah, not in the US


----------



## nitrubbb

which apu should I get until Kaveri?

only requirement is that Trackmania 2 Stadium needs to be playable


----------



## phillyd

Quote:


> Originally Posted by *nitrubbb*
> 
> which apu should I get until Kaveri?
> 
> only requirement is that Trackmania 2 Stadium needs to be playable


The 6800k is $130 on amazon. I have one and love it. I'll go run TM2 Stadium and see how it runs on here.

If you want to know more about the rig, it's the second one in my sig. Just built it yesterday. I am downloading and installing it now.


----------



## phillyd

My rig with this APU










Please check out the *Build log* for more pics and sub! Feedback is appreciated


----------



## Indy1944

Im running my 6800 at 4.6ghz with Dom Plats 2133, I can play BF3 medium no AA and its pretty smooth, can play it and still have a good experience, looks better than console, for a GPU?CPU chip that's amazing. Im buying a HD 7870 today for BF4 and gonna up CPU to 4.7 I don't think im gonna loose a lot of frame rates maybe 10% loss, still not bad and hope BF4 looks just as good....


----------



## phillyd

Quote:


> Originally Posted by *nitrubbb*
> 
> which apu should I get until Kaveri?
> 
> only requirement is that Trackmania 2 Stadium needs to be playable


Sorry to get back to you so late but I ran TM2 Stadium with no adjustments to the settings from Normal preset at 50 FPS with my 6800k at 1080p


----------



## Noobism

Quote:


> Originally Posted by *phillyd*
> 
> My rig with this APU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please check out the *Build log* for more pics and sub! Feedback is appreciated


Very nice









Was thinking about putting one of these into a build for the GF.


----------



## nitrubbb

Quote:


> Originally Posted by *phillyd*
> 
> Sorry to get back to you so late but I ran TM2 Stadium with no adjustments to the settings from Normal preset at 50 FPS with my 6800k at 1080p


great


----------



## tuffy12345

Quote:


> Originally Posted by *phillyd*
> 
> My rig with this APU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please check out the *Build log* for more pics and sub! Feedback is appreciated


ugh...why does that look so good? You're going to make me take the time to try and get my fat fingers to make mine look like that.
Quote:


> Originally Posted by *nitrubbb*
> 
> which apu should I get until Kaveri?
> 
> only requirement is that Trackmania 2 Stadium needs to be playable


Yup. 6800K plays it fine. One of the first games I got it with on the steam sale.


----------



## phillyd

Quote:


> Originally Posted by *Noobism*
> 
> Very nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was thinking about putting one of these into a build for the GF.


Thanks








Quote:


> Originally Posted by *nitrubbb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *phillyd*
> 
> Sorry to get back to you so late but I ran TM2 Stadium with no adjustments to the settings from Normal preset at 50 FPS with my 6800k at 1080p
> 
> 
> 
> great
Click to expand...

i'm sure you can get even better looking settings with some tweaking to run just fine. I'm also Using 1333MHz RAM atm so faster ram would give a boost in GPU performance.
Quote:


> Originally Posted by *tuffy12345*
> 
> ugh...why does that look so good? You're going to make me take the time to try and get my fat fingers to make mine look like that.


It took a lot of effort and a slightly bulging back-panel but it was worth it. Thanks









*I hate to sound like a broken record player, but my build log needs feedback everyone







*


----------



## Noobism

Quote:


> Originally Posted by *phillyd*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> i'm sure you can get even better looking settings with some tweaking to run just fine. I'm also Using 1333MHz RAM atm so faster ram would give a boost in GPU performance.
> It took a lot of effort and a slightly bulging back-panel but it was worth it. Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I hate to sound like a broken record player, but my build log needs feedback everyone
> 
> 
> 
> 
> 
> 
> 
> *


Curious, how much did everything run you?

Think it might just be best to just buy a new case,ssd, HDD, and transplant everything out of my old build into something for her.








Decisions, decisions


----------



## phillyd

Mobo: $120
CPU: $100
RAM: already had it
PSU: $50 used
SSD's: $120 used
Case is $90 for the standard one.

So with those numbers, $480


----------



## miklkit

I'm looking to upgrade my wife's puter and am thinking of getting an A10-6800K. Is this still a good deal or should I wait a few months?


----------



## Durquavian

Quote:


> Originally Posted by *miklkit*
> 
> I'm looking to upgrade my wife's puter and am thinking of getting an A10-6800K. Is this still a good deal or should I wait a few months?


Its a good deal but being soooo close to Kaveri release I'd wait. In fact I am like you and am waiting for Kevri for the wife.


----------



## FIRINMYLAZERMAN

How much better or worse do you suppose the AMD A10-7850K will perform in comparison to the AMD Phenom II X4 965 BE?


----------



## DaveLT

Quote:


> Originally Posted by *FIRINMYLAZERMAN*
> 
> How much better or worse do you suppose the AMD A10-7850K will perform in comparison to the AMD Phenom II X4 965 BE?


It will eat it for lunch. Simply put


----------



## grassh0ppa

Quote:


> Originally Posted by *Opcode*
> 
> Kaveri should be hitting sometime in Q4 2013 to Q1 2014. So yes a whole new platform is on the horizon, and with Steamroller cores and GCN architecture it should be worth the wait.


Hopefully Kaveri allows Dual Graphics with higher end GPUs as well (atleast with high end 7xxx cards). It would make APU systems the perfect "Build it and buy a GPU later" rig.


----------



## kzone75

http://valid.canardpc.com/6xc4ij

Now to figure out why the fans are running at such a high speed..


----------



## mdocod

Quote:


> Originally Posted by *grassh0ppa*
> 
> Hopefully Kaveri allows Dual Graphics with higher end GPUs as well (atleast with high end 7xxx cards). It would make APU systems the perfect "Build it and buy a GPU later" rig.


There's a reason for the "recommended" Xfire configurations. Once the strength of one of the GPU's in Xfire is double that of the other, there is nothing to be gained with Xfire. Placing (for instance) an HD7750 and HD7850 in Xfire would result in performance similar to the discrete HD7850, except WITH scaling related tradeoffs (the HD7850 in this configuration would just wind up running 50% idle cycles waiting for the HD7750). As such, Xfiring anything more powerful than an HD7750 with an equal clocked 512 shader Kavari, would result in "wasted" GPU as the power powerful card is forced to run idle cycles waiting for the slower card to do its half of the workload.

Simply put; unless they come up with a way to split workloads asymmetrically, asymmetrical hardware won't scale very well.


----------



## Durquavian

Quote:


> Originally Posted by *mdocod*
> 
> There's a reason for the "recommended" Xfire configurations. Once the strength of one of the GPU's in Xfire is double that of the other, there is nothing to be gained with Xfire. Placing (for instance) an HD7750 and HD7850 in Xfire would result in performance similar to the discrete HD7850, except WITH scaling related tradeoffs (the HD7850 in this configuration would just wind up running 50% idle cycles waiting for the HD7750). As such, Xfiring anything more powerful than an HD7750 with an equal clocked 512 shader Kavari, would result in "wasted" GPU as the power powerful card is forced to run idle cycles waiting for the slower card to do its half of the workload.
> 
> Simply put; unless they come up with a way to split workloads asymmetrically, asymmetrical hardware won't scale very well.


Guess you could setup a 2 to 1 configuration. There are some selections in Radeonpro for CF config. I never tried any, was curious though, about them that is.


----------



## Stormscion

Here are some tests of A10-6800k in relevant modern and popular games that will stay popular for years to come a(unlike most throw away AAA games used to boost hardware sales).

(they used different settings for lower resolutions to keep it playable aka increasing fidelity on lower resolutions)

RAM on 2133mhz also nothing is clocked (resolutions up to 1080p)

World of Tanks
http://www.youtube.com/watch?v=6wEbFhPI1Sg

Dota 2
http://www.youtube.com/watch?feature=player_embedded&v=pK2jSmSMUpQ

LOL
http://www.youtube.com/watch?feature=player_embedded&v=NLhL6_MbLk0


----------



## EduFurtado

Folks, please take a quick peak at my thread, specially f you also own Counter Strike Global offensive (CS:GO)









http://www.overclock.net/t/1455555/a10-6800k-and-a10-5800k-owners-i-need-your-help/0_100


----------



## 12Cores

Waiting for a10-7850K to replace my xbox 360 as a living room HTPC, my wife still plays lego games on the the 360 I am hoping that I can run those games at 1080p with this chip. If not I am going to pick up a 7770.


----------



## EduFurtado

This must have been asked before but.. .What program should I use to check CPU temperatures?
What about GPU temps?


----------



## cssorkinman

Quote:


> Originally Posted by *EduFurtado*
> 
> This must have been asked before but.. .What program should I use to check CPU temperatures?
> What about GPU temps?


Gpu-z for gpu temps and coretemp for cpu temps .
Be careful when you download them however, they will try to add search bars etc some times.

http://www.techpowerup.com/downloads/2297/techpowerup-gpu-z-v0-7-4/
http://www.alcpu.com/CoreTemp/

Download them at your own risk


----------



## EduFurtado

Quote:


> Originally Posted by *cssorkinman*
> 
> Gpu-z for gpu temps and coretemp for cpu temps .
> Be careful when you download them however, they will try to add search bars etc some times.
> 
> http://www.techpowerup.com/downloads/2297/techpowerup-gpu-z-v0-7-4/
> http://www.alcpu.com/CoreTemp/
> 
> Download them at your own risk


Thanks, but core temp 1.0 RC6 isn't working on my machine :/


----------



## cssorkinman

Quote:


> Originally Posted by *EduFurtado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Gpu-z for gpu temps and coretemp for cpu temps .
> Be careful when you download them however, they will try to add search bars etc some times.
> 
> http://www.techpowerup.com/downloads/2297/techpowerup-gpu-z-v0-7-4/
> http://www.alcpu.com/CoreTemp/
> 
> Download them at your own risk
> 
> 
> 
> Thanks, but core temp 1.0 RC6 isn't working on my machine :/
Click to expand...

Try hwinfo or hwmonitor


----------



## EduFurtado

Quote:


> Originally Posted by *cssorkinman*
> 
> Try hwinfo or hwmonitor


So... there is no standard?

HW monitor shows me anything, I don't want to believe it.

I need something that works. Why is this so confusing - does it changes from motherboard to motherboard? Chipset to chipset?


----------



## cssorkinman

Quote:


> Originally Posted by *EduFurtado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> Try hwinfo or hwmonitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So... there is no standard?
> 
> HW monitor shows me anything, I don't want to believe it.
> 
> I need something that works. Why is this so confusing - does it changes from motherboard to motherboard? Chipset to chipset?
Click to expand...

Sometimes you just have to learn what reading matches what piece of hardware. Post a screen shot of what Hwmonitor gives you someone can interpret it for you


----------



## Papadope

@EduFurtado

For Trinity and Richland, I have had the greatest success with HWMonitor. Post a screenshot during a 1 minute Prime95 run and we will help you identify the cpu core temp. Hard to determine at idle.


----------



## DaveLT

Quote:


> Originally Posted by *Papadope*
> 
> @EduFurtado
> 
> For Trinity and Richland, I have had the greatest success with HWMonitor. Post a screenshot during a 1 minute Prime95 run and we will help you identify the cpu core temp. Hard to determine at idle.


I on the other hand have not had any success with HWMonitor on Richland, it just showed the weirdest temps during load. 105C? Really?


----------



## Papadope

Quote:


> Originally Posted by *DaveLT*
> 
> I on the other hand have not had any success with HWMonitor on Richland, it just showed the weirdest temps during load. 105C? Really?


What motherboard?


----------



## DaveLT

Quote:


> Originally Posted by *Papadope*
> 
> What motherboard?


MSI A55M-P33


----------



## kzone75

Easytune 6 and GPUZ are the only programs that show the correct temps for me. CPU-Z on the other hand may or may not show this:


----------



## Papadope

Quote:


> Originally Posted by *kzone75*
> 
> Easytune 6 and GPUZ are the only programs that show the correct temps for me. CPU-Z on the other hand may or may not show this:


Lol what the hell. Overclock record right there.

I have the A10-6700 on that same motherboard and I've never seen that and CPUID HardwareMonitor works great for temps. Well it's the FM2+ version, Im not sure if yours is FM2. Also running windows 8.1


----------



## Papadope

Quote:


> Originally Posted by *DaveLT*
> 
> MSI A55M-P33


I build my friend a computer with that board and the A8-5600k and CPUID HardwareMonitor worked on it for temps. Windows 8


----------



## kzone75

Movin' on up..









http://valid.canardpc.com/0zb98c http://www.techpowerup.com/gpuz/5herw/


----------



## joeybuddy96

I bought my second a10-6800k. It's been mentioned before that the heatsink is terrible; that may or may not be true, but I think the retention bracket might be to blame--when there are microscopic gaps filled with air, no matter what TIM is used, the resulting temperatures will be higher; so when the retention bracket makes it hard to get a good initial contact between the heatsink and the heat spreader. Most users are going with aftermarket coolers anyway, but I plan to move to an FM2+, so I'd rather wait until then to start working on a cooling plan for the system. I'm trying to figure out if I should drop the heatsink onto the heat spreader, then hook the retention bracket on; or if I should hook one side of the retention bracket on, hovering the heatsink level, then gently hooking the other side of the bracket on, then shoving the retention hook closed. I keep practicing both methods, but the heatsink keeps slipping around with both methods, and I can see why that would cause air pockets. I'm using PK-3, a little bigger than a rice grain, single drop. Once I figure out how to get the heatsink on without movement after it's been dropped into place, I'm going to take the heatsink off, clean the PK-3 off with isopropyl alchohol, then do the whole thing over.


----------



## DaveLT

Quote:


> Originally Posted by *joeybuddy96*
> 
> I bought my second a10-6800k. It's been mentioned before that the heatsink is terrible; that may or may not be true, but I think the retention bracket might be to blame--when there are microscopic gaps filled with air, no matter what TIM is used, the resulting temperatures will be higher; so when the retention bracket makes it hard to get a good initial contact between the heatsink and the heat spreader. Most users are going with aftermarket coolers anyway, but I plan to move to an FM2+, so I'd rather wait until then to start working on a cooling plan for the system. I'm trying to figure out if I should drop the heatsink onto the heat spreader, then hook the retention bracket on; or if I should hook one side of the retention bracket on, hovering the heatsink level, then gently hooking the other side of the bracket on, then shoving the retention hook closed. I keep practicing both methods, but the heatsink keeps slipping around with both methods, and I can see why that would cause air pockets. I'm using PK-3, a little bigger than a rice grain, single drop. Once I figure out how to get the heatsink on without movement after it's been dropped into place, I'm going to take the heatsink off, clean the PK-3 off with isopropyl alchohol, then do the whole thing over.


I didn't have air bubbles but the thing is it has a pretty slow fan (for a stock heatsink) and a all aluminum base ... couple with the fact that the heatsink is slightly shorter than the ones on AM3+


----------



## beers

Quote:


> Originally Posted by *joeybuddy96*
> 
> microscopic gaps filled with air.


Out of all the actual variables, you try to pass THAT off?







The bracket for similar coolers like the OCZ Vendetta is exactly the same.

The stock heatsink doesn't even get full surface area with the CPU, not to mention being a bit undersized. Under max load at stock it's also really audible.


----------



## akromatic

hmm anyone running hybrid CF?

mind answering my question here
http://www.overclock.net/t/1458934/apu-hybrid-crossfire-bandwidth-requirements


----------



## joeybuddy96

Yeah, it's definitely not a quiet heatsink. It's not even a good heatsink. However, everything compared to a three stage helium compression system that is in turn cooled by water is a bad cooling solution as far as I'm concerned. I wasn't asking for cooler recommendations, just application of the stock heatsink I already have; I found the method of level hover, drop, straighten, hook on one side, hook on the bracket side, then closure of the bracket to be the best one.


----------



## mtcn77

Quote:


> Originally Posted by *joeybuddy96*
> 
> Yeah, it's definitely not a quiet heatsink. It's not even a good heatsink. However, everything compared to a three stage helium compression system that is in turn cooled by water is a bad cooling solution as far as I'm concerned. I wasn't asking for cooler recommendations, just application of the stock heatsink I already have; I found the method of level hover, drop, straighten, hook on one side, hook on the bracket side, then closure of the bracket to be the best one.


Well, it is the cpu not the heatsink to blame. When I had mine, a quick check at Asus Cpu monitoring software showed the cpu was running at 1.45v... on a 28 nm chip? - 32nm, it is - No, thanks! I advise that you try out undervolting your cpu. If you let the motherboard pick what it selects is best, well - it will try to overclock the cpu. Mine went from 1.34v to 1.24v without an issue. I am currently using it at 1.25v, completely avoiding any speed up from the "quiet" mode.
6800K and HD6870 don't ever stir up any noise. The case also needs to be ventilated accordingly, as per heat flushed from devices.


----------



## DaveLT

Quote:


> Originally Posted by *mtcn77*
> 
> Well, it is the cpu not the heatsink to blame. When I had mine, a quick check at Asus Cpu monitoring software showed the cpu was running at 1.45v... on a 28 nm chip? - 32nm, it is - No, thanks! I advise that you try out undervolting your cpu. If you let the motherboard pick what it selects is best, well - it will try to overclock the cpu. Mine went from 1.34v to 1.24v without an issue. I am currently using it at 1.25v, completely avoiding any speed up from the "quiet" mode.
> 6800K and HD6870 don't ever stir up any noise. The case also needs to be ventilated accordingly, as per heat flushed from devices.


But thankfully because it's SOI it doesn't mind 1.45v. God bless SOI


----------



## dph314

Can anyone tell me if temperatures are reported correctly? More specifically, in HWinfo64. Because when I run a bench, it reports 80C, but the stock cooler can't be that bad can it? In the BIOS it gets up to 52C, and since I believe UEFI BIOSs can put a full-load on CPUs, I want to assume _that's_ the load temp. But either way, I'm a bit nervous and would like some confirmation one way or the other.

I don't do many AMD builds, but I recommended the 6800k for a friend's budget build and everything went great and is running well, just concerned about the temperature reading I seem to be getting in HWinfo64.

Edit: I just tried AMD OverDrive for the first time. You can imagine how surprised I was until I found out that the Thermal Margin reports distance from dangerous temperatures as opposed to the actual temps







But anyways, Thermal Margin got down to about 32C. I don't know what the max safe temp is that it counts away from though. Would it be TjMax? 74C? If so, I guess the cores would be getting up to around 44C? (74C - 30C that Thermal Margin got down to).

if that's the case then I guess temps aren't that bad afterall. Still would appreciate knowing for sure though. Also, I have the voltage manually set in the BIOS at 1.32v, yet AMD OverDrive keeps showing "Current Voltage" at 1.4v, even though it reads the "Target Voltage" correctly, at 1.32v.

If anyone could clear this up a bit for me it would be much appreciated.


----------



## DaveLT

Oh wow. Don't rely on AMD OvD for readings though, I'd much rather use AIDA64 for readings. 80C might be just that 30-105C scale that it uses
74C Tjmax is the literally the max but Intel Tjmax is similar as well expect that for 6800k programs STILL tend to read it wrongly. Some read as 30-105C scale some read as proper 0-Tjmax which is apps like AIDA64


----------



## dph314

Quote:


> Originally Posted by *DaveLT*
> 
> Oh wow. Don't rely on AMD OvD for readings though, I'd much rather use AIDA64 for readings. 80C might be just that 30-105C scale that it uses
> 74C Tjmax is the literally the max but Intel Tjmax is similar as well expect that for 6800k programs STILL tend to read it wrongly. Some read as 30-105C scale some read as proper 0-Tjmax which is apps like AIDA64


Yeah the BIOS sits for a while and will get the temp up to 53C. And if your suggestion is correct, then the 80C I'm seeing, minus 30, is 50C, which is right around what I see in the BIOS. _Plus_, when using OverDrive and running quick-load tests in PerformanceTest64, Thermal Margin gets down to around 29C, and TjMax of 74 minus 29 = 45, which is in-line with the continuous full-load the BIOS puts on it for 53C.

So...I'm apt to believe that, in the end, you're right and HWinfo64 is adding 30C to the temp. All evidence points to that, from what I've seen.

One thing I'm still not sure on though is the monitoring of voltages in OverDrive. I have the BIOS set to 1.32v, and OverDrive reads the Target at 1.32v, but yet it says the Current is at 1.4v. Does it read the VID for Current and the Vcore for Target?


----------



## cssorkinman

I finally put the 6800k under water
Here are some results thus far : http://valid.canardpc.com/60sz1m
The MSI FM2-A85XA-G65 is a very capable overclocking board, if you ever use one, ease your way into the digitall power settings, the 100% setting gives hella v-boost










Spoiler: Warning: Spoiler!


----------



## kr00t0n

So I'm running a 6800k in a slimline case as I wanted something portable for gaming and djing, but didn't want to shell out for a laptop that cannot be upgraded over time.

Motherboard: Gigabyte F2A88XN-Wifi
Cpu: A10-6800k FM2 @ stock
Cooler: Zalman CNPS8900 Extreme
SSD: Sandisk Pulse 25 64GB
RAM: 4GB Corsair Vengeance 12800
Case: CiT S003B Slim MATX
Screen: Asus MB168B+ 1080p USB
GPU: Sapphire 7750 low profile

Everything is running fine, Smite @ 1080p medium settings running with no problems.
CPU is hitting 65ºC max, which is to be expected in a slimeline case with subpar cooling.

My question is, what is the average undervolt that people have managed keeping stock speeds? Might help bring the temps down.

Alternatively, what is the max overclock at stock volts that people are getting?

Thanks


----------



## mdocod

Kr00t0n,

The Giga board probably has CPB turned on by default, which will run the chip with ~1.40V and aggressively try to maintain turbo speeds. As you're finding, even with a fairly nice compact aftermarket cooler, the chip is pushing near thermal limits at these default settings. Answering your question regarding "stock" voltage is difficult, because every board will tend to run a little different at "default" settings, some will have proprietary performance enhancing features that run more voltage than others. On a giga board, there are "3" stock voltages for an A10-6800k.
1: The VID with CPB enabled: 1.40V IIRC.
2. The VID with CPB disabled: 1.3625V IIRC (pretty sure this is the standard 4.4ghz Pstate that all boards would target for a 6800k, but I could be wrong on this)
3. The "base" VID with turbo disabled: "chip" (in my case 1.275V, as I understand this varies by chip, however I believe it is unlikely to see much deviation from this VID as richland 6800k is already a "premium binned" part, as such, most of the chips sold as this bin will probably share pretty similar base VID characteristics.)

A10-6800K's should be capable of reasonable stability at approximately the following speeds/voltages:
[email protected]~1.20V, [email protected]~1.30V, [email protected]~1.40V, [email protected]~1.50, etc

Keeping in mind, that a voltage setting, and a voltage under a load, are not the same. I use an offset voltage setting with turbo and CPB disabled and medium LLC to run 4.8ghz. Under a load it dips to 1.375V and maintains stability. If we were to consider the "1.40V" default setting of the giga board as "stock," then my particular sample is capable of 4.9ghz on "stock" voltage, provided adjustments are made to accomplish that "1.40V" at the chip under a load. On the other hand, if we consider the 1.275V VID of my chip as "stock," then I can run somewhere around 4.3-4.4ghz at "stock" voltage. Again, assuming things are adjusted to actually run AT that voltage.

Most of those slim cases come with total garbage PSUs. Try to figure out a replacement for it. I believe those are usually a microATX standard PSU size. FSP has a replacement that might fit: FSP300-60GH


----------



## kr00t0n

Wow, thanks for the detailed reply!









I already replaced the crud PSU with a Be Quiet SFX that is doing a great job.

Based on the limited cooling, it looks like aiming for 4.4Ghz @ 1.3v is my best bet, but I will deffo have a play around.

I haven't had an AMD chip since my A64-4000+ year ago, so am totally out of the loop with modern AMD bios settings.


----------



## DaveLT

Quote:


> Originally Posted by *mdocod*
> 
> Kr00t0n,
> 
> The Giga board probably has CPB turned on by default, which will run the chip with ~1.40V and aggressively try to maintain turbo speeds. As you're finding, even with a fairly nice compact aftermarket cooler, the chip is pushing near thermal limits at these default settings. Answering your question regarding "stock" voltage is difficult, because every board will tend to run a little different at "default" settings, some will have proprietary performance enhancing features that run more voltage than others. On a giga board, there are "3" stock voltages for an A10-6800k.
> 1: The VID with CPB enabled: 1.40V IIRC.
> 2. The VID with CPB disabled: 1.3625V IIRC (pretty sure this is the standard 4.4ghz Pstate that all boards would target for a 6800k, but I could be wrong on this)
> 3. The "base" VID with turbo disabled: "chip" (in my case 1.275V, as I understand this varies by chip, however I believe it is unlikely to see much deviation from this VID as richland 6800k is already a "premium binned" part, as such, most of the chips sold as this bin will probably share pretty similar base VID characteristics.)
> 
> A10-6800K's should be capable of reasonable stability at approximately the following speeds/voltages:
> [email protected]~1.20V, [email protected]~1.30V, [email protected]~1.40V, [email protected]~1.50, etc
> 
> Keeping in mind, that a voltage setting, and a voltage under a load, are not the same. I use an offset voltage setting with turbo and CPB disabled and medium LLC to run 4.8ghz. Under a load it dips to 1.375V and maintains stability. If we were to consider the "1.40V" default setting of the giga board as "stock," then my particular sample is capable of 4.9ghz on "stock" voltage, provided adjustments are made to accomplish that "1.40V" at the chip under a load. On the other hand, if we consider the 1.275V VID of my chip as "stock," then I can run somewhere around 4.3-4.4ghz at "stock" voltage. Again, assuming things are adjusted to actually run AT that voltage.
> 
> Most of those slim cases come with total garbage PSUs. Try to figure out a replacement for it. I believe those are usually a microATX standard PSU size. FSP has a replacement that might fit: FSP300-60GH


I didn't expect that. lol. My friend's 6800k ran extremely hot on the stock cooler so I had to swap in a small 92mm fan tower heatsink and that fixed the problems
4.2GHz on that cooler and 1GHz or so on the GPU


----------



## kr00t0n

Oh one more thing, I have disable the APU in the BIOS (as I have a 7750), that is all I need to do right? I don't want my cpu wasting any resources on the APU side of things.


----------



## DaveLT

Quote:


> Originally Posted by *kr00t0n*
> 
> Oh one more thing, I have disable the APU in the BIOS (as I have a 7750), that is all I need to do right? I don't want my cpu wasting any resources on the APU side of things.


Might as well Hybrid CF them.


----------



## kr00t0n

My understanding is that doing so with my 1600mhz DDR3 vs the DDR5 of my 7750 would give worse performance than the 7750 on it's own, along with the little niggles that crossfire brings.


----------



## DaveLT

Quote:


> Originally Posted by *kr00t0n*
> 
> My understanding is that doing so with my 1600mhz DDR3 vs the DDR5 of my 7750 would give worse performance than the 7750 on it's own, along with the little niggles that crossfire brings.


No.


----------



## boolsheat

Hi guys, I have a couple of questions regarding my A10-5800k.

Here's my setup:
A10-5800k
FM2A88M-HD+
2x4GB Teamgroup Team Vulcan PC3 19200 DDR3 2400

So I'm running this thing under a stock AMD cooler and fan. I'm searching for an all round solid and stable performance, I won't game on the machine. So after playing around with different settings I'm currently at the following settings:

-APU/PCIE at 100Mhz
-CPU: x38 -> 3.8Ghz at 1.275V,
-Turbo disabled
Under full load under Prime95 it jumps at rare times up to 1.224V, most of the time its 1.216V or less, so I guess I can under volt it to that region, 1.225V?

Currently the CPU is after a bit less then 2hours of Prime95 at around 63°C (and less), that seems OK, its still some room for the max 74°C, right? I mean under normal circumstances the CPU won't be at full load for that amount of time and the normal temperature would be around 40°C or less. As I have C&Q enabled (and some other power saving features).

Now the tricky part, Northbridge and RAM.
- NB Voltage: 1.2313V
- Speed: 20x -> 2000Mhz
- RAM at 1866
- Timings manually set to XMP profile detected by AIDA64: @ 981 MHz 9-11-11-29 (CL-RCD-RP-RAS) / 40-257-2-6-15-8-8-25-11 (RC-RFC-CR-RRD-WR-WTR-RTP-FAW-WCL) - the motherboard only loaded timings for 1200Mhz, the RAM specification on the home page isn't that useful as it show only CL-RCD-RP-RAS, luckily AIDA loads full specs, very useful stuff.
- RAM Voltage: 1.62V

So now, any comments? Is the voltage properly set? At the current CPU voltage I guess I could put the max frequency a bit higher? If I left it at x3.8, should I lower the volts? What NB frequency should I set actually?

I find this a stable config at the moment.

If I would push the RAM to 2133 would the only benefits be FPS or also other benefits?

I was already running them at 2133 but it wasn't that stable, Prime95 crashed (rounding error I think) after an hour and after running AIDA stress test for 3hours some artifacts were seen in Windows. The CPU was set as is, the NB was at [email protected] and RAM @1.62V. Maybe the RAM should be set to 1.65V and NB frequency higher? Or maybe just NB voltage?

Ok, thats it for now, I'm sure some things were already answered but finding it all is quite a hassle and I'm completely new to all this, so any help and information will be very appreciated.


----------



## mdocod

Kr00t0n,

Interestingly enough, I've run into a lot of claims on this board that it is not possible to disabled the iGPU in these APU builds. Though I found what appears to be a way to disable it in BIOS on my Giga-wifi board as well. I suspect you're already 3 steps ahead on disabling the iGPU than most people have ever made it









You are correct, there is no benefit to crossfiring any GPU with the A10-6800K at this time, especially not with an HD7750-GDDR5 edition. In most game titles it will simply override your dual graphics setting and run the HD7750 discrete anyway because of the architecture difference, and in the few games that will use both, the frame pacing will be so bad that the improvements in FPS are squandered.

In the coming years, we may see improvements in drivers and in hardware utilization, such that you could use the iGPU in the A10 for compute purposes like PhysX, while the discrete GPU continues to handle the primary render. For now it's probably best to leave the iGPU disabled, but keep your eyes "peeled" for opportunities to use the iGPU in tandem as software side changes are made.

DaveLT,

The A10-6800K is certainly up against a self conflicted thermal struggle on the stock cooler when run at "default" settings in some cases. Especially when run with proprietary performance enhancement features like "core performance boost" (Gigabytes proprietary performance enhancement drug). Through fine tuning however, there's room for improvement. In a case with really good air flow, or on a "test bench" (or open case) install, the stock cooler can probably get most of these chip to [email protected]~1.30V (base clocks, no turbo) and ~1150mhz iGPU clocks (also needs around 1.30V, but I don't know the actual voltage under a load as I haven't seen it reported anywhere so unsure).

Boolsheat,

Everything sounds like you are running good. I'll offer some insight to ponder on which may or may not help...

Is 1.275V the default VID for the chip at base (non-turbo) clocks? If so, then there's a pretty good chance you could run that chip [email protected] on the stock cooler. However, you'll need to make adjustments to get a true 1.25V to the chip.

Some voltage sag is pretty normal when the chip is under a heavy load. It's also not unusual to have the actual running voltage on a chip be slightly different from what has been set regardless. Look through your motherboard BIOS settings for load line calibration. You can adjust the LLC settings to help compensate for voltage sag, effectively "tightening" up the difference in voltage from idle to loaded state. In order to "test" the results of your changes to LLC settings, it would be best to turn off all power saving modes in BIOS (like cool/quiet and the "C" states).

If you adjust the voltage down manually to "match" where you have seen it running, then the actual voltage under a load will likely sag even lower, which may or may not introduce instability.

I'm not familiar with default NB speeds/voltages for Trinity. I believe it was [email protected] on my Richland chip. In my experience NB overclocking returns very little benefit, and as such, should be left at stock or near stock to save that dissipation for another area of overclocking that will return greater results. Assuming you are interested in overclocking the chip...

The NB voltage control, will also directly effect your iGPU and Memory controller voltage. (Some boards may have a separate voltage control for the iGPU).

In order to overclock the memory and iGPU (which should be overclocked together to improve FPS in games), you'll probably need to bump up the NB voltage a bit. So while you shouldn't waste dissipation on overclocking the NB itself (at least for now), you should throw some extra voltage here to support higher iGPU and memory controller clocks. I run my Richland iGPU @~1.33V and 1200mhz, with the memory at 2400MT/s. In _theory_ the CPU silicon itself should be reasonably tolerant up to ~1.55V under strong air cooling for any part of the chip but I have not personally found many reports of NB tinkering beyond 1.40V on these so that's as high as I'm personally willing to tinker for now. Probably best not to take any part of the chip beyond ~1.40V on the stock cooler anyway as it will just overheat.(more on this in a moment).

You may need to adjust the memory voltage such that it reads 1.65V (actual) in order to run at the rated speeds. I'm not sure how much "tolerance" or "headroom" there will be in that Team brand stuff. Keep in mind that memory instability can come from either the memory itself, or the memory controller. If you get to 1.70V on the memory and still aren't stable, then it's probably the memory controller. Bump up the NB voltage to compensate for instability on that side. My experience on Richland leads me to believe that there is unlikely anything to gain on the stock cooler above ~1.30V on the NB. Above that point, any gains achievable come with instability likely related to unreported thermals or something.

I don't know whether or not 2400MT/s speeds are possible on Trinity. Interestingly enough, a lot of review sites had a hard time getting richland to run that fast, but we have lots of reports of folks running those speeds (lazy reviewers aren't taking the time to solve the problem and find stability, lol).

I think there are 2 primary mistakes that are made by most people overclocking on 32nm AMD chips:
1. Not enough time spent establishing a base-line minimum voltage for stability. (Going somewhere, without first knowing where we are, is counterproductive)
2. Not enough time spent experimenting with board settings to get the tightest possible voltage regulation as the load changes. (This offers us the most efficient path to get where we have decided to go)

These areas of neglect lead to a lot of guesswork, throwing settings at the problem hoping for an outcome. My experience has been that throwing numbers/settings at the problem wastes a lot of time (I wasted plenty, though I have fun either way). By going through a discovery process first regarding voltage control and chip stability, we establish a strong "you are here" beacon on a map. From there, it's easy to point to anywhere on the map we might want to go, and know exactly what voltage setting and LLC setting to use to get there with reasonable stability on the first try. Then it's a simple matter of finding out whether or not there is enough thermal dissipation and steady power delivery to run at the destination chosen. If so, begin fine tuning.


----------



## boolsheat

Thanks for some really useful information.

I don't know what my CPU's VID is, but it actually might just be 1.275V, AFAIK when in UEFI the CPU always runs at default settings. So in my HW monitor in UEFI it says 3.8Ghz and 1.27V (occasionally jumps to less). I can't confirm this as I haven't tested if the voltage changes with different settings.

So I think I'll try to get my RAM's stable at 2133Mhz. I'll see if I could make them work at 1.62V (XMP profile timings) and the NB at around 1.25V. I tried to make them work at 2400Mhz but it wasn't stable even at 1.35V for northbridge. I have to say that my motherboard has NB and GFX voltage tied together. Also I found some info from Asrock and they say that to make 2400Mhz RAM to work stable the NB would need at least 1.38V. I think that's quite high if the recommended setting for NB is around 1.275V. Also der8auer says he wouldn't go above 1,325V for 24/7 build and his max safe voltage for testing with proper cooling and all is 1.4V.

I also disabled C6 state, cool&quiet and APM (or what's it called) so that the CPU frequency and voltage are stable. I find the system a lot snappier with those things disabled. CPU (at x3.8) is set to 1.275V and runs during prime95 at 1.216-1.224V. So I guess I could really run the CPU at x40 at this voltage (during prime95 the temps are stable at 62°C). I haven't tested it tho as I'm still working on NB and RAM which I still think are the trickiest things to set.









Also as you said that NB frequency isn't that important I lowered it to 1800Mhz. I found a small little northbridge benchmark @madshrips.be which shows that indeed there aren't many benefits from putting it any higher. The thing is here: http://www.madshrimps.be/articles/article/1000359/AMD-Trinity-A10-5800K-APU-Review/6#axzz2rKihRfv8


----------



## boolsheat

So I tried to go for 4Ghz at 1.272V but after soon after boot it crashed. What's different now is that the CPU now runs at 1.264-1.272V at 3.8Ghz all the time while previously it never went past 1.224V at the same frequency. Why the change? I'm quite sure everything else is set the same as before. Did I somehow damage the CPU?

EDIT:
Nevermind, I figured it out. Previously I was always running Prime in the background when looking at the voltage. The last couple of times I checked it was without prime. So basically if I run the CPU with prime when its like 100% load the CPU runs at lower voltage then when its idling at the same frequency!? I guess that this is some auto temperature saving feature when the CPU is at full stress?


----------



## DaveLT

Quote:


> Originally Posted by *boolsheat*
> 
> So I tried to go for 4Ghz at 1.272V but after soon after boot it crashed. What's different now is that the CPU now runs at 1.264-1.272V at 3.8Ghz all the time while previously it never went past 1.224V at the same frequency. Why the change? I'm quite sure everything else is set the same as before. Did I somehow damage the CPU?
> 
> EDIT:
> Nevermind, I figured it out. Previously I was always running Prime in the background when looking at the voltage. The last couple of times I checked it was without prime. So basically if I run the CPU with prime when its like 100% load the CPU runs at lower voltage then when its idling at the same frequency!? I guess that this is some auto temperature saving feature when the CPU is at full stress?


Voltage droop.


----------



## mdocod

boolsheat,

When you run Prime, it increases the electrical load substantially. The voltage to the chip sags due to resistance. It has nothing to do with a "feature" it is just a consequence of imperfect conductivity. The voltage regulation circuit on the motherboard is constantly monitoring and attempting to compensate for the change in load. The aggressiveness of that attempt to correct for voltage sag under a load can be adjusted on most motherboards with a setting called "load-line calibration." You should have this feature under the "OC Tweaker" in BIOS. You should experiment with the LLC settings available to see which offers the tightest voltage regulation when transitioning from idle to load state. (but still erring on the side of sagging, rather than rising). Very aggressive LLC settings will sometimes cause the voltage to RISE when a workload is applied. This can be counterproductive to achieving an efficient overclock.

You're in the conservative zone here for voltages. At default settings many boards will run the chip at 1.40V.

If you're bumping into stability limitations but still able to boot with ~1.225V under load at 4.0ghz, then you may have your baseline more or less figured out already. Manually set the voltage to 1.30V, and begin testing your LLC settings. When you have figured out which setting holds voltage the tightest from idle to load, you'll have the upper hand in finding an efficient clock/voltage setting that improves performance without causing thermal problems.

I can't recall if you mentioned, are you on the factory heatsink or something else?


----------



## boolsheat

Oh, really didn't know about these things, so this is called voltage droop: http://en.wikipedia.org/wiki/Voltage_droop

Well I don't have any LLC settings available unfortunately in BIOS. I wonder what is the default setting?

Yes, I'm on the stock cooler and fan.

I was still having some issues with RAM, I'm not sure if the NB or RAM voltage was too low. Now I'm at 1.25V for NB and 1.67V for RAM as I can't set the voltage for RAM to 1.65V, the BIOS only allows me 1.57V, 1.62V 1.67V, 1.72V so fine tuning will be quite hard.

Does it really matter if RAM has a higher voltage? I wonder in what relation are NB and RAM voltage when it comes to stability.

I think I'll set the NB voltage a bit higher, set RAM to 1.62V and run memtest so I'll see if RAM can work safe at 1.62V. Then I'll know that the problem was NB voltage and not RAM.


----------



## Durquavian

Quote:


> Originally Posted by *boolsheat*
> 
> Oh, really didn't know about these things, so this is called voltage droop: http://en.wikipedia.org/wiki/Voltage_droop
> 
> Well I don't have any LLC settings available unfortunately in BIOS. I wonder what is the default setting?
> 
> Yes, I'm on the stock cooler and fan.
> 
> I was still having some issues with RAM, I'm not sure if the NB or RAM voltage was too low. Now I'm at 1.25V for NB and 1.67V for RAM as I can't set the voltage for RAM to 1.65V, the BIOS only allows me 1.57V, 1.62V 1.67V, 1.72V so fine tuning will be quite hard.
> 
> Does it really matter if RAM has a higher voltage? I wonder in what relation are NB and RAM voltage when it comes to stability.
> 
> I think I'll set the NB voltage a bit higher, set RAM to 1.62V and run memtest so I'll see if RAM can work safe at 1.62V. Then I'll know that the problem was NB voltage and not RAM.


Couple of things:

1: Voltage droop without LLC causes some confusion. Technically max temp is the concern over max voltage. LN2 runs/suicide runs go to 2.0V, not something we would even attempt for daily use. But for instance: My 8350 for 4.84Ghz needs ~1.48V. Without LLC I need to account for Vdroop and at that speed it is about .06V so I need to set Bios voltage to 1.54V. For a lot of 8350 Users above 5.0Ghz ANd no LLC they must run above 1.55V to adjust for Vdroop which can range anywhere from .04-.1V. Only reason I mention this is the loose max V of 1.55V max that was stated early on. Of course now we know the issue is the heat more so than Voltage. Some are running 1.7-V daily to account for Vdroop and no dead chips yet. Degradation is unknown for now.

2: Memory for JDEC standards states that up to 1.93V must be able to be used with no permanent damage to the memory module, just doesn't have to be able to run stable there. So you could safely run in the 1.75V range with little issue.


----------



## boolsheat

So basically it doesn't matter if memory modules are running at 0.2 higher voltage. The modules will just get a bit warmer?


----------



## Durquavian

Quote:


> Originally Posted by *boolsheat*
> 
> So basically it doesn't matter if memory modules are running at 0.2 higher voltage. The modules will just get a bit warmer?


Not sure if anyone has had heat issues with todays RAM. But I gather that has more to do with speed. At 1866 and lower prob not a lot of heat. But 2400 and some Volts may create a bit, just haven't seen anyone mention it.


----------



## Lutfij

Hey Everyone,

I'd like to join the club with my very first AMD build and in effect my very first APU.

Specs:
A10-6800K(stock) @ stock volts
Asrock FM2A85X-itx
Mushkin Blackline Ridgeback 996991 running at 1866MHz on XMP settings but was around 2133MHz manually set in BIOS until the unit went really hot.
Crucial M4 128GB ssd
Toshiba 1TB 2.5" HDD
Thermaltake Litepower 450W PSU
Connected to Del U2311H + HP LA2306x

^ all assembled on a table top.

any tips on how I can under volt it? I don't play games on it and am running the stock cooler. Will hopefully transition to a Loneindustires L2 case and cram it all in there.


----------



## Lutfij

My Asrock FM2A85X-itx is now dead and I'll be moving forward with a RMA process with Newegg. I actually think the board is broken, since no matter what I did in the BIOS(undervolting) the temps would skyrocket on my stock cooler even under idles.

Initially came with BIOS 1.2 and I updated to 1,6 without issues, however I noticed the fan on the stock cooler wouldn't ramp up when my APU would throttle to Turbo frequencies. I also tired disabling Turbo mode, IOMMU and the sata IDE mode

Is it the fault of the APU or meself or something else? I was hoping Asrock had reworked their QC and had a great reputation, would anyone here suggest I take another route apart from Asrock, since i may just get a new board altogether in a mitx form factor.

I could also learn the ins and outs, but up until now, I haven't been able to locate a good undervolting guide for those running at stock speeds/turbo modes yet on 2133MHz rams. if I should reiterate, does the NB clocks and voltages matter for running 2133MHz?


----------



## Durquavian

NB clock must be higher than ram. At stock NB should be enough to cover 2133 and only requires extra voltage if you populate all ram slots, 4 not 2.


----------



## Lutfij

Yeah, thanks for the info mate, +rep, I figured as much but no matter what I did, it'd abruptly shut down and the chipset heatsink as well as the stock heatsink would be really hot to the touch like finger hurting hot. Should I bother with cooling the chipset too with an aftermarket higher finned chipset cooler from FrozenCPU?

Is it possible to provide a rough idea of where a proper A10-6800K's voltages would be gauged as bare minimum? I'm interested in undervolting the APU and also see if I can have 2133MHz DImms on it. Is it possible or am I dragging a dead horse around?

I've learnt that I don't need timings while on an APU system, as long as the frequencies are available.


----------



## mtcn77

Boy! Am I happy to download the "Devastator PowerTune" software that Stilt has uploaded! I have only been able to set the voltage spot on giving that one a try.


----------



## Lutfij

Is it possible for you to provide a download link? The site where its located turns up to a dead dropbox link where the file may have been removed or deleted...


----------



## mtcn77

Quote:


> Originally Posted by *Lutfij*
> 
> Is it possible for you to provide a download link? The site where its located turns up to a dead dropbox link where the file may have been removed or deleted...


http://forum.hwbot.org/showthread.php?t=86959&highlight=devastator+powertune
Stilt is awesome.


----------



## mtcn77

The stock cooler, eventhough nimble, can cool 6800K with igpu on at 1.13v. I had to downclock to 3.8ghz, though not a problem in Deadspace. There is a 1 to 5 celcius difference with the setting off. If I don't turn off gpu powertune, the cpu will just set the voltage 100mv higher than what I had specified in bios. It causes more trouble than its deeds, I would have to reduce boost voltages which would stall the cpu when idle. Glad, it is done for.


----------



## Lutfij

1.13V on all cores enabled? If so that's some good numbers on those volts.


----------



## mtcn77

Quote:


> Originally Posted by *Lutfij*
> 
> 1.13V on all cores enabled? If so that's some good numbers on those volts.


I agree, the cpu is working flawlessly at 3.8ghz & gpu at 950mhz when the voltage is set at 1.3125, VDDNB at 1.09375v. I think I have underclocked the apu to the point, if I were to decrease even one hundredth less, the system crashes in 10 seconds.
Quite worth the effort, considering the auto setting would get the cpu reaching 72 degrees celcius. 59 celcius is much more stable and I don't have to suffer the fan noise.


----------



## Lutfij

I'm going to see what Newegg has to say about my dead Asrock mobo so I can work on it again and shoot the same kind of undervoltage results as you.


----------



## mtcn77

Now at 1.1375v & 1.1v. The apu creeps steadily over 60c if I set any sort of LLC on. Asus EPU software reads 3D power at approximately 55 watts. Amazing.


----------



## mtcn77

Wow, dynamic framelock at 30 fps via RadeonPro and opting for SMAA than SSAA decreased my temperatures 5 degrees more.


----------



## DaveLT

F2A85-M Pro + A10-6800K. So far only tested it for operation ready status. Plonked a oversized zalman clone on it and it still hit 50C across the VRMs and CPU while loading. I guess this is normal


----------



## Lutfij

Speaking of high temps, I was reading through some reviews on Newegg and I cam across one member saying that the board would overheat when in only BIOS screen. Have any of you seen this on your various APU builds/mobo's since I usually tend to stay in BIOS for a little while longer before I confirm my settings and exit to windows.


----------



## DaveLT

Quote:


> Originally Posted by *Lutfij*
> 
> Speaking of high temps, I was reading through some reviews on Newegg and I cam across one member saying that the board would overheat when in only BIOS screen. Have any of you seen this on your various APU builds/mobo's since I usually tend to stay in BIOS for a little while longer before I confirm my settings and exit to windows.


BIOSes always put load. It's normal but for it to overheat ... High ambient temp and faulty fan on stock cooler?


----------



## Lutfij

I wouldn't say faulty fan on the stock cooler since the fan would ramp immediately after I turned the system back up(after it shut down due to thermal protection being on) but it would do that for a few seconds and would remain that way irrespective of whether the fan settings for the cooler was set to auto or the highest setting in BIOS.

I do know BIOS does put some load but this being my first AMD build+APU build I was dumb founded with the way this system behaved...and is now dead







Had a Sandy Bridge i3-2100 build running side by side and the temps were more than favorable on the same ambient temps









I've set my sights on an ASrock FM2A88X-ITX+ & a Noctua NH-L12 cooler for taming this hit head. Any suggestions not to go that route and look at MSI/Gigabyte?

Thanks for the input DaveLT!


----------



## DaveLT

Quote:


> Originally Posted by *Lutfij*
> 
> I wouldn't say faulty fan on the stock cooler since the fan would ramp immediately after I turned the system back up(after it shut down due to thermal protection being on) but it would do that for a few seconds and would remain that way irrespective of whether the fan settings for the cooler was set to auto or the highest setting in BIOS.
> 
> I do know BIOS does put some load but this being my first AMD build+APU build I was dumb founded with the way this system behaved...and is now dead
> 
> 
> 
> 
> 
> 
> 
> Had a Sandy Bridge i3-2100 build running side by side and the temps were more than favorable on the same ambient temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've set my sights on an ASrock FM2A88X-ITX+ & a Noctua NH-L12 cooler for taming this hit head. Any suggestions not to go that route and look at MSI/Gigabyte?
> 
> Thanks for the input DaveLT!


I wouldn't say anything good about MSI ... ._. Go with the ASRock, it's not bad. How about going with an thin AIO instead?


----------



## Lutfij

Hmmm... my last board was an asrock thus the slight discredit to them but in the world of consumer electronics, there can be duds in a batch. As per your suggestion, I think my limitor is the case.

I'm going to put my system inside this case thus the low profile only option for this particular build, unless there is considerable dpubt that the airflow in the case may kill the board.

^has 2 bottom and 2 top points for 80mm fans.


----------



## cssorkinman

Quote:


> Originally Posted by *DaveLT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lutfij*
> 
> I wouldn't say faulty fan on the stock cooler since the fan would ramp immediately after I turned the system back up(after it shut down due to thermal protection being on) but it would do that for a few seconds and would remain that way irrespective of whether the fan settings for the cooler was set to auto or the highest setting in BIOS.
> 
> I do know BIOS does put some load but this being my first AMD build+APU build I was dumb founded with the way this system behaved...and is now dead
> 
> 
> 
> 
> 
> 
> 
> Had a Sandy Bridge i3-2100 build running side by side and the temps were more than favorable on the same ambient temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've set my sights on an ASrock FM2A88X-ITX+ & a Noctua NH-L12 cooler for taming this hit head. Any suggestions not to go that route and look at MSI/Gigabyte?
> 
> Thanks for the input DaveLT!
> 
> 
> 
> I wouldn't say anything good about MSI ... ._. Go with the ASRock, it's not bad. How about going with an thin AIO instead?
Click to expand...

5500 good things to say about MSI







http://valid.canardpc.com/60sz1m ( on water)


----------



## Lutfij

I'm confused right about now


----------



## sgtgates

Hey guys!

Im buying a 6800k and this itx giga board http://www.newegg.com/Product/Product.aspx?Item=N82E16813128663 for a friend of mine. Can this chip and board run tripple monitor? Not looking for eyeinfinity gaming or anything just an office setting... And which video connections would I need and the combo of them all?

Thanks


----------



## DaveLT

Displayport, HDMI and DVI.


----------



## sgtgates

Quote:


> Originally Posted by *DaveLT*
> 
> Displayport, HDMI and DVI.


So on that board the dvi hdmi and that other hdmi? Thought that other hdmi wasn't an out?


----------



## DaveLT

Quote:


> Originally Posted by *sgtgates*
> 
> So on that board the dvi hdmi and that other hdmi? Thought that other hdmi wasn't an out?


that's a displayport not a hdmi -_-


----------



## Papadope

Quote:


> Originally Posted by *Lutfij*
> 
> I wouldn't say faulty fan on the stock cooler since the fan would ramp immediately after I turned the system back up(after it shut down due to thermal protection being on) but it would do that for a few seconds and would remain that way irrespective of whether the fan settings for the cooler was set to auto or the highest setting in BIOS.
> 
> I do know BIOS does put some load but this being my first AMD build+APU build I was dumb founded with the way this system behaved...and is now dead
> 
> 
> 
> 
> 
> 
> 
> Had a Sandy Bridge i3-2100 build running side by side and the temps were more than favorable on the same ambient temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've set my sights on an ASrock FM2A88X-ITX+ & a Noctua NH-L12 cooler for taming this hit head. Any suggestions not to go that route and look at MSI/Gigabyte?
> 
> Thanks for the input DaveLT!


I can't comment on the quality of the ASrock FM2A88X-ITX+ but I have owned 3 ASrock ITX boards and I am done with ASrock. I can only compare the FM1 A75M-ITX to Gigabytes F2A88XN-WIFI

The problems I have with the FM1 A75M-ITX
-Thin PCB, it bends extremely easily
-Does not respect AMD's keep out zone for cpu coolers.
-Has transistors under the boards that interfere with backplates
-3+1 Phase design that cant handle my A8-3850 at stock settings. Computer just shuts off when running a demanding game.

They followed this up with the FM2A75M-ITX, this board was known for catching on fire. It's documented all over the internet.

I am currently typing this on a Gigabyte F2A88XN-WIFI and I can say the quality is far far far superior to that of the ASrock. The board is thick and sturdy. The vrm's have a heat sink on them. Their are no transistors on the back of the board. The design respects AMD's keep out zone and I was able to install the Noctua NH-L9a with zero problems.

All that being said, the ASrock FM2A88X-ITX+ does look like a nice board, but I have 0 confidence in the company anymore.


----------



## mtcn77

My 6800k is Y cruncher stable at 1.1375v High LLC 3.8Ghz with the stock heatsink.
For reference:
250M Digit

Computation Time: 129.469 seconds
Total Time: 136.607 seconds
CPU Utilization: 371.588 %
Multi-core Efficiency: 92.897 %
Last Digits:
3673748634 2742427296 0219667627 3141599893 4569474921 : 249,999,950
9958866734 1705167068 8515785208 0067520395 3452027780 : 250,000,000
500M Digit

Computation Time: 285.373 seconds
Total Time: 299.974 seconds
CPU Utilization: 382.917 %
Multi-core Efficiency: 95.729 %
Last Digits:
3896531789 0364496761 5664275325 5483742003 7847987772 : 499,999,950
5002477883 0364214864 5906800532 7052368734 3293261427 : 500,000,000


----------



## EduFurtado

I'm about to throw my 6800k out the window.

It's throttles like a champion. Every power saving feature is disabled. I even underclocked the CPU to see what would happen and it persists.
Disabling everything to see if I can run at a steady 4.1ghz which would be enough for me, it still throttles.
Also tried playing with LLC...

Looks like it's the BUS speed that drops down. So this is what I'm targeting to keep stable.

The only way I managed to stop it is by using only 2 cores, but then I hit a wall at 5ghz and can't get enough performance.

Any ideas? ...


----------



## cssorkinman

Quote:


> Originally Posted by *EduFurtado*
> 
> I'm about to throw my 6800k out the window.
> 
> It's throttles like a champion. Every power saving feature is disabled. I even underclocked the CPU to see what would happen and it persists.
> Disabling everything to see if I can run at a steady 4.1ghz which would be enough for me, it still throttles.
> Also tried playing with LLC...
> 
> Looks like it's the BUS speed that drops down. So this is what I'm targeting to keep stable.
> 
> The only way I managed to stop it is by using only 2 cores, but then I hit a wall at 5ghz and can't get enough performance.
> 
> Any ideas? ...


Have you got good airflow over the vrms and socket area?


----------



## Hotrod33809

Hmm so every power saving thing disabled yet it still throttles? Yeah how is the airflow like the above poster mentioned? CPU power range in CCC set to only run at 4.1 ghz?


----------



## mtcn77

Quote:


> Originally Posted by *EduFurtado*
> 
> I'm about to throw my 6800k out the window.
> 
> It's throttles like a champion. Every power saving feature is disabled. I even underclocked the CPU to see what would happen and it persists.
> Disabling everything to see if I can run at a steady 4.1ghz which would be enough for me, it still throttles.
> Also tried playing with LLC...
> 
> Looks like it's the BUS speed that drops down. So this is what I'm targeting to keep stable.
> 
> The only way I managed to stop it is by using only 2 cores, but then I hit a wall at 5ghz and can't get enough performance.
> 
> Any ideas? ...


What sort of performance are you looking for? Powertune by default throttles the cpu, if you overclocked the gpu. Also, default v-cpu is way too high. If you cannot cool it, you better undervolt than expect a better clock yield.


----------



## damric

We're having throttling problems over at the 750K/760K club as well, when shooting for very high overclocks. Tried everything, VRMs are cooled, CPU is cool, ect. There seems to be a TDP lock coded on these chips. See 2.5.1.2 and 2.5.9.2

http://support.amd.com/TechDocs/49125_15h_Models_30h-3Fh_BKDG.pdf


----------



## cssorkinman

I didn't have any throttling issues with the 6800K , pushed it all the way up to 5.5ghz.


----------



## mtcn77

The gpu asserts throttling when digital monitoring assumes the gpu consumption is more than 45 watts.
You should check out Stilt's contribution to the Bulldozer microarchitecture.
The Hand of Stilt


----------



## damric

Quote:


> Originally Posted by *mtcn77*
> 
> The gpu asserts throttling when digital monitoring assumes the gpu consumption is more than 45 watts.
> You should check out Stilt's contribution to the Bulldozer microarchitecture.
> The Hand of Stilt


Thanks. I was aware of bulldozer conditioner, but not the other tools. I'll try them out.


----------



## mtcn77

Quote:


> Originally Posted by *damric*
> 
> We're having throttling problems over at the 750K/760K club as well, when shooting for very high overclocks. Tried everything, VRMs are cooled, CPU is cool, ect. There seems to be a TDP lock coded on these chips. See 2.5.1.2 and 2.5.9.2
> 
> http://support.amd.com/TechDocs/49125_15h_Models_30h-3Fh_BKDG.pdf


You can turn of BAPM(2.5.9.3), if you check the link I provided above.
Quote:


> Originally Posted by *damric*
> 
> Thanks. I was aware of bulldozer conditioner, but not the other tools. I'll try them out.


OK, you are welcome! Glad I could help.


----------



## damric

Quote:


> Originally Posted by *mtcn77*
> 
> You can turn of BAPM(2.5.9.3), if you check the link I provided above.
> OK, you are welcome! Glad I could help.


Is BAPM what is causing throttle? I just turned it off and I'm about to test.

Also, this leakage indicator is pretty cool:



How does that compare to what you guys have seen?


----------



## mtcn77

Quote:


> Originally Posted by *damric*
> 
> Thanks. I was aware of bulldozer conditioner, but not the other tools. I'll try them out.


Quote:


> Originally Posted by *damric*
> 
> Is BAPM what is causing throttle? I just turned it off and I'm about to test.
> 
> Also, this leakage indicator is pretty cool:
> 
> 
> 
> How does that compare to what you guys have seen?


Quote:


> The Stilt
> 
> What's the "best" chips ?
> low leakage or high leakage (for LN2) ?
> and scalar high or low ?
> High leakage parts are a must for high frequencies.
> Otherwise you will run into a voltage wall.
> 
> The best parts for high frequency have ultra high to extreme leakage and ultra low (<1980) leakage scalar.
> Lower the leakage scalar value, higher the silicon quality (despite the leakage).
> 
> In my personal experience:
> 
> 1.3375V = 7.6GHz
> 1.3125V = 7.8GHz
> 1.2875V = 8.0GHz
> 1.2750V = 8.2GHz(+)
> 1.2500V = ?.?GHz


"Low" rating means a low leak part, you can safely increase the voltage a little bit more. It is a good chip to undervolt.


----------



## EduFurtado

Quote:


> Originally Posted by *cssorkinman*
> 
> Have you got good airflow over the vrms and socket area?


Airflow: yes. I currently out of a case. However they do not have any heatsinks. They do not feel insanely hot to the touch, and even though I don't trust the software I'm using to monitor my temps, there isn't one going above 75C (VRMs are supposed to be good up to 110C, right?)

Quote:


> Originally Posted by *Hotrod33809*
> 
> Hmm so every power saving thing disabled yet it still throttles? Yeah how is the airflow like the above poster mentioned? CPU power range in CCC set to only run at 4.1 ghz?


Yep, everything disabled on the BIOS, also windows is set to use 100% CPU all the time, CCC as well and AMD overdrive seems to be useless.

Quote:


> Originally Posted by *mtcn77*
> 
> What sort of performance are you looking for? Powertune by default throttles the cpu, if you overclocked the gpu. Also, default v-cpu is way too high. If you cannot cool it, you better undervolt than expect a better clock yield.


Right now all I want is for it not to throttle so I can maintain a stable FPS on CS:GO.


----------



## mtcn77

Quote:


> Originally Posted by *EduFurtado*
> 
> Airflow: yes. I currently out of a case. However they do not have any heatsinks. They do not feel insanely hot to the touch, and even though I don't trust the software I'm using to monitor my temps, there isn't one going above 75C (VRMs are supposed to be good up to 110C, right?)
> Yep, everything disabled on the BIOS, also windows is set to use 100% CPU all the time, CCC as well and AMD overdrive seems to be useless.
> Right now all I want is for it not to throttle so I can maintain a stable FPS on CS:GO.


If you want stable fps, you should install RadeonPro and run the game through its interface, after you set the driver to refresh the scene with "dynamic frame lock". Generally, your monitor's refresh rate is a good reference.
It decreases power consumption, too.
There are many other options to select, like SMAA and driver frame forward limit that is too much to handle at once.


----------



## EduFurtado

Quote:


> Originally Posted by *mtcn77*
> 
> If you want stable fps, you should install RadeonPro and run the game through its interface, after you set the driver to refresh the scene with "dynamic frame lock". Generally, your monitor's refresh rate is a good reference.
> It decreases power consumption, too.
> There are many other options to select, like SMAA and driver frame forward limit that is too much to handle at once.


Unfortunately competitive CS:GO is a whole different story.

To solve my problem I can either throw my APU out the window or stop it from throttling. :/


----------



## mtcn77

Quote:


> Originally Posted by *EduFurtado*
> 
> Unfortunately competitive CS:GO is a whole different story.
> 
> To solve my problem I can either throw my APU out the window or stop it from throttling. :/


To be clear with this conversation, if it throttles, I don't think it has to do with the cpu, rather the setting it is running.
I would understand, if you said it didn't generate the necessary fps, or that the cpu was slow which it can be, but you are saying it does not operate at peak velocity. BAPM still has to be enabled for that which you say to be true. It is the main monitoring sotware for this chip, afaik.


----------



## EduFurtado

Quote:


> Originally Posted by *mtcn77*
> 
> To be clear with this conversation, if it throttles, I don't think it has to do with the cpu, rather the setting it is running.
> I would understand, if you said it didn't generate the necessary fps, or that the cpu was slow which it can be, but you are saying it does not operate at peak velocity. BAPM still has to be enabled for that which you say to be true. It is the main monitoring sotware for this chip, afaik.


BAMP? Don't know what that is, googled for it and found no answer.

I believe it's my motherboard. I'm currently runnign at 3.8ghz









I have room to run it at 5ghz and 1088mhz on the gpu


----------



## mtcn77

Quote:


> Originally Posted by *EduFurtado*
> 
> BAMP? Don't know what that is, googled for it and found no answer.
> 
> I believe it's my motherboard. I'm currently runnign at 3.8ghz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have room to run it at 5ghz and 1088mhz on the gpu


You should install Devastator PowerTune and press apply to "disable" it. Found in this link:Check the third sticky


----------



## EduFurtado

Quote:


> Originally Posted by *mtcn77*
> 
> You should install Devastator PowerTune and press apply to "disable" it. Found in this link:Check the third sticky


Sir, you deserve the nobel price for best linking.

I'm feeling good about my APU once again, now that it doesn't throttles anymore.

Thank you very much!


----------



## Deero

Hey guys, just want to ask for your opinion about my temperature, I am currently using the stock HSF

I used prime95 for 5 min



Is the line method more effective in applying thermal paste to this particular processor?

Thanks


----------



## PcGamer1977

I just wanted to see what kind of results you guys were getiing with the I-gpu, I built my 6800k for a Steam-machine in the living room. I did run some tests on the 8670d I believe it was called and I didnt get good results- Lucky for me I had a pair of Gpus sitting around ( doing nothing ) so I threw in a Radeon 6970 into the mix, Theres no way the 8670 can compete with the 6970 right>? Even if i get the faster ramm 2133 mhz? Currently only running 1600mhz. Thanks for any input


----------



## mdocod

8670D: ~7GP/s, ~20GT/s, ~15-30GB/s, ~650GFLOP, VLIW4
HD6790: ~13GP/s, ~34GT/s, ~134GB/s, ~1350GFLOP, VLIW5
Interestingly, the VLIW4 architecture in the 8670D is slightly better, but that's not enough to overcome the deficiency in performance.
HD6970: ~28GP/s, ~85GT/s, ~176GB/s, ~2700GFLOP, VLIW4

The ROPs and VRAM bandwidth becomes the major problem for the A10. Tthe HD6*97*0 will generally be 4-6X faster depending on the conditions of the workload.

[edited for correction (see below)]


----------



## PcGamer1977

Iam sorry are you talking about the 6970 or 6790? I didn't even think there a=was a 6790? But it did feel ALOT faster in games like the Metro for example.I wasn't able to play on the lowest setting with the 8670 and even then barely hitting 15 fps (1920x1080) I also have a 720 p monitor I can run it on and it does a lot better at 720p as well.But thanks for that I was not aware it was that much faster:thumb: Edit; I know a lot of people will say you nutrlize the value of the Apu if you add a dedicated video-card and I agree to a certin point, Thing is I didn't go out and buy a gpu I had it sitting and wasn't being used anyway, it makes a pretty nice little gaming machine to be honest.So far its handled every game ive tested and then some.Wonder if I should overclock it tough? I have the Gigabyte A88XN/Wi-Fi board.


----------



## mdocod

Woops.... dyslexia... lol

Yes, there is a 6790 ... see edit.


----------



## PcGamer1977

Damm my friend, you know the specs like a hardware rep! More power to ya.


----------



## mdocod

lol, nope, just know where to find em : http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units


----------



## PcGamer1977

Lol still impressive nonetheless,.......So basicly the 8670d is a entry level or midrange gpu? Even if I were to get some 2133mhz ramm theres no way to game with the I-gpu? What about hybrid crossfire? It would still be slower then a 6970 right? And thanks btw for answering my questions.


----------



## beers

You can game with the iGPU, just don't expect things like BF4 on ultra... or high.. or maybe medium.

Here's a 'hybrid crossfire' bench with a 7570 for reference:
http://www.3dmark.com/3dm11/7876480


----------



## PcGamer1977

It was probbly wrong of me to try and play metro last light but even on the lowest settings and at 720p the game was just a mess getting anywhere from 5fps to 9fps max! This makes me wonder how long until amd is able to make a stronger igpi on the cpu itself, something like a 7970 can that even be done or not yet? If they can pull something like that off and in time iam pretty sure they will, it would make one hell of a gaming chip if you ask me lol. Thanks for the benchmarks beer.


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> It was probbly wrong of me to try and play metro last light but even on the lowest settings and at 720p the game was just a mess getting anywhere from 5fps to 9fps max! This makes me wonder how long until amd is able to make a stronger igpi on the cpu itself, something like a 7970 can that even be done or not yet? If they can pull something like that off and in time iam pretty sure they will, it would make one hell of a gaming chip if you ask me lol. Thanks for the benchmarks beer.


Unless you want a 300W TDP APU, no. Also it's gonna be quite expensive and you need better DRAM on board which we don't have yet now. HBM is the answer though


----------



## mdocod

Quote:


> Originally Posted by *PcGamer1977*
> 
> Lol still impressive nonetheless,.......So basicly the 8670d is a entry level or midrange gpu? Even if I were to get some 2133mhz ramm theres no way to game with the I-gpu? What about hybrid crossfire? It would still be slower then a 6970 right? And thanks btw for answering my questions.


Probably don't have to buy "2133mhz" ram to test out the performance difference. As long as the kit you have exceeds minimum JEDEC standards by a reasonable margin ([email protected] or better) then you can probably overclock it to 2000MT/s+ speeds. If it's a "standard" 1600-11 kit, or requires 1.65V for 1600-9 then probably not.

I am surprised to hear only 5FPS in any game at 720P with the A10-6800K. This has major settings errors or improper hardware utilization (driver?) written all over it. Turn off AA and make sure the driver isn't overriding it and running it anyway. Adjust settings to accommodate the GPU. The 8670D should be able to achieve 30FPS minimums in nearly ANY existing game at 720P with carefully adjusted settings. Driver quality overrides and post processing all need to be turned down or off. In-game settings need to be reduced across the board. One post processing setting left cranked up can bottleneck the whole thing.


----------



## DaveLT

Quote:


> Originally Posted by *mdocod*
> 
> Probably don't have to buy "2133mhz" ram to test out the performance difference. As long as the kit you have exceeds minimum JEDEC standards by a reasonable margin ([email protected] or better) then you can probably overclock it to 2000MT/s+ speeds. If it's a "standard" 1600-11 kit, or requires 1.65V for 1600-9 then probably not.


My kingston hyperx genesis 1600 is rated at 1.65v for 1600-9 but has gone to 2000! ... Until my motherboard suddenly decided in the middle of benching it's too high. (Not a surprise, low-power six core gulftown chip)


----------



## PcGamer1977

It plays really nice with the 6970 but the 8670 wasnt cutting it for me.I will take your suggestions into consideration and re run the tests over again.Let me ask you guys this whats faster hybrid crossfire with the A10 or the single 6970? What do you guys think?


----------



## matty50racer

Quote:


> Originally Posted by *PcGamer1977*
> 
> It was probbly wrong of me to try and play metro last light but even on the lowest settings and at 720p the game was just a mess getting anywhere from 5fps to 9fps max! This makes me wonder how long until amd is able to make a stronger igpi on the cpu itself, something like a 7970 can that even be done or not yet? If they can pull something like that off and in time iam pretty sure they will, it would make one hell of a gaming chip if you ask me lol. Thanks for the benchmarks beer.


You must not have been truly on the lowest settings at 720p. Metro LL is playable on my 7850k at 1080p low with a mild overclock. It gets over 30fps most of the time with a few short dips to upper 20's. I know it's faster than Richland but not by that much.


----------



## PcGamer1977

I didn't mess around with Overclocking the 8670 d it was kinda boring to be honest with you, But now iam just curious what can be done with it. Iam gonna have to rip open my Node 304 and take the 6970 out,Iam so lazy and been tied up at work don't have the time. Thanks guys. Btw I forgot to mention iam using a CoolerMaster seidon 120 to cool my 6800k how high do you think I can overclock on this board- GigaByte F2A88XN-WI/FI? its probably not the best board for high overclocking but I needed a (sff) to fit in the living room, Full-size tower was not gonna fly with the Family lol.


----------



## mdocod

Cross-firing will just make the performance worse, don't bother.


----------



## matty50racer

Quote:


> Originally Posted by *PcGamer1977*
> 
> I didn't mess around with Overclocking the 8670 d it was kinda boring to be honest with you, But now iam just curious what can be done with it. Iam gonna have to rip open my Node 304 and take the 6970 out,Iam so lazy and been tied up at work don't have the time. Thanks guys. Btw I forgot to mention iam using a CoolerMaster seidon 120 to cool my 6800k how high do you think I can overclock on this board- GigaByte F2A88XN-WI/FI? its probably not the best board for high overclocking but I needed a (sff) to fit in the living room, Full-size tower was not gonna fly with the Family lol.


I'm using the same board and a similar AIO cooler. I have no problem with cooling or VRM temps up to about [email protected] on my kaveri. I can get it stable at 4.5 with 1.525v but the temps get too high during encoding making it throttle. I Don't think the board and cooler will be a problem unless you need that last 100-200mhz to be stable at full load for long periods of time.


----------



## PcGamer1977

Very well, then iam fine with its stock speed we use it mainly for netflix and a few games every now and then. I kinda feel bad putting in that 6970 its not being used to its full potential but better to have it and not need it then the other way around right? I will get some pics up for you guys of my 6800k.I really enjoyed building it and iam really found of the fractal node 304, very high quality if you ask me!


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> Very well, then iam fine with its stock speed we use it mainly for netflix and a few games every now and then. I kinda feel bad putting in that 6970 its not being used to its full potential but better to have it and not need it then the other way around right? I will get some pics up for you guys of my 6800k.I really enjoyed building it and iam really found of the fractal node 304, very high quality if you ask me!


You can put a 6970 in it and use it to full potential actually. A 6970 is NOT that fast.


----------



## PcGamer1977

Not saying its that fast but its for sure Faster then the 8670 Dave! Or is it not according to you? lol


----------



## mdocod

The VLIW4 based HD6970 is very comparable to the performance of a GCN based R9 270. I'd call that pretty good. All about perspective I guess.


----------



## PcGamer1977

R9 270 is a 8670d?


----------



## mdocod

No, 8670D is most similar to an HD6670


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> Not saying its that fast but its for sure Faster then the 8670 Dave! Or is it not according to you? lol


Definitely is faster if you need more power I definitely won't say no to someone using a 6970 instead.
Quote:


> Originally Posted by *mdocod*
> 
> The VLIW4 based HD6970 is very comparable to the performance of a GCN based R9 270. I'd call that pretty good. All about perspective I guess.


6970 = 7850.


----------



## PcGamer1977

Oh cool thanks I wasnt aware of that. You dont measure a cards performance by the number of prosessing shaders available? Or similar to Nvidias cuda cores, isnt it always better to have more? I dont know I read somewhere thats what determines a cards performance the number of shaders available.


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> Oh cool thanks I wasnt aware of that. You dont measure a cards performance by the number of prosessing shaders available? Or similar to Nvidias cuda cores, isnt it always better to have more? I dont know I read somewhere thats what determines a cards performance the number of shaders available.


No, not at all. Also 7970 > 680 as is 7850 > 650 Ti Boost
What you read is BS.


----------



## mdocod

You CAN compare GPUs of the same architecture by observing core configuration and clocks. Comparing different architectures this way doesn't work, as the "shaders" change in every architecture.

The 6970 =/= 7850. Performance may be similar/comparable, but they are certainly not the same card or anything.


----------



## DaveLT

Quote:


> Originally Posted by *mdocod*
> 
> You CAN compare GPUs of the same architecture by observing core configuration and clocks. Comparing different architectures this way doesn't work, as the "shaders" change in every architecture.
> 
> The 6970 =/= 7850. Performance may be similar/comparable, but they are certainly not the same card or anything.


In performance terms. I did not say they are the same GPU.


----------



## PcGamer1977

Quote:


> Originally Posted by *mdocod*
> 
> You CAN compare GPUs of the same architecture by observing core configuration and clocks. Comparing different architectures this way doesn't work, as the "shaders" change in every architecture.
> 
> The 6970 =/= 7850. Performance may be similar/comparable, but they are certainly not the same card or anything.


I was not aware of this, Thank you for pointing that out for me.


----------



## PcGamer1977

Howdy guys just wanted to ask you guys if its ok to overclock the 8670D from 866 mhz core to 1000mhz gpu core? And how much will faster ram help me in gaming percentage wise?10%?20%?

Go for the 2133mhz ram? I still cant believe iam able to play Battlefield 2 on medium settings with this thing,really amazing! I think Amd is doing great with the Apus,I had to try one for myself.Is this board any good for overclocking:Gigabyte A88XN wi/fi? I got it cause I needed a small form factor system to fit the Node 304.

I wonder what the futute holds for Amd apus 10 years from now,can u imagine the Graphics core muscle they will be able tp squeeze in bye then?!


----------



## mdocod

Hi PcGamer,

I have our 8670D running near 1.2ghz, so yea, I think you should be fine at 1ghz









Don't buy faster RAM, just adjust what you have. I can assist on speeds/timings/voltages if you want to try this. Our APU rig is running [email protected] on a kit that was sold as [email protected] Runs great. Lots of high speed ram kits are single rank, and will actually perform worse than just overclocking an older low-speed dual rank kit. We get slightly better performance from the APU with a dual rank kit at 2133-10-12-12 than a single rank kit at 2400-9-11-11.

What specific RAM kit and motherboard do you have?


----------



## PcGamer1977

Oh that would be highly appricated thank you,the board is a gigabyte a88xn wi-Fi mini itx,the ram is corsair vengence 1600mhz 1.5v.

Iam kinda dissapointed now i didnt get the kaveri 7850k instead,i had a chance to pick either and i choose the older trinity for some reason,i appricate your help thank you.


----------



## jsc1973

Quote:
Quote:


> Originally Posted by *PcGamer1977*
> 
> I wonder what the futute holds for Amd apus 10 years from now,can u imagine the Graphics core muscle they will be able tp squeeze in bye then?!


It's really only limited by the capabilities of the fabs that AMD has access to. They already can put some pretty strong graphics on an APU if they want. Kaveri's onboard GPU is about as strong as an HD 7750, and the custom APU in the consoles is more or less a 7790/R7-260.

I'm hoping that when Carrizo comes, it has the 7790 or better on board. That's a pretty good level of graphics power and a powerhouse for HSA work.
Quote:


> Originally Posted by *PcGamer1977*
> 
> Iam kinda dissapointed now i didnt get the kaveri 7850k instead,i had a chance to pick either and i choose the older trinity for some reason,i appricate your help thank you.


If you need better graphics performance, crossfiring a 6670 with your APU will bring you about up to the level of a 6850.


----------



## PcGamer1977

I can make that happen a 6670 is pretty cheep, heres some screen shots of the 6800k guys

Quote:


> Originally Posted by *jsc1973*
> 
> It's really only limited by the capabilities of the fabs that AMD has access to. They already can put some pretty strong graphics on an APU if they want. Kaveri's onboard GPU is about as strong as an HD 7750, and the custom APU in the consoles is more or less a 7790/R7-260.
> 
> I'm hoping that when Carrizo comes, it has the 7790 or better on board. That's a pretty good level of graphics power and a powerhouse for HSA work.
> If you need better graphics performance, crossfiring a 6670 with your APU will bring you about up to the level of a 6850.


What version schould I get? They have DDR3 and DDR5 as well, 2 gb or 1 gb version?

I had a 6870 in the past , if i can get this machine to that level of gpu performance that would be IMPRESSIVE to say the least!


On a similar note my Fx 9370 scores over 12,000 on this test, my old fx 6100 scored 6000 points so the 6800k is strronger! Not bad.


----------



## jsc1973

Quote:


> Originally Posted by *PcGamer1977*
> 
> What version schould I get? They have DDR3 and DDR5 as well, 2 gb or 1 gb version?


The GDDR5 version of the 6670 is about 20 percent faster, but it might be hobbled by crossfiring with the APU, which is using DDR3. Someone else might be able to answer that better than me. I don't think the extra RAM makes much if any difference on a 6670.


----------



## PcGamer1977

Then ill just opt for the cheeper DDR3 version. But I noticed in most benchmark scores the 6800k is better then the Fx 6100 in cpu related or gaming performance. Do you guys agree or no?









Also what drivers do you guys recommend for this 6800k? Is the Radeon 8670 d Mantle compatable or no? Thank you.

Ok heres Fx 9370 at 4.9ghz vs A10 6800k at 4.3 ghz,I didn't think the difference was so big WOW, last time I ran this test I only scored 12,000 don't know what happened but ill take it lol.


----------



## jsc1973

Quote:


> Originally Posted by *PcGamer1977*
> 
> But I noticed in most benchmark scores the 6800k is better then the Fx 6100 in cpu related or gaming performance.


On workloads that can't use all six cores on the FX-6100, the 6800K is faster because it uses the superior Piledriver core, rather than the Bulldozer core, and it's clocked higher to boot. It will be much faster on any workloads using between one and four cores.


----------



## PcGamer1977

I got a problem guys and need your Expert opinions PLEASE!?........Last night like the Idiot that I am I decided to change the Resolution on my 6800k based machine through Amd catalyst control,Now I cannot get any type of picture in my living room at all!.....I hooked it up to my other monitor anjd it works fine but cant remember what setting I used either Basic or HDTV, now nothing but black screen in the living room, I don't know what the heck to do! Anyone know how to fix this?

The living room uses a Panasonoic Vierra 50 inch flat screen Tv, until I changed the Res settings everything was working just fine, Now all I get is a Black screen upon booting into windows! I cant even make into the Windows log-in screen.What the hell did I do for gods sake? .......Tv is 720p, I tested it on my 1080 p and boots just fine on that monitor but nothing in the living room!


----------



## mdocod

Don't bother with the HD6670. Pointless upgrade. The improved FPS comes with so many frame pacing problems that there is no net benefit to game-play at all. It's a novelty implementation with no useful benefits in 90% of games. Crossfire does not scale well on asymmetrical VLIW4 hardware. This has been thoroughly debunked as an upgrade path. Having HD6850 FPS is meaningless if the added frames are not positioned on the timeline correctly.

Your Corsair kit should have no problems running higher speeds. Just need to configure it manually...

Quote from Corsair: http://www.corsair.com/en-us/blog/2013/october/amds-a10-apu-and-memory-bandwidth
*"Recognizing that AMD's A-series APUs are mainly a budget play, you don't necessarily need to buy our high end Dominator kit to get the most out of them. Forums across the internet are alight with reports of people taking our mainstream Vengeance DDR3-1600, applying a little voltage, and getting it up to DDR3-1866 or better with relaxed latencies similar to what we tested with. If you're doing any casual gaming on an A-series APU, though, just playing with memory speed is a quick and dirty way to extract a meaningful boost in performance and potentially allow you to even bump up settings in-game."*

You should have a page in BIOS looks about like this:



Memory overclocking comes with the risk of file corruption. Back up important data before continuing.

There will be a separate page for each memory channel, so set to "manual" and prepare the timings for higher speeds. I would recommend just planning on going straight to 2133 speeds. Loosen all the timings by about +30% and the kit should run at 2133 speeds with ~1.6V. bump it up to ~1.7V and you can probably come back in and tighten the timings back down some or maybe even shoot for 2400MT/s speeds.

If you want a cheat sheet, try the following:

2133"MHZ"
DRAM: 1.65V

CAS: 11
RCD 13
RP 13

RAS 33
RC 44
RRD 7
WTR 8
WR 16
WL 9
RFC 4
RTP 8
FAW 33
CMD 2

The bottleneck for these APUs is both the ROPs and memory bandwidth in most workloads. As you overclock the iGPU the bottleneck is placed more squarely on the memory bandwidth, so performance scaling from memory overclocking gets better as the iGPU is overclocked.

As far as CPU performance in gaming is concerned, the A10-6800K is better than an FX-6100. For the same reason that an i5-4670K is better than an i7-970.

Consolidating more execution performance per core offers better performance scaling to real-time workloads than having many cores.


----------



## PcGamer1977

I need to fix the resolution issue 1st but iam going to try your settings on the memory! Iam only using 4gb on this machine plus the heatsink came loose on the corsair vengence lol,i did not know they use hot glue and stickers to keep the heatsink in place wow how ghetto lol.I think I will just start over with 8 or 16 gb of decent memory that wont fall apart on me this time hehe.Dont buy Vengence memory guys.


----------



## mdocod

Well if you're going to buy a new kit for an ITX board with only 2 slots, best performance will be a kit with a dual rank chip configuration. 99% of 2x8GB kits are going to be dual rank by default because they are using 16X 8 bit X 512MB chips to construct the DIMMs. 2x4GB kits are a bit more sketchy since they can use half as many of the same chips used to build an 8GB DIMM to make a 4GB DIMM, which cuts the cost of materials. Many brands are doing this now since those 8-bit 512MB DDR3 chips are cheaper per MB than 8-bit 256MB chips.

Ideally, if you're going to get a 2x4GB kit, you want DIMMs made from 16 X 8 bit X 256MB chips instead of 8 X 8 bit X 512MB chips. This can be difficult to distinguish. Most brands aren't "sharing" this information on consumer DIMMs anymore.


----------



## DaveLT

Just buy a G.Skill ValueRam kit and clock it to hell







that way you can see if it's 64x256 or 32x512


----------



## PcGamer1977

No problem I will look into the Memory- Can someone please help me with the Resolution thing please? All I have is a black screen now in my living room- Don't know how to change it back.


----------



## mdocod

Do the obvious stuff, make sure the TV works with other inputs, make sure its set to the proper input. disconnect and reconnect everything, shut down and unplug everything, then plug it all back up.

I wasted 3 hours of my life trying to get a monitor working yesterday. Kept thinking I had a software side glitch. I was down to a prompt with video drivers all purged from the system. Come to find out the monitor was "glitched" and just needed to be unplugged for a minute.


----------



## DaveLT

Make sure the resolution and refresh rate matches your TV. TVs can be very finicky when it comes to supported resolutions and refresh rates.


----------



## PcGamer1977

I changed the Resolution using Amd catalyst control from 1080 p to 720p then I lost the picture from that point forward. Everything else works ok the Tv and the computer on all other connected stuff I just cant boot-up on the tv in order to change the Resolution back to what It was.

I will try un-plugging everything like you guys said, I think this happened to me before once I just cannot recall what I did to fix it!


----------



## PcGamer1977

I wonder if I install my gtx 580 then change res again, take out card then try again.nothing is happening I cant even get into bios with this pos!


----------



## PcGamer1977

Nothing Worked,I tried installing a Gtx 580 into the 6800k I somehow managed to completely mess-up the resolution on my tv! So now instead of using the Trinity on the Tv I have to use it with a Asus monitor instead,Kinda defeats the whole purpose of a media Pc but whatever its my own fault for messing around with the settings I guess? !


----------



## PcGamer1977

Ok did some testing at 1080p and iam under the impression its just better to buy a used Radeon 6850,6870 or Nvidia gtx 650 ti or regular 660?......I overclocked the 8670d to 1000mhz still could not play Rainbow six vegas if I put everything at lowest settings iam rewarded with 35 to 40 fps at 1920x1080p......if we turn down the resolution a tad then things strat to get interesting! Maybe I need to overclock the memory like that guy was saying,but so far its a no go at 1080p for a game thats probbly ten years old now (I think not sure ( .....1600mhz ram is no good for this Apu.


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> Ok did some testing at 1080p and iam under the impression its just better to buy a used Radeon 6850,6870 or Nvidia gtx 650 ti or regular 660?......I overclocked the 8670d to 1000mhz still could not play Rainbow six vegas if I put everything at lowest settings iam rewarded with 35 to 40 fps at 1920x1080p......if we turn down the resolution a tad then things strat to get interesting! Maybe I need to overclock the memory like that guy was saying,but so far its a no go at 1080p for a game thats probbly ten years old now (I think not sure ( .....1600mhz ram is no good for this Apu.


We all know that 1600 isn't enough for the APU. 1866 will get you going by quite a bit more


----------



## mdocod

GPUs with only 8ROPs (that includes the A10) are really only well suited to 720P resolution if you want to play at >30FPS.


----------



## PcGamer1977

See that's what I mean if you want anything decent for 1080p I would say Minimum would be a Radeon 6850 or 6870-On the Nvidia side iam not so sure which card best competes with the 6870? Maybe a 650ti? Not sure.I will see if I can find something USED for a decent price.


----------



## DaveLT

Quote:


> Originally Posted by *PcGamer1977*
> 
> See that's what I mean if you want anything decent for 1080p I would say Minimum would be a Radeon 6850 or 6870-On the Nvidia side iam not so sure which card best competes with the 6870? Maybe a 650ti? Not sure.I will see if I can find something USED for a decent price.


GT640 I think. A 650Ti is 6950 territory as a 7850 is 6970 territory.


----------



## PcGamer1977

Dont have the 6970s anymore sold them.I dont like the 7850 either,id much rather pick up a used 6850 or 6870.


----------



## mdocod

Go for a GCN/Kepler/Maxwell generation GPU. No reason to purposely buy a 4 year old discrete GPU at this time unless it is really really cheap. VLIW5, VLIW4, and Fermi based GPUs are all going to start showing their age here sooner rather than later. Used GCN hardware is cheap right now with mining on the pull-back.


----------



## PcGamer1977

Yeah makes sence, i was reading about those maxwell gpus very low power usage. I dont wanna spend more then like $120 to $150 max kinda defeats the whole purpose of the Apu build wouldnt you say?


----------



## Hotrod33809

Looks like I am a little late to the party. I need to get on these forums more. Staying right around 47 C when under load @ 4.7. May bump up the clock a little more and see what temps go too. See what this old corsair a70 cooler can handle haha. http://valid.canardpc.com/wtdek5


----------



## Horsemama1956

Quote:


> Originally Posted by *PcGamer1977*
> 
> Dont have the 6970s anymore sold them.I dont like the 7850 either,id much rather pick up a used 6850 or 6870.


I don't see how you can look at the 6850/6870 and then say you don't like the 7850. It's better all around and cheap now.


----------



## nanosour

New over clocker here with first build using A10 6800K on an MSI A78M-E45 board. Through the MSI BIOS I've bumped the Radeon 8760D to 950 MHz from the stock 844 MHz. I used the AMD Catalyst software to OC the CPU to 4.8 GHz, but the CineBench R15 CPU benchmark scores were considerably lower than when I run the test on the stock 4.1 GHZ. This has me quite puzzled. The CPU score goes down to 250 from 308 if I over clock. Any thoughts?

My system is:

A10 6800K w/IGP
MSI A78M-E45
G.Skill PC3 1866 (8GB)
Win 7 Home


----------



## cssorkinman

Quote:


> Originally Posted by *nanosour*
> 
> New over clocker here with first build using A10 6800K on an MSI A78M-E45 board. Through the MSI BIOS I've bumped the Radeon 8760D to 950 MHz from the stock 844 MHz. I used the AMD Catalyst software to OC the CPU to 4.8 GHz, but the CineBench R15 CPU benchmark scores were considerably lower than when I run the test on the stock 4.1 GHZ. This has me quite puzzled. The CPU score goes down to 250 from 308 if I over clock. Any thoughts?
> 
> My system is:
> 
> A10 6800K w/IGP
> MSI A78M-E45
> G.Skill PC3 1866 (8GB)
> Win 7 Home


I'd say something is getting hot and it's throttling, what are you running for a cooler?


----------



## nanosour

For
Quote:


> Originally Posted by *cssorkinman*
> 
> I'd say something is getting hot and it's throttling, what are you running for a cooler?


Forgot to mention that:

CoolerMaster Hyper 212


----------



## cssorkinman

Quote:


> Originally Posted by *nanosour*
> 
> For
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> I'd say something is getting hot and it's throttling, what are you running for a cooler?
> 
> 
> 
> Forgot to mention that:
> 
> CoolerMaster Hyper 212
Click to expand...

ARRRRGH not that cooler! lol sorry, it's a bit of a running joke in another thread.

Yeah either the chip is over heating or the board is and it's lowering the clockspeed under load.


----------



## nanosour

Quote:


> Originally Posted by *cssorkinman*
> 
> ARRRRGH not that cooler! lol sorry, it's a bit of a running joke in another thread.
> 
> Yeah either the chip is over heating or the board is and it's lowering the clockspeed under load.


Thanks for the reply.

So I came across AMD Overdrive and after install ran "auto-clock". It kicked out 4.6 GHz as a safe OC. Again didn't seem to increase my benchmark score. Also, ran Prime95 for 10 minutes with stable temp of 52C as reported by HWMonitor. I believe that is the correct temperature. It matches the MSI Command Center temperature.

I've been messing with the BIOS and there is a way to OC the IGP. It's currently set to 844 MHz and I'm wondering if I can bump that up. Is this taken into consideration in the AMD Overdrive software? Maybe I should just leave it alone as it's internal in the 6800K.

So far this is pretty cool and I don't think I've hurt anything


----------



## Hotrod33809

Idk if it was just me but AMD overdrive was giving me fits. But in order to run 4.6 or 4.7 you may have to give it a little voltage bump. Also I have the igpu clocked to 1013 MHz and it runs fine. Just watch the temps and play around with it. Also I personally disable turbo core and make sure you have it set to run at 4.6 all the time or it could be throttling itself.


----------



## azanimefan

Quote:


> Originally Posted by *Hotrod33809*
> 
> Idk if it was just me but AMD overdrive was giving me fits. But in order to run 4.6 or 4.7 you may have to give it a little voltage bump. Also I have the igpu clocked to 1013 MHz and it runs fine. Just watch the temps and play around with it. Also I personally disable turbo core and make sure you have it set to run at 4.6 all the time or it could be throttling itself.


AMD overdrive tends do downclock your ram/nb/ht to stabilize higher clock speeds... and tends to use too much voltage which makes it run hot. you probably had your ram so underclocked any performance gains from the overclock on the cpu were wiped out.

For example, just for giggles i ran amd overdrive on my 8320... it stabilized my cpu at 4.5ghz, with 1.45v on the vcore, and my ram clocked at 1333 with 11-11-13 timings, and a 2000mhz on the nb... just so you know, my cpu can run at 1.45V on the vcore 100% stable at 4.8ghz with 1600 with 9-9-9 on the timings and 2400mhz on the nb.

i think that right there is all the evidence you need to not use AMD Overdrive.

(sidenote: my cpu runs 100% stable with STOCK vcore at 4.5ghz... there was no reason for overdrive to even touch the voltages... or ram timings... or anything else. out of the box, you can just set this chip at 4.5ghz without touching another setting in the bios)


----------



## nanosour

I've got it running stable at 4.9 GHz via the MSI Command Center. I've also OC the igpu to 950 MHz. Everything looks good during Prime95 runs, but for some reason I can't get Thief to launch unless I reset to the stock CPU/IGP settings. It will not run under any over clock condition. This is my first time playing a game on a PC. Anyone have this type of problem. I would like to take advantage of over clocking for increase fps, but no point in OC if game won't run.


----------



## Hotrod33809

Quote:


> Originally Posted by *azanimefan*
> 
> AMD overdrive tends do downclock your ram/nb/ht to stabilize higher clock speeds... and tends to use too much voltage which makes it run hot. you probably had your ram so underclocked any performance gains from the overclock on the cpu were wiped out.
> 
> For example, just for giggles i ran amd overdrive on my 8320... it stabilized my cpu at 4.5ghz, with 1.45v on the vcore, and my ram clocked at 1333 with 11-11-13 timings, and a 2000mhz on the nb... just so you know, my cpu can run at 1.45V on the vcore 100% stable at 4.8ghz with 1600 with 9-9-9 on the timings and 2400mhz on the nb.
> 
> i think that right there is all the evidence you need to not use AMD Overdrive.
> 
> (sidenote: my cpu runs 100% stable with STOCK vcore at 4.5ghz... there was no reason for overdrive to even touch the voltages... or ram timings... or anything else. out of the box, you can just set this chip at 4.5ghz without touching another setting in the bios)


Yeah I don't overclock with amd overdrive gave it a quick try and found it much easier to overclock via bios and got better results. But I agree much better ways to do things than overdrive


----------



## Hotrod33809

So I was playing around with overclock and ran a prime 95 test and ran a prime 95 test and noticed that the while running the torture test each of the cores would occasionally drop down to 3.8 ghz but then go back up to whatever the previous clock was. Whether it was set to 4.4 to 4.7 ghz it still throttled down to 3.8 ghz while passing the torture tests fine. Any ideas on what this could be? It also appears to happen on 2 cores at a time and alternate.


----------



## Lutfij

Well It feels good to be back here with some news!









My motherboard came back and its working nicely as initially seen. I've managed to follow up on this link; http://forum.hwbot.org/showthread.php?t=86959&highlight=devastator+powertune and download AFI and DPT. So far I've downloaded the above two since I didn't think the rest were necessary for my unit/system. Will reconsider my actions if its is indeed necessary.

These are the results;




What can I do as a follow up on the above softwares. I am a little new to all this and would be humbled if you could show me the ropes. A guide/tutorial perhaps? I'm running off the stock APU cooler and will soon move onto the Cryorig C1 itx cooler and a suitable itx case.

My Mushkin 996991 kit is under XMP profile but is running at 1866MHz. This system will be primarily for sketchup, CAD and some light rendering duties as well as watching some shows in 1080p(nothing more)

Lutfij


----------



## mdocod

Quote:


> Originally Posted by *Hotrod33809*
> 
> So I was playing around with overclock and ran a prime 95 test and ran a prime 95 test and noticed that the while running the torture test each of the cores would occasionally drop down to 3.8 ghz but then go back up to whatever the previous clock was. Whether it was set to 4.4 to 4.7 ghz it still throttled down to 3.8 ghz while passing the torture tests fine. Any ideas on what this could be? It also appears to happen on 2 cores at a time and alternate.


Some form of throttling no doubt. What motherboard? What HSF?


----------



## TPCbench

Quote:


> Originally Posted by *Hotrod33809*
> 
> So I was playing around with overclock and ran a prime 95 test and ran a prime 95 test and noticed that the while running the torture test each of the cores would occasionally drop down to 3.8 ghz but then go back up to whatever the previous clock was. Whether it was set to 4.4 to 4.7 ghz it still throttled down to 3.8 ghz while passing the torture tests fine. Any ideas on what this could be? It also appears to happen on 2 cores at a time and alternate.


I have experienced that when I first overclocked my A10-5800K. The core frequency downclocks when running stress test (Prime95, Intel Burn Test, ettc) or video encoding (Handbrake)

Disabling Turbo in the BIOS should solve it. Don't ask me why coz I also don't know but Turbo just interferes with overclocking

If the problem persists, the chip might be throttling due to too much heat which can be caused by improperly installed cooler or insufficient cooling


----------



## Pip Boy

disabled turbo in bios, got 20 - 30fps performance boost on linux









bit peeved, although i want the extra performance for upcoming games but retaining efficiency, i only discovered this in getting ready for my new 750Ti due to average performance on linux amd drivers.

got a 3 second reduction on sysbench running a prime number cpu test!


----------



## Hotrod33809

Quote:


> Originally Posted by *TPCbench*
> 
> I have experienced that when I first overclocked my A10-5800K. The core frequency downclocks when running stress test (Prime95, Intel Burn Test, ettc) or video encoding (Handbrake)
> 
> Disabling Turbo in the BIOS should solve it. Don't ask me why coz I also don't know but Turbo just interferes with overclocking
> 
> If the problem persists, the chip might be throttling due to too much heat which can be caused by improperly installed cooler or insufficient cooling


This was it..I had disabled it before for overclocking however I guess i reset those settings. Thanks for the help:thumb: I figured it wasnt a heating issue as I was hitting 51C under full load.


----------



## LordOfTots

Would any of you guys consider a heavily overclocked 6800k a decent sidegrade from a FX 6300 at stock clocks? I have a friend looking to downsize his FX 6300 rig, but the MATX options for AM3+ are kinda crap.


----------



## mdocod

Hi LordOfTots,

Yes, a decently overclocked A8/A10/750k/760K is a reasonable side-grade from the FX-6300 at stock... http://www.overclock.net/t/1493307/relative-access-to-execution-throughput-comparison-chart/0_100#post_22355151

note the [email protected] vs [email protected] stock. The stock clocked FX-6300 doesn't overtake the overclocked richland until the workload is fully saturating all 6 cores.

While you're there, look at the relative performance of the i3-4150 and overclocked G3258. If you're planning to "downsize" the rig such that it is going to use the iGPU (very small), then the A10 is the better choice, but if the build will still have a discrete GPU, then a 760K or one of those intel options is going to offer better value. (Th 760K is the same as the A10-6800K, just without the iGPU).


----------



## DaveLT

Unlike a AMD A10 which is actually fairly useful outside of overclocking a Pentium G3258 (in fact any pentium) is not. It has gimped cache, no HT and AVX either.


----------



## jsc1973

Quote:


> Originally Posted by *LordOfTots*
> 
> Would any of you guys consider a heavily overclocked 6800k a decent sidegrade from a FX 6300 at stock clocks? I have a friend looking to downsize his FX 6300 rig, but the MATX options for AM3+ are kinda crap.


Would be just fine unless he does a lot of workloads that can max out the two extra cores. On anything else, the Richland platform would be superior, due to lower power consumption and a more up-to-date platform than AM3+, not to mention that if you get an FM2+ board, you have Kaveri and Carrizo as upgrade options later on.


----------



## LordOfTots

Quote:


> Originally Posted by *mdocod*
> 
> Hi LordOfTots,
> 
> Yes, a decently overclocked A8/A10/750k/760K is a reasonable side-grade from the FX-6300 at stock... http://www.overclock.net/t/1493307/relative-access-to-execution-throughput-comparison-chart/0_100#post_22355151
> 
> note the [email protected] vs [email protected] stock. The stock clocked FX-6300 doesn't overtake the overclocked richland until the workload is fully saturating all 6 cores.
> 
> While you're there, look at the relative performance of the i3-4150 and overclocked G3258. If you're planning to "downsize" the rig such that it is going to use the iGPU (very small), then the A10 is the better choice, but if the build will still have a discrete GPU, then a 760K or one of those intel options is going to offer better value. (Th 760K is the same as the A10-6800K, just without the iGPU).


Good info, thanks! Looks like I'm definitely going with the 6800k then, unless the 760k clocks just as easily.

Anyone here have experience with the 760k? If it's not binned lower than the 6800k, then it would be a steal for this build.


----------



## jsc1973

The 6800K's are speed-binned higher, but by all accounts, even a mediocre 760K is good for 4.5 GHz, and often a lot more, with decent aftermarket cooling. The Athlons are probably a little more subject to the silicon lottery since those are Richland cores with defects in the GPU section, and there are no doubt some that come off a weaker part of the silicon wafer and won't clock as well as an A10. But most of them perform very well. The dual-core Richland Athlon is actually clocked 100 MHz higher at stock than the comparable A6-6400K.


----------



## LordOfTots

Quote:


> Originally Posted by *jsc1973*
> 
> The 6800K's are speed-binned higher, but by all accounts, even a mediocre 760K is good for 4.5 GHz, and often a lot more, with decent aftermarket cooling. The Athlons are probably a little more subject to the silicon lottery since those are Richland cores with defects in the GPU section, and there are no doubt some that come off a weaker part of the silicon wafer and won't clock as well as an A10. But most of them perform very well. The dual-core Richland Athlon is actually clocked 100 MHz higher at stock than the comparable A6-6400K.


Probably going with the 6800k then, that binning is worth it too me, especially since the cooling and motherboard for this build will be more than enough to max out the chip


----------



## Lutfij

Could you wonderful folks please help me out and figure some of the stuff out on this APU system I have?

Current;y these are the settings that I'm working with;

Asrock A85X-Itx bios v1.6
Disabled Turbo Core Technology
Disabled APM
CPU Frequency Multiplier X41 - 4100MHz
CPU Voltage - 1.125v
CPU voltage offset - 0%
CPU NB frequency - x11 - 2200MHz
APU LLC - 1/2 Vcore
GFX engine CLock - 844MHz
DRAM Timings configuration - Auto
DRAM Frequency - 2133MHz

DRAM voltage - 1.65v
APU PCIE voltage - 1.208v
SB voltage - 1.10v

Does anything look off? I'm new to AMD and APU's so i thought you guys would offer a better word of advice! I play dirt 3 alot and the reason for downclocking/undervolting is to hold off the heat until my Cryorig C1 arrives.


----------



## Bizoom

I have the exact same problem and you are the only other person talking about it. Any luck on finding a solution?


----------



## Lutfij

Sorry I couldn't get back to you earlier. Are you talking about the overheating?


----------



## Freely

So first post here, I want to throw a XSPC loop on my A10 6800k so I can get higher clocks and, well...cause I want to play around with something. Are there any down sides to this besides the fact it might be a little overkill? Or maybe its not.. i have no idea, which is why i'm asking. Hoping to potentially learn something. Thanks guys


----------



## xKrNMBoYx

Quote:


> Originally Posted by *Opcode*
> 
> I'm not a fan of ECS boards. I never owned one, and I never will. They're not exactly a popular company when it comes to hardware. I personally would of spent the extra $5-10 and got the ASRock Extreme6. The problem with getting a bios shot of that board, is not very many people own it.


Yeah in general they're not well known or loved even though they've been around since 1987. They were basically makera of low cost motherboards that began trying to make products in broader ranges. Only ECS products I own is a USB expansion card and an AM3 motherboard. Both work well for Tue cost but largest issues are drivers and bios updates IMO


----------



## Tisser12

Ooh really old thread, but maybe someone could help me through babysteps of applying a good stable OC to my 6800K. I just recently bought a Sapphire RX470 nitro and wanna get my CPU up to snuff as well.

Gigabyte F2A85X-UP4 MOBO and a Hyper 212 evo cooler, running a silverstone 500W psu. (Probably going to buy a Corsair CX550M tonight from newegg just to get a newer PSU in there to help)

I THOUGHT I had a stable OC of 4.6Ghz at 1.3xx volts the other day, but towards the end of my torture test my PC shut down. I was using AMD's overdrive program and everything went well all up till the very end of my test and then my PC shut down, I was in the bathroom when it happened so I was unable to see the status of it before it crashed.

I have a weak understanding about voltages and BIOS settings at this point, and all the talk about NB and RAM and timings I don't understand yet. Picking up a lot of it quickly but still struggling with the finer details.

Appreciate you guys for any help!


----------



## DannyDK

Shut down is usually because of either the CPU or the motherboard being to hot.


----------



## Himo5

Here's my voltage table for Richland running on the equivalent Asus FM2 board. As you can see 1.3V is just a little low for 4600MHz and Prime95 pushed it to 45C on a cool winter's day, admittedly with a giant 140 x 32mm 2500rpm fan hanging off the Mugen heatsink. The LLC settings are the default Extreme set in AISuite II for that board but if you go to Advanced Voltage Settings page in the UP4 BIOS there are two items at the bottom of the page, VCore and NBVID Loadline Calibration that you can raise to - say - 120% to get roughly the same effect.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> Here's my voltage table for Richland running on the equivalent Asus FM2 board. As you can see 1.3V is just a little low for 4600MHz and Prime95 pushed it to 45C on a cool winter's day, admittedly with a giant 140 x 32mm 2500rpm fan hanging off the Mugen heatsink. The LLC settings are the default Extreme set in AISuite II for that board but if you go to Advanced Voltage Settings page in the UP4 BIOS there are two items at the bottom of the page, VCore and NBVID Loadline Calibration that you can raise to - say - 120% to get roughly the same effect.


Here's a screenshot of my one attempt at 4.5Ghz I can see now I had the voltage too low, but it was fairly stable during this stability test, this screenshot was from after running the test for 15 minutes. I don't think I ended up getting any from my 4.6 attempt. Another thing is I should have probably used overdrive instead of Easytune haha but it's all I had at the time.



I still dont' understand a lot about the voltage stuff though. Not sure what NB means or is. I'd really like to not fry my system from ignorance haha. What geeks me about that is the difference between BIOS and overdrive, I dont' want the two to combine the "voltage upping" and just fry everything. So I get nervous about that stuff, but once I find a stable overclock setup I would rather have it set in BIOS so that it's ready to go as soon as my PC turns on (which is always, it's never off usually)


----------



## Himo5

To put it in perspective the default CPU voltage for the A10-6800K when it is running at its base frequency of 4100MHz is 1.3125V. There is a lot of spare capacity in there and you can stress test it up to around 4330MHz at the base voltage before it breaks down.

If you look at my table you'll see that to get 4600MHz I set CPU voltage to 1.35625V (7 increments above the base voltage) and loosen the VRM controls (CPU Loadline Calibration etc) to allow Prime95 to raise CPU voltage to a maximum of 1.432V (20 increments above base voltage) during the 30 minute run.

If you look at the default PState settings for the A10-6800K in this AmdMsrTweaker image you'll see three Turbo PStates for 4100MHz at 1.35V, 4200MHz at 1.4V and 4300MHz at 1.4375V (you can also see the Base State at P3), so you can see that 4600MHz at 1.432V is well within the range of what Richland can do.



In practice this is the most overclockable of all the APU processors and many people have taken it above 5000MHz with air cooling at voltages ranging between 1.5-1.6V. I wouldn't advise running it permanently at those levels and to watch the CPU temperature carefully while you are doing so, but anything you can run below 60C isn't going to do it any harm.

NB stands for North Bridge and denotes the part of the APU that deals with the functions that used to be dealt with by the motherboard's North Bridge chip before APUs were developed.

Unless you are overclocking the iGpu (the HD8670D Radeon part of the APU) from its default 844MHz you can leave NB Voltage at its default value.

NB Frequency should be at least 1.25:1 in relation to Memory Frequency where the given DDR3 (Double Data Rate) Ram speed is divided by two, so 2133MHz Ram requires 1.25 * 2133 / 2 = 1333MHz minimum NB Frequency and you will usually find the BIOS setting a higher default value than that. You can raise it above the default to increase the data flow but once you get above 1800MHz you may need to raise NB Voltage by 2 or 3 increments to support it.

As far as overclocking in BIOS and doing so in Windows is concerned the BIOS settings are read by Windows during Boot and those are the vales that your overclocking software will start from. Certainly you can use Overdrive and EasyTune to experiment and find an overclock but when you set it up in the BIOS you need to check what the operating values end up as in case there is a transaction during Boot.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> To put it in perspective the default CPU voltage for the A10-6800K when it is running at its base frequency of 4100MHz is 1.3125V. There is a lot of spare capacity in there and you can stress test it up to around 4330MHz at the base voltage before it breaks down.
> 
> If you look at my table you'll see that to get 4600MHz I set CPU voltage to 1.35625V (7 increments above the base voltage) and loosen the VRM controls (CPU Loadline Calibration etc) to allow Prime95 to raise CPU voltage to a maximum of 1.432V (20 increments above base voltage) during the 30 minute run.
> 
> If you look at the default PState settings for the A10-6800K in this AmdMsrTweaker image you'll see three Turbo PStates for 4100MHz at 1.35V, 4200MHz at 1.4V and 4300MHz at 1.4375V (you can also see the Base State at P3), so you can see that 4600MHz at 1.432V is well within the range of what Richland can do.
> 
> 
> 
> In practice this is the most overclockable of all the APU processors and many people have taken it above 5000MHz with air cooling at voltages ranging between 1.5-1.6V. I wouldn't advise running it permanently at those levels and to watch the CPU temperature carefully while you are doing so, but anything you can run below 60C isn't going to do it any harm.
> 
> NB stands for North Bridge and denotes the part of the APU that deals with the functions that used to be dealt with by the motherboard's North Bridge chip before APUs were developed.
> 
> Unless you are overclocking the iGpu (the HD8670D Radeon part of the APU) from its default 844MHz you can leave NB Voltage at its default value.
> 
> NB Frequency should be at least 1.25:1 in relation to Memory Frequency where the given DDR3 (Double Data Rate) Ram speed is divided by two, so 2133MHz Ram requires 1.25 * 2133 / 2 = 1333MHz minimum NB Frequency and you will usually find the BIOS setting a higher default value than that. You can raise it above the default to increase the data flow but once you get above 1800MHz you may need to raise NB Voltage by 2 or 3 increments to support it.
> 
> As far as overclocking in BIOS and doing so in Windows is concerned the BIOS settings are read by Windows during Boot and those are the vales that your overclocking software will start from. Certainly you can use Overdrive and EasyTune to experiment and find an overclock but when you set it up in the BIOS you need to check what the operating values end up as in case there is a transaction during Boot.


Awesome. Thanks again. I'm starting to understand some of this now. SO basically just use overdrive to find a stable OC, and then set that in the BIOS and leave it at that. Because I don't really PLAN on trying for crazy clocks, just trying to get my CPU to perform the best it can *safely*.

I'll give 4.6-4.7 a shot with some upped voltages.

My current question is, once that's set in the BIOS you said there can be weirdness after boot, so do I just need to watch my voltages to make sure Windows doesn't make em screwy when I boot up or what? Or is there a way to make sure that BIOS settings stay the way I set them even after boot?


----------



## Himo5

The BIOS of some motherboards will adjust voltages and frequencies during POST when they find certain values changed by the user.

So if an ASUS FM2+ board finds CPU Frequency above the base setting it may increase CPU Voltage according to the value it finds - but only if CPU Voltage was left at its default value, if you just add a single increment to CPU Voltage it won't make the adjustment.

If you raise the iGpu frequency some motherboards will increase NB Voltage and other boards will adjust the value of NB Frequency according to what frequency setting they find for the memory.

Just about all of these transactional changes are undocumented so take before and after screenshots (Shift+PrtScrn) of all these values and save them as files through your image editor (with CTRL+V) and store them in a folder somewhere so you can check to see if they came through as expected values before testing your OC.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> The BIOS of some motherboards will adjust voltages and frequencies during POST when they find certain values changed by the user.
> 
> So if an ASUS FM2+ board finds CPU Frequency above the base setting it may increase CPU Voltage according to the value it finds - but only if CPU Voltage was left at its default value, if you just add a single increment to CPU Voltage it won't make the adjustment.
> 
> If you raise the iGpu frequency some motherboards will increase NB Voltage and other boards will adjust the value of NB Frequency according to what frequency setting they find for the memory.
> 
> Just about all of these transactional changes are undocumented so take before and after screenshots (Shift+PrtScrn) of all these values and save them as files through your image editor (with CTRL+V) and store them in a folder somewhere so you can check to see if they came through as expected values before testing your OC.


Awesome. I'll give it a shot.


----------



## Tisser12

Ran Overdrive to do a pass at 4.5Ghz as a starting point since it pretty much is (counting turbo speed @4.4) and ran the stress test built in. Ended up showing around 35-36C throughout the test. I watched the core voltage on both overdrive and CPUZ and it didnt' seem to fluctuate at all while under load. All voltages (to my eyes) were okay. The only concern I had was when checking HWInfo64 I saw a 65C as a max temp in a CPU column, that scared me. Can you explain what that was/why that was?

Start:




The next two pics have hwinfo opened up where i was seeing that high temp at.


----------



## Tisser12

Sorry I realized those pictures are probably illegible.


----------



## Himo5

That's great! If you save the images with the .PNG filetype it will stop their resolution being squeezed. If your image editor lets you start with a large new image and paste your screenshots into that so you can use Select and Drag to build a composite of your target data, then select the result and save that as your file.

Since you're trying to see if values displayed in the BIOS match OS operating values it's best to use the motherboard's native OC tool to find out what they are.

AMD Overdrive isn't really geared to FM2/Apu and even HWiNFO64 needs a fair bit of interpretation to marry up its display with what you saw in the BIOS.

If you google EasyTune6 and switch to Images you will see the screenshots of it that have been used by others.

You're looking for how the values when you set your OC in the BIOS correspond to their equivalents in EasyTune6 compared to what they were in your OS-built OC.

Build the EasyTune6 screenshot for the OC you've tested in the OS then set the BIOS and build the EasyTune6 screenshot for the result.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> That's great! If you save the images with the .PNG filetype it will stop their resolution being squeezed. If your image editor lets you start with a large new image and paste your screenshots into that so you can use Select and Drag to build a composite of your target data, then select the result and save that as your file.
> 
> Since you're trying to see if values displayed in the BIOS match OS operating values it's best to use the motherboard's native OC tool to find out what they are.
> 
> AMD Overdrive isn't really geared to FM2/Apu and even HWiNFO64 needs a fair bit of interpretation to marry up its display with what you saw in the BIOS.
> 
> If you google EasyTune6 and switch to Images you will see the screenshots of it that have been used by others.
> 
> You're looking for how the values when you set your OC in the BIOS correspond to their equivalents in EasyTune6 compared to what they were in your OS-built OC.
> 
> Build the EasyTune6 screenshot for the OC you've tested in the OS then set the BIOS and build the EasyTune6 screenshot for the result.


I tried using ET6 before and didn't go so well because I didn't know all the stuff I was tuning lol I'll do that today. So far I haven't set any OC or tweaked anything in BIOS yet. I have a Dual UEFI BIOS though, so I'm hoping that means theres a way to have default on one and my OC attempts on another so that it's easy as switching BIOS profiles if something goes wokny.

I'll give an OC pass with my settings from last night in ET6 and then I'll open up the BIOS and see how badly I can screw it up there. lol


----------



## Tisser12

Went and did runs with 4.6, 4.7, and 4.8 using EasyTune6 as my overclocking tool and CPUZ, HWinfo 64, Rainmeter, and Overdrive for monitoring, and then ran the stability test built into overdrive for 25 minutes each. Results were decent with temps never going above 48.5C even at full load at 4.8Ghz

After i was done testing and rebooted my BIOS said it was unable to boot, I opened it up and everything looked normal so I exited and then it booted fine. Dunno.. Only concern now is I see that in my HWinfo screen there are temps around 60C near the bottom of the window in the CPU column. What reading is that from? And is that even safe?




4.6


Spoiler: Warning: Spoiler!









4.7


Spoiler: Warning: Spoiler!












4.8


Spoiler: Warning: Spoiler!


----------



## Tisser12

Okay so I set my 4.8Ghz OC in BIOS and everything booted just fine I have the voltage set at 1.450000 and have the Vcore LLC set to medium, ran a quick stress test from CPUz and was seeing an average voltage of 1.4400 (different from the ET6 OC run at 1.404) And was seeing much higher temps. As in 55-56C instead of 46-50C that I was seeing before. Nothing looks like it's changed that I've seen, I'm going to go over everything again and see what might have gotten wonked.


----------



## Tisser12

Turned it down to the 4.6Ghz settings and lowered the LLC to low, ran a quick stress and was seeing around 35C which is a LOT more bearable. I'm going to attempt to play fallout and see if I can't make my computer crash trying to power everything haha. Wish me luck. I'll be watching the temps of everything very closely.


----------



## Himo5

Quote:


> Originally Posted by *Tisser12*
> 
> Ooh really old thread, but maybe someone could help me through babysteps of applying a good stable OC to my 6800K. I just recently bought a Sapphire RX470 nitro and wanna get my CPU up to snuff as well.


Fixed.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> Fixed.


Fixed?

Played roughly 6 hours of Fallout 4 with maxed settings and a bunch of mods through far harbor (with all that wonderful fog) Averaging 55-60FPS with dips to around 35-45 in denser areas. GPU temps were around 70-75C (stock oc) and my CPU was mostly around 35-38 but peaked at 45 at times. But I never saw it go above 45C

OC is applied in BIOS. I'll work on getting screenshots of all the settings just to double check I actually sort of know what I'm doing.

Thanks again guys for all your help. I really mean it, I'd be nowhere without this forum!!!


----------



## Tisser12

And again, looking over "min max" values since last night- including my long gaming session here's the values on my GPU. Some of them seem a bit... off. Such as the maximum GPU core power entry... 35 MILLION watts?

And i don't like seeing that 95C near my VRM temps either. I've gotta figure out what's going on either with HWinfo or my PC because idc where on my GPU that is I don't EVER want it at 95C. I haven't OCed or tweaked anythign on my new 470 yet.


----------



## Himo5

HWiNFO64 may not know how to read your graphics card, you could try MSI AfterBurner or Sapphire Trixx.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> HWiNFO64 may not know how to read your graphics card, you could try MSI AfterBurner or Sapphire Trixx.


I have definitley thought the same thing, especially considering it's such a new card yet, but you can understand my worry. I don't want ANY VRM to be above 70-80C. But seeing as it's completely stock I don't see how it would let itself get anywhere near that hot. hopefully


----------



## Himo5

If its not on your card's driver disk it's well worth downloading Sapphire TRIXX 3.0 which is now fully set up for the Radeon RX400 series.


----------



## Tisser12

I have Trixx, it's currently showing a N/A where the VRM temp should be. ugh.


----------



## Tisser12

Anyone?? Why are my temps so much higher after I set the OC in BIOS over setting it in Overdrive or ET6??

Why does ET6 say my CPU is at 51 when everything else says it's at 35?



And what is that really high reading on my CPU right below my motherboard?



I'm not going to do any gaming or anything until I get those questions figured out because I'm not gonna toast my computer. I ticked the voltage down one tick in by bios to see if it would stay stable and lower the temps, and my explorer crashed while running a stress test. so I ticked it back up, and then changed the LLC to auto (was set on low during my OC settings), and after about 1-2 hours of Fallout 4 my PC Bluescreened.

As of right now I have everything back to where it was when it ran well, 4.6Ghz @ 1.356250 with the LLC set on LOW (I have Auto, Extreme, Medium, Low, and Normal settings) and left my pc run all night so far so good. But I'd really like some more info on the above questions if at all possible.


----------



## Himo5

The Thermal Margin given in Overdrive is the clue to what is happening. This indicates how much more heating the processor can take, so in the 1st image adding 51C and 33.1C gives a breakdown temperature of 84C, in the 2nd image HWiNFO64 shows 63C Current Temperature added to Overdrive's Thermal Margin of 21.9C gives 84.9C breakdown temperature, so the other temperature gauge showing 35C in the 1st image is obviously wrong and the 0.9C apparent discrepancy is probably due to different bits of software trying to read the same sensor at the same time. These CPU temperatures under 100% load look about right for 4600MHz and 4800MHz but you could check them by setting 4700MHz and seeing if it loads to around 56C.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> The Thermal Margin given in Overdrive is the clue to what is happening. This indicates how much more heating the processor can take, so in the 1st image adding 51C and 33.1C gives a breakdown temperature of 84C, in the 2nd image HWiNFO64 shows 63C Current Temperature added to Overdrive's Thermal Margin of 21.9C gives 84.9C breakdown temperature, so the other temperature gauge showing 35C in the 1st image is obviously wrong and the 0.9C apparent discrepancy is probably due to different bits of software trying to read the same sensor at the same time. These CPU temperatures under 100% load look about right for 4600MHz and 4800MHz but you could check them by setting 4700MHz and seeing if it loads to around 56C.


Wooooah so my CPU has been running at 84C?? or are you saying that's what the "temp+Thermal margin" is (basically my max temp) ?? What IS the max temp for this CPU?? I really really really don't have the money to toast my processor or mobo haha

The little widget in the bottom middle is a Rainmeter skin, reading stuff straight from HWinfo.

Just to be sure, could i get a small list of monitoring programs I SHOULD be using that will give me accurate information??

Also, I bluescreened my PC again last night while playing FO4. Even after resetting my stuff to where it ran fine at 4.6.

Not enough juice? Too hot?

I really appreciate your help though guys, seriously.

my CX550M showed up today. I'll either attempt to install that tonight or may wait till Monday when I'm off work so I have all day to do it. Since I gotta redo a little bit of my cable management I'm gonna redo it all, clean fans, maybe move some around too.


----------



## Himo5

So 84C is what Overdrive takes to be the maximum safe temperature for the A10-6800K - or it is reading the temperatures 10C higher than they actually are. In fact, now that I come to look it up, the actual max safe temp is 74C. As I mentioned, Overdrive isn't really geared to APU processor systems.

If you look at my voltage chart you'll see that I was running 4600MHz on load at 45C and 4800 at 53C, however, that was in a low ambient temperature with the case open, I had two 3000rpm Gentle Typhoon fans in Push/Pull on a Mugen Heatsink and I didn't have a graphics card heating up the case interior. So I would say your temperatures of 51C and 63C for 4600MHz and 4800MHz under 100% load given by ET6 and HWiNFO64 look about right - the different gaps between the two sets of readings, 8C and 12C, can be accounted for by the extra cooling affecting my temperatures.

Given that then I would say use ET6 and check it against HWiNFO64. Another you could check it against if you want to be more certain would be Speedfan. But all these sensor readers have to be tested to see if they work on your setup. In this case Overdrive obviously doesn't and it's also a pity that TRIXX doesn't yet seem to be up to speed on the RX480.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> So 84C is what Overdrive takes to be the maximum safe temperature for the A10-6800K - or it is reading the temperatures 10C higher than they actually are. In fact, now that I come to look it up, the actual max safe temp is 74C. As I mentioned, Overdrive isn't really geared to APU processor systems.
> 
> If you look at my voltage chart you'll see that I was running 4600MHz on load at 45C and 4800 at 53C, however, that was in a low ambient temperature with the case open, I had two 3000rpm Gentle Typhoon fans in Push/Pull on a Mugen Heatsink and I didn't have a graphics card heating up the case interior. So I would say your temperatures of 51C and 63C for 4600MHz and 4800MHz under 100% load given by ET6 and HWiNFO64 look about right - the different gaps between the two sets of readings, 8C and 12C, can be accounted for by the extra cooling affecting my temperatures.
> 
> Given that then I would say use ET6 and check it against HWiNFO64. Another you could check it against if you want to be more certain would be Speedfan. But all these sensor readers have to be tested to see if they work on your setup. In this case Overdrive obviously doesn't and it's also a pity that TRIXX doesn't yet seem to be up to speed on the RX480.


Okay I really do appreciate the input. I thoguht it was something like 75C but I wasn't sure.. Been really close to overheating my CPU then, if I already haven't.

As far as the GPU goes, I don't plan on and haven't overclocked it (at least not for a while, I wanna keep that warranty in tact for as long as I can hold out) so I'm trying not to worry too much about any weird readings on it just yet, because most programs haven't been updated to read it properly aside from a select few. The Trixx beta is SUPPOSED to have support for my 470, but obviously not full support since it doesn't show me my VRM temps. Or maybe it does and I'm just dumb...

I'm still not totally sure what I did that's causing bluescreens now. gonna reset everything and start over.


----------



## Tisser12

I just downloaded a program just titled "Core Temp" I have each temp sensing program I have open right now to cross compare.

HWinfo shows 9-10C
Rainmeter shows 9-10C
CoreTemp shows 9-10C
Overdrive shows a thermal margin of 60-61C (9-10C)
ET6 is the ONLY program that shows me 35C as my CPU temp.

So either ET6 is either the only one reading correctly, or the only one reading incorrectly. My guess would be the latter.


----------



## Himo5

That's what you've got to do. Because there is no yardstick in the market managing your sensor readings is a purely empirical thing, it's not really something you can advise on at a distance.


----------



## Tisser12

Yeah. I understand, you have done a ton for me already though so THANKS! I realized I have one of those laser thermometers, not that it'll give me a 100% accurate reading but at least it should put me in the ballpark if I hit the edge of my CPU with the sensor. I reset my LLC to medium and played fallout again last night for a while and didn't get a bluescreen, so that's SOME progress. Now I just gotta figure out why being in Far Harbor and Sanctuary makes my CPU jump to 100% and drop my FPS to 30....


----------



## Tisser12

Found a consistency with the higher temps, in HWinfo the CPU temp and CPU0 temps are different, one reads from the chip, the other reads from the board. The higher temps come from the board readings and that's where ET6 is reading it's temps from, all the other programs must be using the temp sensor actually built into the chip, which I'm coming to find out AMD temp sensors in the chips tend to be innacurate. I'll definitley be keeping an eye on the higher temps to make sure they stay within range.

Installed my CX550M today which involved me basically tearing my whole PC down including almost having to fully remove the motherboard to remove the OLD psu, didnt' get the cable routed through the same spot because there wasn't enough room (I only loosened my mobo)

Booted and nothings fried yet so we'll see how it goes.


----------



## Lutfij

Welp, I've been on this thread for quite a while and I've done a couple of adjustments with my build though with some addition of hardware, I've gotten back to square one. I picked up a Be Quiet! Shadow Rock LP cooler and later added a Klevv Urbane 16GB DDR3 2400MHz kit to it. I'm hitting a couple of BSOD's, numerous ones in fact though when I tax the system with Starcraft II only watching 1080p videos on Youtube doesn't send it to a fit.



Help would be appreciated.

Namely, what northbridge frequency and voltages should I be looking at for the DDR3 2400MHz kit?

Board: Asrock A85x-itx.

CPU Frequency Multiplier - x41 4100MHz
CPU Voltage - 1.21875v
CPU voltage offset - +0%
NB Frequency - x12 2400MHz
CPU NB/GFX voltage - 1.33750v
APU LLC - 1/2 Vcore
GPU Engine Clock - 950

Load XMP settings - Auto
DRAM frequency - DDR3-2400MHz
DRAM settings - Auto

I operate off a dual 23" monitor setup, does that mean my settings will need some more change on the GPU clocks as well?


----------



## Tisser12

Okay, so after weeks of no problem 24/7 4.6Ghz OC I was playing fallout the other night and checked my readings afterwards like I always do, and I saw this. That's a world record no? lol but seriously though, freaked me out seeing those clocks, there shouldn't be clocks anywhere near that high. Any one have an explanation?? Should I be worried?


----------



## Himo5

If this was not an HWiNFO64 bug then over a period of 73 hours producing an average just 388MHz above a minimum of 4211MHz it must have peaked at 5923Cpu 1.3volt 2318NB for a very short moment. However, as I noted before, Richland has been noted for its performance above 5GHz, as pages like this will show. I've been above 5.4GHz on air with it myself.


----------



## Tisser12

Quote:


> Originally Posted by *Himo5*
> 
> If this was not an HWiNFO64 bug then over a period of 73 hours producing an average just 388MHz above a minimum of 4211MHz it must have peaked at 5923Cpu 1.3volt 2318NB for a very short moment. However, as I noted before, Richland has been noted for its performance above 5GHz, as pages like this will show. I've been above 5.4GHz on air with it myself.


Of course, I've seen a few people run 5.0 daily on air. This was almost 6Ghz lmao. Almost a full 2Ghz OC, on the same voltages I use for my 4.6.. (Could very likely up my OC but still fear that one high CPU temp reading I get) I'm assuming it was a glitch. I'll keep my eyes out for any other spikes like that.


----------



## cssorkinman

Quote:


> Originally Posted by *Tisser12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Himo5*
> 
> If this was not an HWiNFO64 bug then over a period of 73 hours producing an average just 388MHz above a minimum of 4211MHz it must have peaked at 5923Cpu 1.3volt 2318NB for a very short moment. However, as I noted before, Richland has been noted for its performance above 5GHz, as pages like this will show. I've been above 5.4GHz on air with it myself.
> 
> 
> 
> Of course, I've seen a few people run 5.0 daily on air. This was almost 6Ghz lmao. Almost a full 2Ghz OC, on the same voltages I use for my 4.6.. (Could very likely up my OC but still fear that one high CPU temp reading I get) I'm assuming it was a glitch. I'll keep my eyes out for any other spikes like that.
Click to expand...

If the computer was allowed to go into sleep or other power saving mode while the monitoring program was running , it will cause it to do misreads such as this.

I have a screen shot showing 1 million + ghz somewhere because of this - I'll look around and see if i can find it.


----------



## Tisser12

Quote:


> Originally Posted by *cssorkinman*
> 
> If the computer was allowed to go into sleep or other power saving mode while the monitoring program was running , it will cause it to do misreads such as this.
> 
> I have a screen shot showing 1 million + ghz somewhere because of this - I'll look around and see if i can find it.


My PC doesn't go to sleep or power saving mode. The readings didn't show up till after I was gaming for a couple of hours. So it was really odd that anything changed, let alone that drastically (if it wasn't just a glitch). Sort of makes me want to push my OC higher than 4.6..... lol


----------



## cssorkinman

Quote:


> Originally Posted by *Tisser12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cssorkinman*
> 
> If the computer was allowed to go into sleep or other power saving mode while the monitoring program was running , it will cause it to do misreads such as this.
> 
> I have a screen shot showing 1 million + ghz somewhere because of this - I'll look around and see if i can find it.
> 
> 
> 
> My PC doesn't go to sleep or power saving mode. The readings didn't show up till after I was gaming for a couple of hours. So it was really odd that anything changed, let alone that drastically (if it wasn't just a glitch). Sort of makes me want to push my OC higher than 4.6..... lol
Click to expand...

There's no way it was running 5.9 ghz on 1.35 volts, I'll guarantee that









ULPS disabled?


----------



## Tisser12

Quote:


> Originally Posted by *cssorkinman*
> 
> There's no way it was running 5.9 ghz on 1.35 volts, I'll guarantee that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ULPS disabled?


Yep. I'm not expecting it ran that speed, there should be no possible way for it to clock up, let alone almost 1.5Ghz, and let alone on that voltage. lol. Just wasn't sure if it was anything that might randomly happen or if you guys thought it was just a bug lol. Picking up another 120mm Corsair AF fan tomorrow, so I'm gonna install it on my hyper 212 evo so I havea push-pull going on, then see if I can't get 4.8 with semi-decent temps.


----------



## Tisser12

Speaking of high clocks, I decided since my CPU was running so much cooler to attempt a higher overclock. I worked up to 4.8Ghz @ 1.4500v and my max temps are reaching 53C, which was my max temp before at 4.6ghz before I added my 140 into my drive bay.

After I ran a stablity test



After 2 hours of Fallout 4 (have to do some real-world testing lol)


I'll leave my PC on overnight like usual and check everything again tomorrow. So far everything seems stable. Nothing out of the ordinary at least. May try for higher just to see how far I can push it.


----------



## Tisser12

Bumped my OC up to 4.8Ghz like I said, and so far everything's performing very well. I was seeing 120FPS average in DOOM multiplayer last night which was surprising to say the least, my rig really likes Vulkan and the DOOM code.

Side question, how many volts (well, tenths of volts) do I go up or down to see how low my OC can go before it crashes? I'm wanting to try and up my OC to 4.9 and possibly hit 5.0 but using the table that was provided above as a base to start with I was seeing 75C on my cores within 30 seconds of stressing it.


----------

