# [Official] AMD Bulldozer Reviews Thread



## Chunky_Chimp

This thread is for any reviews posted on or after October 12th, 2011 (Midnight EST or later, NOT counting any discrepancy in time zones). Previews, unboxings, and video-only reviews will not be merged here, unless the unboxing is accompanied by a review in the same article; actual articles are required. Feel free to post the review in standard news thread format, even though it won't be a new thread; this is to make it more eye-catching for those with high post-per-page counts such as myself.

Now then, getting on with the reviews:

*PCTuning*

http://pctuning.tyden.cz/hardware/procesory-pameti/22227-amd-bulldozer-procesory-fx-8150-a-8120-v-testu-1-2?start=14

Translated:

http://translate.google.com.sg/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fpctuning.tyden.cz%2Fhardware%2Fprocesory-pameti%2F22227-amd-bulldozer-procesory-fx-8150-a-8120-v-testu-1-2%3Fstart%3D14

*Ozeros*

http://www.ozeros.com/2011/10/review-amd-bulldozer-fx-8150/

Translated:

http://translate.google.com.sg/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.ozeros.com%2F2011%2F10%2Freview-amd-bulldozer-fx-8150%2F

*Overclockers.com*

http://www.overclockers.com/amd-fx-8150-bulldozer-processor-review

*HotHardware*

http://hothardware.com/Reviews/AMD-FX8150-8Core-Processor-Review-Bulldozer-Has-Landed/

*[H]ardOCP*

http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_desktop_performance_review

http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/

*TechSpot*

http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/

*VR-Zone*

http://vr-zone.com/articles/amd-fx-8150-cpu-overclocking-review-a-bulldozer-for-gamers-/13694-1.html

*Guru3D*

http://www.guru3d.com/article/amd-fx-8150-processor-review/1

*HardwareCanucks*

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/47155-amd-bulldozer-fx-8150-processor-review.html

*Vortez.net*

http://www.vortez.net/articles_pages/amd_fx_8150_bulldozer_cpu_review,1.html

*BenchmarkReviews*

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=831&Itemid=63

*Bit-Tech*

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/1

*Expert Reviews UK*

http://www.expertreviews.co.uk/processors/1287799/amd-fx-8150

*Legit Reviews*

http://www.legitreviews.com/article/1741/1/

*Hi Tech Legion*

http://www.hitechlegion.com/reviews/processors/13752

*PureOC*

http://www.pureoverclock.com/article1376.html

*Bjorn3D*

http://www.bjorn3d.com/articles/AMD_FX-8150_CPU_Bulldozer/2125.html

*Computer Base*

http://www.computerbase.de/artikel/prozessoren/2011/test-amd-bulldozer/

Translation:

http://translate.google.com/translate?sl=de&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.computerbase.de%2Fartikel%2Fprozessoren%2F2011%2Ftest-amd-bulldozer%2F

*ExtraHardware.cz*

http://extrahardware.cnews.cz/amd-fx-8150-6100-bulldozer-zambezi-recenze-test?page=0%2C0

Translation:

http://translate.google.com/translate?sl=cs&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fextrahardware.cnews.cz%2Famd-fx-8150-6100-bulldozer-zambezi-recenze-test%3Fpage%3D0%252C0

*Hardware.fr*

http://www.hardware.fr/articles/842-1/amd-fx-8150-fx-6100-bulldozer-debarque-am3.html

Translation:

http://translate.googleusercontent.com/translate_c?hl=en&rurl=translate.google.com&sl=fr&tl=en&twu=1&u=http://www.hardware.fr/articles/842-1/amd-fx-8150-fx-6100-bulldozer-debarque-am3.html&usg=ALkJrhjyzKiSYAvGNJY-tu7n7pUE6B4FrQ

*Hardware Heaven*

http://www.hardwareheaven.com/reviews/1285/pg1/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-introduction.html

*Hexus*

http://hexus.net/tech/reviews/cpu/32110-amd-bulldozer-fx-8150/

*LegionHardware*

http://www.legionhardware.com/articles_pages/amd_fx_8150fx_8120fx_6100_and_fx_4170,1.html

*Lost Circuits*

http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=102&Itemid=1

*PC Perspective*

http://www.pcper.com/reviews/Processors/AMD-FX-8150-Processor-Review-Can-Bulldozer-Unearth-AMD-Victory

*Hardware TechStation*

http://www.techstation.it/hardware/articoli/amd-fx-il-ritorno-al-vertice/introduzione

Translation:

http://translate.google.com/translate?hl=en&sl=it&tl=en&u=http%3A%2F%2Fwww.techstation.it%2Fhardware%2Farticoli%2Famd-fx-il-ritorno-al-vertice%2Fintroduzione

*Tom's Hardware*

http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-1.html

*Anandtech*

http://www.anandtech.com/show/4955/

*Overclock3D*

http://www.overclock3d.net/reviews/cpu_mainboard/amd_fx8150_cpu_review/1

*Tech Report*

http://techreport.com/articles.x/21813

*X-bit*

http://www.xbitlabs.com/articles/cpu/display/amd-fx-8150.html

*HardwareHeaven*

http://www.hardwareheaven.com/reviews/1285/pg1/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-introduction.html

*Hardware.info*

http://nl.hardware.info/reviews/2382/amd-fx-8150--8120--6100--4100-bulldozer-review

Translation:

http://translate.google.com.sg/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fnl.hardware.info%2Freviews%2F2382%2Famd-fx-8150--8120--6100--4100-bulldozer-review

*TweakTown*

http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index1.html

*AlienBabeltech*

http://alienbabeltech.com/main/amds-fx-8150-vs-core-i7-phenom-ii-bulldozer-arrives/all/1

*Rage3D*

http://www.rage3d.com/reviews/cpu/amd_fx_8150/index.php?p=1


----------



## Tanid

First thing I noticed was they tested it with 3gigs of ram. Yeah, I stopped looking at it at that point. If they can't get ram right what else can't they get right?

Bulldozer: MOAR POWER, NEEDS MOAR POWAH!!!!


----------



## otakunorth

fyi the members of ocn are anal about posting things that have been posted, and generic threads about products


----------



## Usario

Yall realize this is OBR (or at least his organization or whatever), right?


----------



## Oedipus

Quote:


> Originally Posted by *Tanid;15269471*
> First thing I noticed was they tested it with 3gigs of ram. Yeah, I stopped looking at it at that point. If they can't get ram right what else can't they get right?


If it were a kinder review, you wouldn't be saying that.


----------



## Dmac73

Quote:


> Originally Posted by *Tanid;15269471*
> First thing I noticed was they tested it with 3gigs of ram. Yeah, I stopped looking at it at that point. If they can't get ram right what else can't they get right?


If you even looked, thats just a screenshot from a GT system. 12 threads, 6 of them 100% idle. 3gbs of ram, triple channel.

Just as easily another reason to discredit the review, but just sayin'


----------



## Dmac73

Quote:


> Originally Posted by *Usario;15269507*
> Yall realize this is OBR (or at least his organization or whatever), right?


And every other leaked review? You're gonna be an Intel guy by an hour. Never has one had so much consistency. Best of luck, sir.

edit:Sorry for DP, wasn't ninja'd in time.


----------



## Imglidinhere

If this is the Bulldozer we've been waiting for, I'm sorry AMD, *you suck*.


----------



## Scorpion49

Quote:


> Originally Posted by *Oedipus;15269525*
> If it were a kinder review, you wouldn't be saying that.


Officially official isn't official enough.









Since its the 12th there, I guess this goes right along with what we have seen already leaked?


----------



## Sophath

Quote:


> Originally Posted by *Tanid;15269471*
> First thing I noticed was they tested it with 3gigs of ram. Yeah, I stopped looking at it at that point. If they can't get ram right what else can't they get right?


You do realise that the bulldozer rig is running in dual channel right? And with 4 GB of ram?


----------



## Tanid

Quote:


> Originally Posted by *Dmac73;15269529*
> If you even looked, thats just a screenshot from a GT system. 12 threads, 6 of them 100% idle. 3gbs of ram, triple channel.
> 
> Just as easily another reason to discredit the review, but just sayin'


Lmao, I didn't even see the thread count! I just glanced over it and the 3gigs caught my eye. Obvious it's not real, the "test systems" for the Phenom x6 and Bulldozer used am3/am3+ motherboards, but different motherboards. The x6 used a Gigabyte while the Bulldozer used a Crosshair. Failure.

EDIT: Why would one use a useless screenshot instead of a Bulldozer screenshot for the games section?


----------



## chasefrench

really nothing to say, this is not worth buying. I think IB will be my next upgrade now


----------



## Lostcase

Tomorrow will be one for the history books... cannot wait to get to work to read what ocn has to say about the official benches

Sent from my SCH-I500 using Tapatalk


----------



## AddictedGamer93

Quote:


> Originally Posted by *Lostcase;15269676*
> Tomorrow will be one for the history books... cannot wait to get to work to read what ocn has to say about the official benches
> 
> Sent from my SCH-I500 using Tapatalk


Tomorrow is in 48 minutes, but I doubt the benches will be up at midnight.


----------



## Dmac73

Quote:


> Originally Posted by *AddictedGamer93;15269710*
> Tomorrow is in 48 minutes, but I doubt the benches will be up at midnight.


Benches have been up since yesterday. Hell, 3 months ago.









11:59 you'll see benches.


----------



## xPwn

++Sets alarm clock for 6:00AM++


----------



## MarvinDessica

Quote:


> Originally Posted by *Sophath;15269581*
> You do realise that the bulldozer rig is running in dual channel right? And with 4 GB of ram?


Stop arguing with these people, dude. It's a hunk of freaking metal from a company! but it's like AMD is their lifeblood. It's a product, move on and stop defending AMD. This processor will end up being replaced with a better one LIKE ALWAYS. It's life, life dissapoints, move on.


----------



## Xenthos

Not sure if I trust these benches so far but if true, I'm very disappointed.


----------



## mbudden

Owch. BD flop?


----------



## daman246

Wow the way things are going AMD will never BE intels Equal they should just drop out of manufacturing Processors seriously if this is what BUlldozer brings us then this is the biggest Dissapointment ever


----------



## venom55520

ok i don't get it, the second "review/preview" shows very different results from the first one.


----------



## Homeles

Quote:


> Originally Posted by *daman246;15269804*
> Wow the way things are going AMD will never BE intels Equal they should just drop out of manufacturing Processors seriously if this is what BUlldozer brings us then this is the biggest Dissapointment ever


They don't have to be the king of performance to still sell their processors. If they're still offering the best bang for the buck, they'll be ok.


----------



## Xenthos

Quote:


> Originally Posted by *Homeles;15269874*
> They don't have to be the king of performance to still sell their processors. If they're still offering the best bang for the buck, they'll be ok.


I don't know man, look at that power consumption (if true). That's bad.


----------



## MarvinDessica

Quote:


> Originally Posted by *Homeles;15269874*
> They don't have to be the king of performance to still sell their processors. If they're still offering the best bang for the buck, they'll be ok.


A 2400 can be had cheaper than any bulldozer priceif were going by what Microcenter is showing. And they sell processors either at a small loss or what they buy them for. BD is not only going be more expensive, but at a terrible price point.


----------



## Sin0822

i am really sorry guys :'( but they it OCes very good


----------



## Alatar

Just woke up, what's the ETA on anand, tpu etc.?


----------



## Sophath

Should be coming during the day.


----------



## rubicsphere

Quote:


> Originally Posted by *venom55520;15269845*
> ok i don't get it, the second "review/preview" shows very different results from the first one.


This. I'll be waiting for more credible sources.


----------



## venom55520

i'm now even more so contemplating sending back my 990FX and getting a p67 with a 2500 or 2600k from microcenter


----------



## Sir Shfvingle

So the reviews will or will not be going up at midnight EST? It's not like they can only start reviews after the 12th, so I would assume most are done already...


----------



## Sin0822

12am wait 12:01 is lift


----------



## Sophath

Quote:


> Originally Posted by *venom55520;15269845*
> ok i don't get it, the second "review/preview" shows very different results from the first one.


Most of the test are different.
As for the game benchmarks, they are on different resolutions.


----------



## marsey99

not a review but has some results


----------



## Ruckol1

Records will be going down tonight, history will be made. Most views on one thread in OCN history I'll be willing to bet. If not it will def be Currently active users viewing record.


----------



## Sin0822

all these tests are legit, it performs like a 1100T, there is no question about it, not kernal, no enhancement, other than having the latest CPu code, which all of these have.

Also BD shows very inconsistent, VERY insistent benchmark scores, something is wrong with the CPU and how it handles data with its modules, i feel like two cores behind at all times. I can run wPRime, SuperPI, CineBench, and Vantage, and get different scores every time, only thing to change is the voltage, or nothing at all, just run it again,and scores are different. But you can grab and average, or wait until you see the same score, or take the best.


----------



## Lampen

Standing by and hoping for some decent info and reviews.


----------



## PhilWrir

Kiss the enthusiast market goodbye AMD...


----------



## radaja

the leaks were right all along,who would have thought


----------



## ChrisB17

Oh god AMD in my books just became *A Major Disappointment* Damn this is such a heartbreak.


----------



## Blameless

Anyone surprised by these results, which have been largely old news for months, should be ashamed of their narrow mindedness and skepticism that borders on abject paranoia.

Quote:



Originally Posted by *AddictedGamer93*


Tomorrow is in 48 minutes, but I doubt the benches will be up at midnight.


Many will be.

Most major sites have likely had final retail samples and benches of them for quite some time. They were just waiting for the NDA to be lifted.

Quote:



Originally Posted by *Sin0822*


all these tests are legit, it performs like a 1100T, there is no question about it, not kernal, no enhancement, other than having the latest CPu code, which all of these have.

Also BD shows very inconsistent, VERY insistent benchmark scores, something is wrong with the CPU and how it handles data with its modules, i feel like two cores behind at all times. I can run wPRime, SuperPI, CineBench, and Vantage, and get different scores every time, only thing to change is the voltage, or nothing at all, just run it again,and scores are different. But you can grab and average, or wait until you see the same score, or take the best.


Can you try manually setting affinity to force threads to be spread across modules as much as possible, and see if that makes a difference?

Quote:



Originally Posted by *radaja*


the leaks were right all along,who would have thought










Why wouldn't they have been?

A dozen different samples from a dozen different places all in the hands of liars with an anti-AMD agenda?

Anyone who thought that was a fool.


----------



## Ghoxt

I'm actually surprised by what we are seeing so far, albeit it's early. Most surprising is the power...good god.

Did they use my beloved GTX590's first set of drivers as a blueprint?







how could they be so close to Thuban on most performance tests?

I mean they get Mhz World record single core/thread etc, then with all ponies running it's only a marginal improvement over the previous generation. Reminds me of overhead of 4xSLI and a drop in performance.

Again its too soon to say, but I'd hate to hear that they over-engineered it to end up with cycles degredation and additional power consumption.

Yeah this thread will be the smallest one we've seen in a while.


----------



## microfister

sub'd want to see what the BD amounts to, im not gonna trust the anything until its available for purchase.


----------



## redalert

http://hardforum.com/showthread.php?t=1642861

http://hardforum.com/showthread.php?t=1642860

This article will go live at 11PM CDT

It was posted in the Bulldozer blog thread


----------



## GTR Mclaren

this is going to end bad !!


----------



## pengs

This round is going to be shunned quite badly until higher threaded games and software starts making it's way in and it becomes fully used. Unfortunately it's not going to take Intel but a nano-second to put this chip even further in the dust. I don't think it's going to be worth more than 200 dollars, Intel is going to wipe the floor with the FX line before software can catch up.


----------



## black96ws6

I still think it was very cool of the Russian guys to let us benchmark their system. I mean how cool is that? They built the system and let us benchmark it first!


----------



## Sophath

Quote:



Originally Posted by *pengs*


I understand referring someone to an i5 but a i3 2400? I don't care how low BD's PPC is, the i3 isn't going to last nearly as long if you game or whatnot. I'd go i3 if I played ancient games or wanted an media PC. They are aimed at very opposite ways of computing. Even though BD isn't boding too well atm, your still talking about an order of magnitude in power when it comes to highly threaded apps.

This round is going to be shunned quite badly until higher threaded games and software starts making it way in and it becomes fully used. Unfortunately it's not going to take Intel but a nano-second to put this chip even further in the dust. I don't think it's going to be worth more than 200 dollars, Intel is going to wipe the floor with the FX line.


The 2400 is an i5 you know?


----------



## Hiep

Quote:



Originally Posted by *pengs*


I understand referring someone to an i5 but a i3 2400? I don't care how low BD's PPC is, the i3 isn't going to last nearly as long if you game or whatnot. I'd go i3 if I played ancient games or wanted an media PC. They are aimed at very opposite ways of computing. Even though BD isn't boding too well atm, your still talking about an order of magnitude in power when it comes to highly threaded apps.

This round is going to be shunned quite badly until higher threaded games and software starts making it way in and it becomes fully used. Unfortunately it's not going to take Intel but a nano-second to put this chip even further in the dust. I don't think it's going to be worth more than 200 dollars, Intel is going to wipe the floor with the FX line.


a 2400 is an i5....


----------



## Xenthos

Quote:



Originally Posted by *redalert*


http://hardforum.com/showthread.php?t=1642861

http://hardforum.com/showthread.php?t=1642860

This article will go live at 11PM CDT

It was posted in the Bulldozer blog thread


That's a source I was waiting for. Thx


----------



## Blameless

You can barely overclock a 2400, while Bulldozer OCs quite well. I'm not a member of OCN because I'm fond of running my CPUs at stock.

I find it interesting that these first two reviews have used DOSBox as a test. This is something I do myself, but haven't seen anyone else do, until now.

Quote:



Originally Posted by *Ghoxt*


I'm actually surprised by what we are seeing so far, albeit it's early. Most surprising is the power...good god.


Why are you surprised?

These results are not especially different from what's been out for weeks or months.

Quote:



Originally Posted by *PhilWrir*


Kiss the enthusiast market goodbye AMD...


I'm still getting one.

No surprises in performance or overclockability, thus no surprises regarding value, which is fairly decent, all things considered.


----------



## PyroTechNiK

Benches have been out for the longest time, the "official" benchmarks I think will closely match the leaked ones.


----------



## radaja

Quote:



Originally Posted by *Blameless*


Anyone surprised by these results, which have been largely old news for months, should be ashamed of their narrow mindedness and skepticism that borders on abject paranoia.

Many will be.

Most major sites have likely had final retail samples and benches of them for quite some time. They were just waiting for the NDA to be lifted.

Can you try manually setting affinity to force threads to be spread across modules as much as possible, and see if that makes a difference?

Why wouldn't they have been?

A dozen different samples from a dozen different places all in the hands of liars with an anti-AMD agenda?

Anyone who thought that was a fool.


i was being sarcastic,i agree the writing has been on the wall for sometime now,but the deniers will be deniers and will now wait for the magical 22.5% bios fix and patch and if that dont work they will only use BD on linux because windows are intel shills,denial is nasty when it rears its ugly head


----------



## conzilla

I dont think people thought all those reviewers were liars. They more than likely thought AMD would pull an 11th hour magic bios. I for one hoped for it too.


----------



## Derp

Quote:



Originally Posted by *Xenthos*


That's a source I was waiting for. Thx










If their gaming tests are only done at eyefinity resolutions I'm going to cry.


----------



## gamer11200

Based on leaked stuff, I was really hoping for more ;(

Hopefully magic occurred and we get some chips that can properly rival a 2600K


----------



## Blameless

Quote:



Originally Posted by *radaja*


i was being sarcastic,i agree the writing has been on the wall for sometime now,but the deniers will be deniers and will now wait for the magical 22.5% bios fix and patch and if that dont work they will only use BD on linux because windows are intel shills,denial is nasty when it rears its ugly head


I know, I was just emphasizing the should-have-been-obvious for the benefit of others.


----------



## PandaAttack

If these are anywhere near accurate, debacle status achieved. We'll see when the hardocp article goes live.


----------



## conzilla

Hell i still have 15 min of hope left lol.


----------



## Nocturin

Haven't read results, although your responses are disheartning. 9 minutes.


----------



## Hawk777th

Good jump for AMD never expected it to below away SB. Decent performance for the price glad I didn't wait and went SB.


----------



## Core2uu

From what I'm seeing so far, it's performance is quite varied across the board. It's beating the 2600K in one or two benches, staying between the 2500K and 2600K in most and then struggling to keep pace with the Thubans in others. There's still lots of room for improvement and I do believe it will get better, for now though, looking at where it is priced at, the 8150 (while not at the performance levels we were hoping for in all places) does not make a bad value proposition.


----------



## redalert

Quote:


> Originally Posted by *conzilla;15270192*
> Hell i still have 15 min of hope left lol.


more like 7 min


----------



## Schmuckley

7 mins left now...


----------



## Blameless

Quote:


> Originally Posted by *conzilla;15270168*
> I dont think people thought all those reviewers were liars. They more than likely thought AMD would pull an 11th hour magic bios. I for one hoped for it too.


BIOS has virtually nothing to do with anything that could have improved performance significantly at this point.

BIOS can control clock speeds and memory tables, but the clock speeds in all but the earliest previews were spot on and memory performance did not hold any surprises.

BIOS has nothing to do with how a CPU executes instructions or any other internal operations. The OS assigns priorities and affinities, while everything else of consequence is internal to the CPU.

There is no such thing as a magic BIOS.


----------



## uncholowapo

2 more long min...

EDIT: WOW

*Currently Active Users Viewing This Thread: 262 (115 members and 147 guests)*


----------



## Oedipus

I'm looking forward to an entirely new wave of off-the-wall excuses in the coming hours.


----------



## Sophath

Quote:



Originally Posted by *Core2uu*


From what I'm seeing so far, it's performance is quite varied across the board. It's beating the 2600K in one or two benches, staying between the 2500K and 2600K in most and then struggling to keep pace with the Thubans in others. There's still lots of room for improvement and I do believe it will get better, for now though, looking at where it is priced at, the 8150 does not make a bad value proposition.


Unfortunately, all of the benchmarks where it wins against the 2600k are the multithreaded ones. And then again, in single threaded benches, it only keeps up with Thuban.


----------



## Xenthos

Quote:



Originally Posted by *Oedipus*


I'm looking forward to an entirely new wave of off-the-wall excuses in the coming hours.


Please... we can do without these comments. Try keeping one thread clean...


----------



## radaja

Quote:



Originally Posted by *Blameless*


I know, I was just emphasizing the should-have-been-obvious for the benefit of others.


i dont think emphasizing works on these type









like i said in another thread

fantasy and reality a different things altogether,but fantasy seems to be winning more than Charlie Sheen these last few weeks.

but i feel their pain,i really wanted AMD to get this right.maybe a certain mod will admit now that AMD botched this launch big time,he knows who he is


----------



## Tunapiano

looks like all the rumors and benches on test sample chips were right. it competes with a 2400 or 2500 depending on the program you bench it with. It can't touch the 980x in multicore benching and the 2600k beats everything in single core benching.

That would mean when LGA 2011 comes out that it will be the end of AMD competing with the enthusiast market and also LGA 2011 will most likely be twice as good as BD.


----------



## pteek

Why is my clock running so slow?


----------



## Badness

Well, at least the box art is good and bf3 is GPU limited. That's as optimistic as I can be.


----------



## Blameless

Quote:



Originally Posted by *Sophath*


Unfortunately, all of the benchmarks where it wins against the 2600k are the multithreaded ones. And then again, in single threaded benches, it only keeps up with Thuban.


How is this unfortunate?

Who the hell buys an eight core CPU to run single or lightly threaded apps?

Unimpressive single threaded performance was almost a given from the beginning. Multi-threaded performance is what will make or break the CPU for nearly everyone except gamers, who are such a small portion of users that they may as well be irrelevant.

Multi-threaded performance compares well to Intel chips in the same price range, and that why I do not see these results as a failure.


----------



## redalert

Quote:



Originally Posted by *Badness*


Well, at least the box art is good and bf3 is GPU limited. That's as optimistic as I can be.


They will overclock pretty well


----------



## WizrdSleevz

Please BD don't disapoint me..


----------



## razaice

Quote:


> Originally Posted by *Blameless;15270293*
> How is this unfortunate?
> 
> Who the hell buys an eight core CPU to run single or lightly threaded apps?
> 
> Unimpressive single threaded performance was almost a given from the beginning. Multi-threaded performance is what will make or break the CPU for nearly everyone except gamers, who are such a small portion of users that they may as well be irrelevant.
> 
> Multi-threaded performance compares well to Intel chips in the same price range, and that why I do not see these results as a failure.


Wow good point.


----------



## pioneerisloud

Subbed (will probably regret this later....)


----------



## allupinya

http://vr-zone.com/articles/amd-fx-8150-cpu-overclocking-review-a-bulldozer-for-gamers-/13694-8.html


----------



## uncholowapo

1 min left!!!


----------



## Nocturin

For the Record. The minute before reviews hit:

273 (125 members and 148 guests)
Nocturin, 996gt2, AK-47, Alatar, alentor, AMD_Freak, andydam, Anonymous->Object, Arviel, B!0HaZard, Badness, bcornell11, Bitemarks and bloodstains, Blameless, bojinglebells, born2bwild, bossie2000, bullet_101, canoners, Check101, ChrisB17, Chuckclc, CLeeFESQ, conzilla, cook, Core2uu, cutegelo, dantoddd, davieg, DeltaUpsilon, dennisjai, Derp, Diabolical999, Dmac73, Dranx, ElectroGeek007, ElfyOC, exergy, E_man, Favelax895, Firebeard, flyin15sec, formula m, Forsakenfire, gboeds, giver660, GuilT1, HA3AP, Hawk777th, Hiep, hokiealumnus, Homeles, HWI, iamwardicus, Improvidus, jammo2k5, jorpe, jpyumul, kow_ciller, Krame, Lampen, lordikon, Maich, marsey99, MarvinDessica, mav451, Max78, mbudden, moedank, MrSleepin, Nicolas11x12, Oedipus, onethreehill, otakunorth, PandaAttack, Papas, pengs, pinkfloyd48, PinkSlippers, pkmnfreak125, Poisoner, proximo, pteek, pyra, Ra1nman, radaja, razaice, RebelRising, redalert, rx7racer, Scrappy, SeanPoe, Shion314, Sir Shfvingle, slackingoff7, SlaveOnDope, sLowEnd, Sm0keydaBear, SmasherBasher, Sophath, spice003, spiderm0nkey, SSJVegeta, ssjwizard, sub50hz, Tanid, The Nightwatchman, The-Beast, TheStealthyOne, The_0ctogon, tictoc, TiFFman, torsp, Turbo16, Twixter, uncholowapo, WAYN3H3AD, wedge, WizrdSleevz, Xenthos, xx9e02, xxbassplayerxx, y2kcamaross, yesitsmario, [-Snake-]


----------



## Hiep

Quote:



Originally Posted by *allupinya*


http://vr-zone.com/articles/amd-fx-8...-/13694-8.html


Woww...


----------



## xd_1771

OVERCLOCKERS.COM
go


----------



## ChrisB17

Quote:



Originally Posted by *allupinya*


http://vr-zone.com/articles/amd-fx-8...-/13694-8.html


Oh WOW.


----------



## GBCirino

TechSpot review


----------



## pengs

Quote:



Originally Posted by *Badness*


Well, at least the box art is good and bf3 is GPU limited. That's as optimistic as I can be.


I feel some of your disappointment but I really think any game that uses all 8 cores will do quite well and probably end up ahead of the 2500K. All is not lost, it's up to the devs to use it and make BD something worth buying for future software and games.


----------



## Lampen

Load damn you!


----------



## poyyiee

http://www.guru3d.com/article/amd-fx...ssor-review/12

guru3d


----------



## pteek

It's live!


----------



## MarvinDessica

Quote:



Originally Posted by *Blameless*


How is this unfortunate?

Who the hell buys an eight core CPU to run single or lightly threaded apps?

Unimpressive single threaded performance was almost a given from the beginning. Multi-threaded performance is what will make or break the CPU for nearly everyone except gamers, who are such a small portion of users that they may as well be irrelevant.

Multi-threaded performance compares well to Intel chips in the same price range, and that why I do not see these results as a failure.


Yeah but why are 8 cores JUST barely outperforming 4?


----------



## B!0HaZard

285 viewing this thread.









Holy balls, that's a performance increase.

Equals or beats 2500K in a lot of the multithreaded tests.

Topping out at 5 GHz is decent, but I'm not sure 1.6 V is healthy for a 32 nm CPU. Wish I understood the article.

Depending on the price, this is a good alternative to the i5's.


----------



## Nocturin

waiting on tech report too "


----------



## dejanh

Well damn







I think it is time to start refining this architecture. It seems unfinished but it looks promising


----------



## cokezone

guru3d

http://www.guru3d.com/article/amd-fx-8150-processor-review/


----------



## cook

As we all eagerly await on the verge of disappointment -
Quote:


> Members currently online:
> 
> cook, 996gt2, AK-47, Alatar, alentor, AMD_Freak, andydam, Anonymous->Object, Arviel, B!0HaZard, Badness, Bitemarks and bloodstains, Blameless, bojinglebells, born2bwild, bossie2000, BSE, bullet_101, canoners, Check101, ChrisB17, Chuckclc, Chunky_Chimp, CJRhoades, CLeeFESQ, conzilla, Core2uu, cutegelo, dantoddd, davieg, DeltaUpsilon, dennisjai, Dhalmel, Diabolical999, Dmac73, Dranx, ElectroGeek007, ElfyOC, exergy, E_man, fatmario, Favelax895, Firebeard, flyin15sec, formula m, Forsakenfire, gamer11200, GBCirino, gboeds, giver660, GTR Mclaren, GuilT1, HA3AP, Hawk777th, Hiep, hokiealumnus, Homeles, iamwardicus, Improvidus, jammo2k5, jorpe, jpyumul, kow_ciller, Krame, Lampen, lordikon, Lucky 13 SpeedShop, Maich, marsey99, MarvinDessica, mav451, Max78, mickeyfuqinp, moedank, MrSleepin, mystikalrush, Nezmen, Nicolas11x12, Nocturin, onethreehill, otakunorth, PandaAttack, Papas, pengs, pinkfloyd48, PinkSlippers, pkmnfreak125, Poisoner, poyyiee, Prox, proximo, pteek, pyra, QxY, Ra1nman, radaja, razaice, Rebelord, RebelRising, redalert, Rpg2, rubicsphere, rx7racer, Scrappy, Shion314, Sir Shfvingle, Skidooer93, slackingoff7, SlaveOnDope, sLowEnd, Sm0keydaBear, SmasherBasher, smoki, Sophath, spice003, ssjwizard, staryoshi, Tanid, The Nightwatchman, The-Beast, TheStealthyOne, The_0ctogon, TiFFman, Turbo16, Twixter, uncholowapo, WAYN3H3AD, wedge, WizrdSleevz, Xenthos, xx9e02, xxbassplayerxx, y2kcamaross, [-Snake-]


----------



## Hiep

Quote:



Originally Posted by *B!0HaZard*


271 viewing this thread.









Holy balls, that's a performance increase.

Equals or beats 2500K in a lot of the multithreaded tests.

Topping out at 5 GHz is decent, but I'm not sure 1.6 V is healthy for a 32 nm CPU. Wish I understood the article.

*Depending on the price, this is a good alternative to the i5's.*


If priced at $240+, no.. just no.


----------



## mystikalrush

OMG the power consumption just... just... seriously... what the hell is the point in overclocking it now....


----------



## 996gt2

*Where's Anandtech's review?*


----------



## Firehawk

Hardware Canucks


----------



## Homeles

Well damn, those slides were real. Who was right?

<--- this guy


----------



## Canis-X

One up here now...

http://benchmarkreviews.com/index.ph...=831&Itemid=63


----------



## Nocturin

i was going to say something but plain forgot.

EDIT: oh ya, these slides are going to be fun comparing against Google cache, they look slightly different. can see it fully yet!
EDIT EDIT: havent been able to read a damn thing yet because [H] broke >.<.

OMG BD CRASHED THE INTERNET!
:sarcasm:


----------



## CrazyDiamond

http://www.techspot.com/review/452-a...dozer-fx-cpus/


----------



## 996gt2

*Test SetupIdle (Watts) |CPU Loaded (Watts)
i7 2600K97 W | 158 W
FX-8150121 W | 246 W*

Really AMD, really?????


----------



## xd_1771

Check this out








The FX 6XXX is on the same level as the 8XXX. The 8XXX should have a VERY clear advantage. Something is wrong.


----------



## [-Snake-]

Quote:



Originally Posted by *allupinya*


http://vr-zone.com/articles/amd-fx-8...-/13694-8.html


----------



## Derp

Such awful gaming performance, the leaks were right all along.


----------



## B!0HaZard

Quote:



Originally Posted by *Hiep*


If priced at $240+, no.. just no.


Well, is it priced at $240? Anyway, people with BD ready motherboards will have a very cheap upgrade in these CPUs.


----------



## HWI

This pretty much sums it up:

Quote:



Originally Posted by *PCtuning*

If I simplify it, we can say: AMD FX-8150 requires a 800 MHz high frequency to achieve comparable performance with the Core i7-2600 km after overclock all cores on a constant value (if not default to the processor).


It's not too bad considering AMD's price point.


----------



## anubis1127

Well I guess I am glad I did not wait for this CPU to drop. Gaming performance worse than phenom II's, and i3's, yey.


----------



## Diabolical999

_"Hey guys, new girl OCn member here needing help._"


----------



## Lampen

Wow. Didn't think it would be quite this bad...


----------



## 996gt2

Quote:



Originally Posted by *B!0HaZard*


Well, is it priced at $240? Anyway, people with BD ready motherboards will have a very cheap upgrade in these CPUs.


Unless they shelled out big bucks for a Crosshair V or 990FX-UD7


----------



## Firehawk

Bit-tech


----------



## Clairvoyant129

The leaks from the very beginning were right all along. I guess all the die hard fanboys are eating their own words now.

BD isn't just mediocre, it's plain garbage.


----------



## TheStealthyOne

What a shame.


----------



## Hiep

Quote:


> Originally Posted by *B!0HaZard;15270470*
> *Well, is it priced at $240?* Anyway, people with BD ready motherboards will have a very cheap upgrade in these CPUs.


http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/page13.html
Quote:


> *Then there is the question of value. At $245 the FX-8150 is pretty good, as is the FX-8120 at $205, and the FX-6100 at $165.* The FX-8150 is 22% cheaper than the Core i7-2600K and this works to AMD's favor as the FX-8150 was often less than 20% slower.


----------



## Lampen

Quote:


> Originally Posted by *[-Snake-];15270465*


WUT


----------



## ChrisB17

Meh I guess I am keeping what I have. BD was a huge fail after all.


----------



## Shion314

Quote:


> Originally Posted by *Tunapiano;15270273*
> looks like all the rumors and benches on test sample chips were right. it competes with a 2400 or 2500 depending on the program you bench it with. It can't touch the 980x in multicore benching and the 2600k beats everything in single core benching.
> 
> That would mean when LGA 2011 comes out that it will be the end of AMD competing with the enthusiast market and also LGA 2011 will most likely be twice as good as BD.


At this point, that seems to be the case. I had doubts about where BD would land. I felt like it was going to compete with the Sandy Bridge Processors but right now it seems like they aren't up to snuff. Intel will have a field day with this.


----------



## Blameless

Quote:


> Originally Posted by *Tunapiano;15270273*
> looks like all the rumors and benches on test sample chips were right. it competes with a 2400 or 2500 depending on the program you bench it with. It can't touch the 980x in multicore benching and the 2600k beats everything in single core benching.


Exactly what every reasonable and sane person was expecting.
Quote:


> Originally Posted by *Tunapiano;15270273*
> That would mean when LGA 2011 comes out that it will be the end of AMD competing with the enthusiast market and also LGA 2011 will most likely be twice as good as BD.


I can't agree with this. Enthusiast does not imply high cost.

Final performance per dollar of Bulldozer CPUs is not bad at all, and platform costs are extremely competitive.
Quote:


> Originally Posted by *MarvinDessica;15270394*
> Yeah but why are 8 cores JUST barely outperforming 4?


Because the cores are narrower and share an FPU?

It was clear that some sacrifices would have to be made to fit 8 cores in the same transistor count as their previous six core chips, which already had inferior IPC to the last two generations of Intel processors.

Not all cores are created equal.
Quote:


> Originally Posted by *xd_1771;15270463*
> Check this out
> http://static.techspot.com/articles-info/452/bench/Encoding_03.png
> The FX 6XXX is on the same level as the 8XXX. The 8XXX should have a VERY clear advantage. Something is wrong.


How many threads does TMPGEnc use?


----------



## slackingoff7

All that matters is power consumption for AMD's long-term viability. Its performance per watt is ridiculous.

Server markets are not going to be happy with AMD.


----------



## cook

That voltage is insane! I bet it runs super hot as well, especially if it does come with a h60 like cooler. For 240.00 this is not a good deal. I can get a 2600k for 279.00 with a huge discount on a mobo at microcenter, and just might.


----------



## The-Beast

The chip is terrible, made moreso due to inclinations of 10% lower IPC than Thuban. It's actually a 25% deficit from projections. Considering Llano is actually a 5% gain in IPC just from cleaning up the phenom II arch. You have to wonder what significant improvements could have been made to that architecture in moving to 32nm and how it would compare to bulldozer.


----------



## PyroTechNiK

The leaks were right all along, BD is a failure.


----------



## allupinya

and so much for the 5ghz on air rumor...


----------



## Plex

And now we see why the CEO jumped ship when he saw what Sandy Bridge could do.


----------



## Ghoxt

Quote:



Originally Posted by *Blameless*


Why are you surprised?

These results are not especially different from what's been out for weeks or months.


To be specific, prior to release all we heard was those are ES chips...etc, and history does serve well going back years for CPU's that released product that many ES chips were not exactly the same as Production chips. Some by small margin, others by large margin.

Also the leaked sources were shady to the extreme (Ukraine and some back alley in Turkey etc.)









So no, I believe many didn't have any ground to stand on to "believe" the leaks as factual as none even benched using any reliable tools we "Trust".

Overall, I'm sure there's more info to come so I hate to be too polarised this early. Seriously I am still hoping for some magic pill to fix this early sentiment, and I'm an Intel guy. I strongly believe competition is best for everyone.

Speaking of Watts, my shop with 2500 Servers is currently all AMD, all are run at stock speeds of course. My hardware Director is watching Bulldozer and was talking about the next batch. Will be interesting long term when the next round of Server lease replacements start.


----------



## GTR Mclaren

damn...1100T is almost as good as 8150....


----------



## Oedipus

This is legitimately humorous.

I genuinely hope they can do better next time.


----------



## Papas

I'll be the first to admit it, amd failed..big time. I am eating my words.


----------



## Toology

Faildozer is official.


----------



## Billy_5110

poor you guys waitin for a cpu that could beat up sandybridge as hell.Really that's so bad. I was going to get my brother a bulldozer rig in a week but now... lol.. That's clearly not what we expected from it.

AMD done something stupid with bulldozer, it doesn't beat phenom II in some case, WTH?


----------



## SmasherBasher

Wonder how they fold. To me that would be the only reason to buy one at this point.


----------



## spice003

damn it should of grabbed 2500k from micro center when they had it on sale.


----------



## Blameless

Quote:


> Originally Posted by *allupinya;15270558*
> and so much for the 5ghz on air rumor...


Overclocking seems overall roughly on par with SB. At least one site shown today did have a chip that did 5GHz on air.

Still, no one should have really expected suicide runs to be representative of actual, practical, clocks. A SuperPi 32M run is not representative of 24/7 stability.

4.5-5GHz as a maximum clock is not bad at all.


----------



## Clairvoyant129

Power consumption is god awful. AMD always ran on per watt performance, I guess Intel will take another 5% of the server market to end up with 95%.

Significantly slower than Nehalems from 2008. BD is garbage.


----------



## Plex

Intel is very happy right now.


----------



## 40.oz to freedom

Fail.


----------



## BallaTheFeared

Man the power draw and poor scaling with MHz wounds me deeply.


----------



## Hiep

Quote:



Originally Posted by *spice003*


damn it should of grabbed 2500k from micro center when they had it on sale.


Isn't it still 179.99? http://www.microcenter.com/single_pr...uct_id=0354589


----------



## xd_1771

Quote:



Originally Posted by *allupinya*


and so much for the 5ghz on air rumor...


el gappo hit 5Ghz+ on air with a VenX


----------



## lordikon

Quote:



Originally Posted by *Clairvoyant129*


The leaks from the very beginning were right all along. I guess all the die hard fanboys are eating their own words now.

BD isn't just mediocre, it's plain garbage.


I wouldn't call it garbage, it's a good upgrade for those that have compatible motherboards, but yea, it's price/performance isn't great, and it's heat efficiency for its performance is just terrible.

I think most people knew the leaks were right, you don't get leaks from 20 different sources all lying with the same results in some big conspiracy against AMD. Like you said, it's only an issue for those die-hard fanboys who refused to believe AMD could release a CPU that wasn't going to compete with Intel's best CPUs.

It's sad news though, I was hoping against hope that AMD would pull something out of their sleeve. Now I'm afraid that it'll be another couple of years before AMD has time to release something new that'll stand a chance, but maybe we'll get lucky and that won't happen. As it stands right now AMD's new CPUs don't stack up to 9-month old Intel CPUs, so I'd say they're at least a year behind Intel.


----------



## BallaTheFeared

Did he hit it prime95 stable though?


----------



## staryoshi

The power consumption figures I am seeing makes my head explode. I wish the product were more impressive so that Intel would be motivated to release Ivy bridge more quickly.

Side-note: For AMD to sell me a BD proc it would have to sport two modules, run at 65w, and come in under $100


----------



## defuz3d

Legit have just posted theirs http://www.legitreviews.com/article/1741/1/


----------



## Dmac73

Usario where you at? Jk.

This is fail. You could say it's a good CPU when overclocked, but the power consumption. MOTHEROFGOD.JPG

Terrible clock for clock performance, a whole lot of nothing @ high frequency, and ridiculous heat and power consumption(not even ridiculous, unacceptable)

Netburst all over? Yes? Yes? Yes.


----------



## dodger.blue

Okay, so it's a better workstation cpu than the i5 2500k, but that's about it. It seems like the 2500k is a much better option for most people.


----------



## Fallendreams

Those CPUs should be price lower. I rather get 2500k then the 8150.

Sent from my SAMSUNG-SGH-I777 using Tapatalk


----------



## Lampen

Quote:



Originally Posted by *BallaTheFeared*


Man the power draw and poor scaling with MHz wounds me deeply.


Agreed. The power draw is insane...


----------



## smartasien

moving on to bigger and better things.

hello INTEL!


----------



## Derp

Glad you got those pre-orders in right? LOOOOOOOOOOOOOOOOOOOOOOOOOOL


----------



## defuz3d

Hi-tech Legion: http://www.hitechlegion.com/reviews/processors/13752


----------



## Blameless

Only question that still remains in my mind is regarding the uncore/cpu-NB and how this is clocked by default as well as how overclocking it (if possible) will influence performance.


----------



## rockosmodlife

Breaks my heart, oh well.









Next upgrade, hexa or 2600k. FX=poo.


----------



## born2bwild

i5 2500k is both cheaper and a much better option for enthusiasts/gamers and the average consumer.
I do not see BD becoming successful at all with these prices. And even if AMD reduces prices, BD itself as a technological step is underwhelming to say the least.


----------



## c0nnection

*Hugs* Sandy Bridge. I knew Bulldozer was going to be a failure. I am also disappointed about it too. I remember when my AMD 3700+ was ripping Intel apart in gaming benchmarks.


----------



## WizrdSleevz

BD... why?? why?!!!


----------



## Nocturin

Quote:


> Originally Posted by *Firehawk;15270498*
> Bit-tech


txs for that. anyone seen any NB OC yet?


----------



## Dmac73

Quote:


> Originally Posted by *xd_1771;15270636*
> el gappo hit 5Ghz+ on air with a VenX


Bet it's not stable. I want to see 5+ghz on air, cinebench looped or prime95 for anything more than a half hour.

CPUz MHZ doesn't mean squat. Also, when it takes 6.5ghz and beyond to match 5ghz SB, it's not even worth bragging about anymore. You're consuming WAY more wattage than a 480 rofl, with nothing to show for it.


----------



## Clairvoyant129

Quote:


> Originally Posted by *dodger.blue;15270664*
> Okay, *so it's a better workstation cpu than the i5 2500k*, but that's about it. It seems like the 2500k is a much better option for most people.


Are you joking? No it isn't.


----------



## UbNub

Hopefully 1100t still comes down in price. And will bulldozer-e or whatever run on AM3+? They kind of owe a refined version to people who invested in AM3+.


----------



## Lampen

Quote:


> Originally Posted by *Fallendreams;15270669*
> Those CPUs should be price lower. I rather get 2500k then the 8150.
> 
> Sent from my SAMSUNG-SGH-I777 using Tapatalk


Yep. Price/performance and performance/watt were the niches that AMD had in the market. Looking at this though well they've clearly lost that. Given the choice between a 2500k and a 8150, most people are going to be going Intel.


----------



## hgfdsa

I have always been going for the best performance vs the price but this is lame.

Now we are going to see Intel getting expensive again instead of doing another SB move By that i mean they wont compete with the price like they did on SB.


----------



## lordikon

Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


----------



## Shion314

Quote:



Originally Posted by *techspot*

http://www.techspot.com/review/452-a...us/page13.html

*Final Thoughts*
Breaking down our benchmark results we find that the AMD FX-8150 offers huge performance improvements over the Phenom II range when testing with Excel 2010, while it matched the Core i5-2500K and Core i7 920 processors. Our custom WinRAR benchmark also heavily favored the FX-8150 over the Phenom II, matched the Core i7 920 and trailed behind Sandy Bridge processors in this test.!


Ouch.... BD is a huge disappointment. I thought AMD would at least catch up to the current 2600k and 980x but not even that. Sad day for AMD....

I can't see how AMD enthusiasts can talk their way out of this.


----------



## Liquidpain

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


You dang right! Ivy Bridge is gonna be a premium. Good thing I got a good job.


----------



## gamer11200

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


This!


----------



## WizrdSleevz

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


This^


----------



## mickeyfuqinp

http://www.techspot.com/review/452-a...us/page13.html










edit: ninja-ed.. >_>


----------



## BallaTheFeared

I already bought my i5-2500k, I won't be paying anymore...

Not only that Intel has had the same pricing regardless of what AMD was doing good or bad for as long as I can remember.

It's using almost double the power at 4.8GHz as the i5-2500k at 4.8GHz, yet performance is lower... It's like, I don't even know what to say.


----------



## Alatar

Doesn't look good :/ The power draw is just plain ridiculous...

What I find hilarious is the currently active users viewing the thread







(can we get over 500?)


----------



## dodger.blue

Quote:



Originally Posted by *Clairvoyant129*


Are you joking? No it isn't.


I realize that most people jump straight to the gaming benchmarks, but gaming is not an aspect of a workstation cpu.

It's true that there are a couple real world benchmarks that put the 2500k slightly above the 8150, but in most cases the 8150 beats it out and loses (pretty significantly) to the 2600k.

You should check out the non-gaming benchmarks sir.


----------



## 8ight

OCN knows what time it is, uh-huh..









OT: This makes me want to laugh in the face of everyone who constantly discredited the many _very consistent_ leaks because they didn't want to accept that BD is really the Phenom II X8 and complete crap.


----------



## Flying Donkey

This is bad, really bad, let's just hope that Intel doesn't turn into a monopoly







Why AMD? Why?


----------



## yesitsmario

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


Agreed, I was going to wait for an ivy bridge build next year, but I don't think I'm going to be able to afford it lol.


----------



## surfbumb

i'm glad I don't have to see any rumored bulldozer articles anymore on the front page after today.

The power draw on load is crazy with the 8150.

Gonna watch amd stocks tomorrow as well as intel's...should be interesting.


----------



## Blameless

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


Not an Intel fanboy, and not seeing BD as especially disappointing (except OCed power consumption).

Still, I wouldn't mind changing the name of the thread to the "Big giant I SO told you so thread".

There is really no excuse to be surprised.


----------



## Hiep

Quote:



Originally Posted by *lordikon*


Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.

AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


Higher prices will just make most hold on to their processors even longer?


----------



## Dranx

I.... I don't even know what to say....

How? How in the world could you spend five years and millions of dollars in R&D and go BACKWARDS in performance?


----------



## radaja

this one graph really sends it home


----------



## WizrdSleevz

Quote:


> Originally Posted by *Alatar;15270765*
> Doesn't look good :/ The power draw is just plain ridiculous...
> 
> What I find hilarious is the currently active users viewing the thread
> 
> 
> 
> 
> 
> 
> 
> (can we get over 500?)


Lets get OVER 9000!!!


----------



## Liquidpain

Quote:


> Originally Posted by *Dranx;15270789*
> I.... I don't even know what to say....
> 
> How? How in the world could you spend five years and millions of dollars in R&D and go BACKWARDS in performance?


This.


----------



## Clairvoyant129

I think this sums it up perfectly.










Lower IPC compared to PII. Anyone looking to build AMD, should pick up Phenom instead lol. What a sad day to be AMD fans.


----------



## Flash99

Sorry guys, been trolling this site for a few days. Very disapponted in the BD-FX processor. Was hoping to turn the tide a bit on the Intel chips.

There is ONE subject that I haven't seen brought up here, and is an essential part of what this site is about. Not just overlocking but unlocking.

Think about this.... FX 4100 for $115.

Soon they will offer (if not now) bios unlocks for the 8 cores.

Think of getting an 8 Core FX processor for $115. That seems like a hell of a deal for me. Worst case if they dont unlock, you can still sell them off cheap or return them. That's a HUGE bang for the buck for me.

8 Cores = $115... Sweet deal. I'll update my sig tomorrow with a pic when I have one unlocked and working.


----------



## Oedipus

Quote:


> Originally Posted by *lordikon;15270722*
> Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.
> 
> AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


They haven't "competed" in years. Their pricing is still fine.


----------



## sub50hz

Quote:



Originally Posted by *Shion314*


I can't see how AMD enthusiasts can talk their way out of this.


Why does everything regarding CPUs and GPUs have to end up being pushed towards some internet pissing match?

OT: Pretty underwhelming on a whole. Perhaps some kind of BIOS revision will improve.... whatever the hell it is that's causing such a huge "_meh_" degree of performance. In any case, the preliminary reviews most certainly wiggle the FX closer to my "not buying" list. I'll wait to see what happens when some members here get a hold of some of these and start tinkering.

edit: I think the biggest portion of people who _really_ wanted to see BD succeed are those who were hoping for a drop-in solution on their existing mobo to bring them closer to SB performance levels. "Fanboys" can argue all they want, while the rest of the *adults* lean back and form objective opinions based on current and upcoming information.


----------



## amd-dude

ah damn it...now im gonna have to change my name to intel-guy







..not really...i'll just buy a 965 and go to town with it


----------



## Hiep

Quote:



Originally Posted by *Flash99*


Sorry guys, been trolling this site for a few days. Very disapponted in the BD-FX processor. Was hoping to turn the tide a bit on the Intel chips.

There is ONE subject that I haven't seen brought up here, and is an essential part of what this site is about. Not just overlocking but unlocking.

Think about this.... FX 4100 for $115.

Soon they will offer (if not now) bios unlocks for the 8 cores.

Think of getting an 8 Core FX processor for $115. That seems like a hell of a deal for me. Worst case if they dont unlock, you can still sell them off cheap or return them. That's a HUGE bang for the buck for me.

8 Cores = $115... Sweet deal. I'll update my sig tomorrow with a pic when I have one unlocked and working.


Right, like core unlocking is 100% guaranteed.


----------



## otakunorth

I cant wait until amd finally explains what happend (wont be any time soon)
5 years for a cpu thats about equal to what they had


----------



## jacksonv

i'm getting a 2600k tonight! because the price might just go up?

my amd rig is for sale as of now!


----------



## Alatar

just read the canucks review...










not pretty...


----------



## dejanh

What I do not get is how AMD thought that this would be a good idea to release? I guess it was a case of Damocles Sword...if you do not release it or if you release it either way you get hammered on public's opinion...I guess the the latter was calculated to be a "lesser" hit.

It's really said...and I feel horrible for AMD. I actually mean that genuinely. This chip was just not ready for prime-time.


----------



## yesitsmario

Quote:



Originally Posted by *radaja*


this one graph really sends it home











wow! -_-


----------



## Diabolical999

Athlon III X8


----------



## Clairvoyant129

Quote:



Originally Posted by *sub50hz*


Why does everything regarding CPUs and GPUs have to end up being pushed towards some internet pissing match?

OT: Pretty underwhelming on a whole. Perhaps some kind of BIOS revision will improve.... whatever the hell it is that's causing such a huge "_meh_" degree of performance. In any case, the preliminary reviews most certainly wiggle the FX closer to my "not buying" list. I'll wait to see what happens when some members here get a hold of some of these and start tinkering.

edit: I think the biggest portion of people who _really_ wanted to see BD succeed are those who were hoping for a drop-in solution on their existing mobo to bring them closer to SB performance levels. "Fanboys" can argue all they want, while the rest of the *adults* lean back and form objective opinions based on current and upcoming information.


Nothing software is causing this kind of performance. BIOS revision won't change anything. It's obvious the modules are creating a bottleneck. AMD wanted to experiment but it backfired.


----------



## lordikon

Quote:



Originally Posted by *BallaTheFeared*


I already bought my i5-2500k, I won't be paying anymore...

Not only that Intel has had the same pricing regardless of what AMD was doing good or bad for as long as I can remember.
.


Yes, but if AMD had come out with something better than what Intel had then prices of Intel CPUs would drop to compete, instead Intel doesn't have to do anything to compete. In fact, many speculations are that the upcoming Sandy Bridge-E and Ivy Bridge have been pushed back a bit because Intel has no reason to compete with their own existing line up.

Quote:



Originally Posted by *8ight*


This makes me want to laugh in the face of everyone who constantly discredited the many _very consistent_ leaks because they didn't want to accept that BD is really the Phenom II X8 and complete crap.


Nah, I would hope by now they realize how foolish they were being. If they don't they're even more hopeless than I originally thought.


----------



## anubis1127

Quote:



Originally Posted by *Liquidpain*


You dang right! Ivy Bridge is gonna be a premium. Good thing I got a good job.


^This. Although I would be surprised if Ivy Bridge was much more than the current SB line up. I still suspect the 2011 socket CPU's to be the high $$ ones.


----------



## Mad Pistol

BD is a dud. It is matched (and sometimes beaten) by the old X6 1100T, AMD's previous architecture.

Suddenly, I feel like I have something special with my PhII 965BE... the last good AMD quad core.

Game over AMD. My next build will be intel based.


----------



## dodger.blue

They weren't kidding...their future is Fusion.


----------



## ToxicAdam

*PUREOC Review:*

Bulldozer arrives amidst hype and expectations. Can it live up?

http://www.pureoverclock.com/article1376.html


----------



## sub50hz

Quote:


> Originally Posted by *Clairvoyant129;15270869*
> BIOS revision won't change anything to the how BD is built upon.


Since you seem to be so adept at recognizing how BIOS tweaks are useless, perhaps you would care to elaborate on why that is. I'll be waiting.


----------



## otakunorth

Quote:


> Originally Posted by *Diabolical999;15270867*
> Athlon III X8


lol, if they named it this they would of saved face


----------



## JonnyBigBoss

Quote:


> Originally Posted by *Mad Pistol;15270877*
> BD is a dud. It is matched (and sometimes beaten) by the old X6 1100T, AMD's previous architecture.
> 
> Game over AMD. My next build will be intel based.


Enjoy paying an extra arm and a leg. Spending $200+ on a desktop CPU is ridiculous.


----------



## Absauston

I honestly didn't think it was possible to decrease performance and increase power consumption...


----------



## GTR Mclaren

so...1100T at 4.0GHz is a better option ??


----------



## AK-47

I am disappoint
guess my next rig will be SB


----------



## born2bwild

Quote:


> Originally Posted by *lordikon;15270722*
> Intel fanboys please hold your tongue, this is a sad day even for you, as it will mean higher prices for everyone.
> 
> AMD failing to compete with Intel's best is a bad thing for all of us, even if you don't buy AMD products, nobody here should be gloating or celebrating.


Not really, Intel always uses the same pricing scheme;

$200 - entry enthusiast - i5 750 -> i5 2500k
$300 - mid enthusiast - i7 920 -> i7 2600k
$600 - high enthusiast - i7 970 -> i7 3630k
$1000 - extreme edition - i7 990x -> i7 3980X

Intel kept these prices even when Sandy Bridge had no competition. Intel will keep the same prices for Ivy Bridge.

I wish some of AMD hardcore "supporters" on this forum would at least admit they were wrong.


----------



## Imglidinhere

I knew this was too good to be true! It's another failure on AMD's part! Another Pentium 4 incident!

Thanks for wasting our time AMD! Thanks for NOTHING!


----------



## Sophath

Quote:


> Originally Posted by *dejanh;15270858*
> What I do not get is how AMD thought that this would be a good idea to release? I guess it was a case of Damocles Sword...if you do not release it or if you release it either way you get hammered on public's opinion...I guess the the latter was calculated to be a "lesser" hit.
> 
> It's really said...and I feel horrible for AMD. I actually mean that genuinely. This chip was just not ready for prime-time.


They couldn't stop it. After all their marketing, they needed to release it to the public. That's why they delayed it so much.


----------



## Blameless

Quote:



Originally Posted by *Clairvoyant129*


I think this sums it up perfectly.

Lower IPC compared to PII.  Anyone looking to build AMD, should pick up Phenom instead lol. What a sad day to be AMD fans.



Quote:



Originally Posted by *radaja*


this one graph really sends it home


I disagree. Per-core IPC is not everything.

Dollar for dollar the 8 core BD are appreciably better than Phenom II, and fairly comparable to Sandy Bridge in multi-threaded tasks.

Quote:



Originally Posted by *dejanh*


What I do not get is how AMD thought that this would be a good idea to release?


They couldn't not release it. They need to sell CPUs to stay in business.

Better to disappoint those with unrealistic expectations and get what they can for what they've got than to take the whole thing as a complete loss.


----------



## allupinya

wow look at that max temp


----------



## redalert

Quote:



Originally Posted by *dejanh*


What I do not get is how AMD thought that this would be a good idea to release? I guess it was a case of Damocles Sword...if you do not release it or if you release it either way you get hammered on public's opinion...I guess the the latter was calculated to be a "lesser" hit.

It's really said...and I feel horrible for AMD. They have been my brand of choice for many years but this is not something that I can just look past.


They should of just released it back in June and just moved on to BD-E ASAP instead of dragging this out. The power at full load is just awful.


----------



## Mad Pistol

Quote:



Originally Posted by *JonnyBigBoss*


Enjoy paying an extra arm and a leg. Spending $200+ on a desktop CPU is ridiculous.


The performance increase (especially in gaming) is justified. Intel's Ivy Bridge is looking to drive a wooden stake into the heart of AMD. I hope this is not the case.


----------



## xPwn

This is very unexpected, Well, i7 2600k Here I come!

EDIT: I would also like to see JF-AMD's statement on this, as he was bragging that BD will have awesome performance and what not...


----------



## Sophath

Quote:



Originally Posted by *JonnyBigBoss*


Enjoy paying an extra arm and a leg. Spending $200+ on a desktop CPU is ridiculous.


BD was going to be over 200 anyway


----------



## BallaTheFeared

Quote:



Originally Posted by *lordikon*


Yes, but if AMD had come out with something better than what Intel had then prices of Intel CPUs would drop to compete, instead Intel doesn't have to do anything to compete. In fact, many speculations are that the upcoming Sandy Bridge-E and Ivy Bridge have been pushed back a bit because Intel has no reason to compete with their own existing line up.

Nah, I would hope by now they realize how foolish they were being. If they don't they're even more hopeless than I originally thought.


No they wouldn't have, their pricing structure has been unchanged since the dawn of time.

Even with AMD's FX processors were beating them they still had $1000 EE's that were slower than AMD's $200 chips.

Intel is a machine, their price structure is unchanging.

When SB i5 came out it went right into the price slot the 760 had, and the 750 had before that. AMD had nothing to compete with SB, Intel didn't gouge.


----------



## lordikon

Quote:



Originally Posted by *JonnyBigBoss*


Enjoy paying an extra arm and a leg. Spending $200+ on a desktop CPU is ridiculous.


You should realize what forum you're on, people here are regularly spending $2,000+ USD on computers. I see machines with i7 990x and 4 GTX580s in them. Spending over $200 for a desktop CPU is the norm around here. Also, not everyone buys CPUs here to play games with, spending $1000 on a 990x pays itself off in about 2 months due to the time it saves me.

Quote:



Originally Posted by *allupinya*











wow look at that max temp


Forget the temps, look at the lithography! At 0nm I would've figured for a max temp right around ambient temps.


----------



## Raven.7

Wow, this is looking U.G.L.Y.

Thank god I bought an X6 instead of waiting.


----------



## otakunorth

Quote:



Originally Posted by *Sophath*


They couldn't stop it. After all their marketing, they needed to release it to the public. That's why they delayed it so much.


The thing is bulldozer is a large architecture change, how it ended so badly? thats what i want to know

hopefully it ends up like the phenom 1


----------



## fatmario

Glad I bought my i5 2500k back in january 2011 from micorocenter when it was on sale. I am not regretting my purchased right now







I can't wait to rub this benchmark on my friend face hes been waiting for bulldozer for so long.









power consumption is ridiculous eek:
Bulldozer overclock at 4.6 and still can't keep up with 2500k,2600k stock clock in benchmark


----------



## dejanh

Quote:



Originally Posted by *Sophath*


They couldn't stop it. After all their marketing, they needed to release it to the public. That's why they delayed it so much.


Yeah, I know. That's why I said it Damocles Sword. You are damned if you do and damned if you don't. Not releasing would have probably been the final straw so this is better than nothing.

Quote:



Originally Posted by *otakunorth*


The thing is bulldozer is a large architecture change, how it ended so badly? thats what i want to know

hopefully it ends up like the phenom 1


I'm looking at the engineers as well going like







Did nobody run true test on this sucker along the way? Ultimately though maybe we are seeing AMD pulling a Press(hot) on us all...maybe the next iteration will really blow the lid off all of our expectations.


----------



## bojinglebells

Quote:



Originally Posted by *Hiep*


Isn't it still 179.99? http://www.microcenter.com/single_pr...uct_id=0354589


it was $149 for two days about a week ago.


----------



## Siegfried262

Well darn unless something dramatically changes over the next few months it looks like I'll be getting a 2500k come Christmas time. It's more suited to my situation and uses. I'll have to make the drive out to the Microcenter by Detroit.

It will be interesting to see how Piledriver competes with Ivy Bridge, time will tell though.


----------



## Check101

Something just doesn't make sense here... How can you go backwards in performance? Maybe we need more through benchmarks, or a more thorough explanation from AMD... Something just isn't right...


----------



## surfbumb

high tech legion gave it a gold award?


----------



## dodger.blue

Quote:


> Originally Posted by *born2bwild;15270913*
> Not really, Intel always uses the same prices;
> 
> $200 - entry enthusiast - i5 750 -> i5 2500k
> $300 - mid enthusiast - i7 920 -> i7 2600k
> $600 - high enthusiast - i7 970 -> i7 3630k
> $1000 - extreme edition - i7 990x -> i7 3980X
> 
> Intel kept these prices even when Sandy Bridge had no competition.
> 
> I wish some of AMD users on this forum would at least admit they were wrong.


It's true that Intel is not the same company it once was when it screwed the Tech Industry. Even more so, it's true that AMD is certainly _not_ the same company it was when Jerry Sanders was CEO.


----------



## redalert

nvm


----------



## AK-47

I didn't think it was possible to go backwards


----------



## 996gt2

*Mind=Blown*


----------



## Clairvoyant129

Quote:


> Originally Posted by *sub50hz;15270902*
> Since you seem to be so adept at recognizing how BIOS tweaks are useless, perhaps you would care to elaborate on why that is. I'll be waiting.


Do you really believe a BIOS revision will suddenly turn the 30%-50% performance deficit (vs. the comparable Intel CPU)? The cores are narrower and share FPU. These modules are obviously creating a bottleneck.










4 core BD is slower than a 4 core PII... sometimes by a significant amount. IF you want to believe in the magical pony BIOS that will change those numbers, go right ahead and waste your money on a failing product.


----------



## dejanh

Quote:



Originally Posted by *Check101*


Something just doesn't make sense here... How can you go backwards in performance? Maybe we need more through benchmarks, or a more thorough explanation from AMD... Something just isn't right...


I would not be surprised if we get some statements in the coming days from AMD. I simply cannot see them just sitting on the sideline going "I think that went well"...however, see my comment below.

Quote:



Originally Posted by *AK-47*


I didn't think it was possible to go backwards


Sometimes in life you need to take one step back to go two steps forward. Don't underestimate the value of that lesson, regardless of whether you want to apply it to AMD (int his case) or life in general.


----------



## iamwardicus

Overall I'm glad its out - however - I am disappointed in the performance numbers. AMD only having 4x FPU seems to be the major limiting factor on some of the benchmarks. The power consumption disappoints me. Overall I think it's a bit lackluster for the enthusiast crowd, but for business uses it ought to work out fairly well.

I'm glad they are taking a step forward in some areas, and software needs to be coded to take advantage of more cores for BD to really show its stuff. I'm also glad I didn't really upgrade to go BD - from the performance numbers I think I'll wait for Ivy Bridge to be released and attempt to snag either some 1366 hardware, or save the extra for the actual IB depending on what it can do.


----------



## allikat

The results point at it being priced perfectly. It competes with the i5 2500k on price and performance. It clocks as well as the 2500k, and in many cases, further.

Solid, but not exactly world shattering. But it's shown AMD is capable of competing with Intel.


----------



## KittensMewMew

I wanted it to be good, and waited patiently for these reviews but AMD just gets bear-mauled at the price point the 8150 is set at. Seriously, you will only buy it if you have either already invested 150+ into a 990FX mobo and want to get some performance out of it or have recently been inception'd into blindly believing AMD can do no wrong. Because a 2500K is better. Hands down.

edit: 
Quote:



Originally Posted by *allikat*


The results point at it being priced perfectly. It competes with the i5 2500k on price and performance. It clocks as well as the 2500k, and in many cases, further.

Solid, but not exactly world shattering. But it's shown AMD is capable of competing with Intel.


Look at anything other than games. Go look at the Hardwarecanucks review.


----------



## AK-47

Quote:



Originally Posted by *redalert*


microcenter has the FX for sale but not the x8 only the x4

http://www.microcenter.com/single_pr...uct_id=0366948
http://www.microcenter.com/single_pr...uct_id=0366949


isn't that fusion?


----------



## Check101

The only way the 8150 will sell is if it's priced between 2500k and 2600k. It seems to outperform the 2500k, barely, not so much the 2600k...

The extra cores may come in handy for certain apps tho... so maybe it could be a success, just not as big as one people were expecting... But power draw is ridiculous.


----------



## redalert

Quote:



Originally Posted by *AK-47*


isn't that fusion?


yeah I saw the bright red box and thought it was FX lol


----------



## xPwn

Quote:



Originally Posted by *996gt2*


*Mind=Blown*












DUDE! thats more than my GPU? HOW?


----------



## Clairvoyant129

Quote:



Originally Posted by *iamwardicus*


Overall I'm glad its out - however - I am disappointed in the performance numbers. AMD only having 4x FPU seems to be the major limiting factor on some of the benchmarks. The power consumption disappoints me. Overall I think it's a bit lackluster for the enthusiast crowd*, but for business uses it ought to work out fairly well.*


No way, with that kind of a power consumption/performance most firms and businesses will go no where near BD. Why should the firm I work for deploy thousands of BD based platforms/servers so we can pay higher electricity bills while getting significantly less performance.


----------



## dlee7283

Quote:



Originally Posted by *Prox*


That's all nice and stuff, except the fact that you're wrong of course.


http://www.anandtech.com/bench/Product/362?vs=50

The Phenom II X4 is better value than the Q9550, bottom line.

As for Bulldozer, hopefully it doesnt bottleneck like the X6 does in high end SLI/Xfire setups.


----------



## surfbumb

Quote:



Originally Posted by *allikat*


The results point at it being priced perfectly. It competes with the i5 2500k on price and performance. It clocks as well as the 2500k, and in many cases, further.

Solid, but not exactly world shattering. But it's shown AMD is capable of competing with Intel.


did you even read the reviews lol


----------



## MASSKILLA

AMD why not just kick my teeth in!!
Waited all this time for something you lied about!!
AMD never again will i buy a AMD system!!


----------



## UNOE

Quote:


> Originally Posted by *996gt2;15271025*
> *Mind=Blown*


That is quiet a bit more than 990X as well.


----------



## Clairvoyant129

Quote:


> Originally Posted by *dlee7283;15271084*
> http://www.anandtech.com/bench/Product/362?vs=50


You do realize you're comparing a PII with almost 1GHz clock speed advantage. At the same clock speeds, PII is slower than Core 2 Quads.

Quote:


> Originally Posted by *allikat;15271046*
> The results point at it being priced perfectly. It competes with the i5 2500k on price and performance. It clocks as well as the 2500k, and in many cases, further.
> 
> Solid, but not exactly world shattering. But it's shown AMD is capable of competing with Intel.


You're joking right?


----------



## Megacharge

Cool, now I know to stick with my 1055T until AMD perfects this architecture. This saves me money!


----------



## Sophath

Quote:


> Originally Posted by *dlee7283;15271084*
> http://www.anandtech.com/bench/Product/362?vs=50


But the phenom was released later too. And is clocked nearly 1 ghz higher.


----------



## UbNub

You guys are acting like these won't sell. Most of us sure wont buy one, unles it is just to overclock, but the real thing that gives it an advantage in mainstream... "8 core processor". That is all they'll need to sell a computer with one of these at a place like Bestbuy.


----------



## lordikon

Quote:


> Originally Posted by *dlee7283;15271084*
> http://www.anandtech.com/bench/Product/362?vs=50


You're comparing a 3.5 year old Intel CPU against a 5 month old AMD CPU? And the AMD CPU is just a notch faster.


----------



## ZealotKi11er

So many things to say.
1) We though PII X6 was slow but apparently it trups BD.
2) AMD i am disappointed. If this is all you got maybe i have better luck getting Nvidia cards next time.
3) Been waiting to go AMD ever since my amazing Athlon 64 3700+, Got a Q6600, Q9550, 920 and now 2500K and still no hope. 
4) Its a 8-Core CPU which is as fast as 4-Core from 3 years ago PI. 
5) We all though PI was bad but this is horrid. 
6) I dont know what to say, maybe AMD is preparing for eternal sleep.


----------



## xPwn

Quote:



Originally Posted by *MASSKILLA*


AMD why not just kick my teeth in!!
Waited all this time for something you lied about!!
AMD never again will i buy a AMD system!!


Dude, AMD would have been better off selling a Phenom III x6/8 3.8GHz Factory overclocked processor, i would have bought one

Quote:



Originally Posted by *lordikon*


You're comparing a 3 year old Intel CPU against a 5 month old AMD CPU? And the AMD CPU is just a notch faster.


Yes, because they are wrong. Sadly, I wish they were right.


----------



## WizrdSleevz

Quote:



Originally Posted by *redalert*


nvm


I believe those are the APU's.


----------



## AK-47

Quote:



Originally Posted by *Clairvoyant129*


You do realize you're comparing a PII with almost 1GHz clock speed advantage. At the same clock speeds, PII is slower than Core 2 Quads.


pretty sure it's about equal


----------



## potsherds

Quote:



Originally Posted by *BallaTheFeared*


No they wouldn't have, their pricing structure has been unchanged since the dawn of time.

Even with AMD's FX processors were beating them they still had $1000 EE's that were slower than AMD's $200 chips.

Intel is a machine, their price structure is unchanging.


And this is why I dislike Intel. And why AMD's fantastic failure with this chip really sucks. Is it unreasonable to be worried about the future of the enthusiast market and the possibility that AMD will drop out of it?


----------



## Plex

Quote:



Originally Posted by *allikat*


Solid, but not exactly world shattering. But it's shown AMD is capable of competing with Intel.


Wait, what? I'm no fanboy or anything, but what reviews are you reading? I just went through about 7 of them from beginning to end and the conclusion on each one said something like:

"This is a flop. Even as this price, there's no reason to buy this AMD over this xx Intel. Intel has won this round."

The best FX chip randomly inched ahead in some random benches, but other than that, it got completely thrashed by the 1 year old Sandy. It barely beat the 3 year old 1366 chips (and in some cases lost) and on some occasions, their own 1100T was ahead. What part of all of this do you get that "at least we know AMD can compete?"

Man, I wanted AMD to knock this out of the park. I really did. But that just didn't happen. Not at all.


----------



## Megacharge

It'll probably be "uncomfortable" for JF posting in here for a little while.


----------



## Clairvoyant129

Quote:



Originally Posted by *AK-47*


pretty sure it's about equal


At the same clock speeds, PII is about 3%-5% slower than Core 2 Quad 45nm Yorkfields. They are about the same or a bit faster than Core 2 Quad 65nm Kentsfields. You can look it up on Anandtech CPU benches.

But obviously PIIs are significantly cheaper so there is no reason to buy C2Qs unless you're upgrading.


----------



## dlee7283

Quote:



Originally Posted by *lordikon*


You're comparing a 3 year old Intel CPU against a 5 month old AMD CPU? And the AMD CPU is just a notch faster.


I could have easily chosen the 965BE and it still be a better value than buying a Q9550. You can go to Microcenter and get a combo deal with AMD, the margin is even larger....

Just give credit where credit is deserved, quit being an Intel fanboy.


----------



## Kvjavs

I woke up and read a few reviews... Feel better about looking to buy an i3 2100 now.


----------



## CJRhoades

Read though the Guru3D, Hardware Cauncks, and TechSpot reviews. What can I say.... AMD, I'm disappointed. I'm really tired of my current rig and was planning to sell it and build a new one when Bulldozer came out but now I don't know what to do.


----------



## wermad

Really disappointed. Just going to go w/ a 955BE for my amd budget build


----------



## Prox

Quote:



Originally Posted by *dlee7283*


I could have easily chosen the 965BE and it still be a better value than buying a Q9550.....


I bought this CPU in 2008. Your argument that I overpaid for it is void.


----------



## Shiobock

Hate to say it but AMD really shot themselves in the leg with all this. I'm happy I didn't wait and got the 1055T, although an i5 2500 would've been very good too if I had waited for that.


----------



## Clairvoyant129

Quote:


> Originally Posted by *dlee7283;15271172*
> I could have easily chosen the 965BE and it still be a better value than buying a Q9550.....
> 
> Just give credit where credit is deserved, quit being an Intel fanboy.


You realize you're comparing a CPU with almost 1GHz clock speed advantage? That's horrible. Of course 965 is a better value, LGA 775 platform is dead.


----------



## alawadhi3000

Those numbers looks horrible.
I've lost the tiny bit of hope I had in AMD.


----------



## Megacharge

Quote:


> Originally Posted by *CJRhoades;15271192*
> Read though the Guru3D, Hardware Cauncks, and TechSpot reviews. What can I say.... AMD, I'm disappointed. I'm really tired of my current rig and was planning to sell it and build a new one when Bulldozer came out but now I don't know what to do.


Just keep it until until AMD makes Bulldozer right.


----------



## Blameless

Quote:


> Originally Posted by *Check101;15270989*
> Something just doesn't make sense here... How can you go backwards in performance? Maybe we need more through benchmarks, or a more thorough explanation from AMD... Something just isn't right...


Performance is only "going backwards" in a handful of cases, and this should not surprise anyone.

Basic architectural changes have been known for a very long time, and one of those fundamental changes was that each BD core is narrower, which almost always means lower IPC in at least some scenarios.

Also, this sort of thing is not unprecedented. If you can remember back to late 2000, when the Pentium IV was released at 1.3 and 1.4GHz, you will recall that they were usually slower than 1GHz Pentium IIIs.

Just like the PIII was in 2000, the Phenom II has reached a dead end. BDs first incarnation certainly has it's pros and cons, but the changes are generally positive and will pave the way for more options in the future.


----------



## xPwn

Quote:


> Originally Posted by *UbNub;15271125*
> You guys are acting like these won't sell. Most of us sure wont buy one, unles it is just to overclock, but the real thing that gives it an advantage in mainstream... "8 core processor". That is all they'll need to sell a computer with one of these at a place like Bestbuy.


Dude, Wanna be my buddy? Lets join force and go to best buy stores and right before people buy a overpriced PC we go like "HEY!!! THIS ONEZZ BETTER MORE C00r2" and watch the employees be like "HEY!!! NO THIS HAZ MORE GIGGLEHUTS" Then, we pull out some benchmarks that we printed before hand.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless;15271209*
> Performance is only "going backwards" in a handful of cases, and this should not surprise anyone.
> 
> Basic architectural changes have been known for a very long time, and one of those fundamental changes was that each BD core is narrower, which almost always means lower IPC in at least some scenarios.
> 
> Also, this sort of thing is not unprecedented. If you can remember back to late 2000, when the Pentium IV was released at 1.3 and 1.4GHz, you will recall that they were usually slower than 1GHz Pentium IIIs.
> 
> Just like the PIII was in 2000, the Phenom II has reached a dead end. BDs first incarnation certainly has it's pros and cons, but the changes are generally positive and will pave the way for more options in the future.


So going by P4 logic BD is going for 6Ghz?


----------



## TaNgY

imma get the next set of bulldozer seems fine to me too many people expect AMD to take over from INTEL with a new arch lol ..... its just not do able they are on the right track though drop the power usage and increase its power in certain areas and id rather a AMD product, INTEL needs to push its big chips or they will end up like Ferrari in F1


----------



## Chiefpuff420

I think a part of me died somewhere in this thread.

















































:de vil:












































man i can't believe the scores aren't higher or at least have some better pricing ....wow i mean don't ya think that BD chip needs to be like $160-175 MAX if they plan on competing. I can pick up a i5 2500k for $180 like I did once before. So If this is whats going to go down I'm going to have a lot of computers to sell. an i think the whole world just gave up on AMD.

---from a AMD fan


----------



## Disturbed117

Looks like after my x6 is dead Im done with amd.


----------



## JMCB

=(

I kind of expected this. BTW, if you see any deals on AMD Phenom II X6 3.2 GHz, this might be a decent 'upgrade' for people waiting on bulldozer. =/


----------



## Megacharge

I'm just going to hold on to my 1055T, it's fine for my needs, and I'll give AMD time to evolve this architecture, given some time, die shrinks and a few modifications, this new arch will be a monster.


----------



## Derek1387

Wow... and i just bought an AM3+ board hoping BD would be....good....

***.


----------



## ArtistDeAlec

So disappointing







I was still holding out for SB-E/IB to upgrade but had hope on Bulldozer to sway me over back to AMD. Their GPU line has let me down and I don't plan to go back anytime soon either. If there isn't some competition out there these big company's are going to end up milking us for all our cash pretty soon.


----------



## chasent

Wow, that was disappointing.


----------



## anubis1127

Quote:



Originally Posted by *dlee7283*


I could have easily chosen the 965BE and it still be a better value than buying a Q9550. You can go to Microcenter and get a combo deal with AMD, the margin is even larger....

Just give credit where credit is deserved, quit being an Intel fanboy.


And you can't go to Microcenter and buy a q9550, so why even mention it?


----------



## ZealotKi11er

Next is probably 12-Core version stock 4.6Ghz and just barely beats 2600K in multi-tasking.


----------



## xPwn

Quote:



Originally Posted by *disturbed117*


Looks like after my x6 is dead Im done with amd.


Ill gladly ummm Recycle *Cough* [email protected] *cough* with it









Quote:



Originally Posted by *anubis1127*


And you can't go to Microcenter and buy a q9550, so why even mention it?


Yeah, seriously its like comparing a P4 to a Core2 chip. We all need to take a chill pill lol


----------



## Kaze105

AMD slides from a previous thread indicates Bulldozer crushing SB (at least graphs seemed so), but this is a disappointment. I was hoping to change to AMD due to cheaper cpu and motherboard, but this just doesnt seem worth it. Guess ill just wait again, this time for IB.


----------



## Shodhanth

Daaayyyymn.








Talk about disappointment.
Well, guess I'll just get a 990FX board for its good looks and upgrade to a 1090T when I feel underpowered. :/


----------



## qwertymac93

Anandtech:notably missing...


----------



## MGF Derp

Bulldozer is crap


----------



## redalert

Quote:



Originally Posted by *TaNgY*


imma get the next set of bulldozer seems fine to me too many people expect AMD to take over from INTEL with a new arch lol ..... its just not do able they are on the right track though drop the power usage and increase its power in certain areas and id rather a AMD product, INTEL needs to push its big chips or they will end up like Ferrari in F1


People are disappointed in the Power Consumption and the fact that Phenom II beats it in alot of benchmarks. I expect the BD-E cpu's to be alot better


----------



## lordikon

Quote:



Originally Posted by *dlee7283*


I could have easily chosen the 965BE and it still be a better value than buying a Q9550. You can go to Microcenter and get a combo deal with AMD, the margin is even larger....

Just give credit where credit is deserved, quit being an Intel fanboy.


Then you'd still be comparing a 3.5 year old CPU to a 2.2 year old CPU. Maybe you should re-read this thread before calling me an Intel fanboy. I owned AMD CPUs when they were on top, I'm a performance fanboy, nothing more.

You shouldn't call people fanboys whenever you they don't agree with you.

I'm not going to argue that you can't buy a decent AMD system at a low price, because that's true, but for 20% more you can buy a much faster system right now by getting an Intel.


----------



## Badness

Quote:



Originally Posted by *MGF Derp*


Bulldozer is crap


Yeah, pretty much. I'll probably end up buying it too. Same thing I said about fermi. What is wrong with me :S


----------



## CJRhoades

Quote:



Originally Posted by *Megacharge*


Just keep it until until AMD makes Bulldozer right.


I have a feeling my rig isn't going to be worth much by then.


----------



## Plex

http://www.youtube.com/watch?v=8rDwX...ayer_embedded#!

AMD's official Youtube video says otherwise! Haha.


----------



## Lucky 13 SpeedShop

Wow, this is just sad. :/ I'd have been happy w/ an across the board increase in performance, but no way I'm spending that much on a 8150 just to lose out on single thread performance. Not to mention my power bill getting raped...

Now where did I hear the words "IPC increases" from, John? At earlier comments saying this was a x8 Ph. II, not hardly, that would perform better overall. In point of fact, it looks like they'd have been better off just adding 4 cores to Lano, & deleting the GPU from the die.

This is a pretty massive failure. I'm fairly disgusted I wasted my time & money on this...


----------



## staryoshi

It's going to be quite a while before threading beyond 4 cores is truly relevant to mainstream users (and gamers). As such, I would have much rather seen per-core improvements than additional "cores." Hopefully the next generation of BD processors will address this.


----------



## Evil Penguin

It would be sad to even call this a Phenom III x8. 
This is bad news for everyone here in my opinion.

AMD should have never brought the FX name back from the grave on this POC.


----------



## dlee7283

Quote:



Originally Posted by *Clairvoyant129*


You realize you're comparing a CPU with almost 1GHz clock speed advantage? That's horrible. Of course 965 is a better value, LGA 775 platform is dead.


they both can reach 4GHZ at the end of the day

except the Phenom II has always been significantly cheaper than a Q9550... take that into consideration

You take that $100 or so you saved by going AMD, and u have a nice GPU upgrade

and we are not even talking about the combo deals Microcenter has with the Phenom X4..

LGA775 is just as alive as AM3, they both are still selling right now.....

As for the 2500K, its spanks the Phenom X4/X6 but that does not fit the same market segment.


----------



## phibrizo

its right where i expected it, not completely crushing SB, but not getting stomped. Its a new architecture, ill let it mature before i will jump on it and see where it goes.


----------



## Sophath

Quote:



Originally Posted by *Plex*


http://www.youtube.com/watch?v=8rDwX...ayer_embedded#!

AMD's official Youtube video says otherwise! Haha.


And intel's i7 980x scores 5,41 on cinebench in there


----------



## Captain Han

I am deeply deeply saddened by what AMD has done. They've been non competitive ever since Core 2 Duo came out in 2006, and Bulldozer was suppose to be their savior.

I am not so much pissed about the actual product, but as a technology enthusiast I am deeply saddened such a big microprocessor company is developing such crap offering and poor business execution, not to mention I believe they've just killed their FX-brand. They are not pushing the industry forward, and that is what I am mostly pissed about.

In the past 5 years, AMD has been competing on the price point rather than performance. They have basically established themselves as a 'value brand', not a premium or industry brand. They have failed in every aspect to deliver a competitive CPU.

Mind I remind you Trinity is also gonna be based on Bulldozer + Northern Island core. So you know APU performance is gonna be crap.

AMD can now only compete on Fusion and discrete graphics. But they have not much market share to begin with.

I hope Rory Read does something drastic otherwise AMD is going down.


----------



## dmasteR

So umm what happened to the people saying those leaked benchmarks weren't real.









Serious note, this is pretty disappointing. Guess there wont be a whole lot to think about when it comes to price/performance between Sandy Bridge and Bulldozer anymore.


----------



## Sin0822

Quote:



Originally Posted by *Evil Penguin*


It would be sad to even call this a Phenom III x8. 
This is bad news for everyone here in my opinion.

AMD should have never brought the FX name back from the grave on this POC.


agreed


----------



## Mattousai

I was really hoping that BD would at least compete with SB. Competition is a win for us, the consumers.

Sadly this is not the case. I just hope Intel doesn't feel the need to gouge us on Ivy.

I hold no loyalties to any company, because IMO that's just silly. That being said, my next upgrade will be Intel.

So disappointed.


----------



## xd_1771

I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


----------



## Velcrowchickensoup

Next board and cpu: 
ASrock Extreme 7,
2700k.

Been fun AMD...


----------



## supersaiyenx

I was expecting to see slightly better numbers then the ones presented...it is disheartening to see that a thuban was matching/beating it in some cases. I hope AMD can take this as a lesson learned and improve it as best as they can.

I'm in no big hurry to upgrade but I'll wait until BD-e/piledriver to decide on my next upgrade.


----------



## BrutusMaximus

This post is going to get heated! I will admit quite a let down but I wasn't expecting it to compete with a 2500/2600 k at least not for that price. but I was blown away when they're being beaten by ther last generations chip that was a shocker to me. I Also know that now amd has shown their cards I'm sure the sb-e. will be super expensive seeing how they have no competition I'm expecting them to charge 1k now 4 those new chips. The true downside to amd losing the performance race is the intel users that are going to get screwed by paying too much because of lack of competition


----------



## mav451

Nobody was expecting crushing, or even equal, but within 10% would have been nice. As had been said before, the consistency of the "leaks" was never encouraging.

What really caught me off guard is how utterly awful the power consumption is. It truly adds insult to injury. chew of XS had even brought up CPU-NB OCing as an avenue to explore; this was fairly early (August?)...but with BD being SO far behind, it's really not gonna do much good.

People keep harping on price - um, MC's Intel specific deals have made this a moot point. AMD simply can't compete with the $149/$179 i5 2500K deals.

@Evil Penguin - I'd rather call this an Athlon III X8. At least the expectations would be more hedged lol.


----------



## TaNgY

Quote:



Originally Posted by *redalert*


People are disappointed in the Power Consumption and the fact that Phenom II beats it in alot of benchmarks. I expect the BD-E cpu's to be alot better


Totally agree with you.


----------



## pioneerisloud

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


You know what, that's a VERY good question. We all know how well Phenom II responded to NB clocks, perhaps its the same thing with BD?


----------



## timmay556

gg amd


----------



## Kirby1

I do not envy chimp with his task of overseeing this thread.


----------



## linkin93

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


And if they had the latest BIOS

And CPU drivers


----------



## ZealotKi11er

Quote:



Originally Posted by *staryoshi*


It's going to be quite a while before threading beyond 4 cores is truly relevant to mainstream users (and gamers). As such, I would have much rather seen per-core improvements than additional "cores." Hopefully the next generation of BD processors will address this.


I will probably ended up in my 80s before AMD will ever beat Intel. 3 thinks have to happen at the same time for AMD to get on top. Release a CPU thats very fast, Intel fail with their next CPU, Market towards CPU takes a different approach.


----------



## staryoshi

Quote:



Originally Posted by *pioneerisloud*


You know what, that's a VERY good question. We all know how well Phenom II responded to NB clocks, perhaps its the same thing with BD?


As if the platform needs an excuse to consume more power







From a pure performance aspect that is a valid question, though.


----------



## mav451

Quote:



Originally Posted by *pioneerisloud*


You know what, that's a VERY good question. We all know how well Phenom II responded to NB clocks, perhaps its the same thing with BD?


I realize that, but is AMD seriously expecting users to overclock both overall clock and CPU-NB just to match a _stock_ SB?


----------



## JE Nightmare

i only care about the folding power.


----------



## Segovax

So they didn't "bulldoze" Intel, big deal.

AMD is still doing what they always do; offering excellent performance at a reasonable price.

This coming from someone who has owned almost every high end consumer CPU since 2005 excluding 980/990X and QX9650.

Little kids are freaking out like usual, over nothing. What's next another thread about BF3 beta actually having bugs?


----------



## andydam

Quote:



Originally Posted by *Derek1387*


Wow... and i just bought an AM3+ board hoping BD would be....good....

.


yeah same here...


----------



## Check101

Although I am disappointed, my disbelief is still high, so I will wait until the processors are released and several thorough benchmarks are done before I make the decision on whether or not to put this in my future rig...


----------



## G3RG

Apparently it folds on part with a Phenom II x6 too.


----------



## HWI

Quote:



Originally Posted by *phibrizo*


its right where i expected it, not completely crushing SB, but not getting stomped.


You must have read some different reviews than me, cause the ones I read it got stomped by the 2600K.

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


Good question, I bet they didn't. I'd definitely be interested to see how much that increases performance.


----------



## Faster_is_better

Response in this thread doesn't sound to good. We need some OCN'ers to get ahold of them and OC/Bench. I'd like to see those threads.


----------



## Badness

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


I don't think that does anything for games though.


----------



## uncholowapo

Looks like my only options now are either to get the quad core, a PII x4/6, or just dump AMD forever. I am in no way paying over $100 at this point for an AMD processor. They just lost another user


----------



## xPwn

Quote:



Originally Posted by *Sophath*


And intel's i7 980x scores 5,41 on cinebench in there


Dude, My bother is spamming all of AMD's videos with this:*"Well, Have you seen the latest benchmarks? BD at 5GHz sucks over 520W of power alone! Also, worse single-thread performance than Phenom II."*


----------



## matt1898




----------



## Armand Hammer

Just reading through the vortez and HardOcp reviews and it looks like our old friend OBR was right all along, this thing stinks big time.

Sadly it looks like AMD have thrown in the towel and we're all going to pay for it.

Surely this will go down as one of the epic fails in tech history?


----------



## robbo2

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


I'm interested to see el gappos review. At least I have faith in his clocking abilities.


----------



## ZealotKi11er

Quote:



Originally Posted by *dlee7283*


WRONG, WRONG,WRONG

The Phenom I was the only thing that couldn't compete Intel.

The Athlon II X2 is better than a E7200 and better priced better, u could get a combo deal with the X2 as well

The Phenom II X2 is better than the E8400

The Athlon II X3 is better value than the Pentium G6950

The Athlon II X4 is on par with a i3 1st gen until its overclocked, except u can get an X4/motherboard for the price of 1 i3

The Phenom II X4 is better than a Q9550 for the value

The Phenom II X6 was on par with the i7 920,especially for gaming. but the i7 can handle higher end GPU setups alot better.

AMD got its ass kicked when Sandy Bridge came out, up until that point it was competitive except for 2008.


AMD was never in pair with Intel especially Core i Series. Unless the game was all GPU bound and running @ 1600p plus then yes. 920 destroys x6 in gaming and multi-gpus.


----------



## G3RG

Quote:



Originally Posted by *xPwn*


Dude, My bother is spamming all of AMD's videos with this:*"Well, Have you seen the latest benchmarks? BD at 5GHz sucks over 520W of power alone! Also, worse single-thread performance than Phenom II."*










You do realize that just makes you look like a childish fanboy right?


----------



## Kvjavs

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


Anandtech still hasn't released their benchmarks... and I remember they were one of the firsts to discover the CPU-NB relation to performance with overclocking...

Only time will tell. I hope this is true, then maybe Bulldozer wont be a COMPLETE failure.

Quote:



Originally Posted by *Badness*


I don't think that does anything for games though.


Oh yes it does.


----------



## KittensMewMew

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


No, because they tested against stock Sandy Bridge systems. I know people here on the forums are going to get these CPUs and overclock the crap out of them and we'll see results more in line with what we were expecting, but that doesn't change the fact that Bulldozer's IPC is down from Phenom II. Which is saddening.

The overclocker's club review whenever it comes should be good. They seem to grasp the concept that AMD mobos need to be overclocked along with the CPU in order to get good results.


----------



## xPwn

Quote:



Originally Posted by *Armand Hammer*









Just reading through the vortez and HardOcp reviews and it looks like our old friend OBR was right all along, this thing stinks big time.

Sadly it looks like AMD have thrown in the towel and we're all going to pay for it.

Surely this will go down as one of the epic fails in tech history?


I remember writing in my sig that BD would suck, and I received an infraction for "not having proof"







Well, I do have to admit that was childish. I should think more of people.


----------



## mav2000

Well, AMD has screwed us over in more ways than one...I guess they could not say outright that the BD is not going to compete with even a 1100T P II. I guess JF AMD could not say that as he is part of the company...

But what about the other stalwarts on the many forums across the globe who gave all of us the false hope that something good was coming out of this ...I really dont know what to say, other than that the lying and deceiving was a bit too much...

I have been using AMD setups since 1998 and this is probably going to be my last...sorry AMD, but the way this has been handled was just not done.....RANT done...back to the real world now.


----------



## LUNAR

well is this 50% FASTER FROM THE PREVIOUS i7's and Phenom II X4's ? dosent seems so


----------



## davidtran007

Waiting patiently for the people who preordered one to come in with RAGE


----------



## mav451

Quote:



Originally Posted by *Badness*


I don't think that does anything for games though.


You might want to take a glance at this article AT ran on CPU-NB OCing with the Thuban:
http://www.anandtech.com/show/3877/a...ance-scaling/7

It most definitely makes a difference.


----------



## ismet

the benchmarks look very disappointing. I waited all Summer to see good results/reviews. You win Intel...


----------



## AtomicFrost

These results are pretty disappointing. Unless AMD drops the pricing on these by a fair amount they won't sell well.

IMO The 8150 should really be priced ~$189 the 8120 ~$169 etc.

Can the current AM3+ motherboards actually handle the kind of power an overclocked 8150 is using? I foresee a lot of people blowing up VRMs on cheap AM3+ motherboards in the near future.


----------



## Velcrowchickensoup

Quote:



Originally Posted by *uncholowapo*


Looks like my only options now are either to get the quad core, a PII x4/6, or just dump AMD forever. I am in no way paying over $100 at this point for an AMD processor. They just lost another user










Same here haha. they have been good to me, but this is just one of the most disappointing things ive seen.


----------



## Anonymous->Object

Wow, this is sad for AMD Bulldozer fans - it seems the only thing BD managed to bulldoze was itself.

I can hear AMD's market share dropping even more.


----------



## Badness

Quote:



Originally Posted by *Kvjavs*


Oh yes it does.


Proof please. I've looked around for evidence and I have not found anything more than a few fps at 1980's resolution. Maybe it'll do something in source engine, but anything else?


----------



## G3RG

Well hopefully this is just the beginning for Bulldozer.... They'll probably phase out the phenoms and replace them with Bulldozer (sidegrade/very mild upgrade). Gen 2 will hopefully be a step up....hopefully...


----------



## Sophath

Quote:



Originally Posted by *xPwn*


Dude, My bother is spamming all of AMD's videos with this:*"Well, Have you seen the latest benchmarks? BD at 5GHz sucks over 520W of power alone! Also, worse single-thread performance than Phenom II."*










yeah i just mean that it was fake advertisement. On that same picture, you could actually see that it was a core i5 2500k. the 980X scores around 9 fps on the test


----------



## fatmario

Quote:



Originally Posted by *davidtran007*


Waiting patiently for the people who preordered one to come in with RAGE


----------



## pyra

It's actually a blessing to me.
I can throw a thuban 6 core into my rig without changing anything else ang get the same or better performance than buldozer at a lower price.

Thanks for failing so hard AMD, this will be my last non-intel rig for a while.


----------



## Exostenza

Looks like the only good thing that will come out of bulldozer is that the Phenom II will be cheaper...

Crappy results for AMD.


----------



## Mad Pistol

Quote:



Originally Posted by *Badness*


Proof please. I've looked around for evidence and I have not found anything more than a few fps at 1980's resolution. Maybe it'll do something in source engine, but anything else?


http://www.anandtech.com/show/3877/a...ance-scaling/7

NB frequency does help when the game is CPU limited.

Resolution is set to 1680x1050 to make sure the CPU is being taxed and not the GPU.


----------



## Sophath

Quote:



Originally Posted by *Exostenza*


Looks like the only good thing that will come out of bulldozer is that the Phenom II will be cheaper...

Crappy results for AMD.


Heard they were going to phase them out soon.


----------



## bloodmech21

i dont think ive ever been so disappointed with a new piece of hardware. what happened amd? an 8-core phenom ii at 32nm would have destroyed this. i cant wait for amd to explain what went wrong.

you can see in some of the benches that bulldozer starts to close the gap at higher clocks. maybe it was meant to run at ~5 Ghz but GF dropped the ball. they certainly dropped the ball on power consumption. that alone makes this chip a failure.


----------



## G3RG

Quote:



Originally Posted by *Mad Pistol*


http://www.anandtech.com/show/3877/a...ance-scaling/7

NB frequency does help.


Here's to hoping Bulldozer scales as well...that'd be a pretty significant step up in price/performance...


----------



## 8ight

Quote:



Originally Posted by *Sophath*


Heard they were going to phase them out soon.


Secondhand acquisition unaffected


----------



## Evil-Jester

so the main thing i gather hear is if i want best chip for gaming i just wanna snag a 2500k then right? or am i wrong?


----------



## Crag

so..............is it good or not


----------



## Badness

Quote:



Originally Posted by *Mad Pistol*


http://www.anandtech.com/show/3877/a...ance-scaling/7

NB frequency does help.


Good newssssssssssssssssssssss: cigarette juice!
That is good news to hear. BD, not so much.


----------



## AtomicFrost

Quote:



Originally Posted by *Mad Pistol*


http://www.anandtech.com/show/3877/a...ance-scaling/7

NB frequency does help.


That does help give a decent performance bump. I wonder if the same will hold true for BD. I wish a review website would do OC the NB and post the results.


----------



## WizrdSleevz

I wonder what JF AMD has to say about this... I feel bad for him


----------



## Blameless

Quote:



Originally Posted by *Badness*


Proof please. I've looked around for evidence and I have not found anything more than a few fps at 1980's resolution. Maybe it'll do something in source engine, but anything else?


Already been posted, look for the anandtech article.

Game performance is actually one of the areas most influenced by CPU-NB.

That said, even if CPU-NB can be OCed, and has significant OC headroom, it's not going to turn most results around. At best it will improve general core OC scaling and take the edge off the weaker results.


----------



## AddictedGamer93

Wow what a disaster. Guess im jumping ship to Intel.


----------



## ArchLinuxFTW

I waited a year for this?

......

I don't know what to say. I am dissapoint!


----------



## Kvjavs

Quote:



Originally Posted by *WizrdSleevz*


I wonder what JF AMD has to say about this... I feel bad for him










I don't, JF-AMD is just another sub-par product of AMD.


----------



## matt1898

Seriously? drunk guy.........is bulldozer that baddddd?


----------



## Blameless

Quote:



Originally Posted by *WizrdSleevz*


I wonder what JF AMD has to say about this... I feel bad for him










Why? JF-AMD hasn't said anything that directly contradicts these results.

He's a sever guy anyway and likely won't have many comments.


----------



## rubicsphere

http://www.amazon.com/gp/feature.html/ref=amb_link_357768702_4?ie=UTF8&docId=1000730111&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=top-1&pf_rd_r=1SN4VPT25CVT9X2KPENH&pf_rd_t=301&pf_rd_p=1323326862&pf_rd_i=fx-8150
Clicking it doesn't work copy&paste


----------



## Peremptor

Quote:



Originally Posted by *MGF Derp*


Bulldozer is crap


For us that seek the best bang for the buck taking power consumption, overclockability (and it's relative ease/efficiency), ipc efficiency and just plain overall performance... I'd have to agree.


----------



## solar0987

Speechless
spec since i just sold my 1055t..
SAD SAD DAY


----------



## Diabolical999

I am really curious what went wrong here with Bulldozer. Why is there no horsepower behind specs like that? When Phenom X4 debuted, it was supposed to blow everyone away, were the first quadcores for AMD, yet suffered from a TLB bug. This release just seems like a repeat of that debacle or something. Really curious for AMD's explanation, if they even decide to give one.


----------



## AtomicFrost

Quote:



Originally Posted by *rubicsphere*


http://www.amazon.com/gp/feature.htm...f_rd_i=fx-8150

Clicking it doesn't work copy&paste










That pricing is terrible.


----------



## Blameless

What's the mental disorder characterized by poor reading comprehension, lack of foresight, and unrealistic expectations again?


----------



## Fuell

It's not bad actually not sure what everyone is so freaked out about. It was between 2500K and 2600K in threaded benches but lost out on games. It's a good CPU for those who need to run lots of threads, though a little disappointing for gamers.

Makes it a little bit of a niche CPU. It needs a small price reduction to compete really, I think $210-$215 for a 8150 would be fine, a great upgrade for some people.

I'm a little disappointed, I wanted better, though I knew it wasn't gonna beat Intel, but its a step in the right direction maybe. Hopefully they can get single threaded performance up a bit with the next versions and get llano yields under control so they can get moving on maturing the architecture.

But again, I see no real reason for people to be calling it an epic fail, its far from the truth. It's simply not a gaming monster and thats what a lot of people here seem to want, at least the very vocal ones...


----------



## Mad Pistol

Quote:



Originally Posted by *Diabolical999*


I am really curious what went wrong here with Bulldozer. Why is there no horsepower behind specs like that? When Phenom X4 debuted, it was supposed to blow everyone away, were the first quadcores for AMD, yet suffer from a TLB bug. This release just seems like a repeat of that debacle.


The Phenom II was a lot better though (albiet, clocked much higher)

Here's to hoping going from FX to FX-II will be just as much of an increase in performance (and a decrease in power consumption...)


----------



## Syjeklye

I'm confused at who this chip is aimed at. It seems like they are targeting an enthusiast who doesn't want to be an enthusiast?

I've always liked AMD since the K6-2 days, but this isn't what anyone expected.


----------



## Liquidpain

Quote:



Originally Posted by *Fuell*


It's not bad actually not sure what everyone is so freaked out about. It was between 2500K and 2600K in threaded benches but lost out on games. It's a good CPU for those who need to run lots of threads, though a little disappointing for gamers.

Makes it a little bit of a niche CPU. It needs a small price reduction to compete really, I think $210-$215 for a 8150 would be fine, a great upgrade for some people.

I'm a little disappointed, I wanted better, though I knew it wasn't gonna beat Intel, but its a step in the right direction maybe. Hopefully they can get single threaded performance up a bit with the next versions and get llano yields under control so they can get moving on maturing the architecture.

But again, I see no real reason for people to be calling it an epic fail, its far from the truth. It's simply not a gaming monster and thats what a lot of people here seem to want, at least the very vocal ones...


Uhh, the power draw alone makes it a failure.


----------



## NuclearSlurpee

Holy hell, AMD you failed. I'll run my Phenom until it dies but this is unacceptable. The old 1100T beats the new top of the line 8-core processor. -____-


----------



## dantoddd

My memory is a little hazy, but isn't this worse than the P4 debacle?


----------



## Derek1387

Wow...wonder if i should sell my AMD and switch to Intel now.


----------



## Derp

Quote:



Originally Posted by *Blameless*


What's the mental disorder characterized by poor reading comprehension, lack of foresight, and unrealistic expectations again?


Not sure if you're trolling or not but you're trying way too hard to stay as unbiased as possible. Staying unbiased is a good thing but sometimes things are just simply bad. BD is a failed processor and there's just no way to word your way around it.


----------



## dantoddd

If AMD has designed this from the ground up we maybe able to see a significant performance increase in a year or so.


----------



## Sophath

This is bad for people who were going to go 2 way Crossfire/SLI with BD.


----------



## Kaze105

Quote:



Originally Posted by *rubicsphere*


http://www.amazon.com/gp/feature.htm...f_rd_i=fx-8150

Clicking it doesn't work copy&paste


Holy crap, I can buy a 2600k at microcenter at that cost.


----------



## Lord Xeb

I am disappoint..... Then again I am really not expecting much as there isn;t much throughput you can push through a 940 pin socket...


----------



## pyra

Quote:



Originally Posted by *Fuell*


It's not bad actually not sure what everyone is so freaked out about. It was between 2500K and 2600K in threaded benches but lost out on games. It's a good CPU for those who need to run lots of threads, though a little disappointing for gamers.

Makes it a little bit of a niche CPU. It needs a small price reduction to compete really, I think $210-$215 for a 8150 would be fine, a great upgrade for some people.

I'm a little disappointed, I wanted better, though I knew it wasn't gonna beat Intel, but its a step in the right direction maybe. Hopefully they can get single threaded performance up a bit with the next versions and get llano yields under control so they can get moving on maturing the architecture.

But again, I see no real reason for people to be calling it an epic fail, its far from the truth. It's simply not a gaming monster and thats what a lot of people here seem to want, at least the very vocal ones...


You reading some different reviews?

Based on the reviews in the OP there is absolutely no reason for anyone to buy a bulldozer CPU for anything.


----------



## Liquidpain

Folks were expecting this to crush SB and at a minimum, at least have net gains across the board. NEITHER situation has happened.


----------



## NuclearSlurpee

$285 for a lame processor in which the previous gen Phenom beats while $100 cheaper. Then we have the wonderful $200 i5 2500k which beats it all. Wow, this is just BS AMD.

  Introducing the AMD FX Processors


----------



## xPwn

Quote:



Originally Posted by *Fuell*


It's not bad actually not sure what everyone is so freaked out about. It was between 2500K and 2600K in threaded benches but lost out on games. It's a good CPU for those who need to run lots of threads, though a little disappointing for gamers.

Makes it a little bit of a niche CPU. It needs a small price reduction to compete really, I think $210-$215 for a 8150 would be fine, a great upgrade for some people.

I'm a little disappointed, I wanted better, though I knew it wasn't gonna beat Intel, but its a step in the right direction maybe. Hopefully they can get single threaded performance up a bit with the next versions and get llano yields under control so they can get moving on maturing the architecture.

But again, I see no real reason for people to be calling it an epic fail, its far from the truth. It's simply not a gaming monster and thats what a lot of people here seem to want, at least the very vocal ones...


Well SB was like Q1/Q2 and BD is like Q4 i think?


----------



## Carlos Hilgert Ferrari

Let the flame war begins!


----------



## Blameless

Quote:



Originally Posted by *Diabolical999*


Why is there no horsepower behind specs like that?


Specs like what?

99% of the people in this thread don't know a damn thing beyond number of cores, cache size, and clock speed, none of which are even close to being significant basic architectural features.

Quote:



Originally Posted by *Syjeklye*


I've always liked AMD since the K6-2 days, but this isn't what anyone expected.


What?!

This is almost _exactly_ what anyone with _any_ sense expected.

Power consumption is a bit high when OCed, but otherwise we have all known about the performance and general overclockability for quite some time.

Some people like to pretend that final release stepping of chips aren't made many months before release, that everyone on the planet has signed an AMD NDA, and that English speaking nations are the only ones not stuck in the stone age, but they were only deluding themselves.


----------



## BLKKROW

I am actually quite pleased, it just goes to show that AMD the company that is extremely smaller compared to Intel. Has a lot less money for research and tech. Is some what aggressive to Intel. Granted it does not beat it in every test, but for a company that has a lot less revenue. They did as well as I expected.


----------



## GameBoy

Meh, most results are pretty in-line with what I expected. Not sure why the 4 core FX is so much slower than a Phenom II X4 in some of those tests, though.

I'm curious to see if performance would improve with CPU-NB clocking?


----------



## xPwn

Quote:



Originally Posted by *NuclearSlurpee*


$285 for a lame processor in which the previous gen Phenom beats while $100 cheaper. Then we have the wonderful $200 i5 2500k which beats it all. Wow, this is just BS AMD.

Introducing the AMD FX Processors


Oh, what do we have here? http://www.microcenter.com/single_pr...uct_id=0354587
OMG!! a i7 2600k for 270$









OT: I hope AMD does not go out of business, because then Intel will be spit up. I hope AMD Makes some random breakthrough with BD-E Please!


----------



## Lxcivic2k1

I guess my computer upgrade will have to wait, or I'll be building my first Intel system......maybe AMD will somehow pull something out it's ass by Q1 2012.


----------



## SMK

Holy power consumption!!!


----------



## NuclearSlurpee

AMD, please don't screw up Radeon 7xxx series.


----------



## Blameless

Quote:



Originally Posted by *NuclearSlurpee*


$285 for a lame processor in which the previous gen Phenom beats while $100 cheaper. Then we have the wonderful $200 i5 2500k which beats it all. Wow, this is just BS AMD.

Introducing the AMD FX Processors


Not AMD's doing, Amazon is gouging.


----------



## Addict1973

Quote:



Originally Posted by *NuclearSlurpee*


Holy hell, AMD you failed. I'll run my Phenom until it dies but this is unacceptable. The old 1100T beats the new top of the line 8-core processor. -____-


One question. What reviews have you been reading?







I believe its time to sleep! I am sure to dream of....... multi-tasking!


----------



## 996gt2

*Anandtech:*

http://www.anandtech.com/show/4955/t...-fx8150-tested


----------



## Armand Hammer

Quote:



Originally Posted by *dantoddd*


My memory is a little hazy, but isn't this worse than the P4 debacle?


Pretty much, took AMD 5-6 Years of R&D including 2 years of delays to release something that is basically a side grade from thuban.

Why they didn't just do a thuban die shrink to 32nm with 8 cores and continue working on Bulldozer untill it was clearly better than what they already had we will never know.

Now im off to play Duke Nukem Forever with my FX 8150!


----------



## jrbroad77

Only positive spin: Get an FX-4100 and hope it OC's like a beast with watercooling. Should be able to beat an 8150 max OC vs max OC for gaming. And if it can hit 5.5ghz with a Corsair H100.. might be promising.

And yes, I am disappoint. Most likely going IB. No big deal, not like I was delaying a purchase(went for a 24" monitor this summer over a platform switch, not disappointed at all lol).


----------



## Syjeklye

Quote:



Originally Posted by *Blameless*


This is almost _exactly_ what anyone with _any_ sense expected.

Power consumption is a bit high when OCed, but otherwise we have all known about the performance and general overclockability for quite some time.
.


This is exactly what I am talking about. AMD has been the king of power consumption and price/performance. This chip is neither good on power or at a price that would make me buy it over a 2500k.

How could you expect this? The single core performance is even down from the previous generation. No one expected that.


----------



## AtomicFrost

Quote:



Originally Posted by *Blameless*


Not AMD's doing, Amazon is gouging.


Looks like a pricing error. I have a feeling that the price will drop down to list when you are able to actually order.

8150 List = $269.99
8120 List = $229.99
6100 List = $189.99

Still too much coin to compete with Intel.


----------



## WizrdSleevz

Quote:



Originally Posted by *SMK*


Holy power consumption!!!


----------



## Liquidpain

Quote:



Originally Posted by *Blameless*


Specs like what?

99% of the people in this thread don't know a damn thing beyond number of cores, cache size, and clock speed, none of which are even close to being significant basic architectural features.

What?!

This is almost _exactly_ what anyone with _any_ sense expected.

Power consumption is a bit high when OCed, but otherwise we have all known about the performance and general overclockability for quite some time.

Some people like to pretend that final release stepping of chips aren't made many months before release, that everyone on the planet has signed an AMD NDA, and that English speaking nations are the only ones not stuck in the stone age, but they were only deluding themselves.


Sorry but I don't think people were hyped for lower IPC and higher power draw. It's obvious that something, somewhere went wrong and to think otherwise is just being naive.


----------



## 996gt2

*From Anandtech:*

Quote:



Bulldozer is an interesting architecture for sure, but I'm not sure it's quite ready for prime time. AMD clearly needed higher clocks to really make Bulldozer shine and for whatever reason it was unable to attain that. With Piledriver due out next year, boasting at least 10 - 15% performance gains at the core level it seems to me that AMD plans to aggressively address the shortcomings of this architecture. My only concern is whether or not a 15% improvement at the core level will be enough to close some of the gaps we've seen here today. Single threaded performance is my biggest concern, and compared to Sandy Bridge there's a good 40 - 50% advantage the i5 2500K enjoys over the FX-8150.


----------



## Chiefpuff420

Quote:



Originally Posted by *Syjeklye*


This is exactly what I am talking about. AMD has been the king of power consumption and price/performance. This chip is neither good on power or at a price that would make me buy it over a 2500k.

How could you expect this? The single core performance is even down from the previous generation. No one expected that.


clearly agree i didn't expect it to lose out in any categories to the old Thuban. As a Sabertooth990fx owner I'm stumped and am going to chill off my frustration.

Should have never gone amd. Intel is much further ahead in single threaded as well as over all arch. performance


----------



## pioneerisloud

I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


----------



## Peremptor

Quote:



Originally Posted by *996gt2*


*Anandtech:*

http://www.anandtech.com/show/4955/t...-fx8150-tested


I got tired of refreshing the main page over an hour ago. Let's see how Anand spins this debacle...


----------



## DoktorCreepy

Well after looking at the benches I'll be going 1155 or 2011. Still looking forward to the HD 7000 series though.


----------



## HowHardCanItBe

Guys, please stop posting memes. It contributes nothing to the thread.


----------



## Fortunex

Le flop.


----------



## 996gt2

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


You'd think AMD would have given the reviewers a heads-up about the patch so that their new CPU doesn't look any worse on the initial batch of reviews.

First impressions are important. Certainly AMD wouldn't have overlooked the patch?


----------



## Blameless

Quote:



Originally Posted by *Syjeklye*


This is exactly what I am talking about. AMD has been the king of power consumption and price/performance. This chip is neither good on power or at a price that would make me buy it over a 2500k.


AMD hasn't been competitive in performance vs power consumption since 2004-2005.

Both first and second gen i7s murdered Phenom II in the performance/watt ratio.

Quote:



Originally Posted by *Syjeklye*


How could you expect this?


I can read.

Specifically, we have known a BD core was narrower than a Phenom core for over a year.

More recently, leaked benchmarks dating back at least a few months have shown results very similar to what we are seeing today.

Quote:



Originally Posted by *Syjeklye*


The single core performance is even down from the previous generation. No one expected that.


I and dozens of other people here anticipated this.

If you didn't, you haven't been paying attention.


----------



## nuklearwax

Wow....looks like I'll be going Intel for the first time ever next year and I've been building AMD rigs since Barton cores.


----------



## Sophath

Quote:



Originally Posted by *Blameless*


AMD hasn't been competitive in performance vs power consumption since 2004-2005.

Both first and second gen i7s murdered Phenom II in the performance/watt ratio.

I can read.

Specifically, we have known a BD core was narrower than a Phenom core for over a year.

I and dozens of other people here anticipated this.

If you didn't, you haven't been paying attention.


I heard that bulldozer has 80% of the IPC Phenom had per core. Not too sure where I saw that though.


----------



## TheRockMonsi

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


Maybe, but for me the biggest thing I wanted to see was how it did in games; apparently, there's no reason for me to upgrade from my quad core.


----------



## xPwn

Quote:



Originally Posted by *nuklearwax*


Wow....looks like I'll be going Intel for the first time ever next year and I've been building AMD rigs since Barton cores.


Yeah, seriously this makes me 50% happy because I have Intel and 50% sad because I wanted 8 Cores of madness


----------



## RotaryKnight

you know, ive been an avid AMD user since 2000 with the 1ghz t-bird cpu....with recent events...I am sadden, and yet strangely happy to move to Intel. AMD better get their asses in line and make better cpus.

This is a smack across the face for the FX line, I mean WTH AMD!!!


----------



## Lampen

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


We were discussing this about a hour ago in the [email protected] Steam chatroom. We'll see what this yields though. The fix would have to yield massive improvements to really impact what we're seeing in these reviews.


----------



## microfister

wow, thanks amd, you just monopolized intel(so much for competitive price drops). and way to go for being a bottleneck on your upcoming gpus, looks like if you want to xfire 7 series you may have to turn away from amd for your rig. a 2600k @ stock is better than the BD @ 4.6. this is epic fail.


----------



## Jinny1

i guess my signature quote aint gonna happen :/


----------



## Fuell

Quote:



Originally Posted by *pyra*


You reading some different reviews?

Based on the reviews in the OP there is absolutely no reason for anyone to buy a bulldozer CPU for anything.


3DMark 11 it kept up with and beat 2500K often... Beat 2500K in Sandra... Also beat 2600K once... 3ds Max 2012... Photoshop CS5.1... Premiere Pro 5.5... Solidworks 2010...ABBYY FineReader... 7zip...Mainconcept 2.2... Handbreak...

So yea... its def not better than 2500K really if you ask me... but in no way does the 2500K "trounce" or "destroy" the 8150 in all cases. There is a market for this, though I think its too small. I think for most people this isn't going to be a good choice over a 2500K. But in cases like those above, its a cheaper alternative to a 2600K while delivering better performance than a 2500K.

So while I agree BD did disappoint in some cases, it shines in others. Its an architecture that will get better with maturity, but for now, AMD needs to work out a few wrinkles.

In my book, getting a CPU from AMD that beats/competes with a 2500K consistently in any type of workload is a good step forward considering the state of affairs before this. But they obviously have some work to do. Lets hope AMD can drop the prices and get llano yields up to make some $$ to get BD/modular arch's where they need em, that way Intel has reason to move.


----------



## RotaryKnight

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


Its said to be a 3% difference....which makes no difference at all when applied when you look at it overall.


----------



## black96ws6

Quote:



Originally Posted by *Lampen*


We were discussing this about a hour ago in the [email protected] Steam chatroom. We'll see what this yields though. The fix would have to yield massive improvements to really impact what we're seeing in these reviews.


Couldn't sleep, thought I'd log back on and see what people were saying.

By the way, this patch was discussed in another thread, either on that board or another, can't remember. It turns out this patch issue doesn't come into play as often as they thought, 3% was the last I heard. So, no, this won't do anything. Running BD on Windows 8 supposedly doesn't really help either.

EDIT: Ninja'd by RotaryKnight.

EDIT2: So much for the Russian benches being bogus. As I said, they were legit. All these reviews are reviewing the 8150, you can at least look at the Russian ones to see how an 8120 performs (basically the same but slower due to lower clocks)


----------



## pioneerisloud

Quote:



Originally Posted by *Lampen*


We were discussing this about a hour ago in the [email protected] Steam chatroom. We'll see what this yields though. The fix would have to yield massive improvements to really impact what we're seeing in these reviews.


Well, that fix, plus PROPER overclocking (assuming NB clocks increase performance on BD like it does on Phenom II)....it could make things a lot more interesting.


----------



## RJacobs28

My new Sabertooth 990fx may not have been the best move...


----------



## jrbroad77

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


This is a good point. After seeing Anandtech's review stating that BD has 2B transistors vs. the ~1B of Phenom II X6, SB, and Gulftown, it just didn't add up. Each BD core should be bigger than Phenom II, meaning integer performance way up and worst-case FP performance in line with Phenom II X4/X6.

Idk. Bit baffled.


----------



## Quantium40

I said way back before the summer that bulldozer was decreasing instructions per clock as compared to previous gen, but I was laughed at.


----------



## xPwn

Quote:



Originally Posted by *RJacobs28*


My new Sabertooth 990fx may not have been the best move...


Maybe, Maybe not. Its just cases like this where I read reviews/benchmarks before buying expense stuff. Especially computer hardware! Because its a mixed bucket


----------



## Phantom123

Well this is rather disappointing. My main thing is power consumption. If it performed better while overclocked and took less power (which i thought it would) this would be an easy choice. However power seems rather poorly.

Sad to say but when tomorrow comes I may be choosing the Intel 2500k, even though i waited almost 10 months for the FX.


----------



## xPwn

Quote:



Originally Posted by *jrbroad77*


This is a good point. After seeing Anandtech's review stating that BD has 2B transistors vs. the ~1B of Phenom II X6, SB, and Gulftown, it just didn't add up. Each BD core should be bigger than Phenom II, meaning integer performance way up and worst-case FP performance in line with Phenom II X4/X6.

Idk. Bit baffled.


Yeah, I read that and was amazed! But maybe that is why it runs so hot!


----------



## eggs2see

Quote:



Originally Posted by *Quantium40*


I said way back before the summer that bulldozer was decreasing instructions per clock as compared to previous gen, but I was laughed at.










If this is the case, I don't understand AMD's angle, why did they think less IPC would be better than more?


----------



## gooface

And I thought I almost made a mistake after buying this 2500K, guess not.... my 1090t is better than this thing, and I bought that back in May of 2010, AMD you disappoint me. this is a complete flop in my book, when the previous generation cpus beat out the new ones. Reminds me of the first gen pentium 4's


----------



## geoxile

Quote:



Originally Posted by *Sophath*


I heard that bulldozer has 80% of the IPC Phenom had per core. Not too sure where I saw that though.


IIRC AMD themselves stated a single "core" within a bulldozer "module" will yield approximately 80% the performance of a full core.


----------



## Mad Pistol

The results are all over the place. I'm seriously wondering if a bump in the NB clock could eliminate some of the strange results that reviewers are getting. Surely it couldn't hurt.

I mean, about half of the results have the 8150P losing to the 1100T, and 50% of those lose to the PhII 980 X4. That's a serious misstep if AMD can't beat out their previous flagship product outright with their new one.


----------



## Senator

So after the QQ subsides, will we begin seeing, "Just wait for Pildriver to stick it to Ivy Bridge!"

Seriously AMD forecasts' a what? 10%-15% improvement per core in their next iteration due out next year?

Bulldozer fails hard. Incredibly disappointing. Intel seems the only way to fly currently for most desktop choices and certainly is in no danger of losing their crown.


----------



## tpi2007

Quote:



Originally Posted by *Blameless*


Only question that still remains in my mind is regarding the uncore/cpu-NB and how this is clocked by default as well as how overclocking it (if possible) will influence performance.


I would guess that AMD:

1. Would have thought of that and increased the default frequency if they felt the performance was not up to par; they did it with the CPU clockspeed,

or

2. They would have told reviewers to not forget to overlclock the NB to get better results, seeing as the performance would take a hit otherwise.

Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


Yes, but...

Quote:



Originally Posted by *996gt2*


You'd think AMD would have given the reviewers a heads-up about the patch so that their new CPU doesn't look any worse on the initial batch of reviews.

First impressions are important. Certainly AMD wouldn't have overlooked the patch?


+1

My thoughts exactly. If AMD knows about this problem - I mean they MUST know, they designed the chip that way, they had to know the implications and their internal testing would have shown that - their job was to work closely with Microsoft to address the issue as soon as possible. And after all, yesterday was patch Tuesday and nothing.


----------



## Addict1973

"i will still be buying Zambezi cpu's only due its Box, i want still one for candy, one for lollipops and one for screws."


----------



## anubis1127

Quote:



Originally Posted by *dlee7283*


I said the Athlon X2 traded blows with the Core 2 Duo, I didnt say it beat it. The X2 was much cheaper compared to the e6600.

Also I was talking about the Athlon II 250 noob, not an energy efficient Athlon II, LMAO

The PII X2 vs e8400 comes down to the price, in which the X2 beats is for price/performance. Also not including a combo deal.

Your other argument is invalid because I didnt have to worry about the price of 1156 being the same as AMD motherboard because I got a free motherboard with my X4









The Phenom II X6 is a better value even though the i7 920 is better. The X58 is a huge investment, some of the motherboard were $250-300 for a long time. With Phenom I could get an el cheapo AM2+/AM3


Not sure why you are still comparing prices of 2 year old hardware.

Going by MC prices the 8150 will be the same price as an i7 2600k, and the mobos are about the same price, except with i7 you get $80 off with i7 combo. I haven't heard of BD combos but I'm sure they will eventually.

The 8120 will actually be $40 more than the i5 2500k.

My point, current gen hardware is what people care about in 2011, and in 2011 Intel still retains price performance crown ATM.

Sent from my DROIDX using Tapatalk


----------



## Sin0822

NB clock cannot go above 2800/2700 on air. I have tried, itll be in my OC guide.

NB clock makes almost no difference. but under LN2 you can bump it upto like 4ghz+ which helps with LN2 benching, which i think is what BD will be good at.


----------



## MrSleepin

anyone else see this?

http://www.ozeros.com/2011/10/bulldo...2-by-hicookie/


----------



## Blameless

Quote:



Originally Posted by *xPwn*


Yeah, I read that and was amazed! But maybe that is why it runs so hot!


It probably runs so hot because its a big die, with a large transistor count, and has a longer pipeline with higher clocks than prior chips. Also, leaky chips tend to OC better, even as their power consumption is worse.

Seems reasonable that the leakiest dies are going into the FX chips while the ones with more favorable power profiles are being saved for Opterons.

Quote:



Originally Posted by *Sin0822*


NB clock cannot go above 2800/2700 on air. I have tried, itll be in my OC guide.

NB clock makes almost no difference. but under LN2 you can bump it upto like 4ghz+ which helps with LN2 benching, which i think is what BD will be good at.


Thanks for the info.


----------



## Sin0822

yes, its impressive. Ima try the same tomorrow. See if we can bench 7ghz full go with BD, on all cores, then dude that might be enough headroom to do good.

It does run hot, but the stock voltage is WAYYY too high. I can do 4.8ghz stable with 1.41v load/idle. AMD says stock VIDs from 1.3-1.4v.

I can do 4.5ghz with 1.28v idle/load, and it runs cool as stock.

They are overvolting too much


----------



## Sophath

Quote:



Originally Posted by *MrSleepin*


anyone else see this?

http://www.ozeros.com/2011/10/bulldo...2-by-hicookie/


Question is, how much power would those 2 cores draw?


----------



## sLowEnd

Aye...
Well, for AMD's sake, BD had better put up a better showing on the server side of things.


----------



## microfister

in gaming at stock its getting trumped by the 980 BE?

a world record holding bottleneck, this is quite sad.

are there any reviews of it with cores disabled and overclocked? im curious.


----------



## Sin0822

how about max OC on air? anyone?
i did this, http://valid.canardpc.com/show_oc.php?id=2041683

Than i did this: http://valid.canardpc.com/show_oc.php?id=2041712

Another thing, BD loves high HTT link. Like 230+ is better than multiplier. SPI loves multi, other benches dont.


----------



## aznofazns

I didn't expect Bulldozer to be amazing, but this is just terribad.


----------



## Blameless

Quote:



Originally Posted by *Sin0822*


They are overvolting too much


Initial reviews almost always do.

They tend to brute force everything in order to push out an article on time, rather than learning the nuances of the platform/architecture.


----------



## hak8or

Quick thing guys:

Quote:



Originally Posted by *looncraz*

Actually, we already have such an issue known for Bulldozer, and NO bench-marked system has the patch installed!

The shared L1 cache is causing cross invalidations across threads so that the prefetch data is incorrect in too many cases and data must be fetched again. The fix is a "simple" memory alignment and (possible)tagging system in the kernel of Windows/Linux.

I reviewed the code for the Linux patch and was astonished by just how little I know of the Linux kernel... lol! In any event, it could easily cost 10% in terms of single threaded performance, possibly more than double that in multi-threaded loads on the same module due to the increased contention and randomness of accesses.

Not sure if ordained reviewers have been given access to the MS patch, but I'd imagine (and hope) so! Last I saw, the Linux kernel patch was still being worked on by AMD (publicly) and Linus was showing some distaste for the method used to address the issue. One comment questioned the performance cost but had received no replies... but you don't go re-working kernel memory mapping for anything less than 5-10%... just not worth it!


Here is some information on the patch for LINUX:

Quote:



On Wed, Jul 27, 2011 at 11:57:45AM -0400, Avi Kivity wrote:
> Out of curiosity, what's the performance impact if the workaround is
> not enabled?

Up to 3% for a CPU-intensive style benchmark, and it can vary highly in
a microbenchmark depending on workload and compiler.


Now, this patch has been taken apart by the linux community for being a very very bad fix, as explained here:

Quote:



Originally Posted by *LINUS*

Anyway, I seriously think that this patch is completely unacceptable
in this form, and is quite possibly going to break real applications.
Maybe most of the applications that had problems with ASLR only had
trouble with anonymous memory, and the fact that you only do this for
file mappings might mean that it's ok. But I'd be really worried.
Changing address space layout is not a small decision.


http://article.gmane.org/gmane.linux.kernel/1170744

I just took this from a few posts on another awesome forum:
http://www.xtremesystems.org/forums/...64#post4969164


----------



## coolhandluke41

Quote:



Originally Posted by *Captain Han*


I am deeply deeply saddened by what AMD has done. They've been non competitive ever since Core 2 Duo came out in 2006, and Bulldozer was suppose to be their savior.

I am not so much pissed about the actual product, but as a technology enthusiast I am deeply saddened such a big microprocessor company is developing such crap offering and poor business execution, not to mention I believe they've just killed their FX-brand. They are not pushing the industry forward, and that is what I am mostly pissed about.

In the past 5 years, AMD has been competing on the price point rather than performance. They have basically established themselves as a 'value brand', not a premium or industry brand. They have failed in every aspect to deliver a competitive CPU.

Mind I remind you Trinity is also gonna be based on Bulldozer + Northern Island core. So you know APU performance is gonna be crap.

AMD can now only compete on Fusion and discrete graphics. But they have not much market share to begin with.

I hope Rory Read does something drastic otherwise AMD is going down.


that's about the size of it


----------



## UNOE

Yeah 500 watts on load. People building budget rigs will have to account for spending $40 more on a PSU with some more power.


----------



## xPwn

Quote:



Originally Posted by *UNOE*


Yeah 500 watts on load. People building budget rigs will have to account for spending $40 more on a PSU with some more power.



And the dedicated water cooling loop for 250w TDP


----------



## WizrdSleevz

Can I cry??

I'll be switching to Intel or just gonna sell 1 of my 6950's..


----------



## Blameless

Quote:



Originally Posted by *UNOE*


Yeah 500 watts on load. People building budget rigs will have to account for spending $40 more on a PSU with some more power.


The CPUs themselves don't pull 500w, those are whole system figures, usually at the wall.

Also, motherboard VRM is going to be a greater concern than PSU.


----------



## microfister

so when is pyledriver coming out?


----------



## BallaTheFeared

Quote:



Originally Posted by *Sin0822*


NB clock cannot go above 2800/2700 on air. I have tried, itll be in my OC guide.

NB clock makes almost no difference. but under LN2 you can bump it upto like 4ghz+ which helps with LN2 benching, which i think is what BD will be good at.


Thanks.


----------



## mad0314

Quote:



Originally Posted by *Sophath*


I heard that bulldozer has 80% of the IPC Phenom had per core. Not too sure where I saw that though.


I think you're talking about the "module" explanation and how the shared resources made each core worth 80% of a "true" core (something along those lines, not exact, although I think that a module being worth 80% of a true pair of cores would be a better explanation). Note that this was not of a Phenom II core, but of a core of the same architecture if it did not have shared resources, which does not exist for comparison. AMD's plan was to increase speed while trying to maintain IPC.

I hope this is "Phenom I v2" and "Phenom II v2" is yet to come...


----------



## xPwn

Quote:



Originally Posted by *Blameless*


The CPUs themselves don't pull 500w, those are whole system figures, usually at the wall.


Trust me, the SB systems were pulling less than 350w so BD takes 150w extra power


----------



## yesitsmario

Quote:



Originally Posted by *microfister*


so when is pyledriver coming out?


Isn't it only going to be 10% faster ?


----------



## qwertymac93

why u no keep IPC the same AMD?
I hope it's some glitch like the original phenoms had and we see 15% performance improvements with the next stepping...
Very sad, I was expecting 8 core phenom II level performance.


----------



## Scorpion49

I had a gut feeling this would be the turnout, but I really really wish it wasn't.

The only pleasure I take from this is all the super-obnoxious fanboys that were sure that BD was the second coming get to be quiet now, as well as the other super obnoxious fanboys that were sure it would fail.

Bad for market, bad for AMD, bad for performance minded computer enthusiasts. Hopefully they can pull some trick out of their sleeve to bump it up 10-15% and call it a day.


----------



## MrSleepin

Quote:



Currently Active Users Viewing This Thread: 283 (145 members and 138 guests)



never seen this many people in one thread! crazy!


----------



## Blameless

Quote:



Originally Posted by *qwertymac93*









why u no keep IPC the same AMD?


Because core count is more important for the segments that actually make AMD money. Desktop chips are not particularly profitable.

If they can get 33% more cores in the same space for a modest hit to IPC, it will be well worth it.


----------



## xPwn

Quote:



Originally Posted by *MrSleepin*


never seen this many people in one thread! crazy!


Earlier there was 500+


----------



## Dopamin3

Dirk Meyer wasn't ousted like most people think; he left the company when he saw the first Bulldozer benchmarks


----------



## HothBase

Uh, just woke up and glanced at some of the reviews... pretty sure I wont be buying one of these, lol.


----------



## AddictedGamer93

Guess Im waiting for IB :/


----------



## manifest3r

Quote:



Originally Posted by *MrSleepin*


never seen this many people in one thread! crazy!


285 (143 members and 142 guests)

Now you've seen more!









Yeah...pretty much what I expected. Already planned to by a 1055T from a fellow OCN member


----------



## Alatar

Quote:



Originally Posted by *MrSleepin*


never seen this many people in one thread! crazy!


it was better earlier


----------



## black96ws6

Quote:



Originally Posted by *MrSleepin*


never seen this many people in one thread! crazy!


That's because someone ninja'd the title to "Hi I'm a girl and I'm new here and need help with my PC"


----------



## xPwn

Quote:



Originally Posted by *black96ws6*


That's because someone ninja'd the title to "Hi I'm a girl and I'm new here and need help with my PC"


----------



## Badness

I am so happy that bf3 is gpu limited...


----------



## Scorpion49

Quote:



Originally Posted by *black96ws6*


That's because someone ninja'd the title to "Hi I'm a girl and I'm new here and need help with my PC"










Some of these guys need to get to some more popular sites on the net, SVTP has had a few epic ownage threads with thousands of viewers (up until the servers crash).


----------



## AtomicFrost

Quote:



Originally Posted by *manifest3r*


285 (143 members and 142 guests)

Now you've seen more!









Yeah...pretty much what I expected. Already planned to by a 1055T from a fellow OCN member










This might be a pretty good budget upgrade idea. It would be a decent amount cheaper then BD and around the same performance.

For someone like me who is still using a C2D, buying a used 1055T + a decent AM3+ motherboard could be a decent budget upgrade option. Plus you would gain the ability to upgrade to Piledriver if AMD gets their act together.

It also looks like a used 1366 CPU/motherboard would still be pretty similar performance wise to BD in everyday tasks.


----------



## totallynotshooped

This is just sad... in the tom's hardware review, It loses out to sandybridge in a benchmark while being 1 ghz higher..
Fail AMD


----------



## MrWilson

ok wasnt sure if i was going bulldozer or to wait for IB...im sure this Core 2 quad can hang on til next year


----------



## xEnt

oh i'm glad i didn't waste my time waiting for this piece of junk. shame on you AMD. 2500k all the way.

i at least expected the highest end bulldozer to be on par with the nehalem architecture per clock, but from the looks of things it's their old **** with 2 cores slapped on...


----------



## black96ws6

I'm still amazed at the power consumption. Compare the review chart with Intel\\AMD cpus stock and oc'd:


----------



## AMC

Time to read Anand's article. Thankfully I went SB.


----------



## totallynotshooped

Comment, John Freuhe?


----------



## Blameless

The front-end comparison FPU details in Anandtech's article does a pretty good job of summing up the "why":

http://www.anandtech.com/show/4955/t...x8150-tested/2


----------



## RussianJ

Just got out of the hospital waiting room, got a signal and started reading. May turn around and check in for depression now.


----------



## BLKKROW

I want to see benchies after theL1 Cache fix.


----------



## kevink82

Saw the vortez review he stated AMD say turning off the cores to 2 will hit 5 ghz pretty easily....... seriously ***.....









Pretty sure the fat lady just sang.... LOUD!!!


----------



## black96ws6

From Hardwarecanucks:

Quote:



In order to hit 4.5Ghz, we set the vCore to 1.50V with some light Load-Line Calibration (LLC). At this voltage, the processor heated up considerably, reaching well over 77Â°C when stress tested by Prime95 In-place large FFTs.

At these settings, the system is idling at around 195W, but when running the aforementioned Prime95 stress test,* it pulls an immense 550W from the socket.* If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost* 800W*.


----------



## microfister

Quote:



Originally Posted by *yesitsmario*


Isn't it only going to be 10% faster ?


10% faster would still put it quite a bit under the standard


----------



## Kvjavs

Quote:



Originally Posted by *Badness*


I am so happy that bf3 is gpu limited...


Why? The Phenom I X8 actually seems to do good in BF3.


----------



## venom55520

alrighty then, it's official, i'm gonna have to send back my motherboard and go SB. i'm kind of heartbroken









i simply cannot justify buying an 8150 since SB outperforms it almost every time and even when it loses it loses by single digit percentages.

...and apparently my PSU wouldn't support it


----------



## kriios

AMD is still value though, those gaming performance numbers are an eye sore, but the Bulldozer line are priced significantly cheaper than the Intel breed of chips. It's not all bad, just not as awesome as some of us expected.


----------



## tpi2007

Anandtech:










The difference to the i7 2600k is 73.6 w, and to the i5 2500k it's 95,7 w - this one is the TDP of a Sandy Bridge CPU like the i7 2600k!

Also, both from Anand and PC Pro, these chips don't have much headroom (but who would want to overclock these when they use so much power already ?):

Anandtech:

Quote:



AMD indicated the FX-8150 was good for around 4.6GHz using air cooling, 5GHz using water cooling and beyond with more aggressive cooling methods. In our experience with the platform hitting 4.6GHz, stable, on a stock AMD HSF was not an issue. Moving beyond 4.6GHz on air saw a significant decrease in stability however. I could boot and run benchmarks at 4.7GHz but I'd almost always encounter a crash. I couldn't hit 5GHz on air.



PC Pro:

Quote:



Theoretically, that gives AMD a glimmer of hope when it comes to overclocking but, in our tests, we had mixed results. Boosting the FX-8150 to 4.2GHz was straightforward, but the system became unstable at 4.4GHz.


----------



## cjc75

Quote:



Originally Posted by *Sophath*


You do realise that the bulldozer rig is running in dual channel right? And with 4 GB of ram?











Shouldn't the "codename" on CPUz say "Zambezi" and not "Bulldozer" ?

I thought Bulldozer was the name of the overall Architecture around which several chips were based, and _ZAMBEZI_ was the "codename" for the FX Desktop Series Processor based on that Architecture?


----------



## xxbassplayerxx

Quote:



Originally Posted by *Plex*


http://www.youtube.com/watch?v=8rDwX...ayer_embedded#!

AMD's official Youtube video says otherwise! Haha.



Quote:



Originally Posted by *Sophath*


And intel's i7 980x scores 5,41 on cinebench in there


Yup... AMD either legitimately messed up or they were seriously trying to deceive people.


















I'll be buying one for LN2 anyway


----------



## Ruckol1

tpi2007 said:


> Anandtech:
> 
> The difference to the i7 2600k is 73.6 w, and to the i5 2500k it's 95,7 w - this one is the TDP of a Sandy Bridge chip like the i7 2600k!
> 
> Also, both from Anand and PC Pro, these chips don't have much headroom (but who would want to overclock these when they use so much power already ?):
> ?
> 5GHz on a H50? 4.5 on stock cooler?! Where is all the powah going?


----------



## Tchernobyl

Review in german from Gamestar.de, if you want to add that link~

http://www.gamestar.de/hardware/proz...6,2561298.html

Google translate source


----------



## jrbroad77

Quote:



Originally Posted by *totallynotshooped*


Comment, John Freuhe?


I'm sure he's in hiding now, won't be checking email and such until the AMD guys get a decent patch working for Windows 7. Hopefully they can just whip up a W7 SP2 in a week or so to give it all the Windows8 performance gains

On a more serious note, where's the Hitler reaction video about Bulldozer(the scene from Inglorius Basterds)?


----------



## Sin0822

Quote:



Originally Posted by *Blameless*


Initial reviews almost always do.

They tend to brute force everything in order to push out an article on time, rather than learning the nuances of the platform/architecture.


true and the issue with BD is that its very sensitive to voltages,it can do an OC on very low voltage, but ppl don't know that. I didn't bother doing a performance review, i figured everyone would come to the same conclusion. So i focused on its HT and NB VVS CPU clocks Vs multi Vs HTT Vs voltage Vs dram performance and OC. My article should come sometime tomorrow on this forum and others.


----------



## Sophath

AMD indicated the FX-8150 was good for around 4.6GHz using air cooling, 5GHz using water cooling and beyond with more aggressive cooling methods. In our experience with the platform hitting 4.6GHz, stable, on a stock AMD HSF was not an issue. Moving beyond 4.6GHz on air saw a significant decrease in stability however. I could boot and run benchmarks at 4.7GHz but I'd almost always encounter a crash. I couldn't hit 5GHz on air.

PC Pro:

Quote:
Theoretically, that gives AMD a glimmer of hope when it comes to overclocking but, in our tests, we had mixed results. Boosting the FX-8150 to 4.2GHz was straightforward, but the system became unstable at 4.4GHz.

You think it might be due to lack of power from the PSU?


----------



## mad0314

Quote:



Originally Posted by *Kvjavs*


Why? The Phenom I X8 actually seems to do good in BF3.


More like Pentium 4 X8.


----------



## AMC

Quote:



Originally Posted by *Sin0822*


true and the issue with BD is that its very sensitive to voltages,it can do an OC on very low voltage, but ppl don't know that. I didn't bother doing a performance review, i figured everyone would come to the same conclusion. So i focused on its HT and NB VVS CPU clocks Vs multi Vs HTT Vs voltage Vs dram performance and OC. My article should come sometime tomorrow on this forum and others.


Something I will look into. Will wait for that









What was your facial expression when you first started using the chip?


----------



## BALAST

Quote:



Originally Posted by *SMK*


Holy power consumption!!!


----------



## Sin0822

Quote:



Originally Posted by *Sophath*


AMD indicated the FX-8150 was good for around 4.6GHz using air cooling, 5GHz using water cooling and beyond with more aggressive cooling methods. In our experience with the platform hitting 4.6GHz, stable, on a stock AMD HSF was not an issue. Moving beyond 4.6GHz on air saw a significant decrease in stability however. I could boot and run benchmarks at 4.7GHz but I'd almost always encounter a crash. I couldn't hit 5GHz on air.

PC Pro:

Quote:
Theoretically, that gives AMD a glimmer of hope when it comes to overclocking but, in our tests, we had mixed results. Boosting the FX-8150 to 4.2GHz was straightforward, but the system became unstable at 4.4GHz.

You think it might be due to lack of power from the PSU?


reviewers use like overratted review sample PSUs, they don't under power the system.


----------



## G3RG

Weird that every single review I read could only hit exactly 4.6ghz lol...


----------



## Sophath

Quote:



Originally Posted by *cjc75*


Shouldn't the "codename" on CPUz say "Zambezi" and not "Bulldozer" ?

I thought Bulldozer was the name of the overall Architecture around which several chips were based, and _ZAMBEZI_ was the "codename" for the FX Desktop Series Processor based on that Architecture?


Yeah but the codename isn't showed anywhere. None of the review's Cpu-Z shots show the codename Zambezi.


----------



## gooface

AMD needs to go back to the drawing boards, bye bye AMD for a couple of years.. till they rethink things for a while.


----------



## Jared2608

I'm still interested to see what these chips will cost over here, when they eventually arrive here. The benchies are disappointing though..


----------



## ChicknWafflZ

Quote:



Originally Posted by *gooface*


AMD needs to go back to the drawing boards, bye bye AMD for a couple of years.. till they rethink things for a while.


The worst part about this is that Bulldozer is supposed to be the result of AMD rethinking things for a while. Sad.


----------



## xxbassplayerxx

Quote:



Originally Posted by *AMC*


Something I will look into. Will wait for that









What was your facial expression when you first started using the chip?










--->







--->







--->


----------



## Hawk777th

Quote:



Originally Posted by *xxbassplayerxx*









--->







--->







--->










That is sig worthy.


----------



## AMC

Quote:



Originally Posted by *xxbassplayerxx*









--->







--->







--->










My thoughts exactly.


----------



## rockosmodlife

Quote:



Originally Posted by *xxbassplayerxx*









--->







--->







--->










haha, I bet.


----------



## th3illusiveman

That is some Horrendous power consumption.... The really needed to upgrade their IPC.

"MOAR CORES (tm)" doesn't work for the majority of PC users (including gamers)


----------



## CJRhoades

So basically, for this:

AMD FX-8150 @ ~4.8GHz
AMD Radeon HD 6950
8GB DDR3 1600MHz

You'd need a 1000w PSU?


----------



## omni_vision

they will need to review their fx-prices.


----------



## AMC

Quote:



Originally Posted by *Hawk777th*


That is sig worthy.


Done.

Quote:



Originally Posted by *CJRhoades*


So basically, for this:

AMD FX-8150 @ ~4.8GHz
AMD Radeon HD 6950
8GB DDR3 1600MHz

You'd need a 1000w PSU?


Imagine an SR-2 equivalent lol.


----------



## th3illusiveman

Oh god i hope Balla doesn't find this thread









please!!


----------



## dreameer111

Absolutely ridiculous









The only thing I'm really pissed about is the fact that I didn't listen to my gut and go Intel months ago. 
Instead I waited... and waited... hopeful that the FX would outshine Intel. /sigh


----------



## jrbroad77

I think it's safe to assume a Phenom II X8 on 32nm with 2B transistors would've performed better.

I think it's time to relegate AMD's Bulldozer design team strictly to the Llano/Fusion side, and take some of ATi's engineers to work on Bulldozer's successor.


----------



## Blameless

Quote:



Originally Posted by *th3illusiveman*


"MOAR CORES (tm)" doesn't work for the majority of PC users (including gamers)


PC users are a secondary concern to workstation, server, and HPC markets.

Quote:



Originally Posted by *CJRhoades*


So basically, for this:

AMD FX-8150 @ ~4.8GHz
AMD Radeon HD 6950
8GB DDR3 1600MHz

You'd need a 1000w PSU?


No.


----------



## Arni90

Dear god, this is terrible.
This isn't even too little, too late, it's more like sucky, suck, suck.

They would have been better off die-shrinking Phenom II with a bit faster L3 cache. It's barely competing with the 1100T


----------



## yoshi245

I hope Piledriver does a better job, especially on Win8, but that's just me being optimistic. Glad I went with my 2500k.

I only really see BD as an alternative for someone who wants a minor upgrade/sidegrade on an existing 890/990 mobo, or someone who wants to OC really high on water, phase change, LN2 etc.


----------



## Hawk777th

I am just amazed at the power draw. I was wanting one for folding....Nope.


----------



## gerickjohn

You Tube  



 
IGN apparently has a video about it.









Seems the 8150 iirc is gonna be bundled with a water kit? =o


----------



## CJRhoades

Quote:



Originally Posted by *Blameless*


No.


Why not? Bit-Tech was showing nearly 600w usage with only the CPU being stressed at 4.8GHz.


----------



## black96ws6

Quote:



Originally Posted by *Blameless*


PC users are a secondary concern to workstation, server, and HPC markets.

No.


No??? I'd go with that just to be safe if adding a decent video card bumps power up to 800W!! And what if you OC the video card??

From Hardwarecanucks:

Quote:



In order to hit 4.5Ghz, we set the vCore to 1.50V with some light Load-Line Calibration (LLC). At this voltage, the processor heated up considerably, reaching well over 77Â°C when stress tested by Prime95 In-place large FFTs.

At these settings, the system is idling at around 195W, but when running the aforementioned Prime95 stress test,* it pulls an immense 550W from the socket.* If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost* 800W*.


----------



## Kvjavs

They locked the comments on their YouTube video displaying the Phenom I X8 "beating" Intel. Haha!


----------



## PsikyoJebus

Well, it looks like I'm going to keep this chip a little while longer. I don't see a need to switch to this. Bring on the next generation of chips!


----------



## Blameless

Quote:



Originally Posted by *CJRhoades*


Why not? Bit-Tech was showing nearly 600w usage with only the CPU being stressed at 4.8GHz.



Quote:



Originally Posted by *black96ws6*


No??? I'd go with that just to be safe if adding a decent video card bumps power up to 800W!! And what if you OC the video card??

From Hardwarecanucks:


They were using too much vcore, and measuring from the socket (so you can subtract 15% right there).


----------



## omni_vision

Quote:



Originally Posted by *CJRhoades*


So basically, for this:

AMD FX-8150 @ ~4.8GHz
AMD Radeon HD 6950
8GB DDR3 1600MHz

You'd need a 1000w PSU?


exactly


----------



## hammertime850

Quote:



Originally Posted by *CJRhoades*


Why not? Bit-Tech was showing nearly 600w usage with only the CPU being stressed at 4.8GHz.


that was measured at the wall, then multiply by the psu's power efficiency ~85? so 510w for 600 at the wall.


----------



## rusky1

Well, good thing I waited for the reviews. I see no reason whatsoever to upgrade (more like a downgrade) from my 965. If i build a pc in the next 6 months, assuming that AMD doesn't have some sort of miracle breakthrough, I'm going Intel.


----------



## nagle3092

So happy I jumped on SB the day it released. As for AMD, well, at least they still have good GPU's and the mobile APU's....


----------



## Dr. Zoidberg

From Hardware Canucks with FX-8150 @ 4.6 GHz:

"At these settings, the system is idling at around 195W, but when running the aforementioned Prime95 stress test, it pulls an immense 550W from the socket. If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost 800W."

http://www.hardwarecanucks.com/forum...review-19.html


----------



## microfister

Quote:



Originally Posted by *Arni90*


Dear god, this is terrible.
This isn't even too little, too late, it's more like sucky, suck, suck.

They would have been better off die-shrinking Phenom II with a bit faster L3 cache. It's barely competing with the 1100T


my thoughts exactly, im just glad i had others sway me towards a 2600k z68 build in an effort to prepare for ivy bridge. i was originally going to go 955 and upgrade to BD when it was released, good thing i didnt, $250ish in upgrades later id still be bottlenecking my GPUs.

AMD had to have known that this was going in the wrong direction long before reviews started leaking, why didnt they try to push the core performance up? what the hell are they going to do when their new 7 series graphics cards come out if the BD bottlenecks a single 580, they are on a pretty steep downhill path rite now.


----------



## black96ws6

OIC. But still, forget about going SLI, it will melt your PSU with BD (not that you'd want BD for SLI anyway







)


----------



## CJRhoades

Quote:



Originally Posted by *Blameless*


They were using too much vcore, and measuring from the socket (so you can subtract 15% right there).



Quote:



Originally Posted by *hammertime850*


that was measured at the wall, then multiply by the psu's power efficiency ~85? so 510w for 600 at the wall.


Even so.. wouldn't you go with 1000w just to be safe? I'd be scared to try it with an 800w PSU.


----------



## Badness

Quote:



Originally Posted by *Kvjavs*


Why? The Phenom I X8 actually seems to do good in BF3.


Because if bf3 was cpu limited, that wouldn't be the case.

Unless maybe bf3 uses 8 cores. Idk...


----------



## hammertime850

Quote:



Originally Posted by *Dr. Zoidberg*


From Hardware Canucks with FX-8150 @ 4.6 GHz:

"At these settings, the system is idling at around 195W, but when running the aforementioned Prime95 stress test, it pulls an immense 550W from the socket. If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost 800W."

http://www.hardwarecanucks.com/forum...review-19.html


so you would need a good 650w


----------



## Derp

Quote:



Originally Posted by *JF-AMD*


IPC will be higher
Single threaded performance will be higher


Glad to hear, this means if I was an AMD user buying a new AMD motherboard would mean I would be ready for faster IPC when BD comes out!!! Oh wait.... whats this then?










OCN TOS: 
Quote:



You are EXPECTED to: Post factual information


 Deserves a ban IMO.


----------



## Dr. Zoidberg

To be on the safe side, I would probably go with a 900 watt PSU.


----------



## sLowEnd

Quote:



Originally Posted by *BLKKROW*


I want to see benchies after theL1 Cache fix.


That band-aid fix is for Linux

http://www.xtremesystems.org/forums/...-tested/page16


----------



## omni_vision

maybe this is y the HD7000s series will use a lot less power


----------



## Cyrious

So, i went and read through all that, and it was rather underwhelming. Yes, i know the platform need some time to mature, but even so, its... lame, for a lack of a better term.

Guess when i do upgrade to an AMD system (i decided on this a long time ago), im just gonna grab a PhII x6, a decent board with at least 3 PCI-E x16 slots, a decent midrange card, and just leave it at that for a few years.


----------



## jrbroad77

Quote:



Originally Posted by *Derp*


Glad to hear, this means if I was an AMD user buying a new AMD motherboard would mean I would be ready for faster IPC when BD comes out!!! Oh wait.... whats this then?

http://images.hardwarecanucks.com/image/mac/reviews/AMD/Bulldozer/AMD_FX-8150-18.jpg[IMG]

OCN TOS:

Deserves a ban IMO.

[/TD]
[/TR]
[/TABLE]
Those benches are heavily Intel biased.[IMG alt=""]https://www.overclock.net/images/smilies/wink.gif


----------



## black96ws6

Quote:



It is hard to find value in the $245 FX-8150 when the X6 1100T retails for a mere $190. That is almost 30% cheaper and you are getting very similar performance in all but the most highly-threaded applications. *With 3 to 4 years of development time, a new manufacturing process, and twice as many transistors as a Phenom II X6, you can't blame us for being slightly disappointed*. Heck, the FX-8150 doesn't even consume less power than the Phenom II X6 1100T.


The 1090T can actually be found for $159 these days, it's basically the same as the 1100T...


----------



## Uncivilised

AMD really screwed up hard, but what life has taught me is that success evolves from failure and that history always repeats itself. So one day no matter how long it takes them, AMD will have the crown again...

On a more positive note im glad my i7 920 easily keeps up and even beats the "8 core" FX-8150 cpu in some cases


----------



## EAPidgeon

Well it would look like all of my half computer literate friends will come at me like the next messiah now for denying Bulldozer being possibly good with all my heart.







Except for poor Nick... who bought a AM3+ Motherboard and decided to never use it.


----------



## PyroTechNiK

You Tube  



 
They finally disabled comments on this fraudulent video... I wonder why...


----------



## toX0rz

there you go, rumours turned out to be true, FX is an epic fail.

Cant be bothered to go through every review, but does anyone know if theres a MGPU review with the FX-8150?

I'd really like to see how much of a bottleneck that CPU is, I can imagine an old Phenom X6 would scale better with MGPU setups.


----------



## Phantom123

Extremely disappointed. Just ordered my motherboard and CPU below. I am glad I waited 10months anyway, lower prices for everything else. Power consumption is to high for Bulldozer and the performance is just not there for my uses.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *PyroTechNiK*


http://www.youtube.com/watch?v=8rDwXuAINJk

They finally disabled comments on this fraudulent video... I wonder why...


I like that AMD says that the core i7 980x scores 5.41, when in reality it scores close to 9.


----------



## omni_vision

Quote:



Originally Posted by *black96ws6*


The 1090T can actually be found for $159 these days, it's basically the same as the 1100T...


the 2500k can be found for 150usd...


----------



## mksteez

What a flop


----------



## xPwn

Quote:



Originally Posted by *PyroTechNiK*


http://www.youtube.com/watch?v=8rDwXuAINJk

They finally disabled comments on this fraudulent video... I wonder why...


Maybe it was after my brother used his 17 youtube accounts and disliked every AMD video and posted tons of spam on them


----------



## Blameless

Quote:



Originally Posted by *CJRhoades*


Even so.. wouldn't you go with 1000w just to be safe? I'd be scared to try it with an 800w PSU.


I'd be much, much, more worried about my motherboard's VRM.

~700w on the PSU is likely more than plenty for any OC you could achieve with BD on air, including the rest of the system (assuming single GPU).


----------



## Derp

Disabled comments instead of removing the video tells me that they were purposely trying to LIE and trick their customers instead of it being a genuine mistake.


----------



## Domino

Well, what an absolute garbage chip. I was really looking forward for bulldozer, and frankly, it didn't deliver.

Quote:



Originally Posted by *JF-AMD*


Single threaded performance will be higher

That is all we can say at this point.


Was this even true?


----------



## Diabolical999

Bulldozer is like the Windows ME of AMD cpus.


----------



## QuackPot

Quote:



Originally Posted by *PyroTechNiK*


http://www.youtube.com/watch?v=8rDwXuAINJk

They finally disabled comments on this fraudulent video... I wonder why...


Next thing the like/dislike will go.


----------



## meetajhu

AMD bulldozer just got demolished by Intel!

AMD is joke of the generation..


----------



## Blameless

Quote:



Originally Posted by *Domino*


Was this even true?


In a handful of scenarios.

Overall, no.


----------



## QuackPot

Quote:



Originally Posted by *Domino*


Well, what an absolute garbage chip. I was really looking forward for bulldozer, and frankly, it didn't deliver.

Was this even true?


That's going in my sig.


----------



## xPwn

Quote:



Originally Posted by *Blameless*


In a handful of scenarios.

Overall, no.


Do you happen to be JF-AMD ?









EDIT: Guys, im just kidding. WHY SO Serious???


----------



## nagle3092

I am getting a little chuckle thinking about all those guys that ran out and bought AM3+ boards. What a waste of money at this point.


----------



## Domino

Quote:



Originally Posted by *Blameless*


In a handful of scenarios.

Overall, no.


Such a disappointment too. :/

I have no use for this chip.

I think the people trying to scorn AMD users should take a hike.


----------



## wongwarren

All this hype and still stomped by a 920, AMD, I am disappoint.


----------



## just_nuke_em

Soooo, is there any confirmation that these fit in the white sockets? I want something new to DICE and I only got a CHIV.


----------



## Derp

Quote:



Originally Posted by *nagle3092*


I am getting a little chuckle thinking about all those guys that ran out and bought AM3+ boards. What a waste of money at this point.


Well they were told by AMD that IPC would increase so they thought they would have an upgrade path........ Add that to the video where AMD is trying to fool ignorant customers into thinking the i7 980x only gets 5.41 in cinebench when it actually gets 9....

Intel is the shady company right? Not AMD?


----------



## Seid Dark

Even AMD fanboys gotta admit that this is complete failure. It seems every time AMD brings new architecture it fails to deliver. It's slower than Phenom 2 in some games and benchmarks







I miss the glory days of AMD Athlon 64.


----------



## xxbassplayerxx

Quote:



Originally Posted by *just_nuke_em*


Soooo, is there any confirmation that these fit in the white sockets? I want something new to DICE and I only got a CHIV.


Going to be a lot of 990 boards up for sale for really cheap in the near future!

Quote:



Originally Posted by *Derp*


Well they were told by AMD that IPC would increase so they thought they would have an upgrade path........ Add that to the video where AMD is trying to fool ignorant customers into thinking the i7 980x only gets 5.41 in cinebench when it actually gets 9....

Intel is the shady company right? Not AMD?










In the credits they state that it was a 2500K. Legitimate mistake or purposeful deceit is the question.


----------



## Badness

Quote:



Originally Posted by *nagle3092*


I am getting a little chuckle thinking about all those guys that ran out and bought AM3+ boards. What a waste of money at this point.


Hey! I only did it because my old board sucked :3 Now I have 8x/8x and ddr3.
But, I really was banking on getting an upgrade path and small upgrade, not just the latter.


----------



## Blameless

Quote:



Originally Posted by *xPwn*


Do you happen to be JF-AMD ?










No, but I can see the fine line between fact and truth, and if I wanted to, I could spin quite a fairy tale out of nothing but factual statements.


----------



## Dr. Zoidberg

I bet AMD will dupe a lot of people into buying this with that meaningless world record overclock. What good is a overclock if you need a nuclear reactor to power it?


----------



## xPwn

Well, time for bed







School 2morrow xD Also, In 16 hours when I come back I will bet that the page count will have doubled. Peace, bros !


----------



## Derp

Quote:



Originally Posted by *xxbassplayerxx*


In the credits they state that it was a 2500K. Legitimate mistake or purposeful deceit is the question.


If it was a legit mistake the video would be removed, editted and re uploaded. Instead they leave the false video up and disable the comments. Obviously trying to trick their customers.


----------



## Blameless

Quote:



Originally Posted by *Derp*


Intel is the shady company right? Not AMD?










All companies are shady.


----------



## just_nuke_em

Quote:



Originally Posted by *xxbassplayerxx*


Going to be a lot of 990 boards up for sale for really cheap in the near future!


Mmmm, I do love new hardware.

LOL, just saw this:

Quote:



Originally Posted by *Lost Circuits*

All it takes is a tweezer to break off the two extra pins and it will at least mechanically fit into the older socket.


----------



## AtomicFrost

Quote:



Originally Posted by *xxbassplayerxx*


Going to be a lot of 990 boards up for sale for really cheap in the near future!


I wonder how low people will be dumping them for on here. Might actually make a decent budget upgrade if AMD drops the prices on BD.


----------



## XxBeNigNxX

Quote:



Originally Posted by *xxbassplayerxx*


In the credits they state that it was a 2500K. Legitimate mistake or purposeful deceit is the question.


There is NO mistake with what they showed.. It would have to go through a room full of people to get approved as the final edit.

It would be Both deceit but more so Misdirection.

AMD is NO Angel and at the end of the day just like any company they want consumers money.


----------



## tw33k

I don't consider my 990FXA a waste. I am disappointed that BD wasn't what I hoped. I need to upgrade tho so I'm gonna grab a 1090T and wait to see what BD rev2.0 brings.


----------



## BallaTheFeared

Quote:



Originally Posted by *th3illusiveman*


Oh god i hope Balla doesn't find this thread









please!!


Muahahahaha.

I've been reading up


----------



## mad0314

AMD fell down a few steps as a company in my eyes, not because of this horrible product, but because of how they handle the situation in the face of their customers.


----------



## sLowEnd

The reaction about the power consumption seems a bit overblown.

Many people on OCN have overkill power supplies that can easily handle FX's power consumption and then some.
Heck, we even have an overkill PSU club.
http://www.overclock.net/power-suppl...-psu-club.html

Also, unless you are planning to do a CPU intensive task long term (e.g. 24/7 Folding), load power consumption should be the least of your worries when it comes to your power bill. The idle power consumption of FX is actually pretty impressive.


----------



## jivenjune

Quote:



Originally Posted by *Derp*


Well they were told by AMD that IPC would increase so they thought they would have an upgrade path........ Add that to the video where AMD is trying to fool ignorant customers into thinking the i7 980x only gets 5.41 in cinebench when it actually gets 9....

Intel is the shady company right? Not AMD?










Agreed.

Edit:

Sigh, letting my emotions get the better of me again. I'll refrain.


----------



## Serious Dude

bought a 955 and am3+ board thinking bulldozer will be an intel killer.... what a waste of money, should've bought a 2500k


----------



## Oedipus

Quote:



Originally Posted by *sLowEnd*


The reaction about the power consumption seems a bit overblown.

Many people on OCN have overkill power supplies that can easily handle FX's power consumption and then some.
Heck, we even have an overkill PSU club.
http://www.overclock.net/power-suppl...-psu-club.html

Also, unless you are planning to do a CPU intensive task long term (e.g. 24/7 Folding), load power consumption should be the least of your worries when it comes to your power bill. The idle power consumption of FX is actually pretty impressive.


Whether people have overkill PSUs or not does not excuse the breathtaking load power consumption, especially when OCed. This is a non-starter chip for folders unless they generate their own power or don't pay for it.


----------



## Badness

Won't BD 2.0 or w/e you want to call it NOT be using am3+?


----------



## Dopamin3

Quote:



Originally Posted by *Badness*


Won't BD 2.0 or w/e you want to call it NOT be using am3+?


Nope, AM3+


----------



## terraprime

I still wonder what this puppy does with [email protected]


----------



## jivenjune

Quote:



Originally Posted by *Dopamin3*


Nope, AM3+











Isn't Piledriver still incredibly underwhelming when considering that it's only expected to be 10 percent better than the current FX processors?


----------



## sLowEnd

Quote:



Originally Posted by *Oedipus*


Whether people have overkill PSUs or not does not excuse the breathtaking load power consumption, especially when OCed. This is a non-starter chip for folders unless they generate their own power or don't pay for it.


And?
Load power consumption will not greatly impact your power bill unless you keep the CPU loaded for extended periods of time.

As for folding, you're basically reiterating my point.


----------



## Tatakai All




----------



## swindle

Quote:



Originally Posted by *Serious Dude*


bought a 955 and am3+ board thinking bulldozer will be an intel killer.... what a waste of money, should've bought a 2500k










Yeah man.

Me and you, we did the exact same thing, and feel the exact same way.


----------



## ReignsOfPower

Add this to the thread
http://www.overclock3d.net/reviews/c...0_cpu_review/1


----------



## toX0rz

Quote:



Originally Posted by *jivenjune*


Isn't Piledriver still incredibly underwhelming when considering that it's only expected to be 10 percent better than the current FX processors?


yep, barely 2500k-2600k performance.

It's even worse: The following generations after Piledriver are also only going to boost performance by 10-15%: http://www.overclock.net/hardware-ne...l#post15273068

That means somewhere between 2013 and 2014 we might see Gulftown performance lol.


----------



## Dopamin3

Quote:



Originally Posted by *jivenjune*


Isn't Piledriver still incredibly underwhelming when considering that it's only expected to be 10 percent better than the current FX processors?


10% better isn't even going to match Phenom II IPC in the vast majority of cases so it is very underwhelming. Also keep in mind Ivy Bridge will most likely be out around the time this releases, further slamming AMD. It makes me sad, not only how much of a fail Bulldozer is, but how the AMD FX brand name is completely ruined. And then you've got AMD trying to mislead customers (see here and here.)


----------



## th3illusiveman

Quote:



Originally Posted by *sLowEnd*


And?
Load power consumption will not greatly impact your power bill unless you keep the CPU loaded for extended periods of time.

As for folding, you're basically reiterating my point.


Its a bad thing, there is no use in rationalizing it. The CPU consumes way too much power for its underwhelming performance and that is really it.

I can't understand how anyone (even AMD themselves) could defend this.


----------



## terraprime

Quote:



Originally Posted by *sLowEnd*


And?
Load power consumption will not greatly impact your power bill unless you keep the CPU loaded for extended periods of time.

As for folding, you're basically reiterating my point.


Yeah but will doubt it will be cost effective vs the 2600k the way this thing is looking. Especially if [email protected] is FP heavy they are screwed, but at least its decent in multi-threaded apps not like crazy good.

And i was kind of excited to see how BD was gonna bring and now I know what I am buying in the next 6 months unless Piledriver brings more then 10% performance increase over BD current lack luster creation.


----------



## Oedipus

Quote:



Originally Posted by *th3illusiveman*


Its a bad thing, there is no use in rationalizing it. The CPU consumes way too much power for its underwhelming performance and that is really it.

I can't understand how anyone (even AMD themselves) could defend this.


Exactly.


----------



## sLowEnd

Quote:



Originally Posted by *th3illusiveman*


Its a bad thing, there is no use in rationalizing it. The CPU consumes way too much power for its underwhelming performance and that is really it.

I can't understand how anyone (even AMD themselves) could defend this.


I didn't say it was a good thing. I said I think the reaction is overblown.


----------



## geovas77

This is a really sad day for computing, hopefully AMD can recover and learn from this although it will certainly not be easy after letting down so many people and certainly the largest part of the enthusiast community.


----------



## scotty453

I guarantee that AMD change their road map and scrap this architecture, it sucks. Man i was so looking forward to it..

EDIT: im currently ripping everything to do with AMD i have up i'm not happy, i wanted something that could beat my athlon single threaded, it can't ... just wow.


----------



## ivymaxwell

AMD I cant believe the inefficient piece of junk you made.

no wonder all those executives and CEO's left.

and oh yeah never trust people with an agenda in the company. *JF-AMD is a politician*.


----------



## omninmo

quick question, just woke up to a massive 60 page thread lol and cant afford to read all reviews and all the pages to see if this was answered









*did any review actually try to disable 1 core per module so that FX 8150 had 4 "un-shared" threads and see if it overclocked higher?*

i'm guessing this approach would not only improve IPC sort of like what windows 8 scheduler will do (check final pages of toms review, *12,7% increase in FPS* for WOW) but also cut back on power and heat, allowing more OC headroom









im thinking a 5Ghz 4Core (1-core-per-module) AMD Overdrive profile for GAMING would provide less of an embarassment?

SOMEONE PLEASE TEST THIS







(or let me know if someone already has...)


----------



## AMC

Anand's article was a great read. Relatively decent concepts but the fabrication is not there. Maybe Revision 2 will be better.

All in all, a total bust. Another server CPU it seems.


----------



## Djmatrix32

When is it coming out?


----------



## rockosmodlife

Quote:



Originally Posted by *Djmatrix32*


When is it coming out?


About 4 hours ago


----------



## djxput

Just skimmed through about 12 reviews here ...

- low idle power consumption; but quite high load power comsumption
- fares not as well in games; and for me this is what I care about most - (dont need a fast processor to watch movies, browse forums etc) ...
- low temps (which is nice)

I didnt read all the reviews in depth but did they talk much about bios/drivers etc and how updated ones will impact the numbers?
I did notice one review comparing 1333 speed ram to 1866 and there was no difference in performance.

alas' while I would really like to go AMD this time around ... saving more trees (power comsumption), faster performance in games (what my cpu will be used for at load) it doesnt look like the fx-bulldozer will be my cup of tea.

Even if we saw a 10% improvement in #'s from this or that would I still want it instead of a 2600k for games? ... doesnt look like it.


----------



## Captain Bucket

Can't wait for Ivy on Z77.

Hope the lack of competition doesn't hurt Intel's performance too much.


----------



## Djmatrix32

Quote:



Originally Posted by *rockosmodlife*


About 4 hours ago


I don't see it on newegg or trigerdirect.


----------



## rockosmodlife

Quote:



Originally Posted by *Djmatrix32*


I don't see it on newegg or trigerdirect.










Reflex99 bought one earlier, I dunno.


----------



## Shalhoub

I should of bought SB in the beginning of the year... damn AMD.. You lost a big customer.... intel here I come


----------



## TheBlademaster01

I'm glad that I went with Intel back in Febuary, my Q9550 from '07 even manages to beat this FX 8150 in most games. It has better IPC after all...


----------



## MisterMalv

Well, I'm glad I didn't fork out on that 990 motherboard the other day.... 
Look's like I'm keeping my 955 for the foreseeable future, which is fine by me.


----------



## Hawk777th

Not trying to gloat or start a war. I am just so glad that I jumped on SB when it came out! I kept feeling a couple months afterwards that I had made a mistake not waiting for BD. My fears were confirmed with the early leaks on the chips.

Guess sometimes you get lucky!


----------



## IXcrispyXI

well looking at the benchs im very glad i didnt wait for BD money well spent on my 2600k


----------



## dreameer111

Quote:



Originally Posted by *Shalhoub*


I should of bought SB in the beginning of the year... damn AMD.. You lost a big customer.... intel here I come


I just finished making my SB order a few mins ago.


----------



## GreekElite

would the fx8150 be good with running alot of virtual machines since it has 8 cores?


----------



## mad0314

Quote:



Originally Posted by *omninmo*


quick question, just woke up to a massive 60 page thread lol and cant afford to read all reviews and all the pages to see if this was answered









*did any review actually try to disable 1 core per module so that FX 8150 had 4 "un-shared" threads and see if it overclocked higher?*

i'm guessing this approach would not only improve IPC sort of like what windows 8 scheduler will do (check final pages of toms review, *12,7% increase in FPS* for WOW) but also cut back on power and heat, allowing more OC headroom









im thinking a 5Ghz 4Core (1-core-per-module) AMD Overdrive profile for GAMING would provide less of an embarassment?

SOMEONE PLEASE TEST THIS







(or let me know if someone already has...)


I don't think you can disable a single integer core. Its the whole module or nothing. I would much rather have seen a beastly 4 core with 4 integer 4 FPU unshared resources than a 2 module 4 core, but all chips are the same chip as the 8 cores with modules disabled, but not unlockable.

Quote:



Originally Posted by *AMC*


Anand's article was a great read. Relatively decent concepts but the fabrication is not there. Maybe Revision 2 will be better.

All in all, a total bust. Another server CPU it seems.


They had a lot of neat ideas and I was rooting for them, but it seems they tried to do too much. Hopefully we will see better results from their CURRENT ideas in their next chips a la Penom -> Phenom II. Its either that or jump ship like Pentium 4 (what its looking a LOT like).

Quote:



Originally Posted by *Oedipus*


Whether people have overkill PSUs or not does not excuse the breathtaking load power consumption, especially when OCed. This is a non-starter chip for folders unless they generate their own power or don't pay for it.


Agreed. People being misinformed about their power consumption does not make up for a high power consumption with terrible performance. Low power consumption OR good performance would be OK, but it came with the worst of both...


----------



## gooface

if AMD was smart they would drop the prices on these cpus by a lot, then scrap their plans to make any better versions of it, and do a complete redesign, and make the phenom II's 32nm or something or just faster or add more cores to them. Just not this...


----------



## CJRhoades

Quote:



Originally Posted by *gooface*


if AMD was smart they would drop the prices on these cpus by a lot, then scrap their plans to make any better versions of it, and do a complete redesign, and make the phenom II's 32nm or something or just faster or add more cores to them. Just not this...


You can't just drop something as big as this and "make" new CPUs. New architectures take years of R&D. AMD has probably been working on Bulldozer since 2006 or something...


----------



## Derp

Quote:



Originally Posted by *gooface*


if AMD was smart


That went out the window then they decided to continue developing this failbot processor and even release it at the prices they're asking for.


----------



## djxput

Quote:



Originally Posted by *dreameer111*


I just finished making my SB order a few mins ago.










offtopic - what did you go with? mb etc (no idea what mb to be looking at)

back on topic; I actually did find one review that was somewhat positive
http://www.hardwareheaven.com/review...onclusion.html

didnt look like they tested as well as some places thou - but again pretty much just skimmed the info (so many reviews).


----------



## Wishmaker

So the previews were accurate. Discrediting them was pointless especially the LAB501 one. It is pretty obvious that AMD has contacted these guys because many of the paragraphs are sugarcoated. It is good to see that some did not lose objectivity :

Quote:



There is a silver lining to this. AMD's unlocked FX-8120 is to be priced at around $200 in retail. You noticed we did not spend a lot of time pontificating about the FX-8150? *The reason is because, from an enthusiast standpoint, there is simply no reason to buy an 8150*



The Guru3D review, as opposed to the HardOCP, blames entirely the software for the poor results. Another AMD intervention, pretty obvious as this is damage limitation yet again. While I agree software is not on par, I do not agree with the fact that Thuban is right there where Bulldozer is. So why the software excuse when the old architecture from AMD does just fine??

From hitechlegion :

Quote:



Incredible Multi Threaded Performance
Excellent Price/Performance Ratio
Smooth Working Turbo Core For Speed Boost
Dual Turbo Modes For Increased Boost In Lightly Threaded Apps
Maintains Excellent Temps At Stock Speeds
Black Edition With Unlocked Multiplier
Eight Physical Cores
Easily Achieves High Overclocks
Excellent OC Performance Scaling
Easily Tweaked With AOD For Maximum Performance


Reading this review makes one ask, where is INTEL in all this? It is pretty obvious that AMD is the performance king ... from this review.

The lab501 guys were spot on. AMD has pushed some buttons and the fact that review sites have responded to this is an insult to us as readers. I was very interested in buying a Bulldozer rig but with such photoshoped reviews, I will wait for IVY.


----------



## weebeast

So the leaked benchmarks were real after all. Interresting reviews, readed 6-7 of them


----------



## awdrifter

Quote:



Originally Posted by *gooface*


if AMD was smart they would drop the prices on these cpus by a lot, then scrap their plans to make any better versions of it, and do a complete redesign, and make the phenom II's 32nm or something or just faster or add more cores to them. Just not this...


Agreed. Just make some higher clocked Thuban for the desktop market. Leave the Bulldozer design for servers. Thuban is only about half the transistor count as BD, if they shrink that down to 32nm and maybe improve it's clockspeed a bit (4.5ghz or so stock, 5ghz oc), it'll perform better and be cheaper to make as a desktop CPU.


----------



## meetajhu

Quote:



Originally Posted by *CJRhoades*


You can't just drop something as big as this and "make" new CPUs. New architectures take years of R&D. AMD has probably been working on Bulldozer since 2006 or something...


i call AMD R&D EPIC fail


----------



## omninmo

Quote:



Originally Posted by *mad0314*


I don't think you can disable a single integer core. Its the whole module or nothing. I would much rather have seen a beastly 4 core with 4 integer 4 FPU unshared resources than a 2 module 4 core, but all chips are the same chip as the 8 cores with modules disabled, but not unlockable.


are you sure? I was under the impression they achieved OC WR by disabling all but 1 core?

anyhow, you could alternatively, but with more hassle, underclock one core per module to minimum and OC the other with AMD overdrive and set CPU affinity accordingly... even if active, a 800Mhz core or lower, if possible, won't hog many FP cycles I suppose and will still boost performance..

heck you could even cherry pick wich core OCs the most and we might end up having people hitting 4C > 5ghz on high end air or water... that's what i'd do!









Really wish someone would test this







too bad no one will be actually buying this for a while until AMD drops prices


----------



## Anarion

I am glad I did not listened to the forum's replies when they told me to wait for BD. I will never do the mistake to wait for any AMD CPU release EVER again. I am sorry but AMD is ages behind. Maybe I will buy a GPU solution in the future but unless a miracle happen which I doubt and they produce a CPU more superior than Intel's maybe I will think about it. But I seriously doubt. And they even failed with their so called good pricing. They really made Intel seem CHEAPER. Unfortunately Intel will stay alone in Desktop CPU business. That's life. Farewell AMD.


----------



## TFL Replica

Reading those reviews was painful. I never expected bulldozer to get its rear kicked by AMD's very own previous generation.


----------



## Wishmaker

Shockingly, TOMS and ANAND have pretty objective reviews and make sure to touch extensively on aspects where AMD could have done better. It seems that Bulldozer is not really a failure. It is a test where AMD is trying to shift the classical pc way to a more cores = more performance way. I wonder if people will adopt this or keep to what INTEL does.


----------



## gooface

what makes me sick is that they knew back before june when they delayed it because of the stepping issue, they knew it was bad, and they were like, the B3 chips will be way better.... oh wait....


----------



## Hawk777th

I really wondered what went on in the testing department the first time they benchmarked it. They had to know they were toast. Bet that was an unpleasant situation.


----------



## toX0rz

Quote:



Originally Posted by *Wishmaker*


Shockingly, TOMS and ANAND have pretty objective reviews and make sure to touch extensively on aspects where AMD could have done better. *It seems that Bulldozer is not really a failure. It is a test where AMD is trying to shift the classical pc way to a more cores = more performance way.* I wonder if people will adopt this or keep to what INTEL does.


Except BD doesnt mean more cores = more performance but rather more cores and still less performance than 4 cores of the competition








Most shockingly, even less performance than their own six-core Phenoms in some occasions.

It's ok if AMD is going the "more cores" way but it's completely pointless and unfovarable if you have to drop the IPC for it.

Same IPC, more cores -> sure, why not
Lower IPC, more cores -> no thanks.

AMD did it the wrong way.

A Phenom X8 would have had 8 cores while retaining the same IPC and per-clock performance effectively making it faster than Bulldozer currently is.


----------



## ivymaxwell

Quote:



Originally Posted by *Wishmaker*


Shockingly, TOMS and ANAND have pretty objective reviews and make sure to touch extensively on aspects where AMD could have done better. It seems that Bulldozer is not really a failure. It is a test where AMD is trying to shift the classical pc way to a more cores = more performance way. I wonder if people will adopt this or keep to what INTEL does.


why would anyone use this new way when the old way is faster and uses less power?

edit: actually about half the power lol.


----------



## snowshoe

Having just read the BitTech review, I can safely say that Bulldozer sucks.

Very glad that I saw the writing on the wall and jumped ship to Intel.


----------



## GameBoy

Quote:



Originally Posted by *Wishmaker*


The Guru3D review, as opposed to the HardOCP, blames entirely the software for the poor results. Another AMD intervention, pretty obvious as this is damage limitation yet again. While I agree software is not on par, I do not agree with the fact that Thuban is right there where Bulldozer is. *So why the software excuse when the old architecture from AMD does just fine??*


The older architecture is nothing like Bulldozer. So whether Phenom II's do fine or not is completely irrelevant.

As for the performance, it's not _that_ bad, I also think some software fixes/patches/updates to Windows scheduler will provide some (albeit very small) boosts. The only thing I find completely unacceptable is the power consumption, especially overclocked.

Quote:



Originally Posted by *Anarion*


I am glad I did not listened to the forum's replies when they told me to wait for BD. I will never do the mistake to wait for any AMD CPU release EVER again. I am sorry but AMD is ages behind. Maybe I will buy a GPU solution in the future but unless a miracle happen which I doubt and they produce a CPU more superior than Intel's maybe I will think about it. But I seriously doubt. And they even failed with their so called good pricing. They really made Intel seem CHEAPER. Unfortunately Intel will stay alone in Desktop CPU business. That's life. Farewell AMD.


I never really understood this kind of ludicrous thinking. What makes you think AMD will never have competitive CPU's ever again?

Quote:



Originally Posted by *gooface*


if AMD was smart they would drop the prices on these cpus by a lot, then scrap their plans to make any better versions of it, and do a complete redesign, and make the phenom II's 32nm or something or just faster or add more cores to them. Just not this...


No. If AMD were _stupid_, they would do what you just said.


----------



## QuackPot

Here's another review.

http://hexus.net/tech/reviews/cpu/32...dozer-fx-8150/


----------



## Riskitall84

I was going to take some money out of my X79 fund to play with Bulldozer but its not worth buying even over a Phenom II

What a Joke!


----------



## Dublin_Gunner

Quote:



Originally Posted by *GameBoy*


The older architecture is nothing like Bulldozer. *So whether Phenom II's do fine or not is completely irrelevant.*

As for the performance, it's not _that_ bad, I also think some software fixes/patches/updates to Windows scheduler will provide some (albeit very small) boosts. The only thing I find completely unacceptable is the power consumption, especially overclocked.


It's completely relevant from a consumer point of view though.

Why would someone 'upgrade' to BD, when their Phenom II performs pretty much on par?

BD should have been kicking Phenom II to touch, unfortunately, in most tasks, it just doesn't.

As irrelevant as some might say this is - game performance is a HUGE let down with BD. This shows up huge inneficiencies in architecture, and shows they may have really dropped the ball on integer performance.

Sure, FP performance is pretty good (as can be seen with the encoding performance and compression / decompression), but AMD are relying too heavily on an industry to move to multi-core aware software, but there just isn;t enough uptake on the average PC to make software developers really push this side of things.

If every main street PC came equipped with a quad core, this might be different. I feel AMD were hoping for software to be 5 years ahead of where it actually is.

Power consumption is another issue - they are not going to win any server contracts with Zambesi based CPU's.


----------



## ThePath

It seems than i5 2500 is better overall than FX-8150, cheaper and less power hungry


----------



## GameBoy

Quote:



Originally Posted by *Dublin_Gunner*


It's completely relevant from a consumer point of view though.

Why would someone 'upgrade' to BD, when their Phenom II performs pretty much on par?


That wasn't my point.


----------



## Xinc

Quote:



Originally Posted by *Bit_reaper*


It's kind of sad how thees benches make the old thuban look more attractive then the FX-lineup.

I don't think the BD is a total dog but its hard to see a reason to buy one.


Well said!!


----------



## Evil Penguin

You know how AMD has multiple CPU divisions?
Seems to me that the BD division was competing with the Thuban/Deneb division.


----------



## gooface

Quote:



Originally Posted by *Evil Penguin*


You know how AMD has multiple CPU divisions?
Seems to me that the BD division was competing with the Thuban/Deneb division.










lol, I love this.


----------



## Devilmaypoop

Quote:



Originally Posted by *Evil Penguin*


You know how AMD has multiple CPU divisions?
Seems to me that the BD division was competing with the Thuban/Deneb division.










+rep

Maybe we're going to see the Phenom division coming up with a Phenom III, that's just a tweaked K10.5 core on 32nm, but still owns Bulldozer


----------



## Dublin_Gunner

Quote:



Originally Posted by *Evil Penguin*


You know how AMD has multiple CPU divisions?
Seems to me that the BD division was competing with the Thuban/Deneb division.










lol so true.

Its been in development long enough!


----------



## CramComplex

Ugh...there goes my plans for an AM3+ update...


----------



## Somenamehere

I am still waiting for the only review that matters.

The OCN review, I want to see how high this thing can go 24/7 and what the performance is.


----------



## criminal

Very sad. AMD would have been better off releasing an eight core Thuban.


----------



## Dynomutt

well that just sucks!!!!!, looks like my next CPU is a Thuban, seriously AMD how could you even think of releasing this, should have stuck with the Phenom II architecture and added more cores and Mhz, this is just embarrassing for AMD.


----------



## RedCloudFuneral

At least the 4 and 6 core variants might have some redeeming qualities if the pricing is good. I don't see much reason to buy an 8-core and I bet the 4 and 6 cores OC better, still I'm glad I bought my 2500K.


----------



## cayennemist

Wow, just wow..
I been playing minecraft all night, came here to check the news before bed.
Yeah I puked in my mouth a little. Any one want to buy a CH-V? T.T


----------



## gooface

Quote:



Originally Posted by *criminal*


Very sad. AMD would have been better off releasing an eight core Thuban.


or a 32nm version...

http://www.newegg.com/Product/Produc...82E16819103995
did anyone else see this just come out to the retail market? this is the unlockable quad core that unlocks to a hexa core... sorta weird to release that right before BD.


----------



## fongg

Funny how people were saying AMD wasnt giving benchmarks early because it would decrease sales of their current products. If they released them early 1090t and 1100t sales would've increased. Advanced Micro Dog Crap


----------



## RedCloudFuneral

Quote:



Originally Posted by *cayennemist*


Wow, just wow..
I been playing minecraft all night, came here to check the news before bed.
Yeah I puked in my mouth a little. Any one want to buy a CH-V? T.T


Keep your Thuban, you don't need an upgrade anyway.


----------



## ivymaxwell

the fx 8150 does not deserve to cost 45$ more than 2500k.


----------



## wamubu

One test I've yet to see is a VM test. I would imagine it would work pretty good with VMs... And also when W8 comes out.

Any news of how Linux handles it?


----------



## gooface

Quote:



Originally Posted by *wamubu*


One test I've yet to see is a VM test. I would imagine it would work pretty good with VMs... And also when W8 comes out.

Any news of how Linux handles it?


with the whole theory about windows 8, SB-E and Ivy will be out when that releases, so BD will be failing even more by then.


----------



## Chrono Detector

All I can say is that AMD failed big time here, what a major disappointment, was planning on getting one but not anymore.


----------



## wamubu

Quote:



Originally Posted by *gooface*


with the whole theory about windows 8, SB-E and Ivy will be out when that releases, so BD will be failing even more by then.


I saw some W8 benches, and they didn't look promising. Most under 5%.

But I still wait for VM tests... Per the bulldozer architecture, it looks like it will eat up VMs. Until patches are out to optimize Windows for bulldozer, maybe VMs will be the way to maximize performance? Just spit-wadding here...


----------



## Serious Dude

Quote:



Originally Posted by *swindle*


Yeah man.

Me and you, we did the exact same thing, and feel the exact same way.


feel sorry for myself... lol.. maybe i'll wait for some revisions/stepping to come out and see if they change the chip dramatically, then i might buy it,,... if not then gotta save to go intel,..


----------



## Swiftes

thoroughly dissapointed, rang scan and cancelled my preorder this morning.


----------



## jrbroad77

Up to 56x OCL Perf Mandelbrot performance (???). I guess its good for anything that needs FMA4. Pretty much AMD just needs to get a game developer to heavily favor FMA4. It's probably a bit too future-proof as it is, since Intel wont use FMA4 for a couple years.


----------



## Xylian

Dissappointed to say the least. Was hoping it'd atleast be competitively priced, but even that is debatable.

I'm getting i5-2500k :/


----------



## ThePath

Here performance per dollar graph from techreport review










Looks like sandy-bridge has better performance per dollar than Bulldozer


----------



## Derp

Quote:



Originally Posted by *ThePath*


Here performance per dollar graph from techreport review










Looks like sandy-bridge has better performance per dollar than Bulldozer


And better efficiency.


----------



## Riskitall84

Â£194 http://www.aria.co.uk/Products/Compo...roductId=46821

Vs Â£155 2500K

Just wow AMD has really screwed up here!


----------



## swindle

I don't like that graph at all. Very broad and open to all sorts of misconstrued and misguided comparisons and reasoning.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *ThePath*


Here performance per dollar graph from techreport review










Looks like sandy-bridge has better performance per dollar than Bulldozer


It also has better performance per watt.


----------



## dlee7283

The Phenom X6 has problems with higher end SLI/Xfire setups causing bottleneck, maybe Bulldozer can handle this problem better. Any test setups for higher end gpu's?

But like someone said earlier, they should have just added a Phenom II X8 32nm instead of taking 8 months out of our lives for nothing.


----------



## cayennemist

Quote:



Intended primarily for gamers and enthusiasts who regularly perform "extreme" tasks


http://www.pcmag.com/article2/0,2817...id=U_7qv1nfFT4
........ Really AMD?

or

Quote:



Unlock Your Record Setting AMD FX Series Processor Today
October 12, 2011 -- With the first eight-core desktop processor, enthusiasts and overclockers get an amazing PC experience at *unheard of prices*


http://www.amd.com/us/aboutamd/newsr.../newsroom.aspx


----------



## Armand Hammer

Quote:



Originally Posted by *Dr. Zoidberg*


It also has better performance per watt.


The question has to be asked. Just how bad was the performance of the early bulldozer chips that caused them to keep delaying the release?

It must have been shocking if this is the best they can come up with, probably around the original phenom.

Looks like its back to the future for AMD!


----------



## Artikbot

Has M$ released the kernel update that was intended to fully utilize FX processors' horsepower?

Have the board manus yet released the latest BIOSes compatible 100% with FX processors?

Many ifs I see.

I remember when Core i series were out... Until the BIOSes were revised and updated, a PhII 940 beat the i7 920 in a regular basis.

Can someone fold on one of those please?

If this ones are definitive results... I don't know what to think.


----------



## Armand Hammer

Quote:



Originally Posted by *swindle*


I don't like that graph at all. Very broad and open to all sorts of misconstrued and misguided comparisons and reasoning.


Such as?

Or is that the fanboy within speaking?


----------



## TheRockMonsi

I wonder how the 4 and 6 core variants of Bulldozer fair.


----------



## cayennemist

What pisses me off the most isn't the performance, or lack there of, but that they think they can use marketing to cover there but's.

Quote:



Unlock Your Record Setting AMD FX Series Processor Today


----------



## Munkypoo7

I really, really hope there are still optimizations to be had, otherwise AMD really ruined the FX line.

Makes my FX60 look like the last of the true FX line..


----------



## Dr. Zoidberg

What I don't understand is how AMD can explain that their module concept is better than Intel's hyper-threading technology when it isn't?

"As a result, Bulldozer cores do not just work at half the speed of Sandy Bridge cores. In addition to that the performance of the Bulldozer processor module with two cores is even lower than that of a single Sandy Bridge core with enabled Hyper-Threading technology."

http://www.xbitlabs.com/articles/cpu...0_8.html#sect0


----------



## Buttermilk

After reading a little it isn't as bad as I thought. Still not impressive. A bigger disappointment than I thought it would be. I think after a price drop this will be perfect for builders. Calling it a flop will discourage people from the branding though. I think that in time this product will find its place and thrive. I'm still waiting for IvyBridge to release and then price drop so I'll probably build another X58 now.


----------



## swindle

Quote:



Originally Posted by *Armand Hammer*


Such as?

Or is that the fanboy within speaking?


Fanboy? How so?

Well I look at that graph and I see an 8150 next to the i52500k.

As we can all see from today's benchmarks, and in the current state, the 8150 is so far away from the i52500k I wouldn't even call it an alternate choice. Especially with the power draw.

I'd recommend an PII X6 over the 8150.


----------



## Wazige

So in short

1: performance sucks
2: Ipc lower than thuban.
3: all chips use same silicon so they might even lose money on the 4core parts.

They should have shrunk thuban to 32nm, made some ipc improvements and clock that to 3800mhz-4000mhz stock.


----------



## dlee7283

Quote:



Originally Posted by *Hawk777th*


I really wondered what went on in the testing department the first time they benchmarked it. They had to know they were toast. Bet that was an unpleasant situation.


+1

Basically the problem is that no one, even Intel, expected Sandy Bridge to turn out to be as good as it was. AMD panicked and ended up worse than when they started Bulldozer.

AMD still has alot of great priced CPU's out there($60 Phenom X4 840 @MC) not including their APU's which I will buy for my next laptop.

The only problem in this thread is Intel owners spitting on AMD owners like they are dogs and that their company is garbage.

AMD prevented the Pentium 5


----------



## Armand Hammer

Quote:



Originally Posted by *swindle*


Fanboy? How so?

Well I look at that graph and I see an 8150 next to the i52500k.

As we can all see from today's benchmarks, and in the current state, the 8150 is so far away from the i52500k I wouldn't even call it an alternate choice. Especially with the power draw.

I'd recommend an PII X6 over the 8150.


Oh ok I misinterpreted where you were going with that, I thought you were going to try and defend faildozer.

Fanboy jibe retracted.


----------



## vinton13

I'm just going to buy a 1100T...


----------



## Artikbot

^You're just going to wait till new BIOS/Windows updates are out









Just pointing out something... When reviews refer to a single thread... Are they refering to a single module or a single thread on itself (half a FPU, two ALUS and a FMAC)?


----------



## swindle

Quote:



Originally Posted by *Armand Hammer*


Oh ok I misinterpreted where you were going with that, I thought you were going to try and defend faildozer.

Fanboy jibe retracted.










Haha all good dude









Poor post was poor on my behalf tbh.

EDIT: GL to anyone who does try defend the situation though. Seriously, GL.


----------



## cayennemist

On a more positive note, suddenly a can appreciate my 1100T a lot more.
It has been running @ 4.0 since I bought it. And more recently, On AIR! 
Cheap Air to...
CM-Hyper 520
30c 40c load, Solid Stable 24/7

I want PHIII X8 with a Die Shrink, Scrap FX!!


----------



## dlee7283

Quote:



Originally Posted by *cayennemist*


On a more positive note, suddenly a can appreciate my 1100T a lot more.


and u can change ur avatar lol


----------



## dlee7283

Quote:



Originally Posted by *Dr. Zoidberg*


What I don't understand is how AMD can explain that their module concept is better than Intel's hyper-threading technology when it isn't?

"As a result, Bulldozer cores do not just work at half the speed of Sandy Bridge cores. In addition to that the performance of the Bulldozer processor module with two cores is even lower than that of a single Sandy Bridge core with enabled Hyper-Threading technology."

http://www.xbitlabs.com/articles/cpu...0_8.html#sect0


Bulldozer will probably thrive in the server market at the end of the day.

Bulldozer consumer was unneeded. The APU's AMD came out with are more than sufficient for most gamers out there.

AMD should have just let Bulldozer be a server market only chip that would have eventually made its way to the enthusiast market.

AMD should have just said that their focus was all about APU's now, and that Bulldozer is a side project.


----------



## ivymaxwell

Quote:



Originally Posted by *dlee7283*


Bulldozer will probably thrive in the server market at the end of the day.

Bulldozer consumer was unneeded. The APU's AMD came out with are more than sufficient for most gamers out there.


but the insane power draw? businesses worry about electricity bill.


----------



## swindle

Yeah?

This will sit with servers about as well as it does with gamers.

8150 vs Xeon?

Man, you may as well just go to the circus tbh.


----------



## Dr. Zoidberg

"The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."

http://www.bit-tech.net/hardware/cpu...8150-review/11

Bulldozer has trouble even keeping up with a dual core cpu made in 2006. I don't know how much worse than that it can get.


----------



## omninmo

Quote:



Originally Posted by *ivymaxwell*


but the insane power draw? businesses worry about electricity bill.


insane power draw is more of an issue with OCed CPUs if I understand correctly... not a very common business / server practice to OC their CPUs


----------



## cayennemist

Quote:



Originally Posted by *dlee7283*


and u can change ur avatar lol










I know









I want to see if it affects there stock tomorrow.


----------



## dlee7283

Quote:



Originally Posted by *ivymaxwell*


but the insane power draw? businesses worry about electricity bill.


It seems like AMD felt pressure to release Bulldozer when they probably needed another 6 months to get it to where it needed to be.By that time power issues would have been addressed. At load it seems Bulldozer is fail but not at idle.

I just dont see why the higher ups at AMD let Bulldozer become a dog and pony show. It takes aways press from their good product, their APU's.


----------



## Dublin_Gunner

Quote:



Originally Posted by *dlee7283*


*Bulldozer will probably thrive in the server market at the end of the day.*

Bulldozer consumer was unneeded. The APU's AMD came out with are more than sufficient for most gamers out there.

AMD should have just let Bulldozer be a server market only chip that would have eventually made its way to the enthusiast market.

AMD should have just said that their focus was all about APU's now, and that Bulldozer is a side project.


It won't. Performance per watt is atrocious. Server market is all about performance per watt.


----------



## Armand Hammer

Quote:



Originally Posted by *Dr. Zoidberg*


"The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."

http://www.bit-tech.net/hardware/cpu...8150-review/11

Bulldozer has trouble even keeping up with a dual core cpu made in 2006. I don't know how much worse than that it can get.


Cheer up it's not all bad! Piledriver should increase IPC by about 15% so AMD can finally knock off the 2006 core2duo's, in 2012!


----------



## dlee7283

Quote:



Originally Posted by *Dublin_Gunner*


It won't. Performance per watt is atrocious. Server market is all about performance per watt.


True, but companies still bought the Netburst based Xeons didnt they lol

at a lower clock speed, like most server market chips, im sure Bulldozer evens out more.


----------



## QuackPot

Video review:

  
 You Tube


----------



## mad0314

Quote:



Originally Posted by *djxput*


back on topic; I actually did find one review that was somewhat positive
http://www.hardwareheaven.com/review...onclusion.html

didnt look like they tested as well as some places thou - but again pretty much just skimmed the info (so many reviews).



Quote:



There are two areas where the FX-8150 excells though, those are gaming and overclocking.


Do they know something nobody else does? Oh wait, no they showed GPU bottlenecks. They are testing for CPU with all the graphics settings turned up... For some reason when there is a GPU bottleneck present, AMD systems are usually a few FPS higher. BD has worse IPC than Phenom II and games don't make full use of many cores. You want a few strong cores for games.

Nobody else seems to be able to get over 4.6ish GHz. I wonder if they overclocked at all or theyre just commenting on AMDs comments.


----------



## Oranuro

Bulldozer could very well kill AMD, wait, I'll rephrase that, Bulldozer could very well be the end of the high end microprocessor business for AMD. They still have some great tech coming out of that company like the APUs and the graphics line, but a product that's been over 4 years in the making should look a lot better than this.

This won't force Intel to lower prices on sandy bridge so either AMD will have to lower their prices for Bulldozer or stop production on silicon that costs a lot more than Thuban/Deneb. They really need someone at the head of that company to really push the R&D and release products that make the competition sweat their balls off. They should have perfected the process until they had something competitive, no matter how many years it would've taken them.


----------



## vinton13

So...how many of you are planning to get BD? I know I'm just going for a 1100T.


----------



## Armand Hammer

Quote:



Originally Posted by *QuackPot*


Video review:

http://www.youtube.com/watch?v=jOVJn...layer_embedded


Ouch. You can tell he'd had enough of reviewing this @ 18:19, yet he had to trudge on through another 16 mins!

Otherwise good review though.


----------



## iceblade008

Oh.

Well I guess I will be recommending that my friend goes with a 2500K for his new build then...

How disappointing.


----------



## Armand Hammer

Quote:



Originally Posted by *vinton13*


So...how many of you are planning to get BD? I know I'm just going for a 1100T.


Dude if you can you should just cut and run. Sell the 990FX and get whatever you can for it, then buy a 2500k.


----------



## microfister

Quote:



Originally Posted by *rockosmodlife*


Reflex99 bought one earlier, I dunno.


check your emails. i just received an email from tigerdirect showcasing the new fx series. some pretty good bundled deals on there too.

just out of curiosity when did the Phenom IIs come down in price, dont recall checking too recently but $120 for a 955 on newegg.


----------



## vinton13

Quote:



Originally Posted by *Armand Hammer*


Dude if you can you should just cut and run. Sell the 990FX and get whatever you can for it, then buy a 2500k.


Yeah...how disappointing. Thankfully, I ordered it, and it didn't even come in yet.


----------



## JonnyFenix

I'mm soooo going for a 2500K. Bulldozer is a massive FAIL! Boys at school are going to lose they're minds when they read this. Phenom II may be the last "good" processor that they have.


----------



## Armand Hammer

Quote:



Originally Posted by *Oranuro*


Bulldozer could very well kill AMD, wait, I'll rephrase that, Bulldozer could very well be the end of the high end microprocessor business for AMD. They still have some great tech coming out of that company like the APUs and the graphics line, but a product that's been over 4 years in the making should look a lot better than this.

This won't force Intel to lower prices on sandy bridge so either AMD will have to lower their prices for Bulldozer or stop production on silicon that costs a lot more than Thuban/Deneb. They really need someone at the head of that company to really push the R&D and release products that make the competition sweat their balls off. They should have perfected the process until they had something competitive, no matter how many years it would've taken them.


Well said dude.

AMD should probably keep the 8150 and 8120 as an 8 core marketing ploy but withdraw the FX 6 cores and FX 4 cores from the desktop marketplace and produce them only for server. Then keep a couple of the Thubans at 45nm (such as 1075T and 1100T) and then pump out some highly clocked 32nm denebs on AM3+ socket to fill the gap.


----------



## cayennemist

Quote:



Originally Posted by *Armand Hammer*


Dude if you can you should just cut and run. Sell the 990FX and get whatever you can for it, then buy a 2500k.


That's what I'm going to do! unless there is some sort of miracle in the next few days and bios updates fix stuff (wishful thinking)

My mind hurts! I mean Really ? thuban is better! WTH? 
I haven't Had Intel Since Q9550 and that was a very nice chip!


----------



## Darkslayer7

Yay . Amd made a 2500k .

I am disappointed , and not . 
I was hoping more from amd (was a fanboy) , and i'm happy that i made the right decision when i went 2500k .

I had an old board with 770 chipset , and an Athlon II x4 [email protected] 3.51ghz .
now i have a p67 with a 2500k . 
If i have waited for the release , than maybe i would have gone with FX , but i am happy as it is .


----------



## greydor

This is awful. AMD, come on. You've been a part of almost everyone's computer building since, well, computer building was enjoyable. Why disappoint?


----------



## microfister

Quote:



Originally Posted by *Armand Hammer*


Well said dude.

AMD should probably keep the 8150 and 8120 as an 8 core marketing ploy but withdraw the FX 6 cores and FX 4 cores from the desktop marketplace and produce them only for server. Then keep a couple of the Thubans at 45nm (such as 1075T and 1100T) and then pump out some highly clocked 32nm denebs on AM3+ socket to fill the gap.


yea a die shrink or rework of denab and thuban would have been a far better plan than this. this is the opposite of what intel did between their two generations of i7s(less power more performance). and as much as i want to root for the underdog in this, it is disgraceful that this is what amounted from 5 years of work.

of course then again. after seeing the tigerdirect add, i have a feeling they will still meet their quota for sales.

http://www.tigerdirect.com/email/WEM...tigeremail2850


----------



## Shame486

Wow, people who have been saying that Bulldozer have 8real cores, just lol.
It have very hard time beating 2500k in multi task operations, even with 2500K's 4cores vs Bulldozer 8 "real cores" .


----------



## vinton13

Best they had released the Phenom II X8.


----------



## crossy82

Quote:



Originally Posted by *ZealotKi11er*


Next is probably 12-Core version stock 4.6Ghz and just barely beats 2600K in multi-tasking.


Lol,i doubt it,the power it would use,lol,you would need a 1000w PSU for CPU alone,and some serious cooling.

AMD?What have you done?The FX moniker has been totally made a mockery of,all that waiting for such a pile of crap .Just glad i never bought a AM3+ mobo.

And for anyone thinking the next Bulldozer will be good,lol,i would'nt bother,a 10% increase wont exactly help this junk.


----------



## Xenthos

Meh, I guess that's it then... Ivy here I come...


----------



## ryand

Well this architecture is clearly terrible. A die shrunk Phenom II with 8 REAL cores would have done better and probably overclocked just as well without the insane power draw.


----------



## mad0314

Quote:



Originally Posted by *Armand Hammer*


Well said dude.

AMD should probably keep the 8150 and 8120 as an 8 core marketing ploy but withdraw the FX 6 cores and FX 4 cores from the desktop marketplace and produce them only for server. Then keep a couple of the Thubans at 45nm (such as 1075T and 1100T) and then pump out some highly clocked 32nm denebs on AM3+ socket to fill the gap.


If what I read is true, all BD chips are exactly the same but with 1/2 modules disabled for the quad/hex and 8150 cherry picked (to run lower voltages at the higher clocks). I doubt they would stop selling the quad/hex, as they can sell a chip that failed one or two modules. They have a lot of work ahead of them, though.

And no you can't unlock the modules, at least according to AMD. ~$120 for an 8120/8150 just might make it worth it if it somehow becomes possible... might.


----------



## dlee7283

I am pretty sure there are people who would pay $199.99 for a FX 8150 just to have 8 cores. Hopefully Microcenter is going to do something like that.


----------



## SteveMcQueen

I cant believe that the 8150 sucks so much wattage, you can compare it to an awful 980x... 
at least you can drop about 30Watts with undervolting.


----------



## LukaTCE

What the .......... i stuck with phenom 555 until ivy bridge come :S
Probably in this cpu won't improve beformance in bf3 right ?
Not buying AMD ever again


----------



## 2010rig

Quote:



Originally Posted by *gooface*


if AMD was smart they would drop the prices on these cpus by a lot, then scrap their plans to make any better versions of it, and do a complete redesign, and make the phenom II's 32nm or something or just faster or add more cores to them. Just not this...


And take another 5 - 6 years to release? I don't think so, they will try and fix whatever is wrong, who knows at this point. A die shrink of Phenom 2 may be more worth while for them.

Quote:



Originally Posted by *Wishmaker*


So the previews were accurate. Discrediting them was pointless especially the LAB501 one. It is pretty obvious that AMD has contacted these guys because many of the paragraphs are sugarcoated. It is good to see that some did not lose objectivity

The Guru3D review, as opposed to the HardOCP, blames entirely the software for the poor results. Another AMD intervention, pretty obvious as this is damage limitation yet again. While I agree software is not on par, I do not agree with the fact that Thuban is right there where Bulldozer is. So why the software excuse when the old architecture from AMD does just fine??

From hitechlegion :

Reading this review makes one ask, where is INTEL in all this? It is pretty obvious that AMD is the performance king ... from this review.

The lab501 guys were spot on. AMD has pushed some buttons and the fact that review sites have responded to this is an insult to us as readers. I was very interested in buying a Bulldozer rig but with such photoshoped reviews, I will wait for IVY.


I had been saying it for a LONG time as to how people could discredit everything. After JF-AMD called everything fake, not indicative of final performance, people followed suit. They took his word as gospel, because he works at AMD. Wasn't it obvious that he was here to save the public perception of the company?

There was so many different leaks from different sources, heck OBR was right all along and people bashed him. I kept stating facts and well, if you spent any time in the BD blog, you know-how that went.

Remember what the LAB501 guy said, that AMD would attempt damage control, looks like it worked with some reviewers.

Quote:



Originally Posted by *Evil Penguin*


You know how AMD has multiple CPU divisions?
Seems to me that the BD division was competing with the Thuban/Deneb division.










CLASSIC!

Quote:



Originally Posted by *xxbassplayerxx*


Going to be a lot of 990 boards up for sale for really cheap in the near future!

In the credits they state that it was a 2500K. Legitimate mistake or purposeful deceit is the question.


They left that for the end. If it was a legitimate mistake they could've edited it and corrected the BEGINNING doesn't take long. Send me the video I'll fix it for the free of charge.









Quote:



Originally Posted by *Derp*


Glad to hear, this means if I was an AMD user buying a new AMD motherboard would mean I would be ready for faster IPC when BD comes out!!! Oh wait.... whats this then?










OCN TOS:

Deserves a ban IMO.


Ban may be going too far. However, JF-AMD really deceived a lot of people though I look forward to what he has to say. I called him out on a lot of stuff he said which didn't add up, but ran into a TON of nay sayers defending JF-AMD and AMD as a whole.

Quote:



Originally Posted by *PyroTechNiK*


http://www.youtube.com/watch?v=8rDwXuAINJk

They finally disabled comments on this fraudulent video... I wonder why...


False advertising at its finest, I can't believe AMD would stoop this low. Remember 80% of their purchases come from the uninformed, BD will still sell.

What looks better to the average Joe? 8 cores @ 3.6 or 4 cores @ 3.4?

If those people do any sort of research though...they will be disappointed with their purchase.


----------



## Armand Hammer

Quote:



Originally Posted by *mad0314*


If what I read is true, all BD chips are exactly the same but with 1/2 modules disabled for the quad/hex and 8150 cherry picked (to run lower voltages at the higher clocks). I doubt they would stop selling the quad/hex, as they can sell a chip that failed one or two modules. They have a lot of work ahead of them, though.

And no you can't unlock the modules, at least according to AMD. ~$120 for an 8120/8150 just might make it worth it if it somehow becomes possible... might.


True but they wont be able to make a dime selling them for that price, if they are reduced to selling their top of the line 8 core cpus for less than $150 then they are finished in this market sector.

All in all this is just a monumental failure from AMD, there's not much they can do to salvage this, although I think my earlier suggestions will probably be the least damaging to their bottom line long term.


----------



## Chris13002




----------



## Xenthos

Quote:



Originally Posted by *ryand*


Well this architecture is clearly terrible. A die shrunk Phenom II with 8 REAL cores would have done better and probably overclocked just as well without the insane power draw.


I have to agree with you on this one.

An 8 core Phenom II with the die shrink would have performed better. Even the 6 core Phenom II without the die shrink almost out performs Bulldozer every other benchmark.


----------



## Skrillex

Quote:



Originally Posted by *crossy82*


Lol,i doubt it,the power it would use,lol,you would need a 1000w PSU for CPU alone,and some serious cooling.

AMD?What have you done?The FX moniker has been totally made a mockery of,all that waiting for such a pile of crap .Just glad i never bought a AM3+ mobo.

And for anyone thinking the next Bulldozer will be good,lol,i would'nt bother,a 10% increase wont exactly help this junk.










1000W PSU

Considering the TDP of the current bulldozer line up is less than 200W I can pretty much guess you wouldn't need anywhere near 1000W.

Also you have an 850W PSU for a system that would run on 300W.


----------



## mav2000

Quote:



Originally Posted by *2010rig*


I had been saying it for a LONG time as to how people could discredit everything. After JF-AMD called everything fake, not indicative of final performance, people followed suit. They took his word as gospel, because he works at AMD. Wasn't it obvious that he was here to save the public perception of the company?

There was so many different leaks from different sources, heck OBR was right all along and people bashed him. I kept stating facts and well, if you spent any time in the BD blog, you know-how that went.

Remember what the LAB501 guy said, that AMD would attempt damage control, looks like it worked with some reviewers.

Ban may be going too far. However, JF-AMD really deceived a lot of people though I look forward to what he has to say. I called him out on a lot of stuff he said which didn't add up, but ran into a TON of nay sayers defending JF-AMD and AMD as a whole.

False advertising at its finest, I can't believe AMD would stoop this low. Remember 80% of their purchases come from the uninformed, BD will still sell.

What looks better to the average Joe? 8 cores @ 3.6 or 4 cores @ 3.4?

If those people do any sort of research though...they will be disappointed with their purchase.


Right on the button...along with a couple of "Senior" guys at some of the other well known forums who said that BD will "perform".

JF AMD aint going to be back anytime soon...maybe he will be back in time for BD II....to fool us into believing that there "is" a pot of gold at the end of the rainbow.

Sorry buddy, all the deception and lies just caught up and now guys like OBR and Lab will sleep easy knowing they were right.

I am happy atleast a few of the senior guys with inside information decided to keep quite rather than add to the ever growing list of people who gave us wrong indications.

BTW, I wonder if AMD has said anything about the reviews or have they decided to have an NDA on replying to reviews.....


----------



## Kauke

Lol at scam video from amd with comments disabled


----------



## caffeinescandal

So.. To summarize, Bulldozer is a flop? Sorry guys, I just woke up from a really, really, really long nap. xD


----------



## twich12

amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


----------



## Liranan

Absolutely terrible. What have AMD released?


----------



## t00sl0w

well, it looks like they dropped all support on older instruction sets, then they killed IPC in favour of thier new hybrid core functionality........hmm


----------



## Kand

Quote:



Originally Posted by *twich12*


amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


Fermi however, was FASTER than its predecessor, the GT200.


----------



## Stuuut

*One small stepping for AMD, one giant LOL for INTEL.*


----------



## crossy82

Quote:



Originally Posted by *Skrillex*


1000W PSU

Considering the TDP of the current bulldozer line up is less than 200W I can pretty much guess you wouldn't need anywhere near 1000W.

Also you have an 850W PSU for a system that would run on 300W.


Lol,i was mocking AMD,and yes i know it would'nt require that amount,but it sure as hell is power hungry at load,and if you had 2 on a server board,well that would be some serious waste of power for rubbish performance.


----------



## Artikbot

Quote:



Originally Posted by *Kand*


Fermi however, was FASTER than its predecessor, the GT200.


And had an army of fanboys ready to buy it


----------



## twich12

Quote:



Originally Posted by *Kand*


Fermi however, was FASTER than its predecessor, the GT200.


you are correct sir.... in that case AMD's worse than fermi* seriously! im looking at SB boards right now, sick of my crappy amd 965 that cant even keep up with my 3x5870's


----------



## TheRockMonsi

Quote:



Originally Posted by *twich12*


amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


In no way do they compare - Fermi took the performance crown, at the expense of lots of heat and power consumption. AMD's Bulldozers just totally sucks monkey nuts, and eats up power.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *Liranan*


Absolutely terrible. What have AMD released?


Instead of bulldozer, they have released "man with shovel."


----------



## 2010rig

Quote:



Originally Posted by *twich12*


amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


Fermi was a power hungry beast, but the 480 was faster than the 5870.

Fermi refined gave us the 580, the fastest single GPU today.


----------



## Th0m0_202

so not worth buying over a current amd cpu?


----------



## MoBeeJ

In most applications, its not worth it over PII x4 and x6s. But in games its a little bit different. But still not worth it. I am waiting for multi gpu scenarios.

If all goes well, amd is definitly a gamer's cpu, given that you have an old pc.


----------



## drBlahMan

Sandy,







you are the best *1* for me


----------



## th3illusiveman

Quote:



Originally Posted by *Dr. Zoidberg*


Instead of bulldozer, they have released "man with shovel."










hahaha saw that on /g/ awhile ago, it didn't *click* with me back.


----------



## Mattousai

Quote:



Originally Posted by *Dr. Zoidberg*


Instead of bulldozer, they have released "man with shovel."










That statement, my good sir, made me laugh









Being sick all week, and then seeing this train wreck, well I could use a few good laughs


----------



## microfister

maybe, just maybe, with so many consumers turning to intel(previous amd fanboys as well) we may still get a price drop.... keeping my fingers crossed. id really like to build a SB rig for folding.


----------



## TFL Replica

Instead of comparing Bulldozer to Fermi (which was not a flop by any measure), it's much easier to compare it to AMD's Phenom (not Phenom II) or ATI's R600. Bulldozer as a flop lies between those flops.


----------



## Kevlo

Wow, thats a lot of pages really quick, but im more happy than anything. Sure the standard streamlined benchmarks are not that great, about 10%-15% faster than Phenom IIs, but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K most of the time, and seems to almost always beat the 2500K, when both are OCed, mind you. I am honestly impressed....and look forward to getting my father's 1090T once he buys himself a 8120 or 8150....lol....

Tl;Dr: Bulldozer made up a ton of progress on the gaming aspect, not so much on the other stuff.


----------



## Dublin_Gunner

28nm Phenom II x6 or x8 would have been MUCH more financially efficient.

They could have tweaked the architecture to add AVX and some other extensions and left it at that.

They would be compete better with Intels offerings also.

Someone will lose their job over this.

Actually, someone _should_ lose their job over this. Or indeed a whole design team.

They would have been running simulations on this architecture for the past couple of years, so they would have had an idea of the performance.

Its shocking to think they could release something that isn't even competitive with their current products - especially as a 1055T can be had for what, €130?

Phenom II x6 actualyl looks great value for money now, and those craving every last drop of performance will be best sticking with Sandy Bridge.

AMD are damn lucky they bought ATI a few years ago. They would have very little income now otherwise.

They better hope they can string together some deals with OEM's for BD - because its going to sell little or nothing retail.


----------



## microfister

Quote:



Originally Posted by *Dr. Zoidberg*


Instead of bulldozer, they have released "man with shovel."










sig worthy. at least for a little while


----------



## Dublin_Gunner

Quote:



Originally Posted by *Kevlo*


Wow, thats a lot of pages really quick, but im more happy than anything. Sure the standard streamlined benchmarks are not that great, about 10%-15% faster than Phenom IIs, but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K most of the time, and seems to almost always beat the 2500K, when both are OCed, mind you. I am honestly impressed....and look forward to getting my father's 1090T once he buys himself a 8120 or 8150....lol....

Tl;Dr: Bulldozer made up a ton of progress on the gaming aspect, not so much on the other stuff.


Did you read the same reviews as the rest of us?


----------



## ryand

Quote:



Originally Posted by *Kevlo*


but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K most of the time, and seems to almost always beat the 2500K, when both are OCed, mind you.


Check again.

http://www.anandtech.com/show/4955/t...x8150-tested/8

It comes dead last pretty often...


----------



## mechtech

"Pushing the 8120, or 8150, to the 4.6GHz mark takes an extra 200 watts of power at load compared to stock clocks."









While Intel can slap another 2 cores onto an affordable Sandy Bridge if they feel threatened and still have a low power chip, AMD looks like it has some scalability issues with BD...

At the end of the day Sandy is an architecture that has tons of headroom, while bulldozer is pushed to the limit of what is acceptable wattage in a modern CPU. Unfortunately it looks like Intel can sit back and price gouge 6 core chips until AMD has some real competition to a 6 core Sandy, and by the looks of it, this won't come to pass any time soon







.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *Kevlo*


Wow, thats a lot of pages really quick, but im more happy than anything. Sure the standard streamlined benchmarks are not that great, about 10%-15% faster than Phenom IIs, but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K nearly all the time, and seems to almost always beat the 2500K, when both are OCed, mind you. I am honestly impressed....and look forward to getting my father's 1090T once he buys himself a 8120 or 8150....lol....

Tl;Dr: Bulldozer made up a ton of progress on the gaming aspect, not so much on the other stuff.


You have to understand that when bulldozer is overclocked, it consumes a massive amount of power. Hardware Canucks overclocked an FX-8150 to 4.6 GHz, and here are their surprising results:

"At these settings, the system is idling at around 195W,* but when running the aforementioned Prime95 stress test, it pulls an immense 550W from the socket. If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost 800W*. Obviously, no one runs a CPU and GPU stress tests at the same time, but this does illustrate what the worst case scenario could be when it comes to power consumption."


----------



## ivymaxwell

We need IBM to step up and release Desktop Consumer Chips. I don't have hope for AMD Engineers or directors.


----------



## awdrifter

Quote:



Originally Posted by *twich12*


amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


No, this is AMD's Netburst. It's slower (in some cases) than the Thuban and uses more power. The only advantage is it clocks higher.


----------



## faulkton

looks like i'm sticking with my current rig for a while.. thanks AMD for saving me $500.


----------



## nagle3092

Biggest upset this year award goes to Bulldozer...


----------



## ikcti

I feel like Bulldozer is very innovative, definitely has an opportunity to succeed. It just wasn't done and executed properly. Maybe if Intel came up with something like Bulldozer (sharing of resources) it'd do better. I'm just hoping Piledriver will do more than 10-15%, because it if does not, they will *NEVER* catch up to Intel.


----------



## microfister

Quote:



Originally Posted by *Kevlo*


Wow, thats a lot of pages really quick, but im more happy than anything. Sure the standard streamlined benchmarks are not that great, about 10%-15% faster than Phenom IIs, but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K most of the time, and seems to almost always beat the 2500K, when both are OCed, mind you. I am honestly impressed....and look forward to getting my father's 1090T once he buys himself a 8120 or 8150....lol....

Tl;Dr: Bulldozer made up a ton of progress on the gaming aspect, not so much on the other stuff.


are you kidding me? the 8150 @ 4.6 bottlenecks a single gtx 580, it has horrible gaming performance. the phenom II 980 BE at stock clocks beats the bulldozer @ stock clocks in gaming. im not too sure about your reading ability, or maybe your just one of those 'sees/hears what they want to' kinda individual.


----------



## Wildcard36qs

Goodness what did I wake up to?! My hopes and dreams just got bulldozed.


----------



## ivymaxwell

Only IBM can become the competitor to Intel to keep prices reasonable and innovation high.
I give up on AMD.

IBM is the only hope of competition to Intel, they are brilliant. just need them to release consumer desktop chips.


----------



## Artikbot

Ready to see Intel skyrocket prices again in 3,2,1...


----------



## Agenesis

Business as usual for AMD I see.

Now to hunt down those guys who said they would eat their hats...


----------



## Phil~

First I heard of bulldozer was in 07 or 08. They could have baked something more potent than this in that time. Hmm AMD is still the underdog. What else is new. At least they can finally beat Core 2 at least.

The power draw is startling though. Those tests have to be flukes....no way a chip should need 550 watts, that's ludicrous.


----------



## microfister

Quote:



Originally Posted by *ikcti*


I feel like Bulldozer is very innovative, definitely has an opportunity to succeed. It just wasn't done and executed properly. Maybe if Intel came up with something like Bulldozer (sharing of resources) it'd do better. I'm just hoping Piledriver will do more than 10-15%, because it if does not, they will *NEVER* catch up to Intel.


really, intel and amd sharing resources? lets think about it for a second. Intel-from 1st gen i7s to second gen I7s improved performance cut power consumption and also made "top of the line" much more afordable($315). AMD-from phenom IIs to Bulldozer managed to drop in performance and skyrocket power consumption (explain to me how this is green again) and for these changes you have to pay more($90 more than a 1090t, $40 more than a 2500k).

now i ask you. WHO benefits from the two companies sharing resources, hmmm?


----------



## Chiefpuff420

Quote:



Originally Posted by *microfister*


really, intel and amd sharing resources? lets think about it for a second. Intel-from 1st gen i7s to second gen I7s improved performance cut power consumption and also made "top of the line" much more afordable($315). AMD-from phenom IIs to Bulldozer managed to drop in performance and skyrocket power consumption (explain to me how this is green again) and for these changes you have to pay more($260).

now i ask you. WHO benefits from the two companies sharing resources, hmmm?


Lol good one.


----------



## Mattousai

Quote:



Originally Posted by *awdrifter*


No, this is AMD's Netburst. It's slower (in some cases) than the Thuban and uses more power. The only advantage is it clocks higher.


Great comparison, sums this up nicely.


----------



## Fuell

I still don't understand all the doom and gloom... for workstations BD works fairly well... often beating a 2500K in highly threaded tasks and sometimes competing with a 2600K, for less money. So in these workloads it performs between the 2500K and 2600K and is priced between them but closer to the 2500K in which it loses to easily in single thread performance...

What we all need, is to hope that AMD gets single threaded performance up, gets power consumption under control and then increases 10-15% above its multi-threaded performance... THEN it will compete... but that will be no easy task.

So while for gamers and most regular people, BD isn't looking like a great CPU... for those who want workstations though, its got decent price/performance that will likely get better once AMD lowers their prices a little after launch...

Though not a great CPU, it has potential. It hit a home run in some cases, but tripped on its own feet in others... Just too bad it trips in the cases I wanted it for most... Might go llano in the new year then... or just get a cheap Thuban when the prices drop even more... I miss the old 1090T I had


----------



## thrgk

im sticking with my 2600k, faster, and i like intel lol


----------



## Dr. Zoidberg

Quote:



Originally Posted by *Phil~*


Hmm AMD is still the underdog. What else is new. At least they can finally beat Core 2 at least.


Not necessarily.

"The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."

http://www.bit-tech.net/hardware/cpu...8150-review/11

Kind of sad when bulldozer fails to beat a dual core cpu released in 2006.


----------



## Phil~

Overclocking is going to be hell with these chips. If I were AMD I would be working on the power draw problem now, not in 3 years time.

They are basically handing Intel the mobile market on a platter.


----------



## microfister

the 2600K and 2500K trounce it. and price to performance a 955 slaps it around pretty good too. and over the next year AMD has Ivy Bridge and Sandy Bridge Extreme to look forward to... i should by intel stocks.


----------



## Phil~

Quote:



Originally Posted by *Dr. Zoidberg*


Not necessarily.

"The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."

http://www.bit-tech.net/hardware/cpu...8150-review/11

Kind of sad when bulldozer fails to beat a dual core cpu released in 2006.


Well overall. Taken as a whole, Bulldozer chips looks to be on par and slightly faster than Core 2's, especially when overclocked.


----------



## ikcti

Quote:



Originally Posted by *microfister*


really, intel and amd sharing resources? lets think about it for a second. Intel-from 1st gen i7s to second gen I7s improved performance cut power consumption and also made "top of the line" much more afordable($315). AMD-from phenom IIs to Bulldozer managed to drop in performance and skyrocket power consumption (explain to me how this is green again) and for these changes you have to pay more($90 more than a 1090t, $40 more than a 2500k).

now i ask you. WHO benefits from the two companies sharing resources, hmmm?


I'm afraid you didn't quite get what I meant. What I was saying was, if Intel made an architecture quite like Bulldozer where there is a sharing of resources (like the FPU) within the CPU itself they could probably pull it off better.


----------



## Cyclonic

Quote:


> Originally Posted by *Dr. Zoidberg;15274866*
> Not necessarily.
> 
> "The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."
> 
> http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/11
> 
> Kind of sad when bulldozer fails to beat a dual core cpu released in 2006.































i so fell for AMDs marketing and bought a Crosshair V sigh what a crap.


----------



## Chiefpuff420

Im kinda confused with whats going on with that FX 4100 Model. Did they Cut core performance out of that as well, cause if so they have truely gone Drunk. Maybe the Fx4100 is the ace up their sleeve.... probably not.


----------



## faulkton

Quote:


> Originally Posted by *Fuell;15274858*
> What we all need, is to hope that AMD gets single threaded performance up, gets power consumption under control and then increases 10-15% above its multi-threaded performance... THEN it will compete... but that will be no easy task.


"hope" + "and" x "then" = fail.


----------



## Electroneng

This is a sad day for computer enthusiasts. AMD's latest architecture cannot gain any ground on Intel.

Why would Intel push new products out now with no competition what-so-ever. SB-E is ready for launch in November so that will come out but why even push out Ivy Bridge next year? Intel can just sit back and continue to gain market share with current offerings.


----------



## Phil~

Quote:


> The Phenom II X6 has six cores operating at 3.3GHz, which gives it an eBay frequency of 19.8GHz; the FX-8150 has eight cores at 3.6GHz giving it an comparable speed of 28.8GHz - an increase of 45 per cent. However, the Cinebench score of 6.01 is only 2.6 per cent faster than the 5.86 of the Phenom II X6 1100T. That's a poor result for the FX-8150 and indicates that each of those eight cores aren't a match for any of the six within the Phenom II. The 8-core FX-8150 was only marginally faster than the quad-core Core i5-2500K, which managed 5.93. The Hyper-Threaded Core i7-2600K was much faster than the FX-8150, though, with a score of 7.49.


Wow. That is just bad. 8 cores barely winning to a CPU with half the throughput.


----------



## Liranan

Quote:


> Originally Posted by *Dr. Zoidberg;15274742*
> You have to understand that when bulldozer is overclocked, it consumes a massive amount of power. Hardware Canucks overclocked an FX-8150 to 4.6 GHz, and here are their surprising results:
> 
> "At these settings, the system is idling at around 195W, *but when running the aforementioned Prime95 stress test, it pulls an immense 550W from the socket. If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost 800W*. Obviously, no one runs a CPU and GPU stress tests at the same time, but this does illustrate what the worst case scenario could be when it comes to power consumption."


800W























I max my overclocked CPU and GPU all day while running BOINC. Looking at BD I will need to underclock it to prevent my system from exploding.

It may be a slight exaggeration but it's not far from the truth. Damned AMD









I hope Thubans drop in price fast, I'd love to get a 1090 at least.


----------



## Fuell

Quote:


> Originally Posted by *faulkton;15274905*
> "hope" + "and" x "then" = fail.


Way to quote a small fraction of a post out of context to make a useless comment... BD is not an epic fail, unless you guys think a 2500K is utter garbage for threaded workloads... there are areas where BD simply shines, especially compared to PII, and then there is games and lowly threaded tasks...

But go ahead and keep the flame train going... Talking about the architectures ups and downs is one thing... but to call "outright fail" on a chip that CAN best a 2500K consistently in some workloads, thats funny...

(I would like to point out that in no way am I saying BD is "great" or better than a 2500K.... Just that there is far too much negative commenting going on and its simply beyond reason...


----------



## Chris13002

damn...
And I thought of replacing my i7 920 + gtx 480 system with this...

I am disappoint...


----------



## redhat_ownage

Quote:


> Originally Posted by *Agenesis;15274789*
> Business as usual for AMD I see.
> 
> Now to hunt down those guys who said they would eat their hats...


this!!!


----------



## paulerxx

damn AMD was hoping you had another Athlon64 on your hands.


----------



## t00sl0w

to the people saying IBM should make processors, i didnt think they had the x86 license, so it would be more PPC that no one develops software for.


----------



## Kevlo

Quote:



Originally Posted by *ryand*


Check again.

http://www.anandtech.com/show/4955/t...x8150-tested/8

It comes dead last pretty often...


Why don't you look again. That is at stock.
When OCed, it does pretty well: http://vr-zone.com/articles/amd-fx-8.../13694-10.html
http://www.hardocp.com/article/2011/...mance_review/2

Quote:



Originally Posted by *Dr. Zoidberg*


You have to understand that when bulldozer is overclocked, it consumes a massive amount of power. Hardware Canucks overclocked an FX-8150 to 4.6 GHz, and here are their surprising results:

"At these settings, the system is idling at around 195W,* but when running the aforementioned Prime95 stress test, it pulls an immense 550W from the socket. If we also add a fully loaded GeForce GTX 460 1GB to the mix, that number spikes up to almost 800W*. Obviously, no one runs a CPU and GPU stress tests at the same time, but this does illustrate what the worst case scenario could be when it comes to power consumption."


Yes, i saw that and i understand it, but i doubt that it will pull THAT much power in standard gaming sessions, Prime95 is designed to stress it and load it to the max, a regular game will not.

Quote:



Originally Posted by *microfister*


are you kidding me? the 8150 @ 4.6 bottlenecks a single gtx 580, it has horrible gaming performance. the phenom II 980 BE at stock clocks beats the bulldozer @ stock clocks in gaming. im not too sure about your reading ability, or maybe your just one of those 'sees/hears what they want to' kinda individual.


See this is what i don't get, we are on a forum for OCing enthusiasts, but to prove one product better than the other, people pull _stock_ numbers out. No one on here cares about stock performance, people care about OCed performance. I'm not saying that the SB cpus are bad, i think they are great too, but for around the same price or less, i could get a 8120 or 8150, that when OCed can match them. Maybe i am a "Sees/Hears what they want to kinda individual," however people who are saying Bulldozer is pure and nothing else, but flop, are just the same type of ignorant.
AMD has made progress, and will continue to build progress with it, a new architecture of any hardware generally takes 1 to 2 generations to get going properly.


----------



## RallyMaster

*pets 2500K computer*


----------



## faulkton

Quote:



Originally Posted by *Fuell*


Way to quote a small fraction of a post out of context to make a useless comment... BD is not an epic fail, unless you guys think a 2500K is utter garbage for threaded workloads... there are areas where BD simply shines, especially compared to PII, and then there is games and lowly threaded tasks...

But go ahead and keep the flame train going... Talking about the architectures ups and downs is one thing... but to call "outright fail" on a chip that CAN best a 2500K consistently in some workloads, thats funny...

(I would like to point out that in no way am I saying BD is "great" or better than a 2500K.... Just that there is far too much negative commenting going on and its simply beyond reason...



I had hoped AMD would put something out which would trounce my 5ghz 2600K and induce me to part with my monies.

for me it's an epic fail.


----------



## Phil~

To be fair Intel has far more money to burn in R&D


----------



## magicmike

The use of the word epic bothers me, sorry to rant.

And I agree that bulldozer may not have met expectations but the expectations were brought along by the community as AMD never dropped info or flat out said that they would be better than sandy bridge, ivy bridge or whatever you wish to compare too.

I'm glad that they put out some new architecture, and a little disappointed it isn't faster but thats why we all should be glad we have options. Also as people mentioned, in multithreaded applications the chips are performing extremely well, and for some of us that makes them worth their premium, and others it may not, but to call the chips fail, before they have been out and the patched windows kernal comes out is unfair.


----------



## iamwardicus

Quote:



Originally Posted by *Cyclonic*






























i so fell for AMDs marketing and bought a Crosshair V sigh what a crap.


Get an X6 processor and shoot for 4+ghz on it is what it looks like if you game.


----------



## Lord Venom

Well, that means I'm going back to Intel. Either Sandy Bridge-E or Ivy Bridge!


----------



## chadamir

Quote:



Originally Posted by *magicmike*


The use of the word epic bothers me, sorry to rant.

And I agree that bulldozer may not have met expectations but the expectations were brought along by the community as AMD never dropped info or flat out said that they would be better than sandy bridge, ivy bridge or whatever you wish to compare too.

I'm glad that they put out some new architecture, and a little disappointed it isn't faster but thats why we all should be glad we have options. Also as people mentioned, in multithreaded applications the chips are performing extremely well, and for some of us that makes them worth their premium, and others it may not, but to call the chips fail, before they have been out and the patched windows kernal comes out is unfair.


I disagree a bit. Let's not forget the real core push by amd. It seems to indicate real cores aren't really all they made them out to be.


----------



## Izvire

Quote:



Originally Posted by *Phil~*


To be fair Intel has far more money to burn in R&D


AMD fanboys say for a year that Bulldozer will run over Sandy Bridge like nothing, and when it loses like expected, it's suddenly OK because "Intel has far more money to burn in R&D"?


----------



## Skripka

Quote:



Originally Posted by *Izvire*


AMD fanboys say for a year that Bulldozer will run over Sandy Bridge like nothing, and when it loses like expected, it's suddenly OK because "Intel has far more money to burn in R&D"?


Yup. The whole thing is pretty darn funny now.


----------



## Kevlo

Quote:



Originally Posted by *Izvire*


AMD fanboys say for a year that Bulldozer will run over Sandy Bridge like nothing, and when it loses like expected, it's suddenly OK because "Intel has far more money to burn in R&D"?


Your assuming that all people who say that are fanboys. Some people have been saying that since Intel grew as a company, larger than AMD.


----------



## Sophath

The expectation wasn't just brought by the community. Their marketing team hyped it. All their benchmark slides were promises of something good. It was hyped up for years. And even when early leaks were showing signs of a flop, they kept on blaming it on the fact that they were engineering samples.


----------



## Izvire

Say what you want, it still is one big flop in my eyes.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *chadamir*


I disagree a bit. Let's not forget the real core push by amd. It seems to indicate real cores aren't really all they made them out to be.


In addition, it appears that AMD's module approach is even less efficient than Intel's hyper-threading technology on Sandy Bridge.

"As a result, Bulldozer cores do not just work at half the speed of Sandy Bridge cores. In addition to that *the performance of the Bulldozer processor module with two cores is even lower than that of a single Sandy Bridge core with enabled Hyper-Threading technology*."

http://www.xbitlabs.com/articles/cpu...0_8.html#sect0


----------



## Dublin_Gunner

Quote:



Originally Posted by *paulerxx*









damn AMD was hoping you had another Athlon64 on your hands.


That's pretty much what you got. That's the problem.


----------



## dev1ance

Hmm....was hoping for BD to be a decent chip. Would've been nice to replace my Q6600 home server (always queued with encoding tasks). Guess I'll retire my 2600K to that task instead and buy myself X79.


----------



## nubz

typical let down by AMD. I was waiting for this boat of fail, same thing happend with the Phenom II 940, I was a sucker to buy it at release, wont be making the same mistake. AMD ya blow kid.


----------



## CAHOP240

Oh yea...time to grab my coffee and a kit kat bar and read up on these reviews.


----------



## Phil~

Quote:



Originally Posted by *Izvire*


AMD fanboys say for a year that Bulldozer will run over Sandy Bridge like nothing, and when it loses like expected, it's suddenly OK because "Intel has far more money to burn in R&D"?


Well that's the fan community. If AMD themselves were to say that BullDozer would flat out destroy Sandy Bridge then that would be a different matter. The fan community does not make the product.

With Intel as it is, it is surprising AMD is even able to be kept alive in thje X86 arena. So yes, to compete with Intel, you will have to have money, and lots of it. Samsung could probably do it, but they can't innovate like Intel can.


----------



## ivymaxwell

Quote:



Originally Posted by *Phil~*


Well that's the fan community. If AMD themselves were to say that BullDozer would flat out destroy Sandy Bridge then that would be a different matter. The fan community does not make the product.

With Intel as it is, it is surprising AMD is even able to be kept alive in thje X86 arena. So yes, to compete with Intel, you will have to have money, and lots of it. Samsung could probably do it, but they can't innovate like Intel can.


IBM could innovate and compete if they wanted to. IBM people are so brilliant.


----------



## ilhe4e12345

i have gone over a few reviews...and i cant understand what AMD was trying to do here....

Im an AMD fanboy, i openly admit it but my god im glad i didnt get my hopes to high for this......The reviews i am seeing its almost like AMD tried to make a "server processor for home use?" the performance is garbage and the fact that the 1100T is just as good as a 8150 makes me sick to my stomach....like really? your gonna over hype this saying how much better its going to be then Intel's current but then the official release shows that it can barely touch the old generation Intel's? mother of god if thats the case then Intel's next gen stuff is going to wreck....

Im sorry for the apst if i ever argued that BD would be "the all winning AMD chip"....the only thing i would consider ever buying a BD for would be a server BUT with the power consumption so high i dont even think i would pick up a BD for that....The Operton's i loved in the past back in the 939 sockets they were little beasts so i think if i build anything AMD it will be back to the Opteron's for server use...this is just disgusting...

I was going to hold out for the 10 core BD's next year...but it looks like BD is just another Phenom 1 all over again...i really want to like BD but my god my current 1075t is only slightly behind the 8150...and the extra cores arent worth it at this point....thanks a lot AMD, appreciate the epic amount of failure i have seen...

now i know that my next upgrade is going to be a graphics card rather then a new system...atleast i know my current one is as good as the new "NEW AMAZING OMG"

***....now im kind of sad...


----------



## Kasp1js

Funny how the price of a 2500k has risen by about 30$ where I live.

Performance seems decent when all cores are used fully, not that great of a gaming chip though


----------



## Cyclonic

Quote:



Originally Posted by *Phil~*


Well that's the fan community. If AMD themselves were to say that BullDozer would flat out destroy Sandy Bridge then that would be a different matter. The fan community does not make the product.

With Intel as it is, it is surprising AMD is even able to be kept alive in thje X86 arena. So yes, to compete with Intel, you will have to have money, and lots of it. Samsung could probably do it, but they can't innovate like Intel can.


So Samsung needs to buy AMD


----------



## geovas77

Here is a review considering Northbridge scaling, seems to scale but not nearly enough to rescue the chip imo.

http://www.madshrimps.be/articles/ar...#axzz1aY4HupTU


----------



## Disturbed117

Quote:



Originally Posted by *Cyclonic*


So Samsung needs to buy AMD

















If anyone does, i hope its ibm.


----------



## xisintheend

Quick glance at this thread tells me I am future proof with my 1055T for a while


----------



## Dr. Zoidberg

http://finance.yahoo.com/blogs/daily...150326494.html

Please someone add bulldozer to this list.


----------



## CryWin

At this point I honestly can't see anyone buying these chips for anything other than to play with. AMD literally could have put two more cores on the Phenom II X6 and released a better product.


----------



## Skripka

Quote:



Originally Posted by *xisintheend*


Quick glance at this thread tells me I am future proof with my 1055T for a while










And when you say "future-proof", you mean?


----------



## p-saurus

AMD fanboy here who will be building an Intel system in the near future.







I guess it doesn't matter what's inside the case as long as it performs well but it's going to make me sad to break this silly emotional attachment I have to AMD.


----------



## ThePath

Bulldozer should be called Pentium 4 X8


----------



## mav451

Quote:



Originally Posted by *geovas77*


Here is a review considering Northbridge scaling, seems to scale but not nearly enough to rescue the chip imo.

http://www.madshrimps.be/articles/ar...#axzz1aY4HupTU


Lol what is it with review sites not being transparent about settings? Madshrimps shows game benches without even a single word on the settings/resolution/quality used.


----------



## MPIXAPP

Intel all the way ..

Sigh for AMD :/ !

Quote:



Originally Posted by *thepath*


bulldozer should be called pentium 4 x8


Epic














!


----------



## Xenthos

Quote:



Originally Posted by *ThePath*


Bulldozer should be called Pentium 4 X8


True, imagine Thuban with 8 cores and a die shrink... it would outperform Bulldozer by miles.


----------



## Phil~

Quote:



Originally Posted by *Cyclonic*


So Samsung needs to buy AMD

















Couldn't happen, such a merger would cause a shift in power and trigger anti trust laws.


----------



## exnihilo

I got to post 200, and didn't see it posted, so: Anything and everything mentioned in this thread means absolutely squat. Got your attention? The bulk of a chip manufactures profit comes from sales that have nothing to do with what we do on this site. *Is BD, thus far, a disappointment for us, the overclockers/enthusiasts? Yes. *_Is the average opening a gmail from Tigerdirect exclaiming 8 cores! New tech! Full rig! $500 gonna give two flips? Nope._ *AMD will be fine. <---- is the purpose of this post. * Maybe one day they'll get it right and we can once again return to the glory days of my Athlon XP v. Netburst etc... For right now, revel in the goodness that is SB.

cg


----------



## dev1ance

Hitler's take (might be offensive if you're a serious person):

  
 You Tube


----------



## Kevlo

Quote:



Originally Posted by *CryWin*


At this point I honestly can't see anyone buying these chips for anything other than to play with. AMD literally could have put two more cores on the Phenom II X6 and released a better product.


yes they could have, and do you think they would have gotten any less hate?
This way they have released a new Architecture and are innovating.

Also, something else, has anyone else thought about when the Athlon 64s were first released, and they didn't have the best performance from the start but once things were optimized a little bit, they dominated Intel. Perhaps things haven't been optimized in the Windows Kernel or programs to fully utilize Bulldozer. Just a thought.


----------



## Wildcard36qs

It really isnt that huge of a fail though guys. For those of you building BF3 systems, you want this processor lol:


----------



## ivymaxwell

Quote:



Originally Posted by *Kevlo*


yes they could have, and do you think they would have gotten any less hate?
This way they have released a new Architecture and are innovating.

Also, something else, has anyone else thought about when the Athlon 64s were first released, and they didn't have the best performance from the start but once things were optimized a little bit, they dominated Intel. Perhaps things haven't been optimized in the Windows Kernel or programs to fully utilize Bulldozer. Just a thought.


nope it was instant domination once the first athlon 64's released.


----------



## Xenthos

Quote:



Originally Posted by *Wildcard36qs*


It really isnt that huge of a fail though guys. For those of you building BF3 systems, you want this processor lol:


























I have to agree. With a multi threaded future in mind, this thing will be sweet. But right now, at this very moment. Most are hardly 2, let alone 4 threads. I hope this changes in the near future or BD just came too soon.


----------



## NitrousX

Bahaha AMD fail.


----------



## khurios2000

Quote:



Oh hey now, after months of speculation, rumors, and a lot, seriously a lot, of gossip AMD today finally puts an end to all that. Yes, the all new AMD FX series processors are released today -- *octacores for the win my man, octomom will be pleased alright*







.... guru3d.com


i lol'ed


----------



## lordikon

Quote:



Originally Posted by *Alatar*


it was better earlier


















Only thread I've seen with more people at once was the news thread about Creative suing the kid that fixed their Vista drivers, and the drivers he had changed had shown that Creative intentionally disabled many of their sound cards so they could blame Vista and get customers to buy their newest sound cards.

Ah, nostalgia. I'm not even sure this link is the original thread that started it all:
http://www.overclock.net/hardware-ne...all-glory.html


----------



## Skripka

Quote:



Originally Posted by *Wildcard36qs*


It really isnt that huge of a fail though guys. For those of you building BF3 systems, you want this processor lol:
[IM G]http://www.pcper.com/files/imagecache/article_max_width/review/2011-10-10/bf3.jpg[/IMG]
[IM G]http://www.hardocp.com/images/articles/13182343781D3JFR9LiH_2_2.gif[/IMG]
[IM G]http://www.hardocp.com/images/articles/13182343781D3JFR9LiH_2_3.gif[/IMG]


Anyone buying any 8-core CPU for gaming is downright silly in the head. Your argument is invalid.


----------



## BlackOmega

Quote:



Originally Posted by *Kevlo*


Wow, thats a lot of pages really quick, but im more happy than anything. Sure the standard streamlined benchmarks are not that great, about 10%-15% faster than Phenom IIs, but look at the Gaming performance (What i was more concerned with), its fantastic, it matches the 2600K most of the time, and seems to almost always beat the 2500K, when both are OCed, mind you. I am honestly impressed....and look forward to getting my father's 1090T once he buys himself a 8120 or 8150....lol....

Tl;Dr: Bulldozer made up a ton of progress on the gaming aspect, not so much on the other stuff.


Exactly. All of these intel fanboys are blah blah blah flop, yet these BD CPUs' are, undeniably, AMD's fastest CPUs ever.

Quote:



Originally Posted by *Dublin_Gunner*


Did you read the same reviews as the rest of us?










I've been wondering if you guys were also reading the same reviews.

Quote:



Originally Posted by *ryand*


Check again.

http://www.anandtech.com/show/4955/t...x8150-tested/8

It comes dead last pretty often...


The hell it does. But I suppose it's easier for people to believe that. Although you all might want to take a closer look at [H]'s game testing results. At a 200 MHz lower OC , the BD chip (4600MHz) beat out even the OC'd i7 2600K (4800MHz) in 2 of the 3 games tested.

But I suppose it's easier to jump on the BD bashing bandwagon.









Quote:



Originally Posted by *microfister*


are you kidding me? the 8150 @ 4.6 bottlenecks a single gtx 580, it has horrible gaming performance. the phenom II 980 BE at stock clocks beats the bulldozer @ stock clocks in gaming. im not too sure about your reading ability, or maybe your just one of those 'sees/hears what they want to' kinda individual.


I think you're the one that's only focusing on the negative reviews and synthetic benchmarks that mean about squat in real world performance. Why not look at some _realworld_ testing. How's about this handbrake Video encoding test? It always beats out the 2500K which has a 200MHz higher OC.

Quote:



Originally Posted by *microfister*


really, intel and amd sharing resources? lets think about it for a second. Intel-from 1st gen i7s to second gen I7s improved performance cut power consumption and also made "top of the line" much more afordable($315). AMD-from phenom IIs to Bulldozer managed to drop in performance and skyrocket power consumption (explain to me how this is green again) and for these changes you have to pay more($90 more than a 1090t, $40 more than a 2500k).

now i ask you. WHO benefits from the two companies sharing resources, hmmm?


Intel has always stuck it to AMD. Intel is and has always been shady. When they decided on resource sharing, intel would always give AMD incomplete or bogus information. While AMD was forthright and honest.

As I said before, it's without a doubt that BD is AMD's fastest CPU to date. You can't deny it. Sure is single threaded apps it's not the greatest, but those are going to go by the wayside.

It also seems that a lot of the reviews contradict each other. In some, it shows BD doing very well against Intel in terms of encoding, and others it shows it performing very poorly.

However, I will say that I believe some of the reviews aren't performed correctly. I mean BD supports 1866MHz RAM, yet, even OC'd (not extreme) they're running it at 1600MHz. Secondly, like the Hardware Canucks review, they're using RipjawsX RAM, which is *specifically* designed for SB.

There's a lot of inconsistencies, and ways of skewing results. I mean 3 sticks of RAM in one test? Really?









From the most part I can tell, that BD appears to be good for encoding and decent for gaming. It was eluded to in many reviews that Windows 7 and it's scheduler could very well play a big part in BD's poor performance, and it was said that this should be rectified in Windows 8. I also believe that once programs are optimized for it, then BD will show a marked improvement over its current results. Which in all honesty aren't bad, just not as good as some people had hoped.


----------



## Liranan

I have to say, it does depend on price. If they price this right it'll still be a good CPU. Right now it's ridiculous but a 100USD price drop will make it a reasonable. chip, and yes, it will have to be 100 before it's worth anything.

The other thing is that this is just a paper launch. AMD might not release these chips for retail but wait until the architecture's been fine-tuned in a few years


----------



## Skripka

Quote:



Originally Posted by *Liranan*


I have to say, it does depend on price. If they price this right it'll still be a good CPU. Right now it's ridiculous but a 100USD price drop will make it a reasonable. chip, and yes, it will have to be 100 before it's worth anything.

The other thing is that this is just a paper launch. AMD might not release these chips for retail but wait until the architecture's been fine-tuned in a few years










TigerDirect has them on sale now grasshopper. Check the Deals board.


----------



## khurios2000

cant wait for bulldozer shellshockers @ newegg


----------



## Chewy

Quote:



Originally Posted by *Skripka*


Anyone buying any 8-core CPU for gaming is downright silly in the head. Your argument is invalid.


+1

Lets not forget it would need twice the juice to power the system overclocked, and cost more.

I didnt think bulldozer was going to be anything special but this is nothing short of a joke!. Honestly right now i feel sorry for the amd guys who have actually spent money preparing for this


----------



## AK-47

Quote:



Originally Posted by *Phil~*


To be fair Intel has far more money to burn in R&D


that's a lie. AMD recently opened a R&D facility in israhell
They get no pity from. I was hoping they'd make a better product or even close to 2500k and 2600k
But they didn't so screw them and stop making excuses for them


----------



## Liranan

Quote:



Originally Posted by *BlackOmega*


Exactly. All of these intel fanboys are blah blah blah flop, yet these BD CPUs' are, undeniably, AMD's fastest CPUs ever.

I've been wondering if you guys were also reading the same reviews.

The hell it does. But I suppose it's easier for people to believe that. Although you all might want to take a closer look at [H]'s game testing results. At a 200 MHz lower OC , the BD chip (4600MHz) beat out even the OC'd i7 2600K (4800MHz) in 2 of the 3 games tested.

But I suppose it's easier to jump on the BD bashing bandwagon.









I think you're the one that's only focusing on the negative reviews and synthetic benchmarks that mean about squat in real world performance. Why not look at some _realworld_ testing. How's about this handbrake Video encoding test? It always beats out the 2500K which has a 200MHz higher OC.

Intel has always stuck it to AMD. Intel is and has always been shady. When they decided on resource sharing, intel would always give AMD incomplete or bogus information. While AMD was forthright and honest.

As I said before, it's without a doubt that BD is AMD's fastest CPU to date. You can't deny it. Sure is single threaded apps it's not the greatest, but those are going to go by the wayside.

It also seems that a lot of the reviews contradict each other. In some, it shows BD doing very well against Intel in terms of encoding, and others it shows it performing very poorly.

However, I will say that I believe some of the reviews aren't performed correctly. I mean BD supports 1866MHz RAM, yet, even OC'd (not extreme) they're running it at 1600MHz. Secondly, like the Hardware Canucks review, they're using RipjawsX RAM, which is *specifically* designed for SB.

There's a lot of inconsistencies, and ways of skewing results. I mean 3 sticks of RAM in one test? Really?









From the most part I can tell, that BD appears to be good for encoding and decent for gaming. It was eluded to in many reviews that Windows 7 and it's scheduler could very well play a big part in BD's poor performance, and it was said that this should be rectified in Windows 8. I also believe that once programs are optimized for it, then BD will show a marked improvement over its current results. Which in all honesty aren't bad, just not as good as some people had hoped.


If what you are saying is right about the possible performance increase then it will be interesting.

I like Hilbert's reviews, I've found him to be more reliable than most, and he does claim that BD is being hampered by software. As another poster mentioned when the A64 was first released it also didn't perform as well as it could have but after a few tweaks it beat anything and everything Intel threw at it. It would be really interesting if we have the same situation here.


----------



## BlackOmega

Quote:



Originally Posted by *Chewy*


+1

Lets not forget it would need twice the juice to power the system overclocked, and cost more.

I didnt think bulldozer was going to be anything special but this is nothing short of a joke!. Honestly right now i feel sorry for the amd guys who have actually spent money preparing for this










Why according to the results they'll beat your rig.


----------



## Tokkan

This CPU isnt even an upgrade over Phenom's II it seems...


----------



## Ha-Nocri

the key to 'dozer's poor performance lies in this:

Quote:



Originally Posted by *AnandTech*

AMD also shared with us that Windows 7 isn't really all that optimized for Bulldozer. Given AMD's unique multi-core module architecture, the OS scheduler needs to know when to place threads on a single module (with shared caches) vs. on separate modules with dedicated caches. Windows 7's scheduler isn't aware of Bulldozer's architecture and as a result sort of places threads wherever it sees fit, regardless of optimal placement.


8150 isn't really a 8-core CPU, it was supposed to be a 4 core but to have something like Intel's hyper threading, just better. But studying computer science I can understand why is it hard to implement.


----------



## Liranan

Quote:



Originally Posted by *AK-47*


that's a lie. AMD recently opened a R&D facility in israhell


YOU WHAT? How the hell did I miss this? I am very disappointed, very disappointed indeed in AMD


----------



## KingGreasy

How sad
I had been waiting patiently with dedication and belief that Bulldozer would be a lot more comparable but now I pretty much know for sure that my next build will be Intel with probably a AMD 7000 card. 
With the leaks I wasn't expecting it to be all that great but I at least thought it would compete with the 2600k better.


----------



## kevink82

Mine will come tomorrow although i wonder how much my crosshair v can handle with those tiny 8+2 phase, hopefully will get it up and running with all the watercooling parts in the next 2 days.......

Im not too worry since i knew from the start engineering samples are pretty much 90% of final performance anyways, whoever believe AMD that says final sample will have a huge gain is just silly look back at every chip every made the es chips are exactly or pretty close to final performance.

BTW did anyone notice this year is pretty much a all tech product screw up year? Intel p67 bug >>> iphone 5 hype >>> bulldozer hype >>> x79 cutting back on specs.......


----------



## gtsteviiee

And this is why I moved to intel..


----------



## Vagrant Storm

Quote:



Originally Posted by *mav451*


Lol what is it with review sites not being transparent about settings? Madshrimps shows game benches without even a single word on the settings/resolution/quality used.


It doesn't really matter as long as they use the same settings for each tested platform. They aren't testing the game...they are testing the hardware. Plus they might have a page on their site that goes into those details so they don' t have to include it in every review. These are the legit reviewers that people have been waiting for...you just sort of have to trust them when it comes to those things.

Having said that...I feel sick. I was really expecting a FX8120 to be on par with 2500k performance once both CPUs had a good OC on them. It looks like it will barely win or match it on a couple things...but be far under in the rest.

These aren't even good for the server market...not with that power consuption. Power saving is what the server market is all about these days save for the instances where performance is everything.

A Bulldozer based server CPU might get used where threaded performance is everything, but at that power consuption you could almost have another Xeon added to the rack. I'd have to look it up, but I am sure it is close. It doesn't matter if it had more cores, there is no way a couple more threads on one CPU will match the performance of another system.

Perhaps for a company that doesn't care about the utility bill, but at the same time doesn't want to spend a lot on hardware? I don't see many companies doing that...not even the ones known for more of the stupider moves. i wouldn't be surprised to find out that AMD was using Xeons on their servers.


----------



## Kand

Quote:



Originally Posted by *Wildcard36qs*


It really isnt that huge of a fail though guys. For those of you building BF3 systems, you want this processor lol:


GPU limited.


----------



## Skripka

Quote:



Originally Posted by *BlackOmega*


Why according to the results they'll beat your rig.










O rly?

http://www.guru3d.com/article/amd-fx...ssor-review/13

Once it gets up to 4.6 it starts trading punches with SB in multi-threaded...in single-threaded it compares nicely to Old Faithful Thuban.


----------



## BlackOmega

Quote:



Originally Posted by *Liranan*


If what you are saying is right about the possible performance increase then it will be interesting.

I like Hilbert's reviews, I've found him to be more reliable than most, and he does claim that BD is being hampered by software. As another poster mentioned when the A64 was first released it also didn't perform as well as it could have but after a few tweaks it beat anything and everything Intel threw at it. It would be really interesting if we have the same situation here.


 Just like with any new hardware, you have to program the software to be able to take advantage of it. Programs aren't going to miraculously know how to utilize it. Just think of all the GPUs' and how drivers have noticeably improved their performance.

However, this might also just be a stepping stone kind of like how Fermi's initial release led to today's much more powerful cards.

Quote:



Originally Posted by *Skripka*


O rly?

http://www.guru3d.com/article/amd-fx...ssor-review/13

Once it gets up to 4.6 it starts trading punches with SB in multi-threaded...in single-threaded it compares nicely to Old Faithful Thuban.


Once again focusing on synthetics. Show me *real-world*.


----------



## kevink82

Quote:



Originally Posted by *Skripka*


O rly?

http://www.guru3d.com/article/amd-fx...ssor-review/13

Once it gets up to 4.6 it starts trading punches with SB in multi-threaded...in single-threaded it compares nicely to Old Faithful Thuban.


Yes but you have to take into account 2500k can be overclock as well, to a well average of 4.5ghz.......


----------



## Liranan

Quote:



Originally Posted by *BlackOmega*


Just like with any new hardware, you have to program the software to be able to take advantage of it. Programs aren't going to miraculously know how to utilize it. Just think of all the GPUs' and how drivers have noticeably improved their performance.

However, this might also just be a stepping stone kind of like how Fermi's initial release led to today's much more powerful cards.


I will hang on to what I have and see whether Win 8 brings any improvements. If it does then I will look into BD again. In the meantime I will keep an eye on these threads.

I wouldn't be surprised if MS release some patch to improve or optimise performance a little, like they did when Intel released their P4 HT's.


----------



## ttaylor0024

Well, bulldozer's performance is kind of disappointing... At least I wont waste any money on it!


----------



## hydropwnics

i was hoping to upgrade my 1100T but it doesnt even look like its worth it, maybe i'll spring for a 2600k in a few months or wait for Ivy


----------



## Kand

Quote:



Originally Posted by *Skripka*


O rly?

http://www.guru3d.com/article/amd-fx...ssor-review/13

Once it gets up to 4.6 it starts trading punches with SB in multi-threaded...in single-threaded it compares nicely to Old Faithful Thuban.


Great. Comparing an overclocked processor that draws >150w at that frequency against a processor at stock that draws 95w. :3

Sounds fair, and logical? Totally.


----------



## Skripka

Quote:



Originally Posted by *BlackOmega*


Once again focusing on synthetics. Show me *real-world*.










"Real world"? So you'd be fine if AMD or Intel released Athlon64 or a Core2Quad level chips today marketed for the enthusiast/performance market, since in real world you couldn't tell the difference?


----------



## Skripka

Quote:



Originally Posted by *Kand*


Great. Comparing an overclocked processor that draws >150w at that frequency against a processor at stock that draws 95w. :3

Sounds fair, and logical? Totally.


No...it shows how far you have to tweak BD to get it to compete...note the also included 8150 at stock for comparison.


----------



## AK-47

Quote:



Originally Posted by *Liranan*


YOU WHAT? How the hell did I miss this? I am very disappointed, very disappointed indeed in AMD

















yeah it was announced around June.
Their stance on the matter always was you never what the future might hold or something like that so it doesn't exactly surprise me

http://www.khronos.org/news/permalin...io-and-opens-a

http://www.globes.co.il/serveen/glob...51071&fid=1725

http://www.itproportal.com/2011/06/0...m-partnership/


----------



## Phil~

Quote:



Originally Posted by *AK-47*


that's a lie. AMD recently opened a R&D facility in israhell
They get no pity from. I was hoping they'd make a better product or even close to 2500k and 2600k
But they didn't so screw them and stop making excuses for them


Excuse me?

AMD Operating Income 848 Million










Intel Operating Income 16 Billion


----------



## Dmac73

Quote:



Originally Posted by *Wildcard36qs*


It really isnt that huge of a fail though guys. For those of you building BF3 systems, you want this processor lol:


























Gpu limited game is gpu limited.

BD is fail.


----------



## phibrizo

Quote:



Originally Posted by *HWI*


You must have read some different reviews than me, cause the ones I read it got stomped by the 2600K.


The 2600k would be out of my price range, so at what i am looking it, the 8120 is decent compared to the 2500k. If i was looking at the top performance for my money, yes i would say bd gets stomped, but compared that to the 990x, they 2600k gets owned, again, its all about what people can afford, and their price bracket.


----------



## Kand

Quote:



Originally Posted by *Skripka*


No...it shows how far you have to tweak BD to get it to compete...note the also included 8150 at stock for comparison.


Sandy Bridge can be tweaked as well. Keep in mind.


----------



## Skripka

Quote:



Originally Posted by *phibrizo*


The 2600k would be out of my price range, so at what i am looking it, the 8120 is decent compared to the 2500k. If i was looking at the top performance for my money, yes i would say bd gets stomped, but compared that to the 990x, they 2600k gets owned, again, its all about what people can afford, and their price bracket.


OTOH, for folks like you on a budget who already have a Thuban...you really don't have an upgrade to go to in performance. Sure the price is decent for someone still on an Athlon64 or Core2Duo, with a big relative bump in performance...but ye on Thuban already will likely not see a big boost in most tasks, certainly not one that would be worth the coin IMHO.


----------



## BlackOmega

Quote:



Originally Posted by *Dmac73*


Gpu limited game is gpu limited.

BD is fail.


So wait, if it was GPU limited, wouldn't they have the *exact same* results? Yet the SB, even with a 200MHz higher clock, scores lower? Hmmm.... I believe there's something amiss here with your logic.


----------



## Vagrant Storm

Quote:



Originally Posted by *phibrizo*


The 2600k would be out of my price range, so at what i am looking it, the 8120 is decent compared to the 2500k. If i was looking at the top performance for my money, yes i would say bd gets stomped, but compared that to the 990x, they 2600k gets owned, again, its all about what people can afford, and their price bracket.


I never thought I'd say this about a desktop CPU, but...the cost savings a FX 8120 has over a 2500k might not account for your higher electric bill. I think performance and power usage wise the 2500k is the better deal. Meaning in most things you will have better perfromance and a lower electric bill. At casual usage...it would probably be close. If you are a folder or a person who leaves the coputer on 24/7...then it would probbly be a huge difference.


----------



## Skripka

Quote:



Originally Posted by *Kand*


Sandy Bridge can be tweaked as well. Keep in mind.


Oh, I remember. That is why I linked to thew Guru3D review. BD as an enthusiast/performance part barely compete OC'd when the competition is held to stock.

Shame. Was thinking of moving my 1055T folding rig out and throwing a BD in there...definitely not worth the coin as it stands now. Hopefully folks are right and it can be band-aided with some software patching.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *BlackOmega*


So wait, if it was GPU limited, wouldn't they have the *exact same* results? Yet the SB, even with a 200MHz higher clock, scores lower? Hmmm.... I believe there's something amiss here with your logic.










At that resolution any difference can be explained by the margin of error.


----------



## linskingdom

Quite disappointed at the performance by giving that long wait. With a typical quad-chip is ‘good enough’ for most applications, I don’t see any urgency to upgrade unless price looks right. So at this point, AMD is still doing that ‘catch-up’ business. On the other hand, Intel just put a higher multi on SB such as 2700K etc. will easily contain the whole thing and continue pushing back the Ivy….


----------



## Blameless

Quote:



Originally Posted by *BlackOmega*


So wait, if it was GPU limited, wouldn't they have the *exact same* results? Yet the SB, even with a 200MHz higher clock, scores lower? Hmmm.... I believe there's something amiss here with your logic.










Margin of error.


----------



## Horsemama1956

Quote:



Originally Posted by *BlackOmega*


So wait, if it was GPU limited, wouldn't they have the *exact same* results? Yet the SB, even with a 200MHz higher clock, scores lower? Hmmm.... I believe there's something amiss here with your logic.










2 frames is nothing. It's within the +/-.

Seems like a decent chip, though not much of an upgrade over current stuff. Seems like they made things over complicated with the architecture then it needed to be.

Not a horrible chip, but for the money Intel is the better option(unless you're a blind fanboy). Might get one of the cheaper options for a general use PC, just for the hell of it.


----------



## DayoftheGreek

Quote:



Originally Posted by *BlackOmega*


So wait, if it was GPU limited, wouldn't they have the *exact same* results? Yet the SB, even with a 200MHz higher clock, scores lower? Hmmm.... I believe there's something amiss here with your logic.










No they wouldn't have the exact same results. This happens in every GPU limited review ever.


----------



## Phil~

The Bulldozer built the road, to be overlapped with a Sandy Bridge. Pretty soon, we are going to have an Ivy covered Bridge as well.

I am curious if any of the water cooling companies will make kits for this. I have a feeling they won't....even though these chips basically require water for high clocks.


----------



## BinaryDemon

Although I see no logical reason for Bulldozer to impact any prices, I hope atleast it continues to push down the prices of AM3 Phenom II chips. Maybe retailers will still feel the need to get rid of 'older' tech.


----------



## AK-47

Quote:



Originally Posted by *Phil~*


Excuse me?

AMD Operating Income 848 Million










Intel Operating Income 16 Billion











IDC....
you want to prove your point? show me what each company spent in R&D
Stop making excuses for them
Bottom line they worked 5+ years on architecture that is worse than their current
That=fail


----------



## Yvese

Well, glad I didn't wait for this. Guess it's an ok upgrade for those on Thubans.


----------



## traktor

AMD: "I fell down some stairs."


----------



## Dublin_Gunner

http://www.anandtech.com/bench/Product/188?vs=434#

Wow.

That's my current CPU versus the 8150. Bare in mind that's stock speed, and I have mine at 3.8Ghz.

Even still, in some tests, stock 645 spanks the BD chip. Very very odd.

There is 0 reason for me to upgrade to Bulldozer.

Saying that, and I don;t know why. I feel slightly compelled to get one :?

Maybe its because everyone has Sandy Bridge, so just to be different lol


----------



## Rookie1337

All I got out of the reviews is that BD is slightly better than current Phenoms, draws a lot more power, and when OC'd can trade blows with a 2600k at stock. Does that sum it up?


----------



## Phil~

Quote:



Originally Posted by *AK-47*


IDC....
you want to prove your point? show me what each company spent in R&D
Stop making excuses for them
Bottom line they worked 5+ years on architecture that is worse than their current
That=fail


Intel spends more. They have literally 4 times the workforce. They dominate the market. To be brutally honest it is nearly impossible for AMD to overtake or even match intel. That only happened back in the FX60 days. Both CPU's use similar instruction sets. So how would a company that has money to burn lose to a company that doesn't? Intel released 4 major refreshes since 07, AMD hit with Phenom, Phenom 2, and now Bulldozer. I'm not making excuses for them, I am just saying don't expect a small company to miraculously topple a much larger one by using snake oil.


----------



## Fasista

A little disappointed with the performance of the FX 8150









If intel 2500K down the price of $ 200 or less. AMD would be in serious trouble!


----------



## lightsout

I'm sorry but lol at the people who were holding on to their hope that there would be some miracle and BD was going to be so great.

Sucks for the consumer though.


----------



## Skripka

Quote:



Originally Posted by *Rookie1337*


All I got out of the reviews is that BD is slightly better than current Phenoms, draws a lot more power, and when OC'd can trade blows with a 2600k at stock. Does that sum it up?


You missed the part where BD comes in a tin with a fancy belt-buckle.


----------



## nagle3092

Quote:



Originally Posted by *Rookie1337*


All I got out of the reviews is that BD is slightly *worse *than current Phenoms, draws a lot more power, and when OC'd can trade blows with a 2600k at stock. Does that sum it up?


Fixed


----------



## lightsout

Quote:



Originally Posted by *Dublin_Gunner*


http://www.anandtech.com/bench/Product/188?vs=434#

Wow.

That's my current CPU versus the 8150. Bare in mind that's stock speed, and I have mine at 3.8Ghz.

Even still, in some tests, stock 645 spanks the BD chip. Very very odd.

There is 0 reason for me to upgrade to Bulldozer.


I didn't see yours win once. In a lot of those tests lower is better.


----------



## t-ramp

Well, this is kind of a letdown.


----------



## Rookie1337

Quote:



Originally Posted by *Dublin_Gunner*


http://www.anandtech.com/bench/Product/188?vs=434#

Wow.

That's my current CPU versus the 8150. Bare in mind that's stock speed, and I have mine at 3.8Ghz.

Even still, in some tests, stock 645 spanks the BD chip. Very very odd.

There is 0 reason for me to upgrade to Bulldozer.


What the hell are you talking about. It didn't beat BD in more than 1 or 2 tests? I'm confused.

EDIT: Correction it didn't beat BD in any.


----------



## hokiealumnus

Quote:



Originally Posted by *xd_1771*


I'm curious,
I am very, very curious
if any of the reviewers bothered to overclock the CPU-NB.


We did. In the overclocked results the CPU-NB and HT clocks were both at 2500 MHz. 3000 was not stable at the small voltage increase I gave it (1.25 V on CPU-NB, which was a bit of overkill for that clock but time was not on our side; I had the chip 12 days before the review published). FWIW

Quote:



Originally Posted by *JE Nightmare*


i only care about the folding power.



Quote:



Originally Posted by *pioneerisloud*


I suggest everyone give this a good read. It's fully possible that the reviewers might have missed a very important MS hotfix to fix a L1 cache bug.

http://www.xtremesystems.org/forums/...64#post4969164


The discussion was about a Linux fix that somehow morphed into there being a miracle Windows hotfix out there. I have no idea how it ended up there, but there is no fix. All of the reviewers were on the same mailing list. AMD made no mention (and still has made no mention) of any sort of hotfix that would improve performance. Trust me, if it existed we'd have known about it. An up-to-date BIOS/UEFI is the main item people considering BD need to worry about.

Quote:



Originally Posted by *G3RG*


Weird that every single review I read could only hit exactly 4.6ghz lol...


Ours was 4.75 GHz for 24/7 clock, but we used a water loop. A real one, not one of those Corsair / Asetek deals. It's my setup for all reviews, Intel will get the same treatment with SB-E, assuming I can get the mount to fit.

Quote:



Originally Posted by *terraprime*


I still wonder what this puppy does with [email protected]



Quote:



Originally Posted by *Artikbot*


Has M$ released the kernel update that was intended to fully utilize FX processors' horsepower?

Have the board manus yet released the latest BIOSes compatible 100% with FX processors?

Can someone fold on one of those please?


In order: 
As far as anyone knows right now, there is none.
Yes, at least ASUS has - 0813 was the BIOS used in the review on the Crosshair V Formula and it's up on their site now. The version immediately before it (0805 IIRC) was what AMD shipped the boards with and 0813 was an optional update, we had the option of not updating so they are definitely compatible with 0813.
You're not the only one to ask (see question above yours). Unfortunately it didn't make it into our review, but I did manage to get results folding and posted it in the first post of the comment thread, which is quoted below.

Quote:



Originally Posted by *hokiealumnus*

I had promised folding results to Shelnutt2, but they couldn't make the review. So, they will be posted in the first post!

Regular SMP work unit - 13698.9ppd

Bigadv work unit - 13859.2ppd

So between 13,500 and 14,000 at stock, which is right where it's positioned - around the PPD of a 2500K.


My $.02 on DC performance with these - no one is going to touch them with a ten foot pole unless they have free electricity.


----------



## BlackOmega

Quote:



Originally Posted by *Dr. Zoidberg*


At that resolution any difference can be explained by the margin of error.



Quote:



Originally Posted by *Blameless*


Margin of error.



Quote:



Originally Posted by *DayoftheGreek*


No they wouldn't have the exact same results. This happens in every GPU limited review ever.


7.6% seems like a rather large margin of error, does it not?

Quote:



Originally Posted by *Horsemama1956*


2 frames is nothing. It's within the +/-.

Seems like a decent chip, though not much of an upgrade over current stuff. Seems like they made things over complicated with the architecture then it needed to be.

Not a horrible chip, but for the money Intel is the better option(unless you're a blind fanboy). Might get one of the cheaper options for a general use PC, just for the hell of it.


I still believe that this chip falls in between the 2500K and the 2600K in overall performance. It is undoubtedly better than the current PhenomII's, including the x6's for most rendering/encoding tasks, and even for most games. 
I do believe performance will only increase as programs or optimizers are released for the BD chip.
Is it worth the price premium over the x6 chips? That's really up to the end user.


----------



## GMcDougal

I never expected Bulldozer to give a beat down to sandy bridge...but i did expect more than this. If AMD doesnt find some more performance somewhere and fast, This cpu isnt going to sell well.


----------



## Tunapiano

Quote:



Originally Posted by *Blameless*


Exactly what every reasonable and sane person was expecting.

I can't agree with this. Enthusiast does not imply high cost.

Final performance per dollar of Bulldozer CPUs is not bad at all, and platform costs are extremely competitive.

Because the cores are narrower and share an FPU?

It was clear that some sacrifices would have to be made to fit 8 cores in the same transistor count as their previous six core chips, which already had inferior IPC to the last two generations of Intel processors.

Not all cores are created equal.

How many threads does TMPGEnc use?


Well it can't compete with the enthusiast market for 1 reason. power consumption, you have to add a much higher watt PSU just because of your CPU to run this chip.

It has a power appetite than much greater than what the 900 series chips from intel had. That's a huge price increase to your build to compensate.


----------



## Mygaffer

You know when you look at multi-threaded performance the Bulldozer cpu performs very well. I know we aren't there yet on the software side but we will be and at that time this type of architecture will shine.

An interesting chip and all the content creation guys will have a new favorite cpu.


----------



## hydropwnics

Quote:



Originally Posted by *BlackOmega*


Is it worth the price premium over the x6 chips? That's really up to the end user.


Not sure if I can justify getting a 990x board and a 8150, id almost rather get a 2500k/2600k and an intel mobo at that point even tho i was looking forward to bulldozer.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *BlackOmega*


7.6% seems like a rather large margin of error, does it not?

I still that this chip falls in between the 2500K and the 2600K in overall performance. It is undoubtedly better than the current PhenomII's, including the x6's for most rendering/encoding tasks, and even for most games. 
I do believe performance will only increase as programs or optimizers are released for the BD chip.
Is it worth the price premium over the x6 chips? That's really up to the end user.


It doesn't fall in between the 2500K and 2600K. Look how many times it loses in these results:


----------



## Chewy

Quote:



Originally Posted by *BlackOmega*


Why according to the results they'll beat your rig.




















Yeah your right on!! these bulldozer cpus destroy 2500k's in power consumption


----------



## Chris13002

ENOUGH SAID......................


----------



## Acefire

the 2500k has been $180 at microcenter for almost 6 months now.....


----------



## mechtech

Apparently the 8c BD has 2B transistors while Sandy is under 1B. AMD could have a Pentium 4 on their hands.

The only hope for this architecture is if the 32nm process from GF is the sole culprit for this.


----------



## Dublin_Gunner

Quote:



Originally Posted by *lightsout*


I didn't see yours win once. In a lot of those tests lower is better.


*Mental note to self* - learn to read









For some reason, I still want a BD chip though.


----------



## Mygaffer

Wow, the power consumption is horrible.


----------



## Mygaffer

Quote:



Originally Posted by *Dublin_Gunner*


*Mental note to self* - learn to read









For some reason, I still want a BD chip though.


Of course, for the true enthusiast playing with new hardware is its own reward, no matter which chip is "better".


----------



## microfister

Quote:



Originally Posted by *BlackOmega*


I still believe that this chip falls in between the 2500K and the 2600K in overall performance. It is undoubtedly better than the current PhenomII's, including the x6's for most rendering/encoding tasks, and even for most games. 
I do believe performance will only increase as programs or optimizers are released for the BD chip.
Is it worth the price premium over the x6 chips? That's really up to the end user.


dont you mean its clawing at the ankles of the 2500k?

like hitler said "a bunch of half cores mashed together"


----------



## Chewy

Quote:



Originally Posted by *Chris13002*


ENOUGH SAID......................











2500k killer right there!


----------



## Mygaffer

Quote:



Originally Posted by *Chewy*


2500k killer right there!


Someone is gloating.


----------



## LastBucsfan

Well.... at this point it looks like I might start doing some homework on what it will cost me to get a SB rig built.


----------



## Chewy

Quote:



Originally Posted by *Mygaffer*


Someone is gloating.


Not atall re read my older posts


----------



## BlackOmega

Quote:



Originally Posted by *Dr. Zoidberg*


It doesn't fall in between the 2500K and 2600K. Look how many times it loses in these results:











Bah more synthetic garbage. real world man, realworld apps.









Quote:



Originally Posted by *microfister*


dont you mean its clawing at the ankles of the 2500k?

like hitler said "a bunch of half cores mashed together"


 Depends on the review you *want* to believe. I can see numerous things wrong with their methodology and test setups. RAM being the first thing that I see wrong. I believe that the Hardware Canucks review is skewed, whether intentionally or not, it's not running the RAM at 1866, but at 1333 at CAS 7 and 9 respectively. If you know anything about AMD's you KNOW how much their performance can be altered just by messing with the RAM, especially timings. Hell I ran 1600MHz @ CAS 7 on an x3 720. And they can only muster 1333 CAS7. ........


----------



## Ladamyre

Lookit guys, there are AMD fans that had high hopes for it and then there are Intel fans that would take any shortcoming as a sign AMD is going under. It gets to where I can pretty much go down and look at the signature and tell what type of remarks I'm going to see.

Bottom line for me is, there isn't enough data yet to make an informed decision. Plus the issue of 8 physical cores versus 8 virtual cores isn't ripe yet. Game developers are just now writing software that utilizes 4 cores, and this is why much of the benchmarking scores shouldn't be used to condemn this new processor. Add the fact that this is just the first release of a new architecture, and I think criticism is premature.

And to the Intel fans, a question: Where do you think Intel's pricing policy will go if AMD does go under and Intel gets a monopoly in the CPU market?

The right answer is up, way up.


----------



## nagle3092

Saddest part about all this Bulldozer nonsense is AMD totally ruined the FX name for themselves. They should have named it Phenom 2.5.


----------



## Chewy

Quote:


> Originally Posted by *BlackOmega;15276083*
> Bah more synthetic garbage. real world man, realworld apps.


Looks like you have already made your mind up on your next upgrade


----------



## DayoftheGreek

Quote:


> Originally Posted by *BlackOmega;15276083*
> Bah more synthetic garbage. real world man, realworld apps.


How exactly is video encoding, LAME encoding, games, photoshop, winrar, 7zip, not real world?

Last time you posted handbrake. These are all similarly real world. People actually use these applications.


----------



## Mad Pistol

Quote:


> Originally Posted by *BlackOmega;15276083*
> Bah more synthetic garbage. real world man, realworld apps.


Give me one instance where Bulldozer is the superior chip in real world usage?

People keep saying this. If you're so confident that BD will be different in "real world usage", why do so many benchmarks not confirm this? Benchmarks are a measure of perceived performance. However, they are an indicator of how the chips will perform in the real world if given different scenarios.

I mean, if you (or anyone else) can provide evidence that BD performs better in "real world applications" I'll jump all over it. However, as it stands right now, it does nearly everything slow and while using more energy. That might be ok if the price was better... but it's not.

So, I'm going to wait for someone to prove that BD performs well in the "real world" because as it stands right now, it doesn't perform... at all.


----------



## Hallock

I am trying to save my comments till later. I really don't want to say I am disappointed with what I am seeing and for me it wasn't about if AMD could surpass intel just wanted the new chips to be faster that the old chips even by 20% that would have made be happy. Well, I guess my 955 will do me till ......... sorry not going to 'intel'. NEVER!!!!


----------



## alwang17

I'm more interested in seeing the PPD these things will put out. If they're efficient in that department, especially compared to the 2600k, then it'd be worth a lot more if you wanted a powerful folder. Still there'd be the power consumption but at least that'd be one more thing going for it.


----------



## corky dorkelson

I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


----------



## Vagrant Storm

Quote:


> Originally Posted by *BlackOmega;15276083*
> Bah more synthetic garbage. real world man, realworld apps.


Just incase my sarcasm metter is on the fritz today...you do see all those games listed there, right? How much more real world do you need? I'd like to see some compiling benchmarks, but that ussually mirrors the sysnthetic benchmarks.

And it doesn't matter what the memory was set at really...as long as the Intel CPU was the same. However, I am very sure on the OC testing they had the memory as fast as it can go.


----------



## badatgames18

omg!

source: hardware canucks: http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/47155-amd-bulldozer-fx-8150-processor-review-3.html










so it's true after all, huge bummer


----------



## kevink82

The problem lies with AMD went with gpu style, simpler core and more. Just cpu software developement didnt really went the same way as gpu did....

Even intel is working on cpu that is AMD style with lots of cores but simple ones, just as a prototype. I have to agree intel has far more financial backbone to work on stuff like these, just look at their larrabee, they can make something that cost they millions and not release it, AMD cant they nearly announce they are bankrupt not too long ago if they didnt split and form GlobalFoundries.


----------



## nagle3092

Quote:


> Originally Posted by *corky dorkelson;15276164*
> I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


I was checking that also this morning.


----------



## badatgames18

Quote:


> Originally Posted by *corky dorkelson;15276164*
> I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


that's because the general consumer has no idea what ipc is.. nor do they have any clue which processor is actually faster clock per clock. They just think "oh 8 'true' (not) cores must be better than 4c/8t"


----------



## Steak House

Quote:


> Originally Posted by *badatgames18;15276173*
> omg!
> 
> source: hardware canucks: http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/47155-amd-bulldozer-fx-8150-processor-review-3.html
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so it's true after all, huge bummer


This cannot be posted enough... It loses to a Phenom II...

Unreal AMD, Unreal...


----------



## bern43

2600k on the way. I'm in the middle of rmaing my and board and the microcenter deal is too good to pass up.


----------



## radaja

Quote:


> Originally Posted by *Steak House;15276201*
> This cannot be posted enough... It loses to a Phenom II...
> 
> Unreal AMD, Unreal...


yep i said the same,and posted it last night,i put some in my sig and now i will also post it once again,this graph really drives that painful nail deep into what AMD just did.


----------



## Kand

Quote:


> Originally Posted by *corky dorkelson;15276164*
> I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


With misinformation like this?

[ame]http://www.youtube.com/watch?v=8rDwXuAINJk[/ame]

I don't see how they wouldn't fool masses, if not, investors with.


----------



## furmark

Quote:


> Originally Posted by *badatgames18;15276199*
> that's because the general consumer has no idea what ipc is.. nor do they have any clue which processor is actually faster clock per clock. They just think "oh 8 'true' (not) cores must be better than 4c/8t"


so true .


----------



## Wbroach23

All the ones Ive looked at so far have the memory at 1333 or 1600 is there any with 1866 or higher if so which ones? I think I read some where it performs better with 1866 or higher memory


----------



## Baskt_Case

Quote:


> Originally Posted by *corky dorkelson;15276164*
> I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


Yea, I was amazed too. Ha! This was the first thing my wife asked me when I told her how bad BD was doing, she wanted to know the stock prices!

And how can anyone holler bullcrap on the benchmarks? I like HardwareCanucks reviews. Maybe not the best, but I do enjoy them and they are thorough enough for me. I use WinRAR, 7zip, and alot of x264 encoding. BD just sucks. Even overclocked, it sucks. They measured something like 550W draw at the socket with it OC'd, thats insane for the crappy performance it achieved.

My lady wants to know what I want for Xmas and the 2600k is on the top of my list now, goodbye AMD, it was fun while it lasted.


----------



## Vagrant Storm

Quote:


> Originally Posted by *kevink82;15276177*
> The problem lies with AMD went with gpu style, simpler core and more. Just cpu software developement didnt really went the same way as gpu did....
> 
> Even intel is working on cpu that is AMD style with lots of cores but simple ones, just as a prototype. I have to agree intel has far more financial backbone to work on stuff like these, just look at their larrabee, they can make something that cost they millions and not release it, AMD cant they nearly announce they are bankrupt not too long ago if they didnt split and form GlobalFoundries.


I think this is true as well. Multithreaded can get difficult to program. I am not a ture programmer, but I have to make a lot of my own configuration and testing tools. I am working on one right now and am I trying to make it multithreaded just for fun...and it isn't working. This really is rather difficult. I can sort of see why it took so long to get multi threaded apps in the world. I think AMD was betting on that a app using 6 or more threads would become the norm over the last few years, but it really hasn't.


----------



## Skripka

Quote:


> Originally Posted by *Wbroach23;15276277*
> All the ones Ive looked at so far have the memory at 1333 or 1600 is there any with 1866 or higher if so which ones? I think I read some where it performs better with 1866 or higher memory


Guru3D was using 2x4GB of 1866:

http://www.guru3d.com/article/amd-fx-8150-processor-review/11


----------



## Rookie1337

Dear OCN,

What happened? When did we let threads drag on into massive tirades of fanboyism on either side? What happened to the middle?

I swear in this thread I've seen people just refuse to look at the results. Yes BD is a let down because it's not a huge leap forward and consumes a larger amount of power. Doesn't mean you Intel fanboys have to start trolling here saying it's an complete failure and that AMD should bottom out. All you're going to do is bring out the AMD fanboys and start a flame war. We don't need that.

Thank you,
Rookie1337


----------



## Nioxic

now i didnt see all the reviews(coz i have no intention of buying one) just clicked a bit through toms hardwares review.. seems like its either slower or slightly faster (depending on the test) than my 3 year old cpu?


----------



## badatgames18

Quote:


> Originally Posted by *Skripka;15276289*
> Guru3D was using 2x4GB of 1866:
> 
> http://www.guru3d.com/article/amd-fx-8150-processor-review/11


am i reading their review right? they are comparing bd @ 4.6Ghz vs other procs @ stock?

EDIT: nvm they compared it @ stock also.. good review


----------



## Ruckol1

Quote:


> Originally Posted by *nagle3092;15276189*
> I was checking that also this morning.


Yeah me too! +2.5% !?


----------



## Spicy61

Quote:


> Originally Posted by *badatgames18;15276199*
> that's because the general consumer has no idea what ipc is.. nor do they have any clue which processor is actually faster clock per clock. They just think "oh 8 'true' (not) cores must be better than 4c/8t"


No. More than likely AMD stock is relatively the same because investors look _forward_. AMD stock already took a hit when they announced a lower outlook due to poor yields a couple days ago. IE their stock was probably already adjusted.

It's just like how stocks can take a hit by what briefcase the chairman of the fed brings to a media meeting. Investors look forward and will adjust even before events occur.


----------



## xd_1771

Quote:


> Originally Posted by *badatgames18;15276199*
> that's because the general consumer has no idea what ipc is.. nor do they have any clue which processor is actually faster clock per clock. They just think "oh 8 'true' (not) cores must be better than 4c/8t"


No, it would be because us enthusiasts who want FX represent such a small, small share of the market. AMD stock has plummeted lately but it was due to poor Llano APU yields, not poor FX CPU yields.
I didn't expect this release to make much of a difference in stock prices at all.


----------



## Skripka

Quote:


> Originally Posted by *badatgames18;15276327*
> am i reading their review right? they are comparing bd @ 4.6 vs other procs @ stock?


Well, to be fair BD at stock _and_ at 4.6 compared to other CPUs at stock...and the comparison still goes that badly.


----------



## Wbroach23

Quote:


> Originally Posted by *Skripka;15276289*
> Guru3D was using 2x4GB of 1866:
> 
> http://www.guru3d.com/article/amd-fx-8150-processor-review/11


Thank you at least i will feel better looking at this one


----------



## ghost_z

another thing is that, if ever amd went down the drain then it will mean disaster or doomsday for intel guys....intel prises are kept in check by amd.....if amd goes down then intel will become a dictator.......and that won't be good for any geek.........

i hope bd sells well........


----------



## Vagrant Storm

Quote:


> Originally Posted by *Rookie1337;15276310*
> Dear OCN,
> 
> What happened? When did we let threads drag on into massive tirades of fanboyism on either side? What happened to the middle?
> 
> I swear in this thread I've seen people just refuse to look at the results. Yes BD is a let down because it's not a huge leap forward and consumes a larger amount of power. Doesn't mean you Intel fanboys have to start trolling here saying it's an complete failure and that AMD should bottom out. All you're going to do is bring out the AMD fanboys and start a flame war. We don't need that.
> 
> Thank you,
> Rookie1337


I think a lot of the trolls trolling AMD are the AMD fan boys.

But yeah...I think I'd vote for this thread to get a sticky in the hardware news section and get closed. I think every one has gotten thier disgust out. Plus there is the AMD CPU section to complain in.


----------



## Fasista

Quote:


> Originally Posted by *badatgames18;15276327*
> am i reading their review right? they are comparing bd @ 4.6Ghz vs other procs @ stock?
> 
> EDIT: nvm they compared it @ stock also.. good review


I think speed of Stock


----------



## manolith

This is horrible for amd. I was anticipating bulldozer untill i started reading a few weeks back . I was still waiting to see reviews but man dont see how they will pick this mess up. What a disaster.


----------



## Kauke

Awesome benchmark pics. Great performance. Look thick. Solid. Tight. Keep us all posted on your continued progress with any new progress pics or vid clips. Show us what you got, AMD. Wanna see how freakin' huge, solid, thick and fast you can get. Thanks for the motivation.


----------



## Dublin_Gunner

Quote:


> Originally Posted by *Rookie1337;15276310*
> Dear OCN,
> 
> What happened? When did we let threads drag on into massive tirades of fanboyism on either side? What happened to the middle?
> 
> I swear in this thread I've seen people just refuse to look at the results. Yes BD is a let down because it's not a huge leap forward and consumes a larger amount of power. Doesn't mean you Intel fanboys have to start trolling here saying it's an complete failure and that AMD should bottom out. All you're going to do is bring out the AMD fanboys and start a flame war. We don't need that.
> 
> Thank you,
> Rookie1337


Its inevitable though. Happens on every forum around the web.

If the 8150 dropped to about €180, it would be a very viable CPU.

I saw some benches (gaming related) where it keeps up with, and beats 2600k. These are dx11, with supposed multi-threaded rendering enabled in drivers (HardOCP).

However, I would expect similar results in Civ 5 (as this definitely utilised it) but for some reason it cannot keep up.

BD isn't a _bad_ CPU, it just not that much better than what came before - and worse in some scenarios.

Power consumption is ridiculous for a 2011 cpu, especially considering they allegedly put a lot of time and effort into clock and power gating.

The design might work better on a smaller process node - 28nm could be where the BD design shines.


----------



## kweechy

Quote:


> Originally Posted by *Spicy61;15276336*
> No. More than likely AMD stock is relatively the same because investors look _forward_. AMD stock already took a hit when they announced a lower outlook due to poor yields a couple days ago. IE their stock was probably already adjusted.
> 
> It's just like how stocks can take a hit by what briefcase the chairman of the fed brings to a media meeting. Investors look forward and will adjust even before events occur.


But investors had no real insight into Bulldozer performance other than the leaked benchmarks.

Mind you, I used the leaks and sold my shares, so maybe they all did the same a couple weeks ago and that's why the shares have been below $5 for a while now.


----------



## kweechy

Quote:


> Originally Posted by *ghost_z;15276363*
> another thing is that, if ever amd went down the drain then it will mean disaster or doomsday for intel guys....intel prises are kept in check by amd.....if amd goes down then intel will become a dictator.......and that won't be good for any geek.........
> 
> i hope bd sells well........


It's bad for Intel too, they'll probably get split up from anti-trust measures if that happened.


----------



## badatgames18

i'm too poor to have stocks.. my money goes towards books, food, and computer stuff, but if i was an investor.. i wouldn't sell.. i'm no expert but stocks are suppose to be long term investments which should yeild some gain over time if the company is any good at all

EDIT: my criticism is purely from the perspective of an enthusiast expecting more


----------



## Djankie

This won't be good enough for competition.


----------



## Kauke

I think I'm going to cry.


----------



## hokiealumnus

Quote:


> Originally Posted by *Wbroach23;15276277*
> All the ones Ive looked at so far have the memory at 1333 or 1600 is there any with 1866 or higher if so which ones? I think I read some where it performs better with 1866 or higher memory


Ours (I'm not allowed to link to it, but it's the Overclockers.com review linked in the OP) was run at 1866 with plenty tight latencies of 6-9-7-24. Fast RAM is far from making BD compete with a 2600K.

The memory controller is quite impressive once you freeze it though. I ran DDR3-2400+ when benching under LN2.


----------



## Dublin_Gunner

Quote:


> Originally Posted by *kweechy;15276391*
> But investors had no real insight into Bulldozer performance other than the leaked benchmarks.
> 
> Mind you, I used the leaks and sold my shares, so maybe they all did the same a couple weeks ago and that's why the shares have been below $5 for a while now.


They actually broke $5 earlier today. Some analysts obviously still have faith.


----------



## Lampen

Well I see nothing much has changed since last night.


----------



## badatgames18

Quote:


> Originally Posted by *hokiealumnus;15276471*
> Ours (I'm not allowed to link to it, but it's the Overclockers.com review linked in the OP) was run at 1866 with plenty tight latencies of 6-9-7-24. Fast RAM is far from making BD compete with a 2600K.
> 
> The memory controller is quite impressive once you freeze it though. I ran DDR3-2400+ when benching under LN2.


what are the set dividers hokealumnus?
i'm still buying one









how cold did you have to get it?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Rookie1337;15276310*
> Dear OCN,
> 
> What happened? When did we let threads drag on into massive tirades of fanboyism on either side? What happened to the middle?
> 
> I swear in this thread I've seen people just refuse to look at the results. Yes BD is a let down because it's not a huge leap forward and consumes a larger amount of power. Doesn't mean you Intel fanboys have to start trolling here saying it's an complete failure and that AMD should bottom out. All you're going to do is bring out the AMD fanboys and start a flame war. We don't need that.
> 
> Thank you,
> Rookie1337


So what are we supposed to do, just not comment on the reviews? This is a public FORUM. Get that through your head people.

As far as the reviews are concerned I, unfortunately, am not at all surprised. The writing was on the wall the second Q2 slipped to 60-90 days for what were rumored to be poor performance yields. I said at the time (amidst rabid AMD fanboy objections) that things looked very bad for BD. This was back in June when several people were claiming that Anand and Tom's were "trolling" with their reports claiming poor performance was the issue with the delay. Some people still cannot bring themselves to concede that BD was ever actually delayed to begin with.

This was very predictable at least since mid-summer. It is a massive disappointment to me because Intel now has no reason to stay in competitive-mode. I expect them to allow IB and SB-E to roll out slower now and at higher prices and that sucks for us Intel guys. Nobody wins here...


----------



## Lampen

OMG NEWEGG FAIL

http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.739582

Act now before the combo is gone! LOLOL


----------



## G3RG

Quote:


> Originally Posted by *corky dorkelson;15276164*
> I am surprised at how well AMDs stock is doing so far today. I expected it to tank when markets opened this morning, but the stock has remained pretty flat so far....


That would because we're of the tiny minority that:
1. Knows what bulldozer is...
2. Gives a frick


----------



## Dublin_Gunner

Quote:


> Originally Posted by *Lampen;15276539*
> OMG NEWEGG FAIL
> 
> http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.739582
> 
> Act now before the combo is gone! LOLOL


Its no wonder its performing so poorly!


----------



## Rookie1337

http://www.guru3d.com/article/amd-fx-8150-processor-review/14

Some "complete" failure. Guys calm down and think logically. Did you really expect them to completely trounce the 2600k and the 990x? I'm just glad there's places it beats the old Phenoms. However, I'm pretty much unable to justify the power consumption results.


----------



## pursuinginsanity

Quote:


> Originally Posted by *badatgames18;15276513*
> i'm still buying one


Rofl. I can't believe ANYONE would buy one. Maybe at $120 bucks like my 965 BE, but current prices?


----------



## hokiealumnus

Quote:


> Originally Posted by *badatgames18;15276513*
> what are the set dividers hokealumnus?
> i'm still buying one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how cold did you have to get it?


Well, I don't know at what point the IMC starts being happy. I know it _doesn't_ simply run DDR3-2133 on ambient cooling with no tweaks other than setting memory specs. Well, it ran it at what seemed stable until you run SuperPi32M, which would not complete.

As far as _how_ cold, I couldn't tell you unfortunately. The limited time we had these (12 days from receipt to publishing) didn't allow too much experimentation. Since I knew from the WR they didn't have a cold bug, I just filled the pot with LN2 and kept it there until I was done.









EDIT - At stock 200 MHz bus, you can set DDR3-1600, 1866, 2133 & 2400. Not that it will run all of them, but you can set them on the Crosshair V Formula.


----------



## GameBoy

Quote:


> Originally Posted by *Rookie1337;15276310*
> Dear OCN,
> 
> What happened? When did we let threads drag on into massive tirades of fanboyism on either side? What happened to the middle?
> 
> I swear in this thread I've seen people just refuse to look at the results. Yes BD is a let down because it's not a huge leap forward and consumes a larger amount of power. Doesn't mean you Intel fanboys have to start trolling here saying it's an complete failure and that AMD should bottom out. All you're going to do is bring out the AMD fanboys and start a flame war. We don't need that.
> 
> Thank you,
> Rookie1337


To be honest, people have been far more well behaved in this thread as opposed to others.


----------



## Benz

These are not the benchmarks I was expecting.



These are the benchmarks I had for over 2 months from my cousin.

Yes I know they're remarkably similar to Bjorn3D ones but maybe they were using the same software as my cousin.

I don't know what to say...


----------



## microfister

Quote:


> Originally Posted by *Lampen;15276539*
> OMG NEWEGG FAIL
> 
> http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.739582
> 
> Act now before the combo is gone! LOLOL


lol, WTH, why didnt anyone tell me i could put a bulldozer on an x58 chipset?!?!

this has a whole new level of possibilities!


----------



## GMcDougal

Sent JF-AMD a PM and asked for an explanation on performance. This is his reply:

"This processor is targeted at higher threaded workloads, not single thread performance.

8 cores, everything unclocked, great price. There is definitely a segment of the market that wants that."

So if im understanding this correctly, the future will be much brighter for bulldozer assuming applications in the future will utilize more cores?


----------



## Vagrant Storm

Quote:


> Originally Posted by *Lampen;15276539*
> OMG NEWEGG FAIL
> 
> http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.739582
> 
> Act now before the combo is gone! LOLOL


What the heck? Think that was a mistake or is Newegg trying to be funny?

For those that didn't look at it is a combo deal for an Intel motherboard and a FX 8120.


----------



## BlackOmega

Quote:


> Originally Posted by *Chewy;15276141*
> Looks like you have already made your mind up on your next upgrade


Who said anything about an upgrade? My rig is perfectly fine, does everything I want it to, and is enough for gaming. Albeit with some GPU refreshing when the next gen's come out.

Although, I may just get BD to mess around with.








Quote:


> Originally Posted by *Mad Pistol;15276152*
> *Give me one instance where Bulldozer is the superior chip in real world usage?*
> 
> People keep saying this. If you're so confident that BD will be different in "real world usage", why do so many benchmarks not confirm this? Benchmarks are a measure of perceived performance. However, they are an indicator of how the chips will perform in the real world if given different scenarios.
> 
> I mean, if you (or anyone else) can provide evidence that BD performs better in "real world applications" I'll jump all over it. However, as it stands right now, it does nearly everything slow and while using more energy. That might be ok if the price was better... but it's not.
> 
> So, I'm going to wait for someone to prove that BD performs well in the "real world" because as it stands right now, it doesn't perform... at all.


Hows about Handbrake and Videora and the x264 benchmark? (Note: the first run of the x264 BM is just scanning and the second pass is actual encoding)
Is that not enough? Hows about Battlefield 3?

Well there's 4.

Look I'm not saying the BD is superduperawesomewow! However, it's not as bad as people claim it to be. There's still a lot of wrinkles that need to be ironed out. The most notable one is the OS scheduler. Since it's not optimized for BD yet, it'll randomly assign tasks to whatever core in a very inefficient manner.


----------



## ghost_z

Quote:


> Originally Posted by *Benz;15276599*
> These are not the benchmarks I had for over 2 months.
> 
> 
> 
> These are the benchmarks I've got from my cousin.
> 
> Yes I know they're remarkably similar to Bjorn3D ones but maybe they were using the same software as my cousin.
> 
> I don't know what to say...


thats impressive.....but still we need more user made reviews to judge the actual real world performance....

but the insane power requirements are still the biggest deal breaker imo.......


----------



## badatgames18

Quote:


> Originally Posted by *hokiealumnus;15276587*
> Well, I don't know at what point the IMC starts being happy. *I know it doesn't simply run DDR3-2133 on ambient cooling* with no tweaks other than setting memory specs. Well, it ran it at what seemed stable until you run SuperPi32M, which would not complete.
> 
> As far as _how_ cold, I couldn't tell you unfortunately. The limited time we had these (12 days from receipt to publishing) didn't allow too much experimentation. Since I knew from the WR they didn't have a cold bug, I just filled the pot with LN2 and kept it there until I was done.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT - At stock 200 MHz bus, you can set DDR3-1600, 1866, 2133 & 2400. Not that it will run all of them, but you can set them on the Crosshair V Formula.


wow.. thanks for the info.. maybe it can bench @ 7ghz? hopefully at least 2d..

but if things stay the way they are.. it's gonna be like benching a phenom II with better imc


----------



## Wbroach23

Quote:


> Originally Posted by *hokiealumnus;15276471*
> Ours (I'm not allowed to link to it, but it's the Overclockers.com review linked in the OP) was run at 1866 with plenty tight latencies of 6-9-7-24. Fast RAM is far from making BD compete with a 2600K.
> 
> The memory controller is quite impressive once you freeze it though. I ran DDR3-2400+ when benching under LN2.


thanks


----------



## bringonblink

This was.... disapointing... don't get me wrong i was not ever under any illusions that it would be competing with i7 at a decent level, but i did expect it to consistently beat phenom 2....

was gonan upgrade, sticking with my 955 looks like


----------



## Chewy

Quote:


> Originally Posted by *ghost_z;15276659*
> thats impressive.....but still we need more user made reviews to judge the actual real world performance....
> 
> but the insane power requirements are still the biggest deal breaker imo.......


How are home user reviews any more viable than the ones listed on the first page







.

Infact i would not trust ANY home user reviews atall, Photoshop is amazing and all.....


----------



## jprovido

Quote:


> Originally Posted by *ghost_z;15276659*
> thats impressive.....but still we need more user made reviews to judge the actual real world performance....
> 
> but the insane power requirements are still the biggest deal breaker imo.......


can i haz exact power consumption? both at stock and overclocked and what's the difference between an stock/overclocked i7 2600k and i5 2500k. ty guys reps will be given btw


----------



## Milestailsprowe

Amd Bulldozer does not seem toooooooooooooooooooo bad. Its slow yes but I see bulldozer getting better for when things that require quad cores come around. in the HardOC review it keep up and more for battle field 3. If I was one of the ones who waited for Bulldozer then I say get one.


----------



## Benz

This it just another one of AMD's overambitious failures.

Hooray to my sig.


----------



## Kand

Quote:


> Originally Posted by *Milestailsprowe;15276721*
> Amd Bulldozer does not seem toooooooooooooooooooo bad. Its slow yes but I see bulldozer getting better for when things that require quad cores come around. in the HardOC review it keep up and more for battle field 3. If I was one of the ones who waited for Bulldozer then I say get one.


Battlefield 3 is a GPU limited game.


----------



## ghost_z

Quote:


> Originally Posted by *Chewy;15276702*
> How are home user reviews any more viable than the ones listed on the first page
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Infact i would not trust ANY home user reviews atall, Photoshop is amazing and all.....


guys like us on the same forum can be at least truthfull to each other....so it might make more sense if people here made some self reviews...although i agree about the photoshop concern.......








and btw with users having absolutely different overall hardware configurations and different applications/games we can get even better i dea about bd's overal performance......but thats just my point of view......


----------



## radaja

Quote:


> Originally Posted by *Benz;15276722*
> This it just another one of AMD's overambitious failures.
> 
> Hooray to my sig.


why did you wait so long?did your cousin already send you benchmarks?you should have know right?


----------



## GMcDougal

Quote:


> Originally Posted by *Kand;15276729*
> Battlefield 3 is a GPU limited game.


Also, lets keep in mind we are comparing a beta here. I would not put all my faith in a beta...especially one that will supposedly change alot of things.


----------



## TFL Replica

Quote:


> Originally Posted by *GMcDougal;15276639*
> Sent JF-AMD a PM and asked for an explanation on performance. This is his reply:
> 
> "This processor is targeted at higher threaded workloads, not single thread performance.
> 
> 8 cores, everything unclocked, great price. There is definitely a segment of the market that wants that."
> 
> So if im understanding this correctly, the future will be much brighter for bulldozer assuming applications in the future will utilize more cores?


Definitely. It's just that the segment of the market he's referring to is not PC gamers.


----------



## Riou

I get worse performance in Dirt 3 with 4.6GHz i5 2500k than 4.0GHz i5 750.


----------



## OC'ing Noob

All my lingering concerns about purchasing a SB too soon have been laid to rest. As for people who are still somehow surprised about this, the only thing I have to say is that you guys should have seen this coming and it is reality not a nightmare.


----------



## hokiealumnus

Quote:


> Originally Posted by *badatgames18;15276676*
> wow.. thanks for the info.. maybe it can bench @ 7ghz? hopefully at least 2d..
> 
> but if things stay the way they are.. it's gonna be like benching a phenom II with better imc


Heh, go read our review! LN2 results are down toward the bottom. It benched WPrime 32M @ 6.5 GHz and SuperPi1M @ 7.5 GHz. Max CPUz validation was 7623 MHz.


----------



## snowman88

Back to the drawing board for AMD. Not enough mutithreaded applications out there to justify grabbing one of these. We gamers want better per core performance, not just slapping 2 more cores on an X6 thuban.


----------



## Liquidpain

Quote:


> Originally Posted by *Rookie1337;15276553*
> http://www.guru3d.com/article/amd-fx-8150-processor-review/14
> 
> Some "complete" failure. Guys calm down and think logically. Did you really expect them to completely trounce the 2600k and the 990x? I'm just glad there's places it beats the old Phenoms. However, I'm pretty much unable to justify the power consumption results.


According to alot of folks on here. Bulldozer was supposed to DESTROY Sandy Bridge. So yes it is a COMPLETE failure.

How you may ask? Simple. It has lower IPC yet consumes nearly DOUBLE the power, it can barely hold its own in heavily threaded apps, it's damn near useless in single to lightly threaded apps, and the price is laughable per performance.


----------



## Benz

Quote:


> Originally Posted by *radaja;15276746*
> why did you wait so long?did your cousin already send you benchmarks?you should have know right?


If you'd observe this thread more carefully you would know that I already posted them. Oh and he sent me those benchmarks over 2 months ago.

Quote:


> Originally Posted by *Benz;15276599*
> These are not the benchmarks I was expecting.
> 
> 
> 
> These are the benchmarks I had for over 2 months from my cousin.
> 
> Yes I know they're remarkably similar to Bjorn3D ones but maybe they were using the same software as my cousin.
> 
> I don't know what to say...


----------



## levathar

Wow!! I know that this is a enthusiast forum, but I thought that I would see people looking at this with "other" eyes.
On first hard look, BD is not what everyone expected, but it is not bad.

My analysis is that the BD architecture is really a "step" forward. Current software can't take the juice out of it.
You can see the potential of Bulldozer in highly threaded applications! If it was simply crap, Revewers easily adress BD beeing so bad as the Phenom II, but remember what was targeted by Phenom II? Core quads... so...
Is it so bad as everyone say? AMD is trying against current hardware made by Intel, so it is not bad at all. Single thread performance is really bad, indeed, I myself got disapointed (I wanted BD for BOINC).

I am sure that the architecture have pleanty of head room for development in future years.
I am sure AMD engineers will take every point considered by reviewers for Piledriver...

Considering that AMD only got a slice of the market ... it is an amazing result.
INTEL will manage to kill smaller CPU makers (some are almost gone...VIA?) like they did with Cyrix and so on...

Come on AMD, keep up the pace and fight for survival...

As for me, I might go for a 6100 or a 8120... depends on how things evolve in the following weeks, prices and so on...

Ok now you can flame me lol. Yah do that, flame the newbie lol lol lol


----------



## kenolak

I don't see why people are in such a fussy about things. It's an 8 core cpu that handles everything more than well enough for the average persons wants/needs, and handles the multi threaded very close or better than anything else at the same price. Of course the price is a bit high right now... At least they will drop at a reasonable rate before changing the entire socket unlike the competitor who sets and leaves prices the same for a great while.

Personally, I like it. How fast do you really need to browse the web, and videogames?
Wow, a 2 frame difference is SUCH A LET DOWN! If my frames aren't at least 4 times what my monitor will display while ghosting why bother!
Seriously....

One thing I would like to see benched is micro stuttering.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *BlackOmega;15276964*
> 
> 
> 
> 
> 
> 
> 
> This is not a Thuban in any way. It's a COMPLETELY NEW architecture.


The sad part about this is if they had just added two more cores to Thuban it would have obliterated these BD results...


----------



## Liquidpain

Quote:


> Originally Posted by *kenolak;15276913*
> I don't see why people are in such a fussy about things. It's an 8 core cpu that handles everything more than well enough for the average persons wants/needs, and handles the multi threaded very close or better than anything else at the same price. Of course the price is a bit high right now... At least they will drop at a reasonable rate before changing the entire socket unlike the competitor who sets and leaves prices the same for a great while.
> 
> Personally, I like it. How fast do you really need to browse the web, and videogames?
> Wow, a 2 frame difference is SUCH A LET DOWN! If my frames aren't at least 4 times what my monitor will display while ghosting why bother!
> Seriously....
> 
> One thing I would like to see benched is micro stuttering.


The only area where BD is worth a hoot is where it consumes the most insane amounts of power. Even if a buisness were to get BD for it's computing power, the power draw is too much for them to find any type of savings and that is vs. SB. When it comes to SB-E or Ivy, forget it.


----------



## criminal

Quote:


> Originally Posted by *twich12;15274517*
> amd's fermi.... im so disappointed it hurts my brain! how could they possible settle for WORSE single core performance? heck they are going backwards!


Fermi? Haha... not hardly. Fermi was never a failure.


----------



## Majin SSJ Eric

Like I said, this is a sad day for all computer enthusiasts. Intel can't start thinking they have no competition or we will not see a major improvement over SB for years....


----------



## Skripka

Quote:


> Originally Posted by *BlackOmega;15276964*
> OF course their answer will be..
> 
> ^this .
> 
> and that it somehow falls into "Margin of error".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is not a Thuban in any way. It's a COMPLETELY NEW architecture.


Using a modern video GPU-dependent video game as a yardstick for CPU performance is just silly. Let us just call it that and move on, shall we? For a part supposedly targeted at enthusiasts, it falls way short. It does decently against 2600K under heavily multi-threaded apps...Unfortunately the overwhelming majority of apps are not that multi-threaded. Single threaded performance is what counts for most users, even enthusiasts...and the 8150 as it stands now falls flat on that score.

Heck I'm disappointed in the folding PPD BD generates-and that too is a highly multi-threaded task. From what I'm seeing it doesn't do any better than my 1055T, or my i7-950 ~13k PPD in SMP. Hopefully something is wonky software wise, but it doesn't sound like AMD has told the reviewers about this.


----------



## ghost_z

Quote:


> Originally Posted by *criminal;15277005*
> Fermi? Haha... not hardly. Fermi was never a failure.


in power consumption and heat dissipation they were....so i guess the analogy is quite accurate......(leave performance part outta it....as fermi performed well even with those flaws but bulldozer is another story....).....


----------



## RagingCain

I guess the AMD camp is down voting the official review thread?

Very disappointed in AMD, although I suspected as much. Too many negative leaks with similar stories.


----------



## ghost_z

if this the way how things are gonna come then i think my i7 will last me another 3 years even more maybe......


----------



## AMD2600

I see BD FX8150 a little before it's time. The world isn't ready for such a multi threaded task crunching CPU. Maybe revisions will be released that address the low threaded task performance.


----------



## kenolak

Quote:


> Originally Posted by *Liquidpain;15276999*
> The only area where BD is worth a hoot is where it consumes the most insane amounts of power. Even if a buisness were to get BD for it's computing power, the power draw is too much for them to find any type of savings and that is vs. SB. When it comes to SB-E or Ivy, forget it.


Very valid point from a business perspective if you're not already in a building that uses free(solar/wind) energy. You turn off the lights, and don't let giant printers sit turned on all day a long with all the other small things that sip power. The power is too high, but that isn't really a new problem is it.


----------



## badatgames18

Quote:


> Originally Posted by *hokiealumnus;15276829*
> Heh, go read our review! LN2 results are down toward the bottom. It benched WPrime 32M @ 6.5 GHz and SuperPi1M @ 7.5 GHz. Max CPUz validation was 7623 MHz.


very nice review hokiealumnus







... that imc has got me pumped to finally try some extreme memory benches









such big numbers, but scores are so mediocre









btw i assume you were using psc in that review? you didn't try out hypers by any chance did you lol.. i heard they do good on fx

EDIT: nvm i saw you used some flare.. i know i have some somewhere.. wow 2400MHz with them.. nice


----------



## t-ramp

Even if one calls the performance similar to Sandy Bridge, the power draw is absurd. If I'm going to upgrade my PC, I don't want to go the route that will cost me more perpetually.


----------



## microfister

Quote:


> Originally Posted by *Majin SSJ Eric;15276996*
> The sad part about this is if they had just added two more cores to Thuban it would have obliterated these BD results...


yea die shrink, maybe 3d transistors. lets face it, they had five years to do it. that means they couldve spent two years at the drawing board, and three years completing the task. not sure what happened.

amd will not suffer greatly though as much as some of us would want, and who knows, the enthusiast is a very small portion of the consumers. with them pushing the "8 Cores" may do really well on the market. in which case everybody benefits.

amd meets their quota, the 7*** series comes in shortly after. intel sees the competition steeling sales and lowers prices, and hopfully releases Ivy without too much delay.

everybody wins.


----------



## Fletcherea

I dunno, looks decent enough for a 200 dollar processor. I'm really not that enthused about the power usage numbers so far though. The quad core version is the one I'll be looking for, and the price for that will be really decent. No need what so ever for 8 cores in my homestead, had to throw in my pennies =P


----------



## radaja

Quote:


> Originally Posted by *Benz;15276897*
> If you'd observe this thread more carefully you would know that I already posted them. Oh and he sent me those benchmarks over 2 months ago.


yea i saw them,why then did say to everyone that BD will be great,trust me guys my cousin sent me graphs?those graphs show BD x8 barely trading blows with the i7-2500K quad core?
my point still stands,two months ago when you saw these graphs why didnt you go with SB?you knew the rumors of BD costing more than a 2500K already?i just dont get it?maybe you dont know how to evaluate graphs?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *microfister;15277148*
> yea die shrink, maybe 3d transistors. lets face it, they had five years to do it. that means they couldve spent two years at the drawing board, and three years completing the task. not sure what happened.
> 
> amd will not suffer greatly though as much as some of us would want, and who knows, the enthusiast is a very small portion of the consumers. with them pushing the "8 Cores" may do really well on the market. in which case everybody benefits.
> 
> amd meets their quota, the 7*** series comes in shortly after. intel sees the competition steeling sales and lowers prices, and hopfully releases Ivy without too much delay.
> 
> everybody wins.


That would be nice. Sucks we have to put all our hopes in the ignorant masses though...


----------



## MoRLoK

Well, goodbye AMD

After 20 years of beeing fanatic AMD fan i feel cheated. When prices here where i live was really low they bombarded whole world with screens and bars showing it will compete with 2600k and even 990X so i waited. Now prices are really high because of strong $







Now if i want to buy 2600k i must pay 150$ more for it !. More about it. Time to stop selling AMD platforms. Im not angry about all of this just thank you AMD to show me how are u treating yours long time customers)







And in now days power bills. No way AMD.

Yes my english is bad


----------



## RagingCain

Quote:


> Originally Posted by *Majin SSJ Eric;15277194*
> That would be nice. Sucks we have to put all our hopes in the *ignorant masses though...*


Well everyone was (should have been) ignorant till today, with the exception of AMD and a handful of people. Claiming otherwise is a lie.

Still extremely disappointing. The worst part is AMD and its "marketing" of this chip like the second coming. Almost as bad as Apple can be, shame on AMD.


----------



## Artikbot

Quote:


> Originally Posted by *AMD2600;15277104*
> I see BD FX8150 a little before it's time. The world isn't ready for such a multi threaded task crunching CPU. Maybe revisions will be released that address the low threaded task performance.


That's what I think aswell.

APU approach was designed for a world were GPUs come into play during real world tasks, not in a world where the GPU is merely reserved to extensive vector-based apps or gaming.

AMD thinks forward. Almost too forward. If they hadn't dropped as much 'old tech' in this new architecture, it'd performed like a boss. But who tells you that in a year Windows 8 comes up with 'god-knows-what-multitasking-improvement' and the Bulldozer/Bulldozer II trounces and tears thru the competence?

We, sincerely, don't know.


----------



## Liquidpain

Quote:


> Originally Posted by *kenolak;15277105*
> Very valid point from a business perspective if you're not already in a building that uses free(solar/wind) energy. You turn off the lights, and don't let giant printers sit turned on all day a long with all the other small things that sip power. The power is too high, but that isn't really a new problem is it.


Ok. Cool. So what percentage of buildings use wind and solar?








Let's not for get how expensive those forms of energy are in themselves but I digress. BD fails on every level no matter how people try to spin it.


----------



## hajile

The chip was launched too early. I think that two performance increases are coming. The first will be piledriver. I suspect (given that it is launching in just a few months) that AMD pushed out this chip knowing that it had problems. I further suspect that the biggest problem is the one thing AMD won't talk about: branch prediction. Anand's N x N queen simulation helps to show that branch prediction has taken a large step backwards. Given that branch prediction is even more important with a longer pipeline, this is one of the most likely culprits. I believe that piledriver will focus almost entirely on fixing this problem.

The second problem is GF. AMD planned on a base frequency 30% faster and only achieved 9% faster. Even with the problems the chip has, overclock benchmarks @4.6Ghz (about 30% faster) places the chip more where it should be (at stock). In the upcoming revisions I think that speedbumps of 200Mhz+ are likely. Part of the cache latency problem is also linked to poor fab yields (increasing cache latency can help improve yields) and I suspect that piledriver will also have cache latencies more along the lines of Phenom II (or maybe just a bit faster).

All this said, this chip's intended audience was never primarily consumers. I think that server benchmarks show this chip to be more than competitive with sandy bridge.

OT edit: I think that with the BD transistor budget, AMD should have done what IBM did with POWER7 and made the L3 cache eDRAM (IBM did this to put more in while using less die space) and forgoing L3 for consumer chips because L3 is primarily to help with multi-processor bottlenecking in server/HPC environments.


----------



## kenolak

Quote:


> Originally Posted by *Liquidpain;15277237*
> Ok. Cool. So what percentage of buildings use wind and solar?


How many limit power use or have a data center......


----------



## GTR Mclaren

man...biggest thread in OCN...10 hours of life and more than one thousand replies

I expected more of BD....not to crush SB or something like that...

but cmon...even the Phenon II x6 1100T beta is many cases the 8150....


----------



## Liranan

Quote:


> Originally Posted by *Rookie1337;15276553*
> http://www.guru3d.com/article/amd-fx-8150-processor-review/14
> 
> Some "complete" failure. Guys calm down and think logically. Did you really expect them to completely trounce the 2600k and the 990x? I'm just glad there's places it beats the old Phenoms. However, I'm pretty much unable to justify the power consumption results.


I actually concur. Looking at the benchmarks again I see they're not as bas as I thought. I was pretty upset so didn't actually look at the benchmarks.

The only problem I see with BD is the insane power usage. Once we've had a refresh it might be an excellent CPU to get, in fact I bet BOINC would love it. Though I couldn't run this and my GPU at the same time.

I am now reconsidering my original disappointment.


----------



## microfister

Quote:


> Originally Posted by *Majin SSJ Eric;15277194*
> That would be nice. Sucks we have to put all our hopes in the ignorant masses though...


true, as a much wiser man than my self once said.
Quote:


> Originally Posted by *Kay*
> A person is smart. People are dumb, panicky dangerous animals and you know it.


you said it kay ^^^


----------



## obsidian86

Quote:


> Originally Posted by *microfister;15271828*
> wow, thanks amd, you just monopolized intel(so much for competitive price drops). and way to go for being a bottleneck on your upcoming gpus, looks like if you want to xfire 7 series you may have to turn away from amd for your rig. a 2600k @ stock is better than the BD @ 4.6. this is epic fail.


how quick are we to forget

http://en.wikipedia.org/wiki/AMD_v._Intel

intel caused amd's downfall and now you expect them to totally rebuild in 2 years what a decade of damage has caused

amd is the little guy amd's yearly budget wont even run intel for a month


----------



## badatgames18

go to here:http://www.overclockers.com/amd-fx-8150-bulldozer-processor-review
look at the benches with it 6+ghz..
(everything looks so bad considering clock speed







)
wprime is multithreaded.. more cores= better score.. but it doesn't even compare to a 2600k @ 5.4- 5.3ghz..

if you looks at most of the reviews.. only thing i can see this being used as is a encoding,rendering slave


----------



## de Cossatot

Quote:


> Originally Posted by *badatgames18;15276199*
> that's because the general consumer has no idea what ipc is.. nor do they have any clue which processor is actually faster clock per clock. They just think "oh 8 'true' (not) cores must be better than 4c/8t"


I can totally agree with this statement.

When I first tried to build my computer I did not do a pot of research and went to microcenter and saw the intel chip (I forgot which one it was) clocked at 2.8Ghz (I think) and then a AMD clocked at 3.2Ghz and I bought the amd one due to the higher clocks. Turned out it was a C2 955. The intel was more but I didn't know why. Figured it was like Horse power I guess.

I guess to the uniformed bigger numbers do mean better. Haha


----------



## hokiealumnus

Quote:


> Originally Posted by *badatgames18;15277116*
> very nice review hokiealumnus
> 
> 
> 
> 
> 
> 
> 
> ... that imc has got me pumped to finally try some extreme memory benches
> 
> 
> 
> 
> 
> 
> 
> 
> 
> such big numbers, but scores are so mediocre
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw i assume you were using psc in that review? you didn't try out hypers by any chance did you lol.. i heard they do good on fx
> 
> EDIT: nvm i saw you used some flare.. i know i have some somewhere.. wow 2400MHz with them.. nice


**NOTE** The Flare kit wasn't used for the sub-zero runs! They were used for the ambient part of the review (stock @ 1866 / 6-9-7-24, overclocked @ 2000 / 7-9-7-24). Since I had heard FX had such a strong IMC I popped in an older pair of Pis rated for DDR3-2400 / 9-11-9-24 for going sub-zero. IIRC they were the strong PSC chips from before pretty much everyone stopped producing strong memory (at least that's affordable).


----------



## OC'ing Noob

Wow... those power consumption numbers when over clocked are eye popping. LOL


----------



## hajile

Quote:


> Originally Posted by *de Cossatot;15277337*
> I can totally agree with this statement.
> 
> When I first tried to build my computer I did not do a pot of research and went to microcenter and saw the intel chip (I forgot which one it was) clocked at 2.8Ghz (I think) and then a AMD clocked at 3.2Ghz and I bought the amd one due to the higher clocks. Turned out it was a C2 955. The intel was more but I didn't know why. Figured it was like Horse power I guess.
> 
> I guess to the uniformed bigger numbers do mean better. Haha


True, but I think anyone so uneducated won't have need for anything more than an i3 or Phenom II anyway.
Quote:


> Originally Posted by *OC'ing Noob;15277380*
> Wow... those power consumption numbers when over clocked are eye popping. LOL


I think that the numbers are great when considering the massive size of the chip.


----------



## badatgames18

Quote:


> Originally Posted by *hokiealumnus;15277341*
> **NOTE** The Flare kit wasn't used for the sub-zero runs! They were used for the ambient part of the review (stock @ 1866 / 6-9-7-24, overclocked @ 2000 / 7-9-7-24). Since I had heard FX had such a strong IMC I popped in an older pair of Pis rated for DDR3-2400 / 9-11-9-24 for going sub-zero. IIRC they were the strong PSC chips from before pretty much everyone stopped producing strong memory (at least that's affordable).


thanks! i guess i can check what my memory can do without blck wall now.. it looks like it has a better imc than my 980x also









i was surprised that the flare could do so high clocks lol.


----------



## microfister

Quote:


> Originally Posted by *obsidian86;15277307*
> how quick are we to forget
> 
> http://en.wikipedia.org/wiki/AMD_v._Intel
> 
> intel caused amd's downfall and now you expect them to totally rebuild in 2 years what a decade of damage has caused
> 
> amd is the little guy amd's yearly budget wont even run intel for a month


yes i understand that it cant really happen, but in our community more or less the enthusiast community, those going for multiple gpus, max overclocks ect, amd has just failed and all but given intel this portion of the market on a platter.


----------



## Mygaffer

Quote:


> Originally Posted by *de Cossatot;15277337*
> I can totally agree with this statement.
> 
> When I first tried to build my computer I did not do a pot of research and went to microcenter and saw the intel chip (I forgot which one it was) clocked at 2.8Ghz (I think) and then a AMD clocked at 3.2Ghz and I bought the amd one due to the higher clocks. Turned out it was a C2 955. The intel was more but I didn't know why. Figured it was like Horse power I guess.
> 
> I guess to the uniformed bigger numbers do mean better. Haha


This is so very, very true. I work at a shop that repairs and sells computers. Most people who don't know anything about computers will just buy whichever one has the most "gigabytes" for their dollar. It doesn't matter if those gigabytes are ram, storage, or gigahertz, in the consumer's mind they might as well be the same thing.

There are always a few out there who take the time to understand what it means but they are the exception and not the rule.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *hajile;15277399*
> I think that the numbers are great when considering the massive size of the chip.


Yeah, but not when you consider the lack of performance that comes from such a massive size and power draw...


----------



## tsm106

I feel for all those waiting to fix their cpu bottlenecks.... wth AMD?


----------



## capitaltpt

Anybody else notice that the reviews that didn't use the Crosshair V seem to have much better results from BD?


----------



## PvtHudson

Quote:


> Originally Posted by *BlackOmega;15276654*
> Who said anything about an upgrade? My rig is perfectly fine, does everything I want it to, and is enough for gaming. Albeit with some GPU refreshing when the next gen's come out.
> 
> Although, I may just get BD to mess around with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hows about Handbrake and Videora and the x264 benchmark? (Note: the first run of the x264 BM is just scanning and the second pass is actual encoding)
> Is that not enough? Hows about Battlefield 3?
> 
> Well there's 4.
> 
> Look I'm not saying the BD is superduperawesomewow! However, it's not as bad as people claim it to be. There's still a lot of wrinkles that need to be ironed out. The most notable one is the OS scheduler. Since it's not optimized for BD yet, it'll randomly assign tasks to whatever core in a very inefficient manner.


Did you even check the HardOCP article you linked to? The Battlefield 3 performance is identical...

How is that more superior to Sandy Bridge?


----------



## Vispor

This makes me so sad. Really curious to see Thurban prices new and used though...

Sent from my MB611 using Tapatalk


----------



## Kand

Quote:


> Originally Posted by *PvtHudson;15277479*
> Did you even check the HardOCP article you linked to? The Battlefield 3 performance is identical...
> 
> How is that more superior to Sandy Bridge?


Because it's GPU limited.

And a Beta.

And not indicative of how well a CPU performs because both CPUs are limited by the equipped GPU.


----------



## hajile

Quote:


> Originally Posted by *Majin SSJ Eric;15277447*
> Yeah, but not when you consider the lack of performance that comes from such a massive size and power draw...


I agree entirely. I also think that doubling a chip's size to obtain equal performance shows a serious bottleneck somewhere. If I may quote myself.
Quote:


> Originally Posted by *hajile;15277245*
> The chip was launched too early. I think that two performance increases are coming. The first will be piledriver. I suspect (given that it is launching in just a few months) that AMD pushed out this chip knowing that it had problems. I further suspect that the biggest problem is the one thing AMD won't talk about: branch prediction. Anand's N x N queen simulation helps to show that branch prediction has taken a large step backwards. Given that branch prediction is even more important with a longer pipeline, this is one of the most likely culprits. I believe that piledriver will focus almost entirely on fixing this problem.
> 
> The second problem is GF. AMD planned on a base frequency 30% faster and only achieved 9% faster. Even with the problems the chip has, overclock benchmarks @4.6Ghz (about 30% faster) places the chip more where it should be (at stock). In the upcoming revisions I think that speedbumps of 200Mhz+ are likely. Part of the cache latency problem is also linked to poor fab yields (increasing cache latency can help improve yields) and I suspect that piledriver will also have cache latencies more along the lines of Phenom II (or maybe just a bit faster).
> 
> All this said, this chip's intended audience was never primarily consumers. I think that server benchmarks show this chip to be more than competitive with sandy bridge.
> 
> OT edit: I think that with the BD transistor budget, AMD should have done what IBM did with POWER7 and made the L3 cache eDRAM (IBM did this to put more in while using less die space) and forgoing L3 for consumer chips because L3 is primarily to help with multi-processor bottlenecking in server/HPC environments.


edit: I do wonder if the cores are being starved by a too-narrow decode unit.


----------



## black96ws6

Quote:


> Originally Posted by *JF-AMD*
> *Q. I saw a benchmark on xyz website. Is that how bulldozer will perform?*A. No. Nothing posted before launch will be representative of actual performance. To get actual performance, you need:
> 
> Final production silicon
> Final processor microcode
> Final system BIOS
> Final OS optimizaitons
> Final drivers
> An app compiled with the latest flags
> A person who understands the app and configures the test properly
> 
> Without these things (and there are probably more), you cannot get an accurate benchmark. Any extrapolation of a crappy benchmark gives you a crappy estimate of actual performance. Period.
> 
> Oh, and many of the benchmarks that you see were probably not run, those are just charts made in excel. It's really easy to make a chart in excel - what do you want, bulldozer faster by 3716%? Intel faster by 293%? Sure, I can do either one in 10 seconds.


----------



## B3anbag

AMdont?

Ok, so im still learning a _lot_ from reading everyone's posts throughout OCN, but i do have a question or two here re: BD. Someone earlier mentioned how the Overdrive software rocked for PIIs but not for BDs...perhaps because it was designed for the PIIs? Is it possible that AMD was looking to the long game, when multi-core software would be the norm, just implemented and released the BD too soon, as the s/w market hasnt caught up? Maybe they know something we dont? Just wondering.


----------



## pyra

http://www.hardwareheaven.com/reviews/1285/pg16/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-conclusion.html (link is already in the op)

Are these guys ******ed or did they do something no-one else did, The 8150FX actually looks worth it over a 2600k in these benches????
Quote:


> There are two areas where the FX-8150 excells though, those are gaming and overclocking. In the former we saw the processor give us improved framerates over the Intel model. In the latter the ability to exceed 5GHz with ease offers additional value for money. In fact AMD have indicated that they expect most users to exceed 4.8GHz on air cooling.


----------



## goldcrow

Are there any benchmarks showing off an NB overclock together with the CPU? Sites always seem to forget this part of overclocking on AMD systems.


----------



## rubicsphere

No doubt AMD is looking ahead to when MT tasks are more the norm. However, Intel is too far ahead even at most MT stuff with 4 cores that when they drop X79 and there are 6C/12T chips it won't even be funny the thrashing BD will take in MT. It's sad really.


----------



## Vagrant Storm

Quote:


> Originally Posted by *OC'ing Noob;15277380*
> Wow... those power consumption numbers when over clocked are eye popping. LOL


Well...that is the one justafiable result. If you consider the four modules to be equal to eight physical cores it kind of makes sense that it uses double the power of a sandy bridge...and that is if you want to compare power ussage of BD and SB on equal footing when talking one core.

However, since resources are shared some in the modules I was expecting the power usage to be a decent amount lower. Still higher overall, but when eight threads are figured in I would think it would be realitively lower. The increase is still fairly linear though.

Imagine if nothing was shared...the power ussage would have been a little higher yet.
Quote:


> Originally Posted by *rubicsphere;15277573*
> No doubt AMD is looking ahead to when MT tasks are more the norm. However, Intel is too far ahead even at most MT stuff with 4 cores that when they drop X79 and there are 6C/12T chips it won't even be funny the thrashing BD will take in MT. It's sad really.


Not to mention I can turn on HT and still compete in the multithreaded world as well...and not be too far behind with a "last gen" CPU that is a couple years old now


----------



## Partol

Quote:


> 8150 isn't really a 8-core CPU, it was supposed to be a 4 core but to have something like Intel's hyper threading, just better. But studying computer science I can understand why is it hard to implement.


How do BD cores look in windows task manager?

Is it like this?
[module1] [module2] [module3] [module4]
[core0 core1] [core2 core3] [core4 core5] [core6 core7]

Or like this?
[core0 core2] [core4 core6] [core1 core3] [core5 core7]

Could help to have a bios/UEFI setting or windows setting which can change the core number in windows, so that windows will prefer to use one core from each module, before loading 2 cores in one module.

With my intel core i3, windows usually prefers to use real cores, before using hyperthreads.


----------



## Lampen

Quote:


> Originally Posted by *goldcrow;15277569*
> Are there any benchmarks showing off an NB overclock together with the CPU? Sites always seem to forget this part of overclocking on AMD systems.


A couple people have brought it up. I can't remember where it is in the thread now but one of the OCN'ers that has been playing with a 8150 for a while and tried OCing the NB to improve overall performance. He could only get it to 2700 or 2800 and it really didn't improve the chip all that much.


----------



## capitaltpt

Quote:


> Originally Posted by *pyra;15277568*
> http://www.hardwareheaven.com/reviews/1285/pg16/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-conclusion.html (link is already in the op)
> 
> Are these guys ******ed or did they do something no-one else did, The 8150FX actually looks worth it over a 2600k in these benches????


Like I said earlier. Noticed they used an ASROCK 990FX board and NOT the Crosshair.


----------



## badatgames18

i'm still hoping there will be some magical bios flash or updated microcode that will help these out.. it's an amd prescott so far


----------



## BlackOmega

Quote:


> Originally Posted by *Majin SSJ Eric;15276996*
> The sad part about this is if they had just added two more cores to Thuban it would have obliterated these BD results...


Quite possibly. I believe that BD is meant for the next generation of software that can fully utilize it. I'm hoping at least.
I'm going to see what happens with BD in the next few months and see what is fixed with it. And how OS's may be patched to fully utilize BD's architecture.
Quote:


> Originally Posted by *criminal;15277005*
> Fermi? Haha... not hardly. Fermi was never a failure.


That's why ATi's prices went up 30% after its release right?
Quote:


> Originally Posted by *PvtHudson;15277479*
> Did you even check the HardOCP article you linked to? The Battlefield 3 performance is identical...
> 
> How is that more superior to Sandy Bridge?


Errr......what? Identical you say? Lets have a look shall we?










Ok so at stock speed,

MIN. difference: *6.7% better* than 2600K; and *8.9% better* than the 2500k.
MAX. difference: *6.1% better* than 2600k; and *7.4% bette*r than the 2500K.
AVG. difference: 3.8% worse than 2600K; and *1.4% bette*r than the 2500K










Now lets look at overclocked results:

Min: *17.4% bette*r than the 2600K; *13.1% bette*r than the 2500k.
MAX: *17.2% better* than the 2600K; identical to the 2500K.
AVG: *5.2% bette*r than the 2600K; *5.8% bette*r than the 2600K.

It's interesting how the 2600k actually loses performance when overclocked.

Regardless, as you can clearly see, they're not identical performers.


----------



## Behemoth777

Quote:


> Originally Posted by *badatgames18;15277646*
> i'm still hoping there will be some magical bios flash or updated microcode that will help these out.. it's an amd prescott so far


Maybe more optimization with programs and such, but processors don't improve in performance like video cards do when they get a bios/driver update.

I've lost all faith in amd. Can't say how happy I am that I went with sandy bridge all those months ago, even through the recall.


----------



## djriful

Whoever dumb enough if the Bulldozer still haven't proven its power in several weeks or months and the costs remains high $250+ is ridiculous. I rather get Core i5 2nd @ $150 gives better performances.

This is lame.

Here comes the waiting game again... holding out.


----------



## rubicsphere

Were there any non-GPU limited high end dual card setup benchmarks?


----------



## Lampen

Quote:


> Originally Posted by *badatgames18;15277646*
> i'm still hoping there will be some magical bios flash or updated microcode that will help these out.. it's an amd prescott so far


Even if the fixes brought it up to within 5% of the 2500k it still won't really matter. The price/performance ratio is just not there like it was with other AMD lineups. I'll go with whatever chip is cheapest and will do what I need it to. You could really make an argument for going with a 965 over a i7-9xx for gaming because it was cheaper and did what you needed. The BD lineup now requires a pricing premium and offers sub-par performance for that premium.


----------



## G3RG

Quote:


> Originally Posted by *pyra;15277568*
> http://www.hardwareheaven.com/reviews/1285/pg16/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-conclusion.html (link is already in the op)
> 
> Are these guys ******ed or did they do something no-one else did, The 8150FX actually looks worth it over a 2600k in these benches????


Interestingly that review does make Bulldozer's performance look a lot better...


----------



## Strat79

Can't say I am surprised, but still it comes as a huge disappointment. I really thought they would perform better than this. I just couldn't fathom them releasing a new CPU arch that performed worse in per core versus the PHII. I am still hopeful they can improve it by a good amount with a new revision though. Something seems off with the branch prediction or scheduler, these results are all over the place between different sites and tests. Some even get widely different results on the same test run back to back. Something is awry, just not sure what it is.

That said, AMD will probably still do just fine on sales. As said already, the average consumer knows very little in the way of IPC and per core performance or even power consumption. They will see 8 cores and buy if it is cheaper than the competitions "piddly 4 core" sitting next to it for a higher price. Our, the enthusiasts, market is very small and will have little impact on overall sales and profits. AMD will be around and thrive for a very long time to come.


----------



## nagle3092

Quote:


> Originally Posted by *G3RG;15277753*
> Interestingly that review does make Bulldozer's performance look a lot better...


It does, I wonder if there is something up with the Crosshair boards they sent out.


----------



## obsidian86

Quote:


> Originally Posted by *pyra;15277568*
> http://www.hardwareheaven.com/reviews/1285/pg16/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-conclusion.html (link is already in the op)
> 
> Are these guys ******ed or did they do something no-one else did, The 8150FX actually looks worth it over a 2600k in these benches????


might have been the L1 cache fix


----------



## badatgames18

Quote:


> Originally Posted by *Lampen;15277743*
> Even if the fixes brought it up to within 5% of the 2500k it still won't really matter. The price/performance ratio is just not there like it was with other AMD lineups. I'll go with whatever chip is cheapest and will do what I need it to. You could really make an argument for going with a 965 over a i7-9xx for gaming because it was cheaper and did what you needed. The BD lineup now requires a pricing premium and offers sub-par performance for that premium.


when i bought an 1100T it was around 240 bucks.. then it dropped within a few months..


most likely senerio for the fx procs

Quote:


> Originally Posted by *Behemoth777;15277722*
> Maybe more optimization with programs and such, but processors don't improve in performance like video cards do when they get a bios/driver update.
> 
> I've lost all faith in amd. Can't say how happy I am that I went with sandy bridge all those months ago, even through the recall.


i started out with amd then moved to intel.. i still have a soft spot for amd


----------



## kapulek

Quote:


> Originally Posted by *Benz;15276599*
> These are not the benchmarks I was expecting.
> These are the benchmarks I had for over 2 months from my cousin.


You said this 4-core Zambezi leak score was real. FX4xxx faster than I7-2600K.










Now you're saying that your cousins Zambezi 8-core score is lower than 2500K?










Please explain.


----------



## Papas

Quote:


> Originally Posted by *Lampen;15277743*
> Even if the fixes brought it up to within 5% of the 2500k it still won't really matter. The price/performance ratio is just not there like it was with other AMD lineups. I'll go with whatever chip is cheapest and will do what I need it to. You could really make an argument for going with a 965 over a i7-9xx for gaming because it was cheaper and did what you needed. The BD lineup now requires a pricing premium and offers sub-par performance for that premium.


didnt intel just release a fix for there onboard video. after a year of being used it was somehow neglected to the point that the update improved performance *30%*. that shows that stuff released may not be fully utilized.


----------



## badatgames18

Quote:


> Originally Posted by *Papas;15277884*
> didnt intel just release a fix for there onboard video. after a year of being used it was somehow neglected to the point that the update improved performance *30%*. that shows that stuff released may not be fully utilized.


there is a distinction between what was needed to be done with intel procs and optimization and what amd needs to do for their cpu..

they need to sprinkle magical fairy dust on it, and chant incantations inorder for it to preform comparable to their intel counterparts..

pros however are the same as with Phenom II.. no cold boot.. scaling under cold. (i understand for general consumers this doesn't matter)

but they also added ability to clock higher on normal cooling conditions vs older phenom II.

so there is some bright spots


----------



## Vagrant Storm

Quote:


> Originally Posted by *BlackOmega;15277706*
> Quite possibly. I believe that BD is meant for the next generation of software that can fully utilize it. I'm hoping at least.
> I'm going to see what happens with BD in the next few months and see what is fixed with it. And how OS's may be patched to fully utilize BD's architecture.
> 
> That's why ATi's prices went up 30% after its release right?
> 
> Errr......what? Identical you say? Lets have a look shall we?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok so at stock speed,
> 
> MIN. difference: *6.7% better* than 2600K; and *8.9% better* than the 2500k.
> MAX. difference: *6.1% better* than 2600k; and *7.4% bette*r than the 2500K.
> AVG. difference: 3.8% worse than 2600K; and *1.4% bette*r than the 2500K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now lets look at overclocked results:
> 
> Min: *17.4% bette*r than the 2600K; *13.1% bette*r than the 2500k.
> MAX: *17.2% better* than the 2600K; identical to the 2500K.
> AVG: *5.2% bette*r than the 2600K; *5.8% bette*r than the 2600K.
> 
> It's interesting how the 2600k actually loses performance when overclocked.
> 
> Regardless, as you can clearly see, they're not identical performers.


Benchmarks are not valid until after release I thought? That is what I've read about 14,713 times on OCN the last few days. BF3 is in beta so you can't use it. Plus if you are over 40fps...3fps or so is the margin of error. Unless you can tell me that you get the same exact results every time you run a bench on your system. In every other hardware test it would universally accepted that those results were limited by some other piece of hardware since they are so similar.

And I say again...a beta version of a game could easily account for the results too...especially when the 2500k beat the 2600k by a few fps(though I bet it is more to do with an unstable OC). Plus under the law of statistics you have to throw out the best and throw out the worst. BF3 is probably the best right now as far as gaming goes. I am not even sure what to pick to throw out the worst...it could be one many.


----------



## ZealotKi11er

Are there any CFX or SLI benchmarks. Testing CPUs in games they better have 2-3 GPU to simulate how future prof the CPUs are. Even X6 can handle most single GPU just fine.


----------



## Artikbot

Quote:


> Originally Posted by *obsidian86;15277845*
> might have been the L1 cache fix


So, do this processors _indeed_ need a BIOS update?

I sees the light!! Look at there!!

Makes a lot of sense. AMD's business model would NEVER push a CPU that performs 30% less than last generation at 1.5X the price.


----------



## Papas

Quote:


> Originally Posted by *badatgames18;15277932*
> there is a distinction between what was needed to be done with intel procs and optimization and what amd needs to do for their cpu..
> 
> they need to sprinkle magical fairy dust and chant incantations inorder for it to preform comparable to their intel counterparts..
> 
> pros however are the same as with Phenom II.. no cold boot.. sociability under cold. (i understand for general consumers this doesn't matter)
> 
> but they also added ability to clock higher on normal cooling conditions vs older phenom II.
> 
> so there is some bright spots


now this is a serious question. since when has any cpu beat another cpu(with 2x's as many cores) in a multi threaded task? i have never seen a test where a dual core beat a quad in something that used all the cores. To me, in my honest opinion, it seems like bulldozer is not being fully utilized. not that its underpower. If it was underpowered, it would be loosing in all tests, not just some of them. it seems like some of the test are more optimized than others.

Trying to explain myself better. in a bunch of tests it scores worse than the 2500k, then in some others it scores better than a 2600k, how is that possible unless the testing is not using the full power of the cores? Ohh, BTW, im not talking about single threaded performance. im talking about multi threaded where it looses to the 2500k and then somehow wins in others against the 2600k.


----------



## radaja

Quote:


> Originally Posted by *kapulek;15277859*
> You said this 4-core Zambezi leak score was real. FX4xxx faster than I7-2600K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now you're saying that your cousins Zambezi 8-core score is lower than 2500K?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please explain.


yes i would also like to know why he claimed what he did,the graphs his cousin sent him dont make a FX-8150 look too good,it barely keeps up with a quad core 2500K,yet all we heard from him was "dont believe the leaks i have seen real performance figures and BD is a winner" type stuff


----------



## pyra

JF-AMD's last activity was an hour ago and he hasn't posted a thing for 3 days


----------



## badatgames18

Quote:


> Originally Posted by *Artikbot;15277958*
> So, do this processors _indeed_ need a BIOS update?
> 
> I sees the light!! Look at there!!
> 
> Makes a lot of sense. AMD's business model would NEVER push a CPU that performs 30% less than last generation at 1.5X the price.


someone on XS posted that there was an error in coding for fx procs in linux and the error should exist in windows... he also said that it could have decreased performance up to 15% so yes.. there is some optimization that will be done/needed

however.. it still will be slower vs i5 and i7s.. hoping it will be slightly faster than the phenoms though


----------



## hammertime850

Quote:


> Originally Posted by *rubicsphere;15277741*
> Were there any non-GPU limited high end dual card setup benchmarks?


I'm also interested in this. I remember I saw a revew witch showed thuban putting up impressive MIN fps in multicard setups.

I might be mistaken


----------



## sub50hz

Quote:


> Originally Posted by *Behemoth777;15277722*
> Maybe more optimization with programs and such, but processors don't improve in performance like video cards do when they get a bios/driver update.


Try and remember: this is such a radical departure from every x86 CPU ever produced, that it's entirely possible that OS patches and re-structuring of schedulers and/or BIOS fixes may actually fix a lot of what we're seeing as bad performance. Hell, if anyone reads the Anand review, the last page pretty much sums it up -- AMD says Windows 7's scheduler cannot allocate threads appropriately. So you may say, "Well, why didn't they make something more appropriate for the times?" And I would say that making small architectural changes are no longer "looking forward". BD is a huge gamble, and the payoff is not going to be immediate.

That being said, I would like to see how the FX would do in some Linux benches -- wishful thinking, and I'm not sure I'm up to spending 250 bucks to find out.


----------



## cky2k6

This just feels that amd released bulldozer for the sake of releasing it. Power and performance wise, it cannot compete with stars yet. They should've canned this beast until it can actually beat out the previous architecture, and release an 8 core stars based chip. Maybe piledriver can be to bulldozer what stars was to barcelona.


----------



## Malcolm

Quote:


> Originally Posted by *sub50hz;15278047*
> Try and remember: this is such a radical departure from every x86 CPU ever produced, that it's entirely possible that OS patches and re-structuring of schedulers and/or BIOS fixes may actually fix a lot of what we're seeing as bad performance. Hell, if anyone reads the Anand review, the last page pretty much sums it up -- AMD says Windows 7's scheduler cannot allocate threads appropriately.


What did they use to test Bulldozer on during development then?







How would they be able to know what BD's performance would be like on Windows 7 if it currently doesn't work properly? And how were they getting those 1337 benches that showed BD neck and neck with the 980X? That sounds like a bunch of evasive BS on AMD's part if you ask me.


----------



## Vagrant Storm

Quote:


> Originally Posted by *Papas;15277999*
> now this is a serious question. since when has any cpu beat another cpu(with 2x's as many cores) in a multi threaded task? i have never seen a test where a dual core beat a quad in something that used all the cores. To me, in my honest opinion, it seems like bulldozer is not being fully utilized. not that its underpower. If it was underpowered, it would be loosing in all tests, not just some of them. it seems like some of the test are more optimized than others.
> 
> Trying to explain myself better. in a bunch of tests it scores worse than the 2500k, then in some others it scores better than a 2600k, how is that possible unless the testing is not using the full power of the cores? Ohh, BTW, im not talking about single threaded performance. im talking about multi threaded where it looses to the 2500k and then somehow wins in others against the 2600k.


It would have been nice if AMD would have let some of the benchmark makers play with Bulldozer before now. Then this could have been mitigated some. As it stands right now...this is very possible. We might have to wait for a few days for some new version of benches to be made. Though I don't see massive improvemnt coming. From the looks of it there might actually be bottle necks INSIDE the cpu its self, though. When these bottle necks don't matter...then it does OK. When they do matter it does horrible.

This is what we get with AMD's hush-hush nonsense.

Arg...I hate to say it, but I think some more time is need to know for sure. Though my guess is that any benchmark software not fully utilizing Bulldozer based CPUs will have a new version out by this weekend.

If AMD releases a special driver fix for Catalyst to make it perform better I think I might have go down to AMD corporate and start slapping people...they've had plenty of time to make Catalyst make good use of Bulldozer.


----------



## sub50hz

Quote:


> Originally Posted by *Malcolm;15278103*
> What did they use to test Bulldozer on during development then?
> 
> 
> 
> 
> 
> 
> 
> How would they be able to know what BD's performance would be like on Windows 7 if it currently doesn't work properly? And how were they getting those 1337 benches that showed BD neck and neck with the 980X? That sounds like a bunch of evasive BS on AMD's part if you ask me.


It's very possible that AMD has worked with/is currently working with MS to see if it's something that can be rolled out with a hotfix. Rather than calling bullcrap, try and be a little more objective.


----------



## ZealotKi11er

I remeber when first speculation where 50% increase performance with 33% increase core count. Now its 25% less performance with 33% increased core count and we were complaining that 50% increase was not so much considering the extra 2 core.


----------



## BlackOmega

Quote:


> Originally Posted by *nagle3092;15277830*
> It does, I wonder if there is something up with the Crosshair boards they sent out.


That's exactly what I've been thinking.







Quote:


> Originally Posted by *Vagrant Storm;15277941*
> Benchmarks are not valid until after release I thought? That is what I've read about 14,713 times on OCN the last few days. BF3 is in beta so you can't use it. Plus if you are over 40fps...3fps or so is the margin of error. Unless you can tell me that you get the same exact results every time you run a bench on your system. In every other hardware test it would universally accepted that those results were limited by some other piece of hardware since they are so similar.
> 
> And I say again...a beta version of a game could easily account for the results too...especially when the 2500k beat the 2600k by a few fps(though I bet it is more to do with an unstable OC). Plus under the law of statistics you have to throw out the best and throw out the worst. BF3 is probably the best right now as far as gaming goes. I am not even sure what to pick to throw out the worst...it could be one many.


Why can't I use BF3? Because it's beta? Seriously? That's about the silliest thing I've ever heard. You're comparing performance of the same program. Regardless of whether or not it's beta. That argument is invalid and its beta status is irrelevant.

And while you may try to minimize it as "just a few FPS", when you look at it, in percents, there's a much larger margin. I surely can tolerate margin of error, however, they said that they ran each test three times. Besides, we're not talking just "a few fps", we're talkin between *8 and 15*. Don't try to minimize that Intel is not favored.

As for my results, while sure they're not identical every time, all of my results have been within 1% of each other. That's more about methodology and how the tests/BM's are run. It has to be consistent if you expect consistent results.
Quote:


> Originally Posted by *Papas;15277999*
> now this is a serious question. since when has any cpu beat another cpu(with 2x's as many cores) in a multi threaded task? i have never seen a test where a dual core beat a quad in something that used all the cores. To me, in my honest opinion, it seems like bulldozer is not being fully utilized. not that its underpower. If it was underpowered, it would be loosing in all tests, not just some of them. it seems like some of the test are more optimized than others.
> 
> Trying to explain myself better. in a bunch of tests it scores worse than the 2500k, then in some others it scores better than a 2600k, how is that possible unless the testing is not using the full power of the cores? Ohh, BTW, im not talking about single threaded performance. im talking about multi threaded where it looses to the 2500k and then somehow wins in others against the 2600k.


Exactly. The results are so mixed, they almost make no damn sense.


----------



## Oedipus

Quote:


> Originally Posted by *sub50hz;15278130*
> It's very possible that AMD has worked with/is currently working with MS to see if it's something that can be rolled out with a hotfix. Rather than calling BS, try and be a little more objective.


BD has been in development for this long and they still haven't managed to get this mythical patch out yet?


----------



## Da1Nonly

Thanks for the reviews. Does anyone know if there are reviews with crossfire/sli gpus?? Just wondering about multi gpu stability with these cpu's


----------



## Am*

Said it before and I'll say it again: Bulldozer = fail of epic proportions. Judging by the core for core performance, it feels like they duct-taped Pentium 4 cores together, and seeing it fail to previous gen X4 Phenom II CPU which had half the core count, was beyond a joke. This is a several thousand times bigger fail than their Phenom 1's ever were, and more worse than I thought it could ever be. After all those pholosophical theory posts by whom AMD fans consider "prophet" John (from whom we won't be seeing any more hype threads/posts I bet), we get an 8 core Pentium 4. Now where are all the quotes of people saying "BD will be the fastest CPU out" to make them eat those words, after the thousands of flameposts made here by most of them?

I kind of saw it coming anyways. FX name to signify core performance of 2003, biscuit tin packaging to signify the metaphor that it'll get eaten for breakfast performance-wise. If this is the best you can do AMD, after over half a decade of research & development, save yourselves the billions in cash and forget making/designing any more x86 CPUs (and spend it on developing/designing your GPUs).


----------



## mechtech

Quote:


> Originally Posted by *badatgames18;15278018*
> someone on XS posted that there was an error in coding for fx procs in linux and the error should exist in windows... he also said that it could have decreased performance up to 15% so yes.. there is some optimization that will be done/needed


This is not an error. This is a bad design implementation on Bulldozer that AMD wants OS vendors to hack around.

Linus Torvalds commented "Argh. This is a small disaster, you know that, right?... I'd be really worried. Changing address space layout is not a small decision."

http://us.generation-nt.com/answer/patch-x86-amd-correct-f15h-ic-aliasing-issue-help-204200361.html


----------



## G3RG

Why is this review:
http://www.hardwareheaven.com/reviews/1285/pg1/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-introduction.html

Scoring so much better than the rest?

It's actually beating a 2600k in MANY tests in that review, and if not beating...matching. I'd definitely buy an 8150 if it can actually produce that level of performance....

edit: and the power usage isn't fermi like either =o


----------



## Papas

Quote:


> Originally Posted by *Vagrant Storm;15278118*
> It would have been nice if AMD would have let some of the benchmark makers play with Bulldozer before now. Then this could have been mitigated some. As it stands right now...this is very possible. We might have to wait for a few days for some new version of benches to be made. Though I don't see massive improvemnt coming. From the looks of it there might actually be bottle necks INSIDE the cpu its self, though. When these bottle necks don't matter...then it does OK. When they do matter it does horrible.
> 
> This is what we get with AMD's hush-hush nonsense.
> 
> Arg...I hate to say it, but I think some more time is need to know for sure. Though my guess is that any benchmark software not fully utilizing Bulldozer based CPUs will have a new version out by this weekend.
> 
> If AMD releases a special driver fix for Catalyst to make it perform better I think I might have go down to AMD corporate and start slapping people...they've had plenty of time to make Catalyst make good use of Bulldozer.


In the end, im kinda happy. if BD was what everyone thought it to be(the new king cpu) i would a had to upgrade from my 2500k that runs perfectly. so this is kinda a blessing in disguise lol. at least now i have more time to put some $ together and upgrade my GPU which is killing me...even on 1600x1200 on a 15" crt it lags like crazy in alot of games.


----------



## sub50hz

Quote:


> Originally Posted by *Oedipus;15278158*
> BD has been in development for this long and they still haven't managed to get this mythical patch out yet?


I assume you have a background in OS development that would qualify you to make such a comment. If so, let me know how easy it is to solve the aforementioned error. I'm waiting.


----------



## nagle3092

Quote:


> Originally Posted by *BlackOmega;15278148*
> Why can't I use BF3? Because it's beta? Seriously?


To be fair, using an unscripted play through from a multiplayer game is a piss poor benchmark. There are to many variables that could happen from one run to another.


----------



## blackbalt89

Windows 7 SP2 coming soon?









Kind of like AVX support was added in SP1.


----------



## Epsi

Found another review from The Netherlands. They compared the other FX versions also.

Original:

http://nl.hardware.info/reviews/2382/amd-fx-8150--8120--6100--4100-bulldozer-review

Translated:

http://translate.google.com.sg/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fnl.hardware.info%2Freviews%2F2382%2Famd-fx-8150--8120--6100--4100-bulldozer-review


----------



## Malcolm

Quote:


> Originally Posted by *sub50hz;15278130*
> It's very possible that AMD has worked with/is currently working with MS to see if it's something that can be rolled out with a hotfix. Rather than calling BS, try and be a little more objective.


I'm confused. I'm assuming AMD would have a fleet of test systems for development, many of them running Windows 7, right? Again I ask, how were they able to develop BD properly on an OS with unoptimized schedulers? BD has been in development since long before Windows 7 was released, right? I don't see how they were able to do this and worry about fixing it later. Additionally if that were true I would expect the hotfix from MS to have been rolled out *prior* to the BD release. It'd be monumentally stupid not to.


----------



## mbudden

A lot of people in denial in here...


----------



## Oedipus

Quote:


> Originally Posted by *sub50hz;15278210*
> I assume you have a background in OS development that would qualify you to make such a comment. If so, let me know how easy it is to solve the aforementioned error. I'm waiting.


I didn't say it would be a simple fix. Still, AMD and MS are filled with smart people that should have been able to get this done by now.

Which leads me to believe that this hotfix or patch is not real.


----------



## AddictedGamer93

Quote:


> Originally Posted by *G3RG;15278193*
> Why is this review:
> http://www.hardwareheaven.com/reviews/1285/pg1/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-introduction.html
> 
> Scoring so much better than the rest?
> 
> It's actually beating a 2600k in MANY tests in that review, and if not beating...matching. I'd definitely buy an 8150 if it can actually produce that level of performance....
> 
> edit: and the power usage isn't fermi like either =o


All I see is a different mobo


----------



## GameBoy

Quote:


> Originally Posted by *Am*;15278177*
> Said it before and I'll say it again: Bulldozer = fail of epic proportions. Judging by the core for core performance, it feels like they duct-taped Pentium 4 cores together, and seeing it fail to previous gen X4 Phenom II CPU which had half the core count, was beyond a joke. This is a several thousand times bigger fail than their Phenom 1's ever were, and more worse than I thought it could ever be. After all those pholosophical theory posts by whom AMD fans consider "prophet" John (from whom we won't be seeing any more hype threads/posts I bet), we get an 8 core Pentium 4. Now where are all the quotes of people saying "BD will be the fastest CPU out" to make them eat those words, after the thousands of flameposts made here by most of them?
> 
> I kind of saw it coming anyways. FX name to signify core performance of 2003, biscuit tin packaging to signify the metaphor that it'll get eaten for breakfast performance-wise. If this is the best you can do AMD, after over half a decade of research & development, save yourselves the billions in cash and forget making/designing any more x86 CPUs (and spend it on developing/designing your GPUs).


How exactly does the 8150P "fail" against a Phenom II X4? You either didn't read the reviews properly, or you're another person who had ridiculous expectations.

I'm somewhat disappointed with Bulldozer (largely due to the absurd power consumption), but honestly, some of you guys need to quit throwing your toys out the pram.


----------



## G3RG

Quote:


> Originally Posted by *AddictedGamer93;15278293*
> All I see is a different mobo


Has me wondering.... there has to be a reason it performed so much better for them =o.


----------



## M3T4LM4N222

Maybe bulldozer wasn't meant to break the bridge but rather to be slow. Maybe Bulldozer was made slow like a Bulldozer but then a reform will come out and blow the bridge away? MAYBE ITS A MARKETING SCHEME!! LoLolOLASLKLSKALSKALKSLAKSLAKSLAKDLKLSADKSLDLSKDLLMAO TROLFLMF;DCNSPIRICYILOVEM


----------



## blackbalt89

Quote:


> Originally Posted by *G3RG;15278307*
> Has me wondering.... there has to be a reason it performed so much better for them =o.


Maybe AsRock is the only one that can get the BIOS working properly?


----------



## cjc75

NewEgg now has the BD's listed!


----------



## Vagrant Storm

Quote:


> Originally Posted by *BlackOmega;15278148*
> That's exactly what I've been thinking.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why can't I use BF3? Because it's beta? Seriously? That's about the silliest thing I've ever heard. You're comparing performance of the same program. Regardless of whether or not it's beta. That argument is invalid and its beta status is irrelevant.
> 
> And while you may try to minimize it as "just a few FPS", when you look at it, in percents, there's a much larger margin. I surely can tolerate margin of error, however, they said that they ran each test three times.
> 
> As for my results, while sure they're not identical every time, all of my results have within 1% of each other. That's more about methodology and how the tests/BM's are run. It has to be consistent if you expect consistent results.
> 
> Exactly. The results are so mixed, they almost make no damn sense.


You can't go off percents...that is just sensationalizing the numbers. I absolutely dread it when news and marketing people do that. If you want to talk margin of error in percents you have to pretty much go off that +/- 7.5% is the margin of error when dealing in numbers this big. if the number were 10 and 15 there is a 50% increase!!!! woot! but once you have a significant base number like 100 and 105 then you are 5%. As the base number go up so does the margine of error.

Keep things mathmatical and not subjective. We are MEASURING performance so stick with actual measuremeants and leave the OMG percents alone. Same goes for those graphs that are showing SB like 70% better than BD in something. It is pointless percentages that do nothing but stir people up because 70% sounds much worse than 10 seconds slower.

Plus that overlocked 2600k must be unstable if the 2500k is beating it. they are the same CPU essentially once you OC. so you need to focus on the stock measurements if you are going to continue using that data. Though since it is a beta game it could get released and have drastically different results...I've seen it happen before.
Quote:


> Originally Posted by *cjc75;15278338*
> NewEgg now has the BD's listed!


lol and notice they are still in stock. I remember trying to get a 5870 and they would be out of stock before you get it added to your cart.


----------



## gsa700

I must say I am dismayed that BD is so unimpressive compared to the 1100t in these reviews.

Never mind sandy bridge, they should have shrunk Thuban and added two cores and more clock speed. That would have crushed this BD disaster into dust.

.02

And for the sig watchers: I have two AMD and two sandy bridge systems. I prefer AMD but can't overlook the performance gap anymore.


----------



## Rookie1337

Quote:


> Originally Posted by *AddictedGamer93;15278293*
> All I see is a different mobo


1866RAM used. Just like in Guru3Ds the higher the RAM speed the better BD does it seems.


----------



## Schmuckley

Quote:


> Originally Posted by *Am*;15278177*
> Said it before and I'll say it again: Bulldozer = fail of epic proportions. Judging by the core for core performance, it feels like they duct-taped Pentium 4 cores together, and seeing it fail to previous gen X4 Phenom II CPU which had half the core count, was beyond a joke. This is a several thousand times bigger fail than their Phenom 1's ever were, and more worse than I thought it could ever be. After all those pholosophical theory posts by whom AMD fans consider "prophet" John (from whom we won't be seeing any more hype threads/posts I bet), we get an 8 core Pentium 4. Now where are all the quotes of people saying "BD will be the fastest CPU out" to make them eat those words, after the thousands of flameposts made here by most of them?
> 
> I kind of saw it coming anyways. FX name to signify core performance of 2003, biscuit tin packaging to signify the metaphor that it'll get eaten for breakfast performance-wise. If this is the best you can do AMD, after over half a decade of research & development, save yourselves the billions in cash and forget making/designing any more x86 CPUs (and spend it on developing/designing your GPUs).


i agree..the epic failness of this release is unprecedented..makes p4/phenom1
look like small fry..wait..it's 2011..and AMD comes up with..netburst architecture?..they need to take themselves off of their heads


----------



## neonlazer

Quote:


> Originally Posted by *cjc75;15278338*
> NewEgg now has the BD's listed!


Ouch...2500k $219..vs 8150 $279..


----------



## black96ws6

Thanks for the link to the review that compares the 6100 and 4100, here's a snippet:


----------



## Epsi

Quote:


> Originally Posted by *black96ws6;15278396*
> Thanks for the link to the review that compares the 6100 and 4100, here's a snippet:


Your welcome


----------



## G3RG

Quote:


> Originally Posted by *Rookie1337;15278352*
> 1866RAM used. Just like in Guru3Ds the higher the RAM speed the better BD does it seems.


Sounds plausible...though that is a pretty hefty jump just for a higher speed ram lol. I'm thinking it may have more to do with the motherboard.


----------



## Axon14

A sad day for AMD. I understand that they were doing something of a forward looking redesign of their CPU architecture, but they should have put a stop to this before it launched. Surely they knew this was the performance they had.


----------



## bavarianblessed

Well, this clears up quite a bit. Guess I'll buy a board for that 2500K I have sitting at home.


----------



## sub50hz

Quote:


> Originally Posted by *Oedipus;15278278*
> Which leads me to believe that this hotfix or patch is not real.


Did someone claim there was going to be one? Because reviewing my posts I can see that I only eluded to a possiblity, but I have no first-, second- or tenth-hand knowlegde of anything of the sort. It would _make sense_ that if it's a patchable issue something is already being worked on.

I want to be very clear about this, as some of you with blinders on may somehow mistake me for some "AMD fanboy" -- I have no self-fabricated allegiance to Intel/AMD/Nvidia/Via/whoever else. I buy what's good for my dollar, and that's the bottom line. I don't hope for one company to have more success than the other because it makes for a fragmented market with less choice and poorly leveraged pricing.

AMD's complete failure would have farther-reaching implications than many of you realize, although I guess it's easier to be ignorant.


----------



## badatgames18

Quote:


> Originally Posted by *mechtech;15278187*
> This is not an error. This is a bad design implementation on Bulldozer that AMD wants OS vendors to hack around.
> 
> Linus Torvalds commented "Argh. This is a small disaster, you know that, right?... I'd be really worried. Changing address space layout is not a small decision."
> 
> http://us.generation-nt.com/answer/patch-x86-amd-correct-f15h-ic-aliasing-issue-help-204200361.html


thanks for the link..will try it after i get a chip
EDIT: nvm i think i see


----------



## born2bwild

Quote:


> Originally Posted by *G3RG;15278307*
> Has me wondering.... there has to be a reason it performed so much better for them =o.


Also; while most sites are impartial, several sites are clearly showing benchmarks favouring BD.

The review you posted earlier for example only uses game benchmarks that have been GPU limited (1080p, high detail, etc.) and posts ~1fps advantage for the BD to claim it is better, while in reality the 1% is practically the margin of error present in a test, and the test itself is clearly GPU bottlenecked. So in reality, the performance of the CPU is not seen.


----------



## blackbalt89

Quote:


> Originally Posted by *black96ws6;15278396*
> Thanks for the link to the review that compares the 6100 and 4100, here's a snippet:


Wow. Those scores are abysmal.


----------



## t00sl0w

can anyone confirm if the original project lead/core team were either fired or left?

i swear its like, AMD originally was designing a quad core with physical hyperthreading instead of the magic faerie dust intel uses...but somewhere in the middle they forgot that, KEPT THE SAME layout of the chip, and started calling it an octo-core instead of a quad.


----------



## tafkar

Quote:


> Originally Posted by *G3RG;15278409*
> Sounds plausible...though that is a pretty hefty jump just for a higher speed ram lol. I'm thinking it may have more to do with the motherboard.


It does make some sense... if the caching logic is even worse than people are saying it is.


----------



## nagle3092

Quote:


> Originally Posted by *blackbalt89;15278447*
> Wow. Those scores are abysmal.


LOL the A8 and A6 both beat the quad core FX.


----------



## SOCOM_HERO

I guess my processor isn't as obsolete as I thought it was becoming. I'm about on par with the FX. Pretty sad news indeed, as I was going to use BD in my upcoming build for a family member. Guess I'll have to find another chip to use, and it will in all likelihood be intel now. I wish there was a good value for money chip out there that truly epitomized the idea that you can have good quad-core or higher performance for a price under $200


----------



## cusideabelincoln

Quote:


> Originally Posted by *hajile;15277509*
> I agree entirely. I also think that doubling a chip's size to obtain equal performance shows a serious bottleneck somewhere. If I may quote myself.
> 
> "The chip was launched too early. I think that two performance increases are coming. The first will be piledriver. I suspect (given that it is launching in just a few months) that AMD pushed out this chip knowing that it had problems. I further suspect that the biggest problem is the one thing AMD won't talk about: branch prediction. Anand's N x N queen simulation helps to show that branch prediction has taken a large step backwards. Given that branch prediction is even more important with a longer pipeline, this is one of the most likely culprits. I believe that piledriver will focus almost entirely on fixing this problem.
> 
> The second problem is GF. AMD planned on a base frequency 30% faster and only achieved 9% faster. Even with the problems the chip has, overclock benchmarks @4.6Ghz (about 30% faster) places the chip more where it should be (at stock). In the upcoming revisions I think that speedbumps of 200Mhz+ are likely. Part of the cache latency problem is also linked to poor fab yields (increasing cache latency can help improve yields) and I suspect that piledriver will also have cache latencies more along the lines of Phenom II (or maybe just a bit faster).
> 
> All this said, this chip's intended audience was never primarily consumers. I think that server benchmarks show this chip to be more than competitive with sandy bridge.
> 
> OT edit: I think that with the BD transistor budget, AMD should have done what IBM did with POWER7 and made the L3 cache eDRAM (IBM did this to put more in while using less die space) and forgoing L3 for consumer chips because L3 is primarily to help with multi-processor bottlenecking in server/HPC environments."
> 
> edit: I do wonder if the cores are being starved by a too-narrow decode unit.


All of this sounds very plausible.


----------



## Oedipus

Quote:


> Originally Posted by *sub50hz;15278433*
> Did someone claim there was going to be one? Because reviewing my posts I can see that I only eluded to a possiblity, but I have no first-, second- or tenth-hand knowlegde of anything of the sort. It would _make sense_ that if it's a patchable issue something is already being worked on.


For someone who goes to great lengths to appear impartial, you sure appear partial to yourself. You're not the only one who has brought up the potential for a L1 fix or kernel fix or whatever that will magically ameliorate all of the issues that BD has shown. You brought it up, I responded with skepticism. That's not a personal attack on you or your beliefs. Personally, I think it is highly unlikely that AMD and MS have been working on this for any significant length of time, if at all.

It's easier for AMD to not say anything and post extremely questionable youtube videos with the comments disabled than it is to make their product work as advertised.


----------



## sub50hz

Quote:


> Originally Posted by *Oedipus;15278521*
> For someone who goes to great lengths to appear impartial, you sure appear partial to yourself.


I have to go out of my way to remind people not to take my words out of context or twist them into their own agenda because it happens here so very often.


----------



## G3RG

Quote:


> Originally Posted by *Oedipus;15278521*
> For someone who goes to great lengths to appear impartial, you sure appear partial to yourself. You're not the only one who has brought up the potential for a L1 fix or kernel fix or whatever that will magically ameliorate all of the issues that BD has shown. You brought it up, I responded with skepticism. That's not a personal attack on you or your beliefs. Personally, I think it is highly unlikely that AMD and MS have been working on this for any significant length of time, if at all.
> 
> It's easier for AMD to not say anything and post extremely questionable youtube videos with the comments disabled than it is to make their product work as advertised.


How is it impartial to hope for a fix? I would hope both sides of the fence would be hoping for a fix....


----------



## BlackOmega

Quote:


> Originally Posted by *Oedipus;15278158*
> BD has been in development for this long and they still haven't managed to get this mythical patch out yet?


Quote:


> Originally Posted by *Oedipus;15278278*
> I didn't say it would be a simple fix. Still, AMD and MS are filled with smart people that should have been able to get this done by now.
> 
> Which leads me to believe that this hotfix or patch is not real.


I suppose you forgot how drivers and these "mythical patches" helped Fermi's performance.

I'm sure it's taking longer than expected to optimize BD's new instruction sets.
Quote:


> Originally Posted by *nagle3092;15278217*
> To be fair, using an unscripted play through from a multiplayer game is a piss poor benchmark. There are to many variables that could happen from one run to another.


I'll give you that, as it can be fairly inconsistent.

Quote:


> Originally Posted by *Vagrant Storm;15278342*
> You can't go off percents...that is just sensationalizing the numbers. I absolutely dread it when news and marketing people do that. If you want to talk margin of error in percents you have to pretty much go off that +/- 7.5% is the margin of error when dealing in numbers this big. if the number were 10 and 15 there is a 50% increase!!!! woot! but once you have a significant base number like 100 and 105 then you are 5%. As the base number go up so does the margine of error.
> 
> Keep things mathmatical and not subjective. We are MEASURING performance so stick with actual measuremeants and leave the OMG percents alone. Same goes for those graphs that are showing SB like 70% better than BD in something. It is pointless percentages that do nothing but stir people up because 70% sounds much worse than 10 seconds slower.
> 
> Plus that overlocked 2600k must be unstable if the 2500k is beating it. they are the same CPU essentially once you OC. so you need to focus on the stock measurements if you are going to continue using that data. Though since it is a beta game it could get released and have drastically different results...I've seen it happen before.


I was just proving a point with the percents. A lot of people have said that BD performs worse, and I just put it in a format that is easy for people to decipher and see that it's not true. Not as great as expected? Sure. But not as bad as a lot of people would have you believe.

As for the 2500k, I'm not sure if it's an unstable overclock, as I read through a lot of the reviews today, I noticed that this diminished performance after overclocking was fairly common throughout.
Quote:


> Originally Posted by *Rookie1337;15278352*
> 1866RAM used. Just like in Guru3Ds the higher the RAM speed the better BD does it seems.


As with all AMD's since x64 architecture. RAM frequency, timings and, for AM3 at least, CPU-NB frequency plays a huge role in overall performance.

And contrary to popular belief about intels, memory timings play just as big of a role.
Quote:


> Originally Posted by *G3RG;15278409*
> Sounds plausible...though that is a pretty hefty jump just for a higher speed ram lol. I'm thinking it may have more to do with the motherboard.


I'd have to agree. It's odd that all of the reviews that used the Asus board, BD did so poorly. Something smells fishy.









Oddly enough, ASrock is a spin-off company of Asus.


----------



## tafkar

Quote:


> Originally Posted by *G3RG;15278575*
> How is it impartial to hope for a fix? I would hope both sides of the fence would be hoping for a fix....


Hope is usually an indicator of strong emotional attachment to an ideal.


----------



## arctia

Shame, could've been an awesome multithread performer for my server. The power consumption makes me want to kill a kitten.

Xeon here I come.

Sent from my Galaxy S II using Tapatalk


----------



## Nocturin

1150 freaking replies.

I was gonna read this whole thread today, but it would take a week.

Can someone sum up the controversy on the validity of the testing styles in the reviews for me?


----------



## Benz

Quote:


> Originally Posted by *kapulek;15277859*
> You said this 4-core Zambezi leak score was real. FX4xxx faster than I7-2600K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now you're saying that your cousins Zambezi 8-core score is lower than 2500K?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please explain.


You think I can explain? No I can't.
Quote:


> Originally Posted by *radaja;15278011*
> yes i would also like to know why he claimed what he did,the graphs his cousin sent him dont make a FX-8150 look too good,it barely keeps up with a quad core 2500K,yet all we heard from him was "dont believe the leaks i have seen real performance figures and BD is a winner" type stuff


I never said that Bulldozer is a winner I merely said don't believe anything that's not from AMD.

I don't know why the scores are so low, and I don't care anymore I'm getting myself a 2500K end of story.


----------



## G3RG

Quote:


> Originally Posted by *tafkar;15278599*
> Hope is usually an indicator of strong emotional attachment to an ideal.


But what does ANYBODY have to gain by not wanting Bulldozer to be anything other than a success? That's why I don't get fanboys....they want the other side to fail even though it will never gain them anything.


----------



## Majin SSJ Eric

And now the mobo conspiracies begin...


----------



## Axon14

Quote:


> Originally Posted by *Nocturin;15278636*
> 1150 freaking replies.
> 
> I was gonna read this whole thread today, but it would take a week.
> 
> Can someone sum up the controversy on the validity of the testing styles in the reviews for me?


Fairy dust and broken dreams, my friend. http://en.wikipedia.org/wiki/K%C3%BCbler-Ross_model


----------



## BlackOmega

Quote:


> Originally Posted by *Majin SSJ Eric;15278684*
> And now the mobo conspiracies begin...


Can you explain the rather large discrepancy in performance, when the only visible difference is the motherboard?


----------



## Nocturin

Quote:


> Originally Posted by *Majin SSJ Eric;15278684*
> And now the mobo conspiracies begin...


you forgot all the other compoenents;

sdd vs hdd, memory, memory speed, cpuNB, raid controller, nvidia vs ati, ext...


----------



## radaja

Quote:


> Originally Posted by *Majin SSJ Eric;15278684*
> And now the mobo conspiracies begin...


yes and all we need now is for wuttz to show up and start the "bentmark" argument,i can hear it now,"all the reviews relied on bentmarks,bulldozer is great"


----------



## wanako

oh my... I first read BitTech's review, saw dismal results, and then Anandtech to clarify, and I've got say, I am disappoint AMD. I really hoped they would bring a good challange to the table. That's how things get better. Intel must be ROFL'ing at their epic failure right about now.


----------



## ShiftedReality

I found a few issues with a few of the sites... 1 shows F1 2011 on par with 2600k and the other shows it 20fps behind.. wonder what the reason is for this.

http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-20.html

vs

http://www.hardwareheaven.com/reviews/1285/pg11/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-f1-2011.html


----------



## G3RG

Quote:


> Originally Posted by *radaja;15278772*
> yes and all we need now is for wuttz to show up and start the "bentmark" argument,i can hear it now,"all the reviews relied on bentmarks,bulldozer is great"


As I said a few posts back what do you Intel fanboys have to gain by bulldozer failing? Why are so many of you excitedly proclaiming the end of AMD? Do you work for Intel? Do you earn money when Intel succeeds? I kinda doubt it. So why the childish attitude?

Yes I lean towards AMD as they're the underdog and I always tend to support the underdog. No I'm not a fanboy. I'm likely about to build a 2500k system unless AMD can pull something magical out of their sleeve.


----------



## Evil-Jester

Quote:


> Originally Posted by *G3RG;15278839*
> As I said a few posts back what do you Intel fanboys have to gain by bulldozer failing? Why are so many of you excitedly proclaiming the end of AMD? Do you work for Intel? Do you earn money when Intel succeeds? I kinda doubt it. So why the childish attitude?
> 
> *Yes I lean towards AMD as they're the underdog and I always tend to support the underdog. No I'm not a fanboy. I'm likely about to build a 2500k system unless AMD can pull something magical out of their sleeve*.


thats how i feel i wanted to see how BD was but for a few extra $$ more i can pick up a 2500k and be happy


----------



## crashdummy35

Quote:


> Originally Posted by *G3RG;15278681*
> But what does ANYBODY have to gain by not wanting Bulldozer to be anything other than a success? That's why I don't get fanboys....they want the other side to fail even though it will never gain them anything.


True. With no solid competition from AMD at the moment Intel is free to do what they want with prices. BD isn't even a true 8-core cpu, eventually Intel's going to call them out on that, watch.

I'm sure we'll learn more in the weeks to come, but, it just isn't looking good right now.

Intel has 3 choices I think: Keep prices steady and hold the lead they already have; lower prices a tiny bit during the holiday season and mop the friggin floor with AMD; or, raise prices because they know their products are absolutely superior and it can't be argued they aren't.


----------



## G3RG

Quote:


> Originally Posted by *ShiftedReality;15278820*
> I found a few issues with a few of the sites... 1 shows F1 2011 on par with 2600k and the other shows it 20fps behind.. wonder what the reason is for this.
> 
> http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-20.html
> 
> vs
> 
> http://www.hardwareheaven.com/reviews/1285/pg11/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-f1-2011.html


We've been trying to discuss this for a while. Apparently Hardwareheaven was one of the only reviewers to not use an ASUS motherboard and they used faster ram. Other than that I'm not sure.


----------



## BankaiKiller

so is this 8core bulldozer fx better then the phenom or what.. and is it able to match i5 2500k on stock and can it overclock good or what. I'm wanting to drop this chip in my mobo already!


----------



## 40.oz to freedom

Waiting for multi gpu benchmarks before i really make a decision, but from the looks of things right now its not gonna be any better.


----------



## Eolas

Quote:


> Originally Posted by *G3RG;15278839*
> As I said a few posts back what do you Intel fanboys have to gain by bulldozer failing? Why are so many of you excitedly proclaiming the end of AMD? Do you work for Intel? Do you earn money when Intel succeeds? I kinda doubt it. So why the childish attitude?
> 
> Yes I lean towards AMD as they're the underdog and I always tend to support the underdog. No I'm not a fanboy. I'm likely about to build a 2500k system unless AMD can pull something magical out of their sleeve.


Agreed, I don't understand all the people making fun of those people that thought this chip was going to outdo phenom II, I mean phenom II is old...

Something just doesn't seem right about this whole thing IMO. It isn't a conspiracy or hopefullness, something just doesn't seem right about all of this. AMD goes into the think tank continually delays for a product that isn't better than their previous offering?

With all that R&D you would think they would have just made a phenom II x8, but I really don't know.

Again, something seems off.


----------



## Kaze105

Quote:


> Originally Posted by *G3RG;15278839*
> As I said a few posts back what do you Intel fanboys have to gain by bulldozer failing? Why are so many of you excitedly proclaiming the end of AMD? Do you work for Intel? Do you earn money when Intel succeeds? I kinda doubt it. So why the childish attitude?
> 
> Yes I lean towards AMD as they're the underdog and I always tend to support the underdog. No I'm not a fanboy. I'm likely about to build a 2500k system unless AMD can pull something magical out of their sleeve.


Im my case, I am just disappointed by its performance for the cost of $280. AMD is known for low cost for decent performance (at least to me), but at microcenter, the 2600k costs that much. Obviously they dont show the price of the FX-8150 right now, but there doesnt seem so much point of getting this over the 2500k unless its going to be sold for a similar cost at microcenter.


----------



## tafkar

Quote:


> Originally Posted by *G3RG;15278681*
> But what does ANYBODY have to gain by not wanting Bulldozer to be anything other than a success? That's why I don't get fanboys....they want the other side to fail even though it will never gain them anything.


Someone is always gaining somewhere. In this case, the real winners would people who are invested in Intel stock, as Intel will remain secure in their ability to sell their consumer-space chips for higher profit margins for the next six months and beyond.

For people not investing in tech stocks? Looking for long-term benefits isn't something that is ever going to catch on in this culture. Most people would prefer to either justify their personal personal expenses, or to use their group-identification to distract from their shortcomings as individuals.


----------



## radaja

Quote:


> Originally Posted by *Benz;15278653*
> I never said that Bulldozer is a winner I merely said don't believe anything that's not from AMD.
> 
> I don't know why the scores are so low, and I don't care anymore I'm getting myself a 2500K end of story.


ok maybe i'm remembering it wrong about what you said and i apoligize for that,but still 2 months ago when you saw those graphs,couldnt you conclude from looking at them that BD barely competes with the 2500K?if thats what AMD believed and showed in the graphs,what made you wait to go SB?were you hoping AMD was downplaying the performance on thoses graphs?
looking at your cousins graphs i just dont see a reason to think BD would be worth waiting 2 months for making a decision on which would be better?


----------



## ShiftedReality

also, heavenhardware was on of the few to use a different board besides ASUS like has been said and also one of the few to use a Radeon card as well.. guess that makes a difference.


----------



## Kyronn94

Well the 8150 is now out of stock on newegg









I'm pretty sure it's not AS bad as everyone's making it out to be.









I'm a little disappointed too, but it's not all bad.








Ok, it didn't beat Sandy Bridge as everyone had hoped, but this is a completely new architecture. If intel did the same thing with Ivy, a similar thing would probably happen.
Come Piledriver, they'll have it sorted.









The prices will almost DEFIANTLY drop, now that everyone knows it has not lived up to expectations.

Want to build a NEW gaming build?
Get a 2500K.









Bought an AM3+ Board for Bulldozer?
Wait for the prices to drop, and buy an FX.









Bulldozer fixes what was wrong with Phenom II - Overclocking potential.
Either way, Bulldozer will be a MASSIVE success in the OEM/pre built system world, due to the cores and high stock clock speeds.

Just my thoughts


----------



## Vagrant Storm

Quote:


> Originally Posted by *ShiftedReality;15278820*
> I found a few issues with a few of the sites... 1 shows F1 2011 on par with 2600k and the other shows it 20fps behind.. wonder what the reason is for this.
> 
> http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-20.html
> 
> vs
> 
> http://www.hardwareheaven.com/reviews/1285/pg11/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-f1-2011.html


could be any number of things...a 580 vrs a 6950...ASRock vrs ASUS motherboards...I am almost wondering if hardware heaven tested the CPUs while overclocked. They never say what freaquency they tested at that I can see...but I don't typically go to their site so I might not see it.


----------



## Axon14

Quote:


> Originally Posted by *BankaiKiller;15278881*
> so is this 8core bulldozer fx better then the phenom or what..


Yes, in all but a very small amount of benches.
Quote:


> and is it able to match i5 2500k on stock and can it overclock good or what. I'm wanting to drop this chip in my mobo already!


Match stock 2500k? Probably not. It does seem to overclock well, but it needs tons of power.


----------



## Benz

Quote:


> Originally Posted by *radaja;15278911*
> ok maybe i'm remembering it wrong about what you said and i apoligize for that,but still 2 months ago when you saw those graphs,couldnt you conclude from looking at them that BD barely competes with the 2500K?if thats what AMD believed and showed in the graphs,what made you wait to go SB?were you hoping AMD was downplaying the performance on thoses graphs?
> looking at your cousins graphs i just dont see a reason to think BD would be worth waiting 2 months for making a decision on which would be better?


I was absolutely sure that the performance levels had something to do with BIOS updates, that's why I waited so long, If I knew for sure that Bulldozer will be such a piece of crap I wouldn't have waited for even a second.


----------



## IXcrispyXI

with the power consumption as high as it is overclocked, im worried to even think how long the cpu will last, ive only had intel cpu's but i really wanted more out of BD









price of the 2600k's have jumped up $20 here in less then a day....


----------



## hydropwnics

dono how they are gonna charge 280 for a 8150 when u can get a 2500k for 180 at a microcenter


----------



## radaja

Quote:


> Originally Posted by *Benz;15279007*
> I was absolutely sure that the performance levels had something to do with BIOS updates, that's why I waited so long, If I knew for sure that Bulldozer will be such a piece of crap I wouldn't have waited for even a second.


ok i see,optimistic outlook.well i still hope they figure this out,at XS's there a guy who disabled one core in each module of a FX-8150 and ran it as 2m/4c quad and got much better results,which seems to mean the sharing the frontend resources is bugged.but its only good news if true in the immediate future if you willing to pay for a x8 and run it as a x4.maybe interlagos chip with every other core disabled ran as a x8 will show what the FX-8150 could really do?


----------



## IXcrispyXI

Quote:


> Originally Posted by *Kyronn94;15278961*
> Bulldozer fixes what was wrong with Phenom II - Overclocking potential.
> Either way, Bulldozer will be a MASSIVE success in the OEM/pre built system world, due to the cores and high stock clock speeds.
> 
> Just my thoughts


this i can see alot of the non so tech savvy people seeing 8cores and being impressed and buying a oem/ pre built pc


----------



## ShiftedReality

Would someone on OCN who gets a FX-8150 bench it for us with a different board besides a ASUS? I'm curious to see if it is a issue with a particular board or not.


----------



## ToxicAdam

Quote:


> Originally Posted by *crashdummy35;15278863*
> BD isn't even a true 8-core cpu,


Is this true?


----------



## OC'ing Noob

Quote:


> Originally Posted by *IXcrispyXI;15279072*
> this i can see alot of the non so tech savvy people seeing 8cores and being impressed and buying a oem/ pre built pc


I can see the ads on Ebay now:

- Custom built gaming computer with 8 cores!
- Overclocked to 4GHz on each core = 32GHz
- OMG YOU MUST BUY!


----------



## hydropwnics

Quote:


> Originally Posted by *OC'ing Noob;15279098*
> I can see the ads on Ebay now:
> 
> - Custom built gaming computer with 8 cores!
> - Overclocked to 4GHz on each core = 32GHz
> - OMG YOU MUST BUY!


----------



## Rpg2

Quote:


> Originally Posted by *radaja;15279067*
> ok i see,optimistic outlook.well i still hope they figure this out,at XS's there a guy who disabled one core in each module of a FX-8150 and ran it as 2m/4c quad and got much better results,which seems to mean the sharing the frontend resources is bugged.but its only good news if true in the immediate future if you willing to pay for a x8 and run it as a x4.maybe interlagos chip with every other core disabled ran as a x8 will show what the FX-8150 could really do?


Link? That sounds interesting.

How much is much better, and are there benches? 10%? 20%?


----------



## hydropwnics

anyone know if i can drop a 8150 in my crosshair iv or was that a myth with that beta bios update


----------



## wanako

Oh LOOK! Someone got their BD already and posted a "review" on NewEgg about it.
Quote:


> Pros: I have been extensively researching this since its announcement in June.
> 
> Benchmarks from all over are proving that this truly is the next best CPU. This thing performs equally to the 2600k in nearly everything, and better than the i7 990x in a surprisingly large number of mainstream games.
> 
> As for people who may want to get a Phenom: DO NOT! The 8150 actually performs better than the phenom for the price! This thing gets scores around 40% better than the best Phenom available.
> 
> As for the 8 cores? Some people may complain that they arent true cores, but they are. AMD uses what are called "modules" rather than cores. A module is a core, but with its own individual cache and other resources. The only downside is that it has an equal number of threads that a quad core i7 would have. This doesnt really matter, as it still has 8 threads, and 8 threads surpass what anything would use anyways.
> 
> Cons: Beware of speed. I got pulled over by the CPU Police for going 4GHz in a 3.2GHz zone. (LOL)
> 
> Other Thoughts: Using this with a Radeon 7990 will no doubt make you the coolest person in the world.


Absolute idiocy at it's finest.


----------



## anubis1127

Has anybody seen reviews for gaming performance with multiple GPU's? That's all I'm really curious about. I could care less that a 8core BD CPU can keep up with my i3 and a single high end GPU, I'm more curious about how a 8150 does with 2 or 3 HD6970's, or gtx580's. To me that was always the area where the phenom II's were lacking.


----------



## Vagrant Storm

Quote:


> Originally Posted by *IXcrispyXI;15279072*
> this i can see alot of the non so tech savvy people seeing 8cores and being impressed and buying a oem/ pre built pc


This has been the case for a long time...BD will probably make it worse. I wonder AMD will actually start advertising, though. If they don't then it might not matter.

However, The salesmen for prebuilts will try to sell the more expesive Intel CPUs over a system with a FX 8120 in it.


----------



## mad0314

Quote:


> Originally Posted by *ShiftedReality;15278920*
> also, heavenhardware was on of the few to use a different board besides ASUS like has been said and also one of the few to use a Radeon card as well.. guess that makes a difference.


And also terrible testing and reporting. They did a single test in their gaming tests, with graphics settings turned all the way up with a single 6950, I believe, and called it a win. That was one of the worst articles I read as it was very vague and short. If their results are true, I would love to see them go into more depth or have someone else that does go into depth reproduce them. They need to spend more time on their tests than their silly animated graphs.


----------



## ToxicAdam

Quote:


> Originally Posted by *wanako;15279151*
> Oh LOOK! Someone got their BD already and posted a "review" on NewEgg about it.
> 
> Absolute idiocy at it's finest.


Quote:


> Cons: Beware of speed. I got pulled over by the CPU Police for going 4GHz in a 3.2GHz zone.


lol


----------



## Vagrant Storm

Quote:


> Originally Posted by *mad0314;15279160*
> And also terrible testing and reporting. They did a single test in their gaming tests, with graphics settings turned all the way up with a single 6950, I believe, and called it a win. That was one of the worst articles I read as it was very vague and short. If their results are true, I would love to see them go into more depth or have someone else that does go into depth reproduce them. They need to spend more time on their tests than their silly animated graphs.


Any one with a shred of spare time what to go through and compile results from each site? i would...but I've messed around too much already today

If not...give me a few hours and i will have an excel sheet up


----------



## Chuckclc

Where are the 4100 and 6100 reviews, I think i am going to get one. No need for the 8120 or 8150 though.


----------



## radaja

Quote:


> Originally Posted by *Rpg2;15279126*
> Link? That sounds interesting.
> 
> How much is much better, and are there benches? 10%? 20%?


XS's is down for maintenance at the moment but here is the link to the thread.
i really only quickly skimmed through it so i dont know the actual % increase.
but some test clearly did much better with one core disabled in each module.
*AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*

but like i said this only helps if your willing to pay for a 8 core and use as a 4 core,if this turns out to hold any weight.
i will wait to see more findings on this issue


----------



## ShiftedReality

Quote:


> Originally Posted by *Chuckclc;15279190*
> Where are the 4100 and 6100 reviews, I think i am going to get one. No need for the 8120 or 8150 though.


http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/

has them all on the review.


----------



## ToxicAdam

Quote:


> Originally Posted by *crashdummy35;15278863*
> BD isn't even a true 8-core cpu,


link?


----------



## Benz

Quote:


> Originally Posted by *radaja;15279067*
> ok i see,optimistic outlook.well i still hope they figure this out,at XS's there a guy who disabled one core in each module of a FX-8150 and ran it as 2m/4c quad and got much better results,which seems to mean the sharing the frontend resources is bugged.but its only good news if true in the immediate future if you willing to pay for a x8 and run it as a x4.maybe interlagos chip with every other core disabled ran as a x8 will show what the FX-8150 could really do?


I cant really pretend to answer this, because I'm so pissed right now that I could eat my face off. This is sooooo the last time I trusted AMD and no more.


----------



## cjc75

Quote:


> Originally Posted by *Vagrant Storm;15278342*
> Quote:
> 
> 
> 
> Originally Posted by *cjc75;15278338*
> NewEgg now has the BD's listed!
> 
> 
> 
> lol _*and notice they are still in stock*_. I remember trying to get a 5870 and they would be out of stock before you get it added to your cart.
Click to expand...

Actually...

No they're not.

*The 8150's are already sold out!*

Oh... and thats "sold out" within 2 hours of them first being listed as "In Stock"...


----------



## BlackandDecker

So let me get this straight? Its slow as hell, while using more juice than a 2600K?
What a loss.


----------



## finalturismo

well i got 1 thing right

The scaling is great

You can hit 5ghz overclock on bulldozer.

But..... its slower clock per clock

so faildozer.

If the low end bulldozer can hit 5ghz, than i will buy one for best price / performance ratio.

I dont understand amd....

Why not just release the server 16core bulldozer for desktop.

Than you could have at least beat intel in raw power.......

This is amds greatest fail....

The only thing i see good about this

is its a 4 module processor, so there is potential to cram tons of cores on a single die.

Great for server side.

but anyway FAILDOZER.

Maybe intel is doing some insider trading with amd?


----------



## Nocturin

Quote:


> Originally Posted by *Vagrant Storm;15279182*
> Any one with a shred of spare time what to go through and compile results from each site? i would...but I've messed around too much already today
> 
> If not...give me a few hours and i will have an excel sheet up


I've been wanting to do this, but family and wrok interfers. If you find it, link me.


----------



## Lampen

Quote:


> Originally Posted by *radaja;15279067*
> ok i see,optimistic outlook.well i still hope they figure this out,at XS's there a guy who disabled one core in each module of a FX-8150 and ran it as 2m/4c quad and got much better results,which seems to mean the sharing the frontend resources is bugged.but its only good news if true in the immediate future if you willing to pay for a x8 and run it as a x4.maybe interlagos chip with every other core disabled ran as a x8 will show what the FX-8150 could really do?


Link?


----------



## omninmo

Quote:


> Originally Posted by *radaja;15279067*
> ok i see,optimistic outlook.well i still hope they figure this out,at XS's there a guy who disabled one core in each module of a FX-8150 and ran it as 2m/4c quad and got much better results,which seems to mean the sharing the frontend resources is bugged.but its only good news if true in the immediate future if you willing to pay for a x8 and run it as a x4.maybe interlagos chip with every other core disabled ran as a x8 will show what the FX-8150 could really do?


I've been saying this ALL DAY LONG and no one gave a crap.. I must've posted like 3 or 4 posts with this sugestion lolol

i even asked someone to test it here:
http://www.overclock.net/benchmarking-software-discussion/1137376-what-benchmarks-do-you-want-see-4.html#post15274384

mua hua hua time to pat myself on the back? nahh not really, though i'd like to see some numbers first..

could you link me to that thread?


----------



## radaja

Quote:


> Originally Posted by *Benz;15279234*
> I cant really pretend to answer this, because I'm so pissed right now that I could eat my face off. This is sooooo the last time I trusted AMD and no more.


i can tell your upset,you know how?

that wasnt a question:lachen:

i was telling you about some testing a guy did at XS's and that all.
sorry for laughing about it,i tickled me i guess


----------



## BlackandDecker

crapdozer


----------



## PyroTechNiK

Not only are the performance numbers fail but also the power consumption.

I cannot see how anyone can justify the purchase of this failure.


----------



## Stuuut

Why is Hardware Heaven the only review site that shows Bulldozer is performing good? Did they receive a different chip or something??


----------



## Dmac73

Quote:


> Originally Posted by *Benz;15279234*
> I cant really pretend to answer this, because I'm so pissed right now that I could eat my face off. This is sooooo the last time I trusted AMD and no more.


It's just a bad uarch in it's current state.

Who's going to buy a module based cpu to disable 1 core per module. That's ******ed and the performance increase probably still won't even match the IPC of Phenom 2.

3.6ghz 8core Bobcat. That's what this chip is. With Fermi power consumption. Rofl


----------



## ikem

Quote:


> Originally Posted by *ShiftedReality;15279075*
> Would someone on OCN who gets a FX-8150 bench it for us with a different board besides a ASUS? I'm curious to see if it is a issue with a particular board or not.


i will, i have an order in for tigerdirect, but it is now on back order... once i get the cpu ill do a nice review. i will even oc the nb-cpu.

i hope it turns out a little better than these reviews.


----------



## radaja

Quote:


> Originally Posted by *Lampen;15279277*
> Link?


Quote:


> Originally Posted by *omninmo;15279289*
> GODDAMN IT i've been saying this ALL DAY LONG and no one gave a crap.. I must've posted like 3 or 4 posts with this sugestion lolol
> 
> i even asked someone to test it here:
> http://www.overclock.net/benchmarking-software-discussion/1137376-what-benchmarks-do-you-want-see-4.html#post15274384
> 
> mua hua hua time to pat myself on the back? nahh not really, though i'd like to see some numbers first..
> 
> could you link me to that thread?


XS's is down at the moment but here is the link once again.

*AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*

and once again here

*AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*

and once again here

*AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*

sorry for 3 links but i posted this once or twice already in this thread maybe 5 will be the charm


----------



## Rpg2

Quote:


> Originally Posted by *radaja;15279321*
> XS's is down at the moment but here is the link once again.
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> sorry for 3 links but i posted this once or twice already in this thread maybe 5 will be the charm


Throw it into your signature for the next week or so.


----------



## doomlord52

So you need 80w more power, a 1ghz OC to tie a 2600k? Sounds terrible.
As is, its stock v. stock its equal to a 2500k, but STILL worse core for core (its equal to a 920!? at stock!?!?).

Faildozer is fail. Too much power, worse perf than whats already out there, terrible core efficiency. I'll stick with my 2600k @ 4.75ghz, which will absolutely EAT a BD 8150.


----------



## Dmac73

FX4170 @ 4.2 gets spanked by 1156 i5 @ 2.66ghz. Lol.... 3.7ghz PH2 980 beats up on it bad with a 500mhz disadvantage.


----------



## omninmo

Quote:


> Originally Posted by *radaja;15279321*
> XS's is down at the moment but here is the link once again.
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> sorry for 3 links but i posted this once or twice already in this thread maybe 5 will be the charm


thanks







will check back in a while!

damn, and here I was thinking I was the only one who had thought of this..

there go my hopes of having it named:
"OmNINmO's Grand Method for making Bulldozer not fail.. as hard.. sort of"









ohh, and @ the people who say it's stupid to buy a 8 core to have it used as a 4C:

well multithreaded performance ain't that bad at all, it trades SOME blows with a 2600k in the right situations

this would be a "rough fix" for poor performance in GAMES, where you could clock it higher with less power and heat, and not be affected by the shared resources penalty..

thinking 5.2 Ghz+ on water maybe?

who knows...


----------



## yakuzapuppy

With the octo's not looking so good compared to 2500K, I'm curious to see how the hex and quad cpus hold up in their respective brackets.


----------



## Kand

Quote:


> Originally Posted by *radaja;15278772*
> yes and all we need now is for wuttz to show up and start the "bentmark" argument,i can hear it now,"all the reviews relied on bentmarks,bulldozer is great"


http://tipidpc.com/viewtopic.php?tid=172787
He did! On another forum, however.


----------



## Awhoon

Quote:


> Originally Posted by *Stuuut;15279308*
> Why is Hardware Heaven the only review site that shows Bulldozer is performing good? Did they receive a different chip or something??


I noticed that too. Initially I was like "cool!" but story wasnt being shared by other reviewers.


----------



## mad0314

Quote:


> Originally Posted by *radaja;15279321*
> XS's is down at the moment but here is the link once again.
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> sorry for 3 links but i posted this once or twice already in this thread maybe 5 will be the charm


Yea, but, what if we disable 1 integer core per module, so that it doesn't share resources?


----------



## B3anbag

Quote:


> Originally Posted by *finalturismo;15279249*
> well i got 1 thing right; maybe intel is doing some insider trading with amd?


AMD's R&D dept is full of intel moles.









sorry, couldnt resist the misquote. wont happen again.


----------



## guitarholic2008

Quote:


> Originally Posted by *daman246;15269804*
> Wow the way things are going AMD will never BE intels Equal they should just drop out of manufacturing Processors seriously if this is what BUlldozer brings us then this is the biggest Dissapointment ever


10 years ago I wouldn't believe that Intel would ever beat out AMD. All the negativity in this thread is sad. When you compare flagship to flagship, AMD still has the price advantage. When you look at the average computer user, they all are more processor than the average Joe will use. I've built AMD and Intel machines for people, and even though you could consider me an AMD fan boy, I will tip my hat to the performance Intel gives!

People will find a use for BD, and the market will reflect it eventually. As far as AMD dropping out of the picture, that would be bad for the market. Competitive pricing is what keeps intel in check. As far as BD goes, yes I'm a little disappointed, but even if they came out and smashed Sandy Bridge, it would still be a year at best before I bought one.

Food for thought people... personally I could be happy if every thread wasn't a p***ing contest...


----------



## Stuuut

Quote:


> Originally Posted by *Awhoon;15279405*
> I noticed that too. Initially I was like "cool!" but story wasnt being shared by other reviewers.


Yeah but i mean how can every other review on this planet say that Bulldozer isn't performing good and in their gaming benchmarks Bulldozer beats all....


----------



## Wildcard36qs

Do these work at all in the AM3 socket with bios update? This is probably been answered a long time ago...wasnt sure if you had to have the new socket or not.


----------



## Benz

Quote:


> Originally Posted by *radaja;15279292*
> i can tell your upset,you know how?
> 
> that wasnt a question:lachen:
> 
> i was telling you about some testing a guy did at XS's and that all.
> sorry for laughing about it,i tickled me i guess


Yeah well I thought it was a question in there somewhere.









And don't worry for laughing I guess I'm getting over it and laughing too.
















Oh I'm thinking of sticking a few grams of C4 on my GA-990FXA-UD3 and remote detonate it from safe distance.


----------



## bru_05

Quote:


> Originally Posted by *mad0314;15279415*
> Yea, but, what if we disable 1 integer core per module, so that it doesn't share resources?


I think that's what he requested a few pages back. Would be interesting to see.


----------



## radaja

Quote:


> Originally Posted by *mad0314;15279415*
> Yea, but, what if we disable 1 integer core per module, so that it doesn't share resources?


thats the point im saying,this guy tested it and disable one core in each module and ran it as a 2 modul/4 core chip and got better performance.and the only thing that would explain that is AMD's CMT doesnt work like they said it would,the sharing must be causing big problems with BD


----------



## 2010rig

Quote:


> Originally Posted by *Stuuut;15279308*
> Why is Hardware Heaven the only review site that shows Bulldozer is performing good? Did they receive a different chip or something??


I'm not saying the following is true, but it's a possibility. Check out what a little birdie told him.
Quote:


> Originally Posted by *2010rig;15240573*
> Oh my...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From this preview:
> http://lab501.ro/procesoare-chipseturi/amd-fx-8150-bulldozer-preview


----------



## anubis1127

Quote:


> Originally Posted by *guitarholic2008;15279441*
> When you compare flagship to flagship, AMD still has the price advantage.


What? I guess if you throw in Intel's 980x, but comparing SB flagship to BD flagship, they are both $280 in US.


----------



## mad0314

Quote:


> Originally Posted by *guitarholic2008;15279441*
> 10 years ago I wouldn't believe that Intel would ever beat out AMD. All the negativity in this thread is sad. When you compare flagship to flagship, AMD still has the price advantage. When you look at the average computer user, they all are more processor than the average Joe will use. I've built AMD and Intel machines for people, and even though you could consider me an AMD fan boy, I will tip my hat to the performance Intel gives!
> 
> People will find a use for BD, and the market will reflect it eventually. As far as AMD dropping out of the picture, that would be bad for the market. Competitive pricing is what keeps intel in check. As far as BD goes, yes I'm a little disappointed, but even if they came out and smashed Sandy Bridge, it would still be a year at best before I bought one.
> 
> Food for thought people... personally I could be happy if every thread wasn't a p***ing contest...


This is what we do as enthusiasts. We break it down and look at the performance of every aspect of it. You would not compare flagship to flagship when there is a $700 price difference. You would not compare Honda's flagship to Ferrari's flagship, just because both are their "flagship." They are in totally different territories. Someone that needs the power won't even think about the one that doesn't deliver it.

I do agree that these chips are much more than what the average consumer needs. But with the single thread performance being the way it is, I cannot see the point of anyone buying the quad core over a Phenom II quad core. The only advantage BD has is core count, and if you go into the lower segment that flies right out the window and you are left with only the disadvantages.
Quote:


> Originally Posted by *Stuuut;15279454*
> Yeah but i mean how can every other review on this planet say that Bulldozer isn't performing good and in their gaming benchmarks Bulldozer beats all....


I posted this a few pages back, but with the speed of this thread posts get missed easily. Their testing was not thorough and their article is bad. They showed a GPU bottleneck and called it a CPU win. They did only a single test with each CPU in each game benchmark with very high graphics settings.


----------



## Vagrant Storm

Quote:


> Originally Posted by *Awhoon;15279405*
> I noticed that too. Initially I was like "cool!" but story wasnt being shared by other reviewers.


This is why we need a comprehensive database of the results with how hardware was used and what clock speeds. I never use Hardware Heaven's site, but I fully trust their results...as in they observed what they reported. However, there must be a reason why they have different results

I find it hard to believe the better results was do to an ASRock mother board (or have they gotten better? i am used to them being more of a budget board with a few good options, but then...I can't help but think that about Foxconn too and they've been making some decent boards.)

All to hell with it...I am going to call it a day early. heh...not getting any work done anyway thanks to Bulldozer.


----------



## omninmo

Quote:


> Originally Posted by *bru_05;15279540*
> I think that's what he requested a few pages back. Would be interesting to see.


well, it would be easier to cool on air, maybe make 5Ghz achievable with decent air.. ive already seen 5.2 on water for all 8 cores so perhaps even more can be pulled with this strategy and water..

and if IPC per core went up even a bit, could at least help making it decidedly faster than phenom 2 in gaming making it a decent upgrade path if AMD lowered prices...

excited to see


----------



## Bowser

Quote:


> Originally Posted by *radaja;15279321*
> XS's is down at the moment but here is the link once again.
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> and once again here
> 
> *AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.*
> 
> sorry for 3 links but i posted this once or twice already in this thread maybe 5 will be the charm


Quote:


> (*EDIT*: Check out the bottom page for updated and more accurate clock-per-clock results) Now you should take these results with a grain of salt. Unlike with Intel's chips, where it is easy to disable cores from within the BIOS, there is no such luxury with Bulldozer at this time. Therefore, we had to limit the FX-8150's number of cores from within the OS. This fact and perhaps some peculiarities when it comes to how Windows 7 assigns workloads to the Bulldozer microachitecture might have caused exaggerated results. We will be better able to gauge C-P-C performance once we get our hands on a true four-core Zambezi chip.












So how are the cores being disabled then?

And are you trying to compare the 4 core deneb with a 8150 (with a odd half the cores "on"?

I'm trying to clarify what you've posted in several threads. I don't think it's an apt comparison and would rather see the 4100 compared to 4 core denebs than an 8150.

**Edit** I see that they explain the process a little bit, but I'm curious on why they chose those cores on each module. I saw a graph somewhere (too many reviews) that had 8 core turbo, and then a 4 core turbo mode (which was higher than the 8 core).

Just trying to sort through some of the reviews. It's my first time I've actually been interested in a launch of a CPU


----------



## Stuuut

Quote:


> Originally Posted by *2010rig;15279568*
> I'm not saying the following is true, but it's a possibility. Check out what a little birdie told him.


So they are basically calling reviewers on how to benchmark it to get the best scores or am i missing something?


----------



## Stuuut

Quote:


> Originally Posted by *mad0314;15279576*
> I posted this a few pages back, but with the speed of this thread posts get missed easily. Their testing was not thorough and their article is bad. They showed a GPU bottleneck and called it a CPU win. They did only a single test with each CPU in each game benchmark with very high graphics settings.


Aaah thnx for explaining


----------



## omninmo

Quote:


> Originally Posted by *Bowser;15279595*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So how are the cores being disabled then?
> 
> And are you trying to compare the 4 core deneb with a 8150 (with a odd half the cores "on"?
> 
> I'm trying to clarify what you've posted in several threads. I don't think it's an apt comparison and would rather see the 4100 compared to 4 core denebs than an 8150.


they disabled 2 modules to simulate a FX 4110..

what we need is to see someone disable one core PER module so that none of the active ones have to share resources.. i'm hoping that is possible, but ofc can't say for sure if there is a chance to disable or massively downclock one core per module!


----------



## caffeinescandal

Quote:


> Originally Posted by *Wildcard36qs;15279469*
> Do these work at all in the AM3 socket with bios update? This is probably been answered a long time ago...wasnt sure if you had to have the new socket or not.


Some manufacturers are updating their BIOS code to support AM3+ on 890x chipsets. Others however don't.


----------



## kweechy

I see that the FX 8150 tends to average around 6pts in CineBench 11.5

How does the turbo play into this equation? Is it at 4.2 GHz likely the whole time it runs the benchmark?


----------



## cjc75

Quote:


> Originally Posted by *ShiftedReality;15278820*
> I found a few issues with a few of the sites... 1 shows F1 2011 on par with 2600k and the other shows it 20fps behind.. wonder what the reason is for this.
> 
> http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-20.html
> 
> vs
> 
> http://www.hardwareheaven.com/reviews/1285/pg11/amd-fx-8150-black-edition-8-core-processor-vs-core-i7-2600k-review-f1-2011.html


The difference being, that Tom's and most others did not go with AMD's full "Scorpius" platform as their test bed; using an nVidia GTX580...

While HardwareHeaven DID; by using a Sapphire TOXIC Radeon HD6950 2gb switched to its "performance" mode with all shaders unlocked.

ALSO, TOM's used cheap Crucial DRR3-1333 RAM...

While HardwareHeaven used high end Corsair Vengeance DDR3-_1866_ RAM.

Also, as for ASRock and BIOS support etc; yes ASrock did recently release a BIOS update for their 990FX Series boards that are intended to optimize the Bulldozer better then their previous BIOS Updates... THAT, you can read about in the ASRock 990FX Fan club threads here on OCN.

http://www.overclock.net/amd-motherboards/1054875-amd-asrock-extreme-series-motherboard-club.html
http://www.overclock.net/amd-motherboards/1078815-preview-asrock-990fx-fatal1ty-990fx-professional.html

There's your difference...

Its all in the hardware and how its setup.


----------



## capitaltpt

Quote:


> Originally Posted by *Vagrant Storm;15279578*
> This is why we need a comprehensive database of the results with how hardware was used and what clock speeds. I never use Hardware Heaven's site, but I fully trust their results...as in they observed what they reported. However, there must be a reason why they have different results
> 
> I find it hard to believe the better results was do to an ASRock mother board (or have they gotten better? i am used to them being more of a budget board with a few good options, but then...I can't help but think that about Foxconn too and they've been making some decent boards.)
> 
> All to hell with it...I am going to call it a day early. heh...not getting any work done anyway thanks to Bulldozer.


It doesn't necessarily mean that the AsRock board is the answer, but more likely that something could be wrong with the Crosshair V boards with respect to BD. I'm just remembering how many 1 star reviews the Crosshair V got on Newegg (25% last I checked). Maybe there is a design flaw or BIOS flaw with it. It would be helpful if we saw BD benched on other Mobos like the Sabertooth and Gigabyte's UD7 to prove or disprove this.


----------



## Ganglartoronto

Quote:


> Originally Posted by *doomlord52;15279377*
> So you need 80w more power, a 1ghz OC to tie a 2600k? Sounds terrible.
> As is, its stock v. stock its equal to a 2500k, but STILL worse core for core (its equal to a 920!? at stock!?!?).
> 
> Faildozer is fail. Too much power, worse perf than whats already out there, terrible core efficiency. I'll stick with my 2600k @ 4.75ghz, which will absolutely EAT a BD 8150.


thanks for sharing.


----------



## 2010rig

Quote:


> Originally Posted by *Stuuut;15279614*
> So they are basically calling reviewers on how to benchmark it to get the best scores or am i missing something?


Honestly, I can not say for certain if this is the case, because I have no proof, but read the highlighted part carefully...
Quote:


> "Now the saddest thing is not only the performance but in which AMD tries to manage this. I heard from a little birdie that some folks at AMD will start callins press tomorrow to ask them how reviews are going and try to do some damage control ( this reminds me of Nvidia calling press before GTX 480 launch ) Actually many of the press guys I talked to are a little bit puzzled and really don't know how to approach the situation. From my point of view it is pretty clear, the truth ( no matter how much it hurts ) is the only way"


I leave you to make your own conclusions and what you make of it.

Remember this was from a "leaked review" that showed exactly what we are seeing from MOST reviews.


----------



## radaja

Quote:


> Originally Posted by *Bowser;15279595*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So how are the cores being disabled then?
> 
> And are you trying to compare the 4 core deneb with a 8150 (with a odd half the cores "on"?
> 
> I'm trying to clarify what you've posted in several threads. I don't think it's an apt comparison and would rather see the 4100 compared to 4 core denebs than an 8150.
> 
> **Edit** I see that they explain the process a little bit, but I'm curious on why they chose those cores on each module. I saw a graph somewhere (too many reviews) that had 8 core turbo, and then a 4 core turbo mode (which was higher than the 8 core).
> 
> Just trying to sort through some of the reviews. It's my first time I've actually been interested in a launch of a CPU


the guy in that thread disabled one core in each module so i was a quad core,that way each core would avoid sharing resources in said module.i have seen others post this info about this thread and say 10% to 30% increase in performance in some tests.either way its still a serious issue and AMD screwed up big time.i dont think many will pay for a 8150 only run it as a quad-core.this most likey is the cache thrashing everyone is talking about
Quote:


> Originally Posted by *omninmo;15279668*
> they disabled 2 modules to simulate a FX 4110..
> 
> what we need is to see someone disable one core PER module so that none of the active ones have to share resources.. i'm hoping that is possible, but ofc can't say for sure if there is a chance to disable or massively downclock one core per module!


correct in the above graph that disabling two modules only simulates the FX4110 sharing resources and all.
the guy at XS's used a newer version of the asus OCing software/or UEFI bios i think that allowed disabling of individual cores something the original software/UEFI bios didnt have this setting only the setting that allowed turning off muodules.again i didnt read the thread thoroughly before XS's went down.


----------



## cjc75

Oh by the way guys... PRICES.

You all are comparing an EIGHT CORE chip, to the price of a FOUR core 2500K?

HAH!

Well of course the EIGHT core is going to cost MORE then the FOUR Core!

Wait till the FOUR Core FX-B4150 or the FX-4170 hits the shelves and reviews on those two start showing up... THEN you can have a fair comparision on prices.


----------



## Bowser

Quote:


> Originally Posted by *omninmo;15279668*
> they disabled 2 modules to simulate a FX 4110..
> 
> what we need is to see someone disable one core PER module so that none of the active ones have to share resources.. i'm hoping that is possible, but ofc can't say for sure if there is a chance to disable or massively downclock one core per module!


In the edited section they said they disabled one core per module not two modules. (0,2,4,6 were running)

The problem though was doing this through Windows.


----------



## crossy82

Quote:


> Originally Posted by *AMC;15279707*
> This never gets old to me


Lol,amazing vid,made my day all the better after reading those reviews earlier.


----------



## Stuuut

Quote:


> Originally Posted by *cjc75;15279834*
> Oh by the way guys... PRICES.
> 
> You all are comparing an EIGHT CORE chip, to the price of a FOUR core 2500K?
> 
> HAH!
> 
> Well of course the EIGHT core is going to cost MORE then the FOUR Core!
> 
> Wait till the FOUR Core FX-B4150 or the FX-4170 hits the shelves and reviews on those two start showing up... THEN you can have a fair comparision on prices.


Well its usual people compare a top product from a company with a top product from its competitor.. So that would include price.


----------



## 2010rig

Quote:


> Originally Posted by *cjc75;15279834*
> Oh by the way guys... PRICES.
> 
> You all are comparing an EIGHT CORE chip, to the price of a FOUR core 2500K?
> 
> HAH!
> 
> Well of course the EIGHT core is going to cost MORE then the FOUR Core!
> 
> Wait till the FOUR Core FX-B4150 or the FX-4170 hits the shelves and reviews on those two start showing up... THEN you can have a fair comparision on prices.


----------



## 161029

Disappointing. The high TDP seems normal for 8 cores but for the performance, it's disappointing. Looks like I'll be getting SB.


----------



## awdrifter

On AMDZone someone mentioned that the L1 cache is significantly slower than the Thuban, this could be the culprit of the performance problems. While I didn't expect a 6 core SB beater, performing worse than Thuban per core does seem a bit too bad. Hopefully it'll be like the Phenom I's TLB bug, after they fix it we'll get a 10% or so performance gain, but even then that only means it's on part with Thuban clock for clock.


----------



## guitarholic2008

Quote:


> Originally Posted by *mad0314;15279576*
> This is what we do as enthusiasts. We break it down and look at the performance of every aspect of it. You would not compare flagship to flagship when there is a $700 price difference. You would not compare Honda's flagship to Ferrari's flagship, just because both are their "flagship." They are in totally different territories. Someone that needs the power won't even think about the one that doesn't deliver it.
> 
> I do agree that these chips are much more than what the average consumer needs. But with the single thread performance being the way it is, I cannot see the point of anyone buying the quad core over a Phenom II quad core. The only advantage BD has is core count, and if you go into the lower segment that flies right out the window and you are left with only the disadvantages.


When people are throwing money at a computer just because it's the most expensive in the lineup (intel's 990 for example) which I have seen. I built one for someone just cause he had the money, he doesn't really do anything with it. Surfing porn, craigslist, and some minimal gaming. It's what people do to feel better than other people.

And by the way, Honda made the NSX to make people feel like they were driving a car compariable to a ferrari!


----------



## Skylit

Aren't 8 core bulldozer's a 4 module CPU?

If so, it's perfectly justified comparing a 8 core AMD to a 4 core Intel.


----------



## black96ws6

Quote:


> Originally Posted by *AMC;15279707*
> This never gets old to me


LoL I hadn't seen that one yet, that's a great video. Love that and the one where the newswoman busts up reporting the Cinebench score


----------



## 2010rig

Quote:


> Originally Posted by *Skylit;15279911*
> Aren't 8 core bulldozer's a 4 module CPU?
> 
> If so, it's perfectly justified comparing a 8 core AMD to a 4 core Intel.


Or 8 thread vs 8 thread.









I think the whole 8 core thing was just a marketing gimmick.

Chew* for example pointed out that Bulldozer's design was more of a 4 Core / 8 thread part.


----------



## sub50hz

Quote:


> Originally Posted by *cjc75;15279834*
> Oh by the way guys... PRICES.
> 
> You all are comparing an EIGHT CORE chip, to the price of a FOUR core 2500K?
> 
> HAH!


Anyone with the correct number of chromosomes compares not _core count_, but relative performance per dollar.


----------



## finalturismo

LOL @ amd

Something has GOT to be wrong,

slower clock per clock than previous generation?

Ya something is wrong.....

If amds next release still sucks than bye bye amd forever.

Although i will still purchase amd , because iam scared of the intel kill switch


----------



## HMBR

Quote:


> Originally Posted by *Bowser;15279595*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So how are the cores being disabled then?
> 
> And are you trying to compare the 4 core deneb with a 8150 (with a odd half the cores "on"?
> 
> I'm trying to clarify what you've posted in several threads. I don't think it's an apt comparison and would rather see the 4100 compared to 4 core denebs than an 8150.
> 
> **Edit** I see that they explain the process a little bit, but I'm curious on why they chose those cores on each module. I saw a graph somewhere (too many reviews) that had 8 core turbo, and then a 4 core turbo mode (which was higher than the 8 core).
> 
> Just trying to sort through some of the reviews. It's my first time I've actually been interested in a launch of a CPU


something to keep in mind, BD have 4 "full cores", 4 modules, each module have 2 "cores", 2 thread with some shared resources,
so a 2 module, 4 cores BD will suffer against a PII X4 which have twice some of the stuff, so the ideal scenario for the BD in this case would be to disable 1 thread per module and keep 4 modules, but that is simply not viable, and even so, the IPC is still very low for each module compared to Intel and PII, so............








2 threads on one module vs 2 threads on one core with HT looks bad,
1 thread on one module vs 1 thread on a sb core without HT would look A LOT worse.

anyway, always keep in mind that we can cause a very different impression with basically the same thing,

xbitlabs tested this game with the physx option enabled









they didn't


















GPU bound









CPU bound








http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/4


----------



## awdrifter

Quote:


> Originally Posted by *guitarholic2008;15279908*
> When people are throwing money at a computer just because it's the most expensive in the lineup (intel's 990 for example) which I have seen. I built one for someone just cause he had the money, he doesn't really do anything with it. Surfing porn, craigslist, and some minimal gaming. It's what people do to feel better than other people.
> 
> And by the way, Honda made the NSX to make people feel like they were driving a car compariable to a ferrari!


Honda's NSX was actually competitive with the Ferrari 348 when it first came out. It's just that Honda never bothered to significantly update the car after that, but Ferrari made two major revisions (360 and 430) during the same period.


----------



## hokiealumnus

Quote:


> Originally Posted by *cjc75;15279715*
> The difference being, that Tom's and most others did not go with AMD's full "Scorpius" platform as their test bed; using an nVidia GTX580...
> 
> While HardwareHeaven DID; by using a Sapphire TOXIC Radeon HD6950 2gb switched to its "performance" mode with all shaders unlocked.
> 
> ALSO, TOM's used cheap Crucial DRR3-1333 RAM...
> 
> While HardwareHeaven used high end Corsair Vengeance DDR3-_1866_ RAM.
> 
> Also, as for ASRock and BIOS support etc; yes ASrock did recently release a BIOS update for their 990FX Series boards that are intended to optimize the Bulldozer better then their previous BIOS Updates... THAT, you can read about in the ASRock 990FX Fan club threads here on OCN.
> 
> http://www.overclock.net/amd-motherboards/1054875-amd-asrock-extreme-series-motherboard-club.html
> http://www.overclock.net/amd-motherboards/1078815-preview-asrock-990fx-fatal1ty-990fx-professional.html
> 
> There's your difference...
> 
> Its all in the hardware and how its setup.


Sorry, try again. I used a full scorpius platform (CVF, FX-8150, HD6970) and solid ram (G.Skill Flare DDR3-1866 / 6-9-7-24). My results pretty much mirror everyone else's - the FX-8150 is just plain not as good as a 2600K.

It's not the board, as some has surmised - why on earth would AMD send the board _with_ the chip if it performed worse than someone else's board? It just doesn't make any sense. The most up-to-the-minute BIOS update was available from AMD too, so it's not that.

I think Hardwareheaven's selection of games just happened to make BD look better than other selections of games; nothing more nothing less. There's no conspiracy. HH didn't do anything wrong; it was just a different test suite.


----------



## Skylit

Quote:



Originally Posted by *2010rig*


Or 8 thread vs 8 thread.









I think the whole 8 core thing was just a marketing gimmick.

Chew* for example pointed out that Bulldozer's design was more of a 4 Core / 8 thread part.


Yeah. Don't understand why certain people continue to defend AMD..

They basically pulled an Intel when their P4's were slower than the P3's clock for clock. Funnily enough, AMD actually had the faster CPU in comparison to anything Intel put out back then.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *awdrifter*


Honda's NSX was actually competitive with the Ferrari 348 when it first came out. It's just that Honda never bothered to significantly update the car after that, but Ferrari made two major revisions (360 and 430) during the same period.


You forgot the F355...


----------



## Artemus

How about changing the thread's name to All Things Bulldozer and keep all the news in one place? There's quite a few Bulldozer threads popping up and flooding out other news posts.


----------



## phenom01

Quote:



Originally Posted by *hokiealumnus*


Sorry, try again. I used a full scorpius platform (CVF, FX-8150, HD6970) and solid ram (G.Skill Flare DDR3-1866 / 6-9-7-24). My results pretty much mirror everyone else's - the FX-8150 is just plain not as good as a 2600K.

It's not the board, as some has surmised - why on earth would AMD send the board _with_ the chip if it performed worse than someone else's board? It just doesn't make any sense. The most up-to-the-minute BIOS update was available from AMD too, so it's not that.

I think Hardwareheaven's selection of games just happened to make BD look better than other selections of games; nothing more nothing less. There's no conspiracy. HH didn't do anything wrong; it was just a different test suite.


So your saying your using the same setup as what some believe bugged and your getting the same results. reallllyyy.


----------



## Liquidpain

Damn wrong quote. This is for the guy that said the Honda was for people who wanted to drive Ferraris
False. The NSX set the automotive world on fire with its performance at its time. It put the Raging Bull and the Prancing horse on notice to the point that they created the Diablo and F355. Were it not for the NSX, we would not have been graced with the FD3S, R33, R34, R35, C5, C6(both of which were based off the FD3S), 996s, 997s, 3000GTs, and other ironic sports cars.

BD blowes BTW.


----------



## shadowtroop121

screw you AMD.

If trinity disappoints I will abandon that tag under my username.


----------



## enri95

not sure if this was posted:
Overclocking bulldozer:

  
 You Tube  



 
 notice the 5.53 score on cinebench at 4.8 ghz?









I know that is the lower end bulldozer chip but still...


----------



## mad0314

Quote:



Originally Posted by *guitarholic2008*


When people are throwing money at a computer just because it's the most expensive in the lineup (intel's 990 for example) which I have seen. I built one for someone just cause he had the money, he doesn't really do anything with it. Surfing porn, craigslist, and some minimal gaming. It's what people do to feel better than other people.

And by the way, Honda made the NSX to make people feel like they were driving a car compariable to a ferrari!


That is irrelevant as no logic is used in that situation.

And while the NSX is a great car, it still does not reach the level of the flagship European production cars, and was significantly cheaper. But if you want we can use Hyundai as an example instead, or Kia.


----------



## Hawk777th

Not to mention the NSX was designed by the great Ayrton Senna.


----------



## hokiealumnus

Quote:



Originally Posted by *phenom01*


Quote:



Originally Posted by *hokiealumnus*


Sorry, try again. I used a full scorpius platform (CVF, FX-8150, HD6970) and solid ram (G.Skill Flare DDR3-1866 / 6-9-7-24). My results pretty much mirror everyone else's - the FX-8150 is just plain not as good as a 2600K.

It's not the board, as some has surmised - why on earth would AMD send the board _with_ the chip if it performed worse than someone else's board? It just doesn't make any sense. The most up-to-the-minute BIOS update was available from AMD too, so it's not that.

I think Hardwareheaven's selection of games just happened to make BD look better than other selections of games; nothing more nothing less. There's no conspiracy. HH didn't do anything wrong; it was just a different test suite.


So your saying your using the same setup as what some believe bugged and your getting the same results. reallllyyy.


Huh? Who believes this setup is bugged? The setup that AMD themselves sent to reviewers is bugged? I've seen people that think BD should be performing better say that, but no one with any sort of actual knowledge about the product. I'd love if this was all a bad dream and AMD trounced the 2600K, thus forcing Intel into a CPU war but I'm sorry that's just not the way it is.


----------



## mad0314

Quote:



Originally Posted by *Hawk777th*


Not to mention the NSX was designed by the great Ayrton Senna.


Alright, all the discussion about the NSX is completely IRRELEVANT. Substitute Kia or Hyundai instead of Honda and try to grasp the actual point of the post.


----------



## Baking Soda

My reaction: http://youtu.be/ee925OTFBCA?t=15s

Feels bad man.


----------



## awdrifter

Quote:



Originally Posted by *Majin SSJ Eric*


You forgot the F355...










I stand corrected. But my point is, the NSX was at least competitive when it came out, BD is already outperformed now.


----------



## cjc75

I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...

And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read it all, this morning?

Seriously?

AMD sent out their sample chips over a month ago!

Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?

It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...

Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now, likely due to signing some NDA with AMD_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.

I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.

..and no, I'm not trying to defend AMD here, just trying to look at this logically.


----------



## Am*

Quote:


> Originally Posted by *cjc75;15280491*
> I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...
> 
> And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read this morning?
> 
> Seriously?
> 
> AMD sent out their sample chips over a month ago!
> 
> Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?
> 
> It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...
> 
> Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.
> 
> I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.
> 
> ..and no, I'm not trying to defend AMD here, just trying to look at this logically.


This is not a graphics card, dude. Drivers/BIOS updates aren't going to fix its performance by much, if at all.


----------



## Piegoodman

Quote:



Originally Posted by *cjc75*


I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...

And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read this morning?

Seriously?

AMD sent out their sample chips over a month ago!

Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?

It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...

Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.

I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.

..and no, I'm not trying to defend AMD here, just trying to look at this logically.


Bull. There are no performance enhancements for the retail chips.

AMD failed this one. Looks like my next rig is going to be Intel powered seeing as the i5-2500k is on the same performance level as this piece of crap.


----------



## Anomalous

Like many people here on OC, I was highly anticipating the release of Bulldozer. Sadly, the results are a little underwhelming. I guess my next rig is going to be an Intel machine although I'm still hoping Trinity will have something to offer.


----------



## Papas

Quote:


> Originally Posted by *Piegoodman;15280536*
> Bull. There are no performance enhancements for the retail chips.
> 
> AMD failed this one. Looks like my next rig is going to be Intel powered seeing as the i5-2500k is on the same performance level as this piece of crap.


Lol, it's a piece of crap but has the same performance as the 2500k....U do realize how ignorant that sounded right?

Edit:we have seen hundreds of bios updates and windows updates that have improved performance of cpus before.


----------



## AtomicFrost

Quote:


> Originally Posted by *cjc75;15280491*
> I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...
> 
> And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read it all, this morning?
> 
> Seriously?
> 
> AMD sent out their sample chips over a month ago!
> 
> Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?
> 
> It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...
> 
> Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now, likely due to signing some NDA with AMD_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.
> 
> I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.
> 
> ..and no, I'm not trying to defend AMD here, just trying to look at this logically.


The chips that were sent to the reviewers are retail chips. They are the exact same chip stepping / design that they are sending to the distributors.

Unless this OS cache issue is really causing a huge performance drop or there is a major BIOS bug then this is how BD will perform.

It sounds like Windows 8 will handle thread scheduling better then Windows 7 with this CPU, but at that point in time both Ivy Bridge and Pile Driver will be ready. Unless you can get a BD set up pretty cheap I recommend going for SB now, or wait for SB-E (LGA2011) that is coming out later this year.


----------



## mad0314

Quote:



Originally Posted by *cjc75*


I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...

And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read this morning?

Seriously?

AMD sent out their sample chips over a month ago!

Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?

It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...

Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.

I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.

..and no, I'm not trying to defend AMD here, just trying to look at this logically.


I guess its possible that within the last month they found out a huge bug that was hitting their performance and fixed it, but I would imagine that AMD would send out the new chips to reviewers and make damn sure that the old ones did not get out. The NDA was lifted a few hours before the chips went on sale, meaning that stores already had them. If that were the case, I can assure you they would not lift the NDA without at least contacting reviewers first so that reviews of the bad chips did not get out.

That said, I really do hope there is something wrong and they find and correct it soon. Either that or roll out BDs "Phenom II" ASAP.

Edit: oh yea, and heres another thought to add to the "disabling half a module" discussion earlier:

Quote:



Originally Posted by *Anandtech*

There's only a single Bulldozer die. The 6 and 4 core versions simply feature cores disabled on the die. AMD insists this time around, core unlocking won't be possible on these harvested parts.


If they found a way to disable a single integer core, which I believe was said to not be possible but I could be wrong, then maybe theres hope for a ~$120 octacore.


----------



## Piegoodman

Quote:



Originally Posted by *Papas*


Lol, it's a piece of crap but has the same performance as the 2500k....U do realize how ignorant that sounded right?


The 2500k beats it in video encoding benchmarks; I need to encode videos, therefore, it's crap for me. You can also bet that Intel will lower the price following the release of Ivy Bridge.

Despite that point, the 2500K tramples Bulldozer in MOST benchmarks listed here.

http://www.anandtech.com/show/4955/t...x8150-tested/7


----------



## dzalias

Quote:



Originally Posted by *Piegoodman*


Bull. There are no performance enhancements for the retail chips.

AMD failed this one. Looks like my next rig is going to be Intel powered seeing as the i5-2500k is on the same performance level as this piece of crap.


If the chip's skipped enough generations to be on par with an i5, yet is cheaper than an i5, how is that failure?

Genuinely curious here, I just came into this Bulldozer bonanza. Someone send me a PM and explain the situation to me.


----------



## t-ramp

Quote:



Originally Posted by *dzalias*


If the chip's skipped enough generations to be on par with an i5, yet is cheaper than an i5, how is that failure?

Genuinely curious here, I just came into this Bulldozer bonanza. Someone send me a PM and explain the situation to me.


Well, I don't think calling it "on par" or "cheaper" than the 2500K is quite accurate, and Bulldozer's power consumption is borderline ridiculous.


----------



## AtomicFrost

Quote:


> Originally Posted by *dzalias;15280664*
> If the chip's skipped enough generations to be on par with an i5, yet is cheaper than an i5, how is that failure?
> 
> Genuinely curious here, I just came into this Bulldozer bonanza. Someone send me a PM and explain the situation to me.


The issue is that Bull Dozer isn't on par with an i5 with many common work loads. On top of that it costs more (8150) then a 2500k.

BD 8120 (Slower then the 8150 that was reviewed) $220
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103961

vs

2500K
http://www.newegg.com/Product/Product.aspx?Item=N82E16819115072

Also a few reviews have overclocked an 8150 to 4.6Ghz. At this point it is still slower then an i5 2500K running at stock clocks. Once you overclock the 2500K to ~4.6Ghz there is no competition.

AMD needs to get IPC up again to Phenom II levels, and boost the frequencies possible to remain a strong competitor in this market place.

I really don't expect Piledriver to accomplish this.


----------



## robwadeson

Quote:


> Originally Posted by *t-ramp;15280693*
> Well, I don't think calling it "on par" or "cheaper" than the 2500K is quite accurate, and Bulldozer's power consumption is borderline ridiculous.


did you see the overclocked power consumption? 430w for just the cpu is ridiculous _power_


----------



## dzalias

Quote:



Originally Posted by *AtomicFrost*


The issue is that Bull Dozer isn't on par with an i5 with many common work loads. On top of that it costs more (8150) then a 2500k.

BD 8120 (Slower then the 8150 that was reviewed) $220 
http://www.newegg.com/Product/Produc...82E16819103961

vs

2500K
http://www.newegg.com/Product/Produc...82E16819115072


Would it be worth my time to sell my AMD mobo and buy an Intel 1155 one to get an i5?

I don't do any encoding of any sort. Just some moderate gaming and programming / development.


----------



## Benz

Ok, that doesn't make any sense.. Why is hardware-heaven's review so much different than the other 30? Is there an issue with Asus 990FX boards that we don't know of?


----------



## Piegoodman

Quote:



Originally Posted by *dzalias*


Would it be worth my time to sell my AMD mobo and buy an Intel 1155 one to get an i5?

I don't do any encoding of any sort. Just some moderate gaming and programming / development.


Do you have a 990X board?

If so, it would probably save you a couple bucks to just grab Bulldozer. For performance, take the 2500K.


----------



## dzalias

Quote:



Originally Posted by *Piegoodman*


Do you have a 990X board?

If so, it would probably save you a couple bucks to just grab Bulldozer. For performance, take the 2500K.


I've got an 890 CHIV.


----------



## t-ramp

Quote:



Originally Posted by *dzalias*


Would it be worth my time to sell my AMD mobo and buy an Intel 1155 one to get an i5?

I don't do any encoding of any sort. Just some moderate gaming and programming / development.


If you already have the board, probably not. Were I you I'd look for a used Phenom II.


----------



## AtomicFrost

Quote:



Originally Posted by *dzalias*


I've got an 890 CHIV.


What CPU are you currently using with it?









It really is difficult to say what platform you should go with. If you have a decent chip right now (Black x4 or an x6) and you haven't really had any performance issues I would wait a bit. I have a feeling that AMD will be dropping prices on BD within the next month. However, with the superior performance of the 2500K, and LGA2011 right around the corner the number of choices increases.

I remember reading a website that is claiming Intel is going to release a 4 core i7 on the LGA2011 platform for around $300. This is what a current gen 2600K is going for.

Personally I am going to wait a bit to see how this BD launch plays out. If I can snag an 990FX board and an 8120 on the cheap I might jump on that. (IE: ~$325 - $350)


----------



## Kevlo

Quote:


> Originally Posted by *Piegoodman;15280536*
> Bull. There are no performance enhancements for the retail chips.
> 
> AMD failed this one. Looks like my next rig is going to be Intel powered seeing as the i5-2500k is on the same performance level as this piece of crap.


So your saying the i5-2500K is a piece of crap too, right?
I mean its at the same performance level as the BD and since BD is a piece of crap then the 2500K should be too, right?

Think before you post.


----------



## Piegoodman

Quote:



Originally Posted by *Kevlo*


So your saying the i5-2500K is a piece of crap too, right?
I mean its at the same performance level as the BD and since BD is a piece of crap then the 2500K should be too, right?

Think before you post.


Look at the benchmarks.

2500K > Bulldozer in most cases. That's why I want the 2500K.

Use common sense.


----------



## Allen86

Quote:



Originally Posted by *robwadeson*


did you see the overclocked power consumption? 430w for just the cpu is ridiculous _power_










430w ???? holy balls, what the crap









Not even GPU's use that much!


----------



## dzalias

Quote:



Originally Posted by *AtomicFrost*


What CPU are you currently using with it?










I'm not. I sold my Phenom II X4 955 BE before it devalued any more and I've been waiting to drop a chip in.


----------



## Bunnywinkles

I want to know how this is fail? Do you guys realize how much further AMD just came? If you were expecting it to beat out intels chips, then I am sorry, you were meant to be disappointment. I would like to believe I was not alone in believing this way?

Bulldozer was meant to catch up to intel, which it did, it was not meant to beat them. If you believed anything else you were trolled.


----------



## hokiealumnus

Quote:



Originally Posted by *cjc75*


I suppose you guys all think that all these Reviewers bought their FX Chips last night, bought all their testing hardware, last night... set it all up, last night...

And did all their testing, and wrote their Reviews last night.. all in time to have it all ready for you all to read it all, this morning?

Seriously?

AMD sent out their sample chips over a month ago!

Do you all really think that all these Reviews were just suddenly generated _overnight_ on the very moment the chip goes Retail?

It was published right here on OCN, though I don't recall where it was or what thread exactly, but the article was published here stating that AMD had sent out some 1200 sample chips for reviewers and manufacturers to test...

Therefore, most of these Reviews were done, _a month ago_; using those sampled chips... _they just weren't allowed to publish their results until now, likely due to signing some NDA with AMD_.... and its very likely that the FX Chips on Retail now, have had some performance improvements added to them.

I would think it might be a bit more important to wait a little longer for all this "review dust" to settle; and wait until more people have these chips and start putting them through more every day use... and maybe some more updated Reviews show up from more every day users.

..and no, I'm not trying to defend AMD here, just trying to look at this logically.


Hah...a month ago, that's funny. I'm sorry to say it man, but you have no idea what you're talking about. All of us had very little lead time to write these, much less than we normally do actually. I had our Sandy Bridge chip almost a month before the review published, same with previous Thubans; but not with Bulldozer, probably because they were tweaking everything up to the last minute. 
These chips arrived in reviewer's hands - at the earliest - 12 days before the reviews published, some even less if they weren't extremely quick confirming their shipping address.
There were maybe two dozen sample chips sent to reviewers. Obviously manufacturers already had them before we did.
The chips sent to reviewers are the same steppings people are buying at the stores right now.
The BIOS used is the most recent (0813) available currently on ASUS' web site.


----------



## Behemoth777

Quote:



Originally Posted by *Kevlo*


So your saying the i5-2500K is a piece of crap too, right?
I mean its at the same performance level as the BD and since BD is a piece of crap then the 2500K should be too, right?

Think before you post.


The 2500k is also much cheaper than this chip will be. I got my 2500k at frys for 160, that beats the crap out of this chip as far as price/performance goes.


----------



## anubis1127

Quote:



Originally Posted by *dzalias*


I've got an 890 CHIV.


sell board, get a 2500k for gaming. I'm pretty sure i5 2500k combos are still a lot cheaper than 990+bd combos.


----------



## marik123

I bought my 2500k back in April when I was still deciding to see if I should wait for Bulldozer. Glad I made that decision. For those gamers who already bought a 990x board and disappointed with bulldozer performance, why not wait till Ivy bridge come out and then upgrade? I see no reason to upgrade your rig unless there is a game you cannot play smoothly with your current PC...


----------



## BrEnKeR

So basically AMD decided to lower IPC for more cores instead? Shouldn't it have been the other way around?


----------



## anubis1127

Quote:



Originally Posted by *BrEnKeR*


So basically AMD decided to lower IPC for more cores instead? Shouldn't it have been the other way around?


Only if they wanted to beat Intel.

j/k


----------



## AtomicFrost

Quote:



Originally Posted by *dzalias*


I'm not. I sold my Phenom II X4 955 BE before it devalued any more and I've been waiting to drop a chip in.


In that case I guess it depends on how much coin you could get for your motherboard.

If you manage to get ~$100 for your Crosshair you could pick up a decent 1155 motherboard for an extra $75 - $100.

It might be worth doing if you need the extra performance the 2500K would have.









Also, there is the occasional deal on i5 2500K's at some retail stores. Usually you can grab one for under $200. (Frys, MC) I wish there was a Micro Center down here in Florida. I always miss out on the great in store deals.


----------



## RedCloudFuneral

Just playing with the idea here, but I'm currently forced to run my 6970 in a PCI-E x8 port because of my large cooler and sound card, if I traded my 1155 build for a bulldozer rig and got my card in a true x16 slot would I be better or worse than if I just stayed put? I've read its about a 1-3 frame drop running on the x8, but I'm not sure.


----------



## theamdman

YES and no.









YES i got my stuff before bd came out- NOOO i bought a 990fx mobo....


----------



## sub50hz

Quote:



Originally Posted by *Allen86*


430w ???? holy balls, what the crap









Not even GPU's use that much!


430W for the whole system, not the CPU. Do you have any idea what kind of cooling would be necessary to dissipate 430W of heat from a small surface area like that of a CPU lid?


----------



## AtomicFrost

Quote:


> Originally Posted by *RedCloudFuneral;15281035*
> Just playing with the idea here, but I'm currently forced to run my 6970 in a PCI-E x8 port because of my large cooler and sound card, if I traded my 1155 build for a bulldozer rig and got my card in a true x16 slot would I be better or worse than if I just stayed put? I've read its about a 1-3 frame drop running on the x8, but I'm not sure.


I would stick with your i5 2500K. The performance difference between x8 and x16 is quite small with current generation GPU's, and your 2500K will give you a lot more FPS on any game that's CPU limited (and doesn't properly make use of more than 4 cores). Plus you don't have to spend anymore money on your rig for awhile.


----------



## Benz

Quote:


> Originally Posted by *Behemoth777;15280863*
> The 2500k is also much cheaper than this chip will be. I got my 2500k at frys for 160, that beats the crap out of this chip as far as price/performance goes.


It depends where you buy it. I can't get it under 200€



Quote:


> Originally Posted by *BrEnKeR;15280918*
> So basically AMD decided to lower IPC for more cores instead? Shouldn't it have been the other way around?


Not exactly. Because they chose a modular approach they just made things worse.


----------



## cayennemist

What is the average Air OC of the 2600k?
What is the the Estimated average Air OC of bd? 4.8-5.0 I have read

how do those compare? in benches


----------



## Chewy

Quote:



Originally Posted by *cayennemist*


What is the average Air OC of the 2600k?
What is the the Estimated average Air OC of bd? 4.8-5.0 I have read

how do those compare? in benches


have you even looked through the benchies man??


----------



## theamdman

Quote:



Originally Posted by *Chewy*


you traded 1155 for bd build









Big cojones!


he made a good choice bye bye sandy hello BD!


----------



## Blameless

Quote:



Originally Posted by *cayennemist*


What is the average Air OC of the 2600k?
What is the the Estimated average Air OC of bd? 4.8-5.0 I have read

how do those compare? in benches


Average air OC is 4.5 to 5GHz on both of them, but the 2600k is considerably easier to power/cool.

A 2600k at the same speed as BD is faster all-round.


----------



## DayoftheGreek

Quote:


> Originally Posted by *sub50hz;15281080*
> 430W for the whole system, not the CPU. Do you have any idea what kind of cooling would be necessary to dissipate 430W of heat from a small surface area like that of a CPU lid?


http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

The first one that came up on google shows the power consumption at 580W at 4.8GHz. This is whole system, but during a CPU benchmark. Really, the CPU is using a majority of the power here, not the RAM or HDD lol. The GPU is sitting idle. Can you imagine if they fired up a graphics benchmark?


----------



## cayennemist

Quote:



Originally Posted by *Chewy*


have you even looked through the benchies man??


 yes and there are crap loads of different reviews. *most on CH-V*

I'm asking what is the average Overclock of 2600k on Air?

What is the Average Overclock of BD on Air?

I'm wondering, at what point does *OC limitations come in to play and give BD an edge*. is it at something that is unachievable by the average enthusiast say 7ghz+ on LN2.

Clearly 2600k is the winner, but some Reviews with different motherboards are giving BD a little bit of a boost. i.e. the ASRock Extreme 4

http://www.hardwareheaven.com/review...roduction.html


----------



## sub50hz

Quote:



Originally Posted by *DayoftheGreek*


Really, the CPU is using a majority of the power here, not the RAM or HDD lol.


430W cannot be delivered to the socket, period.


----------



## ranger052

I7 2600K for the win


----------



## Dmac73

Quote:



Originally Posted by *sub50hz*


430W cannot be delivered to the socket, period.


You know what's bad though. AT LEAST 310w are being pulled from the CPU. That's incredibly bad. 430 isn't a number to acknowledge. The 250+ numbers that we have seen are. 300+ is butt ugly.


----------



## DayoftheGreek

Quote:



Originally Posted by *sub50hz*


430W cannot be delivered to the socket, period.


Well it's sure as hell coming out of the wall and not going to the GPU. I don't know if its ending up at the socket, the VRMs, the chipset on the mobo, but it's way too high either way and everything is probably VERY hot.

I guess there was a reason they had to cool it with liquid HELIUM.


----------



## sub50hz

Quote:



Originally Posted by *Dmac73*


You know what's bad though. AT LEAST 310w are being pulled from the CPU.


What, pray tell, calculations are you using to determine that? Does any review site have a voltmeter hooked up showing draw at the socket?

I'm not arguing that BD is terrible on power, but stating things as fact without any information to back it up is one of the reasons that a lot of the threads on OCN become trainwrecks.


----------



## RedCloudFuneral

The results are clear, we need to do more testing.


----------



## HAF_wit

As both an original owner of an FX-57 and Sandy Bridge processor, these benchmarks disappoint me at so many levels. We need competition to drive innovation, which without a major miracle isn't going to happen. I think what I find the most disappointing is that they brought back the FX moniker, tarnishing the brand name in my eyes.

This is a sad day for us all.


----------



## Dmac73

Also, the CH5 that every big name reviewer used(AT, [H]), were CH5's directly from AMD via the press deck kit. Latest BIOS/Agesa, so if you see what you think may be better or differing performance numbers from someone way less reputable like hardware heaven, maybe consider some out of the box variables here. And i'm not talking about a different motherboard.


----------



## Chewy

Quote:



Originally Posted by *Dmac73*


Also, the CH5 that every big name reviewer used(AT, [H]), were CH5's directly from AMD via the press deck kit. Latest BIOS/Agesa, so if you see what you think may be better or differing performance numbers from someone way less reputable like hardware heaven, maybe consider some out of the box variables here. And i'm not talking about a different motherboard.


Seriously ^^^^^^


----------



## Vispor

I would like to see more reviews using different boards and different chips. I hope that some people here on OCN buys this and tinkers around with it. I would buy now if I could afford it. I'm pretty sure I can get BD to work on my mobo.


----------



## tedman

The only thing Bulldozer has bulldozed away is my fondness and respect for AMD. How the mighty have fallen. *looks back to 2003/2004 golden years*


----------



## Kirby1

AMDs stock has really fallen compared to Intels over the last year.


----------



## kweechy

You know, when I was planning around render node purchases a few months ago, I assumed that WORST CASE POSSIBLE would be that Bulldozer would be nothing more than an 8 core Phenom II.

I can't believe that they've managed to not even meet my expectations for the worst possible outcome.


----------



## Corrupted

Quote:



Originally Posted by *RedCloudFuneral*


The results are clear, we need to do more testing.


I don't know...they seem to be taking a real beating based on the few reviews I've seen.


----------



## xPwn

Quote:



Originally Posted by *Kirby1*


AMDs stock has really fallen compared to Intels over the last year.


Dang, thats preety harsh


----------



## Anth0789

I'm kinda disappointed by Bulldozer I thought it would been better than expected.


----------



## HowHardCanItBe

Guys could we please watch the language in this thread? Also,please don't resort to personal attacks.


----------



## jivenjune

Quote:



Originally Posted by *BlackOmega*


Can you explain the rather large discrepancy in performance, when the only visible difference is the motherboard?


Yes, it's called damage control on AMD's part. The reliable sources unanimously agree that Bulldozer is remarkably unimpressive. The sources that indicate otherwise seem at best questionable or merely often illustrate what occurs when a GPU bottleneck is induced. Even Anand gave an explanation of this indicating that in a GPU bottlenecked system, AMD tends to pull slightly ahead for reasons that they or anyone else could explain.


----------



## evilghaleon

The chips seem pretty decent for the price, and I think that if they were released at the same time as Sandy Bridge, they would have given Intel a run for their money. But with SB-E and IB coming so soon, AMD had better start dropping those prices. If they don't have the margins for that, they are in big trouble.


----------



## metal_gunjee

Quote:



Originally Posted by *kweechy*


I assumed that WORST CASE POSSIBLE would be that Bulldozer would be nothing more than an 8 core Phenom II.

I can't believe that they've managed to not even meet my expectations for the worst possible outcome.


I have to concur. 
This totally sums up my thoughts on the whole thing. It's very sad to me too, as I've kinda been an AMD fan for a while now. I'm glad I went with my gut instinct and a 2500K.

Such a major sucky disappointment.


----------



## Sickened1

So, when will these be available for purchase?

Sent from my EVO 3D using Tapatalk


----------



## pioneerisloud

Quote:



Originally Posted by *jivenjune*


Yes, it's called damage control on AMD's part. The reliable sources unanimously agree that Bulldozer is remarkably unimpressive. The sources that indicate otherwise seem at best questionable or merely often illustrate what occurs when a GPU bottleneck is induced. Even Anand gave an explanation of this indicating that in a GPU bottlenecked system, AMD tends to pull slightly ahead for reasons that they or anyone else could explain.


It's honestly very EASILY possible, that some motherboards are better "Bulldozer Ready" than other boards. Maybe the BIOS's on the "worse" boards just haven't fully matured enough yet, and aren't fully supporting the FX features? It's a full possibility if you ask me.

The potential of BD is there. Nobody can deny that...it does have potential. It's just a matter of figuring out what went wrong, whether its a chip problem, the reviewers didn't overclock properly, or if its a BIOS update that's badly needed.

Only time will tell.


----------



## Jotun

This is what happens when you work on an architecture for 5 years. You get performance that would be great 5 years ago.


----------



## Vagrant Storm

Quote:



Originally Posted by *Sickened1*


So, when will these be available for purchase?

Sent from my EVO 3D using Tapatalk


They are on Newegg.us already...might be sold out by now...for a new piece of hardware on release I hope they are sold out by now...if they aren't then that means the fail is too wide spread


----------



## jivenjune

Quote:



Originally Posted by *pioneerisloud*


It's honestly very EASILY possible, that some motherboards are better "Bulldozer Ready" than other boards. Maybe the BIOS's on the "worse" boards just haven't fully matured enough yet, and aren't fully supporting the FX features? It's a full possibility if you ask me.

The potential of BD is there. Nobody can deny that...it does have potential. It's just a matter of figuring out what went wrong, whether its a chip problem, the reviewers didn't overclock properly, or if its a BIOS update that's badly needed.

Only time will tell.


I agree that it may be possible, but a lot of things are within the realm of possibility. Even within that realm, the possibility seems just heavily unlikely, sadly.

But yeah, I agree. I'd like to see this thing tested on a few other boards, particularly Gigabyte and MSI to see if the results deviate from the standard Asus board that was used.

The thing is though... Asus isn't particularly known for having poor boards or poor BIOS from my understanding.


----------



## Kasp1js

FYI Techspot also used an Asrock board.


----------



## Fierce Mullet

For those that are curious, X-bit tested with the Gigabyte 990FXA-UD5.

Also interesting is that the review thread on the Hardware Heaven forum has been closed temporarily.


----------



## Jagged_Steel

Quote:



Originally Posted by *Sickened1*


So, when will these be available for purchase?

Sent from my EVO 3D using Tapatalk


Yesterday.







Maybe tommorow, depending on where you are.


----------



## rivaldog

Quote:



Originally Posted by *MarvinDessica*


A 2400 can be had cheaper than any bulldozer priceif were going by what Microcenter is showing. And they sell processors either at a small loss or what they buy them for. BD is not only going be more expensive, but at a terrible price point.


That's garbage. That has to be complete garbage. There is no way microcenter sells anything for a loss, they have absolutely _horrid_ prices. Newegg has a better price on every thing I have ever even thought about buying (nearly), and there's no way newegg isn't making a new profit. You can't sustain a business with out profit. That's off topic anyways. To make this on topic, I'm surprised Newegg is already out of stock on the 8150 and has pricing higher than tiger direct's on it.


----------



## Clairvoyant129

Quote:



Originally Posted by *Kevlo*


So your saying the i5-2500K is a piece of crap too, right?
I mean its at the same performance level as the BD and since BD is a piece of crap then the 2500K should be too, right?

Think before you post.


Are you blind? BD is no where near 2500k performance. I guess fanboys just want to see what they want to see. In most scenarios, the FX-8150 is even slower than the i5 2400, while having a 500MHz clock speed advantage and a 4.2GHz turbo compared to the lowly i5 2400 with a 3.4GHz turbo. Not to mention the FX-8150 is 8 cores. BD is just plain garbage.

http://www.anandtech.com/show/4955/t...x8150-tested/5


----------



## Clipze

Quote:



Originally Posted by *jivenjune*


I agree that it may be possible, but a lot of things are within the realm of possibility. Even within that realm, the possibility seems just heavily unlikely, sadly.

But yeah, I agree. I'd like to see this thing tested on a few other boards, particularly Gigabyte and MSI to see if the results deviate from the standard Asus board that was used.

The thing is though... Asus isn't particularly known for having poor boards or poor BIOS from my understanding.


processors have steppings and revisions for a reason you know


----------



## Axon14

Quote:



Originally Posted by *rivaldog*


That's garbage. That has to be complete garbage. There is no way microcenter sells anything for a loss, they have absolutely _horrid_ prices. Newegg has a better price on every thing I have ever even thought about buying (nearly), and there's no way newegg isn't making a new profit. You can't sustain a business with out profit. That's off topic anyways. To make this on topic, I'm surprised Newegg is already out of stock on the 8150 and has pricing higher than tiger direct's on it.


What? Microcenter is an absolute loss leader in the CPU market. That's their strategy - get you in on a $180 2500k and sell you the rest of their Bullcrap


----------



## mad0314

Quote:



Originally Posted by *theamdman*


YES and no.









YES i got my stuff before bd came out- NOOO i bought a 990fx mobo....


told you to wait... it made no sense to buy the night before.


----------



## jivenjune

Quote:



Originally Posted by *Clipze*


processors have steppings and revisions for a reason you know


From my understanding, Piledriver, which is AMD's next iteration of Bulldozer, will only give an estimated performance increase of 10 percent and even with that revision, 10 percent still leaves the Bulldozer line relatively unimpressive. The 10 percent increase is taken directly from AMD slides.

That being said, I don't know what a new stepping and revision would have to do with current motherboards indicating different levels of performance on the same chip.


----------



## mad0314

Quote:



Originally Posted by *Axon14*


What? Microcenter is an absolute loss leader in the CPU market. That's their strategy - get you in on a $180 2500k and sell you the rest of their BS.


Thats why its in store only. You go in saving a bunch of money and you are more likely to buy a graphics card or something else on impulse with the extra weight in your pocket and the product in your face.


----------



## Jagged_Steel

Quote:



Originally Posted by *Clairvoyant129*


Are you blind? BD is no where near 2500k performance. I guess fanboys just want to see what they want to see. In most scenarios, the FX-8150 is even slower than the i5 2400, while having a 500MHz clock speed advantage and a 4.2GHz turbo compared to the lowly i5 2400 with a 3.4GHz turbo. Not to mention the FX-8150 is 8 cores. BD is just plain garbage.

http://www.anandtech.com/show/4955/t...x8150-tested/5












Looks like FX is doing ok to me.









Those looking for a reason to not buy the latest greatest processor will find one.


----------



## Kand

Quote:



Originally Posted by *Jagged_Steel*











Looks like FX is doing ok to me.









Those looking for a reason to not buy the latest greatest processor will find one.


Keep posting GPU limited benchmarks.

LIKE A BAUS.


----------



## gooface

Quote:



Originally Posted by *Jagged_Steel*











Looks like FX is doing ok to me.









Those looking for a reason to not buy the latest greatest processor will find one.


let me guess that was done with a single GPU setup right? yep just a 6950, what a joke of a review.


----------



## jivenjune

Quote:


> Originally Posted by *Jagged_Steel;15282323*
> 
> Looks like FX is doing ok to me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those looking for a reason to not buy the latest greatest processor will find one.


I would hardly call this the "latest greatest processor," but the same argument can be made about those looking to buy one of these.


----------



## djriful

Quote:


> Originally Posted by *jivenjune;15282284*
> From my understanding, Piledriver, which is AMD's next iteration of Bulldozer, will only give an estimated performance increase of 10 percent and even with that revision, 10 percent still leaves the Bulldozer line relatively unimpressive. The 10 percent increase is taken directly from AMD slides.
> 
> That being said, I don't know what a new stepping and revision would have to do with current motherboards indicating different levels of performance on the same chip.


This is going to depress a large number of people seriously. Everyone will be ditching AMD because we all foresee the up coming plans unless they scrap it

Sent from my iPhone using Tapatalk


----------



## Papas

Quote:


> Originally Posted by *Piegoodman;15280839*
> Look at the benchmarks.
> 
> 2500K > Bulldozer in most cases. That's why I want the 2500K.
> 
> Use common sense.


Lol we are all going off of what you said, that they were on par with each other, yet because one is amd its trash...so the 2500k must be trash to since they are equal.


----------



## Demonkev666

http://www.kitguru.net/components/cpu/zardon/amd-fx-8150-black-edition-8-core-review-with-gigabyte-990fxa-ud7/

F5 bios UD7 tested with bulldozer.








there is F6E out.

maybe they should try all bios lol.


----------



## Liquidpain

If AMD is projecting only a 10% increase with piledriver, then they are in deep trouble. Remember 33% more cores and 50% more power?

Yea...


----------



## redalert

Quote:


> Originally Posted by *Demonkev666;15282414*
> http://www.kitguru.net/components/cpu/zardon/amd-fx-8150-black-edition-8-core-review-with-gigabyte-990fxa-ud7/
> 
> F5 bios UD7 tested with bulldozer.
> 
> 
> 
> 
> 
> 
> 
> 
> there is F6E out.
> 
> maybe they should try all bios lol.


Depending when the review was done the F6E might not have been available.


----------



## wupah

holy carp did you see the power that these chips use ? Simply ridiculous, almost 500w at 100% usage!


----------



## oicw

Quote:


> Originally Posted by *Jotun;15281997*
> This is what happens when you work on an architecture for 5 years. You get performance that would be great 5 years ago.


What's worse? They could've simply kept the Deneb architecture, do a die shrink to 32nm, work in more OC potential, and be further ahead than a bulldozer. Perhaps someone can do a 4.8 - 5.1 DIce OC on a 1090t to compare to a BD?

They should call it "out of date farm tractor with blown engine" instead.


----------



## jivenjune

Quote:


> Originally Posted by *djriful;15282387*
> This is going to depress a large number of people seriously. Everyone will be ditching AMD because we all foresee the up coming plans unless they scrap it
> 
> Sent from my iPhone using Tapatalk


I almost wish that AMD just indicated a higher number for the sake of offering people a little more hope, like... Just say 20 percent and hope that the next iteration could achieve anywhere between 10 to 20 percent.

By blatantly indicating that performance increase is only expected to increase by 10 percent seems to imply that the next iteration is merely a correction of what Bulldozer perhaps should be at this point.

It'd still probably be slightly underwhelming by today's standard, but at least there wouldn't be such a large gap in between multi-threaded and single-threaded performance that we're currently seeing between comparisons made with an i5 2500k and an i7 2600k

What's worse is that even after that 10 percent increase is taken into account with Piledriver in 2012, the next iteration of Intel CPUs (Ivy Bridge and SB-E) are expected to net a gain of 10 to 20 percent this year alone. The gap is closing hard on AMD, and the end result isn't going to be a pretty one.


----------



## BradleyW

Oh dear AMD....Just stick with the GPU making!


----------



## srsdude

http://www.maximumpc.com/article/features/bulldozer_benchmarked_and_analyzed_amd_back_game


----------



## HMBR

Quote:



Originally Posted by *Jagged_Steel*











Looks like FX is doing ok to me.









Those looking for a reason to not buy the latest greatest processor will find one.


lol

best case scenario for the FX, when the performance is limited by the VGA,
now let's see when the CPU have more weight on the final performance,









































http://www.hardware.fr/articles/842-...rma-ii-oa.html

so, at best, when limited by the GPU the FX is as good as anything (Phenom II, Core 2 Quad, i7s and so on), but in other cases, when a game depends more on the CPU performance is a lot worse,


----------



## Fletcherea

Quote:



Originally Posted by *Kand*


You can't spell Mad without AMD.


Tee hee


----------



## lloyd mcclendon

if those last few graphs are correct, i am very glad I am not the AMD product manager.... incoming.

Either something is not quite right yet or this is really bad.


----------



## Jagged_Steel

Quote:



Originally Posted by *HMBR*


lol

best case scenario for the FX, when the performance is limited by the VGA,
now let's see when the CPU have more weight on the final performance,so, at best, when limited by the GPU the FX is as good as anything (Phenom II, Core 2 Quad, i7s and so on), but in other cases, when a game depends more on the CPU performance is a lot worse,


Shogun2 is one of the heaviest games for CPU usage that exists and the FX is showing off what it can do in that test.

If you are determined to find a reason to not buy an FX, I am certain you will find one.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Jagged_Steel*


*Shogun2 is one of the heaviest games for CPU usage that exists *and the FX is showing off what it can do in that test.

If you are determined to find a reason to not buy an FX, I am certain you will find one.


Not when its hopelessly bottlenecked by a single 6950.


----------



## HMBR

Quote:



Originally Posted by *Jagged_Steel*


Shogun2 is one of the heaviest games for CPU usage that exists and the FX is showing off what it can do in that test.

If you are determined to find a reason to not buy an FX, I am certain you will find one.


exactly the same result and you think its impossible that something else (GPU) can be limiting the performance at the test they used? (notice, the game can be CPU bound in some parts, but not always, and probably their benchmark is in a more GPU limited demo/test) seriously?

the one trying to handpick and take things out of contest here is not me,
it's very easy to find reasons to not buy the FX, finding reasons to buy one is harder,










they weren't even able to run the game on the FX, but look at the Phenoms

same here
http://www.hardware.fr/articles/842-...anno-1404.html
http://www.bit-tech.net/hardware/cpu...-8150-review/9
http://www.techradar.com/reviews/pc-...view?artc_pg=4


----------



## formula m

Quote:



Originally Posted by *kweechy*


You know, when I was planning around render node purchases a few months ago, I assumed that WORST CASE POSSIBLE would be that Bulldozer would be nothing more than an 8 core Phenom II.

I can't believe that they've managed to not even meet my expectations for the worst possible outcome.


Lol...

The BD archetecture will enherently get faster as software developers move into 64bit space. A BD chip now, doesn't hurt anything... just that single benchmarks only test derivitives of singles aspects. Granted, AMD didn't add enough decoders to their uarch & PD will fix this. But, the BD chip doesn't slow down the computing experience, it just doesn't meet the threshold of being a dominant chip.

Also, I can garuantee that this BD (4 years from now), will handle future software better than the current SB. It is inevitable, and spending $245 now doesn't mean ur rig is slow! And.. since, "Joe Public" does't buy a computer every 3 years, I see no loss for AMD. It just isn't a big win for current software.

So a BullDozer chip is not a bad choice, it is just not the best choice if all you are concerned about is software in 2011.

Coincidentally, anyone test BD with BF3..? (With full-on apps running in the background, under Windows8?)

To me, benchmarking with only 1 piece of software running is a fallacy. Run these benches, with WoW/etc in the background & see how they stack up. That is what real world use is about. Hardly anybody runs single (multi-threaded) programs alone... so why do you run benchmarks that way?

Clutter up ur OS and see what chip has the beef to chug through the mayhem. That is a true benchmark of your system.

That said, I would choose a 2500K (over FX8150) if I made my living off of benchmarking. Though, if the FX8150 was competively priced @ $215... I'd buy it every time, because it has more value & longer legs. (Well.. at least until it is eol in feb, lol)


----------



## Jagged_Steel

Quote:



Originally Posted by *Majin SSJ Eric*


Not when its hopelessly bottlenecked by a single 6950.


They turned down the graphics settings to maximize the CPU test. At some point you are going to have to come to grips with the fact that a company called AMD is selling a CPU called "FX" that kicks butt at gaming. It even costs a lot less than an Intel CPU that performs about the same.


----------



## anubis1127

Quote:



Originally Posted by *Jagged_Steel*


They turned down the graphics settings to maximize the CPU test. At some point you are going to have to come to grips with the fact that a company called AMD is selling a CPU called "FX" that kicks butt at gaming. It even costs a lot less than an Intel CPU that performs about the same.


How does the $260 FX 8150 cost less than the $220 i5 2500k ($180 where I buy CPU's)?


----------



## machinehead

I had a feeling bulldozer was way over hyped and wouldn't even beat SB. Good thing I already planned on going ivy bridge. But these numbers are way worse than I thought they would be









Intel didn't like when King Athlon ruled and will never let the underdog take a shot at them again.


----------



## Mad Pistol

Quote:



Originally Posted by *Jagged_Steel*


*They turned down the graphics settings to maximize the CPU test.* At some point you are going to have to come to grips with the fact that a company called AMD is selling a CPU called "FX" that kicks butt at gaming. It even costs a lot less than an Intel CPU that performs about the same.


1920x1080 @ Ultra is NOT turning down the settings. That's putting stress on the GPU

The only way that we can be sure if BD is going to bottleneck on gaming performance is to add in a 2nd card, which no review has done yet.


----------



## anubis1127

Quote:



Originally Posted by *Mad Pistol*


1920x1080 @ Ultra is NOT turning down the settings. That's putting stress on the GPU

The only way that we can be sure if BD is going to bottleneck on gaming performance is to add in a 2nd card, which no review has done yet.


Or they have, and just didn't want to show us the results


----------



## tianh

Alright im going to the green team. PEACE!


----------



## Jagged_Steel

Quote:



Originally Posted by *HMBR*


exactly the same result and you think its impossible that something else (GPU) can be limiting the performance at the test they used? (notice, the game can be CPU bound in some parts, but not always, and probably their benchmark is in a more GPU limited demo/test) seriously?

the one trying to handpick and take things out of contest here is not me,
it's very easy to find reasons to not buy the FX, finding reasons to buy one is harder,










they weren't even able to run the game on the FX, but look at the Phenoms
same here
http://www.hardware.fr/articles/842-...anno-1404.html
http://www.bit-tech.net/hardware/cpu...-8150-review/9


And you can't figure out that there is some issue with drivers/setup, etc? So I guess the Hardware Heaven guys are pure genius, because they figured out how to make the FX play that game , and better than a 2600 at that. You go ahead and set up your system the way those reviewers reviewers did, and I will set mine up like the HH reviewers did, deal?


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Mad Pistol*


1920x1080 @ Ultra is NOT turning down the settings. That's putting stress on the GPU

The only way that we can be sure if BD is going to bottleneck on gaming performance is to add in a 2nd card, which no review has done yet.


Somebody send me an 8150 and a mobo and I'll happily plug in my 580's and see what's up!


----------



## dioxholster

its strange that Shogun2 shows amd doing better, but i dont believe its CPU intensive, it uses more GPU i think.


----------



## czin125

so the 2600k at 3.4ghz uses 155 watts but the fx-8150 uses 223 watts at full load?


----------



## Reload_X

the reviews on hardware heaven are using a ASRock 990FX Extreme4 and the 8150 is ahead of the 2600k in some tests............ wonder if there is a problem with the crosshair V that every other review site is using ???????? idk


----------



## anubis1127

Quote:



Originally Posted by *Jagged_Steel*


. You go ahead and set up your system the way those reviewers reviewers did, and I will set mine up like the HH reviewers did, deal?


says the guy with an amd 970 mobo, and hd6790..


----------



## dejanh

Maybe I should just buy an FX-8150 and a matching board and do the tests myself...so we can put this all to rest...


----------



## Slappy Mcgee

Quote:



Originally Posted by *czin125*


so the 2600k at 3.4ghz uses 155 watts but the fx-8150 uses 223 watts at full load?


Remember
Intel 4 cores = 155 watts 
AMD 8 cores = 223 watts

More cores = more power







*requirements


----------



## anubis1127

Quote:



Originally Posted by *dejanh*


Maybe I should just buy an FX-8150 and a matching board and do the tests myself...so we can put this all to rest...


I say do it. I would if they had them in stock at MC, maybe by this weekend. I'd like to see how the FX 8150 does with SLI, not just crossfire.


----------



## HMBR

Quote:



Originally Posted by *Jagged_Steel*


And you can't figure out that there is some issue with drivers/setup, etc? So I guess the Hardware Heaven guys are pure genius, because they figured out how to make the FX play that game better than a 2600. You go ahead and set up your system the way those reviewers reviewers did, and I will set mine up like the HH reviewers did, deal?


I found more than 3 reviews with issues with this single game and CPU,

hardware heaven did find a way to make the FX look good, they simply picked a gameplay situation clearly limited by the GPU they used, and not to anyone surprise the FX performed exactly as any other "good enough" CPU for that situation, like the 2600, now wouldn't be interesting if they had included the 2500, 2400, PII and so on?
this is quite obvious, but whatever, there are plenty of other tests, if you only care about one, GPU limited test, OK.


----------



## Jagged_Steel

Quote:



Originally Posted by *Reload_X*


the reviews on hardware heaven are using a ASRock 990FX Extreme4 and the 8150 is ahead of the 2600k in some tests............ wonder if there is a problem with the crosshair V that every other review site is using ???????? idk


This may very well be the case. Another thing I am seeing is that FX seems to perform best when using a pure Scorpius system. The really trash reviews I have seen are all using Nvidea cards. My honest guess is that at least some of these reviews are done with intentionally bad setups.


----------



## Trogdor

Quote:



Originally Posted by *Kand*


Keep posting GPU limited benchmarks.

LIKE A BAUS.


In what world is this not a result? Do you really think more people have more than one GPU?


----------



## Jagged_Steel

Quote:



Originally Posted by *HMBR*


I found more than 3 reviews with issues with this single game and CPU,

hardware heaven did find a way to make the FX look good, they simply picked a gameplay situation clearly limited by the GPU they used, and not to anyone surprise the FX performed exactly as any other "good enough" CPU for that situation, like the 2600, now wouldn't be interesting if they had included the 2500, 2400, PII and so on?
this is quite obvious, but whatever, there are plenty of other tests, if you only care about one, GPU limited test, OK.


I don't play tests, I play games, and the FX kicks butt at gaming. Any other questions?


----------



## crucifix85

Quote:



Originally Posted by *dioxholster*


its strange that Shogun2 shows amd doing better, but i dont believe its CPU intensive, it uses more GPU i think.


im pretty sure most,if not all, rts games are cpu intensive.


----------



## doomlord52

Quote:



Originally Posted by *Slappy Mcgee*


Remember
Intel 4 cores = 155 watts 
AMD 8 cores = 223 watts

More cores = more power







*requirements


Yet intel's 4, lower-power usage cores outperform AMD's 8 gas guzzlers.


----------



## Jagged_Steel

Quote:



Originally Posted by *crucifix85*


im pretty sure most,if not all, rts games are cpu intensive.


Yes RTS games are indeed the CPU hogs of the gaming world, and this is also the genre I generally play. This is why I posted the Shogun2 results. It was pretty funny to immediately see the " Shogun2 is GPU limited! Test is blargh!" posts the moment I posted that, even though it is one of the most CPU intensive games in existence.


----------



## michintom

Quote:



Originally Posted by *Slappy Mcgee*


Remember
Intel 4 cores = 155 watts 
AMD 8 cores = 223 watts

More cores = more power







*requirements


So...my sig rig psu isn't enough for a 8150 oc'ed?


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Jagged_Steel*


I don't play tests, I play games, and the FX kicks butt at gaming. Any other questions?


I think most rational people will continue buying the 2500k for less money and better gaming performance....


----------



## TheRockMonsi

Quote:



Originally Posted by *crucifix85*


im pretty sure most,if not all, rts games are cpu intensive.


They most certainly are CPU intensive.

Quote:



Originally Posted by *michintom*


So...my sig rig psu isn't enough for a 8150 oc'ed?










It should be, if I'm not mistaking.


----------



## i7Stealth1366

At the looks of Metro 2033 1920x1200 BD is even behind the 2500k and even the 2400.









All I have to say about this is FAIL









I am sorry AMD fanboys but you would have to be high to spend 200+ on this CPU. Even at $150 it wouldnt be worth it because the 2500k is about that price.


----------



## dodger.blue

Is there a review comparing the i3 2100 to the FX 4100 or 6100?


----------



## Indulgence

Quote:



Originally Posted by *pursuinginsanity*


I think it's funny that even after 50+ negative reviews that all show Thuban stomping BD you're STILL in here doing damage control. Do they pay you for this or something? Are you just that deep in denial?


that's one "diehard-fanboy-neversaynever-justinbiebersnotgay-amdBDrules" right there. Period.

Trolling aside, deymn AMD, you really fail hard this time.. would you ever give intel a good fight in terms of performance?


----------



## no1Joeno1

Quote:



Originally Posted by *Jagged_Steel*


Yes RTS games are indeed the CPU hogs of the gaming world, and this is also the genre I generally play. This is why I posted the Shogun2 results. It was pretty funny to immediately see the " Shogun2 is GPU limited! Test is blargh!" posts the moment I posted that, even though it is one of the most CPU intensive games in existence.


And it is the graphics card which renders the frames...

But yeah go ahead and get that CPU because it has 1fps better with a GPU bottleneck.


----------



## mothergoose729

Pretty much what I expected. It is nice at least that is some applications it outperforms or compete closely with the i7 2600k. In consumer applications though its no contest, the IPC is the exact same as phenom. I wouldn't recommend bulldozer to anybody, get a 2600k or a get a 2500k, no room for bulldozer in between.


----------



## Slappy Mcgee

Quote:



Originally Posted by *doomlord52*


Yet intel's 4, lower-power usage cores outperform AMD's 8 gas guzzlers.


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


----------



## no1Joeno1

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


Stop looking at the specs, look at the performance.


----------



## dodger.blue

Quote:



Originally Posted by *Indulgence*


that's one "diehard-fanboy-neversaynever-justinbiebersnotgay-amdBDrules" right there. Period.

Trolling aside, deymn AMD, you really fail hard this time.. would you ever give intel a good fight in terms of performance?


It goes both ways, though. There are Intel fans that just plain want Intel to have a monopoly for some reason. Most AMD fans use AMD for the sake of competition (of which there is *none* now), but there are those fanatics that claim that no matter what, AMD has a better value. Or, no matter what, Intel is the best solution in every single category.

People need to wake up.

The fact is, Bulldozer only works in certain Workstation settings, nothing else. I see it failing at servers, in addition to the areas it has already failed, due to it's high power draw. Intel should further take the rest of the server market next year if tri-gate is all that it lives up to.


----------



## Jagged_Steel

Quote:



Originally Posted by *no1Joeno1*


And it is the graphics card which renders the frames...

But yeah go ahead and get that CPU because it has 1fps better with a GPU bottleneck.


I am OK with this idea. 1 frame better for $100 less = WIN in my book.

Are you honestly suggesting that paying $100 more for less performance is a good deal?







I guess if you can't grasp this aspect of how AMD is a price/performance winner then you probably never will.


----------



## i7Stealth1366

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


Mmmm lets dont go and call BD a true 8 core. If it they put two BE Phenoms together it would double the performance of BD.


----------



## sub50hz

Quote:



Originally Posted by *mothergoose729*


Pretty much what I expected. It is nice at least that is some applications it outperforms or compete closely with the i7 2600k. In consumer applications though its no contest, the IPC is the exact same as phenom. I wouldn't recommend bulldozer to anybody, get a 2600k or a get a 2500k, no room for bulldozer in between.


I think the biggest thing working in Intel's favor is incredibly good performance on a platform that doesn't cost the end-user an arm and a leg.


----------



## doomlord52

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


AMD uses more power to do less. Hence, Intel has a greater power







erformance ratio.


----------



## anubis1127

Quote:



Originally Posted by *dodger.blue*


Is there a review comparing the i3 2100 to the FX 4100 or 6100?


I'm pretty sure the Czech site compared those. I remember seeing it on one of the site's I couldn't read.


----------



## DayoftheGreek

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


You can bullcrap math however you want, but when you plug it into the wall and run it full load, you will use way more power and get way less done.

You can divide by transistors, cores, thread, modules, letters in the name, gigahurtz, years before release, TEARS OF AMD, or whatever else you want. IT USES WAY MORE POWER. And it gets less done.


----------



## dodger.blue

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


That is all well and good and makes sense, but the i7 2600 outperforms even an overclocked FX 8-core at 8 threads while consuming 69% of the power. So, the individual core might consume a little more, but overall it accomplishes more with less power.

Quote:



Originally Posted by *Jagged_Steel*


I am OK with this idea. 1 frame better for $100 less = WIN in my book.

Are you honestly suggesting that paying $100 more for less performance is a good deal?







I guess if you can't grasp this aspect of how AMD is a winner then you probably never will.


$100 less than what? AMD FX is far too expensive!


----------



## anubis1127

Quote:



Originally Posted by *Jagged_Steel*


I am OK with this idea. 1 frame better for $100 less = WIN in my book.

Are you honestly suggesting that paying $100 more for less performance is a good deal?







I guess if you can't grasp this aspect of how AMD is a price/performance winner then you probably never will.


I don't understand why you don't understand the fx8150 is $80 more than a i5 2500k.

Somebody worried about gaming price/performance certainly isn't going to be looking at an i7 in the first place.

Sent from my DROIDX using Tapatalk


----------



## anubis1127

Quote:



Originally Posted by *hydropwnics*


but will it run crysis?


It does, just slower than the competition, including other AMD offerings.


----------



## Jagged_Steel

Quote:



Originally Posted by *anubis1127*


I don't understand why you don't understand the fx8150 is $80 more than a i5 2500k.

Somebody worried about gaming price/performance certainly isn't going to be looking at an i7 in the first place.

Sent from my DROIDX using Tapatalk


 FX is pretty even or slightly better than the 2600 in DX-11 gaming, and costs $100 less. It outperforms the 2500, and costs a little more. Seems about right to me. Why is this confusing to you?


----------



## Slappy Mcgee

OMG, guys were you not following the conversation and how it began? If you had followed my comment to what it was originally posted to you would see I was not arguing performance / watt.

What was posted:

Quote:



Originally Posted by *czin125*


so the 2600k at 3.4ghz uses 155 watts but the fx-8150 uses 223 watts at full load?


My response:

Quote:



Originally Posted by *Slappy Mcgee*


Remember
Intel 4 cores = 155 watts 
AMD 8 cores = 223 watts

More cores = more power







*requirements


So doomlord chimes in:

Quote:



Originally Posted by *doomlord52*


Yet intel's 4, lower-power usage cores outperform AMD's 8 gas guzzlers.


To wich I replied:

Quote:



Originally Posted by *Slappy Mcgee*


How is it a power guzzler? Here break it down liked this
Intel 4 cores = 155 watts = 38.75 watts/core
AMD 8 cores = 223 watts = 27.875 watts /core

How is it a power guzzler?


I was just pointing out they really are not using more power per core


----------



## anubis1127

Quote:



Originally Posted by *Jagged_Steel*


So if I play a game and everything performs perfectly, how am I "getting less done"? I don't get it. It either does what you want it to, or it doesn't.


Because you are playing the game slower than you would, and using more power than you would be using an i5 2500k. Not to mention you paid more money. To me that's "getting less done".


----------



## mad0314

Quote:



Originally Posted by *Jagged_Steel*


So if I play a game and everything performs perfectly, how am I "getting less done"? I don't get it. It either does what you want it to, or it doesn't.


Well then by your logic get a 955. Is there any game you want to run that it can't handle? If there is get a 1090t. It doesn't make it as good as a 2500K though. These benchmarks' purpose is to look at the performance of each CPU. Bulldozer is underwhelming in most.

Funny thing is those chips all beat BD in IPC, which is what matters in games, not core count.


----------



## amstech

Wow this thread is nasty!

I think the expectations for Bulldozer were too high.
It performs quite well IMO...atleast a company is challenging Intel.


----------



## BloodyRory

I'm glad I bought Sandy Bridge now.


----------



## dodger.blue

Quote:



Originally Posted by *Jagged_Steel*


FX is pretty even or slightly better than the 2600 in DX-11 gaming, and costs $100 less. It outperforms the 2500, and costs a little more. Seems about right to me. Why is this confusing to you?


It costs about $35 less. In an intelligent society, you usually need to provide evidence or else your claims tend to be dismissed.

http://www.newegg.com/Product/Produc...82E16819103960

vs.

http://www.newegg.com/Product/Produc...Tpk=i7%202600k


----------



## kurt1288

Quote:



Originally Posted by *amstech*


Wow this thread is nasty!

I think the expectations for Bulldozer were too high.
It performs quite well IMO...atleast a company is challenging Intel.


There's really no reason that a company should release a CPU that has a new architecture and is supposed to keep them going, and have it under-perform or equal chips that they currently have. And the fact that it equals or under-performs the competition....

I didn't have any expectations for BD. I didn't follow it at all. Seeing the reviews and benchmarks, it's sad.


----------



## Kand

Quote:



Originally Posted by *dodger.blue*


It costs about $35 less. In an intelligent society, you usually need to provide evidence or else your claims tend to be dismissed.

http://www.newegg.com/Product/Produc...82E16819103960

vs.

http://www.newegg.com/Product/Produc...Tpk=i7%202600k


http://www.microcenter.com/single_pr...uct_id=0354589


----------



## BloodyRory

I did notice that it does very well on video rendering and competes with the 980x on that. Not to shabby for video encoding.


----------



## Iceman23

Quote:



Originally Posted by *formula m*


Lol...

The BD archetecture will enherently get faster as software developers move into 64bit space. A BD chip now, doesn't hurt anything... just that single benchmarks only test derivitives of singles aspects. Granted, AMD didn't add enough decoders to their uarch & PD will fix this. But, the BD chip doesn't slow down the computing experience, it just doesn't meet the threshold of being a dominant chip.

Also, I can garuantee that this BD (4 years from now), will handle future software better than the current SB. It is inevitable, and spending $245 now doesn't mean ur rig is slow! And.. since, "Joe Public" does't buy a computer every 3 years, I see no loss for AMD. It just isn't a big win for current software.


4 years? Do you know how long that is in computer time? Buying a cpu that will only reach its full potential in 4 years is a terrible terrible idea. By that time something far far faster will be out. And this is all based on the assumption that software will suddenly undergo a REVOLUTION and somehow begin utilizing 8 full cores, rather than the 2-4 that is the VAST majority. You can go to the grave saying it's a good purchase but really, wake up and smell the flowers.


----------



## james8

Quote:



Originally Posted by *dodger.blue*


It costs about $35 less. In an intelligent society, you usually need to provide evidence or else your claims tend to be dismissed.

http://www.newegg.com/Product/Produc...82E16819103960

vs.

http://www.newegg.com/Product/Produc...Tpk=i7%202600k


not to mention the 8150 is "out of stock" so it's not like you can get it anyway


----------



## Jagged_Steel

Quote:



Originally Posted by *ekg84*


hmm i think some people keep having really bad hallucinations, more performance? ok

Bfbc2 is known as cpu intensive game:









.


Why is it that other sites are able to show better results? Perhaps there are some setup/driver/moneymotivation issues that some sites are running into.

The OverClockersClub team managed competely different test results for BF2









So, what you are thinking is that nobody else is going to be able to tune these things properly, and I happen to know that is not the case. If these guys can get these results then so can others, probably even better than this. We are on OCN are we not?


----------



## HMBR

Quote:



Originally Posted by *amstech*


Wow this thread is nasty!

I think the expectations for Bulldozer were too high.
It performs quite well IMO...atleast a company is challenging Intel.



to be honest my expectations were never to high,
but I really didn't expected it to be 7% slower than a Phenom II (IPC), and to be released at only 3.6GHz (if it really needs a high clock to perform), and to use so much power when overclocked, and to look generations apart from Intel in terms of IPC (40-50% in many cases),

so yes, when you have a something like rendering, encoding, file compression, it does a good job, because 8 less capable "cores" can still be competitive against 4, but many softwares, including gaming can't use as many "integer cores" as 8, and BD have other bottlenecks


----------



## dodger.blue

Quote:



Originally Posted by *Slappy Mcgee*


OMG, guys were you not following the conversation and how it began? If you had followed my comment to what it was originally posted to you would see I was not arguing performance / watt.


So your calculation was pointless then.

Who cares what an individual core consumes if you jam a bunch of them together and make them consume more? It would be forgivable if the cores themselves were efficient, but they aren't.

Face it, Intel's 8-thread solution is better and more efficient than AMD's.


----------



## anubis1127

Quote:



Originally Posted by *Jagged_Steel*


FX is pretty even or slightly better than the 2600 in DX-11 gaming, and costs $100 less. It outperforms the 2500, and costs a little more. Seems about right to me. Why is this confusing to you?


I'm not the one confused. The 2500k is clearly the better price/performance CPU available at the moment. It does better in nearly every gaming scenario, and costs less, uses less power.

Not to mention it scales well with multiple GPU's. Something I don't have much faith in the FX series at the moment given its lower IPC.

Sent from my DROIDX using Tapatalk


----------



## mad0314

Quote:



Originally Posted by *Jagged_Steel*


Why is it that other sites are able to show better results? Perhaps there are some setup/driver/moneymotivation issues that some sites are running into.

The OverClockersClub team managed competely different test results for BF2









So, what you are thinking is that nobody else is going to be able to tune these things properly, and I happen to know that is not the case. If these guys can get these results then so can others, probably even better than this. We are on OCN are we not?


Lmao... keep showing more GPU bottlenecked benchmarks as a CPU argument.


----------



## theamdman

i think that bulldoser has a great amount of proformanse - considering no games can fully use all the cores. if we ran to copys of the game at once. too much power at one time.


----------



## aznofazns

I don't see how anyone can argue in favor of Bulldozer's price/performance. It barely beats the 2500K in heavily threaded tasks, loses immensely in less threaded tasks, costs the same ($220 for FX-8120), and uses much, much more power.

I think Bulldozer would be "ok" if it was priced $50 cheaper.


----------



## kurt1288

Quote:



Originally Posted by *formula m*


Lol...

The BD archetecture will enherently get faster as software developers move into 64bit space. A BD chip now, doesn't hurt anything... just that single benchmarks only test derivitives of singles aspects. Granted, AMD didn't add enough decoders to their uarch & PD will fix this. But, the BD chip doesn't slow down the computing experience, it just doesn't meet the threshold of being a dominant chip.

Also, I can garuantee that this BD (4 years from now), will handle future software better than the current SB. It is inevitable, and spending $245 now doesn't mean ur rig is slow! And.. since, "Joe Public" does't buy a computer every 3 years, I see no loss for AMD. It just isn't a big win for current software.


I bought my CPU about 4 years ago. I'm now behind quite a few releases. As already mentioned, 4 years is forever in computer tech. "Joe Public" doesn't by a computer every 3 years, but this CPU isn't really marketed towards them. Joe Public doesn't need 8 cores. People who would want 8 cores are looking for performance, and this chip lacks that. Maybe in 4 years it wont, but some companies will buy hardware and then not upgrade it for years.


----------



## Fuell

Geez some people need to grow up, on both sides. Sure FX may be a let down for gamers and such, but for some people, its a great CPU.
Check Xbit review pages here and here.

I mean, it keeps up with and sometimes beats both the 2500K and 2600K. But overall, I'd still say a letdown just because of its backwards progress for stuff like gaming and what most people would use a consumer CPU for... This chip was obviously designed for server and workstation workloads and that just rubs a lot of people the wrong way it seems... But when its in an environment that can take advantage to the architecture of the CPU, BD just shines.

So overall not that great and its overpriced, but it does have some kick in it... I just wanted it to be in a different way







Oh well... here's hoping AMD works out the kinks of a totally new arch on a new node and wish them the best of luck, we all need them to pull a rabbit out of a hat for the next showing, or performance will stall a little...


----------



## Deacon

This is sad news for me, I was truly hoping for the FX, I'm in the middle of a rig change, and this just pushed me away from the FX-8150, migh just wait for the Ivy Bridge after all, or just go with the 2700k, not sure yet, but I think AMD needs to step up their game, and give Intel a run for its money, other wise will have Intel monopolizing the processors market, and that is bad for consumers.


----------



## kev012in

Well now that the official benchmarks are out and I successfully wasted my time waiting on this turd. I guess it's time to start planning my upgrade to sandy bridge. So sad, barely a challenge to the 2600k. You know over my years i'm more of a reader then a poster and it takes quite a topic for me to post. This is absurd AMD, such a letdown. Once again thanks for wasting a lot of people's time holding out for your "next big thing".


----------



## Sickened1

Quote:



Originally Posted by *mad0314*


Lmao... keep showing more GPU bottlenecked benchmarks as a CPU argument.


How about this then:

http://www.overclockersclub.com/revi..._fx8150/12.htm


----------



## dodger.blue

Quote:



Originally Posted by *Fuell*


Geez some people need to grow up, on both sides. Sure FX may be a let down for gamers and such, but for some people, its a great CPU.
Check Xbit review pages here and here.

I mean, it keeps up with and sometimes beats both the 2500K and 2600K. But overall, I'd still say a letdown just because of its backwards progress for stuff like gaming and what most people would use a consumer CPU for... This chip was obviously designed for server and workstation workloads and that just rubs a lot of people the wrong way it seems... But when its in an environment that can take advantage to the architecture of the CPU, BD just shines.

So overall not that great and its overpriced, but it does have some kick in it... I just wanted it to be in a different way







Oh well... here's hoping AMD works out the kinks of a totally new arch on a new node and wish them the best of luck, we all need them to pull a rabbit out of a hat for the next showing, or performance will stall a little...


Workstations, but probably not servers. Cost saving is a huge factor in that. Bulldozer pretty much chucks AMD's previous "performance per watt" server strategy out the window.

The thing is, the 8150 performs better than the 2500k in many workstation applications, but in most cases it is handily beaten by the 2600k.


----------



## Fuell

Quote:



Originally Posted by *dodger.blue*


Workstations, but probably not servers. Cost saving is a huge factor in that. Bulldozer pretty much chucks AMD's previous "performance per watt" server strategy out the window.

The thing is, the 8150 performs better than the 2500k in many workstation applications, but in most cases it is handily beaten by the 2600k.


Which, if priced right (lower the price AMD) there would be a market for... and make a good upgrade for existing platform owners... but the single threaded and gaming performance really is disappointing... I'm crossing my fingers its due to both being on a newer node 32nm and being a completely new and radically designed arch... though thats kind of a cop-out because AMD should have had time to raise the performance on the other side of things as well... Can't say its a Fail... but its not a win... especially at the price... wtheck...


----------



## mad0314

Quote:



Originally Posted by *Sickened1*


How about this then:

http://www.overclockersclub.com/revi..._fx8150/12.htm


Thats not a game...


----------



## Jagged_Steel

Quote:



Originally Posted by *mad0314*


Lmao... keep showing more GPU bottlenecked benchmarks as a CPU argument.


?? I didn't post both of those, FYI. Somebody posted a BF2 performance graph that showed FX performing badly, and I posted one showing it performing well in the very same game. Which review was it that you were trying to poo poo?


----------



## swindle

Quote:



Originally Posted by *Sickened1*


How about this then:

http://www.overclockersclub.com/revi..._fx8150/12.htm


And its a terrible review.

Look at the BFBC2 benchmarks...

You have all those processors on virtually the exact same level. From 980x to 8150 to Phenom IIs...

Its ******ed.


----------



## mad0314

You dont find it funny that, through a WIDE range of processors, most of their benchmarks were all within 1-3 FPS?


----------



## anubis1127

Quote:



Originally Posted by *swindle*


Its ******ed.


Come on, watch your mouth. That is uncalled for.


----------



## missingno

lol this bulldozer thing sounds like a big overblown pile of fail


----------



## anubis1127

Quote:



Originally Posted by *mad0314*


You dont find it funny that, through a WIDE range of processors, most of their benchmarks were all within 1-3 FPS?


Maybe they were just trying to show that in those games, on those settings, and a single GPU, you will be limited to the GPU.


----------



## anubis1127

Oops. Anyway, I'm just going to stop now.

Probably going to buy a fx8150 to play around with, and rack up my electricity bill.


----------



## aznofazns

Quote:



Originally Posted by *anubis1127*


Oops. Anyway, I'm just going to stop now.

Probably going to buy a fx8150 to play around with, and rack up my electricity bill.


Sounds like a plan. Post some benchmarks and overclocking results in the AMD section once you do.


----------



## Paladin Goo

Glad I didn't hesitate when picking up my Intel system when I switched from my 1090T.


----------



## Papas

Quote:



Originally Posted by *HMBR*


I found more than 3 reviews with issues with this single game and CPU,

hardware heaven did find a way to make the FX look good, they simply picked a gameplay situation clearly limited by the GPU they used, and not to anyone surprise the FX performed exactly as any other "good enough" CPU for that situation, like the 2600, now wouldn't be interesting if they had included the 2500, 2400, PII and so on?
this is quite obvious, but whatever, there are plenty of other tests, if you only care about one, GPU limited test, OK.


Can someone please explain why when the game is gpu limited and frame rates go up for the bd over the i7/i5 that means??? The way you are explaining it, if gpu is limited and its relying heavily on the cpu than bd should have lower frame rates, not higher. That is unless I'm missing the point.


----------



## ekg84

Quote:



Originally Posted by *Jagged_Steel*


?? I didn't post both of those, FYI. Somebody posted a BF2 performance graph that showed FX performing badly, and I posted one showing it performing well in the very same game. Which review was it that you were trying to poo poo?










lol here are the settings they used in techreport review.










Techreport settings are lower which means result is less gpu bound. Thats why u got different result with overclockersclub review.

And here are overclockersclub settigs:

4x AA
16x AF
Global settings = High

I hope that answers your question.


----------



## kdashjl

core i5 2500k here i come...


----------



## coolhandluke41

since this is a "review " thread ,i think this one deserves to be in the mix ,found it in AMD section ,thanks to Mighty Neph
"I've found this to be a very objective approach in analysis of Bulldozer. Very informative."
it'a a long one so you can start watching @ around 18:17 min

  
 You Tube


----------



## SirWaWa

intel wins
intel probably knew... that acted as if bd never existed


----------



## Jagged_Steel

Quote:



Originally Posted by *ekg84*


lol here are the settings they used in techreport review.










Techreport settings are lower which means result is less gpu bound. Thats why u got different result with overclockersclub review.

And here are overclockersclub settigs:

4x AA
16x AF
Global settings = High

I hope that answers your question.


No, my question was which review the last attacker was trying to poo poo. Thanks for all the graphs. Does this mean that the reviews showing FX kicking butt in games are going to disappear just because you manage to dig one up that makes it look bad? You have lost me here.


----------



## Clairvoyant129

Obviously if you increase the resolution and the quality settings, it will become GPU bound. But looking at those results, its safe to say BD gaming performance is garbage. I don't know what Jagged Steel is talking about when he claims it performs better than the 2600k. BD can't even compete with PII X6s.









Quote:



Originally Posted by *Jagged_Steel*


No, my question was which review the last attacker was trying to poo poo. Thanks for all the graphs. Does this mean that the reviews showing FX kicking butt in games are going to disappear just because you manage to dig one up that makes it look bad? You have lost me here.


To prove that if you increase the resolution, all CPUs will start performing similarly in GPU bound games but why would you pick up a $280 FX-8150 when you can get a 2500K that will outperform it in almost every scenario?


----------



## DayoftheGreek

Quote:



Originally Posted by *Jagged_Steel*


No, my question was which review the last attacker was trying to poo poo. Thanks for all the graphs. Does this mean that the reviews showing FX kicking butt in games are going to disappear just because you manage to dig one up that makes it look bad? You have lost me here.


No, I'm pretty sure you can pretty much pick any benchmark or any game at random to make BD look bad. He has plenty of more benchmarks to post. BD certainly isn't doing itself any favors.

EDIT: Oh look, there they are now! Yeah, most of them look like that. Pretty much every games benchmark on something other than 1080p and ultra settings with a good GPU puts BD worse than X6 and X4 even sometimes.


----------



## Code-Red

Wow, this is a bigger launch failure than Fermi was.

I'm truly saddened by this. It's been years since I rocked an AMD computer. With results like this, looks like its going to be a few more.


----------



## lordikon

Quote:


> Originally Posted by *Slappy Mcgee;15283539*
> How is it a power guzzler? Here break it down liked this
> Intel 4 cores = 155 watts = 38.75 watts/core
> AMD 8 cores = 223 watts = 27.875 watts /core
> 
> How is it a power guzzler?


They're both eight threads, the OS could give a crap if they're 4 cores with hyper-threading to give 8 threads, or 8 "modules". Here are the few things that matter to most people on OCN:

1.) Price/performance, Intel is the winner here in most categories
2.) Power/performance. AMD is epic failing this one with any of its "high performance" CPUs.
3.) Pure performance, goes to Intel, no contest.

AMD Fans, answer me this, you save $60 on buying an AMD FX over Intel 2600k, get performance that isn't even close to the 2600k, and then end up paying most of that $60 back on your electricity bill. Seriously, you'd have to be high or just completely irrational to want the Bulldozer for any reason.

And for the record let me state that I've owned numerous AMD rigs over the years, I'm a performance enthusiast, not some hopeless fanboy, and at the moment AMD has nothing to bring to the table.


----------



## Clairvoyant129

Quote:



Originally Posted by *Code-Red*


Wow, this is a bigger launch failure than Fermi was.

I'm truly saddened by this. It's been years since I rocked an AMD computer. With results like this, looks like its going to be a few more.


I know what you mean. I sold my PC to wait for BD. Looks like I can safely pick up SB or SB-e.


----------



## HowHardCanItBe

I'll remind everyone again. Could we all act like *civilized *human beings for once in this thread?


----------



## Jagged_Steel

Quote:



Originally Posted by *DayoftheGreek*


No, I'm pretty sure you can pretty much pick any benchmark or any game at random to make BD look bad. He has plenty of more benchmarks to post. BD certainly isn't doing itself any favors.

EDIT: Oh look, there they are now! Yeah, most of them look like that. Pretty much every games benchmark on something other than 1080p and ultra settings with a good GPU puts BD worse than X6 and X4 even sometimes.












I see the FX on top here. Are you saying that if you bought an FX system that you are not capable of getting it to function as well as these guys?


----------



## RagingCain

I keep popping in hoping to see that there was some massive.... magical fairy uCode or BIOS fix. Any news on this front?

I see mentioning of Linux kernels needing to be adjusted to compensate, but claimed Windows should not have the same issue. Something about memory addressing index of level 2 cache...

Jagged... its bad, give it up.


----------



## mjpd1983

Quote:



Originally Posted by *lordikon*


They're both eight threads, the OS could give a crap if they 4 cores with hyper-threading to give 8 threads, or 8 "modules". Here are the few things that matter to most people on OCN:

1.) Price/performance, Intel is the winner here in most categories)
2.) Power/performance. AMD is epic failing this one with any of its "high performance" CPUs.
3.) Pure performance, goes to Intel, no contest.

AMD Fans, answer me this, you save $60 on buying an AMD FX over Intel 2600k, get performance that isn't even close to the 2600k, and then end up paying most of that $60 back on your electricity bill. Seriously, you'd have to by high or just completely irrational to want the Bulldozer for any reason.

And for the record let me state that I've owned numerous AMD rigs over the years, I'm a performance enthusiast, not some hopeless fanboy, and at the moment AMD has nothing to bring to the table.


Perfect.


----------



## Derp

Quote:



Originally Posted by *5entinel*


I'll remind everyone again. Could we all act like *civilized *human beings for once in this thread?


People aren't being uncivilized, Jagged_Steel is riling other users up with absolute crap. Any reminders need to be directed at him.


----------



## HowHardCanItBe

Please refrain from using provocative words like fanboy.


----------



## savagebunny

I shall wait for the next revision. Seems like my 955 will long live on!


----------



## Disturbed117

Better not back talk to mod people lol
Finally coming out of my depression now.


----------



## lordikon

Quote:



Originally Posted by *Derp*


People aren't being uncivilized, Jagged_Steel is riling other users up with absolute crap. Any reminders need to be directed at him.


Agreed, he comes into all manner of Bulldozer threads making claims that are in direct conflict with 99% of the information out there, riling everyone up, and when people try to step in with valid information he just ignores it and continues on anyway. Well, at least that's what I can gather from the quotes of him, I personally have him on my ignore list because I couldn't stand it any more.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *Jagged_Steel*











I see the FX on top here. Are you saying that if you bought an FX system that you are not capable of getting it to function as well as these guys?










Why do you keep bringing up this single benchmark to justify bulldozer's lackluster performance? This benchmark clearly shows that the game is gpu limited. Anyways, the mere 3 fps is probably within the margin of error.


----------



## gooface

Quote:



Originally Posted by *mjpd1983*


Perfect.


This is logic. no one can ignore this.


----------



## Jagged_Steel

Quote:



Originally Posted by *Derp*


People aren't being uncivilized, Jagged_Steel is riling other users up with absolute crap. Any reminders need to be directed at him.


So posting graphs and charts from reviews and asking questions about the results I am seeing is now crap? AFAIK this is a thread about the reviews, and all I see is everybody focusing on the negative and completely ignoring the positive reviews. I happen to know that if some credible surces are reporting good results, then these can be repeated. If you want to focus solely on the bad reports and not buy a FX, you are quite free to do so.


----------



## Papas

Love how no one answered my question..guess they can't explain it either.


----------



## jprovido

how's the crow tasting jagged_steel? accept it dude. defending the bulldozer is a lost cause


----------



## gooface

Quote:



Originally Posted by *Dr. Zoidberg*


Why do you keep bringing up this single benchmark to justify bulldozer's lackluster performance? This benchmark clearly shows that the game is gpu limited. Anyways, the mere 3 fps is probably within the margin of error.


they are using a 6950, pretty weak gpu to test the cpu out.


----------



## Accuracy158

meh... gaming is one of the more important things for me and that's not really BDs strong point. I really have no need to upgrade my i5 760.


----------



## HMBR

Quote:



Originally Posted by *Papas*


Can someone please explain why when the game is gpu limited and frame rates go up for the bd over the i7/i5 that means??? The way you are explaining it, if gpu is limited and its relying heavily on the cpu than bd should have lower frame rates, not higher. That is unless I'm missing the point.



if the game is GPU limited, the performance should be the same on any CPU (given that is fast enough), the difference you see on that "Shogun 2" graphic is almost nothing, is withing a margin of error or irrelevance, like in any test limited by the GPU... if you add another (SLI, CF) or a faster VGA (or maybe lower the details) probably it can change things, because then the VGA is fast enough to not be the main limiting factor of the framerate,

keep in mind, that on the same game you can go from GPU limited to CPU limited quite easily, depending on the scene, the action, what's is happening, and how fast your CPU/GPU are.

here they tried to isolate a CPU limited scenario on the same game:
http://www.bit-tech.net/hardware/cpu...-8150-review/9

but unfortunately the FX crashed (as it did in this game in other 3 reviews or more)
but you can clearly see a difference, CPUs scaling with OC and so on, when in a GPU limited part of the game/settings you are not going to see this.
also you can see that the 2600 performance here is much lower, because they used a much more demanding (for the CPU) demo, and that's how you normally want to compare CPUs, not when they are irrelevant, or can't make any difference, because is more about how fast the VGA can run the thing.

if you test Crysis with a Radeon HD5450 performance will be around let's say 10fps, no matter what CPU you use (a Celeron or a i7 2700k), but now use a GTX 590, and the Celeron will run much slower than the i7
obviously on a much smaller scale, that's whats happening on that Shogun 2 graphic, the difference is irrelevant, the result is the same, and that's what the VGA is capable of on that case.


----------



## Papas

Quote:



Originally Posted by *HMBR*


if the game is GPU limited, the performance should be the same on any CPU (given that is fast enough), the difference you see on that "Shogun 2" graphic is almost nothing, is withing a margin of error or irrelevance, like in any test limited by the GPU... if you add another (SLI, CF) or a faster VGA (or maybe lower the details) probably it can change things, because then the VGA is fast enough to not be the main limiting factor of the framerate,

keep in mind, that on the same game you can go from GPU limited to CPU limited quite easily, depending on the scene, the action, what's is happening, and how fast your CPU/GPU are.

here they tried to isolate a CPU limited scenario on the same game:
http://www.bit-tech.net/hardware/cpu...-8150-review/9

but unfortunately the FX crashed (as it did in this game in other 3 reviews or more)
but you can clearly see a difference, CPUs scaling with OC and so on, when in a GPU limited part of the game/settings you are not going to see this.
also you can see that the 2600 performance here is much lower, because they used a much more demanding (for the CPU) demo, and that's how you normally want to compare CPUs, not when they are irrelevant, or can't make any difference, because is more about how fast the VGA can run the thing.

if you test Crysis with a Radeon HD5450 performance will be around let's say 10fps, no matter what CPU you use (a Celeron or a i7 2700k), but now use a GTX 590, and the Celeron will run much slower than the i7
obviously on a much smaller scale, that's whats happening on that Shogun 2 graphic, the difference is irrelevant, the result is the same, and that's what the VGA is capable of on that case.


OK, so that makes sense. Ty


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Papas*


Can someone please explain why when the game is gpu limited and frame rates go up for the bd over the i7/i5 that means??? The way you are explaining it, if gpu is limited and its relying heavily on the cpu than bd should have lower frame rates, not higher. That is unless I'm missing the point.


The point is a couple fps up or down is really insignificant in a gpu-limited benchmark. In cpu-specific tests there is simply no parity between these processors at all. The 2600k/2500k run away from the 8150 with ease....


----------



## Jagged_Steel

Quote:



Originally Posted by *jprovido*


how's the crow tasting jagged_steel? accept it dude. defending the bulldozer is a lost cause












So having top scores in the latest games now constitutes "eating crow"? Piling on the Intel bandwagon is easy, sticking up for AMD is hard. It is a phenomenom called "rooting for the underdog".


----------



## 2010rig

whoa, this thread is HUGE, no idea if this has been posted.

AMD has posted a Press Release:
http://www.amd.com/us/aboutamd/newsr.../newsroom.aspx

Quote:



*Unlock Your Record Setting AMD FX Series Processor Today*

October 12, 2011 -- With the first eight-core desktop processor, enthusiasts and overclockers get an amazing PC experience at unheard of prices

AMD (NYSE: AMD) today unleashed the AMD FX family of CPUs, delivering a fully unlocked and customizable experience for desktop PC users. The AMD FX series of desktop CPUs includes the first-ever eight-core desktop processor, enabling extreme multi-display gaming, mega-tasking and HD content creation for PC and digital enthusiasts - all for less than $245 (suggested U.S. retail price). This marks the first retail availability of processors that use AMD's new multi-core architecture (codenamed "Bulldozer"), which is included in AMD's upcoming server CPU (codenamed "Interlagos") and the next-generation of AMD Accelerated Processing Units.

"AMD FX CPUs are back with a vengeance, as validated by the recent feat of setting a Guinness World RecordsÂ® title for 'Highest Frequency of a Computer Processor,'" said Chris Cloran, corporate vice president and general manager, Client Group at AMD. "While overclockers will certainly enjoy the frequencies the AMD FX processors can achieve, PC enthusiasts and HD media aficionados will appreciate the remarkable experience that AMD FX processors can provide as part of a balanced, affordable desktop system."

All AMD FX CPUs offer completely unlocked processor clock multipliers for easier overclocking, paving the way for PC enthusiasts to enjoy higher CPU speeds and related performance gains. Additionally, these processors use AMD Turbo Core Technology to dynamically optimize performance across CPU cores enabling maximum performance for intense workloads.

Without spending a small fortune, users can combine an AMD FX CPU with an AMD 9-series chipset motherboard and AMD Radeon™ HD 6000 series graphics cards to create the AMD "Scorpius" platform for an astounding gaming and HD entertainment experience. As part of the "Scorpius" platform, AMD FX CPUs also support AMD CrossFireX™ technology, which allows the combination of multiple graphics cards in a PC for stunning visual experiences, and AMD Eyefinity technology support for super resolution on up to six monitors.1 With AMD CatalystControl Center™ / AMD VISION Engine Control Center, users can get regular updates to help improve system performance and stability, and to add new software enhancements.


My opinion, fire your copywriters, hire more engineers.


----------



## djriful

Problem is that Bulldozer is consider same level as Phenom II. There isn't really any improvement but at that price $260... you got to be kidding me.

If it wasn't the price, I would still grab it like for $170 because it is under performance than X6 1100T.


----------



## Check101

After reading some review, and seeing as how the FX-8120/50 are priced around the 2500k, exactly what audience is Bulldozer marketed to? Gamers? Average users? Enthusiasts? Business? The only audience I see actually utilizing this is the video/photo editing crowd. If so, this is hardly a competitive product at this price point.

Am I right? Please correct me if I am wrong...


----------



## djriful

Quote:



Originally Posted by *2010rig*


whoa, this thread is HUGE, no idea if this has been posted.

AMD has posted a Press Release:
http://www.amd.com/us/aboutamd/newsr.../newsroom.aspx

My opinion, fire your copywriters, hire more engineers.


Having bigger Ghz numbers doesn't mean anything if the real time usage under perform compare to today decent CPUs...

They said these series aim for enthusiast but enthusiast users aren't that dumb by being fooled with 8Ghz number. You can fool computer literary but that's just plain rip off and gouging the market.


----------



## DayoftheGreek

Quote:



Originally Posted by *Check101*


After reading some review, and seeing as how the FX-8120/50 are priced around the 2500k, exactly what audience is Bulldozer marketed to? Gamers? Average users? Enthusiasts? Business? The only audience I see actually utilizing this is the video/photo editing crowd. If so, this is hardly a competitive product at this price point.

Am I right? Please correct me if I am wrong...


It isn't competitive at all really. Pretty much every reviews done (except one, which is flawed) paints an ugly picture. Even AMD is asking for help from reviewers on how to make this launch look less terrible.


----------



## Omlet

Quote:



Originally Posted by *Check101*


After reading some review, and seeing as how the FX-8120/50 are priced around the 2500k, exactly what audience is Bulldozer marketed to? Gamers? Average users? Enthusiasts? Business? The only audience I see actually utilizing this is the video/photo editing crowd. If so, this is hardly a competitive product at this price point.

Am I right? Please correct me if I am wrong...


The server market at best.

In any other instance it just fails.


----------



## lordikon

Quote:



Originally Posted by *Check101*


After reading some review, and seeing as how the FX-8120/50 are priced around the 2500k, exactly what audience is Bulldozer marketed to? Gamers? Average users? Enthusiasts? Business? The only audience I see actually utilizing this is the video/photo editing crowd. If so, this is hardly a competitive product at this price point.

Am I right? Please correct me if I am wrong...


Yea that's about right, and that's what everyone is complaining about. The only point I can see to Bulldozer was if this new architecture opens them up for big and sweeping changes over the next few years, where the Phenom might have hit a brick wall, but that's still no reason to purchase a Bulldozer now.


----------



## djriful

Quote:



Originally Posted by *Omlet*


The server market at best.

In any other instance it just fails.


Or some OEM machine from Dell, HP junks. Tagline, 8 cores = awesome! Marketing scheme, more core means you can do more and faster... in some sense yes but it fails to deliver in real performance.


----------



## DayoftheGreek

Quote:



Originally Posted by *Omlet*


The server market at best.

In any other instance it just fails.


Power usage is way too high for servers.

It would make an awesome dedicated winzip machine though, hahah. Except not every review has it on top. So maybe it would make a dedicated winzip machine for certain files.


----------



## Check101

Quote:



Originally Posted by *lordikon*


Yea that's about right, and that's what everyone is complaining about. The only point I can see to Bulldozer was if this new architecture opens them up for big and sweeping changes over the next few years, where the Phenom might have hit a brick wall, but that's still no reason to purchase a Bulldozer now.


I read something in the Anand Tech review about how Windows 7 scheduler isn't optimized for Bulldozer's modules. How significant would a tweak to the scheduler for Bulldozer's modules be in terms of performance?


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Jagged_Steel*











So having top scores in the latest games now constitutes "eating crow"? Piling on the Intel bandwagon is easy, sticking up for AMD is hard. It is a phenomenom called "rooting for the underdog".










Lol at having "top scores" by 3 just fps in a non-repeatable MP environment with tons of different variables that are impossible to account for. You go ahead and buy an 8150 based solely on the fact that out of hundreds of benches performed you managed to find just a couple of flawed ones that make BD just on par with SB. The rest of us understand what a consensus of opinion is....


----------



## lordikon

Quote:



Originally Posted by *Check101*


I read something in the Anand Tech review about how Windows 7 scheduler isn't optimized for Bulldozer's modules. How significant would a tweak to the scheduler for Bulldozer's modules be in terms of performance?


It could make a noticeable difference, but probably not enough to justify purchasing one. Windows 7 has been out for quite awhile now, if that change is all it takes to give Bulldozer a decent boost you would think AMD would have already worked that out with Microsoft.


----------



## GameBoy

Quote:



Originally Posted by *Dr. Zoidberg*


Why do you keep bringing up this single benchmark to justify bulldozer's lackluster performance? This benchmark clearly shows that the game is gpu limited. Anyways, the mere 3 fps is probably within the margin of error.


Actually, all of the gaming benchmarks on the hardwareheaven review have the 8150P slightly ahead of a 2600k. That could easily be due to the slight impact on performance that Hyperthreading has (like 1-3%), though.


----------



## djriful

Quote:



Originally Posted by *lordikon*


It could make a noticeable difference, but probably not enough to justify purchasing one.


its more expensive at dollar per performance imo.


----------



## Dr. Zoidberg

Quote:



Originally Posted by *Check101*


I read something in the Anand Tech review about how Windows 7 scheduler isn't optimized for Bulldozer's modules. How significant would a tweak to the scheduler for Bulldozer's modules be in terms of performance?


Supposedly, Windows 8 has a better task scheduler. However with that fact, bulldozer barely does any better in Windows 8.

"I also did some multithreaded tests to compare Windows 7 versus Windows 8. The task scheduler in the latter is more optimised for multi core CPUs.* Yet my finding were not that shocking in my test suite. 1-4% difference was maximum spotted. Nothing that would give the Scorpius platform that required significant boost*."

http://www.madshrimps.be/articles/ar...#axzz1acYVpqDO


----------



## Jagged_Steel

Quote:



Originally Posted by *Majin SSJ Eric*


Lol at having "top scores" by 3 just fps in a non-repeatable MP environment with tons of different variables that are impossible to account for. You go ahead and buy an 8150 based solely on the fact that out of hundreds of benches performed you managed to find just a couple of flawed ones that make BD just on par with SB. The rest of us understand what a consensus of opinion is....


If by "flawed" I guess you are referring to those that show FX performing well. What is your explanation for some reviews showing opposite results in some tests, particularly DX-11 games?? Do you think some people got magic 8150s that showed performance better than a 2600 in games, and that this will never happen again? Or, do you think it is more likely that the bad reviews showing a 2600 blowing away a 8150 in the same game are "flawed" or skewed in some way? Are you expecting the reviews showing great performance for FXs to just disappear? Will the sites posting them fold up in shame? I am guessing not. The results they posted are real. If others are not getting results as good, then something is wrong with their rig,testing methods, or perhaps something else.


----------



## HMBR

Quote:



Originally Posted by *Jagged_Steel*











So having top scores in the latest games now constitutes "eating crow"? Piling on the Intel bandwagon is easy, sticking up for AMD is hard. It is a phenomenom called "rooting for the underdog".










"Keep in mind also that these data sets represent a live server with real players so not every performance run-through is the same, but these fully represent true BF3 gameplay."

"we used an AMD Radeon HD 6970"

"1920x1200 with NO AA, and 16X AF with "Ultra" settings."

if you played BF3 you should know that a 6970 on Ultra even without AA is quite GPU limited,
also when they overclocked the i5 2500k from 3.3 to 4.8GHz thy gained nothing, that tells you what!?

GPU limited... add that to a a less than a ideal testing method (during a MP game, with different number of players and server) and all the CPUs are within something that you should consider a margin of error...


----------



## 2010rig

Quote:



Originally Posted by *djriful*


Having bigger Ghz numbers doesn't mean anything if the real time usage under perform compare to today decent CPUs...

They said these series aim for enthusiast but enthusiast users aren't that dumb by being fooled with 8Ghz number. You can fool computer literary but that's just plain rip off and gouging the market.


Agreed, I mentioned this on the World Record thread expressing that if IPC was weak, it doesn't matter what frequency the CPU can run at with Liquid Helium.

Now we're finding out that Bulldozer average stable overclocks are 4.6 - 4.8 on AIR, and that it has a weak IPC. But sadly, AMD is now pushing this as "The fastest CPU in the world".

http://www.overclock.net/hardware-ne...l#post14936697

http://www.overclock.net/hardware-ne...l#post14936976

Quote:



Originally Posted by *HMBR*


"Keep in mind also that these data sets represent a live server with real players so not every performance run-through is the same, but these fully represent true BF3 gameplay."

"we used an AMD Radeon HD 6970"

"1920x1200 with NO AA, and 16X AF with "Ultra" settings."

if you played BF3 you should know that a 6970 on Ultra even without AA is quite GPU limited,
also when they overclocked the i5 2500k from 3.3 to 4.8GHz thy gained nothing, that tells you what!?

GPU limited... add that to a a less than a ideal testing method (during a MP game, with different number of players and server) and all the CPUs are within something that you should consider a margin of error...


He still won't get it, because like other reviewers pointed out, in GPU limited scenarios for some reason it favors AMD slightly. That's not because Bulldozer is faster, it's because of the platform. It's one of those things you can't explain.


----------



## tosh.0

So how does this compare to the i5-750/760?


----------



## coolhandluke41

Interesting read here;
UPDATE: AMD Insiders Speak Out: BAPCo Exit is An Excuse for Bulldozer

"Bulldozer is going to disappoint people because we did not get the resources to build a great CPU, and it's not that we needed billions of dollars to make it a leader. We needed investment in people, tools and technology."
http://www.brightsideofnews.com/news....aspx?pageid=0


----------



## Sophath

Shogun 2: total war is GPU bottlenecked
It was also tested with a Gulftown cpu


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Jagged_Steel*


If by "flawed" I guess you are referring to those that show FX performing well. What is your explanation for some reviews showing opposite results in some tests, particularly DX-11 games?? Do you think some people got magic 8150s that showed performance better than a 2600 in games, and that this will never happen again? Or, do you think it is more likely that the bad reviews showing a 2600 blowing away a 8150 are "flawed" or skewed in some way? Are you expecting the reviews showing great performance for FXs to just disappear? Will the sites posting them fold up in shame? I am guessing not. The results they posted are real. If others are not getting results as good, then something is wrong with their rig,testing methods, or perhaps something else.


Yeah, something is wrong with the overwhelming majority of the reviews, not the lone aberration that went against the norm. Out 50 reviews ONLY hardware heaven got it right huh? Lol....


----------



## 996gt2

Quote:



Originally Posted by *Jagged_Steel*


Piling on the Intel bandwagon is easy, sticking up for AMD is hard. It is a phenomenom called "rooting for the underdog".










*This is what bulldozer is packing:*

- 2 billion transistors, about 2x what sandy bridge has
- First wholly new architecture in a decade from AMD
- 8 cores
- Higher clock speeds

*Yet despite this it only sometimes outperforms its predecessor, the Phenom II?! Additionally, it dosen't really do all that well in multi threaded stuff either...even though it was supposed to be built for those tasks. Not to mention the thrashing it gets when pitted against a 2600k or 2500k.*

Look at this... *bulldozer vs nehalem. * Bulldozer has a *1 ghz clock advantage AND is 3 years newer AND is on a 32nm process:* _*
*_
http://www.anandtech.com/bench/Product/47?vs=434

They trade blows. Despite all the advantages it* does NOT outright beat what Intel had 3 years ago*...that is a failure in my view.


----------



## jivenjune

Quote:



Originally Posted by *Jagged_Steel*


If by "flawed" I guess you are referring to those that show FX performing well. What is your explanation for some reviews showing the opposite results in some tests, particularly DX-11 games?? Do you think some people got magic 8150s that showed performance better than a 2600 in games, and that this will never happen again? Or, do you think it is more likely that the bad reviews are "flawed" or skewed in some way? Are you expecting the reviews showing great performance for FXs to just disappear? Will the sites posting them fold up in shame? I am guessing not. The results they posted are real. If others are not getting results as good, then something is wrong with their rig,testing methods, or perhaps something else.


Anand gives a good explanation of this.

Quote:



Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance.
*
While we're GPU bound in the full render score*, AMD's platform appears to have a bit of an advantage here. We've seen this in the past where one platform will hold an advantage over another in a GPU bound scenario and it's always tough to explain. Within each family however there is no advantage to a faster CPU, everything is just GPU bound.












Quote:



Looking at the no render score, the CPU standings are pretty much as we'd expect. The FX-8150 is thankfully a bit faster than its predecessors, but it still falls behind Sandy Bridge.












Notice that the margin of error of these tests when GPU bound are within +/- 3 frames per second? This is exactly within the same margin of error that you keep using as "factual" evidence as to why --you-- believe that the FX 8150 is a superior processor.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *2010rig*


Agreed, I mentioned this on the World Record thread expressing that if IPC was weak, it doesn't matter what frequency the CPU can run at with Liquid Helium.

Now we're finding out that Bulldozer average stable overclocks are 4.6 - 4.8 on AIR, and that it has a weak IPC. But sadly, AMD is now pushing this as "The fastest CPU in the world".

http://www.overclock.net/hardware-ne...l#post14936697

http://www.overclock.net/hardware-ne...l#post14936976

*
He still won't get it, because like other reviewers pointed out, in GPU limited scenarios for some reason it favors AMD slightly. That's not because Bulldozer is faster, it's because of the platform. It's one of those things you can't explain.*


Could it be because of the improved pci-e bandwidth of the BD platform?


----------



## Dr. Zoidberg

Quote:



Originally Posted by *996gt2*


*This is what bulldozer is packing:*

- 2 billion transistors, about 2x what sandy bridge has
- First wholly new architecture in a decade from AMD
- 8 cores
- Higher clock speeds

*Yet despite this it only sometimes outperforms its predecessor Phenom II?! Additionally, it dosent really do all that well in multi threaded stuff either which it was supposed to be built for. Not to mention the thrashing it gets when pitted against a 2600k or 2500k.*

Look at this... *bulldozer vs nehalem* bulldozer has a *1 ghz clock advantage and is 3 years newer and is on a 32nm process* _*and is best in class against the slowest of the 9xx series
*_
http://www.anandtech.com/bench/Product/47?vs=434

They trade blows. Despite all the advantages it* does NOT outright beat what intel had 3 years ago*, that is a failure in my view.


[/QUOTE]

That's nothing. I have found even worse.

"The novel Bulldozer design of the FX-8150 seems to be light on performance per core, as our image editing test shows. This test is single-threaded, and the FX-8150 fared extremely poorly with a stock-speed score of 887. *To put that into context, a Core 2 Duo E6700 is 13 per cent faster and a Core i5-2500K is almost twice as fast as the FX-8150 in this kind of situation*."

http://www.bit-tech.net/hardware/cpu...8150-review/11

A dual core cpu released in 2006 manages to overcome bulldozer.


----------



## dioxholster

I think we can all agree this is bad for amd and intel, now there is no competition.


----------



## GameBoy

I think the Bulldozer architecture will start to shine in later revisions/die shrinks. Really, this is just a starting point, that is probably being limited by 32nm.


----------



## Majin SSJ Eric

Its not bad for Intel, its bad for us enthusiasts....


----------



## 2010rig

Quote:



Originally Posted by *Majin SSJ Eric*


Could it be because of the improved pci-e bandwidth of the BD platform?


Jivenjune posted what I was referring to. It's strictly the AMD platform, notice it's the EXACT same 45FPS score as the 1100T. Though the 1100T is clocked at 3.3, and BD at 3.6.









Quote:



Originally Posted by *AnandTech*

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance.

While we're GPU bound in the full render score, AMD's platform appears to have a bit of an advantage here. We've seen this in the past where one platform will hold an advantage over another in a GPU bound scenario and it's always tough to explain. *Within each family however there is no advantage to a faster CPU, everything is just GPU bound.*


----------



## Lampen

Quote:



Originally Posted by *jivenjune*


Anand gives a good explanation of this.



















Notice that the margin of error of these tests when GPU bound are within +/- 3 frames per second? This is exactly within the same margin of error that you keep using as "factual" evidence as to why --you-- believe that the FX 8150 is a superior processor.


Nice post. ^_^ +rep


----------



## Amdkillsintel

Looks like they targeted parallelism like GPUs and such, it needs more modules to make up for the low-IPC cores.


----------



## 996gt2

Quote:



Originally Posted by *Amdkillsintel*


Looks like they targeted parallelism like GPUs and such, it needs more modules to make up for the low-IPC cores.


This is what confuses me a bit. Modern GPUs are already so good at highly parallel tasks, so I don't understand what part of the market AMD was aiming for. Those who need high performance in such tasks likely already have a CUDA cluster.


----------



## Jagged_Steel

Quote:



Originally Posted by *jivenjune*


Anand gives a good explanation of this.



















Notice that the margin of error of these tests when GPU bound are within +/- 3 frames per second? This is exactly within the same margin of error that you keep using as "factual" evidence as to why --you-- believe that the FX 8150 is a superior processor.


 All that matters is that a computer do what you want it to, period. I game on my computer. The FX kicks butt at games and has been shown to do so in some reviews , some bad reviews showing the exact opposite will not make the good performance reviews disappear. You are quite welcome to believe all of the bad reviews, but please, I will make my own decision, thank you for your concern.


----------



## sintricate

Quote:



Originally Posted by *Jagged_Steel*


The FX kicks butt at games and has been shown to do so in some reviews , some bad reviews showing the exact opposite will not make the good performance reviews disappear.


Wouldn't the reverse be true as well?


----------



## Jagged_Steel

Quote:



Originally Posted by *sintricate*


Wouldn't the reverse be true as well?


Use some logic: In order for the good reviews to be "false positives" would require either these review sites got some magical cpus to get these great performance readings, or it would mean that the reviewers were lying. You can't accidentally make a CPU perform beyond it's capabilities. You definitely CAN accidentally (or intentionally)make a CPU perform badly.


----------



## rusky1

This is basically the same position that I was in during the Phenom I launch. I had an Athlon X2 5000+ that wanted to retire but the absolutely horrid reviews led me to sticking with the dual core till the Phenom II's came out. Looks like I'll be waiting for Bulldozer II this time.


----------



## Dmac73

Quote:



Originally Posted by *Jagged_Steel*


Use some logic: In order for the good reviews to be "false positives" would require either these review sites got some magical cpus to get these great performance readings, or it would mean that the reviewers were lying. You can't accidentally make a CPU perform beyond it's capabilities. You definitely CAN make accidentally (or intentionally)make a CPU perform badly.


Nothing magical. You just don't know what a bottleneck is i guess. You still haven't acknowledged that. Enjoy your FX.


----------



## jivenjune

Quote:



Originally Posted by *Jagged_Steel*


All that matters is that a computer do what you want it to, period. I game on my computer. The FX kicks butt at games and has been shown to do so in some reviews , some bad reviews showing the exact opposite will not make the good performance reviews disappear. You are quite welcome to believe all of the bad reviews, but please, I will make my own decision, thank you for your concern.


Yes, I know, and I absolutely have no problem with that.

If you're happy with your computer, then that's all that matters at the end of the day. I think many people with AMD processors have perfectly fine rigs that do exactly what they want it to, and I think that's great since their purchase was well within a reasonable budget.

However, If someone asked me right now which processor I'd buy for gaming, a Phenom II x4 or an 8150 FX, I'd immediately tell them that I'd pick up the Phenom II x4.

It's performance is often superior in terms of gaming from what we've seen, and it's incredibly inexpensive for a fairly powerful budget oriented rig--an area that AMD has always excelled at. It's also significantly less expensive which leaves room to upgrade using a better AMD based GPU solution.

Interesting enough is that if someone asked me if I'd take an 8150 FX over a Phenom II x6 for video rendering and encoding, I'd still have to say no. The price of the Phenom II x6 is likely to drop significantly making it an incredible value while also performing similarly if not sometimes better than the 8150 in certain scenarios.

I think this is why a lot of people, AMD fans in particular, are perplexed about where this CPU belongs or what its intended purpose is directed at.


----------



## hammertime850

Quote:



Originally Posted by *jivenjune*


Yes, I know, and I absolutely have no problem with that.

If you're happy with your computer, then that's all that matters at the end of the day. I think many people with AMD processors have perfectly fine rigs that do exactly what they want it to, and I think that's great since their purchase was well within a reasonable budget.

However, If someone asked me right now which processor I'd buy for gaming, a Phenom II x4 or an 8150 FX, I'd immediately tell them that I'd pick up the Phenom II x4.

It's performance is often superior in terms of gaming from what we've seen, and it's incredibly inexpensive for a fairly powerful budget oriented rig--an area that AMD has always excelled at. It's also significantly less expensive which leaves room to upgrade using a better AMD based GPU solution.

Interesting enough is that if someone asked me if I'd take an 8150 FX over a Phenom II x6 for video rendering and encoding, I'd still have to say no. The price of the Phenom II x6 is likely to drop significantly making it an incredible value while also performing similarly if not sometimes better than the 8150.


this ^


----------



## Majin SSJ Eric

I think most of us understand that BD does NOT "kick butt" at gaming. It is basically no improvement over an 1100t....


----------



## born2bwild

Quote:



Originally Posted by *Jagged_Steel*


All that matters is that a computer do what you want it to, period. I game on my computer. The FX kicks butt at games and has been shown to do so in some reviews , some bad reviews showing the exact opposite will not make the good performance reviews disappear. You are quite welcome to believe all of the bad reviews, but please, I will make my own decision, thank you for your concern.


See that's the point; FX does NOT "kick butt at games."

If you have a _single_ graphic card (midrange or so), and play at a _high resolution_; then in most cases the 8-core FX will deliver enough CPU power to let you play fine - read GPU bound application.

So in this case, the FX *will* tie an i7 2600k. But _it would *also* tie a Phenom II X4 or an i3 2100_, because these would have been just as good, since the task is _GPU limited_. The Phenom II quadcore and the i3 2100 would both provide enough performance and if you need a processor to run GPU-bound application, you'd be better off with these *much cheaper options*.

To see how well an architecture actually does in gaming _based on the CPU_, we try running at lower resolution, or use multiple graphic cards or resort to more CPU-bound games. And at all of those BD falls even behind Phenom II hexacores.

CPUs that excel in CPU-bound cases, are those truly good for gaming.
Just because a BD gets the same result as an i7 2600k; it doesn't mean that it's actually equal to it; it means that in both cases, the CPU power was _enough_ to drive that single GPU at that resolution, but if one was to use more GPUs or change the resolution; the result would NOT be the same, and would favor the truly superior architecture.

*Hence, GPU-bottlenecked benchmarks do not say anything about CPU architecture and how good or bad it is.*


----------



## rusky1

Quote:



Originally Posted by *Jagged_Steel*


Use some logic: In order for the good reviews to be "false positives" would require either these review sites got some magical cpus to get these great performance readings, or it would mean that the reviewers were lying. You can't accidentally make a CPU perform beyond it's capabilities. You definitely CAN make accidentally (or intentionally)make a CPU perform badly.


Every review that has favored Bulldozer has used either a 6850 or a 6950 thus making the overall setup have a bottleneck on the GPU side. Give it a few days and we will have a bunch of reviews come back whose test rigs use GTX590's and 6990's. You'll then see how poorly BD performs.

There is only a slight chance (like 5%) that there is some huge problem with drivers/bios which will get fixed and give us an additional 40% in performance. There is a better chance that newer revisions of the chip will perform better. However the best chance is that the next iteration of the architecture will perform as advertized.

I'm sure everyone remembers AMD's "50% more cores, 33% more performance" statement. Even THAT hasn't come true.


----------



## jprovido

Quote:



Originally Posted by *Majin SSJ Eric*


I think most of us understand that BD does NOT "kick butt" at gaming. It is basically no improvement over an 1100t....


no improvement would've been acceptable(on same clock speeds with phenom II). but a step back at gaming? wth was amd thinking.


----------



## Badness

Quote:


> Originally Posted by *jprovido;15285714*
> no improvement would've been acceptable(on same clock speeds with phenom II). but a step back at gaming? wth was amd thinking.


They're thinking not about us. Obviously gamers weren't even an afterthought in this thing's design. Maybe it is supposed to be future proof. I can't tell you what they were thinking, but I'm sure they were not thinking of us.


----------



## Jagged_Steel

Quote:


> Originally Posted by *rusky1;15285693*
> Every review that has favored Bulldozer has used either a 6850 or a 6950 thus making the overall setup have a bottleneck on the GPU side. Give it a few days and we will have a bunch of reviews come back whose test rigs use GTX590's and 6990's. You'll then see how poorly BD performs.
> 
> There is only a slight chance (like 5%) that there is some huge problem with drivers/bios which will get fixed and give us an additional 40% in performance. There is a better chance that newer revisions of the chip will perform better. However the best chance is that the next iteration of the architecture will perform as advertized.
> 
> I'm sure everyone remembers AMD's "50% more cores, 33% more performance" statement. Even THAT hasn't come true.


A 6950 is not a bottleneck to around 99% of the computing world, including me. I don't understand where you are going with this , are you suggesting that kicking butt in the very latest most graphics intensive game ever created like BF3 means that FX doesn't perform well? You are REALLY reaching for straws here.


----------



## Lampen

Quote:


> Originally Posted by *rusky1;15285693*
> Every review that has favored Bulldozer has used either a 6850 or a 6950 thus making the overall setup have a bottleneck on the GPU side. Give it a few days and we will have a bunch of reviews come back whose test rigs use GTX590's and 6990's. You'll then see how poorly BD performs.
> 
> There is only a slight chance (like 5%) that there is some huge problem with drivers/bios which will get fixed and give us an additional 40% in performance. There is a better chance that newer revisions of the chip will perform better. However the best chance is that the next iteration of the architecture will perform as advertized.
> 
> I'm sure everyone remembers AMD's "50% more cores, 33% more performance" statement. Even THAT hasn't come true.


I support and agree with much of this statement. I wanna see 2-,3-,4- way SLI and the same for Crossfire to see how this thing really holds up.


----------



## jprovido

Quote:


> Originally Posted by *Badness;15285739*
> They're thinking not about us. Obviously gamers weren't even an afterthought in this thing's design. Maybe it is supposed to be future proof. I can't tell you what they were thinking, but I'm sure they were not thinking of us.


if they sucked at single threaded apps but did GREAT on multithreaded apps then they have a good selling point. but even the most heavy threaded applications it *barely* beats an i5 2500k. why would anyone buy a bulldozer if it barely beats the i5 2500k and will be alot versatile than it coz it will be great on lightly threaded applications as well. it also consumes more power than an overclocked i5 [email protected] and pretty much sucks in single threaded applications and most surprisingly does worst than it's previous generation the phenom II.

amd made a hater out of me. it's funny coz I loved amd to death just a couple of days ago. good job AMD! way to kill your fanbase


----------



## Flying Toilet

Quote:


> Originally Posted by *Jagged_Steel;15285763*
> A 6950 is not a bottleneck to around 99% of the computing world, including me. I don't understand where you are going with this , are you suggesting that kicking butt in the very latest most graphics intensive game ever created like BF3 means that FX doesn't perform well? You are REALLY reaching for straws here.


What they're trying to tell you is that you can go with an i3 processor, or a Thuban, or a Phenom II X4 and get the same results as you would with that video card. So if you want to play BF3 with a 6970 and be able to achieve the maximum FPS from that card, you can basically go with Core 2 Quad/Phenom II x4 processors and up. You don't have to buy a $270 processor to do something an $80 processor can do.


----------



## sintricate

Quote:


> Originally Posted by *Jagged_Steel;15285607*
> Use some logic: In order for the good reviews to be "false positives" would require either these review sites got some magical cpus to get these great performance readings, or it would mean that the reviewers were lying. You can't accidentally make a CPU perform beyond it's capabilities. You definitely CAN accidentally (or intentionally)make a CPU perform badly.


So your "logic" is that all the negative reviews are wrong or biased?


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Jagged_Steel;15285763*
> A 6950 is not a bottleneck to around 99% of the computing world, including me. I don't understand where you are going with this , are you suggesting that kicking butt in the very latest most graphics intensive game ever created like BF3 means that FX doesn't perform well? You are REALLY reaching for straws here.


Just admit that you don't understand it and move on. The rest of us understand that a game can be perfectly playable yet still be gpu bottlenecked and thus useless as a measure of ultimate CPU performance.


----------



## Jagged_Steel

Quote:


> Originally Posted by *sintricate;15285823*
> So your "logic" is that all the negative reviews are wrong or biased?


What is your explanation of why some reviews show the FX doing great, and some do not? Did some reviewers get special "magic" FXs that perform great and some got regular ones ? Or is it more likely that some people managed to get theirs working correctly, and others did not?


----------



## PhillyOverclocker

Just tuning in. So at first glance all the chips seem to be a dud? Is there any of them that stand out? I can't get a real grip on all the info that's out there but the consensus seems to be pretty bad I guess.


----------



## born2bwild

Quote:


> Originally Posted by *Jagged_Steel;15285875*
> What is your explanation of why some reviews show the FX doing great, and some do not? Did some reviewers get special "magic" FXs that perform great and some got regular ones ? Or is it more likely that some people managed to get theirs working correctly, and others did not?


Please take 2 minutes to read my comment earlier on and you will understand.


----------



## Flying Toilet

Quote:


> Originally Posted by *Jagged_Steel;15285875*
> What is your explanation of why some reviews show the FX doing great, and some do not? Did some reviewers get special "magic" FXs that perform great and some got regular ones ? Or is it more likely that some people managed to get theirs working correctly, and others did not?


What they're trying to tell you is that you can go with an i3 processor, or a Thuban, or a Phenom II X4 and get the same results as you would with that video card. So if you want to play BF3 with a 6970 and be able to achieve the maximum FPS from that card, you can basically go with Core 2 Quad/Phenom II x4 processors and up. You don't have to buy a $270 processor to do something an $80 processor can do.


----------



## Samurai Batgirl

Quote:



Originally Posted by *5entinel*


Please refrain from using provocative words like fanboy.


I think if someone is offended by the term, then they are either one of the fanboy's, are much too sensitive, or both. No offense, but if a word is provocative it's probably because the other person is being defensive; they're doing so because they know what they are but won't admit it.

I came back to this thread today hoping I could say, "Sheesh! Twas a dream, y'all!"
I can't...


----------



## 2010rig

Quote: 
   Originally Posted by *born2bwild*   *Hence, GPU-bottlenecked benchmarks do not say anything about CPU architecture and how good or bad it is.*  
And this is how AMD has demonstrated Bulldozer, during GPU bottlenecked situations.







They know in CPU bound scenarios Bulldozer doesn't stand a chance.

  
 You Tube


----------



## Lampen

Quote:



Originally Posted by *PhillyOverclocker*


Just tuning in. So at first glance all the chips seem to be a dud? Is there any of them that stand out? I can't get a real grip on all the info that's out there but the consensus seems to be pretty bad I guess.


Nothing good. Things that were claimed didn't happen. Things that we're expected really didn't happen.


----------



## Kand

Quote:



Originally Posted by *2010rig*


And this is how AMD has demonstrated Bulldozer, during GPU bottlenecked situations.







They know in CPU bound scenarios Bulldozer doesn't stand a chance.

http://www.youtube.com/watch?v=8rDwXuAINJk


That video does have some discrepancy to it as stated by some other users.

Try to flag that video for misleading text.

(980x doing 5.something @ cinebench when it should be doing 9.xxx)

And the dream.

It may no longer be one, but it's become a delusion for some.


----------



## anubis1127

Quote:



Originally Posted by *Flying Toilet*


What they're trying to tell you is that you can go with an i3 processor, or a Thuban, or a Phenom II X4 and get the same results as you would with that video card. So if you want to play BF3 with a 6970 and be able to achieve the maximum FPS from that card, you can basically go with Core 2 Quad/Phenom II x4 processors and up. You don't have to buy a $270 processor to do something an $80 processor can do.


^This. Unfortunately Jagged just isn't grasping that. Oh well, let him enjoy his "butt kicking" FX 8150 with his HD 6790. He clearly wouldn't be better off keeping his phenom II, and putting that $260 into a HD6950. That would not be logical, it is clearly a better option for him to buy a $260 CPU to go with his $120 GPU.


----------



## Jagged_Steel

Quote:



Originally Posted by *Lampen*


Nothing good. Things that were claimed didn't happen. Things that we're expected really didn't happen.


What thing that was claimed did not happen? What was "expected" and did not happen? What I see is people dreaming up grand expectations for something and then poo poo ing it when it didn't meet their fantasy expectations. The reality is that the FX is doing everything that AMD promised.


----------



## dlee7283

I wonder if AMD could have just skipped 32nm all together and released a 22nm Phenom III X8

-take out the DDR2 support
-add the AVX instruction set
-default support for DDR3-2400

that could have at least made a winner winner chicken dinner. If u can't beat Intel clock for clock, give the processor a ton of memory bandwidth.


----------



## BrEnKeR

Quote:



Originally Posted by *Jagged_Steel*


What thing that was claimed did not happen? What was "expected" and did not happen? What I see is people dreaming up grand expectations for something and then poo poo ing it when it didn't meet their fantasy expectations. The reality is that the FX is doing everything that AMD promised.


Did you even get to read the quote from born2bwild? I find it very detailed and informative enough. Here it is:

Quote:



Originally Posted by *born2bwild*


See that's the point; FX does NOT "kick butt at games."

If you have a _single_ graphic card (midrange or so), and play at a _high resolution_; then in most cases the 8-core FX will deliver enough CPU power to let you play fine - read GPU bound application.

So in this case, the FX *will* tie an i7 2600k. But _it would *also* tie a Phenom II X4 or an i3 2100_, because these would have been just as good, since the task is _GPU limited_. The Phenom II quadcore and the i3 2100 would both provide enough performance and if you need a processor to run GPU-bound application, you'd be better off with these *much cheaper options*.

To see how well an architecture actually does in gaming _based on the CPU_, we try running at lower resolution, or use multiple graphic cards or resort to more CPU-bound games. And at all of those BD falls even behind Phenom II hexacores.

CPUs that excel in CPU-bound cases, are those truly good for gaming.
Just because a BD gets the same result as an i7 2600k; it doesn't mean that it's actually equal to it; it means that in both cases, the CPU power was _enough_ to drive that single GPU at that resolution, but if one was to use more GPUs or change the resolution; the result would NOT be the same, and would favor the truly superior architecture.

*Hence, GPU-bottlenecked benchmarks do not say anything about CPU architecture and how good or bad it is.*


----------



## redalert

Quote:



Originally Posted by *Kand*


Need we show you slides of how AMD expected BD to perform?

50% faster than ..

Have a quote instead. Straight from the AMD Rep.

Higher. Right? Not at par or lower?


dont forget about power consumption


----------



## Flying Toilet

Quote:


> Originally Posted by *anubis1127;15285991*
> ^This. Unfortunately Jagged just isn't grasping that. Oh well, let him enjoy his "butt kicking" FX 8150 with his HD 6790. He clearly wouldn't be better off keeping his phenom II, and putting that $260 into a HD6950. That would not be logical, it is clearly a better option for him to buy a $260 CPU to go with his $120 GPU.


I sent him a PM, either he's ignoring my posts or they're getting drowned out by everyone else.


----------



## jrbroad77

Quote:


> Originally Posted by *anubis1127;15285991*
> ^This. Unfortunately Jagged just isn't grasping that. Oh well, let him enjoy his "butt kicking" FX 8150 with his HD 6790. He clearly wouldn't be better off keeping his phenom II, and putting that $260 into a HD6950. That would not be logical, it is clearly a better option for him to buy a $260 CPU to go with his $120 GPU.










Did I miss a memo? Someone thinks an FX 8150 is good for gaming?!?!?! If OCN had a poll, "Is FX-8150/6100 good value for gaming?" It should unanimously be a "NO" vote. As for the FX-4100/4170, I'm hoping they OC high enough to be strong contenders at the price. Personally between that and an i3-2100, I'd go FX-4100 for the OC.


----------



## lloyd mcclendon

nevermind


----------



## 996gt2

Quote:


> Originally Posted by *Flying Toilet;15286105*
> I sent him a PM, either he's ignoring my posts or they're getting drowned out by everyone else.


He's still trying to think of a way to spin the response so that it doesn't hurt AMD's feelings


----------



## Dr. Zoidberg

Quote:


> Originally Posted by *Jagged_Steel;15286003*
> What thing that was claimed did not happen? What was "expected" and did not happen? What I see is people dreaming up grand expectations for something and then poo poo ing it when it didn't meet their fantasy expectations. The reality is that the FX is doing everything that AMD promised.












Notice how AMD claimed that bulldozer would be the highest performing single and multi-threaded compute core in history. I believe most will say AMD failed to fulfill this goal.


----------



## Derp

Quote:


> Originally Posted by *Jagged_Steel;15286003*
> What thing that was claimed did not happen? What was "expected" and did not happen?


See my sig.


----------



## Lampen

Quote:


> Originally Posted by *Jagged_Steel;15286003*
> What thing that was claimed did not happen? What was "expected" and did not happen? What I see is people dreaming up grand expectations for something and then poo poo ing it when it didn't meet their fantasy expectations. The reality is that the FX is doing everything that AMD promised.


Might have gotten more cores out of their promise. Unfortunately the promise of performance increase that was associated with those cores certainly didn't. And no I'm not talking about gaming. I'm talking about all the other applications.


----------



## 996gt2

Quote:


> Originally Posted by *Lampen;15286196*
> 50% more cores and 33% more performance happened? Oh wait... No it didn't.


More like, 50% mores and 100% more power consumption


----------



## Kand

Quote:


> Originally Posted by *Jagged_Steel;15286224*
> Posted again without reading other posts.


Quote:


> Originally Posted by *Kand;15286045*
> Need we show you slides of how AMD expected BD to perform?
> 
> 50% faster than ..
> 
> Have a quote instead. Straight from the AMD Rep.
> Quote:
> 
> 
> 
> Originally Posted by *JF-AMD;10528811*
> IPC will be *higher*
> Single threaded performance will be *higher*
> 
> That is all we can say at this point.
> 
> 
> 
> *Higher*. Right? Not at _par_ or lower?
Click to expand...

Reposting.


----------



## dlee7283

obviously the general public like Bulldozer,its sold out

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960


----------



## amd4200

Waited all that time for nothing.. Sorry amd but im not going to buy a chip that goes backwards... Here I come sandy bridge!


----------



## rocklobsta1109

Quote:


> Originally Posted by *Lampen;15286196*
> Might have gotten more cores out of their promise. Unfortunately the promise of performance increase that was associated with those cores certainly didn't. And no I'm not talking about gaming. I'm talking about all the other applications.


exactly. their 8 core barely hangs with or beats intels quad core offerings.


----------



## Flying Toilet

I don't get what you guys get out of arguing with the misinformed... I've posted twice, in a non-partisan, non biased manner why the benchmarks with the 6970 video card are irrelevant due to a GPU bottleneck, and sent him a private message addressing it, and he is still just carrying on and arguing that the 8150 is better. I'm going to go ahead and let you guys duke it out, I don't plan on raging over an argument on the internet.


----------



## Sophath

Ok fine, your CPU won on those 2 benchmarks, but then again, it got arse-raped in 98 others. Have fun paying higher electricity bills while we go play on our Sandy Bridge or Thuban based build with better results all across the board.


----------



## jrbroad77

Quote:


> Originally Posted by *dlee7283;15286251*
> obviously the general public like Bulldozer,its sold out
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960


Clearly there's a supply issue with BD(I'd be shocked if more than 100 sold). They could meet demand with 10k CPU's throught the end of Q4.


----------



## anubis1127

Looks like some early BD folding results are in over in this thread
Quote:


> Originally Posted by *el gappo;15285147*
> 39 minutes for the first loop.


Quote:


> Originally Posted by *robbo2;15285715*
> It takes my 2600K @ 4.6 with 2133 ram 23:04 to complete a percent on that unit. Running native linux that is.


Not the result I was hoping for. I don't even think I can justify buying a fx 8150 just for fun at this point.


----------



## HowHardCanItBe

Alright reopened.


----------



## 996gt2

I feel like this image from over at AMDZone pretty much sums up the whole Bulldozer fiasco:

When pictures like this are posted on an AMD forum, that's when you know it's a sad sad situation.


----------



## Lampen

Second! And yes that picture sums it up nicely.


----------



## 996gt2

Quote:


> Originally Posted by *oc_user;15286954*
> hey yo wth is *2mil* transistors doing? That's double the sandy vagina. is it an optimization issue?


Try 2 *BILLION*


----------



## Biorganic

As an admitted AMD supporter, I am absolutely disgusted by this release. I dont even understand why they would release this chip that has its "forward looking" architecture, when it so blatantly under performs its predecessors. Shouldnt the engineering team be able to figure out that this arch is terrible and it cannot compete?

At what point did AMD really think this was a Win??? Why AMD, with products like this you deserve to lose customers.

Besides, whats up with these overly assertive names for cpu generations? With a name like excavator or piledriver, you better be burying the competitor, not yourself. Unless pilepooper can drive my car for me while buying flowers for my girlfriend, Im going intel for my next build.

Sorry AMD, if you didnt suck so hard I would still support you.

Sad but true...............


----------



## Flying Toilet

Quote:


> Originally Posted by *Lampen;15286936*
> Second! And yes that picture sums it up nicely.


Fail.

Honestly I hope that there is some firmware update that corrects the issue. What I would like to see is a benchmark that shows L1 cache performance in comparison with the other top processors. I know that it was rumored in some of the first K10 opterons that there might be an L1 bug which turned out to actually be a northbridge issue.

IRT dlee, I guess the hype and PR worked in their favor if Newegg sold out their stock. Hopefully when AMD realizes what is bogging down their processors they'll apologize for the initial confusion, _if it's a bug._ I still find it hard to believe that this would happen.


----------



## Riou

Quote:


> Originally Posted by *dlee7283;15286251*
> obviously the general public like Bulldozer,its sold out
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960


It can be popular with system builders. They can sell 8-core CPU computers to consumers at a low price and people will buy it since they don't really care about benchmarks.


----------



## TFL Replica

Quote:


> Originally Posted by *Riou;15287022*
> It can be popular with system builders. They can sell 8-core CPU computers to consumers at a low price and people will buy it since they don't really care about benchmarks.


The irony is that Intel was doing a similar thing in the Athlon64 days. They'd sell higher clock frequencies to people that had no idea what a benchmark was.


----------



## aznofazns

I'm sure AMD is quite aware that Bulldozer is not competitive performance-wise. They know they messed up.

But some of you guys needs to realize that AMD can't just admit BD is terrible and scrap it. They have no choice but to market it to sell to the uninformed. If they didn't, and consequently no one bought it, they would suffer enormous losses and any hope of AMD spending on R&D to bring out a competent architecture in the future would be crushed.

I'm not saying I support their deceptive marketing strategies, but there's really no easy way out of this right now. We, as consumers, can also not afford to have Bulldozer fail in the market. You all are aware of what would happen if AMD were to disappear from the desktop CPU market altogether.


----------



## 996gt2

False advertising, anyone?


----------



## MoRLoK

Could someone test it under win 8 ?


----------



## microfister

Quote:


> Originally Posted by *Riou;15287022*
> It can be popular with system builders. They can sell 8-core CPU computers to consumers at a low price and people will buy it since they don't really care about benchmarks.


exactly, they are pushing the "worlds first eight core" and "world record holder" crap. and its working. newegg is sold out of the 8150, i'm sure that this was the plan months ago after amd started seeing what their new BD was amounting to. they arn't targeting the enthusiast for this, they're going for the everyday consumer on the concept that more is better.
Quote:


> Originally Posted by *TFL Replica;15287049*
> The irony is that Intel was doing a similar thing in the Athlon64 days. They'd sell higher clock frequencies to people that had no idea what a benchmark was.


exactly, its like when new members pop up thinking that the 1100T outperforms the 2500k because it has more cores. the angle that AMD is going for is working for them, and what more, even if newegg only had 50 8150s in stock, a new computer builder will see "worlds first eight core" "world record holder" and "sold out" and combine everything together and assume its the best thing on the market.


----------



## tastycakes

This is so funny to watch this thread explode with fanboy rage. Its sad to see AMD perform as such. If they keep failing maybe qualcomm or texas instruments might buy them out... I just hope Intel does not get slammed with anti trust laws if AMD keeps messing up. I hope the best for them.


----------



## hammertime850

Quote:


> Originally Posted by *MoRLoK;15287172*
> Could someone test it under win 8 ?


they did, 1-4% increase


----------



## Jagged_Steel

Quote:


> Originally Posted by *aznofazns;15287114*
> I'm sure AMD is quite aware that Bulldozer is not competitive performance-wise. They know they messed up.
> 
> But some of you guys needs to realize that AMD can't just admit BD is terrible and scrap it. They have no choice but to market it to sell to the uninformed. If they didn't, and consequently no one bought it, they would suffer enormous losses and any hope of AMD spending on R&D to bring out a competent architecture in the future would be crushed.
> 
> I'm not saying I support their deceptive marketing strategies, but there's really no easy way out of this right now. We, as consumers, can also not afford to have Bulldozer fail in the market. You all are aware of what would happen if AMD were to disappear from the desktop CPU market altogether.


How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible? In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


----------



## tastycakes

Quote:


> Originally Posted by *996gt2;15287122*
> False advertising, anyone?


It is efficient, 4 cores = 300 watts so 8 cores should = 600 watts... LOL:gotproof:


----------



## microfister

another idea here is the fact that tiger direct could very well be sold out as well.










notice the "ships within" period

the retailers aren't planning on ordering 10,000 of these chips without intention of selling them. newegg had the 5970 on their site forever and kept dropping the price until they disappeared(not 1 review was uploaded for them), they aren't going to make that mistake again. looks like tiger direct is going to see how they sell before they even have them in stock, seems like they may already back ordered to me.


----------



## kweechy

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible? In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


I feel like a lot of those tests were more GPU choked than CPU. That's the problem.

When they did run benchmarks that weren't GPU bottlenecked, the Bulldozer often did even worse than Phenom X6 due to lower IPC and fewer cores being utilized by games.


----------



## Lampen

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> *How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible?* In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


i5-2500k - 220 bucks on the Egg. FX-8150 - 279 bucks. Don't see how 279 is less than 220. Was there a memo I didn't get?


----------



## Skylit

Guys... this is a good thing.

An everyday consumer that don't really care about benchmarks is going to see a 8 core high clock speed CPU and be amazed. Intel will eventually have to counter since the mass public don't really understand that AMD is actually selling an inferior product. Worked for Intel back in the P4 vs Athlon days.

Win/Win for consumers. Now to play the waiting game.


----------



## aznofazns

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible? In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


Could you please back up your claim that BD can compete with other processors on the market? More specifically, what makes the $220 FX-8120 competitive with the $220 2500K? All I've seen in the reviews so far is the FX winning in a very limited number of heavily-threaded benchmarks while the 2500K wipes the floor with it in everything else while consuming much, much less power.

I'm sorry if you already posted this previously, but this thread is far too long to scan for a single post.


----------



## 996gt2

Quote:


> Originally Posted by *aznofazns;15287308*
> Could you please back up your claim that BD can compete with other processors on the market? More specifically, what makes the $220 FX-8120 competitive with the $220 2500K? All I've seen in the reviews so far is the FX winning in a very limited number of heavily-threaded benchmarks while the 2500K wipes the floor with it in everything else while consuming much, much less power.
> 
> I'm sorry if you already posted this previously, but this thread is far too long to scan for a single post.


He's just going to post those GPU limited tests again lol...just wait for it


----------



## microfister

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible?


because it still yeilds lower FPS than a 2500k for $40 more + power costs, providing your not limiting the equation with a limited GPU.

and yes it can play the newest games @ highest settings.

$270 cpu for 102 FPS in Crysis 2(after overclock) or $219 cpu for 130 FPS in crysis 2 (before overclock)


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> How in the world do you see a processor that will play the latest games at the best settings for less than the competition as "terrible? In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


Lol, wut? The 2500k is cheaper than BD and performs much better in gaming according to 99% of benchmarks....


----------



## aznofazns

Quote:


> Originally Posted by *996gt2;15287317*
> He's just going to post those GPU limited tests again lol...just wait for it


Haha... well, regardless, I hope Bulldozer sells. Early stockouts on Newegg and other etailers are a good sign. I mean, we enthusiasts (most of us anyway







) know what the better product is so it shouldn't matter too much. We're such a small % of the overall market that our informed realization that BD is a bad product should not have a huge effect on sales. If the masses want to buy it because they're clueless, let them. It will only keep Intel on their toes and allow AMD some time to get things together.


----------



## Jagged_Steel

Quote:


> Originally Posted by *tastycakes;15287286*
> It is efficient, 4 cores = 300 watts so 8 cores should = 600 watts... LOL:gotproof:


And here are power consumption numbers for FX that are completely different than that. These tests show the FX using less energy in nearly every situation, paricularly at idle which most CPUs spend the vast majority of their time doing.


----------



## Annex

I haven't been around long enough to get involved in the fanboy thing, but it clearly seems just from the tone of the thread and couple graphs I've seen that AMD indeed has created a lackluster product in comparison to intels offerings.


----------



## shredzy

Jagged Steal =










Keep diggin son, just another AMD fanboy trying to find some faith in BD and can't face the facts that it is rubbish in gaming. Have fun with your FX using 400W+ overclocked as well, heard thats pretty good.


----------



## kenolak

Quote:


> Originally Posted by *shredzy;15287431*
> Jagged Steal =
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep diggin son, just another AMD fanboy trying to find some faith in BD and can't face the facts that it is rubbish in gaming.


I don't follow the threads enough to even care about you trying to personally attack someone.

But how exactly do 5-10 FPS ruin a game running at 100FPS ? Really?
Thanks.


----------



## aznofazns

Quote:


> Originally Posted by *Jagged_Steel;15287406*
> And here are power consumption numbers for FX that are completely different than that. These tests show the FX using less energy in nearly every situation, paricularly at idle which most CPUs spend the vast majority of their time doing.
> 
> [/IMG]


Idle power consumption is important, but that doesn't mean we should ignore load power consumption. What matters to me is that FX consumes around 200W more than the comparably priced 2500K when both are overclocked similarly. TWO. HUNDRED. WATTS. That's an entire CPU by itself. That is a huge factor in deciding what PSU and cooler to buy.

Also, considering Bulldozer is really a server processor at its heart, load power consumption is actually quite important.


----------



## Lampen

I'm sorry but what is this? I've never seen any chart EVER that had the i7 920 at over 500 watts. And its pulling more watts than a 990x when the 990x has two more cores and is clocked even higher? WUT?


----------



## mad0314

Quote:


> Originally Posted by *Jagged_Steel;15287268*
> How in the world do you see a processor that will play the latest games at the best settings for *less than the competition* as "terrible? In my world performing the same task for less is called a "bargain", not "deceptive marketing" and "terrible" as you have imagined. If you want to pay more to do the exact same thing as you could do for less you are welcome to do so, but it doesn't change anything about those that choose not to.


At this point I'm pretty sure you're trolling...


----------



## kenolak

Quote:


> Originally Posted by *aznofazns;15287450*
> Idle power consumption is important, but that doesn't mean we should ignore load power consumption. What matters to me is that FX consumes around 200W more than the comparatively priced 2500K when both are overclocked similarly. TWO. HUNDRED. WATTS. That's an entire CPU by itself. That is a huge factor in deciding what PSU and cooler to buy.
> 
> Also, considering Bulldozer is really a server processor at its heart, load power consumption is actually quite important.


Server versions are far from the same. Your point still stands as valid.


----------



## microfister

Quote:


> Originally Posted by *shredzy;15287431*
> Jagged Steal =
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep diggin son, just another AMD fanboy trying to find some faith in BD and can't face the facts that it is rubbish in gaming. Have fun with your FX using 400W+ overclocked as well, heard thats pretty good.


seriously, if anyone with an already high w PSU and enthusiast level discrete graphics made the jump to a BD, theyd still be facing the possible need for a PSU upgrade.


----------



## Sir Shfvingle

The funny thing about this thread now is that it's just become so repetitive. Intel fans(or just anyone) continuously telling AMD fans "I told ya so", with most AMD fans responding sadly(like me). Then the extreme fanboys keep coming in with their crazy arguments about how BD is still better somehow. STOP. It's annoying, even to an AMD fan like me. But the same to you Intel guys, you're clearly right. Let them wallow in their ignorance, but don't keep bringing up the obviously obvious charts and stats. Keep the discussion moving. Please and thank you.
/rant


----------



## sbeast

id say wait a few months for more long-term testing to be done before completely abandoning ship on Bulldozer. in the mean time, im hoping i can score a 990 series Motherboard at a decent price soon though, i want to keep my phenom II but have integrated SLI without hacking the BIOS.


----------



## xPwn

Quote:


> Originally Posted by *Lampen;15287457*
> I'm sorry but what is this? I've never seen any chart EVER that had the i7 920 at over 500 watts. And its pulling more watts than a 990x when the 990x has two more cores and is clocked even higher? WUT?


I think its because the 920 is 45nm. And I beleive that 32nm has less electric resistance, thus reducing heat output,which also reduces energy input required. Im also sure the 920/980x/990x both are rated for 130w TDP. (I need backup, im not 100% sure that what im posting is factual







)


----------



## Jagged_Steel

Quote:


> Originally Posted by *shredzy;15287431*
> Jagged Steal =
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep diggin son, just another AMD fanboy trying to find some faith in BD and can't face the facts that it is rubbish in gaming. Have fun with your FX using 400W+ overclocked as well, heard thats pretty good.


Here is a chart showing that the FX 8150 outperforms both the 25 and 2600 in BF3, THE latest and greatest game. According to your logic all of these CPUs are "rubbish", so what exactly should people game with , a Cray?


----------



## shredzy

Quote:


> Originally Posted by *kenolak;15287446*
> I don't follow the threads enough to even care about you trying to personally attack someone.
> 
> But how exactly do 5-10 FPS ruin a game running at 100FPS ? Really?
> Thanks.


Err that's a personal attack?.....LOL, do you even know what a personal attack is?

Its not 5-10 FPS, its actually a lot more then that if you use CPU-limitations (not running a game at 1080P++ and is GPU-limited). For example http://www.guru3d.com/article/amd-fx-8150-processor-review/19 and the other 20+ reviews that are in the OP.

I heard its pretty energy efficient as well http://www.guru3d.com/article/amd-fx-8150-processor-review/7

Keep digging good sir.


----------



## apass

Man. this is awful. Should I upgrade to a cheap Phenom II 955 or grab an Intel system instead?


----------



## mrcool63

In totality

AMD says that they are for multithreaded apps and not suited for present apps.

Thats why AMD is saying that the BD fares that badly in benchmarks.. honestly though they may be right upto a certain extent.. overall speaking no software is utilized for such modules.. so it is in a way a deviation from the normal architecture which may prove to be amd's bane.. they forgot that majority of things today cant understand s**t about the core.. for that fact even windows is flummoxed.

However i believe one thing they did really wrong was the per-core performance.. it is literally abysmal.. absolute crap worthy!! That is why our dear banik is screwing AMD totally up..

Secondly they released a sub performing flagship product first without releasing lower versions which may actually be better(god knows) or for that matter any version along with it.. Tweaktown goes all volatile on amd's marketing policy, pounding it to pulp..

BD is AMD's too hurried, too underprepared and too miniscule effort to answer intel!!


----------



## dantoddd

Quote:


> Originally Posted by *Jagged_Steel;15287406*
> And here are power consumption numbers for FX that are completely different than that. These tests show the FX using less energy in nearly every situation, paricularly at idle which most CPUs spend the vast majority of their time doing.


Those load numbers don't look too good, especially if you are going to be gaming.


----------



## shredzy

Quote:


> Originally Posted by *Jagged_Steel;15287500*
> Here is a chart showing that the FX 8150 performs neck and neck with a 2600 and outperforming a 2500 in BF3. According to your logic all of these CPUs are "rubbish", so what exactly should people game with , a Cray?


Nice work linking a hardocp review man, you do realize that site is AMD bias'd (look at the ATi reviews)? How about you compare it to the other 30+ reviews in OP. Bf3 is a GPU-limited game as well, nice try.

Keep digging.

EDIT: On another note, i'd upgrade your PSU to a 1000W as well, because your FX will make that 500W explode.


----------



## aznofazns

Quote:


> Originally Posted by *Jagged_Steel;15287500*
> Here is a chart showing that the FX 8150 outperforms both the 25 and 2600 in BF3, THE latest and greatest game. According to your logic all of these CPUs are "rubbish", so what exactly should people game with , a Cray?


Sir, they are testing BF3, the latest and greatest in PC gaming graphics, at 1200p on Ultra settings. That screams GPU bottleneck.


----------



## importflip

Quote:


> Originally Posted by *jrbroad77;15286336*
> Clearly there's a supply issue with BD(I'd be shocked if more than 100 sold). They could meet demand with 10k CPU's throught the end of Q4.


Nah. Regular people saw "8 Cores" and were like:

Oh em gee, dat haz da cores. imma get.


----------



## tastycakes

Quote:


> Originally Posted by *apass;15287508*
> Man. this is awful. Should I upgrade to a cheap Phenom II 955 or grab an Intel system instead?


Get a 2500k, best bang for the buck! with a H60 cooling, you be the bees knees and a overclocking machine


----------



## jrbroad77

Quote:


> Originally Posted by *importflip;15287548*
> Nah. Regular people saw "8 Cores" and were like:
> 
> Oh em gee, dat haz da cores. imma get.


The joke's on them, once they switch in a 8-core for their 6-core and see lower performance they'll probably break their computer.


----------



## Jagged_Steel

Quote:


> Originally Posted by *aznofazns;15287541*
> Sir, they are testing BF3, the latest and greatest in PC gaming graphics, at 1200p on Ultra settings. That screams GPU bottleneck.


From the H OCP Review:
Quote:


> In order to keep the game from being GPU limited we set back the resolution and settings. We ran this game at 1920x1200 with NO AA, and 16X AF with "Ultra" settings. Note that all AA was off, including deferred MSAA and Post AA. Our average framerates were at an acceptable level of gameplay performance for competitive BF3 gamers with these settings.


Any other straws you are thinking about reaching for to poo poo a review that you don't like? At some point you and the other are just going to have to come to grips with the fact that FXs are going to be gaming beasts. Or not, I guess you can just keep on denying, but for everybody else the reality will be that FX = Game beast


----------



## aznofazns

Quote:


> Originally Posted by *jrbroad77;15287588*
> The joke's on them, once they switch in a 8-core for their 6-core and see lower performance they'll probably break their computer.


But that's the thing. Are regular consumers even going to notice? If they cared about higher performance, wouldn't they have read at least ONE review before purchasing? I can't see someone going to the effort of buying a new CPU to replace his/her old one without doing some research first. That's why I think Bulldozer may do quite well in the pre-built PC market.


----------



## GTR Mclaren

ok, this thread is now full of AMD and Intel fanboys

we are mature people....people !! stop this nonsense


----------



## kweechy

Quote:


> Originally Posted by *jrbroad77;15287588*
> The joke's on them, once they switch in a 8-core for their 6-core and see lower performance they'll probably break their computer.


No, they'll have not a goddam clue they're getting less performance at all. First off, their old machine was probably bogged down to crap by malware cause they have no idea how to take care of a machine or format...so ANY new PC is going to be blazing in comparison without them realizing that's the main reason. Secondly, since they have no clue how to get objective benchmarks from the computer, they simply get placebo'd into thinking it's fast. The CPU is brand new! The machine is brand new! That means it's fast! Therefore I think it's fast too!

You guys don't realize this, but the amount of people that actually know anything of any real substance about computers and about computing is WAY less than 1 in 100. I'd even say 1 in 1000 is more realistic.

99.9% of people see 8 cores, see the speed, see the sweet packaging and graphic design and see the price. The salesman says it's great too and it's cheaper than the Intel computer.

All they know is this:
1) I can get 8 cores @ 3.6 gigasomethings for $1,000
2) I can get 4 cores @ 3.4 gigasomethings for $1,200

"You'd have to be a moron to pick #2!!!"

AMD is turning CPU cores into what megapixels became.

A single number you look at and decide on the purchase without understanding what it actually means to you. "WOW a 34 megapixel camera!!!" when they don't realize that unless you're rocking a full 35mm sensor, those 34 MPs are actually WORSE than 6 would have been.


----------



## microfister

heres a thought, what about disabling 2-4 cores and boosting the remaining cores to the max? like what some users do with the phenom II x6s. think maybe that would boost overall performance, and drop a little power consumption? just a thought.


----------



## aznofazns

Quote:


> Originally Posted by *Jagged_Steel;15287602*
> From the H OCP Review:
> 
> Any other straws you are thinking about reaching for to poo poo a review that you don't like? At some point you and the other are just going to have to come to grips with the fact that FXs are going to be gaming beasts. Or not, I guess you can just keep on denying, but for everybody else the reality will be that FX = Game beast


The reviewer is simply wrong. 1920x1200 with all Ultra (read: max) settings is still very much a GPU bottleneck. You don't need to ramp up AA to create a bottleneck in that game.


----------



## mad0314

Did you by chance take a look at BOTH of the BF3 charts?
In case you didn't here they are again:
Stock clocks:








Overclocked to 4.8GHz









Clearly a GPU bottleneck. If you think otherwise you are delusional. I fully expect this post to be ignored though.


----------



## shredzy

Quote:


> Originally Posted by *Jagged_Steel;15287602*
> From the H OCP Review:
> 
> Any other straws you are thinking about reaching for to poo poo a review that you don't like? At some point you and the other are just going to have to come to grips with the fact that FXs are going to be gaming beasts. Or not, I guess you can just keep on denying, but for everybody else the reality will be that FX = Game beast


Oh my god. How about you stop linking your 'poo poo' review because I love how its totally different to the other 30 reviews.

Just because they have no AA/AF they say its CPU-limited but they run it at 1200P and ultra settings? LOL are you kidding?

Also I think your mistaken on FX being a "gaming beast", how about you kick into reality and see that bulldozer has failed to deliver what it was told to be.

I honestly think your trolling right now, cause no one can be this arrogant and stupid.

How big is your hole now?


----------



## Flying Toilet

Quote:


> Originally Posted by *Jagged_Steel;15287602*
> From the H OCP Review:
> 
> Any other straws you are thinking about reaching for to poo poo a review that you don't like? At some point you and the other are just going to have to come to grips with the fact that FXs are going to be gaming beasts. Or not, I guess you can just keep on denying, but for everybody else the reality will be that FX = Game beast


I'm going to make this really simple for you, due to the fact that I've responded to you twice in this topic and once via private message, all of which you've ignored.

The same performance from the i7 2600k, i5 2500k, and FX-8150 with a Radeon HD 6970 can also be had with a Core 2 Quad Q9450, Amd Phenom II X4, and Core i3 and up processor. That's called a video card bottleneck. The same performance out of the $300, $270, and $200 processors can be had for $80 with a Phenom II x4 if paired with an HD 6970.

Please, stop. You're making people with AMD platforms look bad by continuing.


----------



## 996gt2

Quote:


> Originally Posted by *mad0314;15287669*
> Did you by chance take a look at BOTH of the BF3 charts?
> In case you didn't here they are again:
> Stock clocks:
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocked to 4.8GHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clearly a GPU bottleneck. If you think otherwise you are delusional. I fully expect this post to be ignored though.


Quote:


> Originally Posted by *Flying Toilet;15287715*
> *I'm going to make this really simple for you, due to the fact that I've responded to you twice in this topic and once via private message, all of which you've ignored.
> 
> The same performance from the i7 2600k, i5 2500k, and FX-8150 with a Radeon HD 6970 can also be had with a Core 2 Quad Q9450, Amd Phenom II X4, and Core i3 and up processor. That's called a video card bottleneck. The same performance out of the $300, $270, and $200 processors can be had for $80 with a Phenom II x4 if paired with an HD 6970.*
> 
> Please, stop. You're making people with AMD platforms look bad by continuing.


QFT

Awaiting Jagged_Steel's response.


----------



## jrbroad77

So I'm genuinely curious, has AMD done any damage control about these reviews yet? More likely they'll say not a word and just act confused when someone says "benchmarks".


----------



## Jagged_Steel

Quote:


> Originally Posted by *kweechy;15287627*
> No, they'll have not a goddam clue they're getting less performance at all.
> 
> You guys don't realize this, but the amount of people that actually know anything of any real substance about computers and about computing is WAY less than 1 in 100.
> 
> 99.9% of people see 8 cores, see the speed, see the sweet packaging and graphic design and see the price.
> 
> All they know is this:
> 1) I can get 8 cores @ 3.6 gigasomethings for $1,000
> 2) I can get 4 cores @ 3.4 gigasomethings for $1,200
> 
> "You'd have to be a moron to pick #2!!!"
> 
> AMD is turning CPU cores into what megapixels became.
> 
> A single number you look at and decide on the purchase without understanding what it actually means to you. "WOW a 34 megapixel camera!!!" when they don't realize that unless you're rocking a full 35mm sensor, those 34 MPs are actually WORSE than 6 would have been.


I think you are giving the "average computer buyer" way more credit than they actually deserve. Your average computer buyer does not know what a core is or how many of them is adequate . They are more inclined to buy one computer over another based on something like HD size and maybe CPU speed which they can fathom rather than some lists of numbers and names like "Sandy bridge" or Bulldozer" that have no meaning to them. I would wager a bet that your "average " computer buyer would have a hard time naming both Intel and AMD, let alone having a preference for one or the other based on anything more than name recognition by advertising. To them the CPU brand is as relevant as the brand of resistors used on the mobo. Also, most computer buyers will still be sporting two core machines for quite some time, four cores on up is still a specialty market aimed at "enthusiasts" and professionals that need high computing power.


----------



## 996gt2

Quote:


> Originally Posted by *jrbroad77;15287766*
> So I'm genuinely curious, has *AMD done any damage control about these reviews yet?* More likely they'll say not a word and just act confused when someone says "benchmarks".


They got Jagged_Steel onto these forums?


----------



## aznofazns

Quote:


> Originally Posted by *Jagged_Steel;15287771*
> I think you are giving the "average computer buyer" way more credit than they actually deserve. Your average computer buyer does not know what a core is or how many of them is adequate . They are more inclined to buy one computer over another based on something like HD size and maybe CPU speed which they can fathom rather than some lists of numbers and names like "Sandy bridge" or Bulldozer" that have no meaning to them. I would wager a bet that your "average " computer buyer would have a hard time naming both Intel and AMD, let alone having a preference for one or the other based on anything more than name recognition by advertising. To them the CPU brand is as relevant as the brand of resistors used on the mobo. Also, most computer buyers will still be sporting two core machines for quite some time, four cores on up is still a specialty market aimed at "enthusiasts" and professionals that need high computing power.


Your well-put logic here makes me wonder if you've been trolling about BD this entire time. If so, well done. You've ignited an impressive flame war.


----------



## mad0314

Quote:


> Originally Posted by *aznofazns;15287797*
> Your well-put logic here makes me wonder if you've been trolling about BD this entire time. If so, well done. You've ignited an impressive flame war.


Its not a flame war its a firing range.


----------



## microfister

Quote:


> Originally Posted by *996gt2;15287774*
> They got Jagged_Steel onto these forums?


LOL,

but seriously though, anybody thought about my question above? disabling cores? maybe better performance on the remaining cores/less power draw?

i wana know!


----------



## aznofazns

Quote:


> Originally Posted by *microfister;15287845*
> LOL,
> 
> but seriously though, anybody thought about my question above? disabling cores? maybe better performance on the remaining cores/less power draw?
> 
> i wana know!


I'm curious about this too.

However, I'd imagine that if you could disable cores, you'd only be able to disable entire modules. If that's the case, the remaining modules still consist of two integer cores competing for resources. I don't think performance would improve.


----------



## mad0314

Quote:


> Originally Posted by *microfister;15287845*
> LOL,
> 
> but seriously though, anybody thought about my question above? disabling cores? maybe better performance on the remaining cores/less power draw?
> 
> i wana know!


Yes and no. You can disable modules to get higher clock speed, but the gains will be marginal. The 8+GHz world record was set on 1 active module. The problem with this route is that you take away BDs seemingly only advantage (core count) and are left with all of its drawbacks. Paying for an 8 core and turning it into a sub par quad core for MORE than other available stronger quad cores? No thanks.

There was a rumor about people being able to disable single integer cores so that the module is left with the FPU and a single integer core. Do this to each module and you are left with a single integer core and FPU per module without the shared resources and a beefed up FPU. I haven't followed up on it though, maybe someone else can comment.


----------



## microfister

Quote:


> Originally Posted by *aznofazns;15287880*
> I'm curious about this too.
> 
> However, I'd imagine that if you could disable cores, you'd only be able to disable entire modules. If that's the case, the remaining modules still consist of two integer cores competing for resources. I don't think performance would improve.


Quote:


> Originally Posted by *mad0314;15287921*
> Yes and no. You can disable modules to get higher clock speed, but the gains will be marginal. The 8+GHz world record was set on 1 active module. The problem with this route is that you take away BDs seemingly only advantage (core count) and are left with all of its drawbacks. Paying for an 8 core and turning it into a sub par quad core for MORE than other available stronger quad cores? No thanks.
> 
> There was a rumor about people being able to disable single integer cores so that the module is left with the FPU and a single integer core. Do this to each module and you are left with a single integer core and FPU per module without the shared resources and a beefed up FPU. I haven't followed up on it though, maybe someone else can comment.


hmmm... so i... should add more modules, after disabling cores?























im just kidding, what a fail


----------



## aznofazns

Quote:


> Originally Posted by *microfister;15287923*
> hmmm... so i... should add more modules, after disabling cores?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im just kidding, what a fail


Each module consists of 2 integer cores, a floating point unit, and shared L2 cache. The poor per-core performance we've seen suggests that each integer core may be crippled by having to share resources with with the 2nd core. If we could disable one integer core in each module, per-core performance might improve.

I'm really just speculating, though. A main point in the Bulldozer architecture was that each integer core won't be severely hindered by its counterpart since they have dedicated integer schedulers and execution pipelines. I guess we'll see once more enthusiasts get their hands on it.


----------



## microfister

Quote:


> Originally Posted by *aznofazns;15288083*
> Each module consists of 2 integer cores, a floating point unit, and shared L2 cache. The poor per-core performance we've seen suggests that each integer core may be crippled by having to share resources with with the 2nd core. If we could disable one integer core in each module, per-core performance might improve.
> 
> I'm really just speculating, though.


interesting. ill wait to see more in depth testing with the BD. surely cant justify picking one up to test when overall performance out of the box is comparable to its predecessors.


----------



## MoRLoK

Quote:


> Originally Posted by *hammertime850;15287227*
> they did, 1-4% increase


Ok ty...

So im confused right now. Would all of you all buy it ? in my case ? i have Gigabyte 970A-UD3 (excellent board, i will never buy expensive boards since i dont really like xfire/sli) and ATHLON x3 450 @ B50 3.9 Ghz + GTX460 OC (still very good card) 100% load it takes around 250W (playing assasin creed brotherhood). Im scared about bulldozer power usage







.Maybe FX-4170 or Fx-6100. I see them available here FX-8150 250$, 2600k almost 450$. When i compare my old athlon (its like Bear Grylls, went throught alot







) to FXs i see almost no difference in gaming. Do u think it will be good for feature or its simply bad architecture and never will show anything better ?


----------



## aznofazns

Quote:


> Originally Posted by *MoRLoK;15288112*
> Ok ty...
> 
> So im confused right now. Would all of you all buy it ? in my case ? i have Gigabyte 970A-UD3 (excellent board, i will never buy expensive boards since i dont really like xfire/sli) and ATHLON x3 450 @ B50 3.9 Ghz + GTX460 OC (still very good card) 100% load it takes around 250W (playing assasin creed brotherhood). Im scared about bulldozer power usage
> 
> 
> 
> 
> 
> 
> 
> .Maybe FX-4170 or Fx-6100. I see them available here FX-8150 250$, 2600k almost 450$. When i compare my old athlon (its like Bear Grylls, went throught alot
> 
> 
> 
> 
> 
> 
> 
> ) to FXs i see almost no difference in gaming. Do u think it will be good for feature or its simply bad architecture and never will show anything better ?


Stick with your Rana chip at least until we see what Piledriver brings. Like you said, your CPU is not significantly slower than BD for gaming.


----------



## daman246

The ppl wondering about the Integers and Modules and what not
http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-%284%29-!exclusive!-Excuse-for-1-Threaded-Perf.


----------



## hammertime850

Quote:


> Originally Posted by *MoRLoK;15288112*
> Ok ty...
> 
> So im confused right now. Would all of you all buy it ? in my case ? i have Gigabyte 970A-UD3 (excellent board, i will never buy expensive boards since i dont really like xfire/sli) and ATHLON x3 450 @ B50 3.9 Ghz + GTX460 OC (still very good card) 100% load it takes around 250W (playing assasin creed brotherhood). Im scared about bulldozer power usage
> 
> 
> 
> 
> 
> 
> 
> .Maybe FX-4170 or Fx-6100. I see them available here FX-8150 250$, 2600k almost 450$. When i compare my old athlon (its like Bear Grylls, went throught alot
> 
> 
> 
> 
> 
> 
> 
> ) to FXs i see almost no difference in gaming. Do u think it will be good for feature or its simply bad architecture and never will show anything better ?


get an x4 955


----------



## daman246

if turning the 8150 into a quad core does indeed increase its performance then this feels just like the x2 555 calisto core that unlocks into a quad lol Amd failed again now we have to disable instead of unlock


----------



## mav2000

Quote:


> Originally Posted by *jrbroad77;15287766*
> So I'm genuinely curious, has AMD done any damage control about these reviews yet? More likely they'll say not a word and just act confused when someone says "benchmarks".


I dont think they can come out and say its a failure....so I guess thats exactly what they will do.


----------



## mav451

Quote:


> Originally Posted by *daman246;15288252*
> if turning the 8150 into a quad core does indeed increase its performance then this feels just like the x2 555 calisto core that unlocks into a quad lol Amd failed again now we have to disable instead of unlock


Haha yeah I can't believe we have to disable cores to get good IPC...but crazier things have happened. I don't want to get my hopes up over nothing though...really need to see more numbers.


----------



## aznofazns

Quote:


> Originally Posted by *daman246;15288221*
> The ppl wondering about the Integers and Modules and what not
> http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-%284%29-!exclusive!-Excuse-for-1-Threaded-Perf.


Wow, that explains a lot. The 4M/4C tests seem to fall in line with what I expected. IPC is actually higher than Phenom II once each integer core has a module's resources all to itself.

Still, it doesn't make sense to pay $220 for an 4M/8C and turn it into a 4M/4C, since performance is still much lower than Sandy Bridge.


----------



## nagle3092

I have a feeling that the horrible power consumption is due to a shoddy 32nm process at GF. There is really no other way to explain it.


----------



## mad0314

Quote:


> Originally Posted by *daman246;15288221*
> The ppl wondering about the Integers and Modules and what not
> http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-%284%29-!exclusive!-Excuse-for-1-Threaded-Perf.


Ah thats the link I was talking about but the site was under maintenance when I tried to view it and then I went to school and what not.

Thanks for posting it


----------



## caffeinescandal

Damn it AMD. I trusted you, I was hoping for something phenomenal since early benchmarks were supposedly fake. I waited, and got myself a 990X board. And now this is what i see? I am very disappoint.

Still that doesnt mean I'll go Intel soon. I already spent money on this board, should I get a Phenom II x6?


----------



## jrbroad77

Quote:


> Originally Posted by *nagle3092;15288359*
> I have a feeling that the horrible power consumption is due to a shoddy 32nm process at GF. There is really no other way to explain it.


2B transistors at 4.6ghz, pretty sure that explains it. The 990X is on 32nm and it's load power consumption is only 25W less, at 4.5ghz (see [H]'s review). Just to clarify for anyone else I'm talking peak load power consumption. Disregarding performance, it seems reasonable.


----------



## Strat79

Quote:


> Originally Posted by *nagle3092;15288359*
> I have a feeling that the horrible power consumption is due to a shoddy 32nm process at GF. There is really no other way to explain it.


The horrible power consumption is due to it being an 8 integer core CPU with almost double the transistors of other CPU's, like the 2600K. Of course it is going to draw quite a bit of power, especially when overclocked. I think people are forgetting the massive transistor count these things are packing and 8 full integer cores. If you packed 4 more integer cores onto SB chips, they would draw just as much and most likely quite a bit more. Still doesn't account for the lackluster performance:watt mind you, just stating why it is using so much power most likely.

Edit: jrbroad77 beat me to it.


----------



## 2010rig

Quote:


> Originally Posted by *Lampen;15287457*
> I'm sorry but what is this? I've never seen any chart EVER that had the i7 920 at over 500 watts. And its pulling more watts than a 990x when the 990x has two more cores and is clocked even higher? WUT?


I've never seen such numbers either.









Quote:


> Originally Posted by *mad0314;15287669*
> Did you by chance take a look at BOTH of the BF3 charts?
> In case you didn't here they are again:
> Stock clocks:
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocked to 4.8GHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clearly a GPU bottleneck. If you think otherwise you are delusional. I fully expect this post to be ignored though.


Makes one wonder how they get more FPS AT stock vs Overclocked.
Quote:


> Originally Posted by *aznofazns;15288316*
> Wow, that explains a lot. The 4M/4C tests seem to fall in line with what I expected. IPC is actually higher than Phenom II once each integer core has a module's resources all to itself.
> 
> Still, it doesn't make sense to pay $220 for an 4M/8C and turn it into a 4M/4C, since performance is still much lower than Sandy Bridge.


Imagine if AMD just made 4 strong cores?

But it doesn't have the same ring to it as the World's 1st 8 core CPU.


----------



## Eolas

Quote:


> Originally Posted by *Strat79;15288413*
> The horrible power consumption is due to it being an 8 integer core CPU with almost double the transistors of other CPU's, like the 2600K. Of course it is going to draw quite a bit of power, especially when overclocked. I think people are forgetting the massive transistor count these things are packing and 8 full integer cores. If you packed 4 more integer cores onto SB chips, they would draw just as much and most likely quite a bit more. Still doesn't account for the lackluster performance:watt mind you, just stating why it is using so much power most likely.
> 
> Edit: jrbroad77 beat me to it.


The question is...with all that horsepower why is the bulldozer floundering? And can it be fixed?


----------



## WhitePrQjser

So right now, what?

Sandy Bridge or Bulldozer?


----------



## Strat79

Quote:


> Originally Posted by *Eolas;15288430*
> The question is...with all that horsepower why is the bulldozer floundering? And can it be fixed?


I have a feeling it has the raw horsepower to easily overpower the current top end chips if everything else was engineered correctly within the package. Something is definitely off with BD. With that much under the hood, it seems like it is being starved of something or lacking a proper scheduler or something along those lines. I don't know much about it honestly, but it just seems very odd that it could perform so bad with all the crap packed inside. I still have hopes that some improvements will come in some form, whether it is software coded to take advantage of BD's unique methods or a hardware revision to fix decode/prefetch/scheduling maybe? I'm not expecting some miracle like some others around here, but even a 10-15% boost with a quick new revision would be a pretty huge step forward if they done it quick enough.


----------



## aznofazns

Quote:


> Originally Posted by *WhitePrQjser;15288438*
> So right now, what?
> 
> Sandy Bridge or Bulldozer?


Definitely Sandy Bridge, but the FX-6100 may be a good choice for lower end builds.

http://www.tweaktown.com/articles/4347/shi_y_marketing_killed_the_bulldozer_star/index.html

Excellent read right there.


----------



## Cyclonic

Quote:


> Originally Posted by *WhitePrQjser;15288438*
> So right now, what?
> 
> Sandy Bridge or Bulldozer?


Seriously? Check my sig and my avatar.


----------



## WhitePrQjser

I'm gonna make a poll about it, so look around in the General Processor Discussions thread


----------



## tpi2007

This whole situation is really kind of awkward and a little sad, really. I was reading Anand's review and looking at how Thuban performs, and it really brings the question: what if AMD had just made a few architectural improvements, lowered the cache and memory latencies, doubled the L3 cache to 12 MB, added two extra cores and Turbo Core 2.0, on a 32nm process ?

I mean, when you look at the data on Anand's review you can see they actually made a few improvements - the cache and memory latencies are better in Thuban compared to the Phenom II X4, and sometimes it possibly doesn't do better than a 3.6 Ghz X4 because Turbo Core is very rudimentary and doesn't really work that well. But you get the feeling they were in the right path with Thuban.

And after reading some more it really looks like they have a Prescott in their hands even if their goal wasn't exactly that. It uses too much power, they are possibly having problems with the process technology (they are moving to a better 32nm one next year), it does well in some tasks, especially multi-threaded ones, but loses in games, single threaded tasks, and other tests.

This is Prescott all over again. Prescott also didn't lose every benchmark, they won in video editing and in some productivity software. But games was FX territory, which is all the more embarrasing for this moniker's comeback.

It's funny how they went on to design this chip: they are thinking about the future, but they can't afford to think like this, really, they literally don't have the money to do this. Intel could do it with Prescott, but AMD can't. Because when their approach finally makes sense (if it ever will), it will be too late. They need to sell chips with good performance now, not in a year's time, when their relative performance will be lower in any case and in need of upgrade anyhow.

The same thing happened with Prescott where you have Hyperthreading, which in many cases was detrimental to performance - including games. You had better multitasking in Windows and some applications that took advantage of it, but it wasn't a win-win situation - and even today the gamer's choice is a 2500k - you can clearly see from the benchmarks that the 2600k needs an additional 100 Mhz and 2MB of cache to somewhat compensate for HT being on. Sometimes it wins in games that already take advantage of it, but in many others it loses and comes just ahead of the i5 2400.

This is a testament that you can't have your cake and fully eat it, even today - HT has gotten much better, but Intel knows it's not perfect, and hence the 2500k. If you want the benefits in multithreaded applications and are are willing to sacrifice a few fps in some games, then the 2600k is the best choice, but for general use the 2500k is the king of the hill, also because of price/performance.

So, Intel learned that you need to diferentiate according to the users' needs, and give them performance they need right now. It's really of no benefit if you have a CPU with HT that your apps don't take advantage of and still take a performance hit for it. AMD is only now going to learn the lesson.

And they really can't afford to lower single threaded performance when their previous effort still wasn't enough. This is just ridiculous. They can only do this many years after the fact. Let me give an example: my Core 2 Quad Q9550 overclocked to 3.4 Ghz has comparable performance to a Phenom II X4 975 / 980, and it plays most games fine. However, some games that like fast dual cores, like Crysis, sometimes dip into the 20's fps. And there are many other applications and games that favour faster dual or single threaded execution.

So, until a CPU manufacturer has reached a point where you get inquestionably good performance in these types of usage cenarios, your single and dual threaded performance has to keep going up, not going down - as I said, they can only do this once the performance of a CPU is so high that they can afford to lower it a bit in some cases - just like dropping some native CPU instruction because few people use it and you can do it with emulation or something that will still run it more than fast enough.


----------



## Strat79

Quote:


> Originally Posted by *WhitePrQjser;15288507*
> I'm gonna make a poll about it, so look around in the General Processor Discussions thread


It really depends on what you plan on doing with it. For most things, and I mean almost everything outside of maybe video encoding, it is a no brainer to go with SB over BD as of right now. Even in the few things BD is showing promise in, you have to weigh the power consumption and cost to run it, so even then it may not be worth it. I'd have a hard time recommending a BD build to anyone over SB and this is coming from a long, long time AMD supporter and fan. I'd almost go as far as saying I was a fanboy, just not a crazed fanatical fanboy like some on here. I always pull for underdogs and AMD was one of my favs for well over a decade. They have lost a long time customer with this stunt though. At least temporarily, if they can fix this quickly.


----------



## WhitePrQjser

Quote:


> Originally Posted by *Strat79;15288539*
> It really depends on what you plan on doing with it. For most things, and I mean almost everything outside of maybe video encoding, it is a no brainer to go with SB over BD as of right now. Even in the few things BD is showing promise in, you have to weigh the power consumption and cost to run it, so even then it may not be worth it. I'd have a hard time recommending a BD build to anyone over SB and this is coming from a long, long time AMD supporter and fan.


Yeah, well, I think the reviews are kind of misleading, because most of them say negative stuff about BD, but then again some have made positive reviews... It's confusing my mind! Especially because I really wanted to build and AMD BD rig for a couple of friends :S


----------



## Strat79

Quote:


> Originally Posted by *WhitePrQjser;15288553*
> Yeah, well, I think the reviews are kind of misleading, because most of them say negative stuff about BD, but then again some have made positive reviews... It's confusing my mind! Especially because I really wanted to build and AMD BD rig for a couple of friends :S


I'd just wait a few weeks until all the people on here get their hands on them and bench them independently so we know for sure. Plus it may give time for any new patches/fixes that could possibly, though unlikely, come out to improve performance. Make your decision after being more informed, can't hurt to wait.


----------



## microfister

Quote:


> Originally Posted by *urbandictionary.com*
> 1. Faildozer108 up, 6 down
> To fail over and over again in spectacular manner.
> Here comes the Faildozer!
> buy faildozer mugs & shirts
> earl douglas failure loser ron & fez noon to three faint black earl wnew
> by KayCeeGee Apr 20, 2008 share this add a video
> 2. Faildozer65 up, 4 down
> 1. An individual whose attempts to complete any and all tasks are rewarded only by shame and humiliation.
> 2. The heavy machinery of failure. (ESD)
> 3. Black Earl.
> " *beep* *beep* HERE COMES THE FAILDOZER!!!"
> buy faildozer mugs & shirts
> fail ron and fez black earl earl douglas loser
> by S.D.B.P. May 10, 2008 share this add a video


lol, i was unaware that it came from urban dictionary. its sad, and i feel bad for everyone that held out to upgrade. but some of the stuff on it is too funny. my favorite is still hitler.


----------



## Trogdor

AMD owners sure are the minority in this thread. It's like Intel owners are trying to make up for the recall 1155 had. Yay Intel for gaining the lead and staying ahead the last few years.


----------



## omninmo

OK FOLKS SOME GOOD NEWS (for some people!)

BTW: This is not my work, I just compiled info and ported it here from the thread posted on Xtremesys a couple pages back

Disabling one core per module to avoid resource sharing improves single-threaded IPC ALOT - ACROSS THE BOARD! And surely fixes some of the stranger results we've seen!
Credit to DGLee @ XS who tested this, and to chew who stepped in to confirm having similar results when he was playing around with the chip, so here go the benches improvement when comparing 2M/4C vs 4M/4C:

*Fritz Chess: 39,1%* improvement
*wPrime 32M: 31%* improvement
*Winrar: 9,5%* improvement
*3DMark06 CPU: 5.8%* improvement
*3DMark Vantage CPU: 22.1%* improvement
*3DMark11 Physics: 14.1%* improvement
*Cinebench R10: 21.4%* improvement
*Cinebench R11.5: 19.1%* improvement
*Blender: 21.7%* improvement
*TechARP x264 enc:* *20%* improvement
*Daum PotEncoding H264 transcoding:* *11,7%* improvement

EDIT TO ELLABORATE

Now there isn't a direct comparison benchmark but word going around is that these gains across the board seem to leave *Bulldozer with a bit more IPC than phenom II!
*
This means, for people willing to *turn off 4 of their threads*, that bulldozer *will NOT SUCK AS HARD* as originally intended in gaming and lightly threaded apps (although we are still waiting on gaming benches to confirm!).. In particular, gains by this are particulary big in some of the weirder results we had seen in original reviews.

IMO this makes BD at least *a bit more viable* as with 4 cores disabled, you will likely be seeing close to or actual *5Ghz on air and more on water with slightly higher IPC than Phenom*! And all the while *lowering power draw and heat output!
*
Ergo, *not a 2500k killer* by any stretch BUT *AT LEAST a viable upgrade* for those who already have a 990FX board..

ATTENTION, Currently it is *not known if all boards support disabling individual cores*, testing was done on a CHV, can't comment on others!

SOURCE: http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-%284%29-!exclusive!-Excuse-for-1-Threaded-Perf.


----------



## Moparman

See Amd should of scraped there idea of Bulldozer an used mine. The FX81 aka 1100T +2cores chop up a 2600k an add its hyperthreading. An you get the monster FX81 8core 16thread cpu.

Well it looked good on paper anyway.


----------



## Partol

According to this,
gaming performance improves a little when running in 4 Module/4 Core mode vs 4 Module/8 Core mode

http://www.hardware.fr/articles/842-9/efficacite-cmt.html


----------



## Artikbot

AMD really messed this up.

Shoulda scrapped the module thingy, added the second FPU and do a TRUE 8 core.

That 20-25% IPC increase would've skyrocketed performance.

But what do we have? A bunch of under-performing useless silicon that runs WAY SLOWER than the last gen.

sigh ~


----------



## omninmo

Quote:


> Originally Posted by *Partol;15288859*
> According to this,
> gaming performance improves a little when running in 4 Module/4 Core mode vs 4 Module/8 Core mode
> 
> http://www.hardware.fr/articles/842-9/efficacite-cmt.html


that falls a bit in line with DGLee's findings i posted above, see more results at post:

http://www.overclock.net/hardware-news/1139571-official-amd-bulldozer-reviews-thread-161.html#post15288816


----------



## mad0314

Quote:


> Originally Posted by *Artikbot;15288890*
> AMD really messed this up.
> 
> Shoulda scrapped the module thingy, added the second FPU and do a TRUE 8 core.
> 
> That 20-25% IPC increase would've skyrocketed performance.
> 
> But what do we have? A bunch of under-performing useless silicon that runs WAY SLOWER than the last gen.
> 
> sigh ~


You can't have everything. There's tradeoffs involved, mainly die size. It would make it larger and they would need a new socket, but they want to use the same socket... this is the price for trying to stay on the same socket forever. At some point it was bound to be a limit.


----------



## aznofazns

Quote:


> Originally Posted by *Partol;15288859*
> According to this,
> gaming performance improves a little when running in 4 Module/4 Core mode vs 4 Module/8 Core mode
> 
> http://www.hardware.fr/articles/842-9/efficacite-cmt.html












Sadly, the performance gains in gaming when going from 4M/8C to 4M/4C are minimal. See the bottom six results.

Although I think those tests were run at 1080p, which would likely result in a GPU bottleneck. Not sure.


----------



## nagle3092

Quote:


> Originally Posted by *mad0314;15288905*
> You can't have everything. There's tradeoffs involved, mainly die size. It would make it larger and they would need a new socket, but they want to use the same socket... this is the price for trying to stay on the same socket forever. At some point it was bound to be a limit.


Well they need to stop designing cpus around sockets and compatibility and focus on architecture and performance gains.

I have a feeling as long as they keep the AM3/3+ around nothing is going to change much.


----------



## omninmo

if we could build a profile on AMD Overdrive or something that could automatically disable 4 threads everytime you ran a game or some other specified task and OC to 5Ghz+ or whatever

and then

reverting back to say, 8 cores @ 4.5ghz for general purpose and other highly threaded apps where BD shines naturally (like video encoding or compression or productivity etc..) then BD wouldn't be such a disappointment, i think...

IF you cuold do it automatically without needing to go to the BIOS to do it properly... And after a price drop or something...

Not that I intend to upgrade anyway for the moment


----------



## mad0314

Quote:


> Originally Posted by *omninmo;15289011*
> if we could build a profile on AMD Overdrive or something that could automatically disable 4 threads everytime you ran a game or some other specified task and OC to 5Ghz+ or whatever
> 
> and then
> 
> reverting back to say, 8 cores @ 4.5ghz for general purpose and other highly threaded apps where BD shines naturally (like video encoding or compression or productivity etc..) then BD wouldn't be such a disappointment, i think...
> 
> IF you cuold do it automatically without needing to go to the BIOS to do it properly... And after a price drop or something...
> 
> Not that I intend to upgrade anyway for the moment


Or if they just fixed the way it works... the ideas are good, the results are not. They have a lot of new ideas in this chip but sadly the rest of the system, mainly OS and software, is not set up to take avdantage of it and it just ends up being a disadvantage instead of advantage.


----------



## P.J

Bulldozer?


----------



## PROBN4LYFE

Lol AMD...come on son!!!!!


----------



## Captain Bucket

I was going to come in here saying "Told you so", but apparently, there are still some folks around who refuse to be told


----------



## DMac84

This is unfortunate







Im disappointed with AMD because as consumers, we really needed Bulldozer to challenge Intel so we could get some better/cheaper chips.

Does not look like that's gonna happen this round. :/


----------



## Wishmaker

Some of the posts here in this thread are puzzling. Before the BD benches, people said the reason for low scores was the ES name in the identifier. Furthermore, they argued over and over again that BD is a 'true' octacore CPU eventhough we all know that is not the case. Now, they are saying it's all good and if you want more performance you need to disable stuff on it? So if I have to disable things on it, why should I buy an octacore CPU?

This does not make sense. AMD is advertising one thing but selling something else. They did the same with the 5870 'best gaming card in the world, best graphics card on the planet'. Make it tesselate and it will go down on its knees compared to the NVIDIA counterpart. Now AMD advertises the awesome price associated with the FX series trying to manipulate from the larger picture. Instead of admitting, yes we dropped the ball, we are fixing it, they are sweeping it under the rug?

At this point in time, I will not consider BD as a solution for my third rig. I'll go IVY.


----------



## SpuddGunn

Didnt the new CEO cancel a load of upcoming products?

Based on the theory that they were not of the quality the new guy wants, I'd say Bulldozer would be in the same boat. Except the new CEO couldnt cancel this product with it being so close to release.

I'd call Bulldozer the last of the old AMD, so I'm more interested in seeing what happens when the first products from under the stewardship of the new guy come out.


----------



## linskingdom

Taking four years to deliver the product like this clearly makes many disappointed regardless which market and user level are planned targeting. Based on most reviews here, it is clearly that Bulldozer doesn’t work well with current OS and hardware components. You can argument that its architecture may focus on future and may work better on new coming software and hardware components but it seems to me that this needs lot accommodations from software and hardware vendors which is an extremely difficult task. A typical user just can’t spent $200 on it and hopes that happens, and not to mention the power consumption when OC. Also, it seems a disconnection among development, production and marketing teams and they all drop the ball….


----------



## tout

Has anyone seen benchmarks when running in Eyefinity? This is what I am interested in since that's what I run. I saw several tests done that showed BD performed equal to a 2600K at high resolution gaming (2560 x 1600)


----------



## Newbie2009

AMD mabye misunderstood the term teraflop, and just went for the flop part.

Well they delivered this in spades.


----------



## mav451

Quote:



Originally Posted by *aznofazns*











Sadly, the performance gains in gaming when going from 4M/8C to 4M/4C are minimal. See the bottom six results.

Although I think those tests were run at 1080p, which would likely result in a GPU bottleneck. Not sure.


Uh 1080P is as _mainstream_ a resolution as you're going to get. It'd be another story if they were running _2560x1600_ or _Eyefinity resolutions_, then you'd have a better argument.

Thanks for posting that though...I think that pretty much removes all my optimism now. Sigh - chew* had given me a lot of hope from his posts too. It's one thing to be manipulated by marketing folks...but frankly I'm disappointed that I was led on by one of the more prominent OCers in the community too.


----------



## RedCloudFuneral

I don't get why people complain so much about them turning up the graphics settings in the gaming tests? I don't think the whole point of a fast CPU is to turn down your settings and watch the frames fly.


----------



## hak8or

Quote:



Originally Posted by *RedCloudFuneral*


I don't get why people complain so much about them turning up the graphics settings in the gaming tests? I don't think the whole point of a fast CPU is to turn down your settings and watch the frames fly.


I am not exactly sure what you are referring to, but I am guessing if you are referring to reviewers putting game settings higher in benchmarks than other reviewers, people are unhappy because it causes the conditions to no longer be equal. When you are comparing two separate things, you want the environmental to be as controlled as possible, including but not limited to the game patches, resolution, settings, and the usual things to get rid of bottle necking.


----------



## lordikon

Quote:



Originally Posted by *RedCloudFuneral*


I don't get why people complain so much about them turning up the graphics settings in the gaming tests? I don't think the whole point of a fast CPU is to turn down your settings and watch the frames fly.


Because when the graphics are turned up the game becomes GPU-bound, meaning that the reason why you can no longer get higher FPS is because the graphics card cannot keep up. At this point in the test most CPUs will all show the same framerates, which makes the test worthless to show which CPUs handle games faster. Those tests are completely worthless at that point, what they should have done is thrown in another 1-2 GPUs to get rid of the GPU bottleneck so we could see where the CPUs really stand in a high-performance gaming environment.


----------



## anubis1127

Quote:



Originally Posted by *RedCloudFuneral*


I don't get why people complain so much about them turning up the graphics settings in the gaming tests? I don't think the whole point of a fast CPU is to turn down your settings and watch the frames fly.


Because when you do that, the test becomes GPU limited. At that point the test is all about the GPU. You would notice little variance between CPU's as long as they are fast enough to power the GPU, as shown in the overclockersclub review I believe. At that point the test is worthless because you would get the same result with an i3, as a phenom II, as a i7, as a FX chip, as a C2D, as an Athlon II, etc.


----------



## tout

As far as real world performance, from what I have seen, BD is performing equal to a 2500K. Rendering and encoding seem to be where it shines. Add in that these CPUs seem to overclock like a beast (compared to Phenom IIs) I don't see where the massive fail is. If I can run a FX 8150 at 5+ GHz on water then sign me up! It will blow away my 1090T at 4 GHz.

Yes for low resolution gaming (1920 x 1080 and under) it is a fail. Although many gaming benchmarks show it competing with a 2500K at 1080p and up. Single threaded apps, also a fail but I don't buy 6 and 8 core CPUs for single threaded use. I would buy a 4 core or less for that.

AMD claims Windows 8 is better optimized for BullDozers. They claim up to 10% increase in performance. We'll see.

I will own one eventually.


----------



## Ulysses Cazuquel

best link about the Bulldozer

http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-(4)-!exclusive!-Excuse-for-1-Threaded-Perf.


----------



## DayoftheGreek

Quote:


> Originally Posted by *RedCloudFuneral;15290593*
> I don't get why people complain so much about them turning up the graphics settings in the gaming tests? I don't think the whole point of a fast CPU is to turn down your settings and watch the frames fly.


People complain because it is a bad test of CPU power, which is exactly what we are trying to test. If you have the graphics super high, the graphics card is doing all the work and the processor sits idle most of the time. An i3, an X4, an X6, and a 2600k, will all get about the same frame rate when you pair them with an awesome graphics card and crank up the graphics. When you lower the graphics settings, the CPU has to do much more work to keep up with the graphics card that is spitting out frames left and right. So if you were to lower graphics settings slowly, weaker processors would start to get left behind first. And like you can see in some of the tests, doing a high graphics test makes funny results. There is one test where the 2600k scores 3 or 4FPS LESS when it is OC'ed a lot. That happens because the processor is completely out of the equation since the GPU is doing all the work. The test could be redone at 3ghz, 5ghz, 8ghz, and all the results would just be the random variation of the test.


----------



## odditory

Quote:



Originally Posted by *tout*


As far as real world performance, from what I have seen, BD is performing equal to a 2500K.


LOL "performing equal" -- but sucking way more power, you left that part out. So exactly how is that performing equal? Ever heard of the term "performance per watt"?

Make all the excuses you want but BD is DOA, brutha. You know it, I know it, the whole internet knows it - and I give credit to some of the review sites for trying to break the news compassionately, letting AMD retain some small shred dignity. Because there are some sites painting a much bleaker scene - one with Intel smooshing AMD's face into the dirt again and again and again.


----------



## ryboto

Quote:



Originally Posted by *tout*


As far as real world performance, from what I have seen, BD is performing equal to a 2500K. Rendering and encoding seem to be where it shines. Add in that these CPUs seem to overclock like a beast (compared to Phenom IIs) I don't see where the massive fail is. If I can run a FX 8150 at 5+ GHz on water then sign me up! It will blow away my 1090T at 4 GHz.

Yes for low resolution gaming (1920 x 1080 and under) it is a fail. Although many gaming benchmarks show it competing with a 2500K at 1080p and up. Single threaded apps, also a fail but I don't buy 6 and 8 core CPUs for single threaded use. I would buy a 4 core or less for that.

AMD claims Windows 8 is better optimized for BullDozers. They claim up to 10% increase in performance. We'll see.

I will own one eventually.


I still think something is severely limiting Bulldozer's performance. There's no way 2 Billion transistors and a more "advanced" design yield this kind of crap performance over a previous generation. It may be linked to the process technology, and yields...at least I hope so, or something is inherently flawed in the microarch. I wouldn't have nearly as much issue with the performance if the power consumption was on par or better than the competition. As it is, I wouldn't touch one....I really wanted to, but I feel like my affair with Intel may turn into something more serious.


----------



## mav451

Quote:



Originally Posted by *ryboto*


I still think something is severely limiting Bulldozer's performance. There's no way 2 Billion transistors and a more "advanced" design yield this kind of crap performance over a previous generation. It may be linked to the process technology, and yields...at least I hope so, or something is inherently flawed in the microarch. I wouldn't have nearly as much issue with the performance if the power consumption was on par or better than the competition. *As it is, I wouldn't touch one....I really wanted to, but I feel like my affair with Intel may turn into something more serious.*


Hahah this thread is delivering.


----------



## Darkcyde

<

Saving for a Intel 2011 setup. Sorry AMD. .....wait.... I'm not sorry at all.


----------



## tout

Quote:



Originally Posted by *odditory*


LOL "performing equal" -- but sucking way more power, exactly how is that equal? Ever heard of the term "performance per watt"? Let me google that for you.

Make all the excuses you want but BD is DOA, brutha. You know it, I know it, the whole tech review internet knows it. We're talking layoffs and people jumping out of windows. Think I'm kidding?


lol, I don't care about performance per watt, I am not running a laptop here. I run my system full bore 24/7. Yes, it shows the inefficiencies that BD has but for us enthusiasts, performance per watt means next to nothing.

I also don't ever run my system at a paltry 1080 resolution, I am way beyond that, Brotha. I want to know where BD stands with high end GPUs, with high end resolutions and high end graphic settings if we are gonna discuss gaming. 1080p is so 2 years ago.

In the 'real world' most people don't sit around running benchmarks and overclocking their CPUs to the limits. They run at stock for everything. At stock, BD compares to a 2500K in rendering, encoding and other multi threaded apps just as it should being a multi core CPU. No one is gonna notice that their web browser opened 1/1000th of a second slower because single threaded performance is lower. At stock settings the CPU uses 125 (or 95) watts just like a Phenom II. From the review sites, at stock settings, cool n quiet actually work on BD compared to Thubans. It even does better than Intel's i5 series in some tests.


----------



## Epsi

Dunno if posted already.

AMD FX-8150 vs. Intel i7-2600k CrossFireX HD 6970 x3 Head-to-Head:

http://www.tweaktown.com/articles/43...ad/index1.html


----------



## Don Karnage

http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index1.html
Quote:


> I walk away from this going, "why do we even have boards with more than two PCIe x16 slots on it?" - That's really disappointing. It's sad for AMD that the best machine for someone to install three HD 6970s into isn't an AMD based one, by a long way.


----------



## racer86

prolly been posted but im not reading through 160 pages of arguing lol

why do the benches on the non asus boards look so much better across the board? hell even the power consumption is different on the boards problems with asus bios??

if it really does perform that well on non asus boards then i might look at picking up one


----------



## anubis1127

Quote:



Originally Posted by *Epsi*


Dunno if posted already.

AMD FX-8150 vs. Intel i7-2600k CrossFireX HD 6970 x3 Head-to-Head:

http://www.tweaktown.com/articles/43...ad/index1.html


Cool man, thanks. I had been looking forward to seeing a test like this.

[edit]

After reading the review, I must say the FX 8150 did better than I thought it would, but still failed hard to max out multiple GPUs.

It looks like if you're a serious PC gamer, and want to build a multiple GPU setup the FX is not for you.


----------



## matty0610

In what case would you need 3 6970s at 1920x1200 for gaming?


----------



## Don Karnage

Quote:



Originally Posted by *matty0610*


In what case would you need 3 6970s at 1920x1200 for gaming?


They tested at 2560X1600 as well.


----------



## Twitchie

Oh my god, the bottleneck hurts my eyes.


----------



## Hiep

Quote:



Originally Posted by *matty0610*


In what case would you need 3 6970s at 1920x1200 for gaming?


Metro 2033.


----------



## no1Joeno1

Quote:



Originally Posted by *matty0610*


In what case would you need 3 6970s at 1920x1200 for gaming?


In a case where you can't allow a GPU to be a bottleneck?


----------



## Epsi

Quote:



Originally Posted by *racer86*


prolly been posted but im not reading through 160 pages of arguing lol

why do the benches on the non asus boards look so much better across the board? hell even the power consumption is different on the boards problems with asus bios??

if it really does perform that well on non asus boards then i might look at picking up one


Ye the power consumption is only +20watts for AMD vs Intel.. strange thing.


----------



## CerealKillah

OK, I understand this graph shows the superiority of the 2600K platform with multiple video cards, but it also shows something else (which cannot be ignored).

The benchmark shows that the user experience will be nearly identical not matter which platform.

OK, SB is faster in this benchmark, but this "faster" does not translate into an improved user experience.

Yes, I put my flame resistant suit on prior to writing this comment. Have at me kids


----------



## Don Karnage

I'm not sure why they didn't use a 2500K in the review instead of the 2600K. Results would have been identical


----------



## cusideabelincoln

Quote:



Originally Posted by *omninmo*


OK FOLKS SOME GOOD NEWS (for some people!)

BTW: This is not my work, I just compiled info and ported it here from the thread posted on Xtremesys a couple pages back

Disabling one core per module to avoid resource sharing improves single-threaded IPC ALOT - ACROSS THE BOARD! And surely fixes some of the stranger results we've seen!
Credit to DGLee @ XS who tested this, and to chew who stepped in to confirm having similar results when he was playing around with the chip, so here go the benches improvement when comparing 2M/4C vs 4M/4C:

*Fritz Chess: 39,1%* improvement
*wPrime 32M: 31%* improvement
*Winrar: 9,5%* improvement
*3DMark06 CPU: 5.8%* improvement
*3DMark Vantage CPU: 22.1%* improvement
*3DMark11 Physics: 14.1%* improvement
*Cinebench R10: 21.4%* improvement
*Cinebench R11.5: 19.1%* improvement
*Blender: 21.7%* improvement
*TechARP x264 enc: **20%* improvement
*Daum PotEncoding H264 transcoding: **11,7%* improvement

-------------- EDIT TO ELLABORATE ----------------

Now there isn't a direct comparison benchmark but word going around is that these gains across the board seem to leave *Bulldozer with a bit more IPC than phenom II!
*
This means, for people willing to *turn off 4 of their threads*, that bulldozer *will NOT SUCK AS HARD* as originally intended in gaming and lightly threaded apps (although we are still waiting on gaming benches to confirm!).. In particular, gains by this are particulary big in some of the weirder results we had seen in original reviews.

IMO this makes BD at least *a bit more viable* as with 4 cores disabled, you will likely be seeing close to or actual *5Ghz on air and more on water with slightly higher IPC than Phenom*! And all the while *lowering power draw and heat output!
*
Ergo, *not a 2500k killer* by any stretch BUT *AT LEAST a viable upgrade* for those who already have a 990FX board..

ATTENTION, Currently it is *not known if all boards support disabling individual cores*, testing was done on a CHV, can't comment on others!

SOURCE: http://www.xtremesystems.org/forums/...Threaded-Perf.



Quote:



Originally Posted by *Partol*


According to this,
gaming performance improves a little when running in 4 Module/4 Core mode vs 4 Module/8 Core mode

http://www.hardware.fr/articles/842-...acite-cmt.html


Software core/module load balancing should be able to provide the same affect as disabling a core per module to prevent the sharing performance penalty. In any case, looks like the hardware.fr site shows there is a 10% increase in gaming performance when using 4c/4m compared to 4c/2m, which seems to be right in line with these results:


























Windows 8 can distribute threads for optimal performance?


----------



## glytyr

The 2500K might have been a fairer choice but ok


----------



## Razi3l

The silliest thing here is the fact that they compared 4.7Ghz to 5.2Ghz. TweakTown reviews are terrible anyway.

SB owners wont stop gloating and being smug now that Bulldozer isn't all it was expected to be. How sad lol. If it was ~$40-$50 cheaper it would be amazing.


----------



## Kasp1js

The sad part is that the 5.2 ghz i7 probably still bottlenecks the 6970's.


----------



## DrFPS

Quote:



Originally Posted by *Don Karnage*


I'm not sure why they didn't use a 2500K in the review instead of the 2500K. Results would have been identical


There is a BD review thread! This review is already in there, so it makes it a repost!


----------



## Don Karnage

Quote:



Originally Posted by *DrFPS*


There is a BD review thread! This review is already in there, so it makes it a repost!


This is more a video card scaling review then a bulldozer review but if the mods want to close go ahead.


----------



## goldcrow

Quote:



I'm not sure why they didn't use a 2500K in the review instead of the 2500K. Results would have been identical


Doesn't that answer your question? Since they're identical, they can use either of the two?


----------



## justinjja

Quote:



Originally Posted by *CerealKillah*


OK, I understand this graph shows the superiority of the 2600K platform with multiple video cards, but it also shows something else (which cannot be ignored).

The benchmark shows that the user experience will be nearly identical not matter which platform.

OK, SB is faster in this benchmark, but this "faster" does not translate into an improved user experience.

Yes, I put my flame resistant suit on prior to writing this comment. Have at me kids










Until you use eyefinity


----------



## glytyr

Quote:



Originally Posted by *goldcrow*


^Doesn't that answer your question? Since they're identical, they can use either of the two?


I think he meant identical FPS between 8150 and 2500K


----------



## Don Karnage

Quote:



Originally Posted by *Epsi*


Ye the power consumption is only +20watts for AMD vs Intel.. strange thing.


2600K was at 5.2Ghz and the 8150 at 4.76ghz. Reason why they're so close


----------



## DrFPS

Quote:



Originally Posted by *Don Karnage*


This is more a video card scaling review then a bulldozer review but if the mods want to close go ahead.










Whats the title again, Ohhh yea, amd vs intel video card sure.

We get it amd bulldozer, is= stick a fork in it, donedozer.


----------



## CerealKillah

Quote:



Originally Posted by *justinjja*


Until you use eyefinity


That's fine, then show off the eyefinity benchmarks. Just saying, the graph used by the OP shows that at those resolutions, gameplay experience would be identical for users of either platform.


----------



## Sterisk

Quote:



Originally Posted by *CerealKillah*


OK, I understand this graph shows the superiority of the 2600K platform with multiple video cards, but it also shows something else (which cannot be ignored).

The benchmark shows that the user experience will be nearly identical not matter which platform.

OK, SB is faster in this benchmark, but this "faster" does not translate into an improved user experience.

Yes, I put my flame resistant suit on prior to writing this comment. Have at me kids










It also shows that you will have to replace your gpus sooner if you insist on keeping the same cpu and want to have the best performance in newer titles vs using a 2600k.


----------



## hammertime850

Quote:



Originally Posted by *Sterisk*


It also shows that you will have to replace your gpus sooner if you insist on keeping the same cpu and want to have the best performance in newer titles vs using a 2600k.


this^ sb cpu will still be more than enough to game on for awhile.


----------



## Diabolical999

Quote:



*We've again opted to use both our processors here today at their maximum overclock.* While some may argue it's unfair for AMD as clock for clock Intel is already faster, others will argue that it's unfair for Intel to be clocked down because the AMD can't clock as high.
*If you're going to go with one of these setups, you're going to want to clock it as high as you can and that's the reason we're using our i7 2600k @ 5.2GHz verses the FX-8150 @ 4.76GHz.*


Partially agree, but they at least could've set one page aside to show _both_ performing at the same clock speed.


----------



## mayford5

Either way my upgrade path is shot unless some mirracle(haha







) happens.


----------



## justinjja

Quote:



Originally Posted by *CerealKillah*


That's fine, then show off the eyefinity benchmarks. Just saying, the graph used by the OP shows that at those resolutions, gameplay experience would be identical for users of either platform.


In the article they mentioned the use of 120hz monitors, where there would be a difference.


----------



## RagingCain

Quote:



Originally Posted by *CerealKillah*


That's fine, then show off the eyefinity benchmarks. Just saying, the graph used by the OP shows that at those resolutions, gameplay experience would be identical for users of either platform.


You are also under the assumption that all users have a 60 Hz monitor. The review also doesn't deal with microstutter which one particular cause has been known to be slow CPU delivery.

You also forget that there is a probability that an overclocked 1100T will perform superior to the FX8150, making it not only a poor choice in performance, but a poor choice in price to pick one up.


----------



## Partol

Quote:



Originally Posted by *ryboto*


I still think something is severely limiting Bulldozer's performance. There's no way 2 Billion transistors and a more "advanced" design yield this kind of crap performance over a previous generation.


I agree. Seems to me, one performance limitation is due to shared resources inside the module. The same issue affects intel hyperthreading. It's a known fact that intel hyperthreading can reduce performance in some situations.

One possible fix is to change core number in windows. Something like this:

Instead of
core 0,1,2,3,4,5,6,7
change to
core 0,2,4,6,7,5,3,1

use only one core from each module. after that, load the second core in each module.

All these reviews/benchmarks which we see, will need to be redone after this change is made.

Questions:
how will this change be made? Why isn't AMD rushing to fix this, right now?


----------



## ekg84

I apologize if this has already been posted.
Guru3d has a BD review with fx4100/6100.

http://www.guru3d.com/article/amd-fx...mance-review/1

Looks like fx4100 in many cases is even slower than A8-3850.


----------



## Don Karnage

Quote:



Originally Posted by *RagingCain*


You are also under the assumption that all users have a 60 Hz monitor. The review also doesn't deal with microstutter which one particular cause has been known to be slow CPU delivery.

You also forget that there is a probability that an overclocked 1100T will perform superior to the FX8150, making it not only a poor choice in performance, but a poor choice in price to pick one up.


Do people still buy 60hz monitors? I've had my 120hz for months and its amazing


----------



## Diabolical999

Quote:



Originally Posted by *Don Karnage*


I'm not sure why they didn't use a 2500K in the review instead of the 2600K. Results would have been identical


 Its Hyperthreading isn't being utilized in any of the games, so the performance is on par with the 2500K, like it always is in games. They were merely benching the latest top offerings from both.


----------



## Don Karnage

Quote:



Originally Posted by *ekg84*


I apologize if this has already been posted.
Guru3d has a BD review with fx4100/6100.

http://www.guru3d.com/article/amd-fx...mance-review/1

Looks like fx4100 in many cases is even slower than A8-3850.


Where is the 2500K in that review?


----------



## Don Karnage

Quote:



Originally Posted by *Diabolical999*


Its Hyperthreading isn't being utilized in any of the games, so the performance is on par with the 2500K, like it always is in games. They were merely benching the latest top offerings from both.


I know why they did it. Having a 2500K wipe the floor with a more expensive 8150 would have been more embarrassing.


----------



## willistech

if they are so bad why is EVERY retailer sold out of them?


----------



## anubis1127

Quote:



Originally Posted by *willistech*


if they are so bad why is EVERY retailer sold out of them?


Because AMD is better at marketing than manufacturing


----------



## CerealKillah

Quote:



Originally Posted by *RagingCain*


You are also under the assumption that all users have a 60 Hz monitor. The review also doesn't deal with microstutter which one particular cause has been known to be slow CPU delivery.

You also forget that there is a probability that an overclocked 1100T will perform superior to the FX8150, making it not only a poor choice in performance, but a poor choice in price to pick one up.


Again, if that is your point, they need benchmarks to illustrate the point.

Everyone that has quoted me seems to miss the point of my post (which is still valid) in that the graph shown by the OP shows superiority of the SB platform, but does not show/indicate a bad user experience on BD.

PS. I would bet that 60hz is the STANDARD for gaming monitors and will account for 90% of the current systems out there.


----------



## Don Karnage

Quote:



Originally Posted by *willistech*


if they are so bad why is EVERY retailer sold out of them?


Because normal consumers are dumb


----------



## Dublin_Gunner

Quote:



Originally Posted by *willistech*


if they are so bad why is EVERY retailer sold out of them?


None of them have stock yet lol


----------



## M3T4LM4N222

Quote:



Originally Posted by *nagle3092*


Well they need to stop designing cpus around sockets and compatibility and focus on architecture and performance gains.

I have a feeling as long as they keep the AM3/3+ around nothing is going to change much.


Thats a problem though. Compatiablity was one of the things that makes AMD CPU's so viable and great. You didn't need to buy a new board every time you wanted a new CPU. AMD has generally always been better at the money saving aspect. But Bulldozer is the exact opposite of what AMD usually produces.


----------



## lordikon

Quote:



Originally Posted by *tout*


lol, I don't care about performance per watt, I am not running a laptop here. I run my system full bore 24/7. Yes, it shows the inefficiencies that BD has but for us enthusiasts, performance per watt means next to nothing.

I also don't ever run my system at a paltry 1080 resolution, I am way beyond that, Brotha. I want to know where BD stands with high end GPUs, with high end resolutions and high end graphic settings if we are gonna discuss gaming. 1080p is so 2 years ago.

In the 'real world' most people don't sit around running benchmarks and overclocking their CPUs to the limits. They run at stock for everything. At stock, BD compares to a 2500K in rendering, encoding and other multi threaded apps just as it should being a multi core CPU. No one is gonna notice that their web browser opened 1/1000th of a second slower because single threaded performance is lower. At stock settings the CPU uses 125 (or 95) watts just like a Phenom II. From the review sites, at stock settings, cool n quiet actually work on BD compared to Thubans. It even does better than Intel's i5 series in some tests.


You make it sound like you're an enthusiast. If that's the case, BD shouldn't even be a consideration for you considering it's miserable performance in almost all cases, and it is grossly inefficient. If you want efficiency, go with Sandy Bridge, if you want best performance, go with Sandy Bridge or possibly a 990x (depending on what you use the CPU for), if you want best price/performance, go with Sandy Bridge. I'd like to say there's a good reason to go with Bulldozer, but I'm being entirely serious when I say I really cannot think of a good reason.

As was already posted on the last page, at extreme gaming settings Bulldozer fails even harder:
http://www.tweaktown.com/articles/43...d/index14.html


----------



## ChrisB17

Just made the switch from my sig rig to BD. I am hoping I can bulldoze my way out of a box.


----------



## Sazar

Quote:



Originally Posted by *CerealKillah*


Again, if that is your point, they need benchmarks to illustrate the point.

Everyone that has quoted me seems to miss the point of my post (which is still valid) in that the graph shown by the OP shows superiority of the SB platform, but does not show/indicate a bad user experience on BD.

PS. I would bet that 60hz is the STANDARD for gaming monitors and will account for 90% of the current systems out there.


Agreed. Just because some new tech comes out doesn't mean you HAVE to have it.

I am perfectly fine with my Acer. Looks great after 4 years, not a single dead pixel. Still looks beautiful. Why the hell would I upgrade just to say OMGZ I HAV TEH LATEST HZ MONITORZ.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Epsi*


Dunno if posted already.

AMD FX-8150 vs. Intel i7-2600k CrossFireX HD 6970 x3 Head-to-Head:

http://www.tweaktown.com/articles/43...ad/index1.html


Wow, that is a BRUTAL test for BD. Maybe now Jagged can see what a CPU-bottleneck looks like....


----------



## Fletcherea

Super sexy budget Phenom II 955 or FX 4100 build I see in my future. Them 955s are just over a hundred bucks now. Love it when new stuff comes out, I can almost afford new old stuff lol =) Was really hoping to see some some low wattage dozers though, just for some powerhouse itx builds.


----------



## Majin SSJ Eric

I honestly can't figure out a reason to buy an FX 4100 over an A-3850. The Llano is faster and you don't even need to buy a gpu if you just do casual computing. I'm seriously considering the A-3850 for a build I'm doing for my father-in-law...


----------



## Don Karnage

Quote:



Originally Posted by *Sazar*


Agreed. Just because some new tech comes out doesn't mean you HAVE to have it.

I am perfectly fine with my Acer. Looks great after 4 years, not a single dead pixel. Still looks beautiful. Why the hell would I upgrade just to say OMGZ I HAV TEH LATEST HZ MONITORZ.


You've never gamed on a 120hz monitor have you? The difference is quite nice especially in BF3


----------



## Dman

Quote:



Originally Posted by *lordikon*


I'd like to say there's a good reason to go with Bulldozer, but I'm being entirely serious when I say I really cannot think of a good reason.


Agreed, in fact there doesn't even seem to be a reason to upgrade if you are using a phenom II right now.


----------



## RotaryKnight

Quote:



Originally Posted by *Don Karnage*


I'm not sure why they didn't use a 2500K in the review instead of the 2600K. Results would have been identical


hyperthreading
Though I dont know how useful that is in games.

Even if they did a clock for clock with a 2500k it would still beat the bulldozer.


----------



## Papas

Quote:



Originally Posted by *Don Karnage*


Because normal consumers are dumb


I hate to say it, but that is the normal response to everyone when something happens they dont like. FX chips sold out in hours. thats not a failure. you have to remember we are the 1%'ers here. 99% of the regular pc people know nothing about overclocking or any of the stuff we know about. BD will end up being a big deal for amd. normal people will see the 8 cores and go, i want. its as simple as that. will they be overpaying? not anymore than anyone else that buys a pre-built system. stop being negative to every response that isnt how you think, its ignorant and rude.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *CerealKillah*


OK, I understand this graph shows the superiority of the 2600K platform with multiple video cards, but it also shows something else (which cannot be ignored).

The benchmark shows that the user experience will be nearly identical not matter which platform.

OK, SB is faster in this benchmark, but this "faster" does not translate into an improved user experience.

Yes, I put my flame resistant suit on prior to writing this comment. Have at me kids










In other games tested you would see a difference. Mafia, for instance has the 2600k supporting a 120Hz refresh rate for 3D gaming whereas the 8150 won't even approach 120Hz...


----------



## Papas

Quote:



Originally Posted by *lordikon*


You make it sound like you're an enthusiast. If that's the case, BD shouldn't even be a consideration for you considering it's miserable performance in almost all cases, and it is grossly inefficient. If you want efficiency, go with Sandy Bridge, if you want best performance, go with Sandy Bridge or possibly a 990x (depending on what you use the CPU for), if you want best price/performance, go with Sandy Bridge. I'd like to say there's a good reason to go with Bulldozer, but I'm being entirely serious when I say I really cannot think of a good reason.

As was already posted on the last page, at extreme gaming settings Bulldozer fails even harder:
http://www.tweaktown.com/articles/43...d/index14.html


test is invalid as anyone here would tell you. .5ghz difference is huge and you act like its nothing. im the first to admit BD is a flop to us here. but to talk about failing harder....it is u sir who have failed.


----------



## Flying Toilet

Quote:



Originally Posted by *Dman*


Agreed, in fact there doesn't even seem to be a reason to upgrade if you are using a phenom II right now.


"Hey, let's duct tape two Athlon II x4's together and see what kind of numbers we can produce."

I don't know what's worse, this release, or the fact that AMD doesn't right itself how much a Macbook is going to cost in the next ten years. Not for myself, for my fiance... I can hear her pitching a fit right now.


----------



## Iceman23

Quote:



Originally Posted by *Papas*


I hate to say it, but that is the normal response to everyone when something happens they dont like. FX chips sold out in hours. thats not a failure. you have to remember we are the 1%'ers here. 99% of the regular pc people know nothing about overclocking or any of the stuff we know about. BD will end up being a big deal for amd. normal people will see the 8 cores and go, i want. its as simple as that. will they be overpaying? not anymore than anyone else that buys a pre-built system. stop being negative to every response that isnt how you think, its ignorant and rude.


Are they really selling out that fast? Selling out of only a few processors isn't a great accomplishment, we have to know the initial stock before drawing any conclusions on how it is selling. Simply seeing an "out of stock" notification by a retailer could mean many things.


----------



## 8ight

Quote:



Originally Posted by *Don Karnage*


I'm not sure why they didn't use a 2500K in the review instead of the 2600K. Results would have been identical


That may be so but this is AMD's top dog vs. Intel's top dog. Not everything that is a 2600K needs to be a 2500K but that's OCNs mentality.


----------



## QuackPot

Quote:



Originally Posted by *willistech*


if they are so bad why is EVERY retailer sold out of them?


Because AMD haven't produced enough of them for most online retailers.


----------



## ZealotKi11er

Sad part is that they used a 5.2Ghz 2600K which in my book is considered Golden and its makes 1% of all 2600K. Most 8150 have been hitting 5Ghz so i dont see how they could not. Other thing i notices is that AMD is still using same chip-set and the internals of BD dont improve upon CFX and probably SLI scaling as i though it would. Having more then 2 Card with BD will only help in less then 20% of the game making it a pointless upgrade. Though some people argue about 120Hz gaming 95% of us still 60hz and AMD CPU is more then capable of delivering that.


----------



## Don Karnage

Quote:



Originally Posted by *Dublin_Gunner*


None of them have stock yet lol


Tigerdirect and Newegg had them yesterday


----------



## dominique120

Reading this makes me think AMD wasted 5 years

That just my opinion


----------



## CrazyNikel

Quote:



Originally Posted by *ZealotKi11er*


Sad part is that they used a 5.2Ghz 2600K which in my book is considered Golden and its makes 1% of all 2600K. Most 8150 have been hitting 5Ghz so i dont see how they could not. Other thing i notices is that AMD is still using same chip-set and the internals of BD dont improve upon CFX and probably SLI scaling as i though it would. Having more then 2 Card with BD will only help in less then 20% of the game making it a pointless upgrade. Though some people argue about 120Hz gaming 95% of us still 60hz and AMD CPU is more then capable of delivering that.


Dude.....seriously! How can ANYONE take this review seriously.

This has to be the one of the most lopsided tests I have ever seen.


----------



## criminal

No thanks. I may only have a 60Hz (IPS) monitor, but I want the best performance I can get.


----------



## CerealKillah

Quote:



Originally Posted by *Majin SSJ Eric*


In other games tested you would see a difference. Mafia, for instance has the 2600k supporting a 120Hz refresh rate for 3D gaming whereas the 8150 won't even approach 120Hz...











I don't doubt it and I totally smell what you are cooking. I don't disagree with SB's superiority in gaming.

My original reply was in response to the OP's graph and the OP's graph only.


----------



## Papas

Quote:



Originally Posted by *Iceman23*


Are they really selling out that fast? Selling out of only a few processors isn't a great accomplishment, we have to know the initial stock before drawing any conclusions on how it is selling. Simply seeing an "out of stock" notification by a retailer could mean many things.


true, never crossed my mind that a company like newegg and tigerdirect would carry limited quantities of a item that is such a huge release and has been touted about for over a year.

On another note. people talking about how BD performance cant be increased should really look at the 2500k/2600k. during initial testing for the release date, 2500k/2600k were scoring within 14 points of each other in vantage and 100 points in 3dmark 11(while getting beat by the i7 950 and i7 875k) now they are scoring hundreds points more(almost 1000 for the 2600k) and beating the i7 950 and i7 875k, so again, how can BD performance not increase due to some magical driver when intel released the same magical driver that made there cpu's perform better.


----------



## Shion314

Quote:



Originally Posted by *lordikon*


You make it sound like you're an enthusiast. If that's the case, BD shouldn't even be a consideration for you considering it's miserable performance in almost all cases, and it is grossly inefficient. If you want efficiency, go with Sandy Bridge, if you want best performance, go with Sandy Bridge or possibly a 990x (depending on what you use the CPU for), if you want best price/performance, go with Sandy Bridge. I'd like to say there's a good reason to go with Bulldozer, but I'm being entirely serious when I say I really cannot think of a good reason.

As was already posted on the last page, at extreme gaming settings Bulldozer fails even harder:
http://www.tweaktown.com/articles/43...d/index14.html


FINALLY. Someone supports a 990x for performance. Woot. I feel like my 990x gets hated on more than loved.....

OT: Sandybridge still beats out BD and that's not even with SB-E on the table.


----------



## ZealotKi11er

Also forgot to mentioned considering they had 3-GPU they not test 1 and 2 cards also.


----------



## Domino

Quote:



Originally Posted by *matty0610*


In what case would you need 3 6970s at 1920x1200 for gaming?


where you dont want to spent 1grand on gpus for the next 5 years while maintaining 60 plus fps for the majority of those years ob max settings.


----------



## Don Karnage

Quote:



Originally Posted by *ZealotKi11er*


Also forgot to mentioned considering they had 3-GPU they not test 1 and 2 cards also.


Personally i could have lived with a Xfire comparo. I don't even believe i'll run 3 cards in my lifetime


----------



## Papas

Quote:



Originally Posted by *Shion314*


FINALLY. Someone supports a 990x for performance. Woot. I feel like my 990x gets hated on more than loved.....

OT: Sandybridge still beats out BD and that's not even with SB-E on the table.


i love the 990x, just not the price tag. To be honest, if i had the $ i would def buy one..then i would sell more sperm to pay for a motherboard.


----------



## elito

may i ask why reviewers ALWAYS, LIKE 99% ALWAYS bench these things on DIFFERENT CLOCKS? i mean..honestly, we all knew SB beats BD. but really..whats the point if youre gonna do 5.2ghz vs. 4.75ghz?? whats wrong with leaving both at 4.7??!?!? i dont get these ******ed reviewers man. having better/stronger/faster architecture is one thing, but putting them on clocks that gives them PURE WIN is another.. ferrari with 600hp vs. porsche with 500hp. hmm...ferrari def wins..but thanks to the great differnce in the engine, its now PURE WIN.
p.s.. - they dare to say "apples to apples" well obv. u failed if you cant even have both systems configured accordingly to eachother.


----------



## criminal

Quote:



Originally Posted by *CerealKillah*


I don't doubt it and I totally smell what you are cooking. I don't disagree with SB's superiority in gaming.

My original reply was in response to the OP's graph and the OP's graph only.


So you get the same user experience in one game? Again, no thanks. I really have been a huge AMD fan over the years, but Bulldozer is disappointing.


----------



## Jagged_Steel

Quote:



Originally Posted by *QuackPot*


Because AMD haven't produced enough of them for most online retailers.


And selling a product faster than you can make them is bad? AMD is also selling more Llanos than they can produce. AMD is also selling more 6950/70/90s than they can produce. So they have several new products that people are standing in line for, and what I hear (in this forum anyways) is that somehow this = failure for AMD which of course is completely ridiculous. I operated a manufacturing business for a decade or so, and I can tell you from experience that having higher demand than you can keep up with is a heck of a lot better position to be in than being able to crank out lots of products that are not wanted.


----------



## CerealKillah

Quote:



Originally Posted by *criminal*


So you get the same user experience in one game? Again, no thanks. I really have been a huge AMD fan over the years, but Bulldozer is disappointing.


*sigh*

Please note I did not say: Buy BD anyway, this one benchmark on this one site clearly shows all things are equal.

Instead I said: In this scenario, while SB is faster the user experience is the same.

It is that simple. Nothing more, nothing less.


----------



## BlackOmega

Quote:



Originally Posted by *Razi3l*


The silliest thing here is the fact that they compared 4.7Ghz to 5.2Ghz. TweakTown reviews are terrible anyway.


^This. How stupid. Really fair isnt it? Looks like theres 1 reviewer that should be nixxed.


----------



## lordikon

Quote:



Originally Posted by *Papas*


test is invalid as anyone here would tell you. .5ghz difference is huge and you act like its nothing. im the first to admit BD is a flop to us here. but to talk about failing harder....it is u sir who have failed.


Surely you're not comparing clock speed between two entirely different architectures as if that matters somehow.

So if I compare a Pentium 4 to a Sandy Bridge I need to compare them at the same clock speeds? It's probably better to instead test any CPUs you're comparing at both stock, and the fastest you can get them to run on air/water/etc.

If you want to go down that road then maybe we also rig the test to run both CPUs at the same TDP as well, just to be fair...


----------



## staryoshi

AMD is facing supply-side deficiencies due to poor 32nm yields as well as ramp and manufacturing issues with GF Dresden. Demand may exceed their limited supply, but that reflects supply-side issues more so than strong demand IMO.


----------



## Aedric

Quote:



Originally Posted by *lordikon*


Surely you're not comparing clock speed between two entirely different architectures as if that matters somehow.

So if I compare a Pentium 4 to a Sandy Bridge I need to compare them at the same clock speeds? It's probably better to instead test any CPUs you're comparing at both stock, and the fastest you can get them to run on air/water/etc.

If you want to go down that road then maybe we also rig the test to run both CPUs at the same TDP as well, just to be fair...


The average person has no idea what IPC is. In reality I think they'll sell well at the local bestbuy to the average consumer:

"Oh it has a bigger number it must be better".

"8 Cores! wow that must be so much better then these 4 core cpu's"


----------



## Pillz Here

Quote:


> Originally Posted by *matty0610;15291261*
> In what case would you need 3 6970s at 1920x1200 for gaming?


The case where you're using a Bulldozer chip apparently.


----------



## Kasp1js

The clock speed difference would have been somewhat legit if they tested common frequencies these chips can achieve, but 5.2ghz is a good bit more than an average sb 24/7 OC.


----------



## Papas

Quote:


> Originally Posted by *lordikon;15292639*
> Surely you're not comparing clock speed between two entirely different architectures as if that matters somehow.
> 
> So if I compare a Pentium 4 to a Sandy Bridge I need to compare them at the same clock speeds? It's probably better to instead test any CPUs you're comparing at both stock, and the fastest you can get them to run on air/water/etc.
> 
> If you want to go down that road then maybe we also rig the test to run both CPUs at the same TDP as well, just to be fair...


ill point this out, i was trying to make a point. YOU KNOW FOR A FACT intel people would be complaining about the fact that BD is clocked higher if it beat the 2600k in tests and say well they should overclock the 2600k then to make it fair. i can bring up hundreds of posts showing that(people complaining of higher amd clocks when amd wins in some tests) if you want. its all over overclock.net. to me it dosnt matter. i have a 2500k and all this BD release did was make me save some more $ and look at my gpu to upgrade now.
Quote:


> Originally Posted by *staryoshi;15292656*
> AMD is facing supply-side deficiencies due to poor 32nm yields as well as ramp and manufacturing issues with GF Dresden. Demand may exceed their limited supply, but that reflects supply-side issues more so than strong demand.


ill have to go back and re-read it somewhere but i thought the number 250K was put out of the number of BD chips that have shipped to suppliers.


----------



## hajile

Quote:


> Originally Posted by *Papas;15292409*
> true, never crossed my mind that a company like newegg and tigerdirect would carry limited quantities of a item that is such a huge release and has been touted about for over a year.
> 
> On another note. people talking about how BD performance cant be increased should really look at the 2500k/2600k. during initial testing for the release date, 2500k/2600k were scoring within 14 points of each other in vantage and 100 points in 3dmark 11(while getting beat by the i7 950 and i7 875k) now they are scoring hundreds points more(almost 1000 for the 2600k) and beating the i7 950 and i7 875k, so again, how can BD performance not increase due to some magical driver when intel released the same magical driver that made there cpu's perform better.


Yields are terrible. The launch was probably comprised of mostly paper.
I don't expect some magic 50% performance increase.

Linux developers seem to have had (still having?) a difficult time redesigning the scheduler to suit the new architecture. I assume that Windows developers are having similar issues. The issue talked about earlier in this thread (where disabling the second integer core in each module improves performance/clock). While this could show a problem with the decode unit *but* the decode unit is probably not the biggest problem. Data and instructions reach the integer cores only *after* being decoded. A decoder bottleneck will exist regardless of which integer core or cores are working downstream(the max no. of instructions decoded per unit of time is the max no. of instructions which can be executed per unit of time regardless of how many execution units are present); however, rearranging the chip in software (via disabling some cores) to simulate a more traditional architecture shows that the performance gains are *more likely* to be due to better scheduling optimization and less cache thrashing (not "getting more instructions to fewer core").

AMD's problem is that the CPU (apparently) can't effectively use all the available integer cores due to poor scheduling by the OS, large cache latencies, not enough decoders, and poor branch prediction. That is to say that the integer units (and possibly the FPU's) are being bottlenecked.

The poor scheduling can be fixed and (based off Anandtech's Windows 8 not-fully-optimized alpha test) will decrease normal power consumption (unused cores can downclock) while increase performance 10% or maybe more (some 4c/4cu benches showed >20% increases). The cache latencies were probably increased over expected numbers to increase yields due to poor 32nm performance at Globalfoundries. They will probably increase with the next stepping or two (side note: one of AMD's goals was near linear performance increase with clockspeed (something SB doesn't achieve) and getting the 30% clockspeed advantage over Deneb that was initially expected will also be a side affect of fab improvements).

Improving decoders (if necessary) and improving branch prediction require a complete reworking of the front-end of the processor. With normal development times for simple chip redesigns being a couple of years, I suspect that AMD knew months ago about the poor decode and branch prediction. This is the only explanation for how soon piledriver is being released (just a few months rather than a couple of years for a major redesign). AMD likely counted on the 30% greater stock clockspeed to carry them until the redesign was finished (notice that the 4.6Ghz overclock benches (roughly 30% faster than Deneb designs) were fairly competitive), but AMD was screwed by the bad fab (though I believe AMD to be at fault as well for shipping a faulty design).

My prediction (please don't quote me later, I am being optimistic, but in reality, I have little faith)

Between a 10-15% average OS performance increase (this seems fairly definite), better fabs giving (I guess) 20% increase in clockspeed rather than 30% (scaling almost linearly), better fabs giving nearly 80% improvement in cache latencies (to match Deneb latencies should be completely possible, cache is cache), and 10-15% IPC improvement (also seems fairly definite) due to more decode and better branch prediction, I believe that the next iteration will show more of the theoretical potential.

edit: At best, 15% from OS and 15% from redesign gives 30% IPC performance boost (making it 20% faster than Deneb and 20% slower than Sandybridge). Better cache latencies are a mixed bag; they may give less than 2% for some applications or they may give >20% for others. If clockspeed can be increased the total increases give between 35-90% increases in performance (that's a huge delta). Even with a 70-90% increase in overall performance, performance per transister would still be worse than Sandybridge.

This seems to be the only explanation for why a chip half the transistor count of bulldozer can have better performance. As the chip is currently, I couldn't recommend anyone buy one (I don't think that I could recommend one even if the OS problem went away).


----------



## MoRLoK

All of u make me want to buy it after all







I will test it myself







(fx-6100) i see there is sick demand in my country for all of them. Today was 30+ in stock fx-6100 and fx-8120 (very big not detalic reseller; not sure if its right word) after one hour 0







. So instead of crying i will try it myself . I should order it today and will be getting it tommorow. Too bad now i must wait to monday







. Maybe if i get one ill post some of myfindings (most important for me how much power hungry it is).


----------



## hammertime850

Quote:


> Originally Posted by *elito;15292481*
> may i ask why reviewers ALWAYS, LIKE 99% ALWAYS bench these things on DIFFERENT CLOCKS? i mean..honestly, we all knew SB beats BD. but really..whats the point if youre gonna do 5.2ghz vs. 4.75ghz?? whats wrong with leaving both at 4.7??!?!? i dont get these ******ed reviewers man. having better/stronger/faster architecture is one thing, but putting them on clocks that gives them PURE WIN is another.. ferrari with 600hp vs. porsche with 500hp. hmm...ferrari def wins..but thanks to the great differnce in the engine, its now PURE WIN.
> p.s.. - they dare to say "apples to apples" well obv. u failed if you cant even have both systems configured accordingly to eachother.


they put them at their highest overclock, we are testing the speed of these processors.


----------



## Domino

so whats the verdict on this puppy? are we looking at a faulty schedular, poor cache latencies, and maybe lacking proper windows support for it? can we expect a fermi on this where performance would increase after an update rather then its netburst rehash?


----------



## Papas

Quote:


> Originally Posted by *Domino;15292816*
> so whats the verdict on this puppy? are we looking at a faulty schedular, poor cache latencies, and maybe lacking proper windows support for it? can we expect a fermi on this where performance would increase after an update rather then its netburst rehash?


im guessing its a little bit of everything with lower performance than everyone expected. im not expecting BD to wipe the floor once everything is fixed, im expecting it to (hopefully) run along side SB.


----------



## hammertime850

Quote:


> Originally Posted by *Jagged_Steel;15292522*
> And selling a product faster than you can make them is bad? AMD is also selling more Llanos than they can produce. AMD is also selling more 6950/70/90s than they can produce. So they have several new products that people are standing in line for, and what I hear (in this forum anyways) is that somehow this = failure for AMD which of course is completely ridiculous. I operated a manufacturing business for a decade or so, and I can tell you from experience that having higher demand than you can keep up with is a heck of a lot better position to be in than being able to crank out lots of products that are not wanted.


I agree with you here, AMD's liano and graphics cards are selling great. bulldozer is just bad news for enthusiasts.


----------



## smoothjk

Here's to hoping that the next iteration will be like Phenom --> Phenom II...


----------



## Pillz Here

Quote:


> Originally Posted by *CerealKillah;15291863*
> Again, if that is your point, they need benchmarks to illustrate the point.
> 
> Everyone that has quoted me seems to miss the point of my post (which is still valid) in that the graph shown by the OP shows superiority of the SB platform, but does not show/indicate a bad user experience on BD.
> 
> PS. I would bet that 60hz is the STANDARD for gaming monitors and will account for 90% of the current systems out there.


These are enthusiast level setups, not the "90% of the current systems out there" that you're referring to. I'd say it's highly probable for anyone that throws down the cash for either of these setups to also have a 120hz monitor, or multiple 120hz monitors. I know if I paid for 3 6970's and chose to game on a single 1920x1200 monitor, it certainly wouldn't be 60hz.


----------



## Fuell

Bull dozer has one area where it absolutely shines and I would recommend it hands down to any AM3+ board owner... but thats sadly a small percentage of even enthusiast level applications...

I still think all the issues listed in the last few posts, along with AMD's new arch, a radically new arch and transition to 32nm are all reasons adding up to a poor launch... They obviously had high hopes and took a serious chance... it was just too risky...

Though that sounds like a cop-out... so lets all hope and pray they work out the kinks for the next edition.... is there a BD-E or just going to Piledriver now?


----------



## scaz

Quote:



Originally Posted by *Don Karnage*


Personally i could have lived with a Xfire comparo. I don't even believe i'll run 3 cards in my lifetime


wow you must be really old. I remeber when I was amazed with the voodoo cards. Then I was amazed that cpus where going to hit 1 ghz. Now that I have a server at work with 24 cores and 256 GB of ram. I really can't forsee how things will be in the future.


----------



## EvanPitts

With all of the problems - looks like a good time to pick up some discounted Phenom II processors...


----------



## Knuxr

At the end of the day, there are no excuses, Bulldozer fell flat on it face. Sure there are some things that after the fact will help, but it just underwhelmed overall. It sucks to see because I am always pulling for AMD but it looks like I will be sticking with Intel for a while.


----------



## CerealKillah

Quote:



Originally Posted by *Pillz Here*


These are enthusiast level setups, not the "90% of the current systems out there" that you're referring to. I'd say it's highly probable for anyone that throws down the cash for either of these setups to also have a 120hz monitor, or multiple 120hz monitors. I know if I paid for 3 6970's and chose to game on a single 1920x1200 monitor, it certainly wouldn't be 60hz.


Valid point, for sure.


----------



## Chewy

It boggles me how amd can get it so right with their gpu's but cant reach people's expectations with their cpu's.


----------



## Lampen

Quote:



Originally Posted by *hajile*


Yields are terrible. The launch was probably comprised of mostly paper.
I don't expect some magic 50% performance increase.

Linux developers seem to have had (still having?) a difficult time redesigning the scheduler to suit the new architecture. I assume that Windows developers are having similar issues. The issue talked about earlier in this thread (where disabling the second integer core in each module improves performance/clock). While this could show a problem with the decode unit *but* the decode unit is probably not the biggest problem. Data and instructions reach the integer cores only *after* being decoded. A decoder bottleneck will exist regardless of which integer core or cores are working downstream(the max no. of instructions decoded per unit of time is the max no. of instructions which can be executed per unit of time regardless of how many execution units are present); however, rearranging the chip in software (via disabling some cores) to simulate a more traditional architecture shows that the performance gains are *more likely* to be due to better scheduling optimization and less cache thrashing (not "getting more instructions to fewer core").

AMD's problem is that the CPU (apparently) can't effectively use all the available integer cores due to poor scheduling by the OS, large cache latencies, not enough decoders, and poor branch prediction. That is to say that the integer units (and possibly the FPU's) are being bottlenecked.

The poor scheduling can be fixed and (based off Anandtech's Windows 8 not-fully-optimized alpha test) will decrease normal power consumption (unused cores can downclock) while increase performance 10% or maybe more (some 4c/4cu benches showed >20% increases). The cache latencies were probably increased over expected numbers to increase yields due to poor 32nm performance at Globalfoundries. They will probably increase with the next stepping or two (side note: one of AMD's goals was near linear performance increase with clockspeed (something SB doesn't achieve) and getting the 30% clockspeed advantage over Deneb that was initially expected will also be a side affect of fab improvements).

Improving decoders (if necessary) and improving branch prediction require a complete reworking of the front-end of the processor. With normal development times for simple chip redesigns being a couple of years, I suspect that AMD knew months ago about the poor decode and branch prediction. This is the only explanation for how soon piledriver is being released (just a few months rather than a couple of years for a major redesign). AMD likely counted on the 30% greater stock clockspeed to carry them until the redesign was finished (notice that the 4.6Ghz overclock benches (roughly 30% faster than Deneb designs) were fairly competitive), but AMD was screwed by the bad fab (though I believe AMD to be at fault as well for shipping a faulty design).

My prediction (please don't quote me later, I am being optimistic, but in reality, I have little faith)

Between a 10-15% average OS performance increase (this seems fairly definite), better fabs giving (I guess) 20% increase in clockspeed rather than 30% (scaling almost linearly), better fabs giving nearly 80% improvement in cache latencies (to match Deneb latencies should be completely possible, cache is cache), and 10-15% IPC improvement (also seems fairly definite) due to more decode and better branch prediction, I believe that the next iteration will show more of the theoretical potential.

edit: At best, 15% from OS and 15% from redesign gives 30% IPC performance boost (making it 20% faster than Deneb and 20% slower than Sandybridge). Better cache latencies are a mixed bag; they may give less than 2% for some applications or they may give >20% for others. If clockspeed can be increased the total increases give between 35-90% increases in performance (that's a huge delta). Even with a 70-90% increase in overall performance, performance per transister would still be worse than Sandybridge.

This seems to be the only explanation for why a chip half the transistor count of bulldozer can have better performance. As the chip is currently, I couldn't recommend anyone buy one (I don't think that I could recommend one even if the OS problem went away).


Excellent post sir!







+rep


----------



## lloyd mcclendon

this is what i was expecting

====================== bulldozer

==================== i7 2600k

========= i5 etc

======= phenom II


----------



## Blitz6804

Hey there, everyone!

Please remember that we should all be civil and professional on this forum. Please check your profanity and personal insults at the door.


----------



## Lampen

Quote:



Originally Posted by *Blitz6804*


Hey there, everyone!

Please remember that we should all be civil and professional on this forum. Please check your profanity and personal insults at the door.










Whoa now. I was told I could have at least one carry-on bag for my insults and profanity!


----------



## AMDMAXX

lol my 2600k is only capable of 5.0 Ghz if I turn off HT

This review is a joke... 4.7 Ghz is the highest I can go with HT on...

Granted my 2600k is the first generation of the processors... im sure the newer ones are better... but still

no matter the other reviews show me BD failed... I will be picking up IVY and running the SB is as my second computer... most likely turning it (SB box) into a Virtual Machine box... was thinking about doing a bulldozer build for my Virtual machine box but not with its single threaded performance...


----------



## M3T4LM4N222

I just died.


----------



## Blitz6804

As long as the bag remains closed and locked, have at it.









Lest not this thread be closed and locked.


----------



## dklimitless

Quote:



Originally Posted by *Pillz Here*


The case where you're using a Bulldozer chip apparently.


ooohhh snap!









BD is spanked by really bad by SB in these tests...

However, this review is incomplete (as has been mentioned here over and over again).
1. Test max overclocks, after all, enthusiasts want as far as they can go (which is what they sorta did). However, I dare say less than 2% of SB chips are ran @5.2Ghz 24/7... also, BD reported even higher (~5.5Ghz) overclocks. It is only fair that they find such a BD OC out there and test with that if they really want to play with golden chips.
2. There is nothing wrong with testing with beyond-average OC's, however, the review will remain unbalanced until they include tests @ the same clocks...

We still need to realize that though this may not be reflective of the general gaming population, there *do* exist enthusiasts with such setups which have to be catered for so ...


----------



## Lampen

Quote:



Originally Posted by *Blitz6804*


As long as the bag remains closed and locked, have at it.









Lest not this thread be closed and locked.










Fine I suppose I can keep my personal threats to a minimum. Especially since it is actually time to board my flight!









See everyone later tonight!


----------



## Steak House

LOL at all the AMD Fanboys making excuses for BD - Just admit the 2500K is the way to go already...


----------



## scyy

Quote:



Originally Posted by *hajile*


Yields are terrible. The launch was probably comprised of mostly paper.
I don't expect some magic 50% performance increase.

Linux developers seem to have had (still having?) a difficult time redesigning the scheduler to suit the new architecture. I assume that Windows developers are having similar issues. The issue talked about earlier in this thread (where disabling the second integer core in each module improves performance/clock). While this could show a problem with the decode unit *but* the decode unit is probably not the biggest problem. Data and instructions reach the integer cores only *after* being decoded. A decoder bottleneck will exist regardless of which integer core or cores are working downstream(the max no. of instructions decoded per unit of time is the max no. of instructions which can be executed per unit of time regardless of how many execution units are present); however, rearranging the chip in software (via disabling some cores) to simulate a more traditional architecture shows that the performance gains are *more likely* to be due to better scheduling optimization and less cache thrashing (not "getting more instructions to fewer core").

AMD's problem is that the CPU (apparently) can't effectively use all the available integer cores due to poor scheduling by the OS, large cache latencies, not enough decoders, and poor branch prediction. That is to say that the integer units (and possibly the FPU's) are being bottlenecked.

The poor scheduling can be fixed and (based off Anandtech's Windows 8 not-fully-optimized alpha test) will decrease normal power consumption (unused cores can downclock) while increase performance 10% or maybe more (some 4c/4cu benches showed >20% increases). The cache latencies were probably increased over expected numbers to increase yields due to poor 32nm performance at Globalfoundries. They will probably increase with the next stepping or two (side note: one of AMD's goals was near linear performance increase with clockspeed (something SB doesn't achieve) and getting the 30% clockspeed advantage over Deneb that was initially expected will also be a side affect of fab improvements).

Improving decoders (if necessary) and improving branch prediction require a complete reworking of the front-end of the processor. With normal development times for simple chip redesigns being a couple of years, I suspect that AMD knew months ago about the poor decode and branch prediction. This is the only explanation for how soon piledriver is being released (just a few months rather than a couple of years for a major redesign). AMD likely counted on the 30% greater stock clockspeed to carry them until the redesign was finished (notice that the 4.6Ghz overclock benches (roughly 30% faster than Deneb designs) were fairly competitive), but AMD was screwed by the bad fab (though I believe AMD to be at fault as well for shipping a faulty design).

My prediction (please don't quote me later, I am being optimistic, but in reality, I have little faith)

Between a 10-15% average OS performance increase (this seems fairly definite), better fabs giving (I guess) 20% increase in clockspeed rather than 30% (scaling almost linearly), better fabs giving nearly 80% improvement in cache latencies (to match Deneb latencies should be completely possible, cache is cache), and 10-15% IPC improvement (also seems fairly definite) due to more decode and better branch prediction, I believe that the next iteration will show more of the theoretical potential.

edit: At best, 15% from OS and 15% from redesign gives 30% IPC performance boost (making it 20% faster than Deneb and 20% slower than Sandybridge). Better cache latencies are a mixed bag; they may give less than 2% for some applications or they may give >20% for others. If clockspeed can be increased the total increases give between 35-90% increases in performance (that's a huge delta). Even with a 70-90% increase in overall performance, performance per transister would still be worse than Sandybridge.

This seems to be the only explanation for why a chip half the transistor count of bulldozer can have better performance. As the chip is currently, I couldn't recommend anyone buy one (I don't think that I could recommend one even if the OS problem went away).


I completely agree with this post, bulldozer was pushed out too soon. The architecture clearly has potential if they can get over these frankly completely performance breaking issues. AMD really should have taken a page from intels play book and did a dieshrink of the phenom II so global foundries would have more experience with 32nm rather then jumping right into a new architecture and a new manufacturing process as well.


----------



## GTR Mclaren

OK Now Im tired of the Bulldozer haters

yeah, its inferior to SB, we get it people


----------



## Majin SSJ Eric

Quote:



Originally Posted by *elito*


may i ask why reviewers ALWAYS, LIKE 99% ALWAYS bench these things on DIFFERENT CLOCKS? i mean..honestly, we all knew SB beats BD. but really..whats the point if youre gonna do 5.2ghz vs. 4.75ghz?? whats wrong with leaving both at 4.7??!?!? i dont get these ******ed reviewers man. having better/stronger/faster architecture is one thing, but putting them on clocks that gives them PURE WIN is another.. ferrari with 600hp vs. porsche with 500hp. hmm...ferrari def wins..but thanks to the great differnce in the engine, its now PURE WIN.
p.s.. - they dare to say "apples to apples" well obv. u failed if you cant even have both systems configured accordingly to eachother.


Because then you are downclocking the 2600k to a speed it is more than capable of surpassing just because the 8150 can't go any higher. How on earth is that considered fair to SB, that it be punished for being a BETTER clocker?!?!


----------



## [T]yphoon

Quote:



Originally Posted by *Razi3l*


The silliest thing here is the fact that they compared 4.7Ghz to 5.2Ghz. TweakTown reviews are terrible anyway.

SB owners wont stop gloating and being smug now that Bulldozer isn't all it was expected to be. How sad lol. If it was ~$40-$50 cheaper it would be amazing.


still with a slight increase in speeds gives the user 50% more FPS???

Quote:



Originally Posted by *Don Karnage*


I'm not sure why they didn't use a 2500K in the review instead of the 2600K. Results would have been identical


why use a FX-8150 8-core against a i5-2500K 4-core? 8-cores vs 8-cores


----------



## Pendulum

Disappointing...


----------



## Majin SSJ Eric

Quote:



Originally Posted by *Jagged_Steel*


And selling a product faster than you can make them is bad? AMD is also selling more Llanos than they can produce. AMD is also selling more 6950/70/90s than they can produce. So they have several new products that people are standing in line for, and what I hear (in this forum anyways) is that somehow this = failure for AMD which of course is completely ridiculous. I operated a manufacturing business for a decade or so, and I can tell you from experience that having higher demand than you can keep up with is a heck of a lot better position to be in than being able to crank out lots of products that are not wanted.


Last I checked, Intel wasn't exactly having trouble shifting SB processors....


----------



## missingno

the clock speeds bother me. why wouldnt you put both at 5ghz? or 4.5?

this embarasing for amd though. the 8 core 3.6ghz chip should be taking on the 980x(although $700 price difference). flagship vs flagship.

but no, it has trouble dealing with a mid grade processor(although "enthusiast" now, will be mid tier when sb-e is out), the 2500k. let alone the 2600k they used in this review.

pathetic.

and dont tell me it's not supposed to compete with intel's best. if it was faster than the 980x you know there would be no mercy from the amd fans and intel people would use the "lol 8 core vs 6 core" excuse that we saw when the i7 950/930 vs the 1090t in reviews(but that was 6v4 cores and amd users still QQed about hyperthreading).


----------



## tpi2007

Quote:



Originally Posted by *M3T4LM4N222*


Thats a problem though. Compatiablity was one of the things that makes AMD CPU's so viable and great. You didn't need to buy a new board every time you wanted a new CPU. AMD has generally always been better at the money saving aspect. But Bulldozer is the exact opposite of what AMD usually produces.


And then there's the BIOS AGESA updates that are giving more trouble than they should. It appears that Bulldozer BIOS updates can actually negatively impact performance of Deneb and Thuban CPUs people have on the boards. So much for backwards compatibility and AMD's advantage of having "one socket fits all".

Quote:



Originally Posted by *Domino*


so whats the verdict on this puppy? are we looking at a faulty schedular, poor cache latencies, and maybe lacking proper windows support for it? can we expect a fermi on this where performance would increase after an update rather then its netburst rehash?


Some people keep comparing this to Fermi, but it's not really comparable. Fermi did perform better for a single GPU than the AMD competitor of the time. It just came with inadequate cooling (hot and loud). Now that coolers like the AXP exist and that reference coolers use a more sophisticated design, along with GPUs with high power consumption like the HD6970, GTX570 and GTX580 exist and people don't complain nearly as much, the GTX480 is a fine card.

Changing the stock cooler on the FX-8150 is not going to have the same effect though.


----------



## Majin SSJ Eric

Quote:



Originally Posted by *missingno*


the clock speeds bother me. why wouldnt you put both at 5ghz? or 4.5?

this embarasing for amd though. the 8 core 3.6ghz chip should be taking on the 980x(although $700 price difference). flagship vs flagship.

but no, it has trouble dealing with a mid grade processor(although "enthusiast" now, will be mid tier when sb-e is out), the 2500k. let alone the 2600k they used in this review.

pathetic.


As was said in the article, they OC'ed each processor as high as it would go (which is what any enthusiast would do). How is that not fair when comparing ultimate speed? Crippling the 2600k with an artificially low clock just to make things fair for the 8150 is inherently UNFAIR to the 2600k. What is so hard to understand here?

Put another way, if the 8150 clocked higher than the 2600k, I would not complain. OCing results are NEVER guaranteed. None of this matters anyway because we all know the 2600k would have mopped the floor with the 8150 at 4700MHz just the same....


----------



## GTR Mclaren

funny...all the reviews use Farcry 2 as their game benchmark...a well know anti AMD game...


----------



## scyy

Quote:



Originally Posted by *Majin SSJ Eric*


Because then you are downclocking the 2600k to a speed it is more than capable of surpassing just because the 8150 can't go any higher. How on earth is that considered fair to SB, that it be punished for being a BETTER clocker?!?!


Actually lots of reports show 5Ghz+ being pretty average for BD whereas 5Ghz+ is a golden chip for SB.


----------



## ToTheSun!

Quote:



Originally Posted by *[T]yphoon*


why use a FX-8150 8-core against a i5-2500K 4-core? 8-cores vs 8-cores


8 cores vs 8 threads*

Sorry, i'm a prick =(


----------



## missingno

Quote:



Originally Posted by *Majin SSJ Eric*


As was said in the article, they OC'ed each processor as high as it would go (which is what any enthusiast would do). How is that not fair when comparing ultimate speed? Crippling the 2600k with an artificially low clock just to make things fair for the 8150 is inherently UNFAIR to the 2600k. What is so hard to understand here?

Put another way, if the 8150 clocked higher than the 2600k, I would not complain. OCing results are NEVER guaranteed. None of this matters anyway because we all know the 2600k would have mopped the floor with the 8150 at 4700MHz just the same....


the point isn't overclockability. anyone can shove 2v through a chip for a review. the point is to see instructions per clock cycle, then go to the amd and intel subforums to see what most people get out of the chips. the truth is it's decently difficult to break the 5ghz barrier/1.45v barrier with sandy bridge(for 24/7). i have 4 2500k cpus from various microcenters and one got past 5ghz without trouble. motherboard is p8p67 pro3.1 with 14 phase power so its not the board's fault


----------



## Amhro

Quote:



Originally Posted by *smoothjk*


Here's to hoping that the next iteration will be like Phenom --> Phenom II...


are first Phenoms really that bad?








mine is pretty good


----------



## HAVO

Quote:



Originally Posted by *Majin SSJ Eric*


Somebody send me an 8150 and a mobo and I'll happily plug in my 580's and see *my PSU explode*










Fixed


----------



## Don Karnage

Quote:



Originally Posted by *[T]yphoon*


why use a FX-8150 8-core against a i5-2500K 4-core? 8-cores vs 8-cores


2600K has 8 Hyperthreads. Its still a 4 core processor and games don't utilize HT so the 2500K would have been fine.

Intel's 8 core processors have 16HT's.


----------



## missingno

i have a feeling people will get upper 5ghz clocks on these bulldozer chips, and may make it compete with the 5ghz 2600ks. maybe they wont. the tdp is awful.

http://www.overclock.net/intel-cpus/...ue-4-8ghz.html

just backing up my claim, look what i see when i refresh the home page. trouble with 4.8??


----------



## kweechy

Quote:



Originally Posted by *GTR Mclaren*


funny...all the reviews use Farcry 2 as their game benchmark...a well know anti AMD game...


They also used about 100 other apps.


----------



## Xenthos

I wanna see a Phenom II x6 1090T in there. Just for giggles


----------



## Majin SSJ Eric

Fine, I don't even really care about the 5.2GHz number. Somebody here on OCN will end up comparing the 8150 and 2600k at the same clocks (probably fairly soon) and I will be totally unsurprised when the results are still a significant loss for the BD....


----------



## VinhDiezel

Quote:



Originally Posted by *missingno*


i have a feeling people will get upper 5ghz clocks on these bulldozer chips, and may make it compete with the 5ghz 2600ks. maybe they wont. the tdp is awful.

http://www.overclock.net/intel-cpus/...ue-4-8ghz.html

just backing up my claim, look what i see when i refresh the home page. trouble with 4.8??


That goes to show that each and every single cpu will OC differently


----------



## missingno

Quote:



Originally Posted by *Majin SSJ Eric*


Fine, I don't even really care about the 5.2GHz number. Somebody here on OCN will end up comparing the 8150 and 2600k at the same clocks (probably fairly soon) and I will be totally unsurprised when the results are still a significant loss for the BD....


no i totally agree, the 2600k rapes anything amd has. i was criticizing the review.


----------



## Booty Warrior

Yeah, I don't really see why people are complaining about the clocks used. It's not like the results were even in the same ballpark.

The 2600k was over 75% faster in some of those benches... You could probably drop the i7 down to 4GHz and it would still be ahead.


----------



## QuackPot

Quote:



Originally Posted by *Jagged_Steel*


And selling a product faster than you can make them is bad? AMD is also selling more Llanos than they can produce. AMD is also selling more 6950/70/90s than they can produce. So they have several new products that people are standing in line for, and what I hear (in this forum anyways) is that somehow this = failure for AMD which of course is completely ridiculous. I operated a manufacturing business for a decade or so, and I can tell you from experience that having higher demand than you can keep up with is a heck of a lot better position to be in than being able to crank out lots of products that are not wanted.


Uhh, it's not that's there's a high demand for them. They just haven't produced enough because they've delayed BD for so long.

Hardly any UK or EU suppliers have them because AMD haven't produced them for that long to create reasonable stock piles. Thus is why there's won't be much here until the 24th.

AMD are just slow. There aren't in too much demand.


----------



## ekg84

Quote:



Originally Posted by *Devilmaypoop*


I wonder where Jagged Steel went..


probably having briefing in AMD headquarters


----------



## Iceman23

It's obviously meant to compare two processors at their peak performance, I don't get how that's hard to see. We already know IPC is better for Intel, what additional knowledge will you gain by having them at the same clocks? This is a much more realistic situation - most of us will overclock to a certain limit and then evaluate performance from there.


----------



## MoBeeJ

http://www.guru3d.com/article/amd-fx...rmance-review/

Quote:



This article is nothing more then a performance overview of all four processors released yesterday. So this article is basically a big performance chart, for the architecture specifications and conclusions we urge you to read our AMD FX 8150 review. This article is a performance overview.

The processors added into the chart are the eight core AMD FX 8120, the six-core AMD FX 6100 and the four core AMD FX 4100.


All i can say now is: mmmmm

I know there is a BD thread with all links, but this is the 1st one with 6 and 4 core BD. So i wanted it to be separate. Mods can merge it.


----------



## formula m

If a specific # in a benchmark is all you are after, then know this about yourself and stop using this specific (personal) need, as a want for the populace.

It is irrational and ignorant.

"Joe Public" will be served by this BD, as much as SB over the next 2~4 years. But, If you change CPUs more often than this & somehow delude the fact that BD is a bust, because next year something better is coming, then understand you are once again a special human being, with special & personal needs.

But you should also know about your specialness and taper your post accordingly.

If you are not special and do not care about the arbitrary, then run any 3 benchmark at once and get back with "Joe Public"... because that is how people abuse their systems. With 20 animated icons, different mouse pointers, Active desktop, minecraft, farmville BOTH running. Plus, the ambiguous Windows Tray full of cute things, etc.

Real life benchmarking?
Benchmark after any 13 girl has had the computer after 6 months.. no fresh install of the OS. Then see what architecture can handle the OS.

Or, you can get a 6-pack, some Vaseline, etc.. and run Cinebench all night and call all of your buddies, to partake on how special you are & how special your rig is.... _after just formating the C drive 1h before _







.

That is about as fake as real world benching can be (for "Joe Public").


----------



## mrcool63

I dont understand this!!! the fact is that SB is capable of max dual 8x then how the hell did they manage to beat BD which actually has dual 16x pcie capabilities... This is absolute crap!!!
SB is capable of only 20 pcie lanes.. whereas BD is 32+.. Now please explain how this s**t is possible!!


----------



## Dapman02

AMD, what happened to you


----------



## Iceman23

Quote:



Originally Posted by *formula m*


If a specific # in a benchmark is all you are after, then know this about yourself and stop using this specific (personal) need, as a want for the populace.

It is irrational and ignorant.

"Joe Public" will be served by this BD, as much as SB over the next 2~4 years. But, If you change CPUs more often than this & somehow delude the fact that BD is a bust, because next year something better is coming, then understand you are once again a special human being, with special & personal needs.

But you should also know about your specialness and taper your post accordingly.

If you are not special and do not care about the arbitrary, then run any 3 benchmark at once and get back with "Joe Public"... because that is how people abuse their systems. With 20 animated icons, different mouse pointers, Active desktop, minecraft, farmville BOTH running. Plus, the ambiguous Windows Tray full of cute things, etc.

Real life benchmarking?
Benchmark after any 13 girl has had the computer after 6 months.. no fresh install of the OS. Then see what architecture can handle the OS.

Or, you can get a 6-pack, some Vaseline, etc.. and run Cinebench all night and call all of your buddies, to partake on how special you are & how special your rig is.... _after just formating the C drive 1h before _







.

That is about as fake as real world benching can be (for "Joe Public").


So you would argue, at this point in time, that BD is a good value upgrade for those who already have PII x4 or x6 Thubans? Would you argue that it's the best value for this average home user you describe who is looking for a new pc? I sure hope not. This is an ENTHUSIAST chip, marketed to ENTHUSIASTS. Your argument is baseless - the average home user would see little difference between any current cpus. So you can tell "joe public" to buy and 8-core cpu - I'll tell them that a quad core PII for half the price will offer a user experience that is probably indistinguishable for half the price. That is real life


----------



## finalturismo

the only thing bulldozer beats sandy bridge at is encoding and archive compression / decompression.

FAILDOZER


----------



## SOCOM_HERO

my goodness, what in the world is going on with AMD. I really wanted BD to live up to its name, but given how pathetic these results are, the bottleneck is the CPU people. The 170+ page thread of doom proves that without doubt


----------



## el gappo

SB @ 5.2 and BD @ 4.7!???? Yeah because that's fair...


----------



## jagz

Quote:



Originally Posted by *el gappo*


SB @ 5.2 and BD @ 4.7!???? Yeah because that's fair...


True but even if they were at the same clock, It wouldn't be close


----------



## Fletcherea

Jeeze, I don't feel so bad about grabbing my tiny little i3 now. Seems I made a fairly informed purchase(see, ocn isn't just arguing all the time, look around and ask, you'll get some help!), based on the charts(i5-665k) that would involve *my usage*, the little guy is hanging right there with the FX 4100, slightly below, but this is a little dualie!


----------



## Baron_Davis

they must have trick up their sleeve...cause this is pretty pathetic.


----------



## MiKE_nz

Bulldozer should be shunned. We should all pretend that it never happened.


----------



## Axon14

Quote:



Originally Posted by *Domino*


so whats the verdict on this puppy? are we looking at a faulty schedular, poor cache latencies, and maybe lacking proper windows support for it? can we expect a fermi on this where performance would increase after an update rather then its netburst rehash?


I think ultimately the BD architecture can work, as the performance in heavily threaded apps is what you would expect. But this generation doesn't really have the single threading/lesser threading performance you'd want or expect.

AMD took a chance here in redesigning everything from the ground up, so I'm giving this a pass. I hope they learn a lesson from this like nvidia did with fermi.


----------



## Kieran

According to the charts, this FX 8150 is equivalent to or better than the i5 2500k but worse than the i7 2600k. Quite disappointed by the performance of the 6 and 4 core models, bulldozer 6 core is worse than my current cpu


----------



## PappaSmurfsHarem

Quote:



Originally Posted by *MiKE_nz*


Bulldozer should be shunned. We should all pretend that it never happened.



SHUUUUUUUUN!


----------



## Homeles

The problem with Bulldozer isn't Bulldozer itself, it's the lack of software support for multithreading.


----------



## mickeyfuqinp

im gona buy bulldozer so i can take pictures with it, and post them all over OCN.

then everyone will be like "omgzz y u no buy inteal 4 peecturz!"


----------



## sloppyjoe123

Even for an Intel user, these reviews are hard to read....


----------



## Jagged_Steel

Quote:



We would like to thank the following companies for supplying and supporting us with our test system hardware and equipment: Intel, ASUS, Sapphire, Western Digital and Corsair.


So let me get this straight. Intel hands this website money and or equipment, and then the "test" ends up getting slanted in their favor. Hmm, do think there is a connection ?


----------



## mrcool63

My advice to AMD would be to shift this entire architecture onto the 28nm die..

It would give them a tighter integration of components thereby a less leak and reduce power required and thus the TDP

Shorter pipelines as one of my friends mentioned so that AMD will be able to perform shorter threads better than it currently does as BD!!

Increase per core performance to boost the overall performance along with an increase of the L2 and L3 cache!!


----------



## Iceman23

Quote:



Originally Posted by *el gappo*


SB @ 5.2 and BD @ 4.7!???? Yeah because that's fair...


I'll quote myself again.

Quote:



It's obviously meant to compare two processors at their peak performance, I don't get how that's hard to see. We already know IPC is better for Intel, what additional knowledge will you gain by having them at the same clocks? This is a much more realistic situation - most of us will overclock to a certain limit and then evaluate performance from there.


Are there not enough benchmarks already out comparing the processors with similar clock speeds/at stock for you to determine per clock performance?

Quote:



Originally Posted by *Jagged_Steel*


So let me get this straight. Intel hands this website money and or equipment, and then the "test" ends up getting slanted in their favor. Hmm, do think there is a connection ?










So let me get this straight, Intel has paid off every reviewer? This shows quite similar performance to other reviews, if you argue there is a connection then I can logically assume every review showing similar performance discrepancy is also the result of Intel bribes. Can't wait to hear your next rationalization


----------



## mrcool63

yes but scaling is the question right and at higher resolutions the bandwidth plays a very major role... 8150 is not such an inferior cpu to get bottlenecked that easily!!

Something inherently wrong with those results!!


----------



## Devilmaypoop

Quote:



Originally Posted by *mrcool63*


yes but scaling is the question right and at higher resolutions the bandwidth plays a very major role... 8150 is not such an inferior cpu to get bottlenecked that easily!!

Something inherently wrong with those results!!


...

Bulldozer is 25% slower on average than Phenom II. Phenom II is around 30%-50% slower than Sandy Bridge.

Games only use 2-4 cores most of the time.

Those benchmarks make perfect sense.

Quote:



So let me get this straight, Intel has paid off every reviewer? This shows quite similar performance to other reviews, if you argue there is a connection then I can logically assume every review showing similar performance discrepancy is also the result of Intel bribes. Can't wait to hear your next rationalization


It all makes sense now.. Bulldozer must be around 200% faster than Sandy Bridge, but Intel has bribed everyone!


----------



## illsupra

welp.....guess i'll be picking up a 1100T


----------



## kiwiasian

Well, on the bright side, they are priced just right.
Oh wait......


----------



## Bit_reaper

They need to drop the FX-8150 price down to the same level as a 1100T if they intend to sell any Bulldozers.


----------



## hajile

Quote:



Originally Posted by *mrcool63*


My advice to AMD would be to shift this entire architecture onto the 28nm die..

It would give them a tighter integration of components thereby a less leak and reduce power required and thus the TDP

Shorter pipelines as one of my friends mentioned so that AMD will be able to perform shorter threads better than it currently does as BD!!

Increase per core performance to boost the overall performance along with an increase of the L2 and L3 cache!!


28nm is a bulk process. Even if AMD redesigned the processor for it, clockspeeds would be terrible (think less than 2Ghz). Bulk process is also more leaky, switching would increase power consumption and raise TDP despite being a smaller node.

Redesigning Bulldozer with a shorter pipeline is not related to manufacture process. Making a shorter pipeline would take several years of research and design.

AMD is already attempting to increase IPC (instructions per clock) with the upcoming piledriver core (as I stated before, probably focusing on decode and branch prediction units). The chip is massive (I think it is by far the largest consumer CPU ever made). The chip doesn't need more L2 or L3 as much as it needs to have the current bottlenecks fixed.

While it is true that a more refined half-node could be created, this would take a year or two. In the meantime, AMD will already have fixed most of the serious bulldozer problems.


----------



## Jagged_Steel

Quote:



Originally Posted by *Iceman23*


I'll quote myself again.

Are there not enough benchmarks already out comparing the processors with similar clock speeds/at stock for you to determine per clock performance?

So let me get this straight, Intel has paid off every reviewer? This shows quite similar performance to other reviews, if you argue there is a connection then I can logically assume every review showing similar performance discrepancy is also the result of Intel bribes. Can't wait to hear your next rationalization










I don't know about "every" reviewer , but this one here ADMITS that they are getting paid by Intel. When somebody openly announces to me that they are dishonest, I believe them. You can pull the wool over your own eyes and pretend that this isn't the case all you want, I will stick to reality.

Quote:



We would like to thank the following companies for supplying and supporting us with our test system hardware and equipment: Intel, ASUS, Sapphire, Western Digital and Corsair.


Gee thanks for the big check and all the cool gear Intel, you guys sure are swell.







We would like to thank you by throwing our readers a little spitball making FX look bad.


----------



## tafkar

Quote:



Originally Posted by *mickeyfuqinp*


im gona buy bulldozer so i can take pictures with it, and post them all over OCN.

then everyone will be like "omgzz y u no buy inteal 4 peecturz!"











Put some lipstick on the chip, some glue-on eyes with fake eyelashes, and put a doll's dress on the aluminum box.

Take it on a date. Order it a salad. _Then_ get your picture taken with it.


----------



## Snowmen

Quote:



Originally Posted by *Razi3l*


The silliest thing here is the fact that they compared 4.7Ghz to 5.2Ghz. TweakTown reviews are terrible anyway.

SB owners wont stop gloating and being smug now that Bulldozer isn't all it was expected to be. How sad lol. If it was ~$40-$50 cheaper it would be amazing.


Yeah cause if I buy a 4GHz CPU instead of a 3GHz CPU, I'll downclock the faster CPU so that it's not unfair for the slower one. If you're gonna buy a chip, you're gonna run it as fast as it can get so if they review it as fast as both chips get, it's perfect for me.


----------



## TheBlademaster01

So 4 BD modules are an upgrade over 4 K10 cores and 3 modules over 3 cores etc.? But 6 BD cores < 6 K10 cores? Even that does not always hold merit. I'm confused...


----------



## Iceman23

Quote:



Originally Posted by *Jagged_Steel*


I don't know about "every" reviewer , but this one here ADMITS that they are getting paid by Intel. When somebody openly announces to me that they are dishonest, I believe them. You can pull the wool over your own eyes and pretend that this isn't the case all you want, I will stick to reality.


Just because they were given testing hardware by Intel does not automatically mean that the reviewer was dishonest in presenting the results. The fact the EVERY review has placed BD performance very similarly should tell you that either 1) every review is dishonest and skewed in Intel's favor or 2) BD performance is being accurately reported. I know which one is more plausible to me, but I guess in your fantasy land things may be different


----------



## gsa700

Quote:



Originally Posted by *Bit_reaper*


They need to drop the FX-8150 price down to the same level as a T1100 if they intend to sell any Bulldozers.


You mean below the 1100t, the Thuban is faster in almost every way.


----------



## pale_neon

Remember when the first Phenom X4 came out & the dual core C2 Duos blew it out of the water in current games but after the B3 stepping came out and they fixed the TLB 5 months later it wasn't the case anymore.

Hopefully BD will be the same. It's pretty close to sandy bridge right now, so if they can match the kind of performance boost they got w/ the revision of the first Phenom it would make it faster than SB.


----------



## Alatar

Quote:



Originally Posted by *Jagged_Steel*


I don't know about "every" reviewer , but this one here ADMITS that they are getting paid by Intel. When somebody openly announces to me that they are dishonest, I believe them. You can pull the wool over your own eyes and pretend that this isn't the case all you want, I will stick to reality.


supplied by intel? Just like all the reviewers are supplied with review samples of pretty much everything? Just like all the BD reviewers were supplied with FX procs?

nah, can't be.


----------



## Jagged_Steel

Quote:



Originally Posted by *Iceman23*


Just because they were given testing hardware by Intel does not automatically mean that the reviewer was dishonest in presenting the results. The fact the EVERY review has placed BD performance very similarly should tell you that either 1) every review is dishonest and skewed in Intel's favor or 2) BD performance is being accurately reported. I know which one is more plausible to me, but I guess in your fantasy land things may be different










Wrong, there are reviews that say FX is doing just fine. How do you explain some people getting good results, and others getting bad results in the very same tests? Do you think some reviewers got special magic CPUs that work way better than the "regular" ones, or do you think that the ones showing good performance might be the accurate ones and those that do not have run into a BIOS/driver/hardware/MoneyMotivation issue that has skewed their results?

Edit to add:

Quote:



Originally Posted by *Alatar*


supplied by intel? Just like all the reviewers are supplied with review samples of pretty much everything? Just like all the BD reviewers were supplied with FX procs?

nah, can't be.


Please show me where AMD gifted anything to the reviewers in this article. If they had gotten gifts from AMD, it would be listed, and it isn't, therefore your "all the reviewers are supplied with review samples " statement is completely false. Maybe you should stick to the facts and avoid sweeping generalizations based on complete conjecture in the future.


----------



## B NEGATIVE

Link your evidence Devilmaypoop? 
http://www.guru3d.com/article/amd-fx...mance-review/4








Seeing as its the cache which is doing BD in,it dont look too bad.....


----------



## Bit_reaper

Quote:


> Originally Posted by *gsa700;15294722*
> You mean below the 1100t, the Thuban is faster in almost every way.


Now lets be fair the thuban is good but not that good. In most cases the BD is faster but the problem is that it struggles to get any significant lead over its predecessor. I would consider the 8150 the better CPU when the OC head room is taken into account. The BD can still find its uses but it cant be put on equal ground wit the 2500k so it need to be cheaper


----------



## Alatar

Quote:


> Originally Posted by *Jagged_Steel;15294829*
> Wrong, there are reviews that say FX is doing just fine. How do you explain some people getting good results, and others getting bad results in the very same tests? Do you think some reviewers got special magic CPUs that work way better than the "regular" ones, or do you think that the ones showing good performance might be the accurate ones and those that do not have run into a BIOS/driver/MoneyMotivation issue that has skewed their results?


The difference is GPU power. These guys used 3 6970s to remove the GPU bottleneck in games and showed how the performance of the FX really is. There's no use in testing CPUs at high resolutions and detail with only single GPU setups because even C2Qs can get the same frames as fx, i7s etc. at that point.


----------



## hajile

Quote:


> Originally Posted by *pale_neon;15294754*
> Remember when the first Phenom X4 came out & the dual core C2 Duos blew it out of the water in current games but after the B3 stepping came out and they fixed the TLB 5 months later it wasn't the case anymore.
> 
> Hopefully BD will be the same. It's pretty close to sandy bridge right now, so if they can match the kind of performance boost they got w/ the revision of the first Phenom it would make it faster than SB.


Speaking in consumer terms (Server and HPC were a different story).

Core 2 was still 10% faster per clock and overclocked better because Phenom had a cold bug. AMD didn't catch up with core 2 until Phenom II. Bulldozer is (at best) 25% slower per clock. Until clockspeeds improve and the bottlenecks are removed, Bulldozer is not competitive.


----------



## gsa700

Quote:


> Originally Posted by *Bit_reaper;15294876*
> Now lets be fair the thuban is good but not that good. In most cases the BD is faster but the problem is that it struggles to put any significant significant lead over its predecessor. I would consider the 8150 the better CPU when the OC head room is taken into account. The BD can still find its uses but it cant be put on equal ground wit the 2500k so it need to be cheaper


All the reviews I've seen have the 1100t ahead in most cases. Including IPC which is the one that counts. Look at the numbers for the 6 core BD as well.

As Master Yoda might say: "Pathetic it is"


----------



## Iceman23

Quote:


> Originally Posted by *Jagged_Steel;15294829*
> Wrong, there are reviews that say FX is doing just fine. How do you explain some people getting good results, and others getting bad results in the very same tests? Do you think some reviewers got special magic CPUs that work way better than the "regular" ones, or do you think that the ones showing good performance might be the accurate ones and those that do not have run into a BIOS/driver/MoneyMotivation issue that has skewed their results?


Well then I can only conclude that they were payed off by AMD.


----------



## jrbroad77

Quote:


> Originally Posted by *pale_neon;15294754*
> Remember when the first Phenom X4 came out & the dual core C2 Duos blew it out of the water in current games but after the B3 stepping came out and they fixed the TLB 5 months later it wasn't the case anymore.
> 
> Hopefully BD will be the same. It's pretty close to sandy bridge right now, so if they can match the kind of performance boost they got w/ the revision of the first Phenom it would make it faster than SB.


Now come on! Phenom X4 came out after Core 2 Quad. It was no competition, a Q6600 at 4ghz blew away a 9950 at 3.2ish(max OC vs max OC).

Anyway the 8-core BD comes close to competing with Intel's best quads. I guess that's good? Doesn't really beat a 2600k multithreaded even (well, 20% on occasion, with double the transistor count)


----------



## dave12

Quote:


> Originally Posted by *Bit_reaper;15294626*
> They need to drop the FX-8150 price down to the same level as a T1100 if they intend to sell any Bulldozers.


I did a super official poll, (called my buddy Sean), and the MC he works at has sold "1" 6100 so far.


----------



## awdrifter

Quote:


> Originally Posted by *mrcool63;15294471*
> My advice to AMD would be to shift this entire architecture onto the 28nm die..
> 
> It would give them a tighter integration of components thereby a less leak and reduce power required and thus the TDP
> 
> Shorter pipelines as one of my friends mentioned so that AMD will be able to perform shorter threads better than it currently does as BD!!
> 
> Increase per core performance to boost the overall performance along with an increase of the L2 and L3 cache!!


TSMC is building their 28nm APUs I think, not Global Foundry. GF is building most their CPUs, and they only have 32nm process. If TSMC have the capacity, they should definitely shrink the Phenom II X6 to 28nm, produce that as a consumer product (call it Phenom III or something), leave Bulldozer to servers.


----------



## jivenjune

Quote:


> Originally Posted by *Epsi;15291143*
> Dunno if posted already.
> 
> AMD FX-8150 vs. Intel i7-2600k CrossFireX HD 6970 x3 Head-to-Head:
> 
> http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index1.html


Wow,that was hard to read. It seriously shows the huge discrepancies between the two processors.


----------



## aznofazns

Quote:


> Originally Posted by *jivenjune;15295072*
> Wow,that was hard to read. It seriously shows the huge discrepancies between the two processors.


It almost looks like the 3rd 6970 wasn't being used at all in some of those tests.


----------



## ThePath




----------



## hajile

Quote:



Originally Posted by *jivenjune*


Wow,that was hard to read. It seriously shows the huge discrepancies between the two processors.


As big as bulldozer's problems are, the only huge discrepancy is tweaktown's bench methods. The elephant in the room is clockspeed. Simply put, Bulldozer can clock higher than 4.7Ghz and Sandybridge almost never hits 5+ Ghz. I don't need to say any more to show the problem.


----------



## jivenjune

Quote:



Originally Posted by *hajile*


As big as bulldozer's problems are, the only huge discrepancy is tweaktown's bench methods. The elephant in the room is clockspeed. Simply put, Bulldozer can clock higher than 4.7Ghz and Sandybridge almost never hits 5+ Ghz. I don't need to say any more to show the problem.


I've been hearing from multiple reviews, sites, sources and even OCN testers that Bulldozer has an incredibly difficult time getting past the 4.6 Ghz wall without proper cooling, and even with proper cooling there has been a significant amount of stability issues.

A recent OCN member posted a review on how to properly overclck a Bulldozer CPU using a 990FX Gigabyte-UD5 or 7, and his overclocking pretty much hit a brick wall around 5.2 Ghz I believe. Even there, it seemed he had some serious stability issues, and the voltages he was pumping into the CPU were probably a bit high for 24/7 comfort.


----------



## HowHardCanItBe

Decided to merge the tweaktown thread to the official thread. Folks please keep it a clean...no flaming


----------



## XxBeNigNxX

Quote:



Originally Posted by *hajile*


As big as bulldozer's problems are, the only huge discrepancy is tweaktown's bench methods. The elephant in the room is clockspeed. Simply put, Bulldozer can clock higher than 4.7Ghz and Sandybridge almost never hits 5+ Ghz. I don't need to say any more to show the problem.


Most if not all reviews have said that getting past 4.6ish without dropping cores/modules and or alternative cooling methods have been somewhat difficult and or in doing so while retaining stability has been very difficult.

Your argument for clocking bulldozer higher doesn't hold water.

All reviews should however review chips at the same clock.. so if Bulldozer turbos to 4.0-4.2 that is where SB should be. But at the same time reviewers should clock them at their max Stable speed as well under the same realistic cooling conditions (air/water) because lets face it anything else most people do not own nor are they going to have someone constantly pour LN2 so they can use the computer.

The Problem is Bulldozer... End of Story


----------



## Madmanden

Quote:



Originally Posted by *hajile*


As big as bulldozer's problems are, the only huge discrepancy is tweaktown's bench methods. The elephant in the room is clockspeed. Simply put, Bulldozer can clock higher than 4.7Ghz and Sandybridge almost never hits 5+ Ghz. I don't need to say any more to show the problem.


Is this established fact? Most BD reviews I've seen have been at 4.4-4.8 GHz (where is was often still underperforming stock SB). Besides, I wouldn't like to pay the electricity bill at that clock speed.


----------



## mrcool63

the tweaktown review is stupid literally. as i keep saying 8x,8x,4x trifire demolishes a 16x, 16x,8x trifire???

is BD that bad? and dont tell me 8x and 4x will not make a difference... because it will!!

too many descrepencies in the reviews for BD. Many people are getting into the habit of picking out the worst ones and ignoring the better ones!!


----------



## Iceman23

Quote:



Originally Posted by *Jagged_Steel*


Looks about right, but you are missing the point: The slice of pie is big enough and exactly what AMD promised, and it costs less than the same slice of Pie from Intel.


Forget about Intel, Thuban will give you the same piece of pie for even less money.


----------



## pyra

Quote:



Originally Posted by *XxBeNigNxX*


Most if not all reviews have said that getting past 4.6ish without dropping cores/modules and or alternative cooling methods have been somewhat difficult and or in doing so while retaining stability has been very difficult.

Your argument for clocking bulldozer higher doesn't hold water.

*All reviews should however review chips at the same clock*.. so if Bulldozer turbos to 4.0-4.2 that is where SB should be. But at the same time reviewers should clock them at their max Stable speed as well under the same realistic cooling conditions (air/water) because lets face it anything else most people do not own nor are they going to have someone constantly pour LN2 so they can use the computer.

The Problem is Bulldozer... End of Story


I disagree.

Both cpus are at roughly the same price point so they should be reviewed for what they are capable of. If I am going to be spending over $200 on a cpu I'll judge it by what it can achieve at it's max stable overclock not by how fast it is at the same frequency as another chip.


----------



## mannyfc

I wonder ln2 vs ln2 results


----------



## XxBeNigNxX

Quote:



Originally Posted by *mrcool63*


the tweaktown review is stupid literally. as i keep saying 8x,8x,4x trifire demolishes a 16x, 16x,8x trifire???

*is BD that bad?* and dont tell me 8x and 4x will not make a difference... because it will!!

too many descrepencies in the reviews for BD. Many people are getting into the habit of picking out the worst ones and ignoring the better ones!!


Yes it is THAT Bad.. have you not been reading reviews?


----------



## Madmanden

Quote:



Originally Posted by *mrcool63*


the tweaktown review is stupid literally. as i keep saying 8x,8x,4x trifire demolishes a 16x, 16x,8x trifire???

is BD that bad? and dont tell me 8x and 4x will not make a difference... because it will!!

too many descrepencies in the reviews for BD. Many people are getting into the habit of picking out the worst ones and ignoring the better ones!!


Clearly they were paid off by Intel! Because there hasn't been enough reviews showing BD failing, so Intel needed to pay off TweakTown.









Or, BD plainly sucks at most everything compared to SB (and even Phenom II). And yes, 8x vs. 16x performance is almost the same.


----------



## aznofazns

Quote:



Originally Posted by *Jagged_Steel*


Looks about right, but you are missing the point: The slice of pie is big enough and exactly what AMD promised, and it costs less than the same slice of Pie from Intel.


BD definitely does not perform exactly as AMD promised. Based on their marketing and the plethora of both official and user reviews, that is a fact.


----------



## XxBeNigNxX

Quote:



Originally Posted by *pyra*


I disagree.

Both cpus are at roughly the same price point so they should be reviewed for what they are capable of. If I am going to be spending over $200 on a cpu I'll judge it by what it can achieve at it's max stable overclock not by how fast it is at the same frequency as another chip.


Ok... Going off of that statement.. According to 99% of the reviews they have been having a hard time OC'ing BD past 4.6-4.8 on air while keeping stability in check with out dropping cores/modules or using a more radical cooling method.

Need a Source? read the reviews located on page 1 of this thread.


----------



## hajile

Quote:



Originally Posted by *jivenjune*


I've been hearing from multiple reviews, sites, sources and even OCN testers that Bulldozer has an incredibly difficult time getting past the 4.6 Ghz wall without proper cooling, and even with proper cooling there has been a significant amount of stability issues.

A recent OCN member posted a review on how to properly overclck a Bulldozer CPU using a 990FX Gigabyte-UD5 or 7, and his overclocking pretty much hit a brick wall around 5.2 Ghz I believe. Even there, it seemed he had some serious stability issues, and the voltages he was pumping into the CPU were probably a bit high for 24/7 comfort.


Even in the event that Bulldozer chips clocking over 4.6 on air are rare, I would put forward that Sandy bridge chips clocking to 5.2 are just as or even more rare.

Quote:



Originally Posted by *XxBeNigNxX*


Most if not all reviews have said that getting past 4.6ish without dropping cores/modules and or alternative cooling methods have been somewhat difficult and or in doing so while retaining stability has been very difficult.

Your argument for clocking bulldozer higher doesn't hold water.

All reviews should however review chips at the same clock.. so if Bulldozer turbos to 4.0-4.2 that is where SB should be. But at the same time reviewers should clock them at their max Stable speed as well under the same realistic cooling conditions (air/water) because lets face it anything else most people do not own nor are they going to have someone constantly pour LN2 so they can use the computer.

The Problem is Bulldozer... End of Story


Please do not take me for a bulldozer defender. I know full well its performance and that it has many problems (look in previous pages for my enumeration of these problems). I even go so far as to claim that I see no reason to recommend bulldozer to any consumer for any reason. That said, little of this relates to the fundamental problem with the benchmark methods used by tweaktown in this test.


----------



## radaja

i wouldn't trust these reviews,i heard that intel paid most of these sites with free hardware and money and asked the actual bulldozer reviewers "when you are done reviewing BD could you write a bad review to make AMD look bad for us,because were in no position to compete at the moment" *in essense* "could you please ram a piece of jagged steel into the heart of AMD for us,we are having issues with IB and need a little cover"






































come people a little reality never hurt anybody

AMD messed up and thats a fact.any one who continues to deny this isn't being honest with themselves


----------



## hammertime850

Quote:



Originally Posted by *XxBeNigNxX*


Ok... Going off of that statement.. According to 99% of the reviews they have been having a hard time OC'ing BD past 4.6-4.8 on air while keeping stability in check with out dropping cores/modules or using a more radical cooling method.

Need a Source? read the reviews located on page 1 of this thread.


I think you guys are saying the same thing.


----------



## XxBeNigNxX

Quote:



Originally Posted by *hajile*


Even in the event that Bulldozer chips clocking over 4.6 on air are rare, I would put forward that Sandy bridge chips clocking to 5.2 are just as or even more rare.

Please do not take me for a bulldozer defender. I know full well its performance and that it has many problems (look in previous pages for my enumeration of these problems). I even go so far as to claim that I see no reason to recommend bulldozer to any consumer for any reason. That said, little of this relates to the fundamental problem with the benchmark methods used by tweaktown in this test.


It's all good hajile.. I don't take you for anything other than a fellow OCN member and a human being


----------



## aznofazns

Quote:



Originally Posted by *Jagged_Steel*


Quote from the Tweaktown review:

TT openly admits that they have been paid by Intel.

I disagree. What exactly did AMD promise you that you feel that they did not deliver on?


That does NOT prove that Intel is paying TT to give BD a poor review. Intel supplied TT with processors to perform benchmarks on. So what?

For the second part, this is a good place to start:


----------



## 113802

Quote:



Originally Posted by *mannyfc*


I wonder ln2 vs ln2 results


Of course Bulldozer will win because it's chipset is cable of being overclocked unlike Sandy Bridge which has a 57x multiplier. So the max it would reach if the chipset could be pushed to reach 120 it would be 6.8Ghz which is kinda hard to beat but I think Bulldozer can reach 7Ghz with 4 modules.

I can't wait for Windows 8 to be released so processors can actually take advantage of the extra cores that usually sit idle.


----------



## mrcool63

Quote:



Originally Posted by *XxBeNigNxX*


Yes it is THAT Bad.. have you not been reading reviews?


you have to read the last line i wrote.. People are getting into the habit of taking the worst written reviews like anandtech and quoting them extensively and convieniently ignoring guru3d, techspot and similar sites who have shown BD in a better light.


----------



## Madmanden

Quote:



Originally Posted by *Jagged_Steel*


I disagree. What exactly did AMD promise you that you feel that they did not deliver on?


What about "the highest single- and multithreaded CPU performance"? Or even just IPC increase over Phenom II? Its per-core performance is slower than Phenom II in CB 11.5. And it's just barely faster in the multi-CPU test despite being clocked higher. If clocked similarly, the X6 would still be faster. How is that something to celebrate?


----------



## solar0987

And they still sold out of every shop ive looked at for the 8150.Now i have to decide how long i want to wait without a computer,Payday hopefully a 8150 will be in stock, or try to hopefully sell my motherboard and get a intel board then wait for money for processor.Im kinda dissapointed but meh maybe it will get better. Maybe. Sad fact is i bought my motherboard beforehand and ordered the motherboard blocks for it. Send blocks back i lose 20% restocking, sell my motherboard and i lose least 40$.As im a paycheck to paycheck living person, I think im gonna have to stick with amd...


----------



## Madmanden

Also to the guy who seemed to think 4.7 GHz BD was totally unfair since it's basically capable of 5.5+ GHz - from the TechReport review:

*Yep, 4.4GHz was about it.* Perhaps we were a little timid, but raising the voltage beyond 1.465V on a brand-new, pre-release 32-nm processor felt like asking for trouble to us, especially with just air cooling. [...]

*Worried that we weren't reaching our chip's full potential, we pinged AMD PR on the matter*, who pointed us to a section in the reviewer's guide (a document we shamlessly ignore after extracting any useful info) *that suggests 4.5GHz is a reasonable expectation for FX-8150 overclocking* with an air cooler.

EDIT: Jagged Steel, still waiting for your response to my earlier post about what AMD promised and what was delivered.


----------



## Milestailsprowe

Why are the gaming benchmarks all over the place? In some reviews its 2-4 frames less then the 2600K


----------



## lordikon

Quote:



Originally Posted by *el gappo*


SB @ 5.2 and BD @ 4.7!???? Yeah because that's fair...


You're a benchmarks editor, you should know that comparing clock speeds between different architectures is essentially worthless (compare a Pentium 4 at 3Ghz to a Sandy Bridge at 3Ghz). Besides, there are plenty of benchmarks out there where they matched the clock speeds and Bulldozer still lags behind.


----------



## XxBeNigNxX

Quote:



Originally Posted by *mrcool63*


you have to read the last line i wrote.. People are getting into the habit of taking the worst written reviews like anandtech and quoting them extensively and convieniently ignoring guru3d, techspot and similar sites who have shown BD in a better light.


But thats just it "who have shown BD in a better light"..

There should be No showing of BD in a "Better Light".. show it as it is for what it is and the Answer is the same.. Bulldozer is Not a Good Chip..it is Not a good alternative to even AMD's previous chips. Bulldozer from a performance stand point and even a economic stand point is a failure.

For the Desktop user BD can be good at encoding/decoding (at the expense of electricity) and even then it's not good enough to justify it's Horrid performance in everything else except for Extreme overclocking.

After mentioning extreme overclocking... remember when the AMD Black 940 came out and they did their whole 3dmark 05/06 liquid helium stunt and at least showing that it had Some grunt behind the chip? Why did AMD not do the same for Bulldozer and instead just went after the "Speed Crown" which shows nothing of the power of the cpu.. maybe because they(AMD) knew the CPU was bad?

All they did was hide the fact that Bulldozer was and is a Failure and did what AMD does... Build hype.

I support both camps (AMD and Intel) but I do Not Approve of Bulldozer and do Not Approve of AMD's tactics... because in the end all it does is hurt consumers that don't know any better.


----------



## TheBlademaster01

Quote:



Originally Posted by *lordikon*


You're a benchmarks editor, you should know that comparing clock speeds between different architectures is essentially worthless (compare a Pentium 4 at 3Ghz to a Sandy Bridge at 3Ghz). Besides, there are plenty of benchmarks out there where they matched the clock speeds and Bulldozer still lags behind.


At 4.8GHz it even loses against a stock 2600K. I mean even Thuban is faster clock for clock.


----------



## el gappo

Quote:



Originally Posted by *lordikon*


You're a benchmarks editor, you should know that comparing clock speeds between different architectures is essentially worthless (compare a Pentium 4 at 3Ghz to a Sandy Bridge at 3Ghz). Besides, there are plenty of benchmarks out there where they matched the clock speeds and Bulldozer still lags behind.


I know the average SB fails to reach 5.2 especially when you take safe temps on air into account. I know the opposite is true for Bulldozer too. So yes it's an unfair test.

CPC comparisons are daft, max stable overclocks that are relevant to users here however are not. They should of run stock.

I know it would still loose but at the current clock speeds this is pointless.

Quote:



Originally Posted by *XxBeNigNxX*


But thats just it "who have shown BD in a better light"..

There should be No showing of BD in a "Better Light".. show it as it is for what it is and the Answer is the same.. Bulldozer is Not a Good Chip..it is Not a good alternative to even AMD's previous chips. Bulldozer from a performance stand point and even a economic stand point is a failure.

For the Desktop user BD can be good at encoding/decoding (at the expense of electricity) and even then it's not good enough to justify it's Horrid performance in everything else except for Extreme overclocking.

After mentioning extreme overclocking... remember when the AMD Black 940 came out and they did their whole 3dmark 05/06 liquid helium stunt and at least showing that it had Some grunt behind the chip? Why did AMD not do the same for Bulldozer and instead just went after the "Speed Crown" which shows nothing of the power of the cpu.. maybe because they(AMD) knew the CPU was bad?


They did a little. http://www.youtube.com/watch?feature...&v=-YzdkkCV_Pg


----------



## hokiealumnus

Quote:



Originally Posted by *mannyfc*


I wonder ln2 vs ln2 results

Quote:



Originally Posted by *WannaBeOCer*


Of course Bulldozer will win because it's chipset is cable of being overclocked unlike Sandy Bridge which has a 57x multiplier. So the max it would reach if the chipset could be pushed to reach 120 it would be 6.8Ghz which is kinda hard to beat but I think Bulldozer can reach 7Ghz with 4 modules.




It depends on what you're measuring. The only real reason to go LN2 is for HWBot benches (IMHO). We already know AMD just sucks at SuperPi (and pifast for that matter). That leaves WPrime, which Bulldozer is also horrible at because it lost FP cores relative to Thuban and _really_ sucks compared to a 2600K. Doesn't matter the max multi. Here are just my personal results:

Time------------Frequency-------CPU
4sec 890ms * 6523 MHz * FX-8150 (WP32M wouldn't run any faster on my chip; full LN2 pot)
4sec 708ms * 5644 MHz * Phenom II X6 1100T BE (Also LN2, full pot.)
....and the real kick-in-the-teeth...
4sec 515ms * 5407.3 MHz * Core i7 2600K (On...wait for it... *water*.)

These things are _fun_ to play with and do well at certain tasks, but don't buy one to go for HWBot global points.


----------



## t3haxle

I still don't understand why stock prices are up 25% after BD launch. Shouldn't they be going down?


----------



## lordikon

Quote:



Originally Posted by *el gappo*


I know the average SB fails to reach 5.2 especially when you take safe temps on air into account. I know the opposite is true for Bulldozer too. So yes it's an unfair test.


If the overclocks weren't both reached using similar methods then I agree with you. If they were going to OC them they should've used similar or identical heatsink/fan combos, outside of a case, or in the exact same type of case, and see how high they could go. As a customer that's the exact kind of review I care about: If I buy this CPU, what's the best it will do on air.


----------



## BallaTheFeared

I haven't seen anything impressive from Bulldozer, not even at 5GHz+.


----------



## el gappo

Quote:



Originally Posted by *hokiealumnus*


These things are _fun_ to play with and do well at certain tasks, but don't buy one to go for HWBot global points.


CPU-Z and max memory clocks









Quote:



Originally Posted by *lordikon*


If the overclocks weren't both reached using similar methods then I agree with you. If they were going to OC them they should've used similar or identical heatsink/fan combos, outside of a case, or in the exact same type of case, and see how high they could go. As a customer that's the exact kind of review I care about: If I buy this CPU, what's the best it will do on air.


Exactly!

That's kind of what I did with my guide and put up a thread taking requests for benchmarks people want to see. Will be working through them again next week after the ln2 event.

http://www.overclock.net/amd-general...ing-guide.html
http://www.overclock.net/benchmarkin...ant-see-5.html

Quote:



Originally Posted by *BallaTheFeared*


I haven't seen anything impressive from Bulldozer, not even at 5GHz+.


 Beat your 2500k at the encoding bench with lower clocks


----------



## HMBR

Quote:



Originally Posted by *Milestailsprowe*


Why are the gaming benchmarks all over the place? In some reviews its 2-4 frames less then the 2600K


in a way you could argue that it only shows that you don't need a lot of CPU "horsepower" to game, but I have the feeling that some people do it deliberately because some people love to compare CPU when CPUs are irrelevant from a certain point (more performance = useless),
to makes the inferior product look good, 
but you could always look at more reviews and consider these when you are about to buy your gaming CPU,









































http://www.hardware.fr/articles/842-...rma-ii-oa.html

































one stands out, cheaper than the FX and faster (2500k)
so at worst it is as good as the FX, at best, much faster, for less money.


----------



## redalert

Quote:


> Originally Posted by *t3haxle;15295797*
> I still don't understand why stock prices are up 25% after BD launch. Shouldn't they be going down?


where did you see that? I just checked and they went up less than 1% today http://www.google.com/finance?client=ob&q=NYSE:AMD


----------



## XxBeNigNxX

Quote:


> Originally Posted by *Jagged_Steel;15295956*
> AMD has been listed as "Outperform" and "Buy" by ratings services like Raymond James for a few months now. Contrary to what a handful of performance enthusiasts think, WallStreet is well aware that AMDs new product lineup is a winner with the general public, which is where they make their money.


That had to do with AMD "fusion" series namely "Brazos" and the following "Llano" lineup for the integrated graphics.

Source - MaximumPC

September 2011 edition - Page 8

Page title - "AMD Back in the game"

PDF DL (Source) - http://dl.maximumpc.com/Archives/MPC_2011_09-web.pdf -Scroll down to page 8 for the article.

--Heres a Pic from the article incase members or on lookers do not want to click the link


----------



## ekg84

Quote:


> Originally Posted by *Jagged_Steel;15294829*
> Wrong, there are reviews that say FX is doing just fine. How do you explain some people getting good results, and others getting bad results in the very same tests? Do you think some reviewers got special magic CPUs that work way better than the "regular" ones, or do you think that the ones showing good performance might be the accurate ones and those that do not have run into a BIOS/driver/hardware/MoneyMotivation issue that has skewed their results?


So far whatever gaming benchmarks you posted were gpu bottlenecked. Those from Overclockersclub for instance clearly show GPU bottleneck. They use max settings in games and single 6970. So all cpus are within couple fps from each other:
bad company 2
View attachment 233638


civilization V
View attachment 233639


AvP
View attachment 233640


Same story with hardwareheaven review.

Whatever review used lowered settings has clearly shown BULLdozer laging behind.

It did however show good performance in several non gaming benchmarks.
Quote:


> Originally Posted by *Jagged_Steel;15294829*
> Please show me where AMD gifted anything to the reviewers in this article. If they had gotten gifts from AMD, it would be listed, and it isn't, therefore your "all the reviewers are supplied with review samples " statement is completely false. Maybe you should stick to the facts and avoid sweeping generalizations based on complete conjecture in the future.


You obviously have no idea what u r talking about, or maybe just pretending, for instance do u know from where all these tech websites get most of the hardware they review? They get Egeneering Samples of them FROM MANUFACTURERS FOR FREE. One more time - whether its AMD cpu or INTEL cpu or new bitfenix case or new videocard, they receive them from the manufacturer. This is what they thank them for. Or maybe you think they bought BD samples before they even appeared in the stores?

Stop posting nonsense.


----------



## Kasp1js

Do all the retail cpus come with the Asetek made wc kits?


----------



## NuclearSlurpee

So does Bulldozer officially suck? I wasn't around yesterday. But saw some stuff on the 11th, disappointing.


----------



## scyy

Quote:


> Originally Posted by *Jagged_Steel;15295484*
> Quote from the Tweaktown review:
> 
> TT openly admits that they have been paid by Intel.
> 
> Edit to add:
> 
> I disagree. What exactly did AMD promise you that you feel that they did not deliver on?


Official reviews for sites like that are almost always given the hardware by whatever company is making it.


----------



## redalert

Quote:


> Originally Posted by *Kasp1js;15296603*
> Do all the retail cpus come with the Asetek made wc kits?


I think just the 8 core cpus


----------



## hokiealumnus

Quote:


> Originally Posted by *Kasp1js;15296603*
> Do all the retail cpus come with the Asetek made wc kits?


No; they come with an air cooler. The Asetek units (which aren't available yet) are expected to cost ~$100.


----------



## linskingdom

Quote:


> Originally Posted by *t3haxle;15295797*
> I still don't understand why stock prices are up 25% after BD launch. Shouldn't they be going down?


25% up? I must miss something. Just be careful on any stock under $5, and it is a reason why they stay under $5. In AMD this case, it rebounds because of board market oversold condition. Its fundamental hasn't changed at all. It may continue moving higher if board market moves higher but based on the price action, it may need to revisit low $4 level. I won't touch it unless it stays above $6.2 for a period of time or unless you are a trader.


----------



## P.J

Another red card


----------



## t3haxle

Whoops about the stock thing- 4.84 yesterday looked like 4.04. My bad.


----------



## mad0314

Quote:


> Originally Posted by *Jagged_Steel;15296796*
> The gaming benchmarks that I posted show the FX performing well in games. I have absolutely no interest in how fast a CPU can spin sitting on the table doing nothing or figuring out Pi to the billionth place. I want to know how it performs in REAL situations like Games, so those are the ones that I find relevant. If you don't like the benchmarks showing FX doing well in games, you are free to not put any credence in them, just as I am free to not care about benches that to ME have no meaning.
> 
> As far as how reviewers get their gear, I am quite aware that these sites get gifted much of this stuff that they review. That is why I personally put more value in benches from people that paid for their gear and are beholden to no one. If you want to believe that a website that depends on gifts and advertising dollars does not skew results to favor those that are buttering their bread, you are quite welcome to do so. I have life experience that tells me the opposite is true, so I will go with what I know. If you honeslty think that I am posting non-sense I highly recommend that you go directly to your user control panel and put me on your ignore list. Problem solved, and you don't have to go around demanding that anybody do as you have ordained.
> 
> Have a Nice Day!


The problem I have is you are making claims from something that does not support your claim. You have claimed that it "excells at gaming" and then proceed to show GPU bottlenecked charts. I agree that it can put out playable FPS in current games, but that is not a good measurement of its absolute performance when the CPUs are not being used close to 100%. You can claim that it will run all your games decently, that I have no problem with. But it does NOT excell at gaming. You cannot say that a Honda Civic is as fast as a Bugatti Veyron when they are only allowed to go the speed limit. Both can achieve that speed easily, but that is not indicative of its performance.

To reiterate: *You CANNOT make a PERFORMANCE claim from a benchmark that does NOT allow PERFORMANCE*


----------



## Rookie1337

Quote:


> Originally Posted by *Jagged_Steel;15296943*
> The graphs you posted show the FX-8150 outperforming both the 2500 and 2600 CPUs in BF3, generally considered to be the most demanding title around right now. If the FX outperforms both the 2500 and 2600 in THE PC game yet you say that the FX stinks at Gaming, what CPU are you imagining DOES excell at gaming? You are quite free to ignore all of the benchmarks that show the FX performing well, just as I am free to talk about them and question why some reviews show opposite results in the same tests.
> 
> Thank you for your concern , and Have a Nice Day!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit to add: I still do not see any promises from AMD that they failed to meet with the FX CPUs. You typed a bunch of random figures without any documentation, which is completely worthless.


Dude. I may not agree with the Intel trolls in this thread. But stop feeding them with BF3 which from what it appears is a GPU limited game. I mean when OCing gives worse results for all the CPUs that's kind of odd.

I'm just wondering why these Intel trolls are here though? I mean does it hurt you guys to know there's a few areas that BD actually leaves your 2500Ks behind? They're the same areas where a 980x beats your 2500ks without worry. Come on guys. Grow up. For transcoding I think at 4.6GHz a 2600k matches a 4.6GHz BD on those.

Again get what you're going to need and use and you'll be happy with the results. Well, except for the power consumption with BD.

Seriously this thread needs to end because people are just flaming and trolling now.


----------



## HAVO

Quote:


> Originally Posted by *formula m;15294203*
> If a specific # in a benchmark is all you are after, then know this about yourself and stop using this specific (personal) need, as a want for the populace.
> 
> It is irrational and ignorant.
> 
> "Joe Public" will be served by this BD, as much as SB over the next 2~4 years. But, If you change CPUs more often than this & somehow delude the fact that BD is a bust, because next year something better is coming, then understand you are once again a special human being, with special & personal needs.
> 
> But you should also know about your specialness and taper your post accordingly.
> 
> If you are not special and do not care about the arbitrary, then run any 3 benchmark at once and get back with "Joe Public"... because that is how people abuse their systems. With 20 animated icons, different mouse pointers, Active desktop, minecraft, farmville BOTH running. Plus, the ambiguous Windows Tray full of cute things, etc.
> 
> Real life benchmarking?
> Benchmark after any 13 girl has had the computer after 6 months.. no fresh install of the OS. Then see what architecture can handle the OS.
> 
> Or, you can get a 6-pack, some Vaseline, etc.. and run Cinebench all night and call all of your buddies, to partake on how special you are & how special your rig is.... _after just formating the C drive 1h before_
> 
> 
> 
> 
> 
> 
> 
> .
> 
> That is about as fake as real world benching can be (for "Joe Public").


Epic post full of win.. soooo true !!

+1 for you sir


----------



## cl04k3d

Where did all the faildozer pics go. They were hilarious!


----------



## MarvinDessica

Quote:


> Originally Posted by *NirXY;15297530*
> 
> 1. *AMD has no money for R&D as Intel has* - AMD is just too cheap to dump money into R&D. You can't make money without spending it.
> 2. *Intel hurt AMD financially in the passed with their shady businesses* - AMD has settled for $1.5B, why can't you settle ?, not to mention AMD made sure plenty of folks upgraded their MB before they showed any benchmarks, only a lot of empty promises, not shady enough? But also remember they were comparing a generation old chip this entire time. It's like sandy bridge didn't exist for them at all. Thought should have been a day-1 indicator to begin with.
> 3. *I buy AMD because they offer top perf/$* - not since 2500k, and specially not now, yet you still repeat the same old phrase for some reason.
> 4. *I only care about real life performance, I don't care about synthetics* - there were plenty of real life benchmarks available last 24h, yet somehow you skip those. The only ones that actually show BD being ahead or sites like hardware heaven that actually showed on many occasions they bend to AMD like showing a 6970 being ahead of Nvidia.
> 5. *AMD did not promise anything they didn't deliver* - besides the fact that they actually did promise improved performance and IPC, don't you find it a bit non-relevant ? we buy CPU's, not promises.
> 6. *BD is aimed for Workstations, not desktops* - not with that power usage, and it's not what AMD said either.
> 7. *BD ain't being utilized well with software* - we as consumers shouldn't care, if we can't use that CPU now why sell it ? go ahead and ask MS to apologize for W7, see how well that goes.
> 
> .


I had to add a few things, but excellent post.


----------



## GTR Mclaren

sad thread is this

amd fanboys fighting desperately to find a reason to praise Bulldozer

and Intel fanboys in the verge of an orgasm every time they see a Benchmark or news showing the failure of Bulldozer

seriously.....and I leave Gametrailers and join this forum for the "mature" users


----------



## mad0314

Quote:


> Originally Posted by *GTR Mclaren;15298240*
> sad thread is this
> 
> amd fanboys fighting desperately to find a reason to praise Bulldozer
> 
> and Intel fanboys in the verge of an orgasm every time they see a Benchmark or news showing the failure of Bulldozer
> 
> seriously.....and I leave Gametrailers and join this forum for the "mature" users


Actually I very few people in the discussion have demonstrated that. Most of the "fanboys" leave a comment or two and leave the thread, those actually discussing things post valid points whether they prefer and/or currently own Intel/AMD computers. Out of those actually discussing valid arguments, I would guess that very few, if any, were rooting for BD to fail. It is because it is widely seen as a failure that the failure is brought up, and many people are looking for the source, if any, of BD's lackluster performance.


----------



## Majin SSJ Eric

I love my 2600k more and more every day. Even after the horror that was the cougar point fiasco, this little baby has been at the top of the performance charts since January and, judging by BD'ss lackluster performance , should remain there until Intel decides to one-up itself with IB and SB-E...


----------



## Blast

Sooo....we can pretty much agree BD croaked.
Really sad considering that BD stood potential of dropping Intel prices.

BD did do good in the media area, but its pretty lame when it comes to processing capability. On Anandtech, my Phenom 955 seems to be pretty close to the BD 8150. Sad.


----------



## MaDeOfMoNeY

AMD faltering is not good for anyone to be honest, it hurts the market as a whole, also releasing products before they are completely done seems to be the norm anymore because the whole world seems to pressure companies to put their products out RIGHT NOW due to the lack of patience.


----------



## pengs

*[DG's Nerdy Story]*

_Translated by Google_
Quote:


> "½ module = 1 core, 1 core ≠ ½ module"
> 
> What does sound uninvolved.
> 1 / 2 module is a core 1-core, but not the half module?
> 
> This part of the bulldozer "in module 'to test the core and optionally turn off the gamyeo '2-core as a' module performance and
> Without a shared resource ohrothi 'module 1 core' is to compare the performance when running on.
> It has a modular design with traditional core design features when compared with the will to learn.


Found
Source


























Through disabling core 2,4,6 and 8 (as seen by the bios (1,3,5,7 logistical)) letting each core fully use each modules resources and giving it more cache, this reviewer is seeing up to around 20% higher IPC than running 4m/8c.
Thought it was interesting to see how sharing resources through the module depletes Bulldozers IPC.

So... disable 4 cores for a few years until software catches up?







Would like to see how and if disabling half the cores lightens the power load any.


----------



## Razi3l

Hmm.. Interesting..


----------



## JY

That's weird :s


----------



## knoxy_14

3.19 cinebench for 4 core still isnt as good as phenom II


----------



## HK_47

its not weird it makes perfect sense...
I wonder how high it can OC on air with only 4cores enabled


----------



## subliminally incorrect

cool nerdy story bro


----------



## Snowmen

Stupid architecture if the only way to get decent performance out of an 8-core chip is to disable half the cores...


----------



## robbo2

This makes me







harder then when people buy i7 and turn HT off.


----------



## jackeyjoe

Guys, keep it clean or I will close this thread.


----------



## Airolden

I thought AMD said one of their cores had 90% of the performance of a true core? So that means it worked out just as they planned, right?


----------



## mad0314

This is the idea behind Bulldozer.

Another thing that bugs me is the way AMD is handling it. There are issues inherent to its design which makes it perform less than it might perform once those issues are fixed. Someone else mentioned the Cougar Point chipset issue. The thing is, Intel aknowledged it and fixed it, and was quick to do so. AMD is lying to everyones faces with their marketing. If they know there is an issue with it, they should come out and say so instead of trying to make BD seem like the fastest chip in the world.


----------



## linkin93

Or maybe software doesn't fully support it yet?


----------



## Armand Hammer

Quote:


> Originally Posted by *Snowmen;15299077*
> Stupid architecture if the only way to get decent performance out of an 8-core chip is to disable half the cores...


Yep this just keeps getting worse and worse for AMD


----------



## G3RG

Quote:


> Originally Posted by *robbo2;15299083*
> This makes me
> 
> 
> 
> 
> 
> 
> 
> harder then when people buy i7 and turn HT off.


huh?


----------



## DesertRat

I will not believe that AMD didn't catch, and know about this. I bet a release of a special series w/ only 1 core on each node being active would see performance increases over PhII Quad-Cores.

From what I was reading in another thread, I heard that bulldozer wasn't "designed" it was computed. Automated programs mapped out it's architecture. No human ingenuity. That would explain a lot.


----------



## Mad Pistol

Quote:


> Originally Posted by *linkin93;15299128*
> Or maybe software doesn't fully support it yet?


*THIS! THIS! THIS! THIS! A THOUSAND TIMES THIS!!!*

People don't seem to get that BD was designed with a certain software usage in mind, which is why the design is so radically different. If 2 cores are competing for L2 cache, of course it's going to drop efficiency (and IPC). However, if the program is designed to utilize this design, I bet IPC rises dramatically, even when using all 8 cores.

Windows 7 and programs don't know how to utilize this design. That's the problem.

I almost want to bet that AMD cuts the L2 cache into 8 sections (1 for each integer core) for the next iteration of Bulldozer chips, and puts in an interconnect between the sets of L2 cache in each module. I bet that increases IPC a bit.


----------



## Homeles

Still not as good as a 2500k.


----------



## Kand

I thought More cores = better.


----------



## linkin93

Quote:


> Originally Posted by *Homeles;15299194*
> Still not as good as a 2500k.


Quote:


> Originally Posted by *Kand;15299212*
> I thought More cores = better.


----------



## lordikon

Quote:


> Originally Posted by *Mad Pistol;15299189*
> *THIS! THIS! THIS! THIS! A THOUSAND TIMES THIS!!!*
> 
> People don't seem to get that BD was designed with a certain software usage in mind, which is why the design is so radically different. If 2 cores are competing for L2 cache, of course it's going to drop efficiency (and IPC). However, if the program is designed to utilize this design, I bet IPC rises dramatically, even when using all 8 cores.
> 
> Windows 7 and programs don't know how to utilize this design. That's the problem.


So why release a CPU that isn't properly supported? If AMD understood these they should have been working with software developers to solve this before they released.


----------



## Mad Pistol

Quote:


> Originally Posted by *lordikon;15299224*
> So why release a CPU that isn't properly supported?


Well, if you spent 5+ years developing a technology and needed to get some money back on the project, you'd probably release it too.









AMD knows where they screwed up. It's guaranteed that they already have a design to address the current flaws of BD.


----------



## robbo2

Quote:


> Originally Posted by *Mad Pistol;15299189*
> *THIS! THIS! THIS! THIS! A THOUSAND TIMES THIS!!!*
> 
> People don't seem to get that BD was designed with a certain software usage in mind, which is why the design is so radically different. If 2 cores are competing for L2 cache, of course it's going to drop efficiency (and IPC). However, if the program is designed to utilize this design, I bet IPC rises dramatically, even when using all 8 cores.
> 
> Windows 7 and programs don't know how to utilize this design. That's the problem.
> 
> I almost want to bet that AMD cuts the L2 cache into 8 sections (1 for each integer core) for the next iteration of Bulldozer chips, and puts in an interconnect between the sets of L2 cache in each module. I bet that increases IPC a bit.


I'm sorry for wanting a CPU that performs now rather then waiting for software developers to catch up.


----------



## MR KROGOTH

Why doesnt everybody just shut up and get off the subject?

Arguing on the internet is about as fruitless as swinging a hammer at concrete.


----------



## lordikon

Quote:


> Originally Posted by *Mad Pistol;15299247*
> Well, if you spent 5+ years developing a technology and needed to get some money back on the project, you'd probably release it too.


This release may harm them more than hurt them. Well, at least in the enthusiast market, I can see plenty of AMD users ready to upgrade to Intel now instead of AMD because of this.


----------



## linkin93

Quote:


> Originally Posted by *lordikon;15299224*
> So why release a CPU that isn't properly supported?


Maybe all it needs is, you know, a software patch, perhaps something called a _driver?_

At the moment, using the FX CPU's as they are right now, without a proper driver or BIOS or software patch or whatever, it could be like playing games without your GPU driver installed:

Sure it works, but what about optimisations?

Hence, people of OCN, you make me


----------



## Mad Pistol

Quote:


> Originally Posted by *robbo2;15299258*
> I'm sorry for wanting a CPU that performs now rather then waiting for software developers to catch up.


AMD bet on the future and lost. Live and learn.


----------



## Blameless

Quote:


> Originally Posted by *HK_47;15299014*
> its not weird it makes perfect sense...


Yeah, I don't see how this isn't a given.

Anything sharing a module needs to share part of the L1, and L2 cache, an FPU, and a few other resources.

In most cases, spreading threads out to different modules would improve performance, while in a few cases where threads are tightly linked but don't compete for FPU resources, putting them on the same module would be better.

For these reasons, it's likely that improved OS scheduling (in WIndows 8 and a patch for WIndows 7) will give BD a modest improvement in performance and efficiency.


----------



## djriful

Hey guys,

Is AM3+ only limited to 8 Cores? I was reading up the road map which they are going up to like 16 cores modules and etc. Even up to like 48+ cores in 2014. But will all these be remain in AM3+?

side note: the roadmap is killing themselves... that is too slow in term of performance increase and just by adding more cores.


----------



## mad0314

Quote:


> Originally Posted by *MR KROGOTH;15299272*
> Why doesnt everybody just shut up and get off the subject?
> 
> Arguing on the internet is about as fruitless as swinging a hammer at concrete.


You know this is a thread about Bulldozer reviews, and thus about its performance, right?


----------



## Derp

Quote:


> Originally Posted by *knoxy_14;15299012*
> 3.19 cinebench for 4 core still isnt as good as phenom II


This. Yes it improved but its still slower than their previous CPU which shows you just how slow BD is at launch. Nothing to be impressed by guys, calm down. BD is still awful.


----------



## coupe

Quote:


> Originally Posted by *lordikon;15299224*
> So why release a CPU that isn't properly supported? If AMD understood these they should have been working with software developers to solve this before they released.


It's called innovation and AMD has the guts to do it! Intel steals their design queues and masters them


----------



## MR KROGOTH

Quote:


> Originally Posted by *mad0314;15299350*
> You know this is a thread about Bulldozer reviews, and thus about its performance, right?


More than you think.

It is NOT however a place to bash people who anticipated BD and its performance, be it as it may, nor does it give a reason for those who have no interest in discussing it but rather nit-picking at details.


----------



## jackeyjoe

Quote:


> Originally Posted by *mad0314;15299350*
> You know this is a thread about Bulldozer reviews, and thus about its performance, right?


It doesn't mean you need to outright flame each other over it... sensible discussions please


----------



## Blameless

The Anandtech review commented on the problems with current schedulers in some detail:

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/11
Quote:


> Originally Posted by *lordikon;15299224*
> So why release a CPU that isn't properly supported? If AMD understood these they should have been working with software developers to solve this before they released.


They do understand this and have been working with OS developers. Generally, the hardware comes first and the software comes later.

Think of how long it took for each incarnation of SSE to become widespread (even with the vastly larger Intel pushing for it) or for for x86-64 to be supported.


----------



## AMD2600

Quote:


> Originally Posted by *lordikon;15299224*
> So why release a CPU that isn't properly supported? If AMD understood these they should have been working with software developers to solve this before they released.


CPU comes first, then software.


----------



## linkin93

Quote:


> Originally Posted by *AMD2600;15299398*
> CPU comes first, then software.


Exactly. It doesn't work the other way


----------



## Majin SSJ Eric

I haven't seen any flaming. What did I miss?


----------



## Mad Pistol

Quote:


> Originally Posted by *AMD2600;15299398*
> CPU comes first, then software.


Just like you get new GPUs with new APIs before the games using those APIs are released.

DX11 GPUs have been out for 2 years, and just now, Battlefield 3 is the first game designed from the ground up to utilize DX11. Hardware always comes before software.


----------



## BizzareRide

Quote:


> Originally Posted by *Mad Pistol;15299189*
> *THIS! THIS! THIS! THIS! A THOUSAND TIMES THIS!!!*
> 
> People don't seem to get that BD was designed with a certain software usage in mind, which is why the design is so radically different. If 2 cores are competing for L2 cache, of course it's going to drop efficiency (and IPC). However, if the program is designed to utilize this design, I bet IPC rises dramatically, even when using all 8 cores.
> 
> Windows 7 and programs don't know how to utilize this design. That's the problem.
> 
> I almost want to bet that AMD cuts the L2 cache into 8 sections (1 for each integer core) for the next iteration of Bulldozer chips, and puts in an interconnect between the sets of L2 cache in each module. I bet that increases IPC a bit.


This must be the relief you were looking for huh?


----------



## GuilT1

People are defending AMD because hardware comes first, then software? That's grasping at straws. With all the time that BD has been in development, you don't think there should be software to support it by launch time?


----------



## Mad Pistol

Quote:


> Originally Posted by *BizzareRide;15299440*
> This must be the relief you were looking for huh?


I wouldn't call it relief. It just does a good job at explaining why BD doesn't perform well.

I'm still not buying BD. There's no reason to.


----------



## jackeyjoe

Quote:


> Originally Posted by *Majin SSJ Eric;15299426*
> I haven't seen any flaming. What did I miss?


Around 80 deleted posts


----------



## Blameless

Quote:


> Originally Posted by *robbo2;15299083*
> This makes me
> 
> 
> 
> 
> 
> 
> 
> harder then when people buy i7 and turn HT off.


Hyper-Threading went through it's share of adoption issues as well.

Fortunately for Intel, HT has been largely understood by OSes since late 2002 when Intel implemented it in it's Xeons and Pentium IVs. When the i7s hit the scene in late 2008, early 2009, every major OS had already had good HT support for years.

BD had no such predecessor to pave the way for the peculiarities of the module approach.

Obviously, you shouldn't buy an 8-core bulldozer to disable half the cores, that would be stupid and wasteful. However, manually setting core affinities to improve lightly threaded performance, at least until OSes are patched, may net some people worthwhile gains.

In the end, these updates and considerations aren't likely to completely turn around BD performance; anyone expecting such a thing is fooling themselves. However, I would not be surprised if, in 6-12 months, BD was ~10% faster overall, with no changes to hardware.


----------



## Homeles

Quote:


> Originally Posted by *linkin93;15299274*
> Maybe all it needs is, you know, a software patch, perhaps something called a _driver?_
> 
> At the moment, using the FX CPU's as they are right now, without a proper driver or BIOS or software patch or whatever, it could be like playing games without your GPU driver installed:
> 
> Sure it works, but what about optimisations?
> 
> Hence, people of OCN, you make me


You make me









Stop making excuses. You're not an AMD engineer or anything. And Windows 8, which has an optimized scheduler? Still not equal to a 2500k.

Even if it was equal performance wise, the 2500k still has better power efficiency.
Quote:


> Originally Posted by *linkin93;15299411*
> Exactly. It doesn't work the other way


And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.


----------



## PvtHudson

Quote:


> Originally Posted by *Mad Pistol;15299429*
> Just like you get new GPUs with new APIs before the games using those APIs are released.
> 
> DX11 GPUs have been out for 2 years, and just now, Battlefield 3 is the first game designed from the ground up to utilize DX11. Hardware always comes before software.


Uh... no.

The difference here is the DX11 GPUs were actually faster. Even though DX11 technology wasn't utilized, the Radeon 5xxx series was simply faster in every way possible when using DX9 and DX 10 games.

BD is slower in everything other than the most threaded applications.

It's not the same thing. What AMD have created is a useless CPU. By the time the software comes around, there is no way in hell you're still going to be using a Bulldozer.


----------



## Sickened1

Quote:


> Originally Posted by *Homeles;15299469*
> You make me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stop making excuses. You're not an AMD engineer or anything. And Windows 8, which has an optimized scheduler? Still not equal to a 2500k.
> 
> Even if it was equal performance wise, the 2500k still has better power efficiency.
> 
> And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.


Since when did OCN become such a HUGE fan of power efficiency? Oh ya, yesterday when people needed something else to help them bash. Find a meaningful horn too blow.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Mad Pistol;15299247*
> Well, if you spent 5+ years developing a technology and needed to get some money back on the project, you'd probably release it too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD knows where they screwed up. It's guaranteed that they already have a design to address the current flaws of BD.


So you're saying that AMD started designing hardware 5 years ago that was meant for software that won't be around for another few years??? Thats a good strategy.









What my question has been ever since this mythical "future software will fix BD" argument came out is why on earth would you buy BD over an Intel processor that performs like a boss on software CURRENTLY available? Do I buy a CPU that performs great now or one that I have to wait for software to fix?


----------



## Homeles

Quote:


> Originally Posted by *Sickened1;15299512*
> Since when did OCN become such a HUGE fan of power efficiency? Oh ya, yesterday when people needed something else to help them bash. Find a meaningful horn too blow.


What point are you trying to make? If a processor draws more power than another, and both yield equal performance, why the heck would anyone buy the one that draws more power?


----------



## mothergoose729

The problem with bulldozer is it is not as well designed as sandy bridge period. They had the same die space to work with, the same instruction set and extensions. Intel designed a better processor more then a year ago. Complex design decisions aside, intel has better funded R&D, more intelligent minds on their projects, and more resources for engineers at their disposal than AMD does. That is not likely to change ever. Bulldozer is the best AMD could come up with, and they gambled on making it fit for the software of the future because they can't compete with the software of today. Problem is, 5 years after the original planning of the design, 8 cores is not much more relevant in consumer space, and even to a large extent in server space, then it was 5 years ago. I don't think AMD can recover from this. They have really been dead in the water since phenom I.


----------



## Mad Pistol

Quote:


> Originally Posted by *PvtHudson;15299483*
> Uh... no.
> 
> The difference here is the DX11 GPUs were actually faster. Even though DX11 technology wasn't utilized, the Radeon 5xxx series was simply faster in every way possible when using DX9 and DX 10 games.
> 
> BD is slower in everything other than the most threaded applications.
> 
> It's not the same thing. What AMD have created is a useless CPU. By the time the software comes around, there is no way in hell you're still going to be using a Bulldozer.












I was using that as an example of hardware coming before software, not performance increases from generation to generation. You took my statement completely out of context.


----------



## Blameless

Quote:


> Originally Posted by *Homeles;15299469*
> And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.


Not for the people who still have them.

I may buy four or five CPUs a year, but I am probably not a typical user. Most people run their chips till they are too slow to be tolerable.


----------



## FenrirXIII

Unfortunately for AMD marketing, I'm happy with spending money on a new ram kit and heat sink to get a higher stable OC on my 1055 until BDv2.0 or w/e comes out. I will support them over intel just for competition sake, but they still gotta earn it. :O


----------



## Sin0822

anyone read that hardware heaven review, bassplayer pointed it out earlier, They really need to give every product a 9/10 or higher huh lol.


----------



## 8ight

But... it still sucks...
Quote:


> Originally Posted by *FenrirXIII;15299581*
> Unfortunately for AMD marketing, I'm happy with spending money on a new ram kit and heat sink to get a higher stable OC on my 1055 until *BDv2.0 or w/e* comes out. I will support them over intel just for competition sake, but they still gotta earn it. :O


Piledriver.


----------



## RagingCain

Quote:


> Originally Posted by *Sickened1;15299512*
> Since when did OCN become such a HUGE fan of power efficiency? Oh ya, yesterday when people needed something else to help them bash. Find a meaningful horn too blow.


Going from a system idling at 215 watts and just running prime95 (no GPU usage) would push the entire system to around 530 watts. Now add two video cards and game on it.

Enjoy that safety shut down on that 800 watt PSU, which is enough for a 2600K and probably 2x 580s.


----------



## HWI

BD is still crap, no matter how you slice it. The only thing it has going for it is it's low cost.


----------



## Mad Pistol

Quote:


> Originally Posted by *Majin SSJ Eric;15299539*
> So you're saying that AMD started designing hardware 5 years ago that was meant for software that won't be around for another few years??? Thats a good strategy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What my question has been ever since this mythical "future software will fix BD" argument came out is why on earth would you buy BD over an Intel processor that performs like a boss on software CURRENTLY available? Do I buy a CPU that performs great now or one that I have to wait for software to fix?


When dual cores first came to the consumer market, there weren't that many applications that could utilize them. The only thing that kept them afloat for the first couple years was because they were based around an already existing (and successful) single-core architecture, and that architecture (the Athlon 64 to Athlon 64 X2) was already very fast in single core form. Putting 2 on the same chip was a no-brainer.

AMD went from the traditional 1c/1t model to a 2c/1 module with shared L2 cache design with bulldozer. They took a risk, and lost. They probably should have tested this sort of design in a more discrete way instead of releasing it to the masses and then trying to cover over it with smoke and hype.


----------



## Xyxox

Quote:


> Originally Posted by *jackeyjoe;15299462*
> Around 80 deleted posts


That actually explains a lot about how this thread was reading until I finally gave up and just skipped to the end.


----------



## jivenjune

I'm generally a fairly rationale person for the most part, but the immense amount of delusional insanity that has run rampant is still enough to just make me







.

Everyone keeps talking about these software patches, utilization updates and so forth, but you know what? I haven't seen anything that confirms any of this yet, and the way AMD has marketed this whole Bulldozer thing leads me to believe that I shouldn't believe in its existence until it's firmly out in the public--just like every damn person tried to argue that we shouldn't believe in any early "preview" or "benchmark."

Even if I wanted to buy into any speculative rumors, in-depth searches pending on this subject indicates the performance increase from any of these so-called "updates" or "patches" is at best a slight tweak in performance--nothing ground breaking.

Look, I'm pretty damn sure there would of been a ton of happy people if AMD merely were able to close the gap between it's current Phenom processors and Intel's Sandy Bridge, but you know what? Multiple scenarios indicate that even the architecture based off of the Phenom line is often superior, and I think this is what really jabs at the AMD supporters.

It took so damn long for AMD to release a CPU that was only better than an i5 2500k in a tiny fraction of minute scenarios, and often bested by AMD's own 3-4 year old Phenom architecture.

What.... WHAT? How can people not feel at least a little upset by that?

On one final note... Even if I wanted to support AMD (which I have done by currently owning a Phenom II x2 system for light usage), I wouldn't do it by buying any of these new processors. Why? It's significantly over priced in comparison to the current Phenom line, and the current Phenom line offers overall more balanced performance than any of these FX processors.

I see absolutely no reason to purchase an FX processor over the Phenom line since a Phenom will offer an overall better balance of performance instead of only being adapt in isolated scenarios which seem almost non-existent.


----------



## RagingCain

Quote:


> Originally Posted by *Sin0822;15299596*
> anyone read that hardware heaven review, bassplayer pointed it out earlier, They really need to give every product a 9/10 or higher huh lol.


A concerning eye on that website will see they are very well in favor of AMD but its subtle. Reminds me of the 580 vs 6970 benchmarking.


----------



## Blameless

Quote:


> Originally Posted by *DesertRat;15299157*
> I will not believe that AMD didn't catch, and know about this.


Of course they knew about this.
Quote:


> Originally Posted by *DesertRat;15299157*
> From what I was reading in another thread, I heard that bulldozer wasn't "designed" it was computed. Automated programs mapped out it's architecture. No human ingenuity. That would explain a lot.


There is a lot of automation that goes into modern processor design, for both AMD and Intel. You simply cannot rely on a team of humans to efficiently arrange hundreds of millions of transistors and their interconnects.

Humans do the basic design, automation does most of the grunt work, then the humans go through and hand tweak things.


----------



## PvtHudson

Quote:


> Originally Posted by *Mad Pistol;15299577*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was using that as an example of hardware coming before software, not performance increases from generation to generation. You took my statement completely out of context.


I didn't take it out of context. I understood what you meant.

The problem, however, is that when new hardware does come out, even if software doesn't currently exist to make the fullest use of it, it still performs better in existing applications compared to previous hardware.

Athlon 64 was the first 64 bit processor for consumers. Hell, even today most applications are still 32 bit, the Athlon 64 still destroyed the Pentium 4 and dominated the market when just about everyone was using 32 bit Windows XP.

That is why your argument is a poor one unless destroying backwards compatibility was a goal for Bulldozer.


----------



## Blameless

Quote:


> Originally Posted by *RagingCain;15299636*
> Going from a system idling at 215 watts and just running prime95 (no GPU usage) would push the entire system to around 530 watts. Now add two video cards and game on it.


Remember that these are at the wall figures, which will be significantly (~15%) higher than what is actually drawn from the PSU.

BDs load power consumption is poor, especially when overclocked, but it's not as dire as some reviews make it out to be.


----------



## gsa700

Quote:


> Originally Posted by *MR KROGOTH;15299272*
> Why doesnt everybody just shut up and get off the subject?
> 
> Arguing on the internet is about as fruitless as swinging a hammer at concrete.


Uhmmmm, you know hammers break concrete right?


----------



## Maelthras

Quote:


> Originally Posted by *Dman;15299472*
> Again who cares? Why release something that is essentially useless today, and virtually no better then your previous product. It's like bringing a car that is optimized for a pavement track to an off road race, and then saying "Well it would've done well on pavement." When you clearly knew it was an off road race. There was no point in releasing this product at this time, and trying to justify it with the excuse "It doesn't work with anything that is out there today." is just dumb.


Does no one remember how crazy it was when they put TWO CORES on one cpu? How long did it take for that little bit of hardware change be worked into programming. OH RIGHT, the majority of programs still don't take advantage of multicore cpu's.

Mad beat me to it, but this is still a valid response.


----------



## Mad Pistol

Quote:


> Originally Posted by *PvtHudson;15299712*
> I didn't take it out of context. I understood what you meant.
> 
> The problem, however, is that when new hardware does come out, even if software doesn't currently exist to make the fullest use of it, it still performs better in existing applications compared to previous hardware.
> 
> Athlon 64 was the first 64 bit processor for consumers. Hell, even today most applications are still 32 bit, the Athlon 64 still destroyed the Pentium 4 and dominated the market when just about everyone was using 32 bit Windows XP.
> 
> That is why your argument is a poor one unless destroying backwards compatibility was a goal for Bulldozer.


I wasn't trying to cover for AMD. I was using an example to help quell some speculation.

AMD did it right with the Athlon 64. They tweaked an already existing architecture, moved the memory controller onto the chip, added 64-bit extensions, and released it. With the X2, they added a 2nd Athlon 64 core. AMD made sure that the Athlon 64 was still faster than the Athlon XP clock-for-clock, and as a bonus, they beat Intel as well. They also made the first consumer 64-bit CPU. The fact that it performed well in 32-bit apps is what really sold it, though.

As I said in an earlier post, AMD bet that software would be extremely threaded at this point in history, and in that regard, BD would probably be faster. The problem is that the software isn't there yet. They built a CPU based on an architecture that 99% of programs out there have no clue how to utilize. Thus, performance sucks.


----------



## tw33k

Quote:


> Originally Posted by *Sin0822;15299596*
> anyone read that hardware heaven review, bassplayer pointed it out earlier, They really need to give every product a 9/10 or higher huh lol.


Yeah...I read that and laughed as well


----------



## PoopaScoopa

tl:dr summary:

































































All that for twice the power consumption compared to a $60 cheaper 2500K.


----------



## NickSim86

I really wish everyone would stop saying that "bulldozer" is a failure when they should be saying "zambezi" or "FX" is a failure.

As someone who sells X86 servers and VMware licensing, I can tell you that bulldozer looks to be a very good server architecture for running virtual machines. A 16 core CPU is going to save a customer alot of money on VMware licenses as they license per CPU not per core.

On a side note, I am looking forward to seeing more reviews on the FX-6100 and FX-4100 CPU's. These look much better price/performance wise for client computing. I suspect that we will see similar performance between the FX-8150 and FX-6100 in most benchmarks, especially when overclocked. Also I expect thermals to be much lower on the 4 and 6 core parts. These chips could be FX's saving grace.


----------



## ilhe4e12345

so wait....disabling 4 of the cores and BD is actually pretty decent...? it seems pretty good from this but maybe im wrong...?


----------



## Homeles

Quote:


> Originally Posted by *Blameless;15299576*
> Not for the people who still have them.
> 
> I may buy four or five CPUs a year, but I am probably not a typical user. Most people run their chips till they are too slow to be tolerable.


Now that's a legitimate argument... not one of the straws people are grasping for around here. If you're not going to upgrade for another 4 years (which isn't most of the people around here) and there was some real promise of Bulldozer becoming a faster processor than what Intel currently offers, then buying Bulldozer could be justifiable. Although, you'd probably save enough money from energy savings to buy a new processor by the time that software support came around.









Unfortunately for AMD, the applications I use would benefit most from a Sandy Bridge processor. By the time I have money for an upgrade, it'll be after Ivy Bridge is out. This Phenom's really starting to wear on me.


----------



## Mad Pistol

Quote:


> Originally Posted by *ilhe4e12345;15299866*
> so wait....disabling 4 of the cores and BD is actually pretty decent...? it seems pretty good from this but maybe im wrong...?


You have to disable 4 of the integer cores in 4 different modules to have this work. If you disable 2 of the cores in the same module, it doesn't work.

It basically takes an 8c/4m CPU and turns it into a 4c/4m CPU. That's when you see the IPC increase. At that level, IPC is pretty comparable to (if not a little bit better than) a Phenom II core.


----------



## L36

This still does not change the fact that im very disappointed how they treated the consumer segment with bulldozer. It might be nice for server space, but its just insulting to consumers like us, performance wise.


----------



## Riou

Quote:


> Originally Posted by *Majin SSJ Eric;15299539*
> So you're saying that AMD started designing hardware 5 years ago that was meant for software that won't be around for another few years??? Thats a good strategy.


Intel did that with Skulltrail.

Anyway, just think back to Windows 98/ME that did not properly support Intel Hyper-threading. If the OS can properly use BD's architecture so it does not thrash, it can perform a bit better.


----------



## Dmac73

Quote:


> Originally Posted by *ilhe4e12345;15299866*
> so wait....disabling 4 of the cores and BD is actually pretty decent...? it seems pretty good from this but maybe im wrong...?


Still significantly worst than a Deneb clock for clock.

Quote:


> Originally Posted by *Mad Pistol;15299460*
> I wouldn't call it relief. It just does a good job at explaining why BD doesn't perform well.
> 
> I'm still not buying BD. There's no reason to.


BD doesn't work well because the design just isn't very good. Multithreaded software uses more threads. I don't see X6 struggling anywhere, BD is just a different approach at that same end result: more cores. The modular design shares resources, and they're already little, narrrow, weak cores clusters. Ouch. Anyways, you can slice it any way you want it, but software's direction won't be dependant of any kind of architecture advancement brough from BD other than more cores. Not some kind of new secret coding abilities you blindlessly seem to want to convey. BD just sucks. The ONLY reason it would do better in future software, is more cores. With all due respect, please don't keep facepalming peoples correct analysis due to some kind of blind hope.

Then you've got the scheduler in W8. W7 is the flagship now, and i'll be sticking with W7 for a long time, just like lots of others. We shouldn't have to upgrade PSUs AND and OS to get the full performance. And what, 5% on average - 10% better in best case scenarios. Here's where facepalm is appropriate.


----------



## Blameless

Quote:



Originally Posted by *Riou*


Anyway, just think back to Windows 98/ME that did not properly support Intel Hyper-threading. If the OS can properly use BD's architecture so it does not thrash, it can perform a bit better.


A bit of a nitpick but Windows NT and 2000 (before the final service pack) are better examples.

98/ME don't support multiple cores (physical or logical) at all.


----------



## Mad Pistol

Quote:


> Originally Posted by *Dmac73;15300012*
> *stuff*


I bet you anything that in Piledriver, each integer core has its own L2 Cache to pull from rather than a shared one like in Bulldozer. I bet that clears up the "scheduler" issue and increases IPC 10-15% at the same time.

It's not rocket science. It's common sense.


----------



## Dman

Quote:



Originally Posted by *Maelthras*


Does no one remember how crazy it was when they put TWO CORES on one cpu? How long did it take for that little bit of hardware change be worked into programming. OH RIGHT, the majority of programs still don't take advantage of multicore cpu's.

Mad beat me to it, but this is still a valid response.


Actually at the time of dual core processors XP supported and utilized them just fine. Software that utilized dual cores also generally worked better on one platform vs another. In that case it was the athlon 64, because its architecture as a whole was more efficient then the P4. The BD is AMD's P4, it has nothing to do with "software" optimization. Even if it did, it's irrelevant, by the time these changes come around BD will be a distant memory.


----------



## Bkpizza

And hey even before release, AMD said they got 50% more performance from 33% more cores. That would be in a perfect world with the nice lightly threaded apps that BD loves, so its not that suprising in that way.


----------



## Mad Pistol

Quote:



Originally Posted by *Homeles*


Dude, you should probably say why everyone's wrong instead of posting ******ed facepalm images and pretending like you know everything.


He bolded the section he was talking about. He is right, too. When you disable 1 integer core per module in Bulldozer, the IPC becomes comparable to that of each core in Deneb. Basically, a 4c/4m Bulldozer design becomes comparable to that of a 4-core Phenom II.

Quote:



Originally Posted by *Dmac73*


Piledriver is different. I'm waiting on Piledriver myself, as POSSIBLY my next CPU. We're talking about Bulldozer.


AMD has already stated that Piledriver will be a tweaked Bulldozer core. It's not going to be radically different.


----------



## Dmac73

Quote:


> Originally Posted by *Mad Pistol;15300188*
> He bolded the section he was talking about. He is right, too. When you disable 1 integer core per module in Bulldozer, the IPC becomes comparable to that of each core in Deneb. Basically, a 4c/4m Bulldozer design becomes comparable to that of a 4-core Phenom II.
> 
> AMD has already stated that Piledriver will be a tweaked Bulldozer core. It's not going to be radically different.


Yes, BD has problems, obviously. Tweaking a mature architecture and fixing the problems of a brand new one are totally different.

And, you mean we'll finally get our Phenom 2 X8 from AMD?









edit; my bolded point still stands. Go look at the results. Then go look at a PH2 clock for clock. IPC is STILL lower. Cinebench clock for clock 4m/4c BD is STILL a good bit lower. If comparable equals lower, sure, i'm stupid.


----------



## Shadowclock

This whole thing just sounds like deja vu for when Fermi was first released. Nvidia made a GPU (Tesla) that overheated and had way more built into it then was even being utilized, it failed in most common respects...and ever since then the 580 is doing great and even better in some cases (tessellation, which at the time was next-gen development) than their AMD counterparts.

AMD took the same chance Nvidia did by taking their architecture to the next level and patiently awaited the next round where they could do some serious catchup....

Well, all this is what I hope because I'd hate to see Intel with no competition.


----------



## Dmac73

Quote:



Originally Posted by *Shadowclock*


This whole thing just sounds like deja vu for when Fermi was first released. Nvidia made a GPU (Tesla) that overheated and had way more built into it then was even being utilized, it failed in most common respects...and ever since then the 580 is doing great and even better in some cases (tessellation, which at the time was next-gen development) than their AMD counterparts.

AMD took the same chance Nvidia did by taking their architecture to the next level and patiently awaited the next round where they could do some serious catchup....

Well, all this is what I hope because I'd hate to see Intel with no competition.



Except Fermi performed better than the competition. Not worst than it's predecessor.


----------



## Homeles

Quote:



Originally Posted by *Mad Pistol*


He bolded the section he was talking about. He is right, too. When you disable 1 integer core per module in Bulldozer, the IPC becomes comparable to that of each core in Deneb. Basically, a 4c/4m Bulldozer design becomes comparable to that of a 4-core Phenom II.


Which is irrelevant because a 4 module Bulldozer runs for $220 on newegg, while a Phenom II 955 runs for $120. Why would you spend 100 dollars more to break even?


----------



## 2010rig

Quote:


> Originally Posted by *NickSim86;15299850*
> I really wish everyone would stop saying that "bulldozer" is a failure when they should be saying "zambezi" or "FX" is a failure.
> 
> As someone who sells X86 servers and VMware licensing, I can tell you that bulldozer looks to be a very good server architecture for running virtual machines. A 16 core CPU is going to save a customer alot of money on VMware licenses as they license per CPU not per core.
> 
> On a side note, I am looking forward to seeing more reviews on the FX-6100 and FX-4100 CPU's. These look much better price/performance wise for client computing. I suspect that we will see similar performance between the FX-8150 and FX-6100 in most benchmarks, especially when overclocked. Also I expect thermals to be much lower on the 4 and 6 core parts. These chips could be FX's saving grace.


So long as those sever chips don't have crazy power consumption, then they'll do fine in servers. We knew Bulldozer was server first, client 2nd.


----------



## Namwons

does 4 single threaded mods of bulldozer beat a 4c phenom II though? or is it just the same? anyway AMD has some catching up to do in IPC


----------



## AtomicFrost

Quote:



Originally Posted by *Maelthras*


Does no one remember how crazy it was when they put TWO CORES on one cpu? How long did it take for that little bit of hardware change be worked into programming. OH RIGHT, the majority of programs still don't take advantage of multicore cpu's.

Mad beat me to it, but this is still a valid response.


Actually a large number of modern applications can utilize multiple cores.

Back when dual core CPU's released there actually was a decent number of dual core applications that were in development. These include both professional applications, and some games.

When AMD released their first dual core CPU (X2) they didn't forsake single thread performance. They realized that a lot of software was single threaded, and it would be incredibly detrimental to them as a company if they did that.

Unfortunately, this time around who ever is making the engineering decisions at AMD decided to just focus on adding more cores. This would have been fine if each core was as powerful as Phenom 2 with much higher clock speeds. Instead we have a chip that is a lot slower clock for clock then a product they released 3 years ago.

Personally I am surprised that AMD would actually launch this product. They should have just announced that they are scrapping the AMD FX launch due to performance issues. Instead they would have been better off releasing a die shrink of Thuban and have an 8 core model. Better IPC, single thread performance, and similar power consumption. Then take what they learned from working on BD (high clock speeds possible, crap performance per core, high power consumption, etc.) and make a new architecture.


----------



## Homeles

Bulldozer does have a purpose. It is an improvement over Phenom IIs. However that improvement won't come about for quite some time for most users.


----------



## Dmac73

Quote:


> Originally Posted by *Namwons;15300433*
> does 4 single threaded mods of bulldozer beat a 4c phenom II though? or is it just the same? anyway AMD has some catching up to do in IPC


Dude this was answered just a couple of posts a go. NO.


----------



## Riou

Quote:


> Originally Posted by *Blameless;15300081*
> A bit of a nitpick but Windows NT and 2000 (before the final service pack) are better examples.
> 
> 98/ME don't support multiple cores (physical or logical) at all.


You are right.


----------



## tpi2007

Quote:



Originally Posted by *NickSim86*


On a side note, I am looking forward to seeing more reviews on the FX-6100 and FX-4100 CPU's. These look much better price/performance wise for client computing. I suspect that we will see similar performance between the FX-8150 and FX-6100 in most benchmarks, especially when overclocked. Also I expect thermals to be much lower on the 4 and 6 core parts. These chips could be FX's saving grace.



Not really, at least at current prices. The FX-4100 can't compete with a Phenom II X4 955 BE clock for clock, and it is even beaten in many benchmarks by the A8-3850. As to the FX-6100 it's more expensive than the Phenom II X6 1090T, and if the FX-8150 already struggles to beat the Thuban and loses in other cases, I can't imagine what happens in this case.

I'm sure there is a set of benchmarks already featuring the FX-4100 and FX-6100. If someone remembers which site did them, I'd be much appreciated.


----------



## a pet rock

Quote:



Originally Posted by *AtomicFrost*


Personally I am surprised that AMD would actually launch this product. They should have just announced that they are scrapping the AMD FX launch due to performance issues. Instead they would have been better off releasing a die shrink of Thuban and have an 8 core model. Better IPC, single thread performance, and similar power consumption. Then take what they learned from working on BD (high clock speeds possible, crap performance per core, high power consumption, etc.) and make a new architecture.


That's just bad business sense. Yeah, you don't get butthurt by the reviews, but they've still spent a ton of money on it. I'm gonna pull random numbers out of my butt as an example. Let's say two months ago AMD was already $13m in the hole for Bulldozer and they had projected sales to only earn $10m. Now do you release and be short only $3m or scrap it and be down $13m.


----------



## mothergoose729

If AMD focuses 100% on improving IPC and the chipset, intercore bandwidth and chipset performance their may still be hope for AMD. Dedicate themselves to an aggressive release cycle like intel. If AMD could improve IPC by 25% and implement their own version of hyper threading for each integer core, then AMD could be scary. Problem is they don't have the money or resources to make that happen.


----------



## kweechy

Where's the benchmarks for the new Opteron 6200 CPUs? That's all I want to see.


----------



## gnarlybug5

soooo... to my understanding its basically the same thing as hyperthreading. except its called 4 modules and 8 cores. not 4 cores and 8 threads. The 4 cores will perform more efficiently when it doesnt have 2 threads choose from.

In the end, I personally see it as being an AMD Phenom II X4 with hyperthreading. But the threads are called actual cores.

That is OBVIOUSLY not what it really is in actuality. But, its a good enough analogy that fits perfect with it's reasoning.


----------



## Ryleh

Quote:



Originally Posted by *Dman*


Again who cares? Why release something that is essentially useless today, and virtually no better then your previous product. It's like bringing a car that is optimized for a pavement track to an off road race, and then saying "Well it would've done well on pavement." When you clearly knew it was an off road race. There was no point in releasing this product at this time, and trying to justify it with the excuse "It doesn't work with anything that is out there today." is just dumb.


Sony did the exact same thing with their CPU(Not that they made it...).

If AMD feels this is the future than they got to start somewhere. If Ivy Bridges sucked it doesn't mean it was a failure... because the general concept is still great and it'll get better in time as it is adopted.

Some people are still running XP(or any other software) because they hate change whereas some people are willing to go through a temporary inconvenience if it means paying off later.

AMD and Intel's engineers are probably smarter than most of us in this thread. While it would have been smarter(seemingly) to dumb it down slightly and gradually introduce the differences introduced in Bulldozer/Ivy Bridge(which has it's share of problems currently too, like the ram placement for example.) they didn't so we'll have to do it.

My only compliant is that AMD wasn't more clear about what Bulldozer SHOULD do and what it is meant for. You could literally sell poop to people and satisfy them if that's what they wanted.


----------



## QuackPot

This is like buying a 4x4 and then forcing all the power to one wheel for better speed.

Pathetic.


----------



## paulerxx

the benchmarks make bulldozer look horrible...Let's not forget how dual cores looked when they first came out, they weren't better than single cored cpus unless the program supported them. Does anyone think that is the case here?


----------



## NickSim86

Quote:



Originally Posted by *2010rig*


So long as those sever chips don't have crazy power consumption, then they'll do fine in servers. We knew Bulldozer was server first, client 2nd.


well the idle power consumption has been good in the tests. Also the power consumption really only gets out of hand at high overclocks, which obviously wont an issue on the Opterons.

Quote:



Originally Posted by *tpi2007*


Not really, at least at current prices. The FX-4100 can't compete with a Phenom II X4 955 BE clock for clock, and it is even beaten in many benchmarks by the A8-3850. As to the FX-6100 it's more expensive than the Phenom II X6 1090T, and if the FX-8150 already struggles to beat the Thuban and loses in other cases, I can't imagine what happens in this case.

I'm sure there is a set of benchmarks already featuring the FX-4100 and FX-6100. If someone remembers which site did them, I'd be much appreciated.


who cares about clock for clock when the FX-41XX should be able to clock 500-1000MHz above PII X4.

I dont know how reputable this site is but the benchmarks paint an interesting picture for the 4 and 6 core chips.
http://www.legionhardware.com/articl...fx_4170,1.html


----------



## Papas

Quote:



Originally Posted by *gooface*


but by the time that happens something else with 8 cores will be out, and better at it, looks at Conroe, and then Athlon64 x2's.

look familiar?


But if they are not configured the same way, they are gonna have the same problem bd is having.


----------



## microfister

since when is blender a benchmarking tool? what did they render with it?


----------



## gooface

Quote:



Originally Posted by *Papas*


But if they are not configured the same way, they are gonna have the same problem bd is having.


its something that only time will tell, its just a gamble mostly right now, and its not worth it when a established product (2500K/2600K) is already out there.


----------



## Majin SSJ Eric

I'm still buying an 8150 and a Sabertooth 990FX for my next build just because I want to try something different. You can never have too many computers right?


----------



## odditory

Quote:



Originally Posted by *NickSim86*


I'm referring to Interlagos based Opterons not Zambezi based FX. You are correct that bulldozer architecture cant measure up to SB in lightly threaded client applications. Servers running dozens or hundreds of VM's is a totally different ballgame.


No doubt Opty's have been strong in the server market many years but irrelevant to the bulldozer discussion happening far and wide.


----------



## NickSim86

Quote:



Originally Posted by *odditory*


No doubt Opty's have been strong in the server market many years but irrelevant to the bulldozer discussion happening far and wide.


The point of my original post is that Zambezi is a failure not Bulldozer. You cant say bulldozer is a failure because the architecture is primarily for servers and I expect the Interlagos parts to perform well. Porting BD over to the client side prematurely is no doubt a failure. However, I will reserve final judgement until i see more reviews of the FX-41XX and FX-6100 on different motherboards, with DDR3 1866, overclocked, and with multiple GPU's.


----------



## Zetsou

This might convert me to AMD...


----------



## southernyankey1970

Me too...

Love my 1100 still and it will be nice to play with 8 cores and not have to fork over half a mortgage payment to do so. Intel still holds the performance crown for now but i just wish they weren't so damned greedy! I'd like to see what the real performance of the BD will be in the coming months with further optimizations and tweaks. Should be interesting to say the least.


----------



## RotaryKnight

Quote:



Originally Posted by *Majin SSJ Eric*


I'm still buying an 8150 and a Sabertooth 990FX for my next build just because I want to try something different. You can never have too many computers right?


I bought the same motherboard on monday, sent it back on weds lol

My 940 BE is getting old and I need an upgrade, I was hoping BD was that upgrade but its not.

Anything sharing in the same module will decrease performance, anybody with experience in hardware and software should know that. I knew that before BD was released but to see it as it is performance wise, I didnt know it was going to be that bad. Thats why I bought the motherboard and was planning on buying BD

IPC per core did increase as JF-AMD said, which I find it hilarious that some of you guys are trolling on his comments....I am not going to name names on that, you guys know who you are


----------



## mad0314

Not any better than a Phenom II X4 or X6 as long as you GPU is the bottleneck, and they are both cheaper. Something you don't seem to understand. There is NO reason to get the 8 core purely for gaming. It is a waste of money for that purpose.


----------



## Dmac73

Quote:



Originally Posted by *RotaryKnight*


IPC per core did increase as JF-AMD said, which I find it hilarious that some of you guys are trolling on his comments....I am not going to name names on that, you guys know who you are











Proof?


----------



## crucifix85

Quote:



Originally Posted by *lordikon*


So why release a CPU that isn't properly supported? If AMD understood these they should have been working with software developers to solve this before they released.


its a AMD cpu not Intel. expect alot of foot dragging going on to get the software optimized.


----------



## Jaxlb

People need to understand that you can't design a motherboard that supports the BD processors without it being released properly or atleast giving the companies a completed processor before release cause the motherboards now may support 8 cores but they don't know how to fully utilize the cores so until manufacturers designs said boards that can utilize BD properly it's going to be wasted.

So until they release the new boards it isn't really worth getting BD. Unless of coarse your upgrading from a really outdated system.

Oh and if anyone here calls me a AMD fan or anything like that I'll have you know I have never owned a AMD based system in my life.


----------



## matt1898

What the crud.......FX is the most amazing CPU ever made, bar none..............OH wait........................


----------



## Ethan Ravencrow

Quote:



Originally Posted by *Zetsou*


This might convert me to AMD...


/sarcasm...


----------



## Cyrilmak

So happy I went SB. I actually almost held out to see Faildozer.


----------



## NFL

Hard to believe something so hyped up could fail so spectacularly...makes me glad I went with Sandy Bridge


----------



## Homeles

Quote:



Originally Posted by *Jaxlb*


People need to understand that you can't design a motherboard that supports the BD processors without it being released properly or atleast giving the companies a completed processor before release cause the motherboards now may support 8 cores but they don't know how to fully utilize the cores so until manufacturers designs said boards that can utilize BD properly it's going to be wasted.

So until they release the new boards it isn't really worth getting BD. Unless of coarse your upgrading from a really outdated system.

Oh and if anyone here calls me a AMD fan or anything like that I'll have you know I have never owned a AMD based system in my life.


Motherboard manufactures have had months to toy with Bulldozer.


----------



## 2010rig

Guys, seriously, this thread is so painful to read.

Can we unite and stop responding to certain individuals?

No matter what, the GPU bottleneck scores are amazing ( to him ), and thinks Bulldozer rocks. There's nothing that has been said, can be said, or WILL be said that will change his mind.


----------



## RotaryKnight

Quote:



Originally Posted by *Dmac73*


Proof?


I guess you didnt read the news thread BEFORE it got merged in with this thread......damn thing is a jumbled mess now


----------



## HowHardCanItBe

Reopened.


----------



## Oedipus

Let the games begin anew!


----------



## kikkO

*ALIENBABEL TECH REVIEW*


----------



## Strat79

Meh. I think this thread has just about run its course honestly. Nothing new can be said and will only be closed again most likely. Good luck trying to keep it cleaned though mods


----------



## jackeyjoe

Quote:



Originally Posted by *Strat79*


Meh. I think this thread has just about run its course honestly. Nothing new can be said and will only be closed again most likely. Good luck trying to keep it cleaned though mods










Pretty much, chances are we'll just close it and leave it closed next time... You have been warned.


----------



## robbo2

All I can say as someone who bought a motherboard with every intention of buying one of these chips is, I'm disappointed. I stuck with the original Phenom but I can't see me buying a bulldozer chip any time soon


----------



## mrcool63

The main drawback in this thread is people ranting unnecessarily about obvious drawbacks of BD.. The discussion has degraded down to the level of people just giving opinions like -- 'Its a Fail!' and not a word more...
People need to have a bit more introspection and explain their problem with BD's performance in words other than FAIL! or Massive FAIL!! or FAILDOZER and its likes..

Beside stating the obvious if you have anything more to add then please post or else just dont!!


----------



## Seronx

Quote:


> Originally Posted by *mrcool63;15302891*
> The main drawback in this thread is people ranting unnecessarily about obvious drawbacks of BD.. The discussion has degraded down to the level of people just giving opinions like -- 'Its a Fail!' and not a word more...
> People need to have a bit more introspection and explain their problem with BD's performance in words other than FAIL! or Massive FAIL!! or FAILDOZER and its likes..
> 
> Beside stating the obvious if you have anything more to add then please post or else just dont!!


The main drawback of Bulldozer is that at stock it consumes 1.5x that of the i7 2600K

and @ clock rates that match the i7 2600K(4.9GHz) you are consuming 4x that of the stock i7 2600K and 3x the overclocked i7 2600K


----------



## Eolas

It just seems like bulldozer has so much more under the hood than any other processor, yet it is under performing relative to its transistor count and core count. I really wish AMD saw through all these bottlenecks in the architecture.


----------



## Hawk777th

Interesting.

http://www.overclock.net/amd-cpus/1141188-asus-crosshair-v-formula-board-may.html


----------



## mrcool63

I refer you to the below statement
Quote:


> If you can buy an 8 core for the price of a four core then it will lag in anything that cannot make full use of its 8 cores..


If they could compete using a 4 core why would they have ever created an 8-core!!!
Simple logic buddy..


----------



## Dopamin3

Quote:


> Originally Posted by *robbo2;15302854*
> All I can say as someone who bought a motherboard with every intention of buying one of these chips is, I'm disappointed. I stuck with the original Phenom but I can't see me buying a bulldozer chip any time soon


Just hold out for Piledriver, hopefully they improve the IPC (hoping >15%) and decrease power consumption.


----------



## Badness

"It just seems like bulldozer has so much more under the hood than any other processor, yet it is under performing relative to its transistor count and core count. I really wish AMD saw through all these bottlenecks in the architecture."
Yeah, and hopefully this new architecture will be a great starting point, like a slow investment. But, I'm probably being 100 times too optimistic.


----------



## robbo2

Quote:


> Originally Posted by *Dopamin3;15303009*
> Just hold out for Piledriver, hopefully they improve the IPC (hoping >15%) and decrease power consumption.


I'm not joining the 'I'm going to sell my board' bandwagon just yet. I have faith in the architecture but in all honesty I don't see it as much of an upgrade. I just hope Piledriver will iron out the bugs like Phenom II did


----------



## mad0314

Quote:


> Originally Posted by *mrcool63;15302967*
> I refer you to the below statement
> 
> If they could compete using a 4 core why would they have ever created an 8-core!!!
> Simple logic buddy..


Its not so simple. They COULD have shrunk Phenom II and gotten better performance out of it. But the thing is, they are focused more on server chips, and this is a server based chip for desktops. Its not that their R&D can't do it, its that it would cost them a lot more. I was going to say that their budget can't do it, but I don't know that. In the end, they are focused more on server chips and this is a chip that reflects that.


----------



## Dublin_Gunner

Quote:


> Originally Posted by *Hawk777th;15302940*
> *Interesting.*
> 
> http://www.overclock.net/amd-cpus/1141188-asus-crosshair-v-formula-board-may.html


Not really.

This is nothing more than a terrible attempt at damage limitation, and trying to buy themselves more time.

New boards / immature bios / new platform - so must perform crap right?

I don't recall there being such issues with LGA1155 P67 boards with UEFI BIOS's.


----------



## alancsalt

I suspect there may be some truth in this:

Quote:



On paper bulldozer is a lovely chip. Bulldozer was on the drawing board (people were even working on it) even back when I was there. All I can say is that by the time you see silicon for sale, it will be a lot less impressive, both in its own terms and when compared to what Intel will be offering. (Because I have no faith AMD knows how to actually design chips anymore). I donâ€™t really want to reveal what I know about Bulldozer from my time at AMD.

What did happen is that management decided there SHOULD BE such cross-engineering ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left â€" they started to rely on synthesis tools, automatic place and route tools, etc. I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electromigration and other problems.

That is now how AMD designs chips. [sarcasm]Iâ€™m sure it will turn out well for them [/sarcasm]


http://www.insideris.com/amd-spreads...ee-speaks-out/

AMD used to lead the way. From 2000 to 2007 all my systems were AMD.


----------



## kweechy

Quote:



Originally Posted by *microfister*


since when is blender a benchmarking tool? what did they render with it?


If you render the same scene with any 3D application, they're benchmarking tools.

Probably better ones in fact than the synthetic ones because they'll use the memory controllers and the RAM in general a lot more, you get a better idea of the system's power and how well the CPU accesses data in addition to simply how well it processes it.


----------



## toX0rz

Quote:



Originally Posted by *mad0314*


Its not so simple. They COULD have shrunk Phenom II and gotten better performance out of it. But the thing is, they are focused more on server chips, and this is a server based chip for desktops. Its not that their R&D can't do it, its that it would cost them a lot more. I was going to say that their budget can't do it, but I don't know that. In the end, they are focused more on server chips and this is a chip that reflects that.


mrcool63 is right, if they could have made a high-performing 4 core chip to compete with intel, they would have done it, but they couldnt.. it IS that simple.
Has nothing to do with focusing on server chips.. AMD is not dumb, I am sure even they know that achieving the same performance with lesser cores is better than having more but weak cores.

You make it sound like a die-shrunk Phenom II would be able to keep up with SB core for core, but thats laughable. I am pretty sure it wouldnt even match Nehalem IPC, so thats still MILES away from having a competitive 4 core.
Fact is that AMD has to use more cores in order to keep up because they simply cant build a faster core, its hopeless.

A die-shrunk Phenom II X8 would have had bit more performance thats right, but it would still be 8 vs 4 cores which means it would still fail in most benchmarks that dont really utilize all of them.


----------



## Am*

These CPUs are officially an epic fail. Prices of the 2500K/2600K and Intel SB motherboards, went up with the release of these processors. Way to go, AMD...


----------



## Tabzilla

Quote:



Originally Posted by *mad0314*


You cannot say that a Honda Civic is as fast as a Bugatti Veyron when they are only allowed to go the speed limit.


Well, you unknowingly proved his point here...bravo!









I'll FTFY:

Quote:



Originally Posted by *mad0314*


You can say that a Honda Civic is as _good_ as a Bugatti Veyron when they are only allowed to go the speed limit.


----------



## jck

Quote:



Originally Posted by *Dublin_Gunner*


Not really.

This is nothing more than a terrible attempt at damage limitation, and trying to buy themselves more time.

New boards / immature bios / new platform - so must perform crap right?

I don't recall there being such issues with LGA1155 P67 boards with UEFI BIOS's.


No, not with the BIOS. But as I recall, Intel's chipset for Sandy Bridge was not without its problems.

...From Intel's own technology blog...

No one is perfect.

Like the performance evals, I will now wait for multiple sources to do comparisons. In fact, I am gonna message a reviewer I have talked to and see if he's going to look into this. It would make a great article for his website.


----------



## hajile

Quote:



Originally Posted by *alancsalt*


I suspect there may be some truth in this:

http://www.insideris.com/amd-spreads...ee-speaks-out/

AMD used to lead the way. From 2000 to 2007 all my systems were AMD.


Computer fab processes are just a few years away from not being able to become smaller. A 2D chip (note: "3D transistors" are not 3D, just turned on edge. They are not stacked.) scales to a power of 2; in order to increase performance once minimum fab size is achieved, a 2D chip must get wider and longer. A 3d chip will allow the same base size (and keep devices small).

Computer designed chips is the future. Some research shows that there is little or no heat output with graphene chips. This makes 3D designs feasible. A cubic chip would have so many transistors and be so complicated (due to hundreds of 3D interconnects) that a computer is the only feasible method of design.

AMD designing chips with a computer (if true) simply shows that the company is forward thinking. Getting experience and patents (I don't like patents, but it is "legit" business strategy) would put AMD ahead of the curve. As experience with computer design increases, efficiency will as well (note: AMD must still create and input basic structures for a computer program to work with). Also important to note is that AMD would still examine the computer designs and tweak the less efficient places by hand (as is done in computerise engineering in every other engineering profession).

That said, GPU's have been computer designed for years and it works fairly well.

edit: I would be remiss if I did not say that Intel undoubtedly uses computers for at least the more boring jobs like designing wire trace layers. If computers are not used for current designs, Intel still likely has a team doing research into this area.

To answer some other persons, I think I'll just quote myself to save typing

Quote:



Originally Posted by *hajile*


Yields are terrible. The launch was probably comprised of mostly paper.
I don't expect some magic 50% performance increase.

Linux developers seem to have had (still having?) a difficult time redesigning the scheduler to suit the new architecture. I assume that Windows developers are having similar issues. The issue talked about earlier in this thread (where disabling the second integer core in each module improves performance/clock). While this could show a problem with the decode unit *but* the decode unit is probably not the biggest problem. Data and instructions reach the integer cores only *after* being decoded. A decoder bottleneck will exist regardless of which integer core or cores are working downstream(the max no. of instructions decoded per unit of time is the max no. of instructions which can be executed per unit of time regardless of how many execution units are present); however, rearranging the chip in software (via disabling some cores) to simulate a more traditional architecture shows that the performance gains are *more likely* to be due to better scheduling optimization and less cache thrashing (not "getting more instructions to fewer core").

AMD's problem is that the CPU (apparently) can't effectively use all the available integer cores due to poor scheduling by the OS, large cache latencies, not enough decoders, and poor branch prediction. That is to say that the integer units (and possibly the FPU's) are being bottlenecked.

The poor scheduling can be fixed and (based off Anandtech's Windows 8 not-fully-optimized alpha test) will decrease normal power consumption (unused cores can downclock) while increase performance 10% or maybe more (some 4c/4cu benches showed >20% increases). The cache latencies were probably increased over expected numbers to increase yields due to poor 32nm performance at Globalfoundries. They will probably increase with the next stepping or two (side note: one of AMD's goals was near linear performance increase with clockspeed (something SB doesn't achieve) and getting the 30% clockspeed advantage over Deneb that was initially expected will also be a side affect of fab improvements).

Improving decoders (if necessary) and improving branch prediction require a complete reworking of the front-end of the processor. With normal development times for simple chip redesigns being a couple of years, I suspect that AMD knew months ago about the poor decode and branch prediction. This is the only explanation for how soon piledriver is being released (just a few months rather than a couple of years for a major redesign). AMD likely counted on the 30% greater stock clockspeed to carry them until the redesign was finished (notice that the 4.6Ghz overclock benches (roughly 30% faster than Deneb designs) were fairly competitive), but AMD was screwed by the bad fab (though I believe AMD to be at fault as well for shipping a faulty design).

My prediction (please don't quote me later, I am being optimistic, but in reality, I have little faith)

Between a 10-15% average OS performance increase (this seems fairly definite), better fabs giving (I guess) 20% increase in clockspeed rather than 30% (scaling almost linearly), better fabs giving nearly 80% improvement in cache latencies (to match Deneb latencies should be completely possible, cache is cache), and 10-15% IPC improvement (also seems fairly definite) due to more decode and better branch prediction, I believe that the next iteration will show more of the theoretical potential.

edit: At best, 15% from OS and 15% from redesign gives 30% IPC performance boost (making it 20% faster than Deneb and 20% slower than Sandybridge). Better cache latencies are a mixed bag; they may give less than 2% for some applications or they may give >20% for others. If clockspeed can be increased the total increases give between 35-90% increases in performance (that's a huge delta). Even with a 70-90% increase in overall performance, performance per transister would still be worse than Sandybridge.

This seems to be the only explanation for why a chip half the transistor count of bulldozer can have better performance. As the chip is currently, I couldn't recommend anyone buy one (I don't think that I could recommend one even if the OS problem went away).


----------



## hokiealumnus

Heh, Intel's chipset problem had nothing to do with performance and was just a messed up storage controller. My theory on all of this unsupported supposition about ASUS being the culprit is that the whole thing is an overblown wishful-thinking conspiracy theory. My reasoning:

First and foremost AMD isn't about to send every reviewer a board that doesn't use their chip to its potential. Agreement or no, that would just be ridiculously stupid to do.

Second, ASUS' BIOS was as up-to-date as it got. The date on the UEFI AMD gave us was little more than a week before the board & cpu were shipped to us. There was one less than a week old (which I flashed) on their FTP available for our use in reviews.

Third I find it hard to believe (with all due respect to the other companies) that ASRock and MSI have some sort of secret sauce to possess a significantly stronger BIOS than ASUS or Gigabyte. Not only AMD's reputation but ASUS' as well could be hurt by screwing the pooch with a UEFI that messes up (nearly) everyone's reviews.

Subset of #3: ASRock doesn't even have a production BIOS on their web site that can _function_ with BD right now. 1.30 for the 990FX Extreme 4 does not POST with FX chips, you have to get a non-public beta from them.

Last, but certainly not least, if that were even a remote possibility why would AMD not have said anything at all to us? All of the reviewers are on the same mailing list and we have received no word what-so-ever from AMD about this. Any reasonable person would see that as a tell-tale sign that it performed right where they thought it would. At the very least AMD would reach out and say 'hey guys, anybody have an ASRock board to test this out on and make sure we didn't screw something up?'

Sorry, I just don't think that theory (and the quite-a-few others like it over the last two days) holds water. Use common sense people.


----------



## jck

Quote:



Originally Posted by *hokiealumnus*


Heh, Intel's chipset problem had nothing to do with performance and was just a messed up storage controller....


I wasn't saying it was a performance issue. Just that Intel's products aren't always perfect either.

Besides, Intel put out a solution for the issue. I'm sure if there is one with AMD's product that is resolvable, they will too.


----------



## Kand

Quote:



Originally Posted by *jck*


I wasn't saying it was a performance issue. Just that Intel's products aren't always perfect either.

Besides, Intel put out a solution for the issue. I'm sure if there is one with AMD's product that is resolvable, they will too.


ONE Transistor.

Seriously. Just for ONE transistor of an issue which had never reached an instance of it happening to a consumer.


----------



## Blameless

Quote:



Originally Posted by *kweechy*


If you render the same scene with any 3D application, they're benchmarking tools.

Probably better ones in fact than the synthetic ones because they'll use the memory controllers and the RAM in general a lot more, you get a better idea of the system's power and how well the CPU accesses data in addition to simply how well it processes it.


Blender is better than a synthetic benchmark because it's not a synthetic benchmark.

Blender is a pretty popular program that is used to actually do stuff, sometimes even real work. So, it's performance is much more relevant.


----------



## Trogdor

Quote:



Originally Posted by *Kand*


ONE Transistor.

Seriously. Just for ONE transistor of an issue which had never reached an instance of it happening to a consumer.


Don't underplay this







. 15% of the motherboards failed during Intel's testing and would cost around $88 per motherboard to repair...and estimated to cost Intel $1 billion.

Back on topic: Hope this Bulldozer issue is resolved soon. It's getting old reading these troll posts to get some worthwhile input.


----------



## jck

Quote:



Originally Posted by *Kand*


ONE Transistor.

Seriously. Just for ONE transistor of an issue which had *never reached an instance of it happening to a consumer.*


Er...it has reached some consumers. You can find them in blogs and message boards. People got these boards and didn't know about the recall until after it happened to them.

Intel did what it could to stop shipments and recall them, but some did get sold as parts and in computers systems to consumers.


----------



## Kand

Quote:



Originally Posted by *jck*


Er...it has reached some consumers. You can find them in blogs and message boards. People got these boards and didn't know about the recall until after it happened to them.

Intel did what it could to stop shipments and recall them, but some did get sold as parts and in computers systems to consumers.


What they did was switch their drives to the Sata 3 slots.

Some didn't bother to respond to the recall after that.


----------



## Dublin_Gunner

Quote:



Originally Posted by *jck*


No, not with the BIOS. But as I recall, Intel's chipset for Sandy Bridge was not without its problems.

...From Intel's own technology blog...

No one is perfect.

Like the performance evals, I will now wait for multiple sources to do comparisons. In fact, I am gonna message a reviewer I have talked to and see if he's going to look into this. It would make a great article for his website.


That has nothing to do with the conversation. Its was a verys slightly bugged SATA controller problem (leaking transistor). And all you had to do was use the other SATA ports (if you were in the 3% it affected).

This issue never affected the platforms performance at all, but could lead to degraded SATA functionality over time.


----------



## Indulgence

remember this?










hmmmmmm







my say is

*BD BROKE OUR HEARTS! (out of disappointment)*


----------



## Quantum Reality

Quote:



Originally Posted by *Xenthos*


Not sure if I trust these benches so far but if true, I'm very disappointed.


I've been reading a few of these reviews and - I'm really less than impressed. Even the relatively non memory intensive benchmarks like SuperPI calculations really favor the Core i* series of CPUs.


----------



## Don Karnage

Quote:



Originally Posted by *Trogdor*


Don't underplay this







. 15% of the motherboards failed during Intel's testing and would cost around $88 per motherboard to repair...and estimated to cost Intel $1 billion.

Back on topic: Hope this Bulldozer issue is resolved soon. It's getting old reading these troll posts to get some worthwhile input.


One of my friends who is stationed overseas still is running a B2 board that I had bought for him in january and shipped to him and he has had zero issues with it. I remember reading one thread over at XS but that was it. If you use the sata 3 ports you'll never have an issue.


----------



## Mad Pistol

Quote:



Originally Posted by *Dublin_Gunner*


That has nothing to do with the conversation. Its was a verys slightly bugged SATA controller problem (leaking transistor). And all you had to do was use the other SATA ports (if you were in the 3% it affected).

This issue never affected the platforms performance at all, but could lead to degraded SATA functionality over time.


BD is a radically different design from anything out there. A proper bios could make a difference in performance... not 50% difference, but 10-15% isn't out of the question.

From the reviewers that used the ASRock 990FX motherboards, BD was on par with or beat an i5 2500k in almost all of their tests. In some cases, it was neck and neck with the 2600k (and in a rare few, it beat the 2600k). The FX 8150 also showed to be significantly faster than any Phenom II X6 in multi-threaded applications in the ASRock motherboards too.

The reviewers that used the ASUS 990FX board were collectively disappointed because BD failed to even match the i5 2500k. This isn't a coincidence; there is a real issue with the current BIOS in the Crosshair V motherboard.

Look, AMD priced the FX 8150 to slot inbetween the 2500k and 2600k. When paired with the right motherboard, the FX 8150 fits perfectly. Do you honestly believe AMD would price themselves out of the market when their product doesn't deliver? They don't have the funds to do that; they have to be competitive on the price front from the get-go, or they can't move inventory.


----------



## Madmanden

Quote:



Originally Posted by *Mad Pistol*


Look, AMD priced the FX 8150 to slot inbetween the 2500k and 2600k. When paired with the right motherboard, the FX 8150 fits perfectly. Do you honestly believe AMD would price themselves out of the market when their product doesn't deliver? They don't have the funds to do that; they have to be competitive on the price front from the get-go, or they can't move inventory.


It does beg the question then, of why AMD would use the Asus board in the review kits for their new flagship processor. Clearly they can't be that stupid?


----------



## Wbroach23

Quote:



Originally Posted by *Mad Pistol*


BD is a radically different design from anything out there. A proper bios could make a difference in performance... not 50% difference, but 10-15% isn't out of the question.

From the reviewers that used the ASRock 990FX motherboards, BD was on par with or beat an i5 2500k in almost all of their tests. In some cases, it was neck and neck with the 2600k (and in a rare few, it beat the 2600k). The FX 8150 also showed to be significantly faster than any Phenom II X6 in multi-threaded applications in the ASRock motherboards too.

The reviewers that used the ASUS 990FX board were collectively disappointed because BD failed to even match the i5 2500k. This isn't a coincidence; there is a real issue with the current BIOS in the Crosshair V motherboard.

Look, AMD priced the FX 8150 to slot inbetween the 2500k and 2600k. When paired with the right motherboard, the FX 8150 fits perfectly. Do you honestly believe AMD would price themselves out of the market when their product doesn't deliver? They don't have the funds to do that; they have to be competitive on the price front from the get-go, or they can't move inventory.


This^^^^^^ Over 9000%


----------



## Nocturin

Quote:



Originally Posted by *Don Karnage*


One of my friends who is stationed overseas still is running a B2 board that I had bought for him in january and shipped to him and he has had zero issues with it. I remember reading one thread over at XS but that was it. *If you use the sata 3 ports you'll never have an issue.*


So I don't use a feature that I paid for because I'm worried about data corruption, and also limit myself to fewer sata ports for my fancy RAID party?

Your argument is not valid, sir.


----------



## RotaryKnight

The only amd cpus that I will use now would probably be the apus. They are the best htpc cpus Ive ever seen. A great cpu for low budget build if you are gaming at low res.


----------



## lordikon

Quote:


> Originally Posted by *Nocturin;15306012*
> So I don't use a feature that I paid for because I'm worried about data corruption, and also limit myself to fewer sata ports for my fancy RAID party?
> 
> Your argument is not valid, sir.


What he's saying is that if you own something that's broken in a way you don't use then you're not really affected. Of course shipping broken hardware isn't acceptable, which is why Intel recalled them.


----------



## lordikon

Quote:


> Originally Posted by *linkin93;15299274*
> Maybe all it needs is, you know, a software patch, perhaps something called a _driver?_
> 
> At the moment, using the FX CPU's as they are right now, without a proper driver or BIOS or software patch or whatever, it could be like playing games without your GPU driver installed:
> 
> Sure it works, but what about optimisations?
> 
> Hence, people of OCN, you make me


You missed my point, I said why release something that isn't already supported. By that I mean to say that AMD should have known many months ago this would happen and gotten the support before they tried to sell this to people.


----------



## QuackPot

http://forum.jogos.uol.com.br/amd-vision-fx8150--fx8120-bulldozer---performance-em-jogos_t_1727300?page=7

Brazilian review. BD seriously sucks gaming wise in their tests anyway.


----------



## levathar

Quote:


> Originally Posted by *Wbroach23;15305952*


I want to see more people confirming this.

I think that in their deep, everyone wanted AMD to hit hard on Intel, and to blame was the long time marketing strategy and so many delays, only made us think that WOW this would be a hell of a chip.

The biggest mistake is to call this line the FX line. Only should be used if sure that would melt the competition. For me, the BE status would suffice.

Right now I want to see other reviewers confirm this. I have seen reviews with ASROCK boards but as the majority goes with ASUS, I still want to wait and see.

Maybe this would put the knifes away from AMD's neck...

EDIT:
Reply meant to quote this:
*Originally Posted by Mad Pistol View Post*
BD is a radically different design from anything out there. A proper bios could make a difference in performance... not 50% difference, but 10-15% isn't out of the question.

From the reviewers that used the ASRock 990FX motherboards, BD was on par with or beat an i5 2500k in almost all of their tests. In some cases, it was neck and neck with the 2600k (and in a rare few, it beat the 2600k). The FX 8150 also showed to be significantly faster than any Phenom II X6 in multi-threaded applications in the ASRock motherboards too.

The reviewers that used the ASUS 990FX board were collectively disappointed because BD failed to even match the i5 2500k. This isn't a coincidence; there is a real issue with the current BIOS in the Crosshair V motherboard.

Look, AMD priced the FX 8150 to slot inbetween the 2500k and 2600k. When paired with the right motherboard, the FX 8150 fits perfectly. Do you honestly believe AMD would price themselves out of the market when their product doesn't deliver? They don't have the funds to do that; they have to be competitive on the price front from the get-go, or they can't move inventory.


----------



## gsa700

Aren't the Asrock boards made by ASUS?

If that's the case, shouldn't it be reasonable to assume the BIOS teams may be the same people?

Or are they not the same company?


----------



## levathar

Quote:


> Originally Posted by *gsa700;15306698*
> Aren't the Asrock boards made by ASUS?
> 
> IF that's the case, shouldn't it be reasonable to assume the BIOS teams may be the same people?
> 
> Or Are they not the same company?


AFAIK, Asrock and Asus split some years ago.


----------



## Don Karnage

Quote:


> Originally Posted by *levathar;15306715*
> AFAIK, Asrock and Asus split some years ago.


In 2008


----------



## i7monkey

Cliffs notes anyone?


----------



## gsa700

Quote:


> Originally Posted by *levathar;15306715*
> AFAIK, Asrock and Asus split some years ago.


Oh, I didn't realize that. Apparently, there is still a lot of confusion about it:

http://www.techpowerup.com/forums/showthread.php?t=151469


----------



## awdrifter

Quote:


> Originally Posted by *i7monkey;15306834*
> Cliffs notes anyone?


Bulldozer sucked, buy i5 2500k for gaming, i7 2600k for multithreaded work, or Phenom II X6 for upgrade if you already own AM3 mobo.


----------



## B NEGATIVE

Quote:


> Originally Posted by *QuackPot;15306496*
> http://forum.jogos.uol.com.br/amd-vision-fx8150--fx8120-bulldozer---performance-em-jogos_t_1727300?page=7
> 
> Brazilian review. BD seriously sucks gaming wise in their tests anyway.


Same results just in Brazillian....
Did you even look at the sources?


----------



## ekg84

Interesting results from rage3d, they claim that NB OC can significantly boost BD performance. 2600K lagging behind i7 920 in charts looks a bit strange though.

http://www.rage3d.com/reviews/cpu/amd_fx_8150/index.php?p=1


----------



## pale_neon

Quote:


> Originally Posted by *hajile;15294904*
> Speaking in consumer terms (Server and HPC were a different story).
> 
> Core 2 was still 10% faster per clock and overclocked better because Phenom had a cold bug. AMD didn't catch up with core 2 until Phenom II. Bulldozer is (at best) 25% slower per clock. Until clockspeeds improve and the bottlenecks are removed, Bulldozer is not competitive.


I wasn't talking about clock for clock. I was talking about value for gaming. the B3 stepping made a big difference in the first Phenom & you could get a quad core for about the price of a core 2 duo. Which is sort of like the current situation, more efficient quads vs less efficient octo & the octo needs a efficacy boost from a revision.
Quote:


> Originally Posted by *jrbroad77;15294931*
> Now come on! Phenom X4 came out after Core 2 Quad. It was no competition, a Q6600 at 4ghz blew away a 9950 at 3.2ish(max OC vs max OC).
> 
> Anyway the 8-core BD comes close to competing with Intel's best quads. I guess that's good? Doesn't really beat a 2600k multithreaded even (well, 20% on occasion, with double the transistor count)


I said Core 2 Duo. not quad. the c2 quad was more expensive than the Phenom X4 at launch. the Q6600 was over $850 when it was new. The Phenom X4 9500 launched around $250, which is why i said it was competing w/ Core 2 duos.

The funny part is the Core 2 Duos were even beating the Q6600 when it was new because few games made much use of quad cores.

Point i was making is, we're in a similar situation now, where very few games actually make much use of over 4 cores. But you take one of those old quad cores now, and match it against a dual core in a new game; and the dual core will be slower, even though it used to be faster in games.

http://www.legitreviews.com/article/682/11/










that being said, BD does need a slight tune up.

I was waiting for BD, but now I guess ill be waiting until Kepler/Southern Islands and re-evaluate my choices then. I refuse to go SB, i might go SB-E if BD doesn't show an improvement & SB-E doesn't include all the things I dislike about SB (remotely accessible serial #s, etc..).


----------



## Chuckclc

You know I think I have been convinced, i think i am going to get a Bulldozer chip anyways. Looks like i am going to get the FX 6100, since all the focus is on the 8150. I will probably get my chip this weekend from Microcenter, if they have any in stock. If they do i will post current benches with my PHII chip at 4.2-4.4ghz and then use the same benches for the FX 6100.


----------



## Dopamin3

Quote:


> Originally Posted by *Chuckclc;15307180*
> You know I think I have been convinced, i think i am going to get a Bulldozer chip anyways. Looks like i am going to get the FX 6100, since all the focus is on the 8150. I will probably get my chip this weekend from Microcenter, if they have any in stock. If they do i will post current benches with my PHII chip at 4.2-4.4ghz and then use the same benches for the FX 6100.


http://www.guru3d.com/article/amd-fx-8150--8120-6100-and-4100-performance-review/

You would really spend money on the FX 6100? It doesn't even come boxed in a tin, that's the only reason to buy the 8 cores. With that thrown out the window why are you going to buy one?


----------



## Nocturin

Quote:


> Originally Posted by *Chuckclc;15307180*
> You know I think I have been convinced, i think i am going to get a Bulldozer chip anyways. Looks like i am going to get the FX 6100, since all the focus is on the 8150. I will probably get my chip this weekend from Microcenter, if they have any in stock. If they do i will post current benches with my PHII chip at 4.2-4.4ghz and then use the same benches for the FX 6100.


I recommend waiting till a new stepping or at the least a few weeks for more data/information.

If you do though, you better post impressions/benchies. Of course you have a CHV, so that variance isn't there







.


----------



## hydropwnics

Quote:


> Originally Posted by *Chuckclc;15307180*
> You know I think I have been convinced, i think i am going to get a Bulldozer chip anyways. Looks like i am going to get the FX 6100, since all the focus is on the 8150. I will probably get my chip this weekend from Microcenter, if they have any in stock. If they do i will post current benches with my PHII chip at 4.2-4.4ghz and then use the same benches for the FX 6100.










i guess u do have the crosshair 5 already..


----------



## Chuckclc

Lol, its almost like folks dont want others to buy these things because they are scared they will be happy with one. Or else why would people care sooo much. Too funny. I will still have my 955, and if the 6100 is just absolute trash, then Im sure there will be many buyers still out there at this point in time. Majority of the buying crowd are not enthusiast like us. Plus people just like to tinker. Selling these should be no problem.


----------



## Mad Pistol

Quote:


> Originally Posted by *Chuckclc;15307282*
> Lol, its almost like folks dont want others to buy these things because they are scared they will be happy with one. Or else why would people care sooo much. Too funny. I will still have my 955, and if the 6100 is just absolute trash, then Im sure there will be many buyers still out there at this point in time. Majority of the buying crowd are not enthusiast like us. Plus people just like to tinker. Selling these should be no problem.


...unless the person is an Intel fanboy, in which case said person will whip out their 2500k/2600k and rub it in their face. It's already happening a lot on these boards.

For those that like to tinker, BD is great. For those that want absolute performance, BD is not.


----------



## Dopamin3

Quote:


> Originally Posted by *Mad Pistol;15307379*
> ...unless the person is an Intel fanboy, in which case said person will whip out their 2500k/2600k and rub it in their face. It's already happening a lot on these boards.
> 
> For those that like to tinker, BD is great. For those that want absolute performance, BD is not.


I hope you're not referring to me... Look at the link I just posted on guru3d, the FX 6100 loses fairly often to the older Phenom II architecture, even the quads! If you want to "tinker" on AMD you would be better off buying the 960T. You have the fun of unlocking cores and overclocking any way you want.

And by the way, check out my second computer, that I upgraded the motherboard for specifically Bulldozer. I'm just as disappointed as all the AMD fangirls. I buy either brand determined on price/performance and Bulldozer offers neither over previous Phenom II models and especially Sandy Bridge.


----------



## hajile

Quote:


> Originally Posted by *pale_neon;15307126*
> I wasn't talking about clock for clock. I was talking about value for gaming. the B3 stepping made a big difference in the first Phenom & you could get a quad core for about the price of a core 2 duo. Which is sort of like the current situation, more efficient quads vs less efficient octo & the octo needs a efficacy boost from a revision.
> 
> I said Core 2 Duo. not quad. the c2 quad was more expensive than the Phenom X4 at launch. the Q6600 was over $850 when it was new. The Phenom X4 9500 launched around $250, which is why i said it was competing w/ Core 2 duos.
> 
> The funny part is the Core 2 Duos were even beating the Q6600 when it was new because few games made much use of quad cores.
> 
> Point i was making is, we're in a similar situation now, where very few games actually make much use of over 4 cores. But you take one of those old quad cores now, and match it against a dual core in a new game; and the dual core will be slower, even though it used to be faster in games.
> 
> http://www.legitreviews.com/article/682/11/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that being said, BD does need a slight tune up.
> 
> I was waiting for BD, but now I guess ill be waiting until Kepler/Southern Islands and re-evaluate my choices then. I refuse to go SB, i might go SB-E if BD doesn't show an improvement & SB-E doesn't include all the things I dislike about SB (remotely accessible serial #s, etc..).


I somewhat agree. At stock speeds, my Phenom 9950 was a bit better in dollar/performance. When both my Phenom and Q6600 were overclocked, the dollar/performance was better for the Q6600 (though there was something to be said for overall system costs).


----------



## Kasp1js

Quote:


> Originally Posted by *ekg84;15306982*
> Interesting results from rage3d, they claim that NB OC can significantly boost BD performance. 2600K lagging behind i7 920 in charts looks a bit strange though.
> 
> http://www.rage3d.com/reviews/cpu/amd_fx_8150/index.php?p=1


Definitely the most thorough review I've read









Shame they didn't include a 2500k, also the 2600k results seem odd, maybe due to slow ram?


----------



## Nixuz

Ehhh.
I'll buy a BD on the basis that it's pretty much as good as what I have for 2-4 threads, and will help with heavy thread stuff when I need it.
I also don't have to buy a new mobo, which saves me some cash.
It seems to be a good general purpose chip, not great on anything, but good for everything.


----------



## Roll Cam Tide

Quote:


> Originally Posted by *Mad Pistol;15307379*
> ...unless the person is an Intel fanboy, in which case said person will whip out their 2500k/2600k and rub it in their face. It's already happening a lot on these boards.
> 
> For those that like to tinker, BD is great. For those that want absolute performance, BD is not.


I've seen a lot of the Intel fanboy bragging as well, BD buyers can still point to how much they paid for their 2600k originally, not what it sells for right now. That helps even the playing field a little in terms of "bragging" but if I was buying today there is no way I'd go for a bulldozer over Sandy (or Ivy soon)


----------



## Trogdor

Quote:


> Originally Posted by *Don Karnage;15305735*
> One of my friends who is stationed overseas still is running a B2 board that I had bought for him in january and shipped to him and he has had zero issues with it. I remember reading one thread over at XS but that was it. If you use the sata 3 ports you'll never have an issue.


Sounds like you don;t know too much about it. Intel says the you'll have a problem within 3 years.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *Trogdor;15307956*
> Sounds like you don;t know too much about it. Intel says the you'll have a problem within 3 years.


You'll never have any issues if you never use the affected SATA ports...


----------



## Wbroach23

TRogDor Burninatin all the people In the Niiuiiiight!!!!!!!!


----------



## Nocturin

Quote:


> Originally Posted by *Majin SSJ Eric;15307999*
> You'll never have any issues if you never use the affected SATA ports...


that is a terrible argument, as i stated earlier.

look my mobo has 8 sata ports!!!1! ... but i can only use 6 of then anyways









i coulda bought the one with 6 for 50$ cheaper


----------



## Mad Pistol

Quote:


> Originally Posted by *Dopamin3;15307536*
> I hope you're not referring to me... Look at the link I just posted on guru3d, the *FX 6100 loses fairly often to the older Phenom II architecture, even the quads!* If you want to "tinker" on AMD you would be better off buying the 960T. You have the fun of unlocking cores and overclocking any way you want.
> 
> And by the way, check out my second computer, that I upgraded the motherboard for specifically Bulldozer. I'm just as disappointed as all the AMD fangirls. I buy either brand determined on price/performance and Bulldozer offers neither over previous Phenom II models and especially Sandy Bridge.


That's to be expected though. IPC on bulldozer is lower, so the clocks have to be higher. Since the clock speed isn't that much higher (if at all), BD just loses to Deneb and Thuban. Sort of embarrassing for AMD's 3 year old architecture to beat their newest one.


----------



## sLowEnd

Quote:


> Originally Posted by *jrbroad77;15294931*
> Now come on! Phenom X4 came out after Core 2 Quad. It was no competition, a Q6600 at 4ghz blew away a 9950 at 3.2ish(max OC vs max OC).
> 
> Anyway the 8-core BD comes close to competing with Intel's best quads. I guess that's good? Doesn't really beat a 2600k multithreaded even (well, 20% on occasion, with double the transistor count)


lol you'd need a golden Q6600 for 4Ghz.

Many top out at 3.6 24/7 OC on air.


----------



## Chuckclc

Im still not 100% convinced BD cannot improve with Bios updates. When I Bios comes out that seems to be more targeted for the cpu then I will concede defeat.


----------



## mad0314

Quote:


> Originally Posted by *Nocturin;15308132*
> that is a terrible argument, as i stated earlier.
> 
> look my mobo has 8 sata ports!!!1! ... but i can only use 6 of then anyways
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i coulda bought the one with 6 for 50$ cheaper


His argument was for said person overseas still using a B2 board, and it is true in his case.

Also, Intel did you say that you WILL undoubtedly have problems in exactly 3 years 100%.


----------



## Dopamin3

Quote:


> Originally Posted by *Mad Pistol;15308135*
> That's to be expected though. IPC on bulldozer is lower, so the clocks have to be higher. Since the clock speed isn't that much higher (if at all), BD just loses to Deneb and Thuban. Sort of embarrassing for AMD's 3 year old architecture to beat their newest one.


How is that to be expected? It's a brand new CPU architecture, why would IPC decrease compared to the previous gen. Furthermore you have JF-AMD swearing up and down that IPC would increase (see my signature, that was posted on XS but he also posted on here). You would expect newer architecture to be faster and more efficient; not slower and higher power draw. I really don't understand why anyone is even defending BD. It really is terrible and you've also got AMD lying on their youtube channel comparing the 980x (EOL CPU lulz) to BD in cinebench when it's actually the 2500k. Instead of correcting the video they lock the comments...


----------



## GTR Mclaren

lol

double Failure this year

Sandy Bridge recall and Bulldozer performance


----------



## sintricate

Quote:


> Originally Posted by *Dopamin3;15308749*
> It really is terrible and you've also got AMD lying on their youtube channel comparing the 980x (EOL CPU lulz) to BD in cinebench when it's actually the 2500k. Instead of correcting the video they lock the comments...


Not to mention, when you look at the end of the video where they show the spec for all the systems tested, you see that they use 1066 and 1333MHz memory for the Intel systems and 1600MHz for the AMD systems and then compare the results only when BD wins. They're all over the board hand picking bench results. One minute they're comparing it to a 2500k, then a 2600k and so on...


----------



## sintricate

Quote:


> Originally Posted by *GTR Mclaren;15309185*
> lol
> 
> double Failure this year
> 
> Sandy Bridge recall and Bulldozer performance


Hopefully AMD can pick themselves back up like Intel did.


----------



## tafkar

Quote:


> Originally Posted by *Chuckclc;15308225*
> Im still not 100% convinced BD cannot improve with Bios updates. When I Bios comes out that seems to be more targeted for the cpu then I will concede defeat.


Honestly, if you want to say "I believe in AMD," your money would be better spent buying sixty shares of their stock and selling it off when they have a good enough product to drive prices back up. If this magical BIOS fix happens? Your good faith may pay for an SSD when the product actually becomes something worth buying!

Quote:


> Originally Posted by *GTR Mclaren;15309185*
> lol
> 
> double Failure this year
> 
> Sandy Bridge recall and Bulldozer performance


_Double failure all the way across the board? So intense!_


----------



## B3anbag

Quote:


> Originally Posted by *alancsalt;15303213*
> I suspect there may be some truth in this:
> 
> *...That changed before I left - they started to rely on synthesis tools, automatic place and route tools, etc...*
> 
> http://www.insideris.com/amd-spreads-propaganda-ex-employee-speaks-out/
> 
> AMD used to lead the way. From 2000 to 2007 all my systems were AMD.


if AMD supplies the design or blueprint w/e based on the hand crafter cpus to those companies, why cant the machines duplicate them correctly? or are we dealing with something along the lines of a Stratavarious violin duplication probs?


----------



## NirXY

Quote:


> Originally Posted by *Roll Cam Tide;15307856*
> I've seen a lot of the Intel fanboy bragging as well, BD buyers can still point to how much they paid for their 2600k originally, not what it sells for right now. That helps even the playing field a little in terms of "bragging" but if I was buying today there is no way I'd go for a bulldozer over Sandy (or Ivy soon)


2600k price hasn't changed since launch.


----------



## Nocturin

Quote:


> Originally Posted by *mad0314;15308667*
> *His argument was for said person overseas still using a B2 board, and it is true in his case.*
> 
> Also, Intel did you say that you WILL undoubtedly have problems in exactly 3 years 100%.


The attitude surround a billion dollar recall is astounding.









I'm done with this, if you want to continue. PM me.


----------



## BallaTheFeared

The i5-2500k is $204 on newegg, just sayin...


----------



## jjsoviet

So everyone calls BD a failure. Sure, when comparing against Sandy Bridge and Ivy Bridge. But is it _that_ of a failure if upgrading from a Phenom X4 to an FX 6100 series won't be that much of a difference in benches and real-world performance?


----------



## jivenjune

Quote:


> Originally Posted by *jjsoviet;15309998*
> So everyone calls BD a failure. Sure, when comparing against Sandy Bridge and Ivy Bridge. But is it _that_ of a failure if upgrading from a Phenom X4 to an FX 6100 series won't be that much of a difference in benches and real-world performance?


The problem isn't that Bulldozer couldn't beat Sandy Bridge but that it often gets tossed aside by even the Phenom. While there are scenarios in which the FX series may win here and there, the Phenom series offers an overall better balance of performance, especially if you play games at all.

Most AMD fans probably wouldn't mind buying this processor if they didn't feel like the were taking a step backwards from their current Phenom II processors.


----------



## tafkar

Quote:


> Originally Posted by *jjsoviet;15309998*
> So everyone calls BD a failure. Sure, when comparing against Sandy Bridge and Ivy Bridge. But is it _that_ of a failure if upgrading from a Phenom X4 to an FX 6100 series won't be that much of a difference in benches and real-world performance?


If I wanted to spend $190 to see no positive result, I'd take up smoking again.


----------



## jjsoviet

Quote:


> Originally Posted by *jivenjune;15310179*
> The problem isn't that Bulldozer couldn't beat Sandy Bridge but that it often gets tossed aside by even the Phenom. While there are scenarios in which the FX series may win here and there, the Phenom series offers an overall better balance of performance, especially if you play games at all.
> 
> Most AMD fans probably wouldn't mind buying this processor if they didn't feel like the were taking a step backwards from their current Phenom II processors.


Quote:


> Originally Posted by *tafkar;15310221*
> If I wanted to spend $190 to see no positive result, I'd take up smoking again.


Then I guess I have saved myself $200 for something else, then.


----------



## redalert

Quote:


> Originally Posted by *jjsoviet;15310248*
> Then I guess I have saved myself $200 for something else, then.


a SSD or memory since its dirt cheap


----------



## Trogdor

Quote:


> Originally Posted by *Majin SSJ Eric;15307999*
> You'll never have any issues if you never use the affected SATA ports...


Lol that's true. I use 5 of the 6 SATA ports on my MB so that wouldn't be an option for people with more functional builds.


----------



## CramComplex

Any reviews yet from OCN members?


----------



## Nocturin

Quote:


> Originally Posted by *BallaTheFeared;15309943*
> The i5-2500k is $204 on newegg, just sayin...


call me when the 2600k is that price


----------



## bucdan

Would it be more worth it in the short run as per my current specs to get a phenom ii x6 and possibly a new gpu or a overhaul to bulldozer/ or 2500k? I don't really plan on doing more than just casual gaming with new games, don't expect high quality and just the usual surfing the net.


----------



## paulerxx

Quote:


> Originally Posted by *bucdan;15310869*
> Would it be more worth it in the short run as per my current specs to get a phenom ii x6 and possibly a new gpu or a overhaul to bulldozer/ or 2500k? I don't really plan on doing more than just casual gaming with new games, don't expect high quality and just the usual surfing the net.


Get a new GPU for now, I'm still gaming hard on my computer and it's quite dated.


----------



## tafkar

Quote:


> Originally Posted by *bucdan;15310869*
> Would it be more worth it in the short run as per my current specs to get a phenom ii x6 and possibly a new gpu or a overhaul to bulldozer/ or 2500k? I don't really plan on doing more than just casual gaming with new games, don't expect high quality and just the usual surfing the net.


Is your 720 actually sluggish in any of your current use scenarios?

If not, upgrading your video card to a 6850 for $150 would probably be the most noticeable and cost effective thing you could do. You'll see a difference in those newfangled GPU-accelerated web browsers and directx 11 games.


----------



## ikem

heres another.

http://www.guru3d.com/article/amd-fx...mance-review/1


----------



## tafkar

Quote:



Originally Posted by *ikem*


heres another.

http://www.guru3d.com/article/amd-fx...mance-review/1


Now taking bets for how many times that SHA1 score is quoted.


----------



## 2010rig

Quote:



Originally Posted by *GTR Mclaren*


lol

double Failure this year

Sandy Bridge recall and Bulldozer performance


Sandy Bridge got fixed within a month. Bulldozer on the other hand...

Quote:



Originally Posted by *tafkar*


Now taking bets for how many times that SHA1 score is quoted.


Bulldozer won a benchmark!


----------



## Quantum Reality

The fact that a Phenom II X6 can already beat a Sandy Bridge (just) on a SHA1 makes the BD score somewhat less impressive.


----------



## Jackeduphard

this just makes me sad a 160$ CPU is almost the same as a 'new' 250$ one ....


----------



## Ghoxt

Quote:



Originally Posted by *CramComplex*


Any reviews yet from OCN members?


Exactly who is going to be the guinea pig and buy one from OCNs members? Good question... I can equivocally say it wont be me.

If Bulldozer was a 15% performance increase over 2600K for equal money, or equal performance for less money(value) then I would have taken the plunge.

Back to those guys that drove a single BD core up to 8+Ghz...ummm, nevermind







Actually I will say it, what the hell did they measure for power usage during their world record? Did they happen to notice the neighborhood Power Outage?


----------



## dzalias

Quote:



Originally Posted by *Ghoxt*


Exactly who is going to be the guinea pig and buy one from OCNs members? Good question... I can equivocally say it wont be me.

If Bulldozer was a 15% performance increase over 2600K for equal money, or equal performance for less money(value) then I would have taken the plunge.

Back to those guys that drove a single BD core up to 8+Ghz...ummm, nevermind







Actually I will say it, what the hell did they measure for power usage during their world record? Did they happen to notice the neighborhood Power Outage?










Why would BD have to be cheaper AND better or equally performing for you to buy it? Fanboy.


----------



## Schmuckley

it'd have to be cheaper and/or better for anyone to buy it..that's just supply and demand..for me..it's not cheaper or better..so i'm not buying it..then there's the power usage







:


----------



## Schmuckley

Quote:



Originally Posted by *bucdan*


Would it be more worth it in the short run as per my current specs to get a phenom ii x6 and possibly a new gpu or a overhaul to bulldozer/ or 2500k? I don't really plan on doing more than just casual gaming with new games, don't expect high quality and just the usual surfing the net.


i'd go with the thuban + gpu combo








wait..your mobo is suspect for running a thuban...


----------



## 2010rig

Quote:



Originally Posted by *Ghoxt*


Exactly who is going to be the guinea pig and buy one from OCNs members? Good question... I can equivocally say it wont be me.

If Bulldozer was a 15% performance increase over 2600K for equal money, or equal performance for less money(value) then I would have taken the plunge.

Back to those guys that drove a single BD core up to 8+Ghz...ummm, nevermind







Actually I will say it, what the hell did they measure for power usage during their world record? Did they happen to notice the neighborhood Power Outage?










El Gappo has one, and he's been running benches, plus has an overclocking guide.

http://www.overclock.net/amd-general...ing-guide.html


----------



## DayoftheGreek

Quote:



Originally Posted by *dzalias*


Why would BD have to be cheaper AND better or equally performing for you to buy it? Fanboy.


Well if you want performance, cheaper and lower performing isn't exactly what you want. Stop insulting people for no reason.


----------



## Ghoxt

Quote:



Originally Posted by *dzalias*


Why would BD have to be cheaper AND better or equally performing for you to buy it? Fanboy.


Read what I said again....you misquoted me. Fanboy? You don't know this War vet....n/m..holding back insult. moving on...

To be specific, i'm running a 920, so why would I buy a BD over a 2600k...naturally...if its *equal performing*, i'd pick the one with the lower price. As would we all.

Consequently if it flatout outperformed, then I'd make that choice? I never said cheaper and better together as a choice...


----------



## Chuckclc

Quote:



Originally Posted by *Chuckclc*


You know I think I have been convinced, i think i am going to get a Bulldozer chip anyways. Looks like i am going to get the FX 6100, since all the focus is on the 8150. I will probably get my chip this weekend from Microcenter, if they have any in stock. If they do i will post current benches with my PHII chip at 4.2-4.4ghz and then use the same benches for the FX 6100.


You know what? No, no I am not. I just cannot bring myself to do it. Can use that money to get all kinds of other stuff. I just have to wait and see if AMD releases some kind of fixed chip, or if there is a better Bios or something.


----------



## XtremeCuztoms

Nuff Said


----------



## Lucky 13 SpeedShop

Quote:



Originally Posted by *Chuckclc*


You know what? No, no I am not. I just cannot bring myself to do it. Can use that money to get all kinds of other stuff. I just have to wait and see if AMD releases some kind of fixed chip, or if there is a better Bios or something.


I wouldn't either. Frankly, I'm all kinds of p.o.'d about this whole fiasco too. When my friend gets back from visiting his family in Spain, he's picking this system up from me, and I'm going elsewhere for a cpu until things turn around. IF they ever turn around...

Quote:



Originally Posted by *XtremeCuztoms*











Nuff Said


Truth. The only thing I'm worrying with now is exactly how bad I'm screwed when it comes to running Hyper's with Sandy?


----------



## XtremeCuztoms

Quote:



Originally Posted by *Lucky 13 SpeedShop*


I wouldn't either. Frankly, I'm all kinds of p.o.'d about this whole fiasco too. When my friend gets back from visiting his family in Spain, he's picking this system up from me, and I'm going elsewhere for a cpu until things turn around. IF they ever turn around...

Truth. The only thing I'm worrying with now is exactly how bad I'm screwed when it comes to running Hyper's with Sandy?



Quote:



STT WX200UB2G7


I'd look for something else.. Most people running those had issues with the boards even booting up running those.


----------



## crossy82

Just a thought,not seen it mentioned,but wont motherboard prices go up after this to recouperate lost revenue from AM3+ based boards lack of sales?

Surely this will sting bad for Asus,Gigabyte,MSI,etc,etc.Also if so wont that moast likely be passed on to say X79 boards?


----------



## Lucky 13 SpeedShop

Quote:



Originally Posted by *XtremeCuztoms*


I'd look for something else.. Most people running those had issues with the boards even booting up running those.


Yeah, I'd heard there were problems. But they never elaborated on exactly what were the problems, nor the severity. That...suuuucks. It's worse than I'd hoped for









I feel pretty funny thanking someone for bad news, but thanks anyway


----------



## Chuckclc

I was actually thinking of picking up a 2500K tomorrow but there are none in stock at my MC. They have 2600K's and they are only 249.99 though. But thats a little out of my budget. Maybe its a sign. Im ready to get a 2500K and there is none, maybe I should wait and see how BD turns out.


----------



## tubers

Quote:



Originally Posted by *Chuckclc*


I was actually thinking of picking up a 2500K tomorrow but there are none in stock at my MC. They have 2600K's and they are only 249.99 though. But thats a little out of my budget. Maybe its a sign. Im ready to get a 2500K and there is none, maybe I should wait and see how BD turns out.


It's a sign that the 2500k's worth getting that's why FATE is trying to keep it from you LOL!


----------



## Chuckclc

Quote:



Originally Posted by *tubers*


It's a sign that the 2500k's worth getting that's why FATE is trying to keep it from you LOL!


Probably. Before Halloween I am about 80% sure i will be sporting one.


----------



## Lucky 13 SpeedShop

Quote:



Originally Posted by *Chuckclc*


Probably. Before Halloween I am about 80% sure i will be sporting one.


You and I both bro.


----------



## BankaiKiller

Can the 8core bulldozer cpu pump out more ppd then my phenom sixcore, 2500k, and 2600k though?


----------



## robbo2

No it gets beaten in folding by all 3 of those chips







Was hoping for a folding beast.

Actually if your at 3.2 then they might be similar.


----------



## Chuckclc

Quote:


> Originally Posted by *BankaiKiller;15316063*
> Can the 8core bulldozer cpu pump out more ppd then my phenom sixcore, 2500k, and 2600k though?


Not sure yet. Within the next week i am sure someone from OCN will test the folding of one of these and let everyone know. My guess would be no though. Based on its single threaded benches.


----------



## Bit_reaper

Quote:


> Originally Posted by *Chuckclc;15315848*
> I was actually thinking of picking up a 2500K tomorrow but there are none in stock at my MC. They have 2600K's and they are only 249.99 though. But thats a little out of my budget. Maybe its a sign. Im ready to get a 2500K and there is none, maybe I should wait and see how BD turns out.


It's a sign to get the 2600k at that price its a steal (at least compared with the prices I have seen) bite the bullet and pay the premium


----------



## snelan

I have hope in Pile Driver, but in my eyes, Bulldozer was more like a prototype, and shouldn't of been released.


----------



## hokiealumnus

Quote:


> Originally Posted by *Chuckclc;15316160*
> Not sure yet. Within the next week i am sure someone from OCN will test the folding of one of these and let everyone know. My guess would be no though. Based on its single threaded benches.


Didn't make it into our article, but I did fold with it and posted the results in our first comment:
Quote:


> Originally Posted by *hokiealumnus*
> I had promised folding results to Shelnutt2, but they couldn't make the review. So, they will be posted in the first post!
> 
> Regular SMP work unit - 13698.9ppd
> 
> Bigadv work unit - 13859.2ppd
> 
> So between 13,500 and 14,000 at stock, which is right where it's positioned - around the PPD of a 2500K.


My $.02 is that, for DC builds, no one without free electricity is going to touch one of these with a ten foot pole.


----------



## 2010rig

I thought for sure that it would've been a folding beast. High power consumption and low PPD numbers. Pathetic.


----------



## Don Karnage

JF posted over at Anandtech. Hopefully not a repost.

http://forums.anandtech.com/showpost.php?p=32421412&postcount=106


----------



## Rookie1337

Quote:


> Originally Posted by *Don Karnage;15318094*
> JF posted over at Anandtech. Hopefully not a repost.
> 
> http://forums.anandtech.com/showpost.php?p=32421412&postcount=106


Sadly for JF it's going to fall on deaf ears. I mean I got attacked for just showing that with 1866 RAM and a 4.6GHz clock the 8150 could almost match a stock clocked 980x in transcoding. I don't understand why people become so attached to companies or products. It's kind of scary.


----------



## james8

^of course you would get attacked.
comparing a 6-core last-gen 3.3 GHz CPU with a 8-core just-released 4.6 GHz CPU


----------



## Don Karnage

Quote:


> Originally Posted by *Rookie1337;15318189*
> Sadly for JF it's going to fall on deaf ears. I mean I got attacked for just showing that with 1866 RAM and a 4.6GHz clock the 8150 could almost match a stock clocked 980x in transcoding. I don't understand why people become so attached to companies or products. It's kind of scary.


It's like diehard sports fans. They just go crazy. I personally don't care whos better. I just like to argue tho.


----------



## kromar

im really sad that this chips turns out to be so weak and power hungry i would have loved to see some competition...


----------



## YangerD

Quote:


> Originally Posted by *kromar;15318605*
> im really sad that this chips turns out to be so weak and power hungry i would have loved to see some competition...


+1, I was hoping for a new chip from AMD to upgrade the 965 in my gaming rig.


----------



## GameBoy

Quote:


> Originally Posted by *james8;15318350*
> ^of course you would get attacked.
> comparing a 6-core last-gen 3.3 GHz CPU with a 8-core just-released 4.6 GHz CPU


The fact that it's being compared to a last gen CPU, or the fact that it has less cores or whatever is completely irrelevant. The only thing that matters is the performance it gives for the price. I'm not defending Bulldozer, or its performance. I'm just merely pointing out how superfluous your bickering is.

And as for everyone being so butt-hurt because of Bulldozer - that's what you get for having ridiculous expectations.


----------



## DayoftheGreek

Quote:


> Originally Posted by *GameBoy;15319146*
> And as for everyone being so butt-hurt because of Bulldozer - that's what you get for having ridiculous expectations.


When people with X6's and X4's are calling it a side-grade at best, something is wrong. I wouldn't call those ridiculous expectations at all.


----------



## toX0rz

Indeed, has nothing to do with high expecations, its just ridicolous if a company pulls out their new gen CPU that cant even beat their previous generation after 5 years of development.


----------



## GameBoy

Quote:


> Originally Posted by *DayoftheGreek;15319506*
> When people with X6's and X4's are calling it a side-grade at best, something is wrong. I wouldn't call those ridiculous expectations at all.


Bulldozer is just OK. I think it's mainly the inconsistent performance in software that's making it look worse. Considering AMD's position in the past few months/years, this kind of flaky launch was almost inevitable.

Hardware usually comes before software. And the Bulldozer architecture itself is pretty solid, so future AMD CPU's (revisions/die shrinks) based on it _should_ be a lot better.


----------



## Wishmaker

Quote:



Originally Posted by *Rookie1337*


Sadly for JF it's going to fall on deaf ears. I mean I got attacked for just showing that with 1866 RAM and a 4.6GHz clock the 8150 could almost match a stock clocked 980x in transcoding. I don't understand why people become so attached to companies or products. It's kind of scary.


Don't you find it a bit odd that AMD called out intel for the jiggawatts race and now they are doing exactly that? Surely you have to see how ridiculous this claim that if you clock BD at 4.6 GHz it is as good as the 980x.


----------



## GameBoy

Quote:


> Originally Posted by *Wishmaker;15320053*
> *Don't you find it a bit odd that AMD called out intel for the jiggawatts race and now they are doing exactly that?* Surely you have to see how ridiculous this claim that if you clock BD at 4.6 GHz it is as good as the 980x.


Who cares? It's just corporations being corporations. They all do it.


----------



## a pet rock

Quote:


> Originally Posted by *DayoftheGreek;15319506*
> When people with X6's and X4's are calling it a side-grade at best, something is wrong. I wouldn't call those ridiculous expectations at all.


Didn't people on the x58 platform call Sandy Bridge a side-grade at best?


----------



## RedCloudFuneral

Quote:



Originally Posted by *a pet rock*


Didn't people on the x58 platform call Sandy Bridge a side-grade at best?


No, Sandy Bridge is faster, its just that for gaming x58 is fast enough.


----------



## oicw

Quote:



Originally Posted by *a pet rock*


Didn't people on the x58 platform call Sandy Bridge a side-grade at best?


Ugh, that must be a pretty steep side slope then









Phenom to BD is a pretty steep side slope too ...... downwards







(except encoding and Winrar)


----------



## 8ight

Quote:



Originally Posted by *RedCloudFuneral*


No, Sandy Bridge is faster, its just that for gaming x58 is fast enough.


It depends on the application, hot-shot.


----------



## RedCloudFuneral

Quote:



Originally Posted by *8ight*


It depends on the application, hot-shot.


Ok, find me a game that runs faster on an X58 chip.


----------



## CULLEN

Has anyone seen this? http://quinetiam.com/?p=2356


----------



## a pet rock

Quote:



Originally Posted by *RedCloudFuneral*


No, Sandy Bridge is faster, its just that for gaming x58 is fast enough.



Quote:



Originally Posted by *oicw*


Ugh, that must be a pretty steep side slope then










I never said they were right. I just seem to recall a very significant number of i7 9xx owners raging about how Sandy Bridge was only a side grade on these very forums. It'll be interesting to see how things pan out.


----------



## dklimitless

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


You just made me very, very, very, very happy. Infact, I could run to your place and kiss you right now







.

Bulldozer is definitely awaiting tweaks (both software and hardware). Oh, I can't wait for BD to spank SB so fanboys just shut up and appreciate tech. ^.^


----------



## FtL1776

8 cores and 125W TDP.

Isn't it the AMD meme about efficiency? Twice as many physical cores and more power usage. Yet its slower.


----------



## PvtHudson

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


This is some dude's blog. Hardly real news.

And:

Quote:



Look for a 40% performance boost if this works&#8230;. more to come.


Key word "if".

Also he writes:

Quote:



I hate overclocking&#8230;


So he's not exactly a benchmarker either.

This is some dude's opinions and hopes that a software fix is going to redeem Bulldozer. If you checked the Tom's Hardware review under Windows 8, it still gets destroyed by a 2500k.


----------



## Spicy61

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


Good post.


----------



## Phantom123

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


If that is true then i feel bad now going intel


----------



## MPIXAPP

My laptop has an i7 sandy bridge ..
4 cores , 8 threads and it's max TDP is 45W !


----------



## Majin SSJ Eric

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


I guess we just take that guy's word for it, huh? No proof, no screens, just his word. Not saying this isn't true but not saying it is either.


----------



## NickSim86

Quote:



Originally Posted by *MPIXAPP*


My laptop has an i7 sandy bridge ..
4 cores , 8 threads and it's max TDP is 45W !


and runs at 2.0GHz


----------



## MPIXAPP

Quote:



Originally Posted by *NickSim86*


and runs at 2.0GHz


and turbo boost to 2.9GHz ..

So this 1GHz of bulldozer requires about 200W ?!


----------



## ZealotKi11er

BD is a complete new architecture and i will probably take Windows 8 to properly use it. Anyone here should not thing of it as a 8-Core CPU. Also right now it might be slower then X6 at some applications bu that could very well be because of Windows Problems. Because BD has 8-Core which if they all are used ~ as fast as 6-7 Core PII this could really drop performance in single threaded applications if the Core are operation in the 8-Core mode. For example a BD Core could be ~ 75% performance compare to 1 module. This could result in 25% or more drop in performance if all cores are not used. In short that article might have some truth behind it.


----------



## cusideabelincoln

Quote:



Originally Posted by *Don Karnage*


JF posted over at Anandtech. Hopefully not a repost.

http://forums.anandtech.com/showpost...&postcount=106


It's pretty pathetic users here jumped on him about not lying by taking his statements out of context. Think before you troll.


----------



## toX0rz

Quote:



Originally Posted by *CULLEN*


Has anyone seen this? http://quinetiam.com/?p=2356


lol 40% performance boost, sure.

If this was the case and BD would really be underperforming by 40-70% like this guy says, AMD would have said something about this.

Also, Im pretty sure I read something about a 10% performance boost with win 8, not 40%.


----------



## ZealotKi11er

Quote:



Originally Posted by *toX0rz*


lol 40% performance boost, sure.

If this was the case and BD would really be underperforming by 40-70% like this guy says, AMD would have said something about this.

Also, Im pretty sure I read something about a 10% performance boost with win 8, not 40%.


Its because its under-performing compare to PII. Just being as fast as PII its 30% increase in performance.


----------



## hammertime850

Quote:



Originally Posted by *toX0rz*


lol 40% performance boost, sure.

If this was the case and BD would really be underperforming by 40-70% like this guy says, AMD would have said something about this.

Also, Im pretty sure I read something about a 10% performance boost with win 8, not 40%.


Someone earlier in this thread posted windows 8 results, a 1-5% increase.


----------



## CULLEN

Quote:



Originally Posted by *Majin SSJ Eric*


I guess we just take that guy's word for it, huh? No proof, no screens, just his word. Not saying this isn't true but not saying it is either.


I just asked if anyone had seen it. Not saying this is the complete truth but some of it makes sens.

And who knows, maybe there will be a "service/patch" pack for Bulldozer that will show dramatic changes.


----------



## robert c james

hopefully 40% would let them drive the PII x6 down into my price range ALOT faster


----------



## Acefire

While everyone is waiting for a Bulldozer fix, I will be getting my utility out of a proccesor that worked the day it was launched. By the time they fix this Dam thing it will be too late to compete anyways. Double fail........


----------



## Majin SSJ Eric

Quote:



Originally Posted by *CULLEN*


I just asked if anyone had seen it. Not saying this is the complete truth but some of it makes sens.

And who knows, maybe there will be a "service/patch" pack for Bulldozer that will show dramatic changes.


Would be interested to see if a patch was all BD needed to spank SB in highly threaded apps. I doubt it would be faster in single threaded apps no matter what they do though...


----------



## ACHILEE5

Quote:



Originally Posted by *Majin SSJ Eric*


Would be interested to see if a patch was all BD needed to spank SB in highly threaded apps. I doubt it would be faster in single threaded apps no matter what they do though...


Like, a patch that allows it to cheat


----------



## otakunorth

AHHH
all the intel users are trolls and the amd users have gone ******ed


----------



## DayoftheGreek

Quote:



Originally Posted by *otakunorth*


AHHH
all the intel users are trolls and the amd users have gone ******ed


----------



## ACHILEE5

Quote:



Originally Posted by *otakunorth*


AHHH
all the intel users are trolls and the amd users have gone ******ed


In my defence! That is my "one and only" vent about BD








And, I feel gutted for those that bought new mobos already









So yeah,


----------



## Naturecannon

Chasing rainbows??

5 years in the works, released as a dud and a patch a week later. Are you serious??

Never going to happen...... No way to spit shine the BD terd beside a revision. Patch would already been released before reviews hit the webs. Get realistic guys!! You chased BD long enough.


----------



## PakJai

They need new steppings like the intel i7 920 C0 - i7 920 D0 and also make amends to the power consumption, it's ludicrous! They cannot hope to have high overclockabilty and performance with a chip that drinks voltages like running tap water thus getting bloated easily.


----------



## Seronx




----------



## Rookie1337

Quote:



Originally Posted by *ACHILEE5*


Like, a patch that allows it to cheat










Intel had one of those a while ago. I sometimes think it's still there. Good old compiler bias IIRC. Told software made with it to run slower on AMD hardware.


----------



## Roll Cam Tide

With all I've read on the terrible L2 and L3 cache latency's with BD, I doubt any hotfix will do anything to fix this generation. IF the cache latency problem is true, hopefully they fix it with the next revision.


----------



## 8ight

Quote:



Originally Posted by *RedCloudFuneral*


Ok, find me a game that runs faster on an X58 chip.


I said it depends on the application, where multithreading and memory bandwidth is important. Though only the 32nm X58 chips are competitive now, and I didn't mention gaming.


----------



## 8ight

Quote:



Originally Posted by *MPIXAPP*


and turbo boost to 2.9GHz ..

So this 1GHz of bulldozer requires about 200W ?!


Not all 4 cores go to 2.9GHz, some get clocked way down/disabled. Mobile chips are binned to have exceptionally low VIDs meaning low TDPs for battery life and heat outputs sake.


----------



## Jinny1

No matter what kind of patch, its not going to reduce your quadrupled power bill from using FX.


----------



## Dr. Zoidberg

Seems like that Quin Etiam guy was lying about the registry fix for bulldozer.

Here's some proof:

http://kubuntuforums.net/forums/index.php?topic=3118749

Also what makes him even more suspicious is that he has opened up new accounts on anandtech and guru3d forums.

Here is one of his threads on guru3d:

http://forums.guru3d.com/showthread.php?t=352281


----------



## 2010rig

Quote:



Originally Posted by *Dr. Zoidberg*


Seems like that Quin Etiam guy was lying about the registry fix for bulldozer.

Here's some proof:

http://kubuntuforums.net/forums/index.php?topic=3118749

Also what makes him even more suspicious is that he has opened up new accounts on anandtech and guru3d forums.

Here is one of his threads on guru3d:

http://forums.guru3d.com/showthread.php?t=352281


This is why we should ALWAYS believe random dudes on the Internet, on top of believe Marketing Hardware reps as well.

/sarcasm

Good find. Also, this:
http://forums.anandtech.com/showthread.php?t=2198259


----------



## Seronx

Quote:


> Originally Posted by *2010rig;15326072*
> This is why we should ALWAYS believe random dudes on the Internet, on top of believe Marketing Hardware reps as well.


We shouldn't believe anyone no sarcasm









Everyone is a communistic heathen SPY!!!


----------



## 2010rig

Quote:


> Originally Posted by *Seronx;15326106*
> We shouldn't believe anyone no sarcasm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone is a communistic heathen SPY!!!


If someone can back up their statements with facts, why shouldn't they be believed? Unless said statements are deceitful, and not real. Not talking about anyone in particular.

Anyway, that guy has not shown any proof of anything, so it's hard to believe what he claims.


----------



## boosted6

Quote:


> Originally Posted by *MPIXAPP;15321522*
> and turbo boost to 2.9GHz ..


I disabled turbo boost !

I like this trick.









After reading through 100's of pages, going to get x4 960T to replace x2 555.


----------



## Kaze105

Quote:


> Originally Posted by *CULLEN;15321111*
> Has anyone seen this? http://quinetiam.com/?p=2356


Im sure AMD would have noticed if this could have been done, and would have probably said it. (Considering they did give a reply regarding reviews saying their tests makes it even with 2600k on BF3).

Guy also seems to be talking about aliens and WWIII. Also a mod on Kubuntu forums said the follwing when asked regarding the blog post above:
Quote:


> generally speaking this sort of work would be done on the ubuntu side, kubuntu work mainly on packaging and kde stuff, not hardware.
> 
> but you can go on freenode irc in #kubuntu-devel and ask them.
> 
> I highly doubt it, haven't heard of this myself


Im going to think the 40% increase is not true for now.


----------



## daydream99

How will it do in vid editing?


----------



## solar0987

Motherboard sold, Intel motherboard bought







happy day happy day
I really gave them as in "amd" a chance but they let me down for the last time going intel and probably staying intel.


----------



## otakunorth

Quote:


> Originally Posted by *solar0987;15326944*
> Motherboard sold, Intel motherboard bought
> 
> 
> 
> 
> 
> 
> 
> happy day happy day
> I really gave them as in "amd" a chance but they let me down for the last time going intel and probably staying intel.


when did they let you down before?


----------



## lan cable garrotte string

Idc about Bulldozer's performance really. When the 8150 gets cheaper, probably make a 24/7 folder with my old psu. I'm just happy to be part of such an epic thread in hardware news.


----------



## nbmjhk6

Quote:


> Originally Posted by *lan cable garrotte string;15327998*
> Idc about Bulldozer's performance really. When the 8150 gets cheaper, probably make a 24/7 folder with my old psu. I'm just happy to be part of such an epic thread in hardware news.


Don't bother TBH. I hear the PPD on it is TERRIBLE.


----------



## Hukkel

I didn't see this in the first post yet I think:

http://www.overclockersclub.com/reviews/amd_fx8150/

Overclockersclub review of bulldozer.


----------



## drufause

Quote:


> Originally Posted by *Hukkel;15328360*
> I didn't see this in the first post yet I think:
> 
> http://www.overclockersclub.com/reviews/amd_fx8150/
> 
> Overclockersclub review of bulldozer.


They said 2 million transistors instead of 2 billion.


----------



## Am*

Any reviews mentioning FX 4100/4170 overclocking? I can't find one anywhere and I'm way more interested in how the quad OCs.


----------



## Madmanden

Quote:


> Originally Posted by *drufause;15328473*
> They said 2 million transistors instead of 2 billion.


Well that could explain the performance.


----------



## J.M.D

So with all that hype, finally BD is down for whatever reason it is. Iam still happy with my Phenom II.

Usually AMD perfoms not better at launch. We need revisions like FX-II ? . May be a C3 or more of a BD would be a Kick-Ass !!


----------



## Dublin_Gunner

Quote:


> Originally Posted by *Am*;15328488*
> Any reviews mentioning FX 4100/4170 overclocking? I can't find one anywhere and I'm way more interested in how the quad OCs.


Guru 3D BD series review

It doesn't cover overcloking the lower end chips, but gives a stock performance break down.

I wouldn't bother with the quad to be honest.


----------



## MoBeeJ

on the contrary, i saw the quad being good. In games its giving almost the same fps as its 8 core brother. And am sure it will oc more, and you can focus on NB overclocking.

But still AMD







...


----------



## Elis

Has JF-AMD posted anything since the 12th?

Just curious is all . . .


----------



## djriful

Quote:


> Originally Posted by *Dr. Zoidberg;15326003*
> Seems like that Quin Etiam guy was lying about the registry fix for bulldozer.
> 
> Here's some proof:
> 
> http://kubuntuforums.net/forums/index.php?topic=3118749
> 
> Also what makes him even more suspicious is that he has opened up new accounts on anandtech and guru3d forums.
> 
> Here is one of his threads on guru3d:
> 
> http://forums.guru3d.com/showthread.php?t=352281


I had to stop reading on kubuntu forums. Those people are just so dumb regarding about computer.

/facepalm

They got it all wrong about BD .
,rendering purposes and GPU purposes. Omg.

Sent from my iPhone using Tapatalk


----------



## Mech0z

Quote:


> Originally Posted by *Elis;15329744*
> Has JF-AMD posted anything since the 12th?
> 
> Just curious is all . . .


Whould you if all you got for sharing information (From a server guy perspective) in your sparetime was death threats? But yes he did reply on XS I think it was about the bull**** he recieved, he was late to reply due to being somewhere on business trip (Which probobly raged more angry nerds)


----------



## GameBoy

Quote:



Originally Posted by *Elis*


Has JF-AMD posted anything since the 12th?

Just curious is all . . .


Not here IIRC. He has posted on the Anandtech forums.


----------



## pursuinginsanity

What's sad is that BD doesn't perform any better than Athlon II or the Llano's in CPU tests. That's pathetic.

http://www.guru3d.com/article/amd-fx...mance-review/4

^There an Athlon II x4 645 is hanging with the 6 core BD in the Queens bench, which is obviously very well threaded. Lower clocks, 2 fewer cores, and still keeping up?

Rofl.. look a little lower. The A8-3850 is beating the 6100 in CINEBENCH for crying out loud.


----------



## drufause

When will 3700k be out? We might have piledriver by then


----------



## GameBoy

Quote:



Originally Posted by *RagingCain*


Especially since the 3700K is speced @ 75~77W with increase of 37% performance.


Where are you pulling this 37% figure from?

Quote:



Originally Posted by *RedCloudFuneral*


Ok, find me a game that runs faster on an X58 chip.


Games aren't the only way to measure CPU performance.


----------



## MeatloafOverdose

Quote:



Originally Posted by *GameBoy*


Where are you pulling this 37% figure from?


http://www.overclock.net/rumors-unco...x-up-36-a.html


----------



## Reslivo

Quote:



Originally Posted by *MeatloafOverdose*


http://www.overclock.net/rumors-unco...x-up-36-a.html


That's the 3960X. That isn't 77W TDP.

I know it wasn't your post, but I felt obliged to make the correction in any case.


----------



## Kasp1js

Quote:



Originally Posted by *MeatloafOverdose*


http://www.overclock.net/rumors-unco...x-up-36-a.html


SB-E is not the same as IB.

IB will offer ipc improvements among the lines of bloomfield>>westmere


----------



## kweechy

Quote:



Originally Posted by *MeatloafOverdose*


http://www.overclock.net/rumors-unco...x-up-36-a.html


That's 6 core SB-E, not Ivy.

However, I think Ivy Bridge quad cores could be that much faster than Sandy quads:

10-15% IPC increase with 15-20% greater OCing capability. Which puts it in a range of 26.5% to 38% faster than SB.

Even if it's only 5% IPC gains and 10% OCing gains, still looking at a 16% faster chip.

Judging by this 77W TDP though, I'm guessing the clock speed gains will be pretty nice though. Willing to bet 5.5 GHz on air is obtainable...95W is 24% more than 77W which would turn a 5 GHz OC into a 6.1 GHz OC, however with exponentially worse returns, etc, etc, let's say 5.5 GHz.


----------



## WhitePrQjser

My friend's getting one. We oppose all the non-believers, haha!









Seriously, though, I still have faith in this chip


----------



## HothBase

Quote:



Originally Posted by *Heavy MG*


The 2600K consumes 300W of power when OC'd as well.


I think you need to revise that bit. Pretty sure that a 2600K alone will never draw 300W.


----------



## Heavy MG

Quote:



Originally Posted by *HothBase*


I think you need to revise that bit. Pretty sure that a 2600K alone will never draw 300W.


Then neither does the 8150,if it were to consume over 300 watts then how are normal air and water coolers still keeping the chip at reasonable temps?


----------



## HothBase

Quote:



Originally Posted by *Heavy MG*


Then neither does the 8150,if it were to consume over 300 watts then how are normal air and water coolers still keeping the chip at reasonable temps?


You're right, it probably won't under a normal OC. However IIRC some reviews measured consumption at ~4.8GHz to around 200W. So, I don't know, but with extreme frequencies and voltage you might be able to get it to touch 300W. Not that it really matters under those circumstances.


----------



## Kasp1js

The power consumption is also blown out of proportion, most reviews show it using as much power as last gen i7's and x6.


----------



## OC'ing Noob

Quote:



Originally Posted by *Heavy MG*


Like dlee7283 said,BD isn't that bad,and Intel sold Pentium 4 and Pentium D even though AMD's Athlon64 and Athlon 64 X2 was better.
Most people don't just upgrade,they buy a whole new system,you'd be buying a case,mobo,GPU,OS,etc. either way. The 2600K consumes 300W of power when OC'd as well.


BD isn't a bad processor from a price/performance standpoint. It is comparable on both counts. Unfortunately, it is far, far behind in terms of power consumption performance in comparison to current gen Intel offerings. Again, AMD fanboys can argue how it is scaling appropriately due to number of cores. Again, that is a stupid argument, especially given that the 2500K with comparable performance, price, and over clock, achieves this with a substantially lower power consumption on load.

Quote:



Originally Posted by *Kasp1js*


The power consumption is also blown out of proportion, most reviews show it using as much power as last gen i7's and x6.


Are you seriously trying to compare a new gen CPU to a last gen CPU? It is common knowledge that Nehalem was hugely power hungry and an issue Intel succeeded in addressing with Sandy Bridge. Nehalem was also the most powerful processor upon release, so the performance crown justified the higher power consumption from an enthusiast standpoint. BD has not seized any performance crowns, while having a much greater power consumption level. Short of being a blatant AMD fanboy, there is little justification for recommending a BD build unless you are simply upgrading the CPU, of which you may need to upgrade your PSU as well.


----------



## Don Karnage

Quote:



Originally Posted by *Kasp1js*


The power consumption is also blown out of proportion, most reviews show it using as much power as last gen i7's and x6.


Last Gen I7's were 45nm as well.

Quote:



Originally Posted by *OC'ing Noob*


Are you seriously trying to compare a new gen CPU to a last gen CPU? It is common knowledge that Nehalem was hugely power hungry and an issue Intel succeeded in addressing with *Westmere*. Nehalem was also the most powerful processor upon release, so the performance crown justified the higher power consumption from an enthusiast standpoint. BD has not seized any performance crowns, while having a much greater power consumption level. Short of being a blatant AMD fanboy, there is little justification for recommending a BD build unless you are simply upgrading the CPU, of which you may need to upgrade your PSU as well.


Fixed for you


----------



## mad0314

Quote:



Originally Posted by *OC'ing Noob*


BD isn't a bad processor from a price/performance standpoint. It is comparable on both counts. Unfortunately, it is far, far behind in terms of power consumption performance in comparison to current gen Intel offerings. Again, AMD fanboys can argue how it is scaling appropriately due to number of cores. Again, that is a stupid argument, especially given that the 2500K with comparable performance, price, and over clock, achieves this with a substantially lower power consumption on load.

Are you seriously trying to compare a new gen CPU to a last gen CPU? It is common knowledge that Nehalem was hugely power hungry and an issue Intel succeeded in addressing with Sandy Bridge. Nehalem was also the most powerful processor upon release, so the performance crown justified the higher power consumption from an enthusiast standpoint. BD has not seized any performance crowns, while having a much greater power consumption level. Short of being a blatant AMD fanboy, there is little justification for recommending a BD build unless you are simply upgrading the CPU, of which you may need to upgrade your PSU as well.


Price/performance, Sandy Bridge blows it away except for heavy tasks, then BD is great for its price if it is utilized for that.

I do agree with him that the power consumption is blown out of proportion. Yes, they promised better efficiency, and it did not deliver as much as Intel's chips did or as much as expected, but many people are making comments like "OMG I WILL NEED A NEW PSU!!" which is just flat out not true. Also, the way different review sites calculated power draw adds to the confusion, as some measured at the wall for the whole system, some accounted for PSU efficiency, some tried to isolate the CPU's power draw, etc, giving a huge range of numbers that most people have no clue what to do with. It is sub par compared to what was promised/expected, but it is not as big as some people make it out to be.


----------



## OC'ing Noob

Quote:



Originally Posted by *mad0314*


Price/performance, Sandy Bridge blows it away except for heavy tasks, then BD is great for its price if it is utilized for that.

I do agree with him that the power consumption is blown out of proportion. Yes, they promised better efficiency, and it did not deliver as much as Intel's chips did or as much as expected, but many people are making comments like "OMG I WILL NEED A NEW PSU!!" which is just flat out not true. Also, the way different review sites calculated power draw adds to the confusion, as some measured at the wall for the whole system, some accounted for PSU efficiency, some tried to isolate the CPU's power draw, etc, giving a huge range of numbers that most people have no clue what to do with. It is sub par compared to what was promised/expected, but it is not as big as some people make it out to be.


What a lot of us power naysayers are trying to point out is that for something that ONLY offers comparable price and performance, there is no excuse for it to be using as much power as it does when it's competitor is offering a more than competitive product that has much lower power draw. To try to justify it as more cores so it scales or that it is comparable with last gen Nehalem is not a legitimate consumer argument. AMD flopped on the efficiency side of things and considering how Intel's products are currently just as good if not outright better, that is a huge factor to consider.


----------



## mad0314

Quote:



Originally Posted by *OC'ing Noob*


What a lot of us power naysayers are trying to point out is that for something that ONLY offers comparable price and performance, there is no excuse for it to be using as much power as it does when it's competitor is offering a more than competitive product that has much lower power draw. To try to justify it as more cores so it scales or that it is comparable with last gen Nehalem is not a legitimate consumer argument. AMD flopped on the efficiency side of things and considering how Intel's products are currently just as good if not outright better, that is a huge factor to consider.


Oh believe me, I agree with you. I'm just pointing out that some people are blowing it way out of proportion when they see a review state that it pulled 500W, and they think they will need a 1000W+ PSU to run this chip.


----------



## Wishmaker

Have we reached so low that we find excuses for a mediocre product? What's with the blatant push to shut one eye and ignore the lack of performance in areas where AMD has brainwashed half of the internet? How many here on ocn drank the amd kool aid and now are willing to let it slide? Where is the objectivity, heck you've been made fools by a company who has done nothing else than damage limitation since the official launch. You blame INTEL's business practices yet you accept to be made a clown by AMD?

AMD has lost its credibility as a chipmaker. All the attacks the AMD folk have done ...'INTEL will be Bulldozered' 'Unrivalled performance at half the price'. Some here need to grow some backbone and think for themselves. How easy are some manipulated (eg the world record attempt). How many tens of pages did we have after the 8GHz record that BD will rape INTEL?

This whole BD soap opera was rotten from the core. AMD did not expect SB to be such a good performer, saw the true potential of BD as a close match to Nehalem and decided to play another card 'don't trust ES samples. Performance will be better. IPC will be better. Single thread performance is better".

This should be a lesson for some to wait until they purchase brand new mobos for a chip they haven't seen in action.


----------



## omninmo

hmmmm regardless of it being a flop, i think BD needs more testing, and more criative tests at that!!

so far, from what i've gathered:
-performance scales with clock (obviously)
-performance scales with HTT clocking (interesting and it's quite more than I expected)
-performance does not seem to scale with NB like phenom did..
-performance per thread increases with 4cores, 4 modules instead of 8..

still want to see how 4M/4C approach affects temps, power consumption and max OC ceiling on air/water!


----------



## mad0314

Yea the thing I hated the most was AMDs attitude in the whole thing. Absolutely disgusting.


----------



## Kaze105

Quote:



Originally Posted by *mad0314*


Yea the thing I hated the most was AMDs attitude in the whole thing. Absolutely disgusting.


I have to agree with this. They even posted some gpu-bottlenecked benchmarks of BF3 on their blog. Saying that Bulldozer is better than 2600K. 1fps difference of 2600k and Phenomâ„¢ II X6 1100T? No way. (no clock speed info or etc, could have at least said it is at stock settings or something)


----------



## motoray

How soon is the new stepping coming out and when will the kernel updates take place? Being a AMD fanboy i want it to help but prolly end up going intel.


----------



## 2010rig

Quote:


> Originally Posted by *motoray;15334806*
> How soon is the new stepping coming out and when will the kernel updates take place? Being a AMD fanboy i want it to help but prolly end up going intel.


Check out this post I made in another thread.

It looks like mid-late Q1 2012.
http://www.overclock.net/amd-cpus/1142438-did-anyone-see-fx-price-cut.html#post15314673

Of course, this is from Semi-Accurate.com, but it does sound reasonable.


----------



## OC'ing Noob

Right now there simply is not any real place for BD with Intel's current offerings. I can see a future where once OS systems and programs become more adept at multi-core management and we can benefit from AMD's proposed modularity and scalability of their architecture. However, they NEED to get the power down!


----------



## BallaTheFeared

Any magic bios, cpu drivers, cache fixes, registry hacks, or any other performance increases happen since release?


----------



## jrbroad77

Did anyone really think an 8-core would be a good idea for gaming? It's preposterous. Same goes with 6-cores(*cough* Thuban), if you don't plan on having all those cores under 100% load for some duration of time.. there's no point. And honestly, nothing is pushing 8 cores besides encoding, rendering, simulation, etc.

I think the new hack will be running Windows 7 in 128-bit mode. Then it can get close to using all of BD's power








Quote:


> Originally Posted by *OC'ing Noob;15333600*
> B
> 
> Are you seriously trying to compare a new gen CPU to a last gen CPU? It is common knowledge that Nehalem was hugely power hungry and an issue Intel succeeded in addressing with Sandy Bridge. Nehalem was also the most powerful processor upon release, so the performance crown justified the higher power consumption from an enthusiast standpoint. BD has not seized any performance crowns, while having a much greater power consumption level. Short of being a blatant AMD fanboy, there is little justification for recommending a BD build unless you are simply upgrading the CPU, of which you may need to upgrade your PSU as well.


An 8150 at 4.6 under load uses 25W more than a 980X at 4.54(Anandtech). Gulftown vs. BD is fair, both are on 32nm, so it's fair to compare power consumption. Of course BD doesn't win on performance, but it's power usage is not that high, considering it has 2B transistors, vs. maybe 1.4ish of the 980X.


----------



## kweechy

Quote:


> Originally Posted by *OC'ing Noob;15337749*
> Right now there simply is not any real place for BD with Intel's current offerings. I can see a future where once OS systems and programs become more adept at multi-core management and we can benefit from AMD's proposed modularity and scalability of their architecture. However, they NEED to get the power down!


Unfortunately, even in apps that saturate all cores 100%, BD still can't match the 2600k. Meanwhile, apps that are lightly threaded, it gets completely demolished.

Maybe I'm out of touch with money, but is $60 really THAT much of a make or break deal for anyone out there that they would get a Bulldozer over a 2600k?


----------



## swindle

Quote:


> Originally Posted by *kweechy;15338033*
> Unfortunately, even in apps that saturate all cores 100%, BD still can't match the 2600k. Meanwhile, apps that are lightly threaded, it gets completely demolished.
> 
> Maybe I'm out of touch with money, but is $60 really THAT much of a make or break deal for anyone out there that they would get a Bulldozer over a 2600k?


Love the sig. Poor JF...

Perhaps I'm the same as you, and out of touch?

I have a ton of bills every week (aprox 80% of my wage goes to bills) and yet I still wouldn't mind parting with the extra dough, for more then a little extra performance.


----------



## toX0rz

Quote:


> Originally Posted by *OC'ing Noob;15333600*
> BD isn't a bad processor from a price/performance standpoint.


Dont know where youre getting your figures from, but actually, Sandy Bridge and the Phenom X6's totally blow it away in terms of price/performance.., so it's price/performance ratio actually IS bad.

Almost 100$ more compared to a 1100T while performing worse in games and just 10% (which are often not even reached) better in multi-threaded apps? lolnothanks.


----------



## Dublin_Gunner

Quote:


> Originally Posted by *Liranan;15330178*
> This thread is a terrible read. Everyone who doesn't bother reading links and provided information should be given a ban until they stop parrotting other peoples conjecture. As for the Intel fanboys, you're all pathetic and need help if you need to constantly criticise AMD in order to compensate for lacking in certain parts of your anatomy. Also it's childish because it's obvious you were terrified AMD might dethrone your 'god' and now feel the need to spam with nonsense in order to make yourself feel better. This reminds me of the A64 days, had the same stupid nonsense then too.


Well I've nearly always owned AMD's CPU's, (apart form a year or so when Core 2 Duo came out).

What was your first AMD cpu? Mine was an AM486-DX4-100. So please don't try to put everything into that group you like to define as 'Intel fanboys' just because their opinion is not something you wish to hear.
Quote:


> Originally Posted by *jrbroad77;15338008*
> Did anyone really think an 8-core would be a good idea for gaming? It's preposterous. Same goes with 6-cores(*cough* Thuban), if you don't plan on having all those cores under 100% load for some duration of time.. there's no point. And honestly, nothing is pushing 8 cores besides encoding, rendering, simulation, etc.


A little google on DX11 multi-threaded rendering might change your opinion of that one.

Both Civ 5 and BF3 use more than 4 cores if available.


----------



## Don Karnage

Quote:


> Originally Posted by *jrbroad77;15338008*
> 
> An 8150 at 4.6 under load uses 25W more than a 980X at 4.54(Anandtech). Gulftown vs. BD is fair, both are on 32nm, so it's fair to compare power consumption. Of course BD doesn't win on performance, but it's power usage is not that high, considering it has 2B transistors, vs. maybe 1.4ish of the 980X.


Its not fair at all especially from a price or performance standpoint. Gulftown blows away bd and the only reason bd fans compare it against a 980x is because of the price difference.

I can't wait for the 3930K vs 8150 comparisons. Its going to be a bloodbath


----------



## i7monkey

The poor performance of BD actually really sucks for us consumers, so I don't know why anyone would be happy for that reason, seriously guys, the less competition the more we pay and the less we benefit so in the end it's us who gets screwed if one company lags behind.

*However*, there's one reason why I find BD's epic fail hilarious and that is because I get to see AMD fanboys drowning in their tears over nothing but loyalty to a company that shouldn't even be there in the first place. I would feel the exact same if an Intel release flopped and Intel fanboys started feeling bad because their "mighty" company failed.

Screw brand loyalty guys, go for whatever performs better or has better value.

I'm also shaking my head at AMD right now cause they screwed up royally with this terrible product. Shame on you AMD!


----------



## 2010rig

I just saw this right now:
http://www.anandtech.com/bench/Product/203?vs=434

Pretty crazy, considering the 1100T is running at 3.3, while BD is running at 3.6. Not to mention a $90 difference between them. BD is 33% more, with 25% more cores, no power consumption numbers were given.

I know this isn't news, but if anyone that doesn't go Intel for personal reasons, they'd be much better off with an 1100T. Or if gaming go with a 955 for $120 and call it a day.


----------



## lem_

There is most definitely a Windows 7 AMD FX - software patch in the works. By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks. By forcing Windows 7 to recognize 8 cpu cores a huge performance hit has happened. The Bulldozer FX-8xxx design&#8230; really isn't 8 cores, it's a 4 core CPU with an extra integer pipeline on each core. If the FX-8xxx series scale according to the 4 and 6 core Bulldozer design than there is a serious bug in Windows 7 that is crippling the FX-8150 performance.
The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.

Source

has this been posted already?


----------



## tout

Quote:


> Originally Posted by *2010rig;15339360*
> I just saw this right now:
> http://www.anandtech.com/bench/Product/203?vs=434
> 
> Pretty crazy, considering the 1100T is running at 3.3, while BD is running at 3.6. Not to mention a $90 difference between them. BD is 33% more, with 25% more cores, no power consumption numbers were given.
> 
> I know this isn't news, but if anyone that doesn't go Intel for personal reasons, they'd be much better off with an 1100T. Or if gaming go with a 955 for $120 and call it a day.


You are forgetting that the BD hasn't matured yet with optimizations _and_ overclocks way beyond what the 1100T (or any Phenom II) is capable of.
Quote:


> Originally Posted by *lem_;15339784*
> There is most definitely a Windows 7 AMD FX - software patch in the works. By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks. By forcing Windows 7 to recognize 8 cpu cores a huge performance hit has happened. The Bulldozer FX-8xxx design&#8230; really isn't 8 cores, it's a 4 core CPU with an extra integer pipeline on each core. If the FX-8xxx series scale according to the 4 and 6 core Bulldozer design than there is a serious bug in Windows 7 that is crippling the FX-8150 performance.
> The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.
> 
> Source
> 
> has this been posted already?


I've seen this mentioned. I (along with many others) am waiting a while to see what new BIOS and drivers will do to overall BD performance. You're an idiot if you don't think it will make any difference. It's already been shown by all the discrepancies between benchmarks on different motherboards. Some of the new BIOS don't even work properly and crash the system. It's gonna be a few weeks until all is said and done.

Everyone needs to take all these reviews with a huge bag of salt for now. They were all rushed so that they would be the first to show the information. Most of the overclocking done was very hack and slash and not indicative of what could be achieved by a good methodical approach. They just ramped the speed up and threw voltage at it. Most of them didn't adjust RAM to 1866 Mhz or beyond... they didn't increase the north bridge much, if it all.

_Everyone has been jumping to conclusions that all these numbers you are seeing are final and not taking into account that this is a completely new design for AMD's CPU. The operating system and BIOS has not yet been tailored to it. Relax._


----------



## DayoftheGreek

Quote:


> Originally Posted by *lem_;15339784*
> The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.
> 
> Source
> 
> has this been posted already?


I don't really understand this statement. Reviewers had chips for weeks and didn't rush anything. They tested and tested tons of different setups. It wasn't EVERY setup, but it never is. Now, what about this "real world" computing claim? Do they expect reviewers to open up their web browsers and windows explorer and say one "feels" better than the other? What a joke. Besides, the did plenty of real-world benchmarks. They did game performance with a lot of different games, photoshop, x264, winzip/7zip, and a bunch of other tests that are all very real-world.


----------



## tout

Quote:


> Originally Posted by *DayoftheGreek;15339938*
> I don't really understand this statement. Reviewers had chips for weeks and didn't rush anything. They tested and tested tons of different setups. It wasn't EVERY setup, but it never is. Now, what about this "real world" computing claim? Do they expect reviewers to open up their web browsers and windows explorer and say one "feels" better than the other? What a joke. Besides, the did plenty of real-world benchmarks. They did game performance with a lot of different games, photoshop, x264, winzip/7zip, and a bunch of other tests that are all very real-world.


If they had them for weeks then why didn't they get some decent RAM running on the system. Most of the reviews I've seen are running 1333 to 1600 MHz RAM not 1866. _And_ their overclocks, while good, are not overclocking the north bridge or anything else. We know AMD CPUs respond very well to north bridge overclocks.

Yes they did plenty of 'real world' tests with games and such but when the settings are turned up (who plays games at 1024 x 768 resolution with no effects?) BD performs just as well a i5 2500Ks in most of the reviews I've seen. It even outperforms i5 2500Ks in FPS at high settings. Which is what gamers want.

You guys can pick and choose what you want out of the benchmarks but so can we.


----------



## ToxicAdam

Quote:


> Originally Posted by *lem_;15339784*
> There is most definitely a Windows 7 AMD FX - software patch in the works. By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks. By forcing Windows 7 to recognize 8 cpu cores a huge performance hit has happened. The Bulldozer FX-8xxx design&#8230; really isn't 8 cores, it's a 4 core CPU with an extra integer pipeline on each core. If the FX-8xxx series scale according to the 4 and 6 core Bulldozer design than there is a serious bug in Windows 7 that is crippling the FX-8150 performance.
> The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.
> 
> Source
> 
> has this been posted already?


Quote:


> Unprecedented forum censorship on this article and objective AMD Bulldozer performance test. These sites are thrashing the Bulldozer and shutting down objective analysis.
> 
> *Overclock.net*
> Hardforum.com
> Techpowerup.com
> xstremesystems.org
> Guru3D.com
> tomshardware.com
> andantech.com


----------



## DrCatHands

The only thing I'm not understanding is about how Windows 7 isn't "optimized" for BD's 4C/8M thing. At work we use 8Core Xeon Windows 7 Workstations and they work just fine without any OS "Optimizations".

Is this because it's actually a 4Core CPU that's trying to trick Windows into thinking it's actually 8cores? And yes, I know how Hyperthreading works, so I'm just curious as to why the all the talk about Registry Hacks and Optimizations to make it work with Windows 7 more "optimally".

Did AMD not test their chips, or use some In-House Operating System?


----------



## Wishmaker

Quote:



Originally Posted by *ToxicAdam*













I must say, objectivity is key on that website














. Getting back to the whole W7 vs BD thing, one would think that Microsoft launched W7 after BD and AMD had no time to actually test the product on W7. W7 has been out for so long and I wonder, what OS was used by the AMD engineers to test these products?









To blame the OS because your architecture is too advanced tops the whole 'the dog ate my homework' thing. If that is the case, why not launch the product when W8 comes out?


----------



## tout

Quote:



Originally Posted by *DrCatHands*


The only thing I'm not understanding is about how Windows 7 isn't "optimized" for BD's 4C/8M thing. At work we use 8Core Xeon Windows 7 Workstations and they work just fine without any OS "Optimizations".

Is this because it's actually a 4Core CPU that's trying to trick Windows into thinking it's actually 8cores? And yes, I know how Hyperthreading works, so I'm just curious as to why the all the talk about Registry Hacks and Optimizations to make it work with Windows 7 more "optimally".

Did AMD not test their chips, or use some In-House Operating System?


From what I have read it's the scheduler not assigning tasks properly to the CPU. Windows 8 is supposed to handle it better and they are planning an update for Windows 7 that hopefully helps. I have no idea if the update will be handled/created by Microsoft or AMD.

Years ago there was something similar for dual core AMD CPUs. AMD made a driver that you downloaded and installed. It made Windows more responsive afterwards.


----------



## DayoftheGreek

Quote:



Originally Posted by *tout*


If they had them for weeks then why didn't they get some decent RAM running on the system. Most of the reviews I've seen are running 1333 to 1600 MHz RAM not 1866. _And_ their overclocks, while good, are not overclocking the north bridge or anything else. We know AMD CPUs respond very well to north bridge overclocks.

Yes they did plenty of 'real world' tests with games and such but when the settings are turned up (who plays games at 1024 x 768 resolution with no effects?) BD performs just as well a i5 2500Ks in most of the reviews I've seen. It even outperforms i5 2500Ks in FPS at high settings. Which is what gamers want.

You guys can pick and choose what you want out of the benchmarks but so can we.


I can't believe we're at the point in this game where you complain about 1600MHz RAM. The BD can use 2133 RAM and compare itself to an intel system with 1333MHz RAM and the intel will STILL wipe the floor with it for less money in everything but winrar and SOME encoding benchmarks.

I didn't comment on bulldozers performance at all. I didn't cite anything about low res gaming benchmarks either, there are plenty of both floating around. I didn't pick and chose benchmarks or even link any in my post, so I'm not sure why you decide to get all defensive and explain to me that you don't understand GPU bottlenecks. The reviewers even explain them to you in the reviews. But since you brought it up...

Even the posters in the AMD forum (the ones that aren't already in the intel forums by now) are recommending X4 and X6 instead of the bulldozers for gaming, so apparently that 1 extra FPS on all max settings isn't quite what gamers are looking for.


----------



## kiwiasian

Are there benchmarks on Windows 8 beta


----------



## Wishmaker

Quote:



Originally Posted by *kiwiasian*


Are there benchmarks on Windows 8 beta


Windows 8 is not even at the beta stage


----------



## Xyxox

Quote:



Originally Posted by *Wishmaker*


Windows 8 is not even at the beta stage

















And there would be no reason to benchmark anything under the current developer's preview because you know the Beta will perform better when it comes.


----------



## Siigari

So, without reading 75 pages of posts, what is a good general consensus on Bulldozer as it stands?


----------



## Steak House

^^^ Yeah all this Windows 8 talk - What if it turns out to be Vista all over again? Will BD users switch to it because it's %4 faster? Will the Intel chips be %5 faster? Enough of putting carrots in front of people AMD - what are you doing for us now?


----------



## OC'ing Noob

Quote:



Originally Posted by *lem_*


There is most definitely a Windows 7 AMD FX â€" software patch in the works. By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks. By forcing Windows 7 to recognize 8 cpu cores a huge performance hit has happened. The Bulldozer FX-8xxx designâ€¦ really isnâ€™t 8 cores, itâ€™s a 4 core CPU with an extra integer pipeline on each core. If the FX-8xxx series scale according to the 4 and 6 core Bulldozer design than there is a serious bug in Windows 7 that is crippling the FX-8150 performance.

The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.

Source

has this been posted already?


Trying to blame Microsoft for poor AMD core management is ludicrous. The onus is on AMD to make sure their products are completely compatible with widely used OS platforms. If there was an incompatibility, it should have been handled between MS and AMD engineers a long time ago.

The game is hardly fixed. AMD brought forth an EXTREMELY underwhelming product after multiple delays. It is slower most of the time than its competitor and uses more power than its competitor. This is not big dog under dog crap. This is AMD releasing a product that provides no justification for it be purchased over it's competitor's product.


----------



## OC'ing Noob

Quote:



Originally Posted by *Siigari*


So, without reading 75 pages of posts, what is a good general consensus on Bulldozer as it stands?


Don't buy it. If you have Phenom II X4/X6, continue to stick with that. If you are looking to upgrade to a new build, go Intel. BD is simply not worth investing a new build into, nor does it justify an upgrade unless you are possibly upgrading from an Athlon II dual core.


----------



## Tabzilla

Quote:



Originally Posted by *Siigari*


So, without reading 75 pages of posts, what is a good general consensus on Bulldozer as it stands?


The easiest answer is, "It depends what review you're reading". However, a few things are consistent:
1. IPC is very poor, as is performance with any application using <4 cores.
2. Overclocked power consumption is absurdly high - somewhere around 450W vs 250-300 for SB.
3. Gaming performance:
3a. Low resolutions and detail (1024x768, low settings) = poor performance. 
3b. 1080P and high quality = competitive.
3c. MultiGPU performance appears to be bugged; several reviews show no increase in FPS with the addition of GPUs.


----------



## Fuell

I was gonna quote a whole bunch of people and such... but did any1 read the rage3d review?

Both 8150 and 2600K OC'd to 4.5GHZ as well...

Power Draw:
Don't know what people are freaking out about... 479w load stock and 545w load on their OC2... an i7 920 uses about the same power... I don't remember everyone running for the hills and bringing it up over and over for the 920... it was a solid chip. and at most, its about 100 watts above a 2600K... total system power by the way, and 8 "cores" that are closer to real cores than SMT... so it makes sense. Plus... 100W ain't gonna kill anyone... people buy 500-700w PSU's for rigs that only need 300-400 max all the time... Most users almost always have a decent overhead...

Temps:
About the same temps as a 2600K stock and OC'd... not bad considering it uses more power... Must be a "cool" design...

Performance: Synthetic:
SiSoft Sandra Arth test: 2600K clearly wins, but not by anything astronomical, though very noticeable.

SiSoft Sandra Multimedia test: BD blows away Intel with Integer work, but gets hammered in Float and Dbl...

SiSoft Memory Bandwidth: At stock and the 2st OC attempt BD is behind... but after a better OC it jumps substantially and beats the 2600K handily (by more than the 2600K beat the stock and 1st OC BD)

Aida64 Julia Test: BD beats the 920 but gets stomped by the 2600K.

Aida64 Mandel: Ouch for BD this time as well, even ousted by the 1100T.

Truecrypt: BD loses stock vs stock but quickly gets the upper hand in OC's.

SiSoft Sandra AES256: BD is behind stock vs stock and even after the 1st OT, but after the OC2 it shows its teeth and pulls ahead by a decent margin.

SiSoft Sandra Sha256: Ouch for BD... very bad showing here.

Aida64 AES: BD 40K behind at stock vs stock and OC vs OC1 but the OC2 pulls ahead by about 40K. Big jump.

Aida Hashing: BD stomps 2600K

Performance: Application and Gaming:
3DMark11: keeps up with 2600K

Cinebench: Stock BD isn't great, but after OC it competes easily.

7-zip: BD wins

Winrar: Tie with 2600K

x256 AVX/XOP HD bench: Slightly behind...

BF:BC2 (1920x1080 - High - 8xMSAA): BD has better min fps but loses in max. Though when above 120 isn't as noticeable as say the 52fps min for 2600K vs 63 for BD. And the OC2 brings its to 2600K speeds.

Civ 5 (1080 - 8xMSAA): stock and first OC BD has a very low min fps... but after the OC2 it vastly improves, enough to tie 2600K min and be within 2fps max... (of 2600K OC'ed)

Crysis 2 (1080, Ultra, EdgeAA, DX11): behind 2600K but OC closes the gap, even against 2600K's OC.

DiRT 3 - 1920x1080 4xMSAA Ultra Preset: BD competes at stock and pulls away when OC'd.

Medal of Honor (SP) - 1920x1080 high settings with AA: BD has better min at stock again but loses max, OC fixes things.

So what I garner from such a review is that OC'ing methodology can play a large role in results as well... Rage3D had a 1st OC that performed similar to most reviews, but then OC'd BD with some different settings and look what happens... competing with a 2600K in most scenarios, even games.

I think BD has a few more surprises left for us once the kinks are worked out, nothing magical but a little better, and when comparing OC vs OC, BD can outshine 2600K considering 2600K is often ahead at stock yet when both are OC'd the BD competes and sometimes wins. Considering BD was further behind than the 2600K at stock, it had to make up the deficit and then go further to catch or beat a 2600K.

It has potential. I still think they should have die shrunk a Phenom II to make a Phenom II to work out fab kinks with 32nm and THEN do BD.... maybe more applications will be more highly threaded by that point in time as well, making BD an even better choice.

So while the 8150 at stock isn't the best choice, its by no means a fail. I think people had it in their heads that it had to beat Intel or nothing. When you have that mentality you will consider the BD launch a total fail for gamers. But when you read into it objectively and start seeing the OC results, things change, and BD looks promising. Prices are still above MSRP which is confusing at best, considering the tone of many reviews...

Wow this is getting long. Oh and I'm not trying to say BD is great or anything, just simply that its better than most would admit.


----------



## OC'ing Noob

Quote:



Originally Posted by *Fuell*


I was gonna quote a whole bunch of people and such... but did any1 read the rage3d review?

Both 8150 and 2600K OC'd to 4.5GHZ as well...

Power Draw:
Don't know what people are freaking out about... 479w load stock and 545w load on their OC2... an i7 920 uses about the same power... I don't remember everyone running for the hills and bringing it up over and over for the 920... it was a solid chip. and at most, its about 100 watts above a 2600K... total system power by the way, and 8 "cores" that are closer to real cores than SMT... so it makes sense. Plus... 100W ain't gonna kill anyone... people buy 500-700w PSU's for rigs that only need 300-400 max all the time... Most users almost always have a decent overhead...

In relation to the performance and price of the BD 8-core, the higher power consumption is simply not justified. It does not sound like a lot, until you realize that an OC'ed BD would be consistently drawing more power and that really does add up over time.

Temps:
About the same temps as a 2600K stock and OC'd... not bad considering it uses more power... Must be a "cool" design...

Like I mentioned in a post further up, that means that we may possibly expect good things in the future about the BD design, but the future does not help consumers NOW and any BD purchase is for NOW. Until they work out the power consumption levels though, this is not going to do well IMHO.

Performance: Synthetic:
SiSoft Sandra Arth test: 2600K clearly wins, but not by anything astronomical, though very noticeable.

SiSoft Sandra Multimedia test: BD blows away Intel with Integer work, but gets hammered in Float and Dbl...

SiSoft Memory Bandwidth: At stock and the 2st OC attempt BD is behind... but after a better OC it jumps substantially and beats the 2600K handily (by more than the 2600K beat the stock and 1st OC BD)

Aida64 Julia Test: BD beats the 920 but gets stomped by the 2600K.

Aida64 Mandel: Ouch for BD this time as well, even ousted by the 1100T.

Truecrypt: BD loses stock vs stock but quickly gets the upper hand in OC's.

SiSoft Sandra AES256: BD is behind stock vs stock and even after the 1st OT, but after the OC2 it shows its teeth and pulls ahead by a decent margin.

SiSoft Sandra Sha256: Ouch for BD... very bad showing here.

Aida64 AES: BD 40K behind at stock vs stock and OC vs OC1 but the OC2 pulls ahead by about 40K. Big jump.

Aida Hashing: BD stomps 2600K

Performance: Application and Gaming:
3DMark11: keeps up with 2600K

Cinebench: Stock BD isn't great, but after OC it competes easily.

7-zip: BD wins

Winrar: Tie with 2600K

x256 AVX/XOP HD bench: Slightly behind...

BF:BC2 (1920x1080 - High - 8xMSAA): BD has better min fps but loses in max. Though when above 120 isn't as noticeable as say the 52fps min for 2600K vs 63 for BD. And the OC2 brings its to 2600K speeds.

Civ 5 (1080 - 8xMSAA): stock and first OC BD has a very low min fps... but after the OC2 it vastly improves, enough to tie 2600K min and be within 2fps max... (of 2600K OC'ed)

Crysis 2 (1080, Ultra, EdgeAA, DX11): behind 2600K but OC closes the gap, even against 2600K's OC.

DiRT 3 - 1920x1080 4xMSAA Ultra Preset: BD competes at stock and pulls away when OC'd.

Medal of Honor (SP) - 1920x1080 high settings with AA: BD has better min at stock again but loses max, OC fixes things.

So what I garner from such a review is that OC'ing methodology can play a large role in results as well... Rage3D had a 1st OC that performed similar to most reviews, but then OC'd BD with some different settings and look what happens... competing with a 2600K in most scenarios, even games.

I think BD has a few more surprises left for us once the kinks are worked out, nothing magical but a little better, and when comparing OC vs OC, BD can outshine 2600K considering 2600K is often ahead at stock yet when both are OC'd the BD competes and sometimes wins. Considering BD was further behind than the 2600K at stock, it had to make up the deficit and then go further to catch or beat a 2600K.

It has potential. I still think they should have die shrunk a Phenom II to make a Phenom II to work out fab kinks with 32nm and THEN do BD.... maybe more applications will be more highly threaded by that point in time as well, making BD an even better choice.

By the time we have enough programs taking advantage of multi-core programming, we will be looking at AMD and Intel's next offering. BD architecture shows promise, but is definitely not release worthy right now.

So while the 8150 at stock isn't the best choice, its by no means a fail. I think people had it in their heads that it had to beat Intel or nothing. When you have that mentality you will consider the BD launch a total fail for gamers. But when you read into it objectively and start seeing the OC results, things change, and BD looks promising. Prices are still above MSRP which is confusing at best, considering the tone of many reviews...

BD is not a bad processor by any means. It is more than year late and unable to justify being an upgrade for anyone really. It is simply not worth upgrading to a BD at this point in time.

Wow this is getting long. Oh and I'm not trying to say BD is great or anything, just simply that its better than most would admit.


Addressed some points.


----------



## Fuell

Power Consumption Calculations:
8150
Idle: 128*8760(1 year)/1K = 1121.28
Load: 260*8760/1K = 2277.6

2600K
Idle: 124*8760/1K = 1086.24
Load: 202*8760/1K = 1769.52

Cost to me is 0.12 per kwh... So that means.......

8150 Cost per year:
Idle: $134.55
Load: $273.31

2600K Cost per year:
Idle: $130.35
Load: $212.34

Now I may be making this up... but most computers aren't running under full load 24/7/365. So the MAX savings if you get is $60.97. I'd say with realistic usage scenarios and such, the real world difference would be less than $10.

So yea, I do find it hard people keep harping on this... Sure it drinks more juice to get the job done... but its extremely insignificant... Considering how the GPU market is with power consumption, this kind of variation is almost not worth pointing out for real world cost...


----------



## neonlazer

Well, My friend is getting his bulldozer parts today. I am going to drop by to help him put it together and I WILL run benchmarks on it..and keep the results to myself..cause I have already annoyed him enough of why he bought it over intel lol


----------



## Vagrant Storm

Quote:



Originally Posted by *lem_*


There is most definitely a Windows 7 AMD FX - software patch in the works. By most estimates the AMD Bulldozer FX is underperforming by 40-70% in most Windows 7 benchmarks. By forcing Windows 7 to recognize 8 cpu cores a huge performance hit has happened. The Bulldozer FX-8xxx design&#8230; really isn't 8 cores, it's a 4 core CPU with an extra integer pipeline on each core. If the FX-8xxx series scale according to the 4 and 6 core Bulldozer design than there is a serious bug in Windows 7 that is crippling the FX-8150 performance.
The one thing that is for-sure here is that every hardware review website rushed to be the first to publish an AMD FX-8150 review, they all used the same generic benchmarks and NONE did any real world computing. The game is fixed, the big-dog spreads around the most ad-dollars.

Source

has this been posted already?



LOL...this has pretty much been proven to be a hoax. There is no registry patch. i was really hoping that site would have been taken down by now.


----------



## awdrifter

Quote:



Originally Posted by *Fuell*


Power Consumption Calculations:
8150
Idle: 128*8760(1 year)/1K = 1121.28
Load: 260*8760/1K = 2277.6

2600K
Idle: 124*8760/1K = 1086.24
Load: 202*8760/1K = 1769.52

Cost to me is 0.12 per kwh... So that means.......

8150 Cost per year:
Idle: $134.55
Load: $273.31

2600K Cost per year:
Idle: $130.35
Load: $212.34

Now I may be making this up... but most computers aren't running under full load 24/7/365. So the MAX savings if you get is $60.97. I'd say with realistic usage scenarios and such, the real world difference would be less than $10.

So yea, I do find it hard people keep harping on this... Sure it drinks more juice to get the job done... but its extremely insignificant... Considering how the GPU market is with power consumption, this kind of variation is almost not worth pointing out for real world cost...


But you have to factor in the need for a more powerful PSU. BD draws 66w more than SB, so with OC it'll probably draw 100w more. You'll need to spend $15-20 more on PSU when you're buying BD. So in the worse case scenario, the BD is $80 more for the initial year, and $60 more every year after that. The cost difference between the 2600K and FX-8150 can be easily saved during the first year of usage. You'll get a system that's much faster in lightly threaded programs.


----------



## cjc75

Quote:



Originally Posted by *Steak House*


^^^ Yeah all this Windows 8 talk - What if it turns out to be Vista all over again? Will BD users switch to it because it's %4 faster? Will the Intel chips be %5 faster? Enough of putting carrots in front of people AMD - what are you doing for us now?


I can already tell you that Win8, will be another Vista.

Microsoft releases 1 good OS, then 1 bad OS, then 1 good OS, then 1 bad... and so on. Microsoft is just like Star Trek movies in this regards; as they always release 1 good film, then 1 bad, then 1 good, then 1 bad, and so on...

Generally this only applies to the Microsofts normal OS's and not their server stuff and follows a tradition going all the way back to Windows 3.0/3.1; which was revolutionary; then followed with Win95 which while it was a great leap in GUI, wasn't much better... but then followed with Win98 which was great; followed that by WinME which was so horrible it only lasted 3 months before MS had it yanked from store shelves; then follow that with WinXP and we all know how great and awesome XP is in its prime; then follow that with Vista, then Win7... and as the cycle goes, Windows 8 is next in line to be the next horrible OS.

I foresee Win8 as Microsofts first step into the mainstream Mobile Device market and it won't do so well for the average Desktop Gamer; more so for the average Desktop user whose used to the current Windows GUI layout and then suddenly they're forced to relearn it all, all over again with Win8 totally redesigned GUI?

I don't think so!

Windows 7 is just in its prime, many businesses are only just now starting to switch over to it; some are still on windows XP as mine is!

So no...

Bulldozer, will not pick up with Windows 8 because I foresee not many people will rush to upgrade to Windows 8... and the majority of us will want to stick with Windows 7 and Bulldozer will forever be handicapped by that.


----------



## OC'ing Noob

Quote:



Originally Posted by *Fuell*


Now I may be making this up... but most computers aren't running under full load 24/7/365. So the MAX savings if you get is $60.97. I'd say with realistic usage scenarios and such, the real world difference would be less than $10.

So yea, I do find it hard people keep harping on this... Sure it drinks more juice to get the job done... but its extremely insignificant... Considering how the GPU market is with power consumption, this kind of variation is almost not worth pointing out for real world cost...


Let's take a step back and look at the bigger picture here.

- Bulldozer is clearly being advertised by AMD as being strong in programs that support multi-core management. When considering its piss poor single core performance, it is not for your standard consumer who only needs to open up mail and watch youtube. This means the that BD will be used by servers, video/audio encoders, virtualization, and folding.
- Enthusiasts often leave their computers on all the time to do stuff. I have my on 24/7 and it acts as a localized Minecraft server, media server, and other purposes.
- For some customers' it may really be as little as $10, but that is not the argument here. The argument is that after all this time, with BD ONLY being comparable, AMD needs every reason to convince people to buy their underwhelming offering. Why would people want to buy something it is just comparable in terms of pricing and performance when the Intel offerings can provide the same if not better with a much lower power consumption level.

I am not saying that BD is a bad CPU, just that with it is only comparable on performance or price levels, it cannot afford anything that will further push it back and the power consumption level does just that. AMD needs to address the power issue ASAP, just like how Intel addressed theirs with Nehalem. The difference though is Nehalem had the performance crown to somewhat justify the heavy power usage.


----------



## Thereoncewasamaninparis

Quote:



Originally Posted by *kiwiasian*


Are there benchmarks on Windows 8 beta



Quote:



Originally Posted by *Wishmaker*


Windows 8 is not even at the beta stage

















You can download a Windows 8 preview iso right here? It says prebeta, but you would think process management would have been one of the first things to be ironed out before adding all the bells and whistles.

Link.


----------



## Am*

Posted this in another thread, I think it still applies:

Quote:



Originally Posted by *Am**


The more reviews I read, the more confused I get about what sort of per-core performance BD has...

http://www.legionhardware.com/articl...fx_4170,6.html

If you look at the 4170, it isn't doing too badly for a quad. Now I don't know if the quad core is a 4 module with 1 broken core in each (and so it gets decent results because it's not sharing resources between each core) or a standard 2 module with 4 core version, but there is clearly something seriously wrong with their 8-core version, since with 4 more of those same cores it should still be a lot faster in those games, and yet it refuses to scale up from there. I don't know if it's a software or a hardware problem but AMD need to explain and address THIS first, and fast, if they want anyone to buy any Bulldozer CPUs.

Also if their FX 4170 isn't just a broken 8 core, I may well consider buying it just to see what kind of overclocks I can get out of it.


Also see this:

http://www.guru3d.com/article/amd-fx...ance-review/10

Again, I don't know if it's a problem with the processor itself or with the OS/BIOS/mobo or whatever, but the 8 core clearly has a lot of potential left and there is clearly something wrong when a quad core with the same architecture gets the same or better results at the same clocks. If it had been at 100% utilization with those results I would say it is a lost cause for AMD, but because of this, I'm hoping it's a screw up on AMD's side (I'm not buying into any "registry fix" just yet though) that will be addressed as soon as humanly possible (should be the #1 priority for them right now to get to the bottom of this). Then again, the 6 core SB-E got disappointing result scaling vs 2500k as well, it could be down to poor threading of Win7/programs in general. While it is still a fail on the power consumption front (until latter revisions at the very least), I'm convinced it will get a lot better with time, just like the Phenom I and hope AMD address the poor threading performance long before Piledriver (which could well be the game changer if this is done).


----------



## TheRockMonsi

Can anybody point me to where I can read up on the FX-4100 in specific - namely how well it OC's and stuff. All I've seen is the 8 and 6 core variants, but I'd like to see how the FX-4100 does.


----------



## Am*

Quote:


> Originally Posted by *TheRockMonsi;15366141*
> Can anybody point me to where I can read up on the FX-4100 in specific - namely how well it OC's and stuff. All I've seen is the 8 and 6 core variants, but I'd like to see how the FX-4100 does.


That's what I want to know as well. There are reviews of the 4100 and the 4170 which will come out in about 2 months (better revision I think), but no-one has yet to test overclocking on them. I'm hoping they can hit 5.5GHz+ and beyond.


----------



## Redwoodz

Quote:


> Originally Posted by *cjc75;15343205*
> I can already tell you that Win8, will be another Vista.
> 
> Microsoft releases 1 good OS, then 1 bad OS, then 1 good OS, then 1 bad... and so on. Microsoft is just like Star Trek movies in this regards; as they always release 1 good film, then 1 bad, then 1 good, then 1 bad, and so on...
> 
> Generally this only applies to the Microsofts normal OS's and not their server stuff and follows a tradition going all the way back to Windows 3.0/3.1; which was revolutionary; then followed with Win95 which while it was a great leap in GUI, wasn't much better... but then followed with Win98 which was great; followed that by WinME which was so horrible it only lasted 3 months before MS had it yanked from store shelves; then follow that with WinXP and we all know how great and awesome XP is in its prime; then follow that with Vista, then Win7... and as the cycle goes, Windows 8 is next in line to be the next horrible OS.
> 
> I foresee Win8 as Microsofts first step into the mainstream Mobile Device market and it won't do so well for the average Desktop Gamer; more so for the average Desktop user whose used to the current Windows GUI layout and then suddenly they're forced to relearn it all, all over again with Win8 totally redesigned GUI?
> 
> I don't think so!
> 
> Windows 7 is just in its prime, many businesses are only just now starting to switch over to it; some are still on windows XP as mine is!
> 
> So no...
> 
> Bulldozer, will not pick up with Windows 8 because I foresee not many people will rush to upgrade to Windows 8... and the majority of us will want to stick with Windows 7 and Bulldozer will forever be handicapped by that.










Funny, I don't seem to be experiencing any of those things since I've been running Windows 8 the last 2 weeks. I absolutely love the new changes, other than that it's just like W7.
OK guys, I was reading through a another BD review, and as I was going through the benchmarks I started noticing BD really wasn't doing that bad, in fact in several gaming tests it beats the i7. Of coarse the usual Intel favored benches were more of the same, but BD always did better than the 1100T. This review was done by someone who knows his stuff, Gabriel Torres over at Hardware Secrets. You should also read his explanation of HTT 3.0 in another article. Take a look-
http://www.hardwaresecrets.com/article/AMD-FX-8150-vs-Core-i5-2500K-and-Core-i7-2600K-CPU-Review/1402

http://www.hardwaresecrets.com/article/Everything-You-Need-to-Know-About-the-HyperTransport-Bus/19


----------



## Xyxox

Quote:


> Originally Posted by *Redwoodz;15366870*
> 
> 
> 
> 
> 
> 
> 
> Funny, I don't seem to be experiencing any of those things since I've been running Windows 8 the last 2 weeks. I absolutely love the new changes, other than that it's just like W7.
> OK guys, I was reading through a another BD review, and as I was going through the benchmarks I started noticing BD really wasn't doing that bad, in fact in several gaming tests it beats the i7. Of coarse the usual Intel favored benches were more of the same, but BD always did better than the 1100T. This review was done by someone who knows his stuff, Gabriel Torres over at Hardware Secrets. You should also read his explanation of HTT 3.0 in another article. Take a look-
> http://www.hardwaresecrets.com/article/AMD-FX-8150-vs-Core-i5-2500K-and-Core-i7-2600K-CPU-Review/1402
> 
> http://www.hardwaresecrets.com/article/Everything-You-Need-to-Know-About-the-HyperTransport-Bus/19


If you read the Conclusions in the review you link, you get this:

"We can summarize the AMD FX-8150 in one word: "disappointment." We expected much more from this eight-core CPU based on the highly anticipated "Bulldozer" architecture.

The FX-8150 was faster than the Core i5-2500K in only a few situations, and the performance difference was not so high as to justify the higher price you will have to pay to bring this new AMD processor home. So, unless you are a die-hard AMD fanboy, we think it is hard to recommend this CPU. The Core i5-2500K is cheaper and provides a higher overall performance, and is the CPU we recommend for the user looking for the best price/performance ratio is the USD 200 - USD 220 price range. And if you really want performance, you can pay a little more and get the Core i7-2600K."


----------



## Redwoodz

Quote:


> Originally Posted by *Xyxox;15367102*
> If you read the Conclusions in the review you link, you get this:
> 
> "We can summarize the AMD FX-8150 in one word: "disappointment." We expected much more from this eight-core CPU based on the highly anticipated "Bulldozer" architecture.
> 
> The FX-8150 was faster than the Core i5-2500K in only a few situations, and the performance difference was not so high as to justify the higher price you will have to pay to bring this new AMD processor home. So, unless you are a die-hard AMD fanboy, we think it is hard to recommend this CPU. The Core i5-2500K is cheaper and provides a higher overall performance, and is the CPU we recommend for the user looking for the best price/performance ratio is the USD 200 - USD 220 price range. And if you really want performance, you can pay a little more and get the Core i7-2600K."


That may be true for someone buying a new rig, but what about the folks who could drop in the new CPU without any other expense? I doubt you even read the article and compared the numbers.


----------



## Buffal0

Quote:


> Originally Posted by *Vagrant Storm;15343042*
> LOL...this has pretty much been proven to be a hoax. There is no registry patch. i was really hoping that site would have been taken down by now.


It has now LOL, your wish came true.

"ELITE CREW" has defaced the website + hacked replaced everything. LOL LOL


----------



## Xyxox

Quote:


> Originally Posted by *Redwoodz;15367146*
> That may be true for someone buying a new rig, but what about the folks who could drop in the new CPU without any other expense? I doubt you even read the article and compared the numbers.


I read all 18 pages.

If you are not looking to build a new rig but only update a current AMD based rig, you might gain some value from BD. BD beats the 1100T in several benchmarks, but also performs below the 1100T in a few. If your current rig has a lower performing CPU than the 1100T, you can definitely gain some value but you'll have to weigh the value you'd gain over upgrading to an 1100T and the extra $90 for the FX8150. If you're currently running an 1100T, you'll have to weigh the pros and cons of pulling out your 1100T and dropping in an FX8150 for a price tag of ~$280 (according to the article).


----------



## J.M.D

I would definitely wait for the re-visions. No doubt.

And i am sure AMD is gonna show flying Color's.

All i gotta say is " Just Wait And See the REAL-BULL...!! "


----------



## djriful

Quote:


> Originally Posted by *J.M.D;15367964*
> I would definitely wait for the re-visions. No doubt.
> 
> And i am sure AMD is gonna show flying Color's.
> 
> All i gotta say is " Just Wait And See the REAL-BULL...!! "


Next up double rainbows?


----------



## kweechy

Quote:


> Originally Posted by *Redwoodz;15367146*
> That may be true for someone buying a new rig, but what about the folks who could drop in the new CPU without any other expense? I doubt you even read the article and compared the numbers.


Why would you do a review based on that?

Review site's job:
- Show the performance numbers.

Your job:
- Figure out if worth it.

If a $240 8150 is better for you than a $180 2500k + $140 mobo, then that's cool! That's your decision to make though, the review site just needs to provide you with the information to make that choice.


----------



## Choopy!

I am kind of sad to see bulldozer do so poorly in the benchmarks. It really isn't THAT poor a performer, but the price and comparison between the price and performance of sandy bridge just makes it look that much worse. Hopefully a revision will make it a better option in the future.


----------



## SirWaWa

bulldozer is a bunch of bull
finally got to say that


----------



## Nocturin

Quote:



Originally Posted by *Choopy!*


I am kind of sad to see bulldozer do so poorly in the benchmarks. It really isn't THAT poor a performer, but the price and comparison between the price and performance of sandy bridge just makes it look that much worse. Hopefully a revision will make it a better option in the future.


your edit makes much more sense.

try to leave the original comment with an EDIT: flag though,









I'm also hoping for a revision. Benchmarks arn't everything, so I'll be waiting...

>.>
<.<

and watching...

Quote:



Originally Posted by *SirWaWa*


bulldozer is a bunch of bull
finally got to say that


I'll show you some bull...


----------



## BlackVenom

Quote:



Originally Posted by *djriful*


Next up double rainbows?


I think I heard about a triple rainbow the other day...


----------



## soth7676

Quote:


> Originally Posted by *BlackVenom;15372923*
> I think I heard about a triple rainbow the other day...


Is that a triple rainbow with or without the unicorns??

Sent from my T-Mobile G2 using Tapatalk


----------



## Schmuckley

it really IS that poor of a performer..woww..just..really unbelievably bad!


----------



## B NEGATIVE

http://www.xtremesystems.org/forums/showthread.php?276114-8120-after-1-week&p=4977413#post4977413
Seems like it not all bad....


----------



## Nocturin

Quote:


> Originally Posted by *B NEGATIVE;15379968*
> http://www.xtremesystems.org/forums/showthread.php?276114-8120-after-1-week&p=4977413#post4977413
> Seems like it not all bad....


pssst.... it's fake....it was _obviously_ GPU bound


----------



## Schmuckley

seems to me like netburst was better..lol..matter of fact..i may just have to investimagate that


----------



## Nocturin

Quote:


> Originally Posted by *Schmuckley;15380276*
> seems to me like netburst was better..lol..matter of fact..i may just have to investimagate that


oooo that could be fun to debate!


----------



## Hazzeedayz

Quote:


> Originally Posted by *Schmuckley;15380276*
> seems to me like netburst was better..lol..matter of fact..i may just have to investimagate that


in your cpu-z validator....how the HELL did you get a 965 to 4.4ghz?
there's no way that is stable


----------



## J.M.D

Quote:


> Originally Posted by *Hazzeedayz;15380657*
> in your cpu-z validator....how the HELL did you get a 965 to 4.4ghz?
> there's no way that is stable


Why not ?? I am able to get a 4525 Mhz stable clock on CPU-Z with my 955 , provided you have clocked other cores down !!


----------



## B NEGATIVE

Quote:


> Originally Posted by *Nocturin;15380256*
> pssst.... it's fake....it was _obviously_ GPU bound


These guys have chips in hand and are benching,where is YOUR chip and YOUR benches?
1280x1024 is not GPU bound.
Unless you have something to add?
Thought not....


----------



## Clairvoyant129

Quote:


> Originally Posted by *B NEGATIVE;15379968*
> http://www.xtremesystems.org/forums/showthread.php?276114-8120-after-1-week&p=4977413#post4977413
> Seems like it not all bad....


Quote:


> Originally Posted by *Nocturin;15380256*
> pssst.... it's fake....it was _obviously_ GPU bound


You realize that Cinebench score at 4.7GHz with an 8 core CPU is a piece of crap right? It's not decent at all. An Intel quad core can easily reach 10 at those clock speeds.

Sent from my iPhone


----------



## Nocturin

Quote:


> Originally Posted by *B NEGATIVE;15381388*
> These guys have chips in hand and are benching,where is YOUR chip and YOUR benches?
> 1280x1024 is not GPU bound.
> Unless you have something to add?
> Thought not....


Hook... Line...
Quote:


> Originally Posted by *Clairvoyant129;15381444*
> You realize that Cinebench score at 4.7GHz with an 8 core CPU is a piece of crap right? It's not decent at all. An Intel quad core can easily reach 10 at those clock speeds.
> 
> Sent from my iPhone


and sinker...

You guys really need to READ the first line of text in my sig.
----

Oh and Clairvoyant,







?

I did some research, and no intel quad core hits the 10's in CB 11.5

Highest [email protected] w/250gts was @ 9.25
Gross over-estimation much?

To throw a wrench in your logic a 2600k @ stock(3.4) w/470gtx @9.35
Seems to me that the GPU in CB is more important than clockspeed...









FX could be much better, but it's not (yet). No need to exaggerate.


----------



## axipher

Quote:


> Originally Posted by *Fuell;15274952*
> Way to quote a small fraction of a post out of context to make a useless comment... BD is not an epic fail, unless you guys think a 2500K is utter garbage for threaded workloads... there are areas where BD simply shines, especially compared to PII, and then there is games and lowly threaded tasks...
> 
> But go ahead and keep the flame train going... Talking about the architectures ups and downs is one thing... but to call "outright fail" on a chip that CAN best a 2500K consistently in some workloads, thats funny...
> 
> (I would like to point out that in no way am I saying BD is "great" or better than a 2500K.... Just that there is far too much negative commenting going on and its simply beyond reason...


These are my thoughts as well. I'd also like to add, that comparing my 955 at it's OC wall of 3.95 GHz to my 8150 at it's OC wall of 4.76 shows that AMD has improved over last generation in threaded apps. I can get higher scores in almost everything I do, from games, to 3D model renders, to video encoding.

For fun I ran the 8150 at 1 core per unit and it still performed better then my 955.

I like to believe that Bulldozer is very from from a fail at all. It's a new architecture, it has time to mature still, just like the Phenom I's to the Phenom II's.


----------



## Don Karnage

Quote:


> Originally Posted by *Nocturin;15382006*
> 
> I did some research, and no intel quad core hits the 10's in CB 11.5


[ame]http://www.youtube.com/watch?v=RRfyjrwrxM8&feature=youtube_gdata_player[/ame]

10.09 @ 5Ghz good for you with a 2600K?


----------



## Xyxox

Quote:


> Originally Posted by *Don Karnage;15382947*
> http://www.youtube.com/watch?v=RRfyjrwrxM8&feature=youtube_gdata_player
> 
> 10.08 @ 5Ghz good for you with a 2600K?


That vid pretty much says it all.


----------



## Nocturin

Quote:


> Originally Posted by *Don Karnage;15382947*
> http://www.youtube.com/watch?v=RRfyjrwrxM8&feature=youtube_gdata_player
> 
> 10.08 @ 5Ghz good for you with a 2600K?


Yes, that proves that it can hit the 10's @ 5ghz, thank you for providing proof.

But what about the original context, or claim, put forth by Clairvoyant?
Quote:


> An Intel quad core can easily reach 10 at those clock speeds.


clock speed = 4.7ghz

?
can you find







for that?

I love it when people take things out of context. Meh.

OT:

I do not care for CB scores, they mean nothing to me in my usage. It does prove that BD IPC could stand for large upgrade. BD to PD only 10%? Thats not the improvement I'm looking for when I'm shopping for a system.

Le Sigh.


----------



## Nocturin

Quote:


> Originally Posted by *axipher;15382866*
> These are my thoughts as well. I'd also like to add, that comparing my 955 at it's OC wall of 3.95 GHz to my 8150 at it's OC wall of 4.76 shows that AMD has improved over last generation in threaded apps. I can get higher scores in almost everything I do, from games, to 3D model renders, to video encoding.
> 
> For fun I ran the 8150 at 1 core per unit and it still performed better then my 955.
> 
> I like to believe that Bulldozer is very from from a fail at all. It's a new architecture, it has time to mature still, just like the Phenom I's to the Phenom II's.


I agree.
This goes for the quoted text in your post as well.
It seems to be a very common thing to do for certain individuals around here.


----------



## Steak House

Quote:


> Originally Posted by *Nocturin;15383668*
> *I did some research, and no intel quad core hits the 10's in CB 11.5*


That's a 2600K Hitting 10's...


----------



## Iceman23

Quote:


> Originally Posted by *Nocturin;15383668*
> Yes, that proves that it can hit the 10's @ 5ghz, thank you for providing proof.
> 
> But what about the original context, or claim, put forth by Clairvoyant?
> 
> clock speed = 4.7ghz
> 
> ?
> can you find
> 
> 
> 
> 
> 
> 
> 
> for that?
> 
> I love it when people take things out of context. Meh.


But a clock speed of 4.7Ghz was his context, not yours. You clearly said no Intel quad core hits 10s in Cinebench which is obviously completely false. Nice try.


----------



## Nocturin

Squirrel moment much?


----------



## GTR Mclaren

[ame]http://www.youtube.com/watch?v=1kd4dvLJQP4&feature=feedu[/ame]

same performance.

inb4 "linus ins a noob I will only believe other reviews site (that show BD as fail)"


----------



## Wishmaker

All of a sudden, Cinebench does not matter because the FX 8150 does not perform. Until BD, Cinebench was the only valid benchmark according to the AMD folk. People have been in denial since Conroe. They cannot accept the reality even when they see how this chip performs!


----------



## Xyxox

Quote:


> Originally Posted by *GTR Mclaren;15385001*
> http://www.youtube.com/watch?v=1kd4dvLJQP4&feature=feedu
> 
> same performance.
> 
> inb4 "linus ins a noob I will only believe other reviews site (that show BD as fail)"


Excellent video and gives hope for a competitive CPU market.


----------



## Nocturin

need MOAR DATA!


----------



## mad0314

Quote:


> Originally Posted by *Wishmaker;15385057*
> All of a sudden, Cinebench does not matter because the FX 8150 does not perform. Until BD, Cinebench was the only valid benchmark according to the AMD folk. People have been in denial since Conroe. They cannot accept the reality even when they see how this chip performs!


BD changed a lot of things. Suddenly, power consumption per core matters, GPU bottlenecked benchmarks matter for CPU performance, and CPUs need drivers.


----------



## Steak House

Quote:


> Originally Posted by *Xyxox;15385191*
> Excellent video and gives hope for a competitive CPU market.


How is a test - that has nothing to do with the CPU - an excellent video that gives hope for a competitive CPU market?


----------



## Derp

Quote:


> Originally Posted by *GTR Mclaren;15385001*
> http://www.youtube.com/watch?v=1kd4dvLJQP4&feature=feedu
> 
> same performance.
> 
> inb4 "linus ins a noob I will only believe other reviews site (that show BD as fail)"


GTR, the AMD kool aid, stop drinking it. A handful of review sites were obviously told to benchmark BD processors in GPU bottlenecked situations, you are clinging to a few GPU benchmarks while standing up for a processor that has simply failed...... Other sites that weren't being biased actually benchmarked processors.

Again, no more kool aid, please.

Thank you.


----------



## konspiracy

Quote:


> Originally Posted by *Steak House;15387174*
> How is a test - that has nothing to do with the CPU - an excellent video that gives hope for a competitive CPU market?


You are still thinking in the wrong direction.
Bulldozer is a weaker quad when it comes to floating point calculations. But in everything but floating point it rocks.
The problem is that x86 work is alot of floating point work.
So when it comes down to it Amd wants the floating point work to go to the gpu where it excels, and it wants the cpu to do the stuff the gpu cant.
I mean think about it the gpu's are made for floating point calculations. Amd is trying to move all floating point data to the gpu.
Bulldozer is like a rough corvette. sure its not a Lamborghini with pure horse power like the 2500k. BUT it is ahead of its time. If you cant understand that then I dont know what to tell you. Your just in denial. Intel is going a pure FP route while AMD is saying hey we have GPU's for that and trying to conquer other things.

The problem happens to be that everything is made to handle alot of FP calculations. AMD is moving forward while Intel is saying Hahahaha you cant jump 10 years ahead. Just like the 64 bit days.

Amd isn't supposed to beat Intel. It's supposed to push them.


----------



## Schmuckley

Oy..vey...on paper these chips look good..but in real life..windows takes at least 5 more seconds to load..everything is slower..no matter how much it's oc'd..(4.8 atm) i mean.if it'd reach 8 ghz..it might be semi-ok ..maybe there IS something conflicting in the bd chips..whatever the case ..it gets shelved tomorrow..i'm GONNA have something better than that to work with..either shelved..or taken out to driveway and pounded with a sledgehammer..P4 was faster


----------



## Xyxox

Quote:


> Originally Posted by *Steak House;15387174*
> How is a test - that has nothing to do with the CPU - an excellent video that gives hope for a competitive CPU market?


It showed that in real world gaming applications of the chip, it performs as well as a 2500K (meaning, most gaming rigs will bottleneck on their GPU before CPU bottlenecks ever become a problem). This will keep those who refuse to ever buy Intel still buying the new chips from AMD.

What that means is that the chip will still sell well enough that AMD will be able to continue research and development in the CPU arena which means there is still hope for a competitive CPU market.

It also means that Sandy Bridge will remain an affordable CPU option.

Edited to answer the following:
Quote:


> Originally Posted by *Derp;15387306*
> GTR, the AMD kool aid, stop drinking it. A handful of review sites were obviously told to benchmark BD processors in GPU bottlenecked situations, you are clinging to a few GPU benchmarks while standing up for a processor that has simply failed...... Other sites that weren't being biased actually benchmarked processors.
> 
> Again, no more kool aid, please.
> 
> Thank you.


You might want to check the Linus Tech Tips videos from recent days. NCIX is offering some pre-built and pre-OCed Bulldozer based rigs for sale (with the ASUS board and the GPU he used in his test video). Obviously Linus was not instructed to benchmark BD processors in a GPU bottlenecked situation. He took matters into his own hands because he has a vested interest in BD performing as well as the 2500K.


----------



## Steak House

Quote:


> Originally Posted by *konspiracy;15387351*
> You are still thinking in the wrong direction.
> Bulldozer is a weaker quad when it comes to floating point calculations. But in everything but floating point it rocks.
> The problem is that x86 work is alot of floating point work.
> So when it comes down to it Amd wants the floating point work to go to the gpu where it excels, and it wants the cpu to do the stuff the gpu cant.
> I mean think about it the gpu's are made for floating point calculations. Amd is trying to move all floating point data to the gpu.
> Bulldozer is like a rough corvette. sure its not a Lamborghini with pure horse power like the 2500k. BUT it is ahead of its time. If you cant understand that then I dont know what to tell you. Your just in denial. Intel is going a pure FP route while AMD is saying hey we have GPU's for that and trying to conquer other things.
> 
> The problem happens to be that everything is made to handle alot of FP calculations. AMD is moving forward while Intel is saying Hahahaha you cant jump 10 years ahead. Just like the 64 bit days.
> 
> Amd isn't supposed to beat Intel. It's supposed to push them.


Crossfire Numbers Anyone?










Now you know the truth. AND THAT'S INTEL'S LAST GENERATION OF CHIPS. The Phenom Looks Better Then BD!


----------



## Steak House

Quote:


> Originally Posted by *Xyxox;15387444*
> It showed that in real world gaming applications of the chip, it performs as well as a 2500K (meaning, most gaming rigs will bottleneck on their GPU before CPU bottlenecks ever become a problem).












What happens with the next generation of MORE POWERFUL Graphics Cards?

Kepler In 3 Months? 7000 AMD Series This Year?


----------



## Xyxox

Quote:


> Originally Posted by *Steak House;15387661*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What happens with the next generation of MORE POWERFUL Graphics Cards?


That's when the BD based server chips should be ready for market, which is clearly where the module based architecture should do well (if it will do well in any application) since the server market is multi-thread based applications. This keeps AMD in the CPU game and alive so long as NVidia doesn't blow their new generation GPUs out of the water.

It does get a bit tricky there, though, because the base fault that causes the performance issues may well be the way the cores share resources in the modules. If the server chips are incapable of overcoming this issue and there is even the slightest faultering in the GPU market, then AMD could be in serious trouble. Their stock closed at $4.54 today and has been on a slide for a while.


----------



## Steak House

Quote:


> Originally Posted by *Xyxox;15387720*
> That's when the BD based server chips should be ready for market, which is clearly where the module based architecture should do well (if it will do well in any application) since the server market is multi-thread based applications. This keeps AMD in the CPU game and alive so long as NVidia doesn't blow their new generation GPUs out of the water.
> 
> It does get a bit tricky there, though, because the base fault that causes the performance issues may well be the way the cores share resources in the modules. If the server chips are incapable of overcoming this issue and there is even the slightest faultering in the GPU market, then AMD could be in serious trouble. Their stock closed at $4.54 today and has been on a slide for a while.












So you think anyone running the new cards will need a server based BD?










OK, IF YOU SAY SO!


----------



## konspiracy

Quote:


> Originally Posted by *Steak House;15387580*
> Crossfire Numbers Anyone?
> 
> Now you know the truth. AND THAT'S INTEL'S LAST GENERATION OF CHIPS. The Phenom Looks Better


You still dont get it.

Right now a good cpu can do alot of FP calculations .
Bulldozer is a weak quad with alot of extra baggage when it comes to FP calculations.

Are you trolling? Nobody even tries to combat my reasoning because they are to narrow minded to think that hell maybe we should leave FP calculations to the gpu, considering thats what they are designed for.... why build a cpu that is great at fp calculations when a gpu can do it 4 times faster. Thats what Amd's goals are. Let the gpu do what is was designed for.


----------



## Xyxox

Quote:


> Originally Posted by *Steak House;15387873*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So you think anyone running the new cards will need a server based BD?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OK, IF YOU SAY SO!


Try to keep up.

I'm saying that once the new cards have come out, the BD market will have moved on to sales in server chips and the loss of the GPU bottleneck won't be as meaningful to overall CPU sales for AMD.


----------



## Steak House

Quote:


> Originally Posted by *konspiracy;15387893*
> Nobody even tries to combat my reasoning


Yeah, well I don't want to "combat your reasoning" so we'll just have to agree to disagree.


----------



## Steak House

Quote:


> Originally Posted by *Xyxox;15387933*
> Try to keep up.
> 
> I'm saying that once the new cards have come out, the BD market will have moved on to sales in server chips and the loss of the GPU bottleneck won't be as meaningful to overall CPU sales for AMD.


Oh OK - so AMD doesn't really care about selling CPUs to desktop users! Now I get it, and actually - That makes a hell of a lot of sense now!

Thanks Xyxox!

If they cared about selling to us desktop users - the chip might have been worth something.


----------



## Schmuckley

dem chips is worth NOTHING imo..being that intel has jacked up 1155 mobo prices in light of the bulldozer..er..garbage..i do believe i'm going with 1090T or zosma..i really regrett spending $200 on a sabertooth..when i could have had a better board for $130..i'll probably get that board ,too... am3+ ready? ready for what? netburst was actually FASTER Ooo..been thinking about a northwoord/prescott build..with ssd..hmmm..who wants a sabertooth? (barely used) $140 fx 4100 ..$65 (half price) ..about what it's actually worth


----------



## Digitallag

Fail dozer is fail. I have lost all faith in amd.


----------



## Xyxox

Quote:


> Originally Posted by *Steak House;15388017*
> Oh OK - so AMD doesn't really care about selling CPUs to desktop users! Now I get it, and actually - That makes a hell of a lot of sense now!
> 
> Thanks Xyxox!
> 
> If they cared about selling to us desktop users - the chip might have been worth something.


You still aren't keeping up.

They care about selling these chips right now. They won't care so much by the time the new GPUs are released because the focus will be on sales of new BD based server chips.

And bottom line, desktop based enthusiast chips are a small part of overall CPU sales.


----------



## Steak House

Quote:


> Originally Posted by *Xyxox;15388118*
> And bottom line, desktop based enthusiast chips are a small part of overall CPU sales.


I can see why based on BD.


----------



## mad0314

Quote:


> Originally Posted by *Xyxox;15387444*
> It showed that in real world gaming applications of the chip, it performs as well as a 2500K (meaning, most gaming rigs will bottleneck on their GPU before CPU bottlenecks ever become a problem). This will keep those who refuse to ever buy Intel still buying the new chips from AMD.
> 
> What that means is that the chip will still sell well enough that AMD will be able to continue research and development in the CPU arena which means there is still hope for a competitive CPU market.
> 
> It also means that Sandy Bridge will remain an affordable CPU option.
> 
> Edited to answer the following:
> 
> You might want to check the Linus Tech Tips videos from recent days. NCIX is offering some pre-built and pre-OCed Bulldozer based rigs for sale (with the ASUS board and the GPU he used in his test video). Obviously Linus was not instructed to benchmark BD processors in a GPU bottlenecked situation. He took matters into his own hands because he has a vested interest in BD performing as well as the 2500K.


Yes, it will sell. That does not make it any better.

Quote:


> Originally Posted by *konspiracy;15387893*
> You still dont get it.
> 
> Right now a good cpu can do alot of FP calculations .
> Bulldozer is a weak quad with alot of extra baggage when it comes to FP calculations.
> 
> Are you trolling? Nobody even tries to combat my reasoning because they are to narrow minded to think that hell maybe we should leave FP calculations to the gpu, considering thats what they are designed for.... why build a cpu that is great at fp calculations when a gpu can do it 4 times faster. Thats what Amd's goals are. Let the gpu do what is was designed for.


Unfortunately, AMD does not have anywhere near the influence that intel has for this shift to happen. If intel went to something like that, you can bet that it would happen very quickly (relatively at least).
Quote:


> Originally Posted by *konspiracy;15387351*
> You are still thinking in the wrong direction.
> *Bulldozer is a weaker quad when it comes to floating point calculations*. But in everything but floating point it rocks.
> *The problem is that x86 work is alot of floating point work*.
> So when it comes down to it Amd wants the floating point work to go to the gpu where it excels, and it wants the cpu to do the stuff the gpu cant.
> I mean think about it the gpu's are made for floating point calculations. Amd is trying to move all floating point data to the gpu.
> Bulldozer is like a rough corvette. sure its not a Lamborghini with pure horse power like the 2500k. BUT it is ahead of its time. If you cant understand that then I dont know what to tell you. Your just in denial. Intel is going a pure FP route while AMD is saying hey we have GPU's for that and trying to conquer other things.
> 
> The problem happens to be that everything is made to handle alot of FP calculations. AMD is moving forward while Intel is saying Hahahaha you cant jump 10 years ahead. Just like the 64 bit days.
> 
> Amd isn't supposed to beat Intel. It's supposed to push them.


That is without a doubt THE worst analogy I have ever read.

That aside, it doesn't matter if Bulldozer will perform 10X better in 2015. If it performs bad now, why would you buy it now? AMD does not have anywhere near the leverage that Intel has, so any optimizations towards that arch is going to be slow, very slow. I am willing to bet that you can buy a SB now, and by the time you are ready to upgrade again software side optimizations will not be anywhere near the optimal state for BD.

And the bolded part is the state that we are in at this point in time. It makes BD not a good buy.
Quote:


> Originally Posted by *Xyxox;15387933*
> Try to keep up.
> 
> I'm saying that once the new cards have come out, the BD market will have moved on to sales in server chips and the loss of the GPU bottleneck won't be as meaningful to overall CPU sales for AMD.


Who cares about AMD? We should care about us, and as it stands *TODAY* BD is not a good buy.


----------



## Nocturin

-.-


----------



## rusky1

Quote:



Originally Posted by *Nocturin*


-.-


o___O


----------



## Derp

Quote:



Originally Posted by *Xyxox*


You might want to check the Linus Tech Tips videos from recent days. NCIX is offering some pre-built and pre-OCed Bulldozer based rigs for sale (with the ASUS board and the GPU he used in his test video). Obviously Linus was not instructed to benchmark BD processors in a GPU bottlenecked situation. He took matters into his own hands because he has a vested interest in BD performing as well as the 2500K.


He is in a biased position to get people to buy from the store he works at.

"Hai guys, Lunus here. We will be selling these new pre-overclocked 8 core bulldozer systems soon! But don't hold your breath because as the reviews have proven over and over this CPU is a piece of garbage, so please don't give us your business. We are going to assemble, overclock and stress test a total of 50 of these systems and we want them to sit in the back room collecting dust."

Are you following me here? In his position he can't be honest and say something like that.

BIASED. PERIOD.


----------



## Xyxox

Quote:



Originally Posted by *mad0314*


Yes, it will sell. That does not make it any better.

Who cares about AMD? We should care about us, and as it stands *TODAY* BD is not a good buy.


As long as it sells now, there will be two players in the CPU market and that means there is still competition. Admittedly one of the two makes a better product at the current time, but as long as there are two there is a possibility that one will overcome the current leader.


----------



## Nocturin

Quote:



Originally Posted by *Derp*


He is in a biased position to get people to buy from the store he works at.

"Hai guys, Lunus here. We will be selling these new pre-overclocked 8 core bulldozer systems soon! But don't hold your breath because as the reviews have proven over and over this CPU is a piece of garbage, so please don't give us your business. We are going to assemble, overclock and stress test a total of 50 of these systems and we want them to sit in the back room collecting dust."

Are you following me here? In his position he can't be honest and say something like that.

BIASED. PERIOD.


He was biased, for sure, but his information corresponds with all the other reviews out there. Still a sad day in AMD land.


----------



## B NEGATIVE

Quote:



Originally Posted by *konspiracy*


You still dont get it.

Right now a good cpu can do alot of FP calculations .
Bulldozer is a weak quad with alot of extra baggage when it comes to FP calculations.

Are you trolling? Nobody even tries to combat my reasoning because they are to narrow minded to think that hell maybe we should leave FP calculations to the gpu, considering thats what they are designed for.... why build a cpu that is great at fp calculations when a gpu can do it 4 times faster. Thats what Amd's goals are. Let the gpu do what is was designed for.


This Guy amongst others,just go BD thread to BD thread trolling for attention.
Dont give it to them.


----------



## swindle

Man, this 2500k is awesome.

Should have done this AGES ago.


----------



## Schmuckley

Quote:



Originally Posted by *swindle*


Man, this 2500k is awesome.

Should have done this AGES ago.


hmm..try this..install a program..delete it..then right-click the unused desktop icon..lol


----------



## Schmuckley

wheee..i can make an athlon 2 X4 get higher scores than this..matter of fact..just read a thread where a llano build looked MUCH better


----------



## black96ws6

I posted this in another thread however I think it deserves repeating here:

It's almost as if there's something inherently wrong with the architecture of Bulldozer.

For example, even in one of the supposed few "good" reviews of it, such as HardOCP's (which is really just a GPU limited review), a lot of people apparently missed the Civilization 5 benchmarks in that article:

Sandy Bridge CRUSHES the 8150 in Civ 5, which is very strange, since Civ is one of the more well multi-threaded games. You'd think an "8 core chip" clocked that high would perform a LOT better:

Quote:



What you see above is no mistake, we ran the test multiple times on the AMD FX-8150 system to make sure we didn't screw something up. The AMD FX-8150 consistently produced an average framerate of 40 FPS each time we tried it. *Both the Intel i7 2600K and 2500K thoroughly blew the AMD FX-8150 out of the water by providing a whopping 42% faster performance with the Intel i7 2600K.*



Quote:



*When we overclocked the AMD FX-8150 to 4.6GHz* we experienced a significant performance increase in this game, 17% faster than it was at stock settings. However, even with 17% faster performance, *it was still not able to reach up to Intel Core i7 2600K and 2500K stock performance in this game. *


http://www.hardocp.com/article/2011/...mance_review/4










Those are two alarming paragraphs from HardOCP that should make any gamer think twice about going with the 8150 as a gaming machine...


----------



## TheBlademaster01

Quote:



Originally Posted by *Derp*


He is in a biased position to get people to buy from the store he works at.

"Hai guys, Lunus here. We will be selling these new pre-overclocked 8 core bulldozer systems soon! But don't hold your breath because as the reviews have proven over and over this CPU is a piece of garbage, so please don't give us your business. We are going to assemble, overclock and stress test a total of 50 of these systems and we want them to sit in the back room collecting dust."

Are you following me here? In his position he can't be honest and say something like that.

*BIASED. PERIOD.*


Blasphemy, one can only be biased against AMD you know...


----------



## AMD2600

I want to see some benches where BD is using 1866mhz ram and with the CPU-NB overclocked.


----------



## Schmuckley

NB @ 2500ish same with HT..only 1600 ram..it's not playing nicey-nice with my ram


----------



## Derp

Quote:


> Originally Posted by *AMD2600;15406387*
> I want to see some benches where BD is using 1866mhz ram and with the CPU-NB overclocked.












There. Source is Sin0822's review.

Still fails.


----------



## Steak House

Quote:


> Originally Posted by *black96ws6;15406059*
> I posted this in another thread however I think it deserves repeating here:
> 
> It's almost as if there's something inherently wrong with the architecture of Bulldozer.
> 
> For example, even in one of the supposed few "good" reviews of it, such as HardOCP's (which is really just a GPU limited review), a lot of people apparently missed the Civilization 5 benchmarks in that article:
> 
> Sandy Bridge CRUSHES the 8150 in Civ 5, which is very strange, since Civ is one of the more well multi-threaded games. You'd think an "8 core chip" clocked that high would perform a LOT better:
> 
> http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those are two alarming paragraphs from HardOCP that should make any gamer think twice about going with the 8150 as a gaming machine...


WOW - ^^^^ That's real world and that's bad!


----------



## 2010rig

Quote:



Originally Posted by *Steak House*


WOW - ^^^^ That's real world and that's bad!


Anandtech further supports these findings.










I don't get why people are using GPU bound scenarios to say Bulldozer is "good", when even a Dual Core i5, or a Phenom II quad would get the exact same results. When testing a CPU, it should be tested in CPU bound scenarios to see what it's truly capable of.



















Quote:



In many cases, AMD's FX-8150 is able to close the gap between the Phenom II X6 and Intel's Core i5 2500K. Given the right workload, Bulldozer is actually able to hang with Intel's fastest Sandy Bridge parts. We finally have a high-end AMD CPU with power gating as well as a very functional Turbo Core mode. Unfortunately the same complaints we've had about AMD's processors over the past few years still apply here today: in lightly threaded scenarios, Bulldozer simply does not perform. *To make matters worse, in some heavily threaded applications the improvement over the previous generation Phenom II X6 simply isn't enough to justify an upgrade for existing AM3+ platform owners*. AMD has released a part that is generally more competitive than its predecessor, but not consistently so. *AMD also makes you choose between good single or good multithreaded performance, a tradeoff that we honestly shouldn't have to make in the era of power gating and turbo cores.*


----------

