dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
3556
share rss forum feed

me1212

join:2008-11-20
Pleasant Hill, MO

New benchmarks for th 8350 and 3570k/3770k. Weird results.

Not sure how he got those results, when most other benches get different(opposite for the most part) results.

»www.youtube.com/watch?v=eu8Sekdb-IE


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Can you summarize for those of us behind the great wall?

me1212

join:2008-11-20
Pleasant Hill, MO
Basically in steaming(xsplit) the 8350 won, and for a lot of other games(most he did, including metro 2033) the 8350 won by a decent bit, nearly 10fps or so. Same gpu was used for the benches.


Juke Box
His Word Never Fails
Premium
join:2001-01-29
Proverbs 3
Reviews:
·WOW Internet and..
·Knology
reply to me1212
Nice find.

I have that very processor and it runs well. But then, who really says they made a bad investment even if they did?

Now I am not going to tell I didn't seriously think about purchasing the latest Intel 3770 when I was shopping around. I almost did but I took the plunge and decided to try out the FX-8350 with the Asus M5A99X. Both the processor and the motherboard got just above average reviews and I did see several complaints on the motherboard but thought I would challenge the odds for the cost of a return and refund. Needless to say, I have had these two items running for a couple of weeks with no issues.

I also have two gigabyte 7870's crossfire. I do not have it overclocked.
--
Oh, praise the one who paid my debt;
And raised this life up from the dead.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
reply to me1212
I got a chance to watch this today from home, and I am surprised, as he is contradicting every other site out there.

But he's using a 7870 as the GPU? He should have been using a top-end GPU like the 680 or 7970 so he can determine how much the CPU is limiting the rest of the rig. I'm not sure why he used a mid-range GPU.

But testing Black Mesa Source? When you are trying to make differences between hundreds of FPS, the difference is going to be coding or drivers implementation, not CPU prowess.

I would be interested in an Anandtech or Tom's Hardware independent peer review of his findings so they have a chance to update their own or refute his findings.

This is not enough for me to changing to recommending the FX-8350--I would bet this has more to do with using the mid-range GPU than the CPUs.

What I really would be interested in is an OC matchup between the FX-8350 and the i5-3570K on the same cooler, like an H100i.

Also, the biblical doubting reference is Thomas, not Judas. lol.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.

me1212

join:2008-11-20
Pleasant Hill, MO
The 7078 puzzled me too, I know he has a 680 in his office he could have used, but he did give away the 7970. Still even a mid range cpu shouldn't make a difference like that. Not that I'm an intel fanboy, heck I use amd's more core cpus on my code compiling box.

I'm guessing he did black mesa source because its free.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
It's still at odds with what everyone else has gotten in their tests. There are three possibilities:

1) His testing is "better" in terms of consistency and methodology than all the other review sites out there.
2) There is a conspiracy by everyone else to push Intel CPUs even though they are subpar.
3) His testing results are flawed.

I am going with #3 if they can't be verified by anyone else out there.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.

me1212

join:2008-11-20
Pleasant Hill, MO
I think it may be 3 aswell, based on what I've read on his site he didn't do a reinstall when he switched cpu+mobo and he did the amd first. Just switched the hardware and installed the drivers.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
lolwut? He just stuck the SSD into a completely new system?

me1212

join:2008-11-20
Pleasant Hill, MO
Ether that or swapped out the cpu and mobo, possibly ram too. Ether way, just why would one do that?


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
He's cutting corners, I guess.

I could see pulling out the GPU, as that would make no difference, but he shouldn't be moving the SSD between systems--I'm surprised the OS install even worked and didn't detect new hardware and shut down and ask for a new license key.

If he is going to cut corners, he has no business making claims or publishing benchmarks. The goal is in testing is to remove as many variables as possible. That usually means a fully-functioning, fully-patched system running the latest stable version of drivers.

Shifting hardware around like that only increases the entropy he's introducing into his systems.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.

me1212

join:2008-11-20
Pleasant Hill, MO
If he wasn't using a legit copy of windows 7 it may not have asked him for new key.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
That's speculation at this point. I'll just mark these results as "suspect" and move along.

me1212

join:2008-11-20
Pleasant Hill, MO
I'm not trying to say he was, I'm just saying that some cracked copies don't ask.


Octavean
Premium,MVM
join:2001-03-31
New York, NY
kudos:1
reply to me1212
If you install Windows 7 without a key you can evaluate it for up to 120 days without activation. If you don't activate it won't prompt you for a key after a hardware change.

Windows 8 install media wants to force activation directly after OS install and requires a key for the install. I hear you can get around this and install to evaluate as well but it takes some work,...

me1212

join:2008-11-20
Pleasant Hill, MO
reply to me1212
For what its worth, heres a review done in direct response to the first one I posted. I'm at college right now so I haven't had time to watch it all yet.

»www.youtube.com/watch?feature=pl···YV8Djt7k


NoAxeToGrind

@comcastbusiness.net
I understand the criticisms, in the chain and the second vid (though I disapprove of the attitude and name calling in the second vid), and agree that the original test is not rigorous and really doesn't say which is the "better" cpu. Viewed in light of the criticism, I think the the first video only shows if actual game performance is your main concern, and you can't afford a high-end GPU (and don't plan to upgrade the component parts), you can save money by choosing the 8350. This strikes me as intuitive -- if the system is GPU bound, having a "better" gaming CPU will not provide much better gaming performance, if any.

I think what would be more interesting is to test which GPUs you would need for the CPU to make a difference. That would be much more helpful for a gamer trying to build a system to get the best bang for the buck. You could then figure out "I can spend the $X difference in MB/CPU on this GPU upgrade" or "I can spend the $Y difference in GPUs on the Intel" and get better performance. Unfortunately, I suspect that nobody is going to spend to money to do that kind of testing.


Ghastlyone
Premium
join:2009-01-07
Las Vegas, NV
kudos:5
reply to me1212
So to save money, you go for the AMD, that generates more heat and uses a lot more power?


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
And is more expensive with less power for gaming?


NoAxeToGrind

@comcast.net
More expensive? A quick search prices $200 for the FX-8350 v $330 for the 3770k. Last I heard, $200 is less than $330. You can pay for a lot of electricity for $130. Do you have figures for the electricity differential? How many hours of gameplay do you need before you use a $130 differential in electricity? If it's $.10 per hour (and I'm guessing it's a lot less) then that's 1300 hours of gameplay. I'm thinking that's 2 years of gameplay for me. I'm guessing the differential is actually less than $.01 per hour. That's probably 4 or 5 times the life of the system.

More power for gaming? If both systems are limited by the GPU in real gameplay, the extra computing power is wasted. If the GPU is only going to give you 50fps in your game with either chip, how are you better off spending another $130 for the same framerate?

The point is that if your budget means that you're going to be GPU limited either way, maybe the $130 puts you into a better GPU and gets you better performance for the same dollars for the system as a whole with the FX-8350. If that's the case, you're better off putting the money into the GPU instead of the CPU.

Admittedly, you can get the 3570k for about $220 or so; but if history is any guide, the FX-8350 price will fall faster than the 3570k price. $20 may or may not make a difference in the GPU, but $50 probably will. Would you take a 3570k with a GT 610 over an 8350 with a GT 620? Sure would. What about a 3570k/GT610 over 8350/GT640? maybe, maybe not - tell me what the actual framerates are. 3770k/GT610 v 8350/GT650? I'd take the 8350/GT650.

The point is, the intels have more bang; but depending on the circumstances, the FX-8350/GPU combination for the same money may have more bang for the buck. If your budget means that your GPU is not going to be top of the line, and either CPU is going to be limited by the GPU performance, then get the best GPU you can for the difference to maximize your bang for the buck.

So to answer your questions: Yes. If the FX-8350/better GPU combo gets better framerates on my game than the same priced Intel/worse GPU combo, I'll take the FX-8350.


Ghastlyone
Premium
join:2009-01-07
Las Vegas, NV
kudos:5

1 recommendation

said by NoAxeToGrind :

More expensive? A quick search prices $200 for the FX-8350 v $330 for the 3770k. Last I heard, $200 is less than $330. You can pay for a lot of electricity for $130. Do you have figures for the electricity differential? How many hours of gameplay do you need before you use a $130 differential in electricity? If it's $.10 per hour (and I'm guessing it's a lot less) then that's 1300 hours of gameplay. I'm thinking that's 2 years of gameplay for me. I'm guessing the differential is actually less than $.01 per hour. That's probably 4 or 5 times the life of the system.

Most people, even the ones reviewing the FX-8350 right on Newegg compare it directly to the i5 3570K, not the 3770k. We're talking 15-20 dollar difference, and a shit ton more power usage and heat.

You might brush 125w of power usage under the rug, compared to a measly 77w on the Intel, but I sure don't. And that's only stock clocks and speeds we're talking. You can squeeze a lot more overclocking out of the 3570K on stock voltage then you can that 8350.

Why would you opt for any hardware that's similar in price, get's worse gaming performance, uses more power and creates a lot more heat?

said by NoAxeToGrind :

More power for gaming? If both systems are limited by the GPU in real gameplay, the extra computing power is wasted. If the GPU is only going to give you 50fps in your game with either chip, how are you better off spending another $130 for the same framerate?

The point is that if your budget means that you're going to be GPU limited either way, maybe the $130 puts you into a better GPU and gets you better performance for the same dollars for the system as a whole with the FX-8350. If that's the case, you're better off putting the money into the GPU instead of the CPU.

By that logic, you could just skimp on the CPU completely to save money and offset it with a 680 SLI or Crossfire equivalent?


NoAxeToGrind

@comcastbusiness.net
I addressed your arguments re: 3570k in my post -- if the $20 is not going to make a difference in the GPU, then go with the better CPU. That said, I will add -- if the GPU is maxed out without overclocking the CPU, I don't see that overclocking is going to help with respect to gameplay. If you're Barry Bonds (CPU) trying to hit home runs in a little league ballpark (GPU), you don't really need the steroids (overclocking); they don't add anything until you get to a big league park (high end GPU). Most casual gamers will never notice the difference in power consumption and heat generation. They will notice the difference in framerates.

said by Ghastlyone:

By that logic, you could just skimp on the CPU completely to save money and offset it with a 680 SLI or Crossfire equivalent?

That's not what I'm saying. I'm saying you get the best bang for your buck on gaming applications by buying the best GPU you can afford with the least expensive CPU that can give you the max performance out of that GPU. That applies whether you are talking a choice between 8350 and 3770k or between the 3570k and 3770k. You may want a better CPU for other applications you are doing, and that's fine; but then we're not talking purely about gaming. I don't care if the CPU is intel or AMD, the question is: can it give me the max performance out of the GPU it is paired with? If the answer is yes, and you can get more GPU with the AMD or cheaper Intel, go with the AMD or cheaper Intel. If the answer is no, then you need to shift dollars from the GPU budget to the CPU budget.


I AM
Premium
join:2010-04-11
Ephrata, PA
kudos:4
reply to NoAxeToGrind
I went AMD with my build at the time. Fx-6100. Cheaper mobo as well. So that I could crossfire 2 cards. Now looking back I wish I would have went Intel.

Outside of gaming a lot of software just works better with Intel than it does with AMD.


NoAxeToGrind

@comcastbusiness.net
said by I AM:

I went AMD with my build at the time. Fx-6100. Cheaper mobo as well. So that I could crossfire 2 cards. Now looking back I wish I would have went Intel.

Outside of gaming a lot of software just works better with Intel than it does with AMD.

Fair enough: Intel may be a better choice over AMD for non-gaming / non-budget reasons. If so, then go Intel.

But, gaming only, let me ask you -- does the 6100 max out your GPUs? What GPUs do you have versus the GPU you would have had to get with an intel build for the same dollars? How would that have affected your framerates?

If your alternate intel build would give you similar framerates for the same price, then intel gets the push because of the non-gamining issues. If the AMD gives better framerates gaming, then you have to evaluate whether the performance loss on gaming is worth the performance gain on non-gaming apps.

I don't see that I'm saying anything controversial. I'm just saying you have to look at the whole system for the dollars spent. For gaming, the GPU, not the CPU, should be the limiting factor. I'm saying, strictly gaming, to get the best bang for your buck, get the best GPU you can afford, with the least expensive CPU that will let you take full advantage of that GPU.

Of course, if you are planning to upgrade your GPU when you can afford, or if better GPUs come out, then you DO need the best CPU you can afford to take advantage of the potential GPU growth. But that's a difffent consideration. The typical gamer, as opposed to the hardware enthusiast, rather than upgrade components, will replace the whole system when he can afford better or it becomes obsolete.


Gordo74
Premium
join:2003-10-28
Monroeville, PA

1 recommendation

Your logic is flawed.

Why buy a $150 top of the line AMD CPU when a $110 Intel i3 beats it in every benchmark? Why not take the extra $40 and put THAT in a better GPU?


NoAxeToGrind

@comcastbusiness.net
said by Gordo74:

Why not take the extra $40 and put THAT in a better GPU?

That's EXACTLY what I'm saying - ignoring non-gaming issues, take the dollars saved on the CPU an put them into the better GPU, provided the CPU can get the max performance out of the GPU. I don't buy the system to run benchmarks, I buy it to run games. If the extra CPU performance will not show up in my gameplay because the GPU is the limiting factor, I'm better off getting a better GPU. From a strictly gaming perspective, it doesn't matter whether the cheaper processor is AMD or Intel; what matters is whether the cheaper processor can get the maximum performance out of the better GPU.


Octavean
Premium,MVM
join:2001-03-31
New York, NY
kudos:1
reply to NoAxeToGrind
Meh,….

I’m with Ghastlyone and Krisnatharok on this one,…

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.


Gordo74
Premium
join:2003-10-28
Monroeville, PA
reply to NoAxeToGrind
said by NoAxeToGrind :

said by Gordo74:

Why not take the extra $40 and put THAT in a better GPU?

That's EXACTLY what I'm saying - ignoring non-gaming issues, take the dollars saved on the CPU an put them into the better GPU, provided the CPU can get the max performance out of the GPU. I don't buy the system to run benchmarks, I buy it to run games. If the extra CPU performance will not show up in my gameplay because the GPU is the limiting factor, I'm better off getting a better GPU. From a strictly gaming perspective, it doesn't matter whether the cheaper processor is AMD or Intel; what matters is whether the cheaper processor can get the maximum performance out of the better GPU.

And you're contradicting yourself.

The i3-3220 outperforms ALL of the AMD chips save for one which is ~$150. The i3 is $120. So by your logic, you would always get the i3.


El Quintron
Resident Mouth Breather
Premium
join:2008-04-28
Etobicoke, ON
kudos:4
Reviews:
·TekSavvy Cable
·TekSavvy DSL
reply to Octavean
said by Octavean:

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

Another thing to consider is that the AMD chip uses almost 50% more power... so with that in mind how long is your $40 saving going to last you?

Six months?

It's not worth it for an inferior chip by any stretch.
--
Support Bacteria -- It's the Only Culture Some People Have


NoAxeToGrind

@comcastbusiness.net
reply to Octavean
said by Octavean:

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

I think $30-$40 is about the difference between an HD 6670 and an HD 7750. You'd rather have a 3770k with a 6670 than a 3570k with a 7750? I think you'll get about half the framerate. Who's pennywise and pound foolish?