dslreports logo
 
    All Forums Hot Topics Gallery
spc
uniqs
19

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone to me1212

Premium Member

to me1212

Re: New benchmarks for th 8350 and 3570k/3770k. Weird results.

So to save money, you go for the AMD, that generates more heat and uses a lot more power?

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

And is more expensive with less power for gaming?

NoAxeToGrind
@comcast.net

NoAxeToGrind

Anon

More expensive? A quick search prices $200 for the FX-8350 v $330 for the 3770k. Last I heard, $200 is less than $330. You can pay for a lot of electricity for $130. Do you have figures for the electricity differential? How many hours of gameplay do you need before you use a $130 differential in electricity? If it's $.10 per hour (and I'm guessing it's a lot less) then that's 1300 hours of gameplay. I'm thinking that's 2 years of gameplay for me. I'm guessing the differential is actually less than $.01 per hour. That's probably 4 or 5 times the life of the system.

More power for gaming? If both systems are limited by the GPU in real gameplay, the extra computing power is wasted. If the GPU is only going to give you 50fps in your game with either chip, how are you better off spending another $130 for the same framerate?

The point is that if your budget means that you're going to be GPU limited either way, maybe the $130 puts you into a better GPU and gets you better performance for the same dollars for the system as a whole with the FX-8350. If that's the case, you're better off putting the money into the GPU instead of the CPU.

Admittedly, you can get the 3570k for about $220 or so; but if history is any guide, the FX-8350 price will fall faster than the 3570k price. $20 may or may not make a difference in the GPU, but $50 probably will. Would you take a 3570k with a GT 610 over an 8350 with a GT 620? Sure would. What about a 3570k/GT610 over 8350/GT640? maybe, maybe not - tell me what the actual framerates are. 3770k/GT610 v 8350/GT650? I'd take the 8350/GT650.

The point is, the intels have more bang; but depending on the circumstances, the FX-8350/GPU combination for the same money may have more bang for the buck. If your budget means that your GPU is not going to be top of the line, and either CPU is going to be limited by the GPU performance, then get the best GPU you can for the difference to maximize your bang for the buck.

So to answer your questions: Yes. If the FX-8350/better GPU combo gets better framerates on my game than the same priced Intel/worse GPU combo, I'll take the FX-8350.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

1 recommendation

Ghastlyone

Premium Member

said by NoAxeToGrind :

More expensive? A quick search prices $200 for the FX-8350 v $330 for the 3770k. Last I heard, $200 is less than $330. You can pay for a lot of electricity for $130. Do you have figures for the electricity differential? How many hours of gameplay do you need before you use a $130 differential in electricity? If it's $.10 per hour (and I'm guessing it's a lot less) then that's 1300 hours of gameplay. I'm thinking that's 2 years of gameplay for me. I'm guessing the differential is actually less than $.01 per hour. That's probably 4 or 5 times the life of the system.

Most people, even the ones reviewing the FX-8350 right on Newegg compare it directly to the i5 3570K, not the 3770k. We're talking 15-20 dollar difference, and a shit ton more power usage and heat.

You might brush 125w of power usage under the rug, compared to a measly 77w on the Intel, but I sure don't. And that's only stock clocks and speeds we're talking. You can squeeze a lot more overclocking out of the 3570K on stock voltage then you can that 8350.

Why would you opt for any hardware that's similar in price, get's worse gaming performance, uses more power and creates a lot more heat?
said by NoAxeToGrind :

More power for gaming? If both systems are limited by the GPU in real gameplay, the extra computing power is wasted. If the GPU is only going to give you 50fps in your game with either chip, how are you better off spending another $130 for the same framerate?

The point is that if your budget means that you're going to be GPU limited either way, maybe the $130 puts you into a better GPU and gets you better performance for the same dollars for the system as a whole with the FX-8350. If that's the case, you're better off putting the money into the GPU instead of the CPU.

By that logic, you could just skimp on the CPU completely to save money and offset it with a 680 SLI or Crossfire equivalent?

NoAxeToGrind
@comcastbusiness.net

NoAxeToGrind

Anon

I addressed your arguments re: 3570k in my post -- if the $20 is not going to make a difference in the GPU, then go with the better CPU. That said, I will add -- if the GPU is maxed out without overclocking the CPU, I don't see that overclocking is going to help with respect to gameplay. If you're Barry Bonds (CPU) trying to hit home runs in a little league ballpark (GPU), you don't really need the steroids (overclocking); they don't add anything until you get to a big league park (high end GPU). Most casual gamers will never notice the difference in power consumption and heat generation. They will notice the difference in framerates.
said by Ghastlyone:

By that logic, you could just skimp on the CPU completely to save money and offset it with a 680 SLI or Crossfire equivalent?

That's not what I'm saying. I'm saying you get the best bang for your buck on gaming applications by buying the best GPU you can afford with the least expensive CPU that can give you the max performance out of that GPU. That applies whether you are talking a choice between 8350 and 3770k or between the 3570k and 3770k. You may want a better CPU for other applications you are doing, and that's fine; but then we're not talking purely about gaming. I don't care if the CPU is intel or AMD, the question is: can it give me the max performance out of the GPU it is paired with? If the answer is yes, and you can get more GPU with the AMD or cheaper Intel, go with the AMD or cheaper Intel. If the answer is no, then you need to shift dollars from the GPU budget to the CPU budget.

I AM
Premium Member
join:2010-04-11
Ephrata, PA

I AM to NoAxeToGrind

Premium Member

to NoAxeToGrind
I went AMD with my build at the time. Fx-6100. Cheaper mobo as well. So that I could crossfire 2 cards. Now looking back I wish I would have went Intel.

Outside of gaming a lot of software just works better with Intel than it does with AMD.

NoAxeToGrind
@comcastbusiness.net

NoAxeToGrind

Anon

said by I AM:

I went AMD with my build at the time. Fx-6100. Cheaper mobo as well. So that I could crossfire 2 cards. Now looking back I wish I would have went Intel.

Outside of gaming a lot of software just works better with Intel than it does with AMD.

Fair enough: Intel may be a better choice over AMD for non-gaming / non-budget reasons. If so, then go Intel.

But, gaming only, let me ask you -- does the 6100 max out your GPUs? What GPUs do you have versus the GPU you would have had to get with an intel build for the same dollars? How would that have affected your framerates?

If your alternate intel build would give you similar framerates for the same price, then intel gets the push because of the non-gamining issues. If the AMD gives better framerates gaming, then you have to evaluate whether the performance loss on gaming is worth the performance gain on non-gaming apps.

I don't see that I'm saying anything controversial. I'm just saying you have to look at the whole system for the dollars spent. For gaming, the GPU, not the CPU, should be the limiting factor. I'm saying, strictly gaming, to get the best bang for your buck, get the best GPU you can afford, with the least expensive CPU that will let you take full advantage of that GPU.

Of course, if you are planning to upgrade your GPU when you can afford, or if better GPUs come out, then you DO need the best CPU you can afford to take advantage of the potential GPU growth. But that's a difffent consideration. The typical gamer, as opposed to the hardware enthusiast, rather than upgrade components, will replace the whole system when he can afford better or it becomes obsolete.

Gordo74
Premium Member
join:2003-10-28
Pittsburgh, PA

1 recommendation

Gordo74

Premium Member

Your logic is flawed.

Why buy a $150 top of the line AMD CPU when a $110 Intel i3 beats it in every benchmark? Why not take the extra $40 and put THAT in a better GPU?

NoAxeToGrind
@comcastbusiness.net

NoAxeToGrind

Anon

said by Gordo74:

Why not take the extra $40 and put THAT in a better GPU?

That's EXACTLY what I'm saying - ignoring non-gaming issues, take the dollars saved on the CPU an put them into the better GPU, provided the CPU can get the max performance out of the GPU. I don't buy the system to run benchmarks, I buy it to run games. If the extra CPU performance will not show up in my gameplay because the GPU is the limiting factor, I'm better off getting a better GPU. From a strictly gaming perspective, it doesn't matter whether the cheaper processor is AMD or Intel; what matters is whether the cheaper processor can get the maximum performance out of the better GPU.

Octavean
MVM
join:2001-03-31
New York, NY

Octavean to NoAxeToGrind

MVM

to NoAxeToGrind
Meh,….

I’m with Ghastlyone and Krisnatharok on this one,…

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

Gordo74
Premium Member
join:2003-10-28
Pittsburgh, PA

Gordo74 to NoAxeToGrind

Premium Member

to NoAxeToGrind
said by NoAxeToGrind :

said by Gordo74:

Why not take the extra $40 and put THAT in a better GPU?

That's EXACTLY what I'm saying - ignoring non-gaming issues, take the dollars saved on the CPU an put them into the better GPU, provided the CPU can get the max performance out of the GPU. I don't buy the system to run benchmarks, I buy it to run games. If the extra CPU performance will not show up in my gameplay because the GPU is the limiting factor, I'm better off getting a better GPU. From a strictly gaming perspective, it doesn't matter whether the cheaper processor is AMD or Intel; what matters is whether the cheaper processor can get the maximum performance out of the better GPU.

And you're contradicting yourself.

The i3-3220 outperforms ALL of the AMD chips save for one which is ~$150. The i3 is $120. So by your logic, you would always get the i3.

El Quintron
Cancel Culture Ambassador
Premium Member
join:2008-04-28
Tronna

El Quintron to Octavean

Premium Member

to Octavean
said by Octavean:

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

Another thing to consider is that the AMD chip uses almost 50% more power... so with that in mind how long is your $40 saving going to last you?

Six months?

It's not worth it for an inferior chip by any stretch.

NoAxeToGrind
@comcastbusiness.net

NoAxeToGrind to Octavean

Anon

to Octavean
said by Octavean:

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

I think $30-$40 is about the difference between an HD 6670 and an HD 7750. You'd rather have a 3770k with a 6670 than a 3570k with a 7750? I think you'll get about half the framerate. Who's pennywise and pound foolish?
NoAxeToGrind

NoAxeToGrind to El Quintron

Anon

to El Quintron
I give up. Either nobody here can read or you're all Intel fanboys. I don't advocate the AMD chips; I say the cheapest chip, Intel or AMD, that can make use of the better GPU for the same total dollars. If you're too blinded by your chip prejudice to make that distinction, I'm wasting my time.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok to NoAxeToGrind

Premium Member

to NoAxeToGrind
said by NoAxeToGrind :

More expensive? A quick search prices $200 for the FX-8350 v $330 for the 3770k. Last I heard, $200 is less than $330. You can pay for a lot of electricity for $130. Do you have figures for the electricity differential? How many hours of gameplay do you need before you use a $130 differential in electricity? If it's $.10 per hour (and I'm guessing it's a lot less) then that's 1300 hours of gameplay. I'm thinking that's 2 years of gameplay for me. I'm guessing the differential is actually less than $.01 per hour. That's probably 4 or 5 times the life of the system.

More power for gaming? If both systems are limited by the GPU in real gameplay, the extra computing power is wasted. If the GPU is only going to give you 50fps in your game with either chip, how are you better off spending another $130 for the same framerate?

The point is that if your budget means that you're going to be GPU limited either way, maybe the $130 puts you into a better GPU and gets you better performance for the same dollars for the system as a whole with the FX-8350. If that's the case, you're better off putting the money into the GPU instead of the CPU.

Admittedly, you can get the 3570k for about $220 or so; but if history is any guide, the FX-8350 price will fall faster than the 3570k price. $20 may or may not make a difference in the GPU, but $50 probably will. Would you take a 3570k with a GT 610 over an 8350 with a GT 620? Sure would. What about a 3570k/GT610 over 8350/GT640? maybe, maybe not - tell me what the actual framerates are. 3770k/GT610 v 8350/GT650? I'd take the 8350/GT650.

The point is, the intels have more bang; but depending on the circumstances, the FX-8350/GPU combination for the same money may have more bang for the buck. If your budget means that your GPU is not going to be top of the line, and either CPU is going to be limited by the GPU performance, then get the best GPU you can for the difference to maximize your bang for the buck.

So to answer your questions: Yes. If the FX-8350/better GPU combo gets better framerates on my game than the same priced Intel/worse GPU combo, I'll take the FX-8350.

Mr. "No Axe to Grind", the FX-8350 is not remotely in the same ballpark as the i7-3770k.

The i5-3570K is $190 at Microcenter, $230 at Newegg. The 8350 is $190 at Microcenter, $200 at Newegg. So yeah, same price-range, and one is vastly superior to the other. Heck, when the 8350 was released, reviewers remarked that AMD has finally matched the performance of the Phenom II x6. Intel, meanwhile, has continued marching forward, hitting successive home-runs with the i5-2500k and i5-3570k.

And using the GT 610, 640, and 650 as examples paired with any of these processors are just lol.

BTW I hate false impartiality.

El Quintron
Cancel Culture Ambassador
Premium Member
join:2008-04-28
Tronna

El Quintron to NoAxeToGrind

Premium Member

to NoAxeToGrind
said by NoAxeToGrind :

I think $30-$40 is about the difference between an HD 6670 and an HD 7750. You'd rather have a 3770k with a 6670 than a 3570k with a 7750? I think you'll get about half the framerate. Who's pennywise and pound foolish?

What you're saying is absurd, no one who's on the market for 3570k or an FX 8350 is even looking for a 3770k, those chips aren't even aimed at the same audience.

The real comparison here is 3570k vs. FX 8350, and it's a no brainer for the 3570K, it's same price, performs better and draws less power, making it not only the better performer, it also has a lower cost of ownership electricity wise.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212 to Krisnatharok

Member

to Krisnatharok
said by Krisnatharok:

Heck, when the 8350 was released, reviewers remarked that AMD has finally matched the performance of the Phenom II x6.

You've hit on something there, something very important yet very sad when you think about it. For gaming(not sure about anything else honestly haven't looked) and STILL hasn't outdone the thuban phenom ll x6 or even x4, heck I hear the x4 oc better so technically they would be better for gaming, and its how many years after release?

I don't mean to bash AMD as I own a phenom ll x6 1045t, I use it in my linux box that I use to compile C/C++ code for school, compiles code wonderfully and hasn't had any problems with anything I've thrown at it. Even compiles GZdoom in under 5 minues, and thats the most heavy gaming I do on that thing. For $200 its probably the best thing I could have gotten at the time. The x6s were great chips sure, but this many years later amd needs to have something waaay better out.

I want amd to succeed in gaming so we can have real competition, but I'm not sure if they do to be honest.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

Bud, we're in the same boat. I like Intel's newest chips (although less enthused that Haswell will only be a 10% increase in power over Ivy Bridge), but I love value and efficiency more. I really wish I could justify an FX-4170 or 8350. But I can't. AMD needs to get its butt in gear and put its brand-new executives to work so they can become competitive against Intel outside of just the $120 pricepoint (where the FX-4170 matches the i3-3220, with potential to OC).
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212

Member

Wow, only 10%? Is it at least across the board, or are they saying its going to be like iby bridge was with 10% max but more like 4% for every day usage? Even if its across the board haswell wont have 20% over sandy bridge, which irrc sandy bridge had 20% over its predecessor.

I'm beginning to wonder if my 2500k wont last me a good 7 or more years at this rate, heck I still haven't overclocked it as much as it can do.

The gamer part of me likes how this will let me keep my hardware longer, but the computer scientist part of me is concerned that we may not get a big jump for a long time.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone

Premium Member

said by me1212:

Wow, only 10%? Is it at least across the board, or are they saying its going to be like iby bridge was with 10% max but more like 4% for every day usage? Even if its across the board haswell wont have 20% over sandy bridge, which irrc sandy bridge had 20% over its predecessor.

I'm beginning to wonder if my 2500k wont last me a good 7 or more years at this rate, heck I still haven't overclocked it as much as it can do.

The gamer part of me likes how this will let me keep my hardware longer, but the computer scientist part of me is concerned that we may not get a big jump for a long time.

It's going to be similar to Sandy Bridge---->Ivy Bridge.

Basically, upgrading from Sandy Bridge to an Ivy Bridge was/is a waste of money.

Same will probably be said about IB--->Haswell.

And yeah regarding your 2500K lasting another 7 years, there are articles out there talking about how we're starting to hit diminishing returns with CPUs.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

Thank you, AMD, for failing to put competitive pressure on Intel. I hoped for leaps more in line with Moore's Law once tri-gate transistors were fully implemented.

Haswell should still be a nice increase over my aging i7-920, however.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212 to Ghastlyone

Member

to Ghastlyone
said by Ghastlyone:

And yeah regarding your 2500K lasting another 7 years, there are articles out there talking about how we're starting to hit diminishing returns with CPUs.

I am certain we are hitting diminishing returns with silicon, I have no idea what the next material we make it out of but it should be better than what we have now.

Still even with diminishing returns it is still possible for them to make something worth upgrading to, even if it doesn't have huge ipc increases we can get other things more pcie lanes, more ram(ddr4) ect. But like Krisnatharok said no competition does this to us.
said by Ghastlyone:

It's going to be similar to Sandy Bridge---->Ivy Bridge.

Basically, upgrading from Sandy Bridge to an Ivy Bridge was/is a waste of money.

Same will probably be said about IB--->Haswell.

*sigh* Nearly as useless to upgrade from SB -----> Haswell then.
me1212

me1212 to Krisnatharok

Member

to Krisnatharok
It most certainly will be, even if the 920 is oc'd, but the i7 came out in q4 of 08, it will have taken them over 4 years by the time haswell comes out to make something worth upgrading to.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone to Krisnatharok

Premium Member

to Krisnatharok
said by Krisnatharok:

Thank you, AMD, for failing to put competitive pressure on Intel.

No shit ^ Same goes for putting pressure on Nvidia on certain GPUs. I'd love to see some real competition on GPU front.
said by Krisnatharok:

Haswell should still be a nice increase over my aging i7-920, however.

That should be a nice upgrade for you.
Ghastlyone

Ghastlyone to me1212

Premium Member

to me1212
said by me1212:

said by Ghastlyone:

And yeah regarding your 2500K lasting another 7 years, there are articles out there talking about how we're starting to hit diminishing returns with CPUs.

I am certain we are hitting diminishing returns with silicon, I have no idea what the next material we make it out of but it should be better than what we have now.

Still even with diminishing returns it is still possible for them to make something worth upgrading to, even if it doesn't have huge ipc increases we can get other things more pcie lanes, more ram(ddr4) ect. But like Krisnatharok said no competition does this to us.
said by Ghastlyone:

It's going to be similar to Sandy Bridge---->Ivy Bridge.

Basically, upgrading from Sandy Bridge to an Ivy Bridge was/is a waste of money.

Same will probably be said about IB--->Haswell.

*sigh* Nearly as useless to upgrade from SB -----> Haswell then.

I'll be happy as shit if I can get another 2-3 years out of my 3570K. If all I have to do is a GPU upgrade, maybe a larger SSD? That's cool with me.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212

Member

Dont get me wrong my good sir, I'd love for my hardware to last me for years. Heck, if my 2500k can get at least to skylake I'll be happy, sure I'll ned to upgrade my gpu in the mean time ,but thats life. However, if it takes longer than the tick after skylake I'll be a bit worried about the amount of time its taking to make progress.

I guess nothing can keep making groundbreaking strides forever. Maybe arm + graphite is the future. I'd be okay with an arm desktop, that would actually be kinda cool.

Octavean
MVM
join:2001-03-31
New York, NY

Octavean to NoAxeToGrind

MVM

to NoAxeToGrind
said by NoAxeToGrind :

said by Octavean:

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

I think $30-$40 is about the difference between an HD 6670 and an HD 7750. You'd rather have a 3770k with a 6670 than a 3570k with a 7750? I think you'll get about half the framerate. Who's pennywise and pound foolish?

You can force those constraints on yourself but I typically don’t buy / build systems with that kind of thinking.

“Pennywise and pound foolish” was used by my professor in the context of buying and using products / tools that were cheaper in price but either wouldn’t last as long or would be inadequate (possibly necessitating buying again prematurely). This was for a requisite Mechanical Drawing class for all Engineering students. He was trying to say it was better to spend more and get an excellent quality product then to cheap out.

So I would consider buying something you really don’t want and or something inadequate in order to make a budget “Pennywise and pound foolish”.

I also think you might be missing the point because I also suggested as an option abstaining from buying anything until such time as one can afford to buy what they really wanted or a better product (if its otherwise necessary due to budget constraints).

Where is it written that one must compromise, especially the way you suggest? Where is it written that one can’t wait a week or more to allow for more in the budget especially for a scant ~$30 to ~$40,….?

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

Who the hell pairs a 3570k with a 7750 in the first place. That's a sub-$100 GPU.

If that's the GPU you're buying, you should be using the i3-3220 or even the Intel Pentium G2120 or weaker.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone

Premium Member

Might as well just use the HD4000 on board graphics at that point. lol
demir
Premium Member
join:2010-07-15
usa

demir to NoAxeToGrind

Premium Member

to NoAxeToGrind
In about 5 months, there isn't going to be a low or mid-range GPU market because Haswell is dropping.

So, even if this guy's findings are real, nobody really cares now because they are all buying better GPU's for gaming anyway --- and when Haswell drops, then it really doesn't matter because low to mid range is all about the CPU.