dslreports logo
 
    All Forums Hot Topics Gallery
spc
Search similar:


uniqs
3748
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212 to Ghastlyone

Member

to Ghastlyone

Re: New benchmarks for th 8350 and 3570k/3770k. Weird results.

said by Ghastlyone:

And yeah regarding your 2500K lasting another 7 years, there are articles out there talking about how we're starting to hit diminishing returns with CPUs.

I am certain we are hitting diminishing returns with silicon, I have no idea what the next material we make it out of but it should be better than what we have now.

Still even with diminishing returns it is still possible for them to make something worth upgrading to, even if it doesn't have huge ipc increases we can get other things more pcie lanes, more ram(ddr4) ect. But like Krisnatharok said no competition does this to us.
said by Ghastlyone:

It's going to be similar to Sandy Bridge---->Ivy Bridge.

Basically, upgrading from Sandy Bridge to an Ivy Bridge was/is a waste of money.

Same will probably be said about IB--->Haswell.

*sigh* Nearly as useless to upgrade from SB -----> Haswell then.
me1212

me1212 to Krisnatharok

Member

to Krisnatharok
It most certainly will be, even if the 920 is oc'd, but the i7 came out in q4 of 08, it will have taken them over 4 years by the time haswell comes out to make something worth upgrading to.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone to Krisnatharok

Premium Member

to Krisnatharok
said by Krisnatharok:

Thank you, AMD, for failing to put competitive pressure on Intel.

No shit ^ Same goes for putting pressure on Nvidia on certain GPUs. I'd love to see some real competition on GPU front.
said by Krisnatharok:

Haswell should still be a nice increase over my aging i7-920, however.

That should be a nice upgrade for you.
Ghastlyone

Ghastlyone to me1212

Premium Member

to me1212
said by me1212:

said by Ghastlyone:

And yeah regarding your 2500K lasting another 7 years, there are articles out there talking about how we're starting to hit diminishing returns with CPUs.

I am certain we are hitting diminishing returns with silicon, I have no idea what the next material we make it out of but it should be better than what we have now.

Still even with diminishing returns it is still possible for them to make something worth upgrading to, even if it doesn't have huge ipc increases we can get other things more pcie lanes, more ram(ddr4) ect. But like Krisnatharok said no competition does this to us.
said by Ghastlyone:

It's going to be similar to Sandy Bridge---->Ivy Bridge.

Basically, upgrading from Sandy Bridge to an Ivy Bridge was/is a waste of money.

Same will probably be said about IB--->Haswell.

*sigh* Nearly as useless to upgrade from SB -----> Haswell then.

I'll be happy as shit if I can get another 2-3 years out of my 3570K. If all I have to do is a GPU upgrade, maybe a larger SSD? That's cool with me.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212

Member

Dont get me wrong my good sir, I'd love for my hardware to last me for years. Heck, if my 2500k can get at least to skylake I'll be happy, sure I'll ned to upgrade my gpu in the mean time ,but thats life. However, if it takes longer than the tick after skylake I'll be a bit worried about the amount of time its taking to make progress.

I guess nothing can keep making groundbreaking strides forever. Maybe arm + graphite is the future. I'd be okay with an arm desktop, that would actually be kinda cool.

Octavean
MVM
join:2001-03-31
New York, NY

Octavean to NoAxeToGrind

MVM

to NoAxeToGrind
said by NoAxeToGrind :

said by Octavean:

From Microcenter, I can buy:

Intel Core i7 3770K ~$229.99 (although I have seen I as low as ~$219.99)
Intel Core i5 3570K ~$189.99
AMD FX 8350 for ~$189.99

Going over budget by ~$30 to $40 isn’t that big of a deal to me. I just don’t think such a small amount should be allowed to effect the performance or hardware choices.

There is a very old expression one of my past professors use to use.

“Pennywise and pound foolish,…. “

If ~$30 to $40 over budget is going to break you or force a hardware change, why not just wait on the build until such time as it doesn’t have said effect.

I think $30-$40 is about the difference between an HD 6670 and an HD 7750. You'd rather have a 3770k with a 6670 than a 3570k with a 7750? I think you'll get about half the framerate. Who's pennywise and pound foolish?

You can force those constraints on yourself but I typically don’t buy / build systems with that kind of thinking.

“Pennywise and pound foolish” was used by my professor in the context of buying and using products / tools that were cheaper in price but either wouldn’t last as long or would be inadequate (possibly necessitating buying again prematurely). This was for a requisite Mechanical Drawing class for all Engineering students. He was trying to say it was better to spend more and get an excellent quality product then to cheap out.

So I would consider buying something you really don’t want and or something inadequate in order to make a budget “Pennywise and pound foolish”.

I also think you might be missing the point because I also suggested as an option abstaining from buying anything until such time as one can afford to buy what they really wanted or a better product (if its otherwise necessary due to budget constraints).

Where is it written that one must compromise, especially the way you suggest? Where is it written that one can’t wait a week or more to allow for more in the budget especially for a scant ~$30 to ~$40,….?

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

Who the hell pairs a 3570k with a 7750 in the first place. That's a sub-$100 GPU.

If that's the GPU you're buying, you should be using the i3-3220 or even the Intel Pentium G2120 or weaker.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone

Premium Member

Might as well just use the HD4000 on board graphics at that point. lol

NoAxeToGrind
@comcastbusiness.net

NoAxeToGrind to El Quintron

Anon

to El Quintron
I give up. Either nobody here can read or you're all Intel fanboys. I don't advocate the AMD chips; I say the cheapest chip, Intel or AMD, that can make use of the better GPU for the same total dollars. If you're too blinded by your chip prejudice to make that distinction, I'm wasting my time.
demir
Premium Member
join:2010-07-15
usa

demir

Premium Member

In about 5 months, there isn't going to be a low or mid-range GPU market because Haswell is dropping.

So, even if this guy's findings are real, nobody really cares now because they are all buying better GPU's for gaming anyway --- and when Haswell drops, then it really doesn't matter because low to mid range is all about the CPU.

El Quintron
Cancel Culture Ambassador
Premium Member
join:2008-04-28
Tronna

El Quintron to NoAxeToGrind

Premium Member

to NoAxeToGrind
said by NoAxeToGrind :

I give up. Either nobody here can read or you're all Intel fanboys. I don't advocate the AMD chips; I say the cheapest chip, Intel or AMD, that can make use of the better GPU for the same total dollars. If you're too blinded by your chip prejudice to make that distinction, I'm wasting my time.

You're making an invalid comparison on all fronts, that's why your point has no traction, on top of which you keep mentionning GPUs no one who games would buy.

I'm willing to accept that there's a competitive AMD chip out there, it just isn't any of the chips you've mentionned.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

This has already been mentioned. At the $120 pricepoint, the FX-4170 is equivalent to the i3-3220, and has OC potential to boot. If you don't care about the huge power waste to get there and the aftermarket cooler it will require.

At any other pricepoint, Intel reigns supreme.

End of discussion, unless you bring facts with you.

El Quintron
Cancel Culture Ambassador
Premium Member
join:2008-04-28
Tronna

El Quintron

Premium Member

said by Krisnatharok:

This has already been mentioned. At the $120 pricepoint, the FX-4170 is equivalent to the i3-3220, and has OC potential to boot. If you don't care about the huge power waste to get there and the aftermarket cooler it will require.

But when all costs are factored that still makes it a more expensive chip.

So the only benefit (cost effectiveness) is negated, by all the hidden costs.
me1212
join:2008-11-20
Lees Summit, MO

me1212

Member

True but if its the best you can afford at the time, then some people may get it. Especially if you live near a microcenter where you get $40 off a motherboard when you buy an fx chip.

El Quintron
Cancel Culture Ambassador
Premium Member
join:2008-04-28
Tronna

El Quintron

Premium Member

said by me1212:

True but if its the best you can afford at the time, then some people may get it. Especially if you live near a microcenter where you get $40 off a motherboard when you buy an fx chip.

Even so, you'd really have to make a logical leap to justify any type of real-world savings.

It's not like people can't like impractical things, it's just that there's no immediate appearant benefit to getting the above mentionned AMD chip.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

me1212

Member

said by El Quintron:

said by me1212:

True but if its the best you can afford at the time, then some people may get it. Especially if you live near a microcenter where you get $40 off a motherboard when you buy an fx chip.

Even so, you'd really have to make a logical leap to justify any type of real-world savings.

It's not like people can't like impractical things, it's just that there's no immediate appearant benefit to getting the above mentionned AMD chip.

Sadly in today's world not really, people see lower price + $40 off a mobo and jump into it. Usually unknowledgeable first time buyers yes, but sadly it still happens.

That said I wouldn't buy one, save for maybe the 6 core one but even that would only be fore code compiling since intel doesn't have an affordable 6 core cpu. Granted I have a phenom ll x6 that I use for that now so even that wouldn't happen for 4 or so years when there will no doubt be a better 6 core on the market.
me1212

me1212

Member

So, he the first guy did another video, this time with a 670 and oc'd both processors. I have to say I can kinda see this one maybe being legit with this one, I mean one is clocked 500mhz higher than the other, but both preformed very well and more traded blows than anything this time.

»www.youtube.com/watch?v= ··· 7kDGSRfc

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok

Premium Member

Man, I am tempted to buy both CPUs and do my own controlled tests just to put this controversy to bed.

How hard is it to run the game through a prerecorded loop or test to gauge frame-rates??

Octavean
MVM
join:2001-03-31
New York, NY

Octavean to me1212

MVM

to me1212
Click for full size
said by me1212:

said by El Quintron:

said by me1212:

True but if its the best you can afford at the time, then some people may get it. Especially if you live near a microcenter where you get $40 off a motherboard when you buy an fx chip.

Even so, you'd really have to make a logical leap to justify any type of real-world savings.

It's not like people can't like impractical things, it's just that there's no immediate appearant benefit to getting the above mentionned AMD chip.

Sadly in today's world not really, people see lower price + $40 off a mobo and jump into it. Usually unknowledgeable first time buyers yes, but sadly it still happens.

That said I wouldn't buy one, save for maybe the 6 core one but even that would only be fore code compiling since intel doesn't have an affordable 6 core cpu. Granted I have a phenom ll x6 that I use for that now so even that wouldn't happen for 4 or so years when there will no doubt be a better 6 core on the market.

I've come across ~$40 of motherboards for Intel processors as well so that takes some of the thunder away from the AMD build price issue as well.
me1212
join:2008-11-20
Lees Summit, MO

me1212

Member

Microcenter also has $40 off when you buy amd fx chips too, as well as ~$40 motherboards for them.
me1212

me1212 to Krisnatharok

Member

to Krisnatharok
I hear ya man, if I had the money I would do the same thing. Not to mention he isn't even testing at the same clock speeds.

Ghastlyone
Premium Member
join:2009-01-07
Nashville, TN

Ghastlyone

Premium Member

What are the clock speeds he's testing at? I can't watch the video just yet.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok to me1212

Premium Member

to me1212
said by me1212:

Not to mention he isn't even testing at the same clock speeds.

That wouldn't matter much. The 8350 an 3570K have different TDP and varying headroom, and the chip architecture is so different that making them the same clockspeed would not do anything towards removing a variable, except to prove that Ivy Bridge's design and architecture is far superior to Piledriver's.

It goes back to the "MOAR COREZ" or "MORE GHZ" myths. And is why the i3-3220 is such a fantastic little chip, despite being dual core.
me1212
join:2008-11-20
Lees Summit, MO

me1212 to Ghastlyone

Member

to Ghastlyone
4.5ghz for the 3570k, 5.01ghz(or something like that, just a tiny bit over 5ghz) for the 8350.
me1212

me1212 to Krisnatharok

Member

to Krisnatharok
I know it would just remove that variable, and thats what I want to see because each chip overclocks differently. Some 3570k can hit 5ghz,and some 8350 can only hit 4.5ghz. I'd like a more apples to apples comparison than what we've gotten.

Krisnatharok
PC Builder, Gamer
Premium Member
join:2009-02-11
Earth Orbit

Krisnatharok to me1212

Premium Member

to me1212
said by me1212:

4.5ghz for the 3570k, 5.01ghz(or something like that, just a tiny bit over 5ghz) for the 8350.

Assuming he absolutely maxed out both under the same CPU cooler, that's a fair comparison. Everyone knows IB doesn't OC as well as SB, but has PCIe 3.0, USB3, faster clock-for-clock, etc.
me1212
join:2008-11-20
Lees Summit, MO
·Google Fiber

1 edit

me1212

Member

True if he did do that then that is 100% a fair comparison; however, most people I have seen that use an h100(it looks like he was using that) can get 4.7ghz on the 3570k. I suppose he could have just gotten a chip that doesn't oc as well it happens.

Anyway, what I meant though was I would like to see how they preform when they are at the same speed just because I think it would be cool to see. Sure most of them wont have the same limit in the real world, but I think it would still be fun to see what would happen.

EDIT: found a couple things that supposedly show what the average oc for these chips are, for multiple types of cooling even.

»hwbot.org/hardware/proce ··· 5_3570k/
»hwbot.org/hardware/proce ··· fx_8350/

C0deZer0
Oc'D To Rhythm And Police
Premium Member
join:2001-10-03
Tempe, AZ

C0deZer0 to me1212

Premium Member

to me1212
I find this rather interesting... most of all though, his video shows that the biggest differences where when playing while streaming your video output with the same results. I suppose in this case, how xsplit handles streaming a live game runnning seems to just be handled in a friendlier fashion on an FX8350 than it is on any of the Ivy Bridge i7's.

I just know that depressingly, I really wish that AMD's boards supported SLI proper. That and for the time being, none of theirs still have yet to implement PCI-E 3.0. Z77 & Ivy bridge boards all seem to also be as feature-lacking as well... and a case of where most of the time, you can't have an intel board that "has it all" unless you go with an X79-based board. Even there, you're still contending with the fact that intel has yet to make an Ivy Bridge-E chip, there's no conclusive support for PCI-E 3.0 on an x79-based platform, and - knowing intel - they'll release a new chipset to go with a (maybe) IB-E that would require replacing the motherboard to go that route.

I just know that for me, I try to get boards that are feature-complete, simply because I almost always end up using every available port; maybe not right away, but if it has it, I'll eventually end up using it. About the only thing I haven't really used much at all on my old 680i SLI board is the second NIC (for all but configuring a few wireless bridges - and even then, for maybe all of five minutes?), and the onboard audio once I got my X-Fi.

Octavean
MVM
join:2001-03-31
New York, NY

Octavean

MVM

Modern ATI/AMD video card report themselves as PCIe 3.0 on X79 boards and modern nVidia boards don't due to drivers. Presumably you can force PCIe 3.0 operation on X79 / nVidia video card combos:

»hardforum.com/showthread ··· =1700629

»forums.geforce.com/defau ··· 19-2012/

The nVidia excuse seems to be due to inconsistencies with respect to X79 motherboards and some Sandy Bridge-E processors.

I might give the " force-enable-gen3.exe" solution a try since I went from an HD 6870 to a GTX 670 but I suspect the difference between 5GT/s PCIe 2.0 and 8GT/s PCIe 3.0 isn't a problem that really needs fixing at this point. So for me if it isn't broken, I'm not going to try and fix it. Unless there is a noteworthy performance improvement due to PCIe 3.0 support "now" on a single GTX 670 then meh,.......and I suspect its meh.

I still question whether or not Intel will release a new chipset for Ivy Bridge-E. I'd personally like to see Intel add Thunderbolt support to X79 boards (or a new chipset) but they would either have to drop the Intel video requirement (unlikely IMO) or add Intel video to newer CPUs / boards.

C0deZer0
Oc'D To Rhythm And Police
Premium Member
join:2001-10-03
Tempe, AZ

C0deZer0

Premium Member

But that's just it... I thought it was the CPU that governed whether you could use PCI-E 3 or 2 on intel, since that's the word from just about every motherboard maker out there. And since there's no Ivy Bridge-E to know for sure, we can't verify if these x79 boards really can do PCI-E or are just saying they are.

And just my opine, but I fully expect intel to change socket and chipsets again if they ever do bother to make an Ivy Bridge-E.

That is one thing that has bothered me with the current state of z77 boards out there. It seems even to just get one Thunderbolt port, you suddenly lose a lot of available ports of every other type. Firewire is usually a casualty , as is at least half the sata and usb ports that are provided by those boards that don't. That's why as it stands, there really isn't a single z77 board I'd like enough to replace even my old 680i SLi.