dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
1211
share rss forum feed


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12

1 edit

Very Specific Math-Based PCIe 3.0 Question

I'm wondering how much of a performance loss is caused by operating two 7970s in Crossfire at 2560x1440 when at 8x versus 16x at 2560x1440.

I am trying to figure out the unknown coefficient of performance at given resolutions so I can predict how much of a difference PCIe 3.0 is compared to 2.0.

The data is here: »i.imgur.com/4jAHC.png (source: BF3 with two 680s, settings are ultra or high).

Tested resolutions:
1920x1080 = 2,073,600 pixels
5760x1080 = 6,220,800 pixels

My resolution:
2560x1440 = 3,686,400 pixels

Frames Per Second (Looking at just Ultra) - PCIe 2.0 vs. 3.0
1920x1080
Low: 60 vs 68 (13.33%)
High: 172 vs. 190 (10.47%)
Average: 113.3 vs. 122.6 (8.23%)

5760x1080
Low: 32 vs. 53 (65.63%)
High: 69 vs. 97 (40.58%)
Average: 52.4 vs. 78.6 (50.01%)

Is there a way to take this data and calculate what the performance would be at 2560x1440, and then the difference between PCIe 2.0 and 3.0 at that resolution?
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
In other words, if the given increase (3.0 over 2.0) is 8.23% at 1920x1080, but is 50.01% at 5760x1080, how much will it be at 2560x1440?

Stupid math problems.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
Chris, very quick comparison just in my head shows something in line of 15-20%


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Is that linear or logarithmic?


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
linear, based on average of 45-60 good playable FPS, those numbers won't matter all that much, I mean 45 or 48 FPS, does it matter?


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Click for full size
How does this look?

It looks to be 26.2% more FPS at 1440p, assuming:
• An 8.23% increase, on average, at 1920x1080 (122.6 vs. 113.3 FPS)
• A 50.01% increase, on average, at 5760x1080 (78.6 vs. 52.4 FPS)
• BF3 on Ultra Settings
• Twin GTX 680s (platform unknown)
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
26% is ideal scenario. Are you looking into getting a new mobo or GPU? Honestly think either way isnt worth your cash.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
I'm upgrading my PC this year. Trying to decide between an LGA 1155/1150 build (PCIe 3.0 x8/x8) or an LGA 2011 build (PCIe 3.0 x16/x16), and adding a second 7970.

26% is a fairly sizeable upgrade. I think I'll go with LGA 2011. The i7-3820 is $230 from Microcenter, which is exactly the same price of an i7-3770K, and only $40 more than an i5-3570K.

I know the motherboard will be more expensive, but I think whatever I pay extra is worth the performance gain.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
is your 7970 underperforming now on the 27" you have?


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Yes. I get 30-45 FPS average, sometimes dipping under 30 FPS. Keep in mind I tend to play the latest games (Crysis 2, Skyrim w/mods, The Witcher 2, etc.).

1440p is a bitch, even if it is beautiful. If I had to do it again, I'd get a 24" 1080p monitor, working up to three of them. I don't see the added strain on the GPU as worth the extra pixels at 1440p.

There is the chance my i7-920 is starting to be the bottleneck, so I'll probably upgrade the CPU/mobo and see where I stand before buying a second 7970.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
Really, interesting. Quick summary of 1440p. I see what you are doing now. Then yes, I will agree with 2011, and most importantly: adding second GPU, that will / should move you into 60-70 FPS territory.
But I would add second 7970 to existing setup and see what happens, I think you will see something close to what you will see with 2011.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Yeah, that's why I caveated my question the way I did, because if not, I'd get some uninformed self-considered "enthusiast" say "durr get a 3770K it's all the CPU you need" (it has already happened on Reddit).

Thanks for the confirmation of what I was thinking, I believe I'll pick up an i7-3820 and an LGA 2011 mobo.

I've never really looked at the x79 mobos, always assumed I'd go the mainstream route with my next build.

Does anyone have any favorites?
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


pnjunction
Teksavvy Extreme
Premium
join:2008-01-24
Toronto, ON
kudos:1
reply to Krisnatharok
Hmm I have been thinking about getting a new GPU and 1440p monitor this year, but if this type of performance scaling is common in games then it seems I'l be limited by the pcie 2.0 in my lga 1156 system even if the CPU holds up.

At that point I might as well sell my whole system (i7 860, 8gb, 6970) and build from scratch when Haswell and the next-gen GPUs are out. That or hold on to the whole thing until next year...pretty sick of my ancient 24" cheap TN panel though.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Go for a quality 24" or 27" 1080p monitor, and work your way up to three of them. 1080p is much kinder on a GPU than 1440p (which has almost twice as many pixels).
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


pnjunction
Teksavvy Extreme
Premium
join:2008-01-24
Toronto, ON
kudos:1
reply to Krisnatharok
Linear approximation looks OK. My guess would be that the real curve is sub-linear, as in it ramps up faster as it becomes more of a bottleneck. That makes 26% a worst case IMO it's probably not quite that bad.


pnjunction
Teksavvy Extreme
Premium
join:2008-01-24
Toronto, ON
kudos:1
reply to Krisnatharok
Well i don't see how working up to 3 1080p (i prefer 1920x1200 but close enough) is any easier on the GPU than the 1440p? Also there's really no in-between steps to work through, 2 monitors for gaming doesn't really work you'd pretty much just jump from 1 to 3 at some point.

Not a fan of multi-monitor for gaming anyways, I would just want the most i can get from a single monitor. How much time do people really spend looking at the side monitors in those triple setups?


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
It's not, was just saying that I'd rather go full blown 3x 1080p than be stuck at a single 1440p. If you don't plan on upgrading monitors again, a single 1440p does fine, but do realize it has twice the pixels of a 1080p monitor and plan your GPU solution accordingly.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


kvn864

join:2001-12-18
Sun City, AZ
kudos:1
The other day I was at Frys reading webpages on 1440p. Realizing that my eyes arent getting any better (and this is with regular reading glasses I am using on my 24) I saw that I will struggle with text if I get the 27. Also now I see that gaming will require an upgrade. I am still running 5870 and it still keeps up with 1920X1200 24 I have. However, I think I would enjoy new 1080p 27, and this is what I have decided to look up in the future, not sure if I need/want 3D, there is still much room to research before I purchase, they are relatively inexpensive. Thanks for the feedback Chris.


bunsenburner
Cinematic Immunity

join:2005-10-11
Charleston, SC
said by kvn864:

I am still running 5870 and it still keeps up with 1920X1200 24 I have.

My GTX680 is somewhere in the middle of the country on an RMA extravaganza (curse you solar activity, neutrinos, cosmic radiation and/or some other such video card destroying invisibility!) and I'm missing it sorely as the 5870 I took out of the drawer to tide me over is getting positively hammered at 1920x1200. Skyrim, GTA IV, Crysis 2, electric boogaloo, et alia, are dropping frames faster than Sally Goodhead's Teflon Jr. prom dress.

And if that were not enough, certain ATi cards do not play well with ACD monitors during POST and I must use the HD3000 in a well overclocked 2600k to access the "BIOS Within The Bowels©" (an inspired, if not stinky, Asus innovation) of a P8Z77-V Deluxe. Yowza!

Man! All this science, I just don't understand...


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
That sucks man, what company made your 680?


bunsenburner
Cinematic Immunity

join:2005-10-11
Charleston, SC
said by Krisnatharok:

That sucks man, what company made your 680?

EVGA. Things break, it happens, not to worry. They are a wonderful company to deal with. No hold time on the phone, and I mean one ring, no questions asked, very polite, and instant RMA approval. They seem to take customer satisfaction quite seriously and were sincerely apologetic about the card failure.

Asus, on the other hand...


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
I've had awesome experiences with EVGA. Did you pay for the cross-ship? It's worth it, IMO.


bunsenburner
Cinematic Immunity

join:2005-10-11
Charleston, SC
I didn't. Would have only made a few days difference. As long as Nvidia keeps designing good cards, I'll be using EVGA again.


Moos
Tequilablob
Premium
join:2008-12-11
Salt Lake City, UT
kudos:3
Reviews:
·Comcast
reply to Krisnatharok
Have you seen this article yet? If your interested in some more data and not already aware of the article, they do some pretty good non biased testing on PCIe 3.0 vs 2.0. They Test quite a few games and video card setups.

»www.hardocp.com/article/2012/07/···KE6XC2ms

Like you said, your I-7 920 is most likely bottlenecking you for the games you listed. I only say this because I experience the same thing with my 2500k. When I overclocked it to 4.4 I was able to see an FPS jump of about 10 in BF3. I run skyrim modded out at around 45 fps on a 7950 @ 2560x2440 and I also have 2 other 1080p monitors running in the background. I do have my 7950 overclocked to a modest 1100 mhz. I think the day's of an I5 being all you ever need for gaming are coming to an end, especially with games like Crysis 3, Star citizen, and The Witcher 3 on the horizon.

I have went through much of the same process your doing now for deciding on if I wanted to get the 2011 chipset vs upgrade z77. what I found was for gaming the upgrade costs did not justify the performance gains unless your planning on running 3 to 4 SLI/crossfired cards. I've elected to do nothing until Haswell comes out, but I am toying with adding a second 7950 on my current Z68 motherboard just for fun.

Another thing I have found from reading other people's tests (no testing on my own so it's somewhat opinion that I tend to agree with) is that PCIe 3.0 is approximately 2 times faster then PCIe 2.0. So basically 1 PCIe 2.0 @16x is equivalent to 1 PCIe 3.0 @ 8x. But I have not been able to see any data yet that there is a single card out there that can saturate a PCIe 2.0 @16x lane. From other peoples experiences/testing I have concluded that there does not seem to be a meaningful performance change for z77 vs x79 for gaming until you start running 3 cards in SLI/crossfire. 2 cards in SLI/crossfire seem to have negligable or very small gains in X79 setups vs z77.

If your dead set on upgrading now, my recommendation would be to grab a z77 setup and save yourself some money (unless your going to run 3 or 4 cards). I have elected to wait until Haswell myself due to the very minor upgrade in going from Sandy to Ivy. If you can hold out till Haswell it would probably be best. My bet is haswell will be out before Ivy-E (who knows if they will skip Ivy-E altogether?).

That was my thought process when looking at LGA 2011 chips anyway. I'm curious to here what other people have found too.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Ugh, that article is comparing two and three-way GPUs on LGA 1155, which means that the only time you get 16x lanes--on PCIe 2.0 or 3.0--is in a single card configuration.

The whole article is then wasted, by comparing mainstream chipset to mainstream chipset.

No one I know is asking if their purchasing decision is between a Z68 or Z77 motherboard. The *real* question at hand is if X79 (being able to run dual GPUs at 16x/16x) is worth the premium over Z77 (running dual GPUs at 8x/8x). The article doesn't even approach the benefit of running two GPUs at full speeds.

To put it in perspective, a Sandy Bridge build running three GPUs are going to run at 8x/8x/4x @ PCIe 2.0, which is equivalent to PCIe 3.0 lanes running at half as many lanes.

No enthusiast worth their salt is going to run high-end GPUs on such a bottlenecked board.

They really should compare x58 (16x/16x @ 2.0) to Z77 (8x/8x @ 3.0) to X79 (16x/16x @ 3.0) if they want someone to take the article seriously.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


Moos
Tequilablob
Premium
join:2008-12-11
Salt Lake City, UT
kudos:3
Reviews:
·Comcast
Heres some more testing done with the X79 motherboards.

»www.behardware.com/art/imprimer/850/

PCIe 3.0 does have a higher transfer rate which is a no brainer. PCIe 3.0@16x transfer rate of 16GB/s vs PCIe 2.0@16x transfer rate of 8GB/s. From what I know there is not a single card that hits 8 GB/s, but I haven't seen the actual numbers. I'll see what I can find. What we need to know is the transfer rate of the 7970 (or 680).

It is noted that PCIe 3.0 uses a more complex coding system and I would imaging it's more efficient than that of PCI 2.0, which might explain the small gains of just a direct swap from 2.0 to 3.0.

I'm also intrigued in the answer if there is a noticeable gain with say a crossfire setup on X79 vs Z77.


Moos
Tequilablob
Premium
join:2008-12-11
Salt Lake City, UT
kudos:3
Reviews:
·Comcast
reply to Krisnatharok
said by Krisnatharok:

Ugh, that article is comparing two and three-way GPUs on LGA 1155, which means that the only time you get 16x lanes--on PCIe 2.0 or 3.0--is in a single card configuration.

The whole article is then wasted, by comparing mainstream chipset to mainstream chipset.

No one I know is asking if their purchasing decision is between a Z68 or Z77 motherboard. The *real* question at hand is if X79 (being able to run dual GPUs at 16x/16x) is worth the premium over Z77 (running dual GPUs at 8x/8x). The article doesn't even approach the benefit of running two GPUs at full speeds.

To put it in perspective, a Sandy Bridge build running three GPUs are going to run at 8x/8x/4x @ PCIe 2.0, which is equivalent to PCIe 3.0 lanes running at half as many lanes.

No enthusiast worth their salt is going to run high-end GPUs on such a bottlenecked board.

They really should compare x58 (16x/16x @ 2.0) to Z77 (8x/8x @ 3.0) to X79 (16x/16x @ 3.0) if they want someone to take the article seriously.

The point I took from the review is that a system with sandy bridge cpu using PCI 2.0 Is not showing signs of bottlenecking when compared to an Ivybridge using PCI 3.0 even in multi-GPU setups and eyefinity situations. I'm curious why you think there would be a performance increase by going to X79 and quadrupling the available bandwidth when doubling it with the Z77 PCI 3.0 x16 setup does not show any significant improvement? I'm sure the more expensive processor and quad channel memory will make the overall performance of the system better resulting in experience overall, but I guess i'm missing where 32 lanes of PCI 3.0 will have an effect when 16 lanes doesn't. (unless your running quad card setups with crazy high resolutions)

Those have been my thoughts on the subject and if i'm way off please fill me in with what i'm misunderstanding. I tried to find some actual data transfer rates today of what a single 7970 would pull across PCI lanes when to the point of being bogged down, but I was unable to find anything meaningful other than FPS comparisons.

On a side note, the Z68 and Z77 motherboards that specify (2 x PCIe 2.0 x16) actually do have them even though the CPU's only output a total of 16 lanesl. When running these boards in crossfire/SLI the cpu outputs 16 lanes of data with 8 lanes intended for each GPU. This data is fed into a PLX chip that spreads each set of 8 lanes into smaller packets which is sent across 16 lanes to each card. It add's some input lag into your system, but it takes care of any negative effects caused by only using 8 of the 16 lanes on the GPU. They also cost more than most boards. If a Z77 board specs say something like PCI Express 3.0 x16, then I agree it works as you explained above.


Moos
Tequilablob
Premium
join:2008-12-11
Salt Lake City, UT
kudos:3
Reviews:
·Comcast
said by Moos:

resulting in experience overall,

Grr. Proofreading never works, I don't know why I try. I men't to say "resulting in a better overall experience."