dslreports logo
 
    All Forums Hot Topics Gallery
spc
Search similar:


uniqs
8595

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

Do x16 video cards use that much bandwidth?

having a discussion with a friend, he asserts that a pci-e 2.0 x16 video card uses only as much bandwidth as an x8 card does. he believes buying an x16 is just a gimmick.

i have a little experience in understanding these things, but i want to know for sure what the situation is.

here's my card:

»www.geforce.com/hardware ··· ications

is the "bandwidth" that the link talks about related to the bandwidth this site explains?

»www.hardwaresecrets.com/ ··· ress/190
Phillip
I Need A Nap
join:2004-12-21
Hatboro, PA

Phillip

Member

Your budy is pritty much right, the only time the PCIe bandwidth is an issue, is when the card has multiple GPU's or you are running multiple cards.

Here is a good article about it, he also mentions two other sites that good info on PCIe bandwith.

»www.pugetsystems.com/lab ··· nce-518/
asdfdfdfdfdf
Premium Member
join:2012-05-09

asdfdfdfdfdf to LowInfoVoter

Premium Member

to LowInfoVoter
"here's my card:

»www.geforce.com/hardware/desktop···ications

is the "bandwidth" that the link talks about related to the bandwidth this site explains?

»www.hardwaresecrets.com/printpag···ress/190"

No memory bandwidth (accessing the cards own memory) is not the same thing as the pci-e bus bandwidth.

x16 or x8 or pci-e version doesn't matter much. Is there a specific concern you have, for example whether a 9800gtx is hindered by a specific motherboard? We are often asked generic questions when someone is really trying to make a decision about something specific and it is better to talk specifics whenever possible.

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

no; my specific concern is understanding the usage of x16 on a video card.

so if the site i linked said that a 2.0 x16 slot can offer 8GB/s of bandwidth and 4GB/s for a 2.0 x8 slot, how do i determine how much bandwidth this (or any) card uses in total?

a_large_rock
join:2003-08-02
L6H0H3

1 edit

a_large_rock to LowInfoVoter

Member

to LowInfoVoter
Is PCI-E 16x a gimmick? No. Do videocards need a PCI-E 16 slot? No. Does a PCI-E 8x slot slow down a videocard? Depends on the card, and what PCI-E revision.

The fastest dual gpu cards out there are capable of saturating a pci-e 1.0 8x, pci-e 1.0 16x, and pci-e 2.0 8x... but not a pci-e 2.0 16x, or pci-e 3.0 8x or 16x.

Most new cards will saturate only a pci-e 1.0 8x slot, but not 2.0 8x.

Your 9800 most likely won't see any degradation of performance using pci-e 1.0 8x.

There aren't any spec's published by manufactures that you can just look at to see how much bandwidth the card will use in a real world situation.

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

how do i determine how much bandwidth this (or any) card uses in total?
asdfdfdfdfdf
Premium Member
join:2012-05-09

asdfdfdfdfdf to LowInfoVoter

Premium Member

to LowInfoVoter
"how do i determine how much bandwidth this (or any) card uses in total?"

You can't.

I'm confident in saying , though, that a 9800gtx isn't going to saturate a v 2.0 x16 or x8 bus.

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

For the purpose of my learning and understanding, and since I don't have your experience, that's not useful to me.

Anyone know how to make this determination from a technical aspect?

CylonRed
MVM
join:2000-07-06
Bloom County
·Metronet

CylonRed

MVM

Well - you have 2 people who have said you can't determine it. Including that there are no specs from the manufacturer for real world situations.

asdfdfdfdfdf specifically has been around here and is very technical. Before he was a registered user he posted as an anon for quite awhile. He knows what he is talking about...

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

I don't doubt the validity of their advice, but it seems to me that if they're so knowledgeable, it should be a trivial matter to educate me on the specifics? It's not like this is magical witchcraft, it's one's and zeros.

CylonRed
MVM
join:2000-07-06
Bloom County
·Metronet

CylonRed

MVM

Again - as noted already ""how do i determine how much bandwidth this (or any) card uses in total?"

You can't."

AND

"There aren't any spec's published by manufactures that you can just look at to see how much bandwidth the card will use in a real world situation."

They know, based off of experience - not calculating the numbers as the data from the manufacturer is not available.

LowInfoVoter
Vote early, vote often, vote democrat.
join:2007-11-19
USA

LowInfoVoter

Member

Unbelievable. Experience isn't reliable (to suffice as a technical answer) enough, valid as it may be.

googleit
@69.118.94.x

googleit

Anon

Then go on Google and Google it, you can then read everyone who just did FPS comparisons to see at what point there was a performance drop and used that as the bench of saturation of the pci link... and maybe write into the review site and ask them to give you a hard number and argue with them.
asdfdfdfdfdf
Premium Member
join:2012-05-09

asdfdfdfdfdf to LowInfoVoter

Premium Member

to LowInfoVoter
It's not that I am trying to keep information from you or am too lazy or arrogant to try to help you. It's just that there is no reliable way for us to calculate or know this and it will vary a great deal depending on a raft of variables, most of which we have no good data on. It really isn't a trivial matter and even if you could come up with a reliable worst case number it isn't going to really help answer the question that you should be concerned about with a gaming card which is "at what point and to what extent will it affect performance".
There is a much more reliable way to determine whether this is an issue and that is gaming benchmarks.

I found this old link which should give some confidence in what I said. It is based on the 9800gx2 , which is essentially a dual gpu version of the card you have. Granted the clock speeds are lower but it shouldn't invalidate the general conclusion. If the 9800gx2 isn't showing a problem your 9800 gtx+ won't be showing a problem.

»www.tomshardware.com/rev ··· -10.html.

Even more recent pci-express scaling benchmarks, of more powerful single gpu cards, show similar results.

I am not trying to be difficult. It is simply the case that this is the best we can do.

Tirael
BOHICA
Premium Member
join:2009-03-18
Sacramento, CA

1 edit

Tirael to LowInfoVoter

Premium Member

to LowInfoVoter
If you want a technical answer, I will give you one.

Your friend is right and wrong at the same time. Two graphics cards, with all other specifications identical (same GPU, memory, power requirements, TDP, and PCI-E revision), but with different PCI-E bandwidth capabilities (x16 and x8) will use the same amount of bandwidth in gaming. This was already shown previously by the links provided by Phillip See Profile and his link to the Puget System article.

How your friend is wrong

There are no (to my knowledge) graphics cards that fulfill this requirement. So, to say that "an x16 card is pointless" is devoid of any basis in reality because there are no modern graphics cards (again, to my knowledge) that are anything other than x16.

However, the important thing to remember is this--the information that is sent along the PCI-E lane has to saturate one type (x8) before the other has any benefits (x16). Since computer games do not saturate the PCI-E lane, you really see no advantages/disadvantages between the two.

When you see a slow down, it has to do with one or many things: 1) GPU (the actual processing unit on the card) not being able to keep up with the amount of data needing to be processed, 2) vRAM saturated, 3) slow CPU, 4) slow HDD, 5) slow RAM, and/or 6) all of the above. Still, when you actually do saturate an x8 lane, x16 becomes advantageous. Among the reasons, most importantly is bandwidth.

When you talk about the PCI-E bus and its performance, you really mean you want to talk about its capabilities and bandwidth. Is an x16 card more capable than an x8 card? Absolutely. Why? Because it has a the capability to handle more bandwidth than an x8 lane. Do software/game develolpers fully utilize the x16 capabilities? No.

Testing the bandwidth of the PCI-E lanes in regards to a GPU requires writing a program that is specific to the GPU itself. Nvidia and AMD use different types of GPU cores (CUDA for Nvidia and Graphics Core Next for AMD). Luckily, for Nvidia GPUs, someone wrote a program.

This program will only show you the actual bandwidth difference between an x8/x16 card and PCI-E revisions (1.0/1.1/2.0/2.1/3.0). There is no real translation to consumer level, real world performance, because (again) the types of applications that most people run on GPUs do not saturate the entire lane.

Further reading on this topic can be found here.

P.S. It was nearly 3 AM when I wrote this, so if something is FUBAR, I will edit it to make it more clear after I sleep.

Edit (after some sleep): JoelC707 See Profile, thank's for the information. I knew some of the older chips were produced as x8 (and still sold), but I didn't know any recent GPUs had x8 variants. You learn something new everyday. Also, the program I linked shows host-to-device and device-to-host concurrent (read: simultaneous, bi-directional) bandwidth.
JoelC707
Premium Member
join:2002-07-09
Lanett, AL

JoelC707

Premium Member

said by Tirael:

There are no (to my knowledge) graphics cards that fulfill this requirement. So, to say that "an x16 card is pointless" is devoid of any basis in reality because there are no modern graphics cards (again, to my knowledge) that are anything other than x16.

I know you added the qualifier "to my knowledge" to this but I just wanted to put out there that there have been PCIe x1 cards available. They're mostly older chips (AMD 4/5000 series, nvidia 8400/GT 610) and from what I remember, many of them advertised 1GB+ of vRAM but they actually shared system RAM to get those numbers (which is ironically where the PCIe bandwidth would really matter). Basically they were really only for people that had early SFF computers with limited slot space and card space.

Interestingly enough I did come across an oddball set of cards while looking on Newegg to see if these x1 cards were still produced/sold. There are apparently PCIe x8 cards available. They're all based on the GT 720 and a couple of them actually use an x8 edge connector (the others use an x16 connector but I suspect the latter half of it is disconnected).

This actually brings up another point for LowInfoVoter See Profile. When is PCIe x16 really x16? At first glance, a card with an x16 edge connector may appear to be x16 but it could be x8 or x4. It's probably cheaper for the manufacturer to simply use a stock x16 blank than it is to use an x8 or x4 blank, especially on something like a video card where it's intended home is an x16 slot. That's not to say an x16 card can't fit (and most likely run) in a smaller slot, which means you could potentially do some investigation yourself. If you have a slot on your board that is wired as x8 or lower, try the video card there and see if you even notice the difference. You could even get an x16 card to work in an x1 slot if it has one of two things: an open back with no component obstructions for the remaining edge connector, or is actually an x16 slot wired as x1.
dave
Premium Member
join:2000-05-04
not in ohio

dave to LowInfoVoter

Premium Member

to LowInfoVoter
I know nothing about video cards, but there's an obvious analogy: will my application run faster if I connect it up to gigabit Ethernet rather than 100mb/s Ethernet?

The answer is 'it depends'. The factors are, 1, were you saturating the 100mb/s link in the first place, 2, do you have enough extra processing power (at both ends!) to use the extra bandwidth, and 3, is your application arranged so it can supply enough work to keep the network pipe filled?

I suppose the same considerations apply to video cards. Bits is bits. Communications is communications.

Tirael
BOHICA
Premium Member
join:2009-03-18
Sacramento, CA

Tirael to JoelC707

Premium Member

to JoelC707
Click for full size
SS of GPUs
I decided to run the program to give you an idea of what it does. I also wanted to show you the difference between single vs SLI/crossfire GPU bandwidth. Above is a screenshot of PrecisionX so you can see the type (and number) of GPUs I have installed. Using two GTX 770 on the x79 platform (with PCI-E 3.0), this is what the program gives me:

System specifications
MSI XPOWER II Big Bang
Intel i7-3930k (PCI-E 3.0 enabled through registry)
Windows 8.1
GTX 770 SLI (PCI-E 3.0 x16)

E:\Downloads\EVGAPrecisionX15>concBandwidthTest.exe
usage: concBandwidthTest.exe deviceID deviceID...
 
##Test 1, using GPU 0 (GTX 770)
 
E:\Downloads\EVGAPrecisionX15>concBandwidthTest.exe 0
Device 0 took 588.792175 ms
Average HtoD bandwidth in MB/s: 10869.709668
Device 0 took 529.417114 ms
Average DtoH bandwidth in MB/s: 12088.766735
Device 0 took 1118.879761 ms
Average bidirectional bandwidth in MB/s: 11440.013886
 
## Test 2, using GPU 1 (GTX 770)
 
E:\Downloads\EVGAPrecisionX15>concBandwidthTest.exe 1
Device 1 took 579.526978 ms
Average HtoD bandwidth in MB/s: 11043.489342
Device 1 took 536.446045 ms
Average DtoH bandwidth in MB/s: 11930.370371
Device 1 took 1106.809448 ms
Average bidirectional bandwidth in MB/s: 11564.772979
 
## Test 3, using GPUs 0 and 1 (GTX 770 SLI)
 
E:\Downloads\EVGAPrecisionX15>concBandwidthTest.exe 0 1
Device 0 took 609.175659 ms
Device 1 took 605.729431 ms
Average HtoD bandwidth in MB/s: 21071.774408
Device 0 took 529.386719 ms
Device 1 took 529.211304 ms
Average DtoH bandwidth in MB/s: 24182.928893
Device 0 took 1136.735718 ms
Device 1 took 1134.876465 ms
Average bidirectional bandwidth in MB/s: 22539.073655
 

I ran the program with/without PCI-E 3.0 enabled. However, it requires a registry hack to do so on my CPU (since the i7-39xx series is 3.0 compliant, but does not have the 3.0 certification), so I do not want to be constantly restarting my computer today. If you can take my word for it (I forgot to save the results), I got about half the bandwidth with PCI-E 2.0 enabled.

DarkLogix
Texan and Proud
Premium Member
join:2008-10-23
Baytown, TX

DarkLogix to asdfdfdfdfdf

Premium Member

to asdfdfdfdfdf
said by asdfdfdfdfdf:

"how do i determine how much bandwidth this (or any) card uses in total?"

You can't.

I'm confident in saying , though, that a 9800gtx isn't going to saturate a v 2.0 x16 or x8 bus.

Well while you can't tell how much its using in normal use there are Cuda programs that can stress the link and determine the max bandwidth the card is capable of using, such programs were used to show a GTX580 (IIRC) could saturate a PCIe 3.0 x16 bus as proof that a moded driver could make it use 3.0 when the NVidia driver was locking it to 2.0.

but in normal use and even normal cuda app use even a GTX titan black or a Quadro K6000 won't saturate the link.
DarkLogix

DarkLogix to Tirael

Premium Member

to Tirael
said by Tirael:

do saturate an x8 lane

A minor point
its not a 8x lane its 8 PCIe lanes
IE a 8x electrical slot is normally 16x physical with 8 lanes.

Tirael
BOHICA
Premium Member
join:2009-03-18
Sacramento, CA

Tirael

Premium Member

said by DarkLogix:

A minor point
its not a 8x lane its 8 PCIe lanes
IE a 8x electrical slot is normally 16x physical with 8 lanes.

Thanks. I was tired.

DarkLogix
Texan and Proud
Premium Member
join:2008-10-23
Baytown, TX

DarkLogix to Tirael

Premium Member

to Tirael
said by Tirael:

However, it requires a registry hack to do so on my CPU (since the i7-39xx series is 3.0 compliant, but does not have the 3.0 certification),

So did you leave 3.0 enabled? if it were me I would.

aurgathor
join:2002-12-01
Lynnwood, WA

1 recommendation

aurgathor to LowInfoVoter

Member

to LowInfoVoter
said by LowInfoVoter:

how do i determine how much bandwidth this (or any) card uses in total?


You're being told misinformation -- it's doable, and it's actually fairly simple.

Just rent or buy one of these and hook it up.
quote:
The Summit T3-16 Protocol Analyzer captures, decodes and displays PCIe 3.0 protocol traffic and data rates for x1, x2, x4, x8, x16 lane widths
aurgathor

aurgathor to JoelC707

Member

to JoelC707
said by JoelC707:

When is PCIe x16 really x16? At first glance, a card with an x16 edge connector may appear to be x16 but it could be x8 or x4. It's probably cheaper for the manufacturer to simply use a stock x16 blank than it is to use an x8 or x4 blank, especially on something like a video card where it's intended home is an x16 slot. That's not to say an x16 card can't fit (and most likely run) in a smaller slot, which means you could potentially do some investigation yourself.

x16 cards do not usually fit into lesser physical slots without a modification to the card or the slot. While I've seen other cards (i.e. controllers) using bigger connectors than what they were wired for, I have yet to see any video card that were wired for less lanes.

BTW, the reason for x8 cards are servers that have x8 slots but no x16 slot.

Tirael
BOHICA
Premium Member
join:2009-03-18
Sacramento, CA

Tirael

Premium Member

said by aurgathor:

You're being told misinformation -- it's doable, and it's actually fairly simple.

Program was linked in my post. Buying a prohibitively expensive analyzer is ridiculous (unless it is your job).
said by DarkLogix:

So did you leave 3.0 enabled? if it were me I would.

Yes. I only reverted the change to do the test of PCI-E 2.0. I just forgot to save them. Herp.

aurgathor
join:2002-12-01
Lynnwood, WA

aurgathor

Member

said by Tirael:

Buying a prohibitively expensive analyzer is ridiculous (unless it is your job).

That's why I mentioned renting. And put the smiley there for a reason.

DarkLogix
Texan and Proud
Premium Member
join:2008-10-23
Baytown, TX

DarkLogix to aurgathor

Premium Member

to aurgathor
said by aurgathor:

x16 cards do not usually fit into lesser physical slots without a modification to the card or the slot.

While true I've seen some lesser slots that are built with the end open instead of solid which allows a greater card to be put in.

and of course there's those riser cables for litecoin mining.
JoelC707
Premium Member
join:2002-07-09
Lanett, AL

JoelC707 to aurgathor

Premium Member

to aurgathor
said by aurgathor:

x16 cards do not usually fit into lesser physical slots without a modification to the card or the slot.

Usually, no they don't. You would have to modify the slot but there are motherboards out there with an open back slot (which may or may not have a rear support for the trailing edge of the card, though most do if they have an open back slot). Honestly though, the more common method of doing that is just using an x16 slot and only wiring it for x8/x4/etc.
said by aurgathor:

While I've seen other cards (i.e. controllers) using bigger connectors than what they were wired for, I have yet to see any video card that were wired for less lanes.

It may actually have something connected to the pins but here's Newegg's list of x8 video cards. Four in total, all say x8 but only two of them actually use an x8 connector. »www.newegg.com/Product/P ··· %20x%208

aurgathor
join:2002-12-01
Lynnwood, WA

aurgathor

Member

said by JoelC707:

but there are motherboards out there with an open back slot (which may or may not have a rear support for the trailing edge of the card, though most do if they have an open back slot). Honestly though, the more common method of doing that is just using an x16 slot and only wiring it for x8/x4/etc.

While I haven't seen any, I don't doubt that there are slots with opened back -- saves you quite a bit of work, and the possibility of destroying a perfectly good mobo if your hand slips. (How do I know that? )

Wiring slots for less lanes is pretty common and makes a perfect sense -- as long as it's well documented.

It may actually have something connected to the pins but here's Newegg's list of x8 video cards. Four in total, all say x8 but only two of them actually use an x8 connector.

About 3 or 4 years ago I've spent quite a bit of time searching for an x8 card, but I could find only one (Colorgraphics) and that would've cost much more than I was willing to pay for one.
JoelC707
Premium Member
join:2002-07-09
Lanett, AL

JoelC707

Premium Member

said by aurgathor:

About 3 or 4 years ago I've spent quite a bit of time searching for an x8 card, but I could find only one (Colorgraphics) and that would've cost much more than I was willing to pay for one.

To be honest, that was the first I had ever seen an x8 video card. I only knew of x16 and x1 models. It doesn't surprise me you couldn't find any previously lol.