GeForce GTX 980 has 2048 CUDA cores, 128 TMUs and 64 ROPs. Card is equipped with 4GB GDDR5 memory and 256-bit interface. It has a bandwidth of GTX 770, which is 224 GB/s. By comparing it to Kepler parts, we notice that Maxwell-based graphics cards arrive with relatively higher clock speeds, GTX 980 has a base clock of 1126 MHz and boost clock of 1216 MHz.
The biggest news here is that GTX 980 has only a TDP of 165W.
165w power draw. Simply amazing. I'm curious how cool and quiet these new set of cards are going to run.
quote:NVIDIA GeForce GTX 980 and GTX 970 Pricing Revealed
Apparently, NVIDIA is convinced that it has a pair of winners on its hands, with its upcoming GeForce GTX 980 and GTX 970 graphics cards, and is preparing to price them steeply. The GeForce GTX 980 is expected to start at US $599, nearly the same price as the GeForce GTX 780 Ti. The GTX 970, on the other hand, will start at US $399, danger-close to cannibalizing the GTX 780.
quote:NVIDIA GeForce GTX 980 and GTX 970 Pricing Revealed
Apparently, NVIDIA is convinced that it has a pair of winners on its hands, with its upcoming GeForce GTX 980 and GTX 970 graphics cards, and is preparing to price them steeply. The GeForce GTX 980 is expected to start at US $599, nearly the same price as the GeForce GTX 780 Ti. The GTX 970, on the other hand, will start at US $399, danger-close to cannibalizing the GTX 780.
Personally I think I'd probably be more in the market for a GTX 960 assuming the pricing is sub ~$300 USD. I guess I am just cheap like that.
All the rumors flying around the web for weeks now (especially on Reddit) about the 980 having a price tag of only $399, we're asinine.
That just wasn't happening, for a many number of reasons. I figured all along it would be a $600+ card, just like every other top tier model on release day.
Anyway, I guess some Asus models actually have some 0db, fanless mode. Fans don't even kick on until you hit 65C. So badass.
That seems like a pretty slick GPU, especially the fact they were able to cut the power consumption like they have. The only thing I don't like is the 256-bit interface, and the 64 rops.
AMD's Pirate Islands flagship, the R9 390X is reported to have 4424 Streaming Processors, 264 Texture Units, and 96 ROPS, and a 512-bit interface. That means the R9 390x will be a 4k munching beast!
Power consumption and such is good and all, but if 4k takes off like I think it will (and AMD seems to think it will), AMD is going to rule the 4k resolution for quite sometime, as simply put 2 R9 390x's Crossfire will be the only cards able to max 4k resolutions buttery smooth.
Nvidia is taking a big gamble here, they are banking on power consumption, and lower TDP being selling points. They are also banking on 4k not being as common for another 2 years.
AMD on the other hand is betting 4k will become much more common in the next year to 18 months on PC. They are hedging their bets that 4k is where to invest.
Honestly, its very interesting watching AMD and Nvidia's strategies moving forward. It will be very intersting to see who comes out on top this go around. I do think its impressive what Nvidia has been able to do power consumption wise with maxwell.
That seems like a pretty slick GPU, especially the fact they were able to cut the power consumption like they have. The only thing I don't like is the 256-bit interface, and the 64 rops.
AMD's Pirate Islands flagship, the R9 390X is reported to have 4424 Streaming Processors, 264 Texture Units, and 96 ROPS, and a 512-bit interface. That means the R9 390x will be a 4k munching beast!
Power consumption and such is good and all, but if 4k takes off like I think it will (and AMD seems to think it will), AMD is going to rule the 4k resolution for quite sometime, as simply put 2 R9 390x's Crossfire will be the only cards able to max 4k resolutions buttery smooth.
Nvidia is taking a big gamble here, they are banking on power consumption, and lower TDP being selling points. They are also banking on 4k not being as common for another 2 years.
AMD on the other hand is betting 4k will become much more common in the next year to 18 months on PC. They are hedging their bets that 4k is where to invest.
Honestly, its very interesting watching AMD and Nvidia's strategies moving forward. It will be very intersting to see who comes out on top this go around. I do think its impressive what Nvidia has been able to do power consumption wise with maxwell.
A couple things...
Considering the vast majority of PC gamers are still gaming on sub 1080p resolutions, I wouldn't bank on most of them magically upgrading to 4K monitors in the next few years time.
I consider myself to be a pretty hardcore PC hardware enthusiast. And I can tell you, I have zero desire to upgrade to 4K from my 1440p yet. It's just not happening. Maybe in a couple years or so.
Another thing is, I think people are putting too much emphasis on the 256bit memory bus of these 900 series. The memory is factory clocked at 7,000mhz, and that 64rops? That shit is going to destroy at higher resolutions. GK110 doesn't even have that amount.
These Maxwell cards will be hitting 1,400-1,500mhz overclocks easy.
Considering the vast majority of PC gamers are still gaming on sub 1080p resolutions, I wouldn't bank on most of them magically upgrading to 4K monitors in the next few years time.
I consider myself to be a pretty hardcore PC hardware enthusiast. And I can tell you, I have zero desire to upgrade to 4K from my 1440p yet. It's just not happening. Maybe in a couple years or so.
Im still on 1080p and i see 1400p as nothing more then a stepping stone, it won't be the new standard like 1080p was, 4k will be the new standard. Prices are already dropping.
you can get a 50 inch 120hz 4k television for 700 bucks....compare that price to when 1080p first came out and it is very favorable...If the past is any indicator, 4k screens will be significantly cheaper in the next 18-24 months as manufacturing process tweaks bring costs down just like all other electronics bringing their prices well in line of the reach of mainstream.
Most of these people that are not on 1080p will be on 4k in the next 2-3 years. I myself will be on 4k in that time frame as well.
You will need something like the R9 390x to run 4k at 60fps buttery smooth with maybe PPAA. I'm not sure the current specs of the GTX 980 could run 60fps @ 4k...let alone with any PPAA.(Post Process AA) I may be wrong, but 4k is a lot of pixels and the specs on the R9 390x look much beefier. 96 ROPS on the 390x vs 64 on the 980...that a pretty stark difference.
However, that doesn't tell the whole story. Nvidia may still be better due to not only lower power requirements, but if their drivers are better optimized to squeeze every inch out of Maxwell, and AMD doesn't get their drivers up to par and leaves performance on the table, it may not even matter in the longterm.
I am content to sit back and wait and see, but to me it seems like AMD is going "Full steam ahead" into the 4k realm with that R9 390x, if that will pan out or not for them is yet to be determined.
Very interesting, can't wait to see how they over clock and how many it takes to drive a 4k monitor @ med settings. Wonder what amd will do to counter this?
This is a somewhat terrible comparison. The R9 390X is AMDs top-end single GPU card. The GTX 980 isn't top of the line for Nvidia. If it follows the 700 series release metric, you will see a new Titan-like and 980ti, which will blow these cards away.
doesnt 4k above 30hz require DP3.0? do the 980s and 390x have DP3.0 on them?
DP 1.3 is the new version, that'll run 4K @ 60hz. Literally just released this week too. It hasn't been confirmed yet if the 970/980 has 1.3 support. These cards are supposed to have HDMI 2.0 though.
OK I did a bit of digging; and found the following on Nvidia's site:
said by Nvidia :Industry Support for 4K
There are 2 ways of delivering 4K content, HDMI and DisplayPort. A) HDMI
The current HDMI 1.4 standard only has bandwidth for 4K at 30hz. HDMI 2.0 will add support for 4K at 60Hz. Details should become clearer in late 2013. There are a few TVs on sale today that support 4K @ 30Hz through HDMI. B) DisplayPort 1.2
DisplayPort can support 4K @ 60Hz using Multi-Stream Transport(MST). The graphics card provides signals for multiple displays but these are multiplexed on a single cable. Computer monitors such as the ASUS PQ321Q 31.5-in 4K 60 Hz Tiled Monitor take input using this format.
OK I did a bit of digging; and found the following on Nvidia's site:
said by Nvidia :Industry Support for 4K
There are 2 ways of delivering 4K content, HDMI and DisplayPort. A) HDMI
The current HDMI 1.4 standard only has bandwidth for 4K at 30hz. HDMI 2.0 will add support for 4K at 60Hz. Details should become clearer in late 2013. There are a few TVs on sale today that support 4K @ 30Hz through HDMI. B) DisplayPort 1.2
DisplayPort can support 4K @ 60Hz using Multi-Stream Transport(MST). The graphics card provides signals for multiple displays but these are multiplexed on a single cable. Computer monitors such as the ASUS PQ321Q 31.5-in 4K 60 Hz Tiled Monitor take input using this format.
I'm going to assume that this means that not all DP 1.2 setups support 4K@60Hz out of the box.
EQ
If I recall correctly,.....
DisplayPort 1.2 was capable of supporting 4K 3840x2160 at 60Hz, however, the scalers of early 4K monitors like the Asus PQ321Q and Sharp equivalent as well as the Dell 24" UltraSharp UP2414Q were incapable of 60Hz at that resolution. Therefore they had to use two of these scalers which necessitated MST.
MST was implemented in something of a hacked approach in order to use two of these otherwise insufficient scalers at lower resolution (1920x2160 @ 60Hz + 1920x2160 @ 60Hz) in order to obtain the desired 3840x2160 at 60Hz.
New scalers were developed to allowed for SST (single stream transport) at 3840x2160 and 60Hz via Displayport 1.2 thus sidestepping the need for multiple scalers and MST.
Keep in mind that 4K monitors that use MST are susceptible to screen tearing down the center where the virtually stitched together screens meet due to sync issues between the two halves of the logical screen. As far as I know there is no fix for this., its just a quirk of the design.
New scalers were developed to allowed for SST (single stream transport) at 3840x2160 and 60Hz via Displayport 1.2 thus sidestepping the need for multiple scalers and MST.
That seems to speak pretty well of DP 1.2, what type of improvements are coming with 1.3 then?
At this point I'm only asking because it seems like DP 1.2 would cover most gaming/high end needs for the foreseeable future, but I'm curious now.
quote:DisplayPort version 1.3 was released on September 15, 2014.[17] The new standard increases overall transmission bandwidth to 32.4 Gbit/s with the new HBR3 mode featuring 8.1 Gbit/s per lane (up from 5.4 Gbit/s with HBR2 in version 1.2), totalling 25.92 Gbit/s with overhead removed. This bandwidth allows for 5K displays (5120x2880 px) in RGB mode, and 8K television displays at either 7680×4320 (16:9, 33.18 megapixels) or 8192×4320 (~17:9, 35.39 megapixels) using 4:2:0 subsampling. The bandwidth also allows for two 4K (3840×2160 px) computer monitors at 60 Hz in 24-bit RGB mode using Coordinated Video Timing, a 4K stereo 3D display, or a combination of 4K display and SuperSpeed USB 3.0 as allowed by DockPort. The new standard features HDMI 2.0 compatibility mode with HDCP 2.2 content protection. It also supports VESA Display Stream Compression, which uses a visually lossless low-latency algorithm, to offer increased resolutions and color depths, and reduced power consumption.[18]
So the Dell 5K presumably uses two DP 1.2 ports to patch together support.
As for gaming,.....not necessary for gaming for the most part. That isn't what is driving this type of hardware at this point. Its likely professional applications.
Oh yea if I did not game and just worked with Blender or similar stuff all day I would likely already have a 30"+ 4k screen. the screen space alone would be amazing for such things.
It's amazing, that this card, on a reference board with no overclock is beating 780ti's...and for cheaper. Who knows what set of drivers they're using too.
One other thing. It's been confirmed that the 7 series are discontinued.
Already? They're barely a year old.
Barely a year and a half.
I'm just laughing because for months now, all the people on forums every where have been saying that they can't wait for the price drops on 770's and 780's.