KrisnatharokPC Builder, Gamer Premium Member join:2009-02-11 Earth Orbit |
[Tech] AMD confirms new flagship cardsaid by WCCTech :AMD Officially Confirms New Radeon Flagship R9 390X Ultra-Enthusiast Graphics Card In All Likelihood
AMD officially confirms for the first time ever the existence of a new Radeon flagship. The Fiji XT based Radeon R9 390X in all likelihood. The unreleased flagship graphics card in question was running the Showdown demo on the Oculus Rift Crescent Bay. We exclusively told you four days ago that AMD was going to use the graphics card to run demos at GDC and in fact they did.
AMDs recently announced LiquidVR set of technologies were demoed on this unreleased flagship Radeon GPU. The virtual reality demo is dubbed Showdown and uses Unreal Engine 4. It involves a huge robot thats causing all kinds of mayhem in the city around you. Explosions, flying debris and a car being flipped over all in a slow motion immersive 360 degree experience. Unfortunately this is all thats been officially revealed by AMD so far, but it shouldnt be long before AMD launches the GPU now that theyve began to use it in public demos. In fact only three weeks ago we exclusively told you that AMD has sufficient inventory of the R9 390X to begin demoing it publicly.
Back to the Radeon graphics card in question. From here on all the information that you will see concerning the R9 390X is based on unconfirmed but very genuine leaks. The R9 390X graphics card will allegedly be based on AMDs upcoming flagship Fiji XT GPU. The graphics card will be the first ever to feature stacked high bandwidth memory, or HBM for short.
The card will allegedly feature a hybrid Hydra liquid cooling unit similar to the one on the companys current dual-GPU flagship and the fastest graphics card in the world the R9 295X2. Fiji XT will allegedly feature 4096 stream processors, 4GB of HBM VRAM, 4096bit wide memory interface for a whopping 640GB/S bandwidth, nearly three times as much memory bandwidth as the GTX 980.
Already supposed to be better than a Titan X. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
said by Krisnatharok:Already supposed to be better than a Titan X. I don't want to enter into a GPU brand bashing rant, so I'll keep my team green criticism short. The whole Titan launch is a fiasco, as mentioned before it's a cut down workstation card that's being sold as a gaming card. I'm curious as to why Nvidia is continuing down this road. It seems like a foolish endeavor. /Rant With that out of the way, I'm very excited for the 390X, I think we're finally entering territory where you can get decent gaming performance on a 4K monitor with a single card. I'm going to wait and find out for sure, but if this is the case, then I'm going to be trading up in fairly short order. EQ |
|
AdaliciaOm Nom Nom join:2009-10-13 Lincoln, NE |
The Titan is like...just a status thing I believe. It is fun to say you have one and for a brief period it is insanely strong, but before long something else comes out that puts it to shame for a fraction of the price. |
|
BlockgorillaSarcasm is my native tongue join:2010-02-11 Wichita, KS |
to Krisnatharok
I'll be upgrading from my OC 7950 as soon as this comes out, I cannot wait. |
|
Ghastlyone Premium Member join:2009-01-07 Nashville, TN |
to Krisnatharok
I might have to look at these new cards when they release for my next upgrade.
Power consumption, thermal temps, noise, and video driver support is still a main concern of mine though.
Do I want more performance? Sure.
But I don't want a card(s) inside my system that run at 95C constantly, sounds like jet engines, and take a 1200w PSU to run. |
|
|
to Krisnatharok
My balls are ready. I may switch. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
to Adalicia
said by Adalicia:before long something else comes out that puts it to shame for a fraction of the price. That's what' really depressing about it, and the reason why I never got one. It really isn't worth the money, it should probably be priced 2 or 3 hundred dollars less at launch. At that price point it would be fairly priced but nothing more. EQ |
|
|
to Krisnatharok
Rumors R9 395x2 is 8GB HBM 8192-bit The only card that will use HBM are R9-385X(2GB), 390(4GB), 390X(4GB) and 395x2(8GB) All the other one is rumored to be using GDDR5. » www.unlockpwd.com/amd-ra ··· 2-rumor/ |
|
KrisnatharokPC Builder, Gamer Premium Member join:2009-02-11 Earth Orbit
2 recommendations |
to El Quintron
Titan, in my mind, has the same stigma associated with Alienware--it shows the owner has more money than good sense or knowledge. The top enthusiast card almost always has more power (or is 90% of it) at about half the price, and AMD cards have more vram. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
to FizzyMyNizzy
said by FizzyMyNizzy:The only card that will use HBM are R9-385X(2GB), 390(4GB), 390X(4GB) and 395x2(8GB) What's the benefit of HBM over GDDR5? (I could look this up, but a quick explanation would benefit the thread) |
|
Ghastlyone Premium Member join:2009-01-07 Nashville, TN |
to Krisnatharok
So...any guesses on what this new 390x price tag will be?
I'm thinking $699.99 at launch. |
|
AdaliciaOm Nom Nom join:2009-10-13 Lincoln, NE |
to FizzyMyNizzy
What I find interesting is that they're still only going with 2-4 GB with the exception of their highest end card. |
|
|
to Krisnatharok
Or like those people with those Killer NIC's heh.. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
to Krisnatharok
said by Krisnatharok:Titan, in my mind, has the same stigma associated with Alienware--it shows the owner has more money than good sense or knowledge. True, but I think at that price Nvidia gets the lion's share of the blame. It's one thing if you're rich, you buy a Lamborghini and you either can't drive it properly or there's no roads that can accommodate its maximum speed, it's another entirely to sell a Honda Civic for 100K. With the former, it performs as advertised, with the latter it's an outright ripoff. I think the Titan is an actual ripoff in most cases. |
|
Ghastlyone Premium Member join:2009-01-07 Nashville, TN |
I bought 2 GTX 780's for nearly the same price as a single Titan, and they completely stomp that card in performance. That extra 3gb of VRAM on the Titan means absolutely fuck all in the overall scheme of things. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
No kidding. The Titan has lots of space to put textures it doesn't have the power to process. |
|
KearnstdSpace Elf Premium Member join:2002-01-22 Mullica Hill, NJ 1 edit |
to Krisnatharok
Titan is great if you use it for CUDA rendering, Still cheaper than a Quadro of a similar class while still being capable of gaming. However if you do not do 3D modeling as a hobby, Its just a status symbol because no game will use Titan levels of VRAM before the normal gaming cards reach that much VRAM.
With Blender Cycles I have actually maxed out a generation 1 Titan. Should it die or time for an upgrade come its getting replaced by a mainstream gaming card though.
The original Titan had a purpose, No card had half its VRAM so it was good for people who do hobby stuff and gaming. Now its a status symbol and Titan is holding Nvidia back. They will not make a number card with more VRAM just to protect sales of the TITAN. Just like how they deoptimize OGL performance to protect the Quadro.
Though I tend to be an Nvidia fan(Mostly because of EVGA being really reliable), I think atm ATI has the advantage in power and memory bandwidth. Nvidia is doing better in the thermals though.
I do think GPUs from both vendors need a shakeup though, Like the Intel Core 2 and i# series CPUs they need to start finding ways to improve efficiency per clock rather than just more raw power as GPUs really are getting power hungry and really hot.
I think we will see a bunch of 300 series cards from ATI though. Likely 350-390x And for those with a nuclear reactor as a PSU... 390X2 |
|
|
KrisnatharokPC Builder, Gamer Premium Member join:2009-02-11 Earth Orbit |
to Ghastlyone
said by Ghastlyone:That extra 3gb of VRAM on the Titan means absolutely fuck all in the overall scheme of things. And if you still need tons of vram, there's this: » www.newegg.com/Product/P ··· 14202144Two of those mean 8 GB ram and commensurate render power to complement it. |
|
2 edits |
to El Quintron
said by El Quintron:said by FizzyMyNizzy:The only card that will use HBM are R9-385X(2GB), 390(4GB), 390X(4GB) and 395x2(8GB) What's the benefit of HBM over GDDR5? (I could look this up, but a quick explanation would benefit the thread) Note: HBM (High Bandwidth Memory) in short
» wccftech.com/amd-20nm-r9 ··· n-gddr5/quote: HBM was originally developed in tandem by both SK Hynix and AMD to replace GDDR5 as the new standard for bandwidth hungry processors such as GPUs. GDDR5 performance scaling has slowed down dramatically and grown exponentially more expensive in the last few years. Faster GDDR5 modules and wider memory interfaces were only going to take us so far, a new standard had to replace the aging technology.
hynix are the maker of HBM for AMD: » www.skhynix.com/gl/produ ··· info.jsp |
|
KearnstdSpace Elf Premium Member join:2002-01-22 Mullica Hill, NJ |
to Krisnatharok
One upside is that with bitcoins having moved on to more custom platforms and the general instability and hostility of the bitcoin... Maybe the 300 series of cards will not skyrocket in price due to people buying them in bulk to mine with them.
I mean it will not be cheap at first as no new toy is, But it should come down quicker. The mining craze kept the 290 at high prices for awhile. |
|
El QuintronCancel Culture Ambassador Premium Member join:2008-04-28 Tronna |
to FizzyMyNizzy
Noted, makes sense that they would move to this over GDDR5 |
|
1 edit
2 recommendations |
to FizzyMyNizzy
said by FizzyMyNizzy:quote: HBM was originally developed in tandem by both SK Hynix and AMD to replace GDDR5 as the new standard for bandwidth hungry processors such as GPUs. GDDR5 performance scaling has slowed down dramatically and grown exponentially more expensive in the last few years. Faster GDDR5 modules and wider memory interfaces were only going to take us so far, a new standard had to replace the aging technology.
AMD went with HBM. nVidia went with USB flash drives. |
|
Nanaki (banned)aka novaflare. pull punches? Na join:2002-01-24 Akron, OH |
to El Quintron
If it was a workstation card that seems right. 3d graphics workstation cards always have crap tons of ram to load textures. But for the most part those textures are never rendered in real time.Meaning you do not run a animation sequence with textures loaded up. Other things a workstation card needs is very high res output higher than any one will run a game. You need that res for when you zoom way way in on a model to find and remove un needed polys verts etc. And also to make sure you do not have 2 polygons that have their edges overlapping that will cause artifacts and just a over all bad appearance. |
|
Nanaki |
to Kearnstd
said by Kearnstd:However if you do not do 3D modeling as a hobby, Its just a status symbol because no game will use Titan levels of VRAM before the normal gaming cards reach that much VRAM. +1 on that and further if you do 3d modeling work for money yeh titans and their like are what you'll be using. Cards like titan have always had way more video ram than their gaming counterparts. 256 meg and even 512 meg vid cards go way back in to the mid/late 90s. My old workstation had a 128 meg vid card with running 2d studio max 2.5 and later 3. I flat out used every drop of that ram more than a few times heh. BTW back then gaming on a card like that was uh painful you were lucky to hit 30fps on low settings. I can't remember the card it had not but it was ati. |
|
Savious Premium Member join:2012-03-05 Billings, MT |
to Krisnatharok
said by Krisnatharok:Titan, in my mind, has the same stigma associated with Alienware--it shows the owner has more money than good sense or knowledge. The top enthusiast card almost always has more power (or is 90% of it) at about half the price, and AMD cards have more vram. I take offense to this. I own an Alienware. I didnt buy it thinking I was getting the best machine for the money. I got mine because I had always wanted one as a teen, it wasn't overly expensive add its a trustworthy brand. If I had any issues it was a phone call and a few days wait to be fixed. Inbox and plug in, no hassle. That being said, I'm going to build my own PC here in a week or two. At the time, getting the AW was a perfect choice for my needs. |
|
KrisnatharokPC Builder, Gamer Premium Member join:2009-02-11 Earth Orbit |
said by Savious:it wasn't overly expensive add its a trustworthy brand You're wrong on both points. Ever since Dell bought Alienware, it's been a shit brand, and waaaay overpriced. |
|
Nanaki (banned)aka novaflare. pull punches? Na join:2002-01-24 Akron, OH |
Nanaki (banned)
Member
2015-Mar-5 5:34 pm
I don't ever remember them not being over priced. They were a joke on hardware forums. 3k for a gaming rig that you could build your self for 1500 to 1700. |
|
BlockgorillaSarcasm is my native tongue join:2010-02-11 Wichita, KS |
deleted for derpitude |
|
Ghastlyone Premium Member join:2009-01-07 Nashville, TN |
to Krisnatharok
Alienware's top end models are rediculously overpriced. The higher end you go, the higher the premium you're paying. Their lower end stuff, while still being over priced, aren't that bad.
I still wouldn't buy one though. Because in the end, you can always build something better, that's less expensive with a much better warranty. |
|
1 edit |
to Krisnatharok
Also, AMD FreeSync Drivers Coming in March 19» wccftech.com/amd-freesyn ··· march-19quote: AMD just announced that Catalyst drivers with FreeSync support will officially debut on March 19. Project FreeSync is AMD's effort to bring variable refresh rate monitors to market through industry standards and by working with established ASIC and monitor manufacturers. AMD proposed the Adaptive-Sync standard earlier in the year to the VESA body. Which has since been adopted and incorporated it into DisplayPort1.2a. AMD FreeSync Catalyst OmegaAdaptive-Sync capable monitors solve three distinct issues in games. The first issue is tearing, tearing occurs mainly whenever the frame rate exceeds the refresh rate of the monitor. The second issue is somewhat related to the first, as stuttering can occur if the frame rate exceeds or falls behind the refresh rate. The third issue is input-lag, which occurs when you enable V-Sync to get rid of tearing and stuttering. So before variable refresh rate monitors had existed, irrespective of whether they were G-Sync or FreeSync enabled. You had to choose between either tearing and stuttering or latency. AMD FreeSync Drivers Coming in March 19 AMD : AMD is very excited that monitors compatible with AMD FreeSync%u2122 technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon%u2122 GPUs and AMD A-Series APUs. We're pleased to announce that a compatible AMD Catalyst%u2122 graphics driver to enable AMD FreeSync%u2122 technology for single-GPU configurations will be publicly available on AMD.com starting March 19, 2015. Support for AMD CrossFire%u2122 configurations will be available the following month in April 2015. FreeSync can support any range of refresh rates, for example 24Hz-144Hz, 24Hz-90Hz or even 24Hz-240Hz. And depending on the monitor maker they can opt for whatever range they want. So you can have all sorts of FreeSync monitors that span from the very high end 4K models to the 120Hz/144Hz "gaming" 1440p monitors and the more affordable 90Hz 1080p solutions. Which means that you will more easily find something that fits your needs and budget. There are currently 11 different FreeSync monitors which are already available or will be available by the end of the month. Some of the ones which are already available right now in Europe include the BenQ XL2730Z, LG Flatron 34UM67 and Acer Predator XG277HU.
FreeSync Capable Monitors at CES 2015 - WCCFTech Manufacturer Model Size Resolution Refresh Rate BenQ XL2730Z 27 Inch 2,560 × 1,440 144Hz Acer XG277HU 27 Inch 2,560 x 1,440 144Hz Nixeus NX-VUE24 24 Inch 1,920 × 1,080 144Hz ViewSonic VX2701mh 27 Inch 1,920 × 1,080 144Hz LG Electronics 29UM67 29 Inch 2,560 × 1,080 75Hz LG Electronics 34UM67 34 Inch 2,560 × 1,080 75Hz Samsung UE590 23.6 Inch 3,840 × 2,160 60Hz Samsung UE590 28 Inch 3,840 × 2,160 60Hz Samsung UE850 23.6 Inch 3,840 × 2,160 60Hz Samsung UE850 28 Inch 3,840 × 2,160 60Hz Samsung UE850 31.5 Inch 3,840 × 2,160 60Hz
Rumors: Pretty sure R9 360/360c should be out soon. » www.unlockpwd.com/amd-ra ··· 2-rumor/I think the R9 360/360x will be listed in the March 19th drivers. But we will see. |
|