dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
971
share rss forum feed


ImpldConsent
Under Siege
Premium
join:2001-03-04
Mcdonough, GA
Reviews:
·AT&T U-Verse
·magicjack.com

AMD Kaveri - Computing Revolution?

AMD launches Kaveri processors, aimed at starting a computing revolution

said by Venturebeat :
AMD claims that its GPUs are much more powerful than Intel’s. For example, AMD said that its A10-7850K chip is 24 percent faster than the system performance of the higher-priced Intel Core i5-4670K chip. It says its graphics performance is 87 percent better than Intel’s, and its compute performance is 63 percent better than Intel’s.
Interesting.
--
That's "MISTER" Kafir to you Mr. Munafiq


Anonymous_
Anonymous
Premium
join:2004-06-21
127.0.0.1
kudos:2
Reviews:
·Time Warner Cable

4 edits
said by ImpldConsent:

AMD launches Kaveri processors, aimed at starting a computing revolution

said by Venturebeat :
AMD claims that its GPUs are much more powerful than Intel’s. For example, AMD said that its A10-7850K chip is 24 percent faster than the system performance of the higher-priced Intel Core i5-4670K chip. It says its graphics performance is 87 percent better than Intel’s, and its compute performance is 63 percent better than Intel’s.
Interesting.

Quad cores are old news from 2006
there has been Zero improvement since except for updated versions of the same thing
65nm > 45nm > 32 nm > 22 nm still only 4 or 6 core chips '

(I don't count bulldozer as having 8 cores since it's just a version of Hyper-Threading Technology ) other wise my pentium 4 Wtih HT counts as a dual core.

by now I expected to have at lest 24 or 32 cores or more

Dedicated GPU with large buffer it still better then a IGP

I guaranteed the Dedicated GPU with the same GPU chip as the A10-7850K chip will out perform it

since you do not use the system RAM bottle necking the computer

normal OS such as Windows 7 will use about 3GB of ram just using the web

VRAM disabled since I have a SSD

also using 102MB out of 512mb video buffer..

I have 8GB installed and 9600m with 512MB btw.....

Aranarth

join:2011-11-04
Stanwood, MI
Reviews:
·Frontier Communi..

1 recommendation

said by Anonymous_:

Quad cores are old news from 2006
there has been Zero improvement since except for updated versions of the same thing
65nm > 45nm > 32 nm > 22 nm still only 4 or 6 core chips '

(I don't count bulldozer as having 8 cores since it's just a version of Hyper-Threading Technology ) other wise my pentium 4 Wtih HT counts as a dual core.

UHHH.... NO not quite.

Hyperthreading allows a single pipeline to service two threads at once. This keeps the pipeline working even if one of the threads stalls because of a cache miss or a broken loop. Nornall the pipline would have to be flushed in order for the the cpu to start working on a new thread.

What amd is doing is using TWO pipelines being fed by a common back end and sharing the "FPU". Sort of like merging two cores together and keeping what can be shared to save power and die space. Absolutely NOT the same as hyper threading.
The problem with what amd is trying to do is you need a much more beefy back end and schedulers etc. This latest version really beefs up the back end over what they had with previous generation of this family.

VRAM cannot be disabled since it stand for VIDEO RAM I assume you are really talking about the swap file.

Also win7 works best with 3gig+ of ram but that does not mean it is actually using ALL of it at any one time for just web browsing unless you have a huge number of tabs open.

You are correct that a dedicated gpu will outperform the gpu included with this chip but that is like saying that semi truck engine will outperform the little 4 banger in my fiesta. They are designed for two completely different things. The gpu in this chip is designed to give you OK base performance with good battery life. The dedicated GPU is USUALLY designed to you uncompromised performance. In this case the built in GPU works EXTREMELY well compared to anything that intel currently has built into their chips and that is exactly what it is designed to do.

You might want to take the time to actually research and learn about these things before making such statements.


Dissembled

join:2008-01-23
Indianapolis, IN
reply to ImpldConsent
Took a lot of reading through a lot of technical mumbo-jumbo that I don't quite find all that interesting to find this fairly well worded conclusion

"In the broader sense however, Kaveri doesn't really change the CPU story for AMD. Steamroller comes with a good increase in IPC, but without a corresponding increase in frequency AMD fails to move the single threaded CPU performance needle. To make matters worse, Intel's dual-core Haswell parts are priced very aggressively and actually match Kaveri's CPU clocks. With a substantial advantage in IPC and shipping at similar frequencies, a dual-core Core i3 Haswell will deliver much better CPU performance than even the fastest Kaveri at a lower price."

ref: »www.anandtech.com/show/7677/amd-···7850k/16


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
In terms of CPU performance only, yes.


Anonymous_
Anonymous
Premium
join:2004-06-21
127.0.0.1
kudos:2
Reviews:
·Time Warner Cable
reply to Aranarth
said by Aranarth:

said by Anonymous_:

Quad cores are old news from 2006
there has been Zero improvement since except for updated versions of the same thing
65nm > 45nm > 32 nm > 22 nm still only 4 or 6 core chips '

(I don't count bulldozer as having 8 cores since it's just a version of Hyper-Threading Technology ) other wise my pentium 4 Wtih HT counts as a dual core.

VRAM cannot be disabled since it stand for VIDEO RAM I assume you are really talking about the swap file.

Also win7 works best with 3gig+ of ram but that does not mean it is actually using ALL of it at any one time for just web browsing unless you have a huge number of tabs open.

Obviously i'm talking about Virtual memory because video card use GDDR[2,3,4,5 ] and VRAM has not been used in video cards since the 90s.

also the CPU is the biggest power hog in the computer in a older laptop I had I changed the chip from a 65nm chip to a 45nm chip and netted a additional 30 min in battery life and the CPU speed went from 1.5GHz 2mb to 2.4GHz 6mb(2.6GHz Turbo mode).
--
Live Free or Die Hard...

Aranarth

join:2011-11-04
Stanwood, MI
Reviews:
·Frontier Communi..
said by Anonymous_:

Obviously i'm talking about Virtual memory because video card use GDDR[2,3,4,5 ] and VRAM has not been used in video cards since the 90s.

The trouble is that the swap file while being virtual memory is not considered to be RAM. Therefore you should not call it VRAM. Also gddr etc. is still referred to as VRAM in some circles since "graphics" and "video" are interchangeable.

said by Anonymous_:

also the CPU is the biggest power hog in the computer in a older laptop I had I changed the chip from a 65nm chip to a 45nm chip and netted a additional 30 min in battery life and the CPU speed went from 1.5GHz 2mb to 2.4GHz 6mb(2.6GHz Turbo mode).

Depending on what you are actually doing, the video card can now use MORE watts than the cpu. I.E. you can pair a 150watt cpu with a 300watt video card quite easily. The point behind this chip is to provide a low power video card with a low power cpu while still maintain a decent amount of performance.

In this case they are taking an equivalent amount of processing power that would have used about 200 to 250 watts 2 or three years ago (video card and CPU) and folding into a single chip that uses 65 watts max. When you look at it from this point of view that is quite impressive from an efficiency stand point. This chip is NOT intended to blow the doors off of a stand alone cpu with a dedicated high-power video card. This is about efficiency not brute power.

Since you mention a laptop this chip will provide you with even more power savings and better graphics performance. Exactly what you were trying to achieve by swapping out the cpu.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
reply to Anonymous_
said by Anonymous_:

Obviously i'm talking about Virtual memory because video card use GDDR[2,3,4,5 ] and VRAM has not been used in video cards since the 90s.

also the CPU is the biggest power hog in the computer in a older laptop I had I changed the chip from a 65nm chip to a 45nm chip and netted a additional 30 min in battery life and the CPU speed went from 1.5GHz 2mb to 2.4GHz 6mb(2.6GHz Turbo mode).

As Aranarth See Profile pointed out, it's not obvious, and a paging file is not VRAM, as VRAM refers to the memory available to the graphics card, discrete or not. This is due to the prevalence of integrated GPUs (like the new Kaveri APUs) that use system memory for the video system.

Nowadays, the CPU is rarely the biggest power hog in a computer. Even high-end Core i7 chips have a TDP of 77w (i7-3770K) and go down to 35w (Pentium G3220T) in the desktop environment. Graphics cards, meanwhile, commonly have TDP between 100-300w. You will rarely ever decide on a power supply size due to which CPU you have in your rig, and will always factor in what type of graphics cards and how many.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


signmeuptoo
Bless you Howie
Premium
join:2001-11-22
NanoParticle
kudos:5
reply to ImpldConsent
Here's the thing:

My money smart BIL and I were shopping for my nephew's Christmas present last fall. BIL had decided that it should be a computer, and naturally, since their house is small and crowded and because of general common sense reasons, a laptop made most sense. Nephew is an early teenager. As with most teens, he's obsessed with gaming but BIL, being money wise and having a tight budget and being that he owns a lot of home built computers (he's an engineer and each computer has a specific job), decided that we need to keep the budget LOW and not choose a 2,000 dollar gaming laptop. Money's tight, folks, and we ain't rich.

I advised him that our boy will want to game on the laptop so we still needed to keep that in mind. Our consensus was, ok, a budget system, a used/refurb that can entry level game. Our choices between AMD systems with stronger graphics processing over Intel systems with strong general processing, the AMD solutions (we're talking a 300 dollar used laptop) won out.

There are a LOT of consumers that will buy this AMD solution, it is a great idea. Processing power on MOST systems today is more than adequate for general uses, but graphics power is welcome to improvements.
--
Join Teams Helix and Discovery. Rest in Peace, Leonard David Smith, my best friend, you are missed badly! Rest in peace, Pop, glad our last years were good. Please pray for Colin, he has ependymoma, a brain cancer, donate to a children's Hospital.


Anonymous_
Anonymous
Premium
join:2004-06-21
127.0.0.1
kudos:2
reply to Krisnatharok
um my video card has TDP of 35watt and the cpu is 35watts TDP

laptop...


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
said by Anonymous_:

um my video card has TDP of 35watt and the cpu is 35watts TDP

laptop...

So why are you opining about desktop processors, then?
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


Anonymous_
Anonymous
Premium
join:2004-06-21
127.0.0.1
kudos:2
Reviews:
·Time Warner Cable
said by Krisnatharok:

said by Anonymous_:

um my video card has TDP of 35watt and the cpu is 35watts TDP

laptop...

So why are you opining about desktop processors, then?

I have desktops too
guess you never looked at benchmarks i7 4 core with HT vs a 8 core amd chip

amd 8 core still gets whipped by a 4 core.
AMD just wanted to claim the first to have 8 cores for desktop market that is about it.....

I had a server with a dual xeon quad core (8 physical cores) before
that system was close to the performance of a I7 920 ht on
just right below it.
except for the memory bandwidth (DDR2 reg ECC bandwidth sucked) it's one of the reasons why parted the system

Hyper-Threading can be equivalent to so called physical cores

Looks like passmark agrees with me as them being "logical cores per physical"

AMD FX-8350 Eight-Core Average CPU Mark
Description: Socket: AM3+, Clockspeed: 4.0 GHz, Turbo Speed: 4.2 GHz, No of Cores: 4 (2 logical cores per physical),

Description: Socket: LGA1155, Clockspeed: 3.4 GHz, Turbo Speed: 3.9 GHz, No of Cores: 4 (2 logical cores per physical), Max TDP: 77 W
Other names: Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz
--
Live Free or Die Hard...


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Comparing Bulldozer to Core i7 chips is completely outside the scope of the Kaveri APU discussion.

pandora
Premium
join:2001-06-01
Outland
kudos:2
Reviews:
·ooma
·Google Voice
·Comcast
·Future Nine Corp..
reply to ImpldConsent
As a shopper, hoping Intel has competition for desktop CPU/GPU/APU solutions. I check - »www.anandtech.com/show/7677/amd-···7850k/14 and note on many synthetic tests the Kaveri beats Intel's 3770K and 4770K.

This means Intel has to release a new desktop CPU with integrated GPU comparable to Kaveri or cede high end desktops to AMD.

I wonder if Intel cares enough about desktops anymore to try and compete in this area?

Good luck AMD, if Intel doesn't respond, next build, I'll take a look at you!
--
Congress could mess up a one piece jigsaw puzzle.


moaripc

@optonline.net
Since when do "high end" desktops use IGPUs?

High end desktops I would think you care about CPU performance, and get a GPU for the video side not go APU.

APU's are low end towers, and notebooks, that's the real area IGPU's come into play, high "power" graphics using little battery, mobile market needs IGPU, high end towers? Unless AMD gets good IPC, I can't see using them for high end or gaming, yes alot of games are going mutli thread, the few I get time to play, 1 thread.

At least amd is putting some pressure on intel, just wish it was on the CPU performance side.

pandora
Premium
join:2001-06-01
Outland
kudos:2
Reviews:
·ooma
·Google Voice
·Comcast
·Future Nine Corp..
said by moaripc :

Since when do "high end" desktops use IGPUs?

I have been playing GW2 at 30-40 FPS 1080P since May on an Intel 3770K. That's the only GPU intensive game I play, and it works great.
--
Congress could mess up a one piece jigsaw puzzle.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
High-end desktops don't use iGPUs. APUs will at least allow the bulk of mid-range desktops to go without discrete GPUs.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.

pandora
Premium
join:2001-06-01
Outland
kudos:2
Reviews:
·ooma
·Google Voice
·Comcast
·Future Nine Corp..
said by Krisnatharok:

High-end desktops don't use iGPUs.

For now, give it a few years. Personally I am sick and tired of expensive power hungry GPU's.
--
Congress could mess up a one piece jigsaw puzzle.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
The whole definition of "high-end" means discrete GPU.

There is no way you can push 200-300 watts through the CPU, on top of CPU's draw itself.

But I don't think we need iGPUs on high-end discrete level to call this a success.

By AMD's definition, providing CPU performance that is "good enough" for most mainstream uses and then GPU performance that can play current games (i.e. BF4) at 1080p @ 30 fps is a landmark achievement.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.

pandora
Premium
join:2001-06-01
Outland
kudos:2
Reviews:
·ooma
·Google Voice
·Comcast
·Future Nine Corp..
said by Krisnatharok:

The whole definition of "high-end" means discrete GPU.

I used to accept that definition, no longer. 1080P isn't beyond an integrated GPU to render well in the near future.
--
Congress could mess up a one piece jigsaw puzzle.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
said by pandora:

1080P isn't beyond an integrated GPU to render well in the near future.

You haven't read up about Kaveri then, because it has achieved that goal already.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
reply to pandora
said by pandora:

said by Krisnatharok:

The whole definition of "high-end" means discrete GPU.

I used to accept that definition, no longer. 1080P isn't beyond an integrated GPU to render well in the near future.

1080P isn't high end and hasn't been for some time
1080P w/3d was a target for a little bit but based on CES 3d is being left behind by TV makers

4K is where its at now and while an iGPU can do 1080P easy its not going to push 4K anytime soon and when it does then 16k will be the new target

said by Krisnatharok:

The whole definition of "high-end" means discrete GPU

QFT

High-end means pushing the limits and 1080p hasn't been pushing the limits for a few years now.

4K-3D might be something next year but I doubt it, they have to double the pixel count to do 3d and they have to quadruple it to go from 1080P to 4k so that'd mean a 4K-3D screen would have eight times the pixels of a 1080P-2D screen
--
semper idem
1KTzRMxN1a2ATrtAAvbmEnMBoY3E2kHtyv


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
1080p is industry standard now.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
said by Krisnatharok:

1080p is industry standard now.

=base line
--
semper idem
1KTzRMxN1a2ATrtAAvbmEnMBoY3E2kHtyv


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
It takes considerably more graphics horsepower to drive above 1080p. I don't even know if a single 290X or 780 Ti can drive 1440p or 1600p at ultra/max settings. Anything above 1080p will likely take multiple GPUs in a Crossfire/SLI setup.
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
well my GTX570 is doing 1152P just fine

gaming ya there's a huge jump in needed power after 1080p, but for say video display I bet you won't have an issue with say a youtube 4k video.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
Youtube =/= gaming. No one buys a 4K monitor to watch Youtube upscale from 720 or 1080.

And your 570 (don't you have SLI anyways?) can't drive BF4 at max, nor any of the games set to be released this year. I know this because my BIL has a GTX 570 and a 2048x1152 screen. But even 1152p has 14% more pixels than 1080p. 1440p has 78% more, and 1600p has nearly twice as many (97.5% more).

Going above 1080p costs considerably more both in terms of display and an extra GPU (which at this point you're looking at minimum $300 units).
--
Those who would give up Essential Liberty to purchase a little Temporary Safety, deserve neither Liberty nor Safety.


Ghastlyone
Premium
join:2009-01-07
Las Vegas, NV
kudos:5
reply to Krisnatharok
said by Krisnatharok:

It takes considerably more graphics horsepower to drive above 1080p. I don't even know if a single 290X or 780 Ti can drive 1440p or 1600p at ultra/max settings. Anything above 1080p will likely take multiple GPUs in a Crossfire/SLI setup.

Metro LL and Crysis 3 are the only two games I couldn't fully max out at 1440p with a single 780. Other than that, every thing else I could run with pretty much max settings, 60fps. Even BF4, ultra settings, 4xAA was pegged at 60fps.

Adding the 2nd 780 (I know it's complete overkill) is pretty much pointless in most games. Made no noticable difference in BF4. I updated my video drivers one day and SLI gets disabled afterwards. I played BF4 for hours and never even noticed SLI wasn't enabled lol.


Ghastlyone
Premium
join:2009-01-07
Las Vegas, NV
kudos:5
reply to DarkLogix
said by DarkLogix:

well my GTX570 is doing 1152P just fine

Playing WoW on max settings?

I would hope so.

Fire up Metro Last Light when get the chance. I'd be amazed if you managed more than 20-30fps.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3

2 edits
reply to Krisnatharok
said by Krisnatharok:

Youtube upscale from 720 or 1080.

Who said upscale?
Youtube does have a small selection of 4K videos

And I was saying gaming on ultra is for high-end not for integrated GPU's (iGPU/APU)'s
hi-end is for dedicated video cards

If it can do 4K youtube videos (not upscaled) then its more than powerful enough for normal users but but thats not a scale to judge high-end by

High-end=gaming on ultra
--
semper idem
1KTzRMxN1a2ATrtAAvbmEnMBoY3E2kHtyv