dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
2603
share rss forum feed

me1212

join:2008-11-20
Pleasant Hill, MO

1 edit

Steam survey says pcs used for gaming are getting "worse&qu

Not really a surprise, with so many games being developed for consoles and then ported to pc, and said consoles being weaker then mid range pcs an uber high end pc isn't needed right now to play games.

That said I have to wonder if the writer of the article knows what he is truly talking about when he says, regarding dual vs quad cores.He equated dual core cpus with being worse than quad cores just because not as many cores, mentioning that dual cores were on the rise and quad were not growing as fast as in years past, also mentioned intel HD integrated gpus(the lowest end hd 2000 is about the same as a geforece 7600) as being on the rise.

The thing is the sandy bridge and ivy bridge intel cpus are the ones that have intel HD, and those(SB/IB cpus) are freaking beastly. Even a sandy bridge dual core celeron is better for gaming(bang:buck) than amd's highest end fx-8150 octo core bulldozer. With a dual core cpu thats better for gaming than anything amd can put out for gaming(save for *maybe* a 960t oc'd to 4.2ghz but those aren't made anymore and most people don't oc because prebuilts), and the equivalent of a 7600.

While not exactly highend stuff, its enough to play every valve game and a multitude of other pc games be it from: steam, gog.com, windows live, or dosbox theres a lot of games available to lower end pcs. You don't need an uber high end rig to game, just one that gets the job done.

link: »www.pcgamesn.com/article/steam-h···y-slower

EDIT: readability.


Dissembled

join:2008-01-23
Indianapolis, IN

Re: Steam survey says pcs used for gaming are getting "wors

The economy?


TK421
Premium
join:2004-12-19
Canada
reply to me1212
said by me1212:

You don't need an uber high end rig to game, just one that gets the job done.

I totally agree with that statement.

The article said the survey shows a rise in integrated graphics, dual-core CPU, and low-end GPU's but goes on to suggest the reason is simply that Steam is getting installed on more laptops than ever before. Makes a lot of sense to me.

me1212

join:2008-11-20
Pleasant Hill, MO
said by TK421:

The article said the survey shows a rise in integrated graphics, dual-core CPU, and low-end GPU's but goes on to suggest the reason is simply that Steam is getting installed on more laptops than ever before. Makes a lot of sense to me.

I kinda have to laugh at them calling SB dual cores low end though. With people using bulldozer chips in big desktop builds and even the freaking SB/IB celeron beating them for gaming cpus, yeah dual cores aren't as low end as people seem to think they are. Heck the i3, a dual core, is the best bang:buck gaming cpu on the market handling pretty much any game thrown at it, when paired with a good gpu.

You bring up a good point though, no one ever said they were using the laptops for games just they had steam installed on them. People may be using steam as an IM client, or using them at college to use the college's high speed network to download games to play on a bigger pc at home. Granted with the indie market being as successful as it is and having low system requirements the 2000hd + an SB cpu is more than enough for them, save for minecraft with runs like crap on intel hd anything.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
reply to me1212

Re: Steam survey says pcs used for gaming are getting worse

I disagree with the title whole-heartedly. Gaming PCs are not getting "worse" (suggesting an absolute decline in overall power), but that PC technology has progressed to a point where "good enough" is cheap and plentiful.

A corollary to this is that many new games are not able to take advantage of the most powerful GPUs--and are a very far cry from using all the CPU power of the latest and greatest (Tom's Hardware flat-out says there are harsh diminishing returns on the useful power you get out of a CPU worth more than $220--that's pretty damn cheap considering the Intel line-up). This is because the current generation of gaming consoles are pushing 6-8 years out there and PC gaming tends to be held back by console hardware.

We should see a big uptick required hardware once the next gen of gaming consoles are released. There are exceptions, of course: The Witcher 2 (when it was released it brought my GTX 580 to its knees, and still does with post-processing), Crysis 2 + mods, etc.

But many of the most popular games out there (WoW, LoL, Minecraft, etc.) don't require super-powerful rigs to play and enjoy, so people aren't shelling out $2000+ for new rigs--they're building $800-1200 rigs, or buying laptops nad using them as their primary gaming rigs.
--
If we lose this freedom of ours, history will record with the greatest astonishment, those who had the most to lose, did the least to prevent its happening.

me1212

join:2008-11-20
Pleasant Hill, MO
I doubt the next gen of consoles will change much, the wiiu is going with a 48x0 gpu, 720 is supposedly gonna have a 6660, and the ps4 something similar. The economy's not that great, sony already had to sell the ps3 at a loss for years, and isn't doing well financially right now, so they are trying to make the ps4 cost less meaning lower end parts. the original xbox sold at a loss, the 360 finally made a profit, but with nintendo having done so freaking well with the wii and gimmicky games the 720 is going to ship with kenect 2 as its "primary" controller so thats gonna be a gimmick fest. And well nintendo and the wiiu, yeah that just I don't think anyone outside of nintendo knows wtf is going on there.

Not that thats necessarily a bad thing, this gen pc devs moved to the console by and large making the pc get crappy ports instead of the other way around as in generation's past. Maybe this next gen will change that as pc's will have better hardware, hopefully with more and more people buying pcs we see that happen anyway.

$220 gets you the 2500k/3570k, and really thats the best thing intel has for gaming, the i7's hyperthreading isn't worth the $100 since games don't use it.

You are spot on with wow/lol, they are popular and wow is f2p until lv 20 I think, thats gonna entice people to buy pcs if they wanna try it. However, if you play minecraft and use new intel chips you need a gpu, intel hd series doesn't play nice with minecraft. Amd and minecraft get along fine though. Not necessarily a bad thing that people aren't spending what they used to on gaming pcs. The newer cpus don't cost as much as the older ones did(ie i5 vs q6600), no reason whatsoever to spend more than you need to.

demir
Premium
join:2010-07-15
usa

4 edits
reply to me1212

Re: Steam survey says pcs used for gaming are getting "wors

Computers are doing more with less --- This has always been the case.

Onboard GPU's on the CPU is all you need to play a ton of games really, really well. Maybe not the most intensive or up to date games, and maybe not on the highest details --- but at least they are playable on a much wider range of computers.

WoW can be played at 1680 resolution on good quality and get 30 fps . . .on a sandy bridge CPU (with no discreet card)

Some of the more intensive games aren't playable yet, but again, I think Haswell / Broadwell will be game changers in that department --- and the first is only 9 months away.

There's also Trinity from AMD --- Whoever can put together an integrated solution, make it cheap and make it play any game on reasonable settings --- will be on top the next couple of years.

me1212

join:2008-11-20
Pleasant Hill, MO
Amd's trinity(as well has intel hd4000 since they are about equal) is wonderful for mobile laptop gaming. It opens up a whole new world of games to onbaord cheap gpus.

I heard that haswell is aiming to have an on board gpu thats beats a 260, but that was like 6 months ago that may have changed. Still thats freaking amazing for on board. Thats enough to play skyrim on a 768p screen at lower settings. Onboard gpus are advancing at astounding paces, I can't wait to see what the next few years hold. I doubt we will see a day in the near future where dedicated gpus are obsolete, but its nearly to the point where unless your a really hard core gamer you don't need a dedicated gpu. Its mind boggling.

demir
Premium
join:2010-07-15
usa
I think it's really exciting, because the CPU makers are going to have the low end market --- and perhaps the mid-range as well.

I wonder if Nvidia is going to make enough money solely on high end GPU's to stay in the market in the years to come . . .

AMD is positioned particularly well --- and Intel doens't really have much experience with discreet cards (maybe they'll buy Nvidia)


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
said by demir:

AMD is positioned particularly well --- and Intel doens't really have much experience with discreet cards (maybe they'll buy Nvidia)

AMD won't be around much longer, imo. I'd be happy to be proved wrong, but Execs are fleeing the company and it recently received F's in most major areas by investors.

Nvidia may come full circle by moving its Tegra chips out of the mobile phone/tablet market and start competing in the console and ultra-portable markets.
--
If we lose this freedom of ours, history will record with the greatest astonishment, those who had the most to lose, did the least to prevent its happening.

IamGimli

join:2004-02-28
Canada
kudos:2
reply to me1212
said by me1212:

The thing is the sandy bridge and ivy bridge intel cpus are the ones that have intel HD, and those(SB/IB cpus) are freaking beastly. Even a sandy bridge dual core celeron is better for gaming(bang:buck) than amd's highest end fx-8150 octo core bulldozer. With a dual core cpu thats better for gaming than anything amd can put out for gaming(save for *maybe* a 960t oc'd to 4.2ghz but those aren't made anymore and most people don't oc because prebuilts), and the equivalent of a 7600.

That's not even considering the fact only a handful of games are natively designed to run on more than two cores.

demir
Premium
join:2010-07-15
usa
reply to Krisnatharok
Regardless, they are positioned better to weather the loss of the low and mid range discreet graphics market, because they have an onboard gpu AND they also make discreet cards --- no other company has that.

Intel doesn't know how to make discreet GPU's and Nivida doesn't know the first thing about CPU's.

AMD *could* mop the floor, although I'm not holding my breath. I would like to see the situation where as many companies as possible are competing --- if AMD goes away, that's not good for the consumer.


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
said by demir:

and Nivida doesn't know the first thing about CPU's.

How's that again?

»forwardthinking.pcmag.com/cell-p···stack-up

»www.forbes.com/sites/greatspecul···segment/

»www.tomshardware.com/reviews/tra···5-2.html
--
If we lose this freedom of ours, history will record with the greatest astonishment, those who had the most to lose, did the least to prevent its happening.

demir
Premium
join:2010-07-15
usa
When I see Nvidia with 16-18% of the CPU market, i'll start worrying.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
Why worry?
if Nvidia goes after intel then maybe intel will feel some heat and not hold back as much (ie we could have 8-core i7's out now)


Krisnatharok
Caveat Emptor
Premium
join:2009-02-11
Earth Orbit
kudos:12
reply to demir
It's a *could* be at this point, especially with AMD's decline, Nvidia has a lot of unrealized potential if they port Tegra 3 and its successors to computers.

But AMD is in serious trouble, especially if Piledriver fails to be at least equivalent to Ivy Bridge, and Steamroller doesn't deliver a strong follow-up. That or get included in the next generation of consoles.
--
If we lose this freedom of ours, history will record with the greatest astonishment, those who had the most to lose, did the least to prevent its happening.

demir
Premium
join:2010-07-15
usa

1 edit
reply to DarkLogix
Because, once you get market share up to that level, it's sort of the tipping point before mainstream commercial success.

It's not anywhere close currently, so needless to say, the articles Kris posted are pretty much speculative.

me1212

join:2008-11-20
Pleasant Hill, MO

1 recommendation

reply to Krisnatharok
I really hope ndivia does well with tegra3, it seems very interesting, plus more competition is always better. Plus arm seems cool.

Amd has never really been the better of the two(intel vs amd), amds been the budget market. It's top of the line 8 core chips costs about as much as a 2500k, quad cores are about as much as an i3, 6 core in the middle. Sure the ipc can't hold a candle the intel, but the average consumer doesn't care about that, most of them see moar coars as better.

Amd does have at least two consoles in the bag, the wiiu and xbox 720 are going to use amd gpus, and sony expressed interest in amd apus. Even apple has expressed interest in apus before, but apparently amd cannot make as many as apple wants in a given time frame. That does not bode well for amd.

If anyone cares this is what the tegra 3 is capable of as far as gaming goes.

»www.youtube.com/watch?v=lBl-goBrWno


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
reply to demir
said by demir:

Because, once you get market share up to that level, it's sort of the tipping point before mainstream commercial success.

It's not anywhere close currently, so needless to say, the articles Kris posted are pretty much speculative.

And you have something against Nvidia doing so?

Even at AMD's peak it never really worried Intel, and intel needs something to worry about or they'll get lazy.


TruSm0ke

join:2005-07-21
Michigan
Reviews:
·Comcast
reply to me1212
I just think because of the success of consoles, developers are creating games with a "console first, PC second" mentality. This mentality inherits an attitude that games have to be dubbed down graphically in order for the game to be playable on older gen hardware such as in consoles.

Because the game was created for consoles, their respective port over to PC also includes dubbed down graphics and/or graphics that run well enough on slower hardware. PC gaming becomes the "it runs well enough" and games stop pushing the boundaries to accommodate a wide range of as many customers/users as possible.

In my opinion, I think there has been some positives that has come from this though. I think developers have had to contemplate game design much differently than they use to before. I've came across so many more games than I use to that were just genuinely really fun rather than needing in-your-face drop-dead gourgeous graphics or snazzy tesselation. I think some developers have adopted the mentality "game needs to be fun first (mechanics, controls, likeability,etc.), above all... and the graphics just need to accomodate this fun gameplay or be original enough to leave a lasting impression on the player."

What pops into mind for instance would be TF2. TF2 doesn't have amazing graphics or even impressive graphics for that matter. But it's a game that put importance on being fun first and then tying in the graphics to be original or have original characters that people would remember and become attached to.

Gamers or gaming in general just doesn't mean having to spend large amounts of money upgrading to faster hardware anymore. Games are now being created with having the versatility to play on medium or low hardware, and reaching a much broader audience that doesn't care about the latest and greatest tech or hardware. Majority just want to play a game that's fun, and most will pay for that. It's no secret that console gaming has become and is an extremely huge and lucrative business. We PC users are no longer the hot ticket for gaming companies. The bite sized games created these days reflect just that.


Blockfire
Sarcasm is my native tongue

join:2010-02-11
Wichita, KS
kudos:1
Reviews:
·AT&T DSL Service
reply to DarkLogix
said by DarkLogix:

Even at AMD's peak it never really worried Intel, and intel needs something to worry about or they'll get lazy'er(?).

fixed that for you

me1212

join:2008-11-20
Pleasant Hill, MO
reply to TruSm0ke
Yeah, we don't need to uber high end parts to get very good graphics anymore, we have nearly reached a good enough point. Sure we "can" get better graphics, but a) the consoles couldn't play it, and b) it would require more hardware expenses that its worth.

The consoles may have held back pc development a bit, but thats for the best I feel. Sure we don't have amazing real life graphics yet, but we also don't have to buy a new rig every two years. a Q/E6600 overclocked and a 460(or equivalent), even with only having pcie 1.0 and ddr2, is enough to play games at better then med settings today at 1080p. To play at just native res and not really care about the other settings not much is needed. And thats partially because of console development being the main focus, being 'held back' by consoles has also saved us money since we don't have to upgrade so thats always a good thing.

Like you said though it has made the devs think about game play more. tf2, the indie scene, portal 2, alice madness returns. Those are not the most gaphically advanced games, but they all have a wonderful graphic style(especially alice) and good game play. I personally like that devs are once again finally starting to focus on gameplay over graphics.


Blockfire
Sarcasm is my native tongue

join:2010-02-11
Wichita, KS
kudos:1
Reviews:
·AT&T DSL Service
reply to me1212
I think we are at a point where it's not the hardware that needs the next big breakthrough, i believe it to be the output devices like monitors etc. It seems like most any game can be written to at least perform decently on a laptop.

Until we have some sort of breakthrough on monitors, we'll be in a good situation for not needing a thoroughbred GPU.


TruSm0ke

join:2005-07-21
Michigan
Reviews:
·Comcast

1 edit
@Blockfire - I just want to add that I think the breakthrough on monitors that you've mentioned is two-fold.

1) Better, larger, monitor specs than we have now

And perhaps just as important or even more so as #1 ...

2) Lowering of prices

When it comes to output devices like monitors though, they are being held back from an unlikely source. Big TV and Hollywood. These 2 content giants indirectly dictate what resolutions become "standard" and at this point 1080p is the golden child. Explains why in just the last couple years there has been such a huge boom in 1080p resolution devices. Most people now have 1080p monitors and tv sets, not even thinking twice about it.


Blockfire
Sarcasm is my native tongue

join:2010-02-11
Wichita, KS
kudos:1
Reviews:
·AT&T DSL Service
what I meant by breakthrough in visual output devices is some sort of new type of monitor or display device. I think GPU's are at a point that it wouldn't take much of one to get good framerates on most any monitor at an acceptable resolution.

i'm thinking like VR, holographic displays, etc


Moos
Tequilablob
Premium
join:2008-12-11
Salt Lake City, UT
kudos:3
Reviews:
·Comcast
reply to me1212
Try playing BF3 online with a low end computer and see how that works out. .... Sure there are alot of games out that are designed for lower end machines/consoles because it applies to a larger concentration of people. the #1 goal of most companies is to make money so of course they are going to appeal to the masses. But as far as gaming PC's getting worse, I totally disagree. Kris nailed it, the technology has expanded so far that You can build a pretty badass setup for $1000. When the next wave of games come out in the near future I imagine it will test the will of many PC's.

I also agree that if AMD does not get their act together they are going to be in for a fun ride.

Kearnstd
Space Elf
Premium
join:2002-01-22
Mullica Hill, NJ
kudos:1
reply to TruSm0ke
said by TruSm0ke:

@Blockfire - I just want to add that I think the breakthrough on monitors that you've mentioned is two-fold.

1) Better, larger, monitor specs than we have now

And perhaps just as important or even more so as #1 ...

2) Lowering of prices

When it comes to output devices like monitors though, they are being held back from an unlikely source. Big TV and Hollywood. These 2 content giants indirectly dictate what resolutions become "standard" and at this point 1080p is the golden child. Explains why in just the last couple years there has been such a huge boom in 1080p resolution devices. Most people now have 1080p monitors and tv sets, not even thinking twice about it.

Much to my dismay when I wanted to add a second monitor and found that 1920x1200 is near impossible to find in the similar screen size range now.
--
[65 Arcanist]Filan(High Elf) Zone: Broadband Reports

demir
Premium
join:2010-07-15
usa
reply to DarkLogix
No, I have nothing against Nvidia doing so. I might change my opinion of them knowing things about CPU's if they are able to grab market share.

Mister_E

join:2004-04-02
Etobicoke, ON
Reviews:
·Bell Sympatico

1 edit
reply to Krisnatharok

said by Krisnatharok See Profile
Nvidia may come full circle by moving its Tegra chips out of the mobile phone/tablet market and start competing in the console and ultra-portable markets.



Ouya?

Ouya on Wikipedia



C0deZer0
Oc'D To Rhythm And Police
Premium
join:2001-10-03
Tempe, AZ
reply to me1212
Firstly, the only reason I would ever have the intel video exist on my system on a new build, is because their video solution is apparently superior to everything else as far as video de/en/transcoding. Much the same reason that I even bothered to put up with the X1600 Pro until it tanked on me was much the same reason... at the time, its video processing was better than anything else out there. And the only reason the SB/IB intel video would even exist in my system, is for much the same reason. Otherwise, it would have no place in any of my systems.

Intel video is treated with disgust for a reason. Heck, even now with the current HD 3k/4k series that are in SB/IB chipsets, can't even run some of the games used for benchmarking on sites like Anandtech and Tom's Hardware, while at least the AMD Fusion chips have a GPU that is more proper, and more capable. Still a bit behind the desktop parts for raw power, but far better than what intel has been putting out.

As far as his dual vs. more cores argument, I can sort of see where the author is coming from. More cores usually means better multitasking and being able to do more with the system. But if a fast enough dual-core is already good enough to handle the most intensive/demanding game that you're worried about, and the system in question is made for that, then go with it. I know I would love to make myself a gaming workhorse around something like an SB-E and a GTX 6*0 SLI, but money won't go that far. As it is, I'm still grateful that my main system has been as useful as it is for this long.

I just wish there were more 16:10 displays of a suitable resolution. It's like monitor makers got lazy and decided to only shove 16:9 on everything. Well, here's hoping that Apple's high-dpi displays finally start showing up on other products soon. Though I'm not holding out my hopes... one reason that it takes so long for stuff like that is because when Apple first designs a new tech with a company it sources parts from, they tend to basically buy all the stock that passes QA from that company, leaving little to none for any competitors. So it'll be a good while before we start seeing print-quality displays from everyone else.

I get that there are people that love the hell out of minecraft, but I just don't see it... and maybe part of that is that no other game i've played has felt like it was actively attacking my eyeballs with its eyesore graphics. Which is strange.. I have no problem playing 8-bit classics for NES, or even the odd atari/coleco here and again. But minecraft just hurts my eyes... literally.
--
Because, f*ck Sony