said by trparky:
I seriously don't like integrated graphics. I've always gone with a discrete graphics card, even in notebooks. Leave graphics up to the people who know graphics and do it well, nVidia.
I'm not fond of it either, but AMD managed to do something that reasonably works with their Fusion chips.
What I still don't like is how intel can call theirs actual graphics at all, and how they continue to purport that these are gamer-worthy chips, when independent review sites demonstrate rather clearly that even their current top-end iGPU can't even physically run a third of the games
meant for an entry-level graphics evaluation. And then when you read the fine print, their tested settings to get playable framerates involved running something like vanilla WoW on low or minimal settings, when...
•vanilla WoW is nowhere near as graphically intensive as even the current build of WoW, and
•If they wanted to impress, they could try instead running it against something intentionally taxing, like RAGE or Metro 2033.
If anything, it comes off as a sign of contempt from intel toward those that try to defend PC gaming as a viable platform. It certainly doesn't help when someone who is new to the paradigm and doesn't know any better ends up buying one of these pre-builds with their integrated graphics and finds that it can barely run even a two-year old PC game, much less to the same level of graphical quality and speed that the current-gen console could. They then get turned off and instead of upgrading as necessary, decide to swear off altogether and go with the console and say "PC sucks for gaming" on forums like this one.--
Because, f*ck Sony