I think we may be looking at this the wrong way.
What drives the advancement in hardware? For a number of years, it seemed that new software application requirements outpaced existing hardware. This is what still drives the graphic card market.
Today, most fairly modern hardware is more than powerful enough to run what people are wishing to run.
Let's use the modem for example. In the late 80's, a 14.4k modem was enough horsepower to connect to the text only BBS. When we decided we wanted to download the racy GIFs (or the latest warez), only a 56K baud modem was the way to go. At the peak of AOL's run, the 1M DSL/cable modem was starting to show its limits. We now have 15M broadband, but are envious of those folks who can afford the 50M Ultimate packages.
For a number of years, one of the "Must Do" things we did when buying new software was to read the "Minimum Requirements" and the "Recommended Requirements" of the software application we were thinking of buying, as we knew the latest software may bring our desktop to its knees. When is the last time you've done that?
So, where did the software development go? It has moved onto other platforms -- The game console, the smart phone, and now, the tablet. That's where many of the top application developers have gone.
Until there is the development of "The Next Big Thing" for the desktop computer, the must have, want to run it, for the desktop, which will require more speed/memory/power for the desktop, there probably won't be much change in the platform.
The question is, what will that be?--
Smoke 'em if you got 'em