Well, MS could have put a better performing GPU in the console to begin with. Remote compute isn't going to help too much with that in my eyes. Virtualization has a very long way to go and this is another step in a great direction. I have a dedicated VDI for work with the same specs, on paper, as my work laptop. The laptop performs great, the VDI makes me want to rip my hair out from frustration. It's graphics and input lag that are the issues that make me keep telling the powers that be that a VDI is not for the general user population. Codec encoding being moved to specialized hardware that can scale. Old concept applied in a helpful new way.
At some point a company has to just jump in. There will always be a better GPU. When anything is released in tech it's already outdated by the time it's approved and manufactured. You can say they should have put double the RAM, or a super powerful GPU - and it would have cost $700. So, at some point you have to say "this is good enough to code on for 8 years, for cheap enough that people will buy it at launch."
Sony had better RAM this go-round. But last gen, MS had the better system as a whole - and both did just fine. It's not like two competing companies are going to put out identical hardware. One will always been slightly different/better. But the developers are going to go with the common denominator so their games look good across both platforms, so they can sell them. It's a bit moot at this point.
Sort of on topic…Parcells has been amused recently with a new Microsoft cloud commercial. In the commercial the MS cloud is being touted as the technology behind the Lotus F1 team’s car which is amusing because the Lotus car is terrible on both the power and reliability fronts…someone didn’t clearly think this ad through me thinks.
This was the key concept behind OnLive, you could play all the latest and greatest games on your 8 year old computer because very little was being processed on your system. All you needed was the bandwidth to do fast screen refreshes and controller I/O. But overall it was kind of a shitty experience and that's why it never really took off, i think they almost closed up shop actually, maybe they sold it off to some other company, can't remember for sure. Either way the moral of the story is it wasn't good enough for primary game delivery.
Due to ONE going back on the 'must have broadband' requirement, I don't think we'll really see the cloud based compute really reach it's full potential unless games release requiring broadband. I guess we do have that now though, Titanfall required broadband since it was a multiplayer only game. But even if the internet requirement still existed, bandwidth lag would relegate most of what you could use cloud compute for to minor background processes. Still that would offload some percentage of work from the on board GPU/CPU for more intensive local computing. Whether or not it's enough to bridge the gap between the PS4 is yet to be seen and certainly we're not going to see it in anything but the top tier of games.
Reminds me that I need to buy or borrow an SNMP capable switch. I want to see if broadband is really even required for titanfall.