Last week, my favoured gaming news site, VGC, asked former US PlayStation boss Shawn Layden whether he thought the pursuit of more powerful consoles was still the way to go for the video games industry. His answer was not what I expected.
“We’ve done these things this way for 30 years, every generation those costs went up and we realigned with it. We’ve reached the precipice now, where the centre can’t hold, we cannot continue to do things that we have done before … It’s time for a real hard reset on the business model, on what it is to be a video game,” he said. “We’re at the stage of hardware development that I call ‘only dogs can hear the difference’. We’re fighting over teraflops and that’s no place to be. We need to compete on content. Jacking up the specs of the box, I think we’ve reached the ceiling.”
This surprised me because it seems very obvious, but it’s still not often said by games industry executives, who rely on the enticing promise of technological advancement to drum up investment and hype. If we’re now freely admitting that we’ve gone as far we sensibly can with console power, that does represent a major step-change in how the games industry does business.
So where should the industry go now?
Even using ai for environments. Plenty of open world games will have you looking at the exact same patch of grass and the exact same trees. Even getting ray traced lighting effects to work at life-like settings, running at at least 4k90, will take a lot of computing horsepower. The price to do these things is still prohibitive, but game devs will absolutely try doing those things in the future, so the console hardware will have to keep up.
What I can see them doing is taking a more passive approach, and waiting for the hardware to develop on its own, and just using “off the shelf” parts, instead of paying to have custom hardware developed.