This generation's console processors are utterly imbalanced

View this thread on: d.buzz | hive.blog | peakd.com | ecency.com
·@liberosist·
0.000 HBD
This generation's console processors are utterly imbalanced
Work on the eighth generation of consoles began some time in 2010/11. The seventh generation was an odd split of CPU architectures and GPU vendors, but the eighth generation would be very different. Do note this is gets pretty technical - so I'll assume you have a fair understanding of the CPU and GPU industries. 

### GPU had to be AMD Radeon

Around that time, AMD's GPU division was firing on all cylinders. The Radeon HD 5870 came to market a full half year before NVIDIA's GTX 480. The GTX 480 featured a massive die - a full 60% larger than HD 5870. Despite the much larger die, the later release, a much higher price tag and higher power consumption, the GTX 480 beat the HD 5870 by about only 10%. It was an unmitigated disaster. That along with their poor reputation at collaborating with OEMs meant the GPUs for Wii U, Xbox One and PlayStation 4 were always going to be AMD Radeon. 

### No good options on the CPU front

On the other hand, AMD's Bulldozer CPU core was as much of a disaster as NVIDIA's Fermi (GTX 480). IBM's PowerPC architecture had gone well out of vogue, too. Nintendo still stuck with it, but for the higher performance PlayStation and Xbox consoles, PowerPC was no longer good enough. ARM was picking up, but at the time it was still firmly a low performance, low power architecture. 

That left AMD's Jaguar low-power cores, and Intel, of course. Intel would have been the ideal solution, but it's fair to assume they would have charged a bomb. Also, going with an all AMD solution would mean a single SoC, and a far lower cost for console manufacturers. 

With no better option, both Sony and Microsoft gambled on Jaguar cores. Jaguar was actually pretty good for its purpose - for use in cheap notebooks and tablets. Indeed, it was superior to Intel's Atom at the time. However, it was far behind Bulldozer's performance, let alone Intel's Bridge architectures. So why not Bulldozer? Simple - it was too power hungry for a console. 

Each Jaguar core was pretty low performance, so both X1 and PS4 went with 8 cores, hoping to compensate for low quality with high quantity. 

### TSMC's 20nm canceled

Up to the late 2000s, Moore's law was working like clockwork. The first sign of trouble was when the 40nm process was delayed. While NVIDIA struggled more than AMD, it was clear advancement in fabrication technologies was slowing down. 

28nm was delayed by over a year, only showing up in 2011. TSMC - who manufactured GPUs for both AMD and NVIDIA at the time - were adamant 20nm would still launch by 2013. 

It never happened. It's fair to assume that both PS4 and X1 were scheduled to be made at 20nm. With 20nm's cancelation, the dies had to be now fabricated at 28nm. As a result, the Jaguar CPUs could only be clocked at a lowly 1.6 GHz for PS4, and a still inadequate 1.75 GHz on X1. These should have been 2 GHz. 

### Complexity in games grinds to a halt 

So why is a weak CPU a big problem? Well, faster GPUs does mean better graphics, but actual complexity in gameplay largely is bound by CPU. A faster CPU means more realistic simulations, larger, more complex worlds, greater number of characters, more reactive characters, more detailed particle effects etc. Over time, many of CPU workload has been shifted to the GPU's compute units, but a lot is still fundamentally bound by the CPU. 

Of course, it's not just the console manufacturers that were caught out - game developers were too. 

A prime example is Assassin's Creed Unity - this first true next-gen game in the franchise. It had a very ambitious crowd simulation - never seen before in a game. They overestimated the consoles' CPU capabilities by a long shot. When it released, the frame rates were tanking hard, dropping down to unplayable levels in the early 20s. 

https://www.youtube.com/watch?v=clQfCP3NFuc

Note that Xbox One consistently had an advantage over PlayStation 4, despite it featuring a weaker GPU. The reason is obvious - it's a critical CPU bottleneck. X1 does of course have a slightly faster CPU. 

The next game, Syndicate, saw a dramatic pairing back of complexity. Origins improves that somewhat, but it's still nowhere near as sophisticated as Unity. Yes, we are going backwards! But for a good reason - both Syndicate and Unity hold the 30 fps mark consistently. Those severe dips in frame rate were solved by making the games less complex. 

And finally, we come to the central problem - **While games this generation are looking better and better, they are not getting any more sophisticated.**

### Xbox One X and PlayStation 4 Pro maintain status quo 

The move to a mid-cycle refresh has been largely positive. With the cancelation of 20nm, TSMC jumped straight from 28nm to 16nm. Console manufacturers took advantage of this by introducing much more powerful consoles. 

However, it was only the GPU and memory that received dramatic boosts - the CPU remained the same! Sure, there was a clock boost of around 30%, and some other minor improvements, but that pales in comparison to the 300% increase in GPU and memory bandwidth the Xbox One X saw. If that doesn't sound bizarrely lopsided, you can see it for yourself - 

https://en.wikichip.org/w/images/a/ad/scorpio_engine_die_shot_%28annotated%29.png

This is an actual die shot of the Xbox One X processor, the Scorpio Engine. The shader array, ROPs, frontend - it's all the GPU. The GDDR5 memory controllers are mostly saturated by the GPU too. The CPU - yeah, those dinky little green things out their in the corners. It's a hilarious visual that really shows off the scale of the problem.

### What about the PC?

In the past, major studios would release games exclusively on PC - because the consoles aren't powerful enough to  fulfil their vision. Stuff like Crysis, The Witcher 1 or Half-Life 2.

By the way, have you seen some of the destructibility and openness of Crysis 1? A whole decade later, there's not a single first person shooter that can match it. Stuff like Wildlands or Far Cry 4 simply aren't as reactive and dynamic. 

Since the early 2010s, no one really wants to gamble with a big budget PC exclusive. So they do the obvious thing - make a game only as complex as the lowest common denominators - consoles - will allow. 

That said, Star Citizen looks like it might finally push the boundaries of gaming. PUBG was a pretty innovative concept too, built for the PC. Come to think of it, it's the only truly innovative game this generation, conceptually from a gameplay perspective. Of course, the game blew up in popularity, and they couldn't refuse Microsoft's generous offer to release it on Xbox One and X. The result? A vicious CPU bottleneck that drops frame rates down to an abysmal 15 fps in the launch areas! 

https://www.youtube.com/watch?v=iKFKSDudWIE&t=0s

### It's a 30 fps world

Another unfortunate side effect of the weak CPUs is it gets really hard to get up to 60 fps. The Xbox One X and PlayStation 4 Pro with their vastly superior capability have only succeeded in increasing resolution. Many had hoped that X1X with its 4x GPU horsepower would allow 60 fps gaming, but it will simply not happen. To achieve that, X1X would have required over 2x uplift in CPU performance, but it has ended up with only a 1.3x increase. It does allow for a more stable 30 fps, but that's about it. 

So, the X1X lets you game at 4K, but there's no option to stick to HD but game at 60 fps. The GPU is well capable of that feat, but the CPU wouldn't allow it. In fact, the GPU is capable of 1440p at 60 fps - an option I'm sure most gamers would choose over 2160p @ 30 fps. 

Yes, there are some game engines like Frostbite and idTech 6 that have figured out how to do 60 fps on this generation of consoles, but these remain a rarity. 30 fps has become the norm. 

### Zen to the rescue? 

Both Sony and Microsoft have begun work on the next-gen consoles now. 

After nearly a decade, AMD finally has a fantastic CPU architecture. Zen not only catches up to Intel, it's even superior in some ways. Either way, it's an order of magnitude better than the archaic Jaguar cores. 

The GPU equation has got more dicy however. Unlike in 2010, NVIDIA does offer a better GPU architecture than AMD today, though for consoles I believe AMD is still very competitive. I expect both Sony and Microsoft to stick with AMD. 

The next gen consoles will almost certainly be fabricated at 7nm. We know Zen 2 is releasing on 7nm in 2019. It's a fair bet that the next gen consoles will feature both Zen 2 and be manufactured at 7nm. 

We can expect a pretty decent 2x-3x uplift on the GPU front, from Xbox One X. But the CPU might end up being nearly 10x as powerful! That'll finally bring things into balance. Of course, this is all speculation, we'll have to wait and see how the next-gen consoles turn out in 2020. 

Game developers will finally be able to create more innovative and sophisticated games, and not just better looking games. Larger, more dynamic worlds. Worlds were a lot more objects react realistically to your actions. Populated with dozens of interactable NPCs. Multiplayer formats with hundreds of players. Far more realistic simulation and gameplay mechanics. More complex storytelling devices. Fresh and innovative types of games that no one has thought of. So on and so forth... From thereon, the end goal here is AI in the game that will generate unique stories and characters for each player. 

Let's see how Star Citizen turns out. Maybe that'll give us a glimpse of what could have been...
👍 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,