Why are PS5 and Xbox just AMD APU now? What happened to the days of wild, custom hardware architectures?
Console exclusives were developed exclusively for a specific hardware and platform, developers used to optimize games specifically for consoles exotic architectures. It's a well known fact that hardware limitations are a good thing when it comes to video games, because it forces developers to be creative and optimize their work.
And all those consoles have hardware limitations.
PC won.
standardization always happens in any industry. you shouldn't look for uniqueness in industrial goods.
Jesus christ what were snoyBlack folk thinking with that design
Let them have fun. It's fine.
no it isn't. looks like a fricking car.
And...?
it's not a fricking car and I don't want to see that chinky shit in my living room
Buy a pc then you fricking consoomer.
>not liking a product makes me a consoomer
weird can be good. the ps5 is not good.
>weird can be good. the ps5 is not good.
It's selling well enough.
Boo hoo. Consoles have often had weird aesthetics. Go cry a little more.
It's a tribute to Chad Warden's signature popped collar you dipshit.
Without him, PlayStation would have died with the PS triple. Show some respect.
they aren't they are mobos with GPU memory soldered to them, iGPU uses DDR5 not GDDR6
the cpu die is probably monolithic but don't take my word for it, there are either two dies or one die but the gist is the memory interface is seperate in a PC but the PS5/X are missing normal DIMM system memory. This actually hurts latency a lot since GDDR6 is high bandwidth not purposed for random access.
All they really did was shave the need to have PCIe southbridge on the mobo by connecting the GPU to the CPU you save resources since consoles are not expandable beyond one single drive.
because making cutting edge CPUs and GPUs cost billions of dollars. why the frick would you invest that money when you can buy off the shelf?
x86 is just as powerful as any other high end architecture, cheap to modify and has good gpu options. why would anyone use anything but x86?
>x86
>has good gpu options
absolute moron
Consoles are going to cease to exist. Gaming is going back to where it started
>Gaming is going back to where it started
The tabletop?
yes, the desktop
PC
Yes, the oscilloscope.
>What happened to the days of wild, custom hardware architectures?
They had to keep their machines competitive, Sony couldn't afford another Cell fiasco, PowerPC (or even POWER) was not suitable for the task anymore. They had to keep the costs manageable while remaining competitive, and the basic IBM PC configuration was the solution. This might change in the future with ARM if they can keep their temps under control and manage to find ways to do backwards compatibility.
>What happened to the days of wild, custom hardware architectures?
In those days people were trying to solve a specific problem: how do we get a graphics processor in people's homes. While the arcades had more leeway in their design due to the amount of money they would make and their semi-modular approach, a console had to be built at a pricepoint your average client would be willing to buy. So this leads to many strategies of what hardware features to include at a time when no standard parts existed. Which means custom chips, custom architectures and custom software.
But by the time XBox entered the market, the problem of getting graphics processing in people's homes had been solved, with standards around hardware and software appearing in the PC world, leading to mass production of compatible parts. And once adoption of these standards ramped up, custom solutions couldn't compete on price. What's the point of reinventing an openGL like rendering hardware, when you can buy a cheaper one that's better? Or a custom storage solution when there's disks and SSDs that do it better already? and so on.
WiiU happened. Nintendo tried to make a console the old fashioned way, sourcing a very custom 3 core PPC750 CPU from IBM, and a custom prototype GPU from AMD. By the time those two custom dies were available in the necessary volume to launch a console, AMD's own semi-custom SoC division had managed to deliver a more powerful single chip solution to Sony and Microsoft.
Unsurprisingly for Nintendo's next system, they simply went out and bought a single chip solution from Nvidia.
Nobody in the modern era is going to buy anything for a low-end system like a console other than a single chip solution.
Even low-end gaming PCs like Valve's Steam Deck, are built using a semi-custom system-on-chip.
> buy cheap ass smart-TV Nvidia chip for cents
> deliver the most successful Nintendo hardware ever
custom hardware architectures were the reason why PS3 had no games. sure it's also why PS2 had so much protection from competitors, but it's too risky of a gamble for how big video game consoles are these days
>What happened to the days of wild, custom hardware architectures?
Such as...?
you know, when they used weird obscure chips like the 6502 or the 68000
Those chips weren't actually that weird back then. They had been popular for arcade machines so games were often already written for them, and they were significantly cheaper than the Intel x86 chips that IBM had sourced for their personal computers. Systems were built from a collection of chips that were usually all available from parts catalogs, and if a system was successful enough they might put them together into a single die or a few dies to really crank production up.
Today AMD64 hardware isn't really any more expensive than anyone else's CPU architectures, and AMD can provide capable GPU hardware as part of the same die.
>obscure chips
Admiteddly, I don't have an extensive experience with writing a, say, 6502 emulator, but from what I know it was a popular chip back in the day.
And 6502 derived from 6800, which was also pretty popular.
The thing is, the only console with a trully obscure architecture that I can think of was PS3 and *maybe* PS2.
In this case, Sony obviously took a risk by deliberately locking developers in their ecosystem. It didn't work out and Sony had to pivot, but that was a one-off thing.
>the only console with a trully obscure architecture that I can think of was PS3
It literally had a PPC chip with some garbage welded on top.
>with some garbage welded on top
Well, yes, I was referring to SPEs
PPE wasn't quite PPC970, and the PPE+SPE core configuration used in the PS3 was extremely unusual, even compared to the Xbox360's 3 core PPE.
IBM during that time was essentially trying to come up with a way to make more low-cost chips to hold on to more mainstream business, but it was ultimately unsuccessful.
same reason the Switch is just a reject tablet SoC
consoles are just PCs or tablets
>developers used to optimize games specifically for consoles exotic architectures
you mean developers would fail literally for years to figure it out until the very end of the lifecycle
R&D is expensive
porting is expensive
off-the-shelf alternatives to AMD APUs don't really exist any more.
If you look back at 7th-generation gaming consoles, you'll see they're all based on PowerPC. Standardization is nothing new
>they're all based on PowerPC.
Even those have some key differences beyond the core ISA.
PS3 had wild VLIW coprocessors doing most of the work while 360 was tricore and had custom SIMD instructions. Both had huge microarchitectural differences to the Wii, lacking even out of order execution which had been standard for years. The Wii's design later went on to evolve in its own different direction with the Wii U which still sucked ass at vector-heavy code but whooped PS3 and 360 in branchy integer code. This isn't even touching on the GPUs.
That wasn't really an effort at standardization more than just Nintendo had already been using PPC750, and IBM had spare capacity for PPE because they were losing Apple's business for PPC970.