Why wasn't it ever utilized to its fullest.. WHY? WHY??

Why wasn't it ever utilized to its fullest.. WHY? WHY??

Homeless People Are Sexy Shirt $21.68

Tip Your Landlord Shirt $21.68

Homeless People Are Sexy Shirt $21.68

  1. 4 weeks ago
    Anonymous

    hard to program for
    making the game run better does not result in more sales especially in an era where everyone started preordering games and guaranteed sales were already baked into the budget for the next projects

    • 4 weeks ago
      Anonymous

      This. Weird, almost DSP-like coprocessors that have to access memory through DMA were not a good fit for existing programming practices.

    • 4 weeks ago
      Anonymous

      >making the game run better does not result in more sales
      Literally this. The entire industry crashed itself by obsessing over movie games and graphical fidelity.

      • 4 weeks ago
        Anonymous

        Those move games don't run better though.

    • 4 weeks ago
      Anonymous

      >hard to program for
      Absolute BS. Please, have a look at the PS2 chip(s) schematics and come back to me with your half-assed pseudo knowledge. The PS3 was inifinitely easier to develop for, the issue? Developers outright refusal to even pretend to try to implement any kind of multithreading.

      • 4 weeks ago
        Anonymous

        >crashing in a plane from 8000ft will almost certainly end in death. therefore crashing in a car is perfectly safe
        something in your logic doesn't add up

    • 4 weeks ago
      Anonymous

      >hard to program for
      not really difficult but sony's licensed developers who were pro at x86 or other chips got btfo by it. making matters worse was sony's shitty sdk.

      >hard to program for
      Absolute BS. Please, have a look at the PS2 chip(s) schematics and come back to me with your half-assed pseudo knowledge. The PS3 was inifinitely easier to develop for, the issue? Developers outright refusal to even pretend to try to implement any kind of multithreading.

      >Developers outright refusal to even pretend to try to implement any kind of multithreading.
      let's face the facts: they didn't know how.

      how good is blueray movie image on PS3, do they have image enhance magic or something going on

      it's amazing, anon.

      >crashing in a plane from 8000ft will almost certainly end in death. therefore crashing in a car is perfectly safe
      something in your logic doesn't add up

      > coping
      didn't watch

      It was basically an Emotion Engine with a better* SDK.
      Game devs were giving up on purpose-built game engines, Unity was starting to take off, UE3 was good enough for mid-sized games, middleware was used for everything from filesystem APIs to physics simulation.

      this chad knows what's up

      [...]
      If you want a PERFECT example of why the PS3 was a terrible system, just look at the hour long GDC presentation on how DICE used the Cell's SPUs to handle Battlefield 3's deferred shading:

      It was an excellent use of the SPU's strengths and was a very impressive bit of software engineering on DICE's part.

      Why is this a mark against the PS3 you might ask?

      DICE didn't need to do a whole presentation explaining how they handled deferred shading on the Xbox 360 because they SIMPLY RAN IT ALL ON THE XBOX 360'S STRONGER ATI GPU. All that work on the PS3 version of BF3 and it ultimately looked and performed the same as the Xbox version.

      >was a terrible system
      was a great system.
      >DICE didn't need to do a whole presentation explaining how they handled deferred shading on the Xbox 360
      apparently they did because of the massive differences between systems. nice work, dunning kruger.

      • 4 weeks ago
        Anonymous

        >apparently they did because of the massive differences between systems. nice work, dunning kruger.

        Xbox and PC didn't require a novel approach. The PS3's "great architecture" required more work for the same results.

        All else being equal the SPUs would have made it more capable, but all else was not equal in the console realm.

        >8800 GTX about 2.5 times as fast as the previous gen flagships
        Yeah, that's a pretty good summary of just how good the GeForce 8800 was.
        The X360's GPU wasn't bad, but I don't think it was quite that good. It was somewhere between the X1950 and the later Radeon HD cards. Also had this little issue---it could run hot enough to physically destroy the console's motherboard.

        When Xenos launched in 2005 it was a generation ahead of anything that was out. There wouldn't be PC graphics cards with unified shaders until the GTX 8000 series at the end of 2006 and the HD 2000 series in 2007. ATI gave Microsoft a bleeding edge GPU for the time.

    • 4 weeks ago
      Anonymous

      Yeah on consoles aslong as you hit 60 fps you are fine, but on PC, its not true anymore.

      If you make your game run on a potato, ofcourse more people can buy the game because IT WILL RUN
      If you make your game require more and run like shit on even newer hardware, its going to affect your sales (Dragons Dogma 2 for example)
      Running a game on 1080p, 2k, 4k requires alot of performance optimization on any machine.
      And then for some people, having the game run at 60 fps might not be enough.

      • 4 weeks ago
        Anonymous

        Why is iD the only company who gets this right?
        Doom and Eternal look great and run on potatoes.

  2. 4 weeks ago
    Anonymous

    Yes, it was.

    • 4 weeks ago
      Anonymous

      The Last of Us was crazy for the time but I'm not so sure.

  3. 4 weeks ago
    Anonymous

    it was designed for video games
    people are still writing software for the C64, and some of it would blow your mind. They explicitly set out to create a spectacle using the hardware, not a full playable video game
    I guess if you have every core running at max, then you've utilised it to it's fullest, just like if you cover every inch of a canvas with paint, you've utilised all of it

    • 4 weeks ago
      Anonymous

      It was designed for SIMD processing. Toshiba had no interest in gaming when they joined the alliance. They were working on TVs with Cell BEs. Their plans were many screen PIP for TVs as well as simulated 3D for non-3D content. IBM also used it in some supercomputers. Roadrunner at Los Alamos, for example.

  4. 4 weeks ago
    Anonymous

    It's the Nintendo 64 curse. We weren't ready for it.

    • 4 weeks ago
      Anonymous

      Just wait until you learn that AVX-512 heavily accelerates PS3 emulation, and is also canned as a power virus instruction set.

      Devil is in the details. N64 had slow RAM and complicated graphics, while the PS3 had explicitly parallel vector coprocessors.

      • 4 weeks ago
        Anonymous

        only because ps3 emulators suck ass and try to run SPE instructions on CPU instead of GPU like they should

        one single Streaming Module in the modern Ada Lovelace architecture has the equivalent of 16 Cell SPEs worth of compute, where as all modern AVX512 CPUs at best can simulate 2 SPEs

        • 4 weeks ago
          Anonymous

          if you emulate cell on the GPU, CPU emulation will steal resources from GPU emulation and it'll be more work to keep everything synchronised tightly

        • 4 weeks ago
          Anonymous

          SPEs connect to the same memory space as the PPE. Emulating that with whatever intermediate representation over PCIe would be hell.

          You're not gonna believe what shelf the PS3 CPU fell out of. Or the PS2. Or the Xbox. Or the SNES. Or the NES. Or the 3DO.

          They are all off the shelf.
          The shelves just became unpopular. The only popular shelves left are x86 and ARM. You know nothing about exotic chip architectures if you're complaining that "the industry shifted to off the shelf." There are chips so exotic they only exist as FPGA.

          PPC isn't very "off the shelf", though it very well tried to be. And MIPS was a little further than that.

          I just mean that in the 90s and 00s, the hardware and developer tools needed to run advanced 3D games was far less standardized and homogenized than it is today, and that meant each company had more room to experiment with different ideas when it came to processors and RAM and buses and rendering and even shit like surround sound. You knew that every new console's hardware would at least be interesting to analyze and discuss, because it wasn't just white-label PC or mobile hardware under the hood.

          Naturally, because people were trying to experiment with what worked and what didn't. Though I guarantee that for however esoteric the hardware is/was, some poor developer(s) toiled over figuring it all out and how to use it all for a game. That kind of thing pisses off your publishers as it increases development time, cost, and risk.
          Even by 2000-3, the industry was coalescing around the idea of the graphics api to abstract away what's going on under the hood. Gamecube used OpenGL like APIs, Xbox used DirectX, and the PS2 unofficially gave up and used Renderware to make it easy.

          • 4 weeks ago
            Anonymous

            >PPC isn't very "off the shelf", though it very well tried to be
            There was a point when Apple was using it in all their computers.

          • 4 weeks ago
            Anonymous

            A chipset’s ISA isn’t all that there is to it. The PS3 and the iMac G5 both had PPC-based CPUs but were otherwise vastly different.

  5. 4 weeks ago
    Anonymous

    First- and second-party devs, and third-party devs like Kojima Productions did use it to the fullest. That being said, it was notoriously difficult to optimize for due to its massively parallel architecture, like the PS2 before it but worse.

    • 4 weeks ago
      Anonymous

      It's not massively parallel. GPGPU is massively parallel. Cell is massively moronic.

  6. 4 weeks ago
    Anonymous

    Fine, but only because you insisted
    >*launches icbm with nuclear warhead*
    WITNESS THE POWER OF THE CELL

  7. 4 weeks ago
    Anonymous

    Because it wasn't even finished, the original design was never achieved.

  8. 4 weeks ago
    Anonymous

    Watch some PS3 versus 360 framerate comparisons, and you'll see that they usually give very close results. Except the 360 was much easier to code for, and didn't cost Microsoft $400M to design a super customized chip. So you'll realize the Cell was a waste of time.

    • 4 weeks ago
      Anonymous

      Both of them had a PowerPC CPU that was probably doing much of the work even on the PS3. The X360 just had more of those regular cores. Some crappier PS3 titles probably didn't even bother with the SPE co-processors.
      The PS3's GeForce 7800 was also kind of old, being a pre-unified shaders design that was absolutely obsoleted within a year of the console's release by the 8800.
      I recall reading somewhere that because making any kind of game logic run on the co-processors was so hard, some games just used them to compute post-processing effects to take some load off the GPU.

      • 4 weeks ago
        Anonymous

        >obsoleted within a year of the console's release by the 8800
        Bruh it was obsoleted before it was released.
        >PS3: November 11th 2006
        >GeForce 8800 GTX: November 8th 2006

        • 4 weeks ago
          Anonymous

          >>PS3: November 11th 2006
          8800 GTX: November 8th 2006
          Imagine Sony CEO's face hearing this right 3 days before ps3 launch, fricking kek

        • 4 weeks ago
          Anonymous

          Huh, I thought the 8800 came out in 2007. Or that's when you could (maybe) buy one, at least.

          >>PS3: November 11th 2006
          8800 GTX: November 8th 2006
          Imagine Sony CEO's face hearing this right 3 days before ps3 launch, fricking kek

          Yeah. The 7800 was basically a high-end Direct3D 9.0c card. That spec had come out in 2004. The 8800 is much closer to a modern GPU, minus the AI shit and whatever.

      • 4 weeks ago
        Anonymous

        >I recall reading somewhere that because making any kind of game logic run on the co-processors was so hard, some games just used them to compute post-processing effects to take some load off the GPU.
        That came later in the console's lifecycle.
        Naughty Dog used the SPE's for anti-aliasing
        Insomniac had a backface culling algorithm that was faster than the GPU
        Polyphony Digital somehow managed to do tessellation entirely in software running on 2006 hardware

    • 4 weeks ago
      Anonymous

      >and didn't cost Microsoft $400M to design a super customized chip
      It did, it was nowhere near what Sony spent, but people joked about both teams working in the same building and stealing each others' homework. The advantage the 360 had was that it was designed with a GPU and unified memory from the very start, Sony expected the Cell with its SPEs to be performant enough without a GPU and scrambled to add one near launch when third party devs flipped the frick out.

  9. 4 weeks ago
    Anonymous

    PS3: 1 normal PowerPC code + 7 custom shit-to-program-for cores
    Xbox 360: 3 normal PowerPC cores
    That is all

  10. 4 weeks ago
    Anonymous

    what are you expecting it to be able to do?

    • 4 weeks ago
      Anonymous

      60 fps in gta v

  11. 4 weeks ago
    Anonymous

    I miss the days when consoles had their own bespoke exotic chip architectures. Something special got lost when the industry shifted to off-the-shelf x86 / ARM chipsets.

    • 4 weeks ago
      Anonymous

      You're not gonna believe what shelf the PS3 CPU fell out of. Or the PS2. Or the Xbox. Or the SNES. Or the NES. Or the 3DO.

      • 4 weeks ago
        Anonymous

        Could you be a little less cryptic?

        • 4 weeks ago
          Anonymous

          They are all off the shelf.
          The shelves just became unpopular. The only popular shelves left are x86 and ARM. You know nothing about exotic chip architectures if you're complaining that "the industry shifted to off the shelf." There are chips so exotic they only exist as FPGA.

          • 4 weeks ago
            Anonymous

            I just mean that in the 90s and 00s, the hardware and developer tools needed to run advanced 3D games was far less standardized and homogenized than it is today, and that meant each company had more room to experiment with different ideas when it came to processors and RAM and buses and rendering and even shit like surround sound. You knew that every new console's hardware would at least be interesting to analyze and discuss, because it wasn't just white-label PC or mobile hardware under the hood.

      • 4 weeks ago
        Anonymous

        >SNES
        >NES
        Didn't these use some shady Ricoh clones of the MOS 6502 / 65816 CPUs?
        Think the Game Boy's SoC used a weird Sharp CPU that sort of resembled the Intel 8080 and the Zilog Z80, but wasn't exactly either of those. It may have been originally been used in appliances like air conditioning units.
        IP laws and patents were different back then.

  12. 4 weeks ago
    Anonymous

    how good is blueray movie image on PS3, do they have image enhance magic or something going on

  13. 4 weeks ago
    Anonymous

    but they did

  14. 4 weeks ago
    Anonymous

    It was basically an Emotion Engine with a better* SDK.
    Game devs were giving up on purpose-built game engines, Unity was starting to take off, UE3 was good enough for mid-sized games, middleware was used for everything from filesystem APIs to physics simulation.

  15. 4 weeks ago
    Anonymous

    Yes. Look at latest PS3 games squeezing all they could out of a processor with far to little RAM.

    It was just a shitty PowerPC core with 7 128bit(actually 2 combined 64bit)SMID units. Sure if you learned to code for the SPUs you could get amazing performance, but theyre just SMID cores at the end of the day. Hard to do most work on, and not really for a gaming system. would be good for number crunching clusters,but the Cell got beaten for that over a decade and a half ago.

  16. 4 weeks ago
    Anonymous

    It was. Just not for games.
    Look up "Condor Cluster"

  17. 4 weeks ago
    Anonymous

    it was designed to be difficult to develop for unironically

  18. 4 weeks ago
    Anonymous

    It was difficult to develop for and had some pretty bad memory bottlenecks. I remember multiple developers complaining about how bad the memory bottlenecks were and how shit the PS3 architecture (not the processor but the hardware topology) was.

  19. 4 weeks ago
    Anonymous

    Imagine if the Cell money went to NVIDIA instead.

    • 4 weeks ago
      Anonymous

      >8800 GTX about 2.5 times as fast as the previous gen flagships
      Yeah, that's a pretty good summary of just how good the GeForce 8800 was.
      The X360's GPU wasn't bad, but I don't think it was quite that good. It was somewhere between the X1950 and the later Radeon HD cards. Also had this little issue---it could run hot enough to physically destroy the console's motherboard.

  20. 4 weeks ago
    Anonymous

    >Why wasn't it ever utilized to its fullest.. WHY? WHY??
    GPUs are better than it at the workloads it was good for

    • 4 weeks ago
      Anonymous

      https://i.imgur.com/wSsUnp1.jpeg

      Why wasn't it ever utilized to its fullest.. WHY? WHY??

      If you want a PERFECT example of why the PS3 was a terrible system, just look at the hour long GDC presentation on how DICE used the Cell's SPUs to handle Battlefield 3's deferred shading:

      It was an excellent use of the SPU's strengths and was a very impressive bit of software engineering on DICE's part.

      Why is this a mark against the PS3 you might ask?

      DICE didn't need to do a whole presentation explaining how they handled deferred shading on the Xbox 360 because they SIMPLY RAN IT ALL ON THE XBOX 360'S STRONGER ATI GPU. All that work on the PS3 version of BF3 and it ultimately looked and performed the same as the Xbox version.

      • 4 weeks ago
        Anonymous

        tbh the Cell could've led to a pretty good revolution in things like physics and game AI if developers allowed themselves the luxury of aiming for multi-modal, graphics-second programming. There's a good deal of absolutely batshit gameplay idess that come to mind when imagining 7 SPUs at your disposal. I guess it was too good for the consumer market.

        • 4 weeks ago
          Anonymous

          >revolution in things like physics
          GPUs lead and still lead that revolution. CPUs stopped being viable for physics in 2008.

  21. 4 weeks ago
    Anonymous

    Sony initially intended the PS3 to have a DUAL CELL setup but early in the SDK development realized it was going to be an absolute nightmare. Supposedly a version of the following tech demo was rendered using two CELLs: https://www.youtube.com/watch?v=Q9cqeYSJo9w
    Once Sony realized it would be a mess they crawled to nVidia and begged them for any chip they could spare. nVidia slapped together a shitty 8800 equivalent chip in the RSX (which we now the first batch of which were defective trash) and Sony paid retail to get them in time (this is crazy expensive and a portion of why the PS3 cost so much). Meanwhile on the SDK front, with their entire graphics setup swapped out the SDK development team was way behind schedule for an easy to program for architecture, let alone the complex nature of the CELL. As the SDK team was behind getting a working 1.0 version done so were devs even more late getting their hands on it to start work. All of this lead to a cascade of delays and headache trying to get developers to use the power of the CELL for anything but hardcoded functions.

    TL;DR: Sony goofed.

    • 4 weeks ago
      Anonymous

      7800*

  22. 4 weeks ago
    Anonymous

    It was. It was the worst version of IBM's Power architecture and it was the nail in IBM's coffin when it came to CPU manufacturing. Proved they had nothing more to provide to the industry and were fresh out of ideas.

    • 4 weeks ago
      Anonymous

      Yet IBM still has POWER as well as z/Architecture

      • 4 weeks ago
        Anonymous

        I haven't seen a Power processor in a datacenter in 15 years anon.

        • 4 weeks ago
          Anonymous

          I left mine in your daughter's bedroom

  23. 4 weeks ago
    Anonymous

    I got one of these bad boys for $35 today and I think it's one of the ones I can put custom firmware on.

    • 4 weeks ago
      Anonymous

      nice, real gem of tech

  24. 4 weeks ago
    Anonymous

    But it totally was, Naughty Dog squeezed every bit of performance out of that piece of overengineered shit anyone could have. Yes, even the SPEs. The results were mediocre.

    • 4 weeks ago
      Anonymous

      that's kind of the problem with advancing technology. when Naughty Dog overwrote pre-loaded Playstation 1 libraries to get enough memory to render Crash in enough poly-frames, it gave them a clear aesthetic edge over their competitors who were playing in-bounds of the console limits. but 8+0.5 > 8 is a much bigger difference than 64+0.5 > 64. eventually it just became not worth it to squeeze every penny out of something. sure, with modern hindsight after 20 years of advanced algorithms and nolife hobbyists you could probably do a lot more with PS2 hardware than was ever done for-profit. but what does that get you other than a single youtube video?

  25. 4 weeks ago
    Anonymous

    >PowerPC with a bunch of funky coprocessors
    That's why.

  26. 4 weeks ago
    Anonymous

    Daily reminder on why the PPC core was shit

    • 4 weeks ago
      Anonymous

      what little this image actually says makes it sound based though
      did OOP (OOPs I just wrote a shit program on purpose haha and now the compiler won't save me!) kill the ps3?

      • 4 weeks ago
        Anonymous

        nocoder kys

        • 4 weeks ago
          Anonymous

          I just don't code like a corpo drone
          sequential multi-core in-order cpus are how modern computers should work

          • 4 weeks ago
            Anonymous

            i can see you haven't kys nocoder
            nocoder kys

          • 4 weeks ago
            Anonymous

            kys Black person
            the cell is a white man's processor and only Black folk hate it

          • 4 weeks ago
            Anonymous

            I'm not going to say you're wrong, but even so the Cell PPE was shit.
            If you actually read past the fact that the PPE was In-Order, you find that bad decisions were made in the design of it regardless.

    • 4 weeks ago
      Anonymous

      Note that "Jaguar" here refers to AMD's chips used on PS4 and XBone, not Atari's console.

  27. 4 weeks ago
    Anonymous

    I'm happy to wait however many decades it takes for the PS3 to find a second life like the N64 has at the moment with the likes of James Lambert and Kaze Emanuar

  28. 4 weeks ago
    Anonymous

    >nobody knew how to use obscure thing that sony made a pain to use
    A mystery

  29. 4 weeks ago
    Anonymous

    The Cell was utilized much more than most realize. Obviously the PS3 used a Cell processor. The IBM Roadrunner supercomputer used over 12,000 Cell processors along with AMD Opterons. There were Cell pcie card add-on cards and Cell blade servers. There were also several well known PS3 cluster projects. There are still options to run Linux on Cell processors primarily focused on the PS3

    Ultimately the Cell was only a single core PPC processor with multithreading. The "Synergistic Processing Element" coprocessors were interesting but infamously a pain. The Cell was utilized more than most processors

    https://en.wikipedia.org/wiki/Cell_(processor)
    https://en.wikipedia.org/wiki/IBM_Roadrunner
    https://en.wikipedia.org/wiki/PlayStation_3_cluster

  30. 4 weeks ago
    Anonymous

    Basically a newer Atari Jaguar. Coolass support chips but nobody uses them.

  31. 4 weeks ago
    Anonymous

    Dunno, it seem to be used well in this configuration: https://en.wikipedia.org/wiki/PlayStation_3_cluster
    Seems to me the CELL CPU was used up to its potential. The GPU was anemic for its time imo and was already at its limit.

  32. 4 weeks ago
    Anonymous

    Imagine wasting your time in optimizing your AAA for a chip that only exist in playstation 3, where Sony drop support few years later, making you AAA exclusive of ps3 and no chance of port to ps4, xbox or PC

    • 4 weeks ago
      Anonymous

      That's basically what all companies were doing. Except it didn't stop them from porting to other platforms.

Your email address will not be published. Required fields are marked *