• magic_lobster_party@kbin.run
    link
    fedilink
    arrow-up
    91
    arrow-down
    1
    ·
    4 months ago

    It is hard. PS3 has incredibly specialized hardware. Even game developers had trouble making games for it at the time because it’s so arcane.

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      68
      arrow-down
      9
      ·
      edit-2
      4 months ago

      Nah, that’s still a bunch of bull, they designed it and have all the documentation. They know all of its functionality, hidden or otherwise, it’s “undocumented” functions, it’s quirk’s, the very ins and outs of it. They probably still have original designers on staff. They have far more knowledge and experience of their own design than any game developers.

      And yet RPCS3, an open source PS3 emulator based on reverse engineered research is able to achieve decent playability on most games.

      Not to mention, they’re a multi-billion dollar company, don’t make excuses for them.

      • magic_lobster_party@kbin.run
        link
        fedilink
        arrow-up
        47
        ·
        edit-2
        4 months ago

        AFAIK, the documentation isn’t the main problem. I’m pretty sure PS3 is quite well understood.

        The problem is how to translate the code to a typical X86 architecture. PS3’s uses a very different architecture with a big focus on their own special way on doing parallelism. It’s not an easy translation, and it must be done at great speed.

        The work on RPCS3 incredible, but it took them more than a decade of optimizations to get where they are now. Wii U emulation got figured out relatively quickly in comparison, even if it uses similar specs to PS3.

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        4 months ago

        There can be a lot of subtle changes going from one uarch to another.

        Eg, C/C++ for x64 and ARM both use a coprocessor register to store the pointer to thread-local storage. On x64, you can offset that address and read it from memory as an atomic operation. On ARM, you need to first load it into a core register, then you can read the address with offset from memory. This makes accessing thread-local memory on ARM more complicated to do in a thread safe manner than on x64 because you need to be sure you don’t get pre-empted between those two instructions or one thread can end up with another’s thread-local memory pointer. Some details might be off, it’s been a while since I dealt with this issue. I think there was another thing that had to line up perfectly for the bug to happen (like have it happen during a user-mode context switch).

        And that’s an example for two more similar uarchs. I’m not familiar with cell but I understand it to be a lot more different than x64 vs ARM. Sure, they’ve got all the documentation and probably still even have the collective expertise such that everything is known by at least someone without needing to look it up, but those individuals might not have that same understanding on the x64 side of things to see the pitfalls before running into them.

        And even once they experience various bugs, they still need to be debugged to figure out what’s going on, and there’s potential that the solution isn’t even possible in the paradigm used to design whatever go-between system they were currently working on.

        They are both Turing complete, so there is a 1:1 functional equivalence between them (ie, anything one can do, the other can). But it doesn’t mean both will be able to do it as fast as the other. An obvious example of this is desktops with 2024 hardware and desktops with 1990 hardware also have that 1:1 functional equivalence, but the more recent machines run circles around the older ones.

        • cm0002@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          That’s all understandable, for a startup or young company. But this is Sony a multi-billion dollar electronics company with many MANY released devices and software projects under its belt.

          If they had taken things seriously, invested the proper funding and pulled the appropriate personnel they would have no problems getting something out that can beat RPCS3 in a year tops.

          They tried to just slap something together as (what someone around here commented a while back) a minimum value add product and shove it out the door. Any claims of “It’s just too hard” they try to make is nothing but cover AFAIC now that people are starting to call them out on it

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            4 months ago

            “It’s too hard” really means “we don’t think the benefit we’d gain is worth the resources and effort it would take to get there”.

      • UltraGiGaGigantic@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        4 months ago

        For profit corporations should not be trusted to preserve out culture. They would happily delete everything it if made then 1 dollar

      • मुक्त@lemmy.ml
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        Not to mention, they’re a multi-billion dollar company, don’t make excuses for them.

        They pay someone handsomely to make excuses for them.

      • Gork@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        I’ve worked at companies where the documentation was either non-existent, not digitized, or very poor in quality. Add 10+ years to that when nobody is left at the company who worked on the original project and it can cause this exact level of frustration.

      • dgbbad@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        Most of the games I’ve played on RPCS3 look way better and run much smoother than how they did on the console itself. And no long wait times to load into the console OS save menuz saving was nearly instant. So good.

    • bolexforsoup@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      28
      ·
      4 months ago

      Yes it is hard, and that was their damn fault. I can’t believe they expected developers to have to program which processors take which loads with such granularity. Unbelievably stupid.

    • Cyth@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I don’t think it being hard is really the issue. Sony is a billion dollar multi-national corporation and they don’t get any benefit of the doubt whatsoever. Is it hard? Maybe it is, but maybe they should have thought of what they were going to do in the future when they were designing this. As was pointed out elsewhere, volunteers making an open source emulator are managing it so Sony not wanting to, or being unable to, isn’t an excuse.

  • Redcuban1959 [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    4 months ago

    I remember how some PS3 models have like the entire PS2 hardware inside them and it could run both ps1 and ps2 games.

    • Mnemnosyne@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      ·
      4 months ago

      This is why PS3 is the last PlayStation that I owned, and I didn’t even buy it retail.

      After they discontinued the backwards compatible model I sought out and bought one secondhand, and swore never again to buy a PlayStation product unless they release one on which I can play all my PlayStation games all the way back to 1.

      • RippleEffect@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        What PlayStation 1 games did you particularly enjoy? I have a small handheld that I’m looking for games for.

    • Zipitydew@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      4 months ago

      That model in the picture is one of them. I don’t think all the fat PS3s could. But nearly all of them. Was why they were chonky.

      • lichtmetzger@discuss.tchncs.de
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        4 months ago

        Some of them did it partly in software, though - and they were less compatible. The European FAT models all worked like that.

        Sadly, the fully-backwards compatible models are all ticking timebombs, unless you get the RSX chip replaced with a later model. It’s a problem with the underfill on the chip which resulted in the YLOD, which is basically Sony’s variant of the red ring of death.

        I have an early FAT model and it still runs stable, but I’m afraid to use it because I know it will fail eventually if I do. It does look sexy asf though!

        • Zipitydew@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          4 months ago

          Thanks for the heads up. Recently took mine out of storage. Setting up a game room now that my kids are old enough to trust them. Did an SSD upgrade the other day. Will look into the chip issue.

    • diannetea@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      We had a backwards compatible ps3 and I was incredibly sad when I discovered I couldn’t play Um Jammer Lammy on it, something in the software fucked up the timing and you just couldn’t press the buttons at the correct time and it sucked

      We gave away the ps3 at some point but I still have Um Jammer Lammy. I’ve considered buying a ps2 for it but haven’t bothered so far. The disc is probably in horrible shape by now anyway, it’s just semi sentimental so I keep it.

    • Mountain_Mike_420@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      Yeah this was such a bait and switch tactic on Sonys part. I remember a few years after they stopped producing backwards compatible models getting in arguments with people who still thought the ps3 could play ps2 games. Yes the first model (and maybe more) could but then they took the chips out to do it in order to save money. Really lame Sony. I had a friend even buy one without realizing this till he was trying to play his old games

      • Redcuban1959 [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        I think the other models would emulate the ps2 games. But most of the time the emulation was really bad. I think GTA LCS is almost unplayable on the ps3 because of it.

  • Codex@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    4 months ago

    I always wondered about the legacy of the Cell architecture, which seems to have gone nowhere. I’ve never seen a developer praise it, and you can find devs who love just about every silly weird computer thing. Like, surely someone out there (emu devs?) have respect for what Cell was doing, right?

    I’ve never understood it. Multicore processors already existed (the X360 had a triple-core processor, oddly) so I’m not clear what going back to multiple CPUs accomplished. Cell cores could act as FPUs also, right? PS3 didn’t have dedicated GPU, right?

    Such a strange little system, I’m still amazed it ever existed. Especially the OG ones that had PS2 chips in them for backwards compatability! Ah, I miss my old PS3.

    • invertedspear@lemm.ee
      link
      fedilink
      arrow-up
      16
      ·
      4 months ago

      It was very experimental, that’s really the reason Sony went with it and it was at the genesis of multi threaded processing, so the jury was still out on which way things would go.

      Your description of it is a little wrong though, it wasn’t multiple CPUs, at least not gore would be traditionally thought. It was a single dual core CPU, with 6 “supporting cores” so not full on CPUs. Kind of like an early stab at octocore processors when dual core was becoming popular and quad core was still being developed.

      I remember that the ability to boot Linux was a big deal too and a university racked 8 PS3s together into basically a 64 core super computer. I’m actually sad that didn’t go further, the raw computing power was there, we just didn’t really know what to do with it besides experiment.

      Honestly I think someone had a major breakthrough in multi-core single-unit processors shortly after the PS3 launch that killed this. Cell was just a more expensive way to get true multi threaded processing and a couple years later it was cheaper to buy a 32 core processor.

      Maybe in a different timeline we’re all running Cell processors in our daily lives.

      • Codex@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Ah, that sounded familiar as you described it! Thanks for the correction and context! I’d forgotten how early into multicore we still were. Well that also explains why it doesn’t have specific fans then, it’s “basically” “just” parallel programming (which people still don’t understand!)

        Yeah the university running a PS3 cluster was fun news! I recall there being a brief run on the devices as people thought there’d be sudden academic demand for them as supercomputers. I think you could run “folding at home” on them as a screensaver? Which (if I remember right) kind of would make ps3 the biggest research computing cluster around for a while!

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      12
      ·
      4 months ago

      PS3 most certainly had a separate GPU - was based on the GeForce 7800GTX. Console GPUs tend to be a little faster than their desktop equivalents, as they share the same memory. Rather than the CPU having to send eg. model updates across a bus to update what the GPU is going to draw in the next frame, it can change the values directly in the GPU memory. And of course, the CPU can read the GPU framebuffer and make tweaks to it - that’s incredibly slow on desktop PCs, but console games can do things like tone mapping whenever they like, and it’s been a big problem for the RPCS3 developers to make that kind of thing run quickly.

      The cell cores are a bit more like the ‘tensor’ cores that you’d get on an AI CPU than a full-blown CPU core. They can’t speak to the RAM directly, just exchange data between themselves - the CPU needs to copy data in and out of them in order to get things in and out, and also to schedule any jobs that must run on them, they can’t do it themselves. They’re also a lot more limited in what they can do than a main CPU core, but they are very very fast at what they can do.

      If you are doing the kind of calculations where you’ve a small amount of data that needs a lot of repetitive maths done on it, they’re ideal. Bitcoin mining or crypto breaking for instance - set them up, let them go, check in on them occasionally. The main CPU acts as an orchestrator, keeping all the cell cores filled up with work to do and processing the end results. But if that’s not what you’re trying to do, then they’re borderline useless, and that’s a problem for the PS3, because most of its processing power is tied up in those cores.

      Some games have a somewhat predictable workload where offloading makes sense. Got some particle effects - some smoke where you need to do some complicated fluid-and-gravity simulations before copying the end result to the GPU? Maybe your main villain has a very dramatic cape that they like to twirl, and you need to run the simulation on that separately from everything else that you’re doing? Problem is, working out what you can and can’t offload is a massive pain in the ass; it requires a lot of developer time to optimise, when really you’d want the design team implementing that kind of thing; and slightly newer GPUs are a lot more programmable and can do the simpler versions of that kind of calculation both faster and much more in parallel.

      The Cell processor turned out to be an evolutionary dead end. The resources needed to work on it (expensive developer time) just didn’t really make sense for a gaming machine. The things that it was better at, are things that it just wasn’t quite good enough at - modern GPUs are Bitcoin monsters, far exceeding what the cell can do, and if you’re really serious about crypto breaking then you probably have your own ASICs. Lots of identical, fast CPU cores are what developers want to work on - it’s much easier to reason about.

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        So what you’re saying is that Cell 2 is gonna bring back cool fluid and cloth simulation 🙏

    • biscuitswalrus@aussie.zone
      link
      fedilink
      arrow-up
      6
      ·
      4 months ago

      I knew a Datacenter that had hundreds of ps3s for rendering fluid simulation and other such things that at the time were absolutely cutting edge tech. I believe F1 and some early 3d pixar stuff was rendered on those farms. But like all things, technology marched on. fpgpas and cuda have taken that space.

      Cell definitely was heavily used by specialist/nichr industry though.

      I wonder if I can find you some link to explain it better than the rumours I heard from staff that used to work in those datacentres.

      Hmm hard to find commercial applications, probably individuals might have blogged otherwise here’s what I’m talking about: https://en.m.wikipedia.org/wiki/PlayStation_3_cluster

    • OutsizedWalrus@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      I think the application of it was wrong.

      You basically had game devs that wanted to build cross platform easily. PC, Xbox, and Nintendo used standard architecture while ps3 was unique.

      That basically meant you had to develop for ps3 as an entirely separate game than the other major systems.

  • deltapi@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    4
    ·
    4 months ago

    Xbox One plays a number of 360 games fine.

    Apple used QuickTransit for their PPC apps on Intel migration to great success.

    I guess Sony just didn’t want to pay the emulator tax?

    • drcobaltjedi@programming.dev
      link
      fedilink
      arrow-up
      37
      arrow-down
      2
      ·
      4 months ago

      The xbox one/series consoles run a good number of 360 games dispite the fact that the 360 uses powerPC and the newer consoles are x86.

      Sony is out here getting shown up by rpcs3 having about 70℅ of their listed games working perfectly fine by hobbyists reverse engineering the ps3.

    • boonhet@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      Apple did the same again with their ARM migration and in my experience it worked great. I believe Microsoft also has a solution for running x86 software on ARM.

      • SkunkWorkz@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        4 months ago

        But Apple’s solution isn’t pure software emulation, the SoC has special hardware inside to make it translate a lot faster.

        • deltapi@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          4 months ago

          The original Rosetta, which was emulating PPC on x86 is directly comparable to the situation of PS3-game-on-PS4 hardware. I was able to play Halo CE for Mac on x86 with Rosetta and it felt native.

          The point is that this isn’t a limitation of technology, this was a decision on Sony’s part.

    • XeroxCool@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      There’s some weird online connection issues on 360 that occur with certain modern routers. You get dropped randomly from the game. Annoyingly, the emulated 360 on One doesn’t skirt around the issue. It was annoying for Borderlands but made Left 4 Dead worthless on anything besides easy

  • macniel@feddit.org
    link
    fedilink
    arrow-up
    20
    arrow-down
    2
    ·
    edit-2
    4 months ago

    The PS3 is the epitome of “idiots admire complexity […]” it was needlessly complicated with its cell architecture.

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      4 months ago

      There are design decisions that I really don’t understand why Sony made them. They do, however, make the PS3 the ideal piece of hardware if you’re wanting to build an adhoc super computer

    • slacktoid@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I would argue its what you get when you build hardware without any consideration for the people writing the software. Which is just as much as an epitome if a kind of silly.

    • taanegl@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I think the world has learned from this, since we’re abstracting and decoupling much more than before, as well as developing new and modernising old tooling all the time to lower that barrier to entry.

      Shout outs to the game Devs who had to deal with this shit for 3 years straight, as their keyboards were probably salty from all the crying, their rubber ducky all crumpled and deflated.

  • SpookyGenderCommunist [they/them, she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    4 months ago

    I saved up and bought a reasonably beefy Mini PC, and turned it into an emulation console with Batocera. PS3 emulation runs like an absolute dream. But who needs backwards compatibility, when we can resell you the same game from 15 years ago, again, at full price???

    • MazonnaCara89@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      Despite my hate for Microsoft, it is important (sometimes) to acknowledge and appreciate the good they do, such as offering backwards compatibility for games for free. Because you don’t have to repurchase a game if you already have the original media, you insert it in the new console and it does download the game, yes maybe an online only solution but far better than what Nintendo and Sony do.

  • The Quuuuuill@slrpnk.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    Emulating a processor with a unique set of properties, including infinite scalability, is hard. You can’t just put an emulation layer on top of x86 like you can with a processor that’s a subset of x86 instructions

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      you can to some extent, its not like you couldnt throw an emulator designed for one architecture to one with a subset, as its already shown on the PS4 for example that you could throw dolphin and cemu on a ps4 running linux. (not that it would run nice, but its possible).

      its only harder if youre trying to do it in the base OS necause the base OS is usually lacking a graphics API rather than it be a hardware issue itself that presents problems. Its why jokingly people are saying the Xbox Series may be able to run PS3 soon beccause dev mode was updated with Mesa, which includes support from both opengl and vulkan. And alien hardware isnt usually always the issue, given random devices are capable of pf running Sega Saturn, which on its own lile the PS3, had extremely unique hardware

      • aaaaaaadjsf [he/him, comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        4 months ago

        The SPEs on the cell processor are actually pretty good at rendering graphics. In a lot of late gen exclusive PS3 games you can see that the developers utilised them more and more for graphics rendering. So the plan was to have the SPEs on both cell processors do all the graphics.

  • Destide@feddit.uk
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 months ago

    My 60 gb RIP after an ex left it on all night with sonic collections paused.

    • Cadeillac@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Sounds like it was heading out regardless, unless there was a massive memory leak or something. Been leaving consoles on for days since the SNES. Not that I recommend it, but it really should not be a problem

      • pulverizedcoccyx@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I wouldn’t recommend that for call of duty war zone on PS4. A perfectly clean, barely used and quite new PS4 pro - fans screaming at the top of their lungs and I’m just sitting at the main menu. I eventually shut it off because it sounded so crazy. Never experienced that with any game before or since.

        • Cadeillac@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          Oh I don’t doubt it, and I don’t really recommend it for anything but a server. All modern consoles should have an automatic shutdown from overheating after the 360 fiasco. Yes, I understand that doesn’t include the PS3, just kinda rambling at this point I guess. Another thing is that people seem to love squeezing consoles into closed compartments so they can sit there and suffocate in their own heat. Also, cigarette smoke will destroy a console.

          • pulverizedcoccyx@lemmy.ca
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            I ended up putting Lego under the corners for additional flow, repasted the APU and that’s about the best I can do. She’s worth it, firmware 9.0

            • Cadeillac@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              This guy fucks. Yeah, fortunately I was an Xbox guy and my GoW Pro sat offline forever. I haven’t tested it since I recovered it from a house fire a few nights ago. 🤞

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    Honest question, can’t they just ask a chip foundry to make a new batch of the components, with even better miniaturization today? The original used 90nm processes, while the later versions of the console used 45nm, nowadays I think even if they opted for 20-25nm for cost saving, it’d still work fine.

  • Koof_on_the_Roof@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    4 months ago

    The whole point of bringing out a new generation of hardware is to make it work in much better ways of operation than the last one. By default it is not going to run the older generation of games because it doesn’t work in the same way. Now they could spend a lot of effort in making it able to play the old games and work in the old way, but what is their incentive to do that, compared say in starting work on the next generation or releasing the console earlier?