are APU's the future?

are APU's the future?

It's All Fucked Shirt $22.14

The Kind of Tired That Sleep Won’t Fix Shirt $21.68

It's All Fucked Shirt $22.14

  1. 1 month ago
    Anonymous

    The future of what?

    • 1 month ago
      Anonymous

      as in... are they gonna be a serious alternative to traditional gpus

      • 1 month ago
        Anonymous

        They've been the norm ever since laptops became corporate standard

      • 1 month ago
        Anonymous

        A slab of silicon with less die area and lower power envelope is never going to outcompete a larger slab of silicon with a higher power envelope when both are made on the same process

        • 1 month ago
          Anonymous

          its over

        • 1 month ago
          Anonymous

          So uhh wouldn't adding more cpu to the cpu be better than adding a gpu to it though? Why do we even need gpu on cpu's when there is literally a dedicated pci-e slot for that?

          • 1 month ago
            Anonymous

            How are you going to spend those xtors for the CPU? The arch is what it is. You can only add more cores which doesn't benefit all CPU workloads. Adding more GPU cores almost always provides direct uplift to GPU performance.
            Integrated graphics continually moves the bar up for what constitutes entry level in the market and allows laptops to have the best possible battery life and performance by not needing a secondary discrete GPU which will pull 20-50w by itself for a low tier one.

      • 1 month ago
        Anonymous

        well they already are for ultra low end. Remember when Nvidia made the 710 or 1030? That tier of card is useless now. They aren't going to go higher than mid range though. I personally don't care to run 24 pins of power to my cpu

      • 1 month ago
        Anonymous

        they are in the budget range and they might be in the future when physical distance becomes a bottleneck

  2. 1 month ago
    Anonymous

    So like... Integrated GPU but it sounds cooler

    • 1 month ago
      Anonymous

      They are the current, all of AMD's mobile chips since 2011 have been APUs.

      In practice, yes. When AMD first came out with Llano and then Trinity there was actually a meaningful distinction between their APUs and intel's competing chips with an IGP. AMD focused on close integration of the two parts, they wanted to improve data locality, decrease the time in clock cycles it took to share data between the two to accelerate parallel compute workloads. They designed some neat fabric and busses to do so and then eventually made them all open for the industry via the HSA consortium or whatever the frick its called now.
      The features are defacto standard employed by everyone now.

      • 1 month ago
        Anonymous

        >Llano and then Trinity
        Oh man those are some names I haven't heard in a long time

      • 1 month ago
        Anonymous

        What exactly do they offer that my intel UHD graphics can't do? I can already feed 3 4k monitors at 60Hz at once.

  3. 1 month ago
    Anonymous

    I don't like his new voice actor

  4. 1 month ago
    Anonymous

    The present, my homie

  5. 1 month ago
    Anonymous

    Wouldn't it be easier to just make GPUs that can be used as CPUs? Just add branch prediction/uop cache/whatever and call it a day. Would also eliminate the need for special GPU computing drivers

    • 1 month ago
      Anonymous

      A GPU cannot do what a CPU can do. The actual executing units per core/SM/CU/EU/whatever terminology the vendor uses are massively different from the ALUs in a high performance CPU core.

    • 1 month ago
      Anonymous

      you know there are reasons we switched from slotted CPUs to whatever we have today. Also a single GPU core is lot weaker than a single CPU core.
      >inb4 parallelize
      most programs are inherently synchronous

  6. 1 month ago
    Anonymous

    >High Latency

    • 1 month ago
      Anonymous

      apu in the loo made that image

  7. 1 month ago
    Anonymous

    nah, but you're right that they're becoming gpu heavier, see apple, the console apus, strix halo and even intel's new gpu tiles

    • 1 month ago
      Anonymous

      They're all only as GPU heavy as the foundry process and power budget allows.

  8. 1 month ago
    Anonymous

    They were the future about 15 years ago.

  9. 1 month ago
    Anonymous

    >high latency
    This is a selling point?

    • 1 month ago
      Anonymous

      i just typed apu and chose the first non webp image;
      I HATE WEBP

  10. 1 month ago
    Anonymous

    The only reason AMD started calling it "APU" was because a Swiss patent troll who owned the brand "Fusion" threatened to sue them

    • 1 month ago
      Anonymous

      Fusion was a concept, not ever intended to be product branding

      • 1 month ago
        Anonymous

        That didn't stop them

  11. 1 month ago
    Anonymous

    im down with this. most people done use discrete gpus anyways. if we had a 35% faster lenovo legion go id replace my desktop with it.

  12. 1 month ago
    Anonymous

    >are APU's the future?
    frog memes got kinda stale

  13. 1 month ago
    Anonymous

    fm2/fm2+ was home of literally the only desktop apus that made sense. Frick this gay earth, I want a 4790+ 1060 equivelent in one deal and I want it for less than $200 total including mobo and ram

  14. 1 month ago
    Anonymous

    M3 Max is an APU

    • 1 month ago
      Anonymous

      Nobody but AMD makes APUs, spergoid. Thats like said Ford makes AMG racecars

  15. 1 month ago
    Anonymous

    All this does is shorten the transmission line length between the CPU and GPU which allows for a higher clock speed on those lines, it's not magic, it's just a system-on-a-chip.

    • 1 month ago
      Anonymous

      They pioneered GPU snooping CPU memory pointers, flat address space between CPU and GPU, shared virtual memory addressing, a ton of other stuff

  16. 1 month ago
    Anonymous

    >imbibed

  17. 1 month ago
    Anonymous

    Yes.
    Dedicated GPUs have been ruined by crypto and AIs.

    I for one welcome our new weak integrated GPU overlords. Maybe this will force game devs to focus on something besides graphics fidelity. Plus, the power draw of modern high-end GPUs is getting ridiculous.

    • 1 month ago
      Anonymous

      >Maybe this will force game devs to focus on something besides graphics fidelity.
      Devs don't even need to tone down graphics fidelity too much. Even in an iGPU-only world it could remain mostly the same, if devs just went back to giving a single shit about optimization and using proper art direction to achieve good visuals, which they don't anymore.
      Especially now with UE5, shit's getting ridiculous. Indieshit devs releasing UE5 games with PS2 or early PS3 tier graphics that demand something on the level of a RTX 3070 or RTX 4060 to run without shitting the bed. Even worse, graphics settings in most indieshit UE5 games just make the graphics even worse with zero impact on improving performance - you can run it with everything on ultra or make it look like a GBA game by setting everything to low and resolution scaling to 0%, it will run like shit either way.

  18. 1 month ago
    Anonymous

    Yes

    • 1 month ago
      Anonymous

      beat me to it

    • 1 month ago
      Anonymous

      based.
      literally clicked just to see if this was the first post. shame it wasn't.

  19. 1 month ago
    Anonymous

    i thought they got rid of apu

    • 1 month ago
      Anonymous

      no, they are going the Intel way now
      next gen of APUs will suck 245W to get you the performance of a 65W gtx1050 + 65W i3 12100, at 399 dollars

      • 1 month ago
        Anonymous

        The Zen3 Rembrandt APU at 35w in a laptop was faster than a desktop 1050ti

        • 1 month ago
          Anonymous

          i.e. none of the gaming capabilities with none of the productivity of CUDA, nvenc, etc.

        • 1 month ago
          Anonymous

          >The Zen3 Rembrandt APU at 35w in a laptop

          you mean this APU?

          • 1 month ago
            Anonymous

            Fake news. System not named, probably has cTDP set above default state.

          • 1 month ago
            Anonymous

            Hence "going the intel way" you moron

          • 1 month ago
            Anonymous

            >one review sit, who didn't even name the laptop being tested, set cTDP higher than what it shipped as
            >hurrr amd so power hungry
            Except not.

          • 1 month ago
            Anonymous

            >who didn't even name the laptop being tested
            https://www.techspot.com/review/2419-amd-ryzen-9-6900hs/
            some zephyrus laptop see for yourself

          • 1 month ago
            Anonymous

            This site explicitly states that they tested the chip with cTDP set to both 45w and 75w
            Did you actually post that picture without knowing anything about the source, you third world shitskin idiot?

          • 1 month ago
            Anonymous

            idgaf, nobody cares about your $1300 laptop beating a 9 year old desktop gpu
            weird "flex" you got there pal.

          • 1 month ago
            Anonymous

            >oh I was a complete moron and proved myself wrong
            >welll a hurr durr durrr
            Great effort. Good post.

          • 1 month ago
            Anonymous

            wrong

            https://www.computerbase.de/2022-03/amd-ryzen-6900hs-test-rembrandt-asus-rog-g14/3/

            it's unclear how they tested the graphic part on your shit website in your dogshit language. i provided a legit link, not some german blog sponsored by AMD.

          • 1 month ago
            Anonymous

            >reee stop making fun of me!
            >I don't understand that computerbase is one of the biggest review sites in the world and measures power directly with four probes and two scopes
            >t-t-they're sponsored!
            >reee!
            Very cool spacing too, redditor

  20. 1 month ago
    Anonymous

    I mean a top of the line CPU on the same die or bridged with a top of the line GPU (and maybe some HBM) seems like it would be game changing

    • 1 month ago
      Anonymous

      HBM is not worth the costs over consumer grade memory like GDDR6, the cost and complexity of interposers means that the only application is for commercial uses.

      https://i.imgur.com/fowLMOG.jpg

      are APU's the future?

      Anyone answering anything other than copackaged accelerators are the future has no idea what they are talking about. Chiplet reality is upon us as serializers have now rapidly outpaced traditional memory bus interconnect speeds and beachfront becomes an issue for interconnect on very large chips when moving the appropriately large amount of data on and off chip to warrant its processing power not to mention the yield losses as you push die size up. Many smaller chips connected with on package fabrics have already proven themselves in AI workloads. I recall some of the AI system on chip packages being almost a square meter for a single package.

      This was always going to be the case but manufacturers and engineers have stayed away from it because it adds to complexity and drives costs up but at some point you hit a wall where your yields tank because of die size and you run out of ways to actually get signals on and off your gigantic organic chip.

      • 1 month ago
        Anonymous

        >HBM is not worth the costs over consumer grade memory like GDDR6, the cost and complexity of interposers means that the only application is for commercial uses.
        Currently, sure. But we're talking about the future.

        • 1 month ago
          Anonymous

          >the future
          That's fair but I don't see why in the future you would not yet again go with GDDR7 or GDDR8 or however long graphics memory can stay as a single bus architecture with no interposer.

          There obviously comes a time where memory requirements will outpace these methods no matter how fast you can clock the memory or how much you can reasonably pull across the bus without retiming or other shenanigans but unlike with memory and using other chiplets to intermediate the system copackaged accelerator chiplets are already here and solve a lot of the issues of scale present in modern systems so certainly in the future that trend will only continue.

          yes but let's do on-die memory as well to fix the memory bandwidth problem.

          on-die HBM memory solves bandwidth but not scale. The two biggest use cases for on die memory right now are switching chips for big IP routers and AI processing chips. Neither carry large on chip memory (usually 16GB or less, 4-8GB is very common and only because the smallest bulk order sizes HBM comes in is often 4GB). This sounds like "just enuff" but considering how dogshit most devs are and how bloated consumer OS's are 16GB to a consumer desktop in 4 years will look like 4GB to a current desktop.

    • 1 month ago
      Anonymous

      Ever heard of a little company named apple?

      • 1 month ago
        Anonymous

        no

  21. 1 month ago
    Anonymous

    >are APU's the future?

    Yes much future sar

  22. 1 month ago
    Anonymous

    yes but let's do on-die memory as well to fix the memory bandwidth problem.

    • 1 month ago
      Anonymous

      That's what lazy greedy ones do, it's not the actual solution.

      • 1 month ago
        Anonymous

        Memory on package lets you have a wide I/O solution without ever worrying about the mobo maker jacking up prices for running those extra traces and doing extra testing for signal integrity
        Its going to be the way the entire market turns eventually whether we like it or not. And of course there will be no savings ever passed onto you.

  23. 1 month ago
    Anonymous

    The 8700g is legitimately impressive. I'd love to have had a cheap desktop PC with one as a teen that my parents could afford, that could play a bunch of modern games at 1080p30
    An APU with on die RAM and 3D v-cache would be really close to an endgame part

  24. 1 month ago
    Anonymous

    Yes. not a gaymer, but from following tech it seems like current mainstream integrated hardware can play most games at 1920x1080 at medium quality settings at good playable frame-rates. As the base-level performance keeps rising it will become harder and harder to justify spending money to buy an external gpu. At some point external gpus are going to become a niche enough thing that economies of scale won't work for their development, both in hardware, but also on the game development side of things, i.e. why spend money on the art for a small % improvement in visual effects for a smaller and smaller % of the market that can take advantage of it.

  25. 1 month ago
    Anonymous

    Will IQfy ever learn how to write?

    • 1 month ago
      Anonymous

      Is this considered a standard grammatical rule by linguistic scholars/big-wigs? Personally, I detest both uses and will defer to not using any apostrophe in such cases whatsoever when possible. If needed, I type things out fully, or use other applicable terms. I feel like casual dialogue in texting only makes these kinds of problems worse. I almost want to just have a fricking dip pen at this point.

  26. 1 month ago
    Anonymous

    what is an APU?

    • 1 month ago
      Anonymous

      testing

  27. 1 month ago
    Anonymous

    Imbibed? Wtf it's a real word!

  28. 1 month ago
    Anonymous

    in the future nothing will be upgradable, you will own nothing. Everything will be on CPU die, your memory, your GPU, and obviously your CPU. Soon enough your PC will just be a phone with a cooler and a just as locked down OS

    Enjoy your future goyim, you created this

  29. 1 month ago
    Anonymous

    the next step is all the memory in the same chip as the CPU as well, interwoven with the instructions/logic circuit.

    • 1 month ago
      Anonymous

      DRAM on package would become ubiquitous long before large high density DRAM LLC is on die

      • 1 month ago
        Anonymous

        i have no idea what that means nerd, I just know that von neumann had other plans. maybe we will get all sorts of other experiments. maybe some analog computing too.

    • 1 month ago
      Anonymous

      No. GPUs are very different from CPUs. Because of the differences (such as emphasizing massive parallelism; no speculation, SPMD), there really isn't that much of a cost to having it as separate logic from the cpu core, even if it is in the same package, or just a special (isolated) group of logic on the same die. AVX-512 is the result of intel trying to integrate lots of parallelism into a standard cpu core, I think first with the Xeon Phi. AVX-512 is neat, but is very, very different from a GPU and ultimately did not unseat gpus from their throughput throne.

  30. 1 month ago
    Anonymous

    They've been a reality for ages though?

    • 1 month ago
      Anonymous

      >ESL from some GMT+3 shithole

      • 1 month ago
        Anonymous

        Yes, English is my second language and I'm not from one of the "golden billion" countries. Why does it make you upset and what are you going to do about it?

  31. 1 month ago
    Anonymous

    Apu > pepe

  32. 1 month ago
    Anonymous

    They already dominate laptops, which is how most people do computing today.

    For desktops, there's more problems. iGPs are bottlenecked by memory bandwidth and mainstream motherboards will struggle to overcome that.

    • 1 month ago
      Anonymous

      Memory bandwidth isn't really an issue at the current moment, memory scaling brings very little compared to improving the IGP's power limit.
      DDR5's rank structure provides better effective bandwidth than DDR4 at the same transfer rate, and being able to hit 100GB/s is pretty high up there for just 12CU.

      • 1 month ago
        Anonymous

        Apple and AMD don't seem to agree. Mx chips spend a lot of silicon to get a wide and fast bus, and Strix Halo will be doing the same.

        • 1 month ago
          Anonymous

          What a meaningless moron post.
          If AMD wanted Phoenix/Hawk to have a 256bit memory PHY they would have it. The 12CU IGP doesn't need it. The halo SKU in the Strix Point family, which is still just a rumor, is alleged to have a 256bit PHY because of its alleged 40CU IGP. More than triple the executing units in the IGP, only doubling the memory PHY.
          Mainstream Strix is still a normal 128bit PHY and is 16CU, so no, AMD thinks you're a fricking shit eating moron who needs to stop pretending to be an authoritative source on anything, poopskin

          • 1 month ago
            Anonymous

            >lower product segments are still fine with lower bandwidth
            yes, and?

          • 1 month ago
            Anonymous

            12CU fine with dual channel DDR5/LPDDR5
            16CU fine with dual channel DDR5/LPDDR5
            If 12CU isn't massively memory bottlenecked with current memory, and you can still had 4 more CU without it just wasting transistors and power, then memory bandwidth still isn't an issue
            You not understanding this is just absolute comedy

          • 1 month ago
            Anonymous

            16CU is still nothing compared to Apple iGPs or where AMD want their high end SKUs to be. This is the direction the market is moving.

          • 1 month ago
            Anonymous

            The only Apple parts which outperform AMD's APUs also draw more power under sustained load
            You're making a nonsense comparison

  33. 1 month ago
    Anonymous

    I mean sub 200 dollar GPU's is a dead market. The only thing they have to offer are APU's, used GPU's and the RTX 3050 6gb and Arc A380.

  34. 1 month ago
    Anonymous

    >APU
    more like
    APOO
    gottem!

  35. 1 month ago
    Anonymous

    >iGPU with a new name
    lol
    lmao

    • 1 month ago
      Anonymous

      IGPU are usually on the motherboard, not on the same die as the CPU.

      • 1 month ago
        Anonymous

        Maybe if you're moronic, or stuck in the late 90s.

  36. 1 month ago
    Anonymous

    Good but I don't think most games are built with APU in mind and newer games push your pc components to the max just to load into the menu so probably still lots of incremental improvements needed to completely replace the 2 component cpu and gpu

  37. 1 month ago
    Anonymous

    the only advantage is the cpu and gpu only take one space?

  38. 1 month ago
    Anonymous

    they've been "the future" for almost two decades now.

  39. 1 month ago
    Anonymous

    Nah Nvidia will sabotage it.

    • 1 month ago
      Anonymous

      NVIDIA has plenty of CUDA money to cover

  40. 1 month ago
    Anonymous

    Good enough to kill low end GPUs and perhaps even some lower end of the midrange but still can't beat a decade old enthusiast GPUs

  41. 1 month ago
    Anonymous

Your email address will not be published. Required fields are marked *