What do anons think of the rx 7000 leaks?

What do anons think of the rx 7000 leaks?

Thalidomide Vintage Ad Shirt $22.14

Shopping Cart Returner Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 years ago
    Anonymous

    >2X RTX 3090 performance for $1099
    seems legit

    • 2 years ago
      Anonymous

      They say that same shit every generation it only turns about to be at 30 to 40% faster

      • 2 years ago
        Anonymous

        6900XT is about 2x 5700XT
        they'll do it again (half of it will be due to higher TDP)

        • 2 years ago
          Anonymous

          Yeah I remember all the FUD posting about how the 6900xt wouldn't be twice as fast "just" because it has twice the cores. That they can't get 2x scaling.
          ofc the TDP is higher as well, and there were architecture improvements, but both were on the same 7nm node.
          RDNA3 has architecture improvements as well to make up for the losses in using chiplets.

          >3x 8-pin power
          >12-pin power
          what the actual frick
          before long you'll need a dedicated circuit for your computer with how insane things are getting

          >2x more powerful card uses 1.5x the power
          I don't see the problem as long as they're not 600w nvidia housefires.

          If the 7950 is 2x the perf of the 6950xt, and uses 475W instead of 335W, that's still over a 50% perf:watt improvement.

          Not like you can't undervolt. Sacrifice 10% performance to save 25% power or so. They sadly can't use 350W cards to compete with Nvidia's 600W cards. They need to push them closer to 500W to compete.

          We've already seen this happen with... was it the 5600XT or something?
          They initially were going to be efficient cards. But Nvidia pushed out housefires so the AMD cards had to raise TDP to barely be much lower but perform better. Nvidia is the one driving this massive power consumption escalation and only regulators can stop it since consumers are stupid.

          • 2 years ago
            Anonymous

            why would regulators stop it? just don't buy a fricking TITAN card from nvidia or amd and you're done

          • 2 years ago
            Anonymous

            They could make them put yearly power cost stickers on the box like appliances have.

            It's not just titan cards lmao?
            The 3060ti is fricking 200W. People said Vega56 was a housefire when it used that much.
            The 1060 was a 120W card. GTX 980 was 145W.
            Nvidia is going insane pushing TDPs up like this.

            >They initially were going to be efficient cards. But Nvidia pushed out housefires so the AMD cards had to raise TDP to barely be much lower but perform better. Nvidia is the one driving this massive power consumption escalation and only regulators can stop it since consumers are stupid.
            based anon I hate the revisionism taking place. Nvidia has been doing this for a decade now.

            Not really. It's namely been since the 2000 series then again with the 3000 series.
            GTX 960 and 1060 were both 120W.

            The 6900xt was twice the performance of the 5700xt for more than twice the price. It wasn't even a gen on gen improvement. If they're going to "do it again" then they'll double it for 2k, not 1100.

            AMD is fine with high prices as long as Nvidia doesn't want to compete on prices either.
            Intel Arc should at least drive down prices in the <$300 range.

          • 2 years ago
            Anonymous

            >3060ti
            thats less than the 5700XT for the same or better performance.

          • 2 years ago
            Anonymous

            Weird way of saying the 3060ti is simply better.

          • 2 years ago
            Anonymous

            >They initially were going to be efficient cards. But Nvidia pushed out housefires so the AMD cards had to raise TDP to barely be much lower but perform better. Nvidia is the one driving this massive power consumption escalation and only regulators can stop it since consumers are stupid.
            based anon I hate the revisionism taking place. Nvidia has been doing this for a decade now.

        • 2 years ago
          Anonymous

          The 6900xt was twice the performance of the 5700xt for more than twice the price. It wasn't even a gen on gen improvement. If they're going to "do it again" then they'll double it for 2k, not 1100.

          • 2 years ago
            Anonymous

            Nice goalpost moving moron

        • 2 years ago
          Anonymous

          Higher tdp, but it's rumored that they can clock upto or over 3ghz, and judging by the 6950xt and ps5 apu, I'd say that rumor is legitimate

        • 2 years ago
          Anonymous

          >comparing a high-end card to a mid-range card
          what the frick u doing

          if we had a 5800XT then the 6800XT/6900XT would be 50% stronger

          • 2 years ago
            Anonymous

            See

            They are both the best they have to offer. This image and all of the rumors point towards flagships from both companies at being 2x larger. Unless you think nvidia and amd are going to massively frick up performance scaling, there is no reason to doubt nearly 2x performance increase gen over gen, flagship vs. flagship.

          • 2 years ago
            Anonymous

            5700XT was RDNA1 flagship stupid homosexual. RDNA2 flagship is 2x faster, RDNA3 will have another +50% perf/watt and more cores as well.

          • 2 years ago
            Anonymous

            Best to compare with perf-per-area.
            6600XT is 1.11x the speed of the 5700XT with 0.94x the die size. About 1.2x PPA.
            However the PPW is far higher, consuming 0.71X the power, does lead to about 1.5x PPW.
            PPA gains are all from clock speeds as IPC is the same outside of DX12U stuff or some cache edge cases.
            Considering the lower ALU count of the 6600XT, it really is nice.
            Meanwhile Navi 21 and Navi 10 have identical PPA, IC and DX12U stuff is less area efficient than more cores and more memory controllers.

          • 2 years ago
            Anonymous

            And naturally larger GPU's gain diminishing returns in performance, scaling gets worse and clocks/power scale worse.

          • 2 years ago
            Anonymous

            Best to compare with perf-per-area.
            6600XT is 1.11x the speed of the 5700XT with 0.94x the die size. About 1.2x PPA.
            However the PPW is far higher, consuming 0.71X the power, does lead to about 1.5x PPW.
            PPA gains are all from clock speeds as IPC is the same outside of DX12U stuff or some cache edge cases.
            Considering the lower ALU count of the 6600XT, it really is nice.
            Meanwhile Navi 21 and Navi 10 have identical PPA, IC and DX12U stuff is less area efficient than more cores and more memory controllers.

            Youre really good at saying nothing relevant to what the other anon is saying

            https://twitter.com/Kepler_L2/status/1537375541686329345
            This is the most rational lineup.

            >twitter link
            aint clicking that shit homie, this is an image board, post screenshots or gtfo

          • 2 years ago
            Anonymous

            The context is the 5700XT is a midrange GPU and the 6900XT is a proper flagship.
            A 5900XT would probably be 50% stronger than the 5700XT at a somewhat smaller die size than the 6900XT.

          • 2 years ago
            Anonymous

            5700XT is the RDNA1 flagship, your imaginary gpus are irrelevant.

          • 2 years ago
            Anonymous

            >5700XT is the RDNA1 flagship, your imaginary gpus are irrelevant.
            No it's not that's what the 900 moniker is for u stupid c**t the 590 Polaris was also cancelled

          • 2 years ago
            Anonymous

            >IESLB
            5700XT is the RDNA1 flagship, your imaginary gpus are irrelevant. Stay mad braindead zoomer.

          • 2 years ago
            Anonymous

            Look, the point is that if RDNA3 doubles RDNA2's performance, it is far more impressive than RDNA2 doubling RDNA1.
            And it is $400 vs $1000 ffs.

          • 2 years ago
            Anonymous

            +50% better perf/watt already means that a 200W RDNA3 card will be at least 50% faster than a 200W RDNA2 card. And this is without more cores higher clocks and more cache.

          • 2 years ago
            Anonymous

            >5700XT is the RDNA1 flagship,
            Correct
            >your imaginary gpus are irrelevant. Stay mad braindead zoomer.
            Incorrect.

            The context is the 5700XT is a midrange GPU and the 6900XT is a proper flagship.
            A 5900XT would probably be 50% stronger than the 5700XT at a somewhat smaller die size than the 6900XT.

            >The context is the 5700XT is a midrange GPU and the 6900XT is a proper flagship.
            Correct.

            The 5700xt is the flagship gpu for rdna1 same way the 480/590 were flagships gor polaris. That said, a flagship is usually only the largest and most powerful vessel in its fleet (although there are exceptions), so a flagship in gpu generations is simply the most powerful in that generation at that time. Therefore, while a gpu like a 480 was technically a flagship, it was at best mid-range, while the 5700xt could be comfortably considered to be part of the upper mid-range, but most definitely not high-end. Mostly, these classifications arise from the actual chips and more specifically their dimensions. In Nvidias case it's also quite convenient that they are somewhat consistent in naming their chips.

            +50% better perf/watt already means that a 200W RDNA3 card will be at least 50% faster than a 200W RDNA2 card. And this is without more cores higher clocks and more cache.

            Well that's an idealized way of computing things.
            >And this is without more cores higher clocks and more cache.
            I'm very certain that this 50% efficiency improvement they claim factors in clock speed changes and most definitely the cache. Also more cores will be part of the equation as well.

            They could make them put yearly power cost stickers on the box like appliances have.

            It's not just titan cards lmao?
            The 3060ti is fricking 200W. People said Vega56 was a housefire when it used that much.
            The 1060 was a 120W card. GTX 980 was 145W.
            Nvidia is going insane pushing TDPs up like this.

            [...]
            Not really. It's namely been since the 2000 series then again with the 3000 series.
            GTX 960 and 1060 were both 120W.

            [...]
            AMD is fine with high prices as long as Nvidia doesn't want to compete on prices either.
            Intel Arc should at least drive down prices in the <$300 range.

            >Nvidia is going insane pushing TDPs up like this.
            And it will be called a house fire. I'm already collecting material to make memes.
            t.v56 owner

            Chiplet design + better node + higher wattage. 2x performance is entirely reasonable.

            this. Mostly the chiplet design allows for truly huge GPUs. Just compare, the 6900xt has 5120 streaming multiprocessors. The 7900xt has allegedly double. If clock speeds are somewhat lower but the cores have more ipc through u-arch improvements and more cache we're easily looking at 2x performance. If the overall tdp rises by 33% (+100% for double cores - potentially lower clocks - u-arch caused voltage reduction - process node improvements) that would be a 50% improvement in efficiency. Now amd could also increase clock speeds for some designs and sacrifice some of the efficiency.

          • 2 years ago
            Anonymous

            t. malding zoom

          • 2 years ago
            Anonymous

            >RX 7970 XT (Navi31 XTX) 48 WGP/96 CU/12288 FP32, 24 GB GDDR6, 384 MB IC
            >RX 7950 XT (Navi31 XT) 42 WGP/84 CU/10752 FP32, 24 GB GDDR6, 192 MB IC
            >RX 7900 XT (Navi31 XL) 35 WGP/70 CU/8960 FP32, 20 GB GDDR6, 160 MB IC
            >RX 7800 XT (Navi32 XTX) 32 WGP/64 CU/8192 FP32, 16 GB GDDR6, 128 MB IC
            >RX 7800 (Navi32 XT) 28 WGP/56 CU/7168 FP32, 16 GB GDDR6, 128 MB IC
            >RX 7700 XT (Navi32 XL) 24 WGP/48 CU/6144 FP32, 12 GB GDDR6, 96 MB IC
            >RX 7600 XT (Navi33 XT) 16 WGP/32 CU/4096 FP32, 8GB GDDR6, 64 MB IC
            >RX 7600 (Navi33 XL) 14 WGP, 28 CU/3584 FP32, 8GB GDDR6, 64 MB IC
            I'll personally add that price is probably from $400-$2000 and perf relative to a 6900XT is 0.9-2.2+?x
            Those are just safe personal estimates.

    • 2 years ago
      Anonymous

      Possible, real world price will be 2K € then

      lol no thats not a hard thing to believe, AMD has already doubled themselves gen over gen and they're likely gonna do it again.

      I expect nvidia to double as well, seeing as SS8 was a terrible node for clocks/power and they're adding a giant cache to ada lovelace. So if a 3080 keeps up with 6900xt despite no LLC, its gonna be a game changer for the 4080.

      Depends if they really boast so many shaders + more clock + more cache
      I expect the top model to be 2x 6900 XT, the main question is what it will cost and if I can obtain one
      Maybe I have to scalp me one
      If i can get even a midrange NVIDIA for MSRP I will buy and resell it after a few weeks, if the mining hype will be nearly as horrible as last time it may be 50 or 100 % price hike
      At this point its very unlikely the top model will be under 1K so I hope the richgays can buy me one
      THis time I also don't need a GPU on laucnch so desperately

    • 2 years ago
      Anonymous

      >$479 for entry level

      Guess I'll be sticking to Polaris for a while.

      • 2 years ago
        Anonymous

        That obviously isn't entry level. Smaller die entry level designs always launch later.

      • 2 years ago
        Anonymous

        You know the $280 rx 6600 is 2.5x faster than your rx 480, right? Also uses less power.

        Can someone explain to me like I'm fricking brain damaged why the memory bus/bandwidth seems to be getting smaller each gen?

        It's expensive.
        Uses die space.
        Faster memory requires a narrower bus for a given bandwidth.

        Your question is like asking
        >why didn't they just put 1024bit bus on a 10 year old GPUs and make them 3x larger dies to make them twice as fast?
        But since RDNA3 seems to have 64mb cache+64bit bus chiplets, they may feasibly be able to actually do 512bit and larger bus from now on which will get much better scaling.

        It's cost.
        As dies get larger for more compute and such, they try to save die space on smaller memory bus and shit like that.
        It's easier to save money on compression and shit for higher effective bandwidth and other optimizations and use faster memory than it is to make the gpu itself weaker in exchange for more die space reserved to memory bus.

        Still makes no sense launching it before the flagship or XT versions. AMD does sandbag but to this point is beyond stupid. Nobody wants the slower product.

        My guess is it's a way to increase prices without it feeling as bad.
        You get a 7700 that's faster and uses less power than the 6700xt. It costs the same but it's still better.
        Then they can add a 7700xt later that matches the 3090/6900xt that's say $549 and drop the 7700 price a bit to $449 or so and they seem like the good guy.

        Fact is that GPU prices are going up. They're always going up. Inflation exists.
        >gtx 970 was $329
        >gtx 1070 was $449 (/w fake $379 msrp that the FE and no other cards on launch sold for)
        >rtx 2070 was $599 (/w fake $499 msrp that the FE and no other cards on launch sold for)
        Nvidia used fake MSRP whch no cards were ever sold at since pascal to hide the rising MSRPs. AMD will probably use repositioning of SKUs in the stack to obfuscate rising MSRPs.

        Since Nvidia appears to be releasing housefires, it gives AMD room to release power efficient GPUs on launch and then respond with their own housefire XT versions afterwards.

        They doubled the shaders, clocks, vram and added a bunch of cache. The nickname of the chip is Big Navi. Oh yeah look at the price and how much power it draws. 200% is the bare minimum.

        5700xt is 225w. 6900xt is 300w. That's not double.

        • 2 years ago
          Anonymous

          >You know the $280 rx 6600 is 2.5x faster than your rx 480, right? Also uses less power.
          It's 60% faster you moron. It's also not $280 but ~ $300. Also 2.5x faster implies 350% you ungodly abomination, which is even more delusional.

          That obviously isn't entry level. Smaller die entry level designs always launch later.

          480 was a midrange card for $229 msrp. Whatever abomination you get nowadays for an inflation adjusted price of $265 is a joke considering it has been 6 years. Case in point, aforementioned rx 6600 costs 300 bucks and is 60% faster, a fricking vega 56 is as fast and went for ~ 220 bucks back in 2019. Comparing GPU prices to anything but pre-cough levels is moronic beyond belief.

          • 2 years ago
            Anonymous

            How are you so wrong?
            Why don't you look up benchmarks that show how wrong you are?
            You're just going by tflops or something stupid.

            And no, 2.5x is +150%. You're terrible at math and need to stop calling others "moron" when you're the most moronic person in this thread.

          • 2 years ago
            Anonymous

            >And no, 2.5x is +150%
            2.5x performance is 250%, which is +150% faster.
            2.5x faster is +250%, which is 350% baseline performance.
            "Faster" is comparative, which means you are comparing it to the baseline of 100% in this case. Learn some fricking basic grammer you imbecile.
            >You're just going by tflops or something stupid.
            I went with userbenchmark numbers fully aware of them being a shit source, although i was only aware of major problems with the cpu scores of the site - until now. I concede that my numbers are therefore incorrect - except of course the aforementioned comparative grammar nitpicking, i still claim you are in the wrong there.
            For the most part, userbenchmark is ranking the v56 as faster than it actually is, a lie that i - as an owner of said card - went along with.
            Still, from what other sources i browsed through it looks like the 6600 is approximately "only" twice as fast an rx 480.

          • 2 years ago
            Anonymous

            Uselessbenchmark has been into seething overdrive over AMD for at least 2-3 years. It's what happens when you're a coping mentally ill intelvidia fanboy in charge of maintaining a benchmarking site.

          • 2 years ago
            Anonymous

            I have to clarify, i mistakenly used the "effective speed" shit they list at the top instead of the actual results. It's the main flaw of the site and i didn't pay attention like a moron, so i fell for their bs. Really have to update my userbenchmark scraper to use my new scraping library and pay some jeets a few bucks so i can finally host an alternative front-end for this shitshow of a site.

          • 2 years ago
            Anonymous

            >2.5x performance is 250%, which is +150% faster.
            >2.5x faster is +250%, which is 350% baseline performance.
            >"Faster" is comparative, which means you are comparing it to the baseline of 100% in this case.

          • 2 years ago
            Anonymous

            A very convincing argument anon. Very cool.

        • 2 years ago
          Anonymous

          >You know the $280 rx 6600 is 2.5x faster than your rx 480, right?
          In what israeli multiverse is this true?

          • 2 years ago
            Anonymous

            This one.
            Multiple reviewers show the same thing.

          • 2 years ago
            Anonymous

            Reality doesn't matter

          • 2 years ago
            Anonymous

            sorry but i only believe the stats from the nvidia and amd official websites

          • 2 years ago
            Anonymous

            That's 2x not 2.5x, also the cheapest RX 6600 where I live is $100 more than what I paid for my RX 480 back in early 2017. To be fair the exchange rate was slightly better back then, but only by about 10%.

          • 2 years ago
            Anonymous

            Nope it's more than 2x.
            Go back to elementary school.

          • 2 years ago
            Anonymous

            It also costs double the price so why bother. The card is only good for 1080p anyway. Everything else it shits the bed.

          • 2 years ago
            Anonymous

            >costs double
            >gets 2.5x the performance for the same power draw
            You're actually arguing to get a used card that is worse perf:price and perf:watt. You're an idiot.

    • 2 years ago
      Anonymous

      https://www.digitaltrends.com/computing/amd-may-be-working-on-an-even-better-rdna-3-gpu/

      • 2 years ago
        Anonymous

        A Pro Duo of Navi 32 is the best fit if it exists.
        But honestly, AMD are just going full deflect mode at this time, whether there truly is a 2GCD card on one package is for time to tell.

        • 2 years ago
          Anonymous

          what are u talking about radeon has superior arch to deadend arch nvidia has to offer
          it is very simple
          look at die size
          compare two
          look at power drawn
          compare two

    • 2 years ago
      Anonymous

      >7900/7950
      >2~2.2 times the SPs of 6900XT
      >Arch+Node gains
      >higher freq. mem
      I mean its not impossible.

  2. 2 years ago
    Anonymous

    what moron did this garbage come from? 512bit 7970 is a dead giveaway that this is fake when you look at the rest of the chart

    • 2 years ago
      Anonymous

      another thing is that these morons keep putting 2x performance jumps for nvidia/amds leaks
      last time it actually happened was with the gtx 1000 series

      • 2 years ago
        Anonymous

        lol no thats not a hard thing to believe, AMD has already doubled themselves gen over gen and they're likely gonna do it again.

        I expect nvidia to double as well, seeing as SS8 was a terrible node for clocks/power and they're adding a giant cache to ada lovelace. So if a 3080 keeps up with 6900xt despite no LLC, its gonna be a game changer for the 4080.

        • 2 years ago
          Anonymous

          5700 XT and 6900 XT were not equivalent products for their respective generations moron

          • 2 years ago
            Anonymous

            They are both the best they have to offer. This image and all of the rumors point towards flagships from both companies at being 2x larger. Unless you think nvidia and amd are going to massively frick up performance scaling, there is no reason to doubt nearly 2x performance increase gen over gen, flagship vs. flagship.

        • 2 years ago
          Anonymous

          They doubled the shaders, clocks, vram and added a bunch of cache. The nickname of the chip is Big Navi. Oh yeah look at the price and how much power it draws. 200% is the bare minimum.

      • 2 years ago
        Anonymous

        We already have 50% increased perf per watt going from RDNA2 to RDNA3 from AMD's own claims. Also consider that it's going to draw 50% more power you're looking at a 2.25x increase in performance.

      • 2 years ago
        Anonymous

        RDNA3 unironically has a shot at doing it again
        node shrink plus architectural efficiencies add up

      • 2 years ago
        Anonymous

        AMD has rdna3+node shrink+higher power, while Nvidia is moving to a much more efficient node in tsmc n4 and they're cranking up the power. It's believable that they'll both have a x2 increase in their flagship cards

      • 2 years ago
        Anonymous

        Chiplet design + better node + higher wattage. 2x performance is entirely reasonable.

    • 2 years ago
      Anonymous

      The memory controller and cache are separate chiplets.

      another thing is that these morons keep putting 2x performance jumps for nvidia/amds leaks
      last time it actually happened was with the gtx 1000 series

      It's more than twice the cores and they say it's a 50% perf/watt increase.
      6950xt is 335W TDP and the 7950 looks to be 400-475W.

      5700 XT and 6900 XT were not equivalent products for their respective generations moron

      6900xt has double the cores of the 5700xt and almost exactly double the performance.
      5120 vs 2560.

      Kopite7komi said the new Ada would do 2.4x 3090

      Highly unlikely. It's a 60% core increase or so and power isn't doubled neither.

      why would the 7800 XT have less VRAM than the 6800 XT

      Cost

      >7970
      why are they doing this

      >he doesn't remember THE most legendary GPU of all time

      • 2 years ago
        Anonymous

        >The memory controller and cache are separate chiplets.
        Then why also do the 2x 5120 monolithic setup according to the OP?

        I believe the original rumors a lot more than this moronation. 32gb 512 bit is going to be above and beyond. I *highly* doubt they're going to waste the die space on n31 for the extra IF links they'd need for that full bandwidth.

        • 2 years ago
          Anonymous

          Radeon Pro RDNA3 is 768 bit, anon.
          512 bit is just a cut down 3x 192 bit instead of 2x or 4x.

          >top die for anything less than $2000
          Fake

          Top die isn't listed as under $2000. It probably is $2000 or $1899 at best.

          yeah and every clickbaiting moron on twitter is hyping the new gens as 3x faster, not even MCM Rdna3 is going to get more than 2x

          More than tripling cores, higher TDP, and a node shrink, can certainly get 2x or more even with the losses of infinity fabric.

          • 2 years ago
            Anonymous

            >Radeon Pro RDNA3 is 768 bit, anon.
            >512 bit is just a cut down 3x 192 bit instead of 2x or 4x.
            okay, nostaseronx.

            chnk moron just wish you'd frick off already

      • 2 years ago
        Anonymous

        srs question. Where the frick do I learn more about GPU architecture/featuresets

        • 2 years ago
          Anonymous

          Unironically start at wikipedia, for example https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units
          Every generation has a microarchitecture article link. Go to them and look at the linked sources. NVIDIA publishes whitepapers, AMD has open docs too.

          • 2 years ago
            Anonymous

            Locuza https://www.youtube.com/channel/UCaFk_ygFCffeQhYouGCcAkQ
            Coreteks
            https://www.youtube.com/channel/UCX_t3BvnQtS5IHzto_y7tbw

            Thx anons

        • 2 years ago
          Anonymous

          Locuza https://www.youtube.com/channel/UCaFk_ygFCffeQhYouGCcAkQ
          Coreteks
          https://www.youtube.com/channel/UCX_t3BvnQtS5IHzto_y7tbw

  3. 2 years ago
    Anonymous

    people are going to be massively disappointed when next gen gpus and cpus aren't a massive leap like they thought it would be. they have to truly wait for zen 5 and rdna4.

    • 2 years ago
      Anonymous

      Nah nvidia will do just fine even if all they do is die shrink ampere and slap more L3 cache and call it a day. Just samshit 8nm to tsmc 4n would be 50% performance

      • 2 years ago
        Anonymous

        yeah and every clickbaiting moron on twitter is hyping the new gens as 3x faster, not even MCM Rdna3 is going to get more than 2x

  4. 2 years ago
    Anonymous

    three 8 pin and one 12 pin power connection needed? does it come with its own nuclear power plant too?

    • 2 years ago
      Anonymous

      its not an nvidia card and thats 3x8 OR 1x12

  5. 2 years ago
    Anonymous

    why would the 7800 XT have less VRAM than the 6800 XT

  6. 2 years ago
    Anonymous

    >128bit bus
    >3080

  7. 2 years ago
    Anonymous

    Probably going to get a R9 7900x+7900xt

  8. 2 years ago
    Anonymous

    >7970
    why are they doing this

    • 2 years ago
      Anonymous

      The simulation is looping. The new mainstream laptop chip from AMD is codenamed Mendocino, just like the old based Intel Celeron.

      • 2 years ago
        Anonymous

        >The simulation is looping
        More like mommy su is taking a fat shit on intel and rewriting over all their history

    • 2 years ago
      Anonymous

      Because it's the 10 year anniversary

  9. 2 years ago
    Anonymous

    Cool I found a good uv oc for my 2080ti and 6900xt to last Aussie summer until new years
    Gonna get a 7950 black
    I love both these gpus but vr just beings them to their knees and rt performance is meh

  10. 2 years ago
    Anonymous

    Kopite7komi said the new Ada would do 2.4x 3090

    • 2 years ago
      Anonymous

      It will not. It's the same basic architecture as Ampere and adds only 80% more cores. The best they can hope for is 1.8x a 3090. It may be 2.4x better in RTX which is a shit benchmark.

      • 2 years ago
        Anonymous

        >It's the same basic architecture as Ampere
        Kopite7kimi literally said it's not and he's the source of leaks everyone is using so I'm not sure where you go that from

        • 2 years ago
          Anonymous

          >A few more tensor cores and a node shrink count as a new architecture
          Even RDNA3 isn't really a new architecture like RDNA was. More of a reshuffling of RDNA2 with minor restructuring to support MCM.

          • 2 years ago
            Anonymous

            RDNA2/3 is mostly the same as RDNA1, they just put a shitload of SRAM on the die to act as cache. That's the only reason why it's so fast.

          • 2 years ago
            Anonymous

            It also clocks higher. 5700 XT + 35% higher boost clock = 6700 XT.

          • 2 years ago
            Anonymous

            >just posting moronic asspulls when reference manuals and programming guides exist
            Infinity Cache offsetting the VRAM hit rate is not what makes RDNA2 fast or power efficient.

  11. 2 years ago
    Anonymous

    the only thing they've got right is the memory specs, release date, 7700XT/7800XT/7900XT and the L3 cache.
    AMD is looking to do dual dies early next year as the initial models will all be single die... and reusing the 7950 and 7970 model numbers is pure wank, but would be cool.

  12. 2 years ago
    Anonymous

    >top die for anything less than $2000
    Fake

  13. 2 years ago
    Anonymous

    How many CUDA cores?

    • 2 years ago
      Anonymous

      >using CUDA
      Lol, get back to work, wagie.

  14. 2 years ago
    Anonymous

    dont forget anons, Nvidia is getting off glorified 10nm that Samsung likes to call 8 and is heading straight to node parity with AMD this time around
    honestly concerned for AMD they didn't demolish Ampere given rdna2 was like 2 nodes ahead

    • 2 years ago
      Anonymous

      Rdna2 already smacks Ampere in raster in 1080p. The saving grace for Nvidia is they have gddr6X while rdna2 doesn't.

    • 2 years ago
      Anonymous

      Nvidia is getting the node advantage of 4nm and its going to be slower and run hotter.
      AMD is going to be 6nm and 5nm.

      • 2 years ago
        Anonymous

        >AMD is going to be 6nm and 5nm.
        5nm, and it's basically the same thing as "4nm", they're on the same platform. N4 is just an alternative to the enhanced N5. Like N6 and N7+ there's little difference in performance and power.

        • 2 years ago
          Anonymous

          nope, its both 6nm and 5nm.
          Navi 33 is probably going to be 6nm, Navi 34 is probably going to be monolithic 6nm

  15. 2 years ago
    Anonymous

    i bought yesterday a rx 6600
    really great card
    idle/desktop use only 5 watt power
    max 100w power
    3d mark 11 performance benchmark 27990 points

    when you have an old graphic card dont wait , upgrade now.

    • 2 years ago
      Anonymous

      >when you have an old graphic card dont wait , upgrade now
      No.

  16. 2 years ago
    Anonymous

    trust me tm in 4 months amd will release card 30% faster than 3090 for one quarter of price ! totally real , will happened !

  17. 2 years ago
    Anonymous

    >implying it will get assembled before Taiwan gets into communistic China

  18. 2 years ago
    Anonymous

    Bog standard 30% performance increase, but without any affordable options. I think I don't give a shit.

    • 2 years ago
      Anonymous

      You will never see anything "affordable" again. There's too much money to be made in high end and not enough TSMC capacity to waste on economy options.

    • 2 years ago
      Anonymous

      >twice the ALUs per CU
      >higher clocks
      >higher CU count than prior designs
      >can do 2 64 wavefronts concurrently
      hurrr its only 30% faster!

      • 2 years ago
        Anonymous

        If the prices are accurate and the thing slotting into the 6700xt's price is 30% faster than the 6700xt, then yeah, it's only a 30% improvement.

        • 2 years ago
          Anonymous

          There's no 6700XT there, but 6800XT as the lowest equivalent. The cheapest 6800XT on newegg is $729 while 7700 is listed as $479

      • 2 years ago
        Anonymous

        At the <$1000 end, yeah it looks to only be around 30% faster.
        It's just better price:performance scaling into the high end.

        >5700 XT
        >256bit
        >$399 launch price

        >6700 XT
        >192bit
        >$479 launch price

        >not even XT
        >128bit
        >same launch price as the already overpriced prev-gen XT

        The way thing are progressing the 8700 will manage to be 64bit and have a launch price of $649

        Remember the good old days before crypto was a thing?
        Mid-range was actually mid-range; Cards were readily available; Prices were fluctuating between 200 and 300 bucks; You didn't need a power plant to run a high end system... too bad those days are long gone and never to return.

        Maybe it'll actually wind up being the 7600 instead of the 7700.
        Just hard to imagine them releasing a 7600 that's close to 3090 perf. The die is probably 350-450mm^2.

        So what's the power consumption like?

        Better than Nvidia.

  19. 2 years ago
    Anonymous

    Are the power reqs going to be insane like nvidia. Cost of Ownership is starting to become a thing to consider for these cards.

    • 2 years ago
      Anonymous

      It's literally a non issue unless you're a miner homosexual

      • 2 years ago
        Anonymous

        or a neet playing 24/7

      • 2 years ago
        Anonymous

        I mean if you play vidya for 6 hours a day on a high end machine, your looking at like $20+ a month just for your PC.

        • 2 years ago
          Anonymous

          >6 hours a day
          You mean like a NEET?

          • 2 years ago
            Anonymous

            stop over sleeping and you'll have time to game

      • 2 years ago
        Anonymous

        It is a problem for thermal management. The main reason why 300W+ TDP SKUs are unmarketable deadweight.
        You can't get around the laws of physics. Nvidia and AMD are just vainly trying to keep up the illusion of leapfrogging each generation.

    • 2 years ago
      Anonymous

      Higher than current gen, but not as insane as Nvidia.

  20. 2 years ago
    Anonymous

    >7970
    KINO, it was my first GPU when I was 11 lol, used it myself for 5 years then passed it on to a friend and now it's two friends later and still going strong, just had to change one of the fans on it
    legendary gpu, all of my childhood memories were made on it
    AMD really got me in the feels with this one

  21. 2 years ago
    Anonymous

    So what's the power consumption like?

    • 2 years ago
      Anonymous

      Requires an Electrical substation.

      • 2 years ago
        Anonymous

        And a fire department

        • 2 years ago
          Anonymous

          not true. my 3070 runs very cool under full load. never above 59°C.

          maybe don't buy the cheapest PowerColor card

        • 2 years ago
          Anonymous
          • 2 years ago
            Anonymous

            >no driver
            kek

  22. 2 years ago
    Anonymous

    EVERY. FRICKING. GEN.

  23. 2 years ago
    Anonymous

    >5700 XT
    >256bit
    >$399 launch price

    >6700 XT
    >192bit
    >$479 launch price

    >not even XT
    >128bit
    >same launch price as the already overpriced prev-gen XT

    The way thing are progressing the 8700 will manage to be 64bit and have a launch price of $649

    Remember the good old days before crypto was a thing?
    Mid-range was actually mid-range; Cards were readily available; Prices were fluctuating between 200 and 300 bucks; You didn't need a power plant to run a high end system... too bad those days are long gone and never to return.

  24. 2 years ago
    Anonymous

    I see they adapted the Nvidia model of releasing a 3090 ti super deluxe edition now

    • 2 years ago
      Anonymous

      >I see they adapted the Nvidia model of releasing a 3090 ti super deluxe edition now
      Did you forget dual gpu cards and the 7970ghz 6gb or the 6950xt that just came out?

      • 2 years ago
        Anonymous

        He's most likely a zoomer.

        • 2 years ago
          Anonymous

          >He's most likely a zoomer.
          Zoomers dont use 9yo gpus? Their around a 1060 and cheaper

  25. 2 years ago
    Anonymous

    >3x 8-pin power
    >12-pin power
    what the actual frick
    before long you'll need a dedicated circuit for your computer with how insane things are getting

    • 2 years ago
      Anonymous

      I only see it getting worse since they'll eventually be unable to shrink the node.

      • 2 years ago
        Anonymous

        There are already factories being made in Japan that will use Galium to make 2nm chips.

        • 2 years ago
          Anonymous

          Gallium is way too rare for widespread use like silicon is.

        • 2 years ago
          Anonymous

          Really? Source? I feel like this would be big news.

    • 2 years ago
      Anonymous

      >what the actual frick
      These are top end cards pushing the architecture as far as it can go. These architectures are incredibly power efficient if you're okay settling at RTX 3080-ish levels of performance.

    • 2 years ago
      Anonymous

      First off it's either 3 8pin or 1 12 pin, but yea power consumption is getting out of hand

      • 2 years ago
        Anonymous

        Stop looking for power efficiency at the highest end you morons. No one expects a Titan to be power efficient besides you two.

        • 2 years ago
          Anonymous

          The more expensive cards tend to be better fps:watt, actually.
          I'm sure someone has a graph.

        • 2 years ago
          Anonymous

          If the top end cards are pushing 600w than that shit will trickle down and we'll have 500w 4080s you moron

          • 2 years ago
            Anonymous

            1X 8-pin for the 7700 which is equivalent to a 3080. What space age technology do you think AMD has to deliver 500W over a single 8-pin?

          • 2 years ago
            Anonymous

            I was talking about Nvidia also the new atx3.0 spec can deliver 609w per connection

          • 2 years ago
            Anonymous

            Nobody cares about a $479 7700 AMD shill. Shoo shoo.

          • 2 years ago
            Anonymous

            >no one cares about a 3080 for half the price with better RT performance
            moron

          • 2 years ago
            Anonymous

            B-but muh COODA and muh dee el ess ess

    • 2 years ago
      Anonymous

      >before long
      seems you're still living in the good old days

      • 2 years ago
        Anonymous

        not possible for amerimutts

        • 2 years ago
          Anonymous

          It is just gotta have some electrical work done beforehand :^)

          • 2 years ago
            Anonymous

            >nooo my landlord wont let me get a 220v outlet for my gaymerinos, how dare you require such high power GPUs

          • 2 years ago
            Anonymous

            Why the frick would I ask them for permission, just get it done.

            3070 has worse performance. 3080ti and 3090 literally where do I get those from? I have to wait for the 4000 series and hope the prices are acceptable and the availability exists.

            See if anyone is interested in swapping their 3080 for your 6900, you might get lucky.

          • 2 years ago
            Anonymous

            >swapping their 3080 for your 6900

            Is that really worth it though? the 3080's VR performance is more or less the same. I guess I'll be able to play Dragon's Dogma, but I feel like I should wait for the 4000 series then sell this when I get a new card. Hopefully the prices aren't fricked and you can actually get one, but who knows, because I live in yurop after all.

          • 2 years ago
            Anonymous

            You can buy and install a small 220v transformer the size of a coffee machine at your house. I have one for my AC.

  26. 2 years ago
    Anonymous

    I have a 7970 already, it's pretty good

  27. 2 years ago
    Anonymous

    I think that I really regret getting a 6900xt, because the drivers and VR performance are awful, so I will not fall for it again.

    • 2 years ago
      Anonymous

      Then sell it and buy a 3070 or whatever you can with the money you get

      • 2 years ago
        Anonymous

        3070 has worse performance. 3080ti and 3090 literally where do I get those from? I have to wait for the 4000 series and hope the prices are acceptable and the availability exists.

    • 2 years ago
      Anonymous

      Really? I'm extremely happy with my 6900XT. 1:1 3090 rasterization performance and better Linux drivers for less than half the price of the 3090 after markup.

      I got my 6900XT for MSRP ($999). A 3090 at the time cost $2600 and nowadays it's still almost double at $1950

      • 2 years ago
        Anonymous

        VR performance on AMD is shittier due to devs only supporting Nvidia since they dominate the market, but they're fine for regular gayming

        • 2 years ago
          Anonymous

          I have an Index running at 120hz supersampled and I have no idea wtf you're talking about anon

          • 2 years ago
            Anonymous

            I'm talking about the quest and all the new meme VR devices being shilled lately

          • 2 years ago
            Anonymous

            I have a Reverb G2 and I'm trying to play simracing games. Obviously an AMD card would be enough for your VR chat on an index.

          • 2 years ago
            Anonymous

            I'm playing Onward/Pavlov/Elite Dangerous/Into the Radius/Sairento/Project Cars 2 bro
            >works on my machine

          • 2 years ago
            Anonymous

            Yeah you're playing easy to run games on the best VR headset currently. Good job. Happy for you.

          • 2 years ago
            Anonymous

            my rx 580 runs pavlov and onward no problem, you're just an idiot that doesn't actually play VR

          • 2 years ago
            Anonymous

            moron

      • 2 years ago
        Anonymous

        Linux sure, but I don't use it. I only got it because it was what's available shortly after release and the prices weren't jacked up yet. I was looking for a 3080 but couldn't get one, so I just sniped the first thing I could. Can't play Dragon's Dogma well, since it's a DX9 game, recording stutters no matter what settings or software I use, VR performance is literal dogshit for what I want to do at least. Before that I had a 1660ti and I could play Dragon's Dogma and record with no issues. Why can't I with a 6900xt? I only upgraded because I wanted to get into simracing in VR and my poorgay friend needed a card, so I gave him the 1660ti, otherwise I'm literally stuck here playing HOMM3 on a 6900xt. What a waste. Frick AMD.

        VR performance on AMD is shittier due to devs only supporting Nvidia since they dominate the market, but they're fine for regular gayming

        It's actual the bus speed, but whatever.

  28. 2 years ago
    Anonymous

    >7800XT
    >192b Bus
    They pulled that one out of their arse.

    • 2 years ago
      Anonymous

      sorry sweetie, if you want more bus you gotta buy the 7950 XT TI super 16GB rev 2.0

    • 2 years ago
      Anonymous

      gddr6x and 50% more cache should somewhat make up for it.

      >starting at $500
      i do not fricking believe they profit more by excluding half a billion of people from buying their GPUs

      Hundreds of millions of people don't buy GPUs.
      # of GPUs sold in 2021 was 50 million iirc. A lot of those were to miners, I think well over 15 million.

      • 2 years ago
        Anonymous

        >Hundreds of millions of people don't buy GPUs.
        but they CAN buy GPUs, i don't see why they want to miss out on this big of a market

        • 2 years ago
          Anonymous

          Exactly, only a fool wouldn't realize the goldmine that has descended
          I've ordered several GPUs for when crypto goes back up and mining skyrockets and I'm going to buy more every time I get paid
          Given the previous bull run, it's only a matter of time before I can double my money

        • 2 years ago
          Anonymous

          6600 is perfectly affordable for a billion people.
          2.5x the perf of a gtx 1060.
          All that 99% of people need.
          People don't buy it by choice and ignorance.

          • 2 years ago
            Anonymous

            it's as expensive as 1070 was when it came out and a lot of people will rather buy a second hand gpu that matches the performance for 1/2 of the price and $400 really is not a price everyone can afford
            >2.5x the perf of a gtx 1060.
            2.5x faster than a 5 year old GPU and 1.5x as expensive
            >6600 is perfectly affordable for a billion people.
            i was obviously talking about the bottom 1-3 billion of people, especially people in the developing countries, this is a huge market now and will be an even bigger one in the future i do understand that the margins they can make are much smaller but i really don't understand why are they deciding to just abandon them

            Exactly, only a fool wouldn't realize the goldmine that has descended
            I've ordered several GPUs for when crypto goes back up and mining skyrockets and I'm going to buy more every time I get paid
            Given the previous bull run, it's only a matter of time before I can double my money

            wouldn't buying coins be more profitable?
            anyway, you probably know a thing or two about mining: where the frick are all of those post-mining polaris/1060s? I thought no one is going to buy them because muh mining destroyz gpuz!! but even the ones without video output are still more expensive than when they were out

          • 2 years ago
            Anonymous

            People who don't know how inflation works should honestly kill themselves.
            1070 was $449 msrp when you ignore the fake MSRP.
            $449 adjusted for inflation is almost $600, twice as much as a RX 6600, you COLOSSAL FRICKING MORON.

          • 2 years ago
            Anonymous

            we don't live in the same country so the details are going to vary but you still managed to miss my point quite well i'm impressed by how fricking moronic you are i bet your parents are proud the massive imbecile they've raised

          • 2 years ago
            Anonymous

            The MSRPs are announced in USD, fricktard.
            Because you pinoys steal GPUs and sell them cheaper doesn't make that the price.

          • 2 years ago
            Anonymous

            you're not very bright are you

          • 2 years ago
            Anonymous

            >1070 was $449 msrp when you ignore the fake MSRP

            You know the $280 rx 6600 is 2.5x faster than your rx 480, right? Also uses less power.

            [...]
            It's expensive.
            Uses die space.
            Faster memory requires a narrower bus for a given bandwidth.

            Your question is like asking
            >why didn't they just put 1024bit bus on a 10 year old GPUs and make them 3x larger dies to make them twice as fast?
            But since RDNA3 seems to have 64mb cache+64bit bus chiplets, they may feasibly be able to actually do 512bit and larger bus from now on which will get much better scaling.

            It's cost.
            As dies get larger for more compute and such, they try to save die space on smaller memory bus and shit like that.
            It's easier to save money on compression and shit for higher effective bandwidth and other optimizations and use faster memory than it is to make the gpu itself weaker in exchange for more die space reserved to memory bus.

            [...]
            My guess is it's a way to increase prices without it feeling as bad.
            You get a 7700 that's faster and uses less power than the 6700xt. It costs the same but it's still better.
            Then they can add a 7700xt later that matches the 3090/6900xt that's say $549 and drop the 7700 price a bit to $449 or so and they seem like the good guy.

            Fact is that GPU prices are going up. They're always going up. Inflation exists.
            >gtx 970 was $329
            >gtx 1070 was $449 (/w fake $379 msrp that the FE and no other cards on launch sold for)
            >rtx 2070 was $599 (/w fake $499 msrp that the FE and no other cards on launch sold for)
            Nvidia used fake MSRP whch no cards were ever sold at since pascal to hide the rising MSRPs. AMD will probably use repositioning of SKUs in the stack to obfuscate rising MSRPs.

            Since Nvidia appears to be releasing housefires, it gives AMD room to release power efficient GPUs on launch and then respond with their own housefire XT versions afterwards.

            [...]
            5700xt is 225w. 6900xt is 300w. That's not double.

            >>gtx 1070 was $449 (/w fake $379 msrp that the FE and no other cards on launch sold for)
            I’m still using a basic b***h ASUS non-FE GTX 1070 I bought at MSRP. No one forces you to buy the quadruple stacked fan 5X OC FROZR RGB Hyper Ultra MAX cards that add 20MHz to the base clock. And even if that’s the case now, just buy an FE.

          • 2 years ago
            Anonymous

            >where the frick are all of those post-mining polaris/1060s
            Many are for sale in the channels I frequent at good prices, but you won't find these guys selling on ebay or other public sale sites
            It's a buyers market, but I don't think it'll last more than a couple months

          • 2 years ago
            Anonymous

            Crytpo GPUs are expensive because of the people who own them. They're just buttholes. Also about 80% of Etherium was mined from China at peak, and it's still in the 70s percentile range, so they're triple buttholes.

  29. 2 years ago
    Anonymous

    >starting at $500
    i do not fricking believe they profit more by excluding half a billion of people from buying their GPUs

  30. 2 years ago
    Anonymous

    Damn, even GPUs are getting blacked now a days.

  31. 2 years ago
    Anonymous

    >7950 Blacked
    >7970 Blacked

    why make fake pic and insert an obvious lie, is it some sort of filter for low iq gayman ?

    • 2 years ago
      Anonymous

      AMD has a tradition of using "Black Edition" as a designation for fully unlocked SKUs. Although I think it just means power in this contenxt, because being able to OC is the norm in GPU world.

  32. 2 years ago
    Anonymous

    I bought an AMD GPU once, I will never make that mistake again

  33. 2 years ago
    Anonymous

    AMDchads I kneel....

  34. 2 years ago
    Anonymous

    >7970 black
    >the hd 7000 series was released 10 (ten) years ago

    • 2 years ago
      Anonymous

      AMDCHADS WE'RE GOING HOME

  35. 2 years ago
    Anonymous

    >low-end GPU at almost $500
    They can frick off.

    • 2 years ago
      Anonymous

      Now look at Nvidia's

    • 2 years ago
      Anonymous

      There are no low end cards on that chart.

      No, lowest part of the chart does not mean low end. Same as how Geforce 16 and 20 series have no low end models.

  36. 2 years ago
    Anonymous

    mentall illness thread

    • 2 years ago
      Anonymous

      And yet you're here, curious.

  37. 2 years ago
    Anonymous

    >7950
    >7970
    SOVL

  38. 2 years ago
    Anonymous

    I had a gts 250 and recently upgraded to a gtx 660. Laugh all you want, I'm happy with it. I can play games I wanted to play I never could before. It's a great feeling.

    • 2 years ago
      Anonymous

      Kid... go to Fiverr or DeviantArt. draw some coom art and you'll be able to at least afford a 1080 ti like a normal human being.

  39. 2 years ago
    Anonymous

    Bought a Steam Deck and I don't need more.
    >7950 Blacked Edition
    >Eleven-hundred Petrodollars
    >It's not even the top end SKU
    Holy Jesus Lord have mercy, what the frick are you doing Lisa?

    • 2 years ago
      Anonymous

      a Steam Deck
      Don't remind me

    • 2 years ago
      Anonymous

      Waiting for Steam Deck 2 here.
      Really needs a replaceable OLED screen, 20% larger battery, and to be like at least 15% lighter.

      original 7970 is usable even to this day

      eh there's some lack of features that hold it back. It's still well below a RX 580 or GTX 1060 and the power consumption is AWFUL. It uses 100w minimum with 2 monitors

      >Completely fake

      Big RDNA3 will likely be have 40-50% at best over Big RDNA2 in most cases. It'll be only be much faster at ray-tracing stuff perhaps be about on par with Ampere but falling short of Ada.
      Most of the micro-architecture improvements will be focused on ray-tracing stuff.
      >MAH MULTI-DIE CHIP
      We will have go back to multi-card/GPU rendering again for this to be feasible. CF/SLI support has been neglected for nearly 10 years.

      lmao. You really think 3x the cores, a 30%+ higher TDP, and node shrink is only going to be 50% faster.
      You're delusional.
      Just a node shrink alone would be a 30% performance increase.

      • 2 years ago
        Anonymous

        >lmao. You really think 3x the cores, a 30%+ higher TDP, and node shrink is only going to be 50% faster.
        >You're delusional.
        >Just a node shrink alone would be a 30% performance increase.
        You are delusional kiddo. The days of easy gains have been long over since Kepler. The wall of diminishing returns has been scaling higher since.
        300W+ chips are non-sellers for the vast majority of the market.
        This fake leak implies that multi-chip packages are coming in the customer space (CF/SLI are coming back)

        • 2 years ago
          Anonymous

          Nuuuuuu we just need to reach below 1 nanometer architecture and then the secrets of the universe will unlock for us!! /sarcasm

          Realistically what will happen when we go below 1nm ?

          • 2 years ago
            Anonymous

            There is a lot of room for area and feature size scaling yet, and tens of billions of dollars is being spend ensuring these future processes continue to yield substantial improvements

          • 2 years ago
            Anonymous
  40. 2 years ago
    Anonymous

    original 7970 is usable even to this day

    • 2 years ago
      Anonymous

      not true now that we live in the always online constantly pushing updates world of 2022, every game has implicitly gotten harder to run as time goes on.

  41. 2 years ago
    Anonymous

    >Completely fake

    Big RDNA3 will likely be have 40-50% at best over Big RDNA2 in most cases. It'll be only be much faster at ray-tracing stuff perhaps be about on par with Ampere but falling short of Ada.
    Most of the micro-architecture improvements will be focused on ray-tracing stuff.
    >MAH MULTI-DIE CHIP
    We will have go back to multi-card/GPU rendering again for this to be feasible. CF/SLI support has been neglected for nearly 10 years.

    • 2 years ago
      Anonymous

      >Most of the micro-architecture improvements will be focused on ray-tracing stuff.
      Why talk out of your ass when compiler patches already detailed RDNA3? Why do you choose to be a frickup moron who posts worthless embarrassing nonsense? 128 ALUs per CU up from 64. Literally a doubling of execution resources that have nothing to do with ray tracing. Dumbfrick.

      • 2 years ago
        Anonymous

        >Theory-crafting this hard without official white papers
        Keep setting yourself for more disappointment for buy into the hype

        • 2 years ago
          Anonymous

          >a compiler commit is "theory crafting"
          Gay moron.

          • 2 years ago
            Anonymous

            >kiddie getting told this hard and doesn't want his dreams get dashed

    • 2 years ago
      Anonymous

      >RDNA3 will likely be have 40-50% at best over Big RDNA2 in most cases
      You posted the same cope when RDNA2 was about to launch lmoa.

  42. 2 years ago
    Anonymous

    The 3060 is already "let me guess" tier for me. I just wanted a 2060S with HDMI 2.1, and got it. Nugames suck and it runs nugames fine and older games fantastically.

  43. 2 years ago
    Anonymous

    >shills get increasingly irate as anons rather talk about nvidia in an amd thread: the thread

    • 2 years ago
      Anonymous

      no one cares about Nvidia anymore

  44. 2 years ago
    Anonymous

    >Next Gen is all 400W+ housefires on all sides
    Yawn. I'll stay on my 1080 until they decide to make actual efficiency improvements and they can give me more than double my current performance at 200W or less.

    • 2 years ago
      Anonymous

      Get a 7700 then

    • 2 years ago
      Anonymous

      RDNA3 improves perf/watt by over 50% compared to RDNA2.

    • 2 years ago
      Anonymous

      Isn't a RX 6600 like 30% faster than a 1080 while using half the power?
      I know it's 2.5x the performance of the 1060 while only using 20 watts more power.

      So the 7600 should be closer to 60% faster.

  45. 2 years ago
    Anonymous

    did anything ever come of those leaked drivers a couple weeks ago? did they actually fix amd's shitty opengl performance on windows?

  46. 2 years ago
    Anonymous

    Brave of AMD to re-release the HD7970 over a decade later.

  47. 2 years ago
    Anonymous

    Can someone explain to me like I'm fricking brain damaged why the memory bus/bandwidth seems to be getting smaller each gen?

    • 2 years ago
      Anonymous

      Solutions to providing very high bandwidths with DRAM solutions are costly. They're costly in dollar amounts and in power consumption.
      AMD's solution to this is what they call Infinity Cache. It is essentially just a large slab of L3 SRAM used to lessen the need for off die dedicated video memory like HBM or GDDR6/X which are all comprised from magnitudes slower DRAM slices.

  48. 2 years ago
    Anonymous

    I can't afford the nuclear reactor required to operate a 2023+ home computer

  49. 2 years ago
    Anonymous

    Fake. No company launches the entry level before the flagship. Also 8gb of ram after they have the 6700 with 12 is also dumb.

    • 2 years ago
      Anonymous

      7700 is mainstream, not entry level. Words have meanings

      • 2 years ago
        Anonymous

        Still makes no sense launching it before the flagship or XT versions. AMD does sandbag but to this point is beyond stupid. Nobody wants the slower product.

        • 2 years ago
          Anonymous

          They're first to market with chiplet GPUs they have issues that only they are seeing for the first time. It takes time to iron those out before release.

  50. 2 years ago
    Anonymous

    >No 200-300$ stuff
    Not interested

    • 2 years ago
      Anonymous

      Buy 6600 then you poorgay.

      • 2 years ago
        Anonymous

        >200-300$ stuff
        >Posts about 350$ garbage
        kys

        • 2 years ago
          Anonymous

          >turd worlder
          Not my problem.

  51. 2 years ago
    Anonymous

    Don't care
    I just want a 6600 to drop to 250€

    • 2 years ago
      Anonymous

      >he doesn't know

      • 2 years ago
        Anonymous

        Look I'm yuropean, I don't know where to look. All I find is these for like 329€.

  52. 2 years ago
    Anonymous

    >release marketing numbers
    >but it's a super secret leak 😉

  53. 2 years ago
    Anonymous

    This is without a doubt one of the most moronic threads I've seen on IQfy.
    Both the OP and most posters are wrong about every word they've posted.
    AMD has been on a huge leaker trolling campaign and it has truly gotten out of hand, N31/32 are single GCD, have chiplet MCD's which contain L3 cache and memory controllers on N6, GCD's are N5HPC. N33 is monolithic N6.
    RDNA3 has double the ALU's per CU/WGP, so double per unit, RDNA 3 is always 4WGP/SA, unlike N21/22 being 5WGP/SA, the 4 config is better PPW and PPA.
    RDNA3, like Zen4 clocks much higher than its predecessor, 3Ghz is comfortable.
    RDNA3 is arguably as big as GCN>RDNA1 in regards to the overall changes, though the perf benefits will be larger.
    In general, N33 is 2SE, N32 4SE, N31 6SE. PPW is >50%, that means N33 is over 50% and that is quite a bit less efficient than the rest of the lineup due to the node.
    The uarch changes are going to benefit a plethora of things, whether that is RT, raster, compute, it is all so much more advanced than RDNA1/2. Software is getting really refined in prep for this launch, with various fixes to longstanding weaknesses, and the full compute stack being supported at launch.
    Infinity Cache is larger, with 32+MB and a 64-bit MC per MCD, 1 MCD per SE.
    AMD has plenty of options in regards to segmentation and yield management, die sizes are reasonable, the nodes yield brilliantly, and the packaging is advanced, but rather cheap all things considered. AMD will be more efficient on the same node for the first time since goddamn Terascale 2. The peak perf crown is a wash most likely, but one reaches peak perf at far less power.
    AMD has the more competitive product, it is their 5870 reborn.
    And well, perhaps there is that chance that we do see well above 2X performance this time around, in which case this is their best card since the true GOAT, the legendary R300. To surpass even the almighty G80's ~2.2x perf in a generation, can it really be done today?

    • 2 years ago
      Anonymous

      Impressive. I might be able to buy a $350 4060ti if amd delivers.

      • 2 years ago
        Anonymous

        Well, if one thing is true, AMD want to increase gross margins with each generation.
        Each tier will cost a bit more than the last due to wafer costs increasing, the main thing is AMD will probably add 1-2 more tiers above what they are doing currently on the high end, while basically abandoning the lowest 1-2 tiers.
        AMD is supply constrained, they would price more reasonably if they had volume to sell.
        Of course we have the normal prepwork happening on both sides, both the 6X50XT and the 3090Ti exist to make the next gen look better than they otherwise would, whether that is perf-per-dollar or perf-per-watt. Nvidia will win this generation on the simple virtue that AMD does not have room to make as many cards, this will be the same way for another 3-4 years.
        But AMD can claim a win via consoles and any CPU with an iGPU.

    • 2 years ago
      Anonymous

      okay but if they don't bring back the 7970 I'm not interested

  54. 2 years ago
    Anonymous

    I would be bad ass if they did a run of cards with HBM again, I miss my Radeon VII.

    • 2 years ago
      Anonymous

      HBM is elegant, but big cache+GDDR is a tad better for most clinet stuff, though it does use more power.
      It is more economical to use now that AMD can use EFB instead of CoWoS, but the demand for HBM is nuts, keep it for HPC, that is where it shines.

    • 2 years ago
      Anonymous

      Reserved for dickyDNA.

  55. 2 years ago
    Anonymous

    Nvijeet cope. RDNA2 was just the beginning.

    • 2 years ago
      Anonymous

      >shit drivers
      Sad!

      • 2 years ago
        Anonymous

        >shit cards
        Maybe next time nvijeet))

  56. 2 years ago
    Anonymous

    rocm won't work

  57. 2 years ago
    Anonymous

    https://twitter.com/Kepler_L2/status/1537375541686329345
    This is the most rational lineup.

    • 2 years ago
      Anonymous

      >RX 7970 XT (Navi31 XTX) 48 WGP/96 CU/12288 FP32, 24 GB GDDR6, 384 MB IC
      >RX 7950 XT (Navi31 XT) 42 WGP/84 CU/10752 FP32, 24 GB GDDR6, 192 MB IC
      >RX 7900 XT (Navi31 XL) 35 WGP/70 CU/8960 FP32, 20 GB GDDR6, 160 MB IC
      >RX 7800 XT (Navi32 XTX) 32 WGP/64 CU/8192 FP32, 16 GB GDDR6, 128 MB IC
      >RX 7800 (Navi32 XT) 28 WGP/56 CU/7168 FP32, 16 GB GDDR6, 128 MB IC
      >RX 7700 XT (Navi32 XL) 24 WGP/48 CU/6144 FP32, 12 GB GDDR6, 96 MB IC
      >RX 7600 XT (Navi33 XT) 16 WGP/32 CU/4096 FP32, 8GB GDDR6, 64 MB IC
      >RX 7600 (Navi33 XL) 14 WGP, 28 CU/3584 FP32, 8GB GDDR6, 64 MB IC
      I'll personally add that price is probably from $400-$2000 and perf relative to a 6900XT is 0.9-2.2+?x
      Those are just safe personal estimates.

      42 to 48 is too small of a jump on the high end.
      I don't think the cache bump would be enough to make it scale.

      It's pretty pointless to utilize chiplets unless you're getting good scaling.
      Why Ryzen was so good if that the 16 core actually scaled 2x over the 8 core (at least if you also had better binning or higher tdp).
      If you don't scale the bus with the cores, and more heavily scale the cores, it's dumb.

      Also it's 16384 cores maximum. Why the FRICK would they only do 12288 for the 7970?

      The bad timeline is that the highend is actually 8192 max and the 2 GPU card is radeon pro only but I've heard that's not true for years.

  58. 2 years ago
    Anonymous

    YOU SEE IN MY HEADCANON 5999XT IS ON PAR WITH 6900XT

    • 2 years ago
      Anonymous

      >YOU SEE IN MY HEADCANON 5999XT IS ON PAR WITH 6900XT
      6900xt guy here what's this? My gpu isn't even a flagship anymore now that the xtxh and 6950xt are a thing

      >IESLB
      5700XT is the RDNA1 flagship, your imaginary gpus are irrelevant. Stay mad braindead zoomer.

      What's wrong dickhead I'm 32 my first ati gpu was a 9600 two decades ago don't pull age on me Black person

      • 2 years ago
        Anonymous

        XTXH was a nonexistent meme.6950XT is the flagship but that came out later, it's more like RDNA 2.5 since it also uses different memory.

        • 2 years ago
          Anonymous

          Figured as much it's literally jsut a unlocked bios
          Gave up oc after a month and put it back to stock and enjoying 250w 3080 perf

          THIS FANFICTION RDNA1 CARD IS ON PAR WITH 6900XT BRO

          >THIS FANFICTION RDNA1 CARD IS ON PAR WITH 6900XT BRO
          He mad
          I dunno why anyone would want a rdna1 gpu it was literally a beta test for rdna 2 which was a stepping stone to rdna 3 mcm

          • 2 years ago
            Anonymous

            rdna1 is already 3 years old

          • 2 years ago
            Anonymous

            >rdna1 is already 3 years old
            Good it was just as bad as radeon 7
            Hope it gets memory holed

            It's also 6x cost

            >It's also 6x cost
            And who's fault is that
            Fricking miners greedy amd tsmc

        • 2 years ago
          Anonymous

          I guess the HD7970 Toxic 6GB is GCN 1.5 then.

          • 2 years ago
            Anonymous

            1.488

  59. 2 years ago
    Anonymous

    THIS FANFICTION RDNA1 CARD IS ON PAR WITH 6900XT BRO

  60. 2 years ago
    Anonymous

    >7950 BLACKED
    dropped

    • 2 years ago
      Anonymous

      I'm not sure if this is ironic /misc/ posting and I've been baited or if you're actually moronic. Kudos.

      • 2 years ago
        Anonymous

        Knowing the state of poltroons it was probably actual moronation.

  61. 2 years ago
    Anonymous

    Where's my RDNA3D?

    • 2 years ago
      Anonymous

      This would be really cool. Just like a few GB of infinity cache would make for a monsterous card.

      • 2 years ago
        Anonymous

        >a few GB of SRAM
        Anon, you're drunk

        • 2 years ago
          Anonymous

          The 6900xt already has 128MB. The 7970xt will supposedly have 512MB.

          • 2 years ago
            Anonymous

            >this one card had 128MB
            >therefore several GB is totally feasible

          • 2 years ago
            Anonymous

            the cache is on chiplets.
            64bit bus and 64mb cache per chiplet.
            8 of those chiplets = 512bit bus and 512MB cache

            384MB and 384bit is more likely though, at least for a consumer card.

          • 2 years ago
            Anonymous

            Imagination is the only limit.

          • 2 years ago
            Anonymous

            1488MB when

  62. 2 years ago
    Anonymous

    damn the 7700 sounds kinda based if true. i'd consoom if cryptocels are gonna keep getting fricked, probably not gonna happen though unfortunately

  63. 2 years ago
    Anonymous

    Here's your nvidia gpu bro.

  64. 2 years ago
    Anonymous

    >you're... le poor
    why do amerimutts think this is a legitimate argument outside of their kaiji-tier dystopia?

    • 2 years ago
      Anonymous

      Cope

      • 2 years ago
        Anonymous

        not a cope + it is your problem since during the climate disaster hundreds of millions of thirdiechads are going to enter your c**t without permission you fricking suburbanoid subhuman

        • 2 years ago
          Anonymous

          Nice projection, coping turd worlder.

          • 2 years ago
            Anonymous

            gpt-tier response

  65. 2 years ago
    Anonymous

    I just wish they fixed their fricking drivers
    I dont care about gayming but somethings that a GPU can help (like AI, video editing) usually is more worth with Nvidia since their drivers are actually supported
    AMD shouild better up their drivers and increase on marketing because I've never met someone with an AMD GPU (even if for gayming they are more worth it)

    • 2 years ago
      Anonymous

      Suddenly everyone on the planet is a video editor.

  66. 2 years ago
    Anonymous

    I did hear that top RDNA3 is intended to be 6x a 5700XT.
    That lines up with the ALU count being 6x higher, process, clocks and uarch get the scaling to that level.

    • 2 years ago
      Anonymous

      It's also 6x cost

      • 2 years ago
        Anonymous

        Pay up goy.

      • 2 years ago
        Anonymous

        rarted

  67. 2 years ago
    Anonymous

    hi guys, TDPlet here
    should i buy one of those intel "k" cpus marketed as 125w or a non k marketed as 65w
    will the k result in higher temperatures?

    • 2 years ago
      Anonymous

      Yes. Buy the 65w

  68. 2 years ago
    Anonymous

    I'm just waiting for the rt performance

  69. 2 years ago
    Anonymous

    I don't like the lower L3 cache and VRAM for the mid-low end. Same with the bus width.

    Also the MSRP is getting out of hand.

  70. 2 years ago
    Anonymous

    Wake me up when there's a 1440p capable card at <$300, with more VRAM than an RX 480.
    It won't happen and I'll stay salty about it. Might buy something used if current gen cards get dirt cheap, like <$100 in the coming couple of years. All I need is something capable of running Elden Ring at 1440p, but I don't want to overpay for something that'll be outdated really fast due to lack of VRAM (Ampere) or RT performance (RDNA2).

    • 2 years ago
      Anonymous

      They are really gimping sub $400/300 GPUs on purpose.
      >NVIDIA/AMD : "What are you gonna do about it? not buy a GPU?"

  71. 2 years ago
    Anonymous

    I'm not buying a graphics card using over 250w

  72. 2 years ago
    Anonymous

    [...]

    >replies with more projection
    No more (You)s for NPCs.

    • 2 years ago
      Anonymous

      >look look i found a new reddit word to spam

  73. 2 years ago
    Anonymous

    If you round up these prices and change the $ to a € you get what these will cost in Europe.

  74. 2 years ago
    Anonymous

    [...]

    >NPC IESLB
    I accept your concession.

    • 2 years ago
      Anonymous

      >more buzzwords
      >tries to force a "concession" like a debateBlack person
      >calls others npcs
      why do amerimutts lack self-awareness? is there something in the burgers that causes this degree of moronation?

  75. 2 years ago
    Anonymous

    [...]

    >NPC IESLB 2: electric boogaloo

    • 2 years ago
      Anonymous

      kys dbsshitter

      • 2 years ago
        Anonymous

        Vacate discorddit

  76. 2 years ago
    Anonymous

    >IQfy posters struggle with basic maths
    Impressive. This is the power of CS.

    • 2 years ago
      Anonymous

      We had the same issue with Ryzen 7000 and what Lisa Su showed with workload speed vs a Ryzen 9 5950X with the issue of how the terminology of faster vs quicker and units of time. Same issue here. Kind of sad for self professed tech followers, but it's not surprising since a lot of IQfy is underage.

  77. 2 years ago
    Anonymous

    If these are chiplet GPUs then neat.
    If these are crossfired GPUs on the same pcb then lol, lmao even.

    • 2 years ago
      Anonymous

      Chiplet

    • 2 years ago
      Anonymous

      Proper chiplet, single device.

  78. 2 years ago
    Anonymous

    I don't care in 1660 soc and thin wallet

Your email address will not be published. Required fields are marked *