What move could AMD make against Nvidia that would essentially be going for the jugular?

What move could AMD make against Nvidia that would essentially be “going for the jugular”?

Nothing Ever Happens Shirt $21.68

DMT Has Friends For Me Shirt $21.68

Nothing Ever Happens Shirt $21.68

  1. 1 month ago
    Anonymous

    They'd have to be able to reach it first.

  2. 1 month ago
    Anonymous

    slash prices by $150 across the board

    • 1 month ago
      Anonymous

      AMD has tried undercutting Nvidia in the past and it gained them nothing. People will still buy Nvidia cards anyway and justify it to themself with one from a list of reasons (ray tracing, drivers, features, "stability", etc).

      • 1 month ago
        Anonymous

        They don't undercut enough is the problem.
        $50-100 off Nvidia's pricing for similar performance cards is not a huge incentive.

      • 1 month ago
        Anonymous

        you forgot the biggest reason by a mile, especially on this board: bragging rights

      • 1 month ago
        Anonymous

        >AMD has tried undercutting Nvidia in the past and it gained them nothing.

        Because the quality matched the pricing.

        • 1 month ago
          Anonymous

          >Because the quality matched the pricing.
          ROFL, nope. This is pure rationalization from brand-gays. Neither company are your friends. They only care about the bottom line for their shareholders.

          • 1 month ago
            Anonymous

            yeah but the quality matches the pricing

          • 1 month ago
            Anonymous

            This. And hell, it's not just features but efficiency too. The power per watt difference isn't even funny.

          • 1 month ago
            Anonymous

            Nope, Nvidia platform has it own laundry list of issues. Brand-gays (mainly the whales) are too blinded by sunken cost fallacy to see it objectively.
            The simple truth is that both platforms are robust and work for 95% of use case without too much hassle. Most of the so called problems are really user end being moronic ie (overclocking their hardware, insufficient cooling and power for their platform)

          • 1 month ago
            Anonymous

            >insufficient cooling
            Not the users fault a card doesn't come with a good enough cooling itself.

          • 1 month ago
            Anonymous

            Yes, the end-user is at fault if they do not provide adequate cooling for their chassis and power supply to keep the discrete GPU from shitting itself at load.

          • 1 month ago
            Anonymous

            don't put a blower on it if you don't want it to blow

          • 1 month ago
            Anonymous

            PSU, sure but unless the user does something insane like cut off their air supply completely, a discrete GPU is supposed to keep itself cool with the provided cooler. If it fails to do so, it's a faulty product.

          • 1 month ago
            Anonymous

            You would be shocked how many morons don't provide adequate chassis cooling for their 200W+ GPU SKU inside a mid-tower and wonder why their shit crashes after 30 or so minute of runtime.

            don't put a blower on it if you don't want it to blow

            >moron doesn't understand that GPU still radiates heat from the backside into the chassis and is often next to CPU socket.

          • 1 month ago
            Anonymous

            doesn't understand that GPU still radiates heat from the backside into the chassis and is often next to CPU socket.
            only if the blower blows enough
            if your gpu can't cope with it's shitty blower ofc it's going to rediate you fricking moron
            that's still a problem with the blower at the end of the day

          • 1 month ago
            Anonymous

            Nope, Nvidia platform has it own laundry list of issues. Brand-gays (mainly the whales) are too blinded by sunken cost fallacy to see it objectively.
            The simple truth is that both platforms are robust and work for 95% of use case without too much hassle. Most of the so called problems are really user end being moronic ie (overclocking their hardware, insufficient cooling and power for their platform)

            If another person's choice on GPUs causes you this much distress, you need to get off of here.

          • 1 month ago
            Anonymous

            >This much projection from brand-gays being called out
            Like clockwork

          • 1 month ago
            Anonymous

            For not being a "brand gay" you sure do have a hard on for Nvidia.

          • 1 month ago
            Anonymous

            >Doubling down on the projection

          • 1 month ago
            Anonymous

            Stay poor.

          • 1 month ago
            Anonymous

            >Triple downs on the projection
            Holy shit, I didn't know you were that dumb. You realize it is poor-gay gaymer that get high-end GPU as a status symbol among their peers?

          • 1 month ago
            Anonymous

            Me need CUDA.

          • 1 month ago
            Anonymous

            Protip: 95% of "MAD CUDA" IQfy are larping gayming-gays who never use or seen CUDA code in their lives.
            Actual CUDA-gays don't waste time on silly gayming SKUs.

          • 1 month ago
            Anonymous

            >Actual CUDA-gays don't waste time on silly gayming SKUs.
            I'm a student and it lets me practice at home AND game. Bang for the buck goes to Nvidia.

          • 1 month ago
            Anonymous

            You will get to use actual hardware once you go professional and get rid of crappy gayming non-sense.

          • 1 month ago
            Anonymous

            No you won't

          • 1 month ago
            Anonymous

            It really does bug you doesn't it?

          • 1 month ago
            Anonymous

            wrong, they're AI coomers, more specifically poobrained idiots using a specific stable diffusion UI that only supports NVIDIA cards
            even the llmgays are very aware that AMD cards have a CUDA equivalent because the main framework people use for llms has supported HIP since pretty early on

          • 1 month ago
            Anonymous

            also i would like to contest the fact that no one uses gaming cards for profession shit
            both the AMD and NVIDIA builds the tinygrad dev's selling as mid-range AI research boxes use 6 gaming cards

          • 1 month ago
            Anonymous

            and also i've been personally doing amateur compute dev with my 6900xt which is still arguably professionalish
            currently working on using persistent warp scheduling to implement a cache aware implementation of matrix multiplication using the RDNA2 dot product instructions in assembly in the hopes that i'll be faster than the rocBLAS/libTensile implementations which are generated and not manually tuned assembly

      • 1 month ago
        Anonymous

        They're not equal cards. All those things you listed are things consumers see and care about. Realistically, AMD cards are worth maybe half what nvidia's are

      • 1 month ago
        Anonymous

        nothing. AMD undercuts Nvidia, people just buy nvidia shit. AMD outperforms nvidia, people just buy nvidia shit. AMD has more features than nvidia, people just buy nvidia shit. Nvidia has inserted itself so completely in the minds of the average gayman homosexual that it is a hopeless situation.

        This is absolutely correct. Plus, when AMD achieves something (RT performance on-par with the previous gen of Nvidia, for example) suddenly Nvidia's previous gen RT performance which was a major selling point is now shit, and only the NEW generation of RT performance matters.

        They're not equal cards. All those things you listed are things consumers see and care about. Realistically, AMD cards are worth maybe half what nvidia's are

        gays like this are why we can't have nice things.

        • 1 month ago
          Anonymous

          Nope, Nvidia platform has it own laundry list of issues. Brand-gays (mainly the whales) are too blinded by sunken cost fallacy to see it objectively.
          The simple truth is that both platforms are robust and work for 95% of use case without too much hassle. Most of the so called problems are really user end being moronic ie (overclocking their hardware, insufficient cooling and power for their platform)

          nah I just used to have an amd gpu and it was the worst experience I had in my life
          this taught me that there is more to gpus than price, performance and features
          I got what I paid for. I'm never going to buy from a company that has less than 10% of the user base.

          • 1 month ago
            Anonymous

            *less than 20%

          • 1 month ago
            Anonymous

            I've been using AMD GPUs since they were still ATI, the only consistent issues I ever had was with Vega, and that was solved by using two dedicated PCIe 8-pin adapters instead of daisy chain.

        • 1 month ago
          Anonymous

          >nothing. AMD undercuts Nvidia, people just buy nvidia shit. AMD outperforms nvidia, people just buy nvidia shit. AMD has more features than nvidia, people just buy nvidia shit. Nvidia has inserted itself so completely in the minds of the average gayman homosexual that it is a hopeless situation.
          This lmao every gaymer is a fricking moron

          • 1 month ago
            Anonymous

            >AMD outperforms nvidia
            Only when using twice as much power.

          • 1 month ago
            Anonymous

            Not true at all. You fool. Both camps have their bouts of efficient and power-hungry silicon over the years. The vast majority of gaymers didn't care at all.

      • 1 month ago
        Anonymous

        All of that noise comes from poor-gay Nvidiots. They are seething that AMD gave up on the price wars.

      • 1 month ago
        Anonymous

        Return of the $500-$750 flagship GPUs. Return of the sweetspot strategy.

        AMD exists to keep the price of NVIDIA GPUs low ever since CUDA secured NVIDIA the position of having the superior package and feature set. The actual hardware is really good but is held back by the features.

        • 1 month ago
          Anonymous

          >The actual hardware is really good but is held back by the features.
          tsmt
          t. actually uses the compute package

          • 1 month ago
            Anonymous

            Good shit on you if you can actually get full use out of their cards, for 3dgays this usually is not the case.

          • 1 month ago
            Anonymous

            it was an uphill fricking battle
            i run headless ie just opencl
            and to make it a thing i have to use a specific fedora distro (bc i suck at loonix things)
            (but also 1.5-2 years ago it was the only solution to get an rx 590 to actually do what it said on the box. this or installing arch bc of course archgays have a package that just werks)

            today the situation is slightly better it seems.
            at least the poor techie who deals with finally got enough time to properly test rocm so now we finally know which distros and which kernels are supported
            (i used techiein singular bc in his interactions on the amd forum it appeared that its one sole guy who does the forum AND bugfix/maintain the rocm package)
            (what we got is workarounds anyways bc were supposed to use very specific distros with very specific kernels to make compute work. amds pennypiching is a bad fricking joke and the mother of all footguns. bc their hardware is really good indeed)

        • 1 month ago
          Anonymous

          AMD is also moronic. There is a project that makes cuda work on amd and they funded it for a bit but then stopped because they said there is no interest in supporting cuda, even though it actually ran faster than cuda on nvidia hardware. That software isn't also just alpha software, you can use it right now in real world applications

          • 1 month ago
            Anonymous

            >There is a project that makes cuda work on amd and they funded it for a bit but then stopped because they said there is no interest in supporting cuda, even though it actually ran faster than cuda on nvidia hardware
            HOLY FRICK
            I knew they planned on actually addressing the lack of CUDA support head on in addition to OpenCL but was this really the case?

          • 1 month ago
            Anonymous

            It's called ZLUDA but I haven't seen definitive benchmarks supporting that anon's claim

      • 1 month ago
        Anonymous

        Because amd are inept lol its not a justification

    • 1 month ago
      Anonymous

      And what good will it do them? Poorgays already go with AMD, non-poorgays won't get inferior hardware, even if it's free.

      • 1 month ago
        Anonymous

        >And what good will it do them?
        There's a hell of a lot more poors than rich people.
        By seriously cutting prices, instead of the token $50 they're doing, they'd sell more, they would get more market share, more public goodwill, more organic shilling, and I have a suspicion they'd also get more profits.

        • 1 month ago
          Anonymous

          It's not like they are selling some essential product, GPUs are a luxury item already, so people buying these aren't exactly destitute, so you can squeeze them out more. Lowering the prices too much just fricks with your margins.

          Also the goodwill would get somewhat balanced out by the PR disaster of being a budget brand, which they already kinda have. It's haunting their CPUs too despite these actually being very competitive – I have friends who don't even bother considering their products because of the "cheap option" reputation, which people tend to associate with cheap quality and some small issues here and there.

          Short of magicing up a MUCH better GPU at a similar prices, I just don't see much hope for them.

          • 1 month ago
            Anonymous

            >GPUs are a luxury item already, so people buying these aren't exactly destitute
            GPUs have BECOME a luxury product. Again. They were a luxury when they appeared in the 90s, but afterwards became more affordable with the wide product ranges. Every kid that wanted a pc could get a *50 level card with his allowance, or a *60 with some effort, leaving the *70 and *80 for the enthusiast with money to spare.
            Now those segments of the young who aren't working yet have no option to buy anything.

          • 1 month ago
            Anonymous

            >Now those segments of the young who aren't working yet have no option to buy anything.
            And these folk get a pretty solid integrated GPU that can play basically anything, so really not that much changed. Unless you want to crank up the details or go with 4k, dedicated GPUs aren't a must have anymore.

          • 1 month ago
            Anonymous

            >integrated GPU
            These are still garbage dynamic 1080p shaky 60FPS lowest settings in 2018 games garbage. And games are LESS optimized now, so there goes your theory.
            Game developers will ALWAYS use more power as an excuse to optimize less.

          • 1 month ago
            Anonymous

            Wrong, they are sufficent for the mainstream audience and it is where the money is made for developer is made via micro transactions.
            Chasing high-end graphic market for games is a fiscal fool's errand.
            The most profitable and lucrative genres can easily operate and are playable with mere iGPU.
            Discrete GPUs are not some sacred cow that are immune to the monster of miniaturization. Nvidia and AMD saw the writing on the wall years ago and have been making measures to make not make gaming graphic but their bread and butter.

          • 1 month ago
            Anonymous

            You're an actual moron with zero foresight.

          • 1 month ago
            Anonymous

            >old-gay IQfytard who is still living in denial
            Sorry, that your PCMR high-end rig is becoming an anachronism that will become more apparent as the decade progresses.
            Ultra portables and cloud stuff will become the future for gayming.

        • 1 month ago
          Anonymous

          Cutting prices risks devaluing your brand.
          And buying share with price drops can end poorly - just look at Japan of the 90's - attempting to buy market share destroyed their economy.

    • 1 month ago
      Anonymous

      My boomer father doesn't even know what AMD is despite being an electrical engineer. It has always been Intel/NVIDIA for him.

  3. 1 month ago
    Anonymous

    Making their CUDA alternative functional. Intel is already beating them to it though with OneAPI I think, so it’s not obvious if it’d actually work. Making CUDA run on AMD cards would also be a big help but I think NVIDIA added some cucked shit to the CUDA license that disallows running compatibility layers after ZLUDA was released

    • 1 month ago
      Anonymous

      That's still playing catch-up, though.
      They need to develop their own gimmick, that's exclusive to their cards. Something like what they did with their CPU division with infinity fabric allowing for drastically more cores at the time, or 3Dcache for massively more cache and performance in games.
      Maybe a feature like hardware level backlight strobing for motion clarity.
      Or a hardware accelerated antialiasing solution to replace TAA, similar to cleartype subpixel rendering.
      Or a texture upscaler that works dynamically, so the closer the texture is in view, the higher resolution it gets, so you'll never see its pixels stretched out.
      Or automatic level of detail, that lowers rendered polygon count the further they are in a scene, allowing for massive scenes without performance hits.
      I don't know, they need something new, and never seen before.

      • 1 month ago
        Anonymous

        Gaming wise, an alternative to TAA that isn’t unpleasant and still runs decently would be nice, but a lot of games now rely on its smoothing effects artistically to save resources. Not sure how appealing backlight strobing would be.
        Everything else you mention is primarily software, and is basically what UE5s Nanite does.
        At the moment they seem to be trying to focus on straight raster performance and the gaming market for their consumer cards, but the cards are still priced too high to be competitive with NVidia solely based on that.

        • 1 month ago
          Anonymous

          >Not sure how appealing backlight strobing would be.
          It's something on the same level as 4K, HDR, high refresh or OLED black levels.
          It can impress some people enough that they find it hard to go back after experiencing it in person.
          With the right marketing, it could be the next big thing, as right now it's just being sought in niche enthusiast circles.

          • 1 month ago
            Anonymous

            Normies can't even tell the difference between 30FPS and 60FPS, let alone give a shit about motion clarity. Only a small minority of esports wannabes and IQfy autists care about 360Hz meme monitors.

      • 1 month ago
        Anonymous

        >That's still playing catch-up, though.
        So AMD hardware isn't THAT bad, the main issue is no CUDA. As anon said, AMD needs to get all the software out there (most of which uses CUDA) to work on their shit "out of the box" - and in fact this is the issue with ALL NVIDIA competitors. Honestly as soon as someone figures out how to openCUDA similar to SGI's openGL (originally IRIS GL), we're golden.

        If CUDA is solved, AMD can do whatever to get an edge, easiest? Just fill the damn thing up with more RAM, RAM which NVIDIA is intentionally cucking their hardware with. For example, allow installation of motherboard type DDR RAM chips up to 128GB and you've got a killer app right there. There's also opportunities to get the CPU in on the action since AMD does thoes also.

        Only thing AMD could do if CUDA is technically infeasible would be to just give a huge pile of money to say stable diffusion and/or llamma.cpp folks and get them to build AMD-specific superfeatures, particularly for windows (largest user base)

        • 1 month ago
          Anonymous

          >If CUDA is solved, AMD can do whatever to get an edge, easiest? Just fill the damn thing up with more RAM, RAM which NVIDIA is intentionally cucking their hardware with. For example, allow installation of motherboard type DDR RAM chips up to 128GB and you've got a killer app right there. There's also opportunities to get the CPU in on the action since AMD does thoes also.
          AMD should also use their advances in X3D cache or HBM on GPUs more. There were good reasons why Vega cards flew off the shelves despite performing pretty poorly in games.

        • 1 month ago
          Anonymous

          They already did a similar thing with SSG, although that was SSD.

    • 1 month ago
      Anonymous

      A 1st party inference runtime for Linux and Windows would be amazing.

    • 1 month ago
      Anonymous

      Why not simply make a stable architecture for these vector processors to run whatever the frick I want? I wonder why Xeon Phi failed, it had the right idea. It works as a linux computer and appears to the the host os as a point to point 40G ethernet link.

  4. 1 month ago
    Anonymous

    AMD could make a better card than offerings from Nvidia and I still wouldn't buy it lol, frick AMD

    • 1 month ago
      Anonymous

      This but the opposite, I'll buy AMD because their shit has always worked better for Linux.

      • 1 month ago
        Anonymous

        >apt install nvidia-driver
        Works on my machine for the past 15 years

      • 1 month ago
        Anonymous

        >This but the opposite, I'll buy AMD because their shit has always worked better for Linux.

        I'll buy Nvidia because their shit works better now.

        • 1 month ago
          Anonymous

          ill agree with you when explicit sync support for wayland is added to the drivers in may

          • 1 month ago
            Anonymous

            >its only good at all if it matches MY niche use case

          • 1 month ago
            Anonymous

            its not that niche really but i don't mind waiting

          • 1 month ago
            Anonymous

            Good is subjective, but most can't accept that

  5. 1 month ago
    Anonymous

    Best GPU selling for $600 on release. Then come out with some kind of REAL AI optimization tool for game engines so developers can lean on that instead of upscaling cancer. Basically, make Nvidia irrelevant for consumers who don't care abut AI porn (meaning most of them).

  6. 1 month ago
    Anonymous

    Lisa is Jensen's b***h, literally. They are related by blood.

  7. 1 month ago
    Anonymous

    A time machine

  8. 1 month ago
    Anonymous

    Completely open source for one of there latest GPU's so that Autists can fix all the bugs and crashes

    > https://github.com/geohot/7900xtx

  9. 1 month ago
    Anonymous

    they need to beat or actually match nvidia in terms of features.
    freesync is a poor mans gsync, fsr looks horrible compared to dlss plus most games just choose dlss, amd cards perform great at 1440p and below but have an issue where their performance drops noticeably at 4k compared to nvidias alternative (happens with the 6000/7000 cards)

  10. 1 month ago
    Anonymous

    Gaming wise, there is nothing they can do, realistically. The majority of people will keep buying Nvidia.
    On the compute side they could team up with Intel and present a viable alternative to CUDA
    t. AMD fanboy

    • 1 month ago
      Anonymous

      >On the compute side they could team up with Intel and present a viable alternative to CUDA
      Which would be a decade long process
      the captcha distracted me and I forgot to finish my post

    • 1 month ago
      Anonymous

      >Gaming wise, there is nothing they can do, realistically

      BS. If their features (ie. encode, streaming, ml upscaling) were on par with nvidia's and driver/game issues were gone, they would win back significant market share in gaming.

      • 1 month ago
        Anonymous

        what's wrong with amd encoding? Yes I have seen that fsr is worse than dlss visually

      • 1 month ago
        Anonymous

        I have encountered 2 driver issues with amd.
        Starfield was borked on linux with vega/polaris cards and I couldn't apply my undervolt with windows iot ltsc but in that case I blame the os

  11. 1 month ago
    Anonymous

    make a CUDA competitor thats compatible out of the box with all major professional creative softwares, while also undercutting prices for the same performance (and having no driver problems)
    and then theyd also need to compete on AI accelerators
    neither of these things are likely to happen, the latter is certainly not going to happen
    theyre not going to compete in the next five years or probably even in the next decade, so they need to be doing RnD for something thatll come to fruition one or two decades from now (much like CUDA is nearly 20 years old at this point)

    maybe something like optical computing, some real bell labs shit

  12. 1 month ago
    Anonymous

    take over the low end with CHEAP APU's, the 8700G at a cheap price would take over the budget market.

    • 1 month ago
      Anonymous

      APUs are a bellcurve product
      they are for office computers and enthusiasts, either building teeny tiny computers or overclocking for overclocking's sake
      you get more performance with a $100 used gpu and a cheaper cpu.

      • 1 month ago
        Anonymous

        Yeah but why would you spend 100 bucks for slightly less crappy 1080p performance? And since the market is moronic a serious step up would be 300ish bucks, so making this even less appealing by further boosting APUs sounds like a decent plan.

  13. 1 month ago
    Anonymous

    >What move could AMD make against Nvidia that would essentially be “going for the jugular”?
    come out with a GPU that wasn't provably worse than NVIDIA's similarly priced options

  14. 1 month ago
    Anonymous

    Making a product stack that beats Nvidia's and selling it for cheaper.And they should do it in the AI winter,to break Nvidia's back.

  15. 1 month ago
    Anonymous

    Everyone ITT talking about gaming features, not a stable and just as fast/capable ML solution.

    • 1 month ago
      Anonymous

      AMD has no hope in AI. Their software stack is garbage. Even if they produce more cost effective hardware, it will only ever be used via some cuda compat layer.

  16. 1 month ago
    Anonymous

    idk they could not totally suck a wall of dicks or something

  17. 1 month ago
    Anonymous

    Fully document the hardware and allow direct hardware access. Graphics drivers and standard apis were a mistake.

  18. 1 month ago
    Anonymous

    in the gaming market: make a super solid mid-tier GPU at half the current pricing scheme. Ideally low power draw (since Lisa Su has said they want to go for efficiency with their next gen), and rock stable drivers – they basically already have that.

    If they really forfeit the High-End next time round, they need to severely undercut Nvidia’s offerings in the mid-range. Not a 100$, it needs to be way more drastic, so they can suck up that market share and make Nvidia’s mid-range stay on the shelves (and also make their high-end prices seem even more ridiculous in comparison).

    As for the AI market: I don’t think they can do anything there at all.

    • 1 month ago
      Anonymous

      >rock stable drivers
      HAHHAHAHAHAHAHAHAHHAHAHAHAHAHAHAHHAHAHAAHAHAHHAHAHAHAHAHHAHAHAHAHAHHAHAHAHAH

    • 1 month ago
      Anonymous

      AMD needs solid $200 and $300 offerings that don't gimp on vram/bus, the $200 performance has to start at 7800XT and the $300 has to beat it by at least 15%. There are a lot of people who don't give a shit about software features, and the reason they aren't buying low end AMD right now is because they know it's as much of a price scam as Nvidia's product stack.

      • 1 month ago
        Anonymous

        >amd just has to lose money

  19. 1 month ago
    Anonymous

    I have a 7900 XTX and it utterly mogs everything Nvidia offers up to the flagship RTX 4090 all at a fraction of the price. I'm getting better frames than Nvidia users and I spent less for it which let me invest more into beefing up other components in my rig. I'm so happy I got it. With Radeon I'm also getting better software and the AIB options for Radeon are top tier. Plus I don't have to deal with the heavily proprietary anti-consumer tactics Nvidia pulls. There is a reason why console makers have partnered with AMD for their hardware.

    • 1 month ago
      Anonymous

      >There is a reason why console makers have partnered with AMD for their hardware.
      It's much cheaper for a market where price is THE priority? Though yeah, going by Apple's experience with them, Nvidia seems like a pain even when it comes to b2b.

    • 1 month ago
      Anonymous

      How's generative AI performance with it? Have you tried SD or ollama?

  20. 1 month ago
    Anonymous

    40 gb vRAM + 4090 cuda performance + costs half as much as 4090

  21. 1 month ago
    Anonymous

    AMD RTG already conceded the discrete GPU market years ago because discrete GPU market is undergoing demand destruction as we speak. iGPUs are simply becoming good enough for the masses that aren't chasing 200FPS+, 4K+ and RT rendering.

    • 1 month ago
      Anonymous

      >iGPUs are simply becoming good enough
      Well be dammed but this is somewhat true, my work's laptop with i7-1165g7 and iris xe play games the same if not better than a discrete rx6400

  22. 1 month ago
    Anonymous

    AMD has an entire monopoly on x86 consoles
    Nvidia has an entire monopoly on CUDA

  23. 1 month ago
    Anonymous

    releasing a good gpu

  24. 1 month ago
    Anonymous

    24GB VRAM cards for $800 or less.
    Drivers that don't shit the bed for VR.

  25. 1 month ago
    Anonymous

    They could try making superior cards. Worked with their cpus.

  26. 1 month ago
    Anonymous

    Bombing Nvidia facilities

  27. 1 month ago
    Anonymous

    They could have tried to compete with RT and DLSS 2 gens ago instead of just holding on to the hope that it would go away.

    • 1 month ago
      Anonymous

      Why bother? There is no real money in it to justify R&D cost.
      Nvidia is just using gaming as a springboard for their ML/general compute ambitions.

  28. 1 month ago
    Anonymous

    If bitnet trained with sign gradient descent and integer math by some miracle turns out to be competitive, it's going to completely change the hardware landscape.

    That's probably the best chance to displace NVIDIA.

  29. 1 month ago
    Anonymous

    They already kind of did that with Ryzen, and still are.

    All that's left is to find a way to emulate PhysX just to make games a bit prettier.

    • 1 month ago
      Anonymous

      >emulate
      why bother, physx has been open source for 3 major versions, including the CUDA parts
      you can just port it to HIP, the windows SDK is out
      i've looked into it and the only hard part is be converting the build system

  30. 1 month ago
    Anonymous

    finally fix rocm

  31. 1 month ago
    Anonymous

    Nvidia GPUs are for legitimate morons. Everyone at my datacenter who gushes over that shit is a Applegay.

    • 1 month ago
      Anonymous

      im still a noob compute-wise
      but with all the shit i had to go through to get compute on my gpu, when my kernels got terminated i almost pulled the trigger on going ngreedia
      despite the cause for that wasnt anywhere near gpu-related
      (it was OOM killer because i didnt properly estimate the amount of ram my program was actually using.
      but thats the exact point:
      i didnt think it could be anything else than some bullshit issue i have no chance of fixing bc im running a frankenstein's driver on a distro that its not supposed to run on)

    • 1 month ago
      Anonymous

      Makes only sense that Applegays would pick other superior products in markets when Apple isn't an option.

  32. 1 month ago
    Anonymous

    Make ROCM suck less/work with Intel on some common framework, 32-48+GB VRAM cards cheaper than 4090.

  33. 1 month ago
    Anonymous

    GPU is in a pathetic state.
    First the crypto mining craze, which then proceeded by the AI stuff.

    AMD is no better than Nvidia.
    Their current line up are all but a scam.
    You either go big or might as well just buy the new PS5.
    The rest of the cards are not worth it.

    • 1 month ago
      Anonymous

      >might as well just buy the new PS5.
      Buying overpriced GPU is one thing, but buying DRM box with no games?

  34. 1 month ago
    Anonymous
    • 1 month ago
      Anonymous

      Nah, it is just alternative period. AMD RTG SKUs are not really cheap either. The only cheap option is old previous generations stock or iGPUs.

    • 1 month ago
      Anonymous

      Why does this homosexual wear earrings

      • 1 month ago
        Anonymous

        because he gets pussy unlike you

        • 1 month ago
          Anonymous

          I mean, I could marry a ugly bawd who loves my wallet and my citizenship, but why would I?

          • 1 month ago
            Anonymous

            because you'll die lonely and childless that's why

    • 1 month ago
      Anonymous

      I hate this fricking lesbian, someone please corrective rape her

      • 1 month ago
        Anonymous

        He's getting raped by his Asian wife daily I bet.

  35. 1 month ago
    Anonymous

    Let's talk about Intel now. Does their new GPU datacenter accelerators stand a chance? Will Musk start competing on the gpu question?

    • 1 month ago
      Anonymous

      We don't need to be worrying about something else.

  36. 1 month ago
    Anonymous

    Real problem is this; lack of any incentive to upgrade to say some $500 card. WTF should I upgrade anything when for one I don't play new games and two the hardware I've got now handles what I do just fine. "Oh look at me I've got this new $600 card" Nope. That kinda thing isn't my style. If I had $600 to blow on a GPU I'd take that money instead and toss it into my brokerage account or savings.

  37. 1 month ago
    Anonymous

    Nothing they can do to win instantly. They need to just keep doing what they're doing.
    Keep the AMD64 architecture relevant to keep Nvidia locked out of the PC SOC market.
    Push Linux gaming where they have the driver advantage.
    Sell more capable SOCs that push discrete GPUs up market.
    Gain access to a greater percentage of OEM sales.

    The Steam Deck is now AMD's single highest rated identifiable product in the Steam Survey.
    They just need about 4 or 5 more successes on that level to really disrupt the market.

  38. 1 month ago
    Anonymous

    64GB VRAM GPU

  39. 1 month ago
    Anonymous

    >make CUDA run on AMD cards
    >highend cards have 64GB by default, but offer models with 80GB or more
    >completely open source their drivers

  40. 1 month ago
    Anonymous

    >all this nonsense about making CUDA run on AMD cards
    stop being moronic
    every single CUDA project can easily be converted to HIP
    the only ones that can't are the ones using msbuild as a build system
    now unfricking some of HIP's performance issues and bugs and providing better access to AMD specific features on the other hand would be actual improvements

    • 1 month ago
      Anonymous

      moron. there's no point in converting the CUDA projects to HIP if HIP doesn't even work.
      the reason people care about making CUDA run on AMD is because a crappy compatibility layer is already better than ROCm after years of development

      • 1 month ago
        Anonymous

        HIP literally works fine outside of fricking weird edge cases like the firmware bug the tinygrad dev ran into you fricking idiot
        fixing it would be substantially more productive than completing ZLUDA especially because ZLUDA is running on top of HIP, the only thing that you're actually benefiting from is the PTX to AMDGPU ASM transpiling

        • 1 month ago
          Anonymous

          and by "fine" i mean mediocre
          it works but it doesn't work well
          optimizing rocblas and miopen and improving clang would be considerably less stupid than working on ZLUDA especially due to the legal issues

  41. 1 month ago
    Anonymous

    Nvidia are greedy little israelites when it comes to VRAM, there's a massive amount of demand for high VRAM graphics cards. All Nvidia needs to do is make are CUDA knockoff and sell cards with high amounts of VRAM.

    AMD is a pretty shit company though, the only reason they still exist is because Intel somehow is just as dumb so it let AMD secure a decent CPU market share. Nvidia is the only competent among major semiconductor company, which is why they're so successful.

    • 1 month ago
      Anonymous

      Itanium was supposed to be the killing blow against AMD, but it put Intel on the wrong foot.

      They also tried to kill Nvidia by taking them out of the motherboard market, and using x86 cores for GPGPU but Nvidia got very lucky that they were able to survive as a discrete GPU maker because Intel's onboard graphics were so bad, and their x86 GPUs just didn't work.

      AMD for their part lucked into being the only company that could provide both a AMD64 CPU and a competent GPU in a single package and built that into an ability to produce semi-custom chips in volume even if they didn't get a big piece of the Windows OEM market.

      • 1 month ago
        Anonymous

        >Itanium was supposed to be the killing blow against AMD, but it put Intel on the wrong foot.
        Wrong, IA64 was meant to be Intel's entry to HPC/big iron market and kill off RISC platforms like Alpha DEC and SPEC. It has nothing to do with x86. Itanium was kinda DOA because the market was geared towards was dying out. x86 evolved to become RISC-like.
        >They also tried to kill Nvidia by taking them out of the motherboard market,
        Nvidia never cared for the motherboard division. It just used its Xbox classic contract to spin-off some motherboard chipset platforms on AMD front. It quickly evaporated once AMD got ATI's chipset division and spun as their in-house platform.
        >t Nvidia got very lucky that they were able to survive as a discrete GPU maker because Intel's onboard graphics were so bad, and their x86 GPUs just didn't work.
        The real reason is that iGPUs weren't good enough for 3D graphics at the time, but Nvidia saw the writing on the wall and been planning to migrate elsewhere in the long-term.
        >Nvidia is the only competent among major semiconductor company, which is why they're so successful.
        Nah, they got a lucky a couple of times and managed to weather PR disasters with minimal damage. The ML/AI bubble bursting is going to be a rude wake-up call for them.

        • 1 month ago
          Anonymous

          >IA64 was meant to be Intel's entry to HPC/big iron market
          >It has nothing to do with x86
          The thing it has to do with x86, is that unlike x86 AMD had no license for Itanium. So if Intel had succeeded in shifting customers over to it as the 64bit platform, it would have ultimately forced AMD out, or at the very least restricted them to just the legacy 32bit market.

          >Itanium was kinda DOA because the market was geared towards was dying out. x86 evolved to become RISC-like.
          True, but that was AMD's doing with AMD64. If it hadn't been a success, the industry would look a lot different today.

          >Nvidia never cared for the motherboard division.
          That seems like a post-hoc 'sour grapes' reasoning. It was a big deal at the time. Especially for Nvidia to get a contract with Apple. For a while there the motherboard chips with onboard graphics seemed like they would take over the mainstream market because OEMs will always prefer a solution that means buying less chips.

          With Nvidia blocked from selling 2 chip solutions AMD picked up a lot of market space at the bottom in part because they became uniquely equipped to supply low-end hardware with the fewest number of chips including somewhat competent GPUs. But Nvidia managed to hang on to the mainstream with their 50 and 60 series parts thanks to OEM relationships they had built in part with the motherboards.

          Nvidia also got really lucky that AMD misjudged the market a bit in moving to RDNA right when AI started to take off, but RDNA made sense for AMD because while GCN and Vega specifically had been intended as high-end GPGPU solutions, they were mostly used in low-end hardware. So AMD made RDNA optimized for low-end hardware, and limited CDNA to data center hardware. If AMD were to scale up some more accessible CDNA hardware it might really help them out, especially given their Linux market advantages, but that would be a niche business.

          • 1 month ago
            Anonymous

            >The thing it has to do with x86, is that unlike x86 AMD had no license for Itanium. So if Intel had succeeded in shifting customers over to it as the 64bit platform, it would have ultimately forced AMD out, or at the very least restricted them to just the legacy 32bit market.
            Wrong, IA64 has nothing to do with x86 from the start. It is just Intel trying to get into big iron space and fight over against competing RISC chip. There maybe have been long-term plans to distill the architecture down to desktop environment at same point in development but quickly evaporated.

            >True, but that was AMD's doing with AMD64. If it hadn't been a success, the industry would look a lot different today.
            Nah, IA64 was killed by x86 in general when evolved to become RISC-like.
            >That seems like a post-hoc 'sour grapes' reasoning
            It was nothing more than off-shooting their R&D with their Microsoft collaboration with Xbox Classic. It only took off when Nvidia seized the small niche away from mediocre/supbar chipset vendors in the AMD camp. That bridge closed once AMD acquired ATI's chipset division. Nvidia tried going Intel camp but their platform was inferior and gimmicks were simply not enough to make-up for the difference (SLI support).
            Nvidia threw in the towel and went elsewhere.

          • 1 month ago
            Anonymous

            >but quickly evaporated.
            Because of AMD64. Intel really could have succeeded in their plan if AMD64 hadn't happened. Its hard to predict what the world would have been like if Itanium had been a success.
            No more IBM compatible Personal Computers and instead a market entirely owned by Intel?
            Or Intel making themselves irrelevant?

            >but their platform was inferior and gimmicks were simply not enough
            More like Intel blocked them (and everyone else) from making chipsets.

            >Nvidia threw in the towel and went elsewhere.
            They had to. Because they couldn't make chips for that market anymore. It no longer existed.

          • 1 month ago
            Anonymous

            > Intel really could have succeeded in their plan if AMD64 hadn't happened
            Nah, Itanium was a flop. It was killed by its Intel's own x86 offerings. Intel would have independently developed its own version of "x86-64" releasing that EPIC was an interesting but silly idea.
            >More like Intel blocked them (and everyone else) from making chipsets.
            Nah, Nvidia didn't like to paid whatever fee/royalties to renew their licensing agreement with Intel for chipsets and saw their platform didn't sell well enough to justify the expense.

          • 1 month ago
            Anonymous

            what are the actual differences between CDNA and RDNA out of curiousity aside from CNDA having proper MMA ops instead of RDNA3's macro-like WMMA op

          • 1 month ago
            Anonymous

            The TLDR summary is:
            CDNA is the continuation of full-power GCN compute units, minus all the graphics specific hardware.
            RDNA halves certain CU functionality and glues two CUs together to share resources, but keeps everything graphics specific.

            The upside for gaming, RDNA delivers much better performance per watt in a smaller package.

            Nothing. The CUDA/AI gap is so big, it's not surmountable. Nvidia will be challenged by someone else on a different front in the AI hardware war, not AMD.

            I don't know, Aldebaran seems pretty well positioned, and AMD may be uniquely able to take advantage of the combination of CDNA + AMD64.

          • 1 month ago
            Anonymous

            oh okay so CDNA is still exclusively wave64 and i assume SIMD64, yea i could see how that's better
            that'd be nice tbh, the weird 1/2 CU shit means i have to look into really low level scheduling shit to park a compute kernel on a SIMD32

    • 1 month ago
      Anonymous

      >All Nvidia needs to do is make are CUDA knockoff and sell cards with high amounts of VRAM.
      did you mean AMD?
      AMD already has a CUDA knockoff, it's called HIP, and they've got a reimplementation of the entire CUDA SDK in HIP, it's called ROCm
      the differences between the two are extremely minimal to the degree that they've neglected to properly support AMD specific features in favor of making it more like CUDA
      the only major difference is functions that start with cu and cuda start with hip and that's because it's also a wrapper over CUDA and its libraries
      it's missing mappings for a couple cuda builtins to AMD opcodes as well like the nibble dot products

      it's got bugs and performance issues but they are seemingly dedicating more effort to it, slowly

      still, no one uses it despite the fact that supporting it is extremely trivial, llama.cpp does is using a few macros while other projects just use the provided conversion scripts

  42. 1 month ago
    Anonymous

    They could probably start by making a product that was actually competitive against nvidia.

    • 1 month ago
      Anonymous

      3000 got its ass kicked hard by 6000 so hard that it scared Nvidia into making the 4090 with melting connectors

    • 1 month ago
      Anonymous

      Gaymurs want GeForce. That's it. You could show them all the graphs in the world, and people like will still ignore you and buy GeForce.

      • 1 month ago
        Anonymous

        Gamers seem want AMD more. Everyone is buying PS5 and Steam Deck. Nobody is happily chanting yes i want a 4070 or a prebuilt with Nvidia.

        • 1 month ago
          Anonymous

          Doesn't help that Intel's handheld entry is vastly inferior because they can't just throw more power at it.

  43. 1 month ago
    Anonymous

    make x86 APU with 256bit memory, actual graphics performance making a 4050/4060 in normie laptops pointless.

    So basically Strix Halo.

  44. 1 month ago
    Anonymous

    >but CUDA
    Isn't ZLUDA able to emulate CUDA stuff?

    • 1 month ago
      Anonymous

      ZLUDA is just a runtime CUDA-to-HIP translation layer
      it's not as big as people who don't actually keep up with software development think, it isn't necessary for 99% of things
      pytorch already supports HIP natively, for example
      apparently some of the CUDA SDK libs are faster than some of the ROCm libs and that's the only real substantial benefit

      it might be nice for 64 bit physx support, (if that actually works, mind) without any effort but that's about it

      • 1 month ago
        Anonymous

        is just a runtime CUDA-to-HIP translation layer
        *and PTX to AMDGPU ASM transpiler
        i think there are some other obscure compiled projects where its relevant but the vast majority of relevant CUDA projects are open source and can or already do support HIP

  45. 1 month ago
    Anonymous

    Nothing. The CUDA/AI gap is so big, it's not surmountable. Nvidia will be challenged by someone else on a different front in the AI hardware war, not AMD.

    • 1 month ago
      Anonymous

      the "gap" is significantly smaller than you think it is now
      look at any actually up to date benchmarks for inference on RDNA3

      and it only existed for consumer cards to begin with, lmao

      • 1 month ago
        Anonymous

        the gap is getting closer and it WILL close due to Microsoft and Google getting into the software stack, not sure for this gen of AMD GPUs for sure though. The current driver design wont work.

        >and it only existed for consumer cards to begin with, lmao
        Yeah because all this LLM madness is being trained on geforce cards not A100, H100s, etc
        "lmao"

  46. 1 month ago
    Anonymous

    Train up a really high quality AI model for porn (both photographic and drawn) designed to only work with AMD GPUs, with no tedious workarounds needed to make it generate NSFW content. If they can make it generate animated content so much the better.

    Coomers (aka everyone) will buy AMD cards in droves. It'll be a slaughter, like VHS vs Betamax.

  47. 1 month ago
    Anonymous

    The only move they could do is sell its GPU line of business to intel.
    Intel is very experienced in making dev tooling. Which AMD has been messing up for 10 years now.

  48. 1 month ago
    Anonymous

    Amd should just make fou no ai shit or buy tens torrent. It's the only way people are eating Nvidia slop cause that is all widely available, but Nvidia is pulling an intanium frick up like Intel did with software compilers spaghetti coding teams of thousands.

  49. 1 month ago
    Anonymous

    That's not how competition between two powerful semi-monopolies works.

  50. 1 month ago
    Anonymous

    Because even the top tier AMD cards are expensive and if I'm going to spend that much money, I might as well get the actual good one.

  51. 1 month ago
    Anonymous

    >Revolutionary AI Chips: AMD could develop game-changing AI chips to outperform Nvidia.

    >Market Expansion: Expanding market share in data center AI processors and emerging segments.

    >Cloud Chip Rivalry: Collaborating with cloud providers to offer competitive alternatives.

    >Deepening Partnerships: Strengthening alliances with cloud giants like Amazon and Microsoft.

  52. 1 month ago
    Anonymous

    Something better than Blackwell

  53. 1 month ago
    Anonymous

    >AMD vs NVidia
    >no ARM
    Y'all aren't seeing much of what's up and coming in the market, are you?

  54. 1 month ago
    Anonymous

    sell their GPU side to a competent company

  55. 1 month ago
    Anonymous

    Amd generally and historically makes worse, less supported, cards than nvidia. If i am spending around 600 dollars to get a nice card to build a new computer, i would not give a frick about trying to save 50 dollars to get a card from a brand with a worse reputation, especially when i am spending 1.5k on a new build. 50 bucks is a drop in the bucket.

    Now, if amd cards were 150 dollars cheaper, "maybe" i take the risk.

    • 1 month ago
      Anonymous

      Nvidia out produces AMD discrete cards more than 10 to 1. In the NT market they can always drop prices to respond to AMD and stay in that 'Just spend more and buy Nvidia' price range.

      The massive undercut on discrete hardware for NT users isn't a viable business plan for AMD.
      The only way they can undercut Nvidia are SOCs in Linux based systems.

  56. 1 month ago
    Anonymous

    make good drivers for real this time and lower prices

    • 1 month ago
      Anonymous

      it's weird how nvidia and amd refuse to make good drivers.

  57. 1 month ago
    Anonymous

    There is nothing they can do Gaymers care about brands.

    However in the crypto market, maybe.

  58. 1 month ago
    Anonymous

    Easy, put their shit together with their compute stack, It has to support all their hardware and it has to do it on first day, no more promises of having it working at some point and then release a half assed solution that they'll later abandon a couple years later cause there's a new shit that this time is gonna work we swear.

    And then put a lot of memory on the cards, and see nvidia burn.

  59. 1 month ago
    Anonymous

    >go for the jugular
    nothing, they could build super high end consumer/workstation apus though since nvidia can't compete on that front

  60. 1 month ago
    Anonymous

    Continue making competitive but reasonably priced cards and keep doing what they are doing with their CPUs. Stay gaming-oriented, that's how Nvidia got as big as they are.

    t. upgraded GPU recently

    • 1 month ago
      Anonymous

      >she fell for the amd meme AND uses windows
      Bro really thought she had something here

      • 1 month ago
        Anonymous

        That screencap is slightly old, I now have windows 11 pro.

        >"bro really thought she had something here"
        Talks like a zoomer and is a tinkering troon to boot. Unlike you, I have work and gaming to do not tinkering with my OS. I'll consider Linux when it's not going to waste my time over my debloated windows.

  61. 1 month ago
    Anonymous

    >I'll consider Linux when it's not going to waste my time over my debloated windows
    apparently using ntlite to make a debloated iso, running shady post install scripts, changing registries and all the settings, activating windows, disabling all the legacy networking shit, and then finally disabiling updates is easier than just installing a stable linux distro. enjoy doing the same humilation ritual when you inevitably have to reinstall windows or update it for security patches

Your email address will not be published. Required fields are marked *