Will it ever be good?

Will it ever be good?

Nothing Ever Happens Shirt $21.68

Ape Out Shirt $21.68

Nothing Ever Happens Shirt $21.68

  1. 4 weeks ago
    Anonymous

    Battlemage is coming out later this year. We'll see where things go when it releases.

    • 4 weeks ago
      Anonymous

      Not any time soon and I don't see Pat caring about the desktop GPU market enough to tolerate another failed generation. They'll refocus their GPU efforts on the enterprise market.

      Battlemage is already too late. It was designed to compete with the current generation of AMD/Nvidia cards, and yet the next one will be launching around the same time Battlemage does. We already know that the top end is a <=225W part, which means we also already know a best case scenario for its performance (if they could match Nvidia's efficiency, which they won't). Unless it's absurdly cheap, it doesn't stand a chance. Which loops back around to the fact that Pat isn't going to keep throwing money into a bottomless pit in a effort to gain market share. He's cut every other failing part of Intel's business, yet people are desperate to believe he'll persist in a futile effort to challenge two decades of brand loyalty, when dumb gaymers will just buy Nvidia anyway as they always have.

      • 4 weeks ago
        Anonymous

        *an effort

      • 4 weeks ago
        Anonymous

        >They'll refocus their GPU efforts on the enterprise market.
        This is already happening. They're trying to commoditize graphics accelerated remote desktop

      • 4 weeks ago
        Anonymous

        >They'll refocus their GPU efforts on the enterprise market.

        That was the end goal all along. Entering the GPU market in the consumer space first was to help them iron the wrinkles out.

      • 4 weeks ago
        Anonymous

        Nvidia isn't going to be adding more performance for the same amount of money so even the a770 will still be competitive against the 5000 series.

      • 3 weeks ago
        Anonymous

        More market share = more awareness and hype = more money from investors. Hyping up and fleecing moronic gaymers is a huge part of what kickstarted Nvidia to where it is now. Hype and awareness is what makes Elon Musk so rich.
        Pretty useful considering Intel isn't really known as a GPU company.

      • 3 weeks ago
        Anonymous

        The whole point is to kickstart enterprise usage by making people aware that Intel is here and does work albeit falling short of the incumbents in various areas. If you see the recent L1Tech video on the Intel GPU server he got to play with, you would know this is where the big money is. The fact that they have SR-IOV vGPUs working without onerous subscription fees and some big customer is using it for that purpose bodes well already for Intel's GPU niche in enterprise.
        They do "care" in a token manner about the consumer GPU market but only as far as catching to parity with where Nvidia is. If it doesn't work out, they will like AMD hype up repurposed or recycled enterprise or older GPU designs to sell to the consumer market. At least it will keep it honest unlike the duopoly people currently have. Pat has already cut the consumer GPU team as far as he can without it doing damage to his enterprise ambitions so there's not much lower the consumer GPU side will face.

        • 3 weeks ago
          Anonymous

          they care about the consumer gpu market insofar as they want to be competitive with amd's igpus

          • 3 weeks ago
            Anonymous

            I was a little shocked by how bad the release was considering how big and monopolistic Intel has been for decades. They have had talented folks at some point off and on so that's no excuse. They have had albeit a terrible igpu which for some reason they never moved on despite things for over a decade until AMD started the Apu lines. It also seems like they just sort of rested on laurels in enterprise until AMD came out with epic and thread ripper which have been a better option all around across the board there. I think they were expecting to just strongarm the market like they did when AMD had opteron series as that's their MO. The comments about enterprise focus is a given and at best they want a piece of that plus the Quadro firegl market. Consumer is an afterthought. They are quite lucky that still have such a large market share considering the CPU's themselves have been the lesser option since at least 2018 in enterprise or consumer. I think the only thing going for them is that all the schedulers are terrible in any is you use. The software outsourcing to subpar outfits helps as well across the board. If USA and Europe were merit based still they would likely be half to a quarter of what they are today even if the company survived.

        • 3 weeks ago
          Anonymous

          >Pat has already cut the consumer GPU team as far as he can without it doing damage to his enterprise ambitions so there's not much lower the consumer GPU side will face.
          That's too bad, I was really hoping they'd keep the consumer GPU side a bit less segregated, unlike NVIDIA and AMD, and offer relatively inexpensive GPUs that weren't so (needlessly/software) crippled with regards to FP32/64 and such, and not be so israeli about RAM either. Could really break into the scene that way and get a ton of mindshare, and have the best shot at doing something about the CUDA monopoly. But then again, it is Intel we're talking about. They clearly have no cohesive long term vision, or at least not the will to actual see it through. People can say what they will about NVIDIA but they had a vision and have steadily kept pushing it for nearly two decades. Their current bonanza profits aren't from mere chance.

    • 4 weeks ago
      Anonymous

      2 more weeks
      trust the plan.

      https://i.imgur.com/KaGMGuB.jpg

      Will it ever be good?

      should have focused on apu instead

    • 4 weeks ago
      Anonymous

      DOA

      • 4 weeks ago
        Anonymous

        Because it doesn't have "Nvidia" on the box.
        My godbrand > your consumer products.

    • 4 weeks ago
      Anonymous

      Gamer words is a bad sign.

      • 3 weeks ago
        Anonymous

        >Gamer words is a bad sign.
        It's a good sign actually, if you don't care about gaming, buy an overpriced NVIDIA GPU.

    • 3 weeks ago
      Anonymous

      frick off moron. intel dGPUs don't even have a chance at being good until celestial

      Not any time soon and I don't see Pat caring about the desktop GPU market enough to tolerate another failed generation. They'll refocus their GPU efforts on the enterprise market.

      Battlemage is already too late. It was designed to compete with the current generation of AMD/Nvidia cards, and yet the next one will be launching around the same time Battlemage does. We already know that the top end is a <=225W part, which means we also already know a best case scenario for its performance (if they could match Nvidia's efficiency, which they won't). Unless it's absurdly cheap, it doesn't stand a chance. Which loops back around to the fact that Pat isn't going to keep throwing money into a bottomless pit in a effort to gain market share. He's cut every other failing part of Intel's business, yet people are desperate to believe he'll persist in a futile effort to challenge two decades of brand loyalty, when dumb gaymers will just buy Nvidia anyway as they always have.

      this fricker knows. intel lost enterprise CPU share with AMD but more importantly ARM since gayMAN and China can basically design their own server CPUs now. Intel knows AMD is imcompetent as frick with GPUs so targeting AMD GPUs in enterprise is basically low hanging fruit.

      • 3 weeks ago
        Anonymous

        >gayMAN and China can basically design their own server CPUs now
        Google surely doesn't, yet anyway

        • 3 weeks ago
          Anonymous

          because GCloud is still a fricking joke compared to AWS and even Azure and Google is a fricking joke at hardware. but they're still working on it obviously since ARM already did 90% of the heavy lifting with Neoverse cores.
          https://www.theregister.com/2023/02/14/google_prepares_its_own_custom/

          • 3 weeks ago
            Anonymous

            >Google is *understood* to be developing it's own arm chips
            If it's happening, it hasn't landed in a single DC yet
            Ampre servers are in production tho

  2. 4 weeks ago
    Anonymous

    Probably. Nvidia is intentionally pricing themselves out of the consumer market and AMD isn't trying, which leaves market share open to a "fledgling" consumer gaming GPU brand. If I recall correctly, Alchemist had actual hardware issues that kept it back. And even then, their drivers are massively better than they were at launch. If Intel keeps this momentum up, they could easily take over Nvidia everywhere but the super high end space (4090/Titan "I literally make too much money and buy things in order to feel anything/I'm REALLY bad with money/I have a severe porn addiction" tier). I actually plan on replacing my 3080 with mid/high tier celestial or Druid (it entirely depends on whether or not I care about any games that need a better GPU) as long as it's reasonably priced like Alchemist and plays nice with emulators.

    • 4 weeks ago
      Anonymous

      >4090/Titan
      >"I have a severe porn addiction" tier
      is there anything outside of AI text/image gen and VR video/VaM I could utilize a high tier card for?
      t. I have a severe porn addiction

      • 4 weeks ago
        Anonymous

        To flex on other people and feel bad about spending large amounts of money into things you're only going to be using to waste your time

        • 4 weeks ago
          Anonymous

          Pretty close to the Janny lifestyle right there.

        • 4 weeks ago
          Anonymous

          you buying shit to impress people on IQfy? are you moronic?

          • 4 weeks ago
            Anonymous

            Moderation shelters public masturbation, so he'd be in good company here.

      • 3 weeks ago
        Anonymous

        You aren't missing out anything. The best coomer models are based on SD 1.5 which can only generate 512x512 image

      • 3 weeks ago
        Anonymous

        raytraced voxel engines written in C with Vulkan

    • 3 weeks ago
      Anonymous

      Massively better drivers from non working isn't much though from that low of a bar. Kinda surprising they even released the product that was essentially useless and non functional to such a large degree.

      • 3 weeks ago
        Anonymous

        This is not bad considering the faulty architecture. Perfectly reasonable as a budget/1080p GPU for a large number of people.
        The real question at this point is if they didn't physically goof up with Battlemage.

  3. 4 weeks ago
    Anonymous

    Linus Tech Tips™ told me this is trash so no.

    • 4 weeks ago
      Anonymous

      this, he also said race mixing is good!

      • 3 weeks ago
        Anonymous

        His kid also killed the family cat. I don't hate any other YouTube channel more than LTT.

        • 3 weeks ago
          Anonymous

          when I google it it says the nanny he hired killed the cat

    • 3 weeks ago
      Anonymous

      He actually said it was good

    • 3 weeks ago
      Anonymous

      Based

      • 3 weeks ago
        Anonymous

        Verily.

    • 3 weeks ago
      Anonymous

      Linux Snipped Tips

  4. 4 weeks ago
    Anonymous

    only reason im not using one is their shit performance/watt

  5. 4 weeks ago
    Anonymous

    how many cuda cores does it have

    • 3 weeks ago
      Anonymous

      The a770 has 512

  6. 4 weeks ago
    Anonymous

    How can you frick up a videocard so bad?

    • 4 weeks ago
      Anonymous

      they outsourced the software end to russian devs, lol lmao
      big tech never learns

      • 3 weeks ago
        Anonymous

        >outsourced to russian devs
        doubt

  7. 4 weeks ago
    Anonymous

    Depends on the state of Linux drivers

  8. 4 weeks ago
    Anonymous

    they'll have servicable igpus so amd doesn't completely sweep them in the laptop market but that's about it

  9. 4 weeks ago
    Anonymous

    ? Anon they're already good. I bought an A380 for $90 and have been using it as a second gpu for capturing 4k 60fps AV1 gameplay and it just frickin werks. Would love to use it in a dedicated jellyfin server or something too.

    • 3 weeks ago
      Anonymous

      I e been thinking about a 380 or 580 specifically for AV1. Well that and the fact I'm still on a 1060 3gb kek. I don't game much, and when I do it's 1080p. So either one would be better than what I have now.

      • 3 weeks ago
        Anonymous

        ? Anon they're already good. I bought an A380 for $90 and have been using it as a second gpu for capturing 4k 60fps AV1 gameplay and it just frickin werks. Would love to use it in a dedicated jellyfin server or something too.

        >I bought an A380 for $90 and have been using it as a second gpu for capturing 4k 60fps AV1 gameplay and it just frickin werks
        Please god tell me this works for Linux without some unholy setup

        • 3 weeks ago
          Anonymous

          It does you just need to get the drivers working for your desired distro but I know for a fact it just werks on Ubuntu.

    • 3 weeks ago
      Anonymous

      So you don't need a separate computer to do that?
      How do most games determine which GPU to use if they don't have a selector?

    • 3 weeks ago
      Anonymous

      It'll be better than it currently is, but good? Its competition are incompetents (AMD) and grandmasters who under-VRAM and overprice their hardware anymore rather than offering variants of their workstation gpus at decent prices for aislop.
      You'd think Intel wouldn't have such a difficult time getting ARC to work. They've been making GPUs for well over a decade. This is just a matter of making stronger, dedicated ones, and could easily build off what they already do for integrated gpus.

      >they're already good
      >I own one that I use as a media encoder and not for graphics at all
      Wow yeah what a frickin' testimonial you've got there, dipshit. Though that does give a pretty nice use case for a new product: Dedicated encoder cards.
      I have an A380. It struggles to run Final Fantasy X of all fricking games, but that might be a shitty port, no resizable bar (6th gen intel cpu + board), and PCIE 3.0 working against it. For its primary usecase, encoding and decoding, it works a damn sight better on a 4kHDR display than the GT1030 it replaced.

  10. 4 weeks ago
    Anonymous

    I have an Arc A770 17G LE in my second PC. Compared to what the drivers were like when it was released, it has come a long way and is significantly better than one would expect. Sure, it's on par with a 3060 with performance, but for the price at the time of $550 AUD (roughly USD $360) and with how nice it looks, it's a damn good card. However, you NEED to overclock it, or it's a wasted card.
    Hopefully Battlemage turns out to be great, but who the frick knows at this stage.

    • 4 weeks ago
      Anonymous

      >17G
      I meant 16GB. Speedtying.

    • 4 weeks ago
      Anonymous

      I'm looking forward to Battlemage too, daily drive the A770 for work (don't ask) and it works well for all my dogfooding tasks. If prices don't come down on AMD and NVidia I'll get one for daily use at home too, it's the only reasonably priced graphics card now.

      • 4 weeks ago
        Anonymous

        >daily drive the A770 for work (don't ask)
        I want to ask

        • 4 weeks ago
          Anonymous

          I work on a large legacy live service. Players run it on all kinds of hardware you don't expect. I got volunteered for the A770.

          • 4 weeks ago
            Anonymous

            everquest?

          • 4 weeks ago
            Anonymous

            What it is doesn't really matter, but DX9 is still somewhat supported, so Shader Model 2.0's limitations are in full force. The A770 is interesting because it uses a translation layer for DX9 games, I think it was DXVK. Either way it works great for what I work on, performance is more than adequate.

      • 4 weeks ago
        Anonymous

        >daily drive the A770 for work
        do you work at intel

    • 3 weeks ago
      Anonymous

      Had a 16gb a770 since launch, it’s been good.
      From a pure consumer ethics standpoint. AMD doesnt even try, and nvidia will fleece consumers for every penny they have.
      Intel shit out a 16gb midrange card for a relatively low price and has been dumping effort into salvaging its potential since.
      I trust that product more. And its been a good experience since, it does everything I need it to.

      I'm interested in ARC cards and been considering getting an A770 for my next upgrade, though I will probably wait for the next generation.

      One thing I've been curious and concerned about is API support. How ARC cards and Intel drivers fare in API support for games? Vulkan, OpenGL, DX12, DX11, older games using DX10, DX9, and older versions of OpenGL, do they work and run well? What about emulators, are there any issues with software such as PCSX2, RPCS3, Xenia, Dolphin, Cemu, etc? Windows user, will only be doing 1080p gaming with 60 fps target.

      • 3 weeks ago
        Anonymous

        >Good
        Vulkan, DX12, DX9
        >In progress
        DX11 - they're almost done with the DX11 upgrade
        >Bad
        OpenGL

        I've only used PCSX2 and RPCS3 on Windows and they work fine.
        I haven't heard any commitment from Intel about OpenGL, and they say the next set of driver updates after DX11 is DX12. Which makes me doubt that they'll put in any serious effort.

        • 3 weeks ago
          Anonymous

          >Bad OpenGL
          Huh, does this also apply to Linux?

          • 3 weeks ago
            Anonymous

            it's the implementation of their opengl driver, why wouldn't it?

          • 3 weeks ago
            Anonymous

            Well because Intel is known for putting a lot of effort into their opensource Linux OpenGL drivers

          • 3 weeks ago
            Anonymous

            Yes.
            Intel only cares about compute on Linux.
            The gaming performance is simply awful.

        • 3 weeks ago
          Anonymous

          Thanks for the reply anon
          >OpenGL
          >Bad
          Why is it bad, is it only poor performance, or are there stability / compatibility issues?
          OpenGL performance has always been a weakness for AMD too, since the ATI days. At least on Windows.

          Interesting to know that Vulkan performance is good. All my Nvidia cards always sucked at Vulkan. I mean, it works, but performance is usually at least slightly worse than DX12 or DX11.
          I'm a bit concerned about DX9 bc of this

          What it is doesn't really matter, but DX9 is still somewhat supported, so Shader Model 2.0's limitations are in full force. The A770 is interesting because it uses a translation layer for DX9 games, I think it was DXVK. Either way it works great for what I work on, performance is more than adequate.

          even though I've been using DX9 less and less, and even current nVidia drivers and cards have issues with many DX9 titles that require me to resort to DXVK, DXGL or dgVoodoo.

  11. 4 weeks ago
    Anonymous

    it's not that great on linux yet. many games don't start with either driver, raytracing only works in quake2rtx, xess only works with dp4a

  12. 4 weeks ago
    Anonymous

    It will be the best when Intel kneecaps Radion and Nvidia on thier Processors.

    • 4 weeks ago
      Anonymous

      Pretty sure that would noticed very quickly once someone compares systems with Intel cpus vs Amd cpus; AMD makes some very good gaming CPUs right now, so that would be hard to Intel to get away with that.

  13. 4 weeks ago
    Anonymous

    Reddit says it's better than Nvidia and I should buy it.

    • 3 weeks ago
      Anonymous

      >Reddit says it's better than Nvidia and I should buy it.
      Reddit also said you need to cut your dick, would you do that as well?

      • 3 weeks ago
        Anonymous

        I would die for Reddit.

  14. 4 weeks ago
    Anonymous

    IQfy* says it's not a 4090 and doesn't have features that I and most people don't use, so it's bad.
    * Shills and game developers and publishers with no talent.

  15. 4 weeks ago
    Anonymous

    Had a 16gb a770 since launch, it’s been good.
    From a pure consumer ethics standpoint. AMD doesnt even try, and nvidia will fleece consumers for every penny they have.
    Intel shit out a 16gb midrange card for a relatively low price and has been dumping effort into salvaging its potential since.
    I trust that product more. And its been a good experience since, it does everything I need it to.

  16. 4 weeks ago
    Anonymous

    https://gitlab.freedesktop.org/drm/xe/kernel/-/issues/234
    already abandoned by intel

    • 4 weeks ago
      Anonymous

      God I hate gitlab. GitHub is shitty as hell too, but their issue tracker is much easier to read through.

      • 3 weeks ago
        Anonymous

        What's difficult about gitlab? Brainlet

    • 4 weeks ago
      Anonymous

      >be shitel
      >literally only market for their GPU is a replacement for the buggy as hell AMDGPU driver in Linux (with a mostly functional userland at least).
      >implement all this effort to support most of the samish shit for a platform that Nvidia already dominates (windows) and then cucks on likely business related idiocy about muh IP

      I legitimately hope Intel GPUs fail. I ended up buying a shitty AMD GPU because even the Alder Lake integrated GPU is completely fricking worthless on Linux. Maybe Intel pulled their head partially out of their ass and fixed a lot of it since then, but reading this and the mesa anv driver issues, it sounds like a lot is lacking.

      Who the frick is running this shit show? Can Intel fire that idiot?

      • 4 weeks ago
        Anonymous

        i915 is stable but in terms of gaming on it, it's roughly where AMDGPU was in the Vega era. Xe is going to be their equivalent rewrite and it's going to take time before it is remotely close to AMD. And your iGPU worked even back then, but it's practically a requirement to run bleeding edge kernels to get new hardware working without issues on Linux. Arc has no issues on the day to day tasks, gaming and AI is just where it is better than AMD but not Nvidia.

        • 4 weeks ago
          Anonymous

          No, it did not work then and I only use upstream kernels from kernel.org.
          It flickered on and off like cheap garbage.
          It's still inexcusable given that WDDM and the DX12 driver just works on Windows and the code you're seeing in Xe and i915 is probably just a partitioned view of it.

      • 3 weeks ago
        Anonymous

        i915 is stable but in terms of gaming on it, it's roughly where AMDGPU was in the Vega era. Xe is going to be their equivalent rewrite and it's going to take time before it is remotely close to AMD. And your iGPU worked even back then, but it's practically a requirement to run bleeding edge kernels to get new hardware working without issues on Linux. Arc has no issues on the day to day tasks, gaming and AI is just where it is better than AMD but not Nvidia.

        I ditched my A770 because the Linux drivers kept sucking.

        it's not that great on linux yet. many games don't start with either driver, raytracing only works in quake2rtx, xess only works with dp4a

        Depends on the state of Linux drivers

        >people unironically pretending that Intel or literally any company gives a shit about L*nux users
        Every company always assigns like 2 pajeets to work on Linux support and lets the freetard "contributors" take up the rest of the slack

        • 3 weeks ago
          Anonymous

          Intel is one of the biggest contributors to the kernel.

        • 3 weeks ago
          Anonymous

          Sure, but AMD and NVidia have better pajeets assigned.

        • 3 weeks ago
          Anonymous

          And thus what I said is still true.
          The only market for this GPU is mass OEM slop (which is already eaten by iGPUs anyhow) or GNU + Linux.
          Intel is full of morons and your post just reaffirms my stance they're moronic.

          • 3 weeks ago
            Anonymous

            Linux users are not a serious market for any hardware device in the world.
            There are only a tiny number of hardware manufacturers that actually offer first class, high quality driver support for Linux (rather than just letting the freetards work things out themselves). Brother printers is one I can think of.

          • 3 weeks ago
            Anonymous

            >Linux users are not a serious market for any hardware device in the world.
            And thus why Intel's shitty GPU failed and will continue to fail.
            Keep re-affirming what I already said I guess.

          • 3 weeks ago
            Anonymous

            >they failed because muh mighty and prosperous users of Linux didn't buy them
            linux users are poor and don't buy anything new anyway.
            marketing any hardware product towards linux users is a mistake

          • 3 weeks ago
            Anonymous

            stop projecting your poverty onto others, wincuck.

          • 3 weeks ago
            Anonymous

            lol
            lmao
            desktop linux has no market share. enterprise is not buying anything that isn't nvidia. you will always be a cuckold

    • 3 weeks ago
      Anonymous

      Wow that's impressive. Here I was concerned about AMD having driver bugs lasting 1 year in since launch, but then there's Intel straight up half dropping one driver and half a totally different one.

  17. 4 weeks ago
    Anonymous

    Intel arc more like
    Intel ACK

  18. 4 weeks ago
    Anonymous

    It may never be good. But at least it will never be bad

  19. 4 weeks ago
    Anonymous

    What?
    If ur buying a graphics card rn and ur budget is south of $350 and you dont buy an A770 ur a fricking moron
    Unless ofc u desperately need the nonexistent ray tracing performance pushed out by the 4060 ti

    • 3 weeks ago
      Anonymous

      >t. gaymer child
      Nvidia is the only choice that exists when it comes to real work like renders or ML

      but yes for gayming arc is fine

      • 3 weeks ago
        Anonymous

        >nvidia is the only choice except for 98% of users
        k?

  20. 4 weeks ago
    Anonymous

    They certainly have supported the Alchemist cards with pretty good updates. If battlemage is priced well and Nvidia bucks off from the low-mid end, we might actually have a decent competitor against AMD.

  21. 4 weeks ago
    Anonymous

    Wait for Xe squared
    if that also sucks then no probably never gonna work out

  22. 4 weeks ago
    Anonymous

    >In summary, Intel ARC GPUs are promising but come with caveats. Keep an eye on the latest coverage to make an informed decision.

    tl;dr - not today, but maybe someday

  23. 4 weeks ago
    Anonymous

    No.
    It has the same issue from AMD where older stuff run like crap or don't run at all on it.

  24. 4 weeks ago
    Anonymous

    doa

  25. 4 weeks ago
    Anonymous

    Geohot says AMD GPUs are dogshit and even Intel has better documentation and closer bare metal access to hardware.

    AMD firmware is just shit they aren't even trying, and they are trying to ride the AI train with their AI Ryzens, when they are dogshit.

    • 4 weeks ago
      Anonymous

      >even Intel has better documentation
      pretty sure intel is known to have good documentation

    • 4 weeks ago
      Anonymous

      >Geohot says AMD GPUs are dogshit
      For ML compute pegged for tens of hours.
      They're fine for gayming.

    • 3 weeks ago
      Anonymous

      > Intel has better documentation and closer bare metal access to hardware.

      Elaborate. I was under the impression that AMD was very good in terms of having open technologies. Hell, they even publish the ISA of their GPU

  26. 4 weeks ago
    Anonymous

    I ditched my A770 because the Linux drivers kept sucking.

  27. 4 weeks ago
    Anonymous

    old games suck on it

  28. 4 weeks ago
    Anonymous

    Gonna be better than AMD considering how they compete against Nvidia, one day

  29. 4 weeks ago
    Anonymous

    Desperate need for AMD and Intel to step up.

    That shitty cucked 4060Ti is still good relative to its competition, which it should not.

  30. 4 weeks ago
    Anonymous

    no its shit

  31. 4 weeks ago
    Anonymous

    Well intel is owned by israelites so no lmfao i still don't understand why people continue to give this company that is literally based in tel aviv

    • 4 weeks ago
      Anonymous

      I only root for Intel in the GPU space in hopes of them introducing some actual competition.
      In reality, I know Nvidia will abandon consumers and Intel will just take their place as the big anti-consumer israelite.
      AMD simply doesn't care. They don't even reel in prices to actual inflation levels (Imagine if the 7900XTX launched at $750) to be less israeli.

      • 4 weeks ago
        Anonymous

        I understand where you are coming from dude it doesn't making buying gpu's any easier especially when all company's are basically israelited

        • 4 weeks ago
          Anonymous

          Exactly. GPUs are a racket. Have been since about 2016/2017.

          • 4 weeks ago
            Anonymous

            I fricking miss ATi was actually a decent company such a shame about what happened to them

      • 4 weeks ago
        Anonymous

        Honestly the 7900XTX was probs the only card worth buying from AMD the rest of the line up was pretty mediocre

    • 3 weeks ago
      Anonymous

      >product BAD
      >not due to price or any actual issue with the product
      >product BAD because I don't like the ethnicity of people involved somewhere along the line
      Grow up.

  32. 4 weeks ago
    Anonymous

    Battlemage running on the xe drivers will have much better support from day one then the older cards did.
    It will probably be pretty usable on Linux for most people shortly after release.
    Hopefully intel keeps solid support going for battlemage and avoids the numerous issues the first gen cards had

    • 4 weeks ago
      Anonymous

      Keep dreaming dude

  33. 4 weeks ago
    Anonymous

    Anyone know if the Arc cards will be TSMC forever, or will Intel switch to their own fabs at a later generation?

    • 4 weeks ago
      Anonymous

      I work at Intel. I work in the fab. I make Meteor Lake n stuff.

      I have no idea the answer to your question they dont tell us lowly tech b***hes these things

      • 4 weeks ago
        Anonymous

        I've heard that battlemage and possibly celestial have been booked at tsmc

        They want to move back to everything eventually, I would imagine but from January, TSMC is on there for the forseeable future up to Nova Lake, which is 2026 with the N2 node for something on there.
        https://www.techpowerup.com/318454/intel-reportedly-selects-tsmcs-2-nanometer-process-for-nova-lake-cpu-generation

        Thanks anons, that's a shame. I was hoping Intel would make them to lower the risk of GPU shortages but oh well.

        • 4 weeks ago
          Anonymous

          Intel isn't constrained by TSMC right now, they need people to actually buy their GPUs to actually have a shortage and that isn't happening. No one outside of insane individuals like myself are using it and are paying a price for it like me not being able to run Dragon's Dogma 2 and needing to use my Steam Deck to power through it and get motion sickness with a silky smooth 15 to 20 FPS dipping down into the single digits. But the fact that Intel is indeed getting its shit together faster than AMD on the AI front makes me hopeful that Intel knows they have a niche and a way to attack Nvidia's fortress stronghold on AI and other fronts so it will be eventually worth it to buy an Arc GPU other than saving money on some good deals.

          • 4 weeks ago
            Anonymous

            *run Dragon's Dogma 2 on Linux, my bad.
            It works fine on Windows, and even got a Day 1 driver.

    • 4 weeks ago
      Anonymous

      I've heard that battlemage and possibly celestial have been booked at tsmc

    • 4 weeks ago
      Anonymous

      They want to move back to everything eventually, I would imagine but from January, TSMC is on there for the forseeable future up to Nova Lake, which is 2026 with the N2 node for something on there.
      https://www.techpowerup.com/318454/intel-reportedly-selects-tsmcs-2-nanometer-process-for-nova-lake-cpu-generation

  34. 4 weeks ago
    Anonymous

    Its awesome for content creation and stable diffusion

  35. 4 weeks ago
    Anonymous

    >Will it ever be good?

    I hope so. Even though I have an Nvidia GPU (Nvidia is currently the only practical choice for 3D rendering right now sadly) I want for Battlemage to break the ice. I bought an A770 back in March last year but had to return it less than a week later because the hardware was defective (first shaders would bug out and then later games wouldn't launch at all), so I'm hoping they'll iron out a lot of the hardware kinks when BM comes out later this year.

  36. 4 weeks ago
    Anonymous

    Hopefully. AMD has grown complacent as what is functionally just another Nvidia market segment.
    A usable GPU that is futureproof, reasonably priced, has usable vGPU (without enterprise frickery) and other features would be nice.

  37. 4 weeks ago
    Anonymous

    If Intel doesn't give up on it. That is always a possibility.

  38. 4 weeks ago
    Anonymous

    if intel doesn't disrupt the gpu market you better hope you won't need them anymore because nvidia is gonna charge an arm and a leg for one and amd will just keep raising prices to match them while offering a subpar service

  39. 4 weeks ago
    Anonymous

    Will consider Battlemage for my leenuks workstation. I'm confident Intlel will deliver proper leenuks drivers near release time

    >captcha: DP00t8

  40. 4 weeks ago
    Anonymous

    I don't know. Alchemist was priced really low and while shit at least had hardware AV1 support and QuickSyncVideo. Using an A380 as a 6GB media accelerator card in my jellyfin server has been brilliant. Faster than Realtime HEVC to AV1 transcoding made it a breeze to compress my library.

    Battlemage doesn't look like it will even have that Niche. I don't think they'll come anywhere near Nvidia at gaming, but depending on pricing they might compete in the poorgay mid and bottom range with shitty drivers niche space. In terms of AI, oneAI has zero support outside of Adobe, though obviously Intel can deliver better product than RoCM in 2 weeks of work. Not gonna beat CUDA ever, but maybe they can rattle the throne.

    The real kicker is that intel's gpu department is run by Raja Poodori. You'd have to be a fricking insane amnesiac moron to pretend that that streetshitting curry munching pajeet with shit on crusty fingernails will deliver a good product. Especially now that he's had a couple of year of ranjeeting the department.

    • 4 weeks ago
      Anonymous

      Raja Sir jumped ship over a year ago.

      • 4 weeks ago
        Anonymous

        Oh shit, nice. Did intel do layoffs yet to unpajeetify the department?

        • 3 weeks ago
          Anonymous

          no their driver team is still outsourced to india

  41. 3 weeks ago
    Anonymous

    they should focus on fixing their trash cpus

    • 3 weeks ago
      Anonymous

      they make the fastest desktop cpu, and their cpus have lower idle power consumption. plus, hybrid architecture is brilliant. so, not sure what youre on about

      • 3 weeks ago
        Anonymous

        CPUs are fine, just a node behind
        I'd rather have an Intel CPU than deal with boot times, melting CPUs or whatever else PC tuner larpers like to ignore.

        It's always funny to see people who bet on the wrong horse try to justify it to themselves.

        • 3 weeks ago
          Anonymous

          Intel was literally the better value lol
          $544 14900k + half price high-end Z690 board
          what am I supposed to regret?

          • 3 weeks ago
            Anonymous

            You are arguing with a pajeet, anon. You need to keep in mind that $544 is not just one wealth for them. It's two whole wealth and a half.

        • 3 weeks ago
          Anonymous

          >actually pretending AMD processors are better
          We're long past 14nm+++++++++++++++ 4C8T days at intel. Don't attach RE-l to such low quality posts.

          • 3 weeks ago
            Anonymous

            Even in those days you exclusively wanted Intel and Nvidia for reliability and gaming FPS. Even the much nostalgia'd Athlon 64 was notedly inferior to Intel's Pentium 4 processors. Sure they ran hot but that was largely overblown by Anandtech and the AMD-biased press. I've been a loyal Intel/Nvidia gamer since 2001 and they've never steered me wrong unlike the shit I hear from AMD customers (Bulldozer and drivers come to mind). I have never and will never buy an AMD product. Too much trouble and too little functionality for the negligible price to performance increase if there even is one.

          • 3 weeks ago
            Anonymous

            >AMD-biased press
            >never bought an amd product
            >spouts shit about things he has no clue about
            many such cases

          • 3 weeks ago
            Anonymous

            Oh hey, it's Userbenchmark mod

          • 3 weeks ago
            Anonymous

            >I've been a loyal Intel/Nvidia gamer
            >fanboyism
            Opinion disregarded.

    • 3 weeks ago
      Anonymous

      CPUs are fine, just a node behind
      I'd rather have an Intel CPU than deal with boot times, melting CPUs or whatever else PC tuner larpers like to ignore.

      • 3 weeks ago
        Anonymous

        >boot times
        Dear heavens, how can we possibly live with a 1 time 2 minute boot to configure XMP profiles? It's over...

        • 3 weeks ago
          Anonymous

          It's not just initially on config but on any boot and MCR is basically a prayer that you hope entropy hasn't kicked in enough to make the previous training invalid with the risk of BSODs or data corruption.
          It's super hilarious that AMD users are suffering through HDD boot times because AMD is too busy talking out of both sides of their damn mouth.

          • 3 weeks ago
            Anonymous

            >HDD
            Windows itself has sucked on HDDs since 8. Not my problem because SSDs are cheap now.

          • 3 weeks ago
            Anonymous

            I never said they are actually booting off HDDs but the memory training is bringing back multi-minute boot times as if they had HDDs. Literal minutes just having the UEFI do it's bullshit.

          • 3 weeks ago
            Anonymous

            That happens once, not every startup.

          • 3 weeks ago
            Anonymous

            It can happen literally every boot
            Memory context restore is supposed to help with the exact issue but comes with it's own set of issues as despite AMD basically endorsing the use of 6000Mhz EXPO memory at every turn, not all of their CPUs can run it reliably without retraining constantly.

          • 3 weeks ago
            Anonymous

            It happens when your power goes out or you completely unplug it. You're being an illiterate fanboy.

          • 3 weeks ago
            Anonymous

            I'm on a 1700x booting off an SSD, with 4 spinners and another SSD. My pc has always booted in under 30 seconds.

  42. 3 weeks ago
    Anonymous

    It already is lol. Easiest GPU to use on linux too.

  43. 3 weeks ago
    lorry

    Yes one day it will be good.
    And on that day they will kill the project because upper management got bored and wants everyone to chase the next big hype.

  44. 3 weeks ago
    Anonymous

    >A380 master race
    I bought this card when it was released and i've never had an issue with it, matter of fact i had more issues with amd gpus than arc

    anybody that tells you arc isn't good is a lying homosexual

  45. 3 weeks ago
    Anonymous

    >Will it ever be good?
    In three years, sure
    In five years, it might even be usable on Linux

  46. 3 weeks ago
    Anonymous

    can somebody give me a qrd on these, I wanted to get one for my old amd fx shitbox but read somewhere they only worked with intel cpu/mobo

    • 3 weeks ago
      Anonymous

      you need ReBAR support in your mobo to fully utilize these GPUs
      check if your mobo can be modded with https://github.com/xCuri0/ReBarUEFI as it's highly unlikely it supports ReBAR natively out of the box.

  47. 3 weeks ago
    Anonymous

    Eventually it will. After watching the Intel man on gamers nexus explain with passion how drivers work I have full faith in Intel.

    I will be buying a battlemage card to replace my 6750xt.

  48. 3 weeks ago
    Anonymous

    ONE DAY, THIS WILL RUN WELL ON LINUX
    AND ON THAT DAY, I SHALL BUY IT

  49. 3 weeks ago
    Anonymous

    you can flash it to become a workstation card and unlock the workstation features that are locked for gaymers.

    • 3 weeks ago
      Anonymous

      Not quite yet, we don't know how to get SR-IOV on the card unlocked and some other stuff but that is the only real hinderance to the card that normal people would be interested in. It may be a thing already people have done, but I think it's being sat on for some time because Battlemage may release soon and there is the possibility you want to have this to kill two birds with one stone or you want to wait until it is announced before you "kill" the older product.

  50. 3 weeks ago
    Anonymous

    their design is pretty comfy, i liked it so much more than all the other AMD variants. nvidia still have some pretty cool designs but their AIB design look like shit tho
    but these still arent coming to asia anytime soon so wtv, i dont care

  51. 3 weeks ago
    Anonymous

    I unironically want to buy some Intel stonks. Sure, they're shit right now. But I feel like they're like AMD was in the 2010s. I want to slap myself for no buying in back then, I was just a click away and backed out.

    • 3 weeks ago
      Anonymous

      i've been buying for a few months, that shit gonna fly in 2, 3 years imo

  52. 3 weeks ago
    Anonymous

    A770 was already good. They are cheap cards but I managed to run Fallout 4 at 4K with settings at "high" running between 40 to 60 fps. I was surprised by the performance and how much it increases every time they release a driver update. Like I wonder how much unlocked potential there still is, how good is the hardware actually?

  53. 3 weeks ago
    Anonymous

    Thoughts on a580 for a 1700x?

    • 3 weeks ago
      Anonymous

      if you only game on windows and your mobo supports rebar it's a good choice

      A770 was already good. They are cheap cards but I managed to run Fallout 4 at 4K with settings at "high" running between 40 to 60 fps. I was surprised by the performance and how much it increases every time they release a driver update. Like I wonder how much unlocked potential there still is, how good is the hardware actually?

      all the leaks before release pointed to them targetting 3070 performance

      • 3 weeks ago
        Anonymous

        that's what it appears to be able to do

      • 3 weeks ago
        Anonymous

        msi b450 tomahawk. Mobo shouldn't be the issue, but I'm concerned the CPU may be. Don't really play more than Insurgency and Diablo 3. Want it for the AV1 encoding, otherwise I'd get a 6500/6600. Anything higher and the processor becomes a huge bottleneck.

    • 3 weeks ago
      Anonymous

      I hate intels naming scheme. Why is the a580 more powerful than an a770?

      • 3 weeks ago
        Anonymous

        The A770 is significantly better. But we're now at a point where waiting might be worth it, no matter what brand you're considering.

        • 3 weeks ago
          Anonymous

          Intel is the only true competitor. Amd and nvidia ceos are related. So Intel is the only one I'm willing to buy.

          • 3 weeks ago
            Anonymous

            I don't blame you. Weaker graphics cards, but the pricing is more in line with real inflation rates and the company doesn't openly abuse GPU buyers and then brag about it to investors at every opportunity.

      • 3 weeks ago
        Anonymous

        The 580 is stronger?

        • 3 weeks ago
          Anonymous

          Nope, even the 750 is stronger than the 580.

          I hate intels naming scheme. Why is the a580 more powerful than an a770?

          That anon is either had a brain fart, is heavily misinformed or was looking for (you)'s

          • 3 weeks ago
            Anonymous

            That's what I assumed given the naming structure usually means higher model number = more performance (and money). Anons post made absolutely no sense, hence the pic

            The 580 is stronger?

            . Thanks for confirming.

  54. 3 weeks ago
    Anonymous

    never

  55. 3 weeks ago
    Anonymous

    I have an ARC750 on my Proxmox Server running Passthrough , has absolutely been a game changer to make transcoding, encoding and rendering of some videos and even 3D models. Works great for that it even exceeded in transcoding a previous 3050ti that I bought just for the server.

    It was a hassle to make the passthrough , because the drivers needed to be downloaded from an obscure repo they certainly had forgotten it existed, but it just werks.

  56. 3 weeks ago
    Anonymous

    Intel needs to kill off this failed venture and try to salvage their cpu business. Never had to use project lasso with my amd processors.

  57. 3 weeks ago
    Anonymous

    Bmage is already known to be a 2x unit increase over Alch. Best guess estimate puts it (the full size A770 successor) somewhere between 4070 annd 7900XT level. Other details are just rumor, which is odd considering how close to release it should be by now. Early recent leak results are about as reliable as claiming your self built car makes 400HP without even putting together the engine, so they are not reliable.

    Battlemage will be a price-class competitor with soon to be last gen GPUs and 5060Ti/S / 8700XT cards.

    • 3 weeks ago
      Anonymous

      Buttmage DOA in other words

    • 3 weeks ago
      Anonymous

      You're basing your opinion off of early "leaks" that are assuming Battlemage will have a screwed up architecture exactly like Alchemist.
      Let's do the right thing by not pretending we know what's going to happen.

      • 3 weeks ago
        Anonymous

        Any reason not to believe so? If there really were hardware flaws that came to light later on, it's very much likely that it was too late for revisions for Battlemage and thus Celestial will be the first chance to correct things.
        Either way I really hope Intel doesn't just throw in the towel, I really want them to succeed because Nvidia is too expensive and AMD is trash.

        • 3 weeks ago
          Anonymous

          That's a completely unrealistic train of thought and you know it. They've known the problems with Alchemist since mid/late 2022 and it's not like Battlemage is going into mass production in a week. If Intel builds a bad track record over the next couple of generations, I'd agree that they should be removed from your list of options, but they haven't even released their second generation and drivers for their first generation are going pretty darn well. It remains to be seen how far into the future their driver support for older GPUs will go, so this is still shaky ground, but they've earned the benefit of the doubt for now.

        • 3 weeks ago
          Anonymous

          They knew, Raja actually shared in a video that they wrote microbenchmarks and they know what the shortcomings were in Alchemist at least by the release of the A770/750 when they shared it with us.

          ?t=547
          The main issue here we know about is their memory subsystem on the GPU and how it is handled from hardware to software which makes sense since they were moving from how iGPUs work to how proper dGPUs do and the fact the driver needs ReBar. There may be other things they didn't share with us as well but that is the big one they admitted publicly.
          I think they know now definitively what needs to be changed by Celestial but they would've known enough internally to change things about Battlemage since it already started to be fabbed out when we heard last August people who were touring Intel's Malaysia's fab. They saw but weren't allowed to photograph the Failure Lab where they had B10 dies.
          https://twitter.com/aschilling/status/1696468861430690218

  58. 3 weeks ago
    Anonymous

    It would have been the best seller midrange card for older systems if not for the fricking thing losing 90% of its performance if your system doesnt have RE-BAR

  59. 3 weeks ago
    Anonymous

    It could be good enough just like AMD but Nvidia will never lose its dominant position as long as they have the software moat around their GPUs, and the competition isn't catching up anytime soon.
    Prepare yourself for at least a decade of incremental improvements from nvidia trickling down from their AI R&D

  60. 3 weeks ago
    Anonymous

    c'mon intel i trust you
    >Verification not required.

  61. 3 weeks ago
    Anonymous

    With the driver updates, it is good. Especially for the price. Or -- is this another IQfy just glances at the Linus release day video and that's that's?

  62. 3 weeks ago
    Anonymous

    They should have partnered with AMD to compete with nvidia

    • 3 weeks ago
      Anonymous

      Intel partnered with AMD before but AMD as usual fricked up drivers which is why the cooperation stopped.

      • 3 weeks ago
        Anonymous

        Intel should do the drivers

        • 3 weeks ago
          Anonymous

          Intel has never had good graphics drivers. They've had ones just functional enough to not crash when surfing the webs, but the minute you try to play a game everything goes souf.

      • 3 weeks ago
        Anonymous

        Intel should do the drivers

        People keep parroting the AMD drivers meme but catalyst the worst version was over a decade ago. I also find it hilarious that Intel should do drivers considering the recent GPU launch on top of the catastrophes of Intel management engine and various other driver and software issues lmao.

  63. 3 weeks ago
    Anonymous

    I run a A770 on my linux desktop; it is great. I'll buy a battlemage when it comes out. But I predict that discrete graphics will only exist for 10 more years.

  64. 3 weeks ago
    Anonymous

    No.

Your email address will not be published. Required fields are marked *