Why did Intel start making GPUs?

Why did Intel start making GPUs?

Beware Cat Shirt $21.68

Rise, Grind, Banana Find Shirt $21.68

Beware Cat Shirt $21.68

  1. 1 month ago
    Anonymous

    Money.

  2. 1 month ago
    Anonymous

    Theyre winning

  3. 1 month ago
    Anonymous

    SoC with slightly impressive but still lukewarm specs is about the only viable market left.

  4. 1 month ago
    Anonymous

    because their cpus don't sell anymore

    • 1 month ago
      Anonymous

      They always did

      Intel's quarterly revenue from Consumer Client + DC markets alone are 2x AMD's entire quarterly revenue including GPU lol. Absolute delusion if you think their cpus aren't selling.

      • 1 month ago
        Anonymous

        really?

  5. 1 month ago
    Anonymous

    Because the market is overpriced as frick
    I would if i could

  6. 1 month ago
    Anonymous

    they already know how to make GPU's due to decades of making integrated graphics, it's a no-brainer really.

  7. 1 month ago
    Anonymous

    They've made iGPU's for a long time and thought they could make it big in the dedicated market.
    Also to compete with AMD APU's.

  8. 1 month ago
    Anonymous

    Why not?

  9. 1 month ago
    Anonymous

    AI

  10. 1 month ago
    Anonymous

    GPU compute is profitable.
    The architecture of Arc (and Raja's vega) is purely optimised for compute. They've also been putting in a tonne of effort into oneapi devtools.

    Also currently the 2nd most used parallel compute hardware are Intel Xeons. So they want to have fully Intel solutions to provide to businesses interested in GPU compute.

    Gaming GPUs are needed for mindshare since business moves slower than consumers.

    • 1 month ago
      Anonymous

      He’s working for Intel now and not at AMD because he made a shitty GPU at AMD. Now he’s making a shitty GPU for Intel. Circle of life. Arc isn’t on the same planet as nVidia in terms of compute, either.

  11. 1 month ago
    Anonymous

    The same reason they’re getting into pay-to-play fabing — the market is super hungry for more, and Intel is positioned to serve it. Too bad they couldn’t bring on a more talented engineering team; the group will end up being ditched like SSDs were but they’ll at least mix things up some before that. Replacing the pajeet is off the table, though; Intel will keep the shitter to the end.

  12. 1 month ago
    Anonymous

    backdooring more goyim

  13. 1 month ago
    Anonymous

    the pandemic GPU shortage and high prices from the then crypto craze and now AI hype cycle gave them an opening to enter the GPU lower end market

    • 1 month ago
      Anonymous

      Pandemic and crypto are BS excuses for soiface consumers. In reality the Asian foundries (TSMC and Samsung) screwed the pooch and either failed or purposefully restricted yields which kept cutting edge nodes at a high price. THAT’S why GPUs cost so much now. Intel, which had just invested in a major EUV fab line, was positioned to exploit this gap.

    • 1 month ago
      Anonymous

      >AI hype cycle gave them an opening to enter the GPU lower end market
      This makes no sense. AI runs on CUDA. Only torch has very limited support for AMD ROCm on Linux. ARC doesn't run CUDA, and I haven't heard that they're making their own alternative to it. Even if they did, I wouldn't expect any support from neither torch nor tensorflow (TF doesn't even support ROCm, what makes you think they'll try to implement Intel's theoretical alternative?)
      Your post is what happens when somebody who doesn't know what they're talking about wants to give his opinion regardless

      • 1 month ago
        Anonymous

        >AI
        AI also runs on OpenVINO
        >I haven't heard that they're making their own alternative to it
        OneAPI / SYCL
        >Torch
        https://github.com/intel/intel-extension-for-pytorch
        >Tensorflow
        https://www.intel.com/content/www/us/en/developer/articles/guide/optimization-for-tensorflow-installation-guide.html

        Your post is what happens when somebody who doesn't know what they're talking about wants to give his opinion regardless

        • 1 month ago
          Anonymous

          Are you seriously comparing shitty XPU optimizations to CUDA models on the GPU?

        • 1 month ago
          Anonymous

          All this shit does is make the CPU models slightly faster, this is not training a model on the GPU
          We’re talking orders of magnitude slower, just train a model on a free google colab card to see the difference by yourself kek

          • 1 month ago
            Anonymous

            Nta, i run LLM on arc 750 8gb easy, stable difussion works like charm

          • 1 month ago
            Anonymous

            Stable diffusion XL models, which suck less, require 10GBs of VRAM minimum.

          • 1 month ago
            Anonymous

            And? Offload to cpu, bit slower but unnoticable. For $200, its more than a bangfor buck, or just buy the 16gb for $300 lol

  14. 1 month ago
    Anonymous

    Linley Group as well as other major semiconductor industry researchers and tech insights have forecasted GPUs to skyrocket due to addition uses especially in data centers since late 2010s. Intel decided to chip in.

  15. 1 month ago
    Anonymous
  16. 1 month ago
    Anonymous

    You meant the browser right?

  17. 1 month ago
    Anonymous

    'Cause they lost in the CPU market.

  18. 1 month ago
    Anonymous

    Because they've been making integrated GPUs for almost two decades. And if we're counting microprocessors for "graphics" it's been like 50 years.

  19. 1 month ago
    Anonymous

    because they watched nvidia become one of the biggest companies on earth by market cap

Your email address will not be published. Required fields are marked *