How long does it realistically have left?

How long does it realistically have left?

Mike Stoklasa's Worst Fan Shirt $21.68

It's All Fucked Shirt $22.14

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 2 weeks ago
    Anonymous

    5 years maybe? Intel and AMD will fall behind massively. Other processor companies which focus on ARM will take over the market.

    • 2 weeks ago
      iToddler

      x86 isn’t going away. x86 is the wienerroach of microprocessors.

    • 2 weeks ago
      Anonymous

      >Other processor companies
      Let's see, we have
      >NVIDIA, minor interest in media and gaming platforms
      >Apple, no interest outside their products, minor market share in PC
      >Qualcomm/MediaTek/Samsung/Huawei, no interest in PC or server and/or embargoed
      >Amazon/Google/etc., no interest outside their own servers
      >Various companies targeting embedded devices, no interest outside it
      >New startup, not scaling up in five years
      Who exactly is going to take over "the market?"

      • 2 weeks ago
        Anonymous

        umm welll, uhhh, RISC-V in 2 weeks?

      • 2 weeks ago
        Anonymous

        Obviously Nvidia. They tried to buy arm but you moronicly cope

      • 2 weeks ago
        Anonymous

        Qualcomm has repeatedly said recently they want to expand to laptops and pcs.

      • 2 weeks ago
        Anonymous

        Google netbooks own school marketshare and are what alphoomers will think of as a pc. Nsas front company absolutely has an interes you glowBlack person shill

    • 2 weeks ago
      Anonymous

      Risk-V vill take over in Asia (Google sees it as an enemy so they removed support from Android)

    • 2 weeks ago
      Anonymous

      hi elon

    • 2 weeks ago
      Anonymous

      >ARM
      I think you meant RISC-V

    • 2 weeks ago
      Anonymous

      x86 is the hardware equivalent of those old legacy systems still running on COBOL. It will outlast every single one of us.

    • 2 weeks ago
      Anonymous

      What will realistically happen is that desktop computers will die out and that will be the death of x86 on consumer devices. Most kids don't even use a computer at all except for school work and those laptops will switch to arm if they haven't already as they only need to run a browser anyways. Kids do everything on their phones.

  2. 2 weeks ago
    Anonymous

    > be iMolesters
    > m1 get released
    > INDUSTRY IS MOVING TO ARM, BOYS! ARM IS THE FUTURE!
    > m2 gets released
    > YEP! TWO MORE WEEKS, EVERYONE IS MOVING TO ARM! X86 IS FINISHED
    > m3 gets released
    > I..iit..it. has good power savings! nothing consumes less battery power than m3!
    > m4 gets announced
    > yeah, battery something something.
    my sides in fricking orbit. x86 has another 40 years at this rate. maybe longer. either way, it'll still be around long after ipedophiles in California move to another processor.

    • 2 weeks ago
      Anonymous

      When the Qualcomm exclusivity deal dies, you may see more ARM from all the major players.

      • 2 weeks ago
        Anonymous

        if you think qualcomm has some form of exclusivity, a company that designs its own hardware and pay license fees /royalties just like everyone else, then you are a dumbfrick moron Black person.

        5 years maybe? Intel and AMD will fall behind massively. Other processor companies which focus on ARM will take over the market.

        see:

        > be iMolesters
        > m1 get released
        > INDUSTRY IS MOVING TO ARM, BOYS! ARM IS THE FUTURE!
        > m2 gets released
        > YEP! TWO MORE WEEKS, EVERYONE IS MOVING TO ARM! X86 IS FINISHED
        > m3 gets released
        > I..iit..it. has good power savings! nothing consumes less battery power than m3!
        > m4 gets announced
        > yeah, battery something something.
        my sides in fricking orbit. x86 has another 40 years at this rate. maybe longer. either way, it'll still be around long after ipedophiles in california move to another processor.

        it's been around since 1986/1987 and has taken over z80, 6502 etc. as a microcontroller. that's it. you know what's even more depressing for itoddlers? variants of z80 and 6502 are still produced. ARM couldn't even replace nearly 50 year old CPUs lmao.

        • 2 weeks ago
          Anonymous

          By Qualcomm exclusivity that anon was referring to reports that Qualcomm has a secret deal with Microsoft to be the only company providing chips for the ARM version of Windows. That deal, if it exists, would need to expire before other potential ARM chip makers (Mediatek, Samsung, Nvidia, etc) could make ARM chips for Windows.

          That said, with Windows I am quite skeptical that using ARM will provide all that much gain. Sure, Apple laptops with ARM chips get really good battery life for their performance, but I question how much of that is due to the CPU being ARM, and how much of that is due to Apple having full control of all the hardware & software in their machines.

          • 2 weeks ago
            Anonymous

            Thank you, Anon.

            The deal was for Microsoft Windows.

            We'll see, around May 20th. That's when the first hardware announcements may arrive.

          • 2 weeks ago
            Anonymous

            >Sure, Apple laptops with ARM chips get really good battery life for their performance
            Not "for their performance"
            On any load you'll see identical perf per watt to x86 contemporaries.

            >but I question how much of that is due to the CPU being ARM
            Not at all.
            >and how much of that is due to Apple having full control of all the hardware & software in their machines.
            It's because it's a phone SoC with an unlocked power limit basically.
            Compare package power to core power and you'll see. Pic related, an idling DESKTOP CPU from 5 years ago. Laptops have 1-2W of motherboard power consumption on top of this. Apple's shit is so integrated onto a highly efficient die that they instead have 100mW.

          • 2 weeks ago
            Anonymous

            >On any load you'll see identical perf per watt to x86 contemporaries.
            >t. never checked those specs
            ARM dominates mobile precisely because of perf/watt.

          • 2 weeks ago
            Anonymous

            >>t. never checked those specs
            Sure did.
            >ARM dominates mobile precisely because of perf/watt.
            Not at x86 level performance clearly!

            >only screencaps the cpu power consumption
            >doesn't show the SoC constantly drawing 8 watts at idle
            kek

            't show the SoC constantly drawing 8 watts at idle
            You're almost getting it. The point I was trying to make, the architecture is irrelevant, Apple has designed their chips to have little "out of core consumption" and the boards do not have chipset, SSD or RAM just power regulation, a bare NAND and wifi.
            That's the way you get low idle usage that translates into battery life, be it ARM or x86.

          • 2 weeks ago
            Anonymous

            >muh one cherry picked graph
            There are papers written on this. Some comparisons are even, others are clearly ARM wins. Spend some time on Google.

          • 2 weeks ago
            Anonymous

            >only screencaps the cpu power consumption
            >doesn't show the SoC constantly drawing 8 watts at idle
            kek

      • 2 weeks ago
        Anonymous

        It isn't going anywhere. Every time Arm PCs were tried they flopped.

        If and only if the snapdragon x doesn't flop.

    • 2 weeks ago
      Anonymous

      Companies wont switch until there's no replacement parts

    • 2 weeks ago
      Anonymous

      >> I..iit..it. has good power savings! nothing consumes less battery power than m3!
      >> m4 gets announced
      >> yeah, battery something something.
      Where's the lie tho?

      • 2 weeks ago
        Anonymous

        Nobody gives a shit about muh battery.

        • 2 weeks ago
          Anonymous

          The new non-x86_64 Macbooks have 2-3 times the battery life as the average Windows laptop. I think even normalgays will care about the massive difference.

  3. 2 weeks ago
    Anonymous

    Two more weeks

  4. 2 weeks ago
    Anonymous

    they invented 64-bit extensions, was the direction right or should we go back or invent another extension

    • 2 weeks ago
      Anonymous

      Well. AMD did. Intel created "x86".

  5. 2 weeks ago
    Anonymous

    Is there a standard way of booting and enumerating the devices on an ARM-based system yet ?

  6. 2 weeks ago
    Anonymous

    There isn't a real desktop competitor so not for a long time

  7. 2 weeks ago
    Anonymous

    40-50 years, conservatively.
    I've been hearing morons claim that <x> ISA is going to kill x86 at least monthly since the mid-1980s. First it was 68k, then it was a series of RISC architectures (MIPS and Alpha were the frontrunners), then a flirtation with VLIW (such as Itanium).
    Now it's ARM. When you get bored of being wrong about that, it'll probably be RISC-V. Then, who knows. Either way, x86 will outlive every single compsci rube ITT.

  8. 2 weeks ago
    Anonymous

    Not much. While some morons are clinging to their obsolete ISA, I am decades ahead of them. Not an itoddler either, I'd rather die than pay Crapple a penny.

  9. 2 weeks ago
    Anonymous

    Considering the 8051, Z80 and 6502 are still going strong, perhaps never

    • 2 weeks ago
      Anonymous

      What the hell uses that old shit?

      • 2 weeks ago
        Anonymous

        Kid, go smash (another) one of your toys in a fit of autistic rage, and notice the epoxy blob on the circuit board that falls out: it's almost certainly one of those.

        • 2 weeks ago
          Anonymous

          I don't own any

      • 2 weeks ago
        Anonymous

        calculators, industrial machines, cars, tractors.

        • 2 weeks ago
          Anonymous

          Huh? Most PCMs use ARM and MIPS

      • 2 weeks ago
        Anonymous

        planes and shit I think

      • 2 weeks ago
        Anonymous

        There's more to the world than laptops and game consoles. Older ISAs like that are perfect for a device that only needs to do the most basic arithmetic and nothing more. Unless you have an x86 calculator I don't know about...

        • 2 weeks ago
          Anonymous

          Who the hell uses a calculator? Last time I was even tempted to buy one it used ARM

  10. 2 weeks ago
    Anonymous

    As soon as it can compete with $500 desktop PCs in terms of gaming performance.

    OH and have an upgradable CPU/GPU/SSD/motherboard/power supply.

  11. 2 weeks ago
    Anonymous

    RAM is a tricky one, maybe you could get away with making it soldered but at least in the 2024 we call all unanimously agree that 16GB of RAM should be the BARE MININUM.

    • 2 weeks ago
      Anonymous

      >maybe you could get away with making it soldered
      nope. nobody is interested in soldered ram. only dumbfrick morons.
      >we can all unanimously agree that 16GB of RAM should be the BARE MININUM.
      absolutely. it still makes me laugh reading about this dangerously low iq monkey working for apple's marketing that tried to claim 8gb was equal to x86 with 16gb of ram because apple used compression or some shit with their virtual memory setup. apple fricking know their 8gb machines are absolute dogshit but they're the only versions of their machines selling in large numbers because nobody (including itoddlers) have the money for 16gb versions.

      • 2 weeks ago
        Anonymous

        Soldered RAM uber close to the CPU does lead to faster RAM. That's how apple got 400GB/s. Would definitely help for gaming/AI workloads.

        That said nvidia/amd might be able to create dedicated graphics cards with upgradable vRAM in the future...

        • 2 weeks ago
          Anonymous

          meanwhile in reality.. m1/m2/m3 gets destroyed by x86 systems using ddr4 and ddr5. the speed advantage hasn't helped the m series, considering it keeps being destroyed in benchmarking. maybe 2tb/s will do the trick? 3tb/s?

        • 2 weeks ago
          Anonymous

          >Soldered RAM uber close to the CPU does lead to faster RAM
          If you mean higher bandwidth (MHz) then yes, but most if not all client workloads don't require that much bandwidth. There is a reason why neither And or Intel provide a CPU for consumers above 2 RAM channels: outside a few niche workloads, nothing really uses bandwidth, and those that use it are willing to pay more for it.
          Maybe there is some advantage in latency, but that's it.
          The real reason they solder memory is because they want their customers to pay whatever price they set for it. Same story with storage.

      • 2 weeks ago
        Anonymous

        I'm definitely fine buying a laptop with 4gb of ram as long as I can upgrade to 32gb and it's not very expensive.

        • 2 weeks ago
          Anonymous

          yes. any laptop you can upgrade is better than ARMmemeshit people keep mindlessly shilling.

      • 2 weeks ago
        Anonymous

        Hopefully CAMM style memory takes off - it lets you fit low power RAM really close to the CPU and is nearly as fast as soldered RAM, but is trivial for people to upgrade - you just need to remove a few screws to change the module.

        • 2 weeks ago
          Anonymous

          When and why did Megahertz get replaced with Megatransfers as the unit of measurement? Is there some moronic controversy about Hertz I'm unaware of?

          • 2 weeks ago
            Anonymous

            Since DDR was a thing in the 00s?
            https://www.kingston.com/en/blog/pc-performance/mts-vs-mhz

          • 2 weeks ago
            Anonymous

            That's cool and all, but
            >clearly modern article
            >use of megaturds over megahertz has only been a widespread thing in the past few months

          • 2 weeks ago
            Anonymous

            Gaymers are moronic and don't understand what "clock" is, let alone "rising and falling edge" which is how DDR achieves its "double-data rate"

          • 2 weeks ago
            Anonymous

            It was discovered Hertz diddled little girls, he's been cancelled. Megaturds it is.

          • 2 weeks ago
            Anonymous

            Do you think 1800 or 3600 will sell more?

        • 2 weeks ago
          baritone

          calling it right fricking now: its DOA
          because its cheaper to make non upgradable E-waste "computers", it is a good thing but no manufacturers would choose it because they still have to plan ahead to even use them

    • 2 weeks ago
      Anonymous

      >16GB of RAM
      >BARE MININUM
      Skill issue

      Posted from my Thinkpad T60 with 4GB of RAM

  12. 2 weeks ago
    Anonymous

    I like x86 assembly more than Arm

  13. 2 weeks ago
    Anonymous

    As long as OEMs keep writing shitty ARM firmware probably a long time. Intel's low power CPUs are actually really nice.

  14. 2 weeks ago
    Anonymous

    Why would less instructions (RISC can't do operations on memory) would be better ?

    • 2 weeks ago
      Anonymous

      Simpler instruction decoder can decode instructions faster. CPUs internally work like that anyway, so it's not like having memory-memory operations gives you more power, it's just that the CPU is hiding the load/stores from you while also being more complicated for no reason.

    • 2 weeks ago
      Anonymous

      RISC isn't "less instructions." It's not even actually simpler instructions, though that's a side effect. It's...
      - Uniform instruction encoding.
      - Load/Store architecture.
      - Large register file.
      - All or almost all instructions should have uniform throughput, ideally single cycle.

      The first 3 support the last point, and also make superscalar dispatch easier (fewer dependencies). RISC is good for 1.5-2x performance, all other factors being equal. But not really more, and other factors are not always equal.

      To your specific point: an op that directly modifies memory stalls the pipeline while waiting for memory access. If load/store are separate instructions, then the ALU or FPU pipeline keep running and the LSU handles memory. Which is what happens under the hood with x86 since the Pentium, for the same reasons. But that means circuitry has to be dedicated to making sure the op can really be broken up and executed out of order. Along with circuitry to break everything down into "micro ops."

      tl;dr - ARM will continue to dominate phones, x86-64 will continue to ship, Apple will continue to act like their M chips are super chips.

      • 2 weeks ago
        Anonymous

        Could architectures switch to making the asynchronous nature of memory access/possibly other slow operations explicit? Like you could have one instruction to start a memory op, and another that waits for it to finish, so you could encode the first op, do a bunch of other work, then execute the second op only when you need it?

        • 2 weeks ago
          Anonymous

          That's basically what's happening with RISC where memory access is separate and explicit. Except there's not a middle "wait for" instruction. If an add is dependent on two loads to registers, then the CPU tries to dispatch other ops until the data is ready for that add.

        • 2 weeks ago
          Anonymous

          Out of order processors figure out those things at runtime. In the late 90s and early 00s VLIW and EPIC designs tried to make it explicit, and force compilers to figure it out rather than the silicon. For the most part it was a disaster.

      • 2 weeks ago
        Anonymous

        This anon gets it. Most of IQfy's understanding of RISC (and x86 for that matter) are incredibly cursory, and at least 30 years out of date.

  15. 2 weeks ago
    Anonymous

    50 years, the king is not going anywhere. There is no successor thst have a complete set of instructions needed in heavy 3D graphics

    • 2 weeks ago
      Anonymous

      >needed in heavy 3D graphics
      Lmao are you still in the 90s? This isn't the 3DNow! era. "Heavy 3D graphics" are processed almost entirely on the GPU. There's also nothing special or unique about x86 SIMD that makes it more suited to 3D graphics.

  16. 2 weeks ago
    Anonymous

    Considering that they're already removing legacy native instructions, it's not going anywhere.

    • 2 weeks ago
      Anonymous

      >they're already removing legacy native instructions
      No! Now I no longer know what programs are doing!

  17. 2 weeks ago
    Anonymous

    I give it at least a decade.

  18. 2 weeks ago
    Anonymous

    x86 is objectively a good insturction set. the most important instructions are often encoded in a single byte, and in todays world cache is king.

    • 2 weeks ago
      Anonymous

      No...x86 had so many damn problems. It wasn't close to "good" until 386. And really needed x86-64. The stack based FPU registers are still shit and compilers spit out vector ops for scalar floating point math for that reason. I wish they had ditched that in x86-64, or introduced a new set of FPU ops that had a big, normal register file.

      • 2 weeks ago
        Anonymous

        >n-no x86 really sucks i read it on a child porn imageboard
        >proceeds to rant hysterically about x87 nobody has cared about it in decades and we use sse now
        At least you outed yourself in one post or less.

        • 2 weeks ago
          Anonymous

          >n...n...nooooo
          >your example is valid but you're still wrong!
          Holy fricking COPE. If you weren't a moron you would know why I mentioned the 386 and x86-64 ISA changes. Because segmented memory was shit. 16-bit was shit. And the whole damn thing was register starved until x86-64.

          x86-64 is OK, but x86 started out as shit. That's because the ISA was actually dictated by a terminal manufacturer. IBM just happened to grab it because Intel had good manufacturing. And IBM went with a cheaper, more hobbled version because why not? 68K was absolutely superior, as were all the RISC architectures. That's why those were used in workstations.

      • 2 weeks ago
        Anonymous

        >x86 wasn't good until 30 years ago
        k

        • 2 weeks ago
          Anonymous

          x86 wasn't good until AMD64, and it's still subpar. "Doing the same shit but faster" obviously has a scaling limit

          • 2 weeks ago
            Anonymous

            >it's bad ok
            >only competition is mobileshit ARM which is complete shit

          • 2 weeks ago
            Anonymous

            ARM in phones and macshit is consoomer slop, it's not built for performance. Server ARM CPUs are way more competitive

          • 2 weeks ago
            Anonymous

            >estimated scores
            >synthetic benchmark
            >2017
            >worse than intel in over half the tests
            >it's all random shit

          • 2 weeks ago
            Anonymous

            >obviously has a scaling limit
            intel hands typed this post

      • 2 weeks ago
        Anonymous

        That's why SSE+ have scalar instructions you moron.

    • 2 weeks ago
      Anonymous

      It really isn't. It used to be good, i386 was pretty nice. But its been extended and extended so much its a hideious distaster now. amd64 was the final tipping point where x86 went to total shit. Amd fricked us all, they traded away our future for a short term advantage.

  19. 2 weeks ago
    Anonymous

    Centuries

  20. 2 weeks ago
    Anonymous

    After APX lands on consumer Intel/AMD cpus it's over for ARM

  21. 2 weeks ago
    Anonymous

    It's not going away any time soon. Windows still dominates the industry and developers don't offer ARM releases for popular software.

  22. 2 weeks ago
    Anonymous

    AMD saved it with Ryzen, and they saved it with the 64-bit version in 2003. Intel was planning to make consumer versions of the Itanium.

  23. 2 weeks ago
    Anonymous

    maybe if apple release a 200$ pc, it could take over

  24. 2 weeks ago
    Anonymous

    Where else am I going to run my LLMs if not for x86?
    Apple M* cheaps excluded.
    Even on the cloud x86 is the only option for GPUs and LLMs.

  25. 2 weeks ago
    Anonymous

    more than you

  26. 2 weeks ago
    Anonymous

    >x86 sucks because ~~*intel*~~ is back to margin of error-esque generational gains
    i love my 7950X

  27. 2 weeks ago
    Anonymous

    Just look at IPv4 and you'll have your answer

  28. 2 weeks ago
    Anonymous

    Nothing ever changes. x86 will stay in desktops, laptops and servers and ARM in mobile and embedded.

  29. 2 weeks ago
    Anonymous

    I give it until 2049

  30. 2 weeks ago
    Anonymous

    Depends, if they strip out all the legacy bullshit, make them x64 only and simplify the shit out of them... maybe another 20 years?

    At the current trajectory, like 10 years or less before widespread use is limited.

    • 2 weeks ago
      Anonymous

      The "legacy" stuff is such a minor part of the cpu. The vast majority of what bloats out the cpu is still availble in x64 mode.

      • 2 weeks ago
        Anonymous

        I'm just going by what Intel claimed, which is that modern processors have a lot of unnecessary complexity because they still have 16-bit and 32-bit backwards compatibility, real mode, and some other bullshit.

        although if they were to eliminate the 8-bit, 16-bit and 32-bit registers, and all the associated instructions, that would render tons of software unusable and at that point just switch to ARM.

        • 2 weeks ago
          Anonymous

          64-bit mode still has most of the complexity of x86. They could drop x87 and some of the segmentation features but most of the complexity is still there. The only way to simplify it is to design a new architecture. It could have all the same features of x86 but a lot cleaner and easier for the CPU and software.

          • 2 weeks ago
            Anonymous

            Some software needs 80 bit floats, so even in a 64bit only x64 cpu, x87 is still needed.

          • 2 weeks ago
            Anonymous

            LOL, really? I'm not doubting you, but what the frick uses Intel's lame-ass 80bit unit? Why couldn't they use a 128bit float?

        • 2 weeks ago
          Anonymous

          Except that's bullshit. A 386 did all of those things in 275,000 transistors, which is a rounding error on modern CPU designs.

  31. 2 weeks ago
    Anonymous

    I'd say 10 years at least, 20 at best
    (in the mainstream, of course)

  32. 2 weeks ago
    Anonymous

    I jumped off that ship a while ago. Gamers will be the last holdout

  33. 2 weeks ago
    Anonymous

    Might be a question for the linuxdorks, but could you make an OS and other applications to run on a GPU only? Could a GPU be tweaked to run other hardware and stuff? Kindof a IQfytourist and I'm an idiot, but is it possible to have a relatively low (horse)power CPU and let the GPU process anything needing more than the simplest shit?

    • 2 weeks ago
      Anonymous

      GPUs hardware is undocumented, and can only be accessed via propriatary drivers, and shitty barely fir for purpose apis. So no, you can't run gpu only, you have to run propriatary crapware on the cpu to control these pieces of shit.

  34. 2 weeks ago
    Anonymous

    [...]

    I'm a White dude happy to own AMD64 computers running Linux. I hate what Apple does, and thus don't own or use any of their products :^)

  35. 2 weeks ago
    Anonymous

    [...]

    That entire rant is probably the most effective way to get some 12 year old to think you're cool and to pretend he wants a shitty discontinued Thinkpad that his mom won't buy, but sapient adults just think you're a 12 year old yourself.

Your email address will not be published. Required fields are marked *