technology will get a lot better when this piece of shit architecture kicks the bucket

technology will get a lot better when this piece of shit architecture kicks the bucket

Shopping Cart Returner Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

Shopping Cart Returner Shirt $21.68

  1. 2 months ago
    Anonymous

    this architecture is the only thing that makes technology relevant

    • 2 months ago
      Anonymous

      this architecture is in the most irrelevant products timmy

      • 2 months ago
        Anonymous

        such as all supercomputers that perform meaningful calculations while ARM is only used for playing gacha games and watching tiktok?

        • 2 months ago
          Anonymous

          supercomputers run on risc-v
          try again

          • 2 months ago
            Anonymous

            it was real in your head

          • 2 months ago
            Anonymous

            https://learn.saylor.org/mod/page/view.php?id=27185
            >As of November 2019, all supercomputers on TOP500 are 64-bit, mostly based on CPUs using the x86-64 instruction set architecture (of which 474 are Intel EMT64-based and 6 are AMD AMD64-based). The few exceptions are all based on RISC architectures).

          • 2 months ago
            Anonymous

            those exceptions are all experimental and use like x10 more energy while not giving you x10 faster computation too

          • 2 months ago
            Anonymous

            >All
            >Exceptions
            You're a real special one

          • 2 months ago
            Anonymous

            the exception is RISC, and all RISC wastes energy, a datacenter full of RISC cpu's might make even a nuclear power plant engineer worried about the power draw.

          • 2 months ago
            Anonymous

            Idk what's up with RISC-V, or if it's an implementation issue vs. an ISA issue. As a general rule a RISC ISA will give you more performance per watt, which means you can run faster or run the same speed using less energy. This assumes all other factors are equal of course. ARM is successful in phones and tablets precisely because of its low power consumption.

            Internally x86 cores break their instructions down into RISC-like instructions or micro-ops. This has been true since the Pentium. So all other factors being equal an x86 core will burn more energy on the circuitry for its sophisticated decoding stage.

          • 2 months ago
            Anonymous

            no it won't, I literally looked it up on google, fastest RISC-V is x6 faster than fastest x64 but takes x10 more power, therefore x10 x64's will beat RISC-V at performance per watt.

          • 2 months ago
            Anonymous

            Performance does not scale linearly with power.

          • 2 months ago
            Anonymous

            low IQ Black personbabble, six of x64's use less power to do same work a single RISC-V would, how can you cope with this?

          • 2 months ago
            Anonymous

            RISC-V is not the entirety of RISC. Do performance/watt stats on
            SPARC
            MIPS
            PowerPC
            ARM
            PA-RISC
            Alpha

          • 2 months ago
            Anonymous

            >cope cope cope
            consider that the only thing that matters is pure performance per watt when you're not limited to a 5V battery

          • 2 months ago
            Anonymous

            not really, you need to factor in manufacturing price as well. also how much you pay extra for proprietary ARM/x64 bullshit?

          • 2 months ago
            Anonymous

            manufacturing prices are negligible unless you're a goomer who replaces hardware every 5 minutes

          • 2 months ago
            Anonymous

            think no license fees doesn't translate in more computing power for same money?

          • 2 months ago
            Anonymous

            I don't remember paying any loicense fees for using x64

          • 2 months ago
            Anonymous

            it's as free as macOS is free. it's price is factored in. in the sense that if anyone else would want to manufacture the CPUs they'd pay license fees.
            ARM has fees I think.

          • 2 months ago
            Anonymous

            I'd rather pay bigger fees for better hardware that saves energy than pay x10 higher bills every month for the rest of my life

          • 2 months ago
            Anonymous

            >x10 higher bills

          • 2 months ago
            Anonymous

            yes x10 more power usage is indeed x10 higher bills

          • 2 months ago
            Anonymous

            are you implying RISC-V will draw 10x the energy of x86 when matured and is competitive?

          • 2 months ago
            Anonymous

            >when matured
            2 more weeks

          • 2 months ago
            Anonymous

            so you indeed are a dishonest homosexual

          • 2 months ago
            Anonymous

            delete your shitty thread and never make one again until RISC-V matures so we don't have to see you ever again

          • 2 months ago
            Anonymous

            i think i'm going to make daily riscv threads. and you'll appear in every one of them. and get mad.

          • 2 months ago
            Anonymous

            The linked email is troubling. If RISC-V really is "watered down Alpha" then it's a poor choice. ARM or Power would be much better.

            Alpha tried to eliminate any complexity which would impact pipelines and clock speeds. And it worked in that sense. Alpha chips regularly had the highest clock speeds. But even when they were the fastest executing code it was by margins smaller than their clock advantage. There were just too many examples of Alpha needing a half dozen ops for two on other RISC ISAs.

            Power/PowerPC (Performance Optimization With Enhanced RISC) was RISC, but with better balance in the instruction set. ARM is better balanced as well.

          • 2 months ago
            Anonymous

            motherfricker is your computer the only appliance in your home or did someone steal your braincells

          • 2 months ago
            Anonymous

            thankfully my microwave doesn't use RISC-V that draws x100 more power than my laptop without including the other power draws it may or may not have.

          • 2 months ago
            Anonymous

            AMD and Intel have some licensing agreements on x86, so you are paying for it one way or another, whichever you buy.

          • 2 months ago
            Anonymous

            I don't know, how much money would it cost to have a RISC-V system with comparable performance to dual CPU EPYC system?

          • 2 months ago
            Anonymous

            >won't do the comparisons
            >accuses others of cope
            Most of the processors from the ISAs on that list shipped for workstations. Go be a moron somewhere else.

          • 2 months ago
            Anonymous

            >but muh single example!!!
            There's more than one RISC ISA. If RISC-V is power hungry that sounds like a RISC-V problem. ARM is not. PowerPC was not. (POWER kind of was but that's because IBM was always designing for workstations and they didn't give a frick about watts, i.e. the 970 or "G5".)

          • 2 months ago
            Anonymous

            ARM is not power hungry because it's made for phones and doesn't compute shit

          • 2 months ago
            Anonymous

            Go compare Geekbench scores for something like a M3. ARM can easily scale up in performance, and when we see high performance examples they have lower power consumption than x86. That will generally be true for a RISC ISA vs a CISC ISA, all other factors being equal. It doesn't always hold (Alpha, RISC-V), but it usually does (all the rest).

          • 2 months ago
            Anonymous

            >muh geekbench scores
            lol I will just go walk into my local data center and ask those boomers what CPU they use

          • 2 months ago
            Anonymous

            So I suppose Amazon's Graviton4, Ampere's Altra + AmpereOne and nVidia's Grace CPUs all do not exist?

          • 2 months ago
            Anonymous

            kek
            RISC-V is possibly THE most inefficient architecture there is
            It was lambasted by people writing actual HPC code

            https://gmplib.org/list-archives/gmp-devel/2021-September/006013.html

            >I believe that an average computer science student could come up with
            a better instruction set that Risc V in a single term project.

            Will RISCgays ever recover?

          • 2 months ago
            Anonymous

            the biggest problem of RISC-V is that it being faster doesn't matter if it hogs power
            >buy x10 more x86_64 CPU's to compute x10 faster for x10 more power
            >RISC that's only x6 faster than one of those hogs same amount of power
            RISCsisters have no future

          • 2 months ago
            Anonymous

            Total RISC-Victory.

            HPC is a meme from the last century, distributed computing is the future.

          • 2 months ago
            Anonymous

            [...]
            [...]

          • 2 months ago
            Anonymous

            >It is, more-or-less a watered down version of the 30 year old Alpha ISA after all.
            >(Alpha made sense at its time, with the transistor budget available at the time.)
            Ouch. OK, so it is the ISA. Alpha hit very high clock speeds but its performance margin over competing CPUs was always slim for the reason he gives. Fast clock but less efficient instructions. Power ISA is open now, should fricking use that instead if they want RISC.

          • 2 months ago
            Anonymous

            lol you wish risctroony

            kek
            RISC-V is possibly THE most inefficient architecture there is
            It was lambasted by people writing actual HPC code

            https://gmplib.org/list-archives/gmp-devel/2021-September/006013.html

            >I believe that an average computer science student could come up with
            a better instruction set that Risc V in a single term project.

            Will RISCgays ever recover?

          • 2 months ago
            Anonymous

            [...]

            It's a giant risk though

          • 2 months ago
            Anonymous

            What games do supercomputers have?

          • 2 months ago
            Anonymous

            why do people in this board always make up shit like this? are there really this many bots?

  2. 2 months ago
    Anonymous

    I like reviving and reusing old computers and you made me sad.

  3. 2 months ago
    Anonymous

    You tell em, Einstein, you zoomer homosexual

  4. 2 months ago
    Anonymous

    lol you wish risctroony

  5. 2 months ago
    Anonymous

    Good morning saaaaaars
    Maybe one day they'll make ARM chipsets in India bhaiiiiiii

  6. 2 months ago
    Anonymous

    two more weeks until x86 disappears

  7. 2 months ago
    Anonymous

    List of things I'll never use:
    >risc
    >arm
    >gnome3+
    >wayland

    • 2 months ago
      Anonymous

      >systemd

    • 2 months ago
      Anonymous

      >systemd

      >pulseaudio
      >dbus
      >UEFI

    • 2 months ago
      Anonymous

      >systemd

      [...]
      >pulseaudio
      >dbus
      >UEFI

      all of the above + pipewire

    • 2 months ago
      Anonymous

      >waaaaaaaaah people shouldn't be allowed software outside walled gardens
      ARM should be made illegal.

      God how moronic you are, the ISA has nothing to do with the software, you can run Linux on ARM or RISC-V and configure it however the frick you want.
      Is console wars the only thing you can do?

      • 2 months ago
        Anonymous

        I just wrote SIMD code for x64 the other day and then I remembered that ARM exists and ARMtoddlers may not be able to run it.

      • 2 months ago
        Anonymous

        Two problems are legacy software and knowledge. It takes time to learn a new ISA and time to optimize for it. Billions of programs in the world so it's not as simple as
        >hurr switch to arm durr just install arm linux bro

  8. 2 months ago
    Anonymous

    >waaaaaaaaah people shouldn't be allowed software outside walled gardens
    ARM should be made illegal.

  9. 2 months ago
    Anonymous

    tf uses x86 in 2024? it's all amd64

  10. 2 months ago
    Anonymous

    >but muh architecture!!!
    I like RISC designs, but a ton of the world's software runs on x86. So no, it's not going any where.
    >but muh performance! muh performance per watt!!!
    A few percentage points of speed aren't worth trashing all that software. Besides, if we stopped hiring pajeets our computers would be 100x faster. Trash the web stack and start over with something better, without pajeets, and they would be 1,000x faster. I realize that last one isn't going to happen because of all those web pages, but we could fire all the pajeets for a 100x speed boost.

    picrel: PowerPC G3 really was twice as fast as Pentium II and it didn't matter. Nice for Mac users, but not enough to make PC users trash their software.

    • 2 months ago
      Anonymous

      Most software should be trashed anyway, x86 has no reason to exist in 2024 other than muh backwards compatibility, you could say the same thing for operating systems like Windows or UNIX, or for a lot of programming languages.
      The culture of backwards compatibility is the biggest cancer in the field of technology, we could be a spacefaring civilizations without short-sighted normalgay-pandering homosexuals like you.

      • 2 months ago
        Anonymous

        I can't take this post seriously when I know that it was written by a Sabagebu troon

      • 2 months ago
        Anonymous

        Let me guess: UNIX Haters Handbook? Lisp machines?

      • 2 months ago
        Anonymous

        Let me guess: UNIX Haters Handbook? Lisp machines?

        duality of pedophiles

        • 2 months ago
          Anonymous

          >sees picture of little girl
          >OMG PEDO!!!
          The mind of a rustroon.

      • 2 months ago
        Anonymous

        Compatibility on x86 is one of the redeeming qualities it has and it's responsible for all the things you entirely take for granted on Windows, Mac and Linux.
        Imagine a world without compatibility. Oh wait you don't have to imagine, just look at the android phone space where you can't just install any OS on any phone because things like "BIOS" is an x86 thing you take for granted.

        • 2 months ago
          Anonymous

          >Imagine a world without compatibility
          Ok, I am imagining automated robots dismantling Mercury for the purpose of building a Dyson swarm around the sun.

          >The culture of backwards compatibility is the biggest cancer in the field of technology, we could be a spacefaring civilizations without short-sighted normalgay-pandering homosexuals like you.
          Sure, we'd be much better off reinventing the wheel every 5 years.

          >we'd be much better off reinventing the wheel every 5 years
          Unironically yes, you'd see true innovation instead of marketing make-believe like "AI", with time all the major problems we have to deal with right now would be solved and computing would converge towards a more stable cycle of reinventing every century or so, instead we are stuck with garbage from the 70s/80s and we have homosexuals saying "ackthually that's a good thing", absolute state.

          • 2 months ago
            Anonymous

            >with time all the major problems we have to deal with right now would be solved
            There would be far less time to for solving major problems because everyone would be too busy solving old problems again.

      • 2 months ago
        Anonymous

        >The culture of backwards compatibility is the biggest cancer in the field of technology, we could be a spacefaring civilizations without short-sighted normalgay-pandering homosexuals like you.
        Sure, we'd be much better off reinventing the wheel every 5 years.

        • 2 months ago
          Anonymous

          correct
          the answer is to lobotomize anyone over 25
          nothing holds back tech more than boomers fearing change

          • 2 months ago
            Anonymous

            how about we change your state from alive to unalive, moronic zoom zoom.

  11. 2 months ago
    Anonymous

    >people that don't even know what an ISA is are complaining about an ISA
    It never ends

    • 2 months ago
      Anonymous

      none of morons ITT complaining about anything at all ever designed a CPU and even if they did, no serious company ever took his shitty design and made it real

  12. 2 months ago
    Anonymous

    when open GPU

  13. 2 months ago
    Anonymous

    You will never be a gaming processor

  14. 2 months ago
    Anonymous

    iToddler BTFO

  15. 2 months ago
    Anonymous

    I'm on the fence about this.

    On one hand, you're right, the instruction set is antiquated and we have better solutions.

    On the other we've been writing software that runs on this architecture for DECADES. You can't just declare it dead and buried until you have an adequate way to salvage the universe of software that already exists for it. This isn't even about preservation, it's about transitioning to smarter hardware without losing existing functionality.

    • 2 months ago
      Anonymous

      Linux distros already support like 5 different architectures just fine.

    • 2 months ago
      Anonymous

      Software is disposable. This idea that we need to keep our code around only appeared in the 90s. Before then you wrote something with the understanding you'd throw it out later. Programming would be much better if we went back to that.

      • 2 months ago
        Anonymous

        Then write stuff to replace old stuff.

        Thing is no one's doing that for legacy applications. They want to throw the baby out with the bathwater ("you don't NEED CD burning software in 2024, frick that piece of junk and move on"). So I have to rely on older software that still runs on my current hardware if that hardware can still do the task.

        • 2 months ago
          Anonymous

          This is literally a trivial example, ISO 9660 is braindead simple. Generate your volume descriptors, generate your path tables, skip the unused first ~38k bytes, write your volume descriptors, write your path tables, organize your directory entries and then write your file sectors. It should take you less than a day to write your own. I don't even think software that takes years to write should be kept around in-perpetuity. Linux, for example, should have been retired ages ago and replaced with something fresh from the ground up.

          • 2 months ago
            Anonymous

            nta i am writing a iso9660 cli tool right now, we are connected on the intergalactic moron spectrum

          • 2 months ago
            Anonymous

            >Linux, for example, should have been retired ages ago and replaced with something fresh from the ground up.

          • 2 months ago
            Anonymous

            9front unironically convinced me of this. It's virtually a complete modern system that's fully usable despite being substantially simpler than any Unix-derived system, including OpenBSD. Working with namespaces, immutable systems, building a fileserver, remote resources, etc. it's all natural and comes easily. People who say "Linux can do anything Plan 9 can" are missing the fricking point. Even something as simple as graphics are a tacked-on construct that fills you with mental exhaustion from how overly complicated and dogshit it all is. When it all clicks there is at once a moment you realize that most of the problems in computers we have, especially ones relating to complexity, are entirely owed to the fact that we've lost our stomach for throwing out old shit. The systems we use today are still ostensibly 1970s timesharing systems at their core (yes, even fricking Windows, though it goes to great lengths to hide it)

          • 2 months ago
            Anonymous

            If your way was right the market would simply vote in favor or rewrites, lets take postgres for example, why would you wanna use something else? It works, if it aint broke dont fix it
            People are constantly rewriting but if no one uses their software i guess its just not as good, untill at some point the scale tips and it is better then the old version, this has a natural pace that is dictated by the market, people choose what to use

          • 2 months ago
            Anonymous

            >muh market
            Capitalism is a broken system(just like democracy), prices are manipulated and inflated by marketing and the media and do not reflect how useful or important something is, just look at "AI".

          • 2 months ago
            Anonymous

            Whew lad before we get all political on a tech thread, throw out your notion of labels mattering and look at it simply:

            >humans are fundamentally the only predators humans have
            >all systems are shit because of such
            "liberalism"
            "democracy"
            "capitalism"
            None of that shit matters. What happens in a system won't really change.

            >prices are manipulated and ...
            Yeah buddy, go dig around for how markets work in Chinese-type state sponsored "capitalism" and see how it's just turtles the whole way down.

          • 2 months ago
            Anonymous

            That's a very naive way of looking at things. The market isn't a rational superbeing driven by optimal choices or high efficiency. It's driven by status quo and perception, neither of which are based in concrete metrics. On one hand, as software companies build up and collapse over time, the practices and institutional knowledge don't disappear or even get reliably rejected. It's a crapshoot as to what transfers between institutions because the mapping between institutional practices and institutional performance is completely beyond human capacity to reason about. It's not a matter of some choice at this level being the right one or the wrong one, which leads to some given outcome. Outside of some very constrained contexts, it's an irrelevant question to begin with. Institutional success is mediated, generally, by social factors. If I have an alternative software system, my success isn't predicated on whether or not it's actually any good. It's predicated on whether my clientele has the perception that it's their best option at a macroscopic operational level, even though there's virtually no connection to begin with.

            What I speak to is the moment-to-moment experience which is endured at a microscopic operational level. While there IS a connection to the macroscopic level here, it may as well not exist. If you've worked as an engineer in the corporate world all you'll ever know is perpetual frustration and burn-out because of how terrible this shit all is. The market is completely blind to these kinds of consequences, among others, which is exactly why it's a useless metric to begin with. "If it works, but it ain't broke, don't fix it" is exactly the kind of mentality which drives a slow, virtually imperceptible degeneration and eventual catastrophic failure.

          • 2 months ago
            Anonymous

            Very good post thank you anon

          • 2 months ago
            Anonymous

            That was just an example.

            For something more theoretical imagine there's an archive format that only works with a proprietary piece of software from a company that went out of business fifteen years ago and the only binaries you can find on Internet Archive from 2001 and says it last supported Windows 2000.

            You download it and it runs because, despite your CPU being years newer, it at least has the wherewithal to recognize instruction calls from software written decades ago by falling back to those layers of years of x86's ever-expanding scope.

            If you didn't have some way to execute that software because the hardware platform moved on you'd end up in a situation where you need to reverse engineer that archive format to unpack it, which would be horrible if all you need are some files in it and intend to repack it in a more compatible archive. No one's going to bother porting something obscure and situational to a better architecture so it's either hope your system can virtualize a platform running on a different CPU architecture or get to crackin'.

          • 2 months ago
            Anonymous

            Problem is that even in that scenario, the cost of losing those files (assuming that they're even important) is not even close to the consequential cost of never throwing old shit out, which is massively underrecognized and understood, even by people who suffer greatly from it as a result. See my post here on the actual consequences of eternal legacy support as a deleterious mechanism in society:

            Almost like the competency crisis is starting to reach critical mass. The competency crisis, which is primarily caused by the fact that our systems have gotten too complex. What drives the complexity? Common rhetoric would have you believe that it's inherent to the domain, but it's not. The complexity is caused by compounding layers of abstraction, as more and more capability get bolted on to the same legacy systems and foundational concepts that date back to the 1970s, and the conflict between these foundations and what we want out of them grows exponentially. Conflict that is resolved through... more abstraction. And eventually the abstractions begin to conflict with each other, and you start to need more abstractions to handle that. The complexity explodes exponentially.

          • 2 months ago
            Anonymous

            The psychology evident in this post...hell, in OP's post...is why we have too many layers of abstraction.

            >the consequential cost of never throwing old shit out
            What do you think we would gain by throwing x86 out? Let's say that tomorrow the gov mandates that no new x86 CPUs can be made. Let's say all the governments of the world agree. Everyone has 1 year to replace all x86 hardware with something else. Then what?

            ARM would likely win in the marketplace. Our machines would be a little faster and use a little bit less power, assuming optimized code, which is not a safe assumption. Would software engineering be easier? No. Would software have less bugs or be more secure? Nope. The opposite as devs struggle to replace old and tested code. What would we gain?

          • 2 months ago
            Anonymous

            >What do you think we would gain by throwing x86 out?
            x86 is the least of all problems. And I agree with you that throwing it out without changing anything about our culture will just result in a sidegrade.
            >Our machines would be a little faster and use a little bit less power
            They wouldn't. The "ARM is more power efficient" thing is a meme. When you actually use Apple silicon for serious work, this is immediately apparent.

            I don't like x86 frankly. But if we were to throw it out, the ideal path is to replace it with something new. Something which is still socketed and still uses a BIOS/UEFI + ACPI. Moving to architectures which are principally SoC ecosystems isn't even a sidegrade, it's a strict downgrade and immensely undesirable.

            [...]
            You're both right, not only your theories are not mutually exclusive but they actually fuel eachother, more complexity requires more high iq and motivated people to be managed, the current trends in demographics and in society exacerbates the problems that arise from the backwards compatibility/legacy induced complexity, now shake hands and kiss.
            [...]
            What we need is a culture radically different from the current culture of "backwards compatibility at all costs", yes throwing out x86 won't change things by itself but it would be a step in the right direction, investing in alternative ISA like RISC-V would be another step, investing in OS research would be another, etc.
            nta

            Well put.

          • 2 months ago
            Anonymous

            >>Our machines would be a little faster and use a little bit less power
            >They wouldn't. The "ARM is more power efficient" thing is a meme. When you actually use Apple silicon for serious work, this is immediately apparent.
            ARM chips score higher per watt. Whether or not any particular Apple implementation does is another issue since they are throwing a lot more on those chips than just ARM cores.

            >I don't like x86 frankly. But if we were to throw it out, the ideal path is to replace it with something new.
            What? What do you think you'll gain? Microprocessor engineers have been studying this shit for decades. The groups that figured out RISC at IBM, Standard, and Berkley were the last step forward of the kind you're dreaming of. VLIW failed. RISC succeeded, but not by such a great margin that it warranted throwing away x86. Backwards compatibility still held the market back when PowerPC was literally twice as fast. No one has exceeded that, the difference is usually less.

        • 2 months ago
          Anonymous

          It's not even legacy shit like CD burning. We are going backwards on software in terms of performance, efficiency, stability, and security. To use an example almost everyone has to deal with: modern office suites suck ass compared to early 2000s Office.

          There is a TON of shit for which there is no current replacement.

          • 2 months ago
            Anonymous

            Almost like the competency crisis is starting to reach critical mass. The competency crisis, which is primarily caused by the fact that our systems have gotten too complex. What drives the complexity? Common rhetoric would have you believe that it's inherent to the domain, but it's not. The complexity is caused by compounding layers of abstraction, as more and more capability get bolted on to the same legacy systems and foundational concepts that date back to the 1970s, and the conflict between these foundations and what we want out of them grows exponentially. Conflict that is resolved through... more abstraction. And eventually the abstractions begin to conflict with each other, and you start to need more abstractions to handle that. The complexity explodes exponentially.

          • 2 months ago
            Anonymous

            >Almost like the competency crisis is starting to reach critical mass. The competency crisis, which is primarily caused by the fact that our systems have gotten too complex.
            It's caused by:
            >decreases in the pool of available >avg IQ people
            >increases in the pool of available <avg IQ people
            >government demanding the selection of people based on anything other than merit
            >mass misallocation of financial resources in both the private and public sectors
            >the implosion of western, particularly U.S., educational systems, both primary and secondary
            >higher education becoming a paper mill
            >labor markets flooded by people who don't even have U.S. paper mill degrees, but entirely fraudulent foreign degrees
            >people dropping out and not trying due to offshoring, H1B, inflation, mass migration, and the general perception that western governments hate their citizens and everything is falling apart
            >the corporate need to introduce "change" whether it's better or not because it drives upgrades which = profits

            >The complexity is caused by compounding layers of abstraction, as more and more capability get bolted on to the same legacy systems and foundational concepts that date back to the 1970s
            Oh for frick's sake, take your "unix haters amirite?" book and burn it. I tried getting through one chapter, just one lousy chapter of that stupid thing and had to stop because I was hurting myself from pounding my head on the desk. It's that stupid and the author is a whiney little child.

            The fundamentals of EE were worked out before the 1970s and emerge from the properties of electronic systems and combinational logic. We do have too many layers of abstraction today, but not for the reasons you believe.

          • 2 months ago
            Anonymous

            >decreases in the pool of available >avg IQ people
            >increases in the pool of available <avg IQ people
            Nope. A common rhetorical point used to explain the competency crisis, rather than the actual cause which is runaway systemic complexity. The pools of both are greatly increased btw.

            Most of the rest of these are just common unsubstantiated political rhetoric used by talking heads to great effect. But it doesn't matter because it's fundamentally not exclusive to the US nor even commercial software. H1B visa issuance in the 2000s quite literally has nothing to do with the fact that the graphics architecture of Unix systems was complete dogshit in the 1980s and we're still stuck with it. It has nothing to do with the fact that Microsoft's NT, Linux, Mach, etc. are ostensibly based on the principles of ancient timesharing mainframes.

            >higher education becoming a paper mill
            Non graduate-level education always was a paper mill, and you've sorely misunderstood it's role in society if you ever thought otherwise.
            >Oh for frick's sake, take your "unix haters amirite?" book and burn it. I tried getting through one chapter, just one lousy chapter of that stupid thing and had to stop because I was hurting myself from pounding my head on the desk. It's that stupid and the author is a whiney little child.
            I have no fricking clue what the frick you're on about, I'm talking from my own hands-on experience both professional and academic and the inductive truths I've unraveled from that experience.

            >The fundamentals of EE were worked out before the 1970s and emerge from the properties of electronic systems and combinational logic.
            EE has nothing to do with the way that concepts like graphics, IPC, network transparency, etc. and the APIs and flow used to leverage these things are treated as user-facing abstractions. You're all over the place here dude.

          • 2 months ago
            Anonymous

            in the pool of available >avg IQ people
            in the pool of available <avg IQ people
            >Nope.
            Declining avg IQ in the U.S. is documented.

            >Most of the rest of these are just common unsubstantiated political rhetoric used by talking heads to great effect.
            LOL which one was false? Go ahead, I'm listening.

            >the graphics architecture of Unix systems was complete dogshit in the 1980s
            LMFAO dude...quit sucking Simson Garfinkel's and Steven Strassmann's wieners. Just fricking stop, OK? It's disgusting.

            >ostensibly based on the principles of ancient timesharing mainframes.
            A modern desktop or phone is actually more demanding of time sharing the CPU than those old mainframes.

            >I have no fricking clue what the frick you're on about
            Sure you don't. Every day there's a thread on IQfy where some b***h whines about UNIX and C and brings up that book and "muh 70s mainframes." But you're not him and you've never heard of the book, you're just pushing the same debunked points, right?

            >EE has nothing to do with the way that concepts like graphics, IPC, network transparency, etc.
            Stopped reading right there. Look, burn the book, maybe read a book on computer architecture, stop wasting IQfy's time.

          • 2 months ago
            Anonymous

            >Declining avg IQ in the U.S. is documented.
            NTA but IQ decline in the US is just severely declining academic standards. Before the decline, the nonwhite population in the US was still increasing.

          • 2 months ago
            Anonymous

            >Before the decline, the nonwhite population in the US was still increasing.
            Yes, but now it's increasing via <80 IQ "refugees" from all over the Earth. Before it was births + immigrants from Mexico.

          • 2 months ago
            Anonymous

            >Almost like the competency crisis is starting to reach critical mass. The competency crisis, which is primarily caused by the fact that our systems have gotten too complex.
            It's caused by:
            >decreases in the pool of available >avg IQ people
            >increases in the pool of available <avg IQ people
            >government demanding the selection of people based on anything other than merit
            >mass misallocation of financial resources in both the private and public sectors
            >the implosion of western, particularly U.S., educational systems, both primary and secondary
            >higher education becoming a paper mill
            >labor markets flooded by people who don't even have U.S. paper mill degrees, but entirely fraudulent foreign degrees
            >people dropping out and not trying due to offshoring, H1B, inflation, mass migration, and the general perception that western governments hate their citizens and everything is falling apart
            >the corporate need to introduce "change" whether it's better or not because it drives upgrades which = profits

            >The complexity is caused by compounding layers of abstraction, as more and more capability get bolted on to the same legacy systems and foundational concepts that date back to the 1970s
            Oh for frick's sake, take your "unix haters amirite?" book and burn it. I tried getting through one chapter, just one lousy chapter of that stupid thing and had to stop because I was hurting myself from pounding my head on the desk. It's that stupid and the author is a whiney little child.

            The fundamentals of EE were worked out before the 1970s and emerge from the properties of electronic systems and combinational logic. We do have too many layers of abstraction today, but not for the reasons you believe.

            You're both right, not only your theories are not mutually exclusive but they actually fuel eachother, more complexity requires more high iq and motivated people to be managed, the current trends in demographics and in society exacerbates the problems that arise from the backwards compatibility/legacy induced complexity, now shake hands and kiss.

            The psychology evident in this post...hell, in OP's post...is why we have too many layers of abstraction.

            >the consequential cost of never throwing old shit out
            What do you think we would gain by throwing x86 out? Let's say that tomorrow the gov mandates that no new x86 CPUs can be made. Let's say all the governments of the world agree. Everyone has 1 year to replace all x86 hardware with something else. Then what?

            ARM would likely win in the marketplace. Our machines would be a little faster and use a little bit less power, assuming optimized code, which is not a safe assumption. Would software engineering be easier? No. Would software have less bugs or be more secure? Nope. The opposite as devs struggle to replace old and tested code. What would we gain?

            What we need is a culture radically different from the current culture of "backwards compatibility at all costs", yes throwing out x86 won't change things by itself but it would be a step in the right direction, investing in alternative ISA like RISC-V would be another step, investing in OS research would be another, etc.
            nta

          • 2 months ago
            Anonymous

            >What we need is a culture radically different from the current culture of "backwards compatibility at all costs"
            But you didn't answer what you think we will gain? Literally millions of man hours of working code gone...for what? What do we get?

    • 2 months ago
      Anonymous

      >the instruction set is antiquated and we have better solutions.
      Intel and AMD have competed pretty well with it for decades. Yes, yes, RISC can be faster and/or more power efficient. But not by leaps and bounds. You would have to have a permanent order of magnitude performance increase to justify trashing decades of developed, tested software. (Of course if you had that you could emulate said software at speed.) As is at any given point in time RISC is 1.5-2x better, if that because there have been times when x86 was just as fast.

      • 2 months ago
        Anonymous

        I'll remind you that once Intel went Curry, their improvements stagnated. This forced Apple to ARM & had the added benefit of not paying the Intel tax.

  16. 2 months ago
    Anonymous

    actually it will get a lot worse. we'll move to SoC systems and with zero standardization of any kind it'll effectively lock you into a much tighter box of what you're capable of doing with your own hardware. Here's a fun exercise for you, try and bootstrap a forth system on a random SBC from scratch. Make your own bootloader, etc. Then take that code and port it to another random SBC. See how much of your own code you end up throwing out.

  17. 2 months ago
    Anonymous

    x265 and GTA V benchmarks say otherwise.

  18. 2 months ago
    Anonymous

    i dont know bro, i like being able to swap just my cpu or just my gpu, you know, things that arent possible in arm by design.

    • 2 months ago
      Anonymous

      OP wasn't saying ARM has to be the pivot off x86, just that x86 needs to move towards deprecation.

      • 2 months ago
        Anonymous

        ARM is the only other prominent architecture tho, how will you kill x86 without something to replace it with? I don't care what they replace it with as long as hardware freedom stays the same

    • 2 months ago
      Anonymous

      The CPU being in a socket has nothing to do with the ISA.

      • 2 months ago
        Anonymous

        So its just an assembly set change then right? When I think of an architecture change I think of shit like what ARM pulls of. Tbh I dont really know that much

        • 2 months ago
          Anonymous

          why don't you elaborate on what you mean more clearly?

        • 2 months ago
          Anonymous

          The design of the physical hardware is up to the manufacturer. If you want an ARM workstation you can buy one.

          • 2 months ago
            Anonymous

            Huh, alright i need to read more about this. I thought a key aspect of arm was that everything was supposed to be one whole bundle, but I guess thats only for phones. Ty anon

          • 2 months ago
            Anonymous

            ARM is just an instruction set. Everything above that is somebody else's decision.

          • 2 months ago
            Anonymous

            reduced instructions won't help in 3D heavy environment. enjoy calculating those PI digits

      • 2 months ago
        Anonymous

        Socketed consumer hardware for ARM will never exist. ARM SeverReady has existed for a long time and nobody uses it outside of very expensive commercial hardware that isn't available to you. The future will be shitty black box SoCs with proprietary hardware extensions and terrible hardware bugs. The sentiment and norms of the market have already been set, and they will never change because that's not how these things work. When x86 dies, so too will UEFI + ACPI in consumer hardware.

        x86-64 is prefix hell and you could get a 10x by removing all the ambiguous and legacy instructions while still being cisc for instruction throughput. intels new cope is to add MORE PREFIXES

        AMD64 is one of the few synthetic formal languages I can think of, which is actually kind of cool.

        • 2 months ago
          Anonymous

          >synthetic formal languages
          what does this mean? As in it has a lot of agglutination?

          • 2 months ago
            Anonymous

            That it has morphology in general. it doesn't really translate to the same kind of morphology you see in natural languages. Things like being able to specify pointer arithmetic and do complex loads in a single instruction, change addressing modes, etc.

            These are both single instructions, they have the same number of operands:
            MOV EAX, ECX
            MOV DWORD PTR [EAX + 2 * EBX], ECX

            The second one doesn't get assembled into multiple instructions or anything, it's not an expression. The microcode looks pretty much the same, albeit with a more complex set of routing going on. Does that make sense?

  19. 2 months ago
    Anonymous

    x86-64 is prefix hell and you could get a 10x by removing all the ambiguous and legacy instructions while still being cisc for instruction throughput. intels new cope is to add MORE PREFIXES

    • 2 months ago
      Anonymous

      >x86-64 is prefix hell and you could get a 10x by removing all the ambiguous and legacy instructions
      That's what RISC does and it's not 10x. Never was.

      >while still being cisc for instruction throughput
      >cisc
      >for instruction throughput
      Anon I...

    • 2 months ago
      Anonymous

      You really can't remove much without making it incompatible with all existing software. What you are proposing is basically making a new ISA from scratch.

  20. 2 months ago
    Anonymous

    I've been hearing about the woes of x86 for over 25 years,
    That's an incredible amount of time for an obviously trash ISA to still be relevant.

  21. 2 months ago
    Anonymous

    explain why? we've heard the usual arguments but I'm personally unconvinced and as a matter of fact, I suspect most people that make this statement are uninformed to such basic things like the fact most x86-64 CPUs released today are actually almost exactly the same in the backend as arm and risc? the ISA all gets translated into similar "stuff" for the backend to process as arm and risc.

    knowing that, why do you think ditching this ISA would improve technology as whole? why would be better to "ditch" it instead of just having ARM and RISC desktop CPUs and Motherboards be more commonplace? this isn't a zero-sum game.

  22. 2 months ago
    Anonymous

    When acidburn said that RISC architecture was gonna change everything she didn't mean in a good way.

  23. 2 months ago
    Anonymous

    [...]

    >shifts entire industry to ARM
    >this is what macgays actually believe
    Go back to reading your email, Nancy.

  24. 2 months ago
    Anonymous

    It did years ago, we mostly use X86-64 nao.

    • 2 months ago
      Anonymous

      8086 gays blown the frick out. How will 16-bit ever recover?

  25. 2 months ago
    Anonymous

    Honest question. How? x86 code is still pretty dense, it has atomics semantics i'd argue are sensible defaults. What exactly does the competition do better? RISC-V's most compelling case is being "simple" and patent unrestricted. ARMv8 is basically racing to gather a lot of the same weirdness of x86. Arm now has CAS loop instruction (sounds pretty cisc to me...) and a bunch of weird addressing modes now.

    • 2 months ago
      Anonymous

      Its main production facilities aren't located outside of China.

  26. 2 months ago
    Anonymous

    MFW Motorola 68k architecture could have been used by IBM's orginal pc, and is almost the REVERSE name of x86. Was it done on purpose?

  27. 2 months ago
    Anonymous

    [...]

    any lincuck distro from even booting with T2
    bootcamp?

  28. 2 months ago
    Anonymous

    [...]

    >Say it with me now:
    >THANKYOUBASEDAPPLE
    Whatever you're on, I want some.

Your email address will not be published. Required fields are marked *