Parallelism on the same machine isn't real.

Parallelism on the same machine isn't real.

Thalidomide Vintage Ad Shirt $22.14

CRIME Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 months ago
    Anonymous

    are you saying modern CPUs don't have multiple individual processors or what is your point here?

    • 2 months ago
      Anonymous

      I'm saying you can't access RAM in parallel homosexual Black person

      • 2 months ago
        Anonymous

        but you can access the L1 and L2 caches individually and if you write cache-efficient code your program will be a 1000x times faster you dumb gorilla Black person

        • 2 months ago
          Anonymous

          still not parallel homosexual. Shit needs to be loaded to that memory somehow. LX cache isn't 100MB

          • 2 months ago
            Anonymous

            It literally is parallel you webshitter clown.

          • 2 months ago
            Anonymous

            No it isn't.

          • 2 months ago
            Anonymous

            >try to make "true parallel" code according to OP
            >put it on multiple machines
            >they need to occasionally coordinate with a clock service
            >clock service can service 10000 requests per second, but not actually at the same exact time
            >my algorithm isn't actually "true parallel" despite being on seperate machines

          • 2 months ago
            Anonymous

            Just post that shit in a queue homosexual. No need to coordinate shit.

      • 2 months ago
        Anonymous

        good thing the cpu doesnt get bottlenecked by ram for most things as a crap ton of things either are bottlenecked by cache / storage IO / internet

        • 2 months ago
          Anonymous

          This is what Python devs actually believe

          • 2 months ago
            Anonymous

            name an every day popular basic usecase where the software is severely limited by the fact that it cant "acces ram in my definition of what parallel is"

          • 2 months ago
            Anonymous

            >everyday usage
            This doesn't need anything more than a 100Mhz CPU. Webshitters ruined the web with UI frickery, microservices, docker containers, etc.

          • 2 months ago
            Anonymous

            non sequitour to the parallelism discussion

          • 2 months ago
            Anonymous

            >fricking webshitters and their...
            notes
            >microservices, fricking up my parallelism

          • 2 months ago
            Anonymous

            literally every program you use
            read/write ops are the slowest instructions unless the data is already in L1

        • 2 months ago
          Anonymous

          >good thing the cpu doesnt get bottlenecked by ram
          lmao you can't be serious, that's by far the most common bottleneck. Cache misses are like 400 instructions overhead, compared to the 4-40 you were probably going to do with that data. Literally 10-100x slower.

          Unless you're just moving data from A to B without doing some significant degree of processing, RAM is uber slow.

      • 2 months ago
        Anonymous

        *GNAM*

      • 2 months ago
        Anonymous

        still not parallel homosexual. Shit needs to be loaded to that memory somehow. LX cache isn't 100MB

        each CPU has its own registers and as

        but you can access the L1 and L2 caches individually and if you write cache-efficient code your program will be a 1000x times faster you dumb gorilla Black person

        said, you can partition the cache.
        once the data is loaded each CPU performs its operations independently from other CPUs, making it parallel by definition, you fricking turd wrangler

        • 2 months ago
          Anonymous

          not just register but a lot of the time L1 L2 cache is per core as well

          • 2 months ago
            Anonymous

            not that 32MB of shared L3 even on a 50$ used r5 3600 is a low amount to store instructions or anything in anyway

          • 2 months ago
            Anonymous

            L3 cache is shared

          • 2 months ago
            Anonymous

            yes, thats what i said

      • 2 months ago
        Anonymous

        why not? I put 4 sticks in and I should be able to access all 4 in parallelogram

      • 2 months ago
        Anonymous

        That's what dual channel is for, Black person

  2. 2 months ago
    Anonymous

    Works on my machine

  3. 2 months ago
    Anonymous

    Oh shit, my Pentium 4 is a parallel processor?

    • 2 months ago
      Anonymous

      it's an illusion. If it wasn't it would be able to run 2 or more operating systems at the same time without a VM. Raw metal.

    • 2 months ago
      Anonymous

      no, it's a perpendicular processor.

  4. 2 months ago
    Anonymous

    https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

  5. 2 months ago
    Anonymous

    Works on my machine

  6. 2 months ago
    Anonymous

    You can run multiple processes in the same machine.

    • 2 months ago
      Anonymous

      and they execute sequentially

  7. 2 months ago
    Anonymous

    Meds

  8. 2 months ago
    Anonymous

    >webjeet doesn't know what cache is

Your email address will not be published. Required fields are marked *