What comes next after reaching the physical limits of silicon?

What comes next after reaching the physical limits of silicon?

CRIME Shirt $21.68

The Kind of Tired That Sleep Won’t Fix Shirt $21.68

CRIME Shirt $21.68

  1. 2 years ago
    Anonymous

    Apple's M2 Silicon.

    • 2 years ago
      Anonymous

      autist

  2. 2 years ago
    Anonymous

    Specialized chips obviously.
    Already seeing it in raytracing, ai, stuff like that. Also software optimizations.

    • 2 years ago
      Anonymous

      I do think we may see FPGA cards in addition to our CPUs and GPUs. What CUDA does for compute, FPGAs can do to CUDA.

      Obviously we need a lot of software before FPGA cards for consumers becomes sensible, but the benefits could be huge.

      • 2 years ago
        Anonymous

        Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

        • 2 years ago
          Anonymous

          Actual FPGA chips aren't that expensive. I'm having a harder time coming up with applications though. The best I can think of is stuff like decoding video formats that are invented after the chip is made. But GPGPU seems more applicable for that.

      • 2 years ago
        Anonymous

        Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

        FPGAs are dead in the water for general purpose compute with how absolute shit and vendor/version specific the tooling is.

        • 2 years ago
          Anonymous

          Yeah I see the potential for FPGA units in consumer use but not for a long time I don't think. It's still far away from being economical/applicable to daily uses.

          Yeah, FPGAs today are basically where GPUs were in the mid 90s. They existed, they kicked ass, but you had to write your software specifically for this one model. Corporations did some pretty impressive stuff with the GPUs of the time, just like corporations today do some pretty impressive stuff with FPGAs (Bing replaced entire racks of algorithm computers with FPGA cards).
          Some standardisation and common libraries like openGL and DirectX and I could see them become more viable.

          • 2 years ago
            Anonymous

            Spot on. They have potential but we're 10 or 20 years out.

      • 2 years ago
        Anonymous

        >FPGA cards
        I hope not.
        I wouldn't mind an FPGA socket on the MB, but in my opinion the expansion card model is pretty aged. I'd rather have a GPU socket also. You could have a new motherboard standard with CPU socket on top, GPU socket in the middle, and FPGA socket on the bottom, and end up with much slimmer and more efficient computers than we have today, with effective server-like front intake rear exhaust ventilation using standard heatsinks. The benefits of ATX is easy expansion with cards, but the only cards 99% of people use today is GPUs anyway. It made more sense back when a computer wasn't usable with less than five cards. Better to move that to the MB. Most don't even use SATA any more, just two M.2s and a high speed ethernet is enough thanks to NAS.

      • 2 years ago
        Anonymous

        >FPGA cards
        It's doubtful that another 'aib' other than gpus will ever enter the consumer market.
        There has, however, been a push towards making some parts of the cpu reconfigurable (intel csa) to reduce power usage and to speed up some algorithms.
        Besides that, FPGAs are being coupled with memory in servers for in-situ processing, no discrete board needed.

      • 2 years ago
        Anonymous

        I dunno what FPGAs will do considering the MiSTer can only really emulate a PSX/Saturn.

        • 2 years ago
          Anonymous

          FPGAs can be programmed to mimic any kind of accelerator circuit, which means the most common ones like encryption and raytracing will remain on our CPUs and GPUs, while more niche ones can go on the FPGA. Horrendously inefficient code can be AI-optimised into a weird circuit, and FPGAs can be used to implement these immediately. For example.
          "Hey Siri, find all my photos of clowns" uses the AI accelerator on your GPU to process natural language, and an FPGA circuit to extremely rapidly classify and filter out the clowns in your images. FPGAs are CUDA on steroids.

  3. 2 years ago
    Anonymous

    they’re going to encode software functionality into the hardware directly completely undoing the philosophical progress we’ve made towards general computing. rather than abandoning the transistor model, directly taking heed of their own propagandisms “computer science isn’t about computers”, they’ll continue to stack SIMD processors on SIMD processors until every liter of oil and every kg of coal in NA is being used to compute dot products.

    • 2 years ago
      Anonymous

      this... is so true

  4. 2 years ago
    Anonymous

    biochips

    • 2 years ago
      Anonymous

      How did they cut it open without accidents cutting into the brain. Also no iron filings from the cutting either.

  5. 2 years ago
    Anonymous

    Maybe MESO?

  6. 2 years ago
    Anonymous

    no more electron apps

    • 2 years ago
      Anonymous

      sorry. Baseddevs are everywhere and they are too addicted.

  7. 2 years ago
    Anonymous

    Huge scale cloud computing farms and 5g will remove the need for smaller more powerful silicon

    • 2 years ago
      Anonymous

      Only correct answer in this thread. Everyone else is moronic

      • 2 years ago
        Anonymous

        There will always be applications for owned hardware

        • 2 years ago
          Anonymous

          Like what? I want to believe but I don't see it.

          • 2 years ago
            Anonymous

            Video games immediately come to mind
            Also on the enterprise scale, companies just aren't going to want to depend on a provider for everything. They just aren't. If that were the case we wouldn't still have salesmen, accountants, legal teams, etc. They'd all be contracted out

          • 2 years ago
            Anonymous

            >Video games immediately come to mind
            Idiots aren't going to care about the horrible latency.
            One thing that brings me hope though is how utterly unprofitable these services are. I believe Spotify cut the revenue of the music industry in half.

          • 2 years ago
            Anonymous

            >Idiots aren't going to care about the horrible latency
            Yes, this is actually why stadia managed to completely shut out Sony and Microsoft and why Google is the biggest name in gaming

          • 2 years ago
            Anonymous

            RDR2 on PS4 already has like 350 miliseconds of lag. Stadia failed because it's a shit service, the real threat is things like Gamepass.

          • 2 years ago
            Anonymous

            >companies just aren't going to want to depend on a provider for everything

          • 2 years ago
            Anonymous

            >he says as every company switches over to cloud computing
            It happened at my megacorporation job. It'll happen to yours.

          • 2 years ago
            Anonymous

            What does your company do?

          • 2 years ago
            Anonymous

            Retail. All computer units in over 5000 stores nationwide forced to switch to cloud point of sale units over night. Corporate now micromanages every store's time sheets and everything since it's all now tracked by them. Formerly it was all up to the discretion of an individual franchise.
            Also they fricking suck but that goes without saying. We went like 2 weeks with no ability to print anything at all.

          • 2 years ago
            Anonymous

            That must have been a hell of an on-call week.

          • 2 years ago
            Anonymous
    • 2 years ago
      Anonymous

      >input latency and bitstarved video streams are the future
      i hate the antichrist

  8. 2 years ago
    Anonymous

    your mom

  9. 2 years ago
    Anonymous

    molecular processors
    optical/photonic processors

  10. 2 years ago
    Anonymous

    don't worry they'll find a way to make it faster yet the code runs less efficiently like it ever has since the 2000s

  11. 2 years ago
    Anonymous

    silicon 2
    or maybe silicon B

  12. 2 years ago
    Anonymous

    Silicone

  13. 2 years ago
    Anonymous

    Step 1: Instead of using electricity between components of a CPU, use light. Use light switching rather than transistors.
    Step 2: Remove the solids in the CPU, since light travels at about 1.5x slower in a physical medium than in a vacuum. This will also get avoid the excess heat.

  14. 2 years ago
    Gon

    [...]

    [...]

    Rocks.

  15. 2 years ago
    Anonymous

    a frickton of optimisation in both hardware and software
    and bloat, a lot of bloat
    hopefully the need for good backend for the bloat will lead to rise of open source projects but I wouldn't count on that

Your email address will not be published. Required fields are marked *