>bro my 2500k is totally fine. >i'll just buy a entry level GPU. >it's totally fine, bro

>bro my 2500k is totally fine
>i'll just buy a entry level GPU
>it's totally fine, bro

Unattended Children Pitbull Club Shirt $21.68

The Kind of Tired That Sleep Won’t Fix Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

  1. 2 years ago
    Anonymous

    the human eye can't even see past 24fps

    • 2 years ago
      Anonymous

      there's a reason movies and TV are mostly in 24FPS moron

      based

      • 2 years ago
        Anonymous

        >cinematic
        >game
        Pick one.

      • 2 years ago
        Anonymous

        >I'm a moronic coping troglodyte who cannot adapt to new things and everything has to be slow for me to comprehend
        >The post
        I fricking hate boomers.

      • 2 years ago
        Anonymous

        >Order 1886 dev
        Fake news.

      • 2 years ago
        Anonymous

        The hobbit being 48 fps was cool but it wasn't the main focus of the movie. A lot of the animations were jarring due to not being proper 48 fps but something about the camera FOV felt off, too narroe maybe. I wish all movies were in 240fps

  2. 2 years ago
    Anonymous

    Just dont play at 4k. 1080p is fine.

    • 2 years ago
      Anonymous

      4k will give you better results then this you brainlet

      • 2 years ago
        Anonymous

        Frick off idiot

  3. 2 years ago
    Anonymous

    >manbaby uses neural net hardware to play games

  4. 2 years ago
    Anonymous

    there's a reason movies and TV are mostly in 24FPS moron

    • 2 years ago
      Anonymous

      the human eye can't even see past 24fps

      [...]
      based

      That's a brainlet moronic normie take. It comes from people with the delusion that since their TV in their living room shows pseudo-60FPS or real 60FPS then "obviously" that's not immersive.

      Their toddler brains don't understand that if their movies were also on high FPS for all their childhood life: they would identify high FPS with "immersive" too.

      Even interpolation is better (and real high FPS even better); that's because if there are artifacts at interpolation: then the original low-FPS source had also abrupt obnoxious cuts too anyway.

      • 2 years ago
        Anonymous

        and something else, ironically: high FPS is PRECISELY what is more immersive. that's because "high fps" is what real life actually does.

        so forget your moronic purely-conditioned attachment to low FPS from your childhood and get used to high FPS.

        it doesn't take long anyway; within 1 or 2 movies you're easily used to it; some GPUs do it for almost 'free' anyway.

      • 2 years ago
        Anonymous

        and something else, ironically: high FPS is PRECISELY what is more immersive. that's because "high fps" is what real life actually does.

        so forget your moronic purely-conditioned attachment to low FPS from your childhood and get used to high FPS.

        it doesn't take long anyway; within 1 or 2 movies you're easily used to it; some GPUs do it for almost 'free' anyway.

        I'm worried that we'll never get truly latency/blur/motion artifact free vidya (~4000FPS) because of people like that.

    • 2 years ago
      Anonymous

      Yes it's to save money on film stock in the 1920s.

  5. 2 years ago
    Anonymous

    >video games

    • 2 years ago
      Anonymous

      this
      get a real adult hobby like killing prostitutes

      • 2 years ago
        Anonymous

        >killing them
        >not enslaving them and forcing them to mow your lawn

        • 2 years ago
          Anonymous

          >Allowing a random prostitute to steal a homeowner's greatest joy

        • 2 years ago
          Anonymous

          >Allowing a random prostitute to steal a homeowner's greatest joy

          >mowing your lawn and not letting the wild flowers grow to help replenish the population of wild bees
          town ordinances and HOAs can get fricked. you want me to trim my lawn you better come out with a fricking yardstick.

        • 2 years ago
          Anonymous
  6. 2 years ago
    Anonymous

    111 fps at 1% low is fine, as long as there aren't jitters with bad frame times. You made a compelling argument for people to not upgrade for longer.

    • 2 years ago
      Anonymous

      >111 fps
      Shit dude I'm happy if I hit stable 30fps on my 4670 ... at 720p .... on low
      might finally upgrade this year to something semi decent like a used rx 470 or 1060

      • 2 years ago
        Anonymous

        Try and get a cheap second hand rx 6600(xt) wont even cost that much more (depends on your narket) and is much more efficient & more performance

    • 2 years ago
      Anonymous

      >111 fps at 1% low is fine, as long as there aren't jitters with bad frame times.

      >lows 50% of the average

      you know its a fricking mess of frametime pacing.

  7. 2 years ago
    Anonymous

    >3060 ti
    >ryzen 5 2600
    do i really need to upgrade on the cpu department? i would go for 13600k when it releases but i wanna sit on my current cpu for long as possible. would it be okay to use it until like 2025?

    • 2 years ago
      Anonymous

      No, upgrade right now, both cpu and gpu.

      • 2 years ago
        Anonymous

        i literally just bought the 3060 ti. stop trolling me.

    • 2 years ago
      Anonymous

      Why switch to craptor lake instead of getting a drop-in upgrade like a 5700x?

      • 2 years ago
        Anonymous

        because my mobo is b450 prime plus it was budget in 2019 too i dont know why i shudnt just buy a better mobo that will support ddr5 as well in the future

    • 2 years ago
      Anonymous

      GPU is still good
      get a 5700X or a 5800X3D

      • 2 years ago
        Anonymous

        5700x is worse than 12600k or 13600k plus idk if i wanna buy a*d stuff anymore

    • 2 years ago
      Anonymous

      >would it be okay to use it until like 2025?
      what do you think it's going to happen in 2025
      do you plan on smashing your CPU or something

      • 2 years ago
        Anonymous

        no but if itll play games and stooff until then

        • 2 years ago
          Anonymous

          people are playing games on 2600k
          you'll be fine

    • 2 years ago
      Anonymous

      1600 here, get yourself a 5800x and be done with it.

  8. 2 years ago
    Anonymous

    games are for shildren

  9. 2 years ago
    Anonymous

    Anyone playing games after becoming 16 years old should just be killed off.

    • 2 years ago
      Anonymous

      shitting on kids in FPS games as an adult is one of the best parts of PC gaming

  10. 2 years ago
    Anonymous

    >modern AAA games
    >good
    lol
    lmao

    Actual good games don't need powerful hardware

  11. 2 years ago
    Anonymous

    oh no! how will my 60hz monitor cope?!

  12. 2 years ago
    Anonymous

    >8 year gap in technology
    >only 2x framerate increase in a dead, hack filled, chingchong appeasing, f2p game

    CPU improvements are over aren't they?

  13. 2 years ago
    Anonymous

    now try it with the ~~*fixes*~~ for fake exploits turned off

    • 2 years ago
      Anonymous

      >the
      got a 7w 4c pentium laptop for shitposting and it was an awesome machine for what it was until the "fixes" now i'm hitting 90%+ just browsing while before i'd almost never go beyond 20%

      • 2 years ago
        Anonymous

        mitigations=off
        Thank me later.

    • 2 years ago
      Anonymous

      >still using BIOS pre-fixes
      >meltdown and spectre fix disabled in windows using Inspectre software

      never measured the performance, i just did it for the feels

  14. 2 years ago
    Anonymous

    Imagine playing games in 2022

  15. 2 years ago
    Anonymous

    How old is PUBG? Why test with that... Hasn't anything new come out? That shit was already pretty old last time I bought a GPU (1070, evga)
    I'm starting to get scared that I'm gonna build a new PC and find there's nothing new to play. Hardware accel in Photoshop can only justify so much

  16. 2 years ago
    Anonymous

    I have a 1080ti paired with my 4790. At 1440p there's almost no bottleneck.

  17. 2 years ago
    Anonymous

    you wont notice a difference if you are at 144hz, btw.

    the 4770 is still doing fine here.

    • 2 years ago
      Anonymous

      >moron thinking average is more important than 1%

      • 2 years ago
        Anonymous

        newer processor is better for less jumpy fps, of course, but it's still usable.

  18. 2 years ago
    Anonymous

    You'll have to pry my 3770K out of my cold, dead hands. Delidded it, replaced the IHS-Die TIM with liquid metal, put my gigantor fricking Thermalright Silver Arrow (the first one) back on there and run an all-core turbo of 4.4 GHz. I don't game that often anymore and I'd already maxed out Z77 with 32 GB RAM back in ~2014 for video editing. Nothing I do on there even comes close to making me want to upgrade.

    • 2 years ago
      Anonymous

      yet a 4770 would stomp it at stock in AVX2

      • 2 years ago
        Anonymous

        >yet a 4770 would stomp it at stock in AVX2
        Considering Ivy Bridge didn't have AVX2 support at all?

  19. 2 years ago
    Anonymous

    All those years/price hikes and there's only a 37% increase? lol

  20. 2 years ago
    Anonymous

    It’s actually shocking how little CPU improvement we’ve had over the last 10 years. It’s intel’s fault, right?

  21. 2 years ago
    Anonymous

    Not even in a thousand years should we use PUBG as a benchmark. Not ever.

  22. 2 years ago
    Anonymous

    people who don't upgrade their hardware don't care about framerates because they don't care about their computer

Your email address will not be published. Required fields are marked *