The hobbit being 48 fps was cool but it wasn't the main focus of the movie. A lot of the animations were jarring due to not being proper 48 fps but something about the camera FOV felt off, too narroe maybe. I wish all movies were in 240fps
That's a brainlet moronic normie take. It comes from people with the delusion that since their TV in their living room shows pseudo-60FPS or real 60FPS then "obviously" that's not immersive.
Their toddler brains don't understand that if their movies were also on high FPS for all their childhood life: they would identify high FPS with "immersive" too.
Even interpolation is better (and real high FPS even better); that's because if there are artifacts at interpolation: then the original low-FPS source had also abrupt obnoxious cuts too anyway.
>Allowing a random prostitute to steal a homeowner's greatest joy
>mowing your lawn and not letting the wild flowers grow to help replenish the population of wild bees
town ordinances and HOAs can get fricked. you want me to trim my lawn you better come out with a fricking yardstick.
>111 fps
Shit dude I'm happy if I hit stable 30fps on my 4670 ... at 720p .... on low
might finally upgrade this year to something semi decent like a used rx 470 or 1060
>3060 ti >ryzen 5 2600
do i really need to upgrade on the cpu department? i would go for 13600k when it releases but i wanna sit on my current cpu for long as possible. would it be okay to use it until like 2025?
because my mobo is b450 prime plus it was budget in 2019 too i dont know why i shudnt just buy a better mobo that will support ddr5 as well in the future
>the
got a 7w 4c pentium laptop for shitposting and it was an awesome machine for what it was until the "fixes" now i'm hitting 90%+ just browsing while before i'd almost never go beyond 20%
How old is PUBG? Why test with that... Hasn't anything new come out? That shit was already pretty old last time I bought a GPU (1070, evga)
I'm starting to get scared that I'm gonna build a new PC and find there's nothing new to play. Hardware accel in Photoshop can only justify so much
You'll have to pry my 3770K out of my cold, dead hands. Delidded it, replaced the IHS-Die TIM with liquid metal, put my gigantor fricking Thermalright Silver Arrow (the first one) back on there and run an all-core turbo of 4.4 GHz. I don't game that often anymore and I'd already maxed out Z77 with 32 GB RAM back in ~2014 for video editing. Nothing I do on there even comes close to making me want to upgrade.
the human eye can't even see past 24fps
based
>cinematic
>game
Pick one.
>I'm a moronic coping troglodyte who cannot adapt to new things and everything has to be slow for me to comprehend
>The post
I fricking hate boomers.
>Order 1886 dev
Fake news.
The hobbit being 48 fps was cool but it wasn't the main focus of the movie. A lot of the animations were jarring due to not being proper 48 fps but something about the camera FOV felt off, too narroe maybe. I wish all movies were in 240fps
Just dont play at 4k. 1080p is fine.
4k will give you better results then this you brainlet
Frick off idiot
>manbaby uses neural net hardware to play games
there's a reason movies and TV are mostly in 24FPS moron
That's a brainlet moronic normie take. It comes from people with the delusion that since their TV in their living room shows pseudo-60FPS or real 60FPS then "obviously" that's not immersive.
Their toddler brains don't understand that if their movies were also on high FPS for all their childhood life: they would identify high FPS with "immersive" too.
Even interpolation is better (and real high FPS even better); that's because if there are artifacts at interpolation: then the original low-FPS source had also abrupt obnoxious cuts too anyway.
and something else, ironically: high FPS is PRECISELY what is more immersive. that's because "high fps" is what real life actually does.
so forget your moronic purely-conditioned attachment to low FPS from your childhood and get used to high FPS.
it doesn't take long anyway; within 1 or 2 movies you're easily used to it; some GPUs do it for almost 'free' anyway.
I'm worried that we'll never get truly latency/blur/motion artifact free vidya (~4000FPS) because of people like that.
Yes it's to save money on film stock in the 1920s.
>video games
this
get a real adult hobby like killing prostitutes
>killing them
>not enslaving them and forcing them to mow your lawn
>Allowing a random prostitute to steal a homeowner's greatest joy
>mowing your lawn and not letting the wild flowers grow to help replenish the population of wild bees
town ordinances and HOAs can get fricked. you want me to trim my lawn you better come out with a fricking yardstick.
111 fps at 1% low is fine, as long as there aren't jitters with bad frame times. You made a compelling argument for people to not upgrade for longer.
>111 fps
Shit dude I'm happy if I hit stable 30fps on my 4670 ... at 720p .... on low
might finally upgrade this year to something semi decent like a used rx 470 or 1060
Try and get a cheap second hand rx 6600(xt) wont even cost that much more (depends on your narket) and is much more efficient & more performance
>111 fps at 1% low is fine, as long as there aren't jitters with bad frame times.
>lows 50% of the average
you know its a fricking mess of frametime pacing.
>3060 ti
>ryzen 5 2600
do i really need to upgrade on the cpu department? i would go for 13600k when it releases but i wanna sit on my current cpu for long as possible. would it be okay to use it until like 2025?
No, upgrade right now, both cpu and gpu.
i literally just bought the 3060 ti. stop trolling me.
Why switch to craptor lake instead of getting a drop-in upgrade like a 5700x?
because my mobo is b450 prime plus it was budget in 2019 too i dont know why i shudnt just buy a better mobo that will support ddr5 as well in the future
GPU is still good
get a 5700X or a 5800X3D
5700x is worse than 12600k or 13600k plus idk if i wanna buy a*d stuff anymore
>would it be okay to use it until like 2025?
what do you think it's going to happen in 2025
do you plan on smashing your CPU or something
no but if itll play games and stooff until then
people are playing games on 2600k
you'll be fine
1600 here, get yourself a 5800x and be done with it.
games are for shildren
Anyone playing games after becoming 16 years old should just be killed off.
shitting on kids in FPS games as an adult is one of the best parts of PC gaming
>modern AAA games
>good
lol
lmao
Actual good games don't need powerful hardware
oh no! how will my 60hz monitor cope?!
>8 year gap in technology
>only 2x framerate increase in a dead, hack filled, chingchong appeasing, f2p game
CPU improvements are over aren't they?
now try it with the ~~*fixes*~~ for fake exploits turned off
>the
got a 7w 4c pentium laptop for shitposting and it was an awesome machine for what it was until the "fixes" now i'm hitting 90%+ just browsing while before i'd almost never go beyond 20%
mitigations=off
Thank me later.
>still using BIOS pre-fixes
>meltdown and spectre fix disabled in windows using Inspectre software
never measured the performance, i just did it for the feels
Imagine playing games in 2022
How old is PUBG? Why test with that... Hasn't anything new come out? That shit was already pretty old last time I bought a GPU (1070, evga)
I'm starting to get scared that I'm gonna build a new PC and find there's nothing new to play. Hardware accel in Photoshop can only justify so much
I have a 1080ti paired with my 4790. At 1440p there's almost no bottleneck.
you wont notice a difference if you are at 144hz, btw.
the 4770 is still doing fine here.
>moron thinking average is more important than 1%
newer processor is better for less jumpy fps, of course, but it's still usable.
You'll have to pry my 3770K out of my cold, dead hands. Delidded it, replaced the IHS-Die TIM with liquid metal, put my gigantor fricking Thermalright Silver Arrow (the first one) back on there and run an all-core turbo of 4.4 GHz. I don't game that often anymore and I'd already maxed out Z77 with 32 GB RAM back in ~2014 for video editing. Nothing I do on there even comes close to making me want to upgrade.
yet a 4770 would stomp it at stock in AVX2
>yet a 4770 would stomp it at stock in AVX2
Considering Ivy Bridge didn't have AVX2 support at all?
All those years/price hikes and there's only a 37% increase? lol
It’s actually shocking how little CPU improvement we’ve had over the last 10 years. It’s intel’s fault, right?
Not even in a thousand years should we use PUBG as a benchmark. Not ever.
people who don't upgrade their hardware don't care about framerates because they don't care about their computer