We've reach the point of diminishing returns when it comes to resolution. It's already hard to see any difference between 2k and 4k, going from 4k to 8k will probably be indistinguishable in most cases. Same with frame rates. I know a lot of pro alpha gamers will disagree, but going above 120fps has no effect for 99% of games.
So what's next then? Some new technology like real raytracing? Or just keep pushing resolution and frame rates for the VR market?
It's All Fucked Shirt $22.14 |
It's All Fucked Shirt $22.14 |
Soldered GPUs
it's been over 10 years since I thought "man, I wish GPUs were more powerful so we can have awesome graphics!" The fact is that graphics today are absolute dogshit for the most part, and it's not because of weak GPUs. It's because of lazy publishers who push out crap and they don't give a rat's ass because drones will buy them.
This.
Keep pushing resolution and frame rates for the VR market. I'm not getting a VR headset until we have 4-8k per eye 120fps
I have a PS4 and NFS: heat looks almost as good as the unreal engine 5 Matrix demo for ps5. It's all about adding the right textures to the game, GPU technology has hit a wall 10 years ago.
Will we get better rasterization performance?
Particles
Naturally
>We've reach the point of diminishing returns when it comes to resolution. It's already hard to see any difference between 2k and 4k,
>posts picture showing how viable polygon count has increased as gpus have improved
tech illiterate spotted
>>>>>>>Wrong board
I want real time water caustics, glass deformation, ocean crest simulations, volumetric smoke affected by the wind, for each tiny rock and pine needle on the ground to have a defined shape, shadow and physics, lush forests with gorgeous subsurface scattering, detailed tree bark with procedural destruction, deformable mud, and enough cores to power the AIs of thousands of little animals.
yeah the main one would be AI I think
Like it should have like a million AI neural cores to power each AI or something
Most games won't need more than a million AIs I think
No but seriously AI is a huge issue.
I saw a zombie unreal tutorial and the most it could handle was 50 zombies at the time.
NVidia might end up having to add another card to the market for the sole purpose of AI calculations kek
Or at least make gaming GPUs come wi the on-board dedicated AI cores with all the fancy heuristics, etc
open world is also an issue because developers have to rely on tricks to pull it off.
The computer is not capable of rendering and keeping track of a huge map in real time.
Eh, I think that can stay the way it is for the time being
You mean like the Physics processing unit?
Here is the video
?t=965
Its actually 100 zombies
Still that limitation bothers me more than the graphics. The graphics are good enough. But I mean is not like the AI used on the zombies on that tutorial was even all that sophisticated.
The only one you couldn't have today is
>for each tiny rock and pine needle on the ground to have a defined shape, shadow and physics
It's all a question of priorities.
Why did they slope her face more in the last version? To make her look more chinese?
to give her more of a "hero chin" without actually making the chin bigger. it also makes her look a little more mixed race but i dont think there was ever a board meeting that said "make her look more chinese"
its just whoever the artist was trying to make her look a little more like a postergirl for the specific setting and theme of that specific game. remember it was the tomb raider where they went "frick it dude lets make it super grim, like have lara get impaled on a spike through her through on camera, lets put in a rape scene yeah bro this is gonna be so dark and gritty" so its pretty obvious they'd change her design up a little to match.
I think the 2013 game based her face on a real model
OG lara croft has a worse hairline than me lol
Raytracing is big because it enables developers to get the photorealism and effects they want (and can already mostly accomplish) without requiring so many compromises on dynamism. It's not difficult to make a game look "good" on mid-range hardware if barely anything in the scene actually moves or changes and you can bake the expensive calculations ahead of time. Developers have proved that if they can't have both, they will pick graphics nearly every time, so a technology that promises both means that games will hopefully become more interactive again.
That's why Frostbite's lighting and destruction tech was so impressive when it came out. It enabled the visuals AND the interactivity, because it didn't require as much precomputation.
Lara Croft will become troony
Apparently dick swinging will be the next big thing in gpus
I think we'll have YouTube videos with stable frame rates by 2030
>It's already hard to see any difference between 2k and 4k, going from 4k to 8k will probably be indistinguishable in most cases.
These resolution arguments with no distance/size taken into account are mouth breathing brainlet tier. Why do morons keep making them? 4k is visibly better than 1440p on a 32inch panel at normal viewing distance (100-150cm)
>Same with frame rates. I know a lot of pro alpha gamers will disagree, but going above 120fps has no effect for 99% of games.
Another basement moron tier argument. FPS arguments with no monitor lag/response time taken into account are useless. A strobed 240hz OLED or a theoretical 240hz CRT would certainly benefit from >120fps.
>So what's next then? Some new technology like real raytracing? Or just keep pushing resolution and frame rates for the VR market?
HDR has arguably made a bigger difference than most graphics advancements in the last couple of years. Also, I'd rather have good games than muh eye candy with loot box cancer.
>These resolution arguments with no distance/size taken into account are mouth breathing brainlet tier
Can you not read moron? That's why I said MOST CASES and brought up VR. Everyone knows the closer your eye balls are to a screen the more you're going to see the pixels. You aren't smart for figuring out what 5 year olds instinctively know. Only mouth breathing tards need it autistically spelled out for them in every conversation. For a 32" panel at 100-150cm the difference between 2k and 4k is negligible and 4k to 8k is indistinguishable. When strapping a screen <5mm from your eye ball (like in VR) obviously 4k to 8k will be more noticeable. Do you need every viewing distance/monitor size combo listed or are you smart enough to figure out the normal use cases for monitors.
>FPS arguments with no monitor lag/response time taken into account are useless.
Lag is mostly a peripherals issue, we're talking about graphics cards here. Try to keep up dipshit.
>The higher resolution and framerate, the better.
If you can't tell a difference (and they've done test and shown people can't) then its literally a marketing gimmick. Congrats you just outed yourself and a brainless coomsoomer.
>Can you not read moron? That's why I said MOST CASES and brought up VR.
Most cases for 4k screens have been pretty viable up until now, with the notable exception being phones.
>Everyone knows the closer your eye balls are to a screen the more you're going to see the pixels. You aren't smart for figuring out what 5 year olds instinctively know.
If it's so simple why don't you include PPI/density? The equation is incomplete without all factors.
>Only mouth breathing tards need it autistically spelled out for them in every conversation.
Only mouth breathing morons make moronic generalizations.
>For a 32" panel at 100-150cm the difference between 2k and 4k is negligible and 4k to 8k is indistinguishable.
Get your eyes checked gramps.
>Do you need every viewing distance/monitor size combo listed or are you smart enough to figure out the normal use cases for monitors.
Define "normal"
>Lag is mostly a peripherals issue, we're talking about graphics cards here. Try to keep up dipshit.
Nice dodge there sweaty. Why do you ignore response times?
>If you can't tell a difference (and they've done test and shown people can't) then its literally a marketing gimmick.
Source: my ass
>the npc Black personcattle cant see the difference so there is no difference
idiot
>He fell for the 4k meme
Probably some propriety nvidia bullshit.
At least Nvidia is the only one actually doing shit, AMDs crap is lame marketing gimmick that doesn't work and only exists so they can technically say they have a competing product and Intel can't even get their cards out, even though they promised their software side tech works on any GPU.
The only good open thing we have is FreeSync and even that is implemented in the VESA VRR standard now.
The higher resolution and framerate, the better. You just think it's diminishing returns since you can't compare properly.
Human vision has finite temporal and spatial resolution, so diminishing returns are a point of fact. The marginal utility of higher resolutions and framerates must decrease.
Hopefully just efficiency of cards. Graphics, for me, haven't done anything significant enough to warrant the constant masturbation over detail. If they can make a 3080ti with the same power draw as a 3060, then I'd be impressed.
>We've reach the point of diminishing returns
Eh. The quality difference between those last two is bigger than between the first two if you ask me.
I'm not a pro alpha gamer and I noticed the difference between 120 to 144. Didn't even do them side by side either, so it was just "wow this is smoother"
Why is #3 so angry?
i believe that by 2035 we will see a renderer where the primary "path tracing" will be done entirely by a ML model with some minor input from an actual path tracer.
it will look produce results better than current offline path tracers and will run in real time.
the only current game with good graphics is that ue5 matrix city demo, and even then the npcs don't look real, they look out of place and weird. there's also apparently no foliage in the demo, so perhaps all that technology only works for static meshes. video game graphics today are honestly pretty bad
>Or just keep pushing resolution and frame rates for the VR market?
Yes. We're not even close to diminishing returns. Even a step from like 8K to 16K would be a big improvement in VR.
Bros I don't want to use a tv for a computer
nothing really, at this moment the bottle neck are not the graphics cards but the cost of asset production. so unless 3d tools get better we wont see much improovement in computer graphics any time soon. the time it takes to do things like uv unwrapping or weightpainting are directly related to the polycount of the model and become more complicated the more detailed the 3d model is.
>We've reach the point of diminishing returns when it comes to resolution
It's not just that, but there is this underlying myth that more resolution or effects make games better. They don't; even simplistic graphics from the 8 to 16 bit console era are still satisfying to watch, especially when you consider that the underlying games are inherently more fun and rewarding to play.
But what is really the selling point for something like a 3090 over a 1080 Ti? GayTracing??? So we need games to appear more "realistic"? Why? I don't need realistic games, I want FUN games that are well-distinguished from 'reality' yet in the realm of imagination could be a real world.
Nintendo's first-party games are a good example of this - they know how to select the right color palette and have solid art direction, such that the style of the game becomes instantly identifiable. When games try to be "realistic" they also become generic and homogenized. Look at World of Warcraft compared to the mountain of asian clones of it...even GuildWars 2. GW2 tries to have a "realistic" look with excessively-detailed characters; WoW sticks to its original art style and in more modern versions of the game, builds upon it without ever trying to make the characters appear "too real".
Dilate videotroony
ray tracing is the biggest graphical upgrade since dynamic lighting was introduced, and it took the industry almost 10 years to fully adopt that so it'll probably take the same amount of time this time. meaning, fully featured proper looking raytracing running at decent resolutions at 60 fps wont really be fully implemented until the next console generation
I tried playing tomb raider
Its pretty much an eye candy game the gameplay is kind of meh
the remakes are pretty meh. the first one was ok, the second one was actually pretty good with its survival mechanics and actually having an open world as opposed to a linear hub world. the most recent one was fricking awful, its such an easy game that there is literally a setting that lets you turn off habing lara speak out loud the answers to a puzzle, and even with it off she still audibly says shit like "maybe that lever will open this gate somehow". its literally "go through door A to win the game. if you go through door B you get a cookie because you thought outside of the box! there are no other doors." and lara croft, at every single set of doors A and B will say "i think there is a cookie behind door B"
the old womb raider games are kino though, not overly deep but certainly a ton of fun. thats sort of what a good game should be anyway. look at PoE, that shit is as deep as you could imagine, but is it fun? not really.
Trying to make it too real was a mistake, can we have our distinct computer graphics back?
I think the next big thing is going to be things like cloth/foliage/hair physics.
Especially with hair, it's still not quite where you can get realistic hair moving naturally.
>like cloth/foliage/hair physics.
>what is nvidia hairworks
cloth physics in particular are pretty damn good in most modern games, i find. hair not so much, but it's certainly a lot better than you think.
foliage is the one thing i agree with, I'd love to see a method of rendering massive amounts of long grass without using those flat boxes that fold over when you walk on it. something that can actually simulate grass in full withiut a big performance hit, like ray tracing but for grass. imagine a grass-rendering core built onto the GPU like the RTX cores. "touch grass edition"
Fully raytraced games.
To get full raytracing with the kind of fidelity we have now using rasterization cheats, and have it run at an acceptable framerate, you're looking at a decade easily.
imo ray tracing is a waste of resources when lumen now exists and there is even more optimization to be made.
raytracing is the next logical step foward, when the processing power exists to do it
>We've reach the point of diminishing returns when it comes to resolution. It's already hard to see any difference between 2k and 4k, going from 4k to 8k will probably be indistinguishable in most cases
Wrong
The point of maximum resolution is where pixels are so small we don't need to do anti-aliasing anymore, and 4k isn't that point
8k might be
The next step is realtime raytracing
>Some new technology like real raytracing?
RTX is that, it just needs to become faster. which it won't, at least by much, but at least the algorithms are catching up.
>It's already hard to see any difference between 2k and 4k
lol wtf. If you can't tell the difference you may need real medical attention anon. I'm dead serious. I wouldn't play a game at 2k if you paid me
This is the kind of opinion you have when your personality boils down to the internet
>120hz has no effect on 99% of games
I disagree.
Lighting / raytracing is the future.
realtime cooming
How many troons here can pass tbh
Tree leafs are still infinitely flat. We haven't even reached the point where games can have actual 3D leafs, in fact most games still only put a few leafs or blades of grass in their game, making every environment look like winter or the desert.
Hair still has a long way to go.
So far there are very few well done female characters with long hair that isn't a pony tail.
drivers that work. i am sick of not being able to run pytorch on my AMD card, sick of statically linked drivers, sick of opencl not working, sick of vulkan randomly failing.
Particles and voxels