That's what's funny to me, GPU tech has continued to rise. The RTX 4090 has something like nearly 60 billion transistors compared to the 12 billion in the GTX 1080 Ti. And yet the 1080 Ti is still perfectly adequate for virtually any game here 8 years later.
Whats even stranger is we're now several years into a new console generation, one which was a giant leap in performance, and yet still games rarely stress even an ancient GPU.
>the 1080 Ti is still perfectly adequate for virtually any game here 8 years later.
that's just a lie
it chokes and buckles at 4k120 on ultra. don't even start lmao
4k is a meme. I am yet to be convinced to play on anything other than 1366x768 on my 1080p monitor
2 weeks ago
Anonymous
The ultimate meme is 4K on consoles, because for example using a 55inch TV, you'd need to be sitting maximum 3 feet away from it to see 4K, anymore and it starts dipping to like 1080P (as perceived by your eye)
In other words pretty much all console games are not seeing 4K on their games.
2 weeks ago
Anonymous
this generation is the funniest one so far. supposedly when consoles would finally reach the pc 4k60 but for a much lower cost.
games now run (UP TO) 4k with dynamic resolution scaling (the game runs 99% of the time at a lower resolution), at 30 fps with uneven frame pacing (devs nowadays can't even into that), OR you can run at 4k (upscaled from 480p) and no ray tracing to run at (UP TO) 60 fps (with constant drops into the 50s and stutters).
I'm not a manchild that takes sides in platform "wars" but anyone buying the latest console for any reason other than console exclusive games is a moron.
2 weeks ago
Anonymous
idk what year you think it is but both ps5 and xbox run 4k120
2 weeks ago
Anonymous
sure, for games from the previous generation or 2d/simplistic artstyle games. now find me a triple (or quadruple) A slop released in the last few months that doesn't fit my description.
I'll start with some counter-examples off the top of my head:
alan wake 2, dragon's dogma 2, the latest final fantasy VII whatever it's called, stellar blade (for this last one I take back the stuttering and frame pacing issues as it's reportedly well optimized, pretty rare these days)
I'm assuming you're trolling, because you can declare virtually any card "chokes" on some arbitrary super high setting.
The fact is though, Most people out there still game at 1080p 60hz and as long as a GPU can manage thst adequately on high settings it's still very relevant even now.
My 1080 (non TI) runs DMC 5 at 1440p ultra above 100 fps and that's the most demanding game I play
Maybe for AI I could use more VRAM
I'm not interested in most post 2020 triple-A games anyway
It's such an obviously outdated card but I still have yet to find something it can't run comfortably on high settings at 1080p.
The most annoying thing is that it's so hard to swap from it unless you do a substantial upgrade. You'd need to go for at least a 4070/Super and change for a higher resolution if you still own a 1080p monitor, but then you'd have a card that basically gives you 1080Ti performance at 1440p and that will be outdated once 4k becomes the norm (if it ever does).
4090 might really be the only worthy upgrade to the 1080 Ti.
>can run decent LLMs fast >can generate SD images with upscaling all in vram >can game at 1440p comfortably >can transcode multiple video streams >doesn't need pcie gen 1000 bandwidth >doesn't need a second psu
Amazing card, it's cost me <$10/month and still does everything I need.
>4090 might really be the only worthy upgrade to the 1080 Ti
This is my dilemma. Factoring in inflation the 1080ti would've cost the same as a 4080 today but continues to be relevant while the 4080 will be aged out in a couple years. 4090 is 50% more expensive but you bet your ass it won't stay relevant for 50% longer than the 1080ti. If it were 4080 priced it would be a worthy upgrade.
I think the 3090 is gonna be the immortal card. AI dialog, voice, frame and story generation is just gonna get better and better. We're gonna see a resurgence of sli builds with a dedicated AI card paired with a normal render card.
When I can buy 2 3090s for the price of a 4090 and cheaper than a 5090 there's no reason to get something newer until AMD gets their head out of their ass or Nvidia starts making dedicated AI cards like PhysX used to be.
>This is my dilemma
You and me both, I really want to get a 4090 but I know I'd have to buy a matching 4k monitor and it's going to tear my wallet's butthole in half.
If you play on 1440p I could see the 4090 being basically immortal but for 4k res it's a bit of a gamble.
I think the 3090 is gonna be the immortal card. AI dialog, voice, frame and story generation is just gonna get better and better. We're gonna see a resurgence of sli builds with a dedicated AI card paired with a normal render card.
When I can buy 2 3090s for the price of a 4090 and cheaper than a 5090 there's no reason to get something newer until AMD gets their head out of their ass or Nvidia starts making dedicated AI cards like PhysX used to be.
>can run decent LLMs fast >can generate SD images with upscaling all in vram
Is it really that good in those fields? I've been having a kneejerk impulse to upgrade from my trusty 1080ti to a 4070tiS or 4080S, but even at 1440p my 1080ti still runs everything I want it to just fine. Messing with AI shit has been calling to me, but if the 1080ti can still hang in that field too, then I might just stick with it for a few more years. I've already repasted it and put on an aftermarket cooler so it could easily end up lasting me 10+ years.
Depends on who you ask. I'm fine playing around with 8 or 13b models which run at 20t/s. It's not as good as the 70b models but still entertaining and it can write regex for me. You can still run big models with enough ram but you're looking at 2t/s which is just too slow for me.
SD is a little slower. Depending on the model and step count you're looking at about 5 seconds per image with no upscaling. With upscaling it's almost 1 minute per image but I run batches of 60 when I leave the house so it's not so bad.
11GB vram is much better than 8. 16 is a bit better than 11, 24 is much better than 16. It's not worth going from 11 to 16 but going up to 24 would be.
2 weeks ago
Anonymous
>Depends on who you ask. I'm fine playing around with 8 or 13b models which run at 20t/s. It's not as good as the 70b models but still entertaining and it can write regex for me. You can still run big models with enough ram but you're looking at 2t/s which is just too slow for me.
That sounds promising. Most of what you said is Greek to me since I'm only vaguely familiar with what can be done with LLMs, but it sounds like it shouldn't be impossible to set up a small bot to talk to about random cringe stuff. >SD is a little slower. Depending on the model and step count you're looking at about 5 seconds per image with no upscaling. With upscaling it's almost 1 minute per image but I run batches of 60 when I leave the house so it's not so bad.
Five seconds per image is plenty, considering I used to generate them one at a time on NovelAI and it was good enough for me. I'm already a drawgay so it wouldn't need to do heavy lifting anyway. >11GB vram is much better than 8. 16 is a bit better than 11, 24 is much better than 16. It's not worth going from 11 to 16 but going up to 24 would be.
Good to know. It sounds like I should just save my money unless I want to sell a kidney for a 4090 or roll the dice on a used 3090, as far as AI is concerned. Thanks for the informative post.
Nah, the DVI on the GTX 980 is far more valuable since it also carries analog VGA signal.
The 1080 is in the first generation that stopped native analog monitor support.
6800XT was the best gpu for money. 16GB longevity. Raytracing is too demanding if you dont have a 4090. I dont use AI slop anyways. The 6800XT gives me 120+ FPS rasters on 3440x1440 HDR, which is the actual game changer. If I want upscaling I just use XeSS Quality, which is good enough. The GPU has a whole 128MB cache, which also makes it future proof and is more than a 7800XT has
To play kino like octopath traveler 2? Nah. I'm good
Yes and no
>ROG homie
what do you want to tell us?
>195x127
Using GTX 1080 today is equivalent of using GTX 280 in 2016.
That is 1 (one) gig of ram. I don't think you could play even runescape with that thing.
That's what's funny to me, GPU tech has continued to rise. The RTX 4090 has something like nearly 60 billion transistors compared to the 12 billion in the GTX 1080 Ti. And yet the 1080 Ti is still perfectly adequate for virtually any game here 8 years later.
Whats even stranger is we're now several years into a new console generation, one which was a giant leap in performance, and yet still games rarely stress even an ancient GPU.
>the 1080 Ti is still perfectly adequate for virtually any game here 8 years later.
that's just a lie
it chokes and buckles at 4k120 on ultra. don't even start lmao
Play a NES game at 4K, it means nothing.
4k is a meme. I am yet to be convinced to play on anything other than 1366x768 on my 1080p monitor
The ultimate meme is 4K on consoles, because for example using a 55inch TV, you'd need to be sitting maximum 3 feet away from it to see 4K, anymore and it starts dipping to like 1080P (as perceived by your eye)
In other words pretty much all console games are not seeing 4K on their games.
this generation is the funniest one so far. supposedly when consoles would finally reach the pc 4k60 but for a much lower cost.
games now run (UP TO) 4k with dynamic resolution scaling (the game runs 99% of the time at a lower resolution), at 30 fps with uneven frame pacing (devs nowadays can't even into that), OR you can run at 4k (upscaled from 480p) and no ray tracing to run at (UP TO) 60 fps (with constant drops into the 50s and stutters).
I'm not a manchild that takes sides in platform "wars" but anyone buying the latest console for any reason other than console exclusive games is a moron.
idk what year you think it is but both ps5 and xbox run 4k120
sure, for games from the previous generation or 2d/simplistic artstyle games. now find me a triple (or quadruple) A slop released in the last few months that doesn't fit my description.
I'll start with some counter-examples off the top of my head:
alan wake 2, dragon's dogma 2, the latest final fantasy VII whatever it's called, stellar blade (for this last one I take back the stuttering and frame pacing issues as it's reportedly well optimized, pretty rare these days)
get glasses soon habibi
I play all my games in 1600x1200 so my regular GTX 1080 plays everything out today at max settings
I'm assuming you're trolling, because you can declare virtually any card "chokes" on some arbitrary super high setting.
The fact is though, Most people out there still game at 1080p 60hz and as long as a GPU can manage thst adequately on high settings it's still very relevant even now.
I game on a LG c2 OLED
4k
120 hz
7900xtx
Let me guess, you need more?
I game on a samsung 1080p TN monitor
768p
60 hz
gtx 750ti
Let me guess, you must consoom and need more?
Developers fell for the 4K resolution meme and raytracing memes over visual fidelity.
4k isnt a meme. it's love. it's life
My 1080 (non TI) runs DMC 5 at 1440p ultra above 100 fps and that's the most demanding game I play
Maybe for AI I could use more VRAM
I'm not interested in most post 2020 triple-A games anyway
>Asus
Buy an ad
I must thank you for making me aware of the existence of this semen demon.
Now I associate Asus with this succubus and it'll only make me buy more Asus products
>250w 1080ti
>a 3050 that runs off a 65w slot with no additional power gets you 80% of its performance
yeah it's obsolete.
I need at least 16GB VRAM.
for rendering creative work, yes I need more
I need less 🙂
It's such an obviously outdated card but I still have yet to find something it can't run comfortably on high settings at 1080p.
The most annoying thing is that it's so hard to swap from it unless you do a substantial upgrade. You'd need to go for at least a 4070/Super and change for a higher resolution if you still own a 1080p monitor, but then you'd have a card that basically gives you 1080Ti performance at 1440p and that will be outdated once 4k becomes the norm (if it ever does).
4090 might really be the only worthy upgrade to the 1080 Ti.
>can run decent LLMs fast
>can generate SD images with upscaling all in vram
>can game at 1440p comfortably
>can transcode multiple video streams
>doesn't need pcie gen 1000 bandwidth
>doesn't need a second psu
Amazing card, it's cost me <$10/month and still does everything I need.
>4090 might really be the only worthy upgrade to the 1080 Ti
This is my dilemma. Factoring in inflation the 1080ti would've cost the same as a 4080 today but continues to be relevant while the 4080 will be aged out in a couple years. 4090 is 50% more expensive but you bet your ass it won't stay relevant for 50% longer than the 1080ti. If it were 4080 priced it would be a worthy upgrade.
The problem is real inflation is much higher than the government admits to.
It's honestly like 40-50% since 2020.
I think the 3090 is gonna be the immortal card. AI dialog, voice, frame and story generation is just gonna get better and better. We're gonna see a resurgence of sli builds with a dedicated AI card paired with a normal render card.
When I can buy 2 3090s for the price of a 4090 and cheaper than a 5090 there's no reason to get something newer until AMD gets their head out of their ass or Nvidia starts making dedicated AI cards like PhysX used to be.
>This is my dilemma
You and me both, I really want to get a 4090 but I know I'd have to buy a matching 4k monitor and it's going to tear my wallet's butthole in half.
If you play on 1440p I could see the 4090 being basically immortal but for 4k res it's a bit of a gamble.
Meant for this
>can run decent LLMs fast
>can generate SD images with upscaling all in vram
Is it really that good in those fields? I've been having a kneejerk impulse to upgrade from my trusty 1080ti to a 4070tiS or 4080S, but even at 1440p my 1080ti still runs everything I want it to just fine. Messing with AI shit has been calling to me, but if the 1080ti can still hang in that field too, then I might just stick with it for a few more years. I've already repasted it and put on an aftermarket cooler so it could easily end up lasting me 10+ years.
Depends on who you ask. I'm fine playing around with 8 or 13b models which run at 20t/s. It's not as good as the 70b models but still entertaining and it can write regex for me. You can still run big models with enough ram but you're looking at 2t/s which is just too slow for me.
SD is a little slower. Depending on the model and step count you're looking at about 5 seconds per image with no upscaling. With upscaling it's almost 1 minute per image but I run batches of 60 when I leave the house so it's not so bad.
11GB vram is much better than 8. 16 is a bit better than 11, 24 is much better than 16. It's not worth going from 11 to 16 but going up to 24 would be.
>Depends on who you ask. I'm fine playing around with 8 or 13b models which run at 20t/s. It's not as good as the 70b models but still entertaining and it can write regex for me. You can still run big models with enough ram but you're looking at 2t/s which is just too slow for me.
That sounds promising. Most of what you said is Greek to me since I'm only vaguely familiar with what can be done with LLMs, but it sounds like it shouldn't be impossible to set up a small bot to talk to about random cringe stuff.
>SD is a little slower. Depending on the model and step count you're looking at about 5 seconds per image with no upscaling. With upscaling it's almost 1 minute per image but I run batches of 60 when I leave the house so it's not so bad.
Five seconds per image is plenty, considering I used to generate them one at a time on NovelAI and it was good enough for me. I'm already a drawgay so it wouldn't need to do heavy lifting anyway.
>11GB vram is much better than 8. 16 is a bit better than 11, 24 is much better than 16. It's not worth going from 11 to 16 but going up to 24 would be.
Good to know. It sounds like I should just save my money unless I want to sell a kidney for a 4090 or roll the dice on a used 3090, as far as AI is concerned. Thanks for the informative post.
Yes 3rd act of baldur gates gave me 15 fps on my 1080 and llama2 didn't fit in the vram. I had to buy a used 3090 even though I didn't need one
yes.
>not my EVGA 3090 XC3 24GB
bin it. now that thing will last me til 2030.
Still rocking my EVGA 1080ti FTW3, would go for a 4090 if EVGA made them.
me too fren
I got their 3090, love me some 8-pin cables
I have a RX6600 and it's more than enough for my 1080p gaming.
>a RX
>if I just lower my settings the card is still le good!!!!!!!
unfortunately no amount of coping and lowering settings will give you more vram.
Will its DVI port become more valuable for vintage monitors or whatever?
Nah, the DVI on the GTX 980 is far more valuable since it also carries analog VGA signal.
The 1080 is in the first generation that stopped native analog monitor support.
6800XT was the best gpu for money. 16GB longevity. Raytracing is too demanding if you dont have a 4090. I dont use AI slop anyways. The 6800XT gives me 120+ FPS rasters on 3440x1440 HDR, which is the actual game changer. If I want upscaling I just use XeSS Quality, which is good enough. The GPU has a whole 128MB cache, which also makes it future proof and is more than a 7800XT has
My gtx 980 is perfect for everything I need—games that are actually good; not AAAss slop
Can you emulate gamecube and ps2 games on a CRT with one? Neat if so
Definitely. With ease