Its when you deep fake a woman, and use her as your VR sex doll.
2 years ago
Anonymous
I don't think you need a high end GPU to do that
2 years ago
Anonymous
Got it, I need to drop thousands of dollars so I can play a game that doesn't exist.
2 years ago
Anonymous
>Got it, I need to drop thousands of dollars so I can play a game that doesn't exist.
That's has been the whole point of high end computers for the past 30 years. When games ae designed so the bottom 10% can play it and the bottom 40% can play it well, that means everyone else wasted money on a high end system.
Skyrim at any resolution above 1600x900
Total War: Warhammer 3 on medium settings
Any Paradox game after 2015
Fallout 4 at any resolution above laptop tier
Corruption of Champions 2
Emulated Duck Hunt
TESVI and Starfield will probably need 20GB VRAM minimum for mods
Skyrim at 1440p and 4k will max out 16GB VRAM and strategy games like Total War will always want more memory
Most recent titles and especially ports are also VRAM hogs of varying degrees, the only ones immune to this are the yearly AAA releases where they're far more optimized and can run on toasters
Skyrim needed GTX 680 at MINIMUM to get choppy 20 fps with grass mods in 2012. I can already see the xx90 being the "you need this to get into serious modding" card, and it's mainly because of the 24GB VRAM that gets maxed out at 4K.
>TESVI and Starfield will probably need 20GB VRAM minimum for mods
That's like saying a game will need 20GB of VRAM if you make it fill the buffer with garbage.
Skyrim mods only need stupid amounts of VRAM if you don't optimize them, modders will use raw 8K textures for something that won't occupy 5% of your screen if you let them.
I'll most likely buy a 4080 for MSRP or a 3090 if its under 600 €, if I get the 4080 I won't even unpack it and continue to use my 6900 XT, then scalp the shit out of it
I doubt there will be another way to obtain a nice high end GPU
Richgays can simply pay me for upgrading its not like they don't have too much money anyway
Far Cry 6 does require that much for the HD texture pack, and even without that you can't really run it with less than 8GB vram (at 6GB it will eventually refuse to load in textures at mipmaps higher than like 128px)
They sell new low cards under other branding to not interfere with the quality brands. GT 1030, now GTX 1630. They don't want to market their lowest cards with their performance cards.
>What does SM stand for?
Streaming Multiprocessor. Its the basic block that Nvidia defines their GPU designs by, like CU(Compute Unit) for AMD. Each CU or SM contains the ALUs, cache, scalar processor, RT accelerators, other various things. A complete GPU is comprised of numerous CUs/SMs. >APUs?
Accelerated Processing Unit. AMD marketing term for their chips with integrated graphics.
To be fair that's only about $45 above what low end cards cost in say, 2015, when adjusted for inflation. You should be more angry at how quickly the dollar is losing value
>meanwhile 6900xt cost 800usd and is as fast as the 4070 3090
Your welcome nvidiots
I'm waiting for the 5000 series is replaced by the 6000 series in 4 years time I did the same wait with my 2080ti and a 6900xt will last til then at least
Yes. I need to be able to buy the newest high end card for $500. You know, like I was able to do 10 years ago with the GTX 680 instead of settling for middling 60 TI garbage. What is even the point in selling high end graphics cards if no one can feasibly buy them? The Titan was a mistake.
Also Twitch streamers. You know for a fact they wouldn't be selling $1000+ cards if every streamer wasn't constantly rushing out to have the best setup possible because money is no object to them. And that's not even mentioning fricking crypto. I'm glad that shit is sinking like the Titanic.
>And that's not even mentioning fricking crypto. I'm glad that shit is sinking like the Titanic.
You must actually be moronic if you think crypto is going anywhere just because it's following the same downward trend as the rest of the market.
2 years ago
Anonymous
>downward trend
bruh it fricking crashed lmao
2 years ago
Anonymous
still up 800% from just a few years ago
you are dense if you still can't see the overall trend
Gpus haven't been that price since my 7900gt almost two decades ago mate I really don't see it happening anymore and those cards where rudimentary.
I'd say if you wanted a 500usd high end gpu today it's just not possible the ram is 50 bucks the cooler and fans is at last another 50 the pcb alone and all the shit on it is about 150 then to bulk manufacture rnd and shit eats the rest
So a 1k card is about 500 bucks to develop manufacture and make and you've gotta do that every 2-3 years or less
It's why gpus are frick huge and basically more complicated and heavy expensive and powerful than a cpu ram mobo and ssd combined
>900 dollars for a 4080
conflicted emotions... >Remember when GTX 780 was 400 dollars >3080 was only 800 dollars >Also 10%+ inflation over the last year
Are Nvidia justified in their price increases over the past 10 years or are they just milking the ETH mining/chip shortage at this point
>Are Nvidia justified in their price increases over the past 10 years or are they just milking the ETH mining/chip shortage at this point
Do you even need to ask?
Absolute top of the line GPU cost me $300 in 1999. Adjusted for inflation that would still only be $520 today.
Reminder for all inflationgays that NVIDIA flagships have for the better part of 20 years always been priced below $650-$500 with some outliers.
It's only after AMD stopped competing at the high end after they had their Bulldozer moment in GCN that they were able to set the pricing structure however they wanted, now all AMD does is slot in to the pricing structure set by NVIDIA because they have so little financial horsepower on the side of their graphics division that they cannot manufacture enough units to offset the lower profit margin that comes with a lower ASP even as their sales and market share continues to dwindle, since everything has been devoted to CPUs. .
Right where the red arrow is was when they brought out the "Hur dur not a gaming card" GTX Titan line that were literally just thinly veiled market tests to see just how many people were willing to pay for a top end card that wasn't that much faster than the GTX numbered flagship but 2x the price, the answer was a shittonne and now we reap the rewards I guess
They've created a huge canyon between affordable cards and high end. This marks the line between 4K running well or not. AMD could easily disrupt this and create xx70 level cards with more VRAM.
2 years ago
Anonymous
The market adapted though, and clearly many many people are okay with buying $1000 GPUs. People are buying $700-$1500 phones. People buy $1500-$3000 laptops. A high end enthusiast GPU even at $2000 has enough of a consumer base that Nvidia isn't hurting for pricing them as such.
2 years ago
Anonymous
>many many people are okay with buying $1000 GPUs. People are buying $700-$1500 phones. People buy $1500-$3000 laptops.
All of this would be fine if not for housing prices also being exorbitant all while many jobs still don't pay more than $20 an hour, which if you ask me should be the bare minimum wage considering how bad things are.
2 years ago
Anonymous
This is a new era. Everyone puts shit on credit cards.
2 years ago
Anonymous
That would be good, but I don't have as much faith in AMD after they started kiking as well with 5600X launch and later RX 6X50 refresh.
He's Taiwanese
I hate to admit it, but for a while there from 2015-2020 I thought that both Nvidia and AMD were Taiwanese companies since their CEOs were both from Taiwan
There are some subtle differences which may not be observable to western eyes. The nuance is really lost if you aren't aware of it.
Lisa Su is what is known as a Real Chink where as Jensen Huang is a Fake asiatic.
2 years ago
Anonymous
>Lisa Su is what is known as a Real Chink where as Jensen Huang is a Fake asiatic.
Please enlighten me
2 years ago
Anonymous
Jensen is "Asian." Looks more in place in California than any Asian country. Might as well be Hawaiian. If you saw him walking around Taiwan you'd think "what the frick is he doing here?" Disgusting replicant morphology, probably an Alien. Fake asiatic.
Lisa is a proper squinty eye, can talk to her baba in her ancestor's tongue. If you saw her in Taiwan you'd think "oh, there goes a cute old woman, glory to Taiwan, we are the real China." Real Chink. Based.
2 years ago
Anonymous
>asiatic
KOREAN
I FRICKING KNEW IT, KOREANS ARE THE israeliteS OF ASIA
>Over 400 bux for a fricking shit tier 106 chip >Almost 700 for a 104 chip, used to be 499 at most >1.3k for a 102, used to start at 699
Nvidia can go frick themselves, back in the Fermi days you used to get a fricking 100 chip for 500 bux. Cheeky wankers
>also didnt reviews saying that even with 12gb the 3060 wasnt performing as it should
NO x060 card will perform "as it's should", it's a weak GPU with weak RAM
Yeah, but it'll still be the lowest end GPU out there and unable to keep up with all the minimum specs that games will have in 8 years
2 years ago
Anonymous
That's his problem not ours
2 years ago
Anonymous
just play games like 5 years after they come out. you'll save thousands on gpus
2 years ago
Anonymous
global recession which might snowball into a depressing, futher breakdown of trade relations between china and west, and focus on the new cold war with soviet 2.0 will make sure in 8 years there will be no AAA games or anything requiring more than a gt1030
2 years ago
Anonymous
I wouldn't mind that as long as I get my free gibs
>inflating core count by lumping half-precision RTX cores into it >no bus and bandwidth info
Yeah, they're STILL not letting anything under a x070 beat the 1080 ti
I don't even play AAA shit and stick to MP and JP titles I dunno why I'm even looking at top of the line GPUs lol, 4000 gen is an easy skip, specially for 1440p gamers.
Wtf makes a Ti card a Ti card? Why not just use the inbetween numbering 5? How is a 4050 ti vs a 4050 difference 4 gb of ram? It should just be 4050 and 4055 then
jesus fricking christ, over 300 for a 4050ti
i legit hope am5 apu really deliver interesting shit now that's on ddr5 because fricking hell, entry level is rip
Supposedly the upcoming Phoenix APU with RDNA3 IGP has 1536 ALUs. Given what some compiler patches submitted by AMD show, each CU is now 128 ALUs. Still 6 WGP, 2CU per WGP. That is essentially 24 RDNA2 CU. Pretty tremendous even compared to current Rembrandt APUs with up to 12 RDNA2 CU.
Allegedly this Phoenix APU will require a moderate sized L3 as IF to make up for the bandwidth, L2 also increased, but theres no doubt about the hardware specifics here, the details are already completely listed. They have to have some way to feed all that or it wouldn't be worth the transistors.
>rtx 4060 >a 3070 refresh >msrp 70 bucks lower
Man, the x60 class got cucked. Or maybe we got used to having it good with the 1060. 8GB also seems insufficient. 1060 had as much VRAM as the previous top card.
So two or even three flagships combined of few gens ago? Is this why they killed crossfire and sli? So you have to go with their house fires instead of creating your own multiGPU 90% equivalent performant house fire at a fraction of a cost?
Efficiency has constantly improved in huge strides, you just don't see it in ever SKU because the highest tier cards don't have perf/watt as a target. Enthusiast hardware is all about ultimate performance and throwing You can still lower power limits and see insane perf/watt yourself.
I guess I'm stuck in the time when 560, 660, 960 came with cheap prebuilts. Nowadays there's a 1650 Super in them. The x60 class became simultaneously weak, expensive and a power hog.
Fun fact:
From late 2007 until 2015 Mid to Low-High end cards had an MSRP price range from $80 to $350 tops. Examples 8800GT for $350 or less and HD4670 1GB for $60 to $80 https://www.techpowerup.com/gpu-specs/radeon-hd-4670.c234 https://www.techpowerup.com/gpu-specs/geforce-8800-gt.c201 Over $500 was considered pretty insane and $1000+ was unheard of. Until you can buy a new mid range $80 GPU or a GPU that uses 75 watts or less again I would just keep not buying ever again.
Over $1000 and 1000watts is simply idiocy no matter what.
this tbh
the card with the highest end gpu of a generation used to be $250-350 and the only thing above that was a dual gpu housefire for $500 that no sane person bought
i'm never gonna buy a >200W card. the summers are already hot enough tbh
>i'm never gonna buy a >200W card. the summers are already hot enough tbh
Honestly shocked how cool the 3060 Ti runs right now under full load (under 60c) even with a Ventus 3X cooler. Pushing the fan speed past 40% doesn't seem to do much though.
gpu coolers have become pretty good compared to say a decade ago, yeah
still, regardless of how good the cooler is all of the heat will get dumped into my room
I love how they've just said frick you if you have an older CPU. Time to buy a new cpu, mb, probably ram just to do gpu accelerated video encoding of xyz new codec
Why is everyone acting like the secondary market dosent exist? With current post pandemic QC issues seemingly here to stay ala HDD prices post flood, lack of components causing manufacturers to take all sorts of design shortcuts, going refurb/bstock/flebay/local-buy-sell-trade is probablty safer and more reasonable, at worst a wash, and all you penny pinching poors save a few $/eurobucks.
Anyway, why the frick do you zoomzooms need RX80000 for anyway? To play some half-assed buggy shit like Cyberpunks2098 at 8k where it still looks like a GTA5 mod from 2017 but it needs 40 a core CPU?
These day I only play visual novels and the occasional game, so rather than buying a full desktop I might just buy a gaming laptop, apparently the new Intel Arc A730M(laptop) matchs a 3070(laptop) performance.
plus it comes with 12gb of vram.
I want to permanently move to linux since I fricking hate win 11 and just barely tolerate 10.
It's carefully placed at 12GB to not handle 4K too well. Forces you to go up if you want to max games out. This makes the 4060 Ti a better buy for lower res.
Is it possible to extrapolate the rough performance of the cards with this data? E.g. what card would be needed for overall consistent 4K 60FPS minimum?
Give me one reason not to FOMO in on a 3060 Ti as eurogay right now if I've heen waiting 8 years, even if I'll be paying $650+ for a non-housefire cooler model.
You're ruining the fricking market just like the real estate market. Wait for GPUs to go down in price when crypto bottoms out 1.5 years from now and get yourself the mining version of a card.
Linus did a good job letting everyone know you can buy a 1060 6GB for 60$ , i think you can even find the mining version of a 3060 Ti in 6 months or right now, but you got to wait for the prices to go down in 1 year.
>just like the real estate market
Anon, not everyone can just go live with their parents, many people had to move to other cities where they can actually find a job. They had no other real choice even if you took into account how much they'd save in rent.
Nobody is going to be able to buy 4000 series cards when they come out. Don't hold out for something you have no hope of attaining in a reasonable time frame.
>$900 MSRP if you want a 16GB framebuffer
I wish AMD would get their ROCm drivers sorted out already.
I just want a consumer GPU with a 16GB framebuffer so I can build a single machine that I can use for both playing games and for work.
But as it is NVIDIA will keep the size of their framebuffers artificially low to squeeze people like me.
People flabbergasted by pricing need to understand the foundry side of things has changed since the time of cheap gpus.
Almost all companies dropped out of the race, and now if you want to make top tier gpu you need to go to TSMC who already sold 1st dips to Apple, and who takes orders from dozen other chip designer companies. Their smallest nm nodes are expensive.
Theres also Samsung foundries but they're definitely behind TSMC, so you can use them for cheap but risk competition using TSMC.
New chip foundries cost tens of Billions and need to be planned years in advance and can't quickly react to chip demand. Theres also ASML who make the chip machines bottleneck. All 150mil machines they make are backordered for years to come because some of their supplylines are limited and can't react quickly.
I'm surprised Nvidia even makes consumer GPUs anymore. They get all their money from data centers and selling GPUs to cryptofarms, the desktop PC market is a frivolous novelty at this point.
Why are people here continually shocked at how graphics card prices are outpacing inflation and size and TDP are following suit? The flagship today is in no way comparable to the flagship 10 years ago. Graphics fidelity, resolution AND refresh rate targets are all increasing simultaneously.
Graphics fidelity isn't increasing. games look like dogshit apart from that matrix city demo. the last of us part II on ps4 looked way better than most games on the ps5 despite the extra power. it's clear that graphics are plateauing and only the biggest studios can afford to make a game with state of the art graphics, the limiting factor being development time and not graphics hardware. I guess you can throw the extra horsepower at placebo like higher refresh rates and higher resolutions, because it isn't actually being used to make prettier games.
Fidelity increasing is precisely why games are looking worse and worse. The developers have increasingly focused on making games that look realistic* instead of looking good.
yup. they don't realize that it just makes their gaymes look shit and uncanny and that most people want games with good art direction and gameplay rather than the newest soi tracing. vidya is art, and in art it doesn't matter whether your brush is the most expensive or your painting is the most realistic. it's easier to do than just making a good game though so we will keep seeing this shit
Muh ebin grafics is still a thing, but wasteful, because it's people 3d scanning things and not optimizing them, unreal engine is leading the way with 60k polygon boulders.
UE5 is providing dev monkeys with a fool proof way of increasing fidelity. They can just turn their Lumen and Nanite On and have more grafix. So, at least there is some improvement there.
>vidya is le bad
literally why
most of you are imageboard dwellers with cum crust on your keyboards, half of the userbase does not even know how to install linux
why is there such a memetic push against video gaymes
how is playing a game worse than browsing IQfy.org/g/
2 years ago
Anonymous
This
2 years ago
Anonymous
I browse IQfy for 30 minutes every morning. It's like reading the news.
Then I get ready for my job.
>YOU WASTE 30 MINS EVERYDAY
Yes. And you waste hours and hours of your day on Vidya. >I HAVE SELF CONTROL ANON
You're participating in a golobohomosexual consumer thread, no, you don't have self control. Cope and dilate manlet.
2 years ago
Anonymous
>It's like reading the news.
Not when you're arguing with other shitposters it isn't (in a thread about gaming hardware, no less. But SURE, you don't play games).
Also, your last post was well over 30 minutes ago, so you're full of shit anyway.
2 years ago
Anonymous
Lol. I have some extra time since my wife walked my dog, I have about 11 extra minutes this morning. You really are coping pretty hard here man. I recommend you install Linux so you'll be less tempted to consume vidya.
2 years ago
Anonymous
>I have about 11 extra minutes this morning
Your first post in this thread was an hour ago, anon. That means you're twenty minutes late from work.
Except for the part where you're probably a NEET.
2 years ago
Anonymous
Yes. Because I wake up, 30 mins of IQfy, then go to work. You are really grasping at straws here. I'm up at 6:25, its 7:54 right now I don't work till 8:30. I'm really enjoying watch neets try to tell me that I'm the neet, so please continue. Also GUIX is a good distro if you're trying to beat vidya addiction.
2 years ago
Anonymous
>Yes. Because I wake up, 30 mins of IQfy, then go to work.
And by 30 minutes you apparently mean an hour and a half.
2 years ago
Anonymous
Yeah you're doing a good job keeping me entertained between pages of my book. Thanks.
Parabola is another good distro for vidya addicts too.
2 years ago
Anonymous
he probably has an hour long cager commute
2 years ago
Anonymous
b***h homie I'm literally at the office rn
I just RDP to my home computer on one screen of my work computer and do work on the other
simple as
2 years ago
Anonymous
his wage cage doesn't allow untrusted connections
2 years ago
Anonymous
I see what you mean, but jobs typically are a waste of time if you are not animated by a lack of money. Worse than browsing IQfy tbh
2 years ago
Anonymous
i agree that they're mostly a waste of time but how the frick do you expect people to live with no money
>x60 for $430
Not falling for the israeli trick. Maybe when there's a game that actually requires more than a 1070 to run at 1080p prices won't be so moronic. $300 used to be high end 10 years ago, they can eat my dick.
Why are GPUs so expensive now? Things used to be priced reasonably back before the 10 series. Is it the VRAM ballooning the cost, or is there something else
>2008
Those gpus had frick all ram and transistors tho
Back when the dollar was worth like $3 now?
>Back when the dollar was worth like $3 now?
Try 4x the street price inflation like fuel and food has doubled in the last month and that's up 4x from 3 years ago
2 years ago
Anonymous
So? Generational improvements don't warrant a price increase, they are just raising their margins. Just look at car prices, if you correct for inflation they actually got cheap when compared to the 90s.
2 years ago
Anonymous
>they are just raising their margins
True. This is objectively true, Nvidia is open and honest about it actually. Consumers don't care though. People just keep buying them all up.
We're probably going to reach a point where integrated graphics cover the basic needs of everyone, and the cheapest discrete GPUs will be over $300.
The GTX 260 retailed for like $450, what are you even talking about?
2 years ago
Anonymous
the 8600GT, mid tier, you could play literally anything at 768p/med or high with decent FPS, 100 buckaroos
2 years ago
Anonymous
The 8600GT launched in 2007 and the MSRP was $200 but retailed for like $250. It wasn't even the real mid-ranger of that generation, that was the 320MB 8800GTS. You had the 8800 GTX and 8800 Ultra (and the 640MB 8800 GTS) above it, placing it at the x060 segment.
2 years ago
Anonymous
maybe in 2007 it did
I got mine for $100 back in 08
2 years ago
Anonymous
It quickly dropped in price and got the 216SP upgrade, which is unlikely to happen in today's market.
You know I have no reason to buy this.
I will upgrade once AV1 support is fully adopted and perhaps they do something similar to the nvec encoder or they find a way to make it do exactly that.
Am I crazy?
I can play every game I enjoy at max setting on my laptop 3080 system.
In layman's terms they are 2070 Super 8GB vs. 3070 8GB. The latter being clearly planned obsolescence given that 2 years ago the 3070 was already criticized for having low VRAM. You'll probably be better off with a 12GB 7600 XT if that ever comes to fruition.
>You'll probably be better off with a 12GB 7600 XT if that ever comes to fruition.
but AMD cards suck because opengl stuff on windows sucks with AMD cards.
The only game worth playing on PC this year is Modern Warfare II and that one isn't graphically intensive at all. Lol at buying a megapint GPU when a 3060Ti will do the needful
elden ring runs on a literal rx 560
we'll probably see gotta go fast have higher reqs, but i'd say the recommended requirements aren't going above the 1660/2060
Only certain boards allow spoiler tags, friendo. And to ensure you do it right just highlight your text in the quick reply window and hit crtl+s to auto apply the tags.
>the OP pic is pure speculation, too.
Oh it is? I fricking hope so. If anything in the 4000 series has less than 10gbs of VRAM (outside of the 4050), I'm going to cry,
>less than 10gbs of VRAM (outside of the 4050), I'm going to cry,
The "low" end cards on both AMD and nvidia will have 8gbs vram, so expect the 4060 and the 7600/7700 to be 8gb cards
Just get a console lol imagine paying that much when the PS5 gets better performance at a lower price than those shitty cards lol pctards never cease to amaze me.
>muh [insert arbitrary number] gpu used to cost [insert arbitrary number] >i can't believe it
Who cares? They're all just made up numbers that mean nothing. The only thing that matters is how many frames I get for my money. I don't care if last year it was called GOY1000 and this year they released KYS300 when the performance is the same but costs like 30% less. >but muh number is smaller >entry level now costs more than high end 10 years ago
XX50 cards used to be throw away dogshit that was on par with an iGPU. If its gonna be on par with a 3060 / ti, is it really entry level when you're getting 80-100+ fps in AAA games @ 1440p for 250$? This "entry product" can now deliver optimal gaming experience for 99% of people. There is no universal rule that if the gpu has 60 in the name, it has to be 200$. GPU technology has changed entirely in its target applications and now does a lot more than just play games. It's essentially a completely different product with a completely different target market and priorities, playing games is just something it can also do, an extra feature.
Regardless, these prices make no sense. If a 4070 = 3080 then you're literally getting same performance for 20$ less than 3080 msrp. Unless they're pricing them as compared to current inflated prices, which would be extremely moronic. If a 4070 is gonna be closer to a 3080ti/3090, then 679$ would be a really good price. The hope is that you're not a moronic consoomer who just buys things for the number on the box.
I need a new GPU for Blender so I'll take the RTX 4090 Ti.
name THREE (3) games i need 16GB of VRAM for
Blender unironically uses a lot of VRAM.
GAMES homie, idgaf about your gay 3d programs
not sure if you know this but gtx gpus are for video games
these are rtx homie
Cyberpunk 2077
Star citizen
Revenge Porn Simulator
>Cyberpunk 2077
lol
>Star citizen
lmao
>Revenge Porn Simulator
??? rofl
>lol
kek
>lmao
bur
>??? rofl
hehe XD
U mad bro?
did star citizen actually happen i thought they ran out of money
what the frick is a revenge porn simulator
>what the frick is a revenge porn simulator
Its when you deep fake a woman, and use her as your VR sex doll.
I don't think you need a high end GPU to do that
Got it, I need to drop thousands of dollars so I can play a game that doesn't exist.
>Got it, I need to drop thousands of dollars so I can play a game that doesn't exist.
That's has been the whole point of high end computers for the past 30 years. When games ae designed so the bottom 10% can play it and the bottom 40% can play it well, that means everyone else wasted money on a high end system.
dogshit, the list.
Nobody plays Cyberpunk or scam ciitizen bro
People pan the camera around their ship in Star Citizen between crashes, and that takes an awful lot of GPU
HLAlyx on max
Skyrim at any resolution above 1600x900
Total War: Warhammer 3 on medium settings
Any Paradox game after 2015
Fallout 4 at any resolution above laptop tier
Corruption of Champions 2
Emulated Duck Hunt
Modded Skyrim, as always. Black folk will retexture an apple and use 16k textures because frick you
get a VR headset
Every VR game
Escape from Tarkov
Stalker with ultrahobomod
TESVI and Starfield will probably need 20GB VRAM minimum for mods
Skyrim at 1440p and 4k will max out 16GB VRAM and strategy games like Total War will always want more memory
Most recent titles and especially ports are also VRAM hogs of varying degrees, the only ones immune to this are the yearly AAA releases where they're far more optimized and can run on toasters
Skyrim needed GTX 680 at MINIMUM to get choppy 20 fps with grass mods in 2012. I can already see the xx90 being the "you need this to get into serious modding" card, and it's mainly because of the 24GB VRAM that gets maxed out at 4K.
>TESVI and Starfield will probably need 20GB VRAM minimum for mods
That's like saying a game will need 20GB of VRAM if you make it fill the buffer with garbage.
Skyrim mods only need stupid amounts of VRAM if you don't optimize them, modders will use raw 8K textures for something that won't occupy 5% of your screen if you let them.
imagine STILL playing skyrim. grow the frick up Black folk
At this point Skyrim is just a platform, people make entire games on it.
Imagine not having VR and modding it
I'll most likely buy a 4080 for MSRP or a 3090 if its under 600 €, if I get the 4080 I won't even unpack it and continue to use my 6900 XT, then scalp the shit out of it
I doubt there will be another way to obtain a nice high end GPU
Richgays can simply pay me for upgrading its not like they don't have too much money anyway
Far Cry 6 does require that much for the HD texture pack, and even without that you can't really run it with less than 8GB vram (at 6GB it will eventually refuse to load in textures at mipmaps higher than like 128px)
>$300 for a low end card
frick this gay earth
New cards aren't for the low end anymore. You have the used market to fill that demand. It's what you get for being poor.
They sell new low cards under other branding to not interfere with the quality brands. GT 1030, now GTX 1630. They don't want to market their lowest cards with their performance cards.
Also APUs are becoming usable for basic games. Or buy an xbox
What does SM stand for?
APUs?
>What does SM stand for?
Streaming Multiprocessor. Its the basic block that Nvidia defines their GPU designs by, like CU(Compute Unit) for AMD. Each CU or SM contains the ALUs, cache, scalar processor, RT accelerators, other various things. A complete GPU is comprised of numerous CUs/SMs.
>APUs?
Accelerated Processing Unit. AMD marketing term for their chips with integrated graphics.
To be fair that's only about $45 above what low end cards cost in say, 2015, when adjusted for inflation. You should be more angry at how quickly the dollar is losing value
liberals don't understand economics, you know this anon.
said the unaware liberal midwit
Cut the price by 30% on all of them, otherwise I dont want to become the product of israelite
This.
>meanwhile 6900xt cost 800usd and is as fast as the 4070 3090
Your welcome nvidiots
I'm waiting for the 5000 series is replaced by the 6000 series in 4 years time I did the same wait with my 2080ti and a 6900xt will last til then at least
I suppose 4080 release in september like last time.
homie you need more? > 4080 > 3090 Ti > 4070 > 3090 > 3080 > 4060 Ti > 3070 Ti > 4060 > 3070 > 4050 Ti > 3060 Ti > 4050 > 3060 > 3050
Yes. I need to be able to buy the newest high end card for $500. You know, like I was able to do 10 years ago with the GTX 680 instead of settling for middling 60 TI garbage. What is even the point in selling high end graphics cards if no one can feasibly buy them? The Titan was a mistake.
Also Twitch streamers. You know for a fact they wouldn't be selling $1000+ cards if every streamer wasn't constantly rushing out to have the best setup possible because money is no object to them. And that's not even mentioning fricking crypto. I'm glad that shit is sinking like the Titanic.
>And that's not even mentioning fricking crypto. I'm glad that shit is sinking like the Titanic.
You must actually be moronic if you think crypto is going anywhere just because it's following the same downward trend as the rest of the market.
>downward trend
bruh it fricking crashed lmao
still up 800% from just a few years ago
you are dense if you still can't see the overall trend
Gpus haven't been that price since my 7900gt almost two decades ago mate I really don't see it happening anymore and those cards where rudimentary.
I'd say if you wanted a 500usd high end gpu today it's just not possible the ram is 50 bucks the cooler and fans is at last another 50 the pcb alone and all the shit on it is about 150 then to bulk manufacture rnd and shit eats the rest
So a 1k card is about 500 bucks to develop manufacture and make and you've gotta do that every 2-3 years or less
It's why gpus are frick huge and basically more complicated and heavy expensive and powerful than a cpu ram mobo and ssd combined
Welcome to Biden's america fricker.
>$429 for a xx60 level card
oops gotta add board partner tax
>$500 for a xx60 level card
lmao. still gonna sell out.
>900 dollars for a 4080
conflicted emotions...
>Remember when GTX 780 was 400 dollars
>3080 was only 800 dollars
>Also 10%+ inflation over the last year
Are Nvidia justified in their price increases over the past 10 years or are they just milking the ETH mining/chip shortage at this point
>Are Nvidia justified in their price increases over the past 10 years or are they just milking the ETH mining/chip shortage at this point
Do you even need to ask?
The price increase at the RTX introduction should already cover the current inflation. This is greed.
Absolute top of the line GPU cost me $300 in 1999. Adjusted for inflation that would still only be $520 today.
Workstation GPU or the equivalent of 1080 TI ?
zoom zoom
Probably a TNT2 Ultra.
Reminder for all inflationgays that NVIDIA flagships have for the better part of 20 years always been priced below $650-$500 with some outliers.
It's only after AMD stopped competing at the high end after they had their Bulldozer moment in GCN that they were able to set the pricing structure however they wanted, now all AMD does is slot in to the pricing structure set by NVIDIA because they have so little financial horsepower on the side of their graphics division that they cannot manufacture enough units to offset the lower profit margin that comes with a lower ASP even as their sales and market share continues to dwindle, since everything has been devoted to CPUs. .
Right where the red arrow is was when they brought out the "Hur dur not a gaming card" GTX Titan line that were literally just thinly veiled market tests to see just how many people were willing to pay for a top end card that wasn't that much faster than the GTX numbered flagship but 2x the price, the answer was a shittonne and now we reap the rewards I guess
They marketed it as a prosumer card yet they still gimped its compute functionality roflmao
They've created a huge canyon between affordable cards and high end. This marks the line between 4K running well or not. AMD could easily disrupt this and create xx70 level cards with more VRAM.
The market adapted though, and clearly many many people are okay with buying $1000 GPUs. People are buying $700-$1500 phones. People buy $1500-$3000 laptops. A high end enthusiast GPU even at $2000 has enough of a consumer base that Nvidia isn't hurting for pricing them as such.
>many many people are okay with buying $1000 GPUs. People are buying $700-$1500 phones. People buy $1500-$3000 laptops.
All of this would be fine if not for housing prices also being exorbitant all while many jobs still don't pay more than $20 an hour, which if you ask me should be the bare minimum wage considering how bad things are.
This is a new era. Everyone puts shit on credit cards.
That would be good, but I don't have as much faith in AMD after they started kiking as well with 5600X launch and later RX 6X50 refresh.
hasnt the 5600x been $250 for like a year
I paid €300 for a 7800GT in 2005 and that was a mid-class high-end card back then.
brb, gonna check early life on black jacket man
He's Taiwanese
I hate to admit it, but for a while there from 2015-2020 I thought that both Nvidia and AMD were Taiwanese companies since their CEOs were both from Taiwan
There are some subtle differences which may not be observable to western eyes. The nuance is really lost if you aren't aware of it.
Lisa Su is what is known as a Real Chink where as Jensen Huang is a Fake asiatic.
>Lisa Su is what is known as a Real Chink where as Jensen Huang is a Fake asiatic.
Please enlighten me
Jensen is "Asian." Looks more in place in California than any Asian country. Might as well be Hawaiian. If you saw him walking around Taiwan you'd think "what the frick is he doing here?" Disgusting replicant morphology, probably an Alien. Fake asiatic.
Lisa is a proper squinty eye, can talk to her baba in her ancestor's tongue. If you saw her in Taiwan you'd think "oh, there goes a cute old woman, glory to Taiwan, we are the real China." Real Chink. Based.
>asiatic
KOREAN
I FRICKING KNEW IT, KOREANS ARE THE israeliteS OF ASIA
I paid 400 for my 2070 what the frick is this shit?
I paid 450€ for my 2070 super 2 years ago. Stay mad
>$900 for an 80 series
I'm just not buying any GPUs unless they are old and used. Nvidia and AMD can get their momey from the suckers.
>Over 400 bux for a fricking shit tier 106 chip
>Almost 700 for a 104 chip, used to be 499 at most
>1.3k for a 102, used to start at 699
Nvidia can go frick themselves, back in the Fermi days you used to get a fricking 100 chip for 500 bux. Cheeky wankers
>Dibs on the 4050 Ti
based on that chart it has the worst price/performance apart from the $1000+ cards
They are NEVER gonna give the XX60s more than 8gb are they?
But the 3060 and the 2060 12GB already exist.
4060ti or 4070 would be the way to go maybe the 4080 now what about the power draw
fricking Nvidia only giving the 3070 8gb also didnt reviews saying that even with 12gb the 3060 wasnt performing as it should
>also didnt reviews saying that even with 12gb the 3060 wasnt performing as it should
NO x060 card will perform "as it's should", it's a weak GPU with weak RAM
Not buying a new GPU until energy prices come down
useless without rop counts
TDP???
Please don't buy any of these, climate change is bad enough and is only going to get worse, we cannot support this any longer
You just know these fricking israelites will only release 4090s and 4080s at launch, with the lower cards coming in 6 months later.
Not buying a new GPU until there's one that's 5x faster than what I currently have and costs at most $250
lol
He'll get that in like 7-8 years m8
I'm a patient man
Yeah, but it'll still be the lowest end GPU out there and unable to keep up with all the minimum specs that games will have in 8 years
That's his problem not ours
just play games like 5 years after they come out. you'll save thousands on gpus
global recession which might snowball into a depressing, futher breakdown of trade relations between china and west, and focus on the new cold war with soviet 2.0 will make sure in 8 years there will be no AAA games or anything requiring more than a gt1030
I wouldn't mind that as long as I get my free gibs
AHHHHHHHHH make a low profile card please.
>AHHHHHHHHH make a low profile card please.
Not to worry anon, they're making the GTX 1630 just for you.
Hope you like GTX 1050 level performance.
>GTX 1630
Worse than a 6500xt, yet I'm sure morons will lap them up
At least it's turing and have open source driver
>GTX 1630
>Worse than a 6500xt
It's worse than a $139 1050 Ti from 2016
The entry level GPU market is dead
I was thinking of getting a 3060 Ti, should I wait for 4060 Ti? Or frick both of those and go for a 12GB 3060?
May as well stick to a GT 1030.
>I was thinking of getting a 3060 Ti, should I wait for 4060 Ti
Only if you want to pay $200 more
Yeah, thought so.
>$899
When I built my first pc you could build a high end gaming pc for $900, with an overkill one at $1000+, sad to see how low we've fallen.
>$100 jump in MSRP after the 2000 series' $100 jump
frick nvidia
>inflating core count by lumping half-precision RTX cores into it
>no bus and bandwidth info
Yeah, they're STILL not letting anything under a x070 beat the 1080 ti
>what is RTX3060Ti
An 8GB mistake
3060 cope.
Last time during kepler the Ti's usually use the same GPU as the non-Ti
Now the Ti's use the same GPU as an upper SKU non-Ti
I don't even play AAA shit and stick to MP and JP titles I dunno why I'm even looking at top of the line GPUs lol, 4000 gen is an easy skip, specially for 1440p gamers.
Those are some expensive space heaters
Which one is worth changing my RX570 for?
any of them also what cpu do you have
Wtf makes a Ti card a Ti card? Why not just use the inbetween numbering 5? How is a 4050 ti vs a 4050 difference 4 gb of ram? It should just be 4050 and 4055 then
Good question. What does ti even mean.
Titanium.
Instead of gold or platinum or any other ultra gamer tier descriptor to make it sound special
jesus fricking christ, over 300 for a 4050ti
i legit hope am5 apu really deliver interesting shit now that's on ddr5 because fricking hell, entry level is rip
You’re comparing apu to dedicated gpu? Wtf
lrn2read Black person, there's no comparison here, just a statement
The APUs will be RX 570 levels of performance, and even that might be optimistic.
Supposedly the upcoming Phoenix APU with RDNA3 IGP has 1536 ALUs. Given what some compiler patches submitted by AMD show, each CU is now 128 ALUs. Still 6 WGP, 2CU per WGP. That is essentially 24 RDNA2 CU. Pretty tremendous even compared to current Rembrandt APUs with up to 12 RDNA2 CU.
Allegedly this Phoenix APU will require a moderate sized L3 as IF to make up for the bandwidth, L2 also increased, but theres no doubt about the hardware specifics here, the details are already completely listed. They have to have some way to feed all that or it wouldn't be worth the transistors.
>rtx 4060
>a 3070 refresh
>msrp 70 bucks lower
Man, the x60 class got cucked. Or maybe we got used to having it good with the 1060. 8GB also seems insufficient. 1060 had as much VRAM as the previous top card.
All I want is 1080ti or higher performance at a lower power draw. Is that too much to ask?
No, friend.
Take the intelpill
That would be the 4060 if they didn't cuck it with 8GB VRAM.
Get AMD. I'm sure they will have a 12GB faster lower TDP card.
I have a feeling the 5000 series will mostly have the same VRAM counts as this, except the 5060 will move up to 10 or 12, and the 5090 might get 32.
>NVIDIA RTX 4080 Rumored To Feature 420 W TDP
https://www.techpowerup.com/295614/nvidia-rtx-4080-rumored-to-feature-420-w-tdp
Ahahahaha.
So two or even three flagships combined of few gens ago? Is this why they killed crossfire and sli? So you have to go with their house fires instead of creating your own multiGPU 90% equivalent performant house fire at a fraction of a cost?
How come we have barely progressed in efficiency? RTX 4060 has TDP of 220W while 1080 Ti has 250W. It's just one bump faster at 2080 Ti levels.
Compare to 2060 vs 980 Ti: 160W vs. 250W.
Efficiency has constantly improved in huge strides, you just don't see it in ever SKU because the highest tier cards don't have perf/watt as a target. Enthusiast hardware is all about ultimate performance and throwing You can still lower power limits and see insane perf/watt yourself.
Yeah, because the 4060 is so enthusiast...
I guess I'm stuck in the time when 560, 660, 960 came with cheap prebuilts. Nowadays there's a 1650 Super in them. The x60 class became simultaneously weak, expensive and a power hog.
>420 W
lol
My name is jensen huang and I am proud to announce that NVENC is removed from everything that isn't the 4090 ti
Fun fact:
From late 2007 until 2015 Mid to Low-High end cards had an MSRP price range from $80 to $350 tops. Examples 8800GT for $350 or less and HD4670 1GB for $60 to $80 https://www.techpowerup.com/gpu-specs/radeon-hd-4670.c234 https://www.techpowerup.com/gpu-specs/geforce-8800-gt.c201 Over $500 was considered pretty insane and $1000+ was unheard of. Until you can buy a new mid range $80 GPU or a GPU that uses 75 watts or less again I would just keep not buying ever again.
Over $1000 and 1000watts is simply idiocy no matter what.
this tbh
the card with the highest end gpu of a generation used to be $250-350 and the only thing above that was a dual gpu housefire for $500 that no sane person bought
i'm never gonna buy a >200W card. the summers are already hot enough tbh
>i'm never gonna buy a >200W card. the summers are already hot enough tbh
Honestly shocked how cool the 3060 Ti runs right now under full load (under 60c) even with a Ventus 3X cooler. Pushing the fan speed past 40% doesn't seem to do much though.
gpu coolers have become pretty good compared to say a decade ago, yeah
still, regardless of how good the cooler is all of the heat will get dumped into my room
we are not getting a 150 dollars entry gpu again, isn't it?
Entry level GPUs used to be under $100. Now this market is served by integrated graphics
I love how they've just said frick you if you have an older CPU. Time to buy a new cpu, mb, probably ram just to do gpu accelerated video encoding of xyz new codec
Progress waits for no poorgay
its even funnier that there are people who run to their defense and are willing to pay 2000 dollars for a gpu HAHA
>150 dollars
Maybe a GTX 1630 if it's actually real and not a second GT 1020/1010 situation.
MSRP prices haven't been a thing since like 2016. If you want realistic pricing instead of the lies on that table then add 35% minimum to the cost.
Frick these prices imagine paying $250 msrp for a 4gb card in 2015+7
Why is everyone acting like the secondary market dosent exist? With current post pandemic QC issues seemingly here to stay ala HDD prices post flood, lack of components causing manufacturers to take all sorts of design shortcuts, going refurb/bstock/flebay/local-buy-sell-trade is probablty safer and more reasonable, at worst a wash, and all you penny pinching poors save a few $/eurobucks.
Anyway, why the frick do you zoomzooms need RX80000 for anyway? To play some half-assed buggy shit like Cyberpunks2098 at 8k where it still looks like a GTA5 mod from 2017 but it needs 40 a core CPU?
Not many people want to buy used and abused GPUs
These day I only play visual novels and the occasional game, so rather than buying a full desktop I might just buy a gaming laptop, apparently the new Intel Arc A730M(laptop) matchs a 3070(laptop) performance.
plus it comes with 12gb of vram.
I want to permanently move to linux since I fricking hate win 11 and just barely tolerate 10.
How can they justify raising prices so much?
People were willing to pay scalper prices so they'll be willing to pay this
Because man baby gamers. Go outside, do something useful.
Why are these israeli c**ts making me pay $2k dollars for a fricking graphics card? i used to build entire computers for $2k dollars.
/waitingfor3090pricetodropfurther/ here. I just want my 24Gb so I can use neural networks for two weeks and then drop it.
I'm not paying more than $500 for an xx70 card
Bot yourself a founder's edition then.
I'll be getting a $100 coupon from INTCEL fr their GPUs.
Sorry Novidya but you're not needed any more.
$150 more for the x060 Ti model? Why is this allowed?
Because inflation and frick you goy
Guess I'll stick with my GTX 1070 for another couple years.
> 4070 is still cucked with 12 GB of VRAM
Hopefully RDNA3 delivers because frick that.
It's carefully placed at 12GB to not handle 4K too well. Forces you to go up if you want to max games out. This makes the 4060 Ti a better buy for lower res.
Is it possible to extrapolate the rough performance of the cards with this data? E.g. what card would be needed for overall consistent 4K 60FPS minimum?
Give me one reason not to FOMO in on a 3060 Ti as eurogay right now if I've heen waiting 8 years, even if I'll be paying $650+ for a non-housefire cooler model.
You're ruining the fricking market just like the real estate market. Wait for GPUs to go down in price when crypto bottoms out 1.5 years from now and get yourself the mining version of a card.
Linus did a good job letting everyone know you can buy a 1060 6GB for 60$ , i think you can even find the mining version of a 3060 Ti in 6 months or right now, but you got to wait for the prices to go down in 1 year.
>Linus did a good job letting everyone know you can buy a 1060 6GB for 60$
If you wouldn't waste your time on watching shitty tech youtubers, maybe you could afford a high-end GPU.
>just like the real estate market
Anon, not everyone can just go live with their parents, many people had to move to other cities where they can actually find a job. They had no other real choice even if you took into account how much they'd save in rent.
>you can buy a 1060 6GB for 60$
Source? A new gen GPU sounds nice, but I just need something that works at all.
Nobody is going to be able to buy 4000 series cards when they come out. Don't hold out for something you have no hope of attaining in a reasonable time frame.
The general market went down 90% within a few days?
I just got myself a 4k120hz OLED TV
Guess Ill get the 4080 or TI but TI probably releases later
>$900 MSRP if you want a 16GB framebuffer
I wish AMD would get their ROCm drivers sorted out already.
I just want a consumer GPU with a 16GB framebuffer so I can build a single machine that I can use for both playing games and for work.
But as it is NVIDIA will keep the size of their framebuffers artificially low to squeeze people like me.
all the prices are +$200 at the very least
This pricing is peak judaism.
But leaks always get the price wrong, moreso than the specs, so we'll see.
People flabbergasted by pricing need to understand the foundry side of things has changed since the time of cheap gpus.
Almost all companies dropped out of the race, and now if you want to make top tier gpu you need to go to TSMC who already sold 1st dips to Apple, and who takes orders from dozen other chip designer companies. Their smallest nm nodes are expensive.
Theres also Samsung foundries but they're definitely behind TSMC, so you can use them for cheap but risk competition using TSMC.
New chip foundries cost tens of Billions and need to be planned years in advance and can't quickly react to chip demand. Theres also ASML who make the chip machines bottleneck. All 150mil machines they make are backordered for years to come because some of their supplylines are limited and can't react quickly.
I'm surprised Nvidia even makes consumer GPUs anymore. They get all their money from data centers and selling GPUs to cryptofarms, the desktop PC market is a frivolous novelty at this point.
>$430 (actually $600) for the base XX60 model
A waste of everyone's time, again.
>paid 149.99 for my gtx 1060 6gb in 2016
>still just werks
let me guess, you NEED more
I just want a sub 200W 1080Ti equivalent.
Get a 3070 and underclock it then.
3060 Ti
Why are people here continually shocked at how graphics card prices are outpacing inflation and size and TDP are following suit? The flagship today is in no way comparable to the flagship 10 years ago. Graphics fidelity, resolution AND refresh rate targets are all increasing simultaneously.
Graphics fidelity isn't increasing. games look like dogshit apart from that matrix city demo. the last of us part II on ps4 looked way better than most games on the ps5 despite the extra power. it's clear that graphics are plateauing and only the biggest studios can afford to make a game with state of the art graphics, the limiting factor being development time and not graphics hardware. I guess you can throw the extra horsepower at placebo like higher refresh rates and higher resolutions, because it isn't actually being used to make prettier games.
Fidelity increasing is precisely why games are looking worse and worse. The developers have increasingly focused on making games that look realistic* instead of looking good.
yup. they don't realize that it just makes their gaymes look shit and uncanny and that most people want games with good art direction and gameplay rather than the newest soi tracing. vidya is art, and in art it doesn't matter whether your brush is the most expensive or your painting is the most realistic. it's easier to do than just making a good game though so we will keep seeing this shit
Muh ebin grafics is still a thing, but wasteful, because it's people 3d scanning things and not optimizing them, unreal engine is leading the way with 60k polygon boulders.
UE5 is providing dev monkeys with a fool proof way of increasing fidelity. They can just turn their Lumen and Nanite On and have more grafix. So, at least there is some improvement there.
Sorry mr. novideo that is too much I will NOT be buying your goy card today.
>MUST CONSOOOOOOOOOOOOOOOOOM
>ME PLAY HECKIN VIDYA ON LE HECKIN RTX!
Grow up manlets.
>vidya is le bad
literally why
most of you are imageboard dwellers with cum crust on your keyboards, half of the userbase does not even know how to install linux
why is there such a memetic push against video gaymes
Because you're wasting your short time on this planet.
When you play video games, if you don't feel like you're wasting time, or feel guilty you're obviously under developed.
how is playing a game worse than browsing IQfy.org/g/
This
I browse IQfy for 30 minutes every morning. It's like reading the news.
Then I get ready for my job.
>YOU WASTE 30 MINS EVERYDAY
Yes. And you waste hours and hours of your day on Vidya.
>I HAVE SELF CONTROL ANON
You're participating in a golobohomosexual consumer thread, no, you don't have self control. Cope and dilate manlet.
>It's like reading the news.
Not when you're arguing with other shitposters it isn't (in a thread about gaming hardware, no less. But SURE, you don't play games).
Also, your last post was well over 30 minutes ago, so you're full of shit anyway.
Lol. I have some extra time since my wife walked my dog, I have about 11 extra minutes this morning. You really are coping pretty hard here man. I recommend you install Linux so you'll be less tempted to consume vidya.
>I have about 11 extra minutes this morning
Your first post in this thread was an hour ago, anon. That means you're twenty minutes late from work.
Except for the part where you're probably a NEET.
Yes. Because I wake up, 30 mins of IQfy, then go to work. You are really grasping at straws here. I'm up at 6:25, its 7:54 right now I don't work till 8:30. I'm really enjoying watch neets try to tell me that I'm the neet, so please continue. Also GUIX is a good distro if you're trying to beat vidya addiction.
>Yes. Because I wake up, 30 mins of IQfy, then go to work.
And by 30 minutes you apparently mean an hour and a half.
Yeah you're doing a good job keeping me entertained between pages of my book. Thanks.
Parabola is another good distro for vidya addicts too.
he probably has an hour long cager commute
b***h homie I'm literally at the office rn
I just RDP to my home computer on one screen of my work computer and do work on the other
simple as
his wage cage doesn't allow untrusted connections
I see what you mean, but jobs typically are a waste of time if you are not animated by a lack of money. Worse than browsing IQfy tbh
i agree that they're mostly a waste of time but how the frick do you expect people to live with no money
you already waste your time slaving away for some bucks every day, why not waste your time enjoying something for once
>x60 for $430
Not falling for the israeli trick. Maybe when there's a game that actually requires more than a 1070 to run at 1080p prices won't be so moronic. $300 used to be high end 10 years ago, they can eat my dick.
>3060 has 12gb
>4060 has 8gb
What in the absolute frick?
>3060 has 12gb
>3060 ti has 8gb
>3070 has 8gb
>3070 ti has 8gb
The 3060 was a fluke.
>hbm will never make it to consumer cards again
Why even live
Its not needed yet.
Disagree. The main thing holding back gpus is bandwidth as shown by consistent increases in performance
Vega competing with only 4gb begs to differ
Competing with what? A 1650 Super?
because turns out it doesn't help shit
if 4080 was 4060ti price i would buy
I've been waiting for so long to upgrade from my shitty nearly ten year old RX 280X (which is just a rebadged 7970 from 2012)
When will I be able to get a decent card for $299 that isn't a housefire?
>When will I be able to get a decent card for $299 that isn't a housefire?
Never again.
$299 from 2013 is worth $371 today.
What? Your wage hasn't increased by the same amount? How curious!
>270x came out mid-late 2013
fug, this hurts bro....
don't buy in the top then boomer
then what do you care if it's up or down
Dibs on the 4090ti for 4k gaming
power requirements leaked yet?
if the 4060 takes 220w i wonder what they'll do with the 4050
Why are GPUs so expensive now? Things used to be priced reasonably back before the 10 series. Is it the VRAM ballooning the cost, or is there something else
I rember when a $100 GPU was considered mid-tier.
Back when the dollar was worth like $3 now?
2008
>2008
Those gpus had frick all ram and transistors tho
>Back when the dollar was worth like $3 now?
Try 4x the street price inflation like fuel and food has doubled in the last month and that's up 4x from 3 years ago
So? Generational improvements don't warrant a price increase, they are just raising their margins. Just look at car prices, if you correct for inflation they actually got cheap when compared to the 90s.
>they are just raising their margins
True. This is objectively true, Nvidia is open and honest about it actually. Consumers don't care though. People just keep buying them all up.
We're probably going to reach a point where integrated graphics cover the basic needs of everyone, and the cheapest discrete GPUs will be over $300.
The GTX 260 retailed for like $450, what are you even talking about?
the 8600GT, mid tier, you could play literally anything at 768p/med or high with decent FPS, 100 buckaroos
The 8600GT launched in 2007 and the MSRP was $200 but retailed for like $250. It wasn't even the real mid-ranger of that generation, that was the 320MB 8800GTS. You had the 8800 GTX and 8800 Ultra (and the 640MB 8800 GTS) above it, placing it at the x060 segment.
maybe in 2007 it did
I got mine for $100 back in 08
It quickly dropped in price and got the 216SP upgrade, which is unlikely to happen in today's market.
You know I have no reason to buy this.
I will upgrade once AV1 support is fully adopted and perhaps they do something similar to the nvec encoder or they find a way to make it do exactly that.
Am I crazy?
I can play every game I enjoy at max setting on my laptop 3080 system.
The 4000 series will most likely have hardware AV1 encoding, because Jetson Orin for cars already has it.
Just waiting for more adoption, would be amazing if they can do something with Nvec to make it do that instead.
What do you mean? Obviously that AV1 encoder will work via NVENC interfaces.
I'm not knowledgeable enough on the subject but I do believe you.
Hopefully they can put it on older cards.
Looks like shit. Why wouldn't I just get a 3090?
Just ordered the 3070 Ti TUF for 600 Euro. I am moronic?
RTX 4080 IS MID RANGE CARD NOW WOW.
RTX2060 is till enough and FRICK You Jensen.
fellow slavbro?
How the frick is this computer using 2.5gb of ram without even a fricking window system?
>$679 for the 4070
>12GB VRAM
It's over.
i hope the 4050 ti is much faster than my 1060 6gb
The 3050 is like 25% faster than the 6gb 1060. I'm sure the 4050ti will slap the shit out of it.
The 3050 is already about as fast as a 1070 so the 4050 Ti will probably be 1080 tier
It's a 2070 Super.
So about as strong as a PS5? Should be just fine for 1080p then.
no source, fake and gay
Alright lads, based on OP:
Would you rather get an 8GB 4050 Ti or an 8GB 4060?
In layman's terms they are 2070 Super 8GB vs. 3070 8GB. The latter being clearly planned obsolescence given that 2 years ago the 3070 was already criticized for having low VRAM. You'll probably be better off with a 12GB 7600 XT if that ever comes to fruition.
>You'll probably be better off with a 12GB 7600 XT if that ever comes to fruition.
but AMD cards suck because opengl stuff on windows sucks with AMD cards.
Won't have dlss though
>dlss
only 4xxx series will have it?
The only game worth playing on PC this year is Modern Warfare II and that one isn't graphically intensive at all. Lol at buying a megapint GPU when a 3060Ti will do the needful
I personally want to play Elden Ring, the Saint's Row reboot and Sonic Frontiers.
elden ring runs on a literal rx 560
we'll probably see gotta go fast have higher reqs, but i'd say the recommended requirements aren't going above the 1660/2060
waisting all that money on gaygit gaymen and accelerated soiputing
HE FRICKED UP WITH THE SPOILER TAG TWICE EVERYONE LAUGH AT HIM AHAHAHAHAAH
Please be gentle
Only certain boards allow spoiler tags, friendo. And to ensure you do it right just highlight your text in the quick reply window and hit crtl+s to auto apply the tags.
I don't even play AAA "games", I just would like to play VR games from time to time on something other than low graphic settings
Are there any benchmarks yet for machine learning / video encodes?
No, the OP pic is pure speculation, too.
>the OP pic is pure speculation, too.
Oh it is? I fricking hope so. If anything in the 4000 series has less than 10gbs of VRAM (outside of the 4050), I'm going to cry,
>less than 10gbs of VRAM (outside of the 4050), I'm going to cry,
The "low" end cards on both AMD and nvidia will have 8gbs vram, so expect the 4060 and the 7600/7700 to be 8gb cards
Bump.
>250€ gpu now has mrsp of the 80‘s line
What a time to be alive
I always forget how cool pc gaming can be. So much better than consoles
How long after release will they be safe to buy?
As in the morons will have beta tested them and they got no issues
Just get a console lol imagine paying that much when the PS5 gets better performance at a lower price than those shitty cards lol pctards never cease to amaze me.
It just got no games!
Isn't the PS5 like $500 if you can find it and it's basically a 2070?
What're the power draws?
300w+ minimum
good luck getting one in the first 6 months
>good luck getting one before xmas
fixed that for you
>muh [insert arbitrary number] gpu used to cost [insert arbitrary number]
>i can't believe it
Who cares? They're all just made up numbers that mean nothing. The only thing that matters is how many frames I get for my money. I don't care if last year it was called GOY1000 and this year they released KYS300 when the performance is the same but costs like 30% less.
>but muh number is smaller
>entry level now costs more than high end 10 years ago
XX50 cards used to be throw away dogshit that was on par with an iGPU. If its gonna be on par with a 3060 / ti, is it really entry level when you're getting 80-100+ fps in AAA games @ 1440p for 250$? This "entry product" can now deliver optimal gaming experience for 99% of people. There is no universal rule that if the gpu has 60 in the name, it has to be 200$. GPU technology has changed entirely in its target applications and now does a lot more than just play games. It's essentially a completely different product with a completely different target market and priorities, playing games is just something it can also do, an extra feature.
Regardless, these prices make no sense. If a 4070 = 3080 then you're literally getting same performance for 20$ less than 3080 msrp. Unless they're pricing them as compared to current inflated prices, which would be extremely moronic. If a 4070 is gonna be closer to a 3080ti/3090, then 679$ would be a really good price. The hope is that you're not a moronic consoomer who just buys things for the number on the box.