Does 4gb of vram really cost $300?
Shopping Cart Returner Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
Shopping Cart Returner Shirt $21.68 |
Does 4gb of vram really cost $300?
Shopping Cart Returner Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
Shopping Cart Returner Shirt $21.68 |
Excuse me what the frick?! They promised 24GB and I think a card had 36GB.
That's the 3090 coming next month
>twice as fast
Not fricking buying. They promised triple the speed AT LEAST.
It even says right there 2-4x faster
Are they mentally moronic they can't even keep their lies straight?
2 for raster, 3 for rt, 4 for sim. at least that's what he said. i just wanna know how fast it makes ai porn
>2 for raster
I'll believe it when I see it. Even their cherry picked tests with no units comparing RT to DLSS+RT don't show 2x
in real world game tests, it's probably just going to be 50 percent faster than the 30 series equivalent, at best. if you guys think that's worth paying a thousand dollars plus for, then be my quest, I guess.
/v/ermin come out of the woodworks again
>Nvidia's 16GB card costs more than 3x more than Intel's 16GB card
Oof.
The top end Intel GPU couldn't even beat a 3060 ti.
honestly if they get pytorch running on it, it'll be a good deal
I'd rather buy 2 used 3080s instead.
You can't SLI them, Nvidia made sure you can't do things like this anymore.
outside of gayms you DO NOT need SLI to use multiple gpus on the same render
Or did you really think people are mentioning intel for anything other than rendering?
SLI/NVLINK is able to merge VRAM pools of cards so it absolutely has uses ouktside of gayming.
Thats like comparing a 3.0 Porsche to a 3.0 Renault.
That's like comparing 2022 3.0 porches to a 2001 Toyota Corolla
I like Renault though
>car analogy
>Thats like comparing a 3.0 Porsche to a 3.0 Renault.
Even at a lower displacement the Renault is superior.
>2GHz
Is this thing also on 14nm++++++?
Intel is using TSMC 6nm for the ARC GPUs.
Also, 2.1GHz is fast for a GPU, the RX 6900XT clocks at the same speed.
And clock speed doesn't matter as much as core count on GPU.
>RAM is the sole metric defining a GPU's power
Big brained take. Did you buy the FX5200 too? It has 128mb of VRAM. That HAS to be as good as a TI4600...
The 3080 16 GB and """3080""" 12 GB are completely different GPUs.
It probably has more cores like the 1060 3gb vs 1060 6gb, and 3080 10gb vs 3080 12gb.
The 3080 10gb was a pretty bad card relative to the time.
The leakers were right.
it's not just the VRAM, the 12gb variant has fewer cuda cores.
i thought people would know this by now after 1060 3gb / 6gb
You're being tricked. the 12GB variant uses AD104 GPU, the 16GB variant uses AD103. Will have significant performance implications. The 12GB is basically a renamed 4070, but they named it 4080 so they can slap a 900 pricetag on it.
So they almost doubled the price of the $500 tier card? That's even worse.
Just wait until 4060 comes out and reviews.
>4060 comes out
for $700
Yes and? As long as the performance is worth $700 who cares. I don't know why you people think NVIDIA is still after the low end / mid market. The naming scheme means nothing anymore. Just wait for reviews.
>being this moronic
A 70 chip is supposed to have a previous 90 performance for the same $500. What's the point of releasing a new gpu if you're just gonna move the prices up instead of the performance? Might as well not make the -80 series exist, and just release a 90tixxxxxxzwv version instead.
4080 12GB is really a 4070Ti, which was $600 or whatever, yes.
I was posting for months in /pcbg/ that it's likely that Nvidia will announce very high MSRPs simply to get people to buy up the 3000 series cards.
Then they can go and drop the MSRPs by $150-$200 after 3000 series inventory clears up.
>I was posting for months in /pcbg/ that it's likely that Nvidia will announce very high MSRPs simply to get people to buy up the 3000 series cards.
>Then they can go and drop the MSRPs by $150-$200 after 3000 series inventory clears up.
But does it matter if you can still get 3080s for $700 and less when the 4080 would still be $900 after the price drop? Or maybe no price drop due to inflation? You still save money.
$700 is still too much for a 3080. It's a 2 year old GPU. 12GB one should be $600 or less.
I'll laugh at someone paying $700 for one, than the RX 7800 costs that much and destroys it.
2x in a single game, MS flight sim. Two other games in their own cherry picked example were like 1.53x and 1.6x
>$700 is still too much for a 3080. It's a 2 year old GPU. 12GB one should be $600 or less.
It's worth what the market demands. And you will not see it for cheaper than $700 unless you buy used from ebay.
>it's worth what the market demands
Get out with that plebe nonsense. An informed market would demand Nvidia stop being such ridiculous c**ts. Cost's em ~$400 to get a high end GPU built, yes this means a xx90 class.
Now frick off.
Poorgays are irrelevant. Corporate usage is what matters.
the market is fricking stupid in 90 percent of the cases and you fricking know it. just because the "market" does the right thing now and then, that doesn't mean that the "market" is some kind of infallible god.
>Then they can go and drop the MSRPs by
HAHAHAHA
NVidia will increase prices instead
I don't know what normalgays would do but given the new information about RTX 4000 I will still not buy new RTX 3000 at the current prices.
My current plan is to buy RTX 3000 second-hand which is going to help NVIDIA at most indirectly in clearing their RTX 3000 inventory.
I wonder if long-term a GPU market similar to Thinkpads will emerge:
Businesses/miners buy new GPUs for computation and then dump their inventory on the second-hand market a few years later where it gets picked up by consumers.
>I wonder if long-term a GPU market similar to Thinkpads will emerge
This does not exist everywhere, in western Europe older computers tend to go to eastern Europe instead of feeding a local second hand market. In France at least the refurbished market is very bad with prices way too high to be of any interest.
About 5 years ago there seemed to be plenty of reasonably-priced refurbished Thinkpads on German ebay but I didn't keep track of the market since then.
>I didn't keep track of the market since then.
For myself at work, we decided to extend life of laptop from 3 years to 4 or 5 before sending them to brokers mostly because of the absurd delivery time. Also I've been contacted by a few refurbishing company offering me computers at competitive prices and on shelves.
That's gotta be a super cut down chip. GIGABlack personisraelite marketing.
4080 12gb is just 4070 rebranded
Stop deadnaming, chud
I'm a tourist from another board. can someone give me a quick rundown on that thing and why it is outrageous?
It's a giant shit on their customers, from the meager increase in performance compared to the increase in base price, and low VRAM capacity.
Nvidia increased the prices of the new video cards to about 1.7x of the previous generation, it's probably the biggest GPU price hike in history (officially, technically during the height of the mining craze cards were even more expensive)
What's even more baffling, considering they have inventory build-up after mining crash AND lots of mining cards hitting the secondary market.
Charging those prices with botched VRAM and inferno-tier power consumption while there's an energy crisis going on is going full moron.
AMD already said they're going for performance per Watt leadership this generation. Let's what they're going to deliver.
>considering they have inventory build-up after mining crash
they'll try to spread rumors about mining cards having cancer. the 40 series launch will also probably have extremely limited availability until most of the old stock is cleared.
Their shareholders got drunk on the fat products and want to continue it as long as possible.
Consoomers kept buying inventory at inflated prices even for gayming purposes.
>why it is outrageous
because IQfy is infested with "people" that don't have jobs and live on welfare and mom's allowance well into their thirties, and they get unreasonably upset at things that grown adults can purchase, but that they can't because it's bigger than mom's christmas budget this year
It's the inverse. People who have jobs know the value of money. Schoolers whose parents buy stuff for them go "olololo fair price, nvidia rules".
I meant real jobs, not your cashier job where you need to count pennies.
$1200? Frick it, I hate money. Time to preorder...
4080 16GB is AD103 (already not really a 4080 tbqh), 4080 12GB is AD104 (normally this would be 4070 and 4070 Ti, possibly also 4060 Ti). They made the "real" 4080 almost twice as expensive, and the same for what should have been a 4070. People complained that 3090 cost too much compared to similar performing 3080, well now Nvidia made that difference smaller by 2x-ing the price of all other cards. And this is before AIBs reveal real prices, Founder's Edition won't be found anywhere like it was with Ampere.
you got owned by their marketing
it's a different core configuration
basically the 4070 but called the 4080 (12GB)
It's not just VRAM....
DO NOT BELIEVE THAT THE 12GB 4080 IS A 4080
IT IS A FRICKING DIFFERENT CHIP
THAT SHIT IS A 4070 IN DISGUISE
shut it down.
It's not fricking right It's a different chip they should NOT be allowed to do that shit.
Fricking disgusting. I'll make that thread every day to make sure people know you are not getting a 4080 if you buy the 12GB card.
You would have to unironically do that on reddit and youtube comments.
it literally is
they renamed the 4070 last minute
DOA
AMD just let it slip that their RDNA3 cards will likely be able to boost near 4Ghz
I have a 1000w psu so my 3090 runs very comfily that will give me power anxiety for the 4090 holy shit 660w?
>I have a 1000w psu so my 3090 runs very comfily that will give me power anxiety for the 4090 holy shit 660w?
ignoring the so-called transient spikes, afaik 660W is the max possible configuration in BIOS
so a non-OC'ed version will be 450W, but OC'ed could be up to 500W or higher, depending on brand and model
Babby's first mid-tier silicon branded as high-end SKU.
Nvidia did this back with Kepler and Maxwell launch. Most people didn't even bat an eye at the time because they were able to compete against high-end AMD's offerings.
If this is true why did titan come out a year after the 680 launch, why would nvidia sandbag and deny money for an entire year? I don't buy it.
Titan was a Quadro-reject/excessive stock (GK110) that Nvidia try to sell to gamers with more money then sense.
Hawaii came out that's when Nvidia unleash Big Kepler onto the market (Geforce 780Ti/Titan Black)
Nvidia did in fact sandbag for a year because they could and made big $$$$. GK104 silicon were much cheaper to make their Tahiti counterparts.
lol you cant even keep the brands straight, no titan wasn't a quadro reject. GK110 didn't come to any pro line until AFTER titan came out. Seriously I don't believe that nvidia would have only barely competed with the 7970 instead of also just releasing titan/690/branding gk104 as 670 to maintain performance dominance.
Yes, the Kepler Titan was a Quadro/Telsa GK110 reject, hell that was the main reason why GPGPU/professional gays were obsessed with them back in the day (FP64 portion was intact). They were practically the same hardware for far less $$$$. Nvidia realized that they made a big oops and end-up cannbalizing their acutal Quadro/Telsa line-up at the time. Instead of enticing gayming-gays with deep pockets.
That's why next generation were binned much and end-up being more of a less, "early access/sneak peak" of their customer-tier Big Maxwell and Big Pascal cards.
Titan brand eventually outlived its usefulness and became new "xx90" tier.
frick off homosexual bot, at least pick a response that pretends like it can read.
>Yes, the Kepler Titan was a Quadro/Telsa GK110 reject
No, titan came out in feburary 2013, k6000 didn't come until july 2013. Quadros were literally reject titans, and this is still all a year after keplers actual launch with the 680.
Kepler Titans have always been Quadro rejects/excessive stock. Quadro were commercially release later because they involved more validation then silly gayming SKUs.
Keep seething that Nvidia's marketing have been bamboozling you.
people also weren't paying $900 for the a 70 series card
That's because Nvidia didn't think the market would tolerate it at the time, but crypto-mining insanity proved otherwise.
There are much than enough gaymers in the high-end market that don't bat an eye at current price points.
Nvidia and AMD aren't your friends. They are only concern is shareholders. The shareholder have a vested investigate to keep revenue streams at their current levels. Any signficant dropoff that results from not taking advantage of the market will result in lawsuits from the shareholders.
>4080 is actually a 4070
>4070 will be 4060
>4060 will be 4050
ahahahha fricking israelitevidia
Then what 4050 will be? Or it won't exist at all?
3060ti rebranded lmao
>RTX 3660
Will the lower memory bus on the 4080 12gb be detrimental.
no, the lower bandwidth, number of cuda cores and memory amount will
16GB VRAM should be the absolute minimum for a high-end card.
>shit tier vram which means this performance increase is meaningless at higher resolutions
>lower memory bus across the board
literally who are these cards even for? who is trying to game at 500 fps at 1440p???
4080 16gb seems like the worst deal here
It's so expensive that at that point you might as well go for the 4090
>pay 33% more over 12GB for 30% more SMs
>or add another 33% for 4090 for 68% more SMs
?
>It's so expensive that at that point you might as well go for the 4090
And that's exactly what Nvidia is trying to do here,
And 24GB is absolutely worth it for Art 2.0
suggested psu is 850w, does it take into account the rest of the system ? I have a 860w psu but already running a 5900x. I really want to join into the AI art thing.
I wouldn't use that without at least a 1000w
$1200? They're nuts. This shouldn't cost that much money. $500 and $600 sure but $900 and $1200 is just way to high. Selling my Nvidia stocks today...
It shouldn't cost $500 either. Before the gpu mining craze mid-tier gpu's cost $200. Mining is dead now and yet the prices haven't gone back to what they were, in fact they only keep getting exponentially higher.
Nvidia is about to get a wakeup call. Nobody is going to buy these.
*drops support for 30 series right after 40 series comes out and intentionally gimps 30 series drivers *
nothen pursonnel m8
Yeah if they do this, it'll hurt them even further.
they don't care homie, they are in the business of making money nothing else matters and you can pretend like the 40 series won't make them a profit but it obviously will
nvidia will never lose
they will keep israeliteing you
and they will keep winning
pick up a new hobby because enthusiast hardware is dying
Desktop PCs are dying then. Yeah I think it's time. Lots of people aren't going to buy these. Who the hell are they targeting?
why do you think their showcase only talked about gaming for like 5 minutes? the rest of it is about AI and machine learning because that's where the bulk of their money is going to come from
They're about to get hacked a second time.
Rich gamers
That's such a tiny niche. All 8 YouTubers I'm sure will love it.
Even then companies are going to look elsewhere to save money. I don't think Nvidia realized the beast they just woke up.
lol look where you moronic coping Black person? everything needs cuda up the ass. gamers might actually have an alternative in amd. enterprise does not
Even the enterprises who already have 30 series cards will be holding off. Mark my words, it's too damn expensive.
lol you dumb Black person. this is peanuts for enterprise. when they're paying 300k to some homosexual to write text on a computer, spending an extra 5k on his gpus isn't going to make any difference
you're just a buttblasted gaymer top kek
A buttblasted gamer who has connections. This price will not last.
time to upsell the companies
>you'll need multimultimultimulti GPUs
they are not looking at companies that are using 30 series man. Content creators and video editors are not their target
Doesn't matter, Nvidia is a target now.
>Even then companies are going to look elsewhere to save money.
Fricking where? A company will gladly fire 20% of its workforce if it means they get a new AI chip from nvidia that does their job anyway
Companies and they always will from now on. AI, ML, Deep Learning, Cars, Digital Twins etc. This is for them and nvidia is for them.
I've been reading that for more than 10 years, but we only see Nvidia, Intel and Steam getting rich, richer and richer, we all know from where all their profit comes from.
Ask me again in five years.
My part of the company has 350 Nvidia GPUs across two locations for ML.
most likely the whales. the dudes who earn pretty good money and who will never have wives n kids and only live to be able to spend their money as fast as they get it in order to feel some sense of happiness and fulfillment in their empty lives.
They always do this (hell they already did with the 20xx when the 30xx came out despite all the scalping and inability to purchase at MSRP) and it has never once hurt them. Know why? Because end users aren't the target audience, it's always been the companies that rely on their proprietary software like game companies. Even if you don't use shit like raytracing or hairworks, developers still scramble out of their way to lick some boots and let nVidia walk all over them in order to get such features implemented into their games.
Not sure, many novidya fanboys out there.
Hey, at least there is hope in ayymd or, God forbid, intlel arc.
>Pc gaming is fricked at this point, im voting with my wallet.Just wait™ or an rx6600 maybe? It's probably the only card at what seems to be a valid msrp here.
You can thank """""""AI""""""" for that
There are no mid-tier GPUs anymore, last gen is supposed to be the new mid-tier GPU, they even made it clear on their slides.
Mid tier does exist it's just priced like high tier shit now
that 4080 12GB is a mid tier card with a moronic $900 price tag
What mid tier GPU was $200 USD MSRP right before the mining craze?
>Before the gpu mining craze mid-tier gpu's cost $200.
no they didn't
they were 350-4000
the frick are you smoking
400*
You could get a 980ti for like $400.
>mid-tier
>$4000
This is your brain on consoom. Did you sell your car for this?
Yawn see
You're so tiring
heres a crazy theory for you
there is no bubble thats gonna pop
it will never shrink down
everyone is just selling it with what they can get away with
they see people buying their overpriced shit, they are gonna keep charging more
same goes for housing market
your kids wont be able to buy property
blame the moronic fricking cattle for that. they've always been spineless fricking fricks.
i'm waiting for the 60 series
Frick novideo and their greed. israelitesen is a crook.
RTX 4080 of 12GB is a XX70 card relabel for AIB sells overstock of 3XXX series.
Lol at these people, they're gonna lose tons of money justo because they're greedy as frick, everyone and their mothers are gonna buy second hand RTX 30X0 for 1/3rd of that price, people are not gonna give a frick about them coming from miners or not at those prices
3080 Ti and 3080 12GB were made to price-hike the 3080, so basically the price hike already happened, Nvidia wasn't making any 3080 10GB lately. This pricing is not surprising at all. Naming 4070 "4080" is fricking nasty though.
if you are a gamer it's unironically cheaper to just buy a console
We got rid of minerBlack folk, but how do we stop the AI companies bros?
by bombing their offices
The only hope is that their research costs billions and results in only literally failures. If their drug discovery models are successful they'll print money for the rest of their lives.
They're going to get hacked. Hard. They just painted a giant target on their backs with this price.
They all harvest their data from particulars and organizations without their permission. If you bring to light the copyright violations they’ll likely get raped
You need 24gb or more VRAM to do the sort of work a AI company needs to do. So most of the consumer gpus will not be affected.
No they rebarnded 4070 to 4080 12 GB at the last minute. It's a different chip than the 4080 16 GB
No anon it actually costs $600
Your monopoly Australian money does not matter , we’re talking in usd as everyone can understand the conversion to its own currency regardless of where they live , convert to AUD and stfu
Is Nvidia moronic? GPU mining is officially dead now. They're never gonna sell these shits.
My GTX 960 has 2GB of VRAM and it still works FINE.
Lol
>450W
>quiet mode
looks almost exact same as my current 3090 that im returning tomorrow
>looks almost exact same as my current 3090 that im returning tomorrow
Jesus
what is this? 3090ti vs 4090?
>what is this? 3090ti vs 4090?
BBC VS bwc VS sac
motherboard breaker not a double taker
>capitalism 101
So... I should just buy a 3080Ti right? This absolutely isn't worth it.
THANK YOU BASED NVIDIA!
>MSRP 3080 - 699
>MSRP 4080 - 1199
So what is the reason for a 500$ price hike beyond "cuz we can".
there isn't one
NVIDIA does not want to sell you GPUs.
this. they're smart and playing the long game of going datacenter/server only in the future. they don't want to fight in price wars now that Intel is joining the GPU market
>rebranding cards in the very same generation and using such a hideous naming scheme
frick njudea, without their fancy names stream processors they'd be nowhere with how hostile they are about everything
#not_buying
Even if your shitcoin dumps to the ocean floor.
RX6400 is $80, GTX3050 is $150.
Yes, it does.
imagine people pay more to have more power consumption cards 😀
i thought pc consumers are smart......
>i thought pc consumers are smart......
used to be, but the pc users from 15 years ago are not the same as the pc users today, completely different species
double the performance
double the power draw
double the price
Nah, it is more like double the performance for 10-20% price increases over their predecessors.
Price consumption is somewhat higher at load not double the amount. It is still more energy efficient..
double performance when you turn on interpolation to double frames
axaAXXAXAAXAXXAXAAXAXXAAX
You idiot that is the whole 3-4X figures come from. Nvidia marketing wants the masses to focus on it because it is much more impressive in their eyes then a straight 60-100% gain.
Jesus, we haven't seen generational gains like that since Kepler to Maxwell.
>turn off poollss3.3589
>turn off shittracing
>turn off 983258 other buzzwords
>oh it is 15% performance increase
every single time
go be israelite elsewhere
you're probably not even gonna see a hundred percent increase. probably max 50 percent. wait for the 3080 vs 4080 benchmarks to come out, then we'll see what's what.
12GB 4080 is really the 4070 that they were originally going to launch with. Entirely different chip and smaller memory bus than the 16GB 4080.
>You will never be a 4080
>You might call yourself a 4080
>Having less VRAM don't make you a 4080
>Your GPU is AD104
>Your memory bus is too small
>Everybody see you are a 4070 dressed up as a 4080
>being poor
300 dollars?
You can't afford 600 dollars??
1200 dollars is a fair price!
Call me back when you have my 2400 dollars
I-I'm sorry, Mr. Jenstein. I-I'll have your money right away, s-sir...
YOU WILL PAY $1000 FOR THE 407... I MEAN 4080 GOYIM!!!
Well it's either no money or no driver. Choose wisely.
don't you have 1400 phones ? just get the gpu to play mobiles ports
970 all over again, plenty of morons will buy that "4080" too
good. i took novideo to court and won $1200 over the 970 debacle. let's do it again!
Its nothing like the 970
>1060 3/6GB
>3080 10/12GB
>Now this
This is the 3rd time Nvidia has pulled this. In the other 2 cases the performance difference was like 5-10%, now it looks to be more like 30%
imagine unironically shilling for njudea that wont even present a graph with real world performance
they talked more about parking cars than any other shit
just deal with it njudea does not give a shit about average consoomer
it's not just 4GB of VRAM, it has 30% more CUDA cores
No, but they are nvidia, they are monopolized the industry... what can you do?
>all these people itt complaining
what is TDP though? is 650w true?
It's 450W but some cards will be higher.
No, they just know that many people won't buy AMD.
The 5700XT and 5600XT were great but people didn't buy them even when the 2000 series was WORSE performance/$ than Pascal was.
4070 will be what should have been the 4060Ti, yes
And 1000+ SPUs.
Kek, they are selling the 4070 as a 4080. Guess that RDNA3 is a flop and that AMD is going to macth prices, else they would not be this moronic.
Can't be worse than rdna2 which is already good.
So the 4080 12 GB is a 4070 in disguise...
Will the 4070 will be one tier below then?
They're gonna call the 10GB card a 4070? Is it even coming out next year? What would they charge for it? $699?
when they say twice as fact, is that with or without interpolation, I appreciate an answer
with.
Look at the specs. This isn't a 4080. It is a completely different card where the 4 gb ram is a rather minor difference compared to other things.
It is a literal 4070 in everything except name. MAYBE you can call it a 4070ti if you want to pretend Nvidia doesn't frick you over, but it is never a 4080.
The biggest mistake of Nvidia was calling the 4080 (16GB) not the 4080ti (16GB), because the 16GB "version" has nothing in common with the other "4080".
It's like two cars where two are completely different, but have the same name with the only marketing difference being the acceleration. Just imagine a Ford and a Toyota will be sold under the same name "Ford 12" and "Ford 16". This is where Nvidia is with the "4080" name.
in europe, a 3080 has cost 1300 dollars for the last couple of years. and everyone fricking knows that the 3080 was supposed to be the mid-range model. a mid-range gpu used to cost 400 dollars in europe only 6-7 years ago and even less pre-2011. until i can buy another gpu for 500 dollars maximum, i ain't buying a new gpu, so i guess gaming is dead for me now.
Why are they branding evidently distinct cards the same 4080 changing only the amount of RAM?
This is obviously deceptive.
What do you think the reaction would be if they charged $899 for a 4070?
There are two reasons:
>There are still 3xxx series that they want to sell
>They want to sell them at their current prices
It's really just them going full israeli at this point.
The original price of the 4070 and 4080 was probably lower. NVIDIA's solution: change the naming scheme and increase prices. The same 4070 goes up out of thin air with a magic number change.
How worth it will it be to upgrade to one of these 4000 series cards coming from a 2070? Using 1440p ultrawide so hard to even get 60fps on a lot of games at good settings.
It does for nvidia's userbase
I really want to believe AMD won't frick up this massive opportunity, but then I think of the 6750 XT they tried to shill for the past 6 months
Shutup and buy our $900 70 series, i mena 80 series cut down card.
Stop being poor
Gamers are being oppressed by Nvidia...
Will AMD save the day?
Since when did morons conclude that vram is a main metric for gaming performance.
it's the only big number on the box, how else would you tell which is faster?