>those RTX 4000 prices
Thanks, Mr. Leather Jacket Man, but we'll take the GPU market from here.
CRIME Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
CRIME Shirt $21.68 |
>those RTX 4000 prices
Thanks, Mr. Leather Jacket Man, but we'll take the GPU market from here.
CRIME Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
CRIME Shirt $21.68 |
It's HER turn! Bring on the RED POWER, girl!
>her
It's Satoru Iwata after faking his death and transitioning.
Lmfao
Lacist.
What are you talking about?
Jensen and Lisa are the exact same person.
The jacket works wonders for swag
they're closely related
https://www.techtimes.com/articles/253736/20201030/fact-check-nvidia-ceo-uncle-amds-dr-lisa-su.htm
you do know they're blood relatives, right? they're literally from the same family. also, what are the odds of that?? how come two people from the same family both get jobs in the same type of companies? how does this happen? nepotism?
>how does this happen? nepotism?
Lol. Just good genes. Lisa su got big in IBM them moved to AMD for CPUs. Even in IBM she was pushing for chiplet design.
That's why GPUs have only now started to get competitive. Once AMD picked up Radeon they brought it down to shit. Now they're working it up.
Lisa Su actually got to her position via her big brain. She literally saved AMD
The HD7970 was one of their best cards ever
>HD7970 released 2011
>ATI - 2006 (acquisition completed); 2010 (branding phased out)
HD7970 was the beginning of the end and they bleed talent or atleast innovation from then on.
kekw
go back
you know, when people "transition", they usually don't change their entire face. i know the meme of all chinks looking alike is a funny one, but 900-come-on-now...
>talking about Korean and Japanese people
>calling them chinks
they arr the same
>Lisa Su
>Korean
all slanty-eyed people are chinks, bruh. don't start getting smart with me.
left is jomon, right is yayoi.
Wait so they're not the same person?
Do you think her feet been polished with stones? Do she smell like lilac n shit?
>Here's your new Radeon, Anon
Still better than 4090.
ah, so it's a condom for my futanarigf
Come on bro!
That was clearly photoshopped to make it look smaller with less fans.
LARGEEEEEEERRRRRR.
>It's not smart, it's what you do when you have an inferior product. Shave $100 off the price and the hapless drones would eat it up.
It's exactly what Sony did in 2013 iirc and it worked marvelously. And they did have the superior product. They basically killed MS that day.
Who would win in battle?
It was smart for AMD to let Nvidia go first. It gives AMD the advantage of being able to readjust their RX 7000 prices in advance.
It's not smart, it's what you do when you have an inferior product. Shave $100 off the price and the hapless drones would eat it up. But now even amdrones would probably buy cheap amperes/moronna2.
That's fricking moronic and you're fricking moronic.
If you have the inferior product then you want to get out ahead and start selling them before your competition lets everyone know they've got a better product. Then all you can do is slash prices, but in the meantime you've at least made some sales at full price.
Which is exactly what it seems like NVIDIA is doing. Zero solid numbers given just a hand wavey 2-4x (under extremely specific conditions) my guess is actual performance is probably ~30-50% better when not relying on DLSS or RT
When?
RDNA 3 reveal is on Nov 3
Just release an efficient rasterization card for 300-400. They'd gain an absurd amount of market share.
300 bucks. 3060 Ti performance. All it would take, but no one can or wants to do it.
Bro the 3060ti should only have cost $270. And that was 2 years ago.
The brass ball of Nvidia to announce a new product line and at the same time market last gens cards too. Listing the same 3060 MSRP as it had at launch 18 months ago. This is what happens when a company has too much market share
True. But Nvidia is also confident AMD won't have anything to match and from the presentation and advances with ray tracing and dlss I can see why Nvidia thinks they can continue to overcharge.
It's really solid software and tech AMD in no way can match up too.
If you want to best GPU and feature set it's still Nvidia. And they know you'll pay for it even through there are millions of used GPUs on the market
Everyone is now hoping AMD will deliver, but it will be, like always, Another Massive Disappointment.
RDNA2 was great. AMD has consistently been delivering since 2019, or even arguably 2017.
>RDNA2 was great.
>Lost to 30 series in Ray-tracing
>tried to shill FSR but got fricked and tried to shill 2.0
IF 7000 is going to be better than 4000 in RT and will have something close to DLSS 3, maybe it will be great
>Lost to 30 series in Ray-tracing
No one cares. You don't play video games. RT adds no value. RDNA2 won where it matters: Price to performance, power efficiency and VRAM.
is that way AMD's marketshare is in the gutter?
true
>lost to Novidya in something that's an explicit Novidya gimmick and an AMD afterthought
Enjoy managing a simulation of your Amazon warehouse or whatever, Jensen.
>muh ray tracing
Rumao
Ppl still care about this shit?
>omg guis 10% better shadow with half the frame rate
Cope.
AMD should end up beating Nvidia on price alone unless and 6000 series were a solid competitor outside of 4k. Considering the level of improvement the 6000 series was, I don't think its unreasonable to think that RDNA3 could work out
Ray tracing destroys performance of games, it's pointless to enable ray tracing if you arent paying 2k dollars for a gpu (and at that point you would want to play at 4k, which the current gpus cant handle with rtx enabled either way).
The only reason to buy nvidia now is if you want their meme AI garbage
if you want 4k with everything maxxed out while having dlls3 turned on and full raytracing, then even the 40 series won't be able to pull that off. screenshot this, because dollars to nuts, the 40 series won't be able to pull consistent 70 frames per second in the most demanding games.
>Gaytracing
why yes i'd like to halve my framerate for a gimmick
DLSS 3 doesn't even support the 3000 series lol, massive self sabotage from novideo.
If you care more about rasterization amd is already the better buy today.
I wish it had more gpgpu support but I can't be assed to care about upscaling videogames.
>Gay tracing
Who cares
>FSR
Good open source alternative that can be implemented everywhere
>DLSS 3
Ah yes, using AI to interpolate fake frames instead of having a higher framerate.
Say whatever you want to say but Frame Interpolation is fundamentally BAD, don't try to shill that shit here on IQfy.
It is the only way to reach HFR.
>Ray-tracing
>RDNA2 was great
RDNA2 owner here, NO.
Nvidia is moronic but AMD has started to get greedy so expecting prices to continue to be near nvidia's
whats the problem
AMD Drivers. Video decoding makes the browser flash white, sometimes. ROCM is shite in SD.
> ROCM is shite in SD.
i don’t see the issue, runs fast enough and amd cards have plenty of ram
I've had less trouble with AMD drivers on Windows than Nvidia.
But regardless, I use Linux and only use Windows to play incompatible games.
I'm a RDNA2 owner too. What's the problem? Been serving me right.
Nvidiots keep harping on RT performance being the only metric that matters even if their own cards choke on it and are forced to disable it to keep-up FPS.
>RDNA2 was great.
Right, correction: RDNA2 is still great. 6950 XT is the king of 1440p.
my ryzen 3700x almost outperforms my 5700xt in stable diffusion
cpu is 3.5s/it
gpu is 3.3s/it
5700XT is RDNA1, moron.
When you run SD on the AMD card, does the GPU peg at 100% usage?
it'd be basically the same performance and $50 cheaper than nvidia then everyone will continue to buy nvidia because drivers. rinse repeat money speaks
yep, welcome to the real world, sonny boi. the politicians lied to you when they told you competition would be good for your wallet.
AMD needs a DLSS alternative. Some AI softwares too
This. Nvidia are scummy but future proof, and if the gaming market can't find a good use for their overpriced technologies the commercial market will. Meanwhile AMD are trying to catch up in the gaming market and have nothing beyond that. Ideologically they live in 2015.
>future proof
their new version of dlss doesn't work on ampere, how is that future proof
Cry harder
Future proof as in actively advancing features that have a tangible commercial application X years from now, instead of just trying to beat their competitor in a gaming benchmark like it's 2005.
incredible, the siemens ad actually works on people buying gaming gpus.
datacenter applications are all about tflops and efficiency, not gimmicks
That's why Tensor cores are a gimmick, right?
they're great if you need a lot of fp16 compute, it just doesn't sound as cool and revolutionary if you put it like that
Except I'm not buying them because I realize I'm not the target audience. Which is the entire point I was making. Nvidia have the better business leverage currently because they're putting more focus on a gigantic world of commercial applications, while AMD are still chasing after the limited desktop gaming market with seemingly no plan B for any other application. Just because my next card will be AMD doesn't mean AMD Radeon are currently set up better as a business.
AMD is doing fine in HPC with CDNA
Better buy the card capable of simulating your job as an Amazon warehouse employee then. Enjoy meeting your digital twin!
RAW FPS is and always will be the only real metric to go on and not some meme tech like DLSS.
>23FPS with goytracing
See this shit
; They will barely match the high end on raster or price/perf and RDNA3 isn't even out yet.
Gigacope. Benchmarks haven't come in yet and people are already making wild claims that 4080 = 6900 XT (More like 3080 = 6900 XT)
it probably could be made to work, but then you would most likely get shitty dlls performance and would you want that? why would anyone want that?
But it does. It's a very specific feature that doesn't.
No it doesn't.
https://www.theverge.com/2022/9/20/23362990/nvidia-dlss-3-0-demonstration-ada-lovelace-graphics-cards-upscaling-technology
Yes it does
what's reflex?
>Alongside our new GeForce RTX 30 Series GPUs, we’re unveiling NVIDIA Reflex, a revolutionary suite of GPU, G-SYNC display, and software technologies that measure and reduce system latency in competitive games (a.k.a. click-to-display latency). Reducing system latency is critical for competitive gamers, because it allows the PC and display to respond faster to a user’s mouse and keyboard inputs, enabling players to acquire enemies faster and take shots with greater precision.
Some latency bs
>Ideologically they live in 2015
Good. They should ideologically live in 2005. Make a card that can push a ridiculous amount of pixels as fast as possible, and nothing more.
FSR 2.1 isn't too terrible, it's not on top but it's not like it is leagues apart from DLSS 2
The real question is if chiplets are a thing this generation and if they get a new video engine and fix ray tracing performance next generation. Since the nodes are the same, it will be a good time to see how behind AMD is to Nvidia. The fact that they only have raster performance locked down and the drivers are still lower quality than Nvidia except on Linux doesn't bode that well.
>AMD needs a DLSS alternative. Some AI softwares too
I think AMD has made good ground with FSR. I think they should carry on the algorithm route. Just like they did with freesync.
It's free for every to use and is far more easier to implement.
Nvidia does amazing work but they do it to flex. They seem to do things the hard way rather than easy
Fricken grandma's been on the roids
Why would you waste money of inferior piece of technology?
AMD doesn't give a shit about their high-end GPU's. They make bank on CPU's and console GPU's. High-end GPU's are waste of silicon to them. They'll allocate little supply and set high prices.
I have a 6900XT and won't upgrade this gen.
RT performance is fine imo, the only games I use it with are Quake 2 and Doom Eternal though, soon Portal and maybe Morrowind, we'll see how that RTX mod support turns out on linux.
Intel is pretty close to Nvidia in relative rt performance, RDNA should also catch up quite a bit.
DLSS is a bit nicer than both XeSS and FSR2, but it's hardly game-changing and you see now how the proprietary nature means it'll turn worthless on a whim when Nvidia wants to market a new gen.
i believe in amd supremacy
The final solution to the nvidia problem
Big oof, how did amd not catch this?
shut up israelite. It's intended.
Imagine being antisemitic in 2022... I'm so sorry for you.
dabbin those spinning R's on u
Press R to disagree
AMD literally dunking on nJudea
>Imagine being antisemitic in 2022
Imagine being so sheltered and ignorant that you're not!
Wild guess, but isn't it rather because it's being recorded on a phone?
it's when you take a video and the framerate happens to match the rotation in a certain way (quarter turn between every frame or something). similar to those videos with a helicopter where the rotor doesn't appear to turn while they fly. you can't see that live, it'll just be blurry.
>Big oof
go back kiddo
damn, if the graphics card was a little longer and took the entire case it would look very aesthetics.
wtf I love AMD now
No prices will compensate the lack of CUDA
Sadly this. My 970 still plays games fine but what I really want is more waifu iterations per second.
>implying the cheapest amd gpu wont be $490
deluded
4090 $ 1,999
4080 $ 1,699
4070 $ 1,199
Australian dollars
Australian prices have not been released.
Expect 1.5x or even 2x increase.
Yes they have.
They are rebranding themselves as a luxury brand.
>starting
Nvidia lost it even the 4090 is cut down as frick
>not a single 48GB vram card
Gay as frick
Even 32 GB would have been great for the top end. Maybe in 2 more years™.
>Even 32 GB would have been great for the top end. Maybe in 2 more years™.
Man I hope rdna 4 has more vram
16gb on my 6900xt isn't Gonna he enough
leaks show that 7900XT has 24GB vram
>32 GB Vram
FOR WHAT!??
Do you have any idea how fricking meaningless vram is for gaming once you reach a certain threshold. It The only spec that tells you jack shit about raw performance is tflops which Nvidia advertised poorly so brainlets see 16GB>12GB gigashits and think we need more
>for gaming
No, thanks, I have a job. 32GB is for compute.
What games? There isn't a single game worth playing that you can't on a 10x0. The new cards are for AI.
>3080Ti consistently higher 1% lows
top kek, is this a general phenomenon or did this guy do something weird with his setup
>same rasterization performance but minimal gimmicks (and open source anyway)
>excellent Linux drivers baked right into the kernel
>cards that aren't overvolted out the wazoo
>more sanely priced
Yeah, I'm thinking AMD's back.
Freetards are a different breed, I swear.
>DLSS now locked out to anyone but the people who will take out a loan to buy the latest housefire cards because Nvidia knows they have nothing else going for them
Absolutely no technical reason as to why it shouldn't work on at least 3000 series but hey, Nvidia knows that they need to pull Intel israelite tricks if you want to be seen as the "best" by morons regardless if your product catches fire or not.
>DLSS now locked out to anyone but the people who will take out a loan to buy the latest housefire cards because Nvidia knows they have nothing else going for them
Do Nvidia train their deep learning in every available DLSS version for compatible games, or only in the most recent one? In other words, can you also play a DLSS 3 compatible game with DLSS 2, or only with DLSS 3?
Nobody knows because DLSS 3 doesn't exist yet
DLSS 3 is frame interpolation, not upscaling. It isn't not even the same technology as 2
Yeah but if a developer pays to implement it, can't you train your thing for both technologies simultaneously?
>they combined optical flow with the tensor cores for bullshit motion interpolation and this somehow justifies overpriced cards
BRAVO NVIDIA
NVIDIA defines the GPU market. They can do whatever they want. If you want that to change, stop buying their fricking products.
How do I do that when I'm Stably Diffusing waifus?
>inb4 ROCM
Maybe they should get it working properly then.
This. Between stable diffusion and the upcoming RTX Remix porn mods, Nvidia has the coomer niche cornered.
AMD has to step up their game.
what doesn't work properly?
>Maybe they should get it working properly then
It works properly. Stop spreading shill talking points.
The best part is that the 4080 12GB might just barely match the 6900XT 16GB right now; Which has more VRAM and is currently being sold at $700
It's both, it's just DLSS 2 with frame interpolation added after upscaling.
TAA still there? Into the trash it goes.
>more sanely priced
I'll believe it when I see it.
>can't run AI
equivalent product will be $50 cheaper best case scenario while having worse ray tracing performance and fsr is still shit
what a bargain, mama su saves us yet again
b-but muh FOSS drivers
Correct. AMD has shareholders to answer to.
Still have my never obsolete GTX 1080.
Same here, with 6700K. I replaced DLSS with FSR2.1 in Cyberpunk 2077, and now I know why Nvidia didn't allowed DLSS on GTX1000 series.
>tfw not a ramen-eating wagie and can afford Nvidia
Feels so good, bros.
I once bought a motorcycle on craigslist for less than the cost of a 4090. I am not willing to pay more for a GPU than I did a vehicle I rode for 5 years.
when is AMD's event??????
Rumor has it RDN3 goes to 4 Ghz
The reality is AMD GPUs have a long way to go before being able to match Nvidia in features, when you buy an Nvidia GPU you will pay the Jensen Tax™ but it's undeniable that you can do a lot more with it.
Sure if you are only interested in games and pushing more pixels without any trickery and graphical gimmicks AMD is fine but for everything else they are behind.
FSR needs more work to compete with DLSS, AMD media engine is not as good, especially their encoders, and when you factor in AI, SDKs, libraries, CUDA it becomes clear that they are far apart, hell leather jacket man spent 90% of the presentation talking about self-driving cars, simulation system, modelling system, development frameworks, a full Nvidia software stack, does AMD have an answer for any of that? It's easy to see why Nvidia GPU are gobbled down by enterprise and academia, while AMD ones are basically nonexistent.
>FSR needs more work to compete with DLSS
Who cares? You don't buy a $2000 GPU to upscale games.
>AMD media engine is not as good
It's fine. Not quite as good, but the difference doesn't matter unless you're a professional streamer. For offline transcoding or professional work you would be laughed out of the room for suggesting GPU encoding.
>and when you factor in AI, SDKs, libraries, CUDA
Irrelevant for consumer cards.
High-concentration cope.
High-concentration truth. Facts don't care about your feelings.
>Irrelevant for consumer cards
Then why are the best consumer LCZero and Stable Diffusion setups based on CUDA?
You don't buy a GPU to generate anime breasts for a week before you get bored.
But you don't even have a job, you stinky NEET.
Trying to insult me doesnt change reality, I don't understand why you deny hte obvious truth. Seriously
I didn't deny anything. You don't need the PRO(tm) features because you don't have a job.
I know, as of today it also does that frame interpolation thing that shitty TVs do.
VMAF: https://github.com/Netflix/vmaf
AMF: AMD Advanced Media Framework
NVENC: NVIDIA Encoder
QSV: Intel Quick Sync Video
Thanks.
Yes. Nvidia is for the PRO(tm) experience. The "A" in AMD is for Amateur.
If your understanding of DLSS is that it just 'upscales games', you might be more moronic than you initially appeared.
What the frick is a VMAF, a AMF, a NVENC or a QSV?
You don't buy a gpu to upscale games at all.
People that use DLSS are the same people that use FXAA and say it's the best thing since bread, being so clouded that they don't even see the smear it makes when anything moves or the extreme lack of detail and shimmering that is introduced.
>People that use DLSS are the same people that use FXAA and say it's the best thing since bread
People that play one game with a poor TAA implementation and play at 1080p and think that this is a blanket situation for every game with TAA are even more stupid.
>makes when anything moves or the extreme lack of detail and shimmering that is introduced.
Not a problem if you aren't playing at 1080p. DLSS has nothing to do with TAA as well. In many games running DLSS with no sharpening or TAA is preferable to many engines shitty default TAA implementation. Learn the difference between the two technologies before you talk shit about other people.
TAA can't be implemented well, it's shit by design frame averaging smoothing. Worse than motionblur.
If you don't see how blurry TAA makes things you need to get glasses, I'm serious. Even at 4k it destroys all form of sharpness.
>Learn the difference between the two technologies before you talk shit about other people.
Your reading comprehension is as good as your taste in anti-aliasing technologies.
>DLSS has nothing to do with TAA as well.
It is literally TAAU but using a neural network model to control the upsampling as opposed to a simple filter.
>TAA
The frick is TAA?
Anti-aliasing at 4K and higher is simply not required.
The problem here is that any upscale doesn't give a pixel-perfect and identical image, and what's worse, journos are shifting from it. "look how well it feeds us 1080p sources, we don't even notice" is the new tech meme. People stopped caring for raw perf and crisp image, what even the point in 8K then.
>professional streamer
Name one big streamer that cares about encoding.
Linus Sebastian.
He's not even a small twitch streamer. Big youtuber shill, but his twitch streams barely pass 300 people on average.
>Irrelevant for consumer cards.
Which is why these features are present in said consumer cards and why a lot of profesionals and companies buy them. Huh.
What the frick happened to the days of 300 or so for low, 400-500 for mid and 500-600 for high end cards? I think I paid around 500 or so for my RTX 2070 in 2019(?) and I'm content with keeping that until prices return to what I would consider normal (if they ever do).
Increasing production costs with node shrinks, inflation and greed.
Mainly greed.
do you have any idea how much inflation the western countries have amassed these last couple of years? do you have any idea how little our money is actually worth right now? our dear leaders work their asses out of their pants trying to get the inflation "under control" but it's not really working they way they want it to. this is just the beginning, wait a couple of more years and the great reset will begin in earnest. when you all go nutzo and kill all your politicians, don't forget to also kill all the israelites behind them. especially the rothschild israelites.
Yeah remember how much you could've bought for a dollar in the 1930s? THese times are long gone and will never return. Inflation never comes back, and if it does, it's even worse.
>400 series
>a massive fire hazard
>4000 series
>a gigantic fire hazard
I'm starting to see a pattern here with the number 4.
>I'm starting to see a pattern here with the number 4.
> asians leave out the number 4 out of superstition due to syhounding like death
GeForce 4 series was kino, though.
Novidya will always be housefire garbage in real world performance.
>VRAMlet
Into the trash IQfyermin
In Austria they capped electricity to 10ct/kWh. Heating with a 4090 is cheaper than burning gas.
Memes aside how are the AMD drivers at this moment of time?
I've been reading it continually improves but how does it compare to Nvidia nowadays when it comes to both AAA and indie games?
>Memes aside how are the AMD drivers at this moment of time?
On windows?
>functional but worse video encoding performance and lack of day1 game drivers is a gamble
on Linux?
>Quite good, baked right into the kernel so it should just work. Tends to run cooler and be more performant than on Windows.
>Lack of day 1 game ready drivers
AMD fixed this issue in 2019 or earlier.
>5500 XT and 6800 XT user
this is a terrible situation, amd can make card 10 to 20 cheaper compared to nvidia, it will still be at a moronic price.
The redemption is coming.
Five hundred and nintey nine US dollars
It will be more like
>We will adjust our prices acordling to yours
>Oh nononono, we were going to charge only $1000 for the RX 7950
>Now it is just $1500, $100 cheaper than yours.
AMD really needs to so something about CUDA. They need something opensource that uses to replace the black CUDA box NVIDIA is using.
>They need something opensource that uses to replace the black CUDA box NVIDIA is using.
They already have that, it's just that it isnt as good as the original thing.
Either AMD prices their new GPU line very agressively, or it's still a no brainer to get a nvidia gpu. Their drivers also suck so yea, if they dont even have good pricing AMD gpus will have nothing going on for them
>it's just that it isnt as good as the original thing.
And never will be. On top of their poor support, slow driver releases, and lack of libraries, they've made it clear that RDNA and any consumer cards are secondary to their CDNA enterprise cards.
CUDA is already deeply entrenched in all professional ML
It's over. You'd have more luck getting corporate off windows
Where is your CUDA equivalent again?
https://github.com/RadeonOpenCompute
Can I run PyTorch on it on Windows?
Officially only on Linux atm, but I think you could be able to get it running on Windows with WSL.
>WSL
So only on Linux
Though to be fair the rocm source has many indication that windows support is in active development.
Dunno. But Windows is a toy OS.
>doesn't work on the latest Ubuntu LTS
t-thanks, AMD
>Where is your CUDA equivalent again?
All the software I use is OpenCL optimized.
They'll cut Nvidia by 10% and leave any posible market share gain off the table.
It's been like this like since forever, if you think otherwise you're simply deluded.
Is it time for Massive Noavi?
she should really have that growth on her left cheek removed, it looks malignant
That's a headset microphone.
y..you're a headset microphone
For you.
gpus are getting bigger and bigger. what the frick is this shit. i dont want to buy a cupboard and a powerplant to run a 40xx card
>i dont want to buy a cupboard and a powerplant to run a 40xx card
then you simply don't want top of the line performance, which is ok
RDNA3 is going to be a disappointment for desktop, what I'm looking forward to is the mobile sector. We're about to get some real potent handheld and laptop APUs with almost 1060-tier performance using only 15W.
Yep, this is where iGPUs start to beginning their path of demand destruction while discrete GPU move upwards.
AMD will have the same price increase as Nvidia did because of TSMC, stop being poor
no it won't, AMD is using N5 not N4 and they're using half of what NVIDIA is using plus N6 for cache they can charge less and make high margins.
and yes NVIDIA is just gouging.
Save us GPU chinkmother.
Can someone validate the claim in this video? Also, this was published in January, and I believed Amd had an update in August that boost the performance for Machine Learning using DirectML. So, is Amd a viable option for ML now?
No, AMD is in no way close to being viable for GPU compute. They hardcore their Python dependency for ROCm to 3.8, while at the same time not shipping Python with ROCm.
Their level of incompetence is astonishing, this is something a beginner could fix in a day.
>They hardcore their Python dependency for ROCm to 3.8, while at the same time not shipping Python with ROCm.
In 2022, you download and unpack the required Python version via Ansible/Puppet/Salt yourself and put it to /usr/local/bin and lib, together with running update-alternatives. Or venv. Or use package manager packages if versioned. Why ship something that will for sure be vulnerable when people install updated packages by themselves all the time.
ROCm supports "Ubuntu LTS" (what exactly this means is not mentioned), and the latest LTS comes with Python 3.10. Due to the hardcoded dependency on 3.8 they can't support the latest "Ubuntu LTS". The code itself has no issues running on 3.10, it's a packaging bug they can't fix for almost a year.
Let that sink in – the company you expect to "enable AI" and give you tools and drivers for GPU compute can't figure out how to change a stupid dependency... IN A YEAR.
>inb4 Jammy has only been out for 6 months
https://github.com/RadeonOpenCompute/ROCm/issues/1612
Probably 18.04. VMWare Greenplum supports 18.04, every corporate thing like Cirtix supports 18.04 (hope they do, they were on 16.04 a year ago). If 20.04, that's a step up for sure already.
> Let that sink in
Dude, I swim in that shit, don't even start.
Thats a man
MA'AM
IT'S MA'AM!
*hits you with a bat*
PC hardware is becoming a status symbol like owning the lastest Iphone.
people will just buy the 4090 to say look I got the 4090 just to play 1080p 60 FPS
for a tiny subset of rich people. most of them will still get nice cars and watches like they used to.
the 40 series will be an absolute disaster. miners don't want them (the perf per dollar and watt are abysmal, mining is likely dead anyway after ETH finally walked the walk) and scalpers have already taken too many losses on the 30 series to try again so soon. your favorite tech youtuber and streamer will get one, but absolutely everyone else will skip this and maybe get a used 3090 instead for a quarter of the price and 80% of the performance.
It's likely AMD would have considered launching the 7900XT at 1500 if Nvidia had chosen 2000 for the 4090, but they went more aggressive.
There's also no chance the 4080 price sticks, it's ridiculous, once 3080 stock clears they'll have to release a Super series to rectify prices.
Still buying a 4090. Still running textual inversions for SD. Sorry seething neets
the 4090 is the only one you should buy if you have the need, both 4080 skus are just jensen pissing in your face
Could be. I'm not poor so i just need VRAM for the future of art
You homosexuals still think AMD is going to save you after what happened with Zen 3.
RDNA3 is the Zen2 of Radeon
It's in their own interest to gain back market share, they have a chance right now to decimate the 4080 in price/performance which is arguably the more important than the crown
Is amd releasing new cards soon? if so when?
November 3rd
frick
nvidia on windows, radeon on linux
drivers is the most important thing
>here's your shitty drivers anon
No thanks, even if RDNA3 was 30% faster than 4090, I still wouldn't buy it because I don't want to deal with AMD headaches.
>le bad drivers
why do redditors parrot this
I had an AMD card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as Nvidia's. Why do simple facts make you people seethe so much? Who cares?
If you think AMD has bad drivers then you haven't seen nvidias shit yet
I had an Nvidia card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as AMD's. Why do simple facts make you people seethe so much? Who cares?
I ran Nvidia cards from 2015-2019 (970 and 2060) and had several instances of driver issues, mainly blue screens and broken Freesync. I've ran AMD since 2020 (5500 XT and 6800 XT) and my only issue is AMD Software will sometimes close randomly when adjusting settings. So both have issues, less severe on the AMD side.
>Freesync
oh I see you're just lying
Same the Nvidia is more reliable meme it's just Nvidia marketing both my laptop Nvidia card and my 2070 have had shitty problems with bluescreens and some problems with suspend
I had an Nvidia card for 4 years, the mediocre driver shit is true. Their drivers simply aren't as good, their driver team isn't as big as AMD's. Why do simple facts make you people seethe so much? Who cares?
Maybe I'll just stick to integrated graphics, buy an Xbox, and get one of those keyboard/mouse adaptors. GPU pricing has gone completely insane over the last 6 years.
yeah, was kinda thinking along those lines a few months ago but then again I play way too much paradox shit like HoI4 and EU4 for console to maka a lot of sense. Finally I went for rtx 3070ti system and hope for the best. Now they say my PC should arrive friday next week so it seems like waiting never ends
RDNA 2 In every Ryzen 4 has me curious.
Get an NH-P1, nvme drives in raid 0, and you’ll have a pretty insane workstation.
AyyMD will raise their prices as well just because they can, duopoly sux.
No fast video encoding, no DLSS, into the trash it goes. I'm a proud oculus link cuck.
FSR 2.0 is pretty great.
and it runs on everything so it doesn't leave GTX 980ti/1000 series chads to rot
now post it in motion with all the dogshit ghosting
lol you think they can pump out 100 times more GPUs this time and for a "fair" price?
The prices will be as "fair" as the new X670E boards. open your butthole for 7900XT starting at 1299$
Kinda glad I only spent 1k on my 6900xt
If rdna 3 is stupid pricing I'll wait for rdna 4 as it will have to compete with a more shreud nvidia and Intel
i just want a PURE RASTER CARD FOR FRICK SAKE ! i want a x70 class for 350$/400 dollars like the goold old days. even shintel is doing meme tracing shit, thus can t have low prices.
So go get a 6800xt?
Apparently 4090 can do sd 512x per second
7700XT $500
some Drones unironically writing this after the AM5 prices. LMAO
The AM5 7XXX desktop CPU prices are mostly reasonable, except at the lower-end. Also, how AMD decides to price RDNA3 will depend on how aggressively they want to try and take market share from Nvidia. If they believe they can sell through everything they make at Nvidia-equivalent card -$50 prices, AMD might decide to do that. But if they are focused on stealing GPU marketshare this generation, I wouldn't be surprised to see somewhat aggressive pricing to make the 4080 and the "4080" look worse.
for aggressive market share accumulation you need stock. Looking at every GPU stat that exists shows that AMD can't even pump out 1/20 of the stock that Nvidia has. They would need to pump out millions of GPUs more this time around.
I'm not holding my breath
I don’t think they’ll compete with the 4090 they never make the best GPU any given cycle.
Maybe they’ll be competitive with nvidias last gen lol
What's everyone's favorite pejorative for Nvidia? I'm between Njudea and Shitvidia
? They are the White People of GPUs, there's no slur. Meanwhile AMD are the pajeets
Novideo
Their fan bois are Nvidiots
In order for AMD to win
>meet Nvidia performance in trad rasterization
>keep improving FSR
>keep improving their hardware encoding
>fix their fricking windows drivers and finally pay a team to do day-1 game driver updates.
You can sidestep the driver issue if you're running Linux. That leaves it to them just staying the course and focusing on pure performance and efficiency without wasting time on motion interpolation gimmicks or other consoomer shit.
The AMD 7000 GPUs are going to be on the 5nm process, right? And isn't Nvidia on 4nm, which is more expensive?
And RDNA 3 should have at least a few models that are chiplet based, which can help a lot with costs as well.
I'm not saying AMD will be the good guy and pass real, major savings onto the customer, but there is a shred of possibility, right? I'm still not getting my hopes up because the GPU market has been absolute cancer for 6 years.
A 7900xt at 999$ with roughly 75% the performance of a 4090's performance is all they need.
Why do you think it'll only be 75%? The 6900XT matched the 3090 in a lot of things.
720p gaming doesn't count, Chang.
Here's your 4k benchmark bro.
>"""4K"""
>3840x1600
>Low
I got you bro. Notice that this only has up to the 6800XT
>I'm not saying AMD will be the good guy and pass real, major savings onto the customer, but there is a shred of possibility, right?
eh. A little bit
>the GPU market has been absolute cancer for 6 years.
You can thank the whole "le pc master race XD" meme marketing strategy. When PC gaming was just "pc gaming" and the ugly duck compared to consoles, we actually had it good and didnt knew better. It was actually worth it to play on PC because there was no status attached to it.
I'm very aware of the PC Mustards and the damage they did. The R9 290 was my first GPU. $320 on Newegg after a $60 rebate. This talk about a mid tier GPU costing $500 or more is moronic.
TSMCs 4N isn't 4nm but 5nm.
RDNA3 is 5nm and 6nm, the cache is 6nm and the GCD is 5nm.
7600XT is monolithic
Don't care, I'm buying an Arc A770
I just wish PyTorch supported oneAPI (Intel's equivalent to CUDA) so I could run Stable Diffusion on it.
Just use the ONNX models
The AMD prices will also be shit, won't they...
It's over
It's what we call "competitive".
competitive to whom.? You think the average Joe has 1k laying around to spend on GPUs homosexual.
You didn't get the joke, try now.
>competitive
As in they compete with each other at being shit.
nV clearly doesn't care what AMD does they play their own game and AMD also won't just lower the price if they can sell high.
To another competitor, duh. Ask the same question about a wage "compensation".
AMD doesn't do their own thing. They are hyper aware of the current markets, and will adjust their prices accordingly. They don't seem to interested at disrupting the markets. We need the Chinese for that. The GPU markets have not seen their Android phones yet.
> comparing silicon chips to assembled micro computers
Maybe you mean "have not seen their Mediatek yet."
yeah, enjoy your 50 bucks discount. man, competition sure is grrrreat!
AMD really has us stuck between a rock and a hard place. I'm actually stressing over this a bit. Maybe I'll just buy an Intel CPU and use their decent integrated graphics instead.
honestly, without a ram bump, I don't see the point. I have a 3080 and I have yet to use DLSS. tried it in some games but unless I put it on highest, the gain is lackluster. but if I put it on lower quality - to actually get a proper boost - the game looks like shit.
>Want a GPU that won't empty my wallet
>Also really want good ray tracing
Used it is.
RDNA 3 will handily beat Ampere in RT
Not that Anon but I hope so, I'd love to get a slightly cut down top tier die and live the R9 290 dream I couldn't afford all those years back.
>R9 290
The only 1080p 60FPS card you ever needed for 7 years.
GCN was so good
RDNA3 beating Ampere in RT is all I really care about.
Ampere has good ray tracing perf in blender. That's all I need.
5800X3D is a good upgrade. It's 2.5-5x faster than your current CPU in games. Massive upgrade and they've been on sale lately plus DDR4 is cheap.
Since you lasted on a 4790k so long, I don't think getting last gen DDR4 platform will matter much for you. Sounds like you don't want to bother upgrading much or you'd already have upgraded to Ryzen when the 3000 series came.
If you DO plan to upgrade more often, than DDR5 and Ryzen 7000 would be worth it since, as long as your board doesn't break, you can use the same board 5 years later.
>3070/TI
One of the worst offerings Nvidia has. The RX 6800 non-XT is a lot better.
The 3070 is only like 8% better than the 3060Ti and it costs like 30% more. If you must get Nvidia, try and grab the $400 3060Ti from Bestbuy when they're in stock. That's a huge upgrade over your gtx 980 which is worse than the 6500xt.
You could also wait for RX 7000 series. The 5800X3D should be powerful enough to power it at 1440p.
>one of the games I play doesn't recognize the recommended driver
>another one of the games I play crashes with the optional driver
Also what the frick is this
Laptop or desktop?
They are related you know.
None of the reddit threads ever mention this detail.
I dont have an account or I would stir that shit up
>I dont have an account or I would stir that shit up
You'd get permabanned for it within a few days at most.
nvidia and amd are family
they are not going to fight each other on prices
homosexuals will go out and get it, anyway. They will always go against their self interest, when the fact is, NOT buying this shit is in their self interest. Do severe damage to Nvidia so they will learn a lesson.
Literally me anytime a new patch drops for a game
Literally me anytime a new standard drops for C++
Who pays for this?
the same people who had a 3090ti and are now getting a 4090
General question:
I was looking to refresh my PC, but the 4000 series sounds like cancer and Intel's Raptor Lake platform sounds like a mixed bag.
If I'm using a system with an i7 4790k and a GTX 980 currently, which is now obviously a little old and powerhungry, is a 12th gen Intel CPU and a 3070±Ti a good match to last me ~5 years of light gaming?
(I just want something that can do 1440p and still be crammed into a small case for a HTPC/couch)
If you don't care about 4k or raytracing then it's more than enough.
How about this?
Sorry I know it's not the thread for it, but thanks for the pointers.
Why the frick would you put a 5800x3D against a 3060Ti? Adding a high-end CPU against a low-mid range GPU is a complete waste of money. CPU will not be bottle-necking you unless you are playing lots of computationally expensive strategy games.
https://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-5800X3D-vs-AMD-Ryzen-5-5600X/m1817839vs4084
Save you money and upgrade your SSD. Don't be a 500GB cuck, get at least 1TB. Also, you're in Oz so go to MWave, they are having a end of generation sale atm, you'll get better deals across your entire spend and it will be a single delivery.
if you don't want power hungry, don't go current gen intel
Get a 5800X3D. It's the best gaming CPU you can buy right now and should easily last you that long. The 3070 is alright, but the 6800 XT is around the same price and should be better than it in most areas. DLSS is very overrated. Upscaling in general is overrated.
anyone else jerk it to Lisa Su? or just me?