AMD literally can't sell their GPUs, Complete Nvidia dominance lads
https://www.pcgamesn.com/amd/radeon-gpu-sales-nosedived
AMD should legit just kill the Radeon division or spin them off into their own company to save their own financials and maybe it'd actually prompt them to make something that can survive in the market against Nvidia.
Shopping Cart Returner Shirt $21.68 |
Tip Your Landlord Shirt $21.68 |
Shopping Cart Returner Shirt $21.68 |
>AMD should stop competing with NVIDIA
lol lmao frick off glowie
Competing is a very strong word i feel.
They are mostly just well..... existing. Somewhere in Nvidia's wake. Far Far behind.
That's all they need to do. Nvidia wants to rape you and your kids by selling overpriced crap like the 4060 TI (the 4060 TI is the bare minimum for a graphics card today and it's overpriced as hell. They have no answer at the budget/mid-range)
Why are you pretending AMD is any better? Before the price drops basically nothing was worth it on AMD's side. It's better now but it took a year and a half to get there. AMD could have easily swept this generation if they weren't so greedy from the onset.
Why do you type like a homosexual? You type like a frickin' upper class bug-person who makes fluff pieces for the latest atrocity that needs to be backed by "public opinion".
They aren't competing. They're just burning money. AMD dGPU is worthless. They should shove all their money into APUs now handhelds are taking off.
>stop
They would need to start in order to stop.
They aren't competing with NVIDIA with their dedicated GPUs though? The shit is just a waste of money for them.
They should just focus on intergrated GPUs and get the poorgays who don't want to spend 200+ bucks for a GPU.
Strange, I can't get a 6800 xt here anymore and my next card will be a 7900 xtx with a 7800x3d. Total AMD dominance
>7900 xtx with a 7800x3d
Based. God tier combo. You can run any game maxed out with that setup and use the savings to get a juicer monitor or other beefier components. Nvidiots don't know the meta.
Im not a kid, i want to use my GPU for a privacy focused LLM and create deep fakes
>AMD that they expect to sell $4 billion in a single GPU part this year
>the best ever results for an AMD GPU
yeah it turns out they aren't wasting any more silicon on gaymers lol
>yes please give me a more expansive 4060, that is actually worse then a 3060
>look at the aislop you can make now tho
even more israeli then Intelaviv
lol. Lmao even. Nobody cares about subhuman poors that buy XX60 gpu's
Lol, even then XX90 of the current gen is cucked, it has a lower % of CUDA cores than even previous XX80 ti GPUs
>Intelaviv
nice
which AMD GPU for vidia, in 200$ range?
for windows mostly
used rx6600
used? I got one for $180 brand new. you could probably find a used 6700 at that price
They're doing great, chud. Cope
Gaming GPUs is about 2/3rds of what they make on desktop CPUs...
Wouldn't call it neglible, it's worth fighting for.
Shifting the focus away from engaging at high-end and instead focusing on mid- to low-range sounds like a safe strategy. Usually, the lower-end chips are recycled and updated former top-end or derived from current top-end, and I'm wondering how changing that pipeline will play itself out.
If the rumours are right AMD's next gen is promising, 7900XTX performance at 7800 prices and (hopefully) wattage.
there are rumors that the next gen will have less bandwidth than a 7800xt, i don't see it as an improvement
Can we get 6800XT perf for 200 bucks instead?
>Can we get 6800XT perf for 200 bucks instead?
Lol. Gamers have shown they'll pay anything for a gpu
4060 is kinda there with Frame Gen.
>down 48%
holy shit
considering how small their market share already was this is very impressive
nvidia will probably also see a decline
we're entering a recession, and graphics cards specifically purchased for gaming is consumer discretionary, which has been getting hammered the past two months
the drop is impressive, but people aren't spending as much as they were a year ago by a large margin
>Consumer discretionary
It’s amazing how so many people forget about this.
I’ve seen industries in discretionary spending justify price hiking their own products because of inflation.
Lol, it’s the opposite sweety.
It’s farmer chud and healthcare stacies who make off with the bag during an inflationary cycle.
Entertainment and Luxury always get crushed.
I swear if your average Hollywood gay understood how much inflation destroys their bottom line they’d all become Austrian Economists overnight.
The Radeon division is responsible for >10% of AMD's revenue with a single SOC they designed for Sony
Design once, sell millions. I don't think they're gonna divest that easy money
This. The desktop GPU market is a side hustle that keeps them up to date on the tech they'll eventually package into SOCs and sell for billions to the console manufacturers.
Aside from Nintendo, who uses nvidia and sells more units.
for a much lower price soc
Sony was about 3.6 billion in revenue for Radeon in 2023 compared to entire Tegra division revenue in 2023 of 1.45 billion and most of that is in automotive
wow it's as if purposefully gimping cards makes them a less attractive option. who knew
sorry guys, it's because i gave my $200 to Nvidia instead, because AMD doesn't make nice 75W cards with more than 4 lanes
>AMD literally can't sell their GPUs
Sure they can, all they have to do is sell at the price the product is actually worth.
What's that? They won't do that because Nvidia sells at insane markups and AMD wants some of that financially illiterate gamer money too? Then I guess they can go bankrupt and burn in hell.
Pretty much this. They used to be great budget cards now they are just slightly cheaper than Nvidia cards
this is why intel has to succeed with battlemage and beyond.
nvidia is focusing on datacenter ewaste for machine learning non-sense, amd wants their share of that too with crippled hardware full of bug (epyc cpu that segfault after a long uptime, mi accelerator with bugged firmware you can't use, etc)
we need a company that focus on the coomsumer market, it's a golden opportunity for intel as the other two are focused elsewhere for now.
I can see a world where intel dominates the consumer diy market, especially if they bring SR-IOV and other features like that.
Newsflash, the discrete GPU market is undergoing demand destruction. iGPUs are taking over what was once the mainstream discrete by being good enough.
Only vocal minority acutally cares about the 300FPS, 4K and RT rendering. The masses are content with 720/1080p resolutions at 30-FPS without any of fancy RT rendering. Game developers and publisher make much money via gacha and microtransactions then tanking $100+ million dollar on a glorified graphical techdemo.
I'd be satisfied with 1080p 60fps stable performances on games not requiring a premium card, but it seems devs can't even do that.
Thank god I bought a 3080Ti but it's kind of moronic that you need at minimum that to enjoy games nowadays.
the naming scheme is fricking terrible.
I've mostly checked out of all that (I've permanently moved to a gaming laptop for my main device) and I have no idea what a Black personblasterX3DXTX5600 is supposed to be.
I know what a 4090 is though.
The naming scheme is actually not that bad, there's just too many different SKUs in each cpu tier. There's 6 different 5600 skus out there and everything could be simplified to just 2 or 3
There are almost always two digits wasted being 00.
The XT and XTX could be xx60 and xx80, but then it might sound too similar to Nvidiot's lineup, idk.
>first digit = generation
>the rest of the digits = power lever
>sometimes xt added to further differentiate the power level, just like ti or super
The naming scheme is the same as Nvidia or Intel, are you moronic?
>The naming scheme is the same as Nvidia
indeed it's no different than a 4070 Super Ti
>first digit = generation
>the rest of the digits = power level
>super and ti added to further differentiate the power level
Yep
Both Nvidia and AMD GPUs have a similar naming scheme:
Nvidia:
3090Ti
3 = generation
9 = tier
Ti = sub-tier
AMD:
7900XT
7 = generation
9 = tier
XT = sub-tier
GTX 1650
Turing was a weird era. 16 and 20 are both the same generation but they wanted a clearer indicator than GTX/RTX branding that the 16 series didn't have RT functionality.
AMDs numbers are bigger so they are better than Nvidia.
Well maybe they should just price their GPUs reasonably and massively undercut Nvidia. But I guess Lisa's price colluding agreement with her cousin wont allow that.
Maybe RTG would do better if AMD gave it more than $21.37 of funding per year
Their cards are a bit too expensive I think. They're "cheaper" than the very expensive cards NVIDIA sells but they come with a bunch of asterisks and they're not cheap enough to be amazing value despite their shortcomings. They could just... reduce the price and sales would definitely go up because they'd have unbeatable value. They launched the 7900 XTX at a $1000 MSRP or even more in other places (1100EUR+ in the EU for instance). This is still a fricking expensive video card. NVIDIA released an even more expensive 4080 at $1200 MSRP, but the 7900 XTX comes with concessions like shit RT performance, worse alternative to DLSS, worse video encoder, worse / less supported for GPGPU shit and so on. This is not acceptable for a $1000 product, nobody opens up their wallet, ready to spend $1000 on a graphics card while at the same time accept all the lacking performance or features compared to the $1200 alternative from the competition. The price is just too high for a product that cannot actually go toe to toe with the competitor in every scenario, that isn't a price point for severe performance weaknesses (RT) or lacking features.
AMD cards are very compeititve in terms of standard rasterised graphics but Nviida hurts them in terms of eatures DLSS has a bit of an edge even if FSR is catching up, they are generations behind in terms of raytracing.
And despite hardly anyone apparently using raytracing the RT benchmarks make AMD look really, really bad.
Yes, that is exactly the problem, they are only competitive in rasterization, in RT they are woefully behind and they're still behind in features even though it's not as bad. That's not good enough on a $1000 product. I'm convinced people could happily accept that at a much lower price. $1000 still feels like a premium even if the competition is even more expensive and people aren't going to be understanding of compromises like the modern Radeon cards have when buying a premium product.
Ideal price structure for AMD right now would be
600 - 225
700 - 300
800 - 400
900 - 650
It's basically the only way they can survive
>the only way for them to survive is losing money on every product they sell
>the only way for them to survive is having the consumer subsidizing their inability to put out a product that can actually compete
If they are losing money selling an objectively worse card at a lower price they can either offer a better product or an actual lower price.
As opposed to NOT sell their products?
they are competitive in the only thing that matters
dlss, frame generation, and rtx are all gay meme shit for morons
The encoder is actually OK rn. IIRC even H264 is usable at this point
Their encoder is shit, H265 is the current standard, YT and Twitch are going to support AV1.
Meanwhile on current AMD drivers
https://www.pcgamesn.com/amd/adrenalin-24-4-1-gpu-driver-helldivers-2
Fricking bugs on games released months ago, hell even their fix for Helldivers 2 isn't a full fix.
Youtube already has AV1 support and some videos use it, it works just fine with AMD on Fedora 40.
I literally posted AMD's own known issues notes and it's the encoder, not decoder.
Gay tracing is moronic and FSR is better than DLSS
>Gay tracing is moronic
yes
>and FSR is better than DLSS
lolno
PCGAMESN is literally a shitpost website
bump
IQfy bros I like that Radeon is cheaper but aren't Cuda cores really useful, I remember a few times getting locked out of software for not having Cuda cores and shit made me sad
It's not that much cheaper. They price fix with Nvidia. Nvidia can price gouge because Radeon are a joke.
Ever since 4800 series and Phenom and AMD buying ATI they have constantly alternated between CPUs being bad and AMD being carried by graphics cards, or graphics cards being bad and AMD being carried by CPUs.
I'm almost certain if AMD didn't buy ATI, ATI would have released their shit generation (the 400 series when all they had was a mid range card for like $250) and died, and AMD would have released FX/Bulldozer and died.
yeah bro, 2000 dollar GPUs are still too cheap, I'm thinking 4000 or 5000
fricking moron
Zoomers prefer marketing and gimmicks over functionality.
The only functionality of Radeon is rasterization. How will that help me on doing other tasks? Rocm/hip is a fricking joke, and I've tried with a 6700xt and a 7900xtx just to get disappointed. I'm seeing more reasons to buy Arc instead of Radeon, even with its current flaws.
Don't you mean functoomer?
AMD's pricing just doesn't make any sense and they always change the prices
Our tech forum had to ban PCGamesN because its SEO spam.
All computer parts are getting autistic tbqh go back to basics man. Ford didn’t randomly change the Mustang to the rqx 45690 with rhi rasterraptor anti corriisive ai gel. It’s still just Mustang.
Should have just named stuff Radeon Target Radeon Pointmaster Radeon Throttle go with the on blank theme but don’t just name everything numbers and letters and crap
I bet it has nothing to do with Lisa Su being related to Jensen Huang.
I think this is a fair point. AMD GPUs started to suck with her. I have serious doubts they are actually, seriously competing against each other.
Instead of killing it off, they should document the hardware and allow direct access. People would flock to whichever manufacturer did this, graphics apis suck.
I miss ATI
Hey AMD, if you're listening - make a passively cooled GPU, I'll buy 20 of them, 3 for my household and 17 for my friends
They already going this route via iGPUs. Discrete GPUs are regressing back to their professional/hobbyist only niche.
>Apple iGPU
I don't wanna hear it. Discrete passive GPU, or nVidia it is. I heard they got a nice passive 3050.
>Discrete GPUs are regressing back to their professional/hobbyist only niche.
I don't care and I don't wanna hear it
Once red you will never go red again, the only thing they are good at is making decent CPU's but I bet even Nvidia could make them better if they cared.
Nvidia tried twice and failed with both Denver and Carmel
Designed a core worse even than ARM so they decided to buy ARM and failed at that too. Now they license bog standard ARM Neoverse V2 CPU cores and try to differentiate with marketing
>Once red you will never go red again
Speak for yourself homosexual.
After 1080-2080Ti having a 7900 XTX is such a pleasure.
you know what has an Nvidia CPU? the Nintendo Switch, and its absolute shit
You're moronic and so is that article. AMD have been gaining market share in terms of GPU sales over the past year. Nobody gives a shit about moronic gaymers. AMD are selling their GPUs elsewhere.
Are you lads ready for $2899 RTX 5090 and $1899 RTX 5080?
I'm literally using a 4080 and I think the problem is just the pricing.
RT is a fricking meme still. It's the bit you flip on your games to make them look worse and run worse. It fricking sucks. If you're running LLMs or other AI shit on your PC you should have a nice day.
In terms of what matters for games they're not bad. FSR is fricking garbage but all temporal AA, frame generation suck ass and you shouldn't be using them. Neither should developers.
They just cost too much. The 7900XTX at the top of the pile should be $700, max. If the 7900GRE was $350 or something like that, it'd be the easiest pick in the world over the incredibly, incredibly bad NVIDIA lineup under the 4080 series (which is also a ripoff, I owned myself getting this thing). Cut the prices or perish.
>frame gen sucks
>rt is a fricking meme still
How to immediately know somebody is lying and doesn't actually have a 4080
Frame gen sucks. It looks bad, creates graphic artifacts, and introduces input lag.
You are a moron.
Sorry you were wrong, feel free to sulk and maybe think before posting next time
Frame Gen sucks, Temporal AA sucks. Both exist to make games look worse and play worse. Some of the worst stuff to happen to the GPU space in recent years in terms of actual image output. We're reduced to smearing shit across the screen because despite having obscenely powerful CPUs and GPUs the AAA industry has ground itself into a paste and cannot complete a proper dev cycle to make games run properly on them without it.
Basically the equivalent of what happened with CPUs and Memory with normal software, where after reasonably decent 4-core processors and 8GB+ RAM became the standard for even walmart special laptops all software basically became piece of fricking shit Electron apps developed by mentally stillborn web developers and not actual programmers. There was a glorious few years of software running well until we decided to make 90% of the apps we run 200,000 javascript iframes, rendered by google's hacked up version of slowkit
Anybody who says they can see the artifacts or feel the input lag from frame gen is larping and regurgitating shit they've read by coping poorgays. Nobody who has used frame gen thinks this
RT also looks incredible in certain cases. It's useless in a lot of games but when its done right it's amazing, and once again anybody saying RT is a meme has never played the right game on good hardware
I've used framegen and I think this.
RT generally is unnoticeable or looks worse and isn't worth the performance hit. Even in Phantom Liberty, which actually was visually designed with raytracing in mind for the assets and areas looks...okay? Kinda better? The puddles look cool? Definitely worth an extra $600 of GPU and docking 40-50FPS.
Ray Reconstruction is fricking awful, it adds even more artifacts.
RT/Ray Reconstruction/Frame Gen are awesome, I love everything in motion having weird artifacts and nothing looking right
This doesn't look like the latest patch. I haven't seen this at all with it. Any artifacts are rare and minor comparatively.
>Kinda better? The puddles look cool? Definitely worth an extra $600 of GPU
This segment of your post resonates with my soul.
So what's the best bang for buck AMD GPU these days, I'm looking for something with 12-16GB VRAM
7900GRE is pretty good for what it is. You may be able to get it for $500 flat off and on.
AMD gave me a glitch free experience on bleeding edge linux so its worth the price in that department at least
>could easily undercut nvidya's prices for people who don't want to generate stable diffusion porn
>jacks their prices up anyway
AMD fricked themselves. Either they got greedy or their cards cost way too much to produce for the value.
7000 series has 1 Radeon product that doesn't waste wafer allocation from MI300 or Zen and that's the shitty rx 7600 on 6nm.
everything else they would rather make something else rather than price radeon aggressively again
>chasmic gap
https://www.islandnet.com/~egbird/dict/dict.htm
>well understood for some time
Well known, or well understood?
>Suffice to say, Radeon GPU sales have nosedived
Suffice to say (comma) this article wasted my time.
amd gpus will always be the coping poorgay's gpu
hyundai vs mercedes
They throw some heavy punches where Nvidia prices themselves out of the market. Nvidia is too busy trying to upsell you and it obviously works because nobody is buying their "budget" options.
I use SVP, people who often tell me they hate it because frame interpolation causes artifacts, use Radeon.
https://www.svp-team.com/
Everything they don't have, they call it a gimmick kek
Heres what I need AMD to do for me to start using them
1 - make either a GUI or CLI that allows forcing AA on Linux
they had this up till 2013. then some gay dev intentionally removed it cuz hes a fricking moron. then a bunch of people just said "use that jiminez shit which is absolutely fricking horrendous" or "just tell xorg to render at 4x the res then scale down", another garbage solution.
seriously, they cant get such a basic required feature of a GPU driver as forcing AA working. theyre fricking moronic. Meanwhile my Nvidia works perfectly on Linux, allows me to set AA, allows me to set watt limits, temp limits, set fan speed, all easily done thru scripting.
Theyre the perfect example of a company going open source purely for convenience, to get free labor from the OS community. They do not give a frick about the ethics of it. They intentionally removed a feature because some idiots tried to use it incorrectly. Since when does FOSS remove features useful to many people to accommodate morons. Sounds like a Mac mentaility
They also recently effectively disabled setting a watt limit, because it "could damage the hardware". Best you can do is 154W limit on a 164W card. Any lower is disabled "for your safety".
Surely that can be disabled or bypassed
Only by patching and recompiling the kernel.
Yeah, I used to set my 164W card to 120W for very cool and quiet operation and to prolong its lifespan. But AMD is more interested in making people damage their shit so they have to "upgrade". Somehow, overclocking, overvolting, and higher power limits are deemed safe and are enabled.
For the first time in years I'm not sure my bext gpu is going to be an AMD one...
>For the first time in years I'm not sure my bext gpu is going to be an AMD one...
I was so happy in 2012-2013 with AMD. Performance on the open source driver was fine, and it let me force AA. But since 2013 Ive been Nvidia and have not had a single issue other than being forced to use specific kernels. I tried an AMD once since 2013, but forcing AA went from being difficult (reverting the dumb patch) to being essentially impossible. So I had hoped for improvements to the situation or at least be able to still revert the patch, only to find out things got way worse.
Seems like Nvidia is starting to open up their driver too, so even the open source aspect of AMD is soon going to be no longer a benefit over Nvidia
Nvidia isn't opening their driver. But explicit sync will make nvidia usable on wayland.
absolutely atrocious. I am currently using a 1080 Ti, which I think is a 250W card, and I run it at 150W, and get like 80% the performance. Absolutely worth it for me and I have no idea how this could be less safe. It is literally more safe. I have an aggressive fan speed script which adjusts based on temperature. I idle at like 45C, and rarely hit 60C. Theres so little temp cycling on my GPU its amazing
>1 - make either a GUI or CLI that allows forcing AA on Linux
xrandr --output DisplayPort-0 --scale 2x2
Forces the only good AA type (SSAA).
power consumption is more important to the greater part of the world
AMd doesn't have anything power efficient GPU-wise except APUs and they need to market this better because APUs will replace dedicated GPUs for 99% of the population in the next couple of years.
So... with haek and strix point and etc I expect AMD to improve their efficiency game and end up every where and bottomline will takeout NV's role in everything but server AI and dev super rigs, even in consumer AI AMD will take out NVDA just by basis of market proliferation (sorry taiwan AMD is selling out to China too)
AMD should open sores their hardware
>raytracing is a meme
And yet between the 7900xtx and 4080 I know which one will still be on charts and used for benchmark comparisons in three years.
It is a meme because vast majority of the genres that are played don't really take advantage of it and it rapidly becomes a short-lived novelty that takes disabled by frame per second, frametime junkies.
it's a meme
>OH MY SCIENCE!
>THAT PUDDLE OF WATER IS ACCURATELY REFLECTING LIGHT BACK AT ME?!
>OH MY GOD I'M GOING INSAAAAAAAAAAAAAAAAAANE
it's basically unnoticeable unless you're really looking. baked lighting is just more tractable unless your scene is like a gorillion reflective mirror-like surfaces and you got a lot of dynamic wub wub lighting or some shit.
>AMD should legit just kill the Radeon division or spin them off into their own company to save their own financials
This has to be the most moronic fricking thread of the year on this website. It's like I'm actually surrounded by mouth breathing children.
they made 2 good gpus that are basically a slightly better 6800xt and the rest was overpriced garbage like nvidia but without their gimmicks
>the low to mid-range gpu manufacturer wants to sell only high-end except their high-end can't compete
>crypto makes them get wienery
>"Hey, what if we just stop making low to mid-range cards? Then people would have no choice BUT to buy them!"
>crypto crashes
>low to mid-range sales nosedive
why are corporations literally moronic.
> Apple dumps Intel CPUs
> Apple also dumps AMD dGPUs
> best GPUs out there are from Nvidia and Apple
> best CPUs are from Apple and AMD
> Nvidia will dominate the CPU market soon though
it's over for Intel, but AMD still has hope since they're pretty close to Nvidia in raw GPU performance, but they need to catch up with their software and AI side.
>Apple
>best at anything beside battery life
M3 Max GPU is neck to neck with the mobile 4090, and it wins on AI related workloads because of the massive VRAM available
lpddr5 is slower than gddr6x
>Wahhhhhh my Nvidia card is too expensive, AMD needs to compete so I can buy Nvidia for cheaper!!!
Based AMD abandoning PC gaming in favor of AI and console gaming.
They're only gonna sell 4B in AI accelerators this year and maybe 3B in consoles. Chump change. Gamers could give them so much more.
they should make it so you can use their GPUs for AI if they are so desperate
AMD's entire range of offerings is $100-$250 overpriced. They would have decimated this generation Nvidia if they weren't greedy israelites (just like Nvidia).
If only they discounted their Radeon GPUs gamers would go wild for them and not wait for Nvidia to discount their products by $100-200 too.
>claim you're going to compete on margins instead of volume
>do neither
They really need to fix their install base by discounting before it's all over.
margin's still up
it's almost like finance homosexuals know more than gaymers wanting to buy discounted Nvidia parts
I think they have to get FSR up to par with DLSS
both look awful
FSR is not competition for DLSS, but NIS. And Nvidia's upscaler without AI algorithms. And it is still better than FSR, and last update is from 2022. AMD can't compete with that yet, they're a joke.
https://github.com/NVIDIAGameWorks/NVIDIAImageScaling/releases
how could it ever be competition though? FSR is also avaible on every Nvidia card anyway so Nvidia wins by default with DLSS
Even XeSS is better than FSR now, Radeon software is a joke
I use my GPU for AI tasks, I use an nVIDIA GPU right now because CUDA is faster for what i do. I will buy an Intel, MediaTek, Apple, AMD, whatever GPU if it gives me better performance for less money. But what I truly want is dedicated AI accelerator hardware.
I want every single brand loyalist and gaymer poster to be shot
them youtubers really tried to tell us the 7800XT was a great card and a great deal
then like usual the Radeon morons release the 7900GRE and make the youtubers look like morons because it's basically the same price. AMD's releasing and pricing strategy for this entire generation has been a mess and that's saying a lot because NVIDIA's has hardly been stellar
>7900GRE
You have only yourself to blame, or your govt.
7900GRE only exists because UH-UH CHINKS LE BAD
What? It isn't like the 4090D. They can't do AI shit and aren't on export control lists. AMD created it for its own regional segmenting. Just like the old Ryzen 3500X or the current 7500F. Muh evil gubmint has nothing to do with it
>News from the actual store week ago
AMD is selling better than Nvidia
>News from who gives a frick
AMD can't sell shit
>>News from a specialty german store that stocks 80% AMD because it's cheaper from some homosexual on twitter
AMD is selling better than Nvidia
>News from AMD's own financial reporting
AMD can't sell shit
>AMD should kill Radeon
KYS
when some homosexual like Geohotz calls you out for being moronic, you know you must be a really big moron.
>PCGamesN
This thread didnt get deleted for spam?
People actually seriously discussed this their content here?
We have banned that place on multiple reputable tech sites.
YOU are all fricking stupid beyond belief.
https://www.amd.com/en/newsroom/press-releases/2024-4-30-amd-reports-first-quarter-2024-financial-results.html
this isn't IQfy buddy
Anon we also banned Currytechforums on Blind yet you morons still discuss them here.
Who cares what you indians do
go talk about labja malak from your approved sources list
wow imagine a gaming revenue is lower on both AMD and Nvidia. maybe something called embargo have been part of the reason?
which embedded devices use amd gpus?
consoles
see
sub 80 iq take
if aymmd cans their gpus, ngreedia is gonna sell theirs starting at 2k usd
>if aymmd cans their gpus, ngreedia is gonna sell theirs starting at 2k usd
And yet Nvidia keeps massively hiking their prices every new gen and what AMD does? Also hikes their prices massively.
It's almost like AMD is so irrelevant that they don't really play any role in regulating Nvidia's prices anymore
Nvidia just proved how fricking weak and cucked gamers are and AMD is picking the same fruits, they'd be moronic not to
This, it's the same deal as it was when scalpers were running rampant. It's scummy behavior to sell something for way more than it's worth, but it's infinitely WORSE to line the pockets of those scummy salesmen because you want to brag online about how your life is totally changed by a 5% performance increase that is only actually noticeable on like 2 games. I hate Nvidia and AMD for trying to rape me, but I hate the homosexuals paying them to do it even more.
AMD is unstable inferior technology for neural network inference even tho many libraries support some form of inference with it its risky to build on top because of lack of support
I don't care about Ray Tracing,I don't care about AAA games and I don't care about AI generating projects. Please, tell me why I should go with Nvidia? Thank you.
in my experience the Navi GPUs have a sometimes broken DX9 and earlier implementation for old games
which can often be worked around by using dxvk.
this doesn't really apply if you try to play old games on linux, that seems to work better
>I don't care about Ray Tracing,I don't care about AAA games and I don't care about AI generating projects.
Then why do you even need to own a gpu?
The $150 3070ti I got off ebay works ok.
Make a 1080p / card that's cheap and cheerful. Work out your BOM, add on manufacturing costs, distribution, etc., add on a fair margin for profit and stop trying to have people's fricking eyes out. Aim for low and the mid tier market. Why throw money away on R&D when you can't compete.
Doubt the R&D cost for that card would be any lower. And with high end shit, you can at least reuse fricked parts for the low-end.
That's what's I mean, stop the high end R&D. I've got an Nvidia 1660 and 2060 cards and as I don't have 4k monitors, I'd pay £300 for a decent 1080p card. Frick, both cards will probably do me for a few more years.
But high end R&D tickles down. Low end shit probably costs as much to develop but you also have much less room to cover for it.
If your high end GIGADILDO 7999 Plus Ultra doesn't sell well, you still got GIGADILDO 7800 and 7650. If your TINYDILDO 7500 doesn't sell amazingly well, you're fricked and stuck with all of the costs.
>AMD should legit just kill the Radeon division
instead of giving up, why not just make modular cube system, where you can add as many cubes you want, meaning you you have cpu cube, and gpu cube, but also have addon cubes like vram for gpu. All working together through magic.
Modularity is inefficient.
>Modularity is inefficient.
what if the cubes was forged in the large hadron collider?
I have my doubts it'd be of any help.
imagine if particles from beyond our reality were infused into the very fabric of computing hardware
kek. theres a couple theories that would support that
We've already known about this for years. See
and similar reports.
The only reason I buy AMD is the fact that I use Linux and don't want to deal with the struggle that comes with Nvidia's propietary drivers or configuring the reverse engineered, open source alternative. I also don't need CUDA for anything and if I ever did, I'll rather wait for oneAPI.
Why do people buy NVIDIA cards so much more? In my small yurop country AMD cards are so much cheaper it's insane, like comparable AMD/NVIDIA cards on benchmarks will be at least 100 euros cheaper on AMD's side. is NVIDIA cheaper in US?
They are just way better at everything, while also consuming less power, making them cheaper in the long run, regardless the original price.
Mindshare and better/manipulative marketing. Nvidia has at least on paper had the same consistent naming scheme for 10 generations. It's so consistent they rely on no one noticing the x60 to x80 series shrinking every gen or using smaller dies for an entire generation like the 600 series and only releasing the big dies next gen. It's become muddied somewhat with the Ti and Super branding. I don't want to go into a rant, but there's also a lot of accusations of paid reviews in the past.
AMD only recently had the resources to invest in Radeon, and when they did have an edge over Nvidia in the Volcanic islands era they did a terrible job marketing their cards and the stupid naming schemes that didn't align with architecture and all the rebadges didn't help. Who the frick knows which is better between a GRE, XT, XTX, and no prefix and how it compares to lower tiers?
You know how Nvidia waved its hand and made boomers piss and shit themselves until its stock was ten gorillion dollars?
Yeah they used the same black magic marketing on normalgay millennials, who now brandgay for their cards.
90% of Nvidia shills online are actually Internet cafe Asians who think they’re hot shit because they got to rent a nvidia computer once.
Why don't more Chinese people use Traditional Chinese? I figured they would want to use that more than the simplified due to their Chinese pride
The Chinese have a long history of trying their hardest to kill all the Chinese. This resulted in them murdering their own language, with simplified Chinese being created specifically to stamp out the old one to support the new regime.
Think 1984, but instead of newspeak they changed the alphabet, and it was in 1956.
Huh, didn't realize they tried to erase their whole language. Truly dystopian over there.
Hong Kong, and Taiwan still use the old symbols.
The rx6600 was more efficient, faster and cheaper than it's competition at the time but it still catches flak from the Nvidia fans. Has worked fine for me for years with possibly the best gnu/Linux drivers. I don't get it - my only cope is that they must use Windows or maybe they do AI crap.
It consumed less power than 3060 and probably was more efficient for raster gayming but both cards are over the top for that anyway at 1080p, while RT switched the performance picture a lot, and memes like DLSS gave 3060 even more advantages. And that's just gayming. AMD got no answer for Cuda.
Nvidiots are the epitome of brainless brand-gays coping with Stockholm syndrome
They are just mad that Nvidia and AMD RTG are no longer interest in doing a price war.
TPU has 3060 at 0.77x performance per watt of rx 6600 in gaming. This efficiency difference? Doesn't matter. TPU has the 7900 xtx at 0.91x the efficiency of 4080 Super in gaming. This efficiency difference? Does matter.
That's just how it works AMD shills. Get used to it.
Most the complaining with that card is because AMD fricked up again by having 100W at idle for certain monitors. And it still isn't fixed for all monitor combinations. Some Nvidia cards had problems with this too but it seems far less common and only impacted multiple monitor setups afaik.
>TPU has 3060 at 0.77x performance per watt of rx 6600 in gaming.
Without RT maybe. Reviewers are actually pushing AMD by trying to make it "fair" and ignore features NV has. Imagine they did same shit for CPUs "no, sorry, we don't count results with turbo mode, only stock clocks".
>we find the RX 6600 XT beating NVIDIA's RTX 3060 easily, with 14% better performance
>RTX 3060 and 3060 Ti are roughly twice as fast in raytracing due to additional hardware units. Even last generation's Turing RTX 2080 which offers similar non-RT FPS runs faster here
AMD just isn't a serious company.
RTX 3060
> Samsung 8nm
RX 6600 XT
> TSMC 7nm
also, performance per watt doesn't scale linearly
RTX 4080 Super
> TSMC 4N
RX 7900 XTX
> TSMC 6nm and TSMC 5nm
no rtx 4080 uses 5nm
Curious. That's not what Nvidia says about Ada whitepaper. Weird how random losers on IQfy think they know more about Nvidia products than Nvidia.
Looks like this made the nvidia gaymers seethe lol maybe they'll look at some nice puddles at 30fps on their rtx 3060 to feel better
I guarantee that if AMD ever catches up in raytracing to Nvidia, there will be some new Nvidia exclusive meme feature used as an excuse to not buy Radeons.
Honestly good, frick AMD. Frick their shitty snakeoil drivers and frick their lack of support for ROCM
aren't AMD iGPU's better than Nvidias iGPU's? ik Nvidia makes the better dGPU but as far as iGPU goes like the Radeon 780m aren't those better than what Nvidia has to offer?
>Nvidias iGPU
lol
Nvidia GeForce MX series... morons
>GeForce MX series
trash reply from a trash person
I'd rather be trash than moronic
thats still a separate chip, so a dGPU, bigger moron
>Nvidias iGPU's
huh
>he doesn't post on IQfy from his Nintendo Shield with Nvidia iGPU
>Nintendo Shield
switch and shield use the same soc betting they was having a bit of a giggle typing that
fair enough. i concede there are nvidia igpus and the question was legitimate
>Consoles using Radeon graphics
>Smartphones using Radeon graphics
>Radeon graphics cards are both cheaper and offer better performance than overpriced Nvidia slop
>AMD cAn'T cOmPeTe
look i'm not an amd fan.
i was during the duron/thunderbird and early k2/athlonx2 period.
they simply are unreliable, but cheap. as a student that was a big plus.
ati was a fricking master. i loved their 9700pro, probably their last good card.
meanwhile every console and most portable shit is running amd/ati, so they're fine.
intel is fricking up as well.
so as consumers we're all fricked.
if nvidia were to release a reliable x86/64 cpu, which they could, holy shit...
rx 590 owner here
>unreliable
no
a b***h and a halfto get working, yes
but once they do,theyre rock solid
they must have passed a deal with ngreedia or something
both ceos are cousins after all, no?
(dont mind the bg, im posting from a pos laptop rn)
Their GPUs suck, but their CPUs destroy intel's offerings at this point. And this talk of AM5 being "unreliable/unstable" is horseshit. The platform doesn't require overclocking RAM because of the better cache, but stupid kiddies do it anyway then complain of platform instability.
That or they get a crash in Linux and don't care to look up it's literally the intel wifi driver crashing it or something because Asus did something moronic.
>but their CPUs destroy intel's offerings at this point
Bit late when ARM is taking over.
I still have hopes that Riscv starts popping off in the coming years
>two chinese companies who will never produce anything for the global market
>two dozen Indian VC funding scams and then one that Keller is working for
Any day now it'll be great
What can I say, Riscv is open sourced while Arm is not. I will always root for the open sourced alternative even if only the Chinese get to use it I would still be happy for them.
The basic ISA is open, the various implementations people make are not. There is not a single proposed high performance RISC-V design that lacks proprietary IP blocks
Frick NoVidya GoyWorks.
Frick NoVidya GayForce.
Frick NoVidya GoySync.
NoVydia is slytherin green israelite poison. AMD is griffyndor red freedom blood.
Frick glowies. Frick nVirgins.
ngl, based
but truth be told,
aymd deserve a good kick up their arse too
GPU market for end consumers in a pathetic state.
Both current gent AMD and Nvidia GPUs are not worth their money.
Will Intel save it?
Nah, it is regressing back to hobbyist/professional only roots.
iGPUs are taking over the mainstream space by being good enough.
They literally dont need to.
It could literally be their pet project and ps5/xsx/steam deck/ryzen/epyc still rake in the dough.
I'm guessing they can't even get cuhrazy on their gpus since consoles and handheld also runs in radeon and they dont want shareholders learn radeon gpus are exploding left and right because of some wacky innovation they did.
>radeon gpus are exploding left and right
cute headcanon, woodscrew nvidiot
i like my rx 6600
>Make budget cards to win the market Nvidia neglected
>It's all pcie 8x so older machines at 2/3x run bad
>Remember not rock solid driver experience with amd all my life
I just keep buying Nvidia, like come on amd stop fricking up
x8 at Gen2/3 is more than enough for lower end cards. The problem is that they're actually x4.
Gen 2 x8 is a problem. Gen 3.0 may be just fine, but Gen 2.0 defo isn't
Perhaps, but then you'd be combining Sandy Bridge era chipsets with modern GPUs, at which point it's like wtf are you even doing Black person
I wish GPUs weren't needed.
>7900xtx will drop in price
Yesssssss AMD sucks stop buying AMD oooOOoOoOoooooo
They are stopping production of it shortly.
>monopoly without any competition good
you honestly believe this, do you? christ on a fricking cross...
It's not but AMD gave up on competing a while ago, short of NV fricking up for a few generations like intcel did, there is just no way back.
and yet they're sold out everywhere in my state and on most websites
These news are from the past, they're probably selling well now that they had to drop prices due to poor sales. Waitgays are eating good at the moment with the 7900XT going as low as $680.
>average gamer
>building PC
>has to choose between DLSS and non DLSS with more power consumption and less stable drivers
>chooses RTX
SHOCKING